Skip to content

mosuka/laurus

Repository files navigation

Laurus : Lexical Augmented Unified Retrieval Using Semantics

Crates.io Documentation License: MIT

Laurus is a search platform written in Rust — built for Lexical Augmented Unified Retrieval Using Semantics. Built on a core library covering lexical search, vector search, and hybrid search, it provides multiple ready-to-use interfaces:

  • Core Library — Modular search engine embeddable into any application
  • CLI & REPL — Command-line tool for interactive search experiences
  • gRPC Server & HTTP Gateway — Seamless integration with microservices and existing systems
  • MCP Server — Direct integration with AI assistants such as Claude
  • Python Bindings — Native Python package for use in data science and AI workflows

Whether embedded as a library, deployed as a standalone server, called from Python, or woven into AI workflows, Laurus is a composable search foundation.

Documentation

Comprehensive documentation is available online:

Contents

Features

  • Pure Rust Implementation: Memory-safe and fast performance with zero-cost abstractions.
  • Hybrid Search: Seamlessly combine BM25 lexical search with HNSW vector search using configurable fusion strategies.
  • Multimodal Capabilities: Native support for text-to-image and image-to-image search via CLIP embeddings.
  • Rich Query DSL: Term, phrase, boolean, fuzzy, wildcard, range, geographic, and span queries.
  • Flexible Analysis: Configurable pipelines for tokenization, normalization, and stemming (including CJK support via Lindera).
  • Pluggable Storage: Interfaces for in-memory, file-system, and memory-mapped storage backends.
  • Scoring & Ranking: BM25 scoring with customizable fusion strategies for hybrid results.
  • Faceting & Highlighting: Built-in support for faceted navigation and search result highlighting.
  • Spelling Correction: Suggest corrections for misspelled query terms.

Workspace Structure

Laurus is organized as a Cargo workspace with 5 crates:

Crate Description
laurus Core search library — schema, analysis, indexing, search, and storage
laurus-cli Command-line interface with REPL for interactive search
laurus-server gRPC server with HTTP gateway for deploying Laurus as a service
laurus-mcp MCP server for AI assistants (Claude, etc.) via stdio transport
laurus-python Python bindings (PyPI package) built with PyO3 and Maturin

Feature Flags

The laurus crate provides optional feature flags for embedding support:

Feature Description
embeddings-candle Local BERT embeddings via Candle
embeddings-openai Cloud-based embeddings via the OpenAI API
embeddings-multimodal CLIP-based multimodal (text + image) embeddings
embeddings-all Enable all embedding backends

Quick Start

use laurus::lexical::{TermQuery, TextOption};
use laurus::storage::memory::MemoryStorageConfig;
use laurus::storage::{StorageConfig, StorageFactory};
use laurus::{Document, Engine, LexicalSearchRequest, Schema, SearchRequestBuilder};

#[tokio::main]
async fn main() -> laurus::Result<()> {
    // 1. Create storage
    let storage = StorageFactory::create(StorageConfig::Memory(MemoryStorageConfig::default()))?;

    // 2. Define schema
    let schema = Schema::builder()
        .add_text_field("title", TextOption::default())
        .add_text_field("body", TextOption::default())
        .build();

    // 3. Create engine
    let engine = Engine::new(storage, schema).await?;

    // 4. Index documents
    engine
        .add_document(
            "doc1",
            Document::builder()
                .add_text("title", "Introduction to Rust")
                .add_text(
                    "body",
                    "Rust is a systems programming language focused on safety and performance.",
                )
                .build(),
        )
        .await?;
    engine
        .add_document(
            "doc2",
            Document::builder()
                .add_text("title", "Python for Data Science")
                .add_text(
                    "body",
                    "Python is a versatile language widely used in data science and machine learning.",
                )
                .build(),
        )
        .await?;
    engine.commit().await?;

    // 5. Search
    let results = engine
        .search(
            SearchRequestBuilder::new()
                .lexical_search_request(LexicalSearchRequest::new(Box::new(TermQuery::new(
                    "body", "rust",
                ))))
                .limit(5)
                .build(),
        )
        .await?;

    for hit in &results {
        println!("score={:.4}", hit.score);
    }

    Ok(())
}

Examples

You can find usage examples in the laurus/examples/ directory:

Example Description Feature Flag
quickstart Basic full-text search
lexical_search All query types (Term, Phrase, Boolean, Fuzzy, Wildcard, Range, Geo, Span)
vector_search Semantic similarity search with embeddings
hybrid_search Combining lexical and vector search with fusion
synonym_graph_filter Synonym expansion in analysis pipeline
search_with_candle Local BERT embeddings via Candle embeddings-candle
search_with_openai Cloud-based embeddings via OpenAI embeddings-openai
multimodal_search Text-to-image and image-to-image search embeddings-multimodal

Contributing

We welcome contributions!

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Lexical Augmented Unified Retrieval Using Semantics

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors