Laurus is a search platform written in Rust — built for Lexical Augmented Unified Retrieval Using Semantics. Built on a core library covering lexical search, vector search, and hybrid search, it provides multiple ready-to-use interfaces:
- Core Library — Modular search engine embeddable into any application
- CLI & REPL — Command-line tool for interactive search experiences
- gRPC Server & HTTP Gateway — Seamless integration with microservices and existing systems
- MCP Server — Direct integration with AI assistants such as Claude
- Python Bindings — Native Python package for use in data science and AI workflows
Whether embedded as a library, deployed as a standalone server, called from Python, or woven into AI workflows, Laurus is a composable search foundation.
Comprehensive documentation is available online:
- English: https://mosuka.github.io/laurus/
- Japanese (日本語): https://mosuka.github.io/laurus/ja/
- Getting Started
- Core Concepts
- Schema & Fields
- Text Analysis
- Embeddings
- Storage
- Indexing (Lexical / Vector)
- Search (Lexical / Vector / Hybrid)
- Query DSL
- Crate Guides
- laurus (Library) — Engine, Scoring, Faceting, Highlighting, Spelling Correction, Persistence & WAL
- laurus-cli — Command-line interface, REPL, Schema Format
- laurus-server — gRPC server, HTTP Gateway, Configuration
- laurus-mcp — MCP server for AI assistants (Claude, etc.)
- laurus-python — Python bindings (PyPI package)
- Development
- API Reference (docs.rs)
- Pure Rust Implementation: Memory-safe and fast performance with zero-cost abstractions.
- Hybrid Search: Seamlessly combine BM25 lexical search with HNSW vector search using configurable fusion strategies.
- Multimodal Capabilities: Native support for text-to-image and image-to-image search via CLIP embeddings.
- Rich Query DSL: Term, phrase, boolean, fuzzy, wildcard, range, geographic, and span queries.
- Flexible Analysis: Configurable pipelines for tokenization, normalization, and stemming (including CJK support via Lindera).
- Pluggable Storage: Interfaces for in-memory, file-system, and memory-mapped storage backends.
- Scoring & Ranking: BM25 scoring with customizable fusion strategies for hybrid results.
- Faceting & Highlighting: Built-in support for faceted navigation and search result highlighting.
- Spelling Correction: Suggest corrections for misspelled query terms.
Laurus is organized as a Cargo workspace with 5 crates:
| Crate | Description |
|---|---|
laurus |
Core search library — schema, analysis, indexing, search, and storage |
laurus-cli |
Command-line interface with REPL for interactive search |
laurus-server |
gRPC server with HTTP gateway for deploying Laurus as a service |
laurus-mcp |
MCP server for AI assistants (Claude, etc.) via stdio transport |
laurus-python |
Python bindings (PyPI package) built with PyO3 and Maturin |
The laurus crate provides optional feature flags for embedding support:
| Feature | Description |
|---|---|
embeddings-candle |
Local BERT embeddings via Candle |
embeddings-openai |
Cloud-based embeddings via the OpenAI API |
embeddings-multimodal |
CLIP-based multimodal (text + image) embeddings |
embeddings-all |
Enable all embedding backends |
use laurus::lexical::{TermQuery, TextOption};
use laurus::storage::memory::MemoryStorageConfig;
use laurus::storage::{StorageConfig, StorageFactory};
use laurus::{Document, Engine, LexicalSearchRequest, Schema, SearchRequestBuilder};
#[tokio::main]
async fn main() -> laurus::Result<()> {
// 1. Create storage
let storage = StorageFactory::create(StorageConfig::Memory(MemoryStorageConfig::default()))?;
// 2. Define schema
let schema = Schema::builder()
.add_text_field("title", TextOption::default())
.add_text_field("body", TextOption::default())
.build();
// 3. Create engine
let engine = Engine::new(storage, schema).await?;
// 4. Index documents
engine
.add_document(
"doc1",
Document::builder()
.add_text("title", "Introduction to Rust")
.add_text(
"body",
"Rust is a systems programming language focused on safety and performance.",
)
.build(),
)
.await?;
engine
.add_document(
"doc2",
Document::builder()
.add_text("title", "Python for Data Science")
.add_text(
"body",
"Python is a versatile language widely used in data science and machine learning.",
)
.build(),
)
.await?;
engine.commit().await?;
// 5. Search
let results = engine
.search(
SearchRequestBuilder::new()
.lexical_search_request(LexicalSearchRequest::new(Box::new(TermQuery::new(
"body", "rust",
))))
.limit(5)
.build(),
)
.await?;
for hit in &results {
println!("score={:.4}", hit.score);
}
Ok(())
}You can find usage examples in the laurus/examples/ directory:
| Example | Description | Feature Flag |
|---|---|---|
| quickstart | Basic full-text search | — |
| lexical_search | All query types (Term, Phrase, Boolean, Fuzzy, Wildcard, Range, Geo, Span) | — |
| vector_search | Semantic similarity search with embeddings | — |
| hybrid_search | Combining lexical and vector search with fusion | — |
| synonym_graph_filter | Synonym expansion in analysis pipeline | — |
| search_with_candle | Local BERT embeddings via Candle | embeddings-candle |
| search_with_openai | Cloud-based embeddings via OpenAI | embeddings-openai |
| multimodal_search | Text-to-image and image-to-image search | embeddings-multimodal |
We welcome contributions!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.