Overview
Features
AI-Powered Document Analysis (Gemini AI)
Enhanced with Google Gemini AI for advanced document analysis, contextual understanding, and insights (requires GEMINI_API_KEY).
Traditional Semantic Search
Chunk-based search using embeddings plus an in-memory keyword index for fast, accurate retrieval.
Context Window Retrieval
Fetch surrounding chunks to provide richer context for LLM responses.
O(1) Document Lookup
DocumentIndex enables instant document lookup for quick access.
LRU Embedding Cache
EmbeddingCache avoids recomputing embeddings to speed up repeated queries.
Parallel Chunking & Batch Processing
Parallel processing accelerates ingestion of large documents.
Streaming File Reader
Reads large files with low memory usage via streaming.
Local-Only Storage with Disk Persistence
All data remains locally in ~/.mcp-documentation-server/ with on-disk persistence.
Who Is This For?
- Developers:Build MCP-driven apps and workflows with local-first document management.
- Knowledge workers:Search, summarize, and relate content across documents using AI-enabled queries.
- Content teams:Organize and retrieve information from local files efficiently.




