Minima

Minima

MCP server for RAG on local files

1,012
Stars
96
Forks
1
Releases

Overview

Minima is an open-source, on-premises RAG solution that runs in containers and can integrate with ChatGPT and MCP. It supports three modes: an isolated on-prem deployment with all components (LLM, embedding, and reranker) running locally; Custom GPT mode, where you query your local documents through the ChatGPT app or web with custom GPTs while the indexer operates on your PC or cloud and ChatGPT acts as the primary LLM; and Anthropic Claude mode, which uses Claude as the primary LLM with a local indexer. It indexes documents from LOCAL_FILES_PATH recursively and supports file types such as .pdf, .xls, .docx, .txt, .md, and .csv. The stack is configurable via EMBEDDING_MODEL_ID and EMBEDDING_SIZE for Qdrant storage, OLLAMA_MODEL for the LLM, and RERANKER_MODEL for reranking, with USER_ID for authentication and PASSWORD for a Firebase account. Minima offers multiple docker-compose options for fully local, ChatGPT-enabled, and MCP-integrated deployments, plus guidance for MCP usage with Anthropic Desktop and Copilot. An MCP workflow can also be installed via Smithery.

Details

Owner
dmayboroda
Language
Python
License
Mozilla Public License 2.0
Updated
2025-12-07

Features

On-Premises isolated deployment

Operate fully on-premises with containers; all LLM, embedding, and reranker components run locally without external dependencies.

Custom GPT local docs integration

Query local documents via the ChatGPT app or web with custom GPTs, while the indexer runs on your PC or cloud and ChatGPT serves as the LLM.

Anthropic Claude integration

Use Anthropic Claude as the primary LLM for local document querying with a local indexer.

MCP Desktop integration

MCP integration support using the MCP docker-compose and Anthropic Desktop app configuration.

Multi-mode deployment via docker-compose

Three docker-compose options (local, ChatGPT-enabled, MCP-enabled) to fit different usage scenarios.

Local indexing of documents

Recursively index LOCAL_FILES_PATH; supports .pdf, .xls, .docx, .txt, .md, .csv.

Configurable ML stack

Configure EMBEDDING_MODEL_ID and EMBEDDING_SIZE for Qdrant, OLLAMA_MODEL for LLM, and RERANKER_MODEL for reranking; includes USER_ID and PASSWORD for ChatGPT integration.

MCP installation and tooling

Smithery-based MCP installation path; Copilot, and Claude-specific MCP configuration supported; includes mcp.json guidance and OTP handling.

Audience

DevelopersSet up on-prem RAG with local document indexing and MCP integration.
IT/DevOpsDeploy containerized Minima across teams with MCP and local data access.
End usersQuery local documents via ChatGPT, Claude, or Copilot using local indexing.
MCP developersIntegrate Minima with MCP desktop apps and Copilot workflows.

Tags

MCPRAGon-premiseslocal-filesembeddingOllamaRerankerChatGPTAnthropic ClaudeCopilotMinimaDockerSmitherydocument-searchlocal-indexing