Moorcheh

Moorcheh

A MCP server integrating Moorcheh's embedding, vector store, semantic search, and AI answer services.

2
Stars
0
Forks
0
Releases

Overview

The Moorcheh MCP Server is a Model Context Protocol (MCP) server that provides seamless integration with Moorcheh's Embedding, Vector Store, Search, and Gen AI Answer services. It enables interaction with Moorcheh's AI capabilities for document embedding, indexing, semantic search, and AI-powered answer generation through the MCP API. The server supports namespace-based data organization, allowing you to create, list, and delete namespaces; upload and manage text documents and vector embeddings; and retrieve or remove data items by ID. It offers advanced search across all namespaces via vector similarity and returns AI-generated responses grounded in your stored data. The MCP server can be run with no installation via NPX or installed locally by cloning the repo and providing a Moorcheh API key stored in a .env file. It exposes tools such as list-namespaces, create-namespace, delete-namespace, upload-text, upload-vectors, get-data, delete-data, search, and answer. It also supports multiple Bedrock models (Claude, Llama) and includes environment configuration guidance for smooth setup.

Details

Owner
moorcheh-ai
Language
JavaScript
License
Apache License 2.0
Updated
2025-12-07

Features

Seamless Moorcheh integration

Provides a unified interface to Moorcheh's embedding, vector store, search, and AI answer services via MCP.

Namespace management

Create, list, and delete namespaces to organize data efficiently.

Document and vector management

Upload and manage text documents and vector embeddings within namespaces.

Data retrieval and deletion

Retrieve documents by ID and remove specific items from a namespace.

Semantic search

Perform vector-based semantic search across namespaces for relevant results.

AI-powered answers

Generate AI-assisted responses grounded in stored data using Moorcheh's AI capabilities.

Deployment flexibility

Run via NPX with no installation or install locally from source for full control.

Model support

Supports multiple Bedrock models (e.g., Claude, Llama) for inference.

Audience

DevelopersIntegrate embedding, vector storage, and AI answers to build apps.
Data engineersSet up namespaces, upload data, and enable semantic search across documents.
AI teamsDesign secure chatbots and RAG systems using Moorcheh MCP capabilities.

Tags

MCP serverembeddingvector storesemantic searchAI answersnamespace managementdocument managementRAGMoorchehBedrock modelsenvironment configurationNPXClaudeLlama