Overview
Features
Query multiple Ollama models with a single question
Sends a single query to all configured Ollama models and collects their responses for synthesis.
Assign different roles/personas to each model
Configure a unique system prompt per model to encourage diverse viewpoints.
View all available Ollama models on your system
Uses list-available-models to enumerate installed models and identify defaults.
Customize system prompts for each model
Set per-model system prompts via environment variables to shape behavior.
Configure via environment variables
Manage server behavior, model list, and prompts through a .env file for easy setup.
Integrate with Claude for Desktop
Seamless integration with Claude for Desktop to enable an advisory workflow.
Who Is This For?
- Developers:Build and deploy an MCP server that orchestrates multiple Ollama models and synthesizes their perspectives.
- Claude for Desktop users:Connect Claude to orchestrate model perspectives and receive synthesized advice via MCP.
- AI teams / researchers:Experiment with ensemble reasoning and diverse model viewpoints for richer answers.




