Multi-Model Advisor

Multi-Model Advisor

An MCP server that queries multiple Ollama models and synthesizes their perspectives.

69
Stars
19
Forks
0
Releases

Overview

Multi-Model Advisor is an MCP server that orchestrates queries across several Ollama models, each assigned a distinct persona, to present diverse AI perspectives on a single question. It exposes two MCP tools: list-available-models and query-models, enabling users to discover installed models and run multi-model queries. The server aggregates the responses from each model and Claude or Desktop to synthesize a comprehensive answer, effectively forming a 'council of advisors.' Configuration is done via an environment file (.env) where you specify server name and version, Ollama API URL, and default models. For each model you can set a separate system prompt to shape its role. Example settings include gemma3:1b, llama3.2:1b, and deepseek-r1:1.5b with prompts to encourage creativity, empathy, or analytical reasoning. The server is designed to run with Node.js (Node 16+), Ollama installed, and Claude for Desktop integration. It supports plugging into Claude's desktop tooling for an enhanced advisory experience and includes troubleshooting guidance for Ollama connectivity, model availability, and memory constraints.

Details

Owner
YuChenSSR
Language
TypeScript
License
MIT License
Updated
2025-12-07

Features

Query multiple Ollama models with a single question

Sends a single query to all configured Ollama models and collects their responses for synthesis.

Assign different roles/personas to each model

Configure a unique system prompt per model to encourage diverse viewpoints.

View all available Ollama models on your system

Uses list-available-models to enumerate installed models and identify defaults.

Customize system prompts for each model

Set per-model system prompts via environment variables to shape behavior.

Configure via environment variables

Manage server behavior, model list, and prompts through a .env file for easy setup.

Integrate with Claude for Desktop

Seamless integration with Claude for Desktop to enable an advisory workflow.

Audience

DevelopersBuild and deploy an MCP server that orchestrates multiple Ollama models and synthesizes their perspectives.
Claude for Desktop usersConnect Claude to orchestrate model perspectives and receive synthesized advice via MCP.
AI teams / researchersExperiment with ensemble reasoning and diverse model viewpoints for richer answers.

Tags

MCPModel Context ProtocolOllamaClaudemulti-modeladvisorperspectivesintegration