consult7

consult7

Consult7 is an MCP server enabling agents to consult large-context models via OpenRouter for analyzing large codebases.

269
Stars
27
Forks
1
Releases

Overview

Consult7 is a Model Context Protocol (MCP) server that enables AI agents to consult large context window models via OpenRouter for analyzing extensive file collections—entire codebases, document repositories, or mixed content that exceed the agent's local context limits. It collects files from the specified absolute paths (with optional wildcards in filenames), assembles them into a single context, and sends them to a chosen model with your query. The model's response is then fed back directly to the agent. Consult7 supports a wide range of OpenRouter models (500+ models) and dynamic context windows, including flagship Gemini 3 Pro with up to 1M tokens and other models like Gemini 2.5 Pro/Flash, Claude Sonnet, Claude Opus, Grok, etc. It provides three performance modes—fast, mid, and think—to balance speed and reasoning. Optional output_file allows saving the result to disk and returns a concise message. It includes a CLI and Python API, usage examples, and mnemonics to reference model+mode combos, with automatic file ignore rules and size limits based on model context windows.

Details

Owner
szeider
Language
Python
License
MIT License
Updated
2025-12-07

Features

File collection and context assembly

Collects files from specified absolute paths (with wildcards in filenames) and assembles them into a single context for the model.

OpenRouter-backed model support

Supports 500+ OpenRouter models with dynamic context window sizing to fit the assembled context.

Performance modes

Offers fast, mid, and think modes to control reasoning depth and speed.

Output to file

Optional output_file parameter saves results to disk and returns a concise notification.

Dynamic size limits

Context window size-based dynamic file size limits per model; e.g., Gemini 3 Pro up to 1M tokens.

CLI and API integration

Provides CLI usage and a Python API for programmatic access, with JSON-based parameters.

Ignore rules

Automatically excludes common files/folders such as __pycache__, .env, secrets.py, .DS_Store, .git, node_modules.

Mnemonics for model/mode combos

Includes mnemonics like gemt, gptt, grot, gemf, ULTRA to reference model + mode configurations.

Audience

AI agentsOffload large-context file analysis to OpenRouter-backed models.
DevelopersAnalyze large codebases and document repositories beyond agent context limits.
Research teamsSummarize and analyze large document collections with high-context models.

Tags

MCPConsult7OpenRouterlarge-context modelscodebase analysisfile collectionGemini 3 ProGrok 4Claude Sonnetperformance modes