Overview
Features
Dual Mode Support (OpenAI API)
Operates in OpenAI API mode, using API_KEY and BASE_URL to access Deepseek's reasoning remotely.
Dual Mode Support (Ollama Local)
Operates in Ollama local mode, enabled by USE_OLLAMA=true for local reasoning.
Focused Reasoning
Captures Deepseek's thinking process and presents it as reasoning output.
Exposed MCP Tool: get-deepseek-thinker
Tool to perform reasoning using the Deepseek model; accepts originPrompt and returns structured reasoning.
Easy Client Integration
Configurable integration with MCP-enabled clients (e.g., Claude Desktop) via claude_desktop_config.json.
Access via API or Local
Supports access to Deepseek reasoning content from the Deepseek API service or a local Ollama server.
Developer Workflow
npm-based development: install, build, and run the server (node build/index.js).
Environment-driven configuration
Configure API_KEY, BASE_URL, and USE_OLLAMA to switch modes and authorization.
Who Is This For?
- AI Clients:Integrate Deepseek reasoning with MCP-enabled AI clients like Claude Desktop.
- Developers:Configure environment variables and deploy the MCP server in OpenAI API or Ollama mode.
- Integrators:Embed the MCP server into applications to access Deepseek's thought processes via MCP.




