deepseek-thinker-mcp

deepseek-thinker-mcp

A MCP server that exposes Deepseek's thinking process via OpenAI or Ollama modes.

66
Stars
15
Forks
0
Releases

Overview

Deepseek Thinker MCP Server is a Model Context Protocol provider that surfaces Deepseek's internal reasoning to MCP-enabled AI clients, such as Claude Desktop. It supports dual operation modes: OpenAI API mode, using API_KEY and BASE_URL for remote access to Deepseek's reasoning, and Ollama local mode, enabled by USE_OLLAMA=true for local hosting. The server exposes the get-deepseek-thinker MCP tool, which takes an originPrompt and returns a structured text response containing the reasoning process. It can be consumed either through the Deepseek API service or a local Ollama instance, enabling flexible deployment. Integration with AI clients like Claude Desktop is facilitated by a claude_desktop_config.json snippet, and a local server can be started via Node or by building the project and running node build/index.js. The project uses TypeScript, the @modelcontextprotocol/sdk, OpenAI API, Ollama, and Zod for parameter validation, and follows an npm-based development workflow (install, build, run). The MIT license applies.

Details

Owner
ruixingshi
Language
JavaScript
License
Updated
2025-12-07

Features

Dual Mode Support (OpenAI API)

Operates in OpenAI API mode, using API_KEY and BASE_URL to access Deepseek's reasoning remotely.

Dual Mode Support (Ollama Local)

Operates in Ollama local mode, enabled by USE_OLLAMA=true for local reasoning.

Focused Reasoning

Captures Deepseek's thinking process and presents it as reasoning output.

Exposed MCP Tool: get-deepseek-thinker

Tool to perform reasoning using the Deepseek model; accepts originPrompt and returns structured reasoning.

Easy Client Integration

Configurable integration with MCP-enabled clients (e.g., Claude Desktop) via claude_desktop_config.json.

Access via API or Local

Supports access to Deepseek reasoning content from the Deepseek API service or a local Ollama server.

Developer Workflow

npm-based development: install, build, and run the server (node build/index.js).

Environment-driven configuration

Configure API_KEY, BASE_URL, and USE_OLLAMA to switch modes and authorization.

Audience

AI ClientsIntegrate Deepseek reasoning with MCP-enabled AI clients like Claude Desktop.
DevelopersConfigure environment variables and deploy the MCP server in OpenAI API or Ollama mode.
IntegratorsEmbed the MCP server into applications to access Deepseek's thought processes via MCP.

Tags

DeepseekMCPReasoningOpenAI APIOllamaClaude DesktopTypeScriptSDK