Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

MCP-server-Deepseek_R1

MCP.Pizza Chef: 66julienmartin

MCP-server-Deepseek_R1 is a Node.js-based MCP server implementation that integrates Claude Desktop with Deepseek's R1 language model, optimized for reasoning tasks with an 8192-token context window. It leverages TypeScript for robust type safety and error handling, providing a stable and compatible environment for real-time model context protocol interactions.

Use This MCP server To

Integrate Deepseek R1 language model with Claude Desktop via MCP Enable reasoning tasks with large 8192-token context windows Provide real-time context feeding to Deepseek R1 through MCP Facilitate multi-step reasoning workflows using Deepseek R1 Develop AI applications leveraging Deepseek R1 via MCP protocol Bridge Claude Desktop with Deepseek API for enhanced language tasks

README

Deepseek R1 MCP Server

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.

Why Node.js? This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.

Deepseek R1 Server MCP server

Quick Start

Installing manually

# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install

# Set up environment
cp .env.example .env  # Then add your API key

# Build and run
npm run build

Prerequisites

  • Node.js (v18 or higher)
  • npm
  • Claude Desktop
  • Deepseek API key

Model Selection

By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts:

// For DeepSeek-R1 (default)
model: "deepseek-reasoner"

// For DeepSeek-V3
model: "deepseek-chat"

Project Structure

deepseek-r1-mcp/
├── src/
│   ├── index.ts             # Main server implementation
├── build/                   # Compiled files
│   ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json

Configuration

  1. Create a .env file:
DEEPSEEK_API_KEY=your-api-key-here
  1. Update Claude Desktop configuration:
{
  "mcpServers": {
    "deepseek_r1": {
      "command": "node",
      "args": ["/path/to/deepseek-r1-mcp/build/index.js"],
      "env": {
        "DEEPSEEK_API_KEY": "your-api-key"
      }
    }
  }
}

Development

npm run dev     # Watch mode
npm run build   # Build for production

Features

  • Advanced text generation with Deepseek R1 (8192 token context window)
  • Configurable parameters (max_tokens, temperature)
  • Robust error handling with detailed error messages
  • Full MCP protocol support
  • Claude Desktop integration
  • Support for both DeepSeek-R1 and DeepSeek-V3 models

API Usage

{
  "name": "deepseek_r1",
  "arguments": {
    "prompt": "Your prompt here",
    "max_tokens": 8192,    // Maximum tokens to generate
    "temperature": 0.2     // Controls randomness
  }
}

The Temperature Parameter

The default value of temperature is 0.2.

Deepseek recommends setting the temperature according to your specific use case:

USE CASE TEMPERATURE EXAMPLE
Coding / Math 0.0 Code generation, mathematical calculations
Data Cleaning / Data Analysis 1.0 Data processing tasks
General Conversation 1.3 Chat and dialogue
Translation 1.3 Language translation
Creative Writing / Poetry 1.5 Story writing, poetry generation

Error Handling

The server provides detailed error messages for common issues:

  • API authentication errors
  • Invalid parameters
  • Rate limiting
  • Network issues

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

MCP-server-Deepseek_R1 FAQ

How do I install MCP-server-Deepseek_R1?
Clone the GitHub repo, install dependencies with npm, set your API key in .env, then build and run using npm scripts.
What are the prerequisites for running this MCP server?
You need Node.js v18+, npm, Claude Desktop, and a valid Deepseek API key.
Why is Node.js used for this MCP server implementation?
Node.js with TypeScript offers stable integration, better type safety, and error handling for MCP servers.
Can this server handle large context windows?
Yes, it supports Deepseek R1's 8192-token context window for complex reasoning tasks.
Is this MCP server compatible with other LLM providers?
This server is specifically designed for Deepseek R1 but MCP protocol supports other providers like OpenAI, Claude, and Gemini.
How does this server improve integration with Claude Desktop?
It provides a stable, type-safe bridge enabling seamless communication between Claude Desktop and Deepseek R1.
What programming languages and tools are used?
The server is built with Node.js and TypeScript for robust development and compatibility.
How do I configure the API key for Deepseek?
Place your Deepseek API key in the .env file as specified in the setup instructions.