Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

deepl-mcp-server

MCP.Pizza Chef: DeepLcom

deepl-mcp-server is an MCP server that integrates DeepL's translation API to provide real-time multilingual text translation, rephrasing, and language detection. It supports all DeepL API languages and features, including formality control, enabling seamless language processing within MCP-enabled applications. This server facilitates natural language translation workflows with high accuracy and flexibility.

Use This MCP server To

Translate text between multiple languages in real time Automatically detect source language for input text Rephrase text to improve clarity or tone using DeepL Control formality level in translations for supported languages Integrate translation capabilities into chatbots or virtual assistants Enable multilingual content generation in AI workflows Support localization workflows by translating UI strings or documents Combine with other MCP servers for enriched multilingual context

README

deepl-mcp-server

Version License: MIT smithery badge

A Model Context Protocol (MCP) server that provides translation capabilities using the DeepL API.

Features

  • Translate text between numerous languages
  • Rephrase text using DeepL's capabilities
  • Access to all DeepL API languages and features
  • Automatic language detection
  • Formality control for supported languages

Installation

You can install this using npm:

npm install deepl-mcp-server

Or you can clone this repository and install dependencies:

git clone https://github.com/DeepLcom/deepl-mcp-server.git
cd deepl-mcp-server
npm install

Configuration

DeepL API Key

You'll need a DeepL API key to use this server. You can get one by signing up at DeepL API. With a DeepL API Free account you can translate up to 500,000 characters/month for free.

Using with Claude Desktop

This MCP server integrates with Claude Desktop to provide translation capabilities directly in your conversations with Claude.

Configuration Steps

  1. Install Claude Desktop if you haven't already

  2. Create or edit the Claude Desktop configuration file:

    • On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • On Windows: %AppData%\Claude\claude_desktop_config.json
    • On Linux: ~/.config/Claude/claude_desktop_config.json
  3. Add the DeepL MCP server configuration:

{
  "mcpServers": {
    "deepl": {
      "command": "npx",
      "args": ["-y", "/path/to/deepl-mcp-server"],
      "env": {
        "DEEPL_API_KEY": "your-api-key-here"
      }
    }
  }
}
  1. Replace /path/to/deepl-mcp-server with an absolute path to your local copy of this repository - for example, /Users/robotwoman/Code/deepl-mcp-server
  2. Replace your-api-key-here with your actual DeepL API key
  3. Restart Claude Desktop

Once configured, Claude will be able to use the DeepL translation tools when needed. You can ask Claude to translate text between languages, and it will use the DeepL API behind the scenes.

Available Tools

This server provides the following tools:

  • get-source-languages: Get list of available source languages for translation
  • get-target-languages: Get list of available target languages for translation
  • translate-text: Translate text to a target language
  • rephrase-text: Rephrase text in the same or different language

Tool Details

translate-text

This tool translates text between languages using the DeepL API.

Parameters:

  • text: The text to translate
  • targetLang: Target language code (e.g., 'en-US', 'de', 'fr')
  • formality (optional): Controls formality level of the translation:
    • 'less': use informal language
    • 'more': use formal, more polite language
    • 'default': use default formality
    • 'prefer_less': use informal language if available, otherwise default
    • 'prefer_more': use formal language if available, otherwise default

rephrase-text

This tool rephrases text in the same or different language using the DeepL API.

Parameters:

  • text: The text to rephrase

Supported Languages

The DeepL API supports a wide variety of languages for translation. You can use the get-source-languages and get-target-languages tools to see all currently supported languages.

Some examples of supported languages include:

  • English (en, en-US, en-GB)
  • German (de)
  • Spanish (es)
  • French (fr)
  • Italian (it)
  • Japanese (ja)
  • Chinese (zh)
  • Portuguese (pt-BR, pt-PT)
  • Russian (ru)
  • And many more

Debugging

For debugging information, visit the MCP debugging documentation.

Error Handling

If you encounter errors with the DeepL API, check the following:

  • Verify your API key is correct
  • Make sure you're not exceeding your API usage limits
  • Confirm the language codes you're using are supported

License

MIT

Links

deepl-mcp-server FAQ

How do I install deepl-mcp-server?
Install via npm with 'npm install deepl-mcp-server' or clone the GitHub repo and run 'npm install'.
Does deepl-mcp-server support all languages available in the DeepL API?
Yes, it supports all languages and features provided by the DeepL API, including formality control.
Can deepl-mcp-server automatically detect the language of the input text?
Yes, it includes automatic language detection to simplify translation workflows.
How do I configure formality control in translations?
You can set formality preferences for supported languages through the server's configuration options.
Is deepl-mcp-server compatible with multiple LLM providers?
Yes, it works with MCP clients that use models from OpenAI, Anthropic Claude, and Google Gemini.
Can I use deepl-mcp-server to rephrase text, not just translate?
Yes, it supports rephrasing capabilities leveraging DeepL's API features.
What are the prerequisites for running deepl-mcp-server?
You need a valid DeepL API key and a Node.js environment to run the server.
How does deepl-mcp-server handle API rate limits?
It relies on DeepL's API rate limiting policies; you should manage your API usage accordingly.