Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-server-dify

MCP.Pizza Chef: yuru-sha

mcp-server-dify is a Model Context Protocol server that integrates Dify AI's chat completion API with LLMs. It supports conversation context, streaming responses, and includes a restaurant recommendation tool. Implemented in TypeScript, it enables seamless interaction between LLMs and Dify AI services through a standardized protocol, facilitating advanced AI-driven chat workflows.

Use This MCP server To

Enable LLMs to perform chat completions using Dify AI API Support streaming chat responses in real-time applications Maintain conversation context for multi-turn dialogues Integrate restaurant recommendation tool within chat workflows Deploy as a Docker container for easy setup and scaling Connect LLM clients like Claude Desktop to Dify AI services

README

mcp-server-dify

CI Status

Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI's chat completion capabilities through a standardized protocol.

Features

  • Integration with Dify AI chat completion API
  • Restaurant recommendation tool (meshi-doko)
  • Support for conversation context
  • Streaming response support
  • TypeScript implementation

Installation

Using Docker

# Build the Docker image
make docker

# Run with Docker
docker run -i --rm mcp/dify https://your-dify-api-endpoint your-dify-api-key

Usage

With Claude Desktop

Add the following configuration to your claude_desktop_config.json:

{
  "mcpServers": {
    "dify": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-dify",
        "https://your-dify-api-endpoint",
        "your-dify-api-key"
      ]
    }
  }
}

Replace your-dify-api-endpoint and your-dify-api-key with your actual Dify API credentials.

Tools

meshi-doko

Restaurant recommendation tool that interfaces with Dify AI:

Parameters:

  • LOCATION (string): Location of the restaurant
  • BUDGET (string): Budget constraints
  • query (string): Query to send to Dify AI
  • conversation_id (string, optional): For maintaining chat context

Development

# Initial setup
make setup

# Build the project
make build

# Format code
make format

# Run linter
make lint

License

This project is released under the MIT License.

Security

This server interacts with Dify AI using your provided API key. Ensure to:

  • Keep your API credentials secure
  • Use HTTPS for the API endpoint
  • Never commit API keys to version control

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

mcp-server-dify FAQ

How do I install mcp-server-dify?
You can install it using Docker by building the image with 'make docker' and running it with your Dify API endpoint and key.
What programming language is mcp-server-dify implemented in?
It is implemented in TypeScript, ensuring type safety and modern JavaScript features.
Does mcp-server-dify support streaming responses?
Yes, it supports streaming chat completions for real-time interaction.
Can mcp-server-dify maintain conversation context?
Yes, it supports conversation context to enable multi-turn dialogues.
How do I configure mcp-server-dify with Claude Desktop?
Add the server configuration to your 'claude_desktop_config.json' with the appropriate command and arguments.
Is there any built-in tool included?
Yes, it includes a restaurant recommendation tool called meshi-doko.
Can I run mcp-server-dify without Docker?
While Docker is recommended for ease, you can run it using Node.js with the appropriate parameters.
Which LLM providers can use mcp-server-dify?
It is designed to work with any LLM supporting MCP, including OpenAI, Claude, and Gemini.