mcp-demo

MCP.Pizza Chef: xiaosl-cell

mcp-demo is a demonstration MCP server project illustrating how to build and use MCP services for integrating large language models (LLMs) with tools. It supports multiple communication protocols like stdio and SSE, provides basic computation tools, and enables LLM-driven tool invocation and orchestration with an extensible tool registration system. This project is ideal for developers learning MCP server construction and LLM integration workflows.

Use This MCP server To

Demonstrate MCP server construction and deployment Showcase LLM and tool integration via MCP Implement multi-protocol MCP communication (stdio, SSE) Provide basic computation tools as MCP services Enable LLM-driven tool orchestration and calls Extend MCP server with custom tool registration

README

MCP示例项目

项目简介

这是一个基于MCP(Model Context Protocol)的服务演示项目,用于展示如何构建和使用MCP服务,实现大模型与工具的交互集成。该项目主要演示了如何创建MCP服务、构建MCP客户端,以及如何利用LLM(大型语言模型)进行工具调用和编排。

项目结构

mcp-demo/
├── prompt/                  # 提示词模板
│   └── systemPrompt.txt     # 系统提示词
├── src/                     # 源代码
│   └── demo/
│       ├── llm/             # LLM与MCP集成的核心代码
│       │   ├── config/      # 配置相关
│       │   ├── llm/         # LLM客户端实现
│       │   ├── mcp_client/  # MCP客户端实现
│       │   ├── mcp_server/  # MCP服务实现
│       │   └── host.py      # 主程序入口
│       ├── 1-stdio/         # 标准输入输出方式的MCP演示
│       └── 2-sse/           # SSE方式的MCP演示
├── server_config.json       # 服务配置文件
└── requirements.txt         # 项目依赖

安装方法

  1. 克隆仓库并进入项目目录
  2. 安装依赖包
pip install -r requirements.txt

功能特性

  • 支持多种MCP通信方式:
    • 标准输入/输出 (stdio)
    • SSE通信协议 (SSE)
  • 提供基本计算工具服务
  • 支持大模型工具调用与编排
  • 可扩展的工具注册机制

使用方法

配置LLM

在项目根目录创建.env文件,配置以下环境变量:

LLM_API_KEY=your_api_key
LLM_BASE_URL=proxy_base_url
MODEL=gpt-4o

运行示例

执行主程序:

1-stdio:python ${项目完整路径}/src/demo/1-stdio/stdio_client.py}
2-sse:
  - python ${项目完整路径}/src/demo/2-sse/sse_server.py
  - python ${项目完整路径}/src/demo/2-sse/sse_client.py
llm: python ${项目完整路径}/src/demo/llm/host.py

自定义MCP服务

您可以在mcp_server目录下创建自己的MCP服务,例如:

from mcp.server.fastmcp import FastMCP

# 创建一个MCP服务
mcp = FastMCP("自定义服务名称")

@mcp.tool(name="tool_name", description="工具描述")
async def tool_function(param1: type, param2: type) -> return_type:
    # 实现工具逻辑
    return result

if __name__ == "__main__":
    # 启动服务
    mcp.run()

配置MCP服务

编辑server_config.json文件,配置MCP服务:

{
  "mcpServers": {
    "Calculation": {
      "type": "stdio",
      "command": "python",
      "args": ["${与项目根目录相对路径}"]
    }
  }
}
  • type: 支持stdio,sse
  • command: 执行命令 stdio模式必填
  • args: 执行命令参数 stdio模式必填
  • url: 服务地址 sse模式必填

依赖项

  • mcp >= 1.2.0
  • openai >= 1.10.0
  • asyncio >= 3.4.3
  • python-dotenv >= 1.0.0

许可证

本项目遵循[MIT]开源许可。

mcp-demo FAQ

How do I install the mcp-demo server?
Clone the repository, navigate to the project directory, and run 'pip install -r requirements.txt' to install dependencies.
What communication protocols does mcp-demo support?
It supports standard input/output (stdio) and Server-Sent Events (SSE) protocols for MCP communication.
How can I configure the LLM for mcp-demo?
Create a '.env' file in the project root and set environment variables like LLM_API_KEY, LLM_BASE_URL, and MODEL accordingly.
Can I extend mcp-demo with additional tools?
Yes, mcp-demo includes an extensible tool registration mechanism to add custom tools.
What programming language is mcp-demo implemented in?
The project is implemented in Python, as indicated by the use of pip and Python file structure.
Does mcp-demo support multiple LLM providers?
While the demo primarily configures for GPT models, it can be adapted to work with other LLM providers like Claude and Gemini by adjusting environment variables.
How does mcp-demo handle LLM tool orchestration?
It demonstrates how to use LLMs to invoke and orchestrate tools via MCP, enabling complex workflows.
Is there example code for MCP client and server in mcp-demo?
Yes, the project includes source code for both MCP client and server implementations under the 'src/demo/llm' directory.