mcp_client_openai

MCP.Pizza Chef: liangpn

mcp_client_openai is an MCP client adapted specifically for the OpenAI SDK, enabling seamless integration with the Model Context Protocol. It addresses gaps in official examples by providing robust lifecycle management, logging, and dynamic server tool updates. This client supports real-time interaction between LLMs and their environment, facilitating advanced AI workflows with OpenAI and other LLM providers.

Use This MCP client To

Integrate OpenAI SDK with MCP for real-time model context management Manage MCP client-server lifecycle with enhanced logging Simulate dynamic server tool updates for adaptive AI workflows Develop custom MCP clients on Windows with compatibility fixes Extend MCP client functionality for advanced LLM interactions

README

适配 OpenAI SDK 构建 MCP Client

官方文档


项目背景

开发这个适配 OpenAI SDK 的 MCP Client 的原因是,在按照官方示例构建 MCP Client 时,我发现官方示例代码中没有适配 OpenAI SDK 的代码。


文件说明

  • client.py:适配了 OpenAI SDK 的 MCP Client。
  • client_new.py:为解决我在Windows遇到的问题而适配的版本。
  • client_20250316.py:增加了日志以及增加接收来自server的一些特定消息。请看我的知乎文章-从MCP Client-Server 生命周期出发,深入研究 MCP 的完整交互链路 ,里面详细介绍了这个MCP Client的Server生命周期。
  • weather_new.py:增加了模拟动态更新server工具的代码。与client_20250316.py一起使用。

遇到的问题

在构建过程中,我遇到了一些问题。具体可以阅读下我的知乎文章如何构建自己的MCP Client,也可以持续关注我的MCP专栏

问题截图


最后

希望这个项目能对大家有所帮助。

问题反馈

如果您在使用过程中遇到任何问题,欢迎随时反馈。


项目贡献

如果您对这个项目感兴趣,欢迎提交 Pull Request 或 Issue,共同完善这个适配 OpenAI SDK 的 MCP Client。

mcp_client_openai FAQ

How does mcp_client_openai differ from the official MCP client examples?
It specifically adapts the OpenAI SDK, adding lifecycle management, logging, and dynamic server tool support not present in official examples.
Can mcp_client_openai be used on Windows?
Yes, there is a dedicated version client_new.py designed to resolve Windows-specific issues.
Does mcp_client_openai support dynamic updates from MCP servers?
Yes, it includes features to receive and handle specific messages from servers and simulate dynamic tool updates.
Where can I find example code for using mcp_client_openai?
Example Python code is available in the MCP Client Python quickstart repository linked in the official documentation.
Is mcp_client_openai limited to OpenAI models only?
While optimized for OpenAI SDK, it can be adapted to work with other LLM providers like Claude and Gemini with additional integration work.
How can I troubleshoot issues with mcp_client_openai?
The developer provides detailed troubleshooting articles and a dedicated MCP column on Zhihu for ongoing support.
What logging capabilities does mcp_client_openai provide?
It includes enhanced logging to track client-server interactions and lifecycle events for better debugging and monitoring.
How does mcp_client_openai handle MCP server lifecycle events?
It manages the full lifecycle with support for receiving server messages and maintaining stable client-server communication.