openai-mcp-client

MCP.Pizza Chef: ResoluteError

The openai-mcp-client is a basic implementation of the Model Context Protocol (MCP) designed to work with OpenAI's API, enabling the creation of chat agents that act within a structured chat context. It serves as a foundational example for developers to build upon, demonstrating how to integrate MCP with OpenAI's models while following Anthropic's MCP standards. The client supports text-based tool call responses and requires Deno v2 for setup. It allows users to connect to any MCP server, facilitating flexible and extensible AI agent workflows. Note that the current implementation sends the entire conversation context to the server, which may increase token usage and costs.

Use This MCP client To

Create chat agents using OpenAI with MCP protocol Prototype MCP client integrations with OpenAI API Develop AI workflows with structured chat context Test tool call responses in MCP client environment Connect to various MCP servers for AI orchestration

README

Intro

This is a simple example of how to use the Model Context Protocol (MCP) with OpenAI's API to create a simple agent acting from a chat context. Feel free to use this as a starting point for your own projects.

Setup Guide

  • Ensure Deno v2 is installed
  • Run deno install to install dependencies
  • Copy .env.example to .env and fill in the values
    • You can choose any MCP server you like - bring your own or use one from the official MCP server list
  • Run deno run dev to start the application

Warning

Chat messages are appended and currently the entire conversation is always sent to the server. This can rack up a lot of tokens and cost a lot of money, depending on the length of the conversation, the model you are using, and the size of the context.

Limitations

This implementation currently only supports tool call responses of type text. Other resource can be implemented in applyToolCallIfExists in src/openai-utils.ts.

Notes

You latest messages are saved in messages.json for debugging purposes. These messages will be overwritten every time you run the application, so make sure to create a copy of the file before running the application again, if you want to keep the previous messages.

If you want to run the application in debug mode, set the DEBUG environment variable to true in your .env file. This will print out more information about the messages and tool calls.

openai-mcp-client FAQ

How do I set up the openai-mcp-client?
Install Deno v2, run `deno install` to install dependencies, copy `.env.example` to `.env` and fill in values, then run `deno run dev` to start.
Can I use any MCP server with this client?
Yes, you can connect to any MCP server, including those from the official MCP server list or your own implementations.
Does the client support tool call responses other than text?
Currently, it only supports tool call responses of type `text`. Other types can be implemented by extending the `applyToolCallIfExists` function.
Will sending the entire conversation context increase costs?
Yes, appending all chat messages and sending the full conversation can rack up tokens and increase costs depending on model and context size.
Is this client compatible with LLM providers other than OpenAI?
While designed for OpenAI, the MCP protocol is provider-agnostic and can be adapted for use with providers like Anthropic's Claude and Google's Gemini.
What programming environment is required?
The client requires Deno v2 for installation and running.
Can I customize the client for my own projects?
Yes, the client is a rudimentary example intended as a starting point for custom MCP client development.
How does the client handle chat context?
It appends all chat messages and sends the entire conversation context to the MCP server for processing.