open-mcp-client

MCP.Pizza Chef: CopilotKit

Open MCP Client is a modular client designed to orchestrate context flow, tool calling, and protocol logic within the Model Context Protocol ecosystem. It enables seamless integration of LLMs by managing real-time structured context and facilitating multi-step reasoning across various providers and platforms. This client supports environment setup with API keys and dependency management, making it ideal for developers building AI-enhanced workflows and agents.

Use This MCP client To

Manage context flow between LLMs and MCP servers Orchestrate tool calls and API interactions for AI workflows Enable multi-step reasoning across different LLM providers Integrate real-time structured data into AI models Coordinate environment variables and dependency setups for agents Run frontend and backend components for AI agent development Switch between LLM providers without rewriting infrastructure

README

Open MCP Client

CopilotKit-Banner

demospreadsheet.1.mp4

Getting Started

Set Up Environment Variables

Create a .env file at the root of your project:

touch .env

Add the following to .env:

LANGSMITH_API_KEY=lsv2_...
OPENAI_API_KEY=sk-...

Next, navigate to the agent folder and create another .env file:

cd agent
touch .env

Add the following inside agent/.env:

OPENAI_API_KEY=sk-...
LANGSMITH_API_KEY=lsv2_...

Set Up Poetry:

Poetry manages dependencies for the agent service. Install it with:

pip install poetry

Verify the installation by running:

poetry --version

Development

For easier debugging, run the frontend and agent in separate terminals:

# Terminal 1 - Frontend
pnpm run dev-frontend

# Terminal 2 - Agent
pnpm run dev-agent

Alternatively, launch both services together:

pnpm run dev

Visit http://localhost:3000 in your browser to view the application.

Architecture

The codebase is organized into two primary components:

  • Frontend - Handles the user interface.
  • Agent - Manages the core functionality.

License

Distributed under the MIT License. See LICENSE for more info.

open-mcp-client FAQ

How do I set up environment variables for the Open MCP Client?
Create a .env file at your project root and agent folder, then add your LANGSMITH_API_KEY and OPENAI_API_KEY accordingly.
What dependency manager does Open MCP Client use?
It uses Poetry to manage dependencies for the agent service, ensuring consistent environment setup.
Can Open MCP Client work with multiple LLM providers?
Yes, it supports provider-agnostic interfaces allowing seamless switching between providers like OpenAI, Claude, and Gemini.
How do I run the Open MCP Client during development?
Run the frontend and agent services in separate terminals using the provided commands for easier debugging.
Is Open MCP Client suitable for building AI-enhanced workflows?
Yes, it is designed to orchestrate complex AI workflows by managing context and tool interactions in real time.
Does Open MCP Client support multi-step reasoning?
Yes, it facilitates multi-step reasoning by coordinating context and tool calls across different components.
What is the role of the Open MCP Client in the MCP architecture?
It acts as the orchestrator managing context flow, tool calling, and protocol logic between hosts and servers.