langchain-mcp-client

MCP.Pizza Chef: guinacio

The langchain-mcp-client is a Streamlit-based user interface client that connects to MCP servers using Server-Sent Events (SSE). It supports multiple server configurations and allows users to select from various LLM providers such as OpenAI, Anthropic, and Google. The client enables users to view, test, and execute MCP tools, interact with LLM agents through a chat interface, and display tool execution results in real time.

Use This MCP client To

Connect to multiple MCP servers via SSE for real-time interaction Select and switch between different LLM providers dynamically Test and execute MCP tools directly from a graphical interface Use chat interface to interact with LLM agents connected to MCP servers Display and review results of tool executions in real time Manage and monitor multiple MCP server connections in one app

README

LangChain MCP Client Streamlit App

This Streamlit application provides a user interface for connecting to MCP (Model Context Protocol) servers and interacting with them using different LLM providers (OpenAI, Anthropic, Google...).

Features

  • Connect to MCP servers via SSE (Server-Sent Events)
  • Support for both single server and multiple server configurations
  • Select between different LLM providers (OpenAI/Claude)
  • View, test, and use available MCP tools directly from the UI
  • Chat interface for interacting with the LLM agent
  • Tool execution results display

Installation

  1. Clone this repository:
git clone https://github.com/guinacio/langchain-mcp-client.git
cd langchain-mcp-client
  1. Create a virtual environment and install dependencies:
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

Running the Application

Run the Streamlit app with:

streamlit run app.py

The application will be available at http://localhost:8501

Setting Up an MCP Server

To use this application, you'll need an MCP server running or a valid URL to an MCP server. Use the simple MCP server available on weather_server.py for a quick test:

  1. Install the MCP library:
pip install mcp
  1. Run the server:
python weather_server.py

The server will start on port 8000 by default. In the Streamlit app, you can connect to it using the URL http://localhost:8000/sse.

Configuration Options

LLM Providers

  • OpenAI: Requires an OpenAI API key and supports models like gpt-4o, gpt-4, and gpt-3.5-turbo
  • Anthropic: Requires an Anthropic API key and supports Claude models
  • Google: Requires a Google Generative Language / Vertex AI API
  • Local LLMs: Supports a local LLM using Ollama

MCP Server Connection

  • Currently supports SSE (Server-Sent Events) connections
  • Enter the URL of your MCP server's SSE endpoint (e.g., http://localhost:8000/sse)

Server Modes

  • Single Server: Connect to a single MCP server
  • Multiple Servers: Connect to multiple MCP servers simultaneously
    • Add servers with unique names
    • Manage (add/remove) servers through the UI
    • Connect to all configured servers at once

Available Tools

  • View all available tools from connected MCP servers in the sidebar
  • Each tool displays:
    • Name and description
    • Required and optional parameters with their types
    • Parameter descriptions and constraints
  • Tools are automatically available to the LLM agent in the chat interface
  • Tool executions and their results are tracked in the chat history

Future Improvements

  • STDIO MCP Servers: Support for connecting to MCP servers using standard input/output (STDIO) for more flexible server configurations.
  • Test Tools Individually: Implement functionality to test each tool individually from the UI to ensure they work as expected.
  • Using Local LLMs: Support for connecting local LLMs (Llama, DeepSeek, Qwen...)
  • Agent Memory: Introduce memory capabilities for the agent to retain context across interactions.
  • RAG (File Upload): Enable Retrieval-Augmented Generation (RAG) by allowing users to upload files that the agent can use to enhance its responses.

Troubleshooting

  • Connection Issues: Ensure your MCP server is running and accessible
  • API Key Errors: Verify that you've entered the correct API key for your chosen LLM provider
  • Tool Errors: Check the server logs for details on any errors that occur when using tools

Resources

langchain-mcp-client FAQ

How do I connect the langchain-mcp-client to an MCP server?
You connect by specifying the MCP server URL in the client UI, which uses Server-Sent Events (SSE) to establish a real-time connection.
Can I use multiple MCP servers simultaneously with this client?
Yes, the client supports both single and multiple MCP server configurations for concurrent interactions.
Which LLM providers can I use with langchain-mcp-client?
The client supports OpenAI, Anthropic, Google, and other LLM providers compatible with MCP.
How do I install and run the langchain-mcp-client?
Clone the repository, create a Python virtual environment, install dependencies via requirements.txt, and run the Streamlit app using 'streamlit run app.py'.
Can I execute MCP tools from the client interface?
Yes, the client allows you to view, test, and execute available MCP tools directly from the UI.
Does the client provide a chat interface for LLM interaction?
Yes, it includes a chat interface to interact with LLM agents connected through MCP servers.
How are tool execution results displayed?
Results from tool executions are shown in the client UI in real time for easy review.
Is the client limited to any specific operating system?
No, since it is a Streamlit app running on Python, it works on Windows, macOS, and Linux platforms.