mcp_client

MCP.Pizza Chef: theailanguage

The MCP Client is a Python-based implementation that integrates LangGraph and Google Gemini AI to execute tasks using the Model Context Protocol (MCP). It offers four client options supporting different configurations, LangChain integration, and transport methods like STDIO and SSE, enabling flexible and scalable AI-driven workflows across various environments.

Use This MCP client To

Run MCP tasks using Google Gemini AI with Python Integrate LangChain for advanced MCP client workflows Configure multi-server MCP clients with flexible settings Use SSE transport for real-time MCP client-server communication Execute legacy MCP client tasks via standard input/output Develop AI-enhanced applications leveraging MCP protocol Test and prototype MCP client implementations quickly Switch between different MCP client modes for varied use cases

README

πŸš€ MCP Client with Gemini AI

πŸ“’ Subscribe to The AI Language on YouTube!

Welcome! This project features multiple MCP clients integrated with Google Gemini AI to execute tasks via the Model Context Protocol (MCP) β€” with and without LangChain.

Happy building, and don’t forget to subscribe!

MCP Client Options

This repository includes four MCP client options for various use cases:

Option Client Script LangChain Config Support Transport Tutorial
1 client.py ❌ ❌ STDIO Legacy Client
2 langchain_mcp_client.py βœ… ❌ STDIO LangChain Client
3 langchain_mcp_client_wconfig.py βœ… βœ… STDIO Multi-Server
4 client_sse.py ❌ ❌ SSE (Loca & Web) SSE Client

If you want to add or reuse MCP Servers, check out the MCP Servers repo.


βœͺ Features

βœ… Connects to an MCP server (STDIO or SSE)
βœ… Uses Google Gemini AI to interpret user prompts
βœ… Allows Gemini to call MCP tools via server
βœ… Executes tool commands and returns results
βœ… (Upcoming) Maintains context and history for conversations


Running the MCP Client

Choose the appropriate command for your preferred client:

  • Legacy STDIO β€” uv run client.py path/to/server.py
  • LangChain STDIO β€” uv run langchain_mcp_client.py path/to/server.py
  • LangChain Multi-Server STDIO β€” uv run langchain_mcp_client_wconfig.py path/to/config.json
  • SSE Client β€” uv run client_sse.py sse_server_url

Project Structure

mcp-client-gemini/
β”œβ”€β”€ client.py                        # Basic client (STDIO)
β”œβ”€β”€ langchain_mcp_client.py         # LangChain + Gemini
β”œβ”€β”€ langchain_mcp_client_wconfig.py # LangChain + config.json (multi-server)
β”œβ”€β”€ client_sse.py                   # SSE transport client (local or remote)
β”œβ”€β”€ .env                            # API key environment file
β”œβ”€β”€ README.md                       # Project documentation
β”œβ”€β”€ requirements.txt                # Dependency list
β”œβ”€β”€ .gitignore                      # Git ignore rules
β”œβ”€β”€ LICENSE                         # License information

How It Works

  1. You send a prompt:

    Create a file named test.txt

  2. The prompt is sent to Google Gemini AI.
  3. Gemini uses available MCP tools to determine a response.
  4. The selected tool is executed on the connected server.
  5. The AI returns results and maintains conversation context (if supported).

🀝 Contributing

At this time, this project does not accept external code contributions.

This is to keep licensing simple and avoid any shared copyright.

You're very welcome to: βœ… Report bugs or request features (via GitHub Issues)
βœ… Fork the repo and build your own version
βœ… Suggest documentation improvements

If you'd like to collaborate in another way, feel free to open a discussion!

mcp_client FAQ

How do I choose between the four MCP client options?
Choose based on your need for LangChain support, configuration, and transport method; options range from simple STDIO clients to SSE-enabled clients.
Can I use this MCP client without LangChain?
Yes, two client options do not require LangChain, allowing lightweight and straightforward MCP interactions.
What transport methods are supported by the MCP client?
The client supports STDIO for legacy and LangChain clients, and SSE for local and web-based real-time communication.
Is configuration support available for all client options?
No, only the 'langchain_mcp_client_wconfig.py' option supports configuration for multi-server setups.
Can this MCP client work with LLM providers other than Gemini?
While optimized for Gemini AI, the client can be adapted to work with other providers like OpenAI and Anthropic with appropriate modifications.
Where can I find tutorials for using the MCP client?
Tutorials are available via linked YouTube videos covering legacy, LangChain, multi-server, and SSE client setups.
Is this MCP client suitable for production use?
It is designed for development and prototyping; production readiness depends on your integration and deployment practices.
How does the MCP client handle multi-server environments?
The client supports multi-server configurations through its configurable LangChain-enabled client option.