Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-client

MCP.Pizza Chef: rakesh-eltropy

The MCP-client is a simple yet powerful REST API and CLI client designed to interact with Model Context Protocol (MCP) servers. It supports any MCP-compatible server, including pre-configured defaults like SQLite and Brave Search, with easy addition of new servers via configuration. Integrated with LangChain, it enables executing LLM prompts across multiple MCP servers simultaneously, facilitating collaborative and comprehensive query responses. The client is compatible with a wide range of LLM providers that support APIs with function capabilities, such as OpenAI, Claude, Gemini, AWS Nova, Groq, and Ollama, making it highly flexible for diverse AI workflows.

Use This MCP client To

Query multiple MCP servers simultaneously for comprehensive data Integrate MCP servers with LangChain for advanced LLM workflows Add and configure custom MCP servers via JSON config Use CLI to interact with MCP servers for quick testing Leverage REST API for programmatic MCP server communication Support multi-provider LLM prompt execution with function calls

README

MCP REST API and CLI Client

A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.

Key Features

1. MCP-Compatible Servers

  • Supports any MCP-compatible servers servers.
  • Pre-configured default servers:
    • SQLite (test.db has been provided with sample products data)
    • Brave Search
  • Additional MCP servers can be added in the mcp-server-config.json file

2. Integrated with LangChain

  • Leverages LangChain to execute LLM prompts.
  • Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.

3. LLM Provider Support

  • Compatible with any LLM provider that supports APIs with function capabilities.
  • Examples:
    • OpenAI
    • Claude
    • Gemini
    • AWS Nova
    • Groq
    • Ollama
    • Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.

Setup

  1. Clone the repository:

    git clone https://github.com/rakesh-eltropy/mcp-client.git
  2. Navigate to the Project Directory After cloning the repository, move to the project directory:

    cd mcp-client
  3. Set the OPENAI_API_KEY environment variable:

    export OPENAI_API_KEY=your-openai-api-key

    You can also set the OPENAI_API_KEY in the mcp-server-config.json file.

    You can also set the provider and model in the mcp-server-config.json file. e.g. provider can be ollama and model can be llama3.2:3b.

4.Set the BRAVE_API_KEY environment variable:

export BRAVE_API_KEY=your-brave-api-key

You can also set the BRAVE_API_KEY in the mcp-server-config.json file. You can get the free BRAVE_API_KEY from Brave Search API.

  1. Running from the CLI:

    uv run cli.py

    To explore the available commands, use the help option. You can chat with LLM using chat command. Sample prompts:

      What is the capital city of India?
      Search the most expensive product from database and find more details about it from amazon?
  2. Running from the REST API:

    uvicorn app:app --reload

    You can use the following curl command to chat with llm:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat

    You can use the following curl command to chat with llm with streaming:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat

Contributing

Feel free to submit issues and pull requests for improvements or bug fixes.

mcp-client FAQ

How do I add a new MCP server to the mcp-client?
You can add new MCP servers by editing the mcp-server-config.json file to include the server's details and endpoints.
Can mcp-client work with any LLM provider?
Yes, mcp-client supports any LLM provider that offers APIs with function call capabilities, including OpenAI, Claude, Gemini, AWS Nova, Groq, and Ollama.
How does mcp-client integrate with LangChain?
mcp-client leverages LangChain to execute LLM prompts, enabling multiple MCP servers to collaborate and respond to queries simultaneously.
Is there a way to interact with MCP servers without coding?
Yes, mcp-client provides a CLI interface for direct interaction with MCP servers, useful for quick testing and manual queries.
What default MCP servers come pre-configured with mcp-client?
The client includes pre-configured SQLite (with sample data) and Brave Search MCP servers by default.
Can I use mcp-client programmatically?
Yes, mcp-client exposes a REST API allowing programmatic communication with MCP servers for integration into applications.
Does mcp-client support concurrent queries to multiple servers?
Yes, it supports querying multiple MCP servers simultaneously to aggregate responses efficiently.
What formats are supported for configuring MCP servers?
MCP servers are configured using a JSON file named mcp-server-config.json for easy management.