mcp-openai-gemini-llama-example

MCP.Pizza Chef: philschmid

The mcp-openai-gemini-llama-example is a client demonstrating how to build an AI agent using the Model Context Protocol (MCP) with OpenAI, Google Gemini, and Meta Llama 3 LLMs. It connects to an MCP server, loads tools and resources, converts them into LLM-compatible function calls, and interacts with a SQLite database. This educational example showcases multi-LLM integration and database interaction through MCP in a simple CLI environment.

Use This MCP client To

Connect to MCP servers for multi-LLM orchestration Interact with SQLite databases via LLM-driven queries Demonstrate LLM function call conversions for tools Build CLI AI agents using OpenAI, Gemini, and Llama 3 Experiment with multi-provider LLM workflows in MCP Load and use MCP server tools and resources dynamically

README

How to use Anthropic MCP Server with open LLMs, OpenAI or Google Gemini

This repository contains a basic example of how to build an AI agent using the Model Context Protocol (MCP) with an open LLM (Meta Llama 3), OpenAI or Google Gemini, and a SQLite database. It's designed to be a simple, educational demonstration, not a production-ready framework.

OpenAI example: https://github.com/jalr4ever/Tiny-OAI-MCP-Agent

Setup

This code sets up a simple CLI agent that can interact with a SQLite database through an MCP server. It uses the official SQLite MCP server and demonstrates:

  • Connecting to an MCP server
  • Loading and using tools and resources from the MCP server
  • Converting tools into LLM-compatible function calls
  • Interacting with an LLM using the openai SDK or google-genai SDK.

How to use it

  • Docker installed and running.
  • Hugging Face account and an access token (for using the Llama 3 model).
  • Google API key (for using the Gemini model).

Installation

  1. Clone the repository:

    git clone https://github.com/philschmid/mcp-openai-gemini-llama-example
    cd mcp-openai-gemini-llama-example
  2. Install the required packages:

    pip install -r requirements.txt
  3. Log in to Hugging Face

    huggingface-cli login --token YOUR_TOKEN

Examples

Llama 3

Run the following command

python sqlite_llama_mcp_agent.py

The agent will start in interactive mode. You can type in prompts to interact with the database. Type "quit", "exit" or "q" to stop the agent.

Example conversation:

Enter your prompt (or 'quit' to exit): what tables are available?

Response:  The available tables are: albums, artists, customers, employees, genres, invoice_items, invoices, media_types, playlists, playlist_track, tracks

Enter your prompt (or 'quit' to exit): how many artists are there

Response:  There are 275 artists in the database.

Gemini

Run the following command

GOOGLE_API_KEY=YOUR_API_KEY python sqlite_gemini_mcp_agent.py

Future plans

I'm working on a toolkit to make implementing AI agents using MCP easier. Stay tuned for updates!

mcp-openai-gemini-llama-example FAQ

How do I set up the mcp-openai-gemini-llama-example client?
Install Docker, obtain a Hugging Face token for Llama 3, and a Google API key for Gemini, then follow the repository instructions.
Can this client work with multiple LLM providers simultaneously?
Yes, it supports OpenAI, Google Gemini, and Meta Llama 3 models within the same MCP client framework.
Does this example support production use?
No, it is designed as a simple educational demonstration, not a production-ready framework.
How does the client interact with the SQLite database?
It connects to an MCP server exposing SQLite and converts tool calls into LLM-compatible function calls for database queries.
What SDKs are used for LLM interaction?
The client uses the OpenAI SDK for OpenAI models and the google-genai SDK for Google Gemini.
Is Docker required to run this client?
Yes, Docker is required to run the example environment smoothly.
Can I extend this client to other MCP servers?
Yes, the client architecture allows connecting to other MCP servers exposing different tools or data sources.