Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-memgraph

MCP.Pizza Chef: memgraph

Memgraph MCP Server is a lightweight implementation of the Model Context Protocol designed to integrate the Memgraph graph database with large language models (LLMs). It enables real-time, structured context sharing between Memgraph and LLMs, facilitating advanced graph-based queries and reasoning within AI workflows. The server supports easy setup and configuration, making it ideal for developers looking to leverage graph data in AI-enhanced applications.

Use This MCP server To

Enable LLMs to query and reason over graph data in Memgraph Integrate Memgraph graph database context into AI workflows Facilitate real-time graph data access for language models Support complex graph queries via natural language interfaces Combine graph analytics with LLM-powered insights Automate knowledge graph exploration using LLMs Enhance AI agents with structured graph context from Memgraph

README

🚀 Memgraph MCP Server

Memgraph MCP Server is a lightweight server implementation of the Model Context Protocol (MCP) designed to connect Memgraph with LLMs.

mcp-server

⚡ Quick start

📹 Memgraph MCP Server Quick Start video

1. Run Memgraph MCP Server

  1. Install uv and create venv with uv venv. Activate virtual environment with .venv\Scripts\activate.
  2. Install dependencies: uv add "mcp[cli]" httpx
  3. Run Memgraph MCP server: uv run server.py.

2. Run MCP Client

  1. Install Claude for Desktop.
  2. Add the Memgraph server to Claude config:

MacOS/Linux

code ~/Library/Application\ Support/Claude/claude_desktop_config.json

Windows

code $env:AppData\Claude\claude_desktop_config.json

Example config:

{
    "mcpServers": {
      "mpc-memgraph": {
        "command": "/Users/katelatte/.local/bin/uv",
        "args": [
            "--directory",
            "/Users/katelatte/projects/mcp-memgraph",
            "run",
            "server.py"
        ]
     }
   }
}

Note

You may need to put the full path to the uv executable in the command field. You can get this by running which uv on MacOS/Linux or where uv on Windows. Make sure you pass in the absolute path to your server.

3. Chat with the database

  1. Run Memgraph MAGE:
    docker run -p 7687:7687 memgraph/memgraph-mage --schema-info-enabled=True
    
    The --schema-info-enabled configuration setting is set to True to allow LLM to run SHOW SCHEMA INFO query.
  2. Open Claude Desktop and see the Memgraph tools and resources listed. Try it out! (You can load dummy data from Memgraph Lab Datasets)

🔧Tools

run_query()

Run a Cypher query against Memgraph.

🗃️ Resources

get_schema()

Get Memgraph schema information (prerequisite: --schema-info-enabled=True).

🗺️ Roadmap

The Memgraph MCP Server is just at its beginnings. We're actively working on expanding its capabilities and making it even easier to integrate Memgraph into modern AI workflows. In the near future, we'll be releasing a TypeScript version of the server to better support JavaScript-based environments. Additionally, we plan to migrate this project into our central AI Toolkit repository, where it will live alongside other tools and integrations for LangChain, LlamaIndex, and MCP. Our goal is to provide a unified, open-source toolkit that makes it seamless to build graph-powered applications and intelligent agents with Memgraph at the core.

mcp-memgraph FAQ

How do I install the Memgraph MCP Server?
Install 'uv' and create a virtual environment, then add dependencies 'mcp[cli]' and 'httpx', and run the server script.
How do I connect the Memgraph MCP Server to an MCP client like Claude?
Add the Memgraph server configuration to Claude's MCP client config file on your OS and start the client.
What prerequisites are needed before running the Memgraph MCP Server?
You need Python environment with 'uv' installed and a Memgraph instance accessible for graph data.
Can the Memgraph MCP Server work with multiple LLM providers?
Yes, it is provider-agnostic and works with OpenAI, Claude, Gemini, and other LLMs supporting MCP.
Is the Memgraph MCP Server suitable for production use?
It is lightweight and designed for easy integration; suitability depends on your scale and performance needs.
How does the Memgraph MCP Server enhance AI applications?
By exposing graph data context to LLMs, it enables richer, graph-aware AI reasoning and workflows.
Where can I find a quick start guide or demo?
A quick start video is available on YouTube linked in the GitHub repository README.