mcp-server-weaviate

MCP.Pizza Chef: weaviate

mcp-server-weaviate is an MCP server that connects the Weaviate vector search engine to the Model Context Protocol ecosystem. It enables LLMs to query and interact with Weaviate's semantic search capabilities in real time, facilitating advanced retrieval-augmented generation and context-aware AI workflows. This server supports seamless integration with clients like Claude Desktop and is installable via Smithery, making it easy to embed Weaviate-powered knowledge bases into AI applications.

Use This MCP server To

Integrate Weaviate vector search with LLMs for semantic retrieval Enable real-time context querying from Weaviate in AI workflows Use Weaviate as a knowledge base for retrieval-augmented generation Connect Claude Desktop client to Weaviate via MCP Automate semantic search queries within AI agents Embed Weaviate data into multi-step reasoning processes

README

mcp-server-weaviate

smithery badge

MCP server for Weaviate

🏎️ Quickstart

Prerequisites

  • Ensure you have uv installed (see the docs for details)
  • Clone this repository

Install

Installing via Smithery

To install Weaviate MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @weaviate/mcp-server-weaviate --client claude

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration

{
  "mcpServers": {
    "mcp-server-weaviate": {
      "command": "PYTHON_PATH",
      "args": [
        "-m",
        "src.server",
        "--weaviate-url",
        "YOUR_WEAVIATE_URL",
        "--weaviate-api-key",
        "YOUR_WEAVIATE_API_KEY",
        "--search-collection-name",
        "YOUR_SEARCH_COLLECTION",
        "--store-collection-name",
        "YOUR_STORE_COLLECTION",
        "--openai-api-key",
        "YOUR_OPENAI_API_KEY"
      ],
      "env": {
        "PYTHONPATH": "PATH_TO_MCP_SERVER_WEAVIATE_DIRECTORY"
      }
    }
  }
}

Demo

mcp-server-weaviate FAQ

How do I install mcp-server-weaviate?
You can install it via Smithery CLI using 'npx -y @smithery/cli install @weaviate/mcp-server-weaviate --client claude' or clone the repo and follow the setup instructions.
What prerequisites are needed for mcp-server-weaviate?
You need to have 'uv' installed as per the Astral docs, and a working Python environment to run the server module.
How does mcp-server-weaviate integrate with LLM clients?
It exposes Weaviate's vector search capabilities as an MCP server, allowing clients like Claude Desktop to query semantic data in real time.
Can mcp-server-weaviate be used with multiple LLM providers?
Yes, it is provider-agnostic and works with models like OpenAI, Claude, and Gemini through the MCP protocol.
Is mcp-server-weaviate suitable for production use?
Yes, it is designed to be lightweight and easily deployable, suitable for embedding Weaviate search in production AI workflows.
How do I configure mcp-server-weaviate for development?
You can configure it by editing the Claude Desktop config JSON or using environment variables as documented in the repo.
What is the role of Weaviate in this MCP server?
Weaviate acts as a vector search engine providing semantic search and knowledge base capabilities accessible via MCP.
Does mcp-server-weaviate support real-time updates from Weaviate?
Yes, it allows LLMs to query up-to-date vector data from Weaviate during interactions.