Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AIFire in da houseCheck it out free

assistant-mcp

MCP.Pizza Chef: pinecone-io

The assistant-mcp is an MCP server that interfaces with the Pinecone Assistant API to retrieve and manage information. It supports configurable multi-result retrieval, enabling efficient data access from Pinecone's vector database assistant. Designed for easy deployment via Docker, it requires a Pinecone API key and Assistant host details, facilitating seamless integration into AI workflows that leverage Pinecone's vector search capabilities.

Use This MCP server To

Retrieve multiple relevant results from Pinecone Assistant API Integrate Pinecone vector search data into AI workflows Deploy Pinecone Assistant data retrieval in containerized environments Configure result limits for optimized data fetching Use as a backend server for AI agents needing Pinecone data access

README

Pinecone Assistant MCP Server

An MCP server implementation for retrieving information from Pinecone Assistant.

Features

  • Retrieves information from Pinecone Assistant
  • Supports multiple results retrieval with a configurable number of results

Prerequisites

  • Docker installed on your system
  • Pinecone API key - obtain from the Pinecone Console
  • Pinecone Assistant API host - after creating an Assistant (e.g. in Pinecone Console), you can find the host in the Assistant details page

Building with Docker

To build the Docker image:

docker build -t pinecone/assistant-mcp .

Running with Docker

Run the server with your Pinecone API key:

docker run -i --rm \
  -e PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE> \
  -e PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE> \
  pinecone/assistant-mcp

Environment Variables

  • PINECONE_API_KEY (required): Your Pinecone API key
  • PINECONE_ASSISTANT_HOST (optional): Pinecone Assistant API host (default: https://prod-1-data.ke.pinecone.io)
  • LOG_LEVEL (optional): Logging level (default: info)

Usage with Claude Desktop

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "pinecone-assistant": {
      "command": "docker",
      "args": [
        "run", 
        "-i", 
        "--rm", 
        "-e", 
        "PINECONE_API_KEY", 
        "-e", 
        "PINECONE_ASSISTANT_HOST", 
        "pinecone/assistant-mcp"
      ],
      "env": {
        "PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY_HERE>",
        "PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST_HERE>"
      }
    }
  }
}

Building from Source

If you prefer to build from source without Docker:

  1. Make sure you have Rust installed (https://rustup.rs/)
  2. Clone this repository
  3. Run cargo build --release
  4. The binary will be available at target/release/assistant-mcp

Testing with the inspector

export PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE>
export PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE>
# Run the inspector alone
npx @modelcontextprotocol/inspector cargo run
# Or run with Docker directly through the inspector
npx @modelcontextprotocol/inspector -- docker run -i --rm -e PINECONE_API_KEY -e PINECONE_ASSISTANT_HOST pinecone/assistant-mcp

License

This project is licensed under the terms specified in the LICENSE file.

assistant-mcp FAQ

How do I deploy the assistant-mcp server?
Build and run the Docker image with your Pinecone API key and Assistant host environment variables.
What prerequisites are needed to run assistant-mcp?
Docker installed, a Pinecone API key, and the Pinecone Assistant API host URL.
Can I configure how many results the server retrieves?
Yes, the server supports configurable multi-result retrieval from Pinecone Assistant.
Is the assistant-mcp server limited to a specific Pinecone Assistant?
No, you can specify the Assistant host to connect to any Pinecone Assistant instance.
What environment variables are required to run the server?
PINECONE_API_KEY (required) and PINECONE_ASSISTANT_HOST (optional).
Can assistant-mcp be integrated with other LLM providers?
Yes, it can be used alongside models from OpenAI, Anthropic Claude, and Google Gemini by feeding Pinecone data into their workflows.
Does assistant-mcp support real-time data retrieval?
It retrieves up-to-date information from Pinecone Assistant on demand during runtime.