Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

ollama-mcp

MCP.Pizza Chef: rawveg

The Ollama MCP Server enables seamless integration of Ollama's local large language models with MCP-compatible applications. It supports listing and pulling Ollama models, chatting via Ollama's chat API, and retrieving detailed model information. It features automatic port management and environment variable configuration, requiring Node.js and Ollama installed locally. This server facilitates real-time, structured interaction between Ollama models and various MCP clients, enhancing local LLM workflows.

Use This MCP server To

List available Ollama LLM models for selection Pull and update Ollama models dynamically Chat with Ollama models through MCP-compatible clients Retrieve detailed metadata about Ollama models Integrate Ollama LLMs into Claude Desktop or similar apps Manage server ports automatically for hassle-free setup Configure environment variables for customized Ollama usage

README

Ollama MCP Server

An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

Features

  • List available Ollama models
  • Pull new models from Ollama
  • Chat with models using Ollama's chat API
  • Get detailed model information
  • Automatic port management
  • Environment variable configuration

Prerequisites

  • Node.js (v16 or higher)
  • npm
  • Ollama installed and running locally

Installation

Manual Installation

Install globally via npm:

npm install -g @rawveg/ollama-mcp

Installing in Other MCP Applications

To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:

{
  "mcpServers": {
    "@rawveg/ollama-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@rawveg/ollama-mcp"
      ]
    }
  }
}

The settings file location varies by application:

  • Claude Desktop: claude_desktop_config.json in the Claude app data directory
  • Cline: cline_mcp_settings.json in the VS Code global storage

Usage

Starting the Server

Simply run:

ollama-mcp

The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:

PORT=3457 ollama-mcp

Environment Variables

  • PORT: Server port (default: 3456). Can be used when running directly:
    # When running directly
    PORT=3457 ollama-mcp
  • OLLAMA_API: Ollama API endpoint (default: http://localhost:11434)

API Endpoints

  • GET /models - List available models
  • POST /models/pull - Pull a new model
  • POST /chat - Chat with a model
  • GET /models/:name - Get model details

Development

  1. Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Start the server:
npm start

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

However, this does not grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like Smithery, recent actions by a similar service — Glama — have required a reassessment of this policy.

Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with their platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but ethically problematic.

As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the GNU Affero General Public License v3.0 (AGPL-3.0). This change ensures that any use of the software — particularly in commercial or service-based platforms — must remain fully compliant with the AGPL's terms and obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being actively monetised. If you wish to include this project in a commercial offering, please get in touch first to discuss licensing terms.

License

AGPL v3.0

Related

This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.

ollama-mcp FAQ

How do I install the Ollama MCP Server?
Install globally via npm using 'npm install -g @rawveg/ollama-mcp' or configure it in your MCP app's settings.
What are the prerequisites for running the Ollama MCP Server?
You need Node.js v16 or higher, npm, and Ollama installed and running locally.
Can I use the Ollama MCP Server with other MCP-compatible applications?
Yes, it can be integrated into MCP clients like Claude Desktop or Cline by adding it to the MCP servers configuration.
How does the Ollama MCP Server handle port management?
It automatically manages ports to simplify setup and avoid conflicts.
What functionality does the Ollama MCP Server provide?
It lists available models, pulls new models, chats with models, and provides detailed model information via Ollama's API.
Is environment variable configuration supported?
Yes, you can configure environment variables to customize the Ollama MCP Server behavior.
Does the Ollama MCP Server support real-time interaction with local LLMs?
Yes, it enables real-time, structured communication between Ollama's local models and MCP clients.
What LLM providers does Ollama MCP Server work with?
It specifically integrates Ollama's local LLMs but can be used alongside other MCP servers supporting providers like OpenAI, Claude, and Gemini.