Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-ollama-agent

MCP.Pizza Chef: ausboss

The mcp-ollama-agent is a TypeScript-based MCP server that integrates the Ollama AI platform with the Model Context Protocol ecosystem. It enables AI agents to interact seamlessly with multiple MCP servers, providing a unified interface for tool usage such as file system operations and web research. Featuring an interactive command-line interface, it supports easy configuration and standalone demo mode for testing without an LLM. This server facilitates advanced AI workflows by combining Ollama's capabilities with MCP's modular tool ecosystem.

Use This MCP server To

Integrate Ollama AI with multiple MCP servers Enable AI agents to perform file system operations Support web research through MCP tool integration Provide interactive CLI for AI agent tool usage Test MCP tools in standalone demo mode without LLM Configure AI workflows via mcp-config.json Demonstrate TypeScript MCP server implementation

README

TypeScript MCP Agent with Ollama Integration

This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.

✨ Features

  • Supports multiple MCP servers (both uvx and npx tested)
  • Built-in support for file system operations and web research
  • Easy configuration through mcp-config.json similar to claude_desktop_config.json
  • Interactive chat interface with Ollama integration that should support any tools
  • Standalone demo mode for testing web and filesystem tools without an LLM

🚀 Getting Started

  1. Prerequisites:

    • Node.js (version 18 or higher)

    • Ollama installed and running

    • Install the MCP tools globally that you want to use:

      # For filesystem operations
      npm install -g @modelcontextprotocol/server-filesystem
      
      # For web research
      npm install -g @mzxrai/mcp-webresearch
  2. Clone and install:

    git clone https://github.com/ausboss/mcp-ollama-agent.git
    cd mcp-ollama-agent
    npm install
    
  3. Configure your tools and tool supported Ollama model in mcp-config.json:

    {
      "mcpServers": {
        "filesystem": {
          "command": "npx",
          "args": ["@modelcontextprotocol/server-filesystem", "./"]
        },
        "webresearch": {
          "command": "npx",
          "args": ["-y", "@mzxrai/mcp-webresearch"]
        }
      },
      "ollama": {
        "host": "http://localhost:11434",
        "model": "qwen2.5:latest"
      }
    }
  4. Run the demo to test filesystem and webresearch tools without an LLM:

    npx tsx ./src/demo.ts
  5. Or start the chat interface with Ollama:

    npm start

⚙️ Configuration

  • MCP Servers: Add any MCP-compatible server to the mcpServers section
  • Ollama: Configure host and model (must support function calling)
  • Supports both Python (uvx) and Node.js (npx) MCP servers

💡 Example Usage

This example used this model qwen2.5:latest

Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.

System Prompts

Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts to improve tool usage.

🤝 Contributing

Contributions welcome! Feel free to submit issues or pull requests.

mcp-ollama-agent FAQ

How do I install the mcp-ollama-agent server?
Ensure Node.js 18+ and Ollama are installed, then clone the repo and follow the setup instructions including installing required MCP tools globally.
Can I use mcp-ollama-agent without an LLM?
Yes, it supports a standalone demo mode for testing web and filesystem tools without connecting to an LLM.
How do I configure the tools used by mcp-ollama-agent?
Configuration is done via the mcp-config.json file, similar to claude_desktop_config.json, allowing easy setup of multiple MCP servers and tools.
What platforms does mcp-ollama-agent support?
It runs on Node.js environments and integrates with Ollama, supporting any MCP server tools compatible with the protocol.
Does mcp-ollama-agent support multiple MCP servers simultaneously?
Yes, it supports multiple MCP servers, tested with uvx and npx MCP server tools.
Is the interactive CLI customizable?
The CLI is designed for interactive use with Ollama integration and can be extended or configured through the project’s TypeScript codebase.
What AI providers can I use with mcp-ollama-agent?
While designed for Ollama, it can be adapted to work with other LLM providers like OpenAI, Claude, and Gemini through MCP compatibility.