Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AIFire in da houseCheck it out free

ollama-mcp-db

MCP.Pizza Chef: robdodson

The ollama-mcp-db server integrates Ollama's powerful large language model capabilities with PostgreSQL database access through the Model Context Protocol (MCP). It provides an interactive chat interface that allows users to query their PostgreSQL databases using natural language. The server automatically generates SQL queries based on user input, delivers schema-aware responses, and ensures secure, read-only access to the database. Designed for developers and data analysts, it simplifies data exploration and retrieval by bridging conversational AI with structured database querying, leveraging the qwen2.5-coder:7b-instruct model from Ollama.

Use This MCP server To

Query PostgreSQL databases using natural language Generate SQL queries automatically from chat input Explore database schema interactively Provide AI-powered data insights in chat Enable secure, read-only database access via chat Integrate conversational AI with database workflows

README

Ollama MCP Database Assistant

An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.

Features

  • Natural language interface to your PostgreSQL database
  • Automatic SQL query generation
  • Schema-aware responses
  • Interactive chat interface
  • Secure, read-only database access

Prerequisites

  • Node.js 16 or higher
  • A running PostgreSQL database
  • Ollama installed and running locally
  • The qwen2.5-coder:7b-instruct model pulled in Ollama

Setup

  1. Clone the repository:
git clone [your-repo-url]
cd [your-repo-name]
  1. Install dependencies:
npm install
  1. Pull the required Ollama model:
ollama pull qwen2.5-coder:7b-instruct
  1. Create a .env file in the project root:
DATABASE_URL=postgresql://user:password@localhost:5432/dbname
OLLAMA_MODEL=qwen2.5-coder:7b-instruct  # Optional - this is the default

Usage

  1. Start the chat interface:
npm start
  1. Ask questions about your data in natural language:
Connected to database. You can now ask questions about your data.
Type "exit" to quit.

What would you like to know about your data? Which products generated the most revenue last month?
Analyzing...

[AI will generate and execute a SQL query, then explain the results]
  1. Type 'exit' to quit the application.

How It Works

  1. The application connects to your PostgreSQL database through the PostgreSQL MCP server
  2. It loads and caches your database schema
  3. When you ask a question:
    • The schema and question are sent to Ollama
    • Ollama generates an appropriate SQL query
    • The query is executed through MCP
    • Results are sent back to Ollama for interpretation
    • You receive a natural language response

Environment Variables

Variable Description Default
DATABASE_URL PostgreSQL connection string Required
OLLAMA_MODEL Ollama model to use qwen2.5-coder:7b-instruct

Security

  • All database access is read-only
  • SQL queries are restricted to SELECT statements
  • Database credentials are kept secure in your .env file

Development

Built with:

  • TypeScript
  • Model Context Protocol (MCP)
  • Ollama
  • PostgreSQL

Troubleshooting

Common Issues

  1. "Failed to connect to database"

    • Check your DATABASE_URL in .env
    • Verify PostgreSQL is running
    • Check network connectivity
  2. "Failed to connect to Ollama"

    • Ensure Ollama is running (ollama serve)
    • Verify the model is installed (ollama list)
  3. "Error executing query"

    • Check database permissions
    • Verify table/column names in the schema

License

MIT

Contributing

  1. Fork the repository
  2. Create your feature branch
  3. Commit your changes
  4. Push to the branch
  5. Open a Pull Request

ollama-mcp-db FAQ

How do I set up ollama-mcp-db?
Clone the repo, install dependencies with npm, pull the Ollama model qwen2.5-coder:7b-instruct, and configure your PostgreSQL DATABASE_URL in a .env file.
What are the prerequisites for running ollama-mcp-db?
You need Node.js 16+, a running PostgreSQL database, Ollama installed locally, and the qwen2.5-coder:7b-instruct model pulled in Ollama.
Is the database access read-only?
Yes, ollama-mcp-db provides secure, read-only access to your PostgreSQL database to prevent unintended data modifications.
Can I customize the Ollama model used?
Yes, you can pull and configure different Ollama models compatible with your use case, though qwen2.5-coder:7b-instruct is recommended.
How does ollama-mcp-db generate SQL queries?
It uses Ollama's LLM capabilities to translate natural language questions into SQL queries dynamically, leveraging schema awareness.
Is the chat interface interactive?
Yes, the server offers an interactive chat interface for iterative querying and data exploration.
Can ollama-mcp-db be integrated with other MCP clients?
Yes, as an MCP server, it can be integrated with any MCP client that supports database query workflows.
What security measures are in place?
The server restricts database access to read-only operations and requires secure environment variable configuration for connection strings.
Does ollama-mcp-db support other databases besides PostgreSQL?
Currently, it is designed specifically for PostgreSQL databases.
Can I use this with other LLM providers?
While designed for Ollama, the MCP protocol allows integration with other LLM providers like OpenAI, Claude, and Gemini through compatible servers.