podman-mcp-server

MCP.Pizza Chef: manusa

Podman MCP Server is a robust Model Context Protocol server that exposes container runtime data from Podman and Docker environments. It enables LLMs and AI agents to access real-time container states, configurations, and metadata, facilitating advanced container management, monitoring, and automation workflows within MCP-enabled applications.

Use This MCP server To

Expose real-time container status and metadata to LLMs Enable AI-driven container lifecycle management Integrate container runtime data into developer IDEs Automate container monitoring and alerting workflows Provide context for AI-assisted container debugging Support multi-container orchestration insights for AI agents Feed container logs and metrics into AI analysis tools

README

Podman MCP Server

GitHub License npm GitHub release (latest SemVer) Build

✨ Features | πŸš€ Getting Started | πŸŽ₯ Demos | βš™οΈ Configuration | πŸ§‘β€πŸ’» Development

✨ Features

A powerful and flexible MCP server for container runtimes supporting Podman and Docker.

πŸš€ Getting Started

Claude Desktop

Using npx

If you have npm installed, this is the fastest way to get started with podman-mcp-server on Claude Desktop.

Open your claude_desktop_config.json and add the mcp server to the list of mcpServers:

{
  "mcpServers": {
    "podman": {
      "command": "npx",
      "args": [
        "-y",
        "podman-mcp-server@latest"
      ]
    }
  }
}

VS Code / VS Code Insiders

Install the Podman MCP server extension in VS Code Insiders by pressing the following link:

Install in VS Code Insiders

Alternatively, you can install the extension manually by running the following command:

# For VS Code
code --add-mcp '{"name":"podman","command":"npx","args":["podman-mcp-server@latest"]}'
# For VS Code Insiders
code-insiders --add-mcp '{"name":"podman","command":"npx","args":["podman-mcp-server@latest"]}'

Goose CLI

Goose CLI is the easiest (and cheapest) way to get rolling with artificial intelligence (AI) agents.

Using npm

If you have npm installed, this is the fastest way to get started with podman-mcp-server.

Open your goose config.yaml and add the mcp server to the list of mcpServers:

extensions:
  podman:
    command: npx
    args:
      - -y
      - podman-mcp-server@latest

πŸŽ₯ Demos

βš™οΈ Configuration

The Podman MCP server can be configured using command line (CLI) arguments.

You can run the CLI executable either by using npx or by downloading the latest release binary.

# Run the Podman MCP server using npx (in case you have npm installed)
npx podman-mcp-server@latest --help
# Run the Podman MCP server using the latest release binary
./podman-mcp-server --help

Configuration Options

Option Description
--sse-port Starts the MCP server in Server-Sent Event (SSE) mode and listens on the specified port.

πŸ§‘β€πŸ’» Development

Running with mcp-inspector

Compile the project and run the Podman MCP server with mcp-inspector to inspect the MCP server.

# Compile the project
make build
# Run the Podman MCP server with mcp-inspector
npx @modelcontextprotocol/inspector@latest $(pwd)/podman-mcp-server

podman-mcp-server FAQ

How do I install the podman-mcp-server?
You can install it via npm using 'npm install podman-mcp-server' or run it directly with npx.
Does podman-mcp-server support both Podman and Docker?
Yes, it supports container runtimes Podman and Docker, providing unified context access.
Can podman-mcp-server be used with multiple MCP hosts?
Yes, it is designed to serve multiple MCP clients and hosts simultaneously.
How does podman-mcp-server handle container security?
It follows MCP principles for secure, scoped, and observable interactions, ensuring safe data exposure.
Is podman-mcp-server compatible with different LLM providers?
Yes, it works with OpenAI, Anthropic Claude, and Google Gemini models via MCP clients.
What kind of container data does podman-mcp-server expose?
It exposes container states, configurations, logs, and runtime metrics in structured form.
How can I contribute to podman-mcp-server development?
Contributions are welcome via GitHub; you can fork the repo, make changes, and submit pull requests.