sauropod

MCP.Pizza Chef: sauropod-io

Sauropod is a local AI agent client that provides a user interface and orchestration layer for managing MCP servers and AI workflows. It supports running and connecting to multiple MCP servers, integrates with OpenAI-compatible backends like Ollama, and enables advanced features such as image support and event handling. Sauropod is designed to streamline AI agent deployment and interaction locally with extensible configuration and roadmap for enhanced capabilities.

Use This MCP client To

Run and manage local AI agents with a unified UI Orchestrate multiple MCP servers and subprocesses Connect to remote MCP servers for distributed workflows Configure AI backends compatible with OpenAI, Claude, Gemini Enable image input support for AI workflows Manage AI model selection and backend routing Develop and test AI agent workflows locally Integrate MCP tools for extended functionality

README

Sauropod

Local AI agents.

Documentation

See the docs/ directory.

Running Sauropod

See docs/config.md for configuration settings.

Example configuration

~/.config/sauropod/config.toml

# Run the server on port 8080
port = 8080
# Point the backend to an OpenAI-compatible server like Ollama.
backend = "http://localhost:11434"

[default_model]
model = "gemma3:27b"
type = "Gemma3"

[[mcp_servers]]
# Spawn an MCP server as a subprocess controlled by the server
command = ["docker", "run", "-it", "--rm", "markitdown-mcp:latest"]

[[mcp_servers]]
# Connect to a remote MCP server
url = "http://localhost:1234"

Roadmap

  • MCP tools support
  • Image support
  • Events
  • Notifications via Web Push
  • Multiple accounts
  • Secrets management
  • Access policies (possibly using Cedar)
  • Automatically generated SDKs for workflows

Build from source

Dependencies

Build a release

make release

The binary will be created in target/optimized-release/sauropod-server.

License

Most of the code is licensed under AGPL.

The code required to build custom clients - such as the schemas, client APIs, and OpenAPI specification - is licensed under Apache-2.0.

sauropod FAQ

How do I configure Sauropod to connect to an AI backend?
You configure Sauropod by editing the config.toml file to specify the backend URL, such as an OpenAI-compatible server like Ollama, Gemini, or Claude.
Can Sauropod run multiple MCP servers simultaneously?
Yes, Sauropod supports spawning MCP servers as subprocesses and connecting to multiple remote MCP servers concurrently.
Does Sauropod support image inputs for AI agents?
Yes, Sauropod includes support for image inputs to enhance AI workflows.
Is Sauropod limited to OpenAI models only?
No, Sauropod supports any OpenAI-compatible backend, including providers like Ollama, Claude, and Gemini.
How can I extend Sauropod with new MCP tools?
You can add MCP servers or tools by configuring them in the config.toml and integrating them into Sauropod's orchestration.
What platforms does Sauropod run on?
Sauropod is designed to run locally on systems that support Docker and standard Linux/Unix environments.
Are there plans for notifications and secrets management?
Yes, the roadmap includes adding notifications via Web Push and secrets management features.
How do I get started with Sauropod?
Refer to the docs directory and config.md for detailed setup and configuration instructions.