A magical tool for using local LLMs with MCP servers
Tome is the simplest way to get started with local LLMs and MCP. Tome manages your MCP servers so there's no fiddling with uv/npm or json files - connect it to Ollama, find an MCP server via our Smithery marketplace integration (or paste your own uvx/npx command), and chat with an MCP-powered model in seconds.
This is a Technical Preview so bear in mind things will be rough around the edges. Join us on Discord to share tips, tricks, and issues you run into. Star this repo to stay on top of updates and feature releases!
- Instant connection to Ollama (local or remote) for model management
- Chat with MCP-powered models, customize context window and temperature
- Install MCP servers by pasting in a command (e.g.,
uvx mcp-server-fetch
) or through the built-in Smithery marketplace which offers thousands of servers via a single click
- MacOS (Sequoia 15.0 or higher recommended. Windows and Linux support coming soon!)
- Ollama (Either local or remote, you can configure any Ollama URL in settings)
- Download the latest release of Tome
- Install Tome and Ollama
- Install a Tool supported model (we're partial to Qwen3, either 14B or 8B depending on your RAM)
- Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste
uvx mcp-server-fetch
into the server field). - Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.
We want to make local LLMs and MCP accessible to everyone. We're building a tool that allows you to be creative with LLMs, regardless of whether you're an engineer, tinkerer, hobbyist, or anyone in between.
- Tome is local first: You are in control of where your data goes.
- Tome is for everyone: You shouldn't have to manage programming languages, package managers, or json config files.
- Model support: Currently Tome uses Ollama for model management but we'd like to expand support for other LLM engines and possibly even cloud models, let us know if you have any requests.
- Operating system support: We're planning on adding support for Windows, followed by Linux.
- App builder: we believe long term that the best experiences will not be in a chat interface. We have plans to add additional tools that will enable you to create powerful applications and workflows.
- ??? Let us know what you'd like to see! Join our community via the links below, we'd love to hear from you.