open-imi

MCP.Pizza Chef: ProjectAI00

Open Imi is an open source MCP client designed as a Claude desktop alternative for developers, engineers, and tech teams. It enables users to hack, customize, and experiment with MCPs and AI agents, connecting multiple AI providers like OpenAI, Anthropic, Google, and Ollama. Built with modern frameworks like Next.js and Vercel's AI SDK, it runs locally or on personal servers with minimal setup, supporting file-based MCP management for easy integration of AI tools.

Use This MCP client To

Customize AI agents for specific workflows Experiment with multiple AI providers in one client Run MCP clients locally without complex setup Integrate and manage AI tools via file-based MCP Develop and test AI-enhanced workflows Hack and extend MCP capabilities for teams Connect diverse AI models through a unified client

README

Open IMI

Open Imi is a open source claude desktop alternative for developers, engineers and tech teams to hack MCP's and agents to their own liking.

(OpenAI, Anthropic, Google, Ollama, etc.) while connecting powerful AI tools through Model Context Protocol (MCP).

This project was developed using mcp-client-chatbot from ( https://github.com/cgoinglove ) and Vercel's open source libraries such as Next.js and AI SDKshadcn/ui, and is designed to run immediately in local environments or personal servers without complex setup. You can easily add and experiment with AI tools through file-based MCP management.

Installation

This project uses pnpm as the recommended package manager.

Quick Start

# Install dependencies
pnpm i

# Initialize the project (creates .env file from .env.example and sets up the database)
pnpm initial

# Start the development server
pnpm dev

After running these commands, you can access the application at http://localhost:3000.

Environment Setup

After running pnpm initial, make sure to edit your .env file to add the necessary API keys for the providers you want to use:

GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****

By default, the application uses SQLite for data storage. If you prefer to use PostgreSQL, you can modify the USE_FILE_SYSTEM_DB value in your .env file and set up your database connection string.

Setting Up MCP Servers

You can add MCP servers in two ways:

  1. Using the UI: Navigate to http://localhost:3000/mcp in your browser and use the interface to add and configure MCP servers.
  2. Editing the config file: Directly modify the .mcp-config.json file in the project root directory.
  3. Custom server logic: A customizable MCP server is already included in the project at ./custom-mcp-server/index.ts.
    You can modify this file to implement your own server logic or connect external tools as needed.

Credits and Acknowledgements

Massive Shoutout To

License

MIT license

Please refer to the respective repositories for more details on licensing.

What is next?

  • chat and file search.
  • multi page work flow.
  • project management page.

open-imi FAQ

How do I install Open Imi?
Use pnpm to install dependencies with 'pnpm i' and follow the quick start guide in the README.
Can Open Imi connect to multiple AI providers?
Yes, it supports OpenAI, Anthropic, Google, Ollama, and others via MCP.
Is Open Imi suitable for local development?
Yes, it is designed to run immediately in local environments or personal servers without complex setup.
How does Open Imi manage AI tools?
It uses file-based MCP management to easily add and experiment with AI tools.
What technologies is Open Imi built on?
It is built using Next.js, Vercel's AI SDK, and shadcn/ui for a modern developer experience.
Can I customize MCPs and agents with Open Imi?
Yes, it is specifically designed for hacking and customizing MCPs and AI agents.
Does Open Imi support running on personal servers?
Yes, it can run on personal servers as well as local machines.
Is Open Imi open source?
Yes, it is an open source project available for developers and teams.