saiki

MCP.Pizza Chef: truffle-ai

Saiki is a versatile, customizable AI agent client that supports the Model Context Protocol (MCP). It enables users to interact with and control computers, applications, and services using natural language commands. Saiki connects once and allows seamless command execution across multiple tools and environments, making it a powerful interface for AI-driven automation and control workflows.

Use This MCP client To

Control multiple applications via natural language commands Automate workflows by connecting various tools and services Interact with computer systems using conversational AI Integrate with MCP servers to orchestrate complex tasks Use CLI mode for interactive AI-driven command execution Customize AI agent behavior for specific user needs

README

Saiki

Status: Beta License: Elastic License 2.0 Discord

Use natural language to control your tools, apps, and services — connect once, command everything.

Saiki Demo

Installation

Global (npm)

npm install -g @truffle-ai/saiki
Build & Link from source
git clone https://github.com/truffle-ai/saiki.git
cd saiki
npm install
npm run build
npm link

After linking, the saiki command becomes available globally.

Quick Start

CLI Mode

Invoke the interactive CLI:

saiki
Alternative: without global install

You can also run directly via npm:

npm start

Web UI Mode

Serve the experimental web interface:

saiki --mode web --web-port 3000
Alternative: without global install
npm start -- --mode web --web-port 3000

Open http://localhost:3000 in your browser.

Bot Modes

Run Saiki as a Discord or Telegram bot.

Discord Bot:

saiki --mode discord

Make sure you have DISCORD_BOT_TOKEN set in your environment. See here for more details.

Telegram Bot:

saiki --mode telegram

Make sure you have TELEGRAM_BOT_TOKEN set in your environment. See here for more details.

Overview

Saiki is a flexible, modular AI agent that lets you perform tasks across your tools, apps, and services using natural language. You describe what you want to do — Saiki figures out which tools to invoke and orchestrates them seamlessly.

Why developers choose Saiki:

  1. Open & Extensible: Connect to any service via the Model Context Protocol (MCP). Drop in pre-built servers for GitHub, filesystem, terminal, or build your own.
  2. AI-Powered Orchestration: Natural language tasks are parsed into multi-step tool calls executed in the correct sequence.
  3. Multi-Interface Support: Use via CLI, wrap it in a web UI, or integrate into other systems – AI logic is decoupled from UI concerns.
  4. Production-Ready: Robust error handling, structured logging, and pluggable LLM providers (OpenAI, Anthropic, Google) ensure reliability.

Saiki is the missing natural language layer across your stack. Whether you're automating workflows, building agents, or prototyping new ideas, Saiki gives you the tools to move fast — and bend it to your needs. Interact with Saiki via the command line or the new experimental web UI.

Ready to jump in? Follow the Installation guide or explore demos below.

Examples & Demos

🛒 Amazon Shopping Assistant

Task: Can you go to amazon and add some snacks to my cart? I like trail mix, cheetos and maybe surprise me with something else?

# Use default config which supports puppeteer for navigating the browser
saiki
Saiki: Amazon shopping agent demo

📧 Email Summary to Slack

Task: Summarize emails and send highlights to Slack

saiki --config-file ./configuration/examples/email_slack.yml

Email to Slack Demo

🎨 AI Website Designer

Task: Design a landing page based on README.md

saiki --config-file ./configuration/examples/website_designer.yml

Website Designer Demo

For more examples, see the Examples section in the docs.

CLI Reference

The saiki command supports several options to customize its behavior. Run saiki --help for the full list.

Common Examples:

  • Specify a custom configuration file:

    cp configuration/saiki.yml configuration/custom_config.yml
    saiki --config-file configuration/custom_config.yml
  • Use a specific AI model (if configured):

    saiki -m gemini-2.5-pro-exp-03-25

Configuration

Saiki uses a YAML config file (configuration/saiki.yml by default) to configure tool servers (MCP servers) and LLM providers.

mcpServers:
  filesystem:
    type: stdio
    command: npx
    args:
      - -y
      - "@modelcontextprotocol/server-filesystem"
      - .
  puppeteer:
    type: stdio
    command: npx
    args:
      - -y
      - "@truffle-ai/puppeteer-server"

llm:
  provider: openai
  model: gpt-4.1-mini
  apiKey: $OPENAI_API_KEY

Discovering & Connecting MCP Servers

Saiki communicates with your tools via Model Context Protocol (MCP) servers. You can discover and connect to MCP servers in several ways:

  1. Browse pre-built servers:

  2. Search on npm:

npm search @modelcontextprotocol/server
  1. Add servers to your configuration/saiki.yml under the mcpServers key (see the snippet above).

  2. Create custom servers:

Documentation

Find detailed guides, architecture, and API reference in the docs/ folder:

Contributing

We welcome contributions! Here's how to get started:

  1. Fork the repository to your GitHub account.
  2. Clone your fork:
    git clone https://github.com/your-username/saiki.git
    cd saiki
  3. Create a new feature branch:
    git checkout -b feature/your-branch-name
  4. Make your changes:
    • Follow existing TypeScript and code style conventions.
    • Run npm run lint:fix and npm run format before committing.
    • Add or update tests for new functionality.
  5. Commit and push your branch:
    git commit -m "Brief description of changes"
    git push origin feature/your-branch-name
  6. Open a Pull Request against the main branch with a clear description of your changes.

Tip: Open an issue first for discussion on larger enhancements or proposals.

Community & Support

Saiki was built by the team at Truffle AI.

Saiki is better with you! Join our Discord whether you want to say hello, share your projects, ask questions, or get help setting things up:

Join our Discord server

If you're enjoying Saiki, please give us a ⭐ on GitHub!

License

Elastic License 2.0. See LICENSE for details.

Contributors

Thanks to all these amazing people for contributing to Saiki! (full list):

Contributors

saiki FAQ

How do I install Saiki globally?
You can install Saiki globally using npm with the command `npm install -g @truffle-ai/saiki`.
Can Saiki be customized?
Yes, Saiki is highly customizable to fit different user workflows and preferences.
How does Saiki interact with other applications?
Saiki uses natural language commands to control connected tools, apps, and services via MCP.
Is Saiki limited to a specific platform?
No, Saiki is designed to work across various platforms where MCP is supported.
How do I start using Saiki in CLI mode?
After installation, run the `saiki` command in your terminal to start the interactive CLI.
What license governs Saiki?
Saiki is licensed under the Elastic License 2.0.
Where can I get support or join the community?
You can join the Saiki Discord community via the invite link in the GitHub repository.
Does Saiki support real-time command execution?
Yes, Saiki supports real-time interaction and command execution through natural language input.