Use natural language to control your tools, apps, and services — connect once, command everything.
Global (npm)
npm install -g @truffle-ai/saiki
Build & Link from source
git clone https://github.com/truffle-ai/saiki.git
cd saiki
npm install
npm run build
npm link
After linking, the saiki
command becomes available globally.
Invoke the interactive CLI:
saiki
Alternative: without global install
You can also run directly via npm:
npm start
Serve the experimental web interface:
saiki --mode web --web-port 3000
Alternative: without global install
npm start -- --mode web --web-port 3000
Open http://localhost:3000 in your browser.
Run Saiki as a Discord or Telegram bot.
Discord Bot:
saiki --mode discord
Make sure you have DISCORD_BOT_TOKEN
set in your environment. See here for more details.
Telegram Bot:
saiki --mode telegram
Make sure you have TELEGRAM_BOT_TOKEN
set in your environment. See here for more details.
Saiki is a flexible, modular AI agent that lets you perform tasks across your tools, apps, and services using natural language. You describe what you want to do — Saiki figures out which tools to invoke and orchestrates them seamlessly.
Why developers choose Saiki:
- Open & Extensible: Connect to any service via the Model Context Protocol (MCP). Drop in pre-built servers for GitHub, filesystem, terminal, or build your own.
- AI-Powered Orchestration: Natural language tasks are parsed into multi-step tool calls executed in the correct sequence.
- Multi-Interface Support: Use via CLI, wrap it in a web UI, or integrate into other systems – AI logic is decoupled from UI concerns.
- Production-Ready: Robust error handling, structured logging, and pluggable LLM providers (OpenAI, Anthropic, Google) ensure reliability.
Saiki is the missing natural language layer across your stack. Whether you're automating workflows, building agents, or prototyping new ideas, Saiki gives you the tools to move fast — and bend it to your needs. Interact with Saiki via the command line or the new experimental web UI.
Ready to jump in? Follow the Installation guide or explore demos below.
Task: Can you go to amazon and add some snacks to my cart? I like trail mix, cheetos and maybe surprise me with something else?
# Use default config which supports puppeteer for navigating the browser
saiki

Task: Summarize emails and send highlights to Slack
saiki --config-file ./configuration/examples/email_slack.yml
Task: Design a landing page based on README.md
saiki --config-file ./configuration/examples/website_designer.yml
For more examples, see the Examples section in the docs.
The saiki
command supports several options to customize its behavior. Run saiki --help
for the full list.
Common Examples:
-
Specify a custom configuration file:
cp configuration/saiki.yml configuration/custom_config.yml saiki --config-file configuration/custom_config.yml
-
Use a specific AI model (if configured):
saiki -m gemini-2.5-pro-exp-03-25
Saiki uses a YAML config file (configuration/saiki.yml
by default) to configure tool servers (MCP servers) and LLM providers.
mcpServers:
filesystem:
type: stdio
command: npx
args:
- -y
- "@modelcontextprotocol/server-filesystem"
- .
puppeteer:
type: stdio
command: npx
args:
- -y
- "@truffle-ai/puppeteer-server"
llm:
provider: openai
model: gpt-4.1-mini
apiKey: $OPENAI_API_KEY
Saiki communicates with your tools via Model Context Protocol (MCP) servers. You can discover and connect to MCP servers in several ways:
-
Browse pre-built servers:
- Model Context Protocol reference servers: https://github.com/modelcontextprotocol/reference-servers
- Smithery.ai catalog: https://smithery.ai/tools
- Composio MCP registry: https://mcp.composio.dev/
-
Search on npm:
npm search @modelcontextprotocol/server
-
Add servers to your
configuration/saiki.yml
under themcpServers
key (see the snippet above). -
Create custom servers:
- Use the MCP TypeScript SDK: https://github.com/modelcontextprotocol/typescript-sdk
- Follow the MCP spec: https://modelcontextprotocol.io/introduction
Find detailed guides, architecture, and API reference in the docs/
folder:
- High-level design — docs/architecture.md
- Docker usage — README.Docker.md
We welcome contributions! Here's how to get started:
- Fork the repository to your GitHub account.
- Clone your fork:
git clone https://github.com/your-username/saiki.git cd saiki
- Create a new feature branch:
git checkout -b feature/your-branch-name
- Make your changes:
- Follow existing TypeScript and code style conventions.
- Run
npm run lint:fix
andnpm run format
before committing. - Add or update tests for new functionality.
- Commit and push your branch:
git commit -m "Brief description of changes" git push origin feature/your-branch-name
- Open a Pull Request against the
main
branch with a clear description of your changes.
Tip: Open an issue first for discussion on larger enhancements or proposals.
Saiki was built by the team at Truffle AI.
Saiki is better with you! Join our Discord whether you want to say hello, share your projects, ask questions, or get help setting things up:
If you're enjoying Saiki, please give us a ⭐ on GitHub!
Elastic License 2.0. See LICENSE for details.
Thanks to all these amazing people for contributing to Saiki! (full list):