y-cli

MCP.Pizza Chef: luohy15

y-cli is a compact command-line interface chat application designed to facilitate AI conversations directly in your terminal. It supports flexible storage options including local JSONL files and Cloudflare KV/R2 for cloud backup. y-cli enables interactive chat with tool execution visualization and supports multiple bot configurations using various API formats like OpenAI and Dify. It also integrates reasoning models and fully supports the Model Context Protocol (MCP) for seamless AI model orchestration and interaction.

Use This MCP client To

Chat with AI models directly from the terminal Store and sync chat history locally or in the cloud Visualize tool execution during AI conversations Configure multiple AI bots with different APIs and models Integrate reasoning models for advanced AI responses Use MCP to manage AI context and tool interactions

README

y-cli πŸš€

A tiny command-line interface chat application that brings AI conversations to your terminal.

Check out y-gui for a web-based version of y-cli.

✨ Features

  • πŸ“ Flexible storage options:
    • Local JSONL files for easy access and sync
    • Cloudflare KV and R2 for cloud storage and backup
  • πŸ’¬ Interactive chat interface with tool execution visualization
  • πŸ€– Support for multiple bot configurations (any base_url/api_key/model combination). Supported api format type:
  • πŸ€” Support for reasoning model
  • πŸ”— MCP (Model Context Protocol) integration:
    • Client support with multiple server configurations (stdio/SSE)
    • Persistent daemon
    • Custom prompt configurations
  • 🧐 Simple "Deep Research" mode by prompt configuration

Demo

demo

demo

asciicast

Multiple bot configurations

➜  ~ y-cli bot list
Name         API Key      API Type    Base URL                             Model                                Print Speed    Description    OpenRouter Config    MCP Servers    Reasoning Effort
-----------  -----------  ----------  -----------------------------------  -----------------------------------  -------------  -------------  -------------------  -------------  ------------------
default      sk-or-v1...  N/A         https://gateway.ai.cloudflare.co...  google/gemini-2.0-flash-001          None            N/A            Yes                  No             N/A
claude       sk-or-v1...  N/A         https://gateway.ai.cloudflare.co...  anthropic/claude-3.7-sonnet:beta     None             N/A            Yes                  todo           N/A
o3-mini      sk-or-v1...  N/A         https://gateway.ai.cloudflare.co...  openai/o3-mini                       None             N/A            Yes                  No             low
ds-chat        sk-or-v1...  N/A         https://gateway.ai.cloudflare.co...  deepseek/deepseek-chat-v3-0324:free                 None            N/A            Yes                  tavily         N/A
dify-bot     app-2drF...  dify        https://api.dify.ai/v1                                                    None             N/A            No                   No             N/A

Multiple MCP servers

➜  ~ y-cli mcp list
Name            Type    Command/URL          Arguments/Token    Environment     Auto-Confirm
--------------  ------  -------------------  -----------------  --------------  --------------
brave-search    sse     https://router.m...                                     brave_web_s...
todo            stdio   uvx                  mcp-todo
exa-mcp-server  stdio   npx                  exa-mcp-server     EXA_API_KEY...

⚑ Quick Start

Prerequisites

Required:

  1. uv
  2. OpenRouter API key

Setup Instructions:

  1. uv

  2. OpenRouter API key

Run without Installation

uvx y-cli

Install with uv tool

uv tool install y-cli

Initialize

y-cli init

Start Chat

y-cli chat

πŸ› οΈ Usage

y-cli [OPTIONS] COMMAND [ARGS]...

Commands

  • chat Start a new chat conversation or continue an existing one
  • list List chat conversations with optional filtering
  • share Share a chat conversation by generating a shareable link
  • bot Manage bot configurations:
    • add Add a new bot configuration
    • list List all configured bots
    • delete Delete a bot configuration
  • mcp Manage MCP server configurations:
    • add Add a new MCP server configuration
    • list List all configured MCP servers
    • delete Delete an MCP server configuration
  • daemon Manage the MCP daemon:
    • start Start the MCP daemon
    • stop Stop the MCP daemon
    • status Check daemon status
    • log View daemon logs
    • restart Restart the daemon
  • prompt Manage prompt configurations:
    • add Add a new prompt configuration
    • list List all configured prompts
    • delete Delete a prompt configuration

Options

  • --help Show help message and exit

πŸ“š Documentation

Visit the deepwiki page for comprehensive project documentation and guides.

y-cli FAQ

How does y-cli handle chat history storage?
y-cli supports flexible storage with local JSONL files and cloud options like Cloudflare KV and R2 for backup and sync.
Can y-cli support multiple AI models or bots?
Yes, it supports multiple bot configurations with any base_url, api_key, and model combination, including OpenAI and Dify formats.
What API formats does y-cli support for AI interactions?
It supports OpenAI chat completion streaming format and Dify chat-messages streaming format for flexible AI integration.
How does y-cli visualize tool execution?
y-cli provides an interactive chat interface that shows tool execution steps during AI conversations for better transparency.
Does y-cli support reasoning models?
Yes, it supports reasoning models like Deepseek-r1 and OpenAI o3-mini for enhanced AI reasoning capabilities.
What is MCP support in y-cli?
y-cli fully supports the Model Context Protocol, enabling structured, real-time context feeding and tool orchestration with AI models.
Is y-cli suitable for cloud and local environments?
Yes, it offers both local file storage and cloud storage options, making it versatile for different deployment scenarios.
Can y-cli be extended or integrated with other MCP tools?
Yes, as an MCP client, y-cli can orchestrate context and tools from various MCP servers and resources seamlessly.