Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

kuon

MCP.Pizza Chef: lissettecarlr

Kuon is a lightweight, developer-friendly server implementing a large language model-based voice assistant. It emphasizes simplicity and usability, supporting selective memory in conversations and integration with the Model Context Protocol (MCP) to extend assistant capabilities. Kuon uses text input and outputs both text and speech, interfacing primarily with OpenAI and other LLM providers via API. It is designed for easy onboarding and practical deployment, with plans to enhance TTS, memory optimization, and GUI interaction.

Use This MCP server To

Enable voice assistant capabilities with LLM integration Provide selective memory in conversational AI Serve as an MCP server for context-aware AI workflows Convert text input to speech output in assistant apps Integrate with OpenAI and other LLM APIs for AI tasks

README

Kuon

Kuon Logo

Python License OpenAI TTS MCP

久远,一个开发中的大模型语音助手。之前代码太臃肿,于是新分支重写,重点放在易用性上,使其成为一个实用的东西。

开发简述

简约代码则不再使用本地模型,即使要本地化也使用接口方式对接本程序。大模型只对接openai接口,其他厂商模型可以使用oneapi的方式匹配。目前取消了语音输入,原因在于实在不常用。后续再考虑是否加入。使用MCP来扩展助手能力。

目前功能:

  • 大模型的记忆存储
  • 使用文本输入交流,输出文本和语音
  • MCP功能

后续计划:

  • 优化TTS
  • 优化记忆存储,提升记忆价值
  • GUI交互

安装与使用

环境准备

  1. 创建并激活conda环境(可选)
conda create -n kuon python=3.10
conda activate kuon
  1. 克隆仓库
git clone https://github.com/yourusername/kuon.git
cd kuon
  1. 安装依赖
pip install -r requirements.txt

配置

1. API密钥配置

对话密钥(必需)

# Windows (PowerShell)
$env:OPENAI_API_KEY = "您的OpenAI API密钥"
$env:OPENAI_BASE_URL = "API基础URL"

# Linux/macOS
export OPENAI_API_KEY="您的OpenAI API密钥"
export OPENAI_BASE_URL="API基础URL"
  • TTS密钥(可选,目前只有阿里TTS,可以只文字交互)
# Windows (PowerShell)
$env:ALIYUN_ACCESS_KEY_ID= ""
# Linux/macOS
export ALIYUN_ACCESS_KEY_ID=""
2. 配置文件

根目录的config.yaml文件:

tts:
  enabled: true  # 是否启用TTS
  engine: "aliyun"  # TTS引擎选择,目前支持 "aliyun" 

mcp:
  enabled: true  # 是否默认启用MCP工具
  config_path: "mcp_server/temp_mcp_server.json"  # MCP服务器配置文件路径 
3. MCP配置(可选)

如需使用MCP功能,请参考mcp_server/temp_mcp_server.json配置文件:

{
    "mcpServers": {
      "general": {
        "type": "stdio",
        "command": "执行命令",
        "args": ["命令参数"],
        "env": {
          "OPENWEATHERMAP_API_KEY": "额外环境变量"
        }
      },
      "mcp-hotnews-server": {
        "type": "sse",
        "url": "https://mcp.modelscope.cn/sse/"
      }
    }
}

启动

运行主程序:

python kuon.py

程序启动后,直接输入文本与AI交互。输入"exit"或"quit"退出程序。

示例展示

下图展示了与久远助手的实际交互效果:

交互示例

其他

目前对话记忆被直接存储在了chat_engines/memory.json文件中,可以根据需求进行删改。 特别是存储了一些奇怪的东西时。

kuon FAQ

How do I install Kuon?
Clone the GitHub repo, create a Python 3.10+ environment, and follow the setup instructions in the README.
Does Kuon support voice input?
Currently, Kuon does not support voice input but outputs speech from text input; voice input may be added later.
How does Kuon handle memory in conversations?
Kuon supports selective memory storage to retain relevant conversational context for improved interactions.
Can Kuon integrate with multiple LLM providers?
Yes, Kuon primarily uses OpenAI APIs but can also connect to other providers like Claude and Gemini via oneapi.
What is the role of MCP in Kuon?
MCP enables Kuon to extend its assistant capabilities by providing structured, real-time context and tool integration.
Is Kuon suitable for production use?
Kuon is under active development focusing on usability; it is suitable for experimentation and prototyping.
What programming language is Kuon built with?
Kuon is developed in Python 3.10+ for ease of use and extensibility.
Does Kuon have a graphical user interface?
Currently, Kuon is command-line based, with GUI interaction planned for future releases.