tuui

MCP.Pizza Chef: AI-QL

Tuui is a client application that accelerates AI tool integration using the Model Context Protocol (MCP) and orchestrates multiple LLM APIs from different vendors. It provides a unified interface for managing AI workflows, supports automated testing, and is built with TypeScript for robust development. Tuui enables seamless cross-vendor LLM API configuration and interaction, making it ideal for developers building AI-enhanced applications and workflows.

Use This MCP client To

Integrate multiple AI tools into a single client interface Orchestrate cross-vendor LLM API calls dynamically Manage AI workflows with unified context and tool access Automate testing of AI-driven applications Develop AI applications with TypeScript support Experiment with AI-generated code and project components Ensure syntax and naming convention compliance via linting

README

TUUI

A Tool-Unified User Interface accelerating AI tool integration via MCP (Model Context Protocol) and orchestrating cross-vendor LLM APIs

💡 Introduction

This repository is essentially an LLM chat desktop application based on MCP. It also represents a bold experiment in creating a complete project using AI. Many components within the project have been directly converted or generated from the prototype project through AI.

Given the considerations regarding the quality and safety of AI-generated content, this project employs strict syntax checks and naming conventions. Therefore, for any further development, please ensure that you use the linting tools I've set up to check and automatically fix syntax issues.

✨ Features

  • ✨ Accelerate AI tool integration via MCP
  • ✨ Orchestrate cross-vendor LLM APIs through dynamic configuring
  • ✨ Automated application testing Support
  • ✨ TypeScript support
  • ✨ Multilingual support
  • ✨ Basic layout manager
  • ✨ Global state management through the Pinia store
  • ✨ Quick support through the GitHub community and official documentation

📖 Getting Started

To explore the project, please check wiki page: TUUI.com

You can also check the documentation of the current project in sections: Getting Started | 快速入门

For features related to MCP, you'll need to set up your own LLM backend that supports tool calls.

For guidance on configuring the LLM, refer to the template(i.e.: Qwen):

{
  "chatbotStore": {
    "chatbots": [
      {
        "name": "Qwen",
        "apiKey": "",
        "url": "https://dashscope.aliyuncs.com/compatible-mode",
        "path": "/v1/chat/completions",
        "model": "qwen-turbo",
        "modelList": ["qwen-turbo", "qwen-plus", "qwen-max"],
        "maxTokensValue": "",
        "mcp": true
      }
    ]
  }
}

The full config and corresponding types could be found in: Config Type

Once you modify or import the LLM configuration, it will be stored in your localStorage by default. You can use the developer tools to view or clear the corresponding cache.

💄 Demo

MCP primitive visualization

Tool call tracing

Specify tool selection

LLM API setting

Native devtools

Remote MCP server

You can utilize Cloudflare's recommended mcp-remote to implement the full suite of remote MCP server functionalities (including Auth). For example, simply add the following to your config.json file:

{
  "mcpServers": {
    "cloudflare": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "https://YOURDOMAIN.com/sse"]
    }
  }
}

In this example, I have provided a test remote server: https://YOURDOMAIN.com on Cloudflare. This server will always approve your authentication requests.

If you encounter any issues (please try to maintain OAuth auto-redirect to prevent callback delays that might cause failures), such as the common HTTP 400 error. You can resolve them by clearing your browser cache on the authentication page and then attempting verification again:

📥 Contributing

We welcome contributions of any kind to this project, including feature enhancements, UI improvements, documentation updates, test case completions, and syntax corrections. I believe that a real developer can write better code than AI, so if you have concerns about certain parts of the code implementation, feel free to share your suggestions or submit a pull request.

Please review our Code of Conduct. It is in effect at all times. We expect it to be honored by everyone who contributes to this project.

For more information, please see Contributing Guidelines

🪲 Opening an Issue

Before creating an issue, check if you are using the latest version of the project. If you are not up-to-date, see if updating fixes your issue first.

🔒 Reporting Security Issues

Review our Security Policy. Do not file a public issue for security vulnerabilities.

🙏 Credits

Written by @AIQL.com.

Many of the ideas and prose for the statements in this project were based on or inspired by work from the following communities:

You can review the specific technical details and the license. We commend them for their efforts to facilitate collaboration in their projects.

tuui FAQ

How does Tuui handle multiple LLM providers?
Tuui dynamically configures and orchestrates APIs from various LLM providers, enabling seamless cross-vendor integration.
What programming languages does Tuui support for development?
Tuui is built with TypeScript, providing strong typing and modern development features.
Does Tuui support automated testing?
Yes, Tuui includes automated application testing support to ensure reliability of AI workflows.
How does Tuui ensure the quality of AI-generated content?
Tuui employs strict syntax checks and naming conventions enforced by integrated linting tools.
Can Tuui be used to build complete AI projects?
Yes, Tuui is designed as a platform to create and manage complete AI projects using MCP and AI-generated components.
Is Tuui limited to a specific MCP host or environment?
No, Tuui is a client that can connect to any MCP host supporting the protocol.
How does Tuui accelerate AI tool integration?
By providing a unified interface and orchestrating tool interactions via MCP, Tuui simplifies and speeds up integration.
What kind of AI tools can be integrated with Tuui?
Any AI tool exposing functionality through MCP servers can be integrated and managed within Tuui.