Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

unichat-ts-mcp-server

MCP.Pizza Chef: amidabuddha

The unichat-ts-mcp-server is a TypeScript-based MCP server designed to facilitate sending requests to multiple leading AI providers including OpenAI, MistralAI, Anthropic, xAI, Google AI, and DeepSeek. It supports both STDIO and SSE transport mechanisms, allowing flexible integration options. This server acts as a bridge using the MCP protocol, enabling developers to interact with various LLM APIs through a unified interface. It requires vendor API keys for authentication and provides a single tool named 'unichat' to send requests, making it ideal for multi-provider AI workflows and real-time model interaction.

Use This MCP server To

Send AI requests to multiple LLM providers via MCP protocol Integrate OpenAI, Anthropic, and Google AI in one server Use STDIO or SSE transport for flexible communication Enable multi-vendor AI workflows with unified API access Bridge TypeScript applications to diverse AI services

README

Unichat MCP Server in TypeScript

Also available in Python

Send requests to OpenAI, MistralAI, Anthropic, xAI, Google AI or DeepSeek using MCP protocol via tool or predefined prompts. Vendor API key required.

Both STDIO and SSE transport mechanisms supported via arguments.

Tools

The server implements one tool:

  • unichat: Send a request to unichat
    • Takes "messages" as required string arguments
    • Returns a response

Prompts

  • code_review
    • Review code for best practices, potential issues, and improvements
    • Arguments:
      • code (string, required): The code to review"
  • document_code
    • Generate documentation for code including docstrings and comments
    • Arguments:
      • code (string, required): The code to comment"
  • explain_code
    • Explain how a piece of code works in detail
    • Arguments:
      • code (string, required): The code to explain"
  • code_rework
    • Apply requested changes to the provided code
    • Arguments:
      • changes (string, optional): The changes to apply"
      • code (string, required): The code to rework"

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Running evals

The evals package loads an mcp client that then runs the index.ts file, so there is no need to rebuild between tests. You can load environment variables by prefixing the npx command. Full documentation can be found here.

OPENAI_API_KEY=your-key  npx mcp-eval src/evals/evals.ts src/server.ts

Installation

Installing via Smithery

To install Unichat MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install unichat-ts-mcp-server --client claude

Installing manually

To use with Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Run locally:

{
  "mcpServers": {
    "unichat-ts-mcp-server": {
      "command": "node",
      "args": [
        "{{/path/to}}/unichat-ts-mcp-server/build/index.js"
      ],
      "env": {
        "UNICHAT_MODEL": "YOUR_PREFERRED_MODEL_NAME",
        "UNICHAT_API_KEY": "YOUR_VENDOR_API_KEY"
      }
    }
}

Run published:

{
  "mcpServers": {
    "unichat-ts-mcp-server": {
      "command": "npx",
      "args": [
        "-y",
        "unichat-ts-mcp-server"
      ],
      "env": {
        "UNICHAT_MODEL": "YOUR_PREFERRED_MODEL_NAME",
        "UNICHAT_API_KEY": "YOUR_VENDOR_API_KEY"
      }
    }
}

Runs in STDIO by default or with argument --stdio. To run in SSE add argument --sse

npx -y unichat-ts-mcp-server --sse

Supported Models:

A list of currently supported models to be used as "YOUR_PREFERRED_MODEL_NAME" may be found here. Please make sure to add the relevant vendor API key as "YOUR_VENDOR_API_KEY"

Example:

"env": {
  "UNICHAT_MODEL": "gpt-4o-mini",
  "UNICHAT_API_KEY": "YOUR_OPENAI_API_KEY"
}

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

If you experience timeouts during testing in SSE mode change the request URL on the inspector interface to: http://localhost:3001/sse?timeout=600000

unichat-ts-mcp-server FAQ

How do I authenticate requests with unichat-ts-mcp-server?
You must provide valid vendor API keys for each AI provider you want to use, such as OpenAI, Anthropic, or Google AI.
What transport mechanisms does unichat-ts-mcp-server support?
It supports both STDIO and SSE transport mechanisms, configurable via server arguments.
Can I use unichat-ts-mcp-server with multiple AI vendors simultaneously?
Yes, it is designed to send requests to multiple providers like OpenAI, MistralAI, Anthropic, xAI, Google AI, and DeepSeek using the MCP protocol.
Is there a predefined tool available in this server?
Yes, the server implements a single tool called 'unichat' for sending requests to AI providers.
Does unichat-ts-mcp-server support real-time streaming responses?
Yes, through the SSE transport mechanism, it supports real-time streaming of responses.
Is the server implementation available in other languages?
Yes, there is also a Python version of the unichat MCP server available.
How do I configure the server for different AI providers?
Configuration is done by supplying the appropriate API keys and selecting the desired transport mechanism when starting the server.
Where can I find badges or security assessments for this server?
Badges and security assessments are available on platforms like Glama.ai, MseeP.ai, and Smithery.ai as shown in the README snippet.