any-chat-completions-mcp

MCP.Pizza Chef: pyroprompts

The any-chat-completions-mcp is a TypeScript-based MCP server that integrates any OpenAI SDK compatible chat completion API as a tool within the Model Context Protocol ecosystem. It supports multiple AI chat providers including OpenAI, Claude, Perplexity, Groq, xAI, and PyroPrompts, allowing seamless relay of chat queries to the configured AI model. This server implements the MCP server interface, enabling developers to use diverse LLM chat completions as tools in their AI workflows, enhancing interoperability and flexibility across different AI providers.

Use This MCP server To

Integrate multiple AI chat providers via a single MCP server Relay chat queries to any OpenAI SDK compatible LLM Enable Claude Desk to use OpenAI chat completions Develop AI agents that switch between chat completion APIs Test and compare chat completions from various LLM providers Build multi-provider AI chat tools with unified interface

README

any-chat-completions-mcp MCP Server

Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.

This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io

This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.

It has one tool, chat which relays a question to a configured AI Chat Provider.

smithery badge

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Installation

To add OpenAI to Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

You can use it via npx in your Claude Desktop configuration like this:

{
  "mcpServers": {
    "chat-openai": {
      "command": "npx",
      "args": [
        "@pyroprompts/any-chat-completions-mcp"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:

{
  "mcpServers": {
    "chat-openai": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:

{
  "mcpServers": {
    "chat-pyroprompts": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "PYROPROMPTS_KEY",
        "AI_CHAT_NAME": "PyroPrompts",
        "AI_CHAT_MODEL": "ash",
        "AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1"
      }
    },
    "chat-perplexity": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "PERPLEXITY_KEY",
        "AI_CHAT_NAME": "Perplexity",
        "AI_CHAT_MODEL": "sonar",
        "AI_CHAT_BASE_URL": "https://api.perplexity.ai"
      }
    },
    "chat-openai": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

With these three, you'll see a tool for each in the Claude Desktop Home:

Claude Desktop Home with Chat Tools

And then you can chat with other LLMs and it shows in chat like this:

Claude Chat with OpenAI

Or, configure in LibreChat like:

  chat-perplexity:
    type: stdio
    command: npx
    args:
      - -y
      - @pyroprompts/any-chat-completions-mcp
    env:
      AI_CHAT_KEY: "pplx-012345679"
      AI_CHAT_NAME: Perplexity
      AI_CHAT_MODEL: sonar
      AI_CHAT_BASE_URL: "https://api.perplexity.ai"
      PATH: '/usr/local/bin:/usr/bin:/bin'

And it shows in LibreChat:

LibreChat with Perplexity Chat

Installing via Smithery

To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install any-chat-completions-mcp-server --client claude

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

Acknowledgements

  • Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. https://modelcontextprotocol.io/introduction
  • PyroPrompts for sponsoring this project. Use code CLAUDEANYCHAT for 20 free automation credits on Pyroprompts.

any-chat-completions-mcp FAQ

How do I install the any-chat-completions-mcp server?
Install dependencies with 'npm install', then build using 'npm run build'. For development, use 'npm run watch' for auto-rebuild.
Which AI chat providers are supported by any-chat-completions-mcp?
It supports any OpenAI SDK compatible chat completion API, including OpenAI, Claude, Perplexity, Groq, xAI, and PyroPrompts.
What programming language is used for this MCP server?
The server is implemented in TypeScript, ensuring type safety and modern JavaScript features.
How does the 'chat' tool function in this MCP server?
The 'chat' tool relays user questions to the configured AI chat provider and returns the model's response.
Can I use this MCP server to switch between different LLM providers dynamically?
Yes, it is designed to integrate multiple providers compatible with OpenAI SDK chat completions, allowing flexible switching.
Is this MCP server compatible with Claude Desk?
Yes, it can be added to Claude Desk to enable OpenAI chat completions within that environment.
Where can I find more information about the Model Context Protocol?
Visit https://modelcontextprotocol.io for detailed documentation and protocol specifications.
Does this server support real-time chat interactions?
Yes, it supports real-time chat completions by relaying queries to the configured AI provider.