Fire in da houseTop Tip:Most people pay up to $340 per month for Perplexity, MidJourney, Runway, ChatGPT, and more - but you can get them all your AI tools for $15 with Galaxy. It's free to test!Fire in da houseCheck it out

mcp-openapi-server

MCP.Pizza Chef: ivo-toby

The mcp-openapi-server is a Model Context Protocol (MCP) server that transforms OpenAPI specifications into MCP resources. It enables Large Language Models (LLMs) to discover, understand, and interact with RESTful APIs defined by OpenAPI specs through the MCP protocol. This server acts as a bridge, exposing API endpoints in a structured, model-readable format, facilitating real-time API calls and data retrieval within AI workflows. It supports easy configuration and integration, allowing developers to connect any OpenAPI-compliant REST API to LLMs without cloning repositories, simply by setting environment variables. This enhances AI agents' capabilities to perform multi-step reasoning and dynamic interactions with external services.

Use This MCP server To

Expose REST APIs as MCP resources for LLM interaction Enable LLMs to call OpenAPI-defined endpoints dynamically Integrate external APIs into AI workflows via MCP Facilitate real-time API data retrieval for models Simplify API access configuration for AI agents

README

OpenAPI MCP Server

A Model Context Protocol (MCP) server that exposes OpenAPI endpoints as MCP resources. This server allows Large Language Models to discover and interact with REST APIs defined by OpenAPI specifications through the MCP protocol.

Quick Start

You do not need to clone this repository to use this MCP server. You can simply configure it in Claude Desktop:

  1. Locate or create your Claude Desktop configuration file:

    • On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  2. Add the following configuration to enable the OpenAPI MCP server:

{
  "mcpServers": {
    "openapi": {
      "command": "npx",
      "args": ["-y", "@ivotoby/openapi-mcp-server"],
      "env": {
        "API_BASE_URL": "https://api.example.com",
        "OPENAPI_SPEC_PATH": "https://api.example.com/openapi.json",
        "API_HEADERS": "Authorization:Bearer token123,X-API-Key:your-api-key"
      }
    }
  }
}
  1. Replace the environment variables with your actual API configuration:
    • API_BASE_URL: The base URL of your API
    • OPENAPI_SPEC_PATH: URL or path to your OpenAPI specification
    • API_HEADERS: Comma-separated key:value pairs for API authentication headers

Development Tools

This project includes several development tools to make your workflow easier:

Building

  • npm run build - Builds the TypeScript source
  • npm run clean - Removes build artifacts
  • npm run typecheck - Runs TypeScript type checking

Development Mode

  • npm run dev - Watches source files and rebuilds on changes
  • npm run inspect-watch - Runs the inspector with auto-reload on changes

Code Quality

  • npm run lint - Runs ESLint
  • npm run typecheck - Verifies TypeScript types

Configuration

The server can be configured through environment variables or command line arguments:

Environment Variables

  • API_BASE_URL - Base URL for the API endpoints
  • OPENAPI_SPEC_PATH - Path or URL to OpenAPI specification
  • API_HEADERS - Comma-separated key:value pairs for API headers
  • SERVER_NAME - Name for the MCP server (default: "mcp-openapi-server")
  • SERVER_VERSION - Version of the server (default: "1.0.0")

Command Line Arguments

npm run inspect -- \
  --api-base-url https://api.example.com \
  --openapi-spec https://api.example.com/openapi.json \
  --headers "Authorization:Bearer token123,X-API-Key:your-api-key" \
  --name "my-mcp-server" \
  --version "1.0.0"

Development Workflow

  1. Start the development environment:
npm run inspect-watch
  1. Make changes to the TypeScript files in src/
  2. The server will automatically rebuild and restart
  3. Use the MCP Inspector UI to test your changes

Debugging

The server outputs debug logs to stderr. To see these logs:

  1. In development mode:

    • Logs appear in the terminal running inspect-watch
  2. When running directly:

    npm run inspect 2>debug.log

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Run tests and linting:
    npm run typecheck
    npm run lint
  5. Submit a pull request

License

MIT

mcp-openapi-server FAQ

How do I configure the mcp-openapi-server without cloning the repo?
You can configure it directly in Claude Desktop by adding the server configuration with environment variables pointing to your API base URL, OpenAPI spec path, and headers.
Can the mcp-openapi-server handle authentication for APIs?
Yes, you can pass authentication headers such as Bearer tokens or API keys via the API_HEADERS environment variable.
Does the mcp-openapi-server support dynamic API endpoints?
Yes, it exposes all endpoints defined in the OpenAPI specification dynamically to the MCP client.
How does the mcp-openapi-server improve LLM interactions with APIs?
It converts OpenAPI specs into structured MCP resources, allowing LLMs to discover and invoke API endpoints safely and efficiently.
Is the mcp-openapi-server compatible with multiple LLM providers?
Yes, it works with any MCP client connected to LLMs like OpenAI, Anthropic Claude, and Google Gemini.
What formats of OpenAPI specs are supported?
The server supports standard OpenAPI JSON or YAML specifications accessible via URL or local path.
Can I customize headers for different API calls?
Currently, headers are set globally via environment variables, but you can update them as needed before starting the server.
How do I update the OpenAPI spec used by the server?
Update the OPENAPI_SPEC_PATH environment variable to point to the new spec URL or file path and restart the server.