Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

grpcmcp

MCP.Pizza Chef: adiom-data

grpcmcp is an MCP server that proxies requests to gRPC backends using either a provided descriptors file or server reflection. It supports multiple transport modes including SSE and STDIN, enabling seamless integration of gRPC services into MCP workflows. This server simplifies exposing gRPC APIs to LLMs by translating gRPC service definitions into MCP-compatible formats, facilitating real-time, structured interaction with gRPC endpoints.

Use This MCP server To

Proxy gRPC backend services for MCP clients Expose gRPC APIs to LLMs via MCP protocol Use server reflection to dynamically discover gRPC services Serve gRPC health check services over MCP Enable SSE transport for real-time gRPC streaming Integrate gRPC services into AI-enhanced workflows Translate gRPC descriptors into MCP-readable data

README

grpcmcp

A simple MCP server that will proxy to a grpc backend based on a provided descriptors file or using reflection.

Quick Start

  1. Install the binary: go install . or go install github.com/adiom-data/grpcmcp Ensure the go bin directory is in your PATH.

  2. In a terminal, run the example grpc server go run example/main.go. This will start a grpc health service on port 8090 with server reflection enabled. Note that this runs on the default port that grpcmcp will connect to.

  3. SSE Transport In another terminal, run grpcmcp --hostport=localhost:3000 --reflect. Specifying hostport will use SSE. The SSE endpoint will be served at http://localhost:3000/sse.

  4. STDIN Transport Set up the MCP config. e.g.

"grpcmcp": {
    "command": "grpcmcp",
    "args": ["--reflect"]
}

Options / Features

grpcmcp --help for a full list of options.

  • hostport string - When set, use SSE, and this serves as the server host:port.

  • descriptors string - Specify file location of the protobuf definitions generated from buf build -o protos.pb or protoc --descriptor_set_out=protos.pb instead of using gRPC reflection.

  • reflect - If set, use reflection to retrieve gRPC endpoints instead of descriptor file.

  • url string - Specify the url of the backend server.

  • services string - Comma separated list of fully qualified gRPC service names to filter.

  • bearer string - Token to attach in an Authorization: Bearer header.

  • bearer-env string - Environment variable for token to attach in an Authorization: Bearer header. Overrides bearer.

  • header string (repeatable) - Headers to add in Key: Value format.

Help

Join our Discord at https://discord.gg/hDjx3DehwG

grpcmcp FAQ

How do I start grpcmcp with server reflection enabled?
Run grpcmcp with the --reflect flag and specify the hostport for SSE transport, e.g., grpcmcp --hostport=localhost:3000 --reflect.
What transport methods does grpcmcp support?
grpcmcp supports SSE transport when hostport is set and STDIN transport via MCP config commands.
How do I provide gRPC service definitions to grpcmcp?
You can provide a descriptors file or enable server reflection to dynamically discover services.
Can grpcmcp handle gRPC health checking?
Yes, grpcmcp can proxy gRPC health services, enabling health status monitoring through MCP.
Is grpcmcp compatible with multiple LLM providers?
Yes, grpcmcp works with any MCP client and LLM providers like OpenAI, Claude, and Gemini by exposing gRPC services in MCP format.
How do I install grpcmcp?
Install via Go with go install github.com/adiom-data/grpcmcp and ensure your Go bin directory is in your PATH.
Where can I find more options and configuration details?
Run grpcmcp --help to see all available options and flags for configuration.