Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

fastapi-sse-mcp

MCP.Pizza Chef: ragieai

fastapi-sse-mcp is a FastAPI-based MCP server showcasing Server-Sent Events (SSE) as a transport layer for real-time model context communication. It supports handling HTTP requests, implementing MCP tools, resources, and prompts using the MCP python-sdk, enabling efficient streaming and interaction with LLMs in a scalable, event-driven manner.

Use This MCP server To

Stream real-time model context updates via Server-Sent Events Implement MCP tools and resources with FastAPI backend Build event-driven LLM workflows using SSE transport Create echo services for testing MCP SSE integration Handle HTTP requests alongside MCP SSE connections Demonstrate MCP python-sdk usage in FastAPI applications

README

FastAPI SSE MCP

A FastAPI application that demonstrates Server-Sent Events (SSE) integration with the Model Context Protocol

Overview

This project showcases how to use Server-Sent Events (SSE) as a transport layer for MCP in a FastAPI application. It provides a simple echo service that can:

  • Handle FastAPI HTTP requests
  • Implement MCP tools, resources and prompts using the MCP python-sdk

Requirements

  • Python 3.12+
  • FastAPI 0.115.11+
  • MCP 1.4.1+

Installation

  1. Clone this repository:

    git clone https://github.com/ragieai/fastapi-sse-mcp.git
    cd fastapi-sse-mcp
  2. Install dependencies:

    uv sync --dev

Running the Application

Start the FastAPI server:

uv run uvicorn app.main:app --reload

The server will be available at http://127.0.0.1:8000.

API Endpoints

  • GET / - Returns a simple JSON greeting
  • GET /sse/ - SSE endpoint for establishing connections
  • POST /messages/ - Endpoint for sending messages over SSE

Examples

The application provides three example MCP functions:

  1. Tool Function: Echoes messages

    @mcp.tool()
    def echo_tool(message: str) -> str:
        """Echo a message as a tool"""
        return f"Tool echo: {message}"
  2. Prompt Function: Creates echo prompts

    @mcp.prompt()
    def echo_prompt(message: str) -> str:
        """Create an echo prompt"""
        return f"Please process this message: {message}"
  3. Resource Function: Handles resource requests

    @mcp.resource("echo://{message}")
    def echo_resource(message: str) -> str:
        """Echo a message as a resource"""
        return f"Resource echo: {message}"

License

MIT License

Copyright (c) 2024 Ragie Corp

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

fastapi-sse-mcp FAQ

How do I start the fastapi-sse-mcp server?
Clone the repository, install dependencies, then run 'uv run uvicorn app.main:app --reload'.
What Python and FastAPI versions are required?
Python 3.12+ and FastAPI 0.115.11+ are required for compatibility.
How does fastapi-sse-mcp use Server-Sent Events?
It uses SSE to provide a real-time streaming transport layer for MCP context updates.
Can I use fastapi-sse-mcp to implement custom MCP tools?
Yes, it supports implementing MCP tools, resources, and prompts via the MCP python-sdk.
Is fastapi-sse-mcp suitable for production use?
It is primarily a demonstration project but can be extended for production with additional hardening.
What endpoints does the server expose?
It exposes '/' for greetings, '/sse/' for SSE connections, and '/messages/' for sending messages.
Does fastapi-sse-mcp support other LLM providers?
While focused on MCP protocol, it can integrate with OpenAI, Anthropic Claude, and Google Gemini via MCP.
How does this server improve LLM interaction?
By using SSE, it enables efficient, low-latency streaming of context and responses to LLMs.