Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

epic-mcp

MCP.Pizza Chef: epicweb-dev

epic-mcp is a server example built on the Epic Stack demonstrating integration of the Model Context Protocol (MCP). It provides a standardized MCP endpoint using server-sent events (SSE) for real-time communication, enabling applications to expose structured context and tools to large language models. This server showcases how to implement MCP server-side setup and tool exposure within a modern web stack.

Use This MCP server To

Implement MCP server endpoint for real-time LLM context delivery Demonstrate MCP integration in Epic Stack web applications Expose custom tools and data sources to LLMs via MCP Enable SSE transport for live model interaction Prototype AI-enhanced workflows with standardized MCP interface

README

Epic MCP

An Epic Stack example adding support for the Model Context Protocol (MCP).

What is MCP?

The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). Think of MCP like a USB-C port for AI applications - it provides a standardized way to connect AI models to different data sources and tools.

Learn more from the MCP Documentation

Example Implementation

This repository demonstrates how to integrate MCP into an Epic Stack application. The implementation includes:

  1. Server-side MCP setup for handling client connections
  2. SSE (Server-Sent Events) transport layer for real-time communication
  3. Example tool implementations showing how to expose functionality to LLMs

Key Components

1. MCP Server Setup (app/routes/mcp+/mcp.server.ts)
export const server = new McpServer(
	{
		name: 'epic-mcp-a25d',
		version: '1.0.0',
	},
	{
		capabilities: {
			tools: {},
		},
	},
)

The MCP server is the core component that handles tool registration and execution. It's configured with a unique name and version, and defines the capabilities it provides.

2. Tool Implementation
server.tool(
	'Find User',
	'Search for users in the Epic Notes database by their name or username',
	{ query: z.string().describe('The query to search for') },
	async ({ query }) => {
		// Implementation...
	},
)

Tools are the primary way to expose functionality to LLMs. Each tool:

  • Has a descriptive name and purpose
  • Uses Zod for type-safe parameter validation
  • Can return multiple content types (text, images, etc.)
  • Integrates with your existing application logic
3. Transport Layer (app/routes/mcp+/fetch-transport.server.ts)

The transport layer handles the bi-directional communication between the MCP client and server:

  • Uses Server-Sent Events (SSE) for real-time server-to-client communication
  • Handles POST requests for client-to-server messages
  • Maintains session state for multiple concurrent connections
4. Route Integration (app/routes/mcp+/index.ts)
export async function loader({ request }: Route.LoaderArgs) {
	const url = new URL(request.url)
	const sessionId = url.searchParams.get('sessionId')
	const transport = await connect(sessionId)
	return transport.handleSSERequest(request)
}

The Remix route:

  • Establishes SSE connections for real-time communication
  • Handles incoming tool requests via POST endpoints
  • Manages session state for multiple clients

Learning Points

  1. Tool Design: When designing tools for LLMs:

    • Provide clear, descriptive names and purposes
    • Use strong type validation for parameters
    • Return structured responses that LLMs can understand
    • Consider supporting multiple content types (text, images, etc.)
  2. State Management: The implementation demonstrates:

    • Session-based connection tracking
    • Clean connection cleanup on client disconnect
    • Safe concurrent client handling
  3. Integration Patterns: Learn how to:

    • Connect MCP with existing application logic
    • Handle real-time communication in Remix
    • Structure your MCP implementation for maintainability
  4. Security Considerations:

    • Session-based access control
    • Safe handling of client connections
    • Proper cleanup of resources

epic-mcp FAQ

How does epic-mcp handle real-time communication with LLMs?
It uses Server-Sent Events (SSE) to stream context and responses in real time.
Can epic-mcp be integrated into existing Epic Stack applications?
Yes, it is designed as an example to add MCP support to Epic Stack projects.
What programming language and framework does epic-mcp use?
It is implemented in TypeScript using the Epic Stack framework.
Does epic-mcp support exposing custom tools to LLMs?
Yes, it includes example tool implementations to demonstrate this capability.
Is epic-mcp limited to a specific LLM provider?
No, MCP is provider-agnostic and works with OpenAI, Anthropic Claude, Google Gemini, and others.
Where can I find documentation to understand MCP better?
The MCP Documentation is available at https://modelcontextprotocol.io/introduction.
How do I start the epic-mcp server?
Follow the Epic Stack setup instructions and run the server with the provided MCP route configuration.
Can epic-mcp be used for production applications?
It is primarily an example implementation but can be extended for production use with customization.