Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

sample-serverless-mcp-server

MCP.Pizza Chef: aws-samples

The sample-serverless-mcp-server is a serverless MCP server implementation that supports the Streamable HTTP protocol and is deployed on Amazon Lambda. It leverages MCP v2025.03.26 features like HTTP Chunked Transfer Encoding for streaming, dynamic context priority control, and seamless integration with existing HTTP infrastructure. This server offers elastic deployment with optimized cost-performance, making it ideal for scalable, real-time AI context delivery in cloud environments.

Use This MCP server To

Deploy scalable MCP servers on AWS Lambda for real-time AI context streaming Stream large context data efficiently using HTTP Chunked Transfer Encoding Implement dynamic priority control for context data in MCP workflows Integrate MCP servers seamlessly with existing HTTP-based infrastructures Optimize cost and performance of MCP servers using serverless architecture Demonstrate MCP v2025.03.26 features in a practical serverless environment

README

sample-serverless-mcp-server

A Github MCP Server implementation based on Amazon Lambda and Streamable HTTP protocol.

Project Description

This project demonstrates how to transform the official TypeScript MCP Server to support Streamable HTTP protocol and deploy it via Amazon Lambda. The implementation leverages new features introduced in MCP v2025.03.26, including:

  • Stream transmission based on HTTP Chunked Transfer Encoding

  • Dynamic context priority control

  • Seamless integration with existing HTTP infrastructure

Key Features

  • Streamable HTTP MCP protocol support

  • Elastic deployment based on Amazon Lambda

  • Optimized cost-performance balance

MCP Server on Amazon Lambda Architecture

Architecture

Quick Start

Prerequisites

  • Node.js 20+
  • AWS CLI configured
  • OSS-Serverless CLI

Serverless Streamable HTTP MCP Server

  • Github MCP Server

How to deploy Github MCP Server on Amazon Lambda :

This project needs to set up a GITHUB_PERSONAL_ACCESS_TOKEN, which has been stored in Amazon Lambda environment variables.

Never submit your serverless.yml to a GitHub public repository.

# Clone repository
git clone https://gitlab.aws.dev/wsuam/sample-serverless-mcp-server.git
cd sample-serverless-mcp-server/src/github/

# Install dependencies
npm install
npm install -g osls

#Set your github personal access token in serverless.yml 
cp serverless.example.yml serverless.yml
#edit serverless.yml , GITHUB_PERSONAL_ACCESS_TOKEN: <Your GitHub Personal Access Token>


# Test local
npm sls offline

# Deploy to AWS Lambda
npm sls deploy

Architecture Overview

The project utilizes the following architecture:

  • API Gateway: Handles HTTP requests

  • Lambda: Executes MCP Server logic

  • Streamable HTTP: Implements streaming responses

  • DynamoDB and S3 store request logs (this part is not ready yet).

Development Guide

  1. Configure AWS credentials
  2. Run npm sls offline
  3. Follow the prompts to complete deployment configuration

Contributing

Pull requests are welcome. Before submitting, please ensure:

  • Code follows project standards
  • All tests pass
  • Documentation is updated

License

This library is licensed under the MIT-0 License. See the LICENSE file.

Project Status

Project is under active development. Issues and suggestions are welcome.

sample-serverless-mcp-server FAQ

How does the sample-serverless-mcp-server handle streaming of context data?
It uses HTTP Chunked Transfer Encoding to stream context data efficiently over HTTP.
What are the benefits of deploying this MCP server on Amazon Lambda?
It provides elastic scaling, cost optimization, and easy integration with AWS infrastructure.
What MCP version features does this server leverage?
It uses MCP v2025.03.26 features including streamable HTTP protocol and dynamic context priority control.
What are the prerequisites for deploying this server?
Node.js 20+, AWS CLI configured, and OSS-Serverless CLI are required.
Can this server integrate with existing HTTP infrastructures?
Yes, it is designed for seamless integration with existing HTTP-based systems.
How does dynamic context priority control improve MCP workflows?
It allows prioritizing important context data dynamically to optimize model interactions.
Is this server suitable for cost-sensitive deployments?
Yes, the serverless architecture on AWS Lambda optimizes cost-performance balance.
What programming language is used for this MCP server?
The server is implemented in TypeScript.