uber-eats-mcp-server

MCP.Pizza Chef: ericzakariasson

The uber-eats-mcp-server is a proof-of-concept MCP server that demonstrates how to build an MCP server on top of Uber Eats. It enables LLM applications to access and interact with Uber Eats data through the Model Context Protocol, facilitating real-time, structured context integration. Built with Python 3.12+, it supports multiple LLM providers like Anthropic and OpenAI, and uses stdio for MCP transport.

Use This MCP server To

Integrate Uber Eats order data into LLM workflows Enable real-time querying of Uber Eats menus via LLMs Automate food order status updates through conversational agents Build AI-powered restaurant recommendation systems using Uber Eats data Test and prototype MCP server capabilities with Uber Eats context Combine Uber Eats data with other APIs for multi-source insights

README

Uber Eats MCP Server

This is a POC of how you can build an MCP servers on top of Uber Eats

mcp-uber-eats-github.mp4

What is MCP?

The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external tools.

Prerequisites

  • Python 3.12 or higher
  • Anthropic API key or other supported LLM provider

Setup

  1. Ensure you have a virtual environment activated:

    uv venv
    source .venv/bin/activate  # On Unix/Mac
    
  2. Install required packages:

    uv pip install -r requirements.txt
    playwright install
    
  3. Update the .env file with your API key:

    ANTHROPIC_API_KEY=your_openai_api_key_here
    

Note

Since we're using stdio as MCP transport, we have disable all output from browser use

Debugging

You can run the MCP inspector tool with this command

uv run mcp dev server.py

uber-eats-mcp-server FAQ

How do I set up the uber-eats-mcp-server?
Activate a Python 3.12+ virtual environment, install dependencies with pip, and configure your LLM API key in the .env file.
Which LLM providers are supported?
The server supports Anthropic, OpenAI, and Gemini LLM providers via API keys.
How does the MCP transport work in this server?
It uses stdio as the MCP transport, which requires disabling browser output for compatibility.
Can I debug the server during development?
Yes, you can run the MCP inspector tool using the command 'uv run mcp dev server.py' to debug.
Is this server production-ready?
No, it is a proof-of-concept designed for experimentation and prototyping.
What programming language is used?
The server is implemented in Python 3.12 or higher.
How do I update the API key?
Update the ANTHROPIC_API_KEY or relevant environment variable in the .env file.
Does this server support multiple Uber Eats regions?
The documentation does not specify, but it can be extended to support multiple regions.