auto_mcp

MCP.Pizza Chef: snick-m

Auto MCP is an MCP server designed for ROS-based systems that automatically introspects live ROS topics and interfaces. It creates an SSE over HTTP transport MCP server on port 5000, enabling seamless integration of LLMs with robotic systems. This server acts as a bridge, giving ROS robots an LLM brain with minimal setup, supporting manual configuration and introspection for dynamic environments.

Use This MCP server To

Expose ROS topics and interfaces as MCP server endpoints Enable LLMs to interact with ROS-based robots in real time Automatically detect live ROS topics for dynamic MCP context Provide SSE over HTTP transport for MCP communication Integrate LLMs with robotic systems using Model Context Protocol Facilitate remote control and monitoring of ROS robots via LLMs Simplify setup of MCP servers in ROS environments Support manual configuration of MCP server parameters

README

Auto MCP βœ¨πŸ€–

This package works as a drop-in MCP Server for any ROS based system. Using introspection tools, the automcp_listener node picks up the list of topics and their interfaces that are live on the system. It then proceeds to create an SSE over HTTP Transport MCP server at port 5000.

Due to this introspection at start behavior - it should be the last node to run.

What is it ❔

Essentially, this package will allow you to give your ROS based robot a big LLM brain with very little work.

It works by creating an MCP server that you can connect with your LLM/Client of choice providing your robot a brain, and the LLM a body.

Model Context Protocol or MCP for short is a new protocol for LLMs to interact with various tools. Anthropic, the company behind Claude developed this protocol which has recently seen support from OpenAI as well. Learn more about it from the official website

Now, this Auto MCP ROS2 package aims to make the process of creating an MCP for your robot effortless. Through introspection, this package can detect all the topics running on a ROS system and provide necessary tools. All you have to do is run its node after everything else, and add the configuration to your preferred client.

Why is it ❕

As LLMs get more and more powerful, I figured many more robotics developers will want to build connections between their robots and LLMs. As I had been working on such a concept to run on the Mars rover prototype at TMR, I saw MCP become popular and realized this is perfect for this use case.

But as a firm believer in quality of life improvements, I wanted to make this super easy to use and thus - Auto MCP.

Setup πŸ”§

The dependencies in the package.xml contain rosdep dependencies. However, the MCP SDK for Python is not yet in the rosdep repo. I have a pull request open for that.

Until that gets resolved, you can use pip install mcp to install the dependency as noted in the python-sdk repo for the Model Context Protocol.

Installation πŸ“₯

In order to install this package create a colcon workspace following the instructions on the official ros website.

After that, clone this repo to the src directory in your workspace.

cd src
git clone https://github.com/snick-m/auto_mcp.git

Then, go back to the root workspace directory and run the build command.

cd ../

# If you want to only build this package
colcon build --packages-select auto_mcp

# If you want to build all packages in the workspace
colcon build

After this you should be able to proceed to using it.

Usage πŸƒπŸΌβ€βž‘οΈ

Once built and installed using your ros2 workspace, you can launch the node by simply running

ros2 run auto_mcp server

This will launch the node that will first do introspection and start the MCP server on 0.0.0.0:5000, therefore, on the local system you can access the SSE endpoint at http://localhost:5000/sse

Adding to Claude Desktop πŸ”—

Currently I've only tested with Claude Desktop, I'll try to add more necessary instructions over time.

At the time of writing this, Claude Desktop doesn't support adding MCPs with SSE over HTTP transport. However, a really simple workaround from Cloudflare makes it usable.

For the Claude Desktop MCP configuration just use the following while modifying the url based on your setup.

{
  "mcpServers": {
    "auto_mcp": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "http://localhost:5000/sse"
      ]
    }
  }
}

Contribution πŸ«‚

Contribution guidelines are soon to come, for now check the issues to see the planned upgrades to this package. If any of them interest you, feel free to ping me on the issue to work on it.

auto_mcp FAQ

How does Auto MCP detect ROS topics?
It uses introspection tools to pick up live ROS topics and their interfaces at startup.
What transport protocol does Auto MCP use?
It creates an SSE (Server-Sent Events) over HTTP transport MCP server on port 5000.
When should Auto MCP be started in the ROS node sequence?
It should be the last node to run due to its introspection at start behavior.
Can I manually configure the Auto MCP server?
Yes, it supports manual configuration alongside automatic introspection.
Which LLM providers can connect to Auto MCP?
Auto MCP is compatible with any LLM client supporting MCP, including OpenAI, Anthropic Claude, and Google Gemini.
Does Auto MCP support ROS1 or ROS2?
It is designed primarily for ROS2-based systems.
How does Auto MCP enhance robot capabilities?
By providing an MCP server, it enables LLMs to act as the robot's brain, facilitating advanced interaction and control.
Is Auto MCP open source?
Yes, it is available as an open-source package on GitHub.