mcp-demo

MCP.Pizza Chef: raca159

mcp-demo is a client application demonstrating a modular AI research assistant built with MCP, LangChain, FastAPI, and Streamlit. It enables searching and analyzing scientific papers, performing linguistic analysis on documents, and provides a user-friendly chatbot interface to interact with these AI capabilities. This client orchestrates multiple MCP servers to deliver a seamless research workflow.

Use This MCP client To

Search and retrieve scientific papers from ArXiv Perform linguistic analysis on research documents Interact with AI research assistant via chatbot interface Combine multiple MCP servers for modular AI workflows Demonstrate integration of LangChain with MCP clients Build custom research tools using MCP and FastAPI Prototype AI-powered document analysis applications

README

MCP Research Assistant Demo

Overview

This project demonstrates a research assistant application built using Model Context Protocol (MCP), LangChain, FastAPI, and Streamlit. It showcases how to create a modular AI system with MCP servers and Langchain. This demo can search and analyze scientific papers, provide linguistic analysis on documents, and create a user-friendly interface for interacting with these AI capabilities.

search and open article parse and answer

Architecture

This demo consists of three main components:

  1. MCP Servers:
  • ArXiv Server: Provides tools for searching and retrieving scientific papers from ArXiv
  • DocLing Server: Offers document linguistics tools for analyzing and understanding text
  1. FastAPI Client Server:
  • Acts as the coordination layer between the MCP servers
  • Implements a research assistant agent that uses tools from both MCP servers
  • Exposes an API for interacting with the assistant
  1. Streamlit UI:
  • Provides a user-friendly web interface
  • Allows users to query the research assistant
  • Displays search results, paper analyses, and other outputs

How It Works

  1. User submits a research question or request through the Streamlit UI
  2. The request is sent to the FastAPI client server
  3. The client server uses the MultiServerMCPClient to coordinate with multiple MCP servers
  4. Based on the request, the appropriate tools are invoked (ArXiv search, document analysis, etc.)
  5. Results are processed and returned to the Streamlit UI for display

Getting Started

Environment Setup
Copy the template environment file:

cp .env.template .env

Edit the .env file with your API keys and configurations

Running with Docker Compose
The easiest way to run the entire application is using Docker Compose:

docker-compose up -d

This will start all components:

  • ArXiv server on port 8000
  • DocLing server on port 8001
  • Client server on port 8002
  • Streamlit UI on port 8501

Then visit here to access the Streamlit UI.

mcp-demo FAQ

How do I set up the mcp-demo client?
Clone the repository, install dependencies, and run the Streamlit app as documented in the GitHub README.
What MCP servers does mcp-demo integrate with?
It integrates with ArXiv Server for paper retrieval and DocLing Server for document linguistic analysis.
Can I extend mcp-demo with additional MCP servers?
Yes, the client is modular and designed to incorporate new MCP servers easily.
What technologies does mcp-demo use?
It uses MCP protocol, LangChain for chaining AI calls, FastAPI for backend, and Streamlit for the frontend UI.
Is mcp-demo suitable for production use?
It is primarily a demo and research prototype but can be extended for production with additional development.
How does mcp-demo handle real-time interaction?
It uses Streamlit's interactive UI combined with MCP's real-time context feeding to LLMs.
Which LLM providers can be used with mcp-demo?
It supports OpenAI, Anthropic Claude, and Google Gemini models via LangChain integration.
How does mcp-demo manage multi-step reasoning?
By orchestrating multiple MCP servers and chaining their outputs through LangChain workflows.