Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AIFire in da houseCheck it out free

Langflow-DOC-QA-SERVER

MCP.Pizza Chef: GongRzhe

Langflow-DOC-QA-SERVER is a TypeScript-based MCP server that provides a streamlined interface for querying documents using a Langflow backend. It exemplifies core MCP principles by enabling real-time document question answering through a customizable Langflow flow, supporting components like ChatInput, File Upload, and LLM integration. This server facilitates seamless document interaction workflows within MCP-enabled environments.

Use This MCP server To

Query documents interactively using natural language Integrate document Q&A into MCP-enabled applications Test and demonstrate MCP server capabilities with Langflow Build custom document Q&A workflows with Langflow backend Enable real-time document search and answer retrieval Connect file uploads to Langflow for dynamic querying Prototype document-based AI assistants using MCP

README

Langflow-DOC-QA-SERVER

smithery badge

Langflow Document Q&A Server MCP server

A Model Context Protocol server for document Q&A powered by Langflow

This is a TypeScript-based MCP server that implements a document Q&A system. It demonstrates core MCP concepts by providing a simple interface to query documents through a Langflow backend.

Prerequisites

1. Create Langflow Document Q&A Flow

  1. Open Langflow and create a new flow from the "Document Q&A" template
  2. Configure your flow with necessary components (ChatInput, File Upload, LLM, etc.)
  3. Save your flow

image

2. Get Flow API Endpoint

  1. Click the "API" button in the top right corner of Langflow
  2. Copy the API endpoint URL from the cURL command Example: http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false
  3. Save this URL as it will be needed for the API_ENDPOINT configuration

image

Features

Tools

  • query_docs - Query the document Q&A system
    • Takes a query string as input
    • Returns responses from the Langflow backend

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Installation

To use with Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "langflow-doc-qa-server": {
      "command": "node",
      "args": [
        "/path/to/doc-qa-server/build/index.js"
      ],
      "env": {
        "API_ENDPOINT": "http://127.0.0.1:7860/api/v1/run/480ec7b3-29d2-4caa-b03b-e74118f35fac"
      }
    }
  }
}

image

Installing via Smithery

To install Document Q&A Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @GongRzhe/Langflow-DOC-QA-SERVER --client claude

Environment Variables

The server supports the following environment variables for configuration:

  • API_ENDPOINT: The endpoint URL for the Langflow API service. Defaults to http://127.0.0.1:7860/api/v1/run/480ec7b3-29d2-4caa-b03b-e74118f35fac if not specified.

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

📜 License

This project is licensed under the MIT License.

Langflow-DOC-QA-SERVER FAQ

How do I set up the Langflow-DOC-QA-SERVER?
Create a Langflow Document Q&A flow, configure components like ChatInput and File Upload, then connect the server to this flow.
What programming language is the server implemented in?
The server is implemented in TypeScript, ensuring strong typing and modern JavaScript features.
Can I customize the Langflow flow used by this server?
Yes, you can modify the Langflow Document Q&A flow to add or adjust components as needed.
Does this server support multiple document formats?
Support depends on the Langflow flow configuration; typically, common text-based document formats are supported.
How does this server demonstrate core MCP concepts?
It provides a simple, structured interface for document querying, showcasing real-time context feeding and model interaction via MCP.
Is this server compatible with different LLM providers?
Yes, it can work with various LLMs supported by Langflow, including OpenAI, Anthropic Claude, and Google Gemini.
What are the prerequisites before running this server?
You need to create and configure a Langflow Document Q&A flow and have the necessary environment to run the TypeScript server.
Can this server be extended for other document-related tasks?
Yes, by modifying the Langflow flow and server code, you can extend it for summarization, extraction, or other document AI tasks.