langgraph-mcp-pipeline

MCP.Pizza Chef: lalanikarim

langgraph-mcp-pipeline is an MCP client demonstrating AI image generation workflows by integrating LangGraph with MCP. It generates prompts and AI images based on topics, incorporating Human-in-the-Loop interaction for user feedback. The client leverages LangGraph's Functional and Graph APIs and integrates with Open WebUI Pipelines, using the Comfy MCP Server for image generation. It showcases practical AI workflow orchestration combining prompt creation and image synthesis.

Use This MCP client To

Generate AI image prompts from user topics Create AI-generated images via MCP and LangGraph Incorporate Human-in-the-Loop feedback in workflows Demonstrate LangGraph Functional API usage with MCP Integrate AI image generation in Open WebUI Pipelines Orchestrate multi-step AI workflows combining prompts and images

README

AI Image Generation Pipeline with LangGraph and MCP

This project demonstrates the use of the Model Context Protocol (MCP) with LangGraph to create workflows that generate prompts and AI-generated images based on a given topic. The project consists of three main files: app.py, graph.py, and ai-image-gen-pipeline.py. Each file showcases different aspects of using MCP with LangGraph, including the LangGraph Functional API, Graph API, and integration within Open WebUI Pipelines. These scripts utilize the Comfy MCP Server to generate AI image prompts and AI images.

Files

app.py

This script demonstrates the use of the LangGraph Functional API along with Human-in-the-Loop (HIL) interaction to generate prompts and AI-generated images based on a given topic. The workflow includes user feedback to approve generated prompts before generating the corresponding image.

Key Components:

  • Dependencies: aiosqlite, langgraph, langgraph-checkpoint-sqlite, mcp[cli].
  • Functions:
    • run_tool(tool: str, args: dict) -> str: Runs a tool using the MCP server.
    • generate_prompt(topic: str) -> str: Generates a prompt for a given topic.
    • generate_image(prompt: str) -> str: Generates an image based on a given prompt.
    • get_feedback(topic: str, prompt: str) -> str: Collects user feedback on the generated prompt.
    • workflow_func(saver): Defines the workflow function with checkpointing.
  • Main Function:
    • Parses command-line arguments to get thread id and optionally the topic and feedback.
    • Initializes the workflow and runs it, based on the provided input.

graph.py

This script demonstrates the use of the LangGraph Graph API along with Human-in-the-Loop (HIL) interaction to generate prompts and AI-generated images based on a given topic. The workflow includes user feedback to approve generated prompts before generating the corresponding image.

Key Components:

  • Dependencies: aiosqlite, langgraph, langgraph-checkpoint-sqlite, mcp[cli].
  • Functions:
    • run_tool(tool: str, args: dict) -> str: Runs a tool using the MCP server.
    • generate_prompt(state: State) -> State: Generates a prompt for a given topic and updates the state.
    • generate_image(state: State) -> State: Generates an image based on a given prompt and updates the state.
    • prompt_feedback(state: State) -> State: Collects user feedback on the generated prompt.
    • process_feedback(state: State) -> str: Processes the user feedback to determine the next step in the workflow.
  • Main Function:
    • Parses command-line arguments to get the thread ID, topic, and feedback.
    • Initializes the state graph and runs it based on the provided input.

ai-image-gen-pipeline.py

This script demonstrates the integration of LangGraph API with Human-in-the-Loop (HIL) within Open WebUI Pipelines. It defines a pipeline for generating prompts and images using MCP, including nodes for generating prompts, processing feedback, and generating images.

Key Components:

  • Dependencies: aiosqlite, langgraph, langgraph-checkpoint-sqlite, mcp[cli].
  • Classes:
    • Pipeline: Defines the pipeline with nodes for generating prompts, processing feedback, and generating images.
      • Valves(BaseModel): Contains environment variables for MCP server configuration.
  • Functions:
    • inlet(body: dict, user: dict) -> dict: Processes incoming messages.
    • outlet(body: dict, user: dict) -> dict: Processes outgoing messages.
    • pipe(user_message: str, model_id: str, messages: List[dict], body: dict) -> Union[str, Generator, Iterator]: Defines the main pipeline logic.
    • run_tool(tool: str, args: dict) -> str: Runs a tool using the MCP server.
    • generate_prompt(state: State) -> State: Generates a prompt for a given topic and updates the state.
    • generate_image(state: State) -> State: Generates an image based on a given prompt and updates the state.
    • prompt_feedback(state: State) -> State: Collects user feedback on the generated prompt.
    • process_feedback(state: State) -> str: Processes the user feedback to determine the next step in the workflow.

Usage

  1. Install Dependencies: Ensure you have the required dependencies installed.

    pip install aiosqlite langgraph langgraph-checkpoint-sqlite mcp[cli] comfy-mcp-server
  2. Run the Application:

    • For app.py:

      python app.py --topic "Your topic here"
    • For graph.py:

      python graph.py --thread_id "your-thread-id" --topic "Your topic here" 

      For feedback:

      python graph.py --thread_id "your-thread-id" --feedback "y/n" 
  3. Using uv Utility: You can also launch app.py and graph.py using the uv utility. This utility manages Python version and dependency management, so there is no need to preinstall dependencies.

    • For app.py:

      uv run app.py --topic "Your topic here"
    • For graph.py:

      uv run graph.py --thread_id "your-thread-id" --topic "Your topic here" 

      For feedback:

      uv run graph.py --thread_id "your-thread-id" --feedback "y/n" 
  4. Environment Variables: Set the necessary environment variables for MCP server configuration.

    export COMFY_URL="comfy-url"
    export COMFY_URL_EXTERNAL="comfy-url-external"
    export COMFY_WORKFLOW_JSON_FILE="path-to-workflow-json-file"
    export PROMPT_NODE_ID="prompt-node-id"
    export OUTPUT_NODE_ID="output-node-id"
    export OLLAMA_API_BASE="ollama-api-base"
    export PROMPT_LLM="prompt-llm"

Contributing

Feel free to contribute to this project by submitting pull requests or issues. Ensure that any changes are well-documented and tested.

License

This project is licensed under the MIT License.

langgraph-mcp-pipeline FAQ

How does Human-in-the-Loop interaction work in this client?
It allows users to provide feedback during prompt generation to refine AI outputs dynamically.
What APIs from LangGraph does this client utilize?
It uses both the LangGraph Functional API and Graph API for building and managing AI workflows.
Can this client be integrated with other MCP servers?
Yes, while it uses the Comfy MCP Server by default, it can integrate with any compatible MCP server for image generation.
Is this client suitable for real-time AI image generation?
Yes, it supports real-time prompt generation and image creation workflows with user interaction.
What programming languages and frameworks are used?
The client is implemented in Python and leverages LangGraph and MCP Python libraries.
Does this client support customization of AI prompts?
Yes, users can customize prompts interactively through the Human-in-the-Loop mechanism.
How does this client handle workflow orchestration?
It orchestrates multi-step workflows combining prompt generation, user feedback, and image synthesis using LangGraph and MCP.
Where can I find the source code and documentation?
The source code and documentation are available on the GitHub repository linked in the MCP entity details.