Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AIFire in da houseCheck it out free

interactive-mcp

MCP.Pizza Chef: ttommyth

interactive-mcp is a local, cross-platform MCP server designed to facilitate interactive prompts, chat, and notifications within LLM workflows. It enables real-time user interaction by asking questions and receiving input, enhancing dynamic conversations and task flows. Compatible with Windows, macOS, and Linux, it integrates seamlessly into MCP ecosystems to provide responsive, user-driven context and feedback.

Use This MCP server To

Collect user input during LLM-driven workflows Enable real-time interactive chat with users Send notifications based on LLM events Facilitate multi-turn conversations with user prompts Integrate interactive prompts into AI-assisted applications Support cross-platform user interaction in MCP environments

README

interactive-mcp

npm version npm downloads smithery badge GitHub license code style: prettier Platforms GitHub last commit

Screenshot 2025-05-13 213745

A MCP Server implemented in Node.js/TypeScript, facilitating interactive communication between LLMs and users. Note: This server is designed to run locally alongside the MCP client (e.g., Claude Desktop, VS Code), as it needs direct access to the user's operating system to display notifications and command-line prompts.

(Note: This project is in its early stages.)

Want a quick overview? Check out the introductory blog post: Stop Your AI Assistant From Guessing — Introducing interactive-mcp

Demo Video

interactive-mcp MCP server

Tools

This server exposes the following tools via the Model Context Protocol (MCP):

  • request_user_input: Asks the user a question and returns their answer. Can display predefined options.
  • message_complete_notification: Sends a simple OS notification.
  • start_intensive_chat: Initiates a persistent command-line chat session.
  • ask_intensive_chat: Asks a question within an active intensive chat session.
  • stop_intensive_chat: Closes an active intensive chat session.

Demo

Here are demonstrations of the interactive features:

Normal Question Completion Notification
Normal Question Demo Completion Notification Demo
Intensive Chat Start Intensive Chat End
Start Intensive Chat Demo End Intensive Chat Demo

Usage Scenarios

This server is ideal for scenarios where an LLM needs to interact directly with the user on their local machine, such as:

  • Interactive setup or configuration processes.
  • Gathering feedback during code generation or modification.
  • Clarifying instructions or confirming actions in pair programming.
  • Any workflow requiring user input or confirmation during LLM operation.

Client Configuration

This section explains how to configure MCP clients to use the interactive-mcp server.

By default, user prompts will time out after 30 seconds. You can customize server options like timeout or disabled tools by adding command-line flags directly to the args array when configuring your client.

Please make sure you have the npx command available.

Usage with Claude Desktop / Cursor

Add the following minimal configuration to your claude_desktop_config.json (Claude Desktop) or mcp.json (Cursor):

{
  "mcpServers": {
    "interactive": {
      "command": "npx",
      "args": ["-y", "interactive-mcp"]
    }
  }
}

With specific version

{
  "mcpServers": {
    "interactive": {
      "command": "npx",
      "args": ["-y", "interactive-mcp@1.9.0"]
    }
  }
}

Example with Custom Timeout (30s):

{
  "mcpServers": {
    "interactive": {
      "command": "npx",
      "args": ["-y", "interactive-mcp", "-t", "30"]
    }
  }
}

Usage with VS Code

Add the following minimal configuration to your User Settings (JSON) file or .vscode/mcp.json:

{
  "mcp": {
    "servers": {
      "interactive-mcp": {
        "command": "npx",
        "args": ["-y", "interactive-mcp"]
      }
    }
  }
}

macOS Recommendations

For a smoother experience on macOS using the default Terminal.app, consider this profile setting:

  • (Shell Tab): Under "When the shell exits" (Terminal > Settings > Profiles > [Your Profile] > Shell), select "Close if the shell exited cleanly" or "Close the window". This helps manage windows when the MCP server starts and stops.

Development Setup

This section is primarily for developers looking to modify or contribute to the server. If you just want to use the server with an MCP client, see the "Client Configuration" section above.

Prerequisites

  • Node.js: Check package.json for version compatibility.
  • pnpm: Used for package management. Install via npm install -g pnpm after installing Node.js.

Installation (Developers)

  1. Clone the repository:

    git clone https://github.com/ttommyth/interactive-mcp.git
    cd interactive-mcp
  2. Install dependencies:

    pnpm install

Running the Application (Developers)

pnpm start

Command-Line Options

The interactive-mcp server accepts the following command-line options. These should typically be configured in your MCP client's JSON settings by adding them directly to the args array (see "Client Configuration" examples).

Option Alias Description
--timeout -t Sets the default timeout (in seconds) for user input prompts. Defaults to 30 seconds.
--disable-tools -d Disables specific tools or groups (comma-separated list). Prevents the server from advertising or registering them. Options: request_user_input, message_complete_notification, intensive_chat.

Example: Setting multiple options in the client config args array:

// Example combining options in client config's "args":
"args": [
  "-y", "interactive-mcp",
  "-t", "30", // Set timeout to 30 seconds
  "--disable-tools", "message_complete_notification,intensive_chat" // Disable notifications and intensive chat
]

Development Commands

  • Build: pnpm build
  • Lint: pnpm lint
  • Format: pnpm format

Guiding Principles for Interaction

When interacting with this MCP server (e.g., as an LLM client), please adhere to the following principles to ensure clarity and reduce unexpected changes:

  • Prioritize Interaction: Utilize the provided MCP tools (request_user_input, start_intensive_chat, etc.) frequently to engage with the user.
  • Seek Clarification: If requirements, instructions, or context are unclear, always ask clarifying questions before proceeding. Do not make assumptions.
  • Confirm Actions: Before performing significant actions (like modifying files, running complex commands, or making architectural decisions), confirm the plan with the user.
  • Provide Options: Whenever possible, present the user with predefined options through the MCP tools to facilitate quick decisions.

You can provide these instructions to an LLM client like this:

# Interaction

- Please use the interactive MCP tools
- Please provide options to interactive MCP if possible

# Reduce Unexpected Changes

- Do not make assumption.
- Ask more questions before executing, until you think the requirement is clear enough.

Contributing

Contributions are welcome! Please follow standard development practices. (Further details can be added later).

License

MIT (See LICENSE file for details - if applicable, or specify license directly).

interactive-mcp FAQ

How do I install interactive-mcp?
Install via npm using 'npm install interactive-mcp' on Windows, macOS, or Linux.
Can interactive-mcp handle multi-turn conversations?
Yes, it supports interactive prompts enabling multi-turn user interactions.
Is interactive-mcp compatible with different LLM providers?
Yes, it works with OpenAI, Anthropic Claude, and Google Gemini models.
Does interactive-mcp support notifications?
Yes, it can send notifications triggered by LLM events or user actions.
How does interactive-mcp ensure cross-platform compatibility?
It is built to run locally on Windows, macOS, and Linux environments seamlessly.
Can I customize the prompts and chat interface?
Yes, interactive-mcp allows customization of prompts and chat flows to fit your needs.
Is interactive-mcp open source?
Yes, it is open source and available on GitHub under a permissive license.
How does interactive-mcp integrate with other MCP components?
It acts as a server exposing interactive prompt and chat functionality to MCP clients and hosts.