mcp-intercom

MCP.Pizza Chef: fabian1710

The mcp-intercom server is a Model Context Protocol (MCP) server designed to integrate Intercom chat data with large language models (LLMs). It provides structured access to Intercom conversations, allowing LLMs to query and analyze chat histories using filters such as date ranges, customer IDs, and conversation states. The server securely connects to Intercom via API keys and exposes rich conversation metadata including contact details, response statistics, and conversation priorities. This enables AI-powered workflows to leverage customer support data for insights, automation, and enhanced interaction capabilities.

Use This MCP server To

Query Intercom conversations by date or customer ID Analyze chat response statistics for support optimization Filter conversations by state or priority for triage Integrate Intercom chat data into AI-driven workflows Enable LLMs to summarize or extract insights from chats

README

MCP Intercom Server

A Model Context Protocol (MCP) server that provides access to Intercom conversations and chats. This server allows LLMs to query and analyze your Intercom conversations with various filtering options.

Features

  • Query Intercom conversations with filtering options:
    • Date range (start and end dates)
    • Customer ID
    • Conversation state
  • Secure access using your Intercom API key
  • Rich conversation data including:
    • Basic conversation details
    • Contact information
    • Statistics (responses, reopens)
    • State and priority information

Installation

  1. Clone the repository:
git clone https://github.com/fabian1710/mcp-intercom.git
cd mcp-intercom
  1. Install dependencies:
npm install
  1. Set up your environment:
cp .env.example .env
  1. Add your Intercom API key to .env:
INTERCOM_API_KEY=your_api_key_here
  1. Build the server:
npm run build

Usage

Running the Server

Start the server:

npm start

Using with Claude for Desktop

  1. Add the server to your Claude for Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS or %AppData%\Claude\claude_desktop_config.json on Windows):
{
  "mcpServers": {
    "intercom": {
      "command": "node",
      "args": ["/path/to/mcp-intercom/dist/index.js"],
      "env": {
        "INTERCOM_API_KEY": "your_api_key_here"
      }
    }
  }
}
  1. Restart Claude for Desktop

Available Tools

search-conversations

Searches Intercom conversations with optional filters.

Parameters:

  • createdAt (optional): Object with operator (e.g., ">", "<", "=") and value (UNIX timestamp) for filtering by creation date.
  • updatedAt (optional): Object with operator (e.g., ">", "<", "=") and value (UNIX timestamp) for filtering by update date.
  • sourceType (optional): Source type of the conversation (e.g., "email", "chat").
  • state (optional): Conversation state to filter by (e.g., "open", "closed").
  • open (optional): Boolean to filter by open status.
  • read (optional): Boolean to filter by read status.

Example queries:

  • "Search for all conversations created after January 1, 2024"
  • "Find conversations updated before last week"
  • "List all open email conversations"
  • "Get all unread conversations"

Security

  • The server requires an Intercom API key to function
  • API key should be stored securely in environment variables
  • The server only provides read access to conversations
  • All API requests are made with proper authentication

Development

  1. Start development mode with auto-recompilation:
npm run dev
  1. Run linting:
npm run lint

Contributing

  1. Fork the repository
  2. Create a new branch for your feature
  3. Make your changes
  4. Submit a pull request

License

MIT

mcp-intercom FAQ

How do I securely connect the mcp-intercom server to my Intercom account?
You provide your Intercom API key in the server's .env configuration file, ensuring secure authenticated access.
What filtering options does mcp-intercom support for querying conversations?
It supports filtering by date range, customer ID, and conversation state to refine query results.
Can mcp-intercom provide detailed metadata about conversations?
Yes, it exposes conversation details including contact info, response counts, reopen statistics, and priority states.
How do I install and run the mcp-intercom server?
Clone the repository, install dependencies with npm, configure your API key in .env, then build and start the server using npm scripts.
Is mcp-intercom compatible with multiple LLM providers?
Yes, it is designed to work with any MCP-compatible LLM such as OpenAI, Anthropic Claude, and Google Gemini.
Can I use mcp-intercom to automate customer support workflows?
Absolutely, by enabling LLMs to access and analyze chat data, you can build AI-driven automation and insights.
Does mcp-intercom support real-time conversation updates?
The server primarily provides query access to stored conversations; real-time updates depend on Intercom's API capabilities.
What programming environment is required to run mcp-intercom?
It requires Node.js and npm for installation and running the server.