perplexity-mcp

MCP.Pizza Chef: DaInfernalCoder

Perplexity MCP is a Model Context Protocol server designed for intelligent research and documentation assistance. It leverages Perplexity AI's specialized models with automatic query complexity detection to route requests to the most suitable model, ensuring optimal results. It supports search capabilities for every task, making it versatile for both simple lookups and complex reasoning workflows.

Use This MCP server To

Perform quick searches for straightforward information queries Route complex multi-step research tasks to advanced AI models Integrate AI-powered research assistance into documentation workflows Automatically detect query complexity to optimize model usage Combine search and reasoning tools for comprehensive answers Support academic and technical research with AI-driven insights Enhance knowledge bases with real-time AI-generated content

README

Perplexity MCP Server

An intelligent research assistant powered by Perplexity's specialized AI models. Features automatic query complexity detection to route requests to the most appropriate model for optimal results. Unlike the Official server, it has search capabilities FOR EVERY TASK, essentially

Tools

Quick Note: The Deep Research tool is going to timeout with some tools like cline, but not with others like cursor due to implementation differences, but the reason tool makes up for it.

1. Search (Sonar Pro)

Quick search for simple queries and basic information lookup. Best for straightforward questions that need concise, direct answers.

const result = await use_mcp_tool({
  server_name: "perplexity",
  tool_name: "search",
  arguments: {
    query: "What is the capital of France?",
    force_model: false // Optional: force using this model even if query seems complex
  }
});

2. Reason (Sonar Reasoning Pro)

Handles complex, multi-step tasks requiring detailed analysis. Perfect for explanations, comparisons, and problem-solving.

const result = await use_mcp_tool({
  server_name: "perplexity",
  tool_name: "reason",
  arguments: {
    query: "Compare and contrast REST and GraphQL APIs, explaining their pros and cons",
    force_model: false // Optional: force using this model even if query seems simple
  }
});

3. Deep Research (Sonar Deep Research)

Conducts comprehensive research and generates detailed reports. Ideal for in-depth analysis of complex topics.

const result = await use_mcp_tool({
  server_name: "perplexity",
  tool_name: "deep_research",
  arguments: {
    query: "The impact of quantum computing on cryptography",
    focus_areas: [
      "Post-quantum cryptographic algorithms",
      "Timeline for quantum threats",
      "Practical mitigation strategies"
    ],
    force_model: false // Optional: force using this model even if query seems simple
  }
});

Intelligent Model Selection

The server automatically analyzes query complexity to route requests to the most appropriate model:

  1. Simple Queries → Sonar Pro

    • Basic information lookup
    • Straightforward questions
    • Quick facts
  2. Complex Queries → Sonar Reasoning Pro

    • How/why questions
    • Comparisons
    • Step-by-step explanations
    • Problem-solving tasks
  3. Research Queries → Sonar Deep Research

    • In-depth analysis
    • Comprehensive research
    • Detailed investigations
    • Multi-faceted topics

You can override the automatic selection using force_model: true in any tool's arguments.

Setup

  1. Prerequisites

  2. Configure MCP Settings

Add to your MCP settings file (location varies by platform):

{
  "mcpServers": {
    "perplexity": {
      "command": "node",
      "args": ["/path/to/perplexity-server/build/index.js"],
      "env": {
        "PERPLEXITY_API_KEY": "YOUR_API_KEY_HERE"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

Or use NPX to not have to install it locally (recommended for macos):

{
  "mcpServers": {
    "perplexity": {
      "command": "npx",
      "args": [
        "-y",
        "perplexity-mcp"
      ],
      "env": {
        "PERPLEXITY_API_KEY": "your_api_key"
      }
    }
  }
}

perplexity-mcp FAQ

How does Perplexity MCP determine which AI model to use?
It uses automatic query complexity detection to route requests to the most appropriate Perplexity AI model for optimal results.
Can Perplexity MCP handle both simple and complex queries?
Yes, it supports quick searches for simple queries and advanced reasoning for complex, multi-step tasks.
Is Perplexity MCP limited to Perplexity AI models only?
While optimized for Perplexity AI, it can be integrated with other MCP-compatible LLM providers like OpenAI, Claude, and Gemini.
What happens if a query times out with one tool?
The server can switch to alternative tools or implementations to complete the task without timing out.
How does Perplexity MCP improve documentation workflows?
By providing AI-powered research assistance that automatically selects the best model and search tool for each query.
Is Perplexity MCP suitable for academic research?
Yes, it is designed to support detailed and complex research tasks with AI-driven insights.
Can I customize the search and reasoning tools used by Perplexity MCP?
Yes, the server architecture allows configuring and extending tools to fit specific research needs.
Does Perplexity MCP support real-time interaction?
Yes, it provides real-time AI responses by dynamically routing queries to appropriate models and tools.