Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-perplexity-search

MCP.Pizza Chef: spences10

mcp-perplexity-search is an MCP server that integrates Perplexity's AI API with large language models, offering advanced chat completion capabilities. It includes specialized prompt templates for technical documentation, security analysis, code review, and structured API documentation, enabling tailored AI interactions. Though no longer maintained, its functionality is now part of mcp-omnisearch, a unified MCP tool package.

Use This MCP server To

Generate technical documentation using AI-driven chat completions Analyze security best practices with specialized prompt templates Review and improve code through AI-assisted suggestions Create structured API documentation automatically Customize prompt templates for specific domain needs Integrate Perplexity AI chat completions into MCP workflows

README

mcp-perplexity-search


⚠️ Notice

This repository is no longer maintained.

The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.

Please use mcp-omnisearch instead.


A Model Context Protocol (MCP) server for integrating Perplexity's AI API with LLMs. This server provides advanced chat completion capabilities with specialized prompt templates for various use cases.

Features

  • 🤖 Advanced chat completion using Perplexity's AI models
  • 📝 Predefined prompt templates for common scenarios:
    • Technical documentation generation
    • Security best practices analysis
    • Code review and improvements
    • API documentation in structured format
  • 🎯 Custom template support for specialized use cases
  • 📊 Multiple output formats (text, markdown, JSON)
  • 🔍 Optional source URL inclusion in responses
  • ⚙️ Configurable model parameters (temperature, max tokens)
  • 🚀 Support for various Perplexity models including Sonar and LLaMA

Configuration

This server requires configuration through your MCP client. Here are examples for different environments:

Cline Configuration

Add this to your Cline MCP settings:

{
	"mcpServers": {
		"mcp-perplexity-search": {
			"command": "npx",
			"args": ["-y", "mcp-perplexity-search"],
			"env": {
				"PERPLEXITY_API_KEY": "your-perplexity-api-key"
			}
		}
	}
}

Claude Desktop with WSL Configuration

For WSL environments, add this to your Claude Desktop configuration:

{
	"mcpServers": {
		"mcp-perplexity-search": {
			"command": "wsl.exe",
			"args": [
				"bash",
				"-c",
				"source ~/.nvm/nvm.sh && PERPLEXITY_API_KEY=your-perplexity-api-key /home/username/.nvm/versions/node/v20.12.1/bin/npx mcp-perplexity-search"
			]
		}
	}
}

Environment Variables

The server requires the following environment variable:

  • PERPLEXITY_API_KEY: Your Perplexity API key (required)

API

The server implements a single MCP tool with configurable parameters:

chat_completion

Generate chat completions using the Perplexity API with support for specialized prompt templates.

Parameters:

  • messages (array, required): Array of message objects with:
    • role (string): 'system', 'user', or 'assistant'
    • content (string): The message content
  • prompt_template (string, optional): Predefined template to use:
    • technical_docs: Technical documentation with code examples
    • security_practices: Security implementation guidelines
    • code_review: Code analysis and improvements
    • api_docs: API documentation in JSON format
  • custom_template (object, optional): Custom prompt template with:
    • system (string): System message for assistant behaviour
    • format (string): Output format preference
    • include_sources (boolean): Whether to include sources
  • format (string, optional): 'text', 'markdown', or 'json' (default: 'text')
  • include_sources (boolean, optional): Include source URLs (default: false)
  • model (string, optional): Perplexity model to use (default: 'sonar')
  • temperature (number, optional): Output randomness (0-1, default: 0.7)
  • max_tokens (number, optional): Maximum response length (default: 1024)

Development

Setup

  1. Clone the repository
  2. Install dependencies:
pnpm install
  1. Build the project:
pnpm build
  1. Run in development mode:
pnpm dev

Publishing

The project uses changesets for version management. To publish:

  1. Create a changeset:
pnpm changeset
  1. Version the package:
pnpm changeset version
  1. Publish to npm:
pnpm release

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see the LICENSE file for details.

Acknowledgments

mcp-perplexity-search FAQ

How do I install mcp-perplexity-search?
You can install it by cloning the GitHub repository and following the setup instructions in the README, ensuring you have access to Perplexity's AI API.
Is mcp-perplexity-search still maintained?
No, this repository is no longer maintained. Its features are now included in the mcp-omnisearch package.
Can I customize the prompt templates?
Yes, mcp-perplexity-search supports custom prompt templates to tailor AI responses to your specific use cases.
What kind of AI models does it use?
It uses Perplexity's AI models for chat completions, but MCP supports integration with other providers like OpenAI, Claude, and Gemini.
How does mcp-perplexity-search handle security-related tasks?
It includes predefined prompt templates designed to analyze and provide insights on security best practices.
Can I integrate this server with other MCP clients?
Yes, as an MCP server, it can be integrated with any MCP client that supports the protocol.
What should I use instead of mcp-perplexity-search now?
The recommended replacement is mcp-omnisearch, which combines multiple MCP tools including this server's functionality.