Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AIFire in da houseCheck it out free

gptr-mcp

MCP.Pizza Chef: assafelovic

GPT Researcher MCP server enhances LLM applications by providing deep, structured research capabilities through the MCP protocol. Unlike standard search tools that return raw, often irrelevant results, this server filters and organizes information to maximize relevance and context efficiency. It empowers LLMs to perform comprehensive research workflows, improving accuracy and reducing manual effort in data gathering and analysis.

Use This MCP server To

Provide LLMs with deep, filtered research data beyond simple web search Enable structured retrieval of relevant academic and technical sources Reduce noise and irrelevant information in LLM research workflows Support multi-step reasoning with curated research context Integrate with LLMs to automate literature review and data synthesis Enhance knowledge discovery in specialized domains via MCP Facilitate real-time research updates for dynamic content exploration

README

Logo

πŸ” GPT Researcher MCP Server

Website Documentation Discord Follow

Why GPT Researcher MCP?

While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.

GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:

  • ✨ Higher quality information
  • πŸ“Š Optimized context usage
  • πŸ”Ž Comprehensive results
  • 🧠 Better reasoning for LLMs

πŸ’» Claude Desktop Demo

gptr-mcp-demo.mp4

Resources

  • research_resource: Get web resources related to a given task via research.

Primary Tools

  • deep_research: Performs deep web research on a topic, finding the most reliable and relevant information
  • quick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more here
  • write_report: Generate a report based on research results
  • get_research_sources: Get the sources used in the research
  • get_research_context: Get the full context of the research

Prompts

  • research_query: Create a research query prompt

Prerequisites

Before running the MCP server, make sure you have:

  1. Python 3.10 or higher installed
  2. API keys for the services you plan to use:
    • OpenAI API key
    • Tavily API key

βš™οΈ Installation

  1. Clone the GPT Researcher repository:
git clone https://github.com/assafelovic/gpt-researcher.git
cd gpt-researcher
  1. Install the gptr-mcp dependencies:
cd gptr-mcp
pip install -r requirements.txt
  1. Set up your environment variables:
    • Copy the .env.example file to create a new file named .env:
    cp .env.example .env
    • Edit the .env file and add your API keys and configure other settings:
    OPENAI_API_KEY=your_openai_api_key
    TAVILY_API_KEY=your_tavily_api_key

You can also add any other env variable for your GPT Researcher configuration.

πŸš€ Running the MCP Server

You can start the MCP server in two ways:

Method 1: Directly using Python

python server.py

Method 2: Using the MCP CLI (if installed)

mcp run server.py

Once the server is running, you'll see output indicating that the server is ready to accept connections.

Integrating with Claude

You can integrate your MCP server with Claude using:

Claude Desktop Integration - For using with Claude desktop application on Mac

For detailed instructions, follow the link above.

πŸ’» Claude Desktop Integration

To integrate your locally running MCP server with Claude for Mac, you'll need to:

  1. Make sure the MCP server is installed and running
  2. Configure Claude Desktop:
    • Locate or create the configuration file at ~/Library/Application Support/Claude/claude_desktop_config.json
    • Add your local GPT Researcher MCP server to the configuration
    • Restart Claude to apply the configuration

For complete step-by-step instructions, see the Claude Desktop Integration guide.

πŸ“ Example Usage with Claude

User: I'm considering investing in NVIDIA. Can you research the current state of the company, recent stock performance, latest product announcements, and analyst opinions?

Claude: I'll help you research NVIDIA for investment purposes. This will take a moment as I conduct comprehensive research to find the most up-to-date information about their stock performance, recent developments, and expert analysis.

[Claude uses conduct_research tool - this takes 30-40 seconds]

Claude: Based on my research, here's a comprehensive analysis of NVIDIA's current state:

## NVIDIA (NVDA) Current Status

### Recent Stock Performance
- Current price: $942.89 (as of my research)
- YTD performance: +90.4% 
- 52-week range: $460.30 - $974.00
- Market cap: ~$2.32 trillion

### Recent Key Developments
1. **Blackwell AI Platform**: At GTC 2024, NVIDIA unveiled the Blackwell architecture, their next-generation AI chip platform promising 4x performance improvement over Hopper. Major cloud providers (AWS, Google, Microsoft) have already committed to integrating Blackwell.

2. **Q1 FY2025 Earnings (May 2024)**: 
   - Revenue: $26.0 billion (+262% YoY)
   - Data Center revenue: $22.6 billion (+427% YoY)
   - Gross margin: 78.4%
   - Announced 10-for-1 stock split effective June 7, 2024

3. **Supply Chain Expansion**: Increased investments in manufacturing capacity with TSMC to address ongoing chip shortages for their high-demand AI GPUs.

### Analyst Consensus
- Strong Buy consensus with 37 analysts rating it "Buy," 5 "Hold," and 0 "Sell"
- Average price target: $1,042 (post-split adjusted: $104.20)
- Key bull thesis: Continued AI infrastructure spending and NVIDIA's dominant position in AI chips
- Key bear concerns: Valuation, potential competition from AMD, Intel, and custom silicon from tech giants

### Industry Position
- Dominant in AI GPU market with estimated 80-90% market share
- Expanding ecosystem through CUDA software platform and industry partnerships
- Diversifying beyond chips with DGX Cloud, AI software platforms, and automotive solutions

πŸ”§ Troubleshooting

If you encounter issues while running the MCP server:

  1. Make sure your API keys are correctly set in the .env file
  2. Check that you're using Python 3.10 or higher
  3. Ensure all dependencies are installed correctly
  4. Check the server logs for error messages

πŸ‘£ Next Steps

  • Explore the MCP protocol documentation to better understand how to integrate with Claude
  • Learn about GPT Researcher's core features to enhance your research capabilities
  • Check out the Advanced Usage guide for more configuration options

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ“ž Support / Contact

⬆️ Back to Top

gptr-mcp FAQ

How does GPT Researcher MCP improve research results compared to standard search tools?
It filters and structures search results to provide more relevant, concise, and context-efficient data for LLMs, reducing noise and manual filtering.
Can GPT Researcher MCP integrate with multiple LLM providers?
Yes, it is designed to work with various LLMs including OpenAI, Anthropic Claude, and Google Gemini via the MCP protocol.
What types of data sources does GPT Researcher MCP support?
It supports diverse data sources such as academic papers, technical documents, web content, and other structured repositories accessible via MCP servers.
Is GPT Researcher MCP suitable for real-time research applications?
Yes, it supports real-time updates and dynamic content exploration to keep research context current.
How secure is the data handled by GPT Researcher MCP?
The server follows MCP principles for secure, scoped, and observable interactions, ensuring data privacy and controlled access.
What setup is required to deploy GPT Researcher MCP?
Deployment involves running the MCP server component and configuring it to connect with desired data sources and LLM clients via MCP protocol.
Does GPT Researcher MCP support multi-step reasoning workflows?
Yes, it enables LLMs to carry out complex, multi-step research and reasoning tasks by providing structured context and tools.
Can GPT Researcher MCP be customized for specific research domains?
Yes, it can be tailored to focus on particular domains by integrating relevant data sources and filtering criteria.