Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-semantic-scholar-server

MCP.Pizza Chef: benhaotang

The mcp-semantic-scholar-server is an MCP server that integrates the Semantic Scholar API to enable searching for academic papers within AI-enhanced workflows. Built with the mcp-python-sdk, it allows models to query and retrieve structured research paper data in real time. This server supports seamless installation and configuration with popular LLM hosts like Claude, facilitating research, literature review, and academic assistance directly through model interactions. It replaces deprecated FastMCP versions and requires Python dependencies installed via pip.

Use This MCP server To

Search academic papers via Semantic Scholar API Integrate research paper lookup in AI assistants Enable real-time literature review in workflows Provide citation and paper metadata retrieval Support academic research automation Enhance LLMs with scholarly knowledge access

README

Semantic Scholar API MCP server

Made with mcp-python-sdk

Important

if you are still using FastMCP version of this mcp server, please consider pull this repo again and update to newer versions as FastMCP is already deprecated.

Usage

Requirements: pip install -r requirements.txt

Run mcp dev path/to/semantic-scholar-plugin.py to initialize the server.

Run mcp install path/to/semantic-scholar-plugin.py to install to claude or add following to claude/cline config:

"semantic-scholar": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mcp",
        "mcp",
        "run",
        "/path/to/semantic-scholar-plugin.py"
      ]
    }

Note

Currently using uv with mcp seems to break certain Linux/macOS version of Claude-desktop, you might need to set as:

"semantic-scholar": {
      "command": "/path/to/mcp",
      "args": [
        "run",
        "/path/to/semantic-scholar-plugin.py"
      ]
    }

instead, with /path/to/mcp got from running which mcp in terminal

API Key

To use the Semantic Scholar API with higher rate limits, you can set your API key as an environment variable:

export SEMANTIC_SCHOLAR_API_KEY="your_api_key"

or set by adding an env key in mcp settings by:

"semantic-scholar": {
      "command": ...,
      "args": ...,
      "env": {
        "SEMANTIC_SCHOLAR_API_KEY": "your_api_key"
      }
}

You can get an API key by filling out the form at: https://www.semanticscholar.org/product/api

Suggested Agent System prompt

See: benhaotang/my_agent_system_prompt, the AI pre-research agent that can make full use of this mcp server.

Known issues

  • If you see things like INFO Processing request of type __init__.py:431 ListToolsRequest in cline, you can ignore them as this will not affect it from working, this is because cline parse tool list together with console debug infos, and current python-sdk cannot disable console messages. This will not affect any function calling part other than seeing this warning.

mcp-semantic-scholar-server FAQ

How do I install the mcp-semantic-scholar-server?
Install required Python packages with 'pip install -r requirements.txt' and run the server using 'mcp dev path/to/semantic-scholar-plugin.py'.
How do I integrate this server with Claude or other LLM hosts?
Use 'mcp install path/to/semantic-scholar-plugin.py' or add the server configuration to your Claude or cline config JSON as documented.
Is the mcp-semantic-scholar-server compatible with FastMCP?
No, FastMCP is deprecated. Please update to the latest version of this server using the mcp-python-sdk.
Are there any known issues with running this server on Linux or macOS?
Yes, using 'uv' with 'mcp' may cause issues on some Linux/macOS versions of Claude-desktop. Adjust the command configuration as noted in the documentation.
What dependencies are required to run this MCP server?
The server requires Python dependencies listed in 'requirements.txt', installed via pip.
Can this server be used with LLM providers other than Claude?
Yes, it can be integrated with any MCP-compatible LLM host, including OpenAI and Gemini, by configuring the server accordingly.
How does this server enhance AI workflows?
It enables real-time access to scholarly papers and metadata, allowing AI models to assist with research, citation, and literature review tasks.