mcp-wolframalpha

MCP.Pizza Chef: ricocf

The mcp-wolframalpha is a Python-based MCP server that connects chat applications to the Wolfram Alpha API, enabling real-time computational and data queries. It supports multi-client interactions, modular extensions, and includes an example client using Gemini via LangChain. This server facilitates advanced conversational AI by providing structured knowledge and computational capabilities through a user-friendly interface.

Use This MCP server To

Perform real-time computational queries in chat applications Retrieve structured scientific and mathematical knowledge Integrate Wolfram Alpha data into AI conversational workflows Enable multi-client access to Wolfram Alpha via MCP Extend MCP server to support additional APIs and features Use example client to connect Gemini LLM with Wolfram Alpha Provide web UI for interactive Wolfram Alpha queries

README

MCP Wolfram Alpha (Server + Client)

Seamlessly integrate Wolfram Alpha into your chat applications.

This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities.

Included is an MCP-Client example utilizing Gemini via LangChain, demonstrating how to connect large language models to the MCP server for real-time interactions with Wolfram Alpha’s knowledge engine.


Features

  • Wolfram|Alpha Integration for math, science, and data queries.

  • Modular Architecture Easily extendable to support additional APIs and functionalities.

  • Multi-Client Support Seamlessly handle interactions from multiple clients or interfaces.

  • MCP-Client example using Gemini (via LangChain).

  • UI Support using Gradio for a user-friendly web interface to interact with Google AI and Wolfram Alpha MCP server.


Installation

Clone the Repo

git clone https://github.com/ricocf/mcp-wolframalpha.git

cd mcp-wolframalpha

Set Up Environment Variables

Create a .env file based on the example:

  • WOLFRAM_API_KEY=your_wolframalpha_appid

  • GeminiAPI=your_google_gemini_api_key (Optional if using Client method below.)

Install Requirements

pip install -r requirements.txt

Configuration

To use with the VSCode MCP Server:

  1. Create a configuration file at .vscode/mcp.json in your project root.
  2. Use the example provided in configs/vscode_mcp.json as a template.
  3. For more details, refer to the VSCode MCP Server Guide.

To use with Claude Desktop:

{
  "mcpServers": {
    "WolframAlphaServer": {
      "command": "python3",
      "args": [
        "/path/to/src/core/server.py"
      ]
    }
  }
}

Client Usage Example

This project includes an LLM client that communicates with the MCP server.

Run with Gradio UI

  • Required: GeminiAPI
  • Provides a local web interface to interact with Google AI and Wolfram Alpha.
  • To run the client directly from the command line:
python main.py --ui

Docker

To build and run the client inside a Docker container:

docker build -t wolframalphaui -f .devops/ui.Dockerfile .

docker run wolframalphaui

UI

UI

Run as CLI Tool

  • Required: GeminiAPI
  • To run the client directly from the command line:
python main.py

Docker

To build and run the client inside a Docker container:

docker build -t wolframalpha -f .devops/llm.Dockerfile .

docker run -it wolframalpha

mcp-wolframalpha FAQ

How do I set up the mcp-wolframalpha server?
Install the Python dependencies, configure your Wolfram Alpha API key, and run the server as per the GitHub instructions.
Can I connect multiple clients to the mcp-wolframalpha server?
Yes, the server supports multi-client interactions seamlessly.
Does the server support other LLMs besides Gemini?
While the example client uses Gemini via LangChain, the server is designed to be compatible with any LLM supporting MCP.
How is the Wolfram Alpha API integrated?
The server uses the official Wolfram Alpha API to perform computational and data queries, returning structured results to clients.
Is there a user interface to interact with the server?
Yes, a Gradio-based web UI is included for easy interaction with Wolfram Alpha queries.
Can I extend the server to support other APIs?
Yes, the modular architecture allows easy extension to additional APIs and functionalities.
What programming language is the server built with?
The server is implemented in Python.
How does the server handle query results?
It returns structured knowledge and computational results suitable for conversational AI workflows.