Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

chronulus-mcp

MCP.Pizza Chef: ChronulusAI

Chronulus-mcp is an MCP server designed to integrate Chronulus AI forecasting and prediction agents with the Claude LLM environment. It enables real-time interaction with advanced forecasting models, allowing users to query and receive predictive insights directly within Claude-powered applications. This server facilitates seamless communication between Chronulus AI services and LLMs, enhancing decision-making workflows with predictive analytics.

Use This MCP server To

Integrate Chronulus AI forecasting agents into Claude chat workflows Query predictive models for real-time forecasting insights Enable AI-driven prediction in enterprise applications Combine LLM reasoning with Chronulus prediction data Automate decision support using Chronulus forecasts in chat Embed forecasting capabilities in AI copilots and assistants

README

Chronulus AI

MCP Server for Chronulus

Chat with Chronulus AI Forecasting & Prediction Agents in Claude

Quickstart: Claude for Desktop

Install

Claude for Desktop is currently available on macOS and Windows.

Install Claude for Desktop here

Configuration

Follow the general instructions here to configure the Claude desktop client.

You can find your Claude config at one of the following locations:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Then choose one of the following methods that best suits your needs and add it to your claude_desktop_config.json

Using pip

(Option 1) Install release from PyPI

pip install chronulus-mcp

(Option 2) Install from Github

git clone https://github.com/ChronulusAI/chronulus-mcp.git
cd chronulus-mcp
pip install .
{
  "mcpServers": {
    "chronulus-agents": {
      "command": "python",
      "args": ["-m", "chronulus_mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    }
  }
}

Note, if you get an error like "MCP chronulus-agents: spawn python ENOENT", then you most likely need to provide the absolute path to python. For example /Library/Frameworks/Python.framework/Versions/3.11/bin/python3 instead of just python

Using docker

Here we will build a docker image called 'chronulus-mcp' that we can reuse in our Claude config.

git clone https://github.com/ChronulusAI/chronulus-mcp.git
cd chronulus-mcp
 docker build . -t 'chronulus-mcp'

In your Claude config, be sure that the final argument matches the name you give to the docker image in the build command.

{
  "mcpServers": {
    "chronulus-agents": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-e", "CHRONULUS_API_KEY", "chronulus-mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    }
  }
}
Using uvx

uvx will pull the latest version of chronulus-mcp from the PyPI registry, install it, and then run it.

{
  "mcpServers": {
    "chronulus-agents": {
      "command": "uvx",
      "args": ["chronulus-mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    }
  }
}

Note, if you get an error like "MCP chronulus-agents: spawn uvx ENOENT", then you most likely need to either:

  1. install uv or
  2. Provide the absolute path to uvx. For example /Users/username/.local/bin/uvx instead of just uvx

Additional Servers (Filesystem, Fetch, etc)

In our demo, we use third-party servers like fetch and filesystem.

For details on installing and configure third-party server, please reference the documentation provided by the server maintainer.

Below is an example of how to configure filesystem and fetch alongside Chronulus in your claude_desktop_config.json:

{
  "mcpServers": {
    "chronulus-agents": {
      "command": "uvx",
      "args": ["chronulus-mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/path/to/AIWorkspace"
      ]
    },
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    }
  }
} 

Claude Preferences

To streamline your experience using Claude across multiple sets of tools, it is best to add your preferences to under Claude Settings.

You can upgrade your Claude preferences in a couple ways:

  • From Claude Desktop: Settings -> General -> Claude Settings -> Profile (tab)
  • From claude.ai/settings: Profile (tab)

Preferences are shared across both Claude for Desktop and Claude.ai (the web interface). So your instruction need to work across both experiences.

Below are the preferences we used to achieve the results shown in our demos:

## Tools-Dependent Protocols
The following instructions apply only when tools/MCP Servers are accessible.

### Filesystem - Tool Instructions
- Do not use 'read_file' or 'read_multiple_files' on binary files (e.g., images, pdfs, docx) .
- When working with binary files (e.g., images, pdfs, docx) use 'get_info' instead of 'read_*' tools to inspect a file.

### Chronulus Agents - Tool Instructions
- When using Chronulus, prefer to use input field types like TextFromFile, PdfFromFile, and ImageFromFile over scanning the files directly.
- When plotting forecasts from Chronulus, always include the Chronulus-provided forecast explanation below the plot and label it as Chronulus Explanation.

chronulus-mcp FAQ

How do I install the chronulus-mcp server?
Installation instructions are provided in the GitHub repository, including pip options and configuration details for macOS and Windows.
How do I configure chronulus-mcp with Claude for Desktop?
Add the chronulus-mcp configuration to your Claude desktop config file located in the user application support directory as per the quickstart guide.
Can chronulus-mcp work with LLM providers other than Claude?
While optimized for Claude, chronulus-mcp can be adapted to work with other LLM providers like OpenAI and Gemini with appropriate configuration.
What kind of forecasting models does chronulus-mcp support?
It supports Chronulus AI's proprietary forecasting and prediction models designed for various time series and predictive analytics tasks.
Is chronulus-mcp suitable for enterprise use?
Yes, it is designed to integrate predictive AI agents into enterprise workflows for enhanced decision-making.
How does chronulus-mcp handle real-time data updates?
The server supports real-time interaction, allowing models to access up-to-date forecasting data during conversations.
What platforms are supported for running chronulus-mcp?
It supports major desktop platforms including macOS and Windows, with configuration instructions provided for both.