mcp-client-jupyter-chat

MCP.Pizza Chef: ihrpr

mcp-client-jupyter-chat is a JupyterLab extension that provides an interactive chat interface with AI models using the Model Context Protocol (MCP). It supports real-time streaming responses, step-by-step reasoning, and integration with MCP servers for interactive tool usage. Designed for JupyterLab 4.0+, it currently supports Anthropic's Claude and plans to add more models, enabling seamless AI-assisted workflows within Jupyter notebooks.

Use This MCP client To

Chat with AI models inside JupyterLab notebooks Stream real-time AI responses with detailed reasoning Integrate and interact with MCP servers for tool execution Display rich content and interactive outputs in chat Use AI to assist coding and data analysis workflows Combine multiple MCP tools in a single chat interface Enable stepwise problem solving with AI in Jupyter

README

mcp_client_jupyter_chat

Github Actions Status Binder

A JupyterLab extension for Chat with AI supporting Model Context Protocol (MCP). This extension integrates AI and provides interactive tool usage capabilities through MCP servers.

Demo

demo

Features

  • Seamless integration with AI models (Currently supported models are Anthropi. More models are coming soon. )
  • Real-time streaming of responses with step-by-step reasoning
  • Support for MCP server tools with interactive execution:
  • Rich content display
  • Interactive chat interface

Requirements

  • JupyterLab >= 4.0.0
  • An Anthropic API key for Claude access
  • Running MCP server(s) for tool integration (optional)

Model Configuration

The extension supports multiple Claude models through the Anthropic API. You'll need to:

  1. Obtain an Anthropic API key from Anthropic's website
  2. Configure your models in JupyterLab's Settings Editor under the "MCP Chat" section
  3. For each model, provide:
    • Name (e.g., "gpt-4")
    • API Key
    • Set as default (optional)

Install

To install the extension, execute:

pip install mcp_client_jupyter_chat

Uninstall

To remove the extension, execute:

pip uninstall mcp_client_jupyter_chat

Contributing

Development install

Note: You will need NodeJS to build the extension package.

The jlpm command is JupyterLab's pinned version of yarn that is installed with JupyterLab. You may use yarn or npm in lieu of jlpm below.

# Clone the repo to your local environment
# Change directory to the mcp_client_jupyter_chat directory
# Install package in development mode
pip install -e "."
# Link your development version of the extension with JupyterLab
jupyter labextension develop . --overwrite
# Rebuild extension Typescript source after making changes
jlpm build

You can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.

# Watch the source directory in one terminal, automatically rebuilding when needed
jlpm watch
# Run JupyterLab in another terminal
jupyter lab

With the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).

By default, the jlpm build command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:

jupyter lab build --minimize=False

Development uninstall

pip uninstall mcp_client_jupyter_chat

In development mode, you will also need to remove the symlink created by jupyter labextension develop command. To find its location, you can run jupyter labextension list to figure out where the labextensions folder is located. Then you can remove the symlink named mcp-client-jupyter-chat within that folder.

Testing the extension

Frontend tests

This extension is using Jest for JavaScript code testing.

To execute them, execute:

jlpm
jlpm test

Integration tests

This extension uses Playwright for the integration tests (aka user level tests). More precisely, the JupyterLab helper Galata is used to handle testing the extension in JupyterLab.

More information are provided within the ui-tests README.

Packaging the extension

See RELEASE

mcp-client-jupyter-chat FAQ

How do I install mcp-client-jupyter-chat?
Install it as a JupyterLab extension compatible with JupyterLab 4.0 or higher, following the GitHub instructions.
What AI models does this client support?
Currently, it supports Anthropic's Claude, with plans to add more models like OpenAI GPT and Google Gemini.
How does mcp-client-jupyter-chat integrate with MCP servers?
It connects to MCP servers to access tools and data sources, enabling interactive tool usage within the chat interface.
Can I use this client without MCP servers?
Yes, but tool integration features require running MCP servers for full functionality.
Does it support real-time streaming of AI responses?
Yes, it streams responses with step-by-step reasoning for better interactivity.
What are the requirements to use this client?
Requires JupyterLab 4.0+, an Anthropic API key for Claude, and optionally running MCP servers for tool integration.
Is the chat interface capable of displaying rich content?
Yes, it supports rich content display including formatted text and interactive elements.
Can this client be used for coding assistance?
Yes, it is designed to assist coding and data analysis workflows interactively within JupyterLab.