sherpa

MCP.Pizza Chef: CartographAI

Sherpa is an MCP client that enables users to chat with any codebase using a single command. It intelligently analyzes project files to provide relevant answers, supports multiple LLMs like Claude, Gemini, and GPT-4o, and ensures full privacy by running locally with your own API keys. Sherpa offers an interactive web UI for seamless codebase exploration and understanding, making it a powerful tool for developers to query and navigate complex projects efficiently.

Use This MCP client To

Chat with a local or remote codebase via command line Analyze project structure to answer code-related questions Use multiple LLMs for flexible code understanding Run fully private codebase queries with local API keys Explore codebases interactively through a web UI Quickly get explanations of unfamiliar code segments Debug or review code by querying specific files or functions

README

Sherpa

Sherpa, your friendly codebase guide

npx @cartographai/sherpa <folder or git url>

Chat with any codebase with a single command.


sherpa

Asking Sherpa about itself

Features

  • Intelligent Code Analysis: Sherpa intelligently determines which files to read to answer your questions about a codebase. It uses a combination of tools to understand the project structure and content.
  • Multiple LLM Support: Sherpa supports various language models, including Anthropic's Claude Sonnet 3.5, Google's Gemini 2.5, and OpenAI's GPT-4o and o3-mini models.
  • Fully Local and Private: Bring your own API key, no data is sent to our servers
  • Interactive Chat Interface: A user-friendly web UI (built with SvelteKit) allows you to interact with Sherpa, view the tools being used, and see the context provided to the language model.
  • Chat History: Your conversation history is stored locally in your browser's local storage, so you can easily pick up where you left off.
  • Secure Filesystem Access: Sherpa uses a secure Model Context Protocol filesystem server that restricts access to a specified directory, preventing unauthorized file access.
  • Git Support: You can use Sherpa with local directories or directly with remote Git repositories.
  • Open Source: Sherpa is open source, allowing you to modify and extend it for your specific use cases.
  • ⭐ [New!] MCP Tools Support: Configure Sherpa with any MCP servers to extend its functionality.

Usage

  1. Run Sherpa: Execute the following command in your terminal, replacing <path/to/your/project> with the actual path to your project or <git_url> with a Git repository URL:
npx @cartographai/sherpa <path/to/your/project>

or

npx @cartographai/sherpa <git_url>

This command will:

  • Clone the repository to a cache directory if a git url is provided.
  • Start a local server (Hono) that handles API requests.
  • Open the Sherpa web app in your default browser (localhost:3031).
  1. Configure API Keys: In the web app's settings panel (click the "Config" button), enter your API keys for the language models you want to use (Anthropic, Gemini, or OpenAI).

  2. Start Chatting: Ask Sherpa questions about the codebase. You can refer to specific files or directories, and Sherpa will use its tools to find the relevant information.

Use with MCPs

Sherpa supports Model Context Protocol (MCP) tools.

  1. Configure MCP servers: Open or create ~/.config/sherpa/mcp_servers.json, and configure MCP servers following Claude Desktop's format.
Example config file
{
  "mcpServers": {
    "memory": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-memory"]
    },
    "time": {
      "command": "uvx",
      "args": ["mcp-server-time", "--local-timezone=Asia/Singapore"]
    },
    "github": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
      }
    }
  }
}
  1. Quit and re-run Sherpa. It will connect to all configured MCP servers on startup.

  2. Chat: The model can now use MCP tools in your chat. You can view configured MCP servers and their connection status in the Configuration panel.

List of configured MCP servers with connected or error status

How it Works

Sherpa reads (or clones) a codebase:

  • Launches a web app and a sandbox server to access your code
  • Use the chat interface to ask questions about the codebase
  • Sherpa intelligently uses tools to read and explore the codebase to find the context to answer your question (or you can ask it to read all the files immediately)

To know more, ask Sherpa to explain itself

npx @cartographai/sherpa https://github.com/CartographAI/sherpa.git

Development

To run Sherpa in development mode:

Setup Instructions

  1. Clone the repository:

    git clone https://github.com/CartographAI/sherpa
    cd sherpa
  2. Install dependencies:

    bun install
  3. Build the project:

    bun run build

Running the Application

Backend Server

Start the backend server with either a local project path or git URL

bun run dev:server <path/to/your/project>

or

bun run dev:server <git_url>

The backend server will be available at http://localhost:3031

Note that the backend bundles the web app so if you are only developing the backend, you don't need to run the web application.

Web Application (Optional)

In a separate terminal

bun run dev:web

This will start the web app with hot reloading. Open http://localhost:3030 in your browser. This requires the backend server to be running.

Acknowledgements

sherpa FAQ

How does Sherpa determine which files to read in a codebase?
Sherpa uses intelligent code analysis tools to understand project structure and content, selecting relevant files to answer your queries efficiently.
Can I use Sherpa without sending my code to external servers?
Yes, Sherpa runs fully locally and privately, requiring you to provide your own API keys, ensuring no data leaves your environment.
Which language models does Sherpa support?
Sherpa supports multiple LLMs including Anthropic's Claude Sonnet 3.5, Google's Gemini 2.5, and OpenAI's GPT-4o and o3-mini models.
Is there a graphical interface for interacting with Sherpa?
Yes, Sherpa includes an interactive web UI built with SvelteKit for user-friendly codebase exploration.
How do I start using Sherpa on my codebase?
You can start Sherpa with a single command: `npx @cartographai/sherpa <folder or git url>` to chat with your codebase instantly.
Does Sherpa support remote repositories?
Yes, Sherpa can analyze both local folders and remote git URLs to provide codebase insights.
Can I switch between different LLM providers easily?
Yes, Sherpa supports multiple LLMs, allowing you to choose or switch providers based on your preferences or needs.
What programming languages does Sherpa support?
Sherpa is language-agnostic and can analyze any codebase structure, making it suitable for various programming languages.