mcp-windbg

MCP.Pizza Chef: svnscha

mcp-windbg is an MCP server that integrates with WinDBG/CDB to allow AI models to execute debugger commands for Windows crash dump analysis. It enables natural language interaction with crash dumps, providing immediate triage, categorization, and detailed inspection through AI-powered debugging commands, significantly improving productivity in crash analysis workflows.

Use This MCP server To

Perform AI-assisted triage of Windows crash dumps Execute WinDBG commands via natural language requests Automatically categorize crash dumps for faster debugging Inspect call stacks and memory regions using AI queries Generate detailed crash analysis reports from dumps Enable natural language debugging workflows with WinDBG Integrate crash dump analysis into AI-enhanced developer tools

README

MCP Server for WinDBG Crash Analysis

A Model Context Protocol server providing tools to analyze Windows crash dumps using WinDBG/CDB.

Overview

This MCP server integrates with CDB to enable AI models to analyze Windows crash dumps.

TL;DR

What is this?

  • Primarily, a tool that enables AI to interact with WinDBG.
  • The whole "magic" is giving LLMs the ability to execute debugger commands. Used creatively, this is quite powerful and a big productivity improvement.

This means, that this is:

  • A bridge connecting LLMs (AI) with WinDBG (CDB) for assisted crash dump analysis.
  • A way to get immediate first-level triage analysis, useful for categorizing crash dumps or auto-analyzing simple cases.
  • A platform for natural language-based "vibe" analysis, allowing you to ask the LLM to inspect specific areas:
    • Examples:
      • "Show me the call stack with k and explain what might be causing this access violation"
      • "Execute !peb and tell me if there are any environment variables that might affect this crash"
      • "Examine frame 3 and analyze the parameters passed to this function"
      • "Use dx -r2 on this object and explain its state" (equivalent to dx -r2 ((MyClass*)0x12345678))
      • "Analyze this heap address with !heap -p -a 0xABCD1234 and check for buffer overflow"
      • "Run .ecxr followed by k and explain the exception's root cause"
      • "Check for timing issues in the thread pool with !runaway and !threads"
      • "Examine memory around this address with db/dw/dd to identify corruption patterns"
      • ...and many other analytical approaches based on your specific crash scenario

What is this not?

  • A magical solution that automatically fixes all issues.
  • A full-featured product with custom AI. Instead, it's a simple Python wrapper around CDB that relies on the LLM's WinDBG expertise, best complemented by your own domain knowledge.

Blog

I've written about the whole journey in blog.

Prerequisites

  • Python 3.10 or higher
  • Windows operating system with Debugging Tools for Windows installed.
  • A LLM supporting Model Context Protocol.

Development Setup

  1. Clone the repository:
git clone https://github.com/svnscha/mcp-windbg.git
cd mcp-windbg
  1. Create and activate a virtual environment:
python -m venv .venv
.\.venv\Scripts\activate
  1. Install the package in development mode:
pip install -e .
  1. Install test dependencies:
pip install -e ".[test]"

Usage

Integrating with VS Code

To integrate this MCP server with Visual Studio Code:

  1. Create a .vscode/mcp.json file in your workspace with the following configuration:
{
    "servers": {
        "mcp_server_windbg": {
            "type": "stdio",
            "command": "${workspaceFolder}/.venv/Scripts/python",
            "args": [
                "-m",
                "mcp_server_windbg"
            ],
            "env": {
                "_NT_SYMBOL_PATH": "SRV*C:\\Symbols*https://msdl.microsoft.com/download/symbols"
            }
        },
    }
}

Alternatively, edit your user settings to enable it globally (independent of workspace). Once added and with Model Context Protocol in Chat feature enabled, the tools from this model context protocol server are available in Agent mode.

That's how it should look like:

Visual Studio Code Integration

Starting the MCP Server (optional)

If integrated through Copilot, you don't need this. The IDE will auto-start the MCP.

Start the server using the module command:

python -m mcp_server_windbg

Command Line Options

python -m mcp_server_windbg [options]

Available options:

  • --cdb-path CDB_PATH: Custom path to cdb.exe
  • --symbols-path SYMBOLS_PATH: Custom symbols path
  • --timeout TIMEOUT: Command timeout in seconds (default: 30)
  • --verbose: Enable verbose output
  1. Customize the configuration as needed:
    • Adjust the Python interpreter path if needed
    • Set custom paths for CDB by adding "--cdb-path": "C:\\path\\to\\cdb.exe" to the args array
    • Set the symbol path environment variable as shown above, or add "--symbols-path" to the args

Integration with Copilot

Once the server is configured in VS Code:

  1. Enable MCP in Chat feature in Copilot settings
  2. The MCP server will appear in Copilot's available tools
  3. The WinDBG analysis capabilities will be accessible through Copilot's interface
  4. You can now analyze crash dumps directly through Copilot using natural language queries

Tools

This server provides the following tools:

  • open_windbg_dump: Analyze a Windows crash dump file using common WinDBG commands
  • run_windbg_cmd: Execute a specific WinDBG command on the loaded crash dump
  • list_windbg_dumps: List Windows crash dump (.dmp) files in the specified directory.
  • close_windbg_dump: Unload a crash dump and release resources

Running Tests

To run the tests:

pytest

Troubleshooting

CDB Not Found

If you get a "CDB executable not found" error, make sure:

  1. WinDBG/CDB is installed on your system
  2. The CDB executable is in your system PATH, or
  3. You specify the path using the --cdb-path option

Symbol Path Issues

For proper crash analysis, set up your symbol path:

  1. Use the --symbols-path parameter, or
  2. Set the _NT_SYMBOL_PATH environment variable

Common Symbol Paths

SRV*C:\Symbols*https://msdl.microsoft.com/download/symbols

License

MIT

mcp-windbg FAQ

How does mcp-windbg enable AI to analyze crash dumps?
It allows AI models to execute WinDBG/CDB debugger commands to inspect and analyze Windows crash dumps.
Can I use mcp-windbg to automate crash dump triage?
Yes, it provides immediate first-level triage and categorization of crash dumps using AI-driven analysis.
Does mcp-windbg support natural language queries?
Yes, it enables natural language-based interaction to request specific debugger commands and insights.
What platforms does mcp-windbg support?
It supports Windows crash dumps analyzed through WinDBG/CDB on Windows environments.
Is mcp-windbg compatible with multiple LLM providers?
Yes, it is designed to work with various LLMs including OpenAI, Anthropic Claude, and Google Gemini.
How does mcp-windbg improve debugging productivity?
By automating command execution and providing AI-driven insights, it reduces manual effort and speeds up crash analysis.
Can mcp-windbg generate detailed reports?
Yes, it can produce detailed crash analysis summaries and reports based on executed debugger commands.
What level of expertise is needed to use mcp-windbg?
Basic familiarity with WinDBG helps, but natural language queries make it accessible to less experienced users.