Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

logfire-mcp

MCP.Pizza Chef: pydantic

The Logfire MCP Server provides LLMs with real-time access to OpenTelemetry traces and metrics. It enables detailed analysis of distributed traces, exception tracking by file, and execution of arbitrary SQL queries on telemetry data. This server empowers developers to monitor application performance and diagnose issues efficiently using structured telemetry insights.

Use This MCP server To

Retrieve exception counts grouped by source file within a time window Analyze detailed trace information for exceptions in specific files Execute custom SQL queries on OpenTelemetry telemetry data Monitor application performance using distributed trace analysis Diagnose errors and exceptions from telemetry data programmatically Integrate telemetry insights into AI-driven debugging workflows

README

Logfire MCP Server

This repository contains a Model Context Protocol (MCP) server with tools that can access the OpenTelemetry traces and metrics you've sent to Logfire.

This MCP server enables LLMs to retrieve your application's telemetry data, analyze distributed traces, and make use of the results of arbitrary SQL queries executed using the Logfire APIs.

Available Tools

  • find_exceptions - Get exception counts from traces grouped by file

    • Required arguments:
      • age (int): Number of minutes to look back (e.g., 30 for last 30 minutes, max 7 days)
  • find_exceptions_in_file - Get detailed trace information about exceptions in a specific file

    • Required arguments:
      • filepath (string): Path to the file to analyze
      • age (int): Number of minutes to look back (max 7 days)
  • arbitrary_query - Run custom SQL queries on your OpenTelemetry traces and metrics

    • Required arguments:
      • query (string): SQL query to execute
      • age (int): Number of minutes to look back (max 7 days)
  • get_logfire_records_schema - Get the OpenTelemetry schema to help with custom queries

    • No required arguments

Setup

Install uv

The first thing to do is make sure uv is installed, as uv is used to run the MCP server.

For installation instructions, see the uv installation docs.

If you already have an older version of uv installed, you might need to update it with uv self update.

Obtain a Logfire read token

In order to make requests to the Logfire APIs, the Logfire MCP server requires a "read token".

You can create one under the "Read Tokens" section of your project settings in Logfire: https://logfire.pydantic.dev/-/redirect/latest-project/settings/read-tokens

Important

Logfire read tokens are project-specific, so you need to create one for the specific project you want to expose to the Logfire MCP server.

Manually run the server

Once you have uv installed and have a Logfire read token, you can manually run the MCP server using uvx (which is provided by uv).

You can specify your read token using the LOGFIRE_READ_TOKEN environment variable:

LOGFIRE_READ_TOKEN=YOUR_READ_TOKEN uvx logfire-mcp

or using the --read-token flag:

uvx logfire-mcp --read-token=YOUR_READ_TOKEN

Note

If you are using Cursor, Claude Desktop, Cline, or other MCP clients that manage your MCP servers for you, you do NOT need to manually run the server yourself. The next section will show you how to configure these clients to make use of the Logfire MCP server.

Configuration with well-known MCP clients

Configure for Cursor

Create a .cursor/mcp.json file in your project root:

{
  "mcpServers": {
    "logfire": {
      "command": "uvx",
      "args": ["logfire-mcp", "--read-token=YOUR-TOKEN"]
    }
  }
}

The Cursor doesn't accept the env field, so you need to use the --read-token flag instead.

Configure for Claude Desktop

Add to your Claude settings:

{
  "command": ["uvx"],
  "args": ["logfire-mcp"],
  "type": "stdio",
  "env": {
    "LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
  }
}

Configure for Cline

Add to your Cline settings in cline_mcp_settings.json:

{
  "mcpServers": {
    "logfire": {
      "command": "uvx",
      "args": ["logfire-mcp"],
      "env": {
        "LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

Customization - Base URL

By default, the server connects to the Logfire API at https://logfire-api.pydantic.dev. You can override this by:

  1. Using the --base-url argument:
uvx logfire-mcp --base-url=https://your-logfire-instance.com
  1. Setting the environment variable:
LOGFIRE_BASE_URL=https://your-logfire-instance.com uvx logfire-mcp

Example Interactions

  1. Find all exceptions in traces from the last hour:
{
  "name": "find_exceptions",
  "arguments": {
    "age": 60
  }
}

Response:

[
  {
    "filepath": "app/api.py",
    "count": 12
  },
  {
    "filepath": "app/models.py",
    "count": 5
  }
]
  1. Get details about exceptions from traces in a specific file:
{
  "name": "find_exceptions_in_file",
  "arguments": {
    "filepath": "app/api.py",
    "age": 1440
  }
}

Response:

[
  {
    "created_at": "2024-03-20T10:30:00Z",
    "message": "Failed to process request",
    "exception_type": "ValueError",
    "exception_message": "Invalid input format",
    "function_name": "process_request",
    "line_number": "42",
    "attributes": {
      "service.name": "api-service",
      "code.filepath": "app/api.py"
    },
    "trace_id": "1234567890abcdef"
  }
]
  1. Run a custom query on traces:
{
  "name": "arbitrary_query",
  "arguments": {
    "query": "SELECT trace_id, message, created_at, attributes->>'service.name' as service FROM records WHERE severity_text = 'ERROR' ORDER BY created_at DESC LIMIT 10",
    "age": 1440
  }
}

Examples of Questions for Claude

  1. "What exceptions occurred in traces from the last hour across all services?"
  2. "Show me the recent errors in the file 'app/api.py' with their trace context"
  3. "How many errors were there in the last 24 hours per service?"
  4. "What are the most common exception types in my traces, grouped by service name?"
  5. "Get me the OpenTelemetry schema for traces and metrics"
  6. "Find all errors from yesterday and show their trace contexts"

Getting Started

  1. First, obtain a Logfire read token from: https://logfire.pydantic.dev/-/redirect/latest-project/settings/read-tokens

  2. Run the MCP server:

    uvx logfire-mcp --read-token=YOUR_TOKEN
  3. Configure your preferred client (Cursor, Claude Desktop, or Cline) using the configuration examples above

  4. Start using the MCP server to analyze your OpenTelemetry traces and metrics!

Contributing

We welcome contributions to help improve the Logfire MCP server. Whether you want to add new trace analysis tools, enhance metrics querying functionality, or improve documentation, your input is valuable.

For examples of other MCP servers and implementation patterns, see the Model Context Protocol servers repository.

License

Logfire MCP is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License.

logfire-mcp FAQ

How does Logfire MCP Server access telemetry data?
It connects to OpenTelemetry traces and metrics sent to Logfire, enabling LLMs to query and analyze them.
What types of queries can I run with Logfire MCP Server?
You can run predefined exception queries and arbitrary SQL queries on your telemetry data.
What is the maximum lookback period for querying exceptions?
The server supports querying data up to 7 days in the past.
Can I get detailed trace information for a specific file?
Yes, using the find_exceptions_in_file tool with the file path and time range.
Is it possible to customize queries beyond predefined tools?
Yes, the arbitrary_query tool allows running any SQL query on your telemetry data.
Which LLM providers can integrate with Logfire MCP Server?
It is compatible with OpenAI, Anthropic Claude, and Google Gemini models.
How secure is the data access through this MCP server?
The server follows MCP principles for scoped, secure, and observable model interactions.