Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AIFire in da houseCheck it out free

mcp-server-bigquery

MCP.Pizza Chef: LucasHild

The mcp-server-bigquery is a Model Context Protocol server that integrates BigQuery access into LLM workflows. It allows language models to inspect database schemas, list tables, and execute SQL queries using the BigQuery dialect. Configurable with GCP project details and dataset filters, this server facilitates real-time, structured interaction with BigQuery data sources, enhancing AI-driven data analysis and automation.

Use This MCP server To

Execute SQL queries on BigQuery from LLMs Inspect BigQuery database schemas dynamically List tables within specified BigQuery datasets Integrate BigQuery data access into AI workflows Automate data retrieval and analysis via LLMs

README

BigQuery MCP server

smithery badge

A Model Context Protocol server that provides access to BigQuery. This server enables LLMs to inspect database schemas and execute queries.

Components

Tools

The server implements one tool:

  • execute-query: Executes a SQL query using BigQuery dialect
  • list-tables: Lists all tables in the BigQuery database
  • describe-table: Describes the schema of a specific table

Configuration

The server can be configured with the following arguments:

  • --project (required): The GCP project ID.
  • --location (required): The GCP location (e.g. europe-west9).
  • --dataset (optional): Only take specific BigQuery datasets into consideration. Several datasets can be specified by repeating the argument (e.g. --dataset my_dataset_1 --dataset my_dataset_2). If not provided, all datasets in the project will be considered.
  • --key-file (optional): Path to a service account key file for BigQuery. If not provided, the server will use the default credentials.

Quickstart

Install

Installing via Smithery

To install BigQuery Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install mcp-server-bigquery --client claude

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration
"mcpServers": {
  "bigquery": {
    "command": "uv",
    "args": [
      "--directory",
      "{{PATH_TO_REPO}}",
      "run",
      "mcp-server-bigquery",
      "--project",
      "{{GCP_PROJECT_ID}}",
      "--location",
      "{{GCP_LOCATION}}"
    ]
  }
}
Published Servers Configuration
"mcpServers": {
  "bigquery": {
    "command": "uvx",
    "args": [
      "mcp-server-bigquery",
      "--project",
      "{{GCP_PROJECT_ID}}",
      "--location",
      "{{GCP_LOCATION}}"
    ]
  }
}

Replace {{PATH_TO_REPO}}, {{GCP_PROJECT_ID}}, and {{GCP_LOCATION}} with the appropriate values.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory {{PATH_TO_REPO}} run mcp-server-bigquery

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

mcp-server-bigquery FAQ

How do I configure the mcp-server-bigquery?
Configure it with your GCP project ID, location, optional datasets, and authentication key file.
Can I limit the server to specific BigQuery datasets?
Yes, use the --dataset argument multiple times to specify datasets to include.
What tools does the server provide?
It offers execute-query for SQL execution, list-tables to list database tables, and describe-table to inspect table schemas.
Is authentication required to use this server?
Yes, you must provide a GCP key file for authentication to access BigQuery.
Can this server handle multiple datasets simultaneously?
Yes, by specifying multiple --dataset arguments, it can consider several datasets.
What SQL dialect does the execute-query tool use?
It uses the BigQuery SQL dialect for query execution.
How does this server enhance LLM capabilities?
It enables LLMs to interact directly with BigQuery data, allowing dynamic querying and schema inspection.
Is this server compatible with multiple LLM providers?
Yes, it works with models from OpenAI, Anthropic Claude, and Google Gemini, among others.