Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

deepview-mcp

MCP.Pizza Chef: ai-1st

DeepView MCP is a Model Context Protocol server designed to empower IDEs like Cursor and Windsurf to analyze large codebases efficiently. It leverages Gemini 2.5 Pro's extensive context window to load entire codebases from single text files and perform complex queries. Configurable via command-line, it supports seamless integration with MCP-compatible IDEs, requiring Python 3.13+ and a Gemini API key for operation.

Use This MCP server To

Load entire codebases from single text files for analysis Query large codebases using Gemini's extensive context window Integrate with MCP-compatible IDEs like Cursor and Windsurf Configure Gemini model selection via command-line arguments Enable deep codebase understanding in AI-powered development environments

README

DeepView MCP

DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

PyPI version smithery badge

Features

  • Load an entire codebase from a single text file (e.g., created with tools like repomix)
  • Query the codebase using Gemini's large context window
  • Connect to IDEs that support the MCP protocol, like Cursor and Windsurf
  • Configurable Gemini model selection via command-line arguments

Prerequisites

  • Python 3.13+
  • Gemini API key from Google AI Studio

Installation

Installing via Smithery

To install DeepView for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @ai-1st/deepview-mcp --client claude

Using pip

pip install deepview-mcp

Usage

Starting the Server

Note: you don't need to start the server manually. These parameters are configured in your MCP setup in your IDE (see below).

# Basic usage with default settings
deepview-mcp [path/to/codebase.txt]

# Specify a different Gemini model
deepview-mcp [path/to/codebase.txt] --model gemini-2.0-pro

# Change log level
deepview-mcp [path/to/codebase.txt] --log-level DEBUG

The codebase file parameter is optional. If not provided, you'll need to specify it when making queries.

Command-line Options

  • --model MODEL: Specify the Gemini model to use (default: gemini-2.0-flash-lite)
  • --log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}: Set the logging level (default: INFO)

Using with an IDE (Cursor/Windsurf/...)

  1. Open IDE settings
  2. Navigate to the MCP configuration
  3. Add a new MCP server with the following configuration:
    {
      "mcpServers": {
        "deepview": {
          "command": "/path/to/deepview-mcp",
          "args": [],
          "env": {
            "GEMINI_API_KEY": "your_gemini_api_key"
          }
        }
      }
    }
    

Setting a codebase file is optional. If you are working with the same codebase, you can set the default codebase file using the following configuration:

{
   "mcpServers": {
     "deepview": {
       "command": "/path/to/deepview-mcp",
       "args": ["/path/to/codebase.txt"],
       "env": {
         "GEMINI_API_KEY": "your_gemini_api_key"
       }
     }
   }
 }

Here's how to specify the Gemini version to use:

{
   "mcpServers": {
     "deepview": {
       "command": "/path/to/deepview-mcp",
       "args": ["--model", "gemini-2.5-pro-exp-03-25"],
       "env": {
         "GEMINI_API_KEY": "your_gemini_api_key"
       }
     }
   }
}
  1. Reload MCP servers configuration

Available Tools

The server provides one tool:

  1. deepview: Ask a question about the codebase
    • Required parameter: question - The question to ask about the codebase
    • Optional parameter: codebase_file - Path to a codebase file to load before querying

Preparing Your Codebase

DeepView MCP requires a single file containing your entire codebase. You can use repomix to prepare your codebase in an AI-friendly format.

Using repomix

  1. Basic Usage: Run repomix in your project directory to create a default output file:
# Make sure you're using Node.js 18.17.0 or higher
npx repomix

This will generate a repomix-output.xml file containing your codebase.

  1. Custom Configuration: Create a configuration file to customize which files get packaged and the output format:
npx repomix --init

This creates a repomix.config.json file that you can edit to:

  • Include/exclude specific files or directories
  • Change the output format (XML, JSON, TXT)
  • Set the output filename
  • Configure other packaging options

Example repomix Configuration

Here's an example repomix.config.json file:

{
  "include": [
    "**/*.py",
    "**/*.js",
    "**/*.ts",
    "**/*.jsx",
    "**/*.tsx"
  ],
  "exclude": [
    "node_modules/**",
    "venv/**",
    "**/__pycache__/**",
    "**/test/**"
  ],
  "output": {
    "format": "xml",
    "filename": "my-codebase.xml"
  }
}

For more information on repomix, visit the repomix GitHub repository.

License

MIT

Author

Dmitry Degtyarev (ddegtyarev@gmail.com)

deepview-mcp FAQ

How do I install DeepView MCP?
You can install DeepView MCP via Smithery CLI or manually using Python 3.13+ and pip.
What are the prerequisites for running DeepView MCP?
DeepView MCP requires Python 3.13+ and a Gemini API key from Google AI Studio.
Can DeepView MCP work with IDEs other than Cursor and Windsurf?
Yes, it supports any IDE that implements the MCP protocol.
How do I configure which Gemini model DeepView MCP uses?
You can select and configure the Gemini model via command-line arguments when starting the server.
Does DeepView MCP support other LLM providers besides Gemini?
DeepView MCP is optimized for Gemini but can be extended to support other providers like OpenAI and Anthropic with additional integration.
Is DeepView MCP suitable for very large codebases?
Yes, it is designed to handle large codebases efficiently by leveraging Gemini's large context window.
How does DeepView MCP improve code analysis in IDEs?
It enables IDEs to query and understand entire codebases deeply, improving navigation, refactoring, and comprehension.
Where can I find the source code for DeepView MCP?
The source code is available on GitHub under the repository named deepview-mcp.