Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

shadcn-ui-mcp-unofficial

MCP.Pizza Chef: Jpisnice

shadcn-ui-mcp-unofficial is a TypeScript MCP server that enables AI models to access detailed context about shadcn/ui components, including source code, demos, and installation instructions. It supports communication via STDIO or HTTP/SSE, making it versatile for integration in various AI assistant workflows. Designed for containerized deployment with Docker or Podman, it simplifies AI-driven interactions with UI component libraries.

Use This MCP server To

Fetch shadcn/ui component source code for AI analysis Retrieve usage examples and demos of shadcn/ui components Provide installation guides for shadcn/ui to AI models Enable AI assistants to answer questions about shadcn/ui components Integrate with AI workflows to dynamically explore UI components Support web clients via HTTP/SSE for real-time component data Run in containerized environments for easy deployment

README

Shadcn UI MCP Server

A TypeScript implementation of a Model Context Protocol (MCP) server designed to help AI assistants interact with shadcn/ui components. It allows AI models to fetch component source code, demos, and installation guides.

Running with Docker / Podman

The recommended way to run this server is using Docker containers, managed with Docker Compose (or podman-compose).

Server Modes:

  1. STDIO Server: Communicates via standard input/output.
  2. HTTP/SSE Server: Communicates via HTTP and Server-Sent Events (ideal for web clients).

Prerequisites

Environment Variables

Configure the server using a .env file (copy from .env.example):

cp .env.example .env

Modify .env as needed (e.g., PORT for the HTTP server, GITHUB_PERSONAL_ACCESS_TOKEN). Docker Compose automatically loads variables from this file.

Using Docker Compose

The compose.yaml defines two services: mcp-server-stdio and mcp-server-http.

1. Build Images:

# Build all services
docker compose build

# Or a specific service (e.g., HTTP)
docker compose build mcp-server-http

2. Run HTTP/SSE Server:

# Start in detached mode
docker compose up -d mcp-server-http
  • Access: http://localhost:3000 (SSE: /sse, Messages: /messages)
  • Port 3000 (host) maps to 3000 (container). PORT is also set in compose.yaml.

3. Run STDIO Server:

# Start in foreground
docker compose up mcp-server-stdio

4. View Logs:

docker compose logs -f mcp-server-http # Or mcp-server-stdio

5. Stop Servers:

docker compose stop mcp-server-http # Or mcp-server-stdio
# Stop and remove all services
docker compose down

Features

This MCP server provides the following capabilities:

Tools

  1. get_component:

    • Retrieves the source code of a specified shadcn/ui component.
    • Parameter: componentName (string) - e.g., "button".
    • Returns: Component source code.
  2. get_component_demo:

    • Fetches demo code for a shadcn/ui component.
    • Parameter: componentName (string).
    • Returns: Demo code.

Resources

  1. resource:get_components:
    • Lists all available shadcn/ui components.

Resource Templates

  1. resource-template:get_install_script_for_component:

    • Generates installation script for a component.
    • Parameters: packageManager (string - npm, pnpm, yarn, bun), component (string).
  2. resource-template:get_installation_guide:

    • Provides framework-specific installation guides for shadcn/ui.
    • Parameters: framework (string - next, vite, etc.), packageManager (string).

Additional Resources

shadcn-ui-mcp-unofficial FAQ

How do I deploy the shadcn-ui-mcp-unofficial server?
You can deploy it using Docker or Podman containers, managed with Docker Compose or podman-compose for easy setup.
What communication protocols does this MCP server support?
It supports STDIO for command-line communication and HTTP with Server-Sent Events (SSE) for web clients.
How can I configure the server settings?
Configuration is done via a .env file where you can set parameters like PORT and GitHub access tokens.
Is this MCP server compatible with multiple LLM providers?
Yes, it is provider-agnostic and can work with OpenAI, Claude, Gemini, and others.
Can I use this server to get real-time updates on shadcn/ui components?
Yes, using the HTTP/SSE mode, clients can receive real-time streamed data about components.
What prerequisites are needed before running this server?
You need Docker installed, and optionally Podman with podman-compose for container management.
Does this server provide source code access for shadcn/ui components?
Yes, it allows AI models to fetch component source code, demos, and installation guides for detailed context.
How does this server enhance AI assistant capabilities?
By providing structured, real-time context about UI components, it enables AI to answer detailed questions and generate relevant code snippets.