ai-prompts-mcp

MCP.Pizza Chef: instructa

ai-prompts-mcp is a TypeScript-based MCP server implementation designed to manage and serve AI prompts via a structured API. It uses a monorepo architecture with pnpm workspaces, enabling scalable and configurable prompt management for AI models. This server facilitates real-time prompt delivery and environment-based configuration, making it ideal for integrating AI prompt workflows in development environments.

Use This MCP server To

Serve AI prompt templates to LLMs in real-time Manage and version AI prompts in a centralized API Configure prompt delivery dynamically via environment variables Integrate prompt management into AI-enhanced applications Support multi-model prompt orchestration with MCP protocol Enable developers to build custom AI prompt workflows Host prompt APIs for scalable AI agent deployments

README

Instructa AI - MCP Prompts API

This repository contains a Model Context Protocol (MCP) implementation for managing and serving AI prompts. The project is built using TypeScript and follows a monorepo structure using pnpm workspaces.

πŸš€ Features

  • MCP (Model Context Protocol) implementation
  • TypeScript-based architecture
  • Monorepo structure using pnpm workspaces
  • Environment-based configuration

πŸ“‹ Prerequisites

  • Node.js (LTS version recommended)
  • pnpm (Package manager)

πŸ› οΈ Installation

  1. Clone the repository:
git clone https://github.com/yourusername/instructa-ai.git
cd instructa-ai
  1. Install dependencies:
pnpm install
  1. Set up environment variables:
cp packages/mcp/.env.dist packages/mcp/.env
# Edit .env file with your configuration

πŸƒβ€β™‚οΈ Development

To start the development server:

# Start MCP development server
pnpm dev:mcp

# Build MCP package
pnpm build:mcp

# Start MCP production server
pnpm start:mcp

πŸ—οΈ Project Structure

.
β”œβ”€β”€ packages/
β”‚   └── mcp/          # MCP implementation package
β”‚       β”œβ”€β”€ src/      # Source code
β”‚       └── .env      # Environment configuration
β”œβ”€β”€ package.json      # Root package.json
└── pnpm-workspace.yaml

🀝 Contributing

Contributions, issues, and feature requests are welcome! Feel free to check the issues page.

🌐 Social

  • X/Twitter: @kregenrek
  • Bluesky: @kevinkern.dev

πŸ“ License

MIT License

This repository is open-source under the MIT license. You're free to use, modify, and distribute it under those terms. Enjoy building!

ai-prompts-mcp FAQ

How do I install ai-prompts-mcp?
Clone the repo, install dependencies with pnpm, and configure environment variables as per the README.
What prerequisites are needed to run ai-prompts-mcp?
Node.js LTS version and pnpm package manager are required.
How do I start the development server for ai-prompts-mcp?
Use the command 'pnpm dev:mcp' to start the development server.
Can ai-prompts-mcp be used in production?
Yes, you can build and start the production server using 'pnpm build:mcp' and 'pnpm start:mcp'.
Is ai-prompts-mcp limited to a specific LLM provider?
No, it supports any LLM provider compatible with MCP, including OpenAI, Claude, and Gemini.
How is configuration managed in ai-prompts-mcp?
Configuration is environment-based, allowing flexible setup via .env files.
What programming language is ai-prompts-mcp built with?
It is built using TypeScript for strong typing and maintainability.
Does ai-prompts-mcp support multi-prompt workflows?
Yes, it supports orchestrating multiple prompts through the MCP protocol.