Build Effective Agents with Model Context Protocol in TypeScript
mcp-agent is a TypeScript framework inspired by the Python lastmile-ai/mcp-agent project. It provides a simple, composable, and type-safe way to build AI agents leveraging the
This library aims to bring the powerful patterns and architecture of mcp-agent to the JavaScript ecosystem, enabling developers to create robust and controllable AI agents that can interact with MCP-aware services and tools.
First, create or update your .npmrc file with:
@joshuaalpuerto:registry=https://npm.pkg.github.com
Then
npm install @joshuaalpuerto/mcp-agentmcp-agent empowers you to build sophisticated AI agents with the following core capabilities:
- Agent Abstraction: Define intelligent agents with clear instructions, access to tools (both local functions and MCP servers), and integrated LLM capabilities.
- Model Context Protocol (MCP) Integration: Seamlessly connect and interact with services and tools exposed through MCP servers.
- Local Function Tools: Extend agent capabilities with custom, in-process JavaScript/TypeScript functions that act as tools, alongside MCP server-based tools.
- LLM Flexibility: Integrate with various Large Language Models (LLMs). The library includes an example implementation for Fireworks AI, demonstrating extensibility for different LLM providers.
- Memory Management: Basic in-memory message history to enable conversational agents.
- Workflows: Implement complex agent workflows like the
Orchestratorpattern to break down tasks into steps and coordinate multiple agents. Support for additional patterns from Anthropic'sBuilding Effective Agents and OpenAI's Swarm coming soon. - TypeScript & Type Safety: Built with TypeScript, providing strong typing, improved code maintainability, and enhanced developer experience.
Get started quickly with a basic example (Using as standalone):
import { fileURLToPath } from 'url';
import path from 'path';
import { Agent, LLMFireworks, Orchestrator } from 'mcp-agent'; // Import from your library name!
import { writeLocalSystem } from './tools/writeLocalSystem'; // Assuming you have example tools
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
async function runOrchestrator() {
const llm = new LLMFireworks("accounts/fireworks/models/deepseek-v3", { // Example LLM from Fireworks
maxTokens: 2048,
temperature: 0.1
});
const researcher = await Agent.initialize({
llm,
name: "researcher",
description: `Your expertise is to find information.`,
serverConfigs: [ // Example MCP Server Configurations
{
name: "read_file_from_local_file_system",
type: "stdio",
command: "node",
args: ['--loader', 'ts-node/esm', path.resolve(__dirname, 'servers', 'readLocalFileSystem.ts'),]
},
{
name: "search_web",
type: "ws",
url: createSmitheryUrl( // Example using community mcp server via @smithery/sdk
"https://server.smithery.ai/exa/ws",
{
exaApiKey: process.env.EXA_API_KEY
}
)
},
],
});
const writer = await Agent.initialize({
llm
name: "writer",
description: `Your expertise is to write information to a file.`,
functions: [writeLocalSystem], // Example local function tool
llm,
});
const orchestrator = new Orchestrator({
llm,
agents: [researcher, writer],
});
const result = await orchestrator.generate('Search new latest developemnt about AI and write about it to `theory_on_ai.md` on my local machine. no need to verify the result.');
console.log(JSON.stringify(result));
await researcher.close();
await writer.close();
}
runOrchestrator().catch(console.error);logs.mp4
To run this example:
- Install Dependencies:
pnpm install
- Set Environment Variables: Create a
.envfile (or set environment variables directly) and add your API keys (e.g.,EXA_API_KEY, Fireworks AI API key if needed). - Run the Demo:
node --loader ts-node/esm ./demo/standalone/index.ts
For a complete Express.js integration example with multi-agent orchestration, check out the demo/express/README.md.
- Agent: The fundamental building block. An
Agentis an autonomous entity with a specific role, instructions, and access to tools. - MCP Server Aggregator (
MCPServerAggregator): Manages connections to multiple MCP servers, providing a unified interface for agents to access tools. - MCP Connection Manager (
MCPConnectionManager): Handles the lifecycle and reuse of MCP server connections, optimizing resource usage.- Supported Transport:
stdio,sse,streamable-http&websockets
- Supported Transport:
- LLM Integration (
LLMInterface,LLMFireworks): Abstracts interaction with Large Language Models.LLMFireworksis an example implementation for Fireworks AI models. - Tools: Functions or MCP server capabilities that Agents can use to perform actions. Tools can be:
- MCP Server Tools: Capabilities exposed by external MCP servers (e.g., file system access, web search).
- Local Function Tools: JavaScript/TypeScript functions defined directly within your application.
- Workflows: Composable patterns for building complex agent behaviors (see anthropic blog
here). - Orchestrator - workflow demonstrates how to coordinate multiple agents to achieve a larger objective.
- Prompt chaining - coming soon.
- Routing - coming soon.
- Parallelization - coming soon.
- Evaluator-optimizer - coming soon.
- Memory (
SimpleMemory): Provides basic in-memory message history for conversational agents.
This project is heavily inspired by and builds upon the concepts and architecture of the excellent lastmile-ai/mcp-agent Python framework
We encourage you to explore their repository for a deeper understanding of the underlying principles and patterns that have informed this TypeScript implementation.
Contributions are welcome!