mcp-client-langchain-ts

MCP.Pizza Chef: hideya

mcp-client-langchain-ts is a TypeScript-based CLI client for the Model Context Protocol (MCP) that leverages the LangChain ReAct Agent framework. It simplifies the integration of multiple MCP servers by converting their tools into LangChain-compatible StructuredTools, enabling parallel initialization and seamless interaction. This client supports LLMs from Anthropic, OpenAI, and Groq, making it versatile for various AI workflows. It is ideal for developers looking to build AI agents that interact with diverse MCP servers using a modern TypeScript environment. A Python counterpart is also available for cross-language compatibility.

Use This MCP client To

Integrate multiple MCP servers into a single LangChain agent Run MCP client workflows via a TypeScript CLI Convert MCP server tools to LangChain StructuredTools Initialize and manage parallel MCP server connections Build AI agents using LangChain ReAct with MCP context Support multi-provider LLMs like Anthropic, OpenAI, Groq

README

MCP Client Using LangChain / TypeScript License: MIT

This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.

It leverages a utility function convertMcpToLangchainTools() from @h1deya/langchain-mcp-tools.
This function handles parallel initialization of specified multiple MCP servers and converts their available tools into an array of LangChain-compatible tools (StructuredTool[]).

LLMs from Anthropic, OpenAI and Groq are currently supported.

A python version of this MCP client is available here

Prerequisites

  • Node.js 16+
  • npm 7+ (npx) to run Node.js-based MCP servers
  • [optional] uv (uvx) installed to run Python-based MCP servers
  • API keys from Anthropic, OpenAI, and/or Groq as needed.

Setup

  1. Install dependencies:

    npm install
  2. Setup API keys:

    cp .env.template .env
    • Update .env as needed.
    • .gitignore is configured to ignore .env to prevent accidental commits of the credentials.
  3. Configure LLM and MCP Servers settings llm_mcp_config.json5 as needed.

    • The configuration file format for MCP servers follows the same structure as Claude for Desktop, with one difference: the key name mcpServers has been changed to mcp_servers to follow the snake_case convention commonly used in JSON configuration files.
    • The file format is JSON5, where comments and trailing commas are allowed.
    • The format is further extended to replace ${...} notations with the values of corresponding environment variables.
    • Keep all the credentials and private info in the .env file and refer to them with ${...} notation as needed.

Usage

Run the app:

npm start

Run in verbose mode:

npm run start:v

See commandline options:

npm run start:h

At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.

Example queries can be configured in llm_mcp_config.json5

mcp-client-langchain-ts FAQ

How do I install mcp-client-langchain-ts?
Ensure Node.js 16+ and npm 7+ are installed, then clone the repo and run npm install.
Which LLM providers are supported?
This client supports Anthropic, OpenAI, and Groq LLMs for flexible AI model usage.
Can I use this client with multiple MCP servers simultaneously?
Yes, it supports parallel initialization and integration of multiple MCP servers.
Is there a version of this client in other programming languages?
Yes, a Python version is available for users preferring Python environments.
How does the client convert MCP tools for LangChain?
It uses the convertMcpToLangchainTools() utility to transform MCP server tools into LangChain StructuredTools.
What Node.js version is required?
Node.js version 16 or higher is required to run this client.
Does this client provide a graphical interface?
No, it is a command-line interface (CLI) client designed for terminal use.
Where can I find the license information?
The project is licensed under MIT, with details available in the GitHub repository.