cf-mcp-client

MCP.Pizza Chef: cpage-pivotal

CF-MCP-Client is a Spring chatbot application designed for deployment on Cloud Foundry. It integrates with platform AI services using the Model Context Protocol (MCP) and memGPT, enabling advanced conversational AI capabilities. The client requires Java 21+, Maven 3.8+, and access to Cloud Foundry with GenAI or other LLM services. It supports easy deployment and binding to LLM models within Cloud Foundry environments.

Use This MCP client To

Deploy AI chatbots on Cloud Foundry platforms Integrate platform LLM services with chatbot applications Leverage MCP for structured AI context in chat workflows Use memGPT for enhanced conversational AI capabilities Bind and manage LLM services in Cloud Foundry environments Build scalable AI chat clients with Spring and MCP Enable real-time AI interactions in cloud-native apps

README

CF-MCP-Client: AI Chat Client for Cloud Foundry

Overview

CF-MCP-Client is a Spring chatbot application that can be deployed to Cloud Foundry and consume platform AI services. It's built with Spring AI and leverages the Model Context Protocol (MCP) and memGPT to provide advanced capabilities:

Prerequisites

  • Java 21 or higher
  • Maven 3.8+
  • Access to a Cloud Foundry Foundation with the GenAI tile or other LLM services
  • Developer access to your Cloud Foundry environment

Deploying to Cloud Foundry

Preparing the Application

  1. Build the application package:
mvn clean package
  1. Push the application to Cloud Foundry:
cf push

Binding to LLM Models

  1. Create a service instance that provides chat LLM capabilities:
cf create-service genai [plan-name] chat-llm
  1. Bind the service to your application:
cf bind-service ai-tool-chat chat-llm
  1. Restart your application to apply the binding:
cf restart ai-tool-chat

Now your chatbot will use the LLM to respond to chat requests.

Binding to Models

Binding to Vector Databases

  1. Create a service instance that provides embedding LLM capabilities
cf create-service genai [plan-name] embedding-llm 
  1. Create a Postgres service instance to use as a vector database
cf create-service postgres on-demand-postgres-db vector-db
  1. Bind the services to your application
cf bind-service ai-tool-chat embedding-llm 
cf bind-service ai-tool-chat vector-db
  1. Restart your application to apply the binding:
cf restart ai-tool-chat
  1. Click on the document tool on the right-side of the screen, and upload a .PDF File Upload File

Now your chatbot will respond to queries about the uploaded document

Vector DBs

Binding to MCP Agents

Model Context Protocol (MCP) servers are lightweight programs that expose specific capabilities to AI models through a standardized interface. These servers act as bridges between LLMs and external tools, data sources, or services, allowing your AI application to perform actions like searching databases, accessing files, or calling external APIs without complex custom integrations.

  1. Create a user-provided service that provides the URL for an existing MCP server:
cf cups mcp-server -p '{"mcpServiceURL":"https://your-mcp-server.example.com"}'
  1. Bind the MCP service to your application:
cf bind-service ai-tool-chat mcp-server
  1. Restart your application:
cf restart ai-tool-chat

Your chatbot will now register with the MCP agent, and the LLM will be able to invoke the agent's capabilities when responding to chat requests.

Binding to Agents

Binding to memGPT for Extended Memory

If you have access to a compatible memGPT implementation service:

  1. Create a user-provided service for the memGPT service:
cf cups memGPT -p '{"memGPTUrl":"https://your-memgpt-service.example.com"}'
  1. Bind the memGPT service to your application:
cf bind-service ai-tool-chat memGPT
  1. Restart your application:
cf restart ai-tool-chat

Binding to Memory

cf-mcp-client FAQ

What are the prerequisites for deploying CF-MCP-Client?
You need Java 21 or higher, Maven 3.8+, and access to a Cloud Foundry environment with GenAI or other LLM services.
How do I deploy CF-MCP-Client to Cloud Foundry?
Build the app with Maven, then push it using 'cf push' command.
How do I bind an LLM service to CF-MCP-Client?
Create a GenAI chat LLM service instance, bind it to the app, and restart the app to apply changes.
Can CF-MCP-Client work with multiple LLM providers?
Yes, it supports integration with various LLM services available on Cloud Foundry, including OpenAI, Claude, and Gemini.
What programming language and framework is CF-MCP-Client built with?
It is built using Java and the Spring framework.
Does CF-MCP-Client support real-time AI chat interactions?
Yes, it leverages MCP and memGPT to enable advanced real-time conversational AI.
Is CF-MCP-Client suitable for cloud-native AI applications?
Yes, it is designed for scalable deployment on Cloud Foundry, ideal for cloud-native AI chat solutions.
How do I update the CF-MCP-Client after deployment?
Rebuild the application package with Maven and push the updated app to Cloud Foundry using 'cf push'.