Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

agent-memory-server

MCP.Pizza Chef: redis-developer

Agent-memory-server is a Redis-based memory server designed for AI agents and applications. It manages both short-term conversational context and long-term memories, providing features like semantic search, automatic summarization, and flexible REST and MCP APIs. It supports token-aware context management for major LLMs and advanced filtering for memory retrieval, enabling efficient and scalable memory handling for AI workflows.

Use This MCP server To

Store and manage conversational context for AI agents Automatically summarize long conversations for context compression Perform semantic search on long-term memories Filter memories by session, topic, entity, or timestamp Manage token limits based on client LLM context windows Integrate memory management via REST or MCP APIs Enable entity recognition and topic modeling on stored data

README

🔮 Redis Agent Memory Server

A Redis-powered memory server built for AI agents and applications. It manages both conversational context and long-term memories, offering semantic search, automatic summarization, and flexible APIs through both REST and MCP interfaces.

Features

  • Working Memory

    • Session-scoped storage for messages, structured memories, context, and metadata
    • Automatically summarizes conversations when they exceed the window size
    • Client model-aware token limit management (adapts to the context window of the client's LLM)
    • Supports all major OpenAI and Anthropic models
    • Automatic promotion of structured memories to long-term storage
  • Long-Term Memory

    • Persistent storage for memories across sessions
    • Semantic search to retrieve memories with advanced filtering system
    • Filter by session, namespace, topics, entities, timestamps, and more
    • Supports both exact match and semantic similarity search
    • Automatic topic modeling for stored memories with BERTopic or configured LLM
    • Automatic Entity Recognition using BERT
    • Memory deduplication and compaction
  • Other Features

    • Namespace support for session and working memory isolation
    • Both a REST interface and MCP server
    • Background task processing for memory indexing and promotion
    • Unified search across working memory and long-term memory

For detailed information about memory types, their differences, and when to use each, see the Memory Types Guide.

Authentication

The Redis Agent Memory Server supports OAuth2/JWT Bearer token authentication for secure API access. It's compatible with Auth0, AWS Cognito, Okta, Azure AD, and other standard OAuth2 providers.

For complete authentication setup, configuration, and usage examples, see Authentication Documentation.

For manual Auth0 testing, see the manual OAuth testing guide.

System Diagram

System Diagram

Project Status and Roadmap

Project Status: In Development, Pre-Release

This project is under active development and is pre-release software. Think of it as an early beta!

Roadmap

  • Long-term memory deduplication and compaction
  • Use a background task system instead of BackgroundTask
  • Authentication/authorization hooks (OAuth2/JWT support)
  • Configurable strategy for moving working memory to long-term memory
  • Separate Redis connections for long-term and working memory

REST API Endpoints

The server provides REST endpoints for managing working memory, long-term memory, and memory search. Key endpoints include session management, memory storage/retrieval, semantic search, and memory-enriched prompts.

For complete API documentation with examples, see REST API Documentation.

MCP Server Interface

Agent Memory Server offers an MCP (Model Context Protocol) server interface powered by FastMCP, providing tool-based memory management for LLMs and agents. Includes tools for working memory, long-term memory, semantic search, and memory-enriched prompts.

For complete MCP setup and usage examples, see MCP Documentation.

Command Line Interface

The agent-memory-server provides a comprehensive CLI for managing servers and tasks. Key commands include starting API/MCP servers, scheduling background tasks, running workers, and managing migrations.

For complete CLI documentation and examples, see CLI Documentation.

Getting Started

For complete setup instructions, see Getting Started Guide.

Configuration

Configure servers and workers using environment variables. Includes background task management, memory compaction, and data migrations.

For complete configuration details, see Configuration Guide.

License

Apache 2.0 License - see LICENSE file for details.

Development

For development setup, testing, and contributing guidelines, see Development Guide.

agent-memory-server FAQ

How does agent-memory-server handle token limits for different LLMs?
It adapts token count management based on the client model's context window, supporting major OpenAI, Anthropic, and other LLMs.
What types of memory does agent-memory-server support?
It supports both short-term conversational memory and long-term memory with semantic search capabilities.
How can I query memories stored in agent-memory-server?
You can perform exact match or semantic similarity searches with advanced filtering by session, namespace, topics, entities, and timestamps.
What APIs does agent-memory-server provide?
It offers flexible REST and MCP protocol APIs for easy integration with AI applications and agents.
Does agent-memory-server support automatic summarization?
Yes, it automatically and recursively summarizes conversations to manage context efficiently.
Can agent-memory-server recognize topics and entities automatically?
Yes, it uses BERTopic for topic modeling and BERT for entity recognition on stored memories.
Is agent-memory-server scalable for large AI applications?
Yes, leveraging Redis allows it to handle large volumes of memory data efficiently and quickly.
Can agent-memory-server be integrated with multiple LLM providers?
Yes, it supports major providers including OpenAI, Anthropic, and others, making it versatile for various AI workflows.