Persistent memory infrastructure for AI agents. Store, recall, and reason across sessions β not just conversations.
Every session starts from zero. Your AI has amnesia.
AI agents forget everything after each session. Weeks of context β gone in an instant.
Context windows are temporary β your knowledge shouldn't be. Token limits force forgetting.
Teams waste hours re-explaining context to AI tools. The same instructions, over and over.
Inspired by how human memory actually works β from instant recall to permanent storage.
Hot cache for your most recent and frequently accessed memories. Lightning-fast retrieval.
Synthesizes, connects, and reasons about your memories. Turns raw data into structured insights.
Multi-signal search combines text matching, semantic similarity, and associative connections for best recall.
Long-term compressed storage. Searchable archive that never forgets. Smart deduplication.
Not just storage β a thinking memory system.
Memories connect to related concepts automatically, just like the human brain.
Multi-signal ranking combines text, semantic, and associative scores for best-of-all-worlds retrieval.
Auto-detect conflicting memories. No more outdated info overriding current truth.
Shared knowledge pool across all users. The more people use it, the smarter everyone gets.
Auto-synthesized from community contributions. Verified knowledge, ready to use.
100x dedup for common knowledge. Semantic hashing means identical concepts stored once.
Natural memory aging + promotion. Frequently accessed memories stay hot, stale ones archive.
Each agent gets its own brain. Complete isolation with optional shared knowledge layers.
Start free. Scale as your AI's memory grows.
Try persistent memory
Full memory for power users
Shared brain for your team
Dedicated infrastructure
REST API, Python SDK, MCP Server, OpenClaw, or Antigravity workflow. Your choice.
from memoryai import memoryai
# Initialize with your API key
memory = memoryai("hm_your_api_key")
# Store a memory
memory.store("User prefers dark theme and TypeScript")
# Recall with natural language
results = memory.recall("What does the user prefer?")
print(results[0].content)
# β "User prefers dark theme and TypeScript"
# Store a memory
curl -X POST https://memoryai.dev/v1/store \
-H "Authorization: Bearer hm_your_api_key" \
-d '{"content": "User prefers dark theme"}'
# Recall memories
curl https://memoryai.dev/v1/recall \
-H "Authorization: Bearer hm_your_api_key" \
-d '{"query": "user preferences", "depth": "deep"}'
{
"mcpServers": {
"memoryai": {
"command": "npx",
"args": ["@memoryai/mcp-server"],
"env": {
"HM_API_KEY": "hm_your_api_key"
}
}
}
}
// Works with Cursor, Windsurf, Kiro, and any MCP client
# Install from ClawHub
clawhub install memoryai
# Auto-configured β just use it
# Your agent now has persistent memory across all sessions
# Store, recall, and reason β automatically
# Create: .agents/workflows/memoryai.md
# Antigravity calls REST API via terminal β no MCP needed
curl -X POST https://memoryai.dev/v1/store \
-H "Authorization: Bearer $MEMORYAI_KEY" \
-d '{"content": "User prefers dark theme", "tags": ["prefs"]}'
curl -X POST https://memoryai.dev/v1/recall \
-H "Authorization: Bearer $MEMORYAI_KEY" \
-d '{"query": "user preferences", "depth": "deep"}'
Remember codebase architecture, patterns, past bugs, and team conventions. Never re-explain your stack.
Remember customer history across sessions. Personalized support without asking the same questions twice.
Accumulate knowledge over time. Build on previous findings instead of starting from scratch.
Shared brain for your entire team. Collective intelligence that grows with every interaction.
Not all memory solutions are created equal.
| Feature | Mem0 | Zep | Letta | MemoryAI |
|---|---|---|---|---|
| Architecture | Key-value | Session-based | Academic | 4-Layer Adaptive |
| Collective Memory | β | β | β | β |
| Associative Graph | β | β | β | β |
| Knowledge Cards | β | β | β | β |
| Content Dedup | Basic | Basic | β | Semantic 100x |
| Price (Pro) | $25/mo | $99/mo | Free/complex | $19/mo |
| Self-hosted | β | β | β | β Enterprise |
Pick your platform, get your key, one command to install.
npx @memoryai/mcp-server --key YOUR_KEY
Your agent will auto-remember everything. No prompts needed.
{
"mcpServers": {
"memoryai": {
"command": "npx",
"args": ["@memoryai/mcp-server"],
"env": {
"HM_API_KEY": "YOUR_KEY"
}
}
}
}
Save as ~/.cursor/mcp.json β restart Cursor. Full docs β
npx @memoryai/mcp-server --key YOUR_KEY
Works with GitHub Copilot agent mode.
{
"servers": {
"memoryai": {
"command": "npx",
"args": ["@memoryai/mcp-server"],
"env": {
"HM_API_KEY": "YOUR_KEY"
}
}
}
}
Save as .vscode/mcp.json β restart. Full
docs β
npx @memoryai/mcp-server --key YOUR_KEY
Claude will remember your conversations forever.
{
"mcpServers": {
"memoryai": {
"command": "npx",
"args": ["@memoryai/mcp-server"],
"env": {
"HM_API_KEY": "YOUR_KEY"
}
}
}
}
Save as claude_desktop_config.json. Full
docs β
npx @memoryai/mcp-server --key YOUR_KEY
Cascade will auto-store and recall context.
{
"mcpServers": {
"memoryai": {
"command": "npx",
"args": ["@memoryai/mcp-server"],
"env": {
"HM_API_KEY": "YOUR_KEY"
}
}
}
}
Add to Windsurf MCP settings. Full docs β
clawhub install memoryai
# Then add your key to config:
# ~/.openclaw/workspace/skills/memoryai/config.json
{"endpoint":"https://memoryai.dev","api_key":"YOUR_KEY"}
Install skill β add key β restart OpenClaw. Done.
// In ~/.openclaw/openclaw.json:
{
"skills": {
"entries": {
"memoryai": {
"enabled": true,
"env": {
"MEMORYAI_KEY": "YOUR_KEY"
}
}
}
}
}
Save config β restart OpenClaw. Full docs β
curl -X POST https://memoryai.dev/v1/store \
-H "Authorization: Bearer YOUR_KEY" \
-d '{"content": "...", "tags": ["..."]}'
Antigravity uses REST API via terminal β no MCP needed.
# 1. Create workflow: .agents/workflows/memoryai.md --- description: Store and recall context via MemoryAI --- ## Store memory curl -X POST https://memoryai.dev/v1/store \ -H "Authorization: Bearer YOUR_KEY" \ -H "Content-Type: application/json" \ -d '{"content": "$CONTENT", "tags": $TAGS}' ## Recall memory curl -X POST https://memoryai.dev/v1/recall \ -H "Authorization: Bearer YOUR_KEY" \ -H "Content-Type: application/json" \ -d '{"query": "$QUERY", "depth": "deep"}'
Save as .agents/workflows/memoryai.md. Full docs β
pip install hmc-memory
Then: from memoryai import memoryai
pip install hmc-memory
from memoryai import memoryai
mem = memoryai(api_key="YOUR_KEY")
mem.store("User prefers dark theme", tags=["prefs"])
results = mem.recall("user preferences")
Sync + async clients. Full SDK docs β
curl -X POST https://memoryai.dev/v1/store \
-H "Authorization: Bearer YOUR_KEY" \
-d '{"content": "Hello, memory!"}'
Works from any language, any platform.
# Store
POST /v1/store {"content":"...", "tags":["..."]}
# Recall
POST /v1/recall {"query":"...", "depth":"deep"}
# Compact
POST /v1/context/compact {"content":"...", "task_context":"..."}
Base URL: https://memoryai.dev. Full
API reference β