Features Overview

Everything your AI needs
to remember.

A complete memory layer: store, search, link, track, isolate, and integrate. Built for developers who want their agents to compound knowledge over time.

10
MCP Tools
12+
REST Endpoints
3
Search Modes
<50ms
Search Latency

Core capabilities

Six primitives that cover the full memory lifecycle. Each one is production-tested.

Store

Save memories with tags and domains. Every memory is auto-indexed for full-text and semantic search the moment it lands. No manual curation required.

Search

Three modes: full-text (BM25 ranking), semantic (OpenAI embeddings), and hybrid (reciprocal rank fusion of both). Find by keyword or meaning.

Link

Connect memories with typed relationships: relates_to, refines, contradicts, supports, implements, derived_from. Graph traversal at depth 1-2.

Track

Session-level task management: create, update, and list tasks. Monthly usage tracking and token savings estimation so you know exactly what memory is worth.

Isolate

Multi-tenant with API key authentication. Domain separation within tenants. Tier-based rate limits and memory caps. Your data never touches another user's.

Integrate

MCP native for Claude Desktop, Claude Code, and Cursor. REST API for GPT, Gemini, and custom agents. Code examples in Python, JavaScript, and curl.

10 MCP tools, ready to use

Install @echo-memory/mcp and your AI gets these tools automatically.

Core Memory

  • echo_store Save a memory with tags and domain
  • echo_search Search memories (FTS, semantic, or hybrid)
  • echo_get Retrieve a specific memory by ID
  • echo_delete Remove a memory permanently

Links

  • echo_create_link Create a typed relationship between memories
  • echo_get_links List all links for a given memory
  • echo_related Traverse the graph to find related memories

Context

  • echo_context_recall Smart recall: search + graph in one call
  • echo_domains List all memory domains for the tenant
  • echo_stats Usage stats, memory count, token savings

12+ REST endpoints

Every feature is accessible over HTTP. Use it from any language, any framework, any agent.

POST /api/v1/memories
GET /api/v1/memories/search
GET /api/v1/memories/hybrid-search
GET /api/v1/memories/semantic-search
GET /api/v1/memories/{id}
DELETE /api/v1/memories/{id}
POST /api/v1/links
GET /api/v1/related
GET /api/v1/tasks
GET /api/v1/domains
GET /api/v1/token-savings
GET /api/v1/usage
GET /api/v1/account
GET /api/v1/health

Three ways to connect

MCP for Claude-native apps. REST for everything else. Code examples for the fastest path to production.

MCP claude_desktop_config.json
{
  "mcpServers": {
    "echo-memory": {
      "command": "npx",
      "args": [
        "@echo-memory/mcp"
      ],
      "env": {
        "ECHO_API_KEY": "echo_sk_..."
      }
    }
  }
}
REST API curl
# Store a memory
curl -X POST \
  https://echo-memory.fly.dev/api/v1/memories \
  -H "Authorization: Bearer echo_sk_..." \
  -H "Content-Type: application/json" \
  -d '{
    "content": "Ship v2 by Friday",
    "tags": ["sprint", "deadline"]
  }'

# Search memories
curl \
  "https://echo-memory.fly.dev/api/v1/memories/search?q=deadline" \
  -H "Authorization: Bearer echo_sk_..."
Python example.py
import requests

API = "https://echo-memory.fly.dev/api/v1"
headers = {"X-API-Key": "echo_sk_..."}

# Store
requests.post(f"{API}/memories", json={
    "content": "Ship v2 by Friday",
    "tags": ["sprint"]
}, headers=headers)

# Search
r = requests.get(f"{API}/memories/search",
    params={"query": "deadline"},
    headers=headers)
print(r.json()["data"])

Works with everything

One memory layer, every AI tool you use.

MCP Clients

  • Claude Desktop
  • Claude Code
  • Cursor
  • Any MCP client

REST API

  • GPT / OpenAI
  • Gemini
  • Custom agents
  • Any HTTP client

Platforms

  • macOS
  • Windows
  • Linux
  • Raspberry Pi

API Examples

  • Python (requests)
  • JavaScript (fetch)
  • curl

Ready to give your AI a memory?

Free tier. No credit card. Five minutes to first API call.

Questions? Want beta access? Reach out directly. We set up every beta tester personally.