Skip to main content
Stores a new memory with automatic embedding generation. Always call search_memories first to check for duplicates before adding.

Parameters

content
string
required
The memory content text. Be specific and self-contained — this will be retrieved out of context.Bad: "Fixed the bug". Good: "Fixed OOM in GraphService by adding connection pool limit of 10."
layer
string
default:"long_term"
Memory priority layer.
ValueDescription
rulePermanent constraints, always loaded first (e.g., “always use ESM”)
workingTemporary session-scoped notes, clean up after use
long_termDurable cross-session knowledge (default)
type
string
default:"note"
Memory classification.
ValueDescription
ruleBehavioral directives (use with layer=rule)
decisionArchitectural choices with rationale
factVerified objective information
noteGeneral observations (default)
skillReusable techniques or solutions
tags
string[]
Tags for categorization and search filtering. Use lowercase, descriptive terms. Indexed in FTS5 for full-text search.
expiresAt
string
Expiration date in ISO 8601 format (e.g., "2025-12-31T23:59:59Z"). Expired memories are excluded from search. Useful for time-sensitive information.
scope
string
default:"project"
Storage scope.
ValueDescription
projectStored under the current project (default)
globalShared across all projects — use for universal preferences

Example

{
  "content": "This project uses React 19 with TypeScript strict mode",
  "layer": "rule",
  "type": "rule",
  "tags": ["tech-stack", "react"],
  "scope": "project"
}

Behavior

  1. Generates a 384d embedding from the content using all-MiniLM-L6-v2
  2. Assigns priority based on layer (rule=10, working=5, long_term=1)
  3. Inserts into memories table and vec_memories virtual table
  4. FTS5 index is updated automatically via triggers

Returns

The full memory object including generated id, embedding, and timestamps.