Skip to main content
Cohaku AI is an MCP server that gives AI coding agents persistent, structured memory across sessions. Instead of losing context every time a conversation ends, Cohaku stores knowledge in a local SQLite database with vector search, full-text search, and a knowledge graph — all accessible through the Model Context Protocol.

Key Features

3-Layer Memory

Rule, Working, and Long-term layers with automatic priority ordering and TTL support.

Hybrid Search

Semantic vector search + BM25 full-text search with 4-component scoring.

Knowledge Graph

Entity nodes, bi-temporal edges, and graph traversal for structured knowledge.

Session & Episode Tracking

Track work sessions with checkpoints and log significant events as episodes.

Config Generation

Export stored memories as rule files for Claude Code, Cursor, Copilot, Windsurf, and more.

Zero Cloud Dependencies

Runs entirely local. SQLite + sqlite-vec. No API keys, no network calls for storage.

How It Works

AI Agent ←→ MCP Protocol ←→ Cohaku Server ←→ SQLite + sqlite-vec

                          Embedding Model
                       (all-MiniLM-L6-v2, local)
  1. Your AI agent connects to Cohaku via MCP (stdio transport)
  2. The agent uses 24 tools to store, search, and manage memories
  3. Everything is persisted locally at ~/.config/cohaku/memory.db

Tool Categories

CategoryToolsDescription
Context1Load prioritized memory context at session start
Memory6Store, search, update, delete, deduplicate memories
Knowledge Graph5Entities, edges, traversal-based graph search
Episodes3Chronological event logging with entity references
Sessions4Work session lifecycle with checkpoints
Config Generation9Export memories to editor-specific rule files

Next Steps

Quickstart

Set up Cohaku in under 5 minutes

Architecture

Understand the system design

CLI Reference

Manage memories from the terminal

Integrations

12 supported AI coding tools