Release Notes

Version history for OpenCxMS products. Features, improvements, and fixes.

CxMS Pro

Commercial

Persistent AI memory engine for coding assistants. Works with Claude Code, Cursor, Windsurf, GitHub Copilot, and 5 more.

v1.1.0

FeatureMarch 24, 2026
  • Cortex Memory Engine — Brain-inspired memory architecture with 5 confidence tiers. Memories start as raw intake and promote to verified knowledge as they prove accurate over time. Automated review cycles merge similar memories, detect contradictions, and fade stale information.
  • Full Filesystem Search Indexing — The search re-indexer now scans the full project filesystem for new and modified files, catching content created by subagents, external editors, or sessions that exited abnormally. Previously only re-indexed files explicitly tracked during the session.
  • Background Worker Reliability (Windows) — Fixed a bug where the session-end background worker was killed when the terminal window closed on Windows. The worker now launches in its own process group, surviving terminal close to complete indexing, memory consolidation, and transcript capture.
  • Cortex-First Search Routing — The AI now queries the Cortex memory engine and ranked search index before falling back to file reads. When file searches are used, the system surfaces any relevant Cortex memories alongside the results.
Updated: cxms-search.mjs, cxms-session-end.mjs, cxms-session-end-worker.mjs, cxms-auto-rag.mjs

v1.0.0

Initial ReleaseMarch 22, 2026
  • Persistent AI Memory — Tell your AI assistant something once. It remembers permanently. Corrections, preferences, and project knowledge survive across sessions, compaction events, and model changes.
  • 9 Agent Support — Works with Claude Code, Cursor, Windsurf, GitHub Copilot, Cline, Continue.dev, Zed, Amazon Q, and Aider. One memory system across all your coding assistants.
  • Ranked Full-Text Search — BM25 search engine replaces grep for finding information across project files. Results ranked by relevance, not just string match.
  • Multi-Project Memory Sharing — Projects opt into shared memory groups. What the AI learns in one project is available in another. Cross-project corrections propagate automatically.
  • License & Distribution — Ed25519 signed license keys, JWT validation, compiled engine (V8 bytecode), Stripe checkout, automated fulfillment.
$49/month per seat · Learn more

CxMS (Open Source)

Free

The open-source AI memory framework. File-based persistence, lifecycle hooks, and context monitoring for any AI coding assistant.

v3.0.0

Major ReleaseFebruary 19, 2026
  • Total Recall v2: Session ID Coordination — SessionStart now reads session_id from stdin and registers each session in the coordination file’s new active_sessions array. Multiple sessions on the same project are tracked individually with status, context %, and model info.
  • Concurrent Session Detection — When a new session starts and another session on the same project is already active, the startup banner shows a CONCURRENT SESSIONS DETECTED warning with session age and context usage. Prevents accidental state conflicts.
  • Session Lifecycle Tracking — SessionEnd marks sessions as completed in the coordination file with final context % and model. Stale sessions auto-pruned (completed >1hr, active >4hr). Per-session state files cleaned up on exit.
  • Per-Session Message Tracking — Coordination messages now record which specific session read them (project + session_id + timestamp), not just which project. Mixed format backward-compatible with existing string-based read_by entries.
  • Coordination Schema 1.1 — Auto-migrated on first run. Adds active_sessions array alongside existing instances and messages. Old hooks reading new schema are unaffected.
Updated: cxms-session-start.mjs (v4.0), cxms-session-end.mjs (v2.2)

v2.0.5

Bug FixFebruary 19, 2026
  • Per-Session Context Isolation (Tools v3.0) — Fixed a bug where two Claude Code sessions running against the same project directory would contaminate each other’s context tracking. Now each session writes to its own status file.
  • Compaction Detection Logging — The statusline command now detects compaction events (context dropping 30%+ from a high point) and logs them for analysis.

v2.0.4

Bug FixFebruary 12, 2026
  • False Compaction Detection Fix — Fixed a bug where every new session triggered a false compaction warning. The SessionStart hook now clears stale state files so only real compactions are detected.

v2.0.3

Bug FixFebruary 11, 2026
  • Save-Once Gate Fix — Fixed a bug where the 80% save-once gate fired repeatedly instead of once.

v2.0.2

ImprovementFebruary 10, 2026
  • Compaction Recovery Trust — The 80% context gate no longer hard-stops the session. It blocks once to force a checkpoint save, then trusts the compaction recovery system.

v2.0.1

Performance FixFebruary 9, 2026
  • Startup Hang Fix — Matcher-based hooks eliminate unnecessary Node.js spawns during startup. Total overhead: ~80ms (was 8 seconds).

v2.0.0

Major ReleaseFebruary 8, 2026
  • Enforced Startup Sequence — AI assistants can no longer skip critical project files. Hard enforcement via PreToolUse block.
  • Total Recall Memory Injection — Cross-instance memory sharing through a global coordination file. One instance writes a message; the other receives it at startup.
  • 3-Component Startup Gate — SessionStart creates startup state, PostToolUse tracks file reads, PreToolUse blocks until all required reads are complete.
  • Cross-Repo Coordination — Global coordination file enables sibling instance awareness across multiple projects.

v1.6.4

FeatureFebruary 3, 2026
  • Cascading Configuration (E22) — CSS-like config inheritance with [REQUIRED], [INHERIT], [OVERRIDE] markers.
  • Multi-Agent Coordination Protocol — Live-tested protocol for multiple AI agents working in the same codebase.
  • Agent Monitor v2.1 — Per-agent TTL, 5-alert max, terminal state detection.

v1.0.0

Initial ReleaseFebruary 2, 2026
  • Core File System — CLAUDE.md, Session.md, Tasks.md — the foundational files that give AI assistants persistent memory.
  • Session Lifecycle Management — Structured session start, checkpoint, and end protocols. State survives context compaction.
  • Context Monitoring — Automated warnings when AI context window approaches capacity.
  • Decision Logging — Numbered decision records that persist the “why” behind architectural choices across sessions.