Gnosys gives LLMs a knowledge layer that survives across sessions. Atomic markdown files, git-backed history, Obsidian-compatible. Works with any MCP client.
npm install -g gnosys-mcp
copy
A complete memory system designed from the ground up for LLMs. No vector databases, no embeddings, no black boxes.
Each memory is one markdown file with YAML frontmatter. Human-readable, git-diffable, and browsable in Obsidian.
FTS5-powered search with relevance keyword clouds. Agents describe what they're working on and get relevant memories back.
Project, personal, and global knowledge layers. Project memory travels with the repo. Personal memory follows you everywhere.
Filter by category, tags, status, author, confidence, dates. Compound lenses combine criteria with AND/OR logic.
Every write auto-commits. View full version history, diff between versions, and non-destructively rollback any memory.
Cross-reference memories with [[wikilinks]]. Build a knowledge graph with backlinks, see connections, find orphaned links.
Feed raw text and an LLM structures it into an atomic memory with title, tags, category, and relevance keywords.
Visualize how knowledge evolves over time. Group by day, week, month, or year. Track creation and modification patterns.
Import thousands of records from CSV, JSON, or JSONL. LLM-powered ingestion generates keyword clouds automatically. Batch commits, dedup, and resume support.
Runs on macOS, Linux, and Windows. Pure Node.js with zero platform-specific dependencies. Install once, works everywhere.
Gnosys is an MCP server. Point your AI client at it, and your agent gains persistent memory.
{ "mcpServers": { "gnosys": { "command": "npx", "args": ["gnosys-mcp"], "env": { "ANTHROPIC_API_KEY": "sk-..." } } } }
{ "mcpServers": { "gnosys": { "command": "npx", "args": ["gnosys-mcp"], "env": { "ANTHROPIC_API_KEY": "sk-..." } } } }
$ claude mcp add gnosys npx gnosys-mcp
[mcp.gnosys] type = "local" command = ["npx", "gnosys-mcp"] [mcp.gnosys.env] ANTHROPIC_API_KEY = "your-key-here"
{ "mcp": { "gnosys": { "type": "local", "command": ["npx", "gnosys-mcp"], "env": { "ANTHROPIC_API_KEY": "your-key-here" } } } }
--- description: Gnosys persistent memory alwaysApply: true --- # Gnosys Memory System ## Retrieve memories - At task start, call gnosys_discover with keywords - Load results with gnosys_read - Trigger on: "recall", "remember when", "what did we decide" ## Write memories - Trigger on: "remember", "memorize", "save this", "don't forget" - Also write on decisions, preferences, specs, post-task findings ## Key Tools gnosys_discover → find memories gnosys_add → write gnosys_read → load content gnosys_update → modify gnosys_hybrid_search → best search gnosys_ask → Q&A with citations … plus 25 more tools (maintain, history, graph, etc.)
# Gnosys Memory This project uses Gnosys for persistent memory via MCP. ## Read first - At task start, call gnosys_discover with keywords - Load results with gnosys_read - On "recall", "remember when", "what did we decide" → search first ## Write automatically - On "remember", "memorize", "save this" → call gnosys_add - Decisions/preferences → commit to decisions/ - Specs → commit BEFORE starting work - After implementation → commit findings ## Key tools | Action | Tool | | Find | gnosys_discover → gnosys_read | | Search | gnosys_hybrid_search, gnosys_ask | | Write | gnosys_add, gnosys_add_structured | | Update | gnosys_update, gnosys_reinforce |
One command: npm install -g gnosys-mcp. Zero config needed to start.
Works with Claude Desktop, Cursor, Claude Code, Codex, OpenCode, or any MCP-compatible client on macOS, Linux, or Windows. Just add the server config.
Ask your agent to run gnosys_init in your project. It creates a .gnosys/ directory with git tracking.
Decisions, architecture choices, conventions, requirements — all persisted as atomic markdown files your agent can discover and reference.
Every memory is a plain markdown file you can read, edit, and browse in Obsidian. Git tracks every change. The filesystem is the source of truth.
Open source, MIT licensed. Built for developers who want their AI to remember what matters.