Local-first
Everything stays on your machine. No accounts, no cloud sync, no telemetry.
Search everything you've ever said to an AI. Organise it into rooms. Never start a session from scratch.
You talk to AI constantly. Claude, ChatGPT, Cursor, Claude Code. Hundreds of conversations. Thousands of hours of reasoning.
Then you close the tab and lose it.
loci fixes this. It indexes every conversation locally, makes it searchable in milliseconds, and organises it into persistent workspaces called rooms. When you return to a room, your context returns with you.
No cloud. No accounts. No sync that breaks. The index lives on your machine, in ~/.loci/. You own it completely.
loci has three user tiers. They share the same local index but expose different surfaces.
You want search. Install the browser extension, let it index your conversations, and search across everything from a single interface. Five minutes to set up. No configuration required.
Scholar is for people who use AI tools daily and keep losing track of useful conversations. It gives you a personal search engine for your AI memory.
You want the full system. Rooms, crystals, garden, MCP integration, local LLM routing. The "palace": a structured local-first workspace that persists context across sessions and agents.
Wizard is for power users who work with AI agents, build with MCP, and need their context to survive beyond a single conversation window.
You want the primitive. loci as a CLI tool and MCP server. No GUI, no extension, just the indexing and retrieval layer. Embed it into your own tooling or use it from Claude Code.
LLMAGE is for AI engineers building LLM-native systems who need reliable local context persistence without cloud dependencies.
If you just want to search your AI conversations, start with Scholar. Five minutes.
If you want the full palace: rooms, crystals, persistent context across tools: start with Wizard. Twenty minutes.
If you're building tooling and want loci as a primitive, start with LLMAGE. Five minutes to integrate.
All three tiers use the same local index. You can start as Scholar and graduate to Wizard without losing data.
loci is built from four components:
| Component | Purpose | Location |
|---|---|---|
| Browser extension | Indexes conversations from Claude.ai, ChatGPT | Chrome (MV3) |
| Desktop app | Search interface, room management | Mac / Windows (Tauri v2) |
| MCP server | Exposes rooms as tools to your IDE | Local (Rust, port 3721) |
| Local index | MiniSearch + SQLite | ~/.loci/ |
Read the full architecture for implementation details.