Skip to content

FAQ

Does loci send data anywhere?

No. loci is local-first and local-only. Your conversations are indexed on your machine, stored on your machine, and never transmitted anywhere.

The browser extension runs entirely in your browser. The desktop app runs entirely on your device. The MCP server runs on localhost. There are no accounts, no cloud sync, no analytics, no telemetry.

If you use loci with a cloud-based AI (Claude, ChatGPT), those conversations are already on that provider's servers. loci does not add any additional data transmission.

What happens to my data if I uninstall the extension?

Your data stays on your machine. Uninstalling the browser extension does not delete your local index.

The index lives in ~/.loci/. If you want to delete it, delete that directory manually. If you reinstall the extension later, it can rebuild the index from your conversation history.

Is loci the same as a RAG system?

No. RAG (Retrieval-Augmented Generation) retrieves text chunks and injects them into prompts. loci is a context persistence layer with structured retrieval.

The differences:

  • RAG is chunk-based. loci is conversation-based and room-based.
  • RAG is stateless per query. loci maintains persistent context across sessions.
  • RAG focuses on retrieval. loci adds structure: rooms, crystals, garden, handovers.

You can build RAG on top of loci's index if you want. But loci's value is in the persistent structure, not just the retrieval.

Does it work with local models?

Yes. The Wizard tier includes local LLM routing. You can point a room at Ollama, LM Studio, or any OpenAI-compatible local endpoint.

The LLMAGE tier exposes loci as an MCP server that any MCP-compatible agent can use, regardless of which model powers it.

What is the difference between a room and a project?

A room is a persistent context workspace. A project is what you are building.

You might have one room per project. Or you might have multiple projects in one room. Or one project spanning multiple rooms (design room, dev room, research room).

Rooms map to how you think, not what you build. If you context-switch between "coding mode" and "writing mode," those are different rooms, even if they serve the same project.

Is there a web interface?

No. loci is local-first by design. A web interface would require data to leave your machine.

The desktop app (Tauri) provides the search interface. The browser extension captures conversations. The MCP server exposes context to your IDE. All local.

How is this different from just using a notes app?

A notes app stores what you write. loci indexes what you say to AI.

The differences:

  • Automatic capture: loci indexes conversations without you copying them
  • Structured search: Search by room, time, speaker, content
  • AI-native context: Rooms load into your AI's context window via MCP
  • Session bridging: Handovers connect sessions automatically

You can use loci alongside a notes app. They solve different problems.

What AI tools are supported?

Currently: Claude (claude.ai) and ChatGPT (chatgpt.com).

The browser extension detects conversation content from these sites. Support for additional tools is planned.

For AI tools accessed via API (Claude Code, Cursor, custom agents), loci exposes an MCP server. Any MCP-compatible tool can read from and write to your rooms.

Can teams share a palace?

Not currently. loci is designed for personal use. Each person has their own local index.

Team sharing would require either:

  • Syncing data to a shared location (breaks local-first)
  • Peer-to-peer sync (complex, on the roadmap)

For now, teams share by sharing CLAUDE.md files, handovers, and loci as text documents.

How do I back up my data?

Copy ~/.loci/ to your backup location. The directory contains:

  • index/: the MiniSearch index
  • conversations/: raw conversation data
  • rooms/: room configuration and crystals

Standard file backup tools work. rsync, Time Machine, whatever you use.

What is a locus vs a crystal?

Both are persistent knowledge, but they serve different purposes.

A crystal is a fact for your AI to act on. It lives in CLAUDE.md. Your AI reads it at session start and applies it immediately. Crystals are operational.

A locus is an insight for navigation. It lives in a .locus file. It marks a significant point in your thinking that you or your AI might want to reference later. Loci are architectural.

Crystal: "Deploy target is Vercel" → Your AI knows where to deploy. Locus: "Privacy tools that advertise privacy undermine themselves" → A reference point for future thinking.

When will loci be on the Chrome Web Store?

The extension is in development. Current target is Q2 2025 for a public beta.

Until then, you can install from source. See Getting Started: Scholar for instructions.

Built by Hux × Vesper · Apache 2.0