Desktop App (Tauri)
Mac + Windows · Scholar and Wizard tiers · Tauri v2
The loci desktop app is the memory hub: the primary search interface, room management UI, and config layer for users who want a first-class native experience beyond the browser extension.
The browser extension collects. The desktop app organises, navigates, and exposes.
Two builds, one codebase
The app ships in two tier configurations, controlled by a build flag (LOCI_TIER):
| Scholar | Wizard | |
|---|---|---|
| Theme | Green / cream | Dark stone / amber |
| Onboarding | 3-screen simple | 5-screen palace setup |
| Rooms UI | Hidden | Palace floor plan nav |
| MCP server | No | Yes (Rust, port 3721) |
| LLM provider | No | Yes (local or API) |
| Config UI | Simple | Full |
| Window chrome | Standard | Custom dark |
Download
| Platform | Tier | File |
|---|---|---|
| macOS (Universal) | Scholar | loci-scholar-mac.dmg |
| macOS (Universal) | Wizard | loci-wizard-mac.dmg |
| Windows x64 | Scholar | loci-scholar-windows.exe |
| Windows x64 | Wizard | loci-wizard-windows.exe |
What the app does that the extension can't
- Persistent, windowed search UI: full conversation browser, not just a floating overlay
- Room management: create, name, configure rooms; assign conversations to rooms
- Locus creation: crystallise insights from conversations into
.locusfiles - MCP server: expose rooms as MCP tools to Claude Code, Cursor, Zed (Wizard tier)
- LLM integration: connect a local Ollama instance or API key for summarisation (Wizard tier)
- System tray: quick-access search from the menu bar without opening the full app
- File system access: direct read/write to
~/.loci/via Rust backend
Onboarding
Scholar (3 screens)
1: Welcome Scholar crystal + headline: "Your AI memory, finally searchable." Single CTA: "Let's set it up"
2: Connect your tools Checklist: Claude.ai / ChatGPT. Button: "Install Chrome extension" (opens CWS). Status: "Extension detected ✓" or "Waiting for extension…" (polls via native messaging).
3: Your first search Live search bar. If extension is connected: searches already-indexed conversations. If not: demo mode with example results. CTA: "Open loci"
Wizard (5 screens)
1: The Palace Awakens Palace floor plan SVG, amber-lit. "Your palace is ready."
2: Choose your rooms Checkboxes for each room. Deselected rooms exist but are hidden from nav.
3: Configure your LLM Provider: Local (Ollama) / Claude API / OpenAI API / Skip. API keys stored in OS keychain: never in config.json.
4: MCP setup Toggle: "Expose rooms as MCP tools." If on: port config + copyable IDE config snippet. "Test connection" button.
5: Enter the palace Floor plan now lit with selected rooms. CTA: "Enter the palace"
File structure bootstrapped on first run
~/.loci/ is created by loci_init() (Rust command) on first launch. Idempotent: safe to call again.
→ See Architecture: file structure
Building from source
Prerequisites: Rust (stable), Node.js ≥ 18, Tauri CLI v2
git clone https://github.com/loci-garden/loci
cd tauri-app
npm installDevelopment:
npm run tauri devProduction build:
# Scholar, Mac universal
LOCI_TIER=scholar npm run tauri build -- --target universal-apple-darwin
# Wizard, Windows
LOCI_TIER=wizard npm run tauri build -- --target x86_64-pc-windows-msvcOr use the build script:
./build.sh scholar mac
./build.sh wizard allRust backend commands
| Command | Description | Tier |
|---|---|---|
loci_init() | Bootstrap ~/.loci/ file structure | All |
get_config() | Read config.json | All |
set_config(patch) | Write config.json | All |
list_rooms() | List configured rooms | All |
create_room(name) | Create new room directory + .room.json | All |
start_mcp_server() | Start local MCP server on port 3721 | Wizard |
register_native_host() | Register Chrome native messaging host | All |
build_tray() | System tray setup | All |
System tray
Tray icon: loci crystal (22px, monochrome on dark menu bar).
Search… ← opens ⌘K overlay or focuses search
─────────────────
Open loci / Open Palace ← focuses main window
Settings
─────────────────
QuitExtension ↔ App communication
The extension sends indexed conversations to the desktop app using Chrome native messaging.
Native host registration: The Tauri app registers a native messaging host (garden.loci.app) on first launch via register_native_host(). This writes the required JSON manifest to the OS-specific path.
Protocol: Simple JSON messages over the native messaging channel:
{ "type": "conversation_indexed", "id": "...", "platform": "claude", "title": "...", "turns": [...] }
{ "type": "sync_request" }
{ "type": "search", "query": "..." }Note (open question): Native messaging vs local HTTP (localhost:3721). Native messaging is more secure (no port exposure) but requires native host registration. Local HTTP is simpler. Cipher to advise before Phase 2 implementation.
Roadmap
| Phase | Features | Duration |
|---|---|---|
| 1 | Tauri scaffold, ~/.loci/ bootstrap, Scholar onboarding, basic search | 2 weeks |
| 2 | Native messaging with extension, "extension detected" status | 1 week |
| 3 | Wizard features: onboarding, palace nav, MCP server, LLM config | 2–3 weeks |
| 4 | Code signing, auto-update, distribution | 1 week |