Getting Started: LLMAGE
For: AI engineers who need a local context persistence layer in LLM-native workflows. Setup time: 5 minutes. Requirements: Git, an MCP-compatible client (Claude Code, Cursor, Zed).
What you get
The LLMAGE tier is loci stripped to its primitives:
- CLI-first: no GUI required
- MCP server: expose context as tools
- Namespace isolation: rooms stay separate
- Zero cloud: no external network calls in the core loop
- Headless mode: run as a daemon
Install
git clone https://github.com/huximaxi/Loci.git
cd Loci
cp config.example.json ~/.loci/config.jsonEdit ~/.loci/config.json to match your setup.
Full config reference
| Field | Type | Default | Description |
|---|---|---|---|
version | string | "0.1.0" | Config schema version |
tier | string | "llmage" | User tier: scholar, wizard, or llmage |
index.auto_sync | boolean | true | Automatically sync index on content change |
index.sync_interval | string | "5m" | Interval between index syncs (e.g., "1m", "30s") |
index.full_text | boolean | true | Enable full-text search indexing |
index.semantic | boolean | false | Enable semantic/embedding-based search (requires LLM) |
llm.provider | string | "local" | Provider: local, anthropic, openai |
llm.endpoint | string | "http://localhost:11434" | Endpoint URL (for local provider) |
llm.model | string | "llama3" | Model identifier |
mcp.expose_rooms | array | ["dev"] | Rooms to expose as MCP tools |
mcp.port | number | 3721 | MCP server port |
mcp.auth | string | null | Optional auth token for MCP server |
chronicle.enabled | boolean | false | Enable session logging |
chronicle.path | string | "~/.loci/chronicle/" | Chronicle storage location |
Starting the MCP server
loci serveThe server starts on the port specified in config (default: 3721).
With explicit port
loci serve --port 3800Headless mode (daemon)
loci serve --daemonThe process detaches and writes logs to ~/.loci/logs/mcp.log. Stop with:
loci stopAvailable MCP tools
| Tool | Description | Parameters | Returns |
|---|---|---|---|
loci_search | Full-text search across indexed content | query: string, room?: string, limit?: number | Array of matched results with excerpts |
room_{name} | Load room context (soul + context.md + crystals) | None | Room context as structured text |
loci_get | Retrieve a specific crystal or locus | id: string | Crystal content or error |
loci_garden | Retrieve a garden plant | slug: string | Plant content with metadata |
loci_write | Write content to a room's context | room: string, content: string, append?: boolean | Success confirmation |
loci_crystal | Create a new crystal | room: string, id: string, content: string, type?: string | Crystal metadata |
Querying from IDE
Example 1: Search for past discussions
@loci loci_search "kafka vs sqs"Output:
{
"results": [
{
"id": "conv-20260501-142200",
"room": "dev",
"excerpt": "...decided on SQS for the payment webhook pipeline. Lower ops overhead, sufficient throughput...",
"score": 0.92
}
],
"total": 1
}Example 2: Load room context
@loci room_devOutput:
# Dev Room: Soul
You are operating inside the Dev Room...
---
## Recent context
Last session: 2026-05-03
State: Implementing auth middleware
## Crystals
- auth-decision: Using Clerk for authentication
- db-choice: PostgreSQL via SupabaseExample 3: Retrieve a specific crystal
@loci loci_get auth-decisionOutput:
---
id: "auth-decision"
created: "2026-05-01T09:00:00Z"
room: "dev"
type: "decision"
---
# Auth provider decision
We use Clerk for authentication...Namespace isolation
Each room is an isolated context namespace. When you invoke room_dev, you get Dev Room context only. Research Room content does not appear.
This isolation is enforced at the MCP tool level:
loci_searchdefaults to all rooms but accepts aroomparameter for scoped queriesroom_{name}tools only return content from that room- Crystals are namespaced by room in the filesystem
Filesystem structure:
~/.loci/
├── config.json
├── rooms/
│ ├── dev/
│ │ ├── CLAUDE.md
│ │ ├── context.md
│ │ └── crystals/
│ └── research/
│ ├── CLAUDE.md
│ ├── context.md
│ └── crystals/
└── index/
└── metadata.dbSnapshot export
Export room state for backup or migration:
loci export --room dev --format jsonOutput: ~/.loci/exports/dev-2026-05-04.json
Export options
| Flag | Description |
|---|---|
--room | Room to export (required) |
--format | Output format: json, markdown, archive |
--include-chronicle | Include session logs |
--output | Custom output path |
Export as markdown (for version control)
loci export --room dev --format markdown --output ./docs/dev-room/Creates a folder of markdown files suitable for committing to git.
Provider backends
loci supports three LLM provider backends.
Local (Ollama)
{
"llm": {
"provider": "local",
"endpoint": "http://localhost:11434",
"model": "llama3"
}
}Requires Ollama running locally. No external network calls.
Claude API
{
"llm": {
"provider": "anthropic",
"model": "claude-sonnet-4-20250514"
}
}Set API key via environment variable:
export ANTHROPIC_API_KEY="sk-ant-..."OpenAI API
{
"llm": {
"provider": "openai",
"model": "gpt-4o"
}
}Set API key via environment variable:
export OPENAI_API_KEY="sk-..."WARNING
When using cloud providers, LLM calls leave your machine. The core index/search/serve loop remains local: only explicit LLM operations (semantic search, summarization) contact external APIs.
Headless mode
For server or CI environments, run loci as a daemon:
loci serve --daemon --port 3721Daemon management
# Check status
loci status
# View logs
tail -f ~/.loci/logs/mcp.log
# Stop daemon
loci stop
# Restart
loci restartSystemd service (Linux)
[Unit]
Description=loci MCP server
After=network.target
[Service]
Type=simple
ExecStart=/usr/local/bin/loci serve --port 3721
Restart=on-failure
User=youruser
[Install]
WantedBy=multi-user.targetArchitecture notes
For LLMAGE-tier use:
- Search index: MiniSearch 7.x. Serializes via
toJSON()/loadJSON(). No daemon required for the index itself. - Content store: SQLite at
~/.loci/index/metadata.db. Append-only, never modified in place. - MCP server: Rust binary. Stateless between calls.
- Room contexts: Served from
~/.loci/rooms/{name}/context.md: plaintext, diffable, version-controllable. - No cloud path: Zero external network calls in the core loop.
Full details: Architecture documentation
Next steps
- Read the MCP API reference for complete tool documentation
- See Architecture for the
.locusfile format specification - Explore Rooms & Loci for namespace patterns and multi-project setups