Skip to content

Positioning

Why GigaBrain

A local-first brain that scales past thousands of markdown files — no cloud, no API keys, no compromise.

Zero cloud, zero keys

No subscription gate, no inference API, no dependency on a service staying alive somewhere else. Intelligence runs where the data lives.

Radical privacy

Your notes are never transmitted, trained on, or indexed by a third party. The machine doing the work is the machine you already own.

Instant retrieval

Hybrid local search keeps latency low enough to stay in the flow of thought instead of waiting on a remote pipeline to wake up.

Future proof

Markdown and SQLite are durable formats. Your knowledge stays readable and queryable long after proprietary products come and go.

The Digital Atelier Built like a quiet studio for thought: warm materials, stable foundations, and zero dependence on the cloud.

The current landscape of knowledge management is fractured by dependency. We rent our memory from services that can change policy, disappear, or decide that semantic features now require yet another key and another bill.

GigaBrain flips that arrangement. It is an architectural approach to software — built with the same philosophy as a physical library: heavy, stable, inspectable, and entirely yours.

Intelligence should not require a handshake with the cloud.

GigaBrain holds four positions that, together, no other knowledge tool occupies:

  • Local-first. Compute and storage both live on your machine. The model weights ship in the binary on the airgapped channel; on the online channel they’re cached after a one-time download. Nothing leaves the box.
  • Agent-native. The MCP server is not an afterthought — it predates any GUI. The same surface that humans use through the CLI is exposed to Claude Code, Cursor, and any other MCP-compatible client over stdio.
  • Single binary. No container, no Python runtime, no service mesh. gbrain is one statically linked file. Drop it on a laptop, a Pi, or an air-gapped workstation and it works the same.
  • Hybrid retrieval. FTS5 for precise keyword matches and a local BGE vector index for semantic recall, fused with set-union and a short-match short-circuit. You don’t pick keyword or semantic — both run, both rank, and the merged answer is what you read.

The same single-file brain answers full-text searches, semantic queries, graph exploration, contradiction checks, and MCP tool calls without handing custody of your data to anyone else.

It also stays current. The v0.9.6 release added a live file watcher that follows your Obsidian vault as you edit — reconcile-backed, with a 1.5 s debounce and write-safety interlocks — so the brain that an AI agent reads is always the same brain you just updated. (Unix/macOS/Linux only.)

Ready to own the stack end to end?

Start with the install tutorial, build your first brain in ten minutes, and wire it into the tools and agents you already use.