Skip to content

How-to

Run airgapped vs online

Pick the build channel that fits your environment, and how to switch between them.

gbrain ships in two flavors. The defaults are tuned so most people never have to think about it; this page is the ten-minute guide for everyone else.

Airgapped (default)Online
Cargo featuresembedded-modelbundled,online-model
Binary size~180 MB~90 MB
Model assetsBGE-small embeddedCached on first use
Network at runtimeNeverOnce, per model
Custom model selection (--model)Warning-only no-opHonored
  • You work on a plane, in a SCIF, or on a regulated workstation.
  • You want zero ambiguity about whether the brain is talking to anyone.
  • You’re fine with the BGE-small default (almost everyone is).
  • You want a smaller binary.
  • You want to use a non-default model (base, large, m3, or any HF model ID).
  • You can tolerate a one-time download on first semantic use.

This is the default. Either:

Terminal window
# Shell installer (default channel = airgapped)
curl -fsSL https://raw.githubusercontent.com/macro88/gigabrain/main/scripts/install.sh | sh
# Or build from source
cargo build --release

Either path produces a binary that contains the BGE-small weights inside it. The first semantic operation does no network work.

Terminal window
# Shell installer with channel override
curl -fsSL https://raw.githubusercontent.com/macro88/gigabrain/main/scripts/install.sh \
| GBRAIN_CHANNEL=online sh
# Or build from source
cargo build --release --no-default-features --features bundled,online-model

On first semantic use, gbrain downloads the active model from GBRAIN_HF_BASE_URL (default https://huggingface.co) into GBRAIN_MODEL_CACHE_DIR (default ~/.gbrain/models/). Subsequent runs use the cache.

The brain database is independent of the binary channel. To switch:

  1. Install the other channel.
  2. Move/keep the same brain.db.
  3. Run gbrain version to confirm; run gbrain stats to confirm the recorded model is still consistent.

There’s no schema migration involved.

Terminal window
curl -fsSL https://raw.githubusercontent.com/macro88/gigabrain/main/scripts/install.sh \
| GBRAIN_VERSION=v0.9.8 GBRAIN_CHANNEL=online sh

GBRAIN_VERSION pins the release tag; GBRAIN_CHANNEL pins the channel. The installer validates SHA-256 either way.

For environments with restricted egress, point at an internal mirror:

Terminal window
export GBRAIN_HF_BASE_URL=https://hf-mirror.internal
export GBRAIN_MODEL_CACHE_DIR=/var/cache/gbrain/models
gbrain query "..."

Only the online channel honors these. The airgapped binary does no network work regardless.

Quick paranoia check on the airgapped channel:

Terminal window
# macOS: drop to airplane mode, run a query
# Linux: drop network namespace
unshare -n gbrain query "test" --db /tmp/empty.db || true

If the query path tries to fetch a model on the airgapped channel, it’s a bug — file an issue with gbrain version output.