Skip to main content

Installation

LocoPilot ships as a single npm package called @infrarix/locopilot. Installing it globally puts the locopilot binary on your $PATH.

Requirements

RequirementWhyAuto-installed?
Node.js 20+Runs the CLI and the local Fastify API❌ install yourself
OllamaRuns models on your hardwarelocopilot init will install it
Python 3.9+Required by the training adapters (Unsloth / Axolotl / MLX)❌ install yourself
SQLiteLocal-tier persistence (inference logs, training jobs)✅ ships with sqlite3 npm package

:::tip Apple Silicon On macOS arm64 the training adapter automatically becomes MLX. Install the Python runtime once with pip3 install mlx-lm. :::

Install the CLI

npm install -g @infrarix/locopilot

Verify it landed:

locopilot --version
# 1.0.0

Initialise your machine

locopilot init

locopilot init runs four checks (the source is in src/cli/commands/init.ts):

  1. Ollama — checks if the binary is on PATH. If not, it auto-installs:
    • macOS → brew install --cask ollama
    • Linux → curl -fsSL https://ollama.com/install.sh | sh
    • Windows → winget install -e --id Ollama.Ollama
  2. Python 3 — verifies python3 is reachable. Prints install instructions if missing.
  3. SQLite — creates ~/.locopilot/db.sqlite and the inference_logs + training_jobs tables.
  4. Config — writes a default ~/.locopilot/.env with sensible defaults.

A successful run looks like this:

🐌 LocoPilot Setup

[1/4] Checking Ollama...
✔ Ollama installed (ollama version is 0.5.4)
✔ Ollama service running

[2/4] Checking Python 3 (required for training adapters)...
✔ Python 3.12.1 (python3)

[3/4] Initializing local database...
✔ SQLite ready at ~/.locopilot/db.sqlite

[4/4] Writing default config...
✔ Config written to ~/.locopilot/.env

Setup complete!

Pull your first model

locopilot models pull llama3:8b

This is a thin wrapper around ollama pull. See locopilot models.

Start the API server

locopilot start

Listens on http://localhost:8080. Hit /v1/locopilot/health for a heartbeat.

Verify everything

locopilot doctor

Runs every pre-flight check end-to-end. See locopilot doctor for the full list.

Where things live

~/.locopilot/
├── db.sqlite # local inference + training history
├── .env # OLLAMA_HOST, API_PORT, SQLITE_PATH, …
└── config.json # Pro-tier token (created by `locopilot login`)

The CLI never reads DATABASE_URL or REDIS_URL — those live exclusively on the cloud control plane. Your machine stays fully self-contained.