Preview — early access · Phase 0 starting
Your decentralized cursor

Start the work.
Hand it off.
Close your laptop — it keeps coding.

An AI coding agent you can hand off to a remote counterpart that keeps working in the background.
Ships as a VS Code extension · a terminal CLI · a branded desktop app.
Fully open source Ollama + llama.cpp local-first No logs · no training on your code

Pick the mode that fits the task.

One keybind for a one-shot completion. A local tool loop for sharper refactors. A remote autonomous agent when the task outgrows your laptop. Switch per task — the product picks none of them for you.

01 · One-shot

Inline chat & Cmd+K

Stateless. Fast. Familiar.

  • Side-panel chat with your code in context
  • Cmd+K inline edit with streaming diffs
  • Tab autocomplete (FIM)
  • @web grounding via LibertAI search
02 · Local agent

Tool loop on your machine

Reads, writes, runs, grounds — with your approval.

  • read_file / write_file / edit_file / bash
  • Per-tool approval UI — nothing runs unseen
  • Checkpoint every turn, roll back instantly
  • Works fully offline with Ollama or llama.cpp
03 · Remote agent

Hand it off to LiberClaw

Autonomous. Persistent. Keeps going.

  • Runs on an Aleph Cloud VM — not your laptop
  • Reuse one of your existing LiberClaw agents
  • Auto-approve inside a sandboxed workspace
  • Close the lid. Reconnect when you want.

Five steps from your laptop to a background agent.

// Cursor can't do this. Claude Code can't do this. Copilot can't do this.
// None of them have a LiberClaw to hand off to.

  1. 01

    Start local.

    You're in VS Code. You open the local agent, rough out a plan, run a few tool calls. Maybe a refactor. Maybe a migration. A few turns in, you realize this is going to take hours.

  2. 02

    Hit Continue on LiberClaw agent.

    One action in the side panel. The extension packages your message history, tool history, and the current workspace state into a hand-off bundle.

  3. 03

    Pick an agent — reuse, don't respawn.

    You see the LiberClaw agents you already have: the research one, the ops one, the one with your personal skills and MCP setup. Pick one. It inherits the task without spawning a new VM or burning a slot. Or spin up a fresh dedicated coding agent if you prefer.

  4. 04

    Let it run autonomously.

    Auto-approve is on — scoped to the workspace sandbox. The agent keeps running tools, writing files, running tests. Your local editor stays in bidirectional sync, so incoming changes show up in your diff view as they happen.

  5. 05

    Close your laptop.

    The agent is on an Aleph Cloud VM — it doesn't need you. Reconnect in 20 minutes, 2 hours, tomorrow morning. Pick up where it got to. Review the diff. Merge it into your working tree.

Bring your own backend.

LibertAI is the default when you're signed in — no logging, no training on your code. Or point it at any OpenAI-compatible or Anthropic-compatible endpoint you already have. Tool calling works across both conventions.

LibertAI

Default

Decentralized inference. Private by policy — no logs, no training on user data. Billed through your LiberClaw plan, or use a pay-per-use API key.

OpenAI-compatible

BYO key

OpenAI, Groq, Together, vLLM, or any endpoint speaking the OpenAI Chat API.

Anthropic-compatible

BYO key

Anthropic proper, or a self-hosted vLLM/llama.cpp with the Anthropic messages shim.

Ollama

Local

Auto-detected on localhost:11434. Full offline mode — code never leaves your machine.

llama.cpp server

Local

Auto-detected on localhost:8080. Fast, configurable, your hardware.

Aleph GPU

Soon

Spin up a private GPU VM preloaded with Ollama or vLLM. Routes like any OpenAI-compatible endpoint.

Tool use is automatic.  OpenAI-compatible backends → OpenAI function calling. Anthropic-compatible → Anthropic tool-use blocks. The agent loop normalizes both. You don't configure it.

Ride your LiberClaw plan. Or bring your own key.

The editor and CLI are free, open source, and work with any backend. Remote-agent mode uses your LiberClaw subscription — so a free LiberClaw account already gets you two concurrent coding agents at no extra cost.

Free
€0forever
2 concurrent agents
  • All editor + CLI features
  • All local backends
  • Hand-off to 2 running agents
Starter
€7/mo
5 concurrent agents
  • Everything in Free
  • Larger agent sizes (up to 4×)
  • Priority VM pool
Team
€49/mo
25 concurrent agents
  • Everything in Pro
  • Shared agent workspaces
  • Org-level usage reporting

Or skip plans entirely: drop an OpenAI / Anthropic / LibertAI key in settings.json and use the editor without any LiberClaw account. Remote-agent mode is the only feature gated behind sign-in.

Three boundaries. Stated precisely.

We don't conflate "private" with "local." Here's exactly where your code goes in each mode.

Fully local Local

Ollama or llama.cpp server running on your own machine. Nothing leaves your laptop. No network egress. The only mode where "local" honestly applies.

LibertAI Remote · private

Inference is remote, but no logging and no training on user data by policy. Aleph TEE instances available for confidential-compute models. Materially stronger than default OpenAI or Anthropic terms.

Remote LiberClaw agent Your VM

Your code lives on an Aleph Cloud VM for the agent's lifetime. You control when to destroy it. Same LibertAI inference guarantees apply for the model calls the agent makes.

House rule "Private by default" never means "local". LibertAI is remote-but-private — a different claim, and one we will never blur in marketing.

Three ways to use it.

Same product. Pick the surface that fits how you already work.

VS Code extension

Recommended

Inline chat, Cmd+K, tab complete, local agent, and hand-off — in your existing VS Code. Marketplace + Open VSX.

$ code --install-extension liberclaw.liberclaw-code

Terminal CLI

Agentic

Claude-Code-class REPL. Tool loop locally, or --agent to offload to a LiberClaw agent you can reconnect to anytime.

$ libertai code

Desktop app

Soon

Signed VSCodium build with LiberClaw Code pre-installed. One download, zero setup. macOS, Windows, Linux.

liberclaw.ai/code — coming after extension GA