CARAPACE

VPS / Linux / Raspberry Pi · layers on top of OpenClaw
1 curl -fsSL https://openclaw.ai/install.sh | bash
2 curl -fsSL https://carapace.info/install.sh | bash
Two commands · ~8 min · non-destructive on existing OpenClaw
Vision + contextual awareness for OpenClaw

Get your mom on OpenClaw in ten minutes.

Your OpenClaw
has eyes.

OpenClaw runs the brain. Carapace gives it eyes, ears, location, and a body. Sandboxed on your VPS or Mac — only sees what you point it at. Ten-minute install.

Open the fridge, ask what's for dinner. Snap a receipt, file it. Walk into a thrift store and price every shelf in real time.

No new subscriptions. No new accounts. Uses the AI keys you already have — Gemini, ChatGPT, Claude, Grok. Free on unlimited gateways.

How to

Yeah, it's really that easy.

The install

The menu bar app

Senses

What carapace adds.
Eyes. Ears. Place. Motion.

OpenClaw can already think. It can search the web, run code, manage files. What it can't do is see what you see. Carapace is the sensory layer — your iPhone becomes the agent's input device.

👁️

Vision Mode

Point your camera at anything — a plant, a wine label, a dishwasher error code, a mushroom. Ask, get a real answer with sources.

📸

Deep Scan

60-second context capture. Walk the aisle, sweep the room, pan the desk — Carapace stitches the moment into one rich query so the agent sees the whole scene.

🔍

Sticker-Peel Focus

Pinch-zoom and lift the thing you actually mean out of the frame. The agent only sees what you point at — not the messy background.

🎙️

Voice Mode

Natural back-and-forth. Sub-second first audio so the reply starts before you've put the phone down. Apple Speech transcription happens on-device.

📍

Place & Motion

Knows you're at the hardware store before you say so. Knows you're walking, not driving. Context the agent gets for free, never has to ask for.

🔠

On-Device OCR

Apple Vision reads every word in the frame before anything leaves your phone. Receipts, prescriptions, signage — text first, image only if the agent needs it.

Carapace runs the perception. OpenClaw runs the brain. You stay in the loop, every step.

Cognition

A brain, not a history buffer.
Modeled on actual brain regions.

Most "AI memory" is a vector database of past chats with cosine similarity slapped on top. Yours runs deeper — a layered cognitive model that takes what your iPhone perceives, files it the way a brain would, and surfaces the right context the moment it's relevant.

🧠

Hippocampus

Episodic memory of specific moments — the wine you photographed in Sonoma, the rash you asked about Tuesday morning. Retrieved by time, place, or topic.

🗺️

Place Cells & Grid Cells

Learns the shape of rooms, blocks, and routes you visit. Knows the difference between "your kitchen" and "a kitchen" without being told.

🌙

Default Mode Network

Consolidates raw experience into stable schemas in the background — like sleep does for you. The bridge gets quietly smarter between turns, not just during them.

Out of the box

Web search Browser control Code execution Cron jobs Persistent memory Multi-agent orchestration Image generation File management

Toggle in Settings → Rumination on iPhone — off the moment you want vanilla AI back.

Real things you can do

A Tuesday with CARAPACE.
Camera, voice, memory — all working at once.

Kitchen

"What can I make for dinner?"

Open the fridge, ask. Carapace scans what's there, cross-references your pantry log, and names three things you can cook right now — not 40-minute recipes that assume you have saffron.

Vision Memory Voice
Thrifting

Good-Will Hunting, priced in real time

Walk the thrift-store aisle with Carapace live. Point at anything interesting — instant eBay sold-listing average, Mercari range, resale spread. "That midcentury lamp? $12 here, $180 sold last week."

Live vision Web search
Everyday

"What am I looking at?"

Landmark in a new city. Plant at the nursery. Error code on a dishwasher. Mushroom in the woods. Engine noise you can hear but can't describe. Point, ask, get a real answer with sources.

Vision Audio
DIY

Home-repair co-pilot

Point at the leak. Carapace walks you through the fix while watching your progress. It remembers your house — "that's the same water heater we looked at in March; the pilot light's on the lower-left panel."

Vision Long memory Guided voice

Four examples. Yours will be different. Build the routine that fits your life — every capability is already in the box.

What's under the shell

Not just a chat wrapper.

Real project tracking. Live agent orchestration. Status, in your pocket.

CARAPACE — Projects view Force-press to dive in

Project Board

Every project your agents are touching, with live progress and per-workstream status. Force-press any project to drill into workstreams — granular progress, active blockers, who touched what last.

CARAPACE — Agents view

Spinal Map

Live view of your running agents — main orchestrator, subagents, what they're chewing on. When something breaks, you see it. When something finishes, you know.

CARAPACE — Scheduled Jobs view Force-press to open payload

Cron Tracker

Every scheduled job in one list — next-run ETAs, last-run results, RED / YELLOW / GREEN health at a glance. Force-press a job to open its payload, edit the schedule, or fire it now.

Your hub. Your data.

Every machine you own.
Sandboxed exactly the way you want.

Carapace is the front-end. The back-end is whatever box you point it at — your Mac mini, a $5/mo Hostinger VPS, a Raspberry Pi in the closet, a Hetzner workhorse. One phone, all of them.

One Phone, Many Machines

Control unlimited gateways from your pocket

Home Mac. Office VPS. Etsy bot on Hetzner. The Pi in the garage running a side hustle. One iPhone, one tap to switch — every gateway on its own Tailscale-secured pipe, every agent with its own personality and memory.

  • Auto-discovers machines on your Tailnet
  • Per-gateway nicknames + custom agent names
  • Live agent feed, project tracker, cron dashboard — for every box
Your Data, Your Rules

Sandbox what the AI can see

Don't want an AI rummaging through your home Mac? Spin it up on a $5/mo VPS instead — give it access to only the data you copy over. Your photos, your bank statements, your kid's school stuff: all stay where you put them.

  • No Carapace cloud — we never see your data
  • Bring your own AI keys (Gemini, ChatGPT, Claude, Grok)
  • Tailscale end-to-end — no public ports, no cloud middleman

Every other AI assistant lives on someone else's server. Carapace lives on yours.

Pricing

Mac and Linux: always free. iPhone scales with you.

Run CARAPACE on as many Macs, VPSs, and Raspberry Pis as you want — zero cost. The iOS tiers below unlock how many of those gateways a single iPhone can connect to. One-time purchase, no subscriptions, uses the AI keys you already have.

Solo

Free forever

$0

iPhone pairs with 1 gateway. Mac & Linux always unlimited.

All apps, all features
Voice, vision, chat
Your own hardware
Get iPhone app — Free

Power

Heavy homelab

$7.99

iPhone pairs with up to 10 gateways.

Everything in Household
Pair iPhone with 10 gateways
VPS fleet, Raspberry Pis
One-time, forever
Meta Ray-Ban beta — first 500, later this year
In-app purchase — $7.99

Corporate

Small team

$24.99

iPhone pairs with up to 50 gateways.

Everything in Power
Pair iPhone with 50 gateways
Dev, QA, production fleets
Still no subscription
Meta Ray-Ban beta — first 500, later this year
In-app purchase — $24.99
Mac & Linux apps are free. Tiers are iPhone-side in-app purchases.
🕶️

Coming this year

Carapace on Meta Ray-Bans

The same vision + contextual awareness — minus the phone in your hand. Reserved for power-tier and corporate accounts; TestFlight invites go out the moment Meta clears the SDK.

Install

Pick your device.
One brain across all of them.

Each guide is a 3-to-5 step walkthrough. Send the link to a friend — they'll handle the rest.

Response speed depends on your setup. A beefy Mac with a fast provider answers in 1-3 seconds; a $5 VPS with a reasoning model can take 60-180 seconds on a cold start. Pick hardware + provider with the latency you can live with — you control both.
🖥️

macOS

Ventura and later

  • Menu bar app — always one click away
  • Guided setup wizard — installs OpenClaw + Tailscale
  • Tailscale integration for secure remote access
  • Project tracking, agent grid, cron monitoring
  • Free — unlimited devices, no login
Download v2.0.12 (5.5 MB)

Notarized by Apple. macOS 14 Sonoma+, Apple silicon native.

Full install guide →
🐧

Linux

Headless / VPS

  • One-liner install — paste and go
  • Runs on any VPS — or a Raspberry Pi
  • Always-on headless daemon
  • Free — no license required for headless
Two commands — Carapace layers on top of OpenClaw:
1 curl -fsSL https://openclaw.ai/install.sh | bash 2 curl -fsSL https://carapace.info/install.sh | bash
Non-destructive on existing OpenClaw setups — your chats, keys, and identity files are preserved.
Full install guide →
📱

iOS

iPhone companion

  • Chat, voice, and photo input
  • Live camera vision — AI sees what you see
  • CarPlay + Bluetooth device support
  • Control any gateway from anywhere
App Store — Free Full install guide →