We're just helping you feed it the data it's craving.
Mac, VPS, or Raspberry Pi. Free on unlimited gateways. A friend's onboarded in 5 minutes.
curl -fsSL https://openclaw.ai/install.sh | bash
curl -fsSL https://carapace.info/install.sh | bash
Why CARAPACE
Context tracking means you can switch between your VPS and your Mac and the AI doesn't get lost. It stays on task, tracks projects, remembers decisions. Work on your server, review on your laptop — same brain, zero re-explaining.
No monthly AI subscription. No enterprise per-seat tier. A VPS, a Raspberry Pi, an old Mac Mini in a closet — whatever hardware you already trust. A free Google API key gets you web search, browser control, code execution, cron jobs, and multi-agent orchestration.
Point your phone at anything and talk. Tap to peel a subject off the scene as a focus sticker. Sweep an area in scan mode to give your AI a panorama. The mic is listening too — Apple's on-device classifier picks up music, water running, your dog barking, and feeds those signals into the conversation. All perception runs on the iPhone; only the labels reach your gateway.
Not a substitute for professional medical, legal, or financial advice. Always verify what the AI says before acting on it.
Power users get headless Linux on a VPS with SSH and cron jobs. Your parents get a Mac app with a guided setup wizard that installs everything automatically. Same platform, different entry points.
This isn't a chat wrapper. It's a full agent runtime with real capabilities.
Vision Mode
Camera + microphone + location, fused into one tap. Your iPhone does the seeing, hearing, and locating; only the distilled labels reach your gateway.
Tap the camera, speak naturally. The frame and the question ship together — no shutter, no upload, no waiting.
Tap a subject in frame to peel it off as a labeled sticker. Your AI sees the original scene plus a labeled cutout — "what's the dosage on this?" actually means this.
Tap SCAN, sweep your phone across a fridge, a shelf, a workspace. Your AI gets a temporal contact-sheet of the area — perfect for inventory questions, "what's missing?", or panoramas.
Apple's on-device sound classifier feeds music, water running, dog barking, keyboard typing — ~300 categories — into the conversation. Audio never leaves the phone; only the labels do.
A reverse-geocoded place name ("San Clemente, CA") rides with every turn so questions like "where am I" or "what's the nearest…" answer themselves without an extra round-trip.
Wake-word listening means you can leave the camera open without holding a button. Whatever you say in vision, voice, or chat threads into one shared session — no context loss when you switch surfaces.
Cognition
Most "AI memory" is a vector database of past chats with cosine similarity slapped on top. Yours runs deeper — a layered cognitive model that takes what your iPhone perceives, files it the way a brain would, and surfaces the right context the moment it's relevant.
Episodic memory of specific moments — the wine you photographed in Sonoma, the rash you asked about Tuesday morning. Retrieved by time, place, or topic.
Learns the shape of rooms, blocks, and routes you visit. Knows the difference between "your kitchen" and "a kitchen" without being told.
Tags emotional weight. Notices when you correct an answer or push back on a take, and quietly weights future recall so the same mistake doesn't repeat.
Consolidates raw experience into stable schemas in the background — like sleep does for you. The bridge gets quietly smarter between turns, not just during them.
Every chat and voice turn flows in here. Not just the message — the cadence, the corrections, the things you didn't have to say twice.
Per-turn relay. Pulls only the memories that actually matter for what you just asked, packs them into the prompt, and gets out of the way. Nothing irrelevant ever reaches your AI provider.
Ask "where did I see that?" three weeks later and get an actual answer. The bridge gets sharper at recognizing your patterns the longer you run it.
Replies are anchored in where you were, what you saw, and what you've cared about — not just the last few messages in the buffer.
Every memory lives on your bridge. There's no "shared brain" across users, no cloud index, nothing for us to subpoena or accidentally leak. Single-tenant by construction.
Built and refined over thousands of real-world turns. Toggle in Settings → Rumination on iPhone — off the moment you want vanilla AI back.
Install
Each guide is a 3-to-5 step walkthrough. Send the link to a friend — they'll handle the rest.
Ventura and later
Headless / VPS
1 curl -fsSL https://openclaw.ai/install.sh | bash
2 curl -fsSL https://carapace.info/install.sh | bash
iPhone companion
What's under the shell
Real project tracking. Live agent orchestration. Status, in your pocket.
Force-press to dive in
Every project your agents are touching, with live progress and per-workstream status. Force-press any project to drill into workstreams — granular progress, active blockers, who touched what last.
Live view of your running agents — main orchestrator, subagents, what they're chewing on. When something breaks, you see it. When something finishes, you know.
Force-press to open payload
Every scheduled job in one list — next-run ETAs, last-run results, RED / YELLOW / GREEN health at a glance. Force-press a job to open its payload, edit the schedule, or fire it now.
Meta Ray-Ban glasses + CARAPACE. Hands-free voice, live vision, every direction you look — all routed through your own gateway.
Targeting later this year, the first 500 CARAPACE supporters on Power or Corporate will be first in line for TestFlight access — no waitlist, no extra cost. Glasses support is pending Meta opening third-party SDK access for Ray-Ban; we have no partnership with Meta and can't guarantee Meta will open that SDK on any specific timeline. The Power / Corporate IAPs remain fully valuable on their own (gateway pairing + device cap) regardless of whether glasses support ships.
What you can do with it
Voice + camera + your own AI's memory. Here's what people actually do with it once it's set up.
Open the fridge, ask. Carapace scans what's there, cross-references your pantry log, and names three things you can cook right now — not 40-minute recipes that assume you have saffron.
Walk the thrift-store aisle with Carapace live. Point at anything interesting — instant eBay sold-listing average, Mercari range, resale spread. "That midcentury lamp? $12 here, $180 sold last week."
Walk your shop or warehouse naming items out loud. Your Mac keeps the spreadsheet live, flags low stock, and drafts reorder emails when a SKU dips below threshold. No clipboards.
Landmark in a new city. Plant at the nursery. Error code on a dishwasher. Mushroom in the woods. Engine noise you can hear but can't describe. Point, ask, get a real answer with sources.
Point at the leak. Carapace walks you through the fix while watching your progress. It remembers your house — "that's the same water heater we looked at in March; the pilot light's on the lower-left panel."
Menu abroad, product label, traffic sign. Translation with context: "this is a regional dialect for…," "this dish is usually served…," "this warning means stop within 50m." Beyond flat Google Translate.
Six examples. Infinite more. Your AI knows what you look at, what you said yesterday, and what's in your tools. Build whatever routine fits your life — it's already in the box.
Pricing
Run CARAPACE on as many Macs, VPSs, and Raspberry Pis as you want — zero cost. The iOS tiers below unlock how many of those gateways a single iPhone can connect to. One-time purchase, no subscriptions.
Free forever
iPhone pairs with 1 gateway. Mac & Linux always unlimited.
Friends & family
iPhone pairs with up to 5 gateways.
Heavy homelab
iPhone pairs with up to 10 gateways.
Small team
iPhone pairs with up to 50 gateways.
Pick your flavor. Each install guide is a short picture-by-picture walkthrough you can send to a non-technical friend.
1 curl -fsSL https://openclaw.ai/install.sh | bash
2 curl -fsSL https://carapace.info/install.sh | bash