# Next Session: Annie Drives a TurboPi Robot Car

## What

Build end-to-end integration so Annie can control the TurboPi robot car via voice, Telegram, and text. Three components: (1) benchmark Gemma 4 E2B with thinking disabled on Pi 5, (2) build a FastAPI server on Pi wrapping the TurboPi SDK, (3) integrate as Annie tool with creature registration, dashboard events, and audit trail.

## Plan

`~/.claude/plans/keen-jumping-grove.md` — **adversarial-reviewed v2** (18 findings, all addressed).

Read the plan first — it has the full implementation, all review findings, and design decisions.

## Key Design Decisions (from adversarial review)

1. **Bearer token auth** on every Pi endpoint (except /health). `ROBOT_API_TOKEN` env var required on both Pi and Titan. Server refuses to start if empty.
2. **ESTOP endpoint** (`POST /estop`) bypasses async lock for immediate motor stop. Sets flag blocking future moves until cleared.
3. **`run_in_executor`** for ALL blocking I/O — OpenCV `cap.read()`, SDK serial calls. Never block the event loop.
4. **`Literal` types** in `DriveRobotInput` — prevents LLM hallucinating invalid actions.
5. **Telegram-only initially** — `channels=("telegram",)`. Expand to voice/phone/whatsapp after safety validation.
6. **`"complete"` event type** — reuses existing `VALID_EVENT_TYPES` entry. Do NOT add `"tool_call"` (it's missing and would need chronicler.py change).
7. **ExecStartPre motor-zero** in systemd unit — stops motors before process restart so STM32 doesn't keep running.
8. **Don't touch `phoenix` creature** — pre-existing dual-definition conflict in chronicler.py vs observability.py. Only add `chariot` and `compass` to types.ts.
9. **Photo description via Ollama E2B on Pi** — simpler than vLLM vision integration. Already proven in session 36.
10. **Split httpx timeouts** — `connect=2.0s` (fast fail if Pi offline), `read=5.0s` (reasonable for motor commands).
11. **Rate limiting** — max 10 drive commands per minute. Prevents LLM tool-call loop from runaway car.
12. **Creature name: `chariot`** — zone: acting, service: annie-voice, process: vehicle-control, accent: burnt orange `{r:220, g:120, b:40}`.
13. **Architecture is single-tier** (Titan brain → Pi actuator). NOT two-tier. Local E2B safety loop is future work.
14. **SDK path**: `sys.path.insert(0, "/home/pi/TurboPi")` at top of main.py + `Environment=PYTHONPATH=/home/pi/TurboPi` in systemd unit.
15. **OpenCV from apt** (`python3-opencv`), NOT pip. Use `--system-site-packages` or system Python. `opencv-python` has no aarch64 wheels for Python 3.13.

## Files to Create

| File | Purpose |
|------|---------|
| `services/turbopi-server/main.py` | Pi FastAPI server (~300 lines, auth + ESTOP + executor) |
| `services/turbopi-server/requirements.txt` | fastapi, uvicorn, httpx (opencv from apt) |
| `services/turbopi-server/turbopi-server.service` | systemd unit with ExecStartPre motor-zero |
| `services/annie-voice/robot_tools.py` | Annie-side HTTP client (~150 lines) |
| `services/annie-voice/tests/test_robot_tools.py` | ~40 tests |
| `docs/RESEARCH-ANNIE-ROBOT-CAR.md` | Architecture doc |

## Files to Modify

| File | Change |
|------|--------|
| `services/annie-voice/tool_schemas.py` | Add DriveRobotInput, RobotPhotoInput, RobotLookInput, RobotStatusInput (Literal types) |
| `services/annie-voice/text_llm.py` | ROBOT_ENABLED flag + 4 ToolSpec entries (gated, telegram-only) + imports |
| `services/annie-voice/observability.py` | Add `chariot` creature to _CREATURES dict |
| `services/annie-voice/bot.py` | FunctionSchema + register_function for voice mode |
| `services/context-engine/chronicler.py` | Add `chariot` to CREATURE_REGISTRY |
| `services/context-engine/dashboard/src/types.ts` | Add `chariot` + `compass` to CreatureId |
| `services/context-engine/dashboard/src/creatures/registry.ts` | Add chariot + compass ENTRIES + LAYOUT_MAP |

## Start Command

```
cat ~/.claude/plans/keen-jumping-grove.md
```

Then implement the plan phase by phase (A → B → C → D → E → F). All adversarial findings are already addressed in it.

## Verification

1. **Benchmark:** SSH to Pi, run A1-A5, fill with-think vs no-think comparison table
2. **Pi server health:** `ssh pi 'curl localhost:8080/health'`
3. **Pi auth test:** `ssh pi 'curl -X POST localhost:8080/drive -d "{}" -H "Content-Type: application/json"'` → 401
4. **Pi drive test:** Authenticated POST `/drive` → car moves 0.5s
5. **ESTOP test:** Send drive then `/estop` → car stops instantly
6. **Dead-man test:** Send drive, wait 15s → car auto-stops
7. **Annie test:** From Telegram: "Annie, drive the car forward" → car moves, event in dashboard
8. **Dashboard:** Chariot creature visible in acting zone
9. **Audit trail:** `GET /v1/events?creature=chariot` → timestamped events
10. **Tests:** `pytest services/annie-voice/tests/test_robot_tools.py` → all pass
