# Next Session: Hailo-8 L1 Safety Reflex Activation

**Session 119 outcome:** v3 perspectives doc produced with DeepStream/Hailo-8/Orin NX findings. Three idle compute tiers surfaced: Hailo-8 on Pi 5 (26 TOPS, unused), Beast (2nd DGX Spark, always-on idle), Orin NX 16GB (owned, reserved for future robot). Across 26 lenses, **Hailo-8 activation was the single most convergent recommendation** — flagged as highest-leverage × lowest-risk in ≥15 lenses and as Synthesis convergence #7.

**This session's goal:** Get YOLOv8n running on Hailo-8 on Pi 5 at 430 FPS locally (zero WiFi dependency), wired into Annie's nav loop as an L1 safety reflex below the existing lidar ESTOP. Eliminates the WiFi cliff edge for the safety path per Lens 04 v3 reframe.

## Read First (context)

| File | Purpose |
|------|---------|
| `docs/RESEARCH-DEEPSTREAM-NAV.md` | Why Hailo-8 is highest-leverage (Recommendations section) |
| `docs/RESOURCE-REGISTRY.md` "Strategic Architecture" section | Tier roles, 4-tier → 5-tier transition |
| `docs/perspectives-vlm-primary-hybrid-nav-v3.html` Lens 04 + Synthesis #7 | Cliff-edge reframe + convergence analysis |
| MEMORY.md session 119 entry | Corrections + hardware inventory + skill install |
| [arXiv 2601.21506](https://arxiv.org/abs/2601.21506) (IROS dual-process paper) | Validates pattern: 66% latency reduction, 67.5% vs 5.83% success |

## Hardware Assumptions

- **Pi 5 on robot** at `192.168.68.61`, 16 GB RAM, aarch64, Debian Trixie 13
- **Hailo-8 AI HAT+** attached via PCIe — 26 TOPS NPU, currently idle for navigation
- **Existing nav loop**: `services/turbopi-server/main.py` (FastAPI) wraps motor / lidar / IMU / camera. NavController (not yet in repo per v3 knowledge) would be the integration point

## Suggested Phases

### Phase 1 — HailoRT + TAPPAS install on Pi 5
- Follow [github.com/hailo-ai/hailo-rpi5-examples](https://github.com/hailo-ai/hailo-rpi5-examples)
- Install `hailo-all` apt package + TAPPAS GStreamer pipeline
- Verify `hailortcli fw-control identify` returns firmware version
- Run NVIDIA-style "hello world": YOLOv8n demo from the Hailo repo on a USB camera

### Phase 2 — Compile or fetch YOLOv8n .hef
- Prefer the prebuilt `.hef` from Hailo Model Zoo (saves DFC conversion step)
- If custom classes needed later: use Hailo DFC on a separate dev box (not Pi — DFC needs x86_64)
- Target input: 640×640, COCO 80 classes — obstacle-detection-useful classes: person, chair, bottle, bowl, couch, potted plant, bed, dining table, toilet, tv, laptop, cell phone, book, backpack

### Phase 3 — Sidecar service on Pi
- New file: `services/turbopi-server/hailo_detect.py` (or similar)
- HailoRT + Python API, not TAPPAS GStreamer (simpler for sidecar)
- Subscribe to camera frames (reuse existing MJPEG/V4L2 path)
- Expose detection stream over local socket or HTTP on `localhost:11437` (port `:11436` is panda_nav sidecar)
- Output: `{class_id, bbox_px, confidence, frame_timestamp}` records at 30+ Hz

### Phase 4 — Integrate into nav as L1 safety gate
- **Additive, not replacement**: L1 Hailo joins lidar ESTOP, doesn't replace it
- Fusion rule: "obstacle if (Hailo detects AND bbox_pixel_height > N) OR (lidar shows <200mm in forward sector)"
- Hailo gives richer information (class + pixel position) than lidar (just distance)
- Ensure L1 Hailo decision path has **zero WiFi dependency** — purely local Pi inference → local motor command

### Phase 5 — Verification
- **Golden path test**: drop a backpack in Annie's corridor during a nav run. Expected: Hailo detects "backpack" at >0.5 confidence, L1 gate triggers, Annie stops/veers before lidar ESTOP would have fired
- **WiFi-drop test**: simulate WiFi outage (block panda IP on Pi). Expected: Panda VLM unreachable, but Hailo keeps detecting obstacles and Annie keeps moving safely — "graceful wander, not 2-second freeze" per Lens 20 v3
- **Panda VRAM headroom check**: measure Panda VRAM usage. Expected: if obstacle-detection queries are removed from Panda VLM multi-query, ~800 MB becomes free (unblocks SigLIP Phase 2d per Lens 06)

## Non-goals (Session Discipline)

- **Do NOT touch Panda VLM nav** — it works; keep it as-is as L2 semantic layer
- **Do NOT start DeepStream on Beast yet** — separate next-next session
- **Do NOT set up Orin NX carrier board** — reserved for when future robot hardware arrives

## Also Pending (Lower Priority, Surface If Time)

- **Narrative pages audit** per CLAUDE.md post-research workflow:
  - `docs/day-in-life-annie.html` and `docs/day-in-life-rajesh.html`
  - 7:30 AM WiFi-hiccup scene needs post-Hailo reframing (graceful wander, not freeze)
  - **Bidirectional scene sync required** — any update to Annie's scene must sync to Rajesh's
  - Lens 20 v3 already reframed this in the perspectives doc; narrative pages must match

## Tools Available

- **Hailo Python API** (`hailo_platform` module via `pip install hailo-platform`)
- **start.sh / stop.sh** are laptop-local per user feedback memory — SSH to Pi from laptop shell

## Kickoff Instruction for Next Session's Claude

> Read `docs/RESEARCH-DEEPSTREAM-NAV.md` recommendations section and `docs/perspectives-vlm-primary-hybrid-nav-v3.html` Lens 04 + Synthesis #7 to confirm the architectural rationale. Then outline Phase 1 concrete steps (HailoRT install commands on Pi 5, verification command, fallback if firmware update needed). Before installing anything, confirm the Hailo-8 is physically attached and healthy via `lspci` or `hailortcli fw-control identify`.

## Success Criteria

- [ ] Hailo-8 identified and healthy on Pi 5 (firmware query succeeds)
- [ ] YOLOv8n `.hef` loaded and running at ≥30 FPS local (should easily hit 430+ FPS per benchmarks)
- [ ] Detection sidecar exposes obstacles over local socket/HTTP
- [ ] Nav loop integrates L1 gate additively (lidar ESTOP preserved)
- [ ] WiFi-drop test: Annie keeps moving safely, no 2-second freeze
- [ ] Panda VRAM measurement before/after to validate ~800 MB freed claim
