# Next Session: ArUco Homing — Review & Takeaways

## What happened (Session 83)

**Goal**: Implement ArUco marker homing so Annie can tell the robot "go home" via Telegram.

**Result**: SUCCESS. 22 commits. Robot finds marker via camera pan sweep, approaches, stops at 0.5m, speaks "Arrived home!" through USB speaker. Full E2E verified from Telegram.

## Review Checklist

### 1. Replay the E2E
- [ ] Move robot ~2m from marker, different angle than last time
- [ ] Send "go home" via Telegram
- [ ] Observe: pan sweep search → acquire → approach → "Arrived home!" speech
- [ ] Record time from command to arrival

### 2. Document hardware findings (update MEMORY.md)
These were discovered empirically during session 83 E2E tuning:

**`_imu_turn` is broken for small angles on loaded Pi:**
- asyncio.sleep(0.01) at 100Hz polling actually sleeps 50-200ms due to event loop contention
- 5° requested → 37° achieved. 180° requested → 300°+ achieved
- Root cause: by the time first IMU poll fires, robot has already rotated past target
- Workaround: `_nudge_turn` uses fixed-duration motor pulse with `time.sleep` in executor thread
- Future fix: rewrite `_imu_turn` to use blocking poll loop in executor (not asyncio)

**Motor trim creates rotation asymmetry:**
- MOTOR_TRIM_FL/RL = 1.15 (15% boost on left motors for forward drift correction)
- During rotation: left turns weaker, right turns stronger
- Homing workaround: temporarily sets trim to 1.0 during nudge turns
- Future: consider separate trim for forward vs rotation

**Left turns have higher static friction breakaway:**
- Right: 0.10s pulse at speed 50 → ~7-15° rotation
- Left: 0.15s pulse at speed 50 → ~2-5° rotation (needs 50% longer pulse)
- Cause unclear: could be wheel contact, floor surface, weight distribution
- Homing uses asymmetric pulse durations per direction

**Marker height vs camera height:**
- ArUco marker on table surface is ABOVE camera at close range (<40cm)
- Marker exits FOV as robot approaches
- TARGET_DISTANCE_M set to 0.50m (stops while marker still visible)
- Future: could tilt camera up during close approach

### 3. Document architecture evolution
The plan went through 3 design iterations during E2E:

**v1 (from plan)**: Body rotation for everything
- Search: rotate body 25° per step
- Center: rotate body to align with marker
- Result: marker lost on every turn at close range

**v2 (user insight)**: Camera-pan-first
- Search: pan servo sweeps 5 positions (~2s, no body movement)
- Center: no body rotation — just verify marker visible
- Approach: camera pan tracks marker, body nudges only when pan > 10° off center
- Result: much better — acquired and approached

**v3 (calibrated nudges)**: Fixed-duration pulses
- `_nudge_turn` replaces `_imu_turn` for all micro-corrections
- Trim disabled during rotation for symmetry
- Asymmetric pulse durations for L/R
- IMU measures achieved rotation after each pulse
- Result: E2E success

### 4. Proper calibration TODO
- [ ] Run `scripts/calibrate_camera_intrinsics.py` with checkerboard (currently using approximate fx=fy=530)
- [ ] Re-measure actual TARGET_DISTANCE_M after calibration (distances will be more accurate)
- [ ] Consider dedicated `/nudge` endpoint for calibrated micro-turns (reusable beyond homing)

### 5. Remaining improvements
- [ ] `_imu_turn` needs fundamental rewrite (blocking poll in executor, not asyncio)
- [ ] Tilt camera up during close approach (extend visible range)
- [ ] Use Annie's Kokoro TTS instead of espeak-ng for arrival announcement
- [ ] Proper camera calibration replaces approximate intrinsics
- [ ] Test from multiple starting positions (behind robot, far corner, etc.)

## Session 83 Commit History (22 commits)

```
1e19091 feat(homing): speak 'Arrived home!' via espeak-ng on Pi USB speaker
314543f feat(homing): two-tone beep on arrival, skip alignment spin
6d8723a fix(homing): skip 180° alignment — _imu_turn overshoots
efc87ed fix(homing): TARGET_DISTANCE_M 0.30→0.50 (marker above camera)
6a80335 fix(homing): asymmetric pulse — left 0.15s right 0.10s
322979c fix(homing): nudge speed 30→50 to overcome static friction
f2424fd fix(homing): use _drive_sync with trim=1.0 (correct formula)
ac6df44 fix(homing): trim-free rotation — bypass MOTOR_TRIM
ef9406a fix(homing): precise nudge timing via time.sleep in executor
42d4781 fix(homing): _nudge_turn — fixed pulse, bypasses asyncio polling
42ccea0 fix(homing): achieved_deg prediction + right turn speed 35
ec4c9e2 fix(homing): nudge 5° at speed 30 (3° didn't overcome friction)
0cd873a fix(homing): proportional body correction — loop decides
6d519ec fix(homing): search rotation 3° per step
29f2f94 fix(homing): micro rotations — 15° search, 3° approach
62f7169 fix(homing): cap body correction to 8° nudge
bc6aad8 fix(homing): no body rotation after acquisition — pan tracks
32712e7 fix(homing): camera-pan-first search and centering
0afb7c2 feat(annie): add go_home tool for ArUco homing via Telegram
269b5c3 feat(aruco): homing state machine, endpoints, 18 tests
dcf5825 feat(aruco): pure ArUco detection module with solvePnP
b69bb47 feat(aruco): /camera/v4l2_state + calibration script
d0125a6 fix(nav): is_timeout fix, stall threshold 3→6, reset on error
2951f97 refactor(nav): extract _imu_turn, fix C1, progressive search
```

## Key Takeaway

**The user's camera-pan-first insight was the breakthrough.** The original plan used body rotation for everything — the same approach that failed in sessions 76-77 (alternating search, marker loss on turns). Using the pan servo for search and tracking, with body nudges only when absolutely necessary, made the system work. The servo is instant, precise, and risk-free. The body is slow, imprecise, and destructive (loses visual lock). Let the camera do the seeing, the body do the driving.

## Start Command

```bash
cat docs/NEXT-SESSION-ARUCO-HOMING-REVIEW.md
```
