# Next Session: Robot Navigation V2 — Lidar Integration + SLAM

## What Was Done

### Session 42: RPLIDAR C1 Verified
- **SLAMTEC RPLIDAR C1** connected to Pi 5 at `/dev/ttyUSB1` (CP2102N USB-UART, 460800 baud)
- Model 65, firmware 1.1, hardware 18, serial `A03AE1F8C9E09CD1A7E79BF53EEF4D1D`
- **pyrplidar 0.1.2** installed (`pip3 --break-system-packages`)
- **Both scan modes verified:**
  - Standard (mode 0): 5843 pts/3s, 4m range
  - **DenseBoost (mode 1): 6176 pts/3s, 10.24m range** — pyrplidar had byte-order bug in `PyRPlidarDenseCabin` (big-endian → little-endian), patched in-place at `/home/rajesh/.local/lib/python3.13/site-packages/pyrplidar_protocol.py`
- Room profile: 442mm–6529mm, ~514 pts/sector, 90%+ hit rate
- No firmware updates available (factory-embedded)
- No GPIO conflicts (USB only, ttyUSB0 = TurboPi motor controller CH340)

### Session 41: Navigation Infrastructure Deployed
- FrameGrabber (shared camera reader, V4L2 single-opener fix)
- HailoSafetyDaemon (YOLOv8s at 30 FPS on Hailo-8 NPU, auto-ESTOP <30cm)
- `_uart_lock` (threading.Lock protecting ALL Board UART access)
- `/obstacles` endpoint with Hailo NMS output parsing
- `navigate_robot` + `robot_obstacles` tools on Annie-side
- Combined multimodal vLLM call (halves GPU load)
- 120 tests (46 Pi + 74 Annie), all passing

**Key Hailo API learnings:**
- Must call `network_group.activate()` BEFORE `InferVStreams`
- NMS on-chip: output is list of per-class numpy arrays `[y1, x1, y2, x2, conf]`
- Use `yolov8s_h8.hef` (Hailo-8), not `yolov8s_h8l.hef` (Hailo-8L)
- Pipeline must be kept open for daemon lifetime

## What's Next

### Phase 1: Wire Lidar into Safety Daemon (HIGH PRIORITY)

**Goal**: Replace coarse bbox-distance heuristic with accurate lidar measurements.

**1a. LidarDaemon thread** (`services/turbopi-server/lidar.py`)
- New daemon thread (same pattern as HailoSafetyDaemon / FrameGrabber)
- DenseBoost mode (mode 1) for 10.24m range
- Buffer latest full 360° scan
- Aggregate into sectors (e.g., 36 sectors × 10° each)
- Thread-safe `get_scan()` → list of (angle, distance_mm) + `get_sector_min(bearing_deg)` → nearest obstacle at bearing
- Motor PWM 660, flush serial buffer on start, handle disconnect gracefully

```python
# Skeleton
class LidarDaemon(threading.Thread):
    def __init__(self, port='/dev/ttyUSB1', baudrate=460800):
        ...
    def run(self):
        lidar = PyRPlidar()
        lidar.connect(port, baudrate)
        lidar.set_motor_pwm(660)
        scan_gen = lidar.start_scan_express(1)  # DenseBoost
        for response in scan_gen():
            # accumulate into sector buffer
            ...
    def get_sector_min(self, bearing_deg, cone_half=15) -> float:
        """Min distance in ±cone_half degrees around bearing."""
    def get_full_scan(self) -> list[tuple[float, float]]:
        """Latest 360° scan as (angle_deg, distance_mm) pairs."""
```

**1b. Fuse lidar into HailoSafetyDaemon**
- For each YOLO detection with `bearing_deg`, query `lidar.get_sector_min(bearing_deg)`
- Replace bbox-ratio distance estimate with lidar distance (when available)
- Fallback to bbox heuristic if lidar daemon unhealthy
- ESTOP threshold stays 30cm but now uses **real** distance

**1c. Enhanced `/obstacles` endpoint**
- Add `lidar_distance_mm` field to each detection
- Add `distance_source: "lidar" | "bbox_estimate"` field
- Keep existing `distance_cm` field for backwards compat

**1d. New `/scan` endpoint**
```
GET /scan → {
  "sectors": [{"angle_start": 0, "angle_end": 10, "min_mm": 1234, "avg_mm": 2345, "points": 42}, ...],
  "raw_points": [...],  // optional, can be large
  "scan_age_ms": 50,
  "mode": "DenseBoost"
}
```

### Phase 2: Navigation Brain Upgrade

**Goal**: Give Titan's navigation brain lidar awareness.

**2a. Enhanced `_ask_nav_combined()`** in `robot_tools.py`
- Fetch `/scan` alongside `/photo` and `/obstacles` (parallel)
- Include sector summary in the LLM prompt: "Lidar: wall 0.5m at 90°, open path at 0° (3.2m), wall 1.0m at 180°"
- LLM can now make informed directional decisions instead of guessing from camera image

**2b. Obstacle-aware path selection**
- Before sending `forward`, check lidar forward cone (±15°) for clearance
- Suggest alternative directions if forward is blocked
- This is a soft check (Hailo ESTOP is still the hard safety layer)

### Phase 3: SLAM (Research + Prototype)

**Now viable with 360° lidar.** Research questions from session 41, updated:

| Question | Status | Notes |
|----------|--------|-------|
| Roomba algorithms (vSLAM vs LiDAR SLAM) | **ANSWERED** — LiDAR SLAM is now the path. GMapping, HectorSLAM, or BreezySLAM. | No ROS dependency with BreezySLAM (pure Python). |
| Semantic SLAM (text-based map) | Still worth exploring | Hybrid: lidar for geometry, LLM for semantic labels ("this is the kitchen") |
| Frontier exploration | **Much easier with lidar** | Detect openings (sectors with distance > threshold) as unexplored frontiers |
| Wheel odometry | Still needed | TurboPi has no encoders — estimate from speed × time, or visual odometry |
| ORB-SLAM3 on Pi | **Deprioritized** | Lidar SLAM is simpler and more robust than visual SLAM for 2D mapping |

**SLAM library candidates:**
- **BreezySLAM** (Python, no ROS): Simple 2D SLAM, accepts lidar scans directly. Best fit for our setup.
- **HectorSLAM** (ROS-based): High-quality 2D SLAM, but requires ROS install
- **GMapping** (ROS-based): Particle filter SLAM, good with odometry
- **RTAB-Map** (visual + lidar): Overkill for 2D but future-proof

### Phase 4: E2E Tests

1. Open Telegram → @her_os_bot
2. Send: "Annie, what obstacles do you see?" — should include lidar distances
3. Send: "Annie, explore the room" — multi-cycle with lidar-aware navigation
4. Test obstacle mid-navigation → verify ESTOP uses lidar distance (not bbox estimate)
5. Verify: `curl http://pi-car:8080/scan` returns valid sector data

### Phase 5: Remaining Fixes

- **Sonar I2C**: `sudo raspi-config nonint do_i2c 0 && sudo reboot` — may be redundant now that lidar provides accurate distance, but sonar gives precise frontal depth as backup
- **Battery UART**: Investigate `board.get_battery()` returning None
- **Distance calibration**: Place objects at known distances → compare lidar mm vs bbox estimate → quantify improvement
- **pyrplidar patch**: Consider forking pyrplidar or contributing the byte-order fix upstream (GitHub: Hyun-je/pyrplidar)

## RPLIDAR C1 Quick Reference

```python
from pyrplidar import PyRPlidar

lidar = PyRPlidar()
lidar.connect(port='/dev/ttyUSB1', baudrate=460800, timeout=3)
lidar.set_motor_pwm(660)       # start spinning
import time; time.sleep(2)      # motor spin-up

# Standard scan (mode 0, 4m range)
scan_gen = lidar.start_scan()

# DenseBoost scan (mode 1, 10.24m range) — USE THIS
scan_gen = lidar.start_scan_express(1)

for response in scan_gen():
    print(response.angle, response.distance)  # degrees, mm

lidar.set_motor_pwm(0)          # stop motor
lidar.stop()                    # stop scanning
lidar.disconnect()
```

**Gotchas:**
- Baud rate MUST be 460800 (not default 115200)
- DenseBoost is mode **1**, not mode 2 (only 2 modes: 0=Standard, 1=DenseBoost)
- pyrplidar `PyRPlidarDenseCabin` byte-order bug: patched on Pi (big→little endian)
- Flush serial buffer before scan: `lidar.lidar_serial._serial.reset_input_buffer()`
- Motor needs ~2s spin-up before first valid scan
- `ttyUSB1` (lidar) vs `ttyUSB0` (TurboPi CH340) — check with `lsusb` if ports swap after reboot

## Files to Modify

### Pi (services/turbopi-server/)
- `lidar.py` — NEW: LidarDaemon thread
- `safety.py` — MODIFY: fuse lidar distance into Detection, fallback to bbox
- `main.py` — MODIFY: add `/scan` endpoint, wire LidarDaemon into startup/shutdown, add lidar health to `/health`
- `test_lidar.py` — NEW: LidarDaemon unit tests
- `test_safety.py` — MODIFY: tests for lidar-fused distance

### Annie (services/annie-voice/)
- `robot_tools.py` — MODIFY: fetch `/scan` in navigate loop, include in LLM prompt
- `tests/test_robot_tools.py` — MODIFY: tests for lidar-enhanced navigation

## Start Command

Read this file, then:
1. Implement LidarDaemon (Phase 1a)
2. Wire into safety daemon (Phase 1b)
3. Add endpoints (Phase 1c, 1d)
4. Test on Pi: `curl http://pi-car:8080/scan`
5. Update Annie navigation brain (Phase 2)
6. E2E test via Telegram (Phase 4)
