★ PROJECT HEXAPOD ★ INTERNAL
Project plan.
ACTIVE WORK · DECISIONS · HISTORY
▸Where we are
Currently shipping V1 — get the hexapod walking on USB serial in about 12 days. After that, V2 is the autonomous creature build on a Raspberry Pi 5 brain, that texts you when something happens. The docs below capture both.
01 · Documents
V1 · ACTIVE PLAN
Walking build plan (12 days)
The Field-Manual roadmap to V1: 22 tasks, day-by-day, from BOM inventory through the 1-metre walk acceptance. ESP32 controller, USB serial, tripod gait. Open this every evening of the build.
Open document →
DECISION · V2 CONTROLLER
Raspberry Pi 5 + AI HAT vs Arduino UNO Q
Comparison of the two long-term contenders for the V2 brain (autonomous creature with personality). Spec sheet, local-LLM ceiling, vision capability, and a clear recommendation: Pi 5 + ESP32 sidecar.
Open document →
DECISION · HISTORICAL
Arduino UNO Q vs the original ESP32 plan
The first dual-brain comparison: UNO Q's integrated Cortex-A53 + STM32 vs the original ESP32-only architecture. Superseded by the Pi 5 decision, kept for reference.
Open document →
02 · The arc, in one screen
V1 · 12 days
ESP32 dev-kit-v1 drives 2× PCA9685, 18× MG996R servos. 2S 3300 mAh LiPo → 30 A fuse → buck @ 6.0 V. Tripod gait. USB serial control:F/B/L/R/S. LVC e-stop. Acceptance test: 1 metre forward in under 30 seconds.
V2 · 4–6 months
Add a Raspberry Pi 5 (16 GB) on top. The ESP32 becomes the spinal cord; the Pi does vision, planning, memory, and an LLM persona. The robot wanders the house, recognises rooms, has a mood state, and texts you on Telegram when something happens — a sock under the couch, the back door open, "I think I saw the cat".
FAR FUTURE
Better gaits (ripple, wave). Charging dock. Visual SLAM. AI HAT for heavier on-board vision. Voice commands. A second one so they can keep each other company.