A small ground robot I use to explore end-to-end robotics: from motor control and embedded sensing on Arduino, up to mapping and visualization on a Raspberry Pi. The goal is a clean stack I can iterate on quickly: simple, observable, and hackable.
Hardware
- Locomotion: DC motors via TB6612 driver
- Sensing: scanning HC-SR04 (on micro-servo), two HW-201 IR, MPU6050 IMU for yaw
- Compute: Arduino (low-level loop) + Raspberry Pi (Python, mapping & dashboard)
Behavior
The Arduino runs a tiny state machine for obstacle avoidance (RUN / EMERG_BACK / EMERG_TURN) and streams compact STAT telemetry at 10 Hz. The Pi listens, estimates a rough pose from PWM+IMU, processes LiDAR scans, and maintains a rolling occupancy grid—a crisp, “what’s around me now” map ideal for debugging.
Software layout
- Serial thread: parses Arduino telemetry, keeps shared state, integrates odometry
- LiDAR thread: full turns → world points → rolling occupancy update (free along ray, hit = occupied)
- Dashboard: Pygame, showing grid crop + last scan overlay + HUD (sonar, IR, IMU, PWM)
Notes & next steps
- Add wheel encoders and fuse them with IMU (complementary/EKF)
- Light ICP on downsampled scans to reduce drift; later a small pose-graph
- Expose LiDAR extrinsics (x, y, yaw) for easy calibration
Gallery
