MoonBot Navigation
MoonBot Navigation
Autonomous navigation and interaction stack for a lunar rover prototype. Built during my visiting research at Tohoku Universityโs Space Robotics Lab (TESP 2025); winner of the TESP 2025 Competition.
Project Overview
- Built an autonomous mobile robot to navigate a sandy, uneven โlunarโ arena, avoid obstacles, detect targets, and actuate a custom gripper to interact with objects.
- End-to-end pipeline: vision โ mapping โ Dijkstra path planning โ PD control โ onboard actuation and interaction.
- Space Robotics Lab project under Prof. K. Yoshida (Tohoku University); awarded 1st place and Research Certificate of Excellence at TESP 2025.

Hardware & Electronics
- Robot evolution: four iterations (Tsukikage โ Seigetsu โ Mikazuki โ final โTenshikoโ) to balance power, traction on sand, and gripper stability.
- Compute & control: Raspberry Pi for perception/planning, EV3 brick for motor control, camera module for target detection.
- Actuation: Loader-style linear gripper kept off the ground during navigation to reduce drag and slippage.

Navigation & Mapping
- Planner: Dijkstra over a binary occupancy map with a distance transform + retraction to pull paths away from obstacles.
- Controller: PD controller outputs linear/angular velocity for smooth tracking.
- Vision-to-map: threshold satellite-style image โ binary map โ distance map โ safe navigation zone.


Object Detection & Interaction
- Dataset: 240 labeled images; trained via Roboflow for lightweight detection of target โturtles.โ
- Model: Simple CV detector to center targets and trigger interaction.
- Flow: Short-range visual servoing keeps targets centered; gripper actuates once aligned.

Other Project
Indoor autonomous cleaning robot with SLAM, navigation, and obstacle avoidance built on ROS, Webots, and RViz.

GitHub Repository