Overview & Academic Context
As part of the "Applied Design Methodology in Mechatronics" (ADMM) course at TUHH, our team developed a non-visual perception system that enables an existing Unmanned Ground Vehicle (UGV) to navigate autonomously without relying on conventional cameras or LiDAR.
System Capabilities
Using an integrated array of ultrasonic sensors, force sensors, and an IMU, the perception stack can successfully:
- Obstacle Differentiation: Distinguish between hard and soft obstacles, triggering specific recovery mechanisms when necessary.
- Terrain Analysis: Detect the terrain type (smooth or rough) and estimate slope inclination utilizing a custom physical terrain probe.
- Real-Time Adaptation: Adapt its navigational motion dynamically based solely on tactile and inertial sensory feedback.
Development Methodology
We followed a highly structured V-model development approach for this project. The process systematically transitioned from initial requirement specification and black-box modeling to functional decomposition, creative ideation, concept selection, rapid prototyping, and finally, comprehensive system testing.
Key Takeaways
- Sensor Constraints: Designing and tuning a robust perception system completely without visual input.
- Resource Management: Engineering effective robotic solutions within strict limitations on hardware and computing resources.
- Full-Stack Implementation: Gaining practical, hands-on experience spanning both software architecture and physical hardware development.