Autonomous Mobile Robot
In MAE5180, I learned about the challenges and techniques used for creating autonomously mobile robots. Topics in the class included sensing, localization, mapping, path planning, motion planning, obstacle and collision avoidance, and multi-robot control. To understand the algorithms that are used to handle these problems, algorithms were first implemented in a simulated environment, then on a iRobot Create. The localization algorithms implemented were an extended Kalman Filter (EKF) and a particle filter. The path planning algorithms implemented were potential fields, probablistic roadmaps (PRM), rapidly-exploring random trees (RRT), and bidirectional rapidly-exploring random trees (BIRRT).
The physical robot itself was equip with several sensors. It had a bump sensor to detect frontal collisions and a Realsense camera to detect obstacle depth. The camera was also able to detect several beacons placed throughout the environment that helped the robot localize itself, helping the robot realize where it was on the map in regions of the map that would look similar based on distances to walls alone. I also gained experience dealing with some of the challenges with working with a physical robot, such as sensor latency and sensor fusion.
The final project for the class put many of these concepts together: the robot needed to handle localization, mapping, and navigation to several checkpoints. The robot was placed into one of the checkpoint in a random orientation and needed to navigate to the other checkpoints in the given map as quickly as possible. There were also several missing walls in the given map that the robot needed to detect. Specifically for this project, I used an EKF to deal with localization, a RRT to generate waypoints, and primarily the depth sensor to help the robot detect walls and confirm it's true position as it traveled.