Autonomous Navigation and Target Tracking on a Mobile Robot
This was my senior capstone project at UMass Amherst. As the sensing and navigation lead on the project, I developed the backbone of the project using ROS2. I was able to tune the navigation system to move around both static and dynamic obstacles while plotting a path to a moving ArUco marker in real time.

Here is our robot Holly. It is a 4-wheeled robot with a camera attached to it on a stand. The camera is mounted with a pan-tilt servo mechanism, allowing it to see everything in the 3D hemisphere around Holly. We also have a LiDAR on the bottom to detect obstacles and create maps with SLAM.
Inside the boards, Holly functions with a GoBilda chassis with some minor modifications. It has 4 motor controllers, which are all controlled through a Raspberry Pi Pico. The camera mechanism uses a separate Raspberry Pi Pico as we needed more pins that was available on the board. We chose the Pico because of how easy it is to develop code on it (Python!).
Holly runs from 2 LiPo batteries (12V, 32A total) and a boost converter which turns 12V into 19V for the mini PC that is seen at the front. This PC runs all our code for target tracking and autonomous navigation.
An ArUco marker is a type of encoding marker which can store data. In this case, it stores numbers. We can detect these markers out of a scene and identify not only the number it corresponds to, but also the pose in SE(3). This gives us the distance and the orientation of the marker. With this, a goal can be plotted for the robot to autonomously navigate to.



Our initial design of Holly was super bare bones. We used the GoBilda chassis out of the box and just strapped on our electronics. This allowed us to show proof of concept for the system moving around to teleop and simple navigation commands, as well as verify the pan-tilt camera mechanism moves to center the target in the frame.
Initially, our robot had mecanum wheels, allowing the potential for holonomic drive. But we decided against it for better odometry, because the mecanum wheels slip a lot on the floor. With rubber wheels, we had way better grip and also better odometry.


We then proceeded to upgrade the robot and give it more shape. Shown here is Holly Mk. III, where we spray painted some pieces of wood, used some of the GoBilda parts, and our new wheels to put together the new Holly. We also routed all our wires better through the chassis and covered it up, sort of like in a car where all the electronics and mechanical pieces are underneath the hood.
This was the most tedious part of the design process. We needed to be very careful with the parts as they were limited, and needed to make it easy to open up the robot and debug any electrical system failures.

Shown here is a live tour example use case for our robot. It is able to switch between targets, navigate around obstacles, and record only when the target is in view. If not, then it pauses the recording, so you can easily edit out any pieces of the video where no targets are seen (searching for a target). Overall, this project taught me how to use ROS2, setup the nav system, and also how to integrate target tracking with live navigation.

Here is the live navigation costmap updating from SLAM as the robot moves. It is able to be dropped in to any environment and will start mapping out the surroundings and determining the obstacles around the robot to navigate.





Our underlying electrical system is shown here. It took a lot of organization to route the high current pieces while protecting the Pico and mini PC. At one point we made a small mistake in the PCB which caused 32A to go into the Pico and mini PC, frying a capacitor on the PC board. This was an expensive mistake, but thankfully we were able to fix the board and get the PC back.
