Cambridge university robotics society - Rescue robot using ROS

Posted on July 1, 2021 by Zach Lambert ‐ 2 min read

Cambridge university robotics society - Rescue robot using ROS

During my 3rd and 4th years of university, I was part of the Cambridge university robotics society and worked in a team building a robot for the RoboCup rescue competition.

My role was to lead the software development and managed the majority of the ROS stack, repository here. Other team members worked on the hardware, electronics, and other software components such as the web interface.

Although we never got far enough to compete in the RoboCup rescue competition, partly due to Covid, partly due to it simply being a challenging project to undertake alongside university, we made some good progress.

3rd year progress: Joint control, odometry, MoveIt, web server, Gazebo

Below shows the progress at the end of 3rd year. The robot could do the following:

  • Command and receive feedback from the joints using the ros_control package. The arm used Dynamixel smart motors, and the tracks and flippers used Odrive brushless motors.
  • The base velocity could be commanded using a base controller, as well as returning odometry. Additionally the flippers could be controlled to help move over obstacles or up stairs.
  • The robot arm could be commanded with velocity control manually, or use MoveIt to move between predefined “home” and “ready” configurations.
  • The robot started its own webserver, from which it could be controlled, and the robot state visualised.
  • In addition to interfacing with the real robot, Gazebo was used for a simulated robot, with an identical interface.

4th year progress: Navigation

In fourth year, in addition to the robot arm and gripper being improved by other team members, I worked on setting up a proper navigation stack. This involved:

  • Interfacing with Realsense cameras, specifically the D435i and T265, to be used for SLAM.
  • Using rtabmap_ros to perform SLAM with visual odometry and features from the camera streams, as well as produce an occupancy map.
  • Using move_base to perform omtion planning using the occupancy map from rtabmap_ros.
  • Set up a simulation suitable for navigation, which provided a fake environment and simulated the camera streams.

Setting up a simulation for navigation was particularly important, since due to Covid we had limited contact and couldn’t meet in person.

I don’t have any videos of the real robot for this, however before I left, I created two “how-to” videos for future members, on how to setup the ROS stack and get it working, which I have included below. Note, it does have sound, it’s just very quiet for some reason.