My experience porting FUNMAP to ROS 2

Hello Stretch Community!

I am Atharva Pusalkar, a graduate student at the Carnegie Mellon Robotics Institute (CMU RI), advised by Prof. Yonatan Bisk and Prof. Zackory Erickson. I joined Hello Robot as an intern this summer to develop our ROS 2 infrastructure and help make it production-ready. Through this post, I will be guiding you through the packages that I have ported to ROS 2 and their changes from ROS 1.

Parity with ROS 1

Our primary goal this summer has been to make ROS 2 on Stretch deliver the same capabilities as ROS 1, so both new and old users can be productive. This also means achieving dependable and robust performance through extensive testing. I developed functional and performance requirements to increase the robustness of our experimental ROS 2 stack.

One package to rule them all - Stretch Core

A solid and well-functioning driver interface to the robot is key to a good user experience. Keeping in touch with this idea, we made sure that the core package has similar usage and behavior as expected from our already robust ROS 1 core package. ROS 2 brings new features in ROS 2 such as a new stretch_driver operation mode topic called “trajectory”.

A robot with a plan - Stretch FUNMAP

Your Stretch can now plan and navigate in a closed-loop control loop with obstacle detection using FUNMAP. The package also provides handy features such as aligning with cliffs, end-effector goal navigation, basic arm manipulation, and more. Being a contacts-sensitive robot, Stretch also provides services to extend or lift its arm until contact with a surface through FUNMAP.

Seeing the bigger picture - Stretch Deep Perception

The stretch_deep_perception package provides easy endpoints for general-purpose computer vision applications such as objects, body landmarks (human), and nearest face detection. The backbone is OpenVINO to make the deep learning models execute on a CPU.

A faithful service robot - Stretch Demos

To get a quick overview of Stretch’s capabilities, users can now run the autonomy demos using ROS 2. These demos include:

  • Surface cleaning
  • Object grasping
  • Drawer opening
  • Object delivery
  • Writing “hello” on a whiteboard

Each demo comes with an accompanying workspace setup and code explanation guide (stretch_tutorials).

Here is a feature comparison table between stretch_ros1 and stretch_ros2 to get a better understanding of the migration: stretch_tutorials/feature_comparison.

This summer, I had a blast working at Hello Robot and living in the Bay Area. I like Hello Robot’s supportive work culture and building alongside their fantastic team. Reaching out to AI research groups across the world through my work and contributing toward making embodied AI more accessible means a great deal to me. I thank (in no particular order) Binit Shah, Mohamed Fazil, Charlie Kemp, Chintan Desai, Julian Mehu, and Aaron Edsinger for their support and mentorship.

At Carnegie Mellon University, I am working with Prof. Bisk and Prof. Erickson to build Alfred, a low-cost elderly assistance robot. If you wish to collaborate on related projects, you can reach out to me via my email or LinkedIn.

Happy tinkering!
Atharva Pusalkar

This post was written by a human and verified by a Stretch.

3 Likes