Things to explore before robot arrives physically

Hi,

We will be getting a Stretch soon for one of our research projects. It involves designing gestures for Stretch.

I was curious to know what are some things we can explore before the bot arrives physically. Would it be possible to do some gesture designing and visualisation in Unity or similar applications?

I tried playing around with the ‘Stretch_ros’ repository. Can I execute the navigation/calibration packages to see a visualization in RViz?

Also, while reading the documentation, it mentions a few python and shell scripts in the calibration, which I couldn’t find. Could you help me locate them - stretch_robot_home.py, update_uncalibrated_urdf.sh

Hi @rishi, welcome to the forum! Stretch has a popular sensor called the Intel Realsense D435i in its head that would probably work best for interpreting hand or body gestures. The sensor returns regular color images, as well as depth images.

Before your robot arrives, you might explore gesture design in terms of these images. For just color images, you might get started with just a webcam. I’ve seen researchers use color images as input to a deep recurrent neural nets (RNNs), which is trained to identify gestures. For the depth images, there’s a number of depth cameras available: the Intel Realsense D435i (which we use), Microsoft Kinect V1/V2, and many more. At Georgia Tech, I was part of a research project called CopyCat that interpreted American Sign Language using Microsoft Kinect within a Unity game.

After your robot arrives, you can execute the navigation/calibration packages within stretch_ros and visualize the results in a tool that comes with ROS called “Rviz”. It’s common to use C# wrappers in order to visualize/use the data in Unity if you need that. These packages in stretch_ros do require a robot for execution. Also within stretch_ros, we include a few deep learning model for perception; one of which can identify body landmarks (e.g. elbow, wrist) on a depth image, which may be useful for body gestures.

Finally, here are links to stretch_robot_home.py and update_uncalibrated_urdf.sh. Hope this answers your questions and gives some helpful info. Let me know if you have any follow up questions.

1 Like

Hi Binit,

Thank you so much for your quick response! All your inputs are very helpful!

I wasn’t clear in explaining what we are trying to do. We intent to design body movement gestures for Stretch; trying to portray emotions through these gestures (eg: fast movement forward-backward showing happiness/joy). The idea being something on the lines of whether humans are able to understand emotions a robot tries to portray just through its gestures.

Do you have any suggestions on these lines? Is there a model available that we can direct import into Unity to try and design/visualise some of these gestures and then later when the robot arrives, program them onto it.

Thanks!
Rishi

Ah sorry Rishikesh, I misunderstood what you are trying to do. Thanks for the clarification. There’s no Unity model of Stretch at the moment, but I do think the community would find a rigged model for Unity useful. Perhaps we could work together to create this model. I can prepare the STL files and identify the joint ranges necessary to rig the moving links on Stretch.

When the robot arrives, I’d be happy to help you turn the Unity animated Stretch into something that could run on the real robot.

Hi Binit, Thank you so much for offering to help! It would be awesome if we could work together and get a rigged model up and running.

How could I share what my teammates have created/modelled till now? Would communicating over email work best?

– Rishi

Glad to hear it @rishi! I’ll direct message you.