Making videos of Stretch : lenses, lights, editors, and more

Hello everyone,

In robotics, videos are an important form of communication. In this post, I describe how we made many of our YouTube videos for our recent public launch. Specifically, this post covers how we shot Stretch in action for our autonomy videos and teleoperation videos. It doesn’t cover the glamour shots, narrative elements, and branding slides.


Making Robot Videos Less Dangerous

Before I get into the equipment and software we used, I’ll touch on the dangers of robot videos and our efforts to reduce them. Robotics videos are often misinterpreted. Creators of robot videos are excited to share and can let their enthusiasm get in the way of conveying reality. On the other side, viewers of robot videos often want to believe that the future is here.

For our launch videos, Hello Robot attempted to reduce misinterpretation through the following methods.

  1. We clearly labeled videos from the start. Early in our editing, we added overlay text to indicate whether Stretch was being teleoperated or acting autonomously, and if the video had been sped up.
  2. We posted the code that we used to control the robot, such as code for teleoperation and autonomy.
  3. We videoed Stretch performing significant tasks. Even if we were just going to use a short snippet, we tried to have the snippet be in the context of a task. This helped avoid suggesting that Stretch can perform a task that it can’t, such as due to not being able to perform a step in the task. It also helped prevent relentless short takes that could more easily catch a low probability event on video.
  4. We released full uncut videos of the significant tasks. For example, our launch video, autonomy compilation video, and teleoperation compilation video show highlights, but you can also watch all of the full underlying autonomy task videos and teleoperation task videos.

Video Equipment

In terms of equipment, let’s start with the camera. I’ve been around long enough to have worked with a variety of large cameras. Fortunately, the world has changed and high-end mobile phones take great video.

We selected the Google Pixel 2 XL, and we’ve been happy with it.

Sometimes, the phone’s field of view was too narrow, so we also purchased add-on lenses and used them in some shots.

We went with M-Series Lenses from Moment. The smartphone case and the two smartphone lenses we used follow.

  • Moment Pixel 2 XL Case that holds Moment Lens
  • Moment Wide Lens (18mm v2)
  • Moment Fisheye Lens (14mm 170deg v2)

We occasionally used handheld shots, such as with the whiteboard writing demo, but most of the time we used the following tripod and mobile phone mount.

Lighting can make a big difference in video quality. We used the LimoStudio Lighting Kit from Amazon.

Screen Recording

To convey the robot’s internal state, we captured videos of RViz using SimpleScreenRecorder in Ubuntu 18.04. The default version worked well, so we just installed it using

sudo apt install simplescreenrecorder

on the command line.

Editing

Once we had the raw video, we performed our initial edits using Kdenlive in Ubuntu 18.04. We used Kdenlive for a variety of editing jobs, including making initial cuts, speeding up some videos, and adding clarifying text about teleoperation, autonomy, and video speed.

The Kdenlive project has been making rapid progress and we’ve had problems with older versions. We used the recommended AppImage version found on the Kdenlive website.


We’re looking forward to seeing videos of Stretch from the community! Have you found something that works well for your videos of Stretch? Please post it here. I’m interested!

Best wishes,

Charlie

Charlie Kemp, PhD
co-founder & CTO
Hello Robot Inc.
http://charliekemp.com

2 Likes