Base motion replay

I tried to replay a trajectory I collected with a remote controller, with something like (assume robot move straight forward for now):
for robot_status in trajectory do robot.base.translate_by(pos[i+1]-pos[i])
For other joints (lift/arm/end_of_arm, etc) it works pretty well. But for the base, the robot always moves closer than it should move at the end of the trajectory.
i.e. robot.base.translate_by(pos[-1]) > sum(robot.base.translate_by(pos[i+1]-pos[i]))

I thought it might because the control ignores some small motions. Any suggestions?

Can you clarify what you mean by ‘moves closer than it should’?

In general the arm, lift, and base motors track small motions pretty well as they utilize closed loop steppers.

When doing absolute moves (move_to) you should see the ability to generate small motions on the order of 2-6mm depending on the joint.

When doing incremental moves (move_by or translate_by), the joint will move relative to its current encoder reading. This works fine for single incremental moves where the controller has time to reach its target.

If you do a series of closely spaced incremental moves you will likely see some accumulated error. This is because each move_by will be relative to the current encoder position. If the controller hasn’t yet achieved the target of the last command, then you will start to deviate from your trajectory.

One workaround might be to compute an absolute trajectory target in your code and use that to command the base using translate_to(**see note below).

Another approach might be to command velocities at a moderate rate (10hz) - though this may have other types of integration errors.

Another comment: The base has a sentry that limits the velocity and acceleration when the arm is not stowed. This is to protect against toppling hazards. It may be simply that your trajectories are getting clipped by these limits. This can be temporaily disabled in the YAML (robot_sentry.base_max_velocity).

Finally, we have an implementation of splined trajectories coming to Stretch Body. We’re still working out the final issues, but hope to release this around the end of summer. This may be what you want longer term.

** Note: The base API doesn’t yet expose absolute moves for the base (translate_to, rotate_to) as we haven’t yet needed them. If interested we can extend the API for this.

Thanks for your reply. It seems like translate_to and rotate_to is the best solution? Is it possible for us to extend the API?

Hi @jianrenw!

Thanks for posting this. I wanted to provide additional information about two aspects of your question that might impact you and others in the community.

Timing Matters

As Aaron touched on in his response, a smoothly varying trajectory for a joint is currently achieved by sending a new motion command before the previous motion command has finished. The game controller teleoperation code provides a good example of this. Each joint’s motor controller generates smooth motion that depends on the current state of the joint and the command parameters. As such, the result of playing back a recorded trajectory depends on the timing of the executed commands. One exception would be if each command were allowed to finish prior to executing the next command, but this would result in the robot stopping between each command.

For reference, each motor controller uses a trapezoidal speed motion generator based on this open source code.

Also, as Aaron noted, Hello Robot is working to support smooth multi-joint trajectories using splines. These trajectories will support commands that specify a robot pose to be achieved at a particular time. One of the motivations for this new mode of control is to better execute trajectories produced by motion planners, such as trajectories from MoveIt 2. However, this mode of control is under active development and not ready for use, especially since it requires a firmware update.

The Limits of Wheel Odometry

The mobile base’s motion will vary based on the flooring. Hard, high-friction, flat floors will tend to result in more accurate and repeatable motion of the mobile base. Other conditions, such as carpet, floor padding, slippery surfaces, and seams between tiles, will result in less accurate and repeatable motion of the mobile base. This is significantly different from the lift and telescoping arm, which execute consistent motions given typical environmental variations.

stretch_body only uses the rotations of the wheels for feedback when executing commands. In other words, it uses wheel odometry when moving the mobile base, which only considers how much the wheels have rotated rather than the actual motion of the mobile base.

For your purposes, you might want to restrict your recording and playback of mobile base trajectories to the same hard, high-friction, flat floor.

Overcoming the Limits of Wheel Odometry

A common approach to overcoming the limits of wheel odometry is to alter the mobile base’s motion using additional sensing. For example, Stretch’s demo code uses a scan matcher with the laser range finder to produce odometry estimates. The laser range finder senses walls and other fixed objects in a room. When the robot moves, the scan matcher estimates how the robot has moved relative to these fixed objects. stretch_ros includes a launch file that starts this scan matcher. You can find simple examples that use scan matcher odometry to adjust the motion of the mobile base in the MoveBase class of navigate.py.

Visual inertial odometry is another useful way to estimate how the mobile base has moved within a room. Internally at Hello Robot, we’ve tried out the T265 Intel RealSense Tracking Camera mounted to the top of Stretch’s head and pointed towards the ceiling. At some point, we may create 3D-printed mounts for the T265 and make them available to the community with example code.

In the context of navigation, it’s common to estimate the mobile base’s pose and plan mobile base motions using a map of the environment. This goes beyond odometry and relates to the wide world of simultaneous localization and mapping (SLAM).

Moving the robot with respect to the room can be useful in a number of circumstances, such as navigating from one part of the room to another. However, some tasks depend more directly on features that aren’t fixed to the room, such as the pose of a person or a movable object. In these circumstances, moving the base with respect to task-relevant features can be useful. For example, Stretch’s object handover demo executes actions based on visual estimates of the pose of a nearby person. Similarly, Stretch’s grasp object and clean surface demos use estimates of the pose of an object and a nearby surface. Each of these demos uses a single estimate prior to acting. Visual servoing is a powerful related approach that uses repeated observations of task-relevant features to provide feedback while moving a robot.

I hope you and others find this post helpful.

Best wishes,
Charlie

Charlie Kemp, PhD
co-founder & CTO
Hello Robot Inc.
http://charliekemp.com

Hi @jianrenw ,

I was wrapping up my response when your reply came in. I doubt that translate_to and rotate_to would solve your problem. I’ll be interested to know what you think after you read my reply.

Best wishes,
Charlie