How to use trajectory control for the mobile base on stretch_body?

Hi!

I am trying to control the mobile base using the stretch_body API, but I am pretty confused as to how to chain trajectories together.

Few questions:

  1. How to properly use the trajectory following API? (Especially for the base). Am I using it correctly? (The 3rd action is not getting executed) Snippet is attached below.
  2. When translating the base forward/backward, the motion has a lot of jerks. Is there any way to change the gains or any parameters that can be tuned to get a better trajectory?
b = stretch_body.base.Base()
b.left_wheel.disable_sync_mode()
b.right_wheel.disable_sync_mode()
b.startup()

def addrot(b, prevx, prevy, prevt, theta, duration):
    b.trajectory.add(
        time=prevt + duration, 
        x=prevx, 
        y=prevy, 
        theta=np.deg2rad(theta),
        translational_vel=0, 
        rotational_vel=0, 
        translational_accel = 0,
        rotational_accel = 0,
    )
    return prevx, prevy, theta, prevt + duration

def addtranslate(b, prevx, prevy,prevt, deltax, deltay, prevtheta, duration):
    x = prevx + deltax
    y = prevy + deltay
    t = prevt + duration
    b.trajectory.add(
        time=prevt + duration, 
        x=prevx + deltax, 
        y = deltay + prevy, 
        theta=prevtheta,
        translational_vel=0, 
        rotational_vel=0, 
        translational_accel = 0,
        rotational_accel = 0,

    )
    return x, y, prevtheta, t

x = 0.0
y = 0.0
theta = 0.0
t = 0.0
b.first_step = True
b.trajectory.add(time=0, x=0, y=0, theta=0)
x, y, theta, t = addtranslate(b, x, y, t, 0.5, 0.0, theta, 5)
x, y, theta, t = addrot(b, x, y, t, -90, 5)
x, y, theta, t = addtranslate(b, x, y, t, 0.5, 0.0, theta, 2) # not executing

b.follow_trajectory()
starttime = time.time()
while time.time() - starttime < t + 3:
    time.sleep(1)
    b.pretty_print()

b.stop()

print('done')
exit(0)

Hi Praveen, thanks for providing the code snippet. We’ll try it out and get back to you on your questions today.

–Aaron

1 Like

Quick update: We were able to replicate the issue. We’ll need another day or so to push a fix / recommendation.

–Aaron

1 Like

Thanks Aaron! Waiting for the same :slight_smile:

hi @aedsinger, has there been a fix/docs for the same yet?

Hi,

We’re still sorting out a more general fix. But here’s a start:

The base trajectory controller waypoint can only specify rotation or translation. It won’t do both at the same time.

The first issue is that our DiffDriveTrajectory class was meant to check for this in is_valid() but it wasn’t - so you didn’t get a warning as expected. This will be updated shortly in a Stretch Body release.

The second issue is that you are passing theta in degrees (rather than radians) in the final addtranslate() call. This has the effect of causing the final waypoint to look like both a rotation and translation. The line should be theta=np.deg2rad(prevtheta)

The third issue is that the waypoints are to be specified in the global frame. Your final waypoint is asking it to move in the X direction yet the robot is pointed in the Y direction. This is why you aren’t seeing motion. Try specifying a Y direction waypoint instead.

I can imagine in some cases a local frame waypoint would be more intuitive. We are evaluating if we should extend the API.

Let us know if you still see motion smoothness issues once your waypoint controller is working. If so we can assist with that as well.

1 Like

Awesome! Thanks a lot for the help.

Also, how will chaining of translation and rotation be executed if the coordinates are in the global frame and the robot doesn’t reach the desired point of interest? For example, if I send x = 0.4, then theta = pi/2 and finally y = 0.3, will y = 0.3 be executed if the robot somehow reaches only x = 0.3?

How do I keep track of the global coordinates of the robot?

Good question. I’m going to hand this off to @bshah as he can best answer this.

@bshah would love your comments on the same!

1 Like

Hi all, great to see a bug was found/fixed with the Trajectory API. @flyingMango, just to confirm, you’re asking about what will happen if the translations/rotations in the base trajectory don’t complete perfectly, right?

Yes! If the robot doesn’t reach the desired location, what behavior can we expect?

Gotcha, so Stretch’s software is layered and each layer operates with the information it has. The Trajectory API in Stretch Body knows wheel odometry, so you can expect compounding error in the base’s location as the trajectory executes because wheel slip, controller error, etc. cause each translation/rotation to not execute perfectly. If you’re using the Trajectory API directly, you could say the waypoints are actually defined in a shifting odom frame, not a global frame. The next layer up is responsible for incorporating more information. The Trajectory API was originally designed for MoveIt2, which accesses the lidar and wheel odometry to fuse together into a localization estimate that is closer to the global frame. MoveIt2 can then replan the waypoints as needed to fix imperfect motions.

In short, the solution is to use the other sensors on the robot to estimate how far the robot has moved and correct for noisy movements. Could you share a little bit of what you’re trying to achieve, and I can make some suggestions on how you might proceed?

Thanks for the reply @bshah

Would it be possible for you to provide a short snippet that uses the above API and does the following while also handling the fact that the robot may not reach the desired location?

  • Translate base by 30cm
  • Rotate base by 45 degrees
  • Translate by 30cm
  • Rotate 180 degrees
  • Translate 30cm
  • Rotate -45 degrees
  • Translate 30cm

I’m trying to get the robot to handle complex grasping scenarios where there may be clutter and hence need access to all joints of the trajectory API.

Yup, I’ll follow up here shortly with a snippet.

1 Like

Hey @flyingMango,

I apologize for the super delayed response on this thread. I was able to set up an example script for you that works in ROS and utilizes the RPlidar to improve the odometry estimate. The script resides in the feature/accurate_odom branch of the stretch_ros repo. Here’s how it works:

I used the laser_scan_matcher package in ROS which computes the pose of the base frame (base_link) in an odometric frame (odom) using an Iterative Closest Point algorithm. The pose messages are published on a topic called /pose2D.

The two methods base_translate(x) and base_rotate(theta) of the BaseMoveBy class allow you to command the robot in the local frame where x is in meters and could be an integer value, while theta is in radians and could be a value between (-pi and pi), where the sign is governed by the right-hand screw rule pointing along the robot z-axis and angle zero faces the front of the robot.

Before you run the script, you would have to launch the base_move_by.launch file to initiate the stretch_driver node in navigation mode and the laser_scan_matcher node along with the RPlidar driver.

roslaunch stretch_core base_move_by.launch

While this is still a work in progress, the method alleviates the issue you are facing to some extent. However, the issue of error accumulation after successive commands persists as I am passing commands in the local frame. As a continuation of this, I am implementing a script that allows you to command the robot in an odometric frame to allow better error correction when the preceding command is not executed truthfully. This will resolve the issue of error accumulation after successive commands.

Feel free to play with the script while I continue to work on it. Meanwhile, let me know if you have any questions.

Best,
Chintan

Thanks a lot @chintujaguar! Will try it out :slight_smile: