Inverse Kinematics Tutorial & Workshop Recording

Screenshot from 2023-03-19 02-23-01

New tutorial for inverse kinematics on Stretch has been released on Github! You can preview the tutorial on Github, and instructions to run it are within the notebook itself.

Additionally, I’m hosting a workshop in 2 weeks to go through the tutorial and answer any questions. Here are the event details:

Inverse Kinematics Workshop

Date: March 29, 2023 at 11am PT
Meeting Link:
Agenda: Inverse Kinematics Workshop - Google Docs


Recording of the workshop is on Youtube at Inverse Kinematics Workshop - Part 1 - YouTube !


Hi @bshah, thanks so much for this, this is extremely helpful! I was wondering – is there something similar for Forward Kinematics? I tried looking for a tutorial / python script for this, but had no luck. Thanks!

Hey @arjung, glad to hear you found it helpful! That’s a good question. The IK tutorial actually includes forward kinematics. Set everything up the same way as in the tutorial, and you can do FK using:


ROS has its own way of doing FK through a library called TF2. If you’re using ROS, check out this example:

Hi Binit, I was able to execute the IK on the stretch and have it working. The mentioned goal point is respect to which origin? Thank you

I missed the part that said the goal is defined with respect to base link. Thank you for the tutorial.

I have a question: If my goal pose is a 3D point in the optitrack world frame, how would I implement this?

Hi @Shashank_Shiva, would you open a new thread? I’d like to keep this thread focused on the IK tutorial.

Short answer: you’d need to transform the point into the robot’s frame, which requires you to also know where the robot’s frame is with respect to the optitrack world frame. You might do this by attaching fiducials to the robot, so optitrack can track where the robot is as well. Happy to elaborate in the new thread.

1 Like

I was able to transform the point. Thank you.

@bshah I have created a new thread for IK with fixed base.

Hi, thank you for the helpful tutorial. I want to add the rotation of the base to this model and i did this with the following code. The joint tree seems to work but i want to make sure if i need to add any offsets in the joint_base_rotation in order for this to accurately represent the motion of the base.

joint_base_translation = urdfpy.Joint(name='joint_base_translation',
                                      axis=np.array([1.0, 0.0, 0.0]),
                                      origin=np.eye(4, dtype=np.float64),
                                      limit=urdfpy.JointLimit(effort=100.0, velocity=1.0, lower=-1.0, upper=1.0))
link_base_translation = urdfpy.Link(name='link_base_translation',
for j in modified_urdf._joints:
    if == 'joint_mast':
        j.parent = 'link_base_translation'
joint_base_rotation = urdfpy.Joint(name='joint_base_rotation',
                                      axis=np.array([0.0, 0.0, 1.0]),
                                      origin=np.eye(4, dtype=np.float64),
                                      limit=urdfpy.JointLimit(effort=100.0, velocity=1.0, lower=-3.14, upper=3.14))
link_base_rotation = urdfpy.Link(name='link_base_rotation',

# amend the chain
for j in modified_urdf._joints:
    if == 'joint_base_translation':
        j.parent = 'link_base_rotation'
print(f"name: {}")
print(f"num links: {len(modified_urdf.links)}")
print(f"num joints: {len(modified_urdf.joints)}")

Hi @Michal, I haven’t run your code, but I believe the joint tree would cause errors since there is a loop. The tree would be:

This wouldn’t work since joint_base_translation has two parents.

The tree actually works and the inverse kinematics runs. But i cannot get the base to rotate with the command robot.base.rotate_by(q_base_rotate). I can see that the inverse kinematics solves for a base rotation but this command doesnt result in any movement. All of the other joints run fine. Is there a way i can troubleshoot this?

I’d suggest troubleshooting this by printing out the solution that IKPy comes up with and visualizing it in Matplotlib as done in the notebook. Printing out the solution would help you determine if the rotation/translation are zero, in which case it would make sense that the robot doesn’t move. Visualizing the solution would help you determine if IKPy is actually able to converge, or if it is just timing out and returning the closest it can find. For example, the image below visualizes a solution that has converged to achieve the target point:

One thing I’ll note is that I wouldn’t expect a virtual base rotation joint followed by a virtual translation joint to model a differential drive base well. For example, that configuration couldn’t model a solution where the base has translated forward and than rotated in place. It might work for your use-case, but in my experience, throwing more dofs in the chain makes it harder for IKPy to find a solution.

1 Like

Hi @bshah,

Thanks for the great tutorial! Are you planning to upload the recordings of the other workshops soon as well? I am sure the Aruco Path Planning workshop will be quite beneficial for my project. The time shift to Europe makes it quite hard to attend the workshops live unfortunately :confused:


Hi @roboor, thank you! Unfortunately, I won’t be able to upload the recording of the Reach to Aruco workshop series, as I’ve discovered that the audio was distorted and sections of video weren’t recorded by my screen recording software. However, it may be helpful to look at the code that was made publicly available as part of the workshop, on the feature/reach_to_aruco branch of the Stretch ROS repository.

We may revisit the Reach to Aruco workshop series again in the future if there’s interest. In the meanwhile, I’d recommend starting a discussion here on the forum about Aruco and path planning related problems. The resulting discussion would likely be beneficial to anyone working with Aruco markers!

1 Like

I didn’t solve the problem with trying to expand this example but I think i understood why it is likely not possible in the way i was approaching it. In short, you can’t in general use inverse kinematics to solve for joints given ee position for a non-holonomic mobile manipulator like stretch because the measure of the traveled distance of each wheel is not sufficient to calculate the final position of the robot. The final position depends on the path taken and during the ik iterations this information is not taken into account. On the bright side you can still formulate a full body Jacobian to achieve transformation between velocities instead of positions. This is useful for task space control if you have a given trajectory you want to follow. Here are some resources if anyone is interested.

Resource on non-holonomic constraints in mobile robotics

Resource on formulating the full body jacobian for a mobile manipulator


Thanks @Michal! In the past, I haven’t had a good way to explain why you can’t formulate position tracking IK for a differential drive. I found your explanation to be intuitive:

“because the measure of the traveled distance of each wheel is not sufficient to calculate the final position of the robot”

Since the manipulator jacobian is a mapping between change in joint space to change in task space, it makes sense that it wouldn’t work if you can’t know the change in task space (i.e. final position of the robot) from the change in joint space (i.e. traveled distance by each wheel).

I also appreciate the resources you’ve linked to. You can indeed develop velocity tracking IK for a differential drive robot, or full mobile manipulator like Stretch, and calculate the trajectory that the robot will follow elsewhere. It’s also easier for a geometric planner to create an obstacle avoiding trajectory, than it is to add obstacle avoiding constraints to the IK formulation. As far as I know, the IKPy library used in this tutorial doesn’t support velocity tracking or adding nonkinematic constraints.