Suppose we have a fiducial marker (AR tag) in the world and an end effector pose defined as some offset with respect to that marker.
Is there a currently implemented method to compute (and ideally move) both the Stretch base and arm actuators to match the given end effector pose? Essentially, inverse kinematics that optimizes for the robot base pose, lift, arm reach, and wrist angles?
Is there a related method for when the target end effector pose is defined with respect to the robot’s base pose (determined via the depth camera), rather than a fiducial marker?
Hi @Zackory, welcome to the forum! For the base + arm problem, there are a number of planning and navigation packages that can help. Here are some options:
Stretch FUNMAP (Fast Unified Navigation, Mapping, and Planning) is designed with Stretch’s kinematic configuration in mind and generates sane and predictable trajectories. It can plan and move Stretch to reach the Stretch Gripper to a given world point (does not take orientation right now). The TF2 library can help you transform goals from fiducial frame to a map frame in ROS. The “Publish Point” tool in Rviz is a quick way to try it out, see instructions here. For your reference, the hand over object demo defines a gripper point with respect to a detected mouth and uses FUNMAP to achieve it.
In ROS land, the de facto standard is MoveIt. It can solve the IK for a target gripper pose, plan in a cluttered environment, and coordinate navigation. Other end-effectors beyond the Stretch Gripper can also be supported. We’ve partnered with PickNik Robotics (@DavidLu leads the project) to bring first class support for Stretch RE1 to MoveIt2. It’s good to have this on your radar, but the project is still early in-development, so I don’t recommend delving into it yet. Updates will be posted here on the forum.
The ROS Navigation Stack is commonly used to get Stretch’s base close enough, and then use another method to achieve the target gripper pose. Once Stretch is in a known position, it can be easy to compute base translation, lift, arm, and wrist joints thanks to Stretch’s cartesian configuration. This is how the grasp object demo works.
The previous three options use ROS packages, however, there are promising Python options as well. IKPy is one option that could be used in conjunction with Stretch Body to achieve your goal.
Finally, depending on what you had in mind, visual servoing may be a valid option. If interested in this avenue, I can provide more information.
These options should also apply to the problem of achieving target gripper poses that are with respect to the base frame. Let me know if you’d like to know more about any of these options or had another approach in mind.
Hi @Zackory, since my original response, there’s been a lot of development on MoveIt2 and it ships on all robots now, so I can wholeheartedly recommend using it. The getting started tutorials are here.
I don’t think we have a tutorial for IK with a fixed base. Let me know if this is of interest and I can create one.
Good timing, it is available @Zackory! See the announcement post below. I’m also hosting a workshop to go through the Jupyter notebook and answer questions. Would be great to see your students there.
@bshah I came across this thread. I am working on the stretch for a project. I am keeping the base fixed and moving the gripper. Is it possible to derive the IK for the joint states by giving the cartesian points of the end effector in the world frame?
Hi @Shashank_Shiva, welcome to the forum. Yes, you can do that! I recommend using an IK library like IKPy instead of deriving it. Check out the tutorial/workshop to see how this is done. Happy to answer any questions too.