Stretch Calibration: detect_aruco_markers error

Hi!

I’ve been following the stretch calibration instructions here and on the third step (when collecting new observations) I get the following error (and warning, which may or may not be relevant):

[ WARN] [1696990073.768834877]: Hardware Notification:Depth stream start failure,1.69699e+12,Error,Hardware Error
[ERROR] [1696990073.857499]: bad callback: <bound method Subscriber.callback of <message_filters.Subscriber object at 0x7f98419b0430>>
Traceback (most recent call last):
  File "/opt/ros/noetic/lib/python3/dist-packages/rospy/topics.py", line 750, in _invoke_callback
    cb(msg)
  File "/opt/ros/noetic/lib/python3/dist-packages/message_filters/__init__.py", line 76, in callback
    self.signalMessage(msg)
  File "/opt/ros/noetic/lib/python3/dist-packages/message_filters/__init__.py", line 58, in signalMessage
    cb(*(msg + args))
  File "/opt/ros/noetic/lib/python3/dist-packages/message_filters/__init__.py", line 240, in add
    self.signalMessage(*msgs)
  File "/opt/ros/noetic/lib/python3/dist-packages/message_filters/__init__.py", line 58, in signalMessage
    cb(*(msg + args))
  File "/home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/nodes/detect_aruco_markers", line 624, in image_callback
    self.aruco_marker_collection.update(self.rgb_image, self.camera_info, self.depth_image, self.rgb_image_timestamp)
  File "/home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/nodes/detect_aruco_markers", line 551, in update
    self.aruco_corners, self.aruco_ids, aruco_rejected_image_points = aruco.detectMarkers(self.gray_image,
AttributeError: module 'cv2.aruco' has no attribute 'detectMarkers'

In attempt to fix this, I tried replacing my existing ROS1 Noetic workspace (using the instructions here), but this did not seem to help.

Any ideas for what might be going on? Any help will be much appreciated, thanks in advance!

Hey @arjung, Stretch Calibration uses OpenCV for detection of the Aruco markers on Stretch’s body, and with the latest version of OpenCV, the Aruco detection API has changed. Sylvia Wang (at University of Washington) wrote a fix that adopts the new API. We just merged it here.

Would you try replacing your Noetic workspace again? The new changes should resolve the error you’re seeing with Stretch Calibration.

1 Like

Hey @bshah, thanks so much for your response! This fixed the error!

I was able to run the stretch calibration commands and I’m getting a calibration error of 0.056, which exceeds the expected threshold (< 0.05) – do you have any ideas for how this calibration error can be reduced?

Furthermore, what initially prompted the need for calibration was the following observation. When the camera is commanded to “look ahead” (as described here) and the robot base is aligned with the tiles (which are aligned with the cabinets), I get the following image:

As evident, the line connecting the ground to the cabinets does not appear to be horizontal, suggesting that there is a camera calibration issue.

Now, when I rotate the yaw of the camera 8 degrees to the left, I get the following image:

Here, while the line connecting the ground to the cabinets does appear to be horizontal, the ceiling is not very horizontal. It seems like there may be a non-zero amount of rotation in one of the other axes, even though only the yaw was changed after the camera was put into “look ahead” mode.

I’m hoping that reducing the calibration error and using the updated urdf may help resolve this – what do you think?

Thanks once again for all your help!

Hi @arjung,

I suspect calibration would not help here. Calibration solves the problem of matching where the robot “thinks” its body is to where it sees its body, but it doesn’t calibrate with respect to the world (e.g. leveling its view with the world).

I think there are two explanations for the skew you are seeing in those images.

  1. Likely, those images are the distorted images that the Realsense libraries return by default. You can undistort the RGB image using OpenCV and the camera model reported in the /camera/color/camera_info topic.
  2. The camera in the head sees the world with a slight “roll”. This could happen if Stretch’s mast were tilted slightly with respect to its mobile base, if the camera were tilted slightly with respect to the pan/tilt assembly, if the floor of the environment wasn’t completely flat (the mobile base tilted with respect to the world), or some combination of those. Estimating the tilt contributed by each of these sources is tricky, so if my use-case required a level view of the world, I would correct for the roll by rotating the image. You could do this via guess & check, or use the IMU in the camera. Gravity will point downwards on the accelerometer, allowing you to estimate the amount of roll the camera experiences.
    • Screenshot from 2023-10-12 16-33-55
1 Like

Hey @bshah, thanks very much for the pointers! I will try out some of these ideas.

It still seems to me that I should resolve the calibration issue, whether or not that is the source of the skew in my images. Do you have any suggestions for how I can go about doing this?

Thanks for your help!

Hey @arjung, at the moment, the main suggestion I can give you is to try the calibration procedure in a controlled environment. If you have multiple Stretchs, make sure your robot doesn’t accidentally detect the Aruco markers off another robot. Try to ensure the room is evenly lit, and doesn’t have harsh overhead lighting. If your calibration error remains high in the controlled environment, we can schedule a support call to determine if there’s another cause of error. In the future, we’d like to provide better debugging tools to help Stretch self-diagnose issues during the calibration procedure.

1 Like

Hey @bshah, thanks so much for your help! Unfortunately, I’m still seeing a calibration error > 0.05, even when we re-do the calibration in different rooms / lighting conditions. Would it be possible to schedule a support call to determine if there’s another cause of error sometime this week?

Yes, that sounds like the best way to get to the bottom of it. Could you reach out to support@hello-robot.com, and we can schedule a meeting.

1 Like

Hey @bshah, I reached out to support@hello-robot.com right after your response and have yet to hear back. Would it be possible for you to check if my email was received? Sorry to bother you once again about this.

Hi @bshah, the calibration error is finally < 0.03 (0.021)! It turned out that the rooms I had tested calibration in were indeed too brightly lit with overhead lights. Re-doing the calibration procedure in a darker room with no overhead lights resolved the issue! Thanks once again for all your help!

1 Like

Really glad to hear that @arjung! Thanks for letting me know.

Also for anyone else who is reading this thread in the future, another issue we discovered over video call was running the “Checking the current calibration” instructions, instead of starting a new calibration. Which explains why the final error was always around 0.05, even in different environments. So make sure you’re running the instructions for “Creating a new calibration” instead of checking how good your existing one is.