RViz is breaking too often

Dear Team,

RViz is breaking too often. I am trying to run https://github.com/hello-robot/stretch_ros/blob/master/stretch_demos/README.md for demo presentation and it’s not working. Last time when I raised this issue, I was suggested to restart the robot but that is also not working this time. A stable solution of this problem will be greatly appreciated.

Dear @sitaneja, would you attach a log of the output of the command you’ve entered? I recall that this previously occurred since your Realsense camera was stuck in a weird configuration; which was resolved by restarting the robot. Since it’s pretty rare for the camera to get stuck in this configuration, I don’t expect this is the same issue. I’d be happy to help find a stable solution for this issue with the log information.

Please find the logs below -

WARNING: Package name "ROAM" does not follow the naming conventions. It should start with a lower case letter and only contain lower case letters, digits, underscores, and dashes.

… logging to /home/hello-robot/.ros/log/1eecf394-f0a3-11ea-8579-94e6f7b7f49e/roslaunch-stretch-re1-1012-25960.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

WARNING: Package name “ROAM” does not follow the naming conventions. It should start with a lower case letter and only contain lower case letters, digits, underscores, and dashes.
started roslaunch server http://stretch-re1-1012:36955/

SUMMARY

CLEAR PARAMETERS

  • /ekf_se/

PARAMETERS

  • /camera/realsense2_camera/accel_fps: 63
  • /camera/realsense2_camera/accel_frame_id: camera_accel_frame
  • /camera/realsense2_camera/accel_optical_frame_id: camera_accel_opti…
  • /camera/realsense2_camera/align_depth: True
  • /camera/realsense2_camera/aligned_depth_to_color_frame_id: camera_aligned_de…
  • /camera/realsense2_camera/aligned_depth_to_fisheye1_frame_id: camera_aligned_de…
  • /camera/realsense2_camera/aligned_depth_to_fisheye2_frame_id: camera_aligned_de…
  • /camera/realsense2_camera/aligned_depth_to_fisheye_frame_id: camera_aligned_de…
  • /camera/realsense2_camera/aligned_depth_to_infra1_frame_id: camera_aligned_de…
  • /camera/realsense2_camera/aligned_depth_to_infra2_frame_id: camera_aligned_de…
  • /camera/realsense2_camera/allow_no_texture_points: True
  • /camera/realsense2_camera/base_frame_id: camera_link
  • /camera/realsense2_camera/calib_odom_file:
  • /camera/realsense2_camera/clip_distance: -2.0
  • /camera/realsense2_camera/color_fps: 15
  • /camera/realsense2_camera/color_frame_id: camera_color_frame
  • /camera/realsense2_camera/color_height: 240
  • /camera/realsense2_camera/color_optical_frame_id: camera_color_opti…
  • /camera/realsense2_camera/color_width: 424
  • /camera/realsense2_camera/depth_fps: 15
  • /camera/realsense2_camera/depth_frame_id: camera_depth_frame
  • /camera/realsense2_camera/depth_height: 240
  • /camera/realsense2_camera/depth_optical_frame_id: camera_depth_opti…
  • /camera/realsense2_camera/depth_width: 424
  • /camera/realsense2_camera/device_type:
  • /camera/realsense2_camera/enable_accel: True
  • /camera/realsense2_camera/enable_color: True
  • /camera/realsense2_camera/enable_depth: True
  • /camera/realsense2_camera/enable_fisheye1: True
  • /camera/realsense2_camera/enable_fisheye2: True
  • /camera/realsense2_camera/enable_fisheye: True
  • /camera/realsense2_camera/enable_gyro: True
  • /camera/realsense2_camera/enable_infra1: False
  • /camera/realsense2_camera/enable_infra2: False
  • /camera/realsense2_camera/enable_pointcloud: False
  • /camera/realsense2_camera/enable_sync: True
  • /camera/realsense2_camera/filters: pointcloud
  • /camera/realsense2_camera/fisheye1_frame_id: camera_fisheye1_f…
  • /camera/realsense2_camera/fisheye1_optical_frame_id: camera_fisheye1_o…
  • /camera/realsense2_camera/fisheye2_frame_id: camera_fisheye2_f…
  • /camera/realsense2_camera/fisheye2_optical_frame_id: camera_fisheye2_o…
  • /camera/realsense2_camera/fisheye_fps: 30
  • /camera/realsense2_camera/fisheye_frame_id: camera_fisheye_frame
  • /camera/realsense2_camera/fisheye_height: 480
  • /camera/realsense2_camera/fisheye_optical_frame_id: camera_fisheye_op…
  • /camera/realsense2_camera/fisheye_width: 640
  • /camera/realsense2_camera/gyro_fps: 200
  • /camera/realsense2_camera/gyro_frame_id: camera_gyro_frame
  • /camera/realsense2_camera/gyro_optical_frame_id: camera_gyro_optic…
  • /camera/realsense2_camera/imu_optical_frame_id: camera_imu_optica…
  • /camera/realsense2_camera/infra1_frame_id: camera_infra1_frame
  • /camera/realsense2_camera/infra1_optical_frame_id: camera_infra1_opt…
  • /camera/realsense2_camera/infra2_frame_id: camera_infra2_frame
  • /camera/realsense2_camera/infra2_optical_frame_id: camera_infra2_opt…
  • /camera/realsense2_camera/infra_fps: 30
  • /camera/realsense2_camera/infra_height: 480
  • /camera/realsense2_camera/infra_width: 640
  • /camera/realsense2_camera/initial_reset: False
  • /camera/realsense2_camera/json_file_path:
  • /camera/realsense2_camera/linear_accel_cov: 0.01
  • /camera/realsense2_camera/odom_frame_id: camera_odom_frame
  • /camera/realsense2_camera/pointcloud_texture_index: 0
  • /camera/realsense2_camera/pointcloud_texture_stream: RS2_STREAM_COLOR
  • /camera/realsense2_camera/pose_frame_id: camera_pose_frame
  • /camera/realsense2_camera/pose_optical_frame_id: camera_pose_optic…
  • /camera/realsense2_camera/publish_odom_tf: True
  • /camera/realsense2_camera/publish_tf: True
  • /camera/realsense2_camera/rosbag_filename:
  • /camera/realsense2_camera/serial_no:
  • /camera/realsense2_camera/tf_publish_rate: 0.0
  • /camera/realsense2_camera/topic_odom_in: odom_in
  • /camera/realsense2_camera/unite_imu_method:
  • /camera/realsense2_camera/usb_port_id:
  • /d435i_configure/initial_mode: High Accuracy
  • /ekf_se/acceleration_gains: [0.8, 0.0, 0.0, 0…
  • /ekf_se/acceleration_limits: [1.3, 0.0, 0.0, 0…
  • /ekf_se/base_link_frame: base_link
  • /ekf_se/control_config: [True, False, Fal…
  • /ekf_se/control_timeout: 0.2
  • /ekf_se/debug: False
  • /ekf_se/debug_out_file: /path/to/debug/fi…
  • /ekf_se/deceleration_gains: [1.0, 0.0, 0.0, 0…
  • /ekf_se/deceleration_limits: [1.3, 0.0, 0.0, 0…
  • /ekf_se/frequency: 15
  • /ekf_se/imu0: /imu/data
  • /ekf_se/imu0_config: [False, False, Fa…
  • /ekf_se/imu0_differential: True
  • /ekf_se/imu0_nodelay: False
  • /ekf_se/imu0_queue_size: 1
  • /ekf_se/imu0_relative: True
  • /ekf_se/imu0_remove_gravitational_acceleration: True
  • /ekf_se/initial_estimate_covariance: [‘1e-9’, 0, 0, 0,…
  • /ekf_se/map_frame: map
  • /ekf_se/odom0: /odom
  • /ekf_se/odom0_config: [True, True, True…
  • /ekf_se/odom0_differential: True
  • /ekf_se/odom0_nodelay: False
  • /ekf_se/odom0_queue_size: 1
  • /ekf_se/odom0_relative: False
  • /ekf_se/odom_frame: odom
  • /ekf_se/print_diagnostics: True
  • /ekf_se/process_noise_covariance: [0.05, 0, 0, 0, 0…
  • /ekf_se/publish_acceleration: False
  • /ekf_se/publish_tf: False
  • /ekf_se/sensor_timeout: 0.2
  • /ekf_se/stamped_control: False
  • /ekf_se/transform_time_offset: 0.0
  • /ekf_se/transform_timeout: 0.0
  • /ekf_se/two_d_mode: True
  • /ekf_se/use_control: False
  • /ekf_se/world_frame: odom
  • /funmap/debug_directory: /home/hello-robot…
  • /imu_filter_node/fixed_frame: map
  • /imu_filter_node/use_mag: False
  • /joint_state_publisher/rate: 15.0
  • /joint_state_publisher/source_list: [’/stretch/joint_…
  • /laser_scan_matcher_node/fixed_frame: odom
  • /laser_scan_matcher_node/max_angular_correction_deg: 20.0
  • /laser_scan_matcher_node/max_iterations: 10
  • /laser_scan_matcher_node/max_linear_correction: 0.2
  • /laser_scan_matcher_node/use_imu: False
  • /lidar_node/angle_compensate: True
  • /lidar_node/frame_id: laser
  • /lidar_node/inverted: False
  • /lidar_node/scan_mode: Boost
  • /lidar_node/serial_baudrate: 115200
  • /lidar_node/serial_port: /dev/hello-lrf
  • /robot_description: <?xml version="1…
  • /robot_state_publisher/publish_frequency: 15.0
  • /rosdistro: melodic
  • /rosversion: 1.14.6
  • /stretch_driver/broadcast_odom_tf: False
  • /stretch_driver/controller_calibration_file: /home/hello-robot…
  • /stretch_driver/rate: 25.0
  • /stretch_driver/timeout: 0.5

NODES
/
d435i_accel_correction (stretch_core/d435i_accel_correction)
d435i_configure (stretch_core/d435i_configure)
detect_handover_position (stretch_deep_perception/detect_nearest_mouth_python3.py)
ekf_se (robot_localization/ekf_localization_node)
funmap (stretch_funmap/funmap)
handover_object (stretch_demos/handover_object)
imu_filter_node (imu_filter_madgwick/imu_filter_node)
joint_state_publisher (joint_state_publisher/joint_state_publisher)
keyboard_teleop (stretch_core/keyboard_teleop)
laser_scan_matcher_node (laser_scan_matcher/laser_scan_matcher_node)
lidar_node (rplidar_ros/rplidarNode)
robot_state_publisher (robot_state_publisher/robot_state_publisher)
rviz (rviz/rviz)
stretch_driver (stretch_core/stretch_driver)
/camera/
realsense2_camera (nodelet/nodelet)
realsense2_camera_manager (nodelet/nodelet)

auto-starting new master
process[master]: started with pid [25979]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to 1eecf394-f0a3-11ea-8579-94e6f7b7f49e
WARNING: Package name “ROAM” does not follow the naming conventions. It should start with a lower case letter and only contain lower case letters, digits, underscores, and dashes.
process[rosout-1]: started with pid [25990]
started core service [/rosout]
process[d435i_accel_correction-2]: started with pid [25998]
process[camera/realsense2_camera_manager-3]: started with pid [25999]
process[camera/realsense2_camera-4]: started with pid [26000]
process[d435i_configure-5]: started with pid [26001]
process[joint_state_publisher-6]: started with pid [26002]
process[robot_state_publisher-7]: started with pid [26003]
process[stretch_driver-8]: started with pid [26009]
[ INFO] [1599439398.567242135]: Initializing nodelet with 8 worker threads.
[ WARN] [1599439398.587286879]: The root link base_link has an inertia specified in the URDF, but KDL does not support a root link with an inertia. As a workaround, you can add an extra dummy link to your URDF.
process[funmap-9]: started with pid [26010]
process[imu_filter_node-10]: started with pid [26031]
process[ekf_se-11]: started with pid [26033]
[ INFO] [1599439398.659610037]: Starting ImuFilter
[ INFO] [1599439398.667091268]: Using dt computed from message headers
[ INFO] [1599439398.667186663]: The gravity vector is kept in the IMU message.
[ INFO] [1599439398.689031495]: Imu filter gain set to 0.100000
[ INFO] [1599439398.689090137]: Gyro drift bias set to 0.000000
[ INFO] [1599439398.689112159]: Magnetometer bias values: 0.000000 0.000000 0.000000
process[lidar_node-12]: started with pid [26040]
process[laser_scan_matcher_node-13]: started with pid [26047]
[ WARN] [1599439398.777949534]: Both imu0_differential and imu0_relative were set to true. Using differential mode.
process[detect_handover_position-14]: started with pid [26058]
process[handover_object-15]: started with pid [26067]
[ INFO] [1599439398.807832497]: RPLIDAR running on ROS package rplidar_ros. SDK Version:1.7.0
process[keyboard_teleop-16]: started with pid [26069]
[ INFO] [1599439398.844056629]: Starting LaserScanMatcher
process[rviz-17]: started with pid [26076]
[INFO] [1599439399.304158]: /d435i_accel_correction started
[ INFO] [1599439399.481092461]: RealSense ROS v2.2.14
[ INFO] [1599439399.481133186]: Running with LibRealSense v2.35.2
[ INFO] [1599439399.540027800]:
[ INFO] [1599439399.556622492]: rviz version 1.13.13
[ INFO] [1599439399.556676556]: compiled against Qt version 5.9.5
[ INFO] [1599439399.556692163]: compiled against OGRE version 1.9.0 (Ghadamon)
[ INFO] [1599439399.564545623]: Forcing OpenGl version 0.
06/09 17:43:19,644 WARNING [140365195851520] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61
06/09 17:43:19,747 WARNING [140365195851520] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61
[ INFO] [1599439399.949102711]: Stereo is NOT SUPPORTED
[ INFO] [1599439399.949244362]: OpenGl version: 3.1 (GLSL 1.4).
[INFO] [1599439400.245889]: /d435i_configure started
[INFO] [1599439400.923413]: /keyboard_teleop started
[INFO] [1599439400.971230]: For use with S T R E T C H ™ RESEARCH EDITION from Hello Robot Inc.
[INFO] [1599439400.972138]: /stretch_driver started
[INFO] [1599439400.973643]: mode = position
[INFO] [1599439400.975592]: broadcast_odom_tf = False
[INFO] [1599439400.977476]: Loading controller calibration parameters for the head from YAML file named /home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/config/controller_calibration_head.yaml
[INFO] [1599439400.980603]: controller parameters loaded = {‘tilt_looking_up_offset’: -0.008582455603193135, ‘arm_retracted_offset’: 0.003116814480937651, ‘tilt_angle_backlash_transition’: -0.4, ‘tilt_angle_offset’: 0.031408220867074815, ‘pan_angle_offset’: 0.07868705926707772, ‘pan_looked_left_offset’: -0.003570145261352967}
[INFO] [1599439400.982180]: self.head_tilt_calibrated_offset_rad in degrees = 1.7995584977
[INFO] [1599439400.983699]: self.head_pan_calibrated_offset_rad in degrees = 4.5084363983
[INFO] [1599439400.985423]: self.head_pan_calibrated_looked_left_offset_rad in degrees = -0.204554255724
[INFO] [1599439400.987007]: self.head_tilt_backlash_transition_angle_rad in degrees = -22.9183118052
[INFO] [1599439400.988329]: self.head_tilt_calibrated_looking_up_offset_rad in degrees = -0.491738483921
[INFO] [1599439400.989317]: self.wrist_extension_calibrated_retracted_offset_m in meters = 0.00311681448094
[INFO] [1599439401.008854]: /stretch_driver rate = 25.0 Hz
[INFO] [1599439401.010873]: /stretch_driver timeout = 0.5 s
[INFO] [1599439401.016450]: /stretch_driver use_fake_mechaduinos = False
[INFO] [1599439401.018283]: /stretch_driver base_frame_id = base_link
[INFO] [1599439401.019704]: /stretch_driver odom_frame_id = odom
RPLIDAR S/N: ACB39AF2C1EA9FC3BEEB9CF323603203
[ INFO] [1599439401.323847220]: Firmware Ver: 1.25
[ INFO] [1599439401.323885519]: Hardware Rev: 5
[ INFO] [1599439401.325235843]: RPLidar health status : 0
Waiting for service /camera/stereo_module/set_parameters…
/home/hello-robot/.local/lib/python3.6/site-packages/numba/core/errors.py:149: UserWarning: Insufficiently recent colorama version found. Numba requires colorama >= 0.3.9
warnings.warn(msg)
[ INFO] [1599439401.924975622]: current scan mode: Boost, max_distance: 12.0 m, Point number: 8.0K , angle_compensate: 2
MoveBase init: self.debug_directory = None
MaxHeightImage information:
image.shape = (67, 25)
image.dtype = uint8
m_per_pix = 0.006
m_per_height_unit = 0.000276679841897
voi.x_in_m = 0.15
voi.y_in_m = 0.4
voi.z_in_m = 0.07
cv2.version = 4.3.0
Python version (must be > 3.0): 3.6.9 (default, Jul 17 2020, 12:50:27)
[GCC 8.4.0]
Using the following directory for deep learning models: /home/hello-robot/stretch_user/stretch_deep_perception_models/
Not attempting to use an Intel Neural Compute Stick 2.
MaxHeightImage information:
image.shape = (434, 350)
Using the following directory to load object detector models: /home/hello-robot/stretch_user/stretch_deep_perception_models/
image.dtype = uint8
m_per_pix = 0.006
m_per_height_unit = 0.00456692913386
voi.x_in_m = 2.1
voi.y_in_m = 2.6
voi.z_in_m = 1.16
attempting to load neural network from files
prototxt file = /home/hello-robot/stretch_user/stretch_deep_perception_models//head_detection/deploy.prototxt
caffemodel file = /home/hello-robot/stretch_user/stretch_deep_perception_models//head_detection/res10_300x300_ssd_iter_140000.caffemodel
head_detection_model.getLayerNames() = [‘data_bn’, ‘data_scale’, ‘conv1_h’, ‘conv1_bn_h’, ‘conv1_scale_h’, ‘conv1_relu’, ‘conv1_pool’, ‘layer_64_1_conv1_h’, ‘layer_64_1_bn2_h’, ‘layer_64_1_scale2_h’, ‘layer_64_1_relu2’, ‘layer_64_1_conv2_h’, ‘layer_64_1_sum’, ‘layer_128_1_bn1_h’, ‘layer_128_1_scale1_h’, ‘layer_128_1_relu1’, ‘layer_128_1_conv1_h’, ‘layer_128_1_bn2’, ‘layer_128_1_scale2’, ‘layer_128_1_relu2’, ‘layer_128_1_conv2’, ‘layer_128_1_conv_expand_h’, ‘layer_128_1_sum’, ‘layer_256_1_bn1’, ‘layer_256_1_scale1’, ‘layer_256_1_relu1’, ‘layer_256_1_conv1’, ‘layer_256_1_bn2’, ‘layer_256_1_scale2’, ‘layer_256_1_relu2’, ‘layer_256_1_conv2’, ‘layer_256_1_conv_expand’, ‘layer_256_1_sum’, ‘layer_512_1_bn1’, ‘layer_512_1_scale1’, ‘layer_512_1_relu1’, ‘layer_512_1_conv1_h’, ‘layer_512_1_bn2_h’, ‘layer_512_1_scale2_h’, ‘layer_512_1_relu2’, ‘layer_512_1_conv2_h’, ‘layer_512_1_conv_expand_h’, ‘layer_512_1_sum’, ‘last_bn_h’, ‘last_scale_h’, ‘last_relu’, ‘conv6_1_h’, ‘conv6_1_relu’, ‘conv6_2_h’, ‘conv6_2_relu’, ‘conv7_1_h’, ‘conv7_1_relu’, ‘conv7_2_h’, ‘conv7_2_relu’, ‘conv8_1_h’, ‘conv8_1_relu’, ‘conv8_2_h’, ‘conv8_2_relu’, ‘conv9_1_h’, ‘conv9_1_relu’, ‘conv9_2_h’, ‘conv9_2_relu’, ‘conv4_3_norm’, ‘conv4_3_norm_mbox_loc’, ‘conv4_3_norm_mbox_loc_perm’, ‘conv4_3_norm_mbox_loc_flat’, ‘conv4_3_norm_mbox_conf’, ‘conv4_3_norm_mbox_conf_perm’, ‘conv4_3_norm_mbox_conf_flat’, ‘conv4_3_norm_mbox_priorbox’, ‘fc7_mbox_loc’, ‘fc7_mbox_loc_perm’, ‘fc7_mbox_loc_flat’, ‘fc7_mbox_conf’, ‘fc7_mbox_conf_perm’, ‘fc7_mbox_conf_flat’, ‘fc7_mbox_priorbox’, ‘conv6_2_mbox_loc’, ‘conv6_2_mbox_loc_perm’, ‘conv6_2_mbox_loc_flat’, ‘conv6_2_mbox_conf’, ‘conv6_2_mbox_conf_perm’, ‘conv6_2_mbox_conf_flat’, ‘conv6_2_mbox_priorbox’, ‘conv7_2_mbox_loc’, ‘conv7_2_mbox_loc_perm’, ‘conv7_2_mbox_loc_flat’, ‘conv7_2_mbox_conf’, ‘conv7_2_mbox_conf_perm’, ‘conv7_2_mbox_conf_flat’, ‘conv7_2_mbox_priorbox’, ‘conv8_2_mbox_loc’, ‘conv8_2_mbox_loc_perm’, ‘conv8_2_mbox_loc_flat’, ‘conv8_2_mbox_conf’, ‘conv8_2_mbox_conf_perm’, ‘conv8_2_mbox_conf_flat’, ‘conv8_2_mbox_priorbox’, ‘conv9_2_mbox_loc’, ‘conv9_2_mbox_loc_perm’, ‘conv9_2_mbox_loc_flat’, ‘conv9_2_mbox_conf’, ‘conv9_2_mbox_conf_perm’, ‘conv9_2_mbox_conf_flat’, ‘conv9_2_mbox_priorbox’, ‘mbox_loc’, ‘mbox_conf’, ‘mbox_priorbox’, ‘mbox_conf_reshape’, ‘mbox_conf_softmax’, ‘mbox_conf_flatten’, ‘detection_out’]
head_detection_model.getUnconnectedOutLayers() = [[112]]
head_detection_model output layer names = [‘detection_out’]
head_detection_model output layer names = [‘detection_out’]
head_detection_model input layer = <dnn_Layer 0x7f0e42e08c10>
head_detection_model input layer name = data_bn
head_detection_model out_layer = <dnn_Layer 0x7f0e44083cb0>
[INFO] [1599439402.172341]: /funmap started
[INFO] [1599439402.200255]: /handover_object started
head_pose_model.getLayerNames() = [‘angle_p_fc’, ‘angle_r_fc’, ‘angle_y_fc’]
head_pose_model.getUnconnectedOutLayers() = [[1]
[2]
[3]]
head_pose_model output layer names = [‘angle_p_fc’, ‘angle_r_fc’, ‘angle_y_fc’]
head_pose_model output layer names = [‘angle_p_fc’, ‘angle_r_fc’, ‘angle_y_fc’]
head_pose_model input layer = <dnn_Layer 0x7f0e42e08c10>
head_pose_model input layer name = angle_p_fc
head_pose_model out_layer = <dnn_Layer 0x7f0e19034710>
head_pose_model out_layer = <dnn_Layer 0x7f0e15242390>
head_pose_model out_layer = <dnn_Layer 0x7f0e19034710>
head_pose_model.getLayerNames() = [‘angle_p_fc’, ‘angle_r_fc’, ‘angle_y_fc’]
head_pose_model.getUnconnectedOutLayers() = [[1]
[2]
[3]]
head_pose_model output layer names = [‘angle_p_fc’, ‘angle_r_fc’, ‘angle_y_fc’]
head_pose_model output layer names = [‘angle_p_fc’, ‘angle_r_fc’, ‘angle_y_fc’]
head_pose_model input layer = <dnn_Layer 0x7f0e42e08c10>
head_pose_model input layer name = angle_p_fc
head_pose_model out_layer = <dnn_Layer 0x7f0e15242390>
head_pose_model out_layer = <dnn_Layer 0x7f0e15242410>
head_pose_model out_layer = <dnn_Layer 0x7f0e15242390>
landmarks_model.getLayerNames() = [‘align_fc3’]
landmarks_model.getUnconnectedOutLayers() = [[1]]
landmarks_model output layer names = [‘align_fc3’]
landmarks_model output layer names = [‘align_fc3’]
landmarks_model input layer = <dnn_Layer 0x7f0e42e08c10>
landmarks_model input layer name = align_fc3
landmarks_model out_layer = <dnn_Layer 0x7f0e15242390>
[INFO] [1599439402.715300]: /detect_handover_position started
Traceback (most recent call last):
File “/home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/nodes/stretch_driver”, line 611, in
node.main()
File “/home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/nodes/stretch_driver”, line 557, in main
self.robot.startup()
File “/home/hello-robot/.local/lib/python2.7/site-packages/stretch_body/robot.py”, line 182, in startup
self.devices[k].startup()
File “/home/hello-robot/.local/lib/python2.7/site-packages/stretch_body/dynamixel_X_chain.py”, line 50, in startup
self.motors[mk].startup()
File “/home/hello-robot/.local/lib/python2.7/site-packages/stretch_body/dynamixel_hello_XL430.py”, line 51, in startup
self.motor.set_return_delay_time(self.params[‘return_delay_time’])
File “/home/hello-robot/.local/lib/python2.7/site-packages/stretch_body/dynamixel_XL430.py”, line 234, in set_return_delay_time
dxl_comm_result, dxl_error = self.packet_handler.write1ByteTxRx(self.port_handler, self.dxl_id,XL430_ADDR_RETURN_DELAY_TIME, x)
File “/opt/ros/melodic/lib/python2.7/dist-packages/dynamixel_sdk/protocol2_packet_handler.py”, line 651, in write1ByteTxRx
return self.writeTxRx(port, dxl_id, address, 1, data_write)
File “/opt/ros/melodic/lib/python2.7/dist-packages/dynamixel_sdk/protocol2_packet_handler.py”, line 641, in writeTxRx
rxpacket, result, error = self.txRxPacket(port, txpacket)
File “/opt/ros/melodic/lib/python2.7/dist-packages/dynamixel_sdk/protocol2_packet_handler.py”, line 347, in txRxPacket
rxpacket, result = self.rxPacket(port)
File “/opt/ros/melodic/lib/python2.7/dist-packages/dynamixel_sdk/protocol2_packet_handler.py”, line 258, in rxPacket
rxpacket.extend(port.readPort(wait_length - rx_length))
File “/opt/ros/melodic/lib/python2.7/dist-packages/dynamixel_sdk/port_handler.py”, line 80, in readPort
return [ord(ch) for ch in self.ser.read(length)]
File “/home/hello-robot/.local/lib/python2.7/site-packages/serial/serialposix.py”, line 501, in read
'device reports readiness to read but returned no data ’
serial.serialutil.SerialException: device reports readiness to read but returned no data (device disconnected or multiple access on port?)
[stretch_driver-8] process has died [pid 26009, exit code 1, cmd /home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/nodes/stretch_driver cmd_vel:=/stretch/cmd_vel joint_states:=/stretch/joint_states __name:=stretch_driver __log:=/home/hello-robot/.ros/log/1eecf394-f0a3-11ea-8579-94e6f7b7f49e/stretch_driver-8.log].
log file: /home/hello-robot/.ros/log/1eecf394-f0a3-11ea-8579-94e6f7b7f49e/stretch_driver-8*.log
[ WARN] [1599439408.712567268]: Still waiting for data on topic /imu/data_raw…
[ WARN] [1599439418.712497587]: Still waiting for data on topic /imu/data_raw…
[ WARN] [1599439428.712486610]: Still waiting for data on topic /imu/data_raw…
[ WARN] [1599439438.712504418]: Still waiting for data on topic /imu/data_raw…

@sitaneja would you check if this issue persists after the outcome from this thread? Also fyi, note that even if you kill the background process of the xbox teleop and/or remove it from the autostart programs, you can always start the process manually by gaining shell access to the robot (SSH or VNC) and running:

$ stretch_xbox_controller_teleop.py

The issue still persist, even after killing xbox process.

I took it to bigger room and camera created map of view just in front of it. It doesn’t rotate to create 360 degree view map and when I clicked on small target in that section, it doesn’t move to reach that target.

It is not creating proper map and no autonomous task is working. This is very surprising and shocking to see such instability. With this, I can’t even think to develop autonomous task code. Please help to create a stable version of this robot so that I can move forward.

Hi all, @sitaneja and I resolved the issue offline a few weeks ago. For posterity’s sake, here’s what we found.


There were four independent issues at play. The first was a transient Serial communication error that gave the following traceback:

If you run into this error, here are a couple things to try:

  • Ensure that no active or background process is using stretch_body. Only one stretch_body instance can communicate with the hardware at a time (e.g. attempting a homing sequence with stretch_robot_home.py while the stretch_ros keyboard teleop with roslaunch stretch_core keyboard_teleop.launch are running will result in serial transport errors). The Linux commands top, ps, and pkill will help you identify and kill background processes.
  • Ensure that the state of your Startup Applications Preferences looks like the image below.
    image
  • A full power cycle helps reset the robot to a working state.
  • Additional information can be found in the troubleshooting guide.

The second issue involved the autonomy demos available in the stretch_ros repo.

These demos are supposed to serve as inspirational examples, and they haven’t been hardened to work 100% of the time in every environment. However, in this case, the reason they were consistently failing was unrelated to the demos themselves. Each Stretch ships with a calibrated kinematic model, which means that Stretch knows information about itself (e.g. the length of its arm). This model is used in the demos to form maps, pick up objects, and perform other autonomous tasks. @sitaneja had changed the robot physically, such that the calibrated model no longer held true. When the model is drastically wrong, where the robot believes its movements/sensing are happening won’t match reality, leading to consistent failure. Given the importance of a good kinematic model for developing autonomous tasks, we’ve released a set of calibration tools to help users automatically and visually check the quality of their robot’s current kinematic model, as well as perform new calibrations if necessary. More details on all of this is available in the stretch_calibration package’s README.


The third issue was the cause of the second issue. An offset was introduced in the head pan joint, such that the head would return to angle away from zero (facing forward) when the robot was homed. The following image contrasts what the angle should look like nominally verse what an incorrect offset from home would look like. This offset was likely introduced by back-driving the joint (physically moving the joints by hand) while the head was holding a position. @aedsinger wrote a post on which joints can be safely back-driven. Ultimately, this offset is the reason why the kinematic model no longer matched the real robot; when the robot believed it was looking forward, it was actually forming a map skewed to the left, leading to consistent failure to accomplish the tasks.

If you are experiencing this issue, follow the instructions outlined in the head pan calibration guide to re-zero the head pan joint.


The last issue was with the battery life of the robot. Even from full charges, the robot’s battery voltage would drop quickly from 12.7 V to critically low voltages near ~10 V. As covered in the safety guide, Stretch will beep rapidly when this happens. Using the robot consistently down to critically low voltage will impact the lifetime of the battery and its ability to hold charge. The charger included with the robot has a Repair mode that allows it to revive severely discharged batteries. More information on the charger and this Repair mode can be found in the troubleshooting guide.