How to update the goal when there is a obstacle

Hi, I am using the navigation stack to move the robot trough a map with different goals to search a object. The problem is when I find the object in table for example ¿How I say to the robot to follow a accessible goal near to near to the object? ¿How I now that a point in the map is accessible? I try different ways, but didn’t work

Hi @Humberto_Andres_Hida,

Can you please give me more details about your issue? I understand that you may be using ROS1, what are the things that you have tried for Stretch to achieve this goal? According to what you are asking, NavStack provides collision avoidance to the users.

Let’s take the example from the table, if you chose a point for Stretch to reach a goal on the table, it will go and move near the table but it will never touch it, just ensure that the local and global planners are configured to avoid obstacles. You might need to tune the planner parameters to suit your specific environment. You can watch this Basic Navigation Tuning Guide that the ROS documentation offers, this guide has a series of steps, checking 3 components, the costmap and the local planner. This is for the Navigation Stack.

We also have for Stretch the FUNMAP Navigation planner, this is the one that I suggest you to use, mainly because the ROS navstack is going to use the LiDAR, while the FUNMAP will use the camera to detect the edges and you can have a better perception of the things that are higher up, take a look to our documentation of FUNMAP for more information, also, the way you point a goal with the Navigation Stack is the same way in FUNMAP.

I’ll be happy to answer any questions that may come up as you progress through this issue!

1 Like

yes, I use ROS1, I try with the threshold of the cell value of the yalmp file of the map and with the costmap (maybe i did some mistake with the methods) using the NavStack, because i was using the actionlib to send goal to the robot based on the word coordinates, but since the object/goal is on the table the navigation stack only identify the table legs.
So, Can I send goal based on the word coordinates with the actionlib with the FUNMAP?

Hello @Humberto_Andres_Hida ,

Short answer, yes, with FUNMAP after you press the spacebar and it finishes scanning the room you can send the goal and Stretch will go for it. @Mohamed_Fazil will take over from here, he can give you a more detailed explanation about this issue.

Hi @Humberto_Andres_Hida,

Yes, you can send the goal pose using the MoveBaseGoal message type to the MoveBaseAction server in the namespace /move_base. But before sending the goal you will need to have the FUNMAP generated MHI (Maximum Height Image) Map available in the topic /map so that you can send your goal pose in the map’s frame.

Generating the Map using FUNMAP Headscan Service

  1. Launch the FUNMAP node along with Stretch Driver, D435i and other necessary nodes. For this you can launch the stretch_driver.launch and the mapping.launch.
# Terminal 1
roslaunch stretch_core stretch_driver.launch

# Terminal 2
roslaunch stretch_funmap mapping.launch 
  1. Perform a Headscan by calling the service /funmap/trigger_head_scan which will make robot head rotate around from two different base rotated positions. It would take few seconds for processing of the 3D MHI map to complete. This would be visualized in through Rviz. (rviz file)
rosservice call /funmap/trigger_head_scan

The final output would look something similar below


(Source)
The Rviz Visualization will look similar to below image:

For sending a MoveBase Goal you could either use the “2D Nav Goal” feature in the Rviz to set a goal pose on the generated map or you can programmatically send a goal pose by creating a Movebase Action client at the namespace /move_base. The robot should be able to robot navigate to the goal pose.
Example of action client snippet:

from move_base_msgs.msg import MoveBaseAction, MoveBaseGoal

# Create a MoveBase Client
client = actionlib.SimpleActionClient('move_base', MoveBaseAction)
client.wait_for_server()

# Create a Goal in map and send it 
goal = MoveBaseGoal()
goal.target_pose.header.frame_id = "map"
goal.target_pose.header.stamp = rospy.Time.now()
goal.target_pose.pose.position.x = X
goal.target_pose.pose.position.y = Y
goal.target_pose.pose.orientation.w = 1.0
self.action_goal = goal
self.client.send_goal(goal)

You can read more about accessing saved MHI maps and using them in the this Readme.
Let us know if you have more doubts.

maybe the method with the costmap don’t work because the the next warning that i see when i launch the command of navigation.launch:

roslaunch stretch_navigation navigation.launch map_yaml:=${HELLO_FLEET_PATH}/maps/map.yaml
... logging to /home/hello-robot/.ros/log/b3dfc250-983a-11ee-abec-87934603d30e/roslaunch-stretch-re2-2005-10835.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://172.16.0.205:36825/

SUMMARY
========

PARAMETERS
 * /amcl/gui_publish_rate: 10.0
 * /amcl/kld_err: 0.05
 * /amcl/kld_z: 0.99
 * /amcl/laser_lambda_short: 0.1
 * /amcl/laser_likelihood_max_dist: 2.0
 * /amcl/laser_max_beams: 30
 * /amcl/laser_model_type: likelihood_field
 * /amcl/laser_sigma_hit: 0.2
 * /amcl/laser_z_hit: 0.5
 * /amcl/laser_z_max: 0.05
 * /amcl/laser_z_rand: 0.5
 * /amcl/laser_z_short: 0.05
 * /amcl/max_particles: 5000
 * /amcl/min_particles: 500
 * /amcl/odom_alpha1: 0.2
 * /amcl/odom_alpha2: 0.2
 * /amcl/odom_alpha3: 0.8
 * /amcl/odom_alpha4: 0.2
 * /amcl/odom_alpha5: 0.1
 * /amcl/odom_frame_id: odom
 * /amcl/odom_model_type: diff
 * /amcl/recovery_alpha_fast: 0.0
 * /amcl/recovery_alpha_slow: 0.0
 * /amcl/resample_interval: 1
 * /amcl/transform_tolerance: 0.1
 * /amcl/update_min_a: 0.5
 * /amcl/update_min_d: 0.2
 * /joint_state_publisher/rate: 30.0
 * /joint_state_publisher/source_list: ['/stretch/joint_...
 * /laser_filter/scan_filter_chain: [{'name': 'shadow...
 * /lidar_node/angle_compensate: True
 * /lidar_node/frame_id: laser
 * /lidar_node/inverted: False
 * /lidar_node/scan_mode: Boost
 * /lidar_node/serial_baudrate: 115200
 * /lidar_node/serial_port: /dev/hello-lrf
 * /move_base/TrajectoryPlannerROS/acc_lim_theta: 1.0
 * /move_base/TrajectoryPlannerROS/acc_lim_x: 1.0
 * /move_base/TrajectoryPlannerROS/acc_lim_y: 0.0
 * /move_base/TrajectoryPlannerROS/angular_sim_granularity: 0.025
 * /move_base/TrajectoryPlannerROS/dwa: False
 * /move_base/TrajectoryPlannerROS/gdist_scale: 5.0
 * /move_base/TrajectoryPlannerROS/goal_distance_bias: 5.0
 * /move_base/TrajectoryPlannerROS/heading_lookahead: 0.325
 * /move_base/TrajectoryPlannerROS/heading_scoring: True
 * /move_base/TrajectoryPlannerROS/heading_scoring_timestep: 0.8
 * /move_base/TrajectoryPlannerROS/holonomic_robot: False
 * /move_base/TrajectoryPlannerROS/latch_xy_goal_tolerance: False
 * /move_base/TrajectoryPlannerROS/max_vel_theta: 1.0
 * /move_base/TrajectoryPlannerROS/max_vel_x: 0.3
 * /move_base/TrajectoryPlannerROS/max_vel_y: 0.0
 * /move_base/TrajectoryPlannerROS/meter_scoring: True
 * /move_base/TrajectoryPlannerROS/min_in_place_vel_theta: 0
 * /move_base/TrajectoryPlannerROS/min_vel_theta: -1.0
 * /move_base/TrajectoryPlannerROS/min_vel_x: 0.0
 * /move_base/TrajectoryPlannerROS/min_vel_y: 0.0
 * /move_base/TrajectoryPlannerROS/occdist_scale: 0.01
 * /move_base/TrajectoryPlannerROS/path_distance_bias: 5.0
 * /move_base/TrajectoryPlannerROS/pdist_scale: 5.0
 * /move_base/TrajectoryPlannerROS/sim_granularity: 0.025
 * /move_base/TrajectoryPlannerROS/sim_time: 1.7
 * /move_base/TrajectoryPlannerROS/vtheta_samples: 20
 * /move_base/TrajectoryPlannerROS/vx_samples: 5
 * /move_base/TrajectoryPlannerROS/xy_goal_tolerance: 0.1
 * /move_base/TrajectoryPlannerROS/yaw_goal_tolerance: 0.09
 * /move_base/global_costmap/cost_scaling_factor: 2.58
 * /move_base/global_costmap/footprint: [[0.05, 0.17], [0...
 * /move_base/global_costmap/footprint_padding: 0.025
 * /move_base/global_costmap/global_frame: map
 * /move_base/global_costmap/inflation_radius: 1.75
 * /move_base/global_costmap/laser_scan_sensor/clearing: True
 * /move_base/global_costmap/laser_scan_sensor/data_type: LaserScan
 * /move_base/global_costmap/laser_scan_sensor/marking: True
 * /move_base/global_costmap/laser_scan_sensor/sensor_frame: laser
 * /move_base/global_costmap/laser_scan_sensor/topic: scan_filtered
 * /move_base/global_costmap/observation_sources: laser_scan_sensor
 * /move_base/global_costmap/obstacle_range: 2.5
 * /move_base/global_costmap/raytrace_range: 3.0
 * /move_base/global_costmap/robot_base_frame: base_link
 * /move_base/global_costmap/static_map: True
 * /move_base/global_costmap/update_frequency: 5.0
 * /move_base/local_costmap/cost_scaling_factor: 1.5
 * /move_base/local_costmap/footprint: [[0.05, 0.17], [0...
 * /move_base/local_costmap/footprint_padding: 0.025
 * /move_base/local_costmap/global_frame: map
 * /move_base/local_costmap/height: 3.0
 * /move_base/local_costmap/inflation_radius: 0.15
 * /move_base/local_costmap/laser_scan_sensor/clearing: True
 * /move_base/local_costmap/laser_scan_sensor/data_type: LaserScan
 * /move_base/local_costmap/laser_scan_sensor/marking: True
 * /move_base/local_costmap/laser_scan_sensor/sensor_frame: laser
 * /move_base/local_costmap/laser_scan_sensor/topic: scan_filtered
 * /move_base/local_costmap/observation_sources: laser_scan_sensor
 * /move_base/local_costmap/obstacle_range: 2.5
 * /move_base/local_costmap/publish_frequency: 2.0
 * /move_base/local_costmap/raytrace_range: 3.0
 * /move_base/local_costmap/resolution: 0.05
 * /move_base/local_costmap/robot_base_frame: base_link
 * /move_base/local_costmap/rolling_window: True
 * /move_base/local_costmap/static_map: False
 * /move_base/local_costmap/update_frequency: 5.0
 * /move_base/local_costmap/width: 3.0
 * /robot_description: <?xml version="1....
 * /robot_state_publisher/publish_frequency: 30.0
 * /rosdistro: noetic
 * /rosversion: 1.16.0
 * /stretch_driver/broadcast_odom_tf: True
 * /stretch_driver/controller_calibration_file: /home/hello-robot...
 * /stretch_driver/mode: navigation
 * /stretch_driver/rate: 30.0
 * /stretch_driver/timeout: 0.5

NODES
  /
    amcl (amcl/amcl)
    centered_base_link_tf_publisher (tf/static_transform_publisher)
    joint_state_publisher (joint_state_publisher/joint_state_publisher)
    laser_filter (laser_filters/scan_to_scan_filter_chain)
    lidar_node (rplidar_ros/rplidarNode)
    map_server (map_server/map_server)
    move_base (move_base/move_base)
    robot_state_publisher (robot_state_publisher/robot_state_publisher)
    rviz (rviz/rviz)
    stretch_driver (stretch_core/stretch_driver)

auto-starting new master
process[master]: started with pid [10852]
ROS_MASTER_URI=http://172.16.0.205:11311/

setting /run_id to b3dfc250-983a-11ee-abec-87934603d30e
process[rosout-1]: started with pid [10869]
started core service [/rosout]
process[joint_state_publisher-2]: started with pid [10876]
process[robot_state_publisher-3]: started with pid [10877]
process[stretch_driver-4]: started with pid [10878]
process[centered_base_link_tf_publisher-5]: started with pid [10879]
process[lidar_node-6]: started with pid [10885]
[ WARN] [1702308839.666556275]: The root link base_link has an inertia specified in the URDF, but KDL does not support a root link with an inertia.  As a workaround, you can add an extra dummy link to your URDF.
process[laser_filter-7]: started with pid [10886]
process[map_server-8]: started with pid [10893]
[ INFO] [1702308839.691003444]: RPLIDAR running on ROS package rplidar_ros, SDK Version:2.0.0
process[amcl-9]: started with pid [10898]
process[move_base-10]: started with pid [10905]
process[rviz-11]: started with pid [10913]
[ INFO] [1702308839.751319047]: RPLIDAR S/N: BB89ED93C0EA98C6C2E29EF549534061
[ INFO] [1702308839.751374840]: Firmware Ver: 1.29
[ INFO] [1702308839.751402406]: Hardware Rev: 7
[ INFO] [1702308839.798457182]: Requesting the map...
[ INFO] [1702308839.801728287]: Received a 326 X 365 map @ 0.025 m/pix

[ INFO] [1702308839.802591484]: RPLidar health status : OK.
[ INFO] [1702308839.813412669]: Initializing likelihood field model; this can take some time on large maps...
[ INFO] [1702308839.847412272]: Done initializing likelihood field model.
[ INFO] [1702308840.109301548]: current scan mode: Boost, sample rate: 8 Khz, max_distance: 12.0 m, scan frequency:10.0 Hz, 
[INFO] [rospy.topics]: topicmanager initialized
[INFO] [1702308840.902118]: For use with S T R E T C H (TM) RESEARCH EDITION from Hello Robot Inc.
[INFO] [1702308840.904150]: /stretch_driver started
[INFO] [1702308843.138023]: mode = navigation
[INFO] [1702308843.138888]: /stretch_driver: Changed to mode = navigation
[INFO] [1702308843.140183]: broadcast_odom_tf = True
[INFO] [1702308843.142233]: Loading controller calibration parameters for the head from YAML file named /home/hello-robot/catkin_ws/src/stretch_ros/stretch_core/config/controller_calibration_head.yaml
[INFO] [1702308843.144022]: controller parameters loaded = {'arm_retracted_offset': 0.0025290283993012154, 'pan_angle_offset': 0.05370401639426355, 'pan_looked_left_offset': 0.0015576428725509596, 'tilt_angle_backlash_transition': -0.4, 'tilt_angle_offset': -0.04653908482546884, 'tilt_looking_up_offset': -0.021135309455911962}
[INFO] [1702308843.144916]: head_tilt_calibrated_offset_rad in degrees = -2.666493142900698
[INFO] [1702308843.145829]: head_pan_calibrated_offset_rad in degrees = 3.0770134822926827
[INFO] [1702308843.146755]: head_pan_calibrated_looked_left_offset_rad in degrees = 0.08924636258580397
[INFO] [1702308843.149070]: head_tilt_backlash_transition_angle_rad in degrees = -22.918311805232932
[INFO] [1702308843.154632]: head_tilt_calibrated_looking_up_offset_rad in degrees = -1.2109640305266958
[INFO] [1702308843.157827]: arm_calibrated_retracted_offset_m in meters = 0.0025290283993012154
[INFO] [1702308843.196631]: /stretch_driver rate = 30.0 Hz
[INFO] [1702308843.198210]: /stretch_driver timeout = 0.5 s
[INFO] [1702308843.200813]: /stretch_driver use_fake_mechaduinos = False
[INFO] [1702308843.201775]: /stretch_driver base_frame_id = base_link
[INFO] [1702308843.202876]: /stretch_driver odom_frame_id = odom
[ WARN] [1702308843.653664712]: global_costmap: Parameter "plugins" not provided, loading pre-Hydro parameters
[ INFO] [1702308843.667373541]: global_costmap: Using plugin "static_layer"
[ INFO] [1702308843.675509916]: Requesting the map...
[ INFO] [1702308843.877726917]: Resizing costmap to 326 X 365 at 0.025000 m/pix
[ INFO] [1702308843.977472996]: Received a 326 X 365 map at 0.025000 m/pix
[ INFO] [1702308843.983204298]: global_costmap: Using plugin "obstacle_layer"
[ INFO] [1702308843.988170222]:     Subscribed to Topics: laser_scan_sensor
[ INFO] [1702308844.014070456]: global_costmap: Using plugin "inflation_layer"
[ WARN] [1702308844.064706279]: local_costmap: Parameter "plugins" not provided, loading pre-Hydro parameters
[ INFO] [1702308844.074450525]: local_costmap: Using plugin "obstacle_layer"
[ INFO] [1702308844.076416465]:     Subscribed to Topics: laser_scan_sensor
[ INFO] [1702308844.087533516]: local_costmap: Using plugin "inflation_layer"
[ INFO] [1702308844.119363955]: Created local_planner base_local_planner/TrajectoryPlannerROS
[ INFO] [1702308844.125353080]: Sim period is set to 0.05
[ INFO] [1702308844.319396079]: Recovery behavior will clear layer 'obstacles'
[ INFO] [1702308844.325683397]: Recovery behavior will clear layer 'obstacles'
[ INFO] [1702308844.353036214]: odom received!

i am trying to do a FUNMAP, but I problem that I said in other post, this makes that I can’t see the robot in rviz, which means that there is a problem with the information of the robot

I already used FUNMAP, but when i try to visualize with the command:

./stretch_user/debug/merged_maps/merged_map_20231212182723_mhi_visualizatio

The terminal prints the next message:

hello-robot@stretch-re2-2005:~$ ./stretch_user/debug/merged_maps/merged_map_20231212182723_mhi_visualization.png
bash: ./stretch_user/debug/merged_maps/merged_map_20231212182723_mhi_visualization.png: Permission denied

So i don’t know if the map saved properly and how to acces to the map

The navigation.launch which is specific Navigation stack cannot be used with Funmap. Funmap has it’s own position control based motion planner and localization technique. Therefore cost map and AMCL are not necessary.

The maps are images so you could try using the eog linux tool to visualize the maps that are saved in the path ~/stretch_user/debug/merged_maps/.

List all the different map files saved previously:

cd ~/stretch_user/debug/merged_maps/
ls

The output must look something similar:

merged_map_20231212191527_mhi_camera_depth.png   merged_map_20231212191527_mhi_visualization.png  merged_map_20231212191626_mhi_rgb.png
merged_map_20231212191527_mhi_rgb.png            merged_map_20231212191626_mhi_camera_depth.png   merged_map_20231212191626_mhi_visualization.png

Open an image using the eog tool:

eog merged_map_20231212191626_mhi_visualization.png

More easier way, you could also use your Linux file explorer GUI to navigate into the path ~/stretch_user/debug/merged_maps/ from your home directory and click on the available formated maps to view it quickly.

I have also sent you a DM, I would be happy to a schedule a support call to help you get the Funmap navigation working.

Thanks, I can see now the map and I appreciated a error in the map. One room have a wrong orientation, it should be like the green rectangle.

1 Like

Glad you are able to visualize the Map now. Does the room orientation error you mention is consistent with every try in that environment ?

Hi, I don’t know because yesterday i was trying to run the FUNMAP, but the rviz was closing all the time or the rviz didn’t recognize the robot .

Maybe I should try more times, no?

Hi, I try to do the map of the lab other times, and I have this error always to a lesser or greater extent.

Hi @Humberto_Andres_Hida, if the robot state estimate is poor when a new head scan is merged into the map, it can yield the wrong orientation artifacts you’re seeing. I’d recommend performing a localization step each time before adding to the map. The easiest way to do this would be to run:

rosservice call /funmap/trigger_local_localization

I already made it a local localization each time before a head scan

i already try to sent a goal of a object up to the table with the actionlib, but i only have a print “[INFO] [1704906730.814171]: MoveObj: FAILED in reaching the goal.”
Additionally if i try to use the frame of the “map” as you told me, the robot don’t see the wall.
How is can send a goal in base the coordinates of the map? and How I can have to manipulate the coordinates of the robot in base the map?