Is it possible to test the stretch deep perception using the simulator only and without access to the physical robot?
Hi @iibuan, it should be possible, but to my knowledge, it’s not something that’s been tested before. The simulated Stretch’s head camera imagery is published at the
/camera/color/image_raw topic, same topic as the real Stretch’s head camera. So you would launch the deep perception launch file with the simulation launch file.
I would expect the models trained on real imagery might not perform as well when inferencing on simulated imagery.
Hi @bshah, upon executing the command “roslaunch stretch_deep_perception stretch_detect_objects.launch,” it gives the warning message “No RealSense devices were found!” even though I received the message “The realsense_camera plugin is attached to the model robot” after executing “roslaunch stretch_gazebo gazebo.launch.” I am also attempting to use Stretch Deep Perception in the Gazebo Simulator.
Is there anything else I need to do to use stretch_deep_perception with the Stretch Robot inside the Gazebo Simulator?
Ah thanks for discovering that @Jerrico_Jr_Dela_Cruz. Looks like it’s because
stretch_detect_objects.launch tries to launch the real Realsense camera driver in lines 5-11.
Commenting those lines out should resolve the problem.