Hi @allen,
It looks like the depth readings for the bottle are off. The model thinks that the bottle is ~3cm wide and close to the camera, which does not appear to match reality. I tried with a transparent bottle on my end, and while YOLO did not always see the bottle, the depth readings appeared correct. Do you observe erroneous distance / size readings when looking at an opaque bottle as well?
I see from an earlier post in the thread that you were having trouble with the tennis ball as well. Have you tested whether the visual servoing demo works for the tennis ball and / or the ArUco cube (running without YOLO)? If so, what were the results?
Also, it is odd that the grasp begins when the object’s z=12.9
. While running visual servoing, I see values such as 'grasp_center_xyz': array([0.00646052, 0.06986317, 0.17505255])
in the YOLO results printout in the visual servoing terminal. What value are you seeing for 'grasp_center_xyz'
in your terminal output?
Example YOLO results output:
yolo_results = {'fingertips': {'right': {'pos': array([0.09063274, 0.03583154, 0.1768219 ]), 'x_axis': array([-0.68398136, 0.03269413, -0.72876649]), 'y_axis': array([ 0.08904867, -0.98778254, -0.12789051]), 'z_axis': array([-0.72404409, -0.15237041, 0.67271347])}, 'left': {'pos': array([-0.07698445, 0.03379969, 0.17573048]), 'x_axis': array([ 0.69878856, 0.10996119, -0.70682607]), 'y_axis': array([-0.0210888 , 0.99085159, 0.13329814]), 'z_axis': array([ 0.71501735, -0.0782411 , 0.6947147 ])}}, 'yolo': [{'name': 'orange', 'confidence': 0.29331216, 'width_m': 0.04629309170714137, 'estimated_z_m': 0.15356746551650566, 'grasp_center_xyz': array([0.00646052, 0.06986317, 0.17505255]), 'left_side_xyz': array([-0.01747896, 0.06128851, 0.15356747]), 'right_side_xyz': array([0.02881413, 0.06128851, 0.15356747])}]}