Hello FS93 ,
After getting the circle on the depth frame, I would take all those depth points, average them out, and calculate spatial coordiantes from the centroid of the circle (with averaged depth). That's essentially what this demo does, except it's square, not circle, but I assume it shouldn't be too complex to change this logic.
Thanks, Erik
OAK-D Absolute Depth Precision
Thanks a lot erik ,
I was able to get some results adapting the demo you proposed me.
I still have a couple of doubts:
I calculated the hough circles pixel coordinates based on the right image output of the camera, does this mean that the coordinates are expressed in the reference frame of the right camera sensor?
The value of the Z coordinate is expressed as the length of the orthogonal segment starting at X,Y point and reaching the object, or is it the length of the segment starting at 0,0 and reaching the object? I guess the first option, but I am asking just to be sure.
Hello FS93 ,
By default, coordinates are taken from right rectified frame. You can also align the depth with other streams, eg. color, example here.
It's the Z coordinate of the object, not the distance to the object. For the actual distance to the object, you would need to use x/y/z and pitagoras formula.
Thanks, Erik
Hello,
I found the following in the given code -
depth = 441.25 * 7.5 / disparity
where 441.25 is the focal length (f) as I understand. I know that the effective focal length for the RGB camera is 3.37 mm and for the stereo camera is 1.3 mm. I was wondering how did you find this 441.25 cm value?
Thanks,
Afsana
Hi afsana ,
Please see docs here: https://docs.luxonis.com/projects/api/en/latest/components/nodes/stereo_depth/#calculate-depth-using-disparity-map
Thanks, Erik
Thanks Erik!
Hi, @FS93 . I'm working on a similar application like yours. After you implement Erik's tips what was the new error between real Z coordinate and that measured by OAK-D camera?
(AlexandreRibeiro for the stereo depth accuracy, see report here)
erik Thanks.
I have some clarification about OAK Camera
We can capture the 7-segment display and convert to number ? Using OAK Camera ?
erik Can you please answer Asfanas question here? I am also trying to get the dispartiy converted into depth and can not find in the link you provided the basis for assuming 7.5 as a effective focal lenghth..
Kindly help as soon as you are able to
Hi MamoonIsmailKhalid ,
Please see the documentation here.
Thanks, Erik
erik Thank you!!!
I wanted to just follow up on this, I am using the code given here:
(RGB, DEPT, CONFIDENCE aligned.py)
https://github.com/luxonis/depthai-python/blob/main/examples/StereoDepth/rgb_depth_aligned.py
Essentially I would need to take the disparity output and apply the formula:
depth=441.25โ7.5/disparity(pixel)
to calculate the depth??
Please answer this as I have been stuck on this for past 3 days and my head has been noodling up everday.
THANK YOU IN ADVANCE!!!!
Regards,
Mamoon
Hi MamoonIsmailKhalid ,
If you have 7.5cm baseline distance and 441 [pixels] focal length then yes, that's the correct formula Otherwise, I would strongly suggest using depth output instead of calculating it yourself on the host.
Thanks, Erik
How would one extract the depth data directly? Any code examples you can refer me to??I am using the depth data to extract depth information and to overlay it on the output of 2D pose estimation (Google Mediapie) to recosntruct a 3D pose of the key points extracted from Google mediapipe
Thank you for your help and prompt responses! !
Hi MamoonIsmailKhalid,
You can extract depth data directly with depthai.node.StereoDepth.depth (reference to docs). One example which might help you, compares manual depth calculation (the formula you have been using above) to directly accessing depth trough ".depth" attribute.
Hope this helps.
Jaka
jakaskerl THANK YOU SO MUCH!! I will try that approach and post here if that solved my porblem. But I really appreciate your guidance already