• Depth data using OAK-D-W OV9782

Hi,

I am trying to align the depth frame with the color frame to do NN inference and get the 3D coordinates.

Using the dephtai/dephtai_demo.py it looks like the Z value is realistic and works to the closest's objects.

However when I go through dephtai_python/examples, the Z value is very inaccurate, and for the closest's objects, the value had a big error (1 or 2 or 3 meters).

What is the best example to start with the camera OAK-D-W OV9782?

Thanks!

    Hi jakaskerl

    Using the dephtai_demo.py this is the result. It seems ok:

    When a try to user the example the depthai-python/examples/StereoDepth/rgb_depth_aligned.py, and calculating the XYZ on the host, using https://github.com/luxonis/depthai-experiments/blob/master/gen2-calc-spatials-on-host/main.py, it is very inaccurate:

    Where is more or less:

    And using the depthai_sdk, I got some distance errors:

    What data frame I should use to get the data to calculate the XYZ, the "depth" or the "disp" frame?

    Thanks!

    • erik replied to this.

      Hi LeonelSimo ,
      Please note the minimal depth perception - for your camera it's 40cm, docs here:
      https://docs.luxonis.com/projects/hardware/en/latest/pages/DM9098w/#stereo-depth-perception
      So your hand is too close to the camera, thus depth is completely random. Docs here on workarounds:
      https://docs.luxonis.com/projects/api/en/latest/tutorials/configuring-stereo-depth/#short-range-stereo-depth

        Hi erik,

        Ok. But using the "dephtai/dephtai_demo.py", I can confirm that 20cm or 30 cm works perfectly. The problem is when I try to make my own script or use an example from "depthai-python/examples/"

        Can you help me?

          Hi LeonelSimo
          Best way to check the accuracy is to look at the depth/disparity map. If it looks random, then the device is unable to correctly infer depth from the two stereo cameras (meaning wrong xyz), since the object is in the "blind spot" of one of the two cameras.
          Only if the depth map looks somewhat homogeneous at objects position will you be able to correctly infer depth.

          Thanks,
          Jaka

          The big question is, when I use the "dephtai_demo.py" works perfectly (the Z value is ok), but when I try an example from depthai-python/examples/ the Z value had errors, especially for closest distances.

          Why? Can you provide a script example?

            Hi jakaskerl

            These examples work very well. However, when I change the resolution from THE_400_P to THE_720_P or THE_800_P, the results are extremely bad.

            In the final, I want to align the depth with the color frame! I intend to use the maximum FOV, do NN inference, and get the XYZ. Maybe the THE_400_P could work, but how can I decrease the color resolution to THE_400_P since the color OV9782 only supports THE_800_P and THE_720_P?

            Thanks

            • erik replied to this.

              Hi LeonelSimo ,
              You can use isp scaling:

              camRgb.setBoardSocket(dai.CameraBoardSocket.CAM_A)
              camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_800_P)
              camRgb.setIspScale(1,2) # 800P to 400P

                Hi erik

                It works! Thanks

                Do we have a way to detect if an object is too close to the camera?

                  Hi LeonelSimo
                  I don't think so. But you should be able to do it host side by looking at the noisiness of the disparity frame.

                  Thoughts?

                  Jaka

                    Hi jakaskerl ,

                    Yes, the idea is to do it on the host. How do we check the noisiness of the disparity frame?

                    • erik replied to this.

                      Hi erik

                      I have already seen that example and I have tested it. The question is, how do we analyze the disparity frame by python code and say there is a lot of noise? There is an example code?

                      Thanks

                      • erik replied to this.

                        LeonelSimo Such non-luxonis-specific (generic) questions can be consulted with chatgpt, which will provide more accurate and faster response: