Why didnt DepthAI use IR projection like Orbbec or Intel Realsense?

Their Depth streams to be with low noise..

Hi @jesse31 ,

So the main reason was eye safety and making sure that that is done correct. So it's faster to get products out safely without lasers.

We are working on a variant with IR-capable cameras and IR dot projection, see here:
https://github.com/luxonis/depthai-hardware/issues/20

Backing up a bit though, the main purpose of DepthAI is not to be yet another depth camera, but to provide spatial AI.

So as far as we understand, we make the only camera in the world that does spatial sensing and AI on the same device.

So the number 1 confusion about DepthAI is that we are yet another depth sensing camera... that is not the purpose.

More information on that is here:
https://discuss.luxonis.com/d/48-summary-of-stereo-mapping-cameras

That said, for not using any active illumination or laser projection, the depth quality is not bad:
image
image
image
image

This is with no filtering whatsoever (with Median fully disabled, for example). The reference code to produce these is here.

Thoughts?

Thanks,
Brandon

2 months later

The depth stream on Intel Realsense or Occipital looks more good/continuous because of filtering?

    Hi jesse31 ,

    Yes, filtering helps. So we do have a WLS filter example here:
    https://github.com/luxonis/depthai-experiments/tree/master/wls-filter

    image

    So this is WLS, run on the host. We will be implementing Bilateral (which usually outpeforms WLS) directly in DepthAI this year as well. So the depth could look smoother than this coming out of the device.

    More details on bilateral is here: https://github.com/luxonis/depthai/issues/215

    Thoughts?

    We'll probably also do IR dot projection, it will just take a while as we don't want to blind anyone. FWIW, Vikas, co-founder of Occipital, is a buddy of mine and an advisor here.

    Thanks,
    Brandon