What’s the delay in getting distance information out of oak-d? We would probably output this data via UART or SPI. I’m looking into the viability of using these for closed loop positional control.
Latency in spatial data
Hi Udonnis ,
Sorry about the delay. We're normally a lot faster to respond but have been under a mountain of Brexit + pandemic shipping issues across all of Europe that we've been trying to solve/update customers about.
So the latency is heavily dependent on the neural network used, but sub-100ms is definitely achievable with object detectors, as DepthAI/OAK-D can run several object detectors at over 50FPS.
In terms of the depth latency itself out over SPI, I think you'd be looking at around 60ms or so. This variant has depth and also an integrated ESP32 connected over SPI:
https://shop.luxonis.com/collections/all/products/bw1092. So you could test it full-loop with that for your application if you'd like.
We haven't characterized it in a while though, but I suspect it is around there or slightly lower.
Thoughts?
Thanks, and sorry about the delay.
- Edited
Thanks Brandon!
Our control loops are currently running at about 300 micro seconds (0.3ms), so 60-100ms is an ice age in comparison.
We have two projects currently. 60-100ms will probably be too slow for one of the projects (needs to respond sub 1ms) but the other one will be fine.
The project that needs to do speedy distance to object stuff will not need to detect anything complex and we are in control of what the object will look like (it can be very AI friendly). What do you think the OAK-D could do if we made a simple friendly object and trained it to only recognize that one object?
Do you guys have any plans for beefier processing in the future for faster closed loop applications?
BTW we just had a Kickstarter too and we get how much work there is!
Hi Udonnis ,
Wow, that's crazy low latency. So if I understand correctly the minimum FPS for such latency would be 3,400 FPS cameras. Is that right? 1/0.3ms = 3,333.3 FPS. But that doesn't account for any processing time, so they'd have to be at least that fast.
So the latency may be under 65 FPS, but the fastest framerate possible with the current implementation of our driver for the cameras is 118 FPS, so the latency there is 8ms. But that's with no processing, that's literally just the delay induced by the framerate. No MIPI latency, no processing latency, no USB latency, etc.
So I think trying to get to 0.3ms latency from computer vision is quite difficult. We have seen one custom application do 1,300 FPS object detection with our hardware, which was limited also by the sensor (the customer build their own custom sensor for this), but even that, the sensor latency alone is 0.7ms, so it's double of what you are looking for. And again, that doesn't account for any processing.
So to the question of a faster processor, actually the processor is not the bottleneck here, the sensor is. I think for such reaction rates, you'd probably want a 10,000 FPS sensor, so that you could get at least 2-3 samples in that 0.3ms window. Actually that or an event-based camera, like here: https://openaccess.thecvf.com/content_CVPR_2019/papers/Wang_Event-Based_High_Dynamic_Range_Image_and_Very_High_Frame_Rate_CVPR_2019_paper.pdf
Thoughts?
And thanks on the KickStarter. Which one was yours? Would love to take a look.
Best,
Brandon
Hi Brandon,
The "fast" application probably needs 1000 samples / second with a latency of 10ms. So cameras at 125fps and a processing time for distance info of 10ms. Does that make sense? Might need to be a bit faster than that. The bottle-neck I think will be the processing time?
For the slow application (our camera robot on KS below) the oak-D's (which we backed) will be perfect!
Our axiiio Kickstarter was this one;
We didn't reach our funding goal but were coming back!
Also if your interested our website for this is here;
https://www.axiiio.com
If anyone you know is interested get them to signup to out newsletter which you can get to on our website.
Ah interesting, thanks! And sounds good will check it out!