I want to use Extended Disparity. But it's processing rate is only 60 fps. Is it possible I use frame buffers to catch at higer frames and buffer them on host. Then when I check my objects is in sight, I can record in host memores then send them back to OAK-D to analyze? If the internal pipeline can take frames from host, then I can still get my Extended Disparity result out. Though with some latency, it will still be aceptable for our application.
Is it possible to feed video stream from host instead of sensors on board?
Hello frankshieh , yes you can send frames from host to device with XLInkIN node. Simple code here - sending video frames to device for inferencing. For recroding/replaying, we are just refactoring this example but for now, you can use this demo.
Thanks, Erik
erik Great! Will try later when we can get >100 fps using C++!