Hi @franva ,
Understood, I guess it is not possible to use the models from Model Zoo simply because that those models require very high resolution which OAK-D does not have.
Thanks for the direct answer.
Yes, that's correct. Some of the models on the model zoo are meant for use in server farms where thousands of TOPS are a minimum amount of processing power.
So the AI landscape in general has a huge variance of processing power. For example many of the larger cloud-AI companies (Google, for example) have likely millions or billions of TOPS available for neural inference.
In the case of OAK-D, we have 1.4 TOPS for neural inference. So models that are intended for cloud-deployment either will not run, or will run too slowly to be useful for real-time applications.
Then Model Zoo is not that helpful. I guess I will have to re-train models with a smaller input size in order to use on OAK-D.
I wouldn't say it's not useful. We use it quite a bit. And here is a list of 12 models that we use fairly often from the model zoo:
https://docs.luxonis.com/en/latest/pages/tutorials/pretrained_openvino/#trying-other-models
I think the more accurate thing to say is that the model zoo is very broad: It has models intended for cloud AI application (where 1,000 TOPS is the minimum compute), for telco-edge (where 100s or TOPS are the minimum), and for more device-edge (1 TOPS or so).
OAK-D is actually on the embedded side, so device edge or smaller. So the models that are on the model zoo that are intended for cloud AI or telco-edge are not as applicable to OAK-D. But the device-edge (or embedded) ones (liked linked above) are quite relevant.
Thoughts?
Thanks,
Brandon