For my purposes, I want to perform inference using a pretrained model, made using Roboflow, running on a low-level board. There is no space for Docker, and therefore no space for the inference server. Is there anyway that the model can be downloaded?
I have the following code and I have been looking everywhere to add a method to download the model, but cannot find anything.
from depthai_sdk import OakCamera
MODEL_ID = "stuff"
MODEL_VS = "2"
PAPI_KEY = "*****WnKpTKuX******"
with OakCamera() as oak:
color = oak.create_camera('color')
model_config = {
'source': 'roboflow',
'model': f"{MODEL_ID}/{MODEL_VS}",
'key': PAPI_KEY
}
nn = oak.create_nn(model_config, color)
oak.visualize(nn, fps=True)
oak.start(blocking=True)
# Code to download the model from `nn`:
Does anyone have any ideas or tips? Thanks!!!!