Touk Hello, Sorry but it is very difficult to understand the steps to deploy our own model (from https://docs.luxonis.com/en/latest/pages/tutorials/deploying-custom-model/). I believed this meant to add our model to the ones available in the depthai GUI. But it doesn't give information about it actually. What I have done so far is to convert my onnx model into a blob, add it a folder into the folder depthai/resourses/nn. Then I think I should create a json file with some parameters, but how comes nothing is said about this ? Could you explain ? Thank you
jakaskerl Hi @Touk The tutorial is meant to guide in porting your own NN to .blob and run it on an OAK device inside a custom python/c++ script. Once you have the blob, you can use it in depthai pipeline like explained here. Thanks, Jaka
Touk Hi @jakaskerl , Thank you for your help. I can run my custom model on an OAK using a python script. I was wondering if it was possible to add it to the demo GUI :
jakaskerl Hi @Touk The visualizations are done with SDK, so you would have to add a model (along with the handler) here: luxonis/depthaitree/main/depthai_sdk/src/depthai_sdk/nn_models Thanks, Jaka