Hello,
I'm very new to depthai and also not really used to python, so bear with me.
I'm trying to get the model person-detection-action-recognition-0006 from the openvino model zoo to run on my oak-d lite as it's just perfect for my use case.
I've already converted it with https://blobconverter.luxonis.com/ using the parameters -ip U8 -il nchw
but I don't know how to show detections with this model. Do I need a custom decode_fn
?
That where I'm stuck:
from depthai_sdk import OakCamera
from depthai import NNData
def decode(nnData: NNData):
# Box coordinates in SSD format
layer = nnData.getLayerFp16('ActionNet/out_detection_loc')
# ???
with OakCamera() as oak:
color = oak.create_camera('color')
nn = oak.create_nn('person-detection-action-recognition-0006.blob', color, decode_fn=decode)
oak.visualize([color, nn], fps=True)
oak.start(blocking=True)
Would be great if someone could point me in the right direction or a related example
Thanks in advance.