erik Locally, do you mean on the VPU (see img below)? If so, you could decode and overlay the segmentation results on the VPU itself, but would need to create custom NN model in order to do that, see docs here.
Yep that's what I meant... ok got it now!
erik You could also stream both segmentation results and video streams to the server and do overlay there, which would be much less complicated.
Ahhh ok that makes sense. I did not see any examples of this in the luxonis repos so it was not clear to me whether it's possible to have two concurrent bitstreams from the device that would carry (1) preview frames (2) inference arrays.
So I guess it's as simple as setting up two streams with a similar structure as the example below from here
print("Creating SPI pipeline: ")
print("COLOR CAM -> ENCODER -> SPI OUT")
pipeline = dai.Pipeline()
cam_color = pipeline.create(dai.node.ColorCamera)
spiout_preview = pipeline.create(dai.node.SPIOut)
videnc = pipeline.create(dai.node.VideoEncoder)
# set up color camera and link to NN node
videnc.setDefaultProfilePreset(1920, 1080, 30, dai.VideoEncoderProperties.Profile.MJPEG);
# Link plugins CAM -> ENCODER -> SPI OUT
Would two such streams work concurrently (I mean bandwidth notwithstanding)?