Hi all,

Has anyone had any success using an OAK device in UVC mode with the defaults in the depthai_demo.py with the last generations of depthai-python package? I get corrupt images that look like they are output in the wrong format of encoding (lots of green and messed up geometry). I remember this worked in an oooold version of the python lib, but I guess it's no longer really supported?

Cheers,

  • erik replied to this.

    On more quick question: is it possible to stream a cv2.imshow output to the UVC node instead of:

    cam_rgb.video.link(uvc.input)

    ?

    • erik replied to this.

      erik

      I mean running some opencv operations and returning the output of those operations to the UVC node to stream that output via UVC.
      cv2 is the python opencv default import
      imshow displays an image streaming 🙂

      • erik replied to this.

        stephansturges so to stream from host computer (after cv2 operations) to the OAK via UVC? I don't think that would work..

          erik Yep that's the idea 🙂

          That way I could apply some effects, use a neural network to clean up the background, blur the image using depth etc...

          • erik replied to this.

            stephansturges You could use XLinkIn node to stream such frames to OAK, and then UVC to stream processed frames via UVC back to host. Would that work?

            Interesting!
            The way I understand it the XLinkIn node has to be bound to an inputqueue, like in the boilerplate from the docs:

            
            pipeline = dai.Pipeline()
            xIn = pipeline.create(dai.node.XLinkIn)
            xIn.setStreamName("camControl")
            
            # Create ColorCamera beforehand
            xIn.out.link(cam.inputControl)
            
            with dai.Device(pipeline) as device:
              device.startPipeline()
              qCamControl = device.getInputQueue("camControl")
            
              # Send a message to the ColorCamera to capture a still image
              ctrl = dai.CameraControl()
              ctrl.setCaptureStill(True)
              qCamControl.send(ctrl)

            Can I just create an XLinkIn node called "cv2Frames" and then pipe that to the UVC output node?

            
            pipeline = dai.Pipeline()
            xIn = pipeline.create(dai.node.XLinkIn)
            xIn.setStreamName("cv2Frames")
            
            
            #create UVC node
            # Create an UVC (USB Video Class) output node. It needs 1920x1080, NV12 input
            uvc = pipeline.createUVC()
            cam_rgb.video.link(uvc.input)
            
            #Link xln node to UVC output
            xln.out.link(uvc.input)
            
            with dai.Device(pipeline) as device:
              device.startPipeline()
              cv2FrameStream = device.getInputQueue("cv2Frames")
            
             #### MAKE SOME cv2 frames and convert them to NV12 ######
            currentCv2Frame = X
            
              # Send the cv2 image formatted to NV12 to the streem
              cv2FrameStream.send(currentCv2Frame)

            Something like that?

            • erik replied to this.

              Hi stephansturges ,
              Yes, that would be possible. You would probably need to have ImageManip in between though, to convert rgb to nv12.
              Thanks, Erik