edxian

  • Oct 2, 2023
  • Joined Sep 11, 2023
  • 0 best answers
  • Hi @erik ,

    i just gave it a try. I took a USB 3.0 Hub as a bridge and connect to my OAK-FFC-3P. And I found that the camera will run in USB HS mode.

    Connected cameras: [{socket: CAM_A, sensorName: IMX378, width: 4056, height: 3040, orientation: AUTO, supportedTypes: [COLOR], hasAutofocus: 1, name: color}, {socket: CAM_C, sensorName: OV9282, width: 1280, height: 800, orientation: AUTO, supportedTypes: [MONO], hasAutofocus: 0, name: right}]
    Usb speed: HIGH
    Device name: OAK-FFC-3P

    However, if i connect to the host directly without USB 3.0 HUB, then the OAK-FFC-3P can run in USB SS mode.

    Connected cameras: [{socket: CAM_A, sensorName: IMX378, width: 4056, height: 3040, orientation: AUTO, supportedTypes: [COLOR], hasAutofocus: 1, name: color}, {socket: CAM_C, sensorName: OV9282, width: 1280, height: 800, orientation: AUTO, supportedTypes: [MONO], hasAutofocus: 0, name: right}]
    Usb speed: SUPER
    Device name: OAK-FFC-3P

    I have confirmed that the USB cable and USB hub support USB3.0.

    The following is the python script:

    
    import cv2
    import depthai as dai
    
    # Create pipeline
    pipeline = dai.Pipeline()
    
    # Define source and output
    camRgb = pipeline.create(dai.node.ColorCamera)
    xoutRgb = pipeline.create(dai.node.XLinkOut)
    
    xoutRgb.setStreamName("rgb")
    
    # Properties
    camRgb.setPreviewSize(300, 300)
    camRgb.setInterleaved(False)
    camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.RGB)
    
    # Linking
    camRgb.preview.link(xoutRgb.input)
    
    # Connect to device and start pipeline
    with dai.Device(pipeline, maxUsbSpeed=dai.UsbSpeed.SUPER_PLUS) as device:
    
        print('Connected cameras:', device.getConnectedCameraFeatures())
        # Print out usb speed
        print('Usb speed:', device.getUsbSpeed().name)
        # Bootloader version
        if device.getBootloaderVersion() is not None:
            print('Bootloader version:', device.getBootloaderVersion())
        # Device name
        print('Device name:', device.getDeviceName())
    
        # Output queue will be used to get the rgb frames from the output defined above
        qRgb = device.getOutputQueue(name="rgb", maxSize=4, blocking=False)
    
        while True:
            inRgb = qRgb.get()  # blocking call, will wait until a new data has arrived
    
            # Retrieve 'bgr' (opencv format) frame
            cv2.imshow("rgb", inRgb.getCvFrame())
    
            if cv2.waitKey(1) == ord('q'):
                break

    I tried to open the device by the script with dai.Device(pipeline, maxUsbSpeed=dai.UsbSpeed.SUPER_PLUS) as device:.

    Any idea?

    • erik replied to this.
    • Hi @erik ,

      Is it a good idea to route the OAK camera and thermal camera as the following diagram?

      The thermal camera (Cat-eye) which is a uvc camera that is fully compliant with USB2.0 FS.

      The OAK-FFC-3P will be powered by 5v DC adapter due to the lack of power provided by USB HUB 3.0

      Is there a side effect or risks if I route the usb camera as the above?

      Any idea?

      • Hi erik ,

        I am curious about that whether there is a custom Yocoto project or buildroot package for RCV4?

        I hope I can rebuild the customized image and integrate some sensors into my project.

        Any Idea?

        • Hi erik ,

          Thanks for your update!

          I think there is a workaround to bridge the image data between FLIR Lepton and OAK-SOM. I find this tutorial talking about how to send a partial image to esp32 through "spi message". So maybe i can inverse the data flow.

          STEP 1. Use a MCU to get thermal image data from Lepton 3.5 and pack the thermal image data as image_part. The data will be send to OAK-SOM .

          STEP 2. Implement a SPIIN node in OAK-SOM to receive SPI data and decode the packet as image_part.

          Since the size of image raw data from lepton is 160 x 120 x 2 which is smaller than 300*300 bytes, I think it okay for OAK-SOM to receive whole image. The fps is about 9Hz. In theory, the minimum spi rate requires 160x120x2x9x8 = 2764800 bit/s (2.8Mbit/s)

          Any idea?

          • erik replied to this.
          • Hi all,

            Is it possible to use built-in hardware in OAK-SOM to interface with FLIR Lepton 3.5? FLIR Lepton 3.5 camera consumes 1 SPI and 1 I2C. The minimal SPI rate for Lepton 3.5 is about 12Mhz in mode 3.

            I have desgined serveral hardware solution for FLIR Lepton 3.5. Now, I want to design a new carrier board for my ffc-3p. The new carrier board has 2 ffc connectors (MONO CAM) and 1 camera socket(FLIR Lepton 3.5). My application is to measure the dimension of an object and the temperature of it.

            Let me konw if you have any idea!

            • erik replied to this.
            • Hi kneave ,

              What is the temperature for your board when it is running AI applications?

              I also want to add a tiny fan on my OAK-FFC-3P board. Just like the heatsink on RPi 4

              . The fan only provides powers and GND.