• Community
  • Is it possible to use OAK-SOM to interface with FLIR Lepton 3.5?

Hi all,

Is it possible to use built-in hardware in OAK-SOM to interface with FLIR Lepton 3.5? FLIR Lepton 3.5 camera consumes 1 SPI and 1 I2C. The minimal SPI rate for Lepton 3.5 is about 12Mhz in mode 3.

I have desgined serveral hardware solution for FLIR Lepton 3.5. Now, I want to design a new carrier board for my ffc-3p. The new carrier board has 2 ffc connectors (MONO CAM) and 1 camera socket(FLIR Lepton 3.5). My application is to measure the dimension of an object and the temperature of it.

Let me konw if you have any idea!

  • erik replied to this.

    Hi edxian ,
    While it might be technically possible, for RVC2 one would need to integrate support for the new sensor in FW (it's an embedded system), and the FW is not opensource.

    I believe it would be possible with RVC3 (eg OAK-FFC-6P) as it has linux os, and you could add driver for it, as we would be able to share BSP with you. Thoughts?

      Hi erik ,

      Thanks for your update!

      I think there is a workaround to bridge the image data between FLIR Lepton and OAK-SOM. I find this tutorial talking about how to send a partial image to esp32 through "spi message". So maybe i can inverse the data flow.

      STEP 1. Use a MCU to get thermal image data from Lepton 3.5 and pack the thermal image data as image_part. The data will be send to OAK-SOM .

      STEP 2. Implement a SPIIN node in OAK-SOM to receive SPI data and decode the packet as image_part.

      Since the size of image raw data from lepton is 160 x 120 x 2 which is smaller than 300*300 bytes, I think it okay for OAK-SOM to receive whole image. The fps is about 9Hz. In theory, the minimum spi rate requires 160x120x2x9x8 = 2764800 bit/s (2.8Mbit/s)

      Any idea?

      • erik replied to this.

        Hi edxian ,
        That might work, but I am not sure whether we support 2.8mbps for SPI - IIRC, it's closer to 1mbps. Also note that the whole SPI comms is community support only, and might not work well (we haven't tested it for more than a year).

        15 days later

        Hi @erik ,

        Is it a good idea to route the OAK camera and thermal camera as the following diagram?

        The thermal camera (Cat-eye) which is a uvc camera that is fully compliant with USB2.0 FS.

        The OAK-FFC-3P will be powered by 5v DC adapter due to the lack of power provided by USB HUB 3.0

        Is there a side effect or risks if I route the usb camera as the above?

        Any idea?

          Hi edxian ,
          I see no reason why there would be any risks, besides potential bandwidth limitation, but at max 480mbps of thermal image (usb2), that shouldn't be a risk either.
          Thanks, Erik

          Hi @erik ,

          i just gave it a try. I took a USB 3.0 Hub as a bridge and connect to my OAK-FFC-3P. And I found that the camera will run in USB HS mode.

          Connected cameras: [{socket: CAM_A, sensorName: IMX378, width: 4056, height: 3040, orientation: AUTO, supportedTypes: [COLOR], hasAutofocus: 1, name: color}, {socket: CAM_C, sensorName: OV9282, width: 1280, height: 800, orientation: AUTO, supportedTypes: [MONO], hasAutofocus: 0, name: right}]
          Usb speed: HIGH
          Device name: OAK-FFC-3P

          However, if i connect to the host directly without USB 3.0 HUB, then the OAK-FFC-3P can run in USB SS mode.

          Connected cameras: [{socket: CAM_A, sensorName: IMX378, width: 4056, height: 3040, orientation: AUTO, supportedTypes: [COLOR], hasAutofocus: 1, name: color}, {socket: CAM_C, sensorName: OV9282, width: 1280, height: 800, orientation: AUTO, supportedTypes: [MONO], hasAutofocus: 0, name: right}]
          Usb speed: SUPER
          Device name: OAK-FFC-3P

          I have confirmed that the USB cable and USB hub support USB3.0.

          The following is the python script:

          
          import cv2
          import depthai as dai
          
          # Create pipeline
          pipeline = dai.Pipeline()
          
          # Define source and output
          camRgb = pipeline.create(dai.node.ColorCamera)
          xoutRgb = pipeline.create(dai.node.XLinkOut)
          
          xoutRgb.setStreamName("rgb")
          
          # Properties
          camRgb.setPreviewSize(300, 300)
          camRgb.setInterleaved(False)
          camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.RGB)
          
          # Linking
          camRgb.preview.link(xoutRgb.input)
          
          # Connect to device and start pipeline
          with dai.Device(pipeline, maxUsbSpeed=dai.UsbSpeed.SUPER_PLUS) as device:
          
              print('Connected cameras:', device.getConnectedCameraFeatures())
              # Print out usb speed
              print('Usb speed:', device.getUsbSpeed().name)
              # Bootloader version
              if device.getBootloaderVersion() is not None:
                  print('Bootloader version:', device.getBootloaderVersion())
              # Device name
              print('Device name:', device.getDeviceName())
          
              # Output queue will be used to get the rgb frames from the output defined above
              qRgb = device.getOutputQueue(name="rgb", maxSize=4, blocking=False)
          
              while True:
                  inRgb = qRgb.get()  # blocking call, will wait until a new data has arrived
          
                  # Retrieve 'bgr' (opencv format) frame
                  cv2.imshow("rgb", inRgb.getCvFrame())
          
                  if cv2.waitKey(1) == ord('q'):
                      break

          I tried to open the device by the script with dai.Device(pipeline, maxUsbSpeed=dai.UsbSpeed.SUPER_PLUS) as device:.

          Any idea?

          • erik replied to this.

            Hi edxian ,
            SUPER_PLUS means 10gbps usb3, can your hub/computer support that? I'd try without specifying max speed (it will take 5gbps usb3). I'd also try different USB3 hub, as I have tried quite a few of them and haven't had issue (maybe due to computer, or hub).