• Hardware
  • Hardware trigger synchronization including RGB image (Oakd-s2)

I am trying to trigger my Oakd-s2 (AF) on ubuntu24 with the example provided here.

https://gist.github.com/Erol444/0138af63378dc8de5b3f7d80db1ea1a5

I managed to get this script working, however, the RGB image is not shown. In the following thread, https://discuss.luxonis.com/d/2304-multiple-oak-ffc-4p-based-cameras-timestamp-sync/5 it was said that they got the trigger working included RGB image, unfortunately any example is missing. Could this be shared?

    toon
    You are using an external trigger right?

    The code is meant for OAK-FFC (or POE M8) devices where the FSIN is exposed. This is not possible on S2 unless you open it and connect it to a FSIN line (which is not easily accessible).

    Thanks,
    Jaka

    • toon replied to this.

      jakaskerl sorry for the misconfusion, I am using a OAK-D S2 PoE with autofocus, so I have already implemented the M8 connector.

      What I would like to do is to trigger multiple devices (3) with a external trigger pulse. Furthermore I want to output an rgb image as close as possible with the triggered mono-images, but they don't have to be fully hardware synchronised.

        toon
        Then the code is fine, just set the color camera FPS to the exactly the same FPS at which you are triggering the other two cameras. As Erik said, snapshot mode does not work.

        Continuous streaming with external syncing, configured with CameraControl.setFrameSyncMode(). In this mode, the FSIN signal is expected to arrive at a continuous rate matching the configured sensor FPS, and trigger can't arrive at arbitrary times as that would disrupt internal sensor operations (leading to bad frames, etc). It can only correct for very small amounts of drift over time.

        Thanks,
        Jaka

          8 days later

          Dear Jakaskerl,

          Indeed the color camera in following script does not work example.

          Instead I wrote the following script including sync node, but still I do not get an image, what am I doing wrong?

          #!/usr/bin/env python3
          import depthai as dai
          import cv2
          import time
          from datetime import timedelta
          
          # FPS_tof = 10
          FPS_rgb = 10
          with_trigger = True
          
          
          pipeline = dai.Pipeline()
          
          # RGB camera
          camRgb = pipeline.create(dai.node.ColorCamera)
          camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.RGB)
          camRgb.setIspScale(2, 3)
          camRgb.setBoardSocket(dai.CameraBoardSocket.RGB)
          camRgb.setFps(FPS_rgb)
          if with_trigger:
              camRgb.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT)
              camRgb.initialControl.setExternalTrigger(4, 3)
          
          
          xout = pipeline.create(dai.node.XLinkOut)
          xout.setStreamName("out")
          # camRgb.isp.link(xout.input)  # Link the RGB output to XLinkOut
          
          # Mono left camera
          monoLeft = pipeline.create(dai.node.MonoCamera)
          monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)
          monoLeft.setBoardSocket(dai.CameraBoardSocket.LEFT)
          monoLeft.setFps(FPS_rgb)
          if with_trigger:
              monoLeft.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT)
              monoLeft.initialControl.setExternalTrigger(4, 3)
          
          # Mono right camera
          monoRight = pipeline.create(dai.node.MonoCamera)
          monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)
          monoRight.setBoardSocket(dai.CameraBoardSocket.RIGHT)
          monoRight.setFps(FPS_rgb)
          if with_trigger:
              monoRight.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT)
              monoRight.initialControl.setExternalTrigger(4, 3)
          
          xoutRight = pipeline.create(dai.node.XLinkOut)
          xoutRight.setStreamName("right")
          monoRight.out.link(xoutRight.input)
          
          # Sync node (if you want to sync all cameras)
          sync = pipeline.create(dai.node.Sync)
          # sync.setSyncThreshold(50)  # Sync threshold in milliseconds
          sync.setSyncThreshold(timedelta(milliseconds=1000))
          
          # Linking RGB, left, and right to the sync node
          camRgb.isp.link(sync.inputs["color"])  # Link RGB to sync
          monoLeft.out.link(sync.inputs["left"])
          monoRight.out.link(sync.inputs["right"])
          
          # You can link the sync output to XLinkOut for all cameras
          sync.out.link(xout.input)
          # sync.out.link(xoutLeft.input)
          # sync.out.link(xoutRight.input)
          
          # Device connection
          device_id = "169.254.1.103"
          with dai.Device(pipeline, dai.DeviceInfo(device_id)) as device:
          
              queue = device.getOutputQueue("out", 8, False)
              queueright=device.getOutputQueue("right")
          
              print("Starting...", queue)
              while True:
                  # if queueright["right"].has():
                  #     test=queueright["right"].get().getCvFrame()
                  #     print("x")
                  messageGroup: dai.MessageGroup = queue.get()
                  frameMono_right: dai.ImgFrame = messageGroup["right"]
                  frameMono_left: dai.ImgFrame = messageGroup["left"]
                  # frameDepth_tof: dai.ImgFrame = messageGroup["depth_tof"]
                  frameRgb: dai.ImgFrame = messageGroup["color"]
                  # Blend when both received
                  if frameMono_right is not None:
                      if frameRgb is not None:
                          print(frameMono_right.getSequenceNum())
                          print(frameRgb.getSequenceNum())
                          # print(messageGroup["depth"].getTimestamp()-messageGroup["rgb"].getTimestamp())
                          # print(messageGroup["rgb"].getSequenceNum())
                          # print(messageGroup["depth"].getSequenceNum())
                          monoRight_img = frameMono_right.getCvFrame()
                          # # cv2.imshow("rgb", frameRgb)
                          # # cv2.imshow("depth", colorizeDepth(frameDepth.getCvFrame()))
                          test=frameRgb.getCvFrame()
                          print("It is working")
                      else:
                          print("no depth frame")
                  else:
                      print("no rgb frame")
          
                  key = cv2.waitKey(1)
                  if key == ord("q"):
                      break
          
          
          cv2.destroyAllWindows()

            The trigger signal is just a signal triggered by a pushbutton (for test and verification). Same setup as in externally trigger fsync. In the end we would like to trigger with a PLC based on the encoder value (every x cm).

              5 days later

              Ok, so if I understand correctly, there is no way to hardware-trigger the stereo frames and then find the most close rgb-frame softwarematticaly (even if we leave it streaming at 30FPS)?

                toon
                Correct, the frame triggers are expected (by IMX378) to come at a fixed rate. The issue is the the fsin lines are connected together and only triggering the OV strereo cameras wouldn't work. If the lines were separate this should be possible.

                RGB will not stream if both left and right are set to input mode.

                Thanks,
                Jaka

                3 months later

                Dear @jakaskerl

                We have tested the provided code with an configured external trigger on 6hz. The left and rgb camera are streaming. However after ±1 minutes a fraction of the color images is under-exposed. It looks like that the fraction of the image, has an older frame? Could you help me out. I am using the following example script:

                https://docs.luxonis.com/hardware/platform/deploy/frame-sync/#Sensor%20FSYNC%20support

                Thanks in advance.

                  toon
                  Can you share some minimal reproducible code?

                  Thanks,
                  Jaka

                  toon
                  I'll check tomorrow. But it looks like there is some drift... Does the image get better if you wait some time - so it gets in sync?

                  Thanks,
                  Jaka

                  First minute it works, then in steps of 5 you will see more and more part of the image becoming darker, until you get something like the image above. How accurate does the trigger need to be? I am currently using a revolution pi to create the signal which I believe is accurate, however of course it can be a few microseconds off.

                  • Edited

                  Update if add manual exposure:

                  camRgb.initialControl.setManualExposure(1000,300) # exposturetime US, sensitivityIs

                  Then output is consistent incorrect:

                  Also it seems that sometimes a trigger is missed. Even at low fps of 6, I am missing a few frames. Both infrared and colour camera. What should be the duration of the trigger?

                    Hi @jakaskerl , Bart a colleague of Toon here.

                    We did some additional experiments today, and the issue appears when setting the exposure time < 17000 us. I am going to use my camera in high light conditions, so this is not acceptable for me.

                    Second observation is even with fixed exposure the video stream is changing a lot. See example video below with in the terminal the exposure. This is caused by the line:

                    camRgb.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) # should be enabled otherwise it is streaming continuous

                    If i disable the setFrameSynMode "flashing" does not occur.

                    another qustion is the SetFrameSynMode also required for left and right camera? If I do that streaming of infrared images does not work.

                    To summarize:

                    • How to trigger RGB and Infrared camera below 17000 us?
                    • How to prevent "flashing" issue (occurs as well without fixed exposure)?
                    • The SetFrameSyncMode, should it be enabled for infrared left and right as well?
                    luxonis-recording-17000us.mp4
                    3MB

                    #!/usr/bin/env python3

                    import depthai as dai

                    import cv2

                    import time

                    fps = 15focus = 126
                    expTime = 17000 # if <10000 black images will appear
                    sensIso = 100
                    pipeline = dai.Pipeline()
                    camRgb = pipeline.create(dai.node.ColorCamera)
                    camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.RGB)camRgb.initialControl.setManualFocus(focus)camRgb.initialControl.setManualExposure(expTime,sensIso) # exposturetime US, sensitivityIsocamRgb.setFps(fps)
                    camRgb.setIspScale(2,3)camRgb.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) # should be enabled otherwise it is streaming continuous
                    xoutRgb = pipeline.create(dai.node.XLinkOut)xoutRgb.setStreamName("color")camRgb.isp.link(xoutRgb.input)
                    monoLeft = pipeline.create(dai.node.MonoCamera)monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)monoLeft.setBoardSocket(dai.CameraBoardSocket.LEFT)# monoLeft.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) ## if I enable this at some point it will freezemonoLeft.initialControl.setExternalTrigger(4,3)
                    xoutLeft = pipeline.create(dai.node.XLinkOut)xoutLeft.setStreamName("left")monoLeft.out.link(xoutLeft.input)
                    monoRight = pipeline.createMonoCamera()monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)monoRight.setBoardSocket(dai.CameraBoardSocket.RIGHT)# monoRight.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) ## if i enable this at some point int will freezemonoRight.initialControl.setExternalTrigger(4,3)# monoRight.setFps(fps)
                    xoutRight = pipeline.create(dai.node.XLinkOut)xoutRight.setStreamName("right")monoRight.out.link(xoutRight.input)
                    # Connect to device with pipeline# with dai.Device(pipeline) as device: # Device connectiondevice_id = "169.254.1.114"with dai.Device(pipeline, dai.DeviceInfo(device_id)) as device: # controlQueue = device.getInputQueue('control') # ctrl = dai.CameraControl() # ctrl.setManualExposure(expTime, sensIso) # controlQueue.send(ctrl)
                    arr = ['left', 'right','color'] queues = {} frames = {}
                    for name in arr: queues[name] = device.getOutputQueue(name)
                    print("Starting...") counter = 0 msg_exp = None while True: for name in arr: if queues[name].has(): msg = queues[name].get() frames[name]=msg.getCvFrame() if name=="color": msg_exp = msg
                    for name, frame in frames.items(): cv2.imshow(name, frame) counter +=1
                    if msg_exp is not None: print("exposure rgb [us] %d"%(msg_exp.getExposureTime().total_seconds()*1e6))

                    key = cv2.waitKey(1) if key == ord('q'): breakfff

                    Forgot to mention the shorter the the exposure time, the more the line shifts upwards. Flashing only occurs at bright part. For example 10000us:

                    5000us