Ok, so if I understand correctly, there is no way to hardware-trigger the stereo frames and then find the most close rgb-frame softwarematticaly (even if we leave it streaming at 30FPS)?
Hardware trigger synchronization including RGB image (Oakd-s2)
toon
Correct, the frame triggers are expected (by IMX378) to come at a fixed rate. The issue is the the fsin lines are connected together and only triggering the OV strereo cameras wouldn't work. If the lines were separate this should be possible.
RGB will not stream if both left and right are set to input mode.
Thanks,
Jaka
Dear @jakaskerl
We have tested the provided code with an configured external trigger on 6hz. The left and rgb camera are streaming. However after ±1 minutes a fraction of the color images is under-exposed. It looks like that the fraction of the image, has an older frame? Could you help me out. I am using the following example script:
https://docs.luxonis.com/hardware/platform/deploy/frame-sync/#Sensor%20FSYNC%20support
Thanks in advance.
- Edited
If I copy paste the code, somehow identation is incorrect, so a link instead:
https://filesender.surf.nl/?s=download&token=52d56732-0b4d-445d-afae-3868a4c9d5f2
Code is similar as the example, except for line 14 (set FPS for the RGB camera). Trigger signal is 6hz.
First minute it works, then in steps of 5 you will see more and more part of the image becoming darker, until you get something like the image above. How accurate does the trigger need to be? I am currently using a revolution pi to create the signal which I believe is accurate, however of course it can be a few microseconds off.
- Edited
Update if add manual exposure:
camRgb.initialControl.setManualExposure(1000,300) # exposturetime US, sensitivityIs
Then output is consistent incorrect:
Also it seems that sometimes a trigger is missed. Even at low fps of 6, I am missing a few frames. Both infrared and colour camera. What should be the duration of the trigger?
toon
cc @Luxonis-Alex any ideas to what might be happening?
Thanks,
Jaka
- Edited
Hi @jakaskerl , Bart a colleague of Toon here.
We did some additional experiments today, and the issue appears when setting the exposure time < 17000 us. I am going to use my camera in high light conditions, so this is not acceptable for me.
Second observation is even with fixed exposure the video stream is changing a lot. See example video below with in the terminal the exposure. This is caused by the line:
camRgb.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) # should be enabled otherwise it is streaming continuous
If i disable the setFrameSynMode "flashing" does not occur.
another qustion is the SetFrameSynMode also required for left and right camera? If I do that streaming of infrared images does not work.
To summarize:
- How to trigger RGB and Infrared camera below 17000 us?
- How to prevent "flashing" issue (occurs as well without fixed exposure)?
- The SetFrameSyncMode, should it be enabled for infrared left and right as well?
#!/usr/bin/env python3
import depthai as dai
import cv2
import time
fps = 15focus = 126
expTime = 17000 # if <10000 black images will appear
sensIso = 100
pipeline = dai.Pipeline()
camRgb = pipeline.create(dai.node.ColorCamera)
camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.RGB)camRgb.initialControl.setManualFocus(focus)camRgb.initialControl.setManualExposure(expTime,sensIso) # exposturetime US, sensitivityIsocamRgb.setFps(fps)
camRgb.setIspScale(2,3)camRgb.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) # should be enabled otherwise it is streaming continuous
xoutRgb = pipeline.create(dai.node.XLinkOut)xoutRgb.setStreamName("color")camRgb.isp.link(xoutRgb.input)
monoLeft = pipeline.create(dai.node.MonoCamera)monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)monoLeft.setBoardSocket(dai.CameraBoardSocket.LEFT)# monoLeft.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) ## if I enable this at some point it will freezemonoLeft.initialControl.setExternalTrigger(4,3)
xoutLeft = pipeline.create(dai.node.XLinkOut)xoutLeft.setStreamName("left")monoLeft.out.link(xoutLeft.input)
monoRight = pipeline.createMonoCamera()monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)monoRight.setBoardSocket(dai.CameraBoardSocket.RIGHT)# monoRight.initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT) ## if i enable this at some point int will freezemonoRight.initialControl.setExternalTrigger(4,3)# monoRight.setFps(fps)
xoutRight = pipeline.create(dai.node.XLinkOut)xoutRight.setStreamName("right")monoRight.out.link(xoutRight.input)
# Connect to device with pipeline# with dai.Device(pipeline) as device: # Device connectiondevice_id = "169.254.1.114"with dai.Device(pipeline, dai.DeviceInfo(device_id)) as device: # controlQueue = device.getInputQueue('control') # ctrl = dai.CameraControl() # ctrl.setManualExposure(expTime, sensIso) # controlQueue.send(ctrl)
arr = ['left', 'right','color'] queues = {} frames = {}
for name in arr: queues[name] = device.getOutputQueue(name)
print("Starting...") counter = 0 msg_exp = None while True: for name in arr: if queues[name].has(): msg = queues[name].get() frames[name]=msg.getCvFrame() if name=="color": msg_exp = msg
for name, frame in frames.items(): cv2.imshow(name, frame) counter +=1
if msg_exp is not None: print("exposure rgb [us] %d"%(msg_exp.getExposureTime().total_seconds()*1e6))
key = cv2.waitKey(1) if key == ord('q'): breakfff
- Edited
Forgot to mention the shorter the the exposure time, the more the line shifts upwards. Flashing only occurs at bright part. For example 10000us:
5000us
@jakaskerl any update on this? Thanks in advance. Kind regards, Bart