• DepthAI-v2
  • Dropped Frames in High FPS Sequence Syncing

Hi, I have an OAK-FFC 4P with two OAK-FFC OV9782 M12 (22 pin) cameras attached and I am trying to achieve sequence syncing at 120fps over USB 3.0 using python. My current issue is that it seems that I am dropping frames in the sequence as I increase my fps. At 30fps I am able to reliably get matching frames from my two camera queues but as I increase the fps, the frames grabbed from the respective queues start skipping frames. At first I thought this could be because I am accumulating frames in the queue but q['left'].tryGetAll() only ever shows 1 frame at most in the queue.

Below is some code that prints the sequence number of the most recent frame for each camera (left and right) and the output which shows how the frames are skipped.

import depthai as dai

import time

from utils.frame_pairing import PairingSystem

# Camera Setup

cam_list = ['left', 'right']

cam_socket_opts = {'left' : dai.CameraBoardSocket.CAM_D,

'right' : dai.CameraBoardSocket.CAM_A}

cam_instance = {'left' : 3,

'right': 0}

# Start defining a pipeline

pipeline = dai.Pipeline()

cam = {}

xout = {}

#Fixes sync Issue

script = pipeline.create(dai.node.Script)

script.setProcessor(dai.ProcessorType.LEON_CSS)

script.setScript("""

import GPIO

GPIO.setup(42, GPIO.IN)

""")

# Setup Cameras

for cam_name in cam_list:

xout[cam_name] = pipeline.create(dai.node.XLinkOut)

xout[cam_name].setStreamName(cam_name)

cam[cam_name] = pipeline.create(dai.node.ColorCamera)

#cam[cam_name].setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)

#cam[cam_name].setIspScale(720, 1080) # 1920x1080 -> 1280x720

cam[cam_name].isp.link(xout[cam_name].input)

cam[cam_name].setBoardSocket(cam_socket_opts[cam_name])

cam[cam_name].setFps(120)

#cam[cam_name].initialControl.setManualExposure(500, 800)

cam[cam_name].initialControl.setManualExposure(5000, 100) # expTime (100-33000 u-seconds), (100-1600)sensIso

# Pipeline defined, now the device is assigned and pipeline is started

with dai.Device(pipeline) as device:

tstart = time.monotonic() # applied as an offset

saveidx = 0

# Sync Setup

q = {}

for c in cam_list:

q[c] = device.getOutputQueue(name=c, maxSize=10, blocking=False)

ps = PairingSystem()

# Warm Up Parameters

warming_up = True

warm_up_time = 1 # seconds

init_time = time.time()

frame_num = 0

i=0

last_frame_time = time.time()

while True:

# Let camera warm up

if warming_up:

time_now = time.time()

if time_now - init_time > warm_up_time:

warming_up = False

# instead of get (blocking) used tryGet (nonblocking) which will return the available data or None otherwise

if q['left'].has():

l_data = q['left'].tryGet()

print("L: ", l_data.getSequenceNum())

if q['right'].has():

r_data = q['right'].tryGet()

print("R: ", r_data.getSequenceNum())

Hi @kcancio
Any reason why not go for the timestamp syncing approach? Depthai has a custom node that should abstract this logic.

Thanks,
Jaka

Thanks for the response! The message syncing documentation says sub-ms accuracy can be attained through message syncing which is why we chose it. We're tracking a relatively fast moving ball so both latency and accurate frame synchronization is a concern.