Hi @erik,

Sorry, but the question of this post hasn't been resolved.
The synchronization between two AR0234 in a FFC-4P is not correct with 60 fps. Some of the frames don't arrive to the host.
I am testing with cam_test.py.
Any solution?
Thank you so much.

Hi @PedroGonzalez ,
I think you are hitting bandwidth limit

1920 * 1200 = 2.3M pixels
2.3M * 1.5 (YUV420, 1.5 bytes/pixel) = 3.5MB / frame
3.5MB * 60 * 2 (sensors) = 415MB/sec
415 * 8 (bits) = 3.3Gbps

I get about 2.5gbps downlink from USB3:
luxonis/depthai-experimentstree/master/random-scripts#oak-bandwidth-test

Hi @erik

Thank you so much for your quick respond. Which is the maximum speed I can achieve? Only depends of the cable?
There is only one type of resolution for this cameras, How can I change the resolution? setVideoSize seems to work.

Hi @PedroGonzalez , you can use the bandwidth-test (linked above) to test the bandwidth. You can use setVideoSize or IspScale.

Yes, sorry, I have 1.9 Gbps with 5 meters USB3 cable. I wanted to know the maximum speed in terms of hardware limitations, because if that is less than 3.3Gbps, the specifications don't fit really well with my project necessities.

Thank you so much.

Hi @erik

I have bought this cable:

https://www.amazon.co.uk/Amazon-Basics-Certified-supports-transfer/dp/B07D7S93DD
And the speed is still the same, no more than 2000.0 mbps. How can I improve it? Is there a hardware limitations on the FFC-4P?

Thank you so much.

6 days later

Hi @PedroGonzalez
The FFC-4P supports up to 10Gbps so the device is not the bottleneck. I'd say it's very likely the port (USB controller) on your PC in not capable of handling the speeds requested. Any chance you can upgrade that (either PCIE extension or different port)?

Thanks,
Jaka

To enable 10gbps, one has to explicitly enable it:

with dai.Device(pipeline, maxUsbSpeed=dai.UsbSpeed.SUPER_PLUS) as device:

By default it will be SUPER (5gbps), as we haven't battle-tested the SUPER_PLUS firmware version, and there might be some second-order issues.

A note that 10Gbps may have issues with longer cables (saw it's the case for a 5m cable here).

There is some chunking done in XLink protocol packetization (64KiB default), try setting this on the pipeline object for a better throughput:
pipeline.setXLinkChunkSize(0)
We haven't changed the default as it was not clear if it could be an issue for some hosts.
It's better on my side with that option set (uncommented in below script), can reach up to 45fps on 2x AR0234 at full res uncompressed:
python3 utilities/cam_test.py -cams camb,c camc,c -fps 45
-> 1920*1200*1.5*45*2 = 311MB/s

If the app can work with downscaled images, can set for example on the ColorCamera nodes:
.setIspScale(2, 3) -> 1920x1200 * (2/3) = 1280x800
-> can reach 60fps

Another option for better USB throughput is to use UVC instead of XLinkOut, but there is a current limitation that a single instance can be created (and if multiple nodes are used, all get muxed on the same stream). If the host app could separate the frames, it may work...

config = dai.Device.Config()
config.board.uvc = dai.BoardConfig.UVC(1920, 1200)
config.board.uvc.frameType = dai.ImgFrame.Type.YUV420p

with dai.Device(config) as device:
    ...
    uvc = pipeline.createUVC()
    cam['left'].isp.link(uvc.input)
    cam['right'].isp.link(uvc.input)
    ...
    device.startPipeline(pipeline)

It seems it can reach the combined 120 fps, but it's at the limit, and there may be occasional frame drops.
USB3 5Gbps uses 8b/10b encoding, that is only 4Gbps of usable bandwidth, and there is some USB protocol overhead on top.

9 days later

Hi @jakaskerl @erik @Luxonis-Alex

Thank you so much for your answers.

Thanks to this two lines of code I am now in 2300 Mbps average with the long cable and 2700 average with only the cable I bought and I mentioned before. Not enough yet.

with dai.Device(pipeline, maxUsbSpeed=dai.UsbSpeed.SUPER_PLUS) as device:
pipeline.setXLinkChunkSize(0)

My speed is SUPER when I printed it. I am using a Lenovo Legion 5 so I have two USB 3.2 Gen 1 ports. Maybe I can try with a USB3-C to USB3-C cable because I have a USB-C 3.2 Gen 2 port.
I really like the option of the UVC, and I think I would be able to identify the frame for each camera but I have done some tests and its impossible to identify the synchronize frames, they are not in alternative order. Any ideas of how to do it?

Thank you.

    PedroGonzalez
    Yes it's usually the case that Type-C ports on the laptop are more likely to support USB3 Gen2 (10Gbps). That option passed to dai.Device() only indicates the maximum USB speed the device should advertise as supported, but in the end the host will select the speed.

    For mixing on UVC, yeah I realize it could be one of the cases:

    1. some frame drops may still happen at 60fps on USB 5Gbps, it's 415MB/s - really at the edge of the practical maximum throughput achievable in general (not only with our devices). Can you check some FPS counters in the app if this could be the case, and if yes, does it help to reduce the FPS a bit (say 50 or 55).
    2. no frame loss, but the sequence of sending frames is not really controlled, as both cameras produce the frames at the same time, it may happen that the 2nd camera ends up sending first. In that case we would need to append some metadata to the frame. A crude way is to encode a few bytes over the image pixels: camera socket/ID, frame sequence number and timestamp (I can build later a modified FW with this). Or ideally to attach metadata in a standard way for UVC - not exactly sure if possible / how.

      Luxonis-Alex

      Yeah, 50-55 FPS could be OK.
      With this code to create the UVC camera:

      import time
      import depthai as dai
      
      def getPipeline():
          enable_4k = False  # Will downscale 4K -> 1080p
      
          pipeline = dai.Pipeline()
      
          # Define sources - two mono cameras
          camB = pipeline.create(dai.node.ColorCamera)
          camC = pipeline.create(dai.node.ColorCamera)
      
          # Set properties for camera B
          camB.setBoardSocket(dai.CameraBoardSocket.CAM_B)
          camB.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1200_P)
      
          # Set properties for camera C
          camC.setBoardSocket(dai.CameraBoardSocket.CAM_C)
          camC.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1200_P)
          # Create an UVC (USB Video Class) output node
          uvc = pipeline.createUVC()
          camB.isp.link(uvc.input)
          camC.isp.link(uvc.input)
          # Note: if the pipeline is sent later to device (using startPipeline()),
          # it is important to pass the device config separately when creating the device
          config = dai.Device.Config()
          # config.board.uvc = dai.BoardConfig.UVC()  # enable default 1920x1080 NV12
          config.board.uvc = dai.BoardConfig.UVC(1920, 1200)
          config.board.uvc.frameType = dai.ImgFrame.Type.YUV420p
          # config.board.uvc.cameraName = "My Custom Cam"
          pipeline.setBoardConfig(config.board)
      
          return pipeline
      
      
      # Standard UVC load with depthai
      with dai.Device(getPipeline()) as device:
          print("\nDevice started, please keep this process running")
          print("and open an UVC viewer to check the camera stream.")
          print("\nTo close: Ctrl+C")
      
          # Doing nothing here, just keeping the host feeding the watchdog
          while True:
              try:
                  continue
              except KeyboardInterrupt:
                  break

      And this one to record videos:

      import cv2
      import os
      from datetime import datetime
      
      # Set up the video capture object to capture video from the first camera device
      cap = cv2.VideoCapture(0)
      
      # Check if the camera opened successfully
      if not cap.isOpened():
          print("Error: Could not open video device.")
          exit()
      
      # Set the properties for capture
      cap.set(cv2.CAP_PROP_FRAME_WIDTH, 1920)
      cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 1200)
      cap.set(cv2.CAP_PROP_FPS, 60)
      
      # Create a folder to save the frames
      folder_name = 'Captured_Frames'
      os.makedirs(folder_name, exist_ok=True)
      fourcc = cv2.VideoWriter_fourcc(*'mp4v')  # Use 'mp4v' for MP4 format
      out = cv2.VideoWriter('output.mp4', fourcc, 60.0, (1920, 1200))
      # Capture frames in a loop
      frame_count = 0
      while True:
          ret, frame = cap.read()
          if not ret:
              print("Error: No frame has been captured.")
              break
      
          # Save each frame to the specified folder
          #timestamp = datetime.now().strftime('%Y%m%d_%H%M%S%f')
          frame_filename = os.path.join(folder_name, f'frame_{frame_count}.jpg')
          cv2.imwrite(frame_filename, frame)
          frame_count += 1
          out.write(frame)
          # Display the resulting frame
          cv2.imshow('frame', frame)
      
          # Press 'q' on the keyboard to exit the loop
          if cv2.waitKey(1) & 0xFF == ord('q'):
              break
      
      # When everything is done, release the capture object
      # When everything is done, release the capture and writer objects
      cap.release()
      out.release()
      cv2.destroyAllWindows()

      Recording a chronometer at 30 FPS I cant find two frames with the same time, its seems that only arrives one of the two frames for each time.
      Would be really helpful for my project if you can achieve the UVC approach with metadata to synchronize frames.

      Thank you so much for your help.