B
bob

  • 6 Feb
  • Joined Jun 14, 2023
  • 0 best answers

    I am having trouble getting External Sync working on an OAK FFC 4P with socket A OAK-FFC-IMX378-W and socket B/C connected to OAK-FFC-OV9282-W all with luxonis white flexes connected via USB3 to a workstation. The mono cameras stop running and the color camera occasionally triggers with corrupted frames. I have read forum posts and https://docs.luxonis.com/hardware/platform/deploy/frame-sync/. I am using depthai version 2.29.0.0.dev0+27410ce1be892f0120394bd123ac2f3e103f0a80 and the luxonis provided "depthai\utilities\cam_test.py" script modified to uncomment the sync option "cam[c].initialControl.setFrameSyncMode(dai.CameraControl.FrameSyncMode.INPUT)", set default frame rate to 10Hz and set gpio 6 to output high. The external sync is driven through the PRBS connector pin 6(FSIN) and 7(GND) from an HP33120A function generator which creates a 1.8V squarewave at 10Hz +/- about 1 microsecond of error (jitter plus inaccuracy, verified on a scope) and each camera is receiving the signal at TP2 (FSIN). When no external signal is connected, the mono cameras start up and run at 10FPS, color camera no output. When external sync is commented out, as expected all cameras run at 10FPS, or using the unmodified script - 30FPS. This forum post https://discuss.luxonis.com/d/5323-hardware-trigger-synchronization-including-rgb-image-oakd-s2/20 seems very similar with the same visual artifacts; does external sync work? If so, could luxonis provide a minimal script with 2 mono and 1 color cameras so I can continue to debug the hardware? Are there maximum or minimum pulse width/duty cycle requirements? I have tried 20-50% duty cycles. Thank you.

      thank you; very cool product! Doesn't look like it is connected to the store website yet. Unfortunately the dimensions won't fit my immediate need but great to know it is available!

        Is this the S4 devkit you mentioned https://shop.luxonis.com/products/oak-4-som ?

        Are full spec sheets and development guide available after an NDA is signed? (for example: detailed information about ports not mentioned in the description but supported by the chip like PCIe, layout guides such as lane tuning length requirements, power quality/decoupling; information about supported image sensors and what features are supported or detailed information about how to tune them)

          I see a few RVC4 products are now available for pre-order; any update on the release plans for RVC4 FFC products?

          • I don't think it will process that many pixels; there are a number of hardware blocks in the chain and each has limits - they are listed in the documentation somewhere. I remember something like 248million/s for the encoder but you would have to check.

            I am currently stuck at 10FPS on smaller cameras than you are planning; the limit is because the auto focus/exposure/white balance algorithm which you can set to run on every nth frame seems to have a bug (it defaults to every frame) which pegs the ethernet tending core at 100% (dropping most frames) or causes it to crash. Luxonis runs my code and gets a different result than I get and I haven't figured out why or had time to poke at it much since I dropped the other thread.

            You won't get realtime performance from windows ever unless "real time" loop closure is in the 10s of seconds or your application doesn't have to reliably close the loop every interval.

          • It depends on frame rate and settings. Preliminary testing suggested about 7 - 30% CPU utilization for care and feeding of the phy which also changes with data throughput. Unfortunately, that same core is also used for auto focus/exposure/white balance so there might be a a larger performance hit than one might expect if you are hoping to maximize frame rate. It isn't hard to keep network delays reasonable if you control the network and carefully optimize the host OS.

          • After trying the troubleshooting listed, I exchanged it.

          • The network consists of an unmanaged PoE switch which operates at 2.5Gbit which is faster than the 1G connection of the OAK and a new workstation running windows 10/depthai; it is essentially directly connected and only limited by the OAK. I have tried two different brands of PoE switches, TRENDnet and tp-link just to make sure. The code in the last post above runs either continuously at 100% CPU or crashes so there must be a setup difference. The bootloader reports the card is running version 0.22. Is the bootloader just a bootloader which accepts code, starts it and cleanly exists or does it do more?

            When I try to update the user bootloader over PoE, it always fails during verification. When I try to update it over USB, it lists the device state as unbooted as shown and the message: "Factory Bootloader Doesn't support User Bootloader flash Factory bootloader will be updated instead." Before I take the risk of bricking and loosing even more time on an exchange for this, please comment:

            1) the bootloader contains FW or performs some initialization and setup for the next stage beyond just accepting an ethernet connection so updating the bootloader might be a valid troubleshooting step

            2) the error message I am getting makes sense for an OAK-FFC 4P PoE shipped a few months ago and running 0.22; pushing the proceed button is not likely to damage it.

            • erik replied to this.
            • The 3A algorithm calculation runs even when Autofocus/WB are set to off?

              Adding setIsp3aFps() did not change CPU utilization in the following MRE (just mono cameras, 30FPS, 100% CPU). Any additional ideas?

              #!/usr/bin/env python3

              FPS=30

              import cv2

              import depthai as dai

              # Create pipeline

              pipeline = dai.Pipeline()

              # Define sources and outputs

              monoLeftNode = pipeline.create(dai.node.MonoCamera)

              monoRightNode = pipeline.create(dai.node.MonoCamera)

              veMonoL = pipeline.create(dai.node.VideoEncoder)

              veMonoR = pipeline.create(dai.node.VideoEncoder)

              veMonoLOut = pipeline.create(dai.node.XLinkOut)

              veMonoLOut.setStreamName('veMonoLOut')

              veMonoROut = pipeline.create(dai.node.XLinkOut)

              veMonoROut.setStreamName('veMonoROut')

              # Properties

              monoLeftNode.setBoardSocket(dai.CameraBoardSocket.CAM_C)

              monoLeftNode.setFps(FPS)

              monoLeftNode.setIsp3aFps(FPS)

              monoRightNode.setBoardSocket(dai.CameraBoardSocket.CAM_B)

              monoRightNode.setFps(FPS)

              monoRightNode.setIsp3aFps(FPS)

              # Configure encoders

              veMonoL.setDefaultProfilePreset(FPS, dai.VideoEncoderProperties.Profile.MJPEG)

              veMonoR.setDefaultProfilePreset(FPS, dai.VideoEncoderProperties.Profile.MJPEG)

              veMonoL.setLossless(True)

              veMonoR.setLossless(True)

              # Linking

              monoLeftNode.out.link(veMonoL.input)

              monoRightNode.out.link(veMonoR.input)

              veMonoL.bitstream.link(veMonoLOut.input)

              veMonoR.bitstream.link(veMonoROut.input)

              # Connect to device and start pipeline

              with dai.Device(pipeline) as dev:

              # Output queues will be used to get the encoded data from the outputs defined above
              
              outQ2 = dev.getOutputQueue(name='veMonoLOut', maxSize=30, blocking=True)
              
              outQ3 = dev.getOutputQueue(name='veMonoROut', maxSize=30, blocking=True)
              
              
              
              dev.setLogLevel(dai.LogLevel.DEBUG)
              
              dev.setLogOutputLevel(dai.LogLevel.DEBUG)   
              
              
              
              # The .h264 / .h265 files are raw stream files (not playable yet)
              
              with open('monoL.h264', 'wb') as fileMonoL, open('monoR.h264', 'wb') as fileMonoR:
              
                  print("Press Ctrl+C to stop encoding...")
              
                  while True:
              
                      try:
              
                          while outQ2.has():
              
                              outQ2.get().getData().tofile(fileMonoL)
              
                          while outQ3.has():
              
                              outQ3.get().getData().tofile(fileMonoR)
              • I tried disabling Autofocus and white balance and decreasing frame rate to 2 Hz. I am skeptical the auto exposure calculation at a 2Hz frame rate is the problem. Using the Aug 31 release build.

              • I noticed frames were being dropped and discovered LeonOS load increases to 100% in about 2s and stays there. Using an OAK FFC 4P PoE with 2 OV9282 and an AR0234 into an unmanaged switch with only the host and device on the network. I tried decreasing frames down to 2Hz, using jumbo frames and turning off auto focus and WB. This sounds sort of similar to a recent post by John194 posted 12 days ago. Any idea why this is happening?

                MRE

                #!/usr/bin/env python3

                import cv2

                import depthai as dai

                from itertools import cycle

                def printSystemInformation(info):

                m = 1024 \* 1024 # MiB
                
                print(f"Ddr used / total - {info.ddrMemoryUsage.used / m:.2f} / {info.ddrMemoryUsage.total / m:.2f} MiB")
                
                print(f"Cmx used / total - {info.cmxMemoryUsage.used / m:.2f} / {info.cmxMemoryUsage.total / m:.2f} MiB")
                
                print(f"LeonCss heap used / total - {info.leonCssMemoryUsage.used / m:.2f} / {info.leonCssMemoryUsage.total / m:.2f} MiB")
                
                print(f"LeonMss heap used / total - {info.leonMssMemoryUsage.used / m:.2f} / {info.leonMssMemoryUsage.total / m:.2f} MiB")
                
                t = info.chipTemperature
                
                print(f"Chip temperature - average: {t.average:.2f}, css: {t.css:.2f}, mss: {t.mss:.2f}, upa: {t.upa:.2f}, dss: {t.dss:.2f}")
                
                print(f"Cpu usage - Leon CSS: {info.leonCssCpuUsage.average \* 100:.2f}%, Leon MSS: {info.leonMssCpuUsage.average \* 100:.2f} %")
                
                print("----------------------------------------")

                # Create pipeline

                pipeline = dai.Pipeline()

                #jumbo frames and lower delays

                config = dai.Device.Config()

                config.board.network.mtu = 9000 # Jumbo frames. Default 1500

                #config.board.network.xlinkTcpNoDelay = False # Default True

                config.board.sysctl.append("net.inet.tcp.delayed_ack=1") # configure sysctl settings. 0 by default.

                # Define sources and outputs

                camRgb = pipeline.create(dai.node.ColorCamera)

                monoLeft = pipeline.create(dai.node.MonoCamera)

                monoRight = pipeline.create(dai.node.MonoCamera)

                ve1 = pipeline.create(dai.node.VideoEncoder)

                ve2 = pipeline.create(dai.node.VideoEncoder)

                ve3 = pipeline.create(dai.node.VideoEncoder)

                ve1Out = pipeline.create(dai.node.XLinkOut)

                ve2Out = pipeline.create(dai.node.XLinkOut)

                ve3Out = pipeline.create(dai.node.XLinkOut)

                ve1Out.setStreamName('ve1Out')

                ve2Out.setStreamName('ve2Out')

                ve3Out.setStreamName('ve3Out')

                sysLog = pipeline.create(dai.node.SystemLogger)

                linkOut = pipeline.create(dai.node.XLinkOut)

                linkOut.setStreamName("sysinfo")

                controlIn = pipeline.create(dai.node.XLinkIn)

                controlIn.setStreamName('control')

                # Properties

                camRgb.setBoardSocket(dai.CameraBoardSocket.CAM_A)

                camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1200_P) #using 1200_P camera

                monoLeft.setCamera("left")

                monoRight.setCamera("right")

                sysLog.setRate(1) # 1 Hz

                # Create encoders, one for each camera, consuming the frames and encoding them using H.264 / H.265 encoding

                ve1.setDefaultProfilePreset(10, dai.VideoEncoderProperties.Profile.MJPEG)

                ve2.setDefaultProfilePreset(10, dai.VideoEncoderProperties.Profile.MJPEG)

                ve3.setDefaultProfilePreset(10, dai.VideoEncoderProperties.Profile.MJPEG)

                ve1.setFrameRate(10)

                ve2.setFrameRate(10)

                ve3.setFrameRate(10)

                ve1.setLossless(True)

                ve3.setLossless(True)

                # Linking

                monoLeft.out.link(ve1.input)

                camRgb.video.link(ve2.input)

                monoRight.out.link(ve3.input)

                ve1.bitstream.link(ve1Out.input)

                ve2.bitstream.link(ve2Out.input)

                ve3.bitstream.link(ve3Out.input)

                sysLog.out.link(linkOut.input)

                controlIn.out.link(camRgb.inputControl)

                controlIn.out.link(monoLeft.inputControl)

                controlIn.out.link(monoRight.inputControl)

                # Connect to device and start pipeline

                with dai.Device(pipeline) as dev:

                # Output queues will be used to get the encoded data from the outputs defined above
                
                outQ1 = dev.getOutputQueue(name='ve1Out', maxSize=30, blocking=True)
                
                outQ2 = dev.getOutputQueue(name='ve2Out', maxSize=30, blocking=True)
                
                outQ3 = dev.getOutputQueue(name='ve3Out', maxSize=30, blocking=True)
                
                
                
                qSysInfo = dev.getOutputQueue(name="sysinfo", maxSize=4, blocking=False)
                
                dev.setLogLevel(dai.LogLevel.DEBUG)
                
                dev.setLogOutputLevel(dai.LogLevel.DEBUG)
                
                
                
                controlQueue = dev.getInputQueue('control')
                
                ctrl = dai.CameraControl()
                
                ctrl.setAutoFocusMode(dai.CameraControl.AutoFocusMode.OFF)
                
                ctrl.setAutoWhiteBalanceMode(dai.CameraControl.AutoWhiteBalanceMode.OFF)
                
                controlQueue.send(ctrl) 
                
                
                
                   
                
                
                
                # The .h264 / .h265 files are raw stream files (not playable yet)
                
                with open('mono1.h264', 'wb') as fileMono1H264, open('color.h265', 'wb') as fileColorH265, open('mono2.h264', 'wb') as fileMono2H264:
                
                    print("Press Ctrl+C to stop encoding...")
                
                    while True:
                
                        try:
                
                            # Empty each queue
                
                            while outQ1.has():
                
                                outQ1.get().getData().tofile(fileMono1H264)
                
                            while outQ2.has():
                
                                outQ2.get().getData().tofile(fileColorH265)
                
                            while outQ3.has():
                
                                outQ3.get().getData().tofile(fileMono2H264)
                
                                
                
                            while qSysInfo.has():
                
                                sysInfo = qSysInfo.get()
                
                                printSystemInformation(sysInfo)
                
                                
                
                        except KeyboardInterrupt:
                
                            # Keyboard interrupt (Ctrl + C) detected
                
                            break
                • erik replied to this.
                • Here is an example of the increasing timestamps; trace= 10. ts is sometimes >0.5s; decreasing the time between captures sometimes makes it worse but it can still occur when spaced minutes apart.

                  py: Saved image as: dataset\right\right_p2_6.png

                  Status of right is True

                  Start capturing...

                  new minimum: {'ts': 0.19998200000000033, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.20850499999999705, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.19215599999999888, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.10017499999999657, 'indicies': {'rgb': 3, 'left': 3, 'right': 3}} min required: 0.1

                  new minimum: {'ts': 0.199955000000001, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.10017499999999657, 'indicies': {'rgb': 2, 'left': 2, 'right': 2}} min required: 0.1

                  new minimum: {'ts': 0.1999569999999995, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.10017499999999657, 'indicies': {'rgb': 1, 'left': 1, 'right': 1}} min required: 0.1

                  new minimum: {'ts': 0.10017499999999657, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.30305099999999996, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.3059590000000014, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.30886900000000495, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.21761300000000006, 'indicies': {'rgb': 3, 'left': 3, 'right': 3}} min required: 0.1

                  new minimum: {'ts': 0.3117789999999978, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.21761300000000006, 'indicies': {'rgb': 2, 'left': 2, 'right': 2}} min required: 0.1

                  new minimum: {'ts': 0.3146889999999978, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  new minimum: {'ts': 0.21761300000000006, 'indicies': {'rgb': 1, 'left': 1, 'right': 1}} min required: 0.1

                  new minimum: {'ts': 0.21761300000000006, 'indicies': {'rgb': 0, 'left': 0, 'right': 0}} min required: 0.1

                  py: Capture failed, unable to sync images! Fix the argument minSyncTimestamp or (-mts). Set to: 0.1

                  Images were unable to sync, threshold to high. Device closing with exception.

                  • erik replied to this.
                  • Thank you for the responses. The switch adds about 2 microseconds and is 2.5gigabitpersecond so it is only limited by the speed of the OAK-D 4 FFC card. Typically it just says error communicating with rgb camera and the timing errors are also from the rgb camera and show delay times increasing to up to a 0.5s. I will paste in a detailed error log when I run it tomorrow; is there a way to run the calibration in debug mode or get more information when it fails than just the likely not terribly helpful increasing time stamps/Xlink communication with rgb camera failed message?

                  • I am struggling with calibration and had a few more questions.

                    1. Some of the polygons for the calibration appear impossible; the script does not redraw the polygon based on data from the boardconfig file which provides camera locations. It seems like both polygon shape and size would need to change based on baseline and certainly based on camera order (color, left right ought to be different than left, color right). Does location really matter or are you just trying to get enough images in very different orientations to let the algorithms sort it out? If it does matter, do you have figures of the poses/locations? Does even more images lower epipolar error?
                    2. I am running the latest released code and get quite a few -mst errors (up to a 0.5s). I am connected to OAK-D 4 FFC PoE via a TRENDnet 2.5GPoE switch. Nothing else is on the network, everything is running at 1G and there are no fire walls. The switch adds around 2us of latency during traffic tests. What would cause this? If this parameter only ensures samples are closely grouped to eliminate motion errors, it would be helpful to just toss the bad frames and report why like the checker board not seen error versus halting the calibration and forcing restart. (I am using OV9282 pair with AR0234, no external sync yet)
                    3. There are a number of Xlink communication errors with the rgb camera. Power to the board looks clean. The SW displays a warning about using the AR0234 with OV9282 not being fully supported, should this be expected?

                    Thank you.

                    • Thank you; much better! It would be helpful if calibration files didn't require an SSH key and chocolatey pointed to a newer code base.

                    • Version 2.20.2.0 and tag v3.4.0.

                      I am currently using a 63cm baseline and potentially trying to increase it to 100cm making full use of the 50cm flexes you offer.

                      • Calibration terminates with error KeyError: 'left_to_right_distance_cm'

                        File "C:\Users\gstern\AppData\Local\Programs\DepthAI\depthai\calibrate.py", line 570, in calibrate

                        - self.board_config['board_config']['left_to_right_distance_cm'], 0.0, 0.0]

                        What are some likely causes?

                        Is there a way change the board rectangle to a different color than black? It is often difficult to see in the black and white image preview.

                        Thank you.

                        Using 2, OV9282-M12s on left/right B/C and an AR0234-M12 on CAM_A.

                        python3 calibrate.py -s 14.81 -brd bdcfg.txt -db -drgb

                        bdcfg.txt:

                        {

                        "board_config":
                        
                        {
                        
                            "cameras":{
                        
                                "CAM_A": {
                        
                                    "name": "rgb",
                        
                                    "hfov": 90,
                        
                                    "type": "color"
                        
                                },
                        
                                "CAM_B": {
                        
                                    "name": "left",
                        
                                    "hfov": 75,
                        
                                    "type": "mono",
                        
                                    "extrinsics": {
                        
                                        "to_cam": "CAM_A",
                        
                                        "specTranslation": {
                        
                                            "x": -14.0,
                        
                                            "y": 0,
                        
                                            "z": 0
                        
                                        },
                        
                                        "rotation":{
                        
                                            "r": 0,
                        
                                            "p": 0,
                        
                                            "y": 0
                        
                                        }
                        
                                    }
                        
                                },
                        
                                "CAM_C": {
                        
                                    "name": "right",
                        
                                    "hfov": 75,
                        
                                    "type": "mono",
                        
                                    "extrinsics": {
                        
                                        "to_cam": "CAM_A",
                        
                                        "specTranslation": {
                        
                                            "x": 49,
                        
                                            "y": 0,
                        
                                            "z": 0
                        
                                        },
                        
                                        "rotation":{
                        
                                            "r": 0,
                        
                                            "p": 0,
                        
                                            "y": 0
                        
                                        }
                        
                                    }
                        
                                }
                        
                            },
                        
                            "stereo_config":{
                        
                                "left_cam": "CAM_B",
                        
                                "right_cam": "CAM_C"
                        
                            }
                        
                        }

                        }

                        • Many chips specify a current that you can safely reverse drive the internal ESD diodes without breaking them or causing latchup; the number helps choose appropriate series resistors for passive signal level translation. I will test; thank you for the help.

                          • erik replied to this.