Hi,

I've recently purchased an OAK-FFP-3P along with three Arducam AR0234 camera modules and I'm having difficulty with pretty much all the examples, I'm entirely expecting this to be user error and figure there's something fundamental I've missed.

If I write my own simple scripts using bits from various examples I can get a list of the three camera modules and they are recognised fine, I can even run one of the RGB examples that shows the source and cropped image, but I can't get IMU data or seem to get images from the left or right cameras.

Is there an idiots guide to getting started with the 3P board please? I'm a software engineer with a lot of experience but still relatively new to computer vision, if anyone could give me pointers to get started I'd appreciate it.

Photo for attention, this is the bot I'll be putting the 3P in to once I've got it working 🙂

  • erik replied to this.

    Hi kneave ,
    I think most demos don't work because you have 3x ColorCamera, and many demos use MonoCamera depthai nodes. After OAK FFC calibration you can also do stereodepth, demo here, which also shows how to use multuple ColorCamera nodes (instead of MonoCamera nodes). Thoughts?
    And that robot looks really cool!! Would love to see picture with the OAK FFC on there🙂
    Thanks, Erik

    Ah! I thought that mono would just return a mono image, I didn't realise that it would also require or expect a mono camera. This all looks ideal, I'll give it a go today. Thanks for the tips!

    The plan is for wide angle colour cameras for the stereo pair and a narrower lens on the centre camera. I figure that'll give a good combination for SLAM and a more zoomed in camera for distance viewing. I'm hoping to find a small camera for the centre so that I can hide it behind the panel between his eyes and not spoil the aesthetic 🙂

    I've switched to and built the develop branch of depthai_python, I saw that the multi_cam_support branch was merge in a few weeks ago, and I've gone through calibration.py from the depthai library and modified it for three colour cameras.

    I'm really struggling to more than a few calibration images accepted by the calibration script, none at all unless I disable the RGB sensor, could this be because I'm using fisheye lenses? The left and right camera have 160 degree lenses and the RGB has a 90 degree lens, do I have to do anything different to enable a fisheye mode? I know that's something I've needed to do with ROS before.

    Here is the calibration file I created in case it helps.

    {
    "board_config":
    {
    "name": "NE3P",
    "revision": "V1.0",
    "swap_left_and_right_cameras": false,
    "left_fov_deg": 160,
    "rgb_fov_deg": 90,
    "left_to_right_distance_cm": 8.0,
    "left_to_rgb_distance_cm": 4.0
    }
    }

    Any tips would be welcome, I'm happy I'm at least getting images now but it's still a little frustrating.

    Hi @kneave,

    Try to calibrate using this branch (https://github.com/luxonis/depthai/tree/ar0234_x3), you have to add a custom board config to work, try adding something like this in depthai/resoruces/boards/FFC-3P.json, and then be sure to select the board while doing the calibration for example python3 calibrate.py -brd FFC-3P. Be sure that you use the HFOV, not DFOV or something else, and check if the left/right/rgb cameras are in the right place (change the CAM_A/B/C ports as needed)

    {
        "board_config":
        {
            "name": "NE3P",
            "revision": "R1M0E1",
            "cameras":{
                "CAM_B": {
                    "name": "right",
                    "hfov": 160,
                    "type": "color",
                    "extrinsics": {
                        "to_cam": "CAM_C",
                        "specTranslation": {
                            "x": 8,
                            "y": 0,
                            "z": 0
                        },
                        "rotation":{
                            "r": 0,
                            "p": 0,
                            "y": 0
                        }
                    }
                },
                "CAM_C": {
                    "name": "left",
                    "hfov": 160,
                    "type": "color",
                    "extrinsics": {
                        "to_cam": "CAM_A",
                        "specTranslation": {
                            "x": 4,
                            "y": 0,
                            "z": 0
                        },
                        "rotation":{
                            "r": 0,
                            "p": 0,
                            "y": 0
                        }
                    }
                },
                "CAM_A": {
                    "name": "rgb",
                    "hfov": 90,
                    "type": "color"
                }
    
            },
            "stereo_config":{
                "left_cam": "CAM_B",
                "right_cam": "CAM_C"
            }
        }
    }

    Thanks, I'll have a look at this today hopefully

    I've gone through this a few times today and gotten much further so thanks for the tips! Pretty sure one of the issues I was facing was due to a poor quality calibration image, the printer put out something more grey than black so I reprinted on photo paper and it's much happier.

    First few times I got XLink errors, pretty sure I knocked a cable, and then when I was almost done the battery in my laptop died as it was taking the last image!

    I just got through them all for the first time after a recharge, it collected them all then hit an error in OpenCV. I've not had a chance to debug it yet, I'll have another look tomorrow.
    OpenCV(4.5.1) C:\Users\appveyor\AppData\Local\Temp\1\pip-req-build-5rb_9df3\opencv_contrib\modules\aruco\src\charuco.cpp:448: error: (-215:Assertion failed) _markerCorners.total() == _markerIds.getMat().total() && _markerIds.getMat().total() > 0 in function 'cv::aruco::_interpolateCornersCharucoLocalHom'

    Though it now looks fairly obvious, "appveyor" isn't my username so I suspect that there is a hard coded path somewhere in the calibration script. I'll have a look tomorrow, hopefully it's an easy win. 🙂

    Hmm, no, AppVeyor isn't mentioned in the repo at all that I can find. How very odd!

    Edit:just realised AppVevor is a CI tool so that's likely the path for the file on the build server and I've the resulting assembly on my machine. Will dig in to the code and figure out what it's doing, or not, as the case may be.

    Just for those who hit a similar issue, with the branch above I needed to downgrade OpenCV to the following version manually:
    opencv-contrib-python==4.5.5.64

    I'm now hitting an issue where it seems to get through everything but complain there's too big a difference between stereo cameras and RGB, I'm assuming as stereo are 160deg and the RGB is 90deg. I'm going to try with the -drgb flag to see if it can calibrate the stereo cameras at least.

    • erik replied to this.

      Hi kneave ,
      We apologize for the inconvenience - could you copy the full error that you get?
      Thanks, Erik

      Hi Erik, will do. I'll go through again and grab it shortly. Are the images saved to the disk too by the way? It'd be handy if I could run calibration again without having to recapture the images each time. It'd speed up debugging no end!

      For others who come across this, when you run the calibration it saves the captured images to "<workingdir>/dataset" and you can rerun the process using the following but with the values for your board/setup:
      python calibrate.py -s 2.5 -ms 1.8 -brd FFC-3P -db -m process

      Today's Troubleshooting

      To provide all the context, in the left and right cameras I'm using the lens from a Waveshare G pi camera. These have a diagonal field of view of 160 degrees, using an online calculator and specs of the AR0234 sensor and details of the sensor and lens I used it to calculate a hfov of 157.77

      This is my setup for prototyping, each camera is spaced 4cm apart:

      One thing that is worth noting, the writing for "Arducam" is upside down on all three cameras, the preview is shown the right way up on screen though so I assume the sensor is upside down with respect to the text? It's either an odd design choice or I've messed up my config...

      Speaking of which, this is the config I put together using the example in a reply above and looking at the other examples in the repo:

      {
      "board_config":
      {
      "name": "FFC-3P",
      "revision": "R1M0E1",
      "cameras":{
      "CAM_A": {
      "name": "rgb",
      "hfov": 90,
      "type": "color"
      },
      "CAM_B": {
      "name": "left",
      "hfov": 157.77,
      "type": "color",
      "extrinsics": {
      "to_cam": "CAM_C",
      "specTranslation": {
      "x": -8,
      "y": 0,
      "z": 0
      },
      "rotation":{
      "r": 0,
      "p": 0,
      "y": 0
      }
      }
      },
      "CAM_C": {
      "name": "right",
      "hfov": 157.77,
      "type": "color",
      "extrinsics": {
      "to_cam": "CAM_A",
      "specTranslation": {
      "x": 4,
      "y": 0,
      "z": 0
      },
      "rotation":{
      "r": 0,
      "p": 0,
      "y": 0
      }
      }
      }
      },
      "stereo_config":{
      "left_cam": "CAM_B",
      "right_cam": "CAM_C"
      }
      }
      }

      To capture the images I'm using the ar0234_x3 branch of DepthAI and the following command:
      python calibrate.py -s 2.5 -ms 1.8 -brd FFC-3P -db

      It runs through the calibration and saves all the images it needs, shows one set of images (the originals I think?) and then a second (rectified maybe?) and then on the third set a red window with errors relating to "high epipolar error" between the various cameras. Console output as follows:

      Displaying Stereo Pair for visual inspection. Press the [ESC] key to exit.
      1200
      Reprojection error threshold -> 3
      rgb Reprojection Error: 0.587818
      1200
      Reprojection error threshold -> 1.6666666666666667
      left Reprojection Error: 12.285764
      1200
      Reprojection error threshold -> 1.6666666666666667
      right Reprojection Error: 0.390285
      ['high Reprojection Error', 'high epipolar error between left and right', 'high epipolar error between right and rgb']
      py: DONE.

      DONE appears if I press any key in the red window, it closes and the script exits.

      As I understand it I've done everything right but I also know that my understanding is very much lacking! I've uploaded the captured images, along with the conf file, to the following location:
      We Transfer

      They will be auto deleted after one week as per WeTransfer's policies.

      If anyone from Luxonis could assist I would greatly appreciate it please. 🙂

      hmm, well, epipolar error just means the depth performance is not that good with the capture you made, it might be because of the wide lenses in general (harder to map the right points). I'd try to make more captures (increase the -c --count parameter), and with a bigger charuco board, maybe A2 or even A1, you can generate charuco boards PDFs for bigger ones here https://calib.io/pages/camera-calibration-pattern-generator. For example we use a A1 board with 16 raws and 22 columns, square size of about 4cm in factory calibration, which works for most cameras (wide lenses as well).

      Thanks for the tips, I'll try this with my TV in that case. If it says "done" does that mean the calibration data has been applied and saved to the board or is there another command I need to run to do that?

      I've found the line that prevents calibration data being written and nullified it by adding error_text = "" before the check and calibration data is now written to the device. It won't let me add a screenshot for some reason but it needs to be added in calibration.py around line 936, just before the if statement checking the length of it basically.

      I created a new calibration target suited for my screen, I went through and captured the images again and it still threw the epilolar errors for both rgb to right and left to right again. The calibration data was written to the board at least.

      I've tried running code that previously worked to preview the cameras and it throws an error now, the IMU script also still can't recognise that the board has an IMU.

      I know that with these modules there's more work needed to get them working but I honestly didn't expect this level of pain and lack of documentation. I've had this a few weeks now and I'm still not able to get even the basic functions working. There's no mention of the various branches I've been told to use anywhere in the docs that I can see and I've no idea what each of these actually relate to or what impact that'll have on using the release versions of DepthAI in the future.

      I need actual help, can I get contact details for someone so I can get in touch directly please?

      I switched to the original lenses to test with, calibration worked first time, however once calibrated and I try to run the demo I get the error below.

      The lenses that came with the cameras have a known HFOV of 90, I'm wondering if I messed up the calculation for the 160deg lenses I have. Could be the issue or is a 160deg pair of stereo cameras just too far from the FOV of a 90deg RGB camera? I'm really keen on having wide angle stereo for SLAM along with a narrower camera in the middle for more detail over a longer distance.

      Error mentioned above on loading the depthai_demo.py file, this is from the ar234_x3 branch however and I'm not sure if that branch was purely for calibration or not:
      RuntimeError: Fatal error. Please report to developers. Log: 'Fatal error on MSS CPU: trap: 07, address: 80089554' '0'

      I'll try running calibration with all 160deg lenses on there but I'll also see if I can find some 120deg lenses to try alongside the 90deg centre.

      I still can't access IMU data though and s I understand it there should be one on board, is there some known-good code for the 3P board please? Model number on the underside is DM1090, R3MOE3.

      I've had another go at calculating the hfov for my 160deg lenses and it's come out at 122.91deg which seems a bit more sensible. Still no luck with calibration.

      Something else I've tried is running calibration with -drgb and removing it from the board definition to see if ignoring the RGB camera would at least calibrate the stereo cameras. Sadly it ran through the calibration script and simply exited, no error or calibration file uploaded, just exited.

      I'll try tomorrow with all the cameras having 160deg lenses in them.

      Hi @kneave sorry for the trouble

      Thanks for the tips, I'll try this with my TV in that case. If it says "done" does that mean the calibration data has been applied and saved to the board or is there another command I need to run to do that?

      Yes, unless you only executed capture.py with -m 'capture', it should write the data on the EEPROM, you can see what was written with calibration_dump.py example

      I've found the line that prevents calibration data being written and nullified it by adding error_text = "" before the check and calibration data is now written to the device. It won't let me add a screenshot for some reason but it needs to be added in calibration.py around line 936, just before the if statement checking the length of it basically.

      if error_text is not empty here you should see something red on the calbiration GUI (like epipolar error or something), didn't noticed any errors when the writing failed here?

      I still can't access IMU data though and s I understand it there should be one on board, is there some known-good code for the 3P board please? Model number on the underside is DM1090, R3MOE3.

      For IMU, you can find some examples in the depthai-python repo, and some info here. If imu_gyroscope_accelerometer.py and imu_rotation_vector.py doesn't work, try to update the firmware with imu_firmware_update.py. Be very carefull NOT to disconnect the device while the IMU update is going, after the IMU update is at 100% and you start to see the gyroscope values on screen, you can stop the script.

      Sadly it ran through the calibration script and simply exited, no error or calibration file uploaded, just exited.

      Did you see any errors in the terminal where you ran the calibration script?

      Error mentioned above on loading the depthai_demo.py file, this is from the ar234_x3 branch however and I'm not sure if that branch was purely for calibration or not:

      RuntimeError: Fatal error. Please report to developers. Log: 'Fatal error on MSS CPU: trap: 07, address: 80089554' '0'

      I'm not sure what this error means, cc @Luxonis-Martin @Luxonis-Alex .

      @kneave can you try to run spatial_calculator_multi_roi.py or spatial_location_calculator.py to test the depth?