I am struggling with calibration and had a few more questions.
- Some of the polygons for the calibration appear impossible; the script does not redraw the polygon based on data from the boardconfig file which provides camera locations. It seems like both polygon shape and size would need to change based on baseline and certainly based on camera order (color, left right ought to be different than left, color right). Does location really matter or are you just trying to get enough images in very different orientations to let the algorithms sort it out? If it does matter, do you have figures of the poses/locations? Does even more images lower epipolar error?
- I am running the latest released code and get quite a few -mst errors (up to a 0.5s). I am connected to OAK-D 4 FFC PoE via a TRENDnet 2.5GPoE switch. Nothing else is on the network, everything is running at 1G and there are no fire walls. The switch adds around 2us of latency during traffic tests. What would cause this? If this parameter only ensures samples are closely grouped to eliminate motion errors, it would be helpful to just toss the bad frames and report why like the checker board not seen error versus halting the calibration and forcing restart. (I am using OV9282 pair with AR0234, no external sync yet)
- There are a number of Xlink communication errors with the rgb camera. Power to the board looks clean. The SW displays a warning about using the AR0234 with OV9282 not being fully supported, should this be expected?
Thank you.