Hi jakaskerl,
It's a lot clearer for me now, but I still don't understand why I'm encountering one problem. My idea is to detect an ArUco marker using OpenCV, but the issue I'm facing is that the detection is giving me the marker with an offset:
DAI: [ 5.04137533e-02 -5.04137533e-02 7.06905365e+01]
CV: [ 1.86120775e+00 -3.23760908e-02 7.06905365e+01]
Diff: [-1.810794 -0.01803766 0. ]
In this example, I'm using OpenCV's depth to calculate the x & y in DAI's data, and it's odd that the closest I get to 0,0 is 36 pixels away from (0,0) by the x-axis.
This is the calibration I'm using for this test:
[
0,
{
"cameraType": 0,
"distortionCoeff": [
-4.282538414001465,
15.699326515197754,
0.0014291888801380992,
0.0007043222431093454,
-26.29835319519043,
-4.37771463394165,
15.943224906921387,
-26.34567642211914,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0
],
"extrinsics": {
"rotationMatrix": [
[
0.0,
0.0,
0.0
],
[
0.0,
0.0,
0.0
],
[
0.0,
0.0,
0.0
]
],
"specTranslation": {
"x": -0.0,
"y": -0.0,
"z": -0.0
},
"toCameraSocket": -1,
"translation": {
"x": 0.0,
"y": 0.0,
"z": 0.0
}
},
"height": 2160,
"intrinsicMatrix": [
[
2978.385009765625,
0.0,
1843.81689453125
],
[
0.0,
2978.385009765625,
1079.8861083984375
],
[
0.0,
0.0,
1.0
]
],
"lensPosition": 0,
"specHfovDeg": 68.7938003540039,
"width": 3840
}
]
Could you provide some insights or ideas on why I'm experiencing this offset?