- Edited
Hello,
Currently, we are creating and testing a program that calculates X and Y from the depth information with infrared pattern (1200mA) based on the RGB Depth alignment C++ program by using OAK-D-Pro.
As shown in the figure below, 100mm becomes 130mm.
How can I calculate the actual correct value?
Settings are
Width = 1280
Height = 720
fx, fy, cx, cy = 807.091,807.091,652.369,358.778
static std::atomic<bool> extended_disparity{ true };
static std::atomic<bool> subpixel{ true };
static std::atomic<bool> lr_check{ true };
The caluculation is
float fxb = fx * baseline;
if (subpixel) {
fxb *= 8;
for (int j = 0; j < height; j++) {
cv::Vec3b* rgb = frame[rgbWindowName].ptr<cv::Vec3b>(j);
ushort* dpth = frame[depthWindowName].ptr<ushort>(j);
for (int i = 0; i < width; i++) {
if (dpth[i] == 0) continue;
cv::Vec3f xyz;
xyz[2] = fxb / (float)dpth[i];
xyz[0] = ((float)i - cx) * xyz[2] / fx;
xyz[1] = ((float)j - cy) * xyz[2] / fy;
vxyz.push_back(xyz);
vrgb.push_back(rgb[i]);
}
}
}
---
Thanks in advance.