Hi billyhorgan
The matrices you have sent look fine to me.
billyhorgan I am using the same images for testing as were used for calibration, however when I check the distance between the same point in the camera images, the real world distance appears completely incorrect(>1300 difference for example).
Looking at the cam_A to world and cam_B to world:
Looks like the cameras are approximately 80cm away from the board, and 28cm apart from each other, with the board centered in between. Is that not how your setup was IRL? The measurements look pretty normal to me.
Likely you have made an error calculating the points back to the board.
Haven't tested this but in my mind, this should work:
import numpy as np
# Given point in image coordinates (in pixels)
image_point = np.array([x_pixel, y_pixel, 1]) # Replace with your actual x, y pixel coordinates
# Given intrinsic matrix K
K = np.array([
[fx, 0, cx],
[0, fy, cy],
[0, 0, 1]
]) # Replace fx, fy, cx, cy with your actual intrinsic parameters
# or you can get it using device.readCalibration().getCameraIntrinsics()
# Invert the intrinsic matrix
K_inv = np.linalg.inv(K)
# Convert to normalized camera coordinates
normalized_camera_point = K_inv @ image_point
# Given CAM_to_WORLD transformation matrix
CAM_to_WORLD = np.array([
[a, b, c, d],
[e, f, g, h],
[i, j, k, l],
[m, n, o, p]
]) # Replace with your actual transformation matrix
# Transform to world coordinates
homogeneous_world_point = CAM_to_WORLD @ np.append(normalized_camera_point, 1)
# Convert to Cartesian coordinates
world_point = homogeneous_world_point[:3] / homogeneous_world_point[3]
print("World Point:", world_point)
Hope this helps,
Jaka