In an environment like the picture below, I want to convert the target's camera coordinates Pc, measured with the oak-d camera, into world coordinates Pw and transmit them to the robot.

At this time, we need to know R and t to change it to world coordinates. At this time, the allowable error in world coordinates is about 5 to 10 cm.

Therefore, the accuracy of R and t is important.

Therefore, I would like to get answers to several questions below.

1. What is the rotation axis of the rotation verector (quaternion) that can be obtained from the IMU?

2. Is it possible to change the rotation axis to be as similar to world coordinates as possible?

3. What is the accuracy or tolerance of quaternion values ​​for a camera in a fixed position?

4. Is calibration of the IMU required, if necessary? What is the correction method?

4. In order to accurately measure t, I need to accurately measure the camera's position coordinates from the origin I specify. Which point on the camera (or which designation behind the camera) should be measured as the camera's position?

I own oak-d cameras:

1. OAK-D Pro PoE

2. OAK-D Pro

3. OAK-D SR

Please let us know if you have any other information necessary for the success of the project.

  • jakaskerl replied to this.
  • OseongKwon What is the rotation axis of the rotation verector (quaternion) that can be obtained from the IMU?

    From the datasheet: The rotation vector provides an orientation output that is expressed as a quaternion referenced to magnetic north
    and gravity.
    [source]

    OseongKwon Is it possible to change the rotation axis to be as similar to world coordinates as possible?

    Only through post transformations.

    OseongKwon 3. What is the accuracy or tolerance of quaternion values ​​for a camera in a fixed position?

    Best to check the datasheet. It has the info.

    OseongKwon In order to accurately measure t, I need to accurately measure the camera's position coordinates from the origin I specify. Which point on the camera (or which designation behind the camera) should be measured as the camera's position?

    Why not just calibrate it with a chessboard? I should solve all your problems. luxonis/depthai-experimentstree/master/gen2-multiple-devices/multi-cam-calibration <-- has a calibration script.

    Thanks,
    Jaka

    OseongKwon What is the rotation axis of the rotation verector (quaternion) that can be obtained from the IMU?

    From the datasheet: The rotation vector provides an orientation output that is expressed as a quaternion referenced to magnetic north
    and gravity.
    [source]

    OseongKwon Is it possible to change the rotation axis to be as similar to world coordinates as possible?

    Only through post transformations.

    OseongKwon 3. What is the accuracy or tolerance of quaternion values ​​for a camera in a fixed position?

    Best to check the datasheet. It has the info.

    OseongKwon In order to accurately measure t, I need to accurately measure the camera's position coordinates from the origin I specify. Which point on the camera (or which designation behind the camera) should be measured as the camera's position?

    Why not just calibrate it with a chessboard? I should solve all your problems. luxonis/depthai-experimentstree/master/gen2-multiple-devices/multi-cam-calibration <-- has a calibration script.

    Thanks,
    Jaka

      jakaskerl
      When the internal space is 7m x 7m x 7m,

      Would it be appropriate for us to estimate the camera's pose (R, t) using a chessboard for distance estimation? If appropriate, how big should the chessboard be?

      If it's difficult, should I hire a surveyor?

      Hi @OseongKwon ,
      Yep, probably the charuco board would be best to get translation/rotation to the camera. You'd want to have it large enough so markers are clearly visible.