jakaskerl
Hi @jakaskerl
Thanks for your answer. It may happen to one or two (sometimes even three) IMUs at the same time. Looking for the most similar measurements won't help in all cases. Do you have an idea what might cause this? And are there any settings we can try to avoid this?

    WouterOddBot
    Thanks. How are you using the IMU - this particular one has the ability to return a rotation vector - are you using rotation vector directly or are you using raw measurements for gyro, accelerometer and magnetometer? If yaw is wrong, then perhaps you are not using magnetometer (or there is something in the vicinity that is skewing your measurements.

    Thanks,
    Jaka

      jakaskerl

      Hi Jaka,

      This is how we use the IMU:

      imu = pipeline.createIMU()
      imu.enableIMUSensor(IMUSensor.ROTATION_VECTOR, 50)
      imu.enableIMUSensor(IMUSensor.LINEAR_ACCELERATION, 50)
      imu.setBatchReportThreshold(1)
      imu.setMaxBatchReports(10)

      We also tried without the magnetometer:

      imu = pipeline.createIMU()
      imu.enableIMUSensor(IMUSensor.GAME_ROTATION_VECTOR, 50)
      imu.enableIMUSensor(IMUSensor.LINEAR_ACCELERATION, 50)
      imu.setBatchReportThreshold(1)
      imu.setMaxBatchReports(10)

      Hope this give more info?

        WouterOddBot
        I have forwarded this to FW team to check. Seems to me like it's either a FW issue or there is an environment cause.

        Thanks,
        Jaka

          jakaskerl

          Hi Jaka,

          My colleague did some experiments with the imu as well and that gave some interesting results. If we run your script from https://docs.luxonis.com/software/depthai/examples/imu_rotation_vector/ with a static camera, the rotation vector accuracy remains 3.141602 indefinitely. I can't find in the documentation whether an accuracy of 3.14 is a good or a bad sign.

          Rotation vector timestamp: 318426.625 ms
          Quaternion: i: -0.018188 j: 0.009338 k: -0.570679 real: 0.820923
          Accuracy (rad): 3.141602

          If we move or shake the camera the accuracy decreases to 0.80. If we then hold the camera still, the accuracy first drops further down to 0.06. But eventually the accuracy slowly increases again.

          Finally, if we stop the script and restart it again, the imu starts out again with the 0.06 accuracy. But if we stop the script, unplug the camera for a while, and start the script again, then the accuracy starts out somewhere between 1.3 and 3.14. I think the longer we unplug the camera, the higher the accuracy.

          We see this happening with IMUSensor.ROTATION_VECTOR and IMUSensor.GEOMAGNETIC_ROTATION_VECTOR.

            jakaskerl
            Ok cool. We look forward to a solution. One more detail, it looks like the imu needs a rotation around all 3 axes before the accuracy decreases.

              WouterOddBot

              WouterOddBot One more detail, it looks like the imu needs a rotation around all 3 axes before the accuracy decreases.

              Actually, I think it's the other way around. The beginning accuracy (actually error) is PI, which makes sense because maximum possible heading error is 180deg. When you start moving it around, the onboard FW calibrates the IMU, so the error decreases. When left still, it slowly starts increasing again - likely counting in the expected drift.

              For the other issue, we'll likely need to contact the manufacturer since we don't have access to IMU's FW.

              Thanks,
              Jaka

                jakaskerl

                Yes I think we mean the same thing. It starts with maximum error PI. When you move it around the error quickly decreases.

                But we noticed that the error only decreases if we rotate the camera along all 3 axes. Our robots move slowly and rotate about the yaw axis only. The pitch and roll are pretty stable. So it can take very long before the error decreases. Until the robot perhaps hits a bump or something? Something shakes up the imu. Suddenly its error drops from PI to around 0.04.

                We suspect (but not sure) that this sudden error drop affects the computation of the output rotation vector, so that the rotation vector suddenly changes too.

                As the robot keeps moving and still rotates about the yaw axis only, the error slowly builds up again. Until the robot hits a bump again… Etc.

                  WouterOddBot
                  I guess it would be worth checking out which part of the rotation vector causes the issues. I would assume the gyro is to blame due to slow rotations + drift. Since we can't really access the firmware of the IMU itself, perhaps there is something we can do to work around the issue.

                  I'm just going to paste down some possible ideas from the IMU datasheet.

                  ACC - If the device moves in a planar fashion:
                  In addition, as the BNO08X has applications in devices such as Robot Vacuum Cleaners it is necessary to
                  calibrate the accelerometer in a planar fashion (i.e. around the axis that the vacuum cleaner revolves). Selection
                  of 3D or planar accelerometer calibration is via command.

                  GYRO- We can possibly disable ZRO calibration
                  Note that though the BNO08X can continuously attempt to calibrate the gyroscope for ZRO, it is possible to fool the
                  calibration algorithms through slow horizontal rotations (around the gravity vector). Placing the device on a stable
                  surface will force the ZRO calibration to converge rapidly. The gyroscope calibration algorithm attempts to remove
                  ZRO while the device is not on a very stable surface. For this to be successful the device to which the BNO08X is
                  mounted must have sufficient tremor such as the human hand. If there is insufficient tremor it is advisable to
                  disable the gyroscope calibration to prevent drift (i.e. by mis-calibrating ZRO).

                  I doubt there is a significant magnetometer disturbance present, but might be worth sanity checking.

                  Source:
                  https://www.ceva-dsp.com/wp-content/uploads/2019/10/BNO080_085-Datasheet.pdf pages 37-38.

                  Thanks,
                  Jaka

                    jakaskerl

                    Thanks for your answer. How does this work in practice? Do you have some documentation how to do this with an OAK-D camera?