• DepthAI-v2Hardware
  • Transforming IMU quarternion rotation to degrees yields unexpected results.

I've attached an Oak-D Pro to a drone's gimbal and would like to take pictures with annotated camera rotations. However, I cannot make sense of the IMU's measurements.

The software doing the image processing requires the rotations to be given in degrees. The drone can yaw 360°, tilting goes from 0* (camera looks straight down) to 90° (camera is aimed horizontally), banking is cancelled out by the gimbal and should always be around 0° (give or take 30°). I assume, the easiest way to describe this is an extrinsic x-y-z rotation.

I'm using the following Python script for testing:

#!/usr/bin/env python3

import cv2
import depthai as dai
import time
import math
from scipy.spatial.transform import Rotation

def fix_coord_system(rot):
    return rot if rot >= 0 else 360 + rot

def degrees_from_quaternion2(i, j, k, w):
        # Create a rotation object from Euler angles specifying axes of rotation
        rot = Rotation.from_quat([i, j, k, w])
        rot_e = rot.as_euler('xyz', degrees=True)
        return fix_coord_system(rot_e[0]), fix_coord_system(rot_e[1]), fix_coord_system(rot_e[2])
    
def degrees_from_quaternion(x, y, z, w):
        """
        Convert a quaternion into euler angles (roll, pitch, yaw)
        roll is rotation around x in radians (counterclockwise)
        pitch is rotation around y in radians (counterclockwise)
        yaw is rotation around z in radians (counterclockwise)
        """
        
        t0 = +2.0 * (w * x + y * z)
        t1 = +1.0 - 2.0 * (x * x + y * y)
        roll_x = math.atan2(t0, t1)
     
        t2 = +2.0 * (w * y - z * x)
        t2 = +1.0 if t2 > +1.0 else t2
        t2 = -1.0 if t2 < -1.0 else t2
        pitch_y = math.asin(t2)
     
        t3 = +2.0 * (w * z + x * y)
        t4 = +1.0 - 2.0 * (y * y + z * z)
        yaw_z = math.atan2(t3, t4)
     
        return roll_x*(180/math.pi), pitch_y*(180/math.pi), yaw_z*(180/math.pi)

if __name__ == "__main__":
    device = dai.Device()
    
    imuType = device.getConnectedIMU()
    imuFirmwareVersion = device.getIMUFirmwareVersion()
    print(f"IMU type: {imuType}, firmware version: {imuFirmwareVersion}")
    
    if imuType != "BNO086":
        print("Rotation vector output is supported only by BNO086!")
        exit(1)
    
    # Create pipeline
    pipeline = dai.Pipeline()
    
    # Define sources and outputs
    imu = pipeline.create(dai.node.IMU)
    xlinkOut = pipeline.create(dai.node.XLinkOut)
    
    xlinkOut.setStreamName("imu")
    
    # enable ROTATION_VECTOR at 400 hz rate
    imu.enableIMUSensor(dai.IMUSensor.ROTATION_VECTOR, 100)
    # it's recommended to set both setBatchReportThreshold and setMaxBatchReports to 20 when integrating in a pipeline with a lot of input/output connections
    # above this threshold packets will be sent in batch of X, if the host is not blocked and USB bandwidth is available
    imu.setBatchReportThreshold(1)
    # maximum number of IMU packets in a batch, if it's reached device will block sending until host can receive it
    # if lower or equal to batchReportThreshold then the sending is always blocking on device
    # useful to reduce device's CPU load  and number of lost packets, if CPU load is high on device side due to multiple nodes
    imu.setMaxBatchReports(10)
    
    # Link plugins IMU -> XLINK
    imu.out.link(xlinkOut.input)
    
    # Pipeline is defined, now we can connect to the device
    with device:
        device.startPipeline(pipeline)
    
        def timeDeltaToMilliS(delta) -> float:
            return delta.total_seconds()*1000
    
        # Output queue for imu bulk packets
        imuQueue = device.getOutputQueue(name="imu", maxSize=50, blocking=False)
        baseTs = None
        while True:
            imuData = imuQueue.get()  # blocking call, will wait until a new data has arrived
    
            imuPackets = imuData.packets
            for imuPacket in imuPackets:
                rVvalues = imuPacket.rotationVector
    
                rvTs = rVvalues.getTimestampDevice()
                if baseTs is None:
                    baseTs = rvTs
                rvTs = rvTs - baseTs
    
                imuF = "{:.06f}"
                tsF  = "{:.03f}"
    
                print(f"Rotation vector timestamp: {tsF.format(timeDeltaToMilliS(rvTs))} ms")
                print(f"Quaternion: i: {imuF.format(rVvalues.i)} j: {imuF.format(rVvalues.j)} "
                    f"k: {imuF.format(rVvalues.k)} real: {imuF.format(rVvalues.real)}")
                #print(f"Accuracy (rad): {imuF.format(rVvalues.rotationVectorAccuracy)}")
                print (degrees_from_quaternion(rVvalues.i, rVvalues.j, rVvalues.k, rVvalues.real))
    
            if cv2.waitKey(1) == ord('q'):
                break
            time.sleep(.2)

The functions degrees_from_quaternion and degrees_from_quaternion2 are two different attempts to get a proper output.

As long as the camera is lying flat on it's back, looking up, the measurements for roll and tilt are stable. Turning the camera on it's back (modifying the yaw) leads to a change of the yaw angle, as expected. However, the yaw measurements are somewhat erratic and don't seem to correctly indicate north.

Once I start rolling or tilting the camera, all angles start to change. I'd like to tell you more about the exact measurements, but from what I can say, the changes are almost random. It's problably best to try this example and see for yourself. Maybe my area is simply riddled with magnetic anomalies which confuse the magnetometer.

To sum up what I'm trying to achieve: When the camera is level, looking at the horizon and facing magnetic north, let's define this rotation as origin (0, 0, 0). Now, I'd like be able the get the camera's 3-dimensional rotation relative to that origin as degrees (or euler angles, doesn't matter) in the form of an extrinsic x-y-z rotation. Am I on the right track with my prior code or am I missing something?

    Looking at this again after a good night's sleep, I can say that different vector functions (GEOMAGNETIC_ROTATION_VECTOR, ARVR_STABILIZED_ROTATION_VECTOR) show similar problems. This leads to the conclusion that, assuming the IMU is ok, there is problably a mathematical problem in converting the the quarternions to angles.

    For now, I take the roll and tilt measurements from the IMU, as they are relatively stable if the gimbal holds it's position, and the yaw measurement from the drone (it's a 2-axis gimbal, so drone yaw = gimbal yaw).