Hi,
I am using the Oak D Lite like a dashcam and trying to measure the distance with vehicles and pedestrians in front of my vehicle. I am using the pedestrian-and-vehicle-detection-adas-0001 model. I'm getting around 10fps which I am ok with. The object detection is working well and fast enough; the tracking is working somewhat well but I think that can be improved by using filter's etc.

However, the thing that I'm not happy with is the depth sensing. I've set the minimum distance as 100cm and the maximum as 15meters. The readings are wildly off, especially the Z coordinate. The readings oscillate anywhere from 1 meter to 12meter.

Am I doing something wrong in the code? Is this expected because the Oak D lite uses passive stereo depth sensing and vehicles are usually somewhat featureless from some angles?

What are your suggestions? Do you think the Oak D lite or any other Oak D is good enough for this application?

My setup:

Oak D Lite
Raspberry Pi 4B 4GB RAM
USB 3 cable

import depthai as dai
import blobconverter
import cv2
import numpy as np
import time
import psutil
import datetime

label_map = ['unknown','vehicle', 'pedestrian']


class KalmanFilter(object):
    def __init__(self, acc_std, meas_std, z, time):
        self.dim_z = len(z)
        self.time = time
        self.acc_std = acc_std
        self.meas_std = meas_std

        # the observation matrix 
        self.H = np.eye(self.dim_z, 3 * self.dim_z) 

        self.x = np.vstack((z, np.zeros((2 * self.dim_z, 1))))
        self.P = np.zeros((3 * self.dim_z, 3 * self.dim_z))
        i, j = np.indices((3 * self.dim_z, 3 * self.dim_z))
        self.P[(i - j) % self.dim_z == 0] = 1e5 # initial vector is a guess -> high estimate uncertainty


    def predict(self, dt):
        # the state transition matrix -> assuming acceleration is constant
        F = np.eye(3 * self.dim_z)
        np.fill_diagonal(F[:2*self.dim_z, self.dim_z:], dt)
        np.fill_diagonal(F[:self.dim_z, 2*self.dim_z:], dt ** 2 / 2)

        # the process noise matrix
        A = np.zeros((3 * self.dim_z, 3 * self.dim_z))
        np.fill_diagonal(A[2*self.dim_z:, 2*self.dim_z:], 1)
        Q = self.acc_std ** 2 * F @ A @ F.T 

        self.x = F @ self.x
        self.P = F @ self.P @ F.T + Q

    
    def update(self, z):
        if z is None: return

        # the measurement uncertainty
        R = self.meas_std ** 2 * np.eye(self.dim_z)

        # the Kalman Gain
        K = self.P @ self.H.T @ np.linalg.inv(self.H @ self.P @ self.H.T + R)

        self.x = self.x + K @ (z - self.H @ self.x)
        I = np.eye(3 * self.dim_z)
        self.P = (I - K @ self.H) @ self.P @ (I - K @ self.H).T + K @ R @ K.T


# Create pipeline
pipeline = dai.Pipeline()

#imu = pipeline.create(dai.node.IMU)
# enable ACCELEROMETER_RAW and GYROSCOPE_RAW at 100 hz rate
#imu.enableIMUSensor([dai.IMUSensor.ACCELEROMETER_RAW], 100)
# above this threshold packets will be sent in batch of X, if the host is not blocked and USB bandwidth is available
#imu.setBatchReportThreshold(1)

# maximum number of IMU packets in a batch, if it's reached device will block sending until host can receive it
# if lower or equal to batchReportThreshold then the sending is always blocking on device
# useful to reduce device's CPU load  and number of lost packets, if CPU load is high on device side due to multiple nodes
#imu.setMaxBatchReports(10)

cam_rgb = pipeline.create(dai.node.ColorCamera)
detection_network = pipeline.create(dai.node.MobileNetSpatialDetectionNetwork)
mono_left = pipeline.create(dai.node.MonoCamera)
mono_right = pipeline.create(dai.node.MonoCamera)
stereo = pipeline.create(dai.node.StereoDepth)
object_tracker = pipeline.create(dai.node.ObjectTracker)

xout_rgb = pipeline.create(dai.node.XLinkOut)
tracker_out = pipeline.create(dai.node.XLinkOut)

xout_rgb.setStreamName('rgb')
tracker_out.setStreamName('tracklets')

cam_rgb.setImageOrientation(dai.CameraImageOrientation.ROTATE_180_DEG)
cam_rgb.setPreviewSize(672, 384)
cam_rgb.setBoardSocket(dai.CameraBoardSocket.RGB)
cam_rgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
cam_rgb.setInterleaved(False)
cam_rgb .setPreviewNumFramesPool(15)

mono_left.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
mono_left.setBoardSocket(dai.CameraBoardSocket.LEFT)
mono_right.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
mono_right.setBoardSocket(dai.CameraBoardSocket.RIGHT)

mono_left.setFps(10)
mono_right.setFps(10)
cam_rgb.setFps(10)

stereo.setDefaultProfilePreset(dai.node.StereoDepth.PresetMode.HIGH_DENSITY)
stereo.setDepthAlign(dai.CameraBoardSocket.RGB)

detection_network.setBlobPath('./pedestrian-and-vehicle-detector-adas-0001.blob')
detection_network.setConfidenceThreshold(0.5)
detection_network.input.setBlocking(False)
detection_network.setBoundingBoxScaleFactor(0.7)
detection_network.setNumInferenceThreads(2)
#detection_network.setNumClasses(2)
detection_network.setDepthLowerThreshold(100)
detection_network.setDepthUpperThreshold(15000)

object_tracker.setDetectionLabelsToTrack([1,2])  # track only person and car
object_tracker.setTrackerType(dai.TrackerType.ZERO_TERM_COLOR_HISTOGRAM)
object_tracker.setTrackerIdAssignmentPolicy(dai.TrackerIdAssignmentPolicy.UNIQUE_ID)

mono_left.out.link(stereo.left)
mono_right.out.link(stereo.right)

cam_rgb.preview.link(detection_network.input)
object_tracker.passthroughTrackerFrame.link(xout_rgb.input)
object_tracker.out.link(tracker_out.input)

detection_network.passthrough.link(object_tracker.inputTrackerFrame)
detection_network.passthrough.link(object_tracker.inputDetectionFrame)
detection_network.out.link(object_tracker.inputDetections)
stereo.depth.link(detection_network.inputDepth)


# Connect to device and start pipeline
with dai.Device(pipeline) as device:

    calibration_handler = device.readCalibration()
    baseline = calibration_handler.getBaselineDistance() * 10
    focal_length = calibration_handler.getCameraIntrinsics(dai.CameraBoardSocket.RIGHT, 640, 400)[0][0]

    q_rgb  = device.getOutputQueue(name='rgb', maxSize=10, blocking=False)
    q_tracklets = device.getOutputQueue(name='tracklets', maxSize=10, blocking=False)

    kalman_filters = {}
    
    startTime = time.monotonic()
    counter = 0
    fps = 0
    color = (255, 255, 255)
    
    # Get the current date as the timestamp to generate unique file names.
    date = datetime.datetime.now().strftime('%m-%d-%Y_%H.%M.%S')
    output_file_name = '/home/pi/data/road_footage/output_' + date + '.avi'
    # Define the codec and create VideoWriter object
    fourcc = cv2.VideoWriter_fourcc(*'XVID')
    out = cv2.VideoWriter(output_file_name, fourcc, 10.0, (672, 384))
    
    while(True):
        frame = q_rgb.get().getCvFrame()
        tracklets = q_tracklets.get()
        current_time = tracklets.getTimestamp()
        
        h, w = frame.shape[:2]

        for t in tracklets.tracklets:
            roi = t.roi.denormalize(frame.shape[1], frame.shape[0])
            x1 = int(roi.topLeft().x)
            y1 = int(roi.topLeft().y)
            x2 = int(roi.bottomRight().x)
            y2 = int(roi.bottomRight().y)

            x_space = t.spatialCoordinates.x
            y_space = t.spatialCoordinates.y
            z_space = t.spatialCoordinates.z

            meas_vec_bbox = np.array([[(x1 + x2) / 2], [(y1 + y2) / 2], [x2 - x1], [y2 - y1]])
            meas_vec_space = np.array([[x_space], [y_space], [z_space]])
            meas_std_space =  z_space ** 2 / (baseline * focal_length)

            if t.status.name == 'NEW':
                # Adjust these parameters
                acc_std_space = 10
                acc_std_bbox = 0.1 #0.1
                meas_std_bbox = 0.05

                kalman_filters[t.id] = {'bbox': KalmanFilter(meas_std_bbox, acc_std_bbox, meas_vec_bbox, current_time),
                                        'space': KalmanFilter(meas_std_space, acc_std_space, meas_vec_space, current_time)}
                print("Created new tracklet: ",t.id)
            else:
                try:
                    dt = current_time - kalman_filters[t.id]['bbox'].time
                except:
                    print("Missed id: ",t.id);
                    continue
                dt = dt.total_seconds()
                kalman_filters[t.id]['space'].meas_std = meas_std_space

                if t.status.name != 'TRACKED':
                    meas_vec_bbox = None
                    meas_vec_space = None

                if z_space == 0:
                    meas_vec_space = None


                kalman_filters[t.id]['bbox'].predict(dt)
                kalman_filters[t.id]['bbox'].update(meas_vec_bbox)

                kalman_filters[t.id]['space'].predict(dt)
                kalman_filters[t.id]['space'].update(meas_vec_space)

                kalman_filters[t.id]['bbox'].time = current_time
                kalman_filters[t.id]['space'].time = current_time

                vec_bbox = kalman_filters[t.id]['bbox'].x
                vec_space = kalman_filters[t.id]['space'].x

                x1_filter = int(vec_bbox[0] - vec_bbox[2] / 2)
                x2_filter = int(vec_bbox[0] + vec_bbox[2] / 2)
                y1_filter = int(vec_bbox[1] - vec_bbox[3] / 2)
                y2_filter = int(vec_bbox[1] + vec_bbox[3] / 2)

                #cv2.rectangle(frame, (x1_filter, y1_filter), (x2_filter, y2_filter), (0, 0, 255), 2)
                #cv2.putText(frame, f'X: {int(vec_space[0])} mm', (x1 + 10, y1 + 110), cv2.FONT_HERSHEY_TRIPLEX, 0.5, (0, 0, 255))
                #cv2.putText(frame, f'Y: {int(vec_space[1])} mm', (x1 + 10, y1 + 125), cv2.FONT_HERSHEY_TRIPLEX, 0.5, (0, 0, 255))
                #cv2.putText(frame, f'Z: {int(vec_space[2])} mm', (x1 + 10, y1 + 140), cv2.FONT_HERSHEY_TRIPLEX, 0.5, (0, 0, 255))
                #print(t.label)

            try:
                label = label_map[t.label]
            except:
                label = t.label

            cv2.putText(frame, str(label), (x1 + 10, y1 + 20), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
            cv2.putText(frame, f'ID: {[t.id]}', (x1 + 10, y1 + 35), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
            cv2.putText(frame, t.status.name, (x1 + 10, y1 + 50), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
            cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0))

            cv2.putText(frame, f'X: {int(x_space)} mm', (x1 + 10, y1 + 65), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
            cv2.putText(frame, f'Y: {int(y_space)} mm', (x1 + 10, y1 + 80), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
            cv2.putText(frame, f'Z: {int(z_space)} mm', (x1 + 10, y1 + 95), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
            
        counter+=1
        current_time = time.monotonic()
        if (current_time - startTime) > 1 :
            fps = counter / (current_time - startTime)
            counter = 0
            startTime = current_time
        cv2.putText(frame, "NN fps: {:.2f}".format(fps), (2, frame.shape[0] - 4), cv2.FONT_HERSHEY_TRIPLEX, 0.4, color)
        #print("fps: {:.2f}".format(fps))
        cv2.imshow('tracker', frame)
        bytes_avail = psutil.disk_usage('/').free
        #gigabytes_avail = bytes_avail / 1024 / 1024 / 1024
        #print(gigabytes_avail)
        if bytes_avail>1073741824:
            out.write(frame)
        else:
            break
        if cv2.waitKey(1) == ord('q'):
            break
        
    cv2.destroyAllWindows()
  • erik replied to this.

    Hi AnirbanRaha ,
    Could you also share a video of rgb/depth together with the bounding boxes? That would help us spot the issue.
    I would also suggest using latest develop of depthai library (on depthai-python checkout to develop, and run python examples/install_requirements.py), as we have changed depth averaging algo to median (see PR here), which, in general, improves spatial detections. Thoughts?
    Thanks, Erik

      Hey erik . Thanks for the quick response!
      Here is the link to the rgb video with bounding boxes.:
      [

      Please let me know if you need anything else. I will also try this with the latest develop of depthai library tomorrow during daytime and let you know here.

      [edit] Another important thing I forgot to mention is that the Oak D is mounted upside down. I am using a rotate 180deg on the RGB frame at the beginning of my code to correct the orientation. Don't know if this causes the issue.

      Thanks!

      • erik replied to this.

        erik It's in the first post. The code is based on the Kalman filter example in depthai-experiments repo. Although I have disabled the Kalmann filter output as it's even worse than the object tracker output.

        • erik replied to this.

          Hi AnirbanRaha ,
          Sorry my bad, I wanted to ask if you could provide MRE (much shorter code), so our development team can spot the issue and debug it.
          Thanks, Erik

            erik Here is the minimum amount of code to produce the video preview similar to the video I had posted in the previous message. Please let me know if this is what you wanted.

            import depthai as dai
            import blobconverter
            import cv2
            import numpy as np
            import time
            
            label_map = ['unknown','vehicle', 'pedestrian']
            
            # Create pipeline
            pipeline = dai.Pipeline()
            
            cam_rgb = pipeline.create(dai.node.ColorCamera)
            detection_network = pipeline.create(dai.node.MobileNetSpatialDetectionNetwork)
            mono_left = pipeline.create(dai.node.MonoCamera)
            mono_right = pipeline.create(dai.node.MonoCamera)
            stereo = pipeline.create(dai.node.StereoDepth)
            object_tracker = pipeline.create(dai.node.ObjectTracker)
            
            xout_rgb = pipeline.create(dai.node.XLinkOut)
            tracker_out = pipeline.create(dai.node.XLinkOut)
            
            xout_rgb.setStreamName('rgb')
            tracker_out.setStreamName('tracklets')
            
            #rotating the camera because it's mounted upside down
            cam_rgb.setImageOrientation(dai.CameraImageOrientation.ROTATE_180_DEG)
            cam_rgb.setPreviewSize(672, 384)
            cam_rgb.setBoardSocket(dai.CameraBoardSocket.RGB)
            cam_rgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
            cam_rgb.setInterleaved(False)
            cam_rgb .setPreviewNumFramesPool(15)
            
            mono_left.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
            mono_left.setBoardSocket(dai.CameraBoardSocket.LEFT)
            mono_right.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
            mono_right.setBoardSocket(dai.CameraBoardSocket.RIGHT)
            
            mono_left.setFps(10)
            mono_right.setFps(10)
            cam_rgb.setFps(10)
            
            stereo.setDefaultProfilePreset(dai.node.StereoDepth.PresetMode.HIGH_DENSITY)
            stereo.setDepthAlign(dai.CameraBoardSocket.RGB)
            
            detection_network.setBlobPath('./pedestrian-and-vehicle-detector-adas-0001.blob')
            detection_network.setConfidenceThreshold(0.5)
            detection_network.input.setBlocking(False)
            detection_network.setBoundingBoxScaleFactor(0.7)
            detection_network.setNumInferenceThreads(2)
            detection_network.setDepthLowerThreshold(100)
            detection_network.setDepthUpperThreshold(15000)
            
            object_tracker.setDetectionLabelsToTrack([1,2])  # track only person and car
            object_tracker.setTrackerType(dai.TrackerType.ZERO_TERM_COLOR_HISTOGRAM)
            object_tracker.setTrackerIdAssignmentPolicy(dai.TrackerIdAssignmentPolicy.UNIQUE_ID)
            
            mono_left.out.link(stereo.left)
            mono_right.out.link(stereo.right)
            
            cam_rgb.preview.link(detection_network.input)
            object_tracker.passthroughTrackerFrame.link(xout_rgb.input)
            object_tracker.out.link(tracker_out.input)
            
            detection_network.passthrough.link(object_tracker.inputTrackerFrame)
            detection_network.passthrough.link(object_tracker.inputDetectionFrame)
            detection_network.out.link(object_tracker.inputDetections)
            stereo.depth.link(detection_network.inputDepth)
            
            
            # Connect to device and start pipeline
            with dai.Device(pipeline) as device:
            
                calibration_handler = device.readCalibration()
                baseline = calibration_handler.getBaselineDistance() * 10
                focal_length = calibration_handler.getCameraIntrinsics(dai.CameraBoardSocket.RIGHT, 640, 400)[0][0]
            
                q_rgb  = device.getOutputQueue(name='rgb', maxSize=10, blocking=False)
                q_tracklets = device.getOutputQueue(name='tracklets', maxSize=10, blocking=False)
                
                startTime = time.monotonic()
                counter = 0
                fps = 0
                color = (255, 255, 255)
               
                while(True):
                    frame = q_rgb.get().getCvFrame()
                    tracklets = q_tracklets.get()
                    current_time = tracklets.getTimestamp()
                    
                    h, w = frame.shape[:2]
            
                    for t in tracklets.tracklets:
                        roi = t.roi.denormalize(frame.shape[1], frame.shape[0])
                        x1 = int(roi.topLeft().x)
                        y1 = int(roi.topLeft().y)
                        x2 = int(roi.bottomRight().x)
                        y2 = int(roi.bottomRight().y)
            
                        x_space = t.spatialCoordinates.x
                        y_space = t.spatialCoordinates.y
                        z_space = t.spatialCoordinates.z
            
                        try:
                            label = label_map[t.label]
                        except:
                            label = t.label
            
                        cv2.putText(frame, str(label), (x1 + 10, y1 + 20), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        cv2.putText(frame, f'ID: {[t.id]}', (x1 + 10, y1 + 35), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        cv2.putText(frame, t.status.name, (x1 + 10, y1 + 50), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0))
            
                        cv2.putText(frame, f'X: {int(x_space)} mm', (x1 + 10, y1 + 65), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        cv2.putText(frame, f'Y: {int(y_space)} mm', (x1 + 10, y1 + 80), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        cv2.putText(frame, f'Z: {int(z_space)} mm', (x1 + 10, y1 + 95), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        
                    counter+=1
                    current_time = time.monotonic()
                    if (current_time - startTime) > 1 :
                        fps = counter / (current_time - startTime)
                        counter = 0
                        startTime = current_time
                    cv2.putText(frame, "NN fps: {:.2f}".format(fps), (2, frame.shape[0] - 4), cv2.FONT_HERSHEY_TRIPLEX, 0.4, color)
                    cv2.imshow('tracker', frame)
            
                    if cv2.waitKey(1) == ord('q'):
                        break
                    
                cv2.destroyAllWindows()

            180 degree rotation is not supported by SpatialDetectionNetwork unless all cameras are rotated the same way. (E.g. depth map is also 180 rotated. For that you need to rotate LEFT and RIGHT cameras 180 degrees, then link left cam to right stereo input, right cam to left stereo input and latest develop branch/library. @AnirbanRaha

            5 days later

            @GergelySzabolcs @erik Hi. I don't want to go into the complications of reversing the image at this stage and so I found a way to mount the camera in the upright position. I also did some extensive testing both indoor and outdoor (on a bright sunny day). I did this with both the latest "develop" branch of depthai-python and the latest stable release 2.20.2.0.

            What I found was that indoor, the spatial detection works very well. I am getting accurate results that are good enough for me. However, outdoor, the thing still won't work properly. The only differences in the indoor and outdoor setups are as follows:

            1. Indoor - Raspberry Pi is being powered by the official 3A power supply
              Outdoor - Raspberry Pi is being powered by a Quick charge 3.0 (3A) USB port that is wired to my car's battery
            2. Indoor - Lighting is low
              Outdoor - It's very bright and somehow objects don't appear as clear as in indoor lighting conditions

            Another difference in both conditions is that the OAK-D lite is placed behind my car's windshield. However, I also tried putting a thick piece of transparent glass in front of the camera when indoors and it still performs just as well. So I don't think this may be the cause but who knows!

            I'm attaching the link to both the outdoor and indoor video setup. Also, the MRE for this setup is almost the same as before but I will still post it:

            import depthai as dai
            import blobconverter
            import cv2
            import numpy as np
            import time
            
            labelMap = ['unknown','vehicle', 'pedestrian']
            
            # Create pipeline
            pipeline = dai.Pipeline()
            
            # Define sources and outputs
            camRgb = pipeline.create(dai.node.ColorCamera)
            spatialDetectionNetwork = pipeline.create(dai.node.MobileNetSpatialDetectionNetwork)
            monoLeft = pipeline.create(dai.node.MonoCamera)
            monoRight = pipeline.create(dai.node.MonoCamera)
            stereo = pipeline.create(dai.node.StereoDepth)
            objectTracker = pipeline.create(dai.node.ObjectTracker)
            
            xoutRgb = pipeline.create(dai.node.XLinkOut)
            trackerOut = pipeline.create(dai.node.XLinkOut)
            
            xoutRgb.setStreamName("preview")
            trackerOut.setStreamName("tracklets")
            
            # Properties
            #camRgb.setImageOrientation(dai.CameraImageOrientation.ROTATE_180_DEG)
            camRgb.setPreviewSize(672, 384)
            camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
            camRgb.setInterleaved(False)
            camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.BGR)
            
            #monoLeft.setImageOrientation(dai.CameraImageOrientation.ROTATE_180_DEG)
            monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
            monoLeft.setBoardSocket(dai.CameraBoardSocket.LEFT)
            
            #monoRight.setImageOrientation(dai.CameraImageOrientation.ROTATE_180_DEG)
            monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
            monoRight.setBoardSocket(dai.CameraBoardSocket.RIGHT)
            
            monoLeft.setFps(10)
            monoRight.setFps(10)
            camRgb.setFps(10)
            
            # setting node configs
            stereo.setDefaultProfilePreset(dai.node.StereoDepth.PresetMode.HIGH_DENSITY)
            # Align depth map to the perspective of RGB camera, on which inference is done
            stereo.setDepthAlign(dai.CameraBoardSocket.RGB)
            stereo.setOutputSize(monoLeft.getResolutionWidth(), monoLeft.getResolutionHeight())
            
            spatialDetectionNetwork.setBlobPath('./pedestrian-and-vehicle-detector-adas-0001.blob')
            spatialDetectionNetwork.setConfidenceThreshold(0.5)
            spatialDetectionNetwork.input.setBlocking(True)
            spatialDetectionNetwork.setBoundingBoxScaleFactor(0.5)
            spatialDetectionNetwork.setDepthLowerThreshold(100)
            spatialDetectionNetwork.setDepthUpperThreshold(15000)
            
            objectTracker.setDetectionLabelsToTrack([1,2])  # track only person
            # possible tracking types: ZERO_TERM_COLOR_HISTOGRAM, ZERO_TERM_IMAGELESS, SHORT_TERM_IMAGELESS, SHORT_TERM_KCF
            objectTracker.setTrackerType(dai.TrackerType.ZERO_TERM_COLOR_HISTOGRAM)
            # take the smallest ID when new object is tracked, possible options: SMALLEST_ID, UNIQUE_ID
            objectTracker.setTrackerIdAssignmentPolicy(dai.TrackerIdAssignmentPolicy.SMALLEST_ID)
            
            # Linking
            monoLeft.out.link(stereo.left)
            monoRight.out.link(stereo.right)
            
            camRgb.preview.link(spatialDetectionNetwork.input)
            objectTracker.passthroughTrackerFrame.link(xoutRgb.input)
            objectTracker.out.link(trackerOut.input)
            
            # if fullFrameTracking:
            #     camRgb.setPreviewKeepAspectRatio(False)
            #     camRgb.video.link(objectTracker.inputTrackerFrame)
            #     objectTracker.inputTrackerFrame.setBlocking(False)
            #     # do not block the pipeline if it's too slow on full frame
            #     objectTracker.inputTrackerFrame.setQueueSize(2)
            # else:
            spatialDetectionNetwork.passthrough.link(objectTracker.inputTrackerFrame)
            
            spatialDetectionNetwork.passthrough.link(objectTracker.inputDetectionFrame)
            spatialDetectionNetwork.out.link(objectTracker.inputDetections)
            stereo.depth.link(spatialDetectionNetwork.inputDepth)
                
            # Connect to device and start pipeline
            with dai.Device(pipeline) as device:
            
                preview = device.getOutputQueue("preview", 4, False)
                tracklets = device.getOutputQueue("tracklets", 4, False)
            
                startTime = time.monotonic()
                counter = 0
                fps = 0
                color = (255, 255, 255)
            
                while(True):
                    imgFrame = preview.get()
                    track = tracklets.get()
            
                    counter+=1
                    current_time = time.monotonic()
                    if (current_time - startTime) > 1 :
                        fps = counter / (current_time - startTime)
                        counter = 0
                        startTime = current_time
            
                    frame = imgFrame.getCvFrame()
                    trackletsData = track.tracklets
                    for t in trackletsData:
                        roi = t.roi.denormalize(frame.shape[1], frame.shape[0])
                        x1 = int(roi.topLeft().x)
                        y1 = int(roi.topLeft().y)
                        x2 = int(roi.bottomRight().x)
                        y2 = int(roi.bottomRight().y)
            
                        try:
                            label = labelMap[t.label]
                        except:
                            label = t.label
            
                        cv2.putText(frame, str(label), (x1 + 10, y1 + 20), cv2.FONT_HERSHEY_TRIPLEX, 0.4, 255)
                        cv2.putText(frame, f"ID: {[t.id]}", (x1 + 10, y1 + 30), cv2.FONT_HERSHEY_TRIPLEX, 0.2, 255)
                        cv2.putText(frame, t.status.name, (x1 + 10, y1 + 40), cv2.FONT_HERSHEY_TRIPLEX, 0.2, 255)
                        cv2.rectangle(frame, (x1, y1), (x2, y2), color, cv2.FONT_HERSHEY_SIMPLEX)
            
                        cv2.putText(frame, f"X: {int(t.spatialCoordinates.x)} mm", (x1 + 10, y1 + 60), cv2.FONT_HERSHEY_TRIPLEX, 0.4, 255)
                        #cv2.putText(frame, f"Y: {int(t.spatialCoordinates.y)} mm", (x1 + 10, y1 + 80), cv2.FONT_HERSHEY_TRIPLEX, 0.5, 255)
                        cv2.putText(frame, f"Z: {int(t.spatialCoordinates.z)} mm", (x1 + 10, y1 + 80), cv2.FONT_HERSHEY_TRIPLEX, 0.4, 255)
            
                    cv2.putText(frame, "NN fps: {:.2f}".format(fps), (2, frame.shape[0] - 4), cv2.FONT_HERSHEY_TRIPLEX, 0.4, color)
            
                    cv2.imshow("tracker", frame)
                    
                  
                    if cv2.waitKey(1) == ord('q'):
                        break
                    
                cv2.destroyAllWindows()

            Outdoor video: [
            Indoor video: [

            Can you please give any hints as to how I can reproduce the indoor results outside in daylight?

              AnirbanRaha

              1. Increase depth threshold for outside, currently it's set to max 15 meters in your pipeline:
                setDepthUpperThreshold(15000)

              2. Set mono cameras to 720p, which is more suitable for lang range

              3. Enable stereo subpixel mode, better for long range

              4. Use latest develop, spatial locations are more stable

                GergelySzabolcs

                GergelySzabolcs Increase depth threshold for outside, currently it's set to max 15 meters in your pipeline:
                setDepthUpperThreshold(15000)

                I've set depth to 30000 now.

                GergelySzabolcs Set mono cameras to 720p, which is more suitable for lang range

                I've got the Oak D Lite version. Don't think it has 720p resolution on the mono cameras (throws an error if I set the mono resolution to 720p and defaults to 480p).

                GergelySzabolcs Enable stereo subpixel mode, better for long range

                Enabled it with: stereo.setSubpixel(True)

                GergelySzabolcs Use latest develop, spatial locations are more stable

                Yes using the latest "develop" branch of depthai-python. As I've mentioned before, the readings are quite stable for me indoors.

                [Edit]: I have measured distances with a measuring tape. I'm somehow getting twice the actual distance on the Z coordinate?!

                `Please have a look at this video, I'm standing at the end of the 5meter long tape and getting around 10meter. The reading is always roughly around 2x of what I should actually be getting:

                [

                  Here it is. Says something about "No factory calibration".
                  Is EEPROM available: True

                  User calibration: {
                    "batchName": "",
                    "batchTime": 0,
                    "boardConf": "",
                    "boardCustom": "",
                    "boardName": "OAK-D-LITE",
                    "boardOptions": 0,
                    "boardRev": "R1M1E3",
                    "cameraData": [
                      [
                        2,
                        {
                          "cameraType": 0,
                          "distortionCoeff": [
                            -3.5793373584747314,
                            -27.937454223632812,
                            -0.0013723287265747786,
                            -0.00048459265963174403,
                            181.11610412597656,
                            -3.565078020095825,
                            -28.067947387695312,
                            181.3641357421875,
                            0.0,
                            0.0,
                            0.0,
                            0.0,
                            0.0,
                            0.0
                          ],
                          "extrinsics": {
                            "rotationMatrix": [
                              [
                                0.9998652338981628,
                                0.016339778900146484,
                                0.0015999042661860585
                              ],
                              [
                                -0.016334136947989464,
                                0.9998605251312256,
                                -0.0034779615234583616
                              ],
                              [
                                -0.0016565101686865091,
                                0.0034513596910983324,
                                0.9999926686286926
                              ]
                            ],
                            "specTranslation": {
                              "x": 3.75,
                              "y": 0.0,
                              "z": 0.0
                            },
                            "toCameraSocket": 0,
                            "translation": {
                              "x": 3.729553699493408,
                              "y": 0.03537023440003395,
                              "z": -0.4103076159954071
                            }
                          },
                          "height": 480,
                          "intrinsicMatrix": [
                            [
                              452.56829833984375,
                              0.0,
                              314.34759521484375
                            ],
                            [
                              0.0,
                              452.56829833984375,
                              237.3134765625
                            ],
                            [
                              0.0,
                              0.0,
                              1.0
                            ]
                          ],
                          "lensPosition": 0,
                          "specHfovDeg": 72.9000015258789,
                          "width": 640
                        }
                      ],
                      [
                        1,
                        {
                          "cameraType": 0,
                          "distortionCoeff": [
                            -19.380821228027344,
                            145.0615997314453,
                            -0.0009900459554046392,
                            0.0008708707173354924,
                            -169.3114013671875,
                            -19.37105941772461,
                            144.98361206054688,
                            -169.1844940185547,
                            0.0,
                            0.0,
                            0.0,
                            0.0,
                            0.0,
                            0.0
                          ],
                          "extrinsics": {
                            "rotationMatrix": [
                              [
                                0.9994993805885315,
                                -0.015042693354189396,
                                -0.027834275737404823
                              ],
                              [
                                0.015119130723178387,
                                0.9998824596405029,
                                0.002537747146561742
                              ],
                              [
                                0.027792830020189285,
                                -0.002957306569442153,
                                0.9996093511581421
                              ]
                            ],
                            "specTranslation": {
                              "x": -7.5,
                              "y": 0.0,
                              "z": 0.0
                            },
                            "toCameraSocket": 2,
                            "translation": {
                              "x": -7.491415500640869,
                              "y": -0.11638329923152924,
                              "z": -0.08934982120990753
                            }
                          },
                          "height": 480,
                          "intrinsicMatrix": [
                            [
                              452.8463439941406,
                              0.0,
                              298.63897705078125
                            ],
                            [
                              0.0,
                              452.8463439941406,
                              216.44290161132812
                            ],
                            [
                              0.0,
                              0.0,
                              1.0
                            ]
                          ],
                          "lensPosition": 0,
                          "specHfovDeg": 72.9000015258789,
                          "width": 640
                        }
                      ],
                      [
                        0,
                        {
                          "cameraType": 0,
                          "distortionCoeff": [
                            -2.4185500144958496,
                            -2.238746404647827,
                            0.0005566917243413627,
                            0.0017487785080447793,
                            9.495307922363281,
                            -2.5591840744018555,
                            -1.5969901084899902,
                            8.690184593200684,
                            0.0,
                            0.0,
                            0.0,
                            0.0,
                            0.0,
                            0.0
                          ],
                          "extrinsics": {
                            "rotationMatrix": [
                              [
                                0.0,
                                0.0,
                                0.0
                              ],
                              [
                                0.0,
                                0.0,
                                0.0
                              ],
                              [
                                0.0,
                                0.0,
                                0.0
                              ]
                            ],
                            "specTranslation": {
                              "x": -0.0,
                              "y": -0.0,
                              "z": -0.0
                            },
                            "toCameraSocket": -1,
                            "translation": {
                              "x": 0.0,
                              "y": 0.0,
                              "z": 0.0
                            }
                          },
                          "height": 2160,
                          "intrinsicMatrix": [
                            [
                              2958.43115234375,
                              0.0,
                              1867.47021484375
                            ],
                            [
                              0.0,
                              2958.43115234375,
                              1093.8828125
                            ],
                            [
                              0.0,
                              0.0,
                              1.0
                            ]
                          ],
                          "lensPosition": 0,
                          "specHfovDeg": 68.7938003540039,
                          "width": 3840
                        }
                      ]
                    ],
                    "hardwareConf": "",
                    "imuExtrinsics": {
                      "rotationMatrix": [
                        [
                          0.0,
                          0.0,
                          0.0
                        ],
                        [
                          0.0,
                          0.0,
                          0.0
                        ],
                        [
                          0.0,
                          0.0,
                          0.0
                        ]
                      ],
                      "specTranslation": {
                        "x": 0.0,
                        "y": 0.0,
                        "z": 0.0
                      },
                      "toCameraSocket": -1,
                      "translation": {
                        "x": 0.0,
                        "y": 0.0,
                        "z": 0.0
                      }
                    },
                    "miscellaneousData": [],
                    "productName": "",
                    "stereoRectificationData": {
                      "leftCameraSocket": 1,
                      "rectifiedRotationLeft": [
                        [
                          0.9998739957809448,
                          0.00045567681081593037,
                          -0.01586950570344925
                        ],
                        [
                          -0.0004328725044615567,
                          0.9999988675117493,
                          0.0014403954846784472
                        ],
                        [
                          0.015870144590735435,
                          -0.0014333445578813553,
                          0.9998730421066284
                        ]
                      ],
                      "rectifiedRotationRight": [
                        [
                          0.9998082518577576,
                          0.01553257554769516,
                          0.011924673803150654
                        ],
                        [
                          -0.015515424311161041,
                          0.999878466129303,
                          -0.001529501285403967
                        ],
                        [
                          -0.011946981772780418,
                          0.0013441916089504957,
                          0.9999276995658875
                        ]
                      ],
                      "rightCameraSocket": 2
                    },
                    "version": 6
                  }
                  No factory calibration: No or invalid EEPROM configuration flashed, error: EEPROM_INVALID_DATA
                  User calibration raw: [6, 0, 170, 85, 3, 0, 0, 0, 0, 79, 65, 75, 45, 68, 45, 76, 73, 84, 69, 0, 17, 128, 128, 0, 0, 82, 49, 77, 49, 69, 51, 0, 0, 0, 6, 190, 247, 127, 63, 232, 231, 238, 57, 196, 0, 130, 188, 42, 243, 226, 185, 237, 255, 127, 63, 167, 203, 188, 58, 27, 2, 130, 60, 16, 223, 187, 186, 174, 247, 127, 63, 111, 243, 127, 63, 88, 124, 126, 60, 181, 95, 67, 60, 104, 52, 126, 188, 9, 248, 127, 63, 140, 121, 200, 186, 70, 189, 67, 188, 150, 47, 176, 58, 67, 251, 127, 63, 1, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 224, 1, 128, 2, 0, 85, 108, 226, 67, 0, 0, 0, 0, 202, 81, 149, 67, 0, 0, 0, 0, 85, 108, 226, 67, 98, 113, 88, 67, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 128, 63, 236, 11, 155, 193, 197, 15, 17, 67, 110, 196, 129, 186, 37, 75, 100, 58, 184, 79, 41, 195, 238, 247, 154, 193, 206, 251, 16, 67, 59, 47, 41, 195, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 205, 204, 145, 66, 0, 1, 49, 223, 127, 63, 161, 117, 118, 188, 181, 4, 228, 188, 59, 182, 119, 60, 76, 248, 127, 63, 85, 80, 38, 59, 202, 173, 227, 60, 95, 207, 65, 187, 102, 230, 127, 63, 173, 185, 239, 192, 94, 90, 238, 189, 10, 253, 182, 189, 0, 0, 240, 192, 0, 0, 0, 0, 0, 0, 0, 0, 2, 224, 1, 128, 2, 0, 190, 72, 226, 67, 0, 0, 0, 0, 126, 44, 157, 67, 0, 0, 0, 0, 190, 72, 226, 67, 64, 80, 109, 67, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 128, 63, 221, 19, 101, 192, 232, 127, 223, 193, 182, 223, 179, 186, 237, 16, 254, 185, 185, 29, 53, 67, 61, 42, 100, 192, 40, 139, 224, 193, 56, 93, 53, 67, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 205, 204, 145, 66, 0, 2, 43, 247, 127, 63, 0, 219, 133, 60, 225, 179, 209, 58, 43, 207, 133, 188, 220, 246, 127, 63, 131, 238, 99, 187, 66, 31, 217, 186, 53, 48, 98, 59, 133, 255, 127, 63, 2, 177, 110, 64, 97, 224, 16, 61, 215, 19, 210, 190, 0, 0, 112, 64, 0, 0, 0, 0, 0, 0, 0, 0, 0, 112, 8, 0, 15, 0, 230, 230, 56, 69, 0, 0, 0, 0, 12, 111, 233, 68, 0, 0, 0, 0, 230, 230, 56, 69, 64, 188, 136, 68, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 128, 63, 134, 201, 26, 192, 159, 71, 15, 192, 243, 238, 17, 58, 69, 55, 229, 58, 200, 236, 23, 65, 172, 201, 35, 192, 44, 106, 204, 191, 255, 10, 11, 65, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 109, 150, 137, 66, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255]
                  Factory calibration raw: [255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255]

                  GergelySzabolcs

                  6 days later

                  @erik @GergelySzabolcs I tried to calibrate the camera. I followed the steps in this thread: [https://github.com/luxonis/depthai/issues/725#:~:text=git%20checkout%20lite_calibration,python3%20calibrate.py%20...](https://)

                  I used the command:
                  python3 calibrate.py -s 2.4 -ms 1.7 -brd OAK-D-LITE

                  I oriented the charuco board and took 13 images. The calibration process ended with something like "Calibration successful. Error = 0.19..."

                  After this I again ran the calibration_dump.py and I get the same calibration/invalid configuration error.

                  Not only this, now I can't even run any of the scripts. I'm getting an "Xlink error: possible device misconfiguration"

                  How can I properly calibrate the OAK D Lite? Please help me through this.

                  [Edit]: I tried the calibration process again. This time I also got a successful calibration message but with a lower error value of 0.15...
                  Now the scripts are working again and indoor readings look good. Yet to check it outdoors in my car.

                  However, what I still don't get is why the calibration dump script still shows a configuration error?

                  Running with acceptable accuracy outdoors now after recalibration earlier.

                  Should I be concerned the calibration_dump.py still shows a configuration error?

                  @erik @GergelySzabolcs

                  • erik replied to this.

                    Hi AnirbanRaha ,
                    Sounds great that the issue has been resolved! Which error at calib dump are you referring to?
                    Thanks, Erik

                      erik Yes seems so but I'm still getting this error when I run calibration_dump.py:
                      No factory calibration: No or invalid EEPROM configuration flashed, error: EEPROM_INVALID_DATA

                      • erik replied to this.

                        Hi AnirbanRaha ,
                        That's because you have an older version of calibration - newer versions have both factory calibration and user calibration - so users can't override their factory calib🙂
                        But that's not a problem, depthai will just take user calibration.
                        Thanks, Erik

                        I see. Thanks for the explanation @erik ! I really don't mind that error since the distances now look pretty good.