• Hardware
  • the timestamps diff between frames of imu and video (~3.5 sec)

Hello,
I have ordered three outputs from depthai pipeline: video, imu, nn(based on video). i have got the weird output trying to check their origin time:
    while (True):

        in_frame = q_cam.get()

        time_fr = in_frame.getTimestampDevice()

        imuData = imuQueue.get()  # blocking call, will wait until a new data has arrived

        in_detector = q_detector.get()
        if in_frame is not None:

            frame = in_frame.getCvFrame()

        if in_detector is not None:

            detections = in_detector.detections

            counter += 1

    imuPackets = imuData.packets

        for imuPacket in imuPackets:

            rVvalues = imuPacket.rotationVector

            rvTs = rVvalues.getTimestampDevice()

          if baseTs is None:

                baseTs = rvTs

            rvTs = rvTs - baseTs

   print(colored(str(time_fr-rvTs), 'green'), end='\\r')

0:00:03.49417969

what's wrong?

thanks!

Yury

  • jakaskerl replied to this.
  • Hi yuph

    detector = pipeline.create(dai.node.YoloDetectionNetwork)
    
    ………………
    cam.preview.link(detector.input)
    
    cam.preview.link(sync.inputs["rgb"])
    detector.out.link(sync.inputs["nn"])
    imu.out.link(sync.inputs["imu"])
    
    sync.out.link(xoutGrp.input)

    Like this.

    Thanks,
    Jaka

    jakaskerl

    I just had added NN processing after <synchronized with IMU> video stream:

    detector = pipeline.create(dai.node.YoloDetectionNetwork)

    ………………
    cam.preview.link(detector.input)

    detector.passthrough.link(sync.inputs["video"])

    detector.out.link(xout_nn.input)

    And have got delays. There weren't ones before syncing video and imu (but imu data was heavily outdated then )
    Did I link NN to synced data in wrong manner?

    I have used this example as a base for sync:
    https://docs.luxonis.com/projects/api/en/latest/samples/Sync/imu_video_sync/#imu-and-video-sync

    Thank You
    Yury

      Hi yuph
      What do you mean by delays? Could you add some timestamping? Usually NN node will introduce some latency as it runs inference on camera frames.

      Thanks,
      Jaka

      • yuph replied to this.

        jakaskerl

        jakaskerl

        Hi

        I meant the delay of NN detection box drawing over camera stream. the box stuck pointing the place where the detected object was before when camera was moved. delay before reaction was 0.3-0.5s seemingly. Previously (before I synced imu and video) video -> nn -> cv2 behavior was different.

        `# Create pipeline
        pipeline = dai.Pipeline()

        cam = pipeline.create(dai.node.ColorCamera)
        cam.setPreviewSize(VIDEO_WIDTH, VIDEO_HEIGHT)
        cam.setInterleaved(False)
        fps = 25
        cam.setFps(fps)
        cam.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
        cam.setColorOrder(dai.ColorCameraProperties.ColorOrder.RGB)
        xout_cam = pipeline.create(dai.node.XLinkOut)
        xout_cam.setStreamName("video")

        detector = pipeline.create(dai.node.YoloDetectionNetwork)
        detector.setConfidenceThreshold(prob_lim)
        detector.setNumClasses(2)
        detector.setCoordinateSize(4)
        detector.setAnchors([])
        detector.setAnchorMasks({})
        detector.setIouThreshold(0.7)
        detector.setBlobPath(nnPathDefault)
        detector.setNumInferenceThreads(1)
        detector.input.setBlocking(False)
        xout_nn = pipeline.create(dai.node.XLinkOut)
        xout_nn.setStreamName("nn")

        imu = pipeline.create(dai.node.IMU)
        xoutImu = pipeline.create(dai.node.XLinkOut)
        xoutImu.setStreamName("imu")
        xoutGrp = pipeline.create(dai.node.XLinkOut)
        xoutGrp.setStreamName("xout")

        imu.enableIMUSensor(dai.IMUSensor.ROTATION_VECTOR, 400)
        imu.setBatchReportThreshold(1)
        imu.setMaxBatchReports(10)

        sync = pipeline.create(dai.node.Sync)
        sync.setSyncThreshold(timedelta(milliseconds=20))
        sync.setSyncAttempts(-1)

        cam.preview.link(sync.inputs["video"])
        imu.out.link(sync.inputs["imu"])

        sync.out.link(xoutGrp.input)

        cam.preview.link(detector.input)
        detector.passthrough.link(sync.inputs["video"])
        detector.out.link(xout_nn.input)`

        thanks
        Yury

          yuph detector.passthrough.link(sync.inputs["video"])

          yuph cam.preview.link(sync.inputs["video"])

          Same input.

          Moreover, link detector.out to the sync node to sync RGB video, IMU and NN results together. You don't need detector.passthrough if you are using a sync node.

          Thanks,
          Jaka

          • yuph replied to this.

            jakaskerl
            I removed "detector.passthrough.link(sync.inputs["video"])"

            Clarify " link detector.out to the sync node to sync RGB video, IMU and NN results together", please!

            Thank You
            Yury

              Hi yuph

              detector = pipeline.create(dai.node.YoloDetectionNetwork)
              
              ………………
              cam.preview.link(detector.input)
              
              cam.preview.link(sync.inputs["rgb"])
              detector.out.link(sync.inputs["nn"])
              imu.out.link(sync.inputs["imu"])
              
              sync.out.link(xoutGrp.input)

              Like this.

              Thanks,
              Jaka