A
AlbinM

  • 4 hours ago
  • Joined 3 days ago
  • 0 best answers
  • jakaskerl

    This is the code we use for displaying the timestamp on the screen (datetime_stream.py in the repo):

    import time
    
    while True:
        print(time.monotonic_ns() / 1000000)

    We have tried varying the fps, and the error persisted through that, being at the same levels of 90-100 ms. Another thing we have tried is to see if it has something to do with rolling shutter by placing our timestamp in different parts of the camera FOV, but the discrepancy has persisted through that as well.

    We definitely think there might be an error in our experiment/measurement as well. We have tried to eliminate the possible causes we could think of, and made our repo for reproducing the error as simple as possible. If you have any more ideas for what in the experiment/measurement that might be wrong, we are happy to try them, to get to the bottom of this.

    Do you maybe know what experiments have been done in developing the camera to check the latency? Maybe we could try replicating one of those experiments to really ensure that it is just our experimental setup that causes the issue.

  • jakaskerl

    We have now updated to time.monotonic (repo also updated), and it did not change anything, the discrepancy persists.

    Any other pointers?

    • We have measured the latency of an image based on image content, and compared that to the latency found through the suggested method by Luxonis, and we have found that these two latencies do not match.

      Based on this post: Latency Measurement, we expect that the latency of a given frame can be found as

      latencyMs = (dai.Clock.now() - imgFrame.getTimestamp()).total_seconds() * 1000

      However, we have found this latency to not match reality. We constructed an experiment where we point the camera towards a monitor where we make printouts of the current time. By comparing this time to the time in our program, we can find the latency of an image. We compare this true latency to the latency reported by the Luxonis camera per the above calculation, and find that these do not match.

      {ExperimentSetup}

      Here is an example of our setup, with the luxonis camera pointing towards the monitor where we have current time printouts.

      We find that the difference between the latency based on image content and program time, compared to luxonis timestamp and luxonis camera time, is 100 ms.

      {LatencyDifferene}

      We have constructed a minimal code example in this repo, with the link showing the primary notebook for the procedure: Minimal Example Repo

      The current discrepancy hinders our usecase where we require accurate latency measurements.

      Any pointers as to how we can get more accurate latency measurements, or if we are missing something?