Hi,
I'd like to stream an encoded (H264) video for real-time monitoring. According to this page, I can expect roughly ~30ms of latency. And this is what I measure with this code
import depthai as dai
import numpy as np
FPS = 60
# Create pipeline
pipeline = dai.Pipeline()
# This might improve reducing the latency on some systems
pipeline.setXLinkChunkSize(0)
# Define source and output
camRgb = pipeline.create(dai.node.ColorCamera)
camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
encoder = pipeline.create(dai.node.VideoEncoder)
profile = dai.VideoEncoderProperties.Profile.H264_MAIN
encoder.setDefaultProfilePreset(FPS, profile)
camRgb.video.link(encoder.input)
xout = pipeline.create(dai.node.XLinkOut)
xout.setStreamName("out")
encoder.bitstream.link(xout.input)
# Connect to device and start pipeline
with dai.Device(pipeline) as device:
print(device.getUsbSpeed())
q = device.getOutputQueue(name="out", maxSize=30, blocking=True)
diffs = np.array([])
while True:
imgFrame = q.get()
# Latency in miliseconds
latencyMs = (dai.Clock.now() - imgFrame.getTimestamp()).total_seconds() * 1000
diffs = np.append(diffs, latencyMs)
print(
"Latency: {:.2f} ms, Average latency: {:.2f} ms, Std: {:.2f}".format(
latencyMs, np.average(diffs), np.std(diffs)
)
)
Is there a way to reduce the latency? The page mentioned above points to a zero-copy branch of depthai. Is it still usable (looks to be outdated)?
Thanks a lot