Hello! I've been looking through and tweaking the samples, but so far I can't seem to get below about 1s in straming video latency when attempting to stream video from the camera to an RTP receiver.

I'm using the Python RTSP streamer as a starting place but then I just take the data and do the following to send a UDP video stream without using Gstreamer. I initialized the encoder with the following args:

    videnc.setDefaultProfilePreset(FPS, dai.VideoEncoderProperties.Profile.H264_MAIN)
    videnc.setNumBFrames(0)
    videnc.setBitrateKbps(4000)

And send the data with:

    with dai.Device(pipeline, device_info) as device:
        seqNum = 0
        sock = socket.socket(socket.AF_INET, # Internet
             socket.SOCK_DGRAM) # UDP

        encoded = device.getOutputQueue("encoded", maxSize=30, blocking=True)
        print("Setup finished")

        while True:
            data = encoded.get().getData()
            dataToSend = len(data)
            startPos = 0
            while startPos < dataToSend:
                dataLeft = dataToSend - startPos
                toSend = min_of_two(1400, dataLeft)
                endPos = startPos + toSend

                buffer = bytearray()
                buffer.extend(data[startPos:endPos])
                print(f"Sending {len(buffer)} / {dataLeft} {startPos} {endPos}")
                startPos = endPos
                sock.sendto(buffer, ("192.168.0.73", 9922))

Ive then played it by using ffplay on the receiver computer:

ffplay -fflags nobuffer -i udp://0.0.0.0:9922

The video looks fine, but the 1s delay is there.

Are there any other items to tweak here? I have tried the H264 encoder with no better results.

Thanks in advance! Would love to have any other samples of direct RTP/UDP streaming if anyone has managed to get low latency video.

Just a quick mention that I also ran this code directly on the oak camera by using the script interpreter there and did not have any better performance.

  • aviator_01 replied to this.
  • @IvanW could you try these flags, they should provide better latency:

    ffplay -fflags nobuffer -flags low_delay -strict experimental [...]

    Perhaps also -fflags discardcorrupt and -framedrop. GPT4 has good explanation on these flags

    Hi @IvanW
    Does reducing the bitrate affect the stream latency?
    The loop could be done without creating a new bytearray every time I think.
    Might also be worth checking if you are perhaps getting the latency from depthai itself. Sometimes when queue sizes are large, there will be a noticable delay. Setting queue size to 1 should fix that.

    Thanks,
    Jaka

    Reducing the bitrate did not help. Doing mjpeg, which is a ton more data, has no latency so I don't suspect the bytearray either ( using exact same code path for mjoeg).

    I can check the queue size and see if that has any impact when using h264 or h265.

    @IvanW could you try these flags, they should provide better latency:

    ffplay -fflags nobuffer -flags low_delay -strict experimental [...]

    Perhaps also -fflags discardcorrupt and -framedrop. GPT4 has good explanation on these flags

    Thank you all. Looks like the -flags low_delay made the largest difference from my previous command line. It's not quite as instantaneous as mjpeg but much lower latency at 30hz (200-300ms). So mostly in the client application (FFPLAY) playing the stream, not the stream generator (OAK POE). ie:

    ffplay -flags low_delay -fflags nobuffer -strict experimental udp://0.0.0.0:9922

    Much appreciated.

    8 months later

    IvanW
    Hi IvanW,
    I am also working on kind of same task. Trying to set up RTSP connection from Jetson nano with oak-1 camera on mac(in my case) or with other machine (linux/windows). I am following this article [luxonis/depthai-experimentstree/master/gen2-rtsp-streaming](https://)
    Connection is setup but not able to see stream properly , high latency, corrupted frames etc.
    Would you help how to carry out this task. A step by step guide would be appreciated.

    Thanks