IvanW

  • Feb 22, 2024
  • Joined Feb 21, 2024
  • 0 best answers
  • Thank you all. Looks like the -flags low_delay made the largest difference from my previous command line. It's not quite as instantaneous as mjpeg but much lower latency at 30hz (200-300ms). So mostly in the client application (FFPLAY) playing the stream, not the stream generator (OAK POE). ie:

    ffplay -flags low_delay -fflags nobuffer -strict experimental udp://0.0.0.0:9922

    Much appreciated.

  • Reducing the bitrate did not help. Doing mjpeg, which is a ton more data, has no latency so I don't suspect the bytearray either ( using exact same code path for mjoeg).

    I can check the queue size and see if that has any impact when using h264 or h265.

  • Did you have any luck in getting reduced latency? Running into the same issue, even running a basic UDP stream on the camera directly.

  • Hello! I've been looking through and tweaking the samples, but so far I can't seem to get below about 1s in straming video latency when attempting to stream video from the camera to an RTP receiver.

    I'm using the Python RTSP streamer as a starting place but then I just take the data and do the following to send a UDP video stream without using Gstreamer. I initialized the encoder with the following args:

        videnc.setDefaultProfilePreset(FPS, dai.VideoEncoderProperties.Profile.H264_MAIN)
        videnc.setNumBFrames(0)
        videnc.setBitrateKbps(4000)

    And send the data with:

        with dai.Device(pipeline, device_info) as device:
            seqNum = 0
            sock = socket.socket(socket.AF_INET, # Internet
                 socket.SOCK_DGRAM) # UDP
    
            encoded = device.getOutputQueue("encoded", maxSize=30, blocking=True)
            print("Setup finished")
    
            while True:
                data = encoded.get().getData()
                dataToSend = len(data)
                startPos = 0
                while startPos < dataToSend:
                    dataLeft = dataToSend - startPos
                    toSend = min_of_two(1400, dataLeft)
                    endPos = startPos + toSend
    
                    buffer = bytearray()
                    buffer.extend(data[startPos:endPos])
                    print(f"Sending {len(buffer)} / {dataLeft} {startPos} {endPos}")
                    startPos = endPos
                    sock.sendto(buffer, ("192.168.0.73", 9922))

    Ive then played it by using ffplay on the receiver computer:

    ffplay -fflags nobuffer -i udp://0.0.0.0:9922

    The video looks fine, but the 1s delay is there.

    Are there any other items to tweak here? I have tried the H264 encoder with no better results.

    Thanks in advance! Would love to have any other samples of direct RTP/UDP streaming if anyone has managed to get low latency video.

    Just a quick mention that I also ran this code directly on the oak camera by using the script interpreter there and did not have any better performance.

    • aviator_01 replied to this.
    • @IvanW could you try these flags, they should provide better latency:

      ffplay -fflags nobuffer -flags low_delay -strict experimental [...]

      Perhaps also -fflags discardcorrupt and -framedrop. GPT4 has good explanation on these flags