Hi all ,

I am working on this tutorial : depthai-experiments/gen2-poe-mqtt at master · luxonis/depthai-experiments · GitHub

In this code, instead of sending the detections , I want to send the image from the camRgb node. I tried all methods from the camRgb node but I always get an error. Can you help me with this ?

import depthai as dai

import blobconverter

import time

# Change the IP to your MQTT broker!

MQTT_BROKER = "test.mosquitto.org"

MQTT_BROKER_PORT = 1883

MQTT_TOPIC = "test_topic/detections"

pipeline = dai.Pipeline()

camRgb = pipeline.create(dai.node.ColorCamera)

camRgb.setPreviewSize(300, 300)

camRgb.setFps(10)

camRgb.setInterleaved(False)

# Define a neural network that will make predictions based on the source frames

nn = pipeline.create(dai.node.MobileNetDetectionNetwork)

nn.setConfidenceThreshold(0.5)

nn.setBlobPath(blobconverter.from_zoo(name="mobilenet-ssd", shaves=6))

camRgb.preview.link(nn.input)

script = pipeline.create(dai.node.Script)

script.setProcessor(dai.ProcessorType.LEON_CSS)

nn.out.link(script.inputs['detections'])

script_text = f"""

import time

mqttc = Client()

node.warn('Connecting to MQTT broker...')

mqttc.connect("{MQTT_BROKER}", {MQTT_BROKER_PORT}, 60)

node.warn('Successfully connected to MQTT broker!')

mqttc.loop_start()

cnt = 0

total = 0

while True:

dets = node.io['detections'].get()

total+=len(dets.detections)

cnt+=1

if cnt > 20:

avrg = str(total / 20)

cnt = 0

total = 0

node.warn(avrg)

(ok, id) = mqttc.publish("{MQTT_TOPIC}", avrg, qos=2)

"""

with open("paho-mqtt.py", "r") as f:

paho_script = f.read()

script.setScript(f"{paho_script}\n{script_text}")

with dai.Device(pipeline) as device:

print('Connected to OAK')

while not device.isClosed():

time.sleep(1)

Thank you very much

Hi @martin181818
What error are you getting?
I'd imagine it would be a problem sending a stream of images (10 FPS in you case) with MQTT. Try first lowering the FPS and image size (maybe preview 300x300 is ok).

Thanks,
Jaka

    jakaskerl

    I am only sending the image every 5 seconds using a timer. The only problem is that I still cannot figure out how to extract the image from the node and send it. Because in normal mode we need to use opencv to encode the image and then send it but now in script node it is not the same.

    Thank you

      Hi martin181818
      Would it help to first encode the frame to MJPEG using VideoEncoder node, then link the bitstream to a script node, where you extract it using frame.get().getData()?

      Thoughts?
      Jaka

        Another thing is that I am sending the images each 5-10 seconds so it is not necessary to have all the images using VideoEncoder. I care about the quality of images to send.

        Thanks

        Hi @jakaskerl

        any news about the thread ? How to manage to maintain a good quality while sending the data via MQTT or sockets using the script node ?

        Thank you

        Hi @martin181818
        Not sure about the MQTT since I haven't tested it, but TCP streaming example using sockets should preserve image quality.
        Is that not the case for you?

        Thanks,
        Jaka

          jakaskerl

          Thank you for your reply. I checked the TCP streaming example using the VideoEncoder node and it is working fine now. I appreciate it. Thank you