Hey,

I wanna use blob with camera exposure limit with OAK-FFC-3P and IMX462 sensor from Arducam.

But i get:

RuntimeError: Communication exception - possible device error/misconfiguration. Original message 'Couldn't read data from stream: 'video' (X_LINK_ERROR)'

My pipeline that is working great with OAK-D Pro :

#!/usr/bin/env python3

import cv2import depthai as dai
# Create pipelinepipeline = dai.Pipeline()pipeline.setCameraTuningBlobPath("/path/tuning/depthai-python/examples/ColorCamera/tuning/tuning_exp_limit_8300us.bin")

print(str(pipeline.getDeviceConfig()))

# Define source and outputcamRgb = pipeline.create(dai.node.ColorCamera)

xoutVideo = pipeline.create(dai.node.XLinkOut)
xoutVideo.setStreamName("video")
# PropertiescamRgb.setBoardSocket(dai.CameraBoardSocket.CAM_A)

camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)

camRgb.setVideoSize(1920, 1080)
xoutVideo.input.setBlocking(False)xoutVideo.input.setQueueSize(1)
# LinkingcamRgb.video.link(xoutVideo.input)
# Connect to device and start pipelinewith dai.Device(pipeline, True) as device:
video = device.getOutputQueue(name="video", maxSize=1, blocking=False)
while True:

videoIn = video.get()
cv2.imshow("video", videoIn.getCvFrame())
if cv2.waitKey(1) == ord('q'):

break

OAK-FFC-3P with IMX462 is running on newest develop branch.

Maybe Team can help me how to cap exposure time to as low as 500us or smaller. I can't set static ISO and exposure time, because of dynamic light environment, higher noise isn't a problem 🙂

Thanks a lot

Andrzej

Hi @AndrzejFijalo
You can't set a cap for exposure time in the API. You can however write a function that will periodically update the exposure time depending on your environment.

Then send the updated exposure time control back to the device like done here.

Thoughts?
Jaka

I found


AND IT'S AWSOME
but i can't find this implementation in depthai-core 🙁 and my project is in cpp so i need this very badly
Andrzej