i see. so better to use useSpec=False
thanks
i see. so better to use useSpec=False
thanks
Hi Jaka. Thank you for the answer, unfortunatelly I do not understand what you mean by that.
Thanks, Jan
Yes, that's it. I did not know how the 720p image is produced. TH conversion from 12MPx -> 4k was important.
And should I use the MAX_FOV returned by getFov with useSpec True / False (or as is shown here)?
I'm not sure, why the FOV saved in eprom is 108° and calculated from Intrinsics is 79.47562085721047
Thanks, Jan
Hi. Thank you for your reply. That might be true, if you crop the sub image (a.k.a. the 720p resolution image) from the source(12MPX). That is not tru, if I'm not mistaken, based on the image look:
Thank you, best wishes Jan
Hi. I'd like to get real HFOV of camera, while setting up different resolution (.THE_12_MP, THE_720_P).
But calling
calibData = device.readCalibration()
HFOV = calibData.getFov(dai.CameraBoardSocket.CAM_A)
returns all the time same resolution (108°, 79.475 with useSpec=False)
That is most probably not true, since the image is cropped for 720p.
Is thare any in-build function, how to get real HFOV?
Thank you, best wishes, Jan
Hi. I'm having an issue with depthai_sdk. I have task to detect objects using yolo, but what I need is only detections and a frame once per 1s. I use code below to print outputs of the yolo. with code as it is, I achieve 15FPS, but with commenting the 14,15 line, the FPS drops to 2 and no detection is provided. Als setting blocking = False on line 25 leads to failure.
Thank you, Jan
def callback(packet: DetectionPacket):
visualizer = packet.visualizer
try:
print()
for inx in range(len(packet.detections)):
print('Detected: ' + packet.detections[inx].label + ' at' + str(packet.detections[inx].centroid()))
print('Spatial: x ' + str(visualizer.objects[0].detections[0].spatialCoordinates.x) + 'mm ,y '
+ str(visualizer.objects[0].detections[inx].spatialCoordinates.y) + 'mm ,z '
+ str(visualizer.objects[0].detections[inx].spatialCoordinates.z) + 'mm.')
print('Confidence: ' + str(visualizer.objects[0].detections[inx].confidence))
print(visualizer.objects[1].text )
except:
pass
frame = visualizer.draw(packet.frame)
cv2.imshow('Visualizer', frame)
with OakCamera(args=args) as oak:
color = oak.create_camera('color')
color.config_color_camera(isp_scale=(1,2))
nn = oak.create_nn(args['config'], color, nn_type='yolo', spatial=True)
for inx in range(len(nn._labels)):
nn._labels[inx] = true_labels[inx]
oak.visualize(nn.out.passthrough, fps=True, callback=callback)
oak.start(blocking=True)
After rebooting the camera, everything works. Thank you
Let me get back to this thread once more again, I'm trying to crop images on camera, but while croping both depth and rgb image the dai rises Fatal error." Please report to developers. Log: 'PipelineBuilder' '2221'"
I follow this two tutorials: depth crop, color crop
Thank you
You are right, I do downscale the isp by factor 2/3, my bad.
So what would be the best solution here? To downscale the Images or somehow crop the images itself?
Thank you
Without synchronisation I tried RGB @ 1080p up to 40FPS without problem. In default I use rgb & depth @ 30FPS.
Hi.
I'm implementing synchronization between IMU/RGB/DEPTH to be able to reconstruct true position of object in time. I'm following this demo, but after runnning, the shown FPS drops to 1-2FPS. Separate modality works just fine in 30FPS, same as nonsync RGB and DEPTH.
Is there any solution, how to speed it up all?
Thank you, best wishes, Jan
Thank you for the code snippet.
Now it is showing 30ms even if the camera is facing dark object and Auto Exposure overbrighten the scene. So there is also some other kind of compensation.
Anyhow, thank you again!
J.
Hello. I need to get the camera frame exposure time for the purpose of my experiment and with using:
camControl = dai.CameraControl()
ET = camControl.getExposureTime().total_seconds() * 1000.
The ET is always 0.0 no matter how dim the illumination of the sample is.
I'm most probably doing something wrong, but I cannot figure out what it is.
How to read it correctly?
thanks, Jan