Hi Erik,
Thank you very much again for your reply.
ImageManip is working fine now using the preview
node.
Since I upgraded the depthai version to be able to stop using startPipeline()
I haven't been running into many issues. At least nothing I think at this stage I should worry you about.
I am trying a different approach now to what I'm trying to achieve.
Essentially what I'm trying to do is take a picture in a pitch dark room and get the camera to identify the people in it and give me the bounding boxes so that I can crop it in.
My initial thought was to light up the room with a powerful LED for say 1 second, to give the camera enough time to identify people, and then save the image.
That is the code I sent before. It works to a degree, but there are multiple issues with the light source that aren't related to depthai.
And I think that from a design point of view, exposure of the picture and identification of people should be 2 separate and independent processes.
That's why I thought about trying to follow the example from "Video & MobilenetSSD" to try to feed a frame back to the camera and in this way use depthai only as the processing unit.
Here is a gist of that code.
I got a moderate degree of success there.
First of all the 12MP resolution fails when I run that code from my ubuntu machine. It says that camera still run out of memory and I couldn't get around that, but it seemed to work on a raspberry pi for a while, but now i'm getting the following errors:
[2021-07-26 16:52:14.424] [debug] Python bindings - version: 2.8.0.0.dev+a4c841d73e5bd0eee688c90b9c5352d187767645 from 2021-07-23 15:44:42 +0300 build: 2021-07-25 03:21:47 +0000
[2021-07-26 16:52:14.424] [debug] Library information - version: 2.8.0, commit: 7d76a830ffc51512adae455ec28b1150eabec513 from 2021-07-23 15:42:59 +0300, build: 2021-07-25 03:21:33 +0000
[2021-07-26 16:52:14.424] [debug] Initialize - finished
[2021-07-26 16:52:14.731] [debug] Resources - Archive 'depthai-bootloader-fwp-0.0.12.tar.xz' open: 6ms, archive read: 299ms
[2021-07-26 16:52:15.170] [debug] Device - pipeline serialized, OpenVINO version: 2021.2
[2021-07-26 16:52:15.387] [debug] Resources - Archive 'depthai-device-fwp-c0a7810c9c1e7678ae65035b8f23d4cac6beb568.tar.xz' open: 6ms, archive read: 956ms
[2021-07-26 16:52:15.435] [debug] Patching OpenVINO FW version from 2021.4 to 2021.2
[2021-07-26 16:52:16.938] [trace] RPC: [1,1,9503979954606424982,null]
[2021-07-26 16:52:16.941] [trace] RPC: [1,1,10182484315255513117,[0]]
[2021-07-26 16:52:16.943] [trace] RPC: [1,1,5804304869041345055,[1.0]]
[2021-07-26 16:52:16.945] [trace] RPC: [1,1,17425566508637143278,null]
[2021-07-26 16:52:16.945] [trace] Log vector decoded, size: 3
[14442C10411D4BD000] [143.330] [system] [info] Memory Usage - DDR: 0.12 / 358.60 MiB, CMX: 2.09 / 2.50 MiB, LeonOS Heap: 6.29 / 82.70 MiB, LeonRT Heap: 2.83 / 26.74 MiB
[14442C10411D4BD000] [143.330] [system] [info] Temperatures - Average: 28.86 °C, CSS: 29.83 °C, MSS 27.88 °C, UPA: 28.13 °C, DSS: 29.59 °C
[14442C10411D4BD000] [143.330] [system] [info] Cpu Usage - LeonOS 9.46%, LeonRT: 1.76%
[2021-07-26 16:52:17.003] [trace] RPC: [1,1,16527326580805871264,[{"connections":[{"node1Id":4,"node1Output":"bitstream","node2Id":5,"node2Input":"in"},{"node1Id":0,"node1Output":"still","node2Id":4,"node2Input":"in"},{"node1Id":1,"node1Output":"out","node2Id":3,"node2Input":"in"},{"node1Id":6,"node1Output":"out","node2Id":0,"node2Input":"inputControl"},{"node1Id":2,"node1Output":"out","node2Id":1,"node2Input":"in"}],"globalProperties":{"calibData":null,"cameraTuningBlobSize":null,"cameraTuningBlobUri":"","leonCssFrequencyHz":700000000.0,"leonMssFrequencyHz":700000000.0,"pipelineName":null,"pipelineVersion":null},"nodes":[[5,{"id":5,"ioInfo":{"in":{"blocking":true,"name":"in","queueSize":8,"type":3}},"name":"XLinkOut","properties":{"maxFpsLimit":-1.0,"metadataOnly":false,"streamName":"jpegEncoderXLinkOut"}}],[1,{"id":1,"ioInfo":{"in":{"blocking":false,"name":"in","queueSize":5,"type":3},"out":{"blocking":false,"name":"out","queueSize":8,"type":0},"passthrough":{"blocking":false,"name":"passthrough","queueSize":8,"type":0}},"name":"DetectionNetwork","properties":{"anchorMasks":{},"anchors":[],"blobSize":14505024,"blobUri":"asset:1/blob","classes":0,"confidenceThreshold":0.5,"coordinates":0,"iouThreshold":0.0,"nnFamily":1,"numFrames":8,"numNCEPerThread":0,"numThreads":2}}],[6,{"id":6,"ioInfo":{"out":{"blocking":false,"name":"out","queueSize":8,"type":0}},"name":"XLinkIn","properties":{"maxDataSize":5242880,"numFrames":8,"streamName":"controller"}}],[0,{"id":0,"ioInfo":{"inputConfig":{"blocking":false,"name":"inputConfig","queueSize":8,"type":3},"inputControl":{"blocking":true,"name":"inputControl","queueSize":8,"type":3},"isp":{"blocking":false,"name":"isp","queueSize":8,"type":0},"preview":{"blocking":false,"name":"preview","queueSize":8,"type":0},"raw":{"blocking":false,"name":"raw","queueSize":8,"type":0},"still":{"blocking":false,"name":"still","queueSize":8,"type":0},"video":{"blocking":false,"name":"video","queueSize":8,"type":0}},"name":"ColorCamera","properties":{"boardSocket":-1,"colorOrder":0,"fp16":false,"fps":30.0,"imageOrientation":-1,"initialControl":{"aeLockMode":false,"aeRegion":{"height":0,"priority":0,"width":0,"x":0,"y":0},"afRegion":{"height":0,"priority":0,"width":0,"x":0,"y":0},"antiBandingMode":0,"autoFocusMode":3,"awbLockMode":false,"awbMode":0,"brightness":0,"chromaDenoise":0,"cmdMask":0,"contrast":0,"effectMode":0,"expCompensation":0,"expManual":{"exposureTimeUs":0,"frameDurationUs":0,"sensitivityIso":0},"lensPosition":0,"lumaDenoise":0,"saturation":0,"sceneMode":0,"sharpness":0},"inputConfigSync":false,"interleaved":false,"ispScale":{"horizDenominator":0,"horizNumerator":0,"vertDenominator":0,"vertNumerator":0},"previewHeight":300,"previewKeepAspectRatio":false,"previewWidth":300,"resolution":2,"sensorCropX":-1.0,"sensorCropY":-1.0,"stillHeight":-1,"stillWidth":-1,"videoHeight":-1,"videoWidth":-1}}],[2,{"id":2,"ioInfo":{"out":{"blocking":false,"name":"out","queueSize":8,"type":0}},"name":"XLinkIn","properties":{"maxDataSize":5242880,"numFrames":8,"streamName":"frameIn"}}],[3,{"id":3,"ioInfo":{"in":{"blocking":true,"name":"in","queueSize":8,"type":3}},"name":"XLinkOut","properties":{"maxFpsLimit":-1.0,"metadataOnly":false,"streamName":"nnXLinkOut"}}],[4,{"id":4,"ioInfo":{"bitstream":{"blocking":false,"name":"bitstream","queueSize":8,"type":0},"in":{"blocking":true,"name":"in","queueSize":4,"type":3}},"name":"VideoEncoder","properties":{"bitrate":8000,"frameRate":1.0,"height":3040,"keyframeFrequency":30,"lossless":false,"maxBitrate":8000,"numBFrames":0,"numFramesPool":4,"profile":4,"quality":95,"rateCtrlMode":0,"width":4032}}]]}]]
[2021-07-26 16:52:17.019] [trace] RPC: [1,1,14157578424912043072,[{"map":{"1/blob":{"alignment":64,"offset":0,"size":14505024}}}]]
[2021-07-26 16:52:17.021] [trace] RPC: [1,1,13547933642676024645,[14505024]]
[2021-07-26 16:52:17.022] [trace] RPC: [1,1,7790432089545852493,["__stream_asset_storage",2308464256,14505024]]
[2021-07-26 16:52:17.195] [trace] RPC: [1,1,11967769899726792808,[2308464256,14505024]]
[2021-07-26 16:52:17.201] [trace] RPC: [1,1,10180360702496156555,null]
[2021-07-26 16:52:17.201] [trace] RPC: [1,1,14047900442330284907,null]
[2021-07-26 16:52:17.211] [trace] Log vector decoded, size: 1
[14442C10411D4BD000] [143.595] [system] [info] ImageManip internal buffer size '171136'B, shave buffer size '19456'B
[2021-07-26 16:52:19.483] [debug] Timesync thread exception caught: Couldn't read data from stream: '__timesync' (X_LINK_ERROR)
[2021-07-26 16:52:19.484] [trace] RPC: [1,1,9503979954606424982,null]
[2021-07-26 16:52:19.489] [debug] Device about to be closed...
[2021-07-26 16:52:19.491] [debug] Log thread exception caught: Couldn't read data from stream: '__log' (X_LINK_ERROR)
[2021-07-26 16:52:19.910] [debug] XLinkResetRemote of linkId: (0)
[2021-07-26 16:52:19.914] [debug] DataInputQueue (frameIn) closed
[2021-07-26 16:52:19.914] [debug] DataInputQueue (controller) closed
[2021-07-26 16:52:19.915] [debug] Device closed, 426
Traceback (most recent call last):
File "test.py", line 46, in <module>
with dai.Device(pipeline) as device:
RuntimeError: Couldn't read data from stream: '__rpc_main' (X_LINK_ERROR)
Should I be asking this question on this follow up message or should I open a new post?
Because I'd also like to ask if it's possible to try to synchronize a flash with the OAK.
I'm triggering the flash from the gpio on the raspberry pi, and what I'm testing at the moment is to see if I can get some type of indication from depthai of when exposure starts, so that I can trigger the flash.
I'm trying with the slowest possible exposure speed I can to try to catch the flash, but it's not that slow, I can only go to 33000 μs.
I'm sorry for such a huge message, and I really appreciate all your help.
Thank you very much once again.
Cheers!