Hello all,

I am new to learning about building OAK-D pipeline. I am running examples and creating my own pipeline to learn. My understanding is that a pipeline can only use 13 shaves when ColorCamera is set to 1080p.

However, when I was looking over the code in depthai-experiments/gen2-face-recognition/main.py, it appears that the pipeline uses 18 shaves. There are 6 shaves used by face_det_nn, 6 used by the headpose_nn and 6 used by the face_rec_nn. Am I missing some trick that the code used so that more shaves can be used or did I misunderstood the code.

Any help to understand is much appreciated.

  • erik replied to this.

    Hi FrancisTse ,
    There are 16 shave cores in total, see docs on it here. And then other nodes also consume shave cores (eg. neural networks, image manips, etc.). Note that you can have multiple NNs (consuming 6 shaves) running in parallel, as they will share the resources. Thoughts?
    Thanks, Erik

    Hello Erik, thank you for your explanation and link. There is so much to learn. I have some follow up questions:

    1. Is it true that as long as the NNs consume the same number of shaves, the NNs can share the same resources?
    2. Is there a limit as how many NNs can share the same resources and is there a performance degradation if I added more NNs?
    3. How many shaves does other nodes such as ImageManips consume?
    4. Is there a way I can estimate how many shaves have been consumed for a pipeline?

    Thanks, Francis.

    • erik replied to this.

      Hi FrancisTse ,

      1. So if you run debug mode, you will see at the start how resources are allocated (docs here). So you can have model compiled for 6 shave cores and in total it will consume 12, so it will run 2x inferences in parallel.
      2. Yes, there's performance degradation. So if a single model can run at 30FPS, then 2 of such models running in parallel will run at 14-15FPS (some overhead).
      3. Docs above
      4. Docs above

      I hope this helps!
      Thanks, Erik

        Hello erik ,
        Following the information in the info on Debugging DepthAI pipeline, I added the # Set debugging level block of code to the main.py program as shown here:

        with dai.Device(pipeline) as device:
        
            # Set debugging level
            device.setLogLevel(dai.LogLevel.DEBUG)
            device.setLogOutputLevel(dai.LogLevel.DEBUG)
        
            facerec = FaceRecognition(databases, args.name)
            sync = TwoStageHostSeqSync()
            text = TextHelper()

        With that, I got some info about Memory Usage, etc. But there were no info about shave usage:

        francis@raspberrypi:~/Desktop/learningOAK-D-Lite/gen2-face-recognition $ /bin/python /home/francis/Desktop/learningOAK-D-Lite/gen2-face-recognition/main.py
        Creating pipeline...
        Creating Color Camera...
        Creating Face Detection Neural Network...
        Creating Head pose estimation NN
        Creating face recognition ImageManip/NN
        [18443010F1AECE1200] [1.1.1] [2.157] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.42 / 77.32 MiB, LeonRT Heap: 8.18 / 41.23 MiB
        [18443010F1AECE1200] [1.1.1] [2.157] [system] [info] Temperatures - Average: 37.36 °C, CSS: 38.18 °C, MSS 37.71 °C, UPA: 36.53 °C, DSS: 37.01 °C
        [18443010F1AECE1200] [1.1.1] [2.157] [system] [info] Cpu Usage - LeonOS 58.59%, LeonRT: 31.94%
        [18443010F1AECE1200] [1.1.1] [3.158] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.49 / 77.32 MiB, LeonRT Heap: 8.22 / 41.23 MiB
        [18443010F1AECE1200] [1.1.1] [3.158] [system] [info] Temperatures - Average: 38.18 °C, CSS: 38.65 °C, MSS 37.71 °C, UPA: 38.18 °C, DSS: 38.18 °C
        [18443010F1AECE1200] [1.1.1] [3.158] [system] [info] Cpu Usage - LeonOS 44.81%, LeonRT: 35.28%
        [18443010F1AECE1200] [1.1.1] [4.159] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.49 / 77.32 MiB, LeonRT Heap: 8.22 / 41.23 MiB
        [18443010F1AECE1200] [1.1.1] [4.159] [system] [info] Temperatures - Average: 38.65 °C, CSS: 38.65 °C, MSS 39.11 °C, UPA: 38.41 °C, DSS: 38.41 °C
        [18443010F1AECE1200] [1.1.1] [4.159] [system] [info] Cpu Usage - LeonOS 32.53%, LeonRT: 26.04%
        [18443010F1AECE1200] [1.1.1] [5.160] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.50 / 77.32 MiB, LeonRT Heap: 8.22 / 41.23 MiB
        [18443010F1AECE1200] [1.1.1] [5.160] [system] [info] Temperatures - Average: 38.76 °C, CSS: 39.58 °C, MSS 38.88 °C, UPA: 38.18 °C, DSS: 38.41 °C
        [18443010F1AECE1200] [1.1.1] [5.160] [system] [info] Cpu Usage - LeonOS 27.66%, LeonRT: 23.91%
        [18443010F1AECE1200] [1.1.1] [6.161] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.50 / 77.32 MiB, LeonRT Heap: 8.22 / 41.23 MiB
        [18443010F1AECE1200] [1.1.1] [6.161] [system] [info] Temperatures - Average: 39.29 °C, CSS: 39.58 °C, MSS 39.11 °C, UPA: 39.35 °C, DSS: 39.11 °C

        Is there something else I have to do?

        Thanks,
        Francis.

        • erik replied to this.

          Hi FrancisTse ,
          Could you try running
          DEPTHAI_LEVEL=debug /bin/python /home/francis/Desktop/learningOAK-D-Lite/gen2-face-recognition/main.py?
          Thanks, Erik

            Hello erik ,
            I tried that and got a lot more info on the terminal output:

            francis@raspberrypi:~/Desktop/learningOAK-D-Lite/gen2-face-recognition $ DEPTHAI_LEVEL=debug /bin/python /home/francis/Desktop/learningOAK-D-Lite/gen2-face-recognition/main.py 
            [2023-02-04 15:49:52.694] [debug] Python bindings - version: 2.20.2.0 from  build: 2023-02-01 00:22:21 +0000
            [2023-02-04 15:49:52.694] [debug] Library information - version: 2.20.2, commit:  from , build: 2023-01-31 23:22:01 +0000
            [2023-02-04 15:49:52.698] [debug] Initialize - finished
            Creating pipeline...
            Creating Color Camera...
            Creating Face Detection Neural Network...
            Creating Head pose estimation NN
            Creating face recognition ImageManip/NN
            [2023-02-04 15:49:52.977] [debug] Resources - Archive 'depthai-bootloader-fwp-0.0.24.tar.xz' open: 4ms, archive read: 277ms
            [2023-02-04 15:49:53.169] [debug] Device - OpenVINO version: 2021.4
            [2023-02-04 15:49:53.170] [debug] Device - BoardConfig: {"camera":[],"emmc":null,"gpio":[],"logDevicePrints":null,"logPath":null,"logSizeMax":null,"logVerbosity":null,"network":{"mtu":0,"xlinkTcpNoDelay":true},"nonExclusiveMode":false,"pcieInternalClock":null,"sysctl":[],"uart":[],"usb":{"flashBootedPid":63037,"flashBootedVid":999,"maxSpeed":4,"pid":63035,"vid":999},"usb3PhyInternalClock":null,"watchdogInitialDelayMs":null,"watchdogTimeoutMs":null} 
            libnop:
            0000: b9 10 b9 05 81 e7 03 81 3b f6 81 e7 03 81 3d f6 04 b9 02 00 01 ba 00 be be bb 00 bb 00 be be be
            0020: be be be be 00 bb 00
            [2023-02-04 15:49:53.883] [debug] Resources - Archive 'depthai-device-fwp-8c3d6ac1c77b0bf7f9ea6fd4d962af37663d2fbd.tar.xz' open: 4ms, archive read: 1183ms
            [2023-02-04 15:49:54.747] [debug] Searching for booted device: DeviceInfo(name=1.1.1, mxid=18443010F1AECE1200, X_LINK_BOOTED, X_LINK_USB_VSC, X_LINK_MYRIAD_X, X_LINK_SUCCESS), name used as hint only
            [18443010F1AECE1200] [1.1.1] [1.190] [system] [info] Memory Usage - DDR: 0.12 / 340.42 MiB, CMX: 2.05 / 2.50 MiB, LeonOS Heap: 7.24 / 77.32 MiB, LeonRT Heap: 2.89 / 41.23 MiB
            [18443010F1AECE1200] [1.1.1] [1.190] [system] [info] Temperatures - Average: 34.28 °C, CSS: 34.64 °C, MSS 34.64 °C, UPA: 33.45 °C, DSS: 34.40 °C
            [18443010F1AECE1200] [1.1.1] [1.190] [system] [info] Cpu Usage - LeonOS 27.49%, LeonRT: 1.48%
            [2023-02-04 15:49:56.024] [debug] Schema dump: {"connections":[{"node1Id":10,"node1Output":"out","node1OutputGroup":"","node2Id":11,"node2Input":"in","node2InputGroup":""},{"node1Id":9,"node1Output":"out","node1OutputGroup":"","node2Id":10,"node2Input":"in","node2InputGroup":""},{"node1Id":6,"node1Output":"manip2_img","node1OutputGroup":"io","node2Id":9,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":6,"node1Output":"manip2_cfg","node1OutputGroup":"io","node2Id":9,"node2Input":"inputConfig","node2InputGroup":""},{"node1Id":7,"node1Output":"out","node1OutputGroup":"","node2Id":8,"node2Input":"in","node2InputGroup":""},{"node1Id":6,"node1Output":"manip_img","node1OutputGroup":"io","node2Id":7,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":6,"node1Output":"manip_cfg","node1OutputGroup":"io","node2Id":7,"node2Input":"inputConfig","node2InputGroup":""},{"node1Id":8,"node1Output":"passthrough","node1OutputGroup":"","node2Id":6,"node2Input":"headpose_pass","node2InputGroup":"io"},{"node1Id":8,"node1Output":"out","node1OutputGroup":"","node2Id":6,"node2Input":"headpose_in","node2InputGroup":"io"},{"node1Id":2,"node1Output":"out","node1OutputGroup":"","node2Id":6,"node2Input":"preview","node2InputGroup":"io"},{"node1Id":4,"node1Output":"passthrough","node1OutputGroup":"","node2Id":6,"node2Input":"face_pass","node2InputGroup":"io"},{"node1Id":4,"node1Output":"out","node1OutputGroup":"","node2Id":6,"node2Input":"face_det_in","node2InputGroup":"io"},{"node1Id":4,"node1Output":"out","node1OutputGroup":"","node2Id":5,"node2Input":"in","node2InputGroup":""},{"node1Id":3,"node1Output":"out","node1OutputGroup":"","node2Id":4,"node2Input":"in","node2InputGroup":""},{"node1Id":2,"node1Output":"out","node1OutputGroup":"","node2Id":3,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":0,"node1Output":"preview","node1OutputGroup":"","node2Id":2,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":0,"node1Output":"video","node1OutputGroup":"","node2Id":1,"node2Input":"in","node2InputGroup":""}],"globalProperties":{"calibData":null,"cameraTuningBlobSize":null,"cameraTuningBlobUri":"","leonCssFrequencyHz":700000000.0,"leonMssFrequencyHz":700000000.0,"pipelineName":null,"pipelineVersion":null,"xlinkChunkSize":-1},"nodes":[[0,{"id":0,"ioInfo":[[["","video"],{"blocking":false,"group":"","id":41,"name":"video","queueSize":8,"type":0,"waitForMessage":false}],[["","still"],{"blocking":false,"group":"","id":39,"name":"still","queueSize":8,"type":0,"waitForMessage":false}],[["","isp"],{"blocking":false,"group":"","id":38,"name":"isp","queueSize":8,"type":0,"waitForMessage":false}],[["","preview"],{"blocking":false,"group":"","id":40,"name":"preview","queueSize":8,"type":0,"waitForMessage":false}],[["","raw"],{"blocking":false,"group":"","id":37,"name":"raw","queueSize":8,"type":0,"waitForMessage":false}],[["","frameEvent"],{"blocking":false,"group":"","id":36,"name":"frameEvent","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":false,"group":"","id":35,"name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputControl"],{"blocking":true,"group":"","id":34,"name":"inputControl","queueSize":8,"type":3,"waitForMessage":false}]],"name":"ColorCamera","properties":[185,24,185,27,0,3,0,0,0,185,3,0,0,0,185,5,0,0,0,0,0,185,5,0,0,0,0,0,0,0,0,0,0,0,0,185,3,0,0,0,185,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,189,0,255,0,0,0,129,48,4,129,48,4,133,48,4,133,48,4,255,255,0,136,0,0,240,65,136,0,0,128,191,136,0,0,128,191,1,185,4,0,0,0,0,3,3,4,4,4]}],[1,{"id":1,"ioInfo":[[["","in"],{"blocking":true,"group":"","id":33,"name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,5,99,111,108,111,114,0]}],[2,{"id":2,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":32,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","id":31,"name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputImage"],{"blocking":true,"group":"","id":30,"name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,6,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,0,0,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,6,32,0,0,0,0,133,255,0,0,0,0,0,0,134,0,155,52,0,20,0,0,189,0]}],[3,{"id":3,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":29,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","id":28,"name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputImage"],{"blocking":true,"group":"","id":27,"name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,6,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,133,44,1,133,44,1,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,6,32,0,0,0,0,133,255,0,0,1,0,0,0,134,0,0,16,0,4,0,0,189,0]}],[4,{"id":4,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":26,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","passthrough"],{"blocking":false,"group":"","id":25,"name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":true,"group":"","id":24,"name":"in","queueSize":5,"type":3,"waitForMessage":true}]],"name":"DetectionNetwork","properties":[185,6,130,0,106,20,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0,185,7,1,136,0,0,0,63,0,0,186,0,187,0,136,0,0,0,0]}],[5,{"id":5,"ioInfo":[[["","in"],{"blocking":true,"group":"","id":23,"name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,9,100,101,116,101,99,116,105,111,110,0]}],[6,{"id":6,"ioInfo":[[["io","manip_img"],{"blocking":false,"group":"io","id":21,"name":"manip_img","queueSize":8,"type":0,"waitForMessage":false}],[["io","manip2_cfg"],{"blocking":false,"group":"io","id":20,"name":"manip2_cfg","queueSize":8,"type":0,"waitForMessage":false}],[["io","face_det_in"],{"blocking":true,"group":"io","id":18,"name":"face_det_in","queueSize":8,"type":3,"waitForMessage":false}],[["io","manip2_img"],{"blocking":false,"group":"io","id":19,"name":"manip2_img","queueSize":8,"type":0,"waitForMessage":false}],[["io","face_pass"],{"blocking":true,"group":"io","id":17,"name":"face_pass","queueSize":8,"type":3,"waitForMessage":false}],[["io","manip_cfg"],{"blocking":false,"group":"io","id":22,"name":"manip_cfg","queueSize":8,"type":0,"waitForMessage":false}],[["io","preview"],{"blocking":true,"group":"io","id":16,"name":"preview","queueSize":8,"type":3,"waitForMessage":false}],[["io","headpose_in"],{"blocking":true,"group":"io","id":15,"name":"headpose_in","queueSize":8,"type":3,"waitForMessage":false}],[["io","headpose_pass"],{"blocking":true,"group":"io","id":14,"name":"headpose_pass","queueSize":8,"type":3,"waitForMessage":false}]],"name":"Script","properties":[185,3,189,14,97,115,115,101,116,58,95,95,115,99,114,105,112,116,189,8,60,115,99,114,105,112,116,62,0]}],[7,{"id":7,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":13,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","id":12,"name":"inputConfig","queueSize":8,"type":3,"waitForMessage":true}],[["","inputImage"],{"blocking":true,"group":"","id":11,"name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,6,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,60,60,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,6,32,0,0,0,0,133,255,0,0,1,0,0,0,134,0,0,16,0,4,0,0,189,0]}],[8,{"id":8,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":10,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","passthrough"],{"blocking":false,"group":"","id":9,"name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":true,"group":"","id":8,"name":"in","queueSize":5,"type":3,"waitForMessage":true}]],"name":"NeuralNetwork","properties":[185,5,130,64,151,58,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0]}],[9,{"id":9,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":7,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","id":6,"name":"inputConfig","queueSize":8,"type":3,"waitForMessage":true}],[["","inputImage"],{"blocking":true,"group":"","id":5,"name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,6,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,112,112,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,6,32,0,0,0,0,133,255,0,0,1,0,0,0,134,0,0,16,0,4,0,0,189,0]}],[10,{"id":10,"ioInfo":[[["","out"],{"blocking":false,"group":"","id":4,"name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","passthrough"],{"blocking":false,"group":"","id":3,"name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":true,"group":"","id":2,"name":"in","queueSize":5,"type":3,"waitForMessage":true}]],"name":"NeuralNetwork","properties":[185,5,130,0,49,73,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0]}],[11,{"id":11,"ioInfo":[[["","in"],{"blocking":true,"group":"","id":1,"name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,11,114,101,99,111,103,110,105,116,105,111,110,0]}]]}
            [2023-02-04 15:49:56.024] [debug] Asset map dump: {"map":{"/node/10/__blob":{"alignment":64,"offset":0,"size":4796672},"/node/4/__blob":{"alignment":64,"offset":8640256,"size":1337856},"/node/6/__script":{"alignment":64,"offset":8636480,"size":3776},"/node/8/__blob":{"alignment":64,"offset":4796672,"size":3839808}}}
            [18443010F1AECE1200] [1.1.1] [1.689] [system] [info] SIPP (Signal Image Processing Pipeline) internal buffer size '16384'B
            [18443010F1AECE1200] [1.1.1] [1.724] [system] [info] ImageManip internal buffer size '393408'B, shave buffer size '35840'B
            [18443010F1AECE1200] [1.1.1] [1.724] [system] [info] NeuralNetwork allocated resources: shaves: [0-12] cmx slices: [0-12] 
            ColorCamera allocated resources: no shaves; cmx slices: [13-15] 
            ImageManip allocated resources: shaves: [15-15] no cmx slices. 
            
            [18443010F1AECE1200] [1.1.1] [1.948] [NeuralNetwork(10)] [info] Needed resources: shaves: 6, ddr: 2007040 
            [18443010F1AECE1200] [1.1.1] [1.951] [DetectionNetwork(4)] [info] Needed resources: shaves: 6, ddr: 2728832 
            [18443010F1AECE1200] [1.1.1] [1.952] [NeuralNetwork(8)] [info] Needed resources: shaves: 6, ddr: 21632 
            [18443010F1AECE1200] [1.1.1] [1.970] [NeuralNetwork(10)] [info] Inference thread count: 2, number of shaves allocated per thread: 6, number of Neural Compute Engines (NCE) allocated per thread: 1
            [18443010F1AECE1200] [1.1.1] [1.970] [DetectionNetwork(4)] [info] Inference thread count: 2, number of shaves allocated per thread: 6, number of Neural Compute Engines (NCE) allocated per thread: 1
            [18443010F1AECE1200] [1.1.1] [1.970] [NeuralNetwork(8)] [info] Inference thread count: 2, number of shaves allocated per thread: 6, number of Neural Compute Engines (NCE) allocated per thread: 1
            [18443010F1AECE1200] [1.1.1] [2.192] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 26.99 / 77.32 MiB, LeonRT Heap: 8.17 / 41.23 MiB
            [18443010F1AECE1200] [1.1.1] [2.192] [system] [info] Temperatures - Average: 36.00 °C, CSS: 36.77 °C, MSS 35.59 °C, UPA: 35.83 °C, DSS: 35.83 °C
            [18443010F1AECE1200] [1.1.1] [2.192] [system] [info] Cpu Usage - LeonOS 53.33%, LeonRT: 30.49%
            [18443010F1AECE1200] [1.1.1] [3.194] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.48 / 77.32 MiB, LeonRT Heap: 8.22 / 41.23 MiB
            [18443010F1AECE1200] [1.1.1] [3.194] [system] [info] Temperatures - Average: 36.71 °C, CSS: 37.24 °C, MSS 36.30 °C, UPA: 36.77 °C, DSS: 36.53 °C
            [18443010F1AECE1200] [1.1.1] [3.194] [system] [info] Cpu Usage - LeonOS 45.36%, LeonRT: 29.64%
            [18443010F1AECE1200] [1.1.1] [4.195] [system] [info] Memory Usage - DDR: 132.59 / 340.42 MiB, CMX: 2.49 / 2.50 MiB, LeonOS Heap: 27.49 / 77.32 MiB, LeonRT Heap: 8.22 / 41.23 MiB
            [18443010F1AECE1200] [1.1.1] [4.195] [system] [info] Temperatures - Average: 37.30 °C, CSS: 38.41 °C, MSS 36.77 °C, UPA: 37.01 °C, DSS: 37.01 °C
            [18443010F1AECE1200] [1.1.1] [4.195] [system] [info] Cpu Usage - LeonOS 37.73%, LeonRT: 29.79%
            [2023-02-04 15:49:59.865] [debug] Device about to be closed...
            [2023-02-04 15:49:59.884] [debug] Timesync thread exception caught: Couldn't read data from stream: '__timesync' (X_LINK_ERROR)
            [2023-02-04 15:49:59.884] [debug] DataOutputQueue (detection) closed
            [2023-02-04 15:49:59.884] [debug] Log thread exception caught: Couldn't read data from stream: '__log' (X_LINK_ERROR)
            [2023-02-04 15:49:59.884] [debug] DataOutputQueue (recognition) closed
            [2023-02-04 15:49:59.884] [debug] DataOutputQueue (color) closed
            [2023-02-04 15:50:00.579] [debug] Watchdog thread exception caught: Couldn't write data to stream: '__watchdog' (X_LINK_ERROR)
            [2023-02-04 15:50:01.954] [debug] XLinkResetRemote of linkId: (0)
            [2023-02-04 15:50:01.954] [debug] Device closed, 2089

            I hit the "q" key at the end. That is probably when the Device closed. I will have to look over the output to study what it says.

            Thanks,
            Francis.

            • erik replied to this.

              Hi FrancisTse , you did get this info:

              [18443010F1AECE1200] [1.1.1] [1.724] [system] [info] NeuralNetwork allocated resources: shaves: [0-12] cmx slices: [0-12] 
              ColorCamera allocated resources: no shaves; cmx slices: [13-15] 
              ImageManip allocated resources: shaves: [15-15] no cmx slices. 

              Thanks, Erik

                Hello erik ,
                Looks like the system is dynamically deploying the allocated 13 shaves [0-12] for the three NNs and the one shave [15-15] for the four ImageManips. Very Nice!
                Thanks again for all your help and links.
                Best regards,
                Francis.