Hi Jungduri ,
If you enable trace debugging you can see latency of operations, example of log output:
[184430102127631200] [1.3] [3.884] [DetectionNetwork(1)] [trace] NeuralNetwork inference took '56.251972' ms.
[184430102127631200] [1.3] [3.887] [system] [trace] EV:0,S:0,IDS:10,IDD:13,TSS:3,TSN:887495335
[184430102127631200] [1.3] [3.887] [system] [trace] EV:0,S:1,IDS:10,IDD:13,TSS:3,TSN:887539441
[184430102127631200] [1.3] [3.884] [system] [trace] EV:1,S:0,IDS:9,IDD:0,TSS:3,TSN:884797408
[184430102127631200] [1.3] [3.884] [system] [trace] EV:1,S:1,IDS:9,IDD:0,TSS:3,TSN:884834724
[184430102127631200] [1.3] [3.886] [DetectionNetwork(1)] [trace] DetectionParser took '0.027416' ms.
[184430102127631200] [1.3] [3.887] [system] [trace] EV:1,S:1,IDS:13,IDD:0,TSS:3,TSN:887861777
[184430102127631200] [1.3] [3.889] [system] [trace] EV:0,S:0,IDS:10,IDD:13,TSS:3,TSN:889062871
[184430102127631200] [1.3] [3.889] [system] [trace] EV:0,S:1,IDS:10,IDD:13,TSS:3,TSN:889118768
[184430102127631200] [1.3] [3.891] [system] [trace] EV:1,S:1,IDS:14,IDD:0,TSS:3,TSN:891560231
[184430102127631200] [1.3] [3.886] [DetectionNetwork(1)] [trace] NeuralNetwork inference took '57.269817' ms.
[184430102127631200] [1.3] [3.886] [system] [trace] EV:1,S:0,IDS:9,IDD:0,TSS:3,TSN:886867631
[184430102127631200] [1.3] [3.886] [system] [trace] EV:1,S:1,IDS:9,IDD:0,TSS:3,TSN:886903565
[184430102127631200] [1.3] [3.894] [system] [trace] EV:0,S:0,IDS:5,IDD:12,TSS:3,TSN:894595561
[184430102127631200] [1.3] [3.894] [system] [trace] EV:0,S:1,IDS:5,IDD:12,TSS:3,TSN:894662758
[184430102127631200] [1.3] [3.894] [system] [trace] EV:1,S:1,IDS:12,IDD:0,TSS:3,TSN:894774412
[184430102127631200] [1.3] [3.894] [system] [trace] EV:1,S:0,IDS:13,IDD:0,TSS:3,TSN:894880010
[184430102127631200] [1.3] [3.888] [DetectionNetwork(1)] [trace] DetectionParser took '0.026914' ms.
So here we can see inference itself takes about 57ms, and parsing of the results (as I'm using MobileNetDetectionNetwork
) takes about 27us. I hope this helps!
Thanks, Erik