Hey Erik,
I am having a similar problem, and ran your script with the following output:
Inputs
Name: images, Type: DataType.U8F, Shape: [448, 224, 3, 1] (StorageOrder.NCHW)
Outputs
Name: output1_yolov7, Type: DataType.FP16, Shape: [56, 28, 18, 1] (StorageOrder.NCHW)
Name: output2_yolov7, Type: DataType.FP16, Shape: [28, 14, 18, 1] (StorageOrder.NCHW)
Name: output3_yolov7, Type: DataType.FP16, Shape: [14, 7, 18, 1] (StorageOrder.NCHW)
The model seems to be expecting a 448x224 image with 3 channels (RGB) but when passed an image that I believe is the right size using this script the following error is output:
[14442C10314230D700] [10.53.168.72] [10.584] [NeuralNetwork(2)] [error] Input tensor 'images' (0) exceeds available data range. Data size (150528B), tensor offset (0), size (301056B) - skipping inference.
Any help would be appreciated 🙂
Edit: Using the same code but resizing to 640x640 instead in line 18 the following warning is obtained:
[14442C10314230D700] [10.53.168.72] [11.671] [NeuralNetwork(2)] [warning] Input image (640x640) does not match NN (448x224)
Interestingly, here the error is gone. Let me know your thoughts.