I am having issue converting onnx model to .blob. I was trying to achieve this Using online blobconverter. The conversion fails by giving "Accessing out-of-range dimension in Dimension[]"
Conversion error
Error message

Command failed with exit code 1, command: /opt/intel/openvino2022_1/tools/compile_tool/compile_tool -m /tmp/blobconverter/3344b3d21bfd4e25b88b8dcc1f7c4b3f/model-2022-08-11-nate-resnet50-simplified/FP16/model-2022-08-11-nate-resnet50-simplified.xml -o /tmp/blobconverter/3344b3d21bfd4e25b88b8dcc1f7c4b3f/model-2022-08-11-nate-resnet50-simplified/FP16/model-2022-08-11-nate-resnet50-simplified.blob -c /tmp/blobconverter/3344b3d21bfd4e25b88b8dcc1f7c4b3f/myriad_compile_config.txt -d MYRIAD -ip U8

Console output (stdout)

OpenVINO Runtime version ......... 2022.1.0
Build ........... 2022.1.0-7019-cdb9bec7210-releases/2022/1
Network inputs:
input : u8 / [...]
Network outputs:
boxes/sink_port_0 : f16 / [...]
scores/sink_port_0 : f16 / [...]
labels/sink_port_0 : i64 / [...]

Error output (stderr)

Accessing out-of-range dimension in Dimension[]

@wendbb
Looks a problem in the model architecture, what did you use to create the model?

Thanks,
Jaka

@jakaskerl
the model was created and trained in PyTorch. it is a detection model (retinanet_resnet50_fpn), then conveted to onnx.

    wendbb

    Can you try exporting it with batch dimension and specifying input shape to model converter?

    There seems to be some issues with dimensions. Either compiler can't recognize them properly or there is an issue with the architecture.

    7 days later

    @Matja @jakaskerl

    I was able to convert my onnx model to openvino (.xml & .bin). but I am still unable to convert the openvino model to .blob with online blobconverter. I get error:
    `Conversion error
    Error message

    Command failed with exit code 1, command: /opt/intel/openvino2022_1/tools/compile_tool/compile_tool -m /tmp/blobconverter/eee906938ab54ad8b958fa86c51bbd16/model_final_epoch_39-step_145680/FP16/model_final_epoch_39-step_145680.xml -o /tmp/blobconverter/eee906938ab54ad8b958fa86c51bbd16/model_final_epoch_39-step_145680/FP16/model_final_epoch_39-step_145680.blob -c /tmp/blobconverter/eee906938ab54ad8b958fa86c51bbd16/myriad_compile_config.txt -d MYRIAD -ip FP16

    Console output (stdout)

    OpenVINO Runtime version ......... 2022.1.0
    Build ........... 2022.1.0-7019-cdb9bec7210-releases/2022/1

    Error output (stderr)

    Cannot create NonMaxSuppression layer NonMaxSuppression_2382 id:25 from unsupported opset: opset9
    `
    do you have any suggestion on how to handle this?

      wendbb

      Yes, non maximum suppression (NMS) would have to be handled on host. You can inspect the ONNX in netron.app, find the layer before the NMS, and pass it as argument in Model Optimizer parameters as --output layer_name. You can then use OpenCV NMS function on host. Alternatively, you can find a more intuitive explanation of NMS here. Would this work for you?

      For Yolo models, we perform NMS on device itself, but we don't expose it as a node yet.

      a month later

      I am having the same problem as wendbb: I have built a pytorchlightning custom object detector on the basis of ssdlite320_mobilenet_v3_large and converted the model to onnx. The luxonis blob converter throws the same error: Accessing out-of-range dimension in Dimension[]. It has to be pointed out that the model optimizer did not throw any error. The error occurs during the blob conversion. I also ran a local model optimizer (version openvino-dev 2022.3) on the onnx file. Blob conversion gave the same error: NonMaxSuppression - opset9. This is due to the version of the model optimizer used. (with python 10 one can only install openvino-dev 2022.3 and higher). This version includes opset9. I edited the xml file and replaced opset9 by opset5 yielding the same error as before: out-of-range dimension. Any suggestion to solve this issue is more than welcome.

        Andrevancalster

        Can you please share the .xml file or ONNX? As mentioned, you would have to cut away the NMS node and do it on host.

        Alternative option is to try downgrading the version (or using an older version in blobconverter).

        8 days later

        Some extra info: I downloaded resnet34-ssd1200.onnx from open_model_zoo and converted the file to xml using openvino-dev 2022.3. To avoid non-supported NMS-opset9, I replaced opset9 with opset8 in the xml file and it converted flawless to the corresponding blob file. So I guess that the blob conversion error "out-of-range dimension in Dimension[]" of my "ssdlite320_mobilenet_v3_large.onnx" or "ssdlite320_mobilenet_v3_large.xml" is related to another layer - node.

          Andrevancalster

          Thanks, this makes more sense to me now. The generated ONNX is quite complex. You would have to use onnx simplifier to simplify it.

          Furthermore, MobileNetV3 uses HardSigmoid operation which is not supported out of the box. You can replace it with lower operations which OpenVINO won't merge into HardSigmoid and will thus allow the export. Below is PyTorch implementation, and you would have to replace all HardSigmoid use cases in the architecture before exporting to ONNX.

          def hard_sigmoid(x, inplace=False):
              return F.relu6(x + 3, inplace) / 6
              
          class HS(torch.nn.Module):
              def forward(self, x):
                  return hard_sigmoid(x)

          Then, the head contains GatherND operation and "if" statement, which you would have to prune away. You would likely have to specify output flag --output concat_name,softmaxname to model optimizer field in blobconverter. Post-processing of results would then have to be performed on host (NMS and filtering of the topk predictions).


          You see you would have to do quite some work to be able to deploy this. Instead, it would probably be easier to take our YoloV8 training notebook and adapt the initial sections to use your dataset. Export should be rather straightforward afterwards, and you can use DepthAI's YoloDetectionNetwork on device later. Let me know if this helps.