Hi,
I'm having trouble converting my custom model to a blob format via the provided blobconverter tool. The error log is copied below:
`Error message
Command failed with exit code 1, command: /app/venvs/venv2022_1/bin/python /app/model_compiler/openvino_2022.1/converter.py --precisions FP16 --output_dir /tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4 --download_dir /tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4 --name coho_full_08252023 --model_root /tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4
Console output (stdout)
========== Converting coho_full_08252023 to IR (FP16)
Conversion command: /app/venvs/venv2022_1/bin/python -- /app/venvs/venv2022_1/bin/mo --framework=onnx --data_type=FP16 --output_dir=/tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4/coho_full_08252023/FP16 --model_name=coho_full_08252023 --input= --data_type=FP16 '--mean_values=[127.5,127.5,127.5]' '--scale_values=[255,255,255]' --input_model=/tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4/coho_full_08252023/FP16/coho_full_08252023.onnx
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4/coho_full_08252023/FP16/coho_full_08252023.onnx
- Path for generated IR: /tmp/blobconverter/4b6ada9693f34e528a0fe2e8f3f17cc4/coho_full_08252023/FP16
- IR output name: coho_full_08252023
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Source layout: Not specified
- Target layout: Not specified
- Layout: Not specified
- Mean values: [127.5,127.5,127.5]
- Scale values: [255,255,255]
- Scale factor: Not specified
- Precision of IR: FP16
- Enable fusing: True
- User transformations: Not specified
- Reverse input channels: False
- Enable IR generation for fixed input shape: False
- Use the transformations config file: None
Advanced parameters:
- Force the usage of legacy Frontend of Model Optimizer for model conversion into IR: False
- Force the usage of new Frontend of Model Optimizer for model conversion into IR: False
OpenVINO runtime found in: /opt/intel/openvino2022_1/python/python3.8/openvino
OpenVINO runtime version: 2022.1.0-7019-cdb9bec7210-releases/2022/1
Model Optimizer version: 2022.1.0-7019-cdb9bec7210-releases/2022/1
FAILED:
coho_full_08252023
Error output (stderr)
[ ERROR ] -------------------------------------------------
[ ERROR ] ----------------- INTERNAL ERROR ----------------
[ ERROR ] Unexpected exception happened.
[ ERROR ] Please contact Model Optimizer developers and forward the following information:
[ ERROR ] Check 'unknown_operators.empty()' failed at frontends/onnx/frontend/src/core/graph.cpp:131:
OpenVINO does not support the following ONNX operations: TRT.EfficientNMS_TRT
[ ERROR ] Traceback (most recent call last):
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/main.py", line 533, in main
ret_code = driver(argv)
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/main.py", line 489, in driver
graph, ngraph_function = prepare_ir(argv)
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/main.py", line 394, in prepare_ir
ngraph_function = moc_pipeline(argv, moc_front_end)
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/moc_frontend/pipeline.py", line 147, in moc_pipeline
ngraph_function = moc_front_end.convert(input_model)
RuntimeError: Check 'unknown_operators.empty()' failed at frontends/onnx/frontend/src/core/graph.cpp:131:
OpenVINO does not support the following ONNX operations: TRT.EfficientNMS_TRT
[ ERROR ] ---------------- END OF BUG REPORT --------------
[ ERROR ] -------------------------------------------------`
Is there anything I can do to fix this?