• Community
  • Creating .blob from Google Vertex Edge Model

I am trying to use a model created in Vertex on an OAK. However, I am having major issues using the blob converter web app. I have also tried the openvino toolkits (versions 2021, 2022, and 2023) with no luck. Can you please advise if this is possible?

Here are the possible outputs from Vertex:
https://cloud.google.com/vertex-ai/docs/export/export-edge-model#tf-lite:~:text=Tensorflow.js-,The%20OUTPUT_BUCKET,-you%20specified%20in

Thank you!

  • erik replied to this.

    Hi MikieReed ,
    I would suggest exporting as TFLite, and then using tflite2tensorflow package to convert to OpenVINO. Let us know if you have any issues with that flow.
    Thansk, Erik

    Thank you, Erik.
    I had played around with that a bit, but I will try it again and let you know.
    Have a wonderful day!
    mikie

    Good morning, Erik,
    I have been trying the tflite2tensorflow package. Per the package README, I have:

    Created a frozed .pb file:

      --model_path 021023_model.tflite \
      --flatc_path ../flatc \
      --schema_path ../schema.fbs \
      --output_pb \
      --optimizing_for_openvino_and_myriad \
      --rigorous_optimization_for_myriad
    
    TensorFlow/Keras model building process complete!
    saved_model / .pb output started ====================================================
    saved_model / .pb output complete!

    However, on the .blob creation step:

    tflite2tensorflow \
      --model_path 021023_model.tflite \
      --flatc_path ../flatc \
      --schema_path ../schema.fbs \
      --output_no_quant_float32_tflite \
      --output_dynamic_range_quant_tflite \
      --output_weight_quant_tflite \
      --output_float16_quant_tflite \
      --output_integer_quant_tflite \
      --string_formulas_for_normalization 'data / 255.0' \
      --output_tfjs \
      --output_coreml \
      --output_tftrt_float32 \
      --output_tftrt_float16 \
      --output_onnx \
      --onnx_opset 11 \
      --output_openvino_and_myriad
    
    Myriad Inference Engine blob convertion started ============================================
    Traceback (most recent call last):
      File "/usr/local/bin/tflite2tensorflow", line 6608, in <module>
        main()
      File "/usr/local/bin/tflite2tensorflow", line 6578, in main
        shutil.copy(f'{model_output_path}/openvino/FP16/saved_model.xml', f'{model_output_path}/openvino/FP16/saved_model_vino.xml')
      File "/usr/lib/python3.8/shutil.py", line 418, in copy
        copyfile(src, dst, follow_symlinks=follow_symlinks)
      File "/usr/lib/python3.8/shutil.py", line 264, in copyfile
        with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
    FileNotFoundError: [Errno 2] No such file or directory: 'saved_model/openvino/FP16/saved_model.xml'

    I played with it for a few hours, and couldn't find anywhere that that file was supposed to get created.

    Please advise. Thank you!

    mikie

    • erik replied to this.

      Hi MikieReed ,
      I am not very familiar with the tool, best to submit the issue on their repo. In general, Luxonis officially supports models whose training notebooks can be found at depthai-ml-training. That said, other models can likely still work, but usually need a bit more work to get converted.
      Thanks, Erik