• DepthAI
  • Model download timeout :download.01.org

Oh interesting. We will investigate and see about hosting these directly instead.

How do I make depthai_demo use the downloaded blob file?
I tried placing face-detection-retail-0004.blob in the resources/nn/face-detection-retail-0004 folder with the model.yml file bit it still tried to download from 01.org. It there a way to download the xml and bin files?

The models I can download are:-
yolo-v3, OK
deeplabv3p_person, OK
mobileNetV2-PoseEstimation
vehicle-license-plate-detection-barrier-0106,
tiny-yolo-v3, OK
mobilenet-ssd, OK

I think the problem is that downloads.01.org does not use the latest download protocol and recent chances to outlaw the use of the older insecure protocols. I don't know whether the restriction is enforced locally in my computer or globally in the web infrastructure. If it were just my laptop there might be a way to reactivate the old protocols. (I tried and failed but it is not my area of expertise)

I found when testing with mobilenet-ssd that the scripts have multiple places that need to be flushed to force a new download.

    julia76 Sorry about the delay. Yes, I think we recently wrote a tutorial on how to do this, checking with the team and will circle back.

    Hi julia76,

    There are certain naming conventions for the blobs copied into resources/nn/. It should match [MODEL_NAME].blob.shXcmxXNCE1. Where X is filled in with the number of SHAVEs and CMX used. For example, you could move it to resources/nn/face-detection-retail-0004.blob.sh14cmx14NCE1. Let me know if that works!

    Best,
    Philip

    Thanks Philip. Getting the correct suffix got it working.

    Correction I incorrectly stated that vehicle-license-plate-detection-barrier-0106.xml can be downloaded

    Hi julia76

    We only formally support up to 2021.1, but we should be adding support for 2021.2 soon. Until then you can still try to use those models since they may be backwards compatible.

    Best,
    Philip

    I have just got something working using openVINO_2021.2

    I used apt to install intel-openvino-runtime-ubuntu20-2021.2.200 & intel-openvino-dev-ubuntu20-2021.2.200
    If I try installing anything older than openVINO_2021.1 the installation scripts fall over since they don't recognise the latest version of Ubuntu. I can install openVINO_2021.1 but that falls over as it tries to download from downloads.01.org (which is my original problem)

    The demo script './demo_benchmark_app.sh -d MYRIAD' runs on my OAk1 as do the other demos in the openVINO distribution.

    5 days later

    I'm having the same issue with downloading:
    requests.exceptions.HTTPError: 504 Server Error: Gateway Time-out for url: https://download.01.org/opencv/2020/openvinotoolkit/2020.1/open_model_zoo/models_bin/1/face-detection-retail-0004/FP16/face-detection-retail-0004.xml

    Aaaand I should follow up by saying it's working now. Perhaps the host is just unreliable?

    6 days later

    Same issue, do I need to resort to using a manually-created blob file? Unlike julia76, I'm not able to access http://69.164.214.171:8083/ from the browser. Removing the port from the url (http://69.164.214.171 ), I am able to load from browser.

    (depthai_env) PS C:\Users\dn****\.conda\envs\depthai_env\depthai> python depthai_demo.py
    No calibration file. Using Calibration Defaults.
    Using depthai module from:  C:\Users\dn****\AppData\Roaming\Python\Python39\site-packages\depthai.cp39-win_amd64.pyd
    Depthai version installed:  0.4.1.1
    [91mCompiling model for 14 shaves, 14 cmx_slices and 1 NN_engines [0m
    model_compilation_target: cloud
    ################|| Downloading models ||################
    
    ========== Retrieving C:\Users\dn****\.conda\envs\depthai_env\depthai\model_compiler\downloads\mobilenet-ssd\FP16\mobilenet-ssd.xml from the cache
    
    ========== Retrieving C:\Users\dn****\.conda\envs\depthai_env\depthai\model_compiler\downloads\mobilenet-ssd\FP16\mobilenet-ssd.bin from the cache
    
    ################|| Post-processing ||################
    
    Unknown error occured: HTTPConnectionPool(host='69.164.214.171', port=8083): Max retries exceeded with url: /compile?version=2020.1 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x00000213E7A76940>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))
    Traceback (most recent call last):
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_demo.py", line 415, in <module>
        dai.startLoop()
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_demo.py", line 45, in startLoop
        configMan = DepthConfigManager(args)
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 26, in __init__
        self.generateJsonConfig()
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 192, in generateJsonConfig
        blobMan = BlobManager(self.args, self.calc_dist_to_bb, shave_nr, cmx_slices, NCE_nr)
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 377, in __init__
        self.blob_file = self.compileBlob(self.args['cnn_model'], self.args['model_compilation_target'])
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 431, in compileBlob
        ret = download_and_compile_NN_model(nn_model, model_zoo_folder, shave_nr_opt, cmx_slices_opt, NCE_nr, outblob_file, model_compilation_target)
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\model_compiler\model_compiler.py", line 182, in download_and_compile_NN_model
        raise RuntimeError("Model compiler failed! Not connected to the internet?")
    RuntimeError: Model compiler failed! Not connected to the internet?
    (depthai_env) PS C:\Users\dn****\.conda\envs\depthai_env\depthai>

    Noticing that luxonis.com has ssl, I changed the URL used in model_compiler.py to https and added luxonis ssl cert to my python environment's cacert.pem file.

    This produced a different error:

    Unknown error occured: HTTPSConnectionPool(host='69.164.214.171', port=8083): Max retries exceeded with url: /compile?version=2020.1 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000002EFF1939E20>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))
    Traceback (most recent call last):
      File "C:\Users\dn\.conda\envs\depthai_env\depthai\depthai_demo.py", line 415, in <module>
        dai.startLoop()
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_demo.py", line 45, in startLoop
        configMan = DepthConfigManager(args)
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 26, in __init__
        self.generateJsonConfig()
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 192, in generateJsonConfig
        blobMan = BlobManager(self.args, self.calc_dist_to_bb, shave_nr, cmx_slices, NCE_nr)
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 377, in __init__
        self.blob_file = self.compileBlob(self.args['cnn_model'], self.args['model_compilation_target'])
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\depthai_helpers\config_manager.py", line 431, in compileBlob
        ret = download_and_compile_NN_model(nn_model, model_zoo_folder, shave_nr_opt, cmx_slices_opt, NCE_nr, outblob_file, model_compilation_target)
      File "C:\Users\dn****\.conda\envs\depthai_env\depthai\model_compiler\model_compiler.py", line 182, in download_and_compile_NN_model
        raise RuntimeError("Model compiler failed! Not connected to the internet?")
    RuntimeError: Model compiler failed! Not connected to the internet?

    Still not sure how to remedy. Any insight would be helpful.

      dnavs Update: Success! I'm attempting this from within our organization's network. I tried running the depthai_demo.py while connected to wifi outside my network (using the original http://... URL) and things downloaded as expected. I suspect my company is blocking non-encrypted traffic.

      5 days later

      Thanks @dnavs ! That makes way more sense now. I should have thought of that. And thanks for circling back with the resolution. (And sorry about the delay, European Brexit/pandemic logistics issues have been consuming so many hours.)