- Edited
dhruvsheth-ai
Thanks for you reply.
I use MobileNet v2. I should have mentioned this. I will amend my fist post if it's still possible.
http://69.164.214.171:8084/ is saying "Converting..." for the last 30 minutes, and I have tried restarting the conversion several times.
I even tried example the model from one of the Colab notebooks, and it's getting stuck as well.
I'm not sure why, maybe the server is a bit too busy.
In any case, I tried reading OpenVino docs, and they say I need frozen inference model, whilst my .pb file is not (I think...).
It's been created by running
model.save('model\modeltf2', save_format='tf')
I tried googling how to create frozen inference model from "unfrozen" one, and all I can see is some sort of black magic....
Am I headed in the right direction?
Thanks,
Eugene.