Hi tode09 ,
If you changed the input shape in the cfg before training and provided --size 608
when calling convert_weights_pb.py
it should work and convert as it is supposed to. A good indicator of that is that you find the correct input shape also in the XML. You can also confirm this by looking at <meta_data>
tags in the last few lines of the XML. There should be a line similar to <input_shape value="[1,3,608,608]"/>
.
If all of this is correct, then maybe you are setting up the pipeline wrong. When calling the depthai_demo.py you should provide correct JSON configuration for you model, with values that match the values in your CFG. As a starting point, I'd suggest you to take this template, and edit it. By default, the anchors should be the same. You would have to change input_size
, classes
, and labels
.
Also, be careful when setting anchor masks. Anchor masks in the JSON should match the anchor masks in the config. Since you changed the input shape, you will also have to change the "side26"
to "side38"
, and "side13"
to "side19"
. You get this number as input_shape
divided by 16
and 32
respectively.
You can also try adapting this example from our docs, where you would set this values directly in Python code.
One last note on custom objects. I would kindly refer you to this page, where it's described how to generate anchors for your own data set, which might improve the performance for your case. In case you generate your own anchors, you would then have to add them to the CFG, to the JSON used during conversion, and to the JSON in depthai_demo.py
.