joanneChung When i tried to convert my custom model to .blob on DepthAI Tools https://tools.luxonis.com/, it keep warning me with the error: An error occurred AxiosError: Request failed with status code 500. I try all models but the error keep occurs. Can anyone help? Thanks!
JanCuhel Hi @joanneChung, could you please provide us with more details about your model? Is it based on an YOLO architecture? Kind regards, Jan
joanneChung JanCuhel actually, i tried different models including yolov5s, yolov6n (the pre-trained model itself) but they all encountered the same error. they successed when converting to .onnx, but when it started to convert to .blob, it ran into the error i mentioned.
joanneChung JanCuhel hey i just tried again today and it works now! i think it was just some system error. Anyways, appreciate for your notice and help!
JanCuhel I am glad to hear that! Anyway if it would happen again, please do not hesitate to contact us again! Kind regards, Jan
JanCuhel Hi @GurdeepakSidhu, I am sorry about that! I am gonna take a look and get back to you as soon as possible! Kind regards, Jan
GurdeepakSidhu @JanCuhel Thank you for looking into it. I look forward to hearing back from you, thank you and appreciate it! Best regards,
JanCuhel GurdeepakSidhu I found the source of the issue; it's because of an argument that became outdated and thus no longer possible to use when we updated a particular dependency library. The error happens whenever some process ends with an error as we pass the error message indicating the failed stage or potential thing to try. During this error message passing, we attempt to create an object with the outdated argument, which causes the exception. I'm working on a fix as we speak. Unfortunately, this means that your model fails to convert for some other reason; it could be because of some other issue or because the model is using some layer that we don't currently support. Once this fix is deployed, we will also get to the bottom of that. Kind regards, Jan
GurdeepakSidhu @JanCuhel Thank you for an update on this issue, and I appreciate you working on the fix. I look forward to hearing from you once this is resolved. I appreciate it and have a great day!
JanCuhel GurdeepakSidhu No problem at all! Just want to update you that the fix is already deploy. Merry Christmas! Jan
GurdeepakSidhu Hi @JanCuhel Thank you for an update, and appreciate it. I tried converting the Yolov6 model custom trained on latest Yolov6 release however I am getting this error now. Thank you and looking forward to hearing back from you. Merry Christmas!
jakaskerl GurdeepakSidhu And you tried converting using the other two YOLOv6 options (as stated in the error log)? Thanks, Jaka
GurdeepakSidhu @jakaskerl Thank you for your response. Yes, I also tried converting to the other Yolov6 models available within tool. I got the same error though. Thank you and looking forward to hearing back from you. Merry Christmas and Happy New Year!
JanCuhel GurdeepakSidhu Could you please share your model weights with me? I'd like to take a closer look. You can send them to my email: jan.cuhel@luxonis.com. I’ve downloaded the latest release of the Nano version (v4), and its conversion worked successfully. Could you also provide us with more details about what exact model variant you used for training? Wishing you a Merry Christmas and Happy Holidays! Kind regards, Jan