Hello.

I have trained a yolov5n with ultralytics and converted the best.pt to an .onnx and then to a .blob-file using the BlobConverter-website.

Now if I try to use the model in my preview I constantly get the following error:

"[XLinkOut(3)] [error] Message has too much metadata (511377B) to serialize. Maximum is 51200B. Dropping message"

I don't believe this has anything to do with the model itself since I got the same error with a yolov5s and yolov5m.

A previous question got the recommendation to convert the .pt-file to a .blob-file with this website: https://tools.luxonis.com/[Automatic Yolo export for OAKs](https://tools.luxonis.com/)

But every time I try to convert a .pt-file I get another error that just says "Error while loading model" even though I put in the correct image size.

As further information, I am using an OAK-1 Lite. The .pt-file has a size of 3.72 MB.

Any help would be appreciated since I have been trying to get a custom-yolo model working with an OAK for an entire day at this point.

    Over the weekend I have trained a yolov8-model instead and this worked with the website. However I am still getting the same metadata-error

      Got it, thanks. I'll look into it and let you know.

      I have trained the model with yolov8 now and realized that I didn't convert the model from the normal ultralytics-model to a model compatible with an OAK-device as described in this post: luxonis/depthai-ml-trainingblob/master/colab-notebooks/YoloV8_training.ipynb

      However every time I try to run the main.py with my .json-config it tells me that the OAK needs to be connected via USB 3.0. Unfortunately the only USB 3.0 I have is on my laptop and if I connect the OAK there it gives me an error that the device crashed with an io error.

      I have also tried forcing USB 2.0 connection but then the program can't find the OAK on the Socket anymore.

        Johnny-McGee
        Can you give me more info?
        What script are you using? What does the io error say?

        Thanks,
        Jaka

        Okay the problem is most likely that my laptop can't output enough power to supply the OAK-device.

        Also the last step of the github-repo for custom-training a yolov8 model is written very confusingly. It does not convert the .blob-file to a usable file but you have to use the code in the main.py or main_api.py to run your model.

        It would be very very helpful to have a proper tutorial about how to run a custom-trained-yolov8 on your oak and which inputs and outputs to generate and how to link everything. Because after trying to recreate the steps in the main_api.py I get a stream but no detections. Clearer documentation would be appreciated.