Thank you Jaka,
I am already using letterbox. the problem is I collected images with 4k .. trained with nn with width and height 1024*1024
so I don't have problem with input to nn.
the problem is when I need to do the inference with 4k. I only be able to load 1 model because of memory limitation. but I need to deploy 2 models to use one of the models based on trigger.
is there any way to work around the 4k inference and convert it to lower resolution but keep the same width and height?
I really appreciate if there is any example showing swap models in the same oak device.