Hello Support Team,
I am currently implementing obj tracking in python. I'm following the RGB Rotate Warp example (https://docs.luxonis.com/projects/api/en/latest/samples/ImageManip/rgb_rotate_warp/#rgb-rotate-warp) but using the center points of 4 aruco-markers to perform 4 points wrap transform to the live camera view. However the results of the transformation are very unstable while using setWarpTransformFourPoints function.
Small variations in these central points are resulting in a wrong transformation. Take as example the bellow results. In the attached images we can see both original and transformed images. In the original one the selected aruco markers are highlighted and their central points are marked with a red dot.
Figure 1: When the central points are: [[21.5, 70.5], [541.5, 46.0], [558.25, 375.25], [39.25, 385.0]], the transformation is as expected.

Figure 2: for a small difference in these points: [[21.75, 70.5], [541.75, 46.0], [559.0, 374.75], [70.5, 384.0]],the transformation is incorrect (or not as expected).

The same problem is also happening in static images.
Other than the input sequence, are there any further condition for these points to proper transform the image? If not, what could be the problem in this application?
Please let me know if you need any further information!
Thank you for the help!