Hi @GergelySzabolcs
I am back with some more info.
If I call getMaxDisparity() in C++ depthai-core API, I get 8192
In contrast, turn on debug would give me the following
[18443010C10149F500] [1.1] [5.327] [StereoDepth(15)] [debug] Post-processing filters: '2'
[18443010C10149F500] [1.1] [5.327] [StereoDepth(15)] [debug] Executing median filter
[18443010C10149F500] [1.1] [5.328] [StereoDepth(15)] [debug] Scaling disparity values by a factor of '10'
[18443010C10149F500] [1.1] [5.330] [StereoDepth(15)] [debug] Executing decimation filter
[18443010C10149F500] [1.1] [5.335] [StereoDepth(15)] [info] Max disparity: 760 Disparity subpixel levels: 3 Disparity integer level multiplier: 8 Scale factor: 10
The two ways of getting the max disparity dont quite tally. The debug logs shows max disparity 760 (which is 95 * 8, same as pre v2.29). Now we have scale factor 10, so it gives 7600
This is still less than 8192.
Which one i should use?