So basically I want a slight DepthBias like 0.00001f which works as expected … until you “zoom” out … as the hardware depth is not linear, it’s clear that a single value will yield different results based on the actual depth of the geometry.
is there anything I am missing in using the DepthBias properly? Is there any sort of calculation to overcome the problem? I just want to solve some flickering issue with it, but the bias-outcome in the distance just gets abnormal big
Not sure of this is related to how MonoGame does this (in respect to OGL) , as the XNA doc state “The value ranges from 0 to 16” … which is definitely not true for Monogame … I guess XNA does some related Math which Monogame doesn’t?