DepthBias caluclation

The DepthBias in MonoGame is not working the same way as in XNA.
I’m not 100% sure, but it looks like XNA is applying the DepthBias value set in RasterizerState in a way that matches the depth buffer format.

In XNA I have used a value of -0.0004f (this is quite much) to offset a transparent area border above the landscape.

This didn’t work at all in MonoGame and after some investigation I found out that MonoGame just passes the value to DirectX without modification and DirectX applies it as a factor to a very small number that is calculated based on the depth buffer format.

My theory now is that XNA does some math with the DepthBias value to make it easier for the developer to deal with it.

In my case (24bit depth buffer) the following formula gives exactly the same visual result as -0.0004f in XNA:
state.DepthBias = (float)(-0.0004f / (1 / Math.Pow(2, 24)))

You should open an issue for this on the GitHub page.