 # Trying to find pixel perfect plane - Weird Unproject behavior

Hello everyone, I’m trying to wrap my head around Unproject in order to find at which Z 1 unit is equal to 1 pixel (my coordinate system has XY going towards the right/bottom and Z going inside the screen); I’m using a perspective projection.

My idea was to calculate the distance in pixels between two points distant 1 unit on the near and far planes, and compare them to find where 1 unit == 1 pixel.

The strange things happen when I start to use Unproject, in particular, if I do this:

``````float x = this.Viewport.Bounds.Center.X;
float y = this.Viewport.Bounds.Center.Y;

Vector3 nearA = this.Viewport.Unproject(new Vector3(x, y, 0f), this.Projection, this.View, Matrix.Identity);
Vector3 nearB = this.Viewport.Unproject(new Vector3(x + 1, y, 0f), this.Projection, this.View, Matrix.Identity);
float nearDistance = (nearB - nearA).Length();

Vector3 farA = this.Viewport.Unproject(new Vector3(x, y, 1f), this.Projection, this.View, Matrix.Identity);
Vector3 farB = this.Viewport.Unproject(new Vector3(x + 1, y, 1f), this.Projection, this.View, Matrix.Identity);
float farDistance = (farB - farA).Length();

// for debug I added these
Vector3 mid1 = this.Viewport.Unproject(new Vector3(x, y, .5f), this.Projection, this.View, Matrix.Identity);
Vector3 mid2 = this.Viewport.Unproject(new Vector3(x, y, .99f), this.Projection, this.View, Matrix.Identity);
``````

I get the following values (camera is at 0,0,-1000, looking down towards Z+, with Near plane at 0.1 and Far plane at 100000):

``````nearA   (z = 0) = 0  0  -999.9 (-1000 + 0.1.. perfect)
farA    (z = 1) = 0  0  86442.664 (ok, maybe some rounding errors.. not a big deal)
mid1  (z = 0.5) = 0  0  -999.8 (should be halfway on the frustum, definitely not -999.8)
mid2 (z = 0.99) = 0  0  -989.99744 (practically 1, and still didn't move from the Near plane)
``````

What am I missing? Is there any other way to obtain the Z of the plane where 1 unit == 1 pixel?

Thanks

The problem is that the depth values are not linear.
There should be a simple trigonometric solution. I think it’s this (untested):

``````distance = (resolutionX / 2) / tan(fieldOfViewX / 2)
``````

Thanks for the hint. I’m just not sure I’m understanding it right… wouldn’t it mean that if I were to resize the window the plane would change?
What happens if I resize only on the X (assuming I’m doing everything else properly) is that I just can see more stuff, but the textures stay the same; they start scaling if I resize the Y. Maybe I’ll try investigating in that direction (no pun intended)

Edit

Yep! the correct formula appears to be

``````(viewport.Height / 2) / tan(fieldOfView / 2)
``````

Thanks a lot!