Hi all,
looks like I’m a litte bit lost with my Math. I try to project a vertex from world to screen space but get strange values in some configurations. My current setup is like this:
Windows 10 UWP, DirectX
Worldmatrix = Identity-Matrix
Viewmatrix:
1 0.0000000 0.0000000 1.0000000
0 0.8660260 -0.4999989 -350.2188
0 0.4999989 0.8660260 -793.4045
0 0.0000000 0.0000000 1.0000000
Projectionmatrix:
0.6841934 0.000000 0.000 0.000
0.0000000 1.303225 0.000 0.000
0.0000000 0.000000 -1.001 -1.001
0.0000000 0.000000 -1.000 0.000
Those where created from the cameraposition at (-1, 700, 512) looking at straight along the negative z-Axis (that is e.g. at the target point (-1, 700, 0). Than the camera was rotated 30 Degrees towards the ground. The Up-Vector is (0, 1, 0). My vertival field of view is 150 degrees, near clipping plane 1.0 and far clipping plane 10000.0.
My current viewport is 1920x1008 pixels.
Projecting a very large quad with the basic effect works as expected. I than try to project the corners of this quad to my screenspace to calculate the extent of the quad on screen. For some points the projections (either via Viewport.Project() or by manually Vector4.Translate(WorldViewProjectionMatrix). But for others the values are really strange.
World-Position Corner 1: (0, 0, 0) -> Screen-Position: (960.8279, 214.0689) -> seems reasonably
World-Position Corner 2: (65536, 0, 0) -> Screen-Position: (55215.27, 214.0689) -> seems reasonably
World-Position Corner 3: (0, 0, 65536) -> Screen-Position: (959.9883, 115.2958) -> this point is to the left of Corner 1 on my screen?! It is definitely not drawn there.
World-Position Corner 4: (65536, 0, 65536) -> Screen-Position: (190.7992, 115.2958) -> this Point is really strange too.
Since the Basic effect shader draws everything Right I guess I have some misconception in projecting the Vertices to screenspace myself. Or is this some form of buffer overflow? Maybe someone with deeper math / graphics knowledge can give me a hint what is going on here? Thanks a lot!