GLSL vertex shader calculation issue

Hi all,

I’m ironing out the final problem with my shaders for an Android app. Right now I’m having trouble understanding a problem which only appears in OpenGL.

I want to apply a detail map to my terrain depending on the camera distance from each pixel. If the camera’s close to a pixel, it should blend more with the detail map. I’m actually calculating the camera distance on the vertex shader, like this:

float3 positionWorld = mul(input.Position, WorldMatrix).xyz; output.CameraDistance = length(CameraPosition - positionWorld);

…then, in the pixel shader, I do a check like this:

if (input.CameraDistance < 500) { // Sample detail map and blend it with regular texture }

The blending works fine, but only for certain camera positions! For example, this direction it works fine:

… but if I move the camera slightly, it doesn’t work at all:

I think it’s something to do with the CameraPosition variable, which represents the coordinates of the camera in world space and is sent into my shader from the main program using SetValue(). If I remove any dependency on positionWorld and just do something like this in the vertex shader:

float cameraDistance = length(CameraPosition);

…I still get unexpected results which respond very strangely to small movements of the camera.

I tried it on Android and desktop GL, and the problem appears in both. Can anyone who knows more about GLSL see an obvious reason why there could be a problem passing a value which depends on a global variable like CameraPosition into a pixel shader?

I should also say that if I use the same logic but calculate the cameraDistance in the pixel shader, it works without any bugs.

Cheers!

what you’re doing here is basically an inverted fog shader but you use a texture instead of just a color :slight_smile: which can be solved faster (easier) by just using the viewspace/screenspace coordinate of the vertex (where z is basically the depth value 0-1 seen from camera and scaled to the near/far pane of your camera) [may need division by w … not exactly sure atm) - you don’t need any distance calculations in that case

normally when rendering goes wrong when rotating the viewport it’s often related to normals or mixed world/screen/viewspace usages, but I guess it’s not the case here and you use TEXCOORD as semantic so the value of the distance should be lerped between the vertices.

remember your other post, the length-calc you’re doing may also overflow your float, because your length is actually the sqrt of 250.000, so not get trapped there again on GL_ES :slight_smile:

1 Like

Ooh good idea with the screen-space depth. That works. Still can’t work out why the first one has problems, but ah well, it’s simplified now. I’m not sure it was an overflow issue as I was doing it in the vertex shader where I think the precision is higher. Also, the original radial clip calculation worked in the VS without being scaled! Still, I’ve re-scaled everything again as it can’t hurt.

Funnily enough the z component of my position in screen space (after WorldViewProjection) being sent to the pixel shader is not normalised from 0-1; it’s still in the original world coordinates. This is the case when tested in GL and DX. Even funnier as I have a shadow map shader where I do the same thing (pass depth to pixel shader), and it IS normalised between 0-1. It’s an orthographic matrix though…

Finally, I ran into the most annoying problem yet when trying to run my shaders after they’d gone through the compilation process. It’s documented here and the only way round it for me was quite hacky. If I don’t transfer some unused components of a matrix multiplication result through to the pixel shader, they will be optimised out and the thing will compile but crash at runtime.

I think everything’s quite stable now, but man it really makes me feel like I’m building my foundations on toothpaste when all these bugs keep showing up. Glad I learned graphics with DirectX, or I’d probably have given up on day one :joy: This is all a bit over my head and winds me up when there’s no easy way to debug.

Could it be that the texture coordinates that are sampled from the map are somehow getting screwed up ?

Ya but whats the w value of that position?

The Gpu has to be within its clip space coordinates or it will automatically clip values.
That is testable by sending in identity matrices and a quad ranging -1 to 1 xy -1 to 1 z anything out of range is clipped. Dx GL have different z clip ranges one of them ranges -1 to 1 and the other 0 to 1 i think something like that.

Thanks for the reply - I’ll check the w value when I get round to it!