Shader parameter order bug

I’m using 2MGFX to compile a shader with the OpenGL profile, and I noticed a strange bug. I haven’t heard of anything like this, so I’m assuming it’s not a known issue.

At the very top of my shader I have a bunch of constants to be taken as parameters:

float4x4 World; float4x4 View; float4x4 Projection; float4x4 InverseViewProjection; float3 CameraPosition; float4x4 LightViewProjection; float3 LightPosition; float4 LightColor; float LightIntensity; ...

and I found that some of the parameters are actually being set to each other. LightPosition was actually receiving CameraPosition’s value, while CameraPosition was being set to (0, 0, 0), etc. I noticed that if I only change the order of some of these statements at the top of the shader, the values get swapped differently. I’m still struggling to find an order that actually works correctly. Is there supposed to be a required order?

Do you use the name or the index when you set the value:
eff.Parameters[“World”].SetValue…
or
eff.Parameters[0].SetValue… ?
I’ve also never heard of this problem before.

No order is required as long as you use the name (or an effectparameter object)

When loading the shader, I used named params to get handles to all the EffectParameters for my shader, and then set them each frame with those.

I can try compiling with /Debug. Maybe it’s an issue with overoptimization that breaks something. I’m not super familiar with debugging shaders though. Is there any other way the debug flag or anything else can help identify the issue? I can try to get a minimal example that’s broken, but it’s hard to check every single parameter for correct values. I’ve just been using some crude multipliers so that the lighting shows up if whichever parameter I’m testing is equal to a test value that gets hardcoded in the shader.

id say post the draw code and shader.

Here’s the draw code and shader. It’s for a spot light, but the shader is kind of in shambles now because I had to tear it apart to identify this issue. I can’t imagine the issue is with this code, unless there’s some weird bug in the shader that causes it to compile incorrectly rather than fail to compile at all. Because, as I said the functionality changes when I simply reorder the variable definitions at the top. The shader code could certainly be simplified from its current state, but seeing as it’s a potential bug with the compiler, I haven’t messed with it since I’ve seen this issue.

Draw code:

private void DrawSpotLights(IDictionary<Entity, IComponent> spotLights, IDictionary<Entity, IComponent> positions, RenderComponent renderComponent, CameraComponent camera, Matrix inverseViewProjection)
{
    spotLightEffect.View = camera.View;
    spotLightEffect.Projection = camera.Projection;
    spotLightEffect.InverseViewProjection = inverseViewProjection;
    spotLightEffect.CameraPosition = camera.Position;
    //spotLightEffect.GBufferTexture0 = renderComponent.GBufferTargets[0].RenderTarget;
    spotLightEffect.GBufferTexture1 = renderComponent.GBufferTargets[1].RenderTarget;
    spotLightEffect.GBufferTexture2 = renderComponent.GBufferTargets[2].RenderTarget;
    spotLightEffect.GBufferTextureSize = renderComponent.GBufferTextureSize;

    renderComponent.GraphicsDevice.SetVertexBuffer(spotLightGeometry.VertexBuffer, spotLightGeometry.VertexOffset);
    renderComponent.GraphicsDevice.Indices = spotLightGeometry.IndexBuffer;

    foreach (var entityLight in spotLights)
    {
        SpotLightComponent light = (SpotLightComponent)entityLight.Value;
        PositionComponent position = (PositionComponent)positions[entityLight.Key];
        float lightAngleCos = light.GetLightAngleCos();

        spotLightEffect.World = light.World;
        spotLightEffect.LightViewProjection = light.View * light.Projection;
        spotLightEffect.LightPosition = position.World.Translation;
        spotLightEffect.LightColor = light.Color;
        spotLightEffect.LightIntensity = light.Intensity;
        spotLightEffect.LightDirection = Vector3.Down;//position.World.Forward;
        spotLightEffect.LightAngleCos = lightAngleCos;
        spotLightEffect.LightHeight = light.FarPlane;
        spotLightEffect.Shadows = light.IsWithShadows;
        spotLightEffect.ShadowMapSize = light.ShadowMapResoloution;
        spotLightEffect.DepthPrecision = light.FarPlane;
        spotLightEffect.DepthBias = light.DepthBias;
        spotLightEffect.AttenuationTexture = light.AttenuationTexture;
        spotLightEffect.ShadowMap = light.ShadowMap;

        // Calculate cull mode
        Vector3 L = camera.Position - position.World.Translation;
        float SL = System.Math.Abs(Vector3.Dot(L, position.World.Forward));

        // If SL is within the LightAngle, draw the back faces, otherwise draw the front faces
        if (SL < lightAngleCos)
        {
            renderComponent.GraphicsDevice.RasterizerState = RasterizerState.CullCounterClockwise;
        }
        else
        {
            renderComponent.GraphicsDevice.RasterizerState = RasterizerState.CullClockwise;
        }

        spotLightEffect.Apply();

        // Draw
        renderComponent.GraphicsDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, spotLightGeometry.StartIndex, spotLightGeometry.PrimitiveCount);
    }

    // Restore cull mode
    renderComponent.GraphicsDevice.RasterizerState = RasterizerState.CullCounterClockwise;
}

Shader code

float4x4 World;
float4x4 View;
float4x4 Projection;
float4x4 InverseViewProjection;
float3 CameraPosition;
float4x4 LightViewProjection;
float3 LightPosition;
float4 LightColor;
float LightIntensity;
float3 S;
float LightAngleCos;
float LightHeight;
float2 GBufferTextureSize;
bool Shadows;
float ShadowMapSize;
float DepthPrecision;
// DepthBias for the Shadowing... (1.0f / 2000.0f)
float DepthBias;

// GBuffer Texture0
texture GBufferTexture0;
sampler GBuffer0 = sampler_state
{
	texture = <GBufferTexture0>;
	MINFILTER = LINEAR;
	MAGFILTER = LINEAR;
	MIPFILTER = LINEAR;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// GBuffer Texture1
texture GBufferTexture1;
sampler GBuffer1 = sampler_state
{
	texture = <GBufferTexture1>;
	MINFILTER = LINEAR;
	MAGFILTER = LINEAR;
	MIPFILTER = LINEAR;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// GBuffer Texture2
texture GBufferTexture2;
sampler GBuffer2 = sampler_state
{
	texture = <GBufferTexture2>;
	MINFILTER = POINT;
	MAGFILTER = POINT;
	MIPFILTER = POINT;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// Attenuation Cookie
texture AttenuationTexture;
sampler Cookie = sampler_state
{
	texture = <AttenuationTexture>;
	MINFILTER = LINEAR;
	MAGFILTER = LINEAR;
	MIPFILTER = LINEAR;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// ShadowMap
texture ShadowMapTexture;
sampler ShadowMap = sampler_state
{
	texture = <ShadowMapTexture>;
	MINFILTER = POINT;
	MAGFILTER = POINT;
	MIPFILTER = POINT;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

struct VSI
{
	float4 Position : POSITION0;
};

struct VSO
{
	float4 Position : POSITION0;
	float4 ScreenPosition : TEXCOORD0;
};

VSO VS(VSI input)
{
	VSO output;

	// Transform Position
	float4 worldPosition = mul(input.Position, World);
	float4 viewPosition = mul(worldPosition, View);
	output.Position = mul(viewPosition, Projection);

	// Pass to ScreenPosition
	output.ScreenPosition = output.Position;

	return output;
}

// Manually Linear Sample
float4 manualSample(sampler Sampler, float2 UV, float2 textureSize)
{
	float2 texelpos = textureSize * UV;
	float2 lerps = frac(texelpos);
	float2 texelSize = 1.0 / textureSize;
	float4 sourcevals[4];
	sourcevals[0] = tex2D(Sampler, UV);
	sourcevals[1] = tex2D(Sampler, UV + float2(texelSize.x, 0));
	sourcevals[2] = tex2D(Sampler, UV + float2(0, texelSize.y));
	sourcevals[3] = tex2D(Sampler, UV + texelSize);

	float4 interpolated = lerp(lerp(sourcevals[0], sourcevals[1], lerps.x), lerp(sourcevals[2], sourcevals[3], lerps.x), lerps.y);
	return interpolated;
}

// Phong Shader
float4 Phong(float3 Position, float3 N, float radialAttenuation, float SpecularIntensity, float SpecularPower)
{
	// Calculate Light vector
	float3 L = LightPosition.xyz - Position.xyz;

	// Calculate height Attenuation
	float heightAttenuation = saturate(2.0f - length(L) / (LightHeight / 2));

	// Calculate total Attenuation
	float Attenuation = min(radialAttenuation, heightAttenuation) + 1;

	// Now Normalize the Light
	L = normalize(L);

	// Calculate L.S
	float SL = dot(L, S);

	// No asymmetric returns in HLSL, so work around with this
	float4 Shading = 0;

	// If this pixel is in the SpotLights Cone
	//if(SL <= LightAngleCos)
	//{
		// Calculate Reflection Vector
		float3 R = normalize(reflect(-L, N));

		// Calculate Eye Vector
		float3 E = normalize(CameraPosition - Position.xyz);

		// Calculate N.L
		float NL = dot(N, L);

		// Calculate Diffuse
		float3 Diffuse = NL * LightColor.xyz;

		// Calculate Specular
		float Specular = SpecularIntensity * pow(saturate(dot(R, E)), SpecularPower);

		// Calculate Final Product
		Shading = Attenuation * LightIntensity * float4(Diffuse.rgb, Specular);
	//}

	//Return Shading Value
	//return Shading;
		return Shading * saturate(sign(50 - LightPosition.y)) + (SL + LightAngleCos) * 0.001;// *saturate(sign(LightAngleCos - SL) + 1);
}

// Decoding of GBuffer Normals
float3 decode(float3 enc)
{
	return (2.0f * enc.xyz- 1.0f);
}

// Decode Color Vector to Float Value for shadowMap
float RGBADecode(float4 value)
{
	const float4 bits = float4(1.0 / (256.0 * 256.0 * 256.0), 1.0 / (256.0 * 256.0), 1.0 / 256.0, 1);
	return dot(value.xyzw , bits);
}

float4 PS(VSO input) : COLOR0
{
	// Get Screen Position
	input.ScreenPosition.xy /= input.ScreenPosition.w;

	// Calculate UV from ScreenPosition
	float2 UV = 0.5f * (float2(input.ScreenPosition.x, -input.ScreenPosition.y) + 1);// -float2(1.0f / GBufferTextureSize.xy);

	// Get All Data from Normal part of the GBuffer
	half4 encodedNormal = tex2D(GBuffer1, UV);

	// Decode Normal
	half3 Normal = decode(encodedNormal.xyz);

	// Get Specular Intensity from GBuffer
	float SpecularIntensity = encodedNormal.w;

	// Get Specular Power from GBuffer
	float SpecularPower = 128;// encodedNormal.w * 255;

	// Get Depth from GBuffer
	float Depth = tex2D(GBuffer2, UV).x;// manualSample(GBuffer2, UV, GBufferTextureSize).x;

	// Make Position in Homogenous Space using current ScreenSpace coordinates and the Depth from the GBuffer
	float4 Position = 1.0f;
	Position.xy = input.ScreenPosition.xy;
	Position.z = Depth;

	// Transform Position from Homogenous Space to World Space
	Position = mul(Position, InverseViewProjection);
	Position /= Position.w;

	// Calculate Homogenous Position with respect to light
	float4 LightScreenPos = mul(Position, LightViewProjection);
	LightScreenPos /= LightScreenPos.w;

	// Calculate Projected UV from Light POV
	float2 LUV = 0.5f * (float2(LightScreenPos.x, -LightScreenPos.y) + 1);

	// Load the Projected Depth from the Shadow Map, do manual linear filtering
	float lZ = manualSample(ShadowMap, LUV, float2(ShadowMapSize, ShadowMapSize)).r;

	// Get Attenuation factor from cookie
	float Attenuation = tex2D(Cookie, LUV).r + 1;

	// Assymetric Workaround...
	float ShadowFactor = 1;

	//// If Shadowing is on then get the Shadow Factor
	//if(Shadows)
	//{
	//	// Calculate distance to the light
	//	float len = max(0.01f, length(LightPosition - Position.xyz)) / DepthPrecision;

	//	// Calculate the Shadow Factor
	//	ShadowFactor = (lZ * exp(-(DepthPrecision * 0.5f) * (len - DepthBias)));
	//}

	// Return Phong Shaded Value Modulated by Shadows if Shadowing is on
	return ShadowFactor * Phong(Position.xyz, Normal, Attenuation, SpecularIntensity, SpecularPower);
}

technique Default
{
	pass p0
	{
		VertexShader = compile vs_3_0 VS();
		PixelShader = compile ps_3_0 PS();
	}
}

The simplest and most obvious oddity that I’ve found so far is that if I move float3 CameraPosition; down 3 lines to follow float3 LightPosition; then the light color (which is supposed to be red but for some reason appearing to be green) changes to blue.

Ive never tried nor looked into intermediate files so i don’t know if this is possible: do you have access to some intermediate file to see what’s going on with the parameters while they are being built?
Some kind of *.obj file but for shaders.

Not that I’m aware of. There may be more advanced options for 2MGFX that I don’t know about though.

In other news, I found another strange manifestation of this bug. When I change the line float multiplier = 1; to float multiplier = 1 - LightScreenPos.w; MonoGame is unable to find any parameters except World, View, and Projection.

Maybe it’s gotten to the point where I start removing one line at a time until it starts working.

So I found the first case of a break in the shader code below. When I added LightPosition, the LightColor changes from the desired (1, 0, 0, 1) to (0, 0, 0, 0). If I move LightPosition below LightColor, then the value of LightColor is correct. When I check the PS glsl that’s being run by this Pass, they look identical except for the indices of the variables switched. I can also see that their order in the Parameters array has also been switched though, so I don’t know why it works in one case but not in the other. After the Pass is applied, the LightColor vector appears to be in the correct location in the buffer in both cases. Maybe it’s getting corrupted or the GPU is reading from the wrong location, despite the buffer and glsl being synchronized.

The little if-statement at the bottom is how I’m testing LightColor, and the multiplication of all those other variables by 0.00001 is just to make sure they don’t get optimized away.

float4x4 World;
float4x4 View;
float4x4 Projection;
float4x4 InverseViewProjection;
float4x4 LightViewProjection;
float3 LightPosition;
float4 LightColor;

// GBuffer Texture1
texture GBufferTexture1;
sampler GBuffer1 = sampler_state
{
	texture = <GBufferTexture1>;
	MINFILTER = LINEAR;
	MAGFILTER = LINEAR;
	MIPFILTER = LINEAR;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// GBuffer Texture2
texture GBufferTexture2;
sampler GBuffer2 = sampler_state
{
	texture = <GBufferTexture2>;
	MINFILTER = POINT;
	MAGFILTER = POINT;
	MIPFILTER = POINT;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// ShadowMap
texture ShadowMapTexture;
sampler ShadowMap = sampler_state
{
	texture = <ShadowMapTexture>;
	MINFILTER = POINT;
	MAGFILTER = POINT;
	MIPFILTER = POINT;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

// Attenuation Cookie
texture AttenuationTexture;
sampler Cookie = sampler_state
{
	texture = <AttenuationTexture>;
	MINFILTER = LINEAR;
	MAGFILTER = LINEAR;
	MIPFILTER = LINEAR;
	ADDRESSU = CLAMP;
	ADDRESSV = CLAMP;
};

struct VSI
{
	float4 Position : POSITION0;
};

struct VSO
{
	float4 Position : POSITION0;
	float4 ScreenPosition : TEXCOORD0;
};

VSO VS(VSI input)
{
	VSO output;

	// Transform Position
	float4 worldPosition = mul(input.Position, World);
	float4 viewPosition = mul(worldPosition, View);
	output.Position = mul(viewPosition, Projection);

	// Pass to ScreenPosition
	output.ScreenPosition = output.Position;

	return output;
}

// Decoding of GBuffer Normals
float3 decode(float3 enc)
{
	return (2.0f * enc.xyz- 1.0f);
}

float4 PS(VSO input) : COLOR0
{
	// Get Screen Position
	input.ScreenPosition.xy /= input.ScreenPosition.w;

	// Calculate UV from ScreenPosition
	float2 UV = 0.5f * (float2(input.ScreenPosition.x, -input.ScreenPosition.y) + 1);

	// Get All Data from Normal part of the GBuffer
	half4 encodedNormal = tex2D(GBuffer1, UV);

	// Decode Normal
	half3 Normal = decode(encodedNormal.xyz);

	// Get Specular Intensity from GBuffer
	float SpecularIntensity = encodedNormal.w;

	// Get Specular Power from GBuffer
	float SpecularPower = 128;

	// Get Depth from GBuffer
	float Depth = tex2D(GBuffer2, UV).x;

	// Make Position in Homogenous Space using current ScreenSpace coordinates and the Depth from the GBuffer
	float4 Position = float4(input.ScreenPosition.xy, Depth, 1);

	// Transform Position from Homogenous Space to World Space
	Position = mul(Position, InverseViewProjection);
	Position /= Position.w;

	// Calculate Homogenous Position with respect to light
	float4 LightScreenPos = mul(Position, LightViewProjection);
	LightScreenPos /= LightScreenPos.w;

	// Calculate Projected UV from Light POV
	float2 LUV = 0.5f * (float2(LightScreenPos.x, -LightScreenPos.y) + 1);

	// Load the Projected Depth from the Shadow Map, do manual linear filtering
	float lZ = tex2D(ShadowMap, UV).r;

	// Get Attenuation factor from cookie
	float Attenuation = tex2D(Cookie, LUV).r + 1;




	float4 output = float4(0, 1, 0, 0);

	if (dot(LightColor, float4(1, -1, -1, 0)) > 0.9)
	{
		output = float4(1, 0, 0, 0);
	}

	return output + (SpecularIntensity + lZ + Attenuation + dot(float3(1,1,1), LightPosition)) * 0.00001;
}

technique Default
{
	pass p0
	{
		VertexShader = compile vs_3_0 VS();
		PixelShader = compile ps_3_0 PS();
	}
}

Could someone help me sanity check by testing this shader?

  1. Set proper World, View, and Projection matrices according to your camera and the geometry in the vertex buffer.
  2. Set LighColor to Red (1,0,0,1).
  3. Pass arbitrary textures for the samplers. The remaining parameters can default to 0.

Then apply and draw, and the geometry is supposed to appear red if it’s working (LightColor is red) and green if it’s not (LightColor is not red). In my case, it doesn’t work in this state, but simply moving the line float3 LightPosition; down a line after float4 LightColor; makes it work.

I can try to get a very simple project in a git repo later when I’m at my computer if anyone would be more willing when it’s all preassembled.

Are you certain that your C# side code that sets those parameters is correct? You haven’t swapped a name or member somewhere by mistake or copy-pasta?


Change light position to a float4. W = 0 is the equivalent of 3d point (W = 1 is a 3d vector).

If that works then it’s memory alignment. float3’s don’t exist in anything Khronos related anymore (they’re all float4 in size, and float3 is just an alias/cast) so if a UBO is used behind the scenes then everything is probably garbled depending on how MG maps things.

Direct uniform access without UBOs should still work as it used to unless your drivers are bugged.

Light color becoming green from a value location shift would seem to indicate that’s the case, though the shift is in the wrong direction. Shifting from 1,0,0,1 to 0,1,0,0 instead of 0,0,1,0 would seem to point the finger at Monogame instead of drivers.

I checked several times and I’m fairly certain. Even if there was an issue with the parameter names being swapped or something, the output shouldn’t change when I just swap the variable declaration order.


Even with LightPosition as a float4/Vector4, everything is the same.

The green color was my own invention just to validate the test. I believe the LightColor is actually (0,0,0,0) when I’m seeing the issue, but I just force anything other than red to show up as green at the bottom of the pixel shader.


When I compile the effect with /Debug, which I think stops it from optimizing, everything is still the same.

UPDATE: It looks like the issue is related to the fact that I’m not using LightScreenPosition.z. If I do use it (e.g. + LightScreenPosition.z * 0.0001 anywhere), then it works regardless of the variable declaration order. Could this be some compiler optimization or buffer alignment bug?

MonoGame sets the dx compiler to the highest optimization level and it’s known to be very aggressive in removing unused variables, so this is plausible. I’ve never heard of it cutting out part of a vector though.

I have seen cases of that before in the glsl that’s visible while debugging. I just find it hard to believe that there’s a bug that somehow ties this to the order of variable declarations.

EDIT: I guess I’ve seen things like the usage of part of a vector is skipped if you’re dotting it with 0, for example. I haven’t seen it actually removed and screw up the alignment.

I doubt this has anything to do with alignment. It’s probably optimization or some scary glitch.

As I noted before, alignment issues should shift left (1,0,0,1 -> 0,0,1,0) not right (1,0,0,1 -> 0,1,0,0) as you were seeing. If something like touching a component of a vector fixes it then it’s definitely not an alignment issue.

Im not to good with shader syntax but…

Could it have something to do with him defining Position in his pixel shader.
It looks like its defined before input.Position is used ?.

struct VSO
{
float4 Position : POSITION0;
float4 ScreenPosition : TEXCOORD0;
};

float4 PS(VSO input) : COLOR0
{

float4 Position = float4(input.ScreenPosition.xy, Depth, 1);

Open the xnb with notepad when its working and not.
Save each. Take a look at whats different.

Position and input.Position are separate fields, and should work fine together.

As for the xnb, I’m not sure how to decode and understand it. The glsl that’s loaded into the effect look correct for both, but I’m not sure if there’s some other info in the xnb that could be screwing things up.

And AcidFaucent, I don’t think I’m actually seeing the right shift that you expect (you may have missed my note on that after you first mentioned it). The green color was my own artificial addition to make it obvious when something’s wrong.

My point is that there is some sort of redundancy going on.

VSO VS(VSI input)
{
	VSO output;

    ....
        // ?
	output.Position = mul(viewPosition, Projection);    	
        // Pass to ScreenPosition  
        // ?
	output.ScreenPosition = output.Position;

	return output;
}

    struct VSO
    {
         _float4 Position : POSITION0;_ 
          // ^^^ Is this used at all in the Pixel shader ?.
          //  why pass it ? 
        float4 ScreenPosition : TEXCOORD0;
    };
    
    float4 PS(VSO input) : COLOR0
    {,,,,
     ,,,,, 
     // input.Position isn't used at all !
     float4 Position = float4(input.ScreenPosition.xy, Depth, 1);

I would clean that up and test again.

Oh I see the confusion. The vertex shader is required to output a field labelled POSITION0 to determine where the vertex is located, even if it’s not used in the pixel shader. The Position variable in the pixel shader is a different, unrelated variable. You’re right that there is redundancy in that output.Position and output.ScreenPosition are identical. I’m not sure why the guide did this, or why I’ve seen it in a few other places as well, but that could probably be changed to just a single output.