Problem in the shader compiler

Hi Guys,

Came across something you should be aware of.

I am working on a terrain system and the first time I ran it, the terrain was white.

After a bit of digging about I found the problem in the pixel shader.

It’s not finished, but should have worked.

My vertex shader outputs a structure

`struct VS_OUTPUT
{
float4 position : POSITION;
float4 uv : TEXCOORD0;
float4 worldPos : TEXCOORD1;
float4 textureWeights : TEXCOORD2;
};

My pixel shader initially was …

float4 PixelShader(in float4 uv : TEXCOORD0, in float4 weights : TEXCOORD2) : COLOR

And I had white terrain indicating the weights was wrong.

I changed it to …

float4 Pixel(VS_OUTPUT input) : COLOR

And all of a sudden my terrain was textured.

Do you use a bespoke shader compiler?

Parameters are passed from the vertex to the pixel shader stage by position, so you should always use the same input for the pixel shader as the output of the vertex shader. XNA uses fxc which supports the Effect framework stuff like techniques, sampler state and semantics.

MG uses the DX11 compiler and a custom parser for the Effects framework stuff. The custom parser is pretty limited, so MG does not support all Effect Framework functionality. The DX11 compiler binds parameters by position rather than semantics. We’d need a better (more complete) parsing solution to be able to rewrite the parameters so they’re properly bound.

When you have a shader that compiles, looks like it should work, but doesn’t work the first step is to turn on verbose logging in the Direct X Control Panel (In VS: that’s Debug -> Graphics -> Direct X Control Panel, you can also get to it via the Direct X SDK install).

It’ll scream at you for everything that you’re doing wrong (which will probably be a ton more than you think). Generally you’ll be able to figure out what’s wrong there right away.

If no dice, then it’s RenderDoc time to see what’s wrong with your render state / vertex|pixel outputs.

1 Like

Hi Jjagg,

Is the source for the parser available?

I have two big problems with shaders in Monogame.

  1. Lack of semantics which you just pointed out.
  2. Lack of a debug mode

The second one is the really big one for me.

The problem is that when I am developing shaders I often comment out blocks of code and just output one particular part of the computation. For example turn off the specular term and just output the diffuse term. Or just output the generated normal. You get the idea.

Unfortunately you cannot do this in MonoGame as the optomiser removes everything that isn’t used. So the C# code blows up when it tries to set parameters on the shader. You can work around this, but it slows down development a lot, and is dangerous as you can easily forget to tidy up the C# code after you are happy with the shader.

I would be happy to take a look at this for you.

Cheers
Paul

Hi AcidFaucent,

It’s nothing to do with getting things wrong in the shader code, it is simply the problem Jjagg pointed out.

I use RenderDoc and Pixwin for complex tasks like foveated rendering, but for a simple shader that worked perfectly in XNA they are over the top.

Now I know what the problem is I can keep an eye out for it.

I also find that RenderDoc can’t always do anything. Getting it to actually work with a particular runtime/ shader driver combination is like knitting fog.

Pixwin is even worse. It either works and is brilliant (apart from the fact that it simulates shader computations, and sometimes the simulation is wrong), or it won’t work at all.

The workaround i have used in the past is to make a helper extension method to ‘SetValue’ of the parameter, which first checks for null parameters, and if so skips the assignment of the value.

If I use that everywhere in place of attempting to directly set them, you can edit shaders to diagnose issues more freely.

Hi Dismiss,

Yes I am going to write a shader manager which has that sort of functionality for the actual game.

I am one of those guys that will throw together a new project to experiment with something rather than using existing code, so for me it would be better if I can get to the parser and add a setting that disables shader optomisation in debug mode.

I also find it is good practice to have this functionality to find the really hard to track down bugs caused by problems in the shader compiler.

If you’re using the pipeline/2MGFX then optimization is already disabled when set for debug.

If you mean unused symbol stripping, you can’t tell the HLSL/GLSL compiler to disable that - that’d be a train-wreck.