I'm having a lot of troubles passing a texture's resolution to a shader.

I’m porting a shader from WindowsDX to DesktopGL. In the WindowsDX version, I was able to get the width and height of the texture, but in the DesktopGL version, I apparently need to pass them in.

Alright, so I pass them in… and it’s a mess. There’s double images all over the place, indicating that it’s probably sampling from the wrong spot and that the width and height are wrong.

I mostly use this information for applying blurs behind menus and as a result the texture size is the same 80% of the time, so I simply hard code these dimensions into the shader instead of passing them in with SetValue(…). Everything works identically to the WindowsDX version.

So, the values I’m passing into the shader are obviously not making it into the shader unaltered.

Is there any way to see what value is actually making it into the shader and why? Is there any way to debug this? Most the shader debugging tools I’ve found don’t seem to work with MonoGame.

if you know the values, then just add a compare

if(inputvalue==0.5)
      return float4(1,1,1,1);
else
       return float4(1,0,0,1);

The values passed should be the same in both - you could do like Stainless said to test the value passed. I would suspect something else is going on. I know WinDX and DesktopGL have slight differences in what you can get away with in how to specify certain things. ie: shader in/out params
Certain things may seem completely unrelated to the problem - but in my experience, I’ve found:
If the planets aren’t aligned at the stroke of midnight, strange things can occur. If I seen the code, I might be able to spot something else that could be causing strange behavior.

I’m so confused. If I do this (512 is the expected value):

<code using Resolution.y>
if (Resolution.y == 512)
{
    result += (0, 0.1, 0, 0);
}
return result;

It’s broken, but tints the output green, indicating that the value is 512 but that the code is incorrect. But if I do this:

<exact same code hard coded to 512 instead of Resolution.y>
if (Resolution.y == 512)
{
    result += (0, 0.1, 0, 0);
}
return result;

It works, and tints the output green, indicating that the value is 512 and that the code is correct, but only works if it’s hardcoded?

I’ve also rewritten the code several times and it still happens.

Edit:

I think I figured it out, although I don’t know why it manifested in such a weird way.

Although DrawUserIndexedPrimitives(…) copies the effect data buffer to OpenGL, calling SetValue(…) on an effect parameter doesn’t actually copy the value to the effect data buffer. And as far as I can tell, there’s no way to do this other than to call Apply(), which I had already done.