I’m porting a shader from WindowsDX to DesktopGL. In the WindowsDX version, I was able to get the width and height of the texture, but in the DesktopGL version, I apparently need to pass them in.
Alright, so I pass them in… and it’s a mess. There’s double images all over the place, indicating that it’s probably sampling from the wrong spot and that the width and height are wrong.
I mostly use this information for applying blurs behind menus and as a result the texture size is the same 80% of the time, so I simply hard code these dimensions into the shader instead of passing them in with SetValue(…). Everything works identically to the WindowsDX version.
So, the values I’m passing into the shader are obviously not making it into the shader unaltered.
Is there any way to see what value is actually making it into the shader and why? Is there any way to debug this? Most the shader debugging tools I’ve found don’t seem to work with MonoGame.