Working on first shader, but getting weird effects

I’ve been running through tutorials on shaders over the last few days, and I’m having an issue that I can’t explain, and I’m sure I’m just missing something as a newbie to this stuff. This is a 2D game so there’s no projection going on.

The goal of this shader is to take each section of a tile (which I’ve divided up mathematically into a top/bottom border, a left/right border, a corner, and a main tile section) and blend it gradually away (alpha-wise) towards the edges. I want to be able to use the “style” integer to be able to control which configuration of blending is being used. Style 4 is to make no changes to the tile’s alpha, and it’s used for the middle section, so I figured it would be a good test value since it should just draw every section with full alpha.

I’m not sure if I’m not calling the effects correctly or what, but when I call the shader I wrote, it draws a very strange result, and if I call the default .fx file (what you get if you create a new effect in Monogame Content Pipeline and then just run it), I get a black screen.

Here’s the code for my shader (hopefully the formatting is readable):

int style;
sampler s0;
sampler Sampler0 : register(s0) { };

float4 PixelShaderFunction(float2 coords: TEXCOORD0) : COLOR0
{
	float4 color = tex2D(s0, coords);
	
	if (style == 0)
	{
		if (coords.x < coords.y) color.a = coords.x;
		else color.a = coords.y;
	}
	else if (style == 1) color.a = coords.y;
	else if (style == 2)
	{
		if (coords.y < 1 - coords.x) color.a = coords.y;
		else color.a = 1 - coords.x;
	}
	else if (style == 3) color.a = coords.x;
	else if (style == 5) color.a = 1 - coords.x;
	else if (style == 6)
	{
		if (coords.x < 1 - coords.y) color.a = coords.x;
		else color.a = 1 - coords.y;
	}
	else if (style == 7) color.a = 1 - coords.y;
	else if (style == 8)
	{
		if (1 - coords.x < 1 - coords.y) color.a = 1 - coords.x;
		else color.a = 1 - coords.y;
	}
			
	return color;
}

technique Technique1
{
	pass Pass1
	{
		PixelShader = compile ps_4_0_level_9_1 PixelShaderFunction();
	}
}

And here’s how I’m calling it:

GameDataManager.BeginNormalEffectBatch(testBlend);
testBlend.Parameters["style"].SetValue(4);
testBlend.CurrentTechnique.Passes[0].Apply();

Which hits:

public static void BeginNormalEffectBatch(Effect effect)
    {
        if (debugMode)
        {
            NormalBatchesThisFrame++;
            EffectBatchesThisFrame++;
        }
        Batch.Begin(sortMode: SpriteSortMode.Immediate, blendState: BlendState.AlphaBlend, samplerState: SamplerState.PointClamp, effect: effect); 
    }

…and then (after some math) the drawing happens:

int variant = screenTiles.Tiles[x, y].Variant; GameDataManager.Batch.Draw(testTiles2.Texture, new Rectangle(point1, size1), testTiles2.GetTileRectBorderCorner(id, variant), Color.White); GameDataManager.Batch.Draw(testTiles2.Texture, new Rectangle(point2, size2), testTiles2.GetTileRectBorderHorizontal(id, variant), Color.White);

When I run this, I get something that looks like it’s drawing the tiles as requested, but they aren’t textured - rather they seem to entirely use the one pixel at the top left of the section of texture I’ve provided, or something similar, because the output is reminiscent of an intellivision or atari game:

The tiles are drawn with my shader, but the “Character” test image (a placeholder for where the player’s animated character will appear) is drawn without. Here’s what the tiles look like when drawn without the shader, using a SpriteBatch with no “effect”:

Any help would be greatly appreciated…I don’t know anyone who is knowledgeable on this subject :slight_smile:

1 Like

That sounds like the texture coordinates are all zero. You can visualize the texture coordinates by returning them from the pixel shader.

float4 PixelShaderFunction(float2 coords: TEXCOORD0) : COLOR0
{
    return float4(coords, 0, 1);
}

You should see a gradient. If the output is all black, the coordinates are zero.

2 Likes

I think your issue is that you’re using s0 when calling tex2D instead of Sampler0, which is bound to a resource via the register command. Also, s0 is a default resource binding in HLSL if I’m not mistaken, so there might be some weirdness with defining a local variable as s0.

I believe you should just get rid of your s0 definition and just use:

sampler Sampler0 : register(s0);
float4 color = tex2D(Sampler0, coords);

1 Like

I’ve made some progress after reading some more tutorials. The problem definitely seems to be the texture coordinates.

If I run this basic debug shader:

#if OPENGL
#define SV_POSITION POSITION
    #define VS_SHADERMODEL vs_3_0
	#define PS_SHADERMODEL ps_3_0
#else
	#define VS_SHADERMODEL vs_4_0_level_9_1
	#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

Texture2D SpriteTexture;

sampler2D SpriteTextureSampler = sampler_state
{
	Texture = <SpriteTexture>;
};

struct VertexShaderOutput
{
	float4 Position : SV_POSITION;
	float4 Color : COLOR0;
	float2 TextureCoordinates : TEXCOORD0;
};

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR
{
	float4 color = tex2D(SpriteTextureSampler, input.TextureCoordinates); // * input.Color;
	color.rg = input.TextureCoordinates;
	color.b = 0;
	return color;
}

technique Technique1
{
	pass Pass1
	{
		PixelShader = compile PS_SHADERMODEL PixelShaderFunction();
	}
}

I get this:

If I delete the lines altering the color channels from the test shader above, it draws the textured tiles as normal. It’s just the alpha calculation that isn’t working. :frowning:

I’ve tried a bunch of stuff and not sure what I’m doing wrong here.

Here’s my Batch.Begin:

I can’t really understand why the TextureCoordinates are coming in as one number for the whole area being drawn…

I’m so close! Just need to figure out this alpha issue :slight_smile: And yes, the texture png in use definitely has an alpha channel (I checked).

Since you guys are nice enough to help me out, as a thank you here’s a link to a song from the game…enjoy :slight_smile:
https://soundcloud.com/taien/lepidoptera-1-final

Try using the COLOR0 semantic instead of COLOR:

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0

I don’t think you should apply the effect manually. SpriteBatch handles that internally. You are passing the effect as a parameter to SpriteBatch.Begin after all.

Yeah sorry, that part in the original post is older :slight_smile: I removed that line since I changed it to be passed into the begin call now. I will try COLOR0 and see if that helps.

Edit: Tried changing the semantics to match (COLOR0) and there was no change. Same with using COLOR in both places. I’m scratching my head…

Edit 2: Basically, all I’m doing is starting the batch and passing in the test effect from my previous post, drawing tiles, and ending the batch, but the TextureCoordinates seem to be wonky. Does it possibly have something to do with one of the other Batch options I chose?

Am I just not understanding how TextureCoordinates work? I am under the impression they are supposed to be between 0 and 1 as a float, similar to the colors…

Edit: I’m having a lot of trouble locating good, current resources on how to do this sort of thing. If any of you have some relevant links you can share, I’d really appreciate it. Most of the tutorials I’m finding are either extremely outdated or simply don’t work in practice. Even the default shader created by Monogame’s Content Pipeline Tool doesn’t work. Because of this, I’m having a lot of trouble trying to follow another person’s example…there seems to be a lot of disagreement as to how the signature of the Shader should look, whether it requires VertexShaderOutput or like a color and textcoords as arguments. I’m not sure what variables I actually have the option of accessing through the VertexShaderOutput because I can’t seem to find a list anywhere. So yeah, any relevant links would be great…Google is failing me :frowning:

Out of curiosity, if you hardcode color.a = 0.5f (or whatever number) do you see what you would expect? That should help identify if it’s a problem with the actual alpha calculation in the shader or a more global setting in the graphics device.

It’s always possible it’s an issue with premultiplied alpha. I had quite a few issues with that when I was starting my current project. You can try setting the blend state to BlendState NonPremultiplied instead of AlphaBlend and see if it changes.

1 Like

The song is quite catchy, by the way.

1 Like

I tried that as well, and it doesn’t seem to have any appreciable impact on the drawing. I’ll show you the output from color.a = 0.5; So you may be right.

One sec.

Glad you liked the song :slight_smile: There’s a bunch of other ones posted on my soundcloud as well if you get bored.

So if I use color.a = 0.5, I get the following:

If I just return color without any changes, I get this:

They both seem to be drawn with full alpha, except the first one almost seems to be using additive math for the colors that are overlapping, even though I specified AlphaBlend. I’m baffled. I’ll try setting NonPremultiplied and see what happens.

The input of the pixel shader must be the same as the output of the vertex shader. I’m not 100% certain on this, but I believe MonoGame’s default vertex shader outputs what you currently have, so you should be good with the inputs. I have shaders set up identically that work with no issues.

I recommend testing an empty game state with just your texture on the screen. If that looks good, then apply a very simple shader like @kgambill suggested. There may be something off with your graphics device states.

1 Like

Ok, so I tried using color.a = 0.5 with NonPremultiplied, and that seemed to draw it correctly (half alpha):

Then I tried using color.a = input.TextureCoordinates.x and I got this:

The problem has GOT to be the TextureCoordinates. Is it because of the type of Draw call I’m using?

Here’s the method called in that Draw line (it’s inside the TileSheet class I wrote, which stores the texture and info on the tile sizes):

Variations is an int defining how many variations each tile on the tilesheet has. So if it’s 4, that means every tile on the tilesheet gets 4 cells with a slightly different graphic (for non-repetitive-textures-sake). I’m reasonably certain my math there is correct (it’s supposed to be getting the lower-right 16th of a tile, depending on its size, which by default is 64x64, so the 16th in this case is 16x16.

TileWidth is the width of one tile, so in this case, 64. Same with TileHeight.

Width is the width of the entire texture. Same with Height. In this case they are both 1024.

When you say there may be something off with my GraphicsDevice states, what exactly do you mean? I’m using the GraphicsDevice that’s provided by monogame at program start, but I’m not super familiar with how to mess with its state.

SpriteBatch internally changes the graphics device’s state when you call Begin. This includes the BlendState and shader used. It needs to do this, so it’s not an issue. What I suspected when I mentioned that is the state is unintentionally changing at some point in between your drawing code.

What happens if you multiply the pixel’s color by 0.5 instead of setting color.a to 0.5 with AlphaBlend? The first output you linked looks like premultiplied alpha at work. If that looks good, then try multiplying the output color by input.TextureCoordinates.x.

1 Like

Multiplying the entire color by 0.5 with AlphaBlend produces the same output as setting color.a to 0.5 with NonPremultiplied.

Multiplying the entire color by input.TextureCoordinates.x produces the same output as setting color.a to input.TextureCoordinates.x.

The texturecoordinates still seem to be off :frowning: I’m so lost, lol.

What are you expecting the texture coordinates to be? For instance, if all of the tiles are in the same texture and one tile is 1/4 the entire texture vertically and horizontally, the first tile will start at (0,0) and end at (.25,.25). If your tiles are in different textures then you will need to either account for the size of each texture or render tiles in different textures in separate batches. I’d suggest the latter due to improved performance and fewer complications with handling different texture sizes in the shader.

I think the next step would be to omit the tile drawing for now and keep it as simple as possible so it’s easier to diagnose the issue. Simply draw the entire texture at once and see what happens.

2 Likes

Give me a little time. You’ve revealed something here I wasn’t aware of. I assumed the texture coordinates were related to the section of the texture being drawn, not the entire texture. So it looks like I need to do some more math. Haha.

Edit: In other words, I thought that when I provided a sourceRect to the draw function, the tilecoordinates were how far across that sourceRect we were, not how far across the entire texture :stuck_out_tongue:

I’m pretty close to solving it now thanks to your help with that texturecoordinate info. Here’s my latest test :slight_smile: