Obtaining accurate single-pixel texture coordinates for shaders

I’m creating a simple outline shader to render on sprites in a spritesheet. I’m sampling the color of the current pixel and all neighboring pixels to decide whether to apply the outline color or not.

Here’s my shader:

#if OPENGL
    #define SV_POSITION POSITION
    #define VS_SHADERMODEL vs_3_0
    #define PS_SHADERMODEL ps_3_0
#else
    #define VS_SHADERMODEL vs_4_0_level_9_3
    #define PS_SHADERMODEL ps_4_0_level_9_3
#endif

struct VertexShaderOutput
{
    float4 Position : SV_POSITION;
    float4 Color : COLOR0;
    float2 TextureCoordinates : TEXCOORD0;
};

sampler s0 = sampler_state { AddressU = Clamp; AddressV = Clamp; };
float2 texelSize;
float4 outlineColor;

float4 Outline(VertexShaderOutput input) : COLOR
{
    float4 color = tex2D(s0, input.TextureCoordinates);

    if (color.a == 0)
    {
        float4 colorUp = tex2D(s0, input.TextureCoordinates - float2(0, texelSize.y));
        float4 colorDown = tex2D(s0, input.TextureCoordinates + float2(0, texelSize.y));
        float4 colorLeft = tex2D(s0, input.TextureCoordinates - float2(texelSize.x, 0));
        float4 colorRight = tex2D(s0, input.TextureCoordinates + float2(texelSize.x, 0));

        if (colorUp.a != 0 || colorDown.a != 0 || colorLeft.a != 0 || colorRight.a != 0)
        {
            color.rgba = outlineColor;
        }
    }

    return color;
}

technique BasicColorDrawing
{
    pass P0
    {
        PixelShader = compile PS_SHADERMODEL Outline();
    }
};

I stepped through the logic manually with pencil and paper, but I couldn’t find any errors. I suspect that the texelSize parameter is incorrect; this is what I pass into the shader:

Vector2 texelSize = new Vector2((float)(1 / (double)playerSprite.Tex.Width), (float)(1 / (double)playerSprite.Tex.Height));

My spritesheet is 764x1358, so the values come out to 0.00130890052356020942408376963351 and .00073637702503681885125184094256259 before they’re converted to floats. After conversion, they come out to 0.001308901 and 0.000736377 respectively, which is a huge loss in accuracy. I tried enlarging my spritesheet with empty space to make it 1024x2056, and while this produced overall better results after the float conversion, there were still some inaccuracies. I suspect that these inaccuracies are causing my outline to be offset up and to the left by a pixel.

I also tried passing in the texture dimensions and calculating the single pixel texture coordinate from within the shader, but that produced the same results.

Is there a way to get texture coordinate values that correspond exactly to one pixel for larger textures?

Hi !
Have you tried with PS_SHADERMODEL ps_5_0 instead of PS_SHADERMODEL ps_4_0_level_9_3 ?
This should make you use DX11 which has if I remember well, a better accuracy on texture sampling (Diablo3 in DX9 mode is less accurate as DX11 for ex)

You may get better results with a Sobel effect to determine the outlines, instead of reinventing the wheel ?

Do I need to use this method with DX11 for more accurate sampling? From what I gathered, if you input 1 to the optional offset parameter, it’ll get a 1 pixel offset in texture coordinates?

If that is the case, how can I get the current texture being sampled from the sampler? MonoGame automatically puts in the texture to sample, but I don’t know how to get it from the sampler state.

You mention that the outline is offset by a pixel, but I don’t think this will be the cause. The difference between 0.00130890052356020942408376963351 and 0.001308901 is -4.764397905759162e-10, in other words, the tiniest fraction of a texel.

What about the code that renders the sprite? Are you able to show some screenshots of the output?

You could try to offset by a half pixel/texel, or round/floor/ceil the values.

It will also help to downscale/upscale the texture dimensions into a power of two to avoid rounding errors and falling outside the float range. Scale it up to 1152x2048 and then pan the rest with Transparent color to get 2048x2048.

Here’s the spritesheet I’m using as a test (I changed it so the background is transparent). Here is the result of the outline. I confirmed that the pixels around the sprite were transparent.

Here is what I’m setting in the shader:

Effect outline = AssetManager.Instance.LoadAsset<Effect>($"{ContentGlobals.ShaderRoot}Outline");
            
Vector2 texelSize = new Vector2((float)(1 / (double)playerSprite.Tex.Width), (float)(1 / (double)playerSprite.Tex.Height));
            
outline.Parameters["outlineColor"].SetValue(new Vector4(1f, 1f, 1f, 1f));
outline.Parameters["texelSize"].SetValue(texelSize);

spriteRenderer = new SpriteRenderer(transform, playerSprite);
spriteRenderer.Shader = outline;
renderer = spriteRenderer;

And my code to render:

public override void Render()
{
    if (TransformData == null || SpriteToRender == null)
        return;
            
    RenderingManager.Instance.DrawSprite(SpriteToRender.Tex, TransformData.Position, SpriteToRender.SourceRect, TintColor, TransformData.Rotation, SpriteToRender.GetOrigin(), TransformData.Scale, FlipData, Depth);
}

Which does:

public void DrawSprite(Texture2D tex, Vector2 position, Rectangle? sourceRect, Color color, float rotation, Vector2 origin, Vector2 scale, SpriteEffects spriteEffects, float depth)
{
    CurrentBatch.Draw(tex, position, sourceRect, color, rotation, origin, scale, spriteEffects, depth);
}

@nkast I will try your suggestions!

I should also mention that I tried this with a box texture and it worked fine. I’m not sure what it is about more complex sprites or spritesheets that causes it not to work.

I don’t know much about shaders, but when using pixel art, it’s always better to use POT sizes for the textures.

The reason is that when you’re using 764 as width and you’re adressing a pixel, you’re using 1/764, which is probably not representable with IEEE754 without data loss.

However, using 1024, adressing a pixel is 1/1024, which in IEEE754 translates into mantissa = 1 and exponent = -10 (no data loss).

The data loss of 1/X may or may not be significant (this probably depends on the shader, the project or the number X), but you can be sure you have no data loss at all with 1/1024

2 Likes

Update: I tried with a 2048x2048 texture and no luck. I tried rounding and offsetting as well, but I kept getting the same results as in the screenshot I linked above.

I think the sampled pixel get’s blended with the surrounding pixels.
In the shader try a comparison with (.a != 255) instead of (.a == 0).

I wrote an outline shader a while back for the project I’m working on, but ended up deciding not to use. The outline just didn’t look as good as I wanted and it was easier to just create a highlighted version of the sprite in Paint.NET/Photoshop and use that instead. My sprites are simple though and not animated, so it was easy for me to do this.

My shader code looks very similar to yours, but I didn’t notice this issue. I went with the route of passing in the texture size and calculating the texel size in the shader routine. I also have everything as floats, and I notice that you’ve got a couple double to float conversions in your code.

Feel free to give mine a shot, I suspect you know how to pass in the parameters so it should be an easy plug and play for you. Oh, another thing to keep an eye out for is your SamplerState that you pass into your sprite batch. If you’re seeing some blurry results maybe make sure you have it set to PointClamp… at least for the pass that draws the outlined texture?

Anyway…

#if OPENGL
	#define SV_POSITION POSITION
	#define VS_SHADERMODEL vs_3_0
	#define PS_SHADERMODEL ps_3_0
#else
	#define VS_SHADERMODEL vs_4_0_level_9_1
	#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

Texture2D SpriteTexture;
float2 xTextureSize : VPOS;
float4 xOutlineColour = float4(128, 0, 0, 255);

sampler2D InputSampler = sampler_state
{
	Texture = <SpriteTexture>;
};

struct VertexShaderOutput
{
	float4 Position : SV_POSITION;
	float4 Color : COLOR0;
	float2 UV : TEXCOORD0;
};

float4 MainPS(VertexShaderOutput input) : COLOR
{
	float4 currentPixel = tex2D(InputSampler, input.UV) * input.Color;
	float4 output = currentPixel;

	if (currentPixel.a == 0.0f)
	{
		float2 uvPix = float2(1 / xTextureSize.x, 1 / xTextureSize.y);

		if (false
			|| tex2D(InputSampler, float2((1 * uvPix.x) + input.UV.x, (0 * uvPix.y) + input.UV.y)).a > 0
			|| tex2D(InputSampler, float2((0 * uvPix.x) + input.UV.x, (1 * uvPix.y) + input.UV.y)).a > 0
			|| tex2D(InputSampler, float2((-1 * uvPix.x) + input.UV.x, (0 * uvPix.y) + input.UV.y)).a > 0
			|| tex2D(InputSampler, float2((0 * uvPix.x) + input.UV.x, (-1 * uvPix.y) + input.UV.y)).a > 0
		)
		{
			output = xOutlineColour;
		}
	}

	return output;
}

technique SpriteOutline
{
	pass P0
	{
		PixelShader = compile PS_SHADERMODEL MainPS();
	}
};

I tried yours and nkast’s suggestions, and I even copied and pasted your code and it still looks incorrect. I’m sending the texture size into the shader like so:

outline.Parameters["xTextureSize"].SetValue(new Vector2(playerSprite.Tex.Width, playerSprite.Tex.Height));

I even tried another spritesheet and the other form of sampling and still nothing:

float4 color = SpriteTexture.Sample(s0, input.TextureCoordinates);

Would the GPU matter at all? I have an NVIDIA GeForce GTX 1070 for reference.

I’m curious. If you changed

if (colorUp.a != 0 || colorDown.a != 0 || colorLeft.a != 0 || colorRight.a != 0)

to

if (colorDown.a != 0 || colorUp.a != 0 || colorRight.a != 0 || colorLeft.a != 0)

would the outline be offset to the lower right instead?

Also, is that the actual texture you are using for the spritesheet? Are you using a mask colour because that PNG has no alpha. If you generated the alpha yourself, make sure the alpha values around the character are actually zero and not one or two.

I just tried it now, and the outline was still the same. I used that texture, but I changed the background to transparent before importing it. I confirmed in Photoshop that it was transparent and even deleted the empty pixels around just in case and re-exported it, but still nothing.

Is anyone else getting the same results with this shader? I’m not sure what’s going on. :confused:

Can you send me the modified sprite sheet with alpha? I have a healthy distrust of Photoshop born from years of dealing with its quirks, especially when it comes to alpha and how it loves to blend or smooth edges.

1 Like

It could, but I have pretty much the same card.

I’m going to do a test application and see if I can reproduce your results. I’ll just extract your sprite from the image you posted above. Gimme a bit here.

Ok, so I’ve created a test program and I’m actually not seeing any issues at all. I just pulled your sprite out of your screenshot and erased the appropriate pixels, making sure it had at least two pixels of space around either side. This is also using the shader I posted above. I even added code to move it around the screen since sometimes I’ve had problems with rasterization generating wonky pixels. Everything seems ok in this case.

Here’s the stuff…

The test sprite: https://imgur.com/d7VvIBz
The code for the game: https://pastebin.com/Hdwz1Xgx
The shader: see above :smiley:
The result: https://imgur.com/TUDrkOq

So at the very least, build a new project with the code I posted and see if you get the issue when I do not.

3 Likes

Thanks! I tested that same code and it worked. I did another test with my spritesheet and it also worked. After playing around with it more, it appears that the combination of PointClamp and calculating the texel size inside the shader causes it to work correctly. I tried this out in my own project and it worked. Thanks a lot!

2 Likes

Oh, awesome, I’m glad it worked! I’ve encountered weird pixel rasterization issues quite often over the years and I definitely understand how frustrating things can be. Good luck in the future with your project :slight_smile: