Palette swap shader not working, interpolates between colors

Hi, I have a shader set up to swap out colors in a sprite based on the R and G values of each color. The original sprite’s colors have a R value equal to the X of the replacement color on the palette; G is set to be used as well if I ever need to go over 256. However, it seems to be grabbing colors interpolated between two colors on the palette sprite I have, like so:

How can I fix this? This is my shader:

[code]texture2D Texture;
texture1D PaletteTexture;
int PaletteTextureWidth;

sampler TextureSampler = sampler_state
Texture = ; magfilter = NONE; minfilter = NONE; mipfilter = NONE; AddressU = clamp; AddressV = clamp;

sampler PaletteTextureSampler = sampler_state
Texture = ; magfilter = POINT; minfilter = POINT; mipfilter = POINT; AddressU = clamp; AddressV = clamp;

// Pixel Shader
float4 PixelShaderFunction(float4 position : SV_Position, float4 color : COLOR0, float2 texCoord : TEXCOORD0) : SV_Target
float4 texColor = Texture.Sample(TextureSampler, texCoord);
float4 outputColor = PaletteTexture.Sample(PaletteTextureSampler, float2(((texColor.r256) + (texColor.g256))/PaletteTextureWidth, 0));
return outputColor;

// Compile
technique Technique1
pass Pass1
PixelShader = compile ps_3_0 PixelShaderFunction();

e: I thought I mentioned this when posting but apparently I forgot. this shader originally worked when compiling for ps_4_0_level_9_1 but since switching to MonoGame+OpenGL it has stopped working.

I think the wrong interpolation is not happenning in the palette translation, but in the texture sampling without point sampling. What happens if you sample the texture with Point filter?

float4 texColor = Texture.Sample(PaletteTextureSampler, texCoord);

Nothing changes if I do that.

I can point out one issue that might be affecting things. In your pixel shader function, you are multiplying the red and green values by 256, but you should be multiplying them by 255. The potential values for byte-style colors are not 1-256, they are 0-255. Converting them to a decimal format will work more accurately if you multiply them by 255. Also, taking the average of two colors may not be the best way to go about things. At best, it will allow you to double your potential colors from 256 to 512. But by using the green value as a map to the V axis of the UV coordinates, you could multiply the potential colors instead. Just saying.

Also, I think you may be calculating your UV coordinates incorrectly, as well as unnecessarily. The value you get back from sampling the UV coordinates from the original texture should theoretically be a float value between 0 and 1, so it shouldn’t be necessary to alter it any further. I could be wrong about that, but I know it’s the way that sampling works in most other shaders I’ve worked with.

Just a bit of an example. If you were to change your code as little as possible, it would look like this.

float4 outputColor = PaletteTexture.Sample(PaletteTextureSampler, float2((texColor.r + texColor.g) / 2.0, 0.0));

This would add the Red and Green decimal values together, and then divide them by 2 to get the average. The output color float4 is supposed to be four decimal values between 0 and 1, not decimal values between 0 and 255. If you feed it a value above 1, it’s going to try to wrap around the texture, and it will be a complete crapshoot as to which part of the texture it will end up sampling. CPUs deal with integer values for colors, so getting a 0 to 255 value makes sense. But GPUs and shaders are all about those floats, and treat colors as 0 to 1 decimals, just like UV coordinates are 0 to 1.

re: the coordinates, it’s necessary to multiply them and divide them because the palette’s width will not always be 255.

re: using 255 instead of 256, I changed it, but it didn’t fix the issue I’m having, sadly. Are there any other reasonable ways to do palette swapping? at this point I’m wondering if I should just load in temporary textures using GetData + SetData for sprites with alternate palettes, which sounds expensive and annoying to set up, but at least might work.

I think you might be misunderstanding how the texture sampling works in the shader language.

A palette texture can be 256 pixels wide. It can be 17 pixels wide. It can be a million pixels wide. The shader’s sampling functions don’t care. As far as the shader is concerned, the left-hand side of the texture is zero, and the right-hand side of the texture is one. This is the same no matter what size texture image you’re feeding in. Likewise, the color information that you get from the original texture is also a 0-1 value. Zero is black or no color. One is the maximum value of the given color. (red, green, or blue)

If your source image is a grey-scale image, all the color values will be same, and any of them can be used to point to a UV value on a palette definition texture. Zero will be the extreme left, and One will be the extreme right. But at no point will the sample function care about the actual pixel width. Defining where on the palette texture you’re pointing at is entirely up to you and your calculations, and doesn’t actually involve the pixels at all. The color you originally sampled will translate into decimal position. So you have to carefully craft your source image’s color accordingly.

This is a sample sprite sheet from the Scott Pilgrim Vs. The World game. I re-formatted it to be an 8-bit palette-based PNG.

This is a greyscale image that I generated for a palette shader using a small python program that I wrote.

And here is its associated palette-sampling color strip texture. Each grey value in the original image corresponds to a horizontal decimal position on the color strip. Change the sample palette strip with a different texture, and you change all the colors assigned in the source image.

…but multiplying and dividing is how it is calculated where to point to on the palette. as such, if the code is replaced with the code you posted then the colors are completely wrong instead of being “almost-right” but between one color and the next.

E: to be clear, the R values correspond to the X of the color in the palette, and it’s the shader’s job to figure out the corresponding 0-1 value by multiplying and dividing. Apparently I somehow forgot to mention this in the OP (I thought I did, oops) but I should point out that this /has/ worked fine for me when compiling for ps_4_0_level_9_1 and it’s only since switching to MonoGame+OpenGL that the shader stopped working

okay well I replaced

float4 outputColor = PaletteTexture.Sample(PaletteTextureSampler, float2(((texColor.r*256) + (texColor.g*256))/PaletteTextureWidth, 0));


float4 outputColor = tex2D(PaletteTextureSampler, float2(((texColor.r * 256) + (texColor.g * 256)) / PaletteTextureWidth, 0));

and everything works fine now, so this is solved I guess