HLSL TEXCOORD0 value changed based on Texture size

I have this strange problem that even after spending like 14 hours in a row i can’t find a way to solve it.

i am trying to make a few pixel shaders functions, and the results are always funny for some reasons, after a lot of testing i found the TEXCOORD0 values range changes based on the original texture size, i read the semantics of TEXCOORD0 is deprecated and i read we should use SV_POSITION instead of POSITION, etc etc etc, anyhow …

This is the shader code i am using

#if OPENGL
    #define SV_POSITION POSITION
    #define VS_SHADERMODEL vs_3_0
    #define PS_SHADERMODEL ps_3_0
#else
    #define VS_SHADERMODEL vs_4_0_level_9_1
    #define PS_SHADERMODEL ps_4_0_level_9_1
#endif

Texture2D SpriteTexture;

float TimeE; float H;
sampler2D SpriteTextureSampler = sampler_state {
    Texture = <SpriteTexture>;
};

struct VertexShaderOutput  {
    float4 Position : Position;
    float4 Color : COLOR0;
    float2 uv : TEXCOORD0;
};

float4 ColdDown1(VertexShaderOutput input) : COLOR {
    float4 color = tex2D(SpriteTextureSampler, input.uv) * input.Color;
    if(input.uv.y >= CD) color.b = color.b + 0.4f;
    return color;
}

technique ColorShiftTec { 
    pass Pass0 { PixelShader = compile PS_SHADERMODEL ColdDown1(); }
};

no matter what i do in the function the result different from online simulators like shadertoy (after i change the terms from GLSL to HLSL manually)

and when i used fixed values like

if(input.TextureCoordinates.x < 0.5) color.g = color.g + 0.2f;

i get different result than the simulated one


image

after a lot of attempts and tinkering i found out based on the texture i use the values differ, and when i edit the image and change its size i get different results, so i made few tests and here they are:

so if i have a texture with width and height of 4000 the uv respectivly value becomes from 0 to ~0.14438
for texture of size 2000 the uv becomes from 0 to ~0.2896 which is double the value
for size of 1 pixel the uv go to 0 to 579.2 …

i copied the project and test it on a different computer and the results the same. which is kinda a good news cause i can just use magic numbers and pray to God that it wont break or never change atlases sizes, and reconfigure a way to make my effects manager do the calculation after find the exact magical number, but no screw all of that <_<; shouldn’t the TEXCOORD0 have values from 0 to 1?

if anyone have an idea what i am doing i would appricate it, i tried to change the pixel shader but didn’t notice any changes

I’m not a shader pro but I think you have to replace

input.TextureCoordinates

With

input.uv

Because your struct defines TEXCOORD0 as uv and not as TextureCoordinates. Instead of changing your code to uv you could rename uv to TextureCoordinates in your struct.

struct VertexShaderOutput  {
    float4 Position : Position;
    float4 Color : COLOR0;
    float2 TextureCoordinates : TEXCOORD0;
};
1 Like

That was a typo i left by a mistake, but thanks for the reply. I changed the TextureCoordinates into uv to make my testing easier, but forgot to replace all the keywords in this example.

Is this texture in the atlas? How do you pass this texture to the shader?

If you have a texture in an atlas and when you draw it (draw a part of altas) you pass UV of this part of image. For example if you have the altas 1000x1000 px (UV for this 0 to 1) with images and let’s admit one of them has coordinates in the atlas Rect(x=50,y=80,w=100,h=100) then UV in shader will be starts from U from 50 * 1/1000 to 150 * 1/1000, and V from 80 * 1/1000 to 180 * 1/1000. Where 1/1000 is not a magic number it is a texel size of your texture (i.e. 1/TextureWidth and 1/TextureHeight).

Really? cause when i pring different portion of the Atalas the uv coord, namly TEXCOORD0 value never changed, but if i use a differnet size texture it does, in my case at least

UV always changed if you draw another part of an atlas. If you pass UV from 0 to 1 to a shader then you draw entire atlas therefore UV depends on part what you draw.

i am not sure what you saying, i know UV is used in many places, i am talking about sampler TEXCoord in HLSL pixel shader, and from what i understood it should be from 0 to 1 all the time for whatever you which part of spritebatch you print on, maybe you draw in a different way, let me get some code sample

// In Map class

			for (int i = 0; i<PLObjects.Count; i++) {
				foreach (KeyValuePair<string, IAnimated> ObjK in PLObjects[i]) {
					if (!ObjK.Value.SetVisible /*||!new Rectangle((int)CTarget.SetCamX-ObjK.Value.GetLocationRF.Width, (int)CTarget.SetCamY-ObjK.Value.GetLocationRF.Height, GameCore.CGC.GetScreenWidth, GameCore.CGC.GetScreenHeight).Contains(ObjK.Value.GetLocationRF)*/) continue;
					ObjK.Value.Draw(SB, CTarget, ZoomFactor, PZoomList[i].Z);
				}
			}
// in draw
public void Draw (SpriteBatch SB, Camera CTarget, float ZF, float LZF, Vector2 OP = default(Vector2), float OPA = 0) {
if (SFX != "") { GameCore.CGC.EffectsAr[SFX].Apply(); }

 if (ADResult.CurAnimation != null) 
							SB.Draw(GameCore.CGC.TextureDictionary[AClass.MainSP], (Pos + OP + GetCenter) *ZF*LZF - CTarget.Pos, AClass.SourceRF[CSide], C, (SetAngle+OPA)* -0.01745f, GetCenter, size*ZF*LZF, (SpriteEffects) (ADResult.SetCB&3), 1f);

// draw layers, accessory and others are ommited

in short i always send SourceRF or use the GameFile (GF) from my editor GUINameSource which reference the soure rectangle, which is the position in the atlas, whenever i use to draw entities, another example for direct use Effects in GUI menus

public override void Draw(SpriteBatch SB, Vector2 Pos) {
	int P = (int)(Pos.Y-120)/54; 

	if (Castle2.Castle.SetJob.SkillDictionary[ID].CD > 0) { SB.End(); SB.Begin(SpriteSortMode.Immediate, BlendState.AlphaBlend);
		float F = 1- (Castle2.Castle.SetJob.SkillDictionary[ID].CD/(float)GameCore.CGC.SkillAr[ID].GetCD);
		F*=0.0325f;
		GameCore.CGC.EffectsAr["ColdDown"+P].SFX.Parameters["CD"].SetValue(F);
		GameCore.CGC.EffectsAr["ColdDown" + P]?.Apply();
		System.Diagnostics.Debug.WriteLine(Castle2.Castle.SetJob.SkillDictionary[ID].CD + ": " + F);
	}

	SB.Draw(GameCore.CGC.TextureDictionary["GUI"], Pos , MenuComponent.GUINameSource("GUI", GUIName), Color.White);
	if (Castle2.Castle.SetJob.SkillDictionary[ID].CD > 0 ) { SB.End(); SB.Begin(SpriteSortMode.Immediate, BlendState.AlphaBlend); }

	if (HudMC.CurTargetSkill != "" ) {
		MenuComponent.DrawDialog(SB, HudMC.ResMC, GameCore.CGC.DialogDictionary["SkillSelectTarget"], new Vector2(70, 50));
		MenuComponent.DrawDialog(SB, HudMC.ResMC, GameCore.CGC.DialogDictionary[GameCore.CGC.SkillAr[CurTargetSkill].NameKey], new Vector2(265 +(GameCore.SetSelectedLanguage ==1?-470:10), 50), 24);
	}
}

Skills have different localtion and use same Shader code and give same results,

in this example if TEXCOORD0 i won’t need the multiply with magic value, cause when i test it on simulators the shader work fine, this make me feel i should make a new project and make direct test, it might be something i am not aware of

F*=0.0325f;

Seems F is an offset in texture atlas. When you call spriteBatch.Draw(…, sourceRectange, …), behind the scene spriteBatch create a quad with this size and set UVs per vertex. Quad contains 4 vertices. Where UVs for each vertex

Vector2 texelSize = 1f / texture.Size;
Vector2 pos = sourceRectange.position;
Vector2 size = sourceRectange.size;

// For v1, v2, v3, v4
Vector2 uv_leftUp = pos * texelSize; // v1
Vector2 uv_rightUp = new Vector2(pos.x + size.x, pos.y) * texelSize; // v2
Vector2 uv_leftBottom = new Vector2(pos.x, pos.y + size.y)* texelSize; // v3
Vector2 uv_rightBottom = new Vector2(pos.x + size.x, pos.y + size.y) * texelSize; // v4

As you can see if you set another sourceRectange to Draw method it will calculate different UVs for quad (sprite).

So you saying that TEXCOORD0 in shaders are not fixed value from 0-1 ? i am talking about this one

struct VertexShaderOutput {
	float4 Position : SV_POSITION;
	float4 Color : COLOR0;
	float2 UV : TEXCOORD0;
};

if that’s the case i could make my shader manger calculate the value based on texture size, but from what i read it always from 0 to 1

texcoord - ps - Win32 apps | Microsoft Learn

but i read too that the symantix is depericated, and the HLSL is a bit lose and optimizes stuff, which can remove codes and variables if they are not used, so i am not sure why or when the Texcoord can be from 0 to 1, or if there is away to force it to that range if it is optional or something so i can avoid some error by making miscalculations

Yep if you draw part of an atlas using spriteBatch then behind the scene it sets different UVs and they are not fixed from 0 to 1.

thanks for the help! i appricate it.

I can suggest you 2 ways to workaround this problem

  1. Draw sprites with different Begin and End methods and pass to the shader special Vector4 with min and max UVs. Then in shader you can calculate UVs from 0 to 1.
float4 uvBounds; // Passed from script, xy - min values, zw - max values.
float2 normalizedUv = (uv - uvBounds.xy) * (uvBounds.zw - uvBounds.xy);

But this way will break batching because you will call different Begin and End for spriteBatch.

  1. You can pack sourceRect parameters to Color and unpack them in shader. Color has 4 float fields (R, G, B, A). Each of them has 32 bits, you can use first 16 bits of R channel to store positionX, next 16 bits of R channel to store positionY and same things for sizeX and sizeY only for G channel. Also you shoud pass texture width and height to the shader. In channel A you can pack a color. For each 8 bit you can put channels’ values and upack them in shader.
    In shader you can unpack all value from Color and recalculate UVs from 0 to 1 as you needed.
    This way is more complicated but does not break bacthing.

Or third way. You can write your own spriteBatch and pass for each sprite second UVs from 0 to 1. :slight_smile:

i will let my effect manager do the calculation and adjust the variable automatically

Today i came back to this problem and started working on it, i made my own calculation from zero to see if i understood what’s going on and reached out to simlar formula you gave me, which to be honest didn’t get what’s going on when i read it first.

so in the shader manager i made a function to update the shader UN Norm factor

public void NormFacCal (string TextureName, Rectangle SourceRC) {
	Vector2 BSize = new Vector2(GameCore.CGC.TextureDictionary[TextureName].Width, GameCore.CGC.TextureDictionary[TextureName].Height);
	SFX.Parameters["NormFac"].SetValue(new Vector4 (SourceRC.X/BSize.X, SourceRC.Y/BSize.Y, SourceRC.Width/BSize.X, SourceRC.Height/BSize.Y));
}

and in the shader code i make a new UN and use it instead of original UN

#if OPENGL
	#define SV_POSITION POSITION
	#define VS_SHADERMODEL vs_3_0
	#define PS_SHADERMODEL ps_3_0
#else
	#define VS_SHADERMODEL vs_4_0_level_9_1
	#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

Texture2D SpriteTexture;

float4 NormFac;
float CD; 

sampler2D SpriteTextureSampler = sampler_state {
    Texture = <SpriteTexture>;
};

struct VertexShaderOutput {
    float4 Position : Position;
    float4 Color : COLOR0;
    float2 TextureCoordinates : TEXCOORD0;
};

float4 ColdDown1(VertexShaderOutput input) : COLOR {
    float4 color = tex2D(SpriteTextureSampler, input.TextureCoordinates);
    float2 NewUV = (input.TextureCoordinates - NormFac.xy) / NormFac.zw;
    
    if (NewUV.y >= CD) color.b = color.b + 0.4f;
    if (NewUV.x >= CD) color.g = color.g + 0.4f;
    
    //if((input.TextureCoordinates.y -0.0125)/0.0125 >= CD) color.b = color.b + 0.4f;
    //if((input.TextureCoordinates.x -0.49025)/0.0125 >= CD) color.g = color.g + 0.4f;
    
    return color;
}

technique ColorShiftTec { 
    pass Pass0 { PixelShader = compile PS_SHADERMODEL ColdDown1(); }
};

And the result is amazing! yes i need to calculate the Norm everytime i need to change the source RC, but that’s why i can make new instances of shaders manger for each calculations if needed, and i need to create a new UV for every function, wish if things could be simpler or more efficient by not make more calculations but hey, i don’t need to start a new SB (even though i need to kill them when drawing is done and start new one)

this is the results: (notes: yes i haven’t fixed the left bars in the game, patience!)

Uploading: 2023-12-27 15-49-45.gif…

What i understood from all this fiasco that i spent around 35 hours to understand that the TexCoord0 is always 0-1 (unless if you assign it in SB ?) and when you draw a portion of the atalase the calculation will still applyed 0-1 but over the whole Atals, so when PixelShader (PS) do calculations of the pixel it draws the UV will be already advanced from where you are drawing, so if you are drawing from the middle of the atalse the UV will start from 0.5, and end where your drawing portion ends, so you need to shift the UV (by making a new UV) and divide it with the size difference.

it was a wild ride, but i finally can feel more free to play with new shaders and implement them on the games!

Great job! UVs pass to each vertex and they can be any number. You can pass from 0 to 2 and then the result will be tiled (if you set SamplerState = Repeat for a texture). If triangle has vertices with UVs from 0.2 to 0.5 then in shader you will get interpolated UVs from 0.2 to 0.5. :slight_smile: