[SOLVED] NormalMap with Spritebatch

Hi guys,

I have a problem with a shader which should apply a BumpMap-effect to a sprite.
I tried various tutorials but i could not get it working.

My last try was this one:

From what i read it should exactly do what I want :slight_smile:
But: After i use my Shader in spriteBatch.Begin the Draw-Call won’t draw anything. It is like the shader would not receive the base texture at all.

My Shader:

// Effect applies normalmapped lighting to a 2D sprite.

float3 LightDirection;
float3 LightColor = 1.0;
float3 AmbientColor = 0.35;

sampler TextureSampler : register(s0);
sampler NormalSampler : register(s1)
{
	Texture = (NormalTexture);
};

float4 main(float4 color : COLOR0, float2 texCoord : TEXCOORD0) : COLOR0
{
	//Look up the texture value
	float4 tex = tex2D(TextureSampler, texCoord);

	//Look up the normalmap value
	float4 normal = 2 * tex2D(NormalSampler, texCoord) - 1;

	// Compute lighting.
	float lightAmount = dot(normal.xyz, LightDirection);
	color.rgb *= AmbientColor + (lightAmount * LightColor);

	return tex * color;
}

technique Normalmap
{
    pass Pass1
    {
        PixelShader = compile ps_4_0_level_9_1 main();
    }
}

My Draw-Call:

    protected override void Draw(GameTime gameTime)
    {
        base.Draw(gameTime);

        //This is the light direction to use to light any norma. maps.
        Vector2 dir = MoveInCircle(gameTime, 1.0f);
        Vector3 lightDirection = new Vector3(dir.X, dir.Y, 0f);
        lightDirection.Normalize();

        //Clear the device to XNA blue.
        GraphicsDevice.Clear(Color.CornflowerBlue);

        // Draw without shader
        spriteBatch.Begin(SpriteSortMode.Immediate, null, null, null, null, null);
        spriteBatch.Draw(this.baseTexture, Vector2.Zero, Color.White);
        spriteBatch.End();

        //Set the light directions.
        this.newEffect.Parameters["LightDirection"].SetValue(lightDirection);
        this.newEffect.Parameters["NormalTexture"].SetValue(this.normal);
        this.newEffect.Parameters["LightColor"].SetValue(new Vector3(1f, 1f, 1f));
        this.newEffect.Parameters["AmbientColor"].SetValue(new Vector3(.25f, 0.25f, 0.25f));

        //Draw with shader
        spriteBatch.Begin(SpriteSortMode.Immediate, null, null, null, null, this.newEffect);

        spriteBatch.Draw(this.baseTexture, new Vector2(600, 0), Color.White);
        spriteBatch.End();

    }

I am convinced that I’am overlooking something obvious (and probably easy too).
Maybe someone has the clue for me that I need.

Thanks in advance

is this the complete shader?

If so then you just have samplers, but pass no textures. Samplers are just defining the ways how a texture is red, not the texture location itself.

Add this upfront

Texture2D ScreenTexture;
Texture2D NormalTexture;

You then want to give your TextureSampler a reference to the ScreenTexture, just like you did with the normal sampler.

If that doesn’t work report back.

If you want you can specify : register(t0); / (t1) behind that, it’s not needed nowadays, but the code would look consistent.

Hey,
thanks for the quick response

I tried your suggestions, but it is still not working. Like before nothing is rendered…

This is my shader now:

// Effect applies normalmapped lighting to a 2D sprite.

float3 LightDirection;
float3 LightColor = 1.0;
float3 AmbientColor = 0.35;

Texture2D ScreenTexture : register(t0);
Texture2D NormalTexture : register(t1);

sampler TextureSampler : register(s0)
{
	Texture = (ScreenTexture);
};

sampler NormalSampler : register(s1)
{
	Texture = (NormalTexture);
};

float4 main(float4 color : COLOR0, float2 texCoord : TEXCOORD0) : COLOR0
{
    	//Look up the texture value
    	float4 tex = tex2D(TextureSampler, texCoord);

	//Look up the normalmap value
	float4 normal = 2 * tex2D(NormalSampler, texCoord) - 1;

	// Compute lighting.
	float lightAmount = dot(normal.xyz, LightDirection);
	color.rgb *= AmbientColor + (lightAmount * LightColor);

	return tex * color;
}

technique Normalmap
{
    pass Pass1
    {
        PixelShader = compile ps_4_0_level_9_1 main();
    }
}

If I am not mistaken this means, I have to set my BaseTexture as one of the effect’s parameters, right?

this.newEffect.Parameters["ScreenTexture"].SetValue(this.baseTexture);

These shaders are quite cryptic to me. They are going to drive me crazy :smiley:

Found the mistake
instead of this declaration
float4 main(float4 color : COLOR0, float2 texCoord : TEXCOORD0) : COLOR0

try this
float4 main(float4 pos : SV_POSITION, float4 color : COLOR0, float2 texCoord : TEXCOORD0) : SV_TARGET0

It seems to not work without the position semantic, it’s strange that the block is even rendered.

Here are my results


It should also be noted that for light calculations dot(normal, lightdir) is not correct, it should be saturate(dot(normal, lightdir)) so the light doesn’t get “negative” when the normals face the opposite direction.


You can also have multiple lights, it would work like this then

color.rgb *= AmbientColor + (lightAmount * LightColor) + (lightAmount2 * LightColor2);

Here for example i have a red light shining from the bottom


My whole file
Vector2 dir = new Vector2((float) Math.Sin(GameSettings.g_Angle), (float) Math.Cos(GameSettings.g_Angle));
dir.Normalize();
Vector3 lightDirection = new Vector3(dir.X, dir.Y, 0.05f);
lightDirection.Normalize();

            //Clear the device to XNA blue.
            _graphicsDevice.Clear(Color.CornflowerBlue);
            _graphicsDevice.BlendState = BlendState.Opaque;
            // Draw without shader
            //_spriteBatch.Begin();
            //_spriteBatch.Draw(_assets.TruckMaterial.NormalMap, new Vector2(0, 0), Color.White);
            //_spriteBatch.End();

            //Set the light directions.
            Shaders.NormalMappingEffect.Parameters["LightDirection"].SetValue(lightDirection);
            Shaders.NormalMappingEffect.Parameters["NormalTexture"].SetValue(_assets.TruckMaterial.NormalMap);
            //Shaders.NormalMappingEffect.Parameters["ScreenTexture"].SetValue(_assets.TruckMaterial.NormalMap);
            Shaders.NormalMappingEffect.Parameters["LightColor"].SetValue(new Vector3(1f, 1f, 1f));
            Shaders.NormalMappingEffect.Parameters["AmbientColor"].SetValue(new Vector3(.25f, 0.25f, 0.25f)*0.0001f);

            Shaders.NormalMappingEffect.CurrentTechnique.Passes[0].Apply();

            _spriteBatch.Begin(SpriteSortMode.Immediate, null, null, null, null, Shaders.NormalMappingEffect);
            _spriteBatch.Draw(_assets.TruckMaterial.AlbedoMap, new Vector2(0, 0), Color.White);
            _spriteBatch.End();

and

// Effect applies normalmapped lighting to a 2D sprite.

float3 LightDirection;
float3 LightColor = 1.0;
float3 AmbientColor = 0.35; 

Texture2D ScreenTexture;
Texture2D NormalTexture;

SamplerState TextureSampler = sampler_state
{
	Texture = <ScreenTexture>;
};

SamplerState NormalSampler = sampler_state
{
	Texture = <NormalTexture>;
};

float4 main(float4 pos : SV_POSITION, float4 color : COLOR0, float2 texCoord : TEXCOORD0) : SV_TARGET0
{
	//Look up the texture value
	float4 tex = ScreenTexture.Sample(TextureSampler, texCoord);

	//Look up the normalmap value
	float4 normal = 2 * NormalTexture.Sample(NormalSampler, texCoord) - 1;

	// Compute lighting.
	float lightAmount = saturate( dot(normal.xyz, LightDirection));
	color.rgb *= AmbientColor + (lightAmount * LightColor);

	return color*tex;
}

technique Normalmap
{
	pass Pass1
	{
		PixelShader = compile ps_4_0_level_9_1 main();
	}
}
1 Like

Hi,

This really did the trick!
Thanks, I think I would have never found it :wink:

It looks like I have to dive into Shaders in the next time to fully understand what’s going on here…

I can try to explain

Normally if you draw geometry you have your vertices (edge points of triangles that make up your model). In the vertex shader their position is transformed to where they would appear from the camera’s perspective.

For example, let’s say our camera is sitting at 0,0 and looking at 1,0. And our model is at 1,0.5f. Then it would appear on the right side of the final image.

So in the vertex shader we transform the positions from world space to view projection space. So we basically calculate which pixels the mesh covers.

Then, in the pixel we fill in the pixels between the vertices / inside the model. Obviously, for each pixel, we need the pixel position on the screen and then possible other input values, like color, normals, textures etc.
Then we calculate whatever we want the pixel to do and store the final color value to our output render texture.

SpriteBatch makes the vertex shader for us. It’s just 4 vertices with the edge points of the sprite, which need no real transformation since we give them in view projection coordinates already (unless we want to rotate them for example)

Either way, we have to let the GPU know what exact pixel on the screen it should work on and for that we need to pass the position. That was missing originally so I added
float4 pos : SV_POSITION,
as an Input semantic. And now our pixel output works as expected.

1 Like

Ok, this makes all a bit clearer

Thank you