Hi,
I am trying to get some lighting working and I am running out of ideas. Currently, I am using a shader that takes in a light position, color, range and intensity, and calculates how dark to shade the sprite.
I made a quick demo by passing in 2 float2s, one with the lights position, the other with the position of the sprite being shaded. When I do this manually, I am able to get the lighting to “work” but it’s choppy because the light for the entire sprite is being calculated based on one position.
I’d like to adapt this to calculate the lighting with each pixel of the sprite, but I am having an awful amount of trouble trying to translate the vertex position back into world space.
I should mention, I am not using SpriteBatch. I am using a fork of SpriteBatch that lets me use my own vertex shader.
Here’s my method right now:
Calculate ViewProjection matrix and pass it into vertex shader.
Store POSITION0 in a “WorldPos” float4
Translate POSITION0 with the ViewProjection matrix, and send that to the Pixel Shader as SV_POSITION.
(Everything works fine up to here).
Pass the world position of the light to the shader, “LightPosition” (position is in world space, the same position the light is “draw” at).
In the shader, calculate the distance between WorldPosition and the LightPosition.
Use the LightRange to calculate the attenuation of the lighting.
Here is some code:
Setup Draw and MatrixTransform
Graphics.Instance.Batcher.Begin(BlendState.AlphaBlend, SamplerState.PointClamp, DepthStencilState.None, RasterizerState.CullNone, null, cam.TransformationMatrix, true);
//Effect.Parameters["WorldPosition"].SetValue(renderable.Position);
Effect.Parameters["LightPos"].SetValue(light.Position);
Effect.Parameters["MatrixTransform"].SetValue(Matrix.Multiply(Graphics.Instance.Batcher.TransformMatrix, Graphics.Instance.Batcher.ProjectionMatrix));
Effect.CurrentTechnique.Passes[0].Apply();
// Draw here
Vertex Shader:
float2 LightPos;
float2 WorldPosition; // Works when I use this.
float4x4 MatrixTransform
struct VertexShaderInput
{
float4 Position : POSITION0;
float2 TexCoord : TEXCOORD0;
float4 Color: COLOR0;
};
struct VertexShaderOutput
{
float4 Position : POSITION0;
float2 TexCoord : TEXCOORD0;
float4 Color: COLOR0;
float4 WorldPos : TEXCOORD2;
};
VertexShaderOutput VertexLightShader(VertexShaderInput input)
{
VertexShaderOutput output;
output.Position = mul(input.Position, MatrixTransform);
output.WorldPos = output.Position;
output.TexCoord = input.TexCoord;
output.Color = input.Color;
return output;
}
Pixel Shader:
float4 PixelLightShader(VertexShaderOutput input) : COLOR0
{
float4 color = tex2D(s0, input.TexCoord);
float2 pos = input.WorldPos.xy;
// Override for testing. Works when I do this.
// pos = WorldPosition;
float distance= length(pos - LightPos);
float attenuation = saturate(1.0 - distance/LightRange);
// I know this ambient color math is a little funny, but it works for my application
float3 factAmb = ambientColor.rgb + ambientColor.a;
float4 factoredAmbient = saturate(float4(factAmb * ambientIntensity,1));
float4 diffuse = saturate(attenuation * LightColor);
diffuse.a = 1;
diffuse = saturate(diffuse + factoredAmbient);
return (diffuse * color * input.Color);
}
I certainly wouldn’t call my approach to lighting conventional, I am doing some weird math all around to force the lighting to look right in my particular application, but I believe the problem lies with the way that “distance” is calculated which stems from WorldPos.xy being incorrectly calculated. Again, when I pass in a float2 to replace WorldPos.xy artificially, it appears to work correctly, but I’d like this to be based off the actual screen pixel. Any ideas or guidance as to how I can make this work?
Thanks!