Creating a Shader That Shows Normals Like This

Hey,

I’m currently making a software to render out .fbx animations using Monogame and WPF. Since I want to use dynamic lighting with my sprites, I need to generate normal maps corresponding to the spritesheets. Does anyone know how I’d achieve a result similar to the images below? What sort of shader, and what type of normal is being illustrated? Thanks!

https://puu.sh/z4R9Q/caf57269d1.png

http://img11.hostingpics.net/pics/824051normal.png

Edit: You can’t do that in a shader other than as a preprocess, too many samples. Since the image is mostly mono-color a 3x3 convolution filter keyed on value wouldn’t work. You have to do it ahead of time, either at load-time or as additional texture assets.

It’s a tangent space normal map (in Y-down orientation). If an image is always a billboard then the tangent space vector doesn’t require a basis and can just be multiplied by the transform (or taken as is for pure 2d).

The head in that image is messed up unless the goal there is some sort of backlighting or just gobbly-gook to force a zero/crap out of the lighting.


There’s several ways to generate them. Ravendarke has a tool for it and there’s several others out there.

You can calculate an SDF and treat it as a heightfield to calculate the derivatives to determine the normals. If your art is all solid colors like that you can apply different curve functions to the SDF based on the pixel color code. Taking the B from HSB color can be useful as an adjustable factor. There’s several convolution filters, LCSM, and markov methods that can be used as well.

You can place primitive shapes and evaluate them in different ways (can do that in with photoshop primitives and a script - done it before…it wasn’t too bad aside from the whole PS scripting thing).

Normals are not surface direction, they’re integrated metric tensors so you can do just about anything in differential geometry from Laplacian methods to probabilistic splatting to determine them if you aren’t scared of dealing with renormalizing from a different basis. The normal is surface direction is just a white-lie made up to not scare the daylights out of artists that has run rampant to the point of being mistaken as fact. Treating them as what they really are can make things easier - and faster.


The cloth dropping down between the legs looks like it was done by hand.

Surface normals are vectors perpendicular to the tangent plane on a surface, by definition. Sure, you can associate any other vectors with your vertices and do some custom lighting calculations on them, but then those vectors are not the normals.
I don’t see what your point is with this statement. Of course you can use any method to generate 3D normals for an image, because you’re making up data anyway.
EDIT: If you have any links to resources that explain what you said, I’d’ be interested to read them.

I made the mistake of neglecting the specific of tangent-space normals.

I don’t see what your point is with this statement. Of course you can use any method to generate 3D normals for an image, because you’re making up data anyway.

Relevant for filtering. Histogram → basis lattice for adjusting levelness while preserving local roundess/hardness of edges. Adjusting to a basis change instead of just tweaking +Z, which is wrong if you want to preserve those slopes (the vector will be normalized, Z tweaking will distort).

EDIT: If you have any links to resources that explain what you said, I’d’ be interested to read them.

You mean like the historical origin of tangent-space normals, Fitting Smooth Surfaces to Dense Polygon Meshes which was for tensor spline surfaces? Though being a use of the subject they obviously expect you to already know it.