Morph Target Setup - SetVertexBuffers

What I’m doing:
I’m blending 2 frames to get a resulting tween… I do that for 2 different animations and then blend those together to get a final blended animation tween. Works great in software. :slight_smile:
Question:
I’m wondering how to set it up so that I can have each frame in a different vertex buffer and then blend together in the shader. I did it but in the shader it is only receiving data for anything in the first vertex buffer. Here’s what I did:

            // S E T   B U F F E R S         
            int a1 = playerDesc.animation1, a2 = playerDesc.animation2;
            int f1 = playerDesc.frame_1,    f2 = playerDesc.frame_2;            
            vertbuffs[0] = sys[a1].obj[o].MeshList[f1].vbuf_bind;     // works
            vertbuffs[1] = sys[a1].obj[o].MeshList[f2].vbuf_bind;     // not working
            vertbuffs[2] = sys[a2].obj[o].MeshList[f1].vbuf_bind;     // "
            vertbuffs[3] = sys[a2].obj[o].MeshList[f2].vbuf_bind;     // "
            gpu.SetVertexBuffers(vertbuffs);            
            gpu.Indices = sys[0].obj[o].MeshList[0].ibuf;
           
            // applies textures, lighting, etc: 
            light.SetDrawParams(world, cam, mat, playerDesc.frame_tween, playerDesc.anim_tween);         
            // D R A W             
            gpu.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, start_index, triangle_count);

I’ve seen mentioned that I may need to add channels to a vertex declaration but then would I need to fill those elements with different data for each frame? Hoping to avoid something like that ;p

Shader takes these:

struct VS_In
{
	float4 Position1: POSITION0;
	float4 Position2: POSITION1;
	float4 Position3: POSITION2;
	float4 Position4: POSITION3;
	float4 Normal1 :  NORMAL0;
	float4 Normal2 :  NORMAL1;
	float4 Normal3 :  NORMAL2;
	float4 Normal4 :  NORMAL3;
	float2 UV:       TEXCOORD0;
};

Pretty sure everything else works fine - just that Position 2-4 seem to have nothing in them when it gets into the shader.

Probably something about vertex declaration where I need more channels?
Or maybe something needs to be enabled?
Btw - I’ve never used lerp in shader - I’m using 0-1 for tweens. This should be correct right?

	// tween frames and then blend different animations (for positions): 
	float4 p1 = lerp(input.Position1, input.Position2, frame_tween);
	float4 p2 = lerp(input.Position3, input.Position4, frame_tween);
        float4 position = lerp(p1, p2, anim_tween);
	// tween frames and then blend different animations (for normals): 
	p1 = lerp(input.Normal1, input.Normal2, frame_tween);
	p2 = lerp(input.Normal3, input.Normal4, frame_tween);
	float4 normal = lerp(p1, p2, anim_tween);

Not clear what your VertexBuffer definition looks like, but i think you have to use a custom vertex definition something like:

`

public struct VertexFourPositionNormalTexture : IVertexType
{
    public Vector4 Position1;
    public Vector4 Position2;
    public Vector4 Position3;
    public Vector4 Position4;
    public Vector4 Normal1;
    public Vector4 Normal2;
    public Vector4 Normal3;
    public Vector4 Normal4;
    public Vector2 TextureCoordinate;

    public static int SizeInBytes = (4+4+4+4 + 4+4+4+4 + 2) * sizeof(float);
    public readonly static VertexDeclaration VertexDeclaration = new VertexDeclaration
      (
          new VertexElement(0, VertexElementFormat.Vector4, VertexElementUsage.Position, 0),
          new VertexElement(sizeof(float) * 4, VertexElementFormat.Vector4, VertexElementUsage.Position, 1),
          new VertexElement(sizeof(float) * 8, VertexElementFormat.Vector4, VertexElementUsage.Position, 2),
          new VertexElement(sizeof(float) * 12, VertexElementFormat.Vector4, VertexElementUsage.Position, 3),
          new VertexElement(sizeof(float) * 16, VertexElementFormat.Vector4, VertexElementUsage.Normal, 0),
          new VertexElement(sizeof(float) * 20, VertexElementFormat.Vector4, VertexElementUsage.Normal, 1),
          new VertexElement(sizeof(float) * 24, VertexElementFormat.Vector4, VertexElementUsage.Normal, 2),
          new VertexElement(sizeof(float) * 28, VertexElementFormat.Vector4, VertexElementUsage.Normal, 3),
          new VertexElement(sizeof(float) * 32, VertexElementFormat.Vector2, VertexElementUsage.TextureCoordinate, 0)
      );
    VertexDeclaration IVertexType.VertexDeclaration
    {
        get { return VertexDeclaration; }
    }
}

`

edit: i have no idea how to correctly format on this forum -.-

Ah yes, that makes sense. I was kinda hoping I could get away with the standard one. I’m guessing I’ll have to somehow load each vertex position & normal with the desired frame and animation to tween together which could be tricky. Unless I can just assign the vertex buffers and somehow it loads into appropriate part of vertex in order based on stream - if not then it seems SetVertexBuffers doesn’t help me. It confuses me cuz I seen a morph target example in xna where they did it this way but with regular VertexPositionNormalTexture and used SetVertexBuffers to assign buffers to streams which somehow get into the correct position(0-3),normal(0-3) in the shader.

I’m now wondering if there is even any way to set a vertex buffer to a stream. ie: VertexPositionNormalTexture to stream 2 would put them into POSITION2, NORMAL2, etc…

Anyway, thanks! :slight_smile:

You assign the values to your vertices, which should be of your custom type (VertexFourPositionNormalTexture).
You than create the VertexBuffer for that type and size and set those vertices to it.
E.g.:

            // Assign your vertices
            VertexFourPositionNormalTexture[] vertices = new VertexFourPositionNormalTexture[4];
            vertices[0] = new VertexFourPositionNormalTexture()
            {
                Position1 = new Vector4(0, 0, 0, 0),
                Position2 = new Vector4(0, 0, 0, 0),
                Position3 = new Vector4(0, 0, 0, 0),
                Position4 = new Vector4(0, 0, 0, 0),
                Normal1 = new Vector4(0, 0, 0, 0),
                Normal2 = new Vector4(0, 0, 0, 0),
                Normal3 = new Vector4(0, 0, 0, 0),
                Normal4 = new Vector4(0, 0, 0, 0),
                TextureCoordinate = new Vector2(0, 0)
            };
            // Create VB to hold the vertices and set the vertex data to VB
            VertexBuffer vb = new VertexBuffer(graphicsDevice, VertexFourPositionNormalTexture.VertexDeclaration, vertices.Length, BufferUsage.None);
            vb.SetData(vertices);

As for the structure of the VertexFourPositionNormalTexture data type: its parameters have to be defined in order as they appear in its VertexDeclaration. You should also add the [StructLayout(LayoutKind.Sequential, Pack = 1)] (using System.Runtime.InteropServices) attribute above the structure to preserve its structure in managed memory.

Hi! AlienScribble, the creator of 2D shader water ^_^y

I’m now wondering if there is even any way to set a vertex buffer to a stream. ie: VertexPositionNormalTexture to stream 2 would put them into POSITION2, NORMAL2, etc…

I think using vertices is slow when the morph change, changing the morph target on vertices will eats up CPU or otherwise if baked it per frame/morph will eats up Memory.

I haven’t test or try StructureBuffer, since I’m not yet into morphing, but I think in HLSL Directx 11 Shader Model 5 has a StructureBuffer that can be use as a look table works just like passing the texture to shader.

I think it can be use something like this :

struct VertexShaderInput
{
    float4 Position : POSITION0;
    float3 Normal   : NORMAL0;    
    float2 TexCoord : TEXCOORD0;    
};

//

structurBuffer<float3> MorpTargetPos  : register (t0)
structurBuffer<float3> MorpTargetNorm : register (t1)

// position.w can be use as  vertices index no.

float3 m_Pos  = lerp( input.Position,  MorpTargetPos[input.Position.w], frame_tween );
float3 m_Norm = lerp( input.Normal,   MorpTargetNorm[input.Position.w], frame_tween );

Hope you find a good solution on morphing ^_^Y

Thanks for all the tips guys! :slight_smile:

StructureBuffer - that’s a new one for me. Been busy with other things but hopefully tonight I can finish the hardware morphing now.
Thanks :slight_smile:

1 Like

Yay - I got it working in hardware! The ears twitch and flop around very smoothly. I’m just have it blend a maximum of 4 targets for now which works fine cuz that’s as many as I’ll need to map between most animations. I’m using openGL build so that may be why I couldn’t do it the xna way cuz I think it limits to 1 vbo or something. I found SSBO’s exist for that which I think might be like StructuredBuffer’s in DirectX. Right now I’m just mapping neighboring animation blends together in the same vertex as InanZen showed. I could see how using multiple targets with high vertex count could end up eating a lot of memory this way - in the future I may shoot for SSBO’s and StructuredBuffers for OpenGL and DirectX builds. I suppose technically the vertex counts are low enough I could even have got away with doing it on CPU and updating a Dynamic buffer. ;p
This works great - thanks again! :slight_smile:

1 Like

Cool! Yo bro, how do you go with the tweening in this situation :

The Current Position is on 6 o’clock and your Target Position is at 12 o’clock for example.My question is how would interpolate the tweening Clock wise or Counter Clockwise ?

Guessing that would be for like bone rotations and blending animations? I suppose it gets tricky if you try to slerp for exactly 180 degree angle difference (can’t just pick shortest angle distance). I’m guessing you’d need a mechanism to pick a half-way in the desired direction and use 2 of 90 degree motions if using lerp. Angular constraints could be needed (so the arms don’t bend backwards for example) - not sure how to set them up off hand - and I do recall someone saying it’s best to use slerp for angle rotations.

I’m not sure how much I know much about doing it in 3D but I think I could give a 2D version to start.
If a simple single axis clockwise tween - you could even use lerp on z-axis angle - like:

// I N I T
const float ToRadians = PI/180f;
float a0 = 270 * ToRadians,     a1 = 90  * ToRadians; // 180 degrees between them 
float bigger = a0, smaller = a1;
if (bigger<smaller) {bigger = a1; smaller=a0;}
float angle_distance = bigger-smaller;
float reverse_distance = 360*ToRadians - angle_distance; 

// L O O P
if (clockwise_restrained) {
  angle = bigger - angle_distance * time; 
} else {
  angle = bigger + reverse_distance * time;  
}
// (could clip-wrap angle so within +0 to +2PI if you wanted) 
elbow_x = shoulder_x + length*cos(angle); 
elbow_y = shoulder_y - length*sin(angle);   // + or - depending on y direction 

Of course that’s just for 2D - not sure if that’s what you’re looking for. :slight_smile:

1 Like

Upon further research your “pick a half-way” is one of the normal solution, follow the path or direction from the previous key frame OR if not desirable result, add new key frame at 7:oo O’clock for ClockWise or add new key frame at 5:oo O’clock for CounterClockWise to meet 6 at 12 on correct rotation tween.

[Copy Save] your angle snippet for rotation tweening, Thanks bru!

1 Like

Right-on - turned out I ended up using similar technique for turning around the character I was morphing in this :slight_smile: :

1 Like

Very nice bro! people learned a lot from your youtube channel! your the man! ^_^y

1 Like