SkinnedModel Animations seem to be in the wrong space?

I’m currently working to get animations working for our game’s 3D levels. I’m basically using the SkinnedModel sample for XNA, and have been able to get everything hooked up so that I’m exporting the model and animations from Blender into FBX, and then importing it into my game.

The issue I’m having is that the animation looks to be wrong. It’s almost as if the rotation of the wheel is being applied in world space instead of local space. Here’s a picture of the animation in Blender (you can see the one bone in the middle of the wheel) and the result I’m seeing in-game:
wheel_game (1)
wheel_game

My code for applying the animation and rendering the model is pretty much exactly the same as the SkinnedModelSample aside from the camera:

private void RenderSkinnedModel()
{
    GraphicsDevice device = graphics.GraphicsDevice;
		
    Matrix[] bones = animationPlayer.GetBoneTransforms();

    // Render the skinned mesh.
    foreach ( ModelMesh mesh in _LevelModel.Meshes )
    {
	    foreach ( SkinnedEffect effect in mesh.Effects )
	    {
		effect.SetBoneTransforms( bones );

		effect.View = _GameCamera.View;
		effect.Projection = _GameCamera.Projection;
		effect.World = Matrix.Identity;

		effect.EnableDefaultLighting();

		effect.SpecularColor = new Vector3( 0.25f );
		effect.SpecularPower = 16;
	    }

	mesh.Draw();
    }
}

Has anyone run into this or have an idea where my process might be breaking down? I’ve been able to re-import my FBX into Blender and it looks correct there, and I’m using an exact copy of the SkinnedModelProcessor right now, so I suspect I’m doing something incorrect with the animation player or the way I’m transforming the bones.

Alien scribble has a video on how he did it.

Honestly id just convert your model to gltf and use this.

Nkast has one aether extras.

kit kat has one as well somewere.

You can look at my super messy loader here as well but its in no way like in a even semi released state its more like a ton of half finish concept test work, but it does work.

2 Likes

Maybe the coordinate systems are different? Maybe there is an option in blender when doing the export or when you import the model. Not sure if this is the reason. Just an idea.

Thanks @willmotil I’ll take a look through these. I’m certainly not married to the current FBX path we’re using since the more built-in stuff has been nothing but trouble and I’ve basically been writing custom code the whole way, so if any of these options will work I’m more than happy to switch over and use them. Do you know offhand if any work well on consoles? Our game is confirmed for PC (DX), Xbox One, and Switch releases so I need to ensure nothing in the runtime portion of the libraries requires anything external that is PC only.

@Kwyrky I’ll be looking into it today. I know that by default the coordinate systems are different (Blender uses Z up, my game is Y up) but I believe I have it fixed with a rotation applied in the content processor data. But definitely appreciate the idea and I’ll be double-checking that portion of code to see if it’s related.

After poking around in both libraries I think I may just end up having to do my own custom processor. I was pretty excited to use SharpGLTF as FBX has been nothing but a pain for me, but unfortunately I need to be able to use a custom shader and it looks like it doesn’t really support that. Aether unfortunately suffers from the same issues I’m having with the SkinnedModel samples, where any non-skinned meshes are stripped out of the Model. Since our game uses 3D levels and 2D characters, our level Models have a mixture of skinned / animated meshes and static meshes within one model and I need to be able to support both. (This is actually why you’re only seeing the wheel and not the rest of the level in the pictures I posted originally).

So I think the best option is to just create my own classes to handle my game’s specific needs. I actually started messing around with this earlier in the project when I was trying to read lighting information from the FBX. I ended up abandoning it because it was taking up too much time, but I might need to revisit it. The code I was working on actually had a few similarities to the stuff you were working on @willmotil and looking through your stuff actually noticed a few things I was doing wrong.

Oh and @Kwyrky you were completely correct about the coordinate system mismatch. At some point during testing I actually removed my 90 degree rotation on the X axis. Adding that back in has the wheel somewhat close to what I’d expect. It still isn’t working correctly but it’s rotating on the expected axis now. So thanks for mentioning that and making me double check!

Were working on getting vpnades loader and a default pbr shader into monogame.
Nades is doing most everything as its his runtime and he knows the gltf spec in and out.

When its said and done it should be able to take a regular effect i think.
But gltf has a lot of stuff in its specs for loading all types of models and what that translates to for running any pixel shader let alone a vertex shader is a lot of work gltf demands a lot of situations be covered that ranges from morph targets to texture transforms and a host of other things.

Im hoping (but not to sure) we can devise a system towards the end to tie on a pixel shader after all the default stuff has been handled that you or i would really not want to tackle in a simple shader.

The nice thing about gltf is that you can just turn fbx’s right into them.

I can’t say how long it will take the majority of the tests are passing now but we sort of left the hardest stuff for last.

1 Like

@willmotil I tried downloading SharpGLTF just to test it out, but when attempting to compile I get a huge number of compiler errors due to trying to assign values to read only Accessor fields. Example:


Do you know if the GitHub depot is broken at the moment or if there is something special I need to link against to prevent these errors? From looking at the errors they look legitimate, the _Accessor object is indeed marked as readonly in most instances.

On the FBX front, I’m honestly not sure how anything based off the Skinned Model example from XNA works. I simplified my Blender model down to just a cube that moves up and down via 3 keyframes, and still nothing works. The original SkinnedModel code, Aether Extras, and my in-progress code all fail due to the fact that Blender uses Z-up but MG is of course Y-up. From looking in the code for all these FBX processors, none of them ever do anything to translate the bones or animations to correct this. MeshHelper.TransformScene does this, but it happens after the animation data is already read. I’ve fixed this with my in-progress code but ensuring the transform scene happens before the animation data is read but I honestly have no idea how any of these examples have ever worked.

He might of uped the nuget to rely on vs2019 i don’t think he did though.

The way i do it is …
Download his example project GitHub - vpenades/SharpGLTF: glTF reader and writer for .NET Standard
Then make my own new project
Open my project folder
Then cherry pick… Drop just his runtime project next to mine.
(thats in the example folder) https://github.com/vpenades/SharpGLTF/tree/master/examples
Goto vs and add just that project to the solution.
Then i add a reference to his runtime project to my project directly in the reference drop down.
Then goto the nuget packet manager and grab the sharp.gltf and core nugets via the nuget manager. His runtime obviously needs his core files so add it to both projects.
Might have to change the framework target i can’t remember if i had to do that.

Should be similar to the below image though this is a early test version of a different branch.

If you still can’t get it working post back and ill make a sample project later tonight when im stuck on my laptop and can’t do any real work.

On the FBX front, I’m honestly not sure how anything based off the Skinned Model example from XNA works. I simplified my Blender model down to just a cube that moves up and down via 3 keyframes, and still nothing works.

The mesh animation part from blender is actually very tricky to implement and requires a non standard change to allow bone zero to be used for the mesh animation when no bone animations are present.

The skinned model example is based off a fbx version format that is 10 + years old.
It’s probably conceptually missing stuff.

To break it down for a modern fbx
There is one set of animations global to the model.
Each mesh or mesh part has its own unique set of transformation nodes that are a part of the entire set of the whole model. They just keep the portion that apply to them. So these are kind of global too but split up in meshes.
Each mesh has its own set of inverseTransforms aka bones that correspond to transformation nodes these are used to de-transform the transformation nodes into model space so the corresponding animation transforms can be applied locally.
Since meshes in a fbx file can have unique local model space coordinates…
These inverseBindPoseTransforms can be different between meshes even though they apply to the same named transform node which as said before are basically just portions of the global transforms.

Ya the up thing is sort of a pain but that is typically from chaining the transforms incorrectly.

Thanks for taking a look. It may just be that I should try using nuget to grab it. I generally just download the zip and hook up everything manually to ensure everything references all external dependencies properly. To ensure all programmers on the team are using the same version of everything, no one has local installations of anything like MonoGame - it all just comes down from our Perforce depot. I did notice that there was some behind-the-scenes reference finding going on so there may just be a mismatch going on. I’ll look into it further tomorrow.

As for the FBX, I think I finally found out what is going on, even though I’m not 100% why yet. I actually got my custom code and Aether Extras both working with some simple animations on my wheel model. Basically, everything MUST be at origin for things to work properly.

If either the mesh or the bone are offset from origin, you see the offset as shown in the original post. This doesn’t surprise me that much, but what does surprise me is that if both are set to the same offset (for example, 10 on the Z axis) you’ll still see the offset even if all transforms are applied to ensure that translation values for both objects are 0. I would expect that no matter where the objects are in the scene, the wheel would rotate in place if the animation is just the bone rotating in place. However from looking at the animation data during runtime, the transforms definitely have a translation of 10 on the Y axis (assuming you corrected for the Z / Y mismatch).

The good news is that I should be able to work with this - just make sure when setting up animated models they’re created around origin as a separate model instead of a mesh part of the larger level model. Plus I’m now fairly familiar with the content processor code so should be able to plug in my own custom shaders without too much additional effort.

That said, I am definitely thankful for you and Nades working on the GLTF avenue as I think that’s the way to go for the future. FBX seems to be hardly support in Blender as it is, considering the amount of TODO or EXPERIMENTAL comments throughout the documentation and all the weirdness you see even before getting to our content pipeline.

I’ll post up any additional findings depending on how long I poke into the origin requirements over the next couple of days.

I tried to get some animation working today and it is working with vpenades https://github.com/vpenades/SharpGLTF

1 Like

That looks great Kwyrky!

I definitely think vpenades SharpGLTF stuff is generally the way to go for most people. It didn’t work out for me due to the fact that I wasn’t able to get custom shaders plugged in very easily, which is something required in my case, but it’s definitely the path with the least amount of aggravation compared to FBX. Sounds like all this is being actively worked on though, so more than likely this will always be the best choice going forward.

For me personally, I ended up using Aether Extras for the actual animation data, but making my own custom model and animated model processors. I’m not sure if SharpGLTF does the same, but I know the Skinned Model example and Aether Extras both strip out non-skinned meshes from the model, expecting the full thing to be skinned (like a character), but since we’re using 2D characters in 3D worlds, our level models are a mixture of skinned and static meshes. So by writing all custom code I could override ConvertMaterial and ProcessGeometryUsingMaterial to set up everything using my custom shaders based upon whether the particular mesh was static or skinned.

I ended up finding out that a large majority of my problems stemmed from Blender -> FBX -> MonoGame conversions and everything having its own expected way of things working. Most of the issues were from Blender and the FBX format and not necessarily any broken code or anything MonoGame was doing.

For reference, the way I got everything working going the FBX route was:

  1. When making the armature in Blender, ensure nothing is skinned to the root bone, but instead everything is skinned to a child of the root bone. Basically I ensure there’s a Root bone at origin with no scale or rotation and nothing is ever skinned to it. For some reason, attempting to skin anything to this even if it never moved caused some weird offset issues in my animations. Still not sure where it was falling apart, but this was really the key to the offset issues I was seeing in the original post.
  2. I often ended up with broken weights when using automatic weights in Blender, even when doing something as simple as skinning a cube to a single bone. Possibly just something I wasn’t understanding due to my inexperience with rigging. But everything worked fine if I skinned the mesh with empty weights and then manually assigned the vertex group.
  3. For some reason, Blender seems to export FBX at 100x scale. There seems to be no reason for this, but apparently it’s a known thing. It’s not too much of an issue with static models aside from everything being really big, but it seems like while the mesh / vertex information has this scale handled in our pipeline, the animation data does not. So there was a mismatch where the raw animation data and the raw vertex data was using different scales. If you export your FBX with scale 0.01 then everything is now scaled uniformly and the raw data makes sense.

So yeah… my honest advice is to try to go with SharpGLTF because FBX is a headache to work with. I don’t think it’s eve particularly well supported in Blender, because if you try to look at documentation for the exporter it’s full of TODO and TBD comments. I think we’ll be sticking with FBX for this project since it’s already working and I don’t want to go back to this, but the first thing we’ll be doing for any future projects is switching to GLTF.

1 Like

Here’s a link to a Twitter post to see the final 3D level with the animated wheel and normal maps:

2 Likes

I think FBX is maybe not that bad but I agree that it is really not pleasant to work within this workflow with the blender FBX exporter and everything. But I never really worked with animation stuff just got something rendered to screen I would say. Also the video from above just is something which worked out of the box. This is the example solution of vpenades I just added a model I exported from Mixamo to FBX together with animations (all as separate FBX files). Then I found some online tool which combines everything to one glTF (gltf or glb) file. I added the file to the sample next to where I found the sample glb files, added a few lines to the sample solution, basically just loading the model like the others are loaded and it worked.

If I would have a better understanding of the topic I could maybe figure out how to use this with custom shaders. So right now as far as I know it is being worked on SharpGLTF getting integrated into MonoGame. Maybe it will be easier then to get custom shaders working when this is done.

I’ve heard GLTF may be the future favorite of 3D file formats for many devs due to being more open and straight-forward - seems to be gaining popularity fast. I noticed a MG example - I’m guessing the example one uses SkinnedEffect? I did make a custom effect before based on that one - where I boosted number of bones supported and added extra eye-shine behavior - so it seems do-able. It’d be exciting to see this in a future Content.mgcb

I came across this the other day and it exports to GLTF, I may look at it in the futore for importing animations into my engine.

At the moment, (as I am not an animator) I am importing my animations in via fbx, then using them on the skinned models in my scene.

1 Like

Nice - animation combiner with gltf export - quite handy - bookmarked (cool demo character they put up).
Sounds like a good way to do it. I spent many days doing skins and basic animations for some of my characters and then watched someone else just upload his character - autoskin - and grab an online animation and apply it (can always tweak it after)… and I all like… what have I been doing all this time… XD

1 Like

Yea, I quite like Mixamo.com. When using Unity I use it for prototyping and character stuff.

1 Like

Yah, great for prototype setups and testing. I’m wondering how many other time-saving tricks are out there which could greatly speed up development for smaller teams or solo developers. Probably something I’ll actively look for from now on. I’ll be interested to watch your engine development too.
Seems like FBX is currently popular and GLTF is the future so support for both will probably be handy. I wish I had more time to make a unique content writer and animation player and tutorials (for current fbx and gltf) - having to work 2 jobs eats up all my free time. So getting a game done takes forever. ;p
Good luck with the engine btw!

Yea, time as ever is the killer. For my fbx animations I am pretty much using the old code from the XNA skinning sample. I have modified it a bit so I can export and read my own content type now and can do animation blending between clips.

I think, once I have my basic 2D stuff in place, I am going to look at writing a GLTF importer formatting to my current pipeline classes. Had a quick look at it and it seems to just be JSON, so should be easy to read in and parse :smiley:

2 Likes

This is exactly the one I used to combine the exported Mixamo FBX model and animations to one gltf and glb file for use in the SharpGLTF demo / example by vpenades.