Deferred Engine Playground - download

I agree, maybe I should reconsider my stance.

Oh that’s great to hear, after some others mentioned problems (which I can’t fix for the newest development builds of monogame)

Thanks, I added the tech demo aspect to the readme

IMHO you really should add that repo to the awesome-list under ‘Effects’.
Despite your recommendable humility it really belongs there.
:slight_smile:

The mesh rendering is actually pretty “advanced” in comparison to the simple myModel.Draw()

It’s fairly messy and probably easy to get lost in but what I do for all entities:

(excuse the naming)

  • when the entity is created I register them in my MeshMaterialLibrary
  • this library has an array of MaterialLibraries
  • each MaterialLibrary has an array of MeshLibraries (with all the different meshes for each material)
  • these MeshLibraries have an array of all the instances of this mesh, plus obviously the bufferdata for the mesh.

The idea is that many objects can use the same material - so let’s make sure we batch our rendering by material. If we sort all meshes by material we have the fewest possible render state changes!

In the gameSettings file you can activate this “batchByMaterial” mode with this command, which is “false” by default
public static bool g_BatchByMaterial = true;

You can see how i only draw 39 materials in one mode, and 209 materials (the same as meshes) in the other one.

However, it turns out that fewer state changes of my renderer are irrelevant for performance, actually it’s roughly 10% slower!

How is that?

Well even though we draw efficiently in terms of changing materials, it matters more that in the other mode by some luck the object order is more optimal (front to back drawing) and therefore more pixels can be rejected.

Note: At this point I don’t control in which order meshes are drawn - the order is basically determined by the order they were registered.

So my idea was - I have to get the distance of each material (square distance is cheaper) from the camera and average the distances of all meshes for each material and then make a list of pointers to the materials with the order from near to far.

So I did that.

But it doesn’t help when you have 2 objects with the same material - one in the front and one in the back - the average might be behind some object that is behind the first one.

So now I reverted to not using the method of drawing all meshes with the same material at once. I treat each mesh as if it has a unique material and then I sort them front to back. It’s simply more efficient in that case.

Apart from that a pretty neat monogame compiler feature is that the ModelMeshes have bounding spheres created by default. These have to be translated and scaled correctly for each submesh, and I also do that in this whole library. It’s pretty complicated how everything is interwoven. Not great maybe.

With these bounding spheres I can easily test frustum culling, so that is great. It makes rendering much more efficient and is submesh specific.

I think I can’t manually add to the repo, I would have to ask the guy

Maybe I will try this distance sorting technique in my renderer. Did not think about the possible performance improvement from front to back sorting. However I should perhaps first find out, where the bottlenecks are. I think I am also doing FrustumCulling but not sure at the moment. It has been a while since I worked on the renderer.

Next thing for me will be to get all the models from the XNA 4.0 version to a format which works with Monogame :slight_smile:

Thanks for the detailed explanation of the used techniques!

Sorting meshes is ok with a few meshes (I think under 500 maybe) but, if you have large areas like in GTA or JustCause, it is faster to use Clustered Shading technique than sorting 10000 meshes imho.

I created a pull request on the awesome-list.

Temporal Reprojection Antialiasing. Blending is not yet weighted, but it’s really interesting and the results are pretty good.

Some issues are still present, but it’s a good start.

Disable with
g_TemporalAntiAliasing = false

If there are only or mostly static models / meshes than it might be ok to sort them once. But I have some tasks todo before anyway. Did the mapping from another model with a lot of meshes, was a bit tedious :slight_smile:

What is clustered shading?

“splitting” the view frustum into clusters, determining which lights belong in which cluster and then only shading the needed lighting per cluster. This is especially interesting for forward renderers, where you really want to have as few lights/light tests per object.

Think of dividing the view into a table with rows and columns, but add a 3rd dimension to it.

This needs some setup but makes light shading a tad cheaper.

However, I don’t think it has any effect on geometry culling, afaik usually the guys who use clustered shading do hierarchical z-pass and then cull lights / objects in compute shader, usually relying on heavily optimized models for the prepass. But this technique is independent of the shading used.

This is done only on GPU; but some CPU culling beforehand goes a long way in making sure the GPU has a better time doing that.

Have a look at avalanche engine’s pdf or frost engine ones, they use this clustered shading with deferred and it seems more efficient than tileshading. But there are some problem with depth discontinuities especially with foliages in the nearest clusters to the nearplane. Some name a 2.5D culling.
I ll give it a try.
Maybe i misunderstood when both of you were talking about sorting models beforehand was faster

I now render the emissive materials in a bigger FOV and reproject for the diffuse/specular calculations.

That means that the light does not disappear immediately when the object is out of view. This costs basically nothing extra (apart from one matrix multiplication for the reprojection)
EDIT: Actually runs much better, I need to find out why

A new console variable has been introduced: g_EmissiveDrawFOVFactor, which determines how much more is considered.

Wrote a blog post about it

2 Likes

Nice read. Now wondering how a textured version would look like :smile:

Anybody else having issues with the placeholder images not loading? they work when clicking on the image but not the inline images inside the posts… I just get the missing image icon… Only noticed it on this thread recently…

EDIT - This is only an issue on some of them…

Same problem here. Mostly with GIFs.

1 Like

Thanks for the explanation. I’m not familiar with hierarchical z-pass. No need for such advanced tecnique for me at the moment.

I think I was a bit confused. You’re right. The sorting has to be done basically each time the camera moves.

I would need the MeshBuilder class. There is already an open issue on GitHub :slight_smile:

I’ve added Directional Lights (Sun light) and I added a new type of shadow.

For this light I write the shadow map to a different rendertarget with some poisson disk filtering.

Then I combine it with SSAO to use a bilateral blur filter to make them soft (in screen space).

I think I’ll go ahead and make it possible to use different shadow types for each light ( VSM, PCF, Poisson and possible combine with SSBlur).