I’m trying to use a geometry shader to emulate water, but the programmable pipeline is ordered like so:
Which means that the geometry stage comes after the vertex stage. I wanted to ask, if I use the vertex stage to do the transformations, and use the geometry stage to do the normal calculations, where do I do the lighting?
PS. I will try to set the geometry shader using the underlying SharpDX.Direct3D11.Device to set the geometry shader and the variables.
The pixel shader just expects to receive vertex-coordinates in clip-space - it doesn’t care who put them there. I usually do all transforms in both the VS and GS (passing unadulterated position to the GS) so that when things break there’s at least something on the screen.
You’d do the lighting either in the geometry-shader or the pixel-shader. Either passing the computed color to the PS or the normal to the PS for it to do the lighting.
For lighting on triangle normal you really don’t care what space your coordinates are in at the GS. So a regular VS with an almost pass-through GS makes sense, unless I’m being a dolt and clip-space would mean distorted normals.
I have a bunch of commits for exposing the rest of the pipeline on DX11, totally untested (haven’t had time) aside from the 2MGFX changes but it should be super close and might at least be some reference for what you’ll have to do or at least what you’ll have to watch out for as far render-state MG-side goes.
@AcidFaucent I’ve been poking around at your fork, I decided to try to implement it myself in my game. I’m having a hard time instancing the geometry shader. When I use the handle in the GraphicsDevice that I inherit from Game, I get this error:
Using the HiDef profile? Most likely cause is a shader profile error (ie, you actually compiled a PS instead of a GS), unspecified entry-point, or the shader didn’t actually compile OK.
Turn on the DX debug layer, it’s basically not an option to have it off when you’re dealing with the other stages of the pipeline. Especially, tessellation which can explode in spectacular ways (an incomplete bind will nuke the graphics driver and take the OS with it - pretty much everywhere).
Has to be HiDef - reach is not an option. You have to set HiDef for Monogame to request for 10, 10.1, and 11 capability during device creation. Reach only requests the DX9 compatibility levels which do not support GS (if you provide a list to device creation it can only give you the best it can provide of what’s in the list you gave) - shader compilation is its own little universe and doesn’t reflect much of anything.
Your other shaders also need to target vs_4_0 and ps_4_0 at the minimum in order to be able to cope with the presence of a GS at all. You can’t stick a GS between SM3 shaders (barring driver witchcraft). SM4 and SM5 can coexist in the pipe (within reason, SM5’s few semantic enhancements notwithstanding).
FYI: it was not fun for me the first time I dove into these shader stages. They slap you in the face with everything you don’t know. Expect a rough ride.
I’ll feed your shader through something this evening and see if I spot anything odd.
Don’t get why it’d be greyed out, are you on a machine with an OS and card that supports DX10+? Debug layer might not work with a compatibility GPU.
Edit: skimmed your stuff really quick, looks like you’re basically there - I’d expect what I see to pretty much work.
So I updated the gist to include the creation of the buffer. I get a
“HRESULT: [0x80070057], Module: [General], ApiCode: [E_INVALIDARG/Invalid Arguments], Message: The parameter is incorrect” Exception when I create the buffer on line 99.
I looked int the DXCPL grey out and added my program to the debug list and now it’s not greyed out.
You CBuffer is an illegal size. Has to be a multiple of 16 bytes (yours is 268, closest legal size is 272).
Also, you’re guessing at the location of elements in the CBuffer - they’re not tightly packed. You really need to use the shader reflection APIs to query where things actually are in the CBuffer.
If the CBuffer is not entirely made of float4’s then nothing will be where you think it is (well, the packing rules are documented, but you won’t have that stuff in your head - and the reflection APIs eliminate the need to).
The core of what you’re doing there is workable, you’re just botching CBuffer setup is all (just explicitly writing in 272 instead of params.sum will get you past the error, but won’t make your buffer correct).
The absolute easiest way to plug a GS in is to just use MGFX as usual but inject the geometry shader after the EffectPass is applied but before you draw anything.
That only works on DX11 (GL uses whole program linking so you have to roll an entire raw shader library).
Grab all of them with the SharpDX API VertexShader.GetConstantBuffers, iterate the array and set them all into the GeometryShader.SetConstantBuffer(currentIdx, buffers[currentIdx])
Using the style you’ve been using:
// assume low-spec DX10 8 regs
// Requires enforcement that GS uses same CBuffers at the same registers as the VS, CBuffers the GS doesn't use are A-Okay, shader doesn't care
Buffer vsBuffers = ((Device)GraphicsDevice.Handle).ImmediateContext.VertexShader.GetConstantBuffers(0, 8);
if (vsBuffers != null)
for (int i = 0; i < vsBuffers.Length; ++i)
I managed to get the debug layer slightly working, but now I’m getting a C# exception…
I’ve updated my gist to include the changes to the code. GardeningGame is the game I’m aiming to eventually use this for, and Terrain.Water is basically just a flat primitive mesh with water-like colours. I don’t use the primitive effect in GardeningGame, instead I use the one in the gist. @AcidFaucent
I figured it out! First of all, I was using the wrong settings in the sharpdx debug panel. Second of all, I needed to expose the internal shaders in monogame, and set them manually in my program. Third of all, I needed to look at the output of Visual Studio to find my errors (all of which were semantic mismatches) and now I get no errors or warnings, just a blank screen. I will update my gist shortly. The gist is now updated to include the most recent code.
RenderDoc time! If you’ve never used it before first you start render-doc then setup the info in the last tab before hitting launch. Which will run your program and you hit F12 to grab a frame, then inspect the frame.
That’ll tell you the full state of all of your draw calls so you’ll be able to see what’s up (you can do similar with PIX/VS-GFX-debugger if you prefer it). If everything looks okay in there then you’ve probably got a view-projection transform issue.
So I tried RenderDoc but had no luck, so I used GPA. GPA tells me about a few problems but fixing those still doesn’t change the output.
I wanted to load the output from each of the shaders into a buffer that I can then save in my program to a file. This would allow me to debug the specific parts of the shader. Is there any way to create a writeable buffer in the shader that can then be saved in C#?
Hey, I looked at your code. I debugged and eventually found out that it was a problem with the distortion algorithm I was using. I copied it from a ThinMatrix video. Anyway, I found a noise function that fit my needs and was off! The end result is in my repo and here is an example of it working!