How to incorporate an algorithm into a geometry shader.

RenderDoc time! If you’ve never used it before first you start render-doc then setup the info in the last tab before hitting launch. Which will run your program and you hit F12 to grab a frame, then inspect the frame.

That’ll tell you the full state of all of your draw calls so you’ll be able to see what’s up (you can do similar with PIX/VS-GFX-debugger if you prefer it). If everything looks okay in there then you’ve probably got a view-projection transform issue.

Edit: sure you updated it? you’ve got a lot of odd S_POSITION stuff in that shader still.

Yeah I’m sure I updated it. I had to use S_Position because of “duplicate sv_position” stuff. I’ll look into renderdoc tomorrow. Thanks!

So I tried RenderDoc but had no luck, so I used GPA. GPA tells me about a few problems but fixing those still doesn’t change the output.
I wanted to load the output from each of the shaders into a buffer that I can then save in my program to a file. This would allow me to debug the specific parts of the shader. Is there any way to create a writeable buffer in the shader that can then be saved in C#?

Is there any way to create a writeable buffer in the shader that can then be saved in C#?

Nope. Technically there’s a printf for HLSL but I doubt Monogame/SharpDX has the plumbing set up for it, requires some plumbing and a GPU supporting it for it to work.

Rigging up a tiny example. Almost done … the problem is the immediate context stuff. I derped out on you, you have to grab the right device context with reflection or tweak Monogame to expose it.

For shorthand, my example is showing it with the handle exposed, but that could be looked up once and stored.

Created a simple example with lighting done in the GS:

The Graphicsdevice.ContextHandle is shorthand for access to the internal _d3dContext of GraphicsDevice.

Also, note that github doesn’t allow / paths so \\ is used for the content files. ie. Effects/WaterShader.fx and Effects/WaterGS.hlsl

Pretty sure I bumbled my math somewhere (spends too much time in black), but hey … it’s a 40 minute quickie.

Hopefully this helps you out. Deliberately kept as light as possible so the shader isn’t doing anything fancy, just the basics in the VS/PS and then dot-prod’ing a light-vector in the GS.

1 Like

Hey, I looked at your code. I debugged and eventually found out that it was a problem with the distortion algorithm I was using. I copied it from a ThinMatrix video. Anyway, I found a noise function that fit my needs and was off! The end result is in my repo and here is an example of it working!

Looks good!

o/t: ThinMatrix tuts are probably some of the best out there, riding that fine line of exposition, code, and explanation just right.

1 Like

Hey there, another quick question, somewhere I heard that MonoGame doesn’t compile cbuffer parameters that aren’t used. Is this true?

Sort of. CBuffers are all or nothing, if you touch one piece of it the compiler has to take the whole buffer. Curiously, a parallel universe in which OpenGL is not a psychotic mess leaks into this one and GL Uniform Buffer Objects behave the exact same.

You’ll have to deal with CBuffers the hard way (ie. shader reflection API) if you have disparities between shader stages from unused/new cbuffers.

You can’t disable variable elimination, would be the stuff of horror-stories if you could (even a small uber-shader would blow the stack without even executing).

Why would this be?

And, I’m going to write a small interface and release it.
By any chance do you have any knownledge about using the MonoGame processors? I’m doing this:
Use the pipeline to build a dummy version of the effect at runtime (Using the default pipeline is kind of tricky to get the directories right.
Then expose the constant buffers in the effect and use the effect to generate the cbuffers for your parameters.
Then inject the right shaders and cbuffers into the GPU.

The problem I’m having is, I have an effect that has both the real version and the dumbed down version by using #ifdef and #ifndefs to check if using the pipeline or the DirectX compiler. But, since I want to use the pipeline on runtime, I have to add these manually. How would I do this? Also, I’m getting an error stating thatAccess to the path 'C:\Effect.xnb' is denied. any ideas as to why?
Here is my snippet that compiles the effect on runtime:

    public void LoadEffect(string fileName, GraphicsDevice GD)
    {
        Pipeline = new PipelineManager(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), Path.Combine(Assembly.GetExecutingAssembly().Location, "\\tempBin"), Path.Combine(Assembly.GetExecutingAssembly().Location, "\\tempBin"));
        Pipeline.Profile = GraphicsProfile.HiDef;
        OpaqueDataDictionary ODD = new OpaqueDataDictionary();
        ODD.Add("Defines", "USINGMONOGAME");
        ODD.Add("Location", Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location));
        var BuiltContent = Pipeline.BuildContent(fileName, processorParameters: ODD);
        
        var ProcessedContent = Pipeline.ProcessContent(BuiltContent);
        var eff = new Effect(GD, ((CompiledEffectContent)ProcessedContent).GetEffectCode());
        this.InternalEffect = eff;
        File.Delete(".\\tempBin\\" + fileName.TrimEnd(".fx".ToArray()) + ".xnb");
    }

Thanks!

Actually I think I got it! Here is the code:

public void LoadEffect(string fileName, GraphicsDevice GD, string Defines)
{
    this.Pipeline = new PipelineManager(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), Path.Combine(Assembly.GetExecutingAssembly().Location, "\\tempBin"), Path.Combine(Assembly.GetExecutingAssembly().Location, "\\tempBin"));
    Pipeline.Profile = GraphicsProfile.HiDef;

    EffectProcessor EProcessor = new EffectProcessor();
    EffectContent EContent = new EffectContent();
    EffectImporter EImporter = new EffectImporter();

    PipelineImporterContext ImporterContext = new PipelineImporterContext(Pipeline);

    var ImportedData = EImporter.Import(fileName, ImporterContext);

    EContent.EffectCode = ImportedData.EffectCode;

    string importerName = null, processorName = null;
    Pipeline.ResolveImporterAndProcessor(fileName, ref importerName, ref processorName);

    var contentEvent = new PipelineBuildEvent
    {
    SourceFile = fileName,
    DestFile = fileName + ".built", //Never actually used
    Importer = importerName,
    Processor = processorName,
    Parameters = new OpaqueDataDictionary(),
    };

    ContentProcessorContext ProcessorContext = new PipelineProcessorContext(Pipeline, contentEvent);

    EProcessor.Defines = Defines;
    EContent.Identity = ImportedData.Identity;

    CompiledEffectContent ProcessedContent = EProcessor.Process(EContent, ProcessorContext);
    Effect EndResult = new Effect(GD, ProcessedContent.GetEffectCode());
    CommonEffect.InternalEffect = EndResult;
}

hi, i just try to compile the project from your [repo] (https://github.com/OptimisticPeach/GardeningGame) and seem like there is some asset missing. can you help?

Ah! I’m sorry for the late reply. I uploaded the project solely for its security and to use in other examples. In theory you could run it, but it is in fact missing some assets (Which I bought and am therefore not allowed to release, hence this entry in my gitignore) For this, I can make a new branch to remove the part which would depend on the assets. This would basically limit it to the use of the water example.

On a side note, if you have questions about geometry shaders and compute shaders and the like (Or really most things that aren’t exposed publicly in the Monogame API) you should contact @AcidFaucent who was an excellent help throughout my endeavor. I’ve since moved on to other things (Rust to be specific. It’s most definitely a great language!) so I don’t quite recall most things about this :sweat_smile:

@11110 I’ve updated my repo to include a cleaned branch! That should compile and is quite a bit cleaner. This will remain a branch though as though it removes most of the “game” code and results in the following:

  • A circular patch of water
  • With alternating triangles like so (With colour for effect):
  • And moving surface
  • The shader which controls it is located here.
    • This shader updates on changes so the changes can be viewed live!
    • This shader includes a geometry shader stage

With some other random algorithms and things scattered around:

3 Likes

Hi @AcidFaucent. Thanks for the great example.

Unfortunately in my case the returned buffers are all null. Could you think of a reason why this might be?

Thank you : )

Edit: Nevermind. They are empty during the first frame only.

@Tom @harry-cpp @mrhelmut @Jjagg ect

This was is seemingly pretty amazing work on getting other shaders into monogame at least for Dx as well as steps towards using it in concert with the pipeline which would be a great start even if gl couldn’t do it yet. Personally i wish we could get mesh shaders in mg even more then this but this is pretty cool.

What are the challenges for mimicking this process to add it to the official version of mg in some suitable way? Even if spritebatch can’t touch it and honestly you’re probably not going to if you could with spritebatch it would be nice to be able to use it out of the box for regular primitive drawing.

Just looking at the view count shows there is a lot of people interested in this topic.

From what I recall, the major changes would involve adding conditional compilation to only allow geometry shaders on DirectX (It’s been a while since I’ve done C# so I can’t recall if Conditional Compilation is possible), and then to load them into the SharpDX Device.

Since Geometry shaders are just another stage in the pipeline, the method for using them is pretty much the same as any other shader, which involves binding them to the device, setting the constant buffers and then invoking a draw call.

Additionally, for reference, here’s how I compiled them, since the content processor doesn’t (didn’t?) allow me to add the GS as part of a pass in a technique.


Since it is not possible to get mojoshader to translate geometry shaders, it might be interesting to look into a different translation api, such as using SPIR-V as the backend. This would be beneficial since glslangValidator can already take HLSL as an input, and there are spirv tools to emit GLSL, HLSL, MSL, etc. Also, this would enable not only Geometry shaders, but also tessellation shaders, and (I can’t recall if this is already enabled in MonoGame) compute shaders.

In any case, this is just my opinion on how you could move forward with Geometry shaders in Monogame, and I apologize for not being of more use since I haven’t worked with Monogame in a while.

Yeah, this looks really good. I’ll see if I can turn this into a pull request.

1 Like

I posted some details on the pull request progress, in case you are interested:

1 Like