My googling hasn’t given me a clear answer for 2018, so… is it possible to do a Texture Lookup in the Vertex Shader using the OpenGL version of Monogame?
If so, how? My XNA project uses tex2dlod() in the Vertex Shader and feeds the textures in as shader parameters, but this fails at the compile stage with the MGCB tool.
If it makes any difference, I use the HiDef profile.
I recognise it’s not a simple matter to support something across so many different hardware platforms, but that one specifically is going to cause me grief: do you know if there’s a branch of the codebase that has attempted to address this issue anywhere? My goal is to support Mac and Linux: is OpenGL the only solution for that?
I’m trying to render grass billboards atop a height map: my solution with vertex textures was to render a static buffer of billboards in the vicinity of the camera, and translate them onto the terrain using it’s heightmap. Without that capability the only solution I can think of is to translate the grass billboards on the CPU and feed it to the shader with a DynamicVertexBuffer, which is going to add an extra CPU cost to an already heavily CPU-bound game.
VTF is supported for DirectX platforms, but not OpenGL. And yes, if you want to support Mac and Linux you’ll need OpenGL VTF support.
I’m not sure how hard this is to implement, I’ll try to look into it this weekend. There’s a closed issue regarding VTF (I guess it got closed when VTF was implemented for DX): https://github.com/MonoGame/MonoGame/issues/2602
Thanks, I’d sincerely appreciate it. I’m using the source code, but unfortunately I don’t have the OpenGL expertise to know where to start to add a feature like this to my game.
Just a prod to ask if anyone has had a chance to look into implementing Vertex Texture Fetch on the OpenGL platform? I can’t see anything about it in the commit list.
Would it be helpful for me to open an issue on the github development branch?
Im not familiar with VTF or what problems it is supposed to solve.
Unless im missing the point i very well could be as i have no idea what vtf means.
What you see in that video is possible already using hardware instancing with open gl.
I have a basic GL example i could post if you like.
The idea is that the particles are created as vertices with a start position and color. Then a current position and velocity texture is maintained where the velocity is dependent on outside variables (such as mouse or perhaps totally random). Each frame the velocities are calculated and added to the positions. Then the particles are drawn using the original vertices and the position texture is sampled in the vertex shader (VTF) to get the current position and set the position of the vertex to that.
This can also be done much more easily with the CPU, but with this technique the particle system is maintained with the GPU which is obviously very desirable if you have millions of particles.
If the path of your particles won’t change then you can define a function that gives you the position if you give it the time passed and then VTF as well as maintaining these textures won’t be necessary. I might consider doing this instead and simply create some variety in the functions by passing a random number with each vertex.
This is just one of the applications of VTF.
The solution should both be possible with DirectX and OpenGL.
There’s a link for the source code in the description of the video which uses XNA 3.1.
If your solution doesn’t involve iterating a list of all the particles at each Update or Draw call on the CPU and works with MonoGame with DirectX and OpenGL then I’d love to hear it.
I am also still watching this issue. In my case VTF would be of great assistance in implementing a variety of possible features, most notably dynamic terrain heightmaps and vegetation. Plus it would be great for cleaning up the ugly workarounds I had to implement for my grass billboards.
Currently, my only strategy for rendering dynamic terrain geometry involves passing new vertex positions to the GPU, which is severely limiting when I’m dealing with even a small 512x512 heightmap. Vertex Texture Fetch would unlock many new possibilities for me.
The instancing sample is at the very bottom of the below post when i was getting the bug out of the gl version or the typo.
I dunno if it will be sufficient but i could push a lot of particles more then i think i would ever actually try to use in a game, maybe it will help.
Its a single quad with 4 vertices that is put on the shader then a instance buffer is also bound to that which is basically a list of Vector3 positions which is updated to draw the same quad at that position. The quad is only sent to the gpu once and just sits on it. You could make it a full orientation ect i suppose but it was for particles so i kept it small. The shader has 3 parts a pixel shader a vertex shader and a instance shader.
It looks something like the alternative I described. You have the initial positions of the particles and then you give that + current time to a function that calculates current position. Correct me if I’m wrong, but you can’t change each individual particle’s direction randomly at each frame (without iterating the vertices on the CPU) with your method.
Well i never thought about trying to do anything other then the above much other then this and models.
The instance list is also on the gpu and can just sit there after loading it. There is a vertex and pixel shader as well.
So technically you could use a texture or something to hold the motion data on the gpu.
Instead of iterating the list.
I suppose you could devise some scheme were by you render to a target then read write with it to either track or calculate motion against a mouse click or some thing or two of them.
Though iterating a array of a million vector3’s isn’t really a big deal if you create it all at once it will be a contiguous block so it shouldn’t cache miss much and will be pretty fast in a straight up loop.
This scheme you speak of is possible, but only through VTF. VTF just means to read a texture from vertex shader. If you had the positions in a texture you need to set the positions of the vertices to these. This can only be done from the vertex shader.
Iterating an array of just 100000 vector3’s and adding a random velocity to each and drawing that many particles slows my game down to around 30 FPS. This could probably be optimized, but I’d say 60 FPS 1 million and above is impossible, at least while running any game logic on the side and with any reasonable CPU.
I’ll admit I’m too lazy to conduct a proper scientific experiment that would hold up in court, so if you want to benchmark the performance properly be my guest.