For rendering SprueKit uses Monogame in WPF via MarcStan’s WPFInterop.

SprueKit uses skeletal voxel modeling with CSG and aggregation to compile complete models. As an alternative to voxel methods B-mesh is available.

UV charts are automatically generated. Texture maps are also generated uses volumetric shapes and projectors - including brush based and volumetric projectors via polygon strips. The program is usable as a projection painter.

Bone weights and LODs are also calculated.

Not depicted are the animation features, which are based on those of Spore and integrate animations from shapes that are merged in. Generalized animation that can remap to almost anything.

Also not depicted are the permutation features of SprueKit. Most properties of any object can have permutations which can be named, weighted, or tagged. At runtime SDK users can select a variant model based on criteria such as “sharp” or “poison”, at random, based on name, etc.

Texture painting volumes and strips are available to FBX and OBJ models, including defining clip-planes.

The procedural texture graph portions will be open-sourced at soft-launch and available to C# and C++. Those graphs are admittedly slow, due to Allegorithmics patents on using hardware features as they were meant to be used.


Wow, this looks awesome! The UI looks very clean and user-friendly :slight_smile:

The UI looks very clean and user-friendly

Modern UI for WPF (MUI) helps a lot with that. Although a little awkward to get used to since it’s page based, I’ve found it to be really nice for forcing you to rethink your GUI. It has been admittedly a chore at times to rectify all of my custom controls to match.

Monogame itself has been pretty dreamy to use.

1 Like

Finally ported over my procedural texture graphing tools to C#/Monogame. Rough-metal PBR only and missing mesh map baking nodes at the moment.


Very nice tool!
Is the graphnode tool used to only create textures or does it also allow to create effects like the effect tool of UE4?

Is the graphnode tool used to only create textures or does it also allow to create effects like the effect tool of UE4?

It used to. The material stuff is definitely broken ATM, patching that back up is my project for this ProcJam.

Internally the graphs support upstream, downstream, and hybrid execution order. Upstream applies to anything with a fixed endpoint (texture, shader, sound graphs), downstream to behavior trees and most dialog systems, and hybrid is pretty much UE4 blueprint flowgraphs.

Whew, finally nearly done with the grind to hit soft-launch before Thanksgiving for this splinter program SprueTex that just does procedural texture graphs. Three more documentation pages and a couple more example textures and I’m finally shippable.

Graphs, big huge graphs. With warp keys, sections, and caching choke-points.

Finally made the 3d viewport easier to use with quick expansion to fill the screen and support for displacement maps as well as the other bad workflows (Rough-Metal, Rough-Specular, Glossy-Metal, and Glossy-Specular are supported).

Still a lot to do there but it can come later. Have to deal with AO, subsurface scattering and clear-coat in the near future … running out of texture samplers :(. Would be nice to add user configurable IBL environments … but I’m not writing a damn cubemap convolver.

Writing help has been miserable (I invalidate my own help in minutes) but I think between the program being geared towards natural discovery to begin with and context-sensitive help I’ve gotten as close to shipping myself with it.

MonoGame has continued to be dreamy.



This made my day!

This made my day!

That is the goal of all help really, try to make it feel like you shipped the developer with it. I learned about the image-map approach when I was in the Chemical Safety industry. People hate reading manuals, but if they think they read the manual by choice then they love it.

Woot! Soft-launch: https://spruekit.itch.io/spruetex

So much more to do, but now is the perfect time for a silent soft-launch. I was freaking out yesterday as the performance was shit under heavy use and I had to overhaul a lot of things to bring it up to par.

Added some caching to the graphs and perf improved by 4x, then optimized bilinear sampling and perf improved again by 4x. At this point, despite the fragment centered approach I took to upstream graph evaluation I’m actually faster than the other guy(s) in most cases aside from normals.

I’m really bloody proud to say that bilinear filtering is the slowest thing in the generation of textures. Bilinear sampling is 30% of all time used in the texture graphs. I did my job so well that clamp and modulos are the performance issues.

Most importantly I’m ready to move on to plugin support as my fragment centric approach of evaluating each pixel of the result makes that trite.


Fun factoid: I’m unemployed and looking for work.

1 Like

Look up Fiverr…

I don’t want to link to it here…

But you can make some basic income through it…

Planning to hop onto it myself in the coming months, if I can…

Ahh, thanks for reminding me. I’ve been grinding tools for so long that I forgot that the whole point of grinding on tools was to make things faster for me. Had the blinders on for too long.

Stuff like Fiverr grinding was the whole point IIRC.

1 Like

Finally got software rendering hooked up and not outpaced by a slug. There’s actually a surprising derth of information on conservative rasterization in a software rasterizer and most of the GPU material doesn’t quite line up (or is a headache).

Rendering depth maps of arbitrary 3d meshes is pretty cool for a texture pipeline. Can use the depth maps for everything from embossing/carving to generating random wine-bottle label stickers based on meshes of actual in-game fauna.

That last one makes me realize I need to add some nodes for LUT translations, especially randomly selected gradient ramp LUTs.