How to Protogame UI?

The UI docs say they’re under construction.

Ah right, I didn’t realise those docs aren’t listing the classes available.

So there’s two major ways of using the UI system, you can either build the UI in code, or you can use XML-based assets to describe the UI. The XML-based assets have the advantage that they will hot reload in debug builds, which makes it easier to lay things out on the screen.

Load the module + set up rendering

First you need to make sure that in your GameConfiguration, you’re loading the user interface module:

kernel.Load<ProtogameUserInterfaceModule>();

Also make sure you have the events module, which handles input:

kernel.Load<ProtogameEventsModule>();

Then in your render pipeline configuration (inside your game class), make sure you add a render pass for the UI canvas:

pipeline.AddFixedRenderPass(kernel.Get<ICanvasRenderPass>());

XML-based Assets

If you’re using XML-based assets, you’ll want to create an entity like this:

    public class UserInterfaceEntity : ComponentizedEntity
    {
        private readonly IUserInterfaceController _userInterface;

        public UserInterfaceEntity(
            IHierarchy hierarchy,
            IAssetManager assetManager,
            UserInterfaceComponent userInterfaceComponent)
        {
            userInterfaceComponent.UserInterfaceAssetName = "ui.gameplay";

            RegisterComponent(userInterfaceComponent);
        }
    }

This uses the UserInterfaceComponent to load and manage a UI described by an asset called “ui.gameplay”.

With this done, you then want to place a file at the path “.Content\assets\ui\gameplay.ui2” with this content:

<?xml version="1.0" ?>
<ui version="2">
  <canvas>
    <container type="vertical">
      <container type="empty" height="2%" />
      <container type="horizontal" height="25%">
        <container type="empty" width="2%" />
        <container type="relative" width="*">
          <texture texture="texture.Hud.healthBarBody" fit="ratio" width="642" height="66" />
          <texture texture="texture.Hud.healthBarBody" fit="ratio" />
          <texture texture="texture.Hud.healthBarBody" fit="ratio" />
        </container>
        <container type="empty" width="20%" />
        <container type="single" width="*">
          <texture texture="texture.Hud.bossHealthBody" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
        <container type="single" width="5%">
          <texture texture="texture.Hud.buttonPause" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
      </container>
      <container type="empty" height="*" />
      <container type="horizontal" height="25%">
        <container type="empty" width="2%" />
        <container type="single" width="*">
          <texture texture="texture.Hud.buttonSwapLeft" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
        <container type="single" width="*">
          <texture texture="texture.Hud.buttonFlip" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
        <container type="single" width="33%">
          <texture texture="texture.Hud.levelBarBody" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
        <container type="single" width="*">
          <texture texture="texture.Hud.buttonItems" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
        <container type="single" width="*">
          <texture texture="texture.Hud.buttonSwapRight" fit="ratio" />
        </container>
        <container type="empty" width="2%" />
      </container>
      <container type="empty" height="2%" />
    </container>
  </canvas>
</ui>

You can see this example uses containers and textures rendered on the screen.

Each of these XML elements is mapped to an actual UI class using the node processors listed in the NodeProcessor folder. Each of these classes handles an XML element and returns the actual UI object that is used at runtime, for example, handling texture elements is done in TextureUserInterfaceNodeProcessor. These classes are bound in dependency injection when you load the user interface module, like so:

            // binding for <canvas> elements
            kernel.Bind<IUserInterfaceNodeProcessor>()
                .To<CanvasUserInterfaceNodeProcessor>()
                .Named("canvas")
                .InSingletonScope();

So if you want to use a UI component that doesn’t yet have a node processor for it, you can implement IUserInterfaceNodeProcessor and bind it, and then you’ll be able to use that XML element in your .ui2 files.

Code-based

Code based is probably the fastest way to get started, but you lose the ability to hot reload UIs (so you have to relaunch the game every time you want to adjust the layout of the UI).

All of the UI classes are in the Control folder.

The user interface in Protogame works like a tree or hierarchy. At the root, you always have a Canvas, which can have a single child control. To support laying things out, you use the various ...Container controls like HorizontalContainer (which lays things out horizontally).

Once you have built up the hierarchy, you set the Canvas element onto a CanvasEntity and add that entity to the world.

Off the top of my head, this would be something like:

var canvas = new Canvas();
canvas.SetChild(...whatever the child control is ...);

// Create the canvas entity assuming you have injected IKernel and IHierarchy into the constructor
// of whatever is making the canvas.
var canvasEntity = _kernel.Get<CanvasEntity>();
canvasEntity.Canvas = canvas;

// If you are constructing the canvas entity inside your IWorld, then inject INode as worldNode and
// IHierarchy as hierarchy, and:
hierarchy.MoveNode(worldNode, hierarchy.Lookup(canvasEntity));
// If you are constructing the canvas entity as part of another entity, you'll need to wait until Update
// and then do:
hierarchy.MoveNode(hierarchy.Lookup(gameContext.World), hierarchy.Lookup(canvasEntity));

Thank you. Is there any UI designer for Protogame (think form editor if you were to do WinForms app, etc.)? While by no means necessary, it would greatly improve iteration time.

Another thing: Is it possible to move controls freely around window by specifying specific x/y position so as to e.g. show gui over certain object?

//edit: Also, how does skinning work?

//edit #2: Also sorry for being such a pain in the butt.

There isn’t… yet. We intend to have the Protogame Editor available at one point (source code is here: https://github.com/RedpointGames/Protogame.Editor, demo video here: https://www.youtube.com/watch?v=RXZv6n4wb5U). It’s not at all ready yet, but it will have a visual UI editor in it.

I believe the RelativeContainer is what you’re looking for if you want to specify X/Y positions. Just create one of those then use RemoveChild/AddChild to replace a child control.

Skinning is a little complex, but basically what you do is this:

  • Create a class that implements ISkinRenderer<WhateverTheControlIsThatYouWantToSkin>
  • Implement that interface
  • In your game module, do kernel.Rebind<ISkinRenderer<WhateverTheControlIsThatYouWantToSkin>>().To<YourImplementation>().InSingletonScope();

Then your class will get called for rendering the control. Take a look at the basic skin implementation for a control first as that’ll give you a good idea as to how it’s rendered by default https://github.com/RedpointGames/Protogame/tree/master/Protogame/UserInterface/Skin.

And no worries, you’re not being a pain. Keep asking questions as you have them; that’s what this forum is for. This is the most usage of Protogame I’ve seen in a while (except for my own), so I’m glad someone else is exploring and using it.

Heh, skinning seems a bit too complex for my liking. I’d prefer if you could just specify 9-slice image for it like you can do e.g. in Unity, even with old OnGUI/IMGUI thing.

This is a real dealbreaker for me. Do you know of any MonoGame UI libs that play nice with Protogame?

I don’t know of any MonoGame UI libraries other than the one we ship in Protogame, let alone whether or not they support 9-slice.

9-slice is kind of something you’d get in a higher engine like UE4 or Unity, but pretty much anything MonoGame based is going to involve programming of some kind since there’s no visual editors.