Mac Retina displays & Crispy Pixels

These are my findings for achieving maximum crispiness on a Retina display. This is the only way I could find to get around the default filtering that MacOS uses for non-retina applications.

The main issue here is that the CurrentDisplayMode resolution is halved when using Mac’s default retina-specific resolutions. So if the true resolution of your monitor is 2880x1800, you get 1440x900 from CurrentDisplayMode. For windows users, I believe this is the equivalent of the UI scaling option in Settings. This is fine (honestly, it’s fine)… UNTIL you notice that the pixels are not at the maximum level of crispness that this display could give you.

So to be specific, what I’m doing is for the situation where you want your game resolution to match the user’s monitor resolution, and on very high resolution displays, be a reasonable fraction of this. For very low res pixel art games, this isn’t such a big deal, but if you are doing higher resolution 2D pixel art where you don’t want a lot of anti-aliasing and just want pure integer scaling, this might be of interest.

My solution only works when using HardwareModeSwitch. When not using this, MacOS takes over the full-screening. It is possible that it would work without HardwareModeSwitch if the application could report to MacOS that it is Retina-compatible, but I haven’t gone as far as packaging a .app to test this.

When not using HardwareModeSwitch, for maximum pixel crisp, the user would have to manually change their display resolution using a third party app (like QuickRes), so this isn’t a real solution…

What this does is take the CurrentDisplayMode width and height for all game rendering except for the BackBuffer resolution, which is double this. This would result in ideal rendering when the user is in the “Best for Retina” setting, but it looks good at other scaled resolutions as well.

If the BackBuffer resolution isn’t multiplied by two, the game is scaled by the operating system using a filter, rather than just doubling the pixels (linear vs nearest neighbor). On Windows, this isn’t an issue as the OS scaling seems to be Nearest Neighbor…

The issue here is, of course, I don’t have a way of detecting if the user is using a Retina display or not. Also, none of this would be an issue if I could change the scale filtering setting that Mac uses for non-retina applications.

(I’m using Monogame Extended for the BoxingViewportAdapter, definitely a way to do it without)


public int windowWidth;
public int windowHeight;

protected override void Initialize() {
    graphics.HardwareModeSwitch = true; //if disabled, the double size backbuffer is scaled back down, resulting in a blurrier image.
    Window.AllowUserResizing = false;
    windowWidth = GraphicsDevice.Adapter.CurrentDisplayMode.Width;
    windowHeight = GraphicsDevice.Adapter.CurrentDisplayMode.Height;
    graphics.IsFullScreen = true;
    graphics.PreferredBackBufferWidth = windowWidth * 2; //Get true display resolution
    graphics.PreferredBackBufferHeight = windowHeight * 2;

    var viewportadapter = new BoxingViewportAdapter (Window, GraphicsDevice, windowWidth, windowHeight); // use non-doubled resolution
    gameplayCamera = new OrthographicCamera (viewportadapter);
}

protected override void Draw(GameTime gameTime) {
    spriteBatch.Begin(
        transformMatrix: gameplayCamera.GetViewMatrix (),
        samplerState: SamplerState.PointClamp); //PointClamp for crispness
    ...
    spriteBatch.End();
}

1 Like