SOLVED - Problems changing resolution with render target

Hi, I am trying to work out how to change resolutions during a game. I am drawing to a render target which I am then stretching to whatever screen resolution I choose. However if I try to change resolution during gameplay I get problems. The resolution changes, but the render target is being drawn away from where it should, meaning I get a big border at the top and left of the screen.

I run an InitializeViewport() function (below) during startup, and then call it again when I have chosen my new resolution. All resolutions are fine when I start the game with them, but swapping on the fly is where it’s going wrong and I can’t spot the problem. Can anyone help?

public void InitializeViewport()
    {
        IsFixedTimeStep = GameInfo.info.fixedTimeStep;
        graphics.PreferredBackBufferWidth = GameInfo.info.resolutionWidth;          // Set screen resolution.
        graphics.PreferredBackBufferHeight = GameInfo.info.resolutionHeight;
        graphics.SynchronizeWithVerticalRetrace = GameInfo.info.vSync;              // Set initial Vsync.

        // Create render target for the implementation of viewports.
        renderTarget = new RenderTarget2D(GraphicsDevice, GameInfo.info.gameplayWindowWidth,  GameInfo.info.gameplayWindowHeight, false, GraphicsDevice.PresentationParameters.BackBufferFormat, DepthFormat.Depth24);

        // Define default viewport (always whole of game window).
        viewport0 = GraphicsDevice.Viewport;
        viewport0.Width = GameInfo.info.gameplayWindowWidth;
        viewport0.Height = GameInfo.info.gameplayWindowHeight;

        // Define viewport 1.
        viewport1 = viewport0;
        viewport1.X = GameInfo.info.gameplayWindowX;
        viewport1.Y = GameInfo.info.gameplayWindowY;
        viewport1.Width = GameInfo.info.gameplayWindowWidth;
        viewport1.Height = GameInfo.info.gameplayWindowHeight;

        // Set full-screen mode
        graphics.HardwareModeSwitch = GameInfo.info.hardwareModeSwitch;
        graphics.IsFullScreen = GameInfo.info.fullScreen;

        // Set window position.
        if (!graphics.IsFullScreen)
            Window.Position = new Point((GraphicsAdapter.DefaultAdapter.CurrentDisplayMode.Width / 2) - (graphics.PreferredBackBufferWidth / 2), (GraphicsAdapter.DefaultAdapter.CurrentDisplayMode.Height / 2) - (graphics.PreferredBackBufferHeight / 2));

        // Set frame rate.
        SetFrameRate(GameInfo.info.frameRate);

        graphics.ApplyChanges();
    }

Why do you do this:

    viewport1.X = GameInfo.info.gameplayWindowX;
    viewport1.Y = GameInfo.info.gameplayWindowY;

Is this your way to implement camera offset?
Anyway I would try to clear the render target to some color not transparent and different than your default backbuffer clear color, to see if the border you get is inside the render target (ie renderings not covering all the render target as it should) or if your render target is drawn at a wrong position on screen.

Another thing - I’m pretty sure you need to call render target Dispose () manually before you replace it with the new target to prevent leakage. Iirc it doesn’t dispose itself. Check it out I’m not 100% sure and can’t check rn.

Thanks very much for the response. Firstly, I have done as you have suggested and called the Dispose() at the top of the InitializeViewport() function (if the render target isn’t null). So hopefully that’ll help avoid any further problems.

Secondly, I define the position of viewport1 like that for no other reason than to have it defined somewhere. The location is only ever set at (0, 0) so this can’t be causing a problem.

Now, I do actually do what you have suggested with the viewport colour. I clear my viewport0 with a grey colour and my viewport1 with a blue colour. The border I’m getting on the screen when changing resolution is blue. Also, I notice that my game graphics aren’t just in the wrong place, they’re also scaled up. This all leads me to believe that the problem might be with the matrix I’m using for my camera.

In my main Draw() method I use this:

spriteBatch.Begin(SpriteSortMode.Deferred, null, SamplerState.LinearClamp, null, null, null, Camera2D.camera1.GetTransformation(graphics));

which is calling the GetTransformation() method in my Camera class:

public Matrix GetTransformation(GraphicsDeviceManager graphicsDevice)
    {
        Vector3 newVector = new Vector3(-position.X, -position.Y, 0);

        cameraTransformMatrix = Matrix.CreateTranslation(newVector) *
                                Matrix.CreateRotationZ(rotation) *
                                Matrix.CreateScale(new Vector3(zoom, zoom, 1)) *
                                Matrix.CreateTranslation(new Vector3(GameInfo.info.resolutionWidth * 0.5f, GameInfo.info.resolutionHeight * 0.5f, 0));

        return cameraTransformMatrix;
    }

I must admit that I haven’t really got my head around how a matrix works, but perhaps I’m doing something wrong here? This matrix is one I cobbled together for my previous game based on one I found online ages ago.

I think I’ve fixed it! My camera position is based on the game’s resolution and I was only updating the camera position at startup. If I update it every frame, the issue is resolved! :slight_smile:

1 Like