Weired Rendertarget resize problem

This problem seems simple but i spent more than a few attempts to search and solve this problem with no use and kept delaying it for later, this time i am asking for help maybe someone have an idea.

So i have a target resolution of my game 1024x768 and when i resize the render target to anything bigger everything looks fine, but when i try to resize to anything smaller i get this result where if the width or the height is smaller than the target size the spritebatch crop part of the rendertarget on the screen based on the scale down percentage, watch the video to see the result.

I checked the actual window size, the preferred buffer size, the buffer size, the boundary, they all show the right values, this is the code for resizing
public void SetScreen (bool FullScreen, GraphicsDeviceManager g, int Width , int Height) {
g.IsFullScreen = FullScreen;
g.PreferredBackBufferWidth = Width;
g.PreferredBackBufferHeight = Height;
GameCore.CGC.RTScale = new Vector2( (float)WisG.WisGGF.GetScreenWidth/(float)WisG.WisGGF.GetBaseScreenWidth, (float)WisG.WisGGF.GetScreenHeight/(float)WisG.WisGGF.GetBaseScreenHeight);
MenuComponent.UpdateUniversalScale (GameCore.CGC.RTScale);

This is the code for spritebatch:-
SB.Begin(SpriteSortMode.Immediate, BlendState.AlphaBlend, SamplerState.LinearClamp, DepthStencilState.Default, RasterizerState.CullNone);
SB.Draw(WisGGF.RTSB, WisGGF.RTRF, null, Color.White, 0.0f, Vector2.Zero, WisGGF.RTScale, SpriteEffects.None, 0f);

for some reason the scale “WisGGF.RTScale” which is the (Target Resolution / selected resolution) is applied kinda twice once on the window size and once again on the inner screen, the problem resume with or without full screen it is if the spritebatch is applying the scale twice but if that’s true then the problem should show when i try to pick up a higher resolution too and the scale would take place outside of the window, but it doesn’t, it only happen when i pick smaller size resolution.

i am sorry that i can’t write this topic in any cleaner way or easier to read. if you have any thoughts or experienced such a problem share them with me, thanks!

Maybe the Graphics Device Manager hasn’t had time to finish applying the changes when you use GetScreenWidth and GetScreenHeight. I’d try changing these to Width and Height passed into SetScreen in case ApplyChanges hasn’t happened or finished yet.

From within the ctor of my Game class I tend to wire up to Window.ClientSizeChanged event, and then use that to then re scale any elements that are reliant on screen size in there.

Something like
Window.ClientSizeChanged += OnResize;

    public void OnResize(Object sender, EventArgs e)
        int cnt = GameComponentHelper.Cameras.Count;

        ICameraService currentCamera = null;

        // Rescale all the cameras.
        for (int c = 0; c < cnt; c++)
            currentCamera = GameComponentHelper.Cameras[c];



A side note from your main question, but If at all possible I’d recommend that you apply your scaling by using the matrix transform overload of the spritebatch. It’s the last parameter of spritebatch.Begin().

Matrix scaler = Matrix.CreateScale(SCALE_FACTOR);
Pass the scaler into the the batcher. I’d avoid scaling anything in your code manually(in each draw method). Draw everything straight to your render target as-is, targeting your virtual resolution of 1024x768. Apply all scaling only at the very end when you’re drawing the render target(You may already be doing this, I can’t tell for sure).

As for your main issue, it’s hard to be certain what’s causing it, but here are a few ideas:

In your example, you aren’t using g.ApplyChanges() after changing your resolution. This may be causing some weird errors as well with the graphcis device not updating things correctly.

Another thing I noticed is that when you switched to the 1176x664 resolution your scaling was off. The aspect ratio wasn’t maintained. This makes sense, potentially, depending on how you’re handling resolution. The aspect ratio of that resolution is 0.56, but for 1024x768, it’s 0.75. If you’re calculating the scaling factor for each dimension separately, you’ll get warping. That probably explains why you saw the weird stretching effect at about 1:15 in your video, would be my guess. It would also explain why the warping never occurred on any of the resolutions you tested which had the same aspect ratio as your target resolution.

The fix for that, if it’s actually a problem in your code: If the width of of the screen becomes 1176, the scaling factor of your width should be 1.148~, but the scaling factor of the height is 0.84. Scaling height and width by these different factors will cause the warping. In this case, both height and width should have been scaled by 0.84 to maintain aspect ratio. Doing so would cause pillarboxing, but that’s necessary. You’ll need to calculate all this dynamically and adjust your viewport as needed.

As for the last issue with the scaling applying twice…

I don’t see the code where you define the size of the render target. You mention ‘resizing the render target’. Did you mean scaling it, or actually changing the size of your render target?

If you’re resizing your window to 640x480 AND also resizing your render target to that same size, it would explain half your problem. Because if you change your render target to 640x480 and then scale that by the scaling factor(0.625 in this case), you’re further shrinking what will be rendered. That’d probably explain why you saw the double-shrinking effect at the 640x480 resolution.

As for why you don’t see that when you increase size… That part I’m not sure about, because if the above statement is true, I’d expect to see the scale increase the render size beyond the boundary at a larger resolution, as you noted. But this could still be the issue of applying changes.

Thanks all for reply and i apologize for my late reply too.

@AlienScribble Well in that case after applying the changing Res won’t that solve the problem?

@Charles_Humphrey well not sure, i doubt it is a camera issue but i will give it more try when i clean up some mess

@Rei well my code … how to put it, i didn’t want to represent it all so i won’t cause confusion which i failed badly cause of my bad words choices i guess.

cause the resolution resize is related how the game engine dealing with the render2d target it is part of few functions, and at the end it always call g.ApplyChanges()

public void SetScreen (bool FullScreen, GraphicsDeviceManager g) {
            g.IsFullScreen = FullScreen;
            g.PreferredBackBufferWidth  = ScreenWidth;
            g.PreferredBackBufferHeight = ScreenHeight;
            SetScreenSize(ScreenWidth, ScreenHeight); 
            EventManger.ScaleUpdate(ScreenWidth, ScreenHeight);
            System.Diagnostics.Debug.WriteLine("W= " + g.PreferredBackBufferWidth + " H= " + g.PreferredBackBufferHeight);

And because my old code didn’t resize the GUI part, it only resize the game resolution to have same GUI size even with resolution change, that’s why i didn’t use the matrix, but at the end cause i got few bugs for GUI locations related to mouse clicks and joystics i decided to scale all on setting resolution, but i tried the matrix after u suggesting it and unfortunately it gave same buggy results.

And about the confusion of resizing and scale … that’s the actual problem, i am only scaling once and yet the code is behaving as if i scaled and resize when i select any resolution that has a smaller edge than the target resolution. and the resize is cropped too! actually one of the attempts i tried to change the Clamp type to see if it will warp or not, no effect, it prints clear area on the edges.

i wonder if it is a video card effect or something, i mean my screen support smaller resolutions why does do this i have no clue, i even scaled my app physical window to see if its the right size or not, what making it even more annoying the same problem occur on full screen too.

I am not saying it is a camera issue, just you can use that to indicate when the screen size changes and then re set what you need when it occurs, my example just shows me re configuring my camera viewports… :slight_smile:

seems ur code allow the user to resize the window by dragging the physical window edge by mouse at will anytime, never thought about it. i know its not related but … sigh … the thing is my earliest code in my game engine 4 years ago had the map size and the physical window size are in the Controller class (it represent the actual human who plays) which have the Camera class, i had removed all the related code from there long ago but it worth double checking… and No, nothing left in there that screw with the resolution XD

This is one of the weirdest issue i have faced in programming …

What does this line do?:


I thought maybe it gets the current BackBufferWidth?

Even after you use ApplyChanges() - it does not always finish updating immediately, as I’ve noticed in the past and will sometimes return the closest supporting resolution but that should only be for full-screen [can check PresentationParameters in that case] - for windowed mode it should be ok for any setting.

This is why I’d just use the width and height instead (for example):

RTScale = new Vector2(Width/OriginalWidth, Height/OriginalHeight);

Another approach:
If you used another render-target first(some ideal resolution), then you’d draw everything onto it first and then draw the render-target(like a texture) onto the back-buffer(target null) using a destination_rectangle(0,0,new_width,new_height)

Keep in mind too that if you do this, you’ll need to convert your mouse coordinates:

public Input(PresentationParameters pp, RenderTarget2D target)
            screenScaleX = 1.0f / ((float)pp.BackBufferWidth  / (float)target.Width);
            screenScaleY = 1.0f / ((float)pp.BackBufferHeight / (float)target.Height);

public void Update()
            mosV = new Vector2((float)ms.Position.X * screenScaleX, (float)ms.Position.Y * screenScaleY);

yes yes, i am doing that having a “based Res” in the game and current resolution value, if they are the same the scale becomes 1, and i adjust a variable for the mouse location

GameCore.CGC.RTScale = new Vector2( (float)WisG.WisGGF.GetScreenWidth/(float)WisG.WisGGF.GetBaseScreenWidth, (float)WisG.WisGGF.GetScreenHeight/(float)WisG.WisGGF.GetBaseScreenHeight);
MenuComponent.UpdateUniversalScale (GameCore.CGC.RTScale);

the UpdateUniversalScale function takes the result and tells the GUI system (MenuComponent class) to adjust it.

and about the WisGGF ==> WisG is the game project name and GF is the game file, which is a class that holds all the game metadata, i saved two integers for the screen res width and height in the game file and i assign them everytime i change the screen width so all the project can see these numbers even if they can’t access the graphicsdevice, as you can see in my last reply SetScreen function i am using those variables to assign them to the g.preferredbackbuffer thingy, and i test the final result by printing the windows bounds after i change the resolution and the numbers are correct. something is fishy going on and the trouble is I can’t tell if they were my code or not, I might need to make a new project and remove all shaders and gui and everything and just draw a based image and see if making smaller res will cause same behavior or not <_<;

Hmm, well, it’s virtually impossible to be sure what the issue is, at least from my PoV. Your project already has many many moving parts which could be the issue right now. Back when I was fiddling with different rendering solutions, I too ran into a lot of weird and buggy behavior. About all I can offer at this point is to tell you how I’ve been handling resolution on my own project and perhaps you can gleam something useful from that. As yet I’ve had 0 problems since I’ve started doing things this way:

I have a fairly simple ‘Resolution’ Class. It only really does two things. I set the desired target resolution for my game(the virtual resolution) on the Resolution class(it’s a persistent service class), in my case 1280x720 at the moment. It then has a method for applying a new window resolution.

When the resolution is applied, it goes through a similar process as your ‘SetScreen’ method. First check to make sure the new desired height/width don’t exceed the actual dimensions of the monitor. you can do this with GraphicsAdapter.DefaultAdapter.CurrentDisplayMode.Width/Height .

As long as the new desired size fits in that size, graphics manager sets the PreferredBackBufferWidth/Height to the new desired dimensions and applies changes. If setting fullscreen, check to see if the new fullscreen resolution is supported. GraphicsAdapter.DefaultAdapter.SupportedDisplayModes . Iterate through the modes and check to see if the new desired height/width match one of the supported modes. Set the back buffers as before and apply changes.

Next the viewport is set. This is where letterboxing and pillarboxing are calculated, if needed. I’ll just post the code directly:

        // The preferredWidth is the desired user window size. Dividing the width of the user window
        // by the aspect ratio we want for our game screen, gives us the required height to maintain
        // that aspect ratio. This is needed to ensure that the game screen isn't flattened if the
        // user's window is shorter than needed.

        int width = _preferredWidth;
        int height = (int)(width / GetVirtualAspectRatio() + 0.5F);

        if (height > _preferredHeight)
            // If true, this means that the aspect ratio determined height of the game world screen
            // is taller than the user's actual window size for the game. If true, the height of
            // the user window becomes the limiting factor for the game screen, and pillarboxing
            // will allow us to maintain the correct aspect ratio for the game screen.

            height = _preferredHeight;
            width = (int)(height * GetVirtualAspectRatio() + 0.5F);

        Viewport view = new Viewport();

        view.X = (_preferredWidth / 2) - (width / 2);
        view.Y = (_preferredHeight / 2) - (height / 2);
        view.Width = width;
        view.Height = height;
        view.MinDepth = 0;
        view.MaxDepth = 1;

        graphicsManager.GraphicsDevice.Viewport = view;`

In short, this will add colored bars(of whatever your graphics clear() color fill is) to the top and bottom, or left and right of the screen to fill up whatever space is unused due to maintaining the aspect ratio. Doing this is what prevents the stretch/warp of using window sizes with aspect ratios different than your target resolution.

The next thing to do is set some state data on the Resolution class which can be used by other classes. Just basic data about the current state of the resolution. The 2 important ones I’ll list here:

        GameScreenScale = new Vector2(graphicsManager.GraphicsDevice.Viewport.Width / VirtualResolution.X,
                                      graphicsManager.GraphicsDevice.Viewport.Height / VirtualResolution.Y);

        scalingFactor = new Vector3(GameScreenScale.X, GameScreenScale.Y, 1);

        RenderingScale = Matrix.CreateScale(scalingFactor);

GameScreenScale is the similar scale to yours. However, note, it’s based on the viewport and the virtual resolution, and not the backbuffer. This is in case letter/pillar boxing is happening due to our setup. RenderScale is a matrix, like the one I mentioned before, which uses the game screen scale we calculated. That’s about it.

Moving on… The last thing my Resolution class does is it fires off an ‘OnResolutionChanged’ event which other classes can subscribe to if needed, so they can deal with the new changes.

After that, it’s fairly simple.

Each draw phase I create a new rendertarget, which is the same size as my target resolution. The render target does not get resized when the window changes sizes. So if you shrink your window to 200x100, the render target is still 1280x720. The only thing that actually changes when the window is resized is the viewport, and my state data(such as the current scale, and the scaling matrix for use by my final batcher). AFAIK, there is no need to change the size of your render target to anything but your intended target resolution. Maybe it works differently in 3D games? I haven’t messed with 3D stuff yet. Anyway…

So my 1280x720 render target gets draw onto by all the various draw calls from my gui and whatnot. Also important to note here is that you do not scale your gui and other drawables, generally. That happens later. Right now you want to draw everything relative to your virtual resolution, which is the same as the render target. So if you have a GUI window you want to be at x:200 and y:200, with a width/height of 500x500 relative to your target resolution… then you draw that rectangle at exactly that position. No scaling or anything like that.

After all your batches .End() and everything is drawn onto the render target, you finally draw the actual render target as normal. The last piece of the puzzle here is that it’s at this point and only this point that you apply scaling. So in your last batch here, include that Matrix scaler from earlier, from the Resolution class. This will scale up your render target… and by extension, everything you drew to the render target, is now scaled in one fell swoop at the very end.

So if my window is 640x480 (ratio of 0.75), then my viewport has been automatically resized to 640x360 (ratio of 0.5625) and centered on the screen, and letterbox bars will fill the rest. Everything gets drawn onto a 1280x720 render target… and at the very end when I draw the target, I scale it. The scale is based on viewport and virtual res, which works out to a 0.5 scale in both dimensions. Scaling my 1280x720 render target by half gets me… exactly 640x360, which fits snugly into the viewport, along with all the things that were drawn onto the render target.

This is how my project does resolution handling… so far, it’s worked perfectly for me. All issues with aspect ratio and scaling are handled easily and with no fuss… all your game code draws everything as though the resolution is 1280x720(or whatever you set), without having to be concerned about the user’s window size, monitor size, aspect ratio, or anything. All the minutiae is handled by the Resolution class, and as long as you keep the Resolution class updated any time a window change happens, it just works.

The only thing you have to account for now is scaling the mouse position, which you already know how to do.

Anyway… best of luck.

Hey, these are always tricky to solve.

I noticed that your app appears truncated on the bottom after the resize. There could be a few things going on so here’s some ideas.

  1. It may be that you didn’t get the requested size at all. Confirm you did by drawing a pixel directly into that region without using your pipeline or camera. If you see the pixel, you know the allocation worked and you can focus on your transform code as the source of the problem.

  2. If you cannot see the test pixel, there could be a bug in the resize code in the graphics library (sharpdx or whatever). There have been in the past.

  3. Set up a simple test app for.your scenario and strip it to the basics to see if you can resize.

  4. The methods for getting buffer sizes are notorious for not always returning what you expect.

  5. The viewport could be different than the buffer. Check it or set it manually. It’s a property on graphics device. Viewports can crop so I would look there.

  6. Compile monogame in debug and trace the resize. See if any conditions are different between the working one and this.

  7. Sprite batch and transforms are a little weird. There is a lot of state going on. And the render order is interesting. Is it possible something is drawing that fills in that space so you can’t draw there? Do you manually set the Spritebatch transform? Try disabling the depth test, stencil test, and alpha blending and see what happens.

Just some thoughts.