Android Screen Size from WindowManager.DefaultDisplay.GetRealSize depends on package name?

Hello all, first post here I hope someone can help me with!

On Android I’m getting a different screen size depending on the package name I give my app.

For example in Activity1.OnCreate (the effective entry point to my game) I mimic what monogame does to get the current screen size:

Android.Graphics.Point wrongSize = new Android.Graphics.Point();
this.WindowManager.DefaultDisplay.GetRealSize(wrongSize);

It seems there is at least one other way to get this info, like this:

Display.Mode mode = WindowManager.DefaultDisplay.GetMode();
Point correctSize = new Point(mode.PhysicalWidth, mode.PhysicalHeight);

In this example, wrongSize isn’t always wrong, only if I give my app package name a certain name value. If I change the name value even by one letter then wrongSize suddenly agrees with correctSize, as if by magic.

This becomes important because Monogame is setting up the game window, view and graphics device based on the wrongSize version. This is my proof from ILSPY:

public AndroidGameWindow(AndroidGameActivity activity, Game game)
{
_game = game;
Point size = default(Point);
if (Build.VERSION.SdkInt < BuildVersionCodes.JellyBean)
{
size.X = activity.Resources.DisplayMetrics.WidthPixels;
size.Y = activity.Resources.DisplayMetrics.HeightPixels;
}
else
{
Android.Graphics.Point point = new Android.Graphics.Point();
activity.WindowManager.DefaultDisplay.GetRealSize(point);
size.X = point.X;
size.Y = point.Y;
}
Initialize(activity, size);
game.Services.AddService(typeof(View), GameView);
}

My guess is that the Android phone is remembering something about an app even after it’s uninstalled/cache cleared/data deleted, and so simply giving a certain package name is causing this weird behaviour.

Can anybody guide me towards solving this strange issue please?

This is still driving me nuts.

The bold sections below are from the debug output and show where the native resolution is changed from 1440 x 2960 to 1080 x 2220 (for no reason other than the Package Name, apparently).

Relayout returned: old=(0,0,1440,2960) new=(0,0,1080,2220) req=(1440,2960)0 dur=10 res=0x7 s={true 3168907264} ch=true

Why does it get changed to a lower resolution at all?

I’ve tested this on 2 different Samsung phones now with a similar result (S7 changes down to 1080x1920, S9 changes down to 1080x2220). Can no one help please?

Complete Extract output from problem package on a Samsung S9:

09-18 05:04:36.361 D/SurfaceView(27784): onWindowVisibilityChanged(0) true md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....I. 0,0-0,0} of ViewRootImpl@b4ad3ad[Activity1]
09-18 05:04:36.374 D/ViewRootImpl@b4ad3ad[Activity1](27784): Relayout returned: old=(0,0,1440,2960) new=(0,0,1080,2220) req=(1440,2960)0 dur=10 res=0x7 s={true 3168907264} ch=true
09-18 05:04:36.375 D/OpenGLRenderer(27784): createReliableSurface : 0xbcc80fc0(0xbce1b000)
09-18 05:04:36.383 D/SurfaceView(27784): surfaceCreated 1 #8 md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....ID 0,0-1080,2076}
09-18 05:04:36.384 D/SurfaceView(27784): surfaceChanged (1080,2076) 1 #8 md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....ID 0,0-1080,2076}
09-18 05:04:36.385 D/OpenGLRenderer(27784): makeCurrent EglSurface : 0x0 -> 0x0
09-18 05:04:36.394 I/mali_winsys(27784): new_window_surface() [1080x2220] return: 0x3000
09-18 05:04:36.398 D/OpenGLRenderer(27784): makeCurrent EglSurface : 0x0 -> 0xebd9dfc0
09-18 05:04:36.404 W/Gralloc3(27784): mapper 3.x is not supported
09-18 05:04:36.406 I/gralloc (27784): Arm Module v1.0
09-18 05:04:36.445 D/Mono (27784): DllImport searching in: '__Internal' ('(null)').
09-18 05:04:36.445 D/Mono (27784): Searching for 'java_interop_jnienv_get_static_object_field'.
09-18 05:04:36.445 D/Mono (27784): Probing 'java_interop_jnienv_get_static_object_field'.
09-18 05:04:36.445 D/Mono (27784): Found as 'java_interop_jnienv_get_static_object_field'.
09-18 05:04:36.452 D/Mono (27784): Assembly Ref addref MonoGame.Framework[0xebd9d120] -> System.Core[0xebd9d5a0]: 4
09-18 05:04:36.476 D/ViewRootImpl@b4ad3ad[Activity1](27784): **Relayout returned: old=(0,0,1080,2220) new=(0,0,1080,2220) req=(1080,2220)0 dur=9 res=0x1 s={true 3168907264} ch=false**
09-18 05:04:36.479 D/SurfaceView(27784): surfaceChanged (1080,2220) 1 #5 md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....ID 0,0-1080,2220}
09-18 05:04:36.482 D/ViewRootImpl@b4ad3ad[Activity1](27784): MSG_WINDOW_FOCUS_CHANGED 1 1

This is the same data from the exact same code just with one letter of the package name changed. The resolution is not changed in this case:

Relayout returned: old=(0,0,1440,2960) new=(0,0,1440,2960) req=(1440,2960)0 dur=7 res=0x7 s={true 3131969536} ch=true

09-18 05:06:04.708 D/SurfaceView(29195): onWindowVisibilityChanged(0) true md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....I. 0,0-0,0} of ViewRootImpl@b4ad3ad[Activity1]
09-18 05:06:04.717 D/ViewRootImpl@b4ad3ad[Activity1](29195): Relayout returned: old=(0,0,1440,2960) new=(0,0,1440,2960) req=(1440,2960)0 dur=7 res=0x7 s={true 3131969536} ch=true
09-18 05:06:04.718 D/OpenGLRenderer(29195): createReliableSurface : 0xbcd45140(0xbaae1000)
09-18 05:06:04.728 D/SurfaceView(29195): surfaceCreated 1 #8 md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....ID 0,0-1440,2768}
09-18 05:06:04.729 D/SurfaceView(29195): surfaceChanged (1440,2768) 1 #8 md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....ID 0,0-1440,2768}
09-18 05:06:04.729 D/OpenGLRenderer(29195): makeCurrent EglSurface : 0x0 -> 0x0
09-18 05:06:04.741 I/mali_winsys(29195): new_window_surface() [1440x2960] return: 0x3000
09-18 05:06:04.743 D/OpenGLRenderer(29195): makeCurrent EglSurface : 0x0 -> 0xebd9eaa0
09-18 05:06:04.753 W/Gralloc3(29195): mapper 3.x is not supported
09-18 05:06:04.756 I/gralloc (29195): Arm Module v1.0
09-18 05:06:04.790 D/ViewRootImpl@b4ad3ad[Activity1](29195): Relayout returned: old=(0,0,1440,2960) new=(0,0,1440,2960) req=(1440,2960)0 dur=10 res=0x1 s={true 3131969536} ch=false
09-18 05:06:04.792 D/SurfaceView(29195): surfaceChanged (1440,2960) 1 #5 md5f54719fab2b5008f890ca4d350c867c1.MonoGameAndroidGameView{e293af5 VFE...... .F....ID 0,0-1440,2960}
09-18 05:06:04.794 D/ViewRootImpl@b4ad3ad[Activity1](29195): MSG_WINDOW_FOCUS_CHANGED 1 1

One crazy idea I’ve had is that this app is on the Google Play Store. Google Play store has stats about every single phone out there in the “Device Catalog” on the Google Play Console. For the S9+ it shows a “normal” resolution of 1080x2220.

So is this some kindof Google lookup thing saying, “Ah this Package name is in the store and the store says this device has a resolution of X, therefore I will set the resolution to X, and not the Y that it is ACTUALLY running”?

From what i know MG no longer fires the resolution Changed event since v3.6.

Whenever the app changes orientation, goes fullscreen, enable/disable the notch, You dont get that info.
Add to that the hack to ‘fix’ preffered size that adds floating errors and adjust the resolution to strange values.
Could the name add some pixels to the width /height of the window on startup?

You just have to build from source and fix the issues yourself. There is no workaround you can do to fix that from your game code.

Hello, thanks for the reply!

I’m using MonoGame version 3.7.1.189 currently.

I was very careful in testing both adding 1 letter to the end of the package name and swapping out one letter from within the name (same string length) but always the same result, anything with that particular package name (which like I say is on the Google Store so may be treated differently?) resulted in this resolution change. It’s just not possible it is just a string length thing.

My game code for android runs fullscreen and just sets the backbuffer to whatever size the device happens to be running (from Adapter info). The problem amounts to the fact that my game cannot seem to take advantage of the real resolution of the device, it is being reduced outside of my control by a mystery mechanism, triggered it seems simply by the name of the package (of all things!).

I even tried a whole new project, as soon as I used that package name I would get the resolution drop.

I don’t think MonoGame is doing anything wrong (I assume D/ViewRootImpl@b4ad3ad is part of Mono, not MonoGame?), it’s just working with the resolution it’s being given.

The only explanation I can come up with is that this is Google’s doing, that they don’t think these devices run at the resolutions they do so they make sure it doesn’t and downgrade it to their approved value on app startup.

Well this is nothing if not fascinating.

To test my theory I used another game of mine, also on the Google play store but this one uses Unity with their IL2CPP compiled code…which I believe compiles the C# to native code via C++ without the need of Mono at all.

Can you guess what happens with the screen resolution?

Testing on my Samsung S7 using the package name of the game as exists in the Google Play Store:

Oh dear, it’s 1080 x 1920

Changing the package name, even just by one letter and run the exact same test again:

Yay, its the proper 1440 x 2560

So at this point I think I can safely say, it’s not MonoGame, it’s not even Mono, it’s either my actual devices (BOTH of them, perhaps because I’d previously installed my app and last run settings or whatever) or it’s Google and it is something to do with Package Names that exist on Google Play Store.

So to test this theory I took these exact, reproduceable by anyone steps:

  1. Opened Visual Studio 2019.
  2. Created a new Android MonoGame project
  3. Added this in the constructor of Game1.cs:

_graphics.IsFullScreen = true;

  1. Added a font to the content (why doesn’t the Content.mgcb double click open in the default tool like it’s supposed to and like it does in VS 2017?)

  2. In Load Content loaded in my font file to a member variable “_font”

  3. Modified the Draw method with this code:

     protected override void Draw(GameTime gameTime)
         {
             GraphicsDevice.Clear(Color.CornflowerBlue);
    
             DisplayMode mode = GraphicsDevice.Adapter.CurrentDisplayMode;
             Point size = new Point(mode.Width, mode.Height);
    
             // TODO: Add your drawing code here
             _spriteBatch.Begin();
             _spriteBatch.DrawString(_font, string.Format("Resolution: {0}", size), new Vector2(50, 50), Color.White);
             _spriteBatch.End();
    
             base.Draw(gameTime);
         }
    
  4. Opened up the Play store and grabbed the first package name of the first game I saw (in “new and updated apps”, yeah, right as if these represent new apps in any way and are not cherry picked for promotion, they’re all so new in fact they each have thousands of reviews already):

com.everywear.game5

  1. Gave that as my package name.
  2. Connected my Developer enabled Samsung S7 phone and pressed F5.

Sure enough the output is this (1080 x 1920):

I then change the package name to:

dom.everywear.game5

'Cos no one would have “DOM”, no way this exists on the play store! :wink:

Pressed F5 and…

You can even see the size difference in the backbuffer. If you can’t read that cos it’s so small it says 1440 x 2560

So there you have it, I’d love to be proved wrong but isn’t this proof positive? This is Google themselves!

And yet…I can’t find anything about this on the whole internet. The best I could find was this:

https://developer.android.com/guide/practices/screens_support

Where they say:

By default, Android resizes your app layout to fit the current screen.

I think I would beg to differ with them on that. Like I say, nothing if not interesting. 3 days wasted on this too :roll_eyes:

Fascinating! The app seems to start at a lower resolution and simply doesn’t get updated, keeping the initial resolution. I suppose the Game Play Services are somehow involved in that.

Just for clarity, I’m not using any Google Play Services or anything here, the steps I took were exactly and only as I described above.

I’m shocked, I’ve never seen anything quite like this before but have come up with one rationale for why they might do this;

They might do this so that the final result resembles what a developer might expect had they tested this exact device on a Google Android Emulator (as if they would bother, but still, in theory they might).

If the Android emulator has a resolution preset for that particular device of a screen resolution of X, Y then when a Store app runs on it then it better had actually have that resolution or the result might be unexpected for the developer.

On the flip side though this is draconian, it would be like buying a “DELL Model 1” computer that has a default screen resolution of 640 x 480 written on the box specs. After unboxing it, the user immediately increases the resolution to 4K because that is their preference (and it is an option, yes it uses more power and heats the device but the user is fine with that trade off) and then download a game from a store. The store says, “Ah a DELL Model 1, 640 x 480 it is then” and automatically changes the screen resolution to that, forcing the game into that size, regardless of what the machine can actually do or what the user has set it to.

“But if the developer expects the game to run at 640 x 480 on DELL Model 1 and it does so, then what’s the problem?” I hear you ask.

The problem is that the game can run on many, many systems, “Dell model 1” is just one. So the game is designed to scale and take advantage of the best each system can give. The game can look better on the “Dell Model 1”, so long as it is not the version downloaded from this particular store and so has a different unique ID or “Package Name”.

Given all of this, there is only one question that remains in my mind, and not being experienced with Android I don’t know the answer but does anyone else know if there is a setting or configuration change that might turn off this “feature” of forcing the screen resolution to the one Google has on record as the “normal” for that device and allowing no other choice?

Given I can’t even find any information about this issue/feature anywhere on the whole internet (so effectively it exists only on this page in the world, no one else has even noticed?), my hopes are not high of any remedy but fingers crossed someone might read this and it rings a bell with them, even just an idea of where I might look?

Perhaps I should ask on an Android forum. Can anyone recommend a good one?

Just a follow up confirmation for any interested parties:

I’ve recently created a new game title on Google play and saw the resolution change before/after upon submission. I began debugging at one size, then later in the day the same build was running at a lower resolution.

There is no doubt at all that this is a Google bug/feature.

The resolution change could be temporary, perhaps google play or app manager starts the app in a predefined resolution during initialization, and then change it back to the native resolution.
You will never know because from v3.6, it will continue to report the first resolution. The app simply doesn’t react to resolution changes.

You can test this on your own. Download this project and repeat the test you did with the app name. You will see that this version can handle resolution and orientation changes during runtime without strange side effects.

Hello, I tried your sample and it’s as expected, when running as your package name (OrientationSample.OrientationSample) then the resolution is 1440x2560 and unlocking the rotation and rotating the phone didn’t have any effect.

When I changed the package name to “com.everywear.game5” then the resolution was 1080x1920 and unlocking the rotation and rotating the phone didn’t have any effect.

Screenshot of your sample running with the “com.everywear.game5” package name (note 1080x1920 resolution).

Well, that’s strange!

I found this story. It seems that for what ever reason when you start an app store game it uses the default resolution and ignores the settings.

Definitely a bug in android, but realistically you can’t do anything about it or expect to get fixed.

If the generated backbuffer is also 1080, which I can see from the screenshots it is, and the touch events also report values in the range 0-1080, then MonoGame is correct to report the virtual value as the Window bounds.
In android the GL backbuffer is created by the system and you can’t change it. If you could get the physical resolution it wouldn’t be of any use.

Thank you I read the story which was about Samsung S7 so perhaps there are only a very few phones affected by this, I just happen to test on two of them!

It was interesting what they said about “you can’t really notice the difference between 1080 and 1440 on a phone screen” and it is true, the only reason I noticed it was because my game has scanline and dot matrix effects that rely very much on exactly how many pixels there are on a screen. I was seeing “banding” and “artifacts” in published games that were not there in other test builds and it was that puzzle that sent me down this whole rabbit hole.

Presumably when an app is launched “the system” does some kind of callback to Google to tell it the package name and gets an answer back about what resolution to set, if any. A test for this might be to launch an app when the phone does not have an internet connection (using a deactivated Wifi connection?).

But it’s all academic at this point given this is just a bug/feature of Google and that’s the way it is.

I must say though, it remains one of the weirdest bugs I’ve ever seen in all programming!

Hi!

The same here. Samsung Galaxy Tab A 10.1. Original resolution is 1920x1200. Just an empty app created with Android Studio shows 1920x1200 if I use some new package name and 1440x900 if I use package name of my app from google play!

Awesome!

“Smart” Samsung feature?

UPD.

  1. Uninstall app
  2. Turn wifi off
  3. Install app
  4. Resolution is fine!
  5. Turn internet on
  6. Restart app
  7. Resolution is LOW

WTF ???

UPD.

Found detailed explanation here:

It’s almost spooky isn’t it?

Thanks for the StackOverflow link, the answer by “JJ” seems plausible and fits the observed behaviour well.

JJ says:

What is causing this problem is an online check to https://service.game-mode.net (parameter: package name) during installation on Samsung devices. This may cause a change of the default resolution, e.g. if the app is known as a ‘game’ (my app is a board game). This can be changed by the user using a game performance tuning service such as Game Launcher. This is available for all Samsung phones and most Samsung tablets. But as it happens not for my tablet, a Samsung Galaxy Tab A 10.1 2019 (SM-T510). After contacting Samsung Developers support they changed the settings for this device type, so that now my app runs in native resolution.

By the way, I notice “Game Launcher” does not appear to be installed on my Samsung Devices but it is available for download from the “Samsung Galaxy Store”. From the description of it though, it appears to be a totally useless piece of bloatware that I can’t imagine anyone bothering to install:

Game Launcher

Game Launcher is the ultimate control centre for games. All your games in one convenient place with new downloads added automatically. Before playing, you can mute and hide alerts and adjust game settings to save power or increase user experience in gameplay.