If you go to options you can change the level of quality. High and very high are quite crazy and kill the performance fast I think even on your powerful One Plus 6.
1) Yep, not by FPS - actually most often I have issues with CPU, not GPU. The idea is to measure the number of miliseconds for a Draw() function call. Since 1 s = 1000 ms, 30 FPS means 33.3 ms per frame. Hence, if your frame takes more time to draw than 33.3 ms, your performance suffers. Now, how to calculate the preferred render size. I use an algorithm that goes like that:
var delta = ResolutionChangeFactor * currentResolutionScaleRatio * ((DesiredTimeFrame - _drawStopwatch.ElapsedMilliseconds) / DesiredTimeFrame);
currentResolutionScaleRatio += delta;
ResolutionChangeFactor is a rate of resolution change constant. I played a bit with values and 0.01f works well for me
currentResolutionScaleRatio is quite self explanatory, starting value is of course 1f
DesiredTimeFrame is target number of miliseconds, so our 33.3f for 30 FPS
_drawStopwatch.ElapsedMilliseconds is the number of miliseconds it took to for the last Draw() call
2) Indeed, that is achieved using RenderTargets. The game is rendered into a single RenderTarget, the UI to a separate one. The game RenderTarget is scaled with dynamic resolution, the UI RenderTarget has const resolution.
3) YES! I managed to pull off 4K resolution in certain scenes with my Sony Xperia XA1 However to keep the battery usage low and heat output reasonable, my game scales up to 1080p or max resolution supported by device if it is lower than 1080p.
If you need any additional explanation please feel free to ask I can only ask for a nice review on Google Play in exchange