scale the local display
Allow the client to "lie" about its real display dimensions (and maybe still report the actual correct values somewhere) so that the server can render things at a different resolution and save CPU, memory and bandwidth.
This will allow us to handle 4k screens a lot better: we can render at 2k or even just 1080p and upscale client side.
For a lot of content, especially video-like pixels, it is a lot more efficient than upscaling the video (in the client application: browser or video player) then downscaling it before compressing it, only to upscale it at the other end. The compression can introduce a lot of blurring which gets magnified. Compressing 4k is just too expensive in most cases, even with hardware assisted encoding.
With a smaller virtual screen size, we are more likely to be able to compress losslessly or at a better quality and higher framerate.
The client-side upscaling may also be able to do the anti-aliasing for us.
This is what MS Windows does already for applications which are not DPI aware.
Maybe worth doing if dummy + randr does not get fixed in time: when we are scaling things, we could also use the full virtual screen's size (which may be bigger than the size the client requested) and let the client scale it appropriately. That's as long as the differences are minimal (say 10%?). It will avoid bugs with applications which use the display size without checking the desktop / workarea / available size options.
Related / impacted tickets: