The state of high resolution scaling on linux has always been a point of contention for me with the OS. I go through these cycles where I’ll switch to whatever the hot distro of the time is, enjoy the heck out of developing with the native tools, then ultimately switch back to Windows because of the unusable scaling that exists for hi-dpi displays right now.
Before I dive in, I’d like to preface with this. The problem is exclusively a first world one. I mean 1080p use on any linux distribution is a dream, though for multitaskers a few extra monitors may come in handy, but that’s no sweat for the OS. The problem therein lies in when you reach for those higher resolutions, with 1440p being usable if a little buggy, and 2160p(4K) being outright not supported in any meaningful way as of the time of this writing.
The problem is not that the resolutions are not supported, you can get whatever resolution you fancy on most distros. The problem is scaling, the UI elements of most linux distributions even the most popular ones simply fail to scale properly on high resolution displays.
Ubuntu 17.10 “works” if by works you mean make sure you are running the bleeding edge version of Gnome and enable some beta features via the terminal(eck). Using the default scaling allows you scale from a factor of 1 to 2 to 3 which is only useful if your display is larger than 32″, a use case scenario which makes little sense to me since the ideal 4K monitor size is 27″ IMO. Using this beta feature enables fractional scaling, which subsequently breaks certain UI elements. I mean it’s 2018, Apple and Microsoft have a few nicks here and there on hi-dpi displays but at least there implementations appear to work for the most part.
Of course if Gnome isn’t your environment of choice, there is always KDE plasma which supports fractional scaling out of the box. Except that the fonts break and get blurry 😑.
Perhaps I’m just whining about an edge case, I mean steams hardware results recently showed that about 75% of the market is dominated by 1080p monitors. So if that’s where most of the user base is then why waste precious development on features that won’t be used for at least another while? My guess is that until the price of 4K monitors becomes comparable to 1080p monitors then I may as well forget about running linux at that resolution.
Until then I sit comfortably on Mac for development and Windows for gaming.