Nearly every time we post a news piece or update on Twitter about a new display, we get a flurry of replies saying “but it’s not 4K”. Where did this obsession with 4K come from, and is it really justified? Does it make sense to demand 4K from a new monitor, and declare a new screen dead on arrival if it doesn’t have it? Or are people getting caught up in thinking they want 4K, without considering the context? There’s definitely a place in the market for 4K, it’s just important to consider the variables before jumping to conclusions,
Where does the obsession stem from?
Looking at the wider consumer display market the interest and obsession with 4K resolution (and by 4K we are talking here about Ultra HD 3840 x 2160 resolutions) seems to stem from 3 key places as far as we can tell:
- The TV market
- Modern games consoles
- TV, streaming, movie and video content
If you have a think about each of those in turn you can see that they’re all linked, but the context of why someone might want 4K is really important here. This is what makes it wrong to assume a monitor should fall in to the same requirements.
Timeline of average TV screen size in the US, and key content and device launches
In the TV space the key driver for a 4K resolution over older 1080p standards is screen size. Back in 2006 when people bought TV’s which were on average 33” in size, a 1080p resolution was perfectly adequate. When you add in a typical and sensible viewing distance, there was no real need for anything higher than 1080p. The top-end content like TV shows and movies were developed and broadcast in 1080p as that’s all people really needed. Blu-Ray was launched in 2006 as the modern disc media, having arrived later than lower resolution DVD’s and this too aligned with a 1080p resolution. The three slotted together nicely.
The driver for 4K resolution has primarily been the increase in screen size. Larger TV’s became more affordable and more widespread, and 6 years later by 2012 the average screen size had increased to 38”. Suddenly 1080p was no longer ideal for these larger (and continuing to grow) screen sizes. Pixels were too large and the image lacked detail. From the same viewing distances as before, you lost picture quality and sharpness. 4K resolutions started to appear on the larger TV options, but in the early days were reserved for higher-end and expensive TV’s with the first being released by LG in 2012 at 84” in size and for around $20,000 USD (I’ll take three!!)
By 2014 the average TV size had crept up to 41” and more and more 4K screens had arrived, and costs had come down. Netflix saw the requirement for higher resolution content and started streaming in 4K that year, and 2 years later UltraHD discs were launched with 4K content in 2016. By that time in 2016 the average screen size was now 43” and the UltraHD Alliance defined their standards for UltraHD displays. 4K capable games consoles started to appear the same year and so you had content and devices that offered 4K output, and TV’s that supported 4K and needed that resolution because of their size.
Nowadays of course the mainstream TV’s really start at around 42” in size as a minimum, with the average size in 2022 being 50”. You can easily find affordable 65” and above TV’s too. You now need more than ever the 4K resolution in this space for sure, as 1080p really wouldn’t look good. For the larger screen sizes like 65” and above which are also far more common now, you might even want to consider 8K – but that’s a different topic for another day. The latest generation games consoles like the PS5 and Xbox Series X were also launched a couple of years ago in 2020 and there’s good, widespread support from content and devices for 4K to meet the requirements of 4K TV’s.
The important thing here is that with the average TV size being 50” in 2022, 4K is really the norm. It’s very sensible to demand 4K from a screen that size. People know of it, and expect it from their TV and the content they consume, whether that’s console gaming, movies, disc, TV or streaming content. This is where we believe the obsession now stems from. People see other displays like monitors being launched and announced, and express their disappointment when they aren’t 4K. Why aren’t they matching their TV? They’ve got all the content and devices that operate in 4K, why is a new monitor not supporting this?
There are certainly some use-cases for 4K on a monitor, but the problem is people are not always considering the wider context of whether you need or even want 4K in this situation. 4K is not by default better than lower resolutions, it all depends on other factors.
Is 4K resolution needed for desktop monitors?
It’s not as clear cut as just saying “4K is better than lower resolutions like 1080p (1920 x 1080) or 1440p (2560 x 1440)”. In the monitor space we are talking about completely different sized screens than TV’s and the requirements for resolution are therefore different.
According to market insight information we have from panel manufacturer, AU Optronics, the average monitor screen size in 2022 was only 24.7” and is creeping up very slowly year on year. By this year it’s expected to grow a bit to 25” average. Obviously there are plenty of larger screen sizes available nowadays with a lot of new announcements falling in to the 27” sized space and above. There’s still a high demand for smaller screens and they make up the volume of shipments and demand. It would be impractical and inappropriate for most people to have a 4K resolution on something that size, when 1080p (and sometimes 1440p) are perfectly adequate.
But let’s think about new monitor announcements, and the kind of screen we would cover here on TFTCentral for news and reviews – after all, that’s how this conversation started.
Let’s first consider the very common 27” screen size, still very popular and still used for many new and emerging screens. For instance the recently announced 240Hz OLED monitors from LG, Asus, Acer and Dough are all 27″ in size. What resolution do you need for these?
Why you probably don’t need 4K for 27” sized monitors
This is one of the most common topics we’re seeing at the moment, and the reason for writing this article. Going back to the history of TV’s, you didn’t need 4K resolution for smaller 30 – 35” ish sized TV’s back in the early 2000’s. The lower 1080p resolution was adequate and perfectly fine for your typical viewing position and content type. Only when screen sizes became larger did 4K start to become really important and relevant. It’s a similar story with desktop monitors, but the sizes and the usage considerations are different.
Obviously with a desktop monitor you’re going to be sitting much closer to the screen and using it for different tasks like office work, photo editing, design work etc. You almost certainly want and need a higher resolution than you would on a TV of the same size due to your usage type and viewing position. When determining which resolution you need for a particular screen the key thing to consider here is the balance between screen size, viewing distance, content and visual acuity.
We have a more detailed article talking about visual acuity in a lot more detail so we won’t go in to it all here, but the gist of it is, there’s a point at which having a higher resolution is actually unnecessary, and in some cases may even be detrimental. If you have a 4K resolution on a screen that is 27″ in size, you need to consider the following:
- The higher resolution may not look any different or could create challenges with scaling – On a 27” sized monitor if you have a 4K resolution you will need to use operating system and application scaling (typically at 150%) to make text size sensible and comfortable. This reduces your desktop space from 4K back to the equivalent of 1440p meaning in reality you get no benefit in terms of real-estate. That’s not always understood or realised. Fans of 4K would then argue that you still get a better pixel density and a sharper image, which is true – but this may or may not be relevant or useful to you. Some people barely notice a difference at all between a native 1440p 27” panel and a 4K higher pixel density 27” panel. This all comes back to viewing distance and visual acuity again. So it’s certainly not as simple as saying “4K is better”; for many people they may not even see the difference. Why would you therefore spend more money on a 4K screen, which are typically more expensive than 1440p, when it won’t be of any benefit to you? We will talk about scenarios where this increased pixel density is desirable in a moment though.
Then come the challenges with scaling. Not all operating systems and applications handle this very well. Modern Windows and Mac OS generally do a decent job, but many applications can struggle. If you’re using split screens with a non-4K screen this can become complicated too. Games can be tricky too. Basically it’s a lot more complicated to have to rely on anything other than standard 100% scaling for many people. Why have 4K if you’re one of those people who can’t see much different on a screen this size when it’s only going to cause you other issues?
- The higher resolution strains your system more for games and multimedia – the topic of whether or not you’d be able to see the improvements in image clarity and sharpness with 4K vs 1440p on a 27” screen becomes even more debatable when you move on to gaming and multimedia. Moving and dynamic content will always be harder to see the difference, especially when you consider that your viewing distance may well increase for gaming and videos. If you do have the higher 4K resolution it’s also a lot more demanding on your graphics card and system than a lower 1440p (or 1080p) resolution – in fact it’s 2.25x more demanding than 1440p given the resolution. The alternative option would be to have a lower resolution screen like 1440p and focus instead on driving frame rates or quality settings. That’s likely to have a far more beneficial impact to the majority of users than trying to push from 1440p to 4K with questionable benefits for the screen size. Some might argue that you should have a 4K screen to give you the option, but you could play your games set at a lower res like 1440p and “get the benefits of both World’s”. That’s true, you could, but the image quality of a monitor that is interpolating a 1440p resolution input up to a 4K panel is never as clear and crisp as a native 1440p panel.
- 4K resolution invariably leads to higher costs for the display – The higher pixel density is harder to produce, and costs for these panels are higher. Add on top of that other knock on costs like the likely need to have HDMI 2.1 capabilities instead of older HDMI 2.0 capabilities for instance and you’ve got a display that ends up retailing for a higher price point than a 1440p equivalent might. A good example would be the recently launched and reviewed Cooler Master Tempest displays. The two screens are very similar in specs, features and capabilities but the GP27Q is 1440p and retails for an MSRP of $499 USD, whereas the GP27U is 4K and has an MSRP of $799 USD. That’s 60% more expensive for the privilege of 4K resolution and not much else!
Reasons why you would still want 4K anyway on a 27” screen
On the flip side there are some scenarios where you may still want 4K resolution on a screen even of this size. They won’t apply to everyone, but they’re worth noting:
- The higher pixel density can lead to a sharper image for some people – depending on their viewing distance and visual acuity some people may see the benefits in the higher pixel density, resulting in a sharper and crisper image than a 1440p equivalent. This can be useful for office and general uses, but becomes even more relevant if you’re doing any highly detailed work such as CAD/CAM or Photo editing, imaging work etc. Personally I can see the difference certainly, but that is not to say that a 27″ 1440p display is not sharp, clear and crisp – it is.
- The higher pixel density and resolution can provide better clarity for gaming – depending on your eye sight, visual acuity and viewing distance you may find that games look sharper and clearer when run at 4K compared to 1440p. It can help bring out finer details, particularly in the distance and make things look sharper and clearer.
- You have the option for more desktop space – you could in theory run the screen at the higher 4K resolution and 100% scaling, and get a much larger desktop area to work with. This is going to be impractical for most people, as text and icons end up being very small, but some people use the screens like this and get used to it. For most people this is unlikely to be a sensible option on a screen that’s 27″ in size though.
- The Xbox Series X only supports HDR mode at 4K currently – this is a very niche scenarios admittedly, but the Xbox Series X only currently supports HDR mode for 4K resolutions. If you have a 1440p resolution screen you wouldn’t be able to use HDR on the console. However, this is only relevant and important if the display actually supports HDR in any meaningful way (most so-called HDR screens don’t!). If the screen offers poor HDR support, you don’t need 4K to make HDR available from the console. The other consideration to this is that many 1440p screens would also support a “Virtual 4K” mode, allowing you to over-come this issue and input a 4K resolution from the console if you want. So all in all, this is a limited argument for needing 4K on a 27” screen.
We would argue that for a 27” screen size, a 1440p resolution is more suitable for the majority of people. Not everybody, but the majority. Just because 4K is everywhere in the TV space now, and many modern games, consoles, videos and movies support 4K doesn’t mean you necessarily need 4K from your monitor. This content and these devices are capable of outputting at 1440p resolution too, even the PS5 nowadays which was missing this resolution at launch. In some cases like the latest generation consoles, outputting at a lower resolution will allow you to instead prioritise frame rates and push for 120Hz refresh rate. There’s no need to force 4K when you may not see any benefit in real use.
Not everyone will agree, and we are not saying that 4K doesn’t have its place and its benefits – it does. Our main point is that buyers should try and look past the knee-jerk reaction of “it’s not 4K” and think about the context fully.
Ultrawide monitors – “but it’s not 4K!”
This is another one we see quite a lot. A new ultrawide screen gets announced with a 3440 x 1440 resolution for example and people immediately moan that it’s not 4K. No, it’s not 4K, it’s not supposed to be 4K. It’s an ultrawide panel and we’re talking about a completely different target audience and use. The same arguments stand as above but we would consider 3440 x 1440 to be the appropriate and sensible resolution for a 34” screen size (the most common ultrawide), with no real reason or need to offer 4K for the majority of users.
At what point do you need 4K on a desktop monitor?
This if course changes as screen size increases, and the argument for 4K certainly becomes stronger if we start thinking about 32”+ sized screens. For a 32” sized screen with 4K resolution you would typically need to use 125% scaling to get a comfortable text size which ends up being very similar to the text size of a 27” 1440p screen – you can again see this is a pretty good sweet spot for text size. The benefit though of having 4K resolution on a 32” screen size is that with 125% scaling you end up with a larger screen area equivalent to a 3072 x 1728 resolution which is a benefit and a nice jump for multi-tasking and productivity. You could even run at 100% scaling and make use of the full 4K resolution, as long as you can live with the smaller text and icon size. There’s a benefit in these regards in having 4K instead of 1440p when the screen reaches this size.
A 32” screen with a lower 1440p resolution would be considered too low for providing a sharp and crisp image for many users although some gaming screens do still existing with 1440p. The text size would be larger and you wouldn’t have the crispness you’d ideally like for general, office and professional uses though. This is less true for gaming and multimedia where many users would be less sensitive to the differences in clarity and pixel density, and may instead prefer to focus on frame rate and graphical detail than pure resolution. This is why there’s still quite a few gaming screens which are 32” in size but have a lower 1440p resolution. Our view is that a 4K resolution is desirable for a screen that is 32” in size all things considered.
Then there’s a point at which for a desktop monitor you’d be able to use 4K resolution at without scaling (at 100%). That is at around 40” in our view. That gives you a text size very similar to 27” 1440p at 100% scaling, and 32” 4K at 125% scaling. So for anything this size and above, you definitely want 4K for a desktop monitor. If we considered larger screen sizes like 48” or above, 4K might be considered too low for a close-up-use monitor, although the screen size is arguably getting too big for a desktop monitor anyway so it’s a different conversation.
We may earn a commission if you purchase from our affiliate links in this article- TFTCentral is a participant in the Amazon Services LLC Associates Programme, an affiliate advertising programme designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com, Amazon.co.uk, Amazon.de, Amazon.ca and other Amazon stores worldwide. We also participate in a similar scheme for Overclockers.co.uk, Newegg, Bestbuy and some manufacturers.
Stay Up To Date
|Browser Alerts||Follow us on X (Twitter)||Subscribe||Support Us|
Popular Trending Articles
- The OLED Black Depth Lie – When Panel Type and Coating Matters November 30, 2023
- Second Generation QD-OLED Panels from Samsung, Improvements and Changes for 2023 June 5, 2023
- The Best OLED Gaming Monitors to Buy in 2023 September 8, 2023