This news piece was inspired by a conversation we had today with @TotallydubbedHD and discussion about the usage of the term “4K” in the monitor market. We wanted to comment on a couple of trends we’ve seen of late that probably need a bit more clarity for consumers, and probably need highlighting to avoid confusion or misunderstanding. One is the use of the term “4K” on certain displays, and the other is on “Virtual 4K” support.
When a 4K monitor isn’t really a 4K monitor
The monitor which kicked off this conversation was the HP S430c, a 43.4″ sized monitor with an utlrawide format. This screen has a 3840 x 1200 resolution, the equivalent of having dual 24″ monitors side by side, each with a 1920 x 1200 resolution. It’s aimed at office work, productivity and fits a niche for consumers who like that extra vertical space (relative to a common 1080p screen) and maybe use dual screens today.
On HP’s main spec page the screen is listed with a “WUXGA” resolution, but in some initial press material HP talked about this as being “4K” as well. Thankfully the main HP product page doesn’t refer to it in this way any more, but it still remains on their support pages in the specs:
This has then made its way in to numerous press news pieces and reviews across the internet. Not only do HP refer to this as 4K above, but they also use the term “UHD” to mean UltraHD – it really isn’t UltraHD!
Is 3840 x 1200 4K? In our opinion it is not a resolution any sensible person would really consider “4K”. The term is a bit vague and general, and technically if we were being pedantic it could be used to describe, as Wikipedia puts it any display with “a horizontal display resolution of approximately 4,000 pixels”. However I think everyone can accept that in the industry 4K as a term has really become associated with 2 key things:
- The most common 4K usage is for resolutions that are actually officially “UltraHD”, those being 3840 x 2160. This is 4x the resolution overall as a normal HD 1920 x 1080 resolution and the 4K term fits nicely for that reason. Ultra HD has the same horizontal resolution as the display in question, but a lot more vertically and in a common 16:9 aspect ratio. This is where the term 4K is used in the TV market, the main monitor market and for films and TV source content. 4K has pretty much become to mean UltraHD.
- The more “official” 4K resolution that meets the 4K truly would be 4096 x 2160, and this has been commonly used in the film projection industry as well. There are some monitors and panels with this “true 4K” resolution with a slightly wider-than-4000-pixels resolution.
What we don’t really think reasonably qualifies as 4K are screens like the HP S430c where they have the required horizontal pixels but lack the vertical pixels.
What if someone released a 3840 x 100 resolution monitor (please don’t!), would you call that 4K still? Perhaps by the broad and vague definition you could, but it would probably be considered misleading by most people. Neither that imaginary resolution nor the 3840 x 1200 resolution of the HP S430c support enough vertical pixels to display what would be considered common 4K content – that being 3840 x 2160 source content from 4K Blu-ray players, games consoles etc so we would not consider them 4K. We certainly wouldn’t consider them UltraHD as HP list in their spec, which as a term doesn’t have an ambiguous, vague meaning like 4K might, it is a very specific and defined resolution.
What about 3440 x 1440 as “Semi-4K”, or “3.5K”perhaps?
No. Just more confusing nonsense.
Another thing you might see advertised occasionally from some displays is “virtual 4K” support, or perhaps a screen being labelled as “4K ready”. If a display has a native resolution of let’s say 2560 x 1440, it only has 2560 x 1440 pixels to display the image. That’s not a native 4K capable screen and it lacks the required resolutions and pixels to truly display a 4K source as it was intended.
Some manufacturers add a “virtual 4K” mode to the display which allows you to input a 4K source signal (3840 x 2160), which the screen will then downscale to the native resolution of the panel, in this case 1440p (2560 x 1440). The source might be a 4K games console or Blu-ray player, or you might even be able to set it at 4K in windows from a PC. At the end of the day the screen is still limited to 1440p. The reason this feature was introduced originally was to cope with unusual interim resolutions like 1440p that are not recognized natively by some external devices and games consoles. You often have a choice between 1080p (1920 x 1080, which is too low to make the most of the screen resolution), or UltraHD/4K (too high in this example) and the device lacked a 1440p output. By allowing a Virtual 4K support the screen can accept the higher 4K res and downscale it at the display level to the native 1440p. That’s better than trying to input 1080p and upscaling it to 1440p from there.
From a PC input there is no benefit in sending a 4K resolution to the screen and letting it downscale it to 1440p. You lose clarity, add unnecessary burden to your system and graphics card and you end up having to use OS scaling as well to make fonts a sensible size. It’s always best to stick to the native screen resolution wherever you can, so from a PC stick to 1440p output and for devices where you can’t set 1440p, using 4K output is likely a better option than having to drop down to 1080p.
This is perhaps useful for games consoles and external devices as we’ve described but this in no way makes the screen a 4K display. It’s fine as long as the manufacturer explains clearly what this feature is designed to do, but the problem is some people may see the term “4K” and make assumptions.
What we are getting at here is that we would advise caution when considering a screen purchase when you see the term “4K” used. Look at what the actual screen resolution is and that should give you a better idea as to whether it’s really something you can consider as 4K, and supporting 4K content and devices as it was intended to be seen.
We would like manufacturers to avoid further confusion and stick where possible to standards and common practice. Don’t label your screen as 4K if it’s not really 4K, and be clear with what you mean by things like “virtual 4K” so consumers aren’t misled.
We may earn a commission if you purchase from our affiliate links in this article- TFTCentral is a participant in the Amazon Services LLC Associates Programme, an affiliate advertising programme designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com, Amazon.co.uk, Amazon.de, Amazon.ca and other Amazon stores worldwide. We also participate in a similar scheme for Overclockers.co.uk, Newegg, Bestbuy and some manufacturers.
Stay Up To Date
|Follow us on X (Twitter)
Popular Trending Articles
- Here’s Why You Should Only Enable HDR Mode on Your PC When You Are Viewing HDR Content May 31, 2023
- Gen 3 Samsung QD-OLED 2024 Panels and Improvements January 29, 2024
- The Best OLED Gaming Monitors to Buy in 2024 February 18, 2024