Gaming monitors lie to us, and they have been for years. Savvy buyers know the tricks brands do to sell the best gaming monitors and have learned to navigate deceptive marketing. But those tricks continue, and 2023 is the year Monitor needs to be a little more transparent.
Some of the key areas where gaming monitors mislead buyers have been around for years, while others are fairly new. As we head into the new year and look to next-gen displays, consider this buying advice for buying your next gaming display and a call to demand improvements from manufacturers.
HDR
Perhaps the biggest area of misinformation about gaming monitors is HDR and all the specs related to good HDR performance. In the case of HDR, the problem boils down to a list of different standards that are haphazardly slapped on product listings without caring what they mean.
The current HDR standard is VESA’s DisplayHDR certification. Adopted by over 100 displays over the last five years, it’s a very popular standard that covers several key factors for stable HDR performance. These include peak brightness, local dimming capability, color depth, and color gamut. In addition, it specifies specific use cases for these metrics.
VESA explained that at a tier like DisplayHDR 1,000, it measures not only the peak brightness of a portion of the screen, but also the peak brightness of the full screen. The rep says this is important for games where you can see flashbang effects or similar things that require rapid brightness.
However, multiple brands have piggybacked on the VESA standard with misleading HDR certifications. The most egregious case came in 2021 when Chinese retailer Taobao listed his Samsung and Acer monitors with fake DisplayHDR 2,000 badges. There is no DisplayHDR 2,000 tier.
What’s the point of having standards if there are none?
However, there are more common and pressing examples, and Samsung is the main offender. Some Samsung displays are DisplayHDR certified, but most use the “HDR 1,000” or “HDR 2,000” branding to denote their HDR performance, and the numbers usually refer to the display’s peak brightness. is showing. Still, some monitors are misleading in terms of brightness. For example, the Samsung Odyssey Neo G8 comes with Quantum HDR 2,000, but according to Samsung’s own product listing, it only supports 1,000 nits of peak brightness.
Similarly, Asus’ recently introduced Nebula HDR standard for laptop displays has vague specifications. The standard is “up to 1,100 nits peak brightness” and “you can have hundreds, if not thousands, of individual dimming zones on a single panel”, but the same for all panels his Nebula HDR branding is used. For example, the 2023 Zephyrus G14 has 504 local dimming zones and 600 nits peak brightness, while the 2023 Zephyrus M16 has 1,024 dimming zones and 1,100 nits peak brightness. Both carry the same ROG Nebula HDR branding. What’s the point of having standards if there are none?
There’s nothing wrong with companies developing standards for their products, but they’re designed to be misleading when they’re designed to look like established industry certifications. If you create a standard, you should also go through a certification process with a third party such as VESA.

This is all the more important given the HDR specification. For example, peak brightness doesn’t take into account how long the screen is at that brightness. Is one pixel a split second, or is 10% of the screen his 30 minutes?No one checks and I don’t want to be a brand selling monitors with low specs on paper.
Contrast ratio has the same problem. Both the Samsung Odyssey Neo G9 (2022) and the Alienware 34 QD-OLED show a contrast ratio of 1,000,000:1. However, his OLED panel on the Alienware 34 QD-OLED and its self-luminous pixels means the contrast ratio is nearly infinite, while a third-party review puts Samsung monitors at around 15,000:1 ratio. is shown. Again, I don’t blame Samsung. The company wants to represent its products in the best possible way, but it’s hard to believe when these important specs go unsaid.
reaction time

Response time has long been a confusing and misleading area of gaming monitor specs. There are multiple ways to test response time, with widely varying results. And, of course, companies that want to sell gaming monitors use numbers that make their products look good.
I haven’t found a gaming monitor that claims a response time greater than 1 ms, so the response time is a moot spec. Most monitors list only gray-to-gray (GtG) response time. This is how quickly a pixel transitions from one shade of gray to another. I don’t know monitor tint, monitor brightness, uptime, etc. All of these affect the actual response time of the display.
A more explicit specification is Video Response Time (MPRT), which measures the visibility of pixels. This number approximates the motion blur that you will actually see on screen. Motion clarity is a key factor that Response Time is trying to track.
Response time is one of the most important metrics for your game, and your product list does little to clarify how your products stack up.
Ideally, manufacturers should list both. Even if you have a monitor with a GtG response time of 1 ms, but with a refresh rate of 60 Hz, your MPRT will be 16.6 ms. Blurring is visible on most objects.
Besides, monitor brands usually measure GtG response time at high overdrive levels. Pixel overdrive reduces monitor response time and, in theory, should produce images with sharper motion. However, overdrive often produces ghosting and corona, both artifacts that look like motion blur in videos. Again, monitor brands don’t usually specify overdrive levels in their response time metrics, making this specification even more confusing.

VESA is looking to improve response times with ClearMR. This gives us the Clear Motion Ratio (CMR), a measure of sharp to blurry pixels in a series of tests. This is even more comprehensive than the GtG and MPRT specifications listed together. Look at the final image, not just a test pattern, and consider the sharpness, overdrive, and motion clarity techniques used by gaming monitors.
ClearMR was only launched last year and currently has only 33 displays certified. Response time is arguably one of the most important metrics for games. For years, product listings have done little to clarify how products stack up. Listing GtG and MPRT is a good first step, but standards such as ClearMR include much more.
Resolution

Gaming monitors don’t lie about resolution, so I don’t want to mislead you here. If you see a monitor that advertises 4K, its resolution is 4K. At least I’ve never come across a monitor that bluntly lies about its resolution.
The bottom line here is that we may see misleading branding in future gaming monitors. For example, Samsung’s Odyssey Neo G9 (2023) was revealed by AMD last November as “the first of his 8K ultrawides.” And you can also find six articles claiming it’s an 8K monitor from its first hands-on time in the press.
But it’s not an 8K monitor. The Rec.2020 standard defines 8K as 7,680 x 4,320 pixels. Additionally, there are groups like his 8K Association that oversee 8K displays and ecosystems that power them with content. The new Odyssey Neo G9 has a resolution of 7,680 x 2,160. To get true 8K resolution, you need to stack the two.
Samsung has never claimed its monitors are 8K, but this is an area where misleading branding is prevalent in next-generation displays. There’s no doubt that 8K’ is about to be tossed around. We’ve already seen it, and it’s a single monitor from one of the biggest brands in the world.
What a gaming monitor can do

Gaming monitor brands need to improve, and it’s easy to say. In fact, brands in businesses that sell gaming monitors want their monitors to look good on spec sheets. If you can measure 1,000,000:1 under certain circumstances, are you going to list a contrast ratio of 15,000:1? Nobody wants to be in that position.
That’s why third-party industry standards are important. DisplayHDR has already set a clear baseline for HDR over the past few years, and ClearMR may clear response times in a similar fashion, if you’ll allow the pun. , these certifications are the only way to set clear standards for gaming monitor quality, and over the past few years these certifications have been lacking.
There is no clear path forward, but things are not going well right now. In an area where spec sheets say little about how gaming monitors actually work and misleading brands are rampant, key factors like HDR performance, brightness, and response time have , we need defined standards that consumers can trust. Otherwise, you might end up completely ignoring the spec sheet when buying a gaming monitor.
This article is part of ReSpec, an ongoing bi-weekly column with discussions, advice, and in-depth reports on the technology behind PC gaming.
Editor’s pick
This article was optimized by the SEO Team at Clickworks
SEO