welcome back to hardware on box today's
view is not the usual sort of content I
make on the channel it's gonna be more
discussion addressing some of the common
misconceptions about HDR technology that
I hear in comments and foreign person I
guess all that sort of thing I guess
there's more hjm owners become available
and as I get into reviewing the asou
speech a 20-7 you cued it's important to
know what exactly you need to look for
when purchasing an HDR monitor so I'm
going to talk about that today don't
expect too much b-roll or anything like
that just a nice juicy talk about HDR
the things I'll be talking about in this
discussion come from a number of sources
firstly my personal experiences in HDR
monitors and TVs of all kinds but also
keep people in the industry involved
with developing HDR displays working on
HDR support for cheap using that sort of
thing and also validating HDR morale is
not just for consumers but also for
movie studios in Hollywood so a lot of
information from a variety of people is
going to go into this video and as far
as I'm concerned there are three main
pillars to HDR technology you've got
your color space brightness and contrast
and it's important to note how these
pillars to HDR improve the viewing
experience relative to SDR tech which
has been used I guess for a long long
time now I'm gonna start with brightness
because this is the area I hear I guess
the most misconception is about common
things that people say include you know
hgr monitors need to support a thousand
it's of brightness or 400 nits of
sustained or white brightness isn't high
enough and even you know it's not real
HDR unless it's 4000 nits and all of
those statements aren't really true so
it's really important to know exactly
how brightness affects HDR content and
what to look for in a monitors
brightness specifications to ensure your
HDR cable display is well suited to HDR
content most of the discussion on HJ
brightness comes from two misconceptions
of sorts the first round is what people
believe to be high levels of brightness
and what isn't and the second is around
what companies are mastering their
content for and they're both kind of
intertwine it is true that the HDR
standards like HDR 10 and Dolby vision
have brightness targets and maximum
levels ranging anywhere from 10,000 nits
to a thousand it's it's true that many
studios creating HDR content
during their content for high levels of
brightness like 4,000 nits and it's also
true there plenty of high-end TVs can
currently produce well above a thousand
nits at peak so how relevant is this
stuff for monitors in particular well
firstly on HDR standards and mastering
it's not necessary if you're displa to
support the full range of the standard
or the exact specs used in the content
mastering so long as your display is
able to correctly map from a wider
brightness range down to whatever the
display supports there are differences
in how HDR displays handle this mapping
but as long as it's done I guess to a
reasonable degree and everything is
scaled or clip properly the content will
pretty good on a display that can't hit
those super high levels of brightness of
course if the display does a bad job you
might run into issues like standard
brightness areas being too darkest you
know everything gets scaled down
inappropriately or you might run into
severe clipping at the top end of the
brightness scale with significant detail
loss luckily for gamers on HDR monitors
this is less of an issue compared to
video content as most games included
scales and sliders to adjust how the HDR
image looks in your TV and allows you to
get the most out of your displays
capabilities now if you display can show
high levels of brightness like four
thousand or even ten thousand nits
that's great higher is better but lower
levels of brightness aren't nearly as
bad as you might think the difference
between a thousand nits and 4,000 hits
sounds massive and the difference from
six hundred minutes to four thousand
it's even larger but the human eye is
nonlinear in the way it perceives
brightness in other words while the jump
from a thousand hits to four thousand it
sounds like a four times increase the
eye does not perceive with four thousand
it's is four times brighter it will
perceive a four times difference between
say fifteen it's and two hundred nits
but as brightness increases your eye
becomes less sensitive to large changes
in light output the result is four
thousand it's only appears to most
viewers is a small increase over a
thousand despite the large change in
number crucially for HDR monitors the
effect also applies below a thousand s
according to some experts I spoke to
that are involved with professionally
testing HDR displays the difference
between 600 nits and a thousand it's in
a typical viewing environment for a
monitor isn't all that large in fact
many viewers will only notice a small
difference or even no difference between
those two brightnesses in an indoor
artificially lit room and that's where
most monitors are viewed the difference
gets even harder to spot in dark
environments like with the lights off
another important factor in this is that
monitors are viewed at closer distances
than TVs a relatively large TV viewed
from a typical couch distance in a
brightly lit living room during the day
requires a much higher brightness output
to deliver the same HDR effect as a
monitor viewed more closely and in a
more dimly lit room so while the TV may
benefit from sake for thousands of
people artists a monitor could give an
equivalent viewing experience with a
thousand nits or less one expert I spoke
to suggested that four monitors the
difference between 600 nits and
thousands of peak brightness is fast
more than most people realize in typical
viewing environments and 600 nits should
be perfectly adequate for a good HDR
experience provided the monitor hits
several other key metrics we'll talk
about later even for hundreds of peak
brightness could be fine in some
scenarios although most people believe
that around this 600 nits mark
corresponding to the display HDR 600
specification is your safe spot I guess
for great HDR with a thousand hits or
more providing a small improvement on
that the other very very important thing
to note is the difference between peak
brightness and sustained brightness I
hear a ridiculous amount of complaints
of that hgl monitors rated 4000 it's or
600 nits of peak brightness being
incapable of displaying that level of
brightness across the entire display in
a full white image for a sustained
period of time the complaints usually
paint those brightness figures as false
advertising when the monitor might only
be able to push 300 minutes across the
full white image the truth is that for
almost all content you'll actually view
on the display the full white brightness
figure is irrelevant and for HDR in
particular is the peak brightness figure
that matters if you think of content you
typically view like a game a TV show or
movie how often is this screen entirely
white very rarely and then on top of
that how often do you think a content
creator working in HDR wants to display
a full white image and eye scorching a
thousand nits for a long time and the
answer that's pretty much never so the
reason why
HDR needs high brightness levels is for
flashes of brightness high brightness
enables you know the intensity of the
Sun to burn through the flicker of fire
to illuminate a screen or an explosion
to rock your eyes more closely matching
how those things are experienced in real
to show those elements in their full
glory you only need a display capable of
pushing high levels of brightness in a
small area or for a short period a
thousand nits in a ten percent white
area along with the ability to show a
thousand it's across the entire display
for a split second those are typical
metrics failure is fine for 99% of HDR
content and it's these metrics that the
peak brightness Vig you see in HDR
monitors specs refers to and like I said
earlier you don't even need that figure
to be a thousand it's instead of 600
nits is fine if you want to look further
into why high levels of sustained
brightness is not required you should
look into a metric called average
picture level or APL which describes the
average brightness of a complex image
for a lot of video content and games the
APL is actually pretty low relative to
full white images it's only really in
desktop usage like browsing the web or
editing documents that you run into hi
APL content and for that you really
don't want a high level of rightness
trust me viewing web pages or even just
400 nits on a monitor in indoor
conditions is pretty painful
so for brightness there are a couple of
things you need to look for don't worry
if the monitor can't sustain high levels
of brightness 300 nits also is fine but
to be on the safe side and recommend
looking for 600 nits of peak brightness
is a minimum with a thousand it's
providing a small improvement if you are
planning to use the monitor in a
brighter more well-lit environment
having high levels of peak brightness
like a thousand hits will benefit you
more but for most users 600 nits is fine
you might be wondering will you get a
good experience with a 400 nits monitor
meaning the display HDR 400 spec well
there so that is maybe but there are a
few other issues with the display HDR
400 spec that we'll get into in a moment
that leads me towards recommending the
higher just by HDR 600 spec as a minimum
and at least in my experience 600 niches
are noticeable jump from 400 nits and
worth upgrading to and I found that
monitors that can hit 600 nits are more
likely to support other key HDR features
that 400 nits monitors often do not for
the second of the three pillars contrast
I think most people on the same page
with this one one of the key benefits to
HDR is its ability to display bright
areas and dark areas on the screen at
the same time giving you that you know
stunning difference between the dark
shadows of a street with the bright
streetlights for example this range of
brightness is or dynamic range
higher than with SDI imagery and HDR
hence why it's called high dynamic range
as with most things higher contrast is
always better but the key thing to note
with HDR is that you really want a
contrast ratio that exceeds five
thousand to one for the best results
even ten thousand to one is really sort
of around that mark so there is a
significant difference between the
brightest brights and the darkest blacks
basically every monitor uses an LCD
panel and even the best LCD panels out
there using VA technology can't really
push that much above 3000 to 1 contrast
ratio so there abouts therefore to
produce a good HDR a display need to
support a technology called local
dimming what local dimming provides is
instead of getting the crystal from the
LCD panel to attenuate the amount of
light passing through as with standard
LCDs local dimming augments this by also
allowing sections of the backlight
itself to dim when you combine a
backlight they can dim in certain
sections with the inherent design of
LCDs you can achieve very high contrast
ratios ideally you'd want the backlight
to be dimmable on a per pixel level
however that's not currently possible
outside of expensive industry reference
monitors so the next best thing is to
have as many dimming zones as possible
you've probably heard of the new g-sync
HDR monitors supporting 384 FA LD zones
or full array local dimming zones and
that's a good number for a monitor size
display you'll get a great HDR
experience with that amount of backlight
control there's no exact number for how
many zones you need for good HDR aside
from more is always better but it's
pretty clear that if you only have a few
zones say a single-digit amount like 6
that the HDR experience from that won't
be great most displays that support
local dimming will advertise it as a
feature so my advice is to look for
local dimming support and then try and
research the amount of zones the
backlight has if that number is
reasonable at least upper double digits
you'll be on your way towards good HDR
and it's also important I guess to avoid
monitors that only support edge local
dimming rather than full array though
most high zone count displays will
already be full array dimmed
unfortunately the display HDR tears
don't have rigid specifications for the
amount of local dimming zones while the
600 and 1,000 tears must support local
dimming
form to meet their contrast ratio
metrics low zone edge-lit Deming does
qualify meanwhile local dimming is not a
requirement for display HDR 400 at all
so be wary of that tier color gamut is
the third pillar of HDR again I think
most people understand what is required
here true hgl monitors should be able to
display more colors than SDR monitors
xgr stone is a future proof in this
regard that they support way more
constant current display technologies
can show for example the ridiculously
massive bt 2020 color space but
realistically so long as the display can
show a decent amount more colors you'll
notice the difference in color depth and
vibrance compared to an SDR display with
today's monitor technologies in mind you
should be looking for monitors that
support at least a hundred and
twenty-five percent of the srgb dammit
use for SDR usually that will mean
upwards of 90 percent DCI p3 coverage of
course larger is always better so long
as the display properly Maps colors to
what it can produce monitors that are
validated to the display HD are 600
tears and above a guaranteed to support
90% of the DCI p3 gamut or more so you
should again be looking for those badges
however display HDR 400 is a bit dodgy
and then it doesn't stipulate a higher
than srgb color gamut so be wary of that
most monitors that use quantum dot
technology will hit those higher gamuts
too so that's a good thing to look for
one misconception I do hear quite often
though relates to 10-bit color 10-bit
color is a requirement for HDR so all
monitors that support HG are would at
least be able to support 10 bit data
input for processing essentially this
gives the monitor greater color depth
and a wider range of colors compared to
8-bit processing which again improves
image quality however what isn't
required is a true 10 bit panel this is
a professional-grade feature only needed
for people mastering content and doing
other color critical work it's also an
expensive technology to include with
most true 10 bit panels starting at well
over $1000 without any fancy features
like high refresh rates or local dimming
backlight some don't even support HDR is
no true things that we've been talking
about today most HG our monitor panel
supporting 10-bit color will display
these cars using a technique called
frame rate control or FRC on top of an
8-bit panel you'll find it very
difficult
to tell the difference in any real-world
scenario between true 10-bit and 8bit
plus FRC it's much harder to notice them
panels that use six bit plus FRC to
achieve 8-bit colors for example for
gaming and viewing video content 8-bit
plus FRC is basically a non-issue so
don't worry about it a couple of other
things before I close this one off I
often get asked about OLED monitors as
all that is one of the best technologies
for hgi TVs due to its outstanding
contrast ratio and effective ability to
per pixel dim while it'd be nice to have
an all that monitor for gaming and
watching HDR videos the truth is all
that is not really well suited to a
monitor OLED struggles with image
retention otherwise known as burnin so
in a desktop environment items like the
taskbar on windows could quite easily
burn in overtime this isn't as much of a
problem with TVs where most of the
contents just video or games there are
other issues as well like for white
power consumption and different sub
pixel sizes but the mainland for
monitors is that image retention issue I
also want to make a brief mention of
chroma subsampling which is an issue
with current top-end 4k 144 Hertz chase
and catch gel monitors chroma
subsampling does reduce image quality
and it's especially noticeable during
desktop usage although during games or
video playback it should be barely
noticeable in most situations however
chroma subsampling really isn't a
feature that display manufacturers are
including out of choice or to save money
rather current DisplayPort and HDMI
standards simply don't have enough
bandwidth for non subsampled 4k 144
Hertz 10 bit content so if you want that
frame rate and resolution with HDR it's
unfortunately a necessary evil at the
moment I expected future monitors with
newer DisplayPort standards point need
to use chroma subsampling but for now it
is what it is and I recommend disabling
subsampling and running at a lower
refresh rate for desktop content I would
only consider using it if you really
want to hit the top end of that refresh
rate window
so to summarize I've made a checklist
for HDR monitors I'll be using for
future hgr monitor reviews and I think
buyers should also look at it to ensure
that they're getting a respectable HDR
capable display and it can be pretty
easy to get fooled by HDR branding on
monitors that don't truly deliver a good
HDR experience so this checklist will
help you sort the crap from the gem so
the first thing is that needs to support
at least 600 minutes of peak brightness
and 300
it's a sustained brightness higher
brightness is marginally better and high
sustained brightness is not required it
needs to support full array local
dimming with at least a high
double-digit number of zones
I don't think Edgewood dimming is good
enough and of course this does not apply
for all Ed's it needs to support the
ability to reproduce 125 percent or more
of the sRGB color space eg 90% plus of
the DCI p3 color space and of course it
needs to support 10 bit processing and
at least an 8-bit plus FRC panel true 10
bit not required in addition to this
considering hjo is a high-end feature
you really want typical performance from
a high-end monitor like good pixel
response time low input latency which
can be an issue with the own steps for
HDR processing and good uniformity also
as a bonus would be things like a high
resolution and refresh rate if you're
wondering how many monitors at the
moment hit every key criteria in my
checklist it's it's not many I think it
could be just the latest recent catch
our monitors and that's it and those
have several other issues I'll talk
about in the full review when I get
around to doing that hgr is definitely
still in an early adopter stage
particularly for PC monitors and I
really hope people don't go out buying
early models thinking they're getting
features that actually aren't included
that's it for this one a bit of a long
one but I hope those that are interested
in HD are tech stuck with me the whole
way through you'll see my review of the
asou speed 27 UQ shortly that's
currently being tested so stay tuned for
that subscribe for more in depth monitor
testing give this video a like if you
enjoyed it and I'll catch you in the
next one
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.