his 4k gaming actually dumb and full
disclosure I don't play my PC games in
4k I play one in 1440p we'll talk about
why here in a second but I wanted to
make this video in direct response to
one of Lana's tech tips that was
uploaded back in November of 2018 and
received a lot of mixed feedback due in
large part to its conclusion that 4k
gaming was in fact dumb I mean the title
kind of says it all so here is our own
testing and our own response
privacy calm is the easy way to shop
securely online by creating virtual
debit cards tied securely to your bank
account you can even download the
browser extension and let it autofill
card information with a single click get
started for free and earn a five dollar
credit by clicking the link below today
so does 4k gaming make sense linus
starts out by mentioning the level of
detail human eye can discern and that's
roughly one sixtieth of an arcsecond at
least that's the general consensus and
there's an equation for this just plug
and chug a bunch of simple variables and
boom outcomes your result so the closest
you could sit in theory to a 24 inch
1080p panel without seeing individual
pixels is about 3 feet or 91 centimeters
you can test this yourself that metrics
pretty spot-on for me I've linked the
website down below by the way in case
you want to play around with it a bit
now let's bump the resolution to a 4k
and keep screen size the same by the way
4k is 3840 by 2160 in this case the
closes you can now sit is roughly 1.4
feet again pretty spot-on for me but
there's a problem with these figures you
see they don't take into account other
perceived aspects of a quality image
things like sharpness and contrast and
they do play a huge role so at this
point lettuce jumps into anti-aliasing
this technology takes neighboring pixels
and kind of blends them together and it
can make a display seem like it's higher
resolution than it actually is
we've got a dedicated video on what that
is and how it works right here but in a
nutshell AAA works by blurring edges and
smoothing out rough and angled or curved
surfaces depicted by square pixels check
out these side by sides of the GT a
benchmark in 1080p the left has
absolutely no AAA at all the right has a
a maxed out at times 8 both runs are
locked to 60 Hertz by the way see a
difference so a a really matters when
pixels are noticeable and in theory if
we had an infinite number of them packed
into a display AAA would be redundant
since all images would appear perfectly
shaped the problem is we use square
pixels and when those pixels are
perceived as very large chunks of light
and that's when you start noticing rough
edges to test this we grabbed three
gaming monitors with the major sixteen
by nine resolutions and with each of
them locked to sick
he hurts with anti-aliasing turned on in
our games we want to see if our test
subjects can tell the difference Linus's
argument is that anti-aliasing makes a
big enough difference in games to
justify a lower resolution and proves
this by sampling the opinion of three
other members of his team they could for
the most part discern the 4k panel from
the lower res counterparts even from
three or so feet away which technically
violates the equation we referenced
earlier and that's because perceived
sharpness is still a thing but linus at
this point shifts the argument entirely
to refresh rate stating that all three
panels were around $500 and that the
1080p in 1440p units packed 240 and 144
Hertz refresh rates respectively and
I've been a longtime believer that
refresh rate makes a way bigger
difference to the gaming experience than
resolution so we're gonna get doom
running on all three of these and see
what people prefer to play on in a
fast-paced shooter so it was sort of a
bait-and-switch here the basis of his
argument was that the one sixtieth of an
arc second metric he mentioned at the
beginning of the video is kind of out
the door we're talking about refresh
rates and not resolution that's those
two have nothing really to do with each
other although they are nice to have
both in the same package and I think
that that was lysis intention all along
I think we can agree that people have
different levels of acceptable visual
clarity and just because some equation
says things have to be a certain way
that doesn't actually mean they work out
that way in real life ask anyone who's
used a 4k panel and that will tell you
that going back down to a 1080p panel is
no bueno I mean even if that person sits
3 feet away at which point you
technically couldn't even tell the
difference between 1080p and 1440p
that same person will notice when the
panels resolution changes especially
when the image displayed is static but
Linus is right about gaming the
fast-paced nature of most scenes is
enough to draw the human eye more to the
smoothness aspect rather than the
resolution that was the point of his
video after all was 4k and gaming
whether or not those two were viable now
refresh rates above 60 Hertz are not
possible to depict by a youtube I could
in theory slow down clips at the point
where a 240 FPS clip still looks great
at 60fps compared to 15 fps at 25%
playback speed but this would be an
unrealistic comparison so this isn't an
argument
defend on this channel though high
refresh rate gamers already know exactly
what I'm talking about so instead I'll
stick with the visual argument and try
to defend it as best I can so here's
what I did I stuck with GTA 5 to keep
things consistent and ran to different
passes first up was 4k medium settings
for the most part no anti-aliasing no
advanced graphics it's what you're
seeing here things look pretty great but
the in-game detail had to be dropped in
order to maintain an acceptable frame
rate by the way the computer running
these tests is sporting in RTX 2070 in a
core I nine ninety nine hundred K so
it's definitely no lightweight my second
test however was run in 1440p at this
point I could turn on just a bit of AAA
and bump remaining settings across the
board in-game detail was definitely
improved across the board but again at a
lower resolution overall side by side
you've got more pixels to the right and
thus more visual detail ignoring the
fact that this entire video was exported
in 4k but you've got more in-game detail
to the left so my thinking in this is
why bumped your gaming resolution to 4k
if all you're gonna be doing in the long
run is either a significantly lowering
your in-game framerate or be lowering in
settings to compensate at which point
away increased resolution at all if all
you're gonna be doing is adding more
detail to a game that suddenly looks
crappy er now keep in mind that if
you're viewing this video in a
resolution much lower than 1080p let's
say 360p you probably can't play along
here but if you're viewing this an
especially 4k I'd now like you to pick
the side of the screen that was running
the benchmark natively in 4k can you
tell which is which even if your screen
only supports 1080p bump the video to 4k
and just for this test to remove any
issues with image quality and bitrate
can you tell a difference this is the 4k
screen shot and this is the 1440p screen
shot different angle here actually all
in game settings are identical
everything's maxed out across the board
except anti-aliasing on the 4k side note
AAA is on but on the 1440p side for time
anti-aliasing is on so back to Linus's
point anti-aliasing does make a
noticeable difference even in game it
doesn't make it forward entirely but it
definitely narrows the gap almost enough
to the point where a 4k panel it's
presumably locked at 60 Hertz is a bit
to use his word dum now I'd like to
close with a bit of talk and just how
taxing 4k can be on your hardware using
a peep
a pixel per inch calculator we can
determine just how many pixels fit into
an inch of screen real estate
so if our 4k panel is 27 diagonal inches
and in the 16 by 9 aspect then the
screen has a ppi of 163 but in all
honesty I mean that means nothing unless
we compared with other resolution so
let's do 1080p
now our PPI has dropped to eighty one
point five and that makes sense 4k is
literally for 1080 to be panels stacked
around each other and by the way for the
sticklers out there we're talking about
3840 by 2160 again when we reference for
K not cinematic 4k at 4096 pixels across
just thought I'd clear that up because
there are always a few wise guys in the
comments ready to ruin everything
so when we double the horizontal pixel
count and condense it into the same
space PPI doubles that makes sense right
but even 1440 peak is a large step down
from 4k if you zoomed in on individual
pixels this is what 1440p would look
like and this is what 4k would look like
two very different pixel densities and
two very different loads for your
graphics card to process 4k definitely
looks great in person but when gaming
I'd take the higher refresh rate any day
of the week and still bump in game
settings for a better overall experience
subjective yeah to an extent but no one
can deny the fact that in order to
maintain the same frame rate you have to
lower in game settings when stepping up
to 4k why add more pixels to an image
that suddenly looks worse I mean you
could probably make the argument if you
were running like a risin three twelve
hundred with an RT X twenty atti maybe
at that point bumping the resolution
would actually give you a better frame
rate I'm not too sure having tested that
but in most cases if you have a balanced
or semi balanced system any time you up
the resolution right from 1080p to 1440p
or from 1440p to 4k your frame rate will
drop as a result and the problem is that
you can't have both every display
interface has a limit to how many pixels
it can push per second which is why up
until very recently every 4k monitor has
been capped at 60 Hertz making for an
inferior gaming experience outside of
some edge cases so my view it doesn't
make sense to add more pixels to an
image that suddenly looks worse assuming
you're gonna drop in game settings when
it gets to the point that you really
need to sacrificing
detail for the sake of resolution I
think you've gone too far case in point
here's a 4k screen shot with medium
tessellation right medium settings it's
okay
here's the same screen shot though in
1440p with tessellation now set to high
it just seems a bit ironic and
counterintuitive to increase screen
detail only to see worse quality in-game
a bit like adding more gross food to
your plate at a buffet
wouldn't you rather cut back a bit and
enjoy something tasty that's 1440p
gaming for me III may not have the same
screen detail as a 4k gamer but I can
still enjoy higher in-game detail and a
higher refresh rate for a more balanced
workload what do you think let me know
what resolution refresh rate you game in
and be sure to recommend future topics
as well in the comment section below yes
movies video thumbs up thumbs cool
dislike the de clèves are feeling or if
you hate everything about life click
that red subscribe button if you haven't
already even become a member to be fancy
with it stay tuned for the next video
this is science studio thanks for
learning with us
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.