Why Benchmarking CPUs with the GTX 1080 Ti is NOT STUPID!
Why Benchmarking CPUs with the GTX 1080 Ti is NOT STUPID!
2018-05-30
welcome back to harbor unboxed today
we're going to discuss a topic that's
often raised when we do our big
benchmark videos as many of you know we
do a lot of cpu related gaming
benchmarks on our channel and we often
try to work out which cpu at a given
price point will offer you the most bang
for your buck
now and hopefully in the future not that
long ago I compared the very evenly
matched core i5 8400 and risin 5 2600
overall the 2600 was faster once fine
tuned but ended up costing more per
frame making the 8400 the cheaper and
more practical option for most gamers
for that particular matchup I compared
the two CPUs and 36 games at 3
resolutions and because we want to use
the maximum in game quality settings to
apply as much load as possible the
geforce gtx 1080i is always our weapon
of choice this helps to minimize GPU
bottlenecks that can hide potential
weaknesses when analyzing cpu
performance the problem we found is
quite a few viewers seem to get confused
about why we're doing this and I suspect
without thinking it through fully take
to the comment section to bash the
content for being misleading and
unrealistic since this is something
we're starting to see a bit of I thought
I'd try and address the topic and
hopefully explain a little better wide
as we test all CPUs with the most
powerful gaming GPU available at the
time when testing a new CPU we have two
main goals in mind firstly to work out
how it performs right now and then how
future-proof is it will it still be
serving you well in a year or so is time
for example as I mentioned a moment ago
to do this accurately it all really
comes down to removing the GPU
bottleneck we don't want the graphics
card to be the performance limiting
component we're measuring CPU
performance and there are a number of
reasons why this is important and I'll
touch on them in this video let's start
by talking about why testing with a high
end GPU isn't misleading and unrealistic
yes it is true right now in 2018 it's
very unlikely anyone is going to pair a
GeForce GTX 1080 Ti with a sub $200 u.s.
processor however when we pour dozens
and dozens of hours into benchmarking
Isetta components we aim to cover as
many bases as we possibly can
to give you the best buying advice that
we can obviously we can only test with
the games and hardware that are
available at the time and this makes it
a little more difficult to predict how
components like the CPU will behave in
yet to be released games using more
modern graphics cards say a year or so
down the track assuming you don't
upgrade your CPU every time you buy a
new graphics card it's important to
determine how the CPU really performs
and compares with competing products
we're not GPU limited this is
particularly important because while
today you might pair a low-end CPU such
as the Pentium G 5400 with a GeForce GTX
750 Ti in a year's time you might have a
graphics card packing twice as much
processing power and in two to three
years who knows as another example if we
compared the Pentium G 5400 to the core
i5 8400 with the geforce gtx 750ti we
were determined that in today's latest
and greatest games the core i5 provides
no real performance benefit and this
means in a year or two when you upgrade
to something offering performance
equivalent to say a geforce gtx 1080
you're going to wonder why your GPU
utilization is only hovering around 60%
and you're not seeing anywhere near the
performance you should be here's another
example we can use from a previous video
for the pentium g 4560 release I did
create a GPU scaling test back in early
2017 here we see with a gtx 1050 TI that
the core i7 6700 K is no faster than the
Pentium processor then we see when using
a gtx 1060 that the core i7 was on
average 26 % faster and here we see the
g 45 60 has already created a system
bottleneck but we only know this because
we test it with a higher-end GPU with
the gtx today we set the core i7 6700k
is almost ninety percent faster the g 45
60 and we are talking about a GPU here
that by this time next year could be
delivering mid-range light performance
much like what we see when comparing the
GTX 980 an gtx 1060 for example now in
the example just given you might say
well the g 45 60 was priced at just $64
us while the 6700 k cost three hundred
and forty dollars us so of course the
core i7 is going to be miles faster and
well i don't disagree but this is an
18-month old example and we can see that
this
see 700k had significantly more Headroom
back then something we wouldn't have
known had we tested with only a gtx
750ti
or even the 1060 you could also argue
that even today at an extreme resolution
like 4k there would be little to no
difference between the G 4560 and 6700 K
and that might be true for some titles
but it won't be for others like
battlefield one multiplayer as an
example and it certainly won't be true
in a year or two and games become even
more CPU demanding
additionally don't fall into the trap of
assuming everyone uses ultra quality
settings or targets just 60 fps there
are plenty of gamers using a mid-range
GPU that opt for medium to high and even
sometimes low settings to push frame
rates well over 100 fps and these aren't
just gamers with high refresh rate 144
Hertz displays despite popular belief
there is a serious advantage to be had
in fast paced shooters by going well
beyond 60 FPS on a 60 Hertz display but
that's a discussion for another video
maybe you guys can hand him to tackle
that one anyway getting back to the kb
lake dual core for a moment swapping out
the $64 u.s. processor for something
higher-end isn't really a big deal which
is why we gave the ultra affordable g 45
60 a rave review in early 2017 but if
we're comparing much more expensive
processors such as the core i5 7600 k
and risin 5 1600 x for example it's very
important to test without GPU
limitations jumping back to the recent
Core i5 8400 vs. Verizon 5 2600
benchmark comparison featuring three
tested resolutions let's take a quick
look at the Mass Effect Andromeda
results I have to say those performance
trends look quite similar to the
previous graph don't they
you could almost rename 720p to GT X
1080 1080 P to GTX 1060 and 1440p to GTX
1050 Ti since so many of you suggest
that those two sub $200 US CPUs should
have been tested with the GPU packing a
sub $300 US MSRP let's see what that
would have looked like at our 3 tested
resolutions now we know that the GTX
1060 has 64 percent fewer cores and a
Mass Effect Andromeda that leads to
around 55% for your frames at 10
P and 1440p using the core i7 7700 K
clocked at 5 gigahertz and we see
exactly that in these graphs from my 35
game benchmark comparing the vega 56 and
geforce gtx 1070 TI graphics cards which
was conducted last year the gtx 1060
sped out 61 FPS on average 10 EP and
just 40 FPS at 1440p so here's where the
gtx 1060 is situated on our graph in
relation to the gtx 1080 TI the first
red line indicates the 1% low result and
the second red line the average
framerate so as you can see even at 720p
we are massively GPU bound here so how
do I only test it with the gtx 1060 or
possibly even the 1070 all the results
would have shown us is that both CPUs
can max out those particular GPUs in
these modern titles even at extremely
low resolutions in fact you could throw
a core i3 8100 and rise in 320 200 jig
into the mix and the results would have
led us to believe that neither CPU is
inferior to the core i5 8400 and risin 5
2600 when it comes to modern gaming of
course there will be the odd extremely
CPU intensive title that shows a small
deepen performance for those lower end
CPUs but the true difference would be
masked by the weaker GPU performance I
know some people believe that reviewers
test with extreme high-end GPUs and an
effort to make the results more
entertaining but come on guys that one's
actually a bit too silly to entertain as
I've said the true intention is trying
to turn which product will serve you
best in the long run not to keep you on
the edge of your seat for an extreme
benchmark battle to the death as for
providing more real-world results by
testing with a lower end GPU
I'd say unless we test it with a range
of GPUs at a range of resolutions and
quality settings you're not really going
to see the kind of real-world results
many of you claim testing with the
mid-range GPU would deliver given the
enormous and unrealistic undertaking
that kind of testing would be for any
more than a few select games the best
option is to test with a high end GPU
and if you can do so at two or three
resolutions as this mimics GPU scaling
performance anyway don't get me wrong
it's not a dumb suggestion testing with
Laura and graphics cards it's just a
different kind of test I also feel like
though suggesting this test of doing so
from somewhat of a narrower viewpoint
for example playing Mass Effect
Andromeda over the gtx 1060 but using
medium quality settings we'll see the
same kind of framerate you'll get with
the gtx 1080i using the ultra quality
settings so don't make the mistake of
assuming that everyone is going to play
under the same conditions as you some
people use the ultra quality settings
and are happy with low frame rates while
others will go all the way down to
medium because they want to see high
frame rates so given that gamers have a
wide and varying range of requirements
we do our best to use a method that
covers as many bases as possible and
well given the GPU limited testing tells
us little to nothing and that is
something we try to avoid when testing
CPU performance and that is going to do
it for this one if you did enjoy the
video be sure to the like button
subscribe for more content if you
appreciate the work we do here at hammer
on box then consider supporting us on
patreon thanks for watching I'm your
host Steve and I'll see you next time
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.