let's talk a little more about the
Nvidia launch every product launch
sparked controversy and I found myself
in crosshairs with more than one faction
in the past I'm not afraid to put myself
between some huge debate everyone
remembers this video right here right so
in short I'm not afraid to speak my mind
and I'm not afraid to call BS when I see
it when something smells fishy I'll tell
ya it smells fishy even if it hurts your
feelings I'm not in the business of
keeping people happy it's how I've
approached every product view on the
channel to date and nothing about that
will ever change here but what I find
the most interesting about the Nvidia
launch is not how the company actually
presented or launched the product I
thought it was pretty weird
and I definitely called it out right
when I saw it but more so was how flip
flopped the defense was this time around
I was attacked by 8 totally different
fanbase last year and defended by many
of those who were actually pretty quick
to attack me this time around over my
thoughts regarding NVIDIA funny how
things work but a vast majority of you
actually agreed the Nvidia launch was
fishy and this is actually pretty good
news at least from my perspective
because it's not that you agreed with me
I mean my opinion can be totally wrong
here and then oli Crowe and that's fine
I'll be happy to do that but you guys
recognized a change you recognized a
shift in the forward miss of Pascal and
Maxwell presentations and how it wasn't
so this time around I was amped about
the Pascal launch I made so many videos
regarding architecture and lead
performance evaluations and it was
pretty easy design changes on a core
level were pretty predictable but this
time it's totally different and that's
what I want to clear up today
if you're still looking for an
all-around excellent headset the
Sennheiser PC 37 X has you covered
excellent microphone incredible sound as
always from Sennheiser products and a
price that won't break the bank check
the link the video description for more
details
Turing graphics cards are packed dense
with features most of which we've never
seen before
terms ray-tracing and tensor core were
thrown out quite a bit during the games
clunky no to a group of mostly gamers
albeit a lot of reviewers and you know
editors were there as well but the
layman has no idea what any of these
terms mean and that's why these videos
right here exist because I had to learn
them just like you did I mean a lot of
these terms and videos either recycling
or completely making up and we kind of
just have to roll with it like 78 RT X
ops that tells us absolutely nothing
about the product because that metric
was literally made up by Nvidia for the
launch so what about other terms right
can we throw those around
how about floating-point performance T
flops represent the number of
floating-point operations essentially
basic math problems a given GPU can
solve per second they're pretty good at
these because they can parallelize like
a boss they have thousands of course
typically it's a wrong number and
doesn't translate seamlessly to FPS or
any layman metric but it gives us an
idea of theoretical processing limits
think of it like a car's horsepower
metric it doesn't always translate
linearly with zero to sixty times but
people understand zero to sixty times
and FPS is kind of the same way you can
visualize those things so that's what
I'm trying to do here sure having a 700
horsepower beast is pretty awesome but
several other factors should be
considered if you want to accurately
determine acceleration from zero wait
tires drivetrain and these correspond
figuratively to like memory bandwidth
clock speed API optimization drivers and
whatever else you can throw into this
mix these aren't cut and dry
measurements and that's why a lot of us
steer clear of floating-point
performance when engaging actual FPS
performance or whatever you want to call
it however it's a pain of broad picture
here won't compare the 1080 TI's
potential with that of the 28 et is the
card with which we'll spend the most
time in this video so Nvidia spec the 20
DTI at roughly thirteen point four
teraflops the ten ATT is was just over
11 now to prove an earlier point Vegas
64 supported at e-flat value roughly
twelve point seven placing it just over
the 1080 Ti on a raw scale though we all
know how frame rates played out for the
red team in this regard several
other variables played a huge role here
including optimization I was a huge one
driver support other things that we're
still working on the 10-day DT I was
everything that the you know 4k gamer
wanted in a top-tier graphics card it
roughly 6 to 700 USD it absolutely
crushed everything in its path and
became in my opinion the first true 4k
capable graphics card handling a
blistering 8.3 million pixels every 17
or so milliseconds in modern triple-a
titles with liberal presets what more
could we ask for in a single card for
that price many still question whether
an upgrade from a 1080 Ti is justified
given the vagueness of the latest
keynote and I cannot blame you one bit
in short if you're already satisfied
with your current performance and game
settings and gaming resolution upgrading
makes zero sense
more money for likely marginal gains and
cool-looking shadows right yeah doesn't
make sense in my book but there's more
to this that many of you haven't thought
about that honestly I haven't either
the 22 Ti is packed 4352 CUDA cores 18
point six billion transistors discrete
tensor and RT cores into an enormous die
memory bandwidth is blazing fast and TDP
over its 1080i predecessor jumped by a
mere 10 watts these margins aren't
incredible when seen in the context of
Pascal improvements but the inclusion of
tensor and RT cores of which we know
basically nothing at this points and
some videos kept them on the DL leaves
the end result kind of unknown but
sources similar to those responsible for
accurate RT X leaks have apparently
released the Turing T 102 block diagram
of which the 20 atti will be a
derivative comparing this to the GP 102
s we can see the vast disparity in SMS
this time around we're seeing more
streaming processors with fewer cuda
cores in each the 20 atti will boast 68
SMS as the cut-down sibling with an
additional 544 tensor cores and 68 RT
cores NVIDIA has already publicized the
coup de coeur account of the 2080 Ti so
this math is pretty straightforward
another week's lag reveals relative
shader performance this time with an
actually labeled y-axis and that's what
I want to discuss next in videos
ridiculous slides released earlier this
week can someone translate this for me I
mean no y-axis label no in-game specs
detailed and a title it seems to double
down on this claim take all of this with
a barrel full of salt folks ignoring the
deep learning super sampling Ben
mark which the 1080 TI clearly can't
handle well thanks to its lack of tensor
and arty cores it would be an unfair
comparison we see performance I mean
things hone in at around 1.5 X but we
have absolutely no idea what 1.5 X means
shader performance as with the leaked
slide shown earlier I mean even this
metric is pretty vague Nvidia steering
clear of FPS here save this one
published slide now I have my own issues
with the way in which this one was
presented but I'll take what I can at
face value and elaborate as much as I
can I mean given what we have here also
note this graphics card is the RT X 2080
not the 2080 ti 4k HDR in general
cripples of 1080 Ti across the board by
roughly 10% so expecting your 10% FPS
cut with HDR enabled vs. disabled
seriously any benefit Nvidia can give
these new cards they'll give so I
decided to compare these with my own
1080i benchmarks I used an i-5 8600 K
from Intel running at 4.8 gigahertz
nothing overkill at all here
16 gigs of DDR 4 and a 2 terabyte hard
drive this is a typical gamer setup so
page filing mild overclocking and modest
memory timings will all play a role I'm
also assuming these titles were all
tested and average and their ultra
presets since I achieve higher FPS than
these here when 4k settings were dropped
and I seriously doubt that these new
cards perform worse in modern games even
given premature drivers so f1 2017 and
the ultra preset battlefield 1 at ultra
hitman and ultra you get the point
I tested four titles to get the point
across and I got these results here if
we factor in an HDR cut of roughly 10%
which I'm sure you could argue against
given the circumstances but I'm giving
the benefit of the doubt here you can
see that things aren't really that far
apart in fact given last year's trends
we'd expect our Kennedy Ti to perform on
par with a 2070 not a 2080
and this may still be the case but then
there's the pricing argument how do you
justify the same performance jump as the
last generation for more money I mean
some countries have 20 atti pre-orders
for upwards of 1200 equivalent USD
that's like Titan territory still think
it's worth pre-ordering so look you can
argue for these cards any which way you
want it's gonna be pretty obvious if
you're on NVIDIA side at this point and
you can keep an open mind that's fine
but I suggest that you look at this as a
skeptic and not as somebody who's gonna
be super positive and and looking
forward to such a successful and epic
increase in performance to me this thing
just looks like a powder cake honestly
and I don't expect that people are going
to be that thrilled with the way these
cards perform and most of the games
they're currently playing you can call
my opinions clickbait I don't care you
can insult me personally call me a joker
for calling you out on your pettiness
but know this people are pissed for a
reason remember the AMD rise in 5 video
I pointed to earlier turns out I was
right the I 5 still outperforms R 5 CPUs
in almost every modern game in existence
years after software updates windows
patches crippling the blue team even so
that's more of a plus for the red team
right and bio stability releases for
higher frequency memory support the
argument for premature drivers by the
way as the cause is a weak one at best
and the 92 Ti and the 1074 neck and neck
when Pascal launched I tested both of
those cards and you know face to face
I've noticed a tiny spread since then
NVIDIA wants these cards prepped for
game day I mean why would they cripple
their own cards on a software level or
you know just launched them prematurely
over pushing back the launch date to me
that seems like a much more plausible
option if that was a case and more food
for thought why would they avoid any
frame rate comparison of literally any
kind we saw a frame rate of one game
during the keynote but no direct
comparison at all so sure I could be
completely wrong about all of this a lot
of us agree by the way I'm not the only
one saying this we could see though I
mean 50 60 70 percent from its increases
frame rate yields over the last
generation and if that's the case I'll
gladly admit that I was wrong but do I
think that I am no and do I think you
should pre-order No so I think that's
where the story ends but if you have
something else to add you can do so by
leaving a comment the comments section
below
you guys like this video give it a
thumbs up I appreciate it thumbs down
for the opposite or if you hate
everything about life you guys click the
red subscribe button if you haven't
already can join us we want to be really
special about it and we'll catch you in
the next video this is science studio
thanks for learning
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.