well guys here we are on what is
probably one of the most important days
for gamers in the last few years at
least that's what Nvidia would like you
to think after nearly two years since
the launch of the GTX 980 Andy Pascal
architecture we finally have some new
round of gaming GPUs that offer a lot
more performance but they cost a ton of
money so this is the r-tx 2080 TI and
the r-tx 2080 and unlike many previous
generations of Nvidia graphics cards
their new touring core architecture both
changes and optimizes the way games are
rendered there is a lot of exciting
stuff going on below these awesome
looking heat sinks and I'll be covering
some of it here however if you're
looking for some more technical details
I'll leave a link to our review article
posted on our official website down in
the description now this video will be
focusing on just one item that you all
wanted to know about and that is how do
these cards perform and are they really
worth the huge amounts of money that
Envy is asking for so sit back with your
cup of coffee and let's dive in but
first a message from our sponsor the new
T Force Excalibur ddr4 memory is one
unique RGB kit with awesome light spill
from each module and the special edition
has a cool totem design on the light bar
you get lifetime warranty up to 4,000
megahertz speeds and full lighting
control through the software check out
the t46 caliber ram down below alright
so I want to start off here with a
closer look and what I think is one of
the best-looking graphics cards ever
created the r-tx 2080 TI and the RT x
2080 share an identical stream line
exterior design that uses forged and
milled aluminum pieces in either a
brushed look or a flat black finish
every single part of this card looks
like it was put together with an eye
towards detail unlike previous founders
Edition cars that featured lower star
coolers Nvidia decided to go with a
different route with the RT X 2080
series they use a pretty low profile
downdraft heatsink with a pair of axial
fans to draw in cool air this does keep
most the GPUs heat within your case but
the design is supposed to lower noise
and offer cooler core temperatures
honestly the looks of these cards
actually make me wonder why anyone would
one of the triple slot monsters from
bored partners some of those designs
look way too bulky both of these cards
require quite a bit of power so the r-tx
28 ET i needs to a pin connectors while
the 20/80 needs a 6 pin and an 8 pin
that means I would recommend at least a
600 watt PSU for the r-tx 20 atti and a
500 watt unit for the 2080 but if you're
rocking a thread ripper or an
overclocked Skylink X system add 200
wants to each of those numbers the back
plates on the 2000 series are actually
pretty similar to the ones on the 10
atti but the all silver colouring really
makes them stand out there's also a
little secret hidden here too this is
the envy link connector which carries an
SLI signal over a new interface at
capable of transferring up to a hundred
gigabytes per second of data between two
cards now there's a few things that you
need to know about envy link the most
important is that in videos officially
discontinuing three and four card setups
in its current form the Arctic's series
only supports dual card configurations
the real benefit of envy link is that it
can customize workloads towards the most
suitable GPU and it can also share
memory capacity and other key resources
across multiple cards that means to
eight gigabyte cards in sli would have
access to a true sixteen gigabyte memory
pool it also can scale in linear fashion
when moving to high resolution displays
and V to give a great example of nearly
a hundred percent performance scaling
when moving from 4k to 8k on current RT
x-series GPUs that's pretty impressive
guys and I can't wait to test it out the
i/o area of these cards are actually
quite unique there are three DisplayPort
1.4 eight outputs that can support 8k at
60 Hertz or 4k at 120 Hertz alongside a
single HDMI 2.0 B port with hdcp 2.2
compatibility that can support 4k 60hz
our content the cool addition here is
the USB C port which is a first for any
GPU this isn't actually your typical USB
C data connector but rather a virtual
link interface that's supposed to
provide data and up to 27 watts of power
to next generation VR headsets alright
so let's end the tour right there and
get on to what makes these cards tick
this is the touring streaming multi
processor which is basically a building
block for the core the reason I wanted
to show
is to explain how NVIDIA has changed
things up to benefit today's games and
future technologies don't worry I'm not
gonna get too technical here the first
thing that you will see here is there
are now dedicated integer and
floating-point units instead of just
CUDA cores this was done because Nvidia
realized the CUDA cores were often
running integer operations while the
rest of the course sat idle so they
broke the two pipelines apart so both
types of operations can now be processed
in parallel
Nvidia says that this could lead to a 25
to 50 percent speed-up in standard
shading tasks the biggest thing added
into the main core blocks are the tensor
course and each SM has eight of them
these are units specifically designed to
efficiently process deep learning
algorithms for machine learning and
artificial intelligence now I know
that's a lot to wrap your mind around
but NVIDIA is actually putting them to
use with their deep learning super
sampling anti-aliasing instead if the
shader is being put to use here the
tensor cores analyzed the scene and
applying anti-aliasing where needed in
real time the end result is much higher
frame rates when a a is enabled within
each streaming multiprocessor there's
also for texture units and 96 kilobytes
of shared memory finally there's a
single RT core which is meant to be
hyper efficient at processing ray
tracing tasks something that previous
architecture just couldn't do in videos
hoping to put these course to use in
future games for real-time lighting and
shadow effects so now that you know what
each streaming multiprocessor is let's
take a bigger look at the tu 102 core
that's in the RT X 20 atti this ship is
made of up to 16 point eight billion
transistors and 72 streaming
multiprocessors clustered into six
graphics processing clusters or Jie pcs
separate from those sections are 96 ROP
s that are broken into groups of eight
six megabytes of shared l2 cache and
twelve 32-bit GED r6 memory controllers
but this is the fullest configuration
and in order to minimize power
consumption and increase yields Nvidia
shaved off for SMS and eliminating their
associated shader course texture units
RT course and tensor course meanwhile
one block of eight ro peas a single
memory controller and some l2 cache were
also removed and all of this leads to
the RT X xx atti founders
having some extremely high inspects
which are way better than what Nvidia
offered with a gtx 980ti
it also happens to consume a lot of
power but that was pretty much expected
given the performance in videos
advertising another big addition is the
new GDD are six memory which allows the
RT x-series to have impressive bandwidth
which is something required for 4k the
big deal here is the price with the
founders Edition costing a crazy $1200
which is a thousand dollars more than
the base card and five hundred dollars
more than the gtx 980ti
sure it comes with an awesome looking
cooler upgraded components and a slight
overclock to the bou speeds but men it's
still expensive the r-tx 2080 on the
other hand uses a different core than
the ti called the tu-104 which means its
specs are lower but they are still
supposed to be more than enough to
overcome the gtx 980ti it should also be
noted that it's much more efficient and
comes with eight gigabytes of gddr3
rather than eleven gigabytes the r-tx
2080 founders edition cost around $800
which is still pretty expensive
especially when it compared to the gtx
280s original price of $550 however if
it can outperform the gtx 980ti while
offering the advanced r-tx features it
may end up being competitive
speaking of r-tx features the RT and
tensor cores make these new cards part
of invidious new r-tx platform this is
an ecosystem made up of hardware
software and api's to add advanced AI
deep learning and ray-tracing
into more traditional graphics pipeline
honestly guys Nvidia is betting big on
great racing and deep learning methods
that will become a huge part of next
generation gaming and it also seems to
be working too since there will be at
least 26 games that will come with TLS s
support in the next few months and when
you combine that with the ones that
support rate racing
there will be more games with our text
features than the total number of DX
tall titles that are available today but
right now other than rise of the Tomb
Raider there aren't any other games that
support RT X technologies I'll be
covering those in the future but what we
care about is how well these two cards
perform in games that can be bought
today for this review will be using ask
Alec X system with an i-9 7900 X or
clock 24.8 giga
thirty two gigabytes of memory at 3,600
megahertz and all the other components
listed here one thing to note is that I
didn't enable the new game mode within
Windows 10 because it ended up messing
up with some key benchmark results you
should also know that every one of the
numbers you'll see in the charts was an
average of three separate benchmark runs
we have more information about the
benchmarks in the website article if
you're interested ok so let's start off
with battlefield one dx12 the first
taste of what these RT x cards has to
offer is pretty impressive the RT x 28
ETI is way way out in front and even the
standard RT x 2080 is able to beat the
gtx 980ti
by about 13% the 4k results continue
that trend too but the gtx 28 DTI
stretches its legs even more offering at
nearly 50% improvement over the gtx
980ti call of duty continues the trend
we saw with battlefield 1 both RT x
cards are well ahead at 1440p and the TI
offers some impressive numbers however
turn on the stress at 4k and the RT x
2080 TI's lead widens even more while
the gap between the RT x 2080 and the
gtx 980ti narrows a bit this is likely
due to the RT X 2080 smaller memory
footprint destiny is next and the
performance seen here is just blazing
fast from every one of the cards
something I do want to mention right now
is the RT X 2080s improvement over the
original gtx 980 it typically hovers
around 50% or so which is why i've been
comparing it to the gtx 1080 TI up to
this point these really are some fast
cards guys but remember they also cost a
small fortune my results in far cry 5 at
1440p are interesting since according to
the system's resource monitor two cores
on the oricon 7900 x were maxed out
during the benchmark run with the RT x
2080 TI this could mean that its
performance was slightly limited by the
cpu and again at 4k the ti surges
further ahead
while the RT x 2080 and gtx 980ti are
neck to neck
forza was another weird one and it
seemed to be CPU limited again at 1440p
but that didn't stop the RT X cards from
dominating they did however pull ahead
at 4k and here even the RT X 2080
maintained a pretty big lead over the
gtx 980ti that's
pretty impressive you can actually tell
that envious seems to have their
optimizations done right before the
Unreal Engine 4 and that's pretty
important since there are dozens of
games which use it here the r-tx cards
absolutely dominate everything else
especially at 4k when the 2889 jumps out
to a nearly 60 percent lead what you
will likely see as these benchmarks go
by is that Nvidia score optimizations
have allowed our TX cards to excel in TX
12 much like battlefield hitman proves
there have been some major improvements
in this field when compared to Pascal
cards particularly at 4k when the new
shading horsepower really comes into
play moving on to a much more popular
game like overwatch there is nothing to
complain about it either at 1440p or 4k
but there's something I wanted to
mention here while the performance gains
are 50% for the Arctic's 20 atti and
about the same for the 2080 versus 1080
battle we can't forget how much more
these new graphics cards cost they
aren't inexpensive but they do give you
some amazing performance in today's
games shadow of war is an extremely
taxing game on the entire system most of
all the GPUs texture pipeline neither
our TX cards had any issue pushing
completely playable framerate at 1440p
and 4k while improving upon their
predecessors by about 50% once again now
by this time I know I'm starting to
repeat myself so let's just roll Rainbow
six results since they're pretty much in
line with the other games tested in this
review alright so this is going to be a
tough one since while one hammer to
total war seems to be an extremely
challenging game to run it actually
isn't provided you use the DX 11 mode
creative Assembly's dx12 implementation
has been in beta for years now and it is
still far from optimized it actually
runs worse than the x11 and we've
included here to show that even with a
lack of in engine optimizations the RT X
series has enough horsepower to deliver
playable in battle frame rates believe
it or not the latest Wolfenstein game
actually provides one of the most
surprising yet not completely unexpected
results of this entire review at 1440p
things seems to be going really well for
the RT X xx atti and the Arctic's 2080
since they're able to chew through frame
rates despite every detail being maxed
out but when the resolution is increased
to 4k the RT X 28 YZ performance simply
Falls
to the floor while the gtx 980ti and the
r-tx 28 ET i surge ahead the reason for
this it's actually pretty simple the 8
gigabyte frame buffer and lower memory
bandwidth caused a severe bottleneck on
the Manhattan mission we chose even vega
64 s higher bandwidth number allows it
to finally become sort of competitive
the final game here in this 13 title
test is which are three and the RTS
cards go back to their normal leadership
positions at both 1440p and eventually
4k as well so there you have the
performance results of the RT x 2080 TI
and the Arctic's 2080 but testing can
stop at this point since there were a
few more additional things that we
wanted to check out first and foremost
let's actually see how these new cards
you know compare or how they perform
when you put them under pressure for an
extended period of time because remember
modern GPU start off with higher clock
speeds as soon as the benchmarks begin
but when temperature rises and videos
boost algorithm kicks in and it started
trice balancing the performance the
temperatures and the power consumption
which usually leads to lower frequencies
and that's why before benchmarking each
game we typically warm up all the GPUs
with at least two minutes of load to
better simulate how they will actually
perform during real gaming sessions
starting off with the actual
temperatures there are actually some
interesting things going on as you can
see the RT x 2080 doesn't go above 72
degrees Celsius which seems to be what
MVS default peak is for this card the
boost algorithm won't allow it to go
beyond that this means you could
probably get some additional clock speed
Headroom by simply increasing the
temperature limit in overclocking
software the RT X xx atti on the other
hand is something interesting too since
its temperatures are allowed to peak at
a point that's 5 degrees higher and
that's where it remains basically both
of these are pretty cool running cards
but something did strike me as weird
though
Nvidia could have easily squeeze some
more performance out of the arctic's
2080 by simply allowing it to hit a
higher temperature like the Arctic's
2080 Ti
maybe this was done to ensure a bit more
performance a separation between the two
cards or maybe for another reason
altogether different anyways onto clock
speeds now and here you can see what
kind of effect in videos GPU boost 4.0
technology has on clock speeds as
temperature increases rather than
pushing fan speeds up so the RT x-series
gets overly
the fluctuate core frequencies to
balance temperature and power
consumption this also goes to show why
it is very important to either perform
longer benchmark runs to ensure the
cards have a warmup period before
testing each game while the r-tx 2080
remains around 1900 megahertz for the
first 30 seconds or so by the end of the
5-minute test that gets reduced to 1845
megahertz this is literally right in
line with the envious boot specification
the r-tx when atti gets a bit more
shaved off by going from 1785 megahertz
to 1680 measures after 5 minutes it too
remains right near these 1635 megahertz
of founders Edition spec so it's not bad
at all but what about power consumption
well let's have a quick look I want to
start with the r-tx 20 atti because in
pre overclocked founders edition form it
consumes quite a bit of electricity
under load the r-tx 2080 isn't that bad
since it manages to outperform the gtx
980ti yet it requires less juice however
this card doesn't tell the whole story
since as a graphics card is able to
process more information CPU answers to
memory load increases as well so other
components are making both of these
cards look more power-hungry than they
really are and related to slower older
GPUs from a performance per watt
standpoint the r-tx 2080 and r-tx 20
atti are far superior to anything that's
just released up to now I mean that mega
64 is just embarrassing even though
there are two of the most powerful GPUs
ever made the r-tx foundation cards are
also some of the quietest I've
experienced at least that's what it
happens when the r-tx 2080 TI isn't
operating at extremely high frame rates
this is what the r-tx 20 80 TI sounds
like an idle and now this is what it
sounds like when playing games at well
over a hundred and sixty frames per
second there is a little bit of coil
whine that's for sure and it becomes
really noticeable especially if you
literally have your ears close up to the
card as you can see I am running it on a
test bench on an open test benchmark to
be more specific but if you're wearing
headphones or if it's inside a case it
might not be as noticeable
however check this out this is what the
RT X xx DTI sounds like when it's
running at a more reasonable 80 frames
per second and under
it's almost completely quiet meanwhile
our r-tx 180 sample didn't exhibit any
of these issues and it operated as quiet
as a mouse through testing so if heat
sinks that seemed to be more than ready
to tackle the heat produced by the tio
104 and the t1 of course let's actually
go into some overclocking but before we
can do that I want to mention that we
haven't explored all the options when it
comes to squeezing every bit of
performance of these r-tx cards for
example and videos for partners are
actually including a scanning tool that
Nvidia developed alongside the GPUs 4.0
technology this is what the EVGA
implementation looks like basically what
it does is make overclocking easy for
newcomers by launching a simulated load
on the GPU while gradually increasing
voltage and clock speeds if you move the
power and temperature limit sliders to
higher values the scanner will
automatically shift its targets as well
it will eventually settle on a final
overclock that should be stable for
longer periods of gaming what I did is
use invidious counter tool to set a
baseline overclock on both cards and
then went ahead to using manual inputs
to dial the clock speeds I should also
mention that the automatic scanner
doesn't touch memory frequencies so I
needed to work on those too basically my
goal was to hit the highest speeds
possible that were also stable for
longer gaming sessions and that means
the results will show frequencies each
GPU settled on after 30 minutes of
gaming I should also mention that the
fans were said between 41% and 52% for
all tests and even then neither card
came close to the assigned temperature
targets if you recall the r-tx 20 atti
founders Edition settled at 1680
megahertz in stock form but with a bit
of tuning it ended up continually
running at 19 80 megahertz the memory
didn't get all that far though going
from 14 gigahertz to just over 15 point
5 gigahertz one thing to make note of is
GDD are six memory has error correction
routines which will cause it to throttle
rather than show rendering errors most
of the time so detecting it's true
stable overclock is pretty challenging
the amount of headroom for the r-tx
20/80 is pretty reasonable with its
going for a boost speed of 1845
megahertz to a relatively constant 2085
megahertz the memory on this particular
card actually came close to hitting 16
gigahertz which is super impressive so
guys I think after spending a few hours
working on overclocking the r-tx cards
they do have some gas left in the tank
if you actually need some more perform
however much like with Maxwell and
Pascal they are strictly limited by the
amount of additional voltage you can
apply in overclocking tools the course
rarely reached the higher power limit
and never even came close to the
temperature limits so it's more than
obvious than higher voltage setting
would allow you to go even further but
how do these overclocked r-tx cards
perform well let's check it out in
battlefield one performance 40 r-tx 20
DTI and the art TX 2080 gets a good
increase of about 12% in both cases but
honestly I would likely just keep the
settings at stock speeds unless you
absolutely need those extra few frames
per second far cry 5 provides a bit more
interesting results the r-tx 20 atti
gets a big bump but the our TX 2080 sees
a small increase this could likely be
due to a frame buffer or core
architecture limitation rather than
clock speeds okay so I'm going to try to
wrap this up as quickly as possible but
there is a lot to cover there's no
denying the fact the our TX 20 atti and
the our TX 2080 are crazy fast but
having looked at the individual charts
what do they actually mean and the
bigger overall picture the first thing I
wanted to take a look at is the our TX
20 atti boundaries additions performance
across all games being tested it is
extremely good at 1440p but this is
obviously a card meant for 4k and high
resolutions in some cases it was
obviously bottled act by other system
components at lower resolutions but when
gaming in UHD it's almost twice as fast
as the GTX 1080 and over 40% faster than
a gtx 980ti those are massive numbers
guys switching to the RT x 2080 and i'll
say that it offers a huge performance at
1440p by cleanly beating the gtx 980ti
at very good frame rates at 4k when
compared against the original GTX 1080
or even Vega 64 it doesn't even come
close however there will be some rare
ultra high resolution situations where
it's eight gigabyte frame buffer and
memory bandwidth will become a
limitation alike in Wolfenstein just be
aware of that and modify a setting to
reduce a game's memory footprint the be
quiet dark bass pro 900 revision 2 is
here with the modern IO for your type C
accessories and Qi charging gadgets the
new hub is good for 8 fans and RGB
strips plus steam terior is incredibly
modular for airflow water cooling invert
systems and now with the power supply
shroud check out the V to down the
alright guys so I think it's time for
some real talk the archaic series is
really really exciting and you know the
frame rates being put out there are huge
but don't forget the fact that both
these cards are really expensive I mean
the r-tx 2080 costs $800 and the 2080 ti
goes for around $1,200 that's expensive
in my opinion people with a GTX 1080 or
gtx 980ti should really look into
upgrading just yet because those cards
still offer great performance across the
board even at 4k I just think that
binding to the know the whole r-tx
series is just a little bit too high
especially for people who have just
recently purchased a Pascal GPU now I
probably wait until the new year to see
how things could shape up with the RT X
features and HDR displays but again
until then the 1000 series cards are
just more than capable enough to a game
or pretty much play any title that's
currently available in the market today
but if you're looking for even more
performance and bragging rights and if
you aren't thinking about picking up a
4k HD unmonitored or high refresh rate
then yeah the RT x 2089 the 2080 are the
only options you have available but what
about gamers have skipped Pascal
altogether well now might be the great
time to upgrade
right before Christmas seasons huge
lineup of new games if you had the money
back in the day to buy the original type
next then you should be looking into
grabbing the RT X to an 80 TI the same
goes for the 980ti users it might be
time to order the RT x 2080 the
performance delta between high-end
maxwell cards and their touring
equivalents is just massive now NVIDIA
is obviously asking gamers to pay a
premium for features that may never
become widespread in some games honestly
RTX technologies like ray tracing and
TLS anti-aliasing are pretty cool to see
in tech demos and stuff but you know it
may take months or even years till we
start to see them in more than a few
titles in the end only you can decide if
this is a good buy for your needs but if
I have to make a choice I actually
choose the RT X 2080 founders Edition it
costs about hundred dollars more than
the custom GTX to me DTI cards but it's
reasonable frame rates in most games
Nvidia also made it a pretty efficient
quiet and overclockable if you need a
bit
performance plus it doesn't require you
to put a huge $1,200 bent on upcoming
tech so there you have it we finally
have some new graphics cards expensive
graphics cards to be more precise but we
can soon expect less expensive
alternatives to follow the RT x 20 ATT
imer takes 2080 now for those of you who
are who do have the money to purchase
the RT x 2080 and the Arctic's 20 ATI
you can be confident that these GPUs
will be a fastest on the planet for the
foreseeable future
I need your with hardware connects thank
you so much for watching make sure to
subscribe to our new boot sequence
channel for the latest tech news and
rumors and of course you can check out
some relevant content over here I'm
signing off and I'll see you guys in the
next one
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.