since time immemorial or at least the
1970s there's been a never-ending demand
for higher quality smoother graphics in
video games but the very earliest GPUs
in game consoles weren't really GPUs at
all instead of being general-purpose
microprocessors that we see in modern
graphics cards earlier video controllers
were more or less hard-coded to only
output specific visuals for whatever
video game it was a part of it wasn't
long though before real CPU started to
appear in video home game consoles but
for years graphics processing in both
consoles and computers was handled by
the CPU itself instead of having a
separate GPU it wasn't till the
mid-1980s that the modern concept of a
discrete GPU started to take shape
including the commodore amiga x'
graphics subsystem that offloaded video
tasks from the cpu and texas instruments
rolling out the very uncreated lee named
TMS three four zero one zero in 1986
which was among the first of
microprocessor specifically designed to
render graphics on its own but it wasn't
until graphical user interfaces on
computers were popularized by new
operating systems like Windows that what
we think of as PC graphics accelerators
on an expansion card really took off
instead of being relegated only to
top-end workstations out of the reach of
average consumers one particularly
popular early video card was the IBM 85
14 / a from 1987 which supported 256
colors and took care of common 2d
rendering tasks like drawing lines on
screens much faster than a regular CPU
could handle thanks to its low cost it
spawned a number of clones and paved the
way for further advances in 2d graphics
it was also around this time that a
small Canadian company named ATI started
producing its own graphics cards notably
the wonder series one of the first
consumer product lines to support
multiple monitors as well as the ability
to switch between a number of different
graphics modes and resolutions which was
uncommon at the time but these early
graphics cards still relied on the main
CPU for quite a
few tasks and his 2d graphics became
more complex in the early in mid-1990s
we started seeing more and more powerful
GPUs that could work more independently
of the CPU as well as the emergence of
open application programming interfaces
or api's including OpenGL in 1992 or say
DirectX in 1995 these api's enabled
programmers to write code for them that
would work on many different graphics
adapters really helping to push the
gaming industry forward by providing
somewhat of a standard software platform
for game studios of course the real
excitement in this area was the
possibility of bringing 3d graphics to
home PCs although the 1995 release of
the original PlayStation console one of
the first to support true 3d graphics
proved wildly successful the PC side got
off to a much slower start one of the
first 3d cards designed for consumer
gaming was the s3 verge also released in
1995 unfortunately the verge was more of
a 2d card with 3d support kind of
hastily added on and was notoriously so
slow to the point where some gamers
called the verge a graphics decelerate
er not exactly flattering other cards
like the 3d FX voodoo from 1996 were
actually 3d only meaning that you need a
separate card for day-to-day computing
huh but at least the Vudu was notable
for being the first ever card to support
multi GPU set up and its accessory the
Vudu 2 and its glide API helped 3dfx
become a dominant force in the late
1990s as time went on we saw improved
features and performance from cards like
the ati rage series that added DVD
acceleration and the matrix mystique
which actually allowed you to add more
vram which you can't even do on modern
cards but the game really changed in
1999 when Nvidia previously known for
cards like the Riva TNT released the
GeForce 256 aside from it being the
first ever GeForce card it could process
complex visuals that were previously
left to the CPU such as lighting effects
and transformation which Maps 3d images
onto a regular 2d monitor
although the GeForce 256 was a little
ahead of
it's time and many games didn't support
its new features it set the stage for
the GeForce 2 which came out next year
and became very popular
that same year however 3dfx disappeared
from the consumer GPU market due to
risky business decisions like attempting
to manufacture its own cards to go with
their GPUs and being unable to keep up
with the performance of GeForce and a
TI's new Radeon line these two products
tax overwhelmed a once crowded GPU
market and in 2001 Nvidia and ATI were
the only two real players remaining
unless of course you count Intel's
integrated graphics although a few
smaller companies remained they
gradually exited the consumer market
over the next several years things
continued to heat up in 2001 with the
GeForce 3 which included a pixel shader
that allowed for much more granular
detail since it could produce effects on
a per pixel basis not to be outdone ATI
quickly added this feature to its second
generation of Radeon cards for a while
after subsequent cards offered
incremental performance improvements
though we did see a transition from the
old AGP interface to the faster PCI
Express interface in 2004 as well as
nvidia sli and ATI crossfire in 2004 and
2005 respectively but 2006 brought us a
couple huge developments ATI was bought
out by AMD and NVIDIA rolled out its
famous 8800 GTX an incredibly powerful
and power-hungry card that not only had
a massive number of transistors but a
unified shader that could handle a large
number of effects at once and run at a
faster clock than the processing core as
well as a number of stream processors
that allowed graphical tasks to be
parallelized to improve efficiency the
switch to stream processing allowed not
only for a greater performance in games
but also general-purpose or GP GPU
computing on graphics cards for things
like scientific research and hey Bitcoin
mining AMD incorporated similar
technology into its Radeon 2000 series a
short while later
AMD was also the first to bring us the
concept of surround gaming with up to
six monitors at once with its Eyefinity
brand
in 2009 with Nvidia following this up in
2010 of course 4k came along with both
the red and the green team featuring
support for it in 2012 we've certainly
come a long way since the days of
playing pong in black and white and who
knows maybe in 40 years we'll have
something so advanced that Crysis 3
won't be noticeably harder to render
than a ball bouncing around a black
screen tunnel Bayer VPN lets you tunnel
to 20 different countries allowing you
to browse the internet and use online
services as if you're in a different
country they have easy-to-use apps for
iOS Android PC and a Mac they even have
a Chrome extension just choose a country
in the app or extension turn tunnel bear
on and watch as your bear tunnels your
internet connection to your new location
your connection is encrypted with
aes-256 encryption and your public IP
address gets switched so you can show
that you're in a different location with
tunnel bear there's no weird port
configurations or DNS settings or
anything like that Tunnel bear handles
all of that kind of stuff in the
background they also have a top rated
privacy policy and do not log user
activity you can try out tunnel their
VPN with 500 megabytes of free data and
with no credit card required and if you
like it and you want to upgrade to their
unlimited plan you can save 10% by going
to tunnel bear comm / Linus alright guys
thank you for watching I know this was a
rather long video but it was a history
of graphics cards so it was important
like the video if you liked it dislike
it if you disliked it check out channel
super-fun we have a video coming soon
but I can't really tell you guys what it
is it's gonna be cool so go check it out
comment down below with video
suggestions and don't forget subscribe
and follow
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.