Gadgetory


All Cool Mind-blowing Gadgets You Love in One Place

GTX 960 2GB vs. 4GB in 2019 – Did It End Up Mattering?

2019-04-29
one of our oldest and most popular videos talks about the gtx 960 4 gigabyte vs. gtx 960 2 gigabyte cards and the value of choosing one over the other the discussion continues today but it's more focused on 3 gigabyte verse 6 gigabyte comparisons or 4 gigabyte verse 8 gigabyte comparisons now looking back at 2015's gtx 960 we are revisiting it with locked frequencies to compare memory capacities and how its aged the goal is to look at both frame rate and image quality to determine how well the 2 gigabyte card has AIDS versus how well the 4 gigabyte 960 has aged before that this video is brought to you by MSI's r-tx 2070 gaming z8 gigabyte card the RT x 2070 gaming z uses MSI's dual fan design with large blades which we've previously tested to have him on the best noise normalised thermal results in the class msi is 20 70 gaming Z has a fat heatsink furthering the focus on reduced noise levels by allowing the fans to spin slower RGB LEDs naturally are abundant on the card but can be blacked out to match the carbon and blackout shroud learn more at the link in the description below first of all a lot of things have changed since that content number one we've changed a lot our testing methodology has improved further in the obvious four years since then so that's a topic on its own where results today will be different from results four years ago further the environment has changed Windows 10 has changed significantly I think we were using tenets that we probably use 10 of time I don't know whatever was out I don't know if 7 or 10 at this point or 8 maybe it might have been 8 but either way the OS has changed significantly and also drivers have changed things like that so we haven't normalized for those variables we're not doing now vs. then what we're doing isn't now so 2 gigabyte first for gigabyte today and we've done a few more steps here than we might have in years past to further improve the accuracy of the data and one of those is to equalize these in frequency fortunately because the 900 series this is pretty easy to do it's Boost 2.0 it's before Nvidia really started pushing the thermal pendant frequency that we see in modern architectures and so it's easy to lock the to the same exact frequency down to the megahertz one of them and same memory frequency as well so everything's identical here the cooling was maxed out on both cards but it's relevant because boost doesn't behave the way it does today and that allows us to get a strict a B comparison between four and two gigabytes even though it's two different cards they're clocked the same so it actually doesn't matter that they have a different heatsink and fat so locked frequency is a big step for ensuring the data is accurate another thing that we need to talk about is memory used versus memory allocated or requested a lot of people talk about how much memory is being used by an application when they open gpu-z or task manager or an on screen display and what that is showing you is how much memory the application has requested and that doesn't mean what's actually engaged or what's actually needed to run the program so some applications might see 11 gigabytes on a card and request all 11 gigabytes but in reality they might only be using 4 and so you really can't rely on that number to mean anything it doesn't practically tell us anything we have to do image quality test and frame rate tasks for frame time tests to see if there's any difference in the memory capacity it's not enough to just look at how much is used by a game because a lot of them will request more than they actually need actively and finally as we get into the results here an important thing to think about and remember is that as we do things like increase the resolution increase the texture quality we will definitely exceed the 2 gigabyte limit of this card especially when we're using modern titles so because of that we are going to see differences in performance as it becomes limited in memory again especially with modern titles but when we're looking at a game running at 26 FPS versus 30 although that is percentage-wise a significant change in terms of playability it doesn't matter you're not really going to be happy with either experience so that's that's something to think about as well is even though we can see at damn it what the differences are between the cards if we force it you still have to question practically whether it matters to you in real life anyway let's get into the testing will start with Sniper Elite Four cypher leap for is the absolute best example to start with and it's because the FPS number is completely betray what's happening on the screen if we look at our GPU bench the chart starts at a 1080p for this one in Sniper Elite 4 you'll see that the gtx 960 strix two gigabyte card actually posts a 57 FPS average with a low as a well spaced at 41 fps and 39 FPS comparatively the 960 SSC for gigabyte card is within margin of error at 58 FPS average lows are also within our wider error margins of the smaller data sets for 1% lows as a frame rate it looks like there's no difference and that the 4 gigabyte card might actually be quote overkill as you could postulate that the rest of the GPU might not be keeping up enough for the memory capacity to ever matter in reality we needed an image quality comparison and this is a newer game that treats GPUs a bit differently than games did in 2015 all games will handle this their own way but Sniper Elite 4 handles vram limitations by just silently how though obviously taking texture resolution and quality to compensate for over extension on vm consumption let's start by looking at 4k just to really exaggerate this effect it's immediately visible even without the side-by-side comparison that there are big differences in image quality again especially at 4k where we're just stressing the card so much as prioritizing running at that resolution rather than running the image where we want it so we have issues with shadows we have issues with mesh quality with texture quality everything is worse and that is something that doesn't show up in FPS data and as such FPS data is invalid for showing this comparison this happens at 1080p too it's to a lesser degree and it does take a few minutes to really start to take away that texture quality because it takes a little bit of time for the memory to reach capacity 2 for the game to start exceeding the memories capacity and so we see issues with image quality about 1080 and 4k with 1080 being more relevant it's just that 4k is more obvious than it happens immediately but either this does invalidate the frame rate numbers because well they don't mean anything if the image being rendered changes it's no longer a controlled scenario but it does illustrate the issue with running two gigabytes on a 960 in the modern era apex legends at 1080p positions the 960 SSC for gigabyte at a 42 FPS average when clock matched with the Strix 2 gigabyte card running lows in the range of 30 fps note that we could improve performance with lowered settings clearly but the goal is more to focus on the head-to-head comparison not to make it playable the 4 gigabyte card leads outside of margin of error claiming a 10% gain over the 2 gigabyte card GTA 5 was released after the 9:00 60s it launched by just a few months so it's the closest to launch comparison at 1080p the 962 gigabyte card ends up at 50 FPS average and as well within error margins when compared to the 960 SSC these are about as close as you can get at 1440p we see the same results the two cards are within margin of error of each other and so we can declare that they are functionally the same in this test at least that's true with regard to performance image quality also doesn't show much of a change so unlike Sniper Elite 4 we see here that the image quality has the same texture quality we see that the mesh quality is about the same everything is the same so this is all logical if we saw a massive changes in the image quality we would also expect the frame rate to go up on the card that is reducing the image quality f1 2018 at 1080p positions the 960 SSC for gigabyte card at 45 FPS average one frequency locked with a low is at 28 fps and 16 fps 1% in 0.1 percent the 960 Strix 2 gigabyte card did end up about at 42 FPS average enabling the 4 gigabyte card elite of 8.2% this difference is outside of our run to run error we did have one test pass that had an excursion from the mean but even if we eliminate that single test pass the lead of the 4 gigabyte card remains at 5.5 percent and is outside of error margins at 1440p our existence of the stock 960 SSC and the clock locks 1 illustrate that the test results are outside of error once again as these two devices represent different test settings run at different times and still advantaged by the extra memory the test range is plus or minus 0.5 FPS average in this test for these cards the Strix 962 gigabyte card hits 29 point 6 FPS average permitting the clock lock to 960 SSC at 1440 2 megahertz lead of about 16 percent at 1440p this is a substantial lead and although 1440p isn't a particularly good experience with either of these cards the important part is that it stresses vram and shows us the limitations these are further illustrated with the significantly lower 1% and 0.1% lows on the 2 gigabyte card demonstrating frame time variance and inconsistency even at 1440 PF 1 2018 image quality remains the same between the 2 gigabyte and 4 gigabyte devices we drop performance from swapping in and out of video memory more aggressively but we don't seem to have a reduction in image quality far cry 5 @ 1080p doesn't illustrate differences of any major margin we are at 44 FPS average for the 964 gigabyte variants with the Strix 962 gigabyte card at 40 2.1 FPS average these are nearing error for this title although the range technically does exit it just barely at 1440p the difference is emergent with greater resolution the 964 gigabyte cards maintain a lead of about 11% since each frame at this rate is meaningful and the 0.1% lows also show some gains on the 4 gigabyte card far cry 5 also doesn't suffer an image quality reduction which is it largely demonstrated by the frame rate difference in performance we'd likely see an image quality reduction accompanied with the frame rate increase which is not observed here shadow of the Tomb Raider at 1080p places the frequency loss to 960 at 36.7 FPS average allowing it a lead of about 14% every single task pass for the 964 gigabyte was between 36 point six and thirty six point two eight FPS average marking this benchmark as highly reliable and with minimal variance the 960 Strix demonstrated more variance with results between 31 point 7 and 33 point 1 FPS average 1440p again shows the same scaling that we've seen before with for gigabyte card pulling ahead in a measurable way but again we're at a point where you wouldn't be playing on either card anyway it's still important to show the difference just not particularly meaningful to the user experience shadow of the tomb Raider's image quality also looks the same between the two some characters are randomly generated but looking at floor tiles we see the same texture quality and resolution the same is true for Lara Croft who has the same texture and mesh quality in each scene and the tree in the poll behind her also have the same image quality so as always with this type of thin the answer is well it sort of depends it's on average the four gigabyte card is definitely better and we do have a serious issue in Sniper Elite 4 where the game treats this exceeding of the memory capacity by lowering the quality of the visual as you've said so your your quality settings actually mean nothing at that point and clearly the four gigabyte card today is doing better now the 4 gigabyte card in 2015 versus the 2 gigabyte card in 2015 we test it at the time and it really again depended on the title but the differences are not always present but they are sometimes present which is annoying as a conclusion because it really just means that you would have had to look at the types of games you're playing at the time versus what was tested online and see if it mattered to you but just like today where we look at four versus eight and we say well yeah it's better on a twenty eighty then four would be it's going to come down to the GPU that's accompanied with the memory and to some extent when you actually play with these things at reasonable settings they're really not all that different the differences start to emerge once you start beating them up so hard with texture resolution with screen resolution with the mesh quality things like that that the memory is is finally becoming a limitation but by the time you've done that also the GP is a limitation and it's not really a fun experience anyway so there's a bit of a balance between academic exercise and real-world experience however we can clearly see that the two gigabyte card is doing worse in fact with modern games not really a huge surprise but if you're still running a 960 you might be able to get a little bit further with a four gigabyte model today than you would with a two gigabyte model today and still stretch it out just just that much longer through games as they launch without needing an upgrade pathway so anyway that's the two versus four gigabyte in 2019 there are differences the image quality is the most interesting one and also tells us a story about how in testing we need to pay attention to things like image quality and not just the numbers because clearly some games resolve this differently than others and that can invalidate test data all together if not accounted for so that's it for this one thank you for watching as always you can subscribe for more or go to stored I Cameron's access net to help us out directly by picking up one of these shirts which we've just restocked it's the blue print shirt I'll see you all next time
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.