Gadgetory


All Cool Mind-blowing Gadgets You Love in One Place

Stop Confusing TDP and Total Power Draw!

2017-05-06
I get it it's miss leading no it doesn't stand for total power draw and many manufacturers insist on reading it differently lets discuss thermal design power and clear up the misinformation lets consider an i7 7700K intel rates its TDP at 91 watts what this means is that a cooler you place atop this CPU would have to dissipate 91 watts of power output in the form of heat transfer in order to keep this CPU running as it should the cooler above is much smaller than required and thus has a much lower thermal design it may only dissipate half the heat given off by the CPU for example resulting in sky rocketing core and package temperatures thermal throttling and possible system failure another thing to note a processing unit's TDP will also change if its frequency and voltage change we discussed this in our crash course playlist every computer enthusiast knows that overclocking a CPU or GPU always "always" results in a higher thermal outoput and here's the catch voltage doesn't have to change on paper this really doesn't make sense if voltage ultimately drives resistance we're talking Ohm's Law here and resistance is converted to heat than a voltage change of 0 should yield a thermal design change of 0 but thanks to how transistors work clock speed directly affects TDP here we go as transistors switch from an active state to a passive state and back again to an active one they release heat thanks to partial resistance remember so as clock speed increases so to does the alternation rate it isn't directly proportional a frequency at 4 gigahertz doesn't mean that your transistors are alternating at 4 billion times a second it actually means they're alternating at higher rates than that and if you overclock all cores in your chip equally which you have no choice but to do when it comes to GPU's most of them have several thousand cores thermal output increases at a rather obtuse exponential rate this means that for every factor 1 of overclock lets says 100 megahertz heat is generated at a factor grater than 1 this value varies from chip to chip but explains why past a certain frequency chips become very stubborn and get very "very" hot regardless of voltage input your essentially asking transistors within your processor to alternate at rates much higher than they were thermally designed for so if you wish to venture any higher that threshold typically around 5 gigahertz or so special cooling systems are required and remember this is all thanks to the second law of thermodynamics no system is perfect every thing loses energy in the form of heat your own body does our Sun does the Earth does in-fact the entropy of the entire universe is increasing and from the vastness of galaxies down to the very transistors powering your computers and phones you're using to watch this video at this very moment every thing is currently giving off heat be sure to check out that full video why do processors get so hot via the link bellow so when frequency increases partial resistance increases resulting in a higher operating temperature same goes for voltage and you can figure that one out with Ohm's Law and as a general rule of thumb overclocking a processor often requires a beefy air cooler or AIO all this to say TDP is usually an underestimate of total power draw under load even a 7700K at stock frequencies will draw more power under full load than its TDP Steve from Gamers Nexus helped me to clear this one up you see TDP isn't a set in stone measurement standard it should be easier to see with intel CPU's by this point but the trend carries over to other component's from other companies in dicey ways consider this PNY GTX 1080 Ti its an insane graphics card one of the most powerful single GPU card's you can currently buy running on the founders edition reference pcb and its "Graphics Card Power" quote enquote is vaguely inferred by NVidias website at 250 watts I'm nit picking here because "Graphics Card Power" isn't necessarily TDP and on top of that NVidia's not specifying whether this is "GPU Power" alone or GPU+VRM+VRam power total board power it gets confusing now with this card power draw can reach upwards of 300 watts which still isn't bad mind you considering how powerful this card is but I should note that this is not a consistent trend it is very infrequent I don't have the equipment for individual component power testing its actually more difficult than it sounds but other websites I trust including Tom'sHARDWARE do this on a daily basis so you can see here absolute peak power draw occurring for a fraction of a second 295 watts that's well over 250 watts of TDP that the card is rated for and if you had chosen a power supply based on summed up component TDP's and decided to play any sort of intensive game you might find yourself tripping your power supply and crashing your system or thermal throttling I'm not saying its guaranteed but you'd be awfully close to doing that but for a vast majority of their testing the Ti when tortured averaged at around its TDP through these examples you can clearly see why you cannot simply extract TDP and derive power draw directly it depends on the unit as well as the manufacturer and what general practices were being followed at the time by this point if you're quite confused don't worry i was in the exact same boat I had many questions to ask the tech community in glad that Steve and Others were able to help me out so thank you for that but if there's one thing that you can and should take away from this video it is clarified perfectly by VSG from Thermal Bench He says that no matter what your power input will always be higher that your heat output and if you think about it it has to be if power consumption matched heat output than the unit would literally just be a space heater and do no electrical work so I'm closing here's a summed up simplified explanation of the difference between TDP and power draw the unit in question draws "X" amount of power in watts from the wall and dumps "Y" amount of heat in watts shortly their after where "Y must be less than X" now when it come to deciding how much power your pc will draw under load it can help to starting with TDP's and then multiplying by some sort of safety factor I tipically use 1.5 just to be on the safe side for example if my CPU's TDP is 100 and my graphics card is rated at 300 i multiply 400 watts by 1.5 and end up with around a 600 watts PSU to be comfortable if I plan on doing some serious overclocking ill adjust the multiplier accordingly if how ever I'm supporting several fans a large pump many hard drives led strips et cetera then might want to tack on an extra 100 watts just to be on the safe side these components can actually drain quite a bit of power from your power supply while the safe power zone is usually subjective under no circumstance would you want to run this kind of PC its a beautiful pc by the way with a 500 watt power supply that would just be insane you'd most certainly be outside your curves peak efficiency discussed in more detail right here and that's assuming you're delivering enough power to your components to keep them running in the first place if you have any questions or concerns leave those in the comments bellow check out the links bellow to the products showcased in this video and also check out Gamers Nexus's Channel for more in depth stuff like this the guys over there cover quite a bit of material that is honestly beyond me if you liked this video be sure to give it a thumbs up thumbs down for the opposite click the subscribe button if you haven't already and I will catch you in the next video this is science studio thanks for learning with us
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.