What is TDP and why should you care about it?

CPU (Image credit: Rich Edmonds / Windows Central)

Should you be looking at various parts to build a PC with, or to upgraded a specific component, you may have come across a term on your travels: TDP. But what exactly is TDP and why should you even care about the value provided by a manufacturer? We break everything down for you.

So, what is TDP?

TDP stands for Thermal Design Power, and is used to measure the amount of heat a component is expected to output when under load. For example, a CPU may have a TDP of 90W, and therefore is expected to output 90W worth of heat when in use. It can cause for confusion when shopping around for new hardware as some may take the TDP value and design a PC build around that, taking note of the watt usage. But this isn't entirely accurate, nor is it completely wrong.

Our 90W TDP CPU example doesn't mean the processor will need 90W of power from the power supply, even though thermal design power is actually measured in watts. Instead of showcasing what the component will require as raw input, manufacturers use TDP as a nominal value for cooling systems to be designed around. It's also extremely rare you will ever hit the TDP of a CPU or GPU unless you rely on intensive applications and processes.

The higher the TDP the more cooling will be required, be it in passive technologies, fan-based coolers or liquid platforms. You'll not be able to keep a 220W AMD FX-9590 with a laptop CPU cooler, for example.

TDP ≠ power draw?

AMD Radeon GPU


Not quite, no. TDP doesn't equate to how much power will be drawn by the component in question, but that doesn't mean you can't use the value provided as an estimation. The reading itself is based on power so it can actually prove useful when looking at what you will need to provide enough juice. Generally, a component with a lower TDP will need draw less electricity from your power supply.

Actual readings listed by manufacturers can vary as well, depending on their own findings. So while the value of TDP may not exactly reflect how much power a part will draw in a system, it does provide solid grounds to design a cooling system around it, as well as a rough idea as to how much output a power supply (PSU) will need to have. To be safe, we usually recommend a quality brand PSU of 500W for a PC with a single GPU.


If that still has you flabbergasted with TDP, it's essentially a reading that helps determine the power efficiency and performance of a component. Using a CPU as an example, one with higher TDP will usually provide more in terms of performance, but will draw more electricity from the PSU. TDP is not — however — a direct measure of how much power a component will draw, but it can be a good indicator.

Be sure available cooling you have at hand is more than enough to keep components cool, especially when it comes to the GPU and CPU.

Rich Edmonds
Senior Editor, PC Build

Rich Edmonds was formerly a Senior Editor of PC hardware at Windows Central, covering everything related to PC components and NAS. He's been involved in technology for more than a decade and knows a thing or two about the magic inside a PC chassis. You can follow him on Twitter at @RichEdmonds.

  • Gotta love when one unit of measure can mean two or more things.
  • That is what I was wondering from what I know from physics watt is the unit of power. Power is energy upon time and heat is a form of energy so heat unit cannot be watt. Unit of heat or any energy is joules.
  • If someone says 50 kj of energy, you would not know how much electricity is required per unit time. Even AA cell battries can produce 50 kj of energy (or as much as u want) but you would need lot of battries or lot of time to get  to target. That is why watts is better unit for these devices. 50 W means constant(or close to constant) 50 W per unit time of energy is required. Think of it like Frames per second (FPS). FPS is like Watts (60 per second). But 60 frames can be 1 second in-game or maybe 60 seconds of slideshow in-game. A video might have 5000 frames but you dont know how smooth it will be without frames per second value.
  • The old faucet/water analogy still seems to be the best way to describe wattage. Voltage is how much pressure the water is under, amperage is how fast the water is flowing out of the faucet, and wattage is how many buckets per hour you will fill with water. Wattage equals volts x amps. Strictly speaking, that's how they come up with TDP, PSU ratings, and many other specs.
  • Can I put a processor of TDP 90W on a mobo which states max TDP 65W? What problems will happen if I do so?
  • Yes. you can do that. Because Processor gets its power from power supply connection. I think max TDP 65W means that your motherboard will use or produce 65W of heat. Motherboard need power to run too.
  • Cpus get power from the pins, that power flows through the mainboard to get there
  • The reply you previously got should be ignored. The TDP rating you see on a motherboard is with respect to CPU's that are compatible with it. The VRM's on the motherboard are designed to handle a certain load current, so cheaper motherboards have cheaper, less capable VRM's. If you put a 90W TDP rated CPU on your board, you could overcurrent and blow the VRM's. At the very least, you'll substantially affect their life span. In some cases, you might get away with it, such as when a CPU is rated 90W TDP, but actually only disipates say 68-70W , or so. It's above the 65W level, so the manufacturer may rate it as a 90W part. In general, however, it's not something you want to risk unless you have spare boards, or the cash to buy another one.
  • TDP is a measure of performance as well?
  • Not really. Higher TDP will bring better performance on similar technology. E.g. GTX 1080 have Higgest TDP (180W) amog gtx 10xx GPUs and it have best performance. And GTX 690 have TDP of 300W but it have lower performance compared to GTX 1080. That is because 1080 have better technology.      
  • No.  It's one part of the equation to help understand efficiency and does not have anything to do with performance directly.  For example, if you had a 100% efficient processor (which is impossible), that means your CPU wastes (gives off) no heat at all.  Therefore, your TDP would be 0W, but it's unknown what the "performance" (FLOPS, IPS, etc.) of the CPU actually is from just that measurement.
  • I am no IT pro, but he does repeat the idea very well, all three times.
  • Watts is not a measure for heat. The first explanation was just incorrect. The explanation provided in the conclusion it's the correct one. It seems the author had trouble understanding what TDP meant.
  • Watts is a measure of joules per second. Which is energy, which can be converted to heat. So he is right
  • Watts is a measure of intantneous power, that is the heating capability. Hence why you need to constantly move heat away from the component at equal to or greater than the rate of heat generation. This is why TDP is quoted, so heatsink/fan manufacturers know how big to make the heatsink for a particular component. The heatsink accepts heat energy from the processor and expells it to the air. The larger the surface area in contact with air the more power it can dissipate. A small fan can be used to help the heat sink get fresh cooler air and improve it's heat dissipating potential. So TDP should only be used to size component cooling. In reality silicon chips effectively convert almost all the consumed electrical energy into heat energy. Confusingly Intel sometimes quotes SDP instead of TDP. SDP or scenario design power is the amount of heat energy that needs to be dissipated over time in a particular scenario, the TDP will often be around double the SDP, for example the Atom X7 had an SDP of 2 Watts, but in reality the TDP is closer to 4 Watts.
  • TDP says nothing whatsoever about power efficiency of a component. All it says is the AVERAGE MAXIMUM energy, expressed in Watts, that needs to be dissipiated under normalized full load. Say I have two GPU's to render an image on a screen. One GPU has a TDP of 90W and one 300W. This article seems to suggest the former to be more efficient. But what if the 90W one can render one image per second while the 300W one can render 50 images per second? Still think the lower TDP rated one is more efficient? Also note that TDP is defined in different ways by company's. AMD and Intel both use the term TDP but they are not comparible. AMD's TDP for the same chip will be higher than Intel's. They measure the average maximum in a different way. And also very important, TDP says nothing at all about idle power consumption. As most CPU's and GPU's spend there time in idle almost all the time, this is the thing you'll see on your power bill. If you're on Windows you can check this in Task Manager (W10:  Ctrl+Shift+Escape  Details 'System Idle Process' Ususally 90%+) Also, a 500W PSU is enough for any normal gaming system with a single GPU. And in most cases enough for a dual GPU system. The 500W and 750W ratings recommended for gaming hardware assume a crappy PSU that cannot deliver power according to ATX spec at anything remotely appraoching full load. Interested about power consumption? Bing for 'TDP and race to idle' to gain some perspective.
  • In electronic computing devices like a CPU or GPU, the power drawn is the same power that produces the heat, all of it, there is no mechanical movement or light coming from a CPU so all power is converted into heat.
  • all the heat does come from power drawn, but not all power drawn goes to heat.