I want to hear your opinions on thermal pads used in laptops. I have heard people say things between "it's a bad idea to replace them with copper shims because they allow the chip to physically move with thermal expansion" and "they are absolute crap".
IMHO, i see thermal pads/rubber/sponges making sense on VRAM chips or a chipset without integrated graphics that dissipates 2-3W, but their usage on GPUs or IGPs with >10W power consumption and a die size of under 100mm^2 is simply retarded. It's the result of manufacturer cheapness, loose mechanical tolerances of the cooling system and planned obsolescence rather than any solid technical reason. Remember that pads, besides being less efficient at heat transfer than even the worst kind of paste, also age worse than paste, since there is physically more material.
The only way i would see a pad doing a reasonable job as thermal interface material for a GPU chip is if it has a large surface to dissipate into, before the heatpipe. This was the case for older laptops and it worked fine there (but also, they used to use leaded solder). Nowadays, with smaller, lower power chips that have just a thin flimsy piece of copper covering them (sometimes even aluminum), soldered to a small diameter heatpipe, using a thermal pad instead of building the heatsink tight enough to use paste seems like the worst possible idea to me.
It's not like it's hard to ensure adequate contact between the heatsink and die. I'm able to take an existing heatsink that used a thermal pad on the GPU or IGP and modify it for direct contact with the die and the use of paste by lightly bending it a couple times by hand and using nothing but my eyes and a little bit of paste for measurement. No shims required.
If you're telling me the heatsink factory could not BUILD them like that from the get go, i'm not buying it.
The use of thermal pads doesn't only *seem* like it's a bad concept, it IS. I have solid proof and some cold hard numbers, some of which are downright scary.
Also, the whole "allows chip to physically move" idea is one of the stupidest things i've heard. Last time i checked, BGAs worked best when firmly attached to the board.
What is your take on this?
IMHO, i see thermal pads/rubber/sponges making sense on VRAM chips or a chipset without integrated graphics that dissipates 2-3W, but their usage on GPUs or IGPs with >10W power consumption and a die size of under 100mm^2 is simply retarded. It's the result of manufacturer cheapness, loose mechanical tolerances of the cooling system and planned obsolescence rather than any solid technical reason. Remember that pads, besides being less efficient at heat transfer than even the worst kind of paste, also age worse than paste, since there is physically more material.
The only way i would see a pad doing a reasonable job as thermal interface material for a GPU chip is if it has a large surface to dissipate into, before the heatpipe. This was the case for older laptops and it worked fine there (but also, they used to use leaded solder). Nowadays, with smaller, lower power chips that have just a thin flimsy piece of copper covering them (sometimes even aluminum), soldered to a small diameter heatpipe, using a thermal pad instead of building the heatsink tight enough to use paste seems like the worst possible idea to me.
It's not like it's hard to ensure adequate contact between the heatsink and die. I'm able to take an existing heatsink that used a thermal pad on the GPU or IGP and modify it for direct contact with the die and the use of paste by lightly bending it a couple times by hand and using nothing but my eyes and a little bit of paste for measurement. No shims required.

The use of thermal pads doesn't only *seem* like it's a bad concept, it IS. I have solid proof and some cold hard numbers, some of which are downright scary.
Also, the whole "allows chip to physically move" idea is one of the stupidest things i've heard. Last time i checked, BGAs worked best when firmly attached to the board.

What is your take on this?
Comment