Originally posted by CapPun
View Post
Like you said, there has to be a reason why manufacturers use Japanese polymers in the CPU VRM. But it may not be because they are more reliable.
Originally posted by CapPun
View Post
How far the video card overclocks at stock voltage will only tell you about the ASIC quality of the actual chip (i.e. the little silicon core.) It will not tell you if the "fusing" of the GPU core to the substrate was done "better" or "late Friday shift" -bad.
Thus, you could have a really good overclocking chip fail very fast... or not.
And you cal also have a really crappy "hot-running" chip last a very long time... or not.
So in short, it's not possible to determine.
But the cooler (or within reasonable temperature range) you keep a graphics card (or any electronic device for that matter), the more likely it will last longer.
Originally posted by CapPun
View Post

I think most newer nVidia video cards (GTX 700 series and above?) have a power limit feature. ATI too possibly, though not sure at all after which series (probably the R7/R9 and onwards??) But from what I've briefly seen/heard, the power limit feature simply clocks the card only as high as it can go before hitting the power limit. So if for example you have a video card with 1 GHz core clock and 150W TDP, and you set the power limit to, say 100 Watts... then the core clock will run at whatever max frequency it can before hitting that power limit (thus, likely less than 1 GHz.)
That said, if anyone knows more about the power limits on newer cards, please correct me if I'm wrong somewhere. I don't have many newer (working) video cards, and thus have not looked into the issue. My current best (working) cards are a GTX 560 Ti and Radeon HD6850. (Yeah, I know... I'm so outdated. Whatever.

Leave a comment: