Badcaps.net Forum
Go Back   Badcaps Forums > Troubleshooting Hardware & Devices and Electronics Theory > General Electronics Technical Discussion
Register FAQ Calendar Search Today's Posts Mark Forums Read

 
Thread Tools Display Modes
Old 11-07-2018, 02:45 PM   #1
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Exclamation *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

so this a general question - I've read many times that amperage on 12V rails should be sufficient to properly feed a graphics card

makes sense so far

what makes way less sense & sounds like total magic to me is, if a GPU is underpowered (insufficient wattage on 12V rail) then this causes the card to...overheat?

how thef is that possible?

basically that's like saying, if the card doesn't get enough energy then it produces more energy?

that's like saying an underfed sportsman will be able to lift more or something

sounds like this totally violates the most fundamental law of this universe lol (conservation of energy)



could an expert explain plz? (in not too complicated terms)
CapPun is offline   Reply With Quote
Old 11-07-2018, 03:09 PM   #2
budm
Badcaps Veteran
 
budm's Avatar
 
Join Date: Feb 2010
City & State: S.F. Bay area
My Country: USA
Line Voltage: 120V 60Hz
I'm a: Knowledge Seeker
Posts: 34,308
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

The switching power supply device that step down the 12V to lower Regulated Voltages to run the GPU will have more current flow through those switching devices when input Voltage is low, those are the devices that will be overheating because they try to supply the same regulated Voltage to run the GPU. But most most card should throttle down.
And when the current draw form 12V exceeds the PS capability then PS may go into shutdown.
__________________
Never stop learning
Basic LCD TV and Monitor troubleshooting guides.
http://www.badcaps.net/forum/showthr...956#post305956

Voltage Regulator (LDO) testing:
http://www.badcaps.net/forum/showthr...999#post300999

Inverter testing using old CFL:
http://www.badcaps.net/forum/showthr...er+testing+cfl

Tear down pictures : Hit the ">" Show Albums and stories" on the left side
http://s807.photobucket.com/user/budm/library/

TV Factory reset codes listing:
http://www.badcaps.net/forum/showthread.php?t=24809

Last edited by budm; 11-07-2018 at 03:13 PM..
budm is offline   Reply With Quote
Old 11-07-2018, 03:22 PM   #3
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by budm View Post
The switching power supply device that step down the 12V to lower Regulated Voltages to run the GPU will have more current flow through those switching devices when input Voltage is low, those are the devices that will be overheating because they try to supply the same regulated Voltage to run the GPU. But most most card should throttle down.
And when the current draw form 12V exceeds the PS capability then PS may go into shutdown.
so basically lower voltage means higher amperage? (so that the product of the two ie. wattage remains constant?)

and by "switching devices" you mean the VRMs on the graphics card? that is what overheats when 12v voltage is too low?
CapPun is offline   Reply With Quote
Old 11-07-2018, 03:56 PM   #4
petehall347
Badcaps Veteran
 
Join Date: Jan 2015
City & State: worcester
My Country: United Kingdom
I'm a: Knowledge Seeker
Posts: 1,571
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

i guess its a bit like a starter motor burning out when using a bad battery .
petehall347 is offline   Reply With Quote
Old 11-07-2018, 04:09 PM   #5
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?

Last edited by CapPun; 11-07-2018 at 04:11 PM..
CapPun is offline   Reply With Quote
Old 11-07-2018, 04:27 PM   #6
sam_sam_sam
Badcaps Veteran
 
Join Date: Jul 2011
City & State: Sunny Jacksonville FL
My Country: USA
Line Voltage: 120 Volts 60 HZ
I'm a: Knowledge Seeker
Posts: 1,145
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by CapPun View Post
thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
Are you measuring the voltage on the graphics card or are you measuring the connector for a hard drive that could make a difference on your voltage measurement

Also does your power supply have clean power output which means very low ripple or noise
which could cause problems also if either value is high
__________________
9 PC LCD Monitor
6 LCD Flat Screen TV
30 Desk Top Switching Power Supply
10 Desk Top Power Supply
10 Battery Charger Switching Power Supply for Power Tool
6 18v Lithium Battery Power Boards for Tool Battery Packs
1 XBox 360 Switching Power Supply and M Board
25 Servo Drives 220/460 3 Phase
6 De-soldering Station Switching Power Supply 1 Power Supply
1 Dell Mother Board
15 Computer Power Supply
1 HP Printer Supply & Control Board * lighting finished it *


These two repairs where found with a ESR meter...> Temp at 50*F then at 90*F the ESR reading more than 10%

1 Over Head Crane Current Sensing Board
2 Hem Saw Computer Stack Board

All of these had CAPs POOF
All of the mosfet that are taken out by bad caps

Last edited by sam_sam_sam; 11-07-2018 at 04:33 PM..
sam_sam_sam is offline   Reply With Quote
Old 11-07-2018, 05:21 PM   #7
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by sam_sam_sam View Post
Are you measuring the voltage on the graphics card or are you measuring the connector for a hard drive that could make a difference on your voltage measurement

Also does your power supply have clean power output which means very low ripple or noise
which could cause problems also if either value is high
mine was a hypothetical question

but let's assume: output if PERFECTLY clean, zero ripple & noise. let's also assume voltage fed into graphics card stays at EXACTLY 12V with zero variation
CapPun is offline   Reply With Quote
Old 11-07-2018, 05:52 PM   #8
eccerr0r
Solder Sloth
 
eccerr0r's Avatar
 
Join Date: Nov 2012
City & State: CO
My Country: USA
Line Voltage: 120VAC 60Hz
I'm a: Hobbyist Tech
Posts: 3,593
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Perhaps have to think it another way, if the GPU is drawing a lot of power - more than what the PSU is able to supply - then the voltage will go down (due to PSU ESR and indirectly wattage rating) and the GPU/device would be generating more heat naturally.

Otherwise the original premise that lower voltage is causing overheating does not make any sense, assuming that the effective resistance of the GPU stays the same. The heat is power dissipation - and power=volts*current. To generate more heat, voltage needs to go up or current drawn needs to go up.

The only thing that could make sense is current going up drastically and voltage trying to stay the same but can't due to ESR.
eccerr0r is online now   Reply With Quote
Old 11-07-2018, 06:00 PM   #9
budm
Badcaps Veteran
 
budm's Avatar
 
Join Date: Feb 2010
City & State: S.F. Bay area
My Country: USA
Line Voltage: 120V 60Hz
I'm a: Knowledge Seeker
Posts: 34,308
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by CapPun View Post
thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
"12V is super stable" Yeah, up to at what load current? 10A?, 100A?
Power supply that can supply 12V with 1A max = 12 watts, 12V power supply that can supply 2A max = 24 Watts. So if the load is trying to draw more current that it can deliver then the output Voltage will drop and then may be to point that the power supply will go into shutdown.

What do you think Watt is? Voltage x Current = Watt (linear load). I think you need to know and understand the relationship between Voltage, current, resistance.
I.E. Supply Voltage is 12V and the load requires regulated 6V at 5A (30 Watts) to run, so the 12V will be fed to switching Buck converter to step it down to be constant 6 and able to supply 5A of current through the load, for the sake of simplification, lest make it zero lost conversion, 100%eff.
So to get 6V 5A (30 Watts) output, the 12V power supply will be drawing 2.5A (30 Watts), if the 12V drops down to 10V, that means the current will go up to 3A to be able to supply the load requirement.

Last edited by budm; 11-07-2018 at 06:23 PM..
budm is offline   Reply With Quote
Old 11-07-2018, 06:07 PM   #10
petehall347
Badcaps Veteran
 
Join Date: Jan 2015
City & State: worcester
My Country: United Kingdom
I'm a: Knowledge Seeker
Posts: 1,571
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

was told when i was 11 years old volts are given and amps are taken .
petehall347 is offline   Reply With Quote
Old 11-07-2018, 07:33 PM   #11
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by budm View Post
"12V is super stable" Yeah, up to at what load current? 10A?, 100A?
Power supply that can supply 12V with 1A max = 12 watts, 12V power supply that can supply 2A max = 24 Watts. So if the load is trying to draw more current that it can deliver then the output Voltage will drop and then may be to point that the power supply will go into shutdown.
ok I didn't know that max wattage of a given rail was related to that rail's voltage stability (that's what your saying right?)

basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?
if so that makes more sense now I guess


Quote:
What do you think Watt is? Voltage x Current = Watt (linear load). I think you need to know and understand the relationship between Voltage, current, resistance.
been a while :/ I remember P=UI and U=RI (and P=RIČ)
at least for direct currents (for alternating currents there were also sin/cos functions involved lol)





alrite so now it's been explained how low voltage causes more heat I've another question - how is it that overvolting processors (CPU's, GPU's, RAM... something them crazy overclockers like to do) also increases heat? shouldn't the extra voltage cause the amperes to drop thus decrease heat output?

Quote:
Originally Posted by petehall347 View Post
was told when i was 11 years old volts are given and amps are taken .
what does that mean

Last edited by CapPun; 11-07-2018 at 07:36 PM..
CapPun is offline   Reply With Quote
Old 11-07-2018, 08:32 PM   #12
budm
Badcaps Veteran
 
budm's Avatar
 
Join Date: Feb 2010
City & State: S.F. Bay area
My Country: USA
Line Voltage: 120V 60Hz
I'm a: Knowledge Seeker
Posts: 34,308
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

"basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?"
Nope, if it is just simple non regulated Voltage source (I.E. battery) and a load, there will be no current rise when the Battery Voltage goes down with the fixed load resistance.
Power supply output Z: https://community.keysight.com/commu...haracteristics
When you deal with regulated power supply, the Switching device in the Regulated Switching power supply will try to draw more current to try to maintain the output Voltage to the load when feeding Voltage goes lower. Regulated power supply will draw more input current when Voltage Source that feeds the regulated power supply, otherwise how else it can maintain the output power, it is about power conversion.
Well, it does not sound like you still do not understand Voltage, current, power. You need to do more readings.
So let say you have 1 Ohm resistor connected to 1V source, so how much current will that be? how much power will that be?
Then you hook up the same 1 Ohm resistor to the 2V source, so how much current will that be? how much power will that be?
Do the experiment with power supply you have, let say it has 12V 1 A , so you connect load that will draw 1A and see if the 12v will stay at 12V, then try to draw 4A from it and see if it will stay at 12V or it may go into shutdown.
You do know how to calculate what the resistance should be so it will draw the required current with known Voltage, correct?

Last edited by budm; 11-07-2018 at 08:57 PM..
budm is offline   Reply With Quote
Old 11-07-2018, 08:44 PM   #13
eccerr0r
Solder Sloth
 
eccerr0r's Avatar
 
Join Date: Nov 2012
City & State: CO
My Country: USA
Line Voltage: 120VAC 60Hz
I'm a: Hobbyist Tech
Posts: 3,593
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by CapPun View Post
basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?
if so that makes more sense now I guess
No, no, no. Again, the current/wattage demand increase is causing the voltage to drop due to ESR of the PSU, not the voltage drop causing current consumption to rise. Big difference here, and the latter makes absolutely no sense from a conservation of energy standpoint. A stronger PSU will still generate as much heat, except that GPU will actually work now since voltages won't sag.

This is also assuming that you're not losing power due to onboard switching regulators wasting power due to not being able to saturate the transistors...which is a different issue. The transistors would be getting hot, not the GPU, which would stay cold because it can't run due to lack of voltage.

Last edited by eccerr0r; 11-07-2018 at 08:48 PM..
eccerr0r is online now   Reply With Quote
Old 11-07-2018, 09:20 PM   #14
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Quote:
Originally Posted by eccerr0r View Post
No, no, no. Again, the current/wattage demand increase is causing the voltage to drop due to ESR of the PSU, not the voltage drop causing current consumption to rise.
what's the ESR? googled it but it only appears as a characteristic of capacitors


Quote:
Originally Posted by budm View Post
When you deal with regulated power supply, the Switching device in the Regulated Switching power supply will try to draw more current to try to maintain the output Voltage to the load when feeding Voltage goes lower. Regulated power supply will draw more input current when Voltage Source that feeds the regulated power supply, otherwise how else it can maintain the output power, it is about power conversion.
that's what I meant (I think): if voltage drops then PSU will increase current to compensate to maintain required power correct?


Quote:
Well, it does not sound like you still do not understand Voltage, current, power. You need to do more readings.
So let say you have 1 Ohm resistor connected to 1V source, so how much current will that be? how much power will that be?
P=UI & U=RI so I=U/R so P=UČ/R so current will be 1A & power will be 1W?

Quote:
Then you hook up the same 1 Ohm resistor to the 2V source, so how much current will that be? how much power will that be?
2A & 4W?

Quote:
Do the experiment with power supply you have, let say it has 12V 1 A , so you connect load that will draw 1A and see if the 12v will stay at 12V, then try to draw 4A from it and see if it will stay at 12V or it may go into shutdown.
You do know how to calculate what the resistance should be so it will draw the required current with known Voltage, correct?
resistance should be 12/4=3 ohms? (omega symbol w/e)

but resistance is a fixed amount it cant change right?

Last edited by CapPun; 11-07-2018 at 09:23 PM..
CapPun is offline   Reply With Quote
Old 11-07-2018, 09:41 PM   #15
budm
Badcaps Veteran
 
budm's Avatar
 
Join Date: Feb 2010
City & State: S.F. Bay area
My Country: USA
Line Voltage: 120V 60Hz
I'm a: Knowledge Seeker
Posts: 34,308
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

What is 'U'?
budm is offline   Reply With Quote
Old 11-07-2018, 10:50 PM   #16
CapPun
New Member
 
Join Date: Nov 2018
City & State: a yurpean city
My Country: Yurp
I'm a: Knowledge Seeker
Posts: 12
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

voltage (teachers always wrote it as U)
I is intensity & P is power
CapPun is offline   Reply With Quote
Old 11-07-2018, 11:29 PM   #17
budm
Badcaps Veteran
 
budm's Avatar
 
Join Date: Feb 2010
City & State: S.F. Bay area
My Country: USA
Line Voltage: 120V 60Hz
I'm a: Knowledge Seeker
Posts: 34,308
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Euro uses 'U' for Voltage which I have not seen it being used in US.
budm is offline   Reply With Quote
Old 11-08-2018, 02:27 AM   #18
eccerr0r
Solder Sloth
 
eccerr0r's Avatar
 
Join Date: Nov 2012
City & State: CO
My Country: USA
Line Voltage: 120VAC 60Hz
I'm a: Hobbyist Tech
Posts: 3,593
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

ESR = effective series resistance. Does not have to be capacitors - anything that has an ideal characteristic of being zero resistance (capacitors, inductors, power supplies, batteries, op amps, etc.) - has some kind of output resistance in real devices. I just name this resistance as an "effective" series resistance because there's no actual resistor in these devices.

With the resistance, the behavior of the circuit usually is negatively affected - batteries and psus will exhibit droop, op amps can't drive anything and still maintain their gain, capacitors and inductors lose energy just being used...
eccerr0r is online now   Reply With Quote
Old 11-08-2018, 05:38 AM   #19
petehall347
Badcaps Veteran
 
Join Date: Jan 2015
City & State: worcester
My Country: United Kingdom
I'm a: Knowledge Seeker
Posts: 1,571
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

i always thought ESR = Equivalent series resistance
petehall347 is offline   Reply With Quote
Old 11-08-2018, 10:20 AM   #20
eccerr0r
Solder Sloth
 
eccerr0r's Avatar
 
Join Date: Nov 2012
City & State: CO
My Country: USA
Line Voltage: 120VAC 60Hz
I'm a: Hobbyist Tech
Posts: 3,593
Default Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

that works too. Either way there's no "real" resistor there and ideally there is no resistor, but due to design or real components, there's 'effectively' an 'equivalent' resistor there. It's confusing for PSUs because they have capacitors there that contribute to the effective resistance too, and each of them indeed do for the high frequency component. But there's also the low frequency component/DC that's dependent on the transistors, diodes, and inductors/transformers.
eccerr0r is online now   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Badcaps.net Technical Forums © 2003 - 2018
Powered by vBulletin ®
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
All times are GMT -6. The time now is 02:01 AM.

Did you find this forum helpful?