*insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • CapPun
    Member
    • Nov 2018
    • 37
    • Yurp

    #1

    *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

    so this a general question - I've read many times that amperage on 12V rails should be sufficient to properly feed a graphics card

    makes sense so far

    what makes way less sense & sounds like total magic to me is, if a GPU is underpowered (insufficient wattage on 12V rail) then this causes the card to...overheat?

    how thef is that possible?

    basically that's like saying, if the card doesn't get enough energy then it produces more energy?

    that's like saying an underfed sportsman will be able to lift more or something

    sounds like this totally violates the most fundamental law of this universe lol (conservation of energy)



    could an expert explain plz? (in not too complicated terms)
  • budm
    Badcaps Legend
    • Feb 2010
    • 40746
    • USA

    #2
    Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

    The switching power supply device that step down the 12V to lower Regulated Voltages to run the GPU will have more current flow through those switching devices when input Voltage is low, those are the devices that will be overheating because they try to supply the same regulated Voltage to run the GPU. But most most card should throttle down.
    And when the current draw form 12V exceeds the PS capability then PS may go into shutdown.
    Last edited by budm; 11-07-2018, 03:13 PM.
    Never stop learning
    Basic LCD TV and Monitor troubleshooting guides.
    http://www.badcaps.net/forum/showthr...956#post305956

    Voltage Regulator (LDO) testing:
    http://www.badcaps.net/forum/showthr...999#post300999

    Inverter testing using old CFL:
    http://www.badcaps.net/forum/showthr...er+testing+cfl

    Tear down pictures : Hit the ">" Show Albums and stories" on the left side
    http://s807.photobucket.com/user/budm/library/

    TV Factory reset codes listing:
    http://www.badcaps.net/forum/showthread.php?t=24809

    Comment

    • CapPun
      Member
      • Nov 2018
      • 37
      • Yurp

      #3
      Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

      Originally posted by budm
      The switching power supply device that step down the 12V to lower Regulated Voltages to run the GPU will have more current flow through those switching devices when input Voltage is low, those are the devices that will be overheating because they try to supply the same regulated Voltage to run the GPU. But most most card should throttle down.
      And when the current draw form 12V exceeds the PS capability then PS may go into shutdown.
      so basically lower voltage means higher amperage? (so that the product of the two ie. wattage remains constant?)

      and by "switching devices" you mean the VRMs on the graphics card? that is what overheats when 12v voltage is too low?

      Comment

      • petehall347
        Badcaps Legend
        • Jan 2015
        • 4425
        • United Kingdom

        #4
        Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

        i guess its a bit like a starter motor burning out when using a bad battery .

        Comment

        • CapPun
          Member
          • Nov 2018
          • 37
          • Yurp

          #5
          Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

          thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

          for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

          if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
          Last edited by CapPun; 11-07-2018, 04:11 PM.

          Comment

          • sam_sam_sam
            Badcaps Legend
            • Jul 2011
            • 6029
            • USA

            #6
            Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

            Originally posted by CapPun
            thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

            for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

            if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
            Are you measuring the voltage on the graphics card or are you measuring the connector for a hard drive that could make a difference on your voltage measurement

            Also does your power supply have clean power output which means very low ripple or noise
            which could cause problems also if either value is high
            Last edited by sam_sam_sam; 11-07-2018, 04:33 PM.

            Comment

            • CapPun
              Member
              • Nov 2018
              • 37
              • Yurp

              #7
              Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

              Originally posted by sam_sam_sam
              Are you measuring the voltage on the graphics card or are you measuring the connector for a hard drive that could make a difference on your voltage measurement

              Also does your power supply have clean power output which means very low ripple or noise
              which could cause problems also if either value is high
              mine was a hypothetical question

              but let's assume: output if PERFECTLY clean, zero ripple & noise. let's also assume voltage fed into graphics card stays at EXACTLY 12V with zero variation

              Comment

              • eccerr0r
                Solder Sloth
                • Nov 2012
                • 8682
                • USA

                #8
                Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                Perhaps have to think it another way, if the GPU is drawing a lot of power - more than what the PSU is able to supply - then the voltage will go down (due to PSU ESR and indirectly wattage rating) and the GPU/device would be generating more heat naturally.

                Otherwise the original premise that lower voltage is causing overheating does not make any sense, assuming that the effective resistance of the GPU stays the same. The heat is power dissipation - and power=volts*current. To generate more heat, voltage needs to go up or current drawn needs to go up.

                The only thing that could make sense is current going up drastically and voltage trying to stay the same but can't due to ESR.

                Comment

                • budm
                  Badcaps Legend
                  • Feb 2010
                  • 40746
                  • USA

                  #9
                  Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                  Originally posted by CapPun
                  thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

                  for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

                  if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
                  "12V is super stable" Yeah, up to at what load current? 10A?, 100A?
                  Power supply that can supply 12V with 1A max = 12 watts, 12V power supply that can supply 2A max = 24 Watts. So if the load is trying to draw more current that it can deliver then the output Voltage will drop and then may be to point that the power supply will go into shutdown.

                  What do you think Watt is? Voltage x Current = Watt (linear load). I think you need to know and understand the relationship between Voltage, current, resistance.
                  I.E. Supply Voltage is 12V and the load requires regulated 6V at 5A (30 Watts) to run, so the 12V will be fed to switching Buck converter to step it down to be constant 6 and able to supply 5A of current through the load, for the sake of simplification, lest make it zero lost conversion, 100%eff.
                  So to get 6V 5A (30 Watts) output, the 12V power supply will be drawing 2.5A (30 Watts), if the 12V drops down to 10V, that means the current will go up to 3A to be able to supply the load requirement.
                  Last edited by budm; 11-07-2018, 06:23 PM.
                  Never stop learning
                  Basic LCD TV and Monitor troubleshooting guides.
                  http://www.badcaps.net/forum/showthr...956#post305956

                  Voltage Regulator (LDO) testing:
                  http://www.badcaps.net/forum/showthr...999#post300999

                  Inverter testing using old CFL:
                  http://www.badcaps.net/forum/showthr...er+testing+cfl

                  Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                  http://s807.photobucket.com/user/budm/library/

                  TV Factory reset codes listing:
                  http://www.badcaps.net/forum/showthread.php?t=24809

                  Comment

                  • petehall347
                    Badcaps Legend
                    • Jan 2015
                    • 4425
                    • United Kingdom

                    #10
                    Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                    was told when i was 11 years old volts are given and amps are taken .

                    Comment

                    • CapPun
                      Member
                      • Nov 2018
                      • 37
                      • Yurp

                      #11
                      Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                      Originally posted by budm
                      "12V is super stable" Yeah, up to at what load current? 10A?, 100A?
                      Power supply that can supply 12V with 1A max = 12 watts, 12V power supply that can supply 2A max = 24 Watts. So if the load is trying to draw more current that it can deliver then the output Voltage will drop and then may be to point that the power supply will go into shutdown.
                      ok I didn't know that max wattage of a given rail was related to that rail's voltage stability (that's what your saying right?)

                      basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?
                      if so that makes more sense now I guess


                      What do you think Watt is? Voltage x Current = Watt (linear load). I think you need to know and understand the relationship between Voltage, current, resistance.
                      been a while :/ I remember P=UI and U=RI (and P=RI²)
                      at least for direct currents (for alternating currents there were also sin/cos functions involved lol)





                      alrite so now it's been explained how low voltage causes more heat I've another question - how is it that overvolting processors (CPU's, GPU's, RAM... something them crazy overclockers like to do) also increases heat? shouldn't the extra voltage cause the amperes to drop thus decrease heat output?

                      Originally posted by petehall347
                      was told when i was 11 years old volts are given and amps are taken .
                      what does that mean
                      Last edited by CapPun; 11-07-2018, 07:36 PM.

                      Comment

                      • budm
                        Badcaps Legend
                        • Feb 2010
                        • 40746
                        • USA

                        #12
                        Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                        "basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?"
                        Nope, if it is just simple non regulated Voltage source (I.E. battery) and a load, there will be no current rise when the Battery Voltage goes down with the fixed load resistance.
                        Power supply output Z: https://community.keysight.com/commu...haracteristics
                        When you deal with regulated power supply, the Switching device in the Regulated Switching power supply will try to draw more current to try to maintain the output Voltage to the load when feeding Voltage goes lower. Regulated power supply will draw more input current when Voltage Source that feeds the regulated power supply, otherwise how else it can maintain the output power, it is about power conversion.
                        Well, it does not sound like you still do not understand Voltage, current, power. You need to do more readings.
                        So let say you have 1 Ohm resistor connected to 1V source, so how much current will that be? how much power will that be?
                        Then you hook up the same 1 Ohm resistor to the 2V source, so how much current will that be? how much power will that be?
                        Do the experiment with power supply you have, let say it has 12V 1 A , so you connect load that will draw 1A and see if the 12v will stay at 12V, then try to draw 4A from it and see if it will stay at 12V or it may go into shutdown.
                        You do know how to calculate what the resistance should be so it will draw the required current with known Voltage, correct?
                        Last edited by budm; 11-07-2018, 08:57 PM.
                        Never stop learning
                        Basic LCD TV and Monitor troubleshooting guides.
                        http://www.badcaps.net/forum/showthr...956#post305956

                        Voltage Regulator (LDO) testing:
                        http://www.badcaps.net/forum/showthr...999#post300999

                        Inverter testing using old CFL:
                        http://www.badcaps.net/forum/showthr...er+testing+cfl

                        Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                        http://s807.photobucket.com/user/budm/library/

                        TV Factory reset codes listing:
                        http://www.badcaps.net/forum/showthread.php?t=24809

                        Comment

                        • eccerr0r
                          Solder Sloth
                          • Nov 2012
                          • 8682
                          • USA

                          #13
                          Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                          Originally posted by CapPun
                          basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?
                          if so that makes more sense now I guess
                          No, no, no. Again, the current/wattage demand increase is causing the voltage to drop due to ESR of the PSU, not the voltage drop causing current consumption to rise. Big difference here, and the latter makes absolutely no sense from a conservation of energy standpoint. A stronger PSU will still generate as much heat, except that GPU will actually work now since voltages won't sag.

                          This is also assuming that you're not losing power due to onboard switching regulators wasting power due to not being able to saturate the transistors...which is a different issue. The transistors would be getting hot, not the GPU, which would stay cold because it can't run due to lack of voltage.
                          Last edited by eccerr0r; 11-07-2018, 08:48 PM.

                          Comment

                          • CapPun
                            Member
                            • Nov 2018
                            • 37
                            • Yurp

                            #14
                            Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                            Originally posted by eccerr0r
                            No, no, no. Again, the current/wattage demand increase is causing the voltage to drop due to ESR of the PSU, not the voltage drop causing current consumption to rise.
                            what's the ESR? googled it but it only appears as a characteristic of capacitors


                            Originally posted by budm
                            When you deal with regulated power supply, the Switching device in the Regulated Switching power supply will try to draw more current to try to maintain the output Voltage to the load when feeding Voltage goes lower. Regulated power supply will draw more input current when Voltage Source that feeds the regulated power supply, otherwise how else it can maintain the output power, it is about power conversion.
                            that's what I meant (I think): if voltage drops then PSU will increase current to compensate to maintain required power correct?


                            Well, it does not sound like you still do not understand Voltage, current, power. You need to do more readings.
                            So let say you have 1 Ohm resistor connected to 1V source, so how much current will that be? how much power will that be?
                            P=UI & U=RI so I=U/R so P=U²/R so current will be 1A & power will be 1W?

                            Then you hook up the same 1 Ohm resistor to the 2V source, so how much current will that be? how much power will that be?
                            2A & 4W?

                            Do the experiment with power supply you have, let say it has 12V 1 A , so you connect load that will draw 1A and see if the 12v will stay at 12V, then try to draw 4A from it and see if it will stay at 12V or it may go into shutdown.
                            You do know how to calculate what the resistance should be so it will draw the required current with known Voltage, correct?
                            resistance should be 12/4=3 ohms? (omega symbol w/e)

                            but resistance is a fixed amount it cant change right?
                            Last edited by CapPun; 11-07-2018, 09:23 PM.

                            Comment

                            • budm
                              Badcaps Legend
                              • Feb 2010
                              • 40746
                              • USA

                              #15
                              Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                              What is 'U'?
                              Never stop learning
                              Basic LCD TV and Monitor troubleshooting guides.
                              http://www.badcaps.net/forum/showthr...956#post305956

                              Voltage Regulator (LDO) testing:
                              http://www.badcaps.net/forum/showthr...999#post300999

                              Inverter testing using old CFL:
                              http://www.badcaps.net/forum/showthr...er+testing+cfl

                              Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                              http://s807.photobucket.com/user/budm/library/

                              TV Factory reset codes listing:
                              http://www.badcaps.net/forum/showthread.php?t=24809

                              Comment

                              • CapPun
                                Member
                                • Nov 2018
                                • 37
                                • Yurp

                                #16
                                Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                voltage (teachers always wrote it as U)
                                I is intensity & P is power

                                Comment

                                • budm
                                  Badcaps Legend
                                  • Feb 2010
                                  • 40746
                                  • USA

                                  #17
                                  Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                  Euro uses 'U' for Voltage which I have not seen it being used in US.
                                  Never stop learning
                                  Basic LCD TV and Monitor troubleshooting guides.
                                  http://www.badcaps.net/forum/showthr...956#post305956

                                  Voltage Regulator (LDO) testing:
                                  http://www.badcaps.net/forum/showthr...999#post300999

                                  Inverter testing using old CFL:
                                  http://www.badcaps.net/forum/showthr...er+testing+cfl

                                  Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                                  http://s807.photobucket.com/user/budm/library/

                                  TV Factory reset codes listing:
                                  http://www.badcaps.net/forum/showthread.php?t=24809

                                  Comment

                                  • eccerr0r
                                    Solder Sloth
                                    • Nov 2012
                                    • 8682
                                    • USA

                                    #18
                                    Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                    ESR = effective series resistance. Does not have to be capacitors - anything that has an ideal characteristic of being zero resistance (capacitors, inductors, power supplies, batteries, op amps, etc.) - has some kind of output resistance in real devices. I just name this resistance as an "effective" series resistance because there's no actual resistor in these devices.

                                    With the resistance, the behavior of the circuit usually is negatively affected - batteries and psus will exhibit droop, op amps can't drive anything and still maintain their gain, capacitors and inductors lose energy just being used...

                                    Comment

                                    • petehall347
                                      Badcaps Legend
                                      • Jan 2015
                                      • 4425
                                      • United Kingdom

                                      #19
                                      Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                      i always thought ESR = Equivalent series resistance

                                      Comment

                                      • eccerr0r
                                        Solder Sloth
                                        • Nov 2012
                                        • 8682
                                        • USA

                                        #20
                                        Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                        that works too. Either way there's no "real" resistor there and ideally there is no resistor, but due to design or real components, there's 'effectively' an 'equivalent' resistor there. It's confusing for PSUs because they have capacitors there that contribute to the effective resistance too, and each of them indeed do for the high frequency component. But there's also the low frequency component/DC that's dependent on the transistors, diodes, and inductors/transformers.

                                        Comment

                                        Related Topics

                                        Collapse

                                        • Borteck
                                          Apple Magic Keyboard A1843 EMC 3138
                                          by Borteck
                                          Hi,
                                          My Magic Keyboard with numeric keypad and no TouchID has lost its magic and doesn't work anymore. Not by Bluetooth nor with direct lightning/usb cable.

                                          The model is A1843 EMC 3138 and logic board 820-00789-A.
                                          Does someone have the schem and boardview please ?
                                          Voltage test points would be much appreciated !

                                          Battery is charging : 4,2v on pin 4

                                          Thanks a lot
                                          A.
                                          09-26-2023, 07:44 PM
                                        • smurtz
                                          Cambridge Audio Stream Magic 6 - Power Supply
                                          by smurtz
                                          I have here a Cambbridge Audio Stream Magic 6 which does not power on.
                                          When opening the case the PSU appears to be faulty:



                                          IC1 (ICE3BR0665J), D2, the rectifier bridge, C15,C16 are dead and T2 is fried:




                                          I found the schematics of the PSU in the cambridge Audio CXN Service guide which uses the same:
                                          My problem is that I cannot find any info on T2 (SG01323) which seems to be a choke. Anyone knows how I can find a replacement & the value of that choke ?





                                          Thanks very much in advance...
                                          12-20-2024, 02:40 PM
                                        • danny34
                                          Thinkpad E15 Gen2 Intel "lower wattage AC Adapter"
                                          by danny34
                                          Hello folks,

                                          My Thinkpad E15 Gen2 with i7 had stopped loading and wouldn't start either. I noticed the typical error with the Mosfet QB6. This has become unsoldered.

                                          I soldered it back on and found that the gate voltage was too low and that's why it was getting hot. Then I wanted to replace the UB7. Unfortunately the one I ordered from China didn't want to run.
                                          So I soldered the old one back in and it still has its 18.9V at the output of QB6.
                                          I stuck a thermal pad on the QB6 to cool it down a bit.

                                          Now the E15 is running again, but unfortunately...
                                          10-30-2024, 02:10 AM
                                        • Stefan Steff
                                          HP laptop smart PSU wattage
                                          by Stefan Steff
                                          Is there any software/program what recognize the PSU wattage attached to a HP laptop?... like DELL has in BIOS/Battery status...?
                                          Thanks!
                                          S.S.
                                          08-05-2024, 06:29 AM
                                        • pc_okay
                                          Overheat +3VA_DSW / +5VSUS [System Power] ON GL504GS 2.1
                                          by pc_okay
                                          Hello,

                                          I have a messed up motherboard asus GL504GS 2.1.
                                          Someone break pin of fan connector and 12V pin touch FAN1_PWM pin, and of course turn off pc..

                                          Now pins are repaired but, still, the 3v/5v chip overheat, and FAN1_PWM is 12 ohm to ground.

                                          FAN1_PWM is directly connected to IT8225VG-128.... There is just a condo on the way and it's okay.

                                          So my deduction is IT8225VG-128 is dead and pull to much voltage from 3V linear of 3v/5v chip. (i already change 3v/5v chip for fun, and same problem of course...)
                                          Can you confirm my guess?...
                                          07-25-2024, 08:08 AM
                                        • Loading...
                                        • No more items.
                                        Working...