Announcement

Collapse
No announcement yet.

*insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

    so this a general question - I've read many times that amperage on 12V rails should be sufficient to properly feed a graphics card

    makes sense so far

    what makes way less sense & sounds like total magic to me is, if a GPU is underpowered (insufficient wattage on 12V rail) then this causes the card to...overheat?

    how thef is that possible?

    basically that's like saying, if the card doesn't get enough energy then it produces more energy?

    that's like saying an underfed sportsman will be able to lift more or something

    sounds like this totally violates the most fundamental law of this universe lol (conservation of energy)



    could an expert explain plz? (in not too complicated terms)

    #2
    Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

    The switching power supply device that step down the 12V to lower Regulated Voltages to run the GPU will have more current flow through those switching devices when input Voltage is low, those are the devices that will be overheating because they try to supply the same regulated Voltage to run the GPU. But most most card should throttle down.
    And when the current draw form 12V exceeds the PS capability then PS may go into shutdown.
    Last edited by budm; 11-07-2018, 03:13 PM.
    Never stop learning
    Basic LCD TV and Monitor troubleshooting guides.
    http://www.badcaps.net/forum/showthr...956#post305956

    Voltage Regulator (LDO) testing:
    http://www.badcaps.net/forum/showthr...999#post300999

    Inverter testing using old CFL:
    http://www.badcaps.net/forum/showthr...er+testing+cfl

    Tear down pictures : Hit the ">" Show Albums and stories" on the left side
    http://s807.photobucket.com/user/budm/library/

    TV Factory reset codes listing:
    http://www.badcaps.net/forum/showthread.php?t=24809

    Comment


      #3
      Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

      Originally posted by budm View Post
      The switching power supply device that step down the 12V to lower Regulated Voltages to run the GPU will have more current flow through those switching devices when input Voltage is low, those are the devices that will be overheating because they try to supply the same regulated Voltage to run the GPU. But most most card should throttle down.
      And when the current draw form 12V exceeds the PS capability then PS may go into shutdown.
      so basically lower voltage means higher amperage? (so that the product of the two ie. wattage remains constant?)

      and by "switching devices" you mean the VRMs on the graphics card? that is what overheats when 12v voltage is too low?

      Comment


        #4
        Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

        i guess its a bit like a starter motor burning out when using a bad battery .

        Comment


          #5
          Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

          thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

          for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

          if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
          Last edited by CapPun; 11-07-2018, 04:11 PM.

          Comment


            #6
            Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

            Originally posted by CapPun View Post
            thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

            for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

            if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
            Are you measuring the voltage on the graphics card or are you measuring the connector for a hard drive that could make a difference on your voltage measurement

            Also does your power supply have clean power output which means very low ripple or noise
            which could cause problems also if either value is high
            Last edited by sam_sam_sam; 11-07-2018, 04:33 PM.
            9 PC LCD Monitor
            6 LCD Flat Screen TV
            30 Desk Top Switching Power Supply
            10 Battery Charger Switching Power Supply for Power Tool
            6 18v Lithium Battery Power Boards for Tool Battery Packs
            1 XBox 360 Switching Power Supply and M Board
            25 Servo Drives 220/460 3 Phase
            6 De-soldering Station Switching Power Supply 1 Power Supply
            1 Dell Mother Board
            15 Computer Power Supply
            1 HP Printer Supply & Control Board * lighting finished it *


            These two repairs where found with a ESR meter...> Temp at 50*F then at 90*F the ESR reading more than 10%

            1 Over Head Crane Current Sensing Board ( VFD Failure Five Years Later )
            2 Hem Saw Computer Stack Board

            All of these had CAPs POOF
            All of the mosfet that are taken out by bad caps

            Comment


              #7
              Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

              Originally posted by sam_sam_sam View Post
              Are you measuring the voltage on the graphics card or are you measuring the connector for a hard drive that could make a difference on your voltage measurement

              Also does your power supply have clean power output which means very low ripple or noise
              which could cause problems also if either value is high
              mine was a hypothetical question

              but let's assume: output if PERFECTLY clean, zero ripple & noise. let's also assume voltage fed into graphics card stays at EXACTLY 12V with zero variation

              Comment


                #8
                Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                Perhaps have to think it another way, if the GPU is drawing a lot of power - more than what the PSU is able to supply - then the voltage will go down (due to PSU ESR and indirectly wattage rating) and the GPU/device would be generating more heat naturally.

                Otherwise the original premise that lower voltage is causing overheating does not make any sense, assuming that the effective resistance of the GPU stays the same. The heat is power dissipation - and power=volts*current. To generate more heat, voltage needs to go up or current drawn needs to go up.

                The only thing that could make sense is current going up drastically and voltage trying to stay the same but can't due to ESR.

                Comment


                  #9
                  Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                  Originally posted by CapPun View Post
                  thing is budm mentioned insufficient voltage causing overheat - but my scenario involves insufficient wattage not voltage

                  for instance suppose the 12V is super stable in voltage & stays at 12V precisely (as in Seasonic-level stable) BUT the wattage (power) on the 12V rails is too low for a graphics card: this will still cause the card to overheat despite the voltage stability right?

                  if so, why is that?? how can insufficient power (energy over time) cause the card to produce more energy (as heat)?
                  "12V is super stable" Yeah, up to at what load current? 10A?, 100A?
                  Power supply that can supply 12V with 1A max = 12 watts, 12V power supply that can supply 2A max = 24 Watts. So if the load is trying to draw more current that it can deliver then the output Voltage will drop and then may be to point that the power supply will go into shutdown.

                  What do you think Watt is? Voltage x Current = Watt (linear load). I think you need to know and understand the relationship between Voltage, current, resistance.
                  I.E. Supply Voltage is 12V and the load requires regulated 6V at 5A (30 Watts) to run, so the 12V will be fed to switching Buck converter to step it down to be constant 6 and able to supply 5A of current through the load, for the sake of simplification, lest make it zero lost conversion, 100%eff.
                  So to get 6V 5A (30 Watts) output, the 12V power supply will be drawing 2.5A (30 Watts), if the 12V drops down to 10V, that means the current will go up to 3A to be able to supply the load requirement.
                  Last edited by budm; 11-07-2018, 06:23 PM.
                  Never stop learning
                  Basic LCD TV and Monitor troubleshooting guides.
                  http://www.badcaps.net/forum/showthr...956#post305956

                  Voltage Regulator (LDO) testing:
                  http://www.badcaps.net/forum/showthr...999#post300999

                  Inverter testing using old CFL:
                  http://www.badcaps.net/forum/showthr...er+testing+cfl

                  Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                  http://s807.photobucket.com/user/budm/library/

                  TV Factory reset codes listing:
                  http://www.badcaps.net/forum/showthread.php?t=24809

                  Comment


                    #10
                    Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                    was told when i was 11 years old volts are given and amps are taken .

                    Comment


                      #11
                      Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                      Originally posted by budm View Post
                      "12V is super stable" Yeah, up to at what load current? 10A?, 100A?
                      Power supply that can supply 12V with 1A max = 12 watts, 12V power supply that can supply 2A max = 24 Watts. So if the load is trying to draw more current that it can deliver then the output Voltage will drop and then may be to point that the power supply will go into shutdown.
                      ok I didn't know that max wattage of a given rail was related to that rail's voltage stability (that's what your saying right?)

                      basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?
                      if so that makes more sense now I guess


                      What do you think Watt is? Voltage x Current = Watt (linear load). I think you need to know and understand the relationship between Voltage, current, resistance.
                      been a while :/ I remember P=UI and U=RI (and P=RI²)
                      at least for direct currents (for alternating currents there were also sin/cos functions involved lol)





                      alrite so now it's been explained how low voltage causes more heat I've another question - how is it that overvolting processors (CPU's, GPU's, RAM... something them crazy overclockers like to do) also increases heat? shouldn't the extra voltage cause the amperes to drop thus decrease heat output?

                      Originally posted by petehall347 View Post
                      was told when i was 11 years old volts are given and amps are taken .
                      what does that mean
                      Last edited by CapPun; 11-07-2018, 07:36 PM.

                      Comment


                        #12
                        Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                        "basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?"
                        Nope, if it is just simple non regulated Voltage source (I.E. battery) and a load, there will be no current rise when the Battery Voltage goes down with the fixed load resistance.
                        Power supply output Z: https://community.keysight.com/commu...haracteristics
                        When you deal with regulated power supply, the Switching device in the Regulated Switching power supply will try to draw more current to try to maintain the output Voltage to the load when feeding Voltage goes lower. Regulated power supply will draw more input current when Voltage Source that feeds the regulated power supply, otherwise how else it can maintain the output power, it is about power conversion.
                        Well, it does not sound like you still do not understand Voltage, current, power. You need to do more readings.
                        So let say you have 1 Ohm resistor connected to 1V source, so how much current will that be? how much power will that be?
                        Then you hook up the same 1 Ohm resistor to the 2V source, so how much current will that be? how much power will that be?
                        Do the experiment with power supply you have, let say it has 12V 1 A , so you connect load that will draw 1A and see if the 12v will stay at 12V, then try to draw 4A from it and see if it will stay at 12V or it may go into shutdown.
                        You do know how to calculate what the resistance should be so it will draw the required current with known Voltage, correct?
                        Last edited by budm; 11-07-2018, 08:57 PM.
                        Never stop learning
                        Basic LCD TV and Monitor troubleshooting guides.
                        http://www.badcaps.net/forum/showthr...956#post305956

                        Voltage Regulator (LDO) testing:
                        http://www.badcaps.net/forum/showthr...999#post300999

                        Inverter testing using old CFL:
                        http://www.badcaps.net/forum/showthr...er+testing+cfl

                        Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                        http://s807.photobucket.com/user/budm/library/

                        TV Factory reset codes listing:
                        http://www.badcaps.net/forum/showthread.php?t=24809

                        Comment


                          #13
                          Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                          Originally posted by CapPun View Post
                          basically insufficient wattage is due to insufficient voltage which in turn causes a rise in amps thus more joule energy wasted - amirite?
                          if so that makes more sense now I guess
                          No, no, no. Again, the current/wattage demand increase is causing the voltage to drop due to ESR of the PSU, not the voltage drop causing current consumption to rise. Big difference here, and the latter makes absolutely no sense from a conservation of energy standpoint. A stronger PSU will still generate as much heat, except that GPU will actually work now since voltages won't sag.

                          This is also assuming that you're not losing power due to onboard switching regulators wasting power due to not being able to saturate the transistors...which is a different issue. The transistors would be getting hot, not the GPU, which would stay cold because it can't run due to lack of voltage.
                          Last edited by eccerr0r; 11-07-2018, 08:48 PM.

                          Comment


                            #14
                            Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                            Originally posted by eccerr0r View Post
                            No, no, no. Again, the current/wattage demand increase is causing the voltage to drop due to ESR of the PSU, not the voltage drop causing current consumption to rise.
                            what's the ESR? googled it but it only appears as a characteristic of capacitors


                            Originally posted by budm View Post
                            When you deal with regulated power supply, the Switching device in the Regulated Switching power supply will try to draw more current to try to maintain the output Voltage to the load when feeding Voltage goes lower. Regulated power supply will draw more input current when Voltage Source that feeds the regulated power supply, otherwise how else it can maintain the output power, it is about power conversion.
                            that's what I meant (I think): if voltage drops then PSU will increase current to compensate to maintain required power correct?


                            Well, it does not sound like you still do not understand Voltage, current, power. You need to do more readings.
                            So let say you have 1 Ohm resistor connected to 1V source, so how much current will that be? how much power will that be?
                            P=UI & U=RI so I=U/R so P=U²/R so current will be 1A & power will be 1W?

                            Then you hook up the same 1 Ohm resistor to the 2V source, so how much current will that be? how much power will that be?
                            2A & 4W?

                            Do the experiment with power supply you have, let say it has 12V 1 A , so you connect load that will draw 1A and see if the 12v will stay at 12V, then try to draw 4A from it and see if it will stay at 12V or it may go into shutdown.
                            You do know how to calculate what the resistance should be so it will draw the required current with known Voltage, correct?
                            resistance should be 12/4=3 ohms? (omega symbol w/e)

                            but resistance is a fixed amount it cant change right?
                            Last edited by CapPun; 11-07-2018, 09:23 PM.

                            Comment


                              #15
                              Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                              What is 'U'?
                              Never stop learning
                              Basic LCD TV and Monitor troubleshooting guides.
                              http://www.badcaps.net/forum/showthr...956#post305956

                              Voltage Regulator (LDO) testing:
                              http://www.badcaps.net/forum/showthr...999#post300999

                              Inverter testing using old CFL:
                              http://www.badcaps.net/forum/showthr...er+testing+cfl

                              Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                              http://s807.photobucket.com/user/budm/library/

                              TV Factory reset codes listing:
                              http://www.badcaps.net/forum/showthread.php?t=24809

                              Comment


                                #16
                                Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                voltage (teachers always wrote it as U)
                                I is intensity & P is power

                                Comment


                                  #17
                                  Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                  Euro uses 'U' for Voltage which I have not seen it being used in US.
                                  Never stop learning
                                  Basic LCD TV and Monitor troubleshooting guides.
                                  http://www.badcaps.net/forum/showthr...956#post305956

                                  Voltage Regulator (LDO) testing:
                                  http://www.badcaps.net/forum/showthr...999#post300999

                                  Inverter testing using old CFL:
                                  http://www.badcaps.net/forum/showthr...er+testing+cfl

                                  Tear down pictures : Hit the ">" Show Albums and stories" on the left side
                                  http://s807.photobucket.com/user/budm/library/

                                  TV Factory reset codes listing:
                                  http://www.badcaps.net/forum/showthread.php?t=24809

                                  Comment


                                    #18
                                    Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                    ESR = effective series resistance. Does not have to be capacitors - anything that has an ideal characteristic of being zero resistance (capacitors, inductors, power supplies, batteries, op amps, etc.) - has some kind of output resistance in real devices. I just name this resistance as an "effective" series resistance because there's no actual resistor in these devices.

                                    With the resistance, the behavior of the circuit usually is negatively affected - batteries and psus will exhibit droop, op amps can't drive anything and still maintain their gain, capacitors and inductors lose energy just being used...

                                    Comment


                                      #19
                                      Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                      i always thought ESR = Equivalent series resistance

                                      Comment


                                        #20
                                        Re: *insufficient* wattage for GPU causes GPU to *overheat* ? is this magic??

                                        that works too. Either way there's no "real" resistor there and ideally there is no resistor, but due to design or real components, there's 'effectively' an 'equivalent' resistor there. It's confusing for PSUs because they have capacitors there that contribute to the effective resistance too, and each of them indeed do for the high frequency component. But there's also the low frequency component/DC that's dependent on the transistors, diodes, and inductors/transformers.

                                        Comment

                                        Working...
                                        X