so this a general question - I've read many times that amperage on 12V rails should be sufficient to properly feed a graphics card
makes sense so far
what makes way less sense & sounds like total magic to me is, if a GPU is underpowered (insufficient wattage on 12V rail) then this causes the card to...overheat?
how thef is that possible?
basically that's like saying, if the card doesn't get enough energy then it produces more energy?
that's like saying an underfed sportsman will be able to lift more or something
sounds like this totally violates the most fundamental law of this universe lol (conservation of energy)
could an expert explain plz? (in not too complicated terms)
makes sense so far
what makes way less sense & sounds like total magic to me is, if a GPU is underpowered (insufficient wattage on 12V rail) then this causes the card to...overheat?
how thef is that possible?
basically that's like saying, if the card doesn't get enough energy then it produces more energy?
that's like saying an underfed sportsman will be able to lift more or something
sounds like this totally violates the most fundamental law of this universe lol (conservation of energy)
could an expert explain plz? (in not too complicated terms)
Comment