Badcaps.net Forum
Go Back   Badcaps Forums > General Topics > Custom Tweaks, Modifications, and Revisions
Register FAQ Calendar Search Today's Posts Mark Forums Read

 
Thread Tools Display Modes
Old 11-15-2016, 02:22 PM   #1
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Undervolt Gerforce 8800 GTS 320

Hi, I've been looking on the net for a way to undervolt this card with no luck, the bios only allows to set 1,1 to the v-core but I measure 1,27 and 1,89 V_men,( thanks to momaka for teling me how). All I can find on the net is how to overvolt, but it seen already factory overvolted because with the stock cooler it overheats while on desktop.
Is it posible to undervolt it withought and enginer degree?
Attached Images
File Type: jpg PB110289.JPG (1.27 MB, 21 views)
File Type: jpg PB150293.JPG (1.21 MB, 17 views)
File Type: jpg vgpumod.JPG (105.9 KB, 17 views)
File Type: jpg caps.jpg (122.7 KB, 19 views)
File Type: jpg 8800gts ocp mod !!!.JPG (115.8 KB, 18 views)
hikaruichijo is offline   Reply With Quote
Old 11-15-2016, 03:17 PM   #2
Per Hansson
Super Moderator
 
Per Hansson's Avatar
 
Join Date: Jul 2005
City & State: ----
My Country: Sweden
Line Voltage: 230v 50Hz
I'm a: Knowledge Seeker
Posts: 4,169
Default Re: Undervolt Gerforce 8800 GTS 320

I tried to ID the voltage regulator and find a datasheet with pinout but failed.
Yours is a Primarion PX3540BDSG unless I'm mistaken.
If so it's very similar to my 8800GTS 512MB that had a PX3544.
What you need to do is identify the VSense pin (also called Feedback).
And instead of leaking some of the signal to ground like when you increase Vcore.
You need to inject some power so the chip will think the GPU is getting too much.
It's also possible that this chip can be controlled by VID signals, but that would require a schematic to understand properly.

I found the following picture on a Russian site.
So what you need to do instead of connecting a VR to ground like they do is connect to a higher voltage point.
Attached Images
File Type: jpg 16_8800gts_vgpu_mod.jpg (68.2 KB, 28 views)
__________________
"The one who says it cannot be done should never interrupt the one who is doing it."
Per Hansson is offline   Reply With Quote
Old 11-16-2016, 01:38 PM   #3
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

Great, is that controler you're not mistaken, i don't understand a word of Russian but that links looks very usefull.
I'm going to buy a 1komh trimmer, it's all I need right?
If the resistances increases the v_gpu will increase and if the resistance decreases the v_gpu will decrease, I'm correct?

Last edited by hikaruichijo; 11-16-2016 at 01:41 PM..
hikaruichijo is offline   Reply With Quote
Old 11-17-2016, 12:32 PM   #4
Per Hansson
Super Moderator
 
Per Hansson's Avatar
 
Join Date: Jul 2005
City & State: ----
My Country: Sweden
Line Voltage: 230v 50Hz
I'm a: Knowledge Seeker
Posts: 4,169
Default Re: Undervolt Gerforce 8800 GTS 320

No, the mod is only for increasing Vcore.
As I explained that's done by leaking some of the Vsense voltage to ground via the pot.
You need to do the opposite, i.e. introduce some voltage on the Vsense pin via a pot.
Per Hansson is offline   Reply With Quote
Old 11-18-2016, 03:56 AM   #5
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

Yes, I understand that, I'm planing on conect the ram positive with the vsense using the trimmer because the voltage diference between the V_core and the V_ram is around the amount of undervolt I'm planing to the V_core and if the V_ram decreases a bit the better.
Is that crazy?
hikaruichijo is offline   Reply With Quote
Old 11-18-2016, 11:07 AM   #6
momaka
Badcaps Veteran
 
momaka's Avatar
 
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 8,473
Default Re: Undervolt Gerforce 8800 GTS 320

Depending on the RAM chips you have on your board, you might want to leave V_RAM alone. Some Qimonda/Infineon chips from that era had stability issues and needed higher V_RAM voltage to run. Meanwhile, Samsung and Hynix are good and never had that problem. Most of the time, you can find a datasheet for the RAM chips on the internet and see what they require. Going below or above the recommended specs in the datasheet for the RAM is not recommended.

As for undervolting the GPU V_core, I will have a look at my 8800 GTS a post back what I find. I don't mind doing some experiments for you so that you don't fry yours. Mine is already dead as a door nail.
momaka is offline   Reply With Quote
Old 11-18-2016, 12:33 PM   #7
Per Hansson
Super Moderator
 
Per Hansson's Avatar
 
Join Date: Jul 2005
City & State: ----
My Country: Sweden
Line Voltage: 230v 50Hz
I'm a: Knowledge Seeker
Posts: 4,169
Default Re: Undervolt Gerforce 8800 GTS 320

Quote:
Originally Posted by hikaruichijo View Post
Yes, I understand that, I'm planing on conect the ram positive with the vsense using the trimmer because the voltage diference between the V_core and the V_ram is around the amount of undervolt I'm planing to the V_core and if the V_ram decreases a bit the better.
Is that crazy?
It might work.
But as I wrote in my first post it might be better to found a voltage source on the same phase as the core.
But then again that might be impossible.
Might be a good idea then to put a diode here for safety sake, don't want any backfeed going to the RAM...
Per Hansson is offline   Reply With Quote
Old 11-19-2016, 09:22 PM   #8
momaka
Badcaps Veteran
 
momaka's Avatar
 
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 8,473
Default Re: Undervolt Gerforce 8800 GTS 320

I hate starting with a rant, but I just have to say that Primarion datasheets really are a piece of shit. Was able to find one for the PX3540 PWM controller (see end of post for attachment), but just like the PX3544 that Per linked to above, it has very very limited information inside. There's not even fucking pin numbers on the IC pins, let alone pin descriptions! But I will cut off my rant here.

Anyways, I had to ASSume from the sample circuit (if you can call it that) that pin 1 was top left and pin 48 top right corner. That makes pin VSENP (the pin that should supposedly allow us to tweak the V_core voltage) as pin #18.

Now. I don't know if this is exactly progress, as I wasn't successfully able to undervolt the GPU (I did a few experiments that didn't seem to do anything). But I was able to at least figure some resistor values in the voltage "feedback" circuit. It doesn't exactly match the sample circuit in the datasheet - in fact, not even close.

First off, here is hikaruichijo's image of the back of the card showing the PWM IC area that I annotated:
http://www.badcaps.net/forum/attachm...1&d=1479612106

I put a blue and a red dot next to two small SMD resistors, calling one of them Ra and the other Rb. Resistor Ra is connected between GPU V_core and the VSENP pin (pin 18) of the PX3540 PWM IC. Meanwhile, resistor Rb is connected between VSENP pin and ground. On my 8800 GTS 320 MB video card (branded by eVGA, but it's a reference nVidia design), resistor Ra = 33 Ohms and resistor Rb = 470 Ohms. Here is a simplified circuit that better shows the arrangement:
http://www.badcaps.net/forum/attachm...1&d=1479612106

Now, the first thing that crossed my mind is that this forms a voltage divider, and removing resistor Rb (the 470 Ohm resistor to ground) should bring the voltage up on pin VSENP, which in turn should lower the V_core voltage. So I tried that and... NOTHING. V_core voltage was exactly the same as before (1.318 Volts before the mod and 1.318V after it). *IF* this was voltage divider circuit, then the V_core should have fallen to about 1.23 Volts... but it did NOT.

So next, with resistor Rb still removed, I tried injecting a higher voltage to pin VSENP (pin #18) through a resistor. Unfortunately, I think I made a miscalculation and used too high of a resistance value (7.5 KOhms in series with a 20 KOhm precision multi-turn potentiometer). Thus, this was another unsuccessful attempt to lower the V_core voltage, even with the potentiometer turned all the way down to zero Ohms. But at least nothing got fried. As for the injection voltage, I tried both 5V and 12V rail of my PSU.

Now here is why I think this attempt didn't work: if I want V_core of, say, 1.2 Volts and *ASSuming* pin 18 of the PX3540 PWM IC is at 1.32 Volts (who knows with such a crappy datasheet), then resistor Ra will need to see a drop of Va that is:
Va = 1.32 - 1.2 = 0.12 Volts. At 33 Ohms for Ra, that's 3.636 mA it has to pass through it. So that means the injection resistor that I was using to feed the voltage to pin 18 must also be supplying at least 3.636 mA. But with a 7.5 KOhm resistor, the best I can do is 1.42 mA. So I think that's why this didn't work. Re-running though my calculations again, I see that the injection resistance needs to be, at the most, 3 KOhms. (But if anyone tries this, I certainly do NOT recommend going with less than 470 Ohms, as the power dissipation of the resistor will need to be considered if using the 12V rail.)

Also, why did I choose to use the 5V and/or 12V rail as the injection voltage and not something else? Because the PSU voltage rails need to be up and running (and steady), before the PSU sends a "Power Good" signal to the motherboard. So if you are using any other voltage rail, and that voltage rail is disabled until the motherboard circuitry receives the PG signal and tells it to turn on, you can have very different voltages for the GPU V_core.

Quote:
Originally Posted by Per Hansson View Post
Might be a good idea then to put a diode here for safety sake, don't want any backfeed going to the RAM...
If you are injecting a voltage with a resistor of a high enough resistance, you don't have to worry about backfeed at all, because any substantial current draw will cause the voltage do drop across the resistance and nothing will be damaged.
momaka is offline   Reply With Quote
Old 11-20-2016, 05:52 AM   #9
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

You are really a hard worker,basically doing the job for me,well anyway I can't do what you are able to, thanks, I don't have half your knowledge so thanks a lot.
I've found other method to overvolt a bit diferent that seen to be aimed to control the vdroop, I post the photos.
As far as I understand the card has a 3 phase desing and well is only an idea, maybe is posible to do somethin on that phases.
Acording to the squematic the vsemp is conected to the output of that phases so what happens if it was posible to disconect it from there an give it the voltage we want, is that posible?
Maybe that's simplier than adding voltage.
Attached Images
File Type: jpg caps.jpg (122.7 KB, 9 views)
File Type: jpg 8800gts ocp mod !!!.JPG (115.8 KB, 12 views)

Last edited by hikaruichijo; 11-20-2016 at 06:04 AM..
hikaruichijo is offline   Reply With Quote
Old 11-20-2016, 09:53 AM   #10
Per Hansson
Super Moderator
 
Per Hansson's Avatar
 
Join Date: Jul 2005
City & State: ----
My Country: Sweden
Line Voltage: 230v 50Hz
I'm a: Knowledge Seeker
Posts: 4,169
Default Re: Undervolt Gerforce 8800 GTS 320

Did you mark the wrong pin in the photo momaka? It's one pin off from what I posted before...
Some more info here: http://www.xtremesystems.org/forums/...=1#post1870010

hikaruichijo: That mod is just to remove the over current protection.
Per Hansson is offline   Reply With Quote
Old 11-20-2016, 11:35 AM   #11
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

Thanks for the correction, I didn't understand it well, againg beyound my knowledge...
hikaruichijo is offline   Reply With Quote
Old 11-21-2016, 09:00 PM   #12
momaka
Badcaps Veteran
 
momaka's Avatar
 
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 8,473
Angry Re: Undervolt Gerforce 8800 GTS 320

Quote:
Originally Posted by Per Hansson View Post
Did you mark the wrong pin in the photo momaka? It's one pin off from what I posted before...
Yeah, I noticed that too when I was doing my diagrams. I counted the pins both on my own video card and on the reference image I was using, just to verify. Initially, I thought that the extremesystems image was in error. But then I decided to do another check, to see how pin VSENN is connected.

According to my ASSUMPTIONS given the above piece of shit datasheet (dataSHIT?? ), VSENP should be pin 18, so VSENN being right before that should be pin 17 and should be connected to a power ground on the output of the VRM.

BUT IT'S NOT!

So then I went to check the ground pin resistance to ground. Again, using my assumptions about the pin numbering in that PX3540 datashit, the ground pin *should* have been pin 47. I put my multimeter leads on there and... nope... another "surprise motherfucker!" moment

So here is a conclusion for now:
ALL OF THE INFORMATION IN MY ABOVE POST IS WRONG. DO NOT USE IT.

That said, I think I am going to step away from this video card as I am seriously considering smashing it with a hammer. Sorry, hikaruichijo, but I think I'm going to have to leave this for now - this video card and shitty PX3540 PWM IC really got me pissed. I'll try to do work on it another day. Probably will have to measure the resistances to ground on each pin and then try to make sense of that to see what matches the datasheet. An then see if that agrees with the extremesystems diagram.

Oh, and remind me to never buy a fucking chip from Primarion. With a datashit like the one for the PX3540, they really deserve to go out of business.
/rant

Last edited by momaka; 11-21-2016 at 09:05 PM..
momaka is offline   Reply With Quote
Old 11-22-2016, 04:39 PM   #13
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

No need to apologize momaka, you did a lot of job with no luck, we know that happens sometimes, I'm still looking on the net because, at this point I'm amazed that so many people care about overvolting and overcloking, but nobody seens to care about eficience, or quality, so many experts on the world and nobody tried to do an hardware undervolt untill your atemp, or at least I didn't find it, maybe overvolting is easier
I'm not going to use the card unless I find a solution for this that doesn't go trought putting an expensive cooling on a card that is cheaper than the new cooler.
I'm going to ask to my boss if he has another of this cards on the trash, and do some crazy thins with it, I want to delid it, and try to change the TIM between the gpu die and the heatspreader to see if the temp decreases a bit, or maybe put a big old cpu cooler directly on the gpu die, we have a lot of old cpu coolers liying around.
Well if I find another card that works...
hikaruichijo is offline   Reply With Quote
Old 11-26-2016, 04:50 PM   #14
momaka
Badcaps Veteran
 
momaka's Avatar
 
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 8,473
Default Re: Undervolt Gerforce 8800 GTS 320

Quote:
Originally Posted by hikaruichijo View Post
I'm still looking on the net because, at this point I'm amazed that so many people care about overvolting and overcloking, but nobody seens to care about eficience, or quality, so many experts on the world and nobody tried to do an hardware undervolt untill your atemp
Well, it just occurred to me, have you tried using NiBiTor to edit the 8800 GTS's BIOS? The GPU V_core voltage is controlled by the BIOS, so perhaps this might be a better route than a hardware under-volt mod.

I was looking yesterday at the BIOS of my HD3870 video cards, and it seems that they also have a BIOS-controlled GPU core voltage. And it looks like I can set the default voltage to a very low value, so they might not even need any hardware modding/under-voltage. Plus, the ATI BIOS editor allows me to edit the fan speeds. IIRC, NiBiTor is also capable of that for nVidia cards. This is very useful, because the fan speeds on those high-end cards should be increased to keep the cards cool. On my HD3870 cards, when I set the fan manually to 58% for 3D-mode, I can play games all day and the temperature still doesn't go over 60°C. If I don't do that, they run at 60-85°C, depending on GPU load.

Quote:
Originally Posted by hikaruichijo View Post
I'm going to ask to my boss if he has another of this cards on the trash, and do some crazy thins with it, I want to delid it, and try to change the TIM between the gpu die and the heatspreader to see if the temp decreases a bit
That probably won't make a difference of more than 3-7°C max, if even that.

As for de-lidding the GPU, I already did that with my 8800 GTS before I reflowed it. I used a blade from a carpet knife. It was not hard to do. You just insert the blade in a corner between the GPU PCB and the heat spreader, then start wiggling left and right to cut the glue. Then you move to the next corner and do the same. When you do 2 corners, the heat spreader should pop-out. Just make sure that you do NOT start at the bottom left corner of the GPU (using the GPU text/marks as a reference), because there are some traces on the GPU PCB in that area. The other corners don't have anything, so they are safe.

Quote:
Originally Posted by hikaruichijo View Post
or maybe put a big old cpu cooler directly on the gpu die, we have a lot of old cpu coolers liying around.
That should work pretty well. I've been doing that to a lot of my video cards as well (but with Xbox 360 heat sinks, since I have a box of them.)
These two are the most recent examples:
http://www.badcaps.net/forum/showpos...6&postcount=19
http://www.badcaps.net/forum/showpos...9&postcount=56

That said, for the GeForce 8800 GTS, you will need to use a very good heat sink with a copper core and/or heatpipes. A stock LGA775 heat sink for Pentium D (most have a copper insert) should work nicely, since some Pentium D chips are rated for up to 130 Watts TDP and have a maximum dissipation of 150 Watts. Alternatively, the a stock Core 2 Quad or high-end Core i7 heat sink should also work. Even a Pentium 4 Prescott heat sink might work alright.

I've been experimenting with the same stock Xbox 360 CPU heat sinks I mentioned above on a few HD4850 video cards (which are rated for 110 Watts TDP). In the summer with an ambient room temperature of 28°C, I can keep the GPU core at around 60-65°C max, which is not bad considering the size of the Xbox 360 CPU heat sink. Of course, this is with a 60 mm fan running at 1.32 Watts (110 mA @ 12V), which is audible but not overly-loud.

Quote:
Originally Posted by hikaruichijo View Post
Well if I find another card that works...
Even if you don't, take a few dead ones too. Many nVidia high-end video cards have the same screw distances for the GPU heat sink (typically 80 mm diagonally between screws for the older generation of GeForce 6/7/8 series of cards). So you can practice making the CPU bracket on the dead video card and then transfer that to the working video card. This is useful for video cards with exposed cores (i.e. no heat spreader), where there is danger of cracking/crushing the core with the heat sink if you over-tighten it by mistake.

Last edited by momaka; 11-26-2016 at 04:58 PM..
momaka is offline   Reply With Quote
Old 01-30-2017, 10:58 AM   #15
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

I returned to this proyect yestarday only to find the damm card died again by itself, back on november it was working and yesterday, nope, nothing, it doesn't worth the efort to reflow it again. I'm going to throw it to the museum box, where I have things to keep for the future and remember how good, bad or amazing they where.
Thanks a lot to momaka for the enforts but I think it does not worth it anymore since there are more powerfull cards for less.
hikaruichijo is offline   Reply With Quote
Old 01-30-2017, 09:05 PM   #16
momaka
Badcaps Veteran
 
momaka's Avatar
 
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 8,473
Default Re: Undervolt Gerforce 8800 GTS 320

Yeah, sorry I never got through the end with this. I still have my card as well and it's still dead. Rather than putting it in a box, I use it as a wall decor. Basically, I have a bunch of working and non-working video cards hung on the wall right next to my bench in my room.

Anyways, it's a shame that your 8800 died again, but I guess it was just a matter of time. Lead-free solder is not very stable (and especially early lead-free solder, like when those 8800 video cards came out).

Also, as far as anyone else trying to undervolt their GeForce 8800 cards: again, I think it might actually make more sense to do it through the card's BIOS (i.e. make a custom BIOS and flash it on the card). I just did that with a Radeon HD4850 video card, and it works fine. Also tweaked the fan curve while I was there, so now the card doesn't wait until it melts to run the fans.

The only thing to be cautious about when under-volting through the BIOS is to not go too low on the voltage, because that can render the video card unstable and not boot again for reflashing (if that happens, a hardware voltage mod to increase the voltage can be used to temporarily increase the GPU voltage to make the video card boot).

If extreme under-volting is desired, then the best course of action is to severely under-clock the video card as well. Then, if the video card posts OK with the under-volted and under-clocked BIOS, start increasing the clocks with a software utility (like RivaTuner or similar) until the limit is reached (i.e. when the card starts crashing under load). Then go back a few notches on the clock, as that would be the maximum safe clock that can be ran at the extreme under-voltage that was selected. This might not make sense now, but I'll make a thread on my HD4850 video cards eventually and explain in more detail there. I've been desperately trying to make those run at a decent temperature with their single-slot coolers. Right now in the winter (with about 20-22C room temperature), this is achievable. But when it gets close to 28-29C in the summer, that might not work so well.

Last edited by momaka; 01-30-2017 at 09:10 PM..
momaka is offline   Reply With Quote
Old 01-31-2017, 03:27 PM   #17
hikaruichijo
Senior Member
 
Join Date: Aug 2015
City & State: Ourense
My Country: Spain
Line Voltage: 230V 60Hz
I'm a: Knowledge Seeker
Posts: 118
Default Re: Undervolt Gerforce 8800 GTS 320

My 8800 did not respond when I moded it's bios with nibitor,I've set the voltage to 1.0 but the voltage regulator always provided 1.29. I did that test when you showed me where to test the Vgpu and Vmen.
I undervolted the HD 6990M of my M17xr3 by moding the bios. I've got the laptop very cheap because the graphics card died but after one reflow it is still working today and that was four years ago.
I agree with you that is better to use software methods to that, I modded the bios of more than one card and it's no very dangerous as long as you know how to recover then from a bad flash and things like that.
But maybe the voltage regulator of the 8800 is as bad as it's "datashit".
hikaruichijo is offline   Reply With Quote
Old 01-19-2018, 07:13 PM   #18
Gegga
New Member
 
Join Date: Sep 2015
City & State: Linköping
My Country: Sweden
I'm a: Knowledge Seeker
Posts: 2
Default

Wow I'm amazed to see a topic so recent (relatively speaking(). As an overclocker, 8800GTX, ULTRA and GTS have for sure been in my hands, I still have plenty. I mostly mod to remove overcurrent and overvolt protection together with the feedback pin, but undervolting is someting new. I tested tonight to modify the VID table (1.30 by default on GTX) by soldering in a 1k resistor to pull up pin #3, which coincidentally happens to be VID3. That gives VID code 1 1 1 1 0 = 1,1 volt (see link) . In reality this measure 1,16 under load, but that can be related to the other mods. So even if GTS 640 / 320 doesn't look 100% the same as GTX/ULTRA, I can assure you VID modifying by soldering in pull up resistors or removing existing ones to match a lower VID is the way to go.

https://www.intel.com/content/dam/www/public/us/en/documents/design-guides/voltage-regulator-module-9.0-dc-dc-converter-guidelines.pdf
Gegga is offline   Reply With Quote
Old 01-28-2018, 01:51 PM   #19
momaka
Badcaps Veteran
 
momaka's Avatar
 
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 8,473
Default Re: Undervolt Gerforce 8800 GTS 320

Thanks for the information Gegga and welcome to the forums!

Yes, a hardware VID mod would be the best way to go, but the 8800 GTS uses the Primerion PX3540 PWM chip, which only has two VID pins: VID0 and VID1. I guess that will require some experimentation to see how those pins change the VRM voltage (provided their paths/traces to the GPU are routed on an outer layer on the PCB so that we can easily tap into them to read and/or modify the signal.)

The PX3540 also has a pin labeled "VMAX", which sets the maximum output voltage that the VRM should output. But how that pin is used is a mystery, as the PX3540 data sheet provides no information on that. The only thing one can do is experiment on their card and hope they don't mess things up.

That said, I've given up on trying to mod my 8800 GTS (even though it's dead and I kind of want to just for learning purposes) - there is simply an abundance of much better performing GPUs out there right now that can be bought for much less money. Not to mention most use only a fraction of the power of the 8800 GTS and are not as likely to die - i.e. a GT430 will have almost the same performance as the 8800 GTS at 1/3 the power draw. Thus, I see no point in buying a 8800 card anymore. Perhaps if I find one for dirt cheap or for free, I might return back to this topic.

In any case, undervolting is something people should do more often if they want their hardware to last. I already showed in one of my laptop threads that I could push an old 90 nm, 2 GHz AMD Turion CPU down to 0.975V on the core with nearly stock clocks and all the way down to 0.9V when underclocked further. The power reduction from doing that was drastic - so much, that I no longer had to worry about overheating issues with the onboard GPU (which shares the same heatsink as the CPU and thus tends to die when the temperature goes above 60C due to being from the defective nVidia era).

Last edited by momaka; 01-28-2018 at 01:57 PM..
momaka is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Badcaps.net Technical Forums © 2003 - 2018
Powered by vBulletin ®
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
All times are GMT -6. The time now is 05:54 AM.

Did you find this forum helpful?