Announcement

Collapse
No announcement yet.

XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

    Yet another eBay special I couldn't resist - an XFX GeForce 8600 GT with 256 MB GDDR3 RAM for $8 total (won the auction for $1).
    Why could I not resist, you wonder? Simple answer: more busted Sacon FZ caps.


    Now, I learned my lesson long before to be very careful with hardware with busted Sacon FZ that is described as not working (more specifics on that for another thread/day). But the seller of this auction claimed the card was tested and working. He, too, noted that the card needed a recap. So I decided to bite and bought the card.

    Here are some pictures of how the card looked like when it came.


    And the back:
    https://www.badcaps.net/forum/attach...1&d=1460251222

    So, usual first order of business: remove heatsink and remove all Sacon FZ caps.
    https://www.badcaps.net/forum/attach...1&d=1460251222

    Then, I reached in my bag with good caps for testing purposes. And this is what the resultant test recap looked like:


    Close-up shots:
    https://www.badcaps.net/forum/attach...1&d=1460251222
    https://www.badcaps.net/forum/attach...1&d=1460251222

    A bit crude? Yes, it is (especially that leaning Fujitsu poly cap, on which I had to add a “prosthetic” leg ). But nonetheless, everything was calculated to meet or exceed specs – even though I only intended this to be a temporary recap.

    Now, if there is one thing I like about these older XFX cards is that they provide you with plenty of options regarding what caps you can use. This particular 8600 card has the capacitor spots silkscreened for both through-hole caps (both 8 mm and 10 mm diameter) as well as SMD. Neat! That means we have a lot of choices when it comes to choosing replacement capacitors. And that's the next order of business.

    Replacement caps for Sacon FZ
    On paper, Sacon FZ series are supposedly equivalent to Nichicon HM and Rubycon MBZ (more or less). But let's be honest: Sacon / Evercon / Elcon / GSC caps are terrible. Therefore, don't let that stop you from using other brands and series of caps, even if they have slightly inferior ESR and ripple current (RC) specs. In my opinion, the following would be absolutely acceptable choices: Panasonic FM, as well as Rubycon ZLG and ZLJ.

    Of course, if you have no means to obtain equivalent caps, then even general purpose caps from the Japanese manufacturers may be okay. I measured a few busted FZ caps with my cheapo ESR meter, and all of them read way out of spec (some even open-circuit!). If the card worked with those (at least according to seller, I didn't test it as I didn't want to risk killing the RAM), then just about any functional cap will work. Don't believe me? Here is working proof of this concept:
    https://www.badcaps.net/forum/showthread.php?t=43404
    On the other hand, if you want to recap this card all with polymers, then be my guest. It should be more than happy .

    Anyways, here is some useful information regarding the caps on this 8600 video card and what function they perform:

    ---- 12V rail (on the card)
    This rail feeds both the GPU core and VRAM voltage regulator modules (VRMs). It has five capacitor spots:
    CX15/CX42 (before toroid filter coil)
    CX10/CX41
    CX11/CX40
    CX12/CX43
    CX44/CX9

    Originally, these were 5x Sacon FZ 16 V, 470 uF, 8x12 mm (1150 mA ripple current and 36 mOhms ESR). For my recap, I used only three caps to replace these, but they were adequate for the task (even more than needed, perhaps ).
    CX15/CX42 --> Sanyo SEPC polymer (SMD), 16 V, 150 uF, 8x7 mm (3320 mA RC, 22 mOhms ESR)
    CX11/CX40 --> Panasonic FL, 16 V, 1500 uF, 10x20 mm (similar specs to Rubycon MCZ below)
    CX44/CX9 --> Rubycon MCZ, 16 V, 1500 uF, 10x20 mm (2770 mA RC, 11 mOhms ESR)

    ---- GPU core rail (a.k.a. GPU V_core or GPU Vcc)
    This rail provides power to the GPU chip. IIRC, stock it is at 1.3-1.4 Volts (forgot to measure mine), so for those of you who are poly-modding, you can use capacitors with a voltage rating as low as 2.5V.

    The GPU core rail has two capacitors spots that came with the following caps:
    CX20/CX18: Chemicon PXA polymer (SMD), 4 V, 1200 uF, 10x12 mm (5500 mA RC, 10 mOhms ESR)
    CX25/CX22: Sacon FZ 6.3 V, 1500 uF, 10x12 mm (1800 mA RC, 22 mOhms ESR)

    The Chemicon PXA polymer above is probably what kept the card running, despite the other busted FZ caps. Thus I only replaced the busted FZ.
    CX25/CX22 --> Nichicon HZ 6.3 V, 2200 uF, 10x20 mm (3770 mA RC, 7 mOhms ESR)

    ---- VRAM Vdd/Vddq rail
    This rail provides power primarily to the RAM chips. Normal operating voltage should be around 1.8 to 2 Volts. Because of this, replacement capacitors with a voltage rating of 4V or more is recommended (2.5V caps will likely work as well, but 4V will keep a better voltage safety margin.)

    Originally, this rail came with two FZ 6.3 V, 1500 uF, 10x12 mm caps at spots CX21/CX19 and CX46/CX45. I replaced them with the following:
    CX21/CX19 --> Nichicon HZ 6.3 V, 2200 uF, 10x20 mm (3770 mA RC, 7 mOhms ESR)
    CX46/CX45 --> Fujitsu FPCAP RE 4 V, 820 uF, 10x12.5 mm (6100 mA RC, 7 mOhms ESR)

    ---- GPU secondary / PCI-E comm. Rail
    Though not 100% sure, I believe this rail is used for communication between PCI-E bus and GPU chip, though I'm not sure. It is usually 1.2V or thereabouts on most nVidia PCI-E cards up to (and including) the 9 series. Thus, any capacitor with a voltage rating of 2.5V or more can be used here. This rail is also linearly derived from the GPU core rail, so the ESR and ripple current of the replacement cap are not that critical – any low-ESR cap will likely work fine. The capacitance, on the other hand, is advisable to be matched within ±30%.

    The rail on this card had only one cap at spot CX6/CX1. Originally, it was a Sacon FZ 6.3 V, 1000 uF, 8x12 mm (1150 mA ripple current and 36 mOhms ESR). I used a Nichicon SMD cap here (not sure what series, as it came from dead PS3 board), rated for 4V and 1000 uF. I've done this before on a GeForce 6200 (128-P2-N361) and it hasn't caused any problems.

    Results
    Unfortunately, after all of this work (looking up cap specs sheets and the recap itself), the video card did not work. And here is why:


    See that bottom-right corner? That, folks, is a cracked core! And it shouldn't have been a surprise: after all, the video card came through my mailbox in a single padded mail envelope. No hard box or any other packaging. So my guess would be it broke in the mail. Of course, I can't blame the postal office for this, because it was the seller who should have known better and packed the item more appropriately. I mean, I know I got it for cheap. But that's no excuse to just stick a postal stamp on something and dump it in a mailbox.

    Oh well, at least I got a working fan out of this one. The fan (made by ARX) is the same as the one used on the XFX 6800 Xtrem and also an MSI GeForce 7600 GS of mine (CeraDyna series). So at least this wasn't a total rip-off. Anyways, I hope this recap info is still useful to someone on the internets, even though the GeForce 8 series of video cards are quite dated at the time of this post.
    Attached Files

    #2
    Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

    Isn't using 2005 dated Nichicon HZs a tad risky? (I know, anything is better than Sacon FZ, even pre-2006 Nichicon HMs and Chemi-con KZGs... I'm guessing those HZs are Xenon Xbox 360 pulls, along with the Fujitsu RE, Rubycon MCZ, and Panasonic FL)

    ...

    Okay, I guess it really doesn't matter what you use there since the core is cracked anyway.

    I also rather like how it sports Infineon RAM. Such a delightful and perfect compliment to the rest of the card, as though the cracked GPU core and blown Sacon FZs weren't insulting enough.

    As for the GeForce 8600 GT being a dated card... I think of greater alarm is the fact that the 8600 and 8500 series were never even competent enough to compete with the midrange graphics cards of the timeframe in which they were released (Spring 2007). They were often outperformed (and significantly at that) by the previous midrange cards from ATi/AMD and nVidia (the Radeon X1950 Pro and GeForce 7900 GS respectively). They were only comparable in shader-intensive games which took advantage of their unified shader architecture. The 8800 GTX and GTS, on the other hand, were behemoths for their time period (late 2006 / early 2007), but they were massive and ran very hot. The 8800 GT, released about a year later, was a very good budget card for its time as well. However, the 8600 GTS, GT, and 8500 GT were simply failures.

    But that's enough from me. I'm not here to hijack your thread. Even though your repair was in vain because of the aforesaid cracked core, you did a nice repair job all the same.
    Last edited by Wester547; 04-09-2016, 08:47 PM.

    Comment


      #3
      Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

      Momaka, you did a good job. When I repair VGA cards the first item I remove is the GPU heatsink, that way I can inspect the core for cracks, chips or heat damage.

      Comment


        #4
        Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

        lol gotta love that pic u made in the other thread of the sacon fz with the X_X faced smilies and the purple sanyo oscon shocked at having to take up the slack from the dead sacon fz caps not doing their job!

        for the sake of venting your frustrations cuz the recap didnt work, u should have made more X_X smilies out of the busted fzs on your 8600 card just to make u feel better. i hate how the fz caps all stick out their wispy brown tongues at you! just looking at them makes you mad and your blood boil.

        anyway, u really should have taken note of the infineon ram on that card. i hope pics of the ram were visible on the listing. i wouldnt bother getting a card with infineon ram no matter how cheap and if i could avoid it!

        the ram is just terrible. i have gddr3 infineon ram on my palit xpertvision 6600 gt agp and its just terrible. the stock speed of the ram is 900mhz and it can only overclock to 938mhz before it starts to artifact.

        in contrast, an evga 6600 gt agp doom 3 edition with samsung gddr3 ram that i got off ebay recently could hit 1100mhz before it started to artifact.

        i also have infineon gddr5 ram on my powercolor 4870 512mb stalker edition. the ram is rated for 4ghz on the part number but is clocked at only 3.6ghz. i can only guess why...

        Comment


          #5
          Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

          Originally posted by Wester547 View Post
          Even though your repair was in vain because of the aforesaid cracked core, you did a nice repair job all the same.
          Originally posted by Sparkey55
          Momaka, you did a good job.
          Thanks folks

          Originally posted by Sparkey55
          When I repair VGA cards the first item I remove is the GPU heatsink, that way I can inspect the core for cracks, chips or heat damage.
          Yes, I did remove the GPU heatsink before removing any of the caps .
          But because the thermal compound still looked fresh, I decided to leave it on there for the test after the recap. (I've done this many times with other CPU and GPU heatsinks. The difference in running temperatures is often negligible, unless the old compound is really baked.) Also, none of the corners looked chipped and the GPU had a metal shim, so I didn't think it would be damaged.
          I guess I will know better in the future.

          Originally posted by Wester547 View Post
          Isn't using 2005 dated Nichicon HZs a tad risky?
          Only if they have been heat-abused and then sitting on the shelf for a long time. But I've used these in a lot of other hardware I temporarily recapped, so they haven't been sitting that long - maybe a year at the most.

          I also recently reformed many of my good caps. This includes both new caps that I never used and old used/pulled caps such as these HZs.

          Originally posted by Wester547 View Post
          I'm guessing those HZs are Xenon Xbox 360 pulls, along with the Fujitsu RE, Rubycon MCZ, and Panasonic FL
          Yes, you caught me - these are Xbox 360 pulls indeed.

          A few years back, a friend of mine had a console repair shop in the area here. He had a ton of damaged 360 boards that just couldn't be fixed (bad RAM, warped boards, etc.), so I started pulling caps and other components for him to have in case he needed any. But he really had too many boards, and soon decided to scrap some. Thus I took a small stash of the boards - hence why you see so many of my recap jobs feature these Xbox caps .
          When he closed down the shop, he threw away even more of these boards. Unfortunately, I wasn't there at the time due to college and couldn't save any. But if I did, I would have easily doubled or even tripled my cap stash.

          Originally posted by Wester547 View Post
          However, the 8600 GTS, GT, and 8500 GT were simply failures.
          I disagree there.
          First, the 8500 and 8600 are two different animals: the 8500 shares the same core as the 8400 (G86), whereas the 8600 uses a more powerful G84 core. So there is some difference in performance between 8500 and 8600 cards for sure. But what's more important is that both of these cards have fairly low power consumption: about 30 and 40 Watts for the 8500 and 8600, respectively. And this is exactly what I like about the 8600 and other mid-range cards: they run cooler and thus last longer, meanwhile also having better performance than the entry-level cards (such as the 8400 and 8500).

          Originally posted by Wester547 View Post
          As for the GeForce 8600 GT being a dated card... I think of greater alarm is the fact that the 8600 and 8500 series were never even competent enough to compete with the midrange graphics cards of the timeframe in which they were released (Spring 2007). They were often outperformed (and significantly at that) by the previous midrange cards from ATi/AMD and nVidia (the Radeon X1950 Pro and GeForce 7900 GS respectively). They were only comparable in shader-intensive games which took advantage of their unified shader architecture.
          The Radeon x1800/x1900 and GeForce 7800/7900 are high-end cards from the (previous) DX9c generation (higher speed memory with 256-bit bus, so higher bandwidth), whereas the 8600 is a mid-range card (slower memory ith 128-bit bus) from the DX10 generation. So yes, it shouldn't be surprising that they outperform the 8600, especially at high resolutions or AA enabled (both of which require high memory bandwidth). But it wasn't by much. And the 8600 GTS/GT/GS series was really only meant to replace the previous generation of mid-range cards - i.e. the 7600 GT/GS... which it did and easily surpassed, almost reaching the performance of the high-end DX9c cards in some cases. Also, the G84 and up cores have build-in H.264 decoding.

          Since you mentioned the GeForce 7900, I'd just like to take a quick special moment here to express how much I dislike these cards
          They have only slightly lower power consumption than the GF 7800 cards, but use a much smaller cooler. So they are guaranteed to fail unless you really put a high-end heatsink on them. I already have two of those paper-weights. A reflow won't bring them back because of the bumpgate issue (which seems to be the worst with the 7 and 8 series). The 7800 cards are overall better than the 7900 in terms of reliability, but still doomed. IMO, the 7600 GT is the only worthwhile card from the whole GeForce 7 series, because it balances performance with reliability. Likewise, I feel the same about the GF 8 series. As you noted, the 8800 GTS and GTX are big behemoths and waste a lot of power. A triple-slot cooling solution with a high-airflow fan might save them (or alternatively water cooling), but otherwise they are doomed to fail as well.

          Originally posted by Wester547 View Post
          But that's enough from me. I'm not here to hijack your thread.
          Heh, don't worry at all. I actually kind of expected for this thread to go off-topic anyways (primarily discussing older video cards and hardware, as usual). Plus, it's not like the 8600 is a hot brand-new card, so I doubt the thread will get much attention. But nonetheless, I figured I already did the recap and took notes, so why not share my findings and recap info anyways? I think it can also be useful when it comes to recapping other older cards as well.

          Originally posted by ChaosLegionnaire
          anyway, u really should have taken note of the infineon ram on that card. i hope pics of the ram were visible on the listing. i wouldnt bother getting a card with infineon ram no matter how cheap and if i could avoid it!
          You mean Inferior RAM?
          Unfortunately, it wasn't visible on the listing what RAM it had. Then again, I don't mind it that much, just as long as it works. I never overclock my RAM anyways, regardless if that's on a video card or a system memory.

          Originally posted by ChaosLegionnaire
          lol gotta love that pic u made in the other thread of the sacon fz with the X_X faced smilies and the purple sanyo oscon shocked at having to take up the slack from the dead sacon fz caps not doing their job!
          Thanks
          At some point I was considering to crop/scale it down and use as my signature . But in the end, I decided not to, as I didn't want to clutter the forums.

          I wish I could remember where I found the original picture that I used as a base, so I can give credit to whoever took it. I think someone posted it here on BCN a long time ago, but I just can't remember.

          Originally posted by ChaosLegionnaire
          for the sake of venting your frustrations cuz the recap didnt work, u should have made more X_X smilies out of the busted fzs on your 8600 card just to make u feel better.
          I wouldn't say that I was frustrated. Not even mad (despite using that "mad" smiley, which was more for dramatic effect here ). Just a tad disappointed that an otherwise functional card got damaged in the mail or due to the seller not being careful with it.

          As for the recap process itself, I always enjoy it . And that's actually why I got this card . Plus, I did the recap back in January when I was really sick and had the shortness of breath. It helped me keep my mind distracted for several hours that day. So I certainly don't regret any part of it.
          Last edited by momaka; 04-10-2016, 06:03 PM.

          Comment


            #6
            Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

            Originally posted by momaka View Post
            I disagree there.
            First, the 8500 and 8600 are two different animals: the 8500 shares the same core as the 8400 (G86), whereas the 8600 uses a more powerful G84 core. So there is some difference in performance between 8500 and 8600 cards for sure. But what's more important is that both of these cards have fairly low power consumption: about 30 and 40 Watts for the 8500 and 8600, respectively. And this is exactly what I like about the 8600 and other mid-range cards: they run cooler and thus last longer, meanwhile also having better performance than the entry-level cards (such as the 8400 and 8500).
            I don't recall the 8600 GTS/GT running significantly cooler than the previous 7600 series. At least, the stock coolers on a number of OEM 8600 GTS/GT cards weren't great, and it wasn't unusual for them to near 80*C at load temperatures. And I know the 8500 GT is meant to be in an entirely different performance category. I just don't think it was good value for the time either (for gamers anyway). But the same could be argued of the 9500 GT.

            The Radeon x1800/x1900 and GeForce 7800/7900 are high-end cards from the (previous) DX9c generation (higher speed memory with 256-bit bus, so higher bandwidth), whereas the 8600 is a mid-range card (slower memory ith 128-bit bus) from the DX10 generation. So yes, it shouldn't be surprising that they outperform the 8600, especially at high resolutions or AA enabled (both of which require high memory bandwidth). But it wasn't by much. And the 8600 GTS/GT/GS series was really only meant to replace the previous generation of mid-range cards - i.e. the 7600 GT/GS... which it did and easily surpassed, almost reaching the performance of the high-end DX9c cards in some cases. Also, the G84 and up cores have build-in H.264 decoding.
            You're right.

            I worded that poorly. When I said midrange, I meant budget, but what I really meant was that at the time the 8600 GTS and GT was released, their MSRP price was $200 and $150 respectively, which wasn't much different from the prices of the 7900 GS and X1950 Pro at the time period in spite of them being last generation budget high end cards since the prices had dropped on those cards by that point. So the 8600 GTS and GT cards weren't great value at the time. And the 8600 GT only really performed better than the 7600 GT in shader-heavy games.

            A reflow won't bring them back because of the bumpgate issue (which seems to be the worst with the 7 and 8 series).
            The bumpgate issue is really a moniker for a wide range of foul ups that nVidia made with those series which include:

            1) Using eutectic pads with high lead bumps (NOT a good combination).
            2) Using a low TG underfill when there were superior options available.
            3) Using high lead bumps instead of eutectic bumps (ATi/AMD switched to eutectic bumps in 2005).

            These foul ups essentially meant that the chip was never properly soldered to the substrate and temperatures of 60*C and higher would cause the underfill to soften rather quickly (and you can imagine what long term exposure to temperatures of 70*C and 80*C would do to the underfill). This was a bigger issue in laptops where it's very arduous to achieve temperatures below 60*C for the GPU. Also, high lead bumps are capable of handling more current than eutectic bumps, which means that nVidia underrated the current handling ability of those chips as well and basically overvolted them, increasing their heat output and power consumption more than necessary.
            Last edited by Wester547; 04-10-2016, 06:51 PM.

            Comment


              #7
              Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

              ooops i forgot to mention that i also had infineon ram on an msi 8800 gt oc that i had and the ram was faulty right out of the box! i had to downclock the inferior ram 50-100mhz to get rid of the artifacts even in 2d mode!

              the cooler was also crap. i should have replaced it. the idle temps were 60°C and the load temps were 80°C! no money for guessing how quickly the card failed: 2.5 years of 12x7 use. i rma-ed it cuz it was under 3 years warranty and the replacement also failed after 2.5 years. was totally unaware of the bumpgate problems back then which i thought only affected laptops. in hindsight, i should have replaced the cooler and reapplied with better branded thermal paste. which i will do on ALL my video cards from now on.

              Comment


                #8
                Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – she's ALIVE!

                SHE'S ALIVE!!!

                Yes, I got that GeForce 8600 GT working again! (For now, at least.) Stress-tested for a bit with games too. No artifacts or any other weirdness. Don't believe me? Here's the GPU-Z screenshot:


                So here's what happened...
                A few weeks back, I was playing with my XFX GeForce 6800 XT and got a little mad at it for being a slow piece of turd and running way too hot. Had some free time today for experiments. Since I have so many spare Xbox 360 GPU and CPU heatsinks, I thought about making a ghetto mod and mount a rev. 2 Xbox CPU heatsink on it (the one with heatpipe). As crappy as I think that 6800 XT is, I still I didn't want to risk damaging it in the process (it's a working card, after all!) So I decided to make the heatsink retention mechanism on another (preferably dead) video card. It didn't take me much time to find that the "dead" GeForce 8600 GT in this thread had the same 80 mm hole spacing (one of its hole patterns) as the 6800 XT. So I took off the stock cooler from the 8600 GT and started working.

                The result wasn't pretty . But it was functional!
                Thus, before moving my ghetto mod to the 6800 XT, I thought, why not overtighten the screws on that ghetto-modded heatsink a bit to see if that somehow makes the "cracked" core on the 8600 GT work. So I did that, put the 8600 GT in my 939Dual-SATA2 test system, and pressed the power button . But to my surprise, the LED on my LCD turned green, I got video, and the system posted! WTFROFLLOL!!!!

                But now this has me thinking: what is/was wrong with the card in the first place then? (Besides the caps.) It's not like I just recapped it, tested it once, and gave up when it saw that it didn't work. - No, I never do that.

                When I saw that this 8600 GT didn't work, I first tried using the second DVI port. But that wasn't the issue. Then tried new thermal compound, which also required me to reinstall the heatsink *and* to reinstall the video card in the motherboard again - so those couldn't have been the issue either. Pretty much all I can think of is either the GPU chip has bad BGA, the core has the bumpgate issue, or the core really was cracked and this really fixed it.
                Anyone else care to hazard a guess?

                Anyways, it is working for now. But I guess we really will see if my ghetto mod has helped it or not. Reason why is because I forgot to apply thermal compound properly. There was some leftovers both on the GPU core and on the Xbox 360 heatsink (from previous projects I've used that heatsink on), though not much. So I'm not running the heatsink totally "dry". Just not optimal.

                That said, the idle/load temperatures I got weren't that bad: 45C idle and 57C under load. And here is the temperature curve I got with a quick 10 minute (or so) test:


                By the way, worth noting is that the room temperature at the time was about 23.5C (74-75F). Because this heatsink mod does not have a fan, I had a 120 mm fan blowing from the side of the case towards the motherboard and video card. The PC case was open.

                Those temperatures are a lot better than the 6800 XT, which was idling at just 56C with its stock heatsink and 21C room temperature. But anyways, that's for the other thread.

                So yeah, that 8600 GT is working and I am happy now . Pictures of the ghetto heatsink mod coming soon. I couldn't take any today, because it got dark by the time I was done testing the card.
                Attached Files
                Last edited by momaka; 04-15-2016, 10:13 PM.

                Comment


                  #9
                  Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                  Originally posted by Wester547 View Post
                  I don't recall the 8600 GTS/GT running significantly cooler than the previous 7600 series.
                  No, the 8600 GTS/GT/GS runs on par with the 7600 GT/GS, because both are mid-range cards with around 40 Watts TDP (43 W for the 8600 and 36 for the 7600).

                  Originally posted by Wester547 View Post
                  When I said midrange, I meant budget, but what I really meant was... the 8600 GTS and GT cards weren't great value at the time.
                  Ah, okay, fair enough.

                  Originally posted by Wester547 View Post
                  And the 8600 GT only really performed better than the 7600 GT in shader-heavy games.
                  Well, we will see how that goes. I have not yet played any DX10 games. That said, in Colin McRae Rally 4, I got similar framerates with the GeForce 8600 GT as I did with both my XFX GeForce 6800 XT and my eVGA GeForce 6200 when I had it unlocked to 8p/3v pipes (same as a GeForce 6600 LE GPU). So yeah, in older games that don't rely heavily on shaders, it has about the same performance as other mid-range cards.

                  Originally posted by ChaosLegionnaire View Post
                  ooops i forgot to mention that i also had infineon ram on an msi 8800 gt oc that i had and the ram was faulty right out of the box! i had to downclock the inferior ram 50-100mhz to get rid of the artifacts even in 2d mode!
                  Maybe you got unlucky... or maybe I got lucky?

                  As you can see, this 8600 GT is running its RAM at 800 MHz. But the RAM chips on the board are Infineon HYB18H512321AF-14. The "14" extension suggests these chips are rated for 700 MHz. Only the HYB18H512321AF-12 chips are rated for 800 MHz, as per the datasheet. So looks like I had good luck this time.
                  Last edited by momaka; 04-15-2016, 10:31 PM.

                  Comment


                    #10
                    Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                    This can't be true!!!

                    Comment


                      #11
                      Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – she's ALIVE!

                      Originally posted by momaka View Post
                      A few weeks back, I was playing with my XFX GeForce 6800 XT and got a little mad at it for being a slow piece of turd and running way too hot.

                      Those temperatures are a lot better than the 6800 XT, which was idling at just 56C with its stock heatsink and 21C room temperature. But anyways, that's for the other thread.
                      i think i already mentioned previously in the 6800 xt thread that the 6800 xt has the same core config as the radeon 9700/9800 cards. the core speed is also the same as the 9700 pro but the memory is clocked faster at 1ghz vs 620mhz on the 9700 pro. so dont expect pixel and texture pushing miracles from it with games that are several years past its time.

                      i also thought that u may have the 128-bit version of the 6800xt as i see a few of em listed on ebay but looking back at your topic i see the gpu-z screenshot saying its 256-bit... ah well... the only explanation i can come up with now is to say that maybe u got a bum chip that spits out waaay too much heat compared to others of the exact same chip.

                      btw, i found quite a number of 6800xt agp cards on ebay lately (some of em were 128-bit). even saw a working xfx 6800xt agp 256mb with the stock cooler. if its cheap enuff and within your budget, u can hoot another and compare how hot the pci-e nv41 runs vs the agp nv40 (crippled). it could be some inefficiency in the chip when nvidia redesigned their native agp chip for pci-e. your weird overheating nv41 warrants some investigation as i have mentioned before in a pm how my 6800 ultra doesnt overheat as much as yours when fully loaded in 3d mode.
                      Originally posted by momaka View Post
                      It didn't take me much time to find that the "dead" GeForce 8600 GT in this thread had the same 80 mm hole spacing (one of its hole patterns) as the 6800 XT.
                      the 6800/7800/7900/8800(g92)/9800 cards and the x1800/x1900/x1950/3800/4800/5800 cards on the ati side all have the same hole spacing. i know this cuz i often search for third party coolers and they often list the abovementioned cards together with the same hole spacing compatibility.

                      so check out mounting hole number 5 on the zalman vf1000 cu and on the scythe musashi as well.
                      Originally posted by momaka View Post
                      SHE'S ALIVE!!!

                      Yes, I got that GeForce 8600 GT working again! (For now, at least.) Stress-tested for a bit with games too. No artifacts or any other weirdness. Don't believe me?

                      So yeah, that 8600 GT is working and I am happy now
                      Anyone else care to hazard a guess?
                      i hate to poop or rain poop on your parade but faulty video cards rarely fix themselves as u have said so yourself. i think u better not disturb or remove the cooler anymore as it may break whatever non-repeatable fluke that occured when u put humpty dumpty back together again.

                      Comment


                        #12
                        Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                        Originally posted by momaka View Post
                        No, the 8600 GTS/GT/GS runs on par with the 7600 GT/GS, because both are mid-range cards with around 40 Watts TDP (43 W for the 8600 and 36 for the 7600).
                        The 8600 GTS has a 71W TDP and the 7600 GS a 27W TDP, although the 7600 cards are built on a 90nm process (about 178 million transistors) and the 8600 cards an 80nm process (289 million transistors) so that sounds about right, all other factors being tantamount. I think the 8600 GTS runs a bit hotter and the 7600 GS a bit cooler than the 7600/8600 GT and GS, though.

                        Who knows how long the Infineon RAM will last. And while I'm happy you managed to revive the card, I'd have to be in agreeance with ChaosLegionnaire. Issues like these are unlikely to resolve themselves and may as well be intermittent. It's possible that you are putting just enough pressure on the solder balls for the card to work (IE, installing another heatsink) if the bumpgate issue isn't the problem (and if the chip isn't cracked enough to stop working altogether... although I think a cracked chip would be permanently dead).
                        Last edited by Wester547; 04-16-2016, 03:02 PM.

                        Comment


                          #13
                          Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                          Originally posted by Wester547 View Post
                          The 8600 GTS has a 71W TDP and the 7600 GS a 27W TDP, although the 7600 cards are built on a 90nm process (about 178 million transistors) and the 8600 cards an 80nm process (289 million transistors) so that sounds about right, all other factors being tantamount. I think the 8600 GTS runs a bit hotter and the 7600 GS a bit cooler than the 7600/8600 GT and GS, though.
                          the 8600gts has quite a bit of difference in tdp compared to the gt and gs. ima provide a link to wikipedia with the giant list of video cards. it doesnt list the tdp of the gf7 series but it does say the 8600gs and gt have a tdp of 47w. so it seems like the 8600gts is a heavily overclocked and souped up g84 chip with an equally pimped up speed on the vram.
                          Originally posted by ChaosLegionnaire View Post
                          i hate to poop or rain poop on your parade but faulty video cards rarely fix themselves as u have said so yourself. i think u better not disturb or remove the cooler anymore as it may break whatever non-repeatable fluke that occured when u put humpty dumpty back together again.
                          Originally posted by Wester547 View Post
                          And while I'm happy you managed to revive the card, issues like these are unlikely to resolve themselves and may as well be intermittent. It's possible that you are putting just enough pressure on the solder balls for the card to work (IE, installing another heatsink) if the bumpgate issue isn't the problem (and if the chip isn't cracked enough to stop working altogether).
                          sorry, momaka, i wasnt trying to sound mean. what i was trying to say is that u shouldnt be surprised if the card kicks the bucket again one day when u power on the system. have another spare card on hand ready!
                          Last edited by ChaosLegionnaire; 04-16-2016, 02:53 PM.

                          Comment


                            #14
                            Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                            momaka i think you have a bga issue and not a cracked core possbily that line you see is a scratch on the core because if the core was cracked it is not possible to bring it back to life and you can't crack a core in that way the most easy part to crack are the corners of the core

                            Comment


                              #15
                              Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                              As promised, here are the pictures of my ghetto heatsink mod I did on this card...

                              Ugly?
                              -Maybe (The beauty is in the eye of the beholder. )

                              Works well?
                              - 100% yes

                              But... I swapped that ghetto HS mod back with the original. I noticed, however, that the original heatsink made the board warp on one side even worse than my mod. So this time I did not tighten the screws on the original HS all the way - just enough to keep it well on there.

                              Plugged in everything, hit the power button and... she's still alive!
                              So, looks like the card may not be dead after all.

                              Originally posted by ChaosLegionnaire
                              i hate to poop or rain poop on your parade but faulty video cards rarely fix themselves as u have said so yourself. i think u better not disturb or remove the cooler anymore as it may break whatever non-repeatable fluke that occured when u put humpty dumpty back together again.
                              No worries, man .
                              Yes, you are absolutely right. I got this card for cheap, though, so I couldn't care less if it breaks (again) . It's all in the name of science, lol. Hence all the experiments. However, it now seems that whatever "fluke" I made the first time to make the card not work, I cannot repeat anymore. Well, maybe it is too early to say right now. So like you said...

                              Originally posted by ChaosLegionnaire
                              what i was trying to say is that u shouldnt be surprised if the card kicks the bucket again one day when u power on the system.
                              ... I'm basically not expecting anything back from this card. I had already written it off as dead once. The fact that it's alive I only consider as a temporary bonus. But who knows - maybe it will turn into one of those half-working, half-broken stuff that refuses to fully die (like a 500 GB Seagate HDD in one of my PCs).

                              That said, I think I will put my ghetto heatsink back on there again. Even though I say I don't care much about this card, I just can't have it run like this. Look at the temperature curves I got with the stock heatsink and fan - it's torture!

                              Those spikes you see is me firing up the game Portal and letting the card sit for about 30 seconds on the menu screen there (which is a 3D rendered background, for those who have not played it). Then I loaded the last part of Test Chamber 18 (which seems to have slightly higher load than other levels) and again playing for about 30 seconds. After this, I exit the game. This is where you see the spike jump back down.

                              If you compare this temperature curve to the one I did above with my crappy HS mod, you can see that the card runs about 10°C hotter with the stock heatsink and fan, hitting 67°C in less than 1 minute of 3D-loading. And all of this with nearly the same 23°C (73-74F) room temperature. Unacceptable!

                              What's even more humiliating is the fact that when I did my ghetto heatsink mod, I forgot to put thermal compound. When I removed the heatsink to swap it with the original, there was thermal compound on only about 1/3 of the GPU core (yes, you read that right - one third). And yet I still managed to pull 10°C lower temperatures than with the original heatsink . So most likely, I will actually try and fix-up my ghetto heatsink mod to make it look neater (well, relatively speaking here, of course ). That means make it not warp the board at all by installing a self-balancing bracket on the back, like I did with my eVGA 6200 video card. The only downside of this big heatsink is that it blocks nearly (if not fully) 3 of the PCI/PCI-E slots below it. But I guess that is the price you pay for decent cooling.

                              Originally posted by dragos2009 View Post
                              momaka i think you have a bga issue and not a cracked core possbily that line you see is a scratch on the core because if the core was cracked it is not possible to bring it back to life and you can't crack a core in that way the most easy part to crack are the corners of the core
                              Yes, I am starting to think that as well now. I've seen cracked cores before, and usually chunks will fall off. I've even slightly chipped the corners of an AMD Duron CPU myself (but luckily, it was very little chips, so the CPU remains functional).

                              The worst part is that I am starting to feel a little bad about leaving neutral feedback for the eBay seller. I though the card broke because he didn't ship it too well. If he/she ever gets to read this message: I definitely want to say I apologize.
                              Attached Files
                              Last edited by momaka; 04-18-2016, 08:10 PM.

                              Comment


                                #16
                                Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – she's ALIVE!

                                Originally posted by ChaosLegionnaire View Post
                                your weird overheating nv41 warrants some investigation as i have mentioned before in a pm how my 6800 ultra doesnt overheat as much as yours when fully loaded in 3d mode.
                                I'm pretty sure it is the heatsink on mine that causes it. But I'll leave that for the 6800 XT thread, when I mount a big Xbox 360 CPU heatsink on it as well, like I did with this 8600. I'm pretty sure the temperatures will be a lot more different.

                                Originally posted by ChaosLegionnaire View Post
                                the 6800/7800/7900/8800(g92)/9800 cards and the x1800/x1900/x1950/3800/4800/5800 cards on the ati side all have the same hole spacing.
                                For ATI, that may be true. But for nVidia, there is one standard (80 mm hole spacing, I think) and then each range of cards can have several different screw hole spacings. Based on that, a manufacturer may or may not choose to use the standard hole spacing, so a lot of nVidia video cards have weird heatsinks that don't always fit on anything else other than the card they were made for. One example of that would be the reference heatsinks for the 7600 and 7900 cards, which are not compatible despite looking similar (the 7900 heatsinks have legs that are 1 mm taller, so they won't touch the 7600 GPU core). That's one example. I can't remember off the top of my head right now which others were not compatible, but I do remember there were some and there wasn't much logic behind it. I'll have to dig through my junk box and check again.

                                Originally posted by ChaosLegionnaire View Post
                                i know this cuz i often search for third party coolers and they often list the abovementioned cards together with the same hole spacing compatibility.
                                Those 3rd party heatsinks probably use the standard spacing to avoid any problems.

                                Originally posted by Wester547 View Post
                                Who knows how long the Infineon RAM will last.
                                Probably longer than the GPU chip, if I don't change that stock heatsink.

                                Comment


                                  #17
                                  Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                                  Originally posted by momaka View Post
                                  However, it now seems that whatever "fluke" I made the first time to make the card not work, I cannot repeat anymore. Well, maybe it is too early to say right now. So like you said...
                                  looking at the original pictures of the goldfingers on the card. the goldfingers seem to appear dirty or dusty. the card may have been left sitting outside collecting dust before being sold. by plugging and unplugging the card, u may have inadvertently "cleaned" the goldfingers such that it got less dirty and restored contact albeit barely enuff for the system to detect the card.

                                  did u clean the card's goldfingers when u got it? i've had video cards become unrecognisable because of a dirty/dusty pci-e slot and/or goldfingers. sometimes it may not look dirty/dusty but when u wipe it with a q-tip dipped in alcohol, the dirt on the q-tip becomes obvious. before declaring your used video card doa, u might wanna clean the pci-e slot and goldfingers first.

                                  for me, i dont clean my used video cards first. i immediately plug them in to make sure they are working but if it gives problems, i would try cleaning the goldfingers and heatsink, reapplying thermal paste to see if that brings it back before declaring it really doa.
                                  Originally posted by momaka View Post
                                  If you compare this temperature curve to the one I did above with my crappy HS mod, you can see that the card runs about 10°C hotter with the stock heatsink and fan, hitting 67°C in less than 1 minute of 3D-loading. And all of this with nearly the same 23°C (73-74F) room temperature. Unacceptable! I guess that is the price you pay for decent cooling.
                                  errr i think u have more than proven the point that stock heatsinks on video cards are almost always crappy. ever since i got the 9800se as an upgrade for my first rig in 2003, i have noticed a trend of video cards becoming as hot as or even hotter than cpus tdp wise but their heatsink size does not change in tandem with their tdp.

                                  if u want your video card to last u a long time, u almost always have to change the stock paste and heatsink. i have no idea why they always like slapping sheety heatsinks on video cards? either planned obsolescence or size constraints in the expansion slot area on motherboards and they hate complaints their beefy video card cooler blocks the adjacent pci slot when dealing with ppl who are expansion slot freaks and have all their expansion slots filled.
                                  Originally posted by momaka View Post
                                  The worst part is that I am starting to feel a little bad about leaving neutral feedback for the eBay seller. I though the card broke because he didn't ship it too well. If he/she ever gets to read this message: I definitely want to say I apologize.
                                  u can open a case with ebay to have the feedback amended. however, do bear in mind that it might be an established fact that the item was poorly protected for shipping. u also need to weigh that against warning future buyers about a seller that does not really protect items well for shipping.

                                  if i paid over ten bucks for domestic shipping, i would expect the seller to use that money to get the necessary materials to protect the item well for shipping. or else if the packing materials are completely garbage or absent, i would be more inclined to think the seller profited off the shipping...

                                  as an international buyer, i would be really pissed to pay over 25+ bucks for international shipping through the gsp and have the item turn up trashed and then realise its too expensive to send the item back! i.e. it doesnt make sense to pay US$50 shipping to send back an item that costs less than the shipping cost itself!!!

                                  Comment


                                    #18
                                    Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                                    Originally posted by ChaosLegionnaire View Post
                                    looking at the original pictures of the goldfingers on the card. the goldfingers seem to appear dirty or dusty.
                                    ...
                                    did u clean the card's goldfingers when u got it? i've had video cards become unrecognisable because of a dirty/dusty pci-e slot and/or goldfingers.
                                    Eagle eyes!
                                    You're right, the gold pins on the card were indeed a bit more dirty than usual. No, I didn't clean them when I got the card. For some reason, this totally escaped my mind as a possibility for the card not to work. That said, I have brought back to life a couple of RAM modules and HDDs by doing this, so I wouldn't be surprised if that was the problem.

                                    Before wiping with alcohol, I usually use a pencil eraser to clean the contacts. Sometimes, even that is just enough. As strange as that may sound, it actually works quite well . Of course, make sure the eraser is good - that is, it needs to be soft and not smudge.

                                    I just cleaned the contacts on the 8600 GT, and they look much better now.

                                    Originally posted by ChaosLegionnaire View Post
                                    u might wanna clean the pci-e slot and goldfingers first.
                                    I'm pretty sure the PCI-E 16x slot on my 939Dual-SATA2 is clean. I use that PC almost exclusively for testing video cards. Thus the PCI-E 16x slot sees a lot of "traffic". Probably over 50 connect/disconnect contacts in the last year . I guess you can call that motherboard a slut - always "sleeping" with various video cards. In some cases, I've even had two video cards installed at once (one in the PCI-E 16x and another in the AGP). Would you say that is a threesome?

                                    Alright, I'll stop there on that topic.

                                    Originally posted by ChaosLegionnaire View Post
                                    i have noticed a trend of video cards becoming as hot as or even hotter than cpus tdp wise but their heatsink size does not change in tandem with their tdp.
                                    Indeed!

                                    Originally posted by ChaosLegionnaire View Post
                                    either planned obsolescence or size constraints in the expansion slot area on motherboards and they hate complaints their beefy video card cooler blocks the adjacent pci slot when dealing with ppl who are expansion slot freaks and have all their expansion slots filled.
                                    Probably a little bit of both.
                                    GPU heatsinks indeed have limits on how large they can be due to slots below them. Yet, manufacturers seem to ignore this (almost on purpose) and just keep pushing the TDP of graphics cards higher and higher - at least for the high-end cards. And then there's the trend of ever-increasing demand for quiet/silent PC operation. I guess manufacturers use this as an excuse to turn down the fan speeds of a graphics card to make it more quiet in order to satisfy buyers. I've seen some graphics cards with decent-sized heatsinks and fans, but the fans were turned down way too low. So in the end, you still get a GPU that fails faster and the buyer having to purchase another GPU again.

                                    Originally posted by ChaosLegionnaire View Post
                                    u can open a case with ebay to have the feedback amended. however, do bear in mind that it might be an established fact that the item was poorly protected for shipping. u also need to weigh that against warning future buyers about a seller that does not really protect items well for shipping.
                                    I'll have a look into it, if eBay still allows me to. After all, the seller isn't someone who sells this stuff as a business. That means, they probably had to go to the postal office just to get the card shipped to me. Considering the time and effort to do that and the fact that everything cost around $8, I guess I shouldn't complain.

                                    Comment


                                      #19
                                      Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                                      I decided to update/clean up the ghetto modded heatsink on this video card a little bit. Actually, I did that way back in August so I could test the video card when it was hottest in my room. Basically, all I did was change the ghetto-cut pieces of laminate wood for round "thumb-nuts" from a wooden dowel rod. Also made the retention springs myself from metal handles on Chinese food containers. The springs ended up quite stiff, but they still take on most of the pressure. Thus, the PCB hardly warps as much as it did with the old mod. And I switched the old screws to PS3 case screws.

                                      Anyways, here is how it looks now:



                                      And of course, let's not forget the accompanying SpeedFan temperature graphs:


                                      As you can see, things look much better now... well, in terms of temperature anyways (I'm sure some of you still find this heatsink mod an eyesore.) I managed to keep the GPU core at 57°C, despite the ambient room temperature being higher now that it was the last time I tested with that heatsink (and I also have applied thermal compound properly now.) Speaking of which, the ambient room temperature was about 28°C (82-83°F) at the time of testing (and all summer long, really). Computer setup was the same with my 939Dual-SATA2 motherboard and open-case PC. The only thing I did change, however, was that I had an 80 mm fan blowing on the heatsink from about 4 cm (~1.5 inches) away instead of my usual 120 mm fan setup that I had previously with this card and the GeForce 6800 XT.
                                      Why? Because the 120 mm fan wasn't doing that great of a job cooling from far away (I got core temperatures up to 63°C), since this heatsink has very closely-spaced fins and needs more pressure for the air to pass through it. The 80 mm fan from up close ended up working much better. Speaking of which, the fan used was a Te Bao Metallic Plastic, rated for 12V and 0.14 Amps. Although I was running it on 12V, it wasn't pushing that much air and thus wasn't very loud either (it's one of those low-power fans). So this was still overall a fairly quiet setup.

                                      Anyways, that is all for now again. I've been playing with the 50 mm fan from the original HS and have it installed on the modded heatsink above. Might make a fan shroud and see how that setup works. Running it "open" on the heatsink is okay for the low temperatures in my room now. But I think summer could be problematic, so this needs more testing.

                                      Also, I purchased another XFX GeForce 8600 GT last week, again with bad Sacon FZ caps and all. Should be arriving any day now. So if you like seeing busted Sacon FZ, stay tuned for more.
                                      Attached Files

                                      Comment


                                        #20
                                        Re: XFX GeForce 8600 GT [PV-T84J-UDD3 V5.0] – recap information and shenanigans

                                        Ok, I'll start "winter session" in the next weekend for my hobby: to repair a nVidia 8500 video card from ZOTAC.
                                        This topic inspired me, thanks!

                                        Comment

                                        Working...
                                        X