Announcement

Collapse
No announcement yet.

Gigabyte 7300GS complete overhaul and modding

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #21
    Re: Gigabyte 7300GS complete overhaul and modding

    I undervolted varios gpus and cpus to reduce their output power. I have an M17xR3 with an HD6990M that failed and it returned to life after a reflow. That was more than two years ago. I tuned the voltage as fine as I could.
    I think that manufacturers don't care of the chip working temperature as long as the device last past the waranty perior.
    Programed obsolescence in my opinion.

    Comment


      #22
      Re: Gigabyte 7300GS complete overhaul and modding

      Nah, it's more cost cutting than anything else. Higher default voltage pushes up the yields from a given wafer, allowing chips with imperfections to perform, at the cost of higher temperature and power consumption for the whole lot.

      Now with the push towards higher efficiency and lower power consumption, most processors and GPUs have serial-controlled power supplies with self-tuning capability and the degree of factory overvoltage has been greatly reduced, most modern chips run damn close to the minimum voltage required for stability. Also you can control them and undervolt via software.
      Originally posted by PeteS in CA
      Remember that by the time consequences of a short-sighted decision are experienced, the idiot who made the bad decision may have already been promoted or moved on to a better job at another company.
      A working TV? How boring!

      Comment


        #23
        Re: Gigabyte 7300GS complete overhaul and modding

        But what if they program those moderm serial controled power supplies to supply a voltage higher than the needed?

        Comment


          #24
          Re: Gigabyte 7300GS complete overhaul and modding

          It's not in their best interest as that would bring higher power consumption thus lower battery life and poorer scores in reviews. The default voltage is picked for the whole wafer like i explained above. Nowadays the chips have algos in them where each individual chip can tune its own power supply - i have noticed on a GT555m by nvidia that even a very small drop in voltage will lead to instability, the factory voltage was already optimal. A significant difference vs a 8000 series where the chips were fed 1.2 or 1.25 volts where they would happily run full clocks on 0.95 or even lower.

          The whole "planned obsolescence" thing can be explained by "make it cheaper so we can sell more". It's all about the penny pinchers in the marketing department. They're everywhere.
          Last edited by Th3_uN1Qu3; 10-25-2016, 10:55 PM.
          Originally posted by PeteS in CA
          Remember that by the time consequences of a short-sighted decision are experienced, the idiot who made the bad decision may have already been promoted or moved on to a better job at another company.
          A working TV? How boring!

          Comment


            #25
            Re: Gigabyte 7300GS complete overhaul and modding

            If they are following the path of eficience like you say, I'm really pleased about that but that "penny pinchers" you menrion will always exist I guess.
            I've been toying with an 8800gts that overheats while on desktop, imagine what happens with Furmark.... But I've founf a lot of tutorials to overvolt it, but none to uncervolt, and undercloking it has no effect they made it harder to undervolt I think. But with an 7900gs undercloking decreases a lot the heat output. So It's dificult to improbe desing faults ... and more if they are there on purpose.
            It's only my point of view of course nothing more.

            Comment


              #26
              Re: Gigabyte 7300GS complete overhaul and modding

              Originally posted by Th3_uN1Qu3 View Post
              ... the factory voltage was already optimal. A significant difference vs a 8000 series where the chips were fed 1.2 or 1.25 volts where they would happily run full clocks on 0.95 or even lower.
              Woah, is that really true?! If yes, I definitely need to try that on my XFX 8600 GT cards. Wonder how far I can go down without decreasing the clocks.

              uN1Qu3, thanks for the idea man. You are truly a genius!

              Originally posted by hikaruichijo View Post
              I've been toying with an 8800gts that overheats while on desktop, imagine what happens with Furmark....
              I know! I know! Pick me! Please, pick me! *raises hand*
              - You get a "NoVidia" GPU.

              Make sure to get the cooling under control, otherwise it will be another bumpgated card. I just finished reflowing my 8800 GTS the other day, and... it's still dead as a rock. (Well, the GPU core voltage comes up. But the RAM voltage comes up only for a few seconds, then goes back to zero again. However, it did that before the reflow, too. Bad RAM or bad GPU?) Something tells me I won't be able to save that one, just like the two GeForce 7900 GS cards I got a while back. The GeForce 7 and 8 series are doomed if you don't keep them under 60°C.

              Originally posted by hikaruichijo View Post
              So It's dificult to improbe desing faults ... and more if they are there on purpose.
              It's only my point of view of course nothing more.
              Yes, I think I know what you are saying and I kind of agree with it. Basically, the manufacturer's are using these "valid" reasons to also possibly build-in planned obsolescence. Then again, if the end user expect something at low prices, they will get what they pay for.
              Last edited by momaka; 10-27-2016, 09:35 PM.

              Comment


                #27
                Re: Gigabyte 7300GS complete overhaul and modding

                That estimation of 60 degress celsius max is the same I have for Nvidia series from 6000 to at least some 500 that I could test. I killed an Gigabyte GTX550TI with furmark half an hour at 86 degrees was enought. I refloweed the 8800gts 4 or 5 times and it is still working, I use the hairdryer because at home I don't have an hot air station. But I really want to undervolt that card, because I'm sure that's the best solution since it's using the stock cooler. But I don't have the knowledge to do it. From my point of view the bumpgate wasn't only Nvidia's fault, It was mainlly becase hardware vendors didn't use an adecuate cooling althought they know that heat is one of the moust comon causes of critical flipchip failures, and also the factory overvolting that I'm sure was intentional.
                I'm running my Saphire HD7970 dual X, non ghz edition at 1100 mhz with 1100mv and the stock voltage for 925mhz was 1200mv, there's something intresting in that I think.

                Comment


                  #28
                  Re: Gigabyte 7300GS complete overhaul and modding

                  Originally posted by hikaruichijo View Post
                  I refloweed the 8800gts 4 or 5 times and it is still working, I use the hairdryer because at home I don't have an hot air station. But I really want to undervolt that card, because I'm sure that's the best solution since it's using the stock cooler. But I don't have the knowledge to do it.
                  Well, feel free to start a new thread and ask there. Provided we both have the same card (a 320 MB or 640 MB 8800 GTS, and NOT the 8800 GTS 512 MB version that's based on the G92 GPU), I don't mind doing some experiments with my 8800 GTS. Mine is eVGA brand, but underneath it is actually just a stock 8800 GTS actually built by nVidia with eVGA stickers. So I can mess around with the GPU VRM controller and see if I can lower the voltage.

                  On the low and mid-range video cards, volt-modding is very easy, because they typically have just two simple PWM controllers: one for the GPU core and one for RAM.

                  Originally posted by hikaruichijo View Post
                  That estimation of 60 degress celsius max is the same I have for Nvidia series from 6000 to at least some 500 that I could test.
                  Yeah, you're right. The GeForce 6x00 series were also affected, though they didn't seem to fail as much due to lower heat output. In fact, if I recall correctly, even Th3_uN1Qu3 mention this somewhere a long time ago.

                  Also, I am starting to suspect that even early ATI Radeon video cards like the Radeon 9700 and 9800 up to the x1000 series may also have something similar to the bumpgate issue. I say this, because the older Radeon 9700 and 9800 video cards use leaded solder, yet they still fail with artifacts quite often. Moreover, no matter how many times I've reflowed some of my failed Radeon 9700 video cards, they won't come back to life. Likewise, when I briefly worked in a console repair shop, some of the reballed Xbox 360s still came back. We used leaded solder, so I doubt the solder failed. The Xbox 360 GPUs are based on the ATI R520 (Radeon x1800) core. So I think ATI may have had a problem as well, but just not as bad as nVidia's.

                  Anyways. That said, I guess I will also be staying away from the several GeForce GTX 260/275/285 "For parts or repair" video cards I have bookmarked on eBay. Now the Radeon HD4850 cards, on the other hand, are wonderful when it comes to reflowing. I've had 3.5 successful repairs out of 4 attempts so far (the 0.5 loss is because one card is missing blue or green on the analog output on both DVI connectors, so probably GPU BGA problem). Got 3 more to go. I think there is a good chance I will get all of them working, but let me not jinx it, of course .
                  Also saw an HD4870 x2 for really cheap recently (it's listed "for parts or repair", of course). I'm really tempted to grab it, even though there are a ton of video cards with better performance/watt.

                  Originally posted by hikaruichijo View Post
                  I'm running my Saphire HD7970 dual X, non ghz edition at 1100 mhz with 1100mv and the stock voltage for 925mhz was 1200mv, there's something intresting in that I think.
                  Thanks for sharing, good to know.
                  Makes me even more curious to examine ALL of my newer video cards. The single-slot HD4850 cards could probably benefit the most from that. (Seriously, was ATI/AMD smoking something funny when they thought a 110 Watt card can be cooled by a single-slot cooler? )
                  Last edited by momaka; 10-28-2016, 07:01 PM.

                  Comment


                    #29
                    Re: Gigabyte 7300GS complete overhaul and modding

                    Originally posted by momaka View Post
                    Also, I am starting to suspect that even early ATI Radeon video cards like the Radeon 9700 and 9800 up to the x1000 series may also have something similar to the bumpgate issue. I say this, because the older Radeon 9700 and 9800 video cards use leaded solder, yet they still fail with artifacts quite often. Moreover, no matter how many times I've reflowed some of my failed Radeon 9700 video cards, they won't come back to life. Likewise, when I briefly worked in a console repair shop, some of the reballed Xbox 360s still came back. We used leaded solder, so I doubt the solder failed. The Xbox 360 GPUs are based on the ATI R520 (Radeon x1800) core. So I think ATI may have had a problem as well, but just not as bad as nVidia's.
                    I'm certainly not an expert on this topic (I'm sure Unique knows more than me), but it's my understanding that it's a bit more complex than "all cards that use low Tg underfill are defective". The "bumpgate issue" is really two or three (or four?) issues that can summarily be described as the following:

                    1) Using high lead bumps with eutectic pads (bumpgate) in lieu of of eutectic bumps and eutectic pads. This is bad because high lead bumps cannot handle thermal stress in comparison to eutectic bumps and will fracture with far more ease (although eutectic bumps cannot handle as much current as high lead bumps). If a power bump dies, it may not be a big deal, but if a signal bump dies, that would be certain doom for the GPU.
                    2) Using a low Tg underfill that starts to soften at 60˚C and quickly turns to mush at 80+˚C.
                    3) As a result of issue #2, the die is never properly soldered to the substrate to begin with.
                    4) As Unique said, nVidia derated the current capability of the chips by overvolting them and that caused them to produce far more heat than necessary.

                    Eutectic bumps and pads all partook the move to RoHS products / the lead-free standard. ATi/AMD switched to eutectic bumps in 2005 if I recall correctly. I don't know if they used high Tg underfill or low Tg underfill but I'm pretty sure pre-RoHS GPUs used high lead bumps and high Tg underfill wasn't as common back then and so was used less. I thought the issue with the Xbox 360 Xenos GPUs was lead-free BGA solder cracking from thermal cycles and an overabundance of heat. Many R300s seem to die because of crappy Infineon BGA RAM chips running way too hot for too long, but it's possible for the core BGA to give away too. It could be a shorted chip as well.

                    As a postscript, I hope the information I posted is correct and isn't just blowing air or sounding biased.
                    Last edited by Wester547; 10-28-2016, 07:47 PM.

                    Comment


                      #30
                      Re: Gigabyte 7300GS complete overhaul and modding

                      Originally posted by momaka View Post
                      the older Radeon 9700 and 9800 video cards use leaded solder, yet they still fail with artifacts quite often.
                      i thought the primary reason for those cards failing was a completely inadequate puny stock cooler that ran the chip at temperatures of 85-90°C at max load. since the solder between the die and substrate are under more thermal stress being closest to the primary heat producing part than the bga between the substrate and pcb, so based on my understanding of physics, its no surprise the entire chip will "disintegrate" by desoldering itself or forming whiskers starting first with the die-substrate interface followed later by the substrate-pcb interface.

                      the issue is not whether they properly designed the chip to handle the high operating temperatures now. its why they chose to use such a crappy cooling system for such a "high" tdp chip i.e. cooling the chicken first vs cooling the egg first issue.

                      to lend some circumstantial anecdotal evidence to my point, i had a radeon 9800 se softmodded to a pro and i put an arctic silencer 1 on it as i was warned by a modding site, techarp, that the stock cooler is inadequate for cooling the card. the gpu lasted me around 7 years (which is more than twice the warranty period) of 12x7 operation overclocked to 432mhz being used for heavy gaming.

                      i also briefly tested the stock cooler before replacing it to see exactly how crappy the website said it was. at idle, when i put my finger on the heatsink, i was surprised that it ran only slightly hot. around 50-60°C, i think. but at full load, it burnt my finger when i put my finger on the heatsink and i was quite shocked at how hot it ran under load.

                      i'd bet in a closed case with crappy ventilation, it may even hit 100°C when running in hot tropical countries. even 80°C is much too hot to run hardware at the time, even well-built ones. it wont last long like that maybe even barely past the warranty period regardless of what the manufacturer says, saying its fine at those temperatures. it may be fine for them (the manufacturer) to get it to last up till the warranty period expires and then u'll be upgrading to the next fastest card. but for us we are trying to get it to last as long as possible. that is the key word here! ALAP! which is an entirely different ball game from what the manufacturers are trying to play/pull here.

                      so my final word to fellow member hikaruichijo is to try to get a better cooler for his constantly failing 8800 gts card and/or to invest some money in a better third party cooler. it is a far better solution to improve the cooling on the card than to constantly reflow it which could be potentially bad for the chip as bga chips are rated for something like a few or 5 or so reflow soldering thermal cycles. if it survives more reflow soldering thermal cycles than that, its purely by luck and living on borrowed time. i dont test theories like that or gamble with my priceless antique hardware. do you?
                      Originally posted by momaka View Post
                      Also saw an HD4870 x2 for really cheap recently (it's listed "for parts or repair", of course). I'm really tempted to grab it, even though there are a ton of video cards with better performance/watt.
                      be aware, the 4850x2 and 4870x2 cards are infamous for overheating cuz of the poorly designed thermal solution. so its no surprise there are so many of those failed cards floating around. there are also almost no third party coolers for the x2 cards due to the dual gpus on the pcb. most third party coolers are designed to be compatible with single gpu cards only. therefore, the x2 cards are fun to play around with for the enthusiast crowd but only while they last! after that, the fun's over awww...

                      if u are up for the challenge of ghetto modding a proper thermal solution for it to prevent it from failing again, then u may feel free to get one of those cards. beware, that it will be extremely challenging to ghetto cook (no pun intended with the term "cook") up something to cool the x2 cards properly as the 4870x2 dissipates 286w of tdp but i'd bet u'd love the challenge huh?
                      Originally posted by momaka View Post
                      The single-slot HD4850 cards could probably benefit the most from that. (Seriously, was ATI/AMD smoking something funny when they thought a 110 Watt card can be cooled by a single-slot cooler? )
                      u probably have the noob consumers to blame for that. they demand access to all of their expansion slots or they demand a high-end shoebox gaming computer so they have to make the cooler single slot. not that a cramped high-end gaming computer with terrible ventilation is a smart idea in itself...
                      Originally posted by Wester547 View Post
                      4) As Unique said, nVidia derated the current capability of the chips by overvolting them and that caused them to produce far more heat than necessary.
                      i thought that would be fine as theoretically, it would result in a zero sum gain of heat produced anyway *assuming* the current decrease was equal to the voltage increase. i'm talking about the P=VI formula here. i guess they got lazy and increased the voltage far more than is necessary to make up for the decrease in current. ah well, nobody's perfect and nothing is perfect for that matter too...

                      Comment


                        #31
                        Re: Gigabyte 7300GS complete overhaul and modding

                        Just to provide one single point of reference.
                        I used a 8800GTS 512MB card from 2007 to 2013.
                        I had it voltmodded and ran at Vcore of ca 1.3v all this time.
                        The card was watercooled with a full cover waterblock, never any issues.
                        Since I stopped using the card I've put it in my parents PC because the 7800 that was in there did not have good drivers for Win10.
                        At the time I did that I removed the voltmod and installed the cards original cooler.
                        Card still works fine like that.
                        "The one who says it cannot be done should never interrupt the one who is doing it."

                        Comment


                          #32
                          Re: Gigabyte 7300GS complete overhaul and modding

                          I've already thought about starting a thread to try to undervolt the 8800.It is an Asus 8800gts 320 G80 core. Under the Asus sticker is an original Nvidia sticker,Asus is a rebrander???
                          I'm not sure if it's posible, But I'm sure it's the real solution.
                          Undervolting is always intresting when you have a TDP that your cooler can't handle or you wan't a quiter system, and next undercloking, but usually I don't underclock, nowbody want to losse horsepower right?

                          Thanks for the advice I already know that changing the cooller of the 8800gts would help and maybe solve the problem, but I really don't care about that cards, I've had another 8800gts that also died and also an 8800ultra the same, every one of then free because our customers usually don't want to take trash to their homes and less non realliable hardware.
                          I also have a motherboard with an nforce chipset because of that.... and a Sony laptop with an 8400gt that I could improve and it is lasting more that when it was new, 3 years versus 6 running to 7 and another nvidia based hardware that failed and is working now.

                          From my point of view crappy desingned hardware with an intentional fault that is very dificult to solve is not priceles, it is good to learm thins that I'll never do to quality hardware, or hardware with a minor fault.

                          And I don't want to spend a penny in that cards because I have the felling that they were made to look powerfull and the rest didn't mather, quality? where?, Manufacturing facility tests? why while they last past the waranty...
                          And to make things whorse they used to say "don't worry Nvidia cards can handle more than 100 degrees celsius even on Nvidia oficial forums"...
                          And the funny thing was that people bragend about their big overclocks and they thought the cards where the best when the big overclocks came from the factory overvolt....
                          My HD7970 can go up to 1300mhz at the factory default 1200mv but the current increaes a lot and the vrms reach more than 90Cº the core reaches only 65 degrees with furmark, it is dificult to cool the vrms because they are small and don't have a clear area around and the height is limited by the gpu cooler.

                          I don't want to offend, sorry if it seens like it.

                          I have old hardware because I love it, it's useles nowadays but it's beatifull for me don't know why, but I don't like stuff that was made to fail, I like stuff that was made to last.

                          Comment


                            #33
                            Re: Gigabyte 7300GS complete overhaul and modding

                            Of course you are not offending, I just wanted to confirm that the main issue is cooling.
                            With a water block the card never reached 60°C even with heavy volt modding and overclocking, not a problem...
                            Your post did intrigue me though, I modded the BIOS now to lower the Vcore from 1.15v to 1.0v.
                            Double checked with a DMM and it worked a treat, card was stable with furmark.

                            Here is a guide if you want to try it yourself, scroll down to post #3 and "voltages"
                            https://www.computerbase.de/forum/sh...36#post3932836

                            I had to use an older version of nvflash to flash the modded BIOS, latest one would not detect my card.
                            I used this one: https://www.techpowerup.com/download...-1-for-windows
                            Last edited by Per Hansson; 10-30-2016, 10:44 AM.
                            "The one who says it cannot be done should never interrupt the one who is doing it."

                            Comment


                              #34
                              Re: Gigabyte 7300GS complete overhaul and modding

                              I already tried modifiying the bios even putting the fan always at 100%, pretty loud.... But it still idles at around 45-50Cº to much for my taste and for the cards health. The card's bios only has three voltages to chouse, from 1100mv to 1300mv and only on the 3D profile, I thought about flashing the BIOS from another vendor but there are a lot of posibilities that it won't boot, I can recover the card with another pci vga but ... I've flashed an Gigabyte HD6950 whit an 6970 bios and it wasn't stable but modifiying the original 6950 bios with the 6970 settings works perfect, the same gpu,memory clocks and shader count as the 6970 and it is working great untill now at least...
                              Last edited by hikaruichijo; 10-30-2016, 01:02 PM.

                              Comment


                                #35
                                Re: Gigabyte 7300GS complete overhaul and modding

                                the point is to take better care of the stuff that the manufacturer neglected to do and not to trash it even further. then whats the point in getting it in the first place if u're not going to respect/treasure it? one man's meat is another man's poison.

                                Comment


                                  #36
                                  Re: Gigabyte 7300GS complete overhaul and modding

                                  Well I agree with you, but I don't know other way to test,research thins, and I think it's better to do tests and reseach with hardware that wasn't good from the begining I don't care about the brand/manufacturer....
                                  Even if I where drunk or crazy,I'll never do that kind of thins to my pair of Voodoo 2 cards they are going to survive me and live peacefully in sli mode...

                                  Comment


                                    #37
                                    Re: Gigabyte 7300GS complete overhaul and modding

                                    Originally posted by ChaosLegionnaire View Post
                                    i thought that would be fine as theoretically, it would result in a zero sum gain of heat produced anyway *assuming* the current decrease was equal to the voltage increase. i'm talking about the P=VI formula here. i guess they got lazy and increased the voltage far more than is necessary to make up for the decrease in current.
                                    No, the power consumption of CPUs/GPUs doesn't work like that.

                                    It varies according to the core load. Therefore, when calculating maximum TDP, the formula P = V*I is only applicable at maximum core load *and* the rated voltage for the chip. Let's take my Radeon HD4850 video card as an example. 110 Watts TDP max and around 1.10V core. That makes the current, at maximum GPU load, I = 110W / 1.10V = 100 Amps (approx.). So imagine I lower the core voltage from 1.10V to 1.00 V. The power consumption would then be P = 1.00V * 100 A = 100 Watts. That's a 10 Watt decrease in TDP. Thus, if you decrease the core voltage, the power consumption will also decrease.

                                    Moreover, as you decrease the core voltage, the current the core draws will also decrease slightly. As a result, you don't get a linear decrease in power when you lower the core voltage - you can get an exponential decrease in power (but only up to a point). Same applies if you increase the core voltage. That's why extreme overclocking requires extreme cooling, because the power consumption really goes through the roof when the voltage is grossly increased.

                                    Originally posted by Per Hansson View Post
                                    Just to provide one single point of reference.
                                    I used a 8800GTS 512MB card from 2007 to 2013.
                                    I had it voltmodded and ran at Vcore of ca 1.3v all this time.
                                    The card was watercooled with a full cover waterblock, never any issues.
                                    Since I stopped using the card I've put it in my parents PC because the 7800 that was in there did not have good drivers for Win10.
                                    At the time I did that I removed the voltmod and installed the cards original cooler.
                                    Card still works fine like that.
                                    Well, first you have to consider the fact that you have a G92 core card there.
                                    Those have much lower consumption (108 Watts TDP?) than the G80 in the original 8800 cards (GTS 320/640 MB and GTX/Ultra: 140-170 Watts TDP). Also, as you noted in your post after this one, it never ran past 60C with the water block, which is very important. And third: from 2013 to present, it has only been 3 years that your card was in your parents' PCs. That's not always long enough for BGA issues to appear. And considering the way that card was used, I can probably safely assume that your parents don't play hardcore 3D games that stress the GPU to its maximum. So the card probably never ran that hot and was mostly idling at desktop. With all of that in mind, I'm not surprised that it is still working. But under-volting it was definitely a worthwhile endeavor on your part and should help increase its lifetime considerably.

                                    Originally posted by hikaruichijo View Post
                                    I've already thought about starting a thread to try to undervolt the 8800.It is an Asus 8800gts 320 G80 core. Under the Asus sticker is an original Nvidia sticker,Asus is a rebrander???
                                    I guess so. Just like my eVGA card then.

                                    I looked through my stash of other nVidia cards, and I found that the eVGA 7600 GT cards are also just rebadged regular nVidia cards... but with Sacon FZ caps!!!

                                    Originally posted by hikaruichijo View Post
                                    From my point of view crappy desingned hardware with an intentional fault that is very dificult to solve is not priceles, it is good to learm thins that I'll never do to quality hardware, or hardware with a minor fault.
                                    I agree 100% with that.

                                    That's actually the main reason why I quite often try to repair "hopeless" hardware. In the process, you learn a lot from the designer's mistakes, and that can greatly expand your knowledge about what works and what doesn't.

                                    Originally posted by hikaruichijo View Post
                                    And I don't want to spend a penny in that cards because I have the felling that they were made to look powerfull and the rest didn't mather, quality? where?, Manufacturing facility tests? why while they last past the waranty...
                                    Yes. What you are saying is sad... but true unfortunately.
                                    That said, I don't think all blame should be put on the manufacturers. First, the whole "upgrade constantly" mentality needs to be rooted from our society. That means advertisers, shops, and even us (the buyers). When almost everyone is upgrading in 2-5 year cycles or less, the manufacturers simply respond to that trend as seeing it unnecessary to design anything to last past that mark. And then there is the human nature in all of us to always expect more for less... typically that means better performing product for lower cost (and lower cost has really been a major factor in the last decade of computer hardware).

                                    So, to sum it up: we get what we pay for... if we are lucky.

                                    Originally posted by hikaruichijo View Post
                                    I don't want to offend, sorry if it seens like it.

                                    I have old hardware because I love it, it's useles nowadays but it's beatifull for me don't know why, but I don't like stuff that was made to fail, I like stuff that was made to last.
                                    I don't think you are offending anyone here, so no need to apologize. In fact, it's good to see there's at least a few of us here who care to have this discussion.

                                    Originally posted by ChaosLegionnaire View Post
                                    be aware, the 4850x2 and 4870x2 cards are infamous for overheating cuz of the poorly designed thermal solution.
                                    Oh, yes, I am well aware of that. 286 Watts of TDP (two 140 Watt TDP 4870 GPUs) is no joke. The size of the GPU heatsink for these cards is obviously not adequate. I'll probably have to go with water cooling on those... but that's if I do get a 4870 X2. Right now, I don't feel like spending $22 on a piece of hardware that doesn't even work and needs modding to get it working right. If it dropped down to $15 or less, I might be a little more interested. Though, to be honest, I feel a bit like hikaruichijo here: I think this stuff was designed to be fast and not last long, so I'm not *that* interested in spending a lot of money on an item that really should be considered nothing more than scrap/e-junk (and priced as such).
                                    Last edited by momaka; 11-04-2016, 11:51 AM.

                                    Comment


                                      #38
                                      Re: Gigabyte 7300GS complete overhaul and modding

                                      Today my boos found an CLUB3D 4870 1GB, it boots but a lot of garbage on the screen, when I saw the cooler and the two 6pin pci-e power connectors everything was clear a cooler with one pipe that doesn't have the side of the board for that TDP something was clearly wrong there I think.
                                      Another example of "penny pinchers" cheaper cooler=cheaper card=it won't last long....

                                      Comment


                                        #39
                                        Re: Gigabyte 7300GS complete overhaul and modding

                                        since we are talking about undervolted video cards, i'd like to talk about an evga 9600gt low power video card that i got a few months back. that card uses the g94b 55nm gpu core undervolted to 1 volt with only a tdp of 59 watts compared to 95w for the g94a 65nm version.

                                        when i got the card and tested it with the stock cooler, i was surprised the stock cooler was actually "acceptable" for cooling the gpu with a furmark load temp of 70°C. normally, i always end up with completely inadequate stock coolers with 75-85°C load temps, so i was a bit surprised with the stock cooling.

                                        next, i changed the stock cooler to a zalman vf700 cu and the furmark load temps were 56°C! i was pleasantly satisfied with that as it is below the bumpgate threshold even tho the gf9 series may not be affected by bumpgate. what i was surprised more was that it also ran cooler than the geforce 6600gt and radeon 9800 pro which both have a tdp of 47-50w. based on this, i believe they overestimated the 59w tdp for the 9600gt green edition. based on the load temperatures i got, i calculated using a °C/W calculator that the actual tdp was more like 40w! thats fantastic for a mid-high-end card. cant find another gpu at that tdp with a better performance without needing a pci-e power connector that i'd actually call it a "pocket rocket video card"! very nice performance per watt ratio there...
                                        Last edited by ChaosLegionnaire; 11-08-2016, 09:13 PM.

                                        Comment


                                          #40
                                          Re: Gigabyte 7300GS complete overhaul and modding

                                          Originally posted by hikaruichijo View Post
                                          Today my boos found an CLUB3D 4870 1GB, it boots but a lot of garbage on the screen, when I saw the cooler and the two 6pin pci-e power connectors everything was clear a cooler with one pipe that doesn't have the side of the board for that TDP something was clearly wrong there I think.
                                          Another example of "penny pinchers" cheaper cooler=cheaper card=it won't last long....
                                          Still save that video card, though. The HD4850 and HD4870 video cards have much higher chances of working after a reflow. I've heard the Radeon HD2000 and HD3000 series are like that too.

                                          By the way, I have a curious question about your GeForce 8800 GTS: what resistance do you get for the GPU V_core to ground and RAM Vdd/Vddq to ground? (Also, what is the lowest resistance your multimeter can measure?)
                                          I am getting 0.2 Ohms for GPU V_Core and around 24 Ohms for RAM Vdd/Vddq supply. Just checking, because I reflowed my 8800 GTS twice now, and it's still dead as a rock. Can't even post the PC with a PCI/AGP card to view the 8800 GTS' BIOS.

                                          Originally posted by ChaosLegionnaire View Post
                                          since we are talking about undervolted video cards, i'd like to talk about an evga 9600gt low power video card that i got a few months back. that card uses the g94b 55nm gpu core undervolted to 1 volt with only a tdp of 59 watts compared to 95w for the g94a 65nm version.
                                          Nice card!

                                          I like video cards under 80 Watts TDP - it's much easier to keep their temperature at reasonable levels. IMO, a 40-60 Watt TDP card with a dual-slot cooler would live for a very long time and never overheat. At some point, I was really looking into the MSI RX2600XT Diamond 512/Plus
                                          https://us.msi.com/Graphics-card/RX2...-specification
                                          ... but it's a pretty weak card performance-wise and wasn't quite a the price I thought was worthwhile, so I didn't get it.

                                          Originally posted by ChaosLegionnaire View Post
                                          "pocket rocket...
                                          I like that term!

                                          Comment

                                          Working...
                                          X