Thank you to the guys at HEGE supporting Badcaps [ HEGE ] [ HEGE DEX Chart ]

Announcement

Collapse
No announcement yet.

Forcing a video card fan to always run at full speed.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #21
    Re: Forcing a video card fan to always run at full speed.

    Originally posted by Spork Schivago View Post
    I'll put a fresh copy of 10 on another hard drive when I get his PC back (he currently has it and it's hard getting in touch with him right now). Even in trial mode, it should at least allow us to rule out or confirm hardware issue vs software issue.
    Exactly!

    Trial modes are more than fine for testing hardware.

    Originally posted by stj View Post
    what asshole allows the o.s. to control the fans?
    *Raises hand quietly*

    Having the OS control the fans is not that much worse than having the card's BIOS control them. Some manufacturers already turn the fans down in BIOS so much (to keep it quiet) that the card cooks itself in a few years.

    My old Radeon HD3870 cards are all like that. I always force the fans to run to at least 45%, which allows them to stay around 40-45°C in idle. When gaming, I turn up the speed to around 60%, which in many cases keeps the core temps under 60°C.

    Sure the proper fix would be to mod the BIOS and flash it... and I've been meaning to do that for a long time now. But I also like to have extensive tests done so that I know what speeds to set in the BIOS.

    Comment


      #22
      Re: Forcing a video card fan to always run at full speed.

      no, the proper fix would be a microcontroller and the internal diode/thermocouple in a closed loop seperate from the pc system.

      Comment


        #23
        Re: Forcing a video card fan to always run at full speed.

        Originally posted by stj View Post
        what asshole allows the o.s. to control the fans?
        Not so much the OS as the driver (which is at least partially dependent on the OS). This has been common in video cards for years (with both AMD and Nvidia), and not just the fans, but other aspects of the cards as well, hence what the early Radeon RX series cards had issues drawing too much power from the PCIe slot and in come cases burning the slot until AMD released a driver update, and if you read the release notes of GPU drivers "improved fan curve" is often in there.

        Comment


          #24
          Re: Forcing a video card fan to always run at full speed.

          I don't think he's trying to have the OS control the fans. I think his goal was to be able to read the speed fans from the OS. I think he installed these programs when the PC started locking up when he was playing games. He had a closed loop liquid cooling system for his CPU. The pump seized. He installed temperature monitoring programs to try and see what was happening, but they're not properly reading the sensor data for one reason or another.

          I thought this was the problem. I thought the programs like SpeedFan were incorrectly reading the sensor data and, thinking the temps were much lower than they were, were limiting the speed of the fan. But they seem to read the GPU just fine, it's just the CPU they seem to think is around 3C (with a good pump) or so. With the bad pump, they were higher. Because the temps are off so bad with SpeedFan, we thought it was within the specs. But that's because we were assuming SpeedFan was reporting the temps correctly. It wasn't until we started using the BIOS to monitor the CPU temps that I started to suspect the pump was bad.

          I took my PSU and tried powering the pump directly. There was voltage going to the pump, but no current. A new pump fixed this and then SpeedFan started reporting the CPU temps at around 3C, which was much lower than what it was reading before. The PSU (programmable) worked to power the new pump, this is how I was able to tell the pump was expecting something like a pulse.

          The BIOS reports the CPU temperature correctly. The BIOS doesn't report the GPU temps, for obvious reasons. Him and his girl are up in Ithaca at something that he's calling a loke a day-trip date, whatever that is. Loke might be a typo, I dunno. He's supposed to be coming back sometime today.
          -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

          Comment


            #25
            Re: Forcing a video card fan to always run at full speed.

            Originally posted by dmill89 View Post
            Not so much the OS as the driver (which is at least partially dependent on the OS). This has been common in video cards for years (with both AMD and Nvidia), and not just the fans, but other aspects of the cards as well, hence what the early Radeon RX series cards had issues drawing too much power from the PCIe slot and in come cases burning the slot until AMD released a driver update, and if you read the release notes of GPU drivers "improved fan curve" is often in there.
            So the fans get their power from the PCIe slots, not from those PCI-E power wires from the power supply?

            Don't those driver updates sometimes include firmware updates for video cards, or no? I know sometimes wireless NICs do. I was thinking something like "improved fan curve", the drivers would have to update something on the actual video card...is that incorrect?

            Thanks.
            -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

            Comment


              #26
              Re: Forcing a video card fan to always run at full speed.

              Originally posted by Spork Schivago View Post
              So the fans get their power from the PCIe slots, not from those PCI-E power wires from the power supply?

              Don't those driver updates sometimes include firmware updates for video cards, or no? I know sometimes wireless NICs do. I was thinking something like "improved fan curve", the drivers would have to update something on the actual video card...is that incorrect?

              Thanks.
              No it's always driver only.
              It's too dangerous to update the cards BIOS with a driver update.
              Think of how many RMA's that could generate if something goes wrong with the update!
              I once had a 7900GT (if I remember right) where I had to update the BIOS to get decent fan control, the original had it running way too hot.

              Actually found the old e-mail, here it is:
              Question (4/5/2008 1:10:26 PM): Hi, I've got BIOS v62.92.16.00.06 and I just realized the card is not throttling the fan speed for me It's "stuck" at ca 30% and never goes up, not even with the core temp hitting 96°C according to ATI Tool! I've setup fan speed throttling manually now in Riva Tuner and this made the temps much better, plus the computer stopped crashing with a BSOD complaining about nvlddmkm.sys with bugcheck 0x116 while gaming! I'm running Vista Ultimate x64 SP1, currently with driver 174.74, before 169.25 aswell, both with the same issue... Do you have a BIOS where the temperature settings are more "sensibly" setup? I set it up in Rivatuner like this; Duty Cycle min:34 Duty Cycle max: 100 Tmin:60 T Range:32 T Operating:78 T low limit:55 T high limit:90 It was originally; Duty Cycle min:29 Duty Cycle max: 100 Tmin:79 T Range:32 T Operating:90 T low limit:80 T high limit:100
              Answered By Peter T (4/11/2008 8:53:22 AM): There is a newer BIOS available on our site that should cover this problem.
              "The one who says it cannot be done should never interrupt the one who is doing it."

              Comment


                #27
                Re: Forcing a video card fan to always run at full speed.

                With some wireless NICs though, part of the BIOS are stored in the drivers though, aren't they? I seem to remember trying to get a wireless NIC working in Linux and had to extract what I believe was the BIOS from the Windows drivers, in order for the WNIC to work correctly.

                I'm sure there was still some sort of BIOS on the WNIC. Maybe they used the term BIOS lightly? For example, the card would still be correctly identified with lspci. It just wouldn't work without extracting the .bin file from the Windows drivers and using that with the Linux module.

                So, what about the fans getting their power? Is it always from the PCI-E slots? A PCI-E 16x slot provides what? A maximum of 75 watts I think? And video cards that use up to 75 watt wouldn't need those extra connectors. This video card has a fancier GPU in it though and I believe the video card draws around 190 watts of power. This is the need for those PCI-E connectors from the PSU (the two 6-pin ones). Could the video card be drawing power from those two 6-pins to power the fans? Or are the fans always getting their power from the PCI-E slots?
                -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                Comment


                  #28
                  Re: Forcing a video card fan to always run at full speed.

                  Originally posted by Spork Schivago View Post
                  So the fans get their power from the PCIe slots, not from those PCI-E power wires from the power supply?
                  it depends on how the card is designed, most GPUs that draw more than 75W draw power from both the slot and 6/8 pin connector(s). whether there is any segregation of that power depends on the design of the card.

                  Comment


                    #29
                    Re: Forcing a video card fan to always run at full speed.

                    I thought there was segregation between that power, from what I've read. Even though, there doesn't need to be (they all share a common ground), I was reading a post about it saying the video cards they checked had it segregated. They called them districts.

                    It's good to know that not all video cards have it segregated like that though and the power that comes from the PCI-E bus can also power the circuit(s) that are being powered by those PCI-E connectors from the PSU.
                    -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                    Comment


                      #30
                      Re: Forcing a video card fan to always run at full speed.

                      Originally posted by Spork Schivago View Post
                      He installed temperature monitoring programs to try and see what was happening, but they're not properly reading the sensor data for one reason or another.

                      I thought this was the problem. I thought the programs like SpeedFan were incorrectly reading the sensor data and, thinking the temps were much lower than they were, were limiting the speed of the fan. But they seem to read the GPU just fine, it's just the CPU they seem to think is around 3C (with a good pump) or so.
                      SpeedFan is better-suited towards older hardware. I've also had it not read temperatures correctly on newer hardware.

                      You should try a different program. Also, give Aida64 a try - you can use it to stress test the CPU and read its temperature (similar to OCCT). The only thing I like about Aida64 is that it also shows you if the CPU starts throttling down due to heat.

                      Originally posted by Spork Schivago View Post
                      So the fans get their power from the PCIe slots, not from those PCI-E power wires from the power supply?
                      Yes, always.

                      At least I have not see a video card that doesn't do that.

                      Originally posted by Spork Schivago View Post
                      I thought there was segregation between that power, from what I've read. Even though, there doesn't need to be (they all share a common ground), I was reading a post about it saying the video cards they checked had it segregated. They called them districts.
                      If the GPU TDP is over 70 Watts, then separation of the PCI-E slot power and 6/8-pin PCI-E power connector is guaranteed. Reason being is that the PCI-E slot should not be loaded with more than 75 Watts of power, due to the load limit of the PCI-E pins. As such, if you have a 100 Watt TDP GPU, for example, and its PCI-E power is not separated from the 6/8-pin connector... then if someone forgets to plug in the 6/8-pin connector power, the GPU will try to pull all of the power from the PCI-E slot, and that could damage it.

                      So for high-power video cards, separation is guaranteed. For lower power video cards, there may not be separation. My XFX GeForce 6800 XT PCI-E card is like that. I know, because one time I forgot to plug in the Molex connector on it. But it still worked fine, even in games (under load). Being a 40-50 Watt TDP card, it didn't damage anything in the PCI-E.

                      That said, some video cards may have TDP lower than 75 Watts but still have a 6-pin PCI-E power connector. For those, separation may or may not exist, as is the case with my XFX GeForce 6800 XT.
                      Originally posted by Per Hansson View Post
                      I once had a 7900GT (if I remember right) where I had to update the BIOS to get decent fan control, the original had it running way too hot.

                      Actually found the old e-mail...
                      Ha!
                      I wonder how many GeForce 7900 GS/GT/GTO video cards died due to this.

                      I bought several of these last fall/winter, as they were super cheap on eBay (less than $10 shipped to my door, in a few cases). The ones with the small copper coolers with small 40 mm squirrel cage fans are hopeless. Even with the fan running at 100%, they still run too hot under load and manage to bumpgate themselves.

                      What I find even more amusing is that the 7900 GS and GT actually use about the same amount of power as a 8600 GT: about 40 Watts or so, under load. With my Xbox 360 CPU heatsink hacks, I couldn't get my 7900 GS to run above 55°C, even when overclocked to GT speeds and with an 80 mm fan running at 7 V to blow on the CPU heatsink. In comparison, the stock heatsink will spike to 60°C almost instantly under load, and from there it starts to rise slowly to 70-80°C (and that's with the small 40 mm fan cranking at 100%).

                      Comment


                        #31
                        Re: Forcing a video card fan to always run at full speed.

                        Originally posted by momaka View Post
                        SpeedFan is better-suited towards older hardware. I've also had it not read temperatures correctly on newer hardware.
                        It's not just SpeedFan though that's reporting the wrong temps. There was a program from the manufacturer of the board as well showing the same temps for the CPU (3C). I figured the motherboard manufacturer would known how to read their sensors. I thought maybe it was a conflict between the three temp programs he had on there. When I say temp programs, I really mean temperature monitoring programs and / or fan speed reading / control programs.

                        Originally posted by momaka View Post
                        You should try a different program. Also, give Aida64 a try - you can use it to stress test the CPU and read its temperature (similar to OCCT). The only thing I like about Aida64 is that it also shows you if the CPU starts throttling down due to heat.
                        Yesterday, he was a no show, but got a hold of me and said he'd try to bring the PC back today sometime around noon. So I can try Aida64 and GPU-Z. But first I'm going to try reinstalling the OS on a separate hard drive, just to rule out software issues.

                        Originally posted by momaka View Post
                        If the GPU TDP is over 70 Watts, then separation of the PCI-E slot power and 6/8-pin PCI-E power connector is guaranteed. Reason being is that the PCI-E slot should not be loaded with more than 75 Watts of power, due to the load limit of the PCI-E pins. As such, if you have a 100 Watt TDP GPU, for example, and its PCI-E power is not separated from the 6/8-pin connector... then if someone forgets to plug in the 6/8-pin connector power, the GPU will try to pull all of the power from the PCI-E slot, and that could damage it.
                        This doesn't make sense to me. I was under the impression that the PCI-E slot could only provide a maximum of 75 watt. I wasn't under the impression that it could draw more current and get damaged if a device required more current. Is this what you're saying? I was also under the impression that if a device required something like 10 amps but was only getting 5 amps, it simply wouldn't work, not that it'd get damaged. Is this not true?

                        Thanks
                        -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                        Comment


                          #32
                          Re: Forcing a video card fan to always run at full speed.

                          pull too much current and you burn the connector and board.

                          i remember when video cards started getting big, some boards openly warned to only put the gfx in a specific slot.

                          Comment


                            #33
                            Re: Forcing a video card fan to always run at full speed.

                            Originally posted by stj View Post
                            pull too much current and you burn the connector and board.

                            i remember when video cards started getting big, some boards openly warned to only put the gfx in a specific slot.
                            Is this just for PCs? I thought it was physically impossible to pull more current than what the power source was providing. For example, if a light bulb is 60 watts, it can never draw more than 60 watts. If the power supply to the light bulb only provides 40 watts, that light bulb will only receive 40 watts, even if it's trying to consume 60 watts.

                            How can it be different with the PCI-E slots? Is the 75 watts just a safe limit that they say the slots can physically handle but they're really capable of providing more? Just not safe like? This is what's confusing me.

                            We tried using two 6-pin connectors with that fancier video card I loaned Josh in one guys computer because he didn't have the 8-pin. That fancier video card (the MSI nVidia one) uses an 8-pin and a 6-pin adapter. When we turned it on, the BIOS on the video card yelled at us and refused to let the computer start, saying it required an 8-pin adapter. I believe there's a sense wire there.
                            -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                            Comment


                              #34
                              Re: Forcing a video card fan to always run at full speed.

                              if the light bulb is rated at 5Kw and your house is 25Kw capable your lamp socket will heat up and burn.

                              weakest point and all that!

                              Comment


                                #35
                                Re: Forcing a video card fan to always run at full speed.

                                Spork: the PSU can deliver what the label says (if it's a honest PSU).
                                The garphic card will draw whatever it's TDP is.

                                But the weak link is the copper traces on the mainboard, and the PCIe connector.
                                They can not handle more than 75w, but there is no monitoring.
                                And nothing to limit the current to 75w...
                                "The one who says it cannot be done should never interrupt the one who is doing it."

                                Comment


                                  #36
                                  Re: Forcing a video card fan to always run at full speed.

                                  Per Hansson,

                                  Okay, that makes sense. I thought there was something that limited the current to 75 watts.

                                  But what Stj says, that doesn't make sense to me. If the house is rated for 25kW and I have a bulb that's rated for 5kW, if I plug that bulb into a socket that's rated for 5kW or higher, I don't see how that would burn anything up, regardless of what the house is rated for. If the house is rated for under 5kW, the bulb would just draw whatever it could. If it was rated for more than 5kW, the bulb would just draw the 5kW.

                                  If the bulb is rated for 60 watts and let's say the socket is rated for 60 watts, but the house is only providing 40 watts....I don't see how or why the socket would burn out. The bulb I would think would just draw 40 watts of current. Now if the house was rated at something like 120 watts, the socket 40 watts, and the bulb rated at 60 watts, without a limiter, I could see how it could burn out. Here, the light fixtures we buy (for the house, ie, the ceiling fans with the lights) all have limiters that prevent the us from using bulbs that draw more current than what the fixture is rated for.
                                  -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                                  Comment


                                    #37
                                    Re: Forcing a video card fan to always run at full speed.

                                    I got the computer back now, so I'm going to find another hard drive around here somewhere and install 10 on it. He said he was never able to remove the original AMD drivers. The card is made by some company named PowerColor.

                                    He originally tried the official AMD drivers, said it locked up all the time, without even having to go into a game. He started reading on the net and everyone said they had to use the official PowerColor drivers, which weren't updated in a long time. That helped he said but would still overheat when he went into games. From what I've seen, it overheats regardless, just quicker when he's doing graphic intensive stuff. The fans just don't ramp up. Now that I have it back though, I can play around and try to confirm software vs hardware issue.
                                    -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                                    Comment


                                      #38
                                      Re: Forcing a video card fan to always run at full speed.

                                      Originally posted by Spork Schivago View Post
                                      This doesn't make sense to me. I was under the impression that the PCI-E slot could only provide a maximum of 75 watt. I wasn't under the impression that it could draw more current and get damaged if a device required more current. Is this what you're saying? I was also under the impression that if a device required something like 10 amps but was only getting 5 amps, it simply wouldn't work, not that it'd get damaged. Is this not true?

                                      Thanks
                                      It is absolutely possible for a card that is poorly designed or with a manufacturing/driver/firmware (since so much is controlled with the driver/firmware) flaw to overload, example the early RX480s were known to draw over 75W from the PCIe slot and fry the slot until AMD released a driver update to fix the issue:
                                      https://www.eteknix.com/amd-radeon-r...ng-pcie-slots/
                                      http://www.pcworld.com/article/30911...nsumption.html

                                      Comment


                                        #39
                                        Re: Forcing a video card fan to always run at full speed.

                                        Alright, so we install Windows 10, verified fresh, clean, unmodified edition. We just picked the username and are configuring the options, haven't actually logged in yet or installed official video drivers.

                                        We're looking at the R9 380 fans. First one kicks on for maybe 5 seconds, turns off. Second one kicks on for maybe 5 seconds, kicks off, first one kicks on for maybe 5 seconds, turns off. Second one kicks on for maybe 5 seconds, turns off. And it just repeats like this the whole time.

                                        Maybe it's not gotten hot enough yet? Maybe this is a clue as to what's wrong?? I dunno. What do you guys think?
                                        -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                                        Comment


                                          #40
                                          Re: Forcing a video card fan to always run at full speed.

                                          Originally posted by dmill89 View Post
                                          It is absolutely possible for a card that is poorly designed or with a manufacturing/driver/firmware (since so much is controlled with the driver/firmware) flaw to overload, example the early RX480s were known to draw over 75W from the PCIe slot and fry the slot until AMD released a driver update to fix the issue:
                                          https://www.eteknix.com/amd-radeon-r...ng-pcie-slots/
                                          http://www.pcworld.com/article/30911...nsumption.html
                                          Well, yeah, that makes sense to me now that I know there's no limiters on the current for the PCI-E slots. Originally, my train of thought was there was a limiter and they wouldn't provide more than 75 watts of power. Not that they could provide more and burn out the traces or the connector.

                                          With a limiter though, let's say something is limited the wattage to 75 watts and a device tries to draw 150 watts. Nothing will burn out, right? The device just won't operate properly or at all, right? That's how I understood it to work.
                                          -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

                                          Comment

                                          Working...
                                          X