Thank you to the guys at HEGE supporting Badcaps [ HEGE ] [ HEGE DEX Chart ]

Announcement

Collapse
No announcement yet.

lol - another Linux convert coming?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #21
    Re: lol - another Linux convert coming?

    Originally posted by Spork Schivago View Post
    I see I can buy one of those Indigo 2's for around 300$ on E-Bay. I bet it isn't all decked out. Probably doesn't even run. I almost thought of purchasing it.
    you should be able to get them a hell of a lot cheaper than that,
    but if you go upgrading cards after - then it starts to build up.

    Comment


      #22
      Re: lol - another Linux convert coming?

      Originally posted by stj View Post
      you should be able to get them a hell of a lot cheaper than that,
      but if you go upgrading cards after - then it starts to build up.
      Oh, I thought because they went for sooo much back in the day that they'd still go for a lot. I'm gonna pass on it anyway. I got someone that might be selling me some of their older equipment. I'm saving my money for that! There might be some rack mountable stuff they're going to sell me. That'd be great!!!
      -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

      Comment


        #23
        Re: lol - another Linux convert coming?

        if it's in a rack it could be Origin or Onyx.

        http://web.archive.org/web/200202142...ech_specs.html
        take a look at the power consumption!!

        Comment


          #24
          Re: lol - another Linux convert coming?

          Originally posted by stj View Post
          btw, did you know that some of the SGI engineers left and started NVIDIA??
          that's why the TNT and then G-Force kicked the hell out of the competition out of nowhere.
          they were based on the best 3D hardware in the world at the time!
          despite the sudden leaps and bounds achieved by nvidia with their gpus during that time, i wasnt too exactly impressed with nvidia during the early gf1,2,3 and 4 days. sure the gf2 gts was the world's first gigatexel fillrate video card. but unfortuntely, despite all their engineering "expertise" and "intellect", they neglected to design a better memory bus or improve the memory bandwidth efficiency and utilization of their gpus. it somehow escaped their "intellect" that a gpu's fillrate can actually be limited by its memory bandwidth?

          anandtech sexposed the gf2's gigatexel fillrate as a marketing gimmick. during their (anandtech's) testing, they basically proved the gf2 failed to fully utilise its available fillrate and that the gpu's fillrate was bottlenecked by its memory bandwidth particularly with z-buffer calls. it only achieved two-thirds of its max theoretical fillrate. the competition (ati, 3dfx) had much better fillrate efficiency even tho their theoretical max fillrate was much lower.

          nvidia tried to improve their memory bandwidth efficiency with the lightspeed memory architecture (lma) in the gf3 ti and later on with lma2 in the gf4 ti series. but all that was in vain when ati came out with the 9700 pro. it completely wiped the floor with the gf4 tis when u used heavy aa and af. then nvidia fumbled and tried to come out with the gf5 series which were a complete joke in dx9 shader intensive games.

          while nvidia's prowess at designing gpu chips is well-renowned, their capability at designing a video card wholistically is a different matter altogether. just look at their recent fuckup with the memory bus of their gtx970. their tradition of designing power hungry gpus still continues to this day with the titan series of video cards.
          Originally posted by stj View Post
          take a look at the power consumption!!
          errr wtf lol! i thought the gtx titan and furyx were just insane with their tdp but this totally, completely takes the cake...

          a video card (more like video graphics rendering system) with 2kw-5kw power consumption! thats waaay too much power consumption even for a home power socket. the current limit for countries that follow the uk power standard is 13A @ 240V ~ AC giving a max power consumption of 3.12kw. the fully loaded 5kw one is too much for even a home power socket! how do u even power one at home?! u're gonna need an industrial power socket and transformer to feed it.

          Comment


            #25
            Re: lol - another Linux convert coming?

            Originally posted by stj View Post
            if it's in a rack it could be Origin or Onyx.

            http://web.archive.org/web/200202142...ech_specs.html
            take a look at the power consumption!!
            That's insane! At least when the furnace breaks in the winter again, we won't freeze to death! Did you see the BTU's those things put off?
            -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

            Comment


              #26
              Re: lol - another Linux convert coming?

              Originally posted by ChaosLegionnaire View Post
              despite the sudden leaps and bounds achieved by nvidia with their gpus during that time, i wasnt too exactly impressed with nvidia during the early gf1,2,3 and 4 days. sure the gf2 gts was the world's first gigatexel fillrate video card. but unfortuntely, despite all their engineering "expertise" and "intellect", they neglected to design a better memory bus or improve the memory bandwidth efficiency and utilization of their gpus. it somehow escaped their "intellect" that a gpu's fillrate can actually be limited by its memory bandwidth?

              anandtech sexposed the gf2's gigatexel fillrate as a marketing gimmick. during their (anandtech's) testing, they basically proved the gf2 failed to fully utilise its available fillrate and that the gpu's fillrate was bottlenecked by its memory bandwidth particularly with z-buffer calls. it only achieved two-thirds of its max theoretical fillrate. the competition (ati, 3dfx) had much better fillrate efficiency even tho their theoretical max fillrate was much lower.

              nvidia tried to improve their memory bandwidth efficiency with the lightspeed memory architecture (lma) in the gf3 ti and later on with lma2 in the gf4 ti series. but all that was in vain when ati came out with the 9700 pro. it completely wiped the floor with the gf4 tis when u used heavy aa and af. then nvidia fumbled and tried to come out with the gf5 series which were a complete joke in dx9 shader intensive games.

              while nvidia's prowess at designing gpu chips is well-renowned, their capability at designing a video card wholistically is a different matter altogether. just look at their recent fuckup with the memory bus of their gtx970. their tradition of designing power hungry gpus still continues to this day with the titan series of video cards.

              errr wtf lol! i thought the gtx titan and furyx were just insane with their tdp but this totally, completely takes the cake...

              a video card (more like video graphics rendering system) with 2kw-5kw power consumption! thats waaay too much power consumption even for a home power socket. the current limit for countries that follow the uk power standard is 13A @ 240V ~ AC giving a max power consumption of 3.12kw. the fully loaded 5kw one is too much for even a home power socket! how do u even power one at home?! u're gonna need an industrial power socket and transformer to feed it.
              Some of the servers I've seen personally, at my old work, used a bunch of PSUs. They were pretty expensive too. Maybe it's something similar with these systems? More than one PSU to share the load and maybe they hook into different power sockets? Just a thought.
              -- Law of Expanding Memory: Applications Will Also Expand Until RAM Is Full

              Comment


                #27
                Re: lol - another Linux convert coming?

                Originally posted by ChaosLegionnaire View Post
                despite the sudden leaps and bounds achieved by nvidia with their gpus during that time, i wasnt too exactly impressed with nvidia during the early gf1,2,3 and 4 days. sure the gf2 gts was the world's first gigatexel fillrate video card. but unfortuntely, despite all their engineering "expertise" and "intellect", they neglected to design a better memory bus or improve the memory bandwidth efficiency and utilization of their gpus. it somehow escaped their "intellect" that a gpu's fillrate can actually be limited by its memory bandwidth?
                that's more a pc problem at the time with the PCI and AGPx4 bus.
                the SGI machines had GIO64 bus and and independant DMA controller with a crossbar switch that could move any amount of data from one place to another in the background.


                Originally posted by ChaosLegionnaire View Post
                a video card (more like video graphics rendering system) with 2kw-5kw power consumption! thats waaay too much power consumption even for a home power socket. the current limit for countries that follow the uk power standard is 13A @ 240V ~ AC giving a max power consumption of 3.12kw. the fully loaded 5kw one is too much for even a home power socket! how do u even power one at home?! u're gonna need an industrial power socket and transformer to feed it.
                you can tap upto 32A from the ring, or run a dedicated 32A or 63A line to the panel.
                these rack units arent for home use.
                Last edited by stj; 05-01-2016, 05:15 AM.

                Comment

                Working...
                X