How does GPU get bottlnecked by CPU ?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • SM-Piyes90
    Senior Member
    • Feb 2014
    • 120
    • Morocco

    #1

    How does GPU get bottlnecked by CPU ?

    Hello everyone .

    Its quite obvious that every gamer's concern when building a PC is to avoid bottleneck issues .

    The problem is how to avoid that with all those CPU and GPU variables out there .

    As far as i understand the GPU takes power from CPU .So does it means that cpu clock must be higher than the GPU clock ? or the memory clock ?

    Also is the ram related to this ?

    For example will a C2D @ 2.33 Ghz with 2 gb of ram be enough to power up a gtx 460 without bottleneck ? :

    GTX 460 specs :

    GPU frequency : 675 Mhz
    Memory frequency : 900 Mhz
    Memory size : 1 Gb
    Last edited by SM-Piyes90; 11-20-2016, 05:23 PM.
  • ChaosLegionnaire
    HC Overclocker
    • Jul 2012
    • 3264
    • Singapore

    #2
    Re: How does GPU get bottlnecked by CPU ?

    no. wrong. u cant calculate it like that. how powerful a gpu is depends on the number of shaders, texture units and raster operator units it has. the clockspeed determines how fast the above 3 operational units in the gpu work which affects the gpu's fillrate and shader operations per second.

    take for example, a gpu with a 8-8-8 (shader-tmu-rop) core config working at 250 mhz. this gpu would be on paper, identical in performance to another gpu from the same generation with a 4-4-4 core config working at 500mhz.

    other factors that affect a gpu's performance are the video memory bandwidth and the gpu core design & generation. a gpu's fillrate and shader performance can be bottlenecked by its video memory bandwidth which is why i stay away from video cards with a 64-bit and 128-bit memory bus.

    a gpu with a later design even tho it has the same number of cores as the previous generation would be faster because the gpu maker would have had time to design a more efficient gpu with the later generation.

    therefore, u cant make an apple to apple comparison of gpus from the clockspeed like that. there is no easy way to do this except to look at the year (when it first came out) of the gpu and cpu u are pairing together and whether they are high-end or not. u can also look at various gpu and cpu benchmarks around the web to gauge a rough idea of the hardware u are pairing together.

    lucky for you, i have a 1gb leadtek gtx 460 oc (gpu core/shader @ 725/1450 mhz). this gpu is from mid-2010 while a c2d 2.3 ghz is from around 2007-2008 and its a mid-low end cpu. without even looking at benchmarks, i can safely say the cpu is too slow and will bottleneck the gpu.

    i attached some benchmarks i made previously with my c2d e8600 @ 3.33 ghz, with the gpu overclocked and at 4.5 ghz. even with my c2d @ 3.33 ghz, the cpu was still bottlenecking the gtx 460.

    the first screenshot is both the cpu and gpu running at stock speeds. my gtx 460 is factory overclocked with the core-mem at 725-1800mhz. the second screenshot is with the cpu at stock but gpu overclocked to 850-2100mhz. only got a puny 200 points improvement. the third screenshot is with the cpu overclocked to 4.5 ghz but the gpu running at stock. got a massive 4000 points improvement from overclocking the cpu alone.

    so to conclude, even a mid-high end gpu from mid-2010 is still too powerful for even a high-end cpu from mid 2008. a suitable gpu to pair with a 2.33 ghz c2d would be a nvidia 9600gt or amd 3870. a suitable cpu to pair with a gtx 460 would be a 3ghz or faster quad-core cpu.

    but it also highly depends on what type of game u wanna play. in general for FPS games, they are more bottlenecked by the gpu, so u can get away with a slow cpu but fast gpu. however for RPG and RTS games, they are more harsh on the cpu, so u can get away with a slow gpu but fast cpu.
    Attached Files

    Comment

    • SM-Piyes90
      Senior Member
      • Feb 2014
      • 120
      • Morocco

      #3
      Re: How does GPU get bottlnecked by CPU ?

      Thanks for the time you gave , was a long read .

      All i can say is its hard to upgrade a pc component without the others with it .

      It would be much easier to just make a new build .

      Comment

      Related Topics

      Collapse

      • Borges
        Schematics and boardviews for newer GPUs (RX VEGA 56/RTX 2080/RX580/ZOTAC RTX 2060)
        by Borges
        I have been learning about microcomponents and GPUs for a while now to make an attempt to fix a few GPU I have bought from eBay.

        For some older model I have started making some repairs as it has been easier for me to find the schematics and BoardViews' for modle sbelow gtx 1080/rx 480. However I have a few GPUs I have bought that I am having a hard time diagnosing or finding any schmatics/boardviews. These are the following GPUs that are giving me a headache:



        - GIGABYTE Radeon RX VEGA 56 GAMING OC 8G
        - GIGABYTE GeForce RTX™ 2080 GAMING...
        12-12-2021, 08:26 PM
      • ktmmotocross
        MSI gt83vr 8rg titan - no picture after repasting and repading GPUs
        by ktmmotocross
        Have a VERY bad day.
        Have this MSI GT83VR 8rg titan that wont show picture after both 1080ti cards repasted and repaded. offcourse as first i unplug main battery. everything went ok, but when starting i have no picture. Whole laptop looks like starting, have all voltages. i try it without gpus, but no luck. i have older gt73 with it and it shows picture without gpus so there is something with board. Flashed bios and have same results. i think about chipset as it is nototrious SR40 series, but dont want to make harsh move

        Any hint?

        No SN as no label on the back
        03-17-2024, 01:46 AM
      • SapientLion
        2004/05 AGP GPUs: Repair Tips and Tricks
        by SapientLion
        Hello, everyone!

        This is my first actual post on this forum. I don't know anyone here nor have any knowledge on proper etiquette that is supposed to be applied here... So, bear with me and my mistakes. I'll try my best. Regardless, here's my story.

        A few days ago, i discovered and managed to purchase two AGP x8 GPUs (ASUS Extreme N6600 [NVIDIA GeForce 6600] and ASUS AX1650PRO AGP [ATI Radeon X1650 Pro] ), from a trusted seller, for dirt cheap, which were released way back in 2004/05. They were primarily used for graphics design and 3D modelling. As far as know,...
        06-28-2023, 05:35 PM
      • Foster1344
        AMD GPUs - Solder-Upgrading VRAM?
        by Foster1344
        Hi All...

        Some recent posts 1 2 have re-ignited my curiosity for upgrading the VRAM on GPU dies that have multiple-VRAM variants (usually the workstation/PRO cards have double the consumer versions).

        Nvidia cards seem to use the same standard practice as laptops/macs: resistor straps.

        However, I can't find much info on AMD boards? Some seem to suggest it's purely in the bios, no resistor-shifting needed.

        Just wondering if anyone on here has any experience with this? Or any schematics, etc.?
        05-01-2023, 07:55 AM
      • shovenose
        Fake GPUs on eBay and elsewhere
        by shovenose
        Was searching for GPUs. All such a ripoff right now. But just seeing if I happened to find a good deal. Personally I'm fine w/ the (legit) GTX1050 in my main PC. Good enough for the 10 mins of gaming I do every month. But we're trying to build a gaming PC for the BF on a budget and for now I lent him a little Optiplex I stuffed a 1030 in but was just hoping for something better.


        Anyway, found this crap: https://www.ebay.com/itm/194206843675

        Obviously fake since it has a VGA port. Reported it as well as another one I found from a different seller. I know it's really...
        06-19-2021, 08:56 AM
      • Loading...
      • No more items.
      Working...