In the nvidia control panel there are three things you can overclock, Memory Clock, Shader clock and Core clock. Which one would improve performance the most? Please don't warn me and give me the talk "you shouldn't over clock blah blah" I know the consequences. Thanks in advance!
Eh, I'd imagine the memory clock controls how fast files, such as large textures, etc are loaded into the memory, and onto the GPU. The shader clock is most likely how shaders, such as 2d animations, textures with transparencies, etc. are shown, and will mostly affect the performance of environments, such as forests, jungles, heavy vegetation, etc will be shown. The Core clock probably will make the most difference-as it is the part of the card that renders everything.
This is only and educated guess on what each part does, so you would probably be better off trying each individual part out than listening to me. However, if I am wrong, would someone be so kind as to correct me?
If you use RivaTuner, it links your core and shader clocks (generally a good idea). The memory clock is still configured independently. What are the standard clock settings for your 8600GTS?
Hey, i'll try out rivertuner and tell you how it goes, the standard clock settings are Core Clock 675Mhz Memory Clock 1008 Mhz Shader Clock 1450 Mhz I did overclock it to Core Clock 700Mhz Memory Clock 1008Mhz Shader Clock 1750 Mhz Everything seems really stable, was playing Crysis with it Med-High, playable framerates. But I saw a video on youtube showing a 8600 GTS overclocked to CC 820 Mhz MC 1008 Mhz SC 2020 Mhz On Max settings and it was smooth as butter. Also quick question, would a Pentium dual core 2.66ghz operate just as fast as a Core 2 Duo 2.66ghz?
Hmm. I've got an 8800GT which I overclocked thus:
Core - 600MHz Shader - 1500MHz Memory - 900MHz
Core - 730MHz Shader - 1800MHz Memory - 1000MHz
This worked pretty well, although the driver would occasionally stop responding, mostly in Call of Duty 4 and Halo 2 Vista, which is odd because neither game is very demanding graphically.
Now I have it set to:
Core - 700MHz Shader - 1750MHz Memory - 1000MHz
There's a definite noticeable boost in performance, and it's been stable so far.
AHH! Why does your 8800GT overclock?! Everytime I try the system crashes and hardlocks. Thats if I increase anything by even 50. Gay!
Babeman;4577022Also quick question, would a Pentium dual core 2.66ghz operate just as fast as a Core 2 Duo 2.66ghz?
I doubt it would, since the Core 2 Duo benefits from a different (hopefully optimized, compared with the P4's ^^ ) architecture.
On Max settings and it was smooth as butter. Also quick question, would a Pentium dual core 2.66ghz operate just as fast as a Core 2 Duo 2.66ghz?
You mean a Pentium D? If so, then no, not even close.
The highest i've got the core speed on my HD 3850 was up to 830Mhz. It was stable, until it got too hot (the factory cooler isn't good enough to run it that fast, but the GPU core will do it). Right now its running at 775/1900, up from 668/1680, at its perfectly stable.
Yea I mean pentium D, I can play Bioshock at max 1280-1024 res, but crysis just bogs down a lot for some reason, and even at 800x600 res it seems to go slower than at 1280-1024 res. I have a 775 socket CPU, so would that mean I could upgrade my Cpu to a core duo without replacing the motherboard? Also another quick question, I have 4 sticks of Ram inside my computer, 2 X 1gb's and 2 X512's making 3gbs, the 512's are PC 4300 while the 1gb's are 5300's, so does that mean if I take out the 512's I could get better performance since the 1gbs have faster speed? EDIT: thinking about this CPU on tigerdirect Intel Core 2 Duo E7200 Processor BX80571E7200 - 2.53GHz, 3MB Cache, 1066MHz FSB, Wolfdale-3M, Dual-Core, Retail, Socket 775, Processor with Fan in Canada at TigerDirect.ca
If supported, I would get the cheapest quad core, which is no more that $50 more, if your mobo supports it.