Cranked it up to 3.2GHz without a change in voltage and the same CPU temps across the board. Put it through the Prime95 overnight ringer again and it performed flawlessly. Lovin' the E6750. :3
@Niner...is there a reason you want it at 3.2GHz, even if it is stable, does it really make a difference to gaming n such ??
of course, there is also the e-penis factor(ie. "mine is bigger than yours!"), which motivates all overclockers :D
the e-penis stuff gets me mainly when it comes to displays/resolutions, rather than clockspeeds :p
mtfbwya
Every clock counts when it comes to single-core games. I noticed an improvement in C&C 3 even when I just bumped it up from 2.4GHz to 2.7GHz. The only time you're not going to see a difference is when the game is so light on CPU resources that you've already got it well covered with 2.4GHz (mostly first-person shooters). Games that eat CPU such as real-time strategies and simulations are where overclocked processors get a chance to shine. :)
Well, it's my very first overclocking endeavor, so it's really about experimentation as well as improving performance. It's more of a learning experience than anything else. :3
I'm planning to throw CoD4 at it soon, maybe Bioshock too, so we'll see if that extra .2 makes a difference. xD
I'm planning to throw CoD4 at it soon, maybe Bioshock too, so we'll see if that extra .2 makes a difference. xDwell, you probably won't see much improvement with either game since Bioshock depends almost exclusively on having a duel core and CoD4 is VRAM dependant. either way, you should be able to run both games at 1280x1024 (or higher) with all settings maxed. if you can't, i would be seriously shocked. ;)
Eh, we'll see. I'm probably gonna pick one of the two up after work and give it a go, so we'll see how my system takes it. I've been playing a lot of Hellgate London on it and there have been a fair few hiccups, but that's mostly because of HGL's subpar online client performance and not really reflective of the system as a whole.
Thriky, I want to be able to play Starcraft II when Blizzard finally does decide to release it upon the masses (thereby sending the entire nation of South Korea into a frenzy), so hopefully it will be able to take the strain.
EDIT: Picked up CoD4 and ran it at max settings without any hiccups for about three hours. My 8800GT didn't go over 45C and the CPU didn't go over 40C. I just stopped playing 5 minutes ago and the GPU is already down to 38C and the CPU is at 33C.
Ridiculous. :joy:
EDIT: Picked up CoD4 and ran it at max settings without any hiccups for about three hours...
what rez niner? I havent bothered with cranking up all the bloom n glows, and have left it at whatever the game picked initially, and just changed it to 1900x1200. If I did put that other stuff at max, I doubt the 8800GTS would like me much for it at 19x12(let alone 1600p)
mtfbwya
I almost always game at 1680x1050, which is the max res that my monitor supports. And omg widescreen in FPSes is godly. :3
...so it's really about experimentation as well as improving performance. It's more of a learning experience than anything else. :3
if only everyone was so diligent about improving their love-making overclocking prowess :D
And omg widescreen in FPSes is godly. :3
yes, once you gone oblong, you can never go back. 1680x1050 is OK, but if you ever have the chance to upgrade your monitor, a WS native 1080p will be well worth it. Monitors are getting cheaper by the day now, and with OLED creeping in, LCD prices will continue to drop even further.
mtfbwya
Hey, after dealing with a 17" CRT monitor for almost 5 years, my 22" widescreen is like being upgraded from coach to first class. :3