Tuesday, February 28, 2006

Kermit's downfall, part 2

2 of 13

Monday, February 27, 2006

Kermit's downfall, part 1

1 of 13

Monday, February 20, 2006

Fun with overclocking

DISCLAIMER: This post contains a dangerously high geek factor. Grease up yer pocket protectors or ya might get hurt!


Soft focus just captures the moment...awwww.

As a computer hardware enthusiast, I naturally have many lofty ways of saying I'm a geek. I also like computer hardware, probably more than you do. I care about what goes into my PC. When you build a PC that has both a budget and a performance goal, you need to do your homework and make sure you're spending your money wisely. I've been building and upgrading PCs long enough to have worked with awesome hardware, shit hardware, and the stuff in between. I know how much more enjoyable it is to work with good hardware than with the cheap stuff. It's like a gearhead caring about every little part of his car, knowing what aftermarket products compliment each other, while all I really care about with my car is that it get me from point A to B, preferably with music.

Last fall I built myself a damn fine PC, the LED-adorned guts of which you can see in the really shitty photo above. Like always, I was working within a budget (albeit the largest one I've ever had for building my own PC), so I couldn't just order the most expensive varient of everything I needed. I did my homework and wound up with a rig that stayed on budget and hit my performance goal with style: gaming at 1600x1200 resolution with anti-aliasing and anisotropic filtering enabled.

[Huh? OK, anti-aliasing (AA) is a technology that reduces the jagged "stair-stepping" effect you often see on diagonal lines. Anisotropic filtering (AF) keeps textures (the details you see on objects in a 3D game) looking sharp as they extend into the background of a scene (think of a floor or a large body of water). They both make games look much better, but they require a lot of horsepower from your graphics card (GPU). You got that?!]


When I built my new PC (which I named Black Ice due to its black case and blue lighting scheme, and because I couldn't think up anything appropriate that sucked less on the spot) I decided not to overclock it...

[Huh? OK, "overclocking" means running part of your PC, usually the CPU, at a higher speed than it's rated at. It voids the warranty immediately, and like pushing anything past its intended limits, can damage or destroy the componnent if not done carefully. It also provides your PC with a free speed boost, hence the attraction.]

Like I was saying, I decided not to overclock my new PC. I had two logical and compelling reasons for this geek-shaming decision:
  1. Black Ice was going to be so much faster than any other PC I'd owned that I wouldn't notice the comparitively small boost from overclocking.
  2. Because overclocking can fry your PC, the first rule of overclocking is: DON'T OVERCLOCK ANYTHING YOU CAN'T AFFORD TO REPLACE. Since I couldn't afford to replace anything more costly than a fan, I decided not to push my good luck.
But I'm a geek, and the lure of overclocking holds sway and illimitable dominion over all. First, you should know what I'm working with:

motherboard: Asus A8N-SLI Premium
CPU: AMD Athlon 64 4000+
RAM: 1GB Corsair XMS Pro Series (2x512)
graphics: XFX nVidia 7800GTX (256MB)
audio: Sound Blaster X-Fi Platinum
hard drives: 2 Hitachi Deskstar T7K250 250GB in RAID-0
optical drive: Plextor 716SA
power supply: PC Power & Cooling 510-SLI
CPU cooler: Zalman CNPS9500 LED
case: Thermaltake Armor full-tower

It started off innocently enough with overclocking my video card (GPU). With nVidia-based GPUs this is done easily enough by downloading and running the "coolbits" registry tweek. Coolbits adds the ability to adjust the operating speeds of the core (the graphics chip) and the video card's memory. Heat is the killer when it comes to overclocking (You're running things faster than spec, so they generate more heat.), and with no way to monitor the temperature of my GPU I decided not to be very aggressive. Fortunately, the nVidia driver (with the coolbits modification) has a "detect optimal settings" button. I clicked on it, and after some screen flickering the GPU's core got a bump from 450MHz to 490MHz, the memory from 1.25GHz to 1.32GHz. Running Doom 3's timedemo feature (a built-in benchmarking facility that shows you how your PC performs in the game, expressed in frames per second) showed a small but measurable improvement: 58.2 fps compared to 55.1 fps before the GPU overclock. (That's at 1600x1200, high-quality graphics setting, 4xAA, 8xAF.) It's not a difference you'd likely notice while playing a game, but it was enough to put the hook in me.

I know not everyone reading this blog is a hardware geek like me, so here's a (very) brief description of how CPU overclocking works. The speed of a CPU is determined by two numbers multiplied together: the front side bus (FSB) speed, and the clock multiplier. At it's stock speed my Athlon 64 4000+ has a 200MHz FSB and a clock multiplier of 12. 200x12=2400, or 2.4GHz. The clock multiplier can't be raised on Athlon 64 CPUs, so to overclock it the FSB must be increased. This should be done in small incriments so you don't fry anything. The process goes something like this: you increase the FSB a small amount, reboot, and if your PC reboots successfully you run a stress test to make sure the overclock is stable, then run whatever benchmarks (performance evaluations) you want to illustrate the performance increase. Then you do it again. Eventually your PC either doesn't boot successfully or isn't entirely stable. At that point you back off to the last stable overclock. If you want to keep pushing you can "overvolt", which means giving your CPU a little extra electricity over what it was designed to run on. This is the most dangerous part of overclocking, as giving a CPU too much voltage can destroy it, but experienced overclockers know that it's also the key to getting the most out of your overclock. I can't afford to replace a fried CPU so I made a hard guideline: I would overclock by raising the FSB, but I would not overvolt and would back off at the first sign of instability. Safety first, you know.

Remember what I said about heat? Heat kills, and overclocking means making things run hotter. To monitor my system temperatures during my overclocking experiment I used Asus PC Probe, a handy little utility that came with my Asus mobo that monitors temps, fan speeds, etc. The process went something like this: I'd set my overclock, reboot, open PC Probe, let my PC sit idle for 5 minute and record the CPU and motherboard temps. Then I'd run SETI@Home for ten minutes, which keeps the CPU at 100% capacity, and record the temps again to see how hot the PC runs when busy. Then I'd run two real-world benchmarks: the aforementioned Doom 3 timedemo, and the non-interactive demo for X3 Reunion, a graphically taxing game. (The demo can be run as a benchmark, which causes it to output a good deal of performance information to a log file.) Then, assuming everything worked, I'd increase the CPU overclock and start the process anew. (Even with my admittedly less-than-completely-thorough testing suite, each overclock setting took more than 30 minutes to test. If I went whole-hog with my testing I'd probably still be at it and wouldn't be able to bring you this riveting post.) Both the Doom 3 and X3 tests were run at the settings I use for gameplay: 1600x1200 resolution, 4xAA, 8xAF. There is a serious tactical error in what I just told you, which I'll elaborate on in a bit. If you already know what it is, pat yourself on the back for being a more experienced benchmarker than me. (Don't flatter yourself that it's a great accomplishment, though.) All testing was done in Windows XP Pro, and for each CPU overclock I tested performance with no GPU overclock and with the GPU overclock mentioned earlier.

I started off very cautiously, raising the FSB first by 3MHz and testing, then another 2MHz, so that with every two testing cycles I was raising the FSB by 5MHz. (Remember--the FSB gets multiplied by the clock multiplier value of 12, so raising the FSB by 1MHz results in a speed increase of 12MHz.) I wasn't sure how quickly heat levels would rise. The Zalman CNPS9500 cooling my CPU (the big blurry copper thing in the photo) is pretty much the Cadillac of air cooling, so I was hoping things would stay frosty for a good while, but was taking no chances with my shiny non-replacable hardware. Well, few chances.

My baseline scores, with no overclock, were as follows:

CPU @ idle: 27 C (mobo 33 C)
CPU @ 100% load: 33 C (mobo 33 C)
Doom 3 (timedemo 1): 55.1 fps (58.2 fps w/GPU overclock)
X3 demo: 36.962 fps (38.960 w/GPU OC)

Upping the FSB by 3MHz resulted in a clock speed increase from 2.40GHz to 2.44GHz. System temps held steady. Unfortunately, game performance actually slipped very slightly--0.1 fps in Doom 3 and 0.218 fps in X3.

Bumping the FSB up another 2MHz to 205 caused CPU and system temps to raise by 1 degree each. Game performance was essentially unchanged.

Temperatures and game benchmarks continued to change only a negligable amount with each little speed bump. When I got the FSB up to 215MHz (resulting in a 2.58GHz CPU clock speed) things were scarcely different than my baseline scores:

CPU @ idle: 27 C (mobo 32 C)
CPU @ 100% load: 34 C (mobo 32 C)
Doom 3: 55.1 fps (58.25 fps w/GPU OC)
X3: 37.506 (39.337 fps w/GPU OC)

At this point I began increasing the FSB by 5MHz a whack. I got up to 225MHz (2.70GHz) with still no significant change in temps or game scores. At 230MHz (2.76GHz) the CPU temp rose 1 whole degree under load to 35 C, but Doom 3 locked up solid during the timedemo. Referring back to my no-money overclocking guideline, I rebooted and backed the FSB down to 225. Without entering the dangerous waters of overvolting I had reached my overclocking limit. (Technically I could've gone about increasing the FSB in 1MHz incriments to find out exactly where it becomes unstable, but at this point I'd been at it most of the day and didn't want to spend another 30 minutes a pop doing that, especially when it was obvious there would be no significant real-world performance gain from it.)

It seemed particularly strange to me that the game benchmarks remained practically unchanged, even though I'd boosted my CPU speed by 300MHz. Then I realized my mistake: the game performance was GPU-limited. (Remember that tactical error? That's it.) You see, there are two hardware factors that affect game performance more than anything else: your processor (CPU) and your graphics card (GPU). If they aren't evenly matched one will reach its performance threshold before the other and will become the bottleneck point for system performance. Since I was increasing the CPU speed and not seeing a performance increase, the GPU was the limiting factor. I didn't expect it--my GPU is nVidia's 7800GTX, still generally thought to be the best on the market, and my Athlon 64 4000+ CPU is more of a mid-level part.

To test this theory I re-ran the Doom 3 benchmark with greatly reduced graphic options to remove the burden from the GPU. I reduced the resolution from 1600x1200 to 1024x768, completely disabled AA/AF, and reduced the graphic quality to medium. The results bore my theory out:

Doom 3 @ 2.40GHz: 100.45 fps (100.5 fps w/GPU OC)
Doom 3 @ 2.70GHz: 112.55 fps (112.4 fps w/GPU OC)

Notice that the tables have now turned: overclocking the CPU grants a significant performance boost, but overclocking the GPU does not (since it's not breaking a sweat anyway). With these settings, performance is CPU-limited, and if you're testing a CPU overclock, that's really the way to go.

My spirits buoyed by the knowledge that I didn't just spend an entire day tinkering with no measurable results, I decided to expand my benchmarking a bit to further illuminate the difference between the stock 2.40GHz and maximum stable 2.70GHz speeds. For this I went to synthetic benchmarking suite PC Mark 04 and the game F.E.A.R. PC Mark (now up to 05) is a commonly used benchmarking application which runs batteries of specific tests on your PC to gauge performance in several categories. F.E.A.R., a first-person shooter released last fall by Monolith, is a real system killer; it also includes a very nice benchmarking utility similar to the timedemo function in Doom 3, but with more detailed reporting. At both CPU speeds I ran these two tests only with the GPU overclocked, since the previous testing made it clear that my GPU needs to give me all it can.

PC Mark 04 has a main set of benchmark tests, along with four other sets dedicated to testing CPU, memory (RAM), GPU, and hard drive (HDD) performance. I ran all five sets.

PC Mark 04 @ 2.40GHz
PC Mark: 4771
CPU: 4554
RAM: 5395
GPU: 9688
HDD: 6467

PC Mark 04 @ 2.70GHz
PC Mark: 5277
CPU: 5121
RAM: 6046
GPU: 11805
HDD: 6480

Significant improvement in every category except the hard drives. I'm really starting to warm up to the overclock, painful pun not intended.

F.E.A.R. is a great looking game, but those looks don't come cheap. It's so demanding of your PC's hardware that I back off a notch on the anti-aliasing, down to 2x. All the in-game graphics options are turned on with the exception of "soft shadows", which I recommend everyone disable due to its disasterous effect on performance. (It doesn't look good anyway.) Here are the results of the F.E.A.R. benchmark at 1600x1200, 2xAA, 8xAF, no soft shadows. (Minimum, average, and maximum values are expressed in frames per second.)


F.E.A.R. @ 2.4GHz
min: 24 avg: 39 max: 77
below 25 fps: 1%
25-40 fps: 70%
above 40 fps: 29%

F.E.A.R. @ 2.7GHz
min: 25 avg: 42 max: 79
below 25 fps: 0%
25-40 fps: 65%
above 40 fps: 35%

In F.E.A.R. there is a small but notable performance increase from the CPU overclock, much moreso than Doom 3 or X3. It doesn't surprise me at all to see that F.E.A.R. leans a little more heavilly on the CPU than other games.

What does all this mean?

Sometimes the journey is more important than the destination, and for me this is one of those times. While overclocking my CPU didn't net any significant game performance improvements (with the exception of F.E.A.R.), I got to know my hardware a lot better. For the past four months I've thought that my CPU was the performance bottleneck of the system, simply because my CPU is a mid-level part and my GPU is a high-end one. That's not the case. That calls for a serious adjustment of my upgrade priorities. I designed this PC to be SLI-capable, meaning that I can add a second video card (of the same type as my current one) for a large performance boost. SLI is sexy as all hell, but my GPU carries a price tag of $469 as of this writing, so it's not a cheap upgrade. Neither is upgrading my CPU, as the AMD CPU directly above it on the food chain, the FX-55, is going for $811 (as opposed to my Athlon 64 4000+, which is $334). When I built the PC I wished I had another $400 for an FX-series proc and considered it a significant compromise, but now I know that my money would be better spent on a second video card anyway. That's a good thing to know.

Gaming performance aside, there's one real-world benefit to my overclock: Windows is noticably snappier. Navigating the Start menu, opening applications, pulling up right-click menus, it all seems just a little more responsive, and anything that can make Windows a little shinier is OK with me.

As long as it doesn't burn anything up.

Saturday, February 04, 2006

Friday Fiver: Weekend Update

I'm a day late, but this time it's due to an Internet service interruption. In the immortal words of Han Solo, it's not my fault!

1. Any plans to watch the Super Bowl?
Shit no! I hate football. It's like watching paint dry, but with injuries and commercials. All major sports take entirely too long to play, and they're not very rewarding to the casual observer. I don't mind watching baseball or basketball once in a while (usually for 30 minutes or less), but that's about as far as I go. Sports are overrated and take up too much time.

2. Friday or Saturday: which is a better date night?
I will answer this question as the character of Quagmire from Family Guy: "That all depends on whether she's Catholic or not! Ow! Giggidy!"

3. Do you do anything special on the weekends that you don't do during the week?
Hoo-boy. OK, short list: bathe, braid my nosehairs, breathe argon, dance with abandon to Zamphir, human sacrifices, animal sacrifices, sacrifices-in-effigy, nipple torture, promote topsoil erosion, pray to almighty Q*Bert, and read for pleasure.

4. Where do you get your news from?
From The Daily show, stoopid! I get my local news and weather from WCAX, homepage of Vermont's own Channel 3. Marselis Parsons will devour the universe. Because old habits die hard I tend to get my national news from CNN, home of all the news unfit for proofreading. There are other websites I rely on for news of a narrower focus, but I'm not telling you what they are. You'll spoil them. I think the newspaper is one of the greatest ergonomical disasters in the history of the human race, and they're often written and edited with complete disregard for the English language. No sir, I don't like it.

5. Kevin, Norm, Colin, Jimmy, Tina, or Amy?
Ruprect! No, seriously, who the fuck are these people anyway? From this week's title and the tagline "Goodnight and have a pleasant tomorrow" on the Friday Fiver website I'm guessing this is a Saturday Night Live reference, but that's meeting the Friday Fiver folks more than halfway. I haven't watched SNL for years, and it's been even longer since I've laughed at it. Oh God, the pain! It's like watching a koala bear get drawn and quartered for 90 minutes! (When's that get fun?) I seriously thought Will Farrel was utterly devoid of talent until I started seeing him in movies. A few days ago I watched the SNL Best of John Belushi DVD. Man, those were the days! He's been gone for 24 years. That figure makes me feel desperately old. I miss ya, John. You too, Gilda. Chris Farley, I don't miss so much. An immitator to the bitter end.



All hail!