Holiday Budget System Buyers' Guideby Zach Throckmorton on November 8, 2011 12:00 AM EST
Sandy Bridge Celerons
Intel released Sandy Bridge-based Celeron CPUs in early September, and these started appearing in retail channels by the middle of that month; we provided a brief overview of these parts. The Celeron that stands out is the G530, a dual-core CPU clocked at 2.4GHz with 2MB L3 cache and on-die Intel HD Graphics. This processor lacks Hyper-Threading and Quick Sync support, and it has a TDP of 65W (though it will generally use far less power than that). While Intel's suggested pricing is a meager $42, retail prices have stayed steady since its release at $55-60. It is the least powerful Intel dual-core CPU, with only the single core G440 available for less money.
If you've been building and using computers for years, you know there is a stigma attached to the Celeron name. For a long time, Celerons were crippled to the point of near-unusability for even the most basic tasks. That has changed as our basic benchmarks indicate that the G530 is not at all an almost-garbage CPU. The Celeron stigma is dead.
Athlon II X2s
AMD's Athlon II X2 Regor-based 45nm dual-cores have been a mainstay of budget computing since their introduction in 2009. The Athlon II X2 250, clocked at 3.0GHz with 2MB L2 cache, is essentially as capable today as it was two years ago for basic usage. For example, 1080p videos on YouTube are no more difficult to decode and Microsoft Office 2010 isn't much more CPU-hungry than Office 2007 was. Given that most computers I assemble are budget systems, I've now used the Athlon II X2 250 for more builds than any other CPU. Is that about to change?
AMD's most recent APUs (accelerated processing units) have also expanded into the budget processor range. These Fusion APUs combine both the CPU and Radeon "cores" on a single die. Anand reviewed the most capable APU back in June, and compared the A6 model to Intel's Sandy Bridge Pentium in late August. The more recently released 32nm A4-3300 chip (overviewed by Anand in September) is a dual-core part clocked at 2.5GHz with 1MB total L2 cache and featuring AMD's Radeon HD 6410 graphics—160 GPU cores clocked at 443MHz. Its nominal TDP is 65W. Priced around $70, the A4-3300 is only about $10 more than the Celeron G530 and Athlon II X2 250. It promises better graphics performance—but how does the least expensive A-series APU compare to inexpensive discrete video cards, and do you sacrifice processor performance for better graphics?
Battle of the Budget Processors: Benchmarks
While we didn't put the Celeron G530 and A4-3300 through our extensive Bench suite, here are a few benchmarks that show how they stack up against the venerable Athlon II X2 250. All benchmarks were performed using an Antec Neo Eco 400W power supply, a Western Digital Blue 500GB WD5000AAKX hard drive, and a 2x2GB kit of DDR3-1333 with a clean installation of Windows 7 Enterprise 64-bit, with only the manufacturer-supplied drivers installed.
Conversion of a PowerPoint Presentation to a PDF
For this benchmark, I converted a 100 slide, 25MB PowerPoint file to a PDF using Microsoft Office 2010's integrated "Save as PDF" option. As you can see, the Athlon II CPU performs this task slightly faster than the Celeron, though in reality you're only going to actually notice a difference if you're converting extremely large PowerPoint files. The Fusion APU is substantially slower—this is a difference you will notice in real-world usage scenarios.
These values were obtained using 7-Zip's built-in benchmark function with a 32MB dictionary. AMD's Athlon II CPU has a more noticeable advantage here over the Celeron—you will notice a difference if compressing/decompressing either many files or large files. The A4-3300 again performs palpably worse--no surprise given its lower 2.5GHz clock compared to the Athlon's 3.0GHz.
FastStone image resizing
For this test, I resized 50 4200p pictures down to 1080p resolution using FastStone's batch image conversion function. Again, the two CPUs perform similarly, though this time Intel takes the lead. The AMD APU once again lags significantly behind the two CPUs.
x264 HD encode test
Graysky's x264 HD test (v. 3.03) uses x264 to encode a 4Mbps 720p MPEG-2 source. The focus here is on quality rather than speed, thus the benchmark uses a 2-pass encode and reports the average frame rate in each pass. The difference between the Athlon II and Celeron CPUs is essentially nil; both offer better performance than the AMD APU.
Like the above benchmarks, all components were held equal for power consumption testing sans the CPU and motherboard. For the Athlon II platform, I used the ASRock 880GM-LE motherboard, for the Intel platform I used the ASRock H61M-VS motherboard, and the APU was tested on an ASRock A55M-HVS. This is where the efficiency of the newer architectures truly outshines the older Athlon II design. Measurements were taken using a P3 International P4400 Kill A Watt monitor and reflect the entire system, not just the CPU.
Intel's Celeron still leads for low power use, but Llano is at least within striking distance. The older Athlon II X2 uses around 50% more power than Llano for these two tests--or around 17 to 30W more power. Taking the lower number and going with a system that's only powered on eight hours per day, we end up with a difference of around 50kWh per year--or $4 to $15 depending on how much you pay for electricity. If you're in a market where power costs more, obviously there's a lot to be said for going with the more efficient architectures.
Next we test how the AMD A4-3300 APU's graphics prowess stacks up against a budget GPU. The AMD Athlon II and Intel Celeron CPUs were paired with an AMD Radeon HD 5670 512MB DDR5 discrete GPU as neither of their integrated graphics are capable of producing a tolerable gaming experience. The A4-3300 was not paired with a discrete GPU.
Left 4 Dead 2
For the Left 4 Dead 2 benchmark, we used a 1024x768 resolution with all settings at maximum (but without antialiasing). The AMD APU delivers almost 40 frames per second by itself, so no discrete graphics card is required. Subjectively, gameplay was smooth and fluid on the APU. However, bumping up the resolution to even 720p could be an issue, even with less demanding games.
For the DiRT 3 benchmark, we used DirectX 11 at 1024x768 resolution, but this time graphics options were set to the low preset. Even then, the AMD APU struggled to breach the 30 frames per second threshold, and DiRT 3 clearly did not run as smoothly as Left 4 Dead 2. That said, it remained playable, and if you're tolerant of lower resolutions, it performs fine in windowed mode.
Keep in mind that we're using the bottom-rung Llano APU for these tests, and it's a pretty major cut from the A6 models--half the shader cores, but with a slightly higher clock, and only a dual-core CPU. Where the A6 and A8 can legitimately replace budget discrete GPUs, the same cannot be said for the A4 APUs. The lowest priced A6-3500 will set you back around $100, but it drops the CPU clock to 2.1GHz and only adds a third core. Meanwhile the quad-core A6-3650 will run $120 ($110 with the current promo code), but it sports a 2.6GHz clock with the HD 6530D graphics (and a higher 100W TDP). At that point, you might also be tempted to go for the A8-3850, with the full HD 6550D graphics and a 2.9GHz clock, which brings the total for the APU to $135. All of these APUs will work in the same base setup as our Llano build, but obviously the price goes up quite a bit. If you'd like added processing and graphics power, though, the quad-core parts make sense.
As you can see, the Athlon II and Celeron CPUs are very evenly matched across a range of basic productivity tests, while the Fusion APU typically lags behind, at least for office productivity and encoding tasks. That said, the A4-3300 is capable of delivering an acceptable gameplay experience for casual gamers without necessitating a discrete GPU. Additionally, Intel's newer Sandy Bridge architecture and AMD's newer Llano architecture result in dramatically lower total system power consumption at both idle and load compared to the aging AMD Regor architecture.
So which CPU should you buy for your budget build? In terms of upgradeability, socket AM3 is still viable. In the short term, Phenom II quad-cores are already inexpensive, starting at just over $100—so they will be even cheaper in another year or two. Of course, Bulldozer CPUs are compatible with many AM3+ motherboards and could be a wise upgrade in a few years as well. Intel's LGA 1155 socket is also very upgrade-friendly—the Celeron G530 is, after all, the least powerful Sandy Bridge CPU (aside from the sole single-core SKU). The Core i3-2100 will likely sell for less than $100 in another year or so (at least on the secondhand market), and more powerful Core i5 and i7s could keep today's Intel budget build alive and well for maybe as much as five more years. Like the Celeron G530, AMD's socket FM1 has nowhere to go but up from the A4-3300 APU. That said, LGA 1155 currently offers far more powerful CPUs than the high-end A8-3850.
Post Your CommentPlease log in or sign up to comment.
View All Comments
DanNeely - Tuesday, November 8, 2011 - linkUnless you're more concerned about squeezing every last watt of efficiency out of your system than noise under load, you want a PSU that maxes at 200-300W more than your peak consumption level so that its fan never goes above idle speeds. For a gaming box that typically means an extra 10-20W drawn while the system is idle since you're in the <20% load low efficiency zone on most PSUs.
piroroadkill - Tuesday, November 8, 2011 - linkThat's odd, because my i5-2500k @ 4.5 with a Radeon 6950, 8GB RAM and 7 hard drives pulls around 150w idle. Seasonic X-660. Under load, we're talking north of 300w easy.
When I had a Q9550 @ 3.8 and a Radeon 4890, it pulled about 230w idle. That was with a Corsair HX520. I easily pushed 400w at the wall under CPU + GPU load, and I was actually pretty afraid to load both to the maximum.
I have a power meter permanently hooked up to my PC.
piroroadkill - Tuesday, November 8, 2011 - linkI realise this is AC load, not DC load. However, I have been running pretty efficient PSUs.
I do completely agree people overestimate vastly.
Actually, with my old Radeon HD 2900XT, that used MORE power than my 4890.
Taft12 - Tuesday, November 8, 2011 - linkIt's the hard drives that are pushing up your idle power usage. WD Blacks or 2+ TB "green" drives use 6-8W each at idle.
erple2 - Friday, November 11, 2011 - linkNo, it's the 4890 combo with the Q9550 that's pushing that kind of output, even at idle. Drives typically consume "only" 7-8 W each at idle, and only about 10W under load. So expect the drives alone to contribute 49-56W alone under idle. The top 2 consumers in that setup are clearly the GPU, then the CPU.
My i7-950 + 6870 and one WD Black drive eats 200W at idle.
My old computer (core2duo 6750, 4890, and similar drive) used to idle at 240W give or take.
Iketh - Tuesday, November 8, 2011 - linkThat's funny, because I have a 2600k @ 4.2ghz converting to x264 as I type this and using a steady 170w. That's with 2 Seagate greens, an ssd, and a Radeon 6870. My power supply is an Antec 380w. If I game at the same time, it's at 250w. Idle is 92w. Sounds like there is a little tweaking you can do in your bios.
I also have a power meter permanently hooked up to my PC.
wifiwolf - Tuesday, November 8, 2011 - linkJust as a side note: you're in the 50% spot, so max efficiency.
DominionSeraph - Tuesday, November 8, 2011 - link$420??
Inspiron Desktop 560 Mini-Tower
Processor: Intel Pentium Dual Core E6700 (3.20GHz 2MB)
Genuine Windows 7 Home Premium
500 GB SATA Hard Drive (7200 RPM)
3 GB DDR3 SDRAM 1333MHz (3 DIMMs)
16X DVD +/- RW Optical Drive
$289 at Dell outlet.
Why the heck would anybody build a budget system?
slayernine - Tuesday, November 8, 2011 - linkFor friends and family that mostly just Internet browse but want a system that could perhaps be upgraded in the future as most prebuilt systems don't allow.
Also building your first entry level system lets you get into system building or gaming without breaking the bank. Your Dell system will not play any games without a dedicated card and would likely need a power supply upgrade if you wanted to install a dedicated video card. Also airflow in most consumer desktops is not suitable for a gaming system unless you buy something like a Dell XPS which then puts you into a much higher price bracket. At that point you will realise why custom built is better :)
jabber - Tuesday, November 8, 2011 - linkIndeed, had many a Dell Dimension or similar in for 'upgrades' and by the time you check them over the HDD and ram is about all that's worth upgrading. If you want to put another HDD in then you have to contend with their bafflingly over complicated HDD mounting systems. Why they have to use 8 parts when other cases use just a simple slot and screw method I don't know.