µATX Part 1: ATI Radeon Xpress 1250 Performance Review
by Gary Key on August 28, 2007 7:00 AM EST- Posted in
- Motherboards
Power Consumption
We measured power consumption in three states: at idle sitting at the Vista desktop after 10 minutes of inactivity, and under load while running our Nero Recode 2 and Company of Heroes benchmarks. At each setting, all available power saving options were enabled in the BIOS and operating system to keep power consumption to a minimum, although obviously the biggest difference occurs at idle.
Our tests surprised us as we see both X1250 Radeon express platforms consuming slightly more power than the Intel G33 platform with or without power management turned on. The AMD HD 2600 XT consumes 13W more power at idle in power savings mode and 12W more without power saving features enabled when compared to the X1250. The 2600 XT uses 9W more power in Nero Recode 2 and 32W more power under load in Company of Heroes when compared to the X1250. We will have NVIDIA 8600 GTS results in our next article. Our only problem during testing with the X1250 boards occurred during the 3D06 and PCMark05 benchmark runs when the CPUs remained in a low idle state. This was quickly solved by changing the power state to Performance settings.
1080p CPU Utilization
With the release of the Catalyst 7.8 driver, the X1250 graphics core has been touted as having the capability to playback high definition content in 1080p with an appropriate CPU and CyberLink's PowerDVD player. We confirmed this claim with our Mission Impossible titles that are encoded in the VC1 format in HD-DVD and the less demanding MPEG-2 format for Blu-ray with our E2160 processor. We tested Casino Royale that is encoded under H.264 and found the average CPU utilization rate to be around 89% with the E2160. While the majority of the movie was viewable, there were certain action sequences that created a 100% CPU usage rate that resulted in the movie stopping or PowerDVD locking up.
ASRock recommends an E6550 processor for 1080p playback and we did not notice an issue with this processor during testing. However, on the ASRock board this means the FSB is automatically overclocked to 333 as the Radeon Xpress 1250 does not officially support the 1333MHz FSB rates required by this processor. The abit board will boot this processor but the FSB rate is locked at 266 which defeats the purpose of using a 1333MHz FSB capable CPU. We found that an E6420 is perfectly suited for playing back H.264 content at 1080p on either board.
The playback quality of the X1250 at 1080p was perfectly acceptable on both boards and we did not see any real differences between the HDMI output of the abit board and the DVI output on the ASRock board. The high definition video quality output is definitely a notch below our HD 2600 XT and NVIDIA 8600 GTS cards with a fair amount of moiré and banding on the Vatican walls after Tom Cruise drops to the ground and again on the stairs in the next sequence inside the Vatican. Color saturation and contrast are very good while noise reduction and deinterlacing on the X1250 is acceptable but clearly a step below our 2600 XT and 8600 GTS cards. Unfortunately, we cannot show screenshot comparisons between our graphic solutions under Vista due to the DRM police hounding us; we are working to find a solution that doesn't involve a camera aimed at the display....
We measured power consumption in three states: at idle sitting at the Vista desktop after 10 minutes of inactivity, and under load while running our Nero Recode 2 and Company of Heroes benchmarks. At each setting, all available power saving options were enabled in the BIOS and operating system to keep power consumption to a minimum, although obviously the biggest difference occurs at idle.
Our tests surprised us as we see both X1250 Radeon express platforms consuming slightly more power than the Intel G33 platform with or without power management turned on. The AMD HD 2600 XT consumes 13W more power at idle in power savings mode and 12W more without power saving features enabled when compared to the X1250. The 2600 XT uses 9W more power in Nero Recode 2 and 32W more power under load in Company of Heroes when compared to the X1250. We will have NVIDIA 8600 GTS results in our next article. Our only problem during testing with the X1250 boards occurred during the 3D06 and PCMark05 benchmark runs when the CPUs remained in a low idle state. This was quickly solved by changing the power state to Performance settings.
1080p CPU Utilization
With the release of the Catalyst 7.8 driver, the X1250 graphics core has been touted as having the capability to playback high definition content in 1080p with an appropriate CPU and CyberLink's PowerDVD player. We confirmed this claim with our Mission Impossible titles that are encoded in the VC1 format in HD-DVD and the less demanding MPEG-2 format for Blu-ray with our E2160 processor. We tested Casino Royale that is encoded under H.264 and found the average CPU utilization rate to be around 89% with the E2160. While the majority of the movie was viewable, there were certain action sequences that created a 100% CPU usage rate that resulted in the movie stopping or PowerDVD locking up.
ASRock recommends an E6550 processor for 1080p playback and we did not notice an issue with this processor during testing. However, on the ASRock board this means the FSB is automatically overclocked to 333 as the Radeon Xpress 1250 does not officially support the 1333MHz FSB rates required by this processor. The abit board will boot this processor but the FSB rate is locked at 266 which defeats the purpose of using a 1333MHz FSB capable CPU. We found that an E6420 is perfectly suited for playing back H.264 content at 1080p on either board.
The playback quality of the X1250 at 1080p was perfectly acceptable on both boards and we did not see any real differences between the HDMI output of the abit board and the DVI output on the ASRock board. The high definition video quality output is definitely a notch below our HD 2600 XT and NVIDIA 8600 GTS cards with a fair amount of moiré and banding on the Vatican walls after Tom Cruise drops to the ground and again on the stairs in the next sequence inside the Vatican. Color saturation and contrast are very good while noise reduction and deinterlacing on the X1250 is acceptable but clearly a step below our 2600 XT and 8600 GTS cards. Unfortunately, we cannot show screenshot comparisons between our graphic solutions under Vista due to the DRM police hounding us; we are working to find a solution that doesn't involve a camera aimed at the display....
22 Comments
View All Comments
Sargo - Tuesday, August 28, 2007 - link
Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.ltcommanderdata - Tuesday, August 28, 2007 - link
Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.Here's Intel's nice PR chart explaining the different IGPs:
http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...
Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
JarredWalton - Tuesday, August 28, 2007 - link
I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.IntelUser2000 - Wednesday, August 29, 2007 - link
It has the drivers at XP.
JarredWalton - Wednesday, August 29, 2007 - link
Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.IntelUser2000 - Wednesday, August 29, 2007 - link
I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.E6600
DG965WH
14.31 production driver
2x1GB DDR2-800
WD360GD Raptor 36GB
WinXP SP2
IntelUser2000 - Tuesday, September 11, 2007 - link
Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.Like:
-What version of BF2 used
-What demos are supposed to be used
-How do I load up the demos
-etc
R101 - Tuesday, August 28, 2007 - link
Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.erwos - Tuesday, August 28, 2007 - link
I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.Lonyo - Tuesday, August 28, 2007 - link
Bioshock requires SM3.0.