Audio Performance

We limited audio testing to the Rightmark 3D Sound version 2.0 CPU utilization test and tested with sound enabled to show the performance effects on several games. The Rightmark 3D Sound benchmark measures the overhead or CPU utilization required by a codec or hardware audio chip.

Audio Performance - Empty CPU - 32 Buffers

Audio Performance - 2d Audio - 32 Buffers

Audio Performance - DirectSound 3D HW - 32 Buffers

Audio Performance - DirectSound 3D EAX2 - 32 Buffers

As you can see, none of the onboard audio solutions were quite as low in CPU utilization as the Abit AudioMAX 7.1 solution. The Gigabyte 8N SLI Quad Royal uses the "almost standard" Realtek ALC850 found in most high end NVIDIA AMD SLI systems. The current drivers limit the 3D sound buffers to a maximum of 25.

Audio Performance - DirectSound 3D EAX - BattleField 2

Audio Performance - DirectSound 3D EAX - Splinter Cell Chaos Theory

Audio Performance - Wolfenstein - Enemy Territory - Radar Demo

The Battlefield 2 numbers are highly disappointing as the Gigabyte 8N SLI Quad Royal implementation of the Realtek ALC850 sound solution creates a 27% loss in frame rates in this highly popular on-line game in which sound is as critical as frame rates. Both Splinter Cell Chaos Theory and Wolfenstein Enemy Territory have an acceptable loss of 10%. Obviously, if you are a serious gamer, then a dedicated sound card is a requirement.

While the Realtek ALC850 codec offers acceptable CPU usage and sound for most office applications or internet based flash games, it is not competitive in audio quality with the MSI P4N Diamond or Gigabyte GA-8I955x on-board audio solutions.

We are still finalizing our expanded audio testing suite and will introduce this in the near future along with results from the MSI P4N Diamond and Gigabyte 8I955x Royal boards.

Ethernet Performance Final Words
Comments Locked

44 Comments

View All Comments

  • PrinceGaz - Friday, October 14, 2005 - link

    Great to hear you are expanding the variety of games tested.

    As for onboard audio, I did originally use the onboard Realtek ALC850 audio on my DFI LanParty nF4 SLI-DR, with the official codec from the Realtek site (at first version A375, then A376a when I noticed the problem, but it made no difference), but found that it has a rather annoying bug.

    In some games, certain sounds that I know for a fact should be played are totally missing. All the other sounds are there, but the odd one is just not played. The most obvious example was in 'Serious Sam: The Second Encounter' (not the new Serious Sam 2 as I haven't played that), where it did not play the quiet intro music for the few seconds when first loading the game, nor did it play the chainsaw sound that follows. Another game I play enough to know when something is wrong is 'Train Simulator', where on one particular route I like, the "Okay to proceed" sound that is played when you can leave each station was never being played, which was rather problematic to say the least. There could well be other sounds missing as well, but only those two were sufficiently obvious to be immediately noticed. I tried using "dxdiag" to reduce Sound Acceleration Level, but it made no difference unless reduced all the way to None (when the missing sounds were then played) but that causes more problems than it fixes.

    The C-Media CMI8738 onboard sound on my older box never had any problems with missing sounds, so I was very disappointed with the ALC850, especially as it seems almost industry standard on AMD nForce4 boards. As a result I bought an Audigy 2ZS, which works perfectly and plays all sounds, but it seems a shame that the onboard sound for me is basically useless for gaming.
  • Gary Key - Wednesday, October 19, 2005 - link

    I agree with you in regards to the static and drop out issues with the ALC850. I had nothing but issues with the a376a driver set in Call of Duty and the retail release of Fear last night. Also, the general sound effects were thin and lacking any bass in most scenes. Music was not acceptable with the general sound coming across like a cat on a hot tin roof. ;-> If I have time on the next review I will also be posting X-FI results as our high end consumer card for the test bed benchmarks now. The Intel manufactured boards with the Sigma-Tel 922x series codecs have the best overall sound of the host based audio solutions at this point. Expect to see these results and further testing of the ALC882m in the near future.
  • Calin - Thursday, October 13, 2005 - link

    As you don't have driver support for SLI on 4 cards, and probably for your 3rd and 4th video card some PCIE 1x, 2x or 4x performance would be enough, it would be a waste of money. Go buy any other SLI board.
    However, multi monitor support is usually needed by some programs that work faster on Intel processors, and buying the cheapest dual core from Intel would work faster than on any processor at that price AMD has to offer.
  • trooper11 - Thursday, October 13, 2005 - link

    its all about return on your investment

    yes the bottom of the berrel intel dual core is cheaper, but just move to the mid range where it squares up againts the X2 3800 and X2 4400 and then it swings in amd's favor.
  • TheInvincibleMustard - Wednesday, October 12, 2005 - link

    Couple o'things:

    1) Awesome to see BF2 as a benchmark (thanks Jarred!)

    2) How nicely would a setup such as this play with Intel's new virtualization technology? Would a solution that allows multiple graphics cards like this (not necessarily this exact board) be a better approach to allowing multiple users to each have their own KVM? I'm envisioning something akin to the "dumb-terminals" of yesteryear, with a family having multiple monitors, keyboards, and mice all hooked up to 1 pc in the house.

    3) On pg2 there's a pic of the BIOS showing the settings for the PCIe lanes. Is there some specific difference between the 0-3D1-16-1 and the 0-3D1-3D1-1 setup? Or are both utilizing 16 lanes for each of the 3D1's and it's just logic on the motherboard to differentiate so it accepts the correct card?

    4) Also regarding the PCIe lanes, I see there's no 0-16-16-1 or equivalent. Is this intentional on the part of Gigabyte? Will a BIOS upgrade allow for this? The reason I'm asking is because I'm curious if there would be a difference in terms of SLI speeds w.r.t. 8-8 vs 16-16, as has been somewhat hinted at in the "SLI x16" snippets I've heard thus far, and this would seem to be the perfect motherboard to test for that.

    5) Any speculation on why the Doom3 scores show such a spread while others don't show as much of one?

    6) When are they gonna be available for purchase? :D

    Thanks guys for a very neat preview of an interesting upcoming product!

    -TIM
  • Gary Key - Thursday, October 13, 2005 - link

    Part Two,

    Tim-

    quote:

    3) On pg2 there's a pic of the BIOS showing the settings for the PCIe lanes. Is there some specific difference between the 0-3D1-16-1 and the 0-3D1-3D1-1 setup? Or are both utilizing 16 lanes for each of the 3D1's and it's just logic on the motherboard to differentiate so it accepts the correct card?


    No,
    The difference between the two setups is that the third PEG slot can utlizie another card other than the 3D1 in a x16 configuration if you use the 0-3D1-16-1. In fact, due to the space limitations caused by the rear heatsink on the 3D1 rev1 cards we used both a 6600Gt and 7800GTX in this slot. The board does require the separate paddle card for the 3D1 card in order to utilize both cards correctly (100%) in my testing. The bios does allow this change but the paddle is the preferred method at least in the pre-production bios. I typically set the bios to auto and utilized the paddle card although both methods were tested to ensure it was possible. I tried the two center slots (easier to type this way) in SLI with the two outer slots in standard mode. I could not get the two center slots to work properly in SLI mode but this was due to the drivers and not the board.

    quote:

    4) Also regarding the PCIe lanes, I see there's no 0-16-16-1 or equivalent. Is this intentional on the part of Gigabyte? Will a BIOS upgrade allow for this? The reason I'm asking is because I'm curious if there would be a difference in terms of SLI speeds w.r.t. 8-8 vs 16-16, as has been somewhat hinted at in the "SLI x16" snippets I've heard thus far, and this would seem to be the perfect motherboard to test for that.


    I have a new bios coming from Gigabyte that hopefully will allow additional changes to the PCIe lanes in manual mode with the paddle card set for SLIx16. Under the auto mode the system will default to a 1-16-16-1 setting with the paddle card set to SLI. I did test in this mode but due to the inability of the 840EE to feed enough data to the two 7800GTX cards the benchmarks did not reflect any difference. I am also testing another "SLI x16" board but have the same issue with the GPU wait states.

    quote:

    5) Any speculation on why the Doom3 scores show such a spread while others don't show as much of one?


    The benchmarks jumped from the D5 to D6 bios used for all results. I re-tested the other boards with their lastest shipping bios and the Gigabyte 8I-955x Royal jumped almost 20%. I am still testing with different GTX cards (it's expensive to buy 6 of these) and driver sets. I cannot match the Abit scores yet and we are still comparing notes.

    quote:

    6) When are they gonna be available for purchase? :D


    I had included this in my article but decided to pull the information as I did not want to jinx Gigabyte or have an ATI situation. The best information I can provide at this time is December. The board is in certification testing at this time and provided there are not any issues it should be out before January unless market conditions dictate otherwise. I will update the article or post a news blurb once the board enters production. We tested the revision 1.0 board and have worked extensively with Gigabyte on some bios enhancements. The current bios is at D9 and I am expecting a new spin next week. I know it is too late to change the sound solution but we are still pushing for the 1394b setup.

    I spent more than 110 hours of testing time on this board. I can honestly say without a doubt that it is ready for production.

    Thanks,
    Gary
  • TheInvincibleMustard - Friday, October 14, 2005 - link

    Awesome replies! Better late than never!

    Thanks!

    -TIM
  • Gary Key - Thursday, October 13, 2005 - link

    All,

    I apologize for not responding sooner as some serious family issues arose the past couple of days. I want to thank Wes for handling my responsibilities.

    Tim,
    quote:

    1) Awesome to see BF2 as a benchmark (thanks Jarred!)


    Jarred worked all night right before the article was published so we could include this benchmark. I wish I could have had more time with it in the overclocking and sound section but that will come in future articles.

    quote:

    2) How nicely would a setup such as this play with Intel's new virtualization technology? Would a solution that allows multiple graphics cards like this (not necessarily this exact board) be a better approach to allowing multiple users to each have their own KVM? I'm envisioning something akin to the "dumb-terminals" of yesteryear, with a family having multiple monitors, keyboards, and mice all hooked up to 1 pc in the house.


    You are certainly on the right track with this thought process. All I can say at this time is wait until next year. ;->

  • DrMrLordX - Wednesday, October 12, 2005 - link

    . . . is this paragraph:

    It is this quick thought process along with quick action that has allowed Gigabyte to introduce several innovative products over the past year that include everything from the GA-8I945P dual graphics capable motherboard to the impressive single slot SLI based GV-3D1-68GT video card. While the true commercial success of these currently niche products are open for debate, the desire of the company to introduce these types of products is not.

    Huh? Since when was their stupid single-slot SLI card innovative? They just crammed the logic from two 6600GTs onto one card, and the result was overpriced crap. No comment about the GA-8I945P, but it all sounds like Gigabyte corporate spew to me.
  • Wesley Fink - Wednesday, October 12, 2005 - link

    Please look at your quote closely. We are talking about the 3D1-68GT, which combines two 6800 GT GPUs on a single card and NOT the earlier 6600 version. Please check the benchmarks before you trash the description as innovative. On p. 6 the 3D1-68GT outperforms the 7800GTX in both 3DMark03 and 3DMark05. That's pretty decent performance from a single slot card based on dual 6800GT (not Ultra) GPUs. The 7800GTX is still likely the better buy, but the 68GT is still an interesting idea with excellent performance.

Log in

Don't have an account? Sign up now