nForce4 SLI Roundup: Painful and Rewarding
by Wesley Fink on February 28, 2005 7:00 AM EST- Posted in
- Motherboards
Gigabyte K8NXP-SLI: Overclocking and Stress Testing
FSB Overclocking Results
Front Side Bus Overclocking Testbed | |
Gigabyte K8NXP-SLI | |
Processor: | Athlon 64 4000+ (2.4GHz, 1MB Cache) |
CPU Voltage: | 1.55V (default 1.50V) |
Cooling: | Thermaltake Silent Boost K8 Heat sink/Fan |
Power Supply: | OCZ Power Stream 520W |
Memory: | OCZ PC3200 EL Platinum Rev. 2 (Samsung TCCD Memory Chips) |
Hard Drive: | Seagate 120GB 7200RPM SATA 8MB Cache |
Maximum OC: (Standard Ratio) |
230x12 (4X HT, 2.5-3-2-7, 1T, 2.8V) 2760MHz (+15%) |
Maximum FSB: (Lower Ratio) |
230x12 (2760MHz) (4X HT, 2.5-3-2-7, 2.8V) (1:1 Memory, 1T, 2 DIMMs in DC mode) (+15% Bus Overclock) |
After the excellent overclocking results that we found in our pre-production review of the K8NXP-SLI, we really expected the Gigabyte SLI to be near the top of our overclocking charts. However, something has happened along the way from pre-production to production because our production board could only reach a very disappointing 230 CPU speed regardless of the multiplier selected. This is in stark contrast to the 284 that we easily reached on the pre-production board.
We recently met with Gigabyte to discuss this issue and Gigabyte has assured us that they will make updates to get overclocking back to the levels which we saw in our earlier review. For now, we can only say that we have no idea what you will actually find in the overclocking capabilities of a K8NXP-SLI that you might buy. It could be stellar, like the pre-production board that we tested, or mediocre, like the last board that we tested. We have evidence to support either conclusion.
Memory Stress Test Results:
Our memory stress tests measure the ability of the K8NXP-SLI to operate at its officially supported memory frequency (400MHz DDR), at the lowest memory timings that OCZ PC3200 Platinum Rev. 2 modules will support. All DIMMs used for stress testing were 512MB double-sided (or double-bank) memory. To make sure that memory performed properly in Dual-Channel mode, memory was only tested using either one dual-channel (2 DIMMs) or 2 dual-channels (4 DIMMs).Stable DDR400 Timings - One Dual-Channel (2/4 DIMMs populated) |
|
Clock Speed: | 200MHz |
CAS Latency: | 2.0 |
RAS to CAS Delay: | 2T |
RAS Precharge: | 7T |
Precharge Delay: | 2T |
Command Rate: | 1T |
Using two DIMMs in Dual-Channel 128-bit mode, the memory performed in all benchmarks at the fastest 2-2-2-7 timings, at default 2.6 voltage.
Tests with 4 DS DIMMs on an AMD Athlon 64 system are more demanding, since AMD specifies DDR333 for this combination. However, most AMD Athlon 64 motherboards combined with recent AMD processors (the memory controller is on the AMD CPU) have been able to handle 4 DIMMs at DDR400.
Stable DDR400 Timings - 4 DIMMs (4/4 DIMMs populated) |
|
Clock Speed: | 200MHz |
CAS Latency: | 2.0 |
RAS to CAS Delay: | 2T |
RAS Precharge: | 7T |
Precharge Delay: | 2T |
Command Rate: | 2T |
Tests with all four DIMM slots populated on the Gigabyte required a 2T Command Rate with 4 DIMMs in two dual channels. This is the pattern seen on other top-performing Socket 939 boards. There was no problem running 4 DS DIMMs at DDR400 at the same aggressive 2-2-2-7 settings, which worked well with 2 DIMMs.
108 Comments
View All Comments
fitten - Tuesday, March 1, 2005 - link
quote:I still do not understand why this argument is so popular. Why is the general assumption that purchasers of SLI capable boards will immediately want to jump into a dual-card config? The idea is flexibility. Sure, 2 6800's are expensive now, but they will inevitably get cheaper.Well, if history serves as a measure... by the time that 2nd board becomes cheap enough to justify its cost, there will be a new board out (say, the nVidia 7800) that will be as fast, or faster than, the SLI combo.
I used to buy motherboards with two sockets for this very reason (flexibility to upgrade to two CPUs later) until about twice doing this I learned that by the time I was ready for that 2nd CPU, there was one out that was faster than both put together.
Computers change too fast. If you perpetually buy on the bleeding edge, you cannot plan any upgrades past ~6 months and definitely not past 12 months. By that time, you'll throw away what you have and get the NextBestThing(tm). Buying SLI is bleeding edge. Saying that you'll buy the upgrade card in a year is just a rationalization to buy the bleeding edge now.
Aquila76 - Tuesday, March 1, 2005 - link
Yeah, that's right. Some apps run slower with SLI because nVidia hasn't SLI optimized the driver for that app (so it can then only utilize one card) and the SLI setup uses some overhead, resulting in slower results. Any new game/benchmark will use SLI just fine. The results in Half-Life and Doom 3 as well as if you add the config for stuff like NFS:U2 and whatever are well above one card though.Sunbird - Tuesday, March 1, 2005 - link
Is my brain screwed up or are the 3Dmark03 single scores higher than the SLI scores???chup - Tuesday, March 1, 2005 - link
too bad, i thought the MSI was the one to get after nforce2.sphinx - Tuesday, March 1, 2005 - link
From this review, I have come to the conclusion that ASUS is slipping. I have always been a supporter of ASUS but, I think this review shows how much ASUS is all about the money and not making quality products. Right now I am waiting for manufacturers to get the VIA chipset working properly. I haven't seen many news or reviews on VIA's new chipset. One other thing. Who in their right mind would spend close to $250 on nVidia's NF4 if there is really no significant performance jump from the NF3.bigbusa - Tuesday, March 1, 2005 - link
You mentioned the asus manual says use a 500+W PS. if you read the Asus users guid the sli 6800 ultra system also has all pci slots used, all memory dims full, 2 optical drives, and anassortment of other stuff. and they recommend a 500+W, but a 350W PS for a dual 6600GT. See below.500+W ps for 55FX, 2x6800 ultra, 4ddr dims, 4 HD's, 2 optical, 1 pcie 1x card, 3 pci card, 1 1394, 6 usp devices. (shit thats alot of gear)
350W for a 3400(64bit 939), dual 6600GT, 2 DDRdims, 2 hd's, 1 optical drive, no pcie 1x, 1 pci card, no 1394, and 3 usb devices.
SO the article is misleading a bit.
The review also did not cover any quad displays and problems one may encounter when setting this up.
Reflex - Tuesday, March 1, 2005 - link
#70 - None of these boards support ECC. The reason for that is that such support would be implemented by the memory controller, not the motherboard manufacturer. In this particuliar case the memory controller is integrated into the CPU. AMD has a line of CPU's that have ECC support, they are called the Opteron and are designed for workstations and servers.In the home user market ECC does not significantly impact stability but it does harm performance by a small amount which is why the feature is not generally available on consumer solutions.
1955mm - Tuesday, March 1, 2005 - link
All in all I think that this is the best review of Socket 939 SLI boards that I have seen. I particularly liked the attention paid to storage and I/O capabilities. My one criticism is that although comments were made regarding stability, and a link was made between overclocking and stability, there was no discussion of ECC support. If system reliability is discussed, ECC should not be ignored. As far as I can tell, the only board supporting ECC is the ASUS board. Over the years I have found it difficult to get accurate information on ECC support, having been given misleading information on occasion by both MSI and ASUS.Aquila76 - Tuesday, March 1, 2005 - link
D'oh, *SoundSTORM Savior*That's it, I'm off to bed. It's quarter of 1:00AM and I have work tomorrow. Uh, today.
Aquila76 - Tuesday, March 1, 2005 - link
To everyone hoping the MSI upsamples analog 5.1 to Dolby Digital - I don't think so. Like any Creative card, it can either downmix DD-EX/DTS-ES 7.1 streams to 4/5.1 speakers (which is what page 5-11 of the manual is actually talking about doing), decode DD/DTS on-card to 5/6/7.1 speakers (via analog or 'Digital Out', Creative's proprietary digital link for their speaker sets), or can just pass the Dolby Digital/DTS 5/6/7.1 signal (now via the SPDIF coax/optical cable) to any outboard decoder.I say this because I have the same exact chip on a stand-alone card, and it does not upsample analog sound to Dolby Digital, like SoundStorm did. 'Digital Out' simply let's you use a proprietary Creative Digital DIN connector to connect one cable from the soundcard to the Creative speaker amp (like on a DTT3500 that I use).
I also find it highly unlikely that Creative would license a DD Live capable chip to only one manufacturer when they have yet to produce one of their own cards with this feature.
*Keeps waiting for a SoundStrom Saviour*