Image Quality Analysis Fall 2003: A Glance Through the Looking Glass
by Derek Wilson on December 10, 2003 11:14 PM EST- Posted in
- GPUs
Final Words
Making useful sense of all this information is tricky at best.There is no real-time 3D engine or hardware in existence that does everything the “right way” to all things visible all the time. In order to affect real-time 3D rendering, trade-offs must be made. Only when these trade-offs become highly perceptible are they a problem.
It is the GPU makers' responsibility to implement optimizations in ways that don't negatively impact the image quality of a scene, but there really isn't a way to quantitatively make a decision about any given optimization. That which is acceptable to one person may not be acceptable to another, and it is a tough call to make.
One stop gap is the end user community's perspective on the issues. If it is decided that a particular optimization shouldn't be done by the people who own (or potentially own) a particular card, it is in the GPU makers' best interest to make some changes.
It is in game developers' best interest to work with GPU makers to keep image quality top notch for their game's sake. In fact, rather than concentrating on getting raw frame rate to the end user, IHVs should focus on getting powerful and easy-to-use features to the developer community. So far, ATI has a leg up on ease of use (developers have said that programming has gone quickly and smoothly with ATI cards with NVIDIA code paths taking longer to tweak), while NVIDIA's hardware offers more flexibility (NVIDIA allows much longer shader programs than ATI and offers functionality above the minimum of current APIs). At this point, ATI is in a better position because it doesn't matter if NVIDIA offers more functionality, if the only code that can take advantage of it runs incredibly slow after taking a very long time to develop. Hopefully, we will see more flexibility from ATI and fewer nuances in how programs need to be written from NVIDIA in next year's hardware.
At this point, ATI uses a more visually appealing algorithm for antialiasing, while NVIDIA does a better job calculating texture LOD and does more alpha blending. The question we are asking now is whether or not these optimizations degrade image quality in any real way. We feel that NVIDIA needs to refine its antialiasing algorithms, and ATI needs to do a better job of rendering alpha effects. We are still looking into the real world effects of the distance calculations that ATI uses in determining LOD, but the problem definitely manifests itself in a more subtle way than the other two issues that we have raised.
The decision on what is acceptable is out of our hands, and we can't really declare a clear winner in the area of image quality. We can say that it appears from the tests we've done that, generally, NVIDIA hardware does more work than ATI. Honestly, it is up to the reader to determine what aspects of image quality are important, and how much of what we covered is relevant.
We really don't have a good way to compare pixel shader rendering quality yet. The possible issues with different shader implementations have yet to be seen in a game, and we hope they never will. It is a developer's responsibility to create a game that gives a consistent experience across the two most popular GPUs on the market, and both ATI and NVIDIA have the ability to produce very high quality shader effects. Each architecture has different limitations that require care when programming, and we will still have to wait and see whether or not there will be image quality differences when more DX9 games hit the shelves.
For now, we are committed to bringing to light as much information as possible about image quality and optimizations in graphics hardware. Armed with this information, individuals will be able to come to their own conclusions about which optimizations go too far and which serve their intended purpose. We hope that all the details that we have brought to light have served their purpose in helping our readers to make informed decisions about graphics hardware.
35 Comments
View All Comments
nourdmrolNMT1 - Thursday, December 11, 2003 - link
i hate flame wars but, blackshrike....there is hardly any difference between the images. and nvidia used to be way behind in that area. so they have caught up, and are actually in some instances doing more work to get the image to look a little different, or maybe they render everything that should be there, while ati doesnt (halo 2)
MIKE
Icewind - Thursday, December 11, 2003 - link
I have no idea what your bitching about Blackshrike, the UT2k3 pics look exactly the same to me.Perhaps you should go work for for AT and run benchmarks how you want them done instead of whining like a damn 5 year old..sheesh.
Shinei - Thursday, December 11, 2003 - link
Maybe I'm going blind at only 17, but I couldn't tell a difference between nVidia's and ATI's AF, and I even had a hard time seeing the effects of the AA. I agree with AT's conclusion, it's very hard to tell which one is better, and it's especially hard to tell the difference when you're in the middle of a firefight; yes, it's nice to have 16xAF and 6xAA, but is it NECESSARY if it looks pretty at 3 frames per second? I'm thinking "No"; performance > quality, that's why quality is called a LUXURY and not a requirement.Now I imagine that since I didn't hop up and down and screech "omg nvidia is cheeting ATI owns nvidia" like a howler monkey on LSD, I'll be called an nVidia fanboy and/or told that A) I'm blind, B) my monitor sucks, C) I'm color blind, and D) my head is up my biased ass. Did I meet all the basic insults from ATI fanboys, or are there some creative souls out there who can top that comprehensive list? ;)
nastyemu25 - Thursday, December 11, 2003 - link
cheer up emo kidBlackShrike - Thursday, December 11, 2003 - link
For my first line I forget to say blurry textures on the nvidia card. Sry, I was frustrated at the article.BlackShrike - Thursday, December 11, 2003 - link
Argh, this article concluded suddenly and without concluding anything. Not to mention, I saw definite blurry textures in UT 2003, and TRAOD. Not to mention the use of D3D AF Tester seemed to imply a major problem with one or the other hardware but they didn't use it at different levels of AF. I mean I only use 4-8 AF, I'd like to see the difference.AND ANOTHER THING. OKAY THE TWO ARE ABOUT THE SAME IN AF BUT IN AA ATI USUALLY WINS. SO, IF YOU CAN'T CONCLUDE ANYTHING, GIVE US A PERFORMACE CONCLUSION, like which runs better with AA or AF? Which creates the best with both settings enabled?
Oh and AT. Remember back to the very first ATI 9700 Pro, you did tests with 6x AA and 16 AF. DO IT AGAIN. I want to see which is faster and better quality when their settings are absoulutely maxed out. Because I prefer playing 1024*768 at 6x AA and 16 AF then 1600*1200 at 4x AA 8x AF because I have a small moniter.
I am VERY disappointed in this article. You say Nvidia has cleaned up their act, but you don't prove anything conclusive as to why. You say they are similar but don't say why. The D3D AF Tester was totally different for the different levels. WHAT DOES THIS MEAN? Come on Anand clean up this article, it's very poorly designed and concluded and is not at all like your other GPU articles.
retrospooty - Thursday, December 11, 2003 - link
Well a Sony G500 is pretty good in my book =)Hanners - Thursday, December 11, 2003 - link
Not a bad article per se - Shame about the mistakes in the filtering section and the massive jumping to conclusions regarding Halo.gordon151 - Thursday, December 11, 2003 - link
I'm with #6 and the sucky monitor theory :P.Icewind - Thursday, December 11, 2003 - link
Or your monitor sucks #5ATI wins either way