PDA

View Full Version : Image quality AMD vs. NVIDIA


KAOZNAKE
11-04-2010, 05:33 PM
Hi guys, I was waiting for someone to bring this topic up, but no one seems to care...

1 thing first: I'm absolutely pro AMD and want them to succed. I don't care for NVIDIA or INTEL as long as they leave AMD "alone". But I want AMD to succed everywhere, not "just" fps (its a whole lot more actually ;)). Can't do nothing against Physx (yet) but everything else should be THE best. Drivers certainly are...

So what's the problem you might ask?

Well the "problem" (lets call it problem) went fairly unnoticed in the "english speaking world".

Andandtech said: (http://www.anandtech.com/show/3987/amds-radeon-6870-6850-renewing-competition-in-the-midrange-market/5)

"In an attempt to not make another foolish claim I’m not going to call it perfect, but from our testing we can’t find any clear game examples of where the 6870’s texture filtering is deficient compared to NVIDIA’s – they seem to be equals once again."

techpowerup didn't talk about it.

HOCP wrote: (http://www.hardocp.com/article/2010/10/21/amd_radeon_hd_6870_6850_video_card_review/1)

"AMD has improved filtering again, which many thought was already close to perfect. We certainly haven’t seen any issues with filtering in any games recently. At any rate, AMD did not sit still and has improved filtering between texture levels with this latest generation."

To be true, I didn't bother looking up other sites, because these are the ones I refer to, when I need some english practice :D

So over to the german sites (I'm from germany)

PCGH hardware was the first to criticize: (http://www.pcgameshardware.de/aid,795021/Radeon-HD-6870-und-HD-6850-im-Test-AMDs-zweite-DirectX-11-Generation/Grafikkarte/Test/?page=10)

There they basically say no more banding, but more flickering again and that the new quality adjustment (default) is more like AI advanced with HD5000. Thus the HD6k is saving "work" and to make it fair again, they are benching HQ (AMD) versus Q (NVIDIA). But even then they say, NVIDIA delivers better IQ.

HT4U was next. (http://ht4u.net/reviews/2010/amd_radeon_hd_6800_reloaded/index10.php)

They claim banding is still there and provided videos (need to install fraps)

Same with flickering. (http://ht4u.net/reviews/2010/amd_radeon_hd_6800_reloaded/index11.php) Not up to par with NVIDIA they say:eek:(more videos)

One of the biggest german IT sites, computerbase.de found the IQ with AMD graphic cards to be inferior to NVIDIA (http://www.computerbase.de/artikel/grafikkarten/2010/bericht-radeon-hd-6800/)and decided to change their benchmarking to HQ (AMD) vs. Quality (NVIDIA) also. (It was Q vs. Q before, this costs AMD 5% in the final ranking)

They also claim, that AMDs new Q setting = AI advanced, and only AMD HQ restores AI-standard. For HD5k they found the default IQ setting to be lower since CCC10.10. So for the future they are going to bench HD5k with AI-OFF (with CCC10.10 and newer) to restore parity.

For all of you guys who don't like to read german, german 3Dcenter made some easy to understand pics: (http://www.3dcenter.org/artikel/amds-radeon-hd-580059006800-mit-rueckschritten-bei-der-filterqualitaet)

http://www.3dcenter.org/dateien/abbildungen/2010-11Nov-02b.png

If you don't like percentages (they are subjective of course) look at this:

http://www.3dcenter.org/dateien/abbildungen/2010-11Nov-02c.png


For me personally, I don't notice the differences. I own a 3870 and I'am absolutely satisfied. But there are lots of people out there who won't buy a highend graphic cards with "BAD" IQ, and worse, some of them keep spreading FUD. "AMD sucks, ************ty IQ", "AMD is cheating" and so on and on...

I'm tired of this, so AMD needs to do something. Not for my eyes, but for my peace of mind :eek:

Whats your take on this?

CRoland
11-04-2010, 05:44 PM
Those are pathological cases and there's AI off for that.

eRacer
11-04-2010, 06:26 PM
TweakPC (http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php) also has comparison videos showing superior AF quality on the GTX 460 (and 5870) compared to the 6870 in their particular sample. It appears AMD is sacrificing image quality to gain frame rates. Some would call that an optimization, others would call it a cheat.

Queamin
11-05-2010, 06:21 AM
If this is true AMD should be ashamed, they got a lot of good will the last few years because what Nvidia has done, last thing we want is both camps moving the goal posts again like a few years ago that was a mess.

Moving the goal posts like this is wrong if i want more performance with lass IQ let me move the slider to what i want not AMD, it just feels a sly move on their part and i do feel it is cheating, with all the performance those cards have nowadays you want good IQ.

Last thing i want is for either camp to have worst IQ than the generation before it and it looks to me as the 6*** is worst then the 5*** even if it is a easy fix in drivers.

AMD don't want to get in the position where like a few years ago for Nvidia even if they make a mistake almost everybody believes they did it on purpose.

Rottis
11-05-2010, 06:26 AM
Those videos look exactly what Catalyst A.I. is supposed to do, ie. not filter edges on same plane.
That looks like non-antialiazed edges on a plane subdivided to small triangles, shimmering comes from triangle edges pixels running with movement, old good sub-pixel accuracy effect.

A.I. tries to optimize by not applying AA on stuff that doesn't show up and usually edges inside solid geometry don't show up that way, only outlining edges. If that bothers someone then it's time to turn A.I. off.

Personally I have been staring at these running edges way too much and beside few pros, geeks and fanboys nobody complains. Some developers even prefer having some 'noise' and intentionally oversample textures slightly.
Just check out pretty much any driving games that are supposed to look good by reviews, road textures are shimmering/noisy and it looks good and gives feeling of speed.

Then again, this is mostly a matter of opinion. For most default settings are fine.

Nicol Bolas
11-05-2010, 06:34 AM
Here's an interesting question: why is the term "Image Quality" just used as a substitute for "Anisotropic Filtering Quality?" Shouldn't image quality mean other things as well as aniso?

Also, here's another question. If the banding is so bad, why is it only visible under very specific and quite rare circumstances and only on certain games? Aren't game-specific image issues something that should be fixed by, you know, the game developers?

DCO
11-05-2010, 06:54 AM
http://forum.beyond3d.com/showpost.php?p=1489965&postcount=118
If you bump LoD to +0.65 the shimmer disappears. On NV cards nudge LoD to -0.65 and it appears.
Caveman-jim is in this forum, probably he will be able to explain his tests.
As we can see there are more parametres that influence the image quality but there is too much focus around AF, probably brought by PR.

Queamin
11-05-2010, 06:58 AM
Browsing the forums there seams more posts about IQ with ati than Nvidia on the few forums i read regularly.

Here's an interesting question: why is the term "Image Quality" just used as a substitute for "Anisotropic Filtering Quality?" Shouldn't image quality mean other things as well as aniso?

I think some of the time they all banded together under IQ, as each can impact IQ.

distinctively
11-05-2010, 07:07 AM
The tests from TweakPC are an example of what I never see in games which is what the cards are made for.

Take a look at how washed out the colours are in the nVidia cards again.

Rottis
11-05-2010, 07:12 AM
Here's an interesting question: why is the term "Image Quality" just used as a substitute for "Anisotropic Filtering Quality?" Shouldn't image quality mean other things as well as aniso?

I'm suggesting that's not filtering quality bug/cheat, but more like triangle-edge AA result.
With that view-angle there's lots of small thin triangles visible and almost no
continuous clear surface fill for texture filtering issues to show up.
Catalyst A.I. also tunes how AA is applied and to my eye above looks exactly
what having AA edges running in steep-angle view of highly subdivided flat plane.
There's no way to say for sure from just those video clips.

Aren't game-specific image issues something that should be fixed by, you know, the game developers?

Yes and I'd say devs won't bother with certain issues. It's not like it's
possible to go through every supported graphics card with every driver
combination and fix everything to look good. Game needs to be released also.

distinctively
11-05-2010, 07:34 AM
These sites are desperately looking for anything to pick on to save face for beloved green these days. NVidia has nothing left to compete with and now they're trying to claim better IQ even though its worse. Sometimes I swear nVidia is only using 256 colours considering how bad the washing out can get.

caveman-jim
11-05-2010, 10:55 AM
http://forum.beyond3d.com/showpost.php?p=1489965&postcount=118

Caveman-jim is in this forum, probably he will be able to explain his tests.
As we can see there are more parametres that influence the image quality but there is too much focus around AF, probably brought by PR.

Hello.

For me this breaks down to three things:

This problem is not new. It was seen and reported on for the HD 5000 series.
The problem is not visible in games for 99% of titles.
Why is this being highlighted now?


AMD present by default a sharper and more vibrant image to the user than NVIDIA does. A sharper image is more subject to noise. So in these AF Filter Tester cases, the images are most prone to demonstate the issue. Which is what they are for. In game, you don't see it, because games don't use the same setup. You can find examples in games if you pause, find an area, and focus on it instead of gameplay. Is that real world? Where are the tens of millions of HD 5000 & 6000 series customers appalled by this?
The question becomes; are AMD out of specification? And the answer is no, they are within D3D API guidelines for AF quality (FilterTester is a DirectX application). So are NVIDIA. NVIDIA err towards a smoother, less vibrant image. It reduces noise, which is seen and judged as aliasing. It's all part of both AMD and NVIDIA's image quality presentation.

Looking at Filter Tester, comparing the 'Perfect' ALU representation to the AMD AF implementation ('TMU'), the shimmer described is present (to a lesser degree) in the ALU rendering. It's in the same spots, but about half as prominent - the banding of texture aliasing is thinner. NVIDIA's doesn't have any, until you nudge LoD bias to about -0.65, where it starts to appear. Push a LoD bias of +0.65 and the AMD aliasing disappears, as it does for the ALU 'perfect' reference. Additionally, does moving the slide on NVIDIA Texture from Performance to Quality to High Quality make a difference? What if you rename the FilterTester application?

The last issue is the one most people want to talk about. Typically, you want sites to bench like-to-like settings. With both the Surface Format Optimizations issue and the AF issue, advice is being given to disable Catalyst AI. Now, seeing as this doesn't affect the results of Filter Tester, it's an interesting decision to make. If it doesn't remove the problem, why turn it off?

Another interesting decision is to compare AMD High Quality (catalyst AI texture optimizations disabled, but performance & quality profiles remain enabled - default setting for HD 5000, not for HD 6000) to NVIDIA Quality (their default setting, texture optimizations used). NVIDIA get a performance boost compared to their HQ setting, maybe 5% best case. But now we're relying on the reviewer stating 'AMD's HQ mode is visually equivalent to NVIDIA's Q mode, for textures' for performance testing. Is this judgement accurate?

TakeANumber
11-05-2010, 11:23 AM
...
AMD present by default a sharper and more vibrant image to the user than NVIDIA does. A sharper image is more subject to noise.
...

So nVidia solves this "problem" by fuzzing it out?

caveman-jim
11-05-2010, 11:30 AM
Depends on how anal you want to get about it. If you're gonna stare at the image for 10 minutes on both pieces of hardware, you might come to that conclusion; in reality, in games and apps I don't think you could say it perceptibly fuzzier or blurry than AMD's. The color defaults do more to promote the differences in perceived image quality than the AF quality (and how many games don't have in-game AF settings, that consumers don't bother to override or enhance with control panel settings?).

Lorien
11-05-2010, 11:47 AM
Nvidia wants sites to use their stop sign pattern filtering:
GF104
http://www.techreport.com/r.x/radeon-hd-6800/gf104-af3.png

Against AMD's filtering:
Barts
http://www.techreport.com/r.x/radeon-hd-6800/barts-af3.png

Gee, I wonder why. :rolleyes:

AtwaterFS
11-05-2010, 11:51 AM
fuzzy wuzzy was a bear
fuzzy wuzzy had no hair
fuzzy wuzzy wasnt very fuzzy, was he?

TakeANumber
11-05-2010, 12:34 PM
Nvidia wants sites to use their stop sign pattern filtering:
GF104
[snip]

Against AMD's filtering:
Barts
[snip]

Gee, I wonder why. :rolleyes:

The center object is actually round instead of octagonal in the Barts example. If you didn't call it "stop sign filtering" (or is that NV's name, because of that?), I'd say that Barts did the better job.

Besides, this is such an artificial test. Maybe a TWIMTBP game that resembles the old Tempest arcade game could be made with these seizure-inducing patterns, if they thought this was such a strength.

alvter
11-05-2010, 12:53 PM
The center object is actually round instead of octagonal in the Barts example. If you didn't call it "stop sign filtering" (or is that NV's name, because of that?), I'd say that Barts did the better job.

Besides, this is such an artificial test. Maybe a TWIMTBP game that resembles the old Tempest arcade game could be made with these seizure-inducing patterns, if they thought this was such a strength.

You read my mind. Looks like Barts rendered more detail. Also, why would you want to make a light glow in a "stop sign" shape? Is that because you can't make a circle? Isn't that one of the problems in older games: Lights reflecting in an octagon shape?

The checker-pattern is something you would like in a leisure suite larry game. Maybe that's who NVDA was trying to sell their product to.

The glow, ambiance, transparencies, detail, transition all seem better on the Barts... but that's my opinion. It also depends on what the artists vision was. But then again, the artist could be wrong*...

*If doing "A" will give you the result you want, it doesn't necessarily mean that doing it as "A" is the right way of doing it... For example my bugged original Pentium computer added 2 and 2 together and gave me 5; to get 5, I shouldn't be using 2 + 2 !

Lorien
11-05-2010, 02:38 PM
That's what I meant. Nvidia is raising a stink through proxy websites to try and get away with using their arguably inferior filtering.

Zealot
11-05-2010, 02:56 PM
Hello.

For me this breaks down to three things:

This problem is not new. It was seen and reported on for the HD 5000 series.
The problem is not visible in games for 99% of titles.
Why is this being highlighted now?


AMD present by default a sharper and more vibrant image to the user than NVIDIA does. A sharper image is more subject to noise. So in these AF Filter Tester cases, the images are most prone to demonstate the issue. Which is what they are for. In game, you don't see it, because games don't use the same setup. You can find examples in games if you pause, find an area, and focus on it instead of gameplay. Is that real world? Where are the tens of millions of HD 5000 & 6000 series customers appalled by this?
The question becomes; are AMD out of specification? And the answer is no, they are within D3D API guidelines for AF quality (FilterTester is a DirectX application). So are NVIDIA. NVIDIA err towards a smoother, less vibrant image. It reduces noise, which is seen and judged as aliasing. It's all part of both AMD and NVIDIA's image quality presentation.

Looking at Filter Tester, comparing the 'Perfect' ALU representation to the AMD AF implementation ('TMU'), the shimmer described is present (to a lesser degree) in the ALU rendering. It's in the same spots, but about half as prominent - the banding of texture aliasing is thinner. NVIDIA's doesn't have any, until you nudge LoD bias to about -0.65, where it starts to appear. Push a LoD bias of +0.65 and the AMD aliasing disappears, as it does for the ALU 'perfect' reference. Additionally, does moving the slide on NVIDIA Texture from Performance to Quality to High Quality make a difference? What if you rename the FilterTester application?

The last issue is the one most people want to talk about. Typically, you want sites to bench like-to-like settings. With both the Surface Format Optimizations issue and the AF issue, advice is being given to disable Catalyst AI. Now, seeing as this doesn't affect the results of Filter Tester, it's an interesting decision to make. If it doesn't remove the problem, why turn it off?

Another interesting decision is to compare AMD High Quality (catalyst AI texture optimizations disabled, but performance & quality profiles remain enabled - default setting for HD 5000, not for HD 6000) to NVIDIA Quality (their default setting, texture optimizations used). NVIDIA get a performance boost compared to their HQ setting, maybe 5% best case. But now we're relying on the reviewer stating 'AMD's HQ mode is visually equivalent to NVIDIA's Q mode, for textures' for performance testing. Is this judgement accurate?


Kudos for this explanation Jim, much appreciated.

distinctively
11-05-2010, 05:20 PM
That's what I meant. Nvidia is raising a stink through proxy websites to try and get away with using their arguably inferior filtering.

Yup, and I think we need to spread the word.:cool:

GeorgiD
07-01-2011, 01:26 PM
Hehe, I found this thread quite interesting.

I think part of it might be that ATI's and Nvidia's quality reached such a point that it was generally a non-issue. Some people just dont think much about it any more and fail to put Intel to the same grill that the other 2 used to face.

Or maybe a few palms are being greased.

I respond to this post here because in the other thread it is a kind of offtopic.

http://semiaccurate.com/forums/showthread.php?t=5035&highlight=image+quality&page=3

So, here I have two relatively low resolution screenshots from the same place in CS1.6. The map is Dust_2 and all settings (both in nvidia control panel and ATi CCC) are set to maximum.

ATi Control Centre

Smoothvision HD:Anti-aliasing Use application settings
Smoothvision HD:Anisotropic filtering Use application settings
Catalyst AI Standard
Mipmap Detail Level High Quality
Wait for Vertical refresh Off, unless application specifies
Adaptive Anti-aliasing Enable-Quality
Support DXT texture formats Enabled
Alternate pixel center Disabled
Triple buffering Disabled
Force 24-bit Z-buffer depth Disabled


nVidia Control Panel
Adjust Image setting with preview

Anisotropic filtering Application-controlled
Antialiasing- Gamma correction On
Antialiasing- Mode Application-controlled
Antialiasing- Setting Application controlled
Antialiasing- Transperancy Supersampling
Conformant texture clamp Use hardware
Extension limit Off
Force mipmaps None
Maximum pre-rendered frames 3
Multi display/mixed-GPU acceleration Multiple display performance mode
Texture filtering- Anisotropic sample optimization On
Texture filtering- Negative LOD bias Allow
Texture filtering- Quality Quality
Texture filtering- Trillinear optimization On
Tripple buffering Off
Vertical sync Use the 3D application setting

http://img18.imageshack.us/img18/1223/dust2tunnelati4670.th.jpg (http://imageshack.us/photo/my-images/18/dust2tunnelati4670.jpg/)&http://img220.imageshack.us/img220/4263/dust2tunnel8500gt.th.jpg (http://imageshack.us/photo/my-images/220/dust2tunnel8500gt.jpg/)


As you can see, there are a lot of small bright dots in the Radeon 4670 screenshot, and those dots are less in the GF 8500GT screenshot. These bright dots are most likely sand particles in the air, they are effect in the particular game and map.

If you like, you can do some new tests, because these are quite old, I cant' do them now at 1920 X 1080, simply because I don't own a nvidia card any more.

Here are two screenshots from NFS Carbon at identical settings for both cards (max, 4Х Anti-aliasing).

http://imageshack.us/f/220/nfscarbon2zz0.jpg/ & http://imageshack.us/f/220/nfscarbon1nvidia1na8.jpg/

This is very interesting topic, and also I do remember that in a game like FarCry few years ago, the image quality on ATi cards was way superiour.

WILLLESS
07-01-2011, 01:58 PM
It's some time ago when I have noticed that ATi graphics cards have better colors. And in the Counter Strike screens ATi card has again the lead in terms of color quality. Colors on NVIDIA's solution seem somewhat bleached. I consider this a much greater flaw than ATi's AF.

seinfeldx
07-01-2011, 02:07 PM
You are returning to this thread that is more than one year old because intel doesn't get tested on image quality? when this Thread is amd vs nvidia.

265586888
07-01-2011, 11:01 PM
So you guys want an Image Quality showdown thread featuring Intel, AMD and NVIDIA? :rolleyes:
Be very careful not to turn that into yet another flame war...

GeorgiD
07-07-2011, 05:41 AM
The thread is about AMD and nvidia, no one hasn't mentioned Integrated Electronics yet. :D

Well, if I understand correctly this:

НАС... хотел я на быстроту имхо графика на красных смотрится куда более красивее. для примера я дома сижу на радике 5670 а у друга ферми 450 так разница в производительности в играх в сторону зеленого но друг сам был в шоке как красиво смотрится игры на радике (больше всего сравнивали Metro)

Using Google Translate it means:

...I like the speed chart in the red IMHO looks much more beautiful. for example I am sitting at home in 5670 and radicals in the other farm 450 as the difference in performance in games towards the green but a friend he was shocked how beautiful it looks in the game radicals (mostly compared the Metro)

http://www.overclockers.ru/hardnews/42611/Esche_raz_o_srokah_anonsa_28_nm_graficheskih_reshe nij_NVIDIA.html

In the comments below the article. :D

So, actually, that guy was shocked about how beautiful the graphics in Metro were in comparison with the nvidia rig. :D

Anyone with perfect Russian will be much appreciated to participate in the discussion.

GeorgiD
08-14-2011, 08:18 AM
I think this awesome explanation should stay here, otherwise the post would be lost and forgotten in that thread.

Well, the Intel vs. [everyone still standing] case is a bit different than the Matrox vs [well, everyone]. The "image quality" case with Intel is all about the postprocessing. The quality of the output is the same. Differences are all in rendering in games and the like; if you put a big JPEG up on both, they'll look the same (+/- epsilon).


What made Matrox so awesome was the quality of its DACs and the seriously awesome quality of the filtering on the VGA out. Not only did it yield a very clean signal without extraneous energy outside of the data (rather important to those of us who occasionally want to use some part of the RF spectrum for something else around computers), but it did it with way less degradation of the signal inside the passband than anybody else. And the group delay... shoot, what group delay?? The result was incredibly crisp, clean output.

Not without its drawbacks. I had friends with other cards who happily used cheap KVM's. I couldn't; even with a pretty middling monitor the loss in quality from going through a cheap mechanical KVM was horrifying to see. Couldn't tell a difference with their output though :D

But, none of that matters now. Aside from luddites and nutcases, everybody is using DVI/HDMI/DP now, so the exacting analog design is completely irrelevant now. It's just a bithose, and the bits come out the same as they go in.

http://semiaccurate.com/forums/showthread.php?t=5146&page=83

Yes, nvidia and ATi were not mentioned by you but I think the post can be a good base for an explanation between the differences in image quality those companies offer.
By the way, my 22 inch monitor is still working with analogue D-Sub. Of course, I cannot compare that cheap TFT display with the oldie but goodie CRTs. :D

Voxel
08-14-2011, 10:50 AM
It's some time ago when I have noticed that ATi graphics cards have better colors. And in the Counter Strike screens ATi card has again the lead in terms of color quality. Colors on NVIDIA's solution seem somewhat bleached. I consider this a much greater flaw than ATi's AF.

I consider the biggest flaw in the nVidia screenshots to be the horrible sub-sampling of games' lightmaps. This practice lowers the contrast in the image and results in that slightly "washed out" look in most nVidia screenshots.

I am pretty sure this was one of the first driver "optimizations" ever used and is still ruining image quality. They did this because a relatively high resolution lightmap is accessed for nearly every pixel on the screen. The driver downsizes the lightmap to save bandwidth and also uses less intensive filtering modes to save resources that are "better used" to make the textures look smoother.

Jon