Do ATI cards "really" have better image quality than nVidia?

This topic is locked from further discussion.

Avatar image for ruuuj
ruuuj

210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 ruuuj
Member since 2007 • 210 Posts
Quite a few people have said this. Is it actually true?
Avatar image for Indestructible2
Indestructible2

5935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Indestructible2
Member since 2007 • 5935 Posts
Been wondering how a 8600GT compares to my HD 2600XT image-quality wise,of course i don't think i'll ever find out :(
Avatar image for ruuuj
ruuuj

210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 ruuuj
Member since 2007 • 210 Posts

Been wondering how a 8600GT compares to my HD 2600XT image-quality wise,of course i don't think i'll ever find out :(Indestructible2

LOL, sorry, totally off-topic, but I'm totally creasing at your sig + avatar (used to be my MSN pic!) :D

Avatar image for Indestructible2
Indestructible2

5935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Indestructible2
Member since 2007 • 5935 Posts

[QUOTE="Indestructible2"]Been wondering how a 8600GT compares to my HD 2600XT image-quality wise,of course i don't think i'll ever find out :(ruuuj

LOL, sorry, totally off-topic, but I'm totally creasing at your sig + avatar (used to be my MSN pic!) :D

:lol: Yeah i know,they're bad-ass 8)
Avatar image for ch5richards
ch5richards

2912

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 ch5richards
Member since 2005 • 2912 Posts
I own and use both, and I can tell you for the most part it is all in peoples head.
Avatar image for xwengstax
xwengstax

8491

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#6 xwengstax
Member since 2004 • 8491 Posts

hocus pocus

Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#7 Daytona_178
Member since 2005 • 14962 Posts

I own and use both, and I can tell you for the most part it is all in peoples head.ch5richards

Its fanboy's that say it!

Avatar image for dayaccus007
dayaccus007

4349

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 dayaccus007
Member since 2007 • 4349 Posts
You will not see any difference between ATI and Nvidia. Obvious you have compare card from the same range, I mean you can't compare 8500GT with HD3870 or 2600XT with 8800GTS
Avatar image for ruuuj
ruuuj

210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 ruuuj
Member since 2007 • 210 Posts

Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -

http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html

wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o

Avatar image for Fignewton50
Fignewton50

3748

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 Fignewton50
Member since 2003 • 3748 Posts

Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -

http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html

wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o

ruuuj

I can't see ANY difference in those examples.

Avatar image for Whiteknight19
Whiteknight19

1303

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 Whiteknight19
Member since 2003 • 1303 Posts
NVidia and ati both cards were Quality both with Great with pictures back then Ati cards were faster the Nvidia's now its the other way round
Avatar image for deactivated-5f033ecf40fed
deactivated-5f033ecf40fed

2665

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#13 deactivated-5f033ecf40fed
Member since 2004 • 2665 Posts
I think it's impossible to say. I find it unlikely that ATI's cards have any sort of magical 'image-quality' thing going on that would make them better than Nvidia. Without a concrete definition of what 'image-quality' is, I don't know how you could even compare them.
Avatar image for ruuuj
ruuuj

210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 ruuuj
Member since 2007 • 210 Posts
[QUOTE="ruuuj"]

Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -

http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html

wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o

Fignewton50

I can't see ANY difference in those examples.

There is a definite difference between the first example given; the Gears of War one with the glass pane. If you look closely, the ATI image clearly has a better contrast between light and dark when compared to nVidia. Even with the Crysis example (third one down), there is a better contrast there with ATI aswell; the mountain appears significantly darker and more vibrant than nVidia. I know these differences aren't groundbreaking - far from it - and I'm really nit-picking here, but it's actually really bugging me. This question keeps coming back to my mind - Why?

Avatar image for Lilgunney612
Lilgunney612

1878

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#15 Lilgunney612
Member since 2005 • 1878 Posts
ATI MIGHT have better image quality, but your not really going to notice it in game. get a video card based on your price range and the type of games you want to play, cause thats the only difference your going to notice.
Avatar image for ch5richards
ch5richards

2912

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 ch5richards
Member since 2005 • 2912 Posts
ATI might or might not have better image quality than Nvidia, but I know one thing for sure, chocolate is better than vanilla.
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 LordEC911
Member since 2004 • 9972 Posts

ATi does better AA, with better modes, while Nvidia has the better AF algorithim though you can hardly see the difference in game.

There is a difference, just very few people take the time to actually show it because you sure are not going to see it without having two systems side by side or without take the exact same screenshot.

Avatar image for ruuuj
ruuuj

210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 ruuuj
Member since 2007 • 210 Posts

ATI MIGHT have better image quality, but your not really going to notice it in game. get a video card based on your price range and the type of games you want to play, cause thats the only difference your going to notice.Lilgunney612

Yeah this is very true. Man I swear I get annoyed with myself sometimes. The more I think about buying something (I'm VERY cautious with my money, always have been), the more I ponder my decision :@

Avatar image for ruuuj
ruuuj

210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 ruuuj
Member since 2007 • 210 Posts

ATi does better AA, with better modes, while Nvidia has the better AF algorithim though you can hardly see the difference in game.

There is a difference, just very few people take the time to actually show it because you sure are not going to see it without having two systems side by side or without take the exact same screenshot.

LordEC911

So you're saying that ATi essentially does have better image quality? Even if it's ever so slightly? Not trying to flame bait or anything here people, I just want to separate fact from opinion concerning this matter.

Avatar image for deactivated-5f033ecf40fed
deactivated-5f033ecf40fed

2665

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#20 deactivated-5f033ecf40fed
Member since 2004 • 2665 Posts

[QUOTE="Lilgunney612"]ATI MIGHT have better image quality, but your not really going to notice it in game. get a video card based on your price range and the type of games you want to play, cause thats the only difference your going to notice.ruuuj

Yeah this is very true. Man I swear I get annoyed with myself sometimes. The more I think about buying something (I'm VERY cautious with my money, always have been), the more I ponder my decision :@

I'd say that's a pretty good quality to have. A fool and his money are soon parted, so it's good you carefully consider your purchases.

Avatar image for Indestructible2
Indestructible2

5935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Indestructible2
Member since 2007 • 5935 Posts
[QUOTE="Fignewton50"][QUOTE="ruuuj"]

Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -

http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html

wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o

ruuuj

I can't see ANY difference in those examples.

There is a definite difference between the first example given; the Gears of War one with the glass pane. If you look closely, the ATI image clearly has a better contrast between light and dark when compared to nVidia. Even with the Crysis example (third one down), there is a better contrast there with ATI aswell; the mountain appears significantly darker and more vibrant than nVidia. I know these differences aren't groundbreaking - far from it - and I'm really nit-picking here, but it's actually really bugging me. This question keeps coming back to my mind - Why?

Yep,i noticed that too,seems like ATI DOES give better image quality after all,though the difference isn't noticeable to a lot of people.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#22 codezer0
Member since 2004 • 15898 Posts
It used to be the case back then. Back when 3D cards were still pretty primitive, ATi did have better 2D IQ than NV's for example (Matrox still reigns king on that right now). Ironically, now thanks to DirectX 10 API specification, the requirements in place for that (and ditching the whole "cap bits" that both vendors whored out between generations), that's not really true anymore.
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23 Daytona_178
Member since 2005 • 14962 Posts

It used to be the case back then. Back when 3D cards were still pretty primitive, ATi did have better 2D IQ than NV's for example (Matrox still reigns king on that right now). Ironically, now thanks to DirectX 10 API specification, the requirements in place for that (and ditching the whole "cap bits" that both vendors whored out between generations), that's not really true anymore.codezer0

I didnt understand one word of that!

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#24 codezer0
Member since 2004 • 15898 Posts

I didnt understand one word of that!

daytona_178
Regarding the early days of computing, there used to be separate graphics cards for 2D and 3D. It took a while before we started seeing graphics cards that could process 2D and 3D on a singular piece of silicon. And back in this time of 2D and 3D... 2D Picture Quality: Matrox > ATi > 3dfx > NVIDIA 3D Picture Quality: ATi > 3dfx > NVIDIA > Matrox Regarding cap bits, Think of cap bits in 3D graphics like the "carbon credits" manufacturing companies buy and use to stay within pollution limits. In GPU's, companies like ATI & NV use "cap bits" to let the API know if they are incapable of certain functions in the DirectX spec. This was useful for cards that were partially compliant with a version of DirectX (such as the geForce4 MX and geForce FX 5*** series, or Radeon 92** and X800 line). With DirectX 10, Microsoft said they got rid of the ability for there to be "cap bits" at all, meaning a card is either 100% DIrectX 10 capable or not.
Avatar image for Krall
Krall

16463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Krall
Member since 2002 • 16463 Posts
[QUOTE="daytona_178"]

I didnt understand one word of that!

codezer0

Regarding the early days of computing, there used to be separate graphics cards for 2D and 3D. It took a while before we started seeing graphics cards that could process 2D and 3D on a singular piece of silicon. And back in this time of 2D and 3D... 2D Picture Quality: Matrox > ATi > 3dfx > NVIDIA 3D Picture Quality: ATi > 3dfx > NVIDIA > Matrox Regarding cap bits, Think of cap bits in 3D graphics like the "carbon credits" manufacturing companies buy and use to stay within pollution limits. In GPU's, companies like ATI & NV use "cap bits" to let the API know if they are incapable of certain functions in the DirectX spec. This was useful for cards that were partially compliant with a version of DirectX (such as the geForce4 MX and geForce FX 5*** series, or Radeon 92** and X800 line). With DirectX 10, Microsoft said they got rid of the ability for there to be "cap bits" at all, meaning a card is either 100% DIrectX 10 capable or not.

Back in nVidia's leaf blower days there was still image quality issues to the point that some things just didn't get rendered on screen.

Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#26 Baselerd
Member since 2003 • 5104 Posts
The differences are small, but ATI seems to have better IQ. Of course, I don't think any of us would notice while playing a game.
Avatar image for rik666
rik666

488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#27 rik666
Member since 2005 • 488 Posts

I could so no difference at all, although i didn't look for long.

Seems miniscule if there is.

Avatar image for blackleather223
blackleather223

1569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 blackleather223
Member since 2004 • 1569 Posts

I'm not sure if this is just me or not but it seems in all of those pics there the ati 3870 look to be better then the nvidia one. It just looks like the ati one is brighter and clearer then the other.

Consider this I have never had a ati card in my pc before it has alwasy been nvidia. So it isn't like I'm a fan boy of that sort.