This topic is locked from further discussion.
[QUOTE="Indestructible2"]Been wondering how a 8600GT compares to my HD 2600XT image-quality wise,of course i don't think i'll ever find out :(ruuuj
LOL, sorry, totally off-topic, but I'm totally creasing at your sig + avatar (used to be my MSN pic!) :D
:lol: Yeah i know,they're bad-ass 8)I own and use both, and I can tell you for the most part it is all in peoples head.ch5richards
Its fanboy's that say it!
Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -
http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html
wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o
Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -
http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html
wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o
ruuuj
I can't see ANY difference in those examples.
[QUOTE="ruuuj"]Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -
http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html
wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o
Fignewton50
I can't see ANY difference in those examples.
There is a definite difference between the first example given; the Gears of War one with the glass pane. If you look closely, the ATI image clearly has a better contrast between light and dark when compared to nVidia. Even with the Crysis example (third one down), there is a better contrast there with ATI aswell; the mountain appears significantly darker and more vibrant than nVidia. I know these differences aren't groundbreaking - far from it - and I'm really nit-picking here, but it's actually really bugging me. This question keeps coming back to my mind - Why?
ATi does better AA, with better modes, while Nvidia has the better AF algorithim though you can hardly see the difference in game.
There is a difference, just very few people take the time to actually show it because you sure are not going to see it without having two systems side by side or without take the exact same screenshot.
ATI MIGHT have better image quality, but your not really going to notice it in game. get a video card based on your price range and the type of games you want to play, cause thats the only difference your going to notice.Lilgunney612
Yeah this is very true. Man I swear I get annoyed with myself sometimes. The more I think about buying something (I'm VERY cautious with my money, always have been), the more I ponder my decision :@
ATi does better AA, with better modes, while Nvidia has the better AF algorithim though you can hardly see the difference in game.
There is a difference, just very few people take the time to actually show it because you sure are not going to see it without having two systems side by side or without take the exact same screenshot.
LordEC911
So you're saying that ATi essentially does have better image quality? Even if it's ever so slightly? Not trying to flame bait or anything here people, I just want to separate fact from opinion concerning this matter.
[QUOTE="Lilgunney612"]ATI MIGHT have better image quality, but your not really going to notice it in game. get a video card based on your price range and the type of games you want to play, cause thats the only difference your going to notice.ruuuj
Yeah this is very true. Man I swear I get annoyed with myself sometimes. The more I think about buying something (I'm VERY cautious with my money, always have been), the more I ponder my decision :@
I'd say that's a pretty good quality to have. A fool and his money are soon parted, so it's good you carefully consider your purchases.
[QUOTE="Fignewton50"][QUOTE="ruuuj"]Also, I'm talking about high end cards - ATI 3870 and 8800GT/GTS. Hmmm, just found this, check it out -
http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html
wOOt so it's actually true that ATI has better image quality? I thought that this was a thing of the past O_o
ruuuj
I can't see ANY difference in those examples.
There is a definite difference between the first example given; the Gears of War one with the glass pane. If you look closely, the ATI image clearly has a better contrast between light and dark when compared to nVidia. Even with the Crysis example (third one down), there is a better contrast there with ATI aswell; the mountain appears significantly darker and more vibrant than nVidia. I know these differences aren't groundbreaking - far from it - and I'm really nit-picking here, but it's actually really bugging me. This question keeps coming back to my mind - Why?
Yep,i noticed that too,seems like ATI DOES give better image quality after all,though the difference isn't noticeable to a lot of people.It used to be the case back then. Back when 3D cards were still pretty primitive, ATi did have better 2D IQ than NV's for example (Matrox still reigns king on that right now). Ironically, now thanks to DirectX 10 API specification, the requirements in place for that (and ditching the whole "cap bits" that both vendors whored out between generations), that's not really true anymore.codezer0
I didnt understand one word of that!
Regarding the early days of computing, there used to be separate graphics cards for 2D and 3D. It took a while before we started seeing graphics cards that could process 2D and 3D on a singular piece of silicon. And back in this time of 2D and 3D... 2D Picture Quality: Matrox > ATi > 3dfx > NVIDIA 3D Picture Quality: ATi > 3dfx > NVIDIA > Matrox Regarding cap bits, Think of cap bits in 3D graphics like the "carbon credits" manufacturing companies buy and use to stay within pollution limits. In GPU's, companies like ATI & NV use "cap bits" to let the API know if they are incapable of certain functions in the DirectX spec. This was useful for cards that were partially compliant with a version of DirectX (such as the geForce4 MX and geForce FX 5*** series, or Radeon 92** and X800 line). With DirectX 10, Microsoft said they got rid of the ability for there to be "cap bits" at all, meaning a card is either 100% DIrectX 10 capable or not.I didnt understand one word of that!
daytona_178
[QUOTE="daytona_178"]Regarding the early days of computing, there used to be separate graphics cards for 2D and 3D. It took a while before we started seeing graphics cards that could process 2D and 3D on a singular piece of silicon. And back in this time of 2D and 3D... 2D Picture Quality: Matrox > ATi > 3dfx > NVIDIA 3D Picture Quality: ATi > 3dfx > NVIDIA > Matrox Regarding cap bits, Think of cap bits in 3D graphics like the "carbon credits" manufacturing companies buy and use to stay within pollution limits. In GPU's, companies like ATI & NV use "cap bits" to let the API know if they are incapable of certain functions in the DirectX spec. This was useful for cards that were partially compliant with a version of DirectX (such as the geForce4 MX and geForce FX 5*** series, or Radeon 92** and X800 line). With DirectX 10, Microsoft said they got rid of the ability for there to be "cap bits" at all, meaning a card is either 100% DIrectX 10 capable or not.I didnt understand one word of that!
codezer0
Back in nVidia's leaf blower days there was still image quality issues to the point that some things just didn't get rendered on screen.
I'm not sure if this is just me or not but it seems in all of those pics there the ati 3870 look to be better then the nvidia one. It just looks like the ati one is brighter and clearer then the other.
Consider this I have never had a ati card in my pc before it has alwasy been nvidia. So it isn't like I'm a fan boy of that sort.
Please Log In to post.
Log in to comment