1600x1200 AA vs noAA

This topic is locked from further discussion.

Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Manly-manly-man
Member since 2006 • 3477 Posts
Does anyone have any screenshots comparing games at 1600x1200 with no AA, and then with 4xAA?
Avatar image for LouieV13
LouieV13

7604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 69

User Lists: 0

#2 LouieV13
Member since 2005 • 7604 Posts
No but at 1920x1200 I cant see past 4x in some games and 8x is normally what I have AA on unless I cant tell between 4x and 8x
Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Manly-manly-man
Member since 2006 • 3477 Posts
How much of a difference does 4xAA make at 1600x1200?
Avatar image for LouieV13
LouieV13

7604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 69

User Lists: 0

#4 LouieV13
Member since 2005 • 7604 Posts
How much of a difference does 4xAA make at 1600x1200?Manly-manly-man
Ill look tommorow but since my native is 1920x1200 it might be hard... what games we talkin tho I have prety much all of the newer ones
Avatar image for Macolele
Macolele

534

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Macolele
Member since 2006 • 534 Posts

Still jaggy at 1600x1200 but less noticeable. I play 19' at 1600x1200 well.

Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Manly-manly-man
Member since 2006 • 3477 Posts
Hm...still, if you could take some comparsison shots that would be great.
Avatar image for frizzyman0292
frizzyman0292

2855

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 frizzyman0292
Member since 2007 • 2855 Posts
There is a big difference between 4x and 8x but i see no dif when I bring it up to 16x
Avatar image for _eDdySON_
_eDdySON_

534

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 _eDdySON_
Member since 2006 • 534 Posts

Found it on this site.

http://www.gamespot.com/features/6168650/p-2.html

Avatar image for Gog
Gog

16376

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 Gog
Member since 2002 • 16376 Posts
Increasing the resolution doesn't eliminate aliasing. Itsimply makes the jaggies smaller. There is a big difference up to 4x, a small difference between 4x and 8x and almost no noticeable difference beyond 8x.
Avatar image for el_carl
el_carl

2376

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 el_carl
Member since 2006 • 2376 Posts
I'll try to get one, it makes a difference mostly in the distance. Like, in battlefield 2, people far away are clearer.
Avatar image for GamingMonkeyPC
GamingMonkeyPC

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#11 GamingMonkeyPC
Member since 2005 • 3576 Posts

You should be able to find screenshots out there with Google...

http://www.pcper.com/article.php?aid=319&type=expert&pid=6

Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 Manly-manly-man
Member since 2006 • 3477 Posts

I know it won't eliminate the aliasing, but if it wasn't bad enough to degrade the visuals too bad it would help me decide between the 8800GTS and the HD2900XT. From those screen shots 4xAA makes a big difference, but only over the game with no AA OR CSAA. The HD2900XT, and corrct me if I am wrong, has it's own type of CSAA (wide and narrow tent) that rivals MSAA without taking a hit on performance (or a small one).

Edit: I worded that badly. What I meant, in short, is that at 2xAA you can enable the CFAA, without performance hit, and make it look as good as 4xAA with better framerates. That is good for me, because that means that the HD2900XT is a better deal for me, because at 1600x1200 and just 2xAA, it performs better then the 8800GTS.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 LordEC911
Member since 2004 • 9972 Posts

I know it won't eliminate the aliasing, but if it wasn't bad enough to degrade the visuals too bad it would help me decide between the 8800GTS and the HD2900XT. From those screen shots 4xAA makes a big difference, but only over the game with no AA OR CSAA. The HD2900XT, and corrct me if I am wrong, has it's own type of CSAA (wide and narrow tent) that rivals MSAA without taking a hit on performance (or a small one). Manly-manly-man

I believe the new type of AA is Custom Filtering.

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 SDxSnOOpZ
Member since 2006 • 225 Posts

From HardOCP...if you wanna see more IQ SS between the two cards, just read up on their reviews.

http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfM181X2wucG5n

http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfM180X2wucG5n

The difference between the lower 25% grass setting and half grass setting can clearly be seen in the screenshots above. As you zoom out farther in third person view the grass disappears quicker with less grass distance. At 50% grass the grass is all visible at full zoom out.

http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfMl80X2wuanBn

http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfMl81X2wuanBn

The above screenshots illustrate the superior image quality of 16X Transparency Supersampling in Battlefield 2142.

http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfNF82X2wucG5n

http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfNF81X2wucG5n

The above screenshots illustrate the difference between full dynamic lighting and objects dynamic rendering. Shadow detail is greatly reduced with all objects that cast shadows, including trees. In the second screenshot you can really see the difference between lightmaps and real-time dynamic lighting

Conclusion:

Overall Performance Summary

In Oblivion we found the 320 MB and 640 MB GeForce 8800 GTS based video cards to perform faster than the ATI Radeon HD 2900 XT. Oddly enough they were able to handle higher grass distance settings in Oblivion despite the Radeon HD 2900 XT having much higher memory bandwidth.

Battlefield 2142 had a large difference in the gameplay experience between the ATI Radeon HD 2900 XT and both GeForce 8800 GTS video cards. Even with the much less expensive 320 MB GeForce 8800 GTS we were able to play the game smoothly at 16X Transparency Supersampling at 1600x1200 with no problems at all in intense gun fights with massive explosions. The more expensive ATI Radeon HD 2900 XT could not handle anything higher than 4X Performance Adaptive AA at 1600x1200.

S.T.A.L.K.E.R. also proved to separate these video cards by performance. The ATI Radeon HD 2900 XT was clearly the weaker performing video card. We had to lower the rendering quality to "Objects Dynamic Lighting" and run at 1280x1024 to receive playable performance. Unfortunately this does diminish the gameplay experience compared to the GeForce 8800 GTS based video cards. We were able to take the game up to full rendering quality and play at 1600x1200 with NVIDIA based cards. With the 320 MB version we had to drop the AF level to 4X and grass density to 50%.

Lost Planet is a fun game, plain and simple; we had a blast playing through the demo. If this is the future of gaming then we are very happy. There is no question that next generation titles will require fast hardware to keep up with the intense detail. This demo presented some interesting results for us. We found that the ATI Radeon HD 2900 XT really does take a large performance hit when enabling AA; to the point where it just isn't a viable option right now. The GeForce 8800 GTS based video cards on the other hand don't take as great a hit and some gamers may find 2X AA or more playable depending on what framerates you are comfortable with.

In Lost Planet's outdoor areas the ATI Radeon HD 2900 XT, without AA, is slightly better performing than both GeForce 8800 GTS based video cards. However, in that one indoor area of the performance test called "Cave" we saw the framerates suffer and perform slower than the GeForce 8800 GTS based video cards. We cannot wait until the full version game is released so we can test all the levels and see how the video cards really compare throughout the entire game.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Wesker776
Member since 2005 • 7004 Posts
LordEC is going to rip you for linking HardOCP... XD
Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 SDxSnOOpZ
Member since 2006 • 225 Posts

LordEC is going to rip you for linking HardOCP... XDWesker776

What's wrong with HardOCP?

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 LordEC911
Member since 2004 • 9972 Posts

[QUOTE="Wesker776"]LordEC is going to rip you for linking HardOCP... XDSDxSnOOpZ

What's wrong with HardOCP?

LoL, I posted in the other thread.
I also think I have discussed this with you a few months ago.

Avatar image for Makari
Makari

15250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 Makari
Member since 2003 • 15250 Posts

[QUOTE="Wesker776"]LordEC is going to rip you for linking HardOCP... XDSDxSnOOpZ

What's wrong with HardOCP?

Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.
Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 Manly-manly-man
Member since 2006 • 3477 Posts
You know, they should have done more in depth testing. At 2xAA with full CFAA the HD2900XT does perform better then the GTS 640MB in most games, and it looks just as good as 4xMSAA.
Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 domke13
Member since 2006 • 2891 Posts
[QUOTE="SDxSnOOpZ"]

[QUOTE="Wesker776"]LordEC is going to rip you for linking HardOCP... XDMakari

What's wrong with HardOCP?

Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.

I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 LordEC911
Member since 2004 • 9972 Posts
[QUOTE="Makari"]Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.domke13

I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.

It isn't really that even.

I had a problem with the way they benched for a long time, one of the reasons I don't read their benches, plus Kyle has a terrible attitude.
The main problem with the current HardOCP is that Kyle and Brent are obviously in Nvidia's pocket, yes I know about Kyle's history but that doesn't mean anything. If you followed the articles leading up to the G80 launch and then the R600 launch you will see what I mean.

For Ex, when AMD was showing off the teraflop in a box, dual R600 sample cards running at about 850mhz, in SanFran a month or two before launch, Kyle decided to post an article stating that the R600 was using 300w of power, 600w for Crossfire. What is amusing is that all the Journos in SanFran were publishing a 200w draw per card, 400w CF. Even a video of the Q/A came out and someone asked about the power draw and the AMD Rep said about 200w per card. Did they do a retraction/correction or take it down? Nope. They just left it up, even though it was total BS.

They started calling the R600 a flop before it even came out. They used early beta drivers on a VERY early sample card, they are still using the same card today for their benches, even though they could simple send it back to AMD and ask for a retail model. Why? I have my own opinion and you have to decide for yourself but the facts are all pointing at something fishy...

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 SDxSnOOpZ
Member since 2006 • 225 Posts

sure..i'll post some SS of LPEC DX10 version on 16x12 AA vs no AA in a min

Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 domke13
Member since 2006 • 2891 Posts
[QUOTE="domke13"][QUOTE="Makari"]Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.LordEC911

I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.

It isn't really that even.

I had a problem with the way they benched for a long time, one of the reasons I don't read their benches, plus Kyle has a terrible attitude.
The main problem with the current HardOCP is that Kyle and Brent are obviously in Nvidia's pocket, yes I know about Kyle's history but that doesn't mean anything. If you followed the articles leading up to the G80 launch and then the R600 launch you will see what I mean.

For Ex, when AMD was showing off the teraflop in a box, dual R600 sample cards running at about 850mhz, in SanFran a month or two before launch, Kyle decided to post an article stating that the R600 was using 300w of power, 600w for Crossfire. What is amusing is that all the Journos in SanFran were publishing a 200w draw per card, 400w CF. Even a video of the Q/A came out and someone asked about the power draw and the AMD Rep said about 200w per card. Did they do a retraction/correction or take it down? Nope. They just left it up, even though it was total BS.

They started calling the R600 a flop before it even came out. They used early beta drivers on a VERY early sample card, they are still using the same card today for their benches, even though they could simple send it back to AMD and ask for a retail model. Why? I have my own opinion and you have to decide for yourself but the facts are all pointing at something fishy...

:lol: Didnt know that. These guys are funny. They write CF uses 600W and infact it uses 400W. Haha.If thats true ill try to avoid their reviews too. But about G80 review. I think that all ppls were fascinating about G80 when it came out. It was THE BEST card of all time, whit a huge performance advatage from previous generation and now when R600 came out, not many ppls were fascinating cause they saw G80 before.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 LordEC911
Member since 2004 • 9972 Posts
I also forgot to add, in their original review of the HD2900XT they used a few slides from an NVIDIA presentation, one of which they based all their power consumption section off of... Though they did take most of those slides out after a day or so.
Avatar image for Makari
Makari

15250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Makari
Member since 2003 • 15250 Posts
I also forgot to add, in their original review of the HD2900XT they used a few slides from an NVIDIA presentation, one of which they based all their power consumption section off of... Though they did take most of those slides out after a day or so.LordEC911
The power consumption was originally AMD's own comments about their card. Just like the 8800's vastly overestimating the 'power required,' so did AMD. AMD made those comments originally, which everybody else promptly jumped on and started quoting, which put the 2900 in a very bad light... the problem kicks in when people started comparing G80's real-world power usage to R600's theoretical power usage. Looking around, the only place I see that actually 'corrected' the 300w estimate before the cards were out was the Inquirer, which I'm pretty sure most big hardware sites make a point of ignoring unless its claims can be backed up independently. Something that bugs me are people that pretty obviously make things up to further their own agenda. You say [H] has been using the same very pre-release card all along, which partially explains their results? http://www.hardocp.com/image.html?image=MTE4MzM2NjA5Mk5MQlRGOUYyTWRfMV8xX2wuanBnp I dunno, that looks retail. And their review mentioned 'retail cards.' One must wonder how they did a CrossFire setup with only one early sample card. And for the record, I don't know about Kyle's history, though he can be an ass and is likely proud of it - I wasn't referring to him. It's Brent Justice, the videocard editor, who came to [H] after a history of running ATi fansites. When you've got a choice between A) a vast conspiracy theory assuming that a small group of people are both extremely cunning and blindingly stupid at the same time; and B) an answer you don't like, it's much more likely to be the answer you don't like.