This topic is locked from further discussion.
You should be able to find screenshots out there with Google...
http://www.pcper.com/article.php?aid=319&type=expert&pid=6
I know it won't eliminate the aliasing, but if it wasn't bad enough to degrade the visuals too bad it would help me decide between the 8800GTS and the HD2900XT. From those screen shots 4xAA makes a big difference, but only over the game with no AA OR CSAA. The HD2900XT, and corrct me if I am wrong, has it's own type of CSAA (wide and narrow tent) that rivals MSAA without taking a hit on performance (or a small one).
Edit: I worded that badly. What I meant, in short, is that at 2xAA you can enable the CFAA, without performance hit, and make it look as good as 4xAA with better framerates. That is good for me, because that means that the HD2900XT is a better deal for me, because at 1600x1200 and just 2xAA, it performs better then the 8800GTS.
I know it won't eliminate the aliasing, but if it wasn't bad enough to degrade the visuals too bad it would help me decide between the 8800GTS and the HD2900XT. From those screen shots 4xAA makes a big difference, but only over the game with no AA OR CSAA. The HD2900XT, and corrct me if I am wrong, has it's own type of CSAA (wide and narrow tent) that rivals MSAA without taking a hit on performance (or a small one). Manly-manly-man
I believe the new type of AA is Custom Filtering.
From HardOCP...if you wanna see more IQ SS between the two cards, just read up on their reviews.
http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfM181X2wucG5n
http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfM180X2wucG5n
The difference between the lower 25% grass setting and half grass setting can clearly be seen in the screenshots above. As you zoom out farther in third person view the grass disappears quicker with less grass distance. At 50% grass the grass is all visible at full zoom out.
http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfMl80X2wuanBn
http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfMl81X2wuanBn
The above screenshots illustrate the superior image quality of 16X Transparency Supersampling in Battlefield 2142.
http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfNF82X2wucG5n
http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfNF81X2wucG5n
The above screenshots illustrate the difference between full dynamic lighting and objects dynamic rendering. Shadow detail is greatly reduced with all objects that cast shadows, including trees. In the second screenshot you can really see the difference between lightmaps and real-time dynamic lighting
Conclusion:
Overall Performance Summary
In Oblivion we found the 320 MB and 640 MB GeForce 8800 GTS based video cards to perform faster than the ATI Radeon HD 2900 XT. Oddly enough they were able to handle higher grass distance settings in Oblivion despite the Radeon HD 2900 XT having much higher memory bandwidth.
Battlefield 2142 had a large difference in the gameplay experience between the ATI Radeon HD 2900 XT and both GeForce 8800 GTS video cards. Even with the much less expensive 320 MB GeForce 8800 GTS we were able to play the game smoothly at 16X Transparency Supersampling at 1600x1200 with no problems at all in intense gun fights with massive explosions. The more expensive ATI Radeon HD 2900 XT could not handle anything higher than 4X Performance Adaptive AA at 1600x1200.
S.T.A.L.K.E.R. also proved to separate these video cards by performance. The ATI Radeon HD 2900 XT was clearly the weaker performing video card. We had to lower the rendering quality to "Objects Dynamic Lighting" and run at 1280x1024 to receive playable performance. Unfortunately this does diminish the gameplay experience compared to the GeForce 8800 GTS based video cards. We were able to take the game up to full rendering quality and play at 1600x1200 with NVIDIA based cards. With the 320 MB version we had to drop the AF level to 4X and grass density to 50%.
Lost Planet is a fun game, plain and simple; we had a blast playing through the demo. If this is the future of gaming then we are very happy. There is no question that next generation titles will require fast hardware to keep up with the intense detail. This demo presented some interesting results for us. We found that the ATI Radeon HD 2900 XT really does take a large performance hit when enabling AA; to the point where it just isn't a viable option right now. The GeForce 8800 GTS based video cards on the other hand don't take as great a hit and some gamers may find 2X AA or more playable depending on what framerates you are comfortable with.
In Lost Planet's outdoor areas the ATI Radeon HD 2900 XT, without AA, is slightly better performing than both GeForce 8800 GTS based video cards. However, in that one indoor area of the performance test called "Cave" we saw the framerates suffer and perform slower than the GeForce 8800 GTS based video cards. We cannot wait until the full version game is released so we can test all the levels and see how the video cards really compare throughout the entire game.
[QUOTE="Wesker776"]LordEC is going to rip you for linking HardOCP... XDSDxSnOOpZ
What's wrong with HardOCP?
Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.[QUOTE="SDxSnOOpZ"][QUOTE="Wesker776"]LordEC is going to rip you for linking HardOCP... XDMakari
What's wrong with HardOCP?
Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.
[QUOTE="Makari"]Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.domke13
I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.
It isn't really that even.
I had a problem with the way they benched for a long time, one of the reasons I don't read their benches, plus Kyle has a terrible attitude.
The main problem with the current HardOCP is that Kyle and Brent are obviously in Nvidia's pocket, yes I know about Kyle's history but that doesn't mean anything. If you followed the articles leading up to the G80 launch and then the R600 launch you will see what I mean.
For Ex, when AMD was showing off the teraflop in a box, dual R600 sample cards running at about 850mhz, in SanFran a month or two before launch, Kyle decided to post an article stating that the R600 was using 300w of power, 600w for Crossfire. What is amusing is that all the Journos in SanFran were publishing a 200w draw per card, 400w CF. Even a video of the Q/A came out and someone asked about the power draw and the AMD Rep said about 200w per card. Did they do a retraction/correction or take it down? Nope. They just left it up, even though it was total BS.
They started calling the R600 a flop before it even came out. They used early beta drivers on a VERY early sample card, they are still using the same card today for their benches, even though they could simple send it back to AMD and ask for a retail model. Why? I have my own opinion and you have to decide for yourself but the facts are all pointing at something fishy...
[QUOTE="domke13"][QUOTE="Makari"]Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.LordEC911
I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.
It isn't really that even.
I had a problem with the way they benched for a long time, one of the reasons I don't read their benches, plus Kyle has a terrible attitude.
The main problem with the current HardOCP is that Kyle and Brent are obviously in Nvidia's pocket, yes I know about Kyle's history but that doesn't mean anything. If you followed the articles leading up to the G80 launch and then the R600 launch you will see what I mean.
For Ex, when AMD was showing off the teraflop in a box, dual R600 sample cards running at about 850mhz, in SanFran a month or two before launch, Kyle decided to post an article stating that the R600 was using 300w of power, 600w for Crossfire. What is amusing is that all the Journos in SanFran were publishing a 200w draw per card, 400w CF. Even a video of the Q/A came out and someone asked about the power draw and the AMD Rep said about 200w per card. Did they do a retraction/correction or take it down? Nope. They just left it up, even though it was total BS.
They started calling the R600 a flop before it even came out. They used early beta drivers on a VERY early sample card, they are still using the same card today for their benches, even though they could simple send it back to AMD and ask for a retail model. Why? I have my own opinion and you have to decide for yourself but the facts are all pointing at something fishy...
:lol: Didnt know that. These guys are funny. They write CF uses 600W and infact it uses 400W. Haha.If thats true ill try to avoid their reviews too. But about G80 review. I think that all ppls were fascinating about G80 when it came out. It was THE BEST card of all time, whit a huge performance advatage from previous generation and now when R600 came out, not many ppls were fascinating cause they saw G80 before.
I also forgot to add, in their original review of the HD2900XT they used a few slides from an NVIDIA presentation, one of which they based all their power consumption section off of... Though they did take most of those slides out after a day or so.LordEC911The power consumption was originally AMD's own comments about their card. Just like the 8800's vastly overestimating the 'power required,' so did AMD. AMD made those comments originally, which everybody else promptly jumped on and started quoting, which put the 2900 in a very bad light... the problem kicks in when people started comparing G80's real-world power usage to R600's theoretical power usage. Looking around, the only place I see that actually 'corrected' the 300w estimate before the cards were out was the Inquirer, which I'm pretty sure most big hardware sites make a point of ignoring unless its claims can be backed up independently. Something that bugs me are people that pretty obviously make things up to further their own agenda. You say [H] has been using the same very pre-release card all along, which partially explains their results? http://www.hardocp.com/image.html?image=MTE4MzM2NjA5Mk5MQlRGOUYyTWRfMV8xX2wuanBnp I dunno, that looks retail. And their review mentioned 'retail cards.' One must wonder how they did a CrossFire setup with only one early sample card. And for the record, I don't know about Kyle's history, though he can be an ass and is likely proud of it - I wasn't referring to him. It's Brent Justice, the videocard editor, who came to [H] after a history of running ATi fansites. When you've got a choice between A) a vast conspiracy theory assuming that a small group of people are both extremely cunning and blindingly stupid at the same time; and B) an answer you don't like, it's much more likely to be the answer you don't like.
Please Log In to post.
Log in to comment