This topic is locked from further discussion.
What I hate about 99% of the benchmarks is that they use massive CPUs massively OC'd. I think they should use more mainstream setups so we aren't left to compare GPUs solely on ratios.
EG:
E6600, 2x1GB DDR2-800, Mainstream mobo instead of QX9650 @4 Ghz, 2 x 1GB DDR3-1600, Massive mobo, etc.
IMO anyway.
Hard to tell. Each game is affected a little differently by changes to either of those. With that said, you'd be an idiot to use 16x AA.... Even 8x AA is overkill.
Your performance at that resolution will be great for any game. I wouldn't worry about it.
XaosII
what is overkill? i dont want jaggies so y not aa16? and sorry im new so i may ask questions with obvious answers. thanks
[QUOTE="XaosII"]Hard to tell. Each game is affected a little differently by changes to either of those. With that said, you'd be an idiot to use 16x AA.... Even 8x AA is overkill.
Your performance at that resolution will be great for any game. I wouldn't worry about it.
bioshock180
what is overkill? i dont want jaggies so y not aa16? and sorry im new so i may ask questions with obvious answers. thanks
It's overkill because you wouldnt even be able to see much of a difference between 4x, 8x, and 16x AA.
i have an amd phenom 9750 (2.4GHz x 4). is this good for now or will i have to upgrade soon?bioshock180
More then enough for anything.... Although other might disagree is not going to fail you anytime soon.
[QUOTE="bioshock180"][QUOTE="XaosII"]Hard to tell. Each game is affected a little differently by changes to either of those. With that said, you'd be an idiot to use 16x AA.... Even 8x AA is overkill.
Your performance at that resolution will be great for any game. I wouldn't worry about it.
getoffmylawn
what is overkill? i dont want jaggies so y not aa16? and sorry im new so i may ask questions with obvious answers. thanks
It's overkill because you wouldnt even be able to see much of a difference between 4x, 8x, and 16x AA.
Things may actually look worse with that much AA :lol:If you like using AA, why the HELL are you getting a GTX 260 instead of a HD4870?
Nice review that compares this season's graphics cards:
http://www.bit-tech.net/hardware/2008/07/11/summer-2008-graphics-performance-roundup/8
The HD4870 trumps the GTX 260 everytime AA is enabled. It even manages to knock the GTX 280 out of the #1 spot sometimes.
Hard to tell. Each game is affected a little differently by changes to either of those. With that said, you'd be an idiot to use 16x AA.... Even 8x AA is overkill.
Your performance at that resolution will be great for any game. I wouldn't worry about it.
XaosII
Depends on the method of AA used. IHV's use "shortcuts" in order to increase the AA sample rates to levels like 12, 16 and 24. This method of greatly increasing pixel samples is usually referred to as a "filter". Typically, Nvidia uses CSAA (Coverage Sample AA) while ATI uses CFAA (Custom Filter AA) and may also include "Edge Detect" filtering.
The main point of filters is to obviously enable high levels of AA as if you're using "proper" method like MSAA (Multisample AA) without the huge performance penalties that come with higher levels of AA. However, like with most shortcuts in life, there are some pitfalls. Filters, in some cases, can actually blur the image (hence why ATI designed the Edge Detect, which runs a pass to detect any particular edges to apply AA moreso than other areas). So it's best to try out filters for yourself and at different levels to see what you're comfortable with.
What I hate about 99% of the benchmarks is that they use massive CPUs massively OC'd. I think they should use more mainstream setups so we aren't left to compare GPUs solely on ratios.
EG:
E6600, 2x1GB DDR2-800, Mainstream mobo instead of QX9650 @4 Ghz, 2 x 1GB DDR3-1600, Massive mobo, etc.
IMO anyway.
Staryoshi87
The objective with such reviews is to remove any CPU bottlenecks to the actual GPU.
If reviewers were to review entire PC setups then you would have a valid point. However, if we want to see which GPU is the fastest, we need the fastest CPU's to obtain an accurate result.
i have an amd phenom 9750 (2.4GHz x 4). is this good for now or will i have to upgrade soon?bioshock180
You won't have to upgrade soon, but the current Phenom's aren't exactly great for gaming.
Let me put it this way: You probably should've gone with an Intel Core 2 Quad or even a Core 2 Duo.
As to your original question: If you can get 60 FPS at 2560x1600, then you could easily get 60+ FPS at 1440x900 with 16xCSAA and 16xAF.
My question though would be: Why the hell are spending $300+ on a GPU to game on a $250 monitor. A GPU that fast warrants a better monitor.
[QUOTE="XaosII"]Hard to tell. Each game is affected a little differently by changes to either of those. With that said, you'd be an idiot to use 16x AA.... Even 8x AA is overkill.
Your performance at that resolution will be great for any game. I wouldn't worry about it.
bioshock180
what is overkill? i dont want jaggies so y not aa16? and sorry im new so i may ask questions with obvious answers. thanks
you look angry xD take it easy he didn't mean to offense you.....
If you like using AA, why the HELL are you getting a GTX 260 instead of a HD4870?
Nice review that compares this season's graphics cards:
http://www.bit-tech.net/hardware/2008/07/11/summer-2008-graphics-performance-roundup/8The HD4870 trumps the GTX 260 everytime AA is enabled. It even manages to knock the GTX 280 out of the #1 spot sometimes.
Wesker776
is this review really accurate ?
because the benchmarks of the HD4870 are crazy O_o!!!
i mean 50 fps with DirectX9 is perfect !!!
[QUOTE="Wesker776"]If you like using AA, why the HELL are you getting a GTX 260 instead of a HD4870?
Nice review that compares this season's graphics cards:
http://www.bit-tech.net/hardware/2008/07/11/summer-2008-graphics-performance-roundup/8The HD4870 trumps the GTX 260 everytime AA is enabled. It even manages to knock the GTX 280 out of the #1 spot sometimes.
gla300
is this review really accurate ?
because the benchmarks of the HD4870 are crazy O_o!!!
i mean 50 fps with DirectX9 is perfect !!!
Why wouldn't they be? :P
Those in the know are saying that RV770 is probably one of the best architectures of its time, along with R300 and G80.
At least in terms of Crysis, when increasing AA, it basically creates clones of texture/video data in vRAM; considering it can use up about 400 or so MB of vRAM before any AA is applied, this is why quickly you find that 512MB of video RAM is a bottleneck for applying any form of AA in Crysis. You almost literally do need at least a 1GB GPU for that game, and multi-GPU as well to have a video system fast enough to keep things performing with a reasonable amount of AA. codezer0
For the most part I would agree with you, however tests show that frame buffer density isn't as important as how the GPU efficiently handles its usable VRAM.
I've seen tests between G200 and RV770 where RV770 manages to keep up or even take over G200 parts at 2560x1600 with AA and AF applied (9800 GX2 drops out at this level as its frame buffer, bandwidth and memory allocation algorithms are not up to par, methinks).
But I guess we should wait for those 1GB HD4870's to see if this holds true...
The higher the AA, the better the quality of the picture....but at a price of performance. I try to avoid AA at all times because it drops my FPS drastically. If I do use AA, I never go past 4x because I can't tell a difference when I go any higher.TJ_22_TJSame here. I cant tell the difference past AAx4 :p
Please Log In to post.
Log in to comment