I always put it to X16 and don't notice any framerate drops. Is it a memory contrainst? Why not just lower the texture quality slightly and have 16xAF. No point in having hi res textures if they're blurry as hell 10 foot infront of you. :?
This topic is locked from further discussion.
It is called Anisotropic Filtering.I always put it to X16 and don't notice any framerate drops. Is it a memory contrainst? Why not just lower the texture quality slightly and have 16xAF. No point in having hi res textures if they're blurry as hell 10 foot infront of you. :?
Peredith
[QUOTE="Peredith"]It is called Anisotropic Filtering.I always put it to X16 and don't notice any framerate drops. Is it a memory contrainst? Why not just lower the texture quality slightly and have 16xAF. No point in having hi res textures if they're blurry as hell 10 foot infront of you. :?
jtm33
Whatever :P
Because consoles games are so linear that they don't need it. :PShadowDeathX
Yep the textures improve as they get closer so they'll never notice :P
Because consoles are weak or devs are lazy, I mean, AF is the least demanding graphical option, and I love it on my PC games.
16xAF FTW.
mitu123
I think they are lazy, it's barely demanding.
[QUOTE="mitu123"]
Because consoles are weak or devs are lazy, I mean, AF is the least demanding graphical option, and I love it on my PC games.
16xAF FTW.
KillerJuan77
I think they are lazy, it's barely demanding.
I think it is a RAM issue really. It is not demanding on PC because we have tons and tons of free memory ram. Consoles on the other hand, yea....Because consoles games are so linear that they don't need it. :PShadowDeathX
This made me laugh a bit.
I have asked myself this same question many times. I have had it at 16x for I don't know how many years and have never seen it have any negative effect on my framerate. I know the consoles are weak, but this just seems worth it to give simething up for. It does so much for so little.
probably due to the 360 hardware limitations. the 360 is holding back alot things and hopefully xbox dies.ConsoleCounsla_
Seriously...you're unbelieveable. If anything, it's probably due to RAM constraints and guess what? PS3 has less of that stuff than the 360...So...
LOL.
[QUOTE="ConsoleCounsla_"]probably due to the 360 hardware limitations. the 360 is holding back alot things and hopefully xbox dies.balfe1990
Seriously...you're unbelieveable. If anything, it's probably due to RAM constraints and guess what? PS3 has less of that stuff than the 360...So...
LOL.
the cell > 360.[QUOTE="balfe1990"][QUOTE="ConsoleCounsla_"]probably due to the 360 hardware limitations. the 360 is holding back alot things and hopefully xbox dies.ConsoleCounsla_
Seriously...you're unbelieveable. If anything, it's probably due to RAM constraints and guess what? PS3 has less of that stuff than the 360...So...
LOL.
the cell > 360.The Cell > the universe.
the cell > 360.[QUOTE="ConsoleCounsla_"][QUOTE="balfe1990"]
Seriously...you're unbelieveable. If anything, it's probably due to RAM constraints and guess what? PS3 has less of that stuff than the 360...So...
LOL.
balfe1990
The Cell > the universe.
Gohan>Cell
[QUOTE="mitu123"]
Because consoles are weak or devs are lazy, I mean, AF is the least demanding graphical option, and I love it on my PC games.
16xAF FTW.
KillerJuan77
I think they are lazy, it's barely demanding.
Judging by your SIG I am sure AF is barely demanding on your computer but when you are playing on 2005-era hardware that has every once squeezed out of it ANY extra processing work will cause problems.[QUOTE="balfe1990"][QUOTE="ConsoleCounsla_"]probably due to the 360 hardware limitations. the 360 is holding back alot things and hopefully xbox dies.ConsoleCounsla_
Seriously...you're unbelieveable. If anything, it's probably due to RAM constraints and guess what? PS3 has less of that stuff than the 360...So...
LOL.
the cell > 360.IBM/AMD/MS's Fusion XCGPU > STI CELL.
CELL doesn't have dedicated hardware to handle texture filtering i.e. it has to emulate thesefunctionsif PS3doesn't have NVIDIA RSX.
[QUOTE="KillerJuan77"][QUOTE="mitu123"]
Because consoles are weak or devs are lazy, I mean, AF is the least demanding graphical option, and I love it on my PC games.
16xAF FTW.
CwlHeddwyn
I think they are lazy, it's barely demanding.
Judging by your SIG I am sure AF is barely demanding on your computer but when you are playing on 2005-era hardware that has every once squeezed out of it ANY extra processing work will cause problems.Surely they can make compomises?
I agree, good AF is more important to me than some extreme motion blur effects or the like. The funny thing is you don't even need AF if the resolution is high enough and you disable mip-mapping.
Judging by your SIG I am sure AF is barely demanding on your computer but when you are playing on 2005-era hardware that has every once squeezed out of it ANY extra processing work will cause problems.[QUOTE="CwlHeddwyn"][QUOTE="KillerJuan77"]
I think they are lazy, it's barely demanding.
Peredith
Surely they can make compomises?
Yes they can make compromises. But the current console game engines are ALL based on compromises. sub-HD, 27fps, Bloom, small environments, limited physics, pseudo-AA, the list is lengthy, You add something extra like AF then you've gotta take something away.I always put it to X16 and don't notice any framerate drops. Is it a memory contrainst? Why not just lower the texture quality slightly and have 16xAF. No point in having hi res textures if they're blurry as hell 10 foot infront of you. :?
Peredith
With AMD Xenos, each of the hardware filtered texture units have bilinear sampling capabilities per clock and for trilinear or other higher order filtering methods(e.g. Anisotropic) each individual unit will loop through multiple cycles of sampling until the requested sampling and filtering level is reached.
Recent PC GPUs has better filtered texture units than the HD consoles..
The texture units in the Radeon HD 2000 series can bilinear filter 64-bit HDR textures at full speed (i.e. ~7x faster than Radeon X1000 series), while 128-bit floating point textures are filtered at half speed (i.e. ~3.5x faster than Radeon X1000 series).
Radeon HD 5000 series has improved Anisotropic Filtering i.e. maintains full performance and angle independence.
the cell > 360.[QUOTE="ConsoleCounsla_"][QUOTE="balfe1990"]
Seriously...you're unbelieveable. If anything, it's probably due to RAM constraints and guess what? PS3 has less of that stuff than the 360...So...
LOL.
balfe1990
The Cell > the universe.
Yeah, ok.....
Personally I think AF is an effect which has high graphics power/effect ratio, meaning for consoles to use it would be a bit of a waste when they should be aiming at resolution, framerate and AA. It's more of a premium PC effect.gamebreakerz__
Not really. AA is over-rated while AF could easily be used more. I mean the blurryness of road textures in Forza 4 and GT 5 would easily be reduced with a higher AF level.
Personally I think AF is an effect which has high graphics power/effect ratio, meaning for consoles to use it would be a bit of a waste when they should be aiming at resolution, framerate and AA. It's more of a premium PC effect.gamebreakerz__
Examples,
AMD Radeon HD 5870 has 80 texture address processors and 80 FP32 texture filtering units, together with 320 texture samplers (4x 80).
AMD Radeon HD 5770 has 40 texture address processors and 40 FP32 texture filtering units, together with 160 texture samplers (4x 40).
AMD Radeon HD 5670 has 20 texture address processors and 20 FP32 texture filtering units, together with 80 texture samplers (4x 20).
AMD Xenos has 16 texture fetch units (filtered texture units) and 16 vertex fetch units (unfiltered / point sample units).
The PC has more hardware and clock speed e.g. my mobile Radeon HD 5730M is clock at 650Mhz while AMD Xenos is clocked at 500Mhz.
For the weak console hardware every little bit of performance is important. Let's not forget that devs should consider other fundamental things like resolution, framerate, antialiasing, polygon budget etc.
Furthermore, the textures are relatively low resolution on consoles, because of the small amount of RAM (depends on the game of course), so no amount of AF will make them crisp.
For the weak console hardware every little bit of performance is important. Let's not forget that devs should consider other fundamental things like resolution, framerate, antialiasing, polygon budget etc.
Furthermore, the textures are relatively low resolution on consoles, because of the small amount of RAM (depends on the game of course), so no amount of AF will make them crisp.
KiZZo1
4 times AA and 8 times AF should be the standard imo. I believe MS wanted that to become the standard to run DX10 games back then but it didn't happen.
[QUOTE="gamebreakerz__"]Personally I think AF is an effect which has high graphics power/effect ratio, meaning for consoles to use it would be a bit of a waste when they should be aiming at resolution, framerate and AA. It's more of a premium PC effect.nameless12345
Not really. AA is over-rated while AF could easily be used more. I mean the blurryness of road textures in Forza 4 and GT 5 would easily be reduced with a higher AF level.
You'll adding additional load on the texture filtering units.From wiki, NVIDIA RSX has the following
The problem with RSX or GF7 is with the lack of decoupled texture unit with shader units designi.e. shader units stalls.
[QUOTE="gamebreakerz__"]Personally I think AF is an effect which has high graphics power/effect ratio, meaning for consoles to use it would be a bit of a waste when they should be aiming at resolution, framerate and AA. It's more of a premium PC effect.nameless12345
Not really. AA is over-rated while AF could easily be used more. I mean the blurryness of road textures in Forza 4 and GT 5 would easily be reduced with a higher AF level.
To me AF is very important I always leave it on.People are forgetting that the PC DX9 cards like the nvidia 7800 and ATI 1900 series had a huge performance drop with AF as well, the consoles are essentially the same hardware.
It's basically the nvidia 8800 which revolutionised AF performance to a level where it wasnt even a question anymore whether someone would disable it because there was almost no performance gain. And hence 16xAF basically became a standard for PC gaming and now we take it for granted but we gotta remember how piss poor hardware the consoles have.
[QUOTE="gamebreakerz__"]Personally I think AF is an effect which has high graphics power/effect ratio, meaning for consoles to use it would be a bit of a waste when they should be aiming at resolution, framerate and AA. It's more of a premium PC effect.nameless12345
Not really. AA is over-rated while AF could easily be used more. I mean the blurryness of road textures in Forza 4 and GT 5 would easily be reduced with a higher AF level.
If you want the best picture quality AA isn't overrated.[QUOTE="mitu123"]
Because consoles are weak or devs are lazy, I mean, AF is the least demanding graphical option, and I love it on my PC games.
16xAF FTW.
KillerJuan77
I think they are lazy, it's barely demanding.
Calling devs lazy is a lazy explanation for the problem. You are just sitting in an armchair and judging. Who is lazy here?Calling devs lazy is a lazy explanation for the problem. You are just sitting in an armchair and judging. Who is lazy here? hensothor
Yeah, you're on System Wars, I think everyone is like that, including you (And considering that Shadows Of The Damned, Castlevania: Lords Of Shadow, Burnout: Paradise, Halo: Reach and even Command And Conquer 3 use a decent ammount of AF it probably isn't demanding at all), get over yourself.
And by the way, I can enable 16XAF on Half Life 2 on a very poor rig (7800gt, amd phenom ii x3 @2.6GHZ and 256 mb of RAM) with more or less high settings (No reflections, no AA, Medium textures and medium shadows) and still achieve 30-60 FPS on 720p so it's not that heavy on the RAM, with another 256mb stick I got 4-6 FPS more.
This is going to make the next batch of consoles interesting.People are forgetting that the PC DX9 cards like the nvidia 7800 and ATI 1900 series had a huge performance drop with AF as well, the consoles are essentially the same hardware.
It's basically the nvidia 8800 which revolutionised AF performance to a level where it wasnt even a question anymore whether someone would disable it because there was almost no performance gain. And hence 16xAF basically became a standard for PC gaming and now we take it for granted but we gotta remember how piss poor hardware the consoles have.
Gambler_3
This is going to make the next batch of consoles interesting. watch them do sub HD and 30fps anyways[QUOTE="Gambler_3"]
People are forgetting that the PC DX9 cards like the nvidia 7800 and ATI 1900 series had a huge performance drop with AF as well, the consoles are essentially the same hardware.
It's basically the nvidia 8800 which revolutionised AF performance to a level where it wasnt even a question anymore whether someone would disable it because there was almost no performance gain. And hence 16xAF basically became a standard for PC gaming and now we take it for granted but we gotta remember how piss poor hardware the consoles have.
mitu123
[QUOTE="mitu123"]This is going to make the next batch of consoles interesting. watch them do sub HD and 30fps anyways But with more AA most likely, Alan Wake is sub HD but does 4xAA and God of War 3 is 720p and does MLAA fine.[QUOTE="Gambler_3"]
People are forgetting that the PC DX9 cards like the nvidia 7800 and ATI 1900 series had a huge performance drop with AF as well, the consoles are essentially the same hardware.
It's basically the nvidia 8800 which revolutionised AF performance to a level where it wasnt even a question anymore whether someone would disable it because there was almost no performance gain. And hence 16xAF basically became a standard for PC gaming and now we take it for granted but we gotta remember how piss poor hardware the consoles have.
wis3boi
[QUOTE="ShadowDeathX"]Because consoles games are so linear that they don't need it. :Parto1223
This made me laugh a bit.
I have asked myself this same question many times. I have had it at 16x for I don't know how many years and have never seen it have any negative effect on my framerate. I know the consoles are weak, but this just seems worth it to give simething up for. It does so much for so little.
I have also asked myself this. I can only guess it is RAM dependant.
People are forgetting that the PC DX9 cards like the nvidia 7800 and ATI 1900 series had a huge performance drop with AF as well, the consoles are essentially the same hardware.
It's basically the nvidia 8800 which revolutionised AF performance to a level where it wasnt even a question anymore whether someone would disable it because there was almost no performance gain. And hence 16xAF basically became a standard for PC gaming and now we take it for granted but we gotta remember how piss poor hardware the consoles have.
Gambler_3
This makes sense. I remember back in the day that AF had a hit on performance and usually had to lower it to trilinear or bilinear filtering, but also I had crappy hardware so I always thought that was the problem.
Please Log In to post.
Log in to comment