This topic is locked from further discussion.
Not likely. There will be considerably less, though. Several current-gen console games have such bad aliasing it's almost unbelievable. And even at higher resolutions, there are still jaggies. They're just smaller. If your eyes are used to it, though, and jaggies really bug you, you'll still see them. I know people who game at 1600p, and still say they're see enough jaggies to get distrated, with any less than 4-8xMSAA.
hopefully, but we already know that "next gen" kszf will be plagued by jaggies. :(
maybe guerilla just doesn't know how to utilise dat raaam.Â
There will still be some jaggies. I hope most devs focus on 720p/ 60fps anyway (most people probably don't even have 1080p TVs).
Who cares really? If your sitting away from the tv then you wont notice Jaggies unless your playing Zelda Twilight Princess.
In some games they're really noticeable. Gears of War 3 and Uncharted 3 *shudders*Who cares really? If your sitting away from the tv then you wont notice Jaggies unless your playing Zelda Twilight Princess.
SuperNovaftw
The jump from 720P to 1080P will already lessen the jaggies a lot, but there will definitely will be visible ones unless they come up with a new AA method. High levels of MSAA will not be feasable for consoles when they run games at 1080P. Most likely many will use post processing AA like FXAA that blurs the whole picture.
I don't know if they could implement TXAA. TXAA gets rid of most jaggies even in games that use deferred rendering.
The only time I've sene a completley jaggie free game is when runnign older games in full-screen anti-aliasing, or using Nvidia's TXAA.
Â
Playing through Assgrabber's Creed 3 on PC with TXAA on was a revelation.
Â
I think you can come close with MSAA or SMAA and a post processing filter on top, but it's the temporal issues and the texture aliasing that still harbor the dreaded jaggie. Only TXAA and FSAA, as far as I know, cna truly completley get rid of it.
Â
That being said, FXAA, or a mixture of FXAA and MSAA does a great job, certainly it's going to be a lot better than what consoles can do now.
Â
Welcome to 2007 consolites ;) The water is warm (and less jaggie).
Oh btw,, even the best PC games at the highest resolutions suffer from jaggies. It's not a "console" problem.
The only jaggie-free game I've ever seen was Half-Life 2. They had turned off the in-game AA and were using the SweetFX Injector AA and AMD's proprietary AA solution. The result were amazing.
Oh btw,, even the best PC games at the highest resolutions suffer from jaggies. It's not a "console" problem.
The only jaggie-free game I've ever seen was Half-Life 2. They had turned off the in-game AA and were using the SweetFX Injector AA and AMD's proprietary AA solution. The result were amazing.
Wasdie
Â
It's actually a problem with our display devices. Â So long as we use square pixels, we shall have jaggies.
Â
Well, at least until the resolution is high enough to allow for a smooth line to look smooth to our eyes.
Â
I know that at 2560x1440 and standing about 4 feet from ym monitor, I can still notice jaggies. Specially when moving. Temporal aliasing makes the damn thigns DANCE.
Â
It's the only reason I forced myself to finish Assgrabber's Creed 3. No jaggies made that game look like I was watchign a play, rather than a game.
The jump from 720P to 1080P will already lessen the jaggies a lot, but there will definitely will be visible ones unless they come up with a new AA method. High levels of MSAA will not be feasable for consoles when they run games at 1080P. Most likely many will use post processing AA like FXAA that blurs the whole picture.
I don't know if they could implement TXAA. TXAA gets rid of most jaggies even in games that use deferred rendering.
Ben-Buja
Some FXAA algorithms are actually fantastic.Â
[QUOTE="Wasdie"]
Oh btw,, even the best PC games at the highest resolutions suffer from jaggies. It's not a "console" problem.
The only jaggie-free game I've ever seen was Half-Life 2. They had turned off the in-game AA and were using the SweetFX Injector AA and AMD's proprietary AA solution. The result were amazing.
Kinthalis
Â
It's actually a problem with our display devices. Â So long as we use square pixels, we shall have jaggies.
Square pixels are fine. We just need higher DPI. 4k solves it. Render something at native 4k, good by jaggies.
However, rendering 9+ million pixels each scene is... difficult.
[QUOTE="Kinthalis"]
[QUOTE="Wasdie"]
Oh btw,, even the best PC games at the highest resolutions suffer from jaggies. It's not a "console" problem.
The only jaggie-free game I've ever seen was Half-Life 2. They had turned off the in-game AA and were using the SweetFX Injector AA and AMD's proprietary AA solution. The result were amazing.
Wasdie
Â
It's actually a problem with our display devices. Â So long as we use square pixels, we shall have jaggies.
Square pixels are fine. We just need higher DPI. 4k solves it. Render something at native 4k, good by jaggies.
However, rendering 9+ million pixels each scene is... difficult.
Â
Well, it's a measure of pixel density, as you said. DPI. 4K will certianly do away with aliansing... on a certian screen size or less.
Â
But ont he 80 inch + screens that that resolution is going to be used on? Â I doubt it. you'd probably still have aliasing.
[QUOTE="Kinthalis"]
[QUOTE="Wasdie"]
Oh btw,, even the best PC games at the highest resolutions suffer from jaggies. It's not a "console" problem.
The only jaggie-free game I've ever seen was Half-Life 2. They had turned off the in-game AA and were using the SweetFX Injector AA and AMD's proprietary AA solution. The result were amazing.
Wasdie
Â
It's actually a problem with our display devices. Â So long as we use square pixels, we shall have jaggies.
Square pixels are fine. We just need higher DPI. 4k solves it. Render something at native 4k, good by jaggies.
However, rendering 9+ million pixels each scene is... difficult.
DPI is the important part, not resolution, you can have "retina" display on less than 4k resolution.I hope most devs focus on 720p/ 60fps anyway (most people probably don't even have 1080p TVs).
TheEpicGoat
NO. 720 kinda sucks and I have large 1080p TV. Would rather have the 30FS than crap resolution.
[QUOTE="Wasdie"]
[QUOTE="Kinthalis"]
Â
It's actually a problem with our display devices. Â So long as we use square pixels, we shall have jaggies.
Kinthalis
Square pixels are fine. We just need higher DPI. 4k solves it. Render something at native 4k, good by jaggies.
However, rendering 9+ million pixels each scene is... difficult.
Â
Well, it's a measure of pixel density, as you said. DPI. 4K will certianly do away with aliansing... on a certian screen size or less.
Â
But ont he 80 inch + screens that that resolution is going to be used on? Â I doubt it. you'd probably still have aliasing.
Remember that the level of detail you can appreciate depends also on the viewing distance.
With next gen looming upon I was wondering will we see an end to aliasing? PC gamers have had this luxury for some time now and its something I'm yet to see. It is always top of my wish list with new consoles but never happens! You would think console developers would make this a priority as it is a major factor in visual quality in my opinion. This is one aspect in graphics which is sorely overlooked. Tommyjeeb
Who can say... there may be another Big Rigs in the works for 2014.
[QUOTE="Riadon2"]Is this possible on next gen consoles? please elaborate.The only way to completely eliminate jaggies (1080p, 1440p) is SGSSAA + FXAA (with sharpening filter), and higher levels of TXAA.
Tommyjeeb
TXAA is not possible unless NVidia plays nice for once and allows AMD cards to make use of it. If this happened, it could be used in some games for a large performance hit.
SGSSAA is technically possible, but it is VERY demanding (at 4x SGSSAA). It could be used in less graphically advanced games, but those games usually settle for higher resolutions and framerates over better AA methods.
Most games will use post-processing methods like FXAA and SMAA, which are not bad modes of AA per-say, but when compared to solutions designed to completely eliminate aliasing, there is a lot to be desired.
Zzzz 1080p was possible this generation, developers just chose the ignore it so they could have better graphics at 720p. There's no reason why they wouldn't do the same thing again - 1080p requires a lot more resources than 720p. Why would they ever focus on the 1080p niche when 720p is the most common resolution?Most likely the average game will have very few jaggies. 1080p + some decent AA is now completely possible on a console.
Wasdie
[QUOTE="Wasdie"]Zzzz 1080p was possible this generation, developers just chose the ignore it so they could have better graphics at 720p. There's no reason why they wouldn't do the same thing again - 1080p requires a lot more resources than 720p. Why would they ever focus on the 1080p niche when 720p is the most common resolution? What kind of logic is that? Consoles focussed on "HD" when SD TV's were the most common.Most likely the average game will have very few jaggies. 1080p + some decent AA is now completely possible on a console.
KHAndAnime
F*ck gameplay, art-style, music and sound design, or trying to advance story-telling in video games, WE NEED TO MAKE SURE JAGGIES ARE NON-EXISTANT.
/SW priorities.
F*ck gameplay, art-style, music and sound design, or trying to advance story-telling in video games, WE NEED TO MAKE SURE JAGGIES ARE NON-EXISTANT.
/SW priorities.
DJ-Lafleur
>Implying that implementing AA is a difficult task that will take away from these things.
[QUOTE="DJ-Lafleur"]
F*ck gameplay, art-style, music and sound design, or trying to advance story-telling in video games, WE NEED TO MAKE SURE JAGGIES ARE NON-EXISTANT.
/SW priorities.
Riadon2
>Implying that implementing AA is a difficult task that will take away from these things.
Also implying that you cna't tackle more than one issue at the same time - specially considering that the fields of knowledge involved barely touch.
[QUOTE="KHAndAnime"][QUOTE="Wasdie"]Zzzz 1080p was possible this generation, developers just chose the ignore it so they could have better graphics at 720p. There's no reason why they wouldn't do the same thing again - 1080p requires a lot more resources than 720p. Why would they ever focus on the 1080p niche when 720p is the most common resolution? What kind of logic is that? Consoles focussed on "HD" when SD TV's were the most common. ? Where have you been the last decade, SDTVs have been on their way out forever. 720p on the other hand isn't going anywhere. Why wasn't 1080p already the main target? And once again, what advantage do game developers gain by targeting the 1080p niche? You can't think of any because there is none. Game developers aren't trying to sell 1080P TVs, they are trying to sell their games. Microsoft and Sony bragged at the possibility and likelihood of 1080P games this generation, yet developers thought it would be most effective to optimize their games for the most common display resolution so they could squeeze the most visual juice out of the hardware. This is just one of those cases where history will repeat itself. The casual gamer will be hard pressed to spot the difference between 720p and 1080p. Once developers start targeting 720p native, it will be plainly obvious that the 720p native games have the best graphics and then all developers, just like this generation, will be competing for the lowest common denominator.Most likely the average game will have very few jaggies. 1080p + some decent AA is now completely possible on a console.
SaltyMeatballs
[QUOTE="Riadon2"]
[QUOTE="DJ-Lafleur"]
F*ck gameplay, art-style, music and sound design, or trying to advance story-telling in video games, WE NEED TO MAKE SURE JAGGIES ARE NON-EXISTANT.
/SW priorities.
Kinthalis
>Implying that implementing AA is a difficult task that will take away from these things.
Also implying that you cna't tackle more than one issue at the same time - specially considering that the fields of knowledge involved barely touch.
I wasn't implying that a company can't multitask. Twas merely a sarcastic statement basically saying"who f*cking cares?"
I don't even pay attention to such little details like jaggies.
[QUOTE="Wasdie"]Zzzz 1080p was possible this generation, developers just chose the ignore it so they could have better graphics at 720p. There's no reason why they wouldn't do the same thing again - 1080p requires a lot more resources than 720p. Why would they ever focus on the 1080p niche when 720p is the most common resolution?Most likely the average game will have very few jaggies. 1080p + some decent AA is now completely possible on a console.
KHAndAnime
I don't many people that have 720p TVs. 1080p is CLEARLY the standard now. 720p would be a joke if that is even a significant minority of the games.
[QUOTE="SaltyMeatballs"][QUOTE="KHAndAnime"] Zzzz 1080p was possible this generation, developers just chose the ignore it so they could have better graphics at 720p. There's no reason why they wouldn't do the same thing again - 1080p requires a lot more resources than 720p. Why would they ever focus on the 1080p niche when 720p is the most common resolution?KHAndAnimeWhat kind of logic is that? Consoles focussed on "HD" when SD TV's were the most common. ? Where have you been the last decade, SDTVs have been on their way out forever. 720p on the other hand isn't going anywhere. Why wasn't 1080p already the main target? And once again, what advantage do game developers gain by targeting the 1080p niche? You can't think of any because there is none. Game developers aren't trying to sell 1080P TVs, they are trying to sell their games. Microsoft and Sony bragged at the possibility and likelihood of 1080P games this generation, yet developers thought it would be most effective to optimize their games for the most common display resolution so they could squeeze the most visual juice out of the hardware. This is just one of those cases where history will repeat itself. The casual gamer will be hard pressed to spot the difference between 720p and 1080p. Once developers start targeting 720p native, it will be plainly obvious that the 720p native games have the best graphics and then all developers, just like this generation, will be competing for the lowest common denominator.
PS4 will be 1080p. If you own crap like a Wii U or Nextbox you might be stuck at 720p...
Please Log In to post.
Log in to comment