This topic is locked from further discussion.
I'd think I'd rather have simpler levels of detail and 60 fps than 30 fps and better looking details. I can't imagine how much smoother everything in Assassin's Creed would've been if it had been running at 60 fps.
Well, I could just play it on my PC.. but I don't wanna! hehe.
we all know that....but that doesn't mean the 60FPS game is always better.NG2 was 60fps. Didn't keep a steady framerate but for the most part it was at 60.
Yes some of the best games this gen are 60 FPS(not counting PC since almost all of them can run 60 FPS)
Mario Galaxy
Metroid PRime 3
Brawl
COD 4
But not all of them are the best games this gen either
Crysis can't be run on Ultra High and 60 FPS unless the PC is beafy and even then it is a workout for the PC.
Halo 3 is just as good if not superior online experience to the console COD 4.
Bioshock on consoles was only 30FPS
Ninja Gaiden 2
Metal Gear Solid 4
Grand Theft Auto 4
Do i want 60 FPS standard? yes
Do i think 60 FPS guarantees the best or top quality? No.jg4xchamp
you really didn't need all those words to say that. yes, 60fps represents a major difference. and 80fps looks better than 60fps. all these things are facts.Dont kid yourself if you think otherwise. Play call of duty 4 or any other games at 60fps and ull see how much sweeter a game is at that frame.
noobswillbepwnd
I used to play half life 2 back when it first came out at about 25-30, when i updated my graphics card and played episode 2 at 60 i was amazed at how sweet everything flowed.
i seriously hope developers out there start optimizing games for the ps3 and 360 to play at 60fps because its just soo damn good at that rate.
Dont get me wrong, 30fps is well and good, but i think 60fps should be the sweet spot of gaming, anything above, no human eyes cant detect it as far is iv seen.
Movies run at 24fps so we maybe used to 30fps for games but once u go 60fps ud wish every game ran that nicely.
I dont have the link but i once read that COD4 sacrificed some graphical quality to make sure the game runs at 60fps for consoles and I for one (being the graphics whore i am) dont mind at all. if a little lighting and texture needs to go to make games run that smooth, then so be it.
I hope next gen they learn to make 60fps a norm and not a luxury.
anyone dont believe me, go pop in a 30fps shooter game and then go pop in COD4, notice the difference.
[QUOTE="ModernTimes"][QUOTE="vashkey"][QUOTE="user_nat"][QUOTE="wmc540"]Yeah, the difference is pretty obvious from Halo 3 to COD 4.noobswillbepwnd
I actually don't really notice the difference in framerate between to 2..
Yea, actually, Im pretty sure Halo 3 runs at 60.
Nope, it's 30FPS (sorry no link)
it would be a shallow victory if a console boasted that it can run games at 120 FPS....our brain/vision can only percept 60FPS max i believe, so if a game can run above that, we couldnt tell the difference between that or 60FPS with the naked eye
Playing my console I can never tell the difference between 30fps and 60fps,but can easily tell the difference between framerates on my PC. Maybe my EDTV and moniter differ?killab2oo5I can sort've tell the difference when I play Consoles between 60 FPS and 30 FPS I'm playing on an SDtv aswell. I just know that the 60 FPS game the Character moves Faster which is great in F-Zero GX and F-Zero X. Metroid Prime and Metroid Prime 2: Echoes also ran at 60 FPS which is easy to tell the difference then because your character moved alot faster then it did in other games on that Genre and Console.
If the difference is great enough, the human eye can perceive 120fps. An Air Force experiment showed people could recognize an image flashed for a mere 1/200th of a second. From what I've noticed, the greater the difference from one frame to the next, the easier it is to perceive.it would be a shallow victory if a console boasted that it can run games at 120 FPS....our brain/vision can only percept 60FPS max i believe, so if a game can run above that, we couldnt tell the difference between that or 60FPS with the naked eye
KGB32
60fps is adopted as a de facto standard because of TV standards. Consoles can't play games faster than 60fps because TVs can't display faster than 60fps (and that's in 60Hz regions like North America--other regions may have a 50Hz limit, dating back to standards that at least partially involved the standard regional electric current pulse).
Dont kid yourself if you think otherwise. Play call of duty 4 or any other games at 60fps and ull see how much sweeter a game is at that frame.
I used to play half life 2 back when it first came out at about 25-30, when i updated my graphics card and played episode 2 at 60 i was amazed at how sweet everything flowed.
i seriously hope developers out there start optimizing games for the ps3 and 360 to play at 60fps because its just soo damn good at that rate.
Dont get me wrong, 30fps is well and good, but i think 60fps should be the sweet spot of gaming, anything above, no human eyes cant detect it as far is iv seen.
Movies run at 24fps so we maybe used to 30fps for games but once u go 60fps ud wish every game ran that nicely.
I dont have the link but i once read that COD4 sacrificed some graphical quality to make sure the game runs at 60fps for consoles and I for one (being the graphics whore i am) dont mind at all. if a little lighting and texture needs to go to make games run that smooth, then so be it.
I hope next gen they learn to make 60fps a norm and not a luxury.
anyone dont believe me, go pop in a 30fps shooter game and then go pop in COD4, notice the difference.noobswillbepwnd
Next generation lcd TV's are going to be 120fps starting this fall. If developers dont hit 60fps then they are two framesets behind the tv's capabilities.
I imagine it is easier for PC gamers to tell the difference because we can directly control our frame rate. Something as simple as changing the res can move you between 30 and 60, allowing you to check the motion quality differences in the same game.AnnoyedDragonIt can be easy for any Console owner to tell the difference they would have to own a PS3 and 360 or wii and 360 or else just a 360. All they have to do is rent to games the 360 version of Madden 08 and the 360 version of Madden 07 or else the 360 and PS3 versions of just Madden 08 because the 360 version was running at 60 FPS and the PS3 version and every other version was running at 30 FPS.
[QUOTE="KGB32"]If the difference is great enough, the human eye can perceive 120fps. An Air Force experiment showed people could recognize an image flashed for a mere 1/200th of a second. From what I've noticed, the greater the difference from one frame to the next, the easier it is to perceive.it would be a shallow victory if a console boasted that it can run games at 120 FPS....our brain/vision can only percept 60FPS max i believe, so if a game can run above that, we couldnt tell the difference between that or 60FPS with the naked eye
HuusAsking
60fps is adopted as a de facto standard because of TV standards. Consoles can't play games faster than 60fps because TVs can't display faster than 60fps (and that's in 60Hz regions like North America--other regions may have a 50Hz limit, dating back to standards that at least partially involved the standard regional electric current pulse).
It's actually pretty easy to see the difference between 60hz and 100hz, especially in fast moving games. You'll see tearing around the edges of quickly moving objects, as well as full screen tearing if you pan quickly. Not to open the gates to 'console fps hate town' but you won't really notice low frame rates as much on consoles because 1) you are playing on a television instead of a crt 2) analog sticks prevent rapid view panning 3) due to slow analog movements, you won't see games paced like Quake3 that make your refresh rate very noticeable.
[QUOTE="KGB32"]If the difference is great enough, the human eye can perceive 120fps. An Air Force experiment showed people could recognize an image flashed for a mere 1/200th of a second. From what I've noticed, the greater the difference from one frame to the next, the easier it is to perceive.it would be a shallow victory if a console boasted that it can run games at 120 FPS....our brain/vision can only percept 60FPS max i believe, so if a game can run above that, we couldnt tell the difference between that or 60FPS with the naked eye
HuusAsking
60fps is adopted as a de facto standard because of TV standards. Consoles can't play games faster than 60fps because TVs can't display faster than 60fps (and that's in 60Hz regions like North America--other regions may have a 50Hz limit, dating back to standards that at least partially involved the standard regional electric current pulse).
ah ok. i have heard that when you are scared or your adrenaline is up, you can perceive more than 60, but i didnt know about the tv thing.
It can be easy for any Console owner to tell the difference they would have to own a PS3 and 360 or wii and 360 or else just a 360. All they have to do is rent to games the 360 version of Madden 08 and the 360 version of Madden 07 or else the 360 and PS3 versions of just Madden 08 because the 360 version was running at 60 FPS and the PS3 version and every other version was running at 30 FPS.Nintendo_Ownes7
I didn't say they couldn't only that it would be easier on PC; which you pretty much backed up with your example.
[QUOTE="HuusAsking"][QUOTE="KGB32"]If the difference is great enough, the human eye can perceive 120fps. An Air Force experiment showed people could recognize an image flashed for a mere 1/200th of a second. From what I've noticed, the greater the difference from one frame to the next, the easier it is to perceive.it would be a shallow victory if a console boasted that it can run games at 120 FPS....our brain/vision can only percept 60FPS max i believe, so if a game can run above that, we couldnt tell the difference between that or 60FPS with the naked eye
juno84
60fps is adopted as a de facto standard because of TV standards. Consoles can't play games faster than 60fps because TVs can't display faster than 60fps (and that's in 60Hz regions like North America--other regions may have a 50Hz limit, dating back to standards that at least partially involved the standard regional electric current pulse).
It's actually pretty easy to see the difference between 60hz and 100hz, especially in fast moving games. You'll see tearing around the edges of quickly moving objects, as well as full screen tearing if you pan quickly. Not to open the gates to 'console fps hate town' but you won't really notice low frame rates as much on consoles because 1) you are playing on a television instead of a crt 2) analog sticks prevent rapid view panning 3) due to slow analog movements, you won't see games paced like Quake3 that make your refresh rate very noticeable.
Would it be more difficult to see the difference between 60 and say 120Hz if the VSync were turned on in your game (turning it on should kill the tearing problem, which comes from a screen refreshing outside the VBlank interval).Would it be more difficult to see the difference between 60 and say 120Hz if the VSync were turned on in your game (turning it on should kill the tearing problem, which comes from a screen refreshing outside the VBlank interval).HuusAsking
Well, lots of PC gamers got the idea that v-sync off is good because you can get higher FPS. In truth, turning v-sync off was really only meant for benchmarking purposes. Let's say your screen is refreshing at 60hz but your video card is sending out 120fps, the result is is a really bad bleeding effect because of how CRTs draw frames. When frame 1 is a third of the way drawn, the 2nd frame will already start being drawn bellow it, and the third frame bellow that; drawing a new frame before the first one is finished.
To answer your question, if you are at 120hz and viewing 60 FPS, depending on your eyes, you may or may not notice (though on a CRT I about promise you that you would due to pixels being instant on/off). If you were at 60hz and getting 120fps, you'll get the aforementioned image bleeding effect.
[QUOTE="Hexagon_777"]
How does one tell how many frames per second their game runs at?Cali3350
Been a PC gamer for a few years now (still play consoles too!) and its to the point i can just tell instantly. Its pretty noticeable when you know what to look for.
here is a very old school project. Half of the screen runs at 30fps while the other runs at 60. Its a simulation of this, but pretty spot on accurate.
http://www.novicee.com/edu/fps_compare.zip
[QUOTE="Hexagon_777"]
How does one tell how many frames per second their game runs at?shadyd1717
If you're talking about PC you can get a program like fraps and it tells you, if the game doesnt have it's own counter.
For consoles idk.
[QUOTE="HuusAsking"]Would it be more difficult to see the difference between 60 and say 120Hz if the VSync were turned on in your game (turning it on should kill the tearing problem, which comes from a screen refreshing outside the VBlank interval).juno84
Well, lots of PC gamers got the idea that v-sync off is good because you can get higher FPS. In truth, turning v-sync off was really only meant for benchmarking purposes. Let's say your screen is refreshing at 60hz but your video card is sending out 120fps, the result is is a really bad bleeding effect because of how CRTs draw frames. When frame 1 is a third of the way drawn, the 2nd frame will already start being drawn bellow it, and the third frame bellow that; drawing a new frame before the first one is finished.
To answer your question, if you are at 120hz and viewing 60 FPS, depending on your eyes, you may or may not notice (though on a CRT I about promise you that you would due to pixels being instant on/off). If you were at 60hz and getting 120fps, you'll get the aforementioned image bleeding effect.
But if you had a 120Hz monitor, would you be able to tell the difference between a scene rendered in 60fps and one rendered in 120fps? I'll say you can, but just barely, and that the odds of noticing become more pronounced the faster the scene moves.Please Log In to post.
Log in to comment