[QUOTE="seanmcloughlin"]
[QUOTE="parkurtommo"] Have you ever seen a movie in 60 fps? I have, it's horrible. And I know what you feel like saying, "Video games aren't movies". But movies feel cinematic because of the standard 24 fps (widescreen resolution being another factor), it makes things feel.... cooler! It adds more impact to each frame, leaves more of an impression.
Â
EDIT: But only in games where you would want it to feel cinematic, so LA Noire, Max Payne 3, Tomb Raider, Hitman Absolution, those sort of titles (mainly console ports).Â
parkurtommo
The fact that you control the camera and character throws all of what you said out the window.You're in a 3D environment running around yourself controlling the camera. You don't have expertly created shots and transitions like a movie
Why wouldn't they be locked at 24 fps then instead of 30? if 24 is where movies are at?
No, instead you have expertly crafted environments and set pieces, and if we're talking about the games I just listed, there are also cut scenes that DO have good shots and transitions. Movies can get away with 24 frames because each frame has motion blur applied, games do not (unless they have it enabled eg Crysis 2, console version did indeed run at an acceptable 24 fps).EDIT: I made some research and projection movie theaters have a different refresh rate (48/72) where as monitors are usually 60hz. 30 fps is an even factor of 60.
But my point about motion blur still stands.
Also, you have to edit a lot of special effects in movies. Having 60fps is ridiculous since you have to edit each frame over and over again. 24fps is less work and is still good enough, but there have been 48fps movies like The Hobbit. I've never seen it in 48fps though so I can't say how it feels. With KB/M you really need more precision and 60fps, but with analog sticks 30fps is more than enough.
Log in to comment