[QUOTE="MrStreetFighter"]I think that has to do something with the way a TV Monitor "refreshes" its screen. You know, the hertz rating it has?snover2009
The TV has nothing to do with frames per second (fps)
The games consist of rendered images (stills or picks). One still or pic is placed one after another to show motion. One image is rendered, showed, then the player puts in some form of input or environments do something, a new image must be rendered.
Frames per second is the number of stills that are shown in one second of gameplay.
-60 is a benchmark, it is the point where you don't notice any form of still images
-30 is a minimum for decency, any lower and you can tell that there are seperate images one after the other
-with a powerful enough PC, the frames per second can go as high as 200 - 300 frames per second, except for Crysis, that stays mainly between 25 and 35 on MAXIMUM settings
Yes but what you DON'T realize is that hz and framerate need to be somewhat in sync for the most smooth image possible. Hz is refresh rate, so essentially it means the number of times the TV draws a picture per second. Standard TVs are 60hz, which is why 60FPS is the golden standard of video games. Halfing that to 30FPS means each frame is shown for two "refreshes" during the 60hz cycle. Thus, each is shown for an EVEN amount of time. In a situation like 45fps, you will get an odd number--a frame is shown for an integer quantity of refreshes of the screen. So with 45fps, 30 frames will be shown 1 time, and 15 will be shown 2 times. An average person won't care, but in terms of purity, this image is far from pure. This is why 120hz is becoming so popular. 24fps TV/movies divide unevenly into 60hz, but into 120hz, it's perfect. So, I hate to say it, but hz and framerate are not completely inseparable...
Log in to comment