Question about framerates and monitors

This topic is locked from further discussion.

Avatar image for chocobo7000
chocobo7000

737

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 chocobo7000
Member since 2008 • 737 Posts

Most monitors are only 60hz (except for the new 120hz ones that recently came out). So that means if your game is running above 60fps then you can even tell the difference, right? I've read several comments online where people are claiming to be able to tell the difference if a game is running higher than 60fps. Is this possible, if so how? And why do people spend so much money to try and get over 100fps in games if their monitors only go to 60?

Avatar image for gameguy6700
gameguy6700

12197

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 gameguy6700
Member since 2004 • 12197 Posts

The idea that your eyes can't see faster than 60hz is a myth. Rest assured you can.

As for why people spend money for crazy high FPS, it boils down to:

1. They're not looking for 100+ FPS in their games but rather looking to get high FPS even in the most demanding games. Case in point is Crysis. It's an outlier in that even a system that can pull 100+ FPS in most games still struggles to get 60 FPS in Crysis.

2. They want to ensure that their game never dips below 60 FPS

3. They want to use mods that improve the graphics at the expense of greatly increased hardware demands (texture mods for example. Another prime example is the Morrowind graphics extender which makes a game that wasn't even difficult to run when it came out 8 years ago give Crysis a run for it's money in keeping up high framerates when using the highest settings)

4. They use high resolutions (high being 1920x1080+, namely multi-monitor resolutions)

5. Ego. There's a subset of hardware enthusiasts out there who are obsessed with having the most powerful systems at any cost. They don't really care about how practical their system is as long as they can show off higher benchmark numbers than anyone else.

Avatar image for kraken2109
kraken2109

13271

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 kraken2109
Member since 2009 • 13271 Posts

Framerate isn't just what you see, it's also about how it feels. I feel a huge difference between 60fps and 100fps in a shooter.

Avatar image for swehunt
swehunt

3637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#4 swehunt
Member since 2008 • 3637 Posts

Ah, this again... ;)


I would like to point out that it's no myth that your eyes cant detect more than 60FPS, the missinformation is about what the 60FPS are.


For ex.

You are running a game and recording it with fraps, fraps says you get an average of 60FPS, but if you could record in Frames PER MILLE Second you would find out that the game runs 55Frames the first half on the (FPS) second and the other 5Frames the other half second.


Anyone would be able to tell that the 60FPS Fraps recorded feels unsmooth since we still count in FramesPerSecond.


This is where most people are missinterpretating the true fact, nope you eye don't detect FPS, it detect in much shorter timespan than that, therefor you all can bash how much you want about Frames per second when it's a very bad way to find out where the limmitation of ones (because it's mostlikely not the same between two tested) eye can tell a diffrence of.


When frames per sec was used at first it was a measure of the amount of frames the video could pull thru, now many wrongly use this as a standard measure to could modern hardware limmitation, for this we need a much more exsact measurement.

If anyone are unsure just look at a movie, on the fraction of a millesecond it have very steady frames, doing this will make even 24FPS look fluidly.

Avatar image for NamelessPlayer
NamelessPlayer

7729

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 NamelessPlayer
Member since 2004 • 7729 Posts
-Some of us still use old aperture grille CRTs capable of driving 1600x1200 at 95 Hz, and lower resolutions at up to 160 Hz. While higher refresh rates on a CRT are mostly for minimizing perceived flicker, we do notice the added smoothness. (My experience with a Samsung 2233rz 120 Hz LCD confirms this.) -When a game falls under 60 FPS, it's kind of jarring. Many people want some elbow room so that the MINIMUM framerate is 60 FPS or above for their preferred games and settings. (Crysis on Very High DX10 is still quite brutal, even at lower resolutions. I think we're only just starting to see single cards that can maintain 60 FPS there, and those same cards would probably pull 120 FPS or more in other games.) -Bragging rights. Let's face it, the quest for 3DMark records and high framerates and such fuels the hardware industry as some people will pay 100% more than the next best product for a 20% increase in performance.
Avatar image for markop2003
markop2003

29917

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 markop2003
Member since 2005 • 29917 Posts
FPS is not just about visible frames but also time instances. 60FPS means the game is checking inputs, processing them and producing output every 60th of a second. However a gaming mouse can manage 1000hz and so the game is only taking notice of every 17th positional report, if the game is at 100fps then it's accepting every 10th. Effectively higher FPS means less lag between input and output. Also neurones can fire at up to 1000hz which means 1000FPS is the real smoothness boundary. The human eye can see more than 60fps though it's one of those things that you won't notice until you see the comparison, the same way as most audio gear is not really full range and you won't notice their absence until you hear some top end gear.