This topic is locked from further discussion.
Framerate doesn't have anything to do with the image. Framerate is how many frames (if you were to freeze a game at any point in time that would be a frame) per second its running at. A framerate of 60fps (frames per second) is the ideal number but anything above 30fps is acceptable. Raising the graphical quality decreases the framerate which would make the game choppy (if it goes below 30fps). Optimizing code can raise fps which would then allow you to increase the graphics quality. The reason games 3 years into the Wii's lifetime are going to look better than games right now is because all the code will be much better optimized (with each game release) and with better code they can squeeze more power out of the system.No, higher framerate means better overall image while in motion.
Imagine a cartoon that only moves 10 times per minute compared to a cartoon that moves 50 times per minute. Which one do you think is going to look better when you watch the cartoon?
Jaysonguy
[QUOTE="Jaysonguy"]Framerate doesn't have anything to do with the image.No, higher framerate means better overall image while in motion.
Imagine a cartoon that only moves 10 times per minute compared to a cartoon that moves 50 times per minute. Which one do you think is going to look better when you watch the cartoon?
XBSHX
That's why I said image in motion and there it has everything to do with it.
A perfect example is the latest Madden on the two "other" systems. One moves at 60fps and one moves at 30. So far everyone who has a set of working eyeballs says that the one moving at 60 looks better.
Same images redered the same way but the one moving at a higher fps looks better to everyone
[QUOTE="Jaysonguy"]Framerate doesn't have anything to do with the image. Framerate is how many frames (if you were to freeze a game at any point in time that would be a frame) per second its running at. A framerate of 60fps (frames per second) is the ideal number but anything above 30fps is acceptable. Raising the graphical quality decreases the framerate which would make the game choppy (if it goes below 30fps). Optimizing code can raise fps which would then allow you to increase the graphics quality. The reason games 3 years into the Wii's lifetime are going to look better than games right now is because all the code will be much better optimized (with each game release) and with better code they can squeeze more power out of the system.No, higher framerate means better overall image while in motion.
Imagine a cartoon that only moves 10 times per minute compared to a cartoon that moves 50 times per minute. Which one do you think is going to look better when you watch the cartoon?
XBSHX
Yup. It's easier to to make a better looking game running at lower framrate compared to higher.
Example: A game running at 60 fps compared to 30 fps must render everything twise as often, so it's not so strange its more demanding.
[QUOTE="XBSHX"][QUOTE="Jaysonguy"]Framerate doesn't have anything to do with the image.No, higher framerate means better overall image while in motion.
Imagine a cartoon that only moves 10 times per minute compared to a cartoon that moves 50 times per minute. Which one do you think is going to look better when you watch the cartoon?
Jaysonguy
That's why I said image in motion and there it has everything to do with it.
A perfect example is the latest Madden on the two "other" systems. One moves at 60fps and one moves at 30. So far everyone who has a set of working eyeballs says that the one moving at 60 looks better.
Same images redered the same way but the one moving at a higher fps looks better to everyone
I think you've missed something. You can in theory push twise as many polygons in a game running at 30 fps compared to a 60 fps game.
So you can have a game that is much more visually stunning, more polygons, better textures, better effects etc.
The question was if MP3 whould've have looked better if it had ran in 30 fps instead of 60 fps and yeah it probably would've.
You can in theory push twise as many polygons in a game running at 30 fps compared to a 60 fps game.
So you can have a game that is much more visually stunning, more polygons, better textures, better effects etc.
Arnalion
True but only if you're at the ceiling. Since most games don't even come close to the ceiling you can make the game move at 60fps with the same level of detail to graphics that the 30fps has.
Madden is a perfect exmaple of this. Both "other" versions look the same yet they move at different speeds and you can tell the difference with the naked eye. Since there's no game on the Wii right now that gets close to it's limit all the games released on the Wii so far can run at 60fps with the same detail as if it was 30fps.
Arnalion pretty much hits the nail on the head, here. The OP asked about the relationship between potential graphics and frame rate, and there really is a negative correllation (in terms of the base visual quality). Put it this way, a given console can render a pretty astounding still image. The more it needs to incorporate motion into that image, the less quality it can put to the image itself.
Rendering a still image requires X amount of processing power, defining X as a percentage of system resources Z. Re-rendering said image a certain amount of times within a defined period (a second) also drains on that overall system resources, Y. As Y scales up, X needs to be reduced (in this simplified example, X + Y = Z).
It's a trade off. A faster frame rate tends to make action look smoother (too slow, and it's choppy), although stabilityof frame rate is likely more important than pure speed, over a certain threshhold. But it's a zero sum game, when it comes to processing power, and it depends where you want to marshal your resources.
Edit: and if you don't think those sacrifices and tradeoffs are already being made on the Wii, then you're beyond kidding yourself. You don't need to even be maxing out a console to have to make that tradeoff: you can expand your Z (the available system resources) with more work and effort, but everyone is still working within some maximum of system power, and they're still sacrificing resources in one area to free them up somewhere else.
Arnalion pretty much hits the nail on the head, here. The OP asked about the relationship between potential graphics and frame rate, and there really is a negative correllation (in terms of the base visual quality). Put it this way, a given console can render a pretty astounding still image. The more it needs to incorporate motion into that image, the less quality it can put to the image itself.
Rendering a still image requires X amount of processing power, defining X as a percentage of system resources Z. Re-rendering said image a certain amount of times within a defined period (a second) also drains on that overall system resources, Y. As Y scales up, X needs to be reduced (in this simplified example, X + Y = Z).
It's a trade off. A faster frame rate tends to make action look smoother (too slow, and it's choppy), although stabilityof frame rate is likely more important than pure speed, over a certain 24fps threshhold.
Spelunker
wow, that was confusing...
[QUOTE="Arnalion"]You can in theory push twise as many polygons in a game running at 30 fps compared to a 60 fps game.
So you can have a game that is much more visually stunning, more polygons, better textures, better effects etc.
Jaysonguy
True but only if you're at the ceiling. Since most games don't even come close to the ceiling you can make the game move at 60fps with the same level of detail to graphics that the 30fps has.
Madden is a perfect exmaple of this. Both "other" versions look the same yet they move at different speeds and you can tell the difference with the naked eye. Since there's no game on the Wii right now that gets close to it's limit all the games released on the Wii so far can run at 60fps with the same detail as if it was 30fps.
Not true. It's much easier for a developer to make a good looking game running at lower framerates.
If Madden on PS3 would've ran at 60 fps would it in theory had been twise as hardware demanding and since the developers don't know how to utilize (especially not the lazy/bad ones *cough* EA *cough*) the hardware did they leave it at 30 fps. They would've needed to downgrade the graphics on the PS3 version of Madden if it would've been running at 60 fps.
When a developer starts developing on a new platform can they never use its full capability since they don't know how to utilize the hardware. So a developer that tried to push the hardware on a new platform is always at the ceiling, the ceiling just rises as the developer learns how to utilize the hardware better over time.
EDIT: @ Spelunker: Good explanation. ^_^
lol, is ther 90fps?!!!han_186There is but the human eye cant detect it. The highest the human eye can detect is 60fps.
Please Log In to post.
Log in to comment