This topic is locked from further discussion.
Which would you rather have... 780p at a steady 60 FPS or 1080p at a unsteady 30 FPS? RKFSUm, since TV's are limited to just under 30 frames per second, I would go for the 1080p at 30PFS. And if your choice is 780p at 60fps, what you're really going to get is 780p at 30fps because like I said your TV is limited to about 29.97 fps. Unless of course you're talking about PC's, in that case I'd go with the 780p at 60fps.
TVs CAN'T DO MORE THEN 30 FPS, WHY THE HELL WOULD YOU WANT 60FPS ON A HOME CONSOLE? THAT'S LIKE WANTING HI-DEF MOVIES TO PLAY ON YOUR SDTVZerostatic0there is a difference between refresh rate and framerate. the game will render frames at 60 but you won't see all those frames, but the frames you do see appear more connected and smooth. If your watching a live tv broadcast, people move infinitely smooth and apears so even on a 30 frames a second refresh rate.
[QUOTE="Nintendo-Wii"]1080p really doesn't do much for me.RKFSAre you using the right TV and cables? Not physically, I mean mentally, it just doesn't make much difference to me.
[QUOTE="Zerostatic0"]TVs CAN'T DO MORE THEN 30 FPS, WHY THE HELL WOULD YOU WANT 60FPS ON A HOME CONSOLE? THAT'S LIKE WANTING HI-DEF MOVIES TO PLAY ON YOUR SDTVtoppsthere is a difference between refresh rate and framerate. the game will render frames at 60 but you won't see all those frames, but the frames you do see appear more connected and smooth. If your watching a live tv broadcast, people move infinitely smooth and apears so even on a 30 frames a second refresh rate.I don't understand why the frames would appear more connected and smooth? You're still seeing half the frames.
Well, people hype a lot of stuff that is not really all that important. For example remember when the PS1 and Saturn came out and Atari kept on having those ads about how their system (Jaguar) has twice the number of bits (i.e. 64 as opposed to 32) as the competitors when in reality the PS1 and Saturn were significantly more powerful then the Jaguar and bits were really not that important.^^^ True? Why would Nintendo have touted that they were going to run their games in 60fps, then?
Jandurin
480p at 120fps. XenoNinjais that even possible? because if it is i would choose this over the others (since i don't play with ny sister's HDTV most of the time :P)
[QUOTE="nintendo-4life"][QUOTE="XenoNinja"]480p at 120fps. Jandurinis that even possible? because if it is i would choose this over the others (since i don't play with ny sister's HDTV most of the time :P) Not possible. And you can't play 480p without an HD (or the slightly rare ED) television. i have an EDTV, not an HDTV, my sister has the HDTV, not me.
[QUOTE="topps"][QUOTE="Zerostatic0"]TVs CAN'T DO MORE THEN 30 FPS, WHY THE HELL WOULD YOU WANT 60FPS ON A HOME CONSOLE? THAT'S LIKE WANTING HI-DEF MOVIES TO PLAY ON YOUR SDTVZerostatic0there is a difference between refresh rate and framerate. the game will render frames at 60 but you won't see all those frames, but the frames you do see appear more connected and smooth. If your watching a live tv broadcast, people move infinitely smooth and apears so even on a 30 frames a second refresh rate.I don't understand why the frames would appear more connected and smooth? You're still seeing half the frames. here is a good article on the difference: http://www.overclock.net/monitors-displays/149846-faq-what-refresh-rate.html "FPS vs Refresh Rate Frames per second discribes how fast a video processor can output full screen updates. Refresh Rate describes how often a monitor updates the display. That is an important distinction. Refresh rate is a set period of time. At 60Hz, a monitor pulls out whatever image is in the video processor's frame buffer and displays it every 1/60s. FPS is how fast the GPU can write out a full image to the frame buffer. This rate varies and is NOT sychronized with the frame rate. Therefore, the monitor may pull from the frame buffer but the video card has not completed updating the frame buffer. In this case, a portion of the old display maybe still present in the image you see on the screen. This is called "tearing" since the buffer is written top to bottom.... so you would see a horizontal line across screen. This often happens but is not noticable if does not occur to many times in a row. You would not notice tearing if only 3-5 frames out of 60 in a second. If it does become an issue, you may use something called V-sync. This forces the video card not to release the new updated image until is completely done. The downside to this is if the video card is too slow to update the new image. The monitor would continue to display an "old" image."
320 * 240 at 1000 fpsachilles614160x180 at 10,000 fps. Dude it's so smooth that butter literally leaks out of my TV when I play at that frame-rate.
[QUOTE="achilles614"]320 * 240 at 1000 fpsZerostatic0160x180 at 10,000 fps. Dude it's so smooth that butter literally leaks out of my TV when I play at that frame-rate. :lol:
Please Log In to post.
Log in to comment