Great now people can switch back and forth and cure themselves of this silly fps obsession.
This topic is locked from further discussion.
Seems pretty useless but I'd guess they wouldn't want to do what MS did with Halo CEA where you can press the select button and see the game as it was in 2001 on the original Xbox since if they did that with TLOU, you likely would be disappointed with the small improvement over the original PS3 version of TLOU. :P
You might be saying something of importance if TLOU wasn't already one of the best looking games from last generation and the fact that TLOU R in it's current form look better than everything in the updated remaster of the Halo MC edition.
Graphically, Halo has looked like trash compared to all PS exclusives of the past generation so that toggle is trying to give xbot fans like you a way to show the difference, with the last of us there is no need since it already looks better than an updated xbox remaster. :P
Good. good let it flow.. eh, you know the rest all too well I'm certain.. :P
Mmm, it may actually mean that the game can't sustain 60fps locked.
Let's wait and see.
No.
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
Mmm, it may actually mean that the game can't sustain 60fps locked.
Let's wait and see.
Maybe this idea never crossed any one of you xbox fanboys mind but perhaps this is no different than Bungie putting in the toggle feature to show the difference between graphics of the current and past game/s only using framerate.
Like ND are saying, it is an option. It's just like the option to play old halo games with original graphics instead of the updated graphics. Seriously xbot always come to the most idiotic conclusions because of their fanboyism.
Let's use that same lem logic on the Halo remaster: From xbox fanboy claims, I guess we should also conclude that must mean that the toggle feature in the Halo collection to change graphics is there because the game can't sustain the updated graphics locked.
Let's wiait and see/ DERP
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
What the **** am I reading?
Only uninformed, ignorant console fanboys think that a game loses its "cinematic" qualities when at 60fps. That is literally one of the dumbest things I've ever heard in my life. It is this ridiculous fantasy by console fanboys to justify the fact that they are fanboys and don't want to game on a more powerful machine. Simple fact is that 60fps only decreases the natural barriers that are between a user and their video game giving them closer 1:1 control. 60fps is more natural, more fluid, more responsive, controls better, and feels better polished. Ever wonder why Nintendo games are always 60 fps? Games simply higher higher quality and more polished at 60fps.
The only reason I can think they would have a 1080p30 toggle is because they can't actually do 1080p60 stable throughout and some gamers would prefer a locked framerate to that of an unlocked.
30 fps is only accepted as devs haven't given gamers a choice in the past. They've picked 30 fps because they would rather crunch some more polygons or do a few more rendering passes because it would make stills and videos of their game look better for marketing reasons. Now ignorant gamers have accepted this as "cinematic" which is utter bullshit.
It's 2014, 30fps it's barely acceptable. It's not totally unacceptable but in 2014 I expect devs to be striving for 60fps whenever possible.
http://www.videogamer.com/ps4/the_last_of_us_remastered/news/the_last_of_us_remastered_details_leak_30fps_toggle_additional_dlc_planned.html
Great idea as a lot of people had complained that it would lose it's cinematic feel at 60fps. Article also states that it will include 4x more detailed texture maps. Can't wait to see what this looks like.
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
What the **** am I reading?
My opinion?
Only uninformed, ignorant console fanboys think that a game loses its "cinematic" qualities when at 60fps. That is literally one of the dumbest things I've ever heard in my life. It is this ridiculous fantasy by console fanboys to justify the fact that they are fanboys and don't want to game on a more powerful machine. Simple fact is that 60fps only decreases the natural barriers that are between a user and their video game giving them closer 1:1 control. 60fps is more natural, more fluid, more responsive, controls better, and feels better polished. Ever wonder why Nintendo games are always 60 fps? Games simply higher higher quality and more polished at 60fps.
The only reason I can think they would have a 1080p30 toggle is because they can't actually do 1080p60 stable throughout and some gamers would prefer a locked framerate to that of an unlocked.
30 fps is only accepted as devs haven't given gamers a choice in the past. They've picked 30 fps because they would rather crunch some more polygons or do a few more rendering passes because it would make stills and videos of their game look better for marketing reasons. Now ignorant gamers have accepted this as "cinematic" which is utter bullshit.
It's 2014, 30fps it's barely acceptable. It's not totally unacceptable but in 2014 I expect devs to be striving for 60fps whenever possible.
QFT... 120 FPS on my pc is the cinematic feel. This is a "our console isn't powerful enough". Coin Cinematic now!
It's great stuff. Just shake your head and laugh at the marketing PR BS. :D
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
What the **** am I reading?
My opinion?
Frame-rates. It's not a subjective thing.
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
What the **** am I reading?
My opinion?
Frame-rates. It's not a subjective thing.
I just demonstrated how subjective it is, so please move on.
60FPS is more cinematic than 30FPS could ever be. Plus there's the question of input lag.
What do you mean? Movies recorded with 35 mill film or IMAX are 24fps. Not 60fps. 30fps is closer to 24fps than 60fps is, therefore it gives a more 'cinematic' feel. Simple stuff really.
so the question arises, are we going to 'watch' the game or 'play' the game?
Play a game with Cinematic feel which is why the game is 30FPS and not 24 FPS.
http://www.videogamer.com/ps4/the_last_of_us_remastered/news/the_last_of_us_remastered_details_leak_30fps_toggle_additional_dlc_planned.html
Great idea as a lot of people had complained that it would lose it's cinematic feel at 60fps. Article also states that it will include 4x more detailed texture maps. Can't wait to see what this looks like.
This doesn't mean they care about cinematic feel, this means that at the community's insistence, they provided a frame rate exceeding 30fps despite the fact that it is unstable and has judder, which is why they included the ability to turn it off. If you equate 'cinematic feel' with low frame rate, you are drinking the Kool Aid. There has never been a single instance of a PC gamer saying, "I think I'd like to artificially limit my frame rate on this game because having input delay would really give this game that cinematic feel I've been looking for."
Agreed @SolidTy, no need for me on this. I'll pop in my ps3 if I want to play it. I wont pay for MC collection. Asked for it as a birthday present.
Not surprised to see the usual suspects complaining. lol
"Too much judder between 40 to 50 fps, they should add a toggluls to lock it at 30fps. I can't play this"
"It's got toggluls for 30fps, it means it's not 60fps, I can't play this"
Make up your mind dufus. xD
Nice to have the option there. I just hope the 60fps option is actually 60 fps and not wildly varying.
Should make for good comparison vids though and now people can see a proper difference between the two with the exact same game for a more direct comparison
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
What the **** am I reading?
My opinion?
The second part is your opinion. You like 30 fps more for whatever reason, sure.. but the first part is really not.
There's no such thing as "cinematic experience"... Film frames and 3d applications frames are entirely different things, you can't just compare the two.
@seanmcloughlin: I can already see all the comparison gifs hehe :P Time to break the GS servers again! xD
It's gonna be fun anyway lol
Agreed @SolidTy, no need for me on this. I'll pop in my ps3 if I want to play it. I wont pay for MC collection. Asked for it as a birthday present.
lol, there is a big difference between you and solid..He actually owns a PS4 (and xbox one) to have the option to buy TLOU R if he chooses.
You, on the other hand, are just making due with the options you have available for the only current gen console (xb1) you bought, as you are still basically saying you want the Halo collection.
Mmm, it may actually mean that the game can't sustain 60fps locked.
Let's wait and see.
That's exactly what I thought. The only reason I could see throwing a 30fps cap on the game would be if the framerates fluctuate heavily, instead of a stable 60fps. Not being able to hit a stable 60fps on a PS3 port with some improved textures would be pretty fucking pathetic, and would be a guarantee that their whole 60fps Uncharted promise was a load of horseshit.
btw, the developer saying "we've worked hard to remove drops" and "the experience SHOULD be a locked 60fps" isn't the same as saying that the game is locked at 60fps and rarely ever experiences fps drops. Learn 2 PR talk people!
This is a joke right?, have console gamers crawled so far up their own asses with this "cinematic" bullshit that they actually believe it, and even worse that devs are pandering to them?
It's not a joke at all. 24 fps is the cinematic standard and if you compare 30 to 60 fps without the whole performance and control accuracy bias , you will see how it looks more "cinematic".
I personally decided to compromise 60 fps for near max settings in Tomb Raider with my crickity ol' gtx 560 Ti, performance hovered around 25-30 fps until I would go indoors where it would stay at 60 fps. When transitioning to the indoor 60 fps areas I noiticed how unnatural her movements looked and how intrusive the smooth animations were. I will choose 30 fps over 60 fps any day in an action adventure/console orientated game.
What the **** am I reading?
My opinion?
The second part is your opinion. You like 30 fps more for whatever reason, sure.. but the first part is really not.
There's no such thing as "cinematic experience"... Film frames and 3d applications frames are entirely different things, you can't just compare the two.
Here you guys go. /discussion on that...
It says something about the state of PS4 and X1 games when fucking remasters are dominating the hype.
A typical year one for a console. I swear some people talk like they haven't lived through a single gen cycle before.
First years always suck. ALWAYS THEY ALWAYS SUCK. They'll suck next gen too.
60FPS is more cinematic than 30FPS could ever be. Plus there's the question of input lag.
What do you mean? Movies recorded with 35 mill film or IMAX are 24fps. Not 60fps. 30fps is closer to 24fps than 60fps is, therefore it gives a more 'cinematic' feel. Simple stuff really.
Movie frames are "intertwined", so to speak, with motion blur, so at 24 FPS they look like 60 fps. Games frames aren't blurred together, making 24 FPS feel very sluggish and choppy.
Does this mean it isn't a stable 60fps?
um... whats so hard to understand?
You can play it at 60fps or at 30fps by switching it at the menu if you prefer 30fps
60FPS is more cinematic than 30FPS could ever be. Plus there's the question of input lag.
What do you mean? Movies recorded with 35 mill film or IMAX are 24fps. Not 60fps. 30fps is closer to 24fps than 60fps is, therefore it gives a more 'cinematic' feel. Simple stuff really.
Movie frames are "intertwined", so to speak, with motion blur, so at 24 FPS they look like 60 fps. Games frames aren't blurred together, making 24 FPS feel very sluggish and choppy.
(This response isn't directed right at you but is to clear up any misconceptions, you may understand this all but I feel it should be said. I know you were just trying to clear things up for people in a way they understand. That just leads to more confusion here.)
They don't "look like 60fps". They look more natural because of the motion blur. Our eyes don't see in frames. Our brains process light as it hits our retinas. Depending on the distance from your eyes to the surface that light takes just a bit longer to reach our vision.
Our eyes (or a camera) also must changes the shape of the lens so that we can focus on objects. Motion blur is natural to our eyes as it's a natural byproduct of focusing. Content that is not in focus is blurry. Unless your eyes are tracking a moving object, the object won't be in focus and thus will appear blurry. Thus we have motion blur.
Since a computer monitor is flat, all of the light hits our eyes at the same time. You focus on the flat screen, not the objects in the world. To our eyes distant objects on the screen are the same distance from close objects. Games make up for this by depth of field and motion blur to simulate how our eyes or camera lenses naturally work.
Given how our eyes and cameras work, a higher framerate is better as it gets closer to simulating how our eyes see. We are constantly being bombarded with photons which means our eyes have an "unlimited" framerate. So the higher the better.
Movies get away with 24 fps because at 24 fps film has just enough motion blur so our eyes don't see the individual frames thus it's not choppy. 24 fps does not look like real life. When watching a movie you can tell you are watching a movie because the perception of 24 fps in our brains is different than 60 FPS or real life. We're so used to movies in 24 fps that watching The Hobbit in 48 fps was weird to a lot of people because traditionally late night television and soap operas were popular media that used 60fps. Hopefully that changes because The Hobbit at 48 fps was excellent, I wish they would have gone at 60fps. I perceived the world less as an unrealistic fantasy and more of a real place. It's hard to describe.
This, of course, ignores all sorts of user inputs. Movies are just watched. With a video game your inputs are mapped and displayed. The faster the refresh rate of the screen, the faster your inputs are displayed and the smoother the game feels. 30 fps feels sluggish as hell compared to 60 fps even on on a controller.
Please Log In to post.
Log in to comment