Still only manages 60 FPS in narrow cooridors/cave areas when there's nothing to render
Plummets to the 30s when there's a hint of action or more open environments to render
Plausible reason
Didn't code to da metal hard enough
Didn't utilise the full potential of the supercharged PC parts within the PS4's chasis
No one push a console to the metal on launch,in fact those tool are not even ready yet.
The PS4 version doesn't hit 30FPS troll,it mostly 50FPS drops as low as 33 for very short instances,..
I love when you hypocrites try to pretend the PC don't have variable frame rate...
Whats that the 7970 dropping from 93 FPS to 56.? A 37FPS drop.? But that is the 7970 lets see other lesser GPU..
The 7870 from 64 to 40 a 24 FPS drop...
The 7850 which is basically a little lower than the PS4 from 51 FPS to 33 FPS.? Hell that is PS4 like...
Oh if you have a 7770 be prepare to fall under 30...
Come on man stop trolling this is a PS4 vs xbox one thread,we don't care about your over priced PC that also drop 30 frames or more.
yeah except you cows somehow thought that console optimization is magic that will turn that slightly better 7850 into a 7970. Also i love that a 100pound GPU will run the game more or less in line with Ps4
Oh and BTW using MSI afterburner you can lock your frame rate to what ever is your liking if you dont want variable fps ( for instance you can lock them at 60fps for a 7970 thus fps will stay at 56-60fps all the time
Let alone if you dont like dropping to 30fps there is magic think called settings so drop AA and tressFX and even a 7850 will stay over 45fps always
So both consoles are frame capped. The PS4's max framerate is probably higher than 60 and the XBOne's higher than 30, but it sounds like both are set with vsync (or similar) to avoid screen tearing/judder.
I'm still betting the xbox port doesn't even touch the system's eSRAM as that would have taken extra dev time and the dev team was tasked with getting a 30FPS port of the original with the extra features, which is all they provided and nothing more.
I've only seen the PS4 version, not the XB1. The frame drops in high action scenes are VERY noticeable. Sorry the XB1 version is so gimped.
All in all, I'm going to finish the game on PC.
Come on man if it didn't use ESRAM it would run at 10 FPS DDR3 bandwidth on xbox one is not enough for the CPU and GPU to share and still move Tomb Raider at 30 FPS,is impossible ESRAM is in play here,it was never going to work exactly like GDDR5 it has been say for months,but even if it did having more bandwidth on a weak GPU will yield no better results either way.
@Krelian-co said:
@tormentos said:
@-Unreal- said:
Tomb Raider Definitive Edition
No tesselation
Low quality ambient occlusion
Lower quality textures
Lower quality objects
Lower shadow resolution
Reduced strand count for TressFX 2.0
Advantages
Coding to da metal
Console optimisation
Resulting performance
Still only manages 60 FPS in narrow cooridors/cave areas when there's nothing to render
Plummets to the 30s when there's a hint of action or more open environments to render
Plausible reason
Didn't code to da metal hard enough
Didn't utilise the full potential of the supercharged PC parts within the PS4's chasis
No one push a console to the metal on launch,in fact those tool are not even ready yet.
The PS4 version doesn't hit 30FPS troll,it mostly 50FPS drops as low as 33 for very short instances,..
I love when you hypocrites try to pretend the PC don't have variable frame rate...
Whats that the 7970 dropping from 93 FPS to 56.? A 37FPS drop.? But that is the 7970 lets see other lesser GPU..
The 7870 from 64 to 40 a 34 FPS drop...
The 7850 which is basically a little lower than the PS4 from 51 FPS to 33 FPS.? Hell that is PS4 like...
Oh if you have a 7770 be prepare to fall under 30...
Come on man stop trolling this is a PS4 vs xbox one thread,we don't care about your over priced PC that also drop 30 frames or more.
when ps4 does it, "it is for very short instances" when its on pc you act like it runs at minimum fps most of the time, you are indeed one of the biggest morons of this forum. My pc never drops from 60 fps, never, those min fps happen like once or twice every hour or so and only for a second while in ps4 happen all the time when there are big fights or a lot of stuff on screen, so try harder.
You are the fu**ing moron quote me saying that PC runs at minimum frames for long time i dare you..
I say PC also drops frames the same instances the PS4 PC also have them and as you can see from the screen i posted by a freaking lot,from 93 to 56 is a huge drop..Actually bigger than what the PS4 drop.
Lets see who do we believe a moron like you or a site which actually recorded and tested what was the minimum frames,but once again quote me i dare you,also post a link to were it say that PC never drops frames idiot i also have a PC and i been a PC gamer since the mid 90's,i just drop PC because i got tire of the driver update,and constant updating when FPS is the topic no other have such variable rates like PC.
Don't be a cry baby PC also drop frames is well tested and documented...
@NFJSupreme said:
@Krelian-co said:
@tormentos said:
@-Unreal- said:
Tomb Raider Definitive Edition
No tesselation
Low quality ambient occlusion
Lower quality textures
Lower quality objects
Lower shadow resolution
Reduced strand count for TressFX 2.0
Advantages
Coding to da metal
Console optimisation
Resulting performance
Still only manages 60 FPS in narrow cooridors/cave areas when there's nothing to render
Plummets to the 30s when there's a hint of action or more open environments to render
Plausible reason
Didn't code to da metal hard enough
Didn't utilise the full potential of the supercharged PC parts within the PS4's chasis
No one push a console to the metal on launch,in fact those tool are not even ready yet.
The PS4 version doesn't hit 30FPS troll,it mostly 50FPS drops as low as 33 for very short instances,..
I love when you hypocrites try to pretend the PC don't have variable frame rate...
Whats that the 7970 dropping from 93 FPS to 56.? A 37FPS drop.? But that is the 7970 lets see other lesser GPU..
The 7870 from 64 to 40 a 34 FPS drop...
The 7850 which is basically a little lower than the PS4 from 51 FPS to 33 FPS.? Hell that is PS4 like...
Oh if you have a 7770 be prepare to fall under 30...
Come on man stop trolling this is a PS4 vs xbox one thread,we don't care about your over priced PC that also drop 30 frames or more.
when ps4 does it, "it is for very short instances" when its on pc you act like it runs at minimum fps most of the time, you are indeed one of the biggest morons of this forum. My pc never drops from 60 fps, never, those min fps happen like once or twice every hour or so and only for a second while in ps4 happen all the time when there are big fights or a lot of stuff on screen, so try harder.
funny thing is if you are running at a steady 60fps you don't notice drops to anything above 50 which wont happen because you are at a stead 60fps. To claim "60 fps" you need to average at least 58fps. PS4 has very noticeable drops to as low as the low 30s. Still much better than the xbone and the PS4 will have a stranglehold on second place when it comes to multiplats this gen. But it's not 60fps as I said before this even came out.
yeah except you cows somehow thought that console optimization is magic that will turn that slightly better 7850 into a 7970. Also i love that a 100pound GPU will run the game more or less in line with Ps4
Oh and BTW using MSI afterburner you can lock your frame rate to what ever is your liking if you dont want variable fps ( for instance you can lock them at 60fps for a 7970 thus fps will stay at 56-60fps all the time
Let alone if you dont like dropping to 30fps there is magic think called settings so drop AA and tressFX and even a 7850 will stay over 45fps always
TLHBO. Suck it lems. How does it feel to pay $100 extra to get half the performance?
Well, it really isn't "half" of the performance.
I believe two different teams ported the two different versions, originally they were both aiming for 30fps but it just turned out that sony's console could hover around 50-60fps better. The X1 version is locked at 30fps, so that means that it could go higher, meaning that the PS4 does not actually have double the performance power.
The REAL power difference can be best judged by comparing the two lowest framerates which are 24 and 33, giving the PS4 a 9fps advantage (Which is what I've been guessing since day one, PS4=5-10 more fps)
Lems don't pay for performance they pay for games, I spent 100 more on my Xbox but it has more hours logged than my PS4. Plus the only game I play on PS4 is Assassins Creed, while my Xbox is mostly playing exclusives.
Gameplay - Xbox One
Lowest FPS: 24fps
Highest FPS: 30fps
Average FPS: 29.84fps
Gameplay - PS4
Lowest FPS: 33fps
Highest FPS: 60fps
Average FPS: 50.98fps
The average gameplay framerate on the Xbox One is 29.84 fps. The average gameplay framerate on the Playstation 4 is 50.98 fps. The Playstation 4 has a 77% faster framerate than the Xbox One.
No amount of ridiculous hand-waving will change that.
Yes but the X1 version is LOCKED at 30 frames. If the PS4 version were locked at 30 frames as well then it would be 29 fps vs 30 fps lol.
It's hardly a fair comparison to say that the game runs 77% faster when one is capped and one is not, if we could compare the uncapped version then you would find it is not that much faster. While it still would be, like i said, you would not see that big of a difference.
If you're going to count a frame rate cap when comparing performance, you are a blatant Sony fanboy.
@Crypt_mx: LOL epic damage control. You paid extra to play shit like Ryse and multiplats at half the res (see COD) and half the framerate as PS4? Wow, no wonder you lems are so butthurt.
"you lems"
Uhmmm I'm a PC/PS4/X1 owner.
While you on the other hand are a cow that has likely never even played an X1.
So both consoles are frame capped. The PS4's max framerate is probably higher than 60 and the XBOne's higher than 30, but it sounds like both are set with vsync (or similar) to avoid screen tearing/judder.
I'm still betting the xbox port doesn't even touch the system's eSRAM as that would have taken extra dev time and the dev team was tasked with getting a 30FPS port of the original with the extra features, which is all they provided and nothing more.
I've only seen the PS4 version, not the XB1. The frame drops in high action scenes are VERY noticeable. Sorry the XB1 version is so gimped.
All in all, I'm going to finish the game on PC.
Come on man if it didn't use ESRAM it would run at 10 FPS DDR3 bandwidth on xbox one is not enough for the CPU and GPU to share and still move Tomb Raider at 30 FPS,is impossible ESRAM is in play here,it was never going to work exactly like GDDR5 it has been say for months,but even if it did having more bandwidth on a weak GPU will yield no better results either way.
@Krelian-co said:
@tormentos said:
@-Unreal- said:
Tomb Raider Definitive Edition
No tesselation
Low quality ambient occlusion
Lower quality textures
Lower quality objects
Lower shadow resolution
Reduced strand count for TressFX 2.0
Advantages
Coding to da metal
Console optimisation
Resulting performance
Still only manages 60 FPS in narrow cooridors/cave areas when there's nothing to render
Plummets to the 30s when there's a hint of action or more open environments to render
Plausible reason
Didn't code to da metal hard enough
Didn't utilise the full potential of the supercharged PC parts within the PS4's chasis
No one push a console to the metal on launch,in fact those tool are not even ready yet.
The PS4 version doesn't hit 30FPS troll,it mostly 50FPS drops as low as 33 for very short instances,..
I love when you hypocrites try to pretend the PC don't have variable frame rate...
Whats that the 7970 dropping from 93 FPS to 56.? A 37FPS drop.? But that is the 7970 lets see other lesser GPU..
The 7870 from 64 to 40 a 34 FPS drop...
The 7850 which is basically a little lower than the PS4 from 51 FPS to 33 FPS.? Hell that is PS4 like...
Oh if you have a 7770 be prepare to fall under 30...
Come on man stop trolling this is a PS4 vs xbox one thread,we don't care about your over priced PC that also drop 30 frames or more.
when ps4 does it, "it is for very short instances" when its on pc you act like it runs at minimum fps most of the time, you are indeed one of the biggest morons of this forum. My pc never drops from 60 fps, never, those min fps happen like once or twice every hour or so and only for a second while in ps4 happen all the time when there are big fights or a lot of stuff on screen, so try harder.
You are the fu**ing moron quote me saying that PC runs at minimum frames for long time i dare you..
I say PC also drops frames the same instances the PS4 PC also have them and as you can see from the screen i posted by a freaking lot,from 93 to 56 is a huge drop..Actually bigger than what the PS4 drop.
Lets see who do we believe a moron like you or a site which actually recorded and tested what was the minimum frames,but once again quote me i dare you,also post a link to were it say that PC never drops frames idiot i also have a PC and i been a PC gamer since the mid 90's,i just drop PC because i got tire of the driver update,and constant updating when FPS is the topic no other have such variable rates like PC.
Don't be a cry baby PC also drop frames is well tested and documented...
@NFJSupreme said:
@Krelian-co said:
@tormentos said:
@-Unreal- said:
Tomb Raider Definitive Edition
No tesselation
Low quality ambient occlusion
Lower quality textures
Lower quality objects
Lower shadow resolution
Reduced strand count for TressFX 2.0
Advantages
Coding to da metal
Console optimisation
Resulting performance
Still only manages 60 FPS in narrow cooridors/cave areas when there's nothing to render
Plummets to the 30s when there's a hint of action or more open environments to render
Plausible reason
Didn't code to da metal hard enough
Didn't utilise the full potential of the supercharged PC parts within the PS4's chasis
No one push a console to the metal on launch,in fact those tool are not even ready yet.
The PS4 version doesn't hit 30FPS troll,it mostly 50FPS drops as low as 33 for very short instances,..
I love when you hypocrites try to pretend the PC don't have variable frame rate...
Whats that the 7970 dropping from 93 FPS to 56.? A 37FPS drop.? But that is the 7970 lets see other lesser GPU..
The 7870 from 64 to 40 a 34 FPS drop...
The 7850 which is basically a little lower than the PS4 from 51 FPS to 33 FPS.? Hell that is PS4 like...
Oh if you have a 7770 be prepare to fall under 30...
Come on man stop trolling this is a PS4 vs xbox one thread,we don't care about your over priced PC that also drop 30 frames or more.
when ps4 does it, "it is for very short instances" when its on pc you act like it runs at minimum fps most of the time, you are indeed one of the biggest morons of this forum. My pc never drops from 60 fps, never, those min fps happen like once or twice every hour or so and only for a second while in ps4 happen all the time when there are big fights or a lot of stuff on screen, so try harder.
funny thing is if you are running at a steady 60fps you don't notice drops to anything above 50 which wont happen because you are at a stead 60fps. To claim "60 fps" you need to average at least 58fps. PS4 has very noticeable drops to as low as the low 30s. Still much better than the xbone and the PS4 will have a stranglehold on second place when it comes to multiplats this gen. But it's not 60fps as I said before this even came out.
Really and you noted that on the DF video.?
ahh el tormo no need to waste time with you, you do know you are this forum joke right?
The average gameplay framerate on the Xbox One is 29.84 fps. The average gameplay framerate on the Playstation 4 is 50.98 fps. The Playstation 4 has a 77% faster framerate than the Xbox One.
No amount of ridiculous hand-waving will change that.
Yes but the X1 version is LOCKED at 30 frames. If the PS4 version were locked at 30 frames as well then it would be 29 fps vs 30 fps lol.
It's hardly a fair comparison to say that the game runs 77% faster when one is capped and one is not, if we could compare the uncapped version then you would find it is not that much faster. While it still would be, like i said, you would not see that big of a difference.
If you're going to count a frame rate cap when comparing performance, you are a blatant Sony fanboy.
When I said "no amount of ridiculous hand-waving" would change that, I was thinking of exactly this kind of ridiculous hand-waving. 50.98 is greater than 29.84. It's not difficult to understand, and it's not debatable.
Also, you do know that your "locked" Xbox One version drops to 24fps, correct? Meaning that it's not locked. Right?
The average gameplay framerate on the Xbox One is 29.84 fps. The average gameplay framerate on the Playstation 4 is 50.98 fps. The Playstation 4 has a 77% faster framerate than the Xbox One.
No amount of ridiculous hand-waving will change that.
Yes but the X1 version is LOCKED at 30 frames. If the PS4 version were locked at 30 frames as well then it would be 29 fps vs 30 fps lol.
It's hardly a fair comparison to say that the game runs 77% faster when one is capped and one is not, if we could compare the uncapped version then you would find it is not that much faster. While it still would be, like i said, you would not see that big of a difference.
If you're going to count a frame rate cap when comparing performance, you are a blatant Sony fanboy.
When I said "no amount of ridiculous hand-waving" would change that, I was thinking of exactly this kind of ridiculous hand-waving. 50.98 is greater than 29.84. It's not difficult to understand, and it's not debatable.
Also, you do know that your "locked" Xbox One version drops to 24fps, correct? Meaning that it's not locked. Right?
I guess if you believe its hand-waving then it is, but its unfortunate that you don't understand my point.
Yes, 50 is greater than 29, I get that, I never argued against that. I was arguing that the X1 version has frame rate cap, just as the article states. Just because a game has a frame rate cap that doesn't mean that the frame rate can't drop lol. Most last gen games were capped at 30 frames but often dipped under that.
I will explain it as best as I can so take this for what it is:
When optimizing a console game devs must make an important choice, they can lock the frame rate at 30 (common)or 60(rare) or leave it unlocked(don't think its been done).
In this instance, the PS4 version runs locked at 60 and the X1 version is locked at 30, the reason for this is very simple. Since the PS4 could maintain a certain average (usually around 45fps) it could look smooth enough hovering between that average and 60fps. However, because of a 5-10 fps difference (like i said before) the X1 version could not meet that standard and would end up looking jittery if left unlocked. However, the X1 does have the potential to get more frames, it simply would not be as pleasing to the eye.
The PS4 is more powerful no doubt, but not as strong as most of you fanboys claim.
Graphics comparisons are skewed when you don't account for the framerate.
For example, my friends told me PS3 graphics were just as good as 360s. What they didn't tell me when I got my PS3 was that I was going to end up playing a freaking slide show than a game.
I get a large frame rate boost if I apply one lower settings for the following areas
1, shadows,
2. object details.
3. depth field.
4. textures.
Also, your using the old driver and game build.
That is not the point you walking mozzarella stick,the point is PC have variable frame rate and it is a fact.
@Krelian-co said:
ahh el tormo no need to waste time with you, you do know you are this forum joke right?
You are so butthurt that you can't even admit that PC have variable frame rate,every fu**ing site knows this and is the reason why they always show average and minimum dude,we all know you will not stay half the game on minimum not even half a minute but it does drop,just like consoles to when you man up you can admit it..
@Crypt_mx said:
Yes but the X1 version is LOCKED at 30 frames. If the PS4 version were locked at 30 frames as well then it would be 29 fps vs 30 fps lol.
It's hardly a fair comparison to say that the game runs 77% faster when one is capped and one is not, if we could compare the uncapped version then you would find it is not that much faster. While it still would be, like i said, you would not see that big of a difference.
If you're going to count a frame rate cap when comparing performance, you are a blatant Sony fanboy.
Both version are cap.
The xbox one version is capped because the game going under 30 would be allot more easy to notice when the frames aren't as high,the PS4 version never drops below 33FPS,the xbox one version dropped as low as 18 but mostly to 24 and 28,also locking will allow for a more stable frames at 30.
The PS4 version is also cap at 60FPS,we don't know if it could go over it.
@ronvalencia said:
@tormentos:
At 1080p, running TR 2013 at 68 GB/s DDR3 levels didn't result in 10 fps average and I have already tried it on my 7950 950Mhz.
PC's GPU has under-clock features.
I might try running 68GB/s memory bandwidth on my R9 290 (947 Mhz).
The average gameplay framerate on the Xbox One is 29.84 fps. The average gameplay framerate on the Playstation 4 is 50.98 fps. The Playstation 4 has a 77% faster framerate than the Xbox One.
No amount of ridiculous hand-waving will change that.
Yes but the X1 version is LOCKED at 30 frames. If the PS4 version were locked at 30 frames as well then it would be 29 fps vs 30 fps lol.
It's hardly a fair comparison to say that the game runs 77% faster when one is capped and one is not, if we could compare the uncapped version then you would find it is not that much faster. While it still would be, like i said, you would not see that big of a difference.
If you're going to count a frame rate cap when comparing performance, you are a blatant Sony fanboy.
When I said "no amount of ridiculous hand-waving" would change that, I was thinking of exactly this kind of ridiculous hand-waving. 50.98 is greater than 29.84. It's not difficult to understand, and it's not debatable.
Also, you do know that your "locked" Xbox One version drops to 24fps, correct? Meaning that it's not locked. Right?
God, you f*cking cows are stupid as hell.
Do you not understand the concept of averages? Do you not realize that the XB1 version averages 29.84fps? This means the drops must be very infrequent. It's capped at 30fps so any drops below 30fps would drag down the average even faster. The frame rate averages show that this is a very steady 30fps game on the XB1.
A great number of games last generation were not that consistent on either the PS3 or Xbox 360. That frame rate average is about what Uncharted 2 likely has, possibly better. In other words, we're talking about a game that is a steady 30fps the vast majority of the time with only very occasional and momentary drops.
What's also not debatable is that the average framerate would be higher on the XB1 version if the frame rate were uncapped. No game engine on any hardware ever pumps out exactly the same amount of frames from moment to moment. There is always a very large variability in the rendering loads.
This means that in order for the XB1 version to stick so closely to 30fps it must have 30fps as an average minimum framerate. It's actual average if uncapped would likely be something like 44fps.
In any case, there wouldn't be a 77% difference if the XB1 version wasn't capped at 30fps. And that's not debatable.
Also, a once-in-a-blue-moon drop down to 24fps is very different than regularly dropping down into the 40s (and sometimes 30s) as the PS4 version does. That's why the PS4 version averages around 50fps and not around 60fps.
@Dire_Weasel: yea just like no ridiculous amount of hand-waving makes an average of 50.98 fps with dips down to 32 fps, a "60 fps game".. fanboys on both sides of this fence are overreaching.. in reality, the PS4 is averaging much higher framerates but is slowing down by a much larger margin as well which can make for a more inconsistent experience.. the Xbox One version is averaging 29.98 fps which is essentially "30 fps" and only dips by an average of 4 fps in the most extreme scenes with the PS4 dipping by an average of 27.5 fps in the extreme cases.. certainly a more stable experience on the Xbox One and some would argue that it is preferred..
True PS4 wins this one from a purely technical/"bragging rights" standpoint, but this "60fps vs 30fps", "double the performance" ownage does not apply.. both games look equal in terms of graphics and resolution, the PS4 version runs faster but isn't locked at 60 fps by any stretch of the imagination, and the Xbox One version while slower has a more stable performance overall..
EDIT- I certainly wasnt saying that YOU said the PS4 version was a "60 fps game".. I was just echoing the cow battecry for this particular debate and was mostly speaking in general.. I was just replying to your post but this wasn't directed to you in particular
Graphics comparisons are skewed when you don't account for the framerate.
For example, my friends told me PS3 graphics were just as good as 360s. What they didn't tell me when I got my PS3 was that I was going to end up playing a freaking slide show than a game.
Of course. But frame rate consistency is just as important. Actually, I'd argue, more important.
A consistent 60fps is better than a consistent 30fps, but both are better than an inconsistent frame rate even if the latter frame-rate is higher in terms of absolute numbers.
Here's what Digital Foundry said on the subject in their G-Sync review:
"We are automatically programmed to believe that higher frame-rates are better, but there's a reason why the vast majority of consoles titles are locked at 30fps - a mostly unchanging frame-rate is easier on the eye, and provides a consistency in input lag that goes out of the window when performance fluctuates with an unlocked frame-rate."
"In the era of the 60Hz monitor, the most consistent, judder-free experience we can get is either with a locked 60fps, or else the console standard 30fps."
"And if our G-Sync testing has taught us anything, it's that - within reason - consistent frame-rates are more important than the fastest possible rendering in any given situation."
That's the utterance of a moron that doesn't have anything worthwhile to add to the discussion and knows he has no argument to answer the facts and reasoning put forward by those he is lashing out against.
lol the MS nuthuggers in this thread are pulling bullshit out of their ass. "but duh cap make da framerate um more stable looks better cuz Microsoft and um duur kinect". I hope you douchebags realize that nobody is biting on your BS and spin. Time to put on your big girl pants and accept that you're gonna be getting Xbowned all gen long....
lostrib is a hermit. lol!! Cows think everyone who disagrees with them are lems. lmao!! Just shows how stupid they really are.
Lousy developers. Lousy game.
Crystal Dynamics should have did both versions themselves. Complete mess.
yeah dem devs are at fault, not the fact that xbone is an underpowered craptastic console.
Game looks like crap on both systems. Neither developer did a good job really. Everyone knows the XB1 performance takes a 20-30% hit in FPS. I've said that before. It's irrelevant since the game isn't worth paying for it.
Graphics comparisons are skewed when you don't account for the framerate.
For example, my friends told me PS3 graphics were just as good as 360s. What they didn't tell me when I got my PS3 was that I was going to end up playing a freaking slide show than a game.
Of course. But frame rate consistency is just as important. Actually, I'd argue, more important.
A consistent 60fps is better than a consistent 30fps, but both are better than an inconsistent frame rate even if the latter frame-rate is higher in terms of absolute numbers.
Here's what Digital Foundry said on the subject in their G-Sync review:
"We are automatically programmed to believe that higher frame-rates are better, but there's a reason why the vast majority of consoles titles are locked at 30fps - a mostly unchanging frame-rate is easier on the eye, and provides a consistency in input lag that goes out of the window when performance fluctuates with an unlocked frame-rate."
"In the era of the 60Hz monitor, the most consistent, judder-free experience we can get is either with a locked 60fps, or else the console standard 30fps."
"And if our G-Sync testing has taught us anything, it's that - within reason - consistent frame-rates are more important than the fastest possible rendering in any given situation."
Game looks like crap on both systems. Neither developer did a good job really. Everyone knows the XB1 performance takes a 20-30% hit in FPS. I've said that before. It's irrelevant since the game isn't worth paying for it.
No it doesn't look like crap not even on xbox one,but on PS4 it has better textures,and depth of field is present where the xbox one version appear to be missing either completely or in some parts.
Also the frame difference is not 20 to 30% once again you have 3rd grade math skills,in many instances the difference is actually 100% 30 FPS vs 60 FPS,on average is 50 FPS vs 29 FPS which again is more than 50%.
You bias toward the xbox brand is 100 times worse than mine for sony,..
@kalipekona said:
PS4 gets Tomb Raider: Stutter Edition.
This is not a 60fps game on PS4. It averages at 50fps. That means it is running at less than 50fps just as much as above 50fps.
And that is going to be one stuttery, fluctuating mess.
No the PS4 get the superior edition period go cry about how 30 is better than 50 or that 720p is equal to 1080p.
Lemming this gen really are crazy,and to think this same people actually hyped the xbox 360 for doing 3 more frames than the PS4..
Once again did you see the video.? Plenty of times the game was running at 60 and over 50 as well,average mean average period if 50 is the average that is mostly what you get,the xbox one average 29 FPS and drops as low as 24,but but the xbox one doesn't stutter...
Oh and that with worse textures and missing effect as IGN videos show..
Memory clock: 2000 Mhz (500Mhz x 4) or ~128 GB/s (the lowest I can set for R9-290).
Results for TR 2013: 63.8 fps average from built-in benchmark.,
And what do you usually get on those setting with your card.?
What do you mean? The settings above has yielded 63.8 fps average.
No i was asking you what frames do you get normally with you card running at full speed.
But never the less your argument would not hold water.
First because the argument the person i quote was making was that developers were not using the ESRAM,which is why i say games would run at 10 FPS,you say that you limit your card in an attend to replicate what i was saying.
And there are 2 HUGE problems with your test that will not represent my argument.
1-The GPU you used for the test is a 7950 you own,that GPU is not 12 CU 1.3 TF in any way,even restricting the bandwidth to 68GB/s would still yield much better results than the xbox one GPU.
2-If the xbox one isn't using ESRAM,it means it is running on the DDR3 memory which mean 68GB/s for both CPU and GPU,and while i don't think the CPU will hog more than 20GB/s,the GPU surely will and there would be a huge bottle neck.
In your test even if you were able to lower you GPU memory to operate at 68GB/s it would not be the same because your GPU use a separate pool of ram which is not shared with the CPU like the xbox one case would be,so you would have 68GB/s on your GPU but also 68GB/s on your DDR3 system memory so is not quite the same.
I was just asking you for the results to know the impact on your GPU under those conditions but it was irrelevant any way,.
Aside from color and contrast differences, I don't see much distinction in your two pics.
The PC version is still the superior version in any case.
Not that any cares about pc,but that's a bad comparison because the definitive version has totally different art as seen with PS3 looking closer to pc there because of the same art.
Lems got destroyed!!! worse graphics,lower resolution,and crap framerate. Plus, can't livestream gameplay, take screenshots or have remote play!!
Using older drivers enables my old 7950-900Mhz to reach ~68 GB/s memory bandwidth i.e. ~350Mhz .
But it wouldn't matter because you still have 2 bandwidth one of the GPU and another of the DDR3 memory,if ESRAM is not part of the equation on xbox one it means you have to share 68GB/s between the CPU and GPU which isn't quite the same.
Using older drivers enables my old 7950-900Mhz to reach ~68 GB/s memory bandwidth i.e. ~350Mhz .
But it wouldn't matter because you still have 2 bandwidth one of the GPU and another of the DDR3 memory,if ESRAM is not part of the equation on xbox one it means you have to share 68GB/s between the CPU and GPU which isn't quite the same.
You are forgetting Xbox One has a direct CPU-to-GPU access and it's claimed to be rated at 30 GB/s i.e. the interactions between CPU and memory would be reduced with a direct connection. PS4 has similar features with it's 20GB/s (10GB/s+10GB/s) super onion link.
Gaming PCs with Intel Ivybridge/Haswell or AMD Kaveri has 32 GB/s (16 GB/s+16GB/s) PCI-E version 3.0 16X links i.e. PC would be limited by this connection.
The CPU's main job is run game logic and generate GPU commands which relatively consumes small memory bandwidth e.g. PS4 was designed with this in-mind i.e. most of the system memory bandwidth is allocated towards the GPU.
Your claim on 10 fps is worst than Xbox One's results and one would think that a high end PC GPU with X1's memory bandwidth would be better than X1's results i.e. more ALU resource, more near-ALU cache/LDS, more tessellation hardware.
Aside from color and contrast differences, I don't see much distinction in your two pics.
The PC version is still the superior version in any case.
Not that any cares about pc,but that's a bad comparison because the definitive version has totally different art as seen with PS3 looking closer to pc there because of the same art.
Lems got destroyed!!! worse graphics,lower resolution,and crap framerate. Plus, can't livestream gameplay, take screenshots or have remote play!!
You missed the reduced shadow/ambient occlusion type effects with the Definitive Edition.
You are forgetting Xbox One has a direct CPU-to-GPU access and it's claimed to be rated at 30 GB/s i.e. the interactions between CPU and memory would be reduced with a direct connection. PS4 has similar features with it's 20GB/s (10GB/s+10GB/s) super onion link.
Gaming PCs with Intel Ivybridge/Haswell or AMD Kaveri has 32 GB/s (16 GB/s+16GB/s) PCI-E version 3.0 16X links i.e. PC would be limited by this connection.
The CPU's main job is run game logic and generate GPU commands which relatively consumes small memory bandwidth e.g. PS4 was designed with this in-mind i.e. most of the system memory bandwidth is allocated towards the GPU.
Your claim on 10 fps is worst than Xbox One's results and one would think that a high end PC GPU with X1's memory bandwidth would be better than X1's results i.e. more ALU resource, more near-ALU cache/LDS, more tessellation hardware.
Where was this so call direct access.?
One of the reason why the xbox one isn't true HSA is because the GPU and CPU can't see the same data,they have something similar but not quite HSA,this was confirmed already,the original diagrams didn't show any 30GB/s connection from the CPU to the GPU..
So here we go again you inventing crap in order to make the xbox one look better..
The fact that you use your GPU to try to make a comparison confirm how stupid you are when it serve you best,your GPU is a fu**ing 7950 that GPU beat the living crap out of the xbox one GPU,and even if you find a way to restrict its bandwidth to 68GB/s still it will beat the xbox one GPU silly..
Either way 68+68 = 136GB/s not quite the same as the xbox one sharing 68Gb/s between CPU and GPU,the sad parts is how you claim you are a hermit and all you do is defend MS all the time here.
You are forgetting Xbox One has a direct CPU-to-GPU access and it's claimed to be rated at 30 GB/s i.e. the interactions between CPU and memory would be reduced with a direct connection. PS4 has similar features with it's 20GB/s (10GB/s+10GB/s) super onion link.
Gaming PCs with Intel Ivybridge/Haswell or AMD Kaveri has 32 GB/s (16 GB/s+16GB/s) PCI-E version 3.0 16X links i.e. PC would be limited by this connection.
The CPU's main job is run game logic and generate GPU commands which relatively consumes small memory bandwidth e.g. PS4 was designed with this in-mind i.e. most of the system memory bandwidth is allocated towards the GPU.
Your claim on 10 fps is worst than Xbox One's results and one would think that a high end PC GPU with X1's memory bandwidth would be better than X1's results i.e. more ALU resource, more near-ALU cache/LDS, more tessellation hardware.
Where was this so call direct access.?
One of the reason why the xbox one isn't true HSA is because the GPU and CPU can't see the same data,they have something similar but not quite HSA,this was confirmed already,the original diagrams didn't show any 30GB/s connection from the CPU to the GPU..
So here we go again you inventing crap in order to make the xbox one look better..
The fact that you use your GPU to try to make a comparison confirm how stupid you are when it serve you best,your GPU is a fu**ing 7950 that GPU beat the living crap out of the xbox one GPU,and even if you find a way to restrict its bandwidth to 68GB/s still it will beat the xbox one GPU silly..
Either way 68+68 = 136GB/s not quite the same as the xbox one sharing 68Gb/s between CPU and GPU,the sad parts is how you claim you are a hermit and all you do is defend MS all the time here.
LOL. you missed hotchip.org's presentation.
Where did you get" 68GB/s +68GB/s=136GB/s"?
The connection between CPU's interface and GPU's host interface is ~30 GB/s.
The connection between CPU's interface and main memory controller is ~30 GB/s.
The two connections doesn't increase the main memory's bandwidth. LOL.
The GPU can access most of DDR3's memory bandwidth.
"I haven't heard of hUMA until today, so I went to look it up. The way I understand it is that in addition to having unified memory access (shared memory between CPU and GPU), which allows the GPU to read CPU memory, it is also a coherent cache system. To quote Ars, "CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached."
I remember reading something on this when I got my first alpha kit. I pulled up a couple of our internal white papers and it's pretty clear that this was the exact implementation in the Xbox One's memory system."
• We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
• Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.
------------------------------
For most of the "out-of-the-box" PC products, the hermits are link with Microsoft. At this time Microsoft and Intel's X86 ISA won the PC system wars.
Without Microsoft, X86's fortunes wouldn't be at this current level and notice why PS4 sports AMD's X86-64 CPUs instead of IBM's PowerPC/POWER64 or MIPS64 or Intel Itanium (IA-64).
The king maker for AMD's X86-64/AMD64 ISA is Microsoft.
Loading Video...
Note that AMD and MS alliance is tied with it's ex-DEC employees.
"In August 1999, AMD released the Athlon (K7) processor. Notably, the design team was led by Dirk Meyer, who had worked as a lead engineer on multiple Alpha microprocessors during his employment at DEC. Jerry Sanders had approached many of the engineering staff to work for AMD as DEC wound down their semiconductor business, and brought in a near-complete team of engineering experts. The balance of the Athlon design team comprised AMD K5 and K6 veterans."
AMD has near-complete DEC CPU engineering team, while Microsoft has most of DEC's OS team (from DEC VMX to Windows NT).
From http://www3.sympatico.ca/n.rieck/docs/Windows-NT_is_VMS_re-implemented.html
"Dave takes most of his DEC West team with him to Microsoft."
AMD's return for ARM ISA is like regeneration of DEC's StrongARM project. http://www.extremetech.com/computing/175583-amd-unveils-its-first-arm-based-cpu-the-64-bit-8-core-opteron-a1100
The combined AMD+MS combo is basically a regeneration of DEC.
------------
Your statement "So here we go again you inventing crap in order to make the xbox one look better.." is bull$hit.
Your "68GB/s+68GB/s = 136 GB/s" claim is LOL. Another fan-fiction from you.
Don't do that RON he gonna say you on that MisterXMedia stuff lol. I've been telling torms that MS and AMD are PARTNERS in hUMA and HSA, Everybody was contracted to make the x1 intel, ibm, and ms with DATAFLOW and supercomputer arch amd and ms for FAB.
Don't do that RON he gonna say you on that MisterXMedia stuff lol. I've been telling torms that MS and AMD are PARTNERS in hUMA and HSA, Everybody was contracted to make the x1 intel, ibm, and ms with DATAFLOW and supercomputer arch amd and ms for FAB.
I have been stating AMD+MS via ex-DEC link since AMD K8 Athlon era, which is before Xbox One.
I'm not going change my POV for any "johnny come lately" console releases.
AMD K7 Athlon's uses the same point-to-point bus protocols as DEC Alpha's EV6.
You even got AMD/DEC's Slot B that can support both AMD K7 and DEC Alpha CPU slot packages on the same motherboard.
"It was based on the Digital Equipment Corporation's EV7 bus architecture".
AMD, NVIDIA, ATI and Intel has existed before Xbox 360, Xbox One, PS3 and PS4.
For AMD HSA and MS, Read http://hsafoundation.com/f-a-q/
Q: Will HSA support Microsoft C++ AMP ?
A: HSA will support Microsoft C++ AMP
The HSA Foundation will also work with all software foundations and open initiatives for the promotion of respective language runtimes and libraries around the advancement of heterogeneous computing.
-------
Torms can f**koff. I can afford to waste my time in pursuit of this fool.
"I haven't heard of hUMA until today, so I went to look it up. The way I understand it is that in addition to having unified memory access (shared memory between CPU and GPU), which allows the GPU to read CPU memory, it is also a coherent cache system. To quote Ars, "CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached."
I remember reading something on this when I got my first alpha kit. I pulled up a couple of our internal white papers and it's pretty clear that this was the exact implementation in the Xbox One's memory system."
• We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
• Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.
------------------------------
Your statement "So here we go again you inventing crap in order to make the xbox one look better.." is bull$hit.
Your "68GB/s+68GB/s = 136 GB/s" claim is LOL. Another fan-fiction from you.
All i see is 30Gb/s Coherent connection from the DDR3 bank to a Cohenrent memory access,that access is connected to Host guess IO MMU,The CPU and the HOST guest GPU MMU,i don't see a 30GB/s connection between the GPU and CPU like you claimed were is it..
The only 30GB/s line there comes from the DDR3 memory,so yeah i was right if ESRAM isn't use
The xbox one has to share its 68GB/s between the CPU and GPU and is clear there,look at how the DRAM center also has a direct connection to the GPU..lol
Funny thing is that you even quoting old ass articles on Reddit..hahaha
The design of the xbox one is not True HSA, even less hUMA and it was already confirmed,in fact one of the rumors for bad performance is that they have to manually flush the GPU still,when on PS4 the design eliminated that.
That bold part there isn't talking about a dedicated line between CPU and GPU it doesn't say that in any way,it talks about the 30GB/s connection from the DRAM bank to the CPU cache Coherent memory access...lol
See the blue line yeah that is the 30?GB/s line that also connect to the GPU the other line is the yellow one which connect the GPU to the DRAM bank directly,
The maximum read speed of the DRAM bank is 68GB/s because that is the speed of DDR3,but 30GB/s of those are use by the blue 30GB/s Coherent bandwidth which mean that from the 68GB/s you only still have 38GB/s..
I was right and you are not there is no direct 30GB/s aside from the 68GB/s bandwidth the 30GB/s is taken from the 68GB/s DRAM bandwidth,so yeah the GPU would only have 38GB/s which is way lower than what the 7770 has,yeah games will perform badly.
The rest of the crap i erase was totally irrelevant to the argument like always you argue thing no one is arguing...
I explain to you what i was asking for,and told you it was irrelevant,because of 2 points that were wrong
with your argument.
1- You used a much more powerful GPU than the xbox one to try to do that test.
2- Your PC has 2 different bandwidths,one for the CPU DDR3 and one for the GPU GDDR5,so even if your card was limited to 68GB/s your PC also had 68GB/s from the DDR3 memory or do you have a new kind of PC that doesn't use system ram.?
You are forgetting Xbox One has a direct CPU-to-GPU access and it's claimed to be rated at 30 GB/s i.e. the interactions between CPU and memory would be reduced with a direct connection. PS4 has similar features with it's 20GB/s (10GB/s+10GB/s) super onion link.
This is true but is not what you think it is,the 30GB/s the xbox one has from CPU to GPU is not directly between both,is a line that comes from DRAM modules and that is share between the DRAM module,between the CPU-GPU and host IO which is where the move engines are,is not alone between CPU and GPU and it uses 30GB's/ from the memory bank.
Is very clearly marked in Blue where the 30GB/s line comes from and were it goes,in fact there is a middle man between the GPU and CPU is call CPU cache Coherent memory access.
You have posted the diagrams for the PS4 here,there is no middle man between the GPU and CPU just a direct connection,this is why MS solution wasn't call true hSA or hUMA,they stated to have something similar but wasn't quite the same.
And now you are the one with funky memory math..
The 30GB/s connection is 1,it doesn't have 30GB/s for every single thing in every single direction,everything that uses that line eat from those 30GB/s,now the whole 30GB/s are taken from DDR3 and is show on that diagram,so is the CPU is using 30GB/s from the DDR3 which wax output is 68GB/s you can't claim that the whole 68GB/s still intact,because that is not true,only 38GB/s remain because the 30Gb/s Coherent bandwidth eat from the DRAM center,so the GPU directly can't only access 38GB/s from the DRAM back because 30GB/s are compromise by the 30GB/s Coherent bandwidth because it feed from the DRAM center to ass.
So yeah the max connection for the DRAM center is 68GB/s,30GB is reserved by the Coherent BW, and 38GB/s left,that 30GB/s BW is mostly for system not for GPU,and if ESRAM isn't use is 68GB/s between CPU and GPU no matter how you slice it..
Please find me the middle man in this architecture..lol
The PS4 has a direct link between CPU and GPU that doesn't touch any of the bandwidth between the Memory and CPU or memory and GPU is just a straight line direct to the GPU.
So yeah sony has 20GB/s for CPU and 10/10 in super Onion.
You either have a poor understanding of English or you are plain dense...
Oh and if you lower your card to a 68GB/s bandwidth it would still be 68GB/s + the bandwidth of the DDR3 memory ass,your PC runs on DDR3 as system ram that has a bandwidth of 68GB/s + the bandwidth your GPU has so yeah is 68+68,unlike on xbox one which is a unified memory with just 68GB/s and nothing more if you left ESRAM un use.
Don't do that RON he gonna say you on that MisterXMedia stuff lol. I've been telling torms that MS and AMD are PARTNERS in hUMA and HSA, Everybody was contracted to make the x1 intel, ibm, and ms with DATAFLOW and supercomputer arch amd and ms for FAB.
Oh please STFU moron,you claimed that there isn't any difference between both consoles,you claimed the PS4 has a 20% GPU reservation pulled from deep deep down your ass..hahahaha
Ron know about hardware he is just to biased,but you you are just biased and know sh** about hardware,claiming the PS4 and xbox one are the same,is like saying the 7850OC and the 7770 are the same,well because you see no difference..lol
@ronvalencia said:
@Tighaman said:
@ronvalencia:
Don't do that RON he gonna say you on that MisterXMedia stuff lol. I've been telling torms that MS and AMD are PARTNERS in hUMA and HSA, Everybody was contracted to make the x1 intel, ibm, and ms with DATAFLOW and supercomputer arch amd and ms for FAB.
I have been stating AMD+MS via ex-DEC link since AMD K8 Athlon era, which is before Xbox One.
I'm not going change my POV for any "johnny come lately" console releases.
AMD K7 Athlon's uses the same point-to-point bus protocols as DEC Alpha's EV6.
You even got AMD/DEC's Slot B that can support both AMD K7 and DEC Alpha CPU slot packages on the same motherboard.
"It was based on the Digital Equipment Corporation's EV7 bus architecture".
AMD, NVIDIA, ATI and Intel has existed before Xbox 360, Xbox One, PS3 and PS4.
For AMD HSA and MS, Read http://hsafoundation.com/f-a-q/
Q: Will HSA support Microsoft C++ AMP ?
A: HSA will support Microsoft C++ AMP
The HSA Foundation will also work with all software foundations and open initiatives for the promotion of respective language runtimes and libraries around the advancement of heterogeneous computing.
-------
Torms can f**koff. I can afford to waste my time in pursuit of this fool.
You are a moron who can't even grasp the dam English language,your understand English as bad as i write it,look at the moron you just quote,he say something and you replay with a mountain of crap..
I ask you something simple and you reply with a bunch of crap no one asked you for.
Don't do that RON he gonna say you on that MisterXMedia stuff lol. I've been telling torms that MS and AMD are PARTNERS in hUMA and HSA, Everybody was contracted to make the x1 intel, ibm, and ms with DATAFLOW and supercomputer arch amd and ms for FAB.
Oh please STFU moron,you claimed that there isn't any difference between both consoles,you claimed the PS4 has a 20% GPU reservation pulled from deep deep down your ass..hahahaha
Ron know about hardware he is just to biased,but you you are just biased and know sh** about hardware,claiming the PS4 and xbox one are the same,is like saying the 7850OC and the 7770 are the same,well because you see no difference..lol
@ronvalencia said:
@Tighaman said:
@ronvalencia:
Don't do that RON he gonna say you on that MisterXMedia stuff lol. I've been telling torms that MS and AMD are PARTNERS in hUMA and HSA, Everybody was contracted to make the x1 intel, ibm, and ms with DATAFLOW and supercomputer arch amd and ms for FAB.
I have been stating AMD+MS via ex-DEC link since AMD K8 Athlon era, which is before Xbox One.
I'm not going change my POV for any "johnny come lately" console releases.
AMD K7 Athlon's uses the same point-to-point bus protocols as DEC Alpha's EV6.
You even got AMD/DEC's Slot B that can support both AMD K7 and DEC Alpha CPU slot packages on the same motherboard.
"It was based on the Digital Equipment Corporation's EV7 bus architecture".
AMD, NVIDIA, ATI and Intel has existed before Xbox 360, Xbox One, PS3 and PS4.
For AMD HSA and MS, Read http://hsafoundation.com/f-a-q/
Q: Will HSA support Microsoft C++ AMP ?
A: HSA will support Microsoft C++ AMP
The HSA Foundation will also work with all software foundations and open initiatives for the promotion of respective language runtimes and libraries around the advancement of heterogeneous computing.
-------
Torms can f**koff. I can afford to waste my time in pursuit of this fool.
You are a moron who can't even grasp the dam English language,your understand English as bad as i write it,look at the moron you just quote,he say something and you replay with a mountain of crap..
I ask you something simple and you reply with a bunch of crap no one asked you for.
I have provided information on HSA foundation working with Microsoft's C++ AMP which backs Tighaman's post i.e. a reply post doesn't automatically equal counter argument. The moron is you.
Log in to comment