Poll How many years will the PS4/XB1 be able to last this generation of gaming? (65 votes)
So what you guys think last 2 are bonus choices for people who don't like this gen.
This topic is locked from further discussion.
So what you guys think last 2 are bonus choices for people who don't like this gen.
You cannot compare flop count as a way to measure performance because architecture differences are more important.
The 290x has 5632 flop count while the GTX 980 has 4616.
Going by that the 290x should be easily more powerfil than the 980 yet the 980 easily beats it.
AMD and Nvidia architecture is very different.
Hell even comparing two Nvidia cards from different generations may not be accurate when going by flop count.
The 980 has 4616 flops while the 780ti has 5046 yet the 980 performs around 10-15% better despite have around 10% of a flop count.
The 970 has almost 30% of a flops count compared to the 780ti yet performs within 0-5% of it and sometimes will surpass it on rare occasions.
True. Well then we should compare an overclocked 7850 (since Ps4 is better than 7850 but worse than 7870) to a 980 to try and quantify the difference. No way in hell it's even 4x though.
970 is gimped you know..
Even the 7850 seems to be performing better than the PS4 in a lot of games.
For example Alien Isolation.
@RyviusARC: Well that's not the Ps4's fault. Metro Redux is locked 60fps on ps4. Not quite max settings probably but a mix of high/max at least.
@RyviusARC: Well that's not the Ps4's fault. Metro Redux is locked 60fps on ps4. Not quite max settings probably but a mix of high/max at least.
Metro 2033 would not be a fair comparison as some of the settings the PS4 lacked compared to the PC are very demanding settings.
Also the video you used is being recorded which drops fps by a lot if not using shadowplay which is an Nvidia exclusive and the PC used in the video has a weak Phenom II x4 CPU from 2009 which makes it even worse when recording.
How about using Thief which is already more optimized for AMD compared to Nvidia.
Here is a performance analysis of Thief on PS4 which has a few settings lower than Thief maxed on PC.
And here is my own benchmark of Thief on PC running at max settings at 1440p with 4xSSAA which make is render at almost 8 times the resolution of the PS4.
I used Shadowplay to record my benchmark so the performance drop from recording is only a few frames at it's worst.
If this gen lasts any longer than four more years, pc hardware sales are going to be staggering. Also, system wars will be a very entertaining place.
@RyviusARC: Well that cpu still beats the jaguar by quite a lot so that's not an issue. Here's a fair benchmark for metro. Keep in mind that is an average of 53 and Ps4 is locked 60 meaning the avg. would be even higher.
Witcher 3 will be an excellent comparison because you can count on CDPR getting the most out of all systems, i'm sure. Considering even ground zero on PC has significant improvements over Ps4 the PC max difference on Witcher might be quite a bit.
And again I have to say if we're comparing chipset to chipset we should not be comparing SLI. You can do it but the power usage is insane and it's just duct taping hardware together and doesn't work 1:1 or even on all games. But I won't go around in circles on that one just that I am a single GPU user.
@RyviusARC: Well that cpu still beats the jaguar by quite a lot so that's not an issue. Here's a fair benchmark for metro. Keep in mind that is an average of 53 and Ps4 is locked 60 meaning the avg. would be even higher.
Witcher 3 will be an excellent comparison because you can count on CDPR getting the most out of all systems, i'm sure. Considering even ground zero on PC has significant improvements over Ps4 the PC max difference on Witcher might be quite a bit.
And again I have to say if we're comparing chipset to chipset we should not be comparing SLI. You can do it but the power usage is insane and it's just duct taping hardware together and doesn't work 1:1 or even on all games. But I won't go around in circles on that one just that I am a single GPU user.
That Metro 2033 benchmark in your link is using 4xMSAA which is a lot more demanding compared to what the PS4 uses for anti aliasing.
Also SLI is not really demanding on power usage with the new GTX 900 series.
A 970 sli setup with an Intel Core i7-3960X Extreme overclocked to 4.6ghz only uses around 439 watts at full load.
And 439 watts is very low for high end gaming computers/
Also the scaling in two card SLI for the 900 series is very good.
Around 90-100%.
And these benchmarks were done at the release of the 970. Recent drivers have improved performance for the 970 and 980.
Simple.
Whenever Sony can make another box that is around 10x more powerful than the PS4, consumes 150 watts and has a manufacturing price of around $399.99.
So, 5-6 yrs should be plenty time.
You can already achieve 10x more power than the PS4 on PC and the gap will continue to widen.
Yes I know achieving this 10x more power requires a substantially higher wattage power supply but it just goes to show you how far consoles are behind compared to last gen.
My current PC is more than 6x the power of the PS4.
"You can already achieve 10x more power than the PS4 on PC and the gap will continue to widen."
No you can't.
"My current PC is more than 6x the power of the PS4."
So you have an 11 Teraflop GPU? Yeah, Sorry bud that card doesn't exist yet, not even a SLI setup can achieve 11 Teraflops yet.
Please go learn more about computers before you make a post because you obviously don't know what you are talking about.
Really? Two 970's don't equal 6x more performance over the PS4. You need to get your Math in check.
Simple.
Whenever Sony can make another box that is around 10x more powerful than the PS4, consumes 150 watts and has a manufacturing price of around $399.99.
So, 5-6 yrs should be plenty time.
You can already achieve 10x more power than the PS4 on PC and the gap will continue to widen.
Yes I know achieving this 10x more power requires a substantially higher wattage power supply but it just goes to show you how far consoles are behind compared to last gen.
My current PC is more than 6x the power of the PS4.
"You can already achieve 10x more power than the PS4 on PC and the gap will continue to widen."
No you can't.
"My current PC is more than 6x the power of the PS4."
So you have an 11 Teraflop GPU? Yeah, Sorry bud that card doesn't exist yet, not even a SLI setup can achieve 11 Teraflops yet.
Please go learn more about computers before you make a post because you obviously don't know what you are talking about.
Really? Two 980's don't equal 6x more performance over the PS4. You need to get your Math in check.
Lol you know nothing.
Even a weaker single Titan is 3x the power of the PS4.
The PS4 can barely handle The Evil Within at 1920x768 with lower settings than PC max and yet it struggles to maintain 30fps with dips in the 20s.
While I play the game at higher settings at 5.6 times the resolution (3840x2160) and still get 51fps.
Simple.
Whenever Sony can make another box that is around 10x more powerful than the PS4, consumes 150 watts and has a manufacturing price of around $399.99.
So, 5-6 yrs should be plenty time.
You can already achieve 10x more power than the PS4 on PC and the gap will continue to widen.
Yes I know achieving this 10x more power requires a substantially higher wattage power supply but it just goes to show you how far consoles are behind compared to last gen.
My current PC is more than 6x the power of the PS4.
"You can already achieve 10x more power than the PS4 on PC and the gap will continue to widen."
No you can't.
"My current PC is more than 6x the power of the PS4."
So you have an 11 Teraflop GPU? Yeah, Sorry bud that card doesn't exist yet, not even a SLI setup can achieve 11 Teraflops yet.
Please go learn more about computers before you make a post because you obviously don't know what you are talking about.
Really? Two 980's don't equal 6x more performance over the PS4. You need to get your Math in check.
Lol you know nothing.
Even a weaker single Titan is 3x the power of the PS4.
The PS4 can barely handle The Evil Within at 1920x768 with lower settings than PC max and yet it struggles to maintain 30fps with dips in the 20s.
While I play the game at higher settings at 5.6 times the resolution (3840x2160) and still get 51fps.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Personally I think this is the last generation of fixed hardware consoles, after this gen they will more than likely become streaming platforms. There are several reasons for this, it's easier to run games centrally from servers, development is only needed on PC, exclusives are easily restricted without specific programming and it utterly annihilates the used games market. It won't matter if you don't like it, the games industry does and that is all that matters to them.
How long this generation will last is hard to say but I suspect at least 7 or 8 years. That depends on Sony and MS keeping gamers happy with seriously out dated hardware in an age where people change their phones, on average, every two years. Also depends on how long it takes for the console companies to set up their infrastructure but with Gaikai and MS cloud they're already well on their way, even if the rest of the world isn't.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
It's true that both consoles released to make profit unlike last generation. One thing people don't take into account however, is they accomplished this with very crappy parts and were not even half-way toward cutting edge at launch. I believe by the end of this year tablets will be almost on par.
It's true that both consoles released to make profit unlike last generation. One thing people don't take into account however, is they accomplished this with very crappy parts and were not even half-way toward cutting edge at launch. I believe by the end of this year tablets will be almost on par.
Tablets still have a few more years to go before they rival the PS4.
They haven't even surpassed the 2006 Nvidia 8800GTX.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
The game is capped at 60fps so It won't go higher no matter what.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
It would be a lot more than what the PS4 can muster.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
The game is capped at 60fps so It won't go higher no matter what.
Then measure the performance of a multiplat that doesn't have a 60fps cap. I want to see 6x better frame rates at the same resolution as the console version.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
The game is capped at 60fps so It won't go higher no matter what.
Then measure the performance of a multiplat that doesn't have a 60fps cap. I want to see 6x better frame rates at the same resolution as the console version.
BF4 anyone? Alien Isolation?
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
It would be a lot more than what the PS4 can muster.
Too bad they cap the game at 60fps.
Also just to see my GPU usage I capped the game at 30fps just like the PS4's fps cap and I ran the game at 1920x768 (just like the PS4).
I looked at my GPU usage but it was not accurate because the game was so easy to run at such settings that my videos cards were downclocking themselves from a core clock of 1530mhz to a core clock of 1140mhz.
Even with the 34% lower core clock speed my GPU usage was only around 15% at those settings.
@RyviusARC: Try doing it with one card, its not really a fair comparison if you are using SLI.
It is fair because I made the statement that my PC was 6x+ the power of the PS4 so I have to include SLI which my PC uses.
Wait for @emgesp: to comeback with some sort of stipulation.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
The game is capped at 60fps so It won't go higher no matter what.
Then measure the performance of a multiplat that doesn't have a 60fps cap. I want to see 6x better frame rates at the same resolution as the console version.
BF4 anyone? Alien Isolation?
The main problem is the most games don't take advantage of all my CPU threads so the frame rate will go up into the 100s but eventually the CPU cannot keep up because the game will not utilize all cores/threads efficiently.
I don't currently have BF4 installed
I can try Alien Isolation but I forced higher settings than the game normally allows like twice the resolution shadows and much better reflections.
Also my CPU (i7 4770k at 4.2ghz) is bottlenecking my GPUs because the game cannot utilize all of my CPU.
My first GPU is at 94% usage while the second is at 75%.
Even with the bottleneck I can get around 240fps at max settings at 1920x1080.
Although I don't think the comparison is fair since the PS4 is capped at 30fps while it should be able to run it at a higher fps.
So by performance you are just talking about a bump in resolution? You can't just say your PC is 6x more powerful than the PS4 just because of resolution.
Yah I can.
If you want to go from running a game at 1920x1080 30fps to 3840x2160 (4x the res of 1080p) 30fps at the same settings then you need a video card that is around 4x more powerful.
4x the resolution is around 4x more demanding on the video card.
What is your framerate if you stick to 1080p?
The game is capped at 60fps so It won't go higher no matter what.
Then measure the performance of a multiplat that doesn't have a 60fps cap. I want to see 6x better frame rates at the same resolution as the console version.
BF4 anyone? Alien Isolation?
The main problem is the most games don't take advantage of all my CPU threads so the frame rate will go up into the 100s but eventually the CPU cannot keep up because the game will not utilize all cores/threads efficiently.
I don't currently have BF4 installed
I can try Alien Isolation but I forced higher settings than the game normally allows like twice the resolution shadows and much better reflections.
Also my CPU (i7 4770k at 4.2ghz) is bottlenecking my GPUs because the game cannot utilize all of my CPU.
My first GPU is at 94% usage while the second is at 75%.
Even with the bottleneck I can get around 240fps at max settings at 1920x1080.
Although I don't think the comparison is fair since the PS4 is capped at 30fps while it should be able to run it at a higher fps.
That is some insane fps. Then again it better be for a costly SLI setup. How much did your rig cost?
@RyviusARC: Well that cpu still beats the jaguar by quite a lot so that's not an issue. Here's a fair benchmark for metro. Keep in mind that is an average of 53 and Ps4 is locked 60 meaning the avg. would be even higher.
Witcher 3 will be an excellent comparison because you can count on CDPR getting the most out of all systems, i'm sure. Considering even ground zero on PC has significant improvements over Ps4 the PC max difference on Witcher might be quite a bit.
And again I have to say if we're comparing chipset to chipset we should not be comparing SLI. You can do it but the power usage is insane and it's just duct taping hardware together and doesn't work 1:1 or even on all games. But I won't go around in circles on that one just that I am a single GPU user.
That Metro 2033 benchmark in your link is using 4xMSAA which is a lot more demanding compared to what the PS4 uses for anti aliasing.
Well, that 7850 is also heavily overclocked so w/e. Couldn't find a benchmark closer than that for some reason.
439 watts seems quite low, where did you get that from? I forgot the 9xx series is pretty efficient but the numbers I found suggest 500 watts minimum for a whole system, not just the gpu's/cpu.
It's true that both consoles released to make profit unlike last generation. One thing people don't take into account however, is they accomplished this with very crappy parts and were not even half-way toward cutting edge at launch. I believe by the end of this year tablets will be almost on par.
Tablets still have a few more years to go before they rival the PS4.
They haven't even surpassed the 2006 Nvidia 8800GTX.
They haven't even surpassed last gen consoles. Mobile is greatly exaggerated all the time and esp. Nvidia's numbers are fudged on the mobile front.
@RyviusARC: Try doing it with one card, its not really a fair comparison if you are using SLI.
It is fair because I made the statement that my PC was 6x+ the power of the PS4 so I have to include SLI which my PC uses.
6x more powerful while consuming 6x the wattage and at 6x the price. WOW, good for you. Now show me a game outside Star Citizen that is taken full advantage of that rig.
Actually it consumes around 3.2x the wattage while costing 3.1x more.
And don't forget the addition cost of PS+ which I didn't even add into the comparison.
Then you have to take into account that the PS4 uses low quality parts and that a computer is capable of doing much more than the PS4.
Also I don't have to worry about shitty fps drops like the PS4 has in many games like Assassin's Creed Unity and The Evil Within having drops in the low 20s and sometimes in the teens.
I play Assassin's Creed Unity at 1440p max settings and never drop below 60fps.
The PS4 can see Assassin's Creed Unity drop to 16fps.
And if you play games at high resolutions then having a high end rig is needed.
At 3840x2160 I can barely handle Nvidia's "A New Dawn" Tech Demo.
@RyviusARC: Well that cpu still beats the jaguar by quite a lot so that's not an issue. Here's a fair benchmark for metro. Keep in mind that is an average of 53 and Ps4 is locked 60 meaning the avg. would be even higher.
Witcher 3 will be an excellent comparison because you can count on CDPR getting the most out of all systems, i'm sure. Considering even ground zero on PC has significant improvements over Ps4 the PC max difference on Witcher might be quite a bit.
And again I have to say if we're comparing chipset to chipset we should not be comparing SLI. You can do it but the power usage is insane and it's just duct taping hardware together and doesn't work 1:1 or even on all games. But I won't go around in circles on that one just that I am a single GPU user.
That Metro 2033 benchmark in your link is using 4xMSAA which is a lot more demanding compared to what the PS4 uses for anti aliasing.
Well, that 7850 is also heavily overclocked so w/e. Couldn't find a benchmark closer than that for some reason.
439 watts seems quite low, where did you get that from? I forgot the 9xx series is pretty efficient but the numbers I found suggest 500 watts minimum for a whole system, not just the gpu's/cpu.
Sub 500 for Haswell, Maxwell under gaming load is normal, especially since he's got SLI.
@RyviusARC: Try doing it with one card, its not really a fair comparison if you are using SLI.
It is fair because I made the statement that my PC was 6x+ the power of the PS4 so I have to include SLI which my PC uses.
6x more powerful while consuming 6x the wattage and at 6x the price. WOW, good for you. Now show me a game outside Star Citizen that is taken full advantage of that rig.
Actually it consumes around 3.2x the wattage while costing 3.1x more.
And don't forget the addition cost of PS+.
Then you have to take into account that the PS4 uses low quality parts and that a computer is capable of doing much more than the PS4.
Also I don't have to worry about shitty fps drops like the PS4 has in many games like Assassin's Creed Unity and The Evil Within having drops in the low 20s and sometimes in the teens.
I play Assassin's Creed Unity at 1440p max settings and never drop below 60fps.
The PS4 can see Assassin's Creed Unity drop to 16fps.
And if you play games at high resolutions then having a high end rig is needed.
At 3840x2160 I can barely handle Nvidia's "A New Dawn" Tech Demo.
Your rig only cost you $1300?
That Metro 2033 benchmark in your link is using 4xMSAA which is a lot more demanding compared to what the PS4 uses for anti aliasing.
Well, that 7850 is also heavily overclocked so w/e. Couldn't find a benchmark closer than that for some reason.
439 watts seems quite low, where did you get that from? I forgot the 9xx series is pretty efficient but the numbers I found suggest 500 watts minimum for a whole system, not just the gpu's/cpu.
Sub 500 for Haswell, Maxwell under gaming load is normal, especially since he's got SLI.
The cpu he suggested was not haswell, it was a 130 watt chip from 2011. I don't follow "especially since he's got SLI", SLI means more power..
@RyviusARC: Try doing it with one card, its not really a fair comparison if you are using SLI.
It is fair because I made the statement that my PC was 6x+ the power of the PS4 so I have to include SLI which my PC uses.
6x more powerful while consuming 6x the wattage and at 6x the price. WOW, good for you. Now show me a game outside Star Citizen that is taken full advantage of that rig.
Actually it consumes around 3.2x the wattage while costing 3.1x more.
And don't forget the addition cost of PS+.
Then you have to take into account that the PS4 uses low quality parts and that a computer is capable of doing much more than the PS4.
Also I don't have to worry about shitty fps drops like the PS4 has in many games like Assassin's Creed Unity and The Evil Within having drops in the low 20s and sometimes in the teens.
I play Assassin's Creed Unity at 1440p max settings and never drop below 60fps.
The PS4 can see Assassin's Creed Unity drop to 16fps.
And if you play games at high resolutions then having a high end rig is needed.
At 3840x2160 I can barely handle Nvidia's "A New Dawn" Tech Demo.
Your rig only cost you $1300?
My rig cost me 1400USD but that is with tax counted.
I counted tax on the PS4 too just to be fair.
My rig is;
i7 4770k
Noctua NH-D14 CPU heat sink
2x4gb DDR3 2400mhz Gskill Ares RAM
2 of the Gigabyte G1 gaming gtx 970s in SLI
Asus Maximus Hero VI motherboard
Corsair HX1000 power supply
Corsair Air 540 case with 3 corsair fans +3 cougar 120mm fans.
2 of the western digital blue 1TB hard drives.
@RyviusARC: Well that cpu still beats the jaguar by quite a lot so that's not an issue. Here's a fair benchmark for metro. Keep in mind that is an average of 53 and Ps4 is locked 60 meaning the avg. would be even higher.
Witcher 3 will be an excellent comparison because you can count on CDPR getting the most out of all systems, i'm sure. Considering even ground zero on PC has significant improvements over Ps4 the PC max difference on Witcher might be quite a bit.
And again I have to say if we're comparing chipset to chipset we should not be comparing SLI. You can do it but the power usage is insane and it's just duct taping hardware together and doesn't work 1:1 or even on all games. But I won't go around in circles on that one just that I am a single GPU user.
That Metro 2033 benchmark in your link is using 4xMSAA which is a lot more demanding compared to what the PS4 uses for anti aliasing.
Well, that 7850 is also heavily overclocked so w/e. Couldn't find a benchmark closer than that for some reason.
439 watts seems quite low, where did you get that from? I forgot the 9xx series is pretty efficient but the numbers I found suggest 500 watts minimum for a whole system, not just the gpu's/cpu.
Got it from this site which is using a CPU which is actually more power hungry than my own.
They see a wattage of 436 in SLI when counting the whole system.
The reason they suggest a much higher wattage power supply is because some no name brand power supply have high wattage reading but low amps on the 12v rails.
Amperage is very important when powering video cards.
http://www.guru3d.com/articles_pages/geforce_gtx_970_sli_review,4.html
Got it from this site which is using a CPU which is actually more power hungry than my own.
They see a wattage of 436 in SLI when counting the whole system.
The reason they suggest a much higher wattage power supply is because some no name brand power supply have high wattage reading but low amps on the 12v rails.
Amperage is very important when powering video cards.
http://www.guru3d.com/articles_pages/geforce_gtx_970_sli_review,4.html
Well I must admit that is impressively low.
But, what about power spikes? I'm thinking that bench is best case scenario. You don't have a twice rated power supply for nothing, PC gpu's fluctuate.
Well in any case, Maxwell is an impressive design for sure, it had to be since they've been stuck on 28nm. Nice talking with you, i'm out.
Please Log In to post.
Log in to comment