GTAV is 1080p 30fps on both PS4 and X1

  • 120 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101  Edited By GrenadeLauncher
Member since 2004 • 6843 Posts

@evildead6789 said:

@Tighaman said:

what ????? how is that possible if xbox runs the game in 1080p the ps4 should at least run the game in 1440p at 60fps

Rockstar a joke ps4 is 50% more powerful preorder canceled

HAHAHA

the ps4 50 percent more powerfull

BWHAAHAHAAA

How many 1440p TVs have you seen about, lemmings?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#102 commander
Member since 2010 • 16217 Posts

@scottpsfan14 said:
@gamersjustgame said:

@scottpsfan14 said:
@gamersjustgame said:

@X_CAPCOM_X said:

@ps4hasnogames said:

@Tighaman said:

what ????? how is that possible if xbox runs the game in 1080p the ps4 should at least run the game in 1440p at 60fps

Rockstar a joke ps4 is 50% more powerful preorder canceled

lol, the problem is...it's not more powerful....if it was then games like COD: AW would look better on ps4, but look how much better the Xbox One version looks

Loading Video...

Actually it objectively is much more powerful.

It isn't more powerful. They are very comparable. The PS4 is slightly more powerful. But nowhere near as powerful as PS4 fanboys want to believe. If the PS4 was 50% more powerful GTA and Destiny would both reach 60fps. The truth is, Its nowhere near 50% more powerful. At the most, it might be 15 to 20 percent. They are comparable low end gaming rigs.

Source? You don't have it and don't know shiyt.

50% more powerful means that Xbox One is 75% the power of the PS4. Meaning that if XB1 can handle 30fps on average, the PS4 can handle 45fps on average.. not enough to lock at 60 is it? That's why Destiny and GTA 5 are both locked at 30fps on PS4 and XB1. I expect a few extra visual settings to be enabled on the PS4 version unless they strive for parity.

And then there is the drastically more equiped GPU compute engine in the PS4. But you have DX12, Cloud and eSRAM right?

No.

Hmm it's clear that this is personally affecting you. So much so that you refuse to accept facts. You just spout the "minimal power difference" bullshit with no actual proof on the matter. But what we do know is the specs of both consoles. We know that the XB1 has a 8 core 1.75GHz CPU while the PS4's is at 1.6GHz. We know the XB1 boasts a 1.31TFLOP GPU with 12 CU's, 768 shader cores, 48 TMU's and 16 ROPs. While the PS4 boasts a 1.84TFLOP GPU with 18 CU's, 1152 shader cores, 72 TMU's and 32 ROPs. In the PC world, that amounts to roughly 50% more GPU power. While the XB1 has a 10% advantage in terms of total CPU grunt.

Factoring in the RAM, the PS4 has 8GB of GDDR5 clocked at 5,500MHz. The XB1 has 8GB of DDR3 clocked at 2,133MHz. The XB1 also has 32MB of eSRAM. The PS4 has 8 ACE's able to handle 8 compute commands each at a time. The XB1 has 2 ACE's able to handle 8 compute comands just the same. That's 64 compute commands for PS4, and 16 for XB1. These are things we know are true. They are fact. These benchmarks are none biased and are equally optimized pieces of software on all devices to ensure accurate results. Such is the definition of a benchmark.

All that said and done, it all depends on the developer and what they do with the hardware.

The only difference between the ps4 and x1 is de gpu and the memory bandwidth. You're comparing stats like cu shader cores and rops but the fact of the matter is, and this a consensus between all hardware specialists, is that the ps4 is comparable to a hd 7850 and xboxone with a hd 7790.

While the ps4 is more powerfull, the 50 percent claims are just plain laughable. The difference between a hd 7850 and hd 7790 is not 50 percent at all, and the memory bandwith doesn't really do anything because system ram doesn't give any extra performance above ddr3 1333 mhz. The extra bandwith on the 7850 is needed though because it's a stronger card. The bandwith of the esram is not added into the total memory bandwith when they compare the two systems. Simply because it's different architecture.

The only reason why the ps4 seemed a lot more powerfull is because the esram wasn't used at all because there weren't any dev tools in the beginning and because 10 percent was reserved for the kinect. Now things are different, the kinect reservation is disabled, freeing 10 percent and the dev tools are readily usable. They even overclocked the gpu in the xboxone and the cpu as well.

After all that's said and done mutlplats look exactly the same simply because the difference between an overclocked 7790 and a hd 7850 is 12 percent, and that's the reason you see hardly any difference now between the ps4 and x1 and that will simply stay that way. The ps4 isn't magically better because it uses gddr5, for system ram this means bs. Just like 8 cores in the ps3 meant bs because the system was severly bottlenecked by the gpu. Still the difference between the ps3 and x360 is way bigger than the difference between the x1 and ps4. The xbox one may use ddr3 on the gpu, but they doubled the bit bus to communicate with the esram. IF you realize the bit bus is 128 bit on a hd 7790 wel then you can only imagine how much overkill the gddr5 is at that speed on the hd 7850.

Tflops give a warped image as well, the esram isn't added in this and superfast on chip ram does give extra performance, wether you like it or not. Just look at the differenre between phenom II and athlon II cpu's.

The developper can do what they want. A hd 7850 with the power of an i3 won't magically run way better than the same system with a hd 7790. The shader count may be 50 percent more, but a system is way more than shaders only.

Avatar image for Antwan3K
Antwan3K

9396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 Antwan3K
Member since 2005 • 9396 Posts

@evildead6789: Well said

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 Tighaman
Member since 2006 • 1038 Posts

@GrenadeLauncher:

supersample? then downsample I'm sure the ps4 powerful enough to do it clearly if the xbox one is getting 1080p?

Avatar image for AM-Gamer
AM-Gamer

8116

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105  Edited By AM-Gamer
Member since 2012 • 8116 Posts

@ps4hasnogames: Lol take the lem goggles off. All assets are identical the PS4 version just runs it at higher res. If you love crushed blacks adjust the gamma bar before you start the game.

Also expect GTA 5 to have lower quality assets on X1.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107  Edited By scatteh316
Member since 2004 • 10273 Posts

@ps4hasnogames said:

@Tighaman said:

what ????? how is that possible if xbox runs the game in 1080p the ps4 should at least run the game in 1440p at 60fps

Rockstar a joke ps4 is 50% more powerful preorder canceled

lol, the problem is...it's not more powerful....if it was then games like COD: AW would look better on ps4, but look how much better the Xbox One version looks

Loading Video...

Yay for different contrast levels.... noob

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#108 commander
Member since 2010 • 16217 Posts

@scottpsfan14 said:
@evildead6789 said:

The only difference between the ps4 and x1 is de gpu and the memory bandwidth. You're comparing stats like cu shader cores and rops but the fact of the matter is, and this a consensus between all hardware specialists, is that the ps4 is comparable to a hd 7850 and xboxone with a hd 7790.

While the ps4 is more powerfull, the 50 percent claims are just plain laughable. The difference between a hd 7850 and hd 7790 is not 50 percent at all, and the memory bandwith doesn't really do anything because system ram doesn't give any extra performance above ddr3 1333 mhz. The extra bandwith on the 7850 is needed though because it's a stronger card. The bandwith of the esram is not added into the total memory bandwith when they compare the two systems. Simply because it's different architecture.

The only reason why the ps4 seemed a lot more powerfull is because the esram wasn't used at all because there weren't any dev tools in the beginning and because 10 percent was reserved for the kinect. Now things are different, the kinect reservation is disabled, freeing 10 percent and the dev tools are readily usable. They even overclocked the gpu in the xboxone and the cpu as well.

After all that's said and done mutlplats look exactly the same simply because the difference between an overclocked 7790 and a hd 7850 is 12 percent, and that's the reason you see hardly any difference now between the ps4 and x1 and that will simply stay that way. The ps4 isn't magically better because it uses gddr5, for syBenchstem ram this means bs. Just like 8 cores in the ps3 meant bs because the system was severly bottlenecked by the gpu. Still the difference between the ps3 and x360 is way bigger than the difference between the x1 and ps4. The xbox one may use ddr3 on the gpu, but they doubled the bit bus to communicate with the esram. IF you realize the bit bus is 128 bit on a hd 7790 wel then you can only imagine how much overkill the gddr5 is at that speed on the hd 7850.

Tflops give a warped image as well, the esram isn't added in this and superfast on chip ram does give extra performance, wether you like it or not. Just look at the differenre between phenom II and athlon II cpu's.

The developper can do what they want. A hd 7850 with the power of an i3 won't magically run way better than the same system with a hd 7790. The shader count may be 50 percent more, but a system is way more than shaders only.

A 7790 is higher specced than the XB1 GPU in every single way. It's a 1.79TFLOP GPU as apose to a 1.31TFLOP on XB1. That's actually higher FLOPS than the 7850 (1.76TFLOPS). But it's power isn't distributed as well for graphical performance as the 7850. The XB1 is more comparable to the 7770 with slightly higher specs. The PS4 is also slightly higher specs than the 7850.

Those benchmark pictures I posted above are the results. Take them or leave them. The GPU is almost 50% stronger in the PS4 wether you like it or not. It's not just about the total FLOP numbers, it's about what makes up those results.H

HD 7790XBOX ONE GPUPS4 GPU
1000MHz853MHz800MHz
16 CU's (896 shader cores)12 CU's (768 shader cores)18 CU's (1152 shader cores)
16 ROPS16 ROPS32 ROPS
56 TMU's48 TMU's72 TMU's
1.79 TFLOPS1.31 TFLOPS1.84 TFLOPS
16 GPix/s13.6 GPix/s25.6 GPix/s
56 GTex/s40.9 GTex/s57.6 GTex/s
96GB/s GDDR5

68GB/s DDR3

176GB/s GDDR5

Justify it how you want, but we are talking GPU power here. 50% more compute units. 100% more ROPS. 50% more texture mapping units. Almost double the pixel fillrate. And a wider memory bandwidth. And this results in almost 50% more TFLOPS. You could say 12% faster, but you would be wrong. In nobodies book is PS4's GPU 12% faster than XB1's. You think that an extra 53MHz overclock will take it to PS4 GPU performance?

You also disregard the PS4's GPU compute engine and ACE's compared to the XB1. 8 ACE's with each of them being able to handle 8 compute commands at once. 64 in total that is. Compared to the XB1 GPU's 2 ACE's (16 in total). And PS4 has more compute units to spare on top of that.

Benchmark numbers don't lie. And PS4 GPU shows 50% better performance in all tests. The almost 10% CPU advantage in the XB1 isn't going to cut it when devs start using GPGPU compute on these consoles. PS4 is simply more capable.

And what multiplats look exactly the same on these consoles apart from Destiny? What about Shadow Of Mordor which is 900p on XB1 with less on screen effects, foliage and shadow resolution and 1080p on PS4 with higher res shadows, foliage and still keeping a solid framerate? I guess you will disregard that and focus on the "parity".

yes but flops don't give the full image , just like it didn't give the full image with the x360. It would make sense if you just compare a single card, but in consoles like this it doesn't work that way.

Why are you trying to argue with every hardware specialist on the internet. The xboxone gpu is comparable with the hd 7790 in performance, that doesn't mean it is a hd 7790. The overclock also is only a couple of percents, without the overclock the ps4 would still only be 15 percent faster.

Benchmarks how, where are these benchmarks? Did they use the esram in these benchmarks? was it with the 10 percent disabled? I'm not even counting the cpu advantage on the x1.

As for shadows of mordor the foliage and shadow resolution is correct, but it's still not even the difference between medium and high on the pc. The resolution is another matter , with these changes in detail settings the xboxone should have performed better and it's pretty obvious why. The engine of for shadow of mordor doesn't support the esram, but since texture tiling (the main usage of esram) will be standard in dx12, shadow of mordor will be one of the last games that shows these differences,

IF it was the other way around like you said , that the devs strive for parity, that would be ridiculous, since the consoles are like pc's, so all they need to do is adjust ini files

You seriously think they're going to deliberatly lower the detail settings on the ps4 to achieve parity lol. If they would do that then everyone would do it because it would have a reason. A reason that is very unlikely because the ps4 is actually a very weak system. Every dev is going to take everything it can out of the system (if it doesn't involve changing their engine , because that would be just starting over, so monolith had no choice here) because the pc versions will be vastly superior and the further in this gen, the harder it will be to hide that.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#110 delta3074
Member since 2007 • 20003 Posts

@SolidTy said:

@ps4hasnogames said:

@SolidTy said:

This gen certainly is taking the crown on the absolute most cross gen multiplats of games I won't re-buy.

Would you have rather waited for the next gen version and played the "superior" one?

My only choice is to avoid buying games in the future and wait and hope they release better versions or bite the bullet and buy the game when it's still relevant.

Tough decision that one, you could end up waiting for a superior version of a game that never ends up appearing at which point you will buy a game everyone and there dog played last year.

This gen is Lame so far, 90% of the games on 8th gen systems can be played and bought Cheaper on 7th gen systems.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#111 GrenadeLauncher
Member since 2004 • 6843 Posts

@Tighaman said:

@GrenadeLauncher:

supersample? then downsample I'm sure the ps4 powerful enough to do it clearly if the xbox one is getting 1080p?

Devs probably decide it isn't worth the effort. I'd rather the existing power spent on a better framerate and better graphical effects anyway.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#112  Edited By commander
Member since 2010 • 16217 Posts

@scottpsfan14 said:

Look instead of you babbling bullshit, back up your claims. I've seen way too many people on SW own you in arguments to take you seriously any way. You have some stupid fucking conjured up opinions. Like 4 8800GTX in SLI is more capable than PS4 and shit. And DX9 can do everything that DX11 can. Where do you get that from? Try running Infamous SS and DriveClub on that PC and it won't even start because it lacks silicon. Even if the FLOPS add up with 4 8800's, that makes no difference.

Also, the consoles are nothing like PC's. They have an x86 CPU with GPU's made by AMD. That's about as far as it goes. But then the PS3 had an Nvidia chip inside. The actual only thing that makes PS4 "like a PC" is the x86 CPU. The PS4 is vastly more streamlined in design than a PC. It's software layers are console like entirely. The API is console like, and is different to PC programming in almost every way. Just like all consoles.

If you actually researched things instead of making up shit you think sounds good, then you would know this. A PS4 game isn't an OS application like a PC game. PS4 runs FreeBSD, but that doesn't mean you could run PS4 games on a PC with FreeBSD installed. The PS4's API is almost at driver level. It's as to the metal as any other console. Same with the Xbone. Not even slightly like a PC with boatloads of middleware.

Get this, the PS4 GPU has 18 CU's in use with 2 disabled. Each CU has 64 shaders. The Xbox One has 12 CU's also with 64 shaders each. These are directly comparable GPU's and eSRAM isn't going to take the XB1 above the PS4.

I really don't know what you're trying to achieve. The 32MB of eSRAM will give small gains in some situations. PS4 and XB1's GPU's are using Pitcairn and Bonaire. Pitcairn>Bonaire. They are directly comparable to AMD's GCN GPU's on PC. There is no hiding from this. For you to say these "12/15%" more perf claims, you must have gotten that number from somewhere? You can't just be spewing dross can you? Oh you can because it's all you ever do.

Both the PS4 and XB1 are miles behind modern PC's, but Xbox One is more behind and by a larger number than you like to think.

It's not because fanboys are in denial that they own me. For starters 4 8800 gtx do beat or at least match a ps4 . Even 3 gtx 8800 do . A hd 7850 in the ps4 is about the same performance as a gtx 480.

Here in crysis 1 the game get's a framerate of 30.97 with 4x aa on the highest settings on very high detail with a gtx 480.

Here is a benchmark with 3 8800 gtx in sli. It get's about 32 fps on 1900 x 1200, with a much weaker cpu. Also on very high detail settings. I think it's pretty obvious 3 8800 gtx in sli beat the ps4's gpu. Allthough any hardware specialist doesn't need any benchmarks to know this.

dx 9 can do everything dx11 can. These are just software optimizations, on dx 9 it would require more power though.

As for the ps4's apu, it's not like there are no drivers for this on the pc. What you're talking about is al work for the driver, the dev has nothing to do with it lol. HAving said that, it doesn't matter what operating system the system uses, the hardware make up is just like a pc with some tweaks like shared ram and apu. Things that also exists on pc and especially laptops. This is not like the power pc cpu's. You're saying only x86, well x86 is what makes the whole pc market compatible lol.

You're saying you're not taking me seriously but it's the other way around lol. You use concepts that you don't know nothing about. The benchmark you posted is fake. It only exists on gamespot lol. As for the esram, look what the onchip ram did with the x360. It matched the ps3 for 90 percent of the gen because of that on chip ram. The ps3 is twice as powerfull on paper cpu wise. The difference between the ps4 and x1 is nowhere near that.

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#113 MK-Professor
Member since 2009 • 4218 Posts

Don't worry, master race will come to save the day with 2560x1440, 60fps and mods

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#115 miiiiv
Member since 2013 • 943 Posts
@evildead6789 said:

Here is a benchmark with 3 8800 gtx in sli. It get's about 32 fps on 1900 x 1200, with a much weaker cpu. Also on very high detail settings. I think it's pretty obvious 3 8800 gtx in sli beat the ps4's gpu. Allthough any hardware specialist doesn't need any benchmarks to know this.

Are you sure that it's the 8800 gtx in that benchmark? Since Crysis 3 is DX11 only unless you are using the unofficial patch.

Avatar image for FireEmblem_Man
FireEmblem_Man

20389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#116 FireEmblem_Man
Member since 2004 • 20389 Posts

@nengo_flow welcome to the world of console limitations. It takes more RAM and better processing to run games in higher resolutions and 60 FPS. If you want the best graphics experiments, go PC because going only consoles will just make you disappointed if you expect games to be all 1080p and 60 FPS. As developers push to make better graphics, its harder to make games look sharp and compromises have to be made in order for games to run properly.

Avatar image for ccagracing
ccagracing

845

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#118 ccagracing
Member since 2006 • 845 Posts

@miiiiv said:
@evildead6789 said:

Here is a benchmark with 3 8800 gtx in sli. It get's about 32 fps on 1900 x 1200, with a much weaker cpu. Also on very high detail settings. I think it's pretty obvious 3 8800 gtx in sli beat the ps4's gpu. Allthough any hardware specialist doesn't need any benchmarks to know this.

Are you sure that it's the 8800 gtx in that benchmark? Since Crysis 3 is DX11 only unless you are using the unofficial patch.

I think this graph refers to the first Crisis game (not Crysis 3) and how multiple cards running in SLI compare to a single card. It shows 1 card running very high settings achieves just over 10fps compared to 3 cards which is just over 30fps. You will notice that the biggest jump is between one and two cards, and its not until the quality settings are increased does the 3rd card offer any real performance increase over the second.

Avatar image for SolidTy
SolidTy

49991

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#119 SolidTy
Member since 2005 • 49991 Posts

@delta3074 said:

@SolidTy said:

@ps4hasnogames said:

@SolidTy said:

This gen certainly is taking the crown on the absolute most cross gen multiplats of games I won't re-buy.

Would you have rather waited for the next gen version and played the "superior" one?

My only choice is to avoid buying games in the future and wait and hope they release better versions or bite the bullet and buy the game when it's still relevant.

Tough decision that one, you could end up waiting for a superior version of a game that never ends up appearing at which point you will buy a game everyone and there dog played last year.

This gen is Lame so far, 90% of the games on 8th gen systems can be played and bought Cheaper on 7th gen systems.

Yep, which is why I just opt to grab the first version instead of waiting and hoping...and therefore makes this generation feel so lame so far.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#120  Edited By commander
Member since 2010 • 16217 Posts

@scottpsfan14 said:
@evildead6789 said:

@scottpsfan14 said:

Look instead of you babbling bullshit, back up your claims. I've seen way too many people on SW own you in arguments to take you seriously any way. You have some stupid fucking conjured up opinions. Like 4 8800GTX in SLI is more capable than PS4 and shit. And DX9 can do everything that DX11 can. Where do you get that from? Try running Infamous SS and DriveClub on that PC and it won't even start because it lacks silicon. Even if the FLOPS add up with 4 8800's, that makes no difference.

Also, the consoles are nothing like PC's. They have an x86 CPU with GPU's made by AMD. That's about as far as it goes. But then the PS3 had an Nvidia chip inside. The actual only thing that makes PS4 "like a PC" is the x86 CPU. The PS4 is vastly more streamlined in design than a PC. It's software layers are console like entirely. The API is console like, and is different to PC programming in almost every way. Just like all consoles.

If you actually researched things instead of making up shit you think sounds good, then you would know this. A PS4 game isn't an OS application like a PC game. PS4 runs FreeBSD, but that doesn't mean you could run PS4 games on a PC with FreeBSD installed. The PS4's API is almost at driver level. It's as to the metal as any other console. Same with the Xbone. Not even slightly like a PC with boatloads of middleware.

Get this, the PS4 GPU has 18 CU's in use with 2 disabled. Each CU has 64 shaders. The Xbox One has 12 CU's also with 64 shaders each. These are directly comparable GPU's and eSRAM isn't going to take the XB1 above the PS4.

I really don't know what you're trying to achieve. The 32MB of eSRAM will give small gains in some situations. PS4 and XB1's GPU's are using Pitcairn and Bonaire. Pitcairn>Bonaire. They are directly comparable to AMD's GCN GPU's on PC. There is no hiding from this. For you to say these "12/15%" more perf claims, you must have gotten that number from somewhere? You can't just be spewing dross can you? Oh you can because it's all you ever do.

Both the PS4 and XB1 are miles behind modern PC's, but Xbox One is more behind and by a larger number than you like to think.

It's not because fanboys are in denial that they own me. For starters 4 8800 gtx do beat or at least match a ps4 . Even 3 gtx 8800 do . A hd 7850 in the ps4 is about the same performance as a gtx 480.

Here in crysis 1 the game get's a framerate of 30.97 with 4x aa on the highest settings on very high detail with a gtx 480.

Here is a benchmark with 3 8800 gtx in sli. It get's about 32 fps on 1900 x 1200, with a much weaker cpu. Also on very high detail settings. I think it's pretty obvious 3 8800 gtx in sli beat the ps4's gpu. Allthough any hardware specialist doesn't need any benchmarks to know this.

dx 9 can do everything dx11 can. These are just software optimizations, on dx 9 it would require more power though.

As for the ps4's apu, it's not like there are no drivers for this on the pc. What you're talking about is al work for the driver, the dev has nothing to do with it lol. HAving said that, it doesn't matter what operating system the system uses, the hardware make up is just like a pc with some tweaks like shared ram and apu. Things that also exists on pc and especially laptops. This is not like the power pc cpu's. You're saying only x86, well x86 is what makes the whole pc market compatible lol.

You're saying you're not taking me seriously but it's the other way around lol. You use concepts that you don't know nothing about. The benchmark you posted is fake. It only exists on gamespot lol. As for the esram, look what the onchip ram did with the x360. It matched the ps3 for 90 percent of the gen because of that on chip ram. The ps3 is twice as powerfull on paper cpu wise. The difference between the ps4 and x1 is nowhere near that.

You can't be for real. 3 8800GTX does not beat PS4 GPU. PS4 is a shader model 5 GPU with tech that doesn't exist in 8800's. DX9 can't do everything DX11 can because DX9 doesn't detect shader model 5 hardware dumb dumb. It doesn't even support multicore CPU's properly. It has less CPU efficiency and is all around inferior.

As for the PS3, it's PPE is 3.2GHz and has 6 SPE's available to games. Those can even handle GPU workloads to help out the PS3's weaker RSX. The 360 has a more capable GPU than the PS3 by quite a lot, while the PS3 has a huge advantage with the Cell SPE's. That's why early multiplats performed better on 360 when they were just utilizing both GPU's and no SPE's in the PS3. Only near the end did third parties start using the Cell properly and multiplats were performing as well or better in some cases on PS3.

Saying the PS4 is just a PC is like saying the Megadrive is just an Amiga because it uses a 68000. The Megadrive did get ports to and from the Amiga because of this, but that didn't make the Megadrive an Amiga or a Mac, or a computer. The PS3 and 360 used Nvidia and ATI GPU's, so does that mean the only thing making them consoles was the CPU arc? PS4 is as much of a console as PS3 exept, like the Megadrive in the reign of Motorola CPU's, it's using a standard CPU arc of it's time. PS4 still can access the hardware like any other console.

Lets talk GPU power. No eSRAM, just the GPU grunt. What makes the PS4 CPU only 15% faster? Explain why in detail and totally defy the specs.

Shader model 5 already exists since 2009. Shader model 4 was already available on the 8800 gtx. The tech you're talking about is an api with a chip to calculate more on the gpu instead of the cpu but it is not like you couldn't do this on dx 10 with a strong cpu. The dx11 features don't give the the card more power, it's more of an optimization where you need less communication with the cpu, because it can be a bottleneck. There's a reason the cpu in the ps4 is so weak, a lot of the calulations are transferred to the apu and the gpu.

In terms of processing polygons the shader model has exactly the same capabilities. With dx9 the processing power would cost a lot more, but this was just a theoretical comparison. The 8800 gtx is dx 10.

The rsx is on par with the xbox 360 gpu, only the xbox 360 gpu has an onship embedded ram and support for unified shaders, which gave it an advantage. It wasn't so hard to use in the games either but in raw power the xbox 360 gpu was not stronger than the ps3's rsx.

Comparing the amiga with the megadrive is just plain ridiculous, the system had 74 kb of ram, the amiga 500 had a standard of 512kb of ram. For a game like streetfighter in the same quality you needed 1 mb of ram on the amiga, the genesis still used 74 kb. With current gen consoles this is not the case. You can make a pc comparable to the ps4 easily. The only thing you will miss is shared ram and some tweaks. The hardware requirements will be very similar, only you'll need a bit more for the os, api and drivers on windows and the ps4 will be a bit more efficient because of the tweaks with the apu but that is it.

As in terms of purely the gpu , the ps4 has 50 percent more shaders. I already said that, but that is not in performance because the xboxone has esram. Not only that, the gpu on the xboxone has a 256 bit bus, while stronger gpu's do just fine with an 128 bit bus in the pc market. The 256 bit bus is to communicate with the esram. The esram is not some gimmick, it's pretty much the fastest ram available and the price per mb is costly. Don't forget that the xboxone is overclocked as well, on the cpu and the gpu, and directx is still build by microsoft. I'm not saying their os and api are better, but they sure showed us with the xbox 360 they can get more out of hardware than sony. If that's still the case remains to be seen but as of now it's going in the right direction.

All these things won't make the xboxone magically better than the ps4 though. The shader power in the ps4 will keep the ps4 in the lead but far from the 50 percent that was marketed. The ps4 had made a good business decision with the gddr5 (it was more chance than something else though, nobody knew gddr5 prices would become so cheap) and because of that they could sell their system cheaper. The kinect made the xboxone even more expensive, the rest is history.

But now the xboxone dropped to 350$ and if they can keep that price difference that would be good because the xboxone is not as much worth as the ps4 in terms of hardware but again not 50 percent lol. In my eyes the xboxone lauch failed because of the mandatory kinect, the kinect reservation of the gpu, the unavailability of esram tools, bad marketing, bad business decisions (aways online, no second hand market) and the unwillingness for taking the losses because of the bad gamble in hardware choices (allthough this last one and the mandatory kinect may not be that bad after all).

The xboxone price could have been 379$ at launch without a kinect. However, this gen isn't over yet, and they did sell a lot of systems with a mandatory kinect at a staggering 500$. The mandatory kinect can still be viable , if vr ever becomes succesfull and they are able to tap into new power like the cloud or maybe a hardware addon (it has been done before). There's no saying how many xboxone's would have been sold if they sold one without a kinect and 20$ cheaper as well, but it was certainly an option.

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#121  Edited By miiiiv
Member since 2013 • 943 Posts

@ccagracing said:

@miiiiv said:
@evildead6789 said:

Here is a benchmark with 3 8800 gtx in sli. It get's about 32 fps on 1900 x 1200, with a much weaker cpu. Also on very high detail settings. I think it's pretty obvious 3 8800 gtx in sli beat the ps4's gpu. Allthough any hardware specialist doesn't need any benchmarks to know this.

Are you sure that it's the 8800 gtx in that benchmark? Since Crysis 3 is DX11 only unless you are using the unofficial patch.

I think this graph refers to the first Crisis game (not Crysis 3) and how multiple cards running in SLI compare to a single card. It shows 1 card running very high settings achieves just over 10fps compared to 3 cards which is just over 30fps. You will notice that the biggest jump is between one and two cards, and its not until the quality settings are increased does the 3rd card offer any real performance increase over the second.

Yes of course, it's Crysis 1. I just skimmed through the post and I got it wrong. He found a bench where 3x 8800 gtx outperforms the 7850. I have my doubts that 3x 8800 gtx is more powerful than the hd 7850 in most scenarios however. But even hypothetically if a pc with a couple of DX10 GPUs has twice the power of the ps4, it still won't be able to replicate everything the ps4 does since it lacks support for newer rendering techniques, it will of course run DX10 based games better than the ps4 could though.