PS4.5 could theoretically play PS3 games in native 4K with better visuals and superior FPS.

  • 82 results
  • 1
  • 2
Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 RekonMeister
Member since 2016 • 784 Posts

GTX 580 is Vram starved guys.

No idea how it handles Shadow of Mordor then on settings that recommend 3GB Vram.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 scatteh316
Member since 2004 • 10273 Posts

One game.... genius..... Come to think of it why have you not shown or talked about modern games on your system?

Try running The Vanishing Of Ethan Carter on that GPU at PS4 settings and it'll tank...lmao

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53  Edited By RekonMeister
Member since 2016 • 784 Posts

@scatteh316 said:

One game.... genius..... Come to think of it why have you not shown or talked about modern games on your system?

Try running The Vanishing Of Ethan Carter on that GPU at PS4 settings and it'll tank...lmao

I don't buy everything that releases. shadow of Mordor is a GPU heavy game and is a XONE/ PS4 title that asks for 3GB Vram, why are you not questioning me on that?

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

I don't think even high end PCs could emulate ps3.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55  Edited By scatteh316
Member since 2004 • 10273 Posts

Taken from Anandtech review of Assassins Creed Unity.

"Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much."

You have less memory then 2Gb too!!! You would have to run the game at lower settings then PS4 to get it playable...... god bless that VRAM limitation....

And why you cherry picking one game.... I can link you to more reviews that show VRAM limited GTX580 if you wish...

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#56 clyde46
Member since 2005 • 49061 Posts

The 580 is an old ass card.

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 RekonMeister
Member since 2016 • 784 Posts

Loading Video...
Loading Video...
@Heirren said:

I don't think even high end PCs could emulate ps3.

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 RekonMeister
Member since 2016 • 784 Posts

@scatteh316 said:

Taken from Anandtech review of Assassins Creed Unity.

"Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much."

You have less memory then 2Gb too!!! You would have to run the game at lower settings then PS4 to get it playable...... god bless that VRAM limitation....

And why you cherry picking one game.... I can link you to more reviews that show VRAM limited GTX580 if you wish...

I would love to test loads of games but somehow i don't buy every game, oh right i'm not a walking money tree, plus all the 2GB GPU's mentioned are running way less memory bandwidth lol.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59  Edited By scatteh316
Member since 2004 • 10273 Posts

@rekonmeister said:
Loading Video...
Loading Video...
@Heirren said:

I don't think even high end PCs could emulate ps3.

So the only PS3 game above ( Battle Fantasia ) you show runs at a MASSIVE 16fps!! Wow... that's sure close to 4k and higher frame rates then PS3......lmao

Are you intentionally trying to make yourself look like an idiot...lmao

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 scatteh316
Member since 2004 • 10273 Posts

@rekonmeister said:
@scatteh316 said:

Taken from Anandtech review of Assassins Creed Unity.

"Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much."

You have less memory then 2Gb too!!! You would have to run the game at lower settings then PS4 to get it playable...... god bless that VRAM limitation....

And why you cherry picking one game.... I can link you to more reviews that show VRAM limited GTX580 if you wish...

I would love to test loads of games but somehow i don't buy every game, oh right i'm not a walking money tree, plus all the 2GB GPU's mentioned are running way less memory bandwidth lol.

GTX970... heck even the GTX680 has less bandwidth and is faster then a GTX580.... you literally have no clue do you?

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 RekonMeister
Member since 2016 • 784 Posts
@scatteh316 said:
@rekonmeister said:
Loading Video...
Loading Video...
@Heirren said:

I don't think even high end PCs could emulate ps3.

So the only PS3 game above ( Battle Fantasia ) you show runs at a MASSIVE 16fps!! Wow... that's sure close to 4k and higher frame rates then PS3......lmao

I know IQ is not a common thing around these parts.

Someone said he don't think even high end PC can emulate it, so i posted 2 vids, what has that got to do with the thread title? it's just me answering an off topic post, plus learn what theoretical means.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@scatteh316:

You also have to factor in CPU power as well, a GTX 580 with a modern intel quad core will still give PS4 a run for the money. Also that the PS4 suffers from DX11 limitations as well, with multiplats. If devs code beyond GNMX(copy paste high level API) then we will see the PS4 pull ahead.

But with well coded multiplats for example BF.

i7 920 at 3.6ghz with a GTX 580 1.5gb can play Battlefront with High settings at 1920x1200 with 50 fps average.... and with ultra settings 45 fps average. Most of the efficiency consoles see is on the cpu side of the code. Which means if you can brute force through the cpu overhead on pc console efficiency does not mean much.

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 RekonMeister
Member since 2016 • 784 Posts

@scatteh316 said:
@rekonmeister said:
@scatteh316 said:

Taken from Anandtech review of Assassins Creed Unity.

"Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much."

You have less memory then 2Gb too!!! You would have to run the game at lower settings then PS4 to get it playable...... god bless that VRAM limitation....

And why you cherry picking one game.... I can link you to more reviews that show VRAM limited GTX580 if you wish...

I would love to test loads of games but somehow i don't buy every game, oh right i'm not a walking money tree, plus all the 2GB GPU's mentioned are running way less memory bandwidth lol.

GTX970... heck even the GTX680 has less bandwidth and is faster then a GTX580.... you literally have no clue do you?

No really you are the one not understanding memory.

Yes they are faster in GPU power, GPU is not the same as memory.

The 1.5GB 580 is suitably matched, and the 3GB 580's were never able to use the Vram, it required them to be in SLi to even utilize so much Vram.

Higher resolutions need more GPU power, so 580's would need to be in SLi.

http://www.overclock.net/t/1209698/how-much-better-is-a-gtx580-3gb-vs-1-5gb

You totally don't understand anything do you? keep building your PC's and upgrading and wasting cash, makes no difference to others.

Avatar image for 2Chalupas
2Chalupas

7286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#64 2Chalupas
Member since 2009 • 7286 Posts

Won't ever work for PS3 games because of the cell architecture, the emulator would never be that efficient at getting PS3 games to run straight up on a PS4 GPU. It might barely get them so that easy backwards compatibility is an option, but running them at 4K? LOLNO.

Perhaps some of the most optimized PC games from that era could run at 4K if somebody were willing to also port and optimize them to PS4 (i.e. throw out the PS3 port, and go back to the PC port as the basis for getting a 4K game on PS4K... that might be possible). But that is assuming "4K gaming" is going to be a thing at all on the PS4K. I'm still not convinced of that, think it will be more about streaming and movies.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#65 clyde46
Member since 2005 • 49061 Posts

I thought the 680 was gimped compared to the 580. I loved my 580, it was a beast of a card.

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 RekonMeister
Member since 2016 • 784 Posts

Still is for a 6 year old GPU, i'm waiting for Pascal and moving to a 5820K.

@clyde46 said:

I thought the 680 was gimped compared to the 580. I loved my 580, it was a beast of a card.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 scatteh316
Member since 2004 • 10273 Posts

@04dcarraher said:

@scatteh316:

You also have to factor in CPU power as well, a GTX 580 with a modern intel quad core will still give PS4 a run for the money. Also that the PS4 suffers from DX11 limitations as well, with multiplats. If devs code beyond GNMX(copy paste high level API) then we will see the PS4 pull ahead.

But with well coded multiplats for example BF.

i7 920 at 3.6ghz with a GTX 580 1.5gb can play Battlefront with High settings at 1920x1200 with 50 fps average.... and with ultra settings 45 fps average. Most of the efficiency consoles see is on the cpu side of the code. Which means if you can brute force through the cpu overhead on pc console efficiency does not mean much.

PS4 doesn't have any of the DX11 API fall backs that PC has because it's not a PC.... we've been over this a 1000x already in this forum.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 scatteh316
Member since 2004 • 10273 Posts

@rekonmeister said:
@scatteh316 said:
@rekonmeister said:
@scatteh316 said:

Taken from Anandtech review of Assassins Creed Unity.

"Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much."

You have less memory then 2Gb too!!! You would have to run the game at lower settings then PS4 to get it playable...... god bless that VRAM limitation....

And why you cherry picking one game.... I can link you to more reviews that show VRAM limited GTX580 if you wish...

I would love to test loads of games but somehow i don't buy every game, oh right i'm not a walking money tree, plus all the 2GB GPU's mentioned are running way less memory bandwidth lol.

GTX970... heck even the GTX680 has less bandwidth and is faster then a GTX580.... you literally have no clue do you?

No really you are the one not understanding memory.

Yes they are faster in GPU power, GPU is not the same as memory.

The 1.5GB 580 is suitably matched, and the 3GB 580's were never able to use the Vram, it required them to be in SLi to even utilize so much Vram.

Higher resolutions need more GPU power, so 580's would need to be in SLi.

http://www.overclock.net/t/1209698/how-much-better-is-a-gtx580-3gb-vs-1-5gb

You totally don't understand anything do you? keep building your PC's and upgrading and wasting cash, makes no difference to others.

Dude.... please... you're killing yourself...

But lets get back on topic..... PS45 could never play PS3 games in 4k at higher frame rates

/thread

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 RekonMeister
Member since 2016 • 784 Posts

@scatteh316 said:
@04dcarraher said:

@scatteh316:

You also have to factor in CPU power as well, a GTX 580 with a modern intel quad core will still give PS4 a run for the money. Also that the PS4 suffers from DX11 limitations as well, with multiplats. If devs code beyond GNMX(copy paste high level API) then we will see the PS4 pull ahead.

But with well coded multiplats for example BF.

i7 920 at 3.6ghz with a GTX 580 1.5gb can play Battlefront with High settings at 1920x1200 with 50 fps average.... and with ultra settings 45 fps average. Most of the efficiency consoles see is on the cpu side of the code. Which means if you can brute force through the cpu overhead on pc console efficiency does not mean much.

PS4 doesn't have any of the DX11 API fall backs that PC has because it's not a PC.... we've been over this a 1000x already in this forum.

If it uses Dx11 then sir, YES IT DOES.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#70 04dcarraher
Member since 2004 • 23858 Posts

@rekonmeister said:

No really you are the one not understanding memory.

Yes they are faster in GPU power, GPU is not the same as memory.

The 1.5GB 580 is suitably matched, and the 3GB 580's were never able to use the Vram, it required them to be in SLi to even utilize so much Vram.

Higher resolutions need more GPU power, so 580's would need to be in SLi.

http://www.overclock.net/t/1209698/how-much-better-is-a-gtx580-3gb-vs-1-5gb

You totally don't understand anything do you? keep building your PC's and upgrading and wasting cash, makes no difference to others.

Problem with that statement that GTX 580 3gb couldnt use the vram... It can, especially if the game allocates and stores as much as it can into the buffer. Now the question then becomes is the GTX 580 is strong enough to fully able to handle properly optimized game assets that require 3gb of buffer..... depends on how graphically complex the game is.

Avatar image for jhonMalcovich
jhonMalcovich

7090

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 jhonMalcovich
Member since 2010 • 7090 Posts

@clyde46 said:
@jhonMalcovich said:

Theoretically speaking, pigs could fly.

They can, when you fire them from a cannon.

I was thinking of attaching them to a rocket. It's more more humane :P

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 RekonMeister
Member since 2016 • 784 Posts

@04dcarraher said:
@rekonmeister said:

No really you are the one not understanding memory.

Yes they are faster in GPU power, GPU is not the same as memory.

The 1.5GB 580 is suitably matched, and the 3GB 580's were never able to use the Vram, it required them to be in SLi to even utilize so much Vram.

Higher resolutions need more GPU power, so 580's would need to be in SLi.

http://www.overclock.net/t/1209698/how-much-better-is-a-gtx580-3gb-vs-1-5gb

You totally don't understand anything do you? keep building your PC's and upgrading and wasting cash, makes no difference to others.

Problem with that statement that GTX 580 3gb couldnt use the vram... It can, especially if the game allocates and stores as much as it can into the buffer. Now the question then becomes is the GTX 580 is strong enough to fully able to handle properly optimized game assets that require 3gb of buffer..... depends on how graphically complex the game is.

That's different, BF3 would store everything it could into VRAM but only really needing 1GB even maxed at 1080P.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#73  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@rekonmeister said:
@scatteh316 said:
@04dcarraher said:

@scatteh316:

You also have to factor in CPU power as well, a GTX 580 with a modern intel quad core will still give PS4 a run for the money. Also that the PS4 suffers from DX11 limitations as well, with multiplats. If devs code beyond GNMX(copy paste high level API) then we will see the PS4 pull ahead.

But with well coded multiplats for example BF.

i7 920 at 3.6ghz with a GTX 580 1.5gb can play Battlefront with High settings at 1920x1200 with 50 fps average.... and with ultra settings 45 fps average. Most of the efficiency consoles see is on the cpu side of the code. Which means if you can brute force through the cpu overhead on pc console efficiency does not mean much.

PS4 doesn't have any of the DX11 API fall backs that PC has because it's not a PC.... we've been over this a 1000x already in this forum.

If it uses Dx11 then sir, YES IT DOES.

lol, with multiplats most devs do a copy paste job of DX 11 coding standards ie deferred multithreading on both the X1 & PS4. This means that consoles suffer from same cpu limitations as PC with DX11....

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#74 04dcarraher
Member since 2004 • 23858 Posts

@rekonmeister said:

That's different, BF3 would store everything it could into VRAM but only really needing 1GB even maxed at 1080P.

Like I said it depends how graphically complex and optimized the game is.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 scatteh316
Member since 2004 • 10273 Posts

@rekonmeister said:
@scatteh316 said:
@04dcarraher said:

@scatteh316:

You also have to factor in CPU power as well, a GTX 580 with a modern intel quad core will still give PS4 a run for the money. Also that the PS4 suffers from DX11 limitations as well, with multiplats. If devs code beyond GNMX(copy paste high level API) then we will see the PS4 pull ahead.

But with well coded multiplats for example BF.

i7 920 at 3.6ghz with a GTX 580 1.5gb can play Battlefront with High settings at 1920x1200 with 50 fps average.... and with ultra settings 45 fps average. Most of the efficiency consoles see is on the cpu side of the code. Which means if you can brute force through the cpu overhead on pc console efficiency does not mean much.

PS4 doesn't have any of the DX11 API fall backs that PC has because it's not a PC.... we've been over this a 1000x already in this forum.

If it uses Dx11 then sir, YES IT DOES.

But PS4 doesn't use DirectX 11.....DirectX 11 requires a user agreement which Sony don't have... it uses Sony's own API......Again this topic on API's has been talked about 1000x already on this forum...... Once again proving you know nothing.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#76  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@scatteh316 said:

But PS4 doesn't use DirectX 11.....DirectX 11 requires a user agreement which Sony don't have... it uses Sony's own API......Again this topic on API's has been talked about 1000x already on this forum...... Once again proving you know nothing.

with multiplats most devs do a copy paste job of DX 11 coding standards ie deferred multithreading on both the X1 & PS4. X1's older API was a modified version of DX11 ie DX11.X it suffered from same cpu limits as DX11.... Then devs use Sony's GNMX and its ability to copy and paste other API usage without having to code from the ground up with GNM.

This means that consoles suffer from same cpu limitations as PC with DX11.... This is why we see GTX 750ti with strong dual core cpu able to compete with PS4 with multiplats.

With the release of DX12 on X1 this will allow the PS4 not to be limited by DX11 standards.

Avatar image for shawty_beatz
Shawty_Beatz

1269

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#77 Shawty_Beatz
Member since 2014 • 1269 Posts

@Heil68 said:

Good god, think of the graphics threads then...lol

I dunno, those 4k shots in the OP don't look anything special.

Avatar image for gago-gago
gago-gago

12138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#78  Edited By gago-gago
Member since 2009 • 12138 Posts

The PS4 could already play PS3 games through a paid subscription PSNow. The technology is already there. As for discs, Sony could but they won't because they know they can get anyway with milking their consumers by remasters and more subscriptions. I think they're going to get pressured in offering BC but it's going to be fun seeing their users flip flop on their anti-BC stance.

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#80 miiiiv
Member since 2013 • 943 Posts
@scatteh316 said:

One game.... genius..... Come to think of it why have you not shown or talked about modern games on your system?

Try running The Vanishing Of Ethan Carter on that GPU at PS4 settings and it'll tank...lmao

I wouldn't be too surprised if the old 580 kept up pretty well with the ps4 in The Vanishing Of Ethan Carter.

Here's a benchmark at 1080p. The 580 could very well be able to get a pretty stable locked 30 fps. Sure the ps4 frame rate is a bit higher but is the ps4 version comparable to "very high quality".

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#81  Edited By ronvalencia
Member since 2008 • 29612 Posts

@rekonmeister said:
@scatteh316 said:
@rekonmeister said:
@scatteh316 said:

Your 580 will never keep up with PS4's GPU you fool...

The PS4's GPU is considerably weaker.

1080P maxed GTX 580.

It's not 'considerably weaker' at all... the GTX 580 is a little bit faster then a 7850 but PS4 isn't a PC now is it?

Toss in much more efficient coding and PS4's GPU slams that old as shit GTX 580 around and if you have the version with the piddly 1.5Gb of VRAM PS4 will smack it around even more.

I just fucking love how people compare 2 graphics cards on PC as a way to compare the way a GPU in a console VS PC are used.....

Different platforms numb nuts with different API's and coding.

The 580 is around 5-10% faster than the HD 7870, factor in it's stock clock of 775mhz and it is able to hit almost 1000mhz with an OC... the PS4 is no where near it, and memory does not work how you think it does.

The 580 has a 384-bit bus with a massive memory bandwidth of over 200GB/s which means data that is to be used in Vram is used much faster thanks to the memory sub system being so rapid, thus meaning it can handle more memory intensive things than a lower bandiwdth card with the same Vram. A slower memory system would make the 1.5GB a very limiting factor, however it's not the case with the 580.

I mean you must think the 8GB GDDR5 in your PS4 makes it superior to a GTX 970 with only 4GB ram lol.

My 580 validation.

http://www.techpowerup.com/gpuz/details.php?id=des9h

WTF? NVIDIA Fermi was smashed in the recent FarCry Primal PC benchmark.

PS4 has CPU issues.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#82  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:
@scatteh316 said:

But PS4 doesn't use DirectX 11.....DirectX 11 requires a user agreement which Sony don't have... it uses Sony's own API......Again this topic on API's has been talked about 1000x already on this forum...... Once again proving you know nothing.

with multiplats most devs do a copy paste job of DX 11 coding standards ie deferred multithreading on both the X1 & PS4. X1's older API was a modified version of DX11 ie DX11.X it suffered from same cpu limits as DX11.... Then devs use Sony's GNMX and its ability to copy and paste other API usage without having to code from the ground up with GNM.

This means that consoles suffer from same cpu limitations as PC with DX11.... This is why we see GTX 750ti with strong dual core cpu able to compete with PS4 with multiplats.

With the release of DX12 on X1 this will allow the PS4 not to be limited by DX11 standards.

Just to add, XBO's support for DX12 SM6(AMD's wavefront programming model expose to DX programmer) will further improve PS4 ports.

PS4's "DirectX 11" support. This is to make game ports easy. Sony can rename Microsoft's APIs and change parameter order, but the programming model remains similar.

PS4 also has APIs similar to Mantle or DirectX 12.

PS4 is effectively "Xbox" with 18 CU GPU without Microsoft logo and Xbox Live services.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#83  Edited By Ten_Pints
Member since 2014 • 4072 Posts

In theory you can make pigs fly.