DX 12 oversimplified .. Yes it applies to Xbox One as well..

  • 69 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51  Edited By GrenadeLauncher
Member since 2004 • 6843 Posts

@StormyJoe said:

@ttboy: Microsoft software engineers > Sony software engineers.

So, to say "Sony will have a n answer", is baseless speculation.

Except they do, StormyJoke. Look up the ICE team.

Incremental improvements in software can't make up for the hardware gulf. Deal with it, for your own good.

@FastRobby said:

Lol, no they haven't. The only ones saying that, are the ones that haven't used it.

Xbox One at least has games at 60fps, because gameplay trumps graphics. Forza 5 => 1080p/60fps, sure they had to cut back with stuff like the crowd. But that doesn't matter because these racing games need 60fps, and you don't see the crowd when you race at 100mph. Naughty Dog is already saying they probably can't get to 60fps for UC4, but yeah the Xbox One is lacking behind ;)

Yes they have, Slow Robert.

Must break your heart that the PS4 also has 1080p 60fps games, and they don't need to look like PS2 ports to get there.

ND is saying if they can't get it to 60FPS without big dips and screen tearing, they'll lock it at 30fps (which is what they'll probably do). They'll continue to optimise as necessary. Reading is important.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#52 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@GrenadeLauncher said:

@FastRobby said:

Lol, no they haven't. The only ones saying that, are the ones that haven't used it.

Xbox One at least has games at 60fps, because gameplay trumps graphics. Forza 5 => 1080p/60fps, sure they had to cut back with stuff like the crowd. But that doesn't matter because these racing games need 60fps, and you don't see the crowd when you race at 100mph. Naughty Dog is already saying they probably can't get to 60fps for UC4, but yeah the Xbox One is lacking behind ;)

Yes they have, Slow Robert.

Must break your heart that the PS4 also has 1080p 60fps games, and they don't need to look like PS2 ports to get there.

ND is saying if they can't get it to 60FPS without big dips and screen tearing, they'll lock it at 30fps (which is what they'll probably do). They'll continue to optimise as necessary. Reading is important.

Yeah Last of us remastered looked like a last-gen game, and yeah I know there are games running at 1080p60fps on PS4, like Resogun and Flower.

And ND already knows they can't get it to 60fps, because else they wouldn't make this statement.

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#53 slimdogmilionar
Member since 2014 • 1345 Posts

@ttboy: There has been a lot of speculation regarding heat on PS4 design and the fact that the power supply is inside. But the fact remains they designed the console to be more geared towards GPGPU, this is something that Cerny himself has been pushing in his interviews. Like I said before devs have already said the ps4 cpu is a bottleneck so I'm guessing gpgpu will be the savior, but I also wonder how much heat the PS4 can take before it starts to take an effect on the system. I've seen complaints on this forum and others about how loud the PS4 gets when it's hot and we all saw what happened to 360 when the GPU got to hot.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 GrenadeLauncher
Member since 2004 • 6843 Posts

@FastRobby said:

Yeah Last of us remastered looked like a last-gen game, and yeah I know there are games running at 1080p60fps on PS4, like Resogun and Flower.

And ND already knows they can't get it to 60fps, because else they wouldn't make this statement.

Last I checked, PS2 wasn't last gen, Robert.

Probably, they'd have to make some more cuts otherwise, and we all know what lemming losers are like with the CSI Paint skills.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#55  Edited By deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@GrenadeLauncher said:

@FastRobby said:

Yeah Last of us remastered looked like a last-gen game, and yeah I know there are games running at 1080p60fps on PS4, like Resogun and Flower.

And ND already knows they can't get it to 60fps, because else they wouldn't make this statement.

Last I checked, PS2 wasn't last gen, Robert.

Probably, they'd have to make some more cuts otherwise, and we all know what lemming losers are like with the CSI Paint skills.

Sure Forza 5 looks like a PS2 game... real mature... And PS4 has the CPU power of a N64, lololol, wow how funny...

Of course they have to make cuts, it's ND... They always show you stuff that's amazing, and then the downgrading starts.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#56  Edited By StormyJoe
Member since 2011 • 7806 Posts

@GrenadeLauncher said:

@StormyJoe said:

@ttboy: Microsoft software engineers > Sony software engineers.

So, to say "Sony will have a n answer", is baseless speculation.

Except they do, StormyJoke. Look up the ICE team.

Incremental improvements in software can't make up for the hardware gulf. Deal with it, for your own good.

@FastRobby said:

Lol, no they haven't. The only ones saying that, are the ones that haven't used it.

Xbox One at least has games at 60fps, because gameplay trumps graphics. Forza 5 => 1080p/60fps, sure they had to cut back with stuff like the crowd. But that doesn't matter because these racing games need 60fps, and you don't see the crowd when you race at 100mph. Naughty Dog is already saying they probably can't get to 60fps for UC4, but yeah the Xbox One is lacking behind ;)

Yes they have, Slow Robert.

Must break your heart that the PS4 also has 1080p 60fps games, and they don't need to look like PS2 ports to get there.

ND is saying if they can't get it to 60FPS without big dips and screen tearing, they'll lock it at 30fps (which is what they'll probably do). They'll continue to optimise as necessary. Reading is important.

Look Stupid,

Microsoft engineers built the .Net Framework, they built Azure, development studios ask them for help. They built Windows, SQL Server, IE, and Office. You have no idea what the Hell you are talking about.

Avatar image for super600
super600

33160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#57  Edited By super600  Moderator
Member since 2007 • 33160 Posts

You need 500 posts to create a thread on this board.

Avatar image for Midnightshade29
Midnightshade29

6003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 301

User Lists: 0

#58 Midnightshade29
Member since 2008 • 6003 Posts

@Wasdie: problem is, will this be regulated to windows 8+, or will windows 7 be able to have it?

I am not upgrading to a crap os for this. I hate win8. I hated when they forced dx10 for Vista only (or forced some dx9 games to Vista only) , so I can see $MS pulling that stunt again... raises fist.

Also will this require a new go or will existing dx11 cards be compatible with it?

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#59  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

@Midnightshade29 said:

@Wasdie: problem is, will this be regulated to windows 8+, or will windows 7 be able to have it?

I am not upgrading to a crap os for this. I hate win8. I hated when they forced dx10 for Vista only (or forced some dx9 games to Vista only) , so I can see $MS pulling that stunt again... raises fist.

Also will this require a new go or will existing dx11 cards be compatible with it?

Knowing Microsoft it will go with Windows 10 since DX12 isn't slated to launch until late this year.

Microsoft needs to force people to stop using outdated software. Windows 7 is already getting old in terms of software standards (launched in 2009) and the back end of Windows 8 is far superior to that of Windows 7. With Windows 10 the backend should continue to improve while they improve the front end experience.

The GTX 900 series and maybe the 700 series are compatible with DX12 but nothing older. Don't quote me on that because I'm not 100% sure. With new API changes you need to realize you need new hardware. To be a worthwhile upgrade an API needs to fully utilize newer hardware, not just slightly support it on top of legacy code.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#60 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@Midnightshade29 said:

@Wasdie: problem is, will this be regulated to windows 8+, or will windows 7 be able to have it?

I am not upgrading to a crap os for this. I hate win8. I hated when they forced dx10 for Vista only (or forced some dx9 games to Vista only) , so I can see $MS pulling that stunt again... raises fist.

Also will this require a new go or will existing dx11 cards be compatible with it?

Well tough luck for you then, DX12 will ship with Windows 10. MAYBE it will come to Windows 8, but probably not. Also Windows 8 is a pretty great OS once you are on the desktop

Avatar image for CrownKingArthur
CrownKingArthur

5262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61  Edited By CrownKingArthur
Member since 2013 • 5262 Posts
@davillain- said:

@CrownKingArthur: Okay when this thread is lock, feel free to repost this.

when's that gonna be?

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@Wasdie said:


The GTX 900 series and maybe the 700 series are compatible with DX12 but nothing older. Don't quote me on that because I'm not 100% sure. With new API changes you need to realize you need new hardware. To be a worthwhile upgrade an API needs to fully utilize newer hardware, not just slightly support it on top of legacy code.

Fermi (400 series) and newer will be supported. Now will they be able to use all the features and gains from dx12 most likely not

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#63  Edited By misterpmedia
Member since 2013 • 6209 Posts

@FastRobby said:

You realise posting this practically exposes you for being a fucking idiot right? lol

gg

Avatar image for tormentos
tormentos

33795

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 tormentos
Member since 2003 • 33795 Posts

@slimdogmilionar said:

@GrenadeLauncher: I think Sony's answer is already known, it's GPGPU. Not saying they can't take advantage of multi-thread CPU but I think their vision for PS4 was always to have the GPU do most of the work, it's been shown already that working with one CPU core alone the bandwidth of the PS4 drops significantly, not too mention that Planetside 2 devs and many others have already said PS4 CPU is the main bottleneck they're facing. I think that's the reason Cerny said balance was at 14 cu's, having the GPU do most of the work will be the best option for the PS4.

So much bullshit is not even funny...

No it hasn't sony talked about what developer should avoid and idiots like you took it as gospel to downplay the PS4 bandwidth like the xbox one there is do's and don't on the PS4 as well.

Yep they have and if the did the game on XBO they would say the same since it is a CPU intensive game,trying to make seen like the xbox one has a damn i7 is a joke,it is the same weak ass CPU on PS4 but with a few hertz higher.

That second bold part yeah debunked...lol

Digital Foundry: Going back to GPU compute for a moment, I wouldn't call it a rumour - it was more than that. There was a recommendation - a suggestion? - for 14 cores [GPU compute units] allocated to visuals and four to GPU compute...

Mark Cerny: That comes from a leak and is not any form of formal evangelisation.

The point is the hardware is intentionally not 100 per cent round.

It has a little bit more ALU in it than it would if you were thinking strictly about graphics.

3 simple sentences.

1. debunking the 14+4 been a rule.

2. Explaining the hardware is not 100% round like a normal GCN.

3.It has a little bit more ALU in it,than if you were thinking of a normal game,which is the reason why Ubisoft test show such a big huge gap between both GPU in compute.

But even if it was split 14+4 what the hell make you think the xbox one would be able to keep up any way.?

If the PS4 uses 400Gflosp for compute that is more than the entire 8 core xbox one CPU has to offer,and still the PS4 would have 14 CU 2 more than the xbox one.

No matter what 14+4 or 18 complete the point is the PS4 has 18CU the xbox one only 12CU,anything done on 4CU on PS4 need to be counter on xbox one with 4CU or else the game will be severely gimped,and that would mean that the XBO would only have 8CU for graphics.'

I already debunk your other theory about the xbox one pulling GpGPU from thing air remember.?

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#65 slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos:

No one said anything about the 14+4 being a rule, it was just Cerny's recommended balance. Developers are free to use all 18 cu's if they want. But I think with gddr5 it would be a better option to use a few of those cu's for gpgpu so the bandwidth is not compromised by the cpu, or make the cpu the bottleneck like many devs have stated.

No one is trying to make the xbox one cpu look like an i7, it's just able to do more because of the DDR3 and the overclock. NO matter what you say a cpu like the ones found in the console is already weak and underpowered so matching it to the ram it's better suited for is a good idea. DDR3>GDDR5 for cpu ram.

Here's another recent article about a dev talking about the differences in the systems: If all you’re doing is rendering polygons, yes, the PS4 is a little faster at that. If all you’re doing is gameplay and simulation, the Xbox One is a little faster.

Secondly it was already documented that the PS4's bandwidth can drops to as low as 135gb/s because of the cpu.

Pretty much where the parallel processing comes into play on the xb1. It has 8 graphics context to process compute and rendering, and it doesn't have to wait for the next graphics command to be processed as it will have already been done and queued waiting to be pushed to the gpu. That's why in my parallel processing thread I said it would most likely be a feature that would not take effect until DX12, I don't really think M$ can take full advantage of this until a better multi threading solution hits the market.

Dude I fully understand that 400 gflops > 112 gflops you are trying to hard to defend the PS4. I don't see how you took this as a Sony dis everything I said is true but you still have to come and white knight for Sony, gpgpu will do nothing but help the PS4. Everytime someone says something you don't like to hear you come and try to debunk it even if it's true, that's why everyone on these forums knows you're full of BS, even cows are embarrassed by you.

You never debunked anything because I never said anything about the xb1 one pulling gpgpu from thin air, classic tormentos. Every thread you try to debunk something all you do is make people come and tell you how full of BS you are until you literally have no argument and eventually stop posting, and that's not just lems, it's cows, hermits, and sheep also.

Your whole response to my post was unnecessary considering that it was all true. Do you need something to argue about so you can feel better about yourself by thinking you proved something to someone? One day you'll get to own somebody, just wait for it.

Not knowing everything is good you know, when you actually realize you don't know everything and you're not always right it opens the door for you to learn new things. But I guess you can't teach an old dogs new tricks.

Avatar image for tormentos
tormentos

33795

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66  Edited By tormentos
Member since 2003 • 33795 Posts

@slimdogmilionar said:

@tormentos:

No one said anything about the 14+4 being a rule, it was just Cerny's recommended balance. Developers are free to use all 18 cu's if they want. But I think with gddr5 it would be a better option to use a few of those cu's for gpgpu so the bandwidth is not compromised by the cpu, or make the cpu the bottleneck like many devs have stated.

No one is trying to make the xbox one cpu look like an i7, it's just able to do more because of the DDR3 and the overclock. NO matter what you say a cpu like the ones found in the console is already weak and underpowered so matching it to the ram it's better suited for is a good idea. DDR3>GDDR5 for cpu ram.

Here's another recent article about a dev talking about the differences in the systems: If all you’re doing is rendering polygons, yes, the PS4 is a little faster at that. If all you’re doing is gameplay and simulation, the Xbox One is a little faster.

Secondly it was already documented that the PS4's bandwidth can drops to as low as 135gb/s because of the cpu.

Pretty much where the parallel processing comes into play on the xb1. It has 8 graphics context to process compute and rendering, and it doesn't have to wait for the next graphics command to be processed as it will have already been done and queued waiting to be pushed to the gpu. That's why in my parallel processing thread I said it would most likely be a feature that would not take effect until DX12, I don't really think M$ can take full advantage of this until a better multi threading solution hits the market.

Dude I fully understand that 400 gflops > 112 gflops you are trying to hard to defend the PS4. I don't see how you took this as a Sony dis everything I said is true but you still have to come and white knight for Sony, gpgpu will do nothing but help the PS4. Everytime someone says something you don't like to hear you come and try to debunk it even if it's true, that's why everyone on these forums knows you're full of BS, even cows are embarrassed by you.

You never debunked anything because I never said anything about the xb1 one pulling gpgpu from thin air, classic tormentos. Every thread you try to debunk something all you do is make people come and tell you how full of BS you are until you literally have no argument and eventually stop posting, and that's not just lems, it's cows, hermits, and sheep also.

Your whole response to my post was unnecessary considering that it was all true. Do you need something to argue about so you can feel better about yourself by thinking you proved something to someone? One day you'll get to own somebody, just wait for it.

Not knowing everything is good you know, when you actually realize you don't know everything and you're not always right it opens the door for you to learn new things. But I guess you can't teach an old dogs new tricks.

Oh for got sake STFU and stop inventing crap.

You just fu**ing quoted a developer freaking kissing MS ass,but simulation and gameplay is faster on xbox one,what the fu** does that even mean,how can you speed up gameplay with a different GPU if your gameplay sucks because the game is bad you are tied with it no matter how powerful your GPU is.

The PSD4 is faster because well it is stronger.

it's been shown already that working with one CPU core alone the bandwidth of the PS4 drops significantly

Now i want a link to this ^^ and please don't link me to that 2013 GDC screen because that say nothing,that was a chart made by sony telling developers how to avoid pitfalls,not only the PS4 has a 176GB's bandwidth which shares with the GPU it has another 20GB one which is 10GB each way,that bypass that 176GB/s bandwidth so don't tell me that 1 CPU core alone drop significantly that is total bullsh*t pull from your buns to make the xbox one look better.

Yes that is what you insinuated when you claimed that the xbox one could also do GpGPU compute without touching any of the 12 CU it had,you have a freaking bad short term memory,i basically ridicule you for it..

No it wasn't...

it's been shown already that working with one CPU core alone the bandwidth of the PS4 drops significantly

This is false ^^.

I think that's the reason Cerny said balance was at 14 cu's

This ^^ is false.

having the GPU do most of the work will be the best option for the PS4.

This ^^ is just pure stupidity,considering the xbox one CPU is 9% faster than the PS4 one which still reserve 2 full cores for OS and which i am sure will loosen as times move on,just like memory foot print was reduce on the PS3 to 46Mb from 125 MB....

sony always deliver improvements and they have a team call ice which is all it does,i remember how the PS3 started and how it ended owning the xbox 360,this time they started better and as year pass sony will improve graphics way pass the xbox one,already is doing with The Order and DC.

The xbox one has 2 aces and 16 commands already stated by the xbox one architect which i quoted already on it..lol

Again so you have in perspective how Huge the gap really is and how can even a small part of the GPU helping the CPU will beat the xbox one CPU silly..

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#67  Edited By ShadowDeathX
Member since 2006 • 11699 Posts

@ttboy said:

Most games, even today, are still written so that only 1 core is dedicated to interacting with the GPU.

*cough*

Assassin's Creed.

And doing it poorly at that.

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#68 slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos:

BOOOOOOO you are starting to bore me. Even when I say something good about PS4 you take to the heart, you are not proving anything besides your lack of reading comprehension and ignorance at this point. I never invent anything you just don't like to accept the truth.

Please explain to me how having the PS4 GPU do most of the work is not good. The gpu does a better job with the gddr5 than the cpu will, by doing so the bandwidth is not compromised.

I said it once and I'll say it again Cerny said "recommended" balance is 14 no one said it was definitive.

PS4 gpu is faster xbox one cpu is faster was my point. And those onion and garlic buses you keep touting about on the PS4 the XB1 has those also fyi.

yep the dev was kissing M$ ass because he said the ps4 gpu was more powerful than the xb1 gpu, but said the xbox one cpu was better than PS4. That sounds like a pretty unbiased and true statement coming from a developer making a game across 3 platforms.

Yea Sony delivers improvements but the fact remains Cerny is no dummy. He had to know that with GDDR5 the weak cpu would struggle which is why Sony built the PS4 with gpgpu in mind to help free up the cpu.

Pease show me where I said Xb1 could do gpgpu without using the gpu? You are just lying now to make yourself look good, how can you do gpgpu without a gpu?

Dude I think you are butthurt because of the improvements and good news coming from M$ of lately. How come you aren't in the threads about Uncharted 4 not being 60fps like ND said their where aiming for all games this gen?........

You claim all the improvements that the ice team can make but it's still not happening as fast as the improvements on the Xb1. Xb went from 720p to 1080p, 45% reduction in ram used, and the OS can be ran and fully functioning off of one core. Comeback when ps4 can do all of that and play online while having Netflix, Hulu or Amazon prime snapped, or even leave a game for days and constantly be using the PS4 for tv and internet browsing and suddenly decide to start playing your game and pick up right where you left off 4 or 5 days later, all on one core.

The slide comes from a technical presentation of the Sony PS4 system and as such, it must be treated with the utmost caution. However, from what I can see, if there is one thing you cannot deny, its that the effective/usable/actual (whatever you want to call it) bandwidth available to the Pitcairn GPU of the PS4 is 120GB/s to 140GB/s. By this I mean the actual bandwidth available to the developers to play with......

So option 1 that we have is that the graph is incorrect or skewed due to an unknown factor, however because there are actually 4 samples present, I am don’t think that’s the case. The second option is that the memory clock is wrong and the last option being the Bus being wrong. Now needless to say, the bus being wrong is quite improbable. Reversing the previously used equation we get [(140x1000x8)/256 = 4375Mhz] and [(120x1000x8)/256 = 3750Mhz]. This is now beginning to make some semblance of logic. The memory appears to be down clocking as the CPU usage increases and this is where the ”disproportionate” part comes into play. Theoretically as the APU varies between CPU intensive and GPU intensive tasks the bandwidth should remain the same, 140 in this case. But it doesn’t. The total bandwidth of the system actually drops to as much as 120 GB/s and more if the trend continues. If you take a look at the graph you will notice that the CPU bandwidth is actually null in the bar where the GPU bandwidth is highest. Since this is a realistic impossibility, even with very low usage you are looking at about 135GB/s of Bandwidth at max. This is a very interesting development and is probably the result of the fact that at the heart of the PS4 is an APU.

here' another link if you don't believe that one:http://techgadgetnews.com/2014/09/21/sony-ps4-effective-gpu-bandwidth-is-140-gbs-not-176-gbs-disproportionate-cpu-and-gpu-scaling/

176 is the theoretical bandwidth just like 218 is the theoretical bandwidth for Esram.

and regarding parallel processing this is what my article said about Sony:

It must be noted that we are not sure whether the PlayStation 4 follows a similar implementation as the Xbox One. It will be rather intriguing to know how the PS4 reduces latency during CPU-GPU exchange.

Avatar image for psx_warrior
psx_warrior

1757

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 psx_warrior
Member since 2006 • 1757 Posts

Wish people would just give this a rest. PC has always outclassed consoles not long after launch. This is just getting old. Glad I could post before thread lock:)

Avatar image for deactivated-5d6bb9cb2ee20
deactivated-5d6bb9cb2ee20

82724

Forum Posts

0

Wiki Points

0

Followers

Reviews: 56

User Lists: 0

#70 deactivated-5d6bb9cb2ee20
Member since 2006 • 82724 Posts

The fact that this thread survived for over a day is sort of sad, but I'm locking it now. Not only do you not have 500 posts needed to make a thread, your entire OP is cherry picked and misleading as well. Very poor form.