Will DirectX 12 Help the Xbox 1?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#201 scatteh316
Member since 2004 • 10273 Posts

@Crypt_mx: Developers would optimise as much as possible without DX12 anyway.

That's just how consoles work.

Most developers that have they're own engines and development tools won't even use DX12.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#202  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:

@ronvalencia said:

@scatteh316 said:

This thread is full of fail and full of idiots who have no grasp of hardware and software.

Apply it to yourself.

I'm talking about you.... 90% of the fail in this thread is from you.... You are literally the most clueless person I know when it comes to hardware and software.

That's all you can do? I expected more from you.

You have NOT address Microsoft's fill rate calculations vs memory bandwidth from 16 ROPS at 853Mhz

If 7950 is top 32 ROPS at 240 GB/s example, that's 7.5 GB/s per ROP. PS4's raw 176GB/s memory bandwidth against 7.5 GB/s would be about 24 ROPS. X1's ROPS at 853Mhz is roughly equal to ~17 ROPS at 800Mhz.

With Battlefield 4 Mantle or DirectX 11.1 versions, my old 7950 (from underclock 800Mhz to overclock 950Mhz) already shown to be superior over PS4. .

Eurogamer already stated X1's 16 ROPS at 853Mhz is not a big issue for 1920x1080p.

Fill rate is a function of ROPS and memory operation i.e. ROPS count without memory operation is northing. Low level APIs can't manufacture additional memory bandwidth.

Rebellion already explained why most of X games are less than 1920x1080p resolution i.e. it's not the 16 ROPS count. Rebellion > you.

You have nothing.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#203 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:

@scatteh316 said:

@ronvalencia said:

@scatteh316 said:

This thread is full of fail and full of idiots who have no grasp of hardware and software.

Apply it to yourself.

I'm talking about you.... 90% of the fail in this thread is from you.... You are literally the most clueless person I know when it comes to hardware and software.

That's all you can do? I expected more from you. AMD already stated PC's pure pixel pushing is already comparable to console efficiency.

Oh look, more rubbish... PC's pure pixel pushing is leaps and bounds above any of the consoles.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#204 ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:

@ronvalencia said:

@scatteh316 said:

@ronvalencia said:

@scatteh316 said:

This thread is full of fail and full of idiots who have no grasp of hardware and software.

Apply it to yourself.

I'm talking about you.... 90% of the fail in this thread is from you.... You are literally the most clueless person I know when it comes to hardware and software.

That's all you can do? I expected more from you. AMD already stated PC's pure pixel pushing is already comparable to console efficiency.

Oh look, more rubbish... PC's pure pixel pushing is leaps and bounds above any of the consoles.

Again, your not addressing the issue at hand.

"Don't be silly... an API will not magically give Xbone the fill rate required for 1080p.".

Rebellion already explained why most of X1 games are less than 1920x1080p resolution i.e. it's not the 16 ROPS count and it's 32MB ESRAM is too small for 1920x1080p and it needs "tiling tricks".

Rebellion > you.

I use 7950 to show PS4's 32 ROPS is not the real 32 ROPS (with the backing of higher memory bandwidth). My old 7950 doesn't have major issues with 1920x1080p and underclocked memory bandwidth at 127 GB/s.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205  Edited By scatteh316
Member since 2004 • 10273 Posts

And here you go again doing direct comparisons with PC hardware.

Maybe if I say it again it might sink in...

YOU CAN NOT COMPARE PC AND CONSOLE HARDWARE LIKE THAT.

And Xbone doesn't have the fill rate for 1080p... Any idiot and his dog can see that.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#206  Edited By ronvalencia
Member since 2008 • 29612 Posts

#

@scatteh316 said:

And here you go again doing direct comparisons with PC hardware.

Maybe if I say it again it might sink in...

YOU CAN NOT COMPARE PC AND CONSOLE HARDWARE LIKE THAT.

And Xbone doesn't have the fill rate for 1080p... Any idiot and his dog can see that.

1. GCN based consoles uses PC hardware.

2. AMD Mantle and DirectX 12 are comparable to Xbox One's DirectX 11.X.

3. Battlefield 4 runs on PC DirectX 11, Mantle, Xbox One and PS4.

4. Rebellion > you. Rebellion already explained Xbox one's issues with 1920x1080p i.e. 32 MB ESRAM is too small for 1920x1080p and it's not the 16 ROPS.

5. I have memory underclock my 7950 and R9-290 to 127 GB/s and 1920x1080p is not a major issue. You could have 32 ROPS at 240 Gb/s memory bandwidth, but if the 240 GB/s bandwidth is 32 MB VRAM size, it's gimped. It will need the "tiling tricks" to make better use of it.

Avatar image for nicecall
nicecall

528

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#207 nicecall
Member since 2013 • 528 Posts

how would dx12 work on xbox one? someone linked me a site that showed dx12 won't even fully work on nvidias next gen 8xx line of cards, and that it'll be 2015 before hardware will be fully equip with dx12 capability.

How is a system that can barely do 720p going to magically turn into a 1080p dx12 machine?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:

#

@scatteh316 said:

And here you go again doing direct comparisons with PC hardware.

Maybe if I say it again it might sink in...

YOU CAN NOT COMPARE PC AND CONSOLE HARDWARE LIKE THAT.

And Xbone doesn't have the fill rate for 1080p... Any idiot and his dog can see that.

1. GCN based consoles uses PC hardware.

2. AMD Mantle and DirectX 12 are comparable to Xbox One's DirectX 11.X.

3. Battlefield 4 runs on PC DirectX 11, Mantle, Xbox One and PS4.

4. Rebellion > you. Rebellion already explained Xbox one's issues with 1920x1080p i.e. 32 MB ESRAM is too small for 1920x1080p and it's not the 16 ROPS.

5. I have memory underclock my 7950 and R9-290 to 127 GB/s and 1920x1080p is not a major issue. You could have 32 ROPS at 240 Gb/s memory bandwidth, but if the 240 GB/s bandwidth is 32 MB VRAM size, it's gimped. It will need the "tiling tricks" to make better use of it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#210 ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:

@ronvalencia said:

#

@scatteh316 said:

And here you go again doing direct comparisons with PC hardware.

Maybe if I say it again it might sink in...

YOU CAN NOT COMPARE PC AND CONSOLE HARDWARE LIKE THAT.

And Xbone doesn't have the fill rate for 1080p... Any idiot and his dog can see that.

1. GCN based consoles uses PC hardware.

2. AMD Mantle and DirectX 12 are comparable to Xbox One's DirectX 11.X.

3. Battlefield 4 runs on PC DirectX 11, Mantle, Xbox One and PS4.

4. Rebellion > you. Rebellion already explained Xbox one's issues with 1920x1080p i.e. 32 MB ESRAM is too small for 1920x1080p and it's not the 16 ROPS.

5. I have memory underclock my 7950 and R9-290 to 127 GB/s and 1920x1080p is not a major issue. You could have 32 ROPS at 240 Gb/s memory bandwidth, but if the 240 GB/s bandwidth is 32 MB VRAM size, it's gimped. It will need the "tiling tricks" to make better use of it.

Your concession is accepted.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#211 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:

@scatteh316 said:

@ronvalencia said:

#

@scatteh316 said:

And here you go again doing direct comparisons with PC hardware.

Maybe if I say it again it might sink in...

YOU CAN NOT COMPARE PC AND CONSOLE HARDWARE LIKE THAT.

And Xbone doesn't have the fill rate for 1080p... Any idiot and his dog can see that.

1. GCN based consoles uses PC hardware.

2. AMD Mantle and DirectX 12 are comparable to Xbox One's DirectX 11.X.

3. Battlefield 4 runs on PC DirectX 11, Mantle, Xbox One and PS4.

4. Rebellion > you. Rebellion already explained Xbox one's issues with 1920x1080p i.e. 32 MB ESRAM is too small for 1920x1080p and it's not the 16 ROPS.

5. I have memory underclock my 7950 and R9-290 to 127 GB/s and 1920x1080p is not a major issue. You could have 32 ROPS at 240 Gb/s memory bandwidth, but if the 240 GB/s bandwidth is 32 MB VRAM size, it's gimped. It will need the "tiling tricks" to make better use of it.

Your concession is accepted.

No concession, if you want to repeat the same shit over and over again like a Parrot then I will brand you as such.

But..but....I ran my PC GPU's at lower clocks and run tests and did not factor in Windows sucking up resources or differences in driver efficiency.......

But but the Xbone can do 1080p perfectly fine because a low rate developer said it can......

But but it's because developers aren't using titles resources to make the frame buffers fit into ESRAM......

But but I'm to retarded to figure out that frame buffers don't have to be stored in ESRAM so that's no excuse......

But but scatteh316 talks with developers from high budget studios on Beyond3D forums who confirm all what scatteh316 says........

But but if I keep on repeating the same shit someone will believe me.....

Seriously, you're a moron.....

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#212  Edited By ronvalencia
Member since 2008 • 29612 Posts

@nicecall said:

how would dx12 work on xbox one? someone linked me a site that showed dx12 won't even fully work on nvidias next gen 8xx line of cards, and that it'll be 2015 before hardware will be fully equip with dx12 capability.

How is a system that can barely do 720p going to magically turn into a 1080p dx12 machine?

Read http://www.gamespot.com/profile/blog/nvidia-tamasi-s-directx-12-vs-amd-s-full-directx-1/26055511/ it's my blog on DirectX 12, Intel and Techreport's sloppiness. One of Techreport's "new DX12 APIs" are not new i.e. it's available on Intel Haswell IGP.

It's a combination of driver/API runtime/middleware(i.e. "new SDK") and programming approach improvements.

Read http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games

Rebellion explains why most Xbox One games are less than 1920x1080p. The near full potential 12 CU (1.32 TFLOPS) GCN example is the prototype 7850 with 153.6 GB/s of 2GB GDDR5 Read http://www.tomshardware.com/reviews/768-shader-pitcairn-review,3196.html on what 1.32 TFLOPS GCN capabilities when it's not gimped by memory issues.

It covers the following points

  • Improved programming approach e.g. tile, tile... tile focus. For modern workloads, ESRAM is too small for 1920x1080p.
  • New SDK i.e. improved driver/API runtime/middleware.
  • The "tiling tricks" is an attempt to workaround Xbox One's very small/very fast 32 MB ESRAM i.e. fake a large/fast GDDR5 type memory via programming complexity.

PS;

The prototype 7850 is still inferior to retail Radeon HD 7850 and R7-265 (the closest to PS4).

The prototype 7850 is superior to Radeon HD 7770.

R7-265 is superior to retail Radeon HD 7850 (1.76 TFLOPS)

Increased program complexity has it's own overheads i.e. more instruction issue slots being consumed by the increased managing code. PS4 already has a large/fast GDDR5 memory and it doesn't need Xbox One's complex tricks.

Sony PS4 or AMD Mantle doesn't need DirectX 12 i.e. Sony/AMD already did the hard work on designing thier APIs from the ground up i.e. dump the existing APIs and design one from scratch.

NVIDIA and thier fanboys (e.g. Techreport's article writer) should leave non-NVIDIA hardware alone since NVIDIA has zero authority on the matter e.g. Intel shows the world on how it gimps it's competitors.

--------------------------

For Conservative Rasterization, Read http://www.google.com/patents/WO2013101167A1?cl=en

Intel's "Five-dimensional rasterization with conservative bounds" patent

"We present a method for computing conservative depth over a motion blurred triangle over a tile of pixels. This is useful for occlusion culling (zmin/zma¾-culling) and conservative rasterization, as well as a number of other techniques, such as dynamic collision detection on the graphics processor, and caustics rendering"

Another Intel win.

Avatar image for steamistrash
steamistrash

431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#213  Edited By steamistrash
Member since 2014 • 431 Posts

direct x 12 wont even help windows muchless xboxone.

Which actual for sure game *legendary) u know is gonna be released on windows 12?

NONE!!!!!!!! EXACTLY!

Avatar image for steamistrash
steamistrash

431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#214 steamistrash
Member since 2014 • 431 Posts

there were no truly badass fps dx10 or dx11 muchless dx12.

The last truly decent fps were dx9 era on pc.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#215 ronvalencia
Member since 2008 • 29612 Posts

@steamistrash:

BattleField 4 PC has DirectX11.1 code path.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#216 btk2k2
Member since 2003 • 440 Posts

No, DX 12 will not really help the Xbox One. MS already have a low level API for use on the Xbox One which is likely to be similar to Mantle and the PS4 low level API. DX12 is about bringing those benefits to the PC in a vendor agnostic format as there is no way Nvidia will get on board with Mantle, they are just too proud a company to admit that the competitor got one over on them. Intel might have done it as they did with using AMDs 64bit X86 extensions but with DX12 on the horizon they probably will not unless Mantle really takes off.

Ron, I agree with scatteh316, you are a moron. I kept telling you what the performance difference would be, I even gave the benefit of the doubt to the the Xbox One with regards to ESRAM and the upper end of my estimate (45%) is about on par with the actual performance difference. between the consoles in the majority of multiplats shown so far.

I do not see this gap closing at all though and once HSA and GPGPU compute becomes a more integral part to engine/game design I can see the gap widening slightly.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#217  Edited By ronvalencia
Member since 2008 • 29612 Posts

@btk2k2 said:

No, DX 12 will not really help the Xbox One. MS already have a low level API for use on the Xbox One which is likely to be similar to Mantle and the PS4 low level API. DX12 is about bringing those benefits to the PC in a vendor agnostic format as there is no way Nvidia will get on board with Mantle, they are just too proud a company to admit that the competitor got one over on them. Intel might have done it as they did with using AMDs 64bit X86 extensions but with DX12 on the horizon they probably will not unless Mantle really takes off.

Ron, I agree with scatteh316, you are a moron. I kept telling you what the performance difference would be, I even gave the benefit of the doubt to the the Xbox One with regards to ESRAM and the upper end of my estimate (45%) is about on par with the actual performance difference. between the consoles in the majority of multiplats shown so far.

I do not see this gap closing at all though and once HSA and GPGPU compute becomes a more integral part to engine/game design I can see the gap widening slightly.

1. Xbox One's DirectX 11.X a few features missing from DirectX 12 e.g.

  • Resource binding model makes render setup nearly free
  • Pipeline state objects do the same shaders

Theses will also be available on Xbox One.

2. 16 ROPS at 853Mhz wasn't the major issue for 1920x1080p. A machine could have a fill rate of ~150 GB/s, but if the that fill rate is limited to 32 GB of memory then it would be gimped (without tiling). Rebellion > You. The moron is you. IF you want to convert system wars into personality war so be it. I have stated tiling would be important for Xbox One and Microsoft knows this issue hence the focus on tiled resource. I have stated AMD PRT/Tiled Resource has a larger gain on X1 since it's starting from lower baseline i.e. 68 GB/s main memory limitation.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#218 btk2k2
Member since 2003 • 440 Posts

@ronvalencia said:

@btk2k2 said:

No, DX 12 will not really help the Xbox One. MS already have a low level API for use on the Xbox One which is likely to be similar to Mantle and the PS4 low level API. DX12 is about bringing those benefits to the PC in a vendor agnostic format as there is no way Nvidia will get on board with Mantle, they are just too proud a company to admit that the competitor got one over on them. Intel might have done it as they did with using AMDs 64bit X86 extensions but with DX12 on the horizon they probably will not unless Mantle really takes off.

Ron, I agree with scatteh316, you are a moron. I kept telling you what the performance difference would be, I even gave the benefit of the doubt to the the Xbox One with regards to ESRAM and the upper end of my estimate (45%) is about on par with the actual performance difference. between the consoles in the majority of multiplats shown so far.

I do not see this gap closing at all though and once HSA and GPGPU compute becomes a more integral part to engine/game design I can see the gap widening slightly.

1. Xbox One's DirectX 11.X a few features missing from DirectX 12 e.g.

  • Resource binding model makes render setup nearly free
  • Pipeline state objects do the same shaders

Theses will also be available on Xbox One.

2. 16 ROPS at 853Mhz wasn't the major issue for 1920x1080p. A machine could have a fill rate of ~150 GB/s, but if the that fill rate is limited to 32 GB of memory then it would be gimped (without tiling). Rebellion > You. The moron is you. IF you want to convert system wars into personality war so be it. I have stated tiling would be important for Xbox One and Microsoft knows this issue hence the focus on tiled resource. I have stated AMD PRT/Tiled Resource has a larger gain on X1 since it's starting from lower baseline i.e. 68 GB/s main memory limitation.

1) Jonah has heavily implied that the low level API of DX12 is basically a multi vendor version of Mantle. He has also stated that Mantle and the PS4 low level API are very similar. Given those two statements it stands to reason that Mantle is already capable of the above features and I would expect the PS4 API to be capable as well. It would surprise me greatly if the Xbox One did not have those features as part of its low level API.

2) It depends on the game, as always. In the general case though the fill rate with 16 ROPS at 853 Mhz is too low for 1080p with modern games. I am sure some games will achieve it as we do have 1080p games on the Xbox One that can run at a stable frame rate but we are seeing a lot more choose 900p or 720p instead to achieve the balance of frame rate, on screen effects and resolution. Some of that is down to the ESRAM and some of that is down to the fill rate.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#219 Tighaman
Member since 2006 • 1038 Posts

DX12 is eliminating the need for z-buffers and plenty more stuff to come it will help the x1 just sit back again grab your popcorn because this year is gonna be a beautiful one (beta test in the future)

Avatar image for theshensolidus
TheShensolidus

224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#220 TheShensolidus
Member since 2013 • 224 Posts

I love it when people bring up 'tiled resourcing' on textures is a free thing we can just throw in the 32 mb pool, and that there aren't any other critical processes a developer would want that space for. Like your drawcalls framebuffer or something. Nah, lets just utilize the majority of our bandwidth of high-speed memory for textures, when we can simply just load in textures all at once with a longer load screen...

Kinda how how & why a few X1 games have longer load times than their PS4 counterparts? Maybe?

Like i've said, DX12 won't be utilized by X1 fully, but there will be multiple optimizations and features that the DirectX-like API in the X1 will benefit from DX 12 proper.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#221  Edited By ronvalencia
Member since 2008 • 29612 Posts

@@

@btk2k2 said:

@ronvalencia said:

@btk2k2 said:

No, DX 12 will not really help the Xbox One. MS already have a low level API for use on the Xbox One which is likely to be similar to Mantle and the PS4 low level API. DX12 is about bringing those benefits to the PC in a vendor agnostic format as there is no way Nvidia will get on board with Mantle, they are just too proud a company to admit that the competitor got one over on them. Intel might have done it as they did with using AMDs 64bit X86 extensions but with DX12 on the horizon they probably will not unless Mantle really takes off.

Ron, I agree with scatteh316, you are a moron. I kept telling you what the performance difference would be, I even gave the benefit of the doubt to the the Xbox One with regards to ESRAM and the upper end of my estimate (45%) is about on par with the actual performance difference. between the consoles in the majority of multiplats shown so far.

I do not see this gap closing at all though and once HSA and GPGPU compute becomes a more integral part to engine/game design I can see the gap widening slightly.

1. Xbox One's DirectX 11.X a few features missing from DirectX 12 e.g.

  • Resource binding model makes render setup nearly free
  • Pipeline state objects do the same shaders

Theses will also be available on Xbox One.

2. 16 ROPS at 853Mhz wasn't the major issue for 1920x1080p. A machine could have a fill rate of ~150 GB/s, but if the that fill rate is limited to 32 GB of memory then it would be gimped (without tiling). Rebellion > You. The moron is you. IF you want to convert system wars into personality war so be it. I have stated tiling would be important for Xbox One and Microsoft knows this issue hence the focus on tiled resource. I have stated AMD PRT/Tiled Resource has a larger gain on X1 since it's starting from lower baseline i.e. 68 GB/s main memory limitation.

1) Jonah has heavily implied that the low level API of DX12 is basically a multi vendor version of Mantle. He has also stated that Mantle and the PS4 low level API are very similar. Given those two statements it stands to reason that Mantle is already capable of the above features and I would expect the PS4 API to be capable as well. It would surprise me greatly if the Xbox One did not have those features as part of its low level API.

2) It depends on the game, as always. In the general case though the fill rate with 16 ROPS at 853 Mhz is too low for 1080p with modern games. I am sure some games will achieve it as we do have 1080p games on the Xbox One that can run at a stable frame rate but we are seeing a lot more choose 900p or 720p instead to achieve the balance of frame rate, on screen effects and resolution. Some of that is down to the ESRAM and some of that is down to the fill rate.

1. PS4 doesn't have any major API issues. PS4 didn't limit itself from OpenGL or Direct3D standards, but it doesn't solve memory bandwidth issues for The Order's 1920x800 MSAA 4X i.e. 32 ROPs are capable to do more, but it's bound by memory bandwidth (as stated by the devs).

AMD's fastest 256bit memory solution comes from 7870 XT's 192 GB/s.

2. Microsoft already shown that their 16 ROPS at 853Mhz = 17 ROPS at 800Mhz can saturate their 150 GB/s memory bandwidth. Eurogamer already stated X1's 16 ROPS is not a major issue for 1920x1080p. 7950's 32 ROPS delivers better results than PS4.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#222 blackace
Member since 2002 • 23576 Posts

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#223 Martin_G_N
Member since 2006 • 2124 Posts

Yes, it will help it, just like new drivers and tools will help the PS4 through the gen. But don't expect multiplats to suddenly look better on the X1, because the difference in specs is still the same.

Why can't just people realize that MS was cheap with the X1's hardware because of Kinect and it being a media box. They wanted 8GB unified RAM, and at the time DDR3 was the only option. Sony wanted speed and went for 4GB GDDR5, and because of price drops later on they went for 8GB instead.

But what I can't understand is why MS went through all this trouble to get the bandwidth speeds back up, instead of just using a GPU with it's own 2GB or more of GDDR5 for VRAM, just like a PC. It would have taken less space on the board, and they could have fitted a bigger GPU. The ESRAM takes up too much space on the board, and it's too little to be usefull enough.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#224 ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:

@Crypt_mx: Developers would optimise as much as possible without DX12 anyway.

That's just how consoles work.

Most developers that have they're own engines and development tools won't even use DX12.

Xbox One's optimisations are bound by runtime limitations e.g.

1. Xbox One's DirectX 11.X is missing a few feature from DirectX 12 e.g.

  • Resource binding model makes render setup nearly free.
  • Pipeline state objects do the same shaders.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#225  Edited By ronvalencia
Member since 2008 • 29612 Posts
@scatteh316 said:

@ronvalencia said:

@scatteh316 said:

@ronvalencia said:

#

@scatteh316 said:

And here you go again doing direct comparisons with PC hardware.

Maybe if I say it again it might sink in...

YOU CAN NOT COMPARE PC AND CONSOLE HARDWARE LIKE THAT.

And Xbone doesn't have the fill rate for 1080p... Any idiot and his dog can see that.

1. GCN based consoles uses PC hardware.

2. AMD Mantle and DirectX 12 are comparable to Xbox One's DirectX 11.X.

3. Battlefield 4 runs on PC DirectX 11, Mantle, Xbox One and PS4.

4. Rebellion > you. Rebellion already explained Xbox one's issues with 1920x1080p i.e. 32 MB ESRAM is too small for 1920x1080p and it's not the 16 ROPS.

5. I have memory underclock my 7950 and R9-290 to 127 GB/s and 1920x1080p is not a major issue. You could have 32 ROPS at 240 Gb/s memory bandwidth, but if the 240 GB/s bandwidth is 32 MB VRAM size, it's gimped. It will need the "tiling tricks" to make better use of it.

Your concession is accepted.

No concession, if you want to repeat the same shit over and over again like a Parrot then I will brand you as such.

But..but....I ran my PC GPU's at lower clocks and run tests and did not factor in Windows sucking up resources or differences in driver efficiency.......

But but the Xbone can do 1080p perfectly fine because a low rate developer said it can......

But but it's because developers aren't using titles resources to make the frame buffers fit into ESRAM......

But but I'm to retarded to figure out that frame buffers don't have to be stored in ESRAM so that's no excuse......

But but scatteh316 talks with developers from high budget studios on Beyond3D forums who confirm all what scatteh316 says........

But but if I keep on repeating the same shit someone will believe me.....

Seriously, you're a moron.....

1. Rebellion already stated main reason why most games for Xbox One are less the 1920x1080p. Rebellion > you. Rebellion is an AMD Gaming Evolved developer that includes AMD's Mantle. Doing "tiling tricks" hardly equals "Xbone can do 1080p perfectly fine" i.e. MS added extra development complexity. Your nothing.

2. Eurogamer already stated Xbox One's 16 ROPS at 853Mhz (or 17 ROPS at 800Mhz effective) is not a major limit for 1920x1080p.

3. Microsoft already shown X1's 16 ROPS at 853Mhz (or 17 ROPS at 800Mhz effectively) can saturate ~150 GB/s memory bandwidth, hence the fill rate is not a major issue for 1920x1080p. 16 ROPS at 853Mhz is capable to read and write to it's fastest memory pool (32 MB ESRAM). Did you expect X1's 16 ROPS to stay at 68GB-to-72 GB/s fill rate? Provide the raw bandwidth requirements calculation from 16 ROPS at 853Mhz.

Your not addressing this point. Your nothing. Note that Microsoft carefully avoided (non-tiling) target render frame-buffers that exceed 32 MB ESRAM i.e. overspill can reduce memory bandwidth bound fill rates (memory read and write).

With Tomb Raider 2013, 69 GB/s memory bandwidth on my R9-290 gimped my 64 ROPS and 40 CUs(i.e. TMUs) to near Xbox One levels i.e. 1920x1080 ~34 fps results.

4. GCN's 32 ROPS wasn't specifically designed for PS4 i.e it scales to 79x0/89x0-OEM/R9-280/280X products with greater memory bandwidth e.g. 240 GB/s and higher.

5. I'm aware of Beyond3D's certain posts that calculated target render size for 1920x1080p.Your not the only one that lurks in Beyond3D's forum.

6. Windows's overheads mostly involves the CPU side which can largely negated by very fast CPU e.g. this is why my Intel Core i7-4770K OC gaming PC has less percentage gain with Mantle than lesser CPU gaming PC. As mentioned in GDC 2014 lectures, AMD loves DirectX 12 since it helps their CPU department.

Low level runtime is not a silver bullet for GPU bound workloads.

My 8870M + 2GB GDDR5 with 72 GB/s (based from desktop 7770) runs like Xbox One in most multi-platform games (e.g. Battlefield 4) which is expected for low 1 TFLOPS GCN with 68 GB/s. My laptop's Intel Core i7 Ivybridge quad core takes care of Windows' CPU side overheads.

Your the moron.

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#226 xhawk27
Member since 2010 • 12194 Posts

Yes it's going to help the Xbox One hardware.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#227 silversix_
Member since 2010 • 26347 Posts

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#228 GravityX
Member since 2013 • 865 Posts

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

Well that's 3 to 4 more fps than Cerny's Knack. So Xbox wins.

Avatar image for Cloud_imperium
Cloud_imperium

15146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 103

User Lists: 8

#229 Cloud_imperium
Member since 2013 • 15146 Posts

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#230 GrenadeLauncher
Member since 2004 • 6843 Posts

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

Avatar image for Solid_Max13
Solid_Max13

3596

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#231 Solid_Max13
Member since 2006 • 3596 Posts

@GrenadeLauncher said:

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

I think it will help but not as muc as peopley hyped it up to be, after reading, and analyzing it seems it's more a PC feature thna anything, of course @ronvalencia is up there trying to spin some bs that makes absolutely no sense, about how the gimped hardware will somehow be fine and do more better than the PS4 is just utter ridiculous.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232 silversix_
Member since 2010 • 26347 Posts

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

and what do you expect sony to do? They will do the same but have much more exclusives to offer and those exclusives won't be looking like some last gen crap at 50% higher rez (900p). So no matter what Sony will win E3 and you'll be talking about 2015 e3 as an Xbone savior...

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#233 GrenadeLauncher
Member since 2004 • 6843 Posts

@Solid_Max13 said:

@GrenadeLauncher said:

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

I think it will help but not as muc as peopley hyped it up to be, after reading, and analyzing it seems it's more a PC feature thna anything, of course @ronvalencia is up there trying to spin some bs that makes absolutely no sense, about how the gimped hardware will somehow be fine and do more better than the PS4 is just utter ridiculous.

Yeah, PC and mobile. PC's get the low level API capabilities that consoles enjoy and mobile batteries last longer. Xbone will get quicker porting between it and PC.

Avatar image for Solid_Max13
Solid_Max13

3596

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#234  Edited By Solid_Max13
Member since 2006 • 3596 Posts

@GrenadeLauncher said:

@Solid_Max13 said:

@GrenadeLauncher said:

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

I think it will help but not as muc as peopley hyped it up to be, after reading, and analyzing it seems it's more a PC feature thna anything, of course @ronvalencia is up there trying to spin some bs that makes absolutely no sense, about how the gimped hardware will somehow be fine and do more better than the PS4 is just utter ridiculous.

Yeah, PC and mobile. PC's get the low level API capabilities that consoles enjoy and mobile batteries last longer. Xbone will get quicker porting between it and PC.

I just noticed my writing above I seem drunk while I was writing that, lol. But yea I see it as more of a tool for PC and quicker porting between Windows platforms IE Windows, Windows Mobile etc, and as for the gaming elements well it seems if we want to see anything Xbox related it looks like we have to wait till like 2015, so right now it is speculation, but I don't see anything big happening to it in the foreseeable future.

Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#235 NFJSupreme
Member since 2005 • 6605 Posts

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#236  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Solid_Max13 said:

@GrenadeLauncher said:

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

I think it will help but not as muc as peopley hyped it up to be, after reading, and analyzing it seems it's more a PC feature thna anything, of course @ronvalencia is up there trying to spin some bs that makes absolutely no sense, about how the gimped hardware will somehow be fine and do more better than the PS4 is just utter ridiculous.

Your is post BS. Rebellion > you.

I rather side with an AMD Gaming Evolved + Mantle developer Rebellion than you.

You haven't done any raw memory bandwidth calculations from 16 ROPS at 853 Mhz (i.e. 17.06 ROPS at 800Mhz effective).

I always supported X1 will be inferior to any higher GCNs with CU(ALU/TMU) count. I do not support noobie X1's 16 ROPS vs PS4's 32 ROPS BS since fill rates are ROPS+memory operation combo functions.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#237 silversix_
Member since 2010 • 26347 Posts

@ronvalencia said:

@Solid_Max13 said:

@GrenadeLauncher said:

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

I think it will help but not as muc as peopley hyped it up to be, after reading, and analyzing it seems it's more a PC feature thna anything, of course @ronvalencia is up there trying to spin some bs that makes absolutely no sense, about how the gimped hardware will somehow be fine and do more better than the PS4 is just utter ridiculous.

Your is post BS. Rebellion > you. Your nothing.

When will be X1's secret sauce available to their costumers?

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#238  Edited By blackace
Member since 2002 • 23576 Posts

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

We'll see. LMAO!!

************************************************************

@silversix_ said:

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

and what do you expect sony to do? They will do the same but have much more exclusives to offer and those exclusives won't be looking like some last gen crap at 50% higher rez (900p). So no matter what Sony will win E3 and you'll be talking about 2015 e3 as an Xbone savior...

We'll see!! LOL!!!! We'll see!!

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#239  Edited By blackace
Member since 2002 • 23576 Posts

@NFJSupreme said:

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

That's why I told Ron not to even waste his breath. Most of the biased trolls on here are clueless and just spout BS they know absolutely nothing about. By this time next year, every one of them will be eating their words and damage controling like their life depended on it. lol!!

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#240  Edited By ronvalencia
Member since 2008 • 29612 Posts

@silversix_ said:

@ronvalencia said:

@Solid_Max13 said:

@GrenadeLauncher said:

@silversix_ said:

Yes it will help. From a minimum of 16fps the Xbox One will be able to achieve a mind blowing 18fps and 19fps when DX12.1 comes out in 2019.

@Cloud_imperium said:

Yes it will , but not much . DX 12 will be more beneficial for PC since it will allow low level access and latest video cards will use its full potential .

These two get it.

@blackace said:

I wouldn't even waste my breath. Microsoft will let the games do the talking at the E3.

Greatness is coming!!

To dreaaaaaaaaaaaaaam the impossible dreaaaaaaaaaaaaaaam...

I think it will help but not as muc as peopley hyped it up to be, after reading, and analyzing it seems it's more a PC feature thna anything, of course @ronvalencia is up there trying to spin some bs that makes absolutely no sense, about how the gimped hardware will somehow be fine and do more better than the PS4 is just utter ridiculous.

Your is post BS. Rebellion > you. Your nothing.

When will be X1's secret sauce available to their costumers?

There's nothing secret about X1's runtime software e.g. I already have AMD Mantle in my gaming PCs. I don't have any laptop and desktop PC with an AMD CPU.

Rebellion's Sniper Elite 3 (first person shooter) will be available for consoles/PCs around June 27th 2014 to July 1st 2014. http://www.rebellion.co.uk/games/sniper-elite-3

Both X1 and PS4 versions has 1920x1080p native resolution. http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games

Since this is system wars not console wars, Rebellion's Sniper Elite 3 PC supports AMD/EA DICE's Mantle. http://techreport.com/news/25682/sniper-elite-3-to-tap-amd-mantle-api

At this time, Rebellion's Sniper Elite 3 Xbox One at 1920x1080p is not the average. If you want 1920x1080p result average on the console, get PS4.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#241  Edited By ronvalencia
Member since 2008 • 29612 Posts

@blackace said:

@NFJSupreme said:

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

That's why I told Ron not to even waste his breath. Most of the biased trolls on here are clueless and just spout BS they know absolutely nothing about. By this time next year, every one of them will be eating their words and damage controling like their life depended on it. lol!!

The main problem with X1 is the extra complexities with it's memory setup and industry has the inertia with the current 3D engines.

Both gaming PC and PS4 doesn't need X1's "tiling tricks". DirectX 11.2 (with Tiled Resource) is practically dead on arrival on the PC i.e. there are more games on AMD Mantle (hence it's synergy with PS4) than on PC's DirectX 11.2.

PS; AMD Mantle has AMD PRT, but fast 2GB GDDR5 reduce the need for it.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#242 GravityX
Member since 2013 • 865 Posts

@ronvalencia said:
@blackace said:

@NFJSupreme said:

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

That's why I told Ron not to even waste his breath. Most of the biased trolls on here are clueless and just spout BS they know absolutely nothing about. By this time next year, every one of them will be eating their words and damage controling like their life depended on it. lol!!

The main problem with X1 is the extra complexities with it's memory setup and industry has the inertia with the current 3D engines.

Both gaming PC and PS4 doesn't need X1's "tiling tricks". DirectX 11.2 (with Tiled Resource) is practically dead on arrival on the PC i.e. there are more games on AMD Mantle (hence it's synergy with PS4) than on PC's DirectX 11.2.

PS; AMD Mantle has AMD PRT, but fast 2GB GDDR5 reduce the need for it.

Apparently Xbox One can perform 780 ops and the PS4 can perform 288 ops. So api is irrelevant.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#243  Edited By ronvalencia
Member since 2008 • 29612 Posts
@GravityX said:

@ronvalencia said:
@blackace said:

@NFJSupreme said:

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

That's why I told Ron not to even waste his breath. Most of the biased trolls on here are clueless and just spout BS they know absolutely nothing about. By this time next year, every one of them will be eating their words and damage controling like their life depended on it. lol!!

The main problem with X1 is the extra complexities with it's memory setup and industry has the inertia with the current 3D engines.

Both gaming PC and PS4 doesn't need X1's "tiling tricks". DirectX 11.2 (with Tiled Resource) is practically dead on arrival on the PC i.e. there are more games on AMD Mantle (hence it's synergy with PS4) than on PC's DirectX 11.2.

PS; AMD Mantle has AMD PRT, but fast 2GB GDDR5 reduce the need for it.

Apparently Xbox One can perform 780 ops and the PS4 can perform 288 ops. So api is irrelevant.

Flawed numbers for PS4 and X1.

Assuming PS4's GCN is based from AMD Pitcairn GCN i.e. 18 CU has 1152 op/cycle.

X1's 768 op/cycle = 768 stream processors. Add another 12 scalar op/cycle from 12 CU. 768 stream processors at 853Mhz is effectively similar to 819 stream processor at 800Mhz.

PS4's 1152 op/cycle = 1152 stream processors. Add another 18 scalar op/cycle from 18 CU.

The CU gap is about 29 percent i.e. X1 has about 71 percent of PS4's CU power. The gap would be larger if Microsoft f**ked-up with their runtime software environment e.g. read http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

When Rebellion states that X1's new SDK will improvement things, it basically states Microsoft f**ked-up Xbox One's runtime software environment. We don't know if AMD also played a role with X1's driver performance issues.

-------

On AMD GCN specifics, read slide 21 from http://www.slideshare.net/DevCentralAMD/gs4106-the-amd-gcn-architecture-a-crash-course-by-layla-mah#btnNext

Each Compute Unit (CU) contains 4 SIMD; each SIMD has

- A 16-lane IEEE-754 vector ALU (vALU)

If you do the math, 16 vector lane x "4 SIMD" = 64 vector/stream processors per CU.

From http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

"divided in three dynamically scheduled SIMD groups of 16 processors each"

Xbox 360's GPU has 48 pipelines/lanes and it's divided into 3 groups of 16 pipelines/lanes. Each pipelines/lanes contains 1 scalar + vector4 unit.

Xbox 360's GPU is the closest relative to AMD GCN i.e. the constant "16 vector lane per SIMD group" design. For GCN, AMD separated the scalar component from the vector "lanes".

Summary

GCN CU = 4 groups of 16 vector lanes = 64 vec4.

Xbox 360 = 3 groups of 16 vector lanes = 48 vec4.

AMD already has a "handheld Xbox 360" i.e. AMD Temash APU tablets.

-------

AMD Jaguar CPU doesn't have Fused Add/Multiply (FMA) unit i.e. it has 128bit 1 FADD/SSE and 128bit 1 FMUL/SSE units. In terms of SSE/SIMD hardware, it's more or less similar to Intel Core 2 era X86-64. AMD GCN supports FMA.

Sony "out-Xbox" Microsoft's Xbox One with a larger scale Xbox i.e. that's PS4.

Rebellion already explained why most Xbox One games are less than 1920x1080p. An AMD loyalist would be neutral between X1 vs PS4 wars, since both boxes has AMD GPUs.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#244  Edited By GravityX
Member since 2013 • 865 Posts

@ronvalencia said:

@GravityX said:

@ronvalencia said:
@blackace said:

@NFJSupreme said:

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

That's why I told Ron not to even waste his breath. Most of the biased trolls on here are clueless and just spout BS they know absolutely nothing about. By this time next year, every one of them will be eating their words and damage controling like their life depended on it. lol!!

The main problem with X1 is the extra complexities with it's memory setup and industry has the inertia with the current 3D engines.

Both gaming PC and PS4 doesn't need X1's "tiling tricks". DirectX 11.2 (with Tiled Resource) is practically dead on arrival on the PC i.e. there are more games on AMD Mantle (hence it's synergy with PS4) than on PC's DirectX 11.2.

PS; AMD Mantle has AMD PRT, but fast 2GB GDDR5 reduce the need for it.

Apparently Xbox One can perform 780 ops and the PS4 can perform 288 ops. So api is irrelevant.

Flawed numbers for PS4 and X1.

Assuming PS4's GCN is based from AMD Pitcairn GCN i.e. 18 CU has 1152 op/cycle.

X1's 768 op/cycle = 768 stream processors.

PS4's 1152 op/cycle = 1152 stream processors.

This is like counting ops/cycle like VLIW5/VLIW4 Radeon HDs i.e. data element processing operation per cycle.

-------

Specifics on AMD GCN which is different from the older VLIW5/VLIW4 Radeon HDs.

AMD GCN's CU has 64 stream processors. 64 stream processors are divided into 4 lines i.e. 16 stream processors per line. Each line has 4 SIMD4 operations.

PS4 = There are 16 SIMD operations per CU x 18 = 288 SIMD4 + 18 scalar + 18 branch operations = 324 operations/cycle.

X1 = There are 16 SIMD operations per CU x 12 = 192 SIMD4 + 12 scalar + 12 branch operations = 216 operations/cycle.

This is like counting GCN CU's ops/cycle like Intel SSE.

AMD Jaguar CPU doesn't have Fused Add/Multiply (FMA) unit i.e. it has 128bit 1 FADD/SSE and 128bit 1 FMUL/SSE units. In terms of SSE/SIMD hardware, it's more or less similar to Intel Core 2 era X86-64. AMD GCN supports FMA.

I knew it was BS, needed you expertise to break down. Appreciate it Ron.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#245 Tighaman
Member since 2006 • 1038 Posts

@GravityX:

He not going to believe you I knew that was bogus when he start spewing that on his site and mrweinstein tried to correct him and mrc. Please send them to ron I always believe him more than anyone on these sites. NNL numbers never lie

Avatar image for lbjkurono23
lbjkurono23

12544

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#246 lbjkurono23
Member since 2007 • 12544 Posts

It doesn't hurt to dream.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#247  Edited By ronvalencia
Member since 2008 • 29612 Posts

@GravityX said:

I knew it was BS, needed you expertise to break down. Appreciate it Ron.

I have revised my post i.e. I changed to vector lanes or vector pipelines count.

On AMD GCN specifics, read slide 21 from http://www.slideshare.net/DevCentralAMD/gs4106-the-amd-gcn-architecture-a-crash-course-by-layla-mah#btnNext

Each Compute Unit (CU) contains 4 SIMD; each SIMD has

- A 16-lane IEEE-754 vector ALU (vALU)

If you do the math, 16 vector lanes x "4 SIMD" group = 64 vector/stream processors per CU.

From http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

"divided in three dynamically scheduled SIMD groups of 16 processors each"

Xbox 360's GPU has 48 pipelines/lanes and it's divided into 3 groups of 16 pipelines/lanes. Each pipelines/lanes contains 1 scalar + vector4 unit.

Xbox 360's GPU is the closest relative to AMD GCN i.e. the constant "16 vector lane per SIMD group" design. For GCN, AMD separated the scalar component from the vector "lanes".

Summary

GCN CU = 4 groups of 16 vector lanes = 64 vec4. PS4 has 18 CUs at 800Mhz and X1 has 12 CU at 853Mhz (effectively near 13 CU at 800Mhz).

Xbox 360 = 3 groups of 16 vector lanes = 48 vec4.

In terms of raw vector lanes, X1 is about 17 times over Xbox 360's GPU. PS4 is about 24 times over Xbox 360's GPU.

AMD already has a "handheld Xbox 360" i.e. AMD Temash APU tablets.

Xbox 360's Xenos wins since it has a direct successor i.e. AMD GCN. The dead-end architecture is IBM CELL and NVIDIA RSX.

Avatar image for pjbasta76
pjbasta76

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#248 pjbasta76
Member since 2014 • 25 Posts

I have no clue about the techno talk that everyone is using on this thread but I have a thought. Many developers have praised the PS4 for its power BUT no one have criticized the X1 for it perceived weakness. It makes me think that there is more to the X1 than we know and that MS and the developers are still working out the kinks.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249  Edited By tormentos
Member since 2003 • 33793 Posts

@blackace said:

@NFJSupreme said:

I see people with no real background programming or computer engineering still want to argue with ron. All he is going to talk over your head while you try to argue your opinion against his numbers. If you don't know the calculations to disprove then you are only wasting your time

That's why I told Ron not to even waste his breath. Most of the biased trolls on here are clueless and just spout BS they know absolutely nothing about. By this time next year, every one of them will be eating their words and damage controling like their life depended on it. lol!!

Ron is to biased and he love to invent to much theories which oddly all revolve around the xbox one performing better that it should based on his theories,in fact latest include saying the PS4 has 8 ACES because is Temash based a theory that i completely shut down with just 1 post with links.

Just like he claim the GPU on the xbox one was Pitcairn when even sites like DF claim is Bonaire.

Ron Knows about tech he is just to biased for his own good and a little to much of an ass kisser with MS.

You of all people should talk about been biased here,you invent sh** about sony and lie all the damn time,claim to be a manticore but all you do all day all night is defend MS,you didn't even know SN allow you to share games and started asking me and other posters for proof,did you even owned a PS3 fake manticore.? How could you miss gamesharing on PS3..lol

So yeah by this time next year sony will be kicking MS ass just as bad as now,look at Infamous is a damn game 3 months from launch and is ultra impressive,much better looking and at a higher resolution than xbox the best looking game on xbox one.

Avatar image for Solid_Max13
Solid_Max13

3596

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#250  Edited By Solid_Max13
Member since 2006 • 3596 Posts

@tormentos: all his posts are doing are claims which more than half have been debunked and also of course he's taking Into consideration that it "apprently" there is no fine answer but he's just going to come back and post Rebellion stayed this and that and Rebeliion >you lol

Edit also: I'd love this answered watchdogs was delayed for whatever reason but in that span of it being delayed how come ubisoft was not able to get it up to 1080p they had more than ample time and still it's being outputted at 900p