what's this, "DX12 will have a substantial effect on XB1"?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#402  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

Apparently, Direct12 is not low enough for some programmers.

http://gamingbolt.com/mantle-exposes-more-low-level-features-than-dx12-shares-many-rendering-codes-with-ps4xbox-one#vJ08Eor5oRo8c3U8.99

AMD Mantle Exposes More Low Level Features Than DX12, Shares Many Rendering Codes With PS4 Xbox One.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#403  Edited By tormentos
Member since 2003 • 33793 Posts

@StormyJoe said:

@tormentos: Software HAS closed the gap already! The June SDK update helped move the XB1 from mostly 720p for multiplats to 900p.

As for your second comment. Seriously, stop replying as if you know anything about software development or optimization - it just makes you look stupid.

This ^^ is total bullshit..

First of all game on xbox one were never mostly 720p,in fact only 3 games on launch were 720p.

COD Ghost,BF4,and Killer Instinct,the rest was either 900p or 1080p like Forza,Need for speed and several other games.

Just 2 months ago when you people pretended 720p was a thing of the past a game hit in that resolutions,when you believe that parity was achieve because of a game which from start shoot for parity only to see games like DA and SOM be inferior again.

All i know about software is that changing the drivers in a pentium 166,will not make it work like a Pentium 3 1ghz.

@04dcarraher said:

el tormentos...... denial confirmed lol

saying DX11.X is better then DX12 on pc? lol .... DX 11.X does not have multithreading properties as DX12 introduces. All DX 11 versions only uses one primary core in feeding gpu data this limits how well each thread/core can be allocated for different tasks. And this effect is seen with multiple multiplat games.

And then that is one desperate attempt in thinking/proving opengl is actually better today.... that whole article is over 5 years old good going lol.

Fact is that DX is easier to use and DX11 is overall better then what opengl provides today. And with DX12 around the corner providing low level true multithreading support API, Apple dumping opengl, AMD focusing on Mantle. opengl has an uphill battle to get people to use it. Hell John Carmack a opengl only type of developer admitted DX is better then opengl all the way back in 2011.

DX11X is only on xbox one and i am sure it has those to,stop with your sorry ass damage control,want to argue about something really real tell Stormyjoe that the difference from 720p and 900p to 1080p can be seen,there you have something you can really argue about.

The gains DX12 bring to PC are on xbox one since day one and they were born on consoles not the other way around,and since your such an idiot that you don't even know the history between Opengl and DX your opinion is basically useless.

Been easier doesn't mean been better,and 2011 when AMD claimed that DX11 was something developers wanted out of the way.? That AMD claimed DX was full of latency and was the reason PC graphics were basically hold back..?

Oh please most of the features DX12 bring to the table are already on Opengl 4.5 and no one has to wait for ate 2015 to access it..lol

At the Game Developers Conference 2014, in a panel including NVIDIA's Cass Everitt and John McDonald, AMD's Graham Sellers, and Intel's Tim Foley, explanations and demonstrations were given suggesting OpenGL could unlock as much as a 7X to 15X improvement in performance. Even without fine tuning, they note that in general OpenGL code is around 1.3X faster than DirectX. It almost makes you wonder why we ever settled for DirectX in the first place—particularly considering many developers felt DirectX code was always a bit more complex than OpenGL code.

http://www.anandtech.com/show/7890/return-of-the-directx-vs-opengl-debates

Last year.

http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014/

Binding multi draw basically every shit in DX12 is already on Opengl and is out already you don't have to wait,like windows DX is more used because of MS muscle not because DX is better or more efficient than opengl.

http://www.extremetech.com/gaming/133824-valve-opengl-is-faster-than-directx-even-on-windows

Remember this little test done by Valve.? 20% faster than DX using the same hardware,but but but carmack say..lol

@delta3074 said:

Completely agree with this, i have a google nexus 7 2012 and theres a ton of Android games that will not run on it

'You say all apps are developed for Android and Apple at the same time, go try and download games like Bioshock and Infinity Blade 3 for your android device tell me how that works out for you.'

completely agree, tormentos is talking utter bollocks, there are a TON of IOS games that i want that i am still waiting to come to android and a ton of games i want to play that will never see the light of day on android and in my experience the IOS version of games appear weeks sometimes months before the android version, they only just released the android version of lego star wars complete Saga and i cannot play it on my nexus 7 because it's only available for kindle fire which means you need a PowerVR gpu to run the game unless you have a rooted tablet with GL tools.

It's clear to me that when it comes to IOS and android tormentos doesn't have a fucking clue.

Piracy is also rife on android, virtually every paid game can be easily downloaded for free from the internet and side loaded due to the very open nature of Android OS, it is extremely hard to get IOS paid games for free because of apples , quite draconian, DRM policies so it stands to reason and makes logical sense that Apple would make more revenue on IOS games than google makes from Android games sales.

Question: What iOS devices will BioShock Mobile not support?

Answer: The game will not support iPhone 4s and below at launch. Likewise, it will not support iPad 3 and below at launch. If you are playing on an iPhone 5, you may encounter some issues like low framerates and hitching. Please note that your device needs 1 GB of RAM in order to play BioShock.

http://support.2k.com/hc/en-us/articles/203408763--iOS-BioShock-FAQ

If you don't know how to run it that is your thing,the same shit was say about the galaxy s 1 and the tegra 2 games,which was say would not work on the S1,and all you need to run them was a chainfire tools which is basically a software driver.

If you can't get it to you are out of luck.

http://androidcommunity.com/nexus-7-quad-core-overclocked-to-2-0-ghz-shatters-benchmarks-20120825/

That nexus 7 fly if you know how to make it fly.

And i already showed how androids has more apps.

@delta3074 said:

@sts106mat said:

@delta3074: unsurprisingly, tormentos is a fanboy of android...who'd have thought it?

I am actually a huge android fan but i can acknowledge that IOS has more games, better games and makes more revenue on games because Piracy is rife on android i just don't like apple products out of principle, there DRM is draconian and they charge way too much for there hardware.

I don't know about more games now but i am sure Android has more apps,revenues probably most iphone fans are idiots who love to pay more for less.

Piracy is a rife on Ios to were the fu** has you been.? For years it has been i have several friends who don't pay a single cent for apps on ios.

@ronvalencia said:

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

No that was a does and don't in a GDC not an actual confirmation that the PS4 can't only use certain amount of CPU time,also the PS4 has an extra connection which passes 20GB/s up and down between CPU and GPU.

I have told you 100 time having 300GB's with a weak ass GPU mean total shit,just because you give the xbox one 200GB/s doesn't mean you will be a GPU with less bandwidth is a total joke and proven wrong by the barrage of games which are superior on PS4,hell UFC uses MSAAx2 on xbox one on PS4 is uses MSAAx4 which consumes more bandwidth,bandwidth without GPU power mean nothings,the 660ti has 142GB/s and can go toe to toe with the stock 7950 which has 240Gb/s.

@04dcarraher said:

@magicalclick said:

I have a feeling that XboxOne current DX11.x is still DX11 based. It would be crazy to do another copy of DX11 with new multi-threading model while they already knew they can just update the console with DX12 later.

Not according to el tormentos

I just want DX12 out to see it fail to deliver and see what your excuses will be...

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#404 tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:

@tormentos:

Apparently, Direct12 is not low enough for some programmers.

http://gamingbolt.com/mantle-exposes-more-low-level-features-than-dx12-shares-many-rendering-codes-with-ps4xbox-one#vJ08Eor5oRo8c3U8.99

AMD Mantle Exposes More Low Level Features Than DX12, Shares Many Rendering Codes With PS4 Xbox One.

Probably for the same reason you will probably get a few more cycles on Mantle than on DX12 at least on AMD hardware,because AMD isn't targeting 50 GPU like DX12 is doing,which lead me to believe that DX12 still is not as efficient as the xbox one 11.x which is a custom one for the xbox.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#405  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:

@ronvalencia said:

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

No that was a does and don't in a GDC not an actual confirmation that the PS4 can't only use certain amount of CPU time,also the PS4 has an extra connection which passes 20GB/s up and down between CPU and GPU.

I have told you 100 time having 300GB's with a weak ass GPU mean total shit,just because you give the xbox one 200GB/s doesn't mean you will be a GPU with less bandwidth is a total joke and proven wrong by the barrage of games which are superior on PS4,hell UFC uses MSAAx2 on xbox one on PS4 is uses MSAAx4 which consumes more bandwidth,bandwidth without GPU power mean nothings,the 660ti has 142GB/s and can go toe to toe with the stock 7950 which has 240Gb/s.

NVIDIA Kepler is not an AMD GCN 1.0//1.1 hence it has different ROPS and memory controller behaviours i.e. it's not apples to apples comparison.

7950 (28 CU) was already rivalled or slightly less than by FirePro W8000 (28 CU) with 256bit bus and both are using AMD's Tahiti Pro chip. 7950's 384bit memory is useful for overclock editions e.g. my 7950 was 900Mhz factory overclock edition, which is close to normal 7970.

FirePro W8000's consumer part is 7870 XT which uses Tahiti LE chip (24 CU) with higher clock speeds i.e. 925Mhz base clock which rivals non-BE 7950.

7870 XT = 925 Mhz x 24 CU = 2.84 TFLOPS.

7950 = 800 Mhz x 28 CU = 2.87 TFLOPS

7950 BE = 850 Mhz(or 925mhz Turbo) x 28 CU = 3.05 TFLOPS to 3.315 TFLOPS

7950 900Mhz OC ed = 3.23 TFLOPS

7970 = 3.79 TFLOPS.

Radaon HD R9-285 (28 CU) with 256bit memory beats R9-280 with 384bit and it's close/rivals to R9-280X i.e. R9-285 has frame buffer compression improvements.

Xbox One's effective ESRAM bandwidth is lower than 200 GB/s, hence it's pointless line. Microsoft already shown their effective ROPS and ESRAM memory bandwidth to be around 150 GB/s. If the frame buffer tiling is used, Xbox One's other issue would be 1.3 TFLOPS shader power which wouldn't be a big issue with proper first party games i.e. they would have budgeted it's shader workload against the hardware.

Sony hasn't removed PS4's CPU access to GDDR5 memory i.e. AI and physics calculations would be done with memory interactions prior to GPU command buffer creation.

Ever since Commodore Amiga 500/1200's IGP and unified memory architecture, I have disliked unified memory architecture.

PS3's implementation is wrong i.e. you don't split GPU work between two sets of memory pools with two separate asymmetric graphics/dsp processors.

The gaming PC setup with fat dGPU + VRAM is better i.e. CPU can do its AI + Physics calculations with DDR3 memory without slowing down GPU's VRAM. It's only when the CPU side has created the command buffer that a transfer is required to the GPU side. The current game console hasn't replicated the gaming PC setup.

The ideal games console would be

1. CPU has it's own memory pool.

2. GPU has it's own memory pool with sufficient size for the modern frame buffer workloads and in-flight texture storage.

3. High speed links between CPU and GPU. This is use for workloads that needs unified memory access i.e. CPU can write to GPU's memory pool or GPU can write to CPU's memory pool.

The above setup has the best of both worlds.

From http://wccftech.com/evidence-amd-apus-featuring-highbandwidth-stacked-memory-surfaces/

AMD's next gen APU has both external main memory and on-chip HBM (VRAM) pools i.e. Xbox One done right.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#406 tymeservesfate
Member since 2003 • 2230 Posts
@prawephet said:

@tymeservesfate: Do you actually expect Microsoft to say that DX12 won't do anything for the Bone?

This thread is a complete waste of time. Of course Microsoft is going to praise their hardware and software.

you dont have to believe anything they say...despite multiple people inside and outside of MS have said there will be an effect. Some even giving detailed walkthroughs on how and why DX12 will make an effect...Its entirely in your right to not believe them. Ignore this thread n say its bullshit nonsense n self serving PR. Go ahead...do that.

I just hope you're not a cow that blindly believed Sony was going to bring greatness, then waited a year n got almost nothing from them. And are now still blindly keeping the faith in Sony to deliver "teh greatness" while saying MS is feeding people bullshit lol. That would be funny.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#407  Edited By GrenadeLauncher
Member since 2004 • 6843 Posts

@StormyJoe said:

I don't give a **** if technical difference is 1,000,000,000,000,000,000,000,000 pixels. No one can really tell the difference when looking at a TV, Stupid.

So far, the best looking next gen game is Ryse - it's 900p. Suck on that for awhile.

HAHAHAHAH! Jesus, StormyJoke, you're pathetic. Why not go one further, you can't tell the difference when your eyes are closed. Xbox Won!!!!!!

Ryse got kerbstormped long ago. Let me guess, gonna pull that SIGGRAPH shit out? The same people who gave The Crew an award? Jog on. Good job on officially becoming the joke of SW.

@tormentos said:

StormyJoke thinks 1080p and 900p look the same, what did you expect?

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#408  Edited By delta3074
Member since 2007 • 20003 Posts

@tormentos said:


@delta3074 said:

Completely agree with this, i have a google nexus 7 2012 and theres a ton of Android games that will not run on it

'You say all apps are developed for Android and Apple at the same time, go try and download games like Bioshock and Infinity Blade 3 for your android device tell me how that works out for you.'

completely agree, tormentos is talking utter bollocks, there are a TON of IOS games that i want that i am still waiting to come to android and a ton of games i want to play that will never see the light of day on android and in my experience the IOS version of games appear weeks sometimes months before the android version, they only just released the android version of lego star wars complete Saga and i cannot play it on my nexus 7 because it's only available for kindle fire which means you need a PowerVR gpu to run the game unless you have a rooted tablet with GL tools.

It's clear to me that when it comes to IOS and android tormentos doesn't have a fucking clue.

Piracy is also rife on android, virtually every paid game can be easily downloaded for free from the internet and side loaded due to the very open nature of Android OS, it is extremely hard to get IOS paid games for free because of apples , quite draconian, DRM policies so it stands to reason and makes logical sense that Apple would make more revenue on IOS games than google makes from Android games sales.

http://support.2k.com/hc/en-us/articles/203408763--iOS-BioShock-FAQ

If you don't know how to run it that is your thing,the same shit was say about the galaxy s 1 and the tegra 2 games,which was say would not work on the S1,and all you need to run them was a chainfire tools which is basically a software driver.

If you can't get it to you are out of luck.

http://androidcommunity.com/nexus-7-quad-core-overclocked-to-2-0-ghz-shatters-benchmarks-20120825/

That nexus 7 fly if you know how to make it fly.

And i already showed how androids has more apps.

@delta3074 said:

@sts106mat said:

@delta3074: unsurprisingly, tormentos is a fanboy of android...who'd have thought it?

I am actually a huge android fan but i can acknowledge that IOS has more games, better games and makes more revenue on games because Piracy is rife on android i just don't like apple products out of principle, there DRM is draconian and they charge way too much for there hardware.

I don't know about more games now but i am sure Android has more apps,revenues probably most iphone fans are idiots who love to pay more for less.

Piracy is a rife on Ios to were the fu** has you been.? For years it has been i have several friends who don't pay a single cent for apps on ios.


'If you don't know how to run it that is your thing,the same shit was say about the galaxy s 1 and the tegra 2 games,which was say would not work on the S1,and all you need to run them was a chainfire tools which is basically a software driver.'

i know what chainfire tools is tormentos and i also have a Galaxy S1 with a custom Cyanogenmod 4.3 android Os running on it (official android for S1 only goes up to 2.6)

Nobody uses Chainfire anymore anyway, people use GL tools these days

https://play.google.com/store/apps/details?id=com.n0n3m4.gltools&hl=en

You have to have a Rooted phone or tablet to use Chainfire 3D because it's software layer that sits between the Game and the GPU and changes the instructions sent from the Game to the GPU.

'Piracy is a rife on Ios to were the fu** has you been.? For years it has been i have several friends who don't pay a single cent for apps on ios'

You need to jailbreak an Iphone, Ipad to play pirated games and certain game are deliberatley gimped if your phone is jailbroken.

You don't need to root and android phone to play pirated games, thats the difference.

Only people with Jailbroken Apple products can play pirated games, Anyone with an android tab or phone can play pirated android games.

Piracy is a way bigger issue on android.

No offence dude but you should leave this alone, i have a ton of experience of messing around with Iphones and android phones and tablets, it's been my Hobby since i got my S1 4 years ago/

I am a tablet gamer these days so i know my shit when it comes to Android and IOS, when you have actually rooted and put a custom OS on one then come and talk to me.

Seriously Tormentos, this is something i know more about from personal experience, all you can do is post articles you find on the internet, Give it up because your bare bone description of chainfire 3D was pathetic and showed that you don't understand how Chainfire actually works.

You need to know which GPU the game is made for before you can even thionk about using it and rooting Android phones and Tablets is not childs play dude, one wrong move and you can Brick the bootloader and destroy your phone or tablet and you need a specific custom OS for each different model of phone or tablet, chosse the wrong one and you have a dead phone/tablet.

I would like to point out now that i do not play pirated games, i do not use chainfire tools because there are very few games that don't run on Tegra 3 which my Google Asus Nexus 7 2012 has and Cyanogenmod custom OS are endorsed by Google and is supported by Microsoft.

http://www.stuff.tv/android/microsoft-investing-aftermarket-android-operating-system-cyanogenmod/news

Also, if you overclock the Nexus 7 to 2.0 Ghz it won't last that long, i know i have overclocked android devices and it's the same deal as with PC's.

Also, overclocking the CPUU won't benefit you that much as very few games need a processor over 1.4 Ghz which the Nexus 7 can easily run.

It already flys, i don't need to overclock it, i WILL NOT ROOT MY NEXUS 7 to overclock it because it is far too Risky to Root a nexus 7.

Word to the wise, if you have to mess around with the Bootloader to Root your device DON'T DO IT unless you are completely sure what you are doing because once you change the Bootloader you CANNOT Reverse it.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#409 delta3074
Member since 2007 • 20003 Posts
@GrenadeLauncher said:


StormyJoke thinks 1080p and 900p look the same, what did you expect?

they do look the same, if you sit further away when you are playing the Xbone or if you are wearing Glasses that are too strong for you when playing the PS4 version,lol

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#410 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@GrenadeLauncher said:

@StormyJoe said:

I don't give a **** if technical difference is 1,000,000,000,000,000,000,000,000 pixels. No one can really tell the difference when looking at a TV, Stupid.

So far, the best looking next gen game is Ryse - it's 900p. Suck on that for awhile.

HAHAHAHAH! Jesus, StormyJoke, you're pathetic. Why not go one further, you can't tell the difference when your eyes are closed. Xbox Won!!!!!!

Ryse got kerbstormped long ago. Let me guess, gonna pull that SIGGRAPH shit out? The same people who gave The Crew an award? Jog on. Good job on officially becoming the joke of SW.

@tormentos said:

StormyJoke thinks 1080p and 900p look the same, what did you expect?

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#411 tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:

NVIDIA Kepler is not an AMD GCN 1.0//1.1 hence it has different ROPS and memory controller behaviours i.e. it's not apples to apples comparison.

7950 (28 CU) was already rivalled or slightly less than by FirePro W8000 (28 CU) with 256bit bus and both are using AMD's Tahiti Pro chip. 7950's 384bit memory is useful for overclock editions e.g. my 7950 was 900Mhz factory overclock edition, which is close to normal 7970.

FirePro W8000's consumer part is 7870 XT which uses Tahiti LE chip (24 CU) with higher clock speeds i.e. 925Mhz base clock which rivals non-BE 7950.

7870 XT = 925 Mhz x 24 CU = 2.84 TFLOPS.

7950 = 800 Mhz x 28 CU = 2.87 TFLOPS

7950 BE = 850 Mhz(or 925mhz Turbo) x 28 CU = 3.05 TFLOPS to 3.315 TFLOPS

7950 900Mhz OC ed = 3.23 TFLOPS

7970 = 3.79 TFLOPS.

Radaon HD R9-285 (28 CU) with 256bit memory beats R9-280 with 384bit and it's close/rivals to R9-280X i.e. R9-285 has frame buffer compression improvements.

Xbox One's effective ESRAM bandwidth is lower than 200 GB/s, hence it's pointless line. Microsoft already shown their effective ROPS and ESRAM memory bandwidth to be around 150 GB/s. If the frame buffer tiling is used, Xbox One's other issue would be 1.3 TFLOPS shader power which wouldn't be a big issue with proper first party games i.e. they would have budgeted it's shader workload against the hardware.

Sony hasn't removed PS4's CPU access to GDDR5 memory i.e. AI and physics calculations would be done with memory interactions prior to GPU command buffer creation.

Ever since Commodore Amiga 500/1200's IGP and unified memory architecture, I have disliked unified memory architecture.

PS3's implementation is wrong i.e. you don't split GPU work between two sets of memory pools with two separate asymmetric graphics/dsp processors.

The gaming PC setup with fat dGPU + VRAM is better i.e. CPU can do its AI + Physics calculations with DDR3 memory without slowing down GPU's VRAM. It's only when the CPU side has created the command buffer that a transfer is required to the GPU side. The current game console hasn't replicated the gaming PC setup.

The ideal games console would be

1. CPU has it's own memory pool.

2. GPU has it's own memory pool with sufficient size for the modern frame buffer workloads and in-flight texture storage.

3. High speed links between CPU and GPU. This is use for workloads that needs unified memory access i.e. CPU can write to GPU's memory pool or GPU can write to CPU's memory pool.

The above setup has the best of both worlds.

From http://wccftech.com/evidence-amd-apus-featuring-highbandwidth-stacked-memory-surfaces/

AMD's next gen APU has both external main memory and on-chip HBM (VRAM) pools i.e. Xbox One done right.

It doesn't matter bandwidth is bandwidth,and GCN on PC with stronger hardware than the xbox one like the 7790 have 96Gb/s and is enough to out perform the xbox one and even some times the PS4 as i have see in some games.

So is irrelevant to hove more bandwidth, the xbox one has under power hardware,is like having a 5 lane race track but your card is a damn Hyundai Accent running vs a stronger Elantra.

All that crap and the firepro is totally irrelevant to the argument fact is having more bandwidth on xbox one means nothing and that has been proven without doubt,in fact it was from 140gb/s to 150gb/s which is not even higher than the PS4.

Physics don't need to even run on the CPU on PS4,GPU compute will handle them better faster and save CPU time.

Yeah the PS3 was so wrong that it beat the xbox 360 graphics wise,i guess they knew something you didn't,but again that is totally irrelevant.

@tymeservesfate said:

you dont have to believe anything they say...despite multiple people inside and outside of MS have said there will be an effect. Some even giving detailed walkthroughs on how and why DX12 will make an effect...Its entirely in your right to not believe them. Ignore this thread n say its bullshit nonsense n self serving PR. Go ahead...do that.

I just hope you're not a cow that blindly believed Sony was going to bring greatness, then waited a year n got almost nothing from them. And are now still blindly keeping the faith in Sony to deliver "teh greatness" while saying MS is feeding people bullshit lol. That would be funny.

Despite MS own Phil Spencer saying it will not drastically change graphics...hahahahaa

That is funny because the PS4 has like 70+ games more than the xbox one,so greatness came even less to the xbox one and and has way less games to play,so bad it is that your biggest release in the next 8 months is an Indie name Ori,because Evolve is also on PS4.

@delta3074 said:

'If you don't know how to run it that is your thing,the same shit was say about the galaxy s 1 and the tegra 2 games,which was say would not work on the S1,and all you need to run them was a chainfire tools which is basically a software driver.'

i know what chainfire tools is tormentos and i also have a Galaxy S1 with a custom Cyanogenmod 4.3 android Os running on it (official android for S1 only goes up to 2.6)

You have to have a Rooted phone or tablet to use Chainfire 3D because it's software layer that sits between the Game and the GPU and changes the instructions sent from the Game to the GPU.

'Piracy is a rife on Ios to were the fu** has you been.? For years it has been i have several friends who don't pay a single cent for apps on ios'

You need to jailbreak an Iphone, Ipad to play pirated games and certain game are deliberatley gimped if your phone is jailbroken.

You don't need to root and android phone to play pirated games, thats the difference.

Only people with Jailbroken Apple products can play pirated games, Anyone with an android tab or phone can play pirated android games.

Piracy is a way bigger issue on android.

No offence dude but you should leave this alone, i have a ton of experience of messing around with Iphones and android phones and tablets, it's been my Hobby since i got my S1 4 years ago/

I am a tablet gamer these days so i know my shit when it comes to Android and IOS, when you have actually rooted and put a custom OS on one then come and talk to me.

Seriously Tormentos, this is something i know more about from personal experience, all you can do is post articles you find on the internet, Give it up because your bare bone description of chainfire 3D was pathetic and showed that you don't understand how Chainfire actually works.

You need to know which GPU the game is made for before you can even thionk about using it and rooting Android phones and Tablets is not childs play dude, one wrong move and you can Brick the bootloader and destroy your phone or tablet and you need a specific custom OS for each different model of phone or tablet, chosse the wrong one and you have a dead phone/tablet.

I would like to point out now that i do not play pirated games, i do not use chainfire tools because there are very few games that don't run on Tegra 3 which my Google Asus Nexus 7 2012 has and Cyanogenmod custom OS are endorsed by Google and is supported by Microsoft.

http://www.stuff.tv/android/microsoft-investing-aftermarket-android-operating-system-cyanogenmod/news

Also, if you overclock the Nexus 7 to 2.0 Ghz it won't last that long, i know i have overclocked android devices and it's the same deal as with PC's.

Also, overclocking the CPUU won't benefit you that much as very few games need a processor over 1.4 Ghz which the Nexus 7 can easily run.

It already flys, i don't need to overclock it, i WILL NOT ROOT MY NEXUS 7 to overclock it because it is far too Risky to Root a nexus 7.

Word to the wise, if you have to mess around with the Bootloader to Root your device DON'T DO IT unless you are completely sure what you are doing because once you change the Bootloader you CANNOT Reverse it.

Basically rooting your phone is the easy here they do it even for your,hell most places were they sell used phones here sell it unlock and rooted.

Oh please dude millions upon millions of iphone users jail broke their iphone,my friend has a 6 and already is jailbroken,is this notion that only few people do this,ios make more money because they tent to charge for things,while android tents to rely more on freemium.

Just like angry birds use to cost $1 on ios and was free on Android.

WTF....hahahahaaa

Dude i use to had chain fire on my SG1,i had it rooted and over cloked,just like my S2 from sprint the Epic 4G touch,which was the fastest of all S2,on my S2 i use to ran mijjz but i ran several more,juts like S1.

Now i own an Lg3 but i haven't root it because i don't game on my phone any mores,other than for strategy games like clash of clans and other very simple touch games controls are always a mess.

I have been a member of XDA since 2010,and have rooted phone for friends as well,so yeah i know.

Oh please you need to be royally stupid or mess up good to actually brick your phone,and even then there is ways to save it,not in all cases but there is,my older step son bricked a galaxys and i revived it with Odin,it got stock on a boot loop.

And basically the link you posted there what is say is that MS like always is trying to fu** the competition,by investing in cyanogen and trying to segregate android which will ultimately fail,because irrelevant of cyanogen running a good custom OS is small and irrelevant vs Google,Most phones come with Android which is free for makers,cyanogen is a rom inside that android bobble,making an OS of its own will yield no results this is MS they don't help you for nothing,so the real reason here is try to fragment android and see if the pathetic windows phones have a breath of air...lol

Hell that is another market sony is beating MS,in the quarter ending in September 30 MS sold 9..3 million phones sony sold 9.9 million phones,and the Xperia z is one of the lesser selling phones..lol

No dude you don't need to go that high is was just showing how it can fly,my sprint S2 still working and is over clock my daughter has it.

It is far to risky for a newbie like you...

Rooting may sound like a tricky procedure, but it's really not. Thanks to an awesome root-kit made specifically for all Nexus devices (including both the 2012 and 2013 Nexus 7), the process for rooting is virtually painless.

Update: There's a new root method available, and it's the easiest one yet. No need for USB cables, computers, drivers, or toolkits. Simply download an app, run it, and you're root 30 seconds later. Check out our guide here, it's worked flawlessly with 2013 Nexus 7's, but currently doesn't work for 2012 editions as well. Worst comes to worst, you can always come back to this guide, as that process does not erase any data.

http://nexus7.wonderhowto.com/how-to/root-your-nexus-7-tablet-running-android-4-4-kitkat-windows-guide-0150849/

Newbie...

I don't play on tables i don't like touch controls,alto i could use a controller i don't really feel like doing that.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#412 tormentos
Member since 2003 • 33793 Posts

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#413  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#414 tormentos
Member since 2003 • 33793 Posts

@ttboy said:

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

System war pool mean total shit..

And we don't know against which games it on siggraph Infamous came after Ryse in a different year so did Drive Club.

Not to mention The order chew and spit Rise and comes next month.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#415  Edited By Krelian-co
Member since 2006 • 13274 Posts

@ttboy said:

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

would you mind posting these "polls" where ryse has won? and since we are going by polls i am glad then that you acknowledge that ps4 and wii u are the best consoles of the gen while xbone is just a poor man pc.

Avatar image for prawephet
Prawephet

385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#416 Prawephet
Member since 2014 • 385 Posts

@tymeservesfate: Actually, developers have been saying it won't make a difference on the Xbox while it will have a substantial effect on the PC.

You want to make fun of cows? They have a couple AA games and multiplats to keep them busy. Microsoft has been promising "DaT SeKrEt SauSsssE!!111!" for 2 years now and hasn't delivered.

The WiiU is the only console delivering and that thing didn't bring the games until a year in as well.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#417  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

1. PC benchmarks with 7790 usually has stronger CPU to drive the GPU. PC's 7790 has more than 96 GB/s bandwidth i.e. did you forget CPU's memory bandwidth?

2. 7790 has higher TFLOPS than Xbox One. Keep in mind X1's tablet class CPU vs desktop PCs. The closest PC relative to X1's GCN GPU is R7-260 i.e. 12 CU at 1000Mhz with 96 GB VRAM. R7-260 needs to be down clocked to 853Mhz and increase memory bandwidth to 150 GB/s + 68 GB/s.

3. I have stated shader limits will influence rendering resolution i.e. DirectX 12 would not solve this shader bound issue.

Each pixel would have shader resource allocate to it e.g.

1.31 TFLOPS / 30 fps = 43.67 GFLOPS per frame and this 43.67 GLOPS shader resource is spread over a given screen resolution.

1.84 TFLOPS / 30 fps = 61.3 GFLOPS per frame and this 61.3 GLOPS shader resource is spread over a given screen resolution.

43.67 GFLOPS per frame is 71.2 percent of 61.3 GFLOPS per frame, this translates to roughly 1600x900 vs 1920x1080 when both consoles are running the same shader program. With 43.67 GFLOPS per frame, a programmer can use it for 1920x1080 or 1600x900 or any screen resolution supported by the system.

4 .Memory bandwidth can also influence frame buffer read and write duration which impacts rendering frame times. With frame buffer tiling, X1's memory bandwidth is a lesser issue than shader limit issues.

Avatar image for WitIsWisdom
WitIsWisdom

10376

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#418  Edited By WitIsWisdom
Member since 2007 • 10376 Posts

I wasn't aware people still placed so much importance on nothingness.

Avatar image for TigerSuperman
TigerSuperman

4331

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#419 TigerSuperman
Member since 2013 • 4331 Posts
@ronvalencia said:

@tormentos:

1. PC benchmarks with 7790 usually has stronger CPU to drive the GPU. PC's 7790 has more than 96 GB/s bandwidth i.e. did you forget CPU's memory bandwidth?

2. 7790 has higher TFLOPS than Xbox One. Keep in mind X1's tablet class CPU vs desktop PCs. The closest PC relative to X1's GCN GPU is R7-260 i.e. 12 CU at 1000Mhz with 96 GB VRAM. R7-260 needs to be down clocked to 853Mhz and increase memory bandwidth to 150 GB/s + 68 GB/s.

3. I have stated shader limits will influence rendering resolution i.e. DirectX 12 would not solve this shader bound issue.

Each pixel would have shader resource allocate to it e.g.

1.31 TFLOPS / 30 fps = 43.67 GFLOPS per frame and this 43.67 GLOPS shader resource is spread over a given screen resolution.

1.84 TFLOPS / 30 fps = 61.3 GFLOPS per frame and this 61.3 GLOPS shader resource is spread over a given screen resolution.

43.67 GFLOPS per frame is 71.2 percent of 61.3 GFLOPS per frame, this translates to roughly 1600x900 vs 1920x1080 when both consoles are running the same shader program. With 43.67 GFLOPS per frame, a programmer can use it for 1920x1080 or 1600x900 or any screen resolution supported by the system.

4 .Memory bandwidth can also influence frame buffer read and write duration which impacts rendering frame times. With frame buffer tiling, X1's memory bandwidth is a lesser issue than shader limit issues.

You're saying things that sound interesting on paper but have no guarantee in execution.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#420 EG101
Member since 2007 • 2091 Posts

@ronvalencia said:
@StormyJoe said:

@tormentos: Software HAS closed the gap already! The June SDK update helped move the XB1 from mostly 720p for multiplats to 900p.

As for your second comment. Seriously, stop replying as if you know anything about software development or optimization - it just makes you look stupid.

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

Bottom line: This Gen is a refinement on last gen. An evolution instead of a revolution in technology. I think the success of the Wii and PS2 (weakest hardware of the gen) Proved to Sony and MS that they don't need to pack Mid level Graphics (compared to PC) when they can go Low Level.

MS and Sony should have delivered to us 2.5 - 3 TeraFlop consoles after waiting 8 years. I would have gladly waited another year if it meant we could have had 2.5 - 3 Teralops for $500 - $550. Sure the New Gen adoption would have been slower but by the end of year 2 the price could have been dropped near $400 anyway and at $400 adoption of the new consoles should be on par with the way the consoles are selling now.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#421 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@Krelian-co said:

@ttboy said:

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

would you mind posting these "polls" where ryse has won? and since we are going by polls i am glad then that you acknowledge that ps4 and wii u are the best consoles of the gen while xbone is just a poor man pc.

If you were in system wars long enough this wouldn't be a question. This has been debated many times on here with polls.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#422 GarGx1
Member since 2011 • 10934 Posts

@EG101:

MS and Sony wanted only one thing from their respective consoles, profitability on every unit sold from day one and increased profit percentage as the console generation matures.

Low end hardware makes for cheaper production and larger profits. MS should have ditched the Kinnect though and spent the extra cash on more power.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#423  Edited By delta3074
Member since 2007 • 20003 Posts

@tormentos said:


@delta3074 said:

'If you don't know how to run it that is your thing,the same shit was say about the galaxy s 1 and the tegra 2 games,which was say would not work on the S1,and all you need to run them was a chainfire tools which is basically a software driver.'

i know what chainfire tools is tormentos and i also have a Galaxy S1 with a custom Cyanogenmod 4.3 android Os running on it (official android for S1 only goes up to 2.6)

You have to have a Rooted phone or tablet to use Chainfire 3D because it's software layer that sits between the Game and the GPU and changes the instructions sent from the Game to the GPU.

'Piracy is a rife on Ios to were the fu** has you been.? For years it has been i have several friends who don't pay a single cent for apps on ios'

You need to jailbreak an Iphone, Ipad to play pirated games and certain game are deliberatley gimped if your phone is jailbroken.

You don't need to root and android phone to play pirated games, thats the difference.

Only people with Jailbroken Apple products can play pirated games, Anyone with an android tab or phone can play pirated android games.

Piracy is a way bigger issue on android.

No offence dude but you should leave this alone, i have a ton of experience of messing around with Iphones and android phones and tablets, it's been my Hobby since i got my S1 4 years ago/

I am a tablet gamer these days so i know my shit when it comes to Android and IOS, when you have actually rooted and put a custom OS on one then come and talk to me.

Seriously Tormentos, this is something i know more about from personal experience, all you can do is post articles you find on the internet, Give it up because your bare bone description of chainfire 3D was pathetic and showed that you don't understand how Chainfire actually works.

You need to know which GPU the game is made for before you can even thionk about using it and rooting Android phones and Tablets is not childs play dude, one wrong move and you can Brick the bootloader and destroy your phone or tablet and you need a specific custom OS for each different model of phone or tablet, chosse the wrong one and you have a dead phone/tablet.

I would like to point out now that i do not play pirated games, i do not use chainfire tools because there are very few games that don't run on Tegra 3 which my Google Asus Nexus 7 2012 has and Cyanogenmod custom OS are endorsed by Google and is supported by Microsoft.

http://www.stuff.tv/android/microsoft-investing-aftermarket-android-operating-system-cyanogenmod/news

Also, if you overclock the Nexus 7 to 2.0 Ghz it won't last that long, i know i have overclocked android devices and it's the same deal as with PC's.

Also, overclocking the CPUU won't benefit you that much as very few games need a processor over 1.4 Ghz which the Nexus 7 can easily run.

It already flys, i don't need to overclock it, i WILL NOT ROOT MY NEXUS 7 to overclock it because it is far too Risky to Root a nexus 7.

Word to the wise, if you have to mess around with the Bootloader to Root your device DON'T DO IT unless you are completely sure what you are doing because once you change the Bootloader you CANNOT Reverse it.

Basically rooting your phone is the easy here they do it even for your,hell most places were they sell used phones here sell it unlock and rooted.

Oh please dude millions upon millions of iphone users jail broke their iphone,my friend has a 6 and already is jailbroken,is this notion that only few people do this,ios make more money because they tent to charge for things,while android tents to rely more on freemium.

Just like angry birds use to cost $1 on ios and was free on Android.

WTF....hahahahaaa

Dude i use to had chain fire on my SG1,i had it rooted and over cloked,just like my S2 from sprint the Epic 4G touch,which was the fastest of all S2,on my S2 i use to ran mijjz but i ran several more,juts like S1.

Now i own an Lg3 but i haven't root it because i don't game on my phone any mores,other than for strategy games like clash of clans and other very simple touch games controls are always a mess.

I have been a member of XDA since 2010,and have rooted phone for friends as well,so yeah i know.

Oh please you need to be royally stupid or mess up good to actually brick your phone,and even then there is ways to save it,not in all cases but there is,my older step son bricked a galaxys and i revived it with Odin,it got stock on a boot loop.

And basically the link you posted there what is say is that MS like always is trying to fu** the competition,by investing in cyanogen and trying to segregate android which will ultimately fail,because irrelevant of cyanogen running a good custom OS is small and irrelevant vs Google,Most phones come with Android which is free for makers,cyanogen is a rom inside that android bobble,making an OS of its own will yield no results this is MS they don't help you for nothing,so the real reason here is try to fragment android and see if the pathetic windows phones have a breath of air...lol

Hell that is another market sony is beating MS,in the quarter ending in September 30 MS sold 9..3 million phones sony sold 9.9 million phones,and the Xperia z is one of the lesser selling phones..lol

No dude you don't need to go that high is was just showing how it can fly,my sprint S2 still working and is over clock my daughter has it.

It is far to risky for a newbie like you...

Rooting may sound like a tricky procedure, but it's really not. Thanks to an awesome root-kit made specifically for all Nexus devices (including both the 2012 and 2013 Nexus 7), the process for rooting is virtually painless.

Update: There's a new root method available, and it's the easiest one yet. No need for USB cables, computers, drivers, or toolkits. Simply download an app, run it, and you're root 30 seconds later. Check out our guide here, it's worked flawlessly with 2013 Nexus 7's, but currently doesn't work for 2012 editions as well. Worst comes to worst, you can always come back to this guide, as that process does not erase any data.

http://nexus7.wonderhowto.com/how-to/root-your-nexus-7-tablet-running-android-4-4-kitkat-windows-guide-0150849/

Newbie...

I don't play on tables i don't like touch controls,alto i could use a controller i don't really feel like doing that.

Do you actually read your own articles?

'it's worked flawlessly with 2013 Nexus 7's, but currently doesn't work for 2012 editions as well'

as i already stated,my Tablet is a nexus 7 2012, reading comprehension for the win.

The galaxy S1 is one of the EASIEST phone to root on the planet dude, i just booted up ODIN, put my phone into download mode and used ODIN to install CWM and then Booted the phone into recovery mode and selected Cyanogen mod Zip on my SD card and it was job done, you don't have to mess with the Bootloader. i have also recovered an S1 using ODIN when it got stuck on a boot loop thanks to a 'nightly' version of cyanogenmod.

My bootloader is still 2.6.

The easiest was the samsung galaxy mini, all i had to do was put a Zip file on my SD card put the phone in recovery mode and select the option to sideload from the SD card and it did everything for me, rooted the phone and installed CWM as well as upgrading my android version Beyond Stock

Wugtools is hit and miss mate, believe me, i am well aware of how it works and i am sorry but anyone worth there salt knows that Unlocking and messing around with Bootloaders and Kernals is always a risky proposition so i won't go near it, the truth is if a cannot use ODIN then i probably won't bother, i trust ODIN and it's always worked well for me and i would root my NEXUS 7 using it if i could find the correct PDA packet to install CWM on it to root it so i don't have to mess with the bootloader. if you can point me in the direction of one i would be most grateful.

I apologise for thinking you knew nothing about the subject, it's actually nice to talk to someone else who does and someone else who respects Samsung galaxy S1's as much as i do , i refuse to buy another phone until my S1 is truly dead, i just bought a new longer lasting Lion battery for it.

Windows phones are shit, end of, in my opinion and thats all i have to say about that.

LG? i would have thought you would have bought an Xperia, i want one of the old Xperias with the flip down controller so i can play Medievil with a contoller.

'And basically the link you posted there what is say is that MS like always is trying to fu** the competition,by investing in cyanogen and trying to segregate android'

thats not what i got from the article, seems to me MS want to corner a foreign market that google wants nothing to do with, it wouldn't Segregate Android anyway, like you said, it's a free OS so therefore cannot be Segregated from a business point of View and Cyanogenmod is still android and you would still need to install GAPPS to use google play store and google play services so Cyanogen cannot get away from Google no matter how much money MS throw at them.Cyanogen is a better OS than android stock though, it runs faster and comes with a proper Audio DSP plus a ton of developer options including command prompt terminal.

You have to get over the idea that everything MS does is designed to Screw everyone else man, you hatred of MS is unhealthy man you need to check that dude, drive you insane it will, especially on here,lol

And SONY May be beating MS in several markets but we both know which is the more successful company overall dude so theres no need to keep pointing that out because it's a moot point when you look at the bigger picture, there are plenty of market MS is beating SONY in like Cloud Computing and operating systems for example.

there just no need for it, it's not a game of Brinkmanship, lets forget all the 'my dads bigger than your dad' shit.

I could not care less about MS or SONY anymore, i moved on to Tablet gaming and getting my Wii-U soon, i probably won't be on here much after that, theres no need for me to get involved with the SONY vs MS battle anymore because i am no longer a part of it.

I am not layed back enough to be a sheep,lol

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#424 Krelian-co
Member since 2006 • 13274 Posts

@ttboy said:

@Krelian-co said:

@ttboy said:

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

would you mind posting these "polls" where ryse has won? and since we are going by polls i am glad then that you acknowledge that ps4 and wii u are the best consoles of the gen while xbone is just a poor man pc.

If you were in system wars long enough this wouldn't be a question. This has been debated many times on here with polls.

oh i see, so basically in your imagination? we want us to take the word of a delusional lem as proof, riiiiight.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#425 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@Krelian-co said:

@ttboy said:

@Krelian-co said:

@ttboy said:

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

would you mind posting these "polls" where ryse has won? and since we are going by polls i am glad then that you acknowledge that ps4 and wii u are the best consoles of the gen while xbone is just a poor man pc.

If you were in system wars long enough this wouldn't be a question. This has been debated many times on here with polls.

oh i see, so basically in your imagination? we want us to take the word of a delusional lem as proof, riiiiight.

Do you notice no one else questioned it not even torm. You're a troll...

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#426  Edited By Krelian-co
Member since 2006 • 13274 Posts

@ttboy said:

@Krelian-co said:

@ttboy said:

@Krelian-co said:

@ttboy said:

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

would you mind posting these "polls" where ryse has won? and since we are going by polls i am glad then that you acknowledge that ps4 and wii u are the best consoles of the gen while xbone is just a poor man pc.

If you were in system wars long enough this wouldn't be a question. This has been debated many times on here with polls.

oh i see, so basically in your imagination? we want us to take the word of a delusional lem as proof, riiiiight.

Do you notice no one else questioned it not even torm. You're a troll...

still waiting for these polls, so yeah keep talking your fanboy drivel.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#427 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@Krelian-co said:

@ttboy said:

@Krelian-co said:

@ttboy said:

@Krelian-co said:

@ttboy said:

@tormentos said:

@ttboy said:

Ryse is still the best looking console game which does not fit into your analysis. Best looking is always subjective however only in this forum it is a debate without evidence to the contrary.

No it isn't..

Every system wars poll has said the opposite as well as winning at Siggraph. We don't need mindless prattlings with no evidence.

would you mind posting these "polls" where ryse has won? and since we are going by polls i am glad then that you acknowledge that ps4 and wii u are the best consoles of the gen while xbone is just a poor man pc.

If you were in system wars long enough this wouldn't be a question. This has been debated many times on here with polls.

oh i see, so basically in your imagination? we want us to take the word of a delusional lem as proof, riiiiight.

Do you notice no one else questioned it not even torm. You're a troll...

still waiting for these polls, so yeah keep talking your fanboy drivel.

Nextgen console graphics King?

You really don't offer anything of value in here.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#428 tymeservesfate
Member since 2003 • 2230 Posts
@prawephet said:

@tymeservesfate: Actually, developers have been saying it won't make a difference on the Xbox while it will have a substantial effect on the PC.

You want to make fun of cows? They have a couple AA games and multiplats to keep them busy. Microsoft has been promising "DaT SeKrEt SauSsssE!!111!" for 2 years now and hasn't delivered.

The WiiU is the only console delivering and that thing didn't bring the games until a year in as well.

so what, i have links of devs saying it will help. so ur not really saying much. We, like the devs, have to wait and see what can be done once dx12 is released. Because right conflicting things are being said. the only thing we do know is DX12 WILL an effect...besides that we just don't know much. with how close the consoles are already i'd be fine with a moderate boost to the XB1. A substantial one would be even better though.

Also, the console hasnt even been out for 2 years. So your second statement makes no sense at all and just wrong. If you can't even keep track of how much time has passed since launch, stay out of my notifications please.

Avatar image for prawephet
Prawephet

385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#429 Prawephet
Member since 2014 • 385 Posts

@tymeservesfate: Microsoft has been promising secret sauce since long before the thing launch.

And no. Right now all we know is that Microsoft is building dx12. We have no idea what effect it will have on the performance good or bad.

Avatar image for 04dcarraher
04dcarraher

23857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#430  Edited By 04dcarraher
Member since 2004 • 23857 Posts

@prawephet said:

@tymeservesfate: Microsoft has been promising secret sauce since long before the thing launch.

And no. Right now all we know is that Microsoft is building dx12. We have no idea what effect it will have on the performance good or bad.

The wont be bad I can you that much, it allows more tools and options for devs, DX12 is more efficient then DX11 at its base, and that the new API allows more control over the resources of the cpu and gpu communicating with each other.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#431 ronvalencia
Member since 2008 • 29612 Posts

@EG101 said:

@ronvalencia said:
@StormyJoe said:

@tormentos: Software HAS closed the gap already! The June SDK update helped move the XB1 from mostly 720p for multiplats to 900p.

As for your second comment. Seriously, stop replying as if you know anything about software development or optimization - it just makes you look stupid.

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

Bottom line: This Gen is a refinement on last gen. An evolution instead of a revolution in technology. I think the success of the Wii and PS2 (weakest hardware of the gen) Proved to Sony and MS that they don't need to pack Mid level Graphics (compared to PC) when they can go Low Level.

MS and Sony should have delivered to us 2.5 - 3 TeraFlop consoles after waiting 8 years. I would have gladly waited another year if it meant we could have had 2.5 - 3 Teralops for $500 - $550. Sure the New Gen adoption would have been slower but by the end of year 2 the price could have been dropped near $400 anyway and at $400 adoption of the new consoles should be on par with the way the consoles are selling now.

Xbox One's 32 MB ESRAM could have yielded another 14 CU (with 12 CU active) GCN.

With X1's silicon budget box, I would select the following config

  • 1.5GB 192 bit high speed GDDR5-6000 VRAM to cover several 1920x1080p frame buffer targets and in-flight texture storage. I'm swapping 32MB ESRAM out with non-tiling solution.
      • I view 192 bit as a compromise between cheap 128bit and 256bit PCB. This is similar to Geforce GTX 660 Ti approach.
      • The external 320bit bus trace lines on the PCB is slightly higher than X1's 256bit external bus trace lines.
      • Future hardware revisions would use HBM for cost reduction.
  • 24 active CU GCN at 853Mhz which yields 2.62 TFLOPS. This would cover complex shader performance for 1920x1080p. This is the important part.
      • 28 CU potential with 24 active CU. There's a straight R7-260 2X scaling.
  • 128bit DDR3-2133 main memory for non-in-flight texture storage and program code.
  • A high speed interconnect between GPU and CPU which should be faster than PCI-E version 3 16X (32 GB bandwidth).

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#432 Spitfire-Six
Member since 2014 • 1378 Posts

@tymeservesfate: Some of the devs that are commenting on DX12 are in the DX12 preview program.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#433  Edited By tymeservesfate
Member since 2003 • 2230 Posts
@prawephet said:

@tymeservesfate: Microsoft has been promising secret sauce since long before the thing launch.

And no. Right now all we know is that Microsoft is building dx12. We have no idea what effect it will have on the performance good or bad.

MS has not been promising any "secret sauce" for 2 YEARS like you said. Before the XB1 announcement event we werent even sure when, or if, the console was coming. They said nothing...even during the event they didn't say much on specs lol. So you saying that they were hyping some kind of secret sauce for almost a YEAR before the console was even announced is complete bullshit...and a lie.

And no what? The rest of your post is basically repeating what I said...that we'll have to wait n see when it releases. You should have just said you agree lol. Because you basically said everything i said to you, back to me lol...well, except the part where you said "good or bad". There will be no bad...DX12 will only add positive things to the XB1. Only positive things. You can be sure about. since you seem to like pretending that you're sure about things that you dont know anything about, pretend you're sure about this. DX12 will only do good things for XB1. There's no doubt there.

Avatar image for prawephet
Prawephet

385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#434 Prawephet
Member since 2014 • 385 Posts

@tymeservesfate: THE CloUdZz PowAa!!!1!!

Secret sauce.

Avatar image for Articuno76
Articuno76

19799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#435 Articuno76
Member since 2004 • 19799 Posts

Sorry guys but DX12 isn't going to have a big effect on anything, Xbox One or PC. Direct X never has. Remember when DX11 was supposed to usher in the new age and was hyped to high heaven? Probably not because as soon as it came out no one, and I mean NO ONE, ever talked about it's incredibly tangible (read: what turned out to be negligible in reality) benefits outside of flappy cloths in tech demos when you zoomed in 200%... yeah, big whoop, right?

So I'm gonna just stand here, arms crossed, until I see something worth getting excited about.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#436 EG101
Member since 2007 • 2091 Posts

@ronvalencia said:

@EG101 said:

@ronvalencia said:
@StormyJoe said:

@tormentos: Software HAS closed the gap already! The June SDK update helped move the XB1 from mostly 720p for multiplats to 900p.

As for your second comment. Seriously, stop replying as if you know anything about software development or optimization - it just makes you look stupid.

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

Bottom line: This Gen is a refinement on last gen. An evolution instead of a revolution in technology. I think the success of the Wii and PS2 (weakest hardware of the gen) Proved to Sony and MS that they don't need to pack Mid level Graphics (compared to PC) when they can go Low Level.

MS and Sony should have delivered to us 2.5 - 3 TeraFlop consoles after waiting 8 years. I would have gladly waited another year if it meant we could have had 2.5 - 3 Teralops for $500 - $550. Sure the New Gen adoption would have been slower but by the end of year 2 the price could have been dropped near $400 anyway and at $400 adoption of the new consoles should be on par with the way the consoles are selling now.

Xbox One's 32 MB ESRAM could have yielded another 14 CU (with 12 CU active) GCN.

With X1's silicon budget box, I would select the following config

  • 1.5GB 192 bit high speed GDDR5-6000 VRAM to cover several 1920x1080p frame buffer targets and in-flight texture storage. I'm swapping 32MB ESRAM out with non-tiling solution.
      • I view 192 bit as a compromise between cheap 128bit and 256bit PCB. This is similar to Geforce GTX 660 Ti approach.
      • The external 320bit bus trace lines on the PCB is slightly higher than X1's 256bit external bus trace lines.
      • Future hardware revisions would use HBM for cost reduction.
  • 24 active CU GCN at 853Mhz which yields 2.62 TFLOPS. This would cover complex shader performance for 1920x1080p. This is the important part.
      • 28 CU potential with 24 active CU. There's a straight R7-260 2X scaling.
  • 128bit DDR3-2133 main memory for non-in-flight texture storage and program code.
  • A high speed interconnect between GPU and CPU which should be faster than PCI-E version 3 16X (32 GB bandwidth).

Those specs with 12 Jaguar Cores instead and No Kinect packed in for $400 would have crippled Sony but good thing for the industry MS doesn't have a Killer Instinct. I mean MS got beat in Hardware design by a Company on the cusp of Bankruptcy. Sony had NO choice but to sell hardware at a Profit. Yet MS who threw away Over $8 Billion on Skype and Threw away $2 Billion on Mine Craft wouldn't give us descent hardware for our money. Only good that came from this mess is that the Brains behind it all are gone and the Xbox division is in the hands of someone with Vision. MS really dropped the ball and imo are very lucky to have the mild success they are having right now.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#437 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@ronvalencia said:

@EG101 said:

@ronvalencia said:
@StormyJoe said:

@tormentos: Software HAS closed the gap already! The June SDK update helped move the XB1 from mostly 720p for multiplats to 900p.

As for your second comment. Seriously, stop replying as if you know anything about software development or optimization - it just makes you look stupid.

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

Bottom line: This Gen is a refinement on last gen. An evolution instead of a revolution in technology. I think the success of the Wii and PS2 (weakest hardware of the gen) Proved to Sony and MS that they don't need to pack Mid level Graphics (compared to PC) when they can go Low Level.

MS and Sony should have delivered to us 2.5 - 3 TeraFlop consoles after waiting 8 years. I would have gladly waited another year if it meant we could have had 2.5 - 3 Teralops for $500 - $550. Sure the New Gen adoption would have been slower but by the end of year 2 the price could have been dropped near $400 anyway and at $400 adoption of the new consoles should be on par with the way the consoles are selling now.

Xbox One's 32 MB ESRAM could have yielded another 14 CU (with 12 CU active) GCN.

With X1's silicon budget box, I would select the following config

  • 1.5GB 192 bit high speed GDDR5-6000 VRAM to cover several 1920x1080p frame buffer targets and in-flight texture storage. I'm swapping 32MB ESRAM out with non-tiling solution.
      • I view 192 bit as a compromise between cheap 128bit and 256bit PCB. This is similar to Geforce GTX 660 Ti approach.
      • The external 320bit bus trace lines on the PCB is slightly higher than X1's 256bit external bus trace lines.
      • Future hardware revisions would use HBM for cost reduction.
  • 24 active CU GCN at 853Mhz which yields 2.62 TFLOPS. This would cover complex shader performance for 1920x1080p. This is the important part.
      • 28 CU potential with 24 active CU. There's a straight R7-260 2X scaling.
  • 128bit DDR3-2133 main memory for non-in-flight texture storage and program code.
  • A high speed interconnect between GPU and CPU which should be faster than PCI-E version 3 16X (32 GB bandwidth).

Are you implying that you're a better Engineer than both MS and AMD in designing the Xbox One? You can't just add up specs and put a system together.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#438 Krelian-co
Member since 2006 • 13274 Posts

@Articuno76 said:

Sorry guys but DX12 isn't going to have a big effect on anything, Xbox One or PC. Direct X never has. Remember when DX11 was supposed to usher in the new age and was hyped to high heaven? Probably not because as soon as it came out no one, and I mean NO ONE, ever talked about it's incredibly tangible (read: what turned out to be negligible in reality) benefits outside of flappy cloths in tech demos when you zoomed in 200%... yeah, big whoop, right?

So I'm gonna just stand here, arms crossed, until I see something worth getting excited about.

i have been through enough directx updates to know what you say is true but the multithreaded optimization and lower cpu overhead sound promising and can actually give tangible results ON PC, i don't really see how consoles would benefit from this but let them dream a bit longer.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#439 EG101
Member since 2007 • 2091 Posts

To TTboy

You have to admit that the Division leaders forcing their TV and Kinect idea's hampered the Xbox Project. I'm sure the accountants insisting the hardware come in at $500 including Kinect also played some part. If M.S. would have simply built a Successor to the 360 with a similar philosophy of best Graphics per dollar spent we probably wouldn't even have this thread as the Xbox was always the most powerful Console Hardware each Gen. They could have made a $500 console without Kinect and a $600 skew that included Kinect. A 2.5 - 3TF Console with the right memory set up and a better CPU.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#440  Edited By tymeservesfate
Member since 2003 • 2230 Posts
@prawephet said:

@tymeservesfate: THE CloUdZz PowAa!!!1!!

Secret sauce.

damn right

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#441  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@EG101 said:

To TTboy

You have to admit that the Division leaders forcing their TV and Kinect idea's hampered the Xbox Project. I'm sure the accountants insisting the hardware come in at $500 including Kinect also played some part. If M.S. would have simply built a Successor to the 360 with a similar philosophy of best Graphics per dollar spent we probably wouldn't even have this thread as the Xbox was always the most powerful Console Hardware each Gen. They could have made a $500 console without Kinect and a $600 skew that included Kinect. A 2.5 - 3TF Console with the right memory set up and a better CPU.

Its fair to criticize the strategy for sure however lets wait and see what DX12 brings. If its nothing then they deserve all of the criticism in the world. What's not cool is to play arm chair engineer and just think you can slap parts together and build a system. If it were so easy then many of you can work for MS/Sony/IBM/AMD as an Architect and make north of 200 000 a year.

Its the same thing I see in the Beyond 3D forums. They act like they know the architecture then are surprised when Phil said full DX12 is coming to it. They also make fun of, yet quote, Brad Wardell on DX12 information. Its poor form to be honest.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#442 StormyJoe
Member since 2011 • 7806 Posts

@GrenadeLauncher: Check the AVI sites, stupid. It is not some great difference. Mountains out of molehills.

This is the reason you cows are considered the absolute worst fanboys.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#443 tormentos
Member since 2003 • 33793 Posts

@ttboy said:

Are you implying that you're a better Engineer than both MS and AMD in designing the Xbox One? You can't just add up specs and put a system together.

My PC is better design than the xbox one, and has less ram..hahahahaa

Sure you can i did it with my PC...lol

Beating sony spec on the PS4 isn't particularly hard,i can understand why Sony went with a Jaguar and 7850 like GPU,its financials were that good.

But Ms going with the same jaguar + DDR3 show graphics wasn't the main goal of the xbox one,MS has endless billions it could have much better hardware,but they have try twice with powerful hardware and twice have loss,so they try to imitate Nintendo wii mote success,building in Kinect since that eat from the console BOM,they also cheapen out with DDR3 and since the GPU power on the console wasn't that of even a 7850 doing what they did r with the xbox 360 seem like a good move for them,ESRAM to help with bandwidth limitations.

The xbox one design is what it is because graphics wasn't MS target,this is the same company who sported a $550 dollar GPU more advanced than PC ones on 2006 as well as a 3 core CPU with 6 threads.

So in conclusion Dx12 will do nothing,just like the cloud the xbox one is destine to ride behind the PS4 all gen long and you will see that next year.

@ttboy said:

Its fair to criticize the strategy for sure however lets wait and see what DX12 brings. If its nothing then they deserve all of the criticism in the world. What's not cool is to play arm chair engineer and just think you can slap parts together and build a system. If it were so easy then many of you can work for MS/Sony/IBM/AMD as an Architect and make north of 200 000 a year.

Its the same thing I see in the Beyond 3D forums. They act like they know the architecture then are surprised when Phil said full DX12 is coming to it. They also make fun of, yet quote, Brad Wardell on DX12 information. Its poor form to be honest.

I am sure Dx12 will bring something to the xbox one,broken dreams and shatter hearts alone side waves and waves of damage control when it fail to deliver on xbox one.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#444  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@tormentos: If DX12 does nothing then it's fair game to criticize the engineers...

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#445  Edited By deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@ttboy said:

@tormentos: If DX12 does nothing then it's fair game to criticize the engineers...

He thinks his PC is a better build than the Xbox One... Even if DX12 makes world peace happen and fixes global warming he'll criticize the engineers...

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#447 tormentos
Member since 2003 • 33793 Posts

@ttboy said:

@tormentos: If DX12 does nothing then it's fair game to criticize the engineers...

I am sure it will not...

@FastRobby said:

@ttboy said:

@tormentos: If DX12 does nothing then it's fair game to criticize the engineers...

He thinks his PC is a better build than the Xbox One... Even if DX12 makes world peace happen and fixes global warming he'll criticize the engineers...

It is 6 core Fx6350 3.8 ghz 4 GB of memory,and 2Gb R270 my PC walk the xbox one home and the PS4 to for that matter.

DX12 were it will have a bigger impact is on PC,so basically it would make my R270 work even better and further away from the xbox one..

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#448  Edited By ronvalencia
Member since 2008 • 29612 Posts
@ttboy said:

@ronvalencia said:

@EG101 said:

@ronvalencia said:
@StormyJoe said:

@tormentos: Software HAS closed the gap already! The June SDK update helped move the XB1 from mostly 720p for multiplats to 900p.

As for your second comment. Seriously, stop replying as if you know anything about software development or optimization - it just makes you look stupid.

While there are runtime improvements,

1600x900 is 69.4 percent of 1920x1080. The resolution difference is more apparent when 30 fps is the target.

If floating 60 fps x 1920x1080 is the target, Xbox One approximately has 42 fps and PS4 has 60 fps i.e. there's no resolution difference with this setup. The next Xbox One refresh should have FreeSync + DisplayPort 1.2a so non-30 or non-60 fps gaming would be tear free.

1.31 TFLOPS is 71.9 percent of 1.84 TFLOPS. 1600x900 is the closest 16:9 resolution for Xbox One's 71.9 percent of PS4.

For 1600x900 vs 1920x1080, 127.696 TFLOPS is 69.4 percent from 1.84 TFLOPS.

This is programmable shader based limits influencing resolution outcomes. DirectX 12 wouldn't solve this issue.

Alternative resolution with close to 71.9 percent from PS4's shader budget

1440x1080 i.e. 72.9 percent from 1920x1080.

1400x1050 i.e. 70.89 percent from 1920x1080.

Using these resolutions on 16.9 display would need scaling and aspect correction.

At this time, DirectX 12 publically known to improve textures/material handling for Xbox One,

PS4 has a weakness i.e. the more the CPU is used, it would result in disproportionately less bandwidth for the GPU. On Xbox One, ESRAM frame buffer bandwidth remains the same. Xbox One's ESRAM's effective bandwidth is already exceeding PS4's effective GDDR5 bandwidth, but using that advantage has it's own issues.

PC's 256bit GDDR5 2GB and 128bit DDR3 8GB model still beats PS4's gaming results i.e. it's between X1 and PS4 extremes. With X1's silicon budget, the alternative game console would have 7870 XT/7950 class GPU. HBM upgrade will make this setup cheaper i.e. move motherboard GDDR5 on to the GPU chip package. 7870 XT/7950 class GPU translates to R9-285 with improved frame buffer compression when using updated GCN.

There's a chance for Nintendo to deliver the PC like gaming if they configure their next console correctly.

Bottom line: This Gen is a refinement on last gen. An evolution instead of a revolution in technology. I think the success of the Wii and PS2 (weakest hardware of the gen) Proved to Sony and MS that they don't need to pack Mid level Graphics (compared to PC) when they can go Low Level.

MS and Sony should have delivered to us 2.5 - 3 TeraFlop consoles after waiting 8 years. I would have gladly waited another year if it meant we could have had 2.5 - 3 Teralops for $500 - $550. Sure the New Gen adoption would have been slower but by the end of year 2 the price could have been dropped near $400 anyway and at $400 adoption of the new consoles should be on par with the way the consoles are selling now.

Xbox One's 32 MB ESRAM could have yielded another 14 CU (with 12 CU active) GCN.

With X1's silicon budget box, I would select the following config

  • 1.5GB 192 bit high speed GDDR5-6000 VRAM to cover several 1920x1080p frame buffer targets and in-flight texture storage. I'm swapping 32MB ESRAM out with non-tiling solution.
      • I view 192 bit as a compromise between cheap 128bit and 256bit PCB. This is similar to Geforce GTX 660 Ti approach.
      • The external 320bit bus trace lines on the PCB is slightly higher than X1's 256bit external bus trace lines.
      • Future hardware revisions would use HBM for cost reduction.
  • 24 active CU GCN at 853Mhz which yields 2.62 TFLOPS. This would cover complex shader performance for 1920x1080p. This is the important part.
      • 28 CU potential with 24 active CU. There's a straight R7-260 2X scaling.
  • 128bit DDR3-2133 main memory for non-in-flight texture storage and program code.
  • A high speed interconnect between GPU and CPU which should be faster than PCI-E version 3 16X (32 GB bandwidth).

Are you implying that you're a better Engineer than both MS and AMD in designing the Xbox One? You can't just add up specs and put a system together.

My mentioned specification is similar to a desktop PC with AMD Radeon HD 7870 XT (24 CU) with slightly reduced video memory bus width and reduced 1.5GB VRAM.

A PC with AMD Radeon HD 7870 XT would murder Xbox One and PS4.

I would recycle Xbox One's CPU to GPU link since it's faster than PCI-E version 3.0 16X.

Xbox One's SoC silicon budget for 32MB ESRAM is enough for another 14 CU GCN, hence the total CU count would be 28 with 24 active CUs (4 CU for yield issues).

AMD's own embedded solution such Radeon E8860 (37 watts) includes on-chip package GDDR5 memory modules NOT 32MB ESRAM.

MS spent their dollars for 32 MB ESRAM with reduced CU potential i.e. MS wants upgraded Xbox 360 type box.

With lesser SoC silicon budget, Sony's selection is better and the results speaks for themselves i.e. better sustained 1920x1080p performance than Xbox One. HBM would enable Sony to cut cost in the future.

AMD provides a menu of IP blocks and it's up to the customer to select the right IP blocks.

DirectX 12 wouldn't solve shader limitations i.e. 12 CU will remain 12 CU GCN.

AMD Radeon HD 7870 XT with AMD A10 5800k, resolution 2560x1440p 50 fps i.e. better than 1280x720p (X1) and 1600x900p (PS4).

Loading Video...

Xbox One's SoC. Note the 32MB SRAM which could have supported another 14 CU GCN.

.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#449  Edited By deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@tormentos

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#450 GrenadeLauncher
Member since 2004 • 6843 Posts

@StormyJoe said:

@GrenadeLauncher: Check the AVI sites, stupid. It is not some great difference. Mountains out of molehills.

This is the reason you cows are considered the absolute worst fanboys.

This'll be those prescription lenses that lems are given when they get an Xbone so they can feel better about their duff purchase.

Funny, cows aren't the ones gloating about the sell-off of a F2P MMO developer, after spending years shitting on the same genre and business model.

You want absolute worst, go look in the mirror.