Xbox one = 7790 confirmed by xbox one Architec.

This topic is locked from further discussion.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 tormentos
Member since 2003 • 33793 Posts

 

 

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."

 

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

 

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

 

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

 

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM.And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

 

 

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

 

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

 

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

 

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

 

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

 

 

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

 

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

 

"On the SoC, there are many parallel engines - some of those are more like CPU cores or DSP cores. How we count to fifteen: [we have] eight inside the audio block, four move engines, one video encode, one video decode and one video compositor/resizer," says Nick Baker.

 

MS 15 different processors..

8 are Audio related.

4 Move engines.

1 video codec

1 video decode

and 1 video compositor resizer..

 

:lol::lol::lol:

This keeps getting better and better so the 15 processors like it was predicted were sh** that are also inside the PS4 all but move engines which we know is to move data not to precess sh**..

MS counting even the batteries on the xbox one controller to claim better specs..:lol:

 

 

Avatar image for Suppaman100
Suppaman100

5250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Suppaman100
Member since 2013 • 5250 Posts
PS4 much stronger than Xbone confirmed. TLHBO
Avatar image for -Snooze-
-Snooze-

7304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 -Snooze-
Member since 2009 • 7304 Posts

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

Avatar image for GuNsbl4ziN
GuNsbl4ziN

285

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 GuNsbl4ziN
Member since 2010 • 285 Posts
Who cares about consoles when you can have a PC?
Avatar image for marklarmer
marklarmer

3883

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#5 marklarmer
Member since 2004 • 3883 Posts

lol will you be able to sleep now?

Avatar image for deactivated-5d7fb49ded561
deactivated-5d7fb49ded561

4019

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 deactivated-5d7fb49ded561
Member since 2010 • 4019 Posts

Well done Tormentos, the tech king of SW

Avatar image for Suppaman100
Suppaman100

5250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Suppaman100
Member since 2013 • 5250 Posts
Who cares about consoles when you can have a PC?GuNsbl4ziN
Because there's nothing good to play on PC now.
Avatar image for ramonnl
ramonnl

769

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 ramonnl
Member since 2010 • 769 Posts

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

-Snooze-

Because otherwise the ps4 would cost 600-700 euro's again, that didn't work out so well with the ps3.

Avatar image for Stinger78
Stinger78

5846

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 Stinger78
Member since 2003 • 5846 Posts
Ok. It's nice to know. At least if I ever get an Xbox One, I'll be sure to keep that in mind....oh, wait, just like any other generation, it doesn't matter if the games are entertaining.
Avatar image for Stinger78
Stinger78

5846

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 Stinger78
Member since 2003 • 5846 Posts
[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

Ahh.. Guess that Humble Origin Bundle, Humble Indie 9 Bundle, and "X" collection I got were all fake, as well as all the other games I already own on Steam and Origin.
Avatar image for GuNsbl4ziN
GuNsbl4ziN

285

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 GuNsbl4ziN
Member since 2010 • 285 Posts
[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

Yeah, I agree. there is almost too much good to play.
Avatar image for campzor
campzor

34932

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 campzor
Member since 2004 • 34932 Posts
Lems getting the subpar console yet again.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 tormentos
Member since 2003 • 33793 Posts

Ok. It's nice to know. At least if I ever get an Xbox One, I'll be sure to keep that in mind....oh, wait, just like any other generation, it doesn't matter if the games are entertaining. Stinger78

 

 

Well that is right,the highest rated games this passing gen are from Wii,so yeah the point of my thread is to proved how silly some poster are here,that refuse to see what MS did and what is doing now,the xbox one will have some great games i am sure,but my point wasn't about that.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 EG101
Member since 2007 • 2091 Posts

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."tormentos

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM.And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

Avatar image for wiiutroll
wiiutroll

543

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 wiiutroll
Member since 2013 • 543 Posts

[QUOTE="tormentos"]

 

 

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."EG101

 

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

 

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

 

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

 

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM.And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

 

 

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

 

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

 

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

 

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

 

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

 

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

 

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

 

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

that's what i was thinking

Avatar image for SKaREO
SKaREO

3161

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16 SKaREO
Member since 2006 • 3161 Posts
[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

Exactly this. PC gaming is dead now, get over it.
Avatar image for RoslindaleOne
RoslindaleOne

7566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 RoslindaleOne
Member since 2006 • 7566 Posts
Looked like Ron owned you big time. Let it go.
Avatar image for Elitro
Elitro

578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 Elitro
Member since 2009 • 578 Posts

[QUOTE="GuNsbl4ziN"]Who cares about consoles when you can have a PC?Suppaman100
Because there's nothing good to play on PC now.

 

You're right... Dragon Age, Dark Souls 2 and Witcher 3 are comming next year.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 silversix_
Member since 2010 • 26347 Posts
Stuck for 10 years with a 7790 :lol: Good luck to you Lems rofl
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 tormentos
Member since 2003 • 33793 Posts

 

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

EG101

 

By odworld developer the PS4 was getting 172GB/s already out of the 176GB/s..:lol:

 

No the 176GB/s is a direct link between the GPU and memory pool,there is another link from CPU to memory that is 20GB,and another one that is 20GB/s from CPU to GPU...:lol:

Get your fact right.

Avatar image for metal_zombie
metal_zombie

2288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 metal_zombie
Member since 2004 • 2288 Posts

[QUOTE="-Snooze-"]

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

ramonnl

Because otherwise the ps4 would cost 600-700 euro's again, that didn't work out so well with the ps3.

I wouldn't minded paying more for better consoles but i suppose it makes sense to launch less risky hardware in todays economy
Avatar image for clone01
clone01

29843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 clone01
Member since 2003 • 29843 Posts

lol will you be able to sleep now?

marklarmer
Nope.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 tormentos
Member since 2003 • 33793 Posts

 that's what i was thinking

wiiutroll

 

And you are wrong as well.

 

13z4vms.png

Avatar image for metal_zombie
metal_zombie

2288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 metal_zombie
Member since 2004 • 2288 Posts
Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_
It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.
Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 silversix_
Member since 2010 • 26347 Posts

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflmetal_zombie
It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.

Its here for TEN YEARS lol this isn't just bad, its atrocious

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 tormentos
Member since 2003 • 33793 Posts

Looked like Ron owned you big time. Let it go.RoslindaleOne

 

Yeah he sure did,specially on the part where he desperately tryed to to imply that the xbox one GPU wasn't bonaire that was pitcairn,or like one of hes last arguments a cross between a 7790 and 7850..:lol:

Avatar image for Zophar87
Zophar87

4344

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#27 Zophar87
Member since 2008 • 4344 Posts

I called this months ago. lol

Avatar image for TheKingIAm
TheKingIAm

1531

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 TheKingIAm
Member since 2013 • 1531 Posts
I thought we knew the x1 was weak as hell
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 ronvalencia
Member since 2008 • 29612 Posts

Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."tormentos

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

I just destroyed Ronvalencia argument,like i claim many times the xbox one was a 7790 with 2CU disable for yields...:lol:

Total bullsh** so MS is telling people that 12 CU are better than 14 on GCN..:lol:

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... one out of every eight cycles is a bubble so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM.And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM

Yeah 140Gb/s 150GB/s not 204Gb/s it can't be sustained,so practical 140 to 150GB/s.

68GB/s + 140 = 208GB/s

68GB/s + 150 = 218GB/s

PS4

176GB/s + 40GB/s from CPU and Onion = 216GB/s

Also is ery funny how Digital Foundry is doing MS PR...:lol:

And is always leadbetter..

Goosen also believes that leaked Sony documents on VGLeaks bear out Microsoft's argument:

"Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design.

MS Architect riding in something that was prove to be false there never was 14+4...:lol:

We think we have very good balance, very good performance, we have a product which can handle things other than just raw ALU [GPU compute power].

This is pure bullsh** at its best MS trying to act as if the GCN inside the xbox was from Nvidia..:lol:

"On the SoC, there are many parallel engines - some of those are more like CPU cores or DSP cores. How we count to fifteen: [we have] eight inside the audio block, four move engines, one video encode, one video decode and one video compositor/resizer," says Nick Baker.

MS 15 different processors..

8 are Audio related.

4 Move engines.

1 video codec

1 video decode

and 1 video compositor resizer..

:lol::lol::lol:

This keeps getting better and better so the 15 processors like it was predicted were sh** that are also inside the PS4 all but move engines which we know is to move data not to precess sh**..

MS counting even the batteries on the xbox one controller to claim better specs..:lol:

Your memory bandwidth assumptions doesn't add up.

http://www.edge-online.com/news/gaijin-games-on-why-war-thunder-isnt-coming-to-xbox-one/

How much more powerful?

AY: It depends what youre doing. GPU, like 40 per cent more powerful. DDR5 is basically 50 per cent more powerful than DDR3, but the memory write [performance] is bigger on Xbox Oneso it depends on what youre doing.

How is that going to translate to on-screen results for the kinds of games you want to make? So to optimise War Thunder on both consoles you could hypothetically make a better, prettier version on PS4?

AY: Yep.

KY: Probably yes. But again, thats not a very big deal.

---------------

7790 doesn't have the following memory controllers.

"You can think of the ESRAM and the DDR3 as making up eight total memory controllers, so there are four external memory controllers (which are 64-bit) which go to the DDR3 and then there are four internal memory controllers that are 256-bit that go to the ESRAM. These are all connected via a crossbar and so in fact it will be true that you can go directly, simultaneously to DRAM and ESRAM," he explains

---------------------------

7790's memory controllers and it's L2 cache bottleneck..

radeonhd7790-slide-2-new.PNG

Avatar image for metal_zombie
metal_zombie

2288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 metal_zombie
Member since 2004 • 2288 Posts

[QUOTE="metal_zombie"][QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_

It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.

Its here for TEN YEARS lol this isn't just bad, its atrocious

It will run games at a decent frame rate console players won't care that it can't put out good visuals when compared to the pc.
Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 ZoomZoom2490
Member since 2008 • 3943 Posts

The worst thing about X1 is the fact that memory is controlled by the same old xbox360 128bit-bus memory.

dont let MS fool you into thinking that the console has 256bit memory width because of that crap ESRAM that's also used in WiiU.

Avatar image for moistsandwich
moistsandwich

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 moistsandwich
Member since 2009 • 25 Posts

[QUOTE="metal_zombie"][QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_

It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.

Its here for TEN YEARS lol this isn't just bad, its atrocious

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 savagetwinkie
Member since 2008 • 7981 Posts

[QUOTE="wiiutroll"]

that's what i was thinking

tormentos

And you are wrong as well.

13z4vms.png

this doesn't actually refute anything they just said... 176GB/s peak of gddr5 and it has 1 memory type.

Avatar image for WilliamRLBaker
WilliamRLBaker

28915

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 WilliamRLBaker
Member since 2006 • 28915 Posts

awww el tormo trying to understand hardware and trying to explain it but getting it wrong

El tormos limit of knowledge= derp derpp..one number is bigger than another number.

Avatar image for metal_zombie
metal_zombie

2288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 metal_zombie
Member since 2004 • 2288 Posts

[QUOTE="RoslindaleOne"]Looked like Ron owned you big time. Let it go.tormentos

 

Yeah he sure did,specially on the part where he desperately tryed to to imply that the xbox one GPU wasn't bonaire that was pitcairn,or like one of hes last arguments a cross between a 7790 and 7850..:lol:

RbWsWvU.png

Avatar image for clone01
clone01

29843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 clone01
Member since 2003 • 29843 Posts

[QUOTE="silversix_"]

[QUOTE="metal_zombie"] It's not that bad.... on pc yeah having a shitty card is terrible but on consoles it will be fine.moistsandwich

Its here for TEN YEARS lol this isn't just bad, its atrocious

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

I find it truly silly that fanboys debate about console power. Its a console. You want performance, get a PC. And this is coming from essentially a console-only gamer.
Avatar image for wiiutroll
wiiutroll

543

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 wiiutroll
Member since 2013 • 543 Posts

[QUOTE="tormentos"]

[QUOTE="wiiutroll"]

that's what i was thinking

savagetwinkie

 

And you are wrong as well.

 

13z4vms.png

this doesn't actually refute anything they just said... 176GB/s peak of gddr5 and it has 1 memory type.

lol

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 savagetwinkie
Member since 2008 • 7981 Posts
[QUOTE="moistsandwich"]

[QUOTE="silversix_"]Its here for TEN YEARS lol this isn't just bad, its atrocious

clone01

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

I find it truly silly that fanboys debate about console power. Its a console. You want performance, get a PC. And this is coming from essentially a console-only gamer.

A lot of people don't like PC though, too many options, more things that can go wrong, its online is a unified experience
Avatar image for LA-Nighthawk
LA-Nighthawk

335

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 LA-Nighthawk
Member since 2013 • 335 Posts
Graphics look fine to me. Looking at the launch games and knowing how much better games will look a few years down the line, I'm totally okay with that. Enjoy obsessing over meaningless numbers for the next few years though.
Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 EG101
Member since 2007 • 2091 Posts

[QUOTE="EG101"]

This definitely not how it works :lol:

The GDDR5 inside the PS4 will never exceed the PEAK which is 176GB/S. There is only 1 RAM Pool inside the PS4 and that has a PEAK of 176GB/S but peak is rarely maintained. The reason you can add BW inside the XB1 is because there are 2 pools of ram with seperate buses that can BOTH run in Parallel something the PS4 Does NOT have.

tormentos

By odworld developer the PS4 was getting 172GB/s already out of the 176GB/s..:lol:

No the 176GB/s is a direct link between the GPU and memory pool,there is another link from CPU to memory that is 20GB,and another one that is 20GB/s from CPU to GPU...:lol:

Get your fact right.

You get your facts straight.

That GDDR5 will NEVER have 200+ GB/S bandwidth NEVER. You tried to make it seem like the PS4 Ram had that much BW. Doesn't matter how many highways go to that RAM there is a maximum bandwidth of 176 GB/S coming from that RAM. You can NOT add Band Width on the PS4 the way you CAN on the XB1.

Avatar image for Evo_nine
Evo_nine

2224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41 Evo_nine
Member since 2012 • 2224 Posts

Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflsilversix_
errr....cows are in the same boat re: stuck with a crappy GPU

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

Avatar image for Lumpy311
Lumpy311

2009

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 Lumpy311
Member since 2013 • 2009 Posts

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflEvo_nine

errr....cows are in the same boat re: stuck with a crappy GPU

 

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

I hope it does last 10 years to PC gaming can grow even more.

Avatar image for clone01
clone01

29843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 clone01
Member since 2003 • 29843 Posts
[QUOTE="clone01"][QUOTE="moistsandwich"]

6-8 years.... there has never been a 10 year gap between console launches within any of the big 3. But I get your point.

savagetwinkie
I find it truly silly that fanboys debate about console power. Its a console. You want performance, get a PC. And this is coming from essentially a console-only gamer.

A lot of people don't like PC though, too many options, more things that can go wrong, its online is a unified experience

Certainly. That's why I like consoles. Although the line gets blurrier every day.
Avatar image for SKaREO
SKaREO

3161

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#45 SKaREO
Member since 2006 • 3161 Posts

[QUOTE="Evo_nine"]

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflLumpy311

errr....cows are in the same boat re: stuck with a crappy GPU

 

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

I hope it does last 10 years to PC gaming can grow even more.

Mobile gaming is growing exponentially faster than PC gaming ever was. PC gaming is a niche and it will always be a niche. Thinking that a PC has any influence in games is a joke these days. Devlopers don't have much intention of making their games available on the PC, takes Rockstar Games for example. I don't buy a $2500 system to brag about it. I buy a system to play games and if there aren't any good games on the PC then what's the point of owning one?
Avatar image for Lumpy311
Lumpy311

2009

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 Lumpy311
Member since 2013 • 2009 Posts

Another Tormentos thread where he pretends to be a hardware architect.

 

And can someone explain to me why Sony Fan continues to obsesse over X1s hardware specs?

kuu2

That's how factions on System Wars work.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 ronvalencia
Member since 2008 • 29612 Posts

The 7790 is a poor poor card. I don't understand why Sony and MS went in with such poor gpu's. I understand the CPU being weak as shit, but why the GPU?

-Snooze-
7790 has a higher FLOPS than 7850, but 7790 has a gimped L2 cache. Unlike X1, 7790 can NOT do memory writes @ 150GB/s (assuming this claim is true). There's a reason why I used prototype-7850 with 12 CUs for X1.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48 ronvalencia
Member since 2008 • 29612 Posts

The worst thing about X1 is the fact that memory is controlled by the same old xbox360 128bit-bus memory.

dont let MS fool you into thinking that the console has 256bit memory width because of that crap ESRAM that's also used in WiiU.

ZoomZoom2490
Btw, GDDR3 is based on DDR2.
Avatar image for Evo_nine
Evo_nine

2224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49 Evo_nine
Member since 2012 • 2224 Posts

[QUOTE="Evo_nine"]

[QUOTE="silversix_"]Stuck for 10 years with a 7790 :lol: Good luck to you Lems roflLumpy311

errr....cows are in the same boat re: stuck with a crappy GPU

. Im betting this generation wont last 10 years, more like 5. Well....I HOPE.

I hope it does last 10 years to PC gaming can grow even more.

I haven't built a gaming PC for years, but ill definately be thinking about it if this console gen looks to last longer than 5 years. Also i really want to play star citizen.

Anyways, PC gaming always benefits from improving console tech.

http://techreport.com/news/25378/pc-gaming-will-benefit-from-next-gen-consoles-says-amd

Avatar image for Basinboy
Basinboy

14558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#50 Basinboy
Member since 2003 • 14558 Posts

And now in addition to everyone being a critic, everyone's an engineer.

Just gimme a damn controller and let me play already.