New Leak/Rumor: Xbox 720 beta kit contains two APUs

This topic is locked from further discussion.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#201 ronvalencia
Member since 2008 • 29612 Posts

2 APUs is Microsoft's version on The Cell!

StrongBlackVine
Xbox 360's Xenos includes two asymmetric graphics chips.
Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#202 NFJSupreme
Member since 2005 • 6605 Posts

Ron is like SW's fact checker haha

Avatar image for HaRmLeSS_RaGe
HaRmLeSS_RaGe

1330

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#203 HaRmLeSS_RaGe
Member since 2012 • 1330 Posts

tormentos is still damage controlling the PS4's weak ass hardware? :lol:

Avatar image for CwlHeddwyn
CwlHeddwyn

5314

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#204 CwlHeddwyn
Member since 2005 • 5314 Posts

still dont understand MS's decision to use DDR3 RAM as opposed to GDDR5 RAM. MS after all used a unified pool of GDDR3 RAM back in 2005 with the Xbox 360, the next logical step would have been to opt for GDDR5 RAM for the Xbox 720, look at what Sony have done. Sony have learned from their mistakes, Microsoft seem to be creating new ones.

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205 Phazevariance
Member since 2003 • 12356 Posts

still dont understand MS's decision to use DDR3 RAM as opposed to GDDR5 RAM. MS after all used a unified pool of GDDR3 RAM back in 2005 with the Xbox 360, the next logical step would have been to opt for GDDR5 RAM for the Xbox 720, look at what Sony have done. Sony have learned from their mistakes, Microsoft seem to be creating new ones.

CwlHeddwyn
You say, with zero evidence of any hardware specs on the next console....
Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#206 savagetwinkie
Member since 2008 • 7981 Posts
[QUOTE="StrongBlackVine"]

2 APUs is Microsoft's version on The Cell!

ronvalencia
Xbox 360's Xenos includes two asymmetric graphics chips.

when did this happen? It wasn't originally like that.
Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#207 savagetwinkie
Member since 2008 • 7981 Posts

[QUOTE="tormentos"][QUOTE="delta3074"]which is based on rumours, nbody knows what RAM the 720 will have yet, RAM is soething you can change at the last minute,I Expect MS to counter what SONY is doing with the Ps4 on that level. I still expect the Ps4 to be more powerful, and i have already made up my mind to get a Ps4,just saying that we should avoid making concrete statements about the 360 until we see an official announceronvalencia

Yeah just like the PS4 was base on rumors and they were basically spot on,MS using DDr3 is basically confirmed and inside people has state this,is well know.. Ram is not something you can change at last minute you have little knowledge on this apparently,adding 256MB like on 360 is not the same as changing from DDR3 to GDDR5,they are not compatible,that involves and change in hardware,in the 720 case greatly since it has extra hardware to try to solve the bandwidth problems. If you use GDDR5 the design most change and DME,ESRAM and several other things should be gone,there is no point in having 102Gb/s ESRAM when GDDR5 would probably run at 153GB/s,that would basically stall the data flow and create a bottle neck. MS can't do nothing to counter sony now,without delaying the 720 9 months to 1 year,you don't test hardware in 1 months and mass market it,remember RROD.?

AMD Cape Verde's memory controller supports both DDR3 (e.g. Radeon HD 7730M) and GDDR5 (e.g. Radeon HD 7750) memory types. AMD Kaveri APU also similar memory type support.

Note that GDDR5 is based on DDR3.

you've never brought up a board before have you. Memory is a large change in the settings and he's completely right about the ESRAM. GDDR5 and DRR3 are not drop in replaceable.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#208 tormentos
Member since 2003 • 33793 Posts

[QUOTE="tormentos"][QUOTE="ronvalencia"]Xbox 720's DDR3 is based on rumours.

"Malta" 7990 requires two 8 pin PCI-E power connectors and there's no possibility for 1st generation Xbox 360's PSU to power it (i.e. 245 watts of continuous power with 280 watts peak power).

ronvalencia

A rumor not something assume by you which is different. You quoted a page which is nothing related to the 720.

You assumed that GCN has a split DDR3 and GDDR5 memory controller designs i.e. 7730M and 7750M uses the same ASIC design.

The designs for both DDR3 and GDDR5 for GCNs has been completed since last year.

No i did not, you did, i never mention even that, you are the king of replaying to things never argue. The GPU supporting DDR3 and GDDR5 in nothing change the fact that to use GDDR5 on 720 the console board has to be redesign,better cooling has to be implemented,and probably even an exterior design,is not about the memory controller DDR3 and GDDR5 are not compatible,and you can't just slap GDDR5 into DDR3 board.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 tormentos
Member since 2003 • 33793 Posts
[QUOTE="tormentos"][QUOTE="04dcarraher"] :lol: never said anything about the xbox 720, all we know is rumors,  totally flew over your headabout the possibility.04dcarraher
The your and idiot because that is what i was talking about,if you quoted 512 bit in an attend to counter GDDR5 speed you failed,GDDR5 still way faster,once again is the reason every card maker move even that GDDR5 is way more expensive..

butthurt that I showed you 512bit GDDR3 being as fast as 256bit GDDR5 aww how sad.

Is not as fast and what i was talking about was the 720,the PS4 Bandwidth by the way is 176GB/s,but there are far higher on GDDR5.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#210 tormentos
Member since 2003 • 33793 Posts

[QUOTE="tormentos"][QUOTE="ronvalencia"] AMD Cape Verde supports both DDR3 (e.g. Radeon HD 7730M) and GDDR5 (e.g. Radeon HD 7750) memory types. Note that GDDR5 is based on DDR3. ronvalencia

And that mean sh** since is not a problem of the card or GPU supporting GDDR5 and DDR3,the problem lies on the board the 720 has inside which already has a design,changing from DDR3 to GDDR5 would imply.. A design change of the board. Better cooling. Elimination of components like ESRAM since they would be a bottle neck with GDDR5 in place. And even probably an exterior design.

The board wouldn't be the problem since the designs for both DDR3 and GDDR5 traces already exist with existing products.

If you look at Wii U's PCB, you can see AMD's typical GPU memory trace design.

DSC_9007sm.jpg

This is "copy and paste" level engineering i.e. "semi-custom" job.

GDDR5 will be further mass produced as part of PC's SODIMM standard.

Xbox 360 Slim with AMD's typical GPU memory trace design.

xbox-360-slim-disassembly-16.jpg

Radeon HD 6670 with AMD's typical GPU memory trace design layout.AMD is pretty consistent when it comes to "cut-and-paste" memory trace designs.

You don't get it do you.? Once the xbox 720 design as in lock changing memory means changing the board again,doing further testing and a complete redesign,contracts are sign for this parts to be manufacture,you cat change components like that,adding more ram is not the same as changing it,you know this so stop your butthurt denial.. You look like a fool trying to prove your random quote theories,once DDR3 with ESRAM was in lock,changing that means delaying the unit and redesigning it period.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#211 tormentos
Member since 2003 • 33793 Posts

Ron is like SW's fact checker haha

NFJSupreme
He is not he is just and ADM huge suck up,who like to argue point never make. You say.. Ice scream is good don't you think.? And Ron respond. I like Apple because they make great products. 'Then come 10 charts proving how great apple is.. :lol:
Avatar image for Caseytappy
Caseytappy

2199

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212 Caseytappy
Member since 2005 • 2199 Posts

Yeh , well keep babbling Tormatos but once I have my 720 I am going to release one of those Apu´s and send it after you and you will be running even faster that if it was a Girl straight back to deep Nerdistan .

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#213 Bebi_vegeta
Member since 2003 • 13558 Posts

[QUOTE="NFJSupreme"]

Ron is like SW's fact checker haha

tormentos

He is not he is just and ADM huge suck up,who like to argue point never make. You say.. Ice scream is good don't you think.? And Ron respond. I like Apple because they make great products. 'Then come 10 charts proving how great apple is.. :lol:

 

Got to be the only time I agree with you.

Avatar image for Shielder7
Shielder7

5191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#214 Shielder7
Member since 2006 • 5191 Posts

[QUOTE="ChubbyGuy40"]

Two APUs for twice the suckage.

Wickerman777

 

It could just be a gag or even an inaccurate rumor but IF 2 of those 1.2 tflop GPUs does end up in Nextbox then that changes everything. Durango would instantly go from quite a bit weaker than PS4 to quite a bit more powerful than it is.

Depends What the 2ed APUs function is or how it's used. If the 2ed APU is identical and is used as a conjunction than maybe, if it's used for the kinetic and social media crap it's irrelevant to game play outside of kinetic.
Avatar image for Shielder7
Shielder7

5191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#215 Shielder7
Member since 2006 • 5191 Posts
[QUOTE="hexashadow13"]That doesn't actually make any sense. The point of APUs is to have everything connected. If you have two APUs... how would that even work? Would one be for low power and one for high power? Would they both work at the same time? If so... which APU would do what? That sounds annoyingly complicated to develop for, as well as a gigantic waste of money as opposed to just having a more powerful single APU. :/ShadowriverUB
If they really used in retail and they both really used for game, they could communicate via shered memory, which would be slower. For programmer it would be seamless as everything would be deal via OS as OS is responsible for process scheduling and assailing jobs fro each core, software would see it as sperate cores... or should i say sperate CPUs as thats how core are seen by software. There bigger propability that those CPU will have sperate roles, one for game one for system, or as mention above one for Kinect. But i still think it's just some kind of debuging machine. Also this set up would be also perfect for DRM, system CPU would only supervise securied game CPU which would only takes orders from system CP, ofcorse in this set up it would be like 2 seperate devices integrated to eachother.

That actually makes sense, but how much you want to bet they'll market it as 2 APUs TWICE THE POWER!
Avatar image for Shielder7
Shielder7

5191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#216 Shielder7
Member since 2006 • 5191 Posts

[QUOTE="Wickerman777"]

Lol, so do you think it would just sit in there and keep the other one company? Not saying there is going to be a second APU but if there was it would certainly make a difference. I remember a couple months ago digital foundry saying it wasn't possible to put 8 gigs of GDDR5 RAM in a console but tada, that's gonna happen. What can or can't work in a PC doesn't necassarily apply to console design because you can customize to no end.

Wasdie

You should look into the 2 CPU solution the Sega Saturn had. Devs hated that thing as it was practically impossible to use correctly.

You cannot just break up your game into hundreds of little pieces and process it at once. Too many issues with latency and communicating between the two processors. It's hard to explain to somebody who has obviously no knowledge of processing on a hardware level. 

Processing hardware and the limitations of the hardware exist in all configurations. By changing the architecture you can eliminate bottlenecks, but you cannot rewrite the laws of physics. 

New Rumor Next Box to defy reality. Lemmings go Ape ****
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#217 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="StrongBlackVine"]

2 APUs is Microsoft's version on The Cell!

savagetwinkie

Xbox 360's Xenos includes two asymmetric graphics chips.

when did this happen? It wasn't originally like that.

Xbox 360's eDRAM chip includes GPU's backend fix function units e.g. ROP units, MSAA and 192 pixel processors. http://www.beyond3d.com/content/articles/4/4

The primary chip contains GPU's frontend fix function units and the main shader/stream processors. The Xenos is not based on PC's Radeon X1800/X1900.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#218 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="tormentos"] And that mean sh** since is not a problem of the card or GPU supporting GDDR5 and DDR3,the problem lies on the board the 720 has inside which already has a design,changing from DDR3 to GDDR5 would imply.. A design change of the board. Better cooling. Elimination of components like ESRAM since they would be a bottle neck with GDDR5 in place. And even probably an exterior design.tormentos

The board wouldn't be the problem since the designs for both DDR3 and GDDR5 traces already exist with existing products.

If you look at Wii U's PCB, you can see AMD's typical GPU memory trace design.

DSC_9007sm.jpg

This is "copy and paste" level engineering i.e. "semi-custom" job.

GDDR5 will be further mass produced as part of PC's SODIMM standard.

Xbox 360 Slim with AMD's typical GPU memory trace design.

xbox-360-slim-disassembly-16.jpg

Radeon HD 6670 with AMD's typical GPU memory trace design layout.AMD is pretty consistent when it comes to "cut-and-paste" memory trace designs.

You don't get it do you.? Once the xbox 720 design as in lock changing memory means changing the board again,doing further testing and a complete redesign,contracts are sign for this parts to be manufacture,you cat change components like that,adding more ram is not the same as changing it,you know this so stop your butthurt denial.. You look like a fool trying to prove your random quote theories,once DDR3 with ESRAM was in lock,changing that means delaying the unit and redesigning it period.

Please provide the confirmation. Microsoft has multiple Xbox Next plans e.g Yukon is known to be a dual IGP setup i.e. one at 1Ghz and the other at 500 MHz.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#219 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ShadowriverUB"][QUOTE="hexashadow13"]That doesn't actually make any sense. The point of APUs is to have everything connected. If you have two APUs... how would that even work? Would one be for low power and one for high power? Would they both work at the same time? If so... which APU would do what? That sounds annoyingly complicated to develop for, as well as a gigantic waste of money as opposed to just having a more powerful single APU. :/Shielder7
If they really used in retail and they both really used for game, they could communicate via shered memory, which would be slower. For programmer it would be seamless as everything would be deal via OS as OS is responsible for process scheduling and assailing jobs fro each core, software would see it as sperate cores... or should i say sperate CPUs as thats how core are seen by software. There bigger propability that those CPU will have sperate roles, one for game one for system, or as mention above one for Kinect. But i still think it's just some kind of debuging machine. Also this set up would be also perfect for DRM, system CPU would only supervise securied game CPU which would only takes orders from system CP, ofcorse in this set up it would be like 2 seperate devices integrated to eachother.

That actually makes sense, but how much you want to bet they'll market it as 2 APUs TWICE THE POWER!

Two APUs on a chip package with quad-core Jaguar + Radeon HD 7750 (8 CU) type GCN would be similar to PS4's compute power. Having two APUs doesn't automatically mean it would beat PS4.

PS4's APU glues two sets quad core AMD Jaguars and linked to GCN with 18 CUs.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#220 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="NFJSupreme"]

Ron is like SW's fact checker haha

tormentos

He is not he is just and ADM huge suck up,who like to argue point never make. You say.. Ice scream is good don't you think.? And Ron respond. I like Apple because they make great products. 'Then come 10 charts proving how great apple is.. :lol:

Again, there nothing of substance with your post. You are converting this topic into fanboys wars.

My views on PS3 haven't changed since 2006's NVIDIA G80.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#221 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="tormentos"][QUOTE="ronvalencia"]

The board wouldn't be the problem since the designs for both DDR3 and GDDR5 traces already exist with existing products.

If you look at Wii U's PCB, you can see AMD's typical GPU memory trace design.

DSC_9007sm.jpg

This is "copy and paste" level engineering i.e. "semi-custom" job.

GDDR5 will be further mass produced as part of PC's SODIMM standard.

Xbox 360 Slim with AMD's typical GPU memory trace design.

xbox-360-slim-disassembly-16.jpg

Radeon HD 6670 with AMD's typical GPU memory trace design layout.AMD is pretty consistent when it comes to "cut-and-paste" memory trace designs.

ShadowriverUB

You don't get it do you.? Once the xbox 720 design as in lock changing memory means changing the board again,doing further testing and a complete redesign,contracts are sign for this parts to be manufacture,you cat change components like that,adding more ram is not the same as changing it,you know this so stop your butthurt denial.. You look like a fool trying to prove your random quote theories,once DDR3 with ESRAM was in lock,changing that means delaying the unit and redesigning it period.

Actually memory application circuitry is something that is designed together with memory controller in APU and memory chips it self.... and pracicly applies to any integrated circuit, so it's just matter of layouting and testing

AMD provides reference PCB designs. http://www10.edacafe.com/nbc/articles/view_article.php?articleid=235634&section=CorpNews

Microprocessors, and their associated chipsets, continue to increase in density, speed and pin count. This makes it increasingly difficult for PCB designers to meet interconnect timing and signal integrity constraint challenges while placing and routing these complex microprocessor components on the board. To facilitate design-in of AMD64 microprocessors, AMD provides its customers with core layout designs for each processor. These reference designs include not only the processor but also memory interfaces, HyperTransport(TM) technology, power distribution and terminations.


The above pictures shows similar PCB design patterns for AMD GPU based devices.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#222 ronvalencia
Member since 2008 • 29612 Posts

still dont understand MS's decision to use DDR3 RAM as opposed to GDDR5 RAM. MS after all used a unified pool of GDDR3 RAM back in 2005 with the Xbox 360, the next logical step would have been to opt for GDDR5 RAM for the Xbox 720, look at what Sony have done. Sony have learned from their mistakes, Microsoft seem to be creating new ones.

CwlHeddwyn
There's no confirmation with the GDDR3/DDR3 plan.
Avatar image for ShadowriverUB
ShadowriverUB

5515

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#223 ShadowriverUB
Member since 2009 • 5515 Posts
[QUOTE="ronvalencia"]

[QUOTE="tormentos"] And that mean sh** since is not a problem of the card or GPU supporting GDDR5 and DDR3,the problem lies on the board the 720 has inside which already has a design,changing from DDR3 to GDDR5 would imply.. A design change of the board. Better cooling. Elimination of components like ESRAM since they would be a bottle neck with GDDR5 in place. And even probably an exterior design.tormentos

The board wouldn't be the problem since the designs for both DDR3 and GDDR5 traces already exist with existing products.

If you look at Wii U's PCB, you can see AMD's typical GPU memory trace design.

DSC_9007sm.jpg

This is "copy and paste" level engineering i.e. "semi-custom" job.

GDDR5 will be further mass produced as part of PC's SODIMM standard.

Xbox 360 Slim with AMD's typical GPU memory trace design.

xbox-360-slim-disassembly-16.jpg

Radeon HD 6670 with AMD's typical GPU memory trace design layout.AMD is pretty consistent when it comes to "cut-and-paste" memory trace designs.

You don't get it do you.? Once the xbox 720 design as in lock changing memory means changing the board again,doing further testing and a complete redesign,contracts are sign for this parts to be manufacture,you cat change components like that,adding more ram is not the same as changing it,you know this so stop your butthurt denial.. You look like a fool trying to prove your random quote theories,once DDR3 with ESRAM was in lock,changing that means delaying the unit and redesigning it period.

Actually memory application circuitry is something that is designed together with memory controller in APU and memory chips it self.... and pracicly applies to any integrated circuit, so it's just matter of layouting and testing
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#224 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"] There's no confirmation with the GDDR3/DDR3 plan. tormentos
It was heavily rumor,just like the PS4 one as,every thing has fit nicely. You can continue your eternal quest for wishful thinking fact is it changes nothing. DDR3 with ESRAM is what will be on on the unit,there is not single reason to use DDR3 with ESRAM if MS will use GDDR5 on final kits,GDDR5 is some out of this world tech,is available for some years now and the PS4 will have 8GB,so if MS wanted GDDR5 they would have go with that..

Again, please provide the confirmation.

I guessed PS4's 78x0 level GCN based on RSX's TDP and combining trace lines from XDR's 72 bit and GDDR3's 128 bit (i.e. PCB cost remains mostly constant).

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#225 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"] Again, please provide the confirmation.

I guessed PS4's 78x0 level GCN based on RSX's TDP and combining trace lines from XDR's 72 bit and GDDR3's 128 bit (i.e. PCB cost remains mostly constant).

tormentos

I don't even know what you claimed there..:lol: Please some one translate,what does the 78XX has to do with the RSX.? Stop making stupid claims.. Changing memory is not as simple as adding more.

Q: "what does the 78XX has to do with the RSX?"

A: TDP levels, die size

"Stop making stupid claims"

Don't participate in this speculative topic.

Changing memory is not as simple as adding more

It's simple if one follows AMD's memory trace reference designs for both DDR3 and GDDR5. Between memory controllers and memory modules, AMD already has" copy-and-paste" reference designs for DDR3 and GDDR5 memory types. Again, you are too stupid to notice that the above pictures follows AMD's memory trace pattern designs.

Switching between DDR3 and GDDR5 for GPUs is not a big issue in the PC world i.e. DELL/HP has launched 7730M(DDR3 version)/7750M(GDDR5 version) at similar time period.

You can have small companies like Xi3 pumping out custom motherboards (which uses the AMD's reference design blocks) that matches any other motherboards' performance.

PS; NVIDIA also has PCB reference design offerings.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#226 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ShadowriverUB"] Actually memory application circuitry is something that is designed together with memory controller in APU and memory chips it self.... and pracicly applies to any integrated circuit, so it's just matter of layouting and testingtormentos
NO is not as simple. The memory controller is on the APU,but GDDR5 is on the main board not on the APU it self,there are way higher density modules on DDR3 than on GDDR5,which mean more memory modules,which mean more heat also GDDR5 also runs hotter than DDR3,so new cooling as well and more expensive one. If MS decided to go DDR3 is because the GPU inside the 720 is basically a 7770 either any way,the 7790 alines better,but the 7790 runs at a higher clock than the 720 rumored clock and the 1.2TF specs confirm that. There is a big change that MS will push for Kinect and features rather than graphics.

Stop making stupid generalisations.

"GDDR5M" has lower power consumption than DDR3 (not referring to DDR3L) or desktop GDDR5.

As with any chips, memory chip's power consumption depandant on it's process tech and it's manufacturing yield quality(i.e. this influences the electrical leakage characteristics).

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#227 tormentos
Member since 2003 • 33793 Posts
[QUOTE="ronvalencia"] There's no confirmation with the GDDR3/DDR3 plan.

It was heavily rumor,just like the PS4 one as,every thing has fit nicely. You can continue your eternal quest for wishful thinking fact is it changes nothing. DDR3 with ESRAM is what will be on on the unit,there is not single reason to use DDR3 with ESRAM if MS will use GDDR5 on final kits,GDDR5 is some out of this world tech,is available for some years now and the PS4 will have 8GB,so if MS wanted GDDR5 they would have go with that..
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#228 tormentos
Member since 2003 • 33793 Posts
[QUOTE="ShadowriverUB"] Actually memory application circuitry is something that is designed together with memory controller in APU and memory chips it self.... and pracicly applies to any integrated circuit, so it's just matter of layouting and testing

NO is not as simple. The memory controller is on the APU,but GDDR5 is on the main board not on the APU it self,there are way higher density modules on DDR3 than on GDDR5,which mean more memory modules,which mean more heat also GDDR5 also runs hotter than DDR3,so new cooling as well and more expensive one. If MS decided to go DDR3 is because the GPU inside the 720 is basically a 7770 either any way,the 7790 alines better,but the 7790 runs at a higher clock than the 720 rumored clock and the 1.2TF specs confirm that. There is a big change that MS will push for Kinect and features rather than graphics.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#229 tormentos
Member since 2003 • 33793 Posts
[QUOTE="ronvalencia"] Again, please provide the confirmation.

I guessed PS4's 78x0 level GCN based on RSX's TDP and combining trace lines from XDR's 72 bit and GDDR3's 128 bit (i.e. PCB cost remains mostly constant).

I don't even know what you claimed there..:lol: Please some one translate,what does the 78XX has to do with the RSX.? Stop making stupid claims.. Changing memory is not as simple as adding more.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#230 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="ronvalencia"]

[QUOTE="tormentos"] A rumor not something assume by you which is different. You quoted a page which is nothing related to the 720.tormentos

You assumed that GCN has a split DDR3 and GDDR5 memory controller designs i.e. 7730M and 7750M uses the same ASIC design.

The designs for both DDR3 and GDDR5 for GCNs has been completed since last year.

No i did not, you did, i never mention even that, you are the king of replaying to things never argue. The GPU supporting DDR3 and GDDR5 in nothing change the fact that to use GDDR5 on 720 the console board has to be redesign,better cooling has to be implemented,and probably even an exterior design,is not about the memory controller DDR3 and GDDR5 are not compatible,and you can't just slap GDDR5 into DDR3 board.

Please provide the confirmation for Xbox Next's DDR3 plan.
Avatar image for kickingcarpet
kickingcarpet

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#231 kickingcarpet
Member since 2011 • 570 Posts

TL: DR wasn't PS4 news, didn't bother!  PS4 will have 1 APU and still do laps around microsoft

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#232 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="tormentos"] No i did not, you did, i never mention even that, you are the king of replaying to things never argue. The GPU supporting DDR3 and GDDR5 in nothing change the fact that to use GDDR5 on 720 the console board has to be redesign,better cooling has to be implemented,and probably even an exterior design,is not about the memory controller DDR3 and GDDR5 are not compatible,and you can't just slap GDDR5 into DDR3 board.tormentos
Please provide the confirmation for Xbox Next's DDR3 plan.

http://www.vgleaks.com/world-exclusive-durango-unveiled-2/ Now where is yours.? I i want to see the diagram of Durago using 8GB of GDDR5..

Notice vgleaks.com

vgleaks.com's "768 threads" for GCN is LOL. Radeon X1900 (R580) has 512 threads over it's 48 pixel shaders. These stupid console clowns/"johnny come lately" don't know anything about PC hardware. I could nick pick and show how stupid they(vgleaks.com) are.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#233 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="tormentos"] No i did not, you did, i never mention even that, you are the king of replaying to things never argue. The GPU supporting DDR3 and GDDR5 in nothing change the fact that to use GDDR5 on 720 the console board has to be redesign,better cooling has to be implemented,and probably even an exterior design,is not about the memory controller DDR3 and GDDR5 are not compatible,and you can't just slap GDDR5 into DDR3 board.tormentos
Please provide the confirmation for Xbox Next's DDR3 plan.

http://www.vgleaks.com/world-exclusive-durango-unveiled-2/ Now where is yours.? I i want to see the diagram of Durago using 8GB of GDDR5..

From http://www.vgleaks.com/playstation-4-architecture-evolution-over-time/

Their CPU bus to memory with <20 GB/s is a LOL material. GDDR5 didn't boost CPU's bandwidth access beyond their DDR3 counterparts. Note that Intel Core i7-3770 was benchmarked past 20 GB/s.

Vgleaks' Onion links is no better than AMD Llano's 10 GB/s Onion links.


Avatar image for Consolessuck187
Consolessuck187

199

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#234 Consolessuck187
Member since 2013 • 199 Posts
Not exactly a new idea desktop apu's and graphics cards can do crossfire.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#235 tormentos
Member since 2003 • 33793 Posts
[QUOTE="RR360DD"] And what if the 720 gets announced in May, and they show the specs, and you're completely wrong? What then?

Simple i was wrong...Period i got owned... As simple as that.. But i don't think MS is after power this time,and leaks until now have been spot on.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#236 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]Notice vgleaks.com

vgleaks.com's "768 threads" for GCN is LOL. Radeon X1900 (R580) has 512 threads over it's 48 pixel shaders. These stupid console clowns/"johnny come lately" don't know anything about PC hardware. I could nick pick and show how stupid they(vgleaks.com) are.

tormentos

Where is your link i posted mine,i want to see the diagram of the 720 with GDDR5...

From http://www.vgleaks.com/playstation-4-architecture-evolution-over-time/

GDDR5 didn't boost CPU's memory bandwidth i.e. < 20 GB/s.

Most of the GDDR5 bandwidth is for the GPU i.e. not much different to the current AMD CPU + 7850 OC/7870 combo.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#237 ronvalencia
Member since 2008 • 29612 Posts

When has bandwidth ever been a issue really? It's overhyped just like omg that cpu will bottleneck that card.Consolessuck187

7750 DDR3 1600 MHz is 35% slower than 7750 GDDR5 http://ht4u.net/reviews/2012/msi_radeon_hd_7750_2gb_ddr3_test/index36.php

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#238 ronvalencia
Member since 2008 • 29612 Posts

Then the difference will be 1% improvement lol Consolessuck187

From http://www.hardware.fr/focus/76/amd-radeon-hd-7750-ddr3-test-cape-verde-etouffe.html

GDDR5 1125 MHz (4500Mhz effective) vs DDR3 800 MHz (1600Mhz effective)

IMG0039893_zps13afa2a2.gif

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#239 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Consolessuck187"]craptastic comparisonConsolessuck187
Intel Haswell's L4 cache for the GPU indicates otherwise.

What's this gotta do with anything? lol the gddr3 has better timings....gddr5 at same clock speed has more bandwidth though.

GDDR5's latency example https://www.skhynix.com/products/graphics/view.jsp?info.ramKind=26&info.serialNo=H5GQ2H24AFR

Programmable CAS latency: 5 to 20 tCK

Programmable WRITE latency: 1 to 7 tCK

DDR3's latency example http://download.micron.com/pdf/datasheets/dram/ddr3/1Gb_DDR3_SDRAM.pdf

CAS READ latency (CL): 5, 6, 7, 8, 9, 10, or 11

CAS WRITE latency (CWL): 5, 6, 7, 8, based on tCK

------------------

http://forums.anandtech.com/showpost.php?p=34710477&postcount=28

Memory latency is most appropriately measured in nanoseconds. In terms of nanosecond latency, DDR3 and GDDR5 are very similar. Someone already posted a link that says that GDDR5 has programmable CAS latency (from 5 to 20 cycles), which if you take the increased clockspeed of GDDR5 into account, means that the latency in terms of nanoseconds is very similar to DDR3.

GDDR5's architecture is different from typical DDR3 in only a few important performance-oriented ways: * More banks (16 in GDDR5 versus 8 [typical] in DDR3, this is actually a latency-*reducing* feature). * More data pins per GDDR5 device (32 in GDDR5 versus [typicall] 8 or [very rarely] 16 in DDR3). This makes it so you can get all of the data for a cache line (or whatever granularity of access you're talking about), in a reasonable number of cycles from a single chip. In DDR3, all 8 chips in a rank work together to provide 64 bits per transfer of a cache line (8 bits per transfer each). This width, plus the very high clockspeed of GDDR5 has the net effect of data transfer taking *less time* with GDDR5 compared to DDR3, but data transfer is never the latency bottleneck in a DRAM system, so this part isn't very important. Suffice it to say, this does not have any negative impact on GDDR5 latency. * Obviously also very high clockspeeds and data transfer speeds.

As a source, other than the link provided earlier in the page, I myself researched DRAM for a couple years, and my lab mate just finished the GDDR5 version of his DRAM simulator, with all the industry-correct timing parameters involved, and according to him they have the almost identical latency.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#240 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Consolessuck187"]Then the difference will be 1% improvement lol Consolessuck187

From http://www.hardware.fr/focus/76/amd-radeon-hd-7750-ddr3-test-cape-verde-etouffe.html

GDDR5 1125 MHz (4500Mhz effective) vs DDR3 800 MHz (1600Mhz effective)

lol the only thing giving gddr5 real performance edge is higher clock. Clock them both to same spec and see difference aint shit

No GPU vendor made 7970 or 680 with DDR3. GDDR5's other real performance edge is multiple read and write request from many core processors.

PS; I overclock my 5730M's DDR3 (1Ghz from 800Mhz) to 5770M or 6570M's GDDR5 clockspeed and it wasn't able to match it.


Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#241 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Consolessuck187"] lol the only thing giving gddr5 real performance edge is higher clock. Clock them both to same spec and see difference aint shit Consolessuck187

No GPU vendor made 7970 or 680 with DDR3. GDDR5's other real performance edge is multiple read and write request from many core processors.

That's because gddr5 is newer. duh

PS; I overclocked my 5730M's DDR3 (to 1Ghz) to 5770M or 6570M's GDDR5 clock speed and it wasn't able to match it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#242 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Consolessuck187"] That's because gddr5 is newer. duh Consolessuck187
PS; I overclock my 5730M's DDR3 (1Ghz from 800Mhz) to 5770M or 6570M's GDDR5 clock speed and it wasn't able to match it.

gtx 280 with gddr3 is faster card than the 7750 with gddr5 ZOMG

There are other bottlenecks besides memory i.e. ROP and TMUs.

GTX 280 has 512-bit memory bus with 141.7GB/s . 512bit PCB is expensive.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#243 tormentos
Member since 2003 • 33793 Posts

[QUOTE="tormentos"][QUOTE="ronvalencia"] Again, please provide the confirmation.

I guessed PS4's 78x0 level GCN based on RSX's TDP and combining trace lines from XDR's 72 bit and GDDR3's 128 bit (i.e. PCB cost remains mostly constant).

ronvalencia

I don't even know what you claimed there..:lol: Please some one translate,what does the 78XX has to do with the RSX.? Stop making stupid claims.. Changing memory is not as simple as adding more.

Q: "what does the 78XX has to do with the RSX?"

A: TDP levels, die size

"Stop making stupid claims"

Don't participate in this speculative topic.

Changing memory is not as simple as adding more

It's simple if one follows AMD's memory trace reference designs for both DDR3 and GDDR5. Between memory controllers and memory modules, AMD already has" copy-and-paste" reference designs for DDR3 and GDDR5 memory types. Again, you are too stupid to notice that the above pictures follows AMD's memory trace pattern designs.

Switching between DDR3 and GDDR5 for GPUs is not a big issue in the PC world i.e. DELL/HP has launched 7730M(DDR3 version)/7750M(GDDR5 version) at similar time period.

You can have small companies like Xi3 pumping out custom motherboards (which uses the AMD's reference design blocks) that matches any other motherboards' performance.

PS; NVIDIA also has PCB reference design offerings.

No is not period and not only on forums like Beyond3D that has been state,neogaf and basically every where,there is a reason why MS chose DDR3 with ESRAM there is no point in using DDR3 with ESRAM if they intended to use GDDR5 all alone period,and changing memory on 720 means change the design period.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#244 ronvalencia
Member since 2008 • 29612 Posts

people assume because xbox720 might have gddr3 it's automatically not as powerful as ps4 because zomg no gddr5. lol afraid it's more to it than that.Consolessuck187

I discounted 512bit DDR3 due to PCB cost, but 128bit DDR3 on-board the chip package + 256bit on PCB could be feasible.

Wide IO/ Stacked DDR3 the potential to rival PCB GDDR5.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#245 tormentos
Member since 2003 • 33793 Posts
Stop making stupid generalisations.

"GDDR5M" has lower power consumption than DDR3 (not referring to DDR3L) or desktop GDDR5.

As with any chips, memory chip's power consumption depandant on it's process tech and it's manufacturing yield quality(i.e. this influences the electrical leakage characteristics).

ronvalencia
See and this is your biggest problems just because something is available doesn't mean MS will use it,you think anything that is make will be on 720 because is some how more efficient,where has been rumor than the 720 will use GDDR5M.? Hell not even the PS4 has been rumor to have GDDR5M,and will have have GDDR5. Stop arguing stupid things,just because something exist doesn't mean MS will chose it,you are like a one man army robot that trow out nothing but self make rumors and incoherent arguments,the 720 will not use GDDR5M and you can quote that in your signature if you want for future reference. DDR3 has higher density models that GDDR5,that is a fact and changing 8GB of DDR3 to 8GB of GDDR5 will not only increase your Boom,it will increase the size of the console,and change the design,and will require more testing and better cooling solution. Cell phones have very low heat producing memory does that mean MS will use Phone memory on the 720.?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#246 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Consolessuck187"]people assume because xbox720 might have gddr3 it's automatically not as powerful as ps4 because zomg no gddr5. lol afraid it's more to it than that.Consolessuck187
I discounted 512bit DDR3 due to PCB cost, but 128bit DDR3 on-board the chip package + 256bit on PCB could be feasible.

720 could have more texture memory units or rasterizing operations so doesnt make a f*ck if ps4 ram is faster lol.

With AMD GCNs, ROP unit count is usually related to memory controller unit count i.e.

128bit bus = 4 32bit GDDR5 memory controllers = 16 ROPs,

256 bit bus = 8 32bit GDDR5 memory controllers = 32 ROPs,

7790 changed it's TMU + CU count hence the increase in performance.

Only AMD Tahiti has decoupled it's memory controller count from the ROP unit count.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#247 tormentos
Member since 2003 • 33793 Posts
[QUOTE="tormentos"][QUOTE="ronvalencia"]

You assumed that GCN has a split DDR3 and GDDR5 memory controller designs i.e. 7730M and 7750M uses the same ASIC design.

The designs for both DDR3 and GDDR5 for GCNs has been completed since last year.

ronvalencia
No i did not, you did, i never mention even that, you are the king of replaying to things never argue. The GPU supporting DDR3 and GDDR5 in nothing change the fact that to use GDDR5 on 720 the console board has to be redesign,better cooling has to be implemented,and probably even an exterior design,is not about the memory controller DDR3 and GDDR5 are not compatible,and you can't just slap GDDR5 into DDR3 board.

Please provide the confirmation for Xbox Next's DDR3 plan.

http://www.vgleaks.com/world-exclusive-durango-unveiled-2/ Now where is yours.? I i want to see the diagram of Durago using 8GB of GDDR5..
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#248 clyde46
Member since 2005 • 49061 Posts
Tormentos has got some major damage control going on.
Avatar image for Bruce_Benzing
Bruce_Benzing

1731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249 Bruce_Benzing
Member since 2012 • 1731 Posts

Tormentos has got some major damage control going on.clyde46
This is what happens when you live your life with your head so far up Kaz's butt, you can tell everyone what he had for dinner....

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#250 clyde46
Member since 2005 • 49061 Posts

[QUOTE="clyde46"]Tormentos has got some major damage control going on.Bruce_Benzing

This is what happens when you live your life with your head so far up Kaz's butt, you can tell everyone what he had for dinner....

He's in everything thread. Anything to do with hardware he always finds a way to worm is way in.