New Xbox 720 chips now in production! POWERHOUSE Incoming?

This topic is locked from further discussion.

Avatar image for arto1223
arto1223

4412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#201 arto1223
Member since 2005 • 4412 Posts

You are such an ignorant fanboy. It will not have those GPUs. Too much power needed and too much heat generated. Also, if you think we will have Avatar like graphics in real time in 2013 you are just an ignorant human being.

Even if that is true about the GPUs, that is funny because those are not even high end GPUs (probably be equivilent to the 7850s).

Avatar image for dramaybaz
dramaybaz

6020

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#202 dramaybaz
Member since 2005 • 6020 Posts

I am predicting 720 will have 2 8990s in crossfire.

fr3ddiemercury
Stopped reading there.
Avatar image for HaloPimp978
HaloPimp978

7329

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 1

#203 HaloPimp978
Member since 2005 • 7329 Posts

Enough with the avatar graphics, next gen consoles won't have that at all.

Avatar image for Evo_nine
Evo_nine

2224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#204 Evo_nine
Member since 2012 • 2224 Posts

Enough with the avatar graphics, next gen consoles won't have that at all.

HaloPimp978

:idea:

are you saying they will have better graphics than avatar?

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#205 wis3boi
Member since 2005 • 32507 Posts

Enough with the avatar graphics, next gen consoles won't have that at all.

HaloPimp978

Anyone who believes that myth I can't take serious. Poe's Law

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#206 hartsickdiscipl
Member since 2003 • 14787 Posts

http://www.pcmag.com/article2/0,2817,2413873,00.asp

Cannot wait!

I am predicting 720 will have 2 8990s in crossfire.Dunno about the cpu, too many rumors on that but it will probably be strong. When you add optimization on top of these specs...WOW. I bet AMD's avatar graphics prediction will come true.

And for those who are going to say "ya right and it will cost $1000"....720 is rumored to have subscription model so can easily price it low for intial purchase for the masses (just like how $1000 phones/tablets can be had for $400)... Whos excited!?

fr3ddiemercury

I predict that you will be moved to the head of the waiting list for a lobotomy.

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#207 superclocked
Member since 2009 • 5864 Posts

[QUOTE="HaloPimp978"]

Enough with the avatar graphics, next gen consoles won't have that at all.

wis3boi

Anyone who believes that myth I can't take serious. Poe's Law

Yeah, it reminds me of the Toy Story graphics claim for the first XBox...
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#208 superclocked
Member since 2009 • 5864 Posts

[QUOTE="fr3ddiemercury"]

http://www.pcmag.com/article2/0,2817,2413873,00.asp

Cannot wait!

I am predicting 720 will have 2 8990s in crossfire.Dunno about the cpu, too many rumors on that but it will probably be strong. When you add optimization on top of these specs...WOW. I bet AMD's avatar graphics prediction will come true.

And for those who are going to say "ya right and it will cost $1000"....720 is rumored to have subscription model so can easily price it low for intial purchase for the masses (just like how $1000 phones/tablets can be had for $400)... Whos excited!?

hartsickdiscipl

I predict that you will be moved to the head of the waiting list for a lobotomy.

lol...
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 Bebi_vegeta
Member since 2003 • 13558 Posts

[QUOTE="Bebi_vegeta"]

[QUOTE="Evo_nine"]

Looks like the choice for gamers next gen will be

Buy a xbox 720 for max $600

or

spend $7000 on a PC gaming rig to get similar graphics.

tough choice.

Vaasman

10000$ on PC, I also upgrade everyday.

Plebs.

If you aren't having multiple packaged upgrades sent to your house every hour by your private mailing company then what's the point?

Damn, I'm such a noob.

Avatar image for 1080pOnly
1080pOnly

2216

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#210 1080pOnly
Member since 2009 • 2216 Posts

How the fvck can it be that the TC edited the OP 16 TIMES and still got it so wrong!

Avatar image for Timstuff
Timstuff

26840

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#211 Timstuff
Member since 2002 • 26840 Posts

[QUOTE="fr3ddiemercury"]

http://www.pcmag.com/article2/0,2817,2413873,00.asp

Cannot wait!

I am predicting 720 will have 2 8990s in crossfire.Dunno about the cpu, too many rumors on that but it will probably be strong. When you add optimization on top of these specs...WOW. I bet AMD's avatar graphics prediction will come true.

And for those who are going to say "ya right and it will cost $1000"....720 is rumored to have subscription model so can easily price it low for intial purchase for the masses (just like how $1000 phones/tablets can be had for $400)... Whos excited!?

hartsickdiscipl

I predict that you will be moved to the head of the waiting list for a lobotomy.

e1760.gif

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212 savagetwinkie
Member since 2008 • 7981 Posts

*facepalms* (yes I really did that)

Freddie before you float off to your own dreamworld again, lets just restate that MS can NOT alter the laws of physics, 2 of the highest end of the new amd series of GPU would have an idle power consumption at atleast 400 Watts, and a peak at 700-800 that would be for the GPUs alone. Making it so expensive in power consumption that it would limit the potential buyerbase, does that sound logical? No good, you are catching on now.

Next is that unless you would see watercooling, or a case as big as a pc tower, two of those in X-fire would likely develop so much heat that the console would burn.

Given that the next step from AMDs GPU line has been suggested to be an incremental update, you can take the 7970 card, and look at how it works, with what specs, heat and so on.

And what you suggest is not plausible.

What you will likely get, is 1 GPU made for gaming, which might if MS fixes cooling, noise and whatnot, aswell as one way smaller GPU for media, and other none related services. So the biggest GPU would'nt need to draw too much power 24/7

Likewise why you would not see an 8 core AMX at 3.x Ghz, the Power needed for that is simply too expensive, the heat is too high for such a form factor. What you might get (and a very logical way more powerful then what you are used to) 6-8 cores at around 2.0 ghz (maybe slightly lower) would also be more realistic.

That is ofcourse, if your next X-box are not made the way a fridge is, and if it is, good lord, then that box would cost more to turn on then the monthly subscription it is rumored to have.

Maddie_Larkin
Your forgetting about ZeroCore, some of the top of the line cards idle at 15w for amd, and your also missing the point that it takes a lot more power to power an entire card than just a chip,
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#213 clyde46
Member since 2005 • 49061 Posts
[QUOTE="Maddie_Larkin"]

*facepalms* (yes I really did that)

Freddie before you float off to your own dreamworld again, lets just restate that MS can NOT alter the laws of physics, 2 of the highest end of the new amd series of GPU would have an idle power consumption at atleast 400 Watts, and a peak at 700-800 that would be for the GPUs alone. Making it so expensive in power consumption that it would limit the potential buyerbase, does that sound logical? No good, you are catching on now.

Next is that unless you would see watercooling, or a case as big as a pc tower, two of those in X-fire would likely develop so much heat that the console would burn.

Given that the next step from AMDs GPU line has been suggested to be an incremental update, you can take the 7970 card, and look at how it works, with what specs, heat and so on.

And what you suggest is not plausible.

What you will likely get, is 1 GPU made for gaming, which might if MS fixes cooling, noise and whatnot, aswell as one way smaller GPU for media, and other none related services. So the biggest GPU would'nt need to draw too much power 24/7

Likewise why you would not see an 8 core AMX at 3.x Ghz, the Power needed for that is simply too expensive, the heat is too high for such a form factor. What you might get (and a very logical way more powerful then what you are used to) 6-8 cores at around 2.0 ghz (maybe slightly lower) would also be more realistic.

That is ofcourse, if your next X-box are not made the way a fridge is, and if it is, good lord, then that box would cost more to turn on then the monthly subscription it is rumored to have.

savagetwinkie
Your forgetting about ZeroCore, some of the top of the line cards idle at 15w for amd, and your also missing the point that it takes a lot more power to power an entire card than just a chip,

Doesnt Nvidia have something like that now?
Avatar image for Chiefme
Chiefme

68

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#214 Chiefme
Member since 2009 • 68 Posts

How the fvck can it be that the TC edited the OP 16 TIMES and still got it so wrong!

1080pOnly
That is what his parents thought , except with the number of pills.
Avatar image for NEWMAHAY
NEWMAHAY

3824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#215 NEWMAHAY
Member since 2012 • 3824 Posts

http://www.pcmag.com/article2/0,2817,2413873,00.asp

Cannot wait!

I am predicting 720 will have 2 8990s in crossfire. >

fr3ddiemercury
i lol'd
Avatar image for jsmoke03
jsmoke03

13719

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#216 jsmoke03
Member since 2004 • 13719 Posts

thats nice....hope they bring more than the third party exclusives

Avatar image for SPBoss
SPBoss

3746

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#217 SPBoss
Member since 2009 • 3746 Posts
TC is definitely a troll master, never have I seen one guy p*** off so many users LOL
Avatar image for dramaybaz
dramaybaz

6020

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#218 dramaybaz
Member since 2005 • 6020 Posts
TC is definitely a troll master, never have I seen one guy p*** off so many users LOLSPBoss
Intentional troll or not, we question his intelligence.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#219 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ShadowMoses900"]

We will see....the tech demo from Square was very impressive. Though I am not going to get the next Xbox unless MS makes some BIG changes.

Masenkoe

*cough*running on PC*cough*

Both Xbox Next and PS4 are basically X86 PCs i.e. assimulated into X86 collective.
Avatar image for SPBoss
SPBoss

3746

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#220 SPBoss
Member since 2009 • 3746 Posts
[QUOTE="SPBoss"]TC is definitely a troll master, never have I seen one guy p*** off so many users LOLdramaybaz
Intentional troll or not, we question his intelligence.

No one can actually be that bad, even extreme fanboys have a limit.. I'm waiting for him to realise its physically impossible to have an 8core cpue and crossfire/sli in such a small form factor. It would literally burst into flames from the heat LOL
Avatar image for deactivated-5ac102a4472fe
deactivated-5ac102a4472fe

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#221 deactivated-5ac102a4472fe
Member since 2007 • 7431 Posts

[QUOTE="Maddie_Larkin"]

*facepalms* (yes I really did that)

Freddie before you float off to your own dreamworld again, lets just restate that MS can NOT alter the laws of physics, 2 of the highest end of the new amd series of GPU would have an idle power consumption at atleast 400 Watts, and a peak at 700-800 that would be for the GPUs alone. Making it so expensive in power consumption that it would limit the potential buyerbase, does that sound logical? No good, you are catching on now.

Next is that unless you would see watercooling, or a case as big as a pc tower, two of those in X-fire would likely develop so much heat that the console would burn.

Given that the next step from AMDs GPU line has been suggested to be an incremental update, you can take the 7970 card, and look at how it works, with what specs, heat and so on.

And what you suggest is not plausible.

What you will likely get, is 1 GPU made for gaming, which might if MS fixes cooling, noise and whatnot, aswell as one way smaller GPU for media, and other none related services. So the biggest GPU would'nt need to draw too much power 24/7

Likewise why you would not see an 8 core AMX at 3.x Ghz, the Power needed for that is simply too expensive, the heat is too high for such a form factor. What you might get (and a very logical way more powerful then what you are used to) 6-8 cores at around 2.0 ghz (maybe slightly lower) would also be more realistic.

That is ofcourse, if your next X-box are not made the way a fridge is, and if it is, good lord, then that box would cost more to turn on then the monthly subscription it is rumored to have.

savagetwinkie

Your forgetting about ZeroCore, some of the top of the line cards idle at 15w for amd, and your also missing the point that it takes a lot more power to power an entire card than just a chip,

Riddle me this then, mighty fine ideling temps, but what when you play a game? Completely useless point for a console. The chips alone as I stated use FAR more power under load. Your idea pretty much just equates to "it will not burn if you do not play games".

Oh and the power draw I wrote were calculated for the chip alone as stated, learn to read.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#222 GioVela2010
Member since 2008 • 5566 Posts
TC is definitely a troll master, never have I seen one guy p*** off so many users LOLSPBoss
:|
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#223 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="savagetwinkie"][QUOTE="Maddie_Larkin"]

*facepalms* (yes I really did that)

Freddie before you float off to your own dreamworld again, lets just restate that MS can NOT alter the laws of physics, 2 of the highest end of the new amd series of GPU would have an idle power consumption at atleast 400 Watts, and a peak at 700-800 that would be for the GPUs alone. Making it so expensive in power consumption that it would limit the potential buyerbase, does that sound logical? No good, you are catching on now.

Next is that unless you would see watercooling, or a case as big as a pc tower, two of those in X-fire would likely develop so much heat that the console would burn.

Given that the next step from AMDs GPU line has been suggested to be an incremental update, you can take the 7970 card, and look at how it works, with what specs, heat and so on.

And what you suggest is not plausible.

What you will likely get, is 1 GPU made for gaming, which might if MS fixes cooling, noise and whatnot, aswell as one way smaller GPU for media, and other none related services. So the biggest GPU would'nt need to draw too much power 24/7

Likewise why you would not see an 8 core AMX at 3.x Ghz, the Power needed for that is simply too expensive, the heat is too high for such a form factor. What you might get (and a very logical way more powerful then what you are used to) 6-8 cores at around 2.0 ghz (maybe slightly lower) would also be more realistic.

That is ofcourse, if your next X-box are not made the way a fridge is, and if it is, good lord, then that box would cost more to turn on then the monthly subscription it is rumored to have.

Maddie_Larkin

Your forgetting about ZeroCore, some of the top of the line cards idle at 15w for amd, and your also missing the point that it takes a lot more power to power an entire card than just a chip,

Riddle me this then, mighty fine ideling temps, but what when you play a game? Completely useless point for a console. The chips alone as I stated use FAR more power under load. Your idea pretty much just equates to "it will not burn if you do not play games".

Oh and the power draw I wrote were calculated for the chip alone as stated, learn to read.

The statement for "new amd series of GPU would have an idle power consumption at atleast 400 Watts" would be false.

power_idle.gif

Peak: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Highest single reading during the test.

power_peak.gif

Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high non-game power consumption that can typically be reached only with stress testing applications. Card left running stress test until power draw converged to a stable value. On cards with power limiting systems we will disable the power limiting system or configure it to the highest available setting - if possible. We will also use the highest single reading from a Furmark run which is obtained by measuring faster than when the power limit can kick in

power_maximum.gif

AMD PowerTune settings was changed.

The statement for "a peak at 700-800 that would be for the GPUs alone" would be false i.e. my Silverstone SG07's power supply doesn't go this high.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#224 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="savagetwinkie"][QUOTE="Maddie_Larkin"]

*facepalms* (yes I really did that)

Freddie before you float off to your own dreamworld again, lets just restate that MS can NOT alter the laws of physics, 2 of the highest end of the new amd series of GPU would have an idle power consumption at atleast 400 Watts, and a peak at 700-800 that would be for the GPUs alone. Making it so expensive in power consumption that it would limit the potential buyerbase, does that sound logical? No good, you are catching on now.

Next is that unless you would see watercooling, or a case as big as a pc tower, two of those in X-fire would likely develop so much heat that the console would burn.

Given that the next step from AMDs GPU line has been suggested to be an incremental update, you can take the 7970 card, and look at how it works, with what specs, heat and so on.

And what you suggest is not plausible.

What you will likely get, is 1 GPU made for gaming, which might if MS fixes cooling, noise and whatnot, aswell as one way smaller GPU for media, and other none related services. So the biggest GPU would'nt need to draw too much power 24/7

Likewise why you would not see an 8 core AMX at 3.x Ghz, the Power needed for that is simply too expensive, the heat is too high for such a form factor. What you might get (and a very logical way more powerful then what you are used to) 6-8 cores at around 2.0 ghz (maybe slightly lower) would also be more realistic.

That is ofcourse, if your next X-box are not made the way a fridge is, and if it is, good lord, then that box would cost more to turn on then the monthly subscription it is rumored to have.

clyde46

Your forgetting about ZeroCore, some of the top of the line cards idle at 15w for amd, and your also missing the point that it takes a lot more power to power an entire card than just a chip,

Doesnt Nvidia have something like that now?

power_idle.gif

Avatar image for dramaybaz
dramaybaz

6020

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#225 dramaybaz
Member since 2005 • 6020 Posts

[QUOTE="dramaybaz"][QUOTE="SPBoss"]TC is definitely a troll master, never have I seen one guy p*** off so many users LOLSPBoss
Intentional troll or not, we question his intelligence.

No one can actually be that bad, even extreme fanboys have a limit.. I'm waiting for him to realise its physically impossible to have an 8core cpue and crossfire/sli in such a small form factor. It would literally burst into flames from the heat LOL

Well, no point in carrying on this silly thread, or answering the TC:

Game Over

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#226 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="SPBoss"][QUOTE="dramaybaz"] Intentional troll or not, we question his intelligence.dramaybaz

No one can actually be that bad, even extreme fanboys have a limit.. I'm waiting for him to realise its physically impossible to have an 8core cpue and crossfire/sli in such a small form factor. It would literally burst into flames from the heat LOL

Well, no point in carrying on this silly thread, or answering the TC:

Game Over

The user can create another account.
Avatar image for el3m2tigre
el3m2tigre

4232

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#227 el3m2tigre
Member since 2007 • 4232 Posts

[QUOTE="SPBoss"][QUOTE="dramaybaz"] Intentional troll or not, we question his intelligence.dramaybaz

No one can actually be that bad, even extreme fanboys have a limit.. I'm waiting for him to realise its physically impossible to have an 8core cpue and crossfire/sli in such a small form factor. It would literally burst into flames from the heat LOL

Well, no point in carrying on this silly thread, or answering the TC:

Game Over

Alright, pack your bags boys, we're done here.

Avatar image for deactivated-5ac102a4472fe
deactivated-5ac102a4472fe

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#228 deactivated-5ac102a4472fe
Member since 2007 • 7431 Posts

[QUOTE="Maddie_Larkin"]

[QUOTE="savagetwinkie"] Your forgetting about ZeroCore, some of the top of the line cards idle at 15w for amd, and your also missing the point that it takes a lot more power to power an entire card than just a chip, ronvalencia

Riddle me this then, mighty fine ideling temps, but what when you play a game? Completely useless point for a console. The chips alone as I stated use FAR more power under load. Your idea pretty much just equates to "it will not burn if you do not play games".

Oh and the power draw I wrote were calculated for the chip alone as stated, learn to read.

The statement for "new amd series of GPU would have an idle power consumption at atleast 400 Watts" would be false.

power_idle.gif

Peak: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Highest single reading during the test.

power_peak.gif

Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high non-game power consumption that can typically be reached only with stress testing applications. Card left running stress test until power draw converged to a stable value. On cards with power limiting systems we will disable the power limiting system or configure it to the highest available setting - if possible. We will also use the highest single reading from a Furmark run which is obtained by measuring faster than when the power limit can kick in

power_maximum.gif

AMD PowerTune settings was changed.

The statement for "a peak at 700-800 that would be for the GPUs alone" would be false i.e. my Silverstone SG07's power supply doesn't go this high.

Go to the 7970 under load, since OP talked abou its big brother in Xfire, read before you let out your AMD marketing bull. Infact jest read the threads in which you post and do SW a favor. Again, idle power consumption matters little in a freakin console, common sense should be requirted given what a console is required to do. Not to mention TC were betting on dual chip cards.

Dual chip cards requires alot more in idle power usage. Given that they can not just be chips, thus the best govering example we can have is to use on how the PCB were structured, each of the power draw of the chips alone, and cut PCB power in pcb in half. add the power needed for a cooling solution.

Avatar image for Shielder7
Shielder7

5191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#229 Shielder7
Member since 2006 • 5191 Posts

Two 8990s ? Do you know what you're saying ?

Bebi_vegeta
I predict we will have a month or two of silence before hermits start spouting again. Newest PCs are more powerful bla bla bla.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#230 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Maddie_Larkin"]

Riddle me this then, mighty fine ideling temps, but what when you play a game? Completely useless point for a console. The chips alone as I stated use FAR more power under load. Your idea pretty much just equates to "it will not burn if you do not play games".

Oh and the power draw I wrote were calculated for the chip alone as stated, learn to read.

Maddie_Larkin

The statement for "new amd series of GPU would have an idle power consumption at atleast 400 Watts" would be false.

Peak: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Highest single reading during the test.

Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high non-game power consumption that can typically be reached only with stress testing applications. Card left running stress test until power draw converged to a stable value. On cards with power limiting systems we will disable the power limiting system or configure it to the highest available setting - if possible. We will also use the highest single reading from a Furmark run which is obtained by measuring faster than when the power limit can kick in

AMD PowerTune settings was changed.

The statement for "a peak at 700-800 that would be for the GPUs alone" would be false i.e. my Silverstone SG07's power supply doesn't go this high.

Go to the 7970 under load, since OP talked abou its big brother in Xfire, read before you let out your AMD marketing bull. Infact jest read the threads in which you post and do SW a favor. Again, idle power consumption matters little in a freakin console, common sense should be requirted given what a console is required to do. Not to mention TC were betting on dual chip cards.

Dual chip cards requires alot more in idle power usage. Given that they can not just be chips, thus the best govering example we can have is to use on how the PCB were structured, each of the power draw of the chips alone, and cut PCB power in pcb in half. add the power needed for a cooling solution.

Again, your statement for "new amd series of GPU would have an idle power consumption at atleast 400 Watts" is false. The BS is you.

Infact, research before you post BS and do SW a favor.

From http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-12.html

07%20Power%20Consumption%20Gaming.png

On gaming workloads, Radeon HD 7990 (single board XFire mode) has ~356 watts TDP i.e. the TDP scaling is not quite 2X when it comes to single board XFire designs.

Building a PCB with 1024 bit external bus is a big problem since PCB cost remained pretty static e.g. Xbox1, Xbox 360 and PS3 has 128bit wired VRAM.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#231 clyde46
Member since 2005 • 49061 Posts
[QUOTE="Bebi_vegeta"]

Two 8990s ? Do you know what you're saying ?

Shielder7
I predict we will have a month or two of silence before hermits start spouting again. Newest PCs are more powerful bla bla bla.

That wont happen this time around.
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232 faizan_faizan
Member since 2009 • 7869 Posts
[QUOTE="Shielder7"][QUOTE="Bebi_vegeta"]

Two 8990s ? Do you know what you're saying ?

clyde46
I predict we will have a month or two of silence before hermits start spouting again. Newest PCs are more powerful bla bla bla.

That wont happen this time around.

Yep, Every dev is saying that consoles will be weaker this time compared to the previous gens.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#233 clyde46
Member since 2005 • 49061 Posts
[QUOTE="faizan_faizan"][QUOTE="clyde46"][QUOTE="Shielder7"] I predict we will have a month or two of silence before hermits start spouting again. Newest PCs are more powerful bla bla bla.

That wont happen this time around.

Yep, Every dev is saying that consoles will be weaker this time compared to the previous gens.

This time around, the consoles don't have any zany new tech like the Unified Shaders from before.
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#234 superclocked
Member since 2009 • 5864 Posts
[QUOTE="clyde46"][QUOTE="faizan_faizan"][QUOTE="clyde46"] That wont happen this time around.

Yep, Every dev is saying that consoles will be weaker this time compared to the previous gens.

This time around, the consoles don't have any zany new tech like the Unified Shaders from before.

I'm hoping that Microsoft is throwing money at AMD's engineering team again, but this time to go crazy improving APU design. You never know...
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#235 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="clyde46"][QUOTE="faizan_faizan"] Yep, Every dev is saying that consoles will be weaker this time compared to the previous gens.superclocked
This time around, the consoles don't have any zany new tech like the Unified Shaders from before.

I'm hoping that Microsoft is throwing money at AMD's engineering team again, but this time to go crazy improving APU design. You never know...

Rory Read already stated "semi-custom" and it's not a full custom GPU like in Xbox 360. Unlike ATI, AMD has injected X86-64 CPU IP into Radeon HD GCN i.e. the focus is X86 PC first.

PC AMD APUs has limit around 125 watts, while next gen consoles has about 170 to 200 watts i.e. info from https://twitter.com/aegies/status/288485398785699840

just throwing this out there: the xi3 runs at 20 watts of power consumption. the wii u is 50. durango and orbis are ~170-200.

If AMD Kaveri APU has quad core Streamroller CPU (it has Intel Sandybridge's level quad instructions per issue per CPU core) with Radeon HD 7750 level GPU (i.e. 8 CU = 512 SPs) and 100 watts TDP, a semi-custom AMD Kaveri APU could have expanded IGP to 16 CUs (1024 SPs) which about same level as Radeon HD 7850. This config is very close to PS4's AMD Liverpool APU spec.

From PC's ~100 watts CPU socket, ~50 watts for Radeon HD 7750 IGP and ~50 watts quad-core Streamroller. If you add another Radeon HD 7750 IGP, that would add another ~50 watts which results to ~150 watt APU. Two Radeon HD 7750 IGP ~= one Radeon HD 7850.

Microsoft could select an 8 core AMD Jaguar CPU with a Radeon HD 7850 level GPU. Note that all future AMD APUs has quad-core ARM Cortex A5 for TrustedZone DRM functions.

To solve PCB's 128bit wired VRAM cost issue, AMD could use interpose method to stack VRAM chips on the chip package.

http://semiaccurate.com/2011/10/27/amd-far-future-prototype-gpu-pictured/

AMD_Interposer_SemiAccurate.jpg

AMD has been working on interposer GPU since 2011.

---------------------

My chip size calulation for semi-custom APU using current processor design blocks.

Semi-custom AMD Trinity (quad core PileDriver) with Radeon HD 7850 = ~335 mm^2

Quad core PileDriver = ~50 watts.

Radeon HD 7850 = ~130 watts (16 CUs operational, with 4 CUs disabled for yeild issues).

Total: ~180 watts.

Chip sizes for Xbox 360 http://www.anandtech.com/show/2682/4

Xenon/Zephyr's total chip size is 358 mm^2 (not including eDRAM chip which contains GPU's backend units).

---------------------

From http://www.anandtech.com/show/6567/amd-ces-2013-press-event-live-blog

AMD-024_575px.jpg

AMD Kaveri APU make it for year 2013 release.

------------------

Notice AMD GCN (Graphics Core Next) for the console category.

AMD-008_575px.jpg

AMD has designs for Radeon HD 7850 level APU with the cost structure of 1st gen/2nd gen Xbox 360.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#236 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="faizan_faizan"][QUOTE="clyde46"][QUOTE="Shielder7"] I predict we will have a month or two of silence before hermits start spouting again. Newest PCs are more powerful bla bla bla.

That wont happen this time around.

Yep, Every dev is saying that consoles will be weaker this time compared to the previous gens.

During 2005, you don't have 200 watt GPU option.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#237 ronvalencia
Member since 2008 • 29612 Posts

But you are ignoring AMD. THEY ARE MAKING THE CHIPS. They said it will have near-Avatar graphics. This will require no less than 2 GTX 8990s in crossfire + optimization. So a larger case will be required. A small case is already ruled out based on AMD speaking about the Avatar graphics thing

fr3ddiemercury

AMD is thinking of Cinema 2.0 (running on two Radeon HD 4870 in XFire with a total of 512bit wide memory bus) type compute tricks to speed up raytracing. Even with Cinema 2.0 type tricks, it would not be Avatar CGI level raytracing i.e. real time raytracing would have cap'ed the light bounces and it would be significantly less than non-real time raytracers.

95dd2b6d.jpg

AMD/GoFlo's on-chip-package with 512bit wide VRAM for AMD GPUs i.e. stacking the VRAM chips on the chip packaging. This avoids PC's costly PCB boards approach.

As part of Cinema 2.0, Blu-ray storage would be reasonable storage device to store massive amounts of voxel data for lights.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#238 tormentos
Member since 2003 • 33793 Posts
"""Originally Posted by Hecatoncheires View Post Hey guys, I have a few questions concerning the rumour of an APU and GPU combination concerning the new PlayStation, if that is agreeable. Maybe someone can bring a little light into my darkness. I stumbled upon this slide from the Fusion Developer Summit which took place in June 2012. The slide deals with GPGPU algorithms in video games. There are a couple of details that are probably somewhat interesting when speculating about a next generation gaming console. As far as I understand, AMD argues that today GPGPU algorithms are used for visual effects only, for example physics computations of fluids or particles. That is because developers are facing an insurmountable bottleneck on systems that use homogeneous processor architectures. AMD calls it the copy overhead. This copy overhead originates from the copy work between the CPU and the GPU that can easily take longer than the processing itself. Due to this problem game developers only use GPGPU algorithms for visual effects that don't need to be sent back to the CPU. AMD's solution for this bottleneck is a unified adress space for CPU and GPU and other features that have been announced for the upcoming 2013 APUs Kabini (and Kaveri). But these features alone are only good for eliminating the copy overhead. Developers still have to deal with another bottleneck, namely the saturated GPU. This problem is critical for GPGPU in video games since the GPU has to deal with both, game code and GPGPU algorithms at once. I'm not sure whether this bottleneck only exists for thick APIs like DirectX or if it also limits an APU that is coded directly to the metal. Anyway, AMD claims that a saturated GPU makes it hard for developers to write efficient GPGPU code. To eliminate this bottleneck AMD mentions two solutions: Either you can wait for a 2014 HSA feature that is called Graphics Pre-Emption, or you can just use an APU for the GPGPU algorithms and a dedicated GPU for graphics rendering. The latter is what AMD recommends explicitly for video gaming and they even bring up the similarities to the PlayStation 3, which renownedly uses SIMD co-processors for all kinds of tasks. I would like to know what you guys think about these slides. What if AMD was building an 28nm APU for Sony that is focused solely on GPGPU, for example four big Steamroller cores with very fast threads in conjunction with a couple of MIMD engines? Combine it with a dedicated GPU and a high bandwidth memory solution and you have a pretty decent next gen console. I would also like to know if an APU + GPU + RAM system in package is possible with 2.5D stacking, which was forecasted by Yole Development for the Sony PlayStation 4, for IBM Power8 and Intel Haswell. And since Microsoft is rumoured to have a heavily customized chip with a "special sauce", could that mean they paid AMD to integrate the 2014 feature Graphics Pre-Emption in the XBox processor, so they can go with one single ultra-low latency chip instead of a FLOP-heavy system in package?""" http://www.neogaf.com/forum/showpost.php?p=46338793&postcount=2163 Find this on GA very interesting read,it was re posted on GA from Beyond3D. There rumors pointing at the 720 GPU been customize could very well mean HSA customization,while the PS4 would do it the other way around,using the APU for algorithms and the GPU for graphics rendering this is a very intriguing argument for several reason. 1)It means the 720 (if is HSA) while been 7XXX based will not have the same bottle necks those GPU face on PC,so it will have more usable power those equivalent GPU on PC which are not HSA. 2)It also show that AMD was taking notes from how Cell and the RSX worked,which mean that Cell wasn't so useless after all and was going in the right direction,CPU helping the GPU,true is Cell helped the RSX with basically all his bottle necks,else the PS3 would not have that surpass or rival the 360 best offerings. 3)Is HSA an impact heat wise on the GPU.? Because we all know what happen last time when the Xenos included tech ahead of its time unified shaders,the heat was to much of a problem on 360 even when the console has never include as build in power supply.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#239 ronvalencia
Member since 2008 • 29612 Posts

From http://www.tomshardware.com/reviews/fusion-hsa-opencl-history,3262-11.html

amd-fusion-bio,5-1-347509-22.jpg

Avatar image for caseypayne69
caseypayne69

5396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#240 caseypayne69
Member since 2002 • 5396 Posts
From what I understand high-in pcs couldn't run Avatar as a game.
Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#241 Cranler
Member since 2005 • 8809 Posts

[QUOTE="faizan_faizan"][QUOTE="clyde46"] That wont happen this time around. clyde46
Yep, Every dev is saying that consoles will be weaker this time compared to the previous gens.

This time around, the consoles don't have any zany new tech like the Unified Shaders from before.

What zany new tech did the original xbox have that made it rival pc in 2001? Only thing holding console back is size and power restraints.

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#242 Cranler
Member since 2005 • 8809 Posts

[QUOTE="clyde46"][QUOTE="faizan_faizan"] This time around, the consoles don't have any zany new tech like the Unified Shaders from before.superclocked
I'm hoping that Microsoft is throwing money at AMD's engineering team again, but this time to go crazy improving APU design. You never know...

From what I understand high-in pcs couldn't run Avatar as a game.caseypayne69
High end pc's can run unreal engine 4 level graphics but pc hardware is so ahead of the software that no games are near that besides modded games. Its very likely that console games could easily match pc graphics at the beginning of next gen when you consider how far behind the software is right now. Albeit at a lower res.

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#243 superclocked
Member since 2009 • 5864 Posts

High end pc's can run unreal engine 4 level graphics but pc hardware is so ahead of the software that no games are near that besides modded games. Its very likely that console games could easily match pc graphics at the beginning of next gen when you consider how far behind the software is right now. Albeit at a lower res.

Cranler
But the original XBox proved that old PC hardware could run games MUCH better than expected due to the optimization for it's closed box design. Good examples are Halo, Doom 3, Halo 2, Republic Commando, Battlefront, Call of Duty, etc.. There is no way in hell that you could play those games on a PC with a 733MHz Pentium 3 + MX440/FX5500, yet the XBox ran them with no problems...
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#244 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="Cranler"]

High end pc's can run unreal engine 4 level graphics but pc hardware is so ahead of the software that no games are near that besides modded games. Its very likely that console games could easily match pc graphics at the beginning of next gen when you consider how far behind the software is right now. Albeit at a lower res.

superclocked
But the original XBox proved that old PC hardware could run games MUCH better than expected due to the optimization for it's closed box design. Good examples are Halo, Doom 3, Halo 2, Republic Commando, Battlefront, Call of Duty, etc.. There is no way in hell that you could play those games on a PC with a 733MHz Pentium 3 + MX440/FX5500, yet the XBox ran them with no problems...

Its called compromises.
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#245 superclocked
Member since 2009 • 5864 Posts
[QUOTE="superclocked"][QUOTE="Cranler"]

High end pc's can run unreal engine 4 level graphics but pc hardware is so ahead of the software that no games are near that besides modded games. Its very likely that console games could easily match pc graphics at the beginning of next gen when you consider how far behind the software is right now. Albeit at a lower res.

04dcarraher
But the original XBox proved that old PC hardware could run games MUCH better than expected due to the optimization for it's closed box design. Good examples are Halo, Doom 3, Halo 2, Republic Commando, Battlefront, Call of Duty, etc.. There is no way in hell that you could play those games on a PC with a 733MHz Pentium 3 + MX440/FX5500, yet the XBox ran them with no problems...

Its called compromises.

A PC with XBox specs couldn't even run those games on low. Doom 3 looked almost as good as it did on PC, just at a lower resolution. Same with Halo. Optimizations for one particular hardware set can lead to some amazing results...
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#246 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="04dcarraher"][QUOTE="superclocked"]But the original XBox proved that old PC hardware could run games MUCH better than expected due to the optimization for it's closed box design. Good examples are Halo, Doom 3, Halo 2, Republic Commando, Battlefront, Call of Duty, etc.. There is no way in hell that you could play those games on a PC with a 733MHz Pentium 3 + MX440/FX5500, yet the XBox ran them with no problems...superclocked
Its called compromises.

A PC with XBox specs couldn't even run those games on low. Doom 3 looked just as good as it did on PC, just at a lower resolution. Same with Halo. Optimizations for one particular hardware set can lead to some amazing results...

Wrong,optimization isnt magic, after the coding is to a point where is using the resources(processing/memory) to the smallest degree while being able to do what it needs to do. Then they have to fight a juggling act of what to keep and what to get rid of or downgrade to allow the system to run in a resource limited environment. Doom 3 for example on xbox ran at 320x240 and had total sections of levels cut out, let alone major cuts in lighting, With a Duron 1ghz cpu and a FX 5200 I ran Halo, Doom 3, Halo 2, Republic Commando, Battlefront, and Call of Duty much higher resolution and higher settings then xbox could ever do.
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#247 superclocked
Member since 2009 • 5864 Posts
[QUOTE="04dcarraher"][QUOTE="superclocked"][QUOTE="04dcarraher"] Its called compromises.

A PC with XBox specs couldn't even run those games on low. Doom 3 looked just as good as it did on PC, just at a lower resolution. Same with Halo. Optimizations for one particular hardware set can lead to some amazing results...

Wrong,optimization isnt magic, after the coding is to a point where is using the resources(processing/memory) to the smallest degree while being able to do what it needs to do. Then they have to fight a juggling act of what to keep and what to get rid of or downgrade to allow the system to run in a resource limited environment. Doom 3 for example on xbox ran at 320x240 and had total sections of levels cut out, let alone major cuts in lighting, With a Duron 1ghz cpu and a FX 5200 I ran Halo, Doom 3, Halo 2, Republic Commando, Battlefront, and Call of Duty much higher resolution and higher settings then xbox could ever do.

I hope that you're joking.. For one, Doom 3 for the XBox ran at a native resolution of 800x480 (480p), and looked nearly identical to the PC version on high.. And a 1GHz Duron and FX 5200 could not run those games at anything even approaching a playable framerate. My Athlon XP 2600+ and FX 5600 could barely run Halo and Doom 3. At 1024x768 and medium settings, both games were like a slideshow with my hardware...
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#248 04dcarraher
Member since 2004 • 23858 Posts

[QUOTE="04dcarraher"][QUOTE="superclocked"]A PC with XBox specs couldn't even run those games on low. Doom 3 looked just as good as it did on PC, just at a lower resolution. Same with Halo. Optimizations for one particular hardware set can lead to some amazing results...superclocked
Wrong,optimization isnt magic, after the coding is to a point where is using the resources(processing/memory) to the smallest degree while being able to do what it needs to do. Then they have to fight a juggling act of what to keep and what to get rid of or downgrade to allow the system to run in a resource limited environment. Doom 3 for example on xbox ran at 320x240 and had total sections of levels cut out, let alone major cuts in lighting, With a Duron 1ghz cpu and a FX 5200 I ran Halo, Doom 3, Halo 2, Republic Commando, Battlefront, and Call of Duty much higher resolution and higher settings then xbox could ever do.

I hope that you're joking.. For one, Doom 3 for the XBox ran at a native resolution of 800x480 (480p), and looked nearly identical to the PC version on high.. And a 1GHz Duron and FX 5200 could not run those games at anything even approaching a playable framerate. My Athlon XP 2600+ and FX 5600 could barely run Halo and Doom 3. At 1024x768 and medium settings, both games were like a slideshow with my hardware...

Texture resolution of the Doom 3 on xbox was 320x240, Doom3 did not run at 800x480 on xbox the game was design to run at 640x480, and its funny that you think Doom3 on xbox looked nearly identical to the PC version on high..

Aside from map differences, the in-game sacrifices that the Xbox makes are as follows:Resolution - the Xbox version of the game only goes up to 480, and most folks played on standard TV .Lighting - Below, you'll see a direct comparison of the lighting effects used on the Xbox versus the PC.Textures - Some of the Xbox textures look very "washed out", pixelated, and just plain bad next to the PC version.Bump Mapping - the Xbox version simply doesn't "pop" like the PC version, even at the same resolution.

inset.jpg

directcompare.jpg

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249 superclocked
Member since 2009 • 5864 Posts

[QUOTE="superclocked"][QUOTE="04dcarraher"] Wrong,optimization isnt magic, after the coding is to a point where is using the resources(processing/memory) to the smallest degree while being able to do what it needs to do. Then they have to fight a juggling act of what to keep and what to get rid of or downgrade to allow the system to run in a resource limited environment. Doom 3 for example on xbox ran at 320x240 and had total sections of levels cut out, let alone major cuts in lighting, With a Duron 1ghz cpu and a FX 5200 I ran Halo, Doom 3, Halo 2, Republic Commando, Battlefront, and Call of Duty much higher resolution and higher settings then xbox could ever do. 04dcarraher

I hope that you're joking.. For one, Doom 3 for the XBox ran at a native resolution of 800x480 (480p), and looked nearly identical to the PC version on high.. And a 1GHz Duron and FX 5200 could not run those games at anything even approaching a playable framerate. My Athlon XP 2600+ and FX 5600 could barely run Halo and Doom 3. At 1024x768 and medium settings, both games were like a slideshow with my hardware...

Texture resolution of the Doom 3 on xbox was 320x240, Doom3 did not run at 800x480 on xbox the game was design to run at 640x480, and its funny that you think Doom3 on xbox looked nearly identical to the PC version on high..

Aside from map differences, the in-game sacrifices that the Xbox makes are as follows:Resolution - the Xbox version of the game only goes up to 480, and most folks played on standard TV .Lighting - Below, you'll see a direct comparison of the lighting effects used on the Xbox versus the PC.Textures - Some of the Xbox textures look very "washed out", pixelated, and just plain bad next to the PC version.Bump Mapping - the Xbox version simply doesn't "pop" like the PC version, even at the same resolution.

Once again, Doom 3 on the XBox ran at a maximum native resolution of 480p / 16:9. That's 800x480, not 640x480 or 320x240.. And you did not play Halo and Doom 3 at higher settings and resolution using a 1GHz Duron and FX 5200. For one, the XBox had a better GPU than the FX 5200. Second, Doom 3 will not even run on that hardware at it's lowest settings. And if Halo was a slideshow using my Athlon XP 2600+ and FX 5600, I can only imagine how horrible it ran on that system...
Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#250 wis3boi
Member since 2005 • 32507 Posts

[QUOTE="SPBoss"][QUOTE="dramaybaz"] Intentional troll or not, we question his intelligence.dramaybaz

No one can actually be that bad, even extreme fanboys have a limit.. I'm waiting for him to realise its physically impossible to have an 8core cpue and crossfire/sli in such a small form factor. It would literally burst into flames from the heat LOL

Well, no point in carrying on this silly thread, or answering the TC:

Game Over

Thank you based god