It is actually not possible that the Xbox 720 uses a 6670 GPU because

  • 89 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Xplode_games
Member since 2011 • 2540 Posts

that is an off the shelf PC card. That means that ATI makes a profit selling them. MS will not purchase an off the shelf part because it burned them in the past. When manufacturing costs go down years down the road, MS doesn't benefit, only ATI does. This is what killed them with the original Xbox and why they had to discontinue it to stop bleeding money.

MS actually tried to stay with Nvidia and purchase the architecture for the 360 GPU but Nvidia refused and said they don't do business that way. ATI stepped up and actually sold MS the architecture for the Xenos that is in the 360. MS takes care of the manufacturing and ATI just designed it and oversaw the manufacturing for the first year. After that it's been all MS. And yes they have benefited from lowered production costs through the years. ATI hasn't made a dime on the 360 for years.

Ironically, Nvidia later made a deal with Sony to sell them the architecture for the RSX in the PS3. I guess they changed their minds.

Why would all of a sudden, after the GREAT SUCCESS that the 360 has been for Microsoft, would they out of the blue make the same mistake they made back with the original Xbox which was a catastrophic failure?

Next point, why would they pay ATI to design an all new GPU specifically for their next gen console the Xbox 720 and ask them to please make sure it is a piece of shat? Even if they ask for a low cost part, this is something that will be developed specifically for the 720 and will take into account the launch date of late 2013/2014. This means even they don't try to best PC tech at launch like they did with the Xenos and 360, it will still be eons ahead of the 360 in power.

You have to literally be a complete and total imbecile to believe that the 720 will use the 6670 or anything even remotely similar. Either that or you're clueless about PC tech. Remember, MS is not buying the chips per unit, they are buying the architecture. They are paying ATI to design them an all new chip and help them get started with the manufacturing. How in the hell can this turn out to be a 6670? NO, and LMFAO to anyone who thought that would happen.

I know, you didn't read cause it's too long and you like waffle fries.

Avatar image for Riki101
Riki101

2339

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 Riki101
Member since 2004 • 2339 Posts
.... ms does not manufactur anything, foxxconn does, ATI and nvidia im sure does not manufacture anything, they design it, and get a company like foxxconn to make it
Avatar image for Dark_man123
Dark_man123

4012

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#3 Dark_man123
Member since 2005 • 4012 Posts

lets not believe any rumors that is floating and lets wait for e3 for something to be officially announce before we could make a valid conclusion.

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts
You might have wanted to figure out what the rumor was before making a thread about it. It was claimed that the nextbox would use a gpu with similar performance to a 6670, not an actual 6670.
Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Xplode_games
Member since 2011 • 2540 Posts

.... ms does not manufactur anything, foxxconn does, ATI and nvidia im sure does not manufacture anything, they design it, and get a company like foxxconn to make itRiki101

I didn't say Microsoft manufactures it, I said they handle the manufacturing. Obviously they subcontract it out to one of many companies in china and india. But the point is they own the architecture and as a result it is much cheaper for them in the long run.

Avatar image for BrunoBRS
BrunoBRS

74156

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#6 BrunoBRS
Member since 2005 • 74156 Posts
ATI developed the wii's architecture, and i don't see nintendo whining about it.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 ronvalencia
Member since 2008 • 29612 Posts

that is an off the shelf PC card. That means that ATI makes a profit selling them. MS will not purchase an off the shelf part because it burned them in the past. When manufacturing costs go down years down the road, MS doesn't benefit, only ATI does. This is what killed them with the original Xbox and why they had to discontinue it to stop bleeding money.

MS actually tried to stay with Nvidia and purchase the architecture for the 360 GPU but Nvidia refused and said they don't do business that way. ATI stepped up and actually sold MS the architecture for the Xenos that is in the 360. MS takes care of the manufacturing and ATI just designed it and oversaw the manufacturing for the first year. After that it's been all MS. And yes they have benefited from lowered production costs through the years. ATI hasn't made a dime on the 360 for years.

Xplode_games

ATI's IP licensing model is similar to IBM's PPE IP licensing model.

AMD Xenos IP was also licensed to Qualcomm and STMicroelectronics. AMD Xenos IP was rendered obsolete with Radeon HD/Terascale 1 IP.

AMD Xenos related patents remains with AMD e.g. S3TC** and 3DC patents was licensed to Microsoft. **AMD owns S3TC patents.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 Xplode_games
Member since 2011 • 2540 Posts

You might have wanted to figure out what the rumor was before making a thread about it. It was claimed that the nextbox would use a gpu with similar performance to a 6670, not an actual 6670.ferret-gamer

That makes even less sense. The benefit of having ATI design the console from scratch is they can cut costs AND at the same time have optimal performance. The Xenos in the 360 cost MS around $120 per unit at launch to manufacture. That chip bested the best PC tech of the day. Today you can save some money but at some point it would be redundant to decrease performance because it wouldn't necessarily decrease production costs.

I can't imagine a scenario where ATI comes up with a card similar in performance to the 6670.

Avatar image for Ninja-Hippo
Ninja-Hippo

23434

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#9 Ninja-Hippo
Member since 2008 • 23434 Posts
ATI developed the wii's architecture, and i don't see nintendo whining about it.BrunoBRS
The Wii's architecture was already night on five years old the day it launched. They're not incurring any costs at all in making that toaster of a console. Microsoft on the other hand was paying the same costs for its ATI chips five years later as it was at launch, even though they costs a fraction of the price to produce by the time. Hence why they had to discontinue the xbox; it was becoming more popular, and they'd cut the price, but they lost money on every one they sold because they were still paying out the ass to ATI for its chipset.
Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 GTSaiyanjin2
Member since 2005 • 6018 Posts

[QUOTE="BrunoBRS"]ATI developed the wii's architecture, and i don't see nintendo whining about it.Ninja-Hippo
The Wii's architecture was already night on five years old the day it launched. They're not incurring any costs at all in making that toaster of a console. Microsoft on the other hand was paying the same costs for its ATI chips five years later as it was at launch, even though they costs a fraction of the price to produce by the time. Hence why they had to discontinue the xbox; it was becoming more popular, and they'd cut the price, but they lost money on every one they sold because they were still paying out the ass to ATI for its chipset.

You mean Nvidia. Thats what the 1st xbox used. And yeah the cost never really went down on those chips.

Avatar image for Ninja-Hippo
Ninja-Hippo

23434

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#11 Ninja-Hippo
Member since 2008 • 23434 Posts

[QUOTE="Ninja-Hippo"][QUOTE="BrunoBRS"]ATI developed the wii's architecture, and i don't see nintendo whining about it.GTSaiyanjin2

The Wii's architecture was already night on five years old the day it launched. They're not incurring any costs at all in making that toaster of a console. Microsoft on the other hand was paying the same costs for its ATI chips five years later as it was at launch, even though they costs a fraction of the price to produce by the time. Hence why they had to discontinue the xbox; it was becoming more popular, and they'd cut the price, but they lost money on every one they sold because they were still paying out the ass to ATI for its chipset.

You mean Nvidia. Thats what the 1st xbox used. And yeah the cost never really went down on those chips.

Indeed, my mistake.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ferret-gamer"]You might have wanted to figure out what the rumor was before making a thread about it. It was claimed that the nextbox would use a gpu with similar performance to a 6670, not an actual 6670.Xplode_games

That makes even less sense. The benefit of having ATI design the console from scratch is they can cut costs AND at the same time have optimal performance. The Xenos in the 360 cost MS around $120 per unit at launch to manufacture. That chip bested the best PC tech of the day. Today you can save some money but at some point it would be redundant to decrease performance because it wouldn't necessarily decrease production costs.

I can't imagine a scenario where ATI comes up with a card similar in performance to the 6670.

AMD Radeon X1800 XT still has 128 (pixel) threads, 16 ROP and memory bandwidth advantage.

6670 is just a rumor. TDP wise, 7750(entry level GCN) can easily displace 6670.

AMD Radeon HD 7750 with single slot cooler.

AMD Radeon HD 6670 with single slot cooler.

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 xhawk27
Member since 2010 • 12194 Posts

The GPU in the 360 was a custom design. State of the art for it's time.

Avatar image for PinnacleGamingP
PinnacleGamingP

5120

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 PinnacleGamingP
Member since 2012 • 5120 Posts
who cares theres no games, all this talk for kinect games? even with that they couldnt deliver milo
Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 xhawk27
Member since 2010 • 12194 Posts

who cares theres no games, all this talk for kinect games? even with that they couldnt deliver miloPinnacleGamingP

Yeah and the PS4 is going to use the vita GPU to make Move games only!

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16 ronvalencia
Member since 2008 • 29612 Posts

The GPU in the 360 was a custom design. State of the art for it's time.

xhawk27

Xenos only has 8 ROPs and 64 threads.

Avatar image for babycakin
babycakin

1597

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 babycakin
Member since 2012 • 1597 Posts

both ya'll fighting like lil kids up there.

anyways, 6670 is a rumor like almost every single peice of news regarding Nintendo, Sony and MS we heard. Let's just wait and see mmmk?

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 xhawk27
Member since 2010 • 12194 Posts

[QUOTE="xhawk27"]

The GPU in the 360 was a custom design. State of the art for it's time.

ronvalencia

Xenos only has 8 ROPs and 64 threads.

Back in 2004 it was state of the art.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="xhawk27"]

The GPU in the 360 was a custom design. State of the art for it's time.

xhawk27

Xenos only has 8 ROPs and 64 threads.

Back in 2004 it was state of the art.

No. November 22, 2005.

Avatar image for brofists
brofists

2120

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 brofists
Member since 2011 • 2120 Posts
You might have wanted to figure out what the rumor was before making a thread about it. It was claimed that the nextbox would use a gpu with similar performance to a 6670, not an actual 6670.ferret-gamer
LMAO /thread
Avatar image for brofists
brofists

2120

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 brofists
Member since 2011 • 2120 Posts

[QUOTE="xhawk27"]

[QUOTE="ronvalencia"] Xenos only has 8 ROPs and 64 threads.

ronvalencia

Back in 2004 it was state of the art.

No. November 22, 2005.

OWNED.

Avatar image for Chutebox
Chutebox

51601

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Chutebox
Member since 2007 • 51601 Posts

I learned a very long time ago never to argue with ronvalencia about computer tech because you will lose.

Avatar image for M3boarder23
M3boarder23

207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 M3boarder23
Member since 2012 • 207 Posts
[QUOTE="ronvalencia"][QUOTE="xhawk27"]Back in 2004 it was state of the art.brofists
No. November 22, 2005 http://images.anandtech.com/graphs/ati%20radeon%20x1900%20launch_012406100152/10664.png img src="http://images.anandtech.com/graphs/ati%20radeon%20x1900%20launch_012406100152/10666.png

OWNED.

An Xbox Game, and a Xbox 360 launch release port? Really? Try playing DiRT or Metro 2033 on a X1900 XT :lol:
Avatar image for PC_Otter
PC_Otter

1623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 PC_Otter
Member since 2010 • 1623 Posts

On a code to metal basis and above 720p, an X1900 would beat Xenos, but in a console of course, you get the benefits of optimization and in Xenos' case, unified shader architecture. The X1900 however can render scenes at above 720p in real time that would bring Xenos to it's knees. Xenos doesn't have a chance there. Xenos was deliberately designed with 720p in mind as it is a sweet spot for output resolution and performance, and much a part as to why Xenos has "aged" so well since it doesn't have to go for those higher resolutions.

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 savagetwinkie
Member since 2008 • 7981 Posts

[QUOTE="xhawk27"]

The GPU in the 360 was a custom design. State of the art for it's time.

ronvalencia

Xenos only has 8 ROPs and 64 threads.

video cards don't have threads lol, they have processors, the xenos has 48 processors broken up into three groups of 16 that all run the same instruction, 3 instructions threads can be run, and each processors can execute 2 operations serially per cycle.

It also has 16 texture filter units and texture address units...

are you really stupid enough that you added 16 + 48 and thought it was threads?

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 savagetwinkie
Member since 2008 • 7981 Posts

[QUOTE="xhawk27"]

[QUOTE="ronvalencia"] Xenos only has 8 ROPs and 64 threads.

ronvalencia

Back in 2004 it was state of the art.

No. November 22, 2005.

What has this got to do with the xenos being state of the art, ALL video cards now a days follow its unified shader architecture, it may not have the same power but its still state of the art technology that proved the the unified shaders are far more efficient and paved the way for the future.

Avatar image for wewantdoom4now
wewantdoom4now

1792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 wewantdoom4now
Member since 2012 • 1792 Posts

original xbox gpu wasnt an off the shelf geforce 3 it had some functions of the geforce 4.

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 savagetwinkie
Member since 2008 • 7981 Posts

I learned a very long time ago never to argue with ronvalencia about computer tech because you will lose.

Chutebox

all he does is post random crap that has similar words he's used in the argument and post it in threads, he's got barely any understanding of what he's talking about, can't put it into context or a relate it to anything that is being said... lol he's not arguing so you can't loose.

Avatar image for ShadowMoses900
ShadowMoses900

17081

Forum Posts

0

Wiki Points

0

Followers

Reviews: 48

User Lists: 0

#29 ShadowMoses900
Member since 2010 • 17081 Posts

That whole thing is a rumor that's just going around, soem rumors are saying it's the opposite. I say don't take any of them seriously, wait until E3 where (most likely) MS will show off the next Xbox system.

I'm betting that the reason why the 360 has very little exclusives is because MS is making them for their next platform, the 360 is a dead system now and MS is retiring it. I expect Halo 4 to be the last exclusive for the 360.

See MS doesn't support their older systems, they did the same thing with the last Xbox when they were working on the 360. So the whole argument "360 haz no gamez" is a pretty stupid one as the next Xbox is going to have them. It doesn't take a genius to figure this out, it's basic common sense folks!

But I'm not going to buy the next Xbox system, I had a 360 and I just don't like the brand and I think MS tries to nickel and dime it's consumers too much.

Avatar image for DeX2010
DeX2010

3989

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30 DeX2010
Member since 2010 • 3989 Posts
I like waffle fries.
Avatar image for loosingENDS
loosingENDS

11793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 loosingENDS
Member since 2011 • 11793 Posts

that is an off the shelf PC card. That means that ATI makes a profit selling them. MS will not purchase an off the shelf part because it burned them in the past. When manufacturing costs go down years down the road, MS doesn't benefit, only ATI does. This is what killed them with the original Xbox and why they had to discontinue it to stop bleeding money.

MS actually tried to stay with Nvidia and purchase the architecture for the 360 GPU but Nvidia refused and said they don't do business that way. ATI stepped up and actually sold MS the architecture for the Xenos that is in the 360. MS takes care of the manufacturing and ATI just designed it and oversaw the manufacturing for the first year. After that it's been all MS. And yes they have benefited from lowered production costs through the years. ATI hasn't made a dime on the 360 for years.

Ironically, Nvidia later made a deal with Sony to sell them the architecture for the RSX in the PS3. I guess they changed their minds.

Why would all of a sudden, after the GREAT SUCCESS that the 360 has been for Microsoft, would they out of the blue make the same mistake they made back with the original Xbox which was a catastrophic failure?

Next point, why would they pay ATI to design an all new GPU specifically for their next gen console the Xbox 720 and ask them to please make sure it is a piece of shat? Even if they ask for a low cost part, this is something that will be developed specifically for the 720 and will take into account the launch date of late 2013/2014. This means even they don't try to best PC tech at launch like they did with the Xenos and 360, it will still be eons ahead of the 360 in power.

You have to literally be a complete and total imbecile to believe that the 720 will use the 6670 or anything even remotely similar. Either that or you're clueless about PC tech. Remember, MS is not buying the chips per unit, they are buying the architecture. They are paying ATI to design them an all new chip and help them get started with the manufacturing. How in the hell can this turn out to be a 6670? NO, and LMFAO to anyone who thought that would happen.

I know, you didn't read cause it's too long and you like waffle fries.

Xplode_games

I agree with everything, but they were not needed

It is juts a stupid IGN rumor, which means is 100% fake, consoles dont use PC parts in the first place and IGN is a joke to try and assume things

Avatar image for loosingENDS
loosingENDS

11793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 loosingENDS
Member since 2011 • 11793 Posts

That whole thing is a rumor that's just going around, soem rumors are saying it's the opposite. I say don't take any of them seriously, wait until E3 where (most likely) MS will show off the next Xbox system.

I'm betting that the reason why the 360 has very little exclusives is because MS is making them for their next platform, the 360 is a dead system now and MS is retiring it. I expect Halo 4 to be the last exclusive for the 360.

See MS doesn't support their older systems, they did the same thing with the last Xbox when they were working on the 360. So the whole argument "360 haz no gamez" is a pretty stupid one as the next Xbox is going to have them. It doesn't take a genius to figure this out, it's basic common sense folks!

But I'm not going to buy the next Xbox system, I had a 360 and I just don't like the brand and I think MS tries to nickel and dime it's consumers too much.

ShadowMoses900

They have 10x more exclusive games than PS3 and Wii in 2012, so how are they retiring 360 ?

Avatar image for TheXFiles88
TheXFiles88

1040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 TheXFiles88
Member since 2008 • 1040 Posts

Wow, people still believe in some BS random rumors? Have we learned anything from the past. Even the devs when working on the Xbox2 SDK kits with the Mac G5 systems, those were proven to be not the final specs. But people still keep talking about specalation? You guys must have been bored to death.

1

Xbox 360 Specs Put Power Mac G5 to Shame

Dan Knight- 2005.05.13 - Tip Jar

Microsoft unveiled the Xbox 360 last night. And just what does that have to do with Macs?

A lot! The Xbox 360, due on the market in time for the holiday retail frenzy,has more horsepower than Apple's Dual 2.7 GHz Power Mac G5

A gaming system!

At the heart of the Xbox 360 is a three-core IBM PowerPC processor running at 3.2 GHz.The clock speed alone is nearly 20% higher than the fastest Power Mac, and a three-core CPU is bound to outperform a pair of single-core CPUsby up to 50%.

That's just wrong.IBM can't produce 3.0 GHz G5 processors for Apple - but for Microsoft they can reach 3.2 GHz? It just doesn't make sense.

Or maybe we'll finally see single- and dual-core G5 CPUs reach and surpass the 3.0 GHz mark as IBM supplies the entire gaming industry (Sony, Microsoft, and Nintendo) with PowerPC brains for their consoles.

Nobody seems to know how much Microsoft will sell the Xbox 360 for,but it's bound to be a whole lot less than the $2,999 top-end Power Mac G5.

How does it compare?The Xbox 360 will be at least 50% more powerful due to a three-core CPU. 512 MB of RAM will be standard, shared by the CPU and a 500 MHz ATI graphics processor. The Xbox 360 will be liquid cooled, has a removable 20 GB hard drive, and include a 12x DVD-ROM drive.

Like the Power Mac G5, it has USB 2.0 ports and is WiFi ready. Additionally, it can take four game controllers at once. And it's bound to cost a lot less than any Power Mac G5, even the 1.8 GHz single-CPU one.

I can imagine a huge after-market for hard drives, SuperDrives, and memory upgrades, which could turn Xbox 360 into a killer digital video recorder. And you just know that the Linux community is going to port their favorite OS to this hardware as fast as they can.

Where does that leave Apple?

Good question.Apple has sold a lot of Power Mac G5 systems to Microsoft, and Microsoft has customized them as Xbox 360 development systems.That means it shouldn't be too hard to port Macintosh software - and possibly even OSX - to the new hardware.

Avatar image for WadeKuun
WadeKuun

161

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 WadeKuun
Member since 2012 • 161 Posts
You might have wanted to figure out what the rumor was before making a thread about it. It was claimed that the nextbox would use a gpu with similar performance to a 6670, not an actual 6670.ferret-gamer
My thoughts right here, i have no idea whats going but OP's post is straight fact if only it held an argument to talk about lol.
Avatar image for glez13
glez13

10314

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 glez13
Member since 2006 • 10314 Posts

The thing that would actually surprise me is if anyone actually believed that the 6670 rumor meant that they would literary stick a 6670 GPU into the Nextbox case.

Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#36 GTSaiyanjin2
Member since 2005 • 6018 Posts

The thing that would actually surprise me is if anyone actually believed that the 6670 rumor meant that they would literary stick a 6670 GPU into the Nextbox case.

glez13

I think that would be more sad than surprising.

Avatar image for Dave_NBF
Dave_NBF

1974

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Dave_NBF
Member since 2005 • 1974 Posts

On a code to metal basis and above 720p, an X1900 would beat Xenos, but in a console of course, you get the benefits of optimization and in Xenos' case, unified shader architecture. The X1900 however can render scenes at above 720p in real time that would bring Xenos to it's knees. Xenos doesn't have a chance there. Xenos was deliberately designed with 720p in mind as it is a sweet spot for output resolution and performance, and much a part as to why Xenos has "aged" so well since it doesn't have to go for those higher resolutions.

PC_Otter

I completely agree. Your knowledge of CPU/GPU logic processes is very intuitive and factual based. I imagine you are some sort of engineering student/grad/worker?

A big reason I think any GPU that would be similar to a 6670 wouldn't surprise me is that MS and Sony can't afford to take the risk of putting high end hardware for a front-end loss anymore. Why not do something similar to Ninty and focus on innovative software to turn profits on the front end (hardware price is off set by sufficient MSRP to profit) and also on the backend (sales/distribution of software and online features/licensing fees).

With that, a 6670 is, on paper '6x' more powerful than the Xenos. However that =! 6x improvements to visual fidelity. It's just theoretical improvements. Having 1080p or 2x the resolution to power may require a GPU that is 6 times more powerful. Especially when you couple a higher resolution and DX11 effects which mid-range GPUs really struggle with. Ironically AMD cards are known to handle tessellation much more poorly currently (at least on the mobile front).

I'm no engineer but just a software/hardware enthusiast but I do enjoy reading your reasoning behind your posts.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#38 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="xhawk27"]

The GPU in the 360 was a custom design. State of the art for it's time.

savagetwinkie

Xenos only has 8 ROPs and 64 threads.

video cards don't have threads lol, they have processors, the xenos has 48 processors broken up into three groups of 16 that all run the same instruction, 3 instructions threads can be run, and each processors can execute 2 operations serially per cycle.

It also has 16 texture filter units and texture address units...

are you really stupid enough that you added 16 + 48 and thought it was threads?

LOL, Modern video cards has hyper-threading/simultaneous multithreading(SMT) type designs e.g. NVIDIA's Giga-Threads and AMD's Ultra-Threads.

Link

XENOS is capable of processing 64 threads simultaneously, this is to make sure that all elements are being utilized and so there is minimal or no stalling of the graphics architecture. So even if a ALU may be waiting for a texture sample to be achieved, that thread would not stall the ALU as it would be working on something else from another thread. This effectively hides tasks that would normally have a large latency penalty attached to them. ATI suggests that their testing achieves an average of 95% efficiency of the shader array in general purpose graphics usage conditions

Please update your GPU information to current GPU design i.e. heavy hyperthreaded/SMT type designs. The stupid one is you.

What is Simultaneous multithreading?

Simultaneous multithreading, often abbreviated as SMT, is a technique for improving the overall efficiency of superscalar CPUs with hardware multithreading. SMT permits multiple independent threads of execution to better utilize the resources provided by modern processor architectures.

AMD Xenos includes own version hardwareSMT type front-end design and it has 64 of them. AMD Radeon X1800 has 128 SMT type threads for it's pixel shaders.

NVIDIA Geforce 8X GPU's "Giga-Threads".

Beyond the ALUs/processing core, both NVIDIA and AMD has invested significantresources on uncore sections.

Your "video cards don't have threads" statement is a lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="Chutebox"]

I learned a very long time ago never to argue with ronvalencia about computer tech because you will lose.

savagetwinkie

all he does is post random crap that has similar words he's used in the argument and post it in threads, he's got barely any understanding of what he's talking about, can't put it into context or a relate it to anything that is being said... lol he's not arguing so you can't loose.

Random crap is you.

Avatar image for tjricardo089
tjricardo089

7429

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 tjricardo089
Member since 2010 • 7429 Posts

E3 this year is gonna be BEAST!

Avatar image for lundy86_4
lundy86_4

62037

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#41 lundy86_4
Member since 2003 • 62037 Posts

You might have wanted to figure out what the rumor was before making a thread about it. It was claimed that the nextbox would use a gpu with similar performance to a 6670, not an actual 6670.ferret-gamer

Yep, this.

Avatar image for jakarai
jakarai

4289

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#42 jakarai
Member since 2008 • 4289 Posts
The article didn't say Xbox 720 uses a 6670, it said the Xbox 720 is about as powerful as a 6670. Pretty pathetic that a multibillion dollar corporation would go this route.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43 ronvalencia
Member since 2008 • 29612 Posts

Wow, people still believe in some BS random rumors? Have we learned anything from the past. Even the devs when working on the Xbox2 SDK kits with the Mac G5 systems, those were proven to be not the final specs. But people still keep talking about specalation? You guys must have been bored to death.

1

Xbox 360 Specs Put Power Mac G5 to Shame

TheXFiles88

Clock speed madness i.e. 3.2Ghz for PPE. Didn't factor in IPC and L2 cache speed.

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 xhawk27
Member since 2010 • 12194 Posts

[QUOTE="savagetwinkie"]

[QUOTE="Chutebox"]

I learned a very long time ago never to argue with ronvalencia about computer tech because you will lose.

ronvalencia

all he does is post random crap that has similar words he's used in the argument and post it in threads, he's got barely any understanding of what he's talking about, can't put it into context or a relate it to anything that is being said... lol he's not arguing so you can't loose.

Random crap is you.

No you are posting random crap trying to sound like you know everything about videocards. Fact that I can post a link once I find it from ATI back in 2005 listing the specs for the Xenos. Unified shaders started with Xenos.

Avatar image for Devil-Itachi
Devil-Itachi

4387

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 Devil-Itachi
Member since 2005 • 4387 Posts
When people say x next generation console will have y gpu, it doesn't mean that it will literally have that gpu. It usually means it is either based on it or within that ballpark in terms of power.
Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#46 muscleserge
Member since 2005 • 3307 Posts
Everybody should chill out for a second. Sure compared to a PC a 6670 is weak, and even for a traditional "next-gen" home console it seems weak, however a 6670 or its equivalent in a tablet form would be extraordinary, just food for thought, especially with recent market trends.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="savagetwinkie"] all he does is post random crap that has similar words he's used in the argument and post it in threads, he's got barely any understanding of what he's talking about, can't put it into context or a relate it to anything that is being said... lol he's not arguing so you can't loose.

xhawk27

Random crap is you.

No you are posting random crap trying to sound like you know everything about videocards. Fact that I can post a link once I find it from ATI back in 2005 listing the specs for the Xenos. Unified shaders started with Xenos.

Sorry, savagetwinkie's "video cards don't have threads" assertion is wrong. Modern video cards has threads.

From amd.com/us/products/notebook/graphics/ati-mobility-hd-x1000/x1800/Pages/x1800-specs.aspx

Ultra-Threaded Shader Engine

  • Up to 128 simultaneous pixel threads.

That's 8 threads per pixel shader processor.

What is an Ultra-Threaded Processor?

An Ultra-Threaded processor is a component on a GPU, or graphics processing unit, that allocates the resources of that GPU so none are idle. It enables an ATI graphics card to boost its efficiency by distributing processor information such as tasks to each of the GPU's cores using proprietary protocols

Are you referring to this 2005 presentation?

For raster based games, Xenos's rasterization is inferior i.e. 8 ROPs.

My point, one must not generalise.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="brofists"][QUOTE="ronvalencia"] No. November 22, 2005 http://images.anandtech.com/graphs/ati%20radeon%20x1900%20launch_012406100152/10664.png img src="http://images.anandtech.com/graphs/ati%20radeon%20x1900%20launch_012406100152/10666.pngM3boarder23
OWNED.

An Xbox Game, and a Xbox 360 launch release port? Really? Try playing DiRT or Metro 2033 on a X1900 XT :lol:

Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.

Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.

Xenos's ROPs wasn't "state of the art".

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 xhawk27
Member since 2010 • 12194 Posts

[QUOTE="M3boarder23"][QUOTE="brofists"] OWNED.ronvalencia

An Xbox Game, and a Xbox 360 launch release port? Really? Try playing DiRT or Metro 2033 on a X1900 XT :lol:

Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.

Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.

Xenos's ROPs wasn't "state of the art".

Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.

Avatar image for BlbecekBobecek
BlbecekBobecek

2949

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#50 BlbecekBobecek
Member since 2006 • 2949 Posts

I can't imagine a scenario where ATI comes up with a card similar in performance to the 6670.

Xplode_games

Thats purely because of your lack of imagination. MS is pulling off the Wii, deal with it.