It is actually not possible that the Xbox 720 uses a 6670 GPU because

  • 89 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for topsemag55
topsemag55

19063

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#51 topsemag55
Member since 2007 • 19063 Posts

Still doesn't change the fact that Xenos was state of the art in GPUs in 2005.

xhawk27

Hardly. Nvidia released the 7800 GTX in 2005.

Avatar image for WiiCubeM1
WiiCubeM1

4735

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#52 WiiCubeM1
Member since 2009 • 4735 Posts

Specs or STFU.

Avatar image for True_Gamer_
True_Gamer_

6750

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#53 True_Gamer_
Member since 2006 • 6750 Posts
Only couple of months after xbox 360 x1900xtx was released so it was outdated two months into its lifetime :D
Avatar image for LustForSoul
LustForSoul

6404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 LustForSoul
Member since 2011 • 6404 Posts
The xbox uses cards differently. Not sure how it all works, but I'm sure they could make something out of it. If it's true of course.
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#55 iamrob7
Member since 2007 • 2138 Posts

I would say it's unlikely the final xbox 720 will use something based on the 6670 because it is so ludicrously underpowered. A 6850 would be ludicrously underpowered for a 2013 console. They are going to need far better technology than that to make the 720 anything more than a bit of a joke power wise.

Avatar image for kraken2109
kraken2109

13271

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 kraken2109
Member since 2009 • 13271 Posts

It's never been said it will use the 6670.

It's said it will use a card based on the 6670.

Avatar image for razgriz_101
razgriz_101

16875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#57 razgriz_101
Member since 2007 • 16875 Posts

[QUOTE="ronvalencia"]

[QUOTE="M3boarder23"] An Xbox Game, and a Xbox 360 launch release port? Really? Try playing DiRT or Metro 2033 on a X1900 XT :lol: xhawk27

Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.

Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.

Xenos's ROPs wasn't "state of the art".

Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.

the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.

it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.

Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#58 muscleserge
Member since 2005 • 3307 Posts

[QUOTE="xhawk27"]

[QUOTE="ronvalencia"]

Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.

Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.

Xenos's ROPs wasn't "state of the art".

razgriz_101

Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.

the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.

it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.

Except that it is also the thing that limits the 360 to 720p or lower, the actual amount of 10mb that is, proly why PC cards don't go that route.
Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 Jankarcop
Member since 2011 • 11058 Posts

Man all you console people care about is gfx and hardware, always such elitists I sweer.

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 savagetwinkie
Member since 2008 • 7981 Posts

[QUOTE="savagetwinkie"]

[QUOTE="ronvalencia"] Xenos only has 8 ROPs and 64 threads.

ronvalencia

video cards don't have threads lol, they have processors, the xenos has 48 processors broken up into three groups of 16 that all run the same instruction, 3 instructions threads can be run, and each processors can execute 2 operations serially per cycle.

It also has 16 texture filter units and texture address units...

are you really stupid enough that you added 16 + 48 and thought it was threads?

LOL, Modern video cards has hyper-threading/simultaneous multithreading(SMT) type designs e.g. NVIDIA's Giga-Threads and AMD's Ultra-Threads.

Link

XENOS is capable of processing 64 threads simultaneously, this is to make sure that all elements are being utilized and so there is minimal or no stalling of the graphics architecture. So even if a ALU may be waiting for a texture sample to be achieved, that thread would not stall the ALU as it would be working on something else from another thread. This effectively hides tasks that would normally have a large latency penalty attached to them. ATI suggests that their testing achieves an average of 95% efficiency of the shader array in general purpose graphics usage conditions

Please update your GPU information to current GPU design i.e. heavy hyperthreaded/SMT type designs. The stupid one is you.

What is Simultaneous multithreading?

Simultaneous multithreading, often abbreviated as SMT, is a technique for improving the overall efficiency of superscalar CPUs with hardware multithreading. SMT permits multiple independent threads of execution to better utilize the resources provided by modern processor architectures.

AMD Xenos includes own version hardwareSMT type front-end design and it has 64 of them. AMD Radeon X1800 has 128 SMT type threads for it's pixel shaders.

NVIDIA Geforce 8X GPU's "Giga-Threads".

Beyond the ALUs/processing core, both NVIDIA and AMD has invested significantresources on uncore sections.

Your "video cards don't have threads" statement is a lol.

you must think the g8 has 12000+ threads! Your an idiot, hardware doesn't have threads, software does, and hardware is designed with features for running threads. The xenos can manage up to... up to being the keyword here... 64 threads. Its only got 48 processors though compared to the x1800 512 threads, but thats more than 16 pixal shader processors and 8 vertex shader processors.

Secondly your still spouting nonsense and don't understand anything, your comparing threads on an x1800 to the xenos and trying to say the xenos wasn't state of the art because pointing out thread engines? Why don't you actually dig up relevant information on the xenos's unified processors ditching the pixal/vertex fixed setup. This actually makes threads mroe useful because out of the 64 threads each thread can run on any of the 48 processors versus x1800 512 threads has to be broken into vertex shaders and pixal shaders.

You never bother putting anything into useful context, you just spout out random facts about technology that people are talking about.

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 savagetwinkie
Member since 2008 • 7981 Posts

[QUOTE="xhawk27"]

[QUOTE="ronvalencia"]

Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.

Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.

Xenos's ROPs wasn't "state of the art".

razgriz_101

Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.

the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.

it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.

i'm pretty sure the ps2 had edram, edram is not a new thing
Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#62 muscleserge
Member since 2005 • 3307 Posts
[QUOTE="razgriz_101"]

[QUOTE="xhawk27"]

Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.

savagetwinkie

the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.

it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.

i'm pretty sure the ps2 had edram, edram is not a new thing

same with the PSP, I think.
Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

fdfd

ispeakfact

.

Avatar image for WiiCubeM1
WiiCubeM1

4735

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#64 WiiCubeM1
Member since 2009 • 4735 Posts

Man all you console people care about is gfx and hardware, always such elitists I sweer.

Jankarcop

A hermit hating on console gamers for caring about hardware is like Hitler hating on someone for being racist. It just doesn't work.

Avatar image for ShadowDragon78
ShadowDragon78

371

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#65 ShadowDragon78
Member since 2011 • 371 Posts
I like waffle fries.DeX2010
Yes...yes they are quite good.
Avatar image for Stalkerfieldsis
Stalkerfieldsis

659

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 Stalkerfieldsis
Member since 2011 • 659 Posts

6670 equivalent, b*tch please!! How can that impress me when I'm rocking this GTX 580:

It even looks cool, like it was built by Tony Stark or something XD

Avatar image for lpjazzman220
lpjazzman220

2249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#67 lpjazzman220
Member since 2008 • 2249 Posts

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

Avatar image for M3boarder23
M3boarder23

207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 M3boarder23
Member since 2012 • 207 Posts

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

lpjazzman220
:|
Avatar image for wewantdoom4now
wewantdoom4now

1792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 wewantdoom4now
Member since 2012 • 1792 Posts

580 gtx is a powerful card but it's a big son of a *****.

This card's alot more sexy

Avatar image for wewantdoom4now
wewantdoom4now

1792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 wewantdoom4now
Member since 2012 • 1792 Posts

No more buying highend cards for me lol i bought a x1800xt for like 400$ after trading in my 6800 GT which i might of paid like 400$ for lmao the card died and i couldnt get it replaced.

And i bought 7800 gtx 520$ in 2005 and that died but i got a replacement lol a freaking 8600 gts owell better than the 7800 gtx.

midrange is where it's at like 8800 GT 200$ type cards.

Avatar image for wewantdoom4now
wewantdoom4now

1792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 wewantdoom4now
Member since 2012 • 1792 Posts

:lol:i have a 520$ 8600 gts nice huh?

Avatar image for M3boarder23
M3boarder23

207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 M3boarder23
Member since 2012 • 207 Posts

:lol:i have a 520$ 8600 gts nice huh?

wewantdoom4now
lol PEEC Gaming
Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 Xplode_games
Member since 2011 • 2540 Posts

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

lpjazzman220

The 8800 GTX was released ONE FULL YEAR after the Xbox 360 in late 2006. It's true that it blows away the 360 and PS3. But it makes sense for it to blow away the 360 as it's one year older. However, there is no excuse for it to outperform the PS3. Especially with the PS3's launch price of $599!!! I have been arguing that point on this forum for years and it's mostly fallen on deaf ears.

Avatar image for markop2003
markop2003

29917

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 markop2003
Member since 2005 • 29917 Posts
.... ms does not manufactur anything, foxxconn does, ATI and nvidia im sure does not manufacture anything, they design it, and get a company like foxxconn to make itRiki101
Actually they do, AMD makes chips through their Global Foudaries subsidiary i think Nvidia use TSMC.
Avatar image for kraken2109
kraken2109

13271

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 kraken2109
Member since 2009 • 13271 Posts
[QUOTE="lpjazzman220"]

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

M3boarder23
:|

What's hard to understand about that?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#76 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Riki101"].... ms does not manufactur anything, foxxconn does, ATI and nvidia im sure does not manufacture anything, they design it, and get a company like foxxconn to make itmarkop2003
Actually they do, AMD makes chips through their Global Foudaries subsidiary i think Nvidia use TSMC.

AMD uses both GoFlo and TSMC. AMD GCN is fab on TSMC LP-28nm process.
Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 nameless12345
Member since 2010 • 15125 Posts

Actually a 6670-based GPU wouldn't be all that bad:

http://www.youtube.com/watch?v=BzWTGumL-ME

With some optimization they should pull-off a stable 30 fps. Just forget about "Avatar graphix".

Avatar image for ionusX
ionusX

25778

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#78 ionusX
Member since 2009 • 25778 Posts

i still could kick a consoles ---- with my old machine bwahahaha in about a month expect to create a pc for 500 or so that could beat TWO 720's glued togeather hahaha its actually pretty disgraceful. by the time a 720 launches its titles wont even be challenging the majority of users on steam and i imagine the world that play games on pc. in fact its actually possible to match the next consoles using laptops with amd INTEGRATED VIDEO. thats so disgraceful i cant imagine why they would bother making a next-gen console.. its just futile bottlenecking of the market.

dirty primate cosnoles developers.

pc is definetely about to strike up the band n bury you kids

http://www.youtube.com/watch?v=oKPzVV2DExk&feature=related

Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 nameless12345
Member since 2010 • 15125 Posts

dirty primate cosnoles developers.

ionusX

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#80 Cranler
Member since 2005 • 8809 Posts

[QUOTE="lpjazzman220"]

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

Xplode_games

The 8800 GTX was released ONE FULL YEAR after the Xbox 360 in late 2006. It's true that it blows away the 360 and PS3. But it makes sense for it to blow away the 360 as it's one year older. However, there is no excuse for it to outperform the PS3. Especially with the PS3's launch price of $599!!! I have been arguing that point on this forum for years and it's mostly fallen on deaf ears.

bluray players were extremely expensive back then
Avatar image for M3boarder23
M3boarder23

207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 M3boarder23
Member since 2012 • 207 Posts
[QUOTE="M3boarder23"][QUOTE="lpjazzman220"]

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

kraken2109
:|

What's hard to understand about that?

U dumbass :lol:
Avatar image for GotNugz
GotNugz

681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 GotNugz
Member since 2010 • 681 Posts

come on enough of the 6670 rumors. they are not giong to use that if anything it is disinformation aimed at sony and nintendo. im not saying they will use the most powerful 2013 card but when we finally hear about it im certain it will be very comparable to an HD7970 and perhaps share some features of the HD8000 series.

Avatar image for GotNugz
GotNugz

681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 GotNugz
Member since 2010 • 681 Posts

[QUOTE="kraken2109"][QUOTE="M3boarder23"] :|M3boarder23
What's hard to understand about that?

U dumbass :lol:

yep didn't know the gtx 8800 was available in late 2005.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#84 ronvalencia
Member since 2008 • 29612 Posts

You have to literally be a complete and total imbecile to believe that the 720 will use the 6670 or anything even remotely similar. Either that or you're clueless about PC tech. Remember, MS is not buying the chips per unit, they are buying the architecture. They are paying ATI to design them an all new chip and help them get started with the manufacturing. How in the hell can this turn out to be a 6670? NO, and LMFAO to anyone who thought that would happen.

Xplode_games

AMD's POV on PS4 and Xbox 720

PCH: "Neal Robison, Head of Software at AMD Developer Relations Department stated in an interview with the Xbit Labs on record that he believes a merger-APU as a basis for a powerful next-generation consoles to be useful, will be more powerful than you can imagine."

Refer to AMD's "fat" APU road map between 2013 to 2015.

Year 2012 desktop AMD Trinity APU (VLIW4) is about similar level as 6670 (VLIW5). Console AMD APUs may not be restricted to desktop PC's DDR3 memory types i.e. GDDR5 can be an option.

Year 2013 APU would have AMD GCN type GPU and which can clock +1Ghz on current TSMC LP-28nm tech.

7770 (640 SPs, 16 ROPs) is just under 6850 (960 SPs, 32 ROPs) in performance i.e. GCN shows improved IPC over Cypress VLIW5 designs.

Avatar image for wewantdoom4now
wewantdoom4now

1792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 wewantdoom4now
Member since 2012 • 1792 Posts

6670 has a slightly higher pixel fill rate than my 8600 gts no way they're gonna use this gpu lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86 ronvalencia
Member since 2008 • 29612 Posts

come on enough of the 6670 rumors. they are not giong to use that if anything it is disinformation aimed at sony and nintendo. im not saying they will use the most powerful 2013 card but when we finally hear about it im certain it will be very comparable to an HD7970 and perhaps share some features of the HD8000 series.

GotNugz

AMD Cypress's VLIW5 design is dead in the future desktop PC AMD APU designs.

2012 AMD Trinity APU uses Cayman's VLIW4 type GPU designs. It's IGP clocked at around 800Mhz

2013 AMD ???? APU uses GCN type GPU designs. Refer to Radeon HD 7770 @ 1Ghz. Can be clocked to 1.3Ghz via overclock.

TheClub 3D Radeon™ HD 7770 GHz Edition graphics card comes standard at 1GHz GPU clock speed with an overclocking headroom at 1.3 GHz and Beyond

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#87 ronvalencia
Member since 2008 • 29612 Posts

6670 has a slightly higher pixel fill rate than my 8600 gts no way they're gonna use this gpu lol.

wewantdoom4now

It depends on 6670 i.e. it comes in DDR3 or GDDR5.

Radeon HD 6670 DDR3 kills Geforce 8600 GTS e.g. refer Radeon HD 4670 vs Geforce 8600 GTS.

Avatar image for lpjazzman220
lpjazzman220

2249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#88 lpjazzman220
Member since 2008 • 2249 Posts

[QUOTE="lpjazzman220"]

i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...

Xplode_games

The 8800 GTX was released ONE FULL YEAR after the Xbox 360 in late 2006. It's true that it blows away the 360 and PS3. But it makes sense for it to blow away the 360 as it's one year older. However, there is no excuse for it to outperform the PS3. Especially with the PS3's launch price of $599!!! I have been arguing that point on this forum for years and it's mostly fallen on deaf ears.

yeah i know 360 came out first...but even at that the 7900 had better performance...same games higher res...while it couldnt do dx10 stuff...it still was better...then the ps3 came out aroudn the same time as the 8800...which blew everythign out of the water...tho to be fair...the 8800 as ~1000usd at release

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#89 Aidenfury19
Member since 2007 • 2488 Posts

2013 AMD ???? APU uses GCN type GPU designs. Refer to Radeon HD 7770 @ 1Ghz. Can be clocked to 1.3Ghz via overclock.

TheClub 3D Radeon™ HD 7770 GHz Edition graphics card comes standard at 1GHz GPU clock speed with an overclocking headroom at 1.3 GHz and Beyond

ronvalencia

It remains to be seen how well the 2013 APUs will overclock. After all, they have tobe able to maintain a reasonable TDPwith both the CPU and GPU on-die, so I would guess it won't overclock as highly as the HD 7770s, but it should be pretty amazing performance for an IGP despite that.

I'm not really convinced that they will have any place in a future console though. Unless the Big 3 have decided to try and cut costs, I suspect we'll see much the same as we have the last several generations which is that the next-gen consoles will have derivatives of commodity GPUs tweaked to fit their needs.