Still doesn't change the fact that Xenos was state of the art in GPUs in 2005.
xhawk27
Hardly. Nvidia released the 7800 GTX in 2005.
This topic is locked from further discussion.
Still doesn't change the fact that Xenos was state of the art in GPUs in 2005.
xhawk27
Hardly. Nvidia released the 7800 GTX in 2005.
I would say it's unlikely the final xbox 720 will use something based on the 6670 because it is so ludicrously underpowered. A 6850 would be ludicrously underpowered for a 2013 console. They are going to need far better technology than that to make the 720 anything more than a bit of a joke power wise.
[QUOTE="ronvalencia"]
[QUOTE="M3boarder23"] An Xbox Game, and a Xbox 360 launch release port? Really? Try playing DiRT or Metro 2033 on a X1900 XT :lol: xhawk27
Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.
Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.
Xenos's ROPs wasn't "state of the art".
Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.
the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.
it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.
[QUOTE="xhawk27"]
[QUOTE="ronvalencia"]
Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.
Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.
Xenos's ROPs wasn't "state of the art".
razgriz_101
Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.
the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.
it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.
Except that it is also the thing that limits the 360 to 720p or lower, the actual amount of 10mb that is, proly why PC cards don't go that route.[QUOTE="savagetwinkie"]
[QUOTE="ronvalencia"] Xenos only has 8 ROPs and 64 threads.
ronvalencia
video cards don't have threads lol, they have processors, the xenos has 48 processors broken up into three groups of 16 that all run the same instruction, 3 instructions threads can be run, and each processors can execute 2 operations serially per cycle.
It also has 16 texture filter units and texture address units...
are you really stupid enough that you added 16 + 48 and thought it was threads?
LOL, Modern video cards has hyper-threading/simultaneous multithreading(SMT) type designs e.g. NVIDIA's Giga-Threads and AMD's Ultra-Threads.
Link
XENOS is capable of processing 64 threads simultaneously, this is to make sure that all elements are being utilized and so there is minimal or no stalling of the graphics architecture. So even if a ALU may be waiting for a texture sample to be achieved, that thread would not stall the ALU as it would be working on something else from another thread. This effectively hides tasks that would normally have a large latency penalty attached to them. ATI suggests that their testing achieves an average of 95% efficiency of the shader array in general purpose graphics usage conditions
Please update your GPU information to current GPU design i.e. heavy hyperthreaded/SMT type designs. The stupid one is you.
What is Simultaneous multithreading?
Simultaneous multithreading, often abbreviated as SMT, is a technique for improving the overall efficiency of superscalar CPUs with hardware multithreading. SMT permits multiple independent threads of execution to better utilize the resources provided by modern processor architectures.
AMD Xenos includes own version hardwareSMT type front-end design and it has 64 of them. AMD Radeon X1800 has 128 SMT type threads for it's pixel shaders.
NVIDIA Geforce 8X GPU's "Giga-Threads".
Beyond the ALUs/processing core, both NVIDIA and AMD has invested significantresources on uncore sections.
Your "video cards don't have threads" statement is a lol.
you must think the g8 has 12000+ threads! Your an idiot, hardware doesn't have threads, software does, and hardware is designed with features for running threads. The xenos can manage up to... up to being the keyword here... 64 threads. Its only got 48 processors though compared to the x1800 512 threads, but thats more than 16 pixal shader processors and 8 vertex shader processors.
Secondly your still spouting nonsense and don't understand anything, your comparing threads on an x1800 to the xenos and trying to say the xenos wasn't state of the art because pointing out thread engines? Why don't you actually dig up relevant information on the xenos's unified processors ditching the pixal/vertex fixed setup. This actually makes threads mroe useful because out of the 64 threads each thread can run on any of the 48 processors versus x1800 512 threads has to be broken into vertex shaders and pixal shaders.
You never bother putting anything into useful context, you just spout out random facts about technology that people are talking about.
[QUOTE="xhawk27"]
[QUOTE="ronvalencia"]
Metro 2033 PC's texture resolution is 2048x2048 while XBox 360's version texture resolution is 1024x1024.
Radeon X1900 XT 256MB, Crysis 2 PC plays like Xbox 360's version.
Xenos's ROPs wasn't "state of the art".
razgriz_101
Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.
the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.
it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.
i'm pretty sure the ps2 had edram, edram is not a new thing[QUOTE="razgriz_101"][QUOTE="xhawk27"]
Still doesn't change the fact that Xenos was state of the art in GPUs in 2005. Also if you think X1900 XT can run Crysis 2 at the 360 res it can not.
savagetwinkie
the only thing that was state of the art was the 10mb EDRAM for the Anti Aliasing and post procession and on top of that the core die its actually based off is the XT1800.
it only had 1 thing going for it so stop kidding yourself it was essentially a modifed pc gpu much like the geforce 3 in the original xbox.
i'm pretty sure the ps2 had edram, edram is not a new thing same with the PSP, I think.6670 equivalent, b*tch please!! How can that impress me when I'm rocking this GTX 580:
It even looks cool, like it was built by Tony Stark or something XD
i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
:|i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
lpjazzman220
No more buying highend cards for me lol i bought a x1800xt for like 400$ after trading in my 6800 GT which i might of paid like 400$ for lmao the card died and i couldnt get it replaced.
And i bought 7800 gtx 520$ in 2005 and that died but i got a replacement lol a freaking 8600 gts owell better than the 7800 gtx.
midrange is where it's at like 8800 GT 200$ type cards.
i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
lpjazzman220
The 8800 GTX was released ONE FULL YEAR after the Xbox 360 in late 2006. It's true that it blows away the 360 and PS3. But it makes sense for it to blow away the 360 as it's one year older. However, there is no excuse for it to outperform the PS3. Especially with the PS3's launch price of $599!!! I have been arguing that point on this forum for years and it's mostly fallen on deaf ears.
.... ms does not manufactur anything, foxxconn does, ATI and nvidia im sure does not manufacture anything, they design it, and get a company like foxxconn to make itRiki101Actually they do, AMD makes chips through their Global Foudaries subsidiary i think Nvidia use TSMC.
[QUOTE="lpjazzman220"]:| What's hard to understand about that?i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
M3boarder23
[QUOTE="Riki101"].... ms does not manufactur anything, foxxconn does, ATI and nvidia im sure does not manufacture anything, they design it, and get a company like foxxconn to make itmarkop2003Actually they do, AMD makes chips through their Global Foudaries subsidiary i think Nvidia use TSMC. AMD uses both GoFlo and TSMC. AMD GCN is fab on TSMC LP-28nm process.
Actually a 6670-based GPU wouldn't be all that bad:
http://www.youtube.com/watch?v=BzWTGumL-ME
With some optimization they should pull-off a stable 30 fps. Just forget about "Avatar graphix".
i still could kick a consoles ---- with my old machine bwahahaha in about a month expect to create a pc for 500 or so that could beat TWO 720's glued togeather hahaha its actually pretty disgraceful. by the time a 720 launches its titles wont even be challenging the majority of users on steam and i imagine the world that play games on pc. in fact its actually possible to match the next consoles using laptops with amd INTEGRATED VIDEO. thats so disgraceful i cant imagine why they would bother making a next-gen console.. its just futile bottlenecking of the market.
dirty primate cosnoles developers.
pc is definetely about to strike up the band n bury you kids
http://www.youtube.com/watch?v=oKPzVV2DExk&feature=related
[QUOTE="lpjazzman220"]
i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
Xplode_games
The 8800 GTX was released ONE FULL YEAR after the Xbox 360 in late 2006. It's true that it blows away the 360 and PS3. But it makes sense for it to blow away the 360 as it's one year older. However, there is no excuse for it to outperform the PS3. Especially with the PS3's launch price of $599!!! I have been arguing that point on this forum for years and it's mostly fallen on deaf ears.
bluray players were extremely expensive back then[QUOTE="M3boarder23"][QUOTE="lpjazzman220"]:| What's hard to understand about that? U dumbass :lol:i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
kraken2109
come on enough of the 6670 rumors. they are not giong to use that if anything it is disinformation aimed at sony and nintendo. im not saying they will use the most powerful 2013 card but when we finally hear about it im certain it will be very comparable to an HD7970 and perhaps share some features of the HD8000 series.
You have to literally be a complete and total imbecile to believe that the 720 will use the 6670 or anything even remotely similar. Either that or you're clueless about PC tech. Remember, MS is not buying the chips per unit, they are buying the architecture. They are paying ATI to design them an all new chip and help them get started with the manufacturing. How in the hell can this turn out to be a 6670? NO, and LMFAO to anyone who thought that would happen.
Xplode_games
AMD's POV on PS4 and Xbox 720
PCH: "Neal Robison, Head of Software at AMD Developer Relations Department stated in an interview with the Xbit Labs on record that he believes a merger-APU as a basis for a powerful next-generation consoles to be useful, will be more powerful than you can imagine."
Refer to AMD's "fat" APU road map between 2013 to 2015.
Year 2012 desktop AMD Trinity APU (VLIW4) is about similar level as 6670 (VLIW5). Console AMD APUs may not be restricted to desktop PC's DDR3 memory types i.e. GDDR5 can be an option.
Year 2013 APU would have AMD GCN type GPU and which can clock +1Ghz on current TSMC LP-28nm tech.
7770 (640 SPs, 16 ROPs) is just under 6850 (960 SPs, 32 ROPs) in performance i.e. GCN shows improved IPC over Cypress VLIW5 designs.
come on enough of the 6670 rumors. they are not giong to use that if anything it is disinformation aimed at sony and nintendo. im not saying they will use the most powerful 2013 card but when we finally hear about it im certain it will be very comparable to an HD7970 and perhaps share some features of the HD8000 series.
GotNugz
AMD Cypress's VLIW5 design is dead in the future desktop PC AMD APU designs.
2012 AMD Trinity APU uses Cayman's VLIW4 type GPU designs. It's IGP clocked at around 800Mhz
2013 AMD ???? APU uses GCN type GPU designs. Refer to Radeon HD 7770 @ 1Ghz. Can be clocked to 1.3Ghz via overclock.
TheClub 3D Radeon™ HD 7770 GHz Edition graphics card comes standard at 1GHz GPU clock speed with an overclocking headroom at 1.3 GHz and Beyond
6670 has a slightly higher pixel fill rate than my 8600 gts no way they're gonna use this gpu lol.
wewantdoom4now
It depends on 6670 i.e. it comes in DDR3 or GDDR5.
Radeon HD 6670 DDR3 kills Geforce 8600 GTS e.g. refer Radeon HD 4670 vs Geforce 8600 GTS.
[QUOTE="lpjazzman220"]
i have no clue what ur talking about...360 and ps3 were already behind in graphics at release...8800 blew them out of the water day one...
Xplode_games
The 8800 GTX was released ONE FULL YEAR after the Xbox 360 in late 2006. It's true that it blows away the 360 and PS3. But it makes sense for it to blow away the 360 as it's one year older. However, there is no excuse for it to outperform the PS3. Especially with the PS3's launch price of $599!!! I have been arguing that point on this forum for years and it's mostly fallen on deaf ears.
yeah i know 360 came out first...but even at that the 7900 had better performance...same games higher res...while it couldnt do dx10 stuff...it still was better...then the ps3 came out aroudn the same time as the 8800...which blew everythign out of the water...tho to be fair...the 8800 as ~1000usd at release
2013 AMD ???? APU uses GCN type GPU designs. Refer to Radeon HD 7770 @ 1Ghz. Can be clocked to 1.3Ghz via overclock.
TheClub 3D Radeon™ HD 7770 GHz Edition graphics card comes standard at 1GHz GPU clock speed with an overclocking headroom at 1.3 GHz and Beyond
ronvalencia
It remains to be seen how well the 2013 APUs will overclock. After all, they have tobe able to maintain a reasonable TDPwith both the CPU and GPU on-die, so I would guess it won't overclock as highly as the HD 7770s, but it should be pretty amazing performance for an IGP despite that.
I'm not really convinced that they will have any place in a future console though. Unless the Big 3 have decided to try and cut costs, I suspect we'll see much the same as we have the last several generations which is that the next-gen consoles will have derivatives of commodity GPUs tweaked to fit their needs.
Please Log In to post.
Log in to comment