The Gameucbe was the most powerful console it's gen.

  • 67 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for Jag85
Jag85

20640

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#51 Jag85
Member since 2005 • 20640 Posts

[QUOTE="Jag85"]

[QUOTE="nameless12345"]

Carmack said that GC was actually less-powerful than the PS2. (smaller peak poly-counts)

But he also said that PS2 had a backwards GPU in comparison to GC and Xbox. (much less features)

Only console in that generation that was powerful enough to run Doom 3 was the Xbox.

Also, Xbox's Pentium 3-based CPU had it's strenghts over the PowerPC-based GC CPU.

Like for example it was better at complex physics calculations (Half-Life 2) and complex AI calculations. (Operation Flashpoint: Elite)

It was also PC-like X86 architecture so ports from the PC were easy on the Xbox.

Altho there weren't exactly many physics and AI intensive games on the GC so it's hard to say how it would handle them.

But Xbox also had more RAM in it's favour.

nameless12345

Are you sure Carmack said that? I just Googled it and haven't found him saying anything like that. As for peak polygon counts, benchmark tests showed the GameCube having the highest that generation, especially with the Rogue Squadron games which had a polygon count well beyond any Xbox game.

As for Doom 3, the reason why it wasn't brought over to the GameCube was because, like I said above, it lacked the Xbox's more advanced shading capabilities that the game heavily relied on. It was technically possible to recreate those shaders on the GameCube, but it would have been too difficult.

As for the Pentium III architecture, I'm not really sure if it really did have better physics & AI calculations, but the fact that Microsoft abandoned it in favour of the PowerPC architecture for the Xbox 360 suggests that, overall, the GameCube's PowerPC architecture had the edge over the Xbox's Pentium III architecture.

And finally, while the Xbox did have more overall RAM, the GameCube had faster 1T-SRAM.

Overall, the Xbox and GameCube were closely matched, with the GameCube offering more raw power while the Xbox offered a more advanced feature set... almost like the Mega Drive vs SNES, with the Mega Drive offering more raw power and the SNES offering a more advanced feature set.

Well that's what I heard from some "sources". (that he said that PS2 had better poly-count than GC did)

MS picked Power architecture because it was cheaper, not because it would be better than X86 architecture in all aspects.

The PS4 and X1 are both using X86 now.

IBM G3 vs Intel P3 was a interesting match but we haven't really seen much CPU-oriented games on GC.

The P3 in Xbox was clocked higher, if anything.

I don't quite agree that GC had more power than Xbox did.

GC's video chip was about on-par with a AMD/ATi Radeon 7200 while the Xbox's was a GeForce 3-based chip with improved feature set.

So basically their video chips were almost a gen appart.

GC was just a very efficient design, that I give it right.

I'd say "MD vs SNES" is more comparable to "PS3 vs 360".

PS3 had the CPU edge (like MD) but 360 had better GPU. (SNES)

I looked online and didn't find Carmack saying anything of the sort. And if he did, then it must have been from before any benchmark tests were done, since Sony (as usual) initially claimed the PS2 had a theotical peak polygon count of 60 million (in reality, the PS2's practical polygon count wasn't much higher than the Dreamcast).

The performance (in terms of MIPS and FLOPS) of a PowerPC G3 clocked at 485 MHz is higher than that of a Pentium III clocked at 733 MHz. The PowerPC series was always known for providing a higher performance at a lower clock rate than the Pentium series.

I'd say it wasn't until the Intel Core processors came along that x86 architecture overtook PowerPC architecture in terms of raw performance per clock rate, so it shouldn't be too surprising that the PS4 and X1 are going x86 now.

The GameCube did not use a Radeon-like GPU, but its GPU was a custom chip designed by ArtX (later brought by ATI), so you cannot compare it directly to any Radeon GPU. Like I already said, the Xbox did have a more advanced feature set, but benchmark tests have shown that the GameCube was capable of performing at higher polygon counts than the Xbox:

The Old Xbox vs GameCube Graphics War

Also, it was actually possible to re-create the Xbox's more advanced feature set on the GameCube, but it was just very difficult to do. Only a handful of GameCube games were programmed well enough to be able to pull off those more advanced graphical features.

Like the MD vs SNES battle, what makes the GC vs Xbox comparison similar is that one had more raw peformance while the other had more advanced graphical features. What also makes it similar is how often many gamers, by default, simply assume the SNES/Xbox was technically superior, overlooking how the MD/GC offered more raw performance than their rivals.

Avatar image for Jag85
Jag85

20640

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#52 Jag85
Member since 2005 • 20640 Posts

Anyway, it's the games that should do the talking: 

ninja-gaiden-black-20050805035621742_640

doom3_042704_002_640w.jpg

conker-live-reloaded-20050412110144128_6

 

resident-evil-4-20040729030233374.jpg

rebelstrike_092403_x21.jpg

starfoxadfin_092202_50-445432.jpg

nameless12345

Fair enough...

Star Fox Adventures (GameCube)

starfoxadfin_092202_38.jpg

Resident Evil 4 (GameCube)

re42.jpg

Rogue Squadron II: Rogue Leader (GameCube)

18.jpg

rogueleadgc41.JPG

Rogue Squadron III: Rebel Strike (GameCube)

00162752-photo-star-wars-rogue-squadron-

scrn_rogueSquadronRebelStrike-01.jpg

Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 nameless12345
Member since 2010 • 15125 Posts

[QUOTE="nameless12345"]

[QUOTE="Jag85"]

Are you sure Carmack said that? I just Googled it and haven't found him saying anything like that. As for peak polygon counts, benchmark tests showed the GameCube having the highest that generation, especially with the Rogue Squadron games which had a polygon count well beyond any Xbox game.

As for Doom 3, the reason why it wasn't brought over to the GameCube was because, like I said above, it lacked the Xbox's more advanced shading capabilities that the game heavily relied on. It was technically possible to recreate those shaders on the GameCube, but it would have been too difficult.

As for the Pentium III architecture, I'm not really sure if it really did have better physics & AI calculations, but the fact that Microsoft abandoned it in favour of the PowerPC architecture for the Xbox 360 suggests that, overall, the GameCube's PowerPC architecture had the edge over the Xbox's Pentium III architecture.

And finally, while the Xbox did have more overall RAM, the GameCube had faster 1T-SRAM.

Overall, the Xbox and GameCube were closely matched, with the GameCube offering more raw power while the Xbox offered a more advanced feature set... almost like the Mega Drive vs SNES, with the Mega Drive offering more raw power and the SNES offering a more advanced feature set.

Jag85

Well that's what I heard from some "sources". (that he said that PS2 had better poly-count than GC did)

MS picked Power architecture because it was cheaper, not because it would be better than X86 architecture in all aspects.

The PS4 and X1 are both using X86 now.

IBM G3 vs Intel P3 was a interesting match but we haven't really seen much CPU-oriented games on GC.

The P3 in Xbox was clocked higher, if anything.

I don't quite agree that GC had more power than Xbox did.

GC's video chip was about on-par with a AMD/ATi Radeon 7200 while the Xbox's was a GeForce 3-based chip with improved feature set.

So basically their video chips were almost a gen appart.

GC was just a very efficient design, that I give it right.

I'd say "MD vs SNES" is more comparable to "PS3 vs 360".

PS3 had the CPU edge (like MD) but 360 had better GPU. (SNES)

I looked online and didn't find Carmack saying anything of the sort. And if he did, then it must have been from before any benchmark tests were done, since Sony (as usual) initially claimed the PS2 had a theotical peak polygon count of 60 million (in reality, the PS2's practical polygon count wasn't much higher than the Dreamcast).

The performance (in terms of MIPS and FLOPS) of a PowerPC G3 clocked at 485 MHz is higher than that of a Pentium III clocked at 733 MHz. The PowerPC series was always known for providing a higher performance at a lower clock rate than the Pentium series.

I'd say it wasn't until the Intel Core processors came along that x86 architecture overtook PowerPC architecture in terms of raw performance per clock rate, so it shouldn't be too surprising that the PS4 and X1 are going x86 now.

The GameCube did not use a Radeon-like GPU, but its GPU was a custom chip designed by ArtX (later brought by ATI), so you cannot compare it directly to any Radeon GPU. Like I already said, the Xbox did have a more advanced feature set, but benchmark tests have shown that the GameCube was capable of performing at higher polygon counts than the Xbox:

The Old Xbox vs GameCube Graphics War

Also, it was actually possible to re-create the Xbox's more advanced feature set on the GameCube, but it was just very difficult to do. Only a handful of GameCube games were programmed well enough to be able to pull off those more advanced graphical features.

Like the MD vs SNES battle, what makes the GC vs Xbox comparison similar is that one had more raw peformance while the other had more advanced graphical features. What also makes it similar is how often many gamers, by default, simply assume the SNES/Xbox was technically superior, overlooking how the MD/GC offered more raw performance than their rivals.

 

Yes, Sony's official numbers were inflated, or rather, they were "raw" polygons without any effects.

Real performance with effects is more in-line with GC's numbers.

However, PS2 did have the vector co-processors (VU1 & VU0) besides the main CPU and GPU so it had some "hidden power", in a way.

An old but still interesting vid on the matter is located here:

http://www.youtube.com/watch?v=0-A-NjRwmgs

 

I've red some articles comparing IBM's processors to Intel's (& AMD's) and it turned out that the Power CPUs did have some cool FLOPS performance and stuff, but still they are not best for all tasks.

The console makers picked them because they had a good perfomance for the price but the raw peformance of, for example, 360's CPU is still well bellow even that of a AMD Athlon 64 X2.

PS3's CPU was more exotic in that regard that is was a hybrid design and aimed at graphics and media acceleration and also had some impressive FLOPS performance.

But still, a better GPU will always outdo a CPU in graphics-related tasks so PC didn't have much problem overcomming the PS3. (the main CPU core in PS3 is about on-par with a Pentium 4 extreme edition or so I heard and the SPEs have the compute power of a GeForce GTX 7800 - no match for newer PC GPUs like the GF 8800 series, especially when combined with a Quad core Intel 6600)

 

I know GC's graphics chip was designed by ArtX and isn't completely the same as the PC range of Radeon cards.

But still, GC's chip shares many similarities with AMD's Radeon family line.

Back then the "hot" new thing were "pixel-shaders" and I've heard GC used only fixed-function ones while Xbox had fully programmable ones.

Xbox's graphics chip was pretty "cutting-edge" when it came out, even surpassing PC GPUs in it's feature set.

It was deff. better than the GC GPU in every way, GC's was just flexible enough to be competitive if taken advantage of. (which games like Rebel Strike demonstrated)

The problem is, most devs didn't really care to invest much into GC's advanced tech, resulting in many watered-down PS2 ports.

But it must also be noted that that "15-20 million polygons" mark for GC is also unrealistic.

Games like Rebel Strike had about 300k polys per scene, which is still impressive but not nearly in the millions range of those theoretical peak performances.

 

I still don't think comparing GC to MD is a good idea simply because the tech is so different.

Even if you wanted to make a point, it would still be hard to compare to a different console because of completely different architecture. (sans for the Wii, which is based directly on the GC tech)

Avatar image for Jag85
Jag85

20640

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#54 Jag85
Member since 2005 • 20640 Posts

[QUOTE="Jag85"]

[QUOTE="nameless12345"]

Well that's what I heard from some "sources". (that he said that PS2 had better poly-count than GC did)

MS picked Power architecture because it was cheaper, not because it would be better than X86 architecture in all aspects.

The PS4 and X1 are both using X86 now.

IBM G3 vs Intel P3 was a interesting match but we haven't really seen much CPU-oriented games on GC.

The P3 in Xbox was clocked higher, if anything.

I don't quite agree that GC had more power than Xbox did.

GC's video chip was about on-par with a AMD/ATi Radeon 7200 while the Xbox's was a GeForce 3-based chip with improved feature set.

So basically their video chips were almost a gen appart.

GC was just a very efficient design, that I give it right.

I'd say "MD vs SNES" is more comparable to "PS3 vs 360".

PS3 had the CPU edge (like MD) but 360 had better GPU. (SNES)

nameless12345

I looked online and didn't find Carmack saying anything of the sort. And if he did, then it must have been from before any benchmark tests were done, since Sony (as usual) initially claimed the PS2 had a theotical peak polygon count of 60 million (in reality, the PS2's practical polygon count wasn't much higher than the Dreamcast).

The performance (in terms of MIPS and FLOPS) of a PowerPC G3 clocked at 485 MHz is higher than that of a Pentium III clocked at 733 MHz. The PowerPC series was always known for providing a higher performance at a lower clock rate than the Pentium series.

I'd say it wasn't until the Intel Core processors came along that x86 architecture overtook PowerPC architecture in terms of raw performance per clock rate, so it shouldn't be too surprising that the PS4 and X1 are going x86 now.

The GameCube did not use a Radeon-like GPU, but its GPU was a custom chip designed by ArtX (later brought by ATI), so you cannot compare it directly to any Radeon GPU. Like I already said, the Xbox did have a more advanced feature set, but benchmark tests have shown that the GameCube was capable of performing at higher polygon counts than the Xbox:

The Old Xbox vs GameCube Graphics War

Also, it was actually possible to re-create the Xbox's more advanced feature set on the GameCube, but it was just very difficult to do. Only a handful of GameCube games were programmed well enough to be able to pull off those more advanced graphical features.

Like the MD vs SNES battle, what makes the GC vs Xbox comparison similar is that one had more raw peformance while the other had more advanced graphical features. What also makes it similar is how often many gamers, by default, simply assume the SNES/Xbox was technically superior, overlooking how the MD/GC offered more raw performance than their rivals.

 

Yes, Sony's official numbers were inflated, or rather, they were "raw" polygons without any effects.

Real performance with effects is more in-line with GC's numbers.

However, PS2 did have the vector co-processors (VU1 & VU0) besides the main CPU and GPU so it had some "hidden power", in a way.

An old but still interesting vid on the matter is located here:

http://www.youtube.com/watch?v=0-A-NjRwmgs

 

I've red some articles comparing IBM's processors to Intel's (& AMD's) and it turned out that the Power CPUs did have some cool FLOPS performance and stuff, but still they are not best for all tasks.

The console makers picked them because they had a good perfomance for the price but the raw peformance of, for example, 360's CPU is still well bellow even that of a AMD Athlon 64 X2.

PS3's CPU was more exotic in that regard that is was a hybrid design and aimed at graphics and media acceleration and also had some impressive FLOPS performance.

But still, a better GPU will always outdo a CPU in graphics-related tasks so PC didn't have much problem overcomming the PS3. (the main CPU core in PS3 is about on-par with a Pentium 4 extreme edition or so I heard and the SPEs have the compute power of a GeForce GTX 7800 - no match for newer PC GPUs like the GF 8800 series, especially when combined with a Quad core Intel 6600)

 

I know GC's graphics chip was designed by ArtX and isn't completely the same as the PC range of Radeon cards.

But still, GC's chip shares many similarities with AMD's Radeon family line.

Back then the "hot" new thing were "pixel-shaders" and I've heard GC used only fixed-function ones while Xbox had fully programmable ones.

Xbox's graphics chip was pretty "cutting-edge" when it came out, even surpassing PC GPUs in it's feature set.

It was deff. better than the GC GPU in every way, GC's was just flexible enough to be competitive if taken advantage of. (which games like Rebel Strike demonstrated)

The problem is, most devs didn't really care to invest much into GC's advanced tech, resulting in many watered-down PS2 ports.

But it must also be noted that that "15-20 million polygons" mark for GC is also unrealistic.

Games like Rebel Strike had about 300k polys per scene, which is still impressive but not nearly in the millions range of those theoretical peak performances.

 

I still don't think comparing GC to MD is a good idea simply because the tech is so different.

Even if you wanted to make a point, it would still be hard to compare to a different console because of completely different architecture. (sans for the Wii, which is based directly on the GC tech)

Actually, Nintendo later stated the GC's theoretical peak to be 90 million polygons/sec with effects (1 texture, 1 lighting). In comparison, the PS2 and Xbox both gave theoretical peaks with no effects at all, i.e. 60 million for PS2 and 120 million for Xbox.

In terms of actual practical performance, the GameCube was capable of sustaining 6-12 million polygons/sec, with some games such as Rogue Leader even hitting peaks of 15 million polygons/sec. In comparison, the Xbox's peak practical performance was barely over 10 million polygons/sec.

Like I said before, the GameCube had more raw power, while the Xbox had more advanced graphical features. The GameCube could push more polygons, whereas the Xbox had superior shaders. The MD vs SNES comparison is just meant to be an analogy, since the MD was capable of pushing more 3D polygons with its stock hardware (e.g. Star Cruiser and Hard Drivin'), whereas the SNES had superior 2D graphical capabilities (more sprites & colours and Mode 7).

Avatar image for tehMoerz
tehMoerz

54

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 tehMoerz
Member since 2012 • 54 Posts
Newp PS2 had bad graphics Xbox had good graphics DC had good graphics Nintendo had good graphics Xbox was the most powerful though.
Avatar image for silent_bomber
silent_bomber

767

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#56 silent_bomber
Member since 2009 • 767 Posts

whereas the SNES had superior 2D graphical capabilities (more sprites & colours and Mode 7).Jag85

The SNES has inferior sprite hardware to the Mega Drive.

The highest figure is higher, but SNES' sprite pixel per scanline limit brings it down to Mega Drive levels.

Add the fact that the Mega Drive's sprite engine is more efficient (huge range of sprite sizes available with complete freedom, as opposed to the SNES being limited to two sprite sizes at any one time) and you have the Mega Drive sizeably ahead in the sprite area.

and that's not going into other SNES bottlenecks, like the low DMA bandwidth time for updating the sprites, or the fact that the more sprites you have, the more collision detection you need to work out, which is handled by the CPU....

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#57 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

Jag, it wasn't that Power PC was mcuh better than X86 at the time (no widely used coding architecture is completely better than another), it's that specifically, the Power PC 750 that gecko was based on was much better than Pentium 3.  The fact that Nintendo has stuck with that general architecture for 3 generations is proof of its architectural superiority.

Gecko beat Xbox's CPU by quite a bit, Broadway decimated it. 

---

Also, those screenshots prove that Gamecube had superior raw output results.

The screenshots show that Xbox could do more with shader's, but look at the detail, just the detail, and you can see the deficit in polygon's and texture's in comparison to Gamecube games quite obviously.  Like I said Conker is the best it got, and the detail still pales in comparison, (low res texture's) though it also show's what Gamecube couldn't do with Shader's.

Bottom line, Nintendo's chosen graphics solution didn't have programmable shader's because at the time of its incorporation to gamecube's design, ATI didn't have them on their cards.  They did by the time gamecube came out, but it was obviously too late at that point.

Had gamecube had those later additions, we wouldn't be having this discussion i'm sure.

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#58 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

Well, Conker may have been the best, but for realism, I have to give it to Butcher bay

the-chronicles-of-riddick-escape-from-bu

Though in this game as well, you can see how its textures and polygon counts suffered.  Effects were great though.

My last piece on Gamecube vs. Xbox graphics is that the extra effects on top of lower res detail didn't look right, and games should have gotten the detail down first and worked on effects later.  Though that's what was great about the 6th generation, every console was completely different in every way.

Avatar image for WhySoLimp
WhySoLimp

135

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 WhySoLimp
Member since 2009 • 135 Posts

Good lord, I forgot how good the Rogue Squadron games looked on Gamecube! They can easily pass for Xbox games. Factor 5 certainly knew their stuff.

Avatar image for naju890_963
naju890_963

8954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#60 naju890_963
Member since 2008 • 8954 Posts
[QUOTE="tehMoerz"]Newp PS2 had bad graphics Xbox had good graphics DC had good graphics Nintendo had good graphics Xbox was the most powerful though.

What are you doing m8?
Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

Call me crazy but Gamecube often had the muddiest multiplatform games.  Madden was always the worst on the Gamecube, visually.  Never really understood why.

Avatar image for Darkman2007
Darkman2007

17926

Forum Posts

0

Wiki Points

0

Followers

Reviews: 40

User Lists: 0

#62 Darkman2007
Member since 2007 • 17926 Posts

Call me crazy but Gamecube often had the muddiest multiplatform games.  Madden was always the worst on the Gamecube, visually.  Never really understood why.

Heirren
that could be down to a bad port, if a system isn't selling as well as the others, the teams assigned to port them often lack resources and manpower (which can translate to a bad port) EA's Saturn ports were also often crap (not just worse than the others , but also worse than what they could have been) , for the same reason.
Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts
[QUOTE="Heirren"]

Call me crazy but Gamecube often had the muddiest multiplatform games.  Madden was always the worst on the Gamecube, visually.  Never really understood why.

Darkman2007
that could be down to a bad port, if a system isn't selling as well as the others, the teams assigned to port them often lack resources and manpower (which can translate to a bad port) EA's Saturn ports were also often crap (not just worse than the others , but also worse than what they could have been) , for the same reason.

That's probably the case, but I recall reading(don't remember specifically) that the GameCube had something holding it back from the other two in certain areas. Gc games were often softer, image wise.
Avatar image for Kaszilla
Kaszilla

1841

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 Kaszilla
Member since 2011 • 1841 Posts
Mechassault 2 looks great on the xbox
Avatar image for MAILER_DAEMON
MAILER_DAEMON

45906

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#65 MAILER_DAEMON
Member since 2003 • 45906 Posts
[QUOTE="Darkman2007"][QUOTE="Heirren"]

Call me crazy but Gamecube often had the muddiest multiplatform games.  Madden was always the worst on the Gamecube, visually.  Never really understood why.

Heirren
that could be down to a bad port, if a system isn't selling as well as the others, the teams assigned to port them often lack resources and manpower (which can translate to a bad port) EA's Saturn ports were also often crap (not just worse than the others , but also worse than what they could have been) , for the same reason.

That's probably the case, but I recall reading(don't remember specifically) that the GameCube had something holding it back from the other two in certain areas. Gc games were often softer, image wise.

The GC had less base RAM than the other systems despite it being the fasted, though it had more VRAM than the PS2, and the Xbox's architecture was based on the familiar PC. With the PS2, companies had a year's head start to figure out how exactly it worked, and since the Xbox and GC's dev kits came out around the same time, they never really took the time to optimize it unless if they were either making an exclusive or a game that started on GC (compare games like Viewtiful Joe, Killer 7, Resident Evil 4, and Tales of Symphonia for examples of games that started on GC). Madden was generally designed with PS2 as the lead, then ported to the other systems. When going in that direction, it was generally easier to port to Xbox than to GC.
Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 nameless12345
Member since 2010 • 15125 Posts

Jag, it wasn't that Power PC was mcuh better than X86 at the time (no widely used coding architecture is completely better than another), it's that specifically, the Power PC 750 that gecko was based on was much better than Pentium 3.  The fact that Nintendo has stuck with that general architecture for 3 generations is proof of its architectural superiority.

Gecko beat Xbox's CPU by quite a bit, Broadway decimated it. 

Chozofication

 

I'm still willing to dispute that.

Show me a GC game that exhibits better physics and AI than Half-Life 2 on Xbox does.

Also, the Power 750CX had a 64-bit FPU whereas Coppermine (P3) had 128-bit SIMD.

Only thing where 750CX was better at was handling the graphics. (i.e. higher poly-count)

Intel's line of Core CPUs and the newer i3/i5/i7 is based upon Pentium M architecture, which was based of the Pentium 3.

Pentium 4 was actually a design they dropped and didn't bother with afterwards.

Nintendo sticked to that architecture because they were already used to it and because it was cheaper than going for a fully new architecture. (also they kept BC due to it)

There was actually quite some confusion about whether or not the WiiU is using Power 7 CPU tech. (which would surely be better than the enhanced Power 750 tech they picked instead)

Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 nameless12345
Member since 2010 • 15125 Posts

[QUOTE="nameless12345"]

[QUOTE="Jag85"]

I looked online and didn't find Carmack saying anything of the sort. And if he did, then it must have been from before any benchmark tests were done, since Sony (as usual) initially claimed the PS2 had a theotical peak polygon count of 60 million (in reality, the PS2's practical polygon count wasn't much higher than the Dreamcast).

The performance (in terms of MIPS and FLOPS) of a PowerPC G3 clocked at 485 MHz is higher than that of a Pentium III clocked at 733 MHz. The PowerPC series was always known for providing a higher performance at a lower clock rate than the Pentium series.

I'd say it wasn't until the Intel Core processors came along that x86 architecture overtook PowerPC architecture in terms of raw performance per clock rate, so it shouldn't be too surprising that the PS4 and X1 are going x86 now.

The GameCube did not use a Radeon-like GPU, but its GPU was a custom chip designed by ArtX (later brought by ATI), so you cannot compare it directly to any Radeon GPU. Like I already said, the Xbox did have a more advanced feature set, but benchmark tests have shown that the GameCube was capable of performing at higher polygon counts than the Xbox:

The Old Xbox vs GameCube Graphics War

Also, it was actually possible to re-create the Xbox's more advanced feature set on the GameCube, but it was just very difficult to do. Only a handful of GameCube games were programmed well enough to be able to pull off those more advanced graphical features.

Like the MD vs SNES battle, what makes the GC vs Xbox comparison similar is that one had more raw peformance while the other had more advanced graphical features. What also makes it similar is how often many gamers, by default, simply assume the SNES/Xbox was technically superior, overlooking how the MD/GC offered more raw performance than their rivals.

Jag85

 

Yes, Sony's official numbers were inflated, or rather, they were "raw" polygons without any effects.

Real performance with effects is more in-line with GC's numbers.

However, PS2 did have the vector co-processors (VU1 & VU0) besides the main CPU and GPU so it had some "hidden power", in a way.

An old but still interesting vid on the matter is located here:

http://www.youtube.com/watch?v=0-A-NjRwmgs

 

I've red some articles comparing IBM's processors to Intel's (& AMD's) and it turned out that the Power CPUs did have some cool FLOPS performance and stuff, but still they are not best for all tasks.

The console makers picked them because they had a good perfomance for the price but the raw peformance of, for example, 360's CPU is still well bellow even that of a AMD Athlon 64 X2.

PS3's CPU was more exotic in that regard that is was a hybrid design and aimed at graphics and media acceleration and also had some impressive FLOPS performance.

But still, a better GPU will always outdo a CPU in graphics-related tasks so PC didn't have much problem overcomming the PS3. (the main CPU core in PS3 is about on-par with a Pentium 4 extreme edition or so I heard and the SPEs have the compute power of a GeForce GTX 7800 - no match for newer PC GPUs like the GF 8800 series, especially when combined with a Quad core Intel 6600)

 

I know GC's graphics chip was designed by ArtX and isn't completely the same as the PC range of Radeon cards.

But still, GC's chip shares many similarities with AMD's Radeon family line.

Back then the "hot" new thing were "pixel-shaders" and I've heard GC used only fixed-function ones while Xbox had fully programmable ones.

Xbox's graphics chip was pretty "cutting-edge" when it came out, even surpassing PC GPUs in it's feature set.

It was deff. better than the GC GPU in every way, GC's was just flexible enough to be competitive if taken advantage of. (which games like Rebel Strike demonstrated)

The problem is, most devs didn't really care to invest much into GC's advanced tech, resulting in many watered-down PS2 ports.

But it must also be noted that that "15-20 million polygons" mark for GC is also unrealistic.

Games like Rebel Strike had about 300k polys per scene, which is still impressive but not nearly in the millions range of those theoretical peak performances.

 

I still don't think comparing GC to MD is a good idea simply because the tech is so different.

Even if you wanted to make a point, it would still be hard to compare to a different console because of completely different architecture. (sans for the Wii, which is based directly on the GC tech)

In terms of actual practical performance, the GameCube was capable of sustaining 6-12 million polygons/sec, with some games such as Rogue Leader even hitting peaks of 15 million polygons/sec. In comparison, the Xbox's peak practical performance was barely over 10 million polygons/sec.

 

The actual poly-count per scene in Rogue Leader is 300k-something polygons or a little less.

15 million is, like you noted, per second.

This is pretty useless for real-time graphics as games run at 30/60 fps, not at a single frame per second.

Several Xbox games (like, for example, Doom 3 and Riddick) traded in high poly-counts in exchange for more advanced shader effects.

This is used in modern games too, as they usually use 3D models made out of only a few thousand polygons but with lots of bump maps, normal maps and parallax maps applyed to them.

While the concept of hardware tessellation may be quite old (it dates back to ATi's "TruForm" tech), it's only the last few years when it became a big deal again with the (re)introduction in DX11 compatible hardware.

But still, to think that hardware tessellation will just magically make all next-gen games free of all "polygon edges" is naive.

We will have to wait for things like real-time ray-tracing and 1 million polygon models to becomme the norm before games will actually look like CGI stuff and this will take a while, still. (likely not this upcomming gen but next-next gen)

Anyway, here is a pic of interest:

 

QDPolygons.jpg

Â