indeed the people here claiming ownage are owned themselves , the op is completely right, the x360 was a monster of a system and could only be matched by ridiculous pc's at release.
Of course, pc hardware keeps innovating like it always does and the 8800 gtx did surpass this beast by far by simply doubling the gpu performance but then we were already a year later and there was still that triple core cpu running at 3ghz in the x360, of course it wasn't a x86 but never the less, you needed something to feed that 8800gtx. You could buy an intel core 2 duo extreme version clocked at 2.67 ghz for a whopping 999$ which was just released a month before the 8800 gtx end 2006. You we're looking at 2000$, congrats you beat a X360 end 2006.
It was quite simple, you wanted to match the x360 in its early years, you were paying 1000- 1500$ depending how early in that gen you bought your system. Pretty much a no brainer for any gamer at that time. it was the coup of the consoles that gen and the reason why consoles are so successfull right now.
The 360 was ahead of the curve when it came out in 2005 with its triple core cpu and unified shader based gpu. however the actual processing power of the triple core cpu in 2006 onward was weak sauce. You could actually get an Intel Quad core QX6700 in late 2006 for 1k. Point is that you didnt need more than an Athlon X2 @2.8 ghz or any C2D to outperform the 360's cpu.
Oh yeah, the 2.67 ghz cpu was a quad, it was the 2.93 ghz that was a duo but it was the same price (999$).
the graph you're showing isn't correct when you're talking about game development. The x360 and ps3 were made for games and they would perform a lot better in games than their x86 counterparts when you were looking at mips only.
Kinda obvious when this graphs shows the ps3 at half the power of the x360. When properly used, it slaughtered the x360 but that was the problem with the ps3, it wasn't easy to code for and it performed worse than the x360 in the beginning.
you were not going to beat the x360 cpu with any c2d as well. At least not in games. An athlon X2 @ 2.8ghz was a cpu from end 2009.
When the 360 launched, no game looked as good as Perfect Dark Zero, Gears of War, or Project Gotham Racing on PC.
I'd know, I was PC gaming in that time period (had a top of the line rig) and it wasn't until 2007 (Crysis and World In Conflict) that PC started outdoing the console graphically.
PCs were using dual cores when 360 launched, the 360 and PS3 were powerhouses sold at a loss and it probably won't happen again.
Gears of War was the best looking of the bunch and it was on PC.
Didn't F.E.A.R. look better than Perfect Dark Zero?
No way did FEAR look better than PDZ, FEAR looked awesome but PDZ looked truely next gen.
F.E.A.R. was a FAR, FAR more demanding and advanced game than PDZ. And it ran at 60 fps, not 30 fps.
That comparison is laughable. Had you bothered to actually play the games you would know that.
@undefined: how did Oblivion "demolish" pcs when it released?
HDR and AA at same time? Nope?
http://www.anandtech.com/show/1996/4
8800GTX released 6 months later and ruined Oblivion...... Heck the link you posted shows PC running at higher frame rates and quality settings then console so you trying to own yourself?
The 8800GTX with unified shaders came out less then 12 months after 360 did and wrecked it..... So no idea what planet you were on at the time as PC's surpassed its power quickly and never looked back.
I have to defend the 360 here..
12 months on PC is a hell of allot of time,in fact the 360 came when the 7800GTX was release 2005,and by fall 2006 we already saw the 7900GTX come and go and the 8800GTX arrive.
Not to mention it was $600 dollars the GPU alone,pair it with a dual core which on 2006 were also ultra expensive and you probably had close to $900 dollars just on 2 components,and yes the Core 2 duo was close to $200 cheapest which was 1.8ghz a 2.4ghz version went for $300 the 2.6 for $500+ and the 2.9 for a whooping $999...
The xbox 360 had a triple core 3.2ghz CPU with 6 threads the core 2 duo didn't even had hyperthreading on 2006 it was 2 core 2 threads.
And the first quad core on late 2006 were $1,000 + in quantities of 1,000 that mean if you bought 1,000 units from intel that was the price they give you street price was like $1,200 or more.
The 360 was as low as $299 you simple could not match or come even close to that price on PC on 2005 or 2006,you could find something more powerful but at a great cost.
Moat people bought an HDTV to go with their console which was a massive cost..... And the fact a very large quantity of 360's ended RROD so owners went through multiple 360's...... My little brother wet through 3 of them!
That 2006 $1300 PC would more then likely still be working fine 10 years later and still able to play some games.....
If Apple enter the console ring definitely could see a beastform.
A premium console at a premium price with a strong eco system is just the competition the console industry needs
I don't want to play with weak hardware i hope apple enters the ring with a premium system theres a market out there for it and its not pc its just a premium console that im looking for.
Power was never a selling point for consoles to begin with!!!
It's more important for pc fans, so whether the next console will be more powerful than pc ( not gonna happen!!! ) or not, is completely irrelevant for the console crowd.
@Juub1990: worthless for gaming, for now, due to legacy APIs, old code and so on running thoughout the industry. with vulkan and DX12 extra cores on the CPU (and maybe the iGPU also) are going to become more useful on the PC. its just going to take a few years before we really see the results. at the moment even the likes of doom are just developers dipping their toes in the water at the moment.
as for consoles...it doesnt make sense to anymore. both the 360 and PS3 (to a lesser extent) launched with big problems and they cost both companies a boat load of money in the short to medium term. in the meantime the wii mopped the floor with both of them in the sales and money making departments.
the reality is that most people actually dont really care whats in the box. they just want to play games. a console doesnt need to be on the bleeding edge to attract sales. it just needs to be good enough, backed by plenty of quality games that get peoples attention and sent out with a clear message. easier said than done mind.
i actually think the hardware in the X1 and PS4 is very good and sensible overall (i could nitpick of course). consoles arent breaking easily. the price is good. they do produce very nice visuals too. i mean we are well past the point now were a game looks bad due to technical limitations. the PS2 gen was arguably the last one that had that problem (well the wii last gen too sure). they could always look better of course but they dont look bad. sony and MS learned well from the mistakes of last gen. the biggest bottleneck in the industry at the moment is the wallet not the hardware. so my gast is still flabbered as to why the Pro and scorpio exist.
@undefined: how did Oblivion "demolish" pcs when it released?
HDR and AA at same time? Nope?
http://www.anandtech.com/show/1996/4
8800GTX released 6 months later and ruined Oblivion...... Heck the link you posted shows PC running at higher frame rates and quality settings then console so you trying to own yourself?
@Juub1990: worthless for gaming, for now, due to legacy APIs, old code and so on running thoughout the industry. with vulkan and DX12 extra cores on the CPU (and maybe the iGPU also) are going to become more useful on the PC. its just going to take a few years before we really see the results. at the moment even the likes of doom are just developers dipping their toes in the water at the moment.
as for consoles...it doesnt make sense to anymore. both the 360 and PS3 (to a lesser extent) launched with big problems and they cost both companies a boat load of money in the short to medium term. in the meantime the wii mopped the floor with both of them in the sales and money making departments.
the reality is that most people actually dont really care whats in the box. they just want to play games. a console doesnt need to be on the bleeding edge to attract sales. it just needs to be good enough, backed by plenty of quality games that get peoples attention and sent out with a clear message. easier said than done mind.
i actually think the hardware in the X1 and PS4 is very good and sensible overall (i could nitpick of course). consoles arent breaking easily. the price is good. they do produce very nice visuals too. i mean we are well past the point now were a game looks bad due to technical limitations. the PS2 gen was arguably the last one that had that problem (well the wii last gen too sure). they could always look better of course but they dont look bad. sony and MS learned well from the mistakes of last gen. the biggest bottleneck in the industry at the moment is the wallet not the hardware. so my gast is still flabbered as to why the Pro and scorpio exist.
So when my laptop has an SSD and a better cpu.
Or my phone has higher resolution than ps4 and better wifi capabilities you don't think thats detrimental to people?
The OS is running slower than cheap laptops since they use slow hard drives.
The wifi they have is complete trash compared to cheaper phones on the market
Most phones have stronger cpu's than both the xb1/ps4.
I don't need a ps4/xb1 for netflix since my smart tv has all that covered and its not as laggy and slow and has better speed.
Consoles took a massive hit this gen going so weak they are getting outdone by nearly every other device on the market else and they've had some of the worst and most undiverse gaming lineups that ive ever seen in a generation.
Now lets look at the PS3 when it came out it had a cell processor stronger than nearly every PC gamers cpu's at the time, it had bluray players which almost nobody had pc tv movie watchers everyone didn't have a bluray drive till this. Its game lineup was probably on par with the ps4 not much there at all compared to past gens but the hardware atleast makes up for it.
You got so much with a ps3. Now with the ps4 you get pretty much nothing the system fan makes a crap ton of room noise, the system itself heats up way too much and all this other crap.
I don't remember my SNES making any noise at all pop a game in play no noise no heat issues 1000's of exclusive that all rocked multiple different games in every genre on lock. DUde its time to wake up.
The 360 didnt even max out Oblivion it was comparable to a PC with a Geforce 6800GT, and it ran at 1024×600 resolution on the 360... and most people back in 2006 were still using old crt tube TV's so using the 7800GTX at 1280x1024 with max detail looked better than 360's version......
It was comparable to max settings on PC,but unlike the OC with a 7800GTX you could not have HDR+AA it was one or the other.
Most people on PC were running 800x600 and 1024x768 on PC back then,don't try to group all PC gamers with a damn 7800GTX those who had a card like that were a minority most people didn't,just like now most people don't own a 1080 or a Fury X.
You can do 4k on PC yet the most used resolution is 1080p which is 1 1/4 of 4k,the same was true for PC back then,in fact you could run 1600x1200 back then on a 7900GTX and it would look better than the xbox 360 but was not by much cleaner picture a little more foliage for max setting,the 360 had better AA since you could not turn AA and HDR on the 7900GTX.
This is low,medium,max...
The xbox 360 version was abode medium but a little below Max,most people were playing this on low on PC on 2006,many people were playing on Medium and Few people were playing it on Max on 2006,just like it is now back then most people didn't rock high end GPU or CPU hell a ton of people were still on windows me and second edition 98 man.
Oh and most people either had a 1600x1200 monitor back then.
No the only would if consoles were 500 to 650 dollars. The 360 was weak 512mb ram was not enough, edram too small for 720p with 2xmsaa, lot's of games couldn't go crazy on graphical effects because the GPU being maxed out.
Yet when properly done the PS3 shitted on anything graphicly on the 360. because of a monster CPU, faster ram 512mb total = 48gb/s bandwidth , a GPU that was on par with the 360 had power wise.
@undefined: how did Oblivion "demolish" pcs when it released?
HDR and AA at same time? Nope?
http://www.anandtech.com/show/1996/4
8800GTX released 6 months later and ruined Oblivion...... Heck the link you posted shows PC running at higher frame rates and quality settings then console so you trying to own yourself?
HDR+AA had been done before Oblivion in HL2:Lost Coast...... Moving on....
The 360 didnt even max out Oblivion it was comparable to a PC with a Geforce 6800GT, and it ran at 1024×600 resolution on the 360... and most people back in 2006 were still using old crt tube TV's so using the 7800GTX at 1280x1024 with max detail looked better than 360's version......
It was comparable to max settings on PC,but unlike the OC with a 7800GTX you could not have HDR+AA it was one or the other.
Most people on PC were running 800x600 and 1024x768 on PC back then,don't try to group all PC gamers with a damn 7800GTX those who had a card like that were a minority most people didn't,just like now most people don't own a 1080 or a Fury X.
You can do 4k on PC yet the most used resolution is 1080p which is 1 1/4 of 4k,the same was true for PC back then,in fact you could run 1600x1200 back then on a 7900GTX and it would look better than the xbox 360 but was not by much cleaner picture a little more foliage for max setting,the 360 had better AA since you could not turn AA and HDR on the 7900GTX.
This is low,medium,max...
The xbox 360 version was abode medium but a little below Max,most people were playing this on low on PC on 2006,many people were playing on Medium and Few people were playing it on Max on 2006,just like it is now back then most people didn't rock high end GPU or CPU hell a ton of people were still on windows me and second edition 98 man.
Oh and most people either had a 1600x1200 monitor back then.
it was not really comparable to max settings..... lower draw distances and lower LOD and texture detail vs PC max.
The bug in Oblivion not allowing HDR and AA is a bug in the game engine, which you could bypass in Nvidia control panel.... But when your running 1280x1024 or 1600x1200 no need for AA anyways.
"A medium spec PC with a decent processor and a mid-range DirectX 9 video card like a GeForce 6600 GT allows you to enable a few more graphics settings like view distance and some shadows. The Xbox 360 version of the game looks slightly better"
"Oblivion looks better on a high-end PC than on the Xbox 360. Note the additional foliage visible in the background. We matched up resolutions for screenshot comparison purposes here, but a high-end PC with an AMD Athlon FX-60 CPU and GeForce 7900 GTX graphics card can enable all the settings and take resolutions up to 1600x1200 or more and still maintain smooth frame rates."
"As you can see from the video and screenshots, the draw distance in our tests was not as far on 360. Mountains and trees will appear on the console but smaller details like rocks and architecture will pop-in later on 360."
Comparing people back then with 7800GTX's vs 1080's or fury's today is not even close to what range of gpus from then to now. The difference between 7900GTX to 7900GT to 7800GTX was very narrow with Oblivion at 1600x1200 with HDR. Now at 1280x1024 even a 7600GS aka 6800GT can handle Oblivion high with HDR with around 26 fps avg.
The 360 didnt even max out Oblivion it was comparable to a PC with a Geforce 6800GT, and it ran at 1024×600 resolution on the 360... and most people back in 2006 were still using old crt tube TV's so using the 7800GTX at 1280x1024 with max detail looked better than 360's version......
It was comparable to max settings on PC,but unlike the OC with a 7800GTX you could not have HDR+AA it was one or the other.
Most people on PC were running 800x600 and 1024x768 on PC back then,don't try to group all PC gamers with a damn 7800GTX those who had a card like that were a minority most people didn't,just like now most people don't own a 1080 or a Fury X.
You can do 4k on PC yet the most used resolution is 1080p which is 1 1/4 of 4k,the same was true for PC back then,in fact you could run 1600x1200 back then on a 7900GTX and it would look better than the xbox 360 but was not by much cleaner picture a little more foliage for max setting,the 360 had better AA since you could not turn AA and HDR on the 7900GTX.
This is low,medium,max...
The xbox 360 version was abode medium but a little below Max,most people were playing this on low on PC on 2006,many people were playing on Medium and Few people were playing it on Max on 2006,just like it is now back then most people didn't rock high end GPU or CPU hell a ton of people were still on windows me and second edition 98 man.
Oh and most people either had a 1600x1200 monitor back then.
it was not really comparable to max settings..... The bug in Oblivion not allowing HDR and AA is a bug in the game engine, which you could bypass in Nvidia control panel.... But when your running 1280x1024 or 1600x1200 no need for AA anyways.
"A medium spec PC with a decent processor and a mid-range DirectX 9 video card like a GeForce 6600 GT allows you to enable a few more graphics settings like view distance and some shadows. The Xbox 360 version of the game looks slightly better"
"Oblivion looks better on a high-end PC than on the Xbox 360. Note the additional foliage visible in the background. We matched up resolutions for screenshot comparison purposes here, but a high-end PC with an AMD Athlon FX-60 CPU and GeForce 7900 GTX graphics card can enable all the settings and take resolutions up to 1600x1200 or more and still maintain smooth frame rates."
"As you can see from the video and screenshots, the draw distance in our tests was not as far on 360. Mountains and trees will appear on the console but smaller details like rocks and architecture will pop-in later on 360."
Comparing people back then with 7800GTX's vs 1080's or fury's today is not even close to what range of gpus from then to now. The difference between 7900GTX to 7900GT to 7800GTX was very narrow with Oblivion at 1600x1200 with HDR. Now at 1280x1024 even a 7600GS aka 6600GT can handle Oblivion high with HDR with around 26 fps avg.
Exactly.
And I'm sorry, it has to be noted: if someone were to walk in the room and say "hey, I've got this sick console it runs all the eye candy at 1024x600 and 2x AA. It must be more powerful than your rig that can't run all they eye candy at 1680x1050" nobody would give them the time of day, because the comparison is unusable. But somehow we're happy to draw the comparisons for the 360 and run with it?
The 360 was really powerful. Like, high-end PC at launch powerful. Cost effective and extremely powerful. I don't see the reason to bend over backwards to use worthless comparisons. Someone want to go run Oblivion on a 7600GS at 1024x600 with only 2xAA instead of 4x, but mid-high eye candy and HDR and report back on the framerates for us? The 360 should win for sure, but it would be interesting to have a worthwhile comparison point if this is going to be a serious point of contention. Then on a 7800GTX?
For us to see someone with a decade old gpu.......... O wait I do have a such a old pc..... It has an Athlon X2 2gb and a 6600GT Later on today I'll whip it out install Oblivion just for the hell of it.
Will Microsoft or Sony be willing to sell at a big loss at launch to challenge the PC's power?
If so, wouldn't you be concerned with another Xbox 360 Red Ring of Death like fiasco happening again, with all that power packed in a console? Unless they come with some high tech cooling solutions.
You did not just use the EIB ring and EDRAM in the bandwidth results.
It would be silly to exclude EIB and eDRAM bandwidth numbers from a Bandwidth list. GDDR3 bandwidth is not the only kind of bandwidth that matters.
One could argue that GDDR3 bandwidth is the most practical and important. But the lists above are raw theoretical numbers, not important practical numbers.
You did not just use the EIB ring and EDRAM in the bandwidth results.
It would be silly to exclude EIB and eDRAM bandwidth numbers from a Bandwidth list. GDDR3 bandwidth is not the only kind of bandwidth that matters.
One could argue that GDDR3 bandwidth is the most practical and important. But the lists above are raw theoretical numbers, not important practical numbers.
So lets include the bandwidth from all the internal caches in G80 and the CPU while we're at it as VRAM bandwidth is not the only kind of bandwidth that matters.
It's even funny how you compare the CPU+GPU in the consoles and only use numbers from G80 only.
The "red ring of death" was a major put off for consumers and costed Microsoft $1 billion. People think Microsoft made a loss because their console was so powerful. Which is total BS. They made a loss because they kept on breaking. Sony were the ones who made a loss on putting out expensive hardware.
You did not just use the EIB ring and EDRAM in the bandwidth results.
It would be silly to exclude EIB and eDRAM bandwidth numbers from a Bandwidth list. GDDR3 bandwidth is not the only kind of bandwidth that matters.
One could argue that GDDR3 bandwidth is the most practical and important. But the lists above are raw theoretical numbers, not important practical numbers.
So lets include the bandwidth from all the internal caches in G80 and the CPU while we're at it as VRAM bandwidth is not the only kind of bandwidth that matters.
It's even funny how you compare the CPU+GPU in the consoles and only use numbers from G80 only.
The Cell is not just a CPU, but its SPEs also function like an additional GPU, which is what the EIB is for. In comparison, the PC and Xbox 360 had traditional CPU, whereas the Cell had an unusual CPU-GPU hybrid design.
We will never see another 360 ever again. MS didn't learn a Goddamn thing what made MS a superstar in the first damn place! When the Xbox One reveal, MS did NOT learn anything from it's little brother's 360 successful life and it shows. Everyone expected the One to be the One but nope, this was proof that we will never see another 360, ever!
And as for PC, PC was at it's worst states last-gen but now, PC is growing and becoming popular ever before thinks to this sluggish gen were in now.
Don Mattrick thought Xbox 360s success was multiplats and Kinect, so it's clear how Xbone launched...
Also, I disagree with your last sentence. Just because AAA publishers abandoned PC and PC specific genres, didn't mean PC was in a bad state.
It's like AAA publishers (Konami) declaring console gaming dead, saying mobile is the futue,... AAA publishers declaring horror dead, AAA publishers thinking every FPS needs to be like COD last gen. All of these decisions were based on fear.
Last gen they thought nobody was interested in PC gaming. At the start of this gen they thought nobody was interested in console gaming (hence the many remasters, hence the move to PC and mobile).
And meanwhile, PC and console gamers are like: "Huh ? Whaaaaaaa ?" Unsure whether to talk to that man on the street, who predicts the future by reading it in his own vomit.
You did not just use the EIB ring and EDRAM in the bandwidth results.
It would be silly to exclude EIB and eDRAM bandwidth numbers from a Bandwidth list. GDDR3 bandwidth is not the only kind of bandwidth that matters.
One could argue that GDDR3 bandwidth is the most practical and important. But the lists above are raw theoretical numbers, not important practical numbers.
So lets include the bandwidth from all the internal caches in G80 and the CPU while we're at it as VRAM bandwidth is not the only kind of bandwidth that matters.
It's even funny how you compare the CPU+GPU in the consoles and only use numbers from G80 only.
The Cell is not just a CPU, but its SPEs also function like an additional GPU, which is what the EIB is for. In comparison, the PC and Xbox 360 had traditional CPU, whereas the Cell had an unusual CPU-GPU hybrid design.
I'm well aware of how the systems work thank you very much....... The cache's in a PC GPU and still used for GPU tasks.... And CPU's were also used for GOU tasks too back then...
So my point still stands, stop picking and choosing things to make one look better then the other.
The total bandwidth in a 2006 PC is faster then any of the consoles from the same period.
After some fighting to get Oblivion to start a new game, the beginning dungeon section leading to outside on ultra averaged around 18 fps at 1024x768 with 2x AA. High it was 22 fps, same AA and resolution. On medium around 26 fps. Once I turned off AA Ultra 23 avg, , high averaged 26, medium 30 avg. Once outside Ultra it got around 25, high 32, medium 40 with no AA at 1024x768.
PS3's RSX for RGBA16F, 550 Mhz x 8 ROPS x 8 bytes = 35.2 GB/s. RSX only has 22.4 GB/s of raw memory bandwidth. BW bound.
7800 GTX 256 MB for RGBA16F, 430 Mhz x 16 ROPS x 8 bytes = 55 GB/s. 7800 GTX 256 MB only has 38.4 GB/s of raw memory bandwidth. BW bound.
8800 GTX for RGBA16F, 575 Mhz x 24 ROPS x 8 bytes = 110 GB/s. 8800 GTX only has 86.4 GB/s of raw memory bandwidth. BW bound.
Xbox 360 for RGBA16F, 500 Mhz x 8 ROPS x 8 bytes = 32 GB/s. May need tiling. ROPS bound.
The principle is from
RSX's GFLOPS number is useless i.e. the bottle-neck is the entry point. Without Giga-threads (G80) or Ultra-threads, a stall in FP Texture processor also stalls FP32 units.
Xenos was the first unified shader GPU to have 64 threads (ultra-threading like) to be decoupled from ALUs.
In unified shader GPUs, the structure has been flatten and exposed with a wider entry points and generalized ALUs .
Refer to http://forum.beyond3d.com/showthread.php?p=552774
Read Jawed's post
For example texture fetches in RSX will always be painfully slow in comparison - but how slow depends on the format of the textures.
Also, control flow operations in RSX will be out of bounds because they are impractically slow - whereas in Xenos they'll be the bread and butter of good code because there'll be no performance penalty.
Dependent texture fetches in Xenos (I presume that's what the third point means), will work without interrupting shader code - again RSX simply can't do this, dependent texturing blocks one ALU per pipe
From https://forum.beyond3d.com/posts/1460125/
------------------------
"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"
1) Two ppu/vmx units
There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.
2) Vertex culling
You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.
3) Vertex texture sampling
You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.
4) Shader patching
Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.
5) Branching
You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.
6) Shader inputs
You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.
7) MSAA alternatives
Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.
Post processing
360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.
9) Load balancing
360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.
10) Half floats
You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.
11) Shader array indexing
You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.
Etc, etc, etc...
Sony(SCEA)'sstudypaper on "Deferred Pixel Shading on the Playstation 3" and comparative performance to Geforce 7800 GTX. Can be found fromhttp://research.scea.com/ps3_deferred_shading.pdf
Quote
D. Comparison to GeForce 7800 GTX GPU
We implemented the same algorithm on a high end state of
the art GPU, the NVIDIA GeForce 7800 GTX running in a
Linux workstation. This GPU has 24 fragment shader
pipelines running at 430 Mhz and processes 24 fragments
in parallel. By comparison the 5 SPEs that we used process
20 pixels in parallel in quad-SIMD form.
The GeForce required 11.1 ms to complete the shading
operation. In comparison the Cell/B.E. required 11.65 ms
including the DMA waiting time
Due to inefficiencies in PS3, the overall results yields like Xbox 360 level.
With this console generation, AMD nuked the separate PC's VLIW based GPU design and unified their separate GPU designs into one i.e. SIMD/MIMD based GPU design. GCN is the spiritual successor to Xbox 360's Xenos and it's a key IP for Qualcomm Adreno unified shader GPU.
From http://news.priorsmart.com/advanced-micro-devices-v-lg-electronics-la2U/
AMD sues LG on GPU patents and LG uses ARM based SoC chipsets.
Filed on March 5, 2014.
From http://www.law360.com/articles/515848/amd-picks-patent-fight-with-lg-over-graphics-technology
AMD is using it's multithreaded shader GPU patents to sue LG.
"The allegedly infringing products include LG televisions, smartphones, tablets, Blu-ray players, projectors and appliances that embody or practice the patented inventions"
LG should follow Sony, Microsoft and Nintendo in licensing AMD's IP blocks.
AMD's multithreaded unified shader GPU patents
Multi-thread graphics processing system
http://www.law360.com/patents/7742053
Graphics processing architecture employing a unified shader
http://www.law360.com/patents/7327369
Graphics processing architecture employing a unified shader
The 360 didnt even max out Oblivion it was comparable to a PC with a Geforce 6800GT, and it ran at 1024×600 resolution on the 360... and most people back in 2006 were still using old crt tube TV's so using the 7800GTX at 1280x1024 with max detail looked better than 360's version......
It was comparable to max settings on PC,but unlike the OC with a 7800GTX you could not have HDR+AA it was one or the other.
Most people on PC were running 800x600 and 1024x768 on PC back then,don't try to group all PC gamers with a damn 7800GTX those who had a card like that were a minority most people didn't,just like now most people don't own a 1080 or a Fury X.
You can do 4k on PC yet the most used resolution is 1080p which is 1 1/4 of 4k,the same was true for PC back then,in fact you could run 1600x1200 back then on a 7900GTX and it would look better than the xbox 360 but was not by much cleaner picture a little more foliage for max setting,the 360 had better AA since you could not turn AA and HDR on the 7900GTX.
This is low,medium,max...
The xbox 360 version was abode medium but a little below Max,most people were playing this on low on PC on 2006,many people were playing on Medium and Few people were playing it on Max on 2006,just like it is now back then most people didn't rock high end GPU or CPU hell a ton of people were still on windows me and second edition 98 man.
Oh and most people either had a 1600x1200 monitor back then.
it was not really comparable to max settings..... lower draw distances and lower LOD and texture detail vs PC max.
The bug in Oblivion not allowing HDR and AA is a bug in the game engine, which you could bypass in Nvidia control panel.... But when your running 1280x1024 or 1600x1200 no need for AA anyways.
"A medium spec PC with a decent processor and a mid-range DirectX 9 video card like a GeForce 6600 GT allows you to enable a few more graphics settings like view distance and some shadows. The Xbox 360 version of the game looks slightly better"
"Oblivion looks better on a high-end PC than on the Xbox 360. Note the additional foliage visible in the background. We matched up resolutions for screenshot comparison purposes here, but a high-end PC with an AMD Athlon FX-60 CPU and GeForce 7900 GTX graphics card can enable all the settings and take resolutions up to 1600x1200 or more and still maintain smooth frame rates."
"As you can see from the video and screenshots, the draw distance in our tests was not as far on 360. Mountains and trees will appear on the console but smaller details like rocks and architecture will pop-in later on 360."
Comparing people back then with 7800GTX's vs 1080's or fury's today is not even close to what range of gpus from then to now. The difference between 7900GTX to 7900GT to 7800GTX was very narrow with Oblivion at 1600x1200 with HDR. Now at 1280x1024 even a 7600GS aka 6800GT can handle Oblivion high with HDR with around 26 fps avg.
GeForce 7 can't combine HDR (with FP surfaces) and MSAA.
Qn: "Does NVIDIA support HDR and Anti Aliasing at the same time?"
Ans: "Two key hardware features present in GeForce 6 and GeForce 7 GPUs that accelerate and enable great HDR effects include FP16 texture filtering and FP16 frame-buffer blending. There are many ways to accomplish AA and HDR simultaneously in applications. Some games like Half Life 2 Lost Coast use an integer-based HDR technique, and in such games, the GeForce 6 and GeForce 7 chips can render HDR with MSAA simultaneously using that method. The GeForce 6 and GeForce 7 series GPUs do not support simultaneous multisampled AA (MSAA) and FP16 blending in hardware (the method used in 3dMark 2006)."
Oblivion's issue with Geforce 7 and MSAA/HDR FP is NOT a bug but the limitation from the hardware i.e. the same problem for 3DMark 2006.
It would be silly to exclude EIB and eDRAM bandwidth numbers from a Bandwidth list. GDDR3 bandwidth is not the only kind of bandwidth that matters.
One could argue that GDDR3 bandwidth is the most practical and important. But the lists above are raw theoretical numbers, not important practical numbers.
So lets include the bandwidth from all the internal caches in G80 and the CPU while we're at it as VRAM bandwidth is not the only kind of bandwidth that matters.
It's even funny how you compare the CPU+GPU in the consoles and only use numbers from G80 only.
The Cell is not just a CPU, but its SPEs also function like an additional GPU, which is what the EIB is for. In comparison, the PC and Xbox 360 had traditional CPU, whereas the Cell had an unusual CPU-GPU hybrid design.
I'm well aware of how the systems work thank you very much....... The cache's in a PC GPU and still used for GPU tasks.... And CPU's were also used for GOU tasks too back then...
So my point still stands, stop picking and choosing things to make one look better then the other.
The total bandwidth in a 2006 PC is faster then any of the consoles from the same period.
You seem to be confusing the EIB and eDRAM with L1/L2 cache. The PS3 and 360 have their own separate L1/L2 cache inside both their CPU and GPU. On top of that, they have the EIB and eDRAM, which are separate from the L1/L2 cache. If you want to include the L1/L2 cache for PC, then you'd have to do the same for the PS3 and 360, nullifying whatever advantage you think a 2006 PC would gain from including L1/L2 cache.
The point still stands that the consoles had a higher overall bandwidth than a 2006 PC. But the advantage that a 2006 PC had was a higher GDDR3 bandwidth, which is the one that mattered the most, especially for video games. In comparison, most of the Cell's EIB bandwidth was great for supercomputing applications, but poorly optimized for video games.
You still cant run HDR and AA through the game even with Modern Nvidia gpus , its the game coding.... however you can force AA through control panel.... and or nVidia Inspector
Nope. 360 was a class on its own. A truly amazing, no frills, remarkable, well rounded system. It performed for what it was designed to do, PLAY GAMES, PROVIDE ONLINE GAMING and MULTIMEDIA capabilities.
You're delusional. PCs never did. Only in your imagination. 1st I've ever heard of this. Must be a hardcore Xfan. They are delusional and wear Xgoggles.
Well, the price: performance ratio was certainly amazing at launch but like others have already said PC was and remains to this day (with respect to current gen consoles) superior technology. So, yeah, that's the last time we are gonna see a massive jump over the previous generation of consoles. Consoles going from predominantly SD to HD was a huge boost as well.
It would be silly to exclude EIB and eDRAM bandwidth numbers from a Bandwidth list. GDDR3 bandwidth is not the only kind of bandwidth that matters.
One could argue that GDDR3 bandwidth is the most practical and important. But the lists above are raw theoretical numbers, not important practical numbers.
So lets include the bandwidth from all the internal caches in G80 and the CPU while we're at it as VRAM bandwidth is not the only kind of bandwidth that matters.
It's even funny how you compare the CPU+GPU in the consoles and only use numbers from G80 only.
The Cell is not just a CPU, but its SPEs also function like an additional GPU, which is what the EIB is for. In comparison, the PC and Xbox 360 had traditional CPU, whereas the Cell had an unusual CPU-GPU hybrid design.
I'm well aware of how the systems work thank you very much....... The cache's in a PC GPU and still used for GPU tasks.... And CPU's were also used for GOU tasks too back then...
So my point still stands, stop picking and choosing things to make one look better then the other.
The total bandwidth in a 2006 PC is faster then any of the consoles from the same period.
You seem to be confusing the EIB and eDRAM with L1/L2 cache. The PS3 and 360 have their own separate L1/L2 cache inside both their CPU and GPU. On top of that, they have the EIB and eDRAM, which are separate from the L1/L2 cache. If you want to include the L1/L2 cache for PC, then you'd have to do the same for the PS3 and 360, nullifying whatever advantage you think a 2006 PC would gain from including L1/L2 cache.
The point still stands that the consoles had a higher overall bandwidth than a 2006 PC. But the advantage that a 2006 PC had was a higher GDDR3 bandwidth, which is the one that mattered the most, especially for video games. In comparison, most of the Cell's EIB bandwidth was great for supercomputing applications, but poorly optimized for video games.
A PC from 2006 has L1/L2 in CPU and GPU...just like PS3 and 360.
PC cache is faster, which would ultimately bump the total PC bandwidth above console....regardless of the hardware... Especially since most of what you posted is THEORETICAL bandwidth....
But lets get to it, your system is only as fast as your slowest part and the slowest part in a console (RAM) is much much slower then PC.
So lets include the bandwidth from all the internal caches in G80 and the CPU while we're at it as VRAM bandwidth is not the only kind of bandwidth that matters.
It's even funny how you compare the CPU+GPU in the consoles and only use numbers from G80 only.
The Cell is not just a CPU, but its SPEs also function like an additional GPU, which is what the EIB is for. In comparison, the PC and Xbox 360 had traditional CPU, whereas the Cell had an unusual CPU-GPU hybrid design.
I'm well aware of how the systems work thank you very much....... The cache's in a PC GPU and still used for GPU tasks.... And CPU's were also used for GOU tasks too back then...
So my point still stands, stop picking and choosing things to make one look better then the other.
The total bandwidth in a 2006 PC is faster then any of the consoles from the same period.
You seem to be confusing the EIB and eDRAM with L1/L2 cache. The PS3 and 360 have their own separate L1/L2 cache inside both their CPU and GPU. On top of that, they have the EIB and eDRAM, which are separate from the L1/L2 cache. If you want to include the L1/L2 cache for PC, then you'd have to do the same for the PS3 and 360, nullifying whatever advantage you think a 2006 PC would gain from including L1/L2 cache.
The point still stands that the consoles had a higher overall bandwidth than a 2006 PC. But the advantage that a 2006 PC had was a higher GDDR3 bandwidth, which is the one that mattered the most, especially for video games. In comparison, most of the Cell's EIB bandwidth was great for supercomputing applications, but poorly optimized for video games.
A PC from 2006 has L1/L2 in CPU and GPU...just like PS3 and 360.
PC cache is faster, which would ultimately bump the total PC bandwidth above console....regardless of the hardware... Especially since most of what you posted is THEORETICAL bandwidth....
But lets get to it, your system is only as fast as your slowest part and the slowest part in a console (RAM) is much much slower then PC.
That's my point, that the PS3 and 360 also had L1/L2 cache in their CPU and GPU, just like PC. But the difference is that, on top of the L1/L2 cache, the PS3 and 360 also have the EIB and eDRAM.
Do you have any numbers to back up your claim that a 2006 PC had faster L1/L2? Even if it did, I doubt it would be faster than the PS3's combined L1/L2+EIB or the 360's combined L1/L2+eDRAM.
The slowest part in a 2006 PC, the main DDR2 RAM, had a maximum bandwidth of 8.5 GB/s, much slower than the slowest parts in the PS3 and 360 (the 22.4 GB/s GDDR3).
You still cant run HDR and AA through the game even with Modern Nvidia gpus , its the game coding.... however you can force AA through control panel.... and or nVidia Inspector
For Geforce FX/6/7 series, control panel's force MSAA has no effect for HDR that uses floating point surfaces and MSAA. This hardware issue is not limited to Oblivion since 3D Mark 2006 also has the same problem. PS3's RSX has the same limitation as PC's Geforce 7.
NAO32 is a known workaround i.e. convert FP into integer. HDR in an INT format allows for MSAA on G70. The workaround has several shader instructions per pixel consumption.
Any driver intercept methods are still bound by hardware limitations.
G80 can do MSAA and HDR FP16.
My GPUs during DX9 era was FX-5200 (work laptop), FX-5950, Mobility Radeon 9600M, Mobility X1600 (work laptop) which is followed by DX10 era 8600 GTS and 8600M GT (bump gated).
huh? PS3 had superior hardware and consistently produced the better looking games. The 360 did have a solid line up of games though and GOW 2, Halo 3 and Forza were a lot of fun.
Log in to comment