Sony discovers Cell+RSX rendering technique

  • 130 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Redfingers
Member since 2005 • 4510 Posts

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

Avatar image for bad82man82
bad82man82

1059

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 bad82man82
Member since 2006 • 1059 Posts
Good news, but we wanna see it in real time.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Redfingers
Member since 2005 • 4510 Posts

Perhaps more significant: "We have explored the potential of Cell/B.E. for accelerating graphical operations in the Playstation 3 computer entertainment system. This system combined the Cell/B.E. with a state of the art GPU in a unified memory architecture. In this architecture both devices share access to system memory and to graphics memory. As a result they can share data and processing tasks."

Let it be known that in this particular experiment, these tasks were performed....admirably. Carmack be damned. Apparently we were all wrong.

Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#4 muscleserge
Member since 2005 • 3307 Posts

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

Redfingers
So it takes cell+RSX to equal to about an 8800GTS. Good news for Cows, maybe some of their 2008 games will start to look great. I kind of wanted to get a PS3 but the lack of games put me off for a while.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Redfingers
Member since 2005 • 4510 Posts

Good news, but we wanna see it in real time.bad82man82

Okay, buck-a-roo! Saddle up! You're going to Sony's first party development studios to witness this experimental technique in action! In fact, because this demonstration has only been performed behind closed doors, implementing it in a game will essentially be the breakthrough application of the entire operation!

Congratulations, chief!

Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Redfingers
Member since 2005 • 4510 Posts
[QUOTE="Redfingers"]

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

muscleserge

So it takes cell+RSX to equal to about an 8800GTS. Good news for Cows, maybe some of their 2008 games will start to look great. I kind of wanted to get a PS3 but the lack of games put me off for a while.

Reading comprehension FTW. Lack of reading comprehension FTL. He says that the shading computation (which is performed 100% by those 5 SPUs on the Cell processor) runs at 85Hz with 30.72 gigaops of performance.

Avatar image for DivergeUnify
DivergeUnify

15150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 DivergeUnify
Member since 2007 • 15150 Posts
[QUOTE="Redfingers"]

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

muscleserge
So it takes cell+RSX to equal to about an 8800GTS. Good news for Cows, maybe some of their 2008 games will start to look great. I kind of wanted to get a PS3 but the lack of games put me off for a while.

Wow, wtf how do you come up with some dumb figure like that.
Avatar image for Burnout_Player0
Burnout_Player0

702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 Burnout_Player0
Member since 2007 • 702 Posts
psu, the site that faked the interview with Valve :lol:
Avatar image for Dreams-Visions
Dreams-Visions

26578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 Dreams-Visions
Member since 2006 • 26578 Posts
Okay. My enthusiasm is limited. Will look forward to seeing how this affects actual games...and when.
Avatar image for lagreendown
lagreendown

1612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 lagreendown
Member since 2006 • 1612 Posts
nice so you can say sony unlocked teh powah
Avatar image for -wii60-
-wii60-

3287

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 -wii60-
Member since 2007 • 3287 Posts
Interesting, but it will never beat my PC.
Avatar image for Whiteknight19
Whiteknight19

1303

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 Whiteknight19
Member since 2003 • 1303 Posts
[QUOTE="Redfingers"]

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

muscleserge
So it takes cell+RSX to equal to about an 8800GTS. Good news for Cows, maybe some of their 2008 games will start to look great. I kind of wanted to get a PS3 but the lack of games put me off for a while.

lol i dont think ps3 can reach to an 8800GTS ps3 does not have Direct x10 but if they bring out the power then ur lookin rougly around a high end 7900GTX
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 Redfingers
Member since 2005 • 4510 Posts

The idea that the Cell processor is as capable as a GPU at rendering GPU optimized code is not new, nor is it unsupported.

http://gametomorrow.com/blog/index.php/2005/11/30/gpus-vs-cell/

"I thought it would be interesting to port this Nvidia CG code to the Cell processor, using the public SDK, and see how it performs given that it was ideal for a GPU. First we directly translated the CG code line for line to C + SPE intrinsics. All the CG code structures and data types were maintained. Then we wrote a CG framework to execute this shader for Cell that included a backend image compression and network delivery layer for the finished images. To our surprise, well not really, we found that using only 7 SPEs for rendering a 3.2 GHz Cell chip could out run an Nvidia 7800 GT OC card at this task by about 30%. We reserved one SPE for the image compression and delivery task. Furthermore the way CG structures it SIMD computation is inefficient as it causes large percentages of the code to execute in scalar mode. This is due to the way they structure their vector data, AOS vs SOA. By converting this CG shader from AOS to SOA form, SIMD utilization was much higher which resulted in Cell out performing the Nvidia 7800 by a factor of 5 - 6x using only 7 SPEs for rendering."

And you have an Nvidia 7900 GT? Kay. It outperforms the 7800 GT at a margin of 30% on GPU optimized code and by a factor of 5/6 on optimized code. I would assume that's feasibly over 2x, possibly 3-4x the raw graphics processing capabilities of the 7900 GT. Also, the Sony report penned today says that 5 SPEs (there are 7 SPEs in the Playstation 3, 1 reserved for the OS, and a PPE...in the Cell processor itself there are 8 SPEs and one PPE) ran comparably to a state of the art GPU....meaning 8800/2900, presumably.

Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 Redfingers
Member since 2005 • 4510 Posts

psu, the site that faked the interview with Valve :lol:Burnout_Player0

I gave you the report itself, no nonsense. Read it, if you like.

Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#15 muscleserge
Member since 2005 • 3307 Posts
[QUOTE="muscleserge"][QUOTE="Redfingers"]

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

DivergeUnify
So it takes cell+RSX to equal to about an 8800GTS. Good news for Cows, maybe some of their 2008 games will start to look great. I kind of wanted to get a PS3 but the lack of games put me off for a while.

Wow, wtf how do you come up with some dumb figure like that.

Read the last statement. IMO state of the art right now are G80s.
Avatar image for TheSystemLord1
TheSystemLord1

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#16 TheSystemLord1
Member since 2006 • 7786 Posts

Okay. My enthusiasm is limited. Will look forward to seeing how this affects actual games...and when.Dreams-Visions

Just wait...for 2010

Avatar image for thinicer
thinicer

3704

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 thinicer
Member since 2006 • 3704 Posts
Redfingers, that's all well and good, but nobody is disputing the power of the PS3. PS3 games look good and will continue to look fantastic in future titles. But as we have seen in previous generations, raw power and graphics is not going to win you a console war. Price and content, two things the PS3 is having real trouble with right now, will do that for you.
Avatar image for DivergeUnify
DivergeUnify

15150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 DivergeUnify
Member since 2007 • 15150 Posts
[QUOTE="DivergeUnify"][QUOTE="muscleserge"][QUOTE="Redfingers"]

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

muscleserge
So it takes cell+RSX to equal to about an 8800GTS. Good news for Cows, maybe some of their 2008 games will start to look great. I kind of wanted to get a PS3 but the lack of games put me off for a while.

Wow, wtf how do you come up with some dumb figure like that.

Read the last statement. IMO state of the art right now are G80s.

G80 was the pending name of an 8800gtx like two years ago...
Avatar image for speedjunkie1992
speedjunkie1992

1395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 speedjunkie1992
Member since 2006 • 1395 Posts
Erm i would read it but i was expecting some kind of real time footage or an image of it being used in real time
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Teuf_
Member since 2004 • 30805 Posts
Yes I read that paper a while ago, its very cool stuff. Killzone 2 probably uses some similar ideas for its deferred renderer, but probably only uses a portion of the Cell for graphics stuff.

If anyone likes I can try to explain whats going on in that paper in less technical language.
Avatar image for Whiteknight19
Whiteknight19

1303

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 Whiteknight19
Member since 2003 • 1303 Posts

sony knows what there doing when there making consoles and ps3 will show more of its power and that power will come to use which will Make Playstation 3 the most fastest console to date

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Teuf_
Member since 2004 • 30805 Posts
G80 was the pending name of an 8800gtx like two years ago...
DivergeUnify


No, G80 is the name of the GPU core in the 8800 video cards. G70 was in 7800, G71 in 7900, etc.
Avatar image for lhughey
lhughey

4890

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 lhughey
Member since 2006 • 4890 Posts

More hidden power of the cell claims. The proof is in the pudding! All this "in theory" or "one day" stuff is getting really old. Until you can show me a game that obviously is much better than what can be done on the 360 i'll chalk this up to a marketing ploy.

Dont get me wrong, i believe it thearticle, to an extent. You can find tweaks or advances to every software application after its released. There have been countless amounts of time that i've released software and said "man i should have done it like this. This would have made performance better".

Avatar image for Pro_wrestler
Pro_wrestler

7880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#24 Pro_wrestler
Member since 2002 • 7880 Posts
[QUOTE="Redfingers"]


The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

So their still figuring out the Cell/RSX but if it doesn't benefit the gamer then why should we care.

Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#25 muscleserge
Member since 2005 • 3307 Posts
DivergeUnify G80= 8800 GTS/GTX. adn my statement was a compliment to the 8800s, because it takes Cell+RSX+ speciall technique just to reach similar performance. But Games won't be seeing this kind of performance because games also need AI and Physics, plus all the other things that make up games that need CPU power.
Avatar image for DivergeUnify
DivergeUnify

15150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 DivergeUnify
Member since 2007 • 15150 Posts
DivergeUnify G80= 8800 GTS/GTX. adn my statement was a compliment to the 8800s, because it takes Cell+RSX+ speciall technique just to reach similar performance. But Games won't be seeing this kind of performance because games also need AI and Physics, plus all the other things that make up games that need CPU power.muscleserge
I'm pretty sure two years ago, Sony was still claiming that the RSX was equal to a G80 and we have yet to see that.
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="Redfingers"]


The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

Pro_wrestler

So their still figuring out the Cell/RSX but if it doesn't benefit the gamer then why should we care.



They didn't say these techniques wouldn't benefit games, they're saying that a game wouldn't be able to achieve the numbers they're getting since games need to handle more than just graphics.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Redfingers
Member since 2005 • 4510 Posts

More hidden power of the cell claims. The proof is in the pudding! All this "in theory" or "one day" stuff is getting really old. Until you can show me a game that obviously is much better than what can be done on the 360 i'll chalk this up to a marketing ploy.

Dont get me wrong, i believe it thearticle, to an extent. You can find tweaks or advances to every software application after its released. There have been countless amounts of time that i've released software and said "man i should have done it like this. This would have made performance better".

lhughey

They're benchmarks. You don't have to believe in benchmarks. The only significant limiting factor to their relevance is that they are presumably controlled. But they're significant nonetheless.

What you should really draw from this article is this: the CPU can render textures, or perhaps shade textures the GPU renders concurrently. They also have "unified memory" that allows them to do this, according to the Sony report.

Avatar image for DivergeUnify
DivergeUnify

15150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 DivergeUnify
Member since 2007 • 15150 Posts
[QUOTE="DivergeUnify"] G80 was the pending name of an 8800gtx like two years ago...
Teufelhuhn


No, G80 is the name of the GPU core in the 8800 video cards. G70 was in 7800, G71 in 7900, etc.

Well w/e :P
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="muscleserge"]DivergeUnify G80= 8800 GTS/GTX. adn my statement was a compliment to the 8800s, because it takes Cell+RSX+ speciall technique just to reach similar performance. But Games won't be seeing this kind of performance because games also need AI and Physics, plus all the other things that make up games that need CPU power.DivergeUnify
I'm pretty sure two years ago, Sony was still claiming that the RSX was equal to a G80 and we have yet to see that.



Nope. RSX always has been G70-based.
Avatar image for -wii60-
-wii60-

3287

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 -wii60-
Member since 2007 • 3287 Posts
The xbox360 processor isn't a monster as the cell, but i have heard that the GPU inside the 360 hasn't been use to its fullest capabilities yet.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 Redfingers
Member since 2005 • 4510 Posts

DivergeUnify G80= 8800 GTS/GTX. adn my statement was a compliment to the 8800s, because it takes Cell+RSX+ speciall technique just to reach similar performance. But Games won't be seeing this kind of performance because games also need AI and Physics, plus all the other things that make up games that need CPU power.muscleserge

So....all hail the G80 because it can perform comparably at this particular app to 5 SPEs? How....remarkable.

Garsh!

But the CPU has to handle AI and physics, thus in the real world it will be hobbled....okay, well, friend. If the G80 attempted a CPU intensive app (that the Cell is good at), I.E. AI and physics, it would poop its pants.

Avatar image for kevy619
kevy619

5617

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#33 kevy619
Member since 2004 • 5617 Posts
PSU is a garbage website /thread
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 Teuf_
Member since 2004 • 30805 Posts

What you should really draw from this article is this: the CPU can render textures, or perhaps shade textures the GPU renders concurrently. They also have "unified memory" that allows them to do this, according to the Sony report.

Redfingers


It's a bit more complicated than that, there's reasons why Cell is good at the particular type of shading they're doing in the paper. But yes, this is definitely a good demonstration of how "teh chip made for video decoding" is useful for graphics.

And the unified memory they're referring to is the Cell's XDR memory, which is accessible to both the CPU and the GPU.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 Redfingers
Member since 2005 • 4510 Posts

[QUOTE="DivergeUnify"][QUOTE="muscleserge"]DivergeUnify G80= 8800 GTS/GTX. adn my statement was a compliment to the 8800s, because it takes Cell+RSX+ speciall technique just to reach similar performance. But Games won't be seeing this kind of performance because games also need AI and Physics, plus all the other things that make up games that need CPU power.Teufelhuhn
I'm pretty sure two years ago, Sony was still claiming that the RSX was equal to a G80 and we have yet to see that.



Nope. RSX always has been G70-based.

The only claim I've ever seen for the RSX in terms of benchmarking is when they said it would be faster than two 6800s in SLI. The 6800 is G60, or something, right? That hardly sounds like comparing it to a G80 to me...

Plus, this benchmark refers to the capabilities of the Cell, not the capabilities of the RSX. If anything, it refers to the GPU in terms of how it relates to the system itself via the memory "link" they say exists and the concurrent rendering technique it allows.

Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#36 muscleserge
Member since 2005 • 3307 Posts
The xbox360 processor isn't a monster as the cell, but i have heard that the GPU inside the 360 hasn't been use to its fullest capabilities yet.-wii60-
Actually it has been already. GeOW made that pretty clear. Maybe Xenos' tessellation unit hasn't been utilized.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Redfingers
Member since 2005 • 4510 Posts
[QUOTE="Redfingers"]

What you should really draw from this article is this: the CPU can render textures, or perhaps shade textures the GPU renders concurrently. They also have "unified memory" that allows them to do this, according to the Sony report.

Teufelhuhn



It's a bit more complicated than that, there's reasons why Cell is good at the particular type of shading they're doing in the paper. But yes, this is definitely a good demonstration of how "teh chip made for video decoding" is useful for graphics.

And the unified memory they're referring to is the Cell's XDR memory, which is accessible to both the CPU and the GPU.

Ah, I gotcha. Thanks. The entirety? The 256 XDR? Is there a performance hit?

And I assume the 256 GDDR3 is entirely locked to the GPU?

Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#38 muscleserge
Member since 2005 • 3307 Posts

[QUOTE="Teufelhuhn"][QUOTE="DivergeUnify"][QUOTE="muscleserge"]DivergeUnify G80= 8800 GTS/GTX. adn my statement was a compliment to the 8800s, because it takes Cell+RSX+ speciall technique just to reach similar performance. But Games won't be seeing this kind of performance because games also need AI and Physics, plus all the other things that make up games that need CPU power.Redfingers

I'm pretty sure two years ago, Sony was still claiming that the RSX was equal to a G80 and we have yet to see that.



Nope. RSX always has been G70-based.

The only claim I've ever seen for the RSX in terms of benchmarking is when they said it would be faster than two 6800s in SLI. The 6800 is G60, or something, right? That hardly sounds like comparing it to a G80 to me...

Plus, this benchmark refers to the capabilities of the Cell, not the capabilities of the RSX. If anything, it refers to the GPU in terms of how it relates to the system itself via the memory "link" they say exists and the concurrent rendering technique it allows.

6800=NV43
Avatar image for LibertySaint
LibertySaint

6500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 LibertySaint
Member since 2007 • 6500 Posts

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

Redfingers

old they used this on Graw2.....

Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 Redfingers
Member since 2005 • 4510 Posts

[QUOTE="-wii60-"]The xbox360 processor isn't a monster as the cell, but i have heard that the GPU inside the 360 hasn't been use to its fullest capabilities yet.muscleserge
Actually it has been already. GeOW made that pretty clear. Maybe Xenos' tessellation unit hasn't been utilized.

UT3>Gears of War according to the president of Epic Games.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 Teuf_
Member since 2004 • 30805 Posts
The xbox360 processor isn't a monster as the cell, but i have heard that the GPU inside the 360 hasn't been use to its fullest capabilities yet.-wii60-


Of course it hasnt, no processor ever gets used to its fullest potential. I still think there's a lot of room to improve on the 360, lately the MS first-party games haven't been all that impressive with the graphics. I'm sure some dev will break the glass ceiling once again.
Avatar image for xDonRobx
xDonRobx

1586

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 xDonRobx
Member since 2007 • 1586 Posts
Until we see it in games this is useless.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 Redfingers
Member since 2005 • 4510 Posts
[QUOTE="Redfingers"]

According to this report which is linked in this article in PSU:

http://www.psu.com/Cell-and-RSX-work-together-for-best-results-News--a0001332-p0.php

And if you're PSU-agnostic:

http://research.scea.com/ps3_deferred_shading.pdf

PSU gives a good summary though so I'll post it here:

"The idea is that Synergistic Processing Elements (SPEs) of the Cell/BE can use the chip's DMA list feature to gather "irregular fine-grained fragments of texture data" generated by the RSX graphics chip, returning shading textures in the same way.

This means that the many small SPEs can apply shading on top of the work that the GPU does to texture the game world, shunting some of the workload to the (potentially underused) Cell/BE. The end result is that more work can be done while maintaining a high framerate.

The example code used for the research achieved an 85 Hz frame rate at 720p resolution. The report states that although numbers like that are unlikely to show up in actual games, progress has been made to at least utilise this method to improve graphics and frame rates in game."

What's truly interesting about this is that it proves that the Playstation 3's Cell processor can, in fact, help with graphics processing....to such a degree that it's helping shade segments of textures! That, and the benchmarks are impressive, even if we know hardly anything about them.

Also, this from the PDF: "The shading computation ran at 85Hz (FPS) at HDTV 720p resolution on 5 SPEs and generated 30.72 gigaops [sic?] of performance. This is comparable to the performance of the algorithm running on a state of the art high end GPU."

LibertySaint

old they used this on Graw2.....

Link me.

Avatar image for -wii60-
-wii60-

3287

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 -wii60-
Member since 2007 • 3287 Posts
[QUOTE="-wii60-"]The xbox360 processor isn't a monster as the cell, but i have heard that the GPU inside the 360 hasn't been use to its fullest capabilities yet.muscleserge
Actually it has been already. GeOW made that pretty clear. Maybe Xenos' tessellation unit hasn't been utilized.



I was talking about xenos :/.
Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#45 muscleserge
Member since 2005 • 3307 Posts
[QUOTE="muscleserge"][QUOTE="-wii60-"]The xbox360 processor isn't a monster as the cell, but i have heard that the GPU inside the 360 hasn't been use to its fullest capabilities yet.-wii60-
Actually it has been already. GeOW made that pretty clear. Maybe Xenos' tessellation unit hasn't been utilized.



I was talking about xenos :/.

Yeah me too.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 Redfingers
Member since 2005 • 4510 Posts

Until we see it in games this is useless.xDonRobx

Don't be ridiculous.

First of all, it's relevant in terms of arguments. People have theoretical debates on System Wars all the time. I have people telling me the Cell is hobbled, limited, not designed with games in mind, and completely incapable of helping the GPU render concurrently. I've pretty much dispelled all of those myths by posting that particular article (though, no credit to me, obviously).

The technique, useless? Well, that depends. I just saw someone tell me "OLD! GRAW 2 USES IT!" As far as the technique, we won't see tangible results until they apply it but, like they said, progress is being made (and it's significant) in that particular field.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="Teufelhuhn"][QUOTE="Redfingers"]

What you should really draw from this article is this: the CPU can render textures, or perhaps shade textures the GPU renders concurrently. They also have "unified memory" that allows them to do this, according to the Sony report.

Redfingers



It's a bit more complicated than that, there's reasons why Cell is good at the particular type of shading they're doing in the paper. But yes, this is definitely a good demonstration of how "teh chip made for video decoding" is useful for graphics.

And the unified memory they're referring to is the Cell's XDR memory, which is accessible to both the CPU and the GPU.

Ah, I gotcha. Thanks. The entirety? The 256 XDR? Is there a performance hit?

And I assume the 256 GDDR3 is entirely locked to the GPU?



The GPU has some extra latency when reading or writing to the XDR memory, but the whole 256 MB is accessible. Latency isn't a big issue for a GPU though, as GPU memory is always high-latency therefore they're designed to be good at handling it without a performance hit. Lot's of devs already use the XDR for storing textures and geometry.

And yeah, the GDDR3 is pretty much for the GPU only. The Cell can read and write to it, but at terribly slow speeds so its not worth it.
Avatar image for Redfingers
Redfingers

4510

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Redfingers
Member since 2005 • 4510 Posts

Yes I read that paper a while ago, its very cool stuff. Killzone 2 probably uses some similar ideas for its deferred renderer, but probably only uses a portion of the Cell for graphics stuff.

If anyone likes I can try to explain whats going on in that paper in less technical language.
Teufelhuhn

I know I for one would appreciate it. Perhaps not these goons that come in and just say "TEH HIDDEN POWAH!"

Go right ahead, please.

Avatar image for ragek1ll589
ragek1ll589

8650

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49 ragek1ll589
Member since 2007 • 8650 Posts
PSU is a garbage website /threadkevy619

he gave another website if your not partial to PSU
Avatar image for LibertySaint
LibertySaint

6500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 LibertySaint
Member since 2007 • 6500 Posts
not to metion this just proves what i've been saying all along the ps3's gpu is equal to the 7800gtx, now for someone to do a 360 test so i can pro the 360s ATI gpu is equal to a x1900x pro from ati or 7900 gtx from nvidia....