rsx is not a g70

This topic is locked from further discussion.

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101 faultline187
Member since 2008 • 64 Posts

1. its a new architechure...(developers are lazy to learn)

2. multiplatform games suffer on ps3...again its because its a new system

3. RSX is somthing special, somthing unique and sophisticated! It needs its API to be able to display its power! OPENGL ES VECTORED GRAPHICS!!! i have been screaming it and noone seems to understand it.....Let me give you an example? You have Vista with DX10 and a 8800gtx or whatever..you can run a DX10 GAME....But if you had a 8800 and no vista or DX10 then your 8800 would only use features of the DX9 API so therefor your card isnt really being used to its full potential? Understand?

4. Umm im tired cant be bothered anymore!

Night:o

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#102 Martin_G_N
Member since 2006 • 2124 Posts
Developers should start making exlusive engines for the PS3, even if it takes a while. If people take a look at multiplats, which all games uses the same engine as the X360. You can see games having problems on the PS3, even games that don't really look good have problems. Fallout 3 could have looked and runned a lot better on the PS3 if it had used an engine developed for the PS3 hardware. All multiplats have these problems, specially games from EA. When the difference between multiplats and exclusives are as big as it is on the PS3, you can start to wonder why.
Avatar image for DOF_power
DOF_power

804

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 DOF_power
Member since 2008 • 804 Posts

The numbers for the cell are ~204.8 gigaflops FP32 and 14.8 FP64. gigaflops and 2.2Gigaflops/watt vs.

Radeon 4870 X2 2.4 terraflops FP32 and 480 gflops FP64, 8.8 gigaflops/watt.

But these are just just numbers numbers, in theory.

This is IBM's analysis of the cell, take a look:

http://www-128.ibm.com/developerworks/power/library/pa-cellperf/

a]
Look at Table 2. Okay, while performing 'single precision matrix multiplication' the cell achieved ~25 Gigaflops/SPU * 8 SPUs gives us Sony's 200 Gigaflops spec for the cell. But wait, Sony is disabling one SPU so the PS3 cell only starts with 175 Gigaflops not 200.

Oh, and everybody knows that all game code is made up primarily of simple 'single precision matrix multiplication' and the cpu rarely has to execute complex mathematics for physics, AI, wireframes, particle effects etc.

b]
Take a peek at Table 9. While performing 'double precision floating point linpack' the cell achieved 9.46 Gigaflops versus 7.2 Gigaflops for the 3.6Ghz P4 processor. Let's look a little closer, according to table 8, the 9.46 Gigaflops is for all eight SPUs and we all know the PS3 is only using 7 SPUs so the comparison is really 8.51 Gigaflops for the PS3 cell vs 7.2 Gigaflops for the 3.6Ghz P4 or about 18% faster, okay performance I guess... Hey wait a second, didn't Intel release the 3.6Ghz P4 a couple of years ago?

How it would stack up against todays quad core processors from Intel and AMD...

Folding@Home

A GeForce GTX 280 can fold at 590 ns/day, compared to 100 ns/day on the Playstation 3.

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=187&Itemid=38&limit=1&limitstart=4

So no the PS3 ain't the 8th wonder.

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 faultline187
Member since 2008 • 64 Posts

The numbers for the cell are ~204.8 gigaflops FP32 and 14.8 FP64. gigaflops and 2.2Gigaflops/watt vs.

Radeon 4870 X2 2.4 terraflops FP32 and 480 gflops FP64, 8.8 gigaflops/watt.

But these are just just numbers numbers, in theory.

This is IBM's analysis of the cell, take a look:

http://www-128.ibm.com/developerworks/power/library/pa-cellperf/

a]
Look at Table 2. Okay, while performing 'single precision matrix multiplication' the cell achieved ~25 Gigaflops/SPU * 8 SPUs gives us Sony's 200 Gigaflops spec for the cell. But wait, Sony is disabling one SPU so the PS3 cell only starts with 175 Gigaflops not 200.

Oh, and everybody knows that all game code is made up primarily of simple 'single precision matrix multiplication' and the cpu rarely has to execute complex mathematics for physics, AI, wireframes, particle effects etc.

b]
Take a peek at Table 9. While performing 'double precision floating point linpack' the cell achieved 9.46 Gigaflops versus 7.2 Gigaflops for the 3.6Ghz P4 processor. Let's look a little closer, according to table 8, the 9.46 Gigaflops is for all eight SPUs and we all know the PS3 is only using 7 SPUs so the comparison is really 8.51 Gigaflops for the PS3 cell vs 7.2 Gigaflops for the 3.6Ghz P4 or about 18% faster, okay performance I guess... Hey wait a second, didn't Intel release the 3.6Ghz P4 a couple of years ago?

How it would stack up against todays quad core processors from Intel and AMD...

Folding@Home

A GeForce GTX 280 can fold at 590 ns/day, compared to 100 ns/day on the Playstation 3.

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=187&Itemid=38&limit=1&limitstart=4

So no the PS3 ain't the 8th wonder.

DOF_power

But did you know that the ps3 is only using cell for its Folding@Home thats a CPU vs a GPU....Ps3 is not i repeat NOT using RSX for Folding@home because of the Hypervisor restricting access to it! So ps3 is and will be the 8th wonder..because im still wondering when its full power will be unleashed! :D

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105 Martin_G_N
Member since 2006 • 2124 Posts

The numbers for the cell are ~204.8 gigaflops FP32 and 14.8 FP64. gigaflops and 2.2Gigaflops/watt vs.

Radeon 4870 X2 2.4 terraflops FP32 and 480 gflops FP64, 8.8 gigaflops/watt.

But these are just just numbers numbers, in theory.

This is IBM's analysis of the cell, take a look:

http://www-128.ibm.com/developerworks/power/library/pa-cellperf/

a]
Look at Table 2. Okay, while performing 'single precision matrix multiplication' the cell achieved ~25 Gigaflops/SPU * 8 SPUs gives us Sony's 200 Gigaflops spec for the cell. But wait, Sony is disabling one SPU so the PS3 cell only starts with 175 Gigaflops not 200.

Oh, and everybody knows that all game code is made up primarily of simple 'single precision matrix multiplication' and the cpu rarely has to execute complex mathematics for physics, AI, wireframes, particle effects etc.

b]
Take a peek at Table 9. While performing 'double precision floating point linpack' the cell achieved 9.46 Gigaflops versus 7.2 Gigaflops for the 3.6Ghz P4 processor. Let's look a little closer, according to table 8, the 9.46 Gigaflops is for all eight SPUs and we all know the PS3 is only using 7 SPUs so the comparison is really 8.51 Gigaflops for the PS3 cell vs 7.2 Gigaflops for the 3.6Ghz P4 or about 18% faster, okay performance I guess... Hey wait a second, didn't Intel release the 3.6Ghz P4 a couple of years ago?

How it would stack up against todays quad core processors from Intel and AMD...

Folding@Home

A GeForce GTX 280 can fold at 590 ns/day, compared to 100 ns/day on the Playstation 3.

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=187&Itemid=38&limit=1&limitstart=4

So no the PS3 ain't the 8th wonder.

DOF_power
Take a look at the newest processors then. The Intel i7 is around 51Gflops. Still far away.
Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#106 AnnoyedDragon
Member since 2006 • 9948 Posts

Take a look at the newest processors then. The Intel i7 is around 51Gflops. Still far away. Martin_G_N

When are people going to get over Cells synthetic benchmark figures and start realizing that the real world results don't add up?

The Intel i7 is the most powerful consumer processor you can buy today, it was developed years after the consoles CPU designs were finalized and costs more than the entire PS3 in the upper end.

This unlocked potential hype is getting old, it amazes me people are playing these old figures when even the simplest real world demonstration proves otherwise.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107 Teuf_
Member since 2004 • 30805 Posts


How much do you really think they changed in the year leading up to the PS3's release? It would have been way too late in the game to make radical changes...they had already been developing tools and API's for developers, and getting ready to manufacture the chips. You can't do either of those things if you're still designing the thing.

In case you're wondering what changes were made in that time period, I'll tell you: the clock speed was dropped from 550 to 500MHz.

[QUOTE="faultline187"]
So how the hell do they know so much of this supposed weak GPU thats in the ps3? You do also know that alot was changed with the GPU in that time! And no additional information has been avaliable since E3 2005...
faultline187


They know because it's common knowledge. Certain PS3 developers have had lots to say about RSX if you peruse the Beyond3D forums.

If you have any other useful information, other than its a g70 its a g70 (which infact is not true) that would be much apreciated.

faultline187



Look let's assume for a minute that you're right, and the PS3 has some super-secret unheard-of Nvidia chip jam-packed with secret powers. How is it then that it routinely gets out-performed by the Xenos? How is it that RSX has the exact same performance pitfalls as the 7800 (like low vertex throughput)? Why is it that Sony hasn't released the official specs? There's only one reason to hold back official specs...if you don't know what that is then you should ask why Nintendo never released the offical Wii specs.


Oh and here is the link where i got the 4x Anti aliasing (Second to last paragraph i think) 8)

4X

thank you

faultline187


You should try reading the actual paper Guerilla released. Page 18.
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 Teuf_
Member since 2004 • 30805 Posts

Read the link i posted at the start of this thread carefully...MS has kept some of DX10's APis' under closed doors for too long almost like it is proprietary. So much of the industry has moved towards a new OpenGL years ago and the new VECTOR GRAPHICS BASED standard is almost upon us. Games can now look as good from $10,000 PCs down to PDAs since the graphics a vector based and scalable.

"Because MT Evans completely replaces DX10 features and adds features the Manufacturers have already built (RSX) into their cards and are also contributing members of OpenGL. Microsoft locked those features out forcing their own standards!"

faultline187



This is so wrong I don't even know where to begin.

First of all, DirectX is proprietary. It belongs completely to Microsoft. However I have no idea what you mean by "under closed doors"...there's absolutely no point to an API that isn't fully accessible to developers. It's not like game developers use one DirectX, and Microsoft has some other super-secret DirectX that only they use.

Second, I hope you're joking about the OpenGL thing. OpenGL is dead. 3.0 was it's last chance, and ARB blew it big time. You might have heard all kinds of hype about what 3.0 was supposed to be, but I assure tht what was released was none of them. The only (and I do mean only) reason to use OpenGL is if you're targeting Mac or Linux.

Third, vector graphics has been here a long time. The 3D graphics that consoles and PC's have been doing for over 10 years now...that's all vector graphics. Using vector graphics doesn't mean you can scale it to any hardware (go ahead and try to get a PDA to render a scene with 1M+ polys), it means that the graphics can scale to any resolution.

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#109 faultline187
Member since 2008 • 64 Posts
[QUOTE="faultline187"]

Read the link i posted at the start of this thread carefully...MS has kept some of DX10's APis' under closed doors for too long almost like it is proprietary. So much of the industry has moved towards a new OpenGL years ago and the new VECTOR GRAPHICS BASED standard is almost upon us. Games can now look as good from $10,000 PCs down to PDAs since the graphics a vector based and scalable.

"Because MT Evans completely replaces DX10 features and adds features the Manufacturers have already built (RSX) into their cards and are also contributing members of OpenGL. Microsoft locked those features out forcing their own standards!"

Teufelhuhn



This is so wrong I don't even know where to begin.

First of all, DirectX is proprietary. It belongs completely to Microsoft. However I have no idea what you mean by "under closed doors"...there's absolutely no point to an API that isn't fully accessible to developers. It's not like game developers use one DirectX, and Microsoft has some other super-secret DirectX that only they use.

Second, I hope you're joking about the OpenGL thing. OpenGL is dead. 3.0 was it's last chance, and ARB blew it big time. You might have heard all kinds of hype about what 3.0 was supposed to be, but I assure tht what was released was none of them. The only (and I do mean only) reason to use OpenGL is if you're targeting Mac or Linux.

Third, vector graphics has been here a long time. The 3D graphics that consoles and PC's have been doing for over 10 years now...that's all vector graphics. Using vector graphics doesn't mean you can scale it to any hardware (go ahead and try to get a PDA to render a scene with 1M+ polys), it means that the graphics can scale to any resolution.

Most people are not aware of difference between DirectX, OpenGL and OpenGL ES. Version and ID are different for sure. OpenGL (full) is on ver.3.0/3.2! While OpenGL ES is at version 2.0/2.1! The difference is night n day with OGL ES being only for Games (no applications, but API's like OpenVG, OpenMAX, OpenSL, EGL certainly are). The Desktop on these Mobile Devices (including Sony PS3) is running OpenGL ES 2.0, OpenVG in combination with EGL. Which is fully Vectored (meaning Resolution-Free and Alias-Free) Desktop Special FX Graphics. Resolution Free doesn't mean there is No Resolution. It means it scaled dynamically in real time, so you never see pixelation.

Along with this OpenGL ES 2.0/2.1 NO longer uses POLYGONS!. But rather triangles only. It is also based on Cell Rendering. So it's a composite of Cells or squares/rectangles which are much easier to calculate and render than polygons. Mainly because their dimensions aren't fixed and you only have four sides to calculate. Along with with being purely 2D rather than 3D Graphics of full DX9 or OpenGL.

These are only some of the reasons the Future is indeed pointed at OpenGL ES 2.0/2.1 with DX10 features along with a new Object Model based on Microkernel OS designs (as is Windows 7 w/25MG Microkernel). Beyond that, the Future of Desktops on both Mobile Devices and full PC's is Back to the Future in Object Oriented Operating Systems (especially on Mobiles). For OpenGL ES this will come with "Halti" in time, with no going back to the past ever again!

LONG PEAKS> MT EVANS > HALTI!

All im saying is that 128bit HDR is a DX10 FEATURE and is a Shader Model 4.0/4.1* Compliant! so there for if ps3 is opengl es 2.0 then its simply cannot be a 7800 card.

Shader Model 4.0 is a feature of DirectX 10, which has been released with Windows Vista. Shader Model 4.0 will allow for 128-bit HDR rendering, as opposed to 64-bit HDR in Shader Model 3.0 ;)

I dont see how hard that is to understand..

Avatar image for trasherhead
trasherhead

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#110 trasherhead
Member since 2005 • 3058 Posts
I would like to point out that KZ2, which from what I got from these posts uses 128bit HDR, do not use Open GL of any kind. They have programed directly to the CPU and GPU. That is out of the Ask the dev forum from the official PS forum.
Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#111 faultline187
Member since 2008 • 64 Posts

I would like to point out that KZ2, which from what I got from these posts uses 128bit HDR, do not use Open GL of any kind. They have programed directly to the CPU and GPU. That is out of the Ask the dev forum from the official PS forum. trasherhead

So it uses 128bit hdr? can you please send me the link to the page if possible..Sounds very interesting to me.. thanks

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#112 ronvalencia
Member since 2008 • 29612 Posts

1. its a new arch

itechure...(developers are lazy to learn)

2. multiplatform games suffer on ps3...again its because its a new system

3. RSX is somthing special, somthing unique and sophisticated! It needs its API to be able to display its power! OPENGL ES VECTORED GRAPHICS!!! i have been screaming it and noone seems to understand it.....Let me give you an example? You have Vista with DX10 and a 8800gtx or whatever..you can run a DX10 GAME....But if you had a 8800 and no vista or DX10 then your 8800 would only use features of the DX9 API so therefor your card isnt really being used to its full potential? Understand?

4. Umm im tired cant be bothered anymore!

Night:o

faultline187

One should review OpenGL ES 2.0 relative OpenGL 2.x.

For programmable hardware: OpenGL ES 2.0 is defined relative to the OpenGL 2.0 specification

For Geforce 8/9/GT1x0/GT2x0 and Windows XP or Lintel, you have the alternative APIs such as OpenGL+NV extensions, OpenGL3.0, CUDA 2.0 and soon OpenCL 1.0 (CUDA like). NVIDIA ForceWare 177.89 for Windows XP includes OpenGL Extensions and OpenGL 3.0.

DX10 NV extensions Remember, Geforce 8/9/GT1x0/GT2x0 are only "DX10" GPUs, but with games like Far Cry 2 that uses DX10 NV extensions to enable some DX10.1 functions i.e. refer to NVIDIA's "The Way It's Meant To Be Played" developers initiative.

See full size image

OpenGL 3.0
- Provides rough feature parity with Direct3D 10
- Shader Model 4.0 features

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Teufelhuhn"][QUOTE="faultline187"]

Read the link i posted at the start of this thread carefully...MS has kept some of DX10's APis' under closed doors for too long almost like it is proprietary. So much of the industry has moved towards a new OpenGL years ago and the new VECTOR GRAPHICS BASED standard is almost upon us. Games can now look as good from $10,000 PCs down to PDAs since the graphics a vector based and scalable.

"Because MT Evans completely replaces DX10 features and adds features the Manufacturers have already built (RSX) into their cards and are also contributing members of OpenGL. Microsoft locked those features out forcing their own standards!"

faultline187



This is so wrong I don't even know where to begin.

First of all, DirectX is proprietary. It belongs completely to Microsoft. However I have no idea what you mean by "under closed doors"...there's absolutely no point to an API that isn't fully accessible to developers. It's not like game developers use one DirectX, and Microsoft has some other super-secret DirectX that only they use.

Second, I hope you're joking about the OpenGL thing. OpenGL is dead. 3.0 was it's last chance, and ARB blew it big time. You might have heard all kinds of hype about what 3.0 was supposed to be, but I assure tht what was released was none of them. The only (and I do mean only) reason to use OpenGL is if you're targeting Mac or Linux.

Third, vector graphics has been here a long time. The 3D graphics that consoles and PC's have been doing for over 10 years now...that's all vector graphics. Using vector graphics doesn't mean you can scale it to any hardware (go ahead and try to get a PDA to render a scene with 1M+ polys), it means that the graphics can scale to any resolution.

Most people are not aware of difference between DirectX, OpenGL and OpenGL ES. Version and ID are different for sure. OpenGL (full) is on ver.3.0/3.2! While OpenGL ES is at version 2.0/2.1! The difference is night n day with OGL ES being only for Games (no applications, but API's like OpenVG, OpenMAX, OpenSL, EGL certainly are). The Desktop on these Mobile Devices (including Sony PS3) is running OpenGL ES 2.0, OpenVG in combination with EGL. Which is fully Vectored (meaning Resolution-Free and Alias-Free) Desktop Special FX Graphics. Resolution Free doesn't mean there is No Resolution. It means it scaled dynamically in real time, so you never see pixelation.

Along with this OpenGL ES 2.0/2.1 NO longer uses POLYGONS!. But rather triangles only. It is also based on Cell Rendering. So it's a composite of Cells or squares/rectangles which are much easier to calculate and render than polygons. Mainly because their dimensions aren't fixed and you only have four sides to calculate. Along with with being purely 2D rather than 3D Graphics of full DX9 or OpenGL.

These are only some of the reasons the Future is indeed pointed at OpenGL ES 2.0/2.1 with DX10 features along with a new Object Model based on Microkernel OS designs (as is Windows 7 w/25MG Microkernel). Beyond that, the Future of Desktops on both Mobile Devices and full PC's is Back to the Future in Object Oriented Operating Systems (especially on Mobiles). For OpenGL ES this will come with "Halti" in time, with no going back to the past ever again!

LONG PEAKS> MT EVANS > HALTI!

All im saying is that 128bit HDR is a DX10 FEATURE and is a Shader Model 4.0/4.1* Compliant! so there for if ps3 is opengl es 2.0 then its simply cannot be a 7800 card.

Shader Model 4.0 is a feature of DirectX 10, which has been released with Windows Vista. Shader Model 4.0 will allow for 128-bit HDR rendering, as opposed to 64-bit HDR in Shader Model 3.0 ;)

I dont see how hard that is to understand..

Watch and learn. Note Geforce 7800's NVIDIA marketing already has 128bit HDR. Geforce 7800GTX claims to have 128bit HDR

Avatar image for Tasman_basic
Tasman_basic

3255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 Tasman_basic
Member since 2002 • 3255 Posts
[QUOTE="Bebi_vegeta"][QUOTE="faultline187"]

Its the truth..

During 2009 you will see things that even your latest pc will not be able to produce until DX15 is out.

Remember i said it..:-)

faultline187

I'm still wating for a game to surpass Crysis... that's been released last year.

Crysis? Well wait no longer my friend KIllzone2 comes out in Feb! first game on ps3 to use the beta version of Mt evans (DX10) comparable!

After next year all games on ps3 will be like that :_)

hey a cow saying wait till next year! Januarys not even over yet!
Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 faultline187
Member since 2008 • 64 Posts
[QUOTE="faultline187"][QUOTE="Teufelhuhn"]

This is so wrong I don't even know where to begin.

First of all, DirectX is proprietary. It belongs completely to Microsoft. However I have no idea what you mean by "under closed doors"...there's absolutely no point to an API that isn't fully accessible to developers. It's not like game developers use one DirectX, and Microsoft has some other super-secret DirectX that only they use.

Second, I hope you're joking about the OpenGL thing. OpenGL is dead. 3.0 was it's last chance, and ARB blew it big time. You might have heard all kinds of hype about what 3.0 was supposed to be, but I assure tht what was released was none of them. The only (and I do mean only) reason to use OpenGL is if you're targeting Mac or Linux.

Third, vector graphics has been here a long time. The 3D graphics that consoles and PC's have been doing for over 10 years now...that's all vector graphics. Using vector graphics doesn't mean you can scale it to any hardware (go ahead and try to get a PDA to render a scene with 1M+ polys), it means that the graphics can scale to any resolution.

ronvalencia

Most people are not aware of difference between DirectX, OpenGL and OpenGL ES. Version and ID are different for sure. OpenGL (full) is on ver.3.0/3.2! While OpenGL ES is at version 2.0/2.1! The difference is night n day with OGL ES being only for Games (no applications, but API's like OpenVG, OpenMAX, OpenSL, EGL certainly are). The Desktop on these Mobile Devices (including Sony PS3) is running OpenGL ES 2.0, OpenVG in combination with EGL. Which is fully Vectored (meaning Resolution-Free and Alias-Free) Desktop Special FX Graphics. Resolution Free doesn't mean there is No Resolution. It means it scaled dynamically in real time, so you never see pixelation.

Along with this OpenGL ES 2.0/2.1 NO longer uses POLYGONS!. But rather triangles only. It is also based on Cell Rendering. So it's a composite of Cells or squares/rectangles which are much easier to calculate and render than polygons. Mainly because their dimensions aren't fixed and you only have four sides to calculate. Along with with being purely 2D rather than 3D Graphics of full DX9 or OpenGL.

These are only some of the reasons the Future is indeed pointed at OpenGL ES 2.0/2.1 with DX10 features along with a new Object Model based on Microkernel OS designs (as is Windows 7 w/25MG Microkernel). Beyond that, the Future of Desktops on both Mobile Devices and full PC's is Back to the Future in Object Oriented Operating Systems (especially on Mobiles). For OpenGL ES this will come with "Halti" in time, with no going back to the past ever again!

LONG PEAKS> MT EVANS > HALTI!

All im saying is that 128bit HDR is a DX10 FEATURE and is a Shader Model 4.0/4.1* Compliant! so there for if ps3 is opengl es 2.0 then its simply cannot be a 7800 card.

Shader Model 4.0 is a feature of DirectX 10, which has been released with Windows Vista. Shader Model 4.0 will allow for 128-bit HDR rendering, as opposed to 64-bit HDR in Shader Model 3.0 ;)

I dont see how hard that is to understand..

Watch and learn. Note Geforce 7800's NVIDIA marketing already has 128bit HDR.

Shader Model 4.0 is a feature of DirectX 10

Shader Model 4.0 will allow for 128-bit HDR rendering

7800gtx is a shader model 3.0 Compliant and only supports 64bit Hdr..Thats the way it is! ask anyone you like

I have a 2 7900 in sli on my pc and there is no game that i have played that uses 128bit hdr! Apart from g70's not being able to run HDR + AA at the same time

Crysis HDR is 128bit because of DX10 and the first card to support true hdr was the 8800

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#116 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="ronvalencia"][QUOTE="faultline187"]

Most people are not aware of difference between DirectX, OpenGL and OpenGL ES. Version and ID are different for sure. OpenGL (full) is on ver.3.0/3.2! While OpenGL ES is at version 2.0/2.1! The difference is night n day with OGL ES being only for Games (no applications, but API's like OpenVG, OpenMAX, OpenSL, EGL certainly are). The Desktop on these Mobile Devices (including Sony PS3) is running OpenGL ES 2.0, OpenVG in combination with EGL. Which is fully Vectored (meaning Resolution-Free and Alias-Free) Desktop Special FX Graphics. Resolution Free doesn't mean there is No Resolution. It means it scaled dynamically in real time, so you never see pixelation.

Along with this OpenGL ES 2.0/2.1 NO longer uses POLYGONS!. But rather triangles only. It is also based on Cell Rendering. So it's a composite of Cells or squares/rectangles which are much easier to calculate and render than polygons. Mainly because their dimensions aren't fixed and you only have four sides to calculate. Along with with being purely 2D rather than 3D Graphics of full DX9 or OpenGL.

These are only some of the reasons the Future is indeed pointed at OpenGL ES 2.0/2.1 with DX10 features along with a new Object Model based on Microkernel OS designs (as is Windows 7 w/25MG Microkernel). Beyond that, the Future of Desktops on both Mobile Devices and full PC's is Back to the Future in Object Oriented Operating Systems (especially on Mobiles). For OpenGL ES this will come with "Halti" in time, with no going back to the past ever again!

LONG PEAKS> MT EVANS > HALTI!

All im saying is that 128bit HDR is a DX10 FEATURE and is a Shader Model 4.0/4.1* Compliant! so there for if ps3 is opengl es 2.0 then its simply cannot be a 7800 card.

Shader Model 4.0 is a feature of DirectX 10, which has been released with Windows Vista. Shader Model 4.0 will allow for 128-bit HDR rendering, as opposed to 64-bit HDR in Shader Model 3.0 ;)

I dont see how hard that is to understand..

faultline187

Watch and learn. Note Geforce 7800's NVIDIA marketing already has 128bit HDR.

Shader Model 4.0 is a feature of DirectX 10

Shader Model 4.0 will allow for 128-bit HDR rendering

7800gtx is a shader model 3.0 Compliant and only supports 64bit Hdr..Thats the way it is! ask anyone you like

I have a 2 7900 in sli on my pc and there is no game that i have played that uses 128bit hdr! Apart from g70's not being able to run HDR + AA at the same time

Crysis HDR is 128bit because of DX10 and the first card to support true hdr was the 8800

Are you blind? NVIDIA's powerpoint slides defines the actual hardware specs i.e. Geforce 7800 GTX. It's not NVIDIA's fault if D3D didn't expose all of G70's hardware features.

Are you blind again? Note OpenGL 3.0's Shader Model 4.0

One of the key features of G80 is it's the ability to process geometry shaders, vertex shaders and pixel shaders on the same unified shader units.

Again, please run geometry shaders on RSX.

Avatar image for obamanian
obamanian

3351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 obamanian
Member since 2008 • 3351 Posts
[QUOTE="karasill"][QUOTE="faultline187"]

http://talkback.zdnet.com/5208-12558-0.html?forumID=1&threadID=48909&messageID=915943&start=-9978

#1. PS3 RSX alone = 1.8 TFLOPS
Cell and RSX = equal over 2TFLOPS

GTX 280 = 933 GFLOPS

ATI 4800 = 1.2 teraFLOPS


#2. I will explain:
RSX = "Multi-way Programmable Parallel Floating Point Shader Pipelines". Now no other device uses this description, but if you were a student of Graphics Cards, you would realize that this was basically describing multiple pathways through a grid or array of Shader processors.

#3. So it's not a "Fixed Function Pipeline" of G70 with 24-Vertex Shaders and 8-Pixel Shaders. This describes a Unified Shader Architecture!

#4. Supports full 128bit HDR Lighting. Same as G80's and G70 only supports 64bit HDR like all last gen GPU's. Dead give away!

#5. From as early as Feb 2005 at GDC Sony has announced that PS3 (RSX) would use OpenGL ES 2.0. At the time 2.0 wasn't out, so they wrote feature supports of RSX into OGL ES 1.1 including Fully Programmable Shader Pipeline and call it PSGL. Khronos Group took anything that could be done in a Shader out of the Fixed Function Pipeline (including Transform and Lighting) streamlining it for future Embedded Hardware. What does that mean?

#6. At least for OpenGL ES 2.0 devices (not PC version) it becomes a unified shader model and remember RSX is fully compliant. OpenGL ES 2.0 will only run on advanced hardware just like DX10.

faultline187

It's based off the G70 architecture, nVidia even states this. I highly doubt a 2 year old GPU is more powerful, let alone on par with the current offerings of nVidia and ATI in the PC market.

Dont forget Uncharted, why do you think the animation and movment are so responsive? Remember RSX can use XDR memory (extreamly fast) 3.2GZ.

Now you tell me what PC has ram that runs at that speed? anyone?

How does a 500Mhz GPU use a 3.2Ghz ram exactly ? Won't the data be used at 500Mhz anyway ? It may gain something small, but i would not expect anything substantial

Also how about the lost cycles to address that extra CPU ram ? That is far worst than a unified approach

And can you explain to us if PS3 ram is so amazing, why Uncharted levels are always in a tiny path and small ?

Avatar image for 50u1r34v3r
50u1r34v3r

1560

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#118 50u1r34v3r
Member since 2006 • 1560 Posts

[QUOTE="danjammer69"] I try to be a proud cow....but this guy and others like him make it so hard. I appreciate his enthusiasm, but to call anything the PS3 can do "comparable" or better than Crysis...well, it is outright rubbish. I can openly admit that Crysis is not going to be touched for a long time, and that is fine with me.IgGy621985

Noone can't deny Killzone 2 looks great, but better than Crysis... come on...

But regarding other games - I'll try to be a proud hermit who appreciates other platforms - I'm considering getting PS3 only for Heavy Rain. It's from the dudes that made Fahrenheit, one of the best games I've ever played.

Yep, screw MGS, if there's one game that I would buy a PS3 for it has got to be Heavy Rain. (God of War coming isn't a bad thing either :P )
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#119 ronvalencia
Member since 2008 • 29612 Posts

But did you know that the ps3 is only using cell for its Folding@Home thats a CPU vs a GPU....Ps3 is not i repeat NOT using RSX for Folding@home because of the Hypervisor restricting access to it! So ps3 is and will be the 8th wonder..because im still wondering when its full power will be unleashed! :D

faultline187
Geforce 8600GT GDDR3 routinely beats UT3 PS3's 720p and 30FPS.
Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#120 faultline187
Member since 2008 • 64 Posts
[QUOTE="faultline187"]

1. its a new arch

itechure...(developers are lazy to learn)

2. multiplatform games suffer on ps3...again its because its a new system

3. RSX is somthing special, somthing unique and sophisticated! It needs its API to be able to display its power! OPENGL ES VECTORED GRAPHICS!!! i have been screaming it and noone seems to understand it.....Let me give you an example? You have Vista with DX10 and a 8800gtx or whatever..you can run a DX10 GAME....But if you had a 8800 and no vista or DX10 then your 8800 would only use features of the DX9 API so therefor your card isnt really being used to its full potential? Understand?

4. Umm im tired cant be bothered anymore!

Night:o

ronvalencia

One should review OpenGL ES 2.0 relative OpenGL 2.x.

For programmable hardware: OpenGL ES 2.0 is defined relative to the OpenGL 2.0 specification

For Geforce 8/9/GT1x0/GT2x0 and Windows XP or Lintel, you have the alternative APIs such as OpenGL+NV extensions, OpenGL3.0, CUDA 2.0 and soon OpenCL 1.0 (CUDA like). NVIDIA ForceWare 177.89 for Windows XP includes OpenGL Extensions and OpenGL 3.0.

DX10 NV extensions Remember, Geforce 8/9/GT1x0/GT2x0 are only "DX10" GPUs, but with games like Far Cry 2 that uses DX10 NV extensions to enable some DX10.1 functions i.e. refer to NVIDIA's "The Way It's Meant To Be Played" developers initiative.

See full size image

OpenGL 3.0
- Provides rough feature parity with Direct3D 10
- Shader Model 4.0 features

OK so your telling me that a 7800gtx can run dx10 i dont think so

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#121 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="ronvalencia"][QUOTE="faultline187"]

1. its a new arch

itechure...(developers are lazy to learn)

2. multiplatform games suffer on ps3...again its because its a new system

3. RSX is somthing special, somthing unique and sophisticated! It needs its API to be able to display its power! OPENGL ES VECTORED GRAPHICS!!! i have been screaming it and noone seems to understand it.....Let me give you an example? You have Vista with DX10 and a 8800gtx or whatever..you can run a DX10 GAME....But if you had a 8800 and no vista or DX10 then your 8800 would only use features of the DX9 API so therefor your card isnt really being used to its full potential? Understand?

4. Umm im tired cant be bothered anymore!

Night:o

faultline187

One should review OpenGL ES 2.0 relative OpenGL 2.x.

For programmable hardware: OpenGL ES 2.0 is defined relative to the OpenGL 2.0 specification

For Geforce 8/9/GT1x0/GT2x0 and Windows XP or Lintel, you have the alternative APIs such as OpenGL+NV extensions, OpenGL3.0, CUDA 2.0 and soon OpenCL 1.0 (CUDA like). NVIDIA ForceWare 177.89 for Windows XP includes OpenGL Extensions and OpenGL 3.0.

DX10 NV extensions Remember, Geforce 8/9/GT1x0/GT2x0 are only "DX10" GPUs, but with games like Far Cry 2 that uses DX10 NV extensions to enable some DX10.1 functions i.e. refer to NVIDIA's "The Way It's Meant To Be Played" developers initiative.

See full size image

OpenGL 3.0
- Provides rough feature parity with Direct3D 10
- Shader Model 4.0 features

OK so your telling me that a 7800gtx can run dx10 i dont think so

To quote NVIDIA The OpenGL 3.0 and GLSL 1.30 feature set is only available on G80 and later hardware. This means the following GPUs: Desktop: GeForce 8000 series or higher; GeForce GTX 260, 280; Quadro FX 370, 570, 1700, 3700, 4600, 4700x2, 5600 Notebook: GeForce 8000 series or higher; Quadro FX 360M, 370M, 570M, 770M, 1600M, 1700M, 2700M, 3600M, 3700M

In Windows 7 and November 2008 DirectX SDK, you can run D3D10 programs on D3D 9c hardware via D3D10 Level 9 i.e. missing hardware features runs on CPU while supported hardware features runs on the GPU. This trick should sound familiar to certain PS3 fanboys.

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122 faultline187
Member since 2008 • 64 Posts
n[QUOTE="faultline187"][QUOTE="ronvalencia"]

One should review OpenGL ES 2.0 relative OpenGL 2.x.

For programmable hardware: OpenGL ES 2.0 is defined relative to the OpenGL 2.0 specification

For Geforce 8/9/GT1x0/GT2x0 and Windows XP or Lintel, you have the alternative APIs such as OpenGL+NV extensions, OpenGL3.0, CUDA 2.0 and soon OpenCL 1.0 (CUDA like). NVIDIA ForceWare 177.89 for Windows XP includes OpenGL Extensions and OpenGL 3.0.

DX10 NV extensions Remember, Geforce 8/9/GT1x0/GT2x0 are only "DX10" GPUs, but with games like Far Cry 2 that uses DX10 NV extensions to enable some DX10.1 functions i.e. refer to NVIDIA's "The Way It's Meant To Be Played" developers initiative.

See full size image

OpenGL 3.0
- Provides rough feature parity with Direct3D 10
- Shader Model 4.0 features

ronvalencia

OK so your telling me that a 7800gtx can run dx10 i dont think so

To quote NVIDIA The OpenGL 3.0 and GLSL 1.30 feature set is only available on G80 and later hardware. This means the following GPUs: Desktop: GeForce 8000 series or higher; GeForce GTX 260, 280; Quadro FX 370, 570, 1700, 3700, 4600, 4700x2, 5600 Notebook: GeForce 8000 series or higher; Quadro FX 360M, 370M, 570M, 770M, 1600M, 1700M, 2700M, 3600M, 3700M

In Windows 7 and November 2008 DirectX SDK, you can run D3D10 programs on D3D 9c hardware via D3D10 Level 9 i.e. missing hardware features runs on CPU while supported hardware features runs on the GPU.

Ok so let me ask you a question....What API is PS3 ?

Avatar image for Frozzik
Frozzik

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123 Frozzik
Member since 2006 • 3914 Posts

I have read this thread with both amusement and disbelief. I'm still not 100% convinced the op is for real. I mean surely no one can believe this nonsence he is ranting on about. If anyone actually bought all that hype, especially now after all this time, well, i really feel sorry for you.

I have a PS3, i have a Pc with a G8 series card. The difference is huge. Massive. I'm not going to spout numbers or try and convince this guy and others they are wrong. If believing this crap makes them feel better about their purchase thyen thats fine by me.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#124 ronvalencia
Member since 2008 • 29612 Posts

r

Ok so let me ask you a question....What API is PS3 ?

faultline187

One of them is libGCM. Similar to ATI CTM 1.x (Close To Metal) for Radeon X1x00.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#125 ronvalencia
Member since 2008 • 29612 Posts

NV47 aka G70

Slides from Sony's GDC briefings.

http://game.watch.impress.co.jp/docs/20060329/3dps309.htm

Avatar image for DAZZER7
DAZZER7

2422

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#126 DAZZER7
Member since 2004 • 2422 Posts
Unchated's animation has nothing to do with the supposed power of the RSX...it's animation for crying out loud. It's more dependant on the motion capture techniques used lol. Also, as for drakes clothes getting wet. Any game could have an effect like that, even last gen games lol.
Avatar image for trasherhead
trasherhead

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#127 trasherhead
Member since 2005 • 3058 Posts

[QUOTE="trasherhead"]I would like to point out that KZ2, which from what I got from these posts uses 128bit HDR, do not use Open GL of any kind. They have programed directly to the CPU and GPU. That is out of the Ask the dev forum from the official PS forum. faultline187

So it uses 128bit hdr? can you please send me the link to the page if possible..Sounds very interesting to me.. thanks

sorry, no link. Just quoting the guys rapid posting this thread. But the part about them programing directly to the GPU and CPU can be found here in this thread. http://boardsus.playstation.com/playstation/board/message?board.id=killzone2&thread.id=90044
Avatar image for Jade_Monkey
Jade_Monkey

4830

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#128 Jade_Monkey
Member since 2004 • 4830 Posts
All I have to say is if the PS3 is as powerful as claimed then why can it still not keep up to PC on games that are available on both platforms ? You could blame coders but that will only get you so far and really isn't a great excuse.
Avatar image for trasherhead
trasherhead

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#129 trasherhead
Member since 2005 • 3058 Posts
Ok so to sum this up. I'll be using arguments from both sides of the case. RSX is based NV47 aka G70. It supports Open GL 3.0 and Open GL ES 2.0. Some programmers choose to program directly for the GPU and the CPU without using an API. Open GL 3.0 is only supported by G80 chipset and up. This conflicts with the idea that the RSX is only a G70. From this we can conclude that they may have taken some of the features that they already had planed for the G80 and put it into the RSX. We can then think of the RSX as the missing link between the G70 and the G80.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#130 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="trasherhead"] Ok so to sum this up. I'll be using arguments from both sides of the case. RSX is based NV47 aka G70. It supports Open GL 3.0 and Open GL ES 2.0. Some programmers choose to program directly for the GPU and the CPU without using an API. Open GL 3.0 is only supported by G80 chipset and up. This conflicts with the idea that the RSX is only a G70. From this we can conclude that they may have taken some of the features that they already had planed for the G80 and put it into the RSX. We can then think of the RSX as the missing link between the G70 and the G80.

OpenGL 3.0 comments are for the alternative APIs and D3D 10 like functionality on Windows XP. It has nothing to do with RSX, therefore there are no conficts for RSX is only NV47/G70 type GPU. Also, note my request on running geometry shaders on RSX. If RSX is the same level as G80 GPU, why would you execute geometry workloads on SPEs? We could compare libGCM vs CUDA 2.0. It seems to be me that certain PS3 sheep doesn't know what G80's basic functionality.
Avatar image for sh0vet
sh0vet

362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 sh0vet
Member since 2006 • 362 Posts
Well it's possible a few features went into the rsx that the g8x based gpus have. This is likely because the g8x roadmap was known well before the rsx went into development. One simple fact though ps3 rsx uses shader pipelines and a g8x based gpu uses shader processors. A few features doesn't mean its a g8x since the biggest advantage a g8x based gpu has is the shader processors.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#132 ronvalencia
Member since 2008 • 29612 Posts

vs

Well it's possible a few features went into the rsx that the g8x based gpus have. This is likely because the g8x roadmap was known well before the rsx went into development. One simple fact though ps3 rsx uses shader pipelines and a g8x based gpu uses shader processors. A few features doesn't mean its a g8x since the biggest advantage a g8x based gpu has is the shader processors. sh0vet

One simple fact that the G80 also has pipelines. G80's marketing material focuses on the number of stream processor units. ALU comparsions between Geforce 7900 and Geforce 8800 Both G7X and G8X are pipelined. Second issue in G7X vs G8X is the branch granularity i.e. G7X has ~256 pixels and G8X has 16 vertex/32 pixels. Smaller branch granularity is better for GpGPU(via CUDA) applications and complex shaders programs. The G8X also has GigaThreads i.e think of SMT but thousands of them. Other G8X features are Early-Z and decoupled texture units.

Care to bring on Geforce 8600GT GDDR3 vs RSX gaming screen comparisons? Shall we start with Fallout3 (smooth FPS with high settings and 720p)?

Avatar image for sh0vet
sh0vet

362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#133 sh0vet
Member since 2006 • 362 Posts

Dude basically what you just said there is a bunch of stuff like what this pdf document lists; you can find it here http://www.nvidia.com/object/IO_30459.html .

There are some simularities since I'm sure the rsx was developed when the g8x roadmap was known so a few good ideas went into it. However that's likely about it.

Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#134 ZoomZoom2490
Member since 2008 • 3943 Posts

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 ZoomZoom2490
Member since 2008 • 3943 Posts

vs[QUOTE="sh0vet"]Well it's possible a few features went into the rsx that the g8x based gpus have. This is likely because the g8x roadmap was known well before the rsx went into development. One simple fact though ps3 rsx uses shader pipelines and a g8x based gpu uses shader processors. A few features doesn't mean its a g8x since the biggest advantage a g8x based gpu has is the shader processors. ronvalencia

One simple fact that the G80 also has pipelines. G80's marketing material focuses on the number of stream processor units. ALU comparsions between Geforce 7900 and Geforce 8800 Both G7X and G8X are pipelined. Second issue in G7X vs G8X is the branch granularity i.e. G7X has ~256 pixels and G8X has 16 vertex/32 pixels. Smaller branch granularity is better for GpGPU(via CUDA) applications and complex shaders programs. The G8X also has GigaThreads i.e think of SMT but thousands of them. Other G8X features are Early-Z and decoupled texture units.

Care to bring on Geforce 8600GT GDDR3 vs RSX gaming screen comparisons? Shall we start with Fallout3 (smooth FPS with high settings and 720p)?

then explain how kz2 is capable to destroy fo3 on geforice 8600gt? look, once devs combine the cell + rsx the results are far greater than what multiplats show because they are only using the gpu rendering.
Avatar image for McdonaIdsGuy
McdonaIdsGuy

3046

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#136 McdonaIdsGuy
Member since 2008 • 3046 Posts

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

ZoomZoom2490
Shows that u don't even know what u're talking about even the crappy Multiplats are using the cell,and u can spin it all you want there isn't any Crysis like game on ps3 and won't ever be 1.
Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#137 ZoomZoom2490
Member since 2008 • 3943 Posts
[QUOTE="ZoomZoom2490"]

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

McdonaIdsGuy

Shows that u don't even know what u're talking about even the crappy Multiplats are using the cell,and u can spin it all you want there isn't any Crysis like game on ps3 and won't ever be 1.

lol, they are only using the cell to process the information and stuff, but they are not using it alongside with the gpu to render graphics like in Uncharted1-2, mgs4, kz2, lbp, etc.

lol at you telling me that i dont know what im talking about, i go way back with these things before you were even born, so watch and learn.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#138 ronvalencia
Member since 2008 • 29612 Posts

Dude basically what you just said there is a bunch of stuff like what this pdf document lists; you can find it here http://www.nvidia.com/object/IO_30459.html .

There are some simularities since I'm sure the rsx was developed when the g8x roadmap was known so a few good ideas went into it. However that's likely about it.

sh0vet

During RSX's development, NVIDIA was developing NV47, later known as G70. The next chip after G70 is G71.

G8X has unified shaders.

http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/2

"The two products share the same heritage, the same technology. But RSX is faster," -David Kirk of NVIDIA

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="ronvalencia"]

vs[QUOTE="sh0vet"]Well it's possible a few features went into the rsx that the g8x based gpus have. This is likely because the g8x roadmap was known well before the rsx went into development. One simple fact though ps3 rsx uses shader pipelines and a g8x based gpu uses shader processors. A few features doesn't mean its a g8x since the biggest advantage a g8x based gpu has is the shader processors. ZoomZoom2490

One simple fact that the G80 also has pipelines. G80's marketing material focuses on the number of stream processor units. ALU comparsions between Geforce 7900 and Geforce 8800 Both G7X and G8X are pipelined. Second issue in G7X vs G8X is the branch granularity i.e. G7X has ~256 pixels and G8X has 16 vertex/32 pixels. Smaller branch granularity is better for GpGPU(via CUDA) applications and complex shaders programs. The G8X also has GigaThreads i.e think of SMT but thousands of them. Other G8X features are Early-Z and decoupled texture units.

Care to bring on Geforce 8600GT GDDR3 vs RSX gaming screen comparisons? Shall we start with Fallout3 (smooth FPS with high settings and 720p)?

then explain how kz2 is capable to destroy fo3 on geforice 8600gt? look, once devs combine the cell + rsx the results are far greater than what multiplats show because they are only using the gpu rendering.

SPEs is a crap GPU i.e. it's instruction set is not geared to be a GPU i.e. how's CELL's OpenGL performance? Would it run Quake 3 at 102x768 at 500 FPS?

Sony's own paper shown us that 5 SPEs matches 20 pipeline enabled Geforce 7800GTX in just shaders. A graphic pipeline includes more than just shaders i.e. texturing, Early-Z cull, Z-cull, AA, ROPS, filtering and 'etc'. http://research.scea.com/ps3_deferred_shading.pdf

Should I bring in Crysis Warhead? Any KZ2 vs others are an apple vs orange comparison. With any benchmark e.g. SpecINT, SpecFP, the code base must be the same. Secondly, the topic is about NVidia's RSX GPU.

Geforce 9650M GT(55nm version G84) with 1GB VRAM can run Crysis Warhead @720p mostly on gamer(high) mode (shadow and shaders are in mainstream mode).

NVIDIA Geforce 8600 GT GDDR3 can beat Geforce 7950 GT SLI in complex shading workloads NVIDIA Geforce 8600GT (with 128bit wide bus, GDDR3 256MB VRAM) beats Geforce 7950GT SLI (with 2X 256bit wide bus, 2X 512 MB VRAM) in Crysis 1.21. Link

I don't care if you bring in "SPE+RSX" as a GPU.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#140 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="McdonaIdsGuy"][QUOTE="ZoomZoom2490"]

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

ZoomZoom2490

Shows that u don't even know what u're talking about even the crappy Multiplats are using the cell,and u can spin it all you want there isn't any Crysis like game on ps3 and won't ever be 1.

lol, they are only using the cell to process the information and stuff, but they are not using it alongside with the gpu to render graphics like in Uncharted1-2, mgs4, kz2, lbp, etc.

lol at you telling me that i dont know what im talking about, i go way back with these things before you were even born, so watch and learn.

Compared to G8X, the G7X needs alot of *fixing*, e.g. geometry, vertex, occlusion culling and textures. Displacement mapping that do not alter pixel information increases vertex/geometry resource i.e. vertex/geometry amplification.

Some bad MGS4 screenshots from

http://au.gamespot.com/pages/forums/show_msgs.php?topic_id=26736617&tag=topics;title

Avatar image for McdonaIdsGuy
McdonaIdsGuy

3046

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#141 McdonaIdsGuy
Member since 2008 • 3046 Posts
[QUOTE="McdonaIdsGuy"][QUOTE="ZoomZoom2490"]

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

ZoomZoom2490

Shows that u don't even know what u're talking about even the crappy Multiplats are using the cell,and u can spin it all you want there isn't any Crysis like game on ps3 and won't ever be 1.

lol, they are only using the cell to process the information and stuff, but they are not using it alongside with the gpu to render graphics like in Uncharted1-2, mgs4, kz2, lbp, etc.

lol at you telling me that i dont know what im talking about, i go way back with these things before you were even born, so watch and learn.

lol again posting nonsenses,are u saying that RSX is doing physics and A,I aswell..please............. :lol:,they aren't using the cell as good as sony first party developers but devs are already using the SPEs.
Avatar image for mamkem6
mamkem6

1457

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142 mamkem6
Member since 2007 • 1457 Posts

I did not know that everyone in here are computer experts! :roll:

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#143 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="mamkem6"]

I did not know that everyone in here are computer experts! :roll:

Waiting for RSX's G80 level results.
Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#144 faultline187
Member since 2008 • 64 Posts

Ok so to sum this up. I'll be using arguments from both sides of the case. RSX is based NV47 aka G70. It supports Open GL 3.0 and Open GL ES 2.0. Some programmers choose to program directly for the GPU and the CPU without using an API. Open GL 3.0 is only supported by G80 chipset and up. This conflicts with the idea that the RSX is only a G70. From this we can conclude that they may have taken some of the features that they already had planed for the G80 and put it into the RSX. We can then think of the RSX as the missing link between the G70 and the G80.trasherhead

So if RSX is the missing link between g70 and g80..then we havnt seen RSX to it full potential as of yet.

Do you remember the getaway demo they showed? well that was definatly 128bit hdr..

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#145 faultline187
Member since 2008 • 64 Posts

Well it's possible a few features went into the rsx that the g8x based gpus have. This is likely because the g8x roadmap was known well before the rsx went into development. One simple fact though ps3 rsx uses shader pipelines and a g8x based gpu uses shader processors. A few features doesn't mean its a g8x since the biggest advantage a g8x based gpu has is the shader processors.sh0vet

RSX E32005!

Quote:nvidia
Jsun Huang Quote: "We've incorporated a Farm of Programable Shading Processors, so that each and every single pixel will be computed in Real Time"!

Avatar image for Frozzik
Frozzik

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#146 Frozzik
Member since 2006 • 3914 Posts

I love how the op, like many other diluded console fanboys, is still, despite all the proof going on with this. Well done my friend.

I will say you are in for a major let down over the next few years though, really. I bet if any devs or hardware experts are reading this thread they will be laughing so hard.

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#147 faultline187
Member since 2008 • 64 Posts

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

ZoomZoom2490

true!

well its like one big unified graphics system....but we know so much about cell.....but nothing on rsx

RSX

Avatar image for Frozzik
Frozzik

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#148 Frozzik
Member since 2006 • 3914 Posts

hey, i bet in 2005 or whenever that was the PS3's theoretical power was the best ever. Its now 2009, things have moved on, if you honestly think n vidia still haven't surpassed the RSX years later you are more stupid than i thought.

Really, they claimed the ps3 was more powerfull than 2x6800 in sli, back then this was, WOW, OMG, thats amazing. Now even budget cards beat that lol.

Avatar image for faultline187
faultline187

64

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149 faultline187
Member since 2008 • 64 Posts
[QUOTE="ZoomZoom2490"]

multiplat games on ps3 are based on gpu rendering only but exclusives are rendered with both the gpu and the cell. Im sure if it wasnt for the cell, games like kz2, uncharted, mgs4, etc, wouldnt have a chance to look like that.

so in reality ps3 is capable of rendering graphics far beyond than what its gpu specs show. This is the part where everyone in this thread failed to understand how the ps3 design works.

McdonaIdsGuy

Shows that u don't even know what u're talking about even the crappy Multiplats are using the cell,and u can spin it all you want there isn't any Crysis like game on ps3 and won't ever be 1.

MAKE A bet?

All games will be Real Time Procedurally Rendered from Sony.

Oh and whats the difference with ASYMETRICAL AND SYMETRICAL systems? hehehehehehehe

U got no idea, its gona smack people in the face LOL