Ps3 going beyond Pc's capabilities?

  • 107 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 Teuf_
Member since 2004 • 30805 Posts

Not this crap again, i dont feel like explaining how many flaws and limitations MLAA has again, so just leave it as: No.
ferret-gamer

It's not a miracle cure, but it's damn good at removing edge aliasing. And it does it in a way that actually scales, unlike MSAA. 4x and 8xMSAA just starts to get stupidly wasteful in terms of memory, bandwidth, and performance when deferred rendering is used.


Also metro 2033's Analytical Anti-aliasing is extremely similar to MLAA and accomplishes about same yet you dont see anyone saying that it is some almighty anit aliasing solution.ferret-gamer


That's because they're not the same at all.

Avatar image for Eltormo
Eltormo

990

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 Eltormo
Member since 2010 • 990 Posts

BHAHAHAHAHAAHA OH GOD i seriously banged my head on my desk laughing after reading that, so much PR nonsenseGTR2addict

That is not PR and is not say by Sony.

Avatar image for clone01
clone01

29843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 clone01
Member since 2003 • 29843 Posts

[QUOTE="GTR2addict"]BHAHAHAHAHAAHA OH GOD i seriously banged my head on my desk laughing after reading that, so much PR nonsenseEltormo

That is not PR and is not say by Sony.

you have proof?
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#54 clyde46
Member since 2005 • 49061 Posts

I fail too see how a console with only 256mb of RAM can beat a PC.

Avatar image for erglesmergle
erglesmergle

1769

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 erglesmergle
Member since 2009 • 1769 Posts

My i7 > Cell Processorsz

My 5870 > RSXXX

1 GB VRAM > 256 MB VRAM

6 GB RAM > 256 MB RAM

1000 year old PC > PS3

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#57 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts


That's because they're not the same at all.

Teufelhuhn

From 4A games:It doesn't have explicit edge-detection. The closest explanation of the technique I can imagine would be that the shader internally doubles the resolution of the picture using pattern/shape detection (similar to morphological AA)

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#58 clyde46
Member since 2005 • 49061 Posts

I agree plus Move owns anything on the PC, :lol:the PC the only one without motion controls.

Dolohov27
seriously dude.
Avatar image for Iantheone
Iantheone

8242

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 Iantheone
Member since 2007 • 8242 Posts

I agree plus Move owns anything on the PC, :lol:the PC the only one without motion controls.

Dolohov27
I think I showed a while ago that the PC can have motion controls. Just as many as the consoles and with more variety
Avatar image for Eltormo
Eltormo

990

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 Eltormo
Member since 2010 • 990 Posts

you have proof?clone01

The PS3 is truly an amazing piece of hardware that is surprising everyone as time passes. Digital Foundry recently conducted an interesting analysis on the tech used in the game Saboteur, which revealed that the PS3 is capable of producing something even a high-end PC graphics card has difficulty with.

http://gamer.blorge.com/2010/01/05/ps3-smoothing-beyond-that-of-high-end-pc-graphics-card/

If you would have read the first page before jumping fast asking for proof you would have know is not an article from Sony,and that is done by Digital Foundry on some technique use on game Saboteur.

Avatar image for markop2003
markop2003

29917

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 markop2003
Member since 2005 • 29917 Posts
That's only talking about an increase in MSAA, personally i've never heard anyone complain about 16x and less is often satisfactory so i don't really see this as something of importance. Higher res textures would be nice though, you can really see the difference that 4k textures make when using quarl's texture pack.
Avatar image for Eltormo
Eltormo

990

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 Eltormo
Member since 2010 • 990 Posts

My i7 > Cell Processorsz

My 5870 > RSXXX

1 GB VRAM > 256 MB VRAM

6 GB RAM > 256 MB RAM

1000 year old PC > PS3

erglesmergle

1,000 old PC i don't think you mean the one with those specs right.?

Because the i7 was release on the end of 2008 november if i am not mistaken and the 5870 on september 2009 i think,hardly what i consider a 1000 year old PC,now i have a 7600GT card and a AMD dual core 2.2 and my PC can't output PS3 graphics,and the 7600GT OC was release on 2006,my PC is hardly 4 years old and can't match my PS3.

Avatar image for crimsonsabre
crimsonsabre

746

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 crimsonsabre
Member since 2006 • 746 Posts

all of you should learn how to read and think:

"effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance"

yess pc's can do mlaa, but that comes at a heftier cost to, according to the dev,high-end GPUs

edit: it just means the ps3 is more efficient in running mlaa

Avatar image for crimsonsabre
crimsonsabre

746

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 crimsonsabre
Member since 2006 • 746 Posts

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

Metroid_Other_M

title is incredible misleading and without thought.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#65 ronvalencia
Member since 2008 • 29612 Posts

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

Metroid_Other_M

In reference to http://www.eurogamer.net/articles/digitalfoundry-saboteur-aa-blog-entry and I quote.

"In the meantime, what we have is something that's new and genuinely exciting from a technical standpoint. We're seeing PS3 attacking a visual problem using a method that not even the most high-end GPUs are using."

Eurogamer didn't factor in AMD's http://developer.amd.com/gpu_assets/AA-HPG09.pdf

It was later corrected by Christer Ericson, director of tools and technology at Sony Santa Monica and I quote

"The screenshots may not be showing MLAA, and it's almost certainly not a technique as experimental as we thought it was, but it's certainly the case that this is the most impressive form of this type of anti-aliasing we've seen to date in a console game. Certainly, as we alluded to originally, the concept of using an edge-filter/blur combination isn't new, and continues to be refined. This document by Isshiki and Kunieda published in 1999 suggested a similar technique, and, more recently, AMD's Iourcha, Yang and Pomianowski suggested a more advanced version of the same basic idea".

AMD's Iourcha, Yang and Pomianowski's papers refers to http://developer.amd.com/gpu_assets/AA-HPG09.pdf

To quote AMD's paper "This filter is the basis for the Edge-Detect Custom Filter AA driver feature on ATI Radeon HD GPUs".

(Insert yet-anotherfanboy PS3 website)"not even the most high-end GPU are using" assertion would be wrong. From top to bottom GPUs, current ATI GPUs supports Direct3D 10.1 and methods menstioned AMD's AA paper.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66 ronvalencia
Member since 2008 • 29612 Posts

So teh Cell had hidden powa to be tapped all along. Who knew 2006 technology would be better than 2010 technology?

Mystic-G
Nov 2006 tech Geforce 8800 GTX says Hi... Geforce 8800GTX was release a few days ahead of Sony's PS3.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#67 ronvalencia
Member since 2008 • 29612 Posts

all of you should learn how to read and think:

"effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance"

yess pc's can do mlaa, but that comes at a heftier cost to, according to the dev,high-end GPUs

edit: it just means the ps3 is more efficient in running mlaa

crimsonsabre

It's just a noise statement. As with any working system, one must pool the entire system computation power together.

Doing image base AA on SPEs willconsume computation resource.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#68 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Eltormo"]

[QUOTE="erglesmergle"]

My i7 > Cell Processorsz

My 5870 > RSXXX

1 GB VRAM > 256 MB VRAM

6 GB RAM > 256 MB RAM

1000 year old PC > PS3

1,000 old PC i don't think you mean the one with those specs right.?

Because the i7 was release on the end of 2008 november if i am not mistaken and the 5870 on september 2009 i think,hardly what i consider a 1000 year old PC,now i have a 7600GT card and a AMD dual core 2.2 and my PC can't output PS3 graphics,and the 7600GT OC was release on 2006,my PC is hardly 4 years old and can't match my PS3.

Use AMD Athlon 64 X2 + NVIDIA Geforce 8800GTX combo.
Avatar image for Eltormo
Eltormo

990

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 Eltormo
Member since 2010 • 990 Posts

[QUOTE="Mystic-G"]

So teh Cell had hidden powa to be tapped all along. Who knew 2006 technology would be better than 2010 technology?

ronvalencia

Nov 2006 tech Geforce 8800 GTX says Hi... Geforce 8800GTX was release a few days ahead of Sony's PS3.

And was as expensive as a PS3 without,online play,Blu-Ray drive,ethernet port,wifi,HDD,control,and you still need it a great PC to put it in,since you will not put that card on 2006 on a low or mid level PC and spec to get the mots out of it.

Avatar image for Captain__Tripps
Captain__Tripps

4523

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 Captain__Tripps
Member since 2006 • 4523 Posts

[QUOTE="erglesmergle"]

My i7 > Cell Processorsz

My 5870 > RSXXX

1 GB VRAM > 256 MB VRAM

6 GB RAM > 256 MB RAM

1000 year old PC > PS3

Eltormo

1,000 old PC i don't think you mean the one with those specs right.?

Because the i7 was release on the end of 2008 november if i am not mistaken and the 5870 on september 2009 i think,hardly what i consider a 1000 year old PC,now i have a 7600GT card and a AMD dual core 2.2 and my PC can't output PS3 graphics,and the 7600GT OC was release on 2006,my PC is hardly 4 years old and can't match my PS3.

I am sure a 7600 gt is older than a PS3, and even then it will come close... I could play Stalker or Bioshock back then on high at 1400x900 at good fps, 30ish... 8800gtx is same age as PS3 but is way better.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#71 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ferret-gamer"]Not this crap again, i dont feel like explaining how many flaws and limitations MLAA has again, so just leave it as: No.
Teufelhuhn

It's not a miracle cure, but it's damn good at removing edge aliasing. And it does it in a way that actually scales, unlike MSAA. 4x and 8xMSAA just starts to get stupidly wasteful in terms of memory, bandwidth, and performance when deferred rendering is used.


Also metro 2033's Analytical Anti-aliasing is extremely similar to MLAA and accomplishes about same yet you dont see anyone saying that it is some almighty anit aliasing solution.ferret-gamer


That's because they're not the same at all.

On the opposite end of the scale, DX11 introduces bandwidth saving BC6H and BC7 texture compression formats. Both ATI and NVIDIA DX11 support BC6H and BC7 at hardware level.

Avatar image for Eltormo
Eltormo

990

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 Eltormo
Member since 2010 • 990 Posts

[QUOTE="Eltormo"]

[QUOTE="erglesmergle"]

My i7 > Cell Processorsz

My 5870 > RSXXX

1 GB VRAM > 256 MB VRAM

6 GB RAM > 256 MB RAM

1000 year old PC > PS3

ronvalencia

1,000 old PC i don't think you mean the one with those specs right.?

Because the i7 was release on the end of 2008 november if i am not mistaken and the 5870 on september 2009 i think,hardly what i consider a 1000 year old PC,now i have a 7600GT card and a AMD dual core 2.2 and my PC can't output PS3 graphics,and the 7600GT OC was release on 2006,my PC is hardly 4 years old and can't match my PS3.

Use AMD Athlon 64 X2 + NVIDIA Geforce 8800GTX combo.

Now you can do that on 2006 the card alone was $500+ without the PC to put it in dude,some people forget that the PS3 is 4 years old.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#73 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Eltormo"]

[QUOTE="ronvalencia"][QUOTE="Mystic-G"]

So teh Cell had hidden powa to be tapped all along. Who knew 2006 technology would be better than 2010 technology?

Nov 2006 tech Geforce 8800 GTX says Hi... Geforce 8800GTX was release a few days ahead of Sony's PS3.

And was as expensive as a PS3 without,online play,Blu-Ray drive,ethernet port,wifi,HDD,control,and you still need it a great PC to put it in,since you will not put that card on 2006 on a low or mid level PC and spec to get the mots out of it.

Depends on the entire total-cost-of-ownership i.e. PC games are usually cheaper and PC related expenses can be applied on personal income tax (depends on country's tax laws).
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#74 ronvalencia
Member since 2008 • 29612 Posts

You can do MLAA on PC CPU, if you don't mind using a few spare cores. In fact it was originally developed by Intel for CPU's. It just happens to be something the SPU's are really good at, and GPU's aren't.

Teufelhuhn

According to the same Intel MLAA paper(image based AA), GPUs are also suitable devices.

To quote Intel,Page 4/8 from http://visual-computing.intel-research.net/publications/papers/2009/mlaa/mlaa.pdf

We did not try to implement other MLAA steps using SSE® instructions
(though it might be possible to do it using SSE4
operations), opting instead for preserving the universal nature of
the algorithm. The upcoming Larrabee chip [Seiler et al. 2008], as
well as modern GPU cards, are capable of handling 8-bit data
extremely efficiently, so our algorithm will benefit from porting to
these architectures.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 ronvalencia
Member since 2008 • 29612 Posts

does anyone have the link of beyond3D where they do the GPU comparisons and explain the importance of the SPE'S?

Metroid_Other_M

I ussually store my repeated postings on my GS blog.

For today's raster workloads, the RSX/Geforce 7 is an aging GPU.

From http://forum.beyond3d.com/showthread.php?t=57736&page=5

------------------------

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"


1) Two ppu/vmx units
There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.

2) Vertex culling
You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.

3) Vertex texture sampling
You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.

4) Shader patching
Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.

5) Branching
You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.

6) Shader inputs
You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.

7) MSAA alternatives
Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.

Post processing
360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.

9) Load balancing
360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.

10) Half floats
You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.

11) Shader array indexing
You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.

Etc, etc, etc...

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#76 mitu123
Member since 2006 • 155290 Posts

Not this crap again, i dont feel like explaining how many flaws and limitations MLAA has again, so just leave it as: No. Also metro 2033's Analytical Anti-aliasing is extremely similar to MLAA and accomplishes about same yet you dont see anyone saying that it is some almighty anit aliasing solution.ferret-gamer
Nobody knows what AAA is and they think MLAA is better than AAA.:lol:

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#77 mitu123
Member since 2006 • 155290 Posts

I fail too see how a console with only 256mb of RAM can beat a PC.

clyde46

It has teh Cell!

Avatar image for topsemag55
topsemag55

19063

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#78 topsemag55
Member since 2007 • 19063 Posts

I don't buy it...you cannot compare a reduced-performance 7800 GTX (DX 9.0c) against the nVidia GT 400 Series, which can run all DX versions.

Can't compare the PS3's CPU against an Intel Core CPU either.

Console technology is 5 years old.

Avatar image for foxhound_fox
foxhound_fox

98532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#79 foxhound_fox
Member since 2005 • 98532 Posts

Where does it say "only the PS3 can do this"? All I'm seeing is they tried it on the PS3 and haven't put it on the PC yet... and didn't Pandemic shut down after The Saboteur?

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#80 Teuf_
Member since 2004 • 30805 Posts

From 4A games:It doesn't have explicit edge-detection. The closest explanation of the technique I can imagine would be that the shader internally doubles the resolution of the picture using pattern/shape detection (similar to morphological AA)

ferret-gamer



MLAA doesn't happen when the geometry is being shaded, it happens afterwards as a full-screen post-process. He's just saying the pattern detection is similar.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 Teuf_
Member since 2004 • 30805 Posts

On the opposite end of the scale, DX11 introduces bandwidth saving BC6H and BC7 texture compression formats. Both ATI and NVIDIA DX11 support BC6H and BC7 at hardware level.

ronvalencia



Sure, but you can't use those formats for render targets since they can't be encoded in real time. It only helps for static textures, which make up a small proportion of total bandwidth usage.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 Teuf_
Member since 2004 • 30805 Posts

[QUOTE="Teufelhuhn"]

You can do MLAA on PC CPU, if you don't mind using a few spare cores. In fact it was originally developed by Intel for CPU's. It just happens to be something the SPU's are really good at, and GPU's aren't.

ronvalencia

According to the same Intel MLAA paper(image based AA), GPUs are also suitable devices.

To quote Intel,Page 4/8 from http://visual-computing.intel-research.net/publications/papers/2009/mlaa/mlaa.pdf

We did not try to implement other MLAA steps using SSE® instructions
(though it might be possible to do it using SSE4
operations), opting instead for preserving the universal nature of
the algorithm. The upcoming Larrabee chip [Seiler et al. 2008], as
well as modern GPU cards, are capable of handling 8-bit data
extremely efficiently, so our algorithm will benefit from porting to
these architectures.



I didn't say you couldn't do it, it just doesn't map as well to GPU's because the edge detection has a large search space (equal to whatever your tile size is) which means a lot of texture samples if you do it in a pixel shader. A powerful GPU could probably still blow through it no problem since they have such massive bandwidth and shader ALU counts, as long the implementation didn't completely suck. The recent GPU MLAA paper seemed to get decent results even on an 8600, although I don't know if the quality matches the original Intel implementation.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#83 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

On the opposite end of the scale, DX11 introduces bandwidth saving BC6H and BC7 texture compression formats. Both ATI and NVIDIA DX11 support BC6H and BC7 at hardware level.

Teufelhuhn



Sure, but you can't use those formats for render targets since they can't be encoded in real time. It only helps for static textures, which make up a small proportion of total bandwidth usage.

Refer http://msdn.microsoft.com/en-us/library/ee416559(VS.85).aspx

Compute Shader 4.0 (on ATI/NV DX10.X hardware) accelerated BC6H/BC7 encoding/decoding.

Refer http://www.microsoftgamefest.com/london2010.htm

Block Compression Smorgasbord

DXT ****block compression has become a de facto standard for most textures in games. We will cover several topics of recent interest in this area:

1. Fast block compression. With some optimization effort, real-time block compression is feasible, on either the CPU or the GPU. Real-time compression opens up a host of possibilities for disk space reduction and dynamic texture updates.

2. Normal map compression. Block compression was designed for color data but adapted for use with normal maps. The results are not always pretty. What usage patterns should be favored, or avoided, in this context?

3. New block compression formats. DirectX 11 introduced two brand new formats: BC6H for HDR textures and BC7 for high quality LDR textures. We discuss how and when these formats should be employed, and how to handle the enormous search space for compression

http://www.waybeta.com/news/8292/true-structure-of-the-native-dx11-graphics-asl-gtx480-graphics-card-evaluation-evaluation-_-asl/7/

However, there do not support texture compression technology HDR (High Dynamic Range) images, which is open HDR is occupied by memory of a great cause. To solve this problem, DirectX 11 by adding two new compression algorithms - BC6H and BC7. Which, BC6H is designed specifically for HDR image compression algorithm, compression ratio of 6:1; and BC7 is dedicated to high quality RGB [A] texture compression algorithm design, compression ratio of 3:1

NVIDIA Geforce7/RSX and the official Direct3D9 doesn't support 3DC+ texture compression formats.

With Bad Company 2, AMD Radeon HD 5570 (GDDR3) beats Xbox 360 when it comes to MSAA and graphics details.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#84 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Teufelhuhn"]

You can do MLAA on PC CPU, if you don't mind using a few spare cores. In fact it was originally developed by Intel for CPU's. It just happens to be something the SPU's are really good at, and GPU's aren't.

Teufelhuhn

According to the same Intel MLAA paper(image based AA), GPUs are also suitable devices.

To quote Intel,Page 4/8 from http://visual-computing.intel-research.net/publications/papers/2009/mlaa/mlaa.pdf

We did not try to implement other MLAA steps using SSE® instructions
(though it might be possible to do it using SSE4
operations), opting instead for preserving the universal nature of
the algorithm. The upcoming Larrabee chip [Seiler et al. 2008], as
well as modern GPU cards, are capable of handling 8-bit data
extremely efficiently, so our algorithm will benefit from porting to
these architectures.



I didn't say you couldn't do it, it just doesn't map as well to GPU's because the edge detection has a large search space (equal to whatever your tile size is) which means a lot of texture samples if you do it in a pixel shader. A powerful GPU could probably still blow through it no problem since they have such massive bandwidth and shader ALU counts, as long the implementation didn't completely suck. The recent GPU MLAA paper seemed to get decent results even on an 8600, although I don't know if the quality matches the original Intel implementation.

My comment was for suitability not the ability.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 Teuf_
Member since 2004 • 30805 Posts

Refer http://msdn.microsoft.com/en-us/library/ee416559(VS.85).aspx

Compute Shader 4.0 (on ATI/NV DX10.X hardware) accelerated BC6H/BC7 encoding/decoding.

ronvalencia



It's an offline conversion tool. The performance still isn't anywhere near fast enough for realtime use.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

Refer http://msdn.microsoft.com/en-us/library/ee416559(VS.85).aspx

Compute Shader 4.0 (on ATI/NV DX10.X hardware) accelerated BC6H/BC7 encoding/decoding.

Teufelhuhn



It's an offline conversion tool. The performance still isn't anywhere near fast enough for realtime use.

For AMD Radeon HD 5570 http://en.hardspell.com/doc/enshowcont.asp?id=7550&pageid=6673

ATIs Render Back-Ends like a ROP units, It is responsible for image post-processing out. In HD5700s Render Back-Ends , adding a new read-back path, enabling texture unit reads the compressed AA Color Buffers, thereby improving the CFAA antialiasing performance. In Render Back-Ends , it also provides a super-sampling anti-aliasing and significantly enhance sampling rate speed, .thus speeding up the performance of antialiasing

AMD Radeon HD 2900 (R600)'s real time hardware compression and decompression at I/O interfaces. Later AMD Radeon HDs improved this concept.

Avatar image for Lable1985
Lable1985

1046

Forum Posts

0

Wiki Points

0

Followers

Reviews: 394

User Lists: 0

#87 Lable1985
Member since 2008 • 1046 Posts

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

Metroid_Other_M
A games performance, and graphics are somewhat dependent on how you build the game to begin with which is why some 2nd gen games like black looked every bit as good as some high performing 3rd gen games. So if designed right you could make a game on any platform, and make it look as good as the highest performing platform out there. That's why crysis 2 looks better than crysis despite being built around consoles this time.
Avatar image for roflcopter317
roflcopter317

709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88 roflcopter317
Member since 2010 • 709 Posts

LOL!! THIS IS HILARIOUS!!!

Avatar image for roflcopter317
roflcopter317

709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 roflcopter317
Member since 2010 • 709 Posts

http://au.xbox360.ign.com/articles/617/617951p1.html

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#90 delta3074
Member since 2007 • 20003 Posts

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

Metroid_Other_M
articles old, and is completely wrong, the 360 uses a CPU based AA solution as well, it's calle AAA (analytical anti aliasing) and emulates 8xMsaa, it's used in the 360 version of metro 2033, so all that crap about the 360 topped at 4xmsaa is just BS really, and no offence, but a top end PC can handle REAL 16xmsaa, it doesn't have to 'emulate' it, so no, the Ps3 does not outperform top end GPU's at all.
Avatar image for KlepticGrooves
KlepticGrooves

2448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#91 KlepticGrooves
Member since 2010 • 2448 Posts

Sounds like some people need to put away the magnifying glass and play some games.

Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#92 GTSaiyanjin2
Member since 2005 • 6018 Posts

[QUOTE="Metroid_Other_M"]

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

delta3074

articles old, and is completely wrong, the 360 uses a CPU based AA solution as well, it's calle AAA (analytical anti aliasing) and emulates 8xMsaa, it's used in the 360 version of metro 2033, so all that crap about the 360 topped at 4xmsaa is just BS really, and no offence, but a top end PC can handle REAL 16xmsaa, it doesn't have to 'emulate' it, so no, the Ps3 does not outperform top end GPU's at all.

Grpahics cards still take a massive hit on AA performance wether it be 15-25% with 16xAA. It dont really mater what card you use, if you want to use that much AA it will come at a price. Though it has improved in recent years, but still its not as efficient as it should be. It would be cool doing something like 16XMSAA on the CPU, since most games dont even take advantage of all the cores.

Avatar image for Lable1985
Lable1985

1046

Forum Posts

0

Wiki Points

0

Followers

Reviews: 394

User Lists: 0

#93 Lable1985
Member since 2008 • 1046 Posts

Sounds like some people need to put away the magnifying glass and play some games.

KlepticGrooves
People just like bragging rights. I doubt a lot of these people care about graphics outside of such things. Remember these people care more for fanboy pride than anything that relates to games they play, well most of them anyway.
Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 stereointegrity
Member since 2007 • 12151 Posts
god of war 3 used mlaa......this is such old news
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#95 ronvalencia
Member since 2008 • 29612 Posts

http://au.xbox360.ign.com/articles/617/617951p1.html

roflcopter317

Shader operations graph is not quite right i.e. any real game would have fetch texture operations.

Since RSX is based on Geforce 7 family.

.

The RSX has 24 of these blocks.

As you can see, the front-end will limit the RSX/G70 design i.e. You also have dependencies e.g. FPU2 is dependent on FPU1. The stall in FPU1 will stall the rest of the dependant units.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#96 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="delta3074"][QUOTE="Metroid_Other_M"]

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

GTSaiyanjin2

articles old, and is completely wrong, the 360 uses a CPU based AA solution as well, it's calle AAA (analytical anti aliasing) and emulates 8xMsaa, it's used in the 360 version of metro 2033, so all that crap about the 360 topped at 4xmsaa is just BS really, and no offence, but a top end PC can handle REAL 16xmsaa, it doesn't have to 'emulate' it, so no, the Ps3 does not outperform top end GPU's at all.

Grpahics cards still take a massive hit on AA performance wether it be 15-25% with 16xAA. It dont really mater what card you use, if you want to use that much AA it will come at a price. Though it has improved in recent years, but still its not as efficient as it should be. It would be cool doing something like 16XMSAA on the CPU, since most games dont even take advantage of all the cores.

The "hit" would be transferred to CPU. On "fat" PC GpGPUs, the hit is irrelevant if the target goals are console level 720p resolution and 30 FPS.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#97 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="Metroid_Other_M"]

According to

this article, the Ps3 can work marvels.

"The PS3 rendition of Pandemic's The Saboteur is different though. It's special. It's trying something new that's never been seen before on console, or indeed PC, and its results are terrific. In a best-case scenario you get edge-smoothing that is beyond the effect of 16x multi-sampling anti-aliasing, effectively delivering an effect better than the capabilities of high-end GPUs without crippling performance. Compare and contrast with Xbox 360 hardware, which tops out at 4x MSAA."

This is either BS because if that was true, 1st party devs should be able to make games that outperform anything on Pc, or it is the truth, making people scream "what a shame!" due to the fact that no one has really showed what the Cell is really capable of in outperforming even high-end gaming rigs.

what's your take? I tend to believe that it's just the ability of great devs such as Naughty Dog and Santa Monica that are able to find the right compromises in order to make their titles look so good, still not good enough to beat out games like Crysis.

another interesting article about the use of the SPUs

Lable1985

A games performance, and graphics are somewhat dependent on how you build the game to begin with which is why some 2nd gen games like black looked every bit as good as some high performing 3rd gen games. So if designed right you could make a game on any platform, and make it look as good as the highest performing platform out there. That's why crysis 2 looks better than crysis despite being built around consoles this time.

Runtime dataset from the consoles would be smaller i.e. less dataset processing due to console's memory limitations.

Avatar image for imprezawrx500
imprezawrx500

19187

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 imprezawrx500
Member since 2004 • 19187 Posts
mid range 2005 hardware will never outperform highend 2010 hardware. They can put stuff in that isn't in pc games yet but the pc will always do the same effects better.
Avatar image for clone01
clone01

29843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100 clone01
Member since 2003 • 29843 Posts

[QUOTE="clone01"] you have proof?Eltormo

The PS3 is truly an amazing piece of hardware that is surprising everyone as time passes. Digital Foundry recently conducted an interesting analysis on the tech used in the game Saboteur, which revealed that the PS3 is capable of producing something even a high-end PC graphics card has difficulty with.

http://gamer.blorge.com/2010/01/05/ps3-smoothing-beyond-that-of-high-end-pc-graphics-card/

If you would have read the first page before jumping fast asking for proof you would have know is not an article from Sony,and that is done by Digital Foundry on some technique use on game Saboteur.

but that's just cherry-picking an answer. i could just as easily link to someone touting the power of the 360.