Intel is entering GPGPU market - how it will effect consoles

  • 65 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#1 rimnet00
Member since 2003 • 11003 Posts

NOTE: Integrated GPU != GPGPU

http://www.news.com/8301-13579_3-9895892-37.html?tag=nefd.blgs

Intel is entering into the GPGPU market in 09-10.

Sony
As from as I see it, this renders the CELL even more useless. Now, of course, Sony doesn't have to support CELL with their PS4 - but knowing them, I wouldn't be surprised if the did once again. This means, using technology that will likely be rendered inferior to the GPGPU concepts, all for sake of proprietary hardware.

However, let's assume that they do take the route of supporting GPGPUs. Well, that means that they will no longer be able to convince their closest developers, and especially developers working on the CELL/PS3 portions of multiplats that CELL will be a lucrative area to work in. This could hinder the potential of the PS3 going into the future once PS4's design architecture starts moving forward. Note, that they should know two years prior to release what route they will be taking - which is a significant amount of time.

Microsoft
The next iteration of the XBox will probably be released in 2010/2011. 4 to 5 years after the release of 360. This means that if they want to adopt this new architecture, their system will likely be very expensive. While at the same time, the technology may not be mature enough -- but none the less, it's going to be a risk. If they do not adopt this technology, and it matures quick enough -- their system will be effected by platforms that take advantage of it.

Nintendo
They do what they want. I don't think it's even possible to figure out what's going on in their mind.

PCs
Upgrade when it's just right :) Oh man, I can't wait to get me one of these.... Ray Tracing is so close, I can taste it :D

Avatar image for MrGrimFandango
MrGrimFandango

5286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 MrGrimFandango
Member since 2005 • 5286 Posts
Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?
Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#3 rimnet00
Member since 2003 • 11003 Posts
Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?MrGrimFandango

Gameplay? You mean, graphics eh?
Avatar image for Meu2k7
Meu2k7

11809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Meu2k7
Member since 2007 • 11809 Posts
For those of us that dont know ... explain this GPGPU thing. :P
Avatar image for darkmagician06
darkmagician06

6060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 darkmagician06
Member since 2003 • 6060 Posts
For those of us that dont know ... explain this GPGPU thing. :PMeu2k7
read the arcticle?
Avatar image for kage_53
kage_53

12671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#6 kage_53
Member since 2006 • 12671 Posts
Sony sold their Cell chips to Toshiba and aren't going to use them for PS4. Also this stuff is old. AMD and ATI have been working on a while to create a GPGPU.
Avatar image for WildTurkey00
WildTurkey00

4067

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 WildTurkey00
Member since 2005 • 4067 Posts
Actually, the title of this thread is misleading. Intel is not entering the GPU market. Intel is already in the market. In fact, Intel is actually the largest producer of graphics chips in the world. The high end gaming chip market (which we all think of with regard to gaming) is a small sliver of the whole, which we all know Nvidia and ATI dominate that sliver.
Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#8 rimnet00
Member since 2003 • 11003 Posts

Sony sold their Cell chips to Toshiba and aren't going to use them for PS4. Also this stuff is old. AMD and ATI have been working on a while to create a GPGPU.kage_53

They sold thier CELL chips to Toshiba? Think about what you are saying in that sentence.

The concepts of GPGPU architecture is old, but it has not been implimented yet. This is a actual official roadmap. If you don't see the value in that -- fail.

Avatar image for rexoverbey
rexoverbey

7622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#9 rexoverbey
Member since 2002 • 7622 Posts

Intel has been making onboard graphics for years and they have always sucked. I think this is just marketing hype much like the 128-teracore processor.

Avatar image for yellowandmushy
yellowandmushy

2095

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 yellowandmushy
Member since 2006 • 2095 Posts
Nintendo?
Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#11 rimnet00
Member since 2003 • 11003 Posts

Intel has been making onboard graphics for years and they have always sucked. I think this is just marketing hype much like the 128-teracore processor.

rexoverbey

Actually, the title of this thread is misleading. Intel is not entering the GPU market. Intel is already in the market. In fact, Intel is actually the largest producer of graphics chips in the world. The high end gaming chip market (which we all think of with regard to gaming) is a small sliver of the whole, which we all know Nvidia and ATI dominate that sliver.WildTurkey00

It clearly says "GPGPU", not GPU. So I don't see how you two are confused.

Avatar image for MrGrimFandango
MrGrimFandango

5286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 MrGrimFandango
Member since 2005 • 5286 Posts

[QUOTE="MrGrimFandango"]Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?rimnet00

Gameplay? You mean, graphics eh?

There ya go SKIPPER!

Avatar image for PullTheTricker
PullTheTricker

4749

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 PullTheTricker
Member since 2006 • 4749 Posts

Actually, the title of this thread is misleading. Intel is not entering the GPU market. Intel is already in the market. In fact, Intel is actually the largest producer of graphics chips in the world. The high end gaming chip market (which we all think of with regard to gaming) is a small sliver of the whole, which we all know Nvidia and ATI dominate that sliver.WildTurkey00

Integrated graphics card = fail

Avatar image for mjarantilla
mjarantilla

15721

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14 mjarantilla
Member since 2002 • 15721 Posts

[QUOTE="MrGrimFandango"]Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?rimnet00

Gameplay? You mean, graphics eh?

To most PS3/360 gamers, graphics = gameplay. :)

Avatar image for MrGrimFandango
MrGrimFandango

5286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 MrGrimFandango
Member since 2005 • 5286 Posts

[QUOTE="rimnet00"][QUOTE="MrGrimFandango"]Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?mjarantilla


Gameplay? You mean, graphics eh?

To most PS3/360 gamers, graphics = gameplay. :)

Yea thats waht I was totally getting at!

Avatar image for Bgrngod
Bgrngod

5766

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#17 Bgrngod
Member since 2002 • 5766 Posts

GPGPU's will never catch on for the gaming industry. Trying to re-integrate graphics processing back onto the same chip as the CPU is a step backwards. It kicks the whole idea of an upgrade path in the nuts and requires a CPU replacement if you want to upgrade your GPU.

The only thing that needs to be taken out of that article is that Intel chips will have a new instructions set, SSE5 or something, in the future. Nothing big really.

Avatar image for MrGrimFandango
MrGrimFandango

5286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 MrGrimFandango
Member since 2005 • 5286 Posts
Intels graphics solutions is the disease of PC gaming.
Avatar image for kage_53
kage_53

12671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#19 kage_53
Member since 2006 • 12671 Posts

[QUOTE="kage_53"]Sony sold their Cell chips to Toshiba and aren't going to use them for PS4. Also this stuff is old. AMD and ATI have been working on a while to create a GPGPU.rimnet00

They sold thier CELL chips to Toshiba? Think about what you are saying in that sentence.

The concepts of GPGPU architecture is old, but it has not been implimented yet. This is a actual official roadmap. If you don't see the value in that -- fail.

Why should I think about something that's true...

http://en.wikipedia.org/wiki/AMD_Fusion

Avatar image for osan0
osan0

18265

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 osan0
Member since 2004 • 18265 Posts

For those of us that dont know ... explain this GPGPU thing. :PMeu2k7

general purpose graphics processing unit i believe. at the mo a geforce card or a radeon card are desigend for graphics. they can do other stuff (we have seen radeons working in folding at home for example) and what they can do, they can do very fast. but there not very flexible.

this is very interesting and seems to be going along the path of "a crap load of CPUs". instead of having discrete dedicated hardware, theres just cpus and devs can use them as they see fit. now it would take a stonkingly huge amount of processing umph to match a GPU in terms of graphics rendering performance...but plonk enough CPUs in there and ull get it. its just a question of cost and logistics. PCs have been going this way for a while now anyway (going from discrete pixel and vertex shaders to unified shader for example) so this just seems like the next step foreward here.

i say seems because intel are still a bit vague on what there planning and they will need a crap load of processing umph to keep up with high end graphics chips. ray tracing, espeically, needs a disgusting amount of horsepower to run in real time (some devs are now even arguing that using it would be a waste of time really....the same effect can be achieved using more traditional methods).

still the plot thickens and the balance of power shifts as a great power rises in the west (oo epic :P). this could be the next big leap in computer graphics, the biggest one since the introduction of hardware transform and lighting. or it could be intel talking nonsense again and failing to deliver anything of interest.

and TC....it probably wont have any effect on the console market. MS might be interested in using it (costs permitting) and sony will do their own thing. sony no longer actively support the cell and chances are it wont be used in the PS4....sony will come up with some new thing....hype it big and have it utterly fail to deliver. nintendo...will be unconcerned.

Avatar image for Lazy_Boy88
Lazy_Boy88

7418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Lazy_Boy88
Member since 2003 • 7418 Posts
Intel is already in the GPU market. They produce most all the terrible integrated stuff you find in prebuilts. They're probably not going to make any kind of impact into the high end market with ATI/Nvidia having such a huge lead.
Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#22 rimnet00
Member since 2003 • 11003 Posts

GPGPU's will never catch on for the gaming industry. Trying to re-integrate graphics processing back onto the same chip as the CPU is a step backwards. It kicks the whole idea of an upgrade path in the nuts and requires a CPU replacement if you want to upgrade your GPU.

The only thing that needs to be taken out of that article is that Intel chips will have a new instructions set, SSE5 or something, in the future. Nothing big really.

Bgrngod

No way dude, how is it a step backwards. A step backwards would suggest that CPUs were able to have dedicated components for calculating vectors - that's why GPUs were created. Why do you think AMD is working on the same thing? Why then is nVidia is working on added CPU-based onto their GPUs, which they suggest will be on their roadmap soon?

This evolution of the way CPUs and GPUs are thought of, is a step forward towards removing the latency between calculations that are dependent on both the CPU and GPU. Lastly, once we step over the 'ray tracing constant' problem with fast enough CPUs, GPUs as we think of them today will likely not exist.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 Teuf_
Member since 2004 • 30805 Posts

The concepts of GPGPU architecture is old, but it has not been implimented yet. This is a actual official roadmap. If you don't see the value in that -- fail.

rimnet00


It most definitely has been implemented -- look up CUDA by Nvidia.
Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 dream431ca
Member since 2003 • 10165 Posts

Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?MrGrimFandango

Ray-tracing is actually an old way to do graphics, it's just the hardware requirements are fairly high to get it to run smooth.

The PS3 can do some ray tracing already with just the Cell chip. I think IBM and Sony are gonna be partners for a long time with this Cell chip business.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Teuf_
Member since 2004 • 30805 Posts
Intel is already in the GPU market. They produce most all the terrible integrated stuff you find in prebuilts. They're probably not going to make any kind of impact into the high end market with ATI/Nvidia having such a huge lead.Lazy_Boy88


That's why Larrabee isn't going after the extreme high-end market.
Avatar image for EnergyAbsorber
EnergyAbsorber

5116

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 EnergyAbsorber
Member since 2005 • 5116 Posts

Sony
As from as I see it, this renders the CELL even more useless. Now, of course, Sony doesn't have to support CELL with their PS4 - but knowing them, I wouldn't be surprised if the did once again. This means, using technology that will likely be rendered inferior to the GPGPU concepts, all for sake of proprietary hardware.

rimnet00

Why would Sony have the same cell in PS4? It would be extremely outdated by then regardless. They aren't going to just keep the same exact technology for their next console. If that was the case then Sony would have just stuck with the Emotion Engine for PS3.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Teuf_
Member since 2004 • 30805 Posts
Lastly, once we step over the 'ray tracing constant' problem with fast enough CPUs, GPUs as we think of them today will likely not exist. rimnet00


There's nothing that says ray tracing is the holy grail of rendering (depsite what Intel might claim). It's just a technique that happens to have certain benefits and certain (big) drawbacks. Many arguements that are pro ray-tracing like to say how new techniques will make ray-tracing faster, but they don't mention that those new techniques almost always make traditional rasterization faster as well.

Most signs point to ray-tracing and rasterization being combined in certain scenarios, since they can very happily co-exist.
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="rimnet00"]

Sony
As from as I see it, this renders the CELL even more useless. Now, of course, Sony doesn't have to support CELL with their PS4 - but knowing them, I wouldn't be surprised if the did once again. This means, using technology that will likely be rendered inferior to the GPGPU concepts, all for sake of proprietary hardware.

EnergyAbsorber

Why would Sony have the same cell in PS4? It would be extremely outdated by then regardless. They aren't going to just keep the same exact technology for their next console. If that was the case then Sony would have just stuck with the Emotion Engine for PS3.



They wouldn't use the same chip, they're not Nintendo (har har har). They would use a "Cell 2", which would have the same architecture but probably with more SPE's and another PPE as well as a higher clock speed. They would do this because Cell was designed to be scalable, you can double your theoretical power just by adding more SPE's. This isn't the case for Emotion Engine, where tacking on more cores or vector units would significantly change the architecture. And you can't just bump up the clock speed, because you can only get the clock so high until your chip melts itself.
Avatar image for MrGrimFandango
MrGrimFandango

5286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 MrGrimFandango
Member since 2005 • 5286 Posts

[QUOTE="MrGrimFandango"]Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?dream431ca

Ray-tracing is actually an old way to do graphics, it's just the hardware requirements are fairly high to get it to run smooth.

The PS3 can do some ray tracing already with just the Cell chip. I think IBM and Sony are gonna be partners for a long time with this Cell chip business.

Yea...you all just missed the sarcasm didnt you? I mean it wasnt that hard to detect, the sentence itself is pretty absurd.

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#30 rimnet00
Member since 2003 • 11003 Posts
[QUOTE="rimnet00"]

The concepts of GPGPU architecture is old, but it has not been implimented yet. This is a actual official roadmap. If you don't see the value in that -- fail.

Teufelhuhn



It most definitely has been implemented -- look up CUDA by Nvidia.

I never realized how vaguely defined GPGPUs really were. More precisely, I was refering to GPGPUs which have dedicated components for doing both linear and vector calculations, as opposed to mapping to across paradigms which is very ineffecient -- ie CPUs doing vector calculations, and GPUs doing hardcore linear computation.

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#33 rimnet00
Member since 2003 • 11003 Posts

[QUOTE="rimnet00"]Lastly, once we step over the 'ray tracing constant' problem with fast enough CPUs, GPUs as we think of them today will likely not exist. Teufelhuhn


There's nothing that says ray tracing is the holy grail of rendering (depsite what Intel might claim). It's just a technique that happens to have certain benefits and certain (big) drawbacks. Many arguements that are pro ray-tracing like to say how new techniques will make ray-tracing faster, but they don't mention that those new techniques almost always make traditional rasterization faster as well.

Most signs point to ray-tracing and rasterization being combined in certain scenarios, since they can very happily co-exist.

Ray tracing is exponentially faster then rasterization, baring the computational constant that plagues it. I'm not using the term 'exponentially' in any exaggerating fashion either -- I mean literally, the algorithm is able to expoentially faster then rasterization at any given resolution.

I do however agree that rasterization won't disappear. However, I'm definitly one to hype ray tracing seeing as every leading computer science department in the country, with a computer graphics department is doing just about the same. Theoretically, it the only things that makes sense in the future - barring any new discoveries.

Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 dream431ca
Member since 2003 • 10165 Posts

[QUOTE="Teufelhuhn"][QUOTE="rimnet00"]Lastly, once we step over the 'ray tracing constant' problem with fast enough CPUs, GPUs as we think of them today will likely not exist. rimnet00



There's nothing that says ray tracing is the holy grail of rendering (depsite what Intel might claim). It's just a technique that happens to have certain benefits and certain (big) drawbacks. Many arguements that are pro ray-tracing like to say how new techniques will make ray-tracing faster, but they don't mention that those new techniques almost always make traditional rasterization faster as well.

Most signs point to ray-tracing and rasterization being combined in certain scenarios, since they can very happily co-exist.

Ray tracing is exponentially faster then rasterization, baring the computational constant that plagues it. I'm not using the term 'exponentially' in any exaggerating fashion either -- I mean literally, the algorithm is able to expoentially faster then rasterization at any given resolution.

I do however agree that rasterization won't disappear. However, I'm definitly one to hype ray tracing seeing as every leading computer science department in the country, with a computer graphics department is doing just about the same. Theoretically, it the only things that makes sense in the future - barring any new discoveries.

I believe the first thing to actually do ray-tracing really well was the PS3. Unfortunatly, you need 3 of them hooked together to accomplish it well.

Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 dream431ca
Member since 2003 • 10165 Posts
[QUOTE="rimnet00"]

[QUOTE="Teufelhuhn"][QUOTE="rimnet00"]Lastly, once we step over the 'ray tracing constant' problem with fast enough CPUs, GPUs as we think of them today will likely not exist. dream431ca



There's nothing that says ray tracing is the holy grail of rendering (depsite what Intel might claim). It's just a technique that happens to have certain benefits and certain (big) drawbacks. Many arguements that are pro ray-tracing like to say how new techniques will make ray-tracing faster, but they don't mention that those new techniques almost always make traditional rasterization faster as well.

Most signs point to ray-tracing and rasterization being combined in certain scenarios, since they can very happily co-exist.

Ray tracing is exponentially faster then rasterization, baring the computational constant that plagues it. I'm not using the term 'exponentially' in any exaggerating fashion either -- I mean literally, the algorithm is able to expoentially faster then rasterization at any given resolution.

I do however agree that rasterization won't disappear. However, I'm definitly one to hype ray tracing seeing as every leading computer science department in the country, with a computer graphics department is doing just about the same. Theoretically, it the only things that makes sense in the future - barring any new discoveries.

I believe the first thing to actually do ray-tracing really well was the PS3. Unfortunatly, you need 3 of them hooked together to accomplish it well.

EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time.

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#36 rimnet00
Member since 2003 • 11003 Posts

EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time.

dream431ca

:| Link? I imagine at most it's some dev being clever, and refering to something a-like object selection through 'mouse clicking' in 3D space, as 'ray tracing'.

Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 dream431ca
Member since 2003 • 10165 Posts
[QUOTE="dream431ca"]

EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time.

rimnet00

:| Link? I imagine at most it's some dev being clever, and refering to something a-like 'mouse clicking' in 3D space, as 'ray tracing'.

Visuals
Just as past games in the series have pushed the PlayStation and PlayStation 2 to their limits, Gran Turismo 5 should be a system showpiece for the PlayStation 3. The pre-race screen shows your car being worked on by your pit crew in a garage, and these scenes will feature full HDR, ray-traced lighting. These scenes are stunning and easily rival anything pre-rendered footage could throw at the screen.

http://ps3.ign.com/articles/813/813424p2.html

It's true. PS3 is the first to use ray tracing in games for a console. I Guess the Cell Chip isn't that bad after all.

Avatar image for kage_53
kage_53

12671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#39 kage_53
Member since 2006 • 12671 Posts
[QUOTE="rimnet00"][QUOTE="dream431ca"] EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time. dream431ca
:| Link? I imagine at most it's some dev being clever, and refering to something a-like 'mouse clicking' in 3D space, as 'ray tracing'.

Visuals Just as past games in the series have pushed the PlayStation and PlayStation 2 to their limits, Gran Turismo 5 should be a system showpiece for the PlayStation 3. The pre-race screen shows your car being worked on by your pit crew in a garage, and these scenes will feature full HDR, ray-traced lighting. These scenes are stunning and easily rival anything pre-rendered footage could throw at the screen. http://ps3.ign.com/articles/813/813424p2.html It's true. PS3 is the first to use ray tracing in games for a console. I Guess the Cell Chip isn't that bad after all.

Its not actual gameplay though
Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 dream431ca
Member since 2003 • 10165 Posts

[QUOTE="dream431ca"][QUOTE="rimnet00"][QUOTE="dream431ca"] EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time. kage_53
:| Link? I imagine at most it's some dev being clever, and refering to something a-like 'mouse clicking' in 3D space, as 'ray tracing'.

Visuals Just as past games in the series have pushed the PlayStation and PlayStation 2 to their limits, Gran Turismo 5 should be a system showpiece for the PlayStation 3. The pre-race screen shows your car being worked on by your pit crew in a garage, and these scenes will feature full HDR, ray-traced lighting. These scenes are stunning and easily rival anything pre-rendered footage could throw at the screen. http://ps3.ign.com/articles/813/813424p2.html It's true. PS3 is the first to use ray tracing in games for a console. I Guess the Cell Chip isn't that bad after all.

Its not actual gameplay though

But it's also not pre-rendered. It's real time, so even though your not driving the car while ray tracing is going on, it's still the first console to use ray tracing in an actual game. That is quite a feat.

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#41 rimnet00
Member since 2003 • 11003 Posts
[QUOTE="rimnet00"][QUOTE="dream431ca"]

EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time.

dream431ca

:| Link? I imagine at most it's some dev being clever, and refering to something a-like 'mouse clicking' in 3D space, as 'ray tracing'.

Visuals
Just as past games in the series have pushed the PlayStation and PlayStation 2 to their limits, Gran Turismo 5 should be a system showpiece for the PlayStation 3. The pre-race screen shows your car being worked on by your pit crew in a garage, and these scenes will feature full HDR, ray-traced lighting. These scenes are stunning and easily rival anything pre-rendered footage could throw at the screen.

http://ps3.ign.com/articles/813/813424p2.html

It's true. PS3 is the first to use ray tracing in games for a console. I Guess the Cell Chip isn't that bad after all.

The author of that article must have drank too much, or there is serious spin on that article. Especially, considering it appears that IGN is the only source of this material. The most I get out a search is, it's probably partially ray traced scene, that only happened in the 'garage'. At the same time, I imagine much of the computation would be cached, as even real time ray tracing, that is dynamically calculating the light off a single 3D model is computationally ridiculous.

edit:

But it's also not pre-rendered. It's real time, so even though your not driving the car while ray tracing is going on, it's still the first console to use ray tracing in an actual game. That is quite a feat.

dream431ca

That is true, though I still question how much is actually being done. If it is just the HDR bloom effect, like you suggested, in the garage -- then I can sorta see that happening. But refracted lighting... no way.

Avatar image for crucifine
crucifine

4726

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#42 crucifine
Member since 2003 • 4726 Posts
I will wait for a Raytracing/Rasterizing hybrid renderer, thank you very much.
Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 dream431ca
Member since 2003 • 10165 Posts
[QUOTE="dream431ca"][QUOTE="rimnet00"][QUOTE="dream431ca"]

EDIT: Sorry, GT5 prologue has ray tracing in it already, so I guess the PS3 is ahead of it's time.

rimnet00

:| Link? I imagine at most it's some dev being clever, and refering to something a-like 'mouse clicking' in 3D space, as 'ray tracing'.

Visuals
Just as past games in the series have pushed the PlayStation and PlayStation 2 to their limits, Gran Turismo 5 should be a system showpiece for the PlayStation 3. The pre-race screen shows your car being worked on by your pit crew in a garage, and these scenes will feature full HDR, ray-traced lighting. These scenes are stunning and easily rival anything pre-rendered footage could throw at the screen.

http://ps3.ign.com/articles/813/813424p2.html

It's true. PS3 is the first to use ray tracing in games for a console. I Guess the Cell Chip isn't that bad after all.

The author of that article must have drank too much, or there is serious spin on that article. Especially, considering it appears that IGN is the only source of this material. The most I get out a search is, it's probably partially ray traced scene, that only happened in the 'garage'. At the same time, I imagine much of the computation would be cached, as even real time ray tracing, that is dynamically calculating the light off a single 3D model is computationally ridiculous.

edit:

But it's also not pre-rendered. It's real time, so even though your not driving the car while ray tracing is going on, it's still the first console to use ray tracing in an actual game. That is quite a feat.

dream431ca

That is true, though I still question how much is actually being done.

But we can't deny it. It's a first, and that's only the Cell chip doing the work. I imagine we will see more ray tracing on the PS3 in the future, once the Cell is fully optimized and developers can take the full advantage of the processor.

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#44 rimnet00
Member since 2003 • 11003 Posts

But we can't deny it. It's a first, and that's only the Cell chip doing the work. I imagine we will see more ray tracing on the PS3 in the future, once the Cell is fully optimized and developers can take the full advantage of the processor.

dream431ca

If it is in fact doing using ray tracing teqniques to calculate the HDR Bloom effect, in the garage. Yes, it's neat, and impressive. However, there is still no evidence outside of this IGN article that really tells anyone what is really going on. Developers love to throw things out there, and frankly "ray tracing" is thrown out there as much as "4D graphics", which is why I am highly skeptical.

Avatar image for Senor_Kami
Senor_Kami

8529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 Senor_Kami
Member since 2008 • 8529 Posts

[QUOTE="rimnet00"][QUOTE="MrGrimFandango"]Man thats sweet, Ray-tracing, thats the new gameplay of the 21st century right?mjarantilla


Gameplay? You mean, graphics eh?

To most PS3/360 gamers, graphics = gameplay. :)

Whoa, take the 360 out of that and add PC. Go in a thread about PC games and all you'll see is people mentioning framerate and polygons per second.

Avatar image for dream431ca
dream431ca

10165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 dream431ca
Member since 2003 • 10165 Posts

Hey Rimnet, I see your interested in ray tracing a lot. So am I. I'll give you a link to a document about the Cell chip vs. other processors when it comes to ray tracing. It's a pretty long article. Go to page 7 to see a chart comparing other processors with various forms of ray tracing:

http://graphics.cs.uni-sb.de/~benthin/cellrt06.pdf

Avatar image for rimnet00
rimnet00

11003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#47 rimnet00
Member since 2003 • 11003 Posts

Hey Rimnet, I see your interested in ray tracing a lot. So am I. I'll give you a link to a document about the Cell chip vs. other processors when it comes to ray tracing. It's a pretty long article. Go to page 7 to see a chart comparing other processors with various forms of ray tracing:

http://graphics.cs.uni-sb.de/~benthin/cellrt06.pdf

dream431ca

Yah, I've actually written my own ray tracers ;) http://graphics.cs.umass.edu/publications.php if you want to see some of the work I did while working under Rui Wang. I'll take a look at that article when I get a chance.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Teuf_
Member since 2004 • 30805 Posts
I never realized how vaguely defined GPGPUs really were. More precisely, I was refering to GPGPUs which have dedicated components for doing both linear and vector calculations, as opposed to mapping to across paradigms which is very ineffecient -- ie CPUs doing vector calculations, and GPUs doing hardcore linear computation. rimnet00


GPGPU isn't a kind of hardware (although I guess you could have hardware designed for it), it's just a branch of programming where GPU's are used for general-purpose calculations. It's been going on since before Nvidia came up with an API for it.
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 Teuf_
Member since 2004 • 30805 Posts


Ray tracing is exponentially faster then rasterization, baring the computational constant that plagues it. I'm not using the term 'exponentially' in any exaggerating fashion either -- I mean literally, the algorithm is able to expoentially faster then rasterization at any given resolution.

rimnet00


You can't say one is faster than the other eithout giving some sort of basis. How many objects, and are they dynamic? What kind of acceleration structures are you using? (I assume your "exponentially faster" algorithm is using some sort of acceleration structure, because without them ray-tracing is basically rasterization but performed with every single pixel. And I'd love for you to explain how that is somehow faster). And I hope you're not going to argue that ray-tracing is somehow faster in practical scenarios, given the huge gap between traditional HW-assisted rasterization and any real-time ray-tracing implementation in existance.
Avatar image for rexoverbey
rexoverbey

7622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#50 rexoverbey
Member since 2002 • 7622 Posts
[QUOTE="rexoverbey"]

Intel has been making onboard graphics for years and they have always sucked. I think this is just marketing hype much like the 128-teracore processor.

rimnet00

Actually, the title of this thread is misleading. Intel is not entering the GPU market. Intel is already in the market. In fact, Intel is actually the largest producer of graphics chips in the world. The high end gaming chip market (which we all think of with regard to gaming) is a small sliver of the whole, which we all know Nvidia and ATI dominate that sliver.WildTurkey00

It clearly says "GPGPU", not GPU. So I don't see how you two are confused.

After a title edit. On top of that it's VPU since it is not a real General purpose GPU.