The Larrabee (intel's new GPGPU)

This topic is locked from further discussion.

Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 gamer7890
Member since 2009 • 114 Posts

The Larrabee is Intel's new graphics chip and this appears to be very...promising...

Larrabee, faster graphics, and real-time ray tracing (RTRT)

Intel employee Daniel Pohl was on hand at last week's IDF to give demonstrations of a version of Quake 4 that uses real-time ray tracing running on Intel hardware. Charlie at the Inquirer managed to catch the demo, and he published an account of it this morning that attempts to get at where the GPU is eventually headed as a product.

Pohl is the German computer science student behind the ray-traced versions of Quake 3 and 4 that have been featured on Digg and Slashdot. For his masters' thesis, he built a version of Quake 4 that uses real-time ray tracing to achieve some pretty remarkable effects-shadows are correctly cast and rendered in real-time, water has the proper reflections, indirect lighting looks like it's supposed to, etc. He was later hired by Intel, and now he's working within their graphics unit on real-time ray tracing for games.

We've covered Intel's ray tracing research in the past, and it's there's no doubt that Intel is serious about bringing this technology to real-time 3D games. The real questions, however, concern what kind of ray-tracing will be used, to what extent, and to what effect. I'll do my best to untangle these questions briefly by relying on email feedback and exchanges from folks who know much more than I about this issue but who've asked not to be named.

When non-graphics people hear "ray tracing," the think of the kind that takes hours per frame to render. This kind of ray tracing is called "global illumination," and it involves computing the paths of rays that come directly from the light sources in a scene (direct rays) and rays that reach the viewer as a result of reflections (indirect rays). Doing ray tracing for the former types of rays can be hard, but it's not nearly has difficult as doing both types at once. Because of the difficulty of solving the global illumination problem for a scene, most ray tracing involves various tricks and approximations for simulating the effects of indirect rays.

The problem with the tricks and approximations are that the indirect rays are what make the scene look realistic in ways that standard raster graphics can't accomplish. If you're only doing direct rays, then ray tracing has no visual advantage over rasterization.

The limited types of non-diffuse "indirect rays" traced in the Quake 4 demo (multibounce specular reflections and glass refraction) have computational demands that are similar to calculating simple eye rays (i.e., they have a high degree of coherence, so they can be calculated using wide SIMD bundles). These effects do indeed look better than rasterized knockoffs, but the tradeoff is that they're still very computationally intense, and the overall look of the game engine isn't really that much more photorealistic than what you can do with shaders.

Therein lies the drawback to the Quake 4 demo: because of the demands of the ray-tracing engine, Pohl had to swap out many of the game's detailed textures in favor of reflective surfaces that exploit ray tracing, because there was no horsepower left over for texturing. It's also the case that the demo required four quad-core machines ganged together, and even then it didn't run at a playable framerate. So those reflective and refractive effects are nice, but they're not worth that kind of horsepower requirement, especially when you compare the overall look of the resulting game to what single G80 can do with the right coding.

Comparison with the Cell Broadband Engine

Larrabee's philosophy of using many small, simple cores is similar to the ideas behind the Cell processor. There are some further commonalities, such as the use of a high-bandwidth ring bus to communicate between cores.[7]However, there are many significant differences in implementation which should make programming Larrabee simpler.

  • The Cell processor includes one main processor which controls many smaller processors. Additionally, the main processor can run an operating system. In contrast, all of Larrabee's cores are the same, and the Larrabee is not expected to run an OS.
  • Each compute core in the Cell (SPE) has a local store, for which explicit (DMA) operations are used for all accesses to DRAM. Ordinary reads/writes to DRAM are not allowed. In Larrabee, all on-chip and off-chip memories are under automatically-managed coherent cache hierarchy, so that its cores virtually share a uniform memory space through standard load/store instructions.[7].
  • Because of the cache coherency noted above, each program running in Larrabee has virtually a large linear memory just as in traditional general-purpose CPU; whereas an application for Cell should be programmed taking into consideration limited memory footprint of the local store associated with each SPE (for details see this article) but with theoretically higher bandwidth.
  • Cell uses DMAfor data transfer to/from on-chip local memories, which has a merit in flexibility and throughput; whereas Larrabee uses special instructions for cache manipulation (notably cache eviction hints and pre-fetch instructions), which has a merit in that it can maintain cache coherence (hence the standard memory hierarchy) while boosting performance for e.g. rendering pipelines and other stream-like computation.[7].
  • Each compute core in the Cell runs only one thread at a time, in-order. A core in Larrabee runs up to four threads. Larrabee's hyperthreading helps hide latencies and compensates for lack of out-of-order execution.

File:Larrabee slide block diagram.jpg

File:Slide scaling.jpg

Intel has purchased Project Offset and plans to use it as a show game for the Larrabee

(old screens of project offset)

Avatar image for mayforcebeyou
mayforcebeyou

2703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 mayforcebeyou
Member since 2007 • 2703 Posts
i hope it's good and is does it need s dpecial motherboard?
Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 gamer7890
Member since 2009 • 114 Posts
i hope it's good and is does it need s dpecial motherboard? mayforcebeyou
to be honest i have no idea :P im just posting some info i found, but, """"Larrabee is the codename for a graphics processing unit (GPU) chip that Intel is developing separately from its current line of integrated graphics accelerators. Larrabee is expected to compete with GeForce and Radeon products from NVIDIA and ATI respectively. Larrabee will also compete in the GPGPU and high-performance computing markets. Intel planned to have engineering samples of Larrabee ready by the end of 2008, with a video card featuring Larrabee hitting shelves in late 2009 or early 2010.[1]""""
Avatar image for osan0
osan0

18244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 osan0
Member since 2004 • 18244 Posts
the GPU market is going to get very interesting soon. GPUs are basically changing form a highly specialised bit of kit used for a very specific task to just a crap load of processors in a die (my card has 128 for example). thats a gross simplification of it all of course but thats basically whats happening. it also opens the door for more competition. as long as a card is direct X and openGL compliant, games will work on it. larrabee looks pretty kewl but im going to reserve judgement until i see some hard benchmarking figures. for it to succeed, it needs to work well with existing games as well as future games. as long as larrabee complies with directX and openGL (which i think it does), intel should be fine on the support front so its a case of performance and, more importantly, performance v price. another company that may enter the fray is creative. i was just reading about their new Zii processor. at the mo they seem to be focusing it at embedded devices but you never know...they may decide to give the GPU market a shot. the processor certainly has the potential to get attention.
Avatar image for 110million
110million

14910

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#5 110million
Member since 2008 • 14910 Posts
Awesome. Its good to see their making all of this a reality, I studied similar upcoming technologies in my CPU architecture class in college, but I didn't think I'd see anything so soon. From some of the things I researched, they will probably need new kinds of motherboards, though I don't know how this specific tech works other then what I've read here, so I'm not sure. :P
Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#6 z4twenny
Member since 2006 • 4898 Posts
sounds terrible honestly. a gpu that can't play games..... isn't that kinda, well whats the word im looking for here?..... oh, yes, thats right.... USELESS
Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 gamer7890
Member since 2009 • 114 Posts
the GPU market is going to get very interesting soon. GPUs are basically changing form a highly specialised bit of kit used for a very specific task to just a crap load of processors in a die (my card has 128 for example). thats a gross simplification of it all of course but thats basically whats happening. it also opens the door for more competition. as long as a card is direct X and openGL compliant, games will work on it. larrabee looks pretty kewl but im going to reserve judgement until i see some hard benchmarking figures. for it to succeed, it needs to work well with existing games as well as future games. as long as larrabee complies with directX and openGL (which i think it does), intel should be fine on the support front so its a case of performance and, more importantly, performance v price. another company that may enter the fray is creative. i was just reading about their new Zii processor. at the mo they seem to be focusing it at embedded devices but you never know...they may decide to give the GPU market a shot. the processor certainly has the potential to get attention.osan0
well i need to find the link to it, btu i read larrabee can preform realtime raytracing at like 70FPS, on a 32 core version, in a complex room with a single light source
Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 gamer7890
Member since 2009 • 114 Posts
sounds terrible honestly. a gpu that can't play games..... isn't that kinda, well whats the word im looking for here?..... oh, yes, thats right.... USELESS z4twenny
cant play games !!???
Avatar image for Master-Thief-09
Master-Thief-09

2534

Forum Posts

0

Wiki Points

0

Followers

Reviews: 22

User Lists: 0

#9 Master-Thief-09
Member since 2009 • 2534 Posts
Heh, and people call Alan Wake vaporware. Good to know this is still in development though.
Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#10 z4twenny
Member since 2006 • 4898 Posts

[QUOTE="z4twenny"]sounds terrible honestly. a gpu that can't play games..... isn't that kinda, well whats the word im looking for here?..... oh, yes, thats right.... USELESS gamer7890
cant play games !!???

yes, in the article it stated that it ran quake 4 with 4 quad cores hooked up just to run it at an unplayable frame rate. that to me says it can't be used for gaming which i think is kinda the whole point of getting a gpu.

Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 gamer7890
Member since 2009 • 114 Posts

[QUOTE="gamer7890"][QUOTE="z4twenny"]sounds terrible honestly. a gpu that can't play games..... isn't that kinda, well whats the word im looking for here?..... oh, yes, thats right.... USELESS z4twenny

cant play games !!???

yes, in the article it stated that it ran quake 4 with 4 quad cores hooked up just to run it at an unplayable frame rate. that to me says it can't be used for gaming which i think is kinda the whole point of getting a gpu.

it was doing realtime raytracing...
Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#12 z4twenny
Member since 2006 • 4898 Posts
^ regardless of what it was doing, you couldn't play a game with it.while ray tracing does look nice, they're going to need to do something that is backwards compat and also will need to be able to support textures of some form (or an alternative which will continue to allow for the photorealism) and not just ""shiny" objects and it will obviously need to run at a decent framerate with a minimum amount of processors. i don't know anyone who would spend $4k + just on their cpu for a rig to play a couple games. i like gaming and all, but im not spending that kind of money to play a photorealistic game.
Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 AnnoyedDragon
Member since 2006 • 9948 Posts

^ regardless of what it was doing, you couldn't play a game with it. z4twenny

The chart with games on it escaped your attention how?

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 Teuf_
Member since 2004 • 30805 Posts
i hope it's good and is does it need s dpecial motherboard? mayforcebeyou


I'm sure they'll just have it use whatever variant of PCI-e is popular at the time when it comes out
Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 gamer7890
Member since 2009 • 114 Posts
[QUOTE="mayforcebeyou"]i hope it's good and is does it need s dpecial motherboard? Teufelhuhn


I'm sure they'll just have it use whatever variant of PCI-e is popular at the time when it comes out

from what i read set for release end of this year or early 2010.
Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#17 z4twenny
Member since 2006 • 4898 Posts

[QUOTE="z4twenny"]^ regardless of what it was doing, you couldn't play a game with it. AnnoyedDragon

The chart with games on it escaped your attention how?

you seem to miss the point. ok it can render stuff to look shiny and reflective, don't get me wrong raytracing looks good. but if you want to continue to have stuff looking like polished aluminum, glass and plastic people then thats cool. me personally im not all for that, i'd rather have people that look like people. not photographed toys under light sources. as for the "chart with games on it" all i see are "number of larabee cores" and a 1-5 rating. 1-5 what? fps? the whole 1-5 rating doesn't mean much to me.

Avatar image for gamer7890
gamer7890

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 gamer7890
Member since 2009 • 114 Posts
[QUOTE="z4twenny"][QUOTE="AnnoyedDragon"]

^ regardless of what it was doing, you couldn't play a game with it. z4twenny

The chart with games on it escaped your attention how?

you seem to miss the point. ok it can render stuff to look shiny and reflective, don't get me wrong raytracing looks good. but if you want to continue to have stuff looking like polished aluminum, glass and plastic people then thats cool. me personally im not all for that, i'd rather have people that look like people. not photographed toys under light sources. as for the "chart with games on it" all i see are "number of larabee cores" and a 1-5 rating. 1-5 what? fps? the whole 1-5 rating doesn't mean much to me.

"Benchmarking results from the recent SIGGRAPH paper, showing performance as an approximate linear function of the number of processing cores."
Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#19 z4twenny
Member since 2006 • 4898 Posts
^ thats still fairly irrelevent. is 5 playable? is 5 barely passable? is 1 a slide show? im not asking for much if i ask for fps in these games instead of a "1-5 scale" i rate myself a 10 on a 1-10 scale. but then the question becomes, is 1 good? is 10 good? if 10 is good what exactly validates that rating?
Avatar image for lowe0
lowe0

13692

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 lowe0
Member since 2004 • 13692 Posts
Larrabee is interesting in that it gets us where Nvidia is clearly heading - instead of HLSL, you just get a C compiler for the GPU. At that point, anything that makes sense to run on the GPU can be moved there - especially since PCIe is fast in both directions, unlike AGP. You could offload something like sound occlusion to the GPU, then send the results back to the CPU for the final audio routines.
Avatar image for EXLINK
EXLINK

5719

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#21 EXLINK
Member since 2003 • 5719 Posts
^ thats still fairly irrelevent. is 5 playable? is 5 barely passable? is 1 a slide show? im not asking for much if i ask for fps in these games instead of a "1-5 scale" i rate myself a 10 on a 1-10 scale. but then the question becomes, is 1 good? is 10 good? if 10 is good what exactly validates that rating? z4twenny
That graph is showing the scale of performance. If I read it correctly, the Larrabee will be 5 times faster when it uses all 48 of its cores compared to just 8 of them. Thats not measuring FPS.
Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 AnnoyedDragon
Member since 2006 • 9948 Posts
[QUOTE="AnnoyedDragon"]

[QUOTE="z4twenny"]^ regardless of what it was doing, you couldn't play a game with it. z4twenny

The chart with games on it escaped your attention how?

you seem to miss the point. ok it can render stuff to look shiny and reflective, don't get me wrong raytracing looks good. but if you want to continue to have stuff looking like polished aluminum, glass and plastic people then thats cool. me personally im not all for that, i'd rather have people that look like people. not photographed toys under light sources. as for the "chart with games on it" all i see are "number of larabee cores" and a 1-5 rating. 1-5 what? fps? the whole 1-5 rating doesn't mean much to me.

You're focusing on the ray tracing aspect far too much, ray tracing is something that this most likely won't touch in its product life cycle; people just like ray tracing demos to show off hardware power.

Read under "Preliminary performance data" from the wiki.

Avatar image for Baranga
Baranga

14217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23 Baranga
Member since 2005 • 14217 Posts

^ regardless of what it was doing, you couldn't play a game with it.z4twenny

Project Offset?

Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#24 z4twenny
Member since 2006 • 4898 Posts

[QUOTE="z4twenny"]^ thats still fairly irrelevent. is 5 playable? is 5 barely passable? is 1 a slide show? im not asking for much if i ask for fps in these games instead of a "1-5 scale" i rate myself a 10 on a 1-10 scale. but then the question becomes, is 1 good? is 10 good? if 10 is good what exactly validates that rating? EXLINK
That graph is showing the scale of performance. If I read it correctly, the Larrabee will be 5 times faster when it uses all 48 of its cores compared to just 8 of them. Thats not measuring FPS.

ok, but still after all this nobody has stated much of anything other than "48 cores will perform better than 8 cores" which i got from reading the graph, i'm not that slow. my question still hasn't been answered on the "what does the 1-5 scale represent" obviously 5 is better than one. but in the original statement 4 quads were hooked up and it wasn't being rendered as playable.... so once again, no real numbers are given to validate how much better its going to be. all i really read out of the article was "our GPU can do awesome graphics, unfortunately because the graphics are so awesome the 4 year old test game we have for it is unplayable"

Avatar image for Arsuz
Arsuz

2318

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Arsuz
Member since 2003 • 2318 Posts
Some of you guys should re read the article. The quake 4 demo was the end thesis of the student that now works on the larrabee. His demo did not run on the larrabee! Besides, you're really missing the point. A near linear scalibility is dream for game developers. It basically means that you can have a larrabee with 100 cores and it will run 10 times better than one with only 10 WITHOUT any special programming! THAT is the buty of the larrabee.
Avatar image for anshul89
anshul89

5705

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26 anshul89
Member since 2006 • 5705 Posts
Some of you guys should re read the article. The quake 4 demo was the end thesis of the student that now works on the larrabee. His demo did not run on the larrabee! Besides, you're really missing the point. A near linear scalibility is dream for game developers. It basically means that you can have a larrabee with 100 cores and it will run 10 times better than one with only 10 WITHOUT any special programming! THAT is the buty of the larrabee.Arsuz
really ? :o

That's amazing.
Avatar image for AnimalStak
AnimalStak

63

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 AnimalStak
Member since 2008 • 63 Posts

for the guy who keeps talking about crappy performance

i remember reading that the 16core larrabee runing with just 2 threads (i think) and half clock speed was running gears of war pc maxed out 1980x1200 at 234fps

it will make crysis its ****

Avatar image for forza1989
forza1989

55

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 forza1989
Member since 2007 • 55 Posts

"Intel's SIGGRAPH 2008 paper describes simulations of Larrabee's projected performance. Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half life 2: Episode 2 with 4x antialiasing. It is likely that Larrabee will run faster than 1 GHz, so these numbers are conservative. Another graph shows that performance on these games scales nearly linearly with the number of cores up to 32 cores. At 48 cores the performance scaling is roughly 90% of linear."

Taken from Wikipedia

Avatar image for z4twenny
z4twenny

4898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#29 z4twenny
Member since 2006 • 4898 Posts

for the guy who keeps talking about crappy performance

i remember reading that the 16core larrabee runing with just 2 threads (i think) and half clock speed was running gears of war pc maxed out 1980x1200 at 234fps

it will make crysis its ****

AnimalStak

i didn't see that in the article. if thats true then yes, its friggin awesome.

Avatar image for iam2green
iam2green

13991

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 iam2green
Member since 2007 • 13991 Posts
sounds cool, it would be great if it came out now. i think it will be the next next gen graphics. xbox 720/ 3, ps4, and nintendo will have thiers out and this will be PC new graphics.
Avatar image for forza1989
forza1989

55

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 forza1989
Member since 2007 • 55 Posts
[QUOTE="AnimalStak"]

for the guy who keeps talking about crappy performance

i remember reading that the 16core larrabee runing with just 2 threads (i think) and half clock speed was running gears of war pc maxed out 1980x1200 at 234fps

it will make crysis its ****

Can you find a link for that?
Avatar image for kemar7856
kemar7856

11789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#32 kemar7856
Member since 2004 • 11789 Posts
if it can beat nvida and ati cards I'll look into it
Avatar image for LordMontezuma
LordMontezuma

87

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 LordMontezuma
Member since 2007 • 87 Posts
z4t, I think you misread the article. It is not the Larrabee that was unplayable even when 4 quad cores were running parallel. Rather, it was that student's Quake 4 ray-tracing thesis. The processors he was using to show the possibilty of ray-tracing were not designed to be used for ray-tracing, hence the tremendous lag. The Larrabee, however, is being designed with ray-tracing as its major selling point and therefore won't suffer from lag because of it. They guy's thesis got him hired by Intel, that was the point of that segment of the article - they were not discussing the Larrabee at that point.
Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#34 horrowhip
Member since 2005 • 5002 Posts

Can you find a link for that?forza1989

He was wrong...

It was the 32 Core 2GHz version that is planned for the launch configuration that would get 200+FPS(theoretically assuming good scaling at higher clocks and higher numbers of cores... Intel had the graph showing it(it is in the TC's post). Scaled performance is demonstrating how as you increase the cores, the scaling increases linearly, which is a good thing).

the Beyond 3D post about it where they calculated the theoretical FPS based upon the very odd figures Intel gave basically said that if the clock is 2 times(planned to be) then it could get 60FPS with 12.5 cores. With 32 cores, the theoretical FPS in Gear of War could be up to 150FPS.

And the ray-tracing was theoretically 70FPS on a 24 Core 1GHz version in a scene with 4 million rays, 240K polygons and 2 reflection levels(indirect rays). It only got 12 FPS on a Core 2 Quad. So, much better performance.

Avatar image for SteezyZ
SteezyZ

209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 SteezyZ
Member since 2008 • 209 Posts
Larrabee is interesting in that it gets us where Nvidia is clearly heading - instead of HLSL, you just get a C compiler for the GPU. At that point, anything that makes sense to run on the GPU can be moved there - especially since PCIe is fast in both directions, unlike AGP. You could offload something like sound occlusion to the GPU, then send the results back to the CPU for the final audio routines.lowe0
This is perhaps the most important aspect of Larabee, which seems to get glossed over : a GPU that also uses a x86 instruction set, and not some proprietary compiler/language from Nvidia or AMD. Based on Intel's past in GPUs I'm still a little skeptical, but here's hoping it works out.
Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 TOAO_Cyrus1
Member since 2004 • 2895 Posts
i hope it's good and is does it need s dpecial motherboard? mayforcebeyou
Yes it probably will. It will be absolutely amazing as a stream processor but for games whatever dedicated GPU AMD and Nvidia have at the time combined with a regular quad core plus CPU will probably be faster. There are allot of pitfalls that can effect performance for instance GPU's have a tone of dedicated logic for much of the rendering pipeline that will have to be emulated programmatically on the stream processors in larabee. To compete with traditional games it will have to have allot more compute power then contemporary GPU's and Intel's driver team will have to work wonders. The upside is if a dev figures out a new and improved rendering technique its easier to implement it on Larabee as it does not have fixed function hardware. Thats why Intel is pushing ray tracing which can be done purely pragmatically on Larabee. Intel will also have to double its power every 12 to 18 months to keep up with AMD and Nvidia.