Larrabee is a x86 capable part for pci-express, meant to compete against ati and nvidia, but it aint a graphic card only, unlike nvidia and ati cards which sport a rigid (soldered) graphic logic in the core + some narrow space for programming (the foundation of all gpus), Larrabee can run costume drivers for the whole render pipeline ( the order in which things are processed before display).
The reason why some sli or crossfire games dont scale well, or some games run better with nvidia or ati parts, its because, the developer cant really decide what to render first, so, at best, nvidia/ati have to wait for a game to be near release and have their scientific monkeys profiling each game, so most of the tiame game profiles appears months after release.
See larrabee as one of those chips for college students with empty logic, it has the best of a software renderer but it doesnt run in windows, the software layer runs in the chip itself and uses opengl/dx10 to show results. With such a feature for example you could update a PS3 or xbox 360 to shader model 5.0 and so on. It can be an adobe accelerator a folding home for the masses, why? it uses actual x86 code, the same we use to make apps, so no CUDA learning needed.
If you had read tech sites like ars or theinquirer for the past 3 years, you would have expected this move, Nvidia is no longer targeting ATI, they're going for Intel, not for their market share, but for the relevance the market gives to cpus, lets say, if you sell a gpu (next 55nm GTX280) at the same price of an upcoming eight core mid range , and concentrate in super accelerating a few key applications like photoshop or HD video encoders, then the consumer from a not so distant future driven by real life earning needs would think "why a $400 32 core and a GTX 240*, when I mostly encode and post 30min design tutorials in youtube and make money with adobe at my uncles print shop"; then instead, he buys a $200 16 core and a $400* GTX 280.
So the main stream conservative market no longer sees Nvidia as company for games or watching blu ray movies, the commodity becomes mainstream necessity (Intels playground). Nvidias worst nightmare isnt AMD which sports a market cap of $3.8 billion with 5 billions in debt after they bought ATI. Nvidia already enjoys a $10 billion market cap and only need discounts rebranded stuff and broader "the way is meant to be played" support to slow down ATIs earnings
And Yes, Nvidia will win Larrabee`s 1st round, since larrabee is a proof of concept, on the other hand theres nehalem (and 8 to 84 core with a larrabee inside), Intel with 180 billion market cap and 9 times AMDs net income; this crappy vga maker worth 18 times nvidia not only sells cheap P4s united by an awesome architecture (coreduo) at higher prices than AMD, its also has been the world biggest gpu maker for a long time, yes, they make crap gpus, these are for everyone not some like us.
How about if you give the same gpu tech chained monkeys 10 times the funding and permission to use 8 or more cores from the next cpu in order to create a gpu, well, with 32mb+ cache access at 45nm and up to (84) 3.8ghz cores then, after 5 years of paper and wafer cooking (nehalem dev cycle), theyll give you a reason to think "Hey! Nvidia didnt had alien technology applied in their cards and drivers, It was just about size, design and gigahertz, why would I buy an nvidia or an ati card?, when I can spend $400 in a 32 core cgpu that can later "slicross" in my new multi socket $130 mobo, yes some cores go for Havok physics". This my friend its THE WORST NIGHTMARE for Nvidias CEO and co-founder Jen-Hsun Huang.
I have almost none consumer innocence for this or that brand due to my marketing career and geeky habits. My point, Nvidia is doing what its left for them to do, play hard and show no sweat, with their outstanding PR and marketing dep, everyone thinks they re in safe harbor. AMD made some investors go suicide and went red numbers, knowing this Intel plan, they bought a 6 billion "out of Jail" ticket. Now they can make the same as Intel and extend their fight 10 more years, the most brilliant hard decision.
Nvidia, the respected Nvidia, needs a really rigid 0 error, road map and strategy to survive, they ve been doing this with near perfection for the last 2 years, until they made a $1200 million notebook chip fiasco blamed their provider (TSMC) for a design flaw which was their fault, and made HP to pay 50% for their own mistake. Interesting times to come, never the less my 8800 still deserves a life achievement award.
Load Comments