"ive seen benchmarks and from what i can tell these cards are less powerful than the 9 seeries. there crap dont buy em... " - hofuldig
First off, you must not have read the articles that went with theh pictures you were looking at. These benchmarks are made with beta drivers on Nvidia's first consumer card that is not specifically meant for gaming. This GPU is a great gamers card and in time the drivers will improve it's performance by a mile. The same way the rest of Nvidia's lineup has been in the past. But the real difference here is this card is mean to showcase to the consumer and devloper market the Nvidia CUDA initiative. It's Nvidia's cahnce to come out swinging at Intel. They have been cliaming that they can do a better job with replacing the CPU of mainstream computers with their GPU's, than Intel and AMD can do with continuing to make their standard, general purpose CPU's. Of course this model cant replace a Core 2 or Phenom. It's not trying to. It's meant to show off what can be done with it. Apps like Folding@home that are made to run on the 280 itself vs. the top of the line Intel extreme CPU will show that Nvidia has the number crunching edge by a mile.
Thats what Nvidia wants people to see. Their plan is to make the consumer, put a demand to the devlopers to use their chips. They cant do this by only sticking with the niche gaming market. However, it the very same niche gaming market that will pave the way for this. In the consumer marketplace, we are the ones who spend the money on more "esoteric" hardware. The average user is quite happy with low end intergrated graphics and sound. Why shouldnt they be? They dont do anything that needs the power of our hardware. So what will happen here is this. We gamers will "subsidize" the CUDA platform for Nvidia. Then over the next so many years, the average consumer will begin to demand this power and apps that use it. At least, thats the theory anyway. Intel of course is fighting back with "Larrabee".
AMD is simply just trying to stay afloat for the next few years. They simply do not have anything on the drawingboard that can compete with Nvidia or Intel on the latest power struggle. Instead, they are refining Phenom and trying to take the mid power graphics market. They also do fairly well in the intergrated market. So they should be fine as long as nothing goes really wrong for them.
So the real question will be this. Will consumers want the eaiser to code for power of Larrabee? Or do they want the much harder to code for CUDA platform from Nvidia? Nvidia's system looks to have great performance but is the trouble worth it. Intel's idea seems to really speedup devlopment buy using the tried and true x86 instruction set directly in the GPU itself. Forget that extra year or so in devlopement for a game with this method. But it may end up with lackluster performance. In the end, devs will always tend to want the faster turnover in product so Larrabee will be of some very real interest for them. But its we the consumer that will tell them with our cash if it's good enough. Time will tell.
So in the end, I really dont see how you can say the 280 is "crap" when you A) dont have one, B) have not even see a performance test with finished drivers as well as testing the other abilities it was devloped with, and C) make recommendations based on a complete lack of knowledge of the subject at hand. Then again, thats just my opinion based on your very limited statement.
Log in to comment