Official - Project Offset PC exclusive - will be Intel Larabee launch title

  • 135 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for IgGy621985
IgGy621985

5922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101 IgGy621985
Member since 2004 • 5922 Posts

Yes thanks to games like Nancy Drew which will probably be the best selling pc game this year. Note that I said high end, which is also what Carmack was talking about. When veteran pc dev Carmack calls the pc their "junior partner" I think the time for denial is over.

dc337

When we're talking about Nancy Drew, we're also talking about the least important gaming market in the world. Yeah, North American market.

I'm still playing highest-end games on my PC, because, thankfully I have a pretty vast option of exclusive and multiplatform titles. Every platform now shares more titles with other platforms, but PC has probably one of the strongest lineups of exclusive titles ever for 2009.

Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#102 horrowhip
Member since 2005 • 5002 Posts
[QUOTE="Teufelhuhn"][QUOTE="devious742"][QUOTE="Teufelhuhn"][QUOTE="Bebi_vegeta"]

Open GL doesn't use Direct X... so, yeah.

Bebi_vegeta



What does OpenGL have to do with it?

that you can bypass directx and just use opengl....



That's not what I meant when I said "bypass DirectX"...I mean not using a 3D graphics API at all.

On PC's OpenGL and DirectX do the same thing: they serve as a sort of translator that you use to talk to a video card. You see graphics hardware each has it's own "language" that it speaks. So when you make a PC game and you want to draw a triangle on the screen you don't program it to say that in all the different hardware "languages", you use DirectX or OpenGL. You tell that API to draw the triangle, then the API translates it to the "language" that the hardware understands. This is how things have been for a very long time...it's nice in that you don't need to worry about individual kinds of video cards, but it was limiting since you could only do the things that DirectX/OpenGL knows how to translate. You're also limited because video cards aren't very flexible to begin with: they are only designed to do things like drawing triangles or use textures.

Now with Larrabee things are a little different. It does support using DirectX or OpenGL if you want to do that and go through the whole translator bit...but if you want you can skip that. This is because Larrabee uses x86 cores: the same kind of cores that are in a Core 2 Duo or any other Intel/AMD CPU. So this means, if you wanted, you could talk to the hardware directly to do your graphics stuff. It would be harder to do and any code you wrote for this would only work on Larrabee and not Nvidia/ATI, but you would have a whole lot less restrictions.

I still have my doubts on larrabee... GPU are that more powerfull in terms of calculation.

Intel will just scale the number of cores to match the competition...

Ultimately, if they need more than 32 cores, they can just go up to 48 cores.

Keep in mind that even the 32 core version still have about 1/3 the die size of an nVidia card, and is slightly smaller than ATi cards.

And that is in the 45nm process.

That is why Intel has such a huge advantage. Their design scales almost perfectly.

1:1 scaling is relatively simple with it.

The problem with massive amounts of cores is the power demands. It is likely, if they REALLY need to, they will launch with a 48 core version. That increase the theoretical power of the card by 50% in a single swipe...

And Intel has proven that they can do 32nm process REALLY easily and 22nm process with very well.

So, if they were to keep scaling down they would reduce power demands, increase number of cores and scale very well.

Intel will release with whatever they ultimately need to release with power wise.

Thing is, the 32 core version shows absolutely no signs of being less powerful than whatever nVidia or ATi can possibly cook up(unless nVidia has some rediculous secret GPU that increase performance compared to the GTX 280 by like 100%.)

What we can expect in the next year from nVidia and ATi is probably somewhere in the 50-60% range in terms of power incease and somewhere between 40-50% in performance increase.

Based upon what Intel has already proven, Larrabee should (realistically) be anywhere from 40-50% faster performance wise, than the GTX 280.

Avatar image for Puckhog04
Puckhog04

22814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 Puckhog04
Member since 2003 • 22814 Posts
Awesome. Can't wait to see what the game is made of when it's a PC exclusive. :)
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"][QUOTE="Teufelhuhn"]



That's not what I meant when I said "bypass DirectX"...I mean not using a 3D graphics API at all.

On PC's OpenGL and DirectX do the same thing: they serve as a sort of translator that you use to talk to a video card. You see graphics hardware each has it's own "language" that it speaks. So when you make a PC game and you want to draw a triangle on the screen you don't program it to say that in all the different hardware "languages", you use DirectX or OpenGL. You tell that API to draw the triangle, then the API translates it to the "language" that the hardware understands. This is how things have been for a very long time...it's nice in that you don't need to worry about individual kinds of video cards, but it was limiting since you could only do the things that DirectX/OpenGL knows how to translate. You're also limited because video cards aren't very flexible to begin with: they are only designed to do things like drawing triangles or use textures.

Now with Larrabee things are a little different. It does support using DirectX or OpenGL if you want to do that and go through the whole translator bit...but if you want you can skip that. This is because Larrabee uses x86 cores: the same kind of cores that are in a Core 2 Duo or any other Intel/AMD CPU. So this means, if you wanted, you could talk to the hardware directly to do your graphics stuff. It would be harder to do and any code you wrote for this would only work on Larrabee and not Nvidia/ATI, but you would have a whole lot less restrictions. horrowhip

I still have my doubts on larrabee... GPU are that more powerfull in terms of calculation.

Intel will just scale the number of cores to match the competition...

Ultimately, if they need more than 32 cores, they can just go up to 48 cores.

Keep in mind that even the 32 core version still have about 1/3 the die size of an nVidia card, and is slightly smaller than ATi cards.

And that is in the 45nm process.

That is why Intel has such a huge advantage. Their design scales almost perfectly.

1:1 scaling is relatively simple with it.

The problem with massive amounts of cores is the power demands. It is likely, if they REALLY need to, they will launch with a 48 core version. That increase the theoretical power of the card by 50% in a single swipe...

And Intel has proven that they can do 32nm process REALLY easily and 22nm process with very well.

So, if they were to keep scaling down they would reduce power demands, increase number of cores and scale very well.

Intel will release with whatever they ultimately need to release with power wise.

Thing is, the 32 core version shows absolutely no signs of being less powerful than whatever nVidia or ATi can possibly cook up(unless nVidia has some rediculous secret GPU that increase performance compared to the GTX 280 by like 100%.)

What we can expect in the next year from nVidia and ATi is probably somewhere in the 50-60% range in terms of power incease and somewhere between 40-50% in performance increase.

Based upon what Intel has already proven, Larrabee should (realistically) be anywhere from 40-50% faster performance wise, than the GTX 280.

Hd4870x2 has 2tera flops of calculation power. No intel/AMD CPU comes even close to that power.... folding at home will show you perfectly what i'm talking about.

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Nvidia and Ati are also scalling down on to 45nm or 40nm very soon.

Seems to me that the new road for GPU is also going mutliple GPU on the same card.

Avatar image for AdrianWerner
AdrianWerner

28441

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#105 AdrianWerner
Member since 2003 • 28441 Posts

Awesome. Can't wait to see what the game is made of when it's a PC exclusive. :)Puckhog04
This and Shattered Horizon will be very interesting to watch in terms of how they will turn out graphics-wise

Avatar image for Jamex1987
Jamex1987

2187

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#106 Jamex1987
Member since 2008 • 2187 Posts
[QUOTE="dc337"]

Yes thanks to games like Nancy Drew which will probably be the best selling pc game this year. Note that I said high end, which is also what Carmack was talking about. When veteran pc dev Carmack calls the pc their "junior partner" I think the time for denial is over.

IgGy621985

When we're talking about Nancy Drew, we're also talking about the least important gaming market in the world. Yeah, North American market.

I'm still playing highest-end games on my PC, because, thankfully I have a pretty vast option of exclusive and multiplatform titles. Every platform now shares more titles with other platforms, but PC has probably one of the strongest lineups of exclusive titles ever for 2009.

How is North America the least when they have the second highest amount of WoW subscribers. Subscription gaming is the backbone of the PC game industry with Asia bringing in the most revenue.

Avatar image for JetB1ackNewYear
JetB1ackNewYear

2931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#107 JetB1ackNewYear
Member since 2007 • 2931 Posts
can someone please tell me what the Larrabee actully is is it a cpu? or somthing i was gonna get a quad core should i just wait now?
Avatar image for HuusAsking
HuusAsking

15270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 HuusAsking
Member since 2006 • 15270 Posts
[QUOTE="horrowhip"][QUOTE="Bebi_vegeta"][QUOTE="Teufelhuhn"]



That's not what I meant when I said "bypass DirectX"...I mean not using a 3D graphics API at all.

On PC's OpenGL and DirectX do the same thing: they serve as a sort of translator that you use to talk to a video card. You see graphics hardware each has it's own "language" that it speaks. So when you make a PC game and you want to draw a triangle on the screen you don't program it to say that in all the different hardware "languages", you use DirectX or OpenGL. You tell that API to draw the triangle, then the API translates it to the "language" that the hardware understands. This is how things have been for a very long time...it's nice in that you don't need to worry about individual kinds of video cards, but it was limiting since you could only do the things that DirectX/OpenGL knows how to translate. You're also limited because video cards aren't very flexible to begin with: they are only designed to do things like drawing triangles or use textures.

Now with Larrabee things are a little different. It does support using DirectX or OpenGL if you want to do that and go through the whole translator bit...but if you want you can skip that. This is because Larrabee uses x86 cores: the same kind of cores that are in a Core 2 Duo or any other Intel/AMD CPU. So this means, if you wanted, you could talk to the hardware directly to do your graphics stuff. It would be harder to do and any code you wrote for this would only work on Larrabee and not Nvidia/ATI, but you would have a whole lot less restrictions. Bebi_vegeta

I still have my doubts on larrabee... GPU are that more powerfull in terms of calculation.

Intel will just scale the number of cores to match the competition...

Ultimately, if they need more than 32 cores, they can just go up to 48 cores.

Keep in mind that even the 32 core version still have about 1/3 the die size of an nVidia card, and is slightly smaller than ATi cards.

And that is in the 45nm process.

That is why Intel has such a huge advantage. Their design scales almost perfectly.

1:1 scaling is relatively simple with it.

The problem with massive amounts of cores is the power demands. It is likely, if they REALLY need to, they will launch with a 48 core version. That increase the theoretical power of the card by 50% in a single swipe...

And Intel has proven that they can do 32nm process REALLY easily and 22nm process with very well.

So, if they were to keep scaling down they would reduce power demands, increase number of cores and scale very well.

Intel will release with whatever they ultimately need to release with power wise.

Thing is, the 32 core version shows absolutely no signs of being less powerful than whatever nVidia or ATi can possibly cook up(unless nVidia has some rediculous secret GPU that increase performance compared to the GTX 280 by like 100%.)

What we can expect in the next year from nVidia and ATi is probably somewhere in the 50-60% range in terms of power incease and somewhere between 40-50% in performance increase.

Based upon what Intel has already proven, Larrabee should (realistically) be anywhere from 40-50% faster performance wise, than the GTX 280.

Hd4870x2 has 2tera flops of calculation power. No intel/AMD CPU comes even close to that power.... folding at home will show you perfectly what i'm talking about.

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Nvidia and Ati are also scalling down on to 45nm or 40nm very soon.

Seems to me that the new road for GPU is also going mutliple GPU on the same card.

Plus ATI's new GPUs are supposed to be multicore-based, too. What if nVidia goes the same route? Not only that, their GPUs are becoming more generalized while still being beasts at their specialties. And there's a new initiative already out--OpenCL--that could put things like CAL and CUDA under one banner.
Avatar image for cobrax75
cobrax75

8389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#109 cobrax75
Member since 2007 • 8389 Posts
[QUOTE="IgGy621985"][QUOTE="dc337"]

Yes thanks to games like Nancy Drew which will probably be the best selling pc game this year. Note that I said high end, which is also what Carmack was talking about. When veteran pc dev Carmack calls the pc their "junior partner" I think the time for denial is over.

Jamex1987

When we're talking about Nancy Drew, we're also talking about the least important gaming market in the world. Yeah, North American market.

I'm still playing highest-end games on my PC, because, thankfully I have a pretty vast option of exclusive and multiplatform titles. Every platform now shares more titles with other platforms, but PC has probably one of the strongest lineups of exclusive titles ever for 2009.

How is North America the least when they have the second highest amount of WoW subscribers. Subscription gaming is the backbone of the PC game industry with Asia bringing in the most revenue.

they are second highest by country, not by region.

Avatar image for Jamex1987
Jamex1987

2187

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110 Jamex1987
Member since 2008 • 2187 Posts
[QUOTE="Jamex1987"][QUOTE="IgGy621985"][QUOTE="dc337"]

Yes thanks to games like Nancy Drew which will probably be the best selling pc game this year. Note that I said high end, which is also what Carmack was talking about. When veteran pc dev Carmack calls the pc their "junior partner" I think the time for denial is over.

cobrax75

When we're talking about Nancy Drew, we're also talking about the least important gaming market in the world. Yeah, North American market.

I'm still playing highest-end games on my PC, because, thankfully I have a pretty vast option of exclusive and multiplatform titles. Every platform now shares more titles with other platforms, but PC has probably one of the strongest lineups of exclusive titles ever for 2009.

How is North America the least when they have the second highest amount of WoW subscribers. Subscription gaming is the backbone of the PC game industry with Asia bringing in the most revenue.

they are second highest by country, not by region.

Since when is North America a country?

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#111 Teuf_
Member since 2004 • 30805 Posts

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Bebi_vegeta


That's the whole point of Larrabee: to have many simple core with powerful vector math units. 32 cores sounds very likely for the first iteration.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"]

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Teufelhuhn



That's the whole point of Larrabee: to have many simple core with powerful vector math units. 32 cores sounds very likely for the first iteration.

Physical cores?

10 years later, Intel is developing a new generation architecture to head back into the visualization market. Visual Computing is what they term it and they are really serious this time round. They plan to tackle life-like rendering, HD audio/video processing and physics model processing by utilizing a programmable and readily available architecture such as several simpler Intel Architecture (IA) cores. Intel plans to add a vector computational unit to each of the cores as well as introduce a vector handling instruction set. They believe their leadership in the total computing architecture of the various platforms and a vast software engineering department will help them achieve their goal of creating Larrabee. Based on a flexible computing architecture (similar to Nehalem's various building blocks), it can be scaled up or down for various market needs. Here's a slide from Intel and how the Larrabee processing architecture would look like:-


Also expected in the 2010 timeframe, Larrabbee is likely to be a discrete 'GPU' like offering from Intel. However, it may or may be a 'graphics card'. For all we know, it may even use Intel's QuickPath Interconnects as a drop-in to an auxiliary socket to communicate with the processors and may even utilize the system's main memory. Remember, Nehalem's memory controller can handle up to 64GB/s - that's a fait bit of memory bandwidth for several high-end graphics cards these days. However the question of latencies will be one of the toughest to tackle and we've not yet seen how QPI handles itself in real life. But this idea could be ideal for a lower-end offering from Intel whereas higher-end versions can contain dedicated local frame buffers. Still, all of these are just speculations for now as there's a long way to go.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#113 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="Teufelhuhn"][QUOTE="Bebi_vegeta"]

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Bebi_vegeta



That's the whole point of Larrabee: to have many simple core with powerful vector math units. 32 cores sounds very likely for the first iteration.

Physical cores?



Yes, physical cores. A Larrabee core is only ~30M transistors...a GTX 280 is ~1.7B transistors.

BTW that material there is pretty old, Intel has already spilled the beans on the memory architecture and other details.

http://softwarecommunity.intel.com/UserFiles/en-us/File/larrabee_manycore.pdf
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"][QUOTE="Teufelhuhn"][QUOTE="Bebi_vegeta"]

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Teufelhuhn



That's the whole point of Larrabee: to have many simple core with powerful vector math units. 32 cores sounds very likely for the first iteration.

Physical cores?



Yes, physical cores. A Larrabee core is only ~30M transistors...a GTX 280 is ~1.7B transistors.

BTW that material there is pretty old, Intel has already spilled the beans on the memory architecture and other details.

http://softwarecommunity.intel.com/UserFiles/en-us/File/larrabee_manycore.pdf

Inn 2010.. there will be 8 cores... 2 years from now we will see 10-12 cores.

Like I said... 32 cores won't been soon.

http://www.anandtech.com/showdoc.aspx?i=3367&p=14

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 Teuf_
Member since 2004 • 30805 Posts

Inn 2010.. there will be 8 cores... 2 years from now we will see 10-12 cores.

Like I said... 32 cores won't been soon.

http://www.anandtech.com/showdoc.aspx?i=3367&p=14

Bebi_vegeta


I'm not talking about Nahelem or other desktop CPU's from Intel, I'm talking about Larrabee. Larrabee cores are significantly stripped down compared to what you have in a desktop CPU, hence you can have many more of them.

If you want more details, read through the paper I linked. It's all in here.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"]

Inn 2010.. there will be 8 cores... 2 years from now we will see 10-12 cores.

Like I said... 32 cores won't been soon.

http://www.anandtech.com/showdoc.aspx?i=3367&p=14

Teufelhuhn



I'm not talking about Nahelem or other desktop CPU's from Intel, I'm talking about Larrabee. Larrabee cores are significantly stripped down compared to what you have in a desktop CPU, hence you can have many more of them.

If you want more details, read through the paper I linked. It's all in here.

The probleme is... you have many cores at low speeds. It looks good on paper... but we have nothing in real life. Thoses benchmark in your link show an estimate of what they think will happen.

http://www.anandtech.com/showdoc.aspx?i=3367&p=15

There's alot that could go wrong indeed.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 Teuf_
Member since 2004 • 30805 Posts

The probleme is... you have many cores at low speeds. It looks good on paper... but we have nothing in real life. Thoses benchmark in your link show an estimate of what they think will happen.

Bebi_vegeta


Of course...I haven't made any claims about performance, since there are no real-world performance figures yet.
Avatar image for FFXIII360
FFXIII360

988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#118 FFXIII360
Member since 2008 • 988 Posts
Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gaming
Avatar image for PC360Wii
PC360Wii

4658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#119 PC360Wii
Member since 2007 • 4658 Posts

Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingFFXIII360

Flop depends on quality not sales.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#120 Bebi_vegeta
Member since 2003 • 13558 Posts

Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingFFXIII360

Yeah WOW is a flop!

Avatar image for M337ING
M337ING

299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121 M337ING
Member since 2008 • 299 Posts

Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingFFXIII360

Even if it does, Intel won't care since they'll be funding development and will use it as a hardware-seller.

Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#122 horrowhip
Member since 2005 • 5002 Posts

Hd4870x2 has 2tera flops of calculation power. No intel/AMD CPU comes even close to that power.... folding at home will show you perfectly what i'm talking about.

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Nvidia and Ati are also scalling down on to 45nm or 40nm very soon.

Seems to me that the new road for GPU is also going mutliple GPU on the same card.

Bebi_vegeta

And now you show just how ignorant you are about Larrabee..

Larrabee is Intel's GPU.

They are going to launch it in Late 2009.

Plans right NOW, have it with 32 Cores at 2GHz....

The cores are based off an extremely updated P4 architecture with 4 simultaneous threads per core.

What that means is that the 32 Core Larrabee will have 128 theoretical cores(32 Cores * 4 Simultaneous Threads that are interpreted as individual CPU's by the OS)

Don't start talking like you know what you are talking about when you don't know what Larrabee is.

In terms of FLOPS, the 32 Core version gets 2TFLOPS at 2GHz

Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#123 horrowhip
Member since 2005 • 5002 Posts

The probleme is... you have many cores at low speeds. It looks good on paper... but we have nothing in real life. Thoses benchmark in your link show an estimate of what they think will happen.

http://www.anandtech.com/showdoc.aspx?i=3367&p=15

There's alot that could go wrong indeed.

Bebi_vegeta

many cores at low speeds? it is 2GHz...

That is perfectly fine.

Also, there are no benchmarks in that "link" just them saying that Larrabee is largely based off how well they pull of the software renderer.

And honest to god, they could not have hired a BETTER man for the job. Micheal Abrash is doing the software renderer. Micheal Abrash's experience with coding software renderers is second to nobody in the world.

You can keep turning up your nose and going "oh it will fail" but Intel has hired the people who can make this happen.

They have so many absolutely brilliant people on this project that at the very least it will MATCH whatever nVidia or ATi can churn out.

Avatar image for Lidve
Lidve

2415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#124 Lidve
Member since 2007 • 2415 Posts

Until PCs become turnkey simple, there will always be a place for consoles.HuusAsking

No,its more like "as long there are lazy people and their kids,there will be place for console"

Because,imagine yourself as parent,why would you buy expencive pc that kid dont understand,when you can buy him cheap console and make him happy

But world and humanity are advancing,i know some 4-7 year old kids playing some difficult pc games,so in time,we will have people that understand PC,and consoles will be needed no more

Avatar image for FFXIII360
FFXIII360

988

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125 FFXIII360
Member since 2008 • 988 Posts

[QUOTE="FFXIII360"]Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingBebi_vegeta

Yeah WOW is a flop!

WOW is pay to play and after I tried the demo I wouldn't play it for free

Avatar image for cobrax75
cobrax75

8389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126 cobrax75
Member since 2007 • 8389 Posts
[QUOTE="Bebi_vegeta"]

[QUOTE="FFXIII360"]Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingFFXIII360

Yeah WOW is a flop!

WOW is pay to play and after I tried the demo I wouldn't play it for free

yes, 10 million paying subscribers, a 93 average on metacritic, GS's GOTY of 2004, means it sucks a lot.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"]

Hd4870x2 has 2tera flops of calculation power. No intel/AMD CPU comes even close to that power.... folding at home will show you perfectly what i'm talking about.

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Nvidia and Ati are also scalling down on to 45nm or 40nm very soon.

Seems to me that the new road for GPU is also going mutliple GPU on the same card.

horrowhip

And now you show just how ignorant you are about Larrabee..

Larrabee is Intel's GPU.

They are going to launch it in Late 2009.

Plans right NOW, have it with 32 Cores at 2GHz....

The cores are based off an extremely updated P4 architecture with 4 simultaneous threads per core.

What that means is that the 32 Core Larrabee will have 128 theoretical cores(32 Cores * 4 Simultaneous Threads that are interpreted as individual CPU's by the OS)

Don't start talking like you know what you are talking about when you don't know what Larrabee is.

In terms of FLOPS, the 32 Core version gets 2TFLOPS at 2GHz

Intel is keeping two important details of Larrabee very quiet: the details of the instruction set and the configuration of the finished product. Remember that Larrabee won't ship until sometime in 2009 or 2010, the first chips aren't even back from the fab yet, so not wanting to discuss how many cores Intel will be able to fit on a single Larrabee GPU makes sense.

Link ???

So on another note... there way behind Ati/Nvdia in terms of flop.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"]

The probleme is... you have many cores at low speeds. It looks good on paper... but we have nothing in real life. Thoses benchmark in your link show an estimate of what they think will happen.

http://www.anandtech.com/showdoc.aspx?i=3367&p=15

There's alot that could go wrong indeed.

horrowhip

many cores at low speeds? it is 2GHz...

That is perfectly fine.

Also, there are no benchmarks in that "link" just them saying that Larrabee is largely based off how well they pull of the software renderer.

And honest to god, they could not have hired a BETTER man for the job. Micheal Abrash is doing the software renderer. Micheal Abrash's experience with coding software renderers is second to nobody in the world.

You can keep turning up your nose and going "oh it will fail" but Intel has hired the people who can make this happen.

They have so many absolutely brilliant people on this project that at the very least it will MATCH whatever nVidia or ATi can churn out.

Oh, I never said it failed... this looks similar to the cell from ps3 but better. It's a good on paper, but how good is it really... Hell i'd be happy if this succeded, more competition in the market means lower price for me.

Avatar image for AdrianWerner
AdrianWerner

28441

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#129 AdrianWerner
Member since 2003 • 28441 Posts
Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingFFXIII360
Offset team wants great graphics, which is impossible on consoles unfortunately. If they really want to wow people they need to make it PC exclusive
Avatar image for Tactis
Tactis

1568

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#130 Tactis
Member since 2006 • 1568 Posts
ehhh.. i dont know i dont really want this game turning into a graphical showcase but i dont know i mean its hard to judge early on kinda interesting how Intel is approaching it
Avatar image for Kantroce
Kantroce

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 Kantroce
Member since 2006 • 533 Posts

I hope these companies are aware that high end pc gaming is on the decline

Carmack on pc game sales

http://blogs.pcworld.com/gameon/archives/007422.html

dc337

In other news, stupid is on the rise.

Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#132 horrowhip
Member since 2005 • 5002 Posts
[QUOTE="horrowhip"][QUOTE="Bebi_vegeta"]

Hd4870x2 has 2tera flops of calculation power. No intel/AMD CPU comes even close to that power.... folding at home will show you perfectly what i'm talking about.

We won't see a 32 core intel very soon... 8core is supposed to be out by 2010.

Nvidia and Ati are also scalling down on to 45nm or 40nm very soon.

Seems to me that the new road for GPU is also going mutliple GPU on the same card.

Bebi_vegeta

And now you show just how ignorant you are about Larrabee..

Larrabee is Intel's GPU.

They are going to launch it in Late 2009.

Plans right NOW, have it with 32 Cores at 2GHz....

The cores are based off an extremely updated P4 architecture with 4 simultaneous threads per core.

What that means is that the 32 Core Larrabee will have 128 theoretical cores(32 Cores * 4 Simultaneous Threads that are interpreted as individual CPU's by the OS)

Don't start talking like you know what you are talking about when you don't know what Larrabee is.

In terms of FLOPS, the 32 Core version gets 2TFLOPS at 2GHz

Intel is keeping two important details of Larrabee very quiet: the details of the instruction set and the configuration of the finished product. Remember that Larrabee won't ship until sometime in 2009 or 2010, the first chips aren't even back from the fab yet, so not wanting to discuss how many cores Intel will be able to fit on a single Larrabee GPU makes sense.

Link ???

So on another note... there way behind Ati/Nvdia in terms of flop.

Intel has said numerous time now(since SIGGRAPH) that they plan to sell 8 Core, 16 Core, 24 Core and 32 Core versions of the card. That is their plans.

They have also revealed that they will have a 16-wide single precision(8 wide double precision) vector configuration.

Technically, they may or may not actually have honestly rediculous numbers because the way anandtech explained it, it seemed like each simulated core(thread) had individual access to the vector units which would mean that the card would have 4TFLOPS Single Precision(16 Single Precision OPs per Core * 128 theoretical cores * 2GHz = 4096 GFLOPS SP) and 2TFLOPS Double Precision (8 Double Precision OPs per Core * 128 theoretical cores * 2 GHz = 2048 GFLOPS DP). Doesn't quite make perfect sense but Anandtech's article made it SEEM like that with their description of how those Vector units work.

Just to explain how I got 128 theoretical cores one more time.... 32 cores but each one has 4 way simultaneous multithreading. The OS interprets that as 4 seperate CPU's. And if I were to get into the technicalities of Intel's complete architecture than you would see how compared to nVidia/ATi, in terms of "threads"(by the more traditional graphics definition), then Intel's architecture is actually several thousand "threads" if I were to go by nVidia's definition.

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#133 blackdreamhunk
Member since 2007 • 3880 Posts

hahahaha I bet alot companies are feeling the pressure

http://www.youtube.com/watch?v=FpLsp6KNRoQ&feature=related

Yea I can't wait for this game project offset to come to the pc. This thread has sweat writen all over it lol. I so can't wait for this game to come. Man can you picture the types of gmes pc gamers will get with this new tech lol. It's good to be a pc gamer.

Any ways I know you feel the pressure :)

Avatar image for malikmmm
malikmmm

2235

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#134 malikmmm
Member since 2003 • 2235 Posts

Any PC exclusive is most likely going to be a flop considering the amount of piracy going on with PC gamingFFXIII360

wow first of all do you have any idea about the piracy on the the consoles do you ? and second was crysis a flop ?

you tell me......

Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#135 horrowhip
Member since 2005 • 5002 Posts

http://arstechnica.com/articles/paedia/gpu-sweeney-interview.ars

That interview with Tim Sweeney basically echoes everything that I have said about why Larrabee should be great.