Official - Project Offset PC exclusive - will be Intel Larabee launch title

  • 135 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#51 Daytona_178
Member since 2005 • 14962 Posts
Larabee is going to be a HUGE step forward in gaming graphics!
Avatar image for horrowhip
horrowhip

5002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#52 horrowhip
Member since 2005 • 5002 Posts

Larabee is going to be a HUGE step forward in gaming graphics!Daytona_178

or they could fail and we will have to wait another 10 years for the return of fully-programmable graphics.

Avatar image for osan0
osan0

18263

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#53 osan0
Member since 2004 • 18263 Posts
[QUOTE="PS3_3DO"]

Intel is all hot air. Iets see them make a good GPU since they haven't ever.

horrowhip

.... have you seen the thought put into the architecture? It is very well planned out.

And ultimately, anyone dissing Larrabee doesn't fully understand what it could mean for the industry as a whole.

Larrabee could be the single best tech shift in the last 10 years... It could mean a LOT to the industry if it succeeds.

And by a LOT, I mean technological advancement and experimentations not seen since the early '90s....

Great programmers could theoretically redefine the way we look at graphics. None of that will happen in the current state with hardware rendering and everything being tied to the current version of DX or the current shader model version...

Larrabee could do away with all those technicalities and just let the programmers make fantastic engines with whatever they want.... I could mean more improvements in a single generation of GPU's than we have seen in 3-4 generations of GPU's in the current system.

Philosophically, any true gamer should want Intel to succeed. Because it they do, the industry will change for the better.

someones excited arent they?

although im not quite as excited....i certainly find intels proposition very interestng.

i fondly remember a game called outcast which was released in 1999. it used the voxel engine (which meant software rendering only) and at the time it was a visually stunning game. the world looked very organic and was a pleasure to explore. that ran on a 500Mhz pentium 3 (and of course the CPU had to do everything from graphics rendering to ai.

ive often wondered what voxels could do today given the power of modern CPUs and multi core tech. larabee would certainly be very interesting for that also. the design itself, on paper at least, is very flexible and would support voxels.

i wont be going crazy for it though until i know more. it may be all good on paper but how will it stack up in the real world? it may be flexible but how powerful will it be? at the mo its all just rumours. ive heard that itll have 48 cores for the hign end model....but that doesent tell us much at all. how will those cores stack up against the shaders in an 8000 series geforce or 2000 series ati card? the cores are in order execution units based on pentium MMX architecture with added cache and vector processing units....which is a cause for pause imho.

larabee may be the biggest advancement in GPUs since the geforce 256.....or it could just end up being a damp squib. i hope it succeeds as competition is always good and it could give the GPU industry a kick in the bum.but i await benchmarks and more info before ill make a judgement on it.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 Teuf_
Member since 2004 • 30805 Posts

so my understanding is that the larabee will require devs to learn a new architecture?

mephisto_11



No. It will support DX11, which means PC devs will be able to use it just like any other regular GPU. However they also have the option of bypassing DirectX (like the Project Offset guys are doing) and writing a custom rasterizer, which would require understanding how the chip works on a low-level.

ms isn't interested in incorporating it into their next console and sony already has the cell...

mephisto_11


Yeah it's nearly a given at this point that Sony will go with another Cell variant for PS4, but MS is still up in the air. We haven't heard anything definitive from MS or Intel in that regard.
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 Teuf_
Member since 2004 • 30805 Posts

someones excited arent they?

although im not quite as excited....i certainly find intels proposition very interestng.

i fondly remember a game called outcast which was released in 1999. it used the voxel engine (which meant software rendering only) and at the time it was a visually stunning game. the world looked very organic and was a pleasure to explore. that ran on a 500Mhz pentium 3 (and of course the CPU had to do everything from graphics rendering to ai.

ive often wondered what voxels could do today given the power of modern CPUs and multi core tech. larabee would certainly be very interesting for that also. the design itself, on paper at least, is very flexible and would support voxels.

i wont be going crazy for it though until i know more. it may be all good on paper but how will it stack up in the real world? it may be flexible but how powerful will it be? at the mo its all just rumours. ive heard that itll have 48 cores for the hign end model....but that doesent tell us much at all. how will those cores stack up against the shaders in an 8000 series geforce or 2000 series ati card? the cores are in order execution units based on pentium MMX architecture with added cache and vector processing units....which is a cause for pause imho.

larabee may be the biggest advancement in GPUs since the geforce 256.....or it could just end up being a damp squib. i hope it succeeds as competition is always good and it could give the GPU industry a kick in the bum.but i await benchmarks and more info before ill make a judgement on it.

osan0


Voxels are indeed really cool, and it would be awesome if someone got a nice voxel engine running on Larrabee. Carmack's been talking about integrating a few voxel-based techniques for his next-gen engine...maybe he could pull it off.

BTW if you're interested in voxels, you should definitely check out Voxelsten 3D.
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#56 Daytona_178
Member since 2005 • 14962 Posts

[QUOTE="Daytona_178"]Larabee is going to be a HUGE step forward in gaming graphics!horrowhip

or they could fail and we will have to wait another 10 years for the return of fully-programmable graphics.

My expectations are high,,,iam guessing the price point will also be high :D

Avatar image for osan0
osan0

18263

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#57 osan0
Member since 2004 • 18263 Posts

[QUOTE="osan0"]Teufelhuhn


Voxels are indeed really cool, and it would be awesome if someone got a nice voxel engine running on Larrabee. Carmack's been talking about integrating a few voxel-based techniques for his next-gen engine...maybe he could pull it off.

BTW if you're interested in voxels, you should definitely check out Voxelsten 3D.

kewl thanks for the link

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="horrowhip"]

[QUOTE="Daytona_178"]Larabee is going to be a HUGE step forward in gaming graphics!Daytona_178

or they could fail and we will have to wait another 10 years for the return of fully-programmable graphics.

My expectations are high,,,iam guessing the price point will also be high :D



I think Intel will want to be competitive. They seem like they're out for blood, especially when it comes to Nvidia.
Avatar image for killzowned24
killzowned24

7345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 killzowned24
Member since 2007 • 7345 Posts
Nvidia says that it will be the power of a 2006 nvidia card so.
Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 blackdreamhunk
Member since 2007 • 3880 Posts
Nvidia says that it will be the power of a 2006 nvidia card so.killzowned24
do you have a link to back those words
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 Teuf_
Member since 2004 • 30805 Posts
Nvidia says that it will be the power of a 2006 nvidia card so.killzowned24


Nvidia is the master of FUD. :)
Avatar image for killzowned24
killzowned24

7345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 killzowned24
Member since 2007 • 7345 Posts

[QUOTE="killzowned24"]Nvidia says that it will be the power of a 2006 nvidia card so.Teufelhuhn


Nvidia is the master of FUD. :)

we shall see,but I still believe ati and nvidia will have much better hardware when it comes.

about the link google,nvidia throws another punch.

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 blackdreamhunk
Member since 2007 • 3880 Posts

[QUOTE="Teufelhuhn"][QUOTE="killzowned24"]Nvidia says that it will be the power of a 2006 nvidia card so.killzowned24



Nvidia is the master of FUD. :)

we shall see,but I still believe ati and nvidia will have much better hardware when it comes.

about the link google,nvidia throws another punch.

may i see a link or the source for your info
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 Teuf_
Member since 2004 • 30805 Posts

[QUOTE="Teufelhuhn"][QUOTE="killzowned24"]Nvidia says that it will be the power of a 2006 nvidia card so.killzowned24



Nvidia is the master of FUD. :)

we shall see,but I still believe ati and nvidia will have much better hardware when it comes.

about the link google,nvidia throws another punch.



I wouldn't doubt it if Nvidia and ATI out-perform Larrabee when it comes to 99% of PC games. What will be interesting is what games like Project Offset pull off, or what would happen if Larrabee gets put into a console.

And I've read all of Nvidia's PR briefs, as usual they give off a lot of hot air. Larrabee being outperformed by 10% is one thing, but saying it will be on par with a 7900GTX is utterly ridiculous. Like I said before Intel has a whole lot of talented engineers and the best fabs money can buy.
Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 blackdreamhunk
Member since 2007 • 3880 Posts
know iam going to be getting project offset. When their fourms open up I'm going to be helping them out with some of my ideas. I have a repuation of helping big game companies with ideas. :)
Avatar image for Arronaxxx
Arronaxxx

306

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 Arronaxxx
Member since 2003 • 306 Posts
Looks like Intel got their own personal MGS4 :)
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#67 Daytona_178
Member since 2005 • 14962 Posts

[QUOTE="Teufelhuhn"][QUOTE="killzowned24"]Nvidia says that it will be the power of a 2006 nvidia card so.killzowned24



Nvidia is the master of FUD. :)

we shall see,but I still believe ati and nvidia will have much better hardware when it comes.

about the link google,nvidia throws another punch.

Why do you think that? NVidia will be usinjg standard video card technology, Larabee is completely different thus if its good its going to be VERY good!

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 blackdreamhunk
Member since 2007 • 3880 Posts
Looks like Intel got their own personal MGS4 :)Arronaxxx
mgs4 or daiblo on steroids hehe more like it
Avatar image for Eddie-Vedder
Eddie-Vedder

7810

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 Eddie-Vedder
Member since 2003 • 7810 Posts
Isn't this game on the 360?
Avatar image for killzowned24
killzowned24

7345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 killzowned24
Member since 2007 • 7345 Posts
[QUOTE="killzowned24"]

[QUOTE="Teufelhuhn"][QUOTE="killzowned24"]Nvidia says that it will be the power of a 2006 nvidia card so.Teufelhuhn



Nvidia is the master of FUD. :)

we shall see,but I still believe ati and nvidia will have much better hardware when it comes.

about the link google,nvidia throws another punch.



I wouldn't doubt it if Nvidia and ATI out-perform Larrabee when it comes to 99% of PC games. What will be interesting is what games like Project Offset pull off, or what would happen if Larrabee gets put into a console.

And I've read all of Nvidia's PR briefs, as usual they give off a lot of hot air. Larrabee being outperformed by 10% is one thing, but saying it will be on par with a 7900GTX is utterly ridiculous. Like I said before Intel has a whole lot of talented engineers and the best fabs money can buy.

your right,it seems he was quoted wrong and if you read like the 5th comment there is link where he says its like a gtx280

but still not sure until they show more info, but still think they will be better.

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 blackdreamhunk
Member since 2007 • 3880 Posts

Isn't this game on the 360?Eddie-Vedder
no it's not going to be on console. Intel bought the rights to the game. It's pc only game by the way. The console is just too old such a high tech game.

yea when project offset open their fourms I am so going ask if they can use this tech in their game :) hehehehe

http://technology.timesonline.co.uk/tol/news/tech_and_web/article4557935.ece

now that would be insane

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 blackdreamhunk
Member since 2007 • 3880 Posts

welll this how feel about this thread black dream hunk style lol reasons it's good to be a pc gamer and paying the extra money is so worth it. black dream hunk style. To be the best you first have to play the best!any way this song says how feel about this thread lol

http://www.youtube.com/watch?v=Vla11PrcuEk&feature=related

and this video is people who are cought up to speed with intel

http://www.youtube.com/watch?v=uBPegjNG2qw

Avatar image for malikmmm
malikmmm

2235

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 malikmmm
Member since 2003 • 2235 Posts

Isn't this game on the 360?Eddie-Vedder

hehe not any more.... :lol:

Avatar image for Giancar
Giancar

19160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 Giancar
Member since 2006 • 19160 Posts

Isn't this game on the 360?Eddie-Vedder

not anymore

Avatar image for Jamex1987
Jamex1987

2187

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76 Jamex1987
Member since 2008 • 2187 Posts

Isn't this game on the 360?Eddie-Vedder

Yes it is. No one has cancelled anything,

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 blackdreamhunk
Member since 2007 • 3880 Posts

[QUOTE="Eddie-Vedder"]Isn't this game on the 360?Jamex1987

Yes it is. No one has cancelled anything,

this game is going to maxed out for the pc you really think a console can handle it! a console can't handle crysis let alone a project osffset. The amount of tech going into this game beyond anything to date.

http://en.wikipedia.org/wiki/Project_Offset

the last statment was the consoles is too old! yea and there is talk the game devs for this were not happy with limitaions consoles has to offer. The game couldn't be at it's full potential on console.

Avatar image for Jamex1987
Jamex1987

2187

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 Jamex1987
Member since 2008 • 2187 Posts
[QUOTE="Jamex1987"]

[QUOTE="Eddie-Vedder"]Isn't this game on the 360?blackdreamhunk

Yes it is. No one has cancelled anything,

this game is going to maxed out for the pc you really think a console can handle it! a console can't handle crysis let alone a project osffset. The amount of tech going into this game beyond anything to date.

http://en.wikipedia.org/wiki/Project_Offset

the last statment was the consoles is too old!

All you are doing is specualting. Come back when you have facts.

This game was announced over two years ago.

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 blackdreamhunk
Member since 2007 • 3880 Posts
[QUOTE="blackdreamhunk"][QUOTE="Jamex1987"]

[QUOTE="Eddie-Vedder"]Isn't this game on the 360?Jamex1987

Yes it is. No one has cancelled anything,

this game is going to maxed out for the pc you really think a console can handle it! a console can't handle crysis let alone a project osffset. The amount of tech going into this game beyond anything to date.

http://en.wikipedia.org/wiki/Project_Offset

the last statment was the consoles is too old!

All you are doing is specualting. Come back when you have facts.

This game was announced over two years ago.

check out the link I gave yea on wiki!!! PC only game!!! sorry I bring facts to the table specualtion are for weak people.

the game is bought and own by Intel

http://www.youtube.com/watch?v=Vla11PrcuEk&feature=related

Avatar image for dc337
dc337

2603

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#80 dc337
Member since 2008 • 2603 Posts

I hope these companies are aware that high end pc gaming is on the decline

Carmack on pc game sales

http://blogs.pcworld.com/gameon/archives/007422.html

Avatar image for PC360Wii
PC360Wii

4658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 PC360Wii
Member since 2007 • 4658 Posts

I hope these companies are aware that high end pc gaming is on the decline

Carmack on pc game sales

http://blogs.pcworld.com/gameon/archives/007422.html

dc337

I wouldnt be suprized if the entire 96 posts you have have been anti-PC.

Avatar image for AdrianWerner
AdrianWerner

28441

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#82 AdrianWerner
Member since 2003 • 28441 Posts

I hope these companies are aware that high end pc gaming is on the decline

Carmack on pc game sales

http://blogs.pcworld.com/gameon/archives/007422.html

dc337

And yet D3 sold more than Quake 1-3.

Anyway with ID and Crytek going multiplat there's a big opportunity for new companies to get the "graphics champ" place, that's what Offset and Futuremark are aiming at and it will bring them nice sales

Avatar image for dc337
dc337

2603

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 dc337
Member since 2008 • 2603 Posts

I wouldnt be suprized if the entire 96 posts you have have been anti-PC.

PC360Wii

90/96

Avatar image for devious742
devious742

3924

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 devious742
Member since 2003 • 3924 Posts

No. It will support DX11, which means PC devs will be able to use it just like any other regular GPU. However they also have the option of bypassing DirectX (like the Project Offset guys are doing) and writing a custom rasterizer, which would require understanding how the chip works on a low-level. Teufelhuhn

wait a sec if the devs are allowed to bypass directx wouldnt MS not be happy with that?? if thats true i dont think MS would allow a larabee on their console...

Avatar image for HuusAsking
HuusAsking

15270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 HuusAsking
Member since 2006 • 15270 Posts

someones excited arent they?

although im not quite as excited....i certainly find intels proposition very interestng.

i fondly remember a game called outcast which was released in 1999. it used the voxel engine (which meant software rendering only) and at the time it was a visually stunning game. the world looked very organic and was a pleasure to explore. that ran on a 500Mhz pentium 3 (and of course the CPU had to do everything from graphics rendering to ai.

ive often wondered what voxels could do today given the power of modern CPUs and multi core tech. larabee would certainly be very interesting for that also. the design itself, on paper at least, is very flexible and would support voxels.

i wont be going crazy for it though until i know more. it may be all good on paper but how will it stack up in the real world? it may be flexible but how powerful will it be? at the mo its all just rumours. ive heard that itll have 48 cores for the hign end model....but that doesent tell us much at all. how will those cores stack up against the shaders in an 8000 series geforce or 2000 series ati card? the cores are in order execution units based on pentium MMX architecture with added cache and vector processing units....which is a cause for pause imho.

larabee may be the biggest advancement in GPUs since the geforce 256.....or it could just end up being a damp squib. i hope it succeeds as competition is always good and it could give the GPU industry a kick in the bum.but i await benchmarks and more info before ill make a judgement on it.

osan0
Interesting thought on the voxels, but I think volumetric rendering runs into problems when you have to deal with more realistic physics--particularly fluids. Fluids are curvaceous in nature and their physics are closer to spheres than to cubes (most physics engines simulate fluid physics by means of spheres).
Avatar image for IgGy621985
IgGy621985

5922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 IgGy621985
Member since 2004 • 5922 Posts

I hope these companies are aware that high end pc gaming is on the decline

Carmack on pc game sales

http://blogs.pcworld.com/gameon/archives/007422.html

dc337

And yet it was the biggest player in 2007.

Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87 blackdreamhunk
Member since 2007 • 3880 Posts
[QUOTE="PC360Wii"]

I wouldnt be suprized if the entire 96 posts you have have been anti-PC.

dc337

90/96

That is because they know the pc is the best gaming machine ever. Sony and M$ can feel, ceo's can feel it,epic games crytek can feel it.

It 's like me around alot hot chicks! they can feel the pressure.

http://www.youtube.com/watch?v=FpLsp6KNRoQ&feature=related

that is what happens when people know your the best or one of the best. competaion my friend ;)

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88 Teuf_
Member since 2004 • 30805 Posts

[QUOTE="Teufelhuhn"]No. It will support DX11, which means PC devs will be able to use it just like any other regular GPU. However they also have the option of bypassing DirectX (like the Project Offset guys are doing) and writing a custom rasterizer, which would require understanding how the chip works on a low-level. devious742

wait a sec if the devs are allowed to bypass directx wouldnt MS not be happy with that?? if thats true i dont think MS would allow a larabee on their console...



They're probably not thrilled at the idea, but I don't they plan on that really taking off in the PC space. You have to remember that even if Larrabee is a huge hit your'e going to have mostly have users with Nvidia and ATI GPU's for a very long time. And if a developer wants to target ATI, Nvidia, and Intel with the same code, then they'll still have to go through DX.

I don't think that would affect their decision whether or not to use it in a console....if Xbox 720 (or whatever it's called) does have a Larrabee in it then MS would develop new tools and API's for it, and that would likely cross over to the PC space. So even if PC devs didn't use DX, they'd still be using Microsoft tools.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 Bebi_vegeta
Member since 2003 • 13558 Posts

[QUOTE="Teufelhuhn"]No. It will support DX11, which means PC devs will be able to use it just like any other regular GPU. However they also have the option of bypassing DirectX (like the Project Offset guys are doing) and writing a custom rasterizer, which would require understanding how the chip works on a low-level. devious742

wait a sec if the devs are allowed to bypass directx wouldnt MS not be happy with that?? if thats true i dont think MS would allow a larabee on their console...

Open GL doesn't use Direct X... so, yeah.

Avatar image for devious742
devious742

3924

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90 devious742
Member since 2003 • 3924 Posts
[QUOTE="devious742"]

[QUOTE="Teufelhuhn"]No. It will support DX11, which means PC devs will be able to use it just like any other regular GPU. However they also have the option of bypassing DirectX (like the Project Offset guys are doing) and writing a custom rasterizer, which would require understanding how the chip works on a low-level. Bebi_vegeta

wait a sec if the devs are allowed to bypass directx wouldnt MS not be happy with that?? if thats true i dont think MS would allow a larabee on their console...

Open GL doesn't use Direct X... so, yeah.

true...forgot about that :D

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 Teuf_
Member since 2004 • 30805 Posts

Open GL doesn't use Direct X... so, yeah.

Bebi_vegeta


What does OpenGL have to do with it?
Avatar image for devious742
devious742

3924

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 devious742
Member since 2003 • 3924 Posts
[QUOTE="Bebi_vegeta"]

Open GL doesn't use Direct X... so, yeah.

Teufelhuhn



What does OpenGL have to do with it?

that you can bypass directx and just use opengl....

Avatar image for dc337
dc337

2603

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 dc337
Member since 2008 • 2603 Posts
[QUOTE="dc337"]

I hope these companies are aware that high end pc gaming is on the decline

Carmack on pc game sales

http://blogs.pcworld.com/gameon/archives/007422.html

IgGy621985

And yet it was the biggest player in 2007.

Yes thanks to games like Nancy Drew which will probably be the best selling pc game this year. Note that I said high end, which is also what Carmack was talking about. When veteran pc dev Carmack calls the pc their "junior partner" I think the time for denial is over.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"]

Open GL doesn't use Direct X... so, yeah.

Teufelhuhn



What does OpenGL have to do with it?

Well it's quite obvious... if you read what was wrote.

Avatar image for dc337
dc337

2603

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#95 dc337
Member since 2008 • 2603 Posts
That is because they know the pc is the best gaming machine ever. Sony and M$ can feel, ceo's can feel it,epic games crytek can feel it.

It 's like me around alot hot chicks! they can feel the pressure.

http://www.youtube.com/watch?v=FpLsp6KNRoQ&feature=related

that is what happens when people know your the best or one of the best. competaion my friend ;)

blackdreamhunk

Competation? So the best don't know about spell check?

I have a gaming pc and I don't think it is the end-all of gaming. I'm actually only keeping it for Dawn of War II.

Some of us manage to keep a gaming pc without developing the dorky, arrogant attitude that seems to be common around here. A lot of pc gamers really seem like they are using their video cards for penile compensation.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 Teuf_
Member since 2004 • 30805 Posts
[QUOTE="Teufelhuhn"][QUOTE="Bebi_vegeta"]

Open GL doesn't use Direct X... so, yeah.

devious742



What does OpenGL have to do with it?

that you can bypass directx and just use opengl....



That's not what I meant when I said "bypass DirectX"...I meant not using a 3D graphics API at all.

On PC's OpenGL and DirectX do the same thing: they serve as a sort of translator that you use to talk to a video card. You see graphics hardware each has it's own "language" that it speaks. So when you make a PC game and you want to draw a triangle on the screen you don't program it to say that in all the different hardware "languages", you use DirectX or OpenGL. You tell that API to draw the triangle, then the API translates it to the "language" that the hardware understands. This is how things have been for a very long time...it's nice in that you don't need to worry about individual kinds of video cards, but it was limiting since you could only do the things that DirectX/OpenGL knows how to translate. You're also limited because video cards aren't very flexible to begin with: they are only designed to do things like drawing triangles or use textures.

Now with Larrabee things are a little different. It does support using DirectX or OpenGL if you want to do that and go through the whole translator bit...but if you want you can skip that. This is because Larrabee uses x86 cores: the same kind of cores that are in a Core 2 Duo or any other Intel/AMD CPU. So this means, if you wanted, you could talk to the hardware directly to do your graphics stuff, and the hardwre is flexible enough to do all kinds of things that wouldn't be possible on "normal" video cards. It would be harder to do and any code you wrote for this would only work on Larrabee and not Nvidia/ATI, but you would have a whole lot less restrictions.
Avatar image for fazares
fazares

6913

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#97 fazares
Member since 2004 • 6913 Posts
just as i thought...larabee is the herald of the approach that will be taken for next gen gfx pipelines...but they will first use it to run slightly enhanced current gen gfx engines for a few years till the time is mature for jumping on the next gen gfx engines bandwagon...
Avatar image for blackdreamhunk
blackdreamhunk

3880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 blackdreamhunk
Member since 2007 • 3880 Posts
[QUOTE="blackdreamhunk"]That is because they know the pc is the best gaming machine ever. Sony and M$ can feel, ceo's can feel it,epic games crytek can feel it.

It 's like me around alot hot chicks! they can feel the pressure.

http://www.youtube.com/watch?v=FpLsp6KNRoQ&feature=related

that is what happens when people know your the best or one of the best. competaion my friend ;)

dc337

Competation? So the best don't know about spell check?

I have a gaming pc and I don't think it is the end-all of gaming. I'm actually only keeping it for Dawn of War II.

Some of us manage to keep a gaming pc without developing the dorky, arrogant attitude that seems to be common around here. A lot of pc gamers really seem like they are using their video cards for penile compensation.

yea well some people don't like lowering our standards for any thing. So good luck with your old console machines
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="devious742"][QUOTE="Teufelhuhn"][QUOTE="Bebi_vegeta"]

Open GL doesn't use Direct X... so, yeah.

Teufelhuhn



What does OpenGL have to do with it?

that you can bypass directx and just use opengl....



That's not what I meant when I said "bypass DirectX"...I mean not using a 3D graphics API at all.

On PC's OpenGL and DirectX do the same thing: they serve as a sort of translator that you use to talk to a video card. You see graphics hardware each has it's own "language" that it speaks. So when you make a PC game and you want to draw a triangle on the screen you don't program it to say that in all the different hardware "languages", you use DirectX or OpenGL. You tell that API to draw the triangle, then the API translates it to the "language" that the hardware understands. This is how things have been for a very long time...it's nice in that you don't need to worry about individual kinds of video cards, but it was limiting since you could only do the things that DirectX/OpenGL knows how to translate. You're also limited because video cards aren't very flexible to begin with: they are only designed to do things like drawing triangles or use textures.

Now with Larrabee things are a little different. It does support using DirectX or OpenGL if you want to do that and go through the whole translator bit...but if you want you can skip that. This is because Larrabee uses x86 cores: the same kind of cores that are in a Core 2 Duo or any other Intel/AMD CPU. So this means, if you wanted, you could talk to the hardware directly to do your graphics stuff. It would be harder to do and any code you wrote for this would only work on Larrabee and not Nvidia/ATI, but you would have a whole lot less restrictions.

I still have my doubts on larrabee... GPU are that more powerfull in terms of calculation.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100 Teuf_
Member since 2004 • 30805 Posts

I still have my doubts on larrabee... GPU are that more powerfull in terms of calculation.

Bebi_vegeta


Yup, they most likely will be. Intel is betting that the extra flexibility will make up for that, but who knows.