You need to buy a video card to play unreal tournament 3 on the pc, the original unreal 1998 you didnt even need to buy a video card to play the game dude.doom1marine
So we should go back to Doom 1 graphics?
This topic is locked from further discussion.
You need to buy a video card to play unreal tournament 3 on the pc, the original unreal 1998 you didnt even need to buy a video card to play the game dude.doom1marine
So we should go back to Doom 1 graphics?
Psst.... Look up the meaning of "confirmed". You really should before continuing to make those sort of statements.[QUOTE="ferret-gamer"]
[QUOTE="AbleFa3"]Stalker 2 is confirmed for consoles actually too
AbleFa3
http://www.systemwars.com/forums/showthread.php?78857-Stalker-2-in-development-console-bound
http://www.neoseeker.com/news/14618-stalker-2-confirmed-going-multi-platform/
Confirmed, like Withcer 2 and Diablo 3, these games are made for cosoles, they are not offically announced, but that does not mean anything
Again look up the meaning of confirmed and read what actually was said:"A completely new multi-platform technology developed by GSC will make the core of the game."
Nowhere at all was it ever said that it was confirmed for consoles. It said that it uses a mutliplatform engine. That doesnt mean that it is coming to multiple platforms, just that it is capable of doing so.
[QUOTE="doom1marine"]You need to buy a video card to play unreal tournament 3 on the pc, the original unreal 1998 you didnt even need to buy a video card to play the game dude.DeckardLee2010
So we should go back to Doom 1 graphics?
ut1999 cpu software rendered http://i53.tinypic.com/2vdsm68.jpg
does intel chipsets intergrated chipsets even support direct x these days? i doubt it the intel intergrated chipset on my pavilion 6535 didnt support direct x .
had to run original unreal and unreal tournament in software mode untill i got a 3dfx voodoo3.
[QUOTE="doom1marine"]You need to buy a video card to play unreal tournament 3 on the pc, the original unreal 1998 you didnt even need to buy a video card to play the game dude.DeckardLee2010
So we should go back to Doom 1 graphics?
You need a graphics card to be able to produce an image on the screen without it you wouldn't have a pc :lol:[QUOTE="DeckardLee2010"][QUOTE="doom1marine"]You need to buy a video card to play unreal tournament 3 on the pc, the original unreal 1998 you didnt even need to buy a video card to play the game dude.SPBoss
So we should go back to Doom 1 graphics?
You need a graphics card to be able to produce an image on the screen without it you wouldn't have a pc :lol:bs.
quake 1/2/3
unreal 1998, unreal tournament 1999,
all supported software/cpu rendering.
Then he should shut the f up and make his games PC only and he'll be able to make them as powerful as he wants. I never asked for a console version of Crysis 2.jack00
exactly. lol console gamers don't give a **** about crysis...
When quake 1 was released graphics cards werent even released yet, they later released a patch to support open gl.
You need a graphics card to be able to produce an image on the screen without it you wouldn't have a pc :lol:[QUOTE="SPBoss"][QUOTE="DeckardLee2010"]
So we should go back to Doom 1 graphics?
yodogpollywog
bs.
quake 1/2/3
unreal 1998, unreal tournament 1999,
all supported software/cpu rendering.
What os would those games run on, im pretty sure the os would require a gpu to produce a visual image[QUOTE="yodogpollywog"][QUOTE="SPBoss"] You need a graphics card to be able to produce an image on the screen without it you wouldn't have a pc :lol:SPBoss
bs.
quake 1/2/3
unreal 1998, unreal tournament 1999,
all supported software/cpu rendering.
What os would those games run on, im pretty sure the os would require a gpu to produce a visual imageumm u must be new to gaming
http://www.youtube.com/watch?v=J2En3SCxlJM
quake 2 on linux ps3 running only off the cpu software rendering, u cant access the rsx on linux with ps3.
What os would those games run on, im pretty sure the os would require a gpu to produce a visual image[QUOTE="SPBoss"][QUOTE="yodogpollywog"]
bs.
quake 1/2/3
unreal 1998, unreal tournament 1999,
all supported software/cpu rendering.
yodogpollywog
umm u must be new to gaming
http://www.youtube.com/watch?v=J2En3SCxlJM
quake 2 on linux ps3 running only off the cpu software rendering, u cant access the rsx on linux with ps3.
no im not new to pc gaming i've been doing it for years.. I just haven't looked at outdated technology lol! Quite interesting tech skills for doing that but pretty lame at the same time, if someone wants to play quake they should just get a normal pc lol. Its great for proving a point, maybe one day someone can hack the rsx and run pc games on ps3, now that would be epic! I wonder why sony stopped OS support.. maybe cause of things like this?intel intergrated chipsets dont support direct x as far i know.
so u need to buy a video card since games dont support software rendering anymore.
[QUOTE="yodogpollywog"][QUOTE="SPBoss"] What os would those games run on, im pretty sure the os would require a gpu to produce a visual imageSPBoss
umm u must be new to gaming
http://www.youtube.com/watch?v=J2En3SCxlJM
quake 2 on linux ps3 running only off the cpu software rendering, u cant access the rsx on linux with ps3.
no im not new to pc gaming i've been doing it for years.. I just haven't looked at outdated technology lol! Quite interesting tech skills for doing that but pretty lame at the same time, if someone wants to play quake they should just get a normal pc lol. Its great for proving a point, maybe one day someone can hack the rsx and run pc games on ps3, now that would be epic! I wonder why sony stopped OS support.. maybe cause of things like this?you must be new.
because u couldnt even buy graphics card when quake 1 shipped in 1996.
Ok im going to break this down to you very simply: 1. Intel chipsets x3100 and higher fully support dx10 and Hardware T&L 2. There is no plausible reason to bring back software rendering, Graphics Processing Chips were invented, they are far more efficient at handling graphical rendering than a x86 CPU going to software rending is a step backwards. 3. You keep using examples from a decade ago. Just because software rendering may have been better then doesnt mean it is better now. See point 2. 4. If you really want to see how software rendering would work out take a look at the Wii emulator, where it takes an i7 to run Resident Evil 4 at 30fps on 1080p. 5. If you go on a tirade like this learn what you are talking about. Software rendering was made obsolete fro a reason, maybe you should have thought of that.intel intergrated chipsets dont support direct x as far i know.
so u need to buy a video card since games dont support software rendering anymore.
yodogpollywog
[QUOTE="yodogpollywog"]Ok im going to break this down to you very simply: 1. Intel chipsets x3100 and higher fully support dx10 and Hardware T&L 2. There is no plausible reason to bring back software rendering, Graphics Processing Chips were invented, they are far more efficient at handling graphical rendering than a x86 CPU going to software rending is a step backwards. 3. You keep using examples from a decade ago. Just because software rendering may have been better then doesnt mean it is better now. See point 2. 4. If you really want to see how software rendering would work out take a look at the Wii emulator, where it takes an i7 to run Resident Evil 4 at 30fps on 1080p. 5. If you go on a tirade like this learn what you are talking about. Software rendering was made obsolete fro a reason, maybe you should have thought of that.intel intergrated chipsets dont support direct x as far i know.
so u need to buy a video card since games dont support software rendering anymore.
ferret-gamer
software rendering is better.
cpu can render unreal tournament 1999's graphics as good as video card on max settings.
intergrated chipsets cant run crysis even close to max settings compared to highend card.
software rendering was more comparable to max.
Ok im going to break this down to you very simply: 1. Intel chipsets x3100 and higher fully support dx10 and Hardware T&L 2. There is no plausible reason to bring back software rendering, Graphics Processing Chips were invented, they are far more efficient at handling graphical rendering than a x86 CPU going to software rending is a step backwards. 3. You keep using examples from a decade ago. Just because software rendering may have been better then doesnt mean it is better now. See point 2. 4. If you really want to see how software rendering would work out take a look at the Wii emulator, where it takes an i7 to run Resident Evil 4 at 30fps on 1080p. 5. If you go on a tirade like this learn what you are talking about. Software rendering was made obsolete fro a reason, maybe you should have thought of that.[QUOTE="ferret-gamer"][QUOTE="yodogpollywog"]
intel intergrated chipsets dont support direct x as far i know.
so u need to buy a video card since games dont support software rendering anymore.
yodogpollywog
software rendering is better.
cpu can render unreal tournament 1999's graphics as good as video card on max settings.
intergrated chipsets cant run crysis even close to max settings compared to highend card.
software rendering was more comparable to max.
In oher words you have no idea whatsoever what you are talking about so you just will continue to repeat the same stuff that i debunked in the very post you quoted.[QUOTE="yodogpollywog"][QUOTE="ferret-gamer"] Ok im going to break this down to you very simply: 1. Intel chipsets x3100 and higher fully support dx10 and Hardware T&L 2. There is no plausible reason to bring back software rendering, Graphics Processing Chips were invented, they are far more efficient at handling graphical rendering than a x86 CPU going to software rending is a step backwards. 3. You keep using examples from a decade ago. Just because software rendering may have been better then doesnt mean it is better now. See point 2. 4. If you really want to see how software rendering would work out take a look at the Wii emulator, where it takes an i7 to run Resident Evil 4 at 30fps on 1080p. 5. If you go on a tirade like this learn what you are talking about. Software rendering was made obsolete fro a reason, maybe you should have thought of that.ferret-gamer
software rendering is better.
cpu can render unreal tournament 1999's graphics as good as video card on max settings.
intergrated chipsets cant run crysis even close to max settings compared to highend card.
software rendering was more comparable to max.
In oher words you have no idea whatsoever what you are talking about so you just will continue to repeat the same stuff that i debunked in the very post you quoted.nope i been pc gaming before u were born.
Ok im going to break this down to you very simply: 1. Intel chipsets x3100 and higher fully support dx10 and Hardware T&L 2. There is no plausible reason to bring back software rendering, Graphics Processing Chips were invented, they are far more efficient at handling graphical rendering than a x86 CPU going to software rending is a step backwards. 3. You keep using examples from a decade ago. Just because software rendering may have been better then doesnt mean it is better now. See point 2. 4. If you really want to see how software rendering would work out take a look at the Wii emulator, where it takes an i7 to run Resident Evil 4 at 30fps on 1080p. 5. If you go on a tirade like this learn what you are talking about. Software rendering was made obsolete fro a reason, maybe you should have thought of that.[QUOTE="ferret-gamer"][QUOTE="yodogpollywog"]
intel intergrated chipsets dont support direct x as far i know.
so u need to buy a video card since games dont support software rendering anymore.
yodogpollywog
software rendering is better.
cpu can render unreal tournament 1999's graphics as good as video card on max settings.
intergrated chipsets cant run crysis even close to max settings compared to highend card.
software rendering was more comparable to max.
No im not new, and no software rendering is stupid. Try running a new dx11 game on max settings with just the cpu, there is a benchmark that does that and even the highest end cpu's cant do more than a few frames per second[QUOTE="yodogpollywog"][QUOTE="ferret-gamer"] Ok im going to break this down to you very simply: 1. Intel chipsets x3100 and higher fully support dx10 and Hardware T&L 2. There is no plausible reason to bring back software rendering, Graphics Processing Chips were invented, they are far more efficient at handling graphical rendering than a x86 CPU going to software rending is a step backwards. 3. You keep using examples from a decade ago. Just because software rendering may have been better then doesnt mean it is better now. See point 2. 4. If you really want to see how software rendering would work out take a look at the Wii emulator, where it takes an i7 to run Resident Evil 4 at 30fps on 1080p. 5. If you go on a tirade like this learn what you are talking about. Software rendering was made obsolete fro a reason, maybe you should have thought of that.SPBoss
software rendering is better.
cpu can render unreal tournament 1999's graphics as good as video card on max settings.
intergrated chipsets cant run crysis even close to max settings compared to highend card.
software rendering was more comparable to max.
No im not new, and no software rendering is stupid. Try running a new dx11 game on max settings with just the cpu, there is a benchmark that does that and even the highest end cpu's cant do more than a few frames per secondcpu could do it
tim sweeney was talking about bringing back software rendering when intel was talking about making 32 core larabee spelling? cpu.
ut1999 video card/hardware rendered http://i53.tinypic.com/2ykny45.jpg
ut1999 cpu software rendered http://i53.tinypic.com/2vdsm68.jpg
max settings on both, lol video card doesnt look much better
software rendering is closer to max on video card in ut1999 than intergrated can run crysis on max compared to highend video card.
game's actually look more crisp software rendered than they do video card rendered.
if u look at ut1999 screens u'll see software mode looks crisper.
In oher words you have no idea whatsoever what you are talking about so you just will continue to repeat the same stuff that i debunked in the very post you quoted.[QUOTE="ferret-gamer"][QUOTE="yodogpollywog"]
software rendering is better.
cpu can render unreal tournament 1999's graphics as good as video card on max settings.
intergrated chipsets cant run crysis even close to max settings compared to highend card.
software rendering was more comparable to max.
yodogpollywog
nope i been pc gaming before u were born.
Yet i know more about the topic :D
So lets end this once and for all:
You keep claiming that Software rendering is superior because of a comparison of Unreal. Let me explain exactly why that is no longer a usable reason. The very first GPUs used a Fixed Function Pixel Process Pipeline. Whereas the Software Rendering of the time was not limited to that. But Graphics Chips are no longer limited to FFPP, as they invented something called "Shaders". Perhaps you have heard of them? Graphics processor now had a superior way of programability along with FAR superior Performance, making them the much better choice for rendering.
[QUOTE="yodogpollywog"]
[QUOTE="ferret-gamer"] In oher words you have no idea whatsoever what you are talking about so you just will continue to repeat the same stuff that i debunked in the very post you quoted.ferret-gamer
nope i been pc gaming before u were born.
Yet i know more about the topic :D
So lets end this once and for all:
You keep claiming that Software rendering is superior because of a comparison of Unreal. Let me explain exactly why that is no longer a usable reason. The very first GPUs used a Fixed Function Pixel Process Pipeline. Whereas the Software Rendering of the time was not limited to that. But Graphics Chips are no longer limited to FFPP, as they invented something called "Shaders". Perhaps you have heard of them? Graphics processor now had a superior way of programability along with FAR superior Performance, making them the much better choice for rendering.
the particles in ut1999 were rendered by cpu.
those were basically shaders before ''shaders''
[QUOTE="ferret-gamer"]
[QUOTE="yodogpollywog"]
nope i been pc gaming before u were born.
yodogpollywog
Yet i know more about the topic :D
So lets end this once and for all:
You keep claiming that Software rendering is superior because of a comparison of Unreal. Let me explain exactly why that is no longer a usable reason. The very first GPUs used a Fixed Function Pixel Process Pipeline. Whereas the Software Rendering of the time was not limited to that. But Graphics Chips are no longer limited to FFPP, as they invented something called "Shaders". Perhaps you have heard of them? Graphics processor now had a superior way of programability along with FAR superior Performance, making them the much better choice for rendering.
the particles in ut1999 were rendered by cpu.
those were basically shaders before ''shaders''
No duh, did you read my post at all apart from the giant green word?that blows intel cancelled that 32 core cpu, that might of made dev's support software again, the software rendering support these days would of been much better than it was in early 90s.
1. The Larrabee was a GPGPU not a CPU 2. You have given no plausible reason as to why software rendering should have been brought back over Hardware rendering.that blows intel cancelled that 32 core cpu, that might of made dev's support software again, the software rendering support these days would of been much better than it was in early 90s.
yodogpollywog
i can run unreal tournament 1999 software mode at 1600x1200 30-45ish fps, on a athlon dualcore 2.1 ghz 1 mb cache, ut1999's only running off 1 core though because game's so old doesnt support dualcore, so a 2.1 ghz single core is running it at 30-45 fps.
[QUOTE="yodogpollywog"]1. The Larrabee was a GPGPU not a CPU 2. You have given no plausible reason as to why software rendering should have been brought back over Hardware rendering.that blows intel cancelled that 32 core cpu, that might of made dev's support software again, the software rendering support these days would of been much better than it was in early 90s.
ferret-gamer
maybe im thinking of wrong cpu name then.
firingsquad interview with tim sweeney he was talking about bringing software rendering back maybe.
That is honestly sad performance. my motherboard's integrated graphics chip could run Quake 3 at over 100fps on the same res.i can run unreal tournament 1999 software mode at 1600x1200 30-45ish fps, on a athlon dualcore 2.1 ghz 1 mb cache, ut1999's only running off 1 core though because game's so old doesnt support dualcore, so a 2.1 ghz single core is running it at 30-45 fps.
yodogpollywog
[QUOTE="yodogpollywog"]That is honestly sad performance. my motherboard's integrated graphics chip could run Quake 3 at over 100fps on the same res.i can run unreal tournament 1999 software mode at 1600x1200 30-45ish fps, on a athlon dualcore 2.1 ghz 1 mb cache, ut1999's only running off 1 core though because game's so old doesnt support dualcore, so a 2.1 ghz single core is running it at 30-45 fps.
ferret-gamer
is quake 3 .....ut1999?
my intergrated 466 mhz celeron mhz intel intergrated chipset could barly run 800x600 max and it was new pc back in 1999 got it for quake 3.
http://i54.tinypic.com/120gqxd.jpg
lol ut1999 software rendering at 1600x1200 is only using 50% of my cpu power and running 30-40 fps max settings.
so old probley only reason why it's using 50%
the software rendering support would be much better in 2010 than it was in the 90s.
i dont see why people are against it, the more options for gamers the better.
that blows intel cancelled that 32 core cpu, that might of made dev's support software again, the software rendering support these days would of been much better than it was in early 90s.
Intel Larrabee includes some GPU functions (e.g. texture sampling hardware) and 512bit wide SIMD. Against Intel's 512bit SIMD part, AMD Fusion says Hi e.g. Radeon HD stream processors.the software rendering support would be much better in 2010 than it was in the 90s.
i dont see why people are against it, the more options for gamers the better.
yodogpollywog
Sure, you can have your graphics decelerator.
Quake 3 running on pure X86 CPU e.g.Core 2 Duo P8700 2.5Ghz.
If you downloaded MS DirectX SDK, you can have JIT SSE2/SEE4.1 optimised Direct3D10 software CPU renderer e.g. MS's Warp10.
On transistor count vs performance, Warp10/Swiftshader2 not competitive against custom processors i.e. GpGPUs.
http://www.pcstats.com/articleview.cfm?articleid=2343&page=9
Foxconn A7DA-S (AMD 790GX 200/800 AM2 X2 5000+ onboard HD 3300 video)
Crysis 1.2 (no AA) - DirectX10 Integrated VGA 1024x768 Low Quality 24 fps
LOL
Why is that Lol? Just running off of the 5000+ alone would equate to much worse performance.http://www.pcstats.com/articleview.cfm?articleid=2343&page=9
Foxconn A7DA-S (AMD 790GX 200/800 AM2 X2 5000+ onboard HD 3300 video)
Crysis 1.2 (no AA) - DirectX10 Integrated VGA 1024x768 Low Quality 24 fps
LOL
yodogpollywog
Performance wise its not worth it, i already told you i ran a benchmark with a high end cpu and it gave me 4 fps at max, compared to 200+frames on a gpu. what your saying is pointless, it would make more sense if you said they should combine cpu&gpu, which is something being looked into at the moment by current manufacturersthe software rendering support would be much better in 2010 than it was in the 90s.
i dont see why people are against it, the more options for gamers the better.
yodogpollywog
[QUOTE="yodogpollywog"]Why is that Lol? Just running off of the 5000+ alone would equate to much worse performance.http://www.pcstats.com/articleview.cfm?articleid=2343&page=9
Foxconn A7DA-S (AMD 790GX 200/800 AM2 X2 5000+ onboard HD 3300 video)
Crysis 1.2 (no AA) - DirectX10 Integrated VGA 1024x768 Low Quality 24 fps
LOL
ferret-gamer
nope doubt it.
http://www.pcstats.com/articleview.cfm?articleid=2343&page=9
Foxconn A7DA-S (AMD 790GX 200/800 AM2 X2 5000+ onboard HD 3300 video)
Crysis 1.2 (no AA) - DirectX10 Integrated VGA 1024x768 Low Quality 24 fps
LOL
Swiftshader 2.01 with JIT X86/SSSE3 engine runs Quake 3 at 41 FPS/Normal settings/640x480 on my Intel Core 2 Duo P8700.[QUOTE="ferret-gamer"][QUOTE="yodogpollywog"]
http://www.pcstats.com/articleview.cfm?articleid=2343&page=9
Foxconn A7DA-S (AMD 790GX 200/800 AM2 X2 5000+ onboard HD 3300 video)
Crysis 1.2 (no AA) - DirectX10 Integrated VGA 1024x768 Low Quality 24 fps
LOL
Why is that Lol? Just running off of the 5000+ alone would equate to much worse performance.nope doubt it.
It's worst i.e. run it on Swiftshader or MS Warp10http://techreport.com/articles.x/11931/9
That's my intergrated chipset GeForce 6150se.
it can barly run f.e.a.r 1# 1024x768 low detail LOL @ 16 fps.
[QUOTE="jack00"]Then he should shut the f up and make his games PC only and he'll be able to make them as powerful as he wants. I never asked for a console version of Crysis 2.lazerface216
exactly. lol console gamers don't give a **** about crysis...
I wish all console gamers were like you and that you all put your opinion in a letter addressed to Crytek.
Please Log In to post.
Log in to comment