intergrated chips still suck too much for gaming, dev's definitely need to start supporting software rendering again.
This topic is locked from further discussion.
intergrated chips still suck too much for gaming, dev's definitely need to start supporting software rendering again.
http://techreport.com/articles.x/11931/9
That's my intergrated chipset GeForce 6150se.
it can barly run f.e.a.r 1# 1024x768 low detail LOL @ 16 fps.
My Swiftshader (Direct3D9b)+Intel Core 2 Duo P8700 runs about Geforce 5200FX. NVIDIA's single chip Geforce 320 IGP say's Hi.Depends on IGP e.g. AMD netbook level APU's 80 stream processors** kills Geforce 6150SE. **Same SPU count as Radeon HD 54x0intergrated chips still suck too much for gaming, dev's definitely need to start supporting software rendering again.
yodogpollywog
[QUOTE="yodogpollywog"]My Swiftshader (Direct3D9b)+Intel Core 2 Duo P8700 runs about Geforce 5200FX. NVIDIA's single chip Geforce 320 IGP say's Hi.http://techreport.com/articles.x/11931/9
That's my intergrated chipset GeForce 6150se.
it can barly run f.e.a.r 1# 1024x768 low detail LOL @ 16 fps.
ronvalencia
benchmarks?
Why is that Lol? Just running off of the 5000+ alone would equate to much worse performance.[QUOTE="ferret-gamer"][QUOTE="yodogpollywog"]
http://www.pcstats.com/articleview.cfm?articleid=2343&page=9
Foxconn A7DA-S (AMD 790GX 200/800 AM2 X2 5000+ onboard HD 3300 video)
Crysis 1.2 (no AA) - DirectX10 Integrated VGA 1024x768 Low Quality 24 fps
LOL
yodogpollywog
nope doubt it.
http://www.beyond3d.com/content/news/618An e8400 runs crysis through software rendering with performance akin to a FX 5700. A graphics card below the system requirments to run crysis smoothly even on low.
So... you were saying?
[QUOTE="yodogpollywog"][QUOTE="ferret-gamer"] Why is that Lol? Just running off of the 5000+ alone would equate to much worse performance. ronvalencia
nope doubt it.
It's worst i.e. run it on Swiftshader or MS Warp10different than native game support for software mode.
[QUOTE="yodogpollywog"]
Why is that Lol? Just running off of the 5000+ alone would equate to much worse performance. ferret-gamer
nope doubt it.
http://www.beyond3d.com/content/news/618An e8400 runs crysis through software rendering with performance akin to a FX 5700. A graphics card below the system requirments to run crysis smoothly even on low.
So... you were saying?
I was about to post the same link. Thanks.Nicolas Capens (aka Nick on our forums) is the creator and lead programmer behind SwiftShader
LOL software mode would run crysis faster if crysis dev's themselves actually made it.
It's worst i.e. run it on Swiftshader or MS Warp10[QUOTE="ronvalencia"][QUOTE="yodogpollywog"]
nope doubt it.
yodogpollywog
different than native game support for software mode.
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
[QUOTE="yodogpollywog"]
[QUOTE="ronvalencia"] It's worst i.e. run it on Swiftshader or MS Warp10ronvalencia
different than native game support for software mode.
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
http://www.tomshardware.com/reviews/g45-geforce-9400,2263-7.html
The GeForce 9400 mGPU
lol 28 fps at unreal tournament low settings 1280x1024
[QUOTE="ronvalencia"]
[QUOTE="yodogpollywog"]
different than native game support for software mode.
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
Again, prior to execution, the Swiftshader recompiles the incoming code stream into native X86. Note that ATI driver optimisation tricks includes "JIT" re-complier.[QUOTE="yodogpollywog"][QUOTE="ronvalencia"]
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
ronvalencia
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
Again, prior to execution, the Swiftshader recompiles the incoming code stream into native X86. Note that ATI driver optimisation tricks includes "JIT" re-complier.you dont even understand what im talking about.
the goddamn developers didnt make swifter dude.
it's a 3rd party app.
it crysis had software support programmed by crytek themselves, no need to even run swifter the performance would of been better
[QUOTE="ronvalencia"]
[QUOTE="yodogpollywog"]
different than native game support for software mode.
yodogpollywog
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
CPU doesn't have
1. large register storage spaces compared to ATI/NV GpGPUs.
2. Independent texture units.
3. MSAA hardware.
4. large thread/SMT pool e.ghide latency
5. native 3DC+ hardware support.
6. native BC6H/BC7 hardware support.
7. Z-buffer instructions.
8. Z-Cull hardware.
9. Early Z-Cull hardware.
10. Fast Log instructions.
11.Fast geometry instructions.
12. Tessellation hardware.
[QUOTE="yodogpollywog"]
[QUOTE="ronvalencia"]
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
ronvalencia
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
CPU doesn't have1. large register storage spaces compared to ATI/NV GpGPUs.
2. Independent texture units.
3. MSAA hardware.
4. large thread/SMT pool
5. native 3DC+ hardware support.
6. native BC6H/BC7 hardware support.
7. Z-buffer instructions.
8. Z-Cull hardware.
9. Early Z-Cull hardware.
10. Fast Log instructions.
gt5 uses tessellation dude lol on the car's for damage.
ps3's gpu doesnt support tessellation, but it's being done on the cpu.
CPU doesn't have[QUOTE="ronvalencia"]
[QUOTE="yodogpollywog"]
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
yodogpollywog
1. large register storage spaces compared to ATI/NV GpGPUs.
2. Independent texture units.
3. MSAA hardware.
4. large thread/SMT pool
5. native 3DC+ hardware support.
6. native BC6H/BC7 hardware support.
7. Z-buffer instructions.
8. Z-Cull hardware.
9. Early Z-Cull hardware.
10. Fast Log instructions.
gt5 uses tessellation dude lol on the car's for damage.
ps3's gpu doesnt support tessellation, but it's being done on the cpu.
And the cell isn't a true CPU. It works very differently from a OOO x86 processor.CPU doesn't have[QUOTE="ronvalencia"]
[QUOTE="yodogpollywog"]
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
yodogpollywog
1. large register storage spaces compared to ATI/NV GpGPUs.
2. Independent texture units.
3. MSAA hardware.
4. large thread/SMT pool
5. native 3DC+ hardware support.
6. native BC6H/BC7 hardware support.
7. Z-buffer instructions.
8. Z-Cull hardware.
9. Early Z-Cull hardware.
10. Fast Log instructions.
gt5 uses tessellation dude lol on the car's for damage.
ps3's gpu doesnt support tessellation, but it's being done on the cpu.
GT5 uses SPE resource and you have 6 of them.One wonders why PS3 games still has relatively flat polygon surfaces...
ATI hardware tessellation doesn't burden ATI's stream processor resource. NVIDIA Fermi has specialised ALUs to handle tessellation/geometry workloads.
Since RSX has vertex shader performance issue, you have to use move vertex workload into the SPE. LOL at 2D GT5's trees.
Want to start PS3 vs Xbox 360 vs PC+Geforce 9800 MLAA benchmarks? hint; PS3(with 5 SPEs)MLAA comes last.
[QUOTE="locopatho"]How? Let everyone make PC games if they are so awesome in every way?doom1marineDevelopers are stupid and buying into the hype console games sell better LOL. you guys seem offended by this. Consoles are just not as good as PC's. Its just a fact. The #1 reason why ppl choose consoles over PC is b/c they are simpler. So either they are too stupid to understand or too lazy to put forth the effort to figure it out. Either way, don't make the PC community suffer for someone elses non efforts.
[QUOTE="yodogpollywog"]
[QUOTE="ronvalencia"]
Notice "JIT" i.e. the rendering engine includes "Just In Time" recomplier. Prior to execution, the Swiftshader recompiles the code stream into native X86.
Intel and AMD haven't open sourced thier "JIT recomplier" GPU software technology.
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
CPU doesn't have
1. large register storage spaces compared to ATI/NV GpGPUs.
2. Independent texture units.
3. MSAA hardware.
4. large thread/SMT pool e.ghide latency
5. native 3DC+ hardware support.
6. native BC6H/BC7 hardware support.
7. Z-buffer instructions.
8. Z-Cull hardware.
9. Early Z-Cull hardware.
10. Fast Log instructions.
11.Fast geometry instructions.
12. Tessellation hardware.
A PC doesn't need all that. I remember quite a few ppl saying that the PS3 was a pain in the ass to develope for. Sounds like bunch of double talk to me.obvious crytek boss is obvious.
zzz. console gaming has always held back pc gaming.
We would have been seeing crysis-like games on a monthly basis.
imo meh. can't complain that much, console gaming is just another phase in gaming.supdotcom
More and more games are being developed for consoles and then ported to pc, used to be the other way round. You can only tell how good a pc really is by lookin at its exclusives, crysis, shogun 2, etc. Too bad really, games in 2012 wont be much better because of consoles, if consoles didn't exist we would be at least 5 years ahead lol
[QUOTE="firefluff3"][QUOTE="doom1marine"] console game's selling better is bs dude, more people have pc's. back in the day people didnt need to buy graphics cards to play pc games, cpu/software driven. stuff need's to switch back to that.TerrorRizzing
11k people on fallout new vegas (its a steam game, yes?)very low, 600k on black ops at the moment on xbox360, on pc (steam) 73k.
Game quality may be better and mods are great, but nobody can deny consoles give them more money.
same argument every time, sigh... How many playing starcraft 2, wow and other mmos? Do those games not count now? Counter Strike sold 100 000 copies last month... it doesn't take a genius to figure out console gamers are stuck on the same 2-3 games at any given time. Lets face it, crytek are in bed with microsoft and believe their trash now... Crysis sold well enough to compete with virtually any console game other than halo or call of duty, but crytek thought if they were on console they would have sold 20 million copies just because in their minds they are the best and would outsell call of duty. just like Rockstar are in bed with Sony.[QUOTE="ronvalencia"][QUOTE="yodogpollywog"]
im talking about dev actually programming the software support themselves.
swifter is just some 3rd party emulation for games with non native software support.
DudeNtheRoom
CPU doesn't have
1. large register storage spaces compared to ATI/NV GpGPUs.
2. Independent texture units.
3. MSAA hardware.
4. large thread/SMT pool e.ghide latency
5. native 3DC+ hardware support.
6. native BC6H/BC7 hardware support.
7. Z-buffer instructions.
8. Z-Cull hardware.
9. Early Z-Cull hardware.
10. Fast Log instructions.
11.Fast geometry instructions.
12. Tessellation hardware.
A PC doesn't need all that. I remember quite a few ppl saying that the PS3 was a pain in the ass to develope for. Sounds like bunch of double talk to me.A modern gaming PC needs the above points.
For point 1 e.g. NV Geforce GTX 260/280 has 1.8 megabyte of register storage space. AMD Radeon HD 4850/4870 has 2.5 megabyte of register storage space. CELL only has about 14 kilobyte of register space. Processor's registers is the fastest known memory technology.
A proper Early-Z hardware was added to NVIDIA G80.
ATI / NVIDIA DX11 GpGPUs includes the above points.
im using swiftshader 2.0 in crysis now low settings 800x600
if crytek actually programmed the software support it would of ran faster.
you dont even understand what im talking about.
the goddamn developers didnt make swifter dude.
it's a 3rd party app.
it crysis had software support programmed by crytek themselves, no need to even run swifter the performance would of been better
yodogpollywog
You don't even understand what's a JIT recomplier. The purpose of JIT recomplier to convert non-native code into native code prior to execution. This is different to old school interpretive emulators.
it doesnt matter.
swiftshader 2.0 would rather unreal tournament 1999 slower than it's native software support mode.
im getting 2 fps 800x600 in crysis low settings software mode using swiftshader 2.0 that's just bad programming.
yodogpollywog
AMD K8 Athlon has half-baked SIMD units i.e. it doesn't have a proper SSE128 hardware. It doesn't have the compute resource compared to AMD Radeon HD 5450 (80 stream processors + GPU hardware). The low end Radeon HD 5165 has 320 stream processors (renamed Radeon HD 4650). Btw, GPUs includes ROPs (raster operations)hardware i.e.a CPU has to emulatethese functions.
Swiftshader 3.0 has been released btw.
Anyway, both Intel and AMD are heading CGPU i.e. like FPU and Vector co-processors before it, the GpGPU will be assimilated.
[QUOTE="yodogpollywog"]AMD K8 Athlon has half-baked SIMD units i.e. it doesn't have a proper SSE128 hardware. It doesn't have the compute resource compared to AMD Radeon HD 5450 (80 stream processors + GPU hardware). The low end Radeon HD 5165 has 320 stream processors (renamed Radeon HD 4650). Swiftshader 3.0 has been released btw.it doesnt matter.
swiftshader 2.0 would rather unreal tournament 1999 slower than it's native software support mode.
im getting 2 fps 800x600 in crysis low settings software mode using swiftshader 2.0 that's just bad programming.
ronvalencia
you dont seem to be getting it, unreal tournament 1999 supports software rendering....EPICGAMES CREATED THE SOFTWARE RENDERER.
http://i55.tinypic.com/3324f14.jpg
see in the picture of ut1999 menu software rendering? i circled it for you.
if crytek programmed software renderer built in, it would of performed better than swiftshader 2.0 or 2.0 trying to run crysis in software mode.
[QUOTE="ronvalencia"][QUOTE="yodogpollywog"]
it doesnt matter.
swiftshader 2.0 would rather unreal tournament 1999 slower than it's native software support mode.
im getting 2 fps 800x600 in crysis low settings software mode using swiftshader 2.0 that's just bad programming.
AMD K8 Athlon has half-baked SIMD units i.e. it doesn't have a proper SSE128 hardware. It doesn't have the compute resource compared to AMD Radeon HD 5450 (80 stream processors + GPU hardware). The low end Radeon HD 5165 has 320 stream processors (renamed Radeon HD 4650). Swiftshader 3.0 has been released btw.you dont seem to be getting it, unreal tournament 1999 supports software rendering....EPICGAMES CREATED THE SOFTWARE RENDERER.
http://i55.tinypic.com/3324f14.jpg
see in the picture of ut1999 menu software rendering? i circled it for you.
if crytek programmed software renderer built it, it would of performed better than swiftshader 2.0 or 2.0 trying to run crysis in software mode.
So? both are attempting to do raster based 3D rendering.AMD K8 Athlon has half-baked SIMD units i.e. it doesn't have a proper SSE128 hardware. It doesn't have the compute resource compared to AMD Radeon HD 5450 (80 stream processors + GPU hardware). The low end Radeon HD 5165 has 320 stream processors (renamed Radeon HD 4650). Swiftshader 3.0 has been released btw.[QUOTE="ronvalencia"][QUOTE="yodogpollywog"]
it doesnt matter.
swiftshader 2.0 would rather unreal tournament 1999 slower than it's native software support mode.
im getting 2 fps 800x600 in crysis low settings software mode using swiftshader 2.0 that's just bad programming.
yodogpollywog
you dont seem to be getting it, unreal tournament 1999 supports software rendering....EPICGAMES CREATED THE SOFTWARE RENDERER.
http://i55.tinypic.com/3324f14.jpg
see in the picture of ut1999 menu software rendering? i circled it for you.
if crytek programmed software renderer built in, it would of performed better than swiftshader 2.0 or 2.0 trying to run crysis in software mode.
Are you serious? Software rendering through the cpu is a dumb idea in 2010, it was fine in the mid 90's or earlier since Pc's didnt have a dedicated gpu's, just video memory and the cpu handled it all. Todays gpu's are heck alot faster then any cpu that is out. this is why were seeing a large shift into parallel processing performed by gpu's not cpu's. And why do you keep on bringing up old games that used software rendering? Yes we get it, it can be used today but you will see a major drop in performance and abilites since, as I stated above GPU's process data alot faster then CPU's. So why even bother using it?[QUOTE="yodogpollywog"]
[QUOTE="ronvalencia"] AMD K8 Athlon has half-baked SIMD units i.e. it doesn't have a proper SSE128 hardware. It doesn't have the compute resource compared to AMD Radeon HD 5450 (80 stream processors + GPU hardware). The low end Radeon HD 5165 has 320 stream processors (renamed Radeon HD 4650). Swiftshader 3.0 has been released btw.04dcarraher
you dont seem to be getting it, unreal tournament 1999 supports software rendering....EPICGAMES CREATED THE SOFTWARE RENDERER.
http://i55.tinypic.com/3324f14.jpg
see in the picture of ut1999 menu software rendering? i circled it for you.
if crytek programmed software renderer built in, it would of performed better than swiftshader 2.0 or 2.0 trying to run crysis in software mode.
Are you serious? Software rendering through the cpu is a dumb idea in 2010, it was fine in the mid 90's or earlier since Pc's didnt have a dedicated gpu's, just video memory and the cpu handled it all. Todays gpu's are heck alot faster then any cpu that is out. this is why were seeing a large shift into parallel processing performed by gpu's not cpu's. And why do you keep on bringing up old games that used software rendering? Yes we get it, it can be used today but you will see a major drop in performance and abilites since, as I stated above GPU's process data alot faster then CPU's. So why even bother using it?if bet if crytek programmed software renderer quadcore could run crysis on max settings better than best intergrated graphics chip on the market.
the mod's here think im trolling.
dude my intel 466 mhz celeron could run unreal tournament 1999 software mode low settings 800x600 higher than 2 fps back in 1999.
me only getting 2 fps using swiftshader on athlon 2.1 ghz dualcore 800x600 low settings in crysis is total bs.
the reason why fps so low because it's not built into crysis, and crytek didnt make the software renderer
the mod's here think im trolling.
dude my intel 466 mhz celeron could run unreal tournament 1999 software mode low settings 800x600 higher than 2 fps back in 1999.
me only getting 2 fps using swiftshader on athlon 2.1 ghz dualcore 800x600 low settings in crysis is total bs.
the reason why fps so low because it's not built into crysis, and crytek didnt make the software renderer
yodogpollywog
I'm frankly not sure what you are trying to accomplish here. Intel had an idea with the Larrabee. Much like the cell it was a hybrid CPU/GPU solution meant to replace the GPU with one chip.
That thing had as many as 24 cores running at 2.4 GHz if I remember correctly. The project got canceled and why was that you ask.
It was a power hungry behemoth that couldn't even beat a 7800 gtx in graphic heavy tasks, that's why.
Why do you think Sony baled out the last second replacing one Cell for a 7900 gtx? Because with two Cells the PS3 would have been slapped all over the place by the 360.
[QUOTE="yodogpollywog"]
the mod's here think im trolling.
dude my intel 466 mhz celeron could run unreal tournament 1999 software mode low settings 800x600 higher than 2 fps back in 1999.
me only getting 2 fps using swiftshader on athlon 2.1 ghz dualcore 800x600 low settings in crysis is total bs.
the reason why fps so low because it's not built into crysis, and crytek didnt make the software renderer
fireballonfire
I'm frankly not sure what you are trying to accomplish here. Intel had an idea with the Larrabee. Much like the cell it was a hybrid CPU/GPU solution meant to replace the GPU with one chip.
That thing had as many as 24 cores running at 2.4 GHz if I remember correctly. The project got canceled and why was that you ask.
It was a power hungry behemoth that couldn't even beat a 7800 gtx in graphic heavy tasks, that's why.
Why do you think Sony baled out the last second replacing one Cell for a 7900 gtx? Because with two Cells the PS3 would have been slapped all over the place by the 360.
of course developers should support highend video card's since there better for gaming than a cpu, but intergrated graphics processor are always weak and software rendering if programmed right might actually runs games better than these igp's.
[QUOTE="locopatho"]How? Let everyone make PC games if they are so awesome in every way?doom1marineDevelopers are stupid and buying into the hype console games sell better LOL.
LOL Hype? Um no its just a simple fact they know where they make there money and pc gaming isnt it.
doom3 The game was a critical and commercial success for id Software; with more than 3.5 million copies of the game sold, it is the most successful game by the developer to date
i wonder if there counting xbox sales?
Doom was released as shareware, with people encouraged to distribute it further. They did so: in 1995, Doom was estimated to have been installed on more than 10 million computers. Although most users did not purchase the registered version, over one million copies have been sold,
some shareware basically a demo you had to pay like 5$ for it or something.
He summarized it pretty well. Consoles may be holding back PCs technically, but consoles are also helping to fund these games' very existence. You gotta take the good with the bad.
This wouldn't normally be a problem if you assume consoles operate on a 6 year lifecycle and always bring with their next iterations vast hardware improvements. But that model of console development has been under attack by the rise of the Wii, Move and Kinect.
If consoles do not make drastic technical improvements in the next several years, then multiplat PC games will stagnate technically -- unless devs are willing to put in extra effort into making a superior PC title. Why bother, though, when the PC version is usually the worst-selling of the three versions (PC/360/PS3)?
Are you serious? Software rendering through the cpu is a dumb idea in 2010, it was fine in the mid 90's or earlier since Pc's didnt have a dedicated gpu's, just video memory and the cpu handled it all. Todays gpu's are heck alot faster then any cpu that is out. this is why were seeing a large shift into parallel processing performed by gpu's not cpu's. And why do you keep on bringing up old games that used software rendering? Yes we get it, it can be used today but you will see a major drop in performance and abilites since, as I stated above GPU's process data alot faster then CPU's. So why even bother using it?[QUOTE="04dcarraher"]
[QUOTE="yodogpollywog"]
you dont seem to be getting it, unreal tournament 1999 supports software rendering....EPICGAMES CREATED THE SOFTWARE RENDERER.
http://i55.tinypic.com/3324f14.jpg
see in the picture of ut1999 menu software rendering? i circled it for you.
if crytek programmed software renderer built in, it would of performed better than swiftshader 2.0 or 2.0 trying to run crysis in software mode.
yodogpollywog
if bet if crytek programmed software renderer quadcore could run crysis on max settings better than best intergrated graphics chip on the market.
It would not. Run benchmark test between Quake 2 software render vs Quake 2 OpenGL (Radeon HD 4200 IGP).
Intel Core i7 Quad has 8 128bit SSE ADD and 4 128bit SSE MUL. For 32bit data formats, its effectivity 32 32bit SSE ADD and16 32bit SSE MUL. It's effectivity 16 32bit *MADs*, 16 32bit ADD. X86 instruction operand format is two. There's a SSE4 instruction with 3 operands i.e. uses the result slot as the 3rd operand.
AMD Radeon HD 4200 IGP has 40 32bit MADs units (stream processors) + GPU hardware e.g. 8 ROPs with MSAA, command processor, Early Z test, Re-Z, Z Range optimization, Fast Z Clear, Programmable tessellation unit, DXTC and 3Dc+ texture compression, accelerated geometry shader path, adaptive anisotropic filtering, Bicubic filtering, Percentage Closer Filtering and etc'. Stream Processor's operand format is atleast 3.
We are not even factoring GPU's texture units vs CPU's load-store units.
Large chunk ofCore i7 Quad's compute power will be used for "GPU"emulation features.
Crysis?? Try Final Fantasy 14 benchmark for performance killer. PS; FF14 benchmark runs on Swiftshader 3.0 at 1 to 2 FPS on my Core i7 Quad (Turbo mode at 1.86Ghz with 8 threads at 90 percent) andscored 61 points. My AMD Mobility Radeon HD 5730 scores 1892 points.
[QUOTE="yodogpollywog"]
the mod's here think im trolling.
dude my intel 466 mhz celeron could run unreal tournament 1999 software mode low settings 800x600 higher than 2 fps back in 1999.
me only getting 2 fps using swiftshader on athlon 2.1 ghz dualcore 800x600 low settings in crysis is total bs.
the reason why fps so low because it's not built into crysis, and crytek didnt make the software renderer
I'm frankly not sure what you are trying to accomplish here. Intel had an idea with the Larrabee. Much like the cell it was a hybrid CPU/GPU solution meant to replace the GPU with one chip.
That thing had as many as 24 cores running at 2.4 GHz if I remember correctly. The project got canceled and why was that you ask.
It was a power hungry behemoth that couldn't even beat a 7800 gtx in graphic heavy tasks, that's why.
Why do you think Sony baled out the last second replacing one Cell for a 7900 gtx? Because with two Cells the PS3 would have been slapped all over the place by the 360.
One problem, CELL doesn't have hardware "GPU" features i.e. instant programmable compute resource consumption.the mod's here think im trolling.
dude my intel 466 mhz celeron could run unreal tournament 1999 software mode low settings 800x600 higher than 2 fps back in 1999.
me only getting 2 fps using swiftshader on athlon 2.1 ghz dualcore 800x600 low settings in crysis is total bs.
the reason why fps so low because it's not built into crysis, and crytek didnt make the software renderer
yodogpollywog
Your AMD K8 Athlon would not even beat Radeon HD 4200 IGP in SGEMM benchmarks (math matrix benchmark). Radeon HD 4870 (800 SPUs) has SGEMM benchmark reaching 1TFLOPs (which is near it's ~1.2 TFLOPs design limit). Radeon HD 5870(1600 SPUs) doubles that score. Radeon HD 4200 has 40 SPUs which is 20 times less than Radeon HD 4870.
ATI Stream (GpGPU) version of Crysis would be even faster on Radeon HDs.
Swiftshader 3.0 includes JIT optimizations e.g. instruction combining, control flow simplification, dead code elimination and 'etc' i.e. similar tricks as Radeon HD's driver side JIT optimizations.
It's true that consoles hold back PC techicaly in RPGs and action games, but the constant graphics race in the past has killed most of PC-centric genres and now when those games switched to being multiplats those more niche genres are enjoying ressurgence on PC, so for me the change has been for the better
Static console hardware helps the PC laptop market.It's true that consoles hold back PC techicaly in RPGs and action games, but the constant graphics race in the past has killed most of PC-centric genres and now when those games switched to being multiplats those more niche genres are enjoying ressurgence on PC, so for me the change has been for the better
AdrianWerner
Please Log In to post.
Log in to comment