alanorman / Member

Forum Posts Following Followers
46 41 4

Game engine optimization

This matter has been discussed many times over the years. The notion od game engine optimiation (GEE) has been awake since the first generation of graphics accelarators was put on the market.

Never or almost never before that people had been discussing their computer specs., running various benchmarks, checking Internet forums for gameplay opinions and more. Over the years the CPUs got by far more powerfull and the video cards' evolution was going parallelly. Video games' creators were given a completly new environment to create games for.

But along came indolence. The coders thought that they do not have to care about script optimization anymore, because the PCs were sooooooooo powerfull that the game engine code did not need any kind of tweaking and improving. The reality shows us how wrong they were. Games like TES IV - Oblivion (high demand for top-level video cards and CPUs), Boiling Point (disastrously bugged engine resulting in constant gameplay glitches and low performance) and DMoMM - the latest installment to the great Might&Magic series
- shows us how indolent programmers can be.

Dark Messiah of Might and Magic is believed to be the greatest and most astinishing example of the Source engine's capabilities. Probably it's true, but AT WHAT COST?! The hardware requirements were set way too high. I was able to play Half-Life 2 quite smoothly on my rig (AhtlonXP 2600+, 1GB RAM, GF6800GT 256MB RAM) at 1280x1024 with almost maximum settings for the graphics details level. But playing DM with smooth framerate is possible only in 1024x768 and with details set to medium (e.g. anizo, textures set to medium, as well and most other options).

Can anyone point out to me, where is this 'improvement' in Source engine ? Where did they hide it? How much do I have to pay for hardware that will enable me to see this improvement?! This is sick!

Waiting for you opinions,
Alan