[QUOTE="ronvalencia"]
[QUOTE="MK-Professor"]
no
http://www.tomshardware.co.uk/charts/desktop-vga-charts-2006/3DMark06-v1.0.2,584.html even in 3DMark06 the 7800 GTX is still slightly faster.
buck to my initial argument, a prehistoric ATI X1950pro can run games like crysis 2 ( link ) with slightly better graphics and performance than console version.(there for the 7800GTX shouldn't have problem to play this game with similar settings)
MK-Professor
No,
Crysis 2 + Geforce 7800 GTX = fail http://www.youtube.com/watch?v=klgVCk178OI

in evry other bechmark the Geforce 7800 GTX was ahead from the x1800XT or x1950pro
You didn't read my post from Beyond3D and modern shader programs in games.
Geforce 7800 GTX didn't win
1. Oblivion PC (one of many GameBryo engine games) since it's unable to match MSAA+HDR FP. Limitation carried over to PS3. Next-gen game engine for Xbox 360 era.
2. Mass Effect. i.e. a game that uses Unreal Engine 3 (deferred shading for lights). Next-gen game engine for Xbox 360 era.
3. Assassin's Creed. Next-gen game engine for Xbox 360 era.
4. It looks like it wouldn't win Crysis 2 (deferred renderer for lights).
Notice the games mentioned have shader bias.
There are plenty of games that uses Unreal Engine 3 e.g. Mirror's Edge, Gears Of War, Mass Effect 2, The Last Remnant, Unreal Tournament 3, Tom Clancy's Rainbow Six: Vegas and 'etc'. http://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_3
The move towards Xbox 360 era shows ATI's PC DX9 GPU strenghts i.e. reduces the influences with NVIDIA's "The Way It's Meant To Be Played" DX9 workarounds, which is not sustainable in the long run.
With Nvidia's Geforce 8800 (Nov 2006), NV can abandon thier DX9 "The Way It's Meant To Be Played" DX9 workarounds. PC doesn't have CELL to "patch" Geforce 7 type GPU.
One can't say Geforce 7800/7900 is a better GPU compared to the competition since clearly it doesn't show the sustained compute performance. Geforce 7's 32bit FP issues reminds me of Geforce FX i.e. 16bit FP compute is "good enough" mentality.
Mr 15 percent GPU vendor (NVIDIA) would not have clout for "The Way It's Meant To Be Played" in the next gen Wii U/Xbox 720/PS4 game consoles and Mr 25 percent GPU vendor (AMD).
PS; In mobile phone GPUs, Qualcomm's Adreno 320 uses AMD's Xenos technology. Qualcomm's SnapDragon S4 is mundering NV Tegra 3 in unit sales.
Log in to comment