I already proved 360 has best version of Oblivion. Its your turn to prove Witcher 2 can run on 7800reach3You haven't proven 360 has the best Oblivion build i.e. X1800 XT can run "Chuck patch" Oblivion.
This topic is locked from further discussion.
I already proved 360 has best version of Oblivion. Its your turn to prove Witcher 2 can run on 7800reach3You haven't proven 360 has the best Oblivion build i.e. X1800 XT can run "Chuck patch" Oblivion.
[QUOTE="dakan45"][QUOTE="reach3"]Yet they can. And 7800 card can't. Now that is whats sad... Did you not notice the low resolution texture top right corner? lower res than quake 1 sh*t If that looks like Quake 1, then what does fear and prey look like? because halo 4 and even perfect dark zero stomp all over them.[QUOTE="Bebi_vegeta"]
It would be pretty sad if X360 couldn't play games like Halo 4 or Witcher 2.
jokeisgames2012
[QUOTE="jokeisgames2012"][QUOTE="dakan45"] Yet they can. And 7800 card can't. Now that is whats sad...Did you not notice the low resolution texture top right corner? lower res than quake 1 sh*t If that looks like Quake 1, then what does fear and prey look like? because halo 4 and even perfect dark zero stomp all over them.reach3
It's probably a bad sign when even your bullshots dont look that good
[QUOTE="jokeisgames2012"][QUOTE="dakan45"] Yet they can. And 7800 card can't. Now that is whats sad...reach3Did you not notice the low resolution texture top right corner? lower res than quake 1 sh*t If that looks like Quake 1, then what does fear and prey look like? because halo 4 and even perfect dark zero stomp all over them. quake looks better than all of them has something called atmosphere. They're all blocky polygons
If that looks like Quake 1, then what does fear and prey look like? because halo 4 and even perfect dark zero stomp all over them.[QUOTE="reach3"][QUOTE="jokeisgames2012"] Did you not notice the low resolution texture top right corner? lower res than quake 1 sh*tlostrib
It's probably a bad sign when even your bullshots dont look that good
i didnt post bullshots, you just cant believe it looks that good[QUOTE="lostrib"][QUOTE="reach3"] If that looks like Quake 1, then what does fear and prey look like? because halo 4 and even perfect dark zero stomp all over them.reach3
It's probably a bad sign when even your bullshots dont look that good
i didnt post bullshots, you just cant believe it looks that good i swear to god it's unimpressive[QUOTE="reach3"][QUOTE="lostrib"]i didnt post bullshots, you just cant believe it looks that good i swear to god it's unimpressive show me a pc game before crysis that looks betterIt's probably a bad sign when even your bullshots dont look that good
jokeisgames2012
This is a load of crap because they will not be able to sufficiently cool a 7990 in a console sized enclosure lolSPBossthats why video cards suck and software rendering is better, cpus dont run as hot and arent limited by a specifc version of direct x
shadow maps look like sh1t the edges are pixelated, doom3 interiors look better hopefully they finally caught up with doom3's lighting jokeisgames2012Are you an alt of reach? You seem convenient, new account, your silly pro PC in the midst of his anti-PC.
yodogpollywog?
[QUOTE="jokeisgames2012"]shadow maps look like sh1t the edges are pixelated, doom3 interiors look better hopefully they finally caught up with doom3's lighting InconsistancyAre you an alt of reach? You seem convenient, new account, your silly pro PC in the midst of his anti-PC. it looks like halo combat evolved with slightly more polygons and shadow maps looks like sh1t
[QUOTE="SPBoss"]This is a load of crap because they will not be able to sufficiently cool a 7990 in a console sized enclosure loljokeisgames2012thats why video cards suck and software rendering is better, cpus dont run as hot and arent limited by a specifc version of direct x
1. Swiftshader 3.0 CPU performance says Hi. Both Swiftshader 3.0 and AMD driver stack uses LLVM (low level virtual machine) JIT recompliers.
2. CPUs ussualy has less ALUs compared to "fat" GPU.
3. X86 CPUs has it's own extensions e.g. incoming AVX2 SIMD. Without OpenCL, existing applications can't use AVX1/AVX2 extensions. AMD's OpenCL CPU driver is faster than Intel's OpenCL CPU driver and existing applications can use AVX1 and soon AVX2. OpenCL has it's own version targets.
[QUOTE="jokeisgames2012"][QUOTE="SPBoss"]This is a load of crap because they will not be able to sufficiently cool a 7990 in a console sized enclosure lolronvalenciathats why video cards suck and software rendering is better, cpus dont run as hot and arent limited by a specifc version of direct x 1. Swiftshader 3.0 CPU performance says Hi. Both Swiftshader 3.0 and AMD driver stack uses LLVM JIT recompliers. 2. CPUs ussualy less ALUs compared to "fat" GPU. 3 X86 CPUs has it's own extensions e.g. incoming AVX2 SIMD. Without OpenCL, existing applications can't use AVX1/AVX2 extensions. AMD's OpenCL CPU driver is faster than Intel's OpenCL CPU driver and existing applications can use AVX1 and soon AVX2. switchshader isnt a true software renderer it emulates direct x api made for video cards
[QUOTE="ronvalencia"][QUOTE="jokeisgames2012"] thats why video cards suck and software rendering is better, cpus dont run as hot and arent limited by a specifc version of direct x jokeisgames20121. Swiftshader 3.0 CPU performance says Hi. Both Swiftshader 3.0 and AMD driver stack uses LLVM JIT recompliers. 2. CPUs ussualy less ALUs compared to "fat" GPU. 3 X86 CPUs has it's own extensions e.g. incoming AVX2 SIMD. Without OpenCL, existing applications can't use AVX1/AVX2 extensions. AMD's OpenCL CPU driver is faster than Intel's OpenCL CPU driver and existing applications can use AVX1 and soon AVX2. switchshader isnt a true software renderer it emulates direct x api made for video cards
Swiftshader has similar LLVM recomplier technology as in AMD's GPU drivers.
[QUOTE="jokeisgames2012"][QUOTE="ronvalencia"] 1. Swiftshader 3.0 CPU performance says Hi. Both Swiftshader 3.0 and AMD driver stack uses LLVM JIT recompliers. 2. CPUs ussualy less ALUs compared to "fat" GPU. 3 X86 CPUs has it's own extensions e.g. incoming AVX2 SIMD. Without OpenCL, existing applications can't use AVX1/AVX2 extensions. AMD's OpenCL CPU driver is faster than Intel's OpenCL CPU driver and existing applications can use AVX1 and soon AVX2.ronvalenciaswitchshader isnt a true software renderer it emulates direct x api made for video cards Swiftshader has similar LLVM recomplier technology as in AMD GPU drivers. swtchshader is not built into crysis it's a third party app that trys to emulate direct x 9 with a cpu. THAT'S NOT TRUE SOFTWARE RENDERING STUPID FOOK
[QUOTE="reach3"][QUOTE="Jebus213"] Show me a console game that can do 1080p and 60FPS with Halo 4's graphics.Jebus213while 2005 + 2006 pc cant. I don't even see how that's even relevant anyway. completely relevant really. It means 360 was graphic king until crysis matched it. Yet hermits refuse to admit (except Wasdie, admited it, he is the only smart hermit here)
[QUOTE="Jebus213"][QUOTE="reach3"] while 2005 + 2006 pc cant.reach3I don't even see how that's even relevant anyway. completely relevant really. 360 was graphic king until crysis matched it. Yet hermits refuse to admit (except Wasdie, admited it, he is the only smart hermit here) Look at any multiplat game from 2005-2007 :roll: console versions looked worse.
[QUOTE="ronvalencia"][QUOTE="jokeisgames2012"] switchshader isnt a true software renderer it emulates direct x api made for video cardsjokeisgames2012Swiftshader has similar LLVM recomplier technology as in AMD GPU drivers. swtchshader is not built into crysis it's a third party app that trys to emulate direct x 9 with a cpu. THAT'S NOT TRUE SOFTWARE RENDERING STUPID FOOK
As as example, a good modular program can have a function call to a change color value which can be targeted for CPU backend or GPU backend.
Swiftshader 3.0 *is* the DX9c software renderer for X86 CPU backend.
AMD's GPU doesn't directly execute Direct3D API i.e. you have to hit AMD's LLVM layer before you hit GPU's ISA. AMD's LLVM manages the changes with various AMD GPU ISA e.g. Cypress's VLIW5, Cayman's VLIW4 and GCN's SIMD.
Swiftshader has a similar LLVM software tech but targets X86 CPUs, it could be modifed to target Intel Xeon Phil.
AMD's OpenCL CPU driver also uses AMD's own LLVM software tech which targets X86 CPUs. On certain workloads, it was proven to be faster than Intel's complier technology.
[QUOTE="jokeisgames2012"][QUOTE="ronvalencia"] Swiftshader has similar LLVM recomplier technology as in AMD GPU drivers. ronvalenciaswtchshader is not built into crysis it's a third party app that trys to emulate direct x 9 with a cpu. THAT'S NOT TRUE SOFTWARE RENDERING STUPID FOOK AMD's GPU doesn't directly execute Direct3D API i.e. you have to hit AMD's LLVM layer before you hit GPU's ISA. AMD's LLVM manages the changes with various AMD GPU ISA e.g. Cypress's VLIW5, Cayman's VLIW4 and GCN's SIMD. Swiftshader has a similar LLVM software tech but targets X86 CPUs, it could be modifed to target Intel Xeon Phil. AMD's OpenCL CPU driver also uses AMD's own LLVM software tech which targets X86 CPUs. On certain workloads, it was proven to be faster than Intel's complier technology. microsoft stopped developing direct 3d like 12+ years ago dumbass
I don't even see how that's even relevant anyway. completely relevant really. It means 360 was graphic king until crysis matched it. Yet hermits refuse to admit (except Wasdie, admited it, he is the only smart hermit here) Crysis was way ahead of Gears and it came out only a year later. Gears didn't match Crysis. Crysis was 1+leap years ahead. Also it's still technically ahead of most games.[QUOTE="Jebus213"][QUOTE="reach3"] while 2005 + 2006 pc cant.reach3
completely relevant really. 360 was graphic king until crysis matched it. Yet hermits refuse to admit (except Wasdie, admited it, he is the only smart hermit here) Look at any multiplat game from 2005-2007 :roll: console versions looked worse. no, gears had better technical graphics than any pc game at the time.[QUOTE="reach3"][QUOTE="Jebus213"] I don't even see how that's even relevant anyway.04dcarraher
[QUOTE="04dcarraher"]Look at any multiplat game from 2005-2007 :roll: console versions looked worse. no, gears had better technical graphics than any pc game at the time. parallax mapping is fail[QUOTE="reach3"] completely relevant really. 360 was graphic king until crysis matched it. Yet hermits refuse to admit (except Wasdie, admited it, he is the only smart hermit here)reach3
[QUOTE="ronvalencia"][QUOTE="jokeisgames2012"] swtchshader is not built into crysis it's a third party app that trys to emulate direct x 9 with a cpu. THAT'S NOT TRUE SOFTWARE RENDERING STUPID FOOK jokeisgames2012AMD's GPU doesn't directly execute Direct3D API i.e. you have to hit AMD's LLVM layer before you hit GPU's ISA. AMD's LLVM manages the changes with various AMD GPU ISA e.g. Cypress's VLIW5, Cayman's VLIW4 and GCN's SIMD. Swiftshader has a similar LLVM software tech but targets X86 CPUs, it could be modifed to target Intel Xeon Phil. AMD's OpenCL CPU driver also uses AMD's own LLVM software tech which targets X86 CPUs. On certain workloads, it was proven to be faster than Intel's complier technology. microsoft stopped developing direct 3d like 12+ years ago dumbass Direct3D 11.1 says Hi . I wonder who's dumb.
[QUOTE="jokeisgames2012"][QUOTE="ronvalencia"] AMD's GPU doesn't directly execute Direct3D API i.e. you have to hit AMD's LLVM layer before you hit GPU's ISA. AMD's LLVM manages the changes with various AMD GPU ISA e.g. Cypress's VLIW5, Cayman's VLIW4 and GCN's SIMD. Swiftshader has a similar LLVM software tech but targets X86 CPUs, it could be modifed to target Intel Xeon Phil. AMD's OpenCL CPU driver also uses AMD's own LLVM software tech which targets X86 CPUs. On certain workloads, it was proven to be faster than Intel's complier technology.ronvalenciamicrosoft stopped developing direct 3d like 12+ years ago dumbass Direct3D 11.1 says Hi . I wonder who's dumb. It's just a troll with an alt. Just ignore him, he did this "software vs. hardware" spam in a LoosingENDS thread a long time ago.
the most advanced engine right now in 2012 is software rendering. proving hardware sucks peenjokeisgames2012AMD GCN can raytrace just fine.
[QUOTE="ronvalencia"][QUOTE="jokeisgames2012"] microsoft stopped developing direct 3d like 12+ years ago dumbass Jebus213Direct3D 11.1 says Hi . I wonder who's dumb. It's just a troll with an alt. Then why arent games shipping with it? unreal tournament 1999 could be ran in direct 3d shipped with direct x 7 or direct 3d support.
[QUOTE="lostrib"][QUOTE="reach3"] If that looks like Quake 1, then what does fear and prey look like? because halo 4 and even perfect dark zero stomp all over them.reach3
It's probably a bad sign when even your bullshots dont look that good
i didnt post bullshots, you just cant believe it looks that goodIt doesnt look that good. The only part of all the halo4 shots is Master chief. All the other parts look rather low res and blurry
Please Log In to post.
Log in to comment