Any Visual Differences between the two Batman Demos?

  • 74 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#51 Hanass
Member since 2008 • 2204 Posts

[QUOTE="HenriH-42"]

[QUOTE="BoloTheGreat"]:shock: Wow that's pretty good looking, where is the PC demo available?! Must test it on my 4870XX (no psyicsx :( )BoloTheGreat

Here ya go ;)

http://www.nzone.com/object/nzone_batmanaa_downloads.html

I dunno about the performance on ATI cards, I guess if you turn off PhysX it'll be alright.

Meh, it's 2GB and i got a WiC patch a cookin tonight. Maybe tommorow....

The game runs like ass on ATI cards. Stupid nVidia paid those devs like they did with Crysis. 20FPS with 4850's Crossfired is what you will be expecting (with Physx enabled).

Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 AnnoyedDragon
Member since 2006 • 9948 Posts

To the complaining ATI users.

It's not Nvidia's fault ATI won't get off their backside and support their own GPU computing API like Close to Metal, they make OpenCL tech demo's while Nvidia funds the implementation of PhysX into actual games.

ATI could just as easily sponsor a game and implement their own GPU accelerated physics technology, instead of getting mad at Nvidia why not ask what the hell ATI is doing right now?

Avatar image for DragonfireXZ95
DragonfireXZ95

26716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 DragonfireXZ95
Member since 2005 • 26716 Posts

[QUOTE="XboximusPrime"]

Found this on N4g. http://www.n4g.com/NewsCom-373909.aspx?CT=1#Comments

Loading...

go to the link for more pics and the ability to flip back and forth.

AnnoyedDragon

Bad comparison, why on Earth did they grab clips from a "pre-rendered" part of the demo?

That's what I was wondering.

Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#54 Hanass
Member since 2008 • 2204 Posts

To the complaining ATI users.

It's not Nvidia's fault ATI won't get off their backside and support their own GPU computing API like Close to Metal, they make OpenCL tech demo's while Nvidia funds the implementation of PhysX into actual games.

ATI could just as easily sponsor a game and implement their own GPU accelerated physics technology, instead of getting mad at Nvidia why not ask what the hell ATI is doing right now?

AnnoyedDragon

Or they could stop pretending to be bad asses and run the physics through the CPU. Everyone would be happy then. Right now I'm getting less FPS's than Crysis, and I'm pretty sure Crysis's physics are much more advanced than Arkham Asylum.

Avatar image for DragonfireXZ95
DragonfireXZ95

26716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 DragonfireXZ95
Member since 2005 • 26716 Posts

[QUOTE="BoloTheGreat"][QUOTE="HenriH-42"]

Here ya go ;)

http://www.nzone.com/object/nzone_batmanaa_downloads.html

I dunno about the performance on ATI cards, I guess if you turn off PhysX it'll be alright.

Hanass

Meh, it's 2GB and i got a WiC patch a cookin tonight. Maybe tommorow....

The game runs like ass on ATI cards. Stupid nVidia paid those devs like they did with Crysis. 20FPS with 4850's Crossfired is what you will be expecting (with Physx enabled).

Good thing I got Nvidia. Runs like butter with Physx on high and everything else on max(except AA is at 8XQ). ;)

Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 AnnoyedDragon
Member since 2006 • 9948 Posts

Or they could stop pretending to be bad asses and run the physics through the CPU. Everyone would be happy then. Right now I'm getting less FPS's than Crysis, and I'm pretty sure Crysis's physics are much more advanced than Arkham Asylum.

Hanass

In other words you don't fully understand the sort of workload being done here, assuming that it can all be done equally well on the CPU

There is a reason everyone is moving to GPU computing, whether the GPU companies like Nvidia and ATI or even Intel with Larrabee. The future is a highly parallel specialised processing unit teamed up with a general purpose multicore processor, tasks being offloaded to where they are run best.

Crysis uses advanced CPU physics; but they do not compare to GPU computing.

Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#57 Hanass
Member since 2008 • 2204 Posts

[QUOTE="Hanass"]

Or they could stop pretending to be bad asses and run the physics through the CPU. Everyone would be happy then. Right now I'm getting less FPS's than Crysis, and I'm pretty sure Crysis's physics are much more advanced than Arkham Asylum.

AnnoyedDragon

In other words you don't fully understand the sort of workload being done here, assuming that it can all be done equally well on the CPU

There is a reason everyone is moving to GPU computing, whether the GPU companies like Nvidia and ATI or even Intel with Larrabee. The future is a highly parallel processing unit teamed up with a multicore processor, tasks being offloaded to where they are run best.

Crysis uses advanced CPU physics; but they do not compare to GPU computing.

Nope, I do fully understand that split workload gives better performance. That doesn't justify not optimizing the physics for CPU support. You know, for people who don't get ripped of by nVidia, because their products are crap compared to ATI lately.

Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 AnnoyedDragon
Member since 2006 • 9948 Posts

Nope, I do fully understand that split workload gives better performance. That doesn't justify not optimizing the physics for CPU support. You know, for people who don't get ripped of by nVidia, because their products are crap compared to ATI lately.

Hanass

Stop being a fanboy for moment and recognise ATI is doing research into "the exact same area" that PhysX is currently doing. You're just miffed because it was Nvidia and not ATI's solution that was implemented into this game, if it were the other way round you would probably think differently. Well tough, that is what I meant when I said you should take this up with ATI instead of blaming Nvidia for getting their solution to the market first. Nvidia has been pushing GPU computing allot harder than ATI, ATI is concerning itself with preparing with DX11 rather than the now.

Now you can make up stuff to rationalize your dislike for Nvidia or you can recognise these physics run badly on the CPU because they are too advanced for them.

Avatar image for halo_wars86
halo_wars86

1505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#59 halo_wars86
Member since 2009 • 1505 Posts

Its the usual chain of command. PC > Xbox360 > PS3Birdy09

im guessing the ps3 = 360 on this one sionce they have been working xtra hard and the ps3 version. so expect the multiplat to be the the same and not infirior.actually u eben get more content on the ps3

Avatar image for charomid
charomid

901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#60 charomid
Member since 2005 • 901 Posts

omg the joker is only playable on ps3? i wanted to get this game on 360, now i mite have to get it for ps3! does anyone know if its gonna be worth it to buy on ps3 just for the joker? is it dlc?

Avatar image for aaronmullan
aaronmullan

33426

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#61 aaronmullan
Member since 2004 • 33426 Posts

omg the joker is only playable on ps3? i wanted to get this game on 360, now i mite have to get it for ps3! does anyone know if its gonna be worth it to buy on ps3 just for the joker? is it dlc?

charomid
Exclusive content for the PS3. You get to play with him in these combat level things.
Avatar image for deactivated-5b19214ec908b
deactivated-5b19214ec908b

25072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62 deactivated-5b19214ec908b
Member since 2007 • 25072 Posts

omg the joker is only playable on ps3? i wanted to get this game on 360, now i mite have to get it for ps3! does anyone know if its gonna be worth it to buy on ps3 just for the joker? is it dlc?

charomid

The two versions are the same except the ps3 version has more content. so i think its worth getting it on the ps3.

Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#63 Hanass
Member since 2008 • 2204 Posts

[QUOTE="Hanass"]

Nope, I do fully understand that split workload gives better performance. That doesn't justify not optimizing the physics for CPU support. You know, for people who don't get ripped of by nVidia, because their products are crap compared to ATI lately.

AnnoyedDragon

Stop being a fanboy for moment and recognise ATI is doing research into "the exact same area" that PhysX is currently doing. You're just miffed because it was Nvidia and not ATI's solution that was implemented into this game, if it were the other way round you would probably think differently. Well tough, that is what I meant when I said you should take this up with ATI instead of blaming Nvidia for getting their solution to the market first. Nvidia has been pushing GPU computing allot harder than ATI, ATI is concerning itself with preparing with DX11 rather than the now.

Now you can make up stuff to rationalize your dislike for Nvidia or you can recognise these physics run badly on the CPU because they are too advanced for them.

That still doesn't explain why the horrible optimization for CPU bound physics is a good thing. Tell you what: it isn't. You can say that ATI is behind the competition all day, that will never justify devs dropping support for the customers that have ATI cards.

Avatar image for mclovin401
mclovin401

899

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 mclovin401
Member since 2007 • 899 Posts

[QUOTE="Birdy09"]Its the usual chain of command. PC > Xbox360 > PS3XboximusPrime

based on what?

Based on fanboyism.
Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 AnnoyedDragon
Member since 2006 • 9948 Posts

That still doesn't explain why the horrible optimization for CPU bound physics is a good thing.

Hanass

It is not horribly optimized, physics simply runs better on a highly parallel processor; which is why GPU computing is being used to enable physics that are too resource intensive for the common CPU. The PhysX API is highly optimized for multicore CPUs, it has to be because at the end of the day it is a cross platform API that has to work on consoles as well.

What part of that do you not understand? This is the equivalent of you declaring every game in existence is badly optimized because you cannot run the graphics on the CPU.

Your beloved ATI is working on the exact same technology, physics acceleration on the GPU. The only reason you're angry is because this game happens to use Nvidia's solution, you won't accept that so are demanding that they magically make GPU level physics run just as well on your CPU.

DirectX 11 comes with the compute shader, a unified form of GPU computing that is standardized across brands. This brand exclusivity of high end physics isn't going to last forever, put up with it or go yell at ATI for not putting it in games right now like Nvidia.

Avatar image for druggyjoe3000
druggyjoe3000

1523

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#66 druggyjoe3000
Member since 2006 • 1523 Posts

Probably this: PC > 360 = PS3.

Legendaryscmt

This :o

Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#67 Hanass
Member since 2008 • 2204 Posts

[QUOTE="Hanass"]

That still doesn't explain why the horrible optimization for CPU bound physics is a good thing.

AnnoyedDragon

It is not horribly optimized, physics simply runs better on a highly parallel processor; which is why GPU computing is being used to enable physics that are too resource intensive for the common CPU. The PhysX API is highly optimized for multicore CPUs, it has to be because at the end of the day it is a cross platform API that has to work on consoles as well.

What part of that do you not understand? This is the equivalent of you declaring every game in existence is badly optimized because you cannot run the graphics on the CPU.

Your beloved ATI is working on the exact same technology, physics acceleration on the GPU. The only reason you're angry is because this game happens to use Nvidia's solution, you won't accept that so are demanding that they magically make GPU level physics run just as well on your CPU.

DirectX 11 comes with the compute shader, a unified form of GPU computing that is standardized across brands. This brand exclusivity of high end physics isn't going to last forever, put up with it or go yell at ATI for not putting it in games right now like Nvidia.

Nope, still makes zero sense that Crysis on Very High is fine on my computer (I don't recall Crysis using GPU for the physics), but suddenly my CPU becomes a piece of garbage when I start playing Arkham Asylum.

And I won't believe that some sparks and steam > destructible environments on Crysis, which is the only possible explanation for the game performing worse than Crysis.

Avatar image for The__Havoc
The__Havoc

2350

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 The__Havoc
Member since 2009 • 2350 Posts

[QUOTE="XboximusPrime"]

[QUOTE="Birdy09"]Its the usual chain of command. PC > Xbox360 > PS3mclovin401

based on what?

Based on fanboyism.

So what happens the majority of the time is a concept based on fanboyism? News to me. Regardless of which if the 360 performs better or not then the PS3 in the majority of multiplats if I decide to get this its going to be PS3 just incase of the DLC option. Now and days the PS3 multiplats pretty much match the 360's though.

Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 AnnoyedDragon
Member since 2006 • 9948 Posts

Nope, still makes zero sense that Crysis on Very High is fine on my computer (I don't recall Crysis using GPU for the physics), but suddenly my CPU becomes a piece of garbage when I start playing Arkham Asylum.

And I won't believe that some sparks and steam > destructible environments on Crysis, which is the only possible explanation for the game performing worse than Crysis.

Hanass

Crysis doesn't do a fraction of these physics you [censored]

Here's a video I made a while back showing the CPU usage impact of just cloth physics on its own, this is from before Nvidia ever touched the PhysX API so you cannot claim the CPU usage is artificially inflated. You can see it is clearly requiring a vast amount of CPU power to run these simulations; and you have to run an entire game along side that, the Batman game has multiple of these all over the floor as physics paper.

Brand fanboys are always a nuisance to talk to, believe whatever you want. But I know as soon as ATI's solution gets used in a game your perspective on this is going to change radically, people like you always do.


Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#70 Hanass
Member since 2008 • 2204 Posts

Crysis doesn't do a fraction of these physics you [censored]

Here's a video I made a while back showing the CPU usage impact of just cloth physics on its own, this is from before Nvidia ever touched the PhysX API so you cannot claim the CPU usage is artificially inflated. You can see it is clearly requiring a vast amount of CPU power to run these simulations; and you have to run an entire game along side that, the Batman game has multiple of these all over the floor as physics paper.

Brand fanboys are always a nuisance to talk to, believe whatever you want. But I know as soon as ATI's solution gets used in a game your perspective on this is going to change radically, people like you always do.

Yeah, and I totally said that PhysX sucked. People assume WAY too much these days. :roll:

Avatar image for FamiBox
FamiBox

5481

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 FamiBox
Member since 2007 • 5481 Posts

Those are the most identical looking screenshots I've seen. Maybe the 360 version looks 20% sharper.

I'd say playing as the Joker makes up for that for the PS3.

Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 AnnoyedDragon
Member since 2006 • 9948 Posts

Yeah, and I totally said that PhysX sucked. People assume WAY too much these days. :roll:

Hanass

Assumption? You are not making any sense.

You have spent the last couple of posts insulting Nvidia's GPU computing physics solution, accusing it of being inefficient on CPU usage. Now you are telling me you didn't say PhysX sucked.

PhysX IS Nvidia's GPU physics solution, it became their solution when Intel robbed Havok FX from them and they bought Ageia as a replacement.

So what is there to assume? You attacked Nvidia's GPU computing physics solution, PhysX is their solution.

Avatar image for Hanass
Hanass

2204

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#73 Hanass
Member since 2008 • 2204 Posts

[QUOTE="Hanass"]

Yeah, and I totally said that PhysX sucked. People assume WAY too much these days. :roll:

AnnoyedDragon

Assumption? You are not making any sense.

You have spent the last couple of posts insulting Nvidia's GPU computing physics solution, accusing it of being inefficient on CPU usage. Now you are telling me you didn't say PhysX sucked.

PhysX IS Nvidia's GPU physics solution, it became their solution when Intel robbed Havok FX from them and they bought Ageia as a replacement.

So what is there to assume? You attacked Nvidia's GPU computing physics solution, PhysX is their solution.

No, I just said that their products in general were inferior to ATI because they are usually 10-15% more expensive (at least where I live they are) than the ATI equivalent. Other than that, I said that they could've done a better job of optimizing the physics for those who DON'T have an nVidia card.

Avatar image for AnnoyedDragon
AnnoyedDragon

9948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 AnnoyedDragon
Member since 2006 • 9948 Posts

Other than that, I said that they could've done a better job of optimizing the physics for those who DON'T have an nVidia card.

Hanass

And I'm telling you saying that is as silly as saying game graphics should be optimized to run on the CPU. These are GPU level physics, the capabilities of CPUs are not suited to run them. Everyone is going down this path, everyone; including the only CPU orientated company left Intel. All companies are going down the parallel processor rout for physics; whether it be Nvidia's CUDA, ATI's CTM or Intel's programmable Larrabee. Even Microsoft is working on a GPU computing standard in DirectX 11 as a said earlier.

Why you won't recognise this when it is universally being supported I don't know, even Cell in the PS3 is using parallel processor acceleration with its SPEs. You said they should optimize the physics more for CPUs, well there is an off setting that turns off the GPU level stuff and runs software based physics. Batman's cape is physics cloth even on the lowest physics setting, there is your CPU optimized setting.

You know even with an Nvidia GPU there is no guarantee you can run these physics well? They recommend a GTX 260, even on my SLI setup I get some pretty significant drops in fps at times.

Anyway it's late my end, night.