Wow, it was worth the wait.
This topic is locked from further discussion.
HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?Looks VERY nice. Finally with this and Battlefield 3 I feel like we are approaching next gen in terms of graphics. Wonder if Wii U will be capable of this when devs squeeze the power out of it (I know its supposed to have only DX10 level of GPU, but PS3 having DX9 level GPU and it best looking games can compete with most DX10 games easily).
BlbecekBobecek
[QUOTE="Firebird-5"][QUOTE="i5750at4Ghz"] They are better imo just because they've made the better game. However you talk about devs going the extra mile. My point is what has value done to go the extra mile? Not saying they are bad or lazy, but value really doesn't do much. Most of there newer ips weren't even created in house.i5750at4Ghz
I had way more fun with Portal 2, both singleplayer and going at it with friends. Just because Crytek may be good at implementing a more robust RK4 into the physics engine integrator or writing a dozen new shaders to take advantage of the new graphics pipeline in their engine doesn't mean squat except to people who get off on that kind of thing.
At the end of the line, Portal 2 got authoring tools less than a month after its release. Crysis got them zero-day. Where are they for Crysis 2?
You're comparing creating tools for an engine that been around for 7 years to one that was just created. Look at DICE they never realsed tools for BC3, not because they are lazy or whatever, but just do to not having the time. The fact Crytek is putting forth the effort all all is huge imo. Not many devs even attempt to do it anymore.Sheer 'effort' alone means nothing. I'd know - I do a lot of programming for this kind of thing. Making something new just for the sake of it is effort wasted. Valve knows and understands this, so they update their engine instead of creating a new cutting edge one every time they release, which allows them to service their community better.
Crytek, on the other hand, appeals to people who get their jollies off the latest and greatest graphics technology. That's fine, but that's not all there is.
[QUOTE="BlbecekBobecek"]HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?Looks VERY nice. Finally with this and Battlefield 3 I feel like we are approaching next gen in terms of graphics. Wonder if Wii U will be capable of this when devs squeeze the power out of it (I know its supposed to have only DX10 level of GPU, but PS3 having DX9 level GPU and it best looking games can compete with most DX10 games easily).
muscleserge
If the rumors about it using a 4890 or even 4870 is true, its gonna dominate the HD twins in graphics.
I'm thinking Nintendo took this route because they didn't need to get the latest chip from Nvidia/AMD. If the rumors of MS and Sony using APUs for their next console is true, then Nintendo can get away with using these chips and still be more powerful than the next line of consoles.
[QUOTE="BlbecekBobecek"]HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?Looks VERY nice. Finally with this and Battlefield 3 I feel like we are approaching next gen in terms of graphics. Wonder if Wii U will be capable of this when devs squeeze the power out of it (I know its supposed to have only DX10 level of GPU, but PS3 having DX9 level GPU and it best looking games can compete with most DX10 games easily).
muscleserge
Witcher 2 looks maybe better than the best PS3 games (even that could be subject of discussion if you compare it directly to Uncharted 3 or Killzone 3), but not a generation better. Crysis 2 dx11 (judging by those screenshots) and maybe Battlefield 3 are the first games I have seen that look way better than best PS3 games.
edit. 2x isnt that much if you take into consideration diminishing returns and the fact that devs will optimise their games specifically for Wii U hardware.
HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?[QUOTE="muscleserge"][QUOTE="BlbecekBobecek"]
Looks VERY nice. Finally with this and Battlefield 3 I feel like we are approaching next gen in terms of graphics. Wonder if Wii U will be capable of this when devs squeeze the power out of it (I know its supposed to have only DX10 level of GPU, but PS3 having DX9 level GPU and it best looking games can compete with most DX10 games easily).
BlbecekBobecek
Witcher 2 looks maybe better than the best PS3 games (even that could be subject of discussion if you compare it directly to Uncharted 3 or Killzone 3), but not a generation better. Crysis 2 dx11 (judging by those screenshots) and maybe Battlefield 3 are the first games I have seen that look way better than best PS3 games.
The witcher 2 looks jaw-droppingly good, the lighting and texture work alone put it above any PS3 game, and when played at proper resolutions the detail of that game is amazing. However the Witcher 2 is a DX9 game. Crysis 1 in dx9 will blow any PS3 exclusive out of the water. Metro 2033 in DX9 on very high will as well. Just Cause 2 is a DX10 game and also outclassed PS3 exlusives. I would argue that Crysis 1 already looks a generation ahead, followed by Metro, and a few other games.HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?[QUOTE="muscleserge"][QUOTE="BlbecekBobecek"]
Looks VERY nice. Finally with this and Battlefield 3 I feel like we are approaching next gen in terms of graphics. Wonder if Wii U will be capable of this when devs squeeze the power out of it (I know its supposed to have only DX10 level of GPU, but PS3 having DX9 level GPU and it best looking games can compete with most DX10 games easily).
BlbecekBobecek
Witcher 2 looks maybe better than the best PS3 games (even that could be subject of discussion if you compare it directly to Uncharted 3 or Killzone 3), but not a generation better. Crysis 2 dx11 (judging by those screenshots) and maybe Battlefield 3 are the first games I have seen that look way better than best PS3 games.
edit. 2x isnt that much if you take into consideration diminishing returns and the fact that devs will optimise their games specifically for Wii U hardware.
How about actually playing the PC games :? I know you didn't.Can my rig handle this??bigboss5akWho knows. :? Depends on the optimization, which judging by their track record, it will suck. You will be able to run it I guess, but who knows at what frames. It probably won't be playable.
Still better than the first :)Way too late. Peoples will just install it, play a couple of hours and forget about it. Who would've expected something like this years ago from the sequel of Crysis. This update is not going to bring back the sandbox open world of the first either.
edidili
[QUOTE="bigboss5ak"]Can my rig handle this??millerlight89Who knows. :? Depends on the optimization, which judging by their track record, it will suck. You will be able to run it I guess, but who knows at what frames. It probably won't be playable. Well the current crysis 2 is extremely well optimized.
[QUOTE="millerlight89"][QUOTE="bigboss5ak"]Can my rig handle this??ferret-gamerWho knows. :? Depends on the optimization, which judging by their track record, it will suck. You will be able to run it I guess, but who knows at what frames. It probably won't be playable. Well the current crysis 2 is extremely well optimized. Considering it wasn't even optimized for SLI, I would have to say no. Considering how the game looked, it should've performed better.
[QUOTE="BlbecekBobecek"][QUOTE="muscleserge"] HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?millerlight89
Witcher 2 looks maybe better than the best PS3 games (even that could be subject of discussion if you compare it directly to Uncharted 3 or Killzone 3), but not a generation better. Crysis 2 dx11 (judging by those screenshots) and maybe Battlefield 3 are the first games I have seen that look way better than best PS3 games.
edit. 2x isnt that much if you take into consideration diminishing returns and the fact that devs will optimise their games specifically for Wii U hardware.
How about actually playing the PC games :? I know you didn't.How about actually playing the PS3 games? I know you didnt.
Honestly the PC games have clear edge in texture quality, resolution (that was always the case when compared to consoles - now the gap is smaller than ever before) and maybe lighting (Killzone 3 lighting is excellent though), but top PS3 games have edge in overall presentation and animations (Drakes animations in UC2 are still to be matched on PC). You can notice that the gap is not as huge as you hermits think when you compare Crysis 2 360 version and PC version. PC version looks just as good as pretty much any PC game (except for BF3) around and while 360 version does have lower resolution, lower framerate and blurrier textures, the overall result is nowhere near a difference of generation.
[QUOTE="millerlight89"][QUOTE="bigboss5ak"]Can my rig handle this??ferret-gamerWho knows. :? Depends on the optimization, which judging by their track record, it will suck. You will be able to run it I guess, but who knows at what frames. It probably won't be playable. Well the current crysis 2 is extremely well optimized.
Maybe because of crappy low res textures? Or maybe because of a much smaller world to render.
It's going to take a lot more than some cosmetic tweaks to make me care about this game. Even though I hated the blur fest when I tried it out, the way the game played ultimately prevented a purchase. It didn't feel like Crysis at all, might as well been a Crysis suit mod attempt in a console port.
The game flopped as far as I'm concerned. That the retail price here in the UK has already halved, shows I'm not alone in that regard.
Well the current crysis 2 is extremely well optimized.ferret-gamer
I don't think I can agree with that.
The visual difference between the lowest and highest settings was small, yet the impact on frame rate was significant. At least with Crysis 1, I could see what was hurting my frame rate when I upped the settings.
I don't feel Crysis 2 gave me anywhere near the same visual bang for my performance, it was just a crudely padded out console port. Even by console port standards, it was half baked. Most console ports scale asset quality, Crysis 2 didn't.
[QUOTE="ferret-gamer"][QUOTE="millerlight89"] Who knows. :? Depends on the optimization, which judging by their track record, it will suck. You will be able to run it I guess, but who knows at what frames. It probably won't be playable. millerlight89Well the current crysis 2 is extremely well optimized. Considering it wasn't even optimized for SLI, I would have to say no. Considering how the game looked, it should've performed better.
What do you mean? My two 460's litterally gives me a 100% improvement over one.....
Considering it wasn't even optimized for SLI, I would have to say no. Considering how the game looked, it should've performed better.[QUOTE="millerlight89"][QUOTE="ferret-gamer"] Well the current crysis 2 is extremely well optimized.Filthybastrd
What do you mean? My two 460's litterally gives me a 100% improvement over one.....
I had to manually get SLI working myself, as did many other Nvidia users. Maybe they fixed it, but even then considering how the game looks, the performance is a joke.LOL. Steam won't automatically download it you're saying? Oh no, the horribleness of having to manually download something. How did I ever get all those mods for Fallout New Vegas!steam version is not getting the dx11 package, sucks for people who bought the steam version.
ZoomZoom2490
it was a good game beforehand, just not great.So will this make it a good game now? :P
nameless12345
Is this going to be released in an official patch because the launcher currently has no new patches.Marka1700
Its all a completely optional download.
[QUOTE="Filthybastrd"][QUOTE="millerlight89"] Considering it wasn't even optimized for SLI, I would have to say no. Considering how the game looked, it should've performed better. millerlight89
What do you mean? My two 460's litterally gives me a 100% improvement over one.....
I had to manually get SLI working myself, as did many other Nvidia users. Maybe they fixed it, but even then considering how the game looks, the performance is a joke.Really? I certainly see where the performance goes.... There are a lot more physics being calculated, there's a lot more destruction, there's a vastly longer draw distance, the AA options are much more liberal (Nvidia controll panel TSSAA works), Vanilla textures are better..... Crysis 2 has better lighting, that's it.
Did you update it to 1.2?
When I load Crysis 2, my eyes are physically assaulted by tons of absent miniscule details that add up to an overwheling stench of console port. Furthermore, Crysis was amazing in that, apart from two distinct texture mods, the difference between 07 and 11 is literally the autoexec.cfg you use.
I hope the dx11 patch will be the same but I shall wait and see.
It'd doesn't look like there is any actualy way to get to the page at the moment without actually typing it in you address bar as there are no links to it on the website.I'm a little baffled right now.... I'd assume you'd make the page accesable once it worked.
Filthybastrd
HD4000 series of GPUs do have a tessellator, not as good or powerful as modern DX11 ones, but at least it has one. However todays GPUs even midrange ones outclass a GPU like an HD4870 by a factor 2x, not even counting improvements in efficiency that DX11 offers. So in short games will look great, but don't get your hopes up. edit: a dx9 game aka Witcher 2 looks far better than any PS3 game. What dx10 games are you talking about?[QUOTE="muscleserge"][QUOTE="BlbecekBobecek"]
Looks VERY nice. Finally with this and Battlefield 3 I feel like we are approaching next gen in terms of graphics. Wonder if Wii U will be capable of this when devs squeeze the power out of it (I know its supposed to have only DX10 level of GPU, but PS3 having DX9 level GPU and it best looking games can compete with most DX10 games easily).
BlbecekBobecek
Witcher 2 looks maybe better than the best PS3 games (even that could be subject of discussion if you compare it directly to Uncharted 3 or Killzone 3), but not a generation better. Crysis 2 dx11 (judging by those screenshots) and maybe Battlefield 3 are the first games I have seen that look way better than best PS3 games.
edit. 2x isnt that much if you take into consideration diminishing returns and the fact that devs will optimise their games specifically for Wii U hardware.
These two aren't the only games that look better than any PS3 games. There's other games like Crysis, Crysis Warhead, and DX 11 Metro 2033 for example.
[QUOTE="Xtasy26"]waiting for someone to post a video to compare. Crytek needs to make it available to download first :PTexture package upgrade requires 1GB video memory. Here's to hoping that the textures looks miles ahead when compared to consoles lame 256MB video memory.
lawlessx
How quickly until people start complaining about not being able to run the game and it being unoptimized?i5750at4Ghz
It's already unoptimized. They kept the same art assets at all three graphical settings, and yet they still somehow managed to knock my frame rate from 60 at minimum to 30 range at the highest. And what do you get for that? The difference is barely noticeable.
[QUOTE="i5750at4Ghz"]How quickly until people start complaining about not being able to run the game and it being unoptimized?AnnoyedDragon
It's already unoptimized. They kept the same art assets at all three graphical settings, and yet they still somehow managed to knock my frame rate from 60 at minimum to 30 range at the highest. And what do you get for that? The difference is barely noticeable.
Took exactly 6 minutes.Took exactly 6 minutes.i5750at4Ghz
The impression I got from your response was that the people demanding higher graphics are hypocritical, because they probably couldn't run them anyway.
Look at them.
The differences between gamer and extreme are barely noticeable, you could show someone a side by side; and it is unlikely they could pick out the minimum settings screenshots. And yet somehow it nearly halved my frame rate, for practically nothing.
I imagine all the optimization went into the minimum settings, the console settings, and the two levels slapped on top weren't optimized in the least. The game isn't optimized for PC in the least, so I cannot imagine how terrible these updates are going to run.
[QUOTE="i5750at4Ghz"]Took exactly 6 minutes.AnnoyedDragon
The impression I got from your response was that the people demanding higher graphics are hypocritical, because they probably couldn't run them anyway.
Look at them.
The differences between gamer and extreme are barely noticeable, you could show someone a side by side; and it is unlikely they could pick out the minimum settings screenshots. And yet somehow it nearly halved my frame rate, for practically nothing.
I imagine all the optimization went into the minimum settings, the console settings, and the two levels slapped on top weren't optimized in the least. The game isn't optimized for PC in the least, so I cannot imagine how terrible these updates are going to run.
The biggest changes from setting to setting are the lighting. I'm no graphics artist, but any game that uses advanced real time lighting takes huge frame hits.[QUOTE="AnnoyedDragon"][QUOTE="i5750at4Ghz"]Took exactly 6 minutes.i5750at4Ghz
The impression I got from your response was that the people demanding higher graphics are hypocritical, because they probably couldn't run them anyway.
Look at them.
The differences between gamer and extreme are barely noticeable, you could show someone a side by side; and it is unlikely they could pick out the minimum settings screenshots. And yet somehow it nearly halved my frame rate, for practically nothing.
I imagine all the optimization went into the minimum settings, the console settings, and the two levels slapped on top weren't optimized in the least. The game isn't optimized for PC in the least, so I cannot imagine how terrible these updates are going to run.
The biggest changes from setting to setting are the lighting. I'm no graphics artist, but any game that uses advanced real time lighting takes huge frame hits. correct, the lowest settings use one bounce in the GI, while Extreme takes 5 bounces i believe, a quick glance you may not notice it, but there is a definite yet subtle difference in the lighting between the two settings. Like how SSAO is in some games, you don't really notice it consiously but it helps alot in grounding in the world.Extreme also has shadows being cast on most light sources, while low has barely any light sources casting lighting, the motion blur is also alot smoother, and the water propagation is much better.
A game being "optimized" is very different from a game having visuals that you think are worth the performance cost. Saying something is "unoptimized" implies that there were different ways to achieve the same exact results that would have better performance, or that they didn't care about performance at all when implementing it. Something being "worth it" is a matter of personal taste, and whether or not you even notice it in the first place. If you don't think having tessellation to fill in the tiny surface details on a brick wall is worth the perf cost I'd say that's a reasonable complaint, but it doesn't mean they didn't spend time optimizing it to be performant.
The biggest changes from setting to setting are the lighting. I'm no graphics artist, but any game that uses advanced real time lighting takes huge frame hits.i5750at4Ghz
correct, the lowest settings use one bounce in the GI, while Extreme takes 5 bounces i believe, a quick glance you may not notice it, but there is a definite yet subtle difference in the lighting between the two settings. Like how SSAO is in some games, you don't really notice it consciously but it helps alot in grounding in the world.
Extreme also has shadows being cast on most light sources, while low has barely any light sources casting lighting, the motion blur is also alot smoother, and the water propagation is much better.
ferret-gamer
The performance eaten does not justify the visual "gain", because the visual "gain" is so subtle and minute that most people cannot tell the difference. It doesn't help that the art assets remain the same at all settings, bringing to question the optimization even more; given the increased workload is purely from variable tweaks.
I've seen console ports put more effort into their PC versions. At least most console ports scale the textures and other art assets in their settings. Crytek didn't bother, it's pathetic.
A game being "optimized" is very different from a game having visuals that you think are worth the performance cost. Saying something is "unoptimized" implies that there were different ways to achieve the same exact results that would have better performance, or that they didn't care about performance at all when implementing it. Something being "worth it" is a matter of personal taste, and whether or not you even notice it in the first place. If you don't think having tessellation to fill in the tiny surface details on a brick wall is worth the perf cost I'd say that's a reasonable complaint, but it doesn't mean they didn't spend time optimizing it to be performance.
Teufelhuhn
I'm a gamer, not a developer. So my definition of unoptimized is going to be very different from the technically accurate one.
All I see is this runs twice as bad as this, and I can see barely any justifiable difference as to why. Whatever they did, it looks like a massive waste of performance, because 60fps is a more worthwhile visual gain than the higher settings.
Please Log In to post.
Log in to comment