I guess not. To me something like parallax mapping or motion blur jumps right out at me, and goes a long way in making current-gen games look good. Teufelhuhn
Agreed. One reason I think some people believe otherwise is that we've reached a point where even crappy examples of what current and/or next-gen hardware is capable of will stand up well over time. Looking at most current-gen games, developers have been able to pull out enough detail in most instances that nobody has to guess at what something is.
Slowly but surely, games are moving in a direction where things that have largely been considered semantic can be brought out. More realistic lighting and texture, larger color palettes, crowd AI, facial ticks - all those little things that sound simple and unnecessary, but can make a world of difference in terms of how immersive the experience can be.
Graphics are also becoming more important in light of the spread of HD. Some people have stated that it hasn't caught on yet, but I would have to disagree. I only paid $700 for mine - that isn't unreasonable for a lot of people, especially considering how high prices were just 2 years ago. It may not be common in every household right now, but in a year or two, when every console has hit it's stride and gotten past those first-year hiccups? I expect we'll see a much greater emphasis on 1080p, 5.1/DTS, and yes, better graphics. Bad graphics look especially poor on a good HDTV - good graphics look amazing. That matters to a lot of people.
All that babbling said, it's true that people will buy the system(s) that have the games they want to play, but graphics are a contributor to what games people want to play. If i'm looking at previews/reviews and see a game that gets great scores but looks awful, and a game that gets average scores but looks amazing, i'm probably going to spend more time looking into the one that looks more visually appealing. This is, afterall, a visual medium.
Log in to comment