I've decided that I hate the way game developers are pushing "cinematic presentation." I don't know why the games industry feels the need to mimic an older media in order to provide visual appeal that is, in my opinion, ultimately unfulfilling and doesn't play to the unique strengths of games.
The entertainment business has always tended toward the unoriginal, mimicry and cashing in. As games become big business, big business people who don't necessarily know or even care about gaming get some degree of control over the basic game design decisions and especially over marketing decisions. Shortsighted MBA's know movies are highly visually appealing, they know games are partly a visual experience. So they put two and two together and get three.
So what's wrong with the cinematic experience? In short, it's not the gaming experience.
The first negative is the the specific namesake for our general problem, the cinematic. Whether it be an extended segment of story exposition or a few seconds of your character falling through the floor at a predetermined location. When you're sitting in front of your video screen holding an unresponsive controller waiting for the game to let you start playing again you're losing immersion. You're no longer a part of things, you're an outside observer.
Second, we see games with set-piece situations that are designed with the emphasis on looking cool rather than on gameplay. You're either shooting non-stop at a giant monster or you're running/driving/boating as fast as possible through some sort of over-the-top mayhem. The Modern Warfare 3 trailer from Microsoft's 2011 E3 press conference is a prime example. (http://e3.gamespot.com/press-conference/microsoft-e3/?tag=masthead%3Be32011%3Bvideo%3Bmicrosoft) If the gameplay that follows these spectacular shows isn't first rate, it can start to feel like the actual gaming parts (you know, the part that really matters...) are a letdown. In light of this we see game developers making shorter and shorter games because they're focusing resources on shock and awe filled set-pieces instead of good gameplay, level design, stories and characters. Because unfortunately, in the end, it's easier.
Last and least is the idea that we're playing the game through the lens of a camera. I would like to meet the individual who first decided to call the point of view for a game "the camera" and tell him he's a moron. I would then like to meet the individual who took this idea to it's logical conclusion and decided to make "the camera" look and act like an actual camera. I would like to take that person's picture with a real-life soul stealing camera as fitting revenge for what he did to games. If I'm supposed to be seeing the game world through my own eyes, why the hell am I seeing lense flare, halation, lens distortion, vignetting, sepia tones, motion blur or any number of things that are not shortcomings of my biological eyes? If I'm playing a game set in the medieval era hundreds of years before cameras were even an inkling of an idea, why am I seeing lens flare, ... you get the drift.
So what to do about this? The answer is obvious but it's also difficult. First off, lose the jealous little sister to Hollywood mentality. Games are still maturing and they will certainly go beyond movies. The unique strength of games is their interactivity. Put the player in the world and let him affect it, not just watch it.
I've been gaming for a long time and the push toward cinematic visuals is a recent trend. Games were developing to their own strengths before its introduction, so the first step would be to get back to basics and move forward from there. Games were fun even before the whole "cinematic" thing was introduced. There are some good things to keep, to be sure, but not at the expense of gameplay and immersion.