Forum Posts Following Followers
174 187 100

sweetbryancito Blog

Defending Guitar Hero and Rock Band...

Defending Guitar Hero and Rock Band: Why there's nothing wrong with being a fake musician.

Despite the enormous success and popularity of Guitar Hero and the recently released Rock Band, it seems that there is still a small but vocal group of cynics and naysayers who like to pop up in internet forums and belittle the fans of these games for "wasting" their time on "fake" instruments when they could be learning the real thing. I can see how this might make sense from the perspective of those who have never played these games. They see someone performing music on a quasi-"instrument" that mimics the shape of a guitar and wonder (reasonably it would seem) why the heck these foolish and misguided souls don't just learn how to play a real damn guitar. Well, I'm here to set the record straight. So without further ado, here are the three reasons why the just-learn-a-real-instrument argument against Guitar Hero and Rock Band is completely out of tune with reality.

1) Apples and Oranges

Comparing Guitar Hero to playing the guitar is like comparing apples to oranges. If people want to criticize others for playing Guitar Hero, then they might as well criticize them just for listening to music that they don't know how to play. Are racing game fans losers who don't want to make the effort to learn how to drive an actual racecar? Are SimCity fans idiots who should spend that time studying real-life urban planning? Are Super Mario Bros. fans fools who should just go and jump on real turtles and run through actual sewer pipes? Of course not. Guitar Hero and Rock Band expand on the common experience of listening to recorded music by simulating the feeling of performing it in the same way and with the same sound as the original bands. That's it.

2) A Quick Reality Check

Learning how to perfectly perform every song in Guitar Hero and Rock Band on real instruments would be virtually impossible. A person could spend decades practicing and spend thousands of dollars on equipment and still not be able to recreate the music included in these games. Not to mention that it would certainly be impossible to do it alone. Bands like Van Halen, Boston, and Metallica spent years practicing together to hone their own unique sound. A person could spend one hour playing Guitar Hero or Rock Band and have an absolute blast playing "Sweet Child o' Mine" or "(Don't Fear) The Reaper," and trying to invalidate or dismiss that experience by by telling him or her to just "learn a real instrument" is patently ridiculous.

3) The Musically Un-Gifted

Not everyone is musically talented. Heck, there are a lot of people who can't even carry a tune. But that doesn't mean that these folks don't love listening to music or that they can't tap their fingers to a beat. As my previous point suggests, criticizing even musically gifted people for playing Guitar Hero is absurd to begin with. Criticizing someone who is musically challenged for playing it is doubly absurd. Not everyone can play football or basketball-not just by choice, but due to physical limitations-but that doesn't mean they should be scoffed at for playing a football or basketball videogame. The same applies to Guitar Hero and Rock Band. These games let people enjoy music in ways that would ordinarily be out of reach. Nothing wrong with that.

While the just-learn-a-real-instrument argument might seem logical enough to some people, it is an argument that is borne out of ignorance of what the Guitar Hero and Rock Band experience is about and of what it means to play videogames (or any game) in general. There is a substantial gap in understanding between those who play videogames and those who merely see images or clips of them and this disconnect has been a great source of confusion and conflict. As an interactive medium, games must be played in order to be understood, and that goes not only for non-gamers who would impugn games as a whole, but also for gamers who would impugn games from genres with which they are unfamiliar. The truth is that the cynics who criticize players of Guitar Hero and Rock Band have probably never played either game. I should know because I used to be one of them.

Contrasting Crysis and Call of Duty 4: Why emergent gamplay is the future.

Without a doubt, 2007 has been a marquee year for first-person shooters. We've gotten Bioshock, Call of Duty 4, STALKER: Shadow of Chernobyl, Crysis, Enemy Territory: Quake Wars, Half-Life 2: Episode 2, Team Fortress 2, Clive Barker's Jericho, Unreal Tournament 3, Timeshift, FEAR expansions, and more. Even the lesser games are above-average, and the best are some of the best games we've ever seen in the genre.

Two of the most acclaimed of this whole bunch are Crysis and Call of Duty 4. Both were hotly anticipated and both have been well-received by gamers. But I thought that these two games are an interesting contrast - both are first-person shooters, but represent two sharply contrasting design philosophies. Crysis features large, open environments that are highly interactive and allow players to approach any given situation in a seemingly limitless number of ways - so-called "sandbox" gameplay. Call of Duty 4, on the other hand, is very tightly scripted, utilizing purely linear levels that are designed to emphasize the drama and intensity of the action.

Just listen to the developers. In an interview with IGN, Grant Collier, head of Call of Duty 4 developer Infinity Ward, had this to say about linearity and interactivity:

"[E]veryone right now is demanding sandbox gameplay and total destructibility. We personally don't think that it's that fun, I mean, 'go anywhere! Do anything!' That's just - I think it's a buzzword, it's a badge, it's a bullet-point option, but a lot of games they get in there and they try to do that and then they're like 'okay we have the sandbox, now why don't we try to make the game fun'. And total destructibility, you can really ruin the gameplay. There's so many spectacular moments that you have when you funnel the action into certain corridors.... I think right now it's a fad, and the fad will pass, we're not going to be bite on in it - we want the game to be fun first, and destructibility comes second."

By contrast, here's Crysis lead designer Cevat Yerli on Crysis' open-ended, interactive $tyle of play:

New gameplay emerges out of these systems. I was running with speed power towards an enemy and shooting and another person came across running as well towards me, and he jumped over me, and in the air increased his strength, landed, and punched me to death. I was like, "%*&# you!" (laughs) It was like life straight out of The Matrix.

Another really cool scenario was when I was in the harbor under the water, and under a boat. I had the pistol, and then switched on speed and literally, like a dolphin, jumped in the air, pow pow pow pow - killed him. He was like, "What the #(*%!" He couldn't see underwater because of the boat, but I could see him as an enemy on the radar.

This is an aspect that I'm proud of because the systems are working everywhere, and it's not like it's a scripted moment or event or just me versus a person or enemy. It's always fresh. That's the cool part of it and that's what I mean when I say I want the player to express his intelligence in ideally the most wide range a shooter can offer.

So with these two games, we're presented with two highly contrasting gameplay designs: scripted gameplay versus emergent gameplay. While these designs are by no means mutually exclusive (both games contain elements of both designs), in my view the scripted design of Call of Duty 4 represents an aging and increasingly archaic design; Crysis' emergent gameplay, on the other hand, is the way of the future. Here's why (warning - spoilers ahead):

Call of Duty 4 showcases some of the strengths of scripting. At times, the levels are very dramatic and intense. There is a level of storytelling that indeed can benefit greatly from scripted events. However, the contrivances of scripting often hurt the immersion of the gameplay, and a few examples in particular stuck out like sore thumbs to me as I played the game:

* At one point, I was under attack by a helicopter, and the goal was to make my way to a nearby farmhouse, which is heavily guarded by enemy troops. I tried to make it to the house a few times unsuccessfully, being gunned down by the helicopter gunner nearly every time. So I decided to do the most logical thing: I shot the gunner. He fell out of the chopper in dramatic fashion, and it slowly flew away. Finally I could make my way to the farm house and focus on killing the infantry. But wait! The helicopter re-emerged, with another gunner. I killed him, too. This sequence repeated itself numerous times, until it became obvious that the helicopter had infinite gunners - the designers had already decided how I was going to take out the helicopter. Sure enough, once I made it to the farm house I was greeted with an infinite supply of Stinger missiles, and my squadmates instructed me to shoot down the helicopter.

* In another sequence, I was on a mission to assassinate a key character with a sniper rifle. The game factors wind direction into the shots, which is pretty clever. What's not clever is that shooting this character is a pre-scripted event; he is maimed by the shot, losing his arm, but he survives. No matter how accurate I am - I could shoot him in the head or in the foot - the final sequence plays out the same.

* Later in that same level during a massive enemy assault, it is possible to stand in an enemy spawn point, which prevents them from spawning and renders the whole scene a cake walk.

* Scripting also hurts the replayability of the game. When I'm playing any given sequence, all of the enemies spawn at the same spot every time and follow the same scripted patterns. So if I round a corner and get killed by an enemy I didn't see, I know exactly where to aim my gun the next time through. I always know that the guy with the rockets is going to run over that way, and the two guys with machine guns will run this way.

Crysis, on the other hand, gave me some very memorable moments of emergent gameplay:

* On the first level, I was pinned down behind a rock, under fire from a mounted machine gun. Stray bullets caught a nearby tree, which fell on me and killed me.

* I decided to ambush a small encampment of enemies. I used the nanosuit's super-strength to jump up a steep rocky hill on one side, rather than walking up the road or creeping through the woods. I mistimed a jump though, and launched myself about fifteen feet into the air above the enemies I'd been planning to ambush. In an amusing moment, they all gasped in unison at the sight of my superhuman ability, and scrambled for cover. So much for that ambush!

* On a number of occasions, I found myself under heavy fire from a patrol boat. I did the logical thing and shot the gunner. The driver would speed off, usually docking at a nearby beach or fleeing into the distance.

* I tried to use the cloak to sneak up on a group of lazy enemies sitting around on a beach. I changed weapons as I approached and suddenly, they were alerted to my presence and started attacking me even though I was still cloaked. At first I thought it might be a glitch, then it hit me: they had seen the flashlight on my gun.

I could go on and on; these are just a few small examples. Because the gameplay is emergent rather than scripted, the possibilities seem almost endless. Peruse any Crysis forum and you're bound to hear many entertaining stories about surprising, amazing, and even humorous emergent gameplay.

I played Call of Duty 4 after I had played Crysis, and Crysis had spoiled me quite a bit. I kept wanting to destroy vehicles, shoot down trees and knock down walls, but everything was static; I wanted to find better vantage points to attack, but there were artificial barriers blocking my path - often as contrived as an impassable wooden fence; I wanted to shoot the gunner in the helicopter, but... well, you know how that went. And while scripted gameplay does allow for some dramatic moments (as exemplified by the mostly excellent Pripyat level), I feel emergent gameplay allows players to discover situations that are every bit as dramatic, but more unique and special since the player's choices are the catalysts.

I believe that games that reward players for creativity and intelligence are the real wave of the future. The problem of course is that it's much more difficult to develop a player-centric game. Many functionalities can clash with each other, and a lot more things can go wrong than in a narrow, scripted environment. Given the escalating costs of videogame development, I feel very few developers will take the risks of using emergent gameplay. It's worth noting for example that Far Cry, Crytek's previous game released in 2004 and a very ambitious game in its own right, has yet to be imitated by other developers despite its success. Ambition is the most costly development expense of all. But every now and then, we see developers taking risks, doing something special and gamers embracing it. In time, I think, more developers will embrace this emergent $tyle of gameplay, and gaming will be better for it.

Doom and gloom for PC gaming.

I can't throw a rock at the internet these days without hitting someone proclaiming that PC gaming is a shrinking violet. Consoles have become near PC-like themselves, and seem to be drawing both developers and customers away from the PC.

NPD data would seem to reinforce the notion. According to a report by the NPD, PC gaming software sales are looking down compared to the rest of the industry. Last year, the NPD says, PC gaming did about $970 million, a rather small chunk of the roughly $13 billion games market.

And what about the old notion that the PC was a haven for the most innovative developers? Well, recently John Carmack announced that his next-generation game would find its way to the XBox 360, PS3, Macintosh, and the PC. Developers seem to be making more games with a multiplatform focus, with seminal PC series such as Bioshock (the spiritual successor to the System Shock games), Unreal Tournament III, and Call of Duty 4 making their way to consoles.

You can hear the cries of doom and gloom miles away: PC gaming is dying, dead, on the way out, yesterday's news, whatever. But is it really? Because when I look at PC gaming, I see not only a growing market, but a place that is still the premier platform for videogames.

Take games for instance. It's true that more and more games are multiplatform, but that trend has been developing for a while, and has spilled over to consoles as well; very few games are exclusive to the XBox 360 or PS3. This is simply a result of the fact that code can often be ported between platforms with relative ease, and doing so can increase the potential games audience by millions - a more prudent strategy than making separate games for each platform. But the PC still has more AAA, platform-exclusive content than any console. Just this year, we've seen Crysis, World of Warcraft: Burning Crusade, World In Conflict, STALKER: Shadow of Chernobyl, The Witcher, Neverwinter Nights 2: Mask of the Betrayer, Enemy Territory: Quake Wars, Team Fortress 2, Lord of the Rings Online: Shadows of Angmar, Hellgate: London, Supreme Commander: Forged Alliance and Tabula Rasa.

Meanwhile, the PS3's most notable exclusives were Lair and Heavenly Sword, both widely regarded as duds. The 360's only notable exclusives were Mass Effect and Halo 3, the former of which has already been confirmed for the PC. And PC gamers even got a graphically-updated, expanded-content version of the 360's last AAA title, Gears of War. Most of the other big games this year - Orange Box, Bioshock, Call of Duty 4, DiRT, Kane & Lynch, etc. - were all on the PC as well. This isn't to say there aren't some great console exclusives; Guitar Hero III and all things Nintendo certainly count - but no single platform has had as many top-notch games as the PC.

But what about the hardware market? It certainly is easier to purchase a plug-and-play console than a gaming PC where you have to think about drivers, hardware upgrades, monitor resolutions, memory compatibility, etc. etc. Right? Well, sure. PC gaming is, and always has been, somewhat of a niche market for precisely that reason; it's not as user-friendly as consoles. But gaming hardware is still going strong. Dell recently acquired Alienware, maker of high-end gaming PCs; subsequently, Dell has aggressively entered the gaming PC market with their XPS systems. Hewlett-Packard acquired Canadian boutique Voodoo PC, and has also entered the gaming market with their sleek-looking Blackbird 002 desktop systems. Meanwhile, for do-it-yourselfers like me, more and more hardware choices are out there than ever, many of them marketed toward enthusiasts, with motherboards boasting built-in liquid cooling and overclocking-friendly features. Would these companies be investing so much in PC gaming if the market was clearly in decline?

But what about those software sales? Well, just as the NPD doesn't take into account sales from Wal-Mart, there are two other rather large markets they ignore in PC gaming: subscriptions and digital distribution. With stores like Steam, the EA Store and Direct2Drive becoming ever more popular among PC gamers, more and more gamers are sparing themselves annoying CD checks, scratched or lost disks, lost activation codes and cumbersome DRM software by turning to digital distribution. Personally, I don't buy boxed copies of games unless I have to - I love the convenience of digital distribution. And most MMORPGs have subscription fees - World of Warcraft for example has a remarkable 9 million subscribers. But, their dutifully paid monthly fees are not included in the NPD's data. Finally, it's worth noting that while console games have crept up to $60 apiece, PC games are still never more than $50 new. So are the NPD's figures really a surprise?

Lastly, PC gaming is still unrivaled in its communities. Whether it's the thriving hardware enthusiast communities, massive online videogames or the ever-present modding communities, no platform can provide the unique community experience of the PC.

The fact is, PC gaming is still the premier platform for videogames. No other platform has better technology, more AAA games, more sheer variety, more tightly-knit communities, or more flexibility for your budget. Don't get me wrong, I have nothing against consoles, which often have some great exclusives and provide an excellent gaming experience for the money. In a perfect world, I'd own every console and a great gaming PC; unfortunately, I have to make a choice. But those who shout doom and gloom for the PC need to look again - PC gaming is not just alive and well, but better than ever.

Videogame length and the question of quality versus quantity.

Is it valid to judge a game by its length? I don't hear people criticizing books for being too short (except maybe Harry Potter fans) or giving a movie a thumbs down because it didn't last long enough. I don't hear people arguing that the Mona Lisa would be a better painting if only it were bigger. I do, however, hear this type of criticism thrown at videogames. For most players, videogames are regarded as consumable products, like cheeseburgers. For them, longer game translates to better value and (supposedly) better game. For me, this kind of thinking seems not only narrow, but bad for videogames.

Sure, a person could eat six plates of food at a buffet. But is that better than eating a single plate of high quality food for the same price? Maybe it's better if the goal is to eat to the point of puking. But I think most people would agree that when it comes to food, or books, or movies, quality is more important than quantity. Why the heck shouldn't this principle also apply to videogames? It didn't used to take 10 or 20 hours to beat most games. Contra can be finished easily in less than an hour with a friend, and Super Mario Bros. can be beaten even faster. It's a point of comparison that most players seem to forget.

Adjusted for inflation, most NES games originally retailed at anywhere from $90 to $100. Compare that to today, where the superb action game Call of Duty 4: Modern Warfare, which retails for $60, is criticized for not being long enough at six hours. The funny thing is that even from a consumerist bang-for-the-buck point of view, COD4 still offers a better value today than Contra did 20 years ago, aside from the fact that it's a much better game. I think it's time to be a bit more broadminded about this hobby. A short but high quality experience is much more satisfying than 100 hours of mediocrity.

I'm not saying that it's never valid to criticize a game for its length. There are many other factors to consider. A role-playing game that takes only three hours to beat would likely be deserving of criticism. But should the same criticism be leveled at a polished and a densely-packed first-person shooter? Sonic and the Secret Rings would have been more enjoyable and focused without the constant level recycling. Heck, I'll even admit that the sublime Resident Evil 4 may have dragged on a little longer than it should have. There are many more examples of this, but suffice it to say that bigger is not always better.

If videogames are to continue developing as an artform, then it will require some willingness on the part of players to be more accepting of experiences that don't necessarily require 10 or 20 hours to finish. Not everyone has the time to play through Zelda or Oblivion, but that doesn't mean these players can't have fantastic gaming experiences that are distinguished not by their length, but by their artistry, emotional impact, and replayability. So long as videogames can be criticized out of hand based solely on their duration, then there's little hope of games earning the type of respect accorded to music, film, and other artforms. That would be a shame.

Move over, BioShock: there's a videogame that poses even greater moral challenge

Like many others, I thought BioShock was an amazing game. The art direction was brilliant, the atmosphere was tremendously creepy and beautiful, and the question of whether or not to rescue or harvest the little sisters presented a challenging moral dilemma-at least initially. Nevertheless, I've recently been playing a game that has taxed my morality far more than BioShock ever did. I'm talking about The Elder Scrolls IV: Oblivion. Let me explain.

When I started playing Oblivion, the various discretionary quests always struck me as essentially virtuous: saving people from monsters, rescuing people from caves, finding people's lost heirlooms. These were well intentioned acts that, for the most part, made me feel like a pretty noble guy. Everyone wanted my help, and I was happy to do whatever I could to help them. I even joined the highly reputable Fighters Guild (which requires a clean criminal record) and progressed through the ranks, honorably completing each of my contracts. Then things started to change.

WARNING: SPOILERS AHEAD

One night, a mysterious character approached me and offered me a chance to join the secretive Thieves Guild. I was hesitant to tarnish my character's reputation, but the idea intrigued me. I joined up. Several assignments later, however, it became clear that even the infamous Thieves Guild was not entirely without virtue. The missions usually involved stealing from the rich to feed the poor and foiling corrupt officials; and killing was strictly forbidden. It seemed that even the "darker" quests in Oblivion would always be imbued with at least a sense of semi-righteousness. Then things really started to change.

A man in a black cloak named Lucien Lachance woke me up in the middle of the night and said that he knew I was a killer. He had been watching my progress for some time, and was now prepared to offer me a chance to join a mysterious group known as the Dark Brotherhood. All I had to do was travel to a remote inn and secretly murder a man named Rufio. I was given no reason for why this man should die. The choice was entirely mine. I could proceed with the game and never hear from Lucien again, or I could complete this cold-blooded act and gain entrance into the Dark Brotherhood. I found Rufio. I killed him. Lucien was right about me.

Since making that fateful choice, my membership in the Dark Brotherhood and subsequent ascension through its ranks has aroused in me more moral discomfort and genuine thoughts of "this feels wrong" and "why am I doing this?" than I have ever felt while playing a videogame (and that includes such supposedly immoral fair as Grand Theft Auto). Deciding whether or not to help or harm a little sister in BioShock was tough, but only the first time the choice was presented. I chose to rescue the little sisters because I wanted to be good, even though I knew it might make the game more difficult. In Oblivion, the choices felt different.

The vast majority of the quests in Oblivion are completely optional. Some of these quests present small moral dilemmas (e.g., lie to protect someone vs. uphold the law), and if the quest itself seems immoral, the player has the option of just ignoring it. The player need never pursue membership in the Dark Brotherhood in order to complete the game. Initially, my main motivation for fulfilling the game's more depraved quests was a desire to get the most out of the experience-that, and a dose of curiosity. But I never expected that my curiosity would lead me to perform such heinous acts or to feel such moral ambivalence.

For one of my assignments, the Brotherhood sent me to a mansion in which five strangers had been locked. My job was to enter the mansion, talk to the trapped guests, and then find a way to kill each one in secret. As I spoke to them I got a real sense of their personalities, their backgrounds, and their thoughts and feelings about one another. They seemed like real people with real problems. Even though the Brotherhood wanted me to believe they had it coming, it didn't make it any easier. After killing my first victim, I immediately felt a distinct twinge of uneasiness. I never felt this disquieted after shooting an innocent bystander in Grand Theft Auto.

After a while, I found myself rationalizing my bloody deeds. It was just too hard for me to complete my assignments otherwise. I would tell myself that the Dark Brotherhood must have its reasons. Surely these people did something to deserve their fate. Gradually I came to rely on the other seven assassins who shared the underground sanctuary with me. I regularly conversed with them about the history of the Brotherhood and sought their advice and support. Even a ruthless assassin needs a few constants in life, and my fellow killers made me feel less lonely amidst all the wanton bloodshed. That's what made my next mission so difficult to swallow.

The assignment (entitled "Purification") came from Lucien himself. There had long been rumors of a rogue assassin who had already killed several members of the Brotherhood. In fact, some of my associates had told me about it back when I first became a member. Lucien now believed that the assassin was from my sanctuary. To eradicate the threat, I was to kill all seven of my comrades-a ritual of purification. I couldn't believe it. Just when it seemed like things couldn't go any further, the game pulled the rug out from under me. For me, this was far more morally twisted than "harvesting" a little sister. The path before me looked very dark indeed.

What did I do? I carried out my task, and I haven't looked back since. I've performed many more assassinations since then, usually without thought or hesitation. One task required me to wipe out an entire family. I started with the elderly mother, whom I found tilling the soil outside of her quaint farmhouse. She immediately mistook me for a courier she had requested to deliver gifts to her kids. She seemed like a sweet old woman. After she handed over the list of addresses, I stabbed her and then proceeded to systematically track down and kill each of her four (adult) children, one by one. Afterwards, I collected my payment from Lucien and moved on to the next assignment.

SPOILERS END HERE

What makes the violence in Oblivion so weighty and morally problematic is that the inhabitants are given lives within the game. They have homes and families and occupations. They converse with one another. They have their own petty conflicts and disagreements. But most of all, they have names. These people are more than mere "little sister" archetypes. When one of the citizens in Oblivion dies, that's it. Whatever unique interactions the player might have had with that character are gone. Nobody will come and take his or her place. Forget BioShock or Grand Theft Auto. When it comes to stoking the player's moral ambivalence, Oblivion far outdoes them all.

Bioshock Review My Second Opinion.

BioShock is a simple, straightforward first-person shooter (FPS) dressed up in next-generation trappings and superb artistic design. There are numerous distractions attempting to draw the player's eye away from the basic formula at its heart, but really, that's all it is.

Is there anything wrong with this? Well-polished, enjoyably playable games are always welcome amidst the scores of shoddy cash-ins, inspiration-free clones and unrealistically ambitious projects needing six more months of development (but not getting them) before hitting retail. In this context, it's easy to appreciate BioShock as a strong effort.

On the other hand, I can't help but feel a little disappointed that there isn't more to it. In all fairness, there's nothing really wrong with the experience it provides; it just lacks the sort of vision and hook truly memorable games possess underneath the graphics, physics and sound effects. During my time in the undersea world of Rapture, there were many small occurrences that didn't mean much on their own. But, when added together and taken in total, the ability to become engrossed in the adventure was chipped away until all that was left was an underlying desire for completion. It's as good a reason to play a game as any, but perhaps not the most satisfying.

The most serious issue is that BioShock's character development and dramatic narrative never come together. The "silent protagonist" approach is rife with its own set of dangers, these problems only magnified by choices like having an absurd number of passive audio journals waiting to be picked up, or the absence of any real conversations. Even worse, a huge blow was struck against believability in the opening moments, setting the tone for the rest to come:

After arriving in Rapture, my character happens upon a glowing syringe. Rather than questioning its contents or waiting for a clue or verbal order from someone else, this mysterious substance is immediately and illogically self-injected on a whim. Not only does this action fly in the face of common sense, but when the ability to generate electricity is imparted, my character has no reaction and makes no comment. There's no sense of fear, or even of awe. As a player, how can I be expected to give events in BioShock any credence or weight when the person I'm supposed to be playing as doesn't?

Further undermining intellectual buy-in, the much-discussed inclusion of Little Sisters as a catalyst for triggering moral choices falls incredibly flat. Reduced to a repeated resource-gathering scenario, the "good" or "evil" in capturing these carbon-copy moppets seems to be solely based on whether the player has any empathy for the iconic visual representation of a small girl. It may give pause the first time, but when every subsequent encounter is revealed to be an exact duplicate with neither the action nor the individual having any unique quality, all impact is negated.

Although these can be taken as fairly high-level criticisms of things that many games wouldn't even have attempted, BioShock also suffers from other issues that hold it back in ways besides the intellectual.

Seeing the game lapse almost immediately into a series of fetchquests without the proper motivation was offputting. Some may claim that the game accounts for this later on, but to me it was too little, too late. In addition, despite the superb realization of the period through architecture, music selection and voice acting, it was hard to escape the sensation that most of my time was spent in a series of corridors and abstract areas. Since the game failed to create a place where I believed geniuses could congregate under the sea, I was left to admire specific elements while being unconvinced of the whole.

Finally, the egregious multitude of power-ups available could have been a significant addition to the mix, but again, their handling lacked the weight and importance needed to raise these systems above the level of idle toolgathering. Between Plasmids, Tonics, Photos, Inventions, Hacking, Weapon Upgrading, Ammunition Types and simply picking up audio logs or the millions of items strewn throughout every environment, there were too many things to be distracted by, with none of them feeling as vital or important as they should be.

So what does BioShock do right? It looks great, it sounds great, and it absolutely knows which genre it's in. The difficulty is quite reasonable (and adjustable as well) so I'd actually say that it's a great place for beginners or people who don't have a lot of FPS exposure to start. Beyond this, my feeling is that it doesn't do much beside meet the usual expectations. Without the kind of adrenaline-inducing pace that carries action blockbusters or the kind of emotionally involving characterization that can sustain a slower think-piece, BioShock certainly satisfies the standard FPS criteria but falls short of carving itself a place in the top tier.

Why the Check Mii Out Channel is a complete sham.

With the recent release of the Check Mii Out Channel (or Mii Contest Channel), I thought it might be a good time to voice some of my criticisms of the new channel and of the prospect of holding Mii contests in general. To get right to the point, the Check Mii Out Channel-both as a vehicle for hosting Mii contests and as a forum for recognizing talented Mii artists-is fundamentally useless and doomed to failure. In short, this latest Wii channel is too little, too late. Here's why.

Soon after the Wii's release in November of 2006, a handful of talented users latched onto the Mii-creation system and spent countless hours perfecting Miis made to resemble friends, family, and celebrities. As photos of these celebrity Miis made their way to places like MiiPlaza.net, other Wii users naturally wanted to have these characters for themselves. This is completely understandable. Mii making can be somewhat intimidating; it's much easier to simply borrow someone else's Mii than to try and design one from scratch.

Unfortunately, Nintendo adopted a ridiculously complicated friend system. In order to send or receive Miis, players first had to exchange their 16-digit Wii console code numbers and go through a tedious and laborious process of adding said codes to their Wii address books. Even if a player went through the lengthy ordeal of filling the 100 available address slots, it would still only be possible to send Miis to a maximum of 100 people. Oh yeah, and users could only send 20 Miis per day.

As a result, artists (myself included) resorted to posting pictures to allow others to replicate their designs, thus initiating a process that has all but destroyed any chance of holding a fair Mii contest. Case in point, within one week of posting my original celebrity Mii designs I received thousands of visitors. Based on its overwhelming popularity, I submitted my Jack Black Mii to the first celebrity Mii contest (hosted by kottke.org). Unfortunately, a copycat also submitted the same design, leading the organizer to nearly disqualify us both.

Since then, the situation has only continued to deteriorate. Many of my Mii designs (and those of many other artists) have been so rampantly copied and recopied that it would be impossible for the average user to trace their origins. On Sunday, I visited the Check Mii Out Channel and submitted some Miis. I noticed that a copy of my Jack Black Mii was ranked first. I pushed the "call friends" button to find other Jack Blacks. Fifty clones appeared (there may be more, but the system can only show fifty), none of which were the one I submitted.

Further exploration of the Check Mii Out Channel revealed a list ranking the top 100 "Mii Artisans," to use Nintendo's terminology. Looking over the submissions of several of these "artisans" revealed that a large proportion of their Miis were exact or nearly-exact copies of designs created by other artists, mostly from MiiPlaza.net. It's too bad that these copycats don't seem to have any problem stealing other people's work and shamelessly passing it off as their own. What's lacking here is a firm sense of Mii ethics.

The sad thing is that this could have all been avoided if Nintendo had provided some of the services contained in the Check Mii Out Channel from the beginning. The old system for sending and receiving Miis preserved the identity of the original creator in that any Miis received from another user are tagged with that user's name and are un-editable. But since Nintendo made it so absurdly difficult to send Miis, this feature turned out to be of little use. Players found it easier to just copy the Miis from pictures and tutorials.

Now that it's here, the Check Mii Out Channel provides a relatively easy way for Wii owners to distribute Miis. It's just too bad it's showing up so late. If this service had been around from the start, then artists could have posted their original Mii characters right away, leaving little question as to who is the real creator of a particular design. But now that most of the best Mii designs out there have been copied and recopied thousands of times, the damage has already been done. Not only that, but there's probably no concievable way to fix things.

They say that imitation is the sincerest form of flattery. In that sense, I suppose I should be extremely flattered to see several copies of my work sitting in the list of top fifty worldwide Miis on the Check Mii Out Channel. Maybe I should be flattered. But at the end of the day, I'm mostly just irritated. Nintendo finally has a Mii contest channel that claims to rank the top "Mii Artisans." But with the rampant Mii plagiarism that has taken place over the past year, the Check Mii Out Channel was doomed to be a sham before it was even released.

Anyone who believes that Mii artists deserve to be credited for their work should visit MiiPlaza.net to confirm whether or not the Miis that he or she has favorited on the Check Mii Out Channel were actually submitted by the original artist. If Nintendo's new channel is to have any hope of holding fair contests and supporting and encouraging the creativity of talented Mii artists, then it will only be through the willingness of Wii owners to support those artists by favoriting their Miis. Here's a list of Mii codes to help people get started.

Videogame violence revisited.

The latest issue of the highly respected journal Psychiatric Quarterly contains a meta-analysis of all peer-reviewed studies published in the last twelve years concerning the relationship between violent videogames and aggressive behavior. The conclusion? While the analysis found some evidence that videogames improve visuospatial cognition, it found zero evidence linking videogames with violent behavior. The article is rather technical, but the author's conclusion is as clear as day:

"Arguably the larger part of the discussion on violent video games has focused on their effects on aggressive behavior, with some researchers suggesting that the relationship between violent games and aggressive behavior is well demonstrated. Results from the current analysis, however, suggest that such claims are unfounded."

To be clear, this is not just suggesting that videogames don't cause violence; rather, it is suggesting that videogames aren't even correlated with violence. Indeed, there is no evidence linking violent videogames to actual aggressive behavior. The article will make little sense to anyone unschooled in statistics. But essentially, the idea is that the miniscule relationship between videogames and aggressive behavior disappears once the phenomenon of "publication bias" is factored in.

Publication bias occurs when authors are more likely to submit, and editors are more likely to accept, studies with a positive result. If several studies on videogames and violence are conducted but only the studies that show a statistically significant correlation are published-while the rest are shelved-then that creates a bias. This sort of bias has to be taken into account when performing a meta-analysis, which is precisely what the author has done through the use of statistics.

The author focuses only on studies published since 1995 to correspond with the so-called "third era" of videogames in which 3D first-person shooters (FPSs) became prevalent. Since the mainstream media frequently blames school shootings on FPSs, it makes sense to restrict the analysis to this timeframe. The author does not say that videogames do not cause violence, but rather that the current research simply does not support that conclusion, despite its widespread acceptance in the press.

For me, what's so important about this article is that it debunks the whole videogame violence myth that has been systematically perpetuated by the media. While there may be plenty of studies that link videogames to increased physiological arousal or aggressive thoughts, there is still absolutely no proof that playing games is associated with violent or aggressive behavior. I can only hope that this article will help decrease-if only slightly-the pervasive knee-jerk scapegoating of videogames.

More on Crysis: Facts, hacks, and shenanigans!

The Crysis demo has been created a real stir in the PC gaming community. Most of it is for good reasons-it's a great-looking game, and the gameplay is very well done. But there have been some issues that bring to light a lot of the marketing ballyhoo that Crytek has been spouting, and unless things change pretty dramatically with the final product, a lot of people will be calling b.s.

DirectX 10 a sham?

Perhaps the biggest news spreading throughout the community is that Crysis' much-vaunted "DirectX 10" settiings - the ones only available in Windows Vista under the selection "very high" (the "very high" option is greyed out on the DX9/XP version)-can actually be enabled in Windows XP. Through a simple tweak of the game's configuration files, the settings for the "high" options can be changed to stealthily enable the "very high" features. Not only is there little, if any, noticeable visual difference between the two versions, but these supposedly Vista-exclusive DirectX 10 settings actually perform better on Windows XP. This has a lot of of gamers calling shenanigans, and rightly so. We've been fed all kinds of DirectX 10 marketing, and Crytek has really pushed the DirectX 10 feature set from the beginning.

Now, this isn't the final word on the matter. For one, this is just a beta demo, not the final product. And since DirectX 10 is at least theoretically more efficient than DirectX 9, with a little engine polishing and driver revision, we could be seeing superior performance from the DirectX 10 version. Right now, however, we can at best fault Crytek for releasing a demo with subpar optimization, and at worst start wondering if there isn't some seriously unethical marketing going on.

Quad Core and 64? How 'bout it?

Speaking of subpar optimization, two other notable issues have cropped up. Crytek has always touted multi-threading support for Crysis - hell, it came straight from the big cheese of the company just a couple of weeks ago, when he went so far as to suggest that a quad-core processor would make a better upgrade than a new graphics card. Well, they apparently forgot to include multi-threading in the demo. While the demo for Unreal Tournament 3 easily stresses multiple cores, the gaming community has found that Crysis' support for this feature is conspicuously absent in the demo. We can only hold our collective breaths and hope that, like the DX10 thing, it's an optimization issue and we'll see multi-threading in the final product. Based on how this demo is bringing even the highest-end systems to their knees, we need every ounce of processing power we can get.

Support for Vista 64-bit is another interesting tidbit. Supposedly, the 64-bit version of the game should perform better than the 32-bit version. This is because the game can disable "texture streaming", and instead load all the textures of a level into the RAM. This is supposedly not possible in Vista 32-bit, but then again they said DX10 quality was Vista-exclusive too. Well, I'm running Vista 32-bit, and I used a configuration tweak to disable texture streaming. As many folks running Vista 64-bit have said, there is a noticeable improvement in visual quality, mainly in distant textures. That's a good thing, and I didn't notice any hit on my frame rate... at first. What I did notice is that the game seemed to get progressively slower and slower, going from a smooth 30+ frames per second (high settings, with some tweaks) down to an unplayable 15-20 frames over time. It appeared that the game was just taking up more and more memory, and performance was hurting accordingly.

That may have just been a quirk of playing 32-bit, but when I looked around the Intraweb, it appeared that many people using 64-bit Vista had the same problem. The issue may not be the operating system per se, but rather sheer memory capacity. Fortunately 4GB of RAM is a lot more affordable than it used to be, and Vista 64-bit is a fine counterpart to its 32-bit cousin (which could not be said of Windows XP). But there are probably a lot of folks out there with Vista 64-bit that do not have 4GB of RAM, and who knows how this will shape up in the final product.

Lowered Expectations

Just as there are plenty of gamers thrilled with Crysis, there are a fair share of them who are disappointed that they aren't getting 60 frames per second with "very high" settings and 4x anti-aliasing. On either "high" or "very high" settings, Crysis is undoubtedly a stunner. But while a top-end rig is needed to run the game on "high", "very high" seems to be beyond all but the most outrageously high-end dual-card PCs (actually, it's out of their reach too, since the demo does not support SLI). The settings can be turned down to "medium" or "low", but then the game just ends up looking a lot like Far Cry (which, btw, is still a gorgeous game three years after its release). What a lot of gamers don't realize is that this is on purpose. Crazy as it may seem, Crysis isn't designed to be playable on the uber-high settings with today's hardware. Crytek has stated that they want Crysis not only to look great today, but to scale forward with hardware so it looks great a couple of years from now, as Far Cry does.

This hasn't stopped a lot of gamers from complaining of course, but this is a game that pushes hardware, and it certainly looks the part. Let's just hold off our final judgment on these kinds of issues until the 14th, when Crysis becomes available.

Scariness in videogames.

With Halloween here, I've been thinking a lot about scariness in videogames. Can videogames produce scares as effectively as movies? If so, when did scariness become a viable element of game design? Is scariness in videogames relative? How is scariness defined? What are the scariest videogames I've played? What made them scary? How does scariness interact with gameplay, if at all? It's a complicated subject, and there are certainly many ways to look at it, but I'll do my best in offering some thoughts on these questions.

I have to admit that, up until now, I have still never played a videogame that scared me as much as some of the movies that I have seen. It just hasn't happened. The gap has definitely closed over the years, but the fact still remains that, so far as scariness is concerned, movies have the upper hand. Is this due to some intrinsic limitation of the medium? Not necessarily, as evidenced by the fact that videogames have been getting creepier over the years. As realism continues to improve, so too will the medium's potential for delivering frightening experiences.

To the extent that scariness in games is tied to realism, it is abundantly clear that scariness has not always been a viable aspect of game design. It's hard to imagine an NES-era game producing anything approximating genuine fright. The games back then just didn't look real enough. So when did that start to change? CD-based games were a big step in that they enabled designers to include digital audio for a much more realistic sound environment. Realistic sound makes a big difference in creating a convincing atmosphere. But that was only part of it.

The shift to 3D started closing the visual gap between games and reality. As the technology improved, so too did the demand for more mature content. Resident Evil marks a significant turning point. While I don't think that Resident Evil is particularly scary (in the interest of full disclosure, I didn't play it when it came out and so I don't know how it might have seemed at the time), it was still a big step in the direction of scariness. Looking back, I think it really marks the beginning of when scariness became a viable goal for videogame design.

Even as I write this, it occurs to me that this whole notion of scariness might be somewhat relative. Most people don't find older movies particularly scary, or at least not as scary as they used to be. I know that's been the case for me. As with videogames, the reason for this seems to be mainly technological. The special effects used in older movies generally aren't as realistic, which makes the whole suspension of disbelief thing much more difficult. Indeed, moviemaking technology has advanced so much that old horror flicks often seem funny today.

I wonder if this relativity of scariness also applies to videogames. It's a tough issue to examine because there weren't a lot of horror-themed games being made 20 years ago, and things that I found scary as a kid wouldn't seem as scary today anyway. I suspect, however, that videogame scariness is not entirely relative. I doubt that a Resident Evil made for the NES would be as intrinsically scary as Resident Evil 4 is now. In a previous post, I argued that videogame enjoyment is largely relative. I don't believe that the same holds true for scariness.

While I've never played a game that scared me as much as any movie, there are some that have come pretty damn close-in particular, Resident Evil 4 and BioShock. I think the key word here is atmosphere. Indeed, one of the biggest improvements afforded by new technology is the quality of a game's atmosphere. It was Resident Evil 4 that really brought home for me how much atmosphere can influence the gaming experience. That I could feel creeped out simply by walking through a dilapidated virtual village struck me as a big leap forward for the medium.

The atmospheric factors in Resident Evil 4 and BioShock have little to do with gameplay, however. The creepiness that pervades these games resides primarily in the visuals and sound, design aspects that could apply just as well to a horror movie. So what can games uniquely offer to the cause of scariness? I think the fight with Verdugo in Resident Evil 4 offers a good example. Dodging and fleeing from this Alien-inspired monster adds loads of tension to a game that already drips with atmospheric freakiness. I get a bit tense just thinking about it.

It would seem then that the interactive nature of games can introduce a distinct element of frantic tension, above and beyond what would result from merely watching a scene unfold. Eternal Darkness: Sanity's Requiem goes about it a different way. The game ingeniously messes with the player's head by subverting certain gameplay conventions-even breaking the fourth wall-in order to make the player feel more creeped out and uneasy. So it would also seem that messing with the player's mind is another path to scariness in videogames.

These examples really only scratch the surface. My videogame knowledge is far from comprehensive, so I'd be curious to know what others think about this. Advances in videogame technology have greatly expanded the creative possibilities of the medium, and not the least of which its capacity to scare the crap out of players. This is a significant development, not just because of what is says about the technology of games, but also because of its connection to thelarger task of creating games that can evoke a broader range of emotions.