Aspyred's forum posts

Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#1 Aspyred
Member since 2006 • 256 Posts

The biggest problem I see that gamespot reviews 'are aimed at anyone who would consider spending fifty bucks on Supreme Commander' rather than the people that actually read game review sites on a regular basis, Gamespot knew back in the good ol' days that everyone who bothered to look them up to read what they thought of a particular game generally had better machines than John Q. Public, this is obviously why Gamespot's review of Total Annihilation never brought up it's at the time steep hardware requirements despite it being a "absolute disservice" not to mention it. The true question is how did Gamespot go from being quite alright with a beastly game to docking it because the developers had the gall to stress the computers they used?

mikemil828


It's hard for me to personally buy that argument seeing as a lot of time has passed since the days of Total Annihilation.

GameSpot has changed hands in terms of its staff many times since then, and methodologies may change with such shifts. Secondly, back then, you're right. Only enthusiasts (essentially) would visit the website on a regular basis, but that doesn't neccesarily imply they're packing the hardware. Period.

Today, GameSpot is online the WWW, where millions of people are online at any given moment. Web surfing is an accepted social practice nowadays, and GameSpot is linked from many websites on the 'net (refer to the bottom of this page for a few, GameFAQs, MetaCritic, GameFly...) and is getting fed a fudgeload of advertising dollars. You have a wider audience, so it won't neccesarily be those enthusiasts looking at the reviews. Such a audience may necessitate a few changes here and there.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#2 Aspyred
Member since 2006 • 256 Posts
[QUOTE="ElectricNZ"]Here's my take on things. I disagree that they should deduct points from the overall score for having high system requirements, however they should mention the system requirements in the review which is what a review should be. This seems to be what most people here think also, I'm going to be a tad arrogant but seriously, this is the most logical way, high system requirements don't make a bad game. In a way the score is inaccurate, personally I have a decent rig which means I can run the game at max settings at 70+ average fps, that would mean the system requirements are not a problem, which also means that the game I play would be a "better game" than people with not so good pcs because the high system requirements are not a negative anymore. If the requirements deducted 0.2 from the score, would I be playing a game with those points added back on? I read about Crysis requirements, it is said that it's highly scalable, scalable backwards to 2 years and forward for 1.5 years. To me, that means that 1.5 years later when there are uber rigs, they should still be able to increase the graphical settings even further. In my opinion these new games released aren't meant to be able to played at their max settings at this moment, here's an example. I recently installed Far Cry, back then it was awesome in graphical achievements, it still looks great. I run the game at about 300 fps, now think about this, if it was scaled further ahead than optimised for it's time, it would mean I could crank up the settings even more and run it at 100 fps instead, would you rather run something that looks crap at 300fps or looks great at 100fps? Its a bit complicated but what Im trying to say is that a year or so later most people will have much better computers, and would be able to push supreme commander's graphics even further. Supreme commander is a next gen game. Here's another rant, I'm getting annoying I know. Supreme Commander is a next gen game, all these new pc releases with steep requirements are next gen games, they are meant to be played at max with real "next gen hardware", all these "next gen consoles" and console games are utter bs, how can it be next gen when it's played at full settings right NOW, it doesn't make sense.



You may have a decent rig that as you say, can run the game at 70 FPS+ at high settings, but consider if that wasn't the case. What if Supreme Commander ran at sub-30 FPS at medium detail?

And I disagree with your idea that if your computer can run a game that has steep system requirements, it should have the points put "back on". Quick points (I've explored them in more detail in my earlier post; look there if you wish):

- Multiplayer, you would have less people playing the game due to steep system requirements, and graphical lag might become an issue when the game gets bogged down by a plethora of things going on. And that has to be less fun.
- Less people playing, may possibly mean a smaller community, so for your single player experience, less user-created content, impacting longevity of the game. This is speculative at best.

An obvious counterargument is that neither of those points concretely contribute to GameSpot's review score. But it is something to consider.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#3 Aspyred
Member since 2006 • 256 Posts
[QUOTE="wanted_police"]im downloadin the patch.. hope it turns goodguylapierre
There are some issues with the 1.4 when playing SP. It was intended for MP and has some not so great results on the SP campaign. One example is the enemies can shoot and see you through some walls and canvas on tents.



Holy fudge, you're freakin' kidding me, right? I recently played Far Cry for the first time and got completely frustrated at that very fact they had insane line of sight on my rear. I vehemently questioned why people thought the game was as awesome as it was and haven't played it since.

This is a bug? ... How can it be fixed?
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#4 Aspyred
Member since 2006 • 256 Posts
I don't want to sound like an apologist and leave without giving you any definitive answers, but GameSpot has changed hands in terms of their staff many times over since then, so it's hard to consistent scoring all across the board throughout drastically different timeframes.  It just wouldn't be fair.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#5 Aspyred
Member since 2006 • 256 Posts

[QUOTE="Kevin-V"]

Supreme Commander is a beast. We write reviews for everyone that would conceivably be interested in playing the game, not just those with high-powered machines and one-day-old hardware. Nowhere in the review do we act as if everyone has a "piss-poor pc," but it would be an absolute disservice if we didn't take performance into account. You may be fortunate enough to have the most recent and expensive hardware, but our review is aimed at anyone who would consider spending fifty bucks on Supreme Commander, not just those so proud of their system requirements that they place them in their forum signature. Subcritical

First, I'd like to know how a reviewer is going to determine how many points to take off? How old of a machine should be considered worthy of playing a game? Why does the reviewer think they have the power to determine what hardware a developer should design for? Wouldn't it just be better to say that a game will run poorly for older machines, and let the general public decide whether or not to buy the game?

When did Gamespot come up with the idea to even take points off for games that have high system requirements? I'd like to know how may points you took off for FEAR. FEAR came out in 2005. Its minimum system requirement for RAM is 512mb. Many average people didn't, or still don't have 512mb of RAM. Did the game score suffer because Monolith and Sierra decided to make a game that requires at least 512mb of RAM?

When Gamespot writes a review for a console and gives the game a great score for graphics after the reviewer plays the game on an HDTV, what considerations do you make for the multitude of people that don't have high-end HDTV's? The game may look like crap to them. Sure the console is capable of producing the great graphics, but the TV is part of the experience, and hardware for console gamers. So when a reviewer is lauding the graphics of a console game, like with PC reviews, they should mention that an HDTV makes the game look best, and lower end TV's won't look as good, and as a result may affect the overall experience with the game.

I think the best way for Gamespot to handle this is to inform people how the game runs on low, medium, and high-end hardware. Just like they should let people know what to expect from an Xbox 360 game running on an old TV. Not everyone has HDTV for their console.

Gamespot is judging the quality of the game, not the quality of hardware required to run a game. Judging the quality of hardware is a slippery slope with many parameters open to debate. Who put Gamespot in charge of determining how a game should run on given hardware? Pentium III, Pentium II? Geforce GTS, on board Intel Video?

Doing that is bad news, my friend.



You ask a lot of questions as to the methodology that GameSpot should use to deduct points for games that simply run poorly on average gaming rigs. I personally think, as with regards to criterion like Gameplay and Tilt, that is completely up to the editor to judge whether the fault is enough to be detrimental to gameplay on a wider scale.

PC Gaming should not just pander to the elite, the ones that either have the funds, or sacrifice much to become part of that community. Though that sounds like a normative statement, for a gamer, especially for online play, it is more fun for you to play a larger community in an online match than it is a smaller community. It's difficult to say otherwise. And for single player games (ie. Oblivion), having a large and active community may increase the livelihood of your game through user-created content, mods and other such alterations.

Dude, you're asking for GameSpot to create a new level of standardization upon their existing framework, but sometimes that takes away from a person's opinion. Sure it's great to get some objective information and let the audience decide, but a large point of a review is to see how an industry professional sees a game, given their experience and know-how (or else they would not have been hired to do what they do).

And you asked, "Who put Gamespot in charge of determining how a game should run on given hardware? Pentium III, Pentium II? Geforce GTS, on board Intel Video?"  My answer is that you do. You put GameSpot in charge when you read their reviews, visit their website and spread the word (go advertising dollars!). GameSpot is not a definitive or divine measuring tool for videogames, they are simply one source, and they become as powerful as you or me make it.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#6 Aspyred
Member since 2006 • 256 Posts
I've heard of people using their flash drives for extra RAM, but dude, you're seriously onto something here. I will have to try this. : )
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#7 Aspyred
Member since 2006 • 256 Posts
[QUOTE="R-Dot-Yung"]

Do you have an obsession with making your games look as good as they are supposed to, making sure they run 60FPS at unthinkable resolutions?

I do to a certain extent, how about you

rasstarman

There are some people who will spends $1000s to max games out for what i don't know. People will buy two 8800gtx to play on a 20inch monitor at max settings. When I compare the difference between medium and high I cannot say spending so much money is worth it. I have a 7950gt and it maxes out the few games i throw at it but it's not a big deal to me. I guess it's all about the e-penis.



It's simply the Law of Increasing Cost. As you go past that prime point where you are getting the most "bang for your buck", incremental increases in specs will cost increasingly more. Most people will try to pin that sweet spot and leave it at that, but there will always be enthusiasts that have the bucks.

Personally, I research as much as possible to hit that sweet spot and try to hold on to the machine for dear life 'til I almost HAVE to replace the rig if I want to seriously game. You don't want to know how long I did in fact hold onto that P3 600 Mhz machine.

I don't think it's always about the ego, but some have personal investments in their machines, ie) when they build it from scratch/have connections so parts are cheaper/just looove the technology. It can become a very involving hobby.

And for proof, you need not look any further than forums such as anandtech.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#8 Aspyred
Member since 2006 • 256 Posts
That was definitely me a few years ago, trying to squeeze every frame per second out of my Pentium 3 600 Mhz, Rage 128 Chipset machine, while making games look as gorgeous as possible.

Now, with a beefier system in tow, I tend to prefer performance over visual splendor. I won't stand to play a game that dips under 30 FPS when the going gets tough.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#9 Aspyred
Member since 2006 • 256 Posts
I really hope it truly IS optimized for the PC.

I remember reading my lion's share of articles referring to the many beneficial changes and tweaks PC gamers would get when GRAW hit the platform. Such disappoinment. Jaggies (unless you're the lucky owner of a recent ATi card with Chuck's patch), slowdown, low-res textures (you couldn't select high on default unless you had 512 MB of video RAM)...

Let's hope it'll be better than...that.
Avatar image for Aspyred
Aspyred

256

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#10 Aspyred
Member since 2006 • 256 Posts
i cant believe what people vote for. this isnt an opinionated thing, its def without a question rbow vegas. like if you havnt played all of those games dont vote jsut for whatever one you own. everyone should be voting for rbow vegas, its a no braininer, voting for any other is lyingncderek


How is it lying? Throw R6: Vegas into Google along with "performance" or "image quality" as additional keywords.

You'll find debates over the version of the new Unreal engine used, low-resolution textures all alongside a central theme of "shoddy port". Even with decent PC rigs, the game chugs, hardly keeping up with the action on-screen. An anemic video options menu makes squeezing performance out of the game difficult.

If you happen to own an nVidia/older ATi video card, you can forget about AA. With as many objects on screen as R6: Vegas, the jaggies are very pronounced, definitely to the point of distraction.

I've played the X360 version, and it's much more tolerable than the PC version. I'm sorry, but shoddy implementation seriously undermines whatever advances in gameplay Vegas might have brung to the table. Oftentimes, much talk is about the high hardware requirements forcing PC Gaming to take 1 step forward, but 2 steps back. This is a case-in-point example.

On topic, go Source-based games.  Rarely have I seen such mindblowing vistas than those from HL2.  FEAR, in my opinion is grossly overrated in terms of graphical prowress.  Any level that isn't immediately inside a warehouse/office complex looks like a peek back to 2002.