Tobold's Blog
Monday, August 03, 2009

Three eminent physicists from the 18th and 19th century meet for the first time in heaven, sitting on a cloud and chatting about their work. The first says: "I found out in 1724 that water freezes at 32 degrees". The second disagrees, and says: "I found out in 1742 that water freezes at 0 degrees". The third says: "No way, I found out in 1848 that water freezes at 273 degrees". A loud argument ensues, until St. Peter arrives and says: "Mr. Fahrenheit, Mr. Celsius, Baron Kelvin, you're never going to agree on the freezing point of water if you're all using a different scale to measure it!"

Silly? But the equivalent is happening every day in the gaming blogosphere, when people discuss review scores of games. While I was on holiday, Eurogamer published a very readable second review of Darkfall, now giving the game a score of 4/10. And of course the Darkfall community still thinks that score is unfair, and too low. Since the original 2/10 review, lots of bloggers have chimed in and given the game various scores, and none ever bothered to talk about what scale they were using. How can you rate a game X out of 10 if you can't agree what kind of a game 10 out of 10 is?

For example the original Eurogamer review of World of Warcraft in 2005, from the same guy who did the 4/10 Darkfall re-review, gave the game a 8/10 score, scoring less than the 9/10 City of Heroes. A later re-re-re-review of World of Warcraft with Wrath of the Lich King got a 10/10 score. On Metacritic there are scores from 60/100 to 100/100 for World of Warcraft, with an average of 93/100. Warhammer Online has a slightly lower average, 86/100, with a range from 70/100 to 100/100. Metacritic also has a "user score" in parallel, and in that one WAR beats WoW by 8.1 to 7.1 out of 10. So from the point of view of the reviewers WAR is nearly as good as WoW, from the point of the users giving scores on Metacritic it is actually better, and from the point of subscribers WoW beats WAR in the US and Europe by about 5 million to 0.3 million. If you consider subscription numbers of games with similar cost to be some kind of a vote-with-your-feet general user review score, we end up with three extremely different scores, from three extremely different scales.

So why is there this myth that review scores make any sense at all, that there is some sort of universal scale on which all games can be ranked? Even if you don't consider the possibility of some commercial publication giving a game a higher score because of the advertising revenue from that game company, it should be obvious that for example an average gamer, a veteran gamer, and a professional game reviewer will have very different priorities and scoring criteria. Somebody who actually plays Darkfall and decided to stick with it obviously has a rating scale on which Darkfall scores higher than any other existing game, even World of Warcraft, because otherwise we would need to assume that he consciously went for a less good game. At least Fahrenheit, Celsius, and Kelvin agreed that the boiling point of water is higher than the freezing point. With game review score you can't even find everyone agreeing whether Darkfall is a better or worse game than World of Warcraft. So how on earth can you even start discussing whether "4 out of 10" is a "fair" score?

Personally I only use two scales to rank games. One is the hyper-subjective Tobold scale, on which there are only two levels, "recommended" yes or no. The other scale is a financial one, how much money is a game making, which for games with very similar pricing models is equivalent to the number of subscribers. Out of 5.3 million people in the US and Europe who all had the free choice, 5 million preferred WoW over WAR, and 0.3 million preferred WAR over WoW. Of course the 0.3 million all think that the 5 million are misguided idiots, and vice versa, but economic theory says that decisions involving money are usually more true than any opinion polls. But that financial scale has the big disadvantage that in most cases we don't have the numbers. For games like Darkfall or Lord of the Rings Online no subscription numbers have been published, and for Free2Play games like Second Life or Free Realms the published numbers are usually "people who made free accounts", of which many stopped playing, or just play the game because it is free.

But whatever scale we use to rank games, there is no guarantee that this scale will correspond to YOUR personal one. At best they can provide some sort of probability. If I take a random PC gamer who hasn't tried MMORPGs before, and let him play WoW, WAR, and Darkfall each for a month, there is a high *probability* that he'll like WoW best, WAR medium, and Darkfall least. But if I take somebody who stopped playing MMORPGs in disgust when EA "ruined" Ultima Online by introducing PvP-free Trammel, the order of preference of these three games might well be reversed.

So the best advice before looking at a review score of a game you don't know, is to look a the review scores of games you *do* know, from exactly the same source and reviewer. If you agree with the review scores of the games you know, then there is a chance that the review score of the unknown game from the same source is relevant to you. If you don't agree with the previous reviews, than you and the reviewer are using very different scales, and the scores are simply irrelevant for you.
I think the best advice is to ignore the review score completely, and just read the goddamn review. I'm frightened of the prospect where at some point in the future game reviewers don't even post their opinion, just a number.

Its made worse when their number scales don't work, like you said what is the value of said scale. If 10 games come out they should span the scale from the best at the top to the worst at the bottom. Instead we see 90% of games in the 7-10 range and anything below that is immediately perceived as complete trash, regardless of in any logical scale the majority of games would fall at 5 in a 1 to 10 scale.

DarkFall receiving a 4 when 4 means slightly below average would't cause such a riot as Darkfall receiving a 4 on a scale of 7to10
Where your friends play is such a huge factor in these games that the quality of the game isn't even the most important thing. I find eq2 a far better game than wow but my friends are on wow so I play that
Very Interesting post.
And welcome back Tobold! ^_^

As you describe, review scores aren't scaled properly among different reviewers and website/magazines.

But also they can suffer from subjective momentum. The same reviewer could give more or less depending on different factors even if a good reviewer should stick to his guidelines and be impartial.

What is foundamental to me is to know how a reviewer got to that number or judgement.

So is very important to write why you like or dislike a game. What are pros and cons. Etc. etc.
My advice would be to ignore all reviews except for of those you know, even virtually. If Tobold or Spinks would recommend a new MMO as great, I would consider getting a free trial or even a 1-month peek subscription.

The "professional" reviewers are not just biased (everyone are) but "motivated" to be biased.

The things to consider are the game mechanics. Having no death penalty for example tells everything about the difficulty of WoW and is listed on the official site:

So simply look at the mechanics and decide if the game is for you.

PS: I looked for a game with largest playerbase since I wanted to learn about people. Well, next time I'll be more careful what I wish for.
I mostly agree.

Fixating on the exact number of a review score pointless, and even comparing similar scores usually boils down to personal preference of the reviewer(s).

Nevertheless I find metacritic majorly useful, having the score be a sample of different reviewers smooths the personal prefference variance a bit and allows you to quickly browse the "best" games that have been released lately without getting to invested into gaming news.
Ah, the use of scales. I always shrug when I hear someone say that 20 degrees Celsius is twice as high a temperature as 10 degrees, which happens a lot more than I would expect, even amongst well-educated people. Celsius is an interval scale: it has an arbitrary starting point, so you can't use it to divide or multiply. From 0 to 20 degrees Celsius is twice *the interval* between 0 and 10 degrees. Nothing more. Kelvin is a ratio scale: the starting point of absolute zero is non-arbitrary. In other words: 20 degrees Kelvin is twice as hot as 10 degrees Kelvin.

Reviews are only as good as the reviewer. I loathe the film reviews in the Metro newspaper, because the reviewer’s taste is so different from mine. If he likes a movie, chances are I hate it and vice versa. Scales in reviews are a nice tool for comparing reviews, but more importantly I am looking for two things in a reviews: does the reviewer know what he is talking about and can he tell me why he likes or dislikes something. As Xash said: read the entire review.

Knowledge is power. Does the reviewer know what he's talking about? I value people's taste in music significantly higher when they can draw parallels between 90s Britpop and 70s glam rock than when they claim that Cliff Richard was the lead singer of The Beatles. (True story.) The same goes for MMOs. If you have only played World of Warcraft, you can only compare a new game to WoW – you can't really speak for the whole MMO genre.

I always ask people why they like or dislike certain movies, games, and music. The typical response I get? "Because it’s good." If someone tells me she likes special effects and therefore recommends the new Harry Potter film, I know what to expect effects-wise. If someone points out the lack of character development and obvious script flaws in the exact same movie, I have a more, well-founded overview for me to decide whether or not I should go see that movie.
I give review scores about as much time as my horoscope. I'll read the review and try to get an idea of what the game is like based on the reviewers words, but the scores are pointless to me.

Short reviews that rely heavily on the review scores to get the idea across are nigh on useless to me.

Unfortunately you can't even trust what the reviewers say as it's always subjective and in case outright erroneous. I've found several cases where a reviewer has lambasted a game for a missed feature or complained about something, only to find they didn't know what they were doing/hadn't researched enough or didn't have all the facts and a little bit of research later figures it out, making the game that much better.

Personally, even if a game gets bad reviews, I'll try it if it looks like something I'd like. If there's a demo available, I'll try it myself.

If I can't get my hands on a demo, well, I'll get hold of a bootleg copy to try it. If I enjoy it, I pay for an original, if I don't play it for more than 10 minutes, I delete the bootleg. No harm, no foul.

In short, there is only one reviewer I trust now, me.

Personally, I'd like to see game reviewing simply drop the concept of a scale. Review games in a narrative. Explain what you liked, didn't like and who you would recommend the game to.

Forget about the numbers.
1) The reviewer of Darkfall, Kieron Gillen, has made a long post about his Darkfall review on their brilliant gaming blog rockpapershotgun.

2) First thing I was thinking when reading the little story was "of course, you fools, were you measuring at the same pressure"? Damn my science degree.

3) I always have to think about the movie reviewer Roger Ebert when point systems come up. What you basically want to know from reading a review is "Should I see this movie? Yes or no?".
"I think the best advice is to ignore the review score completely, and just read the goddamn review."

I would like to second Xash.

My suggestion would be to leave out the scale/numbers completely!

They could still give out honors like special awards for this or that, but the focus should be on the review, which is an opinion that you can evaluate yourself.

Books are rarely judged with such scales and rating systems in mind, as it does not make much sense.

But every game (90%+) out there is rated in regions between 80%, 8/10, 4/5 or OMG it is a bad game.

They should have left out the number and just written that they cannot recommend the game for everyone and that it has serious flaws. And I think EVERYONE would have agreed, even Darkfall supporters.

But all get worked up about this silly verdict of giving a game a numerical value.
I think Xash's point is bang on. 4/10 from a site that gives everything not totally unplayable at least 6/10 is too low. It didn't match the opinion in the article.

If you look at Kieron's article the tone is basically "ok if you're into this kind of thing". I defy you to find any other game on Eurogamer where they wrote an "ok if you're into this kind of thing" piece while giving it under 6/10.

Kieron laid his cards on the table. As an editor he needs to stand by his writers. He explains that then does it. At the price of a very poor and disingenuous piece of journalism.

He also makes it very clear that he is very pissed off by the whole fiasco. I doubt he will be using Ed Zitron much for future reviews.
I think the best advice is to ignore the review score completely, and just read the goddamn review.

Within limits. Because even the written text can be highly subjective. In 2004 lots of reviewers said the graphics of EQ2 were "better" than those of WoW, but I always preferred the more comic style graphics to the pseudo-photorealistic uncanny valley graphics. I also remember reviews of Final Fantasy XI complaining about the bad controls, but I found that playing FFXI with a gamepad was the best control system I've ever seen for an MMO.

Reading does help sometimes, when the review is at the same time very descriptive of the game features, something I'm trying to do when I review games (but then, I give no scores). So if you read that I hated being ganked in Darkfall or EVE, you can still decide that this is something you don't mind all that much and try the game anyway.
Despite the limitations we still need reviews. There are too many games and we too little time and too little money to play them all so we need some kind of pre-purchase pointer to help us choose which ones to play.

I generally don't follow individual reviewers enough to find those whose tastes mirror my own so I tend to look for games which get broadly positive reviews across the board. I find that system works well for mass appeal games but not so well for niche games.
The problem is that there probably is no such as thing as an "objective" score. There are games I like, and games I don't. I've never been fond of graphic adventures, so I'm not a good person to review them. On the other hand, if I say that I had fun with one, it might be something to look out for. (Or the game is tainted with gameplay I do like, like the Quest for Glory series.)

When you get to the realm of "professional" reviewers, things get strange. Money is a big motivation there. Consider that sites, especially smaller ones, are dependent on the good graces of publishers to send games for them to review. A few bad reviews and suddenly you're not getting games from a certain source, and that reduces your site's content. One problem with a Darkfall review is that the developer probably isn't a big spender for advertising (compared to Blizzard or EA). So, dissing Darkfall isn't as financially dangerous as dissing the latest WoW expansion. (Really? 10/10?)

Finally, MMO reviews add another level of strange. As the original review pointed out, how much time is "enough" to really review an MMO? An MMO at launch isn't the same as an MMO a few years down the line, even without expansions. WoW and EQ2 are very different games now than they were at launch. Any review from those games' launches is going to be horribly inaccurate. How do you show that in a review? This is one reason why I think blogging is a better way to go about it. But, then money to pay the blogger is another issue.

Personally, I don't pay much attention to reviews. I look at what people are talking about and take a look at those games. Since I'm a professional developer, my tastes are pretty far outside the mainstream anyway.
the latest WoW expansion. (Really? 10/10?)

Yeah, that is what they said, follow the link and go to page 3 of the review for the final 10/10 score. And again putting a score on that is highly subjective. On the one hand the quests and zones of WotLK are very well made, there have been lots of other improvements to WoW with patches since 2004, and you can easily justify giving WoW with WotLK a higher score than original WoW. But for somebody who played WoW all the time, you could also very well argue that WotLK is too little, too late, should have come out a year earlier, and should have contained Ulduar already on release. And then of course the challenge level of raiding changed, so whether WotLK is better or worse than vanilla WoW will also depend on how happy you were with earlier raiding.
Don't confuse "Popularity" with "Quality".

How many people subscribe to a game tells you how popular it is, but nothing about how "good" it is. Reviews attempt to tell you that.

Much older forms of art and entertainment than computer games struggle to reach consensus on critical standards. Discussion of Canons has occupied academics in various fields for decades, possibly centuries.

Without an established Canon it can be difficult to judge quality. It takes a long time, and much, much argument, for a Canon to develop. A lot longer than computers, let alone games, have been around. Consequently, for now everyone is relying heavily on conflicting subjective opinion.

Come back in about a hundred years and there may be some consensus on whether WoW was a "good" game or not.
Would you really want to know whether a game is "good" from a canonical point of view? Usually your question is "am I likely to have fun in this game", and then popularity is telling you more about the game than an academic scale of canonical quality. And who decides in a 100 years whether WoW is a good game, and based on what factors?
I agree with the sentiment that perhaps the best way to evaluate reviews is to look at one reviewer's past work and see if his area of focus and criticisms line up with your own.

However, not all outlets have the same philosophy regarding reviews. Gamespot and IGN are somewhat notorious for taking an "objective," consumer reports style tone in their reviews where the reviewer's own voice and personal experience of playing the game is subdued.

If you are looking for reviews that are useful, I would start by finding sites/publications that emphasize the reviewers' voices and subjective experiences.
After reading some comments I'm getting to a new point of view that is different from the one I had at beginning reading Tobold's article. And different from my last comment too.

We are probably focusing too much on the review and reviewer side.

We tend to think that a good review is everything.

The reality isn't the scale that changes based on reviewer. What's real is that we have an equation with two different incognitas.
One is reviewer judgment, the other is the reader and eventually gamer.

A game is like a shoe or a film.
One game doesn't fit all.

So, best reviwer is the gamer it'self with his capacity of judgment and ability what fit best for him. Doesn't matter what other had choosen already.

Also, number of boxes sold...
There's tons of games that have great expectations. That sells a lot. But at same time are a great delusion, but a delusion that is noticed only afterward.
For whatever reason, I remember a PCGamer editor talking about reviews, and how they would love to do them without a score, but when they tried that people complained and asked for the score to come back. Just like people love 5-10 minute daily quests, people love glancing at a review and seeing the score. If you remove the score, they actually have to put in the effort to read the review to get an opinion, and for many that's too much.

As for the DF review itself, the only real 'issue' was that reading the review gave off a very positive vibe for the game (In that 'if you want PvP, this is the game for it' way), yet then you look at what 4/10 means and it does not add up. It's not a huge deal, and of course the 'why' behind the 4/10 is much deeper than just how much the reviewer actually liked DF, but it's just another reason why what is written is far more important than the eventual score.
"Within limits. Because even the written text can be highly subjective."

Right, but with a number you have no idea where a person is coming from. Where as from just reading a review its easy to quickly develop a sense of where the writer is coming from. Good reviews explain why they don't like something not simply that they don't like it. So if they feel game system x is bad because of y and z, you might say "Hey, I love y and z, I should purchase this game"

I've read game reviews where everything the review lambasted is things that I personally enjoyed. I own a rather large collection of games that sit in the 6.0 to 7.0 range on a site like IGN because they contained something I found enjoyable.

People seem to make the mistake of either believing or expecting that reviews are or should be anything more then an opinion. There shouldn't be a rating on opinions, this isn't a scientific scale, you either enjoy something or you don't.

But above all its important to explain why, and that's one of the reasons I frequent this site, I'm often at odds with your opinions Tobold, but since you explain why you feel one way or the other I find it easy to extrapolate from your opinions if I myself would derive enjoyment from whatever your discussing.
"Out of 5.3 million people in the US and Europe who all had the free choice, 5 million preferred WoW over WAR, and 0.3 million preferred WAR over WoW."

That's not even true, because it assumes that all 5.3 million people tried every title, which is patently false.

I'd argue that more than half of WoW players never tried an MMORPG before or after WoW, but are happy enough with WoW to stick around. If you could somehow force all 5.3 million players to try out the other MMORPGs out there, WoW's subscription numbers would undoubtedly drop.
This is always the way it is with reviews. The context (or scale) for every review is simply the opinion of one person. Whereas, financial success is not subjective at all and is easily measured.

We rationalize that because financial success lacks bias, its value as a form of measurement is much higher than mere opinion. Of course, we can rarely measure this type of success until well after a product launch so its usefulness in decision making is very limited. In fact, in most scenarios, financial success is rarely used to make the decision itself, but instead is used as proof that a previous decision was a good one.

That said, it is also a drastic mistake to consider commercial success a measure of quality. It can just as easily be a measure of good marketing or simply just a reflection of a popular (and brief) trend. For example, a popular actor might fill the box office seats for his new movie, but that doesn't make the movie itself good.

Of course, opinion is just that – opinion. And not everyone shares the same viewpoint. A 4/10 score for one reviewer might be a 9/10 for someone else. That’s why I either try to find a reviewer who I already know shares my perspective or I read as many reviews as possible to get a better “overall”view.
One of my pet peeves about reviews and review scores for PC games is that they usually don't take into account system requirements. It seems to me sometimes that the heavy duty 'computer-hog' games get higher review scores.

I don't have a super-duper computer but I'm guessing most of the reviewers do. When they give a game a great graphics score, that doesn't mean much to me if my graphics card can't handle it.

Also, a game that my computer can handle and that runs smoothly is a big plus for me as a choppy game is a big immersion killer for me.

So the first think I look at in a games is system requirements, then game faqs, reviews, and then review scores last.
I always liked the review system on GameFAQS: they usually break it down into multiple categories, for example:

Visuals: 8/10
Gameplay: 7/10
Audio: 4/10
Difficulty: &$%*/10
Replayability: 100/10

etc, with descriptions of each score to follow. The pure numeric score in most review magazines or sites only really tells me what "grade" the reviewer would give it, and then I scour their detailed explanation to see why. If I see off the bat that a review site scored a new game with a 40/100, I know they definitely peg it below acceptable/average, and I'll want to know why it failed so badly.
I certainly would not want to argue with the marketplace, but would say that financial numbers are a lagging indicator: a game clearly superior to WoW could come out but on their launch day would have far fewer subscribers than WoW.

I have long maintained that to be useful, a movie reviewer would show their reviews of say 5 movies:
Die Hard, Blue Velvet, Star Wars/2001, Barry Lyndon, Ran .
So I can tell whether they only like Disney happy endings or only art-house ...

Probably recommended or not is about as good as you can get. Even so, it would help to know whether someone recommends 90% or 10%. And a reviewer for a publication that accepts ads is under some pressure.

Too complicated but most useful would be to rate aspects of the game: graphics, or pvp or solo-play.
So if PvP is unimportant to me and graphics are a bit less important to me than some, then i can judge based upon my scale. Or try to pick a few demographics ( younger, FPS-maven,older crafting carebear,less-combat (trying not to be sexist but the stereotype of the female gamer, of which not all females are) ) and rate it for them. In WoWSpeak: if you think Arenas are too-slowpaced then you will love this game, if you have 450 fishing and crafting on mulitple toons you will hate it.
Where I get off the boat with Tobold is his insistence that if something is popular, it must be good.

That's an extremely shallow way to analyze something aesthetically. The immediate rejoinder when I say that is something about me being an elitist or something, but its a simple fact of life; popularity is not a good indicator of beauty or quality. Otherwise we are put in the position of defending Dean Koontz or Danielle Steel as superior authors to Dostoevsky or Dickens, or whoever.

Especially in a genre with such a powerful network effect, popularity is a pretty bad indicator of quality. All popularity indicates is that's where people have nested, and it is going to take a really, really spectacular game to get these people to leave long term. Eventually people drift back to their network, whether that be LOTRO guys heading back to LOTRO when AOC croaked, or WoW guys heading back to WoW. Once you are in a community, that community outweighs things like graphics or raiding or whatever.

I'm not even trying to rag on WoW here, because you can obviously make a really good case for it being the best MMO out there at this time. I'm just saying that you need to make that case based on the game itself, and not assert that it is good because more people like it.
But I don't say that popular equals good! I say that popular equals popular! I say that if somebody asks me what book I would recommend to read on holidays, I'd recommend Dean Koontz or Danielle Steel, and not Dickens or Dostojevsky. Because chances are that the average person would totally hate having to read Dostojevsky on the beach.

And who the hell defines that Dostojevsky is better than Koontz anyway, if reading the former is a bore and reading the latter is fun? Who defines that Mozart is better than Madonna? I'm a scientist, and I'd really like to know how you scientifically measure that.
I don't think that the "boxes sold" or "active accounts" scales are valid. In order for the scale to work as a quality judgment in this instance, the player needs to make an active, informed choice among alternatives. I doubt that most of the WoW players have even considered different MMOs.

I wrote an article about the nature of opinions and their usefulness a few days ago--it may be worth a read. It's fairly short and to-the-point.
"And who the hell defines that Dostojevsky is better than Koontz anyway, if reading the former is a bore and reading the latter is fun? Who defines that Mozart is better than Madonna? I'm a scientist, and I'd really like to know how you scientifically measure that."

See, you might not say that popular equals good, but then you say things like that.

I could go on a big tear about how the infinite ways in which Koontz sucks,but I'll leave it at this: a person who reads one book a year on the beach doesn't have the ability to judge quality, because they just don't know any better. And the fact that a person isn't literate enough to tackle a work is more an indication they need to read more, than a reflection on the work they can't handle.

In the end, quality is subjective. The various opinions of what constitutes quality help make the incredible diversity of literature, music, games, and everything else, and that is great! I don't want everyone listening and reading the same things all the time. But I do want them to be basing their assessments of quality on something more than "lots of teenagers like it, so it must be the best."

[Comment edited by moderator]
Actually the score is for the game producers and the content of the article for the consumers.

Gevlon identified a market failure (I'm pretty certain he still thinks 100% free markets are perfect). Without regulation the games magazines need to appeal to the readers and the industry - which is not in the best interest of the consumers - us.

Therefore they use this schizophrenic ansatz to write a low tof text that only the readers really read and give a high score that the industry can use in commercials. In the case of Darkfall, however the producer is that small, that they do not need to appeal to them. Therefore the low number.

The onyl reasonable source of information still are financially independent bloggers.
I wrote reviews for a games magazine a long time ago. This mag -it doesnt exist anymore- specialized in long, in-depth and thorough reviews. Lots of text, only a few pictures and no scores. Each reviewer had his/her specialty and only wrote about those games [mine was turnbased, strategic wargames, remember those?].

Apparently the publishers got lots of requests for scores, so after a while those were added. But it always felt artificial to me. It suggests an objective quantifiability which just isnt there, since a review is per definition totally subjective, my very own perception of a game, influenced by all sorts of variables.

Loyal readers may have gained some valued insight thanks to knowledge of writing style, gaming preferences and other peculiarities but otherwise the practical usefulness of reviews is pretty limited imho.
I agree with you that numbers are a poor metric for judging the value of a game, but numbers can be helpful as a lead-in to the review itself. If a game I haven't heard of scores a nine or a ten, I'm more likely to read the review to find out what made the reviewer give the game such a high number. Reviews themselves are extremely valuable, because I can learn what features appealed or repelled the reviewer and have a good idea whether or not I'd feel the same way.

As far as the popularity metric, I would side with Toxic and agree that it doesn't have anything to do with quality. Although I think the argument for videogames' popularity vs. quality is a lot different from that of music, movies, books, or any other type of media. Games have such a miniscule output compared to a centuries-old media form like the novel. So whereas you can make the argument that some of the best-selling games of all time are also some of the highest quality for their respective genre, that argument falls apart when looking at something as broad as books or music. Although the popularity argument applies in some cases with games, I think overall it's a very poor idea to use it as a judgment of value.

I would also wholeheartedly disagree with the statement "Out of 5.3 million people in the US and Europe who all had the free choice, 5 million preferred WoW over WAR, and 0.3 million preferred WAR over WoW." I would argue that free choice never exists in a vacuum and there are a lot more variables to take into consideration.

In a social genre like an MMO, number of players is going to make a huge decision in the choice to stay a subscriber. People are going to choose the game all of their friends play, regardless of the quality. I'm not arguing that WAR is a higher quality game than WOW, only that 'free choice' has more factored into it than it looks. At the launch of WAR, an established WOW player might have to convince his guild or his real life friends to follow him to the new game before he makes the switch himself. In the case of a player that is new to MMOs, chances are they will be swayed by the game with more word-of-mouth popularity or that they have heard other people talking about.
It's an excellent point. It drives me nuts to see a forum review where someone writes 3 paragraphs and then scores a game 8.5/10. What could I possibly infer from that when I don't even know the person?

I will say that in aggregate the scores do tend to tell you *something* about the product quality. I generally won't check out a game (or movie) under 7.5 on metacritic unless it has a feature I'm really, really interested in. There are tons of things I won't enjoy with much higher scores, but games below that point are pretty much guaranteed to have multiple serious flaws. Seeing the one-liners is also helpful; go add that to a few well-populated forums and you get a pretty good sense of community consensus pretty quickly. In fact, I kind of prefer community reviews on forums because I have a much better sense of each reviewer's preferences than I'd get off a site.

Since most games aren't very innovative, particularly in well-established categories like FPS or RTS, I often find myself drawing the line at roughly 8.5 metascore depending on how good a fit the game type is. I'm busy and I generally have time for only a couple movies or games per genre each year. I also check the metascores to make sure I haven't missed any gems.
I have eaten hundreds of hamburgers in my life, and I think McDonald's makes a pretty good burger. If I were to blog about burgers for the common man, I'd recommend McDonald's.

I've also listened to more Madonna than Mozart. A friend of mine who was a professional music student once told me that she liked pop music (such as Madonna) because it was like candy. It doesn't really make sense to rate them both on the same 1-10 scale, but then it doesn't really make sense to even rate artists the same genre. Who is better: Beethoven or Mozart?

Ultimately gamers are addicted to numbers (such as big crits), and fanboys like to see their fanaticism justified by big numbers. Magazines sell to fans, so they are pressured to provide scoring numbers. Tobold makes no money off anyone in the gaming industry, thus immune from that pressure.

Overall the best advice I've seen in this thread is to find a reviewer you agree with, and try the games he likes.
That last paragraph is extremely good advice. Right before I read it I was thinking that the saving grace that can make a scale still useful is it's consistency. If it's consistent with itself, you can extrapolate what the review means to you.

Actually, I think you could sort of use this as an example for how people think about just about anything in the world.
Personally I like the simpler system that Crispy Gamer uses: try it, buy it, or fry it.
Well... Scales are by definition personal. Darkfall could be unbelievably great, but its whole unlimited PvP aspect would still limit it to a 6/10 from me anyway.

Forza Motorsport 2 being what it is gets a 10 right away. I know it is actually seen as good, but I happen to really like realistic racing.

So the Tobold scale can be misguided too. I tried Luminary and didn't like it...

In general the text of the review should be more important. What is the game about? Which things I like does it have?
Re: "How many people subscribe to a game tells you how popular it is, but nothing about how "good" it is."

I would like to quibble with the "nothing" - if I know a game has a million people spending money for it every month, I don't know whether I will like it or whether it is "Good" or "Art". But I do know something: it has some minimal level of art, performance, network capacity and software stability. The Next Big Thing described by some beta tester on their blog may not.

OTOH, mass market is not always a good thing:

I noticed real differences with Zagat's (NYC lawyers who turned gathering friend's restaurant reviews into a Brand and a business.)

I love the restaurant review guides and rely on them. Zagat did a brand extension into a movie Guide that I really did not like. It was way too mass-market/common and told me little more than box office receipts would have. Whereas the restaurant guide was by and for foodies and very useful.
Post a Comment

<< Home
Newer›  ‹Older

  Powered by Blogger   Free Page Rank Tool