Friday, November 29, 2013

The Numbers Game

For years now, critics have been paid to consume some sort of media, and report back to us on whether or not they think it is worth they time and/or money to experience it. Critics are trusted for consistent opinions on their chosen field of multimedia. Sometimes critics can rise to celebrity status, like Roger Ebert and his collaborators Gene Siskel and Richard Roeper. Whether the review is put into print, audio, or video, many people trust the opinions of the critics and form their media habits around them. I myself am partial to Bob Chipman's (aka Movie Bob) movie reviews on theescapistmagazine.com. After watching a few of his videos, I found his taste in movies to be similar enough to mine that I could use his reviews as a good baseline for whether or not I would enjoy a movie. I won't agree with his opinion on every movie that he reviews, but our opinions are close enough that I can usually judge for myself with his information. Video games are the latest form of multimedia to enter into the mainstream, and with them have come the video game reviewers. There are, however, a few quite serious issues with video game reviews and reviewers lately.
First off, I'd like to highlight the issues with the numeric scoring system so often employed with any review in any medium. Many reviewers will use a scale of some sort, giving a score along a range of (for examples) 0-5 stars, 1-10, or even 1-100. A main problem with this system is its confusion/collision with the grading system of many schools. In school, a 75/100 is considered 'average', with 80/100 being 'good', and 90-100/100 being great. Any grade below 65/100 is considered a failure. Having this rubric applied to review scores is somewhat confusing. This condenses many game scores into a very narrow range. Ideally, an average score would be 50/100. At this point, it would be something that is not great, but not bad, just average. This view has recently put game journalist Jim Sterling into focus. In his review of the latest Batman game (Arkham: Origins), he gave the game a 3.5/10. While the education based system would mark that as a total and utter failure, Jim merely reviewed it as somewhat below average. Many of the comments on Reddit's Games section echoed similar feelings about the review score – initially it seemed low, but then they read his words on it and saw that he backed his score up with the written review (the top three first-level comments are all basically this).
Secondly, I'd like to take on the issue of Metacritic. Metacritic is a review score aggregator. They take review scores from many sites on the internet, put them into a proprietray algorithm, and hand out a Metascore. The Metascore is, once again, on a scale of 1-100. They even further perpetuate the problem from the last paragraph by color coding their scores. Green is 75-100, yellow is 50-74, and less than 50 is red. That means half of the possible ratings are red scores. The vast majority of games released in 2012 for PS3, Xbox 360, and the Wii U/Wii were in the yellow category. One of the most mysterious parts of the Metascore is that different sites are given different weights for their scoring, and nobody outside of the company knows what the weights are. When a certain site scores a game, some of them affect the Metascore more than others, with some sites likely not affecting it much at all. Metacritic is also looked at by developers and publishers of games. Obsidian, developers of Fallout: New Vegas were to receive a bonus if their game scored over 85 on Metacritic. The game just missed that target, and only received an 84 on Metacritic. Creative Assembly, developers of the Total War franchise actually looked at Metacritic scores and cut content if they think that it won't give them a better Metascore.
Personally, I think that reviewers should use and advocate for rating systems more based around an average game being in the middle than in the top quarter of ratings. I know that some games writers/journalists are staunchly in favor of getting rid of the review scores completely, but I think that they can help provide information quickly and effectively is used correctly. Metacritic should also be far less important than it is in many developers' eyes, but with the marketing departments being so strong in some of these companies, it would take a lot of work to separate them from its influence.

No comments:

Post a Comment