Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
The Media Games

Are Review Scores Pointless? 135

donniebaseball23 writes: With Eurogamer being the latest popular video games site to ditch review scores, some are discussing just how valuable assigning a score to a game actually is these days. It really depends on whom you ask. "I've always disliked the notion of scores on something as abstract and subjective as games," says Vlambeer co-founder Rami Ismail. From the press side, though, former GameSpot editor Justin Calvert still believes in scores. "I've been basing my own game-purchasing decision on reviews ever since I picked up the first issue of Zzap! 64 magazine in the UK almost 30 years ago," he says, while admitting that YouTube is certainly changing the landscape today: "There's something very appealing about watching a game being played and knowing that the footage hasn't been edited in a way that might misrepresent the experience."
This discussion has been archived. No new comments can be posted.

Are Review Scores Pointless?

Comments Filter:
  • by Deffexor ( 230167 ) on Wednesday February 11, 2015 @01:14PM (#49030933)

    I've found Metacritic to be a good aggregator of scores, but more importantly, the "users" scores (and reviews) tend to be more reliable in terms of not being overly critical of games that are generally pretty good, but don't meet the expectations of "hard core" gamers.

    • by jclarker6 ( 3402799 ) on Wednesday February 11, 2015 @01:21PM (#49031033)
      Totally agree with this. And taking it a step further one could say any single score on it's own is not that reliable, when taken in aggregate the cream definitely rises to the top
      • by Anonymous Coward

        I mostly read negative review. Yu can quickly differentiate between drama queen and people with legitimate gripes, then evaluate what they find frustrating and compare with my expectations.

        • by Anubis IV ( 1279820 ) on Wednesday February 11, 2015 @02:34PM (#49031929)

          I'd go even further than that and say that it depends on the type of scale being used as well.

          When it comes to user reviews, if the reviews are thumbs up or down, I'll do the same as you and read the thumbs down reviews first, since it's easier to filter out the extreme reviewers and get a sense for the common issues. If it's a 5-point scale, I'll read through the 2s and 4s, since those reviews can give you a quick understanding of the pros and cons for the product, without nearly the level of overstatement that you'll need to filter through in the 1s and 5s. And I don't even bother reading reviews based on 10-point scales, since the way that everyday users grade on a 10-point scale is arbitrary to the point of uselessness (e.g. some people treat it like a 5-point scale with better granularity, while others treat it like an academic scale).

          • by dj245 ( 732906 )

            I'd go even further than that and say that it depends on the type of scale being used as well.

            When it comes to user reviews, if the reviews are thumbs up or down, I'll do the same as you and read the thumbs down reviews first, since it's easier to filter out the extreme reviewers and get a sense for the common issues. If it's a 5-point scale, I'll read through the 2s and 4s, since those reviews can give you a quick understanding of the pros and cons for the product, without nearly the level of overstatement that you'll need to filter through in the 1s and 5s. And I don't even bother reading reviews based on 10-point scales, since the way that everyday users grade on a 10-point scale is arbitrary to the point of uselessness (e.g. some people treat it like a 5-point scale with better granularity, while others treat it like an academic scale).

            The best scale I have seen used is a 4-part scale-
            Is it worth the money even if you don't like the genre? Yes/No
            Is it worth the money if you do like the genre? Yes/No
            Is it worth the money if you like a very specific subgenre? Yes/No
            Is it not worth your money? Yes/No

            Some Youtube reviewers use this format and I've found it much more meaningful than any pure point scale.

          • by rtb61 ( 674572 )

            The best scaled reviews, generally require at least a 10 x 10 scale. Ten categories of value graded out of ten and the score totalled. In is very important that some categories be contradictory so that a 100 score is impossible to be true and say 80 is the most realistic for exceptional value. Realistic evaluations a never done with a straight flat score, the answer is totally meaningless and of little value beyond public relations bull shit marketing.

          • My favorite "scale" was the Penny Arcade Report format, where the game was listed, the names of folks were across the top, and it was a straight "this guy liked it, this guy didn't, this guy hasn't played it yet".

            What worked is that you could quickly get a sense of what sort of games each person liked, and that gave you a lot better information - you don't really care what *Everyone* thinks, just people who enjoy the same sorts of games you do.

        • Any review in all caps, I skip.

      • by houghi ( 78078 ) on Wednesday February 11, 2015 @02:14PM (#49031671)

        Scores by themselves are useless in many cases. I once was heavily involved in a customer service survey. It was basically "From 1 to 10, how do you like the service." What we noticed what that Nordic countries gave a complete different number compared to Mediteranian countries.

        First they thought it was because the service was much better in some countries compared to others. Looking into it and asking customers we found nothing.

        We then started asking a second question: "What service did you expect." and then measure the difference. So if you expect a 6 and you get a 6 it is much better then expecting a 9 and getting a 7. So a 6 can be better then a 7.

        The issue was that the first time we did not have a base to start from. In school-tests the base is pretty easy. 100% is perfect without any errors.

        Compare it to Americans and Enlish where one would say "Wow, this is AMAZING. It is the best I have ever seen." and the other would say "It's not bad." (I jope you get what I mean.)

        When I look at at scres for movies, restaurants, books or whatever, I read the comments to know WHAT they thought about it.

        • I get what your saying with the customer service surveys as I've been involved in those to. First, I found it was important to keep surveys at 3-5 questions. If you exceed the 5 question mark you discourage the positive reviewers since they don't have the motivation to finish the survey. Negative reviewers are usually far more motivated since they are on a mission to display their dissatisfaction.

          In addition, the questions need to be easily rated. Was the service good is too general. You need to narrow it d

        • Re: (Score:2, Informative)

          by Anonymous Coward

          About Nordic countries giving different answers...
          Look up into which grading systems are used in schools.

          Norway has a scale of 1 to 6, with 6 being best(grade school only).
          Finland has a scale of 4 to 10, with 10 being the best (grade school), and 1 to 5, 5 being the highest (higher education).
          Sweden uses letter grades, A, B, C etc.
          Iceland has a scale of 0 to 10, with 5 - 5.99 being 3rd, 6.0 - 7.24 being 2nd and 7.25 - 8.99 being 1st grade. 9.00 and up is a Fine grade. This is very similar to the grading sys

      • While lots of people love to hate Metacritic, and there are some flaws for sure particularly if people use it is a "good-o-meter" I find it to be really useful. If a game has really high scores, I find that it is usually worth my time to check out, even if it isn't the sort of game I'm usually that in to. If all kinds of people loved it, well it probably does something right and I may well enjoy it. Conversely if a game that looks interesting gets really low scores, I may wish to investigate why, as usually

      • Aggregating scores makes them LESS reliable. There are many instances of large publishers pinning bonuses to metacritic scores, leading devs to improperly attempt to court reviewers to raise them.
    • As long as the score focuses on "Should you buy this?" and isn't a weight of various factors which doesn't end up in IS IT GOOD / SHOULD I GET IT?!

      I guess they are just fine and helpful.

      If it was a board-game the amount of miniatures and artwork on cardboard and so on doesn't really help much if the game is shit.

    • by solios ( 53048 )

      Metacritic is also a great - and in some cases the only - way to get *negative* reviews. Review sites are astroturf at best and completely useless at worst. I could care less how awesome a paid reviewer thinks a product is; I want to hear about the experience somebody who paid money for a thing has had with it - if they think they got their money's worth, what pisses them off about it, etc.

      Then there's the fact that with games the product is largely subjective - for example Metacritic gives Dishonored a 9

    • by Anonymous Coward

      except metacritic is a cancer and doesn't update the score based on updated reviews, they only accept the first score. Meaning if a game release completely buggy and a reviewer give the game a 2/10 because of that, once it's patched a week later and he change his score to 8/10, metacritic doesn't accept the new score. It's ridiculous.
      And that's just one example. There's also those devs who missed out of their bonus, because the meta critic score was 1% too low.

      • There's also those devs who missed out of their bonus, because the meta critic score was 1% too low.

        You'd have to be insane to take a job where your pay was dependent upon a metacritic score.

        • There's also those devs who missed out of their bonus, because the meta critic score was 1% too low.

          You'd have to be insane to take a job where your pay was dependent upon a metacritic score.

          He's referring to the disaster that was Depression Quest. Only professional critics gave it a high score. However the user score on metacritic set a new record for "poor", and the guy in charge of changing that was fired.

    • by solios ( 53048 )

      I've found that high ratings tend to be simple and very echo-chamber - people that praise a game tend to like it for similar reasons. The real variety is in the negative reviews, which is where any issues with gameplay or story (or both) tend to surface. If I'm interested in a game enough to want more than the upvoted reviews on the Steam Store (which tend to give a fairly concise answer to the question "Why would I buy this?") I've found that one or two positive Metacritic user reviews and then three or

    • 0/10 has day one DLC.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Are you kidding me? The user scores are just full of people who mindlessly rate the game 0/10 or 10/10 based on the current score in order to pull it up or down.

      Or whenever there's a controversy in the game, they'll rate it 0/10 for a tiny reason to bring the score down.

      There's nothing useful in user reviews of popular games.

      Steam's system is a little better, it only allows a positive or negative experience, and the reviews just show the consensus as "mixed positive". Still, there's bandwagoning but at leas

    • by Dutch Gun ( 899105 ) on Wednesday February 11, 2015 @03:03PM (#49032309)

      the "users" scores (and reviews) tend to be more reliable in terms of not being overly critical of games that are generally pretty good

      In my experience, users are very extreme in their assigned scores. If they enjoy the game, they assign it a 100 (ZOMG, Best Game EVAR!!!). If they didn't enjoy the game for some reason, it rates a 0 (WTF?! Worst !@#$!@ game of ALL TIME!!!). There are often relatively few scores in the middle. Also, user ratings will often pick up on issues that the press doesn't touch, though, which is a good thing. For instance, when a company introduces intrusive DRM, or if an online-only game has a very bad launch, users will flood the systems with very low scores, where professional scores would not have touched (or perhaps seen) these issues.

      Generally speaking, if a game gets universal praise, there's probably something worthwhile about it, at least to many people. If it generally gets horrid scores, you know that there's something seriously wrong. No, review scores aren't pointless at all. If you want to get the details, then read the actual review, and you can find out if you agree with those specific points or not.

      Eurogamer isn't really dropping the score, incidentally. They're just moving to a "four star" system ("Avoid", "No Recommendation", "Recommended", and "Essential"). In truth, I think that's probably a more honest way of scoring, because it's sort of silly to try to rank a different games based on a one or two percentage points of difference, which is probably completely arbitrary. For instance, what's the difference between a game that ranks 90% on metacritic and one that ranks 89% ? Answer: one more high profile review gave it a five out of five stars instead of four out of five stars. This also avoids the problem of having to try to rank very different genres against each other, or try to convey what a particular score "means" (there's almost always a chart along with the score). In a sense, giving it one of four rankings is sort of cutting out the score as a middleman.

      Also, honestly, I sort of wonder if dropping numeric ratings is a way for gaming sites to give themselves an "out" with publishers, who may apply pressure if their review scores are too low. I've heard of bonus and such being tied to metacritic review scores, which is a pretty nasty thing to do to your employees, IMO. Also, I'm guessing websites don't care to have their review simply aggregated by metacritic into a single, unified score.

      • by mjwx ( 966435 )

        In my experience, users are very extreme in their assigned scores.

        The problem with user scores is that most users are extremely unreliable or heavily biased. Fanboys, anti-fanboys, people who like everything and people who get off from writing negative reviews. So many peaks and valleys you cant be sure what is reliable. Above all this, you have no idea of the experience, biases or preferences of the reviewer.

        With a professional review you can have some idea of the motivation of the reviewer (sometimes

      • One example of Metacritic scores being contractually tied to bonuses was with Fallout: New Vegas. Gamasutra reported on it almost 3 years ago: http://www.gamasutra.com/view/... [gamasutra.com]

        I never played Fallout:NV, but I remember hearing Jeff Gerstmann describe it as "...very well written... and kinda broken." I also remember someone posting a Windows 7 Reliability Monitor graph, with the only crashes reported coming from "FalloutNV.exe".

        One thing I really like about Eurogamer's approach is that they're simultaneousl

    • by sd4f ( 1891894 )

      I'm of a similar opinion. Score aggregation is a far more reliable method of determining the quality of a game, very quickly. It's not perfect, but it works rather nicely for me.

      The problem that has been happening is the stupid score of 7/10, which demonstrates the stupid offset that game reviewers have been applying to their scores. 7/10 is more or less the average game, and anything below is usually crap (or niche) and anything above is ok. The dumbest thing about it is scores 0-5 essentially are varying

  • by neminem ( 561346 ) <neminem&gmail,com> on Wednesday February 11, 2015 @01:15PM (#49030945) Homepage

    Betteridge's Law indicates that the answer is "no", when of course, the answer is actually "duh".

    • The correct answer is "it depends". Some sort of ranking spread across multiple attributes (e.g. gameplay, visuals, story, etc) is more useful than a single number and expecially useful when accompanied by a sentence or two summing up the reason (e.g. "the story was contrived and uninspired afterthought").

      In absence of that, I find negative reviews to be vastly more helpful than positive ones. The games marketing can handle selling me the game, and people are better at articulating why they don't like somet

    • by AmiMoJo ( 196126 ) *

      Scores are useful once you figure out what they really mean.

      10 - Wait a couple of weeks for the servers to recover from the day 1 meltdown and meanwhile try out the demo

      9 - Get it in the Steam sale if you really like this sort of thing

      8 - Don't buy unless you are really bored and see it for less than â5

      1-7 - Total crap, avoid.

      • But there is always the good old "This game had one minor point that I strongly disagree with but most people won't care about. 0/10." that makes even the most basic of inferences entirely unreliable.
  • by Anonymous Coward

    I only read Bennett Haselton reviews (& also reviews of him).

  • by Anonymous Coward

    Its not the the scoring system in these mags is bad or worthless, its that the reviewers themselves are increasingly becoming shills for the gaming industry. When you see ads for the game right next to the "review", its a good sign that the magazine might have some vested interests in giving enthusiastic reviews.

  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday February 11, 2015 @01:22PM (#49031045) Journal
    This seems like a rather pointless question, since 'reviews' and 'review scores' serve somewhat different purposes.

    If you want a comparatively deep examination of a game, strengths, weaknesses, what is it trying to do?, does it succeed?, who is it aimed at?, etc. an answer like "65" or "8" is practically useless. If you want to do a metacritic-style survey(or decide what long-form reviews to read when faced with 2,000 games), though, 3 pages of prose and musing, each, from two dozen sources isn't going to cut it.

    Anyone pretending that a hundred-point score is actually that precise is likely fooling themselves; but there's a much stronger argument that you can get at least a 1-10 or so scoring system unless you are a pure, handwaving "It's all, like, intersubjective, man..." type.
    • I disagree. The point of those x/100 is that they usually combine many other reviews.

      At heart, people really only give 3 reviews.

      1) Bad

      2) Meh

      3) Good.

      Yeah, some times you get "worst" and "best", but those really don't matter, except in unusual circumstances. They don't matter, as the average will deal with those situations.

      You get 65 out of a hundred only when you compile a huge number of reviews.

      Then you can get helpful stats. Simply take the average. If the average of 20 people is a scored o

    • This review score inflation is mere consequence of inherent subjectivity of it. Each person maybe can establish some sort of consistent scoring methodology but it will be of little use for other people. If you try to be objective and neutral you pretty much always find a way to boost score because each poorly executed aspect of the game in question is pretty much guaranteed to be unimportant for someone and thus not warranting a downscore..
  • by Piata ( 927858 ) on Wednesday February 11, 2015 @01:24PM (#49031079)

    Almost no games get below 40, while any game that doesn't get 80 or more is considerd a failure. Then you have people giving games 3 out of 5 stars which translates to a score of 60, which skews things even more. Plus tent pole games like CoD can be executed extremely well but offer nothing new so how do you review that? There are games with low interaction (point and click) or high interaction (RTS). How do you compare one against the other? Good reviews are also often given despite massive bugs, incomplete games being released or week 1 launch disasters (like Diablo III).

    It's issues like that which make me understand the no score review trend.

    • by Kjella ( 173770 )

      If those are your only two complaints, I fail to see the problem. As long as they rank them correctly within the genre and you can apply your own mind to realize whether you'd like that genre or not isn't it simple and great? I mean Schindler's List is a great movie but if you're looking for a romantic comedy you're way off target, likewise I won't suddenly play Portal instead of Skyrim because one got 0.1 higher score than the other.

      The greater problem is all the reviews that are basically bought PR where

      • How the f*** do you expect people to "rank them correctly within the genre"? What does that even mean? Where doe Dwarf Fortress stand in the genre of DF-likes? Where does Brutal Legend stand in the genre of Action, Comedy, Adventure, Open World RPG, Hack and Slash, RTS games?
    • I disagree this your statement because the scores are rated by all gamers regardless of their game preferences. The rating is there to guide you towards the popular choices. Popular in most cases doesn't mean you will like it, it just means many people like it. Movie ratings are no different in that matter. The critics sometimes rate a movie horribly but the public loves it.

      If 9 out of 10 games don't like a game, why would the game rating get a good score?

      • Popular in most cases doesn't mean you will like it, it just means many people like it. Movie ratings are no different in that matter.

        True, which is why they're pointless. As are movie ratings, for that matter.

        • In this case they cater to the majority which doesn't make them pointless. After all the point of a score is to help you find what you are most probable to like. If you are more adventurous like I am, you'll actually go and try games that aren't in the top 50.

          As far as I'm concerned the most popular games are the games that should get the best scores because they are the ones people like.

          • If you are more adventurous like I am, you'll actually go and try games that aren't in the top 50.

            Yes, almost exclusively. Games that rate highly tend to be games that have high production values in terms of video and audio. Games with such high production values tend to be games with poor gameplay.

            As far as I'm concerned the most popular games are the games that should get the best scores because they are the ones people like.

            If the ratings were a simple measure of popularity, I wouldn't have a problem with them at all. But instead, they pretend to represent the overall quality of a game in a single number, which is impossible. Therefore, the ratings are worthless. Often worse than worthless: they are misleading.

            • Yes, almost exclusively. Games that rate highly tend to be games that have high production values in terms of video and audio. Games with such high production values tend to be games with poor gameplay.

              Examples please. Usually if they make it to the top of the popularity list they have proven gameplay to meet player expectations.

              If the ratings were a simple measure of popularity

              I guess it depends where you go for your reviews.

              • Usually if they make it to the top of the popularity list they have proven gameplay to meet player expectations.

                I'm not claiming otherwise! I'm saying that there are a lot of people who buy games and have rather low expectations in terms of gameplay. That's fair, people like what they like. My point isn't that people should have tastes similar to mine, it's that review scores are not sufficient to cover these differences, and are therefore entirely worthless for me.

                • and are therefore entirely worthless for me.

                  I can agree with that part but scores are catered to the black sheeps of gaming. Me and you are black sheeps hence we just need to continue doing what we do to find games we like.

    • Almost no games get below 40, while any game that doesn't get 80 or more is considerd a failure.

      It depends on the scale that the rating uses. In most schools, anything below 60 or 70 is considered a failure; you did more than nothing (which would be a 0), but you didn't get enough correct to even be considered adequate. In game reviews, you could consider 50 to be the starting score that a game gets just for making it to the title screen. A score of 80 might be the minimum needed for a game to be considered successful (i.e. the equivalent of a C in school).

  • No. [wikipedia.org]
    • by Piata ( 927858 )
      Sorry to burst your bubble but this is one situation where Betteridge's law might actually faulter. Game review scores have been broken for sometime and removing them entirely might be a step in the right direction.
      • Double agree, and out of mod points.

        Furthermore, the awfulness of review scores has been exacerbated by some publishers relying on Metacritic scores to determine bonus payouts to developers.

        So I'm glad this is becoming a trend. I hope this extends out to movies. x stars out of y still doesn't tell me if I should go see it on opening night or wait for DVD

  • Asking if the number is useful is supposing reviews are honest and unbiased.

    Since most reviews are prohibited from coming out before the game, and one assume most of these websites are getting paid for favorable reviews ... the number they put next to their review is probably as meaningless as the entire review in a lot of cases.

    • by radish ( 98371 )

      Since most reviews are prohibited from coming out before the game

      Review embargoes are, in general, a good thing. I know you don't believe me :) If the outlet doesn't get the final version of a game until a few days before launch they don't have time to play the thing and write a decent review if they're competing with every other outlet to get that all important first post. Having an embargo takes that first past the post advantage away and lets the outlets actually spend the time they need to do the job p

  • One person's Schindler's List is another person's Bad Taste. Like reading wine scores, you'll find there are particular reviewers that you agree with and those you don't. When looking at reviews of games you've enjoyed, find the reviewers that agree with you in terms of both score and analysis. Fall back on those gamers for reviews. It's not always possible -- many gaming sites have different people review games, there is staff turnover, etc.

    The score will give you a general idea of whether the game is

  • by Holammer ( 1217422 ) on Wednesday February 11, 2015 @01:40PM (#49031267)

    The Steam User Score is currently my most trusted metric for how good a game is, something which is considered "overwhelmingly positive" with a couple thousand user reviews is usually a worthy purchase.
    For non-steam users, imagine Metacritic except you can only submit your score/review if you own the actual game and it's either thumbs up or down.

    • its a good direction, but someone should come up with an adjustment because:
      a) pissed of payers rant more
      c) pleased customers usually don't care about feedback
      b) too many "happy" reviews that list more negative than positive points
      x) positive reviews of sorts "please give them 1-n patches to fix a quadrillion bugs before you vote negative"
      etc.
      • The steam overlay is included in every game launched through steam. After closing the game, or preferably after launching it when you've played more than a few hours, it could prompt you to click thumbs up or thumbs down. Might be a little annoying I guess, but it'd probably even out the negative versus positive bias and only requires clicking a button.

    • by sd4f ( 1891894 )
      I've never used it for one simple reason; Bad Rats...
  • Actually the numbers themselves are almost meaningless but are a nice general gauge, especially when they are user reviews.

    The correct way, I think, to use them is to look for a few high scores, and then read the low and medium score reviews. Are the 1 star reviews people who don't even like this type of game? Thats actually a GOOD sign of a decent game.

    Are the 1 star reviews complaining about bugs and play control? Watch out.

    Overall though, after checking this stuff out, I almost always watch at least the

  • Excellent question on an important topic. 7/10
  • As far as I can see the reviewers are little more than lying shills that make up the numbers anyway.

    Here is the title of a review column I saw on a popular site

    So You Don’t Like TSW? I Don’t Care (if you want to read it google for it, I have no intention of driving traffic there)

    Turns out the "Reviewer" was working hand in glove with games community management, they might as well have just taken out the scores and put up a dollar sign with how much they charged for the review.

    • by 0123456 ( 636235 )

      I'm sure it's just a coincidence that, when I was still reading game magazines, the games that got high review scores were usually the ones that had full-page ads in the magazine.

  • Simple numeric scores are useful in the same way that having a college degree is useful - it enables people who lack the time or motivation to perform an in-depth examination of a candidate or game to make a snap decision based on a single, over-simplified criterion.
  • by LaurenCates ( 3410445 ) on Wednesday February 11, 2015 @01:56PM (#49031413)

    The thing that I find increasingly aggravating nowadays is how much is hung on score rather than substantive view of the content of a thing.

    For instance (on a sort-of related topic): when a highly-anticipated movie, like "The Avengers" is released for critics and the scores start coming in, and it turns out critics found the movie overwhelmingly positive, the fans get all hopped up when someone dares to give the film a "rotten" instead of "fresh", ruining a "erfect score, as if there was somehow some personal investment in a movie getting 100% of critics to like it (or spoiling of their enjoyment of it if a mere 1% did not).

    Except for the fact that not all critics thought the movie was perfect, and the Tomatometer merely indicates that the movie was at least good enough not to be considered bad.

    The score is the headline, sure, to draw in people to read the review in the first place. But a lot of people gloss over it and stop engaging their critical faculties, brandishing a metric over true criticism as validation of their personal tastes (like Rotten Tomatoes readers; if you don't believe that people do this, find out what happened to critic Eric D. Snider after he posted a fake negative review of "The Dark Knight Rises" before he'd actually seen it).

    I don't have any "infamous" examples of games to point to, though I'm sure examples exist; in fact I wandered into this topic curious about which games were controversial in the same way, since both media have the same kinds of fanatics attached to them.

    My thought is to get rid of scores so that people actually consume opinions, not reduce them to a single number, but that's just me.

    • by sd4f ( 1891894 )

      A big part of the problem recently is that 'consuming opinions' means that you are getting all the internal bias of the reviewer, warts and all. That means if they get all antsy about not having a playable female character, then the game is toast. If the game has been criticised by Anita Sarkeesian (or will be) then it's awful. If it doesn't cater to alphabet soup people, then it's sucking up to the privileged white males and therefore bad.

      So when you have opinions, all you end up with is the game 'media' p

      • by sd4f ( 1891894 )
        Also forgot to add that the game sites also tend to gloss over issues which plague a lot of games. Namely debilitating DRM, day one DLC and early access. If they don't stand up for the consumer, then what are they for?
  • The essential / recommended / avoid rating only works if you happen to find a reviewer whose tastes exactly match yours. Which granted given the enormous number of reviews on the Internet is a lot easier now than in the dead tree gaming magazine days. But the key thing to take from that should be that you need to somehow take advantage of the large number of reviews, not simplify the scoring system. The review site should use pattern matching where you list which games you liked, the site searches for ot
  • To see if I should bother even watching the trailer, if I like what I see in the trailer I will watch a review (usually at least two) THEN I go and buy the thing. I also use review aggregators to find out which good games are coming out, there are so many games coming out and I don't keep up with game news so in that regard the current system works very well for me. Although Steam user ratings are getting more useful by the day I still use them only to then check the reviews out.

  • The review scores are like a broken pencil...
  • Sites like metacritic for games or rottentomatoes for movies tend to provide the most usable results for me. The contrasting reviewer types and ability see at least some negative comments really help sort out the worst choices.
  • by Yakasha ( 42321 ) on Wednesday February 11, 2015 @02:46PM (#49032105) Homepage
    They just changed it to a 3 number scale and gave them names.

    games will be considered Recommended, Essential or Avoid.

    Translated to a 1-10 scale, that is 5-8, 9+, and 1-4.
    Translated to 5 stars, that is 3+, 4.5+, and 1-2.

    Better is to find a specific reviewer that favors the same types of games that you favor and read what they have to say about a particular game. Reviewers themselves should be given scores in different genres to reflect their interest, and scores in different aspects of games that don't necessarily translate between genres and are not necessarily used on every game (perhaps each reviewer chooses 3 most important factors of a set list of say, 10 different areas); then have multiple reviewers on each game.

    How should we score an excellent game with severe networking issues? A flawlessly polished game with a hackneyed design? A brilliantly tuned multiplayer experience with dreadful storytelling? If you expect the score to encompass every aspect of a game, the task becomes an exercise in futility. Add an inflated understanding of the scoring scale in many quarters - whereby 7/10 and even sometimes 8/10 are construed as disappointing scores - and you have a recipe for mixed messages.

    Excellent game with networking issues:
    "Mary the FPS guru" says:

    Polish: 9.5/10 "It's pretty!"
    Networking: 4/10 "Networking problems ruins everything."
    Replayability: 8/10 "Single player scenarios keep me coming back."

    "Matt Foley the puzzle champ" says:

    Team Balance: 8/10 "Pick your army, its all about skill"
    Networking: 6/10 "It's ok because I live in a trailer down by the river!"
    Price: 10/10 "Freeware, freeware, freeware."

    You get the idea. Sorry for the babbling. No time to reword this.

    • by solios ( 53048 )

      Over the years I've learned that I can rely on two factors when it comes to games - word of mouth and development staff. Somebody who knows me and knows what I like probably isn't going to recommend something outside of that sphere (or if they do it's due to incomplete information, for a laugh, or for reasons unrelated to gameplay), and if I like a game or series of games it's usually a good indicator that I'm going to like whatever the people that made that game work on next - usually but not always.

      I agr

      • by Yakasha ( 42321 )
        My idea is the other side of the word of mouth coin. The reviewers can't know you, but you can know them.

        Having reviewers announce what they're primarily interested in would allow you to choose the reviewer that best matches your own preferences. Then you can easily give that reviewer greater weight when reading reviews. If you don't want to or don't need to choose a reviewer, you can instead look at individual reviews for the specific points that best line up with your interests. If good networking fun

        • I like it. I suppose it is kind of what the "curators" thing in Steam tries to do, or various Youtube channels.

          Here is a person who's opinion I trust, I will go with their recommendation.

  • Game reviews are good at generally identifying the best games, but I wouldn't rely on them for more than a very rough metric. You simply have to play a game to know if you like it.

    The best game review I've ever experienced was a live stream of a well-known GTA V speed runner playing the game on next generation hardware. Speed runners know games almost as intimately as the programmers do. The conclusions of the speed runner were dead on. The game has beautiful graphics, but playability had declined in many a

    • I realize that nobody has the attention span to read full reviews anymore, and media wants everything boiled down to a simple number they can display at the top of the article, but for games a game could rate very highly... and I would think it was terrible.

      A more useful abbrviated scoring system would be:
      "You would like this game, if you like this sort of game."
      "This game is a poor example of its type."
      "This game is unique, try it if you like X."
      "This game sucked."

      But I suppose that would require reading.

  • If you find someone that has similar interests as you then their reviews will be good for you specifically. However, general aggregate reviews or population votes are only correct as an average and may have zero accuracy for any specific individual.
  • Actually I jest, I completely agree that rating scores for pretty much anything are useless. A number expressed as a percentage out of 10 or 5 is simply not enough to tell you if a product is good anymore than a statement of "Do/Don't buy this product." In this case is why most of the five star rating systems are either 1 star (don't buy) or 4 (do buy), nobody gets a 5.

  • Review scores have always been worthless, with the sole exception that very low scores are almost always indicative of crap. Higher scores don't tell you what you need to know. For instance, I've seen games with terrible gameplay get high score simply because they're visually beautiful. That's probably legitimate for certain types of players, but it is actively misleading for other types (like me) who just want the gameplay to be excellent and couldn't care less how beautiful the art is.

  • This headline violates Betteridge's Law, and therefore constitutes pro-review score propaganda. I call shenanigans!
  • They're great for knowing who has publisher dicks in their mouths. So when I know for certain a game is going to be bad and see, oh let's say, completely hypothetically, IGN give it a 9.5/10, I know IGN is sucking cock and is probably going to get a kickback. "God damn fucking IGN get that publisher cock out of your mouth! We all already know better than to buy a $60 game on your say-so already!" is what I would want to say to them when I see them sucking cock like that! Of course this is a completely hypot
    • Don't harsh on people for sucking cock, it's an important service to those of us who have them. Besides, these bullshit review scores are the opposite of sucking cock. It's like promising to suck your cock, and then not even jerking you off.

  • The audience for any review is 2-fold.
    The buying audience, advice whether to bother downloading, signing up, or learning,
    and the developer, to provide feedback what worked what didn''t and what's wrong with their .game.

    I am not an avid gamer, don't own a gaming PC, nor any consoles, but I do enjoy the occasional strategy game.

    Scores on reviews are good for filtering. That's whether I buy a game, for download, or any other kind of product.
    For products that are hard to find and few users, I also look at produ

  • They're dropping numbers in favor of words. So the review system is completely in place. What was nice was when game magazines had three people review a game, usually one in depth and two smaller reviews. You could see in the mag which authors preferred what kinds of games and then with their reviews you could be like, oh I always like Bobs reviews, his reviews line up well with what I like. The personalization is gone with all the meta reviews.

Trying to be happy is like trying to build a machine for which the only specification is that it should run noiselessly.

Working...