Game reviews are an interesting part of gaming journalism. They're articles with Schrodinger's importance. Reviews rarely get traffic on the level of that of features, guides, or news, and they're often too all-encompassing to offer the deep criticism more targeted features can offer. They also require a huge playtime investment, up against the clock, which means they're not only shallow and largely go unread, they're also rarely your best work. And still, our industry is defined by them. What a game scores on Metacritic is used in fan circles as talking points, or when it differs from the User Score (which sways hard into 0 = don't like it, 10 = do like it), as a cudgel against our integrity. They're 1,000 words, often significantly more, offering the first critical analysis on a game and the only criticism the game will ever receive in a vacuum. They're also just a number. We do them pretty rarely, in the grand scheme of how many people write for TheGamer and how busy our jobs are, yet when you tell someone you're a games journalist they'll almost exclusively ask 'oh so you review video games for a living?'.
Even when we don't write reviews, we get emails complaining about our 'biased reviews', which are usually aimed at pieces like 'the lighting in the original Halo game isn't as good as I remembered'. You might be wondering what this has to do with anything. Well, join the club. I write about toys for a living, sometimes the nonsense just spills out. In this specific case though, I've decided that the process of our reviews could do with some grounding, so this will be our hub for what review scores from TheGamer mean. We have a large review team and have previously debated back and forth over what score to award any given game, so this helps put that in perspective.
Previously, I have moderated reviews with two simple rules: a) your score is the right score, and b) if you're torn between two scores, pick the lower one. These are still basically how I feel and, naturally for such a subjective subject, any two staff at the site might give a game a different score. I've never gotten the idea of "TheGamer gave X game a six and Y game an eight, they suck!", which mostly makes no sense because our scores are out of five. But also, they were probably two different reviewers, and why do you care about what scores a site you hate gave to two video games anyway? Still, it's important to have consistency, so going forward, these are the metrics all reviewers for TheGamer will use when they review any given video game. If we're wrong, at least you'll know why.
0.5 (out of 5) - An offensively bad game. The world would be a better place if this did not exist. Awful.
1 - Broken beyond all repair, completely dreadful. We gave Babylon's Fall a 1/5. Very Bad.
1.5 - Mostly broken but still enjoyable in places, or working fine but with no other redeeming qualities whatsoever. Bad.
2 - Without the technical faults of lower scores but nothing much else positive can be said about it. We gave High on Life a 2/5. Poor.
2.5 - Boring in terms of design, gameplay, or both. No new ideas and not enjoyable enough for that to sustain it. Boring.
3 - Nothing special, but without any glaring errors. Forgettable but competent. We gave Mario Strikers: Battle League a 3/5. Okay.
3.5 - Fun and competent, but not many layers to it, or layered narratively but without gameplay to back it up. Good.
4 - Extremely enjoyable but not an all-time classic. Well developed in everything it attempts but nothing unique or innovative. We gave Horizon: Forbidden West a 4/5. Very good.
4.5 - One of the best of the year with excellent execution of fresh ideas. Excellent.
5 - I'll be talking about this game for years, a rare title that will come to define the generation. Executes its vision perfectly and sets a new benchmark for its genre. We gave Elden Ring and The Forgotten City a 5/5. Perfect.