I need to get some writing done today, so I’m using this to warm up my fingers.
The article argues that review scores for “AAA” games are homogeneous, leading to a situation where too-similar games are given too-similar scores and people struggle to tell them apart.
There are a bunch of problems with the article, including its use of Metacritic averages as its sole source of review scores. Heavy Rain’s Metacritic score is 87, sure, but that’s ignoring that across different review sources it scored as high as 100 and as low as 45.
The article also focuses on AAA games, ignoring the outliers of indie development, and uses the presentations at E3 to conclude that, “We’re producing ever more of the same old stuff.”
I agree that the entire industry looks the same if you take an average of averages, look only at big budget games, and ignore anything that didn’t put on a big show at E3. From where I’m sitting though, the industry is a good deal larger and a great deal more interesting than that. But this isn’t really what I want to write about.
The first reason that most mainstream, AAA games score in the high 70s to low 90s, is that most mainstream, AAA games are good. As Jim said in his Sunday Papers blurb, they are “focused-tested into oblivion”. That doesn’t necessarily mean that anything that makes those games different gets shaved away, but it means that the most heinous bugs and design elements tend to be fixed or replaced. Often that means swapping them for elements from other games that the developers already know will work, but that means games are very rarely abysmal. This is a good thing! Well done the games industry, for considering the player in your design and in releasing fewer unfinished products.
The second reason requires that you consider who is reviewing the game, and who the audience for the review is.
When games reviewing started, games and gamers were split along party lines: that is, platform lines. You had your Amstrad magaziness and your Spectrum mags; or your Amiga mags, Nintendo mags and Sega mags; or your Sony mags, Nintendo mags and PC mags; and so on. Within these spheres, the magazines were generalist. If you were an Amiga magazine, you reviewed everything that came out on Amiga, whether it was a platform game or a strategy game or a racing game or a fighting game or anything else.
Then the internet came along, and in pursuit of greater traffic, websites became more generalist still. Gamespot and IGN covered no single platform, but every game in every genre on every platform.
So you end up, in both cases, with reviewers at generalist outlets reviewing for a general audience of gamers. There’s a pretty good chance that within that there are specialist reviewers, mind you. Gamespot probably have a “strategy guy”, who is expert in and a lover of strategy games. That’s fine, that’s what Gamespot should be doing, and strategy gamers will recognise that.
But when a person reviews Just Cause 2 for one of these places – a mainstream action game, intended as broad, pop entertainment – then the reviewers are judging its general quality for that general audience. An audience that when taken as a whole numbers in the millions and likes all kinds of games.
This, as an aside, is what leads to hateful lines in reviews like, “If you like this sort of thing, you’ll like this sort of thing.” It’s the reviewer breaking down, falling to their knees and saying, “I’ve no earthly idea what it is you enjoy, so how can I give you accurate buyer’s advice?”.
The way to fix this is to create more review outlets that reject “generalist” principles, and which choose to divide themselves along ideological rather than platform lines. Think of it in terms of music criticism. The NME has very particular taste in music. If you’ve ever read it or even just seen a few of their covers, you have a sense for what qualifies as an “NME” band. Pitchfork are similar.
And it’s not necessarily that these places only review music within their favoured genre, but that they review all music through a single, defining prism of taste. When they score something, they know what they like, they know what their audience likes, and they rate the music in front of them not for general quality, but for how well it fits into their own tastes, and ideas and philosophies about music.
In other words, they’re elitist, and games reviewing desperately needs more elitism.
Don’t think that I’m against generalism entirely. I work for PC Gamer, and while we have definite, recognisable standards, we are also generalist. We cover everything for everyone. We have a strategy guy. I think it’s essential and desperately important that places like PC Gamer, and Gamespot, and IGN exist. You need someone who is authoritative and who covers everything and caters to everyone, because not every gamer does care about only one or a couple of genres, and not every gamer knows yet where their own tastes lie. But once you have that in the field, there’s no point having ten other places doing the same thing.
So if you’re starting up a website to review games, think first about what you won’t cover, and then think about what things you’ll love and – just as importantly – what things you’ll hate. Let your hate define you. Make your website’s first post your manifesto. Have extreme ideas. Start fights. Give every game with a health bar zero out of ten. Embrace your love of Napoleonic war re-enactment and lambast any game that isn’t about Napoleonic wars. They suck.
Love the thing you love and make everything else the enemy.