Cyberpunk 2077 was a game changer for many people. CD Projekt Red’s long-chased RPG masterpiece fell at the final hurdle, and its release was marred by immersion-breaking bugs, failed console releases, and a community of gamers drawn from this premium title, for which they spent over $60 had, were completely overwhelmed.
Experienced gamers complained that “it wasn’t like that in their day,” that the happy days of picking a cart off the shelf and pocketing it and playing were long gone. The average gamer started complaining about unfinished products being auctioned at top prices, and patience was starting to wear off. I have no patience for the “lazy developer” criticism that is so casually thrown around in this industry (in 10+ years in this job I’ve never met a “lazy developer” – the guys downstairs care a lot) , but it was the development staff who started taking flak at release-level decisions.
Perhaps a studio would release a patch in week one, apologize, and then “fix” their game with a crawling roadmap aimed at allowing a game to “reach its full potential,” but that’s not what the game is for consumer has paid , is it? If you canceled a day from work, spent $70 on a new entry in your favorite series, and then realized it wasn’t playable in its current state, you’d be frustrated too, right? Many players will tell you that this habit of patching things until they’re in the state they should launch is all too common these days, and maybe they’re on to something.
But that’s not a new phenomenon – developers had to patch games even back in the Nintendo 64-days, it’s just that consumers saw less of the process because disc burning and game re-releases were a less visible thing back then were. But in the last few generations, the problem has gotten worse. Or maybe the most noticeable. Cyberpunk 2077, Anthem, Battlefield 2042, The Callisto Protocol, Pokemon Scarlet, and Violet – just a few triple-A games that have hit the market in the past few years that are unoptimized at best and unplayable at worst.
And as publishers — 2K, Sony, EA, and now Microsoft — deign to charge more for their games, consumers are beginning to ask, “Why should I pay so much for games on day one?”
There are a few good reasons why you should do this. First of all, games are getting more and more expensive to produce. In 2005, a triple-A developer developing a blockbuster game would spend around $25-$35 million on the project. Now, the same studio with the same scale will invest between $75 million and $150 million to make this game work, and the studio needs to recoup that money somehow. Supporting your favorite developer is the most obvious thing you can do to keep the doors from closing forever, although sometimes even that isn’t enough.
Then there is inflation. Games stayed around $60 (or £50) for the most part for 15 years, but inflation has been rising for some time – and recently exploded. Right now it’s over 12% here in the UK and shows no signs of slowing down anytime soon. It’s obvious that your games cost more to buy, but it’s not the developer’s fault that your wages haven’t increased accordingly. Adjusting £50 in 2005 for inflation in 2022 gives you £88.24. Think about it for a minute.
So games are getting more expensive. Especially in a world still recovering from (and affected by) Covid-19. Developers had to enable work-from-home pipelines. Publishers had to support workers in more difficult situations. Network engineers have faced a massive spike in traffic during the 2020 lockdown. The gaming landscape isn’t the same as it used to be; Labor is getting more expensive, developers are no longer accepting crunch as the norm, and industry is beginning to unionize—finally.
Paying more for your games is good for the industry and good for workers. But it’s totally understandable that you wouldn’t want to pay more for games that give you less or make you wait (at least) longer for a premium product. There is a ‘cost of living crisis’ in the UK and the reality of that statement is that energy companies are slashing prices while the government sits on its hands as inflation soars. The end result is that the average citizen is significantly worse off, with the working and middle classes being disproportionately affected. TL;DR: The biggest players have less money to spend on games.
So to see Microsoft charging $70 for new titles is stinging; it makes me think I can only buy maybe three or four brand new games each year… and only from developer/publisher combos I trust enough to commit to day-one sales (I used to think I could count Game Freak/Nintendo in that group, unfortunately). That’s down from maybe six or seven brand new games, even a few years ago. There’s a reason services like Game Pass and PS Plus are so popular right now; Value for money is off the charts compared to massive one-time payments for marquee games. Sony’s commitment to its premium pricing could prove unpopular in the tough months ahead, and that’s only underscored by how consumer-friendly Game Pass has proven over the past few years.
But despite all this, gambling is still one of the cheapest forms of entertainment in terms of entertainment value per hour. I’ve paid around £20 for The Binding Of Isaac three times (oops) and I’ve spent at least 750 hours on it. By my math, that means I get 12.5 hours of playtime for every £1 spent. That’s better than paying £12 to waste two hours of my life watching Doctor Strange in the Multiverse of Madness.
So a $70 game is tough to sell, and the immediate resistance to the announcement overnight isn’t surprising. But think about the average developer — non-union, overworked, probably underpaid — before you start pointing your finger in the wrong direction. We’re all in this together, and it’s not the developers you should be mad at when things get expensive. Remember it.