At about this time next year, we’ll have a pretty good idea of what the next generation of video games will look like. New consoles will likely be shown off, bold new streaming initiatives will begin to launch, and we’ll see all the wonderful kinds of games they will bring us. All these new things will come, and we’ll close the book on a generation that saw the industry that makes games come under greater scrutiny than ever before, as studios shuttered, developers burned out, and toxic work culture fostered environments hostile to marginalized people.
These are not problems that have been resolved, but the wheels of the games industry keep turning, in spite of the strain. So how much bigger can video games get? Video games are only getting more costly, in more ways than one. And it doesn’t seem like they’re sustainable.
There’s the human cost, which Kotaku has chronicled extensively. Contract workers are continually undervalued and taken advantage of, as Call of Duty: Black Ops 4 developer Treyarch is reported to do. Artists who work on gory cinematics integral to games like Mortal Kombat suffer from post-traumatic stress disorder. Unrealistic demands and lofty investor expectations lead to disastrous development cycles for video games like Anthem, which in turn leads to developer crunch. Every week, news breaks about the toll video game development takes on the people who make them, and we carry on as if it’s all going to be fine.
That’s only the start of it. When you adjust for inflation, the retail cost of video games has never been cheaper, and it’s been this way for some time. The $60 price point for a standard big-budget release has held steady for nearly 15 years, unadjusted for inflation even as the cost to make big-budget video games has risen astronomically with player expectations. (Here’s some math that gives you an idea of just how absurdly expensive games are to make.)
Since changing the price point seems to be anathema, we’ve seen the industry attempt to compensate with all manner of alternatives: higher-priced collector’s editions, live service games that offer annual passes or regular expansions a la Destiny, microtransactions, and free-to-play games. Then you have loot boxes, which in many cases boil down to slot machine-style gambling inserted into retail and free-to-play games alike—something that is coming under increased legal scrutiny that might potentially cut off what has quickly become a major source of revenue in the industry.
These aren’t all necessarily responses to thinning profit margins in the face of rising inflation. Game publishers are often publicly-held companies, with investors that need to be shown endlessly increasing profits that are then used to justify ridiculously large executive paychecks. Perhaps that’s a problem that needs solving, too.
Because of all this, $60 is often just the minimum buy-in, the ante in the pot, for some of the biggest releases. If you want every character in a game’s roster, or every map in its playlists, you’ll have to pay more, and increasingly, you have to. Big-budget single-player games that deliver a single-serving experience with minimal strings attached have largely disappeared from the lineups of major third-party publishers.
Let’s run down the Big Three. We’re more than halfway through 2019, and Electronic Arts has only published one single-player game, the indie Sea of Solitude. Last year was much the same, with two indies as its only single-player releases: Fe and Unraveled 2. Activision’s portfolio of single-player games looks even thinner: Sekiro: Shadows Die Twice is the only exclusively single-player, non-remake game that the publisher has released since 2015’s Transformers: Devastation—which itself is no longer available, thanks to an expired licensing agreement.
Ubisoft is an exception, regularly releasing entries in single-player game franchises like Far Cry and Assassin’s Creed. But it buttresses them aggressive microtransactions and extensive season pass plans. (And the occasional diversion like Trials Rising and South Park: The Fractured But Whole.) The big-budget single-player experience is now almost entirely the domain of first-party studios making marquee games for console manufacturers, which bankroll games like Spider-Man and God of War. The economics of first-party exclusives are totally different—they’re less about making money by themselves and more about drawing players into the console’s ecosystem.
This is worth considering, because as big publishers prioritize live, service-oriented games, the number of games on their schedules has dropped. If you look at the Wikipedia listings for EA, Ubisoft, and Activision games released by year, you’ll get a stark—if unscientific—picture of how each big publisher’s release slate has thinned out in the last five years, relying on recurring cash cows like sports games and annualized franchises and little else. In 2008, those three publishers released 98 games; in 2018 they released just 28, not including expansions.
In short, the single-player game was not sustainable. So why should we think the current model is?
The smaller release slates make for a precipitous state of affairs where too much is riding on too little, a shaky foundation for big-budget game development to rest on. Granted, there are other publishers, like those in Japan, that are still very interested in single-player games. Independent games have also filled the single-player void and achieved greater visibility than ever before. But each of these alternatives face their own challenges in a volatile market, one where just five years ago conventional wisdom held the Japanese games industry was dead. Independent developers, meanwhile, continue to fight for the smallest slice of an impossibly crowded market. No matter where you sit on the games industry ladder, stability remains elusive.
That’s the present of video games. Let’s talk about the future. The intersecting trends of games-as-a-service and the increased emphasis on streaming mean an increased reliance on off-site computing with data centers and server farms distributed across the globe.
Microsoft’s Project xCloud wants to use the company’s data centers to provide high-end console and PC gaming to anyone with a good enough internet connection. Google Stadia is a service that pitches something similar if not even more wide-reaching, angling for the big-budget video game experience in a web browser. And Sony already offers a streaming service, PlayStation Now, which is likely to expand in the next generation.
A 2016 study from the Department of Energy’s Lawrence Berkeley National Laboratory gives us an idea of the sort of things to consider in this arena. The outlook gives reasons to both be alarmed and also be hopeful.
The foremost takeaway is that while data centers are growing in number, their energy consumption is starting to plateau out of necessity, as the dramatic increase in cloud computing has actually forced tech companies to become more efficient. The biggest companies, according to the Berkeley Lab report, are actually remarkably efficient.
Data center efficiency is measured by power usage effectiveness (PUE) rating. PUE is found by measuring a facility’s total power delivered divided by the power used by its IT equipment. Under this rating, the platonic ideal is a PUE of 1.0: power input and output perfectly balanced. Google, then, is in pretty good shape as far as this standard goes, with the average PUE of all its data centers currently at 1.11.
Efficiency, however, can remain good as power consumption increases, and consumption is going to remain a problem.
Data center energy consumption has been a concern for some time now, particularly in the United States, where data center energy consumption dwarfs that of the rest of the world at 1.8 percent of all energy used in the countrySmaller data centers, which estimates say make up 60 percent of data center energy-use, are inefficient compared to the biggest players, and with no legal standard or universal benchmark, there’s no way to ensure that efficiency gap is closed.
Making this problem even more dire is our current political climate, where developing sources of clean, renewable energy is an idea met with hostility by countries like the United States throwing their weight behind fossil fuels, even outside of its own borders. That doesn’t even account for the ways games contribute to the world’s electronic waste problem. E-waste is toxic, and only 40 percent of it is properly recycled.
And all that is before you even start to think about climate change, and the urgent action needed to avert a major crisis in our lifetime.
Video games cannot do this forever. If any of these things were to collapse—the people who make them, the economy they’re sold in, the ecosystem we’re all a part of—it would be catastrophic. All of them at once? That’s a disaster we need to talk about, openly. Because there are solutions to these problems.
Some of them are small, like making sure you know how to properly dispose of e-waste, should you need to throw out a busted console or peripheral, and doing what you can to live sustainably, even though climate change certainly requires the sort of large-scale action that only governments can enact To that end, you can take more involved action, like calling your local congressperson or government representative and asking if climate change and environmental concerns are on their agenda, and keeping apprised of any legislation up for voting in local elections.
Other solutions are harder to parse. How do we account for the data center sprawl of tech companies and their energy consumption? Is it ethically sound to use a service like Project xCloud or Google Stadia or Playstation Now, knowing all this? Should we push for a global green tech agreement of some kind, so companies that contribute to server sprawl and energy consumption do so in a sustainable way? A carbon tax seems like a good start, but this is a problem in need of many answers, not one.
Some solutions are thankfully, underway. Labor practices have come under scrutiny and developers are beginning to discuss organizing in earnest. Unionization is not going to solve every problem, but it can lead to meaningful progress in a lot of ways that trickle outward into other arenas. More equitable practices can mean the relentless pace of development is slowed down, which could make for fewer, better games and a course correction in supply and demand. Or it might only make things marginally better.
Sony, Microsoft, and Nintendo all have stated sustainability initiatives and reports, but these programs are all buried in corporate sites and paperwork—a better approach would be to make sustainability as big a talking point as load times or ray-tracing. Something we could look at and compare to the previous year, and make note of how better off we are.
These are big, insurmountable seeming problems, but like all incredibly big projects—like, say, game development—they’re things that can be done, slowly, a little bit at a time. We just have to start.
It’s unlikely that video games will ever truly go extinct. We’ll probably always have something called “video games,” but what those games will look like is still very much in flux. There’s no guarantee that the way games are currently made will remain viable for another 10 years—games aren’t even made today the same way they were 10 years ago. They will look different. They will change because they can, and because they must. Hopefully, all the ways games change will be on our terms—otherwise disaster will change them for us.