Las Vegas, the city of overabundance and monetary waste, an apt place for the Consumer Electronics Show (CES), which declares itself as “The Most Powerful Tech Event in the World.” It’s here that Sony unveiled a revolutionary way to play games as a “glimpse of the future.” By showing off a room made up of “Sony's Crystal LED panels, engaging audio, haptics, and scent” - yes, Sony wants us to smell the undead - we got a look at a new immersive experience for The Last of Us.
In the short clip posted to YouTube, the experience shows a bunch of actors exclaiming and posing in a darkened room surrounded by screens where crazed undead stampede towards them, blasting away with replica weapons to sell us… what exactly? A way to play games that will never be affordable to 99.99% of players? An experience that will be touted and toured, but barely visited? If the goal was to show how out of touch tech corporations are, then the mission has been accomplished.
This seems to be a trend in the current landscape of video games, though Sony is particularly guilty of it. Why show off something so lavish when it could have seemingly been replicated on the virtual reality tech they so desperately want players to buy? I’m sure they spent tens of thousands of dollars on this flash in the pan that could have been spent on creating a VR tech demo for the hardware they’ve already designed, packaged, and sold to the public.
Advert
Then, of course, you’ve got the PlayStation 5 Pro console that so incrementally upgrades the visual experience of games that YouTube is flooded with videos showing minute ‘improvements,’ beyond the obvious framerates, without which most players wouldn’t even see the benefit. It all smacks of this idea that players want technological improvements over anything else, which is, quite frankly, wrong.
The most powerful console ever and a new step in this generation
Of course, there’s a percentage of players who want nothing more than these lavish experiences and to see every pore in a character's skin, but I’d put money on the fact that the vast majority want substance over style. This is something of a hot topic at the moment as companies continue to overinflate their budgets on things like graphical fidelity when so many of us are happy with better writing, intriguing art direction, innovative gameplay, and character development, over the sharpest graphics.
Advert
This is something the New York Times recently touched on in their article ‘Video Games Can’t Afford to Look This Good’, highlighting a case with Marvel’s Spider-Man 2. In that article, journalist Zachary Small details how leaked documents uncovered that the game cost a reported $300 million to make while “chasing Hollywood realism.” He goes on to say that the game sold 11 million copies and, despite its success, staff from developer Insomniac Games were laid off as part of a corporate manoeuvre that saw 900 employees out of a job.
It feels like flogging a dead horse to bring up the cost of living crisis and the environmental impact of technology, but flog it I will. With wages stagnating, prices generally skyrocketing, and the world’s energy consumption being wasted on trite technologies like A.I., consumers will less and less fork out for these ‘dream technologies,’ and the increasing cost of games as company’s chase cinema realism, that companies are so happy to flaunt. Particularly when it’s all so unnecessary.
I’ll show you the cards in my hand, I do not care about graphics any more. Once upon a time, maybe, when I played the T. rex tech demo on the PS1 and declared “graphics would never get better,” but now it’s nothing more than a digital penis-measuring contest of ‘oh look, our characters have more individual hairs than yours’. There’s an argument to be made for framerates as it offers a much smoother experience, and I’m all for making gaming feel more comfortable for the player.
Advert
This isn’t meant to be a taking down of those who admire visual fidelity. It’s not even a knock against developers who want to tell their stories through the use of photorealism. It’s a critique at the use cases of pushing a technology when it isn’t needed, and overspending in an industry that can rarely keep its doors open or pay its staff. Can a game tell a story through 4K textured characters? Of course. Can games tell the same story using pixel graphics or PS2-style engines? Yes, it can. Just because the industry has long used graphics as a benchmark doesn’t mean this needs to continue.
It’s clear why large game studios insist on pushing the boundaries - it attracts eyes. It’s easier to pull focus to a game on its visuals than anything else, but it’s a huge gamble nowadays. Like any trend chasing in this industry, the utilisation of photorealism or advanced technologies like A.I. (or the passing fad of Web3) is like spinning a roulette wheel. Sure, you can create the next Call of Duty, but you might also create the next Suicide Squad: Kill the Justice League. Or something that players simply won’t interact with.
With AAA budgets constantly skyrocketing, development is becoming more and more of a risk. If we look at some recent big budget games courtesy of ARGENTICS we can see just how much companies are gambling on big wins.
Immortals of Aveum was made for around $125 million - a month later, due to poor sales, 45% of the work force was laid off. Undawn, which was published by Tencent, cost approximately $140 million and ended up only making back $287,000. Of course, there are big wins and I want to show this with The Last of Us Part II, which was made for $220 million and bagged over 300 game of the year awards. The margin is thin, and the difference between wins and losses aren’t guaranteed by flashy visuals.
Advert
Jason Shreier wrote on the subject himself at Bloomberg, where he describes that “bloated budgets in the graphical arms race” are only part of the problem. He deems the “real problem” is “rampant mismanagement.” I touched on this in my article on why 2024 ended up being a bad year for the games industry.
While I won’t belabour that point, as Jason has already broken it down along with his Bluesky following, and I’ve touched on it anyway, this rampant push for everything looking like it rolled off a Hollywood marquee can trickle down from pressure up top. It’s all marketing, a constant need to be on the tips of tongues and, often, it doesn’t pan out for the best.
Circling back to my opening example of Sony’s LED experience, you’ll notice it focuses on blinding the senses rather than engaging the emotions of the player beyond fear. Immersion is what this industry is built on, whether that’s through writing compelling stories, or using game engines for interesting design. It’s also built on ambition, both qualities that could be achieved without the need to focus on driving up budgets or taking a gamble.
Advert
I don’t think I’m alone in my thinking here. In fact, a cursory glance at social media and message boards shows swathes of gamers who simply don’t care about how games look now, particularly with the increasing technology only making incremental steps forward. To illustrate my point, I want to focus on some Reddit conversations, across multiple threads, on this topic.
A comment from Iggy_Slayer notes that, “They've had incredibly detailed games going back almost 10 years now (Uncharted 4 still looks better than many new games).” This is echoed in another comment from PetSoundsSucks, who says, “I want them to take three steps backwards on fidelity and a giant leap forward on particle/environmental effects.”
As Fact0ryOfSadness says perfectly, “Around 2015 or so, we started getting to a point where the best graphics were already photorealistic enough for the vast majority of gamers, and improved textures or more complex models started to become harder to spot. Improvements at that point became more of a gradual refinement of lighting, particles, and shadows. Also, a lot of gamers seemed to shift focus from the fidelity of the graphics to performance and framerate.”
In fact, after trawling hundreds of comments and threads across Reddit and social media, the one thing a majority wants from new games is optimisation.
With the increase in size of games, both in scope and actual game files, the optimisation of the game is becoming harder as teams can’t handle and QA everything properly. Again, this comes down to mismanagement, but it also comes down to lifecycles of games, which are taking longer to produce due to complexity and player demand, but also have to take into account a wealth of textures, 3D models, and the interaction of those, which costs time and money.
Essentially, it can be boiled down to this: if the cost of production, and the pushing of untested technologies, is reduced, so too will budget. This will then have a knock-on effect for players who could pay less for games, but will also pay less for new technology like consoles and PC graphics cards. If the industry held back on the incessant push for graphical fidelity, we could eke out a few more years from our consoles, or skip upgrading to a graphics card that costs the same as a second-hand car.
This would also please a large contingent of workers and consumers within the industry. More focus can be pulled to things like optimisation, which delivers a better experience; framerates, which will be more manageable with less dependence on graphics; and, through writing and art direction where we can see more diverse worlds that don’t simply look like an Unreal Engine 5 demo.
Cycling back to the New York Times article, it's noted that graphics isn't selling the biggest games in the world. Roblox, Minecraft, and of course, Fortnite, are all responsible for billions of dollars and untold moments of joy, and none of those are driven by the race for visual fidelity or technology at its pinnacle. I'm not saying it has to stop, but I am appealing for more level heads and clearer vision in how to tell a great story, or deliver a brilliant experience.
Topics: Xbox, PlayStation, PC, Opinion, Features