The Xbox One X and PS4 Pro have altered the long-standing tradition of console generations. Is it a one-off? Will we see small jumps in between traditional generation upgrades? Or will Sony and Microsoft go the PC route of gradual upgrades. More importantly, how will this affect the consumer?
I’ve enjoyed the traditional console cycle since getting my first NES. The jumps between generations were much larger then. 8 to 16bit was a huge graphical and sound jump. The Playstation and N64 introduced 3D gaming to the home audience. PS2/Xbox refined those concepts and introduced immersive worlds such as GTA III. PS3/360 made the jump to HD but it could be argued the gameplay mechanics started seeing diminishing returns at that points. PS4 and XB1 are offering the smallest jump yet, so perhaps it makes perfect sense for an incremental upgrade to give console players their first taste of 4K.
Technology is impossible to predict (for me) but if it stays at this current growth, then we might not need a PS5/XB2? for quite some time. Not until a new-fangled breakthrough happens and PC gaming leads the way with killer VR, AR, or some other undiscovered development.
Consumer benefits
Adopting the PC model has many benefits for gamers, backwards compatibility being the most obvious. But this fluidity works on both ends, as the latest and greatest games will eventually require some kind of upgrade.
Other benefits are cutting edge hardware tech. Late in a console’s life, PC gaming usually takes a far lead when it comes to visual power. There are two arguments against this however. 1) Hardware is not everything as we’ve seen monstrous leaps from games released late on a console. 2) Multi-platform dominates the market and if a publisher is aiming at a console market primarily, the tech is often limited by that lowest common denominator.
Consumer cons
Console gaming’s greatest strength is reliability and simplicity. Sure, patches are now the norm, but they require no installation from the user and most importantly, no potential conflicts with other drivers etc. I’m not saying that will be an issue going forward, but if you’re trying to play the latest game on a slightly older console iteration, you could start running into poor performance. We could argue that’s already the case, as multi-platform games often struggle on at least one of the platforms gamers pay hard-earned cash to play. Introducing multiple new models with incrementally increased tech will magnify this issue. Many PC gamers will argue this is not a problem. Simply slide the settings to fit your setup and enjoy a slightly altered experience.
Money
Expense is the biggest difference between console and PC models of business. A new GPU every few years (or longer) can cost as much as a console, less, or much more, depending on how up to date you wish to be. But every so often, more than just the GPU must be replaced. While PC gaming is not prohibitively expensive to most, the console model is still a bargain in comparison.
A one-time fee for 5+ years of gaming guaranteed to provide the best possible experience of games designed for that platform. The Last of Us came out in 2013, and looked just as good on PS3s from 2006. The PS3/360 generation is an extreme case of course but it holds true in other examples. The worst-case scenario could see a three-year old console running new releases so poorly the gamer feels compelled to purchase an upgrade far sooner than they would have in the traditional format.
I’m sure Sony and Microsoft have watched Apple release miniscule upgrades each year for big profits and eagerly want a piece of that action.
The bottom line is that gaming is expensive when staying on the cutting edge (and quite cheap if you’re wiling to stay behind the curve). If Sony and Microsoft start pumping out console upgrades and leaving older models behind, gamers will have to shell out more than ever to play the best versions of the latest and greatest.