With the advent of the next generation consoles finally upon us, it seems like a good time to compare console gaming to another popular gaming platform. No, not mobile games or handhelds, this article is going to be talking about consoles versus PCs.
First, what exactly is a console and what is a PC? The former is a machine dedicated to the purpose of playing video games, which connects to a television or similar device. The latter is, as the name implies, a personal computer (in most cases, a computer which is running Windows) which can be used to play games.
Console gaming is likely the well known gaming platform, and for good reason, it’s been around since the 70’s. The NES (released as the Famicom in Japan) was arguably the beginning of console gaming as a successful industry and the oldest well known console, despite being predated by older systems released ten years earlier.
Fast forward a few decades. Not long ago, we saw the release of the Wii U, Xbox One, and PlayStation 4, thus starting the eighth generation of consoles. Before that, we had their direct predecessors, the Wii, Xbox 360, and PlayStation 3 making up the seventh generation.
Traditionally, consoles are machines built with pre-determined and static hardware, meaning that the capabilities of the individual console will not change as time progresses.
What this basically means is that consoles won’t have their parts updated to keep up with advances in technology. While new models can be released, these are almost entirely updates to the exterior cases, making new versions less bulky or more aesthetically pleasing to the masses rather than actual upgrades.
So, a fancy new shooter might look flashier or fit more objects on screen at once, but it will be because the developers learned how to make better use of the console hardware or updated the engine that the game is running on. This can be restricting, as it means that games are inherently confined to certain limitations.
On the flipside, consoles that don’t change their capabilities means that consumers won’t have to repeatedly put money into upgrades to ensure that they can continue to play the latest releases. It also means that no research has to be done on which new parts are superior to previous ones or which ones are compatible with a given machine.
In other words, “it just works”. Plug it in, insert the disc, and play.
Something else to take in account is obviously cost. In general, a current generation gaming console will cost at least $200 and no more than $400, not including the cost of peripherals like extra controllers.
On the other hand, the price of most retail games will be around $60, which should clearly be taken into account as well. Assuming someone purchases more than one game a year, spending over a hundred dollars on games annually is not only common, but to be expected. Factor in the average lifespan of a console generation (seven to eight years), and this means that the average gamer will likely spend at least $840 to $960 on games for a single console.
Even being conservative and cutting the number of games purchased per year to only one, this is still $420 to $480. Putting aside possible sales or price cuts, this means that the money spent on a gaming collection will almost always exceed the money spent on the initial console itself. In some cases, it’ll even be twice as much.
Pricey, is it not?
PC games first appeared in the early 50’s and 60’s, but it wasn’t until the early 80’s that they acquired widespread popularity.
junior Kristian Vargas reveals, “While consoles cost a fraction of what a high powered computer might, the digital sales featured on online distributors vastly trumps the GameStops and EB Games of the console world.”
while both systems can be quite pricey, you are still getting a great value for each system. the controversy still stands whether pc or console gaming is the best, it is truly up to the players who can really decide its fate.