Nowadays it seems that everyone’s concerned with what resolution something is. While just about a year ago, online conversation was dictated primarily by the potential pricing of the next-gen consoles or which console had better exclusives, now conversation is dominated by what has more on-screen pixels. Recently, Killzone Shadowfall was discovered to not run natively at 1080p, the details of which are more complex than just how it runs or looks. That doesn’t stop the gamer user-base from out-crying that once again the next gen sucks and we should all buy a PC. While it would be easy to wave off all these concerns as fanboy drivel, perhaps there’s a plethora of reasons why ‘resolutiongate’ is such a hot topic with this current generation of consoles.
Consider the overwhelming belief that the next-gen has to be ‘next-gen’; a new generation of possibilities, technology and hardware specs. Many might feel that when games fail to even harness the power of 1080p resolution then it lacks the ‘next-gen’ feel. This concern seems superficial, but it might be one rooted in some truth. Dealing with a current gen system comes with many restrictions, based primarily on how old the current hardware is. If you’ve been hanging onto your Xbox 360 or PS3 for 7 years, you’ll begin to feel that games are looking dated, especially compared to the stellar graphics of PC games. This of course isn’t a stab at console games, since a lot of 2013’s console games had phenomenal visual style such as Grand Theft Auto V and The Last of Us. It could just be that people feel like it’s time to move on; to move on from 30fps and 980x720p video games and onto games where everything looks shiny and perfect. However, as we’ve seen lately, the next-gen has a while to go.
Recently, we discovered that Twitch on Xbox One outperforms that of Twitch on PS4, and one of this website’s authors expressed why this matters. It all comes down to opinion of course and be it not down to any of us here to tell you what to think of resolutions on hardware. The greater point it does raise instead is that gamers demand a level of quality, quality which game developers consistently promise and aren’t being met for some. I will counter that it also reeks of severe nitpicking, and gamers who are entitled seem to be very vocal about their expectations. On the one hand, people expect things on ‘next-gen’ consoles to run at full 1080p with 60fps smooth gameplay, as a lot of PC games are capable of, whilst on the other hand some gamers can’t sleep with the thought of missing even one pixel. Perhaps the internet flame wars have become somewhat ridiculous when we talk less and less about the subjective values of games or game hardware and more about the intricacies of onscreen visuals. Maybe in the future we’ll soon be arguing about how much texture goes into one blade of grass in a bush.
It’s certainly a topic of hot debate and disagreement, with each gamer fully expressing their disdain at the idea of owning a next-gen game that can’t even handle a full visual output of the glorious 1080p. Whether the ‘resolutiongate’ flame war is necessary for the industry right now is another story altogether. When it comes down to it, an individual’s enjoyment of a game should have less to do with the visuals and more to do with the fun factor of the game itself. That isn’t to say that graphics don’t matter, since games like the Crysis series thrive off of their phenomenal visual presentation, which CEO Cervat Yerli of Crytek claims to be 60% of the game. In the end, gamers’ expectation of quality is valid and hopefully developers can continue to push the hardware and give us exciting presentations and experiences. Remember, we are only on the precipice of the next-gen and it will be a while before we start to see some truly spectacular games visually (compare games like Prey or Kameo to more recent titles like Halo 4 and The Last of Us). Until then, don’t get too bogged down by the pixels.
Sources: Craveonline, Xb1.co.uk