As we enter a fresh console cycle,
wanton joystick measuring in terms of each new machine's technical capabilities
is inevitable. Indeed, no early adopter wants to think he or she's backed the
wrong horse, and the gaming media is keen to add its own two cents to the
debate through exhaustive, detailed comparisons of multi-format releases and
flagship exclusives alike in an attempt to vindicate purchasing choices with
hard statistics.
As powerful as dedicated gaming
hardware has become, the raw grunt that such devices can offer will always be
finite, and one of console game development's biggest challenges is striking a
balance between visual fidelity and performance, which is usually measured in frames per second, a metric that refers to how
often the onscreen image is updated in said time frame. Simply put, this
balancing act exists because more complicated scenes are harder to render. The industry
seems to have settled on 60 frames per second as the gold standard
performance-wise, the prevailing opinion being that achieving such a lofty goal
is key to offering the most realistic and immersive experience possible. I'm
not sure I agree.
The frame rate arms race is a
relatively new phenomena, but its roots can be traced back to mainstream
console gaming's opening skirmishes between
Sega and Nintendo in the late 80s and early 90s. Back then,
European televisions operated on a 50Hz PAL signal offering a maximum 50 frames
per second output, whereas in the US and Japan, where the vast majority of the
era's game were developed, the faster 60Hz NTSC standard was used, which
offered a maximum of 60 frames per second. Rather than invest time and money
optimising titles for what no doubt felt like a strange and distant market, American
and Japanese developers would often opt to simply slow their games down to
compensate.
Most European gamers wouldn't
have known anything was amiss at the time, having never played the original, full
speed NTSC versions of their classic favourites. But, when compared side by side
retrospectively, a stark difference between regional variations is often
apparent, especially if a speed-centric title such as Sonic The Hedgehog is
used as a point of comparison. The non-European
iteration features noticeably faster gameplay, while also sporting similarly
sped up sound effects and music. Unsurprisingly,
it's considered vastly superior as a consequence, and is the version that Sega
uses as the basis for its endless stream of re-releases. There's even a sizable
modding scene dedicated to speeding up old PAL consoles to bring them to par with their American and Japanese counterparts.
A modded Mega Drive, yesterday |
Luckily, some might say, the rise
of digital media and HD television has effectively killed the old 50Hz standard
off. The Dreamcast was the first PAL console to offer gamers the option to use
60Hz output if they had a TV that supported it, and by the time the Xbox 360
and PS3 dropped, 60Hz had become the universal norm.
So in the here and now, with
gamers the world over hypothetically able to enjoy a uniform experience, the
onus is on developers more than ever to deliver the best performance
possible. Just prior to the release of Watch Dogs, Sony briefly claimed on their
website that the PS4 version would run at the coveted 60 frames per second
"in a way that only PS4 can provide", before removing any references
to performance entirely. A bold
assertion that didn't quite pan out, but clearly one that suggests Sony believes such things matter enough to
gamers to have a tangible effect on their purchasing habits. The industry at
large's perception appears to be that when it comes to frames per second, more
is always more.
Exhibit A |
But I wouldn't say this is always
true. The concept of suspended disbelief
refers to the idea that a fictional world, or indeed, fictional representation
of the real world, must be consistent and believable for a reader or viewer to
accept it. In gaming terms, the look
and feel of a given title is key to achieving this, but to my mind a constant
60 frames per second looks unnaturally smooth - far more so than the real world
as the naked eye perceives it - undermining any attempt at realism or immersion,
and serving as a constant reminder that I'm looking into the world presented to
me as an observer rather than a participant.
Of course, such misgivings aren't
unilaterally applicable across the entire gaming spectrum by any means. For one thing, with titles such as Rayman Legends
or Super Mario 3D World - the gaming equivalents of Saturday morning
cartoons - fun is the modus operandi, not
a deep narrative or believable visuals. Also, at the upper echelons of competitive
gaming, realism and graphics are considered largely irrelevant. Here,
professional or semi-pro gamers are concerned only with the title they've sunk
thousands of hours into being an adequate conduit for their skill, something
that the more responsive controls offered by higher frame rates factors into
heavily. Call of Duty, for example,
that's yearly iterations feature heavily on the e-sports circuit, wears its
constant 60 frames per second in multiplayer like an enduring badge of honour.
In fact, the series' trademark rock-steady high frame rate is a key argument
made in favour of it over rival franchise, Battlefield, which generally emphasises
scale and elaborate pyrotechnics over fluidity in its consoles versions.
Prettier, but at what cost? |
If realism and immersion are the
goal, however, a steadier performance target more in line with what we see in
the outside world, that also allows for a great deal more visual bells and
whistles, may be more appropriate. After all, the point of releasing ever more
technically accomplished hardware is to facilitate better and better looking
content, and for games purporting to offer any kind of human perspective, better
and more realistic are very often one and the same. It may be, though, that all
the technical innovations of the future will be for naught unless widely
accepted priorities in terms of how best to utilise them change.
No comments:
Post a Comment