Ultra Settings Suck
Our standards have slowly risen over the years,
from the lowly 640×480 resolution up to today’s 1080p and beyond. I’m sure we used to treat
30 FPS as the acceptable minimum, whereas now it’s almost always 60. Unless you’re
a console gamer. And of course, we also want today’s games to look better than ever before.
But I think that somewhere along the way, we’ve lost sight of what really matters.
We’ve begun reviewing things in an increasingly irrelevant way which has gone as far as to
be misguiding to readers looking to buy. Now I’m not talking about bottlenecking,
frametimes and the usual things. They’re spoken about enough already. I’m looking
at the ULTRA QUALITY setting in games. This realisation started for me with the high-end,
when 4K benchmarks started being introduced a few years ago. Back then, results made it
very clear that no system was anywhere close to delivering acceptable framerates at this
resolution. It’s only been with the 980 Tis and FuryX’s that dabbling with this
resolution has become possible, and indeed only with the recent 1080 TI that we’ve
seen 60 FPS at 4K in AAA titles. In other words: to play at 4K, you need the BEST.
But this hasn’t reflected my experiences at all. Back when I first got a 4K monitor,
I sported a Geforce 670 card. By today’s standards, it’s below all but the slowest
of modern systems and is about ¼ the speed of a Geforce 1080. Clearly, 4K with it is
out of the question. …Only it wasn’t, as I was able to get
very acceptable performance in many modern titles at 4K simply by turning down the settings
somewhat. And I don’t mean turning it down to lowest. But rather replacing ULTRA with
MEDIUM or HIGH, which would instantly double the framerate I’d get and with very little
impact on the graphical quality. I’m not going to recommend a Geforce 670
if you want to game in 4K. Of course, the more power you have the better. But the point
I want to get across is that you CAN game quite happily at 4K without needing the most
powerful card available, even if the benchmarks say otherwise.
By the way, this is a real backtrack from me. I have always loved the cutting edge,
prioritising highest quality settings above everything else, including resolution and
framerate. I played through FEAR at 640×480 just so that I could have everything turned
up to highest, complete with those parallax bullet holes. And despite Oblivion only letting
you choose HDR or AA, I found ways of enabling both just so that I could experience it that
way. And Crysis didn’t run too well on Very High. But I stuck with it because I felt it
was the one game where it would be a shame to compromise with the graphics. So when I
tell you that I’m happy with lower quality settings in modern titles, I genuinely believe
it and am not just making up excuses. The reason I no longer care is because high
graphics settings no longer represent what they used to. Go back 10 or 15 years and it’s
clear that games were designed with ‘high settings’ in mind. Then to accommodate for
lower settings and slower systems they’d potato-ise the graphics. You wouldn’t want
to play on those unless you had to. But modern games don’t do this! They look beautiful,
even at lower settings. I have always believed that developers use current-gen consoles as
a baseline, but even modern consoles such as the Playstation 4 and Xbox One have enough
horsepower to produce perfectly pretty graphics! If anything, it’s now the higher graphics
qualities that are the after-thoughts. Let’s face it: there’s no excuse for games
to look bad or to run slowly today. If they do then the fault lies with the developers
more than with the hardware in our machines. And I like the way that developers now target
the lowest settings, rather than the highest. Helps more people to play the game the way
it should be played. I first remember this happening with Crysis.
Whilst the first game looked beautiful on high and very high, turn it down and it looked
potato. But Crysis 2 and 3 remain very nice on lowest settings and only add to it as you
go up. As you can see from this, the third game looks nice even on medium, whilst the
framerate is far more desirable than on ultra. Would you really want that compromise, even
if you could run Ultra at 60? Unless a game is intentionally designed with
higher quality settings in mind, I don’t think that it’s worth the performance hit.
Even if I can manage 60 FPS on ULTRA everything, I’d rather have 120 FPS on high settings
and would be hard-pressed to notice any drop in quality most of the time. Hell, part of
the appeal to me is in finding a combination of lower settings that looks similar to highest
yet runs several times faster! I feel like a genius.
So what’s going on? It’s simple: PCs are becoming more powerful. And there’s only
so much power that you need to make 1080p look good. For more details or improved texture
quality you’re going to want a higher resolution screen! And indeed, we are seeing a shift
towards 1440p and even 4K. This has the inevitable side-effect that anybody sticking to 1080p
will have absolutely no problem running the latest games, even with a budget graphics
card. A Radeon 460 or Geforce 1050 won’t struggle to make today’s games look perfectly
acceptable, using perfectly acceptable graphics settings. It’s only when you try to ramp
everything up to max that they begin to struggle. And at the other end of the spectrum, a powerful
graphics card will easily handle 4K gaming until you reach the all but the highest settings.
But I do find that these highest settings are regularly used when testing cards. Not
all the time, granted- but still too often for my liking. I don’t think that a card
that fails to reach 60 FPS on ultra settings should be immediately dismissed from being
playable on that particular game. If it can’t at low or medium then maybe. But ultra certainly
isn’t a fair representation of a game’s requirements, or the way in which most gamers
can or would even prefer to game. Especially when reviewing budget cards or unusual resolution
choices. Somebody looking to game in those situations won’t get useful information
from a benchmark that obsesses in the highest possible quality, a setting that I feel is
increasingly becoming a niche choice for those who don’t care about high framerates or
resolutions or anti-aliasing. Ultra’s a nice option to have, yes, but certainly shouldn’t
be the standard or even the thing that players aspire to run things at.
So why have ultra settings, if they sap performance for such a small graphical gain? It’s simply
because game developers can. They want to make their games look as good as possible
for promotional purposes, optimisation be damned! And gamers with powerful rigs but
who are still on 1080p monitors will want to justify their lopsided system choices.
Why not lavish 50% of their system power on an ultra-demanding post-process effect that
improves the graphics by like, 1%? And that is why I don’t believe that benchmarks
should use ultra settings. The plus-side may be that you create a test that challenges
every tier of graphics card. But the down-side is that it’s pointless for people actually
looking to run those games. In most modern titles, medium or high settings more accurately
represent the optimum compromise between quality and performance. Beyond that, you get drastically
diminished returns. And I mean like, 5% better graphics for 50% less performance.
I’m happy the option’s there for those who care, and for when people revisit the
game in 10 years’ time and simply want it to look as good as possible because they can.
But I doubt that 99% of gamers- myself included- would even notice, let alone care that a game
isn’t on maximum settings until shown the graphics menu. Hell, it can even be hard to
see the differences in side-by-side comparisons unless it’s paused, and then flicked between
quickly. And even if differences can be seen, I still sometimes struggle to know if a change
is a genuine improvement. A lot of the time it’s not even a definitive improvement graphically,
but often simply artistically. And even if it is an improvement, is it anywhere near
worth the performance drop that it adds? (Which is often the one thing that IS noticeable
and substantially worse)! I think that reviewers should study the games, to see what is needed
to make it look good, and then to use those settings for benchmarking purposes.
There’s no reasonable excuse not to do this. Don’t give me this future-proofing to simulate
tomorrow’s games crap. Or that ultra settings are justified simply by existing. Highest
settings are only tested to appeal to readers rather than gamers. Instead, I want benchmarks
that reflect how a game plays. If a person is looking to buy a budget card then they
won’t care that it can handle all modern games at 15 FPS on ultra settings, if on high
it can output 60 and look pretty much the same. And as a 4K user with a powerful system
myself, I can assure you I’d rather have playable performance at high settings than
I would a slideshow for an extra FPS sapping lens flare effect.
Highest settings used to mean something. But now, judging hardware based on them is hindering
progress more than it’s helping. Please stop, and keep the ultra settings graphics
porn to your own time, in your own room, where no one else can see you.