The Theft Protection

Myths and Facts About Identity Theft

Our standards have slowly risen over the years,
from the lowly 640×480 resolution up to today’s 1080p and beyond. I’m sure we used to treat
30 FPS as the acceptable minimum, whereas now it’s almost always 60. Unless you’re
a console gamer. And of course, we also want today’s games to look better than ever before.
But I think that somewhere along the way, we’ve lost sight of what really matters.
We’ve begun reviewing things in an increasingly irrelevant way which has gone as far as to
be misguiding to readers looking to buy. Now I’m not talking about bottlenecking,
frametimes and the usual things. They’re spoken about enough already. I’m looking
at the ULTRA QUALITY setting in games. This realisation started for me with the high-end,
when 4K benchmarks started being introduced a few years ago. Back then, results made it
very clear that no system was anywhere close to delivering acceptable framerates at this
resolution. It’s only been with the 980 Tis and FuryX’s that dabbling with this
resolution has become possible, and indeed only with the recent 1080 TI that we’ve
seen 60 FPS at 4K in AAA titles. In other words: to play at 4K, you need the BEST.
But this hasn’t reflected my experiences at all. Back when I first got a 4K monitor,
I sported a Geforce 670 card. By today’s standards, it’s below all but the slowest
of modern systems and is about ¼ the speed of a Geforce 1080. Clearly, 4K with it is
out of the question. …Only it wasn’t, as I was able to get
very acceptable performance in many modern titles at 4K simply by turning down the settings
somewhat. And I don’t mean turning it down to lowest. But rather replacing ULTRA with
MEDIUM or HIGH, which would instantly double the framerate I’d get and with very little
impact on the graphical quality. I’m not going to recommend a Geforce 670
if you want to game in 4K. Of course, the more power you have the better. But the point
I want to get across is that you CAN game quite happily at 4K without needing the most
powerful card available, even if the benchmarks say otherwise.
By the way, this is a real backtrack from me. I have always loved the cutting edge,
prioritising highest quality settings above everything else, including resolution and
framerate. I played through FEAR at 640×480 just so that I could have everything turned
up to highest, complete with those parallax bullet holes. And despite Oblivion only letting
you choose HDR or AA, I found ways of enabling both just so that I could experience it that
way. And Crysis didn’t run too well on Very High. But I stuck with it because I felt it
was the one game where it would be a shame to compromise with the graphics. So when I
tell you that I’m happy with lower quality settings in modern titles, I genuinely believe
it and am not just making up excuses. The reason I no longer care is because high
graphics settings no longer represent what they used to. Go back 10 or 15 years and it’s
clear that games were designed with ‘high settings’ in mind. Then to accommodate for
lower settings and slower systems they’d potato-ise the graphics. You wouldn’t want
to play on those unless you had to. But modern games don’t do this! They look beautiful,
even at lower settings. I have always believed that developers use current-gen consoles as
a baseline, but even modern consoles such as the Playstation 4 and Xbox One have enough
horsepower to produce perfectly pretty graphics! If anything, it’s now the higher graphics
qualities that are the after-thoughts. Let’s face it: there’s no excuse for games
to look bad or to run slowly today. If they do then the fault lies with the developers
more than with the hardware in our machines. And I like the way that developers now target
the lowest settings, rather than the highest. Helps more people to play the game the way
it should be played. I first remember this happening with Crysis.
Whilst the first game looked beautiful on high and very high, turn it down and it looked
potato. But Crysis 2 and 3 remain very nice on lowest settings and only add to it as you
go up. As you can see from this, the third game looks nice even on medium, whilst the
framerate is far more desirable than on ultra. Would you really want that compromise, even
if you could run Ultra at 60? Unless a game is intentionally designed with
higher quality settings in mind, I don’t think that it’s worth the performance hit.
Even if I can manage 60 FPS on ULTRA everything, I’d rather have 120 FPS on high settings
and would be hard-pressed to notice any drop in quality most of the time. Hell, part of
the appeal to me is in finding a combination of lower settings that looks similar to highest
yet runs several times faster! I feel like a genius.
So what’s going on? It’s simple: PCs are becoming more powerful. And there’s only
so much power that you need to make 1080p look good. For more details or improved texture
quality you’re going to want a higher resolution screen! And indeed, we are seeing a shift
towards 1440p and even 4K. This has the inevitable side-effect that anybody sticking to 1080p
will have absolutely no problem running the latest games, even with a budget graphics
card. A Radeon 460 or Geforce 1050 won’t struggle to make today’s games look perfectly
acceptable, using perfectly acceptable graphics settings. It’s only when you try to ramp
everything up to max that they begin to struggle. And at the other end of the spectrum, a powerful
graphics card will easily handle 4K gaming until you reach the all but the highest settings.
But I do find that these highest settings are regularly used when testing cards. Not
all the time, granted- but still too often for my liking. I don’t think that a card
that fails to reach 60 FPS on ultra settings should be immediately dismissed from being
playable on that particular game. If it can’t at low or medium then maybe. But ultra certainly
isn’t a fair representation of a game’s requirements, or the way in which most gamers
can or would even prefer to game. Especially when reviewing budget cards or unusual resolution
choices. Somebody looking to game in those situations won’t get useful information
from a benchmark that obsesses in the highest possible quality, a setting that I feel is
increasingly becoming a niche choice for those who don’t care about high framerates or
resolutions or anti-aliasing. Ultra’s a nice option to have, yes, but certainly shouldn’t
be the standard or even the thing that players aspire to run things at.
So why have ultra settings, if they sap performance for such a small graphical gain? It’s simply
because game developers can. They want to make their games look as good as possible
for promotional purposes, optimisation be damned! And gamers with powerful rigs but
who are still on 1080p monitors will want to justify their lopsided system choices.
Why not lavish 50% of their system power on an ultra-demanding post-process effect that
improves the graphics by like, 1%? And that is why I don’t believe that benchmarks
should use ultra settings. The plus-side may be that you create a test that challenges
every tier of graphics card. But the down-side is that it’s pointless for people actually
looking to run those games. In most modern titles, medium or high settings more accurately
represent the optimum compromise between quality and performance. Beyond that, you get drastically
diminished returns. And I mean like, 5% better graphics for 50% less performance.
I’m happy the option’s there for those who care, and for when people revisit the
game in 10 years’ time and simply want it to look as good as possible because they can.
But I doubt that 99% of gamers- myself included- would even notice, let alone care that a game
isn’t on maximum settings until shown the graphics menu. Hell, it can even be hard to
see the differences in side-by-side comparisons unless it’s paused, and then flicked between
quickly. And even if differences can be seen, I still sometimes struggle to know if a change
is a genuine improvement. A lot of the time it’s not even a definitive improvement graphically,
but often simply artistically. And even if it is an improvement, is it anywhere near
worth the performance drop that it adds? (Which is often the one thing that IS noticeable
and substantially worse)! I think that reviewers should study the games, to see what is needed
to make it look good, and then to use those settings for benchmarking purposes.
There’s no reasonable excuse not to do this. Don’t give me this future-proofing to simulate
tomorrow’s games crap. Or that ultra settings are justified simply by existing. Highest
settings are only tested to appeal to readers rather than gamers. Instead, I want benchmarks
that reflect how a game plays. If a person is looking to buy a budget card then they
won’t care that it can handle all modern games at 15 FPS on ultra settings, if on high
it can output 60 and look pretty much the same. And as a 4K user with a powerful system
myself, I can assure you I’d rather have playable performance at high settings than
I would a slideshow for an extra FPS sapping lens flare effect.
Highest settings used to mean something. But now, judging hardware based on them is hindering
progress more than it’s helping. Please stop, and keep the ultra settings graphics
porn to your own time, in your own room, where no one else can see you.

100 thoughts on “Ultra Settings Suck

  1. The law of diminishing returns. I have a gtx1070 and I will turn my games right down to low if I see a single dip in frame rate in the most demanding scene in the game on any higher setting. sadly you will see dramatic performance dips even in well optimized games on high end hardware when explosions happen and effects begin to compound. 4k running smooth all the time is a pipe dream at this point. that is just me and I wont turn the resolution down, everything else goes first for me and I game at 1080p. we are simply not there yet unless you have a super computer with some exceptions of course

  2. "nowadays the standard is 60"

    The standard was ALWAYS 60. Did you completely miss out on every generation prior to the 5th?

  3. 1080p is still used to most.
    Proof here.
    I would rather have ultra or max at 1080p at high fps then 4k med at 60fps.

  4. I would rather play at 40 FPS on ultra than 60 on high, simply because I like the little extra things that ultra settings add

    If your like me turning up the post prossesing and all the fancy bling bling effects in a game to ultra while keeping the texture and shadow resolution to high would still give u high FPS while also giving u the blings of ultra

  5. In some games like The Witcher 3, AC – Odyssey, or Metro Exodus, I rather playing at the highest settings possible for 30fps besides turning down the settings for 60fps,

    But in games like Doom, Wolfenstein series, NieR, BioShock Infinite, or Call of Duty series, 60fps is a must

  6. One of my proudest achievements has been running witcher 3 at 4k on custom settings with draw distance to the horizon at 60 fps locked. Honestly looked better than ultra at 1080por 1440p due to image sharpness and it ran smooth as butter.

  7. Most all of graphics, and indeed art, relies on a combination of tricks to give the illusion of realism. So the greatest level of immersion, let alone enjoyment, doesn't require the most accuracy. Imagine trying to simulate worlds at the molecular level just to add 1% realism to the physics engine. There's always a point of rapidly diminishing returns.

  8. This is why PC is better than consoles. You can manipulate the setting to whatever you want. Be it high or low resolution. Don't even get me started on ultra setting mods. Because those are fucking legendary.

  9. People still think the same about vr, which really irritates me because the requirements have dropped as the rendering methods and synthetic frame injection gets better and better, while people think you'd be barely able to play vr on a 1060. I have literally seen people playing some pretty smooth vr all the way down to a GTX 660 Ti by adjusting the settings.

  10. I honestly always strive for High settings. The differences between Ultra and Hugh settings are so negligible that I don't even notice. If going a step below gives me a good enough boost in FPS, then Ultra settings be damned.

  11. This makes a 1050Ti owner quite happy!
    Funnily enough, GTA V and Witcher 3 shown on the video don't really run too well at all despite lowering the settings.

  12. Fun fact: I get 400fps on highest settings in csgo on my desktop and 10-30fps on my laptop on the lowest settings.

  13. as a person that upgraded my GPU and went from 30-35 FPS on low settings to 50-55 FPS on high settings in games like XCOM 2 for example,i can say im ok with an average graphics card that can run average settings

  14. Running benchmarks on the highest possible setting makes sure that the bottleneck is on the GPU and not the CPU… I think…..

  15. 25-30 fps, 1080p, Ultra only on Real-time.
    If you never had this – you should be ashamed of yourself

  16. i still want a gtx 1080ti, mainly cause i have a 144hz mointor and my 970 can only goto 60 to 70 at high most of the time. And i dont wanna turn down my settings…. cause shush 😛

  17. my friend has a gtx 1080t ti and he is not happy with the framerates he has. nearly every single game on high 100+ fps and i have an intel hd graphics 5500 and barely get 30 fps in csgo at lowest 720p which that game is now 7 years old.

  18. They don't make things look better by improving the pixels anymore, they improve by increasing rendered objects and that is what you need horsepower for. Ultra becomes a setting where more and more visual objects are to be rendered instead and if done correctly that will make things look even more beautiful and less artificial, that's a completely valid trade off imho. Side by side comparisons at 3:35 to 3:55 for examples.

  19. For me there is slight to no change between ultra and medium, the only thing I Notice is that medium runs at 90 FPS and ultra at 45 FPS..

  20. I mean, we have a broarder issue regarding graphics in modern gaming as a whole, as games more and more dump higher portions of it's budget only on graphical fidelity being willing to sacrifice everything else, from plot to gamplay, so they can focus as many working hours on graphical working as possible. Most modern shooters become mere interactive movies with little to no player interactions as gameplay was only an afterthought after the developers spent 99% of the budget on getting those wind physics in the game, and the grass.

  21. I agree that developers have everything to do with how the game runs on any rig. GTA V runs buttery smooth on 1080p while the Witcher 3 struggles at 720p lowest settings. Not to mention terrible pubg optimization

  22. I'm not one of those rich gamers so I've never properly experienced what "good" graphics look like but I can say this, I have a crappy GeForce GT 730, I can run Overwatch, GTA V on really low settings and I still think that they look amazing. Ultra settings are way overrated; enjoy the game, not the graphics.

  23. I played Battlefield 1 and 3 on lowest settings. They look beautiful and at the same time runs at least 40 fps on low resolutions. Still enjoy it.

  24. i don't know why but skybox like in half life or cs maps (for example as_oilrig) looks awesome to me. better than sky spheres. old low res skyboxes and images. i love them. when i download something i open cs cz (1.6 dont have bots) for 5 or 10 minutes but im playing 1 hour (then checking download progress and opening game again). but in other games i generally play 30 minutes then trying other games. why? they are newer and should be better.

  25. You have a very strong point. Although the part where you mention that benchmarks should not be shown on the highest settings. The point of a benchmark is to push the system to its limits and see how cards compare under stress that might be shown to it over the 2-4 year lifespan someone owns the card. The framerates of a 1050 and a 2080 ti on lowest settings will be much closer because it is not hard to run most games. There comes in the bottleneck of the cpu. Which exists to matter what. Therefore benchmarks should always be on the highest settings.
    Video games make their graphics beautiful because people care about how the game looks. I play almost all games I own on lowest settings to get that sweet 144fps. Many people prefer quality over frames, not to mention streamers. Also, not everyone plays first-person shooters where you get an advantage on lowest settings. Games that are slower, like you showed in the video, The Witcher 3, they were built to be beautiful graphics and story that immerses you. Why would anyone want to be running around a low poly world if they could make it look beautiful by changing the graphical settings? (Assuming their hardware can handle medium to high settings.)
    Do I even need to mention Virtual Reality games that will be coming soon in the future where ultra won't just look good? They may be so believable that you may forget that you're in a game.

    These are my two cents, I do agree higher fps is king. But there are some criticisms still here! Have a nice day!

  26. is GTX 960 is considerd "weak cards" i assume GT 755 is horrible right? funny cuz games run ok on my computer anyways

  27. What I do when I get a new game is I first set every setting to maximum, test the game and if the FPS feels too low I'll gradually decrease the settings until I find a good compromise between quality and performance.

  28. yes this is so true, i currently have a g4560 very basic budget cpu, 2 cores and 4 threads, its all i can afford right now, but the reason why i chose this is because it was at a very reasonable price and its been reviewed so many times on youtube, i know that all i need now is a good graphics card, something like an RX 570 or GTX 1060 should do just fine, i do expect some bottlenecks in some games, but i don't plan to play everything, but the ULTRA preset is a big fat scam it doesn't really improve the experience at all with the games it makes it worse, he's right i'd much rather have average of 120 frames on HIGH rather than 60 fps on ULTRA, games look like movies now, its crazy how great the graphics have gotten over the years that the ultra preset just doesn't make sense.

  29. I have a 4 year old computer and bought a 1080 gtx for 300$ on EBAY and run on 1440p @ 120/hdr w ultra settings on a 4k 120 hz native Quantum and it gets a stable 90+ FPS with far cry new dawn. It doesn’t take much. 4k is a different story though. 1080p @ 60fps, ultra settings doesn’t really take much at all. That being said things have changed significantly since this video was made.

  30. Hey, I'd like to bring HialgoBoost to your attention. It helped me run many games at good settings with a budget gpu. The method they use is really ingenious. The software lowers resolution of the game when the things in the game is actually blurry. So, when you move your camera in game, things are already blurry so they lower the resolution and you get fluid camera movement from high fps. But, when you stop to look at something everything renders beautifully at full resolution, fps will be low, but you don't need high fps when you are not moving the camera. It really is a genius solution with a difference you have to try hard to notice. Games should make use of their idea. But, the software never got the attention they deserved.

  31. For hardware reviews the question is mostly "what hardware is best", and those reviewers use those settings because they can expose hardware in ways that lower settings just won't. Honestly I think the problem you're running in to is that hardware reviewers don't review these products as "gamers". They review them as "tech press", and for that role what's important is simply to show what each piece of hardware will do relatively to other available hardware. I honestly think most buyers are smart enough to consider what settings they themselves will want to play at.

  32. It is arguably true that the "best" or "optimal" approach — at least for non-enthusiast hardware — is to try and find the point of diminishing return in visual quality for each setting and stop there.

    So, for example, if we have Low, Medium, High, and Ultra — usually this is gonna be the High setting, perhaps the Medium setting if you can't manage acceptable framerates with High.

    Another example would be using 4x or 8x passes of Anisotropic Filtering instead of 16x. Now, for this, I know that people are going to say that the performance impact is negligible and generally you're right — however, in modern games, many tests have shown that there is indeed a performance hit, it's just normally very small (perhaps 1 to 3 fps jumping from 4x to 16x), but in some cases where you're doing your darndest to hit your target framerate that can make all the difference and arguably you hit diminishing returns with that setting at 4 or 8 passes.

    Same can be said of modern post-process Anti-Aliasing — MSAA is probably best left out these days unless you have a lot of performance overhead. SMAA or TAA are usually the optimal route to go.

    If I remember correctly, mouse polling rate (CPU side) no longer has a meaningful impact on performance, but I personally can't tell the difference between 500 and 1000 so arguably on older PCs, it may be worth it to stick with 500, but I'm not certain if this is even an issue anymore.

  33. It still baffles me how great Crysis looks, even by today's standards.The environment (forest, sky etc.) is just stunning. Look at how the moonlight at 6:10 reflects in the leaves of the palm tree…
    This game is 12 freaking years old now.

  34. Yeah you've just convinced me to go with an ultrawide (which I've always wanted to game on) when I get my RX 5700XT. I've been hearing so many mixed opinions on whether that card will be enough for a decent 3440×1440 experience and the benchmarks are okay but not amazing. It's so true though, if it can run the most demanding AAAs of today on ultra at 40-60 then I'll be able to achieve 60+ at high/medium in future AAAs for a few years yet.

  35. I want to be able to keep up with consoles at bare minimum, plus an extra 30fps. But tbh, I've seen tge different resolutions, n 1080p is all i need… i might switch to 1440p one day

  36. You can see at 4:34, how only the low setting is obviously compromised on the shadowing. The other settings just create more detailed shadows. I noticed on XCom 2, I could not tell the difference, at 4k, between the highest settings. I presume the main difference is shadows. AA makes a big difference in performance, but interestingly, I think XCom 2 sets AA to FXAA for even the highest setting, due to MSAA ( I think that's what it is,) having a big impact on FPS. ( I mention XCom 2, because it's one of the worst FPS of all my games.)

  37. It's true, lately you can't make out the differences between ultra settings and medium settings unless you had a side by side comparison and you went out of your way to look for the tiny little meaningless differences.

  38. You are kind of wrong. The ultra settings, are actually very good way to stress test cards. I usually pay more attention to differences between cards in Ultra settings than in Very High/High, even if I will not play on that card on Ultra.

  39. there are more RTX cards than RTX games. where are the THOUSANDS games promised by the liar in leather?
    Conclusion: Turing hasn't been and will never become a valid choice for ray tracing based on these facts:

    From launch till now not many games support raytracing, performance hit is huge so not too many people use it on daily basis.Fail, massive fail.

  40. Stop it, it's okay to have a lower end gpu but don't make excuses for not having every settings maxed out at 240+ fps

  41. What you're saying is stupid:

    When comparing gpus, running the games on ultra is used to fully use the gpus and show their real performance

    When just reviewing a single gpu in games, multiple resolutions and settings are often tested, if those who you trust to review gpus don't do it, well that's because their bad at it and don't offer a proper review. To see a card run a perticular à multiple settings a lot of video exists on YouTube anyway

  42. Just maybe the Testers want to produce a compareable Benchmark, because comparing 5 different systems on 5 Games with 5 different settings either has them benching 125 times or you not being able to tell any difference between the cards or games when everything is around 90 fps

  43. Why not just check the Ultra benchmarks to see if they stay above 30fps, then be pleasantly surprised when you turn it down to medium?

  44. Well you're right that ultra settings largely suck, but you're completely wrong about the reasons for it. Developers aren't just targeting low settings, it's simply that modern hardware is powerful enough that having options to do things like disable dynamic lighting (that made old games look "potato") no longer make sense. So the graphical options we have left are subject to significant diminishing returns – increasing shadow resolution for instance. If anything, many developers are no longer providing real "low" options anymore – it's more like high, higher, and highest.

Leave comment

Your email address will not be published. Required fields are marked with *.