WEBVTT

00:00:00.000 --> 00:00:03.920
If you've got an NVIDIA graphics card, you may have noticed that the majority of their

00:00:03.920 --> 00:00:09.920
driver updates are labeled as game-ready drivers, with the idea being that, in addition to bringing

00:00:09.920 --> 00:00:15.280
general updates that benefit everyone, these drivers are optimized for a specific new game

00:00:15.280 --> 00:00:21.280
and are released very near or even on the game's launch day. But guess what? With a few exceptions,

00:00:21.280 --> 00:00:26.320
downloading the latest driver for your graphics card probably isn't going to give you a massive

00:00:26.320 --> 00:00:32.240
performance boost, even in that new game. Huh? So what do they actually do? Well, we'll get to that.

00:00:32.240 --> 00:00:37.280
But first, a quick history nugget. It turns out this misconception about performance goes all

00:00:37.280 --> 00:00:44.160
the way back to the days of the Riva TNT, an NVIDIA product from 1998. There was a famous driver for

00:00:44.160 --> 00:00:51.040
the Riva TNT called Detonator. That actually did boost performance significantly due to the inclusion

00:00:51.040 --> 00:00:57.840
of optimizations for AMD CPUs that previous drivers lacked. This was great press for NVIDIA,

00:00:57.840 --> 00:01:02.480
but unfortunately it ended up setting an expectation that drivers would be some kind

00:01:02.480 --> 00:01:07.680
of silver bullet for performance issues. The issue with that is they're usually not,

00:01:07.680 --> 00:01:12.720
and a big part of the reason it's rare to see big performance jumps with new drivers these days

00:01:12.720 --> 00:01:18.560
is that devs work closely with NVIDIA while the game is in development. This isn't to say you

00:01:18.560 --> 00:01:23.600
never see significant performance increases with new NVIDIA drivers, but it's not as common as you

00:01:23.600 --> 00:01:28.640
might think. Especially as tweaks to the drivers are often made with specific games in mind

00:01:28.640 --> 00:01:33.360
well before launch. Meaning that if you've updated your drivers with any regularity,

00:01:33.360 --> 00:01:38.720
there's a good chance your game will perform just fine even if you're not yet using the specific

00:01:38.720 --> 00:01:45.520
game ready driver for that game. NVIDIA has over an 80% market share in the discrete GPU market,

00:01:45.520 --> 00:01:50.080
so it probably isn't a surprise that they have plenty of contacts at major game studios.

00:01:50.080 --> 00:01:55.120
There's actually a regular cadence where NVIDIA sends out pre-release drivers to developers on

00:01:55.120 --> 00:01:59.440
Mondays, and from there, there's back and forth between the game developers and the NVIDIA folks

00:01:59.440 --> 00:02:05.440
to get things to play as nicely as possible before launch day. And while NVIDIA can't possibly test

00:02:05.440 --> 00:02:12.240
every single hardware configuration, they do have over 4500 combinations they test that go all the

00:02:12.240 --> 00:02:17.520
way back to 2012 as of the time this video was shot. In case you were wondering, this is how

00:02:17.520 --> 00:02:23.360
they figure out what the optimal settings for your PC are in GeForce Experience. But enough context

00:02:23.360 --> 00:02:28.480
already, what specifically about these drivers do they tweak before release? Rather than pushing

00:02:28.480 --> 00:02:34.160
drivers to squeeze every possible frame per second from an upcoming game, the general approach with

00:02:34.160 --> 00:02:39.520
game ready drivers is to ensure stability. For example, NVIDIA has outright disabled

00:02:39.600 --> 00:02:44.560
resizable bar support for certain games as it actually caused performance degradation with

00:02:44.560 --> 00:02:49.920
certain hardware configurations. Something that notably happened with Hitman World of Assassination

00:02:49.920 --> 00:02:54.960
where NVIDIA turned off the feature on systems with Intel CPUs to prevent a performance hit.

00:02:54.960 --> 00:03:00.320
But if you had an AMD CPU, NVIDIA left resizable bar on because it wasn't causing performance

00:03:00.320 --> 00:03:06.240
issues. That's not fair. Not only that, tweaks like this tend to be for an entire product stack.

00:03:06.240 --> 00:03:10.160
So for example, if there's a problem with an optional feature that's being seen with a lot

00:03:10.160 --> 00:03:16.400
of RTX 4080s, it's often switched off for the entire lovelace architecture instead of creating

00:03:16.400 --> 00:03:22.480
a situation where that feature has to get retested on a per card basis for every subsequent driver

00:03:22.480 --> 00:03:28.640
release. And of course, resizable bar is just one of a huge number of levers NVIDIA can pull.

00:03:28.640 --> 00:03:34.800
DLSS, other AI features and the actual CUDA cores themselves all need to be able to support a massive

00:03:34.800 --> 00:03:40.160
number of games, which is part of the reason modern GPU drivers have gotten so huge. Your new

00:03:40.160 --> 00:03:45.440
game ready driver has specific code describing how to behave when you're playing this specific new

00:03:45.440 --> 00:03:51.120
game. NVIDIA has actually tried to strip out unnecessary components to get driver sizes down

00:03:51.120 --> 00:03:56.640
from around a Gigabyte to around 600 megabytes, which is still pretty gigantic. And with all of

00:03:56.640 --> 00:04:02.720
those variables, game ready drivers still can and sometimes do cause issues, even after extensive

00:04:02.720 --> 00:04:08.640
testing. This is part of the reason NVIDIA also offers studio drivers geared toward even greater

00:04:08.640 --> 00:04:14.000
stability for content creators, which aren't pushed out as frequently as game ready drivers.

00:04:14.000 --> 00:04:19.760
The issues that game ready drivers sometimes face are a possible point in favor of AMD's

00:04:19.760 --> 00:04:25.840
slower release cadence for its WHQL certified Radeon drivers and something that the two companies

00:04:25.840 --> 00:04:32.080
have publicly criticized each other on. Of course, AMD has also had issues with its drivers at times

00:04:32.080 --> 00:04:36.800
with their anti-leg plus feature causing gamers to run into problems with anti-cheat software

00:04:36.800 --> 00:04:42.800
in late 2023, standing out as an example. So slower cadences aren't necessarily a perfect

00:04:42.800 --> 00:04:48.960
solution either. And the industry's transition towards the DirectX 12 API has actually shifted

00:04:48.960 --> 00:04:54.720
more of the onus for getting things right towards the game developers and away from GPU companies.

00:04:54.720 --> 00:04:59.360
The intent behind this is to give developers more control over their own game engines,

00:04:59.360 --> 00:05:05.760
but an unintended and common consequence has been that devs can violate the DirectX 12 spec

00:05:05.760 --> 00:05:11.120
when they actually believe they're following it. So I wouldn't expect those hotfix drivers that

00:05:11.120 --> 00:05:17.200
target a specific problem to go away anytime soon. It's not exactly like trillion-dollar companies

00:05:17.200 --> 00:05:21.760
can't make mistakes. Hey, thanks for watching that whole video! Like it if you liked it, dislike

00:05:21.760 --> 00:05:25.200
it if you disliked it, check out our other videos, comment below with video suggestions,

00:05:25.200 --> 00:05:30.000
and don't forget to subscribe and follow. What are you doing here if you didn't like the video?
