WEBVTT

00:00:00.000 --> 00:00:04.880
There's one big compromise that gamers have had to make for a long time.

00:00:04.880 --> 00:00:08.680
You want your games to look better, or do you want them to run faster?

00:00:08.680 --> 00:00:13.160
Typically, this has meant turning down your graphics setting to get more frames per second,

00:00:13.160 --> 00:00:20.800
especially if you don't have a high-end graphics card. But today, we're instead going to talk about the amount of input lag that gets introduced

00:00:20.800 --> 00:00:26.480
when you're trying to upscale a game. We consulted with our friend Amin Shabane over at Merci to put this video together, so we

00:00:26.480 --> 00:00:33.880
like to thank him for his help. Now to be clear, I am not talking about your GPU rendering frames from scratch.

00:00:33.880 --> 00:00:38.480
What I'm referring to instead is what happens after your GPU finishes rendering a frame,

00:00:38.480 --> 00:00:45.760
and either the GPU or your display resizes the image to make it fit a certain resolution.

00:00:45.760 --> 00:00:49.600
You can see this if you're running a PC game at below your monitor's native resolution

00:00:49.600 --> 00:00:55.080
to improve performance, or if you hooked up an older console to modern flat-panel TV.

00:00:55.360 --> 00:00:59.600
There are different forms of upscaling, some of which look nicer than others, but they

00:00:59.600 --> 00:01:05.520
all require a certain amount of post-processing time which can introduce noticeable input lag,

00:01:05.520 --> 00:01:08.840
meaning that there's a delay between when you press a button or move a thumbstick or

00:01:08.840 --> 00:01:12.680
move the mouse, and the corresponding action appearing on the screen.

00:01:12.680 --> 00:01:18.400
This can seriously hinder gameplay for obvious reasons, especially in older titles like classic

00:01:18.400 --> 00:01:24.240
platforms where responsiveness is a huge part of making the game feel like you remember.

00:01:24.240 --> 00:01:31.360
But why does it introduce so much lag? Well to get the image looking as nice as possible, some algorithms look at the frames that are

00:01:31.360 --> 00:01:37.280
rendered before and after the frame to be upscaled to better understand what a higher res version

00:01:37.280 --> 00:01:40.920
of the same image is supposed to look like.

00:01:40.920 --> 00:01:45.760
Then they apply what they think are correct changes to the frame.

00:01:45.760 --> 00:01:49.760
This method of analyzing multiple frames that are held in what's called the frame buffer

00:01:50.200 --> 00:01:54.080
before they're shown to the user can definitely yield visual improvements.

00:01:54.080 --> 00:02:00.040
But not only is this a computationally time-consuming process that adds lag, it can also result

00:02:00.040 --> 00:02:04.240
in worse image quality if the frames it's examining were highly compressed.

00:02:04.240 --> 00:02:09.720
For example, if you're watching a movie. An alternative approach to reduce lag is to, instead of relying on multiple frames at

00:02:09.720 --> 00:02:15.440
one time, have the algorithm look at certain elements of a single frame that human brains

00:02:15.440 --> 00:02:24.040
are typically sensitive to. For example, Mercedes-Mersai, Mercedes-Mersay, Mercedes-M classic smart HDMI cable has a built-in

00:02:24.040 --> 00:02:28.800
library of objects like edges and textures that we naturally key in on.

00:02:28.800 --> 00:02:34.720
Think about how jaggies caused by bad anti-aliasing of edges are often really noticeable to us.

00:02:34.720 --> 00:02:40.080
Interestingly, characters' eyes are also a focus as humans are psychologically programmed

00:02:40.080 --> 00:02:43.720
to be very sensitive to what someone else's eyes are doing.

00:02:43.720 --> 00:02:47.520
This kind of strategy of focusing mostly on key visual elements can greatly reduce lag

00:02:47.520 --> 00:02:52.680
time while improving visual quality due to its reliance on predetermined visual cues

00:02:52.680 --> 00:02:57.720
for the algorithm to focus on, as well as the fact it only examines one frame.

00:02:57.720 --> 00:03:02.920
But like other upsampling methods, it's not perfect, so can we do better?

00:03:02.920 --> 00:03:06.880
It turns out the answer is yes, though we might still be some years away from seeing

00:03:06.880 --> 00:03:13.640
it becoming widely available. Rather than programming a scaler to spot a few specific elements, computer scientists

00:03:13.640 --> 00:03:18.360
have been training artificial intelligences to recognize what more complex objects are

00:03:18.360 --> 00:03:26.560
supposed to look like. Accurately scaling an HD image to 4K or even 8K is a very computationally intensive problem,

00:03:26.560 --> 00:03:31.560
so large amounts of AI training will reduce the reliance of predefined features and allow

00:03:31.560 --> 00:03:36.600
a scaler to recognize anything from whether or not an object is a dog to how it handles

00:03:36.600 --> 00:03:42.520
scenes with complicated lighting. We're already seeing this to some extent with NVIDIA's Deep Learning Super Sampling

00:03:42.520 --> 00:03:47.920
or DLSS, where a supercomputer is fed with lots of frames from different games and figures

00:03:47.920 --> 00:03:52.960
out an algorithm to produce something close to an ideally anti-aliased image.

00:03:52.960 --> 00:03:56.920
These algorithms are then pushed out to individual users through software updates.

00:03:56.920 --> 00:04:00.560
Not only does this allow gamers to improve how their games look without lowering frame

00:04:00.560 --> 00:04:05.200
rates, the more efficient post-processing algorithms optimized through AI should hopefully

00:04:05.200 --> 00:04:12.040
make games feel more responsive as well. Don't remember that if you just suck at games like CSGO because you have straight-up terrible

00:04:12.040 --> 00:04:17.320
reflexes, AI probably won't help you, so you might want to just give turn-based games

00:04:17.320 --> 00:04:22.560
a shot. So thanks for watching, guys. If you liked this video, give it a thumbs up, subscribe, and be sure to hit us up in

00:04:22.560 --> 00:04:28.760
the comment section for your ideas about future videos that we should make, about tech topics

00:04:28.760 --> 00:04:31.640
that you want explained. We'll do it.
