WEBVTT

00:00:00.240 --> 00:00:07.359
even though arguments between console and pc gamers still happen all the time

00:00:04.880 --> 00:00:12.960
consoles are more pc-like than ever before the xbox series x and series s

00:00:10.559 --> 00:00:16.640
are based on AMD's zen 2 architecture and you can even buy off-the-shelf ssds

00:00:15.360 --> 00:00:21.840
to expand the storage of your playstation 5. but if consoles are so

00:00:19.359 --> 00:00:26.080
similar to pcs now why the heck do ports of console games still have all sorts of

00:00:24.640 --> 00:00:30.240
performance issues when you try to run them on a regular desktop to find out we

00:00:28.960 --> 00:00:36.559
reached out to our friends over at digital foundry and we'd like to thank alex battaglia and richard ledbetter for

00:00:34.480 --> 00:00:39.280
their assistance with this deeper dive as well as some of the footage you're

00:00:37.920 --> 00:00:43.120
about to see one issue might be familiar to you if

00:00:41.120 --> 00:00:48.800
you played horizon zero dawn at launch or final fantasy 7 remastered problems

00:00:46.079 --> 00:00:52.800
with how pcs handle the game's shader a shader is the code that handles how to

00:00:50.719 --> 00:00:57.440
illuminate and color each pixel on the screen on a console the shader is just a

00:00:55.600 --> 00:01:02.559
file that the console loads when the game starts up and because each model of

00:01:00.000 --> 00:01:07.040
console has exactly the same hardware in each unit programmers know how to write

00:01:05.040 --> 00:01:11.280
the shader for that specific hardware configuration

00:01:08.400 --> 00:01:16.640
but because pcs can have nearly endless combinations of processors memory

00:01:13.600 --> 00:01:19.439
graphics cards etc the shaders in pc

00:01:16.640 --> 00:01:24.159
ports are instead recompiled as you play the game so each time you encounter a

00:01:21.680 --> 00:01:28.479
new area or object there's a good chance that the game will stutter while the

00:01:25.520 --> 00:01:33.040
shader recompiles and renders the scene but the good news is that once a shader

00:01:30.479 --> 00:01:37.439
gets recompiled it is saved to your SSD or to your hard drive so your CPU won't

00:01:34.960 --> 00:01:41.360
have to do that extra work ever again however there are other problems that

00:01:39.200 --> 00:01:46.159
can appear over and over again because of differences in how memory works on a

00:01:43.520 --> 00:01:50.320
console versus on a pc modern gaming pcs tend to have more

00:01:48.159 --> 00:01:57.439
memory than consoles it's not uncommon to find 16 gigs of system RAM plus 8 or

00:01:54.240 --> 00:01:59.840
more gigs of vram in a gaming computer

00:01:57.439 --> 00:02:04.960
but modern consoles have a shared pool of typically between 10 and 16 gigs of

00:02:02.719 --> 00:02:09.200
memory in total meaning that games written for consoles often work by

00:02:06.719 --> 00:02:13.040
constantly streaming the necessary data into RAM since there just isn't as much

00:02:11.599 --> 00:02:16.640
of it to work with now pcs could give you better

00:02:14.720 --> 00:02:21.200
performance simply by loading in more assets ahead of time thanks to their

00:02:18.319 --> 00:02:25.760
extra RAM but games have to be recoded to take advantage of this and that would

00:02:23.120 --> 00:02:29.920
be a lot of extra work that publisher timetables may not allow for

00:02:28.160 --> 00:02:33.760
and while this might not create graphical glitches it does mean that

00:02:32.080 --> 00:02:38.400
you're leaving performance on the table even if your GPU is more powerful than

00:02:36.080 --> 00:02:42.160
what you would find in a console god of war is a great example of a title where

00:02:40.480 --> 00:02:46.239
this limitation is visible with the game becoming dramatically slower at

00:02:44.000 --> 00:02:51.120
dedicated loading portals and the fact that pcs have separate system RAM and

00:02:49.040 --> 00:02:55.360
video memory also means that data like textures and geometry information is

00:02:53.599 --> 00:03:00.400
getting shuttled between those two RAM pools quite often which takes time

00:02:58.080 --> 00:03:04.720
generally speaking the CPU has to handle this information including decompressing

00:03:02.959 --> 00:03:09.519
all this visual data consoles on the other hand have just one

00:03:07.040 --> 00:03:13.440
pool of memory where the data goes and they usually have a dedicated

00:03:11.120 --> 00:03:16.400
decompression chip that just handles this visual data lessening the load on

00:03:15.200 --> 00:03:21.360
the CPU the fact that pc cpus have to do this

00:03:18.640 --> 00:03:25.840
extra legwork can you guessed it cost you performance though hopefully the new

00:03:23.680 --> 00:03:30.480
direct storage API for Windows will help alleviate this burden since it offloads

00:03:27.840 --> 00:03:34.400
decompression work to the GPU which wouldn't tax your performance very much

00:03:32.319 --> 00:03:37.680
due to the way that gpus are designed we're just waiting for developers to

00:03:35.840 --> 00:03:41.040
take advantage of this feature in newer titles we'll tell you more right after

00:03:39.440 --> 00:03:45.440
we thank ifixit for sponsoring this video all month long ifixit is

00:03:43.200 --> 00:03:50.560
challenging our youtube community to fix or make something instead of buying new

00:03:48.080 --> 00:03:55.040
for fixit february ifixit is giving away a protect toolkit to one lucky fixer

00:03:52.959 --> 00:04:00.319
every week all you have to do is share photos or videos of your repairs with

00:03:57.120 --> 00:04:02.159
the hashtag fixitfeb for a chance to win

00:04:00.319 --> 00:04:06.640
so save yourself money by fixing your tech while entering for a chance to win

00:04:04.319 --> 00:04:10.239
with fixit feb get your ifixit kits today using the link in the video

00:04:08.319 --> 00:04:16.560
description now it isn't uncommon anymore to see pc processors with 12 16

00:04:13.519 --> 00:04:18.720
or even more threads but console games

00:04:16.560 --> 00:04:23.520
are written with fewer threads in mind even though the new xbox series x has 16

00:04:21.519 --> 00:04:29.440
threads the previous generation of consoles which many games are still

00:04:26.320 --> 00:04:31.520
being made in mind for didn't have

00:04:29.440 --> 00:04:35.759
nearly that many and because it would be a massive amount of effort to re-code

00:04:33.840 --> 00:04:40.400
these games to scale to higher numbers of threads most developers don't bother

00:04:38.560 --> 00:04:44.560
and the gameplay experience suffers as a result far cry 6 is a good example of a

00:04:43.040 --> 00:04:49.440
recent game that suffers from this problem although it can definitely take

00:04:46.560 --> 00:04:53.520
advantage of high-end modern gpus it can still stutter if you're getting a high

00:04:51.199 --> 00:04:58.000
frame rate because the CPU is trying to load assets and it can't use all of its

00:04:56.400 --> 00:05:02.479
threads to do so because developers have been so used to

00:05:00.080 --> 00:05:06.479
coding mostly for a single thread until about the mid-2010s

00:05:04.160 --> 00:05:09.360
this paradigm shift is taking a long time

00:05:07.520 --> 00:05:13.520
now how games use or don't use multi-threading leads us into our last

00:05:10.960 --> 00:05:18.080
topic the directx API on the surface you might think that the

00:05:15.199 --> 00:05:22.320
fact that both the xbox and Windows use directx might make it easier to port

00:05:20.080 --> 00:05:28.160
games between them but the problem is that both directx 11 and directx 12 are

00:05:25.759 --> 00:05:31.199
widely used in game development and they are

00:05:29.039 --> 00:05:36.080
very different from each other under the hood directx 12 gives developers more

00:05:34.080 --> 00:05:40.560
granular control over their games and it's become especially popular with xbox

00:05:38.160 --> 00:05:45.039
titles because as we mentioned before everybody already knows exactly what's

00:05:42.960 --> 00:05:48.880
inside of an xbox so programming specifically for that fixed set of

00:05:46.880 --> 00:05:54.160
hardware isn't too difficult but when those titles get ported over to pc

00:05:51.360 --> 00:05:57.840
many games instead use directx 11 like assassin's creed odyssey for example

00:05:55.840 --> 00:06:02.240
directx 11 doesn't provide as much granular control and instead takes care

00:05:59.840 --> 00:06:06.639
of more tasks automagically that's important because pcs can have such a

00:06:04.080 --> 00:06:11.600
wide range of hardware but it isn't always a good thing more specifically

00:06:08.960 --> 00:06:15.680
directx 11 by default uses only one thread for rendering meaning that games

00:06:13.520 --> 00:06:21.360
that get a direct x11 port released for pc often have poorer CPU performance

00:06:18.800 --> 00:06:25.840
especially on AMD with its directx 11 driver being slower on the CPU

00:06:23.759 --> 00:06:28.960
using directx 12 for pc ports could solve this issue but again because of

00:06:28.000 --> 00:06:32.880
all the different hardware configurations possible on a pc that

00:06:31.039 --> 00:06:37.039
would necessitate a great deal of coding maintenance from the developer

00:06:34.639 --> 00:06:41.840
directx 11 also does not support a feature called asynchronous compute this

00:06:39.680 --> 00:06:47.039
feature allows one part of the GPU to render a scene's geometry while another

00:06:44.160 --> 00:06:50.960
takes care of compute tasks like physics or ambient occlusion

00:06:48.880 --> 00:06:54.960
consoles rely heavily on asynchronous compute to increase performance but

00:06:52.880 --> 00:06:59.759
since so many pc ports are written with directx 11 the result is that many games

00:06:57.680 --> 00:07:04.560
released in the mid-2010s have lower than expected performance especially on

00:07:02.240 --> 00:07:08.880
AMD gpus which historically tend to be designed more with compute tasks in mind

00:07:07.440 --> 00:07:14.400
compared to their counterparts from NVIDIA in fact there were a number of

00:07:11.039 --> 00:07:16.960
games where a gtx 1060 and an rx 580

00:07:14.400 --> 00:07:21.840
performed similarly under directx 11 even though the rx 580 from AMD is a

00:07:19.199 --> 00:07:25.599
more powerful GPU on paper the lack of asynchronous compute is probably playing

00:07:23.680 --> 00:07:29.680
a big role there so yeah there's a lot that prevents our

00:07:28.000 --> 00:07:34.639
pc ports from playing the way that we expect but hopefully as time goes on developers

00:07:32.960 --> 00:07:38.880
will be able to take advantage of new tools to make coding for both platforms

00:07:37.120 --> 00:07:43.360
far less tedious i would say to just try and exercise

00:07:40.880 --> 00:07:48.960
some patience until then but i do think we used up a lot of that on cyberpunk

00:07:45.440 --> 00:07:50.319
2077 and the GPU shortage and this darn

00:07:48.960 --> 00:07:53.319
pandemic subscribe
