1
00:00:00,000 --> 00:00:04,480
The art of deception is as old as time itself and covers everything from counterintelligence

2
00:00:04,480 --> 00:00:10,000
to Halloween masks. But one incredibly convincing method of subterfuge that's rapidly becoming

3
00:00:10,000 --> 00:00:16,880
more common is the deep fake. The idea behind a deep fake is that it creates images that strongly

4
00:00:16,880 --> 00:00:22,800
resemble a certain subject, usually a person, that isn't actually them. But I'm not talking about

5
00:00:22,800 --> 00:00:28,320
some kind of cheap obvious Photoshop job that involves cut pasting your head onto a wait-lifter

6
00:00:28,320 --> 00:00:34,400
friend's vacation photos. Instead, deep fakes can be nearly indistinguishable from the real thing

7
00:00:34,400 --> 00:00:39,920
to the average human eye thanks to the use of artificial intelligence and machine learning.

8
00:00:39,920 --> 00:00:46,080
So here's how it works. Programs that generate deep fakes use not one but two different AIs

9
00:00:46,080 --> 00:00:52,560
working together. The first AI will scan many images of the subject to be faked and then create

10
00:00:52,560 --> 00:00:59,680
new, faked images. The second AI will then examine these fakes and compare them to real images,

11
00:00:59,680 --> 00:01:04,960
and if the differences are too stark, the second AI will mark the image as an obvious fake and

12
00:01:04,960 --> 00:01:11,280
tell the first AI. So the first AI takes this information and continually adjusts the fake

13
00:01:11,280 --> 00:01:16,960
images until the second AI's error rate hits a certain target. That is to say, until the second

14
00:01:16,960 --> 00:01:22,880
AI can't tell a fake from the real thing anymore. This system is called a generative adversarial

15
00:01:22,880 --> 00:01:28,240
network and although the idea behind it is fairly simple, advances in processing power have made

16
00:01:28,240 --> 00:01:34,560
it a very powerful tool for producing convincing looking fakes. And because it's a fairly general

17
00:01:34,560 --> 00:01:41,440
purpose technology, it has lots of cool applications. For example, an AI network like this can analyze

18
00:01:41,440 --> 00:01:47,840
lots of people with a similar look to create a fake model for advertising purposes. It's much

19
00:01:47,840 --> 00:01:52,240
cheaper than hiring a real model and paying them for a photo shoot, much of course to the chagrin of

20
00:01:52,240 --> 00:01:57,360
attractive yet unemployed people everywhere. Deep fakes could also help with age progression photos

21
00:01:57,360 --> 00:02:02,640
used by the police to help find missing people or even to upscale old school video games more

22
00:02:02,640 --> 00:02:08,240
realistically so they look nice on modern displays. But if you've seen stories about deep fakes in the

23
00:02:08,240 --> 00:02:13,280
news, you probably know that there's a much darker side to the technology. Unsurprisingly,

24
00:02:13,280 --> 00:02:18,400
the majority of the deep fake videos on the internet right now are pornographic in nature,

25
00:02:18,400 --> 00:02:23,040
with the idea often being to swap in a celebrity's face without their consent,

26
00:02:23,040 --> 00:02:27,760
resulting in several well-known actresses falling victim to the practice. And because

27
00:02:27,760 --> 00:02:32,480
deep fake technology has gotten good enough to create fake videos that make it appear as though

28
00:02:32,480 --> 00:02:37,120
people have been caught on camera saying things they haven't, seriously it can accurately match

29
00:02:37,120 --> 00:02:43,120
up lip movements to convincingly fake speech, there's a huge concern that it could be used

30
00:02:43,120 --> 00:02:48,560
to spread fake videos of politicians saying insightful or inflammatory things in order to

31
00:02:48,560 --> 00:02:52,960
push a certain agenda. Of course, this doesn't mean that everyone out there will be fooled,

32
00:02:52,960 --> 00:02:58,560
but with how realistic deep fakes can be and how polarized the political climate is in many parts

33
00:02:58,560 --> 00:03:04,880
of the world, political deep fakes could easily convince enough people. Recently, a video surfaced

34
00:03:04,880 --> 00:03:10,400
of Nancy Pelosi giving a speech while slurring her words apparently drunk. The video wasn't even

35
00:03:10,400 --> 00:03:15,920
a deep fake, it was merely slowed down, but it still racked up millions of views before finally

36
00:03:15,920 --> 00:03:21,680
being exposed as fake. And with accusations of political manipulation becoming more common these

37
00:03:21,680 --> 00:03:26,640
days, the rise of deep fake technology has led some observers to believe that it will have a

38
00:03:26,640 --> 00:03:32,960
tangible effect on political discourse and even election results. Although it remains to be seen

39
00:03:33,040 --> 00:03:38,640
if these fears actually come to pass, there are a number of cheaply accessible deep fake tools

40
00:03:38,640 --> 00:03:43,600
that allow people to create deep fakes without a ton of technical skills. So if you want to tinker

41
00:03:43,600 --> 00:03:49,600
with them yourself, I'd say just go ahead, but stay away from the politics and stick to swapping

42
00:03:49,600 --> 00:03:54,240
Linus Faces into your friends wedding video. So thanks for watching guys, like, dislike,

43
00:03:54,240 --> 00:03:58,080
check out our other videos, leave a comment if you have a suggestion for a future fast as possible,

44
00:03:58,080 --> 00:04:03,360
and don't forget to subscribe and follow. And that's not me being

45
00:04:03,360 --> 00:04:06,560
fakely manipulated to say that. I really think you should subscribe.
