1
00:00:00,000 --> 00:00:03,360
What if I told you you could defeat incredibly high tech

2
00:00:03,360 --> 00:00:06,400
surveillance with a decidedly low tech tool?

3
00:00:06,400 --> 00:00:11,400
And no, I'm not talking about spray painting a security camera, I'm talking about simple clothing

4
00:00:11,400 --> 00:00:15,000
that can throw off advanced AI algorithms and facial recognition.

5
00:00:15,000 --> 00:00:20,360
It turns out that certain algorithms can be essentially reverse engineered in a manner that fools them

6
00:00:20,360 --> 00:00:25,600
even if your face is in plain view. A few years ago, researchers at a university in Belgium

7
00:00:25,600 --> 00:00:28,680
took a closer look at a common image recognition algorithm

8
00:00:28,720 --> 00:00:32,040
called YOLO V2, it's literally the name,

9
00:00:32,040 --> 00:00:36,960
and worked out how to produce images that the algorithm either wouldn't recognize as human

10
00:00:36,960 --> 00:00:40,040
or wouldn't even recognize as any kind of object.

11
00:00:40,040 --> 00:00:46,000
The resulting images, called adversarial patterns, often resemble brightly colored abstract art

12
00:00:46,000 --> 00:00:49,840
that the algorithm can't make heads or tails of. But can any of us?

13
00:00:49,840 --> 00:00:53,140
Not only does this lead the algorithm to think there isn't a person in the image at all,

14
00:00:53,140 --> 00:00:57,120
but the patterns also make for some pretty cool looking clothing that there's apparently

15
00:00:57,120 --> 00:01:00,540
already a market for. Some of the patterns effectively make the wearer

16
00:01:00,540 --> 00:01:03,820
invisible to AI, while others make the computer think

17
00:01:03,820 --> 00:01:08,340
that it's actually looking at something else, like a dog. Of course, the method isn't foolproof,

18
00:01:08,340 --> 00:01:11,620
as a more sophisticated algorithm could get released at any time

19
00:01:11,620 --> 00:01:16,580
and would still recognize what it's looking at, and it won't do much if an actual person

20
00:01:16,580 --> 00:01:22,060
is sitting on the other side of the security camera either. But why bother trying to be invisible to AI

21
00:01:22,060 --> 00:01:27,140
when you could just spam it with noise? We're starting to see clothing made to resemble

22
00:01:27,140 --> 00:01:30,300
other objects that automated cameras commonly look for,

23
00:01:30,300 --> 00:01:33,360
such as this dress that's covered with license plates,

24
00:01:33,360 --> 00:01:38,440
or this scarf with photorealistic faces. This way, you don't have to beat its algorithm,

25
00:01:38,440 --> 00:01:43,140
you just feed it with junk data, and you're also making a rather bold fashion statement.

26
00:01:43,140 --> 00:01:46,380
Of course, if you're serious about remaining hidden, you could try this approach,

27
00:01:46,380 --> 00:01:52,060
which uses these masks that have a lens-like effect to trick cameras, but keep you recognizable

28
00:01:52,060 --> 00:01:57,820
to fellow humans, or even this prototype that projects different faces onto your own.

29
00:01:57,820 --> 00:02:02,380
It's like a scanner, darkly, assuming that you can be bothered to wear something like that around town.

30
00:02:02,380 --> 00:02:05,940
But regardless of what method anyone uses to fool an AI camera system,

31
00:02:05,940 --> 00:02:09,340
exactly how widespread are these surveillance technologies

32
00:02:09,340 --> 00:02:13,340
that researchers and designers have felt the need to explore ways to beat them?

33
00:02:13,340 --> 00:02:17,020
Actually, more widespread than you might think. There have been plenty of headlines

34
00:02:17,020 --> 00:02:21,900
about how the authorities in mainland China have used smart cameras to track and identify citizens,

35
00:02:21,900 --> 00:02:26,500
but dozens of other countries are using similar technologies for everything from cracking down on dissidents

36
00:02:26,500 --> 00:02:31,940
to more benign pursuits like traffic management and smart cities, and crowd security at sporting events.

37
00:02:31,940 --> 00:02:37,660
And these networks largely cannot function if they can't even detect objects in the first place.

38
00:02:37,660 --> 00:02:42,340
Regardless of why you're being spied on, it's understandable if you want to try it and opt out

39
00:02:42,340 --> 00:02:46,500
while still being able to walk around a city center. But fair warning, even clothing

40
00:02:46,500 --> 00:02:49,900
that utilizes adversarial patterns doesn't work all the time,

41
00:02:49,900 --> 00:02:53,940
even against image recognition algorithms it was designed for. Like if maybe you're just standing

42
00:02:53,940 --> 00:02:56,940
kind of two sideways or something, you know?

43
00:02:56,940 --> 00:03:00,580
Suddenly setting up my own society in the woods sounds pretty appealing.

44
00:03:00,580 --> 00:03:04,300
Thanks for watching guys, if you liked this video, hit like, hit subscribe, and hit us up in the comments section

45
00:03:04,300 --> 00:03:07,300
with your suggestions for topics that we should cover in the future.
