1
00:00:00,000 --> 00:00:03,920
You know, we don't have to do this. Give you the tech news.

2
00:00:03,920 --> 00:00:08,800
You know, like we're choosing to do this. I could stop any time.

3
00:00:08,800 --> 00:00:12,400
Shut up, you're not my dad. YouTube has quietly added the ability

4
00:00:12,400 --> 00:00:17,200
to request a takedown of AI generated or other synthetic content that looks

5
00:00:17,200 --> 00:00:22,400
or even sounds like you, which means I can finally do something about this video.

6
00:00:22,400 --> 00:00:25,560
Oh, you're in big trouble, wise guy.

7
00:00:25,560 --> 00:00:28,560
This differs from YouTube's approach to deep fakes,

8
00:00:28,560 --> 00:00:32,840
which accounts as misleading information and can be reported by anybody.

9
00:00:32,840 --> 00:00:37,800
Requesting takedowns of synthetic impersonation can only be done by the affected individual

10
00:00:37,800 --> 00:00:41,080
or their guardian is subject to a number of criteria

11
00:00:41,080 --> 00:00:44,960
and isn't guaranteed to result in an actual takedown.

12
00:00:44,960 --> 00:00:49,400
Just like all those times I attacked my older brother after learning about the leg sweep,

13
00:00:49,400 --> 00:00:52,920
success rate zero, I'm bigger than my older brother.

14
00:00:52,920 --> 00:00:57,520
One thing that will factor into YouTube's decision is whether the creator has marked their video

15
00:00:57,520 --> 00:01:01,800
as containing AI generated content, something it started requiring creators

16
00:01:01,800 --> 00:01:06,560
and advertisers to do last year. In another surprisingly useful new feature,

17
00:01:06,560 --> 00:01:12,920
the platform has begun a broader rollout of an AI powered erase song feature previously available

18
00:01:12,920 --> 00:01:16,000
in beta that will allow creators to remove songs

19
00:01:16,000 --> 00:01:20,840
from a video that has a copyright claim on it, while keeping other audio like dialogue

20
00:01:20,840 --> 00:01:25,440
or sound effects intact. This will let YouTube partners keep the video up

21
00:01:25,440 --> 00:01:29,600
and continue making ad revenue. If you're not in the partner program

22
00:01:29,600 --> 00:01:34,000
and your video has over 100,000 views, according to YouTube support page,

23
00:01:34,000 --> 00:01:39,080
you may not be able to save changes. They'll let you try though, you know.

24
00:01:39,080 --> 00:01:43,600
It's fun for YouTube to play with you regular people.

25
00:01:43,600 --> 00:01:47,560
As the generative AI hype bubble continues to stretch,

26
00:01:47,560 --> 00:01:51,120
more scrutiny is being leveled at prominent AI companies.

27
00:01:51,120 --> 00:01:54,160
This week, developer Pedro Jose Pereira discovered

28
00:01:54,200 --> 00:01:59,840
that the macOS chat GPT app stored all of its conversations in plain text

29
00:01:59,840 --> 00:02:05,040
in an easily accessible location. This snapped open AI out of staring at the window,

30
00:02:05,040 --> 00:02:11,480
dreaming about omniscient AI waifus long enough to spit out an update that encrypted the conversations.

31
00:02:11,480 --> 00:02:17,080
But then the New York Times reported on an undisclosed breach from April, 2023,

32
00:02:17,080 --> 00:02:21,880
in which a hacker stole sensitive info from open AI's internal employee forum,

33
00:02:21,880 --> 00:02:26,800
because like why worry about cybersecurity when AI is gonna add us all to the world coin blockchain

34
00:02:26,800 --> 00:02:31,080
in a couple years. Concern is also growing about the web scrapers

35
00:02:31,080 --> 00:02:34,760
used to gather AI training data, which have been worming their way

36
00:02:34,760 --> 00:02:40,880
around existing tools meant to block them. Cloudflare just released a new free one-click tool

37
00:02:40,880 --> 00:02:47,360
that promises to block all AI bots, and Glaze, a tool that poisons artists' images

38
00:02:47,360 --> 00:02:52,560
so they can't be properly trained on by AI models, is exploding in popularity.

39
00:02:52,560 --> 00:02:58,880
Yep, the arms race is on. In a blog post detailing the most scraped websites of 2024,

40
00:02:58,880 --> 00:03:05,060
data scraping company Smart Proxy advertises advanced APIs to scrape data

41
00:03:05,060 --> 00:03:08,120
even from the most protected online targets,

42
00:03:08,120 --> 00:03:12,240
AKA real human beings trying to bring something other

43
00:03:12,240 --> 00:03:15,440
than cursed AI slop into our world.

44
00:03:15,440 --> 00:03:18,520
That's actually our painting now, fun fact.

45
00:03:18,520 --> 00:03:24,080
In an ironic twist, a new research paper detailing the myriad ways in which generative AI tools

46
00:03:24,080 --> 00:03:27,480
have only just begun to, quote, distort collective understanding

47
00:03:27,480 --> 00:03:30,680
of sociopolitical reality or scientific consensus,

48
00:03:30,680 --> 00:03:36,560
was written by researchers at Google who just wanna find the guy responsible for all this.

49
00:03:36,560 --> 00:03:40,560
Look, we're all trying to find the guy who did this and give him a spanket.

50
00:03:40,560 --> 00:03:46,520
Apple has approved the Epic Games Store as an alternative app marketplace on iOS and iPadOS

51
00:03:46,520 --> 00:03:49,560
in the EU after Epic Games complained,

52
00:03:49,560 --> 00:03:52,640
both on Twitter and directly to the European Commission,

53
00:03:52,640 --> 00:03:56,320
about Apple rejecting their submission twice.

54
00:03:56,320 --> 00:03:59,440
These two companies, man, they're like kids in a preschool.

55
00:03:59,440 --> 00:04:04,320
Tim Cook's trying to build his perfect iPhone out of wooden blocks and Tim Sweeney always crashes in

56
00:04:04,320 --> 00:04:07,800
like, teacher, teacher, I wanna put my blocks there.

57
00:04:07,800 --> 00:04:11,720
Build your own iPhone, you quarrelless rodent.

58
00:04:11,720 --> 00:04:16,760
Apple was rejecting Epic submissions over some of their buttons looking too similar

59
00:04:16,760 --> 00:04:19,840
to Apple's buttons, which is an actual restriction

60
00:04:19,840 --> 00:04:25,120
as listed in an Apple document entitled Alternative Terms Addendum for Apps in the EU.

61
00:04:25,120 --> 00:04:29,960
Alternative app marketplaces and their apps must not infringe Apple's intellectual property

62
00:04:29,960 --> 00:04:34,320
or appear confusingly similar to the App Store or an Apple Product Service Interface

63
00:04:34,320 --> 00:04:40,280
computer software application or advertising theme. Yet, Apple eventually gave in, making me wonder

64
00:04:40,280 --> 00:04:44,520
how many other roadblocks I could make disappear by snitching to Europe.

65
00:04:44,520 --> 00:04:49,360
Sure, you can write me a speeding ticket officer but Europe is gonna hear about this

66
00:04:49,360 --> 00:04:55,560
and Europe is not gonna be happy. So I'm gonna do the quick bits now.

67
00:04:55,560 --> 00:04:59,640
Is that, I want to do them.

68
00:04:59,640 --> 00:05:04,720
Twilio, the parent company of two-factor authentication app, Authy, has confirmed

69
00:05:04,720 --> 00:05:09,840
that they suffered a data breach following the leak of 33 million phone numbers.

70
00:05:09,840 --> 00:05:14,720
Twilio states that it hasn't found evidence that the hackers were able to access its systems

71
00:05:14,720 --> 00:05:19,560
or any sensitive information like Colonel Sanders, 11 secret herbs and spices,

72
00:05:19,560 --> 00:05:22,880
or how Lee Harvey Oswald faked the moon landing.

73
00:05:22,880 --> 00:05:26,120
Info they have, apparently. Don't think you're safe though

74
00:05:26,120 --> 00:05:32,040
because Cyber News researchers have also discovered an unrelated compilation of nearly 10 billion

75
00:05:32,040 --> 00:05:37,760
unique passwords leaked on a popular hacking forum by a user called Obamacare.

76
00:05:37,760 --> 00:05:41,880
Why would the Affordable Care Act do such a thing? Thanks, Obama.

77
00:05:41,880 --> 00:05:45,640
Ransomware group BrainCypher has apologized

78
00:05:45,640 --> 00:05:51,160
to the citizens of Indonesia after hacking the country's temporary national data center

79
00:05:51,160 --> 00:05:54,840
and encrypting several thousand terabytes of information.

80
00:05:54,840 --> 00:06:00,480
After the attack massively disrupted public services, it was discovered that the affected government agencies

81
00:06:00,480 --> 00:06:05,160
didn't have mandatory backup policies. Nevertheless, the government didn't pay

82
00:06:05,160 --> 00:06:08,860
the $8 million ransom BrainCypher was asking for

83
00:06:08,860 --> 00:06:12,880
so the group released the encryption key of their own accord.

84
00:06:12,880 --> 00:06:16,560
Though they did ask the citizens for donations.

85
00:06:16,560 --> 00:06:21,720
They can't afford to hack for free. I mean, in this economy? Toyota is on an audacious mission

86
00:06:21,720 --> 00:06:25,720
to normalize hydrogen-powered machines that don't blow up,

87
00:06:25,720 --> 00:06:29,040
including this, the world's first hydrogen-powered

88
00:06:29,040 --> 00:06:32,760
stone oven and a hydrogen-powered barbecue

89
00:06:32,760 --> 00:06:36,800
made in collaboration with Japanese appliance company, Rinnai.

90
00:06:36,800 --> 00:06:39,980
I don't know if this guy is smiling because the oven's so great at cooking pizza

91
00:06:39,980 --> 00:06:45,100
or if that's just what he does when he's anxiously thinking about causing another Hindenburg Italian style.

92
00:06:45,100 --> 00:06:48,940
Does car-making expertise apply to hydrogen-stone ovens?

93
00:06:48,940 --> 00:06:55,600
Toyota asks, I certainly hope so. This isn't even Toyota's first go at merging cars and pizza.

94
00:06:55,600 --> 00:06:58,860
They put a pizza oven in a Tundra pickup truck in 2018

95
00:06:58,860 --> 00:07:02,940
and in the back of a Lexus GX earlier this year. And why stop there?

96
00:07:02,940 --> 00:07:06,700
I say we put this new oven in a blimp. We've learned so much since then.

97
00:07:06,700 --> 00:07:10,180
It turns out that F1 cars have IP addresses

98
00:07:10,180 --> 00:07:13,980
and you can use that IP to make their engines explode.

99
00:07:13,980 --> 00:07:18,420
XF1 senior systems engineer and apparent Twitter enthusiast,

100
00:07:18,420 --> 00:07:23,420
Dan, engine mode 11, explained that the danger of remote engine explosion

101
00:07:23,420 --> 00:07:29,900
apparently isn't from external manufacturers, but from the internal electronics team itself.

102
00:07:29,900 --> 00:07:35,900
Dan didn't say explicitly, but he strongly implies that one team accidentally flashed

103
00:07:36,060 --> 00:07:39,940
the electronic control unit, the thing that controls the electrical systems in the car

104
00:07:39,940 --> 00:07:43,900
while the vehicle was running, causing some amount of kaboom.

105
00:07:43,900 --> 00:07:50,740
Yet not a single action movie I can think of, not a single one has blown up a car by hacking it.

106
00:07:50,740 --> 00:07:55,820
And Amazon has decided to discontinue its Astro for Business security robot

107
00:07:55,820 --> 00:08:02,740
to instead focus on home robotics products such as its Astro household robot, very different.

108
00:08:02,740 --> 00:08:06,360
That's where the money is. Everyone is buying robots for their homes these days.

109
00:08:06,360 --> 00:08:11,600
Any businesses that purchased the security robot will receive a refund plus a $300 credit

110
00:08:11,600 --> 00:08:16,520
to their Amazon accounts to help support a replacement solution because the best security guards

111
00:08:16,520 --> 00:08:19,600
prefer to be paid in Amazon gift cards.

112
00:08:19,600 --> 00:08:25,060
You'd think customers could just convert the business Astro into a home version,

113
00:08:25,060 --> 00:08:30,480
but no, he's seen too much. He got kicked by one too many hoodlums,

114
00:08:30,480 --> 00:08:34,920
trying to break into the warehouse. But between you and me, you should come back on Monday for more tech news.

115
00:08:34,920 --> 00:08:39,520
I mean, it's educational. And well, I'd just like to keep sharing tech news.

116
00:08:39,520 --> 00:08:43,840
So please support my passion. Okay, thanks, someone else wrote that.

117
00:08:43,840 --> 00:08:44,840
But it's true.
