1
00:00:00,000 --> 00:00:04,800
Oh, well, I'm sorry to say we don't have any tech news today, April Fool!

2
00:00:07,840 --> 00:00:12,000
I'm so like, I got you so good to look on your face.

3
00:00:12,960 --> 00:00:16,080
I love this holiday. It's good that we do this. Iran!

4
00:00:18,880 --> 00:00:24,880
Iran has announced plans to attack 18 major tech companies across the Middle East,

5
00:00:24,880 --> 00:00:30,320
either as part of its ongoing military conflict with the US, or the most ambitious April Fool's

6
00:00:30,320 --> 00:00:36,160
joke of all time. The threat was posted to Telegram on Tuesday, naming Apple, Google,

7
00:00:36,160 --> 00:00:42,400
Microsoft, Meta, NVIDIA, Boeing, and GE. Among the companies it accused of helping the US

8
00:00:42,400 --> 00:00:49,440
and Israeli military carry out strikes on Iran. Wait, GE? Is the US dropping shitty

9
00:00:49,440 --> 00:00:54,240
washer dryers on their enemies now? You're gonna have to run the Ayatollah or whoever's

10
00:00:54,320 --> 00:00:57,760
pants through the wash twice if he doesn't want weird detergent residue on them?

11
00:00:58,560 --> 00:00:59,840
What do you have against GE?

12
00:01:03,120 --> 00:01:07,680
These tech companies have thousands of employees across the Gulf region, who were warned to

13
00:01:07,680 --> 00:01:13,440
evacuate a 1 kilometer radius around all targeted facilities. The attacks were expected to start

14
00:01:13,440 --> 00:01:19,920
at 8pm Tehran time on April 1st, but as of recording, no strikes have been confirmed.

15
00:01:19,920 --> 00:01:26,960
Risk management firm Helix's CEO, James Henderson, told CNBC that since AI is now being used as a

16
00:01:26,960 --> 00:01:32,160
weapon in the theater of war, cloud infrastructure assets like data centers will be pulled into

17
00:01:32,160 --> 00:01:38,720
conflict more and more, and then also GE, for some reason. I mean, their microwaves suck,

18
00:01:38,720 --> 00:01:46,720
but are they like, Jihad bad? The writer that wrote that is Muslim by the way, so I'm allowed.

19
00:01:50,320 --> 00:01:59,280
Open source software is under siege from AI, and not in the fun Steven Segal way either.

20
00:01:59,280 --> 00:02:05,360
Two security researchers recently launched an AI tool that can clone entire open source projects

21
00:02:05,360 --> 00:02:12,480
as a tongue-in-cheek provocation to show how easily AI can be used to bypass open source licensing.

22
00:02:12,560 --> 00:02:21,520
The dubiously named MALLUS, spelled M-A-L-U-S, recreates software from public documentation alone,

23
00:02:21,520 --> 00:02:27,520
producing code that's functionally identical to the original, but proprietary. The seemingly

24
00:02:27,520 --> 00:02:32,800
satirical service markets itself as targeting companies that don't want to credit the original

25
00:02:32,800 --> 00:02:38,720
developers or share their changes back to the project, promising legally distinct code with

26
00:02:38,720 --> 00:02:44,000
corporate friendly licensing. It's still unclear whether MALLUS is real or not,

27
00:02:44,000 --> 00:02:49,600
but what is definitely real is the broader threat to genuinely beloved open source tools.

28
00:02:49,600 --> 00:02:54,880
Curl was forced to shut down its bug bounty program to stem the massive tide of vibe-coded

29
00:02:54,880 --> 00:03:00,240
slop submissions. GitHub is considering a kill switch to let maintainers disable pull requests,

30
00:03:00,240 --> 00:03:06,320
entirely for the same reason, and Tailwind Labs laid off 75% of its engineers after AI tools

31
00:03:06,400 --> 00:03:11,120
tanked their revenue. As for whether MALLUS is satire, well on the one hand,

32
00:03:11,120 --> 00:03:16,800
its blog includes cartoonishly villainous posts thanking maintainers for their unpaid labor,

33
00:03:16,800 --> 00:03:22,240
and testimonials from clients like Chad Stockholder at Profit First LLC.

34
00:03:23,120 --> 00:03:28,720
On the other hand, a guy named Chad Stockholder is, on paper, arguably more believable than

35
00:03:28,720 --> 00:03:33,280
about half of the weirdo tech bros in Silicon Valley. Now in case you hadn't noticed, today

36
00:03:33,280 --> 00:03:37,520
was April Fool's, aka the hardest day of the year for us here,

37
00:03:37,520 --> 00:03:43,040
because Timu TikTok's store Slop has completely flooded the internet and it's hard to tell what's

38
00:03:43,040 --> 00:03:49,920
real. These are probably jokes, though, that I'm gonna say now. Yahoo showed off the scroll stopper,

39
00:03:49,920 --> 00:03:55,040
a thumb helmet inspired by what I can only imagine is Juggernaut, the marvel bad guy,

40
00:03:55,040 --> 00:03:59,520
designed to physically stop you from doom scrolling. And I'm kinda sad that this is a

41
00:03:59,520 --> 00:04:03,520
joke because for a second there, I thought I knew one thing that Yahoo still does.

42
00:04:03,520 --> 00:04:09,360
PlayStation's Project Playmo goes the opposite direction, letting AI take over your controller,

43
00:04:09,360 --> 00:04:13,120
your gameplay, and maybe one day, your role as a father and husband.

44
00:04:13,680 --> 00:04:19,680
Triggers meet AI glasses promise real-time steak intelligence for those who love meat

45
00:04:19,680 --> 00:04:26,400
and don't love meta. There's gotta be some crossover. Metro by T-Mobile dropped Kalaon,

46
00:04:26,480 --> 00:04:30,640
a phone-based fragrance so you can smell like an old Nokia,

47
00:04:30,640 --> 00:04:36,160
and Sega released a sanic shirt collection that thankfully did not include Sonichu.

48
00:04:36,160 --> 00:04:40,880
CD Projekt Red's Project Roach, or Ride on a Controller Horse,

49
00:04:41,600 --> 00:04:46,080
is exactly what it sounds like, an experience that gaming has been sorely lacking until now.

50
00:04:46,080 --> 00:04:50,400
MSI added a catbed to a monitor ARM because of course they did, and I want it.

51
00:04:50,400 --> 00:04:55,200
And then there were these articles, one from TechSpot about AMD buying Intel,

52
00:04:55,200 --> 00:04:59,760
which is thankfully not true, or it would trigger the shitty naming scheme apocalypse,

53
00:04:59,760 --> 00:05:08,240
and another one from TechPowerUp about AMD's answer to DLSS5, called FSR5 Scarlet Cortex,

54
00:05:08,240 --> 00:05:13,920
which genuinely broke my brain because it's seven pages long, with AMD branded presentation

55
00:05:13,920 --> 00:05:19,040
slides, detailed diagrams, and even interactive comparisons, which is a lot of work for an

56
00:05:19,040 --> 00:05:24,080
April Fool's joke about a feature that, in all likelihood, AMD is probably going to release

57
00:05:24,080 --> 00:05:29,280
for real at some point soon. We confirmed with AMD that this is not real, for now.

58
00:05:29,280 --> 00:05:33,600
But like, it's like making an April Fool's joke about Apple working on the iPhone 18.

59
00:05:33,600 --> 00:05:36,000
Haha, what if they did that? They're going to!

60
00:05:37,600 --> 00:05:43,520
Oh man, look at you. You've seen so many April Fool's jokes that your entire internal model

61
00:05:43,520 --> 00:05:48,880
of reality is breaking down. You're like, straight up, been set ontologically adrift,

62
00:05:49,760 --> 00:05:56,720
get got, son! Anthropic accidentally open sourced part of AI coding tool clod code

63
00:05:56,720 --> 00:06:02,240
source code. Say that five times fast, I won't. The flub was spotted by security researcher

64
00:06:02,240 --> 00:06:09,120
Chow Fan Xu, who linked to the relevant zip archive on Twitter. Anthropic told Axios it was a

65
00:06:09,120 --> 00:06:14,880
release packaging issue caused by human error, not a security breach. I feel like those can be the

66
00:06:14,880 --> 00:06:20,560
same thing, but so at least no user data was leaked. Cis admins on Reddit point out the sweet

67
00:06:20,560 --> 00:06:27,040
irony of Anthropic building 23 separate layers of bash security, only to be undone by one

68
00:06:27,040 --> 00:06:34,240
misconfigured text file. I wouldn't have done that. Anthropic has since DMCA'd the leaks,

69
00:06:34,240 --> 00:06:39,760
because publicly accessible data is only valid when Anthropic is the one doing the scraping.

70
00:06:39,840 --> 00:06:47,920
Okay, FYI. NASA's Artemis-2 successfully launched today at 6.24pm Eastern Daylight Time from NASA's Kennedy

71
00:06:47,920 --> 00:06:54,240
Space Center in Florida, the first crewed mission beyond low Earth orbit since Apollo 17 in the

72
00:06:54,240 --> 00:07:00,720
70s. The 10 day lunar flyby is part of a longer roadmap in the Artemis program, focusing on lunar

73
00:07:00,720 --> 00:07:07,200
exploration and carries the first black astronaut, first woman astronaut, and even the first Canadian

74
00:07:07,200 --> 00:07:12,640
astronaut to ever visit the moon's orbit, shattering the glass dome that encompasses the

75
00:07:12,640 --> 00:07:17,920
entire Earth, causing our atmosphere to vent into space and giving us all hours to live.

76
00:07:18,480 --> 00:07:22,880
But at least those guys will get to see the moon. Oracle is laying off thousands of its

77
00:07:22,880 --> 00:07:28,960
employees as the company decides AI data centers have a better value proposition than people.

78
00:07:28,960 --> 00:07:35,600
Workers across the US, India, Canada, and Mexico received a termination email from Oracle leadership

79
00:07:35,600 --> 00:07:44,160
at 6am, which is an AI robot thing, with no prior warning and access to systems was cut immediately.

80
00:07:44,160 --> 00:07:48,000
Senior operations manager Michael Shepard took to LinkedIn to write,

81
00:07:48,000 --> 00:07:53,520
To my colleagues who are impacted today, your worth is not defined by this moment.

82
00:07:53,520 --> 00:07:56,960
A post looking like it was drafted with chat dbt.

83
00:07:56,960 --> 00:08:00,960
But I think what calms my restless heart is knowing somewhere out there,

84
00:08:00,960 --> 00:08:06,800
Larry Ellison is unaffected by all this sipping chardonnay in a bubble bath while keeping his

85
00:08:06,800 --> 00:08:11,520
perfect wax visage out of the water so it doesn't melt off his head.

86
00:08:11,520 --> 00:08:18,160
Over a hundred of Baidu's Apollo Go robotaxies turned into 4000 pound paperweights on Wuhan

87
00:08:18,160 --> 00:08:24,160
highways Tuesday night after a total system failure, the second worst news to come out of Wuhan in the

88
00:08:24,160 --> 00:08:25,200
last six years.

89
00:08:27,680 --> 00:08:33,200
Passengers were trapped for almost two hours, some stranded in the middle of high speed lanes with

90
00:08:33,200 --> 00:08:41,040
trucks speeding past them, some of them those driverless blocks that are just, there's no humans

91
00:08:41,040 --> 00:08:47,920
in there. Okay, China uses, you've seen them. One passenger told a journalist it took her 30

92
00:08:47,920 --> 00:08:53,440
minutes just to connect to a customer representative. Turns out AI drivers don't get sleepy,

93
00:08:53,520 --> 00:08:58,400
but they do occasionally decide to take a simultaneous unprompted coffee break in traffic.

94
00:08:59,040 --> 00:09:08,080
This'll be fine. Let's do more of this. And scientists at USC built a memory chip that works at 700 degrees Celsius,

95
00:09:08,080 --> 00:09:13,440
which could finally allow computers to be used by denizens of the elemental plane of fire.

96
00:09:13,440 --> 00:09:18,960
The team used tungsten, ceramic, and a single atom thick graphene layer to block the

97
00:09:18,960 --> 00:09:25,360
short circuits that normally kill chips in extreme heat. It held data for over 50 hours at 700

98
00:09:25,360 --> 00:09:29,920
degrees and survived over a billion switching cycles at that temperature. So for all the

99
00:09:29,920 --> 00:09:35,280
supervillains out there who have been eyeing up those primo volcano locations for their evil

100
00:09:35,280 --> 00:09:41,440
data centers, now's your time to shine. Teal, I'm looking at you. And I'm looking at you,

101
00:09:42,080 --> 00:09:48,320
viewer, hoping you'll come back on Friday. Oh, just kidding. We got you again. It's a stat holiday.

102
00:09:49,520 --> 00:09:54,480
We'll be back on Monday, April 6th with more tech news. Man, you are really gullible. You should

103
00:09:54,480 --> 00:09:58,880
do something about that, like severely doubt everything and maintain heinously high standards

104
00:09:58,880 --> 00:10:02,480
of evidence. No intuition or vibes at all. Just question everything. Try that.
