1
00:00:01,839 --> 00:00:09,040
You'll own no IP at all and you'll be

2
00:00:05,359 --> 00:00:10,719
happy. I will be sad. What? Nope. Not

3
00:00:09,040 --> 00:00:15,200
allowed. That's okay. It's the rules. The AI will

4
00:00:12,480 --> 00:00:18,960
stop you from being sad. How could you be sad when you live in a utopia where

5
00:00:17,039 --> 00:00:22,560
everything's controlled and oh, everything's generated?

6
00:00:21,199 --> 00:00:28,000
What are we talking about here, Luke? It's powered by the the IP grinding

7
00:00:25,039 --> 00:00:31,439
machine. We're talking about uh IP. We're talking about AI policy. We're

8
00:00:29,840 --> 00:00:35,840
talking about how this stuff is going to possibly work because there's a huge

9
00:00:33,440 --> 00:00:41,760
battle going on between North American and European AI companies and mostly

10
00:00:39,520 --> 00:00:45,360
Chinese AI companies. Deepseek came and took everyone's lunch and what are we

11
00:00:43,520 --> 00:00:49,680
going to do about it? You you uh it's like you host a stream or something.

12
00:00:46,879 --> 00:00:54,160
You've done this before. Maybe we haven't done a talk linked in a in a hot

13
00:00:51,680 --> 00:00:59,600
second but and you've never been on. No. I'm so excited. Yeah. And also get you

14
00:00:56,640 --> 00:01:04,360
get to talk about AI. Wait, I just when Lionus doesn't let you. No. On when he

15
00:01:01,840 --> 00:01:09,360
whips me. As soon as you get to a AI topic, and I know this because I prepare

16
00:01:07,040 --> 00:01:12,479
the topic sometimes. Yeah. I'm like, "Oh, they're going to like this topic."

17
00:01:10,799 --> 00:01:15,920
I stopped preparing them because he just gets there and he goes on his phone.

18
00:01:14,159 --> 00:01:18,880
Yeah. He has literally done that. And then he'll get at me for not paying

19
00:01:17,200 --> 00:01:22,880
attention during other topics. But whatever, man. Yeah. But now you get to

20
00:01:20,960 --> 00:01:30,000
talk about it. So the the the kind of impetus for all of this was Jack Dorsey,

21
00:01:26,080 --> 00:01:32,560
the founder, former CEO of Twitter.

22
00:01:30,000 --> 00:01:39,600
Yeah. Now X, which I still call Twitter. Yeah. Uh tweeting delete all IP law on

23
00:01:36,640 --> 00:01:43,840
April 11th. And it kind of sparked some discourse. Yeah. That's that's part of

24
00:01:41,360 --> 00:01:49,840
what in my opinion are like the three core topics around AI policy right now,

25
00:01:47,360 --> 00:01:54,000
which I laid out as open versus closed models, copyright and IP reform, and AI

26
00:01:52,320 --> 00:01:59,040
safety versus innovation speed. Oh, heck. And I think that's like one of the

27
00:01:56,720 --> 00:02:04,399
really core things is what do we do with IP law? Because when you're looking at

28
00:02:01,280 --> 00:02:06,439
the the race for AI stuff, it's becoming

29
00:02:04,399 --> 00:02:10,160
more and more like country or alliance-based instead of company based.

30
00:02:08,959 --> 00:02:15,760
So a lot of people are looking at it like okay the west versus the east I

31
00:02:13,120 --> 00:02:21,360
guess or right now basically deepse v everyone

32
00:02:18,040 --> 00:02:26,080
and there are differences in how the

33
00:02:21,360 --> 00:02:29,599
west and China deal with IP law. Do I

34
00:02:26,080 --> 00:02:31,599
understand Chinese IP law? Not even sort

35
00:02:29,599 --> 00:02:34,560
of. Yeah. Yeah. I kind of started to Google it in preparation for this and

36
00:02:33,120 --> 00:02:40,080
I'm kind of like oh this is going to be too too complicated. What my not enough

37
00:02:37,599 --> 00:02:44,640
Googling has told me uh is that they are much more lax on IP law especially when

38
00:02:42,879 --> 00:02:49,360
it comes to outside of their own country but also within their own country and

39
00:02:47,360 --> 00:02:53,280
especially especially when it has to do with anything that could benefit the

40
00:02:50,640 --> 00:02:58,000
state right because over here you know the the North American kind of

41
00:02:55,120 --> 00:03:01,840
capitalist mindset corpo focused right it's like let people go out there

42
00:02:59,840 --> 00:03:05,200
struggle whoever rises to the top is good and China also does that I know

43
00:03:03,440 --> 00:03:11,440
they do that with like but but it's more like it's they the government directs these little

44
00:03:08,400 --> 00:03:13,440
experiments to happen. Um so if

45
00:03:11,440 --> 00:03:18,519
something is like oh this will be beneficial to the nation then they can

46
00:03:15,599 --> 00:03:25,200
just kind of override yeah the ide which which when you're in a race

47
00:03:20,879 --> 00:03:27,519
like this could be very beneficial when

48
00:03:25,200 --> 00:03:32,239
the other side the west is like squabbbling over are we even allowing

49
00:03:30,159 --> 00:03:37,519
these companies access this data at all which I I think what a lot of the

50
00:03:34,159 --> 00:03:40,720
argument hinges on right is like the

51
00:03:37,519 --> 00:03:43,440
morality around letting these gigantic

52
00:03:40,720 --> 00:03:48,519
companies that don't care about you and want to loot you for everything you have

53
00:03:45,599 --> 00:03:52,959
take all of your IP as well or lose the whole race and have it not

54
00:03:51,200 --> 00:03:57,360
matter anyways because a company in another country that's just going to do

55
00:03:54,319 --> 00:03:59,760
that anyways ends up winning. Yeah. So,

56
00:03:57,360 --> 00:04:03,040
so for me tough argument for me the there's kind of two well there might be

57
00:04:01,280 --> 00:04:08,720
more than two but the two perspectives that are kind of popping into my head

58
00:04:04,640 --> 00:04:12,000
here are the like philosophical argument

59
00:04:08,720 --> 00:04:14,239
for AI copyright like as a society just

60
00:04:12,000 --> 00:04:21,199
regardless of like you know where China is at do we think that AI created stuff

61
00:04:18,320 --> 00:04:25,120
deserves copyright do we think that AI a model could count as the author of

62
00:04:23,360 --> 00:04:29,040
something there's all those questions and then there's the kind of more

63
00:04:26,759 --> 00:04:35,680
utilitarian like practical question of like is this what we have to do in order

64
00:04:32,080 --> 00:04:38,000
to keep up in the nuclear AI arms race

65
00:04:35,680 --> 00:04:40,919
with China. So which of those is more interesting to you? Well, I think right

66
00:04:39,759 --> 00:04:49,880
now the can an AI have ownership over

67
00:04:44,639 --> 00:04:52,560
something that it creates thing. Um is

68
00:04:49,880 --> 00:04:58,000
is technically somewhat decided in law right now and it's no. Right. Well, can

69
00:04:55,199 --> 00:05:02,479
can an AI model count as the author? Yeah, I think no. Right now, it's no.

70
00:05:00,160 --> 00:05:06,880
Yeah, the copyright office has ruled a couple times, I think, at least. Yeah.

71
00:05:05,040 --> 00:05:11,440
About uh as far as go, there's there's multiple presidents here. I think there's one guy in particular, we'll

72
00:05:09,759 --> 00:05:16,160
look it up later maybe. But like there's one guy in particular who has submitted

73
00:05:13,039 --> 00:05:18,000
multiple times to be like, okay, this is

74
00:05:16,160 --> 00:05:22,880
definitely like there's one for sure, guys. But I used an AI this time and

75
00:05:20,800 --> 00:05:26,039
this time this AI is sentient for sure, guys. and and the copyright office is

76
00:05:24,479 --> 00:05:31,360
like, "Stop coming back here and put on some pants." So, there's

77
00:05:28,720 --> 00:05:35,600
precedent against that dude. Yeah. Yeah, that makes sense. But, I mean, I think

78
00:05:32,960 --> 00:05:40,320
that the the like, do do we get to keep copyright in an era where all these

79
00:05:38,560 --> 00:05:44,560
companies are just kind of gobbling everything up anyway? Like that those

80
00:05:42,960 --> 00:05:49,160
are the interesting lawsuits that are moving through. The big one being OpenAI

81
00:05:46,479 --> 00:05:54,880
versus the New York Times. Yeah. Uh, it's so like CHAGBT has been out

82
00:05:52,720 --> 00:05:59,320
since 2022 and obviously transformer-based LLMs. They were out

83
00:05:57,520 --> 00:06:04,800
for a while. I mean, they were in research scenarios, but attention is all

84
00:06:02,160 --> 00:06:09,120
you need. It's been Yeah, it's 2017. Yeah, I've done research. I can't

85
00:06:07,360 --> 00:06:13,199
believe that this question is still kind of as open as it is. I would have

86
00:06:11,199 --> 00:06:18,560
expected there to be some precedent set and a ruling set by by now, but we're

87
00:06:16,160 --> 00:06:23,840
still just kind of like, you know, Meta just is also going through proceedings

88
00:06:20,720 --> 00:06:26,560
where they're like, you you literally

89
00:06:23,840 --> 00:06:30,720
torrented like every book ever. I mean, not not, you know, but like a massive

90
00:06:28,800 --> 00:06:34,160
archive of books. Yeah. Yeah. And like that's the that's the and this I'm not

91
00:06:32,880 --> 00:06:40,720
the first person to reference this at all, but u one of the co-founders I

92
00:06:37,280 --> 00:06:42,080
think of Reddit Mhm. Alex was Alexian.

93
00:06:40,720 --> 00:06:47,039
Yeah, I don't remember the exact details of this story, but I believe he was

94
00:06:43,840 --> 00:06:48,880
pirating um like research papers and

95
00:06:47,039 --> 00:06:53,440
then making them available to people for free, I think. Yeah. And then he was

96
00:06:51,440 --> 00:06:58,319
going to get thrown in jail forever and ended up

97
00:06:55,400 --> 00:07:01,759
unaliving himself. Oh. Over it because it like ruined his whole life. Wait,

98
00:07:00,160 --> 00:07:05,199
Aaron Schwarz? Are you talking about Aaron Schwarz? Oh, okay. Yeah. Yeah.

99
00:07:03,280 --> 00:07:08,960
Yeah. Yeah. Yeah. No, I'm familiar. Yeah. Yeah. And then now a company's

100
00:07:07,120 --> 00:07:13,840
doing it on a much grander scale and we're all like, "Ah, you know what?

101
00:07:11,840 --> 00:07:18,639
Seems fine." That totally reminds me. I saw an article recently like referencing

102
00:07:15,919 --> 00:07:24,800
Aaron Schwarz and like how bonkers it is that it's like actually kind of similar.

103
00:07:22,160 --> 00:07:29,520
Well, it it's it's crazy that that was the case just a few years ago. Yeah. And

104
00:07:27,840 --> 00:07:34,960
really not that long ago. He was his entire life was going to be ruined. Yep.

105
00:07:31,759 --> 00:07:38,720
And now we have every tech company ever

106
00:07:34,960 --> 00:07:40,160
is just gobbling up. I mean, I had as

107
00:07:38,720 --> 00:07:47,280
one of the points I wanted to talk about these like AI web crawlers that are

108
00:07:44,240 --> 00:07:48,880
taking up tons of bandwidth. Oh yeah. I

109
00:07:47,280 --> 00:07:53,520
we've referenced it a couple times on TechLink, but there are like a number of

110
00:07:51,280 --> 00:07:57,120
studies now and like reports from administrators being like, "Yeah, we

111
00:07:55,120 --> 00:08:01,599
looked into it and it's like 80% of our web traffic is just AI crawlers." So

112
00:07:59,759 --> 00:08:07,919
regardless, we can I don't want to bash on AI completely because I feel like

113
00:08:04,479 --> 00:08:12,080
there's an argument here and like for

114
00:08:07,919 --> 00:08:15,120
for AI for embedding it in society at

115
00:08:12,080 --> 00:08:19,759
every level for even maybe deleting all

116
00:08:15,120 --> 00:08:22,160
IP law and the argument would be is if

117
00:08:19,759 --> 00:08:28,000
it's actually if it if it actually leads to this scenario that Jack Dorsey and

118
00:08:25,919 --> 00:08:34,479
apparently Elon Musk and whoever because Elon Musk tweeted did as well. I agree

119
00:08:30,720 --> 00:08:36,719
to delete all IP law which is yeah uh if

120
00:08:34,479 --> 00:08:39,279
the argument by uh putting being put forward by them that like okay this is

121
00:08:38,159 --> 00:08:44,560
actually going to help us it's actually going to lead to a situation where we

122
00:08:41,560 --> 00:08:46,720
benefit do we like if you could know

123
00:08:44,560 --> 00:08:51,360
deleting all IP law will lead to a kind of a utopian society where you don't

124
00:08:48,640 --> 00:08:54,160
need money. It feels like, you know, there's a bunch of stories that are like

125
00:08:52,560 --> 00:08:59,360
this where you have utopian society and then like in the basement of city hall

126
00:08:56,160 --> 00:09:01,519
there's just like a a terrible demon

127
00:08:59,360 --> 00:09:05,920
machine that is like the reason why everything's okay. I feel like this is

128
00:09:03,279 --> 00:09:10,160
one of those scenarios. I I don't think um and I'm decently confident this is

129
00:09:08,320 --> 00:09:15,200
the route it's going to go to. I I I don't think deleting IP law is like the

130
00:09:13,120 --> 00:09:19,680
way to go. I I feel like what they're trying to do is like gambit an

131
00:09:17,640 --> 00:09:24,720
overcorrection. So, they're trying to be like, "Let's delete everything," hoping

132
00:09:22,000 --> 00:09:28,399
that they'll end up somewhere 50% of the way, if that makes sense. Because I I

133
00:09:26,560 --> 00:09:31,600
don't think they think that's going to happen either, to be completely honest.

134
00:09:30,399 --> 00:09:36,720
But I think they're trying to do the like Trumpian move of like I'm going to

135
00:09:34,800 --> 00:09:40,720
ask for this and then I'll send end up somewhere here. You know, we'll delete

136
00:09:38,640 --> 00:09:45,279
half of IP law. Yeah. Well, I think it's going to be IP reform. Uh because like a

137
00:09:43,200 --> 00:09:51,680
lot of the laws that were made weren't even really made uh in the internet age,

138
00:09:48,399 --> 00:09:54,560
let alone let alone the the like AI

139
00:09:51,680 --> 00:09:58,959
webcrawwler crazy situation we are now. So it's likely a lot of modernization

140
00:09:56,880 --> 00:10:03,440
and reform that needs to happen. Uh I mean we even have this problem with like

141
00:10:00,720 --> 00:10:09,760
YouTube videos. Um like there's this huge question mark of like can I have a

142
00:10:06,160 --> 00:10:12,880
clip from a movie in my video? Yeah.

143
00:10:09,760 --> 00:10:15,279
Like maybe. But also maybe not. Yeah.

144
00:10:12,880 --> 00:10:18,800
I've been scared to put anything. Yeah. That could go over the line. But then

145
00:10:16,720 --> 00:10:22,720
like all short form content seems to just be able to use whatever music they

146
00:10:20,240 --> 00:10:27,360
want. Yeah. And like that seems fine. That's completely different, Luke.

147
00:10:24,560 --> 00:10:31,839
That's totally different. One is watched by young people. And the other they

148
00:10:30,399 --> 00:10:36,560
don't have money. Why do they matter? They don't. They're too dumb. They're young and dumb. They don't know about

149
00:10:34,399 --> 00:10:41,200
copyright. Yeah. Elijah, one of the parallels that I would draw is to

150
00:10:38,800 --> 00:10:44,720
patents. And it's not perfect, but it's it's an interesting argument because

151
00:10:42,959 --> 00:10:48,560
there's a huge debate on whether or not patents have driven or stifled

152
00:10:46,480 --> 00:10:54,320
innovation in the past. And I think it really depends on

153
00:10:50,720 --> 00:10:55,920
the time, the country, uh the the field

154
00:10:54,320 --> 00:10:58,880
of innovation. Like it it depends on tons of stuff. A lot of people, this is

155
00:10:57,680 --> 00:11:04,800
the Wright brothers conversation I warned you about. A lot of people look at the Wright brothers and are like, "Oh

156
00:11:02,959 --> 00:11:09,000
my god, they brought in a this is amazing, right? We we didn't think we

157
00:11:06,720 --> 00:11:14,800
were going to fly for a long time." And then the

158
00:11:11,160 --> 00:11:17,600
air sick immediately turned into patent

159
00:11:14,800 --> 00:11:21,279
trolls as far as I could be wrong about this. A little bit of online research

160
00:11:19,279 --> 00:11:26,240
led me to find that they got their patent and then just

161
00:11:24,480 --> 00:11:30,959
just wrecked everyone else as much as they could and like very close to

162
00:11:28,480 --> 00:11:34,320
stopped innovating. Oh jeez. They they like got aviation off the ground and

163
00:11:32,640 --> 00:11:40,240
then just tried to make money off of what they did. Did you get this info

164
00:11:36,399 --> 00:11:43,680
from Grock? No. Okay. But there's also

165
00:11:40,240 --> 00:11:46,000
like in my opinion the which I don't

166
00:11:43,680 --> 00:11:53,360
think is as common anymore, but being an inventor as your trade as like what you

167
00:11:49,480 --> 00:11:55,440
do probably became a thing because of

168
00:11:53,360 --> 00:11:59,360
things like patents and IP law. Oh yeah, he's an inventory. Like no one has

169
00:11:57,120 --> 00:12:05,600
thought of, you know, an automatic egg making machine or something. But like

170
00:12:02,320 --> 00:12:08,320
it's a chicken. At one point IP law,

171
00:12:05,600 --> 00:12:12,480
copyright law, patent law, whatever. uh spurred innovation and if we get rid of

172
00:12:10,240 --> 00:12:15,480
it, you're worried that we would lose. Well, but there's but there's sto

173
00:12:13,600 --> 00:12:21,279
there's stories in both sides because like let's come back to more IP specific

174
00:12:18,480 --> 00:12:26,160
law, right? Um Disney kind of owned children's entertainment for a long

175
00:12:23,120 --> 00:12:29,120
time, most of our lives and a lot of

176
00:12:26,160 --> 00:12:33,839
before it. Um and now they're losing but they're not losing because you can use

177
00:12:31,279 --> 00:12:37,519
Mickey Mouse in other content now because that is a thing that happened.

178
00:12:35,519 --> 00:12:41,120
That was like a tiny splash. A bunch of games all came out all at once right

179
00:12:39,279 --> 00:12:44,800
when that got lifted that all featured Mickey Mouse and none of them were good.

180
00:12:43,120 --> 00:12:48,399
Everyone stopped caring almost immediately. No one talks about it

181
00:12:46,160 --> 00:12:52,240
anymore at all. And Bluey is just completely taking their life. Heck yeah.

182
00:12:50,480 --> 00:12:56,800
Have you watched Blueie? I have seen some Bluey. Blue is freaking It makes me

183
00:12:54,880 --> 00:13:02,480
cry. They like lost their ability to protect their IP on Mickey, right? And

184
00:12:59,360 --> 00:13:05,519
it didn't matter. And that's why the MCU

185
00:13:02,480 --> 00:13:07,519
didn't is isn't as good anymore cuz they

186
00:13:05,519 --> 00:13:13,200
they're losing their because they stop they didn't stop uh Steamboat Willie

187
00:13:10,320 --> 00:13:17,200
from escaping. He was going to join the Avengers.

188
00:13:15,360 --> 00:13:22,880
Steamboat Willie was Yeah, honestly I would go watch that. I would I haven't

189
00:13:19,440 --> 00:13:24,560
watched a Marvel movie in many years. If

190
00:13:22,880 --> 00:13:29,120
Steamboat Willie joined the Avengers Avengers, I would go. I mean, I I will

191
00:13:26,800 --> 00:13:35,440
say that like one argument for the deleting of IP law is the the the few

192
00:13:33,680 --> 00:13:38,800
times there's been a couple times where Disney Point puts out a project and it

193
00:13:37,360 --> 00:13:43,040
seems like the entire point of the project is just to brag about how much

194
00:13:40,399 --> 00:13:48,160
IP they have. Like did you see the Ralph Rex the internet movie like quite a few

195
00:13:46,160 --> 00:13:52,480
years ago? Did you you saw it? I think so. Okay. Yeah. It was just the whole

196
00:13:50,480 --> 00:13:56,320
movie just seemed like they were like wait Ralph. So there's Wreckit Ralph and

197
00:13:54,639 --> 00:13:59,680
then the sequel was Ralph Breaks the Internet. Wreck-It Ralph, they had a

198
00:13:58,240 --> 00:14:03,040
couple recognizable people. There was Sonic, it was there, and like Bowser and

199
00:14:01,440 --> 00:14:07,519
stuff, but then they go to the internet in the second one and they they go to

200
00:14:05,279 --> 00:14:10,959
like Disney the Disney website and it's just like, "Wow, what a magical place.

201
00:14:09,519 --> 00:14:14,800
Look, there's Star Wars and there's these and Lilo and Stitch and like

202
00:14:12,959 --> 00:14:18,560
they're just showing all the princesses are there." So it was basically like

203
00:14:17,279 --> 00:14:24,160
look at all these things that used to be cool. That watching that movie made me

204
00:14:21,360 --> 00:14:27,920
think like this is kind of like if I if if I watched that movie and then Jack

205
00:14:25,920 --> 00:14:32,639
Dorsey called me up and like we should delete all IP law and I'd be like Jackie

206
00:14:30,079 --> 00:14:37,760
you're so smart why don't we talk more often. No I please it kind of makes me

207
00:14:35,760 --> 00:14:44,480
feel that way because it's like we're heading into this AI

208
00:14:40,519 --> 00:14:46,480
future where if AI gets power like

209
00:14:44,480 --> 00:14:52,360
there's kind of two paths that I see. Yeah.

210
00:14:47,639 --> 00:14:54,880
Um, one is we lock this down. And we're

211
00:14:52,360 --> 00:14:59,199
like, AI companies, you're going all over the place. You're you're slurping

212
00:14:56,639 --> 00:15:03,360
up everyone's stuff and making slop. We need to stop this. You if you want to

213
00:15:01,600 --> 00:15:06,800
train on material, you got to license it. You got to pay all the people you've

214
00:15:05,120 --> 00:15:13,279
already trained on their material. Like, like either that that's not going to happen. But like that's either we kind

215
00:15:10,480 --> 00:15:19,199
of move in that direction or things just kind of keep progressing as they have

216
00:15:15,040 --> 00:15:22,240
been and copyright and IP just kind of

217
00:15:19,199 --> 00:15:25,040
become this kind of nebulous like like

218
00:15:22,240 --> 00:15:31,440
the the walls get torn down a lot more. Yeah. Uh and it's a lot more

219
00:15:28,519 --> 00:15:34,720
sketchy as to what's infringing and what's not. You think we're going there?

220
00:15:33,360 --> 00:15:38,480
I think we're going there. So Jack Dorsey is going to mostly maybe get his

221
00:15:36,800 --> 00:15:41,600
way. I think he's going to partially win. I don't think, like I said earlier,

222
00:15:40,240 --> 00:15:51,680
I don't think we're going to be deleting IP law, but I think there will be significant reform because I I don't see

223
00:15:47,279 --> 00:15:54,800
America effectively just kind of being

224
00:15:51,680 --> 00:15:57,279
willing to lose the AI race because I

225
00:15:54,800 --> 00:16:02,399
see this as their in in their opinion, this is this is computers. This is uh

226
00:16:00,320 --> 00:16:07,440
they have to fight again. So, I I think I agree with you if we're talking about

227
00:16:04,399 --> 00:16:10,079
like the powers that be. Yeah. the uh

228
00:16:07,440 --> 00:16:14,720
you know tech corporations, government, they definitely don't want to lose to

229
00:16:11,279 --> 00:16:16,639
China. Yeah. The question then is what

230
00:16:14,720 --> 00:16:24,480
what do how do the people feel about it? Oh yeah. Because every time I encounter

231
00:16:21,040 --> 00:16:25,920
sentiment about AI and copyright online,

232
00:16:24,480 --> 00:16:31,600
this is why this tweet was so interesting because the vast majority of

233
00:16:29,360 --> 00:16:36,000
the discourse around AI and copyright online is like, yeah, there these AI

234
00:16:34,560 --> 00:16:40,160
companies are stealing from people. Well, they are. Yeah. Yeah. And you have

235
00:16:38,560 --> 00:16:45,199
the tech bros, you know, whatever coming out every once in a while being like, we

236
00:16:42,639 --> 00:16:49,079
need to chill, guys. we like how are we going to get exponential growth if we

237
00:16:46,880 --> 00:16:53,519
don't steal all this copyrighted stuff. Um and but they don't usually like

238
00:16:52,000 --> 00:16:57,680
that's not getting a ton of engagement. It's like this but this is saying the

239
00:16:55,199 --> 00:17:01,120
quiet part out loud pretty much. It's cuz it's because it doesn't feel good.

240
00:16:59,360 --> 00:17:04,640
They are stealing stuff. They are taking other people's work and they are

241
00:17:02,399 --> 00:17:09,520
directly aggressively profiting off of it. The

242
00:17:07,199 --> 00:17:12,480
well profiting off of investment I guess most of these companies are really not

243
00:17:10,959 --> 00:17:16,319
making a profit. Yeah. They're not they're not taking the copyright machine

244
00:17:14,400 --> 00:17:19,839
and putting it into a different machine and it's like wow it's chewing up the

245
00:17:18,000 --> 00:17:23,039
copyright and spitting out profit. They're tricking people into thinking

246
00:17:21,520 --> 00:17:26,799
the copyright machine putting a cool little bow on it and being like you want

247
00:17:24,559 --> 00:17:31,440
to give me money for nothing. Yeah. Sick. Um

248
00:17:29,280 --> 00:17:34,960
the bubble man. Yeah, the bubble's pretty sorry you you

249
00:17:33,520 --> 00:17:40,000
were saying something though. No, it's just it doesn't feel good. Yeah, it's

250
00:17:36,720 --> 00:17:41,600
morally I think it's like bad. you're

251
00:17:40,000 --> 00:17:46,320
you're taking significant work from people especially h and we already see

252
00:17:44,400 --> 00:17:50,720
this from Google right a lot of small brands are getting screwed over by these

253
00:17:47,760 --> 00:17:55,360
search engines and now also by AI systems because everything's

254
00:17:52,960 --> 00:17:59,360
intercepting like even right now if you're using Google search not even if

255
00:17:57,039 --> 00:18:03,600
you're in a chat program if you're using Google search it will give you an AI

256
00:18:01,280 --> 00:18:07,840
summary at the top and the AI summary will often make it so that you do not

257
00:18:06,160 --> 00:18:12,960
have to go through and click onto one of those websites and that is making it so

258
00:18:10,080 --> 00:18:18,480
that independent small time research, writing, whatever, is all effectively

259
00:18:15,760 --> 00:18:23,760
being defunded by these platforms that are using them in order to succeed,

260
00:18:20,720 --> 00:18:25,760
right? And that is a crazy concept.

261
00:18:23,760 --> 00:18:31,520
Yeah, because like, okay, we might win the AI race, but at what cost? The cost

262
00:18:28,640 --> 00:18:34,880
of like literally everything else. Um, I was going to say like, okay, yeah,

263
00:18:32,799 --> 00:18:39,200
definitely at the cost of like, you know, smaller creators, smaller like

264
00:18:37,120 --> 00:18:43,160
copyright holders or whatever. uh smaller companies that are trying to

265
00:18:40,880 --> 00:18:48,320
get up in the SEO rankings and they just can't. But also at the cost of like I

266
00:18:45,520 --> 00:18:53,280
feel like our minds there have been some uh studies looking at uh people who use

267
00:18:51,360 --> 00:18:58,160
AI a lot versus people who don't use AI a lot and it's like looking at their

268
00:18:55,120 --> 00:19:00,880
like problem solving ability novel task.

269
00:18:58,160 --> 00:19:05,280
Yeah. They're worse at it. And I I know that when I go to Google something it

270
00:19:03,039 --> 00:19:08,960
gives me an AR overview. I know how to Google stuff. My whole job is like

271
00:19:07,120 --> 00:19:12,799
researching stuff and like reporting on it. So like I've learned how to be

272
00:19:10,640 --> 00:19:17,360
really good good at like Google foo. Yeah. But most people are not and

273
00:19:15,600 --> 00:19:21,600
they're going to be even worse now if they don't even have to click into like

274
00:19:18,960 --> 00:19:27,039
one site to see what the to see the answer or what the quote unquote answer

275
00:19:24,880 --> 00:19:31,280
because it might be it might be fake. And it's it's crazy that we we all know

276
00:19:29,600 --> 00:19:35,440
that AI hallucinates and just makes stuff up all the time, but a huge

277
00:19:33,760 --> 00:19:40,640
portion of the internet, like genuinely a lot of people seem to be going under

278
00:19:37,679 --> 00:19:46,320
the the route of blindlessly verifying things with AI. Mhm. So like you see

279
00:19:44,320 --> 00:19:51,520
this on the tweeters, which I'm still going to call it, um where someone will

280
00:19:48,400 --> 00:19:53,760
be like Grock, is this true? Don't call

281
00:19:51,520 --> 00:20:00,000
it the tweeters. The tweeters. Just call it Twitter. Twitter. All right.

282
00:19:57,600 --> 00:20:03,360
But they'll ask like Grock or Perplexity or whatever other ones will respond to

283
00:20:01,600 --> 00:20:08,080
tweets if something is true and then they will just take that fully at face

284
00:20:05,440 --> 00:20:11,600
value. I which is brutal. I made a joke about this in tech the other day cuz

285
00:20:09,600 --> 00:20:16,720
yeah I've seen this where someone makes a claim and someone's like I don't think

286
00:20:14,000 --> 00:20:20,720
that's true and they're like well it is cuz look at this website and then they

287
00:20:18,320 --> 00:20:25,840
say hey Grock is this true and it's like just click on the website just Google

288
00:20:23,039 --> 00:20:30,480
it. Why are you asking an AI that may or may not lie to your face? Like they've

289
00:20:28,400 --> 00:20:33,760
gotten a lot better, but they still hallucinate enough that it's like, why

290
00:20:32,240 --> 00:20:38,240
are you trusting? They hallucinate a lot. Yeah. And then it was even worse is

291
00:20:36,080 --> 00:20:43,760
the people who are like, I collaborated with Chat GBT on on this and then they

292
00:20:40,960 --> 00:20:47,600
like post some giant freaking thing. It's like what was your participation in

293
00:20:45,919 --> 00:20:51,679
this? I was wondering about whether this is the case. So I asked Chat GBT and it

294
00:20:49,679 --> 00:20:55,919
told me that. So, I guess ChatGpt likes this and it's like, "Guys, stop." Yeah.

295
00:20:54,159 --> 00:20:59,520
The confirmation bias is crazy, too, because a lot of the chat programs will

296
00:20:57,919 --> 00:21:04,080
try to detect what you're leaning towards and just affirm it. Yeah, cuz

297
00:21:01,760 --> 00:21:09,840
people like those responses. We've done a lot of bashing on AI so

298
00:21:06,240 --> 00:21:14,000
far. Yeah. But I deserves it. Well, it

299
00:21:09,840 --> 00:21:15,600
does, but I know that you and I Well, I

300
00:21:14,000 --> 00:21:19,840
don't know, but I have a suspicion that you and I share an appreciation for some

301
00:21:17,840 --> 00:21:25,280
of its upsides as well. And I want to talk about that after this segue to our

302
00:21:22,440 --> 00:21:31,280
sponsor, Vessie. And Luke, let me tell you, I've been wearing Vessie for years

303
00:21:27,919 --> 00:21:33,679
now. Nice. And uh you know, Vessie says

304
00:21:31,280 --> 00:21:38,960
they're waterproof. They're comfortable. Oh, are your feet dry? They are. And we

305
00:21:36,480 --> 00:21:43,919
live in Vancouver here. It's tough. It's a wet place. Rains all the time. It's

306
00:21:40,799 --> 00:21:45,440
wet and and moist. And sometimes the

307
00:21:43,919 --> 00:21:51,760
water just gets in places you didn't even know it could critically. What?

308
00:21:48,080 --> 00:21:53,280
Don't worry about it. Critical hit.

309
00:21:51,760 --> 00:21:58,600
You said moist. Don't worry about it. It's fine. Oh, moist. Critical.

310
00:21:58,799 --> 00:22:04,960
Critical hit. By the way, this is completely real. So, like we had an

311
00:22:02,559 --> 00:22:09,919
atmospheric river here in Vancouver. Yeah. I don't know if it affected you.

312
00:22:07,200 --> 00:22:14,159
Seems like a new in your ivory castle. Oh, yeah. Up on the hill. You're safe

313
00:22:12,240 --> 00:22:17,919
and sound. I have a groundf flooror apartment.

314
00:22:16,080 --> 00:22:22,640
Okay. And all of the pipes for my building go to this corner where my unit

315
00:22:20,720 --> 00:22:27,760
is. You know, you know what got me? What? The snow drift on the roof. Oh,

316
00:22:25,120 --> 00:22:32,400
really? Separated the the like shingles on the roof and it rained in my

317
00:22:30,159 --> 00:22:36,080
apartment. Oh my gosh. So, you know, the ivory tower ain't that great either. Oh

318
00:22:34,159 --> 00:22:41,679
jeez. Yeah, you need to find a different ivory tower. Yeah, maybe an ebony tower.

319
00:22:40,159 --> 00:22:46,720
So, the atmospheric river hit. It overloaded the drain system. My patio

320
00:22:43,919 --> 00:22:50,559
flooded. It's an enclosed patio and it flooded. I've I've woken up in the

321
00:22:48,799 --> 00:22:56,640
morning multiple times over the past few years to find water streaming into my to

322
00:22:53,600 --> 00:22:59,520
my uh house. I've used Vessie shoes

323
00:22:56,640 --> 00:23:03,679
multiple times waiting through bailing out my patio, throwing water over the

324
00:23:02,000 --> 00:23:08,240
side of the fence. Now, all of these issues are fixed, but being out in the

325
00:23:06,320 --> 00:23:12,559
rain, in the cold, cuz they're they're warm, too. These Vessie shoes, these

326
00:23:10,480 --> 00:23:17,360
crazy Vessie shoes, they're warm. They keep your feet dry and you don't have to

327
00:23:15,679 --> 00:23:21,039
do up your laces every time. They're slip on, slip off. Yeah. Actually, my my

328
00:23:19,120 --> 00:23:23,440
girlfriend Emma went out and bought a pair just cuz she thought they sounded

329
00:23:22,400 --> 00:23:28,080
cool. She didn't even know they were sponsors of ours. Oh. Which bit of a

330
00:23:26,000 --> 00:23:32,440
failing on my part. What did she Yeah. What are you doing? I don't know. But

331
00:23:30,240 --> 00:23:38,240
she loves them too. Just up in that castle just imagining daydreaming.

332
00:23:36,640 --> 00:23:42,240
Yeah. Now, for me, my vesties have been very helpful. They kept my feet dry

333
00:23:40,400 --> 00:23:46,880
during these horrible, horrible nights I had to endure. Pumping water out of the

334
00:23:44,320 --> 00:23:51,760
sump, my whole body was wet, but not my shoes. Not your Not your Not my feet.

335
00:23:49,120 --> 00:23:56,000
Feet dry. Everything else dry and I looked great. Now you need a Vessie

336
00:23:53,360 --> 00:24:00,400
suit. Vessie, make it happen. These shoes work with any outfit. They're very

337
00:23:57,919 --> 00:24:04,480
comfortable. They keep your feet dry. So check out Vessie in the description

338
00:24:01,760 --> 00:24:08,480
below if you're tired of squishy socks. Okay, now AI. Essentially, the argument

339
00:24:07,039 --> 00:24:12,080
we're talking about is AI safety versus innovation speed because it's like, is

340
00:24:10,400 --> 00:24:16,000
is it more worth going for the moral argument of this sucks, we're stealing

341
00:24:13,679 --> 00:24:19,600
things from people, or is it more worth noting to China, right? And this is why

342
00:24:18,159 --> 00:24:26,840
I think the reason why there hasn't been precedent set is these are such

343
00:24:21,799 --> 00:24:30,480
different weights that it's it's hard

344
00:24:26,840 --> 00:24:33,679
to and like the same person can want

345
00:24:30,480 --> 00:24:37,320
both sides of it very deeply. Yeah.

346
00:24:33,679 --> 00:24:40,320
Yeah, the same person can want to not

347
00:24:37,320 --> 00:24:44,799
steal all thought creation from humanity

348
00:24:40,320 --> 00:24:47,039
ever and also not want to lose this war

349
00:24:44,799 --> 00:24:53,679
because if you lose this then this side didn't matter anyways. So it's like it's

350
00:24:49,919 --> 00:24:55,440
especially hard because you can like as

351
00:24:53,679 --> 00:24:59,400
you were saying that my first kind of like gut reaction is like but what about

352
00:24:57,360 --> 00:25:03,360
principles you know like there's kind of like you know the utilitarian thing of

353
00:25:02,080 --> 00:25:08,000
like this is what's necessary in order to win but I'm like oh but at what cost?

354
00:25:05,360 --> 00:25:14,080
What if we lose our soul as as a as a society? But at the same time, it's

355
00:25:10,320 --> 00:25:16,080
like, well, if China develops a super

356
00:25:14,080 --> 00:25:19,279
intelligence that kind of hacks the entire world and makes everything, you

357
00:25:17,600 --> 00:25:23,919
know, whatever they want it to be, then we won't have a society. That's like

358
00:25:21,840 --> 00:25:29,520
very far on the end of the possibility. Yeah, but I mean there's also the the

359
00:25:26,240 --> 00:25:32,240
problem of how consumers act about

360
00:25:29,520 --> 00:25:35,520
things, which is everyone will say online, not everyone, I'm exaggerating,

361
00:25:33,840 --> 00:25:40,559
a lot of people will say online that they want the uh, you know, don't steal

362
00:25:38,480 --> 00:25:47,039
everything from everyone route, but then if a company that does do that releases

363
00:25:43,200 --> 00:25:50,159
a model that's better, they will use it.

364
00:25:47,039 --> 00:25:52,960
Well, and yeah, that's and like that the

365
00:25:50,159 --> 00:25:58,159
precedent of that is like carved into every stone on the planet. Like they

366
00:25:55,120 --> 00:26:01,840
when you say they will use it users

367
00:25:58,159 --> 00:26:04,320
if if if everyone knows that many uh

368
00:26:01,840 --> 00:26:09,600
things are made in sweat shops by slaves and they keep buying them. Um people

369
00:26:06,640 --> 00:26:15,120
won't like uh gambling but they'll watch kick. People won't like a certain thing

370
00:26:12,480 --> 00:26:19,279
but then if the thing is better they'll buy it anyways. And I guess this kind of

371
00:26:17,120 --> 00:26:25,679
comes into a part of the conversation that I wanted to address, which was the

372
00:26:21,919 --> 00:26:28,880
usefulness of AI, the the pros of AI,

373
00:26:25,679 --> 00:26:31,679
because I remember I had friends that

374
00:26:28,880 --> 00:26:38,159
are not techy people. Um, one is a teacher. And in like

375
00:26:35,240 --> 00:26:42,960
2023, like shortly after CHACPD is released, I go and talk to him and he's

376
00:26:40,400 --> 00:26:46,320
like, "Yeah, Chachi, I use chatbt all the I'm like, first of all, I was like,

377
00:26:44,480 --> 00:26:50,159
you know what chatbt is because these are like not techy people. Yeah. And

378
00:26:48,400 --> 00:26:53,520
he's like, yeah, use it to help me like make lessons. User growth in history. It

379
00:26:52,240 --> 00:26:56,720
has to have hit a lot of people, right? Yeah. And he and he was using it to make

380
00:26:55,279 --> 00:27:02,720
lesson plans for his kids and like coming up with ideas for stuff. And I was like, wow. Like cuz that that was at

381
00:27:00,159 --> 00:27:07,480
a point where even I wasn't really I I haven't found it to be super super

382
00:27:04,799 --> 00:27:12,880
useful for my specific uh work. Yeah. Um my my dad's a plumbing

383
00:27:10,960 --> 00:27:17,679
instructor and they use it for Yeah. lesson plans. Yeah. And once it gets

384
00:27:15,360 --> 00:27:22,799
like once these actual like multimodal updates get out like it's Gemini live is

385
00:27:20,080 --> 00:27:27,279
now available with the live camera feed. Yeah. I think chat GPT has I know you

386
00:27:25,600 --> 00:27:31,919
can feed images but I think they have a camera mode for the Gemini for the live

387
00:27:29,200 --> 00:27:37,279
mode too. There's um I think a lot of people dramatically misuse AI. Um and

388
00:27:35,679 --> 00:27:40,400
like for instance like I I say that about my dad's lessons plan lesson plans

389
00:27:38,960 --> 00:27:46,559
for plumbing and people are like oh hopefully it doesn't like teach people to go outside of code and it's like well

390
00:27:44,000 --> 00:27:52,159
no he's controlling those aspects. He's using it to build the structure right to

391
00:27:49,760 --> 00:27:56,320
build the pacing to come up with the document that he can put the information

392
00:27:53,840 --> 00:28:00,799
on. He's using it to help organizational skills those types of things. The way

393
00:27:57,760 --> 00:28:03,760
that I'll often use it is to review my

394
00:28:00,799 --> 00:28:07,760
output and give me feedback on my output. I don't use it for output, but

395
00:28:06,159 --> 00:28:14,559
like what kind of output are you talking about? My most common use case is I will

396
00:28:10,960 --> 00:28:17,600
write an email that is like I sure hope

397
00:28:14,559 --> 00:28:19,279
this goes well. Uh so but I you know

398
00:28:17,600 --> 00:28:22,240
maybe I have to be rather serious or whatever in the tone of the email, but I

399
00:28:21,120 --> 00:28:27,840
want to make sure that I get my point across but not come across as aggressive or whatever else. So, I'll ask it for a

400
00:28:25,679 --> 00:28:31,279
sentiment analysis of the email. And then I might ask it like if I'm like,

401
00:28:29,600 --> 00:28:35,520
okay, that's not what I wanted someone to get from this. Yeah. And it got that

402
00:28:33,600 --> 00:28:38,720
from this, then I'll be like, okay, how would you change it to get it there? And

403
00:28:37,039 --> 00:28:43,120
then I won't use its output, but I'll use it to help guide me in that

404
00:28:40,880 --> 00:28:46,399
direction. Did I mention lizards too many times? Well, like sometimes your

405
00:28:45,120 --> 00:28:51,279
email is pretty good, but you seem weirdly focused on lizards.

406
00:28:49,919 --> 00:28:56,880
Maybe you should draw back on the lizards. I think they're not like a reptile person.

407
00:28:54,480 --> 00:29:02,480
maybe cats. We we've had um I I think it's also a good um Kickstarter of

408
00:28:59,919 --> 00:29:07,919
sorts, like for for writer's block, for instance. If if you have to write um a

409
00:29:05,520 --> 00:29:11,200
video on something, you could use it to give you your skeleton. I found I've

410
00:29:09,679 --> 00:29:15,279
tried to get it to write YouTube videos. Man, it sucks. I don't know if you've

411
00:29:13,279 --> 00:29:20,000
tried. I've used it a couple times when I like can't think of a word for

412
00:29:17,440 --> 00:29:23,840
something or like I'm like writing a joke and I'm like, "Oh man, this is like

413
00:29:22,000 --> 00:29:27,760
this." And I'm just like drawing a blank and I'm like, "What's this thing? What's

414
00:29:26,320 --> 00:29:31,039
something like when this guy says this and then another guy comes at it like

415
00:29:29,360 --> 00:29:37,840
that and they're like, "Maybe this." And I'm like, "Okay, thank you." Yeah, that makes sense. But but writing whole

416
00:29:34,320 --> 00:29:41,520
scripts, no. No good. But you can use it

417
00:29:37,840 --> 00:29:43,919
as like a research assistant. Um, again,

418
00:29:41,520 --> 00:29:47,679
there's huge pitfalls there because you shouldn't just believe what it outputs,

419
00:29:45,919 --> 00:29:52,240
but you can use it as like a kicking off point. Mhm. So like something that I've

420
00:29:49,440 --> 00:29:56,880
done is like okay here's a topic what are the like key fields of interest in

421
00:29:54,640 --> 00:30:01,520
this topic and it'll give me that list and then I'll go off and learn things

422
00:29:58,799 --> 00:30:05,760
about those. All of the AI chatbots now have like deep research modes. It does a

423
00:30:03,679 --> 00:30:09,760
good job of like giving you kind of a beginner summary of stuff. But if you

424
00:30:07,840 --> 00:30:13,760
know a lot about a topic and you're like what was the finding of this research

425
00:30:11,279 --> 00:30:17,760
like in the 19 whatever and it's like they found up this they found this stuff

426
00:30:15,600 --> 00:30:21,840
that I just made up because I'm an AI and you're like well I know that's not

427
00:30:19,440 --> 00:30:24,880
true. So it's like but that's why like when you mentioned earlier in the show

428
00:30:23,360 --> 00:30:29,600
that it's getting better with like hallucinations and stuff a little bit.

429
00:30:27,840 --> 00:30:35,080
It's only a little bit. I think it's getting better at convincing people that

430
00:30:31,919 --> 00:30:38,279
it's not doing it. Yeah. Um I I don't

431
00:30:35,080 --> 00:30:42,640
know. I don't personally think it's

432
00:30:38,279 --> 00:30:44,320
getting dramatically better overall. Um,

433
00:30:42,640 --> 00:30:47,840
there are measures that it's getting a bit better, but I really think it's just

434
00:30:46,159 --> 00:30:51,760
getting better at convincing people and it's getting better at using slightly

435
00:30:49,919 --> 00:30:57,120
more vague language and stuff like that to get away from um actually being

436
00:30:55,279 --> 00:31:03,200
pointed as doing things wrong because I catch it hallucinating stuff like Oh,

437
00:30:59,360 --> 00:31:05,200
yeah. all the time. I will say that I've

438
00:31:03,200 --> 00:31:10,240
I've I've tr I've I've just started trying to use it a little bit to help me

439
00:31:07,279 --> 00:31:14,720
research techlength stories. Um it's kind of like one of the steps in the

440
00:31:12,000 --> 00:31:19,120
process and it has not been it has not been very uh useful. But most of the

441
00:31:16,720 --> 00:31:25,039
time it's not hallucinating fake stuff as much as it's like I'm asking for uh

442
00:31:22,880 --> 00:31:28,240
stories that are recent and it like brings me something from a month ago.

443
00:31:26,399 --> 00:31:32,039
Here's something from last quarter. Yeah. Yeah. On balance, if you look at

444
00:31:30,880 --> 00:31:38,720
the whole field, do you feel like the majority of

445
00:31:35,600 --> 00:31:40,559
AI tools that are available are like

446
00:31:38,720 --> 00:31:46,320
helpful or just kind of taking up space and taking up resources? Whoa. Okay.

447
00:31:44,799 --> 00:31:51,840
Majority of AI tools. So, if you include every AI tool in that pool, then

448
00:31:48,880 --> 00:31:57,720
honestly, no. There's a lot of compute, a lot of resources being taken up. Yeah.

449
00:31:54,720 --> 00:32:00,960
And that they're able to do that

450
00:31:57,720 --> 00:32:02,640
because these companies vacuumed up IP,

451
00:32:00,960 --> 00:32:08,000
vacuumed up copyright, and massive investment because that's the the hotness right now. I think most of the

452
00:32:05,760 --> 00:32:12,399
tools are are garbage and we'll lose and we'll die. And that is how these like

453
00:32:10,320 --> 00:32:15,919
very startup focused fields work. So that's pretty natural. I don't actually

454
00:32:14,240 --> 00:32:20,399
think that's weird. I don't think that's AI specific. I think that's just that's

455
00:32:18,159 --> 00:32:23,440
true big new fancy field with lots of investment dollars and everyone trying

456
00:32:21,600 --> 00:32:28,559
to dive in all at the same time. A lot of those people are going to make garbage for for the handful of like

457
00:32:26,559 --> 00:32:33,360
crypto projects that were actually kind of cool and maybe like we could they did

458
00:32:30,640 --> 00:32:38,000
exist. They did like the idea of having some like records and stuff on a

459
00:32:35,679 --> 00:32:43,799
blockchain system that was cool. Y but for every one of those there were

460
00:32:40,799 --> 00:32:43,799
250,000

461
00:32:43,840 --> 00:32:53,679
uh rugpull yeah garbage things. So it's the same for AI. But maybe that all

462
00:32:49,760 --> 00:32:56,880
won't matter if you we we only need one.

463
00:32:53,679 --> 00:32:58,559
We only need one ASI. Artificial super

464
00:32:56,880 --> 00:33:05,519
intelligence and then it'll all be worth it. Ask. You

465
00:33:02,640 --> 00:33:09,200
can do it. Yeah. What? Yeah. ASKI. Artificial super general intelligence.

466
00:33:07,919 --> 00:33:12,960
I've never heard that together. I've heard AGI and ASI. You can't just put

467
00:33:11,600 --> 00:33:16,799
them together like that. We were talking earlier about innovators. That's true.

468
00:33:14,799 --> 00:33:21,519
I'm an inventor. Wow. Yeah. Here's a patent, sir. Thank you. That was a You

469
00:33:19,679 --> 00:33:25,840
just witnessed innovation. Jack Dorsey, this is what you're trying to kill.

470
00:33:22,880 --> 00:33:29,840
Yeah. Stop it. Think of the children. So, maybe Jack Dorsey's right and we'll

471
00:33:27,840 --> 00:33:36,279
have a utopia and none of us will own any IP and everything will be fine. And

472
00:33:32,559 --> 00:33:39,320
maybe Jack Dorsey and Elon Musk and Sam

473
00:33:36,279 --> 00:33:41,919
Alman, especially

474
00:33:39,320 --> 00:33:46,640
Samman, are going to enslave all of us. Yeah, definitely. Especially Sam Alman.

475
00:33:44,799 --> 00:33:51,200
And those are the two options. Elon just wants to impregnate all of us. He can be

476
00:33:49,519 --> 00:33:55,760
persuasive. I think this has been a huge wakeup call

477
00:33:52,799 --> 00:33:59,600
for people about uh Chinese position on value chain. America has traditionally

478
00:33:58,000 --> 00:34:05,840
been positioned very high on the value chain and then I think in a lot of North

479
00:34:02,640 --> 00:34:08,159
American views u you you go down the

480
00:34:05,840 --> 00:34:11,440
value chain from America and end up in China and and a few other countries at

481
00:34:09,839 --> 00:34:15,679
the bottom of the value chain. But I think this and uh the tariff war going

482
00:34:13,679 --> 00:34:19,200
on and a few other things have revealed that China is much higher on the value

483
00:34:17,280 --> 00:34:24,879
chain than we previously thought and has the active ability now to take rungs on

484
00:34:22,879 --> 00:34:30,240
the value chain. Yes. And I think one of those Yeah.

485
00:34:28,800 --> 00:34:36,480
No, I was just saying that because I was thinking about a recent story that just came up with their uh they have uh chips

486
00:34:34,159 --> 00:34:39,679
that they say are more performant than NVIDIA's equivalent. But anyways, go

487
00:34:38,320 --> 00:34:44,000
ahead. Yeah. I'm like, I don't know about that being true, but they're

488
00:34:41,919 --> 00:34:49,440
they're progressing very quickly in a lot of these fields, CPUs, GPUs, AI

489
00:34:46,399 --> 00:34:50,560
technology, movies, random one. Um, lots

490
00:34:49,440 --> 00:34:56,320
and lots and lots of fields where they're they're really taking major steps forward. One of those places,

491
00:34:53,760 --> 00:35:00,480
insane story that I I don't understand how it doesn't get talked about more,

492
00:34:57,760 --> 00:35:06,320
but uh the quote that I have here is Ford CEO Jim Farley admitted he's been

493
00:35:02,800 --> 00:35:08,160
driving a Xiai SU7 for 6 months and

494
00:35:06,320 --> 00:35:13,280
doesn't want to give it up. Wow. The Ford CEO drives a $30,000 Chinese

495
00:35:11,119 --> 00:35:17,200
electricade vehicle, sorry, Chinese-made electric vehicle. Um, unavailable in the

496
00:35:15,440 --> 00:35:21,839
US due to tariffs and safety regulation issues. Is this something that he

497
00:35:18,720 --> 00:35:23,119
revealed uh on a podcast like like of

498
00:35:21,839 --> 00:35:27,520
his own valition? We are now. We're basically the Ford CEO. Yes, he did. It

499
00:35:25,760 --> 00:35:31,200
was of his own valition. Why would he do that? I feel like that's of a motor

500
00:35:29,440 --> 00:35:35,520
company mentioned that he didn't want to give up a $30,000 vehicle from a

501
00:35:33,839 --> 00:35:39,680
competing company because it was so good. That should be Is he about to cash

502
00:35:38,240 --> 00:35:43,920
out or something? Like what? Stunning wakeup call to people about the

503
00:35:41,520 --> 00:35:48,480
automotive industry outside of America. Yeah. I think in EVs in particular, oh

504
00:35:46,160 --> 00:35:53,359
yeah, China is like leapfrogging the auto falling apart and have panel gaps

505
00:35:50,640 --> 00:35:56,560
that I can shove my fist through. Um and are like,

506
00:35:55,119 --> 00:36:01,440
you know, they're they're just partying on Slack sharing people's videos and

507
00:35:59,119 --> 00:36:04,000
stuff like that. So like if you're And I think that's a big part of the problem

508
00:36:02,640 --> 00:36:08,480
too where like when you when you think about Chinese products, people traditionally go like, "Oh, but like

509
00:36:06,720 --> 00:36:15,119
they're going to steal your information, right?" It's like, "Yeah, but we we know

510
00:36:12,320 --> 00:36:18,800
that our companies are Yeah, for sure. Well, yeah. Tons of documented evidence.

511
00:36:17,200 --> 00:36:22,400
That's a whole another that's a whole another conversation I feel like which

512
00:36:20,400 --> 00:36:27,040
I'm still I don't know what to think about cuz that's been the main reason

513
00:36:25,040 --> 00:36:32,839
why I'm like ah don't I don't want to use that like Chinese phone or whatever.

514
00:36:29,520 --> 00:36:35,760
Who knows what data they're sending.

515
00:36:32,839 --> 00:36:39,920
But I know my data is going out to all these other companies. So like so at a

516
00:36:37,359 --> 00:36:43,680
certain point it's like this is the future. This is what they're

517
00:36:41,839 --> 00:36:47,440
saying. They're like IP doesn't matter. Your personal information doesn't

518
00:36:44,880 --> 00:36:52,560
matter. You'll own nothing, including your own personal information and your

519
00:36:49,920 --> 00:36:56,720
own thoughts, if you can't sell them anymore, your own link and you'll be

520
00:36:54,240 --> 00:37:01,280
happy. Yeah, but we'll have to see. I don't know if this tweet that Jack

521
00:36:58,960 --> 00:37:07,200
Dorsey reposted is going to come to pass in which so he tweeted he he retweeted a

522
00:37:04,160 --> 00:37:10,240
interview with Microsoft AI CEO Mustafa

523
00:37:07,200 --> 00:37:12,800
Sullean who says the future isn't UBI

524
00:37:10,240 --> 00:37:16,880
universal basic income it's UBP universal basic

525
00:37:14,359 --> 00:37:20,599
provision abundant intelligence as the new currency you won't need more money

526
00:37:18,800 --> 00:37:26,079
because knowledge won't be something you buy not cash capability we won't need

527
00:37:24,079 --> 00:37:31,920
cash because we'll have AI assistants who can do everything for us. You don't

528
00:37:28,800 --> 00:37:34,640
need to go to the store and buy a pop, a

529
00:37:31,920 --> 00:37:39,760
soda. Your AI will just synthesize it for you. Okay, this feels like that

530
00:37:37,520 --> 00:37:46,240
South Park episode where Randy can't fix his oven because he can't fix the oven

531
00:37:41,440 --> 00:37:47,920
door. And then the the like handyman

532
00:37:46,240 --> 00:37:51,520
people end up being the wealthiest people in society because no one can fix

533
00:37:50,079 --> 00:37:55,359
anything because no one knows how to do anything. M um and they they I don't

534
00:37:54,000 --> 00:38:00,160
remember if this happens in the episode, but I'm going to make it up, I guess. Um

535
00:37:57,599 --> 00:38:02,880
I'm hallucinating. Um but they they like ask an AI to fix the door, and it just

536
00:38:01,760 --> 00:38:07,040
tries to teach him how to do it, but he's like, "No, I want you to fix the door." He keeps giving them steps, but

537
00:38:05,760 --> 00:38:12,240
he's like, "No, I want you to fix the door." Like, "We don't have things that can do our laundry right now. We don't

538
00:38:10,240 --> 00:38:14,560
have things that can build houses. We don't have things that can fix plumbing.

539
00:38:13,599 --> 00:38:21,119
We don't have things that can do any of that kind of stuff." It's like he's talking about provisions of like this

540
00:38:18,160 --> 00:38:25,599
thing that can draw Studio Gibli for me. um or like write a terrible email or a

541
00:38:23,839 --> 00:38:29,119
really really really bad YouTube video that would get one view. Like is that

542
00:38:27,440 --> 00:38:33,119
really I'm really excited about that being a universal provision. Thanks,

543
00:38:31,520 --> 00:38:36,880
bro. I think that's what makes the whole conversation just so so hard to come

544
00:38:34,640 --> 00:38:41,839
down on one side of because it's so uncertain. You need your apartment to

545
00:38:38,960 --> 00:38:47,680
not leak. Is is is chat GPT going to help you with that? Yeah. Like it's hard

546
00:38:45,040 --> 00:38:51,520
and there's there's droughts and and uh like there's peaks and valleys, too.

547
00:38:49,440 --> 00:38:54,560
Like sometimes it seems like the AI whole thing is just hype and sometimes

548
00:38:53,119 --> 00:38:57,680
it's like oh this is okay it's happening. Stunning. Yeah. Yeah. Well,

549
00:38:56,560 --> 00:39:04,240
we're not going to solve it in this podcast. No. But maybe in the next one.

550
00:39:01,440 --> 00:39:10,720
So subscribe possibly. Uh I I wouldn't want to miss it. It might happen. Maybe

551
00:39:06,560 --> 00:39:14,400
we'll invent AGI on right here. A SGI.

552
00:39:10,720 --> 00:39:17,040
Watch out. Open AAI.

553
00:39:14,400 --> 00:39:23,440
We're going to delete your IP. Hey, thanks for watching. Uh, that was a talk

554
00:39:19,040 --> 00:39:24,800
linked. Uh, subscribe to Techl. Uh,

555
00:39:23,440 --> 00:39:30,240
see you later. Maybe I'll do another one of these in half a year. Nope. Never

556
00:39:26,720 --> 00:39:30,240
happening. Never again. Bye.
