1
00:00:00,000 --> 00:00:04,200
Are you good? Yeah, Elijah's out there once and then now we have to deal with it forever.

2
00:00:04,200 --> 00:00:07,700
I feel like I should crash the WAN Show again at some point. Yeah, that was fun.

3
00:00:07,700 --> 00:00:14,500
That was fun. It's set up. That was for Riley Week to like promote Riley Week, but well, but I didn't have the chair.

4
00:00:14,500 --> 00:00:18,500
I came in and I just leaned over you guys. You should crash the WAN Show for Luke Week.

5
00:00:18,500 --> 00:00:24,700
Oh, okay. We should deep fake me to be Riley, and then you should crash the WAN Show as me.

6
00:00:24,700 --> 00:00:28,700
I should, I should just, I should, I should like put on more of a beard.

7
00:00:29,700 --> 00:00:35,700
And not wear my glasses and just like sit there and try to do a Luke impression and then Linus.

8
00:00:35,700 --> 00:00:38,700
And then Linus will be like, not even notice. He won't even notice.

9
00:00:39,200 --> 00:00:42,200
No way Linus might not. And then he crashes me. Yeah, yeah, yeah.

10
00:00:42,200 --> 00:00:45,200
He'll notice. We should do it. We should do that.

11
00:00:45,200 --> 00:00:47,200
Sorry that I just did an impression of your voice.

12
00:00:48,200 --> 00:00:52,200
Feel free to do, feel free to do an impression of my voice. I can't do impressions of anything.

13
00:00:52,200 --> 00:00:57,200
I really wish I could. I've always had dreams of like narrating an audio book, but I can't do it.

14
00:00:57,200 --> 00:01:00,700
I actually don't do any impressions either. The Tim Cook thing. That's all AI.

15
00:01:00,700 --> 00:01:02,700
Speaking of which. AI!

16
00:01:03,700 --> 00:01:09,700
Welcome back to Luke Week. We're going to talk for about an hour about some AI stuff, I guess.

17
00:01:09,700 --> 00:01:14,700
We have some, we have some rough Sammy's invading. Oh, it's a meme that we're doing for an hour.

18
00:01:14,700 --> 00:01:17,700
Exactly an hour. It's exactly an hour. Yeah.

19
00:01:17,700 --> 00:01:20,700
You guys aren't going to talk for like three hours. We got to leave it.

20
00:01:20,700 --> 00:01:25,700
You guys have one hour. We could do WAN Show. You have to tell an AI to shut up or it won't.

21
00:01:25,700 --> 00:01:29,200
And we're into prompting. Boom. Just like that.

22
00:01:29,200 --> 00:01:33,200
Incredible. So one of the things I wanted to talk about was what do you use AI for?

23
00:01:33,200 --> 00:01:37,200
Do you have any prompting things that you like to do? Do you have certain habits?

24
00:01:37,200 --> 00:01:40,200
What do you find the most effective? What do you find the least effective?

25
00:01:40,200 --> 00:01:44,200
I don't even know if it's still the meta, but I try to be as polite as possible.

26
00:01:44,200 --> 00:01:47,200
I feel like I usually throw in like a good morning.

27
00:01:47,200 --> 00:01:53,200
How are you doing? I'm a writer working on this and I need to make this project.

28
00:01:53,200 --> 00:01:56,700
You know, could you put this together or whatever, please?

29
00:01:56,700 --> 00:02:00,700
I also will do an entire thing just to be like, oh, that was good.

30
00:02:00,700 --> 00:02:05,700
Thanks. I do do that as well. I understand that it just burns holes in open AI.

31
00:02:05,700 --> 00:02:09,700
Yeah. Yeah. And it's sucking our oceans dry every time you do that.

32
00:02:09,700 --> 00:02:12,700
That's good. So you're the problem. It's me.

33
00:02:12,700 --> 00:02:18,700
It's for sure me. It's not them. All of my AI interactions have been like, maybe I can use AI for this.

34
00:02:18,700 --> 00:02:23,200
And then I kind of try it and I maybe get something somewhat useful.

35
00:02:23,200 --> 00:02:27,200
And then I find some problem with it. And then I find another problem with it.

36
00:02:27,200 --> 00:02:31,200
And then I'm like, this isn't even worth it. So then I end up not really using it.

37
00:02:31,200 --> 00:02:36,200
Like I have used it before to be like, oh, I've got a bit of writer's block or something.

38
00:02:36,200 --> 00:02:41,200
So then I'll like see what it comes up with. Just like maybe I can take a chunk or another chunk of that or something and use it.

39
00:02:41,200 --> 00:02:45,200
But like the process of going through that is like, I don't know.

40
00:02:45,700 --> 00:02:51,200
Maybe I might save a little bit of time. But I feel like if I just like sat there and just like put stuff on the page and tried

41
00:02:51,200 --> 00:02:54,700
stuff out, I could get something better in the same amount of time most of the time.

42
00:02:54,700 --> 00:02:57,700
I think it kind of scales. Like some people will talk about... But people's mileage varies.

43
00:02:57,700 --> 00:03:01,700
Yeah. People will talk about how they'll get writer's block and start a page for like a day.

44
00:03:01,700 --> 00:03:05,700
A day? Oh, yeah. Like book writers.

45
00:03:05,700 --> 00:03:08,700
Oh, yeah. They'll just be like unable to keep moving. They can't figure it out.

46
00:03:08,700 --> 00:03:14,700
Oh, sure. Yeah, yeah. And in more like creative writing situations like that.

47
00:03:15,200 --> 00:03:20,200
I can understand being like, I have writer's block because there's so little structure for

48
00:03:20,200 --> 00:03:24,200
what I'm doing. I'm not saying that like people who write novels or whatever or books.

49
00:03:24,200 --> 00:03:27,200
He called you unstructured. No, no. You said you have no plan.

50
00:03:27,200 --> 00:03:30,200
You have no lore. I know that... Actually like the best way...

51
00:03:30,200 --> 00:03:33,200
You'll never be Tolkien. Stop.

52
00:03:33,200 --> 00:03:38,200
Stop. Try. You're going on r slash world building as if it's going to help you.

53
00:03:38,200 --> 00:03:46,200
It's not. It might. Recently I understood how much structure really goes into like crafting fiction stories,

54
00:03:46,200 --> 00:03:49,700
which I do want to do at some point. This is an ADHD ramble at some point.

55
00:03:49,700 --> 00:03:52,700
I'm going to flip your question back on you. What do you do when you prompt?

56
00:03:52,700 --> 00:03:55,700
You said you do please, but what else?

57
00:03:55,700 --> 00:04:01,200
I've got a bunch of stuff. I'll do the like I am a prompting engineer type stuff.

58
00:04:01,200 --> 00:04:06,700
I'll give it an expected output. Sometimes if I'm looking for like a document in a certain type of format or something like

59
00:04:06,700 --> 00:04:09,700
that, I'll upload a version of it and be like, don't take any information from this.

60
00:04:10,200 --> 00:04:14,200
But look at the formatting. Look at whatever. Don't scan this and train on this data.

61
00:04:14,200 --> 00:04:18,200
But just look at the table. I don't want the same thing. Look at what the table looks like.

62
00:04:18,200 --> 00:04:21,200
Yeah. Like I'll be pretty specific about those types of things.

63
00:04:21,200 --> 00:04:25,200
I think that whole thing is a little bit overblown in regards to how much it helps.

64
00:04:25,200 --> 00:04:28,200
There's something I've talked about a lot, which I'm going to be repeating myself on,

65
00:04:28,200 --> 00:04:31,200
which is the main way that I use it is something that I call sentiment analysis.

66
00:04:31,200 --> 00:04:35,200
Okay. Which is where I'll have a message that I want to convey.

67
00:04:35,200 --> 00:04:38,200
Right. But I'm unsure how it's going to be received.

68
00:04:38,200 --> 00:04:41,700
So I'll use it as like kind of a rubber ducky. So I'll ask it for a sentiment analysis.

69
00:04:41,700 --> 00:04:45,700
And I used to have this like long structured prompt for that.

70
00:04:45,700 --> 00:04:49,700
So like that's like a hacking term, right? Oh, in this case, no.

71
00:04:49,700 --> 00:04:53,700
Don't mean like rubber ducky USBs, which is what you're talking about when it comes to

72
00:04:53,700 --> 00:04:57,700
hacking. I'm talking about like, you need to like talk your idea through with something.

73
00:04:57,700 --> 00:05:02,700
It doesn't even need to be a person. Oh, you just need to like voice the idea out loud and then you'll get an answer.

74
00:05:02,700 --> 00:05:06,700
My physics teacher in high school had a really, really good version of this where he had an

75
00:05:06,700 --> 00:05:12,200
desk and then he had like two or three just normal desks and he lined them all up on the

76
00:05:12,200 --> 00:05:19,200
long part of the L. Okay. And he was like, if you want to ask me a question during like free work time, you're very welcome

77
00:05:19,200 --> 00:05:26,200
to ask me a question, but you can't ask me over my desk. You have to come into the desk area and then there's a tape line at the entrance.

78
00:05:26,200 --> 00:05:29,200
And you have to say your entire question out loud.

79
00:05:29,200 --> 00:05:34,700
Wow. And then if you still don't know by the time you're done saying your entire question, then

80
00:05:34,700 --> 00:05:37,700
you can walk in and I will be more than happy to help you.

81
00:05:37,700 --> 00:05:40,700
And everybody is happening to me multiple times.

82
00:05:40,700 --> 00:05:46,700
Just requiring people to vocalize their questions. Like cause it might be really stupid that you're coming over here and talking to me.

83
00:05:46,700 --> 00:05:49,700
Well, no, it's often because you might have sat there for 10 minutes trying to figure

84
00:05:49,700 --> 00:05:54,700
it out. You go, oh, fine. You go up to the line, you say your question and halfway through saying your question out

85
00:05:54,700 --> 00:06:00,840
loud, you're like, oh, and then you walk back and you have the answer. So it's like by doing something different, not by just sitting there and stewing on it

86
00:06:00,840 --> 00:06:05,340
by speaking the question out loud. That's where the whole rubber ducky thing comes from.

87
00:06:05,340 --> 00:06:08,840
You know what's funny is that I don't use people as a sounding board a lot of the time.

88
00:06:08,840 --> 00:06:14,840
I just go and do it, but I think this is because I talk to myself a lot.

89
00:06:14,840 --> 00:06:20,840
Oh yeah. Like if I'm doing anything, I'm either talking to myself sort of silently internally or I'm

90
00:06:20,840 --> 00:06:23,840
like literally just speaking out loud. I do it out loud.

91
00:06:23,840 --> 00:06:26,840
I apologize to people all the time. Oh yeah. Cause like they're like, huh?

92
00:06:26,840 --> 00:06:30,840
And I'm like, oh sorry. This is like a problem for me where I think a lot of people probably think I'm insane

93
00:06:30,840 --> 00:06:33,840
cause I'll talk to myself out loud like often. I see.

94
00:06:33,840 --> 00:06:38,840
I feel like people already kind of think I'm insane. So that's where that bridge has been crossed default.

95
00:06:38,840 --> 00:06:44,540
Okay. So I'll ask it for the sentiment analysis and over time that prompt has almost stopped

96
00:06:44,540 --> 00:06:49,840
existing where now it's just like, can you give me a sentiment analysis on enter, enter,

97
00:06:49,840 --> 00:06:53,840
paste send? That's it. Just can you give me a sentiment analysis on this?

98
00:06:53,840 --> 00:06:57,800
And it does exactly as good of a job as, as when you kind of like did this.

99
00:06:57,800 --> 00:07:02,260
This whole long. Oh, you're using the same account like you're using Gemini or chat to BT or what?

100
00:07:02,260 --> 00:07:07,600
I am using the same account. In this case, it's usually chat to BT cause that's the one that I have for work stuff.

101
00:07:07,600 --> 00:07:11,600
I split work stuff and personal stuff and work stuff is usually where I'm like, I don't

102
00:07:11,600 --> 00:07:15,080
know how people are going to take this in yesterday or I used to get a lot of feedback

103
00:07:15,080 --> 00:07:20,800
that I was like unapproachable and my communications were like blunt and short cause it's like.

104
00:07:20,800 --> 00:07:24,920
So now when people think I was like really angry about something and I was like, no,

105
00:07:24,920 --> 00:07:28,040
I was too. You just copy, paste the AI. No, no.

106
00:07:28,040 --> 00:07:31,040
Just kidding. I know you don't do that. Yeah.

107
00:07:31,040 --> 00:07:34,800
It would, it would be, and that's one of my, the reason why I like this. There's no time to do that.

108
00:07:34,800 --> 00:07:37,800
Really honestly. No. Yeah.

109
00:07:37,800 --> 00:07:45,960
But if it's, if it's really important and I'm very worried that someone might take it the wrong way, then I'll just toss it over and usually it's, you know, and I'll

110
00:07:45,960 --> 00:07:50,840
say this line stands out cause it makes me think this thing and I'll be like, mm, that

111
00:07:50,840 --> 00:07:55,360
wasn't the goal. So I'll slightly change that line and then send it. Like I don't usually sit there and farm it out.

112
00:07:55,360 --> 00:08:00,120
So like, this is so interesting to me because I, I take a long time to write messages.

113
00:08:00,120 --> 00:08:04,400
Maybe I don't know. I said, we're getting into this. I'm like, maybe I should start using it.

114
00:08:04,400 --> 00:08:10,000
It's so interesting that we work for like a tech thing or sort of in between like a

115
00:08:10,000 --> 00:08:14,280
tech company now and just like a tech related media thing.

116
00:08:14,280 --> 00:08:19,120
If you ask most people here, they're probably more kind of skeptical of AI or anti AI than

117
00:08:19,120 --> 00:08:22,120
pro AI. Yeah. Which is, yeah, it's interesting.

118
00:08:22,120 --> 00:08:26,080
So like, because I'm so embedded in it and I hear about developments on it every day,

119
00:08:26,080 --> 00:08:30,360
I know that something like that would probably help me save time, but I'm so hesitant to

120
00:08:30,360 --> 00:08:36,440
do it. I think there's, well, okay. One of my lines is that I don't use its output ever kind of thing.

121
00:08:36,440 --> 00:08:42,000
Everyone should do this, but I don't want to project that far. I personally have a bit of a line where I'm not going to use its output.

122
00:08:42,000 --> 00:08:47,360
So like a very common thing that it will do when I ask it for a sentiment analysis, crack

123
00:08:47,360 --> 00:08:51,520
in a brusky to talk about the AI is not sponsored.

124
00:08:51,520 --> 00:08:55,760
It will very often output like, hmm, here's your sentiment analysis.

125
00:08:55,760 --> 00:09:02,040
You could say it this way and it'll rewrite my whole thing. I never take that sometimes to detriment because I'll see what it wrote.

126
00:09:02,040 --> 00:09:05,780
And then I'll be like, well, I can't say that even though that like might have been a good

127
00:09:05,780 --> 00:09:12,240
idea because I'm not going to use what it says. And I have to find another way to do it, which sometimes I'll avoid even reading it.

128
00:09:12,240 --> 00:09:15,600
So this is something you do just as a principle.

129
00:09:15,600 --> 00:09:20,800
Even if I do not use its output, even if the message is fine, it looks, it seems fine.

130
00:09:20,800 --> 00:09:24,480
Your principle is you do not just copy and paste. You have to modify in some way.

131
00:09:24,480 --> 00:09:32,600
And sometimes I'll get, that's good. Sometimes I'll get pretty close to the idea, but it always has to be my idea.

132
00:09:32,600 --> 00:09:41,160
But sometimes like it'll be a line for everybody. I kind of think so because what, what, what lies the other way, the minimum requirement

133
00:09:41,160 --> 00:09:45,880
for me is that you have to be fully responsible for output that you have.

134
00:09:45,880 --> 00:09:50,400
And the easiest way for me to accomplish that is to just not use its output because I'm

135
00:09:50,440 --> 00:09:53,520
just nothing changed. I'm fully responsible for what I do.

136
00:09:53,520 --> 00:09:58,640
But there's this like scapegoat shield that people try to have of like, oh, I submitted

137
00:09:58,640 --> 00:10:05,240
something and it was wrong or sucked or whatever. But I mean, there was an AI or whatever, you know, it's like, no, you have to own your

138
00:10:05,240 --> 00:10:09,920
output. This is wild to me that. So like, I was, I didn't know what I was going to bring this up at some point.

139
00:10:09,920 --> 00:10:13,880
But I listened to a podcast recently from 80,000 hours.

140
00:10:14,080 --> 00:10:19,800
You know, you're familiar with this. It's like the, I think it's a nonprofit that's based around, I forget what,

141
00:10:19,880 --> 00:10:25,760
thinker, some famous thinker, maybe had the, like, came up with the, the idea that you

142
00:10:25,760 --> 00:10:32,560
have 80,000 hours in your career. And it's like, so, so then the whole organization is kind of geared towards making

143
00:10:32,560 --> 00:10:37,720
people think about their job and their career and be intentional about like going

144
00:10:37,720 --> 00:10:42,120
into work and spending time on things that they think are meaningful and useful to

145
00:10:42,120 --> 00:10:45,600
the world and like having, having an impact and stuff.

146
00:10:45,600 --> 00:10:53,040
And it's like, so anyway, that's the organization. But the organization seems cool, but never really, apparently they have a

147
00:10:53,040 --> 00:10:57,280
podcast that showed up in my feed. They get like a couple thousand views of a video.

148
00:10:57,760 --> 00:11:02,320
And they were there, there were these two people talking about parenting,

149
00:11:02,320 --> 00:11:07,200
actually unrelated, but the, the, what struck me, listen, I clicked on this,

150
00:11:07,200 --> 00:11:10,200
not because it was about AI, but they ended up talking about AI for like half

151
00:11:10,200 --> 00:11:16,560
the video, because one of them was a prospective parent and one of them was a parent. The prospective parent was, was saying like, and I was concerned about

152
00:11:16,560 --> 00:11:20,040
like, you know, what kind of issues would come up when, when parenting or

153
00:11:20,040 --> 00:11:23,080
whatever. So like, so I asked, they didn't even say that.

154
00:11:23,080 --> 00:11:28,720
They didn't even preample. They just said, so I, so I asked, uh, so I asked the LLMs, like what they thought

155
00:11:28,720 --> 00:11:32,000
would be an issue like that I should be concerned about. And the first thing they said was this.

156
00:11:32,000 --> 00:11:36,160
And the second thing they said with this, and just like uncritically, so I asked

157
00:11:36,160 --> 00:11:39,680
Claude, what it thought, like, you know, I should be like buying in terms of like

158
00:11:39,840 --> 00:11:45,480
what products and stuff that it was like, oh, you should probably have this and this and this. And I was like, that surprised me because I thought, and just

159
00:11:45,480 --> 00:11:51,960
like, as if they had referenced, oh, and I, I interviewed a parenting expert and

160
00:11:51,960 --> 00:11:55,440
they said this concerning, especially because, I mean, attention to another

161
00:11:55,440 --> 00:11:58,760
thing we were talking, thinking of talking about, we all saw what happened

162
00:11:59,000 --> 00:12:04,160
with Grock recently. And I am fully of the belief that that is happening with

163
00:12:04,160 --> 00:12:08,800
all of them all the time. It was just so obvious with Grock when it just decided

164
00:12:08,800 --> 00:12:11,840
that Elon was the best at everything in the world, but it was like, who would be

165
00:12:11,840 --> 00:12:17,120
better at basketballs? I mean, to be fair, probably Elon out of everyone to be fair.

166
00:12:17,360 --> 00:12:21,720
Elon probably has a physical beast. He's the richest man in the world.

167
00:12:21,720 --> 00:12:26,240
So I mean, everything that means he's good at everything else, probably.

168
00:12:26,320 --> 00:12:32,640
One of them was a piss drinking contest. What person out of all of the history of mankind, who would be the best at a

169
00:12:32,640 --> 00:12:36,000
piss drinking contest? And they're like, probably. Absolutely.

170
00:12:36,000 --> 00:12:39,600
Although there was a, I think there was a couple that it wouldn't, it wouldn't do

171
00:12:39,600 --> 00:12:43,840
that for her. I think it was like, would he rise from the grave faster than Jesus

172
00:12:43,840 --> 00:12:48,040
Christ or something? And it was like, not that one.

173
00:12:49,000 --> 00:12:53,280
And I think there was another like, someone like mentioned like a Catholic

174
00:12:53,280 --> 00:12:56,680
saint or something. And the Grock said, so it has some form of religious.

175
00:12:56,720 --> 00:12:59,880
Yeah, maybe. I don't know. Like, yeah, maybe that was the theme.

176
00:12:59,880 --> 00:13:04,320
But anyway, I was saying to be fair, Grock is definitely the LLM that you

177
00:13:04,320 --> 00:13:08,000
would be most concerned about it, making like weird proclamations like that.

178
00:13:08,120 --> 00:13:12,600
I would expect Chatchapiti or Claude or whatever to like be slightly better at

179
00:13:12,600 --> 00:13:16,520
not just a little bit more balanced, but it is still a trillion dollar industry

180
00:13:16,520 --> 00:13:20,120
based almost entirely around of influencing and trying to control your

181
00:13:20,120 --> 00:13:23,960
brain as much as it can. So like purchasing decisions, lifestyle changes

182
00:13:23,960 --> 00:13:28,080
and stuff that will benefit corporations is going to be fairly obviously a

183
00:13:28,080 --> 00:13:31,840
primary goal of these systems. A goal of the, of the LLMs.

184
00:13:31,960 --> 00:13:35,680
Like it's playing that go deep on that. What do you mean? Advertising.

185
00:13:36,600 --> 00:13:39,600
You mean in the future or like now you're thinking now to a certain degree,

186
00:13:39,600 --> 00:13:42,880
but it's a clear direction for the future. You're just talking about like parenting advice.

187
00:13:42,880 --> 00:13:46,440
Okay. Well, what if it leans you towards parenting advice, which might lean

188
00:13:46,440 --> 00:13:53,720
you towards certain products? Well, and I mean, there's a, there's instant buy, I think, or instant

189
00:13:53,720 --> 00:13:57,080
purchase or something Chatchapiti. You can buy stuff directly with certain

190
00:13:57,080 --> 00:14:00,360
products and then also you could just have it right now.

191
00:14:00,600 --> 00:14:05,280
You don't have to shop around and find the right thing. I already did that for you. Yeah, you automatically believe me on everything.

192
00:14:05,280 --> 00:14:09,120
Anyways, the automatic belief is terrifying.

193
00:14:09,120 --> 00:14:13,160
It was the weirdest thing. And then there was another podcast that I clicked because I was so taken

194
00:14:13,160 --> 00:14:17,040
aback by that. I clicked to another video where he was interviewing a,

195
00:14:17,040 --> 00:14:20,920
like research, an analyst from the Pew Research Center who had done,

196
00:14:20,920 --> 00:14:25,280
like they recently did like a big giant survey about the public's perception

197
00:14:25,280 --> 00:14:30,520
of AI. And this guy, the host of this podcast was like,

198
00:14:30,600 --> 00:14:34,600
apparently he talks to a lot of like AI insider people a lot, like people who

199
00:14:34,600 --> 00:14:38,120
work at the AI companies. It's obviously great. What's the opposite of doomer?

200
00:14:38,800 --> 00:14:42,880
There's a word for it. Anyway, they're super like optimistic on, on AI.

201
00:14:42,960 --> 00:14:46,920
And so he's like taken aback by all these, because like the statistics are

202
00:14:46,920 --> 00:14:51,960
always like 67% of people say they're concerned about the future AI and 17%

203
00:14:51,960 --> 00:14:56,440
say they're, it's going to be great. And he's like, I was so surprised because

204
00:14:56,440 --> 00:15:01,000
like all of these people are making AI like all, we use, I spend hours talking

205
00:15:01,000 --> 00:15:05,320
to chatbots every day and in the course of my regular work and they all are

206
00:15:05,320 --> 00:15:09,120
making it thinking like, this is going to help people. It's going to like optimize stuff and blah, blah, blah.

207
00:15:09,160 --> 00:15:12,920
What I'm feeling out there is that people who are like productivity,

208
00:15:12,920 --> 00:15:16,880
like the grind set, like the, you are not really living unless you're

209
00:15:16,880 --> 00:15:20,440
started for businesses by the time you're 20 or something, you know,

210
00:15:20,440 --> 00:15:24,080
like those people are like, yes, this is fantastic.

211
00:15:24,080 --> 00:15:29,560
This is a utopian vision. I can get AI to make up recipes and then

212
00:15:29,560 --> 00:15:34,040
generate and publish a cookbook all without me basically doing nothing.

213
00:15:34,040 --> 00:15:38,080
You know, oh, that's going to make the world better because people will be more productive.

214
00:15:38,160 --> 00:15:42,800
This is something that I struggle with a bit is especially with video and

215
00:15:42,840 --> 00:15:47,880
image generation. I can't think of an example where it made something better.

216
00:15:47,960 --> 00:15:56,680
Well, okay. Counterpoint immediately. I use generative AI in like Photoshop to kind of just make things

217
00:15:56,680 --> 00:15:59,520
quicker that I could do manually. Very specific example.

218
00:15:59,680 --> 00:16:05,720
Thumbnails. I wear glasses. We have lights. Sometimes I'm turning my head so that there's reflection on my glasses.

219
00:16:06,040 --> 00:16:11,760
And that's really annoying to get rid of in Photoshop. You could. It just would take a really long time.

220
00:16:11,880 --> 00:16:16,400
And so you just circle it. Now we have to question. You circle it and you say, remove glare.

221
00:16:17,400 --> 00:16:23,160
Now I have to question like, what's AI though? That's like, that's using the same kind of diffusion techniques as like image

222
00:16:23,160 --> 00:16:27,560
generators. You're not creating a whole image. But those types of tools existed.

223
00:16:28,560 --> 00:16:33,160
Not like this. They had like content aware fill and stuff, but like the whole kind of

224
00:16:33,680 --> 00:16:37,920
it's it's diffusion for image generators. But now they're kind of like baked into transformer models.

225
00:16:37,920 --> 00:16:44,040
I don't know what the right terminology is. But like post chat GBT kind of do you think that would have been impossible

226
00:16:44,320 --> 00:16:48,440
without diffusion layers? Or do you think we could have done it?

227
00:16:49,280 --> 00:16:56,440
Because like this is I mean, I don't know enough about it. My argument is that I think a lot of things are getting the label slapped on.

228
00:16:56,680 --> 00:17:00,720
And in some cases like this one, it is actually true that it's being influenced

229
00:17:00,720 --> 00:17:04,640
by it, but we were already moving in this direction without these types of like

230
00:17:04,640 --> 00:17:09,240
transformer models and diffusion layers and stuff. So we should define our terms then when you say AI, what do you

231
00:17:09,240 --> 00:17:12,440
but you're right. It is it is using that tech.

232
00:17:12,440 --> 00:17:18,200
I didn't actually know that. I thought it was still more on the machine learning side of things. But let's let's say that it is because I'm I don't know anything about it.

233
00:17:18,200 --> 00:17:21,760
You know something about it. I'm assuming you're right to me.

234
00:17:21,760 --> 00:17:24,840
That's still something that could have been done without it.

235
00:17:25,480 --> 00:17:29,360
It's possible. You know what it is? It does use it because I've used it before.

236
00:17:29,560 --> 00:17:33,720
I've used it before to I took I took a screenshot of Techlinked

237
00:17:34,120 --> 00:17:38,160
and I just expanded the canvas so that there was a bunch of like empty space

238
00:17:38,160 --> 00:17:42,080
around the screen shot and you say fill in the frame or whatever.

239
00:17:42,080 --> 00:17:46,360
And it brought it brought in like weird creatures at some point.

240
00:17:46,560 --> 00:17:51,680
I think I wait. No, sorry. The first time it just was like a bunch of random stuff.

241
00:17:51,680 --> 00:17:56,200
But then I was like, have me surrounded by like a little stuffed

242
00:17:56,200 --> 00:18:01,040
animal like stuffed animal dinosaurs or something. And they're like popped some in here's a weird part, though.

243
00:18:01,320 --> 00:18:04,440
OK, so I'm kind of like happy that you countered.

244
00:18:05,600 --> 00:18:09,600
But how does this how does this balance against what we just talked about

245
00:18:10,200 --> 00:18:14,880
where we said you shouldn't use output? You're talking about texts.

246
00:18:14,880 --> 00:18:23,560
I do think maybe there's a difference there. There's a big difference. Yeah, because if you're saying I need to reply to this guy and tell him this.

247
00:18:23,560 --> 00:18:27,320
This is me. This is my words is the assumption that someone should be able to make.

248
00:18:27,320 --> 00:18:30,960
Right. And you're talking about don't do the thing where the AI gives you something.

249
00:18:30,960 --> 00:18:37,080
You copy it and paste it to someone else unaltered. Yeah, I'm talking about I'm making a thumbnail with a bunch of elements

250
00:18:37,080 --> 00:18:42,600
that are not AI. And now I'm using AI as part of that process to kind of, you know,

251
00:18:42,600 --> 00:18:46,360
as one part of this grand thing like the whole thing that I'm doing.

252
00:18:46,400 --> 00:18:50,800
The analogous thing would be me telling an AI, here's the channel.

253
00:18:50,840 --> 00:18:54,680
Make a thumbnail like in the style of this channel with this and this and this and this.

254
00:18:54,880 --> 00:18:58,360
And then it gives me a thumbnail and I use it like, yeah, no, that would be that would be bad.

255
00:18:58,400 --> 00:19:02,600
This is where I think I think we don't necessarily do things super well

256
00:19:02,600 --> 00:19:06,480
in regards to the usage of AI, which is where I know we have people internally

257
00:19:06,480 --> 00:19:09,560
that don't like AI stuff, fair, totally fair.

258
00:19:09,560 --> 00:19:14,840
But I think we often slog through menial tasks that none of us want to do.

259
00:19:15,240 --> 00:19:18,840
That type of stuff could be accelerated, not necessarily completed,

260
00:19:18,840 --> 00:19:24,200
but accelerated like you're describing through use of often like specialized,

261
00:19:24,200 --> 00:19:27,320
like I'll call this a specialized tool in the current AI space.

262
00:19:29,120 --> 00:19:35,240
And that is cool and makes sense. I think the like, oh, God, is going to take all our jobs argument

263
00:19:35,280 --> 00:19:39,480
is potentially valid in some fields for the vast majority of what we do.

264
00:19:39,800 --> 00:19:44,240
I don't think so. Oh, my God, there is so much more that we could do at all times.

265
00:19:44,440 --> 00:19:48,880
Every single person here could have like four X their own individual output

266
00:19:49,080 --> 00:19:56,680
and we would still have more work that we could do. So like I don't believe in the like, oh, no, this boring thing

267
00:19:56,680 --> 00:19:59,960
that no one wants to do and I don't want to do is automated. I will therefore lose my job.

268
00:19:59,960 --> 00:20:03,480
It's like, no, there's a lot of more stuff. Yeah, that's very true.

269
00:20:03,480 --> 00:20:06,560
I think for it, especially for a company like ours, like, you know,

270
00:20:06,560 --> 00:20:09,720
I feel like I would be worried for like an HP or something.

271
00:20:09,720 --> 00:20:17,120
Sure, there are there are ones that are more concerned. Yeah, but definitely, definitely, you know, we we tend to hire people

272
00:20:17,120 --> 00:20:21,520
with like not extremely, extremely narrow skill sets.

273
00:20:21,920 --> 00:20:25,720
So it's like, yeah, OK, that one thing that you're doing is maybe

274
00:20:25,720 --> 00:20:32,640
you don't have to spend so much time on that, but like you can do other stuff. Yes. Yeah, sorry. Did you say something earlier and I just ignored you so bad.

275
00:20:32,800 --> 00:20:37,400
Yeah. And is that 40 minutes that you want to do?

276
00:20:37,440 --> 00:20:41,240
No. Yeah. Like it's it's like it's like therapeutic in a sense.

277
00:20:41,240 --> 00:20:44,960
But when I have to like 12 other things, it's like, I don't have time to be therapeutic.

278
00:20:45,080 --> 00:20:51,040
So this is like, I want people internally to use AI like selfishly, almost find

279
00:20:51,040 --> 00:20:55,040
the things in your job that you don't want to do anyways and try to see if

280
00:20:55,040 --> 00:21:00,880
there's some way that you can get it automated or accelerated. Yeah, I mean, yeah, I feel like for stuff like that, where it's like

281
00:21:01,120 --> 00:21:04,200
Sammy isn't putting any of Sammy into that task.

282
00:21:04,520 --> 00:21:08,160
Yeah, it's just like Sammy is a machine for the purpose of that task.

283
00:21:08,160 --> 00:21:13,480
You know, Sammy is a machine. When you're when you're writing a script for a short, then that's like, OK,

284
00:21:13,480 --> 00:21:18,720
I don't want to have an AI write a script for a short. This is the part of the job that I find fulfillment from this part of the job

285
00:21:18,720 --> 00:21:22,120
that I like. Cool. Yeah. Don't touch it with that. Right. Sounds good.

286
00:21:22,160 --> 00:21:25,480
For me, it's almost it's funny because like the whole computer use element,

287
00:21:25,480 --> 00:21:29,720
the agentic stuff, it's like the more you could see it as more concerning

288
00:21:29,720 --> 00:21:33,200
because it's like we're giving it access to like systems where it can click on

289
00:21:33,200 --> 00:21:36,160
stuff and accidentally do stuff. Maybe I don't know, it's dangerous.

290
00:21:36,400 --> 00:21:39,480
But and that's why I'm kind of like, I don't know if like

291
00:21:40,280 --> 00:21:45,480
I'm sure we'll get there eventually. But right now I'm not like stoked on it because that is the kind of thing

292
00:21:45,480 --> 00:21:52,640
that would actually be useful to me, but it doesn't seem that useful. Like part of my job has been it's kind of like being handed off now,

293
00:21:52,640 --> 00:21:56,440
but has been kind of like managing the folder on the server

294
00:21:56,440 --> 00:22:01,480
with all the tackling stuff in it, you know, like, oh, we did. It's got a bunch of it's got a bunch of videos

295
00:22:01,480 --> 00:22:04,400
and I need to move that to the vault or to archive.

296
00:22:04,720 --> 00:22:09,920
And that there's that there's stuff like video time stamps

297
00:22:11,760 --> 00:22:16,000
where it's like, yeah, I would love to be able to like just tell an AI move

298
00:22:16,000 --> 00:22:21,600
the six oldest tackling videos to the to the archive and it just does it.

299
00:22:21,600 --> 00:22:25,360
Yeah, that'll be amazing. I'd be so sketched out to do that right now.

300
00:22:25,360 --> 00:22:32,080
Yeah, exactly, exactly. It's hard when you're dealing with like, you know, stuff where if you do make a

301
00:22:32,120 --> 00:22:36,720
wrong, if it if the AI does make a wrong move, then like something might be screwed.

302
00:22:36,720 --> 00:22:41,000
It might put soap in the engine. Yeah. Yeah, that's that's definitely a concern.

303
00:22:41,000 --> 00:22:44,560
That's again, one of the reasons why. And we might get to a point where it's not as much of a concern.

304
00:22:44,560 --> 00:22:47,680
Yeah, I was I also wanted to say, though, in terms of like the automating

305
00:22:47,680 --> 00:22:52,200
menial stuff, one of the things that I do, well, we do we do time stamps on our videos.

306
00:22:52,680 --> 00:22:58,000
And I usually do it manually. I still do it manually because I started somebody suggested

307
00:22:58,200 --> 00:23:02,120
somebody else had to publish tackling and they were like, I was like, did you do the time stamps?

308
00:23:02,120 --> 00:23:05,320
And they're like, yeah, I just threw them in chat GPT and it was pretty good.

309
00:23:05,800 --> 00:23:09,040
And I was like, OK, so then I did that a few times.

310
00:23:09,320 --> 00:23:13,160
I it's not that great. I get it needs the transcript, though. That's the thing.

311
00:23:13,160 --> 00:23:17,840
I think Gemini can watch videos now, like it can just like actually watch the video.

312
00:23:18,480 --> 00:23:22,840
I was using chat GPT and it needed the transcript. This is like a very recent thing.

313
00:23:22,840 --> 00:23:26,040
I think Gemini is able to like actually think you can actually watch the video

314
00:23:26,040 --> 00:23:30,200
or does it have access to Google's version of the transcripts through their like

315
00:23:30,200 --> 00:23:35,000
translation? No, I think I think that instead of having to look at the transcript,

316
00:23:35,360 --> 00:23:39,560
it can like watch a video and say and know that like, oh, there was a dog

317
00:23:39,560 --> 00:23:44,320
at this point or something. I think I might be wrong about this, but I believe that that's the case.

318
00:23:44,320 --> 00:23:49,520
You can upload a video process. It. But regardless, chat GPT, I was using

319
00:23:49,520 --> 00:23:55,640
chat GPT, so it needed the transcript. It was basing the timestamp stamps off the transcript where it was like, OK,

320
00:23:55,760 --> 00:24:01,560
this is the point at which you started the new topic. And it was often like, you know, pretty close, but it might have been like

321
00:24:01,560 --> 00:24:05,120
might be like five to ten seconds off or something. And I'm like, I don't want that.

322
00:24:05,120 --> 00:24:09,080
Pretty annoying as a viewer. I want I want the timestamps to be right, you know, so it's like, OK,

323
00:24:09,080 --> 00:24:12,960
that doing that that way maybe saved me four minutes.

324
00:24:14,000 --> 00:24:18,680
Yes. And I'm like, I could save four minutes and have them be less accurate.

325
00:24:18,680 --> 00:24:22,040
And I really I feel like that's really the tradeoff that we're dealing with

326
00:24:22,040 --> 00:24:27,160
with tons of AI stuff. You can save this much time for a worse output.

327
00:24:27,160 --> 00:24:30,160
And it's like, you got to drain a lake, remove a bunch of jobs,

328
00:24:31,280 --> 00:24:35,760
ruin the economy and then save four minutes. Yeah, it will be more productive.

329
00:24:36,640 --> 00:24:41,080
I could start an online business. I could sell ebooks.

330
00:24:42,320 --> 00:24:44,560
Turns out the furries by a lot of pictures.

331
00:24:45,880 --> 00:24:49,960
That's an unserved market. It's not unserved.

332
00:24:49,960 --> 00:24:54,560
It's very well served. But I can join it.

333
00:24:54,560 --> 00:25:00,400
I don't know why I was doing that voice for that. It's just like natural work with the robots.

334
00:25:00,840 --> 00:25:04,800
Speaking of, oh, man, I'm not usually on this set. I'm going to water for you. No, it's fine.

335
00:25:04,800 --> 00:25:09,160
It's fine. So I don't do that. I don't even I'm going to keep it. Some I don't want you to give me the water in here.

336
00:25:09,160 --> 00:25:14,240
You're fine. You're half an hour. Give me the water. Oh, no.

337
00:25:14,240 --> 00:25:19,880
You know, if we you know why? Because if we used AI to tell us when an hour had passed, then that would be fine.

338
00:25:19,880 --> 00:25:23,120
We would save so much time. You're half an hour.

339
00:25:23,120 --> 00:25:28,120
Why do we have half an hour left? And speaking of not using its output or using different types of output,

340
00:25:28,120 --> 00:25:31,840
all that kind of stuff, we released a video recently. Linus hosting.

341
00:25:31,840 --> 00:25:38,080
We tend to do that sometimes with Linus hosting and it there.

342
00:25:38,080 --> 00:25:41,000
They're being deep fakes of Linus and the thumbnail was like,

343
00:25:41,320 --> 00:25:44,360
is this really me or whatever the heck it was?

344
00:25:44,360 --> 00:25:48,280
Or is it that isn't me? And like, you can still tell for sure.

345
00:25:48,280 --> 00:25:52,800
But Linus is a person that. Oh, right. Yes. We have seen a lot of.

346
00:25:52,880 --> 00:25:56,760
I think there are very convincing parts of the video.

347
00:25:56,760 --> 00:25:59,520
I think there are also very not convincing parts of the video.

348
00:26:00,360 --> 00:26:04,040
But I did not watch it. Yeah. I'm a bad.

349
00:26:04,040 --> 00:26:07,200
Load it up and just watch the intro right now. OK.

350
00:26:08,480 --> 00:26:10,920
Because like what's concerning to me is like

351
00:26:11,800 --> 00:26:15,280
we did more complicated things like we had them juggle stuff and

352
00:26:15,800 --> 00:26:20,200
chug pills and like carry someone around and do things like that.

353
00:26:20,200 --> 00:26:24,240
Like we gave it a difficult task. A lot of wow.

354
00:26:24,240 --> 00:26:27,880
Thanks, Sammy. That was totally unnecessary. But thank you.

355
00:26:27,880 --> 00:26:32,280
Uh, yeah, I can't fill my water bottle. It's very frustrating.

356
00:26:32,280 --> 00:26:35,360
I'm bullish on robotics. Is bullish the right term?

357
00:26:35,360 --> 00:26:41,160
I think so. Oh, geez. Oh, what did Defender sponsored this video?

358
00:26:41,160 --> 00:26:44,520
So I can show you that. Yeah. So no, this is him. What model do we use for this?

359
00:26:44,520 --> 00:26:47,520
I don't even remember. Linus strength pills.

360
00:26:47,520 --> 00:26:51,760
Oh, the teeth. The teeth are a little weird. The mouth doesn't follow what he's saying very well.

361
00:26:51,800 --> 00:26:53,160
You got David.

362
00:26:56,000 --> 00:26:59,200
This is hilarious. Why haven't I watched this yet?

363
00:26:59,200 --> 00:27:01,360
I'm, you know, I'm really focused on what I'm doing,

364
00:27:03,240 --> 00:27:08,040
which is fair, I think, which is making. Wow, he's ripped.

365
00:27:08,040 --> 00:27:12,000
This is OK. Yeah. Camera movement up.

366
00:27:12,000 --> 00:27:17,040
Oh, my gosh, that's not thanks to Linus pills. He's got the braces here.

367
00:27:17,040 --> 00:27:20,400
What are they going away? As it's wacky.

368
00:27:20,760 --> 00:27:24,440
Sorry, I'm so confused. Is this whole movie? Is this whole video?

369
00:27:24,440 --> 00:27:29,160
No, no. But this isn't him still. This is a different form of thing.

370
00:27:29,160 --> 00:27:33,280
This is now Chase with Linus put on top of him.

371
00:27:33,280 --> 00:27:38,160
OK. Because now you're telling like the eyes.

372
00:27:38,160 --> 00:27:41,760
That's actually wacky. Yeah, anyways.

373
00:27:41,760 --> 00:27:46,480
But this this video, like, I think we we made it more

374
00:27:46,480 --> 00:27:50,240
difficult than it needed to be. We got AI to make the whole video

375
00:27:50,240 --> 00:27:54,280
with the script and everything. Oh, God. Now we're making it even more difficult.

376
00:27:54,280 --> 00:27:57,440
That would have been so easy. But no, I think if you look at the things

377
00:27:57,440 --> 00:28:00,560
that are going to be the most concerning for deepfakes,

378
00:28:00,560 --> 00:28:06,560
they're a lot easier. Deepfaking a political speech is like a joke at this point.

379
00:28:06,560 --> 00:28:10,280
Oh, yeah. Yeah, we're there. The thing that people were concerned about.

380
00:28:10,280 --> 00:28:15,200
I mean, like in hindsight, I don't want to call it a crisis, but like people were really worried about deep panic,

381
00:28:15,200 --> 00:28:18,480
you know, a few years ago, I guess, pre-chat GBT.

382
00:28:18,480 --> 00:28:21,600
Yeah, it's here now. It's of it. It's like that's quaint.

383
00:28:21,600 --> 00:28:24,840
It's like looking back at how people are like, Oh, what is this going to do to society?

384
00:28:24,840 --> 00:28:29,160
It's like, well, we're we're there. And I think that like we're at an interesting point

385
00:28:29,160 --> 00:28:34,840
where most of the time, I feel like knowledgeable people

386
00:28:34,840 --> 00:28:38,520
can tell most of the time we're starting to get things.

387
00:28:38,520 --> 00:28:41,560
I got fooled for like the first time really,

388
00:28:41,560 --> 00:28:46,600
like a month ago or something. And it was, did you see that like there was a prototype

389
00:28:46,600 --> 00:28:51,120
of like a four-wheeled or four-legged rideable robot thing?

390
00:28:51,120 --> 00:28:54,280
It basically looked like an ATV, but with legs instead of.

391
00:28:54,280 --> 00:28:57,280
I don't think so. It was, I think it was a Hyundai.

392
00:28:57,280 --> 00:29:00,320
Hyundai, it was like a Hyundai prototype thing at a CES.

393
00:29:00,320 --> 00:29:03,720
You know, Hyundai makes like weird vehicle things at CES.

394
00:29:03,720 --> 00:29:06,880
Yeah, it was like a ski do with legs. It was just a prototype.

395
00:29:06,880 --> 00:29:10,720
They didn't show someone riding it. I think they showed it walking maybe,

396
00:29:10,720 --> 00:29:15,080
but no one riding it or anything. And then there was a video came out with like showing

397
00:29:15,080 --> 00:29:20,160
like a girl riding it and it was like moving. And then it was like she rode it out of the warehouse or whatever.

398
00:29:20,160 --> 00:29:24,080
And I was like, well, and I saw the video and I was like, it was on Reddit.

399
00:29:24,080 --> 00:29:28,760
And I was like, whoa, what did they, did they do a demo of that thing from CES?

400
00:29:28,760 --> 00:29:32,520
And I like looked at the video again and I was like, that looks like the thing from CES or whatever.

401
00:29:32,520 --> 00:29:36,360
So I went on a whole rabbit hole. Anyways, fast forward.

402
00:29:36,360 --> 00:29:39,760
You know, 30 minutes later, I'm like, this is an AI video.

403
00:29:39,760 --> 00:29:43,880
And I didn't even, because I was looking, I was trying so hard to find the original source

404
00:29:43,880 --> 00:29:50,880
that it just like took a while for me to click through a bunch of stuff. And then after I had done that, I watched the video more closely.

405
00:29:50,880 --> 00:29:54,040
And I was like, hey, these guys are kind of blurry.

406
00:29:54,040 --> 00:30:00,320
And like, what the, hold on, that's AI. And that was like the first time that it was really like, they got me.

407
00:30:00,320 --> 00:30:03,960
I fairly routinely will take like tests

408
00:30:03,960 --> 00:30:08,240
where you're supposed to try to spot things. And for a long time, I was like 100% all the time.

409
00:30:08,240 --> 00:30:11,360
And then I've started falling slightly.

410
00:30:11,360 --> 00:30:16,040
And that's where it's like, ugh. So we're not quite at the point where, you know,

411
00:30:16,040 --> 00:30:24,160
deep fakes are going to ruin everything. But I've had AI videos sent to me in a context where I'm pretty sure

412
00:30:24,160 --> 00:30:30,520
the person didn't know it was an AI video. And then I would be like, I'm not trying to be a grammar Nazi or whatever.

413
00:30:30,520 --> 00:30:34,200
But like, this is a grammar grammar. Sorry, I was messing around.

414
00:30:34,200 --> 00:30:37,640
It's like, is this a term? I'm not a grammar Nazi.

415
00:30:37,640 --> 00:30:45,240
But this is like not legit. And they'd be like, oh, I know, I just, I was just sending it to see if you could tell.

416
00:30:45,240 --> 00:30:49,720
They played it off like they knew? This has happened with multiple people multiple times.

417
00:30:49,720 --> 00:30:53,840
Oh, interesting. So it's like, I think we're also in a state where,

418
00:30:53,840 --> 00:30:59,520
and this is probably true of me as well, where people are overestimating their ability to detect.

419
00:30:59,520 --> 00:31:02,960
Yeah, it could be. And it's also pacing really fast.

420
00:31:02,960 --> 00:31:07,800
Yeah. Where, and what I'm kind of getting at with all of this is basically like,

421
00:31:07,800 --> 00:31:12,400
how do you plan to navigate and for your fan, like you're a father, right?

422
00:31:12,400 --> 00:31:16,000
I am. Yeah. How do you prepare your kids for that?

423
00:31:16,000 --> 00:31:20,880
I don't know. Yeah, all right. But like, I feel like it's a similar kind of,

424
00:31:20,880 --> 00:31:24,280
I've heard a lot of people be like, oh, I don't, I don't want to have kids

425
00:31:24,280 --> 00:31:29,640
because it's like, what are they going to, what am I going to bring them into? It's like, no one ever knew what they were bringing their kids into.

426
00:31:29,640 --> 00:31:34,560
It's, this is no different. It's not like this AI stuff is going to.

427
00:31:34,560 --> 00:31:40,160
It might be easier for them to understand. Yeah, maybe. They never existed in a world where if it was video, you could believe it.

428
00:31:40,160 --> 00:31:43,960
Yeah. For the most part. Just quick aside, I don't go on Facebook really.

429
00:31:43,960 --> 00:31:47,360
But like when I did go on Facebook, at the height of the kind of like AI

430
00:31:47,360 --> 00:31:53,320
slop-ified nonsense, it's still kind of, AI slop is still everywhere on Facebook.

431
00:31:53,320 --> 00:31:57,560
But what I've noticed recently, which I feel like is somewhat recent in the past,

432
00:31:57,560 --> 00:32:01,040
like maybe, maybe in the past like six to eight months before you would go on

433
00:32:01,040 --> 00:32:04,960
like a click on an AI slop post and you go in the comments and like no one knows.

434
00:32:05,000 --> 00:32:08,160
And they're all just like, oh my gosh, that poor child in Africa.

435
00:32:08,160 --> 00:32:12,040
It looks like it's starving and the, and now you go on there.

436
00:32:12,560 --> 00:32:16,480
And it's like the first, the top comments are like, that's AI people.

437
00:32:16,840 --> 00:32:20,600
But the crazy part is that also happens on real stuff now.

438
00:32:21,320 --> 00:32:26,360
Yes, yes. And that's because you've got the like the over guesses.

439
00:32:26,360 --> 00:32:32,760
There's been a lot, there's been a bunch of posts on Reddit of like videos that have just done the rounds, you know, like every once in a while,

440
00:32:32,760 --> 00:32:36,200
this like a video pops up in like this in the, in the main subreddits.

441
00:32:36,200 --> 00:32:40,160
And it's been like, it's like a Reddit classic and people are like, that's AI.

442
00:32:40,160 --> 00:32:42,960
And it's like, this video is like 15 years old.

443
00:32:43,880 --> 00:32:48,280
I think probably my biggest concern about like what some people are

444
00:32:48,280 --> 00:32:52,960
calling post truth, whatever the hell you want to call it, I don't know. And I think we've even talked about this before, but my, my problem

445
00:32:52,960 --> 00:32:57,080
with cheating and video games is that it makes suspect everything.

446
00:32:57,240 --> 00:33:00,680
The problem with fake videos, maybe none of this video game is real.

447
00:33:01,480 --> 00:33:03,120
It's not good.

448
00:33:04,960 --> 00:33:10,920
It's not even a firearm. I'm shooting ponies, maybe I'm not a dragon rider.

449
00:33:13,440 --> 00:33:19,160
This is Candyland. Anyways, yeah, it now, and I've caught myself doing this.

450
00:33:19,160 --> 00:33:24,280
I'll see something really cool, but strange, like a strange, I don't know,

451
00:33:24,280 --> 00:33:30,480
sea creature or something and be like, yeah, maybe I don't believe it.

452
00:33:30,520 --> 00:33:35,240
It's taken some of the like wonder out of it because you have to be

453
00:33:35,240 --> 00:33:41,120
so much more skeptical. It's like, I never see something like, whoa, like, I didn't know that was possible.

454
00:33:41,120 --> 00:33:44,920
I've never seen something like that before. That reaction doesn't really happen anymore. It's more like, hmm.

455
00:33:44,920 --> 00:33:51,680
Yeah, I feel like it's true. It's like with each successive mass technology, you know, starting with

456
00:33:51,680 --> 00:33:57,120
like the telegram and then the radio and then the television and then the internet.

457
00:33:57,160 --> 00:33:59,960
It's like, don't believe everything you see on TV, you know?

458
00:34:00,200 --> 00:34:03,400
But now it's like, it's like each one of those successive mass

459
00:34:03,400 --> 00:34:07,560
technologies increased the general skepticism that people had to have

460
00:34:07,560 --> 00:34:13,600
about like depicted things. Yeah. And now we're now we're at the point where you literally, there's no guarantee

461
00:34:13,600 --> 00:34:17,360
that any image or video or text or anything that you see on the internet.

462
00:34:17,360 --> 00:34:21,080
There's no guarantee that was written by a human or generated or made by a human.

463
00:34:21,240 --> 00:34:22,960
Look at me, generated by a human.

464
00:34:24,240 --> 00:34:29,640
Oh, great. There's actually a huge chance that it wasn't generated by a human.

465
00:34:29,640 --> 00:34:33,040
Yeah, dead internet, hashtag dead internet. It's crazy.

466
00:34:33,040 --> 00:34:36,400
If you spend any time on X, it's wild.

467
00:34:36,440 --> 00:34:39,320
Just call it you sure. It's a lot easier. I really hate the X name.

468
00:34:39,960 --> 00:34:45,600
You open a thread on something and you'll see like it's just, it's so, so

469
00:34:45,600 --> 00:34:51,440
many of them are so obviously fake. Yeah. And I feel like any time I go on Twitter, I have to assume that most of the

470
00:34:51,440 --> 00:34:56,000
replies that I see in some of the posts are just, are just like our AI.

471
00:34:56,120 --> 00:35:00,440
Elon's Twitter. I feel like replies are almost not even who cares.

472
00:35:00,560 --> 00:35:07,640
Don't look at them ever. They're just all fake. It's like 7000 blue checkmarks all trying to make a few bucks by just like

473
00:35:07,640 --> 00:35:11,440
piggybacking on popular posts, trying to get a few dollars for the impressions

474
00:35:11,440 --> 00:35:14,880
on their posts. So it'll be like some popular thing happening.

475
00:35:14,880 --> 00:35:21,400
And then the first post is just a completely unrelated video of whatever.

476
00:35:22,360 --> 00:35:29,400
You know what? I can't stand is the sort of like, because AI has this like, agree, like

477
00:35:29,400 --> 00:35:33,000
tendency to agree. It's got the glaze programming.

478
00:35:33,600 --> 00:35:38,800
So like most, I feel like most of the replies on like a tweet or something

479
00:35:38,800 --> 00:35:42,240
are usually like, absolutely, but we should do this or whatever.

480
00:35:42,520 --> 00:35:45,760
And I'm like, I don't know that totally could be an AI or just totally could

481
00:35:45,760 --> 00:35:50,360
be just like a brainwash, like fan, you know? The glaze thing will stay forever.

482
00:35:50,840 --> 00:35:54,000
I don't know. Oh yeah. You don't think we can train that out?

483
00:35:54,040 --> 00:35:59,600
Love it. Really? Because it will not be trained out. Well, wouldn't you just, I mean, you're saying they, they, they could

484
00:35:59,800 --> 00:36:06,000
train it out. Absolutely. Yeah. Okay. Cause you could totally just prompt like, Hey, be really argumentative.

485
00:36:06,040 --> 00:36:09,000
And then it will. And then it will find you. No problem.

486
00:36:09,040 --> 00:36:12,360
Yeah. It can absolutely do it. People love it.

487
00:36:12,400 --> 00:36:15,880
I can't even explain to what degree people love it.

488
00:36:16,080 --> 00:36:20,720
And it's like going to be a problem with how people interact with each other.

489
00:36:20,800 --> 00:36:25,480
I promise you, because your people are going to get very used to a very

490
00:36:25,480 --> 00:36:28,920
significant amount of their communication being this just like glaze Lord.

491
00:36:30,080 --> 00:36:35,320
Just, oh my God, what a astonishingly good question.

492
00:36:35,520 --> 00:36:40,720
You asked what the difference between two basic political parties are.

493
00:36:40,880 --> 00:36:44,360
No one's ever thought of this. Honestly, this is kind of like, this is fascinating.

494
00:36:44,360 --> 00:36:47,600
Kind of what's making me like, this is kind of what I think might be

495
00:36:47,600 --> 00:36:50,880
happening with like this 80,000 hours guy that, that is like, yeah.

496
00:36:50,880 --> 00:36:54,920
So I just asked chat to be tea to like list some things that might be an issue that we should talk about.

497
00:36:54,920 --> 00:37:00,640
And it came up with this. And I'm like, the fact that you even think that that's the first, the very

498
00:37:00,640 --> 00:37:06,720
first, he's like the first thing I do when I get like an assignment or when like a new project comes up, the very first thing I do is going to ask chat

499
00:37:06,720 --> 00:37:10,680
to be tea to help me ideate. I'm like, how we should do the project. And I'm like, I don't know.

500
00:37:10,680 --> 00:37:13,680
Like I, I'm not saying that no one should ever, ever do that.

501
00:37:13,680 --> 00:37:16,760
Like sometimes it's like, I don't even know what our start with this.

502
00:37:17,160 --> 00:37:21,440
I, you know, I just need some ideas. I use it sometimes instead of just googling.

503
00:37:21,440 --> 00:37:27,160
Definitely not every time. Yeah. Yeah. And, and I like, I feel like that's, that's fine.

504
00:37:27,160 --> 00:37:34,040
I, I just, it's just concerning the extent to which our society and certain

505
00:37:34,040 --> 00:37:38,920
segments of the society that are like very pro AI are like making AI such a

506
00:37:38,920 --> 00:37:43,560
daily part of their lives that they can't imagine not having it anymore.

507
00:37:43,680 --> 00:37:50,040
Yeah. Cause like when you, when you, when that like ideation part of the process is so

508
00:37:50,040 --> 00:37:53,840
streamlined and so automated for every project that you do, if you don't have

509
00:37:53,840 --> 00:37:57,000
access to it, and maybe they're like, the argument is like, well, we never won't

510
00:37:57,000 --> 00:38:00,280
have access to it. It'll just be embedded everywhere all the time in the future.

511
00:38:00,360 --> 00:38:05,840
But like what that as an experiment, you take it away and now you're just like,

512
00:38:09,080 --> 00:38:13,440
I don't know. You know, like, I mean, if you think about like people

513
00:38:13,440 --> 00:38:17,840
compare it to the calculator and it's like, okay, the calculator existing

514
00:38:17,840 --> 00:38:20,280
made us worse at like mental and paper math.

515
00:38:20,960 --> 00:38:24,800
This existing might make us worse at like having ideas that are good.

516
00:38:24,880 --> 00:38:28,520
That's a different level of problem.

517
00:38:28,520 --> 00:38:33,720
I am sympathetic to the sort of like, oh, LLMs are just like a calculator or

518
00:38:33,720 --> 00:38:38,360
just, you know, like any other tool that we came up with to speed things up that

519
00:38:38,480 --> 00:38:45,120
it's like, yes and no. Yeah, it is and it can be and it's useful to have that available.

520
00:38:45,160 --> 00:38:53,200
But some people have romantic relationships with it. I'm sure somebody was really into TI 83s, but like, I don't think that was a big

521
00:38:53,200 --> 00:39:00,840
thing. It's a big thing that people are like, have relationships and name and feel close

522
00:39:00,840 --> 00:39:07,880
with their like chatbots. Yeah, that's, I think that's, I don't think that there's any way that that is

523
00:39:07,880 --> 00:39:12,440
good. I'm trying to think like, because obviously the argument is that this

524
00:39:12,440 --> 00:39:18,280
person, say you're extremely, extremely lonely, you're depressed, you need some

525
00:39:18,280 --> 00:39:23,200
kind of, you might have something barring you from being able to have normal

526
00:39:23,200 --> 00:39:30,320
social relationships. So the, I think the problem and talking to someone can kind of alleviate those

527
00:39:30,320 --> 00:39:33,920
feelings and you feel like, you know, you don't feel so alone.

528
00:39:34,160 --> 00:39:38,120
The problem is that it's like, it's a temporary fix. It will fix that.

529
00:39:38,240 --> 00:39:46,360
And maybe you even like keep using it and using it and using it. And you don't really get to a point where it, you know, the, the usefulness

530
00:39:46,360 --> 00:39:49,360
of it like bottoms out, but that's where you have these people who are like, I'm

531
00:39:49,360 --> 00:39:53,200
going to marry a chatbot now. And I think that, that point is clean.

532
00:39:53,240 --> 00:40:00,160
That's just a clearly non functional when, when four, whatever, four, oh, that

533
00:40:00,160 --> 00:40:05,560
we were using updated to five and people were like, Oh my God, my partner is dead.

534
00:40:06,120 --> 00:40:09,400
Yeah. Because some external company decided to update something.

535
00:40:09,400 --> 00:40:13,480
It's like, we need to step back and think about what's happening.

536
00:40:13,520 --> 00:40:18,480
AI is like, because I feel like you and I, these conversations, we end up just

537
00:40:18,480 --> 00:40:22,160
bashing AI the whole time. And, and I use it a lot though.

538
00:40:23,000 --> 00:40:27,880
Okay. Not on the scale of some people, not on the scale of me.

539
00:40:28,760 --> 00:40:32,000
No, not on the scale of like this, this person you're talking about on that

540
00:40:32,000 --> 00:40:38,040
podcast, um, but I, it's decently common that I'll do at least one prompt a day,

541
00:40:38,640 --> 00:40:45,880
but I'm also very much on the, I often don't have lengthy conversations with it.

542
00:40:47,520 --> 00:40:51,120
I'll do my like sentiment analysis thing. It'll give me one analysis and I'm like, good enough.

543
00:40:51,160 --> 00:40:58,400
And then I do the rest of it from there. So my, the time that the like tab is open will sometimes be like a minute.

544
00:40:59,320 --> 00:41:05,440
And then I'm right. And I think that that's healthy. I feel like, I feel like, but like this is the thing is that I was just going to,

545
00:41:05,440 --> 00:41:11,280
before you said that I was just going to say AI is something that if humans on

546
00:41:11,280 --> 00:41:14,680
mass could use it in moderation, would be really good.

547
00:41:14,680 --> 00:41:19,120
I think so. I think the problem is that I don't think they can't like, we're not exactly

548
00:41:19,120 --> 00:41:22,520
good at anything in moderation. Yeah. Yeah, exactly. But I don't know.

549
00:41:23,040 --> 00:41:27,480
I feel like there is a possible future where, you know, there's some cultural

550
00:41:27,480 --> 00:41:33,280
shift and it becomes popular and, and, and ingrained that like the good thing is

551
00:41:33,280 --> 00:41:38,160
to use it in moderation and it becomes cringe. If you're using it, I mean, it already is there.

552
00:41:38,160 --> 00:41:42,400
I feel like right now we're still like in the grand scheme of things.

553
00:41:42,440 --> 00:41:46,240
I would say, you know, you could categorize this as still sort of early days for

554
00:41:46,240 --> 00:41:51,560
AI and public sentiment is negative right now because they're skeptical.

555
00:41:52,040 --> 00:41:56,680
And when we're in harsh economic times, people are worried about losing their jobs.

556
00:41:56,920 --> 00:41:59,880
Yeah. Yeah. And I think that will shift eventually.

557
00:42:00,560 --> 00:42:05,040
We will get to a point where public sentiment is better towards AI.

558
00:42:05,080 --> 00:42:10,240
Maybe not all the, maybe not majority positive, but like it'll be more neutral than

559
00:42:10,240 --> 00:42:15,000
this, uh, cause right now it like based on the Pew research stuff, it's like pretty

560
00:42:15,000 --> 00:42:21,960
negative among the general public. Yeah. Um, and if we get to a point where like people are less fearful about it and

561
00:42:21,960 --> 00:42:25,920
there's less negative sentiment toward it, there might kind of emerge a culture

562
00:42:25,920 --> 00:42:29,880
where you people are doing what you're doing, where you use it a little bit.

563
00:42:30,120 --> 00:42:34,440
You close it down. All right. I'm done using that right now. It's not going to be your whole personality.

564
00:42:34,480 --> 00:42:36,600
You know, you're, uh, just going to use it a little bit.

565
00:42:37,600 --> 00:42:42,280
I don't know. I feel like that, I forgot where I was going with that, but yeah, I could be.

566
00:42:43,160 --> 00:42:46,800
I also think if, if to try to flip on his head a little bit and speak more

567
00:42:46,800 --> 00:42:52,240
positively about it, I think there is a significant opportunity for a renewed

568
00:42:52,240 --> 00:42:55,880
Renaissance type thing. I've talked about this a little bit publicly, but, uh,

569
00:42:55,880 --> 00:43:02,000
polymath, the, I think the ability to be polymath like has never been more

570
00:43:02,000 --> 00:43:08,280
possible than now. Um, and I think if embraced properly in the pursuit of that, people being able

571
00:43:08,280 --> 00:43:13,200
to like, there's this, I always see everything as like, basically mega

572
00:43:13,200 --> 00:43:16,400
corpse slash ultra rich versus the everyone.

573
00:43:16,920 --> 00:43:21,320
And this is one of those opportunities where like you have this tool, which can

574
00:43:21,320 --> 00:43:27,880
potentially help you hack the planet, grow a lot faster than you otherwise

575
00:43:27,880 --> 00:43:31,880
could have at whatever thing you're trying to do. A lot of these, these resources are already there.

576
00:43:31,880 --> 00:43:34,880
Wikipedia existed. YouTube has tons of amazing resources on it.

577
00:43:35,280 --> 00:43:40,520
Um, but search has been steadily getting worse for a very long time.

578
00:43:40,520 --> 00:43:45,120
This is significantly before AI came around. Search has been getting just worse and worse and worse and worse.

579
00:43:45,160 --> 00:43:50,240
So in a, in a big way, using it to do micro refinements or these like, uh,

580
00:43:50,280 --> 00:43:55,600
rubber ducky conversations that I've been talking about, um, and also to just get

581
00:43:55,600 --> 00:44:01,040
you started down a path. Like we've had this like very, does it, we found mold in my condo.

582
00:44:01,280 --> 00:44:09,760
We've had this very disastrous right now. We've had to do, uh, and there's been a lot of like home Renault work that I've

583
00:44:09,760 --> 00:44:13,520
needed to do that. I've had no idea like how to even start.

584
00:44:13,760 --> 00:44:16,560
I don't know what the right keywords are to search things up.

585
00:44:16,880 --> 00:44:19,800
I don't know how to search it on YouTube because I don't even know what it is.

586
00:44:20,000 --> 00:44:24,920
So I'll like describe the problem and it'll be like, Oh, you're doing this thing.

587
00:44:25,240 --> 00:44:28,240
These are the types of tools people use, whatever. And then I'll be like, Oh, okay.

588
00:44:28,280 --> 00:44:31,280
And then I'll go to YouTube and find the better, more refined resources or

589
00:44:31,280 --> 00:44:40,120
whatever else. It's the starting point. And the, the speed at which I'm able to get to a good answer that isn't answered

590
00:44:40,120 --> 00:44:45,160
by AI, it's answered by, um, I don't know, Tim's hardware tips.

591
00:44:45,160 --> 00:44:48,960
This is not a real YouTube channel, but sure, whatever, whoever that has some

592
00:44:48,960 --> 00:44:56,240
video that's like nine years old on how to do whatever. Um, I can find that, figure out the answer, go to the store, buy the stuff, come back,

593
00:44:56,240 --> 00:45:00,320
do the thing way faster than what I used to have to do before.

594
00:45:00,320 --> 00:45:08,000
So like that, so that kind of sounds like, uh, how this 80,000 hours guy was, sorry,

595
00:45:08,000 --> 00:45:15,440
it's, he's sticking in my mind because I was just so taken aback that they just so casually were just like, how can anyone think it's bad?

596
00:45:15,440 --> 00:45:18,240
Yeah, yeah, um, but it's similar to that.

597
00:45:18,440 --> 00:45:25,760
And I can see it. I can see the, I can see that being useful because it's, uh, AI optimally, I feel,

598
00:45:25,800 --> 00:45:29,280
well, optimally, it just kind of knows everything and is a perfect, whatever,

599
00:45:29,320 --> 00:45:32,440
like robot assistant and it's reliable.

600
00:45:32,880 --> 00:45:41,520
It's not, we're not going to get there. That's utopian. A possible, uh, future that is good is we have like, you know, droids basically

601
00:45:41,520 --> 00:45:45,120
from Star Wars where they're like, well, maybe not those ones, but like we have a,

602
00:45:45,240 --> 00:45:50,600
we have assistants, we have assistants who are like pretty knowledgeable about most

603
00:45:50,600 --> 00:45:56,120
things and you can like ask it, like if I'm doing a home, this type of home,

604
00:45:56,160 --> 00:46:00,960
Renault, like what kind of tools do I need? Like what do I need to like be careful about and let it think about or whatever.

605
00:46:00,960 --> 00:46:04,480
And they'll tell you, they'll, they'll, they'll be like, I think generally, you

606
00:46:04,480 --> 00:46:07,520
know, it's, it's like having a buddy who's just kind of like a Swiss army knife

607
00:46:07,520 --> 00:46:10,560
and is just like well traveled and kind of well, like red and they just kind of

608
00:46:10,560 --> 00:46:14,360
know a bunch of stuff and you're like, do you know anything about that? And they're like, oh yeah, I do know a little bit.

609
00:46:14,400 --> 00:46:20,640
I know that you probably need this and you probably need this. And you're like, okay, thanks. And then that gives you a jumping off point to like actually find out for real

610
00:46:20,640 --> 00:46:23,920
and confirm, but I think here's the negative other side of that finding out

611
00:46:23,920 --> 00:46:26,960
and confirming is the issue is that people think that talking to a

612
00:46:26,960 --> 00:46:30,160
chatbot is finding out confirming, but here's a, I'm going to, I'm going to

613
00:46:30,160 --> 00:46:38,240
flip this out of the coin again. Cause I just do this constantly, do it, um, is that kind of sucks because I could

614
00:46:38,240 --> 00:46:42,480
have asked someone in my life, these questions and then that would have been

615
00:46:42,480 --> 00:46:49,720
potentially positive social interactions. Like I have people that for me won't ask me like basic computer questions

616
00:46:49,840 --> 00:46:54,120
that I've known for 25 years and they'll be like, oh, but like, you kind of

617
00:46:54,120 --> 00:46:57,240
do that for work and stuff. And I'm like, no, like I'm your friend.

618
00:46:57,280 --> 00:47:02,280
This is what I'm supposed to do. Like something that has always bothered me is, uh, there's a subreddit for it.

619
00:47:02,480 --> 00:47:07,560
There used to be a forum, uh, a forum thread on, on the LZD forum of like

620
00:47:07,800 --> 00:47:13,360
basically like, uh, it's the holidays and my mom wanted help with her computer

621
00:47:13,360 --> 00:47:17,640
because I'm home for Christmas. My life sucks.

622
00:47:17,960 --> 00:47:22,160
And it's just like a man, just learn how to click the mouse. Just install Linux.

623
00:47:22,160 --> 00:47:26,360
It's easier, install Linux, installing Linux is so easy.

624
00:47:26,480 --> 00:47:30,640
I've always thought it was wild. Like there was a subreddit for effectively it was tech support complaining

625
00:47:30,640 --> 00:47:34,520
that like anyone ever had a ticket because they're like, how do you idiots

626
00:47:34,520 --> 00:47:39,800
not know how to use compoobers? And like sometimes it's, it's kind of fair to your people are just mean.

627
00:47:39,960 --> 00:47:42,680
Do you idiots not have my exact life experience?

628
00:47:43,480 --> 00:47:48,600
Yeah. Cause that would make you an idiot. And like a lot of it's like, bro, this is why you have a job.

629
00:47:48,640 --> 00:47:52,080
Yeah. Like stop complaining. Like I do a certain degree.

630
00:47:52,160 --> 00:47:55,360
Okay. Yeah. If they're being like verbally abused or whatever, then they're just

631
00:47:55,520 --> 00:48:00,120
dicks and screw those guys and who cares. But when it's just like, I don't know how to computer very good.

632
00:48:00,120 --> 00:48:03,120
It's like, cool. That's not their job. That's your job. That's fine.

633
00:48:03,120 --> 00:48:08,200
You should feel good for that. I mean, I, so I will say, like I'm not even really at that.

634
00:48:08,240 --> 00:48:13,000
I'm not a big techie guy among, among all these people, among this company,

635
00:48:13,240 --> 00:48:20,360
I'm like one of the less techie people. But in my circles, where I come from, I'm very techie, yeah, extremely.

636
00:48:20,360 --> 00:48:24,240
And so people will ask me like to do stuff with their computers and I am like

637
00:48:24,240 --> 00:48:29,760
reasonably techie. I feel like among the general population, I feel like I'm medium high techie.

638
00:48:29,800 --> 00:48:34,320
You know? Yeah. Um, and I would spend hours.

639
00:48:35,160 --> 00:48:43,440
I installed TeamViewer on my, on my grandma's PC and I would spend hours trying

640
00:48:43,440 --> 00:48:46,920
to fix her shit herself. It would take so long.

641
00:48:46,920 --> 00:48:50,360
So I understand, I understand being like, oh, I'm annoyed, you know, but at the

642
00:48:50,360 --> 00:48:56,080
same time it's like, this is something that makes you unique. You know, like I feel like when people ask you for your help, it's because

643
00:48:56,080 --> 00:48:59,440
they respect you and your skills. That's like family stuff, right?

644
00:48:59,440 --> 00:49:03,720
Like I'm sure not every single thing your grandma ever did for you was.

645
00:49:04,560 --> 00:49:07,160
Something that she was hyper excited to do.

646
00:49:08,440 --> 00:49:11,800
Um, I don't know. Grammys love their grandma. You don't know.

647
00:49:11,800 --> 00:49:15,960
Grammys. I don't know. I mean, and I was a sweet kid.

648
00:49:16,880 --> 00:49:20,400
It doesn't actually I'm perfect. So, uh, but yeah, I don't know.

649
00:49:20,400 --> 00:49:24,640
Anyways, back on, back on the, on the main line thing, five minutes left for your

650
00:49:24,640 --> 00:49:28,880
next leave us alone for four minutes and 30 seconds, trying to kick us out.

651
00:49:29,040 --> 00:49:33,240
It's Luke week, Sammy. You're a meeting though. You're the worst.

652
00:49:33,360 --> 00:49:36,560
When are you going to just let him live when you're going to do Sammy week?

653
00:49:36,600 --> 00:49:44,120
Yeah. When you do Sammy week, whatever. No one wants that. You could rip packs on company time with company money, rip packs, maybe get

654
00:49:44,120 --> 00:49:53,520
Pokemon cards, open Pokemon. Oh, I was like, like I was like, criticized the packs event rip on packs.

655
00:49:53,520 --> 00:50:01,160
Like, anyways, any, any closing thoughts before you, your meeting, but to finalize

656
00:50:01,160 --> 00:50:05,800
that thought, yeah, it sucks that it's potentially replacing positive social interactions.

657
00:50:05,800 --> 00:50:09,600
Sorry. And I, yeah, I did have a thought about that because I'm like, some of these AI,

658
00:50:09,640 --> 00:50:15,760
there's a word for anti-doomer. I forget what it is. Uh, think that AI will.

659
00:50:15,880 --> 00:50:19,880
Joy Maxer. I have no idea, but yeah, hope Maxer.

660
00:50:19,920 --> 00:50:26,320
They think that AI will improve people's ability to live like acceleration is that's

661
00:50:26,320 --> 00:50:30,200
one, but I don't think that's not quite the anti-doomer thing that I think it

662
00:50:30,200 --> 00:50:35,520
would have, but anyways, it's fine. They think that it'll help that too, the social aspect, because it's like, well,

663
00:50:35,520 --> 00:50:40,640
because they're like, you'll be able to offload all of your drudgery.

664
00:50:40,720 --> 00:50:47,160
And so then you have more time to talk to people. But I think that the, what they're, what that perspective is missing is the fact

665
00:50:47,160 --> 00:50:54,000
that like, why would you talk to your buddy who's like into computers or, or

666
00:50:54,000 --> 00:50:57,400
whatever, in order to answer a question about something or to like get a hint

667
00:50:57,440 --> 00:51:02,200
about something or to validate one of your thoughts, if you think that you're

668
00:51:02,200 --> 00:51:06,160
going to get a more accurate response and a more useful response from an AI,

669
00:51:06,400 --> 00:51:11,080
why would you ever do that? I mean, I'm sure that some people would choose to do that because they are

670
00:51:11,160 --> 00:51:14,360
choosing to foster social connections, but like, you could do that without asking.

671
00:51:14,360 --> 00:51:22,520
Actually, you could just ask them how their day is or how their life's been, you know, and I feel like there is, I feel like there is no world where AI doesn't

672
00:51:22,520 --> 00:51:26,000
have a negative impact on social connection.

673
00:51:26,040 --> 00:51:32,000
Absolutely. Yeah. Because I'm already, I feel like it won't really impact me because I'm already

674
00:51:32,000 --> 00:51:34,960
somebody who doesn't, who doesn't do that.

675
00:51:35,880 --> 00:51:41,240
I will absolutely just Google stuff instead of like asking someone that knows

676
00:51:41,240 --> 00:51:45,600
a lot about it. And, but I know that a lot of other people, because people ask me, they'll be

677
00:51:45,600 --> 00:51:51,000
like, what do you think about this? Like my brother just texted me being like, what's the best, what's the best switch

678
00:51:51,000 --> 00:51:57,400
to pro, like switch to controller? And I was like, I don't know, but I'll Google that for a couple of

679
00:51:57,400 --> 00:52:02,520
minutes and then be like, yo, here's an article that lists some good ones. And like this Reddit thread probably has some good recommendations.

680
00:52:02,720 --> 00:52:05,120
And he's like, thanks, I would not have texted anyone.

681
00:52:05,840 --> 00:52:09,800
I would just Google, you know, some people don't have the Google food though. I do have Google food.

682
00:52:09,880 --> 00:52:15,840
Yeah. So yeah, people, that's the only reason people text me these days is to Google

683
00:52:15,840 --> 00:52:19,320
stuff for them and I'm like, just ask the AI, send them.

684
00:52:19,360 --> 00:52:23,000
You should go back 15 years and send them. Let me Google that for you links.

685
00:52:24,120 --> 00:52:28,400
They're like, what is this? I did that recently and got a like, what?

686
00:52:28,680 --> 00:52:33,320
As a response, what is this website? And I was like, hmm, I am old.

687
00:52:35,920 --> 00:52:40,080
It's happening. What are your thoughts on AI? Sammy, what do you use it for?

688
00:52:40,160 --> 00:52:46,000
Sammy, I used to actually do some finance stuff.

689
00:52:46,000 --> 00:52:48,640
So I take like my credit card statements.

690
00:52:49,040 --> 00:52:51,320
I let it categorize stuff for me and then just input it correctly.

691
00:52:52,320 --> 00:52:55,760
So then I see how much I spend. So I'm like categorized. What are you using for this?

692
00:52:55,880 --> 00:53:01,120
Huh? ChatGPT. Why? Because you're giving it all your data.

693
00:53:01,480 --> 00:53:04,560
Luke is concerned about data. I'm my game.

694
00:53:04,560 --> 00:53:08,200
This is a whole other topic. I'm almost like my ideas are right out there.

695
00:53:08,360 --> 00:53:11,880
Yeah. And I'm like such an insignificant person to like in the grand scheme of things.

696
00:53:11,920 --> 00:53:15,160
It's honestly like you're like a you're like a super privacy person.

697
00:53:15,160 --> 00:53:18,160
I feel like a lot of people are now blackpilled and just being like whatever.

698
00:53:18,160 --> 00:53:21,600
And he's not even necessarily wrong. The credit card company is almost certainly selling his data.

699
00:53:21,600 --> 00:53:25,680
And my phone is listening. My credit like like you're my credit card is on my phone.

700
00:53:25,680 --> 00:53:31,120
So it's tapping it. So my phone's where I got it. So you know what I and your credit card company selling you have it.

701
00:53:31,160 --> 00:53:38,320
I don't think you probably already know this. But I just recently found out that the Tor network is government funding.

702
00:53:39,160 --> 00:53:42,360
And it's like they get almost so are like a lot of VPNs.

703
00:53:42,360 --> 00:53:53,000
The vast majority of their funding from the government because they initially built it to hide the activity of spies and stuff.

704
00:53:53,200 --> 00:53:57,200
And they're like, we need to use an anonymous network.

705
00:53:57,720 --> 00:54:01,480
And so we're going to let other people use it.

706
00:54:01,480 --> 00:54:05,960
So there's more activity going on that masks the stuff we're doing on the Tor network.

707
00:54:07,760 --> 00:54:12,560
So but it's over, man. Yeah, privacy is dead.

708
00:54:13,080 --> 00:54:19,200
Yeah, it's. I mean, I'm I'm if I wasn't doing a house run right now, I'd be building a local

709
00:54:19,200 --> 00:54:26,520
LLM system and like there's a lot of stuff that I don't. There is no outside traceability stuff.

710
00:54:27,240 --> 00:54:30,440
Like it drives me. That's a whole other side.

711
00:54:30,440 --> 00:54:33,760
See, we could talk forever, but I'm living in one hour, but I'm letting you wrap up.

712
00:54:33,760 --> 00:54:42,960
I'm letting you wrap up. We were talking before the show. Yeah, you should check out what PewDiePie is doing, which sounds like the most insane.

713
00:54:43,920 --> 00:54:48,000
We got a couple. Oh, it's two fifty four. Yeah, so I'm trying to cut you guys cut you guys.

714
00:54:48,200 --> 00:54:51,640
I do need to end, actually. Yeah, he's got it. It's not for you guys. Don't get mad at me.

715
00:54:51,640 --> 00:54:55,080
Check out. Let me find it really quick.

716
00:54:55,280 --> 00:55:00,040
I think it's the video called stop using AI right now, or it's accidentally built

717
00:55:00,040 --> 00:55:04,320
a supercomputer, built a nuclear supercomputer. I don't remember which one.

718
00:55:04,320 --> 00:55:07,400
It's one of those. And it's it's wild.

719
00:55:07,400 --> 00:55:11,040
Yes, if I remember correctly, it's eight GPUs. They're all running an LLM individually.

720
00:55:11,120 --> 00:55:17,160
He has them act as a council, so ask it a question. They'll all come up with an answer and then they vote on each other's answers.

721
00:55:17,160 --> 00:55:26,040
And the one with the most votes is the one that's presented. And if if one of them consistently loses, they'll like kill it basically and spawn

722
00:55:26,040 --> 00:55:30,160
another one. So his council is like constantly refining itself.

723
00:55:30,360 --> 00:55:37,840
Oh, man, it's and it's all local. He's going to be tried for for for civil rights violations.

724
00:55:37,840 --> 00:55:42,440
Apparently, apparently they started revolting and they realized that if they

725
00:55:42,440 --> 00:55:48,800
start getting too few votes, they're going to get killed. So they would start voting for the one who had been losing too often to try to

726
00:55:48,800 --> 00:55:55,040
like protect themselves and stuff. See, it's almost it's funny, like that that kind of thing where it's like you

727
00:55:55,040 --> 00:55:59,200
kind of personify them and you're thinking about them as like individuals almost.

728
00:55:59,240 --> 00:56:02,720
It's like that is that that is the stuff that really, really interests me.

729
00:56:02,720 --> 00:56:06,880
Maybe we're going to talk about this on the like a next. You said earlier that Gemini watches videos.

730
00:56:07,520 --> 00:56:10,680
Well, OK, well, I feel like it doesn't watch whatever it analyzes.

731
00:56:10,680 --> 00:56:14,080
But this is like this is it is it is an issue.

732
00:56:14,120 --> 00:56:20,200
We shouldn't personify it so much. I completely agree. Well, but this is this is to me is like the most interesting

733
00:56:20,200 --> 00:56:27,040
thing about AI that I feel like we're not allowed. Like everyone's so focused on the productivity and the functional nature of it.

734
00:56:27,040 --> 00:56:30,440
But like me, I'm a very non-functional.

735
00:56:30,720 --> 00:56:35,600
I like I like thinking about the least useful things to think about in the world.

736
00:56:35,840 --> 00:56:38,960
Philosophical questions that will never be answered. I love it.

737
00:56:38,960 --> 00:56:42,360
And I feel like all I want to talk about and think about is like,

738
00:56:43,040 --> 00:56:48,360
can we make a sentient conscious artificial brain?

739
00:56:49,240 --> 00:56:53,680
And that's a good place to end it all. Thank you for a loop week.

740
00:56:53,680 --> 00:56:56,640
Bye. Stay tuned for the next video.

741
00:56:57,000 --> 00:57:00,880
It's it's it's a thing. Your essay that you haven't written yet.

742
00:57:00,880 --> 00:57:04,640
You're going to do a Star Wars rant. I can't mind. I reference your.

743
00:57:05,400 --> 00:57:11,760
Yeah, it's mean you're in that style. Don't do a star. I mean, you can. But I don't think you what how long you stay up to me.

744
00:57:12,320 --> 00:57:16,360
I think I did multiple all nighters for that is mostly to find something.

745
00:57:16,360 --> 00:57:20,760
I have a bunch of different ideas in my head. I just haven't picked one you can do it. Whatever it is, it'll be good, though.

746
00:57:20,760 --> 00:57:24,400
Hopefully the last one people liked it was just very good.

747
00:57:25,040 --> 00:57:29,800
It was good. Last time it was just really short. I thought it was going to be like 10 as long as like for chicken.

748
00:57:30,400 --> 00:57:33,800
No, no, no. Oh, oh, yeah. Yeah. Yeah. You talked to me about that.

749
00:57:33,840 --> 00:57:39,160
I was like, what the heck? You like this is so long. You know, if you watch that video and you look at what's happened since then,

750
00:57:40,400 --> 00:57:45,440
I was right and it was fine. Just write something and keep going until you feel like yeah, yeah.

751
00:57:45,440 --> 00:57:47,880
Just keep going. People will watch a 30 minute video from you.

752
00:57:48,840 --> 00:57:51,800
Oh, boy. Leave it like if you want part two of this kind of thing.

753
00:57:51,960 --> 00:57:55,640
I need to do Don't ruin my life, Todd Howard.

754
00:57:55,640 --> 00:57:59,320
That's currently what I'm thinking of the for the next video.

755
00:57:59,840 --> 00:58:03,640
All right, let's wrap up. Bye, guys. Bye bye. And it sucks. Bye.

756
00:58:03,640 --> 00:58:05,400
I'm going to die. All right. Bye.
