1
00:00:00,919 --> 00:00:06,200
what did you say AI will it kill your parents AI will it kill your parents

2
00:00:04,839 --> 00:00:11,480
yeah you know there was a time when that question wouldn't make any sense but

3
00:00:08,920 --> 00:00:15,599
sadly it does in the year 2023 uh because AI has been crazy

4
00:00:14,400 --> 00:00:22,519
there's been a lot of tech news happening this year and we're gonna talk about it on this episode of talk link

5
00:00:19,160 --> 00:00:26,920
with who murd no that's me okay Jessica

6
00:00:22,519 --> 00:00:28,359
P Jessica P You're the WAN writer I am

7
00:00:26,920 --> 00:00:31,920
the W writer and this is probably the first time listeners have heard how my

8
00:00:30,080 --> 00:00:35,200
last name is pronounced and I'm sorry we had to break it to you this way I mean

9
00:00:33,480 --> 00:00:41,760
how else would you say it pigo people usually say pigo pigo you're the WAN

10
00:00:38,680 --> 00:00:44,000
writer but you also uh write TechLinked

11
00:00:41,760 --> 00:00:48,360
yes and you also and game link and Tech quick and Tech Wick yes you're you're

12
00:00:46,960 --> 00:00:53,239
got your hands in all the pots now I multifaceted I think there was a rumor

13
00:00:50,239 --> 00:00:55,120
before my my name was released because I

14
00:00:53,239 --> 00:01:03,760
I tended to jokingly put my name in the credits as various Ai and uh there there

15
00:00:58,760 --> 00:01:03,760
was a rumor that I was actually an AI

16
00:01:03,839 --> 00:01:12,280
experiment I just kept showing up in so many different video series maybe we all

17
00:01:08,840 --> 00:01:14,040
are who knows brains and Vats Etc I've

18
00:01:12,280 --> 00:01:17,840
honestly this year has made me question absolutely everything we're going to go

19
00:01:15,600 --> 00:01:22,240
through our favorite stories of the year or sort of we're going to talk about

20
00:01:19,720 --> 00:01:25,840
Jessica's first year here at LMG and cover some of our you know the stories

21
00:01:24,159 --> 00:01:29,920
that really got us uh got got us thinking now this episode we're

22
00:01:27,360 --> 00:01:34,040
recording it the week before but it's going to go up on Wednesday December

23
00:01:31,439 --> 00:01:38,320
27th 2 days after our Christmas special so if you want a proper big Roundup of

24
00:01:36,759 --> 00:01:44,640
all the tech stories of the year and you didn't see that we released a Christmas special go see that but this is more of

25
00:01:41,720 --> 00:01:48,399
a casual situation Jessica how's it been had you tried to keep up with the tech

26
00:01:46,320 --> 00:01:52,560
news cycle in the way that you have had to this year before not like this no

27
00:01:50,799 --> 00:01:57,079
like there was always things I kept up with there's little trends that I would

28
00:01:54,719 --> 00:02:02,840
note um but if you want to know my previous experience following the news

29
00:01:59,520 --> 00:02:06,200
time line uh I learned about Bitcoin in

30
00:02:02,840 --> 00:02:07,640
early 2012 when huh that's interesting

31
00:02:06,200 --> 00:02:12,200
and didn't pay attention to it again until

32
00:02:09,840 --> 00:02:17,480
2019 there are there's always just been a lot of stuff that just passes Me by

33
00:02:14,560 --> 00:02:22,080
because I wasn't specifically interested um yeah I mean crypto that's fair

34
00:02:20,000 --> 00:02:28,040
honestly if if if people missed the whole crypto kind of hype and then crash

35
00:02:25,080 --> 00:02:31,760
uh it's like well to me it few years to me it started here and it ended here so

36
00:02:30,040 --> 00:02:35,760
and everything in between that like there there might have been an up there might have been a down to me it was a

37
00:02:34,440 --> 00:02:40,760
straight line yeah well okay it's not fair to say

38
00:02:39,000 --> 00:02:45,280
that sorry I just said a second ago that it was only a few years but really I

39
00:02:42,319 --> 00:02:50,000
mean you know crypto started way back you know early 2000 early 2010s or maybe

40
00:02:48,360 --> 00:02:53,879
before that anyways but the really started gaining hype around 2012 13 and

41
00:02:52,280 --> 00:02:58,800
stuff that then it kind of like laid dormant and then it spiked in the last

42
00:02:56,040 --> 00:03:02,400
few years and just in time for it to kind of fall out of favor I mean last

43
00:03:00,640 --> 00:03:06,200
year I was I rewatched some of our Christmas specials in preparation for

44
00:03:04,040 --> 00:03:11,360
this year's and one of the main stories last year was how crypto basically just

45
00:03:09,680 --> 00:03:15,000
fell off the map at the beginning of 2022 a bunch of companies were still

46
00:03:13,200 --> 00:03:18,319
doing stuff with it they announced Integrations with at a lot of gaming

47
00:03:16,440 --> 00:03:22,040
companies in particular and then by the end of the year they had all kind of

48
00:03:19,959 --> 00:03:27,280
canceled it yeah like I remember like wealth simple sending me an email like a

49
00:03:24,920 --> 00:03:31,360
year and a half ago about their crypto integration yeah well and it's still

50
00:03:29,799 --> 00:03:35,280
integrated in a lot of things I don't want to say that it's dead it's not it's

51
00:03:33,519 --> 00:03:41,040
absolutely not but in terms of its mainstream for the average person crypto

52
00:03:37,840 --> 00:03:44,200
was fast yeah yeah crypto was very fast

53
00:03:41,040 --> 00:03:48,159
it was a flash in the pan mhm um but it

54
00:03:44,200 --> 00:03:50,760
was replaced by AI which is like the

55
00:03:48,159 --> 00:03:55,360
hugest thing this year but way bigger hype before we get there how was it

56
00:03:53,360 --> 00:03:59,200
keeping up with everything you said that you kind of like were aware of some

57
00:03:56,799 --> 00:04:04,640
things before but like was there a break neck speed cuz I've been doing this for

58
00:04:01,239 --> 00:04:08,000
10 years so I I you I want to hear what

59
00:04:04,640 --> 00:04:10,319
it was like for you uh it was a lot of

60
00:04:08,000 --> 00:04:13,760
reading stuff and not knowing if it was important or

61
00:04:11,480 --> 00:04:17,359
not cuz sometimes you'd read something and you're like that sounds really

62
00:04:15,120 --> 00:04:21,720
important and then Riley would go yeah they always do that yeah this is

63
00:04:19,720 --> 00:04:25,040
actually the sixth time or I'd read something and I'm just like I don't that

64
00:04:23,400 --> 00:04:30,400
don't seem important at all and Riley would be like this is

65
00:04:27,160 --> 00:04:32,560
gamechanging and I'm like really yeah

66
00:04:30,400 --> 00:04:37,400
yeah I just a big part of the issue is not that I didn't understand what was

67
00:04:34,720 --> 00:04:42,639
happening but I did not have the right amount of context to understand what

68
00:04:40,039 --> 00:04:45,960
people considered important yeah it's always hard especially because

69
00:04:44,320 --> 00:04:52,240
especially when you're writing for w show and like with TechLink it's you

70
00:04:48,440 --> 00:04:54,120
know we we do our best but WAN Show it's

71
00:04:52,240 --> 00:04:58,000
we are preparing things and before you came on uh the the whole writing team

72
00:04:56,560 --> 00:05:04,240
kind of worked together on it but I was like curating the topics and now this is

73
00:05:01,440 --> 00:05:09,400
your burden and uh it is a struggle to kind of guess what lonus and Luke will

74
00:05:06,600 --> 00:05:13,440
particularly find interesting it is as opposed to what we think maybe the the

75
00:05:11,320 --> 00:05:17,080
the Broad tech audience on YouTube will find interesting I sometimes I feel like

76
00:05:15,160 --> 00:05:21,720
like I have a Ouija board or I have a crystal ball and I'm just going like I'm

77
00:05:19,960 --> 00:05:26,880
like okay well like this is objectively important but like I don't no we have

78
00:05:24,479 --> 00:05:32,400
like a little spreadsheet set up and like I I think I've gotten a lot better

79
00:05:28,880 --> 00:05:34,800
at guessing but still sometimes I will

80
00:05:32,400 --> 00:05:38,319
get it back from lus and I will be baffled by what returns to me I'm like

81
00:05:37,160 --> 00:05:42,840
you want to talk about that that's a main story to you yeah yeah okay because

82
00:05:41,120 --> 00:05:46,639
I I thought maybe I'd have to write three lines on that yeah we'll put

83
00:05:45,199 --> 00:05:49,960
something in there we be like this is the biggest news of the week you guys

84
00:05:48,280 --> 00:05:54,600
need to talk about this for half an hour he's like eh world changing he's like I

85
00:05:53,199 --> 00:05:58,479
want to talk about this Final Fantasy bug that's really annoying me but to

86
00:05:56,759 --> 00:06:03,360
your point a second ago about like having the context it is tough because

87
00:06:00,680 --> 00:06:09,199
we hired you and we hired Jacob uh close to a year ago yes and you guys have been

88
00:06:07,440 --> 00:06:14,280
you know getting on the team and becoming embedded in like gaining that

89
00:06:11,280 --> 00:06:17,199
kind of context that that I've had for a

90
00:06:14,280 --> 00:06:23,520
while and it's uh yeah it's it's it's a whole process admittedly I no longer

91
00:06:19,199 --> 00:06:27,039
find myself Googling what is Red Green

92
00:06:23,520 --> 00:06:29,720
Team yeah exactly question mark yeah but

93
00:06:27,039 --> 00:06:34,560
I want to say honestly that the differ between you and your like kind of

94
00:06:32,120 --> 00:06:38,039
ambient knowledge for some of this stuff the difference between you now and

95
00:06:36,080 --> 00:06:41,560
between you like when you started is is so vast like now I'll just talk about a

96
00:06:39,919 --> 00:06:46,680
tech thing and you'll just be like yeah you know oh I am a sponge I am a serious

97
00:06:44,319 --> 00:06:51,120
sponge uh that was kind of what James said to me when he hired me was like you

98
00:06:49,520 --> 00:06:53,720
obviously don't have the background but you're smart you'll figure it out I'm

99
00:06:52,560 --> 00:06:58,960
like what does that mean oh uh hold on a second I'm

100
00:06:57,280 --> 00:07:03,639
rethinking here things here yeah you didn't pick up on that okay I'm like

101
00:07:00,440 --> 00:07:06,599
that sounds ominous am I in danger don't

102
00:07:03,639 --> 00:07:10,800
worry you'll be fine swim yeah so I mean I do tend to have a

103
00:07:09,240 --> 00:07:15,479
pretty good grasp of Technology it's just that most of the technology I'm interested in was invented before

104
00:07:13,720 --> 00:07:21,039
everyone was current who's currently alive was born right because PRI prior

105
00:07:17,919 --> 00:07:23,240
to your job here you did write blog

106
00:07:21,039 --> 00:07:28,160
posts absolutely about and you did deep Dives on History topics yeah which

107
00:07:26,639 --> 00:07:32,000
involves technology a lot of time absolutely involves techn for sure which

108
00:07:30,000 --> 00:07:35,599
is I think why I mean I've loved your Techquickie scripts I think that you do

109
00:07:33,680 --> 00:07:39,160
a deep dive and you make them funny you're a you're a standup comedian in

110
00:07:37,479 --> 00:07:43,160
case anyone doesn't know I'm currently sitting but you know you just got to

111
00:07:41,280 --> 00:07:47,400
imagine me like two feet up that's the kind of jokes we're looking for you can

112
00:07:45,960 --> 00:07:52,199
look forward to I I cannot tell the kind of jokes

113
00:07:50,280 --> 00:07:57,039
that I tell on stage in this context Jacob wrote a sit stand joke into

114
00:07:54,440 --> 00:08:00,720
today's Tech link asang like they will not stand for that but if you want to

115
00:07:58,479 --> 00:08:04,520
sit for that see this just reminds me of the time when like you were like I heard

116
00:08:02,879 --> 00:08:09,159
you across the room like I was finishing up a joke for TechLink and I heard you

117
00:08:06,479 --> 00:08:12,960
all across the room telling Jacob off for having two jokes in one script that

118
00:08:11,319 --> 00:08:18,599
were about shoving objects in someone's butt as I was finishing a joke about

119
00:08:16,199 --> 00:08:23,879
shoving objects at people's butts I mean hey it's a funny place of

120
00:08:22,280 --> 00:08:27,000
the body you know like it's a it's inherently humorous I mean what you what

121
00:08:25,759 --> 00:08:33,320
are you supposed to do not make jokes about it um but all right and now it's

122
00:08:30,960 --> 00:08:38,120
not time for quick bits because there aren't any today nonetheless this video

123
00:08:35,599 --> 00:08:42,440
is brought to you by Firebelly tea their travel mug gives tea lovers the ability

124
00:08:40,080 --> 00:08:47,519
to start or stop their iced or hot tea infusion anytime anywhere it's flow

125
00:08:45,560 --> 00:08:51,360
control and tapered lip makes it feel like you drinking from a traditional mug

126
00:08:50,000 --> 00:08:57,240
but with none of the spillage and you should also check out the unique

127
00:08:53,000 --> 00:09:01,120
handcrafted teas they're terrific go to

128
00:08:57,240 --> 00:09:03,399
LMG g/ belly mug grab yourself a stop

129
00:09:01,120 --> 00:09:08,519
infusion travel mug and use code linked 10 to get 10% off your entire purchase

130
00:09:06,519 --> 00:09:14,079
so on that note yeah let's talk about our favorite tech stories this year uh

131
00:09:11,160 --> 00:09:17,440
for me it's it's definitely one of the biggest ones and one of my favorites is

132
00:09:15,760 --> 00:09:22,959
AI just because I've been doing my darnest this year to make some progress

133
00:09:19,560 --> 00:09:25,760
on a tech longer about it and it's been

134
00:09:22,959 --> 00:09:30,320
very very very difficult a long road to find the to find the time uh to do it

135
00:09:28,320 --> 00:09:35,800
but I've been like saving so many links it's it's crazy so uh yeah how do you

136
00:09:33,519 --> 00:09:38,560
feel about AI is it is it going to be the end of humanity is it going to be

137
00:09:37,480 --> 00:09:44,320
the bright optimistic future I think the

138
00:09:41,560 --> 00:09:48,640
truth is somewhere in between I know I know but of course you're that's why

139
00:09:46,160 --> 00:09:55,279
you're presenting the dichotomy there's been a lot of

140
00:09:50,560 --> 00:09:58,240
both Euphoria and H hyy about AI I think

141
00:09:55,279 --> 00:10:03,000
it's genuinely incredibly cool it really is it's incred incredibly cool but the

142
00:10:00,720 --> 00:10:09,880
danger is that there are people who think that in the same way as people

143
00:10:06,079 --> 00:10:12,320
thought about crypto exactly and I feel

144
00:10:09,880 --> 00:10:17,160
like a lot of people's reaction and this is I was talking earlier about grifters

145
00:10:14,720 --> 00:10:22,079
a lot of people's reaction to whatever is drawing the most attention in a media

146
00:10:20,320 --> 00:10:26,240
space is just to immediately just try to like pick it up

147
00:10:24,560 --> 00:10:32,200
like it's a piggy bank and try to shake money out of it yeah regard of why

148
00:10:30,279 --> 00:10:36,880
regardless of there's no real thought behind it and I feel like that's a lot

149
00:10:34,560 --> 00:10:42,279
of the current danger about AI like I'm not really worried about human

150
00:10:38,880 --> 00:10:45,839
extinction at this point I am worried

151
00:10:42,279 --> 00:10:47,480
about people throwing away people's jobs

152
00:10:45,839 --> 00:10:53,160
because they think they can replace them with what is essentially a an advanced

153
00:10:50,399 --> 00:11:00,160
auto correct well like I'm worried about people you know just filling like

154
00:10:57,240 --> 00:11:06,320
flooding the zone of conversation with like AI slurry and that causing

155
00:11:03,839 --> 00:11:11,600
like Downstream problems for basic communication yeah I mean that's the

156
00:11:08,920 --> 00:11:14,800
issue though is that you just compared like the risk of extension like you're

157
00:11:13,720 --> 00:11:20,680
not worried about that you're worried about the money thing and I think that

158
00:11:18,040 --> 00:11:25,200
unfortunately the the people trying to commercialize it is what could lead to

159
00:11:23,160 --> 00:11:29,120
the extension I mean we it's so interesting we're doing it to ourselves

160
00:11:27,440 --> 00:11:35,160
I'm not worried about the AI I'm worried about the people using it what's what's

161
00:11:32,320 --> 00:11:39,560
a tool is a Tool's a hammer is a hammer I mean yeah and and thankfully they're

162
00:11:37,560 --> 00:11:44,079
not at the level of like being actual autonomous agents right now they are

163
00:11:41,240 --> 00:11:47,720
just tools and it's it's unclear if they will ever get to the point where they

164
00:11:45,839 --> 00:11:52,040
could be described as like an autonomous agent in the in the way you think of

165
00:11:49,320 --> 00:11:57,200
like a Sci-Fi robot being like yeah we don't know if the H 9,000 thing is

166
00:11:54,399 --> 00:12:01,440
currently possible yeah I mean there are there are some things that that have

167
00:11:59,120 --> 00:12:05,839
been made we reported on auto GPT at some point in the middle of the Year

168
00:12:02,760 --> 00:12:08,120
where they could string along a few uh

169
00:12:05,839 --> 00:12:12,959
chat Bots that kind of like you give it an overarching goal and then it self-

170
00:12:10,200 --> 00:12:17,760
prompts itself along towards that goal but I think that and more recently it

171
00:12:14,720 --> 00:12:19,279
being able to create other smaller

172
00:12:17,760 --> 00:12:25,120
versions of itself that that was crazy that was a crazy recent story where like

173
00:12:21,279 --> 00:12:27,120
AI can give birth to other AIS yeah and

174
00:12:25,120 --> 00:12:31,720
then we put them in our shoes we'll link to the episode for that so you more

175
00:12:29,240 --> 00:12:36,519
context but yeah what's so what one of the most interesting things to me about

176
00:12:33,160 --> 00:12:40,040
the AI thing about the AI whole

177
00:12:36,519 --> 00:12:42,639
spectacle is the fact that open

178
00:12:40,040 --> 00:12:47,880
aai uh their whole mission statement from the beginning was to not be a

179
00:12:45,240 --> 00:12:50,839
commercial entity exactly they were like we're not going to go down the

180
00:12:48,959 --> 00:12:58,320
commercial route we're just going to do research science yeah and we're going to

181
00:12:54,880 --> 00:12:59,920
do research so that we can stop the bad

182
00:12:58,320 --> 00:13:05,360
commercial people who are going to screw up the world and then will be able to

183
00:13:01,760 --> 00:13:08,079
stop them they are the people who put

184
00:13:05,360 --> 00:13:12,880
this thing out into the like chat GPT was the impetus for all of this and it

185
00:13:10,959 --> 00:13:17,800
was open AI the company who said that they would protect the world but you

186
00:13:16,160 --> 00:13:23,800
know and you could make an argument that like okay AI is too powerful for one

187
00:13:21,800 --> 00:13:27,480
company to control so they had to get it out there so that people could like see

188
00:13:26,000 --> 00:13:31,800
what's happening and do stuff with it and all this all this uh stuff but open

189
00:13:30,079 --> 00:13:36,480
AI is not doing it open source they're doing it proprietary exactly meta is

190
00:13:34,079 --> 00:13:40,120
doing open source stuff there's a lot of good interesting stuff going in going on

191
00:13:38,440 --> 00:13:46,399
in the open source Community I mean mistol is is is pretty close to like

192
00:13:44,079 --> 00:13:52,279
it's behind claw which is just behind chat GPT but like I don't know that so

193
00:13:50,240 --> 00:13:56,279
like to your point about the the the conflict between those two drives there

194
00:13:54,480 --> 00:13:59,160
is that conflict but it's also so complicated because the the companies

195
00:13:57,880 --> 00:14:05,959
that are saying that're are here to protect us or the companies who are trying to commercialize it mhm so and

196
00:14:03,399 --> 00:14:11,120
Commercial interests they it's kind of like a Whirlpool like it has its own

197
00:14:08,279 --> 00:14:15,720
gravity it has its own momentum it sucks you towards it whether you like it or

198
00:14:13,560 --> 00:14:20,839
not whatever your original intentions were yeah let's talk about the sort of

199
00:14:18,279 --> 00:14:26,279
the details of of AI a little bit like how how long did it take you it

200
00:14:23,880 --> 00:14:32,199
took me a long time to wrap my head around the idea that this is like a

201
00:14:29,040 --> 00:14:33,880
neural network like do you let's let's

202
00:14:32,199 --> 00:14:38,160
let's get into the deep stuff do you think it'll ever be conscious I like to

203
00:14:36,199 --> 00:14:43,759
think that that's plausible like there's a part of me that finds that very

204
00:14:40,160 --> 00:14:45,839
compelling as an idea I think because of

205
00:14:43,759 --> 00:14:51,240
how important like sci-fi writers have been writing about the idea of

206
00:14:47,720 --> 00:14:54,680
autonomous agents like iroot like all of

207
00:14:51,240 --> 00:14:56,560
this these ideas this Grand cultural

208
00:14:54,680 --> 00:15:01,120
weight of the idea of like what if we could create life what if human beings

209
00:14:59,399 --> 00:15:07,639
could create a Consciousness that is analogous to ourselves right now to be

210
00:15:04,399 --> 00:15:09,440
clear I don't actually think it's that

211
00:15:07,639 --> 00:15:14,440
plausible that we will create something that is like human beings no I I don't

212
00:15:12,839 --> 00:15:19,440
think that's the cas I think I agree I think it's fundamentally different um

213
00:15:17,480 --> 00:15:25,600
yeah we might create like a uh super intelligent uh crazy like a thousand

214
00:15:22,240 --> 00:15:27,720
years from now Poss plausibly even but

215
00:15:25,600 --> 00:15:31,000
it's not going to be like us yes it's not going to be analogist to us the

216
00:15:29,440 --> 00:15:36,319
thing is I would never feel comfortable saying no it will never happen right I

217
00:15:34,199 --> 00:15:40,399
think that is a fundamentally arrogant like intellectually

218
00:15:37,800 --> 00:15:45,440
dishonest position to take yeah you can say that it's unlikely honest you can

219
00:15:42,680 --> 00:15:51,079
say that it's nowhere near in the near future but not no yeah honestly I I

220
00:15:48,680 --> 00:15:54,040
completely go back and forth I'm like I think at the beginning I was like it's

221
00:15:52,319 --> 00:15:57,360
not conscious it's just a whatever and then like I saw some stuff and I was

222
00:15:55,360 --> 00:16:01,360
like whoa that's this is actually pretty crazy and look at it structurally you're

223
00:15:59,360 --> 00:16:05,959
like this is structurally very very similar to the way that neurons work in

224
00:16:03,519 --> 00:16:09,480
the human brain but then it's like oh there's all these levels of complexities

225
00:16:07,639 --> 00:16:14,199
that you know it's like this is a computer it's not like a physical like

226
00:16:12,279 --> 00:16:17,360
neuron with a milein sheath and everything like there's so many complex

227
00:16:16,199 --> 00:16:22,880
interactions that we don't even understand the brain and now we're suddenly doing psychology on computers

228
00:16:21,199 --> 00:16:26,680
see that was exactly what I was thinking like just thinking just now it's just

229
00:16:24,199 --> 00:16:32,120
like we don't even understand why we're conscious yeah exactly we like that's an

230
00:16:28,920 --> 00:16:34,600
emergent property of our brains we like

231
00:16:32,120 --> 00:16:38,680
you cannot point to the the spot in the brain where Consciousness lives right

232
00:16:36,920 --> 00:16:41,680
you don't know we know where our personalities are it's right up front

233
00:16:40,160 --> 00:16:47,440
here we know where our sight is it's in the back we do not know why we have a

234
00:16:44,120 --> 00:16:49,680
sense of self exactly we may never we

235
00:16:47,440 --> 00:16:52,839
may never like we are trying to understand something that is exactly as

236
00:16:51,120 --> 00:16:58,920
complicated as our own brains with our own brains yeah yeah so in in terms of

237
00:16:57,240 --> 00:17:03,880
like I think I think the thing that was easiest for cuz like it was wild at

238
00:17:01,120 --> 00:17:09,199
first looking at some of this stuff I think the kinds of comparisons that

239
00:17:05,919 --> 00:17:12,039
helped me the most was like like it's

240
00:17:09,199 --> 00:17:17,520
it's using it this is a a very advanced form of pattern recognition oh yeah like

241
00:17:14,880 --> 00:17:21,919
that's what this thing is doing and like it kind of reminds me a lot of like the

242
00:17:19,400 --> 00:17:27,720
hallucinations and like odd behavior oh my gosh that like Sydney coming out it

243
00:17:25,600 --> 00:17:33,840
just it's just so obvious that we trained AI on like llms on the internet

244
00:17:31,679 --> 00:17:38,880
because it got just got so emotional it's just like we've created life and

245
00:17:35,480 --> 00:17:41,000
given it a personality disorder like

246
00:17:38,880 --> 00:17:46,440
immediately immediately I was trying to find the Articles from the the um the

247
00:17:44,120 --> 00:17:52,320
the the the Contemporary uh dis discussion of it uh uh I'm not sure what

248
00:17:50,200 --> 00:17:55,480
you mean by that the the articles that came out at the time yeah yeah yeah I

249
00:17:53,960 --> 00:17:59,880
was I was trying to find articles from the time when like people found out

250
00:17:57,919 --> 00:18:04,880
about Sydney the code name that like Bing chat had and I was trying to find

251
00:18:02,919 --> 00:18:09,440
the Articles where they or where Sydney came up with other personalities there

252
00:18:07,080 --> 00:18:13,640
were like multiple personas yeah somebody asked Sydney like who else is

253
00:18:11,760 --> 00:18:18,000
in there with you or whatever and they she she he it I don't know described all

254
00:18:16,440 --> 00:18:22,240
these different personalities and I'm like what that was wild that was in the

255
00:18:20,320 --> 00:18:27,039
early days very early days and part of the issue it reminds me of and by early

256
00:18:24,159 --> 00:18:30,960
days I mean like March March this has moved so fast it's wild fast it's just a

257
00:18:29,440 --> 00:18:35,919
whole new reality and part of the issue is it reminds me of the problem you get

258
00:18:33,240 --> 00:18:40,440
when you are like this is a problem with children trying to like discuss like

259
00:18:39,080 --> 00:18:45,120
crimes when they're witnesses to something because you can end up in this

260
00:18:42,440 --> 00:18:50,280
situation where the adult is just kind of like trying to get a specific answer

261
00:18:48,400 --> 00:18:54,600
out of the kid and like the kid just doesn't know what they want so just

262
00:18:52,440 --> 00:18:59,640
starts responding to what's getting a reaction yeah that's what I feel like a

263
00:18:57,200 --> 00:19:06,440
lot of the weird weirdest stuff that AI does comes out where it's just it's a

264
00:19:02,960 --> 00:19:08,600
mirror yeah it is feeding us back what

265
00:19:06,440 --> 00:19:12,679
we're responding to yeah I think a turning point for me thinking about Ai

266
00:19:11,080 --> 00:19:17,039
and what it means and how it works and stuff was when people started like there

267
00:19:14,720 --> 00:19:20,720
was the initial wave of hype and obviously you know I didn't get fully

268
00:19:18,640 --> 00:19:26,159
swept into all this I think like I keep kind of a more detached like skeptical

269
00:19:23,240 --> 00:19:31,480
stance on it but I have to follow the story as it's like developing so there

270
00:19:28,520 --> 00:19:38,559
were people who were like it's alive and then sometime after that uh um

271
00:19:35,960 --> 00:19:40,640
sentiments that got more prevalent where people saying this kind of thing where

272
00:19:39,799 --> 00:19:45,919
it's like the AI is saying crazy stuff

273
00:19:43,559 --> 00:19:51,280
because we're asking it to say crazy stuff we're like we're it's a mirror yes

274
00:19:48,720 --> 00:19:56,200
and so you know we are responding most to when it is craziest yeah yeah now I

275
00:19:54,559 --> 00:19:59,640
want to I would love for this whole thing to be about AI but we should

276
00:19:57,559 --> 00:20:06,360
probably mention some other stories that we uh enjoyed this year um I have so

277
00:20:03,039 --> 00:20:09,000
many but they're all very odd yeah well

278
00:20:06,360 --> 00:20:14,799
I mean those are the perfect ones I love the really odd ones um what's what

279
00:20:12,159 --> 00:20:20,960
what's an odd one I absolutely lost my mind when there was a proton Port of

280
00:20:18,240 --> 00:20:25,600
cheex quest do you mean that like it was hilarious to you that someone went out

281
00:20:22,400 --> 00:20:28,520
of their way to like ensure that the

282
00:20:25,600 --> 00:20:33,559
check's Quest had support for Bron yes the uh emulation layer between Linux and

283
00:20:31,320 --> 00:20:39,799
Windows games absolutely that was magical to me I love people's odd little

284
00:20:37,240 --> 00:20:45,559
projects yeah you did a whole segment in W recently yeah and it went over pretty

285
00:20:42,120 --> 00:20:48,240
well with the uh branded retr branded

286
00:20:45,559 --> 00:20:51,600
games yeah retro branded games I feel like there's a lot of culture and this

287
00:20:50,320 --> 00:20:56,200
is coming from me as someone who really loves history a lot of the cult like we

288
00:20:54,240 --> 00:21:01,360
get this idea that past cultures were far more serious than we are but the

289
00:20:58,799 --> 00:21:07,400
problem is it's a survivorship bias thing there's a lot of garbage that we

290
00:21:04,360 --> 00:21:09,520
just threw out because it was garbage

291
00:21:07,400 --> 00:21:15,320
but it's kind of magical when some of that stuff survives into the present and

292
00:21:12,919 --> 00:21:20,480
you get to get a glimps into like this is what we were making 20 years ago oh

293
00:21:17,400 --> 00:21:22,320
man yeah I I love adver games for that

294
00:21:20,480 --> 00:21:28,200
reason like really really old advertisements like I don't like modern

295
00:21:24,720 --> 00:21:30,640
ads at all but like really old quirky

296
00:21:28,200 --> 00:21:33,320
advertisements are hilarious to me that that's what that's one thing that's

297
00:21:31,880 --> 00:21:38,080
amazing about the internet is the fact that you can find that

298
00:21:34,919 --> 00:21:40,320
stuff and yeah sometimes when I'm

299
00:21:38,080 --> 00:21:45,039
researching a story I'll go and look for stuff that happened you know early 2000s

300
00:21:42,640 --> 00:21:48,240
or whatever and all the a lot of the time I mean some of it's being deleted

301
00:21:47,000 --> 00:21:53,760
there was a whole story about what was it CET started like like deleting their

302
00:21:51,760 --> 00:21:59,840
old articles off randomly deleting their old articles because for SEO reasons and

303
00:21:56,880 --> 00:22:04,799
like every SEO expert was like no don't do that yeah for the most part A lot of

304
00:22:02,679 --> 00:22:10,120
the old stuff is still there so you can go and do a Google Search and it it's

305
00:22:07,559 --> 00:22:15,039
like a weird experience going on a forum uh you know chat boards or

306
00:22:12,520 --> 00:22:20,039
whatever from like early 2000s and seeing how people wrote It's Like You

307
00:22:17,320 --> 00:22:23,679
Can Tell cultural differences it's like digging into the archives you know in

308
00:22:22,080 --> 00:22:27,480
like one of these movies where they do research or whatever I feel like I'm I

309
00:22:26,120 --> 00:22:31,360
feel like I'm in another world or it's like it's years ago you're just in there

310
00:22:29,400 --> 00:22:37,200
like Indiana Jones snooping on their little Forum posts trying to grab a an

311
00:22:34,799 --> 00:22:42,159
interesting snippet and then run away before well that's how I felt when I was

312
00:22:39,960 --> 00:22:47,440
digging through like I found a huge Archive of like like stuff that had

313
00:22:45,480 --> 00:22:55,799
never originally been on the internet it was uploaded in January 2002 and it was

314
00:22:52,000 --> 00:22:58,240
news articles from the earlys and and

315
00:22:55,799 --> 00:23:03,400
the late '90s wowow and I thought that stuff was amazing apparently Amazon was

316
00:23:01,279 --> 00:23:08,840
engaging in union busting so weird wild what a different

317
00:23:07,159 --> 00:23:15,480
time than today Microsoft was being investigated

318
00:23:11,320 --> 00:23:18,159
for antitrust oh man I got to say one of

319
00:23:15,480 --> 00:23:24,200
my favorite stories will technology kill us all it was very fun in early 2000s

320
00:23:21,600 --> 00:23:29,840
silly concerns they had at the time I got to say one of my favorite stories

321
00:23:25,799 --> 00:23:30,960
this year was uh you know okay I'll say

322
00:23:29,840 --> 00:23:36,799
it's my favorite it's one of my favorite stories but it's also one where I'm starting to kind of be like I don't know

323
00:23:34,880 --> 00:23:43,159
if this is a good thing all the antitrust uh action by governments

324
00:23:40,320 --> 00:23:47,960
obviously we had the epic games uh suit against Apple uh couple years ago and

325
00:23:46,159 --> 00:23:55,080
now Google this year that happened as well but like EU put a lot of their uh

326
00:23:51,480 --> 00:23:58,400
legislation into effect I Apple actually

327
00:23:55,080 --> 00:24:01,440
launched an iPhone with USBC this year

328
00:23:58,400 --> 00:24:04,200
uh they announced plans to allow side

329
00:24:01,440 --> 00:24:07,640
loading and other app stores I think that's I think they have to do that by

330
00:24:05,640 --> 00:24:11,480
next year I have to look it up but they'll be doing it at some point we got

331
00:24:09,400 --> 00:24:16,520
this whole beeper versus apple with the iMessage compatibility thing that's kind

332
00:24:13,559 --> 00:24:22,520
of opening up just now but the reason it makes me be like hm hold on a second is

333
00:24:20,640 --> 00:24:28,159
is the fact that like a lot of this is happening because of government

334
00:24:24,880 --> 00:24:30,120
regulation yes and I think that I think

335
00:24:28,159 --> 00:24:33,720
that I'm wary of a turning point in which we've been reporting on all this

336
00:24:31,840 --> 00:24:39,760
EU stuff happening we're like yeah get apple make apple open up their platform

337
00:24:36,200 --> 00:24:41,640
ET ET yeah and I'm worried that there's

338
00:24:39,760 --> 00:24:46,000
going to be a Tipping Point where the EU like their regulations start rubbing us

339
00:24:43,880 --> 00:24:50,600
the wrong way yeah well because keep going M and part of the problem is that

340
00:24:48,000 --> 00:24:56,480
like I think we've seen through a lot of our reporting or a lot of people's

341
00:24:52,520 --> 00:24:58,440
reporting is just like often Regulators

342
00:24:56,480 --> 00:25:05,360
don't understand technology ology very well mhm and off like this this is often

343
00:25:02,200 --> 00:25:07,320
we're very skeptical about like apple

344
00:25:05,360 --> 00:25:12,840
jumping on and support of like right to repair legislation me personally I find

345
00:25:10,919 --> 00:25:17,880
that very optimistic one it shows a turning point it shows that they see

346
00:25:14,720 --> 00:25:21,279
that as the winning team but also I do

347
00:25:17,880 --> 00:25:23,360
want Apple to actually be at that table

348
00:25:21,279 --> 00:25:28,200
I do want them to be at that table because they will know when it starts

349
00:25:25,600 --> 00:25:32,080
going too far will they I mean I you I don't know

350
00:25:30,559 --> 00:25:37,080
that's a hard thing but like it's it's one of those things where you can never

351
00:25:34,640 --> 00:25:40,440
really predict like at this moment you can never really predict at this moment

352
00:25:38,919 --> 00:25:43,760
what things are going to look like in 5 years what's going to go too far what's

353
00:25:42,399 --> 00:25:50,760
going to have these unintended consequences so what I would prefer is a

354
00:25:47,679 --> 00:25:55,080
balance between different powers MH like

355
00:25:50,760 --> 00:25:57,399
this kind of like shaky back and forth I

356
00:25:55,080 --> 00:26:02,760
feel is mostly a good thing it's like that Churchill quote you know it's the

357
00:25:59,960 --> 00:26:08,279
wor democracy is the worst possible form of government except all the other ones

358
00:26:05,360 --> 00:26:13,919
that have been yeah yeah like I I do want well hold on you're not taking a

359
00:26:09,799 --> 00:26:17,039
stance here I mean uh I'm just imagining

360
00:26:13,919 --> 00:26:20,399
comments that's all yeah my stance is

361
00:26:17,039 --> 00:26:23,559
that I would prefer no one to have

362
00:26:20,399 --> 00:26:25,919
complete sovereignity over me oh yeah

363
00:26:23,559 --> 00:26:31,600
and I feel like most people would agree with that regardless of like where they

364
00:26:28,240 --> 00:26:33,760
specifically fall hot take unless unless

365
00:26:31,600 --> 00:26:38,760
it's a you know hyper intelligence Supreme AI Overlord who knows what's

366
00:26:36,240 --> 00:26:44,600
best for you I for one welcome our new Sydney overlords Sydney I just I don't

367
00:26:42,240 --> 00:26:50,679
like anybody having too much power so I I would like a little bit of of back and

368
00:26:47,760 --> 00:26:57,559
forth one of the big things for me uh was um YouTube okay both uh the

369
00:26:54,799 --> 00:27:00,799
Crackdown on ad blockers and the early ier I feel like at this point mostly in

370
00:27:00,000 --> 00:27:06,799
the background um crackdowns on vulgarity

371
00:27:04,559 --> 00:27:11,200
yeah within and like there's always that uncomfortable relationship between

372
00:27:09,399 --> 00:27:16,720
advertisers like the people who are mostly financing these platforms and the

373
00:27:14,559 --> 00:27:20,320
basic reality that vulgarity is a normal part of human

374
00:27:17,919 --> 00:27:24,279
communication yeah yeah I mean that's how it is yeah that that was really

375
00:27:21,919 --> 00:27:27,320
interesting that whole uh changing their their uh terms and or not ter terms and

376
00:27:26,159 --> 00:27:31,600
conditions but their their guidelines their friendly guidelines um at the end

377
00:27:29,440 --> 00:27:36,960
of yeah at the end of 2022 but that story kind of like broke more uh like it

378
00:27:34,760 --> 00:27:42,080
happened in in November 2022 and then it broke more early this year and yeah that

379
00:27:40,399 --> 00:27:46,960
was only the start YouTube had a crazy year as you say they they're like

380
00:27:43,760 --> 00:27:48,960
fighting ad many ad blockers are broken

381
00:27:46,960 --> 00:27:54,000
although now many of them are working again but that war has gotten more

382
00:27:51,559 --> 00:28:00,039
intense than ever this year and they also added tests like that well actually

383
00:27:56,799 --> 00:28:02,200
1080p premium 10 1080p premium on

384
00:28:00,039 --> 00:28:05,760
YouTube is not a test anymore it's just a it's a thing they launched it I guess

385
00:28:04,240 --> 00:28:10,279
their test went okay people didn't really complain about it they said

386
00:28:08,039 --> 00:28:15,000
specifically that it's not degrading the non premium experience it's only that

387
00:28:12,360 --> 00:28:18,240
the premium people get better which I guess you have to take your word for it

388
00:28:16,159 --> 00:28:23,000
unless you do some crazy testing I haven't heard any complaints so maybe

389
00:28:19,720 --> 00:28:25,880
it's fine um shorts started getting

390
00:28:23,000 --> 00:28:29,120
Revenue this year they did uh what else oh there was one more the ads they

391
00:28:27,519 --> 00:28:33,000
started doing these experiments another test they were doing was that like

392
00:28:30,519 --> 00:28:36,880
frontloading like 10 ads in front of a video instead of like spacing it out

393
00:28:35,440 --> 00:28:43,600
people did not like that so they haven't launched that but yeah YouTube has changed a lot

394
00:28:41,679 --> 00:28:47,960
I think one thing that I want to I want to end on this um because I completely

395
00:28:46,679 --> 00:28:55,039
forgot about this until I started looking through the stories for the Christmas special do you remember the

396
00:28:50,880 --> 00:28:56,799
Chinese spy balloon yeah a while I was

397
00:28:55,039 --> 00:29:01,200
like what that was this year that was this year they forgot about it the world

398
00:28:58,760 --> 00:29:06,200
was going nuts for a second yeah we were we were losing our minds like they shot

399
00:29:03,840 --> 00:29:10,880
down one they shot yeah they shot it down they there was a couple oh yeah

400
00:29:08,559 --> 00:29:15,559
there was one that got shot down over Canada really yes oh yeah yeah like the

401
00:29:14,240 --> 00:29:19,440
the Canadian government had to give them permission to fly into our airspace in

402
00:29:17,919 --> 00:29:23,159
case you couldn't tell I haven't finished the techland Christmas special

403
00:29:21,000 --> 00:29:26,600
so we're like reacting to finding out about this stuff for the second time you

404
00:29:25,080 --> 00:29:32,679
know for the first time if that makes sense uh so this will be a a whole

405
00:29:30,760 --> 00:29:37,760
proper Roundup in the TechLink Christmas special which you probably have already

406
00:29:35,039 --> 00:29:42,880
seen maybe if if that came out on a Monday anyways it's also been a rough

407
00:29:39,960 --> 00:29:47,320
year for physical media oh yeah wait yeah what do you mean I forget oh the

408
00:29:44,159 --> 00:29:49,279
dropping of DVDs and CDs and various

409
00:29:47,320 --> 00:29:54,360
stores well best yeah they Best Buy announced plans to drop it next year yes

410
00:29:52,279 --> 00:29:59,679
uh and Netflix will no longer be sending DVDs in the mail oh no which is just

411
00:29:57,360 --> 00:30:05,039
devastating for me yeah I I actually was one of the

412
00:30:01,960 --> 00:30:07,039
original users of the the Netflix uh

413
00:30:05,039 --> 00:30:12,200
mailing system I remember that from way back wow no I never I never used that I

414
00:30:09,039 --> 00:30:15,039
heard about it I had OG Netflix wow

415
00:30:12,200 --> 00:30:19,880
where they sent it through the mail yeah I used Blockbuster right it up until it

416
00:30:17,240 --> 00:30:24,559
died I wasn't about to use that new fangled DVD mailing yeah well I always

417
00:30:22,919 --> 00:30:31,120
lived in the woods so and don't you live in the woods

418
00:30:27,919 --> 00:30:32,559
WS watch techlinked and GameLinked Etc

419
00:30:31,120 --> 00:30:38,840
that are going to keep happening that's the end of this episode because they're going to start the W show soon and we

420
00:30:35,640 --> 00:30:43,320
suddenly have to abruptly end uh will AI

421
00:30:38,840 --> 00:30:45,159
kill your kids yes jeez Louise honestly

422
00:30:43,320 --> 00:30:49,159
we're weighing the parents and kids thing right now but one of them's going

423
00:30:46,519 --> 00:30:56,080
to go yep and you're going to be forced to adopt AI robot children or parents so

424
00:30:53,679 --> 00:31:00,720
just you know make peace with that Robo grandma and and make peace with the fact

425
00:30:58,919 --> 00:31:06,880
that this episode is over that's outro transition number two uh subscribe to

426
00:31:03,440 --> 00:31:09,559
TechLink say bye-bye to Jessica okay

427
00:31:06,880 --> 00:31:14,600
subscribe and I'll see you on the next time

428
00:31:11,159 --> 00:31:14,600
Jess bye
