WEBVTT

00:00:00.000 --> 00:00:03.160
What did you say? AI, will it kill your parents?

00:00:03.160 --> 00:00:10.080
AI, will it kill your parents? Yeah. You know, there was a time when that question wouldn't make any sense, but sadly it does

00:00:10.080 --> 00:00:13.080
in the year 2023.

00:00:13.080 --> 00:00:17.680
Because AI has been crazy. There's been a lot of tech news happening this year and we're going to talk about it

00:00:17.680 --> 00:00:21.280
on this episode of Talk Linked With Who You! Riley Murdock.

00:00:21.280 --> 00:00:24.280
No, that's me. Okay. Jessica Peugeot.

00:00:24.280 --> 00:00:29.960
Jessica Peugeot, you're the WAN writer. I am the WAN writer and this is probably the first time listeners have heard how my

00:00:29.960 --> 00:00:33.280
last name is pronounced and I'm sorry, we had to break it to you this way.

00:00:33.280 --> 00:00:36.720
I mean, how else would you say it? Peugeot. People usually say Peugeot.

00:00:36.720 --> 00:00:41.720
Peugeot. You're the WAN writer, but you also write TechLinked.

00:00:41.720 --> 00:00:44.720
Yes. And GameLinked. And GameLinked.

00:00:44.720 --> 00:00:47.720
And Tech Quicky. And Tech Quicky. Yes.

00:00:47.720 --> 00:00:54.680
You've got your hands in all the pots now. I am multifaceted. I think there was a rumor before my name was released because I tended to jokingly put

00:00:54.680 --> 00:00:57.760
my name in the credits as various AI.

00:00:57.760 --> 00:01:05.840
Yes. I think there was a rumor that I was actually an AI experiment because I just kept showing

00:01:05.840 --> 00:01:09.440
up in so many different video series. Maybe we all are.

00:01:09.440 --> 00:01:15.240
Who knows? Brains and vats, etc. Honestly, this year has made me question absolutely everything.

00:01:15.240 --> 00:01:19.640
We're going to go through our favorite stories of the year or sort of, we're going to talk

00:01:19.640 --> 00:01:24.320
about Jessica's first year here at LMG and cover some of our, you know, the stories that

00:01:24.320 --> 00:01:31.280
really got us thinking. Now this episode, we're recording it the week before, but it's going to go up on Wednesday,

00:01:31.280 --> 00:01:38.640
December 27th, two days after our Christmas special. So if you want a proper big roundup of all the tech stories of the year and you didn't

00:01:38.640 --> 00:01:42.960
see that we released a Christmas special, go see that. But this is more of a casual situation.

00:01:42.960 --> 00:01:48.800
Jessica, how's it been? Had you tried to keep up with the tech news cycle in the way that you have had to this

00:01:48.800 --> 00:01:51.800
year before? Not like this. No.

00:01:51.800 --> 00:01:56.280
Like there was always things I kept up with. There's little trends that I would note.

00:01:56.280 --> 00:02:01.640
But if you want to know my previous experience following the news timeline, I learned about

00:02:01.640 --> 00:02:06.360
Bitcoin in early 2012, went, huh, that's interesting.

00:02:06.360 --> 00:02:11.520
It didn't pay attention to it again until 2019.

00:02:11.520 --> 00:02:16.000
There's always just been a lot of stuff that just passes me by because I wasn't specifically

00:02:16.000 --> 00:02:21.200
interested. Yeah. I mean, crypto, that's fair, honestly.

00:02:21.200 --> 00:02:28.040
When people missed the whole crypto hype and then crash, it was only a few years.

00:02:28.040 --> 00:02:33.920
To me, it started here and it ended here. Everything in between that, there might have been an up, there might have been a down.

00:02:33.920 --> 00:02:37.760
To me, it was a straight line. Yeah. Well, okay.

00:02:37.760 --> 00:02:42.000
It's not fair to say that, sorry, I just said a second ago that it was only a few years.

00:02:42.000 --> 00:02:49.360
But really, crypto started way back early 2010s or maybe before that anyways.

00:02:49.360 --> 00:02:53.160
But they really started gaining hype around 2012, 13 and stuff.

00:02:53.160 --> 00:02:58.040
Then it kind of like laid dormant and then it spiked in the last few years and just in

00:02:58.040 --> 00:03:04.200
time for it to kind of fall out of favor. I mean, last year, I rewatched some of our Christmas specials in preparation for this

00:03:04.200 --> 00:03:10.280
years and one of the main stories last year was how crypto basically just fell off the

00:03:10.280 --> 00:03:14.480
map. At the beginning of 2022, a bunch of companies were still doing stuff with it.

00:03:14.480 --> 00:03:17.960
They announced integrations with it, a lot of gaming companies in particular.

00:03:17.960 --> 00:03:21.840
Then by the end of the year, they had all kind of canceled it. Yeah.

00:03:21.840 --> 00:03:26.920
I remember WealthSimple sending me an email a year and a half ago about their crypto

00:03:26.920 --> 00:03:31.320
integration. Yeah. Well, it's still integrated in a lot of things.

00:03:31.320 --> 00:03:34.760
I don't want to say that it's dead. It's not. It's absolutely not.

00:03:34.760 --> 00:03:38.760
But in terms of its mainstream. For the average person, crypto was fast.

00:03:38.760 --> 00:03:41.760
Yeah. Yeah. Crypto was very fast.

00:03:41.760 --> 00:03:50.040
It was a flash in the pan. But it was replaced by AI, which is like the hugest thing this year.

00:03:50.040 --> 00:03:55.080
But way bigger hype. Before we get there, how was it keeping up with everything?

00:03:55.080 --> 00:03:59.960
You said that you were aware of some things before, but was there a breakneck speed?

00:03:59.960 --> 00:04:06.840
Because I've been doing this for 10 years, so I want to hear what it was like for you.

00:04:06.840 --> 00:04:13.120
It was a lot of reading stuff and not knowing if it was important or not, because sometimes

00:04:13.160 --> 00:04:16.160
you'd read something and you're like, that sounds really important.

00:04:16.160 --> 00:04:21.280
And then Riley would go, yeah, they always do that. Yeah, this is actually the sixth time.

00:04:21.280 --> 00:04:24.800
Or I'd read something and I'm just like, that doesn't seem important at all.

00:04:24.800 --> 00:04:28.360
And Riley would be like, this is game changing.

00:04:28.360 --> 00:04:31.680
And I'm like, really? Yeah. Yeah.

00:04:31.680 --> 00:04:35.600
I just, a big part of the issue is not that I didn't understand what was happening, but

00:04:35.600 --> 00:04:41.680
I did not have the right amount of context to understand what people considered important.

00:04:41.680 --> 00:04:47.200
Yeah. It was always hard, especially when you're writing for WAN Show.

00:04:47.200 --> 00:04:53.760
And with TechLink, we do our best, but WAN Show, we are preparing things.

00:04:53.760 --> 00:04:57.960
And before you came on, the whole writing team kind of worked together on it, but I

00:04:57.960 --> 00:05:02.880
was curating the topics. And now this is your burden.

00:05:02.880 --> 00:05:09.400
And it is a struggle to kind of guess what Linus and Luke will particularly find interesting,

00:05:09.400 --> 00:05:14.400
as opposed to what we think maybe the broad tech audience on YouTube will find interesting.

00:05:14.400 --> 00:05:18.680
Sometimes I feel like I have a Ouija board, or I have a crystal ball.

00:05:18.680 --> 00:05:24.360
And I'm just going like, okay, well, this is objectively important, but I don't know.

00:05:24.360 --> 00:05:29.840
We have a little spreadsheet set up, and I think I've gotten a lot better at guessing,

00:05:29.840 --> 00:05:35.960
but it's still, sometimes I will get it back from Linus, and I will be baffled by what

00:05:35.960 --> 00:05:39.720
returns to me. I'm like, you want to talk about that? That's a main story to you.

00:05:39.720 --> 00:05:44.480
Yeah, yeah. Okay, because I thought maybe I'd have to write three lines on that.

00:05:44.480 --> 00:05:48.200
Yeah, we'll put something in there. We'll be like, this is the biggest news of the week.

00:05:48.200 --> 00:05:52.120
You guys need to talk about this for half an hour. He's like, eh. World changing.

00:05:52.120 --> 00:05:56.520
I want to talk about this final fantasy bug that's really annoying me.

00:05:56.520 --> 00:06:00.920
But to your point a second ago about like having the context, it is tough because we

00:06:00.920 --> 00:06:04.840
hired you and we hired Jacob close to a year ago.

00:06:04.840 --> 00:06:11.120
Yes. You guys have been, you know, getting on the team and becoming embedded and like gaining

00:06:11.120 --> 00:06:15.520
that kind of context that I've had for a while.

00:06:15.520 --> 00:06:23.280
And it's, yeah, it's a whole process. Admittedly, I no longer find myself googling, what is red green team?

00:06:23.280 --> 00:06:26.280
Yeah, exactly. Question mark?

00:06:26.280 --> 00:06:32.640
Yeah. But I want to say honestly that the difference between you and your like kind of ambient

00:06:32.640 --> 00:06:38.840
knowledge for some of this stuff. The difference between you now and between you, like when you started is so vast.

00:06:38.840 --> 00:06:42.120
Like now I'll just talk about a tech thing and you'll just be like, yeah, you know.

00:06:42.120 --> 00:06:45.640
Oh, I am a sponge. You're up on it. I am a serious sponge.

00:06:45.640 --> 00:06:50.160
That was kind of what James said to me when he hired me was like, you obviously don't

00:06:50.160 --> 00:06:54.720
have the background, but you're smart. You'll figure it out. I'm like, what does that mean?

00:06:54.720 --> 00:06:58.720
Oh, hold on a second. I'm rethinking things here.

00:06:58.720 --> 00:07:01.720
Yeah. You didn't pick up on that? Okay.

00:07:01.720 --> 00:07:04.720
I'm like, that sounds ominous. Am I in danger? Don't worry.

00:07:04.720 --> 00:07:07.720
You'll be fine. Swim. Yeah.

00:07:07.720 --> 00:07:10.840
So, I mean. I do tend to have a pretty good grasp of technology.

00:07:10.840 --> 00:07:15.480
It's just that most of the technology I'm interested in was invented before everyone was currently

00:07:15.480 --> 00:07:21.720
alive was born. Right. Because prior to your job here, you did write blog posts.

00:07:21.720 --> 00:07:25.720
Absolutely. About and you did deep dives. On history topics.

00:07:25.720 --> 00:07:29.320
Yeah. Which involves technology a lot of times. Which absolutely involves technology.

00:07:29.320 --> 00:07:33.120
Yeah, for sure. Which is I think why, I mean, I've loved your Techquickie scripts.

00:07:33.120 --> 00:07:36.240
I think that you do a deep dive and you make them funny.

00:07:36.240 --> 00:07:43.000
You're a stand-up comedian in case anyone doesn't know. I'm currently sitting, but you know, you just gotta imagine me like two feet up.

00:07:43.000 --> 00:07:48.560
That's the kind of jokes we're looking for. You can look forward to.

00:07:48.560 --> 00:07:52.240
I cannot tell the kind of jokes that I tell on stage in this context.

00:07:52.240 --> 00:07:57.560
Jacob wrote a sit-stand joke into today's tech link, just like, they will not stand

00:07:57.560 --> 00:08:03.960
for that. But if you want to sit for that. This just reminds me of the time when like you were, I heard you across the room, like

00:08:03.960 --> 00:08:08.640
I was finishing up a joke for tech link and I heard you all across the room telling Jacob

00:08:08.640 --> 00:08:14.160
off for having two jokes in one script that were about shoving objects in someone's butt.

00:08:14.160 --> 00:08:19.560
As I was finishing a joke about shoving objects in people's butts.

00:08:19.560 --> 00:08:23.000
I mean, hey, it's a funny place of the body.

00:08:23.000 --> 00:08:26.000
You know, like it's a. It's inherently humorous. I mean, what do you have?

00:08:26.000 --> 00:08:30.280
What are you supposed to do? Not make jokes about it? But simmer, right?

00:08:30.280 --> 00:08:34.920
So on that note, yeah, let's talk about our favorite tech stories this year.

00:08:34.920 --> 00:08:40.360
For me, it's it's definitely one of the biggest ones and one of my favorites is AI just because

00:08:40.360 --> 00:08:45.880
I've been doing my darndest this year to make some progress on a tech longer about it.

00:08:45.880 --> 00:08:51.440
And it's been very, very, very difficult to long road to find the to find the time to

00:08:51.440 --> 00:08:55.400
do it. But I've been like saving so many links. It's crazy.

00:08:56.400 --> 00:09:00.480
Yeah. How do you feel about AI? Is it is it going to be the end of humanity?

00:09:00.480 --> 00:09:04.400
Is it going to be the bright optimistic future?

00:09:04.800 --> 00:09:08.560
I think the truth is somewhere in between. I know, I know.

00:09:08.560 --> 00:09:11.440
But of course, that's why you're presenting the dichotomy.

00:09:12.040 --> 00:09:18.040
There's been a lot of both euphoria and hysteria about AI.

00:09:18.400 --> 00:09:22.080
I think it's genuinely incredibly cool. It really is.

00:09:22.080 --> 00:09:29.920
It's incredibly cool. But the danger is that there are people who think that in the same way as people thought

00:09:29.920 --> 00:09:37.280
about crypto. Exactly. And I feel like a lot of people's reaction and this is I was talking earlier about

00:09:37.280 --> 00:09:43.520
grifters, a lot of people's reaction to whatever is drawing the most attention in a

00:09:43.520 --> 00:09:49.160
media space is just to immediately just try to like pick it up like it's a piggy bank

00:09:49.160 --> 00:09:52.240
and try to shake money out of it. Yeah.

00:09:52.280 --> 00:09:56.440
Regardless of why, regardless of there's no real thought behind it.

00:09:56.840 --> 00:10:00.200
And I feel like that's a lot of the current danger about AI.

00:10:00.200 --> 00:10:04.400
Like I'm not really worried about human extinction at this point.

00:10:04.920 --> 00:10:10.280
I am worried about people throwing away people's jobs because they think they can

00:10:10.280 --> 00:10:15.000
replace them with what is essentially an advanced auto correct.

00:10:15.280 --> 00:10:18.800
Well, I'm worried about people, you know, just

00:10:19.480 --> 00:10:27.320
feeling like flooding the zone of conversation with like AI slurry and that causing

00:10:27.360 --> 00:10:30.680
like downstream problems for basic communication.

00:10:30.720 --> 00:10:37.040
Yeah. I mean, that's the issue, though, is that you just compared like the risk of extension.

00:10:37.080 --> 00:10:39.640
Like you're not worried about that. You're worried about the money thing.

00:10:40.160 --> 00:10:46.040
And I think that unfortunately the people trying to commercialize it is what could

00:10:46.280 --> 00:10:50.440
lead to the extension. I mean, it's so interesting because we're doing it to ourselves.

00:10:50.440 --> 00:10:54.240
Yeah. I'm not worried about the AI. I'm worried about the people using it.

00:10:54.480 --> 00:10:58.480
What's what's a tool is a tool. What's a hammer is a hammer.

00:10:58.640 --> 00:11:04.120
I mean, yeah. And thankfully, they're not at the level of like being actual autonomous agents

00:11:04.120 --> 00:11:09.000
right now. They are just tools and it's unclear if they will ever get to the point where

00:11:09.000 --> 00:11:12.520
they could be described as like an autonomous agent in the in the way you

00:11:12.520 --> 00:11:17.560
think of like a sci-fi robot being like we don't know if the Hal 9000 thing.

00:11:17.800 --> 00:11:23.160
Is currently possible. Yeah. I mean, there are no, there are some things that have been made.

00:11:23.160 --> 00:11:27.040
We reported on auto GPT at some point in the middle of the year where they could

00:11:27.040 --> 00:11:33.080
string along a few chatbots that kind of like you give it an overarching goal and

00:11:33.080 --> 00:11:36.200
then it self prompts itself along towards that goal.

00:11:36.440 --> 00:11:41.840
But I think that recently it being able to create other smaller versions of

00:11:41.840 --> 00:11:47.440
itself, that was crazy. That was a crazy reason story where like AIs can give birth to other AIs.

00:11:48.120 --> 00:11:53.240
Yeah. And then we put them in our shoes. We'll link to the episode for that so you have more context.

00:11:53.560 --> 00:11:57.240
But yeah, what's so one of the most interesting things to me about the AI

00:11:57.240 --> 00:12:05.920
thing about the AI whole spectacle is the fact that open AI, their whole mission

00:12:05.920 --> 00:12:09.760
statement from the beginning was to not be a commercial entity.

00:12:09.800 --> 00:12:13.280
Exactly. They were like, we're not going to go down the commercial route.

00:12:13.440 --> 00:12:16.480
We're just going to do research. Science. Yeah.

00:12:16.760 --> 00:12:23.000
And we're going to do research so that we can stop the bad commercial people who

00:12:23.000 --> 00:12:25.960
are going to screw up the world and then we'll be able to stop them.

00:12:26.760 --> 00:12:32.440
They are the people who put this thing out into the, like chat GPT was the

00:12:32.480 --> 00:12:38.120
impetus for all of this. And it was open AI, the company who has said that they would protect the world.

00:12:39.000 --> 00:12:44.240
But, you know, and you could make an argument that like, okay, AI is too

00:12:44.240 --> 00:12:50.280
powerful for one company to control. So they had to get it out there so that people could like see what's happening

00:12:50.280 --> 00:12:55.320
and do stuff with it and all this, all this stuff. But open AI is not doing it open source.

00:12:55.320 --> 00:12:59.200
They're doing it proprietary. Exactly. Meta is doing open source stuff.

00:12:59.360 --> 00:13:03.480
There's a lot of good, interesting stuff going on in the open source community.

00:13:03.480 --> 00:13:08.840
I mean, Mistral is, is, is pretty close to, like it's behind Claude, which is

00:13:08.840 --> 00:13:13.160
just behind chat GPT. But like, I don't know.

00:13:13.160 --> 00:13:17.000
That, so like to your point about the, the, the conflict between those two

00:13:17.000 --> 00:13:21.120
drives, there is that conflict, but it's also so complicated because the, the

00:13:21.120 --> 00:13:25.200
companies that are saying that they're here to protect us are the companies who are trying to commercialize it.

00:13:26.080 --> 00:13:30.240
So, and commercial interests, they, it's kind of like a whirlpool.

00:13:30.280 --> 00:13:33.760
Like it has its own gravity. It has its own momentum.

00:13:33.760 --> 00:13:38.560
It sucks you towards it, whether you like it or not, whatever your original

00:13:38.560 --> 00:13:43.800
intentions were. Yeah. Let's talk about the sort of the details of AI a little bit.

00:13:43.800 --> 00:13:49.680
Like how, how long did it take you, it took me a long time to wrap my head

00:13:49.680 --> 00:13:54.120
around the idea that this is like a neural network.

00:13:54.120 --> 00:13:57.040
Like do you, let's, let's, let's get into the deep stuff.

00:13:57.240 --> 00:14:01.480
Do you think it'll ever be conscious? I like to think that that's plausible.

00:14:01.480 --> 00:14:05.120
Like there's a part of me that finds that very compelling as an idea.

00:14:05.480 --> 00:14:10.200
I think because of how important like sci-fi writers have been writing about

00:14:10.200 --> 00:14:16.160
the idea of autonomous agents, like iRobot, like all of this, these ideas,

00:14:16.160 --> 00:14:21.200
this grand cultural weight of the idea of like, what if we could create life?

00:14:21.760 --> 00:14:26.360
What if human beings could create a consciousness that is analogous to ourselves?

00:14:26.560 --> 00:14:32.360
Right. Now, to be clear, I don't actually think it's that plausible that we will

00:14:32.360 --> 00:14:37.640
create something that is like human beings. No, I, I don't think that's, I think I agree.

00:14:37.880 --> 00:14:40.160
I think it's fundamentally different.

00:14:40.920 --> 00:14:46.640
Yeah. We might create like a super intelligent, crazy, like a thousand years from now.

00:14:46.800 --> 00:14:50.360
Possible, plausible even, but it's not going to be like us.

00:14:50.680 --> 00:14:56.840
Yes. It's not going to be analogous to us. The thing is, I would never feel comfortable saying no, it will never happen.

00:14:56.880 --> 00:15:03.360
Right. I think that is a fundamentally arrogant, like intellectually dishonest position to take.

00:15:03.440 --> 00:15:10.640
Yeah. You can say that it's unlikely. Honestly, that it's nowhere near in the near future, but not no.

00:15:10.800 --> 00:15:16.680
Yeah. Honestly, I completely go back and forth. I'm like, I think at the beginning, I was like, it's not conscious.

00:15:16.680 --> 00:15:21.240
It's just the whatever. And then like, I saw some stuff and I was like, whoa, this is actually pretty crazy.

00:15:21.240 --> 00:15:25.280
And I look at it structurally, you're like, this is structurally very, very similar

00:15:25.280 --> 00:15:31.560
to the way that neurons work in the human brain. But then it's like, oh, there's all these levels of complexities that, you know,

00:15:31.600 --> 00:15:38.280
it's like this is a computer. It's not like a physical like neuron with a myelin sheath and everything.

00:15:38.280 --> 00:15:41.640
Like there's so many complex interactions that we don't even understand the brain.

00:15:42.000 --> 00:15:46.360
And now we're suddenly doing psychology on computers. See, that was exactly what I was thinking.

00:15:46.360 --> 00:15:50.680
I would like to just thank you just now. It's just like, we don't even understand why we are conscious.

00:15:50.680 --> 00:15:54.600
Yeah, exactly. We like, that's an emergent property of our brains.

00:15:54.600 --> 00:16:00.040
Yeah. We like, you cannot point to the spot in the brain where consciousness lives.

00:16:00.120 --> 00:16:04.080
Right. You don't know. We don't know where our personalities are. It's right up front here.

00:16:04.240 --> 00:16:08.480
We know where our sight is. It's in the back. We do not know why we have a sense of self.

00:16:09.040 --> 00:16:15.600
Exactly. We may never. We may never. Like we are trying to understand something that is exactly as complicated as our own

00:16:15.600 --> 00:16:18.160
brains with our own brains. Yeah.

00:16:18.680 --> 00:16:23.680
Yeah. So in terms of like, I think, I think the thing that was easiest for, because like

00:16:23.680 --> 00:16:29.440
it was wild at first, looking at some of this stuff, I think the kinds of comparisons

00:16:29.440 --> 00:16:33.640
that helped me the most was like, like it's, it's using it.

00:16:33.640 --> 00:16:37.200
This is a very advanced form of pattern recognition.

00:16:37.520 --> 00:16:45.360
Oh yeah. Like that's what this thing is doing. And like, it kind of reminds me a lot of like the hallucinations and like odd behavior.

00:16:45.360 --> 00:16:53.720
Oh my gosh. Yeah. That like Sydney coming out, it just, it's just so obvious that we trained AI on like

00:16:53.720 --> 00:16:57.480
LLMs on the internet because it just got so emotional.

00:16:57.520 --> 00:17:00.640
It's just like, we've created life and given it a personality disorder.

00:17:02.400 --> 00:17:10.720
Immediately. I was trying to find the articles from the, the contemporary discussion of it.

00:17:13.320 --> 00:17:16.680
I'm not sure what you mean by that. The, the articles that came out at the time.

00:17:16.840 --> 00:17:21.720
Yeah. Yeah. Yeah. I was, I was trying to find articles from the time when like people found out about

00:17:21.720 --> 00:17:24.640
Sydney, the code name that like Bing Chat had.

00:17:25.120 --> 00:17:29.440
And I was trying to find the articles where they, or where Sydney came up with other

00:17:29.440 --> 00:17:32.880
personalities. Yes. There were like multiple personas.

00:17:32.880 --> 00:17:36.680
Yeah. Somebody asked Sydney like, who else is in there with you or whatever.

00:17:36.680 --> 00:17:41.440
And they, she, he, it's, I don't know, described all these different personalities.

00:17:41.440 --> 00:17:44.640
And I'm like, what? That was wild. That was in the early days.

00:17:44.760 --> 00:17:49.200
Very early days. And part of the issue, it reminds me. And by early days, I mean like March.

00:17:49.200 --> 00:17:52.240
March. It's a, this has moved so fast. It's wild.

00:17:52.360 --> 00:17:58.440
It's just a whole new reality. And part of the issue is it reminds me of the problem you get when you are like,

00:17:58.480 --> 00:18:03.400
this is a problem with children trying to like discuss like crimes when they're

00:18:03.400 --> 00:18:08.360
witnesses to something, because you can end up in a situation where the adult is

00:18:08.360 --> 00:18:12.680
just kind of like trying to get a specific answer out of the kid.

00:18:13.120 --> 00:18:18.840
And like the kid just doesn't know what they want. So just starts responding to what's getting a reaction.

00:18:18.880 --> 00:18:23.680
Yeah. That's what I feel like a lot of the weirdest stuff that AI does.

00:18:24.040 --> 00:18:27.080
Comes out where it's just, it's a mirror.

00:18:27.240 --> 00:18:31.480
Yeah. Yeah. It is feeding us back what we're responding to.

00:18:31.520 --> 00:18:36.080
Yeah. I think a turning point for me thinking about AI and what it means and how it works

00:18:36.080 --> 00:18:39.520
and stuff was when people started, like there was the initial wave of hype.

00:18:39.800 --> 00:18:43.320
And obviously, you know, I didn't get fully swept into all this.

00:18:43.320 --> 00:18:47.720
I think like I keep kind of a more detached like skeptical stance on it.

00:18:47.960 --> 00:18:51.640
But I have to follow the story as it's like developing.

00:18:51.640 --> 00:18:54.600
So there were people who were like, it's alive.

00:18:54.880 --> 00:19:02.440
And then sometime after that, sentiments that got more prevalent were people

00:19:02.440 --> 00:19:07.440
saying this kind of thing where it's like the AI is saying crazy stuff because

00:19:07.440 --> 00:19:11.600
we're asking it to say crazy stuff. We're like, we're it's a mirror.

00:19:11.680 --> 00:19:16.320
Yes. And so, you know, we are responding most to when it is craziest.

00:19:16.360 --> 00:19:20.880
Yeah. Yeah. Now I want to, I would love for this whole thing to be about AI, but we

00:19:20.880 --> 00:19:25.040
should probably mention some other stories that we enjoyed this year.

00:19:26.160 --> 00:19:29.520
I have so many, but they're all very odd. Yeah.

00:19:29.600 --> 00:19:33.840
Well, I mean, those are the perfect ones. I love the really odd ones.

00:19:35.040 --> 00:19:42.720
What's what's an odd one? I absolutely lost my mind when there was a proton port of ChexQuest.

00:19:43.200 --> 00:19:46.880
Do you mean that like it was hilarious to you that someone went out of their way

00:19:46.880 --> 00:19:51.320
to like ensure that the ChexQuest had support for proton?

00:19:51.360 --> 00:19:55.600
Yes. The emulation layer between Linux and Windows games.

00:19:55.680 --> 00:19:58.040
Absolutely. That was magical to me.

00:19:58.720 --> 00:20:04.480
I love people's odd little projects. Yeah, you did a whole segment and WAN recently.

00:20:04.520 --> 00:20:09.680
Yeah. And it went over pretty well with the branded retro branded games.

00:20:10.000 --> 00:20:14.960
Yeah, retro branded games. I feel like there's a lot of culture and this is coming from me as someone

00:20:14.960 --> 00:20:20.920
who really loves history. A lot of the cult like we get this idea that past cultures were far more serious

00:20:20.920 --> 00:20:25.200
than we are, but the problem is it's a survivorship bias thing.

00:20:25.840 --> 00:20:30.400
There's a lot of garbage that we just threw out because it was garbage.

00:20:30.880 --> 00:20:36.000
But it's kind of magical when some of that stuff survives into the present

00:20:36.400 --> 00:20:40.720
and you get to get a glimpse into like this is what we were making 20 years ago.

00:20:40.720 --> 00:20:44.440
Oh, man. Yeah. I love ad for games for that reason.

00:20:44.440 --> 00:20:52.480
Like really, really old advertisements. Like I don't like modern ads at all, but like really old quirky advertisements

00:20:52.480 --> 00:20:56.960
are hilarious to me. That's what that's one thing that's amazing about the Internet is the fact

00:20:56.960 --> 00:21:04.240
that you can find that stuff. And yeah, sometimes when I'm researching a story, I'll go and look for stuff

00:21:04.240 --> 00:21:10.200
that happened, you know, early 2000s or whatever. And all the a lot of the time, I mean, some of it's being deleted.

00:21:10.640 --> 00:21:16.680
There was a whole story about, what was it? CNET started like deleting their old articles off the Internet.

00:21:16.680 --> 00:21:19.880
Deleting their old articles because for SEO reasons.

00:21:20.120 --> 00:21:23.960
And like every SEO expert was like, no, don't do that.

00:21:23.960 --> 00:21:27.800
Yeah. For the most part, a lot of the old stuff is still there.

00:21:27.800 --> 00:21:32.360
So you can go and do a Google search and it's like a weird experience

00:21:32.360 --> 00:21:38.120
going on a forum, you know, chat boards or whatever from like early 2000s

00:21:38.480 --> 00:21:43.360
and seeing how people wrote. It's like you can tell cultural differences.

00:21:43.360 --> 00:21:46.680
It's like digging into the archives, you know, in like one of these movies

00:21:46.680 --> 00:21:52.280
where they do research or whatever. I feel like I'm I feel like I'm in another world or it's like it's 20 years ago.

00:21:52.360 --> 00:21:55.960
You're just in there like Indiana Jones snooping on their little forum posts.

00:21:55.960 --> 00:22:00.120
Yeah. Trying to grab an interesting snippet and then run away.

00:22:00.120 --> 00:22:06.200
And then run away. Yeah. Well, that's how I felt when I was digging through like I found a huge archive

00:22:06.200 --> 00:22:10.800
of like like stuff that had never originally been on the Internet.

00:22:10.840 --> 00:22:14.920
It was uploaded in January 2002.

00:22:14.920 --> 00:22:20.520
And it was news articles from the early aughts and and the late 90s.

00:22:20.960 --> 00:22:26.680
Wow. And I thought that stuff was amazing. Apparently Amazon was engaging in union busting.

00:22:27.960 --> 00:22:31.960
So weird. Wild. What a different time than today.

00:22:33.320 --> 00:22:40.640
Microsoft was being investigated for antitrust. Oh, man, I got to say one of my favorite stories.

00:22:40.640 --> 00:22:45.160
Will technology kill us all? It was very fun in early 2000.

00:22:45.160 --> 00:22:53.200
Silly concerns they had at the time. I got to say one of my favorite stories this year was, you know, I OK.

00:22:53.200 --> 00:22:56.920
I'll say it's my favorite. It's one of my favorite stories, but it's also one where I'm starting to kind of

00:22:56.920 --> 00:23:03.520
be like, hmm, I don't know if this is a good thing. All the antitrust action by governments.

00:23:03.880 --> 00:23:09.520
Obviously, we had the Epic Games sued against Apple a couple of years ago.

00:23:09.560 --> 00:23:16.560
And now Google this year, that happened as well. But like EU put a lot of their legislation into effect.

00:23:17.560 --> 00:23:21.480
Apple actually launched an iPhone with USBC this year.

00:23:22.080 --> 00:23:27.360
They announced plans to allow side loading and other app stores.

00:23:27.400 --> 00:23:32.320
I think that's I think they have to do that by next year. I have to look it up, but they'll be doing it at some point.

00:23:32.680 --> 00:23:36.760
We got this whole beeper versus Apple with the iMessage compatibility thing

00:23:36.800 --> 00:23:43.080
that's kind of opening up just now. But the reason it makes me be like, hmm, hold on a second.

00:23:43.560 --> 00:23:49.120
Is is the fact that like a lot of this is happening because of government regulation.

00:23:49.160 --> 00:23:53.920
Yes. And I think that I think that I'm wary of a turning point in which

00:23:54.240 --> 00:24:00.480
we've been reporting on all this EU stuff happening. We're like, yeah, get Apple, make Apple open up their platform, et cetera, et cetera.

00:24:00.760 --> 00:24:04.920
Yeah. And I'm worried that there's going to be a tipping point where the EU's

00:24:05.080 --> 00:24:09.760
like their regulations start rubbing us the wrong way. Yeah, well, because they just keep going.

00:24:10.360 --> 00:24:14.400
And part of the problem is that like I think we've seen through a lot of our

00:24:14.400 --> 00:24:17.720
reporting where a lot of people's reporting is just like

00:24:18.520 --> 00:24:22.280
often regulators don't understand technology very well.

00:24:22.840 --> 00:24:28.560
And often this is often we're very skeptical about like Apple

00:24:28.920 --> 00:24:32.400
jumping on and support of like right to repair legislation.

00:24:33.120 --> 00:24:37.360
May personally, I find that very optimistic. One, it shows a turning point.

00:24:37.360 --> 00:24:44.000
It shows that they see that as the winning team. But also I do want Apple to actually be at that table.

00:24:44.840 --> 00:24:50.320
I do want them to be at that table because they will know when it starts going too far.

00:24:51.280 --> 00:24:54.880
You know, will they? I mean, I don't know. That's a hard thing.

00:24:54.880 --> 00:25:00.160
But like it's one of those things where you can never really predict like at this

00:25:00.160 --> 00:25:03.360
moment, you can never really predict at this moment what things are going to look

00:25:03.360 --> 00:25:06.600
like in five years, what's going to go too far, what's going to have these

00:25:06.600 --> 00:25:12.800
unintended consequences. So what I would prefer is a balance between different powers.

00:25:13.640 --> 00:25:20.560
Like this kind of like shaky back and forth, I feel is mostly a good thing.

00:25:20.560 --> 00:25:23.800
It's like that Churchill quote, you know, it's the work.

00:25:23.800 --> 00:25:28.920
Democracy is the worst possible form of government, except all the other ones

00:25:28.920 --> 00:25:31.960
that have tried. Yeah, yeah. Like I do.

00:25:31.960 --> 00:25:37.920
Well, hold on. You're not taking a stance here. I mean, I'm just imagining comments.

00:25:37.920 --> 00:25:44.520
That's all my stance is that I would prefer no one to have complete

00:25:44.520 --> 00:25:49.920
sovereignty over me. Oh, yeah. And I feel like most people would agree with that.

00:25:49.960 --> 00:25:54.760
Regardless of like where they specifically fall, hot take, unless,

00:25:54.800 --> 00:26:00.080
unless it's a, you know, hyperintelligence supreme AI overlord who knows what's best

00:26:00.080 --> 00:26:03.400
for you. I for one, welcome our new Sydney overlords.

00:26:05.200 --> 00:26:11.800
I just, I don't like anybody having too much power. So I would like a little bit of back and forth.

00:26:11.920 --> 00:26:16.520
One of the big things for me was YouTube.

00:26:17.200 --> 00:26:22.320
Okay, both the crackdown on ad blockers and the earlier, I feel like at this

00:26:22.320 --> 00:26:29.760
point, mostly in the background, um, crackdowns on vulgarity, yeah, and like

00:26:29.760 --> 00:26:33.960
there's always that uncomfortable relationship between advertisers,

00:26:33.960 --> 00:26:38.480
like the people who are mostly financing these platforms and the basic

00:26:38.480 --> 00:26:41.600
reality that vulgarity is a normal part of human communication.

00:26:42.240 --> 00:26:46.040
Yeah, yeah. I mean, that's how it is. Yeah, that was really interesting.

00:26:46.040 --> 00:26:49.960
That whole, uh, changing their, their, uh, terms or not terms and conditions,

00:26:49.960 --> 00:26:53.160
but their, their guidelines, their ad friendly guidelines, um, at the end of,

00:26:53.480 --> 00:26:58.280
yeah, at the end of 2022, but that story kind of like broke more, uh, like it

00:26:58.280 --> 00:27:02.160
happened in, in November, 2022. And then it broke more early this year.

00:27:02.640 --> 00:27:05.880
And yeah, that was only the start. YouTube had a crazy year.

00:27:05.880 --> 00:27:10.480
As you say, they, they're like fighting ad block, many ad blockers are broken,

00:27:10.480 --> 00:27:15.120
although now many of them are working again, but that war has gotten more

00:27:15.160 --> 00:27:19.920
intense than ever this year. And they also added tests like that.

00:27:19.960 --> 00:27:25.520
Well, actually 1080p premium, 10, 1080p premium on YouTube is not a test anymore.

00:27:25.520 --> 00:27:28.920
It's just a, it's a thing. They launched it. I guess their test went okay.

00:27:28.920 --> 00:27:35.440
People didn't really complain about it. They said specifically that it's not degrading the non premium experience.

00:27:35.440 --> 00:27:39.320
It's only that the premium people get better, which I guess you have to take

00:27:39.320 --> 00:27:42.880
your word for it unless you do some crazy testing. I haven't heard any complaints.

00:27:42.880 --> 00:27:47.400
So maybe it's fine. Um, shorts started getting revenue this year.

00:27:47.440 --> 00:27:52.240
They did. Uh, what else? Oh, there was one more. The ads, they started doing these experiments.

00:27:52.280 --> 00:27:56.400
Another test they were doing was that like front loading, like 10 ads in front

00:27:56.400 --> 00:28:00.040
of a video instead of like spacing it out. People did not like that.

00:28:00.040 --> 00:28:04.760
So they haven't launched that, but yeah, YouTube has changed a lot.

00:28:05.280 --> 00:28:09.920
I think one thing that I want to, I want to end on this, um, because I've

00:28:09.920 --> 00:28:13.480
completely forgot about this until I started looking through the stories for the Christmas special.

00:28:13.920 --> 00:28:17.120
Do you remember the Chinese spy balloon? Yeah.

00:28:17.120 --> 00:28:20.600
Yeah, right? I was like, what? That was this year. That was this year.

00:28:20.600 --> 00:28:23.560
I completely forgot about it. The world was going nuts for a second.

00:28:23.920 --> 00:28:28.280
Yeah, we were, we were losing our minds. Yeah. They shot down one.

00:28:28.280 --> 00:28:31.440
They shot, yeah, they shot it down. They, yeah, there was a couple.

00:28:31.840 --> 00:28:34.520
Oh yeah. Yeah. That got shot out down over Canada.

00:28:35.400 --> 00:28:40.680
Really? Yeah. Oh yeah. Yeah. Like the Canadian government had to give them permission to fly into our

00:28:40.680 --> 00:28:44.640
airspace in case you couldn't tell. I haven't finished the Techland Christmas special.

00:28:44.640 --> 00:28:48.760
So we're like reacting to finding out about this stuff for the second time, you

00:28:48.760 --> 00:28:56.280
know, for the first time, if that makes sense. Uh, so this will be a, uh, whole proper roundup in the Techland Christmas

00:28:56.280 --> 00:29:01.760
special, which you probably have already seen maybe if, if that came out on a Monday.

00:29:02.080 --> 00:29:04.760
Anyways, it's also been a rough year for physical media.

00:29:05.560 --> 00:29:11.520
Oh yeah. Wait, what do you mean? Oh, the dropping of DVDs and CDs and various stores.

00:29:11.520 --> 00:29:15.040
Well, best, yeah, they best buy announced plans to drop it next year.

00:29:15.160 --> 00:29:20.480
Yes. Uh, and, uh, Netflix will no longer be sending DVDs in the mail, which is

00:29:20.480 --> 00:29:21.880
just devastating for me.

00:29:24.160 --> 00:29:29.040
I actually was one of the original users of the, uh, the Netflix, uh, mailing

00:29:29.040 --> 00:29:32.480
system. I remember that way back. Wow. No, I never, I never used that.

00:29:32.480 --> 00:29:37.240
I heard about it. Oh, gee, Netflix, where they sent it to the mail.

00:29:37.440 --> 00:29:41.160
Yeah. I used Blockbuster right up until it died.

00:29:41.640 --> 00:29:45.840
I wasn't about to use that newfangled DVD and mailing.

00:29:45.920 --> 00:29:47.840
Yeah. Well, I always lived in the woods. So.

00:29:49.040 --> 00:29:54.760
And don't you live in the woods, watch Techlington, GameLink, etc.

00:29:54.760 --> 00:29:58.120
that are going to keep you happening. That's the end of this episode because they're going to start the

00:29:58.120 --> 00:30:00.760
WAN Show soon and we suddenly have to abruptly end.

00:30:01.160 --> 00:30:04.320
Uh, will AI kill your kids? Yes.

00:30:05.200 --> 00:30:09.600
Louise, honestly, we're weighing the parents and kids thing right now, but

00:30:09.600 --> 00:30:16.520
one of them's going to go. Yep. And you're going to be forced to adopt AI robot children or parents.

00:30:17.000 --> 00:30:20.000
So just, you know, make peace with that. Robogramma.

00:30:20.640 --> 00:30:25.560
And make peace with the fact that this episode is over. That's outro transition number two.

00:30:25.600 --> 00:30:29.120
Uh, they subscribe to Techlinked and say bye-bye to Jessica.

00:30:29.760 --> 00:30:33.880
Okay. Subscribe and I'll see you on the next time.

00:30:34.080 --> 00:30:36.120
Jessica. Bye.
