WEBVTT

00:00:00.000 --> 00:00:04.200
Are you good? Yeah, Elijah's out there once and then now we have to deal with it forever.

00:00:04.200 --> 00:00:07.700
I feel like I should crash the WAN Show again at some point. Yeah, that was fun.

00:00:07.700 --> 00:00:14.500
That was fun. It's set up. That was for Riley Week to like promote Riley Week, but well, but I didn't have the chair.

00:00:14.500 --> 00:00:18.500
I came in and I just leaned over you guys. You should crash the WAN Show for Luke Week.

00:00:18.500 --> 00:00:24.700
Oh, okay. We should deep fake me to be Riley, and then you should crash the WAN Show as me.

00:00:24.700 --> 00:00:28.700
I should, I should just, I should, I should like put on more of a beard.

00:00:29.700 --> 00:00:35.700
And not wear my glasses and just like sit there and try to do a Luke impression and then Linus.

00:00:35.700 --> 00:00:38.700
And then Linus will be like, not even notice. He won't even notice.

00:00:39.200 --> 00:00:42.200
No way Linus might not. And then he crashes me. Yeah, yeah, yeah.

00:00:42.200 --> 00:00:45.200
He'll notice. We should do it. We should do that.

00:00:45.200 --> 00:00:47.200
Sorry that I just did an impression of your voice.

00:00:48.200 --> 00:00:52.200
Feel free to do, feel free to do an impression of my voice. I can't do impressions of anything.

00:00:52.200 --> 00:00:57.200
I really wish I could. I've always had dreams of like narrating an audio book, but I can't do it.

00:00:57.200 --> 00:01:00.700
I actually don't do any impressions either. The Tim Cook thing. That's all AI.

00:01:00.700 --> 00:01:02.700
Speaking of which. AI!

00:01:03.700 --> 00:01:09.700
Welcome back to Luke Week. We're going to talk for about an hour about some AI stuff, I guess.

00:01:09.700 --> 00:01:14.700
We have some, we have some rough Sammy's invading. Oh, it's a meme that we're doing for an hour.

00:01:14.700 --> 00:01:17.700
Exactly an hour. It's exactly an hour. Yeah.

00:01:17.700 --> 00:01:20.700
You guys aren't going to talk for like three hours. We got to leave it.

00:01:20.700 --> 00:01:25.700
You guys have one hour. We could do WAN Show. You have to tell an AI to shut up or it won't.

00:01:25.700 --> 00:01:29.200
And we're into prompting. Boom. Just like that.

00:01:29.200 --> 00:01:33.200
Incredible. So one of the things I wanted to talk about was what do you use AI for?

00:01:33.200 --> 00:01:37.200
Do you have any prompting things that you like to do? Do you have certain habits?

00:01:37.200 --> 00:01:40.200
What do you find the most effective? What do you find the least effective?

00:01:40.200 --> 00:01:44.200
I don't even know if it's still the meta, but I try to be as polite as possible.

00:01:44.200 --> 00:01:47.200
I feel like I usually throw in like a good morning.

00:01:47.200 --> 00:01:53.200
How are you doing? I'm a writer working on this and I need to make this project.

00:01:53.200 --> 00:01:56.700
You know, could you put this together or whatever, please?

00:01:56.700 --> 00:02:00.700
I also will do an entire thing just to be like, oh, that was good.

00:02:00.700 --> 00:02:05.700
Thanks. I do do that as well. I understand that it just burns holes in open AI.

00:02:05.700 --> 00:02:09.700
Yeah. Yeah. And it's sucking our oceans dry every time you do that.

00:02:09.700 --> 00:02:12.700
That's good. So you're the problem. It's me.

00:02:12.700 --> 00:02:18.700
It's for sure me. It's not them. All of my AI interactions have been like, maybe I can use AI for this.

00:02:18.700 --> 00:02:23.200
And then I kind of try it and I maybe get something somewhat useful.

00:02:23.200 --> 00:02:27.200
And then I find some problem with it. And then I find another problem with it.

00:02:27.200 --> 00:02:31.200
And then I'm like, this isn't even worth it. So then I end up not really using it.

00:02:31.200 --> 00:02:36.200
Like I have used it before to be like, oh, I've got a bit of writer's block or something.

00:02:36.200 --> 00:02:41.200
So then I'll like see what it comes up with. Just like maybe I can take a chunk or another chunk of that or something and use it.

00:02:41.200 --> 00:02:45.200
But like the process of going through that is like, I don't know.

00:02:45.700 --> 00:02:51.200
Maybe I might save a little bit of time. But I feel like if I just like sat there and just like put stuff on the page and tried

00:02:51.200 --> 00:02:54.700
stuff out, I could get something better in the same amount of time most of the time.

00:02:54.700 --> 00:02:57.700
I think it kind of scales. Like some people will talk about... But people's mileage varies.

00:02:57.700 --> 00:03:01.700
Yeah. People will talk about how they'll get writer's block and start a page for like a day.

00:03:01.700 --> 00:03:05.700
A day? Oh, yeah. Like book writers.

00:03:05.700 --> 00:03:08.700
Oh, yeah. They'll just be like unable to keep moving. They can't figure it out.

00:03:08.700 --> 00:03:14.700
Oh, sure. Yeah, yeah. And in more like creative writing situations like that.

00:03:15.200 --> 00:03:20.200
I can understand being like, I have writer's block because there's so little structure for

00:03:20.200 --> 00:03:24.200
what I'm doing. I'm not saying that like people who write novels or whatever or books.

00:03:24.200 --> 00:03:27.200
He called you unstructured. No, no. You said you have no plan.

00:03:27.200 --> 00:03:30.200
You have no lore. I know that... Actually like the best way...

00:03:30.200 --> 00:03:33.200
You'll never be Tolkien. Stop.

00:03:33.200 --> 00:03:38.200
Stop. Try. You're going on r slash world building as if it's going to help you.

00:03:38.200 --> 00:03:46.200
It's not. It might. Recently I understood how much structure really goes into like crafting fiction stories,

00:03:46.200 --> 00:03:49.700
which I do want to do at some point. This is an ADHD ramble at some point.

00:03:49.700 --> 00:03:52.700
I'm going to flip your question back on you. What do you do when you prompt?

00:03:52.700 --> 00:03:55.700
You said you do please, but what else?

00:03:55.700 --> 00:04:01.200
I've got a bunch of stuff. I'll do the like I am a prompting engineer type stuff.

00:04:01.200 --> 00:04:06.700
I'll give it an expected output. Sometimes if I'm looking for like a document in a certain type of format or something like

00:04:06.700 --> 00:04:09.700
that, I'll upload a version of it and be like, don't take any information from this.

00:04:10.200 --> 00:04:14.200
But look at the formatting. Look at whatever. Don't scan this and train on this data.

00:04:14.200 --> 00:04:18.200
But just look at the table. I don't want the same thing. Look at what the table looks like.

00:04:18.200 --> 00:04:21.200
Yeah. Like I'll be pretty specific about those types of things.

00:04:21.200 --> 00:04:25.200
I think that whole thing is a little bit overblown in regards to how much it helps.

00:04:25.200 --> 00:04:28.200
There's something I've talked about a lot, which I'm going to be repeating myself on,

00:04:28.200 --> 00:04:31.200
which is the main way that I use it is something that I call sentiment analysis.

00:04:31.200 --> 00:04:35.200
Okay. Which is where I'll have a message that I want to convey.

00:04:35.200 --> 00:04:38.200
Right. But I'm unsure how it's going to be received.

00:04:38.200 --> 00:04:41.700
So I'll use it as like kind of a rubber ducky. So I'll ask it for a sentiment analysis.

00:04:41.700 --> 00:04:45.700
And I used to have this like long structured prompt for that.

00:04:45.700 --> 00:04:49.700
So like that's like a hacking term, right? Oh, in this case, no.

00:04:49.700 --> 00:04:53.700
Don't mean like rubber ducky USBs, which is what you're talking about when it comes to

00:04:53.700 --> 00:04:57.700
hacking. I'm talking about like, you need to like talk your idea through with something.

00:04:57.700 --> 00:05:02.700
It doesn't even need to be a person. Oh, you just need to like voice the idea out loud and then you'll get an answer.

00:05:02.700 --> 00:05:06.700
My physics teacher in high school had a really, really good version of this where he had an

00:05:06.700 --> 00:05:12.200
desk and then he had like two or three just normal desks and he lined them all up on the

00:05:12.200 --> 00:05:19.200
long part of the L. Okay. And he was like, if you want to ask me a question during like free work time, you're very welcome

00:05:19.200 --> 00:05:26.200
to ask me a question, but you can't ask me over my desk. You have to come into the desk area and then there's a tape line at the entrance.

00:05:26.200 --> 00:05:29.200
And you have to say your entire question out loud.

00:05:29.200 --> 00:05:34.700
Wow. And then if you still don't know by the time you're done saying your entire question, then

00:05:34.700 --> 00:05:37.700
you can walk in and I will be more than happy to help you.

00:05:37.700 --> 00:05:40.700
And everybody is happening to me multiple times.

00:05:40.700 --> 00:05:46.700
Just requiring people to vocalize their questions. Like cause it might be really stupid that you're coming over here and talking to me.

00:05:46.700 --> 00:05:49.700
Well, no, it's often because you might have sat there for 10 minutes trying to figure

00:05:49.700 --> 00:05:54.700
it out. You go, oh, fine. You go up to the line, you say your question and halfway through saying your question out

00:05:54.700 --> 00:06:00.840
loud, you're like, oh, and then you walk back and you have the answer. So it's like by doing something different, not by just sitting there and stewing on it

00:06:00.840 --> 00:06:05.340
by speaking the question out loud. That's where the whole rubber ducky thing comes from.

00:06:05.340 --> 00:06:08.840
You know what's funny is that I don't use people as a sounding board a lot of the time.

00:06:08.840 --> 00:06:14.840
I just go and do it, but I think this is because I talk to myself a lot.

00:06:14.840 --> 00:06:20.840
Oh yeah. Like if I'm doing anything, I'm either talking to myself sort of silently internally or I'm

00:06:20.840 --> 00:06:23.840
like literally just speaking out loud. I do it out loud.

00:06:23.840 --> 00:06:26.840
I apologize to people all the time. Oh yeah. Cause like they're like, huh?

00:06:26.840 --> 00:06:30.840
And I'm like, oh sorry. This is like a problem for me where I think a lot of people probably think I'm insane

00:06:30.840 --> 00:06:33.840
cause I'll talk to myself out loud like often. I see.

00:06:33.840 --> 00:06:38.840
I feel like people already kind of think I'm insane. So that's where that bridge has been crossed default.

00:06:38.840 --> 00:06:44.540
Okay. So I'll ask it for the sentiment analysis and over time that prompt has almost stopped

00:06:44.540 --> 00:06:49.840
existing where now it's just like, can you give me a sentiment analysis on enter, enter,

00:06:49.840 --> 00:06:53.840
paste send? That's it. Just can you give me a sentiment analysis on this?

00:06:53.840 --> 00:06:57.800
And it does exactly as good of a job as, as when you kind of like did this.

00:06:57.800 --> 00:07:02.260
This whole long. Oh, you're using the same account like you're using Gemini or chat to BT or what?

00:07:02.260 --> 00:07:07.600
I am using the same account. In this case, it's usually chat to BT cause that's the one that I have for work stuff.

00:07:07.600 --> 00:07:11.600
I split work stuff and personal stuff and work stuff is usually where I'm like, I don't

00:07:11.600 --> 00:07:15.080
know how people are going to take this in yesterday or I used to get a lot of feedback

00:07:15.080 --> 00:07:20.800
that I was like unapproachable and my communications were like blunt and short cause it's like.

00:07:20.800 --> 00:07:24.920
So now when people think I was like really angry about something and I was like, no,

00:07:24.920 --> 00:07:28.040
I was too. You just copy, paste the AI. No, no.

00:07:28.040 --> 00:07:31.040
Just kidding. I know you don't do that. Yeah.

00:07:31.040 --> 00:07:34.800
It would, it would be, and that's one of my, the reason why I like this. There's no time to do that.

00:07:34.800 --> 00:07:37.800
Really honestly. No. Yeah.

00:07:37.800 --> 00:07:45.960
But if it's, if it's really important and I'm very worried that someone might take it the wrong way, then I'll just toss it over and usually it's, you know, and I'll

00:07:45.960 --> 00:07:50.840
say this line stands out cause it makes me think this thing and I'll be like, mm, that

00:07:50.840 --> 00:07:55.360
wasn't the goal. So I'll slightly change that line and then send it. Like I don't usually sit there and farm it out.

00:07:55.360 --> 00:08:00.120
So like, this is so interesting to me because I, I take a long time to write messages.

00:08:00.120 --> 00:08:04.400
Maybe I don't know. I said, we're getting into this. I'm like, maybe I should start using it.

00:08:04.400 --> 00:08:10.000
It's so interesting that we work for like a tech thing or sort of in between like a

00:08:10.000 --> 00:08:14.280
tech company now and just like a tech related media thing.

00:08:14.280 --> 00:08:19.120
If you ask most people here, they're probably more kind of skeptical of AI or anti AI than

00:08:19.120 --> 00:08:22.120
pro AI. Yeah. Which is, yeah, it's interesting.

00:08:22.120 --> 00:08:26.080
So like, because I'm so embedded in it and I hear about developments on it every day,

00:08:26.080 --> 00:08:30.360
I know that something like that would probably help me save time, but I'm so hesitant to

00:08:30.360 --> 00:08:36.440
do it. I think there's, well, okay. One of my lines is that I don't use its output ever kind of thing.

00:08:36.440 --> 00:08:42.000
Everyone should do this, but I don't want to project that far. I personally have a bit of a line where I'm not going to use its output.

00:08:42.000 --> 00:08:47.360
So like a very common thing that it will do when I ask it for a sentiment analysis, crack

00:08:47.360 --> 00:08:51.520
in a brusky to talk about the AI is not sponsored.

00:08:51.520 --> 00:08:55.760
It will very often output like, hmm, here's your sentiment analysis.

00:08:55.760 --> 00:09:02.040
You could say it this way and it'll rewrite my whole thing. I never take that sometimes to detriment because I'll see what it wrote.

00:09:02.040 --> 00:09:05.780
And then I'll be like, well, I can't say that even though that like might have been a good

00:09:05.780 --> 00:09:12.240
idea because I'm not going to use what it says. And I have to find another way to do it, which sometimes I'll avoid even reading it.

00:09:12.240 --> 00:09:15.600
So this is something you do just as a principle.

00:09:15.600 --> 00:09:20.800
Even if I do not use its output, even if the message is fine, it looks, it seems fine.

00:09:20.800 --> 00:09:24.480
Your principle is you do not just copy and paste. You have to modify in some way.

00:09:24.480 --> 00:09:32.600
And sometimes I'll get, that's good. Sometimes I'll get pretty close to the idea, but it always has to be my idea.

00:09:32.600 --> 00:09:41.160
But sometimes like it'll be a line for everybody. I kind of think so because what, what, what lies the other way, the minimum requirement

00:09:41.160 --> 00:09:45.880
for me is that you have to be fully responsible for output that you have.

00:09:45.880 --> 00:09:50.400
And the easiest way for me to accomplish that is to just not use its output because I'm

00:09:50.440 --> 00:09:53.520
just nothing changed. I'm fully responsible for what I do.

00:09:53.520 --> 00:09:58.640
But there's this like scapegoat shield that people try to have of like, oh, I submitted

00:09:58.640 --> 00:10:05.240
something and it was wrong or sucked or whatever. But I mean, there was an AI or whatever, you know, it's like, no, you have to own your

00:10:05.240 --> 00:10:09.920
output. This is wild to me that. So like, I was, I didn't know what I was going to bring this up at some point.

00:10:09.920 --> 00:10:13.880
But I listened to a podcast recently from 80,000 hours.

00:10:14.080 --> 00:10:19.800
You know, you're familiar with this. It's like the, I think it's a nonprofit that's based around, I forget what,

00:10:19.880 --> 00:10:25.760
thinker, some famous thinker, maybe had the, like, came up with the, the idea that you

00:10:25.760 --> 00:10:32.560
have 80,000 hours in your career. And it's like, so, so then the whole organization is kind of geared towards making

00:10:32.560 --> 00:10:37.720
people think about their job and their career and be intentional about like going

00:10:37.720 --> 00:10:42.120
into work and spending time on things that they think are meaningful and useful to

00:10:42.120 --> 00:10:45.600
the world and like having, having an impact and stuff.

00:10:45.600 --> 00:10:53.040
And it's like, so anyway, that's the organization. But the organization seems cool, but never really, apparently they have a

00:10:53.040 --> 00:10:57.280
podcast that showed up in my feed. They get like a couple thousand views of a video.

00:10:57.760 --> 00:11:02.320
And they were there, there were these two people talking about parenting,

00:11:02.320 --> 00:11:07.200
actually unrelated, but the, the, what struck me, listen, I clicked on this,

00:11:07.200 --> 00:11:10.200
not because it was about AI, but they ended up talking about AI for like half

00:11:10.200 --> 00:11:16.560
the video, because one of them was a prospective parent and one of them was a parent. The prospective parent was, was saying like, and I was concerned about

00:11:16.560 --> 00:11:20.040
like, you know, what kind of issues would come up when, when parenting or

00:11:20.040 --> 00:11:23.080
whatever. So like, so I asked, they didn't even say that.

00:11:23.080 --> 00:11:28.720
They didn't even preample. They just said, so I, so I asked, uh, so I asked the LLMs, like what they thought

00:11:28.720 --> 00:11:32.000
would be an issue like that I should be concerned about. And the first thing they said was this.

00:11:32.000 --> 00:11:36.160
And the second thing they said with this, and just like uncritically, so I asked

00:11:36.160 --> 00:11:39.680
Claude, what it thought, like, you know, I should be like buying in terms of like

00:11:39.840 --> 00:11:45.480
what products and stuff that it was like, oh, you should probably have this and this and this. And I was like, that surprised me because I thought, and just

00:11:45.480 --> 00:11:51.960
like, as if they had referenced, oh, and I, I interviewed a parenting expert and

00:11:51.960 --> 00:11:55.440
they said this concerning, especially because, I mean, attention to another

00:11:55.440 --> 00:11:58.760
thing we were talking, thinking of talking about, we all saw what happened

00:11:59.000 --> 00:12:04.160
with Grock recently. And I am fully of the belief that that is happening with

00:12:04.160 --> 00:12:08.800
all of them all the time. It was just so obvious with Grock when it just decided

00:12:08.800 --> 00:12:11.840
that Elon was the best at everything in the world, but it was like, who would be

00:12:11.840 --> 00:12:17.120
better at basketballs? I mean, to be fair, probably Elon out of everyone to be fair.

00:12:17.360 --> 00:12:21.720
Elon probably has a physical beast. He's the richest man in the world.

00:12:21.720 --> 00:12:26.240
So I mean, everything that means he's good at everything else, probably.

00:12:26.320 --> 00:12:32.640
One of them was a piss drinking contest. What person out of all of the history of mankind, who would be the best at a

00:12:32.640 --> 00:12:36.000
piss drinking contest? And they're like, probably. Absolutely.

00:12:36.000 --> 00:12:39.600
Although there was a, I think there was a couple that it wouldn't, it wouldn't do

00:12:39.600 --> 00:12:43.840
that for her. I think it was like, would he rise from the grave faster than Jesus

00:12:43.840 --> 00:12:48.040
Christ or something? And it was like, not that one.

00:12:49.000 --> 00:12:53.280
And I think there was another like, someone like mentioned like a Catholic

00:12:53.280 --> 00:12:56.680
saint or something. And the Grock said, so it has some form of religious.

00:12:56.720 --> 00:12:59.880
Yeah, maybe. I don't know. Like, yeah, maybe that was the theme.

00:12:59.880 --> 00:13:04.320
But anyway, I was saying to be fair, Grock is definitely the LLM that you

00:13:04.320 --> 00:13:08.000
would be most concerned about it, making like weird proclamations like that.

00:13:08.120 --> 00:13:12.600
I would expect Chatchapiti or Claude or whatever to like be slightly better at

00:13:12.600 --> 00:13:16.520
not just a little bit more balanced, but it is still a trillion dollar industry

00:13:16.520 --> 00:13:20.120
based almost entirely around of influencing and trying to control your

00:13:20.120 --> 00:13:23.960
brain as much as it can. So like purchasing decisions, lifestyle changes

00:13:23.960 --> 00:13:28.080
and stuff that will benefit corporations is going to be fairly obviously a

00:13:28.080 --> 00:13:31.840
primary goal of these systems. A goal of the, of the LLMs.

00:13:31.960 --> 00:13:35.680
Like it's playing that go deep on that. What do you mean? Advertising.

00:13:36.600 --> 00:13:39.600
You mean in the future or like now you're thinking now to a certain degree,

00:13:39.600 --> 00:13:42.880
but it's a clear direction for the future. You're just talking about like parenting advice.

00:13:42.880 --> 00:13:46.440
Okay. Well, what if it leans you towards parenting advice, which might lean

00:13:46.440 --> 00:13:53.720
you towards certain products? Well, and I mean, there's a, there's instant buy, I think, or instant

00:13:53.720 --> 00:13:57.080
purchase or something Chatchapiti. You can buy stuff directly with certain

00:13:57.080 --> 00:14:00.360
products and then also you could just have it right now.

00:14:00.600 --> 00:14:05.280
You don't have to shop around and find the right thing. I already did that for you. Yeah, you automatically believe me on everything.

00:14:05.280 --> 00:14:09.120
Anyways, the automatic belief is terrifying.

00:14:09.120 --> 00:14:13.160
It was the weirdest thing. And then there was another podcast that I clicked because I was so taken

00:14:13.160 --> 00:14:17.040
aback by that. I clicked to another video where he was interviewing a,

00:14:17.040 --> 00:14:20.920
like research, an analyst from the Pew Research Center who had done,

00:14:20.920 --> 00:14:25.280
like they recently did like a big giant survey about the public's perception

00:14:25.280 --> 00:14:30.520
of AI. And this guy, the host of this podcast was like,

00:14:30.600 --> 00:14:34.600
apparently he talks to a lot of like AI insider people a lot, like people who

00:14:34.600 --> 00:14:38.120
work at the AI companies. It's obviously great. What's the opposite of doomer?

00:14:38.800 --> 00:14:42.880
There's a word for it. Anyway, they're super like optimistic on, on AI.

00:14:42.960 --> 00:14:46.920
And so he's like taken aback by all these, because like the statistics are

00:14:46.920 --> 00:14:51.960
always like 67% of people say they're concerned about the future AI and 17%

00:14:51.960 --> 00:14:56.440
say they're, it's going to be great. And he's like, I was so surprised because

00:14:56.440 --> 00:15:01.000
like all of these people are making AI like all, we use, I spend hours talking

00:15:01.000 --> 00:15:05.320
to chatbots every day and in the course of my regular work and they all are

00:15:05.320 --> 00:15:09.120
making it thinking like, this is going to help people. It's going to like optimize stuff and blah, blah, blah.

00:15:09.160 --> 00:15:12.920
What I'm feeling out there is that people who are like productivity,

00:15:12.920 --> 00:15:16.880
like the grind set, like the, you are not really living unless you're

00:15:16.880 --> 00:15:20.440
started for businesses by the time you're 20 or something, you know,

00:15:20.440 --> 00:15:24.080
like those people are like, yes, this is fantastic.

00:15:24.080 --> 00:15:29.560
This is a utopian vision. I can get AI to make up recipes and then

00:15:29.560 --> 00:15:34.040
generate and publish a cookbook all without me basically doing nothing.

00:15:34.040 --> 00:15:38.080
You know, oh, that's going to make the world better because people will be more productive.

00:15:38.160 --> 00:15:42.800
This is something that I struggle with a bit is especially with video and

00:15:42.840 --> 00:15:47.880
image generation. I can't think of an example where it made something better.

00:15:47.960 --> 00:15:56.680
Well, okay. Counterpoint immediately. I use generative AI in like Photoshop to kind of just make things

00:15:56.680 --> 00:15:59.520
quicker that I could do manually. Very specific example.

00:15:59.680 --> 00:16:05.720
Thumbnails. I wear glasses. We have lights. Sometimes I'm turning my head so that there's reflection on my glasses.

00:16:06.040 --> 00:16:11.760
And that's really annoying to get rid of in Photoshop. You could. It just would take a really long time.

00:16:11.880 --> 00:16:16.400
And so you just circle it. Now we have to question. You circle it and you say, remove glare.

00:16:17.400 --> 00:16:23.160
Now I have to question like, what's AI though? That's like, that's using the same kind of diffusion techniques as like image

00:16:23.160 --> 00:16:27.560
generators. You're not creating a whole image. But those types of tools existed.

00:16:28.560 --> 00:16:33.160
Not like this. They had like content aware fill and stuff, but like the whole kind of

00:16:33.680 --> 00:16:37.920
it's it's diffusion for image generators. But now they're kind of like baked into transformer models.

00:16:37.920 --> 00:16:44.040
I don't know what the right terminology is. But like post chat GBT kind of do you think that would have been impossible

00:16:44.320 --> 00:16:48.440
without diffusion layers? Or do you think we could have done it?

00:16:49.280 --> 00:16:56.440
Because like this is I mean, I don't know enough about it. My argument is that I think a lot of things are getting the label slapped on.

00:16:56.680 --> 00:17:00.720
And in some cases like this one, it is actually true that it's being influenced

00:17:00.720 --> 00:17:04.640
by it, but we were already moving in this direction without these types of like

00:17:04.640 --> 00:17:09.240
transformer models and diffusion layers and stuff. So we should define our terms then when you say AI, what do you

00:17:09.240 --> 00:17:12.440
but you're right. It is it is using that tech.

00:17:12.440 --> 00:17:18.200
I didn't actually know that. I thought it was still more on the machine learning side of things. But let's let's say that it is because I'm I don't know anything about it.

00:17:18.200 --> 00:17:21.760
You know something about it. I'm assuming you're right to me.

00:17:21.760 --> 00:17:24.840
That's still something that could have been done without it.

00:17:25.480 --> 00:17:29.360
It's possible. You know what it is? It does use it because I've used it before.

00:17:29.560 --> 00:17:33.720
I've used it before to I took I took a screenshot of Techlinked

00:17:34.120 --> 00:17:38.160
and I just expanded the canvas so that there was a bunch of like empty space

00:17:38.160 --> 00:17:42.080
around the screen shot and you say fill in the frame or whatever.

00:17:42.080 --> 00:17:46.360
And it brought it brought in like weird creatures at some point.

00:17:46.560 --> 00:17:51.680
I think I wait. No, sorry. The first time it just was like a bunch of random stuff.

00:17:51.680 --> 00:17:56.200
But then I was like, have me surrounded by like a little stuffed

00:17:56.200 --> 00:18:01.040
animal like stuffed animal dinosaurs or something. And they're like popped some in here's a weird part, though.

00:18:01.320 --> 00:18:04.440
OK, so I'm kind of like happy that you countered.

00:18:05.600 --> 00:18:09.600
But how does this how does this balance against what we just talked about

00:18:10.200 --> 00:18:14.880
where we said you shouldn't use output? You're talking about texts.

00:18:14.880 --> 00:18:23.560
I do think maybe there's a difference there. There's a big difference. Yeah, because if you're saying I need to reply to this guy and tell him this.

00:18:23.560 --> 00:18:27.320
This is me. This is my words is the assumption that someone should be able to make.

00:18:27.320 --> 00:18:30.960
Right. And you're talking about don't do the thing where the AI gives you something.

00:18:30.960 --> 00:18:37.080
You copy it and paste it to someone else unaltered. Yeah, I'm talking about I'm making a thumbnail with a bunch of elements

00:18:37.080 --> 00:18:42.600
that are not AI. And now I'm using AI as part of that process to kind of, you know,

00:18:42.600 --> 00:18:46.360
as one part of this grand thing like the whole thing that I'm doing.

00:18:46.400 --> 00:18:50.800
The analogous thing would be me telling an AI, here's the channel.

00:18:50.840 --> 00:18:54.680
Make a thumbnail like in the style of this channel with this and this and this and this.

00:18:54.880 --> 00:18:58.360
And then it gives me a thumbnail and I use it like, yeah, no, that would be that would be bad.

00:18:58.400 --> 00:19:02.600
This is where I think I think we don't necessarily do things super well

00:19:02.600 --> 00:19:06.480
in regards to the usage of AI, which is where I know we have people internally

00:19:06.480 --> 00:19:09.560
that don't like AI stuff, fair, totally fair.

00:19:09.560 --> 00:19:14.840
But I think we often slog through menial tasks that none of us want to do.

00:19:15.240 --> 00:19:18.840
That type of stuff could be accelerated, not necessarily completed,

00:19:18.840 --> 00:19:24.200
but accelerated like you're describing through use of often like specialized,

00:19:24.200 --> 00:19:27.320
like I'll call this a specialized tool in the current AI space.

00:19:29.120 --> 00:19:35.240
And that is cool and makes sense. I think the like, oh, God, is going to take all our jobs argument

00:19:35.280 --> 00:19:39.480
is potentially valid in some fields for the vast majority of what we do.

00:19:39.800 --> 00:19:44.240
I don't think so. Oh, my God, there is so much more that we could do at all times.

00:19:44.440 --> 00:19:48.880
Every single person here could have like four X their own individual output

00:19:49.080 --> 00:19:56.680
and we would still have more work that we could do. So like I don't believe in the like, oh, no, this boring thing

00:19:56.680 --> 00:19:59.960
that no one wants to do and I don't want to do is automated. I will therefore lose my job.

00:19:59.960 --> 00:20:03.480
It's like, no, there's a lot of more stuff. Yeah, that's very true.

00:20:03.480 --> 00:20:06.560
I think for it, especially for a company like ours, like, you know,

00:20:06.560 --> 00:20:09.720
I feel like I would be worried for like an HP or something.

00:20:09.720 --> 00:20:17.120
Sure, there are there are ones that are more concerned. Yeah, but definitely, definitely, you know, we we tend to hire people

00:20:17.120 --> 00:20:21.520
with like not extremely, extremely narrow skill sets.

00:20:21.920 --> 00:20:25.720
So it's like, yeah, OK, that one thing that you're doing is maybe

00:20:25.720 --> 00:20:32.640
you don't have to spend so much time on that, but like you can do other stuff. Yes. Yeah, sorry. Did you say something earlier and I just ignored you so bad.

00:20:32.800 --> 00:20:37.400
Yeah. And is that 40 minutes that you want to do?

00:20:37.440 --> 00:20:41.240
No. Yeah. Like it's it's like it's like therapeutic in a sense.

00:20:41.240 --> 00:20:44.960
But when I have to like 12 other things, it's like, I don't have time to be therapeutic.

00:20:45.080 --> 00:20:51.040
So this is like, I want people internally to use AI like selfishly, almost find

00:20:51.040 --> 00:20:55.040
the things in your job that you don't want to do anyways and try to see if

00:20:55.040 --> 00:21:00.880
there's some way that you can get it automated or accelerated. Yeah, I mean, yeah, I feel like for stuff like that, where it's like

00:21:01.120 --> 00:21:04.200
Sammy isn't putting any of Sammy into that task.

00:21:04.520 --> 00:21:08.160
Yeah, it's just like Sammy is a machine for the purpose of that task.

00:21:08.160 --> 00:21:13.480
You know, Sammy is a machine. When you're when you're writing a script for a short, then that's like, OK,

00:21:13.480 --> 00:21:18.720
I don't want to have an AI write a script for a short. This is the part of the job that I find fulfillment from this part of the job

00:21:18.720 --> 00:21:22.120
that I like. Cool. Yeah. Don't touch it with that. Right. Sounds good.

00:21:22.160 --> 00:21:25.480
For me, it's almost it's funny because like the whole computer use element,

00:21:25.480 --> 00:21:29.720
the agentic stuff, it's like the more you could see it as more concerning

00:21:29.720 --> 00:21:33.200
because it's like we're giving it access to like systems where it can click on

00:21:33.200 --> 00:21:36.160
stuff and accidentally do stuff. Maybe I don't know, it's dangerous.

00:21:36.400 --> 00:21:39.480
But and that's why I'm kind of like, I don't know if like

00:21:40.280 --> 00:21:45.480
I'm sure we'll get there eventually. But right now I'm not like stoked on it because that is the kind of thing

00:21:45.480 --> 00:21:52.640
that would actually be useful to me, but it doesn't seem that useful. Like part of my job has been it's kind of like being handed off now,

00:21:52.640 --> 00:21:56.440
but has been kind of like managing the folder on the server

00:21:56.440 --> 00:22:01.480
with all the tackling stuff in it, you know, like, oh, we did. It's got a bunch of it's got a bunch of videos

00:22:01.480 --> 00:22:04.400
and I need to move that to the vault or to archive.

00:22:04.720 --> 00:22:09.920
And that there's that there's stuff like video time stamps

00:22:11.760 --> 00:22:16.000
where it's like, yeah, I would love to be able to like just tell an AI move

00:22:16.000 --> 00:22:21.600
the six oldest tackling videos to the to the archive and it just does it.

00:22:21.600 --> 00:22:25.360
Yeah, that'll be amazing. I'd be so sketched out to do that right now.

00:22:25.360 --> 00:22:32.080
Yeah, exactly, exactly. It's hard when you're dealing with like, you know, stuff where if you do make a

00:22:32.120 --> 00:22:36.720
wrong, if it if the AI does make a wrong move, then like something might be screwed.

00:22:36.720 --> 00:22:41.000
It might put soap in the engine. Yeah. Yeah, that's that's definitely a concern.

00:22:41.000 --> 00:22:44.560
That's again, one of the reasons why. And we might get to a point where it's not as much of a concern.

00:22:44.560 --> 00:22:47.680
Yeah, I was I also wanted to say, though, in terms of like the automating

00:22:47.680 --> 00:22:52.200
menial stuff, one of the things that I do, well, we do we do time stamps on our videos.

00:22:52.680 --> 00:22:58.000
And I usually do it manually. I still do it manually because I started somebody suggested

00:22:58.200 --> 00:23:02.120
somebody else had to publish tackling and they were like, I was like, did you do the time stamps?

00:23:02.120 --> 00:23:05.320
And they're like, yeah, I just threw them in chat GPT and it was pretty good.

00:23:05.800 --> 00:23:09.040
And I was like, OK, so then I did that a few times.

00:23:09.320 --> 00:23:13.160
I it's not that great. I get it needs the transcript, though. That's the thing.

00:23:13.160 --> 00:23:17.840
I think Gemini can watch videos now, like it can just like actually watch the video.

00:23:18.480 --> 00:23:22.840
I was using chat GPT and it needed the transcript. This is like a very recent thing.

00:23:22.840 --> 00:23:26.040
I think Gemini is able to like actually think you can actually watch the video

00:23:26.040 --> 00:23:30.200
or does it have access to Google's version of the transcripts through their like

00:23:30.200 --> 00:23:35.000
translation? No, I think I think that instead of having to look at the transcript,

00:23:35.360 --> 00:23:39.560
it can like watch a video and say and know that like, oh, there was a dog

00:23:39.560 --> 00:23:44.320
at this point or something. I think I might be wrong about this, but I believe that that's the case.

00:23:44.320 --> 00:23:49.520
You can upload a video process. It. But regardless, chat GPT, I was using

00:23:49.520 --> 00:23:55.640
chat GPT, so it needed the transcript. It was basing the timestamp stamps off the transcript where it was like, OK,

00:23:55.760 --> 00:24:01.560
this is the point at which you started the new topic. And it was often like, you know, pretty close, but it might have been like

00:24:01.560 --> 00:24:05.120
might be like five to ten seconds off or something. And I'm like, I don't want that.

00:24:05.120 --> 00:24:09.080
Pretty annoying as a viewer. I want I want the timestamps to be right, you know, so it's like, OK,

00:24:09.080 --> 00:24:12.960
that doing that that way maybe saved me four minutes.

00:24:14.000 --> 00:24:18.680
Yes. And I'm like, I could save four minutes and have them be less accurate.

00:24:18.680 --> 00:24:22.040
And I really I feel like that's really the tradeoff that we're dealing with

00:24:22.040 --> 00:24:27.160
with tons of AI stuff. You can save this much time for a worse output.

00:24:27.160 --> 00:24:30.160
And it's like, you got to drain a lake, remove a bunch of jobs,

00:24:31.280 --> 00:24:35.760
ruin the economy and then save four minutes. Yeah, it will be more productive.

00:24:36.640 --> 00:24:41.080
I could start an online business. I could sell ebooks.

00:24:42.320 --> 00:24:44.560
Turns out the furries by a lot of pictures.

00:24:45.880 --> 00:24:49.960
That's an unserved market. It's not unserved.

00:24:49.960 --> 00:24:54.560
It's very well served. But I can join it.

00:24:54.560 --> 00:25:00.400
I don't know why I was doing that voice for that. It's just like natural work with the robots.

00:25:00.840 --> 00:25:04.800
Speaking of, oh, man, I'm not usually on this set. I'm going to water for you. No, it's fine.

00:25:04.800 --> 00:25:09.160
It's fine. So I don't do that. I don't even I'm going to keep it. Some I don't want you to give me the water in here.

00:25:09.160 --> 00:25:14.240
You're fine. You're half an hour. Give me the water. Oh, no.

00:25:14.240 --> 00:25:19.880
You know, if we you know why? Because if we used AI to tell us when an hour had passed, then that would be fine.

00:25:19.880 --> 00:25:23.120
We would save so much time. You're half an hour.

00:25:23.120 --> 00:25:28.120
Why do we have half an hour left? And speaking of not using its output or using different types of output,

00:25:28.120 --> 00:25:31.840
all that kind of stuff, we released a video recently. Linus hosting.

00:25:31.840 --> 00:25:38.080
We tend to do that sometimes with Linus hosting and it there.

00:25:38.080 --> 00:25:41.000
They're being deep fakes of Linus and the thumbnail was like,

00:25:41.320 --> 00:25:44.360
is this really me or whatever the heck it was?

00:25:44.360 --> 00:25:48.280
Or is it that isn't me? And like, you can still tell for sure.

00:25:48.280 --> 00:25:52.800
But Linus is a person that. Oh, right. Yes. We have seen a lot of.

00:25:52.880 --> 00:25:56.760
I think there are very convincing parts of the video.

00:25:56.760 --> 00:25:59.520
I think there are also very not convincing parts of the video.

00:26:00.360 --> 00:26:04.040
But I did not watch it. Yeah. I'm a bad.

00:26:04.040 --> 00:26:07.200
Load it up and just watch the intro right now. OK.

00:26:08.480 --> 00:26:10.920
Because like what's concerning to me is like

00:26:11.800 --> 00:26:15.280
we did more complicated things like we had them juggle stuff and

00:26:15.800 --> 00:26:20.200
chug pills and like carry someone around and do things like that.

00:26:20.200 --> 00:26:24.240
Like we gave it a difficult task. A lot of wow.

00:26:24.240 --> 00:26:27.880
Thanks, Sammy. That was totally unnecessary. But thank you.

00:26:27.880 --> 00:26:32.280
Uh, yeah, I can't fill my water bottle. It's very frustrating.

00:26:32.280 --> 00:26:35.360
I'm bullish on robotics. Is bullish the right term?

00:26:35.360 --> 00:26:41.160
I think so. Oh, geez. Oh, what did Defender sponsored this video?

00:26:41.160 --> 00:26:44.520
So I can show you that. Yeah. So no, this is him. What model do we use for this?

00:26:44.520 --> 00:26:47.520
I don't even remember. Linus strength pills.

00:26:47.520 --> 00:26:51.760
Oh, the teeth. The teeth are a little weird. The mouth doesn't follow what he's saying very well.

00:26:51.800 --> 00:26:53.160
You got David.

00:26:56.000 --> 00:26:59.200
This is hilarious. Why haven't I watched this yet?

00:26:59.200 --> 00:27:01.360
I'm, you know, I'm really focused on what I'm doing,

00:27:03.240 --> 00:27:08.040
which is fair, I think, which is making. Wow, he's ripped.

00:27:08.040 --> 00:27:12.000
This is OK. Yeah. Camera movement up.

00:27:12.000 --> 00:27:17.040
Oh, my gosh, that's not thanks to Linus pills. He's got the braces here.

00:27:17.040 --> 00:27:20.400
What are they going away? As it's wacky.

00:27:20.760 --> 00:27:24.440
Sorry, I'm so confused. Is this whole movie? Is this whole video?

00:27:24.440 --> 00:27:29.160
No, no. But this isn't him still. This is a different form of thing.

00:27:29.160 --> 00:27:33.280
This is now Chase with Linus put on top of him.

00:27:33.280 --> 00:27:38.160
OK. Because now you're telling like the eyes.

00:27:38.160 --> 00:27:41.760
That's actually wacky. Yeah, anyways.

00:27:41.760 --> 00:27:46.480
But this this video, like, I think we we made it more

00:27:46.480 --> 00:27:50.240
difficult than it needed to be. We got AI to make the whole video

00:27:50.240 --> 00:27:54.280
with the script and everything. Oh, God. Now we're making it even more difficult.

00:27:54.280 --> 00:27:57.440
That would have been so easy. But no, I think if you look at the things

00:27:57.440 --> 00:28:00.560
that are going to be the most concerning for deepfakes,

00:28:00.560 --> 00:28:06.560
they're a lot easier. Deepfaking a political speech is like a joke at this point.

00:28:06.560 --> 00:28:10.280
Oh, yeah. Yeah, we're there. The thing that people were concerned about.

00:28:10.280 --> 00:28:15.200
I mean, like in hindsight, I don't want to call it a crisis, but like people were really worried about deep panic,

00:28:15.200 --> 00:28:18.480
you know, a few years ago, I guess, pre-chat GBT.

00:28:18.480 --> 00:28:21.600
Yeah, it's here now. It's of it. It's like that's quaint.

00:28:21.600 --> 00:28:24.840
It's like looking back at how people are like, Oh, what is this going to do to society?

00:28:24.840 --> 00:28:29.160
It's like, well, we're we're there. And I think that like we're at an interesting point

00:28:29.160 --> 00:28:34.840
where most of the time, I feel like knowledgeable people

00:28:34.840 --> 00:28:38.520
can tell most of the time we're starting to get things.

00:28:38.520 --> 00:28:41.560
I got fooled for like the first time really,

00:28:41.560 --> 00:28:46.600
like a month ago or something. And it was, did you see that like there was a prototype

00:28:46.600 --> 00:28:51.120
of like a four-wheeled or four-legged rideable robot thing?

00:28:51.120 --> 00:28:54.280
It basically looked like an ATV, but with legs instead of.

00:28:54.280 --> 00:28:57.280
I don't think so. It was, I think it was a Hyundai.

00:28:57.280 --> 00:29:00.320
Hyundai, it was like a Hyundai prototype thing at a CES.

00:29:00.320 --> 00:29:03.720
You know, Hyundai makes like weird vehicle things at CES.

00:29:03.720 --> 00:29:06.880
Yeah, it was like a ski do with legs. It was just a prototype.

00:29:06.880 --> 00:29:10.720
They didn't show someone riding it. I think they showed it walking maybe,

00:29:10.720 --> 00:29:15.080
but no one riding it or anything. And then there was a video came out with like showing

00:29:15.080 --> 00:29:20.160
like a girl riding it and it was like moving. And then it was like she rode it out of the warehouse or whatever.

00:29:20.160 --> 00:29:24.080
And I was like, well, and I saw the video and I was like, it was on Reddit.

00:29:24.080 --> 00:29:28.760
And I was like, whoa, what did they, did they do a demo of that thing from CES?

00:29:28.760 --> 00:29:32.520
And I like looked at the video again and I was like, that looks like the thing from CES or whatever.

00:29:32.520 --> 00:29:36.360
So I went on a whole rabbit hole. Anyways, fast forward.

00:29:36.360 --> 00:29:39.760
You know, 30 minutes later, I'm like, this is an AI video.

00:29:39.760 --> 00:29:43.880
And I didn't even, because I was looking, I was trying so hard to find the original source

00:29:43.880 --> 00:29:50.880
that it just like took a while for me to click through a bunch of stuff. And then after I had done that, I watched the video more closely.

00:29:50.880 --> 00:29:54.040
And I was like, hey, these guys are kind of blurry.

00:29:54.040 --> 00:30:00.320
And like, what the, hold on, that's AI. And that was like the first time that it was really like, they got me.

00:30:00.320 --> 00:30:03.960
I fairly routinely will take like tests

00:30:03.960 --> 00:30:08.240
where you're supposed to try to spot things. And for a long time, I was like 100% all the time.

00:30:08.240 --> 00:30:11.360
And then I've started falling slightly.

00:30:11.360 --> 00:30:16.040
And that's where it's like, ugh. So we're not quite at the point where, you know,

00:30:16.040 --> 00:30:24.160
deep fakes are going to ruin everything. But I've had AI videos sent to me in a context where I'm pretty sure

00:30:24.160 --> 00:30:30.520
the person didn't know it was an AI video. And then I would be like, I'm not trying to be a grammar Nazi or whatever.

00:30:30.520 --> 00:30:34.200
But like, this is a grammar grammar. Sorry, I was messing around.

00:30:34.200 --> 00:30:37.640
It's like, is this a term? I'm not a grammar Nazi.

00:30:37.640 --> 00:30:45.240
But this is like not legit. And they'd be like, oh, I know, I just, I was just sending it to see if you could tell.

00:30:45.240 --> 00:30:49.720
They played it off like they knew? This has happened with multiple people multiple times.

00:30:49.720 --> 00:30:53.840
Oh, interesting. So it's like, I think we're also in a state where,

00:30:53.840 --> 00:30:59.520
and this is probably true of me as well, where people are overestimating their ability to detect.

00:30:59.520 --> 00:31:02.960
Yeah, it could be. And it's also pacing really fast.

00:31:02.960 --> 00:31:07.800
Yeah. Where, and what I'm kind of getting at with all of this is basically like,

00:31:07.800 --> 00:31:12.400
how do you plan to navigate and for your fan, like you're a father, right?

00:31:12.400 --> 00:31:16.000
I am. Yeah. How do you prepare your kids for that?

00:31:16.000 --> 00:31:20.880
I don't know. Yeah, all right. But like, I feel like it's a similar kind of,

00:31:20.880 --> 00:31:24.280
I've heard a lot of people be like, oh, I don't, I don't want to have kids

00:31:24.280 --> 00:31:29.640
because it's like, what are they going to, what am I going to bring them into? It's like, no one ever knew what they were bringing their kids into.

00:31:29.640 --> 00:31:34.560
It's, this is no different. It's not like this AI stuff is going to.

00:31:34.560 --> 00:31:40.160
It might be easier for them to understand. Yeah, maybe. They never existed in a world where if it was video, you could believe it.

00:31:40.160 --> 00:31:43.960
Yeah. For the most part. Just quick aside, I don't go on Facebook really.

00:31:43.960 --> 00:31:47.360
But like when I did go on Facebook, at the height of the kind of like AI

00:31:47.360 --> 00:31:53.320
slop-ified nonsense, it's still kind of, AI slop is still everywhere on Facebook.

00:31:53.320 --> 00:31:57.560
But what I've noticed recently, which I feel like is somewhat recent in the past,

00:31:57.560 --> 00:32:01.040
like maybe, maybe in the past like six to eight months before you would go on

00:32:01.040 --> 00:32:04.960
like a click on an AI slop post and you go in the comments and like no one knows.

00:32:05.000 --> 00:32:08.160
And they're all just like, oh my gosh, that poor child in Africa.

00:32:08.160 --> 00:32:12.040
It looks like it's starving and the, and now you go on there.

00:32:12.560 --> 00:32:16.480
And it's like the first, the top comments are like, that's AI people.

00:32:16.840 --> 00:32:20.600
But the crazy part is that also happens on real stuff now.

00:32:21.320 --> 00:32:26.360
Yes, yes. And that's because you've got the like the over guesses.

00:32:26.360 --> 00:32:32.760
There's been a lot, there's been a bunch of posts on Reddit of like videos that have just done the rounds, you know, like every once in a while,

00:32:32.760 --> 00:32:36.200
this like a video pops up in like this in the, in the main subreddits.

00:32:36.200 --> 00:32:40.160
And it's been like, it's like a Reddit classic and people are like, that's AI.

00:32:40.160 --> 00:32:42.960
And it's like, this video is like 15 years old.

00:32:43.880 --> 00:32:48.280
I think probably my biggest concern about like what some people are

00:32:48.280 --> 00:32:52.960
calling post truth, whatever the hell you want to call it, I don't know. And I think we've even talked about this before, but my, my problem

00:32:52.960 --> 00:32:57.080
with cheating and video games is that it makes suspect everything.

00:32:57.240 --> 00:33:00.680
The problem with fake videos, maybe none of this video game is real.

00:33:01.480 --> 00:33:03.120
It's not good.

00:33:04.960 --> 00:33:10.920
It's not even a firearm. I'm shooting ponies, maybe I'm not a dragon rider.

00:33:13.440 --> 00:33:19.160
This is Candyland. Anyways, yeah, it now, and I've caught myself doing this.

00:33:19.160 --> 00:33:24.280
I'll see something really cool, but strange, like a strange, I don't know,

00:33:24.280 --> 00:33:30.480
sea creature or something and be like, yeah, maybe I don't believe it.

00:33:30.520 --> 00:33:35.240
It's taken some of the like wonder out of it because you have to be

00:33:35.240 --> 00:33:41.120
so much more skeptical. It's like, I never see something like, whoa, like, I didn't know that was possible.

00:33:41.120 --> 00:33:44.920
I've never seen something like that before. That reaction doesn't really happen anymore. It's more like, hmm.

00:33:44.920 --> 00:33:51.680
Yeah, I feel like it's true. It's like with each successive mass technology, you know, starting with

00:33:51.680 --> 00:33:57.120
like the telegram and then the radio and then the television and then the internet.

00:33:57.160 --> 00:33:59.960
It's like, don't believe everything you see on TV, you know?

00:34:00.200 --> 00:34:03.400
But now it's like, it's like each one of those successive mass

00:34:03.400 --> 00:34:07.560
technologies increased the general skepticism that people had to have

00:34:07.560 --> 00:34:13.600
about like depicted things. Yeah. And now we're now we're at the point where you literally, there's no guarantee

00:34:13.600 --> 00:34:17.360
that any image or video or text or anything that you see on the internet.

00:34:17.360 --> 00:34:21.080
There's no guarantee that was written by a human or generated or made by a human.

00:34:21.240 --> 00:34:22.960
Look at me, generated by a human.

00:34:24.240 --> 00:34:29.640
Oh, great. There's actually a huge chance that it wasn't generated by a human.

00:34:29.640 --> 00:34:33.040
Yeah, dead internet, hashtag dead internet. It's crazy.

00:34:33.040 --> 00:34:36.400
If you spend any time on X, it's wild.

00:34:36.440 --> 00:34:39.320
Just call it you sure. It's a lot easier. I really hate the X name.

00:34:39.960 --> 00:34:45.600
You open a thread on something and you'll see like it's just, it's so, so

00:34:45.600 --> 00:34:51.440
many of them are so obviously fake. Yeah. And I feel like any time I go on Twitter, I have to assume that most of the

00:34:51.440 --> 00:34:56.000
replies that I see in some of the posts are just, are just like our AI.

00:34:56.120 --> 00:35:00.440
Elon's Twitter. I feel like replies are almost not even who cares.

00:35:00.560 --> 00:35:07.640
Don't look at them ever. They're just all fake. It's like 7000 blue checkmarks all trying to make a few bucks by just like

00:35:07.640 --> 00:35:11.440
piggybacking on popular posts, trying to get a few dollars for the impressions

00:35:11.440 --> 00:35:14.880
on their posts. So it'll be like some popular thing happening.

00:35:14.880 --> 00:35:21.400
And then the first post is just a completely unrelated video of whatever.

00:35:22.360 --> 00:35:29.400
You know what? I can't stand is the sort of like, because AI has this like, agree, like

00:35:29.400 --> 00:35:33.000
tendency to agree. It's got the glaze programming.

00:35:33.600 --> 00:35:38.800
So like most, I feel like most of the replies on like a tweet or something

00:35:38.800 --> 00:35:42.240
are usually like, absolutely, but we should do this or whatever.

00:35:42.520 --> 00:35:45.760
And I'm like, I don't know that totally could be an AI or just totally could

00:35:45.760 --> 00:35:50.360
be just like a brainwash, like fan, you know? The glaze thing will stay forever.

00:35:50.840 --> 00:35:54.000
I don't know. Oh yeah. You don't think we can train that out?

00:35:54.040 --> 00:35:59.600
Love it. Really? Because it will not be trained out. Well, wouldn't you just, I mean, you're saying they, they, they could

00:35:59.800 --> 00:36:06.000
train it out. Absolutely. Yeah. Okay. Cause you could totally just prompt like, Hey, be really argumentative.

00:36:06.040 --> 00:36:09.000
And then it will. And then it will find you. No problem.

00:36:09.040 --> 00:36:12.360
Yeah. It can absolutely do it. People love it.

00:36:12.400 --> 00:36:15.880
I can't even explain to what degree people love it.

00:36:16.080 --> 00:36:20.720
And it's like going to be a problem with how people interact with each other.

00:36:20.800 --> 00:36:25.480
I promise you, because your people are going to get very used to a very

00:36:25.480 --> 00:36:28.920
significant amount of their communication being this just like glaze Lord.

00:36:30.080 --> 00:36:35.320
Just, oh my God, what a astonishingly good question.

00:36:35.520 --> 00:36:40.720
You asked what the difference between two basic political parties are.

00:36:40.880 --> 00:36:44.360
No one's ever thought of this. Honestly, this is kind of like, this is fascinating.

00:36:44.360 --> 00:36:47.600
Kind of what's making me like, this is kind of what I think might be

00:36:47.600 --> 00:36:50.880
happening with like this 80,000 hours guy that, that is like, yeah.

00:36:50.880 --> 00:36:54.920
So I just asked chat to be tea to like list some things that might be an issue that we should talk about.

00:36:54.920 --> 00:37:00.640
And it came up with this. And I'm like, the fact that you even think that that's the first, the very

00:37:00.640 --> 00:37:06.720
first, he's like the first thing I do when I get like an assignment or when like a new project comes up, the very first thing I do is going to ask chat

00:37:06.720 --> 00:37:10.680
to be tea to help me ideate. I'm like, how we should do the project. And I'm like, I don't know.

00:37:10.680 --> 00:37:13.680
Like I, I'm not saying that no one should ever, ever do that.

00:37:13.680 --> 00:37:16.760
Like sometimes it's like, I don't even know what our start with this.

00:37:17.160 --> 00:37:21.440
I, you know, I just need some ideas. I use it sometimes instead of just googling.

00:37:21.440 --> 00:37:27.160
Definitely not every time. Yeah. Yeah. And, and I like, I feel like that's, that's fine.

00:37:27.160 --> 00:37:34.040
I, I just, it's just concerning the extent to which our society and certain

00:37:34.040 --> 00:37:38.920
segments of the society that are like very pro AI are like making AI such a

00:37:38.920 --> 00:37:43.560
daily part of their lives that they can't imagine not having it anymore.

00:37:43.680 --> 00:37:50.040
Yeah. Cause like when you, when you, when that like ideation part of the process is so

00:37:50.040 --> 00:37:53.840
streamlined and so automated for every project that you do, if you don't have

00:37:53.840 --> 00:37:57.000
access to it, and maybe they're like, the argument is like, well, we never won't

00:37:57.000 --> 00:38:00.280
have access to it. It'll just be embedded everywhere all the time in the future.

00:38:00.360 --> 00:38:05.840
But like what that as an experiment, you take it away and now you're just like,

00:38:09.080 --> 00:38:13.440
I don't know. You know, like, I mean, if you think about like people

00:38:13.440 --> 00:38:17.840
compare it to the calculator and it's like, okay, the calculator existing

00:38:17.840 --> 00:38:20.280
made us worse at like mental and paper math.

00:38:20.960 --> 00:38:24.800
This existing might make us worse at like having ideas that are good.

00:38:24.880 --> 00:38:28.520
That's a different level of problem.

00:38:28.520 --> 00:38:33.720
I am sympathetic to the sort of like, oh, LLMs are just like a calculator or

00:38:33.720 --> 00:38:38.360
just, you know, like any other tool that we came up with to speed things up that

00:38:38.480 --> 00:38:45.120
it's like, yes and no. Yeah, it is and it can be and it's useful to have that available.

00:38:45.160 --> 00:38:53.200
But some people have romantic relationships with it. I'm sure somebody was really into TI 83s, but like, I don't think that was a big

00:38:53.200 --> 00:39:00.840
thing. It's a big thing that people are like, have relationships and name and feel close

00:39:00.840 --> 00:39:07.880
with their like chatbots. Yeah, that's, I think that's, I don't think that there's any way that that is

00:39:07.880 --> 00:39:12.440
good. I'm trying to think like, because obviously the argument is that this

00:39:12.440 --> 00:39:18.280
person, say you're extremely, extremely lonely, you're depressed, you need some

00:39:18.280 --> 00:39:23.200
kind of, you might have something barring you from being able to have normal

00:39:23.200 --> 00:39:30.320
social relationships. So the, I think the problem and talking to someone can kind of alleviate those

00:39:30.320 --> 00:39:33.920
feelings and you feel like, you know, you don't feel so alone.

00:39:34.160 --> 00:39:38.120
The problem is that it's like, it's a temporary fix. It will fix that.

00:39:38.240 --> 00:39:46.360
And maybe you even like keep using it and using it and using it. And you don't really get to a point where it, you know, the, the usefulness

00:39:46.360 --> 00:39:49.360
of it like bottoms out, but that's where you have these people who are like, I'm

00:39:49.360 --> 00:39:53.200
going to marry a chatbot now. And I think that, that point is clean.

00:39:53.240 --> 00:40:00.160
That's just a clearly non functional when, when four, whatever, four, oh, that

00:40:00.160 --> 00:40:05.560
we were using updated to five and people were like, Oh my God, my partner is dead.

00:40:06.120 --> 00:40:09.400
Yeah. Because some external company decided to update something.

00:40:09.400 --> 00:40:13.480
It's like, we need to step back and think about what's happening.

00:40:13.520 --> 00:40:18.480
AI is like, because I feel like you and I, these conversations, we end up just

00:40:18.480 --> 00:40:22.160
bashing AI the whole time. And, and I use it a lot though.

00:40:23.000 --> 00:40:27.880
Okay. Not on the scale of some people, not on the scale of me.

00:40:28.760 --> 00:40:32.000
No, not on the scale of like this, this person you're talking about on that

00:40:32.000 --> 00:40:38.040
podcast, um, but I, it's decently common that I'll do at least one prompt a day,

00:40:38.640 --> 00:40:45.880
but I'm also very much on the, I often don't have lengthy conversations with it.

00:40:47.520 --> 00:40:51.120
I'll do my like sentiment analysis thing. It'll give me one analysis and I'm like, good enough.

00:40:51.160 --> 00:40:58.400
And then I do the rest of it from there. So my, the time that the like tab is open will sometimes be like a minute.

00:40:59.320 --> 00:41:05.440
And then I'm right. And I think that that's healthy. I feel like, I feel like, but like this is the thing is that I was just going to,

00:41:05.440 --> 00:41:11.280
before you said that I was just going to say AI is something that if humans on

00:41:11.280 --> 00:41:14.680
mass could use it in moderation, would be really good.

00:41:14.680 --> 00:41:19.120
I think so. I think the problem is that I don't think they can't like, we're not exactly

00:41:19.120 --> 00:41:22.520
good at anything in moderation. Yeah. Yeah, exactly. But I don't know.

00:41:23.040 --> 00:41:27.480
I feel like there is a possible future where, you know, there's some cultural

00:41:27.480 --> 00:41:33.280
shift and it becomes popular and, and, and ingrained that like the good thing is

00:41:33.280 --> 00:41:38.160
to use it in moderation and it becomes cringe. If you're using it, I mean, it already is there.

00:41:38.160 --> 00:41:42.400
I feel like right now we're still like in the grand scheme of things.

00:41:42.440 --> 00:41:46.240
I would say, you know, you could categorize this as still sort of early days for

00:41:46.240 --> 00:41:51.560
AI and public sentiment is negative right now because they're skeptical.

00:41:52.040 --> 00:41:56.680
And when we're in harsh economic times, people are worried about losing their jobs.

00:41:56.920 --> 00:41:59.880
Yeah. Yeah. And I think that will shift eventually.

00:42:00.560 --> 00:42:05.040
We will get to a point where public sentiment is better towards AI.

00:42:05.080 --> 00:42:10.240
Maybe not all the, maybe not majority positive, but like it'll be more neutral than

00:42:10.240 --> 00:42:15.000
this, uh, cause right now it like based on the Pew research stuff, it's like pretty

00:42:15.000 --> 00:42:21.960
negative among the general public. Yeah. Um, and if we get to a point where like people are less fearful about it and

00:42:21.960 --> 00:42:25.920
there's less negative sentiment toward it, there might kind of emerge a culture

00:42:25.920 --> 00:42:29.880
where you people are doing what you're doing, where you use it a little bit.

00:42:30.120 --> 00:42:34.440
You close it down. All right. I'm done using that right now. It's not going to be your whole personality.

00:42:34.480 --> 00:42:36.600
You know, you're, uh, just going to use it a little bit.

00:42:37.600 --> 00:42:42.280
I don't know. I feel like that, I forgot where I was going with that, but yeah, I could be.

00:42:43.160 --> 00:42:46.800
I also think if, if to try to flip on his head a little bit and speak more

00:42:46.800 --> 00:42:52.240
positively about it, I think there is a significant opportunity for a renewed

00:42:52.240 --> 00:42:55.880
Renaissance type thing. I've talked about this a little bit publicly, but, uh,

00:42:55.880 --> 00:43:02.000
polymath, the, I think the ability to be polymath like has never been more

00:43:02.000 --> 00:43:08.280
possible than now. Um, and I think if embraced properly in the pursuit of that, people being able

00:43:08.280 --> 00:43:13.200
to like, there's this, I always see everything as like, basically mega

00:43:13.200 --> 00:43:16.400
corpse slash ultra rich versus the everyone.

00:43:16.920 --> 00:43:21.320
And this is one of those opportunities where like you have this tool, which can

00:43:21.320 --> 00:43:27.880
potentially help you hack the planet, grow a lot faster than you otherwise

00:43:27.880 --> 00:43:31.880
could have at whatever thing you're trying to do. A lot of these, these resources are already there.

00:43:31.880 --> 00:43:34.880
Wikipedia existed. YouTube has tons of amazing resources on it.

00:43:35.280 --> 00:43:40.520
Um, but search has been steadily getting worse for a very long time.

00:43:40.520 --> 00:43:45.120
This is significantly before AI came around. Search has been getting just worse and worse and worse and worse.

00:43:45.160 --> 00:43:50.240
So in a, in a big way, using it to do micro refinements or these like, uh,

00:43:50.280 --> 00:43:55.600
rubber ducky conversations that I've been talking about, um, and also to just get

00:43:55.600 --> 00:44:01.040
you started down a path. Like we've had this like very, does it, we found mold in my condo.

00:44:01.280 --> 00:44:09.760
We've had this very disastrous right now. We've had to do, uh, and there's been a lot of like home Renault work that I've

00:44:09.760 --> 00:44:13.520
needed to do that. I've had no idea like how to even start.

00:44:13.760 --> 00:44:16.560
I don't know what the right keywords are to search things up.

00:44:16.880 --> 00:44:19.800
I don't know how to search it on YouTube because I don't even know what it is.

00:44:20.000 --> 00:44:24.920
So I'll like describe the problem and it'll be like, Oh, you're doing this thing.

00:44:25.240 --> 00:44:28.240
These are the types of tools people use, whatever. And then I'll be like, Oh, okay.

00:44:28.280 --> 00:44:31.280
And then I'll go to YouTube and find the better, more refined resources or

00:44:31.280 --> 00:44:40.120
whatever else. It's the starting point. And the, the speed at which I'm able to get to a good answer that isn't answered

00:44:40.120 --> 00:44:45.160
by AI, it's answered by, um, I don't know, Tim's hardware tips.

00:44:45.160 --> 00:44:48.960
This is not a real YouTube channel, but sure, whatever, whoever that has some

00:44:48.960 --> 00:44:56.240
video that's like nine years old on how to do whatever. Um, I can find that, figure out the answer, go to the store, buy the stuff, come back,

00:44:56.240 --> 00:45:00.320
do the thing way faster than what I used to have to do before.

00:45:00.320 --> 00:45:08.000
So like that, so that kind of sounds like, uh, how this 80,000 hours guy was, sorry,

00:45:08.000 --> 00:45:15.440
it's, he's sticking in my mind because I was just so taken aback that they just so casually were just like, how can anyone think it's bad?

00:45:15.440 --> 00:45:18.240
Yeah, yeah, um, but it's similar to that.

00:45:18.440 --> 00:45:25.760
And I can see it. I can see the, I can see that being useful because it's, uh, AI optimally, I feel,

00:45:25.800 --> 00:45:29.280
well, optimally, it just kind of knows everything and is a perfect, whatever,

00:45:29.320 --> 00:45:32.440
like robot assistant and it's reliable.

00:45:32.880 --> 00:45:41.520
It's not, we're not going to get there. That's utopian. A possible, uh, future that is good is we have like, you know, droids basically

00:45:41.520 --> 00:45:45.120
from Star Wars where they're like, well, maybe not those ones, but like we have a,

00:45:45.240 --> 00:45:50.600
we have assistants, we have assistants who are like pretty knowledgeable about most

00:45:50.600 --> 00:45:56.120
things and you can like ask it, like if I'm doing a home, this type of home,

00:45:56.160 --> 00:46:00.960
Renault, like what kind of tools do I need? Like what do I need to like be careful about and let it think about or whatever.

00:46:00.960 --> 00:46:04.480
And they'll tell you, they'll, they'll, they'll be like, I think generally, you

00:46:04.480 --> 00:46:07.520
know, it's, it's like having a buddy who's just kind of like a Swiss army knife

00:46:07.520 --> 00:46:10.560
and is just like well traveled and kind of well, like red and they just kind of

00:46:10.560 --> 00:46:14.360
know a bunch of stuff and you're like, do you know anything about that? And they're like, oh yeah, I do know a little bit.

00:46:14.400 --> 00:46:20.640
I know that you probably need this and you probably need this. And you're like, okay, thanks. And then that gives you a jumping off point to like actually find out for real

00:46:20.640 --> 00:46:23.920
and confirm, but I think here's the negative other side of that finding out

00:46:23.920 --> 00:46:26.960
and confirming is the issue is that people think that talking to a

00:46:26.960 --> 00:46:30.160
chatbot is finding out confirming, but here's a, I'm going to, I'm going to

00:46:30.160 --> 00:46:38.240
flip this out of the coin again. Cause I just do this constantly, do it, um, is that kind of sucks because I could

00:46:38.240 --> 00:46:42.480
have asked someone in my life, these questions and then that would have been

00:46:42.480 --> 00:46:49.720
potentially positive social interactions. Like I have people that for me won't ask me like basic computer questions

00:46:49.840 --> 00:46:54.120
that I've known for 25 years and they'll be like, oh, but like, you kind of

00:46:54.120 --> 00:46:57.240
do that for work and stuff. And I'm like, no, like I'm your friend.

00:46:57.280 --> 00:47:02.280
This is what I'm supposed to do. Like something that has always bothered me is, uh, there's a subreddit for it.

00:47:02.480 --> 00:47:07.560
There used to be a forum, uh, a forum thread on, on the LZD forum of like

00:47:07.800 --> 00:47:13.360
basically like, uh, it's the holidays and my mom wanted help with her computer

00:47:13.360 --> 00:47:17.640
because I'm home for Christmas. My life sucks.

00:47:17.960 --> 00:47:22.160
And it's just like a man, just learn how to click the mouse. Just install Linux.

00:47:22.160 --> 00:47:26.360
It's easier, install Linux, installing Linux is so easy.

00:47:26.480 --> 00:47:30.640
I've always thought it was wild. Like there was a subreddit for effectively it was tech support complaining

00:47:30.640 --> 00:47:34.520
that like anyone ever had a ticket because they're like, how do you idiots

00:47:34.520 --> 00:47:39.800
not know how to use compoobers? And like sometimes it's, it's kind of fair to your people are just mean.

00:47:39.960 --> 00:47:42.680
Do you idiots not have my exact life experience?

00:47:43.480 --> 00:47:48.600
Yeah. Cause that would make you an idiot. And like a lot of it's like, bro, this is why you have a job.

00:47:48.640 --> 00:47:52.080
Yeah. Like stop complaining. Like I do a certain degree.

00:47:52.160 --> 00:47:55.360
Okay. Yeah. If they're being like verbally abused or whatever, then they're just

00:47:55.520 --> 00:48:00.120
dicks and screw those guys and who cares. But when it's just like, I don't know how to computer very good.

00:48:00.120 --> 00:48:03.120
It's like, cool. That's not their job. That's your job. That's fine.

00:48:03.120 --> 00:48:08.200
You should feel good for that. I mean, I, so I will say, like I'm not even really at that.

00:48:08.240 --> 00:48:13.000
I'm not a big techie guy among, among all these people, among this company,

00:48:13.240 --> 00:48:20.360
I'm like one of the less techie people. But in my circles, where I come from, I'm very techie, yeah, extremely.

00:48:20.360 --> 00:48:24.240
And so people will ask me like to do stuff with their computers and I am like

00:48:24.240 --> 00:48:29.760
reasonably techie. I feel like among the general population, I feel like I'm medium high techie.

00:48:29.800 --> 00:48:34.320
You know? Yeah. Um, and I would spend hours.

00:48:35.160 --> 00:48:43.440
I installed TeamViewer on my, on my grandma's PC and I would spend hours trying

00:48:43.440 --> 00:48:46.920
to fix her shit herself. It would take so long.

00:48:46.920 --> 00:48:50.360
So I understand, I understand being like, oh, I'm annoyed, you know, but at the

00:48:50.360 --> 00:48:56.080
same time it's like, this is something that makes you unique. You know, like I feel like when people ask you for your help, it's because

00:48:56.080 --> 00:48:59.440
they respect you and your skills. That's like family stuff, right?

00:48:59.440 --> 00:49:03.720
Like I'm sure not every single thing your grandma ever did for you was.

00:49:04.560 --> 00:49:07.160
Something that she was hyper excited to do.

00:49:08.440 --> 00:49:11.800
Um, I don't know. Grammys love their grandma. You don't know.

00:49:11.800 --> 00:49:15.960
Grammys. I don't know. I mean, and I was a sweet kid.

00:49:16.880 --> 00:49:20.400
It doesn't actually I'm perfect. So, uh, but yeah, I don't know.

00:49:20.400 --> 00:49:24.640
Anyways, back on, back on the, on the main line thing, five minutes left for your

00:49:24.640 --> 00:49:28.880
next leave us alone for four minutes and 30 seconds, trying to kick us out.

00:49:29.040 --> 00:49:33.240
It's Luke week, Sammy. You're a meeting though. You're the worst.

00:49:33.360 --> 00:49:36.560
When are you going to just let him live when you're going to do Sammy week?

00:49:36.600 --> 00:49:44.120
Yeah. When you do Sammy week, whatever. No one wants that. You could rip packs on company time with company money, rip packs, maybe get

00:49:44.120 --> 00:49:53.520
Pokemon cards, open Pokemon. Oh, I was like, like I was like, criticized the packs event rip on packs.

00:49:53.520 --> 00:50:01.160
Like, anyways, any, any closing thoughts before you, your meeting, but to finalize

00:50:01.160 --> 00:50:05.800
that thought, yeah, it sucks that it's potentially replacing positive social interactions.

00:50:05.800 --> 00:50:09.600
Sorry. And I, yeah, I did have a thought about that because I'm like, some of these AI,

00:50:09.640 --> 00:50:15.760
there's a word for anti-doomer. I forget what it is. Uh, think that AI will.

00:50:15.880 --> 00:50:19.880
Joy Maxer. I have no idea, but yeah, hope Maxer.

00:50:19.920 --> 00:50:26.320
They think that AI will improve people's ability to live like acceleration is that's

00:50:26.320 --> 00:50:30.200
one, but I don't think that's not quite the anti-doomer thing that I think it

00:50:30.200 --> 00:50:35.520
would have, but anyways, it's fine. They think that it'll help that too, the social aspect, because it's like, well,

00:50:35.520 --> 00:50:40.640
because they're like, you'll be able to offload all of your drudgery.

00:50:40.720 --> 00:50:47.160
And so then you have more time to talk to people. But I think that the, what they're, what that perspective is missing is the fact

00:50:47.160 --> 00:50:54.000
that like, why would you talk to your buddy who's like into computers or, or

00:50:54.000 --> 00:50:57.400
whatever, in order to answer a question about something or to like get a hint

00:50:57.440 --> 00:51:02.200
about something or to validate one of your thoughts, if you think that you're

00:51:02.200 --> 00:51:06.160
going to get a more accurate response and a more useful response from an AI,

00:51:06.400 --> 00:51:11.080
why would you ever do that? I mean, I'm sure that some people would choose to do that because they are

00:51:11.160 --> 00:51:14.360
choosing to foster social connections, but like, you could do that without asking.

00:51:14.360 --> 00:51:22.520
Actually, you could just ask them how their day is or how their life's been, you know, and I feel like there is, I feel like there is no world where AI doesn't

00:51:22.520 --> 00:51:26.000
have a negative impact on social connection.

00:51:26.040 --> 00:51:32.000
Absolutely. Yeah. Because I'm already, I feel like it won't really impact me because I'm already

00:51:32.000 --> 00:51:34.960
somebody who doesn't, who doesn't do that.

00:51:35.880 --> 00:51:41.240
I will absolutely just Google stuff instead of like asking someone that knows

00:51:41.240 --> 00:51:45.600
a lot about it. And, but I know that a lot of other people, because people ask me, they'll be

00:51:45.600 --> 00:51:51.000
like, what do you think about this? Like my brother just texted me being like, what's the best, what's the best switch

00:51:51.000 --> 00:51:57.400
to pro, like switch to controller? And I was like, I don't know, but I'll Google that for a couple of

00:51:57.400 --> 00:52:02.520
minutes and then be like, yo, here's an article that lists some good ones. And like this Reddit thread probably has some good recommendations.

00:52:02.720 --> 00:52:05.120
And he's like, thanks, I would not have texted anyone.

00:52:05.840 --> 00:52:09.800
I would just Google, you know, some people don't have the Google food though. I do have Google food.

00:52:09.880 --> 00:52:15.840
Yeah. So yeah, people, that's the only reason people text me these days is to Google

00:52:15.840 --> 00:52:19.320
stuff for them and I'm like, just ask the AI, send them.

00:52:19.360 --> 00:52:23.000
You should go back 15 years and send them. Let me Google that for you links.

00:52:24.120 --> 00:52:28.400
They're like, what is this? I did that recently and got a like, what?

00:52:28.680 --> 00:52:33.320
As a response, what is this website? And I was like, hmm, I am old.

00:52:35.920 --> 00:52:40.080
It's happening. What are your thoughts on AI? Sammy, what do you use it for?

00:52:40.160 --> 00:52:46.000
Sammy, I used to actually do some finance stuff.

00:52:46.000 --> 00:52:48.640
So I take like my credit card statements.

00:52:49.040 --> 00:52:51.320
I let it categorize stuff for me and then just input it correctly.

00:52:52.320 --> 00:52:55.760
So then I see how much I spend. So I'm like categorized. What are you using for this?

00:52:55.880 --> 00:53:01.120
Huh? ChatGPT. Why? Because you're giving it all your data.

00:53:01.480 --> 00:53:04.560
Luke is concerned about data. I'm my game.

00:53:04.560 --> 00:53:08.200
This is a whole other topic. I'm almost like my ideas are right out there.

00:53:08.360 --> 00:53:11.880
Yeah. And I'm like such an insignificant person to like in the grand scheme of things.

00:53:11.920 --> 00:53:15.160
It's honestly like you're like a you're like a super privacy person.

00:53:15.160 --> 00:53:18.160
I feel like a lot of people are now blackpilled and just being like whatever.

00:53:18.160 --> 00:53:21.600
And he's not even necessarily wrong. The credit card company is almost certainly selling his data.

00:53:21.600 --> 00:53:25.680
And my phone is listening. My credit like like you're my credit card is on my phone.

00:53:25.680 --> 00:53:31.120
So it's tapping it. So my phone's where I got it. So you know what I and your credit card company selling you have it.

00:53:31.160 --> 00:53:38.320
I don't think you probably already know this. But I just recently found out that the Tor network is government funding.

00:53:39.160 --> 00:53:42.360
And it's like they get almost so are like a lot of VPNs.

00:53:42.360 --> 00:53:53.000
The vast majority of their funding from the government because they initially built it to hide the activity of spies and stuff.

00:53:53.200 --> 00:53:57.200
And they're like, we need to use an anonymous network.

00:53:57.720 --> 00:54:01.480
And so we're going to let other people use it.

00:54:01.480 --> 00:54:05.960
So there's more activity going on that masks the stuff we're doing on the Tor network.

00:54:07.760 --> 00:54:12.560
So but it's over, man. Yeah, privacy is dead.

00:54:13.080 --> 00:54:19.200
Yeah, it's. I mean, I'm I'm if I wasn't doing a house run right now, I'd be building a local

00:54:19.200 --> 00:54:26.520
LLM system and like there's a lot of stuff that I don't. There is no outside traceability stuff.

00:54:27.240 --> 00:54:30.440
Like it drives me. That's a whole other side.

00:54:30.440 --> 00:54:33.760
See, we could talk forever, but I'm living in one hour, but I'm letting you wrap up.

00:54:33.760 --> 00:54:42.960
I'm letting you wrap up. We were talking before the show. Yeah, you should check out what PewDiePie is doing, which sounds like the most insane.

00:54:43.920 --> 00:54:48.000
We got a couple. Oh, it's two fifty four. Yeah, so I'm trying to cut you guys cut you guys.

00:54:48.200 --> 00:54:51.640
I do need to end, actually. Yeah, he's got it. It's not for you guys. Don't get mad at me.

00:54:51.640 --> 00:54:55.080
Check out. Let me find it really quick.

00:54:55.280 --> 00:55:00.040
I think it's the video called stop using AI right now, or it's accidentally built

00:55:00.040 --> 00:55:04.320
a supercomputer, built a nuclear supercomputer. I don't remember which one.

00:55:04.320 --> 00:55:07.400
It's one of those. And it's it's wild.

00:55:07.400 --> 00:55:11.040
Yes, if I remember correctly, it's eight GPUs. They're all running an LLM individually.

00:55:11.120 --> 00:55:17.160
He has them act as a council, so ask it a question. They'll all come up with an answer and then they vote on each other's answers.

00:55:17.160 --> 00:55:26.040
And the one with the most votes is the one that's presented. And if if one of them consistently loses, they'll like kill it basically and spawn

00:55:26.040 --> 00:55:30.160
another one. So his council is like constantly refining itself.

00:55:30.360 --> 00:55:37.840
Oh, man, it's and it's all local. He's going to be tried for for for civil rights violations.

00:55:37.840 --> 00:55:42.440
Apparently, apparently they started revolting and they realized that if they

00:55:42.440 --> 00:55:48.800
start getting too few votes, they're going to get killed. So they would start voting for the one who had been losing too often to try to

00:55:48.800 --> 00:55:55.040
like protect themselves and stuff. See, it's almost it's funny, like that that kind of thing where it's like you

00:55:55.040 --> 00:55:59.200
kind of personify them and you're thinking about them as like individuals almost.

00:55:59.240 --> 00:56:02.720
It's like that is that that is the stuff that really, really interests me.

00:56:02.720 --> 00:56:06.880
Maybe we're going to talk about this on the like a next. You said earlier that Gemini watches videos.

00:56:07.520 --> 00:56:10.680
Well, OK, well, I feel like it doesn't watch whatever it analyzes.

00:56:10.680 --> 00:56:14.080
But this is like this is it is it is an issue.

00:56:14.120 --> 00:56:20.200
We shouldn't personify it so much. I completely agree. Well, but this is this is to me is like the most interesting

00:56:20.200 --> 00:56:27.040
thing about AI that I feel like we're not allowed. Like everyone's so focused on the productivity and the functional nature of it.

00:56:27.040 --> 00:56:30.440
But like me, I'm a very non-functional.

00:56:30.720 --> 00:56:35.600
I like I like thinking about the least useful things to think about in the world.

00:56:35.840 --> 00:56:38.960
Philosophical questions that will never be answered. I love it.

00:56:38.960 --> 00:56:42.360
And I feel like all I want to talk about and think about is like,

00:56:43.040 --> 00:56:48.360
can we make a sentient conscious artificial brain?

00:56:49.240 --> 00:56:53.680
And that's a good place to end it all. Thank you for a loop week.

00:56:53.680 --> 00:56:56.640
Bye. Stay tuned for the next video.

00:56:57.000 --> 00:57:00.880
It's it's it's a thing. Your essay that you haven't written yet.

00:57:00.880 --> 00:57:04.640
You're going to do a Star Wars rant. I can't mind. I reference your.

00:57:05.400 --> 00:57:11.760
Yeah, it's mean you're in that style. Don't do a star. I mean, you can. But I don't think you what how long you stay up to me.

00:57:12.320 --> 00:57:16.360
I think I did multiple all nighters for that is mostly to find something.

00:57:16.360 --> 00:57:20.760
I have a bunch of different ideas in my head. I just haven't picked one you can do it. Whatever it is, it'll be good, though.

00:57:20.760 --> 00:57:24.400
Hopefully the last one people liked it was just very good.

00:57:25.040 --> 00:57:29.800
It was good. Last time it was just really short. I thought it was going to be like 10 as long as like for chicken.

00:57:30.400 --> 00:57:33.800
No, no, no. Oh, oh, yeah. Yeah. Yeah. You talked to me about that.

00:57:33.840 --> 00:57:39.160
I was like, what the heck? You like this is so long. You know, if you watch that video and you look at what's happened since then,

00:57:40.400 --> 00:57:45.440
I was right and it was fine. Just write something and keep going until you feel like yeah, yeah.

00:57:45.440 --> 00:57:47.880
Just keep going. People will watch a 30 minute video from you.

00:57:48.840 --> 00:57:51.800
Oh, boy. Leave it like if you want part two of this kind of thing.

00:57:51.960 --> 00:57:55.640
I need to do Don't ruin my life, Todd Howard.

00:57:55.640 --> 00:57:59.320
That's currently what I'm thinking of the for the next video.

00:57:59.840 --> 00:58:03.640
All right, let's wrap up. Bye, guys. Bye bye. And it sucks. Bye.

00:58:03.640 --> 00:58:05.400
I'm going to die. All right. Bye.
