WEBVTT

00:00:00.000 --> 00:00:09.200
You'll own no IP at all and you'll be happy I will be sad what nope not allowed

00:00:10.400 --> 00:00:13.800
That's a case of rules the AI will stop you from being sad

00:00:14.360 --> 00:00:19.920
How could you be sad when you live in a utopia where everything's controlled and oh everything's generated?

00:00:20.920 --> 00:00:26.120
What are we talking about here Luke's powered by the the IP grinding machine? We're talking about

00:00:26.960 --> 00:00:34.240
IP we're talking about AI policy we're talking about how this stuff is gonna possibly work because there's a huge battle going on between

00:00:34.680 --> 00:00:37.600
North American and European AI companies and

00:00:38.800 --> 00:00:43.800
mostly Chinese AI companies deep-seek came and took everyone's lunch and what are we gonna do?

00:00:44.120 --> 00:00:48.240
You it's like you host a stream or something you've done this before maybe

00:00:49.160 --> 00:00:53.760
We haven't done a talk linked in a in a hot second, but then and you've never been on no

00:00:53.800 --> 00:00:57.440
I'm so excited. Yeah, and also get you get to talk about AI

00:00:57.960 --> 00:01:04.480
Wait, I just when Linus doesn't let you know on when he whips me as soon as you get to a AI topic

00:01:05.000 --> 00:01:10.760
And I know this because I prepare the topic sometimes. Yeah, I'm like, oh they're gonna like this topic

00:01:10.760 --> 00:01:13.880
I stopped preparing them because he just gets there and he goes on his phone

00:01:13.880 --> 00:01:18.320
Yeah, he has literally done that and then he'll get at me for not paying attention during other topics

00:01:18.320 --> 00:01:21.440
But whatever yeah, but now you get to talk about it

00:01:21.440 --> 00:01:26.760
So the the the kind of impetus for all of this was Jack Dorsey the founder

00:01:27.400 --> 00:01:33.700
Former CEO of Twitter. Yeah now X which I still call Twitter. Yeah, tweeting

00:01:33.880 --> 00:01:43.240
delete all IP law on April 11th and it kind of sparked some discourse. Yeah, that's that's part of what in my opinion are like the three

00:01:43.600 --> 00:01:50.200
Core topics around AI policy right now, which I laid out as open versus closed models

00:01:50.320 --> 00:01:54.240
Copyright and IP reform and AI safety versus innovation speed. Oh heck

00:01:54.240 --> 00:01:59.600
I think that's like one of the really core things is what do we do with IP law?

00:01:59.600 --> 00:02:03.160
Because when you're looking at the the race for AI stuff

00:02:03.200 --> 00:02:08.560
It's becoming more and more like country or alliance based instead of company based

00:02:08.560 --> 00:02:15.560
Yeah, a lot of people are looking at it like okay the the West versus the East I guess or right now basically deep-seek v

00:02:15.560 --> 00:02:17.560
everyone yeah, yeah, and

00:02:18.560 --> 00:02:27.920
There are differences in how the West and China deal with IP law. Do I understand Chinese IP law?

00:02:28.360 --> 00:02:35.680
Not even sort of yeah Yeah, I kind of started to Google it in preparation for this and I'm kind of like oh this is gonna be too too complicated

00:02:35.880 --> 00:02:41.960
What my not enough googling has told me is that they are much more lax on IP law

00:02:42.200 --> 00:02:46.840
Especially when it comes to outside of their own country, but also within their own country and

00:02:47.280 --> 00:02:52.400
Especially especially when it has to do with anything that could benefit the state right because over here

00:02:52.400 --> 00:02:57.760
You know the the North American kind of capitalist mindset is like focused right?

00:02:57.760 --> 00:03:03.160
It's like let people go out there struggle whoever rises at the top is good and China also does that

00:03:03.160 --> 00:03:09.360
I know they do that with like but but it's more like it's like they the government directs these little experiments to happen

00:03:10.360 --> 00:03:16.940
So if something is like oh, this will be beneficial to the nation then they can just kind of override

00:03:16.960 --> 00:03:21.120
Yeah, the I just make it happen. Yeah, which which when you're in a race like this

00:03:21.800 --> 00:03:31.720
Could be very beneficial When the other side the West is like squabbling over are we even allowing these companies to access this data at all?

00:03:31.800 --> 00:03:36.680
Which I think what a lot of the argument hinges on right is like the

00:03:37.600 --> 00:03:42.200
morality around letting these gigantic companies that don't care about you and

00:03:43.160 --> 00:03:47.920
Want to loot you for everything you have? Mm-hmm take all of your IP as well or

00:03:49.200 --> 00:03:53.480
Lose the whole race and have it not matter anyways because a company in another country

00:03:53.480 --> 00:03:59.080
That's just gonna do that anyways ends up winning. Yeah, so so for me argument for me

00:03:59.080 --> 00:04:05.280
The there's kind of too well there might be more than two but the two perspectives that are kind of popping into my head here are the

00:04:05.800 --> 00:04:16.400
like Philosophical argument for AI copyright like as a society just regardless of like, you know where China's at do we think that AI?

00:04:16.920 --> 00:04:23.720
Created stuff deserves copyright. Do we think that AI a model could count as the author of something?

00:04:23.720 --> 00:04:26.680
There's all those questions and then there's the kind of more utilitarian

00:04:27.440 --> 00:04:32.120
Like practical question of like is this what we have to do in order to

00:04:32.760 --> 00:04:40.440
Keep up in the nuclear AI arms race with China. So which of those is more interesting to you? Well, I think right now the

00:04:41.640 --> 00:04:46.480
Canon AI have ownership over something that it creates thing

00:04:47.960 --> 00:04:53.320
Is is Technically somewhat decided in law right now and it's no

00:04:53.960 --> 00:04:59.480
Right. Well, it can can an AI model count as the author. Yeah. Yeah, I think no right now

00:04:59.480 --> 00:05:05.200
It's no. Yeah, the copyright office has ruled a couple times. I think at least yeah about

00:05:06.200 --> 00:05:09.280
There's multiple precedents. I think there's one guy in particular

00:05:09.280 --> 00:05:15.480
We'll look it up later Maybe but like there's one guy in particular who has submitted multiple times to be like, okay

00:05:15.480 --> 00:05:18.080
This is definitely one for sure guys

00:05:18.600 --> 00:05:25.560
But I used an AI this time and this time this AI is sentient for sure guys and the copyright office is like stop coming back here

00:05:26.440 --> 00:05:30.880
And put on some pants. So there's precedent against that dude. Yeah. Yeah

00:05:30.880 --> 00:05:37.320
Yeah, but I mean, I think that the the like do do we get to keep copyright in an era where

00:05:37.480 --> 00:05:41.240
All these companies are just kind of gobbling everything up anyway

00:05:41.720 --> 00:05:47.240
Like that those are the interesting lawsuits that are moving through the big one being opening. I versus the New York Times

00:05:47.320 --> 00:05:52.920
Yeah It's so interesting like chat GBT has been out since

00:05:53.440 --> 00:05:59.200
2022 and obviously transformer based LLMs. They were out for a while. I mean they were in research

00:06:00.280 --> 00:06:06.120
Scenarios, but attention is all you need. It's been yeah, it's 2017. Yeah, I've done research

00:06:06.120 --> 00:06:10.200
I can't believe that this question is still kind of as open as it is

00:06:10.200 --> 00:06:15.480
I would have expected there to be some precedent set and a ruling set by that by now

00:06:15.480 --> 00:06:21.960
But we're still just kind of like, you know meta just is also going through proceedings where they're like you

00:06:22.760 --> 00:06:29.680
You literally torrented like every book ever. I mean not that you know, but like a massive archive of books

00:06:29.680 --> 00:06:34.200
Yeah, and like that's the that's the and this I'm not the first person to reference this at all

00:06:34.200 --> 00:06:37.960
But one of the co-founders. I think of reddit

00:06:38.520 --> 00:06:44.360
Mm-hmm. I was out so Hainian. I don't remember the exact details of this story, but I believe he was pirating

00:06:45.480 --> 00:06:49.080
Like research papers and then making them available to people for free

00:06:49.720 --> 00:06:53.800
I think yeah, and then he was gonna get thrown in jail forever and ended up

00:06:54.760 --> 00:07:02.360
Unaliving oh oh over it because it like ruined his whole life. Wait Aaron Schwartz. You know what Aaron Schwartz?

00:07:02.360 --> 00:07:08.600
Oh, okay. Yeah. Yeah. Yeah. Yeah. No, I'm familiar. Yeah. Yeah, and then now a company's doing it on a much grander scale

00:07:08.840 --> 00:07:13.560
Like oh, you know what that seems fine. That totally reminds me

00:07:13.560 --> 00:07:19.240
I saw an article recently like referencing Aaron Schwartz. Yeah, and like how bonkers it is. It's like actually

00:07:19.880 --> 00:07:26.520
kind of similar Well, it's it's crazy that that was the case just a few years ago

00:07:26.520 --> 00:07:35.400
Yeah, and really not that long ago. He was his entire life was gonna be ruined. Yep, and now we have every tech company ever is just

00:07:36.360 --> 00:07:43.000
Gobbling up. I mean I had as one of the points I wanted to talk about these like AI web crawlers that are

00:07:44.120 --> 00:07:46.440
Taking up tons of bandwidth. Oh, yeah

00:07:47.240 --> 00:07:52.920
We've referenced it a couple times on TechLinked But there are like a number of studies now and like reports from

00:07:53.800 --> 00:07:58.920
Administrators being like yeah, we looked into it and it's like 80 percent of our web traffic is just AI crawlers

00:07:59.240 --> 00:08:03.160
so regardless we can I don't want to bash on AI completely because

00:08:03.880 --> 00:08:10.680
I feel like there's an argument here and like for for AI for embedding it in

00:08:11.320 --> 00:08:16.200
society at every level for even maybe deleting all IP law

00:08:17.080 --> 00:08:22.680
And the argument would be is if it's actually if it if it actually leads to this

00:08:23.800 --> 00:08:30.040
Scenario that Jack Dorsey and apparently Elon Musk and whoever because Elon Musk tweeted as well. I agree

00:08:30.680 --> 00:08:35.240
To delete all IP law Yeah, if the argument by

00:08:35.800 --> 00:08:41.240
Putting being put forward by them. They're like, okay. This is actually going to help us. It's actually going to lead to a situation where we benefit

00:08:42.520 --> 00:08:49.400
Do we like if you could know deleting all IP law will lead to a kind of a utopian society where you don't need money

00:08:50.200 --> 00:08:55.960
It feels like you know, there's a bunch of stories that are like this where you have utopian society and then like in the basement of city hall

00:08:56.040 --> 00:09:05.240
there's just like A terrible demon machine that is like the reason why everything's okay. I feel like this is one of those scenarios. I I don't think

00:09:06.920 --> 00:09:13.640
And I'm decently confident. This is the route. It's going to go to I don't think deleting IP law is like the way to go

00:09:13.800 --> 00:09:18.200
I feel like what they're trying to do is like gambit and overcorrection

00:09:18.680 --> 00:09:24.760
So they're trying to be like, let's delete everything hoping that they'll end up somewhere 50% of the way

00:09:25.000 --> 00:09:29.960
If that makes sense because I don't think they think that's going to happen either to be completely honest

00:09:30.360 --> 00:09:33.560
But I think they're trying to do the like Trumpian move of like

00:09:34.280 --> 00:09:39.560
I'm going to ask for this and then I'll send end up somewhere here, you know, we'll delete half of IP law

00:09:39.640 --> 00:09:45.960
Yeah, well, I think it's going to be IP reform Because like a lot of the laws that were made weren't even really made

00:09:47.000 --> 00:09:54.120
In the internet age let alone Let alone the the like AI web crawler crazy situation. We are now

00:09:54.360 --> 00:10:01.480
So it's likely a lot of modernization and reform that needs to happen. I mean, we even have this problem with like youtube videos

00:10:01.960 --> 00:10:08.600
Um, like there's this huge question mark of like, can I have a clip from a movie in my video?

00:10:09.240 --> 00:10:14.360
Yeah, like maybe but also maybe not. Yeah, I've been scared to put anything

00:10:14.760 --> 00:10:20.360
Yeah, that could go over the line But then like all short form content seems to just be able to use whatever music they want

00:10:20.440 --> 00:10:23.960
Yeah, and like that seems fine. I guess that's completely different luke

00:10:24.520 --> 00:10:28.120
It's whole totally different one is watched by young people

00:10:29.080 --> 00:10:32.760
And the other they don't have money. Why do they matter? They don't they're too dumb

00:10:32.840 --> 00:10:40.120
They're young and dumb. They don't know about copyright Yeah, elijah one of the parallels that I would draw is to patents and it's not perfect

00:10:40.680 --> 00:10:47.400
But it's it's an interesting argument because there's a huge debate on whether or not patents have driven or stifled innovation in the past

00:10:47.800 --> 00:10:52.680
And I think it really depends on the time the country

00:10:53.240 --> 00:10:56.280
The the field of innovation like it depends on tons of stuff

00:10:56.760 --> 00:11:01.800
A lot of people this is the right brothers conversation. I warned you but a lot of people look at the right brothers and are like

00:11:02.600 --> 00:11:10.120
Oh my god, they brought in avi. This is amazing, right? We didn't we didn't think we were gonna fly for a long time and then the right brothers got us up in the air

00:11:10.680 --> 00:11:16.680
That's true. That's sick. That's cool. They immediately turned into patent trolls as far as my understanding goes

00:11:16.760 --> 00:11:20.200
I could be wrong about this a little bit of online research led me to find that

00:11:20.440 --> 00:11:22.440
They got their patent and then just

00:11:24.440 --> 00:11:29.240
Just wrecked everyone else as much as they could and like very close to stopped innovating

00:11:29.320 --> 00:11:34.680
Oh, jeez. They they like got aviation off the ground and then just tried to make money off of what they did

00:11:34.920 --> 00:11:41.880
Did you get this info from grok? No, okay, but there's also like in my opinion the

00:11:43.160 --> 00:11:49.080
Which I don't think is as common anymore, but being an inventor as your trade as like what you do

00:11:50.440 --> 00:11:59.080
Probably became a thing Because of things like patents and ip law. Oh, yeah, he's an inventory like no one has thought of you know, uh automatic egg

00:11:59.240 --> 00:12:04.120
Making machine or something It's a chicken at one point

00:12:04.760 --> 00:12:07.560
ip law, copyright law patent law, whatever, uh

00:12:08.280 --> 00:12:15.240
Spurred innovation and if we get rid of it, you're worried that we would lose that there's but there's there's stories in both sides because like

00:12:16.360 --> 00:12:18.920
Let's come back to more ip specific law, right?

00:12:19.720 --> 00:12:23.320
Disney kind of owned children's entertainment for a long time

00:12:24.120 --> 00:12:28.360
Most of our lives and a lot of before it. Yes, and now they're losing

00:12:28.840 --> 00:12:33.160
But they're not losing because you can use mickey mouse in other content now

00:12:33.880 --> 00:12:40.040
Because that is the thing that happened That was like a tiny splash a bunch of games all came out all at once right when that got lifted

00:12:40.280 --> 00:12:44.040
That all features micky mouse. Yeah, and none of them were good. Everyone stopped caring

00:12:44.440 --> 00:12:50.040
Almost immediately no one talks about it anymore at all and bluey is just completely taking their life. Heck. Yeah

00:12:50.440 --> 00:12:55.000
Have you watched bluey? I have seen some bluey bluey's frickins. It makes me cry

00:12:55.240 --> 00:13:01.560
They like lost their ability to protect their ip on mickey, right and it didn't matter and that's why the mcu

00:13:02.440 --> 00:13:08.200
Didn't isn't as good anymore Because they they're losing there because they stopped they didn't stop

00:13:08.760 --> 00:13:11.400
Uh steamboat willy from escaping

00:13:11.960 --> 00:13:14.520
He was going to join the avengers

00:13:15.240 --> 00:13:19.880
Steamboat willy was yeah, honestly. I would go watch that I would I haven't watched a

00:13:20.600 --> 00:13:25.560
Marvel movie in many years if steamboat willy joined the adventures adventures. I would go

00:13:25.800 --> 00:13:30.360
I mean, I will say that like one argument for the deleting of ip law

00:13:31.080 --> 00:13:38.520
is the The the few times there's been a couple times where disney points puts out a project and it seems like the entire point of

00:13:38.520 --> 00:13:44.280
The project is just a brag about how much ip they have Like did you see the ralph wrecks the internet movie?

00:13:45.000 --> 00:13:52.280
Like quite a few years ago. Did you you saw it? I think so. Okay. Yeah, it was just the whole movie just seemed like they were like

00:13:52.440 --> 00:13:56.600
Wait wreck it ralph. So there's wreck it ralph and then the sequel was ralph breaks the internet

00:13:56.920 --> 00:13:59.960
Wreck it ralph. They had a couple I recognizable people there was sonic

00:14:00.040 --> 00:14:07.000
It was there like bowser and stuff But then they go to the internet in the second one and they they go to like disney the disney website

00:14:07.080 --> 00:14:12.520
And it's just like wow, what a magical place. Look there's star wars and there's these then lilo and stitch and label

00:14:12.600 --> 00:14:15.160
And like they're just showing all the princesses are there

00:14:15.960 --> 00:14:20.360
So it was basically like look at all these things that used to be cool that watching that movie

00:14:20.840 --> 00:14:26.280
Made me think like this is kind of like if I if I watch that movie and then jack dorsy

00:14:26.600 --> 00:14:30.760
Called me up and like we should delete all ip law and I'd be like jack you're so smart

00:14:31.720 --> 00:14:40.280
Why don't we talk more often? No Please call me it kind of makes me feel that way because it's like we're heading into this ai future

00:14:41.480 --> 00:14:46.600
Where if ai gets power like there's kind of two paths that I see. Yeah

00:14:48.520 --> 00:14:52.680
One is We lock this down and we're like enough

00:14:53.320 --> 00:14:58.600
Ai companies you're going all over the place. You're you're slurping up everyone's stuff and making slop

00:14:58.840 --> 00:15:03.480
We need to stop this you if you want to train on material you got to license it

00:15:03.480 --> 00:15:08.200
You got to pay all the people you've already trained other material like like either that that's not going to happen

00:15:08.600 --> 00:15:11.720
but like that's either we kind of move in that direction or

00:15:12.680 --> 00:15:15.160
Things just kind of keep progressing as they have been

00:15:15.720 --> 00:15:20.120
and copyright and ip Just kind of become

00:15:20.120 --> 00:15:25.080
This kind of nebulous like like the the walls get torn down a lot more. Yeah

00:15:26.040 --> 00:15:28.600
And it's a lot more sketchy

00:15:29.480 --> 00:15:34.040
As to what's infringing and what's not you think we're going there. I think we're going there

00:15:34.200 --> 00:15:40.120
So jack doors. He's gonna mostly maybe get his way. I think he's going to partially win. I don't think like I said earlier

00:15:40.120 --> 00:15:43.960
I don't think we're gonna be deleting ip law, but I think there will be significant reform

00:15:44.680 --> 00:15:49.240
because I I don't see America effectively just

00:15:50.120 --> 00:15:53.320
kind of Being willing to lose the ai race

00:15:54.120 --> 00:15:59.720
Because I see this as they're in in their opinion. This is this is computers. This is uh.com

00:16:00.200 --> 00:16:05.640
They have to fight again. So I I think I agree with you if we're talking about like the powers that be

00:16:05.960 --> 00:16:09.320
Yeah, the uh, you know tech corporations government

00:16:09.960 --> 00:16:19.640
They definitely don't want to lose to china. Yeah The question then is what is what are the how do the people feel about it? Oh, yeah, because every time I encounter

00:16:21.000 --> 00:16:27.640
sentiment about ai and Copyright online. This is why this tweet was so interesting because the vast majority

00:16:28.520 --> 00:16:32.520
of The discourse around ai and copyright online is like

00:16:33.240 --> 00:16:39.560
Yeah, there's these ai companies are stealing from people. Well, they are yeah. Yeah, and you have the tech bros, you know

00:16:39.560 --> 00:16:48.440
Whatever coming out every once in a while being like We need to chill a guys. We need like how are we gonna get exponential growth if we don't steal all this copyrighted stuff?

00:16:48.840 --> 00:16:56.600
um And but they don't usually like that's not getting a ton of engagement. It's like this but this is saying the quiet part out loud pretty much

00:16:57.400 --> 00:17:00.360
It's because it's because it doesn't feel good. They are stealing stuff

00:17:00.440 --> 00:17:05.240
They are taking other people's work and they are directly aggressively profiting off of it the

00:17:07.160 --> 00:17:11.640
Well profiting off of investment, I guess most these companies are really not making a profit

00:17:11.720 --> 00:17:17.000
Yeah, they're not they're not taking the copyright machine and putting it into a different machine and it's like wow

00:17:17.080 --> 00:17:22.920
It's chewing up the copyright and spitting out profit. Yeah, they're tricking people into thinking their value machine putting a cool

00:17:22.920 --> 00:17:27.720
Little bow on it and be like you want to give me money? Yeah for nothing. Yeah sick. Um

00:17:29.240 --> 00:17:37.160
The bubble man Yeah, the bubble's pretty sorry you're you're saying something though. No, it's just it doesn't feel good. Yeah, it's morally

00:17:37.240 --> 00:17:42.440
I think it's like bad You're taking significant work from people especially

00:17:43.320 --> 00:17:48.760
Oh, and we already see this from google right a lot of small brands are getting screwed over by these search engines and now

00:17:49.320 --> 00:17:54.200
also by ai systems because Everything's intercepting

00:17:54.200 --> 00:17:58.840
Like even right now if you're using google search not even if you're in a chat program

00:17:58.840 --> 00:18:01.960
If you're using google search it will give you an ai summary at the top

00:18:02.520 --> 00:18:08.280
And the ai summary will often make it so that you do not have to go through and click onto one of those websites

00:18:08.760 --> 00:18:11.560
And that is making it so that independent small time

00:18:11.960 --> 00:18:20.120
research writing Whatever is all effectively being defunded by these platforms that are using them in order to succeed

00:18:20.600 --> 00:18:26.520
Right and that is a crazy concept. Yeah, because like okay, we might win the ai race

00:18:27.080 --> 00:18:30.040
But at what cost the cost of like literally everything else

00:18:31.320 --> 00:18:33.960
I was gonna say like okay. Yeah, definitely at the cost of like

00:18:34.520 --> 00:18:38.280
You know smaller creators smaller like copyright holders or whatever

00:18:38.840 --> 00:18:43.000
Uh smaller companies that are trying to get up and the seo rankings and they just can't

00:18:43.560 --> 00:18:47.640
I think but also at the cost of like I feel like our minds there have been some

00:18:48.440 --> 00:18:54.040
studies looking at People who use ai a lot versus people who don't use ai a lot and it's like

00:18:54.520 --> 00:18:59.480
Looking at their like problem-solving ability your ability like novel tasks. Yeah, they're worse at it

00:18:59.880 --> 00:19:04.280
And I I know that when I go to google something it gives me an ai overview

00:19:04.520 --> 00:19:09.000
I know how to google stuff. My whole job is like researching stuff and like reporting on it

00:19:09.320 --> 00:19:13.320
So like I've learned how to be really good good at like google foo. Yeah, but

00:19:14.280 --> 00:19:21.720
Most people are not and they're going to be even worse now If they don't even have to click into like one site to see what the see the answer

00:19:22.120 --> 00:19:29.080
Or what the quote unquote answer Because it might be it might be fake. So I don't it's it's crazy that we we all know

00:19:29.560 --> 00:19:38.760
That ai hallucinates and just makes stuff up all the time But a huge portion of the internet like genuinely a lot of people seem to be going under the the route of

00:19:39.320 --> 00:19:46.680
Blindlessly verifying things with ai. Mm-hmm. So like you see this on the tweeters, which I'm still going to call it

00:19:47.000 --> 00:19:52.840
Um where someone will be like At grok is this true don't call it the tweeters the tweeters

00:19:53.400 --> 00:19:57.400
Just call it twitter twitter. All right Um

00:19:57.400 --> 00:20:02.760
But they'll ask like grok or perplexity or whatever other ones will respond to tweets if something is true

00:20:03.000 --> 00:20:06.120
And then they will just take that fully at face value. I may

00:20:06.680 --> 00:20:10.600
Which is brutal. I made a joke about this in teclic the other day because yeah, I've seen this where

00:20:11.000 --> 00:20:15.480
Someone makes a claim and someone's like, I don't think that's true and they're like, well

00:20:16.200 --> 00:20:21.720
It is because look at this website and then they say hey grok is this true and it's like just click on the website

00:20:22.280 --> 00:20:26.920
Just google it. Why are you asking an ai that may or may not lie to your face?

00:20:27.640 --> 00:20:32.840
Like they've gotten a lot better, but they still hallucinate enough that it's like, why are you trusting?

00:20:33.800 --> 00:20:36.840
Yeah, and then it was even worse as the people who are like

00:20:37.400 --> 00:20:43.160
I collaborated with chat gpt on on this and then they like post some giant freaking thing

00:20:44.360 --> 00:20:47.960
What was your participation? I was wondering about whether this is the case

00:20:47.960 --> 00:20:52.680
So I asked chat gpt and it told me that so I guess chat gpt likes this and it's like guys

00:20:53.400 --> 00:21:00.760
stop Yeah, the confirmation bias is crazy too because a lot of the chat programs will try to detect what you're leaning towards and just affirm it

00:21:01.080 --> 00:21:07.640
Yeah, because people like those responses We've done a lot of bashing on ai so far. Yeah, but I

00:21:08.040 --> 00:21:12.600
It serves it. Well, it does But I know that you and I

00:21:13.480 --> 00:21:18.840
Well, I don't know but I have a suspicion that you and I share an appreciation for some of its upsides as well

00:21:19.000 --> 00:21:23.080
Yeah, and I want to talk about that After this segue to our sponsor

00:21:23.480 --> 00:21:28.840
Okay, now ai this essentially the argument we're talking about is ai safety versus innovation speed because it's like

00:21:29.240 --> 00:21:36.120
Is it more worth going for the moral argument of this sucks? We're stealing things from people or is it more worth not losing to china?

00:21:36.280 --> 00:21:41.080
Right, and this is why I think the reason why there hasn't been precedent set is these are such different

00:21:41.800 --> 00:21:45.000
weights That it's it's hard to

00:21:46.840 --> 00:21:50.520
And like the same person can want both sides of it

00:21:51.480 --> 00:21:55.960
Very deeply the same person can want to not steal

00:21:57.320 --> 00:22:07.800
All thought creation from humanity ever And also not want to lose this war because if you lose this then this side didn't matter anyways, so it's like

00:22:08.680 --> 00:22:11.640
It's especially hard because you can like

00:22:12.520 --> 00:22:18.440
As you were saying that my first kind of like gut reaction is like but what about principles, you know like there's kind of like

00:22:19.400 --> 00:22:22.680
You know the utilitarian thing of like this is what's necessary in order to win

00:22:22.760 --> 00:22:27.480
But i'm like oh at what cost what if we lose our soul as as a as a society

00:22:27.960 --> 00:22:32.600
But at the same time it's like well If china develops a

00:22:32.760 --> 00:22:36.360
Super intelligence that kind of hacks the entire world and makes everything

00:22:36.440 --> 00:22:39.480
You know whatever they want it to be then we won't have a society

00:22:39.960 --> 00:22:46.120
That's like very far on the end of the possibilities, but I mean there's also the the problem of

00:22:46.680 --> 00:22:53.960
How consumers act about things which is everyone will say online not everyone i'm exaggerating a lot of people will say online

00:22:54.280 --> 00:22:58.680
that they want the You know don't steal everything from everyone route

00:22:59.080 --> 00:23:03.320
But then if a company that does do that releases a model that's better

00:23:04.360 --> 00:23:10.440
They will use it well And yeah that and like that the precedent of that is like

00:23:11.240 --> 00:23:15.960
Carved into every stone on the planet who's they when you say they will use it users

00:23:17.160 --> 00:23:24.360
If if everyone knows that many uh things are made in sweatshops by slaves and they keep buying them

00:23:25.240 --> 00:23:30.920
people won't like Gambling but they'll watch kick people won't like a certain thing

00:23:31.320 --> 00:23:34.920
Yeah, but then if the thing is better they'll buy it anyways

00:23:35.160 --> 00:23:39.080
And I guess this kind of comes into a part of the conversation that I wanted to

00:23:39.560 --> 00:23:45.160
address which was the usefulness of ai the the pros of ai because

00:23:45.640 --> 00:23:49.240
I remember I had friends that are not techie people

00:23:49.800 --> 00:24:00.040
um one is a teacher and In like 2023 like shortly after chat gpt. He's released. I go and talk to him and he's like yeah chat

00:24:00.920 --> 00:24:06.600
I use chat gpt all the time like first of all I was like, you know what chat gpt is because these are like not techie people

00:24:06.760 --> 00:24:10.760
Yeah, and he's like yeah use it to help me like make a lesson growth in history

00:24:11.000 --> 00:24:16.920
Yeah, I have to hit a lot of people right? Yeah, and and he was using it to make lesson plans for his kids and like coming up with ideas for stuff

00:24:17.000 --> 00:24:20.440
And I was like wow like because that that was at a point where I even I wasn't really

00:24:21.080 --> 00:24:25.240
I I haven't found it to be super super useful for my specific

00:24:27.480 --> 00:24:32.520
Work yeah, um my my dad's a plumbing instructor and they use it for yeah lesson plans

00:24:33.400 --> 00:24:39.240
Yeah, and once it gets like once these actual like multimodal updates get out like it's uh gemini live is now

00:24:39.720 --> 00:24:43.560
Available with the live camera feed. Yeah, I think chat gpt has

00:24:44.040 --> 00:24:49.240
I know you can feed images and I think they have a camera mode for the gem like there's live mode too. There's um

00:24:50.440 --> 00:24:53.720
I think a lot of people dramatically misuse AI

00:24:54.520 --> 00:24:59.080
And like for instance like I say that about my dad's lesson plan lesson plans for plumbing and people like oh

00:24:59.400 --> 00:25:04.920
Hopefully doesn't like teach people to go outside of code and it's like well. No, he's controlling those aspects

00:25:05.160 --> 00:25:07.800
He's using it to build the structure

00:25:08.280 --> 00:25:13.160
Right to build the pacing to come up with the document that he can put the information on

00:25:13.400 --> 00:25:18.200
He's using it to help organizational skills those types of things the way that I'll often use it is to

00:25:18.680 --> 00:25:24.520
Review my output and give me feedback on my output. I don't use it for output

00:25:24.920 --> 00:25:31.560
But like what kind of output are you talking about my most common use case is I will write an email that is like

00:25:32.440 --> 00:25:39.640
I sure hope this goes well Uh, so but you know, maybe I have to be rather serious or whatever in the tone of the email

00:25:39.800 --> 00:25:43.640
But I want to make sure that I get my point across but not come across as aggressive or whatever else

00:25:43.800 --> 00:25:48.840
So I'll ask it for a sentiment analysis of the email. Oh, and then I might ask it like if I'm like, okay

00:25:48.920 --> 00:25:54.040
That's not what I wanted someone to get from this Yeah, and it got that from this then I'll be like, okay

00:25:54.280 --> 00:25:58.600
How would you change it to get it there? And then I won't use its output, but I'll use it to help

00:25:59.160 --> 00:26:04.760
Guide me in that direction. Did I mention lizards too many times? Well, like sometimes their email is pretty good

00:26:04.760 --> 00:26:06.840
But you seem weirdly focused on lizards

00:26:08.920 --> 00:26:12.600
Maybe you should draw back on the lizards. I think they're not like a reptile person

00:26:13.640 --> 00:26:16.760
Maybe cats we've had um, I think it's also a good

00:26:17.480 --> 00:26:23.320
um kickstarter of sorts like For for writer's block, for instance, if you have to write, um

00:26:24.360 --> 00:26:29.960
A video on something you could use it to give you your skeleton. I've found I've tried to get it to write youtube videos

00:26:30.200 --> 00:26:34.680
Man, it sucks. Yeah, I don't know if you've tried. I've used it a couple times when I like

00:26:35.320 --> 00:26:37.640
Can't think of a word for something or like I'm like

00:26:38.520 --> 00:26:44.120
Writing a joke and I'm like, oh man This is like this and I'm just like drawing a blank and I'm like, what's uh

00:26:44.440 --> 00:26:49.000
Was this thing something like when this guy says this and then another guy comes at it like that and they're like

00:26:49.400 --> 00:26:54.680
Maybe this and I'm like, okay. Thank you. Yeah, that makes sense. But but writing whole scripts. No, no good

00:26:55.080 --> 00:26:58.680
but You can use it as like a research assistant

00:26:59.480 --> 00:27:04.440
Um, again, there's huge pitfalls there because you shouldn't just believe what it outputs

00:27:05.000 --> 00:27:09.480
But you can use it as like a kicking off point So like something that I've done is like, okay

00:27:09.640 --> 00:27:16.920
Here's a topic What are the like key fields of interest in this topic? Right and it'll give me that list and then I'll go off and

00:27:17.000 --> 00:27:21.800
Yeah, learn things about those all of the ai chatbots now have like deep research modes

00:27:22.280 --> 00:27:25.960
It does a good job of like giving you kind of a beginner's summary of stuff

00:27:26.360 --> 00:27:33.720
But if you know a lot about a topic and you're like What was the finding of this research like in the 19th whatever and it's like they found up this

00:27:33.880 --> 00:27:38.600
They found this stuff that I just made up because I'm an ai and you're like, well, I know that's not true

00:27:39.240 --> 00:27:44.920
So it's like, but that's why like you mentioned earlier in the show that it's getting better with like hallucinations and stuff

00:27:45.480 --> 00:27:53.720
A little bit It's only a little bit. I think it's getting better at convincing people that it's not doing it. Yeah. Um, I don't know

00:27:55.320 --> 00:27:57.320
I don't personally think it's getting

00:27:58.360 --> 00:28:01.640
Dramatically better overall Um

00:28:01.640 --> 00:28:08.040
There are measures that it's getting a bit better But I really think it's just getting better at convincing people and it's getting better at using

00:28:08.520 --> 00:28:11.720
slightly more vague language and stuff like that to get away from

00:28:12.360 --> 00:28:18.440
um Actually being pointed as doing things wrong because I catch it hallucinating stuff. Oh, yeah

00:28:19.240 --> 00:28:21.240
All the time I will say that I've

00:28:22.280 --> 00:28:27.480
I've I've just started trying to use it a little bit to help me research tech length stories

00:28:28.040 --> 00:28:31.320
Um, it's kind of like one of the steps in the process

00:28:31.960 --> 00:28:35.000
And it has not been it has not been very uh useful

00:28:35.000 --> 00:28:40.600
But most of the time it's not hallucinating fake stuff as much as it's like I'm asking for

00:28:41.400 --> 00:28:47.720
Uh stories that are recent and it like brings me something from a month ago. Here's something from last quarter. Yeah. Yeah

00:28:48.280 --> 00:28:50.680
On balance if you look at the whole field

00:28:52.040 --> 00:28:56.200
Do you feel like the majority of ai tools that are available?

00:28:57.080 --> 00:29:01.640
Are like helpful or just kind of taking up space and taking up resources. Whoa, okay

00:29:03.240 --> 00:29:06.840
Because there's a majority of ai tools. So if you include every ai tool in that pool

00:29:07.400 --> 00:29:13.960
Then honestly, no, there's a lot of compute a lot of resources being taken up. Yeah, and

00:29:14.920 --> 00:29:23.640
they're able to do that because These companies vacuumed up ip vacuumed up copyright and massive investment because that's the the hotness right now

00:29:23.800 --> 00:29:28.920
I think most of the tools are are garbage and we'll lose and we'll die and that is how these like

00:29:29.400 --> 00:29:35.720
Very startup focus fields work. So that's pretty natural. I don't actually think that's weird. I don't think that's ai specific

00:29:36.040 --> 00:29:41.800
I think that's just that's true big new fancy field with lots of investment dollars and everyone trying to dive in all at the same time

00:29:42.040 --> 00:29:47.880
So a lot of those people are gonna make garbage for for the handful of like crypto projects that were actually kind of cool

00:29:48.120 --> 00:29:52.760
Yeah, maybe like we can they did exist. They did like the idea of having some like

00:29:53.240 --> 00:29:58.680
Records and stuff on a blockchain system. That was cool. Yep, but for every one of those there were

00:30:00.440 --> 00:30:03.880
250 thousand

00:30:03.880 --> 00:30:07.160
Yeah, garbage things. So it's the same for ai, but

00:30:07.960 --> 00:30:11.160
Maybe that all won't matter if

00:30:11.160 --> 00:30:14.360
You know, we only need one we only need one asi

00:30:15.000 --> 00:30:19.640
artificial super intelligence And then it'll all be worth it

00:30:20.280 --> 00:30:26.360
Ascii you can do it. Yeah, what? Yeah, ascii artificial super general intelligence

00:30:26.920 --> 00:30:29.480
I've never heard that together. I've heard agi and asi

00:30:29.960 --> 00:30:34.760
You can't just put them together like that. We were talking earlier about innovators. That's true. I'm an inventor. Wow

00:30:35.320 --> 00:30:41.960
Here's a patent, sir That was you just witnessed innovation jack dorsey. This is what you're trying to kill. Yeah

00:30:42.600 --> 00:30:50.680
Stop it think of the children So maybe jack dorsey's right and we'll have a utopia and none of us will own any ip and everything will be fine

00:30:51.240 --> 00:30:55.240
And maybe jack dorsey and ilo musk and sam altman

00:30:56.200 --> 00:30:58.200
Especially sam altman

00:30:59.320 --> 00:31:04.840
Are gonna enslave all of us. Yeah, definitely especially sam altman and those are the two options

00:31:05.080 --> 00:31:09.080
Ilan just wants to impregnate all of us. He can be persuasive

00:31:10.200 --> 00:31:13.480
I think this has been a huge wake-up call for people about uh, chinese

00:31:14.200 --> 00:31:19.560
position on value chain america has traditionally been positioned very high on the value chain and then

00:31:20.360 --> 00:31:22.760
I think in a lot of north american views

00:31:23.640 --> 00:31:29.720
You you go down the value chain from america and end up in china and a few other countries at the bottom of the value chain

00:31:30.040 --> 00:31:34.520
But I think this and the tariff were going on and a few other things have revealed

00:31:34.760 --> 00:31:39.720
That china is much higher on the value chain than we previously thought and has the active ability now

00:31:40.120 --> 00:31:44.120
To take rungs on the value chain. Yes, and I think one of those

00:31:45.160 --> 00:31:51.400
Yes No, I was just saying that because I was thinking about a recent story that just came up with their

00:31:51.960 --> 00:31:59.640
They have chips that they say are More performant than NVIDIA's equivalent. But anyways, go ahead. Yeah, I'm like, I don't know all about that being true

00:31:59.800 --> 00:32:04.600
But yeah, who knows they're they're progressing very quickly in a lot of these fields CPU's GPU's

00:32:05.000 --> 00:32:07.400
AI technology movies random one

00:32:08.200 --> 00:32:11.480
Lots and lots lots of fields where they're they're really taking major steps forward

00:32:11.800 --> 00:32:16.680
One of those places insane story that I I don't understand how it doesn't get talked about more

00:32:16.760 --> 00:32:21.400
But uh the quote that I have here is ford ceo jim farley admitted

00:32:21.400 --> 00:32:26.280
He's been driving a xiaomi su 7 for six months and doesn't want to give it up

00:32:27.080 --> 00:32:32.920
The ford ceo drives a 30 000 dollar chinese electric made vehicle. Sorry chinese made electric vehicle

00:32:33.640 --> 00:32:39.880
Unavailable in the us due to tariffs and safety regulation issues. Is this something that he revealed uh on a podcast

00:32:40.040 --> 00:32:43.800
Like like of his own volition. We are now. We're basically the ford ceo. Yes, he did

00:32:44.520 --> 00:32:48.760
It was of his own volition. Why would he do that? I feel like that's like so dumb motor company

00:32:49.080 --> 00:32:54.840
Mentioned that he didn't want to give up a 30 000 dollar vehicle from a competing company because it was so good

00:32:55.320 --> 00:32:59.720
That should be is he about to cash out or something like what donning wake up call the people

00:32:59.960 --> 00:33:08.520
Yeah about the automotive industry outside of america Yeah, I think in e v's in particular. Oh, yeah, china is like leapfrogging the automotive industry

00:33:08.520 --> 00:33:13.400
I start falling apart and have panel gaps that I can shove my fists through um and are like

00:33:14.120 --> 00:33:16.760
You know, they're they're just partying on slack sharing people's

00:33:17.320 --> 00:33:21.800
Videos and stuff like that. So like if you're and I think that's a big part of the problem too

00:33:21.880 --> 00:33:26.920
We're like when you when you think about chinese products people traditionally go like oh, but like they're gonna steal your information

00:33:27.640 --> 00:33:30.360
Right. It's like yeah, but we we know

00:33:31.320 --> 00:33:38.520
That our companies are yeah for sure. Well, yeah, that's a documented evidence. That's a whole another that's a whole another conversation

00:33:38.600 --> 00:33:41.640
I feel like which I'm still I don't know what to think about

00:33:42.440 --> 00:33:48.360
Because that's been the main reason why I'm like, uh, don't I don't want to use that like chinese phone or whatever

00:33:48.520 --> 00:33:52.840
Who knows what data they're sending but

00:33:52.840 --> 00:33:57.160
I know my dad is going out to all these other companies. So like so at a certain point it's like

00:33:57.160 --> 00:34:04.120
This is the future. This is what they're saying. They're like ip doesn't matter your personal information doesn't matter

00:34:04.680 --> 00:34:07.880
You'll own nothing including your own personal information

00:34:08.520 --> 00:34:15.320
And your own thoughts If you can't sell them anymore neural link and you'll be happy. Yeah, but we'll have to see

00:34:15.480 --> 00:34:21.800
I don't know if this tweet that jack dorsy reposted is gonna come to pass in which so he tweeted

00:34:22.200 --> 00:34:25.160
He retweeted a interview with microsoft a i ceo

00:34:25.720 --> 00:34:31.240
Mustafa Suleyman who says the future isn't ubi universal basic income. It's ubp

00:34:31.960 --> 00:34:39.720
universal basic provision Abundant intelligence as the new currency you won't need more money because knowledge won't be something you buy

00:34:40.600 --> 00:34:46.440
Not cash capability. We won't need cash because we'll have ai assistants who can do everything for us

00:34:47.480 --> 00:34:50.120
You don't need to go to the store and buy a pop

00:34:50.680 --> 00:34:54.760
A soda Your ai will just synthesize it for you. Okay

00:34:55.720 --> 00:35:00.680
This feels like that south park episode where randy can't fix his oven because he can't fix the oven door

00:35:01.320 --> 00:35:07.800
and then the the like Handy men people end up being the wealthiest people in society

00:35:08.120 --> 00:35:10.840
Because no one can fix anything because no one knows how to do anything

00:35:11.960 --> 00:35:15.880
And they they I don't remember if this happens in the episode, but I'm gonna make it up. I guess

00:35:16.680 --> 00:35:21.800
I'm hallucinating But they they like ask an ai to fix the door and it just tries to teach him how to do it

00:35:21.800 --> 00:35:28.920
But he's like no, I want you to fix the door it keeps giving them steps But he's like no, I want you to fix the door like we don't have things that can do our laundry right now

00:35:28.920 --> 00:35:32.600
We don't have things that can build houses. We don't have things that can fix plumbing

00:35:32.600 --> 00:35:39.400
We don't have things that can do any of that kind of stuff It's like he's talking about provisions of like this thing that can draw studio ghibli for me

00:35:40.360 --> 00:35:45.560
Or like write a terrible email or a really really really bad youtube video that would get one view

00:35:45.800 --> 00:35:50.840
Like is that really I'm really excited about that being a universal provision. Thanks bro

00:35:51.080 --> 00:35:56.440
I think that's what makes the whole conversation just so so hard to come down on one side of because it's so uncertain

00:35:56.760 --> 00:36:02.120
You need your apartment to not leak. Is is is chat gpt gonna help you with that? Yeah

00:36:02.840 --> 00:36:08.200
like It's hot and there's there's droughts and and uh like there's peaks and valleys too

00:36:08.520 --> 00:36:14.760
Like sometimes it seems like the ai whole thing is just hype and sometimes it's like oh, this is okay. It's happening. Yeah

00:36:15.480 --> 00:36:22.200
We're not gonna solve it in this podcast But maybe in the next one so subscribe possibly

00:36:22.840 --> 00:36:26.600
I wouldn't want to miss it. It might happen. Maybe we'll invent a gi

00:36:27.480 --> 00:36:30.920
Right here a s watch out open ai

00:36:33.560 --> 00:36:39.800
We're gonna delete your ip Hey, thanks for watching. Uh, that was a talk linked. Uh, subscribe to TechLinked

00:36:42.440 --> 00:36:46.760
See you later. Maybe I'll do another one of these in half a year. Nope never happening never again. Bye

00:36:47.480 --> 00:36:50.200
You
