WEBVTT

00:00:00.000 --> 00:00:03.240
We're rushing this one because it's hot as heck in here.

00:00:03.240 --> 00:00:07.040
I'm here with Ploof. And what do you do for the company?

00:00:07.040 --> 00:00:10.360
Right. Good. All right. AI, good, bad?

00:00:10.360 --> 00:00:13.560
Thoughts? Man, it's such a mixed bag.

00:00:13.560 --> 00:00:19.360
I think overall it's really cool, and it's a great idea. We could speed up things like machine learning.

00:00:19.360 --> 00:00:24.080
We can use it to diagnose people who have weird symptoms and stuff, because let's face it,

00:00:24.080 --> 00:00:31.640
people are limited at some point. But it becomes bad when we basically use prior work

00:00:31.640 --> 00:00:35.320
to then create new stuff.

00:00:35.320 --> 00:00:38.320
And the people that made that work previously don't get any credit for it.

00:00:38.320 --> 00:00:41.800
So I'm obviously talking about mid-journey and the AI art

00:00:41.800 --> 00:00:46.320
stuff, like that's garbage. The whole idea of it's really cool,

00:00:46.320 --> 00:00:49.520
but it's not fair to anyone else whose work is being trained

00:00:49.520 --> 00:00:55.000
on. Same thing goes for writing. Look at articles. You can tell when it's a AI written article.

00:00:55.040 --> 00:01:00.200
Like it's just kind of bad. And that's being trained on everyone who's written stuff.

00:01:00.200 --> 00:01:03.320
So I think it's cool, but I think ultimately it's probably

00:01:03.320 --> 00:01:09.120
going to lead to a major issue. There's going to be a lot of jobs that just go away.

00:01:09.120 --> 00:01:14.360
I've already seen posts, people that are like, yeah, I used to make a small living just writing time stamps

00:01:14.360 --> 00:01:18.000
for companies and stuff. All of a sudden, AI is doing that.

00:01:18.000 --> 00:01:21.040
It's doing a worse job than him, is what he wrote.

00:01:21.040 --> 00:01:26.240
But it's not doing a worse enough job that they're willing to take him over the AI.

00:01:26.240 --> 00:01:30.520
So it's a good and a bad thing, I don't know. Well, I didn't see any article like that.

00:01:30.520 --> 00:01:35.720
I know that there's like people get fought, people like, there was like some thing we talked about.

00:01:35.720 --> 00:01:39.720
It was like, it was like some mental health place where like they fired a bunch of people

00:01:39.720 --> 00:01:44.000
and then replaced it with AI. And that's the problem is, you know,

00:01:44.000 --> 00:01:47.760
it's kind of like when industrialization happened, all of a sudden a lot of people lost their jobs

00:01:47.760 --> 00:01:51.600
and they were forced to move to cities because manufacturing became a thing.

00:01:51.600 --> 00:01:56.720
I can see AI kind of being like that. Some stuff, no, it's really hard to replicate,

00:01:56.720 --> 00:01:59.760
but honestly give it, you know, five more years,

00:01:59.760 --> 00:02:03.880
three more years, 10 more years. Like the more this stuff goes on and gets trained,

00:02:03.880 --> 00:02:07.480
the better it's gonna be. And then all of a sudden, it's not garbage anymore.

00:02:07.480 --> 00:02:11.320
All of a sudden it's not making mistakes anymore. Now that being said, it's definitely not perfect.

00:02:11.320 --> 00:02:15.320
Just about any time we try to make like some kind of AI chat bot, you give it a week or a month

00:02:15.320 --> 00:02:19.320
and it turns racist and sexist. You know, as soon as it gets access to 4chan,

00:02:19.360 --> 00:02:21.440
it just goes straight downhill.

00:02:23.120 --> 00:02:26.520
But I don't know. So I love it, but at the same time, I hate it.

00:02:26.520 --> 00:02:31.080
And I am definitely terrified of some kind of a terminator-ish future where AI says,

00:02:31.080 --> 00:02:34.760
ah, you know, humans kind of suck and they're basically just a parasite on the earth. So we should kill them.

00:02:34.760 --> 00:02:40.320
And then it just nukes everyone. Yeah, I think that's like the general fear people have about just, I guess, AI in general.

00:02:40.320 --> 00:02:44.520
But so then you sounds like you're kind of more against it than for it or?

00:02:45.560 --> 00:02:48.800
I'd say yes in certain settings.

00:02:48.800 --> 00:02:52.360
I'm definitely for it if it can help like diagnose people

00:02:52.360 --> 00:02:57.520
or figure out new cures for diseases and stuff like that. Like machine learning is great.

00:02:57.520 --> 00:03:00.520
And I think it can be used in a lot of really useful ways.

00:03:00.520 --> 00:03:04.800
But when it comes to like artistic side of things,

00:03:04.800 --> 00:03:09.640
yeah, maybe it's cool to get a concept, but then you should have to create that yourself.

00:03:09.640 --> 00:03:12.720
I don't know. It's such a weird space.

00:03:12.720 --> 00:03:17.040
Yeah, whatever I said, but it's like, it should be like a starting point, not the end product.

00:03:17.040 --> 00:03:20.640
That's really a way to put it. I'm totally down with taking like Mid Journey

00:03:20.640 --> 00:03:24.960
or any other company like that and basically using it to generate ideas.

00:03:24.960 --> 00:03:29.040
It can be a brainstorm session, that is fine. But then you should take what it's brainstormed

00:03:29.040 --> 00:03:34.520
and like make it your own thing, right? Cause all it is at that time is some amalgamation of ideas

00:03:34.520 --> 00:03:40.000
from other people who haven't gotten any money or credit for what it's making.

00:03:40.000 --> 00:03:45.000
All right, so then as a tech company that we are,

00:03:45.280 --> 00:03:49.320
and a creative company, do you think that we should be using AI at any level?

00:03:52.800 --> 00:03:55.660
Not really. Maybe machine learning for the lab,

00:03:58.000 --> 00:04:02.480
but like real AI and using it to make,

00:04:02.480 --> 00:04:06.360
I don't know, I don't think so. I just, I hate the idea of eliminating jobs,

00:04:06.360 --> 00:04:10.160
especially when it's not ready. I think in a couple of years that story might change

00:04:10.160 --> 00:04:14.360
and I might be like, yeah, sure, I don't want to make the timestamps for every video.

00:04:14.360 --> 00:04:17.640
So run it through the AI generator to generate those timestamps for us, like fine.

00:04:17.640 --> 00:04:21.200
But right now, nah. So I get the question here,

00:04:21.200 --> 00:04:25.360
but so then it sounds like you aren't afraid that it's gonna take over your job in the future

00:04:25.360 --> 00:04:29.200
or anything like that. I'm not personally right now

00:04:29.200 --> 00:04:33.720
because the line of work we're in, like yeah, there's deep fakes and yeah,

00:04:33.720 --> 00:04:37.340
you can like make fake line of tech tips episodes and stuff like that.

00:04:37.340 --> 00:04:41.080
But it's several years out at least to the point

00:04:41.080 --> 00:04:45.240
where I might be, I might worry about like retiring from something like this in 20 years

00:04:45.280 --> 00:04:49.160
because it's probably gone by then, but I think 20 years we've got a lot of other problems.

00:04:49.160 --> 00:04:52.320
Yeah, like this warming that we're in.

00:04:52.320 --> 00:04:55.720
I'm like, I'm more excited, I feel it. All right, I'm gonna skip one question

00:04:55.720 --> 00:05:00.280
and just go jump to the right thing because I'm dying. I want to know the question. All right, all right.

00:05:00.280 --> 00:05:06.320
Have you used AI before? And if so, what are your thoughts on it? The only thing I've really done with AI was when mid-journey

00:05:06.320 --> 00:05:09.600
and those other guys were coming out, I was messing around on it.

00:05:09.600 --> 00:05:13.280
I was like, yeah, this is cool. Ha ha, like Trump eating a lemon or whatever.

00:05:13.280 --> 00:05:16.400
I don't know. That's like the, oh my God. Why is that my example?

00:05:17.840 --> 00:05:21.320
But like, yeah, it was a lot of fun. And then it took a hot minute, but then I'm like,

00:05:21.320 --> 00:05:25.240
oh, hold on a second, where did they get all this data from?

00:05:25.240 --> 00:05:28.960
That's not good. Okay, and then I just stopped using it instantly.

00:05:28.960 --> 00:05:32.960
I left the Discord group and I've been very against it

00:05:32.960 --> 00:05:36.880
ever since, chat GPT, kind of the same thing. It sucks because my cousin loves it

00:05:36.880 --> 00:05:40.360
because he's not the best writer. So he's like, yeah, I had to write this thing and I didn't want to write it.

00:05:40.360 --> 00:05:45.520
So I just got chat GPT to do it and I was like, I get it. You think it's cool, but it's, I don't know.

00:05:45.520 --> 00:05:50.160
Yeah. No, personally, I use chat GPT for like, if I wanted to know like, oh, what's a good workout

00:05:50.160 --> 00:05:54.080
for this like, set of muscles or like a good like,

00:05:54.080 --> 00:05:58.000
like diet for my age and height and like weight and stuff.

00:05:58.000 --> 00:06:01.440
You know, like, I think that's useful. But like in terms of like writing stuff,

00:06:01.440 --> 00:06:04.840
like I'm like, we should learn how to write first

00:06:04.840 --> 00:06:08.640
before you start using technology to replace that. Yeah.

00:06:08.640 --> 00:06:12.680
And on top of that, hasn't there been, I mean, this is maybe just, you know, garbage,

00:06:12.680 --> 00:06:16.880
but haven't there been court cases and stuff where like a lawyer was caught using chat GPT

00:06:16.880 --> 00:06:18.860
and then it fabricated a case?

00:06:20.120 --> 00:06:23.480
I'm pretty sure. I didn't hear that. I'm pretty sure that was the thing. Maybe I'm wrong.

00:06:23.480 --> 00:06:27.600
But if stuff like that is happening and AI at this level

00:06:27.600 --> 00:06:30.760
is at the point where it's like creating a fake narrative,

00:06:30.760 --> 00:06:33.840
can you trust it for anything? Yeah, I know that there was a court case

00:06:33.840 --> 00:06:38.320
where it's like a monkey took a selfie from another photographer's camera.

00:06:38.320 --> 00:06:42.080
And then I think Peter sued the photographer saying that like this monkey,

00:06:42.080 --> 00:06:45.760
the monkey is the owner of it rather than the photographer because it was his camera.

00:06:45.760 --> 00:06:49.360
So it's like, it shows that like, it doesn't matter who the type mirror is,

00:06:49.360 --> 00:06:52.840
it's the person, their origin, the creator. So it's like, you don't know who the creator is

00:06:52.840 --> 00:06:57.760
when it's like AI generated. Yeah, absolutely. You don't know, it's just an amalgamation of people.

00:06:57.760 --> 00:07:02.120
All right, I'm sweating like crazy. So we're gonna wrap it up here. Do you want to say anything to the float planers?

00:07:02.120 --> 00:07:06.640
I'm trying to coin this phrase now for people who've subscribed to us on Floatplane.com.

00:07:06.640 --> 00:07:09.880
It's that they're float planers. So any message for them or like something

00:07:09.880 --> 00:07:14.920
that we didn't talk about that you wanted to expand on? Tell Sammy you don't like being called float planers.

00:07:17.120 --> 00:07:20.640
Then, okay, what should we call them then? What should we call them then? I don't know.

00:07:20.640 --> 00:07:25.760
I don't know. Subscribers, subscribers. Subscribers, no, but I said you two subscribers, you know.

00:07:25.760 --> 00:07:29.680
I think float planers is a good one. Sure. Unless I'm getting in trouble for that,

00:07:29.680 --> 00:07:33.200
then I didn't coin that phrase. It was never you, I never heard it here first.

00:07:33.200 --> 00:07:36.600
Yeah, but unless if people like it, then it's mine.

00:07:36.600 --> 00:07:41.280
You heard it here first. You heard it first. I'm Sammy, this is my sweat.

00:07:41.280 --> 00:07:42.480
Goodbye, float planers.
