1
00:00:00,000 --> 00:00:03,240
We're rushing this one because it's hot as heck in here.

2
00:00:03,240 --> 00:00:07,040
I'm here with Ploof. And what do you do for the company?

3
00:00:07,040 --> 00:00:10,360
Right. Good. All right. AI, good, bad?

4
00:00:10,360 --> 00:00:13,560
Thoughts? Man, it's such a mixed bag.

5
00:00:13,560 --> 00:00:19,360
I think overall it's really cool, and it's a great idea. We could speed up things like machine learning.

6
00:00:19,360 --> 00:00:24,080
We can use it to diagnose people who have weird symptoms and stuff, because let's face it,

7
00:00:24,080 --> 00:00:31,640
people are limited at some point. But it becomes bad when we basically use prior work

8
00:00:31,640 --> 00:00:35,320
to then create new stuff.

9
00:00:35,320 --> 00:00:38,320
And the people that made that work previously don't get any credit for it.

10
00:00:38,320 --> 00:00:41,800
So I'm obviously talking about mid-journey and the AI art

11
00:00:41,800 --> 00:00:46,320
stuff, like that's garbage. The whole idea of it's really cool,

12
00:00:46,320 --> 00:00:49,520
but it's not fair to anyone else whose work is being trained

13
00:00:49,520 --> 00:00:55,000
on. Same thing goes for writing. Look at articles. You can tell when it's a AI written article.

14
00:00:55,040 --> 00:01:00,200
Like it's just kind of bad. And that's being trained on everyone who's written stuff.

15
00:01:00,200 --> 00:01:03,320
So I think it's cool, but I think ultimately it's probably

16
00:01:03,320 --> 00:01:09,120
going to lead to a major issue. There's going to be a lot of jobs that just go away.

17
00:01:09,120 --> 00:01:14,360
I've already seen posts, people that are like, yeah, I used to make a small living just writing time stamps

18
00:01:14,360 --> 00:01:18,000
for companies and stuff. All of a sudden, AI is doing that.

19
00:01:18,000 --> 00:01:21,040
It's doing a worse job than him, is what he wrote.

20
00:01:21,040 --> 00:01:26,240
But it's not doing a worse enough job that they're willing to take him over the AI.

21
00:01:26,240 --> 00:01:30,520
So it's a good and a bad thing, I don't know. Well, I didn't see any article like that.

22
00:01:30,520 --> 00:01:35,720
I know that there's like people get fought, people like, there was like some thing we talked about.

23
00:01:35,720 --> 00:01:39,720
It was like, it was like some mental health place where like they fired a bunch of people

24
00:01:39,720 --> 00:01:44,000
and then replaced it with AI. And that's the problem is, you know,

25
00:01:44,000 --> 00:01:47,760
it's kind of like when industrialization happened, all of a sudden a lot of people lost their jobs

26
00:01:47,760 --> 00:01:51,600
and they were forced to move to cities because manufacturing became a thing.

27
00:01:51,600 --> 00:01:56,720
I can see AI kind of being like that. Some stuff, no, it's really hard to replicate,

28
00:01:56,720 --> 00:01:59,760
but honestly give it, you know, five more years,

29
00:01:59,760 --> 00:02:03,880
three more years, 10 more years. Like the more this stuff goes on and gets trained,

30
00:02:03,880 --> 00:02:07,480
the better it's gonna be. And then all of a sudden, it's not garbage anymore.

31
00:02:07,480 --> 00:02:11,320
All of a sudden it's not making mistakes anymore. Now that being said, it's definitely not perfect.

32
00:02:11,320 --> 00:02:15,320
Just about any time we try to make like some kind of AI chat bot, you give it a week or a month

33
00:02:15,320 --> 00:02:19,320
and it turns racist and sexist. You know, as soon as it gets access to 4chan,

34
00:02:19,360 --> 00:02:21,440
it just goes straight downhill.

35
00:02:23,120 --> 00:02:26,520
But I don't know. So I love it, but at the same time, I hate it.

36
00:02:26,520 --> 00:02:31,080
And I am definitely terrified of some kind of a terminator-ish future where AI says,

37
00:02:31,080 --> 00:02:34,760
ah, you know, humans kind of suck and they're basically just a parasite on the earth. So we should kill them.

38
00:02:34,760 --> 00:02:40,320
And then it just nukes everyone. Yeah, I think that's like the general fear people have about just, I guess, AI in general.

39
00:02:40,320 --> 00:02:44,520
But so then you sounds like you're kind of more against it than for it or?

40
00:02:45,560 --> 00:02:48,800
I'd say yes in certain settings.

41
00:02:48,800 --> 00:02:52,360
I'm definitely for it if it can help like diagnose people

42
00:02:52,360 --> 00:02:57,520
or figure out new cures for diseases and stuff like that. Like machine learning is great.

43
00:02:57,520 --> 00:03:00,520
And I think it can be used in a lot of really useful ways.

44
00:03:00,520 --> 00:03:04,800
But when it comes to like artistic side of things,

45
00:03:04,800 --> 00:03:09,640
yeah, maybe it's cool to get a concept, but then you should have to create that yourself.

46
00:03:09,640 --> 00:03:12,720
I don't know. It's such a weird space.

47
00:03:12,720 --> 00:03:17,040
Yeah, whatever I said, but it's like, it should be like a starting point, not the end product.

48
00:03:17,040 --> 00:03:20,640
That's really a way to put it. I'm totally down with taking like Mid Journey

49
00:03:20,640 --> 00:03:24,960
or any other company like that and basically using it to generate ideas.

50
00:03:24,960 --> 00:03:29,040
It can be a brainstorm session, that is fine. But then you should take what it's brainstormed

51
00:03:29,040 --> 00:03:34,520
and like make it your own thing, right? Cause all it is at that time is some amalgamation of ideas

52
00:03:34,520 --> 00:03:40,000
from other people who haven't gotten any money or credit for what it's making.

53
00:03:40,000 --> 00:03:45,000
All right, so then as a tech company that we are,

54
00:03:45,280 --> 00:03:49,320
and a creative company, do you think that we should be using AI at any level?

55
00:03:52,800 --> 00:03:55,660
Not really. Maybe machine learning for the lab,

56
00:03:58,000 --> 00:04:02,480
but like real AI and using it to make,

57
00:04:02,480 --> 00:04:06,360
I don't know, I don't think so. I just, I hate the idea of eliminating jobs,

58
00:04:06,360 --> 00:04:10,160
especially when it's not ready. I think in a couple of years that story might change

59
00:04:10,160 --> 00:04:14,360
and I might be like, yeah, sure, I don't want to make the timestamps for every video.

60
00:04:14,360 --> 00:04:17,640
So run it through the AI generator to generate those timestamps for us, like fine.

61
00:04:17,640 --> 00:04:21,200
But right now, nah. So I get the question here,

62
00:04:21,200 --> 00:04:25,360
but so then it sounds like you aren't afraid that it's gonna take over your job in the future

63
00:04:25,360 --> 00:04:29,200
or anything like that. I'm not personally right now

64
00:04:29,200 --> 00:04:33,720
because the line of work we're in, like yeah, there's deep fakes and yeah,

65
00:04:33,720 --> 00:04:37,340
you can like make fake line of tech tips episodes and stuff like that.

66
00:04:37,340 --> 00:04:41,080
But it's several years out at least to the point

67
00:04:41,080 --> 00:04:45,240
where I might be, I might worry about like retiring from something like this in 20 years

68
00:04:45,280 --> 00:04:49,160
because it's probably gone by then, but I think 20 years we've got a lot of other problems.

69
00:04:49,160 --> 00:04:52,320
Yeah, like this warming that we're in.

70
00:04:52,320 --> 00:04:55,720
I'm like, I'm more excited, I feel it. All right, I'm gonna skip one question

71
00:04:55,720 --> 00:05:00,280
and just go jump to the right thing because I'm dying. I want to know the question. All right, all right.

72
00:05:00,280 --> 00:05:06,320
Have you used AI before? And if so, what are your thoughts on it? The only thing I've really done with AI was when mid-journey

73
00:05:06,320 --> 00:05:09,600
and those other guys were coming out, I was messing around on it.

74
00:05:09,600 --> 00:05:13,280
I was like, yeah, this is cool. Ha ha, like Trump eating a lemon or whatever.

75
00:05:13,280 --> 00:05:16,400
I don't know. That's like the, oh my God. Why is that my example?

76
00:05:17,840 --> 00:05:21,320
But like, yeah, it was a lot of fun. And then it took a hot minute, but then I'm like,

77
00:05:21,320 --> 00:05:25,240
oh, hold on a second, where did they get all this data from?

78
00:05:25,240 --> 00:05:28,960
That's not good. Okay, and then I just stopped using it instantly.

79
00:05:28,960 --> 00:05:32,960
I left the Discord group and I've been very against it

80
00:05:32,960 --> 00:05:36,880
ever since, chat GPT, kind of the same thing. It sucks because my cousin loves it

81
00:05:36,880 --> 00:05:40,360
because he's not the best writer. So he's like, yeah, I had to write this thing and I didn't want to write it.

82
00:05:40,360 --> 00:05:45,520
So I just got chat GPT to do it and I was like, I get it. You think it's cool, but it's, I don't know.

83
00:05:45,520 --> 00:05:50,160
Yeah. No, personally, I use chat GPT for like, if I wanted to know like, oh, what's a good workout

84
00:05:50,160 --> 00:05:54,080
for this like, set of muscles or like a good like,

85
00:05:54,080 --> 00:05:58,000
like diet for my age and height and like weight and stuff.

86
00:05:58,000 --> 00:06:01,440
You know, like, I think that's useful. But like in terms of like writing stuff,

87
00:06:01,440 --> 00:06:04,840
like I'm like, we should learn how to write first

88
00:06:04,840 --> 00:06:08,640
before you start using technology to replace that. Yeah.

89
00:06:08,640 --> 00:06:12,680
And on top of that, hasn't there been, I mean, this is maybe just, you know, garbage,

90
00:06:12,680 --> 00:06:16,880
but haven't there been court cases and stuff where like a lawyer was caught using chat GPT

91
00:06:16,880 --> 00:06:18,860
and then it fabricated a case?

92
00:06:20,120 --> 00:06:23,480
I'm pretty sure. I didn't hear that. I'm pretty sure that was the thing. Maybe I'm wrong.

93
00:06:23,480 --> 00:06:27,600
But if stuff like that is happening and AI at this level

94
00:06:27,600 --> 00:06:30,760
is at the point where it's like creating a fake narrative,

95
00:06:30,760 --> 00:06:33,840
can you trust it for anything? Yeah, I know that there was a court case

96
00:06:33,840 --> 00:06:38,320
where it's like a monkey took a selfie from another photographer's camera.

97
00:06:38,320 --> 00:06:42,080
And then I think Peter sued the photographer saying that like this monkey,

98
00:06:42,080 --> 00:06:45,760
the monkey is the owner of it rather than the photographer because it was his camera.

99
00:06:45,760 --> 00:06:49,360
So it's like, it shows that like, it doesn't matter who the type mirror is,

100
00:06:49,360 --> 00:06:52,840
it's the person, their origin, the creator. So it's like, you don't know who the creator is

101
00:06:52,840 --> 00:06:57,760
when it's like AI generated. Yeah, absolutely. You don't know, it's just an amalgamation of people.

102
00:06:57,760 --> 00:07:02,120
All right, I'm sweating like crazy. So we're gonna wrap it up here. Do you want to say anything to the float planers?

103
00:07:02,120 --> 00:07:06,640
I'm trying to coin this phrase now for people who've subscribed to us on Floatplane.com.

104
00:07:06,640 --> 00:07:09,880
It's that they're float planers. So any message for them or like something

105
00:07:09,880 --> 00:07:14,920
that we didn't talk about that you wanted to expand on? Tell Sammy you don't like being called float planers.

106
00:07:17,120 --> 00:07:20,640
Then, okay, what should we call them then? What should we call them then? I don't know.

107
00:07:20,640 --> 00:07:25,760
I don't know. Subscribers, subscribers. Subscribers, no, but I said you two subscribers, you know.

108
00:07:25,760 --> 00:07:29,680
I think float planers is a good one. Sure. Unless I'm getting in trouble for that,

109
00:07:29,680 --> 00:07:33,200
then I didn't coin that phrase. It was never you, I never heard it here first.

110
00:07:33,200 --> 00:07:36,600
Yeah, but unless if people like it, then it's mine.

111
00:07:36,600 --> 00:07:41,280
You heard it here first. You heard it first. I'm Sammy, this is my sweat.

112
00:07:41,280 --> 00:07:42,480
Goodbye, float planers.
