1
00:00:00,000 --> 00:00:06,240
You're recording? Oh, yeah. I would ask that and I was like, yeah, I have to double check.

2
00:00:06,240 --> 00:00:09,800
My glasses. It's OK.

3
00:00:09,800 --> 00:00:13,840
We'll make it work. All right. I am here with? Jonathan Horst.

4
00:00:13,840 --> 00:00:20,040
And what do you do for the company? I write, host, produce, American Dress.

5
00:00:20,040 --> 00:00:23,080
The Apple Focus channel. We love that channel.

6
00:00:23,080 --> 00:00:29,520
Oh, I think. I like it, actually. I feel like I watched the, I watched, I think, Hoffman edited it.

7
00:00:29,560 --> 00:00:32,760
They're the most recent one. Yeah. That was really good. I liked it. I watched the whole thing.

8
00:00:32,760 --> 00:00:36,800
I was like, interesting. You'll be so lucky.

9
00:00:36,800 --> 00:00:40,320
No. I'm not. I'm not that kind of editor.

10
00:00:40,320 --> 00:00:44,280
You sure? Yeah. It's not my strength.

11
00:00:44,280 --> 00:00:47,880
All right. I mean, I was like, maybe it is. We'll find out.

12
00:00:47,880 --> 00:00:53,840
What's that? All right. So I'm just asking everyone in the company what their thoughts on AI it is.

13
00:00:53,840 --> 00:00:54,800
Oh, my God.

14
00:00:57,880 --> 00:01:01,240
Over-hyped? Over-hyped? Maybe incorrectly named.

15
00:01:05,840 --> 00:01:09,560
Annoying? I don't know.

16
00:01:09,560 --> 00:01:13,200
What is there to know? I don't even feel like I know enough about it,

17
00:01:13,200 --> 00:01:16,920
but I'm annoyed by the hype around it.

18
00:01:16,920 --> 00:01:21,560
Let's wait till the door closes. It don't work.

19
00:01:21,560 --> 00:01:24,720
Yeah, I'm not a lot of the bad idea, but we're committed.

20
00:01:24,720 --> 00:01:28,480
We committed to this location, and it's our fault.

21
00:01:29,480 --> 00:01:33,400
Yeah, just. Yeah, go ahead. It's OK. The bid is committed.

22
00:01:33,400 --> 00:01:37,360
It's committed. It's footprint. Yeah.

23
00:01:37,360 --> 00:01:41,720
Yeah. OK, so it sounds like you don't like it at all.

24
00:01:41,720 --> 00:01:47,120
I mean, OK, so here's the problem. Especially this past year, with the release of chat

25
00:01:47,120 --> 00:01:53,800
GPT and the mid-journey bot and all this weird stuff, everyone started attributing all this amazing capability

26
00:01:53,800 --> 00:01:57,840
to something that really hasn't been proven.

27
00:01:57,840 --> 00:02:03,200
And so I think we got a little fooled by what it was doing and thought it could do more.

28
00:02:03,200 --> 00:02:07,200
And think it can do more than it can. And we're thinking of it as like,

29
00:02:07,200 --> 00:02:13,200
it's going to just do the thing we want it to do when it's probably just going to be yet another tool in the toolbox

30
00:02:13,200 --> 00:02:16,520
to help. You still have to do the thing, though. You know what I mean?

31
00:02:16,520 --> 00:02:20,560
So when people are like, writers are going to be out of jobs

32
00:02:20,560 --> 00:02:23,560
because you could just use chat GPT, it's like.

33
00:02:23,560 --> 00:02:26,520
And I'm not sure about that, because it's just

34
00:02:26,560 --> 00:02:30,440
a summary of all the writing it's ever looked at.

35
00:02:30,440 --> 00:02:35,280
So how are you going to get any new ideas or new ways of thinking about things or new thinking?

36
00:02:35,280 --> 00:02:39,560
Because that's what writing is. From that, I don't know.

37
00:02:39,560 --> 00:02:44,160
OK, well, surprisingly, you're the first person I interviewed today who is really against it.

38
00:02:44,160 --> 00:02:49,160
So against it. But what's one thing you like about AI, then?

39
00:02:52,600 --> 00:02:59,160
I don't know. I haven't really, ah, AI is such a broad term.

40
00:02:59,160 --> 00:03:04,240
I guess AI in the current state. Or in the current culture.

41
00:03:04,240 --> 00:03:04,740
Exactly.

42
00:03:16,880 --> 00:03:21,160
It's impressive, but I'm not impressed, if that makes any sense.

43
00:03:21,160 --> 00:03:24,600
It's like, I don't know, I dabbled with mid-journey.

44
00:03:24,600 --> 00:03:28,800
And I was like, if you have broad strokes,

45
00:03:28,800 --> 00:03:34,560
you'll get an interesting image. But then if you want anything specific from the real world

46
00:03:34,560 --> 00:03:37,560
or something like that, it just starts making stuff up.

47
00:03:37,560 --> 00:03:44,720
And you're like, ah, it's a bit of a fever dream still. And it's like, I don't know.

48
00:03:44,720 --> 00:03:50,000
It can only get you so far, it feels like. And then hearing about people writing emails with chat GPT

49
00:03:50,000 --> 00:03:55,600
or something like that, it becomes this weird question of, well, what's the point?

50
00:03:55,600 --> 00:04:00,480
If you're not willing to put effort into saying

51
00:04:00,480 --> 00:04:04,840
what you want to convey, then are you saying anything?

52
00:04:04,840 --> 00:04:11,240
I don't know. So it just makes everything so much harder to trust.

53
00:04:11,240 --> 00:04:18,440
And so I find these concerns way bigger than the benefits.

54
00:04:18,440 --> 00:04:22,080
And so I'm just like, I just haven't

55
00:04:22,080 --> 00:04:25,920
bothered to really play around too much with it.

56
00:04:25,920 --> 00:04:31,040
So I'm not the most experienced, I could say. But when I have dabbled with it, I've been like, oh,

57
00:04:31,040 --> 00:04:34,280
this is kind of neat. But and maybe this could be a tool,

58
00:04:34,280 --> 00:04:40,680
like I guess Adobe's generative fill is kind of interesting.

59
00:04:40,680 --> 00:04:46,600
But it's like, they're demonstrating it. And look, we could make this image so much wider.

60
00:04:46,720 --> 00:04:52,680
We can fill it just from this tiny little portrait. We could turn it into a whole landscape thing.

61
00:04:52,680 --> 00:04:56,720
And it's like, yeah, I guess that's cool. But it's also like, what is the picture then anymore?

62
00:04:56,720 --> 00:05:01,160
Whereas it's probably a really great tool when you're trying to do a cool effect

63
00:05:01,160 --> 00:05:04,280
or you take the subject foreground or the background.

64
00:05:04,280 --> 00:05:08,360
But when you do that, the background has that thing cut out of it.

65
00:05:08,360 --> 00:05:12,360
So you could fill it with the generative fill, that little space. And that would be really great.

66
00:05:12,360 --> 00:05:17,440
You know what I mean? It's great for small things. It got marketed to us as this big thing.

67
00:05:17,440 --> 00:05:23,200
And so I think then it becomes really hard to trust it or understand it within those limitations

68
00:05:23,200 --> 00:05:30,080
and then be able to take advantage of it there. Instead, we're being hyped on this new future that actually

69
00:05:30,080 --> 00:05:36,760
sounds worse. You know what I mean? Yeah, no. I always talk about AI as like, oh, I know, we've always had it.

70
00:05:36,760 --> 00:05:40,880
It's like in a small scale, like you said. But it's like, now that it's this big scale thing,

71
00:05:40,880 --> 00:05:45,920
like, oh, it can take over jobs. So I thought that's like, oh, OK, that's not really.

72
00:05:45,920 --> 00:05:50,320
What's being said is that. I don't buy that. Yeah, near do I, yeah.

73
00:05:50,320 --> 00:05:54,520
Yeah, yeah. So yeah, so it's annoying.

74
00:05:54,520 --> 00:05:58,200
It's really annoying. I feel like, and then I get an argument with people about it.

75
00:05:58,200 --> 00:06:02,880
I don't know what to say. Have you written something? It's going to be really satisfying when you figure out

76
00:06:02,880 --> 00:06:05,920
how to write. I hate writing, by the way.

77
00:06:05,920 --> 00:06:10,600
But when you figure out how to write something really cohesively, that's really impressive.

78
00:06:10,600 --> 00:06:14,800
And I just don't know how a prompt is going to lead to that.

79
00:06:14,800 --> 00:06:16,680
As a writer, you hate writing? I do.

80
00:06:20,920 --> 00:06:26,080
All right, let me look at my next few questions here. OK, so you kind of touched on it.

81
00:06:26,080 --> 00:06:29,200
But you said that like, there's always

82
00:06:29,200 --> 00:06:34,600
like the idea of that AI is going to take over jobs. So you don't believe in that, right?

83
00:06:34,600 --> 00:06:40,320
It's like, I mean, you could say any sort of technology

84
00:06:40,320 --> 00:06:44,280
takes over jobs. But in some ways, the jobs shift, perhaps.

85
00:06:44,280 --> 00:06:49,760
And so like, I don't know, like you would have trying to think of an example.

86
00:06:49,760 --> 00:06:54,320
Like you could have a really specific job for like a really menial task.

87
00:06:54,320 --> 00:06:57,400
And like, and then they get a robot to do that.

88
00:06:57,400 --> 00:07:01,440
I think like, so I'm interested in cars. So in the car industry, like they tried,

89
00:07:01,440 --> 00:07:07,120
there was an attempt to roboticize the whole production

90
00:07:07,120 --> 00:07:11,600
process in the 80s. General Motors tried it, and it was a massive failure.

91
00:07:11,600 --> 00:07:15,640
And like, I think there are certain tasks that like are really great where somebody

92
00:07:15,640 --> 00:07:19,480
could get out of a job. So like, I don't know, like welding, you know,

93
00:07:19,480 --> 00:07:23,480
stamping and welding, the like metal panels of a car.

94
00:07:23,480 --> 00:07:29,360
You know, instead of having someone like manually doing that, you have the system because every piece is exactly the same.

95
00:07:29,360 --> 00:07:32,720
Smush and bush and do the thing. And it's like, OK, so the person who

96
00:07:32,720 --> 00:07:37,400
had to do that manual menial task, I guess it did replace their job.

97
00:07:37,440 --> 00:07:41,880
But like, they can do something else. I think when we get up to this weird, like,

98
00:07:41,880 --> 00:07:45,440
broad creative space, it's puzzling to me

99
00:07:45,440 --> 00:07:49,240
that you would say, oh, this tool is going

100
00:07:49,240 --> 00:07:55,680
to take over a creative job instead of like making the creative job easier or allowing the creative job

101
00:07:55,680 --> 00:07:59,760
to be the boundaries to be pushed or something like that. Those are two like very different things.

102
00:07:59,760 --> 00:08:03,080
And I just think it's more the last one. So I just want to go. Do you need ice? You can get ice if you want.

103
00:08:03,080 --> 00:08:07,120
That's good, that's good. OK, so this is a bad idea.

104
00:08:07,120 --> 00:08:11,200
I don't have any ice in my water, so it's a possibility.

105
00:08:11,200 --> 00:08:16,480
Yeah, so it's probably more like, oh, maybe it'll push the boundaries of what the job looks like

106
00:08:16,480 --> 00:08:19,600
or maybe how many people are involved in it.

107
00:08:19,600 --> 00:08:25,520
Maybe, perhaps. Yeah, OK, I guess that's true. But it's not like a one-to-one replacement.

108
00:08:25,520 --> 00:08:29,200
But I guess what people do shifts all the time anyway, right?

109
00:08:29,200 --> 00:08:32,360
Exactly, exactly. All right, so and you kind of dabble on it,

110
00:08:32,360 --> 00:08:37,360
but you've used AI. How do you feel about it?

111
00:08:37,360 --> 00:08:41,480
You said you were OK about it, weird about it, or?

112
00:08:41,480 --> 00:08:45,680
Oh, I was like, oh, this is pretty neat. And I tried some image prompts.

113
00:08:45,680 --> 00:08:50,960
And I was like, oh, and weirdly, I thought the broader it was, the better it was,

114
00:08:50,960 --> 00:08:58,640
which I thought was interesting. And you can sort of see the years of going through image

115
00:08:58,640 --> 00:09:03,880
search or scrolling through Pinterest or whatever. You can sort of see how it all averages out

116
00:09:03,880 --> 00:09:10,360
and how it starts to be its common thing. So I was like, oh, OK, this is like a weird kind of take

117
00:09:10,360 --> 00:09:14,160
on all these pictures that I could picture having seen

118
00:09:14,160 --> 00:09:19,120
or whatever, right? So I don't know. So in that case, it's like, that's neat.

119
00:09:19,120 --> 00:09:22,280
But also, it has no sense of the physical world.

120
00:09:22,280 --> 00:09:26,760
It has no understanding of anything.

121
00:09:26,760 --> 00:09:30,160
It's literally just like, I know that this pixel is most

122
00:09:30,160 --> 00:09:33,320
likely to be next to this pixel when these pixels clump

123
00:09:33,760 --> 00:09:38,720
together like that. And I mean, I'm probably reducing it. I'm sure some AI people are going to be so mad, and I don't care.

124
00:09:38,720 --> 00:09:42,840
But it doesn't know.

125
00:09:42,840 --> 00:09:46,280
Have you seen the AI-generated commercials?

126
00:09:46,280 --> 00:09:50,760
Yeah, I've seen a lot. They're like, whoa. Some of them are really like, some of them are like, cool,

127
00:09:50,760 --> 00:09:56,400
but you can tell that it's very AI. Well, I mean, when you get to motion,

128
00:09:56,400 --> 00:10:00,200
then even the animation, you're replicating

129
00:10:00,200 --> 00:10:04,760
the physical reality, right? So animators in Disney in the past

130
00:10:04,760 --> 00:10:08,040
would study how a bird flies. And then they would draw that out.

131
00:10:08,040 --> 00:10:11,120
And it was an acknowledgement of how something actually

132
00:10:11,120 --> 00:10:15,080
physically happens. And then you watch this. There's this really funny beer commercial.

133
00:10:15,080 --> 00:10:20,760
And no one's drinking the beer. They're holding the beer bottle, and it's near their mouth.

134
00:10:20,760 --> 00:10:24,120
And then the person's sucking on.

135
00:10:24,120 --> 00:10:27,120
But they're making a sucking face.

136
00:10:27,120 --> 00:10:30,120
And it's like, there's no liquid. There's nothing.

137
00:10:30,560 --> 00:10:34,080
It's such an indicator of, oh, these things just

138
00:10:34,080 --> 00:10:39,040
don't get the physicals. And so what are you going to do then?

139
00:10:39,040 --> 00:10:44,240
Are you going to now create an AI that gets the laws of physics calculated and now has

140
00:10:44,240 --> 00:10:48,560
to figure out how to render that? It's just going to be more and more complicated.

141
00:10:48,560 --> 00:10:52,600
OK. So then as a tech-slash-creator company that we are,

142
00:10:52,600 --> 00:10:55,640
do you think that we should be using AI in any capacity?

143
00:10:55,640 --> 00:10:59,160
Or?

144
00:10:59,160 --> 00:11:03,120
I mean, again, from what little I know of what's out there,

145
00:11:03,120 --> 00:11:07,880
I'm sure there might be some really basic things where

146
00:11:07,880 --> 00:11:12,360
it could be beneficial. Like again, like I said, the gender to fill,

147
00:11:12,360 --> 00:11:15,920
if you want to animate a photo in our space.

148
00:11:15,920 --> 00:11:19,600
Sorry, can we start again? So like, start again, yeah.

149
00:11:19,600 --> 00:11:23,080
I'll start from the top. Yeah, yeah. Oh, yeah, OK. This is a bad picture.

150
00:11:23,080 --> 00:11:32,560
I'm so sorry. Don't worry about it. So in the case of here, using the tools really broadly,

151
00:11:32,560 --> 00:11:38,160
saying chat GPT will replace a writer or that is a bit rough.

152
00:11:38,160 --> 00:11:42,280
Maybe getting a good background image for a thumbnail

153
00:11:42,280 --> 00:11:46,640
with mid-journey might work. I could see the gender to fill being really great

154
00:11:46,640 --> 00:11:49,880
if we wanted to animate a photo and give it

155
00:11:49,880 --> 00:11:56,760
this parallax effect. I don't know if I should have stopped because it's just sounds.

156
00:11:56,760 --> 00:11:59,920
They're just backgrounds. But yeah, you get this parallax effect.

157
00:11:59,920 --> 00:12:03,160
If you get the parallax effect, then you could have this really cool animation.

158
00:12:03,160 --> 00:12:09,040
You fill it with the gender to fill. That sounds great. So again, really specific use cases

159
00:12:09,040 --> 00:12:14,280
where it's going to be really amazing. But in terms of broad strokes, no.

160
00:12:14,280 --> 00:12:20,600
It's going to help just like that tiny little step, like making one vision you have a bit easier

161
00:12:21,160 --> 00:12:25,320
to execute or making it better. That sounds great.

162
00:12:25,320 --> 00:12:29,200
So basically, we use it as a starting point, but then it won't be some new use for it.

163
00:12:29,200 --> 00:12:33,480
We should use it as a starting point, but we shouldn't use it for the final product.

164
00:12:33,480 --> 00:12:39,920
As a tool. We should be using it as a tool. Like you'd use a computer as a tool or a camera as a tool.

165
00:12:39,920 --> 00:12:46,800
That's what it's supposed to be, and that's what it is. And thinking of it as anything but that is annoying.

166
00:12:46,800 --> 00:12:50,320
Yeah. I think what everyone talked to, it's kind of like it is a tool,

167
00:12:50,320 --> 00:12:54,080
but it shouldn't be something we really rely on, basically.

168
00:12:54,080 --> 00:12:58,000
Yeah, no. No.

169
00:12:58,000 --> 00:13:01,720
What's his name? Dax Flame has been trying to do.

170
00:13:01,720 --> 00:13:08,880
Have you ever heard of him? He's a YouTuber who's been using AI to guide his videos along.

171
00:13:08,880 --> 00:13:13,000
And sometimes I'm like, man, you have great ideas.

172
00:13:13,000 --> 00:13:16,000
You just need to be confident in yourself.

173
00:13:16,000 --> 00:13:19,000
Just do that. Yeah. Yeah.

174
00:13:19,000 --> 00:13:22,840
All right, we're going to wrap up here quickly. But is there anything that we haven't mentioned

175
00:13:22,840 --> 00:13:28,720
that you want to discuss? No, I'm so glad to get this off my chest.

176
00:13:28,720 --> 00:13:32,760
All right, well, thank you. Or saying goodbye to the float planers now.

177
00:13:32,760 --> 00:13:35,760
Goodbye float planers. Yeah, that's the name I've coined now.

178
00:13:35,760 --> 00:13:38,760
I don't know if it's official. Is it one video or just each separate?

179
00:13:38,760 --> 00:13:42,960
It's going to be like 20 uncuts. And then I got cut down to like a five minute video.

180
00:13:42,960 --> 00:13:45,000
Oh yeah, OK.
