1
00:00:00,060 --> 00:00:06,960
talk linked it's back with the number

2
00:00:03,600 --> 00:00:08,400
one co-host uh as voted by you the

3
00:00:06,960 --> 00:00:12,420
viewers of America I feel like I haven't been on talking in like two and a half years well that's because we haven't

4
00:00:10,980 --> 00:00:17,039
done it in like two and a half years all right let's get into it but then we started doing it and guess what we're

5
00:00:14,880 --> 00:00:20,400
talking about today James uh AI generated art see this is why I love you

6
00:00:18,960 --> 00:00:27,300
on talk linked you just want to cut right to the point yeah don't waste the beer's time

7
00:00:23,820 --> 00:00:30,500
information density let's go I

8
00:00:27,300 --> 00:00:33,600
just want to know how you're doing

9
00:00:30,500 --> 00:00:35,340
the reason we're here is because a news

10
00:00:33,600 --> 00:00:38,760
story just came out today about a synthetic media artist Jason Allen I

11
00:00:37,739 --> 00:00:44,100
think that actually came out yesterday but he won the Colorado State Fair Fine

12
00:00:41,879 --> 00:00:49,500
Arts competition in the Digital Arts category and good for him you know

13
00:00:46,920 --> 00:00:53,520
those judges got to catch up yeah uh wait why what do you mean

14
00:00:51,180 --> 00:00:56,820
because an AI generated his entry oh well that was the key that was the key

15
00:00:55,020 --> 00:01:04,079
detail that I hadn't gotten to yet yes the reason he won was because he isn't

16
00:01:00,059 --> 00:01:04,079
actually an artist at all arguably

17
00:01:06,260 --> 00:01:11,820
he used an AI called mid-journey and

18
00:01:09,900 --> 00:01:16,020
used a special prompt that he will be publishing quote unquote at a later date

19
00:01:13,560 --> 00:01:19,799
that's this whole process his uh 11 herbs of spices is that secret prompt

20
00:01:18,119 --> 00:01:24,659
right exactly what combination of text yielded this art piece yeah so so the

21
00:01:22,320 --> 00:01:29,820
whole story came about firstly because of this tweet that I'll click into

22
00:01:28,439 --> 00:01:33,420
um that someone was very upset about this

23
00:01:32,100 --> 00:01:39,240
but apparently um he goes by sincarnate on on Discord

24
00:01:36,060 --> 00:01:42,119
or whatever and he talked about his his

25
00:01:39,240 --> 00:01:47,579
win here and uh he's like hey that's so great look at this I I used a a personal

26
00:01:44,640 --> 00:01:51,060
project he's made using mid-journey and uh yeah people are upset about it

27
00:01:49,680 --> 00:01:56,100
James is AI art art

28
00:01:54,119 --> 00:02:01,140
first of all is it art and then we'll talk about whether you know you can take

29
00:01:57,899 --> 00:02:02,820
credit for it oh that's let's just start

30
00:02:01,140 --> 00:02:07,979
with the most philosophical and Abstract question like I guess the Crux of any

31
00:02:06,240 --> 00:02:13,379
anyone who squeamish to answer that with anything but a fast yes or no uh the

32
00:02:11,520 --> 00:02:17,099
that person was probably grappling with the question of does art need to be

33
00:02:15,300 --> 00:02:23,280
generated by humans really what it comes down to is it art uh

34
00:02:21,319 --> 00:02:27,239
in order to answer that question you have to you have to ask what is art well

35
00:02:25,620 --> 00:02:31,260
I think the thing with art is the fuzziness of it and when you're talking

36
00:02:28,860 --> 00:02:35,879
about a computer generating it it seems more like a computation this is an

37
00:02:33,660 --> 00:02:38,879
output it's deterministic the machine will create this output every single

38
00:02:37,379 --> 00:02:43,200
time because it's a computer but that's actually not true because these art

39
00:02:40,440 --> 00:02:47,700
generators they output multiple different iterations every time you hit

40
00:02:45,540 --> 00:02:51,599
enter right so even with the same text prompt you probably won't get this exact

41
00:02:49,140 --> 00:02:55,260
piece ever again so in that sense it still has that ephemeral quality of it

42
00:02:53,519 --> 00:03:00,540
and that fuzziness that making it art see I would argue

43
00:02:58,319 --> 00:03:07,200
that's a good point thank you but I would argue that art has to have

44
00:03:04,080 --> 00:03:10,080
some sort of intentionality behind it

45
00:03:07,200 --> 00:03:15,720
because if it wasn't intended to be produced in a certain way then

46
00:03:12,959 --> 00:03:19,379
it may evoke feelings in it in the viewer or in in the ex person

47
00:03:17,400 --> 00:03:24,060
experiencing it you know whether it's whether it's an image or music or

48
00:03:21,060 --> 00:03:26,879
whatever it may evoke certain feelings

49
00:03:24,060 --> 00:03:29,640
but you go and watch it you go and look at a beautiful mountain and you feel

50
00:03:28,440 --> 00:03:34,019
something that doesn't make the mountain art so you're saying that uh so there's

51
00:03:32,700 --> 00:03:39,000
an intentionality of the person creating the prompt this is a collaborative collaborative effort between the person

52
00:03:36,900 --> 00:03:42,180
at the keyboard and the AI yes and the person at the keyboard has an intention

53
00:03:40,319 --> 00:03:46,260
of what they want that's why they they create the string that they do in the

54
00:03:43,560 --> 00:03:50,940
way that they create it however there is a distance there where I just have to

55
00:03:48,959 --> 00:03:54,599
roll the dice and see what the AI is going to Output right it just gives you

56
00:03:52,860 --> 00:03:58,739
a random like uh here's a few things yeah and so I don't I don't control that

57
00:03:56,940 --> 00:04:02,580
I can try to approximate it with my words but but then with the so then the

58
00:04:01,140 --> 00:04:05,879
the thing actually doing the generation of the art the AI it doesn't have

59
00:04:04,200 --> 00:04:08,940
intentionality well it's intention is to try to closely you know produce

60
00:04:07,799 --> 00:04:14,220
something that matches what you've said but it doesn't have a it's not trying to make a statement you know it doesn't so

61
00:04:13,019 --> 00:04:21,120
this collaborative effort the intentionality is kind of severed so exactly that's why I think it's not art

62
00:04:18,299 --> 00:04:24,900
yeah I feel like you need to have well so I I think the the thing that just

63
00:04:23,160 --> 00:04:28,680
popped in my head was like I Choose Your Own Adventure story

64
00:04:26,580 --> 00:04:34,020
or like a multiple choice test let's go let's Choose Your Own Adventure uh an

65
00:04:31,139 --> 00:04:38,160
author has written out a number of possible branching paths for a story to

66
00:04:36,479 --> 00:04:41,100
take and as you're reading this choose your own I'm thinking about the books

67
00:04:40,020 --> 00:04:47,639
but like that's not really a thing anymore maybe it is let's say it's a Netflix it's a Netflix uh just keep it

68
00:04:45,360 --> 00:04:52,500
as books sure it's one of these books that used to exist

69
00:04:49,620 --> 00:04:55,919
um and if you you by reading through the book and then being like hmm these are

70
00:04:54,479 --> 00:05:01,919
the options presented to me by this person that actually generated something

71
00:04:57,479 --> 00:05:03,840
and I'm gonna choose this one and

72
00:05:01,919 --> 00:05:09,720
okay now you're going along this path but that doesn't mean that you generated

73
00:05:05,880 --> 00:05:11,699
that story you just put in an input

74
00:05:09,720 --> 00:05:16,259
you're putting in some inputs to some system the system is generating right

75
00:05:13,860 --> 00:05:22,440
but so this system just has Myriad just in just infinitely more inputs and even

76
00:05:18,840 --> 00:05:24,300
more problematically uh the system is

77
00:05:22,440 --> 00:05:28,199
based on work that was actually originally created by real human artists

78
00:05:26,940 --> 00:05:34,080
well this is a really interesting part of it yeah so it's like

79
00:05:31,919 --> 00:05:38,820
I you know I don't want to say like I'm not trying to be here being like this

80
00:05:35,580 --> 00:05:40,139
new technology that is you know uh I'm

81
00:05:38,820 --> 00:05:43,440
not trying to hate on like new technology just because it's new but I

82
00:05:42,120 --> 00:05:47,039
think that like while it is exciting and while it is a

83
00:05:45,720 --> 00:05:51,000
very interesting technology that we should explore I think that right now

84
00:05:48,840 --> 00:05:54,060
it's kind of like the wild west where there isn't any regulation people are

85
00:05:52,979 --> 00:05:58,380
still asking these questions we're having a talk like right now talking

86
00:05:55,979 --> 00:06:01,800
about it uh so we need to have these conversations and then like break it

87
00:06:00,479 --> 00:06:07,680
down into what does this mean for copyright what does this mean for the

88
00:06:04,800 --> 00:06:10,860
future of artistry as like a career or even competitions or even competitions

89
00:06:09,479 --> 00:06:15,000
like do you have to film yourself for creating the art now and submit that as

90
00:06:13,199 --> 00:06:19,979
well as your competition which is what package which is what one person on a on

91
00:06:16,979 --> 00:06:23,039
a Reddit thread suggested in in response

92
00:06:19,979 --> 00:06:25,800
to this fact that this guy like won this

93
00:06:23,039 --> 00:06:29,819
art competition using AI art and the key detail is that he did not

94
00:06:27,720 --> 00:06:33,600
it's the the details seem a little murky right now but at least

95
00:06:31,440 --> 00:06:37,979
some of the judges have said that they didn't know it was AI art and he has

96
00:06:36,060 --> 00:06:41,759
said himself that he's like oh what I needed to let them know that was in the

97
00:06:40,259 --> 00:06:45,000
vice article I believe at the at the bottom

98
00:06:43,500 --> 00:06:50,360
um I mean I don't know anything about this person's intentions but it's pretty obvious that there wasn't the spirit of

99
00:06:48,240 --> 00:06:54,960
the competition exactly exactly I mean I can understand

100
00:06:53,460 --> 00:07:01,560
this is almost like a form of performance art like it's a stunt exactly exactly I can understand him

101
00:06:58,979 --> 00:07:06,120
saying you know okay I'm gonna I'm gonna I want to bring attention to this yeah I

102
00:07:03,780 --> 00:07:10,979
want to show what this tool mid-journey can do so I'm going to stealthily enter

103
00:07:08,280 --> 00:07:16,740
uh with my AI art and then when I win I'm gonna be like haha look at this I

104
00:07:14,520 --> 00:07:19,620
this is actually a i I are now I've started a conversation and now we're

105
00:07:18,000 --> 00:07:23,340
talking about this but then I think what he should have done is

106
00:07:21,840 --> 00:07:26,580
be like okay but I didn't actually win your surprise money yeah yeah because

107
00:07:25,380 --> 00:07:32,460
now you have these people who actually created Art losing and they're like

108
00:07:30,060 --> 00:07:36,900
I think that makes it way crappier that makes it way stinkier of a situation

109
00:07:35,160 --> 00:07:40,500
because it's a stinky stinky because there are people who are outraged and

110
00:07:38,280 --> 00:07:43,380
they're and they're disappointed that the world's going this way and they're

111
00:07:41,880 --> 00:07:49,139
worried about the future and what this means for artists around the world and the volume of art that will be created

112
00:07:46,319 --> 00:07:55,319
by humans going forward and it's kind of dystopian but it that that dystopian

113
00:07:52,919 --> 00:07:58,380
Viewpoint is greatly enhanced by people being crappy like this exactly what do

114
00:07:57,479 --> 00:08:01,919
you mean this is valid just like everyone else like yeah no dude I think

115
00:08:00,960 --> 00:08:06,120
that like they're they're it happened there's

116
00:08:04,080 --> 00:08:11,240
always this pushback against hard backlash right when something cool

117
00:08:08,280 --> 00:08:15,000
happens like say blockchain technology and people are like okay cool we're

118
00:08:13,740 --> 00:08:20,580
building this technology in the early days it's like oh this enter the scammers enter the scammers and the pump

119
00:08:18,419 --> 00:08:24,240
and dumpers and now the whole they're you know there's all these scams out

120
00:08:22,440 --> 00:08:27,900
there and there's a huge backlash of being like cryptocurrency is evil and

121
00:08:26,340 --> 00:08:30,840
bad and it should never be used and it can only be used for as it's pump enough

122
00:08:29,400 --> 00:08:34,200
stuff and then you have people being like okay wait but remember in the early

123
00:08:32,880 --> 00:08:37,860
days there was this promise and we could use it for that still but the well is

124
00:08:36,120 --> 00:08:40,440
poisoned yeah and I think that's happening a little bit with air right

125
00:08:39,240 --> 00:08:44,700
it's going to be a way different outcome though because the AI art is just it's

126
00:08:43,140 --> 00:08:48,420
much it's simpler it's more straightforward the future is here now

127
00:08:46,320 --> 00:08:53,399
deal with it now you know what I mean like uh all graphic artists

128
00:08:51,300 --> 00:08:58,440
you're in trouble right within the next two years uh stock imagery sites you're

129
00:08:56,640 --> 00:09:02,700
in trouble yeah this is going to upend Industries and change laws for sure you

130
00:09:00,540 --> 00:09:07,380
know and and it's and it's moving super rapidly like this technology at the

131
00:09:05,399 --> 00:09:12,600
beginning of this year was super super rudimentary and it wasn't available and

132
00:09:09,420 --> 00:09:14,279
then open AI released Dolly too uh

133
00:09:12,600 --> 00:09:17,760
Google released their thing that I forget what it's called think Imogen I

134
00:09:15,720 --> 00:09:22,560
think it's called and now you go and there's lists of like the 10 coolest uh

135
00:09:20,940 --> 00:09:27,180
era well it's getting even crazier because Dolly had some built-in kind of

136
00:09:25,140 --> 00:09:32,459
filters where you couldn't do you can't do a human's face right you can't you

137
00:09:29,640 --> 00:09:37,140
can't do pornography right uh whereas stability AI yeah stable diffusion yeah

138
00:09:35,399 --> 00:09:42,779
it is release open source it has some filters

139
00:09:39,779 --> 00:09:44,459
that are enabled by default to that's

140
00:09:42,779 --> 00:09:47,459
just so you don't type something in and get a result that's pornographic and

141
00:09:46,080 --> 00:09:51,420
you're like oh I didn't want that right but you can disable this filter

142
00:09:49,740 --> 00:09:55,380
and you can you can create celebrity likenesses you can make like I've seen a

143
00:09:53,459 --> 00:09:58,440
bunch of pictures of Charlize Theron's face I knew it was her there was no

144
00:09:57,000 --> 00:10:04,860
label saying it was Charlize Theron yeah I could tell it's her in multiple Styles and angles right uh and so it's

145
00:10:02,640 --> 00:10:10,320
what does this mean yeah exactly this is you can run this on your desktop GPU

146
00:10:08,160 --> 00:10:13,380
it's only supports NVIDIA gpus right now but I don't have to rely on a cloud

147
00:10:11,880 --> 00:10:17,100
service for this yeah I mean like when deep fakes first came out people were

148
00:10:15,600 --> 00:10:20,820
like oh my gosh people are using their gpus to like run these computations and

149
00:10:18,899 --> 00:10:24,360
do it at home and that was one thing because if you didn't really have access

150
00:10:22,380 --> 00:10:28,800
to the hardware to like run those simulations over and over uh or run

151
00:10:26,940 --> 00:10:32,459
those computations then you wouldn't be able to make deep fakes but now all of

152
00:10:30,480 --> 00:10:36,300
these air generators are on the web you can just go to a URL and put in a prompt

153
00:10:34,380 --> 00:10:39,899
and get images and so this is a TechCrunch article talking about what we

154
00:10:38,279 --> 00:10:45,420
were the the generator we were just mentioning stable diffusion

155
00:10:42,300 --> 00:10:48,480
um I think it's it's in a program called

156
00:10:45,420 --> 00:10:50,940
dream AI or something I forget what it's

157
00:10:48,480 --> 00:10:56,519
called Uh but you know people are using it to make porn of existing people and

158
00:10:54,660 --> 00:11:00,839
upload it to 4chan lovely site it's and it's just images

159
00:10:59,040 --> 00:11:04,440
for Now what's the difference with defects this is just images for now not

160
00:11:02,579 --> 00:11:08,579
video but it'll be video within two or three for sure for sure I mean yeah like

161
00:11:06,240 --> 00:11:12,000
there's there's deep fake videos and yeah that I'm sure there will be

162
00:11:10,019 --> 00:11:16,320
generators uh quite soon like have you watched like that really low quality

163
00:11:13,880 --> 00:11:20,700
animated children's content on YouTube like not to throw a little baby bum

164
00:11:18,899 --> 00:11:25,500
under the bus but little baby bum is just it's nursery rhyme songs with just

165
00:11:23,040 --> 00:11:29,160
like pretty bad animation a lot of 3D animation along with it and kids love it

166
00:11:27,120 --> 00:11:32,160
my kid watched it for a year and a half straight every day all day like just

167
00:11:30,959 --> 00:11:39,000
love it um in the future you could potentially just play the song and just have the

168
00:11:36,480 --> 00:11:42,000
animation generated and just put that onto your YouTube channel even less work

169
00:11:40,680 --> 00:11:45,959
and you can monetize that YouTube channel and get rich very easily yeah I

170
00:11:44,339 --> 00:11:51,000
don't doubt that that that is the future I mean uh that's video stuff uh and we

171
00:11:49,440 --> 00:11:54,959
have to watch out for that coming but I mean yeah the still images are being

172
00:11:52,500 --> 00:11:59,339
used right now this is an Atlantic article from earlier this month I love

173
00:11:57,420 --> 00:12:04,260
the Atlantic I I do as well you subscribe no no you should uh but I when

174
00:12:02,760 --> 00:12:10,140
people send a link and it's an Atlantic link I'm like okay I'm out of taste

175
00:12:07,620 --> 00:12:14,040
anyway what's in the article um this uh you know it's a it's a regular article

176
00:12:11,760 --> 00:12:18,180
no AI was uh involved in the creation of the words but that's a that's a

177
00:12:15,360 --> 00:12:21,959
mid-journey created image uh it says the prompt was Alex Jones inside an American

178
00:12:20,100 --> 00:12:25,800
office under fluorescent lights that's the perfect use for that yeah and there

179
00:12:23,519 --> 00:12:29,399
goes the graphic designer's job right so that is something that would be

180
00:12:27,420 --> 00:12:34,380
traditionally I mean yeah this is like a perfect example like it's real world

181
00:12:30,899 --> 00:12:36,120
this is a lost gig I think graphic

182
00:12:34,380 --> 00:12:39,600
designers will continue to exist and I think they will use these tools you know

183
00:12:37,920 --> 00:12:42,180
when you're specking out a job you're probably going to go all right well

184
00:12:40,920 --> 00:12:46,200
here's five different things that I created in the last 10 minutes

185
00:12:44,459 --> 00:12:49,680
um using one of these tools right uh which one of you like right you like

186
00:12:48,060 --> 00:12:52,620
this one okay now I'll go and make a better version of that exactly because a

187
00:12:51,480 --> 00:12:56,160
lot of these things when you zoom in they're actually not that nice so you

188
00:12:54,720 --> 00:13:02,040
might not use it for like your corporate logo or something like that although there are Services which use AI to

189
00:13:00,360 --> 00:13:06,000
generate logos you put in like your company name some details about you your

190
00:13:03,720 --> 00:13:11,220
industry uh style them you maybe want and they will create AI created logo for

191
00:13:09,660 --> 00:13:15,180
you and I'm sure like there are options and then I'm sure you can yeah depending

192
00:13:12,839 --> 00:13:18,300
on that AI like who has the copyright because with Dali Dali retains the

193
00:13:17,160 --> 00:13:24,360
copyright to the images that you generate right which is weird kind of

194
00:13:21,060 --> 00:13:25,860
dolly dolly retains yeah it does yeah

195
00:13:24,360 --> 00:13:28,680
not you see but that's the thing is that like

196
00:13:27,540 --> 00:13:33,540
should they have copyright at all because what is Dolly trained on they're trained on all of these other artists

197
00:13:32,160 --> 00:13:38,880
that initially created the work the whole internet like it's just a general web scrape of paid content like

198
00:13:37,200 --> 00:13:42,240
shutterstocking you need to have a subscription and then they just take

199
00:13:40,500 --> 00:13:46,139
those billions of things and monetize it while putting Shutterstock out of

200
00:13:43,740 --> 00:13:50,279
business oh my God that's totally not fair I think in the future it could be

201
00:13:47,399 --> 00:13:54,540
possible that anyone who wants to uh create one of these models using a

202
00:13:52,380 --> 00:14:00,120
corpus of imagery will have to license all that imagery to feed the model but

203
00:13:57,600 --> 00:14:04,440
like how do you enforce that yeah I mean we need regulation which is

204
00:14:02,760 --> 00:14:07,860
it which is kind of it's funny because this is like on The

205
00:14:06,300 --> 00:14:13,019
Cutting Edge of what's going on right now and to even think about

206
00:14:10,860 --> 00:14:18,740
to even think about how long it'll be before a bunch of old people in you know

207
00:14:16,200 --> 00:14:22,860
the US Congress or senate or elsewhere uh start to like become familiar with

208
00:14:21,360 --> 00:14:26,339
this as a phenomenon and then craft legislation around it it's gonna be a

209
00:14:24,660 --> 00:14:30,360
while yeah it's not gonna happen and well it could happen it'll just be a

210
00:14:28,200 --> 00:14:36,060
couple years yeah it's gonna be a rocky road there's other legal aspects as well

211
00:14:31,860 --> 00:14:38,220
like you can create a artwork of a

212
00:14:36,060 --> 00:14:43,980
celebrity space you could sell a painting of Morgan Freeman and could I

213
00:14:41,760 --> 00:14:47,040
yes but it's kind of like fair use it's a gray you have to make like an argument

214
00:14:45,360 --> 00:14:52,380
for it a little bit so for example if you if you created a work where the the

215
00:14:50,399 --> 00:14:56,160
work was completely just like like a photo like look like a photo of Morgan

216
00:14:53,820 --> 00:14:59,459
Freeman then uh you probably have to get the permission to sell that but if you

217
00:14:57,600 --> 00:15:03,360
created a work like an Andy Warhol style where you've the work isn't just that

218
00:15:02,040 --> 00:15:09,959
it's his photo it's not just his likeness that makes the work cool it's that you stylized it right and it came

219
00:15:07,560 --> 00:15:13,740
from raw materials like paint uh then you don't even need to get this person's

220
00:15:11,760 --> 00:15:16,860
permission especially if it's just a one-off like you're you're making

221
00:15:15,300 --> 00:15:21,540
caricatures or something on the street on Vegas like rather than making this

222
00:15:19,800 --> 00:15:25,079
one RP so I'm gonna make a bazillion up and sell them all right if you if you

223
00:15:23,399 --> 00:15:28,500
were doing that with a picture of Morgan Freeman's face you'd probably run into

224
00:15:26,760 --> 00:15:33,060
some sort of yeah it's similar to fair use in that one of the pillars of it is

225
00:15:30,480 --> 00:15:37,199
does it materially impact that person's ability to monetize their likeness right

226
00:15:35,519 --> 00:15:40,260
right right right if you make one painting of Morgan Freeman and sell it

227
00:15:38,880 --> 00:15:44,040
to one person I mean depending on how expensive it is

228
00:15:42,180 --> 00:15:49,560
maybe Morgan Freeman would want would be like hey wait it's a question because

229
00:15:46,019 --> 00:15:52,019
now you've got this thing that can

230
00:15:49,560 --> 00:15:56,760
create many thousands of images of Morgan Freeman right but it's only

231
00:15:53,699 --> 00:15:58,560
creating them you made one and sold it

232
00:15:56,760 --> 00:16:01,199
and that it was inconsequential yeah because Riley just sold one so Morgan

233
00:15:59,880 --> 00:16:06,000
Freeman's not gonna sell no one's no Morgan Freeman's not gonna sue you yeah no one's paying millions of dollars of

234
00:16:04,079 --> 00:16:09,420
art for my art but what if millions of individuals each make their own Morgan

235
00:16:07,500 --> 00:16:12,420
Freeman thing so it does materially impact his ability to monetize his

236
00:16:10,800 --> 00:16:17,339
likeness right but there's no individual to attack as each individual only sold

237
00:16:14,040 --> 00:16:18,839
at once so do you attack the open source

238
00:16:17,339 --> 00:16:26,040
well honestly that's kind of the question we're we're faced with here because because these AI generators are

239
00:16:24,000 --> 00:16:29,699
mass producing now now that a million people millions of people have access to

240
00:16:28,139 --> 00:16:33,660
these online they sign up for the wait list there's there's not even like the

241
00:16:31,980 --> 00:16:36,899
like the big ones the really good ones you have to sign up like Dolly you had

242
00:16:35,220 --> 00:16:41,519
to register for and wait until you get access stable diffusion well same thing

243
00:16:40,139 --> 00:16:46,320
but they publicly released it now stable yeah it's publicly released there are

244
00:16:43,440 --> 00:16:49,980
there's there's tens of these dozens of these more uh online that you can just

245
00:16:48,480 --> 00:16:54,000
go to a URL and put it in you don't need to wait at all so now there are millions

246
00:16:51,899 --> 00:16:59,519
of people potentially using these are art generators to generate

247
00:16:56,519 --> 00:17:02,040
countless works of art based on whatever

248
00:16:59,519 --> 00:17:05,100
yeah original art pieces by human people yeah that's the other thing because

249
00:17:03,600 --> 00:17:09,959
they're not coming from raw material yeah it's not like your paint on the canvas the materials are coming from

250
00:17:07,980 --> 00:17:14,459
these other presumably copyrighted oftentimes I just had a I just I just

251
00:17:12,000 --> 00:17:18,600
had a conversation with with David uh prior to this about

252
00:17:16,079 --> 00:17:23,339
uh you know whether this is like plagiarism or not and I'm like well okay

253
00:17:20,760 --> 00:17:28,919
so it's not plagiarism because you're using these people's art as sort of like

254
00:17:25,319 --> 00:17:31,080
a training tool for AI to generate

255
00:17:28,919 --> 00:17:34,980
something new right so like that's if that's if plagiarism is just copying it

256
00:17:33,660 --> 00:17:39,419
you know that's on one end of the spectrum and on the other side maybe is

257
00:17:37,260 --> 00:17:44,039
you know looking at it from the frame of when you paint you're using materials

258
00:17:42,539 --> 00:17:49,380
that you didn't make from scratch you know you're buying these this existing

259
00:17:45,960 --> 00:17:51,120
pigment and uh you know making something

260
00:17:49,380 --> 00:17:54,419
completely new from it but you didn't make it completely from scratch there's

261
00:17:52,740 --> 00:17:58,380
other people's work going into that as well so like that's on the other end of

262
00:17:56,160 --> 00:18:02,580
the spectrum and AI art is kind of somewhere in the middle but

263
00:18:00,360 --> 00:18:06,480
I'm inclined to put it a little bit more towards plagiarism it's not plagiarism

264
00:18:04,559 --> 00:18:11,520
well it's all arbitrary it all comes down to the abilities of the entity you

265
00:18:09,360 --> 00:18:17,400
know it's not plagiarism if I do a perfect replica of a um uh starry night

266
00:18:14,820 --> 00:18:21,960
because yeah obviously I was influenced by the original work and then I had to

267
00:18:19,260 --> 00:18:25,440
do all this analog very high skill work to like mix the paint correctly right

268
00:18:23,520 --> 00:18:30,780
and years of skill for me to actually render it yeah yeah but all that stuff

269
00:18:28,679 --> 00:18:35,220
is as trivial to a machine right so it's just it's only because the

270
00:18:33,660 --> 00:18:40,140
entity in question the machine is better at doing it than you are but that's but

271
00:18:36,960 --> 00:18:43,440
like I guess that's the

272
00:18:40,140 --> 00:18:46,679
that's the question is there value in a

273
00:18:43,440 --> 00:18:49,260
machine learning to do this skill should

274
00:18:46,679 --> 00:18:54,780
we like because it's not an agent it's not a conscious agent I mean according

275
00:18:51,660 --> 00:18:57,120
to not according to boys when you use

276
00:18:54,780 --> 00:19:01,260
mid mid Journey you're interacting with a bot it's not like you're on a Google

277
00:18:59,160 --> 00:19:05,100
local page with a search bar you're in Discord and you're interacting with the

278
00:19:03,480 --> 00:19:08,940
bot you type the prompt you want it returns an image and then you can say

279
00:19:06,720 --> 00:19:11,700
make more of them or change it in this way and it's like

280
00:19:10,679 --> 00:19:16,020
you're having a conversation with an artist in a way and you're collaborating you're kind of collaborating but that's

281
00:19:14,039 --> 00:19:21,539
just an artifice that's not the bot is just a a a a window into using the

282
00:19:19,500 --> 00:19:26,580
service available on yeah for sure for now soon it'll be Scarlett Johansson

283
00:19:22,980 --> 00:19:27,840
soon they'll add actual like AI chat bot

284
00:19:26,580 --> 00:19:32,640
functionality in there and then you're talking to it'll be very weird through

285
00:19:30,240 --> 00:19:38,100
the AI yes and they're like okay so when you say a bird flying over a ship do you

286
00:19:36,000 --> 00:19:41,360
mean like a big ship or like a little ship like this is the AI talking to you

287
00:19:39,960 --> 00:19:46,140
about it yeah and then make that by voice if we like

288
00:19:43,860 --> 00:19:50,280
combine these Technologies yes because that's even scarier like it's one thing

289
00:19:47,760 --> 00:19:54,980
to say all right I work in a I work in an auto Factory a car factory and

290
00:19:53,039 --> 00:19:59,280
they're replacing my you know job screwing in these rivets with a with a

291
00:19:57,539 --> 00:20:03,840
robot robotic ARM that's one thing another thing is saying okay you have

292
00:20:01,919 --> 00:20:08,940
this creative career you thought you were safe from the AI for a while but

293
00:20:06,299 --> 00:20:15,480
now literally your entire job like consultation uh sample work for for

294
00:20:12,840 --> 00:20:19,140
inspiration or whatever uh you know any other type of parameters that you would

295
00:20:16,860 --> 00:20:23,039
discuss with a person ahead of time you can do with AI because we have this

296
00:20:20,820 --> 00:20:27,600
language AIS and then we also have these art generating AIS yeah so even if me

297
00:20:25,500 --> 00:20:30,900
even if AI doesn't achieve this like General sentience that we're that that

298
00:20:29,580 --> 00:20:35,460
is like far in the future with Singularity et cetera Etc even if we

299
00:20:33,059 --> 00:20:40,140
don't achieve that it's like we're functionally going to achieve

300
00:20:37,559 --> 00:20:44,100
something very similar to it because of all these other systems that are coming

301
00:20:41,400 --> 00:20:48,600
together yeah scary well it begs the question like what happens to artists so

302
00:20:45,480 --> 00:20:50,400
does it mean that you've got it I think

303
00:20:48,600 --> 00:20:54,179
it look it'll stratify it like it'll make uh it'll make it more disparate

304
00:20:52,380 --> 00:20:58,200
like you'll have billionaires and impoverished people uh so to speak

305
00:20:55,980 --> 00:21:01,440
you'll you'll have the artists or graphic designers who are only using

306
00:20:59,520 --> 00:21:06,120
these tools and you'll have the like the very high level highly skilled artists

307
00:21:05,280 --> 00:21:12,660
um but the I mean this I guess it's good to be one

308
00:21:09,600 --> 00:21:15,059
of them but they'll be few they'll be so

309
00:21:12,660 --> 00:21:19,200
few of them right yeah well uh you know as this uh this is was a reply in the

310
00:21:17,280 --> 00:21:24,780
original tweet where someone kind of like whistled blue on this guy uh

311
00:21:21,660 --> 00:21:26,400
bragging about his his achievement uh

312
00:21:24,780 --> 00:21:29,940
Omni Morpho on Twitter says we're watching the death of artistry unfold

313
00:21:27,840 --> 00:21:32,880
right before our eyes um even high skilled jobs you know

314
00:21:31,679 --> 00:21:38,640
you're saying oh there will be some high skilled artists but like even those guys I mean maybe they'll be like a handful

315
00:21:36,299 --> 00:21:44,159
you know like literally the thing you have to think about though is the

316
00:21:39,720 --> 00:21:47,400
convergence or the homogenization of of

317
00:21:44,159 --> 00:21:50,340
art because right now all of these

318
00:21:47,400 --> 00:21:54,480
generators are using a data set that is Virgin that is created by humans all the

319
00:21:52,440 --> 00:21:58,740
all the info that's feeding them was photographs taken by humans art pieces

320
00:21:56,520 --> 00:22:03,419
drawn by humans but now that millions of pieces are coming out as output from

321
00:22:01,200 --> 00:22:09,299
these generators that means the internet at large is is growing in its proportion

322
00:22:06,600 --> 00:22:12,720
of stuff that was made by robots right and so if that continues to expand to

323
00:22:11,520 --> 00:22:18,720
the point where like half the stuff of the internet was made by robots yeah then when the next the next AI is

324
00:22:16,320 --> 00:22:24,059
programmed and fed into it is outputs from robots it'll become more and more

325
00:22:21,419 --> 00:22:27,900
like it's self-referential yeah this is this was Horst I was talking to him

326
00:22:25,620 --> 00:22:33,299
before as well and he specifically asked me to bring this up like he he likened

327
00:22:30,360 --> 00:22:37,520
it to re-encoding videos the more that you re-encode the same video file the

328
00:22:35,159 --> 00:22:42,179
more the quality to grades so it's like eventually are these AI art generators

329
00:22:40,620 --> 00:22:47,820
just going to be you know feeding off themselves again and again and again until it's just like a blurry mess uh so

330
00:22:46,260 --> 00:22:52,740
I guess I don't think that I think I mean like stylistically like to me I

331
00:22:49,919 --> 00:22:55,740
liken it as um I this is a pet Theory I guess it won't it won't exactly sorry it

332
00:22:54,120 --> 00:23:00,240
won't like yeah it won't be the same as like the resolution is worse but maybe

333
00:22:58,500 --> 00:23:04,080
it would be like um Hollywood taking over the world and all film is just like

334
00:23:01,980 --> 00:23:07,500
every movie has to be like an MCU movie or everyone has to do their makeup like

335
00:23:05,820 --> 00:23:10,559
Kim Kardashian because we all now we all use Instagram and we all look at that

336
00:23:08,880 --> 00:23:17,039
one celebrity and we all want that face yeah so then all the dado will just make

337
00:23:13,080 --> 00:23:19,020
this kind of art so therefore maybe the

338
00:23:17,039 --> 00:23:22,080
last remaining highly skilled artists will be the wackiest artists who are

339
00:23:20,580 --> 00:23:26,340
making the most original kind of stuff because it just looks so different from

340
00:23:24,480 --> 00:23:33,480
what is generated from the AI you know what it might actually happen is uh

341
00:23:29,179 --> 00:23:34,860
carsonization car Carson as I answer

342
00:23:33,480 --> 00:23:40,440
nope carsonization

343
00:23:37,500 --> 00:23:45,720
what is it it's the phenomenon of uh the the phenomenon wherein uh organisms

344
00:23:43,340 --> 00:23:49,510
multiple branches of organisms on evolutionary Pathways uh all

345
00:23:48,059 --> 00:23:53,250
become crabs

346
00:23:53,460 --> 00:23:58,559
like we're going back to crab animals

347
00:23:56,520 --> 00:24:01,679
keep evolving into crabs and scientists well this one says scientists don't know

348
00:24:00,360 --> 00:24:05,940
why but they do know why it's because it's like a advantageous that's the word

349
00:24:03,840 --> 00:24:09,720
in some way um so maybe like you know we'll have the

350
00:24:07,919 --> 00:24:14,039
equivalent of uh of everything going back to crab all AIS

351
00:24:12,179 --> 00:24:17,360
eventually all AI music generators eventually create crab raid

352
00:24:21,780 --> 00:24:28,380
oh speaking of which we didn't even I mean real quick uh this isn't like

353
00:24:25,919 --> 00:24:33,419
entirely a new phenomenon they've been doing this with with music for some time

354
00:24:30,179 --> 00:24:35,940
now uh in that they feed in music to an

355
00:24:33,419 --> 00:24:40,620
AI it generates like Melodies and rhythms and stuff and in some cases it

356
00:24:38,400 --> 00:24:43,799
generates the audio directly but most of the time it doesn't sound very good so

357
00:24:42,419 --> 00:24:47,400
what they'll do instead is they'll feed music into like they'll say we want a

358
00:24:45,539 --> 00:24:51,600
Jimi Hendrix Style song they'll feed a million Jimi Hendrix songs to an AI and

359
00:24:49,559 --> 00:24:55,679
it'll make midi data or like sheet music and then the humans will recreate it

360
00:24:53,580 --> 00:24:58,919
with like and in some cases they make like it right writes lyrics and

361
00:24:57,299 --> 00:25:02,280
everything as well but then humans will do other people share my opinion that

362
00:25:00,780 --> 00:25:06,120
I'm less bothered by this I think this is less bothersome and the reason is

363
00:25:04,020 --> 00:25:11,580
popular music can't get any worse exactly I I think that I think that

364
00:25:08,760 --> 00:25:14,159
music is I mean that's a whole nother discussion I feel like we probably

365
00:25:12,840 --> 00:25:20,580
shouldn't get into it no it doesn't bother me that much but I will say that

366
00:25:16,020 --> 00:25:23,460
like you know it seems like with music

367
00:25:20,580 --> 00:25:27,360
human generated art will human generated music will always be more important to

368
00:25:25,679 --> 00:25:30,360
people I don't think like I think because I think part of that is like

369
00:25:28,919 --> 00:25:33,779
sort of the personality the cult of personality around like an artist if you

370
00:25:32,039 --> 00:25:37,740
like like their music you're gonna be like wow I kind of want to listen to

371
00:25:35,460 --> 00:25:40,500
more of their music if it's an AI ai's gonna have to generate some very

372
00:25:38,880 --> 00:25:45,900
personal histories in the lyrics yeah like are people gonna go to

373
00:25:43,380 --> 00:25:49,080
a concert no because the music live music is already differentiated from

374
00:25:47,400 --> 00:25:54,299
recorded music you go to see live music you go you go to see the phenomenon of

375
00:25:51,419 --> 00:25:57,960
this well but then in in the in the age of ADM

376
00:25:55,620 --> 00:26:01,799
people that you don't go there to see that you're not going there to do

377
00:25:59,159 --> 00:26:06,120
performance sure but but I mean like you know more people are going to a show by

378
00:26:04,380 --> 00:26:10,559
I don't even know what the big EDM guy is now but like a few years ago maybe it

379
00:26:08,460 --> 00:26:15,179
was dead mouse like people go to the show because it's oh it's it's that guy

380
00:26:12,240 --> 00:26:19,140
that I like and not because the music is particularly really good it's because

381
00:26:17,159 --> 00:26:21,960
you liked a few of their music their tracks and so now like you have this

382
00:26:20,520 --> 00:26:25,200
idea in their head in your head yeah we're not gonna do that for AI no no

383
00:26:23,760 --> 00:26:30,299
one's dreaming for art like no one's going or maybe it will I mean V tubers

384
00:26:27,000 --> 00:26:32,279
are a thing uh Miku Miku from uh

385
00:26:30,299 --> 00:26:36,360
Vocaloid or whatever there's these like you know virtual Idols in Japan and

386
00:26:34,559 --> 00:26:43,380
whatnot they go on stage as a hologram they're dancing people are like I'm I'm

387
00:26:38,820 --> 00:26:45,120
a fan of this Idol person oh and if

388
00:26:43,380 --> 00:26:48,960
there was just an AI but at the same but at the end of the day if you really

389
00:26:46,740 --> 00:26:52,080
wanted like a distinct sound there would have to be some sort of human input but

390
00:26:50,460 --> 00:26:55,679
that's not even true I'd take it back well actually because far in the future

391
00:26:53,880 --> 00:26:59,159
maybe there is like an AI that was like tuned with specific parameters and

392
00:26:57,120 --> 00:27:03,059
people were like I'm a fan or you just around us you randomize the parameters

393
00:27:01,020 --> 00:27:07,559
like they're part of uh stability diffusion is you can you can make

394
00:27:05,700 --> 00:27:10,860
derivative products with it yeah as long as the license gets passed down you can

395
00:27:09,659 --> 00:27:15,240
make derivative products and you can tune some parameters such that the art

396
00:27:13,620 --> 00:27:18,980
that it creates is more of a certain style for sure and there's there's

397
00:27:16,799 --> 00:27:23,820
Commercial Services that do that like sound full I just found this like right

398
00:27:21,419 --> 00:27:26,640
before we card reader yeah yeah there's a bunch of services like this but like

399
00:27:25,260 --> 00:27:31,799
sound full you know you start in there you you uh you know you you choose a

400
00:27:29,640 --> 00:27:35,580
genre you customize it you like choose a beat you choose some instruments they're

401
00:27:33,480 --> 00:27:39,120
all pre-recorded pre-done and then you can like you know put it together and

402
00:27:37,200 --> 00:27:43,440
make something like quote unquote new you could make like a Nine Inch Nails AI

403
00:27:41,880 --> 00:27:47,640
you're like you really loves this mode and this interval use these instruments

404
00:27:46,080 --> 00:27:51,600
yeah whatever it puts out will sound nailsy exactly exactly and the same

405
00:27:49,919 --> 00:27:56,460
thing can happen with visual art too oh man and if that's just randomized people

406
00:27:53,760 --> 00:28:01,140
could become fans of certain certain AIS that they really like their style right

407
00:27:58,440 --> 00:28:03,960
okay so last question and then we're done

408
00:28:01,860 --> 00:28:06,720
I know you really really hate this James you want to go I gotta get out of here

409
00:28:06,059 --> 00:28:13,740
um what do you think should be the situation what should we head

410
00:28:11,159 --> 00:28:19,260
towards as a society when it comes to like AI generated art should should we

411
00:28:16,320 --> 00:28:24,480
respect AI generated art as like you know on a somewhat equal playing

412
00:28:21,480 --> 00:28:26,279
field or should it be like it should it

413
00:28:24,480 --> 00:28:31,820
be as as um denormalized or de-specialized as you

414
00:28:29,880 --> 00:28:37,620
know I don't know some some some some commodity that you can like Mass produce

415
00:28:34,260 --> 00:28:40,140
no problem I generally think uh like the

416
00:28:37,620 --> 00:28:43,320
technology is here so deal with it kind of approach like there's no we shouldn't

417
00:28:41,940 --> 00:28:46,980
be trying to stop this or slow it down we have to just adapt the toothpaste is

418
00:28:45,539 --> 00:28:50,880
out of the tube for a lot of this stuff in terms of they already made this AI

419
00:28:49,380 --> 00:28:54,840
they already scraped these copyrighted images sorry Getty

420
00:28:53,220 --> 00:28:57,720
um but in terms of how we conceptualize it how we think of

421
00:28:56,700 --> 00:29:02,460
it um I think that it is similar to VFX

422
00:29:00,539 --> 00:29:04,380
you watch a movie has great VFX you're like wow that was very enjoyable to

423
00:29:03,539 --> 00:29:10,320
watch or and then you watch another movie and you go yeah that wasn't that crazy but

424
00:29:08,640 --> 00:29:13,679
you know they did all the stunts that's real that was all practical they

425
00:29:12,240 --> 00:29:18,480
actually climbed that building he actually jumped from that train when you

426
00:29:16,140 --> 00:29:22,559
know that a human did it you give it this extra level like it gets it earns

427
00:29:20,640 --> 00:29:27,360
your respects in this other layer I think we're just gonna have to do with art hey that's a nice painting and wow a

428
00:29:25,799 --> 00:29:33,000
human did that from scratch that's amazing I don't see that that often oh

429
00:29:30,419 --> 00:29:36,179
you're gonna you think the the norm is going to be like AI art is ubiquitous

430
00:29:34,799 --> 00:29:40,500
and it's it's everything but then when there's something extra special yeah

431
00:29:38,760 --> 00:29:45,120
because this just this is like anything else this is democratizes Art more you

432
00:29:43,440 --> 00:29:49,440
know I I suck at drawing but now I can make this so I can it's like making

433
00:29:47,159 --> 00:29:53,039
music on your laptop so it's going to be everywhere it's going to take over I

434
00:29:51,659 --> 00:29:57,480
don't think it's like making music on your laptop because when because for

435
00:29:56,039 --> 00:30:01,500
that it's like okay somebody else recorded an instrument and gave you all

436
00:29:59,640 --> 00:30:05,159
this data that you can then manipulate using MIDI files everywhere you see

437
00:30:03,179 --> 00:30:08,820
stock photos today you're gonna see AI generated art when I make a Squarespace

438
00:30:06,899 --> 00:30:13,500
website I'm going to use original as original as a descriptive reality yes

439
00:30:11,760 --> 00:30:17,820
that's that's going to be true but I think I'm saying that uh you know from a

440
00:30:16,380 --> 00:30:22,140
normative framework it's not this is not the same as using

441
00:30:21,240 --> 00:30:27,779
like uh it's been a while since I did since I

442
00:30:25,679 --> 00:30:30,840
meddled in uh electronic music making I forget what the actual files are called

443
00:30:28,980 --> 00:30:34,980
but like the digital sounds you're using a digital sound of a trombone and you

444
00:30:32,880 --> 00:30:39,360
press a key and it makes the sound the trombone sound for that note

445
00:30:37,559 --> 00:30:43,919
that's completely different because you're still generating the data you're

446
00:30:41,700 --> 00:30:48,419
still doing it by hand right I think that with AI art it should be

447
00:30:46,679 --> 00:30:54,899
the case that if you are an artist and you want your

448
00:30:52,200 --> 00:30:58,620
art to be included in these models you can

449
00:30:55,740 --> 00:31:03,539
submit them to the database and there should be a requirement that they

450
00:31:01,020 --> 00:31:07,500
uh you know pay you royalties and I know that the copy like our

451
00:31:05,159 --> 00:31:11,940
copyright system has lots of problems it is not perfect and that like this is

452
00:31:09,299 --> 00:31:15,840
gonna go along with a much needed you know reform of how we deal with

453
00:31:13,500 --> 00:31:19,559
copyright in general in the west we might just get rid of it just go more

454
00:31:17,760 --> 00:31:24,120
China style it's like it's irrelevant now the pace of innovation is just so

455
00:31:21,659 --> 00:31:27,000
fast that it well I think that it's going to be it's going to be that way

456
00:31:25,440 --> 00:31:31,440
for a while as we're in the wild west here but I think that in a few years as

457
00:31:29,940 --> 00:31:36,899
this becomes more and more normalized there's going to be people asking for

458
00:31:34,679 --> 00:31:41,340
you know to save artists in some way because they don't make money already if

459
00:31:39,120 --> 00:31:44,880
this is going if this continues with no regulation

460
00:31:42,779 --> 00:31:49,320
artists are out of a job they're dead

461
00:31:46,320 --> 00:31:51,179
rip rip and rip to this episode because

462
00:31:49,320 --> 00:31:54,059
James I just like to give you your last word

463
00:31:52,799 --> 00:31:58,980
you're good you want the last word that's the last word it's rip

464
00:31:56,460 --> 00:32:04,260
you took it actually so that's your word my last word is thanks for watching talk

465
00:32:02,580 --> 00:32:08,880
link we'll we'll be back we can have these go too long rip Riley subscribe to

466
00:32:06,539 --> 00:32:12,419
techlinked subscribe to they're just movies

467
00:32:11,039 --> 00:32:15,380
see you later thank you for having me
