WEBVTT

00:00:00.060 --> 00:00:06.960
talk linked it's back with the number

00:00:03.600 --> 00:00:08.400
one co-host uh as voted by you the

00:00:06.960 --> 00:00:12.420
viewers of America I feel like I haven't been on talking in like two and a half years well that's because we haven't

00:00:10.980 --> 00:00:17.039
done it in like two and a half years all right let's get into it but then we started doing it and guess what we're

00:00:14.880 --> 00:00:20.400
talking about today James uh AI generated art see this is why I love you

00:00:18.960 --> 00:00:27.300
on talk linked you just want to cut right to the point yeah don't waste the beer's time

00:00:23.820 --> 00:00:30.500
information density let's go I

00:00:27.300 --> 00:00:33.600
just want to know how you're doing

00:00:30.500 --> 00:00:35.340
the reason we're here is because a news

00:00:33.600 --> 00:00:38.760
story just came out today about a synthetic media artist Jason Allen I

00:00:37.739 --> 00:00:44.100
think that actually came out yesterday but he won the Colorado State Fair Fine

00:00:41.879 --> 00:00:49.500
Arts competition in the Digital Arts category and good for him you know

00:00:46.920 --> 00:00:53.520
those judges got to catch up yeah uh wait why what do you mean

00:00:51.180 --> 00:00:56.820
because an AI generated his entry oh well that was the key that was the key

00:00:55.020 --> 00:01:04.079
detail that I hadn't gotten to yet yes the reason he won was because he isn't

00:01:00.059 --> 00:01:04.079
actually an artist at all arguably

00:01:06.260 --> 00:01:11.820
he used an AI called mid-journey and

00:01:09.900 --> 00:01:16.020
used a special prompt that he will be publishing quote unquote at a later date

00:01:13.560 --> 00:01:19.799
that's this whole process his uh 11 herbs of spices is that secret prompt

00:01:18.119 --> 00:01:24.659
right exactly what combination of text yielded this art piece yeah so so the

00:01:22.320 --> 00:01:29.820
whole story came about firstly because of this tweet that I'll click into

00:01:28.439 --> 00:01:33.420
um that someone was very upset about this

00:01:32.100 --> 00:01:39.240
but apparently um he goes by sincarnate on on Discord

00:01:36.060 --> 00:01:42.119
or whatever and he talked about his his

00:01:39.240 --> 00:01:47.579
win here and uh he's like hey that's so great look at this I I used a a personal

00:01:44.640 --> 00:01:51.060
project he's made using mid-journey and uh yeah people are upset about it

00:01:49.680 --> 00:01:56.100
James is AI art art

00:01:54.119 --> 00:02:01.140
first of all is it art and then we'll talk about whether you know you can take

00:01:57.899 --> 00:02:02.820
credit for it oh that's let's just start

00:02:01.140 --> 00:02:07.979
with the most philosophical and Abstract question like I guess the Crux of any

00:02:06.240 --> 00:02:13.379
anyone who squeamish to answer that with anything but a fast yes or no uh the

00:02:11.520 --> 00:02:17.099
that person was probably grappling with the question of does art need to be

00:02:15.300 --> 00:02:23.280
generated by humans really what it comes down to is it art uh

00:02:21.319 --> 00:02:27.239
in order to answer that question you have to you have to ask what is art well

00:02:25.620 --> 00:02:31.260
I think the thing with art is the fuzziness of it and when you're talking

00:02:28.860 --> 00:02:35.879
about a computer generating it it seems more like a computation this is an

00:02:33.660 --> 00:02:38.879
output it's deterministic the machine will create this output every single

00:02:37.379 --> 00:02:43.200
time because it's a computer but that's actually not true because these art

00:02:40.440 --> 00:02:47.700
generators they output multiple different iterations every time you hit

00:02:45.540 --> 00:02:51.599
enter right so even with the same text prompt you probably won't get this exact

00:02:49.140 --> 00:02:55.260
piece ever again so in that sense it still has that ephemeral quality of it

00:02:53.519 --> 00:03:00.540
and that fuzziness that making it art see I would argue

00:02:58.319 --> 00:03:07.200
that's a good point thank you but I would argue that art has to have

00:03:04.080 --> 00:03:10.080
some sort of intentionality behind it

00:03:07.200 --> 00:03:15.720
because if it wasn't intended to be produced in a certain way then

00:03:12.959 --> 00:03:19.379
it may evoke feelings in it in the viewer or in in the ex person

00:03:17.400 --> 00:03:24.060
experiencing it you know whether it's whether it's an image or music or

00:03:21.060 --> 00:03:26.879
whatever it may evoke certain feelings

00:03:24.060 --> 00:03:29.640
but you go and watch it you go and look at a beautiful mountain and you feel

00:03:28.440 --> 00:03:34.019
something that doesn't make the mountain art so you're saying that uh so there's

00:03:32.700 --> 00:03:39.000
an intentionality of the person creating the prompt this is a collaborative collaborative effort between the person

00:03:36.900 --> 00:03:42.180
at the keyboard and the AI yes and the person at the keyboard has an intention

00:03:40.319 --> 00:03:46.260
of what they want that's why they they create the string that they do in the

00:03:43.560 --> 00:03:50.940
way that they create it however there is a distance there where I just have to

00:03:48.959 --> 00:03:54.599
roll the dice and see what the AI is going to Output right it just gives you

00:03:52.860 --> 00:03:58.739
a random like uh here's a few things yeah and so I don't I don't control that

00:03:56.940 --> 00:04:02.580
I can try to approximate it with my words but but then with the so then the

00:04:01.140 --> 00:04:05.879
the thing actually doing the generation of the art the AI it doesn't have

00:04:04.200 --> 00:04:08.940
intentionality well it's intention is to try to closely you know produce

00:04:07.799 --> 00:04:14.220
something that matches what you've said but it doesn't have a it's not trying to make a statement you know it doesn't so

00:04:13.019 --> 00:04:21.120
this collaborative effort the intentionality is kind of severed so exactly that's why I think it's not art

00:04:18.299 --> 00:04:24.900
yeah I feel like you need to have well so I I think the the thing that just

00:04:23.160 --> 00:04:28.680
popped in my head was like I Choose Your Own Adventure story

00:04:26.580 --> 00:04:34.020
or like a multiple choice test let's go let's Choose Your Own Adventure uh an

00:04:31.139 --> 00:04:38.160
author has written out a number of possible branching paths for a story to

00:04:36.479 --> 00:04:41.100
take and as you're reading this choose your own I'm thinking about the books

00:04:40.020 --> 00:04:47.639
but like that's not really a thing anymore maybe it is let's say it's a Netflix it's a Netflix uh just keep it

00:04:45.360 --> 00:04:52.500
as books sure it's one of these books that used to exist

00:04:49.620 --> 00:04:55.919
um and if you you by reading through the book and then being like hmm these are

00:04:54.479 --> 00:05:01.919
the options presented to me by this person that actually generated something

00:04:57.479 --> 00:05:03.840
and I'm gonna choose this one and

00:05:01.919 --> 00:05:09.720
okay now you're going along this path but that doesn't mean that you generated

00:05:05.880 --> 00:05:11.699
that story you just put in an input

00:05:09.720 --> 00:05:16.259
you're putting in some inputs to some system the system is generating right

00:05:13.860 --> 00:05:22.440
but so this system just has Myriad just in just infinitely more inputs and even

00:05:18.840 --> 00:05:24.300
more problematically uh the system is

00:05:22.440 --> 00:05:28.199
based on work that was actually originally created by real human artists

00:05:26.940 --> 00:05:34.080
well this is a really interesting part of it yeah so it's like

00:05:31.919 --> 00:05:38.820
I you know I don't want to say like I'm not trying to be here being like this

00:05:35.580 --> 00:05:40.139
new technology that is you know uh I'm

00:05:38.820 --> 00:05:43.440
not trying to hate on like new technology just because it's new but I

00:05:42.120 --> 00:05:47.039
think that like while it is exciting and while it is a

00:05:45.720 --> 00:05:51.000
very interesting technology that we should explore I think that right now

00:05:48.840 --> 00:05:54.060
it's kind of like the wild west where there isn't any regulation people are

00:05:52.979 --> 00:05:58.380
still asking these questions we're having a talk like right now talking

00:05:55.979 --> 00:06:01.800
about it uh so we need to have these conversations and then like break it

00:06:00.479 --> 00:06:07.680
down into what does this mean for copyright what does this mean for the

00:06:04.800 --> 00:06:10.860
future of artistry as like a career or even competitions or even competitions

00:06:09.479 --> 00:06:15.000
like do you have to film yourself for creating the art now and submit that as

00:06:13.199 --> 00:06:19.979
well as your competition which is what package which is what one person on a on

00:06:16.979 --> 00:06:23.039
a Reddit thread suggested in in response

00:06:19.979 --> 00:06:25.800
to this fact that this guy like won this

00:06:23.039 --> 00:06:29.819
art competition using AI art and the key detail is that he did not

00:06:27.720 --> 00:06:33.600
it's the the details seem a little murky right now but at least

00:06:31.440 --> 00:06:37.979
some of the judges have said that they didn't know it was AI art and he has

00:06:36.060 --> 00:06:41.759
said himself that he's like oh what I needed to let them know that was in the

00:06:40.259 --> 00:06:45.000
vice article I believe at the at the bottom

00:06:43.500 --> 00:06:50.360
um I mean I don't know anything about this person's intentions but it's pretty obvious that there wasn't the spirit of

00:06:48.240 --> 00:06:54.960
the competition exactly exactly I mean I can understand

00:06:53.460 --> 00:07:01.560
this is almost like a form of performance art like it's a stunt exactly exactly I can understand him

00:06:58.979 --> 00:07:06.120
saying you know okay I'm gonna I'm gonna I want to bring attention to this yeah I

00:07:03.780 --> 00:07:10.979
want to show what this tool mid-journey can do so I'm going to stealthily enter

00:07:08.280 --> 00:07:16.740
uh with my AI art and then when I win I'm gonna be like haha look at this I

00:07:14.520 --> 00:07:19.620
this is actually a i I are now I've started a conversation and now we're

00:07:18.000 --> 00:07:23.340
talking about this but then I think what he should have done is

00:07:21.840 --> 00:07:26.580
be like okay but I didn't actually win your surprise money yeah yeah because

00:07:25.380 --> 00:07:32.460
now you have these people who actually created Art losing and they're like

00:07:30.060 --> 00:07:36.900
I think that makes it way crappier that makes it way stinkier of a situation

00:07:35.160 --> 00:07:40.500
because it's a stinky stinky because there are people who are outraged and

00:07:38.280 --> 00:07:43.380
they're and they're disappointed that the world's going this way and they're

00:07:41.880 --> 00:07:49.139
worried about the future and what this means for artists around the world and the volume of art that will be created

00:07:46.319 --> 00:07:55.319
by humans going forward and it's kind of dystopian but it that that dystopian

00:07:52.919 --> 00:07:58.380
Viewpoint is greatly enhanced by people being crappy like this exactly what do

00:07:57.479 --> 00:08:01.919
you mean this is valid just like everyone else like yeah no dude I think

00:08:00.960 --> 00:08:06.120
that like they're they're it happened there's

00:08:04.080 --> 00:08:11.240
always this pushback against hard backlash right when something cool

00:08:08.280 --> 00:08:15.000
happens like say blockchain technology and people are like okay cool we're

00:08:13.740 --> 00:08:20.580
building this technology in the early days it's like oh this enter the scammers enter the scammers and the pump

00:08:18.419 --> 00:08:24.240
and dumpers and now the whole they're you know there's all these scams out

00:08:22.440 --> 00:08:27.900
there and there's a huge backlash of being like cryptocurrency is evil and

00:08:26.340 --> 00:08:30.840
bad and it should never be used and it can only be used for as it's pump enough

00:08:29.400 --> 00:08:34.200
stuff and then you have people being like okay wait but remember in the early

00:08:32.880 --> 00:08:37.860
days there was this promise and we could use it for that still but the well is

00:08:36.120 --> 00:08:40.440
poisoned yeah and I think that's happening a little bit with air right

00:08:39.240 --> 00:08:44.700
it's going to be a way different outcome though because the AI art is just it's

00:08:43.140 --> 00:08:48.420
much it's simpler it's more straightforward the future is here now

00:08:46.320 --> 00:08:53.399
deal with it now you know what I mean like uh all graphic artists

00:08:51.300 --> 00:08:58.440
you're in trouble right within the next two years uh stock imagery sites you're

00:08:56.640 --> 00:09:02.700
in trouble yeah this is going to upend Industries and change laws for sure you

00:09:00.540 --> 00:09:07.380
know and and it's and it's moving super rapidly like this technology at the

00:09:05.399 --> 00:09:12.600
beginning of this year was super super rudimentary and it wasn't available and

00:09:09.420 --> 00:09:14.279
then open AI released Dolly too uh

00:09:12.600 --> 00:09:17.760
Google released their thing that I forget what it's called think Imogen I

00:09:15.720 --> 00:09:22.560
think it's called and now you go and there's lists of like the 10 coolest uh

00:09:20.940 --> 00:09:27.180
era well it's getting even crazier because Dolly had some built-in kind of

00:09:25.140 --> 00:09:32.459
filters where you couldn't do you can't do a human's face right you can't you

00:09:29.640 --> 00:09:37.140
can't do pornography right uh whereas stability AI yeah stable diffusion yeah

00:09:35.399 --> 00:09:42.779
it is release open source it has some filters

00:09:39.779 --> 00:09:44.459
that are enabled by default to that's

00:09:42.779 --> 00:09:47.459
just so you don't type something in and get a result that's pornographic and

00:09:46.080 --> 00:09:51.420
you're like oh I didn't want that right but you can disable this filter

00:09:49.740 --> 00:09:55.380
and you can you can create celebrity likenesses you can make like I've seen a

00:09:53.459 --> 00:09:58.440
bunch of pictures of Charlize Theron's face I knew it was her there was no

00:09:57.000 --> 00:10:04.860
label saying it was Charlize Theron yeah I could tell it's her in multiple Styles and angles right uh and so it's

00:10:02.640 --> 00:10:10.320
what does this mean yeah exactly this is you can run this on your desktop GPU

00:10:08.160 --> 00:10:13.380
it's only supports NVIDIA gpus right now but I don't have to rely on a cloud

00:10:11.880 --> 00:10:17.100
service for this yeah I mean like when deep fakes first came out people were

00:10:15.600 --> 00:10:20.820
like oh my gosh people are using their gpus to like run these computations and

00:10:18.899 --> 00:10:24.360
do it at home and that was one thing because if you didn't really have access

00:10:22.380 --> 00:10:28.800
to the hardware to like run those simulations over and over uh or run

00:10:26.940 --> 00:10:32.459
those computations then you wouldn't be able to make deep fakes but now all of

00:10:30.480 --> 00:10:36.300
these air generators are on the web you can just go to a URL and put in a prompt

00:10:34.380 --> 00:10:39.899
and get images and so this is a TechCrunch article talking about what we

00:10:38.279 --> 00:10:45.420
were the the generator we were just mentioning stable diffusion

00:10:42.300 --> 00:10:48.480
um I think it's it's in a program called

00:10:45.420 --> 00:10:50.940
dream AI or something I forget what it's

00:10:48.480 --> 00:10:56.519
called Uh but you know people are using it to make porn of existing people and

00:10:54.660 --> 00:11:00.839
upload it to 4chan lovely site it's and it's just images

00:10:59.040 --> 00:11:04.440
for Now what's the difference with defects this is just images for now not

00:11:02.579 --> 00:11:08.579
video but it'll be video within two or three for sure for sure I mean yeah like

00:11:06.240 --> 00:11:12.000
there's there's deep fake videos and yeah that I'm sure there will be

00:11:10.019 --> 00:11:16.320
generators uh quite soon like have you watched like that really low quality

00:11:13.880 --> 00:11:20.700
animated children's content on YouTube like not to throw a little baby bum

00:11:18.899 --> 00:11:25.500
under the bus but little baby bum is just it's nursery rhyme songs with just

00:11:23.040 --> 00:11:29.160
like pretty bad animation a lot of 3D animation along with it and kids love it

00:11:27.120 --> 00:11:32.160
my kid watched it for a year and a half straight every day all day like just

00:11:30.959 --> 00:11:39.000
love it um in the future you could potentially just play the song and just have the

00:11:36.480 --> 00:11:42.000
animation generated and just put that onto your YouTube channel even less work

00:11:40.680 --> 00:11:45.959
and you can monetize that YouTube channel and get rich very easily yeah I

00:11:44.339 --> 00:11:51.000
don't doubt that that that is the future I mean uh that's video stuff uh and we

00:11:49.440 --> 00:11:54.959
have to watch out for that coming but I mean yeah the still images are being

00:11:52.500 --> 00:11:59.339
used right now this is an Atlantic article from earlier this month I love

00:11:57.420 --> 00:12:04.260
the Atlantic I I do as well you subscribe no no you should uh but I when

00:12:02.760 --> 00:12:10.140
people send a link and it's an Atlantic link I'm like okay I'm out of taste

00:12:07.620 --> 00:12:14.040
anyway what's in the article um this uh you know it's a it's a regular article

00:12:11.760 --> 00:12:18.180
no AI was uh involved in the creation of the words but that's a that's a

00:12:15.360 --> 00:12:21.959
mid-journey created image uh it says the prompt was Alex Jones inside an American

00:12:20.100 --> 00:12:25.800
office under fluorescent lights that's the perfect use for that yeah and there

00:12:23.519 --> 00:12:29.399
goes the graphic designer's job right so that is something that would be

00:12:27.420 --> 00:12:34.380
traditionally I mean yeah this is like a perfect example like it's real world

00:12:30.899 --> 00:12:36.120
this is a lost gig I think graphic

00:12:34.380 --> 00:12:39.600
designers will continue to exist and I think they will use these tools you know

00:12:37.920 --> 00:12:42.180
when you're specking out a job you're probably going to go all right well

00:12:40.920 --> 00:12:46.200
here's five different things that I created in the last 10 minutes

00:12:44.459 --> 00:12:49.680
um using one of these tools right uh which one of you like right you like

00:12:48.060 --> 00:12:52.620
this one okay now I'll go and make a better version of that exactly because a

00:12:51.480 --> 00:12:56.160
lot of these things when you zoom in they're actually not that nice so you

00:12:54.720 --> 00:13:02.040
might not use it for like your corporate logo or something like that although there are Services which use AI to

00:13:00.360 --> 00:13:06.000
generate logos you put in like your company name some details about you your

00:13:03.720 --> 00:13:11.220
industry uh style them you maybe want and they will create AI created logo for

00:13:09.660 --> 00:13:15.180
you and I'm sure like there are options and then I'm sure you can yeah depending

00:13:12.839 --> 00:13:18.300
on that AI like who has the copyright because with Dali Dali retains the

00:13:17.160 --> 00:13:24.360
copyright to the images that you generate right which is weird kind of

00:13:21.060 --> 00:13:25.860
dolly dolly retains yeah it does yeah

00:13:24.360 --> 00:13:28.680
not you see but that's the thing is that like

00:13:27.540 --> 00:13:33.540
should they have copyright at all because what is Dolly trained on they're trained on all of these other artists

00:13:32.160 --> 00:13:38.880
that initially created the work the whole internet like it's just a general web scrape of paid content like

00:13:37.200 --> 00:13:42.240
shutterstocking you need to have a subscription and then they just take

00:13:40.500 --> 00:13:46.139
those billions of things and monetize it while putting Shutterstock out of

00:13:43.740 --> 00:13:50.279
business oh my God that's totally not fair I think in the future it could be

00:13:47.399 --> 00:13:54.540
possible that anyone who wants to uh create one of these models using a

00:13:52.380 --> 00:14:00.120
corpus of imagery will have to license all that imagery to feed the model but

00:13:57.600 --> 00:14:04.440
like how do you enforce that yeah I mean we need regulation which is

00:14:02.760 --> 00:14:07.860
it which is kind of it's funny because this is like on The

00:14:06.300 --> 00:14:13.019
Cutting Edge of what's going on right now and to even think about

00:14:10.860 --> 00:14:18.740
to even think about how long it'll be before a bunch of old people in you know

00:14:16.200 --> 00:14:22.860
the US Congress or senate or elsewhere uh start to like become familiar with

00:14:21.360 --> 00:14:26.339
this as a phenomenon and then craft legislation around it it's gonna be a

00:14:24.660 --> 00:14:30.360
while yeah it's not gonna happen and well it could happen it'll just be a

00:14:28.200 --> 00:14:36.060
couple years yeah it's gonna be a rocky road there's other legal aspects as well

00:14:31.860 --> 00:14:38.220
like you can create a artwork of a

00:14:36.060 --> 00:14:43.980
celebrity space you could sell a painting of Morgan Freeman and could I

00:14:41.760 --> 00:14:47.040
yes but it's kind of like fair use it's a gray you have to make like an argument

00:14:45.360 --> 00:14:52.380
for it a little bit so for example if you if you created a work where the the

00:14:50.399 --> 00:14:56.160
work was completely just like like a photo like look like a photo of Morgan

00:14:53.820 --> 00:14:59.459
Freeman then uh you probably have to get the permission to sell that but if you

00:14:57.600 --> 00:15:03.360
created a work like an Andy Warhol style where you've the work isn't just that

00:15:02.040 --> 00:15:09.959
it's his photo it's not just his likeness that makes the work cool it's that you stylized it right and it came

00:15:07.560 --> 00:15:13.740
from raw materials like paint uh then you don't even need to get this person's

00:15:11.760 --> 00:15:16.860
permission especially if it's just a one-off like you're you're making

00:15:15.300 --> 00:15:21.540
caricatures or something on the street on Vegas like rather than making this

00:15:19.800 --> 00:15:25.079
one RP so I'm gonna make a bazillion up and sell them all right if you if you

00:15:23.399 --> 00:15:28.500
were doing that with a picture of Morgan Freeman's face you'd probably run into

00:15:26.760 --> 00:15:33.060
some sort of yeah it's similar to fair use in that one of the pillars of it is

00:15:30.480 --> 00:15:37.199
does it materially impact that person's ability to monetize their likeness right

00:15:35.519 --> 00:15:40.260
right right right if you make one painting of Morgan Freeman and sell it

00:15:38.880 --> 00:15:44.040
to one person I mean depending on how expensive it is

00:15:42.180 --> 00:15:49.560
maybe Morgan Freeman would want would be like hey wait it's a question because

00:15:46.019 --> 00:15:52.019
now you've got this thing that can

00:15:49.560 --> 00:15:56.760
create many thousands of images of Morgan Freeman right but it's only

00:15:53.699 --> 00:15:58.560
creating them you made one and sold it

00:15:56.760 --> 00:16:01.199
and that it was inconsequential yeah because Riley just sold one so Morgan

00:15:59.880 --> 00:16:06.000
Freeman's not gonna sell no one's no Morgan Freeman's not gonna sue you yeah no one's paying millions of dollars of

00:16:04.079 --> 00:16:09.420
art for my art but what if millions of individuals each make their own Morgan

00:16:07.500 --> 00:16:12.420
Freeman thing so it does materially impact his ability to monetize his

00:16:10.800 --> 00:16:17.339
likeness right but there's no individual to attack as each individual only sold

00:16:14.040 --> 00:16:18.839
at once so do you attack the open source

00:16:17.339 --> 00:16:26.040
well honestly that's kind of the question we're we're faced with here because because these AI generators are

00:16:24.000 --> 00:16:29.699
mass producing now now that a million people millions of people have access to

00:16:28.139 --> 00:16:33.660
these online they sign up for the wait list there's there's not even like the

00:16:31.980 --> 00:16:36.899
like the big ones the really good ones you have to sign up like Dolly you had

00:16:35.220 --> 00:16:41.519
to register for and wait until you get access stable diffusion well same thing

00:16:40.139 --> 00:16:46.320
but they publicly released it now stable yeah it's publicly released there are

00:16:43.440 --> 00:16:49.980
there's there's tens of these dozens of these more uh online that you can just

00:16:48.480 --> 00:16:54.000
go to a URL and put it in you don't need to wait at all so now there are millions

00:16:51.899 --> 00:16:59.519
of people potentially using these are art generators to generate

00:16:56.519 --> 00:17:02.040
countless works of art based on whatever

00:16:59.519 --> 00:17:05.100
yeah original art pieces by human people yeah that's the other thing because

00:17:03.600 --> 00:17:09.959
they're not coming from raw material yeah it's not like your paint on the canvas the materials are coming from

00:17:07.980 --> 00:17:14.459
these other presumably copyrighted oftentimes I just had a I just I just

00:17:12.000 --> 00:17:18.600
had a conversation with with David uh prior to this about

00:17:16.079 --> 00:17:23.339
uh you know whether this is like plagiarism or not and I'm like well okay

00:17:20.760 --> 00:17:28.919
so it's not plagiarism because you're using these people's art as sort of like

00:17:25.319 --> 00:17:31.080
a training tool for AI to generate

00:17:28.919 --> 00:17:34.980
something new right so like that's if that's if plagiarism is just copying it

00:17:33.660 --> 00:17:39.419
you know that's on one end of the spectrum and on the other side maybe is

00:17:37.260 --> 00:17:44.039
you know looking at it from the frame of when you paint you're using materials

00:17:42.539 --> 00:17:49.380
that you didn't make from scratch you know you're buying these this existing

00:17:45.960 --> 00:17:51.120
pigment and uh you know making something

00:17:49.380 --> 00:17:54.419
completely new from it but you didn't make it completely from scratch there's

00:17:52.740 --> 00:17:58.380
other people's work going into that as well so like that's on the other end of

00:17:56.160 --> 00:18:02.580
the spectrum and AI art is kind of somewhere in the middle but

00:18:00.360 --> 00:18:06.480
I'm inclined to put it a little bit more towards plagiarism it's not plagiarism

00:18:04.559 --> 00:18:11.520
well it's all arbitrary it all comes down to the abilities of the entity you

00:18:09.360 --> 00:18:17.400
know it's not plagiarism if I do a perfect replica of a um uh starry night

00:18:14.820 --> 00:18:21.960
because yeah obviously I was influenced by the original work and then I had to

00:18:19.260 --> 00:18:25.440
do all this analog very high skill work to like mix the paint correctly right

00:18:23.520 --> 00:18:30.780
and years of skill for me to actually render it yeah yeah but all that stuff

00:18:28.679 --> 00:18:35.220
is as trivial to a machine right so it's just it's only because the

00:18:33.660 --> 00:18:40.140
entity in question the machine is better at doing it than you are but that's but

00:18:36.960 --> 00:18:43.440
like I guess that's the

00:18:40.140 --> 00:18:46.679
that's the question is there value in a

00:18:43.440 --> 00:18:49.260
machine learning to do this skill should

00:18:46.679 --> 00:18:54.780
we like because it's not an agent it's not a conscious agent I mean according

00:18:51.660 --> 00:18:57.120
to not according to boys when you use

00:18:54.780 --> 00:19:01.260
mid mid Journey you're interacting with a bot it's not like you're on a Google

00:18:59.160 --> 00:19:05.100
local page with a search bar you're in Discord and you're interacting with the

00:19:03.480 --> 00:19:08.940
bot you type the prompt you want it returns an image and then you can say

00:19:06.720 --> 00:19:11.700
make more of them or change it in this way and it's like

00:19:10.679 --> 00:19:16.020
you're having a conversation with an artist in a way and you're collaborating you're kind of collaborating but that's

00:19:14.039 --> 00:19:21.539
just an artifice that's not the bot is just a a a a window into using the

00:19:19.500 --> 00:19:26.580
service available on yeah for sure for now soon it'll be Scarlett Johansson

00:19:22.980 --> 00:19:27.840
soon they'll add actual like AI chat bot

00:19:26.580 --> 00:19:32.640
functionality in there and then you're talking to it'll be very weird through

00:19:30.240 --> 00:19:38.100
the AI yes and they're like okay so when you say a bird flying over a ship do you

00:19:36.000 --> 00:19:41.360
mean like a big ship or like a little ship like this is the AI talking to you

00:19:39.960 --> 00:19:46.140
about it yeah and then make that by voice if we like

00:19:43.860 --> 00:19:50.280
combine these Technologies yes because that's even scarier like it's one thing

00:19:47.760 --> 00:19:54.980
to say all right I work in a I work in an auto Factory a car factory and

00:19:53.039 --> 00:19:59.280
they're replacing my you know job screwing in these rivets with a with a

00:19:57.539 --> 00:20:03.840
robot robotic ARM that's one thing another thing is saying okay you have

00:20:01.919 --> 00:20:08.940
this creative career you thought you were safe from the AI for a while but

00:20:06.299 --> 00:20:15.480
now literally your entire job like consultation uh sample work for for

00:20:12.840 --> 00:20:19.140
inspiration or whatever uh you know any other type of parameters that you would

00:20:16.860 --> 00:20:23.039
discuss with a person ahead of time you can do with AI because we have this

00:20:20.820 --> 00:20:27.600
language AIS and then we also have these art generating AIS yeah so even if me

00:20:25.500 --> 00:20:30.900
even if AI doesn't achieve this like General sentience that we're that that

00:20:29.580 --> 00:20:35.460
is like far in the future with Singularity et cetera Etc even if we

00:20:33.059 --> 00:20:40.140
don't achieve that it's like we're functionally going to achieve

00:20:37.559 --> 00:20:44.100
something very similar to it because of all these other systems that are coming

00:20:41.400 --> 00:20:48.600
together yeah scary well it begs the question like what happens to artists so

00:20:45.480 --> 00:20:50.400
does it mean that you've got it I think

00:20:48.600 --> 00:20:54.179
it look it'll stratify it like it'll make uh it'll make it more disparate

00:20:52.380 --> 00:20:58.200
like you'll have billionaires and impoverished people uh so to speak

00:20:55.980 --> 00:21:01.440
you'll you'll have the artists or graphic designers who are only using

00:20:59.520 --> 00:21:06.120
these tools and you'll have the like the very high level highly skilled artists

00:21:05.280 --> 00:21:12.660
um but the I mean this I guess it's good to be one

00:21:09.600 --> 00:21:15.059
of them but they'll be few they'll be so

00:21:12.660 --> 00:21:19.200
few of them right yeah well uh you know as this uh this is was a reply in the

00:21:17.280 --> 00:21:24.780
original tweet where someone kind of like whistled blue on this guy uh

00:21:21.660 --> 00:21:26.400
bragging about his his achievement uh

00:21:24.780 --> 00:21:29.940
Omni Morpho on Twitter says we're watching the death of artistry unfold

00:21:27.840 --> 00:21:32.880
right before our eyes um even high skilled jobs you know

00:21:31.679 --> 00:21:38.640
you're saying oh there will be some high skilled artists but like even those guys I mean maybe they'll be like a handful

00:21:36.299 --> 00:21:44.159
you know like literally the thing you have to think about though is the

00:21:39.720 --> 00:21:47.400
convergence or the homogenization of of

00:21:44.159 --> 00:21:50.340
art because right now all of these

00:21:47.400 --> 00:21:54.480
generators are using a data set that is Virgin that is created by humans all the

00:21:52.440 --> 00:21:58.740
all the info that's feeding them was photographs taken by humans art pieces

00:21:56.520 --> 00:22:03.419
drawn by humans but now that millions of pieces are coming out as output from

00:22:01.200 --> 00:22:09.299
these generators that means the internet at large is is growing in its proportion

00:22:06.600 --> 00:22:12.720
of stuff that was made by robots right and so if that continues to expand to

00:22:11.520 --> 00:22:18.720
the point where like half the stuff of the internet was made by robots yeah then when the next the next AI is

00:22:16.320 --> 00:22:24.059
programmed and fed into it is outputs from robots it'll become more and more

00:22:21.419 --> 00:22:27.900
like it's self-referential yeah this is this was Horst I was talking to him

00:22:25.620 --> 00:22:33.299
before as well and he specifically asked me to bring this up like he he likened

00:22:30.360 --> 00:22:37.520
it to re-encoding videos the more that you re-encode the same video file the

00:22:35.159 --> 00:22:42.179
more the quality to grades so it's like eventually are these AI art generators

00:22:40.620 --> 00:22:47.820
just going to be you know feeding off themselves again and again and again until it's just like a blurry mess uh so

00:22:46.260 --> 00:22:52.740
I guess I don't think that I think I mean like stylistically like to me I

00:22:49.919 --> 00:22:55.740
liken it as um I this is a pet Theory I guess it won't it won't exactly sorry it

00:22:54.120 --> 00:23:00.240
won't like yeah it won't be the same as like the resolution is worse but maybe

00:22:58.500 --> 00:23:04.080
it would be like um Hollywood taking over the world and all film is just like

00:23:01.980 --> 00:23:07.500
every movie has to be like an MCU movie or everyone has to do their makeup like

00:23:05.820 --> 00:23:10.559
Kim Kardashian because we all now we all use Instagram and we all look at that

00:23:08.880 --> 00:23:17.039
one celebrity and we all want that face yeah so then all the dado will just make

00:23:13.080 --> 00:23:19.020
this kind of art so therefore maybe the

00:23:17.039 --> 00:23:22.080
last remaining highly skilled artists will be the wackiest artists who are

00:23:20.580 --> 00:23:26.340
making the most original kind of stuff because it just looks so different from

00:23:24.480 --> 00:23:33.480
what is generated from the AI you know what it might actually happen is uh

00:23:29.179 --> 00:23:34.860
carsonization car Carson as I answer

00:23:33.480 --> 00:23:40.440
nope carsonization

00:23:37.500 --> 00:23:45.720
what is it it's the phenomenon of uh the the phenomenon wherein uh organisms

00:23:43.340 --> 00:23:49.510
multiple branches of organisms on evolutionary Pathways uh all

00:23:48.059 --> 00:23:53.250
become crabs

00:23:53.460 --> 00:23:58.559
like we're going back to crab animals

00:23:56.520 --> 00:24:01.679
keep evolving into crabs and scientists well this one says scientists don't know

00:24:00.360 --> 00:24:05.940
why but they do know why it's because it's like a advantageous that's the word

00:24:03.840 --> 00:24:09.720
in some way um so maybe like you know we'll have the

00:24:07.919 --> 00:24:14.039
equivalent of uh of everything going back to crab all AIS

00:24:12.179 --> 00:24:17.360
eventually all AI music generators eventually create crab raid

00:24:21.780 --> 00:24:28.380
oh speaking of which we didn't even I mean real quick uh this isn't like

00:24:25.919 --> 00:24:33.419
entirely a new phenomenon they've been doing this with with music for some time

00:24:30.179 --> 00:24:35.940
now uh in that they feed in music to an

00:24:33.419 --> 00:24:40.620
AI it generates like Melodies and rhythms and stuff and in some cases it

00:24:38.400 --> 00:24:43.799
generates the audio directly but most of the time it doesn't sound very good so

00:24:42.419 --> 00:24:47.400
what they'll do instead is they'll feed music into like they'll say we want a

00:24:45.539 --> 00:24:51.600
Jimi Hendrix Style song they'll feed a million Jimi Hendrix songs to an AI and

00:24:49.559 --> 00:24:55.679
it'll make midi data or like sheet music and then the humans will recreate it

00:24:53.580 --> 00:24:58.919
with like and in some cases they make like it right writes lyrics and

00:24:57.299 --> 00:25:02.280
everything as well but then humans will do other people share my opinion that

00:25:00.780 --> 00:25:06.120
I'm less bothered by this I think this is less bothersome and the reason is

00:25:04.020 --> 00:25:11.580
popular music can't get any worse exactly I I think that I think that

00:25:08.760 --> 00:25:14.159
music is I mean that's a whole nother discussion I feel like we probably

00:25:12.840 --> 00:25:20.580
shouldn't get into it no it doesn't bother me that much but I will say that

00:25:16.020 --> 00:25:23.460
like you know it seems like with music

00:25:20.580 --> 00:25:27.360
human generated art will human generated music will always be more important to

00:25:25.679 --> 00:25:30.360
people I don't think like I think because I think part of that is like

00:25:28.919 --> 00:25:33.779
sort of the personality the cult of personality around like an artist if you

00:25:32.039 --> 00:25:37.740
like like their music you're gonna be like wow I kind of want to listen to

00:25:35.460 --> 00:25:40.500
more of their music if it's an AI ai's gonna have to generate some very

00:25:38.880 --> 00:25:45.900
personal histories in the lyrics yeah like are people gonna go to

00:25:43.380 --> 00:25:49.080
a concert no because the music live music is already differentiated from

00:25:47.400 --> 00:25:54.299
recorded music you go to see live music you go you go to see the phenomenon of

00:25:51.419 --> 00:25:57.960
this well but then in in the in the age of ADM

00:25:55.620 --> 00:26:01.799
people that you don't go there to see that you're not going there to do

00:25:59.159 --> 00:26:06.120
performance sure but but I mean like you know more people are going to a show by

00:26:04.380 --> 00:26:10.559
I don't even know what the big EDM guy is now but like a few years ago maybe it

00:26:08.460 --> 00:26:15.179
was dead mouse like people go to the show because it's oh it's it's that guy

00:26:12.240 --> 00:26:19.140
that I like and not because the music is particularly really good it's because

00:26:17.159 --> 00:26:21.960
you liked a few of their music their tracks and so now like you have this

00:26:20.520 --> 00:26:25.200
idea in their head in your head yeah we're not gonna do that for AI no no

00:26:23.760 --> 00:26:30.299
one's dreaming for art like no one's going or maybe it will I mean V tubers

00:26:27.000 --> 00:26:32.279
are a thing uh Miku Miku from uh

00:26:30.299 --> 00:26:36.360
Vocaloid or whatever there's these like you know virtual Idols in Japan and

00:26:34.559 --> 00:26:43.380
whatnot they go on stage as a hologram they're dancing people are like I'm I'm

00:26:38.820 --> 00:26:45.120
a fan of this Idol person oh and if

00:26:43.380 --> 00:26:48.960
there was just an AI but at the same but at the end of the day if you really

00:26:46.740 --> 00:26:52.080
wanted like a distinct sound there would have to be some sort of human input but

00:26:50.460 --> 00:26:55.679
that's not even true I'd take it back well actually because far in the future

00:26:53.880 --> 00:26:59.159
maybe there is like an AI that was like tuned with specific parameters and

00:26:57.120 --> 00:27:03.059
people were like I'm a fan or you just around us you randomize the parameters

00:27:01.020 --> 00:27:07.559
like they're part of uh stability diffusion is you can you can make

00:27:05.700 --> 00:27:10.860
derivative products with it yeah as long as the license gets passed down you can

00:27:09.659 --> 00:27:15.240
make derivative products and you can tune some parameters such that the art

00:27:13.620 --> 00:27:18.980
that it creates is more of a certain style for sure and there's there's

00:27:16.799 --> 00:27:23.820
Commercial Services that do that like sound full I just found this like right

00:27:21.419 --> 00:27:26.640
before we card reader yeah yeah there's a bunch of services like this but like

00:27:25.260 --> 00:27:31.799
sound full you know you start in there you you uh you know you you choose a

00:27:29.640 --> 00:27:35.580
genre you customize it you like choose a beat you choose some instruments they're

00:27:33.480 --> 00:27:39.120
all pre-recorded pre-done and then you can like you know put it together and

00:27:37.200 --> 00:27:43.440
make something like quote unquote new you could make like a Nine Inch Nails AI

00:27:41.880 --> 00:27:47.640
you're like you really loves this mode and this interval use these instruments

00:27:46.080 --> 00:27:51.600
yeah whatever it puts out will sound nailsy exactly exactly and the same

00:27:49.919 --> 00:27:56.460
thing can happen with visual art too oh man and if that's just randomized people

00:27:53.760 --> 00:28:01.140
could become fans of certain certain AIS that they really like their style right

00:27:58.440 --> 00:28:03.960
okay so last question and then we're done

00:28:01.860 --> 00:28:06.720
I know you really really hate this James you want to go I gotta get out of here

00:28:06.059 --> 00:28:13.740
um what do you think should be the situation what should we head

00:28:11.159 --> 00:28:19.260
towards as a society when it comes to like AI generated art should should we

00:28:16.320 --> 00:28:24.480
respect AI generated art as like you know on a somewhat equal playing

00:28:21.480 --> 00:28:26.279
field or should it be like it should it

00:28:24.480 --> 00:28:31.820
be as as um denormalized or de-specialized as you

00:28:29.880 --> 00:28:37.620
know I don't know some some some some commodity that you can like Mass produce

00:28:34.260 --> 00:28:40.140
no problem I generally think uh like the

00:28:37.620 --> 00:28:43.320
technology is here so deal with it kind of approach like there's no we shouldn't

00:28:41.940 --> 00:28:46.980
be trying to stop this or slow it down we have to just adapt the toothpaste is

00:28:45.539 --> 00:28:50.880
out of the tube for a lot of this stuff in terms of they already made this AI

00:28:49.380 --> 00:28:54.840
they already scraped these copyrighted images sorry Getty

00:28:53.220 --> 00:28:57.720
um but in terms of how we conceptualize it how we think of

00:28:56.700 --> 00:29:02.460
it um I think that it is similar to VFX

00:29:00.539 --> 00:29:04.380
you watch a movie has great VFX you're like wow that was very enjoyable to

00:29:03.539 --> 00:29:10.320
watch or and then you watch another movie and you go yeah that wasn't that crazy but

00:29:08.640 --> 00:29:13.679
you know they did all the stunts that's real that was all practical they

00:29:12.240 --> 00:29:18.480
actually climbed that building he actually jumped from that train when you

00:29:16.140 --> 00:29:22.559
know that a human did it you give it this extra level like it gets it earns

00:29:20.640 --> 00:29:27.360
your respects in this other layer I think we're just gonna have to do with art hey that's a nice painting and wow a

00:29:25.799 --> 00:29:33.000
human did that from scratch that's amazing I don't see that that often oh

00:29:30.419 --> 00:29:36.179
you're gonna you think the the norm is going to be like AI art is ubiquitous

00:29:34.799 --> 00:29:40.500
and it's it's everything but then when there's something extra special yeah

00:29:38.760 --> 00:29:45.120
because this just this is like anything else this is democratizes Art more you

00:29:43.440 --> 00:29:49.440
know I I suck at drawing but now I can make this so I can it's like making

00:29:47.159 --> 00:29:53.039
music on your laptop so it's going to be everywhere it's going to take over I

00:29:51.659 --> 00:29:57.480
don't think it's like making music on your laptop because when because for

00:29:56.039 --> 00:30:01.500
that it's like okay somebody else recorded an instrument and gave you all

00:29:59.640 --> 00:30:05.159
this data that you can then manipulate using MIDI files everywhere you see

00:30:03.179 --> 00:30:08.820
stock photos today you're gonna see AI generated art when I make a Squarespace

00:30:06.899 --> 00:30:13.500
website I'm going to use original as original as a descriptive reality yes

00:30:11.760 --> 00:30:17.820
that's that's going to be true but I think I'm saying that uh you know from a

00:30:16.380 --> 00:30:22.140
normative framework it's not this is not the same as using

00:30:21.240 --> 00:30:27.779
like uh it's been a while since I did since I

00:30:25.679 --> 00:30:30.840
meddled in uh electronic music making I forget what the actual files are called

00:30:28.980 --> 00:30:34.980
but like the digital sounds you're using a digital sound of a trombone and you

00:30:32.880 --> 00:30:39.360
press a key and it makes the sound the trombone sound for that note

00:30:37.559 --> 00:30:43.919
that's completely different because you're still generating the data you're

00:30:41.700 --> 00:30:48.419
still doing it by hand right I think that with AI art it should be

00:30:46.679 --> 00:30:54.899
the case that if you are an artist and you want your

00:30:52.200 --> 00:30:58.620
art to be included in these models you can

00:30:55.740 --> 00:31:03.539
submit them to the database and there should be a requirement that they

00:31:01.020 --> 00:31:07.500
uh you know pay you royalties and I know that the copy like our

00:31:05.159 --> 00:31:11.940
copyright system has lots of problems it is not perfect and that like this is

00:31:09.299 --> 00:31:15.840
gonna go along with a much needed you know reform of how we deal with

00:31:13.500 --> 00:31:19.559
copyright in general in the west we might just get rid of it just go more

00:31:17.760 --> 00:31:24.120
China style it's like it's irrelevant now the pace of innovation is just so

00:31:21.659 --> 00:31:27.000
fast that it well I think that it's going to be it's going to be that way

00:31:25.440 --> 00:31:31.440
for a while as we're in the wild west here but I think that in a few years as

00:31:29.940 --> 00:31:36.899
this becomes more and more normalized there's going to be people asking for

00:31:34.679 --> 00:31:41.340
you know to save artists in some way because they don't make money already if

00:31:39.120 --> 00:31:44.880
this is going if this continues with no regulation

00:31:42.779 --> 00:31:49.320
artists are out of a job they're dead

00:31:46.320 --> 00:31:51.179
rip rip and rip to this episode because

00:31:49.320 --> 00:31:54.059
James I just like to give you your last word

00:31:52.799 --> 00:31:58.980
you're good you want the last word that's the last word it's rip

00:31:56.460 --> 00:32:04.260
you took it actually so that's your word my last word is thanks for watching talk

00:32:02.580 --> 00:32:08.880
link we'll we'll be back we can have these go too long rip Riley subscribe to

00:32:06.539 --> 00:32:12.419
techlinked subscribe to they're just movies

00:32:11.039 --> 00:32:15.380
see you later thank you for having me
