WEBVTT

00:00:00.000 --> 00:00:04.640
I'm asking our employees what they think of A.I. as a company both in the tech and this

00:00:04.640 --> 00:00:08.560
crave space. I'm sure there are takes that people want to share. Uncut versions will be

00:00:08.560 --> 00:00:14.240
available below, so let's get into it. I am here with Jonathan Horst. Emily Young. James Shrype.

00:00:14.240 --> 00:00:18.720
Do I get to hold this at any point? Do I have to keep leaning in? Yeah, I'm realizing. Jake Bell.

00:00:18.720 --> 00:00:25.360
Dan, hey, uh, Andy. Nicole. Ploof. And what do you do for the company? Right. Good. What's your

00:00:25.360 --> 00:00:30.960
thought in the current state of A.I.? I've only used ChatGPT before. I think for like day-to-day,

00:00:30.960 --> 00:00:36.880
like if I don't want to deal with like a customer representative, like, hey, I want to get refund

00:00:36.880 --> 00:00:42.880
off this. I was like, hey, ChatGPT, write me an email about getting a refund off this. Over-hyped.

00:00:42.880 --> 00:00:47.120
We're thinking of it as like it's going to just do the thing we want it to do when it's probably

00:00:47.120 --> 00:00:52.080
just going to be yet another tool in the toolbox. It's pretty interesting. There's a lot of cool

00:00:52.080 --> 00:00:57.520
stuff going on and I'm excited to see it kind of develop over the next, uh, I guess a year or

00:00:57.520 --> 00:01:02.720
a few months even. We don't have it? No, it's not. What we refer to as A.I. right now, have you

00:01:02.720 --> 00:01:08.160
ever played Mass Effect? So in the lore of Mass Effect, A.I.s are like, you don't do that. So

00:01:08.160 --> 00:01:13.200
instead what they have is what's called a V.I. of virtual intelligence. That is a lot like what we

00:01:13.200 --> 00:01:18.080
consider to be like ChatGPT and all that kind of stuff right now. It's able to answer questions.

00:01:18.080 --> 00:01:23.440
It's able to kind of like carry a conversation. It's able to do a lot of things, but it's not

00:01:23.440 --> 00:01:27.360
able to think. So it's just learning patterns and spitting out what it thinks is the next thing in

00:01:27.360 --> 00:01:32.960
the pattern, more or less. To me, it just seems like this big mass of horrifying knowledge and it

00:01:32.960 --> 00:01:37.840
just gets kind of like randomly spit out whether it's accurate or not. So for me, it's like kind

00:01:37.840 --> 00:01:42.080
of spooky because you never really know what you're going to get. I think a lot of fantasy stuff

00:01:42.080 --> 00:01:47.520
going on. It's people getting very excited about what's going to be in the future while it's actually

00:01:47.520 --> 00:01:52.080
the capabilities are still a bit limited. However, they're crazy compared to two years ago,

00:01:52.080 --> 00:01:55.840
three years ago, like the transformational, especially the A.I. art space, like that's

00:01:55.840 --> 00:02:00.880
already pretty damn good and pretty damn disruptive and scary. I think overall it's really cool and

00:02:00.880 --> 00:02:06.880
it's a great idea. We could speed up things like machine learning. We can use it to diagnose people

00:02:06.880 --> 00:02:10.960
who have weird symptoms and stuff because let's face it, people are limited at some point. It

00:02:10.960 --> 00:02:18.480
becomes bad when we basically use prior work to then create new stuff and the people that made

00:02:18.480 --> 00:02:23.120
that work previously don't get any credit for it. It's not maybe as terrifying as people think it is.

00:02:23.120 --> 00:02:27.520
Yeah, certainly, certainly for it. So since you're for it, what's one thing you're really against in

00:02:27.520 --> 00:02:34.320
terms of the A.I.? Probably creating images of real people as we've seen with QTC and

00:02:34.320 --> 00:02:38.000
Pokimane and some of these other streamers that had explicit photos of them generated and shared

00:02:38.080 --> 00:02:42.640
on websites, stuff like that is awful. And now people think it's a lot of fun to create the A.I.

00:02:42.640 --> 00:02:47.040
songs of like Linus and stuff like that online, but he didn't consent to that. And no matter what,

00:02:47.040 --> 00:02:51.520
that's kind of wrong and weird to think that just because he's on the internet, he's okay with it.

00:02:51.520 --> 00:02:55.520
That's not true. And it's something that obviously there's no way to limit, but it's something I'm

00:02:55.520 --> 00:03:00.320
definitely concerned about. I think probably some of the protectionism. I mean, you're spending

00:03:00.320 --> 00:03:07.360
millions of dollars on like cloud infrastructure, GPU time, just to actually create a reasonable

00:03:07.360 --> 00:03:11.840
model. People, I think, are willing to protect that rather than just open sourcing it. Adobe's

00:03:11.840 --> 00:03:17.440
like generative fill is kind of like interesting. They're demonstrating it in like, look at,

00:03:17.440 --> 00:03:22.320
we could make this image so much wider and we can fill it just from this tiny little portrait.

00:03:22.320 --> 00:03:26.160
We could turn it into a whole landscape and it's like, yeah, I guess that's cool, but it's also

00:03:26.160 --> 00:03:30.480
like, what is the picture then anymore? A.I. is difficult. There are things that A.I. is really

00:03:30.480 --> 00:03:35.280
good for. Say, for example, A.I. image scaling or whatever, you train it to do whatever it needs

00:03:35.280 --> 00:03:41.440
to do and it does it basically by failing as many times as is necessary to fail the correct way.

00:03:41.440 --> 00:03:46.400
And that's actually really cool because it does things in a way that is very difficult to do in

00:03:46.400 --> 00:03:52.480
traditional software. When you get into marketing A.I. as well, I mean, I keep saying A.I. is more

00:03:52.480 --> 00:03:56.240
like machine learning. So like, when you get into marketing it as like A.I. and you say, oh, well,

00:03:56.240 --> 00:04:01.520
chat GPT is going to take over the world, well, that's overselling it. And that's, I think it's

00:04:01.520 --> 00:04:05.360
by design that they're doing that because they want people to invest. They want people to think

00:04:05.360 --> 00:04:10.640
that we're close enough. I think it's a really interesting tool. And I think it, depending on

00:04:10.640 --> 00:04:18.400
how it's used, it kind of broadens a lot of channels and possibilities of being able to sort

00:04:18.400 --> 00:04:23.760
through a lot of information really quickly. Like for example, you can ask A.I. something and it'll

00:04:23.760 --> 00:04:29.840
like throw out a ton of results. But then it's also like good to keep in mind. This is always just

00:04:30.640 --> 00:04:36.560
from a certain pool of information. So depending on how accurate that pool of information is,

00:04:36.560 --> 00:04:41.680
you can't always 100% trust it. I hate writing, by the way, but like, but when you figure out how

00:04:41.680 --> 00:04:45.680
to write something really cohesively and like, it's like, that's really impressive. And I just

00:04:45.680 --> 00:04:50.880
don't know how a prompt is going to lead to that. As a writer, you hate writing. I do.

00:04:53.120 --> 00:04:57.520
There's a lot of worry that like A.I. would take over creative jobs. And as we are in a creative

00:04:57.520 --> 00:05:03.680
field, what do you think about that? I personally don't think so. Because like, A.I. can be very

00:05:03.680 --> 00:05:10.160
objective. But like, they probably can really be subjective. If I trust the A.I. to grab focus for

00:05:10.160 --> 00:05:16.000
me, sometime, if we have a face in there, they'll grab like focus like pretty well. But there is

00:05:16.640 --> 00:05:21.680
one person of the chance that, you know, the focus gets shifted to something else. Because

00:05:21.680 --> 00:05:27.200
the camera thing, that's the face. Human being in general is still better in that case. Like,

00:05:27.280 --> 00:05:33.120
determine which part is the important part. So that's exactly the reason why we don't use like,

00:05:33.120 --> 00:05:38.640
autofocus on the top down and be cam for ShortCircuit. If no new writing is made,

00:05:38.640 --> 00:05:42.960
like if you say, for example, writing is completely no longer a thing. Basically,

00:05:42.960 --> 00:05:48.160
as you iterate on things that are already iterated on, you kind of like lose bits and pieces of it,

00:05:48.160 --> 00:05:53.760
it'll be difficult to continue indefinitely. You might get away with a little bit for a little

00:05:53.760 --> 00:05:58.400
while. People are already getting away with writing paragraphs or whatever. But it still requires

00:05:58.400 --> 00:06:02.160
like an editing pass. It still requires people who know what they're doing. Because I don't know

00:06:02.160 --> 00:06:08.720
if you or like people who are watching know. But A.I., especially like chat GBT, has a very short

00:06:08.720 --> 00:06:14.720
memory. So if you're trying to tell a story, like in a movie or something, keeping that thing on the

00:06:14.720 --> 00:06:18.880
rails is going to be a really tough, a really tough operation. Creative industry is definitely

00:06:18.880 --> 00:06:24.800
at a risk. I think it's going to be similar to other industries, like perhaps VFX, where there's

00:06:24.800 --> 00:06:28.960
humans still doing the job, but there's fewer of them, because the human is doing it can do

00:06:28.960 --> 00:06:33.760
so much faster now. But I think we're going to see things change and adapt. I think we're going to see

00:06:33.760 --> 00:06:40.160
a lot more like identity driven, personality driven content, like influencers, like Linus,

00:06:40.160 --> 00:06:44.720
who you love and have a parasocial relationship with, that's going to just become even more

00:06:44.720 --> 00:06:49.280
of a thing. Because in a couple of years, the majority of the content on the internet is going

00:06:49.280 --> 00:06:53.280
to be from A.I. And you're not even going to know that it's from A.I. And we're going to have to

00:06:53.280 --> 00:06:58.880
figure out identity online. And whether that's a blockchain thing or some kind of centralized,

00:06:58.880 --> 00:07:03.600
I don't know, identity system, we're going to have to figure out a way to know like, oh,

00:07:03.600 --> 00:07:08.400
actual humans said that, not just a very convincing bot with a political agenda. I don't

00:07:08.400 --> 00:07:12.560
know, oh yeah, will my job be a risk? If I was in high school right now, I would be, I'd be doing

00:07:12.560 --> 00:07:17.440
trades for sure. Something, the robots are 15 years behind the A.I.s right now, I think. So

00:07:18.480 --> 00:07:22.800
something practical that you can help with would definitely be a little more secure than a purely

00:07:22.800 --> 00:07:27.840
creative job. Yeah, I was telling Nicole, you know, A.I. can maybe edit videos, but it can't

00:07:27.840 --> 00:07:32.000
do a fart reverb like I can. No way. It can't put Vine boom in at the perfect moment.

00:07:32.960 --> 00:07:37.280
Have you used any A.I. currently? Like, where it's like, you have chat GBT,

00:07:37.280 --> 00:07:44.560
the image random rangers, music, are you like that? Have you used any of it? The only thing I've really done with A.I. was when Mid Journey and those other guys were coming

00:07:44.560 --> 00:07:49.840
out, I was messing around on it. I was like, yeah, this is cool. Haha, like Trump eating a lemon or

00:07:49.840 --> 00:07:54.720
whatever. I don't know. That's like the, oh my God, why is that my example? Have you used things

00:07:54.720 --> 00:07:59.120
like A.I. Dungeon, the like choosing an adventure Dungeons and Dragons type game online, where it

00:07:59.120 --> 00:08:02.960
creates a story for you to play through. And that's a lot of fun. But otherwise, I haven't done

00:08:02.960 --> 00:08:07.440
really any image generation. The only thing is I created an image of Kanye West at a ring door

00:08:07.440 --> 00:08:12.080
bell cam. And that was, that was pretty funny. Yeah, I've played with pretty much all of them.

00:08:12.080 --> 00:08:19.440
And I actually run a couple at home. So I run an LLM as well as stable diffusion for image generation.

00:08:19.440 --> 00:08:24.880
And I run those locally. I recently just bought a couple more 30 90s and, you know, upgraded some of

00:08:24.880 --> 00:08:31.280
them with VRAM so that I could try some of the larger models. So they're like a chat GBT for

00:08:32.160 --> 00:08:37.920
but you can only talk to them for like, I don't know, two to five minutes before the

00:08:37.920 --> 00:08:43.200
like token inference length gets too much. And it takes like a minute to respond to you.

00:08:43.200 --> 00:08:48.080
But initially they're they're really quite powerful as a tech slash creator company that we are.

00:08:48.080 --> 00:08:53.360
Do you think that we should be using AI and like any capacity or depends on the AI and depends on

00:08:53.360 --> 00:08:59.440
what we're using it for. If we're talking chat GBT, I don't know that it's not like copyright

00:08:59.440 --> 00:09:06.000
infringing. So that's one thing. The fact that we don't know what the test like the the actual

00:09:06.000 --> 00:09:12.400
data sets that they train them on is a bit of a big giant question mark in terms of like the

00:09:12.400 --> 00:09:20.000
ethics and the I guess legality ultimately of using AI for generative purposes. On the other

00:09:20.000 --> 00:09:26.080
hand, like say for example, I'm like 95% of the way through a script. And you're like,

00:09:26.160 --> 00:09:30.800
you don't know how to close it out. You could sit there and you could scratch your head for

00:09:30.800 --> 00:09:38.240
an hour or two, or you could ping some concepts off of chat GBT or something and see what it looks

00:09:38.240 --> 00:09:42.160
like. I believe we do currently. I think the business team uses it for talking points and

00:09:42.160 --> 00:09:46.480
things like that. And then they just kind of clean it up. I think basically every organization

00:09:46.480 --> 00:09:52.320
should be using whatever new tools come to light. I mean, transcribing meetings, summarizing a

00:09:52.320 --> 00:09:56.320
meeting that you missed, if it can allow your team to be smaller and increase your output,

00:09:56.320 --> 00:10:02.240
yeah, you should be using that really specific use cases where it's going to be really amazing.

00:10:02.240 --> 00:10:07.120
But in terms of like broad strokes, no, it's going to like help just like that tiny little

00:10:07.120 --> 00:10:11.760
step like making one thing that you're trying to making one vision you have a bit easier to

00:10:11.760 --> 00:10:17.120
to execute or like making it better. That sounds great. We should use it as a starting point,

00:10:17.120 --> 00:10:21.680
but we shouldn't use it for like the final product as a tool, we should be using it as a tool,

00:10:21.680 --> 00:10:26.080
like you'd use a computer as a tool or a camera as a tool. If I'm not like selling

00:10:26.080 --> 00:10:31.200
up other people's work, I'm okay with that. It's like, I need lines to be in a car,

00:10:31.200 --> 00:10:36.880
where can I get the car? You know, like that car, it costs money, that costs time, that costs people's

00:10:36.880 --> 00:10:42.960
effort. Yeah, so in general, just like as long as the AI we're not like selling up other people's

00:10:42.960 --> 00:10:49.440
stuff and be able to help us like increase our production quality, increase our speed,

00:10:50.240 --> 00:10:56.560
I think that's a good thing to use. Not really. Maybe machine learning for the lab, but like real

00:10:56.560 --> 00:11:01.920
AI and using it to make, I don't know, I don't think so. I just, I hate the idea of eliminating

00:11:01.920 --> 00:11:06.800
jobs, especially when it's not ready. I think in a couple of years that story might change and I

00:11:06.800 --> 00:11:11.440
might be like, yeah, sure, I don't want to make the timestamps for every video. So run it through

00:11:11.440 --> 00:11:16.080
the AI generator to generate those timestamps for us like fine. But right now, nah, I'm sweating

00:11:16.080 --> 00:11:19.120
like crazy. So we're going to wrap it up here. Do you want to say anything to the float planers?

00:11:19.280 --> 00:11:25.200
I'm trying to coin this phrase now, or like something you didn't, we didn't talk about that you wanted to expand on. Tell Sammy you don't like being called float planers.

00:11:26.640 --> 00:11:32.400
No, I'm so glad to get this off my chest. Everything, like all, all tools are inherently

00:11:32.400 --> 00:11:38.160
not good or evil. It is what you use it for. That is, this is the most important. So like AI

00:11:38.160 --> 00:11:42.080
every job scaling is amazing. For example, it's just one of those things that's just

00:11:43.600 --> 00:11:48.800
so universally useful. Try it at home. It's fun at home. It's a fun little exercise. It's

00:11:48.800 --> 00:11:54.320
really, really simple to get going. And it's a fun little playground and you don't have to like

00:11:56.000 --> 00:12:01.280
spend money on chat GPT. I mean, chat GPT is still the best. But if you play with it at home,

00:12:02.480 --> 00:12:07.600
yours is you can play with it as much as you want. It's still, it's still flawed. It's still very

00:12:07.600 --> 00:12:14.640
flawed. It's all no, like how to commit the perfect crime. It's got answers for that somewhere. You

00:12:14.640 --> 00:12:19.760
just got to convince it. I would love real AI. I want a Jarvis, you know, a Jarvis that goes like,

00:12:19.760 --> 00:12:23.280
Oh, I got you fam. I got this ready for you before like an actual personality,

00:12:23.280 --> 00:12:28.240
an actual artificial intelligence that's effectively a human. That'll be scary. And it means like

00:12:28.240 --> 00:12:33.120
deleting an AI is like deleting a sentient life. And that's why I does not truly exist now is

00:12:33.120 --> 00:12:37.760
because like a sentient life is not something that's easy to do. For a lot of like the medical

00:12:37.760 --> 00:12:46.800
environment, I am pretty like pro AI because like some people are doing research on like data and

00:12:46.800 --> 00:12:54.640
stuff like for data driven stuff. Like I think AI is definitely winning and also quite like useful.

00:12:54.640 --> 00:13:00.480
Maybe like if the AI research enough like paper, we can cure cancer, you know, you never know. So

00:13:00.480 --> 00:13:05.440
in that perspective, I think it's great. Generally excited. Generally excited.

00:13:08.000 --> 00:13:15.440
Yeah, cut this out. No, this is uncut version. All right, we'll cut it there. Thank you very

00:13:15.440 --> 00:13:22.480
much. Bye flow planers. James leave him, but I'm gonna keep waving. I'm committed a bit. Bye.
