WEBVTT

00:00:00.000 --> 00:00:04.640
I'm asking our employees what they think of AI as a company both in the tech and this

00:00:04.640 --> 00:00:08.560
crave space. I'm sure there are takes that people want to share. Uncut versions will be

00:00:08.560 --> 00:00:14.240
available below, so let's get into it. I am here with Jonathan Horst. Emily Young. James Shrype.

00:00:14.240 --> 00:00:18.720
Do I get to hold this at any point? Do I have to keep leaning in? Yeah, I'm realizing. Jake Bell.

00:00:18.720 --> 00:00:25.360
Dan, hey, Andy. Nicole. Ploof. And what do you do for the company? Right. Good. What's your

00:00:25.360 --> 00:00:30.960
thought in the current state of AI? I've only used chat GPT before. I think for like day-to-day,

00:00:30.960 --> 00:00:36.880
like if I don't want to deal with like a customer representative, like, hey, I want to get refund

00:00:36.880 --> 00:00:42.880
off this. I was like, hey, chat GPT, write me an email about getting a refund off this. Over height.

00:00:42.880 --> 00:00:47.120
We're thinking of it as like, it's going to just do the thing we want it to do when it's probably

00:00:47.120 --> 00:00:52.080
just going to be yet another tool in the toolbox. It's pretty interesting. There's a lot of cool

00:00:52.080 --> 00:00:57.520
stuff going on, and I'm excited to see it kind of develop over the next, I guess, a year or a

00:00:57.520 --> 00:01:02.880
few months even. We don't have it? No, it's not. What we refer to as AI right now, have you ever

00:01:02.880 --> 00:01:08.640
played Mass Effect? So in the lore of Mass Effect, AIs are like, you don't do that. So instead, what

00:01:08.640 --> 00:01:13.600
they have is what's called a VI, a virtual intelligence. That is a lot like what we consider

00:01:13.600 --> 00:01:18.320
to be like chat GPT and all that kind of stuff right now. It's able to answer questions. It's

00:01:18.320 --> 00:01:23.760
able to kind of like carry a conversation. It's able to do a lot of things, but it's not able to

00:01:23.760 --> 00:01:28.000
think. So it's just learning patterns and spitting out what it thinks is the next thing in the pattern,

00:01:28.000 --> 00:01:33.520
more or less. To me, it just seems like this big mass of horrifying knowledge and it just gets kind

00:01:33.520 --> 00:01:38.640
of like randomly spit out whether it's accurate or not. So for me, it's like kind of spooky because

00:01:38.640 --> 00:01:43.280
you never really know what you're going to get. I think a lot of fantasy stuff going on. It's

00:01:43.280 --> 00:01:48.080
people getting very excited about what's going to be in the future. Well, it's actually the capability

00:01:48.080 --> 00:01:52.800
states are still a bit limited. However, they're crazy about two years ago, three years ago, like

00:01:52.800 --> 00:01:57.680
the transformational, especially the AI art space, like that's already pretty damn good and pretty

00:01:57.680 --> 00:02:02.320
damn disruptive and scary. I think overall it's really cool and it's a great idea. We could speed

00:02:02.320 --> 00:02:08.400
up things like machine learning. We can use it to diagnose people who have weird symptoms and stuff

00:02:08.400 --> 00:02:14.640
because let's face it, people are limited at some point. It becomes bad when we basically use prior

00:02:14.720 --> 00:02:20.480
work to then create new stuff and the people that made that work previously don't get any credit for

00:02:20.480 --> 00:02:25.520
it. It's not maybe as terrifying as people think it is. Yeah, certainly, certainly for it. So since

00:02:25.520 --> 00:02:29.760
you're for it, what's one thing you're really against in terms of the AI? Probably creating

00:02:29.760 --> 00:02:35.920
images of real people. As we've seen with QTC and Pokimane and some of these other streamers that

00:02:35.920 --> 00:02:40.560
had explicit photos of them generated and shared on websites, stuff like that is awful. And now

00:02:40.560 --> 00:02:44.400
people think it's a lot of fun to create the AI songs of like Linus and stuff like that online,

00:02:44.400 --> 00:02:48.960
but he didn't consent to that. And no matter what, that's kind of wrong and weird to think that just

00:02:48.960 --> 00:02:53.600
because he's on the internet, he's okay with it. That's not true. And it's something that obviously

00:02:53.600 --> 00:02:57.680
there's no way to limit. But it's something I'm definitely concerned about. I think probably

00:02:57.680 --> 00:03:03.440
some of the protectionism, I mean, you're spending millions of dollars on like cloud infrastructure

00:03:03.440 --> 00:03:09.520
GPU time just to actually create a reasonable model. People I think are willing to protect that

00:03:09.520 --> 00:03:15.520
rather than just open sourcing it. Adobe's like generative fill is kind of like interesting.

00:03:15.520 --> 00:03:20.400
They're demonstrating it in like, look at, we could make this image so much wider and we can

00:03:20.400 --> 00:03:24.400
fill it just from this tiny little portrait, we could turn it into a whole landscape. And it's

00:03:24.400 --> 00:03:28.720
like, yeah, I guess that's cool. But it's also like, what is the picture then anymore? AI is

00:03:28.720 --> 00:03:33.200
difficult. There are things that AI is really good for. Say, for example, AI image scaling or

00:03:33.200 --> 00:03:38.720
whatever, you train it to do whatever it needs to do, and it does it basically by failing as many

00:03:38.800 --> 00:03:43.360
times as is necessary to fail the correct way. And that's actually really cool, because it

00:03:43.360 --> 00:03:48.640
does things in a way that is very difficult to do in traditional software. When you get into

00:03:49.200 --> 00:03:53.600
marketing AI as well, I mean, I keep saying AI, it's more like machine learning. So like, when

00:03:53.600 --> 00:03:57.840
you get into marketing it as like AI, and you say, Oh, well, chat, GPT is going to take over the world.

00:03:57.840 --> 00:04:03.040
Well, that's, that's overselling it. And that's, I think it's by design that they're doing that,

00:04:03.040 --> 00:04:06.640
because they want people to invest, they want people to think that we're close enough.

00:04:06.640 --> 00:04:12.720
I think it's a really interesting tool. And I think it, depending on how it's used, it kind of

00:04:12.720 --> 00:04:19.680
broadens a lot of channels and possibilities of being able to sort through a lot of information

00:04:19.680 --> 00:04:25.520
really quickly. Like for example, you can ask AI something and it'll like throw out a ton of

00:04:25.520 --> 00:04:31.760
results. But then it's also like good to keep in mind, this is always just from a certain pool

00:04:31.760 --> 00:04:38.400
of information. So depending on how accurate that pool of information is, you can't always 100%

00:04:38.400 --> 00:04:42.160
trust it. I hate writing, by the way, but like, but when you figure out how to write something

00:04:42.160 --> 00:04:46.160
really cohesively and like, it's like, that's really impressive. And I just don't know how

00:04:46.880 --> 00:04:50.880
prompt is going to lead to that. As a writer, you hate writing. I do.

00:04:53.120 --> 00:04:57.520
There's a lot of worry that like AI would take over creative jobs. And as we are in a creative

00:04:57.520 --> 00:05:04.400
field, what do you think about that? I personally don't think so. Cause like AI can be very objective,

00:05:04.400 --> 00:05:11.680
but like they probably can really be subjective. If I trust the AI to grab focus for me sometime,

00:05:11.680 --> 00:05:17.360
if we have a face in there, they'll grab like focus like pretty well. But there is one person

00:05:17.360 --> 00:05:22.880
of the chance that, you know, the focus gets shifted to something else. Cause the camera thing,

00:05:22.960 --> 00:05:27.840
that's the face. Human being in general is still better in that case, like determine

00:05:28.800 --> 00:05:33.120
which part is the important part. So that's exactly the reason why we don't use like

00:05:33.120 --> 00:05:38.640
auto focus on the top down and be cam for ShortCircuit. If no new writing is made,

00:05:38.640 --> 00:05:42.960
like if you say, for example, writing is completely no longer a thing that basically,

00:05:42.960 --> 00:05:48.160
as you iterate on things that are already iterated on, you kind of like lose bits and pieces of it,

00:05:48.160 --> 00:05:53.600
it'll be difficult to continue indefinitely. You might get away with a little bit for a

00:05:53.600 --> 00:05:56.640
little while. People are already getting away with writing paragraphs or whatever,

00:05:56.640 --> 00:06:01.280
but it still requires like an editing pass. It still requires people who know what they're

00:06:01.280 --> 00:06:06.560
doing because I don't know if you or like people who are watching know, but AI, especially like

00:06:06.560 --> 00:06:12.320
chat GBT has a very short memory. So if you're trying to tell a story, like in a movie or something,

00:06:13.920 --> 00:06:17.520
keeping that thing on the rails is going to be a really tough, a really tough operation.

00:06:17.520 --> 00:06:22.720
Creative industries definitely are at risk. I think it's going to be similar to other industries

00:06:22.720 --> 00:06:27.600
like perhaps the Apex where there's humans still doing the job, but there's fewer of them

00:06:27.600 --> 00:06:31.680
because the human is doing it can do so much faster now. But I think we're going to see things

00:06:31.680 --> 00:06:36.800
change and adapt. I think we're going to see a lot more like identity driven personality driven

00:06:36.800 --> 00:06:42.480
content like influencers like Linus who you love and have a parasocial relationship with.

00:06:42.560 --> 00:06:47.920
That's going to just become even more of a thing because in a couple of years, the majority of

00:06:47.920 --> 00:06:51.920
the content on the internet is going to be from AI and you're not even going to know that it's

00:06:51.920 --> 00:06:56.400
from AI and we're going to have to figure out identity online and whether that's a blockchain

00:06:56.400 --> 00:07:02.080
thing or some kind of centralized, I don't know, identity system, we're going to have to figure

00:07:02.080 --> 00:07:07.680
out a way to know like, oh, actual humans said that, not just a very convincing bot with a political

00:07:07.680 --> 00:07:12.880
agenda. Oh yeah, will my job be at risk? If I was in high school right now, I would be doing trades

00:07:12.880 --> 00:07:19.280
for sure. The robots are 15 years behind the AIs right now, I think. So something practical that

00:07:19.280 --> 00:07:24.080
you can help with would definitely be a little more secure than a purely creative job. Yeah,

00:07:24.080 --> 00:07:29.520
I was telling Nicole, AI can maybe edit videos, but it can't do a fart reverb like I can.

00:07:29.520 --> 00:07:32.000
No way. It can't put vine boom in at the perfect moment.

00:07:32.960 --> 00:07:37.920
Have you used any AI currently, like where it's like the chat GBT, the image random

00:07:37.920 --> 00:07:41.200
rangers, music, I already think that, have you used any of it? The only thing I've really done

00:07:41.200 --> 00:07:46.560
with AI was when Mid Journey and those other guys were coming out, I was messing around on it.

00:07:46.560 --> 00:07:50.720
I was like, yeah, this is cool. Haha, like Trump eating a lemon or whatever. I don't know,

00:07:50.720 --> 00:07:55.760
that's like the, oh my god, why is that my example? Have you used things like AI Dungeon,

00:07:55.760 --> 00:07:59.840
the like choosing an adventure Dungeon and Dragons type game online where it creates a story

00:07:59.840 --> 00:08:03.680
for you to play through? And that's a lot of fun. But otherwise, I haven't done really any image

00:08:03.680 --> 00:08:08.080
generation. The only thing is I created an image of Kanye West at a ring door bell cam,

00:08:08.080 --> 00:08:12.880
and that was pretty funny. Yeah, I've played with pretty much all of them. And I actually run a

00:08:12.880 --> 00:08:20.000
couple at home. So I run an LLM as well as stable diffusion for image generation. And I run those

00:08:20.000 --> 00:08:26.000
locally. I recently just bought a couple more 30 90s and, you know, upgraded some of them with VRAM

00:08:26.000 --> 00:08:32.960
so that I could try some of the larger models. So they're like a chat GPT for but you can only

00:08:32.960 --> 00:08:39.360
talk to them for like, I don't know, two to five minutes before the like token inference length

00:08:39.360 --> 00:08:44.400
gets too much. And it takes like a minute to respond to you. But initially, they're they're

00:08:44.400 --> 00:08:49.040
really quite powerful as a tech slash creator company that we are. Do you think that we should

00:08:49.040 --> 00:08:54.080
be using AI and like any capacity or depends on the AI and depends on what we're using it for.

00:08:54.080 --> 00:09:02.320
If we're talking chat GPT, I don't know that it's not like copyright infringing. So that's one thing.

00:09:03.520 --> 00:09:08.000
The fact that we don't know what the test, like the the actual data sets that they train them on

00:09:08.000 --> 00:09:14.320
is a bit of a big giant question mark in terms of like the ethics and the, I guess, legality

00:09:14.320 --> 00:09:22.000
ultimately of using AI for generative purposes. On the other hand, like say, for example, I'm like

00:09:22.720 --> 00:09:27.680
95% of the way through a script. And you're like, you don't know how to close it out.

00:09:28.400 --> 00:09:32.320
You could sit there and you could, you know, scratch your head for an hour or two, or you could

00:09:33.520 --> 00:09:39.200
ping some concepts off of chat GPT or something and see what it looks like. I believe we do

00:09:39.200 --> 00:09:42.960
currently. I think the business team uses it for talking points and things like that. And then

00:09:42.960 --> 00:09:48.320
they just kind of clean it up. I think basically every organization should be using whatever new

00:09:48.320 --> 00:09:53.360
tools come to light. I mean, transcribing meetings, summarizing a meeting that you missed,

00:09:53.360 --> 00:09:57.040
if they can allow your team to be smaller and increase your output. Yeah, you should be using

00:09:57.040 --> 00:10:03.440
that really specific use cases where it's going to be really amazing. But in terms of like broad

00:10:03.440 --> 00:10:08.640
strokes, no, it's going to like help just like that tiny little step, like making one thing that

00:10:08.640 --> 00:10:13.840
you're trying to making one vision, you have a bit easier to execute or like making it better.

00:10:14.480 --> 00:10:18.560
That sounds great. We should use it as a starting point, but we shouldn't use it for like the final

00:10:18.560 --> 00:10:23.760
product as a tool. We should be using it as a tool. Like you'd use a computer as a tool or a

00:10:23.760 --> 00:10:28.960
camera as a tool. If I'm not like sitting out of other people's work, I'm okay with that. It's like,

00:10:28.960 --> 00:10:35.120
I need lines to be in a car. Where can I get the car? You know, like that car, it costs money,

00:10:35.120 --> 00:10:40.800
that costs time, that costs people's effort. Yeah. So in general, just like as long as the AI

00:10:40.800 --> 00:10:47.520
we're not like sitting out of other people's stuff and be able to help us like increase our

00:10:47.520 --> 00:10:52.960
production quality, increase our speed, I think that's a good thing to use. Not really.

00:10:53.600 --> 00:10:59.360
Maybe machine learning for the lab, but like real AI and using it to make, I don't know,

00:10:59.360 --> 00:11:04.720
I don't think so. I just, I hate the idea of eliminating jobs, especially when it's not ready.

00:11:04.720 --> 00:11:08.400
I think in a couple of years that story might change and I might be like, yeah, sure. I don't

00:11:08.400 --> 00:11:12.880
want to make the timestamps for every video. So run it through the AI generator to generate those

00:11:12.880 --> 00:11:17.120
timestamps for us, like fine. But right now, nah. I'm sweating like crazy. So we're going to wrap

00:11:17.120 --> 00:11:20.560
it up here. Do you want to say anything to the float planers? I'm trying to coin this phrase now

00:11:20.560 --> 00:11:23.600
or like something you didn't, we didn't talk about that you want to expand on. Tell Sammy,

00:11:23.600 --> 00:11:29.600
you don't like being called float planers. No, I'm so glad to get this off my chest.

00:11:29.600 --> 00:11:34.880
Everything, like all, all tools are inherently not good or evil. It is what you use it for,

00:11:34.880 --> 00:11:40.720
that is, this is the most important. So like AI, every job scaling is amazing. For example,

00:11:40.720 --> 00:11:46.320
it's just one of those things that's just so universally useful. Try it at home. It's fun at

00:11:46.320 --> 00:11:52.000
home. It's a fun little exercise. It's really, really simple to get going and it's, it's a fun

00:11:52.000 --> 00:11:59.200
little playground and you don't have to like spend money on chat GPT. I mean chat GPT is still the

00:11:59.200 --> 00:12:05.280
best. But if you play with it at home, you're yours. You can play with it as much as you want.

00:12:05.280 --> 00:12:11.840
It's still, it's still flawed. It's still very flawed. It's all, no, like how to commit the

00:12:11.840 --> 00:12:17.200
perfect crime. It's got answers for that somewhere. You just got to convince it. I would love really.

00:12:17.200 --> 00:12:21.600
I want to Jarvis, you know, and Jarvis that goes like, Oh, I got you fam. I got this ready for

00:12:21.600 --> 00:12:26.480
you before like an actual personality and actual artificial intelligence that's effectively a human.

00:12:26.480 --> 00:12:31.440
That'll be scary. And it means like deleting an AI is like deleting a sentient life.

00:12:31.440 --> 00:12:34.800
And that's why I does not truly exist now is because like a sentient life is not something

00:12:34.800 --> 00:12:43.120
that's easy to do. For a lot of like the medical environment, I am pretty like pro AI because

00:12:43.120 --> 00:12:48.720
like some people are doing research on like data and stuff. Like for data driven stuff,

00:12:48.880 --> 00:12:56.960
like I think AI is definitely winning and also quite like useful. Maybe like if the AI research

00:12:56.960 --> 00:13:02.080
enough like paper, we can cure cancer, you know, you never know. So in that perspective, I think

00:13:02.080 --> 00:13:11.040
it's great. Generally excited. Generally excited. Um, I don't, yeah, this is cut this out. No,

00:13:11.040 --> 00:13:16.800
this is an uncut version. All right, we'll cut it there. Thank you very much. Bye flow planers.

00:13:16.800 --> 00:13:22.480
James leaving, but I'm gonna keep waving. I'm committed a bit. Bye.
