WEBVTT

00:00:00.000 --> 00:00:06.860
All right, so I am here with James Schreib. All right, James. What do you do for the company? I'm head of writing

00:00:07.660 --> 00:00:10.860
Do I get to hold this at any point? Do I have to keep leaning in? Yeah. Yeah, I'm

00:00:11.740 --> 00:00:14.220
I'm just gonna I'll come in closer. I'll come in closer. All right

00:00:14.820 --> 00:00:20.060
All right, is it close enough? Yes, excellent. All right. Look at this bell guy

00:00:21.900 --> 00:00:25.900
Cloth All right, so about sorry. Sorry, so James

00:00:25.900 --> 00:00:29.260
I'm just asking, you know, everyone in the company what their thoughts on AI is and

00:00:29.800 --> 00:00:33.300
Ask you the first question, which is what's your thought in the current state of AI?

00:00:34.980 --> 00:00:36.980
Current state of AI is I

00:00:37.980 --> 00:00:43.520
Think a lot of fantasy stuff going on. It's people getting very excited about what's going to be in the future

00:00:44.900 --> 00:00:49.260
Well, it's actually the capabilities. They are still a bit limited. However, they're

00:00:49.660 --> 00:00:55.740
They're crazy. You were two years ago or three years ago like the transformational especially the AI art space like that's already

00:00:56.500 --> 00:01:00.060
pretty damn good And pretty damn disruptive and scary

00:01:02.100 --> 00:01:07.340
But there's headlines all the time like chat GPT gets 50% of engineering questions wrong or like, you know

00:01:07.340 --> 00:01:09.980
It's not reliable yet. There's too many hallucinations

00:01:11.060 --> 00:01:17.060
But that's not to say it's not useful. It's helping people learn code is helping people code faster write faster send emails faster

00:01:17.140 --> 00:01:22.620
So if it never improves beyond what we have today, it's still useful if it does improve

00:01:23.060 --> 00:01:26.820
It's both terrifying and very exciting

00:01:27.260 --> 00:01:31.700
So are you do you think you're more for it or for or against it or?

00:01:32.780 --> 00:01:36.740
I'm not really for slowing down technology ever

00:01:37.300 --> 00:01:45.580
There need to be guardrails regulations There's some things that are just so truly scary though that I don't know how we're gonna cope as a society as like a just a little

00:01:45.820 --> 00:01:52.900
Monkey brain society who just loves outrage and just takes things at face value and just doesn't read you on the headlines

00:01:52.900 --> 00:01:57.420
I have no idea how we're gonna deal with deep fakes a generative

00:01:57.740 --> 00:02:02.460
political videos There's no I don't like the idea of a post-truth world

00:02:03.140 --> 00:02:06.900
there's already countries that live in post-truth media landscapes and

00:02:07.620 --> 00:02:15.180
Since 2016 the West is getting more and more like that and that is just a dangerous and toxic world to live in and I

00:02:15.580 --> 00:02:21.180
Figure that a lot of it generative AI just sends us beating down that canyon

00:02:21.900 --> 00:02:25.620
Which is very scary But on the other hand

00:02:25.620 --> 00:02:27.620
There's lots of things to be excited about

00:02:29.620 --> 00:02:34.980
I just remember the movie her and just like how it's gonna be eventually. I'll be normalized to have AI

00:02:35.740 --> 00:02:41.460
Romantic relationships. It's like some of the stuff is really scary inside a fertility rates are already dropped. We've been dropping for 50 years

00:02:41.980 --> 00:02:43.980
I don't know. We might just not survive

00:02:44.980 --> 00:02:47.740
Fair enough. Okay, so then in the case we do survive

00:02:48.260 --> 00:02:53.540
Are you worried about your job because I feel like a lot of people who talk about AI say that the first thing job is gonna

00:02:53.540 --> 00:02:58.940
Go as writers, right? And is that something that you worry about or is that something like you think it's like all just like

00:02:59.020 --> 00:03:03.340
There's people like just talking or something that or I think that

00:03:04.980 --> 00:03:11.740
Creative industries definitely are at risk. I think it's gonna be similar to other industries like perhaps the effects where

00:03:12.500 --> 00:03:19.940
You've got There's humans still doing the job, but there's fewer of them because the human is doing it can do so much faster now

00:03:20.540 --> 00:03:27.100
But I think we're gonna see things change and adapt. I think we're gonna see a lot more like identity driven personality driven

00:03:28.020 --> 00:03:37.300
content like influencers Like Linus who you love and have a parasocial relationship with that's gonna just become even more of a thing

00:03:37.900 --> 00:03:44.100
Because in a couple of years the majority of the content on the internet is going to be from AI and you're not even going to know

00:03:44.100 --> 00:03:50.900
That it's from AI and we're gonna have to figure out identity online and whether that's a blockchain thing or some kind of centralized

00:03:52.060 --> 00:03:54.220
I don't know identity system

00:03:55.300 --> 00:04:04.580
Blue check marks We're gonna have to figure out a way to know like oh actual humans said that not just a very convincing bot with a political agenda

00:04:04.980 --> 00:04:09.980
So I think we would probably just see a switch in culture about like who we listen to

00:04:10.740 --> 00:04:15.820
And what we find entertaining stuff and by the way, we're going to listen to and find entertaining

00:04:16.140 --> 00:04:21.820
Bots vTubers that we know are artificial. We're still gonna like them even though they're artificial

00:04:21.820 --> 00:04:25.020
I don't know for oh, yeah, will my job be a risk if I was in high school right now

00:04:25.020 --> 00:04:29.420
I would be I'd be doing trades for sure I could I'd be like, oh, yeah

00:04:29.900 --> 00:04:33.220
Carpenter in a welding like well, I don't even know welding but

00:04:34.660 --> 00:04:38.220
Something the robots are 15 years behind the AI is right now

00:04:38.220 --> 00:04:44.740
I think so some something practical that you can help with would definitely be a little more secure than a purely creative job

00:04:46.740 --> 00:04:52.820
But I'm hoping that as they I get sufficiently advanced then we'll have some kind of universal bit basic income and all work less

00:04:52.940 --> 00:04:57.460
although I'm not optimistic that this will happen because you know

00:04:57.460 --> 00:05:03.500
certain certain political persuasion that seem to be winning for the last 10 years and it's really

00:05:03.980 --> 00:05:12.820
society is getting very stratified and Billionaires are gobbling up more and more. There doesn't seem to be any kind of thrust for anti-trust or breaking up

00:05:13.980 --> 00:05:17.820
Oligopolis in the United States That's why all the companies are getting bigger

00:05:17.820 --> 00:05:23.460
All the little players getting pushed out the middle class is shrinking and they're just gobbling up the money

00:05:23.460 --> 00:05:26.180
So I don't really the government's just sitting on their ass

00:05:26.460 --> 00:05:33.120
So I really think that the odds are there'll be just a few more billionaires and a billion more

00:05:33.500 --> 00:05:41.860
People in poverty that have a great AR All right, but then have you used AI before like any like tap GPT mid-journey?

00:05:41.860 --> 00:05:47.420
I think that have you tried it like it then like a thing like that or did you just say cat GPT?

00:05:48.740 --> 00:05:59.380
That's a check cat. I don't know that's gotta exist I'm I've dabbled less than I'd like to I really wish I could say it's been my hobby for last eight months on Friday nights to have a

00:06:00.140 --> 00:06:06.100
Glass of scotch and mess around with chat GPT. I've very I've used it very seldom. I've used it to like

00:06:06.940 --> 00:06:11.140
Convert things instead of it doing text to call them in Excel because that's annoying

00:06:11.140 --> 00:06:15.400
I just tell me tell me all the weeks in the year and remove this it's sometimes fast

00:06:15.400 --> 00:06:20.460
But actually I've been using chat GPT at home quite a bit. Um, it's great for like meal planning

00:06:20.460 --> 00:06:25.660
It's great for when you have a very specific question that the internet like you just know sometimes you Google something

00:06:25.660 --> 00:06:34.480
You're like, please someone has to have asked this question online I hope there's a form thread about my specific case, but there often isn't so it's really nice to just be able to ask like

00:06:34.540 --> 00:06:40.500
Hey in the region I live in does the electrical code require that in an attic this?

00:06:40.700 --> 00:06:46.800
You know like just something really specific or something's more general to like if I'm shopping for of

00:06:47.580 --> 00:06:51.900
Smoker, what are the five things to look for and they'll tell you like, you know

00:06:51.900 --> 00:06:57.060
The dimensions of quality that are important for that product that you know nothing about it's great to get like a quick lay of the land

00:06:57.740 --> 00:07:00.940
Yeah, no, I use it for like like sort of example

00:07:00.940 --> 00:07:05.620
Like I want to look up a diet for like my for someone my age my height and my weight

00:07:05.620 --> 00:07:09.100
And that and it's like tells me oh you should do this or like if I won't do a workout

00:07:09.100 --> 00:07:13.940
And I was like, oh, what's the best workouts for like this muscle group and will give me like a good list or stuff

00:07:13.940 --> 00:07:17.820
So yeah, I've been using for that. But yeah, I like it using like a alternative Google

00:07:17.820 --> 00:07:20.720
I guess this is interesting. I have a friend who works at a large

00:07:21.720 --> 00:07:25.880
Company in the dating app space and they told me they are working on

00:07:27.800 --> 00:07:36.160
Basically a matchmaker based on an LLM So instead of you just like swiping and swiping and swiping you just tell it like I'm looking for someone who does this and this and this

00:07:36.160 --> 00:07:45.080
And you know about all these things about me So this this AI will know a lot about each user and what each user wants and be able to match them that way

00:07:45.080 --> 00:07:51.700
It's like that actually sounds like clutch if it if that was designed properly. It could make dating very efficient

00:07:51.700 --> 00:07:56.060
Yeah, yeah, because most people what I wanted my date efficiency in my love life

00:07:57.480 --> 00:07:59.480
You need this now. Oh, I need this now

00:08:01.080 --> 00:08:06.360
Because it's like, you know, people just like this, you know, that's that's the AI currently in the dating space

00:08:07.040 --> 00:08:16.200
That was me in college All right, so then as a tech company that we are a kind of tech company slash entertainment company or slash creative

00:08:16.200 --> 00:08:22.360
I guess do you think that we should be using AI and at any level or do you think we should like just not be using it because it's like

00:08:22.680 --> 00:08:25.560
kind of standing come protecting our jobs anything about or

00:08:27.240 --> 00:08:32.560
Yeah, we're not in like a union field. So like that job security solidarity thing

00:08:32.560 --> 00:08:36.280
I don't think makes sense for us. I think basically every organization should be using

00:08:36.640 --> 00:08:40.400
Whatever new tools come to light. I mean if it's for

00:08:42.000 --> 00:08:44.920
Transcribing meetings summarizing a meeting that you missed

00:08:45.680 --> 00:08:51.240
Making emails. I know that some creators are using it for a community posts like

00:08:52.040 --> 00:08:58.360
Just hey, can you make a community post that advertises this giveaway that we're gonna do and say make this point this fit this point

00:08:58.360 --> 00:09:03.320
And it just maybe that's not their forte They're an engineer and they're great a great maker

00:09:03.320 --> 00:09:08.760
But they're not great at word crafting some things if they can allow your team to be smaller and increase your output

00:09:08.760 --> 00:09:12.840
Yeah, you should be using that you should be using it to make your emails more polite or like

00:09:13.720 --> 00:09:20.200
If you're on the support team, you should be using it to make templates for you or reply with boilerplate

00:09:21.280 --> 00:09:25.800
Yeah, every organization should be using it's it's like asking us like should you guys be on the internet, you know

00:09:25.800 --> 00:09:30.720
Like oh you guys you're you're a book publishing company. Obviously you love type writing

00:09:31.240 --> 00:09:33.640
Should you guys get on the internet and like yes?

00:09:34.840 --> 00:09:41.920
Always adopting technology Okay, so is there anything else you want to mention to the float planers?

00:09:42.720 --> 00:09:48.440
Like yeah, I'm I'm calling a new term. It's people who subscribe to total plane full planers. They're called flow plane

00:09:49.480 --> 00:09:52.520
That's not even a new term. Is it float planers? I don't know is it not

00:09:53.400 --> 00:09:55.840
I tried to like, you know how there's like

00:09:56.600 --> 00:10:00.600
Believers and there's like little monsters or Lady Gaga fans all these things

00:10:00.600 --> 00:10:03.440
I tried to get a name like that for like LMG fans

00:10:04.120 --> 00:10:08.280
No lines was not into it. I couldn't think of a good name like tech tippers

00:10:09.240 --> 00:10:15.280
Techies or like even even our us the staff like Googlers. We don't have like a staff name either

00:10:15.280 --> 00:10:18.920
I don't know. Let us know the comments. Oh you got some ideas

00:10:19.080 --> 00:10:26.120
There's a chat with LMG partners and it's called LMG Fs

00:10:27.240 --> 00:10:32.800
LMG Fs, that's pretty good. That's pretty good. No, I'm holding the mic. All right. I'm getting out of here. Thank you

00:10:32.800 --> 00:10:36.400
All right. All right. Bye. Oh, wait, wait, wait, wait, wait, sorry

00:10:36.400 --> 00:10:41.960
Is anything you want to mention say anything that we didn't make topic-wise we didn't maybe cover or I

00:10:45.800 --> 00:10:50.520
Mean it's such a vast topic like What is it?

00:10:51.160 --> 00:10:57.640
Excited for it nervous for it think it's I think like that. Yeah. Yeah, generally excited Jen. Generally excited. Um, I

00:10:58.960 --> 00:11:02.560
Don't yeah This is cut this out

00:11:03.000 --> 00:11:12.360
No, this is uncut version All right, we'll cut it there. Thank you very much. Bye flow planers James leave him, but I'm gonna keep waving. I'm committed a bit

00:11:13.880 --> 00:11:15.880
Bye
