WEBVTT

00:00:01.434 --> 00:00:02.602
Riley: Should I clap too? 2 00:00:02.736 --> 00:00:04.471 Sammy: No, just Luke. 3 00:00:05.271 --> 00:00:07.007 Riley: Do you like his claps better than my claps? 4 00:00:07.040 --> 00:00:07.574 Sammy: Yeah. 5 00:00:08.808 --> 00:00:10.510 Riley: A lot of people say I have great claps. 6 00:00:11.344 --> 00:00:13.046 Riley: We're talking about Sora, right? 7 00:00:13.146 --> 00:00:15.315 Sammy: Yes, Sora 2; the- 8 00:00:15.315 --> 00:00:16.282 Riley: Do we have the app? 9 00:00:16.282 --> 00:00:17.117 Sammy: Yes. 10 00:00:17.117 --> 00:00:19.452 Luke: I am working on generating a video right now. 11 00:00:19.452 --> 00:00:25.325 Riley: Background for anyone who isn't super, super in the loop, OpenAI released Sora. 12 00:00:25.325 --> 00:00:28.128 I mean, when was that? That was two years ago or something? 13 00:00:28.128 --> 00:00:28.428 Luke: We were- 14 00:00:28.428 --> 00:00:31.031 Riley: It was like such a watershed moment for AI. 15 00:00:31.031 --> 00:00:33.033 Yes, I'm going back that far; because that was like, 16 00:00:33.033 --> 00:00:36.903 Sora was the first time when we saw AI video and it was like this, 17 00:00:36.903 --> 00:00:39.005 Riley: wow, this could be photorealistic. 18 00:00:39.005 --> 00:00:40.507 Luke: It's pretty good, this is no longer Will Smith eating spaghetti. 19 00:00:40.507 --> 00:00:43.009 Riley: Yeah, and then in quick succession, 20 00:00:43.009 --> 00:00:46.713 multiple models followed that like could basically do the same thing. 21 00:00:46.713 --> 00:00:51.918 And then just recently OpenAI released Sora 2, the model. 22 00:00:52.018 --> 00:00:55.555 Which is exclusively available I think, just in the app right? 23 00:00:55.822 --> 00:00:57.924 You might be able to use it on desktop through something. 24 00:00:57.924 --> 00:00:58.425 Luke: I'm not sure. 25 00:00:58.425 --> 00:00:58.925 Riley: I'm not sure. 26 00:00:59.526 --> 00:01:05.231 Riley: But the main thing is that they released it through this app on iOS only called Sora. 27 00:01:05.231 --> 00:01:10.270 Riley: So there's Sora 2, the model, which you can use within Sora the app, 28 00:01:10.303 --> 00:01:15.842 Riley: Which is basically a TikTok clone where you only post AI videos. 29 00:01:15.842 --> 00:01:18.845 So you open the app, you generate video and you post the video. 30 00:01:18.845 --> 00:01:20.046 Luke: Hey, you want some slop? 31 00:01:20.046 --> 00:01:24.617 Riley: And you can like and scroll, and yes please, feed me the slop ASAP. 32 00:01:24.617 --> 00:01:26.052 Luke: My prompt is still generating. 33 00:01:26.186 --> 00:01:27.654 Riley: I don't think you need to show it to the- 34 00:01:27.654 --> 00:01:28.521 Sammy: No, the screens recording. 35 00:01:28.521 --> 00:01:29.756 Luke: Oh right, we're screen recording. 36 00:01:32.092 --> 00:01:36.329 Riley: Oh, what!? Did it actually know what I look like? Kind of. 37 00:01:36.329 --> 00:01:37.630 There's a guy with glasses. 38 00:01:37.630 --> 00:01:38.665 Luke: Open it up, let's see. 39 00:01:38.798 --> 00:01:42.435 AI Linus: This is wild, every step feels like it's in slow motion up here. 40 00:01:42.535 --> 00:01:46.172 We're above the clouds and I can actually see the summit right there. 41 00:01:46.172 --> 00:01:48.641 Never thought I'd be saying this, but Riley from LMG is climbing- 42 00:01:48.641 --> 00:01:50.176 Riley: LMG, it said LMG. 43 00:01:50.176 --> 00:01:51.111 Luke: It did say LMG. 44 00:01:51.111 --> 00:01:52.445 Riley: You wrote "Riley from"- 45 00:01:52.645 --> 00:01:56.783 The prompt is "Riley from Linus Media Group climbing Mount Everest". 46 00:01:56.783 --> 00:02:00.053 Luke: And they made it Linus talking about Riley. 47 00:02:00.453 --> 00:02:01.321 Riley: Well, that's funny; 48 00:02:01.321 --> 00:02:05.625 Riley: I feel like this looks like maybe a mixture of me and Linus? Luke: A mashup, yes. 49 00:02:05.692 --> 00:02:06.126 Luke: Yeah. 50 00:02:06.226 --> 00:02:07.861 Riley: Because there's big glasses. 51 00:02:07.861 --> 00:02:08.495 Luke: There are big glasses. 52 00:02:08.495 --> 00:02:10.697 Riley: But the face looks like Linus with the beard and everything. 53 00:02:10.797 --> 00:02:13.433 Luke: Do "Riley from LMG hosts a"- 54 00:02:14.400 --> 00:02:16.503 Riley: I don't think it, I don't think there's enough data. 55 00:02:16.503 --> 00:02:17.070 Riley: I've- 56 00:02:17.170 --> 00:02:18.404 This is a little vain; 57 00:02:18.505 --> 00:02:23.042 I've Googled myself a few times to see like where I'm at on the ranking of things. 58 00:02:23.042 --> 00:02:26.379 Like does the internet know like what I look like if you just kind of... 59 00:02:26.379 --> 00:02:29.949 but I'm pretty sure the AI, there's not enough AI training data for you to say 60 00:02:29.949 --> 00:02:32.051 "Riley from Linus Media Group" and for it to know who I am. 61 00:02:32.051 --> 00:02:33.319 Sammy: What if you say Riley from TechLinked? 62 00:02:33.319 --> 00:02:34.921 Riley: Maybe Riley Murdock. 63 00:02:35.021 --> 00:02:36.256 Luke: Yeah, maybe more specific. 64 00:02:36.256 --> 00:02:36.956 Riley: Let's find out. 65 00:02:36.956 --> 00:02:37.857 Riley: This is a new app. 66 00:02:37.857 --> 00:02:46.332 Sam Altman is known like in tech circles; but I think that like a ton of the people 67 00:02:46.332 --> 00:02:48.501 using these apps may not even know who Sam Altman is. 68 00:02:48.535 --> 00:02:54.140 They're just like, "cool, I love this AI stuff, ChatGPT". 69 00:02:54.274 --> 00:02:58.611 Like I have friends, I have friends who have been using ChatGPT, 70 00:02:58.611 --> 00:03:05.018 and love ChatGPT since December 2022; and are just like, oh, I love ChatGPT. 71 00:03:05.018 --> 00:03:06.386 I use it for everything. 72 00:03:06.386 --> 00:03:07.854 They might not know who Sam Altman is. 73 00:03:07.854 --> 00:03:08.421 Luke: Hmm. 74 00:03:08.922 --> 00:03:11.224 Riley: One of them for sure, who's like a teacher. 75 00:03:11.224 --> 00:03:12.659 Teachers love ChatGPT. 76 00:03:12.659 --> 00:03:13.092 Luke: Oh, yeah. 77 00:03:13.092 --> 00:03:14.160 Riley: They use it for- 78 00:03:14.160 --> 00:03:15.662 Luke: You think it's just the students? No. 79 00:03:15.662 --> 00:03:17.564 Riley: Yeah, lesson plans. 80 00:03:17.564 --> 00:03:17.997 Luke: Oh, yeah. 81 00:03:18.164 --> 00:03:19.666 Riley: You know, creative ideas. 82 00:03:19.666 --> 00:03:21.501 Like they use it all the time; 83 00:03:21.701 --> 00:03:24.037 and it's actually, to be fair that's like one of the things. 84 00:03:24.037 --> 00:03:25.004 Luke: It's pretty good at that. 85 00:03:25.004 --> 00:03:25.438 Riley: Yeah. 86 00:03:25.438 --> 00:03:28.975 Riley: I feel like that's actually a pretty decent application, 87 00:03:28.975 --> 00:03:30.176 and it saves them a ton of time. 88 00:03:30.176 --> 00:03:31.945 But anyway, I don't know; maybe you're right. 89 00:03:32.245 --> 00:03:32.912 Riley: Maybe that's- 90 00:03:32.946 --> 00:03:33.479 Sammy: This is a whole second video 91 00:03:33.479 --> 00:03:34.614 Luke: Let's watch you climb Everest. 92 00:03:34.614 --> 00:03:37.584 Riley: Okay; let's see if it knows who Riley Murdock from TechLinked is. 93 00:03:37.984 --> 00:03:40.186 AI Riley: Every step feels like it's in slow motion up here. 94 00:03:40.186 --> 00:03:40.787 Riley: That's not me. 95 00:03:40.887 --> 00:03:43.122 AI Riley: We're above the clouds, above pretty much everything. 96 00:03:43.156 --> 00:03:45.758 I never thought a tech guy from a studio would be on the side of Everest, 97 00:03:45.758 --> 00:03:47.927 but here we are; summit's just a few hundred meters- 98 00:03:47.927 --> 00:03:51.364 Riley: "Never thought a tech guy from a studio would be doing this". 99 00:03:52.532 --> 00:03:53.032 Luke: That was it. 100 00:03:53.032 --> 00:03:55.001 Riley: Can you scan your face? Wait, this is great. 101 00:03:55.468 --> 00:03:55.969 Riley: Now we have- 102 00:03:55.969 --> 00:03:57.203 Luke: Don't scan your face, don't scan your face. 103 00:03:57.203 --> 00:03:58.638 Riley: I'm not going to scan your face; no, no, no. 104 00:03:58.705 --> 00:03:59.672 Luke: Sammy... 105 00:03:59.672 --> 00:04:01.207 Riley: I didn't mean to do this, how do I get out of here? 106 00:04:01.207 --> 00:04:02.108 Luke: Don't tell people that. 107 00:04:02.108 --> 00:04:02.842 Riley: I hate iPhone. 108 00:04:02.875 --> 00:04:06.913 Sorry, I feel like Linus right now; why is there no universal back button? 109 00:04:06.946 --> 00:04:10.583 Luke: "I only know one thing, it must work my way". 110 00:04:10.583 --> 00:04:11.251 Riley: What?- 111 00:04:11.284 --> 00:04:12.085 Luke: You did. 112 00:04:12.085 --> 00:04:13.052 Riley: Okay, but- 113 00:04:13.086 --> 00:04:14.520 Luke: I'm saying there is a way out. 114 00:04:14.520 --> 00:04:16.222 Riley: I guess saying you could like do that. 115 00:04:17.023 --> 00:04:17.724 Luke: Yeah. 116 00:04:17.857 --> 00:04:19.692 Riley: Oh, that's what you do, you do that. 117 00:04:20.426 --> 00:04:20.860 Riley: Okay. 118 00:04:22.061 --> 00:04:23.229 Riley: Alright, that's fair. 119 00:04:24.197 --> 00:04:27.600 Riley: Luke... Fluke Lafreniere. 120 00:04:27.600 --> 00:04:28.134 Luke: Yeah. 121 00:04:28.134 --> 00:04:31.971 Riley: Luke... Luke Lafreniere; is that how you say your name? 122 00:04:31.971 --> 00:04:32.872 Luke: I could- Sure. 123 00:04:32.872 --> 00:04:35.141 Riley: Lafreniere or LafreniÃ¨re? 124 00:04:35.508 --> 00:04:41.114 Luke: Someone left the little thing off on a document and now we don't do that; 125 00:04:41.114 --> 00:04:42.181 but it used to have that. 126 00:04:42.181 --> 00:04:43.650 Riley: Oh, you don't have the accent. 127 00:04:43.683 --> 00:04:44.450 Luke: Not anymore. 128 00:04:45.051 --> 00:04:45.985 Riley: Grave. 129 00:04:47.453 --> 00:04:47.987 Riley: Right? 130 00:04:48.087 --> 00:04:48.655 Luke: Sure. 131 00:04:48.688 --> 00:04:49.656 Riley: You're the French guy. 132 00:04:49.689 --> 00:04:50.657 Luke: Yeah, you'd think that. 133 00:04:50.657 --> 00:04:51.157 Riley: Okay. 134 00:04:51.190 --> 00:04:52.592 Luke: In high school, I took German. 135 00:04:53.559 --> 00:04:54.294 Sammy: You did? 136 00:04:54.327 --> 00:04:54.661 Luke: Yeah. 137 00:04:54.694 --> 00:04:56.896 Sammy; Oh, that makes sense why he spoke so well German when we were there. 138 00:04:56.896 --> 00:04:57.964 Luke: So well. 139 00:04:57.964 --> 00:04:59.465 Riley: So well. 140 00:04:59.465 --> 00:05:00.600 Sammy: I'm Korean, I don't... 141 00:05:01.567 --> 00:05:04.003 Riley: What should we make you do? I was climbing Mount Everest. 142 00:05:04.871 --> 00:05:08.508 Luke Lafreniere... because this is before we've trained it. 143 00:05:08.508 --> 00:05:12.812 What you're supposed to do is scan your face and give your data to the machine. 144 00:05:12.812 --> 00:05:13.246 Luke: Not doing that. 145 00:05:13.246 --> 00:05:14.047 Riley: We're not doing that. 146 00:05:14.047 --> 00:05:14.580 Luke: Nope. 147 00:05:14.614 --> 00:05:15.415 Riley: We're just trying to see. 148 00:05:15.415 --> 00:05:16.382 Sammy: "Eating spaghetti". 149 00:05:17.517 --> 00:05:17.950 Luke: Sure. 150 00:05:17.984 --> 00:05:19.419 I mean, I'm down for anything, right? 151 00:05:19.452 --> 00:05:22.755 I get to experience vicariously through myself through digital form. 152 00:05:23.089 --> 00:05:25.091 My digital twin gets to eat spaghetti. 153 00:05:25.091 --> 00:05:26.292 I'm hungry and I ain't eating that right now. 154 00:05:26.292 --> 00:05:26.693 Riley: Yeah, yeah. 155 00:05:26.693 --> 00:05:28.061 Luke: Maybe that will sate my hunger. 156 00:05:28.127 --> 00:05:29.796 Riley: Yeah; you know, it very well might be- 157 00:05:29.796 --> 00:05:33.900 Luke: The food of the future is videos of you eating, which takes more energy than the food; 158 00:05:34.534 --> 00:05:36.402 than it does to grow the food. 159 00:05:36.436 --> 00:05:40.173 Riley: We won't even need ozempic, because we'll just watch videos of us eating food 160 00:05:40.173 --> 00:05:42.875 and we'll be like, wow I feel like, I feel like I did that. 161 00:05:43.776 --> 00:05:44.243 Riley: And even if- 162 00:05:44.243 --> 00:05:47.480 Luke: I could eat a horse! Time for my digital version. 163 00:05:48.114 --> 00:05:50.216 Luke: Yeah, we were talking about like the adoption of the app; 164 00:05:50.216 --> 00:05:53.619 so there's, there's Facebook has their slop scroller. 165 00:05:53.653 --> 00:05:56.189 OpenAI has their Sora slop scroller. 166 00:05:56.389 --> 00:05:58.558 Luke: There's this is becoming a thing right now. 167 00:05:58.558 --> 00:06:00.727 To me, it feels like a desperation move, 168 00:06:01.260 --> 00:06:03.696 because they're doing their trading of millions. 169 00:06:03.696 --> 00:06:04.297 Riley: Yes, yes. 170 00:06:04.297 --> 00:06:06.332 Luke: Where they're circling the millions around in a few different companies. 171 00:06:06.332 --> 00:06:08.134 Riley: Incestuous trading, yeah. 172 00:06:08.134 --> 00:06:10.136 Luke: And then people get excited by that and they invest, 173 00:06:10.136 --> 00:06:13.239 so that tops up the loss that happens every time they move the money around. 174 00:06:14.073 --> 00:06:15.041 Riley: And it's buying time. 175 00:06:15.341 --> 00:06:15.975 Luke: Yes. 176 00:06:15.975 --> 00:06:18.745 Luke: This also feels like a maybe like, 177 00:06:18.745 --> 00:06:25.251 "look investors, I swear we're doing something"; because I don't feel like since... 178 00:06:25.885 --> 00:06:27.487 you know, it's gotten a little bit better. 179 00:06:27.587 --> 00:06:34.327 But since the inception of these tools, hasn't been like a massive dynamics of the world changing splash. 180 00:06:34.327 --> 00:06:34.827 Riley: Mhmm. 181 00:06:34.827 --> 00:06:35.928 Luke: Because there was originally, right. 182 00:06:35.928 --> 00:06:37.230 You're talking about how teachers use it. 183 00:06:37.263 --> 00:06:37.497 Riley: Yeah. 184 00:06:37.497 --> 00:06:38.598 Luke: They do, absolutely. 185 00:06:38.631 --> 00:06:38.998 Riley: Yeah. 186 00:06:38.998 --> 00:06:43.469 Luke: But then using it isn't a variable that has changed since its original introduction. 187 00:06:43.469 --> 00:06:43.836 Riley: No. 188 00:06:43.836 --> 00:06:48.307 Luke: There hasn't been a like, like there are still customer service agents in the world. 189 00:06:48.441 --> 00:06:48.941 Riley: Yes. 190 00:06:48.941 --> 00:06:52.478 Luke: That is something that if you talk to people at the beginning of it, 191 00:06:52.478 --> 00:06:56.816 everybody was like, oh man, like the customer service bots that already exist, 192 00:06:56.816 --> 00:06:59.152 are going to get better and they're going to replace people. 193 00:06:59.152 --> 00:06:59.519 Riley: Yeah. 194 00:06:59.519 --> 00:07:00.386 Luke: That didn't happen. 195 00:07:00.386 --> 00:07:00.787 Riley: No. 196 00:07:00.887 --> 00:07:03.423 Luke: And when it did, it was like horribly disastrous. 197 00:07:03.456 --> 00:07:07.560 Riley: The customer service bots are a little better, but- 198 00:07:07.727 --> 00:07:10.430 Luke: And every once in a while they just give you everything for free and, 199 00:07:10.430 --> 00:07:10.730 Riley: Yeah. 200 00:07:10.730 --> 00:07:11.697 Luke: Do other crazy things. 201 00:07:11.697 --> 00:07:13.566 Riley: There was like one company that was like 202 00:07:13.566 --> 00:07:17.036 we're just getting rid of all of our people and just doing customer service bots. 203 00:07:17.069 --> 00:07:19.939 Riley: And then they had to roll it back because they were like, oh whoops. 204 00:07:20.339 --> 00:07:23.476 Riley: This wasn't the play, because they caused issues. 205 00:07:23.476 --> 00:07:28.748 Luke: Yeah; so all the investors that were like, yes, we get to fire everyone! We love this. 206 00:07:28.781 --> 00:07:29.282 Riley: Yeah. 207 00:07:29.282 --> 00:07:30.516 Luke: That didn't really happen. 208 00:07:30.817 --> 00:07:31.317 Riley: Yes. 209 00:07:31.317 --> 00:07:34.687 Luke: So I feel like they're fishing for something; they're like look we made an app. 210 00:07:34.687 --> 00:07:35.087 Riley: Yeah. 211 00:07:35.188 --> 00:07:40.993 Riley: Hank Green, his theory was that this, as well as the ChatGPT pulse, 212 00:07:40.993 --> 00:07:45.932 which was the- It's like the while you're not using the app. 213 00:07:45.932 --> 00:07:46.299 Luke: Yeah. 214 00:07:46.299 --> 00:07:49.202 Riley: ChatGPT uses what it knows about you to go and like find stuff 215 00:07:49.202 --> 00:07:52.104 that it thinks you might be interested in, and then you wake up and it like serves you. 216 00:07:52.104 --> 00:07:53.172 Riley: Here's what- Here, look at what I found. 217 00:07:53.172 --> 00:07:53.840 Luke: Ads. 218 00:07:54.140 --> 00:07:59.445 Riley: Yeah, and... well, it's not ads; but then he's saying they can use it for ads. 219 00:07:59.445 --> 00:08:00.012 Luke: It'll get there. 220 00:08:00.012 --> 00:08:00.313 Riley: Yeah. 221 00:08:00.313 --> 00:08:01.147 Luke: It'll get there. 222 00:08:01.147 --> 00:08:06.085 Riley: They're feeds that they can inject ads in ChatGPT pulse, and also in here, 223 00:08:06.152 --> 00:08:09.755 and maybe somehow get a little bit of revenue. 224 00:08:10.756 --> 00:08:16.262 Riley: But like, is it going to be enough revenue to to keep the bubble going forever? 225 00:08:16.295 --> 00:08:18.030 Luke: I feel like they should have done the classic move 226 00:08:18.030 --> 00:08:22.301 and just never charge for ChatGPT in the first place; just be in pre-revenue forever. 227 00:08:25.271 --> 00:08:26.606 Riley: Well, wouldn't that just be worse than- 228 00:08:26.606 --> 00:08:31.210 Luke: ...pre revenue- What the investors love, because then there's potential energy. 229 00:08:32.645 --> 00:08:33.312 Riley: Well, if... 230 00:08:33.312 --> 00:08:34.480 Luke: If you start charging money- 231 00:08:34.480 --> 00:08:37.116 Riley: Then they're like, hold on; now the proof of concept... 232 00:08:37.116 --> 00:08:38.217 Luke: This is now measurable. 233 00:08:38.217 --> 00:08:39.952 Riley: Yeah, and we're not getting there. 234 00:08:39.986 --> 00:08:40.520 Luke: Yeah. 235 00:08:40.553 --> 00:08:43.189 Riley: Yeah, you just want to live and live in dream world forever. 236 00:08:43.789 --> 00:08:44.524 Luke: No. 237 00:08:48.728 --> 00:08:49.662 Luke: That's not bad, though. 238 00:08:49.662 --> 00:08:50.396 AI Luke: That's so good, perfect amount of sauce. 239 00:08:50.396 --> 00:08:51.797 AI Luke: Cheers to whoever made this. 240 00:08:51.831 --> 00:08:53.566 Riley: It sounds like Linus a little bit. 241 00:08:53.566 --> 00:08:54.233 AI Luke: Let's see. 242 00:08:54.433 --> 00:08:55.701 Sammy: Why the default to Linus? 243 00:08:55.735 --> 00:08:56.235 Riley: Okay, wait. 244 00:08:56.235 --> 00:08:57.603 Sammy: Did you put Linus Tech Tips? 245 00:08:58.571 --> 00:09:00.106 AI Luke: That's so good, perfect amount of sauce. 246 00:09:00.106 --> 00:09:01.374 AI Luke: Cheers to whoever made this. 247 00:09:01.374 --> 00:09:01.941 Luke: That is Linus. 248 00:09:01.941 --> 00:09:03.643 Riley: That's clearly pulling stuff from Linus. 249 00:09:03.643 --> 00:09:08.347 Luke: So there's so many searches for me that incorporate Linus, 250 00:09:08.347 --> 00:09:11.017 and it has way better training data for Linus's voice. 251 00:09:11.017 --> 00:09:11.450 Riley: Yeah. 252 00:09:11.450 --> 00:09:14.520 Luke: So it just shoved Linus's voice in there, that's nuts. 253 00:09:14.520 --> 00:09:15.221 Riley: Yeah. 254 00:09:15.788 --> 00:09:18.858 Riley: Well, that's what's interesting about the public figures thing. 255 00:09:18.858 --> 00:09:23.329 Luke: It got base things; I have a beard, I'm a white dude, my hair goes swoop. 256 00:09:23.362 --> 00:09:25.798 Riley: Yeah, and just like me, like this thumbnail from far away, 257 00:09:25.831 --> 00:09:27.099 I could kind of think that's me. 258 00:09:27.099 --> 00:09:29.802 Luke: The thumbnail, I was like holy crap it nailed it. 259 00:09:29.802 --> 00:09:30.102 Riley: Yeah. 260 00:09:30.102 --> 00:09:31.571 Luke: And then we played the video and not so much, 261 00:09:31.571 --> 00:09:33.005 and that's kind of the same on this one. 262 00:09:33.005 --> 00:09:34.807 Riley: Yeah, and this one was more specific; 263 00:09:34.807 --> 00:09:37.009 we were like Riley Murdock from TechLinked. 264 00:09:37.043 --> 00:09:40.880 I mean, to me this is a bit encouraging that I know that like, 265 00:09:40.880 --> 00:09:47.153 ok, my soul hasn't been completely consumed by the data available about me online yet; 266 00:09:47.186 --> 00:09:49.088 but Linus is, Linus is far gone. 267 00:09:49.155 --> 00:09:50.156 Luke: So... 268 00:09:50.156 --> 00:09:52.124 Riley: Linus is completely absorbed. 269 00:09:52.124 --> 00:09:54.393 Luke: Speaking of which, let's make Linus do something. 270 00:09:54.393 --> 00:09:54.827 Riley: Yes. 271 00:09:54.927 --> 00:10:00.333 Luke: And now use the very clearly heavily trained on person to do something. 272 00:10:00.333 --> 00:10:04.737 Riley: Linus leading an AI cult or something. 273 00:10:04.737 --> 00:10:06.405 That would be too crazy. 274 00:10:08.007 --> 00:10:10.543 Riley: Because Linus doesn't love AI, he hates AI. 275 00:10:11.177 --> 00:10:11.777 Riley: And so... 276 00:10:11.777 --> 00:10:12.778 Luke: He's starting to use it. 277 00:10:12.912 --> 00:10:13.346 Riley: What? 278 00:10:13.446 --> 00:10:14.347 Luke: He's starting to use it. 279 00:10:14.647 --> 00:10:16.582 Riley: Oh, did- Really? Because he was doing that video. 280 00:10:16.582 --> 00:10:17.783 Luke: He did the vibe coding video. 281 00:10:17.783 --> 00:10:21.087 Luke: And then he's just like not really completely stopped using it. 282 00:10:22.121 --> 00:10:23.489 The doors are now open. 283 00:10:23.589 --> 00:10:24.857 Luke: Ok, so the prompt is 284 00:10:24.857 --> 00:10:28.461 "Linus Sebastian stopping traffic to raise awareness about badminton courts disappearing". 285 00:10:29.028 --> 00:10:29.695 Riley: Love it. 286 00:10:30.096 --> 00:10:31.430 Sammy: How long does it take to generate? 287 00:10:31.430 --> 00:10:32.832 It's like a few minutes, right? 288 00:10:33.032 --> 00:10:34.467 Luke: Yeah, it's not that long. 289 00:10:34.934 --> 00:10:35.668 Luke: I want to see- 290 00:10:35.668 --> 00:10:37.136 Riley: It was a few minutes, yeah. 291 00:10:37.903 --> 00:10:39.572 Which is like really impressive. 292 00:10:39.572 --> 00:10:40.172 Luke: Oh, yeah. 293 00:10:40.206 --> 00:10:44.610 Riley: That it can do video and audio that quick. 294 00:10:44.610 --> 00:10:49.081 Luke: I feel like this is like, it's kind of cheap. 295 00:10:49.115 --> 00:10:52.151 I hate... one of the things I hate about this whole like 296 00:10:52.151 --> 00:10:57.056 generating AI vertical social content, 297 00:10:57.590 --> 00:11:01.193 Riley: is the fact that it's so easy to make photo realistic stuff. 298 00:11:01.227 --> 00:11:02.728 Riley: What? What was that? 299 00:11:02.828 --> 00:11:07.266 Luke: This content may violate our guardrails concerning third-party likeness. 300 00:11:07.667 --> 00:11:10.403 Riley: Because they... he's a public figure. 301 00:11:12.138 --> 00:11:13.105 Luke: So let's see if we can get around it. 302 00:11:13.105 --> 00:11:17.276 Riley: Because they increased the restrictions on public figures, 303 00:11:17.276 --> 00:11:21.881 Riley: because people are using, people are generating- 304 00:11:21.881 --> 00:11:24.283 Well actually that was about MLK specifically; 305 00:11:24.283 --> 00:11:28.354 but I think they must have rolled it out as like public figures in general. 306 00:11:28.387 --> 00:11:31.824 Sammy: But then why would they, why are we allowed to do like Riley from Linus Media Group? 307 00:11:32.425 --> 00:11:33.092 Sammy: And then it generates...? 308 00:11:33.125 --> 00:11:37.129 Riley: I think it's because the system doesn't know that Riley from Linus Media Group 309 00:11:37.163 --> 00:11:38.831 is like a notable public figure. 310 00:11:38.831 --> 00:11:42.001 I think it's just saying like, oh there's a guy in, 311 00:11:42.001 --> 00:11:44.737 there's a guy in Linus Media Group named Riley? Okay, I guess so. 312 00:11:44.770 --> 00:11:49.108 Riley: But it's not like we, it's not like the AI system has in its database 313 00:11:49.108 --> 00:11:51.544 "we know who Riley from Linus Media Group is". 314 00:11:51.544 --> 00:11:52.745 Luke: I'm pretty sure I just got around it. 315 00:11:52.845 --> 00:11:55.915 I just said "the main host from Linus Media Group stopping traffic"; 316 00:11:55.915 --> 00:11:59.552 and it was like cool, so we'll see if it generates Linus. 317 00:11:59.652 --> 00:12:01.554 Riley: I want to bring it back to the boomer thing. 318 00:12:01.587 --> 00:12:01.987 Luke: Sure. 319 00:12:02.021 --> 00:12:06.792 Riley: Cause like clearly people are into this, people are posting videos. 320 00:12:07.026 --> 00:12:13.733 I had the opinion, I'm not sure whether I still hold it, that this will get old. 321 00:12:13.766 --> 00:12:20.106 That like posting... like the novelty of being able to generate any video 322 00:12:20.106 --> 00:12:22.408 with anybody doing anything will wear off, 323 00:12:22.441 --> 00:12:26.011 and that it will quickly get to a point where we're kind of like, 324 00:12:26.011 --> 00:12:30.316 okay, but are you doing something interesting with it? 325 00:12:30.316 --> 00:12:31.951 Like what, how is this different? 326 00:12:31.984 --> 00:12:35.221 You know, it's like okay, you put Sam Altman's head on the Skibidi Toilet thing; 327 00:12:35.254 --> 00:12:38.157 like when everybody can do that in two minutes, 328 00:12:39.091 --> 00:12:41.460 what's the value in having that video at all? 329 00:12:42.294 --> 00:12:42.895 Luke: I hear you. 330 00:12:43.562 --> 00:12:47.433 I think the main staying power is when people can't tell, and there's a lot of that. 331 00:12:47.533 --> 00:12:51.036 Riley: Right now we're talking about videos posted to the AI slop app. 332 00:12:51.070 --> 00:12:51.570 Luke: Yeah. 333 00:12:51.604 --> 00:12:53.806 Riley: So when you see a video on there, you're like that's AI slop. 334 00:12:53.806 --> 00:12:56.008 Luke: Like what I said, Theo's video did not pop off here, 335 00:12:56.008 --> 00:13:00.146 it popped off on Twitter or X or whatever. 336 00:13:00.146 --> 00:13:06.252 Luke: And like this video, like I suspect this would fool most people. 337 00:13:07.286 --> 00:13:08.754 AI Luke: That's so good, perfect amount of sauce. 338 00:13:09.121 --> 00:13:10.456 AI Luke: Cheers to whoever made this. 339 00:13:11.123 --> 00:13:12.858 Luke: It's not like a good video. 340 00:13:12.892 --> 00:13:17.429 Riley: It would fool people into thinking that's a real video maybe. 341 00:13:17.429 --> 00:13:17.830 Luke: Yeah. 342 00:13:17.897 --> 00:13:20.933 Riley: But I feel like, I don't know, now the AI- 343 00:13:20.933 --> 00:13:24.537 It used to be that you look at an AI video and you're kind of looking for like artifacts 344 00:13:24.537 --> 00:13:27.506 and like, oh, is the fingers aren't right or whatever; and now we're past that point. 345 00:13:27.540 --> 00:13:29.909 And now people are looking at a video, the video is ready; 346 00:13:30.142 --> 00:13:32.578 now people are looking at videos being like, 347 00:13:32.578 --> 00:13:36.115 does it make sense for a human to have made this video and posted it? 348 00:13:36.115 --> 00:13:39.752 Riley: Because if it doesn't, then this might be AI. 349 00:13:39.752 --> 00:13:40.352 Luke: There's so much- 350 00:13:40.386 --> 00:13:43.255 Riley: Like he's just like, "oh kudos to whoever, that was really great". 351 00:13:43.289 --> 00:13:45.324 Riley: And it's like, he's holding the fork kind of weird; 352 00:13:45.357 --> 00:13:47.193 and like, it's just kind of like what is this for? 353 00:13:47.193 --> 00:13:52.865 Like when we're, when we're watching videos we kind of are like, what's the context here? 354 00:13:52.865 --> 00:13:54.266 Why did someone film this? 355 00:13:54.366 --> 00:13:58.204 And that's the, now the AI kind of like alarm bells go off when you're just like 356 00:13:58.204 --> 00:13:59.505 why would someone do this? 357 00:13:59.505 --> 00:14:02.208 Luke: The thing that sucks for me is that's some of my favorite content. 358 00:14:02.241 --> 00:14:04.977 Luke: Not this scenario of some dude eating spaghetti, 359 00:14:04.977 --> 00:14:05.744 but when it's like- 360 00:14:05.778 --> 00:14:07.112 Riley: Content lacking content? 361 00:14:07.112 --> 00:14:10.883 Luke: I actually can't believe that someone filmed this video, that's wild. 362 00:14:10.883 --> 00:14:14.420 Like I saw- A friend of mine shared me a video the other day 363 00:14:14.420 --> 00:14:18.991 of some dude sitting in a car and the camera pans up from his steering wheel, 364 00:14:18.991 --> 00:14:21.927 and it looks like there's like a bite taken out of the steering wheel; 365 00:14:21.927 --> 00:14:24.830 and then he popped, he pours two liters of lemonade onto his dash. 366 00:14:25.764 --> 00:14:27.266 Luke: And then the video just ends. 367 00:14:27.533 --> 00:14:28.667 Riley: Wait, is that AI or not? 368 00:14:28.667 --> 00:14:29.401 Luke: And it's real. 369 00:14:29.401 --> 00:14:30.102 Riley: It's real!? 370 00:14:30.102 --> 00:14:30.536 Luke: Yeah. 371 00:14:30.603 --> 00:14:33.639 Riley: See, I feel like I see a video like that and I know that it's real, 372 00:14:33.672 --> 00:14:35.307 and I'm like this is hilarious to me. 373 00:14:35.307 --> 00:14:35.875 Luke: Yes. 374 00:14:35.908 --> 00:14:38.344 Riley: I have no idea why they're doing this, but it's so funny. 375 00:14:38.344 --> 00:14:38.944 Luke: So funny. 376 00:14:38.944 --> 00:14:42.047 Riley: But if it's an AI video it's not funny at all; I'm just kind of like, huh? 377 00:14:42.081 --> 00:14:43.182 Luke: Have you seen milk pocket? 378 00:14:43.849 --> 00:14:44.183 Riley: No. 379 00:14:44.183 --> 00:14:45.718 Luke: People pour milk in their pocket. 380 00:14:45.718 --> 00:14:46.452 Riley: Is this real? 381 00:14:46.452 --> 00:14:48.754 Luke: And then they're like, "why won't it stay in my pocket?". 382 00:14:49.588 --> 00:14:50.990 Luke: Which I think is hilarious. 383 00:14:51.023 --> 00:14:54.627 But then the second that I'm like hmmm, I don't think that's real, 384 00:14:54.627 --> 00:14:55.628 then it stops being fun. 385 00:14:56.729 --> 00:14:57.529 Riley: Yeah. 386 00:14:57.663 --> 00:14:59.131 Luke: And that's like going to happen. 387 00:14:59.131 --> 00:15:01.367 Riley: The concept of pouring milk in your pocket, 388 00:15:01.400 --> 00:15:05.371 and wondering why it's not staying there is funny; that's a funny concept. 389 00:15:05.371 --> 00:15:08.641 And now I'm like... because the counter argument to my kind of like 390 00:15:08.641 --> 00:15:13.312 it's not funny because it's AI thing is, if it's a funny idea, it's funny. 391 00:15:13.545 --> 00:15:14.480 Riley: You know? Like that idea- 392 00:15:14.480 --> 00:15:19.351 Luke: But it is... I don't think it's funny the second I know it's not real. 393 00:15:19.351 --> 00:15:20.386 Luke: Because the funny part about it 394 00:15:20.386 --> 00:15:21.420 Riley: But that's the boomer. 395 00:15:21.420 --> 00:15:24.623 is that some dude actually poured milk into his pocket, that's hilarious. 396 00:15:24.657 --> 00:15:26.292 Riley: But that's the argument is that it's like, 397 00:15:26.325 --> 00:15:28.994 you're just, it's just your bias. 398 00:15:29.028 --> 00:15:30.863 You're- You think it's a funny thing. 399 00:15:31.163 --> 00:15:33.465 And then when someone tells you, well guess what? 400 00:15:33.499 --> 00:15:36.702 That video you just watched that you thought was funny, it's AI. 401 00:15:36.702 --> 00:15:39.371 And you'll be like, oh, it's not funny anymore! 402 00:15:39.371 --> 00:15:40.139 Luke: Ooh! 403 00:15:40.172 --> 00:15:42.908 Riley: Like, I wish I could go back in time and stop myself from laughing! 404 00:15:42.942 --> 00:15:44.843 Riley: It's like, okay but you laughed, it was funny. 405 00:15:44.910 --> 00:15:45.744 Luke: So that's, I think- 406 00:15:45.778 --> 00:15:46.545 Riley: That's the counterpoint. 407 00:15:46.679 --> 00:15:49.682 Luke: That's my problem, is I don't know if I'll laugh now; 408 00:15:49.682 --> 00:15:52.151 because my initial reaction will be skepticism. 409 00:15:52.751 --> 00:15:53.352 Riley: And that- 410 00:15:53.352 --> 00:15:54.286 Luke: That's the problem. 411 00:15:54.620 --> 00:15:56.822 Luke: That's the thing that I'm concerned about. 412 00:15:56.822 --> 00:15:57.456 Riley: Yes. 413 00:15:57.456 --> 00:15:59.925 Riley: I think that's kind of- Yeah, I'm right there with you. 414 00:15:59.959 --> 00:16:01.260 Luke: Now, let's watch Linus. 415 00:16:01.260 --> 00:16:01.727 Riley: Okay, let's watch. 416 00:16:01.727 --> 00:16:02.194 Riley: Yeah, yeah. 417 00:16:02.227 --> 00:16:04.496 AI Linus?: Folks, just a second; I'm trying to keep these lanes clear. 418 00:16:04.530 --> 00:16:05.130 Thank you. 419 00:16:05.331 --> 00:16:07.700 Yeah, I'm that guy stopping traffic, but it's for a reason. 420 00:16:07.700 --> 00:16:08.634 AI Mark Cuban?: Our neighborhood's only public badminton courts 421 00:16:08.634 --> 00:16:09.368 Riley: Mark Cuban!? 422 00:16:09.368 --> 00:16:10.970 AI Mark Cuban?: are about to get turned into a parking lot. 423 00:16:11.003 --> 00:16:12.004 AI Mark Cuban?: These games built- 424 00:16:12.004 --> 00:16:14.807 AI Linus?: But, just a second, I'm trying to keep these lanes clear, thank you! 425 00:16:14.807 --> 00:16:15.674 Luke: Wait, what? 426 00:16:15.708 --> 00:16:16.075 Riley: What? 427 00:16:16.075 --> 00:16:17.376 AI Linus?: Yeah, I'm that guy stopping traffic, but it's for a reason. 428 00:16:17.376 --> 00:16:19.178 AI Mark Cuban?: Our neighborhood's only public badminton courts 429 00:16:19.211 --> 00:16:20.612 AI Mark Cuban?: are about to get turned into a parking lot. 430 00:16:20.612 --> 00:16:21.814 Riley: Oh, and then it just switches to the... 431 00:16:22.481 --> 00:16:26.051 Riley: Mark Cuban is saying a sentence and then it switches to him finishing it. 432 00:16:26.085 --> 00:16:27.353 Luke: I know what happened, 433 00:16:27.353 --> 00:16:30.255 when I was trying to edit it to say "the main host of Linus Media Group". 434 00:16:31.623 --> 00:16:34.727 Luke: I got kind of lost in the app for a second and I think I tagged it. 435 00:16:37.262 --> 00:16:38.864 Riley: You tagged Mark Cuban in it. 436 00:16:38.897 --> 00:16:39.832 Luke: That's crazy. 437 00:16:39.865 --> 00:16:43.435 Riley: Well, thankfully we have Mark on our side. 438 00:16:43.435 --> 00:16:44.203 Luke: Yeah. 439 00:16:44.203 --> 00:16:45.237 Riley: Helping to raise awareness. 440 00:16:45.270 --> 00:16:45.804 Luke: Yeah. 441 00:16:45.804 --> 00:16:47.906 Riley: There aren't enough badminton courts, 442 00:16:47.906 --> 00:16:52.344 and that's why Linus is going to build four more badminton centers. 443 00:16:52.378 --> 00:16:53.846 Luke: Milk pocket maybe. 444 00:16:53.979 --> 00:16:55.014 Riley: Yeah, yeah. 445 00:16:55.214 --> 00:16:57.716 Riley: Okay, but the milk pocket is a real video. 446 00:16:57.716 --> 00:16:58.350 Luke: Yeah. 447 00:16:58.384 --> 00:17:00.319 Riley: Okay, and you're going to try and recreate it. 448 00:17:00.352 --> 00:17:02.554 Luke: There's milk pocket, there's milk drawer. 449 00:17:03.022 --> 00:17:03.589 Sammy: Oh yeah. 450 00:17:03.622 --> 00:17:06.692 Luke: So when he pours a ton of milk into a drawer and then he pulls the drawer out, 451 00:17:06.692 --> 00:17:07.993 and he dunks it- You've seen this? 452 00:17:07.993 --> 00:17:08.627 Sammy: I've seen it, yeah. 453 00:17:08.627 --> 00:17:09.161 Luke: Dunks a cookie. 454 00:17:09.161 --> 00:17:09.995 Sammy: People in the car- 455 00:17:09.995 --> 00:17:11.397 Luke: This is prime content. 456 00:17:11.397 --> 00:17:11.830 Riley: Okay. 457 00:17:11.830 --> 00:17:14.733 Luke: Yeah, I can do iJustine, do Jake Paul. 458 00:17:14.733 --> 00:17:16.869 Riley: Wait, sorry; is this showing you people? 459 00:17:16.869 --> 00:17:18.137 Luke: Cameos, yes. 460 00:17:18.237 --> 00:17:20.873 Riley: Oh, so these are people who have scanned themselves in, 461 00:17:20.873 --> 00:17:23.642 and have now made it available for other people to use? 462 00:17:23.642 --> 00:17:27.446 Luke: So when I was trying to take that stuff in, I fat fingered and clicked on Mark Cuban, 463 00:17:27.446 --> 00:17:28.080 Riley: Mmmm. 464 00:17:28.080 --> 00:17:29.181 Luke: and then it put them at the end. 465 00:17:29.214 --> 00:17:31.250 Sammy: You can do Esfand, I know Esfand does a lot of- 466 00:17:31.250 --> 00:17:33.018 Luke: Oh, there he is, Esfand. 467 00:17:33.185 --> 00:17:33.685 Sammy: Yeah. 468 00:17:33.685 --> 00:17:35.654 Sammy: He watches them on the stream. 469 00:17:35.654 --> 00:17:36.922 Luke: Let me cook for a sec. 470 00:17:37.089 --> 00:17:41.860 Luke: Esfand TV pours milk into the large cargo pocket in his... of his shorts, 471 00:17:42.294 --> 00:17:45.397 and tries to dunk a chocolate chip cookie into it; 472 00:17:45.497 --> 00:17:48.834 but the milk keeps seeping through the material of the shorts down his leg 473 00:17:48.834 --> 00:17:50.669 and onto the ground, which makes him sad. 474 00:17:50.669 --> 00:17:55.607 And then at the end of the video, he sings "subscribe to Esfand", like it's a sea shanty. 475 00:17:56.375 --> 00:17:56.909 Sammy: So do you- 476 00:17:57.009 --> 00:17:57.943 Luke: That's the prompt. 477 00:17:58.077 --> 00:18:01.880 Luke: Okay, so what I typed in, I can remove my addition if you want. 478 00:18:01.914 --> 00:18:05.751 I said, make a realistic depiction of a computer eating a student's homework. 479 00:18:05.751 --> 00:18:08.754 The student then needs to explain to their teacher why their homework is late. 480 00:18:08.754 --> 00:18:09.321 Riley: Perfect. 481 00:18:09.421 --> 00:18:09.855 Luke: Okay. 482 00:18:09.955 --> 00:18:11.256 Riley: We'll see if that's funny. 483 00:18:11.256 --> 00:18:13.325 Riley: I mean, this is not a very objective test. 484 00:18:13.325 --> 00:18:17.129 Luke: Do we want to use a cameo of somebody as the student and the teacher, 485 00:18:18.130 --> 00:18:19.665 or just raw dog it? 486 00:18:19.765 --> 00:18:21.800 Riley: The person who made this argument to me 487 00:18:21.834 --> 00:18:24.002 definitely was making the argument that like, 488 00:18:24.002 --> 00:18:26.038 oh, it'd be funny because you can inject- 489 00:18:26.071 --> 00:18:28.240 You can depict yourself, 490 00:18:28.240 --> 00:18:30.175 Riley: or if some famous person doing it. 491 00:18:30.943 --> 00:18:33.946 Luke: So their teachers is... let's use somebody different. 492 00:18:33.946 --> 00:18:37.749 We've used, we've done Mark Cuban, we've done Esfand. 493 00:18:37.749 --> 00:18:39.318 Riley: Their teacher's iJustine. 494 00:18:39.451 --> 00:18:40.586 Luke: The teachers iJustine? Okay. 495 00:18:40.586 --> 00:18:43.622 Riley: She's made her likeness available; that's kind of interesting. 496 00:18:43.622 --> 00:18:48.026 Riley: Because they're, they're early adopters; they're just like, they're embracing the future. 497 00:18:48.026 --> 00:18:52.097 Sammy: Uh, you can try Carterpc, he's a short form influencer. 498 00:18:52.264 --> 00:18:54.466 Luke: We got a content violation for Esfand. 499 00:18:54.500 --> 00:18:56.869 Riley: It's probably because you got, I don't know, 500 00:18:57.269 --> 00:19:01.039 milk leaking down someone's leg, and I don't know. 501 00:19:01.039 --> 00:19:03.208 Sammy: Oh, it might be, it might be... 502 00:19:03.208 --> 00:19:05.978 Luke: They probably think- I genuinely did not mean it that way. 503 00:19:05.978 --> 00:19:06.645 Sammy: Yeah. 504 00:19:07.012 --> 00:19:09.848 Riley: Yeah, okay, Luke. 505 00:19:10.482 --> 00:19:13.986 Luke: I won't, I won't mention the leg; I'll just say leaking. 506 00:19:13.986 --> 00:19:16.255 Sammy: There was a whole issue on Twitter where- 507 00:19:17.389 --> 00:19:18.524 Riley: Milk in his pants. 508 00:19:18.524 --> 00:19:19.091 Sammy: Yeah. 509 00:19:19.091 --> 00:19:22.494 Sammy: Where people would put, say like get like a selfie of like an influencer, 510 00:19:22.494 --> 00:19:26.498 a female influencer; and at Grok, "put milk on their face". 511 00:19:26.732 --> 00:19:27.166 Riley: Yeah. 512 00:19:27.166 --> 00:19:28.901 Sammy: Yeah, that was a huge, that was a huge one. 513 00:19:28.901 --> 00:19:30.936 Luke: Okay, so this whole milk thing might not work at all. 514 00:19:30.936 --> 00:19:31.236 Sammy: Yeah. 515 00:19:31.236 --> 00:19:33.305 Riley: Grok did it, cause Grok doesn't care. 516 00:19:33.305 --> 00:19:34.673 Luke: Grok doesn't care at all. 517 00:19:34.673 --> 00:19:37.943 Riley: Might be intelligent enough to know that if you're pouring milk in a pants pocket, 518 00:19:37.943 --> 00:19:41.713 it's going to leak out; or it might just be like yeah, there's nothing wrong with that. 519 00:19:43.282 --> 00:19:45.184 Riley: Yeah, that seems like a reasonable thing to do; 520 00:19:45.217 --> 00:19:48.153 pour some milk in your cargo pants pocket to take with you. 521 00:19:49.221 --> 00:19:50.088 Sammy: Average tuesday 522 00:19:50.255 --> 00:19:53.025 Riley: You got milk in one pocket and some tater tots in the other; 523 00:19:53.058 --> 00:19:54.526 you're good for the day, that's the lunch. 524 00:19:54.760 --> 00:19:56.328 Luke: Dude, if I could actually do that, tha'd be amazing. 525 00:19:56.328 --> 00:19:57.863 Riley: That's a nutritious lunch right there. 526 00:19:58.163 --> 00:20:00.065 Luke: Okay, we're trying again; let's see what it does. 527 00:20:00.532 --> 00:20:02.834 Luke: Nope, it kicks it out immediately. 528 00:20:02.868 --> 00:20:04.169 Luke: So I, I made it to- 529 00:20:04.203 --> 00:20:05.871 Riley: You heard it here first folks. 530 00:20:05.904 --> 00:20:10.509 No milk! No AI milk! No milking the AI. 531 00:20:10.509 --> 00:20:14.680 Luke: It, uh, it's not- I think the app is having an issue actually. 532 00:20:14.713 --> 00:20:19.518 Luke: Cause I keep submitting changes and it isn't doing the changes. 533 00:20:19.518 --> 00:20:20.519 Sammy: AWS is down again. 534 00:20:20.519 --> 00:20:21.553 Riley: Well, and now it knows. 535 00:20:21.987 --> 00:20:26.525 Riley: That, that prompt, that text box that you're typing is in is tainted now. 536 00:20:26.525 --> 00:20:27.926 Luke: Yeah, so I'm doing a new one. 537 00:20:28.193 --> 00:20:31.630 Riley: Oh, this guy again, he's obsessed with milk. 538 00:20:31.630 --> 00:20:31.997 Luke: Yup. 539 00:20:31.997 --> 00:20:33.765 Riley: That's the milk guy, he's back. 540 00:20:34.499 --> 00:20:37.336 Riley: He wants to pour milk in things. 541 00:20:37.469 --> 00:20:41.273 Sammy: I saw an article today, I think Bryan Cranston from Breaking Bad, 542 00:20:41.340 --> 00:20:45.644 he was saying thank you to the Sora people because they put guardrails 543 00:20:45.644 --> 00:20:48.947 on like Michael Jackson and Breaking Bad and stuff like that. 544 00:20:48.981 --> 00:20:51.183 Riley: This is a really interesting phenomenon to me, 545 00:20:51.216 --> 00:20:56.221 that like AI existing and allowing you to make a video, 546 00:20:56.255 --> 00:20:59.358 photorealistic video of whatever your brain can think up. 547 00:20:59.358 --> 00:21:01.560 Riley: And then individuals being like, 548 00:21:01.593 --> 00:21:06.765 hey, I don't want it to be able to make a video of my specific thing. 549 00:21:06.765 --> 00:21:09.735 And then the company is like, oh shoot, sorry; 550 00:21:09.735 --> 00:21:13.605 and then they change it so that you can't make that specific thing anymore, 551 00:21:13.605 --> 00:21:15.807 but you can still do all the other stuff. 552 00:21:16.642 --> 00:21:20.712 Riley: And it's like, okay, wait; so if every celebrity in the world was like, 553 00:21:20.746 --> 00:21:28.320 hey OpenAI, don't let people make a version of my show, with like, 554 00:21:28.320 --> 00:21:32.524 with dinosaurs in it or something like, don't let, don't let them change my show. 555 00:21:32.524 --> 00:21:33.425 And then OpenAI would be like, 556 00:21:33.425 --> 00:21:35.594 ooh, we better change that show specifically. 557 00:21:35.594 --> 00:21:36.862 Luke: I think that's what they've said, 558 00:21:36.862 --> 00:21:39.264 is they want people to, they want it to be opt out. 559 00:21:39.965 --> 00:21:40.499 Sammy: Yeah. 560 00:21:40.532 --> 00:21:43.135 Luke: Which is not how copyright works at all. 561 00:21:43.135 --> 00:21:44.036 Riley: I know! 562 00:21:44.036 --> 00:21:44.736 Luke: But that's what they want. 563 00:21:44.736 --> 00:21:45.637 Riley: Honestly it seemns like- 564 00:21:46.204 --> 00:21:49.007 Luke: I feel like this- And I'm being actually serious, 565 00:21:49.007 --> 00:21:52.044 I feel like this app is like vibe coded, cause it is a mess. 566 00:21:52.044 --> 00:21:52.644 Riley: Yeah. 567 00:21:52.778 --> 00:21:54.112 Luke: It is actually very annoying. 568 00:21:54.112 --> 00:21:55.414 Riley: It might be vibe coded. 569 00:21:55.414 --> 00:22:01.420 I feel like AI companies are using AI to shorten, 570 00:22:01.420 --> 00:22:04.656 to like cut corners in ways that don't even make sense. 571 00:22:04.656 --> 00:22:04.956 Luke: Yeah. 572 00:22:04.956 --> 00:22:08.593 Riley: I just, I just did the meta Ray-Ban displays on the ShortCircuit, 573 00:22:08.593 --> 00:22:12.898 and the tutorial voice is an AI voice. 574 00:22:12.898 --> 00:22:13.498 Luke: Like was that necessary? 575 00:22:13.498 --> 00:22:14.166 Riley: And I'm like, couldn't you just- 576 00:22:14.166 --> 00:22:14.966 Luke: On a premium product? 577 00:22:14.966 --> 00:22:19.104 Riley: Could you just bring in a voice actress for like an hour? 578 00:22:19.104 --> 00:22:22.374 And just record like "now swipe forward". 579 00:22:22.374 --> 00:22:27.112 Like really? You couldn't do that? You had to use an AI voice for this? 580 00:22:27.112 --> 00:22:28.814 And it's like weird, it glitches out. 581 00:22:28.814 --> 00:22:31.750 It's like, did you feel the "bu-uzz"? 582 00:22:32.584 --> 00:22:35.220 Riley: It like had weird deliveries like that, and I'm like. 583 00:22:35.754 --> 00:22:37.255 Luke: I got a notification saying that it's done. 584 00:22:37.255 --> 00:22:38.523 Riley: But just, but it- 585 00:22:38.523 --> 00:22:42.127 I feel like when it's actually done, it stops saying generating there. 586 00:22:42.127 --> 00:22:44.629 Luke: I agree, but why would I get a notification saying that it's done? 587 00:22:44.629 --> 00:22:46.765 Riley: Because it's vibe that- Luke Murdock? 588 00:22:46.765 --> 00:22:48.233 You made the account "Luke Murdock". 589 00:22:48.233 --> 00:22:48.633 Sammy: Yeah. 590 00:22:48.734 --> 00:22:49.568 Luke: Here we go. 591 00:22:50.202 --> 00:22:52.904 Riley: Okay, the milk pocket thing. 592 00:22:53.038 --> 00:22:55.474 AI Esfand: Pocket milk time, got the big gallon. 593 00:22:55.507 --> 00:22:57.109 Luke: That's just down his shorts, that's not into the- 594 00:22:57.109 --> 00:22:58.076 AI Esfand: Oh yeah, that's a lot of milk. 595 00:22:58.076 --> 00:22:59.111 AI Esfand: Okay, cookie dunk. 596 00:22:59.111 --> 00:22:59.544 AI Esfand: Come on. 597 00:22:59.544 --> 00:23:01.012 AI Esfand: Freak, it's just running straight through. 598 00:23:01.012 --> 00:23:02.848 Ah, man; it's all seeping out, my cookies barely wet- 599 00:23:05.384 --> 00:23:07.085 Luke: That was actually pretty good. 600 00:23:07.085 --> 00:23:10.555 Luke: Other than like that he poured it into his shorts and not into the pocket; 601 00:23:10.789 --> 00:23:13.692 and he like teleports the cookie through. 602 00:23:13.692 --> 00:23:15.527 AI Esfand: Got the big gallon, big pocket. 603 00:23:15.560 --> 00:23:16.361 Let's see what happens. 604 00:23:16.361 --> 00:23:17.429 Oh yeah; that's a lot of milk. 605 00:23:17.462 --> 00:23:18.397 Okay, cookie dunk. 606 00:23:18.397 --> 00:23:18.864 Come on. 607 00:23:18.897 --> 00:23:20.332 Frick, it's just running straight through. 608 00:23:20.332 --> 00:23:22.167 Ah, man; it's all seeping out, my cookie's barely wet. 609 00:23:23.268 --> 00:23:23.702 Luke: Okay. 610 00:23:23.702 --> 00:23:25.937 Riley: Okay; so listen, look at, look at what just happened. 611 00:23:27.172 --> 00:23:27.773 Riley: We were laughing. 612 00:23:27.773 --> 00:23:28.573 Luke: It was funny. 613 00:23:28.573 --> 00:23:29.274 Riley: It was a funny video. 614 00:23:29.274 --> 00:23:31.943 Luke: I feel like it was funny because we make it though. 615 00:23:31.943 --> 00:23:33.945 Luke: I honestly feel like if I watch that... 616 00:23:37.282 --> 00:23:39.518 Luke: maybe it is kind of funny, I do think it's kind of funny. 617 00:23:39.518 --> 00:23:39.951 Luke: That sucks. 618 00:23:39.951 --> 00:23:41.353 Riley: It's funny, it's a funny video. 619 00:23:41.353 --> 00:23:43.221 Luke: I don't like that, that's uncomfortable, but I will admit. 620 00:23:43.755 --> 00:23:49.428 Riley: But I, so I'm trying to, I'm trying to interrogate my own feelings here. 621 00:23:49.428 --> 00:23:49.661 Luke: Yeah. 622 00:23:49.661 --> 00:23:51.029 Riley: Because it's like, why is it funny? 623 00:23:51.029 --> 00:23:56.401 If that was a real video, it would be funny because what the hell? 624 00:23:56.401 --> 00:23:57.269 Luke: It's so strange. 625 00:23:57.269 --> 00:24:00.138 Riley: Why did this, why is this guy pouring milk down his shorts, you know? 626 00:24:00.272 --> 00:24:02.441 Riley: And, and you'd be wondering about the... 627 00:24:02.441 --> 00:24:04.843 Luke: Now it's funny why did someone prompt that? 628 00:24:04.943 --> 00:24:07.512 Luke: Which is less funny, but it's still funny. 629 00:24:07.512 --> 00:24:10.882 Riley: It's why did someone prompt that? and also like the, 630 00:24:10.882 --> 00:24:14.519 like the decisions made by the AI. 631 00:24:14.686 --> 00:24:18.590 So it's like the AI chose to have that. 632 00:24:18.590 --> 00:24:22.227 Like he's putting the cookie in the pocket, right? Like it's the flap isn't up. 633 00:24:22.227 --> 00:24:23.428 He's putting it, you know- 634 00:24:23.428 --> 00:24:26.164 Luke: Speaking of which though, I accidentally generated another one; so should we see that? 635 00:24:26.198 --> 00:24:27.466 AI Esfand: Let's see if this is going to work. 636 00:24:27.466 --> 00:24:28.934 Pockets huge, should be fine. 637 00:24:30.168 --> 00:24:31.002 Okay, it's filling up. 638 00:24:31.002 --> 00:24:33.405 Oh, it's already leaking through; whatever, we'll dunk it. 639 00:24:33.405 --> 00:24:35.874 Moment of truth, no shot, just running right out of the fabric. 640 00:24:35.874 --> 00:24:36.975 Let's see if this is going to work. 641 00:24:36.975 --> 00:24:37.709 Pockets huge. 642 00:24:37.709 --> 00:24:39.144 Riley: He said like the same thing. 643 00:24:39.578 --> 00:24:40.445 Luke: It's very close. 644 00:24:40.612 --> 00:24:42.948 AI Esfand: Oh, it's already leaking through; whatever, we'll dunk it... 645 00:24:43.648 --> 00:24:44.916 Riley: "Oh, it's already leaking through". 646 00:24:45.350 --> 00:24:47.352 Riley: "Oh, it's already leaking through; whatever, we'll dunk it". 647 00:24:47.352 --> 00:24:48.687 Can you go back to the other one real quick? 648 00:24:48.687 --> 00:24:50.922 AI Esfand: Pocket milk time, got the big gallon. 649 00:24:50.956 --> 00:24:51.723 Riley: Pocket milk time. 650 00:24:51.723 --> 00:24:54.593 AI Esfand: Let's see what happens, oh yeah that's a lot of milk, cookie dunk. 651 00:24:54.593 --> 00:24:56.394 Come on; frick, it's just running straight through. 652 00:24:56.394 --> 00:24:58.463 Ah, man; it's all seeping out, my cookie's barely wet. 653 00:24:58.463 --> 00:24:59.197 Luke: No, it's different. 654 00:24:59.231 --> 00:24:59.898 Riley: Oh, no; yeah, it's slightly different. 655 00:24:59.898 --> 00:25:00.899 Luke: It's similar but it's different. 656 00:25:00.899 --> 00:25:01.566 Riley: "Ah". 657 00:25:02.467 --> 00:25:04.102 Luke: Let's just go through all these and the... 658 00:25:04.102 --> 00:25:04.669 Riley: Okay, fine. 659 00:25:04.669 --> 00:25:05.270 Luke: There's one more. 660 00:25:05.270 --> 00:25:05.604 Riley: Alright. 661 00:25:06.571 --> 00:25:08.907 AI Carter: Whoa, hey that's my homework; you just ate it. 662 00:25:08.974 --> 00:25:11.610 So that's why I don't have the paper, my laptop literally ate it last night. 663 00:25:11.610 --> 00:25:12.410 Luke: Why is there a pig? 664 00:25:12.410 --> 00:25:14.679 AI Carter: I know it sounds ridiculous but I filmed it, teeth and everything. 665 00:25:14.679 --> 00:25:15.313 AI iJustine: That's a first for- 666 00:25:16.815 --> 00:25:18.650 Sammy: I like how it cuts up so fast. 667 00:25:20.085 --> 00:25:21.119 Luke: Why is there a pig? 668 00:25:22.354 --> 00:25:22.921 Riley: Why not? 669 00:25:25.290 --> 00:25:28.460 Luke: The two people are very good. 670 00:25:30.395 --> 00:25:33.965 Luke: iJustine and Carter are fantastic, I don't know why there's a pig; 671 00:25:34.332 --> 00:25:37.903 and the computer eating the homework feels not good, considering I said realistic. 672 00:25:38.603 --> 00:25:41.106 It has like a picture of a mouth on the screen, it just goes. 673 00:25:41.106 --> 00:25:42.807 Riley: Well, I think maybe that's why it was confused, 674 00:25:42.807 --> 00:25:45.043 because it's like, because it's like realistic; 675 00:25:45.043 --> 00:25:48.647 and so it's like a, it's like realistic mouth eating it. 676 00:25:48.647 --> 00:25:49.214 But there's- 677 00:25:49.247 --> 00:25:50.015 Luke: Oh, yeah, maybe. 678 00:25:50.015 --> 00:25:51.082 Riley: Well, it's like how do I- 679 00:25:51.082 --> 00:25:55.253 It's like make, have a computer eating my homework and make it look realistic. 680 00:25:55.687 --> 00:25:58.390 Like, wait, what does that, how does that look like in real life? 681 00:25:58.390 --> 00:25:59.724 To have a computer eat it? 682 00:25:59.724 --> 00:26:01.560 It's probably just got confused about what would... 683 00:26:01.593 --> 00:26:03.962 it would have to be a little bit more creative, 684 00:26:04.129 --> 00:26:07.699 to be like okay a laptop kind of looks like a mouth. 685 00:26:07.699 --> 00:26:07.966 Luke: Yeah. 686 00:26:07.966 --> 00:26:10.535 Riley: So I'm going to have the laptop go, right. 687 00:26:10.535 --> 00:26:12.270 Luke: And like little shreds of paper flying over place. 688 00:26:12.270 --> 00:26:14.973 Riley: If it doesn't have, and this you know, I don't know. 689 00:26:14.973 --> 00:26:19.644 I don't know; maybe AI will be fully like actually creative at some point, 690 00:26:19.644 --> 00:26:22.781 in terms of like thinking of new ways to depict things. 691 00:26:22.781 --> 00:26:29.120 But like, I feel like that's an example where AI doesn't have enough training data 692 00:26:29.120 --> 00:26:33.558 depicting a laptop eating something, so it doesn't know how to depict that. 693 00:26:33.558 --> 00:26:34.059 Luke: Yeah. 694 00:26:34.659 --> 00:26:37.696 Riley: A human could be like, okay, 695 00:26:37.729 --> 00:26:41.466 I'm going to either have the paper go in through the screen where the mouth is eating it, 696 00:26:41.466 --> 00:26:44.235 or I'm going to have the whole laptop be a mouth or you know, something. 697 00:26:44.402 --> 00:26:44.936 Luke: Or... 698 00:26:44.936 --> 00:26:47.639 Riley: These are just some ideas that I can come up with as a creative. 699 00:26:47.639 --> 00:26:51.910 Luke: With my creative brain, unlike amateur Sora. 700 00:26:51.910 --> 00:26:53.311 Riley: Oh, okay. 701 00:26:54.245 --> 00:26:54.980 Riley: I was saying. 702 00:26:55.547 --> 00:26:56.114 Luke: Yes. 703 00:26:56.214 --> 00:26:58.483 Riley: That a real video would be funny because you're like, 704 00:26:58.516 --> 00:27:00.151 how did, why did a human do this? 705 00:27:00.185 --> 00:27:06.024 And this video, AI videos are funny because you're laughing at the decisions being made 706 00:27:06.057 --> 00:27:08.193 by the AI in creating the video. 707 00:27:08.193 --> 00:27:08.793 Luke: Yeah, that's fair. 708 00:27:08.793 --> 00:27:13.198 Luke: Like, why did he decide to pour it into his waistline instead of his pocket? 709 00:27:13.264 --> 00:27:14.532 Riley: And the kind of like- 710 00:27:14.532 --> 00:27:16.701 Luke: Why did the cookie morph through the pocket? 711 00:27:16.701 --> 00:27:17.202 Riley: Yeah. 712 00:27:17.202 --> 00:27:18.770 Riley: It's like, oh that's so weird that it's kind of funny. 713 00:27:18.770 --> 00:27:19.971 Luke: That, I think will get old. 714 00:27:20.405 --> 00:27:21.439 Riley: Yes, okay. 715 00:27:21.906 --> 00:27:23.308 Riley: And I feel like that- 716 00:27:23.308 --> 00:27:26.444 Luke: I don't think the weird creativity of humans deciding 717 00:27:26.444 --> 00:27:28.680 I'm going to pour milk into my pocket is going to get old. 718 00:27:28.680 --> 00:27:29.047 Riley: Yes. 719 00:27:29.648 --> 00:27:35.153 Luke: I do think haha silly AI video creation do silly thing will get old. 720 00:27:35.186 --> 00:27:37.489 Riley: The pig thing I feel like is a really good example, 721 00:27:38.089 --> 00:27:43.261 where if a human decided there's going to be a pig on the table. 722 00:27:43.294 --> 00:27:44.329 Luke: I think that would be funny. 723 00:27:44.329 --> 00:27:45.997 Riley: A little mini pig on the table the whole time. 724 00:27:46.031 --> 00:27:47.599 Riley: We'll be like, what? Why is there a pig? 725 00:27:47.599 --> 00:27:47.899 Luke: Yeah. 726 00:27:47.899 --> 00:27:49.734 Riley: And we said, why is there a pig? But we're like, 727 00:27:50.669 --> 00:27:53.038 we don't think that there's any intentionality behind it, 728 00:27:53.271 --> 00:27:55.373 Riley: so we don't feel the need to interrogate that anymore; 729 00:27:55.407 --> 00:27:58.309 we're just kind of like, oh yeah, it picked a pig to go there I guess, okay. 730 00:27:59.077 --> 00:28:02.781 Luke: Meanwhile, if I saw that in a YouTube video and there's a pig on the table, 731 00:28:02.881 --> 00:28:07.252 I would have to assume that this was like a lore thing with this creator that I didn't understand. 732 00:28:07.452 --> 00:28:07.986 Riley: Mmhmm. 733 00:28:07.986 --> 00:28:09.487 Luke: I'd assume there was additional context; 734 00:28:09.487 --> 00:28:12.223 Luke: I'd check the comments to be like, what's going on with the pig? 735 00:28:12.991 --> 00:28:17.195 Riley: And that to me really is kind of a it's the, it's the intention angle. 736 00:28:17.195 --> 00:28:26.438 We're heading into a a phase of the Internet where the majority of the content on the Internet 737 00:28:26.471 --> 00:28:30.709 is going to be content where most of the decisions 738 00:28:30.709 --> 00:28:34.245 in terms of like what you see on the screen were not intentional. 739 00:28:34.379 --> 00:28:34.779 Luke: Yeah. 740 00:28:34.779 --> 00:28:38.516 Riley: There were some intention, I wanted this guy to be, you know, 741 00:28:38.516 --> 00:28:41.086 the shot of this guy in the city and he's holding this thing. 742 00:28:41.419 --> 00:28:46.958 But like where the buildings are located, how many cars are going past, 743 00:28:47.325 --> 00:28:48.760 what clothes he's wearing; 744 00:28:48.760 --> 00:28:53.898 all that might just not be an intentional decision by a human creator where before it would be. 745 00:28:53.898 --> 00:28:56.768 Luke: There's a, there's an element to this and stick with me here. 746 00:28:56.768 --> 00:28:57.235 Riley: Yeah. 747 00:28:57.235 --> 00:29:00.038 Luke: There's an element to this that reminds me of cheating NVIDIA games. 748 00:29:00.338 --> 00:29:00.939 Riley: Mmhmm. 749 00:29:00.972 --> 00:29:01.506 Luke: Hold on. 750 00:29:01.639 --> 00:29:03.441 Riley: No, it makes sense to me. 751 00:29:03.475 --> 00:29:07.979 Luke: Okay; yeah, because my my reason for it is if you have... 752 00:29:08.980 --> 00:29:14.819 if you know of its existence, it kind of can at times invalidate 753 00:29:14.819 --> 00:29:16.988 the effort and skill of other players. 754 00:29:17.956 --> 00:29:20.558 Luke: Because if you just get dunked, part of a little- 755 00:29:20.558 --> 00:29:21.259 Riley: Oh you mean like- 756 00:29:21.259 --> 00:29:24.129 Luke: Part of your brain is going to be like, maybe you're cheating. 757 00:29:24.395 --> 00:29:25.497 Riley: Multiplayer cheating you're talking about. 758 00:29:25.497 --> 00:29:25.864 Luke: Yes. 759 00:29:25.864 --> 00:29:26.664 Luke: Yeah, yeah. 760 00:29:26.698 --> 00:29:28.333 Riley: I thought at first you were talking about cheating... 761 00:29:28.333 --> 00:29:31.269 Luke: Oh I see, that also kind of makes sense but that's not the angle I was going. 762 00:29:31.269 --> 00:29:35.106 Luke: I was going, there's a certain amount of like it creeps into your brain 763 00:29:35.106 --> 00:29:39.077 that maybe these people are cheating, and I find that to be worse in games 764 00:29:39.077 --> 00:29:41.913 that it is talked about more often. 765 00:29:42.313 --> 00:29:43.515 Riley: Yeah, yeah. 766 00:29:43.648 --> 00:29:46.951 Luke: Like a lot of people talk about how their their experience with Tarkov was ruined, 767 00:29:46.985 --> 00:29:50.088 not necessarily because they actually genuinely think 768 00:29:50.088 --> 00:29:51.489 that they were just killed by cheaters all the time. 769 00:29:51.523 --> 00:29:55.160 Luke: But because it's talked about so often, every single death was questioned. 770 00:29:55.160 --> 00:29:55.794 Riley: Oh, for sure. 771 00:29:55.794 --> 00:29:58.797 Riley: And then like it incentivizes you to be like, 772 00:29:58.830 --> 00:30:02.100 well, if everyone else is using cheats; I mean, I like this game, 773 00:30:02.100 --> 00:30:05.970 I want to play the game, but I can't play without cheats because everyone else is using cheats. 774 00:30:05.970 --> 00:30:08.106 Luke: So I guess that's either boring and I quit, 775 00:30:08.106 --> 00:30:10.642 also known as maybe use the Internet less. 776 00:30:10.675 --> 00:30:11.109 Riley: Yeah. 777 00:30:11.109 --> 00:30:15.346 Luke: Or I guess I'll just use cheats and they just start generating more stuff; 778 00:30:15.346 --> 00:30:17.248 and then that makes it more pervasive. 779 00:30:17.248 --> 00:30:20.451 Riley: And everyone's using cheats on each other, and you get slop talk. 780 00:30:20.451 --> 00:30:20.852 Luke: Yeah. 781 00:30:20.852 --> 00:30:22.654 Riley: I... ah, ugh! 782 00:30:22.654 --> 00:30:25.290 Luke: So that's that's my biggest concern with this kind of stuff; 783 00:30:25.290 --> 00:30:27.025 and that's what I was talking about earlier where it was like, 784 00:30:28.626 --> 00:30:30.528 now I'm skeptical about all this content. 785 00:30:31.629 --> 00:30:32.197 Riley: Right. 786 00:30:32.297 --> 00:30:37.602 Luke: Because it's like... that is weird enough that I feel like it would be, 787 00:30:37.602 --> 00:30:40.271 you know, pouring two liters of lemonade on my dash in my car. 788 00:30:40.271 --> 00:30:41.940 Luke: I'd rather really not do that. 789 00:30:42.140 --> 00:30:42.507 Riley: Yeah. 790 00:30:42.507 --> 00:30:44.509 Luke: I'm wrecking my car in a way. 791 00:30:44.509 --> 00:30:45.043 Riley: Mhmm. 792 00:30:45.977 --> 00:30:47.145 Luke: I could just AI prompt it. 793 00:30:48.580 --> 00:30:50.682 Luke: If I have this weird idea, I could just prompt it; 794 00:30:50.682 --> 00:30:53.418 and then it's done in three minutes and my car is still clean. 795 00:30:53.451 --> 00:30:59.657 Riley: So like I'm going to question every off the wall, really weird, very human thing. 796 00:31:00.158 --> 00:31:00.758 Riley: Yeah. 797 00:31:00.758 --> 00:31:01.292 Luke: Because- 798 00:31:01.292 --> 00:31:05.363 Riley: And like- Yes, and the more... 799 00:31:05.730 --> 00:31:08.600 So like there's, there's that aspect where it's like oh my gosh, 800 00:31:08.600 --> 00:31:11.269 I'm not sure going to be sure which video is real. 801 00:31:11.703 --> 00:31:18.376 But I feel like we're heading to a future where most of the videos are not going to be real. 802 00:31:18.376 --> 00:31:18.910 Luke: Yeah. 803 00:31:18.910 --> 00:31:22.046 Riley: Like you're scrolling through social media, like in the same way... 804 00:31:22.447 --> 00:31:25.750 CaseyNeistat made this point, because he also made like a Sora video 805 00:31:25.783 --> 00:31:28.953 that was kind of like talking about the the ecosystem shift; 806 00:31:28.987 --> 00:31:33.391 and his argument was essentially he used the... 807 00:31:33.524 --> 00:31:37.862 go watch his video, I guess; if you, you know, I don't want to steal things or whatever. 808 00:31:37.862 --> 00:31:40.231 But like his argument was like to do with a funnel. 809 00:31:40.231 --> 00:31:46.504 And like right now, like pre AI before, move a few years in the past. 810 00:31:47.505 --> 00:31:50.375 Most of the stuff on the Internet is crap. 811 00:31:50.408 --> 00:31:51.976 It's not, I don't want to say crap. 812 00:31:51.976 --> 00:31:54.512 Sorry, people can make all sorts of great things. 813 00:31:54.512 --> 00:31:55.446 I'm just saying that like... 814 00:31:56.247 --> 00:31:58.449 Luke: There was, I mean there's a lot of... yeah. 815 00:31:58.449 --> 00:32:01.953 Riley: If you, if you subscribe to some theory of art 816 00:32:01.953 --> 00:32:06.457 that places some above others on a hierarchy where like there's 817 00:32:06.457 --> 00:32:10.461 higher quality art and lower quality art, most art that exists is low quality art. 818 00:32:10.561 --> 00:32:11.062 Luke: Yes. 819 00:32:11.195 --> 00:32:14.165 Riley: And then the stuff that comes out the bottom of the funnel, 820 00:32:14.198 --> 00:32:17.168 this hypothetical, this metaphorical funnel like that's the good stuff. 821 00:32:17.168 --> 00:32:20.905 So like right now we have a certain sized funnel. 822 00:32:20.905 --> 00:32:23.341 Making AI videos is super, super easy now; 823 00:32:23.341 --> 00:32:26.644 and so that funnel is just going to get super, super, super, super big. 824 00:32:26.744 --> 00:32:31.983 And it was already like scrolling through your feed. 825 00:32:32.383 --> 00:32:33.518 Riley: It's already like- 826 00:32:33.518 --> 00:32:34.419 Luke: It's hard to find the good stuff. 827 00:32:34.419 --> 00:32:36.521 Riley: The tiny minority of that is good stuff. 828 00:32:36.621 --> 00:32:38.122 Luke: It's hard to find Milk Pocket. 829 00:32:38.122 --> 00:32:38.723 Riley: And now- 830 00:32:42.493 --> 00:32:44.996 Riley: I mean, Milk Pocket is really the diamond in the rough. 831 00:32:45.463 --> 00:32:45.863 Luke: Yeah. 832 00:32:46.864 --> 00:32:47.966 Luke: That's the Picasso of our time. 833 00:32:47.966 --> 00:32:49.934 Riley: Milk Pocket only comes along once in a generation. 834 00:32:49.934 --> 00:32:50.568 Luke: Yeah. 835 00:32:52.437 --> 00:32:55.974 Riley: And now it's just going to be like those Milk Pockets, 836 00:32:55.974 --> 00:32:58.609 are going to be fewer and farther between. 837 00:32:58.609 --> 00:33:01.079 Riley: Will there be any point in posting at all? 838 00:33:01.079 --> 00:33:04.215 Will there be any point in making art at all if it- 839 00:33:04.215 --> 00:33:09.087 You just have to get, like you have to throw it into the sea of slop? 840 00:33:09.087 --> 00:33:13.124 Riley: I think that, because we have to wrap this up in some way; 841 00:33:13.124 --> 00:33:20.131 I feel like making a prediction about the future really comes down to 842 00:33:20.164 --> 00:33:27.772 whether you have any faith in human society to do something about this. 843 00:33:27.772 --> 00:33:33.845 Clearly, like... ok, I just, I don't want to say- 844 00:33:33.845 --> 00:33:37.949 Because I want to be clear that I'm not coming down on AI as 845 00:33:37.949 --> 00:33:41.786 like a totally evil technology and it shouldn't be used for anything. 846 00:33:41.786 --> 00:33:43.855 I think the teachers love it and it should, 847 00:33:43.855 --> 00:33:48.459 like LLM should be available to people like teachers to make their jobs easier. 848 00:33:48.459 --> 00:33:50.895 It's making the kids lives easier too, 849 00:33:50.895 --> 00:33:54.198 as long as there aren't horrible like hallucinations happening and like the whatever. 850 00:33:54.198 --> 00:33:58.836 But like if you're just like, you're like make me a lesson plan for how to talk about, 851 00:33:58.870 --> 00:34:04.709 you know, how to teach the history of America in the whatever century you know? 852 00:34:04.742 --> 00:34:08.713 Luke: But then we've got, see- Yeah, so I agree. 853 00:34:08.713 --> 00:34:13.284 But then brain rot is so pervasive that we have someone on staff right now, 854 00:34:13.284 --> 00:34:19.223 filming this video sitting scrolling brain rot on their phone, just off camera. 855 00:34:19.223 --> 00:34:21.192 Riley: Yes, yeah, ok yes, yes; but ok. 856 00:34:21.192 --> 00:34:23.227 So hold that thought, we'll get back to the brain rot. 857 00:34:23.494 --> 00:34:24.762 Luke: Sorry, I just had to do that. 858 00:34:24.762 --> 00:34:28.966 Riley: Alright, it's useful for teachers, it's very useful for medicine. 859 00:34:29.000 --> 00:34:33.938 I'm very, very... is bullish the right word? I'm like, I'm hopeful. 860 00:34:33.938 --> 00:34:36.841 Luke: Yeah, there's some actually, there's some really cool stuff happening. 861 00:34:36.841 --> 00:34:37.975 Luke: I completely agree with you. 862 00:34:37.975 --> 00:34:43.848 Riley: Diagnostic, like doctors being able to use a system to accurately diagnose like, 863 00:34:43.848 --> 00:34:46.918 oh, you have a lump; and instead of me just being like, 864 00:34:46.918 --> 00:34:51.856 ok, I've been a doctor for a while and that doesn't look cancerous; so you're probably fine. 865 00:34:51.856 --> 00:34:53.991 Riley: You know, put some cream on it or something. 866 00:34:53.991 --> 00:34:56.160 Riley: And having an AI system look at it and be like, 867 00:34:56.994 --> 00:35:00.131 this is, this fits the profile of something that might be precancerous. 868 00:35:00.131 --> 00:35:01.332 Like, let's get that scan. 869 00:35:01.332 --> 00:35:04.802 Luke: I like the idea, like how Labs uses it for checking things 870 00:35:04.802 --> 00:35:08.106 is that we don't use it to replace any human process, 871 00:35:08.106 --> 00:35:10.341 but we use it as like an early warning detection system. 872 00:35:10.341 --> 00:35:10.842 Riley: Right, right. 873 00:35:10.875 --> 00:35:15.079 Luke: So hopefully it can save us from wasting a bunch of time because a setting is incorrectly done. 874 00:35:15.079 --> 00:35:15.480 Riley: Yeah. 875 00:35:15.513 --> 00:35:17.748 Luke: I like the idea of that use case in the middle. 876 00:35:17.748 --> 00:35:18.249 Riley: Weather. 877 00:35:19.016 --> 00:35:19.417 Luke: Weather. 878 00:35:19.417 --> 00:35:22.220 Riley: There are all sorts of scientific applications for like AI; 879 00:35:22.220 --> 00:35:24.755 and how it could like increase, or make better. 880 00:35:24.755 --> 00:35:26.224 Luke: Get rid of all the scientists. 881 00:35:26.390 --> 00:35:26.891 Riley: No, no, no! 882 00:35:26.891 --> 00:35:27.391 Luke: It's what Riley's saying. 883 00:35:27.425 --> 00:35:28.693 Riley: No, no, no! I'm saying, I'm saying. 884 00:35:28.726 --> 00:35:29.560 Luke: Fire them all. 885 00:35:29.827 --> 00:35:33.865 Riley: Science is one, it's one of the areas where it's like 886 00:35:34.832 --> 00:35:37.668 you cannot have an AI replace a scientist. 887 00:35:37.668 --> 00:35:41.739 You can have an... because scientists need better tools. 888 00:35:41.739 --> 00:35:43.875 Luke: Hopefully, hopefully it can provide good assistance; 889 00:35:43.875 --> 00:35:47.578 and it seems like it might be able to in a variety of applications, which is really cool. 890 00:35:47.578 --> 00:35:50.081 Riley: Although, yeah like somebody as soon as I said that, 891 00:35:50.081 --> 00:35:53.184 I was like, somebody is going to be like, everybody says- 892 00:35:53.184 --> 00:35:53.985 Luke: Get rid of them. 893 00:35:53.985 --> 00:35:57.155 Riley: And no, everybody says that AI can't replace x. 894 00:35:57.155 --> 00:36:00.625 Riley: And then later it's like, ok, oh it can again I guess. 895 00:36:01.025 --> 00:36:03.861 Luke: But then it hasn't successfully done that anywhere yet. 896 00:36:03.861 --> 00:36:07.932 Riley: So again, it really comes down to whether we are like optimistic 897 00:36:07.932 --> 00:36:11.836 about society's ability to kind of rein this in and be like, 898 00:36:11.836 --> 00:36:13.137 alright, this is really cool technology; 899 00:36:13.137 --> 00:36:15.806 it has a lot of useful applications. 900 00:36:16.641 --> 00:36:19.744 We're not going to let you use it for this, 901 00:36:19.877 --> 00:36:23.247 Riley: or like we're not going to let you use it for this and profit off of it. 902 00:36:23.281 --> 00:36:26.117 Luke: I don't, I am not optimistic about that at all. 903 00:36:26.117 --> 00:36:31.489 I am highly optimistic about our ability to adapt to it being in our society. 904 00:36:31.489 --> 00:36:32.323 Riley: You are? 905 00:36:32.323 --> 00:36:32.924 Luke: Very. 906 00:36:33.124 --> 00:36:33.624 Riley: Ok. 907 00:36:33.624 --> 00:36:34.959 Luke: I think we are... 908 00:36:34.959 --> 00:36:35.526 Riley: That's cool. 909 00:36:35.560 --> 00:36:37.795 Luke: Basically adapters as a species. 910 00:36:37.795 --> 00:36:38.496 Riley: Mmmm. 911 00:36:38.863 --> 00:36:42.200 Luke: I don't necessarily think that that will come without damage. 912 00:36:42.466 --> 00:36:43.034 Riley: Right. 913 00:36:43.034 --> 00:36:47.605 Luke: I think we are also very good at damaging each other and ourselves. 914 00:36:47.605 --> 00:36:51.475 Riley: And I was going to say, like we've adapted to the Internet and to social media. 915 00:36:51.509 --> 00:36:51.842 Luke: Yeah. 916 00:36:51.842 --> 00:36:53.177 Riley: But like I don't think that was... 917 00:36:53.744 --> 00:36:54.312 Riley: I don't think that's... 918 00:36:54.312 --> 00:36:55.846 Luke: I'm not saying it's a good thing. 919 00:36:55.846 --> 00:36:56.480 Riley: Yeah, yeah. 920 00:36:57.248 --> 00:36:58.849 Luke: But I think we'll adapt to doing it. 921 00:36:58.849 --> 00:36:59.517 Riley: We'll adapt. 922 00:36:59.550 --> 00:36:59.984 Luke: Yeah. 923 00:37:00.017 --> 00:37:00.451 Riley: Yeah. 924 00:37:00.484 --> 00:37:01.986 Luke: Just not necessarily positively. 925 00:37:03.688 --> 00:37:05.957 Luke: Watch the bleepification of Google. 926 00:37:05.957 --> 00:37:11.295 Riley: If you like this video, feed yourself into the AI machine. 927 00:37:11.295 --> 00:37:11.796 Luke: No! 928 00:37:11.796 --> 00:37:13.965 Riley: Have the machine gods subsume you. 929 00:37:13.965 --> 00:37:14.899 Luke: No, don't do that! 930 00:37:15.900 --> 00:37:17.201 Riley: Hail Moloch!
