{"video_id":"5IdRvGxqKRY","title":"The Google Rant","channel":"FP Exclusives","show":"FP Exclusives","published_at":"2026-03-05T14:53:29Z","duration_s":2466,"segments":[{"start_s":0.1,"end_s":1.968,"text":"Adam: I don't think many companies f***** up Google 2 00:00:01.968 --> 00:00:05.138 other companies could have done more f****** up of Google. 3 00:00:05.672 --> 00:00:07.841 If Google got more f***** up there'd be less problems. 4 00:00:07.841 --> 00:00:09.342 , if they just had if, you know, 5 00:00:09.342 --> 00:00:12.545 if they if Google got their legs broken a couple times as a kid. 6 00:00:12.579 --> 00:00:13.079 Adam: 7 00:00:13.079 --> 00:00:14.547 Riley: . They'd, they'd be more humble 8 00:00:14.547 --> 00:00:16.249 Sammy: [Save this for the video] 9 00:00:16.249 --> 00:00:17.717 Riley: No just use that. That's the intro. 10 00:00:17.717 --> 00:00:18.785 Sammy: No, no. we canât ! 11 00:00:18.785 --> 00:00:19.986 Riley: That's how the video starts. 12 00:00:19.986 --> 00:00:21.888 Adam: You know Google it kind of sucks now. 13 00:00:21.888 --> 00:00:24.157 And it's crazy because Google used to be really, 14 00:00:24.157 --> 00:00:26.526 really cool when they started doing stuff with Android. 15 00:00:26.526 --> 00:00:29.462 Google Maps Street View is like a revolutionary, 16 00:00:29.462 --> 00:00:33.266 but now it seems like every single update that Google pushes out is just to either 17 00:00:33.266 --> 00:00:34.267 cram more ads, 18 00:00:34.267 --> 00:00:37.303 or to take away features, or to kill something that you already like. 19 00:00:37.303 --> 00:00:40.473 Riley: , I don't know if many people remember this, but when Google started, 20 00:00:40.473 --> 00:00:43.943 they had like the whole, disclaimer or message 21 00:00:43.943 --> 00:00:48.048 on their on their site being like, hey, we're a new type of search engine, 22 00:00:48.048 --> 00:00:51.985 no ads, no weather, no stocks or whatever, just pure search. 23 00:00:52.719 --> 00:00:54.988 That's that's how Google started with it. 24 00:00:54.988 --> 00:00:56.656 And it's like, wait, what? 25 00:00:56.656 --> 00:00:57.857 That's what Google is now. 26 00:00:57.857 --> 00:00:59.192 Adam: And if you go back far enough, 27 00:00:59.192 --> 00:01:02.195 I know that there's like an interview with one of the two founders 28 00:01:02.529 --> 00:01:05.632 where they were saying that like, oh, they'll never do advertising on Google, 29 00:01:05.632 --> 00:01:07.167 because once you start doing advertising 30 00:01:07.167 --> 00:01:09.169 in a search engine, you've defeated its usefulness. 31 00:01:09.169 --> 00:01:12.405 Riley: That's like, , that's that's like a broader arc 32 00:01:12.405 --> 00:01:16.042 of like the the bad path that Google has gone down. 33 00:01:16.042 --> 00:01:19.379 But I feel like in the past, you know, three or 4 or 5 years, 34 00:01:19.746 --> 00:01:23.883 people have also just identified a like distinct ways 35 00:01:23.883 --> 00:01:29.789 in which Google has gotten worse to use because I feel like pre, I don't know, 36 00:01:29.789 --> 00:01:34.360 2018, 2019, pre-COVID, it was like it's still basically just Google. 37 00:01:34.761 --> 00:01:38.465 Like they experimented with having, you know, personalized web pages and stuff. 38 00:01:38.465 --> 00:01:39.766 I don't know if you ever like signed- 39 00:01:39.766 --> 00:01:42.802 I think they had like iGoogle they had they had a thing called, 40 00:01:42.802 --> 00:01:44.270 I think it was called iGoogle 41 00:01:44.270 --> 00:01:45.805 Adam: I had Google Plus. 42 00:01:45.972 --> 00:01:47.474 Riley: . That was the Social network. 43 00:01:47.474 --> 00:01:48.975 Adam: I was super stoked on that. 44 00:01:48.975 --> 00:01:52.045 Riley: So like if you signed into Google you could have, you know, widgets 45 00:01:52.045 --> 00:01:53.813 and stuff up on Google when you got there 46 00:01:53.813 --> 00:01:57.117 and it would tell you like breaking news and and weather and stuff. 47 00:01:57.117 --> 00:01:59.586 But then they took it away. They're just like, no, we're not doing this. 48 00:01:59.586 --> 00:02:01.187 We're going back to normal Google Pay. 49 00:02:01.187 --> 00:02:03.223 Adam: I mean, that's just what they always do, right? 50 00:02:03.223 --> 00:02:04.257 Adam: They just kill everything 51 00:02:04.257 --> 00:02:04.891 Riley: They just took it away. 52 00:02:04.891 --> 00:02:06.159 Adam: Like, oh, that's a cool idea. 53 00:02:06.159 --> 00:02:06.826 Riley: you know what would be- 54 00:02:06.826 --> 00:02:11.097 oh- you know what be much better is if we just, 55 00:02:11.097 --> 00:02:14.701 came up with a robot that will search the web for you. 56 00:02:14.701 --> 00:02:16.836 Adam: , and it'll cut down 57 00:02:16.836 --> 00:02:20.940 trees in the Amazon to throw coals that go into the furnace that powers it. 58 00:02:21.040 --> 00:02:22.542 Riley: Oh, man. 59 00:02:22.542 --> 00:02:25.178 Adam: Let's kind of go back, because I think that a lot of this can be 60 00:02:25.178 --> 00:02:30.316 tied into the overall, like centralization of the web. 61 00:02:31.050 --> 00:02:35.655 Because back in the day, we used to have, oh, you'd have a website. 62 00:02:35.989 --> 00:02:38.158 I mean, you used to have to have a destination. 63 00:02:38.158 --> 00:02:39.692 Stuff didn't go to you, right? 64 00:02:39.692 --> 00:02:44.097 Like now, before it was like, I'm going to go to my friends at web site 65 00:02:44.297 --> 00:02:48.902 or I will go to a news website, just actual specific locations. 66 00:02:49.469 --> 00:02:50.203 Adam: You had to go to. 67 00:02:50.203 --> 00:02:53.106 Riley: And they all linked to each other and it was nice. 68 00:02:53.106 --> 00:02:54.140 That was a whole like 69 00:02:54.140 --> 00:02:57.510 Riley: you could go on a hyperlink rabbit hole through different websites. 70 00:02:57.810 --> 00:02:58.711 Riley and Adam: . 71 00:02:58.711 --> 00:03:02.282 Adam: And now it all just gets centralized which is interesting because I remember 72 00:03:02.282 --> 00:03:04.918 do you ever do your using like StumbleUpon. Did you ever use that? 73 00:03:04.918 --> 00:03:05.785 Riley: I never used it. 74 00:03:05.785 --> 00:03:08.421 Adam: So for people who don't know it was a web extension. 75 00:03:08.421 --> 00:03:11.958 Basically you just click that and it would send you to a random website 76 00:03:11.958 --> 00:03:15.862 from a relatively curated list of, like, interesting websites 77 00:03:15.862 --> 00:03:17.096 that would completely not work 78 00:03:17.096 --> 00:03:18.731 Riley: Like Reddit, but different. 79 00:03:18.731 --> 00:03:21.768 Adam: It's like, Reddit, but you press the like a random button every 80 00:03:21.768 --> 00:03:23.836 Time and it would send you to a web page. 81 00:03:23.836 --> 00:03:26.372 But now there's no web pages, right? 82 00:03:26.372 --> 00:03:29.042 You just go to your social media service. 83 00:03:29.042 --> 00:03:30.243 Riley: So now we're like, , 84 00:03:30.243 --> 00:03:33.012 everything is kind of siloed within these little kingdoms. 85 00:03:33.012 --> 00:03:35.081 There's the Facebook kingdom, Twitter Kingdom, 86 00:03:36.316 --> 00:03:37.116 Adam: The Reddit 87 00:03:37.116 --> 00:03:39.419 Riley: those are the two. 88 00:03:39.419 --> 00:03:42.422 But that's kind of like a broader problem with the web in general. 89 00:03:42.956 --> 00:03:46.593 But I feel like if we look at just at Google, obviously it's made it harder 90 00:03:46.593 --> 00:03:50.763 for Google to be this useful resource where you kind of 91 00:03:50.763 --> 00:03:53.866 are finding all of these fun little things across the internet. 92 00:03:53.900 --> 00:03:56.669 You can still be that, like, I can still have that experience. 93 00:03:56.669 --> 00:03:58.304 You know, when you Google something 94 00:03:58.304 --> 00:04:02.609 a little random and you go down the results pages, you can find something. 95 00:04:02.609 --> 00:04:06.879 It's just harder because you have to sift through a bunch of stuff first. 96 00:04:07.146 --> 00:04:12.285 I feel like the the beginning of when Google really started to go 97 00:04:12.285 --> 00:04:16.022 wrong was placing such a heavy emphasis 98 00:04:16.022 --> 00:04:19.125 on, well, it wasn't even Google's fault really. 99 00:04:19.125 --> 00:04:21.494 It was just SEO, search engine optimization, 100 00:04:21.494 --> 00:04:24.497 and the kind of arms race that came up about that. 101 00:04:24.664 --> 00:04:25.765 Adam: How did that occur? 102 00:04:25.765 --> 00:04:27.767 Because my understanding was that originally 103 00:04:27.767 --> 00:04:31.471 Google's like their big trick to having a good search 104 00:04:31.471 --> 00:04:35.575 engine was by using how often other sites were linked to, 105 00:04:36.342 --> 00:04:40.680 as a way of weighting the quality, whereas I feel like that's not the case anymore. 106 00:04:40.680 --> 00:04:44.017 Riley: You sound like you maybe have done more research on this than I have because I, 107 00:04:44.050 --> 00:04:46.219 I'm not. I'm not sure, but that sounds right. 108 00:04:46.219 --> 00:04:46.586 I think. 109 00:04:46.586 --> 00:04:49.589 I think that is the case for the early days. 110 00:04:49.956 --> 00:04:54.127 And then I guess at some point they changed the system. 111 00:04:54.127 --> 00:04:56.296 We shouldn't talk about this as if we know. 112 00:04:56.296 --> 00:04:57.530 We havent done the research. 113 00:04:57.530 --> 00:04:58.231 Adam: This is not that research. 114 00:04:58.231 --> 00:05:00.099 Riley: This is an impromptu rant. 115 00:05:00.099 --> 00:05:04.270 But what I do know is that SEO at some point became the norm. 116 00:05:04.270 --> 00:05:08.941 Where you're ranking in the search results is now more to do with how 117 00:05:08.941 --> 00:05:12.545 well you have optimized keywords and presentation or whatever. 118 00:05:12.545 --> 00:05:14.247 And like order of things, 119 00:05:14.247 --> 00:05:18.451 website owners tuned their websites specifically to show up. 120 00:05:18.451 --> 00:05:20.386 Well, in the algorithm. 121 00:05:20.386 --> 00:05:23.389 And that has it's become it's screwed everything up. 122 00:05:23.389 --> 00:05:23.990 Adam: Oh, absolutely. 123 00:05:23.990 --> 00:05:25.291 Riley: Because that's that's slop. 124 00:05:25.291 --> 00:05:26.592 That's how you get slop. 125 00:05:26.592 --> 00:05:27.327 Adam: . 126 00:05:27.327 --> 00:05:29.062 Riley: And that's where we find ourselves today. 127 00:05:29.062 --> 00:05:33.666 Adam: Google must realize that their ambitions into AI is self-defeating right now. 128 00:05:33.666 --> 00:05:35.735 Riley: Here we go. Give me a take. 129 00:05:35.735 --> 00:05:38.371 Adam: Well everyone complains about Google. 130 00:05:38.371 --> 00:05:40.039 It just surfaces slop. 131 00:05:40.039 --> 00:05:43.776 And they are investing so heavily into the greatest slot production 132 00:05:43.776 --> 00:05:46.045 machine ever, ever made. Right? 133 00:05:46.045 --> 00:05:49.549 I don't think that like, AI actually 134 00:05:49.549 --> 00:05:53.353 has a substantial threat to art or what 135 00:05:53.353 --> 00:05:57.790 what it actually is going to be warping is the ability to create sellable content. 136 00:05:57.790 --> 00:06:02.628 I no longer have to write a blog post about the chili that goes 137 00:06:02.628 --> 00:06:06.766 before the chili recipe, so that I show up at the top of the Google search results. 138 00:06:06.766 --> 00:06:08.835 I can just have the AI do it. 139 00:06:08.835 --> 00:06:11.304 No one cares about those blog posts. 140 00:06:11.304 --> 00:06:15.174 No one cares about the AI slop because the product 141 00:06:15.174 --> 00:06:18.544 is really the recipe and it's just a way of finding your way at the top. 142 00:06:18.578 --> 00:06:19.245 Riley: That's like. 143 00:06:19.245 --> 00:06:23.416 Those recipes are a great example of how SEO screwed things up. 144 00:06:23.416 --> 00:06:24.484 , so what are you saying? 145 00:06:24.484 --> 00:06:30.089 You're saying that Google is, creating the circumstances of their own demise? Yes. 146 00:06:30.089 --> 00:06:33.092 Adam: They're absolutely just like, what if we just made it 147 00:06:33.126 --> 00:06:35.895 profoundly easier to generate crap? 148 00:06:35.895 --> 00:06:39.031 Like, you can just ask here, I just, I just 149 00:06:39.031 --> 00:06:43.236 instead of making this video, I just asked, Gemini, 150 00:06:43.236 --> 00:06:45.371 why does Google search suck? Now? 151 00:06:45.371 --> 00:06:46.406 Riley: Here we go. 152 00:06:46.406 --> 00:06:48.808 Adam: And it's given me tons of content. 153 00:06:48.808 --> 00:06:53.045 Riley: , we could just like put on the voice synthesizer and just play that. 154 00:06:53.045 --> 00:06:54.914 And people could like, watch that instead. 155 00:06:54.914 --> 00:06:56.749 In some ways it would be more focused. 156 00:06:56.749 --> 00:06:58.584 Adam: Probably, it would probably be 157 00:06:58.584 --> 00:07:02.822 Riley: Some ways it would probably provide more, you know, pertinent information. 158 00:07:03.289 --> 00:07:03.890 Adam: Probably. 159 00:07:03.890 --> 00:07:06.092 Riley: But people are coming here for the good stuff. 160 00:07:06.092 --> 00:07:07.126 Adam: . 161 00:07:07.126 --> 00:07:09.595 And the good stuff is not pertinent information. 162 00:07:09.595 --> 00:07:14.066 Riley: The slot problem is I don't see a solution to it right now, 163 00:07:14.066 --> 00:07:18.638 because now that these tools are out, the AI generating tools, 164 00:07:19.572 --> 00:07:20.673 are out in the public. 165 00:07:20.673 --> 00:07:23.743 Anyone can make any AI slop and post online. 166 00:07:23.743 --> 00:07:24.544 That is just 167 00:07:24.544 --> 00:07:29.549 I don't know how you stop that from totally polluting the search experience. 168 00:07:29.549 --> 00:07:32.385 Like, I think that there are a bunch of problems with Google search 169 00:07:32.385 --> 00:07:35.922 that, you know, we haven't even touched on, like the ads being a problem and, 170 00:07:36.255 --> 00:07:39.992 like the actual AI overview, like, when you, when you do a search. 171 00:07:39.992 --> 00:07:43.663 But like in terms of like the results and what the internet state of the internet 172 00:07:43.663 --> 00:07:46.833 is, the fact that it's being polluted by all this slop. 173 00:07:46.833 --> 00:07:48.100 I don't know how you can fix that 174 00:07:48.100 --> 00:07:52.472 with the other stuff, like ads where scammers are paying 175 00:07:52.805 --> 00:07:56.642 to be the first sponsored result when you Google something. 176 00:07:56.976 --> 00:08:00.179 I watched a great video on this by, Tunnel Vision, the YouTube channel. 177 00:08:00.179 --> 00:08:01.948 I think he tried to contact the 178 00:08:01.948 --> 00:08:05.451 the people who are buying those ads and ask them like what the deal is, 179 00:08:05.451 --> 00:08:07.053 and they're just like, that's what we got to do. 180 00:08:07.053 --> 00:08:08.221 I mean, like, the system is there. 181 00:08:08.221 --> 00:08:08.788 So like, 182 00:08:08.788 --> 00:08:11.791 you could take away that ability, you could take away that feature 183 00:08:11.791 --> 00:08:12.558 and make that better. 184 00:08:12.558 --> 00:08:13.793 You could take away the 185 00:08:13.793 --> 00:08:17.630 AI overviews feature and make it so that, like the web search on Google is 186 00:08:17.830 --> 00:08:18.598 is the default. 187 00:08:18.598 --> 00:08:19.565 But what do you do about the fact 188 00:08:19.565 --> 00:08:22.568 that, like, everyone is generating AI slop and putting it online now, 189 00:08:22.702 --> 00:08:25.938 you can't tell the search engine filter out AI images. 190 00:08:25.938 --> 00:08:27.573 It doesn't know what's an AI image or not. 191 00:08:27.573 --> 00:08:29.909 I mean, and if we put metadata in, maybe, 192 00:08:29.909 --> 00:08:32.245 but not everyone's going to put the metadata in the images. 193 00:08:32.245 --> 00:08:36.148 Adam: Also, if you're Google and you go, we just introduced this killer new feature. 194 00:08:36.182 --> 00:08:38.284 It gets rid of all the AI you don't want. 195 00:08:38.284 --> 00:08:41.554 You're just like telling your investors that people hate this stuff. 196 00:08:41.554 --> 00:08:43.422 Riley: , exactly. The level to which, 197 00:08:44.624 --> 00:08:47.793 using Google sucks now is only going to get worse. 198 00:08:47.927 --> 00:08:49.896 This is the lowest it's going to be. 199 00:08:49.896 --> 00:08:51.931 It's going to just be 200 00:08:51.931 --> 00:08:54.100 worse and worse and worse. I don't know. 201 00:08:54.100 --> 00:08:56.435 Adam: They're in this situation where they're at like a party. 202 00:08:56.435 --> 00:08:56.869 Riley: . 203 00:08:56.869 --> 00:08:59.872 Adam: And they realize that they currently have diarrhea. 204 00:08:59.872 --> 00:09:01.140 Riley: I'm hearing you out. Okay. 205 00:09:01.140 --> 00:09:04.644 Adam: And what would really solve the problem is if they just said, 206 00:09:05.311 --> 00:09:06.879 oh, God, I have diarrhea. 207 00:09:06.879 --> 00:09:09.048 Please excuse me for a moment. 208 00:09:09.048 --> 00:09:12.051 But if they admitted that they had diarrhea, 209 00:09:12.151 --> 00:09:14.654 everyone would be like, you're the diarrhea guy. 210 00:09:14.654 --> 00:09:16.622 Riley: . kills the vibe! 211 00:09:16.622 --> 00:09:20.226 Adam: Youâve ruined dinner, and then they would never get invited again. 212 00:09:20.259 --> 00:09:21.928 They're in this position where, like, 213 00:09:21.928 --> 00:09:25.731 they can't admit that what they're doing is actually, like, causing them problems. 214 00:09:26.132 --> 00:09:27.900 Well, and I don't think they 215 00:09:27.900 --> 00:09:30.603 even if they could do that, I don't think they want to. 216 00:09:30.603 --> 00:09:32.605 Like, I don't want to say that AI is okay. 217 00:09:32.605 --> 00:09:35.975 First of all, we're passionate fashion in on it, on on AI a lot. 218 00:09:35.975 --> 00:09:37.743 And I think we're going to continue to do that. 219 00:09:37.743 --> 00:09:38.978 But I do want to say 220 00:09:38.978 --> 00:09:42.315 there are some instances where AI is very useful and good. 221 00:09:43.115 --> 00:09:46.752 For instance, one of the ways that Google search has improved, that 222 00:09:46.752 --> 00:09:51.223 I'll say up front is with the multimodal search and like, lens. 223 00:09:51.557 --> 00:09:53.125 , Google Lens. Absolutely. 224 00:09:53.125 --> 00:09:56.462 My parents were using that to like, oh what is this? 225 00:09:56.462 --> 00:09:57.897 And they like learn how to use lens 226 00:09:57.897 --> 00:09:59.966 to like identify something and it is accurate. 227 00:09:59.966 --> 00:10:00.900 It's really good. 228 00:10:00.900 --> 00:10:04.036 We have these, animal toys by a company called Schleich. 229 00:10:04.036 --> 00:10:07.039 It's a it's a toy of an armadillo. 230 00:10:07.273 --> 00:10:09.709 And you take a picture of it and it's like, it's impressive enough 231 00:10:09.709 --> 00:10:12.478 that Google would be able to say, oh, that's an armadillo, 232 00:10:12.478 --> 00:10:17.083 but it knows it's a Schleich model of an armadillo, 233 00:10:17.283 --> 00:10:21.387 and it directs you to a bunch of different websites that sell 234 00:10:21.387 --> 00:10:24.590 that particular toy with the like, correct skew and everything. 235 00:10:24.590 --> 00:10:26.492 And it's like, okay, that's really cool. 236 00:10:26.492 --> 00:10:27.627 Adam: I like how that's like the most 237 00:10:28.861 --> 00:10:30.696 Every single, every 238 00:10:30.696 --> 00:10:34.767 single demo of AI is like, and here's how you can buy something with it. 239 00:10:34.767 --> 00:10:35.868 You like that thing. 240 00:10:35.868 --> 00:10:37.970 Wouldn't it be great if you could buy it faster? 241 00:10:37.970 --> 00:10:39.038 Riley: , . 242 00:10:39.038 --> 00:10:43.609 But I will say that like, you know, Apple had that like plant identifying app 243 00:10:43.609 --> 00:10:45.678 or whatever. I mean, what else is going to show you? 244 00:10:45.678 --> 00:10:46.412 Riley: It gives you 245 00:10:46.412 --> 00:10:47.246 Adam: Where to buy the plant 246 00:10:47.246 --> 00:10:49.682 Riley: It gives you the Wikipedia and like other things. 247 00:10:49.682 --> 00:10:53.252 But you know what else is going to be when you search this plant, I guess. 248 00:10:53.252 --> 00:10:53.853 What are you saying? 249 00:10:53.853 --> 00:10:56.122 In a perfect world, they would give you like scientific studies about the plant 250 00:10:56.122 --> 00:10:57.356 Adam: I don't know what it would do. 251 00:10:57.356 --> 00:11:00.726 Like if we it's just it's funny because like the use cases 252 00:11:00.726 --> 00:11:02.928 that are always demonstrated for AI are always like. 253 00:11:02.928 --> 00:11:07.233 And this is how you can like that's think less and buy more sometimes. 254 00:11:07.233 --> 00:11:10.403 Riley: I mean sometimes the demos are like, oh, find out 255 00:11:10.403 --> 00:11:14.140 when your mom's flight is coming in and send her a message for it. 256 00:11:14.140 --> 00:11:14.874 Adam: Except it can't do that yet! 257 00:11:14.874 --> 00:11:17.476 it still sucks at all that I want it to be. I want it to be 258 00:11:17.476 --> 00:11:19.412 I want it to be Siri demo. And it was , know. 259 00:11:19.412 --> 00:11:21.347 Adam: Like that's the part that blows my mind 260 00:11:21.347 --> 00:11:25.317 is that they keep going like oh here's like this handy new AI feature. 261 00:11:25.317 --> 00:11:29.922 Yet I still can't ask my like Google Home to do more than one thing at a time. 262 00:11:29.989 --> 00:11:33.993 , I can't be like, turn off the lights and put a reminder in. 263 00:11:33.993 --> 00:11:35.928 It just goes like, I don't, I don't know. 264 00:11:35.928 --> 00:11:38.597 Have you noticed on your Nest Hub that you used to be able to say, like, 265 00:11:38.597 --> 00:11:41.233 never mind when it kept talking to you when it you didn't hear it? 266 00:11:41.233 --> 00:11:43.569 And now if you do that, it just goes. 267 00:11:43.569 --> 00:11:47.039 Never mind is an album by Nirvana from 1995. 268 00:11:47.073 --> 00:11:48.941 Riley: I can still say never mind. 269 00:11:48.941 --> 00:11:52.878 Adam: It always just just tells me about Nirvana and it pisses me off so much. 270 00:11:54.680 --> 00:11:57.917 Riley: I don't know, I feel like I've maybe I've been desensitized to it 271 00:11:57.917 --> 00:11:59.885 because my kid always goes up to the Nest Hub 272 00:11:59.885 --> 00:12:02.888 and says, I'll just activate everyone saying, Hey Google, 273 00:12:02.888 --> 00:12:05.891 what does a Spinosaurus sound like? 274 00:12:05.958 --> 00:12:08.794 And then it's like, doesn't miss, doesn't hear him properly. 275 00:12:08.794 --> 00:12:10.463 And it just like has something else. 276 00:12:10.463 --> 00:12:12.164 And so I'm like, okay. 277 00:12:12.164 --> 00:12:14.567 When it gets it, I'm like, good job, Google. 278 00:12:14.567 --> 00:12:17.136 You know, it's like sometimes I don't know what my kid's saying. 279 00:12:17.136 --> 00:12:19.105 So like the fact that you pick that up great 280 00:12:19.105 --> 00:12:20.706 Adam: Itâs interesting how- do you find that 281 00:12:20.706 --> 00:12:23.909 you have far less patience for a machine than you do for a person? 282 00:12:23.909 --> 00:12:26.912 Like if I, if I was like, hey, could you, could you order pizza? 283 00:12:27.179 --> 00:12:28.147 And they went, what? 284 00:12:28.147 --> 00:12:30.416 I'd be like, can you order pizza? You know, 285 00:12:31.550 --> 00:12:34.386 but the moment my Nest Hub is like, 286 00:12:34.386 --> 00:12:37.757 I, you have to log in, you have to recognize your voice to do that. 287 00:12:37.757 --> 00:12:38.758 I was like you, mother- 288 00:12:38.758 --> 00:12:41.560 Riley: Why are you trying to order pizza through your Nest Hub? 289 00:12:41.560 --> 00:12:42.661 Adam: Iâm not, no, that's just an example. 290 00:12:42.661 --> 00:12:43.262 Because, like, 291 00:12:43.262 --> 00:12:46.365 you can ask a human to do normal stuff and you have, like, sympathy for them. 292 00:12:46.365 --> 00:12:50.436 But the moment my robot gets something slightly wrong, I am infuriated. 293 00:12:50.436 --> 00:12:50.970 Riley: Well, let's be good. 294 00:12:50.970 --> 00:12:51.303 . 295 00:12:51.303 --> 00:12:53.072 I mean, this is an interesting problem 296 00:12:53.072 --> 00:12:56.575 with with AI in general and with with rolling it out in. 297 00:12:56.575 --> 00:12:58.878 It's kind of a half baked hallucination 298 00:12:58.878 --> 00:13:02.882 state to everyone's devices because it is amazing. 299 00:13:02.882 --> 00:13:06.218 Like, I feel like if you rewind, ChatGPT came out December 300 00:13:06.218 --> 00:13:09.488 2022, everyone was losing their minds about it because it's like, this is crazy. 301 00:13:09.488 --> 00:13:11.090 It it seems exactly like a human. 302 00:13:11.090 --> 00:13:14.026 You can talk to it like, and it's only gotten better since then, 303 00:13:14.026 --> 00:13:19.031 but now because it's so good, it's so close to being like talking to a human. 304 00:13:19.331 --> 00:13:21.934 We see all the things that is wrong with it immediately. 305 00:13:21.934 --> 00:13:23.769 You know, we're like, oh, it's passing the Turing test. 306 00:13:23.769 --> 00:13:25.404 Like, oh, blah blah is like, oh no, wait. 307 00:13:25.404 --> 00:13:28.407 Once you get a little bit used to it, you're like, oh no, the Gulf is 308 00:13:28.808 --> 00:13:31.577 the Gulf is massive because it's like close. 309 00:13:31.577 --> 00:13:34.380 And it's so much, so noticeable. 310 00:13:34.380 --> 00:13:37.383 Adam: When you just have like how many Rs are in strawberry and it like just. 311 00:13:37.383 --> 00:13:38.617 Riley: Well there's that stuff. 312 00:13:38.617 --> 00:13:41.587 There's just like you can be having a conversation with it, 313 00:13:41.587 --> 00:13:43.923 especially with these new like voice modes. 314 00:13:43.923 --> 00:13:48.160 I experimented with it like driving home from work a couple times. 315 00:13:48.160 --> 00:13:51.263 I'm like I just put on the voice mode and like pretend that I have 316 00:13:51.263 --> 00:13:54.366 that I'm giving somebody a ride and I'm like having a car conversation. 317 00:13:54.967 --> 00:13:57.736 And there are like moments at a time 318 00:13:57.736 --> 00:14:01.273 where it's like, okay, I kind of feel like I'm talking to somebody here. 319 00:14:01.273 --> 00:14:03.442 And then like, as soon as that happens, 320 00:14:03.442 --> 00:14:07.112 another thing comes up and it's like, oh, you, oh, this is what am I doing? 321 00:14:07.179 --> 00:14:08.280 I feel stupid. 322 00:14:08.280 --> 00:14:09.615 Iâm like what am I? 323 00:14:09.615 --> 00:14:12.852 I'm pretending to talk to a person, but it's like, obviously 324 00:14:12.852 --> 00:14:15.955 not a person and it doesn't understand the nuance of what I just said. 325 00:14:15.955 --> 00:14:17.456 Adam: I will say that the live feature 326 00:14:17.456 --> 00:14:20.125 is really cool, especially cause you can, like, interrupt it. 327 00:14:20.125 --> 00:14:23.128 like that's the best when you I mean, just like, no, you're you're off. 328 00:14:23.195 --> 00:14:25.497 Like you're wrong. 329 00:14:25.497 --> 00:14:28.000 Sometimes I ask you the question just to tell to shut up immediately. 330 00:14:28.000 --> 00:14:29.802 , well, actually shut up. 331 00:14:29.802 --> 00:14:31.937 You're not real. 332 00:14:31.937 --> 00:14:33.505 You're not a person like me 333 00:14:33.505 --> 00:14:36.675 Adam: Itâs true because It has like this, like this passivity. 334 00:14:36.675 --> 00:14:38.611 Like it won't take a stance on anything ever. 335 00:14:38.611 --> 00:14:39.945 Like it won't ever have an opinion. 336 00:14:39.945 --> 00:14:42.681 And it always is way more verbose than you need it to be. 337 00:14:42.681 --> 00:14:45.184 You're like, you're like, hey, hey, what's a strawberry? 338 00:14:45.184 --> 00:14:46.552 And I get like three paragraphs. 339 00:14:46.552 --> 00:14:48.120 I was like, I just needed to know that it was a fruit. 340 00:14:48.120 --> 00:14:49.455 My dog 341 00:14:49.455 --> 00:14:53.125 Riley: My main thing when I talk to these voice modes, I wanted to have a take. 342 00:14:53.359 --> 00:14:56.695 I want to, like, have a discussion and like, push back on me. 343 00:14:56.695 --> 00:14:59.565 But the thing is, it's AI it doesn't have a perspective. 344 00:14:59.565 --> 00:15:00.799 It doesn't have an opinion. 345 00:15:00.799 --> 00:15:03.969 You can make it have an opinion, but then it's like, what's the point 346 00:15:03.969 --> 00:15:05.738 Adam: Itâs just somebody else's opinion 347 00:15:05.738 --> 00:15:08.741 Riley: One thing about the slop issue I wanted to say before I move on, maybe, 348 00:15:08.941 --> 00:15:12.011 is we both do a lot of googling for our jobs, 349 00:15:12.544 --> 00:15:16.448 a lot of research and I feel like when I Google now, 350 00:15:16.582 --> 00:15:20.152 the internet is so full of I slop that I almost 351 00:15:20.152 --> 00:15:23.222 forget about it because I've learned to, like, tune it out. 352 00:15:23.656 --> 00:15:28.260 So, like Linus sent a link for that he wanted to talk about in the WAN Show, 353 00:15:28.260 --> 00:15:32.431 I saw the the website was called Glass Almanac is an AI slop site. 354 00:15:32.631 --> 00:15:34.800 It basically like. And I'd looked into the story. 355 00:15:34.800 --> 00:15:35.467 He was talking about. 356 00:15:35.467 --> 00:15:38.537 It's like, okay, this story that he sent me was 357 00:15:38.537 --> 00:15:41.707 an AI slop article based off another slop article 358 00:15:41.707 --> 00:15:45.144 that was based off a wired article that was like, 359 00:15:45.511 --> 00:15:48.647 not about the thing that Linus wanted to talk about. 360 00:15:48.914 --> 00:15:52.051 The point of this story is that I saw Glass Almanac and I was like, 361 00:15:52.051 --> 00:15:55.621 I know that's a AI slop site, so I like just tuned it out. 362 00:15:55.621 --> 00:15:59.491 So when I Google stuff, it's like I'm only seeing the little gems 363 00:15:59.491 --> 00:16:03.162 that are like the like glimpses of the real internet. 364 00:16:03.162 --> 00:16:07.366 But for everyone else who aren't us, when they Google 365 00:16:07.700 --> 00:16:09.335 and they go down the search results, 366 00:16:09.335 --> 00:16:12.438 they don't know how to tune out the AI slop for down. 367 00:16:12.438 --> 00:16:14.940 The slop is going to be the internet. 368 00:16:14.940 --> 00:16:16.642 Adam: Yeah, and it's like scary. 369 00:16:16.642 --> 00:16:17.509 It is scary. 370 00:16:17.509 --> 00:16:20.245 It's interesting 404 media did a big investigation about 371 00:16:20.245 --> 00:16:23.482 like how and why Facebook is so full of AI slop. 372 00:16:23.482 --> 00:16:26.618 They identified that Facebook is incentivizing this slop 373 00:16:26.952 --> 00:16:28.320 and they're paying people. 374 00:16:28.320 --> 00:16:31.423 Since then they've kind of been like, oh, we're going to turn down the, 375 00:16:31.724 --> 00:16:32.691 the the dial, turn down the basically. 376 00:16:32.691 --> 00:16:33.792 Adam: Turn down the sloppening of 377 00:16:33.792 --> 00:16:37.696 Riley: But like, you know, they were like posts on Facebook can be monetized. 378 00:16:37.930 --> 00:16:41.400 And so they're like, if you can get if you can make a page on Facebook 379 00:16:41.400 --> 00:16:43.235 and get a ton of traffic on it 380 00:16:43.235 --> 00:16:46.238 from the boomers clicking all this AI slop, you're going to make a lot of money. 381 00:16:46.672 --> 00:16:47.239 Adam: To add to that 382 00:16:47.239 --> 00:16:49.141 It's like it's it's this like House of cards, right? 383 00:16:49.308 --> 00:16:53.012 It's like if you have bots to boost your engagement, 384 00:16:53.245 --> 00:16:54.913 so you're getting all these fake impressions, 385 00:16:54.913 --> 00:16:58.283 then your ad revenue is worthless because you have fake behavior. 386 00:16:58.751 --> 00:17:00.652 Riley: Right. That's that's an issue too 387 00:17:00.652 --> 00:17:02.721 Adam: So then like who are they going to sell to. 388 00:17:02.721 --> 00:17:03.756 If people go like I won't, 389 00:17:03.756 --> 00:17:05.524 I don't want to sell them I say because you inflate your numbers 390 00:17:05.524 --> 00:17:09.161 by allowing bots to rampantly run on your website. 391 00:17:09.161 --> 00:17:10.429 Adam: And that's the business model 392 00:17:10.796 --> 00:17:14.867 Riley: The web crawling a war is something that's really interesting to me 393 00:17:15.300 --> 00:17:18.871 because yes, publishers and people who own websites 394 00:17:18.871 --> 00:17:22.941 want real visitors unless they make slop. 395 00:17:22.941 --> 00:17:24.910 Slop people don't care who's coming to their website. 396 00:17:24.910 --> 00:17:28.580 But like people who, you know, journalists want real visitors. 397 00:17:28.914 --> 00:17:32.418 CEO, Cloudflare, Matthew Prince recently, 398 00:17:33.152 --> 00:17:37.990 talked about the ratio of crawlers on a on a 399 00:17:38.023 --> 00:17:41.960 like a publisher website to actual human visitors 400 00:17:42.995 --> 00:17:44.363 has gone from like being like 401 00:17:44.363 --> 00:17:48.467 2 to 1 to like dozens of crawlers per. 402 00:17:48.467 --> 00:17:52.504 And it's even worse for the, other like, 403 00:17:53.072 --> 00:17:56.141 I think that was for Google and for OpenAI and anthropic. 404 00:17:56.141 --> 00:17:59.211 It was like thousands, hundreds of thousands per one visitor. 405 00:17:59.211 --> 00:18:02.448 well, they tried to solve it recently with the pay per crawl thing. 406 00:18:03.082 --> 00:18:04.083 Did you hear about that? 407 00:18:04.083 --> 00:18:04.716 Adam: No 408 00:18:04.716 --> 00:18:05.451 Riley: I'll bring it up. 409 00:18:05.451 --> 00:18:09.121 Enabling content owners to charge AI crawlers for access. 410 00:18:09.288 --> 00:18:10.122 So like. 411 00:18:10.122 --> 00:18:11.323 Adam: Who put this initiative? 412 00:18:11.323 --> 00:18:15.627 Riley: Cloudflare is actually trying to do something about this. 413 00:18:15.627 --> 00:18:17.729 And they're incentivized to because like, you know... 414 00:18:17.729 --> 00:18:19.198 Adam: it's their it's their servers. 415 00:18:19.198 --> 00:18:21.366 Riley: They're hosting this content and they don't want it. 416 00:18:21.366 --> 00:18:23.769 All the bandwidth taken up by by crawlers. 417 00:18:23.769 --> 00:18:25.070 I don't know how well it's going to work. 418 00:18:25.070 --> 00:18:28.707 There's probably going to be an arms race where the the AI companies 419 00:18:29.108 --> 00:18:30.109 find a way around it. 420 00:18:30.109 --> 00:18:34.646 Yeah, they're allowing websites to block AI crawlers by default and then charge 421 00:18:34.947 --> 00:18:36.915 the AI companies to be like, you want to crawl the web? 422 00:18:36.915 --> 00:18:39.318 Pay up. itâs just in beta right now. 423 00:18:39.318 --> 00:18:40.419 But we'll see where that goes. 424 00:18:40.419 --> 00:18:42.387 Adam: Okay. What the hell happened to YouTube search? 425 00:18:42.387 --> 00:18:43.922 I know this is a complete left turn. 426 00:18:43.922 --> 00:18:45.424 but have you tried 427 00:18:45.424 --> 00:18:48.427 searching something on YouTube and getting more than eight results total? 428 00:18:48.427 --> 00:18:51.797 Riley: Like yeah... I have mixed feelings about YouTube 429 00:18:51.830 --> 00:18:53.999 because I feel like YouTube has done a lot of things right. 430 00:18:53.999 --> 00:18:54.933 Let's talk about this. 431 00:18:54.933 --> 00:18:57.069 When you search something on YouTube, 432 00:18:57.069 --> 00:18:58.704 you're saying that you don't get what you want. 433 00:18:58.704 --> 00:19:00.239 Adam :So if you search something on YouTube, 434 00:19:00.239 --> 00:19:03.509 it'll give you about 10 to 12 actual results. 435 00:19:03.509 --> 00:19:04.243 And then it's before. 436 00:19:04.243 --> 00:19:06.712 It switches to random stuff, especially if it's something that's more niche. 437 00:19:06.712 --> 00:19:10.849 If you go to say like a Rav4 review, we're getting tons of relevant results. 438 00:19:10.849 --> 00:19:15.754 but if I go like, let's say like TOSlink adapter 439 00:19:15.754 --> 00:19:17.289 this is terrible itâs showing up 440 00:19:17.289 --> 00:19:18.390 It's not doing the right thing. 441 00:19:20.292 --> 00:19:20.993 Sometimes it's like 442 00:19:20.993 --> 00:19:24.229 Riley: Yeah 443 00:19:24.229 --> 00:19:24.696 Adam: here's six results and there's totally more there. 444 00:19:24.696 --> 00:19:29.067 But it just goes like ah, there I mean, you watch some other stuff instead. 445 00:19:29.067 --> 00:19:31.570 Riley: I find that that's the case with Google a lot. 446 00:19:31.970 --> 00:19:35.407 And I don't know, honestly, I don't know anymore whether 447 00:19:35.841 --> 00:19:39.511 Google is not giving me, a lot of results 448 00:19:39.511 --> 00:19:44.082 because there's not many results out there to, to find or whether it's just 449 00:19:44.850 --> 00:19:47.085 there's so much slop and it has to dig through all the slop 450 00:19:47.085 --> 00:19:49.054 and it can actually find the good stuff anymore. 451 00:19:49.054 --> 00:19:52.090 Sammy: Sorry, I just want to quickly interrupt because I also looked at the Toslink. 452 00:19:52.491 --> 00:19:56.195 Adam mentioned I got a product like ads for product like, 453 00:19:56.195 --> 00:19:59.398 so you can buy it one video and then I got shorts. 454 00:19:59.498 --> 00:20:01.500 So it's like the front page is already, like useless. 455 00:20:01.500 --> 00:20:02.801 I had to scroll down to find more 456 00:20:02.801 --> 00:20:03.569 Adam: my default for Toslink 457 00:20:03.569 --> 00:20:06.338 adapters sponsored, which is weird because it's on YouTube. 458 00:20:07.472 --> 00:20:09.007 And then it's the shorts 459 00:20:09.007 --> 00:20:11.977 Sammy: and then it's another ad and then it's actually like actual stuff, 460 00:20:11.977 --> 00:20:14.346 Adam: it's more shorts and then it's another ad 461 00:20:14.346 --> 00:20:15.714 Riley: Toslink adapter? 462 00:20:15.714 --> 00:20:19.284 Riley: So , I got ad two results ad. 463 00:20:19.284 --> 00:20:23.322 Shorts and then three results shorts again. 464 00:20:23.755 --> 00:20:25.090 Is it just the same shorts? 465 00:20:25.090 --> 00:20:26.858 Riley: Some of the shorts of the same. 466 00:20:26.858 --> 00:20:27.726 Adam: Please watch YouTube shorts. 467 00:20:27.726 --> 00:20:31.029 Riley: Yeah, I don't think I don't think shorts are annoying. 468 00:20:31.496 --> 00:20:32.431 By themselves. 469 00:20:32.431 --> 00:20:36.301 I think it's just annoying to have the actual results broken up so often. 470 00:20:36.301 --> 00:20:39.304 This is a great example of what it's like to use Google 471 00:20:39.304 --> 00:20:40.405 and the internet in general. 472 00:20:40.405 --> 00:20:42.975 Right now you have to filter everything out. 473 00:20:42.975 --> 00:20:45.844 And this is what I'm saying about normies they're going through. 474 00:20:45.844 --> 00:20:49.615 And this is why all of these, search ads, 475 00:20:49.615 --> 00:20:52.050 these are like, you know, sponsored results or whatever. 476 00:20:52.050 --> 00:20:53.552 It's why they get so many clicks 477 00:20:53.552 --> 00:20:57.322 because the normies don't know to not click on them like they're. 478 00:20:57.356 --> 00:21:00.792 I think they're learning, obviously, but like our parents are going through 479 00:21:01.026 --> 00:21:03.262 and they're probably learning to be like, oh, it's sponsored. 480 00:21:03.262 --> 00:21:04.162 And then they skip it. 481 00:21:04.162 --> 00:21:07.232 But like the vast majority of people, I think while they're using these things, 482 00:21:07.232 --> 00:21:08.600 Adam: No. They can't, tell 483 00:21:08.600 --> 00:21:10.002 Riley: First result is a sponsored result. 484 00:21:10.002 --> 00:21:13.005 They're probably clicking that the vast majority of the time 485 00:21:13.238 --> 00:21:15.440 because they're like, oh, that's the top result. 486 00:21:15.440 --> 00:21:16.675 It's probably the most useful thing. 487 00:21:16.675 --> 00:21:17.843 It's dystopian. 488 00:21:17.843 --> 00:21:20.846 Adam: It's really frustrating, especially when you like, 489 00:21:20.846 --> 00:21:23.849 like for tech support stuff where you're like, how to back up my PC 490 00:21:23.882 --> 00:21:27.319 and everything is just like SEO slop, but it's SEO slop from people who sell 491 00:21:27.986 --> 00:21:29.121 backup software, right? 492 00:21:29.121 --> 00:21:30.589 So it's like it's like pseudo helpful. 493 00:21:30.589 --> 00:21:34.359 Riley: Honestly, there have been a good many of those, like, it's a company 494 00:21:34.660 --> 00:21:37.195 and they have like a blog blaming something. 495 00:21:37.195 --> 00:21:39.931 âThe five different types of blah blah blahâ 496 00:21:39.931 --> 00:21:41.300 And they're explaining it to you. 497 00:21:41.300 --> 00:21:44.636 And then at the end they're like, and to solve that problem we have a product. 498 00:21:44.636 --> 00:21:45.137 Check it out. 499 00:21:45.137 --> 00:21:45.537 You know. 500 00:21:45.537 --> 00:21:46.071 Adam: Yeah and itâs 501 00:21:46.071 --> 00:21:47.739 Riley: Like to me the like I've actually had 502 00:21:47.739 --> 00:21:49.808 I've actually found quite a few of those that are, 503 00:21:49.808 --> 00:21:51.910 that are useful that have solved my problem for me. 504 00:21:51.910 --> 00:21:54.813 Adam: And oftentimes when they do it and they put like their app 505 00:21:54.813 --> 00:21:57.649 is like the 10th, like the fifth or sixth step, I'm like, you know what? 506 00:21:57.649 --> 00:21:58.550 Riley: That's fine. 507 00:21:58.550 --> 00:21:59.184 Adam: That's okay. 508 00:21:59.184 --> 00:22:01.553 Riley: I'll accept this real quick. 509 00:22:01.620 --> 00:22:03.388 I want to talk about AI overviews. 510 00:22:03.388 --> 00:22:04.256 Adam: Okay. Sure. 511 00:22:04.256 --> 00:22:07.592 Riley: Because this to me is one of the most 512 00:22:08.026 --> 00:22:10.562 pernicious developments in search. 513 00:22:10.562 --> 00:22:13.699 The idea that instead of using Google like a tool, 514 00:22:13.699 --> 00:22:16.601 that you know how to use yourself. 515 00:22:16.601 --> 00:22:19.071 They've added this AI 516 00:22:19.071 --> 00:22:21.840 search functionality to all the major searches. 517 00:22:21.840 --> 00:22:24.743 So engines that turn it from a tool that you're using 518 00:22:24.743 --> 00:22:27.746 into an interface with a bot that's using the tool. 519 00:22:27.879 --> 00:22:28.880 Adam: Yes. Yeah. 520 00:22:28.880 --> 00:22:31.917 Riley: And I think this is going to have a horrible effect on people's 521 00:22:31.917 --> 00:22:35.354 ability to research stuff for themselves and think critically. 522 00:22:35.887 --> 00:22:39.124 And just to be able to to find out what the hell is going on out there, 523 00:22:39.658 --> 00:22:41.360 because this is how people are going to use it. 524 00:22:41.360 --> 00:22:42.728 We already had the like 525 00:22:42.728 --> 00:22:46.631 the kind of Google summaries when you like, before the a- before LMS. 526 00:22:46.665 --> 00:22:48.033 we're really a big thing. 527 00:22:48.033 --> 00:22:51.203 Adam: It took like a snippet of like a Wikipedia article that would be like, 528 00:22:51.303 --> 00:22:52.804 how much does an Eagle weigh? 529 00:22:52.804 --> 00:22:55.540 And I was honestly totally fine with that the vast majority of the time, 530 00:22:55.540 --> 00:22:59.378 because what it was showing you was an actual snippet of a website. 531 00:22:59.378 --> 00:23:02.948 And then right below it was the link with the full text of the website header 532 00:23:02.948 --> 00:23:03.648 and everything like that. 533 00:23:03.648 --> 00:23:04.683 Riley: I'll look more into that. 534 00:23:04.683 --> 00:23:07.719 I'll go to the direct source that you're citing to me, but 535 00:23:07.719 --> 00:23:12.924 we've gone from that to a chat bot searching the internet for you 536 00:23:13.158 --> 00:23:16.895 and having the very first thing you see be Google's 537 00:23:17.496 --> 00:23:20.732 AI summary of all this stuff that I've found online. 538 00:23:20.866 --> 00:23:25.170 And the thing is that, like, if it was accurate, yes, that's very useful, 539 00:23:26.204 --> 00:23:29.274 but you can't know that it's accurate, which is the whole problem. 540 00:23:29.274 --> 00:23:31.676 And so why are we rolling this out? 541 00:23:31.676 --> 00:23:35.180 Because everyone's going to learn that. 542 00:23:35.180 --> 00:23:38.283 You just need to ask this robot what's up. 543 00:23:38.283 --> 00:23:40.852 The robot will tell you. 544 00:23:40.852 --> 00:23:42.120 And then you're like, okay. 545 00:23:42.120 --> 00:23:45.223 Adam: And the fact that they hide it behind like a tiny little link that has a pop up, 546 00:23:45.524 --> 00:23:49.194 like I just googled how many are in the word strawberry? 547 00:23:49.561 --> 00:23:51.830 And it goes, there are zero W's in the word strawberry. 548 00:23:51.830 --> 00:23:55.801 The letters in the word are stra w very, 549 00:23:56.701 --> 00:24:01.807 its source is a post complaining about how LLMS keeps screwing up. 550 00:24:01.807 --> 00:24:03.442 How many R are in strawberry? 551 00:24:03.442 --> 00:24:07.045 Riley: This is I mean, this is, it's kind of a meme in the, 552 00:24:08.013 --> 00:24:09.881 in the AI community. 553 00:24:09.881 --> 00:24:11.283 There's an easy explanation for this. 554 00:24:11.283 --> 00:24:14.286 It's the fact that LLMS just the way that they work, 555 00:24:14.953 --> 00:24:18.123 they don't know what the letters are, that they're. 556 00:24:18.123 --> 00:24:20.358 that they're they're dealing with tokens. 557 00:24:20.358 --> 00:24:22.828 They're not dealing with letters that make up words. 558 00:24:22.828 --> 00:24:25.964 It can't look at the word strawberry and go line by line. 559 00:24:25.997 --> 00:24:28.099 It's like I mean, it did, as you can see, 560 00:24:28.099 --> 00:24:30.035 but it doesn't have the understanding of what it's doing. 561 00:24:30.035 --> 00:24:31.503 It's just kind of like putting stuff out. 562 00:24:31.503 --> 00:24:32.737 Adam: It's a word calculator. 563 00:24:32.737 --> 00:24:33.171 Riley: Yes 564 00:24:33.171 --> 00:24:35.974 Adam: And it's just but we don't really know how the calculator works. 565 00:24:35.974 --> 00:24:38.577 It's just that it goes like what is most likely to happen. 566 00:24:38.577 --> 00:24:39.911 Riley: And that's like, that's a whole thing. 567 00:24:39.911 --> 00:24:41.346 I mean, we let's not get into the whole... 568 00:24:41.346 --> 00:24:42.647 Adam: The octopus. 569 00:24:42.647 --> 00:24:44.282 Riley: So anyways, I have an issue with that 570 00:24:44.282 --> 00:24:49.721 AI overviews because of my job is researching and understanding things. 571 00:24:49.721 --> 00:24:53.024 And when I write a TechLinked story I'm like 572 00:24:53.024 --> 00:24:56.194 if I screw up one of these facts, people are going to call me out. 573 00:24:56.228 --> 00:24:59.531 I need to know that what I like, that I understand what's going on, 574 00:24:59.898 --> 00:25:02.701 and I just I can't I can't trust AI overviews 575 00:25:02.701 --> 00:25:05.570 and I don't think other people should be, should, should learn to trust it. 576 00:25:05.570 --> 00:25:08.874 I think it's funny because my instinct when I use the AI overviews is I 577 00:25:08.874 --> 00:25:11.977 go, I read it and I go, all right, time to find out if that's true. 578 00:25:12.077 --> 00:25:14.913 Which means that I just spent a bunch of time not determining 579 00:25:14.913 --> 00:25:16.982 whether or not something's true because I had to. 580 00:25:16.982 --> 00:25:19.117 I don't trust the AI summary. 581 00:25:19.117 --> 00:25:23.388 Riley: Now, the counterpoint to this is that normies were never good at googling. 582 00:25:23.688 --> 00:25:25.924 googling has always kind of been a skill, 583 00:25:25.924 --> 00:25:26.658 Adam: Yes, 584 00:25:26.658 --> 00:25:31.096 Riley: but I fear that whatever rudimentary skills people were forced to pick up 585 00:25:31.096 --> 00:25:34.099 if they wanted to Google something or now, you know, it's more convenient. 586 00:25:34.099 --> 00:25:37.769 I guess it's like instead of going to the library and looking something 587 00:25:37.769 --> 00:25:40.772 up, you're asking bro, 588 00:25:40.906 --> 00:25:44.242 standing outside the library, hey, you know, 589 00:25:44.809 --> 00:25:49.247 when did Hannibal first, conquer? 590 00:25:49.247 --> 00:25:51.249 I don't know, Carthage. Is that a thing? 591 00:25:51.249 --> 00:25:52.617 Adam: Sure. 592 00:25:52.617 --> 00:25:53.218 Riley: It's been a long time. 593 00:25:53.218 --> 00:25:55.253 Adam: Yeah Let me go ask ten other dudes inside. 594 00:25:55.253 --> 00:25:57.322 Riley: It's like, oh, that was totally this year. 595 00:25:57.322 --> 00:26:00.692 And you're like, okay, I could go into the library 596 00:26:00.692 --> 00:26:03.929 and look up the actual thing, but this guy seems pretty confident. 597 00:26:03.929 --> 00:26:05.564 Okay, I'll just believe him. 598 00:26:05.564 --> 00:26:07.332 That's what we're doing with Google 599 00:26:07.332 --> 00:26:08.667 Adam: Hannibal never conquered Carthage. 600 00:26:08.667 --> 00:26:10.101 Riley: All right. Sorry. 601 00:26:10.101 --> 00:26:11.670 Adam: Or did he? 602 00:26:11.670 --> 00:26:14.039 Riley: This is AI overviews! 603 00:26:14.039 --> 00:26:15.507 Adam: He commanded the forces of Carthage. 604 00:26:15.507 --> 00:26:17.275 Riley: Oh, he was on the side of Carthage. 605 00:26:17.275 --> 00:26:18.543 There we go 606 00:26:18.543 --> 00:26:19.377 Adam: I think that. 607 00:26:19.377 --> 00:26:20.679 Riley: Oh, sorry. Go ahead. 608 00:26:20.679 --> 00:26:21.713 Adam: No, you go ahead. 609 00:26:21.713 --> 00:26:25.383 Riley: I was just thinking I had I had wrote down here that like, try Reddit answers. 610 00:26:25.383 --> 00:26:28.954 Try searching on TikTok because one big thing, 611 00:26:29.220 --> 00:26:32.424 it's not about Google getting worse per se, but, 612 00:26:33.758 --> 00:26:36.328 young people, instead of searching Google, 613 00:26:36.328 --> 00:26:39.564 a lot of time they're searching on Reddit and they're searching on TikTok. 614 00:26:40.765 --> 00:26:43.602 Adam: I have never tried the Reddit answers. 615 00:26:43.602 --> 00:26:46.605 Riley: What is a Toslink adapter? 616 00:26:46.605 --> 00:26:49.274 Toslink adapter is a device that allows you connect audio equipment like. 617 00:26:49.274 --> 00:26:51.676 Okay, so it's giving me like a ChatGPT answer. 618 00:26:51.676 --> 00:26:53.011 Here's the thing. 619 00:26:53.011 --> 00:26:56.314 This is an AI summary, but right away 620 00:26:56.314 --> 00:27:01.586 it's linking to Reddit post by humans that I know are probably real. 621 00:27:01.620 --> 00:27:04.623 There's a lot of AI, there's a lot of bots on Reddit for sure, 622 00:27:04.823 --> 00:27:07.525 but I think that the vast majority of the time 623 00:27:07.525 --> 00:27:11.463 a post about something that like, like this is going to be I'm, I'm 624 00:27:11.463 --> 00:27:15.333 going to be fairly confident that, okay, this first this was written by human. 625 00:27:15.567 --> 00:27:16.501 Adam: for now, 626 00:27:16.501 --> 00:27:19.471 Riley: for now, I'm sure that Reddit may be flooded 627 00:27:19.471 --> 00:27:22.974 by bots talking about basic tech support issues. 628 00:27:22.974 --> 00:27:24.542 Adam: Right. And this is the AI issue. 629 00:27:24.542 --> 00:27:25.710 Riley: This is like the end result. 630 00:27:25.710 --> 00:27:28.680 Adam: If it's all bots, nothing's a value anymore. 631 00:27:29.180 --> 00:27:32.717 And then the data is going to be trained on become slop, and it's just going to be 632 00:27:32.717 --> 00:27:33.385 this vicious cycle. 633 00:27:33.385 --> 00:27:36.554 It's like the thing that everyone's the people were very early talking about 634 00:27:37.122 --> 00:27:39.691 how that would happen is that like, AI would just end up 635 00:27:39.691 --> 00:27:42.894 getting worse and worse and worse as it trains itself on its own slop 636 00:27:43.428 --> 00:27:46.031 over and over and over again till it's basically useless, 637 00:27:46.031 --> 00:27:47.899 Riley: It's like a jpeg getting compressed again and again. 638 00:27:47.899 --> 00:27:49.868 Adam: Now I don't trust Reddit. 639 00:27:49.868 --> 00:27:50.969 Riley: Now. You don't trust. Reddit. 640 00:27:50.969 --> 00:27:53.271 Adam: I fact checked everything now. 641 00:27:53.271 --> 00:27:54.873 I mean, it's my job right. 642 00:27:54.873 --> 00:27:57.142 So like I even if I read it on Reddit, I'm like, eh 643 00:27:57.142 --> 00:28:00.211 you know, I guess I'll have to look like I think one of the best places 644 00:28:00.211 --> 00:28:03.214 to target stuff is topics of things that are hard to review. 645 00:28:03.248 --> 00:28:04.683 Right. Like I do a lot of reasons. 646 00:28:04.683 --> 00:28:07.452 We do a lot of research on products and stuff, like a washing machine. 647 00:28:07.452 --> 00:28:10.088 No one's out there testing, washing machines 648 00:28:10.088 --> 00:28:12.023 because it'd be prohibitively expensive. 649 00:28:12.023 --> 00:28:15.026 And usually you get one you buy. 650 00:28:15.126 --> 00:28:16.928 How often do you buy a washing machine? 651 00:28:17.929 --> 00:28:18.630 Riley: I donât... 652 00:28:18.630 --> 00:28:19.698 Adam: Have you ever ? 653 00:28:19.698 --> 00:28:20.065 Riley: no 654 00:28:20.065 --> 00:28:23.101 Adam: Same! so and same thing with like, AC units, 655 00:28:23.101 --> 00:28:24.602 stuff like that where they're very difficult to test 656 00:28:24.602 --> 00:28:27.639 or they're like so commodified that nobody really tests them. 657 00:28:28.106 --> 00:28:29.974 But I want to have some sort of opinion. 658 00:28:29.974 --> 00:28:32.010 But that's like where the bots come in, right? 659 00:28:32.010 --> 00:28:35.814 That's where the like, listicles that just are like, these are like for. 660 00:28:36.014 --> 00:28:37.982 Riley: Yeah, but that's not on Reddit. 661 00:28:37.982 --> 00:28:38.783 Adam: Yeah it is! 662 00:28:38.783 --> 00:28:40.251 Riley: Reddit is is. 663 00:28:40.251 --> 00:28:42.987 Adam: 100% when I've looked like best air conditioner Reddit. 664 00:28:42.987 --> 00:28:46.458 You will find tons of bot accounts that are like. 665 00:28:46.591 --> 00:28:50.962 They're posting like I bought the Penguino D30 666 00:28:50.962 --> 00:28:54.866 from, DeLonghi and it's the best thing I've ever owned. 667 00:28:54.866 --> 00:28:58.203 It has these feature and it is the most ChatGPT stuff I've ever read. 668 00:28:58.203 --> 00:29:00.004 Riley: I will say that like the vast majority 669 00:29:00.004 --> 00:29:04.142 of like the writing that I do is not for stuff like that, I guess. 670 00:29:04.142 --> 00:29:08.613 So you've probably definitely encountered more things like, like product focused. 671 00:29:09.013 --> 00:29:10.515 Most of it, most of the time that I spend on 672 00:29:10.515 --> 00:29:13.785 Reddit is like, you know, some article was written based on, 673 00:29:14.652 --> 00:29:19.724 another article that was based on a Reddit post from someone whose GPU exploded. 674 00:29:20.058 --> 00:29:23.762 So then I'm like clicking through, finding trying to find the original Reddit post 675 00:29:23.762 --> 00:29:26.364 and like reading the comments and stuff like that. So. 676 00:29:26.364 --> 00:29:29.701 But yeah, I don't doubt that there are that the bots. 677 00:29:29.701 --> 00:29:33.404 I mean, I know that the bots, problem is a big thing on Reddit, 678 00:29:33.872 --> 00:29:36.841 and not just for that, I think what was it? 679 00:29:36.841 --> 00:29:39.110 What was the subreddit? I think it was am I the a****** 680 00:29:40.612 --> 00:29:42.113 Adam: Well, everything on that subreddit fake. 681 00:29:42.113 --> 00:29:43.114 Everyone knows that. 682 00:29:43.114 --> 00:29:44.282 Sammy: Is it? 683 00:29:44.282 --> 00:29:47.252 Adam: On am I the a****** space. So there's so many. 684 00:29:47.252 --> 00:29:47.919 Riley: And this is the thing 685 00:29:47.919 --> 00:29:48.686 with slop. 686 00:29:48.686 --> 00:29:50.455 Slop isn't new, right? 687 00:29:50.455 --> 00:29:51.022 It's not like. 688 00:29:51.022 --> 00:29:54.959 It's not like, you know, flooding the internet with slop was not a thing 689 00:29:54.959 --> 00:29:55.293 before. 690 00:29:55.293 --> 00:29:58.496 with AI, like humans are great at slop. 691 00:29:59.264 --> 00:30:00.632 And we've been doing it for a long time. 692 00:30:00.632 --> 00:30:02.066 It's a storied tradition. 693 00:30:02.066 --> 00:30:05.303 but now it's just so trivially. 694 00:30:05.303 --> 00:30:08.239 Trivially easy for anyone just to pump out slop. 695 00:30:08.239 --> 00:30:10.408 Like they're a like it's a factory. 696 00:30:10.408 --> 00:30:12.644 Adam: It was, It was change my view. 697 00:30:12.644 --> 00:30:14.612 Riley: Oh. It was change my view. 698 00:30:14.612 --> 00:30:17.982 Adam: And, basically, Reddit threatened to sue them 699 00:30:17.982 --> 00:30:20.985 because they're like, hey, we're the only ones who can put bots in there. 700 00:30:20.985 --> 00:30:22.053 Riley: We didn't actually say what it was. 701 00:30:22.053 --> 00:30:25.323 But it's so like these, these University of Zurich researchers, 702 00:30:26.057 --> 00:30:29.561 basically just posted tons and tons and tons of posts and change it 703 00:30:29.561 --> 00:30:30.695 and changed my view. 704 00:30:30.695 --> 00:30:34.632 They say that they were partially written by AI like, but they used 705 00:30:34.632 --> 00:30:39.437 AI to generate tons and tons of posts over like a year or something. 706 00:30:39.838 --> 00:30:43.208 Adam: I think the goal was to figure out if they could better change the opinions 707 00:30:43.208 --> 00:30:44.042 of folks. 708 00:30:44.042 --> 00:30:44.676 Riley: Yeah, to find out 709 00:30:44.676 --> 00:30:48.780 how good AI was at changing people's view. 710 00:30:48.780 --> 00:30:51.683 And I think they found that AI was slightly better. 711 00:30:51.683 --> 00:30:54.986 Than than people at, yeah... At convincing us. 712 00:30:54.986 --> 00:30:56.988 So that's great news. 713 00:30:56.988 --> 00:30:58.923 That's that's exciting for the future. 714 00:30:58.923 --> 00:30:59.357 Okay. 715 00:30:59.357 --> 00:31:01.826 We've gone on a lot of tangents here. 716 00:31:01.826 --> 00:31:03.461 Adam: Why does Gmail search stuff so bad? 717 00:31:03.461 --> 00:31:04.896 Riley: I... 718 00:31:04.896 --> 00:31:06.631 Sammy: Oh my god can I can I say something about this? 719 00:31:06.631 --> 00:31:07.732 Riley: Sammy has a thought about that 720 00:31:07.732 --> 00:31:10.134 Sammy: I was looking for something 721 00:31:10.134 --> 00:31:10.869 like E-transfer. 722 00:31:10.869 --> 00:31:12.871 Right. That's for for non-Canadian. 723 00:31:12.871 --> 00:31:15.540 That's basically like transferring money through to Venmo. 724 00:31:15.540 --> 00:31:16.241 Adam: Itâs like Venmo 725 00:31:16.241 --> 00:31:18.710 Venmo. yeah. All that. 726 00:31:18.710 --> 00:31:19.310 I was looking for it. 727 00:31:19.310 --> 00:31:22.981 And then like the Google, the Gmail search result gave me something 728 00:31:22.981 --> 00:31:26.818 that wasn't alphabetical, nor was it in chronological order. 729 00:31:26.918 --> 00:31:30.455 So I was like, so give me stuff from like a year ago or like a few months ago. 730 00:31:30.455 --> 00:31:31.489 Iâm just like why? 731 00:31:31.489 --> 00:31:33.157 Riley: You sent that screenshot. And I was like, what? 732 00:31:33.157 --> 00:31:35.426 I've never even I've never seen that happen to me. 733 00:31:35.426 --> 00:31:38.162 Adam: Yeah and itâll miss stuff like, I'll, I'll search like meeting 734 00:31:38.162 --> 00:31:39.230 because I know I had to have a meet. 735 00:31:39.230 --> 00:31:41.332 I there's something about a meeting in some email 736 00:31:41.332 --> 00:31:44.202 and it's like just shows me stuff on like four years ago. 737 00:31:44.202 --> 00:31:46.804 And I'm like, no, like it can't search. 738 00:31:46.804 --> 00:31:49.307 It seems like it doesn't search the contents of the emails or something. 739 00:31:49.307 --> 00:31:52.076 The vast majority of the time when I saw something in my email, 740 00:31:53.111 --> 00:31:56.547 it's still chronological, so I don't know. 741 00:31:57.382 --> 00:32:00.418 I mean, if I search meeting someone's eye for June 742 00:32:00.418 --> 00:32:03.488 30th, June 26th, June 24th, I mean, , it's chronological, so I don't. 743 00:32:03.488 --> 00:32:06.491 Adam: For... if I search James meeting. 744 00:32:06.858 --> 00:32:09.861 Why is the 10th result here 745 00:32:10.094 --> 00:32:13.965 from June 6th- er sorry June 10th, 2024. 746 00:32:14.565 --> 00:32:17.568 And then it goes to April 11th of this year? 747 00:32:17.702 --> 00:32:19.237 Sammy: Yeah, I will say it wasn't this bad. 748 00:32:19.237 --> 00:32:22.240 I don't know what changed in the last few months, 749 00:32:22.807 --> 00:32:23.341 but like- 750 00:32:23.341 --> 00:32:26.644 Riley: Never I've never seen that I don't know, like, what do you guys doing. 751 00:32:26.644 --> 00:32:28.279 Sammy: Try searching something on your email because these are the default. 752 00:32:28.279 --> 00:32:29.814 Adam: Yeah okay so the default thereâs a... 753 00:32:29.814 --> 00:32:31.816 So there is a most recent there's a most relevant view. 754 00:32:31.816 --> 00:32:34.118 Riley: Well yeah! Just don't just deselect that. 755 00:32:34.118 --> 00:32:34.585 Sammy: Wait theres.. 756 00:32:34.585 --> 00:32:36.087 They changed it? 757 00:32:36.087 --> 00:32:36.654 Riley: Oh my gosh. 758 00:32:36.654 --> 00:32:38.890 Why do you have most relevant selected I probably- 759 00:32:38.890 --> 00:32:41.292 Adam: Because maybe I want the most relevant stuff I- 760 00:32:41.292 --> 00:32:45.029 one would think then maybe recency would make things slightly more relevant. 761 00:32:45.129 --> 00:32:46.397 You know they might be related. 762 00:32:46.397 --> 00:32:48.800 Riley: How was Google going to know what the most relevant thing is? 763 00:32:48.800 --> 00:32:50.068 Sammy: Why is that even a thing? 764 00:32:50.068 --> 00:32:52.503 Adam: They don't ask me for most recent results on Google search. 765 00:32:52.503 --> 00:32:53.871 Riley: So this is okay. Here we go. 766 00:32:53.871 --> 00:32:58.576 Actually this is this is a this is a good takeaway for the for the viewers. 767 00:32:58.576 --> 00:32:59.344 Okay. 768 00:32:59.344 --> 00:33:02.981 As much as AI has screwed up the web and screwed up everything, 769 00:33:03.314 --> 00:33:08.820 there are still ways for you to make your search experience not horrible. 770 00:33:09.320 --> 00:33:09.654 Adam: yeah 771 00:33:09.654 --> 00:33:14.025 Riley: So like on Google, there still is a way to search without the AI overviews. 772 00:33:14.158 --> 00:33:17.595 Like if I search I'll say Who is Linus? 773 00:33:19.530 --> 00:33:22.467 Sebastian? 774 00:33:22.467 --> 00:33:23.868 Okay. It doesn't even give me 775 00:33:23.868 --> 00:33:25.937 An AI overview. Unbelievable. 776 00:33:25.937 --> 00:33:28.773 So anyways, pretend there's an AI overview there. 777 00:33:28.773 --> 00:33:31.542 You can click right here. It says web. 778 00:33:31.542 --> 00:33:34.946 Click there and it will give you just a regular Google list of links. 779 00:33:34.946 --> 00:33:37.015 So like that still exists. 780 00:33:37.015 --> 00:33:42.120 And in in Gmail they have the drop down most recent or most relevant. 781 00:33:42.587 --> 00:33:45.890 So there are ways to find the workaround 782 00:33:46.290 --> 00:33:49.794 and and and use it and use the tools available to you. 783 00:33:49.794 --> 00:33:51.195 Don't don't give in. 784 00:33:51.195 --> 00:33:52.930 Don't give in to the AI. 785 00:33:52.930 --> 00:33:55.833 Adam: The other like quick ways to just add a curse word. 786 00:33:55.833 --> 00:33:56.300 Riley: Oh yeah 787 00:33:56.300 --> 00:33:57.969 Adam: So instead of saying Who is Linus tech tips, you say who? 788 00:33:57.969 --> 00:33:58.870 The f*** is Linus Tech Tips 789 00:33:58.870 --> 00:34:00.171 Then the AIâs like 790 00:34:00.171 --> 00:34:01.672 Riley: Who the f*** 791 00:34:01.672 --> 00:34:03.541 Adam: Other things you can do 792 00:34:03.541 --> 00:34:04.375 Quotation marks. 793 00:34:04.375 --> 00:34:07.378 You put something in quotes, it'll search for that exact phrase. 794 00:34:07.678 --> 00:34:10.181 If you put something with, if you put a minus 795 00:34:10.181 --> 00:34:11.182 and then you put like a word. 796 00:34:11.182 --> 00:34:14.619 So if I say like I want, dogs minus spaniel, 797 00:34:16.020 --> 00:34:18.856 I will get results without the word spaniel in it. 798 00:34:18.856 --> 00:34:20.525 Okay. That's good to know. 799 00:34:20.525 --> 00:34:20.992 Adam: Yeah!. 800 00:34:20.992 --> 00:34:21.692 Riley: Wow! 801 00:34:21.692 --> 00:34:24.796 Oh, I didn't even I wanted to say this about the YouTube thing that, like, 802 00:34:25.196 --> 00:34:28.199 I feel like YouTube is just such a 803 00:34:28.533 --> 00:34:32.103 it's some things they're doing so well and other things are doing horribly like 804 00:34:32.103 --> 00:34:33.805 Adam: let's give them a compliment sandwich. 805 00:34:33.805 --> 00:34:34.839 What are they doing well? 806 00:34:34.839 --> 00:34:38.242 Riley: I feel like they've done really well in the past couple of years, at least 807 00:34:38.242 --> 00:34:43.681 in, surfacing smaller creators and and in the feed. 808 00:34:44.015 --> 00:34:47.452 So when you just go to home and the algorithmic feed it's giving me, 809 00:34:47.452 --> 00:34:49.320 you know, like the normal kind of big like, 810 00:34:49.320 --> 00:34:52.523 oh, look, this got a lot of clicks, but then, you know, you 811 00:34:52.523 --> 00:34:53.758 scroll past a few of those 812 00:34:53.758 --> 00:34:57.161 and then it's like some small creator and it's the video has like 50 views 813 00:34:57.161 --> 00:34:57.795 or something. 814 00:34:57.795 --> 00:34:59.464 And I've clicked a few of them 815 00:34:59.464 --> 00:35:02.467 and I've found some cool people that I have subscribed to. 816 00:35:02.800 --> 00:35:06.437 And so, like, I really appreciate YouTube kind of changing their algorithm 817 00:35:06.437 --> 00:35:08.072 to surface smaller people. 818 00:35:08.072 --> 00:35:10.875 The bad stuff, you know, is like kind of there, there. 819 00:35:10.875 --> 00:35:12.210 I mean, you could come up with something. 820 00:35:12.210 --> 00:35:16.214 I know that the that they have their AI summaries on a lot of YouTube videos, 821 00:35:16.214 --> 00:35:18.349 they're like rolling that out really inconsistently. 822 00:35:18.349 --> 00:35:22.019 I've seen it sometimes and it goes away and I'm like, is it just 823 00:35:22.019 --> 00:35:23.454 can you just decide what you're doing? 824 00:35:24.622 --> 00:35:27.024 Sammy: How do you feel about the AI overviews on videos? 825 00:35:27.024 --> 00:35:30.328 Adam: I've never, ever used an AI overview 826 00:35:30.761 --> 00:35:34.699 for a video or email, and they keep being like Gemini's summary of my emails. 827 00:35:34.699 --> 00:35:37.568 I'm like, I'm just going to read the email. Actually. 828 00:35:37.568 --> 00:35:38.703 Riley: We didnât even talk about that. 829 00:35:38.703 --> 00:35:41.939 Yeah , in Google, in Google Workspace. 830 00:35:41.939 --> 00:35:44.208 Doc sucks now 831 00:35:44.208 --> 00:35:45.476 I'm so annoyed. 832 00:35:45.510 --> 00:35:49.247 Adam: They moved they moved to the like right click button is a Gemini button now! 833 00:35:49.247 --> 00:35:49.780 Riley: is it? 834 00:35:49.780 --> 00:35:50.481 Adam: on docs yeah! 835 00:35:50.481 --> 00:35:52.083 Riley: the right click button? 836 00:35:52.083 --> 00:35:53.217 Adam: So like when you right click to like 837 00:35:53.217 --> 00:35:55.586 try and add a comment it pops up like a Gemini pop up. 838 00:35:55.586 --> 00:35:58.222 I don't know if they rolled it back or not because people were pissed 839 00:35:58.222 --> 00:35:59.724 If you highlighted it needs to have a pop up 840 00:35:59.724 --> 00:36:02.460 that would show you the comment button and those were fine. 841 00:36:02.460 --> 00:36:03.528 It's the only thing it's gone. 842 00:36:04.595 --> 00:36:06.197 I have to right click and go comment. 843 00:36:06.197 --> 00:36:07.198 Or press control alt N 844 00:36:07.198 --> 00:36:11.836 Riley: I don't think I maybe I disabled it, but I never had the like the add comment 845 00:36:11.836 --> 00:36:16.107 when I highlight something like but now I have the refine thing that shows up. 846 00:36:16.474 --> 00:36:20.011 every single time I highlight everything, it's the worst. 847 00:36:20.211 --> 00:36:23.281 And it's funny because when you go refine, you have to type more. 848 00:36:23.548 --> 00:36:26.083 You have to tell it what to do! 849 00:36:26.083 --> 00:36:27.151 Riley: Yeah but I just mean like, hey, 850 00:36:27.151 --> 00:36:30.588 sometimes you just highlight something because you want to copy and paste it. 851 00:36:30.988 --> 00:36:34.926 Stop asking me if I want the AI to summarize the four words 852 00:36:34.926 --> 00:36:36.394 that I just highlighted. 853 00:36:36.394 --> 00:36:37.929 How can we turn this around? 854 00:36:37.929 --> 00:36:39.096 Will Google turn around? 855 00:36:39.096 --> 00:36:42.133 Or is or should we abandon Google and try and like, say, 856 00:36:42.133 --> 00:36:43.467 everyone use something else? 857 00:36:43.467 --> 00:36:45.636 Adam: I don't know, have you tried using an alternative search engine? 858 00:36:45.636 --> 00:36:46.571 Riley: I haven't 859 00:36:46.571 --> 00:36:48.639 Adam: sucks even DuckDuckGo. 860 00:36:49.207 --> 00:36:50.508 Bing 861 00:36:50.508 --> 00:36:51.542 Riley: if I didn't, if. 862 00:36:51.542 --> 00:36:52.610 Adam: Bingâs good for ad for YouTube 863 00:36:52.610 --> 00:36:57.648 Riley: if basically my whole job wasn't googling stuff. 864 00:36:58.015 --> 00:37:01.552 Like Searching stuff and trying to find out what the news is, 865 00:37:01.552 --> 00:37:03.955 you know, what's the who's the original source for this? 866 00:37:03.955 --> 00:37:07.491 Like, if I didn't, I don't trust another 867 00:37:07.491 --> 00:37:10.661 search engine to be able to get me there consistently. 868 00:37:10.661 --> 00:37:14.298 I'm sure there's one that can I just haven't spent the time 869 00:37:14.699 --> 00:37:17.535 that is required to be able to trust another one. 870 00:37:17.535 --> 00:37:19.403 Adam: There's a good. One that starts with a K, I think, and it- 871 00:37:19.403 --> 00:37:20.371 Riley: Kagi. 872 00:37:20.371 --> 00:37:22.473 Adam: Yeah but it costs $15 a month or something. 873 00:37:22.473 --> 00:37:23.474 Riley: Yeah, that looks pretty decent. 874 00:37:23.474 --> 00:37:26.377 I've seen people say that like that's kind of the, 875 00:37:26.377 --> 00:37:28.145 that's one of the most promising. 876 00:37:28.145 --> 00:37:30.915 Adam: It's interesting because I think that like it could the, 877 00:37:30.915 --> 00:37:33.150 the quality of search could change drastically 878 00:37:33.150 --> 00:37:36.153 if Google gets broken up into its various constituent parts. 879 00:37:36.587 --> 00:37:37.888 Right. Like that's another. 880 00:37:37.888 --> 00:37:39.123 Riley: Yeah, that's something. 881 00:37:39.123 --> 00:37:43.995 Adam: Because Google has an absolutely insane level of control over the internet. 882 00:37:44.128 --> 00:37:44.829 They could just. 883 00:37:44.829 --> 00:37:47.832 Riley: Most popular browser, most popular smartphone operating system. 884 00:37:48.199 --> 00:37:49.734 Adam: Most popular search engine. 885 00:37:49.734 --> 00:37:54.672 So like all the content that is supported or surfaced basically goes through them. 886 00:37:54.672 --> 00:37:56.340 And they have 887 00:37:57.441 --> 00:37:58.276 somehow 888 00:37:58.276 --> 00:38:02.546 avoided like being very clear targets of anti-trust until very recently. 889 00:38:02.546 --> 00:38:05.750 Riley: Until like the past year, it's it's pretty exciting. 890 00:38:05.750 --> 00:38:06.784 I mean, like, I'm not saying that 891 00:38:06.784 --> 00:38:09.920 I want to break them up or whatever, but like maybe also, Yeah, maybe 892 00:38:09.954 --> 00:38:10.454 break them up. 893 00:38:10.454 --> 00:38:10.955 I don't know 894 00:38:10.955 --> 00:38:12.423 Adam: I think so, 895 00:38:12.423 --> 00:38:16.260 Riley: it seems to be one of the only solutions that will actually do anything 896 00:38:16.427 --> 00:38:18.062 Adam: I think one of the biggest issues that feels like 897 00:38:18.062 --> 00:38:20.998 Google has not made something cool or good for a long time, 898 00:38:20.998 --> 00:38:23.634 like they used to make cool things that were good, 899 00:38:23.634 --> 00:38:26.637 like look at Google Maps or Google Earth 900 00:38:26.971 --> 00:38:28.205 Very cool. 901 00:38:28.205 --> 00:38:29.240 Good products. 902 00:38:29.240 --> 00:38:32.877 they slowly have been making it worse by having it like you're navigating. 903 00:38:32.877 --> 00:38:35.179 And then you have to see like West Group insurance. 904 00:38:35.179 --> 00:38:36.380 Advertisement. 905 00:38:36.380 --> 00:38:40.051 Riley: I don't know how much of that is like, oh, Google is worse than they were. 906 00:38:40.051 --> 00:38:43.921 And how much of it is like, you know, the internet is just kind of a mature 907 00:38:43.921 --> 00:38:45.790 ecosystem now. It's like, what? 908 00:38:45.790 --> 00:38:48.826 Adam: I mean, I can't imagine how much it cost to run Google Maps. 909 00:38:48.826 --> 00:38:49.527 Riley: Oh my gosh. 910 00:38:49.527 --> 00:38:51.062 Store... 911 00:38:51.062 --> 00:38:53.464 Photos from like, every street in the world. 912 00:38:53.464 --> 00:38:57.034 Riley: This is the whole problem with all of these tech giants 913 00:38:57.368 --> 00:39:02.673 is that at the end of the day, we do rely on their services and they are useful. 914 00:39:02.673 --> 00:39:04.942 Google maps is very useful. 915 00:39:04.942 --> 00:39:05.776 Adam: Nationalize it 916 00:39:07.545 --> 00:39:10.948 like like if you think about it, if there's some if there's some service. 917 00:39:10.948 --> 00:39:12.016 I know that this sounds very... 918 00:39:12.016 --> 00:39:13.384 Riley: Yeah weâre gonna get political 919 00:39:13.384 --> 00:39:14.685 Adam: Iâm gonna get hyper commie real quick 920 00:39:14.685 --> 00:39:17.288 But if there are some services that private companies 921 00:39:17.288 --> 00:39:20.391 have inserted themselves in to the point where it's become, 922 00:39:20.624 --> 00:39:23.961 they're basically essential for modern operation. 923 00:39:23.961 --> 00:39:26.964 It makes sense to try and remove the middleman 924 00:39:27.331 --> 00:39:30.101 who is extracting dollars from the the two sides. 925 00:39:30.101 --> 00:39:30.901 Right? 926 00:39:30.901 --> 00:39:33.704 Like Google's taking money from the government and from the people. 927 00:39:33.704 --> 00:39:37.308 Riley: The problem with that real quick, we can't get into this. 928 00:39:37.308 --> 00:39:40.111 Adam: No, we've got three minutes. Let's get into that. 929 00:39:40.111 --> 00:39:43.280 Riley: A lot of the time when you nationalize something like that, 930 00:39:43.981 --> 00:39:45.616 the management is just worse. 931 00:39:45.616 --> 00:39:46.484 Adam: Absolutely. 932 00:39:46.484 --> 00:39:49.754 And so, you know, our Canadian health care system is a great example. 933 00:39:49.754 --> 00:39:52.423 Everyone loves our our health care system. 934 00:39:52.423 --> 00:39:54.091 And I love our healthcare system. 935 00:39:54.091 --> 00:39:57.061 When it works, a lot of the time it doesn't work. 936 00:39:57.495 --> 00:39:59.997 And I think, you know. 937 00:39:59.997 --> 00:40:02.433 Again, we can't get into it. 938 00:40:02.433 --> 00:40:03.401 Sammy: We've gone political 939 00:40:05.503 --> 00:40:07.037 Riley: But it's certainly a solution. 940 00:40:07.037 --> 00:40:10.541 It's certainly if there was a way to nationalize a, 941 00:40:11.409 --> 00:40:15.146 big service like that that is used by millions of people 942 00:40:15.479 --> 00:40:19.517 and have it retain its usefulness and not like, degrade due to, 943 00:40:19.950 --> 00:40:22.653 you know, a lack of resources or a lack of people who know 944 00:40:22.653 --> 00:40:25.322 what the hell they're doing, then yeah, absolutely. 945 00:40:25.322 --> 00:40:26.724 I'm on board. 946 00:40:26.724 --> 00:40:32.296 We need a philosopher king who will lead the revolution and redistribute. 947 00:40:32.296 --> 00:40:34.498 Adam: And that's gonna be me. 948 00:40:34.498 --> 00:40:36.667 That's gonna be me. 949 00:40:36.667 --> 00:40:38.536 Sammy: Vote Adam, as world leader 950 00:40:38.536 --> 00:40:39.904 Adam: no, I that's too much responsibility. 951 00:40:40.204 --> 00:40:42.840 It's scary. Why would we mess it up big time? 952 00:40:42.840 --> 00:40:46.610 Riley: What a great way to end this Google video 953 00:40:46.610 --> 00:40:48.312 Viva la revolution! 954 00:40:48.312 --> 00:40:49.146 Adam: Yeah! 955 00:40:49.146 --> 00:40:49.713 Riley: Cien. 956 00:40:49.713 --> 00:40:51.482 Adam: Just eat the rich, 957 00:40:51.482 --> 00:40:52.349 Riley: okay 958 00:40:52.349 --> 00:40:54.351 Riley:What do you mean? Okay? 959 00:40:54.351 --> 00:40:56.153 Riley: Okay. No, how do we actually end it? 960 00:40:56.153 --> 00:40:57.421 How do we actually end it? 961 00:40:57.421 --> 00:40:58.155 Sammy: Thatâs how weâre ending it 962 00:40:58.155 --> 00:40:58.956 Riley: You have to leave! 963 00:40:58.956 --> 00:40:59.790 Sammy: Nope thatâs how weâre ending it! 964 00:40:59.790 --> 00:41:01.125 Adam: Are you still on the eat the rich side 965 00:41:01.125 --> 00:41:02.593 If you have to kill people you like. 966 00:41:02.593 --> 00:41:03.394 Sammy: All right, we're ending it there!","speaker":null,"is_sponsor":0}],"full_text":"Adam: I don't think many companies f***** up Google 2 00:00:01.968 --> 00:00:05.138 other companies could have done more f****** up of Google. 3 00:00:05.672 --> 00:00:07.841 If Google got more f***** up there'd be less problems. 4 00:00:07.841 --> 00:00:09.342 , if they just had if, you know, 5 00:00:09.342 --> 00:00:12.545 if they if Google got their legs broken a couple times as a kid. 6 00:00:12.579 --> 00:00:13.079 Adam: 7 00:00:13.079 --> 00:00:14.547 Riley: . They'd, they'd be more humble 8 00:00:14.547 --> 00:00:16.249 Sammy: [Save this for the video] 9 00:00:16.249 --> 00:00:17.717 Riley: No just use that. That's the intro. 10 00:00:17.717 --> 00:00:18.785 Sammy: No, no. we canât ! 11 00:00:18.785 --> 00:00:19.986 Riley: That's how the video starts. 12 00:00:19.986 --> 00:00:21.888 Adam: You know Google it kind of sucks now. 13 00:00:21.888 --> 00:00:24.157 And it's crazy because Google used to be really, 14 00:00:24.157 --> 00:00:26.526 really cool when they started doing stuff with Android. 15 00:00:26.526 --> 00:00:29.462 Google Maps Street View is like a revolutionary, 16 00:00:29.462 --> 00:00:33.266 but now it seems like every single update that Google pushes out is just to either 17 00:00:33.266 --> 00:00:34.267 cram more ads, 18 00:00:34.267 --> 00:00:37.303 or to take away features, or to kill something that you already like. 19 00:00:37.303 --> 00:00:40.473 Riley: , I don't know if many people remember this, but when Google started, 20 00:00:40.473 --> 00:00:43.943 they had like the whole, disclaimer or message 21 00:00:43.943 --> 00:00:48.048 on their on their site being like, hey, we're a new type of search engine, 22 00:00:48.048 --> 00:00:51.985 no ads, no weather, no stocks or whatever, just pure search. 23 00:00:52.719 --> 00:00:54.988 That's that's how Google started with it. 24 00:00:54.988 --> 00:00:56.656 And it's like, wait, what? 25 00:00:56.656 --> 00:00:57.857 That's what Google is now. 26 00:00:57.857 --> 00:00:59.192 Adam: And if you go back far enough, 27 00:00:59.192 --> 00:01:02.195 I know that there's like an interview with one of the two founders 28 00:01:02.529 --> 00:01:05.632 where they were saying that like, oh, they'll never do advertising on Google, 29 00:01:05.632 --> 00:01:07.167 because once you start doing advertising 30 00:01:07.167 --> 00:01:09.169 in a search engine, you've defeated its usefulness. 31 00:01:09.169 --> 00:01:12.405 Riley: That's like, , that's that's like a broader arc 32 00:01:12.405 --> 00:01:16.042 of like the the bad path that Google has gone down. 33 00:01:16.042 --> 00:01:19.379 But I feel like in the past, you know, three or 4 or 5 years, 34 00:01:19.746 --> 00:01:23.883 people have also just identified a like distinct ways 35 00:01:23.883 --> 00:01:29.789 in which Google has gotten worse to use because I feel like pre, I don't know, 36 00:01:29.789 --> 00:01:34.360 2018, 2019, pre-COVID, it was like it's still basically just Google. 37 00:01:34.761 --> 00:01:38.465 Like they experimented with having, you know, personalized web pages and stuff. 38 00:01:38.465 --> 00:01:39.766 I don't know if you ever like signed- 39 00:01:39.766 --> 00:01:42.802 I think they had like iGoogle they had they had a thing called, 40 00:01:42.802 --> 00:01:44.270 I think it was called iGoogle 41 00:01:44.270 --> 00:01:45.805 Adam: I had Google Plus. 42 00:01:45.972 --> 00:01:47.474 Riley: . That was the Social network. 43 00:01:47.474 --> 00:01:48.975 Adam: I was super stoked on that. 44 00:01:48.975 --> 00:01:52.045 Riley: So like if you signed into Google you could have, you know, widgets 45 00:01:52.045 --> 00:01:53.813 and stuff up on Google when you got there 46 00:01:53.813 --> 00:01:57.117 and it would tell you like breaking news and and weather and stuff. 47 00:01:57.117 --> 00:01:59.586 But then they took it away. They're just like, no, we're not doing this. 48 00:01:59.586 --> 00:02:01.187 We're going back to normal Google Pay. 49 00:02:01.187 --> 00:02:03.223 Adam: I mean, that's just what they always do, right? 50 00:02:03.223 --> 00:02:04.257 Adam: They just kill everything 51 00:02:04.257 --> 00:02:04.891 Riley: They just took it away. 52 00:02:04.891 --> 00:02:06.159 Adam: Like, oh, that's a cool idea. 53 00:02:06.159 --> 00:02:06.826 Riley: you know what would be- 54 00:02:06.826 --> 00:02:11.097 oh- you know what be much better is if we just, 55 00:02:11.097 --> 00:02:14.701 came up with a robot that will search the web for you. 56 00:02:14.701 --> 00:02:16.836 Adam: , and it'll cut down 57 00:02:16.836 --> 00:02:20.940 trees in the Amazon to throw coals that go into the furnace that powers it. 58 00:02:21.040 --> 00:02:22.542 Riley: Oh, man. 59 00:02:22.542 --> 00:02:25.178 Adam: Let's kind of go back, because I think that a lot of this can be 60 00:02:25.178 --> 00:02:30.316 tied into the overall, like centralization of the web. 61 00:02:31.050 --> 00:02:35.655 Because back in the day, we used to have, oh, you'd have a website. 62 00:02:35.989 --> 00:02:38.158 I mean, you used to have to have a destination. 63 00:02:38.158 --> 00:02:39.692 Stuff didn't go to you, right? 64 00:02:39.692 --> 00:02:44.097 Like now, before it was like, I'm going to go to my friends at web site 65 00:02:44.297 --> 00:02:48.902 or I will go to a news website, just actual specific locations. 66 00:02:49.469 --> 00:02:50.203 Adam: You had to go to. 67 00:02:50.203 --> 00:02:53.106 Riley: And they all linked to each other and it was nice. 68 00:02:53.106 --> 00:02:54.140 That was a whole like 69 00:02:54.140 --> 00:02:57.510 Riley: you could go on a hyperlink rabbit hole through different websites. 70 00:02:57.810 --> 00:02:58.711 Riley and Adam: . 71 00:02:58.711 --> 00:03:02.282 Adam: And now it all just gets centralized which is interesting because I remember 72 00:03:02.282 --> 00:03:04.918 do you ever do your using like StumbleUpon. Did you ever use that? 73 00:03:04.918 --> 00:03:05.785 Riley: I never used it. 74 00:03:05.785 --> 00:03:08.421 Adam: So for people who don't know it was a web extension. 75 00:03:08.421 --> 00:03:11.958 Basically you just click that and it would send you to a random website 76 00:03:11.958 --> 00:03:15.862 from a relatively curated list of, like, interesting websites 77 00:03:15.862 --> 00:03:17.096 that would completely not work 78 00:03:17.096 --> 00:03:18.731 Riley: Like Reddit, but different. 79 00:03:18.731 --> 00:03:21.768 Adam: It's like, Reddit, but you press the like a random button every 80 00:03:21.768 --> 00:03:23.836 Time and it would send you to a web page. 81 00:03:23.836 --> 00:03:26.372 But now there's no web pages, right? 82 00:03:26.372 --> 00:03:29.042 You just go to your social media service. 83 00:03:29.042 --> 00:03:30.243 Riley: So now we're like, , 84 00:03:30.243 --> 00:03:33.012 everything is kind of siloed within these little kingdoms. 85 00:03:33.012 --> 00:03:35.081 There's the Facebook kingdom, Twitter Kingdom, 86 00:03:36.316 --> 00:03:37.116 Adam: The Reddit 87 00:03:37.116 --> 00:03:39.419 Riley: those are the two. 88 00:03:39.419 --> 00:03:42.422 But that's kind of like a broader problem with the web in general. 89 00:03:42.956 --> 00:03:46.593 But I feel like if we look at just at Google, obviously it's made it harder 90 00:03:46.593 --> 00:03:50.763 for Google to be this useful resource where you kind of 91 00:03:50.763 --> 00:03:53.866 are finding all of these fun little things across the internet. 92 00:03:53.900 --> 00:03:56.669 You can still be that, like, I can still have that experience. 93 00:03:56.669 --> 00:03:58.304 You know, when you Google something 94 00:03:58.304 --> 00:04:02.609 a little random and you go down the results pages, you can find something. 95 00:04:02.609 --> 00:04:06.879 It's just harder because you have to sift through a bunch of stuff first. 96 00:04:07.146 --> 00:04:12.285 I feel like the the beginning of when Google really started to go 97 00:04:12.285 --> 00:04:16.022 wrong was placing such a heavy emphasis 98 00:04:16.022 --> 00:04:19.125 on, well, it wasn't even Google's fault really. 99 00:04:19.125 --> 00:04:21.494 It was just SEO, search engine optimization, 100 00:04:21.494 --> 00:04:24.497 and the kind of arms race that came up about that. 101 00:04:24.664 --> 00:04:25.765 Adam: How did that occur? 102 00:04:25.765 --> 00:04:27.767 Because my understanding was that originally 103 00:04:27.767 --> 00:04:31.471 Google's like their big trick to having a good search 104 00:04:31.471 --> 00:04:35.575 engine was by using how often other sites were linked to, 105 00:04:36.342 --> 00:04:40.680 as a way of weighting the quality, whereas I feel like that's not the case anymore. 106 00:04:40.680 --> 00:04:44.017 Riley: You sound like you maybe have done more research on this than I have because I, 107 00:04:44.050 --> 00:04:46.219 I'm not. I'm not sure, but that sounds right. 108 00:04:46.219 --> 00:04:46.586 I think. 109 00:04:46.586 --> 00:04:49.589 I think that is the case for the early days. 110 00:04:49.956 --> 00:04:54.127 And then I guess at some point they changed the system. 111 00:04:54.127 --> 00:04:56.296 We shouldn't talk about this as if we know. 112 00:04:56.296 --> 00:04:57.530 We havent done the research. 113 00:04:57.530 --> 00:04:58.231 Adam: This is not that research. 114 00:04:58.231 --> 00:05:00.099 Riley: This is an impromptu rant. 115 00:05:00.099 --> 00:05:04.270 But what I do know is that SEO at some point became the norm. 116 00:05:04.270 --> 00:05:08.941 Where you're ranking in the search results is now more to do with how 117 00:05:08.941 --> 00:05:12.545 well you have optimized keywords and presentation or whatever. 118 00:05:12.545 --> 00:05:14.247 And like order of things, 119 00:05:14.247 --> 00:05:18.451 website owners tuned their websites specifically to show up. 120 00:05:18.451 --> 00:05:20.386 Well, in the algorithm. 121 00:05:20.386 --> 00:05:23.389 And that has it's become it's screwed everything up. 122 00:05:23.389 --> 00:05:23.990 Adam: Oh, absolutely. 123 00:05:23.990 --> 00:05:25.291 Riley: Because that's that's slop. 124 00:05:25.291 --> 00:05:26.592 That's how you get slop. 125 00:05:26.592 --> 00:05:27.327 Adam: . 126 00:05:27.327 --> 00:05:29.062 Riley: And that's where we find ourselves today. 127 00:05:29.062 --> 00:05:33.666 Adam: Google must realize that their ambitions into AI is self-defeating right now. 128 00:05:33.666 --> 00:05:35.735 Riley: Here we go. Give me a take. 129 00:05:35.735 --> 00:05:38.371 Adam: Well everyone complains about Google. 130 00:05:38.371 --> 00:05:40.039 It just surfaces slop. 131 00:05:40.039 --> 00:05:43.776 And they are investing so heavily into the greatest slot production 132 00:05:43.776 --> 00:05:46.045 machine ever, ever made. Right? 133 00:05:46.045 --> 00:05:49.549 I don't think that like, AI actually 134 00:05:49.549 --> 00:05:53.353 has a substantial threat to art or what 135 00:05:53.353 --> 00:05:57.790 what it actually is going to be warping is the ability to create sellable content. 136 00:05:57.790 --> 00:06:02.628 I no longer have to write a blog post about the chili that goes 137 00:06:02.628 --> 00:06:06.766 before the chili recipe, so that I show up at the top of the Google search results. 138 00:06:06.766 --> 00:06:08.835 I can just have the AI do it. 139 00:06:08.835 --> 00:06:11.304 No one cares about those blog posts. 140 00:06:11.304 --> 00:06:15.174 No one cares about the AI slop because the product 141 00:06:15.174 --> 00:06:18.544 is really the recipe and it's just a way of finding your way at the top. 142 00:06:18.578 --> 00:06:19.245 Riley: That's like. 143 00:06:19.245 --> 00:06:23.416 Those recipes are a great example of how SEO screwed things up. 144 00:06:23.416 --> 00:06:24.484 , so what are you saying? 145 00:06:24.484 --> 00:06:30.089 You're saying that Google is, creating the circumstances of their own demise? Yes. 146 00:06:30.089 --> 00:06:33.092 Adam: They're absolutely just like, what if we just made it 147 00:06:33.126 --> 00:06:35.895 profoundly easier to generate crap? 148 00:06:35.895 --> 00:06:39.031 Like, you can just ask here, I just, I just 149 00:06:39.031 --> 00:06:43.236 instead of making this video, I just asked, Gemini, 150 00:06:43.236 --> 00:06:45.371 why does Google search suck? Now? 151 00:06:45.371 --> 00:06:46.406 Riley: Here we go. 152 00:06:46.406 --> 00:06:48.808 Adam: And it's given me tons of content. 153 00:06:48.808 --> 00:06:53.045 Riley: , we could just like put on the voice synthesizer and just play that. 154 00:06:53.045 --> 00:06:54.914 And people could like, watch that instead. 155 00:06:54.914 --> 00:06:56.749 In some ways it would be more focused. 156 00:06:56.749 --> 00:06:58.584 Adam: Probably, it would probably be 157 00:06:58.584 --> 00:07:02.822 Riley: Some ways it would probably provide more, you know, pertinent information. 158 00:07:03.289 --> 00:07:03.890 Adam: Probably. 159 00:07:03.890 --> 00:07:06.092 Riley: But people are coming here for the good stuff. 160 00:07:06.092 --> 00:07:07.126 Adam: . 161 00:07:07.126 --> 00:07:09.595 And the good stuff is not pertinent information. 162 00:07:09.595 --> 00:07:14.066 Riley: The slot problem is I don't see a solution to it right now, 163 00:07:14.066 --> 00:07:18.638 because now that these tools are out, the AI generating tools, 164 00:07:19.572 --> 00:07:20.673 are out in the public. 165 00:07:20.673 --> 00:07:23.743 Anyone can make any AI slop and post online. 166 00:07:23.743 --> 00:07:24.544 That is just 167 00:07:24.544 --> 00:07:29.549 I don't know how you stop that from totally polluting the search experience. 168 00:07:29.549 --> 00:07:32.385 Like, I think that there are a bunch of problems with Google search 169 00:07:32.385 --> 00:07:35.922 that, you know, we haven't even touched on, like the ads being a problem and, 170 00:07:36.255 --> 00:07:39.992 like the actual AI overview, like, when you, when you do a search. 171 00:07:39.992 --> 00:07:43.663 But like in terms of like the results and what the internet state of the internet 172 00:07:43.663 --> 00:07:46.833 is, the fact that it's being polluted by all this slop. 173 00:07:46.833 --> 00:07:48.100 I don't know how you can fix that 174 00:07:48.100 --> 00:07:52.472 with the other stuff, like ads where scammers are paying 175 00:07:52.805 --> 00:07:56.642 to be the first sponsored result when you Google something. 176 00:07:56.976 --> 00:08:00.179 I watched a great video on this by, Tunnel Vision, the YouTube channel. 177 00:08:00.179 --> 00:08:01.948 I think he tried to contact the 178 00:08:01.948 --> 00:08:05.451 the people who are buying those ads and ask them like what the deal is, 179 00:08:05.451 --> 00:08:07.053 and they're just like, that's what we got to do. 180 00:08:07.053 --> 00:08:08.221 I mean, like, the system is there. 181 00:08:08.221 --> 00:08:08.788 So like, 182 00:08:08.788 --> 00:08:11.791 you could take away that ability, you could take away that feature 183 00:08:11.791 --> 00:08:12.558 and make that better. 184 00:08:12.558 --> 00:08:13.793 You could take away the 185 00:08:13.793 --> 00:08:17.630 AI overviews feature and make it so that, like the web search on Google is 186 00:08:17.830 --> 00:08:18.598 is the default. 187 00:08:18.598 --> 00:08:19.565 But what do you do about the fact 188 00:08:19.565 --> 00:08:22.568 that, like, everyone is generating AI slop and putting it online now, 189 00:08:22.702 --> 00:08:25.938 you can't tell the search engine filter out AI images. 190 00:08:25.938 --> 00:08:27.573 It doesn't know what's an AI image or not. 191 00:08:27.573 --> 00:08:29.909 I mean, and if we put metadata in, maybe, 192 00:08:29.909 --> 00:08:32.245 but not everyone's going to put the metadata in the images. 193 00:08:32.245 --> 00:08:36.148 Adam: Also, if you're Google and you go, we just introduced this killer new feature. 194 00:08:36.182 --> 00:08:38.284 It gets rid of all the AI you don't want. 195 00:08:38.284 --> 00:08:41.554 You're just like telling your investors that people hate this stuff. 196 00:08:41.554 --> 00:08:43.422 Riley: , exactly. The level to which, 197 00:08:44.624 --> 00:08:47.793 using Google sucks now is only going to get worse. 198 00:08:47.927 --> 00:08:49.896 This is the lowest it's going to be. 199 00:08:49.896 --> 00:08:51.931 It's going to just be 200 00:08:51.931 --> 00:08:54.100 worse and worse and worse. I don't know. 201 00:08:54.100 --> 00:08:56.435 Adam: They're in this situation where they're at like a party. 202 00:08:56.435 --> 00:08:56.869 Riley: . 203 00:08:56.869 --> 00:08:59.872 Adam: And they realize that they currently have diarrhea. 204 00:08:59.872 --> 00:09:01.140 Riley: I'm hearing you out. Okay. 205 00:09:01.140 --> 00:09:04.644 Adam: And what would really solve the problem is if they just said, 206 00:09:05.311 --> 00:09:06.879 oh, God, I have diarrhea. 207 00:09:06.879 --> 00:09:09.048 Please excuse me for a moment. 208 00:09:09.048 --> 00:09:12.051 But if they admitted that they had diarrhea, 209 00:09:12.151 --> 00:09:14.654 everyone would be like, you're the diarrhea guy. 210 00:09:14.654 --> 00:09:16.622 Riley: . kills the vibe! 211 00:09:16.622 --> 00:09:20.226 Adam: Youâve ruined dinner, and then they would never get invited again. 212 00:09:20.259 --> 00:09:21.928 They're in this position where, like, 213 00:09:21.928 --> 00:09:25.731 they can't admit that what they're doing is actually, like, causing them problems. 214 00:09:26.132 --> 00:09:27.900 Well, and I don't think they 215 00:09:27.900 --> 00:09:30.603 even if they could do that, I don't think they want to. 216 00:09:30.603 --> 00:09:32.605 Like, I don't want to say that AI is okay. 217 00:09:32.605 --> 00:09:35.975 First of all, we're passionate fashion in on it, on on AI a lot. 218 00:09:35.975 --> 00:09:37.743 And I think we're going to continue to do that. 219 00:09:37.743 --> 00:09:38.978 But I do want to say 220 00:09:38.978 --> 00:09:42.315 there are some instances where AI is very useful and good. 221 00:09:43.115 --> 00:09:46.752 For instance, one of the ways that Google search has improved, that 222 00:09:46.752 --> 00:09:51.223 I'll say up front is with the multimodal search and like, lens. 223 00:09:51.557 --> 00:09:53.125 , Google Lens. Absolutely. 224 00:09:53.125 --> 00:09:56.462 My parents were using that to like, oh what is this? 225 00:09:56.462 --> 00:09:57.897 And they like learn how to use lens 226 00:09:57.897 --> 00:09:59.966 to like identify something and it is accurate. 227 00:09:59.966 --> 00:10:00.900 It's really good. 228 00:10:00.900 --> 00:10:04.036 We have these, animal toys by a company called Schleich. 229 00:10:04.036 --> 00:10:07.039 It's a it's a toy of an armadillo. 230 00:10:07.273 --> 00:10:09.709 And you take a picture of it and it's like, it's impressive enough 231 00:10:09.709 --> 00:10:12.478 that Google would be able to say, oh, that's an armadillo, 232 00:10:12.478 --> 00:10:17.083 but it knows it's a Schleich model of an armadillo, 233 00:10:17.283 --> 00:10:21.387 and it directs you to a bunch of different websites that sell 234 00:10:21.387 --> 00:10:24.590 that particular toy with the like, correct skew and everything. 235 00:10:24.590 --> 00:10:26.492 And it's like, okay, that's really cool. 236 00:10:26.492 --> 00:10:27.627 Adam: I like how that's like the most 237 00:10:28.861 --> 00:10:30.696 Every single, every 238 00:10:30.696 --> 00:10:34.767 single demo of AI is like, and here's how you can buy something with it. 239 00:10:34.767 --> 00:10:35.868 You like that thing. 240 00:10:35.868 --> 00:10:37.970 Wouldn't it be great if you could buy it faster? 241 00:10:37.970 --> 00:10:39.038 Riley: , . 242 00:10:39.038 --> 00:10:43.609 But I will say that like, you know, Apple had that like plant identifying app 243 00:10:43.609 --> 00:10:45.678 or whatever. I mean, what else is going to show you? 244 00:10:45.678 --> 00:10:46.412 Riley: It gives you 245 00:10:46.412 --> 00:10:47.246 Adam: Where to buy the plant 246 00:10:47.246 --> 00:10:49.682 Riley: It gives you the Wikipedia and like other things. 247 00:10:49.682 --> 00:10:53.252 But you know what else is going to be when you search this plant, I guess. 248 00:10:53.252 --> 00:10:53.853 What are you saying? 249 00:10:53.853 --> 00:10:56.122 In a perfect world, they would give you like scientific studies about the plant 250 00:10:56.122 --> 00:10:57.356 Adam: I don't know what it would do. 251 00:10:57.356 --> 00:11:00.726 Like if we it's just it's funny because like the use cases 252 00:11:00.726 --> 00:11:02.928 that are always demonstrated for AI are always like. 253 00:11:02.928 --> 00:11:07.233 And this is how you can like that's think less and buy more sometimes. 254 00:11:07.233 --> 00:11:10.403 Riley: I mean sometimes the demos are like, oh, find out 255 00:11:10.403 --> 00:11:14.140 when your mom's flight is coming in and send her a message for it. 256 00:11:14.140 --> 00:11:14.874 Adam: Except it can't do that yet! 257 00:11:14.874 --> 00:11:17.476 it still sucks at all that I want it to be. I want it to be 258 00:11:17.476 --> 00:11:19.412 I want it to be Siri demo. And it was , know. 259 00:11:19.412 --> 00:11:21.347 Adam: Like that's the part that blows my mind 260 00:11:21.347 --> 00:11:25.317 is that they keep going like oh here's like this handy new AI feature. 261 00:11:25.317 --> 00:11:29.922 Yet I still can't ask my like Google Home to do more than one thing at a time. 262 00:11:29.989 --> 00:11:33.993 , I can't be like, turn off the lights and put a reminder in. 263 00:11:33.993 --> 00:11:35.928 It just goes like, I don't, I don't know. 264 00:11:35.928 --> 00:11:38.597 Have you noticed on your Nest Hub that you used to be able to say, like, 265 00:11:38.597 --> 00:11:41.233 never mind when it kept talking to you when it you didn't hear it? 266 00:11:41.233 --> 00:11:43.569 And now if you do that, it just goes. 267 00:11:43.569 --> 00:11:47.039 Never mind is an album by Nirvana from 1995. 268 00:11:47.073 --> 00:11:48.941 Riley: I can still say never mind. 269 00:11:48.941 --> 00:11:52.878 Adam: It always just just tells me about Nirvana and it pisses me off so much. 270 00:11:54.680 --> 00:11:57.917 Riley: I don't know, I feel like I've maybe I've been desensitized to it 271 00:11:57.917 --> 00:11:59.885 because my kid always goes up to the Nest Hub 272 00:11:59.885 --> 00:12:02.888 and says, I'll just activate everyone saying, Hey Google, 273 00:12:02.888 --> 00:12:05.891 what does a Spinosaurus sound like? 274 00:12:05.958 --> 00:12:08.794 And then it's like, doesn't miss, doesn't hear him properly. 275 00:12:08.794 --> 00:12:10.463 And it just like has something else. 276 00:12:10.463 --> 00:12:12.164 And so I'm like, okay. 277 00:12:12.164 --> 00:12:14.567 When it gets it, I'm like, good job, Google. 278 00:12:14.567 --> 00:12:17.136 You know, it's like sometimes I don't know what my kid's saying. 279 00:12:17.136 --> 00:12:19.105 So like the fact that you pick that up great 280 00:12:19.105 --> 00:12:20.706 Adam: Itâs interesting how- do you find that 281 00:12:20.706 --> 00:12:23.909 you have far less patience for a machine than you do for a person? 282 00:12:23.909 --> 00:12:26.912 Like if I, if I was like, hey, could you, could you order pizza? 283 00:12:27.179 --> 00:12:28.147 And they went, what? 284 00:12:28.147 --> 00:12:30.416 I'd be like, can you order pizza? You know, 285 00:12:31.550 --> 00:12:34.386 but the moment my Nest Hub is like, 286 00:12:34.386 --> 00:12:37.757 I, you have to log in, you have to recognize your voice to do that. 287 00:12:37.757 --> 00:12:38.758 I was like you, mother- 288 00:12:38.758 --> 00:12:41.560 Riley: Why are you trying to order pizza through your Nest Hub? 289 00:12:41.560 --> 00:12:42.661 Adam: Iâm not, no, that's just an example. 290 00:12:42.661 --> 00:12:43.262 Because, like, 291 00:12:43.262 --> 00:12:46.365 you can ask a human to do normal stuff and you have, like, sympathy for them. 292 00:12:46.365 --> 00:12:50.436 But the moment my robot gets something slightly wrong, I am infuriated. 293 00:12:50.436 --> 00:12:50.970 Riley: Well, let's be good. 294 00:12:50.970 --> 00:12:51.303 . 295 00:12:51.303 --> 00:12:53.072 I mean, this is an interesting problem 296 00:12:53.072 --> 00:12:56.575 with with AI in general and with with rolling it out in. 297 00:12:56.575 --> 00:12:58.878 It's kind of a half baked hallucination 298 00:12:58.878 --> 00:13:02.882 state to everyone's devices because it is amazing. 299 00:13:02.882 --> 00:13:06.218 Like, I feel like if you rewind, ChatGPT came out December 300 00:13:06.218 --> 00:13:09.488 2022, everyone was losing their minds about it because it's like, this is crazy. 301 00:13:09.488 --> 00:13:11.090 It it seems exactly like a human. 302 00:13:11.090 --> 00:13:14.026 You can talk to it like, and it's only gotten better since then, 303 00:13:14.026 --> 00:13:19.031 but now because it's so good, it's so close to being like talking to a human. 304 00:13:19.331 --> 00:13:21.934 We see all the things that is wrong with it immediately. 305 00:13:21.934 --> 00:13:23.769 You know, we're like, oh, it's passing the Turing test. 306 00:13:23.769 --> 00:13:25.404 Like, oh, blah blah is like, oh no, wait. 307 00:13:25.404 --> 00:13:28.407 Once you get a little bit used to it, you're like, oh no, the Gulf is 308 00:13:28.808 --> 00:13:31.577 the Gulf is massive because it's like close. 309 00:13:31.577 --> 00:13:34.380 And it's so much, so noticeable. 310 00:13:34.380 --> 00:13:37.383 Adam: When you just have like how many Rs are in strawberry and it like just. 311 00:13:37.383 --> 00:13:38.617 Riley: Well there's that stuff. 312 00:13:38.617 --> 00:13:41.587 There's just like you can be having a conversation with it, 313 00:13:41.587 --> 00:13:43.923 especially with these new like voice modes. 314 00:13:43.923 --> 00:13:48.160 I experimented with it like driving home from work a couple times. 315 00:13:48.160 --> 00:13:51.263 I'm like I just put on the voice mode and like pretend that I have 316 00:13:51.263 --> 00:13:54.366 that I'm giving somebody a ride and I'm like having a car conversation. 317 00:13:54.967 --> 00:13:57.736 And there are like moments at a time 318 00:13:57.736 --> 00:14:01.273 where it's like, okay, I kind of feel like I'm talking to somebody here. 319 00:14:01.273 --> 00:14:03.442 And then like, as soon as that happens, 320 00:14:03.442 --> 00:14:07.112 another thing comes up and it's like, oh, you, oh, this is what am I doing? 321 00:14:07.179 --> 00:14:08.280 I feel stupid. 322 00:14:08.280 --> 00:14:09.615 Iâm like what am I? 323 00:14:09.615 --> 00:14:12.852 I'm pretending to talk to a person, but it's like, obviously 324 00:14:12.852 --> 00:14:15.955 not a person and it doesn't understand the nuance of what I just said. 325 00:14:15.955 --> 00:14:17.456 Adam: I will say that the live feature 326 00:14:17.456 --> 00:14:20.125 is really cool, especially cause you can, like, interrupt it. 327 00:14:20.125 --> 00:14:23.128 like that's the best when you I mean, just like, no, you're you're off. 328 00:14:23.195 --> 00:14:25.497 Like you're wrong. 329 00:14:25.497 --> 00:14:28.000 Sometimes I ask you the question just to tell to shut up immediately. 330 00:14:28.000 --> 00:14:29.802 , well, actually shut up. 331 00:14:29.802 --> 00:14:31.937 You're not real. 332 00:14:31.937 --> 00:14:33.505 You're not a person like me 333 00:14:33.505 --> 00:14:36.675 Adam: Itâs true because It has like this, like this passivity. 334 00:14:36.675 --> 00:14:38.611 Like it won't take a stance on anything ever. 335 00:14:38.611 --> 00:14:39.945 Like it won't ever have an opinion. 336 00:14:39.945 --> 00:14:42.681 And it always is way more verbose than you need it to be. 337 00:14:42.681 --> 00:14:45.184 You're like, you're like, hey, hey, what's a strawberry? 338 00:14:45.184 --> 00:14:46.552 And I get like three paragraphs. 339 00:14:46.552 --> 00:14:48.120 I was like, I just needed to know that it was a fruit. 340 00:14:48.120 --> 00:14:49.455 My dog 341 00:14:49.455 --> 00:14:53.125 Riley: My main thing when I talk to these voice modes, I wanted to have a take. 342 00:14:53.359 --> 00:14:56.695 I want to, like, have a discussion and like, push back on me. 343 00:14:56.695 --> 00:14:59.565 But the thing is, it's AI it doesn't have a perspective. 344 00:14:59.565 --> 00:15:00.799 It doesn't have an opinion. 345 00:15:00.799 --> 00:15:03.969 You can make it have an opinion, but then it's like, what's the point 346 00:15:03.969 --> 00:15:05.738 Adam: Itâs just somebody else's opinion 347 00:15:05.738 --> 00:15:08.741 Riley: One thing about the slop issue I wanted to say before I move on, maybe, 348 00:15:08.941 --> 00:15:12.011 is we both do a lot of googling for our jobs, 349 00:15:12.544 --> 00:15:16.448 a lot of research and I feel like when I Google now, 350 00:15:16.582 --> 00:15:20.152 the internet is so full of I slop that I almost 351 00:15:20.152 --> 00:15:23.222 forget about it because I've learned to, like, tune it out. 352 00:15:23.656 --> 00:15:28.260 So, like Linus sent a link for that he wanted to talk about in the WAN Show, 353 00:15:28.260 --> 00:15:32.431 I saw the the website was called Glass Almanac is an AI slop site. 354 00:15:32.631 --> 00:15:34.800 It basically like. And I'd looked into the story. 355 00:15:34.800 --> 00:15:35.467 He was talking about. 356 00:15:35.467 --> 00:15:38.537 It's like, okay, this story that he sent me was 357 00:15:38.537 --> 00:15:41.707 an AI slop article based off another slop article 358 00:15:41.707 --> 00:15:45.144 that was based off a wired article that was like, 359 00:15:45.511 --> 00:15:48.647 not about the thing that Linus wanted to talk about. 360 00:15:48.914 --> 00:15:52.051 The point of this story is that I saw Glass Almanac and I was like, 361 00:15:52.051 --> 00:15:55.621 I know that's a AI slop site, so I like just tuned it out. 362 00:15:55.621 --> 00:15:59.491 So when I Google stuff, it's like I'm only seeing the little gems 363 00:15:59.491 --> 00:16:03.162 that are like the like glimpses of the real internet. 364 00:16:03.162 --> 00:16:07.366 But for everyone else who aren't us, when they Google 365 00:16:07.700 --> 00:16:09.335 and they go down the search results, 366 00:16:09.335 --> 00:16:12.438 they don't know how to tune out the AI slop for down. 367 00:16:12.438 --> 00:16:14.940 The slop is going to be the internet. 368 00:16:14.940 --> 00:16:16.642 Adam: Yeah, and it's like scary. 369 00:16:16.642 --> 00:16:17.509 It is scary. 370 00:16:17.509 --> 00:16:20.245 It's interesting 404 media did a big investigation about 371 00:16:20.245 --> 00:16:23.482 like how and why Facebook is so full of AI slop. 372 00:16:23.482 --> 00:16:26.618 They identified that Facebook is incentivizing this slop 373 00:16:26.952 --> 00:16:28.320 and they're paying people. 374 00:16:28.320 --> 00:16:31.423 Since then they've kind of been like, oh, we're going to turn down the, 375 00:16:31.724 --> 00:16:32.691 the the dial, turn down the basically. 376 00:16:32.691 --> 00:16:33.792 Adam: Turn down the sloppening of 377 00:16:33.792 --> 00:16:37.696 Riley: But like, you know, they were like posts on Facebook can be monetized. 378 00:16:37.930 --> 00:16:41.400 And so they're like, if you can get if you can make a page on Facebook 379 00:16:41.400 --> 00:16:43.235 and get a ton of traffic on it 380 00:16:43.235 --> 00:16:46.238 from the boomers clicking all this AI slop, you're going to make a lot of money. 381 00:16:46.672 --> 00:16:47.239 Adam: To add to that 382 00:16:47.239 --> 00:16:49.141 It's like it's it's this like House of cards, right? 383 00:16:49.308 --> 00:16:53.012 It's like if you have bots to boost your engagement, 384 00:16:53.245 --> 00:16:54.913 so you're getting all these fake impressions, 385 00:16:54.913 --> 00:16:58.283 then your ad revenue is worthless because you have fake behavior. 386 00:16:58.751 --> 00:17:00.652 Riley: Right. That's that's an issue too 387 00:17:00.652 --> 00:17:02.721 Adam: So then like who are they going to sell to. 388 00:17:02.721 --> 00:17:03.756 If people go like I won't, 389 00:17:03.756 --> 00:17:05.524 I don't want to sell them I say because you inflate your numbers 390 00:17:05.524 --> 00:17:09.161 by allowing bots to rampantly run on your website. 391 00:17:09.161 --> 00:17:10.429 Adam: And that's the business model 392 00:17:10.796 --> 00:17:14.867 Riley: The web crawling a war is something that's really interesting to me 393 00:17:15.300 --> 00:17:18.871 because yes, publishers and people who own websites 394 00:17:18.871 --> 00:17:22.941 want real visitors unless they make slop. 395 00:17:22.941 --> 00:17:24.910 Slop people don't care who's coming to their website. 396 00:17:24.910 --> 00:17:28.580 But like people who, you know, journalists want real visitors. 397 00:17:28.914 --> 00:17:32.418 CEO, Cloudflare, Matthew Prince recently, 398 00:17:33.152 --> 00:17:37.990 talked about the ratio of crawlers on a on a 399 00:17:38.023 --> 00:17:41.960 like a publisher website to actual human visitors 400 00:17:42.995 --> 00:17:44.363 has gone from like being like 401 00:17:44.363 --> 00:17:48.467 2 to 1 to like dozens of crawlers per. 402 00:17:48.467 --> 00:17:52.504 And it's even worse for the, other like, 403 00:17:53.072 --> 00:17:56.141 I think that was for Google and for OpenAI and anthropic. 404 00:17:56.141 --> 00:17:59.211 It was like thousands, hundreds of thousands per one visitor. 405 00:17:59.211 --> 00:18:02.448 well, they tried to solve it recently with the pay per crawl thing. 406 00:18:03.082 --> 00:18:04.083 Did you hear about that? 407 00:18:04.083 --> 00:18:04.716 Adam: No 408 00:18:04.716 --> 00:18:05.451 Riley: I'll bring it up. 409 00:18:05.451 --> 00:18:09.121 Enabling content owners to charge AI crawlers for access. 410 00:18:09.288 --> 00:18:10.122 So like. 411 00:18:10.122 --> 00:18:11.323 Adam: Who put this initiative? 412 00:18:11.323 --> 00:18:15.627 Riley: Cloudflare is actually trying to do something about this. 413 00:18:15.627 --> 00:18:17.729 And they're incentivized to because like, you know... 414 00:18:17.729 --> 00:18:19.198 Adam: it's their it's their servers. 415 00:18:19.198 --> 00:18:21.366 Riley: They're hosting this content and they don't want it. 416 00:18:21.366 --> 00:18:23.769 All the bandwidth taken up by by crawlers. 417 00:18:23.769 --> 00:18:25.070 I don't know how well it's going to work. 418 00:18:25.070 --> 00:18:28.707 There's probably going to be an arms race where the the AI companies 419 00:18:29.108 --> 00:18:30.109 find a way around it. 420 00:18:30.109 --> 00:18:34.646 Yeah, they're allowing websites to block AI crawlers by default and then charge 421 00:18:34.947 --> 00:18:36.915 the AI companies to be like, you want to crawl the web? 422 00:18:36.915 --> 00:18:39.318 Pay up. itâs just in beta right now. 423 00:18:39.318 --> 00:18:40.419 But we'll see where that goes. 424 00:18:40.419 --> 00:18:42.387 Adam: Okay. What the hell happened to YouTube search? 425 00:18:42.387 --> 00:18:43.922 I know this is a complete left turn. 426 00:18:43.922 --> 00:18:45.424 but have you tried 427 00:18:45.424 --> 00:18:48.427 searching something on YouTube and getting more than eight results total? 428 00:18:48.427 --> 00:18:51.797 Riley: Like yeah... I have mixed feelings about YouTube 429 00:18:51.830 --> 00:18:53.999 because I feel like YouTube has done a lot of things right. 430 00:18:53.999 --> 00:18:54.933 Let's talk about this. 431 00:18:54.933 --> 00:18:57.069 When you search something on YouTube, 432 00:18:57.069 --> 00:18:58.704 you're saying that you don't get what you want. 433 00:18:58.704 --> 00:19:00.239 Adam :So if you search something on YouTube, 434 00:19:00.239 --> 00:19:03.509 it'll give you about 10 to 12 actual results. 435 00:19:03.509 --> 00:19:04.243 And then it's before. 436 00:19:04.243 --> 00:19:06.712 It switches to random stuff, especially if it's something that's more niche. 437 00:19:06.712 --> 00:19:10.849 If you go to say like a Rav4 review, we're getting tons of relevant results. 438 00:19:10.849 --> 00:19:15.754 but if I go like, let's say like TOSlink adapter 439 00:19:15.754 --> 00:19:17.289 this is terrible itâs showing up 440 00:19:17.289 --> 00:19:18.390 It's not doing the right thing. 441 00:19:20.292 --> 00:19:20.993 Sometimes it's like 442 00:19:20.993 --> 00:19:24.229 Riley: Yeah 443 00:19:24.229 --> 00:19:24.696 Adam: here's six results and there's totally more there. 444 00:19:24.696 --> 00:19:29.067 But it just goes like ah, there I mean, you watch some other stuff instead. 445 00:19:29.067 --> 00:19:31.570 Riley: I find that that's the case with Google a lot. 446 00:19:31.970 --> 00:19:35.407 And I don't know, honestly, I don't know anymore whether 447 00:19:35.841 --> 00:19:39.511 Google is not giving me, a lot of results 448 00:19:39.511 --> 00:19:44.082 because there's not many results out there to, to find or whether it's just 449 00:19:44.850 --> 00:19:47.085 there's so much slop and it has to dig through all the slop 450 00:19:47.085 --> 00:19:49.054 and it can actually find the good stuff anymore. 451 00:19:49.054 --> 00:19:52.090 Sammy: Sorry, I just want to quickly interrupt because I also looked at the Toslink. 452 00:19:52.491 --> 00:19:56.195 Adam mentioned I got a product like ads for product like, 453 00:19:56.195 --> 00:19:59.398 so you can buy it one video and then I got shorts. 454 00:19:59.498 --> 00:20:01.500 So it's like the front page is already, like useless. 455 00:20:01.500 --> 00:20:02.801 I had to scroll down to find more 456 00:20:02.801 --> 00:20:03.569 Adam: my default for Toslink 457 00:20:03.569 --> 00:20:06.338 adapters sponsored, which is weird because it's on YouTube. 458 00:20:07.472 --> 00:20:09.007 And then it's the shorts 459 00:20:09.007 --> 00:20:11.977 Sammy: and then it's another ad and then it's actually like actual stuff, 460 00:20:11.977 --> 00:20:14.346 Adam: it's more shorts and then it's another ad 461 00:20:14.346 --> 00:20:15.714 Riley: Toslink adapter? 462 00:20:15.714 --> 00:20:19.284 Riley: So , I got ad two results ad. 463 00:20:19.284 --> 00:20:23.322 Shorts and then three results shorts again. 464 00:20:23.755 --> 00:20:25.090 Is it just the same shorts? 465 00:20:25.090 --> 00:20:26.858 Riley: Some of the shorts of the same. 466 00:20:26.858 --> 00:20:27.726 Adam: Please watch YouTube shorts. 467 00:20:27.726 --> 00:20:31.029 Riley: Yeah, I don't think I don't think shorts are annoying. 468 00:20:31.496 --> 00:20:32.431 By themselves. 469 00:20:32.431 --> 00:20:36.301 I think it's just annoying to have the actual results broken up so often. 470 00:20:36.301 --> 00:20:39.304 This is a great example of what it's like to use Google 471 00:20:39.304 --> 00:20:40.405 and the internet in general. 472 00:20:40.405 --> 00:20:42.975 Right now you have to filter everything out. 473 00:20:42.975 --> 00:20:45.844 And this is what I'm saying about normies they're going through. 474 00:20:45.844 --> 00:20:49.615 And this is why all of these, search ads, 475 00:20:49.615 --> 00:20:52.050 these are like, you know, sponsored results or whatever. 476 00:20:52.050 --> 00:20:53.552 It's why they get so many clicks 477 00:20:53.552 --> 00:20:57.322 because the normies don't know to not click on them like they're. 478 00:20:57.356 --> 00:21:00.792 I think they're learning, obviously, but like our parents are going through 479 00:21:01.026 --> 00:21:03.262 and they're probably learning to be like, oh, it's sponsored. 480 00:21:03.262 --> 00:21:04.162 And then they skip it. 481 00:21:04.162 --> 00:21:07.232 But like the vast majority of people, I think while they're using these things, 482 00:21:07.232 --> 00:21:08.600 Adam: No. They can't, tell 483 00:21:08.600 --> 00:21:10.002 Riley: First result is a sponsored result. 484 00:21:10.002 --> 00:21:13.005 They're probably clicking that the vast majority of the time 485 00:21:13.238 --> 00:21:15.440 because they're like, oh, that's the top result. 486 00:21:15.440 --> 00:21:16.675 It's probably the most useful thing. 487 00:21:16.675 --> 00:21:17.843 It's dystopian. 488 00:21:17.843 --> 00:21:20.846 Adam: It's really frustrating, especially when you like, 489 00:21:20.846 --> 00:21:23.849 like for tech support stuff where you're like, how to back up my PC 490 00:21:23.882 --> 00:21:27.319 and everything is just like SEO slop, but it's SEO slop from people who sell 491 00:21:27.986 --> 00:21:29.121 backup software, right? 492 00:21:29.121 --> 00:21:30.589 So it's like it's like pseudo helpful. 493 00:21:30.589 --> 00:21:34.359 Riley: Honestly, there have been a good many of those, like, it's a company 494 00:21:34.660 --> 00:21:37.195 and they have like a blog blaming something. 495 00:21:37.195 --> 00:21:39.931 âThe five different types of blah blah blahâ 496 00:21:39.931 --> 00:21:41.300 And they're explaining it to you. 497 00:21:41.300 --> 00:21:44.636 And then at the end they're like, and to solve that problem we have a product. 498 00:21:44.636 --> 00:21:45.137 Check it out. 499 00:21:45.137 --> 00:21:45.537 You know. 500 00:21:45.537 --> 00:21:46.071 Adam: Yeah and itâs 501 00:21:46.071 --> 00:21:47.739 Riley: Like to me the like I've actually had 502 00:21:47.739 --> 00:21:49.808 I've actually found quite a few of those that are, 503 00:21:49.808 --> 00:21:51.910 that are useful that have solved my problem for me. 504 00:21:51.910 --> 00:21:54.813 Adam: And oftentimes when they do it and they put like their app 505 00:21:54.813 --> 00:21:57.649 is like the 10th, like the fifth or sixth step, I'm like, you know what? 506 00:21:57.649 --> 00:21:58.550 Riley: That's fine. 507 00:21:58.550 --> 00:21:59.184 Adam: That's okay. 508 00:21:59.184 --> 00:22:01.553 Riley: I'll accept this real quick. 509 00:22:01.620 --> 00:22:03.388 I want to talk about AI overviews. 510 00:22:03.388 --> 00:22:04.256 Adam: Okay. Sure. 511 00:22:04.256 --> 00:22:07.592 Riley: Because this to me is one of the most 512 00:22:08.026 --> 00:22:10.562 pernicious developments in search. 513 00:22:10.562 --> 00:22:13.699 The idea that instead of using Google like a tool, 514 00:22:13.699 --> 00:22:16.601 that you know how to use yourself. 515 00:22:16.601 --> 00:22:19.071 They've added this AI 516 00:22:19.071 --> 00:22:21.840 search functionality to all the major searches. 517 00:22:21.840 --> 00:22:24.743 So engines that turn it from a tool that you're using 518 00:22:24.743 --> 00:22:27.746 into an interface with a bot that's using the tool. 519 00:22:27.879 --> 00:22:28.880 Adam: Yes. Yeah. 520 00:22:28.880 --> 00:22:31.917 Riley: And I think this is going to have a horrible effect on people's 521 00:22:31.917 --> 00:22:35.354 ability to research stuff for themselves and think critically. 522 00:22:35.887 --> 00:22:39.124 And just to be able to to find out what the hell is going on out there, 523 00:22:39.658 --> 00:22:41.360 because this is how people are going to use it. 524 00:22:41.360 --> 00:22:42.728 We already had the like 525 00:22:42.728 --> 00:22:46.631 the kind of Google summaries when you like, before the a- before LMS. 526 00:22:46.665 --> 00:22:48.033 we're really a big thing. 527 00:22:48.033 --> 00:22:51.203 Adam: It took like a snippet of like a Wikipedia article that would be like, 528 00:22:51.303 --> 00:22:52.804 how much does an Eagle weigh? 529 00:22:52.804 --> 00:22:55.540 And I was honestly totally fine with that the vast majority of the time, 530 00:22:55.540 --> 00:22:59.378 because what it was showing you was an actual snippet of a website. 531 00:22:59.378 --> 00:23:02.948 And then right below it was the link with the full text of the website header 532 00:23:02.948 --> 00:23:03.648 and everything like that. 533 00:23:03.648 --> 00:23:04.683 Riley: I'll look more into that. 534 00:23:04.683 --> 00:23:07.719 I'll go to the direct source that you're citing to me, but 535 00:23:07.719 --> 00:23:12.924 we've gone from that to a chat bot searching the internet for you 536 00:23:13.158 --> 00:23:16.895 and having the very first thing you see be Google's 537 00:23:17.496 --> 00:23:20.732 AI summary of all this stuff that I've found online. 538 00:23:20.866 --> 00:23:25.170 And the thing is that, like, if it was accurate, yes, that's very useful, 539 00:23:26.204 --> 00:23:29.274 but you can't know that it's accurate, which is the whole problem. 540 00:23:29.274 --> 00:23:31.676 And so why are we rolling this out? 541 00:23:31.676 --> 00:23:35.180 Because everyone's going to learn that. 542 00:23:35.180 --> 00:23:38.283 You just need to ask this robot what's up. 543 00:23:38.283 --> 00:23:40.852 The robot will tell you. 544 00:23:40.852 --> 00:23:42.120 And then you're like, okay. 545 00:23:42.120 --> 00:23:45.223 Adam: And the fact that they hide it behind like a tiny little link that has a pop up, 546 00:23:45.524 --> 00:23:49.194 like I just googled how many are in the word strawberry? 547 00:23:49.561 --> 00:23:51.830 And it goes, there are zero W's in the word strawberry. 548 00:23:51.830 --> 00:23:55.801 The letters in the word are stra w very, 549 00:23:56.701 --> 00:24:01.807 its source is a post complaining about how LLMS keeps screwing up. 550 00:24:01.807 --> 00:24:03.442 How many R are in strawberry? 551 00:24:03.442 --> 00:24:07.045 Riley: This is I mean, this is, it's kind of a meme in the, 552 00:24:08.013 --> 00:24:09.881 in the AI community. 553 00:24:09.881 --> 00:24:11.283 There's an easy explanation for this. 554 00:24:11.283 --> 00:24:14.286 It's the fact that LLMS just the way that they work, 555 00:24:14.953 --> 00:24:18.123 they don't know what the letters are, that they're. 556 00:24:18.123 --> 00:24:20.358 that they're they're dealing with tokens. 557 00:24:20.358 --> 00:24:22.828 They're not dealing with letters that make up words. 558 00:24:22.828 --> 00:24:25.964 It can't look at the word strawberry and go line by line. 559 00:24:25.997 --> 00:24:28.099 It's like I mean, it did, as you can see, 560 00:24:28.099 --> 00:24:30.035 but it doesn't have the understanding of what it's doing. 561 00:24:30.035 --> 00:24:31.503 It's just kind of like putting stuff out. 562 00:24:31.503 --> 00:24:32.737 Adam: It's a word calculator. 563 00:24:32.737 --> 00:24:33.171 Riley: Yes 564 00:24:33.171 --> 00:24:35.974 Adam: And it's just but we don't really know how the calculator works. 565 00:24:35.974 --> 00:24:38.577 It's just that it goes like what is most likely to happen. 566 00:24:38.577 --> 00:24:39.911 Riley: And that's like, that's a whole thing. 567 00:24:39.911 --> 00:24:41.346 I mean, we let's not get into the whole... 568 00:24:41.346 --> 00:24:42.647 Adam: The octopus. 569 00:24:42.647 --> 00:24:44.282 Riley: So anyways, I have an issue with that 570 00:24:44.282 --> 00:24:49.721 AI overviews because of my job is researching and understanding things. 571 00:24:49.721 --> 00:24:53.024 And when I write a TechLinked story I'm like 572 00:24:53.024 --> 00:24:56.194 if I screw up one of these facts, people are going to call me out. 573 00:24:56.228 --> 00:24:59.531 I need to know that what I like, that I understand what's going on, 574 00:24:59.898 --> 00:25:02.701 and I just I can't I can't trust AI overviews 575 00:25:02.701 --> 00:25:05.570 and I don't think other people should be, should, should learn to trust it. 576 00:25:05.570 --> 00:25:08.874 I think it's funny because my instinct when I use the AI overviews is I 577 00:25:08.874 --> 00:25:11.977 go, I read it and I go, all right, time to find out if that's true. 578 00:25:12.077 --> 00:25:14.913 Which means that I just spent a bunch of time not determining 579 00:25:14.913 --> 00:25:16.982 whether or not something's true because I had to. 580 00:25:16.982 --> 00:25:19.117 I don't trust the AI summary. 581 00:25:19.117 --> 00:25:23.388 Riley: Now, the counterpoint to this is that normies were never good at googling. 582 00:25:23.688 --> 00:25:25.924 googling has always kind of been a skill, 583 00:25:25.924 --> 00:25:26.658 Adam: Yes, 584 00:25:26.658 --> 00:25:31.096 Riley: but I fear that whatever rudimentary skills people were forced to pick up 585 00:25:31.096 --> 00:25:34.099 if they wanted to Google something or now, you know, it's more convenient. 586 00:25:34.099 --> 00:25:37.769 I guess it's like instead of going to the library and looking something 587 00:25:37.769 --> 00:25:40.772 up, you're asking bro, 588 00:25:40.906 --> 00:25:44.242 standing outside the library, hey, you know, 589 00:25:44.809 --> 00:25:49.247 when did Hannibal first, conquer? 590 00:25:49.247 --> 00:25:51.249 I don't know, Carthage. Is that a thing? 591 00:25:51.249 --> 00:25:52.617 Adam: Sure. 592 00:25:52.617 --> 00:25:53.218 Riley: It's been a long time. 593 00:25:53.218 --> 00:25:55.253 Adam: Yeah Let me go ask ten other dudes inside. 594 00:25:55.253 --> 00:25:57.322 Riley: It's like, oh, that was totally this year. 595 00:25:57.322 --> 00:26:00.692 And you're like, okay, I could go into the library 596 00:26:00.692 --> 00:26:03.929 and look up the actual thing, but this guy seems pretty confident. 597 00:26:03.929 --> 00:26:05.564 Okay, I'll just believe him. 598 00:26:05.564 --> 00:26:07.332 That's what we're doing with Google 599 00:26:07.332 --> 00:26:08.667 Adam: Hannibal never conquered Carthage. 600 00:26:08.667 --> 00:26:10.101 Riley: All right. Sorry. 601 00:26:10.101 --> 00:26:11.670 Adam: Or did he? 602 00:26:11.670 --> 00:26:14.039 Riley: This is AI overviews! 603 00:26:14.039 --> 00:26:15.507 Adam: He commanded the forces of Carthage. 604 00:26:15.507 --> 00:26:17.275 Riley: Oh, he was on the side of Carthage. 605 00:26:17.275 --> 00:26:18.543 There we go 606 00:26:18.543 --> 00:26:19.377 Adam: I think that. 607 00:26:19.377 --> 00:26:20.679 Riley: Oh, sorry. Go ahead. 608 00:26:20.679 --> 00:26:21.713 Adam: No, you go ahead. 609 00:26:21.713 --> 00:26:25.383 Riley: I was just thinking I had I had wrote down here that like, try Reddit answers. 610 00:26:25.383 --> 00:26:28.954 Try searching on TikTok because one big thing, 611 00:26:29.220 --> 00:26:32.424 it's not about Google getting worse per se, but, 612 00:26:33.758 --> 00:26:36.328 young people, instead of searching Google, 613 00:26:36.328 --> 00:26:39.564 a lot of time they're searching on Reddit and they're searching on TikTok. 614 00:26:40.765 --> 00:26:43.602 Adam: I have never tried the Reddit answers. 615 00:26:43.602 --> 00:26:46.605 Riley: What is a Toslink adapter? 616 00:26:46.605 --> 00:26:49.274 Toslink adapter is a device that allows you connect audio equipment like. 617 00:26:49.274 --> 00:26:51.676 Okay, so it's giving me like a ChatGPT answer. 618 00:26:51.676 --> 00:26:53.011 Here's the thing. 619 00:26:53.011 --> 00:26:56.314 This is an AI summary, but right away 620 00:26:56.314 --> 00:27:01.586 it's linking to Reddit post by humans that I know are probably real. 621 00:27:01.620 --> 00:27:04.623 There's a lot of AI, there's a lot of bots on Reddit for sure, 622 00:27:04.823 --> 00:27:07.525 but I think that the vast majority of the time 623 00:27:07.525 --> 00:27:11.463 a post about something that like, like this is going to be I'm, I'm 624 00:27:11.463 --> 00:27:15.333 going to be fairly confident that, okay, this first this was written by human. 625 00:27:15.567 --> 00:27:16.501 Adam: for now, 626 00:27:16.501 --> 00:27:19.471 Riley: for now, I'm sure that Reddit may be flooded 627 00:27:19.471 --> 00:27:22.974 by bots talking about basic tech support issues. 628 00:27:22.974 --> 00:27:24.542 Adam: Right. And this is the AI issue. 629 00:27:24.542 --> 00:27:25.710 Riley: This is like the end result. 630 00:27:25.710 --> 00:27:28.680 Adam: If it's all bots, nothing's a value anymore. 631 00:27:29.180 --> 00:27:32.717 And then the data is going to be trained on become slop, and it's just going to be 632 00:27:32.717 --> 00:27:33.385 this vicious cycle. 633 00:27:33.385 --> 00:27:36.554 It's like the thing that everyone's the people were very early talking about 634 00:27:37.122 --> 00:27:39.691 how that would happen is that like, AI would just end up 635 00:27:39.691 --> 00:27:42.894 getting worse and worse and worse as it trains itself on its own slop 636 00:27:43.428 --> 00:27:46.031 over and over and over again till it's basically useless, 637 00:27:46.031 --> 00:27:47.899 Riley: It's like a jpeg getting compressed again and again. 638 00:27:47.899 --> 00:27:49.868 Adam: Now I don't trust Reddit. 639 00:27:49.868 --> 00:27:50.969 Riley: Now. You don't trust. Reddit. 640 00:27:50.969 --> 00:27:53.271 Adam: I fact checked everything now. 641 00:27:53.271 --> 00:27:54.873 I mean, it's my job right. 642 00:27:54.873 --> 00:27:57.142 So like I even if I read it on Reddit, I'm like, eh 643 00:27:57.142 --> 00:28:00.211 you know, I guess I'll have to look like I think one of the best places 644 00:28:00.211 --> 00:28:03.214 to target stuff is topics of things that are hard to review. 645 00:28:03.248 --> 00:28:04.683 Right. Like I do a lot of reasons. 646 00:28:04.683 --> 00:28:07.452 We do a lot of research on products and stuff, like a washing machine. 647 00:28:07.452 --> 00:28:10.088 No one's out there testing, washing machines 648 00:28:10.088 --> 00:28:12.023 because it'd be prohibitively expensive. 649 00:28:12.023 --> 00:28:15.026 And usually you get one you buy. 650 00:28:15.126 --> 00:28:16.928 How often do you buy a washing machine? 651 00:28:17.929 --> 00:28:18.630 Riley: I donât... 652 00:28:18.630 --> 00:28:19.698 Adam: Have you ever ? 653 00:28:19.698 --> 00:28:20.065 Riley: no 654 00:28:20.065 --> 00:28:23.101 Adam: Same! so and same thing with like, AC units, 655 00:28:23.101 --> 00:28:24.602 stuff like that where they're very difficult to test 656 00:28:24.602 --> 00:28:27.639 or they're like so commodified that nobody really tests them. 657 00:28:28.106 --> 00:28:29.974 But I want to have some sort of opinion. 658 00:28:29.974 --> 00:28:32.010 But that's like where the bots come in, right? 659 00:28:32.010 --> 00:28:35.814 That's where the like, listicles that just are like, these are like for. 660 00:28:36.014 --> 00:28:37.982 Riley: Yeah, but that's not on Reddit. 661 00:28:37.982 --> 00:28:38.783 Adam: Yeah it is! 662 00:28:38.783 --> 00:28:40.251 Riley: Reddit is is. 663 00:28:40.251 --> 00:28:42.987 Adam: 100% when I've looked like best air conditioner Reddit. 664 00:28:42.987 --> 00:28:46.458 You will find tons of bot accounts that are like. 665 00:28:46.591 --> 00:28:50.962 They're posting like I bought the Penguino D30 666 00:28:50.962 --> 00:28:54.866 from, DeLonghi and it's the best thing I've ever owned. 667 00:28:54.866 --> 00:28:58.203 It has these feature and it is the most ChatGPT stuff I've ever read. 668 00:28:58.203 --> 00:29:00.004 Riley: I will say that like the vast majority 669 00:29:00.004 --> 00:29:04.142 of like the writing that I do is not for stuff like that, I guess. 670 00:29:04.142 --> 00:29:08.613 So you've probably definitely encountered more things like, like product focused. 671 00:29:09.013 --> 00:29:10.515 Most of it, most of the time that I spend on 672 00:29:10.515 --> 00:29:13.785 Reddit is like, you know, some article was written based on, 673 00:29:14.652 --> 00:29:19.724 another article that was based on a Reddit post from someone whose GPU exploded. 674 00:29:20.058 --> 00:29:23.762 So then I'm like clicking through, finding trying to find the original Reddit post 675 00:29:23.762 --> 00:29:26.364 and like reading the comments and stuff like that. So. 676 00:29:26.364 --> 00:29:29.701 But yeah, I don't doubt that there are that the bots. 677 00:29:29.701 --> 00:29:33.404 I mean, I know that the bots, problem is a big thing on Reddit, 678 00:29:33.872 --> 00:29:36.841 and not just for that, I think what was it? 679 00:29:36.841 --> 00:29:39.110 What was the subreddit? I think it was am I the a****** 680 00:29:40.612 --> 00:29:42.113 Adam: Well, everything on that subreddit fake. 681 00:29:42.113 --> 00:29:43.114 Everyone knows that. 682 00:29:43.114 --> 00:29:44.282 Sammy: Is it? 683 00:29:44.282 --> 00:29:47.252 Adam: On am I the a****** space. So there's so many. 684 00:29:47.252 --> 00:29:47.919 Riley: And this is the thing 685 00:29:47.919 --> 00:29:48.686 with slop. 686 00:29:48.686 --> 00:29:50.455 Slop isn't new, right? 687 00:29:50.455 --> 00:29:51.022 It's not like. 688 00:29:51.022 --> 00:29:54.959 It's not like, you know, flooding the internet with slop was not a thing 689 00:29:54.959 --> 00:29:55.293 before. 690 00:29:55.293 --> 00:29:58.496 with AI, like humans are great at slop. 691 00:29:59.264 --> 00:30:00.632 And we've been doing it for a long time. 692 00:30:00.632 --> 00:30:02.066 It's a storied tradition. 693 00:30:02.066 --> 00:30:05.303 but now it's just so trivially. 694 00:30:05.303 --> 00:30:08.239 Trivially easy for anyone just to pump out slop. 695 00:30:08.239 --> 00:30:10.408 Like they're a like it's a factory. 696 00:30:10.408 --> 00:30:12.644 Adam: It was, It was change my view. 697 00:30:12.644 --> 00:30:14.612 Riley: Oh. It was change my view. 698 00:30:14.612 --> 00:30:17.982 Adam: And, basically, Reddit threatened to sue them 699 00:30:17.982 --> 00:30:20.985 because they're like, hey, we're the only ones who can put bots in there. 700 00:30:20.985 --> 00:30:22.053 Riley: We didn't actually say what it was. 701 00:30:22.053 --> 00:30:25.323 But it's so like these, these University of Zurich researchers, 702 00:30:26.057 --> 00:30:29.561 basically just posted tons and tons and tons of posts and change it 703 00:30:29.561 --> 00:30:30.695 and changed my view. 704 00:30:30.695 --> 00:30:34.632 They say that they were partially written by AI like, but they used 705 00:30:34.632 --> 00:30:39.437 AI to generate tons and tons of posts over like a year or something. 706 00:30:39.838 --> 00:30:43.208 Adam: I think the goal was to figure out if they could better change the opinions 707 00:30:43.208 --> 00:30:44.042 of folks. 708 00:30:44.042 --> 00:30:44.676 Riley: Yeah, to find out 709 00:30:44.676 --> 00:30:48.780 how good AI was at changing people's view. 710 00:30:48.780 --> 00:30:51.683 And I think they found that AI was slightly better. 711 00:30:51.683 --> 00:30:54.986 Than than people at, yeah... At convincing us. 712 00:30:54.986 --> 00:30:56.988 So that's great news. 713 00:30:56.988 --> 00:30:58.923 That's that's exciting for the future. 714 00:30:58.923 --> 00:30:59.357 Okay. 715 00:30:59.357 --> 00:31:01.826 We've gone on a lot of tangents here. 716 00:31:01.826 --> 00:31:03.461 Adam: Why does Gmail search stuff so bad? 717 00:31:03.461 --> 00:31:04.896 Riley: I... 718 00:31:04.896 --> 00:31:06.631 Sammy: Oh my god can I can I say something about this? 719 00:31:06.631 --> 00:31:07.732 Riley: Sammy has a thought about that 720 00:31:07.732 --> 00:31:10.134 Sammy: I was looking for something 721 00:31:10.134 --> 00:31:10.869 like E-transfer. 722 00:31:10.869 --> 00:31:12.871 Right. That's for for non-Canadian. 723 00:31:12.871 --> 00:31:15.540 That's basically like transferring money through to Venmo. 724 00:31:15.540 --> 00:31:16.241 Adam: Itâs like Venmo 725 00:31:16.241 --> 00:31:18.710 Venmo. yeah. All that. 726 00:31:18.710 --> 00:31:19.310 I was looking for it. 727 00:31:19.310 --> 00:31:22.981 And then like the Google, the Gmail search result gave me something 728 00:31:22.981 --> 00:31:26.818 that wasn't alphabetical, nor was it in chronological order. 729 00:31:26.918 --> 00:31:30.455 So I was like, so give me stuff from like a year ago or like a few months ago. 730 00:31:30.455 --> 00:31:31.489 Iâm just like why? 731 00:31:31.489 --> 00:31:33.157 Riley: You sent that screenshot. And I was like, what? 732 00:31:33.157 --> 00:31:35.426 I've never even I've never seen that happen to me. 733 00:31:35.426 --> 00:31:38.162 Adam: Yeah and itâll miss stuff like, I'll, I'll search like meeting 734 00:31:38.162 --> 00:31:39.230 because I know I had to have a meet. 735 00:31:39.230 --> 00:31:41.332 I there's something about a meeting in some email 736 00:31:41.332 --> 00:31:44.202 and it's like just shows me stuff on like four years ago. 737 00:31:44.202 --> 00:31:46.804 And I'm like, no, like it can't search. 738 00:31:46.804 --> 00:31:49.307 It seems like it doesn't search the contents of the emails or something. 739 00:31:49.307 --> 00:31:52.076 The vast majority of the time when I saw something in my email, 740 00:31:53.111 --> 00:31:56.547 it's still chronological, so I don't know. 741 00:31:57.382 --> 00:32:00.418 I mean, if I search meeting someone's eye for June 742 00:32:00.418 --> 00:32:03.488 30th, June 26th, June 24th, I mean, , it's chronological, so I don't. 743 00:32:03.488 --> 00:32:06.491 Adam: For... if I search James meeting. 744 00:32:06.858 --> 00:32:09.861 Why is the 10th result here 745 00:32:10.094 --> 00:32:13.965 from June 6th- er sorry June 10th, 2024. 746 00:32:14.565 --> 00:32:17.568 And then it goes to April 11th of this year? 747 00:32:17.702 --> 00:32:19.237 Sammy: Yeah, I will say it wasn't this bad. 748 00:32:19.237 --> 00:32:22.240 I don't know what changed in the last few months, 749 00:32:22.807 --> 00:32:23.341 but like- 750 00:32:23.341 --> 00:32:26.644 Riley: Never I've never seen that I don't know, like, what do you guys doing. 751 00:32:26.644 --> 00:32:28.279 Sammy: Try searching something on your email because these are the default. 752 00:32:28.279 --> 00:32:29.814 Adam: Yeah okay so the default thereâs a... 753 00:32:29.814 --> 00:32:31.816 So there is a most recent there's a most relevant view. 754 00:32:31.816 --> 00:32:34.118 Riley: Well yeah! Just don't just deselect that. 755 00:32:34.118 --> 00:32:34.585 Sammy: Wait theres.. 756 00:32:34.585 --> 00:32:36.087 They changed it? 757 00:32:36.087 --> 00:32:36.654 Riley: Oh my gosh. 758 00:32:36.654 --> 00:32:38.890 Why do you have most relevant selected I probably- 759 00:32:38.890 --> 00:32:41.292 Adam: Because maybe I want the most relevant stuff I- 760 00:32:41.292 --> 00:32:45.029 one would think then maybe recency would make things slightly more relevant. 761 00:32:45.129 --> 00:32:46.397 You know they might be related. 762 00:32:46.397 --> 00:32:48.800 Riley: How was Google going to know what the most relevant thing is? 763 00:32:48.800 --> 00:32:50.068 Sammy: Why is that even a thing? 764 00:32:50.068 --> 00:32:52.503 Adam: They don't ask me for most recent results on Google search. 765 00:32:52.503 --> 00:32:53.871 Riley: So this is okay. Here we go. 766 00:32:53.871 --> 00:32:58.576 Actually this is this is a this is a good takeaway for the for the viewers. 767 00:32:58.576 --> 00:32:59.344 Okay. 768 00:32:59.344 --> 00:33:02.981 As much as AI has screwed up the web and screwed up everything, 769 00:33:03.314 --> 00:33:08.820 there are still ways for you to make your search experience not horrible. 770 00:33:09.320 --> 00:33:09.654 Adam: yeah 771 00:33:09.654 --> 00:33:14.025 Riley: So like on Google, there still is a way to search without the AI overviews. 772 00:33:14.158 --> 00:33:17.595 Like if I search I'll say Who is Linus? 773 00:33:19.530 --> 00:33:22.467 Sebastian? 774 00:33:22.467 --> 00:33:23.868 Okay. It doesn't even give me 775 00:33:23.868 --> 00:33:25.937 An AI overview. Unbelievable. 776 00:33:25.937 --> 00:33:28.773 So anyways, pretend there's an AI overview there. 777 00:33:28.773 --> 00:33:31.542 You can click right here. It says web. 778 00:33:31.542 --> 00:33:34.946 Click there and it will give you just a regular Google list of links. 779 00:33:34.946 --> 00:33:37.015 So like that still exists. 780 00:33:37.015 --> 00:33:42.120 And in in Gmail they have the drop down most recent or most relevant. 781 00:33:42.587 --> 00:33:45.890 So there are ways to find the workaround 782 00:33:46.290 --> 00:33:49.794 and and and use it and use the tools available to you. 783 00:33:49.794 --> 00:33:51.195 Don't don't give in. 784 00:33:51.195 --> 00:33:52.930 Don't give in to the AI. 785 00:33:52.930 --> 00:33:55.833 Adam: The other like quick ways to just add a curse word. 786 00:33:55.833 --> 00:33:56.300 Riley: Oh yeah 787 00:33:56.300 --> 00:33:57.969 Adam: So instead of saying Who is Linus tech tips, you say who? 788 00:33:57.969 --> 00:33:58.870 The f*** is Linus Tech Tips 789 00:33:58.870 --> 00:34:00.171 Then the AIâs like 790 00:34:00.171 --> 00:34:01.672 Riley: Who the f*** 791 00:34:01.672 --> 00:34:03.541 Adam: Other things you can do 792 00:34:03.541 --> 00:34:04.375 Quotation marks. 793 00:34:04.375 --> 00:34:07.378 You put something in quotes, it'll search for that exact phrase. 794 00:34:07.678 --> 00:34:10.181 If you put something with, if you put a minus 795 00:34:10.181 --> 00:34:11.182 and then you put like a word. 796 00:34:11.182 --> 00:34:14.619 So if I say like I want, dogs minus spaniel, 797 00:34:16.020 --> 00:34:18.856 I will get results without the word spaniel in it. 798 00:34:18.856 --> 00:34:20.525 Okay. That's good to know. 799 00:34:20.525 --> 00:34:20.992 Adam: Yeah!. 800 00:34:20.992 --> 00:34:21.692 Riley: Wow! 801 00:34:21.692 --> 00:34:24.796 Oh, I didn't even I wanted to say this about the YouTube thing that, like, 802 00:34:25.196 --> 00:34:28.199 I feel like YouTube is just such a 803 00:34:28.533 --> 00:34:32.103 it's some things they're doing so well and other things are doing horribly like 804 00:34:32.103 --> 00:34:33.805 Adam: let's give them a compliment sandwich. 805 00:34:33.805 --> 00:34:34.839 What are they doing well? 806 00:34:34.839 --> 00:34:38.242 Riley: I feel like they've done really well in the past couple of years, at least 807 00:34:38.242 --> 00:34:43.681 in, surfacing smaller creators and and in the feed. 808 00:34:44.015 --> 00:34:47.452 So when you just go to home and the algorithmic feed it's giving me, 809 00:34:47.452 --> 00:34:49.320 you know, like the normal kind of big like, 810 00:34:49.320 --> 00:34:52.523 oh, look, this got a lot of clicks, but then, you know, you 811 00:34:52.523 --> 00:34:53.758 scroll past a few of those 812 00:34:53.758 --> 00:34:57.161 and then it's like some small creator and it's the video has like 50 views 813 00:34:57.161 --> 00:34:57.795 or something. 814 00:34:57.795 --> 00:34:59.464 And I've clicked a few of them 815 00:34:59.464 --> 00:35:02.467 and I've found some cool people that I have subscribed to. 816 00:35:02.800 --> 00:35:06.437 And so, like, I really appreciate YouTube kind of changing their algorithm 817 00:35:06.437 --> 00:35:08.072 to surface smaller people. 818 00:35:08.072 --> 00:35:10.875 The bad stuff, you know, is like kind of there, there. 819 00:35:10.875 --> 00:35:12.210 I mean, you could come up with something. 820 00:35:12.210 --> 00:35:16.214 I know that the that they have their AI summaries on a lot of YouTube videos, 821 00:35:16.214 --> 00:35:18.349 they're like rolling that out really inconsistently. 822 00:35:18.349 --> 00:35:22.019 I've seen it sometimes and it goes away and I'm like, is it just 823 00:35:22.019 --> 00:35:23.454 can you just decide what you're doing? 824 00:35:24.622 --> 00:35:27.024 Sammy: How do you feel about the AI overviews on videos? 825 00:35:27.024 --> 00:35:30.328 Adam: I've never, ever used an AI overview 826 00:35:30.761 --> 00:35:34.699 for a video or email, and they keep being like Gemini's summary of my emails. 827 00:35:34.699 --> 00:35:37.568 I'm like, I'm just going to read the email. Actually. 828 00:35:37.568 --> 00:35:38.703 Riley: We didnât even talk about that. 829 00:35:38.703 --> 00:35:41.939 Yeah , in Google, in Google Workspace. 830 00:35:41.939 --> 00:35:44.208 Doc sucks now 831 00:35:44.208 --> 00:35:45.476 I'm so annoyed. 832 00:35:45.510 --> 00:35:49.247 Adam: They moved they moved to the like right click button is a Gemini button now! 833 00:35:49.247 --> 00:35:49.780 Riley: is it? 834 00:35:49.780 --> 00:35:50.481 Adam: on docs yeah! 835 00:35:50.481 --> 00:35:52.083 Riley: the right click button? 836 00:35:52.083 --> 00:35:53.217 Adam: So like when you right click to like 837 00:35:53.217 --> 00:35:55.586 try and add a comment it pops up like a Gemini pop up. 838 00:35:55.586 --> 00:35:58.222 I don't know if they rolled it back or not because people were pissed 839 00:35:58.222 --> 00:35:59.724 If you highlighted it needs to have a pop up 840 00:35:59.724 --> 00:36:02.460 that would show you the comment button and those were fine. 841 00:36:02.460 --> 00:36:03.528 It's the only thing it's gone. 842 00:36:04.595 --> 00:36:06.197 I have to right click and go comment. 843 00:36:06.197 --> 00:36:07.198 Or press control alt N 844 00:36:07.198 --> 00:36:11.836 Riley: I don't think I maybe I disabled it, but I never had the like the add comment 845 00:36:11.836 --> 00:36:16.107 when I highlight something like but now I have the refine thing that shows up. 846 00:36:16.474 --> 00:36:20.011 every single time I highlight everything, it's the worst. 847 00:36:20.211 --> 00:36:23.281 And it's funny because when you go refine, you have to type more. 848 00:36:23.548 --> 00:36:26.083 You have to tell it what to do! 849 00:36:26.083 --> 00:36:27.151 Riley: Yeah but I just mean like, hey, 850 00:36:27.151 --> 00:36:30.588 sometimes you just highlight something because you want to copy and paste it. 851 00:36:30.988 --> 00:36:34.926 Stop asking me if I want the AI to summarize the four words 852 00:36:34.926 --> 00:36:36.394 that I just highlighted. 853 00:36:36.394 --> 00:36:37.929 How can we turn this around? 854 00:36:37.929 --> 00:36:39.096 Will Google turn around? 855 00:36:39.096 --> 00:36:42.133 Or is or should we abandon Google and try and like, say, 856 00:36:42.133 --> 00:36:43.467 everyone use something else? 857 00:36:43.467 --> 00:36:45.636 Adam: I don't know, have you tried using an alternative search engine? 858 00:36:45.636 --> 00:36:46.571 Riley: I haven't 859 00:36:46.571 --> 00:36:48.639 Adam: sucks even DuckDuckGo. 860 00:36:49.207 --> 00:36:50.508 Bing 861 00:36:50.508 --> 00:36:51.542 Riley: if I didn't, if. 862 00:36:51.542 --> 00:36:52.610 Adam: Bingâs good for ad for YouTube 863 00:36:52.610 --> 00:36:57.648 Riley: if basically my whole job wasn't googling stuff. 864 00:36:58.015 --> 00:37:01.552 Like Searching stuff and trying to find out what the news is, 865 00:37:01.552 --> 00:37:03.955 you know, what's the who's the original source for this? 866 00:37:03.955 --> 00:37:07.491 Like, if I didn't, I don't trust another 867 00:37:07.491 --> 00:37:10.661 search engine to be able to get me there consistently. 868 00:37:10.661 --> 00:37:14.298 I'm sure there's one that can I just haven't spent the time 869 00:37:14.699 --> 00:37:17.535 that is required to be able to trust another one. 870 00:37:17.535 --> 00:37:19.403 Adam: There's a good. One that starts with a K, I think, and it- 871 00:37:19.403 --> 00:37:20.371 Riley: Kagi. 872 00:37:20.371 --> 00:37:22.473 Adam: Yeah but it costs $15 a month or something. 873 00:37:22.473 --> 00:37:23.474 Riley: Yeah, that looks pretty decent. 874 00:37:23.474 --> 00:37:26.377 I've seen people say that like that's kind of the, 875 00:37:26.377 --> 00:37:28.145 that's one of the most promising. 876 00:37:28.145 --> 00:37:30.915 Adam: It's interesting because I think that like it could the, 877 00:37:30.915 --> 00:37:33.150 the quality of search could change drastically 878 00:37:33.150 --> 00:37:36.153 if Google gets broken up into its various constituent parts. 879 00:37:36.587 --> 00:37:37.888 Right. Like that's another. 880 00:37:37.888 --> 00:37:39.123 Riley: Yeah, that's something. 881 00:37:39.123 --> 00:37:43.995 Adam: Because Google has an absolutely insane level of control over the internet. 882 00:37:44.128 --> 00:37:44.829 They could just. 883 00:37:44.829 --> 00:37:47.832 Riley: Most popular browser, most popular smartphone operating system. 884 00:37:48.199 --> 00:37:49.734 Adam: Most popular search engine. 885 00:37:49.734 --> 00:37:54.672 So like all the content that is supported or surfaced basically goes through them. 886 00:37:54.672 --> 00:37:56.340 And they have 887 00:37:57.441 --> 00:37:58.276 somehow 888 00:37:58.276 --> 00:38:02.546 avoided like being very clear targets of anti-trust until very recently. 889 00:38:02.546 --> 00:38:05.750 Riley: Until like the past year, it's it's pretty exciting. 890 00:38:05.750 --> 00:38:06.784 I mean, like, I'm not saying that 891 00:38:06.784 --> 00:38:09.920 I want to break them up or whatever, but like maybe also, Yeah, maybe 892 00:38:09.954 --> 00:38:10.454 break them up. 893 00:38:10.454 --> 00:38:10.955 I don't know 894 00:38:10.955 --> 00:38:12.423 Adam: I think so, 895 00:38:12.423 --> 00:38:16.260 Riley: it seems to be one of the only solutions that will actually do anything 896 00:38:16.427 --> 00:38:18.062 Adam: I think one of the biggest issues that feels like 897 00:38:18.062 --> 00:38:20.998 Google has not made something cool or good for a long time, 898 00:38:20.998 --> 00:38:23.634 like they used to make cool things that were good, 899 00:38:23.634 --> 00:38:26.637 like look at Google Maps or Google Earth 900 00:38:26.971 --> 00:38:28.205 Very cool. 901 00:38:28.205 --> 00:38:29.240 Good products. 902 00:38:29.240 --> 00:38:32.877 they slowly have been making it worse by having it like you're navigating. 903 00:38:32.877 --> 00:38:35.179 And then you have to see like West Group insurance. 904 00:38:35.179 --> 00:38:36.380 Advertisement. 905 00:38:36.380 --> 00:38:40.051 Riley: I don't know how much of that is like, oh, Google is worse than they were. 906 00:38:40.051 --> 00:38:43.921 And how much of it is like, you know, the internet is just kind of a mature 907 00:38:43.921 --> 00:38:45.790 ecosystem now. It's like, what? 908 00:38:45.790 --> 00:38:48.826 Adam: I mean, I can't imagine how much it cost to run Google Maps. 909 00:38:48.826 --> 00:38:49.527 Riley: Oh my gosh. 910 00:38:49.527 --> 00:38:51.062 Store... 911 00:38:51.062 --> 00:38:53.464 Photos from like, every street in the world. 912 00:38:53.464 --> 00:38:57.034 Riley: This is the whole problem with all of these tech giants 913 00:38:57.368 --> 00:39:02.673 is that at the end of the day, we do rely on their services and they are useful. 914 00:39:02.673 --> 00:39:04.942 Google maps is very useful. 915 00:39:04.942 --> 00:39:05.776 Adam: Nationalize it 916 00:39:07.545 --> 00:39:10.948 like like if you think about it, if there's some if there's some service. 917 00:39:10.948 --> 00:39:12.016 I know that this sounds very... 918 00:39:12.016 --> 00:39:13.384 Riley: Yeah weâre gonna get political 919 00:39:13.384 --> 00:39:14.685 Adam: Iâm gonna get hyper commie real quick 920 00:39:14.685 --> 00:39:17.288 But if there are some services that private companies 921 00:39:17.288 --> 00:39:20.391 have inserted themselves in to the point where it's become, 922 00:39:20.624 --> 00:39:23.961 they're basically essential for modern operation. 923 00:39:23.961 --> 00:39:26.964 It makes sense to try and remove the middleman 924 00:39:27.331 --> 00:39:30.101 who is extracting dollars from the the two sides. 925 00:39:30.101 --> 00:39:30.901 Right? 926 00:39:30.901 --> 00:39:33.704 Like Google's taking money from the government and from the people. 927 00:39:33.704 --> 00:39:37.308 Riley: The problem with that real quick, we can't get into this. 928 00:39:37.308 --> 00:39:40.111 Adam: No, we've got three minutes. Let's get into that. 929 00:39:40.111 --> 00:39:43.280 Riley: A lot of the time when you nationalize something like that, 930 00:39:43.981 --> 00:39:45.616 the management is just worse. 931 00:39:45.616 --> 00:39:46.484 Adam: Absolutely. 932 00:39:46.484 --> 00:39:49.754 And so, you know, our Canadian health care system is a great example. 933 00:39:49.754 --> 00:39:52.423 Everyone loves our our health care system. 934 00:39:52.423 --> 00:39:54.091 And I love our healthcare system. 935 00:39:54.091 --> 00:39:57.061 When it works, a lot of the time it doesn't work. 936 00:39:57.495 --> 00:39:59.997 And I think, you know. 937 00:39:59.997 --> 00:40:02.433 Again, we can't get into it. 938 00:40:02.433 --> 00:40:03.401 Sammy: We've gone political 939 00:40:05.503 --> 00:40:07.037 Riley: But it's certainly a solution. 940 00:40:07.037 --> 00:40:10.541 It's certainly if there was a way to nationalize a, 941 00:40:11.409 --> 00:40:15.146 big service like that that is used by millions of people 942 00:40:15.479 --> 00:40:19.517 and have it retain its usefulness and not like, degrade due to, 943 00:40:19.950 --> 00:40:22.653 you know, a lack of resources or a lack of people who know 944 00:40:22.653 --> 00:40:25.322 what the hell they're doing, then yeah, absolutely. 945 00:40:25.322 --> 00:40:26.724 I'm on board. 946 00:40:26.724 --> 00:40:32.296 We need a philosopher king who will lead the revolution and redistribute. 947 00:40:32.296 --> 00:40:34.498 Adam: And that's gonna be me. 948 00:40:34.498 --> 00:40:36.667 That's gonna be me. 949 00:40:36.667 --> 00:40:38.536 Sammy: Vote Adam, as world leader 950 00:40:38.536 --> 00:40:39.904 Adam: no, I that's too much responsibility. 951 00:40:40.204 --> 00:40:42.840 It's scary. Why would we mess it up big time? 952 00:40:42.840 --> 00:40:46.610 Riley: What a great way to end this Google video 953 00:40:46.610 --> 00:40:48.312 Viva la revolution! 954 00:40:48.312 --> 00:40:49.146 Adam: Yeah! 955 00:40:49.146 --> 00:40:49.713 Riley: Cien. 956 00:40:49.713 --> 00:40:51.482 Adam: Just eat the rich, 957 00:40:51.482 --> 00:40:52.349 Riley: okay 958 00:40:52.349 --> 00:40:54.351 Riley:What do you mean? Okay? 959 00:40:54.351 --> 00:40:56.153 Riley: Okay. No, how do we actually end it? 960 00:40:56.153 --> 00:40:57.421 How do we actually end it? 961 00:40:57.421 --> 00:40:58.155 Sammy: Thatâs how weâre ending it 962 00:40:58.155 --> 00:40:58.956 Riley: You have to leave! 963 00:40:58.956 --> 00:40:59.790 Sammy: Nope thatâs how weâre ending it! 964 00:40:59.790 --> 00:41:01.125 Adam: Are you still on the eat the rich side 965 00:41:01.125 --> 00:41:02.593 If you have to kill people you like. 966 00:41:02.593 --> 00:41:03.394 Sammy: All right, we're ending it there!"}