WEBVTT

00:00:00.000 --> 00:00:07.000
There is a bill, there is only a bill, and it's coming to ban your teens from social media.

00:00:09.000 --> 00:00:11.000
On Capitol Hill.

00:00:12.000 --> 00:00:19.000
That's right, it's talk linked. We're back with another episode about the Florida social media ban.

00:00:19.000 --> 00:00:24.000
This is not another episode, we've never done it before. But Jacob is here to tell you all about it.

00:00:24.000 --> 00:00:28.000
Hi. And I'll tell you about it too, we both will. What has happened?

00:00:28.000 --> 00:00:31.000
Well Ron DeSantis just signed a law.

00:00:31.000 --> 00:00:35.000
He's the Florida governor. He's the Florida governor, that's important to know. He's Florida governor.

00:00:35.000 --> 00:00:38.000
He's not just some guy. He's also very red.

00:00:39.000 --> 00:00:42.000
In both literally and metaphorically. He gets a lot of sun.

00:00:42.000 --> 00:00:46.000
He seems to, he's a passionate man.

00:00:46.000 --> 00:00:48.000
Okay. And what's-

00:00:50.000 --> 00:00:53.000
And you can see in his face, but you know.

00:00:53.000 --> 00:01:03.000
Okay, we need to actually like- Okay, so anyway, the point is, so he passed this bill that's being called one of the U.S.'s most restrictive social media bans for minors.

00:01:04.000 --> 00:01:15.000
And when it takes effect, which I believe is January 1st next year, it will, children under 14 will not be able, those will just be banned from having social media accounts.

00:01:15.000 --> 00:01:18.000
Yeah, you literally just can't do it.

00:01:18.000 --> 00:01:21.000
Oh, you're 13 and a half, banned.

00:01:21.000 --> 00:01:26.000
We're back to, we're back to the early days of social media when you had to be, I think over 13.

00:01:26.000 --> 00:01:30.000
Yeah, well, you had to lie. I remember when I created my Facebook account, I lied.

00:01:30.000 --> 00:01:34.000
You either had to be over 13 or be morally unscrupulous.

00:01:34.000 --> 00:01:37.000
Yeah. Inscrupulous? Actually, I think my mom lied for me.

00:01:37.000 --> 00:01:41.000
So, can't you, can you want me to create a Facebook account for me?

00:01:41.000 --> 00:01:46.000
This is the joke you made earlier today. If you're gonna like tweets, I'd prefer you to do it at home.

00:01:46.000 --> 00:01:50.000
If you're gonna post to Facebook, I'd rather you did it at home.

00:01:50.000 --> 00:01:54.000
Mom, if you don't let me do it here, I'm just gonna go to the bar and swipe on TikTok.

00:01:54.000 --> 00:02:00.000
They're just gonna go to the cyber cafe and they're gonna get unlicensed internet.

00:02:00.000 --> 00:02:03.000
Yeah, so this is pretty wild.

00:02:03.000 --> 00:02:19.000
I remember hearing that Florida was working on this and I know that a number of other states have tried to do similar social media bans where minors are prevented from creating accounts and using these platforms.

00:02:19.000 --> 00:02:22.000
And it's worth noting as well, just keep your thought.

00:02:22.000 --> 00:02:29.000
But the 14 and 15 year olds require parental permission to create social media accounts, so it also affects them.

00:02:29.000 --> 00:02:36.000
Yeah, so once you're 16, destroy your body, melt your brain.

00:02:36.000 --> 00:02:40.000
You can drive and also go on Facebook without mom's permission.

00:02:40.000 --> 00:02:46.000
And we joke about it, but there have been many, many, many, many studies done.

00:02:46.000 --> 00:02:50.000
And I feel like it's basically... Many done by Facebook themselves.

00:02:50.000 --> 00:02:58.000
Yeah, it's basically an uncontroversial truth now that social media is not good for young kids.

00:02:58.000 --> 00:03:03.000
Or anyone really. There's an argument to be made for society.

00:03:03.000 --> 00:03:16.000
Yeah, yeah. And despite this, I feel like, you know, obviously it is a charged issue, like any other issue that requires people to make laws.

00:03:16.000 --> 00:03:24.000
But I think it's somewhat uncontroversial at this point that social media is not good for people in general.

00:03:24.000 --> 00:03:28.000
The debate is whether...

00:03:28.000 --> 00:03:34.000
What we do about that. Yeah, so let's get a few more details about this.

00:03:34.000 --> 00:03:42.000
This bill has been passed through both houses of the Florida state government and has been signed by Governor Ron DeSantis,

00:03:42.000 --> 00:03:45.000
meaning that, as you said, it is an actual law now.

00:03:45.000 --> 00:03:48.000
It is law. It just hasn't taken effect yet. Yes.

00:03:48.000 --> 00:03:56.000
We're in the fun-waiting period, the bureaucracy. Right. And in other states where they've tried similar bills, they've encountered legal challenges from people who are saying,

00:03:56.000 --> 00:03:59.000
you know, this is not constitutional or whatever, so then...

00:03:59.000 --> 00:04:07.000
Like Mark Zuckerberg. As far as I'm aware, you know, we said it was the most restrictive social media ban in the country.

00:04:07.000 --> 00:04:16.000
As far as I'm aware, other states haven't successfully gone all the way through in making such bans law, although we should check that.

00:04:16.000 --> 00:04:24.000
Oh, I was wrong. Other states have already banned social media for minors without parental consent.

00:04:24.000 --> 00:04:34.000
So I believe that's why this bill is being called the most restrictive, because I believe Arkansas, Ohio, and Utah...

00:04:34.000 --> 00:04:37.000
I think California has a ban as well, although I don't know the details of theirs.

00:04:37.000 --> 00:04:50.000
They've banned minors. I think each of the states has different age limits, but for people under a certain age, those states have banned social media without parental consent,

00:04:50.000 --> 00:04:53.000
which is very interesting. So that already exists.

00:04:53.000 --> 00:04:57.000
I didn't actually know that. We're in Canada, so we don't know.

00:04:57.000 --> 00:05:00.000
Our parents let us do anything we want.

00:05:00.000 --> 00:05:04.000
As long as we say please and thank you and sorry.

00:05:04.000 --> 00:05:07.000
Especially once you're over 18 and you move out.

00:05:07.000 --> 00:05:13.000
Now, what's interesting about this, and especially I believe a lot of the states that you mentioned is that they are red states.

00:05:13.000 --> 00:05:20.000
And these sorts of things. Well, I guess you can make the argument because there's parental permission that it's still, it's not necessarily infringing on parents' rights,

00:05:20.000 --> 00:05:26.000
though one issue that is normally championed by the Republican Party is small government, which this obviously is not.

00:05:26.000 --> 00:05:36.000
Though, I will say about parental permission, one person that had an issue with this Florida bill in its original form was none other than a signer of this bill, Ron DeSantis,

00:05:36.000 --> 00:05:46.000
who actually vetoed it initially because it initially was everyone 16 and under, no social media for you.

00:05:46.000 --> 00:05:53.000
And then I guess he settled, he was like, all right, I'll give it to you guys if you like make it parental permission for 14 and 15 years.

00:05:53.000 --> 00:06:03.000
I did find that interesting. One of the earlier articles I was reading about it was like mentioned that other people were opposing it, but it is interesting to see that,

00:06:03.000 --> 00:06:10.000
because I believe this bill was not put forward by Ron DeSantis, it was put forward by the Republicans.

00:06:10.000 --> 00:06:15.000
States and Republicans. Yeah, yeah. And so, you know.

00:06:15.000 --> 00:06:22.000
Or I should say House Republicans, because I know that it was the top legislative priority, according to The Guardian, for Republican State House Speaker Paul Renner.

00:06:22.000 --> 00:06:33.000
Right, right, okay, yeah. So like, we're going to try our best obviously to like not get too political with this discussion, but obviously it is a political issue.

00:06:33.000 --> 00:06:44.000
And as you said, it's interesting, at the very least, to see that this is an issue where, you know, Republicans are at the front of the issue,

00:06:44.000 --> 00:06:47.000
trying to institute these bans.

00:06:47.000 --> 00:06:57.000
Well, in other scenarios, like say, changes to school curriculum, Republicans are for more parents' rights in that scenario.

00:06:57.000 --> 00:07:04.000
In this scenario, one of the things that the Democrats have said against this bill is that, oh, it doesn't respect parents' rights enough.

00:07:04.000 --> 00:07:12.000
So, you know, that's just interesting. At least one, because it's worth noting for accuracy and for, you know, nonpartisanship.

00:07:12.000 --> 00:07:20.000
Some Democrats did join the majority of the Republican majority who supported this bill.

00:07:20.000 --> 00:07:23.000
Like there were Democrats who were like, yeah, I'm for it.

00:07:23.000 --> 00:07:29.000
Right, yeah. But like, you also have people like Anna Escamani, she's a Democratic State House representative,

00:07:29.000 --> 00:07:35.000
and she said in a news release that the alternative would be better to ensure improved parental oversight tools,

00:07:35.000 --> 00:07:42.000
improved access to data to stop bad actors, alongside major investments in Florida mental health systems and programs.

00:07:42.000 --> 00:07:48.000
And I do think that's an interesting point, just because those seem like more general supports.

00:07:48.000 --> 00:07:54.000
And so it wouldn't be so, the bill wouldn't be so one-sided.

00:07:54.000 --> 00:08:01.000
It seems weirdly, weirdly surgical, because there's probably, I mean, there's areas of gray.

00:08:01.000 --> 00:08:09.000
From what I understand from the law, what they're looking at specifically are like, they don't necessarily name any specific platforms,

00:08:09.000 --> 00:08:16.000
but they name ones that have like infinite scrolling, liking as like an ability, these sorts of things.

00:08:16.000 --> 00:08:24.000
So there are like, like with any law, there are things that maybe are similar to what they are trying to get rid of that aren't necessarily covered under the law.

00:08:24.000 --> 00:08:31.000
So having something like to play, you know, I guess devil's advocate, depending on where you land,

00:08:31.000 --> 00:08:41.000
this suggestion of just generally supporting parental oversight tools on all platforms and supporting mental health

00:08:41.000 --> 00:08:47.000
treats sort of things that are like social media addiction, make some logical sense.

00:08:47.000 --> 00:08:52.000
With any of these bills, there's going to be particularities that are brought up, because we haven't read the bill.

00:08:52.000 --> 00:09:01.000
So like, even if you are broadly for the idea that, you know, children under a certain age shouldn't be allowed to create accounts and use social media,

00:09:01.000 --> 00:09:06.000
you might have an issue with the way that these bills are presented and like some of the language in them.

00:09:06.000 --> 00:09:16.000
And that's a lot of the opposition to certain bills in the process of government often comes down to like, okay, the bill is about this,

00:09:16.000 --> 00:09:21.000
but they also have this little section that's like saying that they're going to do this and I'm against that.

00:09:21.000 --> 00:09:26.000
So then you have to oppose the bill broadly and that's, you know, that's why things take so long.

00:09:26.000 --> 00:09:32.000
In case you were wondering why nothing ever gets done, it's because, you know, there's so many safeguards in place.

00:09:32.000 --> 00:09:40.000
If you want something else, then you might have to, you know, try to get that, I don't know, communist revolution going, if that's what you're into.

00:09:40.000 --> 00:09:49.000
You alluded to something earlier that I think is important to remember when you're, when, you know, we're talking about all these like, tick-tock bands and social media bands and whatnot.

00:09:49.000 --> 00:09:54.000
Spoilers. Yeah, spoilers. We are going to talk about the tick-tock band as well in a few minutes.

00:09:55.000 --> 00:10:06.000
It's that these bands are targeting the features of the platform themselves, the kind of addictive qualities, the constant notifications you get,

00:10:06.000 --> 00:10:13.000
the algorithmic like tendency to push certain types of content over other types of content.

00:10:13.000 --> 00:10:22.000
It's not the content itself that's the problem, it's the way that the platforms are dealing with it that ends up being just like...

00:10:22.000 --> 00:10:26.000
That are, you know, enticing you to spend more and more time on the platforms.

00:10:26.000 --> 00:10:32.000
You know, at the risk of sounding like an old man, when I was young...

00:10:32.000 --> 00:10:39.000
Well, there you go, you already did it. When I was in, when I was in high school, you know, we had MSN, chat or whatever.

00:10:39.000 --> 00:10:44.000
Facebook really started once I was in second year university or first year and...

00:10:44.000 --> 00:10:47.000
I remember asking some honeys for their MSN messenger.

00:10:48.000 --> 00:10:51.000
That was a big deal when I worked up the courage to do that.

00:10:51.000 --> 00:10:55.000
See, I was too busy talking to the chatbot.

00:10:55.000 --> 00:11:00.000
There was a chatbot? They had a chatbot on MSN and you could go and talk to them.

00:11:00.000 --> 00:11:08.000
I mean, it was talk to it. It was very, very, very simple and you could kind of exhaust all the possibilities of conversation with it in like a few minutes.

00:11:08.000 --> 00:11:12.000
But it was like... Anyway. It was actually really interesting.

00:11:12.000 --> 00:11:19.000
It was before LLMs and all that obviously. But regardless, I didn't have to deal with this like toxic sludge growing up.

00:11:19.000 --> 00:11:23.000
Yeah. And there is like... And I can't imagine what it's doing.

00:11:23.000 --> 00:11:27.000
I can see why people don't want their kids on social media.

00:11:27.000 --> 00:11:32.000
Especially because like, you know, the one argument for why kids should be on social media is because their friends are on social media.

00:11:32.000 --> 00:11:38.000
And that's not a good argument for everything. Parents have been dismissing that argument for years with bridge-based arguments.

00:11:38.000 --> 00:11:41.000
Yeah. It's a bridge-based arguments at the core of...

00:11:41.000 --> 00:11:50.000
Yeah, jumping off bridge arguments. This bill, as we've said, is far from being actually enacted.

00:11:50.000 --> 00:12:02.000
They're definitely going to face legal challenges both from, you know, groups that oppose them, oppose the idea just on like political merits.

00:12:02.000 --> 00:12:08.000
And they're also going to be definitely opposed by the social media companies because you can make a law as much as you want.

00:12:08.000 --> 00:12:18.000
The people that make money? You can make a law, you can make regulations as much as you want, but then it's going to be up to the companies that you're regulating to enact those regulations.

00:12:18.000 --> 00:12:27.000
We're seeing this with the EU and their attempts to wrangle Apple and Google and Microsoft are apparently being really easy on.

00:12:27.000 --> 00:12:30.000
Doing some of the greatest bureaucratic work of our time.

00:12:30.000 --> 00:12:34.000
Yeah, kind of. There are some missteps, but that's not what this video is about.

00:12:34.000 --> 00:12:40.000
Again, we're not in Europe and Europeans probably have their own thoughts about this legislation and all that.

00:12:40.000 --> 00:12:51.000
But regardless, you know, to use the EU as an example, they say, okay, you have to allow third-party app stores Apple, and Apple's like, okay, this is how we're going to do it.

00:12:51.000 --> 00:12:54.000
Let us know. Weeks pass. In the comments.

00:12:54.000 --> 00:12:58.000
Weeks pass. The EU finally says, hey, that's not okay.

00:12:58.000 --> 00:13:01.000
And the Apple says, oh, okay, we'll come up with something else.

00:13:01.000 --> 00:13:05.000
Weeks pass. Here's our new law. You know, now I'm going into the future.

00:13:05.000 --> 00:13:09.000
This hasn't happened yet. But it's going to be a process.

00:13:09.000 --> 00:13:15.000
And in the meantime, how many kids' brains will be melted and slurped?

00:13:15.000 --> 00:13:18.000
By Mark Zuckerberg. Yes.

00:13:18.000 --> 00:13:21.000
To fuel his reptilian regime and body.

00:13:21.000 --> 00:13:24.000
To give him the fuel to BJJ Elon Musk.

00:13:24.000 --> 00:13:29.000
Another thing to consider about this, you know, the fallout is how related companies.

00:13:29.000 --> 00:13:35.000
I mean, I don't know if, if, uh, Prawn Hub, I think has how you can say without-

00:13:35.000 --> 00:13:38.000
They have the best frickin' seafood.

00:13:38.000 --> 00:13:41.000
Without the, without the algorithms catching you.

00:13:41.000 --> 00:13:44.000
Uh, you know, are they a social media site?

00:13:44.000 --> 00:13:47.000
Is YouTube, I think technically is a social media site?

00:13:47.000 --> 00:13:53.000
There is a lot of interesting things that are post on Prawn Hub that are not prawn-based.

00:13:53.000 --> 00:13:56.000
They may be shrimp or crawfish based. That's true.

00:13:56.000 --> 00:13:59.000
But regardless, I mean, you could view that as a form of social media.

00:13:59.000 --> 00:14:03.000
Yeah. People are in the comments saying things that they would also say in YouTube comments.

00:14:03.000 --> 00:14:06.000
I know. I've read our YouTube comments. Actually, that's a great point.

00:14:06.000 --> 00:14:10.000
Will YouTube, will these kids be allowed to go on YouTube?

00:14:10.000 --> 00:14:14.000
Oh. YouTube has a whole kids app.

00:14:14.000 --> 00:14:17.000
No, that's, that's gone now. You sure?

00:14:17.000 --> 00:14:20.000
I'm pretty sure. I'm pretty sure they got rid of YouTube kids.

00:14:20.000 --> 00:14:28.000
You can go ahead and check that out. But they definitely do have like infinite scrolling and liking and, um, all of that.

00:14:28.000 --> 00:14:33.000
So YouTube could be one of those forms of social media that kids are not allowed on,

00:14:33.000 --> 00:14:37.000
which YouTube's definitely gonna have a problem with, because that's big business.

00:14:37.000 --> 00:14:41.000
YouTube, uh, child entertainment on YouTube is massive.

00:14:41.000 --> 00:14:45.000
Look at Coco Melon as just one, as the most well-known example.

00:14:45.000 --> 00:14:49.000
And that's just one. Oh, I mean, don't even get me started on crappy kids content.

00:14:49.000 --> 00:14:52.000
Well, yeah, apparently YouTube kids, I think, is still a thing. Okay.

00:14:52.000 --> 00:14:56.000
Um, which I know because I went to use it.

00:14:56.000 --> 00:14:59.000
You have a YouTube kid? I have a YouTube kid.

00:14:59.000 --> 00:15:03.000
He's got his own channel. We're gonna support him.

00:15:03.000 --> 00:15:07.000
Uh, get him, get him sponsor brand deals. Welcome to my channel.

00:15:07.000 --> 00:15:10.000
Today we talk about dinosaurs.

00:15:10.000 --> 00:15:14.000
This, this Dimetrodon.

00:15:15.000 --> 00:15:23.000
Anyways, Pronhub, uh, said they're prepared to block Florida if, uh, the child safety law takes effect,

00:15:23.000 --> 00:15:28.000
because, um, they're gonna require IDs.

00:15:28.000 --> 00:15:34.000
Yeah. And I think that a big part of- That's sort of the early example of this, of this sort of legislation,

00:15:34.000 --> 00:15:39.000
is that there's, like, even California, they introduced laws to try and make, like, IDs,

00:15:39.000 --> 00:15:44.000
like, try and ID users to prevent them from going and looking at adult sites,

00:15:44.000 --> 00:15:47.000
like, like Pornhub. Right.

00:15:47.000 --> 00:15:52.000
And that's actually, you know, as well, a fair criticism of this type of law,

00:15:52.000 --> 00:16:01.000
is that, okay, your kids are going online and now you're requiring them to give social media companies

00:16:01.000 --> 00:16:05.000
even more sensitive data about them in the form of, like, your government ID.

00:16:05.000 --> 00:16:09.000
Yeah. So, you know, that's, that's one fair objection to the whole thing.

00:16:09.000 --> 00:16:13.000
Or their frickin' biometric data, which is something that was, it's not been put into effect,

00:16:13.000 --> 00:16:17.000
but it was something that was suggested by, I believe, the United Kingdom,

00:16:17.000 --> 00:16:21.000
because that's something that they're looking into in terms of, uh, is, is,

00:16:21.000 --> 00:16:26.000
is solutions to preventing, uh, children from, you know, accessing.

00:16:26.000 --> 00:16:31.000
Right. Uh, let's say red band, red band content.

00:16:31.000 --> 00:16:38.000
Right. Yes. So Pornhub, Pornhub is incentivized to, you know, stop this law from being enacted

00:16:38.000 --> 00:16:41.000
because a lot of their users don't want to have to put in IDs and whatnot.

00:16:41.000 --> 00:16:46.000
Uh, but at the same time, I don't want to be ID just to watch all of my math tutorials.

00:16:49.000 --> 00:16:54.000
Math, math-ter. There is a dude on there just does calculus, like, tutorials.

00:16:54.000 --> 00:16:57.000
Does he call it math-terbation? No.

00:16:57.000 --> 00:17:01.000
The other argument that Pornhub gives, though, is the fact that, like, okay,

00:17:01.000 --> 00:17:04.000
you're going to stop people from going on our site,

00:17:04.000 --> 00:17:08.000
but, like, you're regulating, like, we're going to abide by this

00:17:08.000 --> 00:17:12.000
because we're a big company and, you know, we're visible.

00:17:12.000 --> 00:17:15.000
We can lose the penis of you, the United States.

00:17:15.000 --> 00:17:18.000
Sure. But there's, no, I mean, they're just more visible.

00:17:18.000 --> 00:17:26.000
But they're, they're, they're tons and tons and tons of more, uh, websites out there

00:17:26.000 --> 00:17:29.000
that are less regulated and less visible that are still going to host porn.

00:17:29.000 --> 00:17:32.000
And then you're just going to push kids and whoever

00:17:32.000 --> 00:17:35.000
more towards these sketchier and sketchier sites. That's their argument.

00:17:35.000 --> 00:17:39.000
Mmm. I, you know, I, I'm not going to give a take on this right now

00:17:39.000 --> 00:17:42.000
because then we're just going to get into the weeds. Yeah.

00:17:42.000 --> 00:17:46.000
Um, but we're going to, uh- People make similar arguments about, like,

00:17:46.000 --> 00:17:51.000
really objectionable opinions and whether you should be able to share those on social media.

00:17:51.000 --> 00:17:56.000
Moderation? Yep. I mean, this is such a deep conversation.

00:17:56.000 --> 00:18:01.000
But the thing that I definitely wanted to mention while we're here is the Tiktok.

00:18:01.000 --> 00:18:06.000
Mmm. Uh, you know, it's being, it's being widely called the Tiktok Band Bill.

00:18:06.000 --> 00:18:09.000
Mmm. Which is a federal, uh, federal legislation.

00:18:09.000 --> 00:18:12.000
Tiktok Band Bill, we want Bill back.

00:18:12.000 --> 00:18:15.000
Bring him back, Tiktok.

00:18:15.000 --> 00:18:18.000
It's being called the Band Bill, but it's not actually a Band Bill.

00:18:18.000 --> 00:18:22.000
It's kind of like a, if you don't do this, then you'll get banned.

00:18:22.000 --> 00:18:25.000
And the thing that they have to do is- It's an offer you can't refuse, Bill.

00:18:25.000 --> 00:18:33.000
Yes. Yeah. They're, they're trying to legislate, uh, that a platform as large and influential,

00:18:33.000 --> 00:18:36.000
uh, in American society as Tiktok- And foreign operated.

00:18:36.000 --> 00:18:40.000
As Tiktok has to be owned by an American company.

00:18:40.000 --> 00:18:46.000
So the deal is, ByteDance, the owner of Tiktok, has to sell Tiktok, uh,

00:18:46.000 --> 00:18:51.000
divest itself of it in order for Tiktok to remain operating in the U.S.

00:18:51.000 --> 00:18:54.000
Mmm. And- Without VPNs.

00:18:54.000 --> 00:18:58.000
I don't, I really don't know where I personally sit on this,

00:18:58.000 --> 00:19:02.000
but one thing that we do know is ByteDance doesn't want to sell.

00:19:02.000 --> 00:19:05.000
Well, China doesn't want ByteDance to sell.

00:19:05.000 --> 00:19:08.000
It's both. Well, I mean, yeah. I mean, but that's the whole reason.

00:19:08.000 --> 00:19:11.000
Who wants to, who wants to get rid of a literal pile of money?

00:19:11.000 --> 00:19:15.000
Yeah. No one. No one's gonna, you know, you're right, America.

00:19:15.000 --> 00:19:19.000
I'm going to take this giant pile of money and I'm going to set it on fire Joker style.

00:19:19.000 --> 00:19:22.000
Well, that's the, so this is the whole question.

00:19:22.000 --> 00:19:29.000
It really comes down to, because ostensibly the reason why the U.S. government wants to ban Tiktok

00:19:29.000 --> 00:19:36.000
is because of this whole thing that China has with certain, well, with all companies

00:19:36.000 --> 00:19:43.000
technically. I think I'm not sure whether it's a threshold thing, but companies have to give the Chinese

00:19:43.000 --> 00:19:47.000
government data when the Chinese government comes knocking.

00:19:47.000 --> 00:19:54.000
And the U.S. knowing about this law is suddenly now, I mean, years ago.

00:19:54.000 --> 00:19:59.000
They've done things. They've banned it from work phones for government officials.

00:19:59.000 --> 00:20:03.000
But the reason, the reason they're doing all that is because now they're treating basically

00:20:03.000 --> 00:20:07.000
every Chinese company as if it's the CCP.

00:20:07.000 --> 00:20:12.000
Yeah. And I mean, that remains to be seen, whether that's kind of, because we don't know.

00:20:12.000 --> 00:20:18.000
Not without reason, because there were reports that came out that ByteDance employees were

00:20:18.000 --> 00:20:22.000
able to access the location of journalists.

00:20:22.000 --> 00:20:30.000
Yes. Using Tiktok. I think that unquestionably, well, but the question is, are they giving that data to the

00:20:30.000 --> 00:20:38.000
government? Yeah. It's one, like obviously social media companies are tracking our every thought, you know, but

00:20:38.000 --> 00:20:45.000
the question is, are ByteDance and other big companies, Chinese companies that have been

00:20:45.000 --> 00:20:54.000
the bad guy in the tech world for a spell, Huawei, are they actually guilty of just straight

00:20:54.000 --> 00:20:57.000
feeding data from users to the Chinese government?

00:20:57.000 --> 00:21:04.000
And I don't think you're crazy if you think that that's the case. I think that it's reasonable to think that if you're dealing with a Chinese company that

00:21:04.000 --> 00:21:09.000
the CCP could easily get their hands on that data if they don't already.

00:21:09.000 --> 00:21:17.000
However, I also don't know how much we should act as if when we're dealing with Tiktok,

00:21:17.000 --> 00:21:21.000
the CCP owns all these companies, because they don't own them.

00:21:21.000 --> 00:21:29.000
Yeah, but they have incredible influence on them. But even if they're not giving location to the CCP, another question that I would raise

00:21:29.000 --> 00:21:33.000
because of the point I raised, because I double checked it and I looked at sources now, is

00:21:33.000 --> 00:21:39.000
do you want to trust a company that even if they're not giving your location to China,

00:21:39.000 --> 00:21:46.000
their employees can improperly access data on the location of journalists?

00:21:46.000 --> 00:21:50.000
Yeah, and apparently they've done things to make that better.

00:21:50.000 --> 00:21:58.000
They did, like, move their servers, so they store North American users' data in North

00:21:58.000 --> 00:22:03.000
American servers, and I believe that elsewhere data is stored in Singapore.

00:22:03.000 --> 00:22:08.000
I mean, they're located in Singapore. I think European users' data is stored on the European server.

00:22:08.000 --> 00:22:12.000
But, you know, I don't know.

00:22:12.000 --> 00:22:23.000
There's so many unknowns. The other argument, of course, is why is America just doing this for the one Chinese social media platform?

00:22:23.000 --> 00:22:29.000
What about all the American ones? The real-time location of all those 14-year-olds is the property of the U.S. government.

00:22:29.000 --> 00:22:33.000
Yeah, yeah, yeah. I mean, right.

00:22:33.000 --> 00:22:41.000
Is that a double standard? This is kind of why when it comes to Chinese phones, I have, like, I don't know how to feel.

00:22:41.000 --> 00:22:48.000
Because on the one hand, you have your North American phones, or whatever, or non-Chinese phones,

00:22:48.000 --> 00:22:53.000
that is almost certainly giving data to these big social media companies

00:22:53.000 --> 00:22:57.000
who may or may not give it to the government if they ask.

00:22:57.000 --> 00:23:01.000
On the other hand, you have these Chinese phones who are like,

00:23:01.000 --> 00:23:06.000
let's say, for the sake of argument, they're certainly giving it to the CCP.

00:23:06.000 --> 00:23:10.000
Support American-made spyware. Which one's better?

00:23:10.000 --> 00:23:16.000
It's almost like I would rather have my data being harvested by someone who barely,

00:23:16.000 --> 00:23:19.000
like an entity that barely affects my life.

00:23:19.000 --> 00:23:23.000
The CCP may be through, like, broad economic trends, like it gets around to me,

00:23:23.000 --> 00:23:30.000
but, like, if they know, you know, my preferences and where I work, it's like,

00:23:30.000 --> 00:23:34.000
do I care more about that or do I care about, like, a local company,

00:23:34.000 --> 00:23:39.000
or a local government agency that is going to exert more power over my life directly?

00:23:39.000 --> 00:23:42.000
Do I care about them having it? It's like, I don't know.

00:23:42.000 --> 00:23:50.000
I don't know. We live in a world where, if I'm not mistaken, a few years ago,

00:23:50.000 --> 00:23:56.000
there was a lovely young woman who was, I believe, the student body president

00:23:56.000 --> 00:24:00.000
of the University of Toronto, and she just happened to be from Tibet.

00:24:00.000 --> 00:24:05.000
And when she was elected, or when she made a certain post,

00:24:05.000 --> 00:24:14.000
wouldn't you know what? She was bombarded with negative sentiments from Chinese-speaking people.

00:24:14.000 --> 00:24:21.000
And one of the things that came up was whether or not that was foreign action.

00:24:21.000 --> 00:24:29.000
Like, these are possibly foreign agents trying to fight the, you know,

00:24:29.000 --> 00:24:34.000
what's the most diplomatic way for me to say this?

00:24:34.000 --> 00:24:37.000
That Tibet exists?

00:24:37.000 --> 00:24:43.000
To fight that, whether Tibet is its own thing or whether it belongs to China?

00:24:43.000 --> 00:24:46.000
For a lot of people in the world, that may be a controversial statement. Yeah, it is.

00:24:46.000 --> 00:24:52.000
It is for a lot of people in the world. So, like, that was something that straight up affected Canadians.

00:24:52.000 --> 00:25:01.000
Right. And so, on the other hand, you might say that China having your preferences is the entity

00:25:01.000 --> 00:25:06.000
that's going to least affect you on a daily basis. But on the other hand, it might be the one that most affects you on a daily basis

00:25:06.000 --> 00:25:11.000
because they literally, they may or may not have agents that are just, their whole job

00:25:11.000 --> 00:25:14.000
is to, like, just hurt people on social media. Yeah.

00:25:14.000 --> 00:25:20.000
Well, I mean, so, like, you kind of bring up an important question because it's like,

00:25:20.000 --> 00:25:28.000
you could view TikTok as a foreign agent if you're making this connection between them

00:25:28.000 --> 00:25:34.000
and the CCP. And then it's like, okay, say, for the sake of argument, that there's a connection there.

00:25:34.000 --> 00:25:42.000
We have already, like, as it regards this bill, there has been China influenced action

00:25:42.000 --> 00:25:45.000
on American citizens. Yeah.

00:25:45.000 --> 00:25:52.000
And once news of this bill came out, TikTok made a push note, they pushed a notification

00:25:52.000 --> 00:25:56.000
to TikTok users saying, hey, this bill is being passed through Congress.

00:25:56.000 --> 00:25:59.000
If they pass it. No TikTok for you.

00:25:59.000 --> 00:26:04.000
TikTok could be banned. They're saying that it's not a ban, but, like, trust us, it's a ban.

00:26:04.000 --> 00:26:09.000
And then all the people who depend on TikTok for their businesses and who, like, for them,

00:26:09.000 --> 00:26:12.000
it's like a way of life. Or just like dancing.

00:26:12.000 --> 00:26:20.000
Sure. They called, they were told by TikTok to call the representatives, and they did en masse.

00:26:20.000 --> 00:26:23.000
Phone lines were flooded.

00:26:23.000 --> 00:26:27.000
Threats were made. Well, I mean, that's just what happens.

00:26:27.000 --> 00:26:34.000
Yeah. So, like, you have a foreign-owned company telling its American user base, which is huge,

00:26:34.000 --> 00:26:41.000
to go and enact some thing, take some action that affects the U.S. government, and it happens.

00:26:41.000 --> 00:26:44.000
That's a little scary. It is a little scary. It is a little scary.

00:26:44.000 --> 00:26:49.000
I mean, I feel like... Like, it's just calling, but... I feel like Met has tried to do something similar, and they...

00:26:49.000 --> 00:26:52.000
I don't know if they succeeded or failed, but does that...

00:26:52.000 --> 00:26:58.000
What is China doing that the U.S. is failing at, that they can get this loyal following

00:26:58.000 --> 00:27:02.000
and get young people actually interested in politics?

00:27:02.000 --> 00:27:06.000
Well, apparently, the thing that TikTok is doing well is their algorithm.

00:27:06.000 --> 00:27:13.600
Yes. It's widely regarded as the best recommendation algorithm in the industry, and a big reason

00:27:13.600 --> 00:27:22.200
why... It works so fast. It's so fast, and it's so good at serving people exactly what they want to see and what

00:27:22.200 --> 00:27:30.000
will be interesting to them. And so, like you said earlier, China doesn't want to sell, BiteDance doesn't want to sell.

00:27:30.000 --> 00:27:33.000
The reason China is definitely...

00:27:33.000 --> 00:27:40.500
They are invested in not selling is because in order to sell TikTok to an American owner

00:27:40.500 --> 00:27:49.000
or whatever, they would have to grant a technology export license for the TikTok algorithm, which

00:27:49.000 --> 00:27:57.000
in their mind is like, this is like a Chinese asset. This is like innovation that happened here, and now we're going to give it to someone

00:27:57.000 --> 00:28:05.000
else. They don't want to do that, obviously. It would be like Oppenheimer granting an export license on the nuke to Russia.

00:28:05.000 --> 00:28:11.000
It would be exactly like that. I haven't seen the movie, but I'm assuming.

00:28:11.000 --> 00:28:14.000
He didn't grant it. They got it anyway. They did.

00:28:14.000 --> 00:28:21.000
Via espionage. Yeah, but we've been trying. We can't build our own nuke, a.k.a. good algorithms.

00:28:21.000 --> 00:28:26.600
So, yeah, apparently the bill, the TikTok ban bill, quote unquote, is stalled in the Senate

00:28:26.600 --> 00:28:33.100
right now because the House passed it like crazy, the Federal Congress.

00:28:33.100 --> 00:28:37.760
It was a landslide, and now it's got to the Senate, and now people are kind of like waking

00:28:37.760 --> 00:28:41.120
up and realizing, wait a second, this is a bit more complicated.

00:28:41.120 --> 00:28:47.480
This is a tech company. They're regulated by a buttload of different government committees and agencies, and so

00:28:47.480 --> 00:28:55.400
that's going to be complicated. That brings up an interesting point on why this bill might be stalled.

00:28:55.400 --> 00:28:59.640
This is the right way to go about this, because not only was this bill happening, but now

00:28:59.640 --> 00:29:03.400
it's come out that apparently the FTC could sue them.

00:29:03.400 --> 00:29:07.440
Now, if you don't know what the Federal Trade Commission is, you might know their little

00:29:07.440 --> 00:29:13.160
known failure where they didn't stop Microsoft from acquiring Activision Blizzard.

00:29:13.160 --> 00:29:16.160
They're still trying. Even though...

00:29:16.160 --> 00:29:19.720
They're doing their darndest. Everyone else is acting like it's already happened. Yeah.

00:29:19.720 --> 00:29:23.520
Well, it did already happen. What they're trying to do now is make them divest. Right.

00:29:23.520 --> 00:29:26.640
Good luck. Reverse. Uno reverse card.

00:29:26.640 --> 00:29:33.740
Shot, shot, slide. Reverse, reverse. But now it's come out, at least according to a few sources that we've seen, one from

00:29:33.740 --> 00:29:40.840
Politico, that they are investigating TikTok over faulty privacy and data security practices,

00:29:40.840 --> 00:29:49.320
and they could sue not just TikTok, but maybe also ByteDance, because they're weighing the

00:29:49.320 --> 00:29:55.760
quote from the source, Politico source. They're weighing allegations that TikTok and its Beijing-based parent company ByteDance

00:29:55.760 --> 00:30:00.320
deceived its users by denying that individuals in China had access to their data and also

00:30:00.320 --> 00:30:08.040
violated the children's privacy law. So even if the TikTok ban bill doesn't go all the way through, they're still going to

00:30:08.040 --> 00:30:17.440
be in trouble in one way or another. It's hard to see how this is going to end with TikTok just chilling.

00:30:17.440 --> 00:30:23.200
It's like nothing changes, because the government clearly wants something's got to give.

00:30:23.200 --> 00:30:28.680
They want their pound of flesh. So we'll have to see.

00:30:28.680 --> 00:30:33.180
There's so many questions about whether government even has, whether this is actually the proper

00:30:33.180 --> 00:30:41.040
role of government to try and regulate platforms in the way that it's trying to do.

00:30:41.040 --> 00:30:45.600
Like obviously state governments are doing their own thing, and the federal government

00:30:45.720 --> 00:30:48.720
has national security issues.

00:30:48.720 --> 00:30:56.200
Those are at play here. But I don't know, it comes down to whether you believe that you have a fundamental constitutional

00:30:56.200 --> 00:31:01.240
right to swipe up and down all day, and whether your kids do too.

00:31:01.240 --> 00:31:04.880
And occasionally swipe left and swipe right. So that's all from us.

00:31:04.880 --> 00:31:10.120
There's a little meandering today, but there's a lot of topics that we kind of wanted to

00:31:10.120 --> 00:31:15.720
cover. There are a lot of questions to ask. Let us know in the comments what you think about the social media bands, and whether

00:31:15.720 --> 00:31:20.040
you think TikTok is going to cause climate change.

00:31:20.040 --> 00:31:27.720
And yeah, maybe we'll see you in the next one. And then also tell us, what is your favorite animal for a little bit of positivity?

00:31:27.720 --> 00:31:31.760
Just tell us what your favorite animal is, and then tell us why we're wrong on everything.

00:31:31.760 --> 00:31:35.680
And why is it the red panda, even though there's like 1,200 left of them in the wild?

00:31:35.680 --> 00:31:39.560
I don't know why you chose the one, like one of the two Chinese animals, like they're

00:31:39.560 --> 00:31:45.320
going to think we're... I love the red panda. They're going to think we're like false flags now.

00:31:45.320 --> 00:31:49.960
We've been bought by China. Watch Three Body Problem, and then you won't think the CCP's so great.

00:31:51.960 --> 00:31:55.440
Okay, subscribe to TechLink. See you later.

00:31:55.440 --> 00:31:58.440
Bye. I love you too.

00:31:58.440 --> 00:31:58.940
Oh, okay there.
