WEBVTT

00:00:00.000 --> 00:00:03.760
Okay, stand back. Stand back! We've got a dangerous amount of tech news

00:00:03.760 --> 00:00:08.480
coming through with it. You trying to get your face burned off? Get the back up!

00:00:08.480 --> 00:00:12.200
When tech news gets too close, it burns your face off. Ah!

00:00:12.200 --> 00:00:15.360
Andy, no. iPhone 16 will come with a pretty hefty upgrade

00:00:15.360 --> 00:00:19.040
according to a new report from Taiwanese site Economic Daily News.

00:00:19.040 --> 00:00:22.320
Their sources indicate Apple's next gen A18 chips

00:00:22.320 --> 00:00:27.720
for mobile and M4 chips for Macs will likely feature an upgraded neural engine

00:00:27.720 --> 00:00:33.160
with significantly more AI cores, which makes sense given that Tim Cook promised

00:00:33.160 --> 00:00:36.480
to give us huge AI announcements if we were good this year.

00:00:37.680 --> 00:00:42.160
Other recent reports have indicated Apple will be launching a suite of generative features

00:00:42.160 --> 00:00:47.080
as part of iOS 18, most likely at WWDC in June.

00:00:47.080 --> 00:00:54.040
Notably, Apple Silicon's standard neural engine has had 16 cores since the iPhone 12 back in 2020.

00:00:54.040 --> 00:00:59.240
What is your grandma's neural processor? However, certain configurations of the Mac Studio

00:00:59.240 --> 00:01:03.480
and the Mac Pro have had neural engines with up to 32 cores,

00:01:03.480 --> 00:01:07.680
which my AI assistant tells me is twice as many.

00:01:07.680 --> 00:01:12.560
And you'll definitely want to get one of these new phones because Apple is currently rolling out

00:01:12.560 --> 00:01:17.080
quantum proof encryption protection in what the company is calling

00:01:17.080 --> 00:01:20.320
the most significant cryptographic security upgrade

00:01:20.320 --> 00:01:24.680
in iMessage history, including all the other ones you remember.

00:01:24.720 --> 00:01:30.600
It's a preventative measure to help keep you safe from getting hacked by computers that don't even exist yet.

00:01:30.600 --> 00:01:36.720
Well, I'll be safe today when you can be safe tomorrow. We released a patch for a bug that wasn't even...

00:01:36.720 --> 00:01:39.960
What won't protect your phone, however, is rice,

00:01:39.960 --> 00:01:44.200
especially from hackers, according to new guidance published by Apple Support,

00:01:44.200 --> 00:01:49.320
who recommend leaving your phone to dry somewhere with decent airflow rather than shoving it

00:01:49.320 --> 00:01:54.080
in a sack full of edible grain, which might get into the device and damage it.

00:01:54.080 --> 00:02:00.200
I, however, would like additional clarification. What about oatmeal, barley, quinoa?

00:02:00.200 --> 00:02:03.880
You're just showing off about how many grains you know. Grits!

00:02:03.880 --> 00:02:07.080
What'd you say, rits? Grits! What?

00:02:07.080 --> 00:02:10.920
What if I put a fan in with the rice? Is that enough airflow? Psh!

00:02:10.920 --> 00:02:13.960
The European Commission has announced a formal investigation

00:02:13.960 --> 00:02:18.320
into whether TikTok is out of compliance with the Digital Services Act,

00:02:18.320 --> 00:02:23.800
specifically when it comes to the protection of miners and potentially addictive design features.

00:02:23.800 --> 00:02:28.360
TikTok will be the second platform investigated under the DSA following Twitter,

00:02:28.360 --> 00:02:32.280
which drew the commission's ire in part because they have a moderation team

00:02:32.280 --> 00:02:36.400
that's barely even big enough to staff and control a large waffle house.

00:02:36.400 --> 00:02:40.160
But they try. Both TikTok and Twitter are potentially in trouble

00:02:40.160 --> 00:02:43.400
with the EU over failure to meet the DSA's standards

00:02:43.400 --> 00:02:47.040
for advertiser transparency and access to research data.

00:02:47.040 --> 00:02:52.760
But TikTok has received far more attention for the harm it might pose to children and adolescents

00:02:52.800 --> 00:02:57.520
because Twitter is for old people, I feel attacked. Guys, we have to make more TikToks.

00:02:57.520 --> 00:03:01.160
We don't make any right. According to an EC Press officer speaking to ours,

00:03:01.160 --> 00:03:06.240
Technica, the commission suspects TikTok of failing to ensure its default settings

00:03:06.240 --> 00:03:09.480
will appropriately protect the safety and privacy of miners,

00:03:09.480 --> 00:03:16.440
as well as neglecting to properly assess the risk of 13 to 17 year olds pretending to be adults on the app,

00:03:16.440 --> 00:03:19.600
which is less of a risk and more of a certainty.

00:03:19.600 --> 00:03:23.560
But TikTok is far from the only company in Europe's bad books,

00:03:23.560 --> 00:03:29.000
as the EU will reportedly be finding Apple around 500 million euros next month,

00:03:29.000 --> 00:03:34.600
following an antitrust probe into a formal complaint from Spotify that Apple was using its platform

00:03:34.600 --> 00:03:39.200
to preference its own music streaming service. This is so sad.

00:03:39.200 --> 00:03:43.920
Siri, play Despacito. Google suddenly dropped a couple

00:03:43.920 --> 00:03:49.000
of lightweight open source AI models inspired by the same tech

00:03:49.000 --> 00:03:54.080
used in their flagship model, Gemini. So naturally they called it Gemma,

00:03:54.080 --> 00:03:57.600
the sound you'd make if you gave up saying the word Gemini halfway through.

00:03:57.600 --> 00:04:01.880
Why use chatGPT when you can use the Gemma? Actually it's pronounced Gemma.

00:04:01.880 --> 00:04:08.600
Is it? No. Okay. Google says Gemma 2B and 7B can be run locally,

00:04:08.600 --> 00:04:11.600
do better than larger models on key benchmarks,

00:04:11.600 --> 00:04:16.160
although we've heard that before, and are part of what DeepMind CEO,

00:04:16.160 --> 00:04:19.240
Demis Hasabis calls the company's long history

00:04:19.280 --> 00:04:22.360
of supporting responsible open source and science,

00:04:22.360 --> 00:04:26.080
which apparently did nothing to stop open AI from snagging the AI crowd.

00:04:26.080 --> 00:04:30.640
But hey, sure, you know, you know, whatever. Although if chatGPT continues occasionally

00:04:30.640 --> 00:04:33.760
losing its damn mind as it did early this morning,

00:04:33.760 --> 00:04:37.560
Google may have a shot. Reports flooded in from chatGPT users

00:04:37.560 --> 00:04:42.840
about getting responses that were everything from overly verbose to full on gibberish

00:04:42.840 --> 00:04:47.240
until open AI correctly diagnosed and fixed the bug in about 24 hours.

00:04:47.240 --> 00:04:51.840
This follows open AI having to step in after widespread reports of its chatbot

00:04:51.840 --> 00:04:56.040
slacking off around the holiday season. Okay, say it with me.

00:04:56.040 --> 00:04:59.200
It thinks it's people.

00:04:59.200 --> 00:05:03.760
The following five quick bits have not been approved by the Food and Drug Administration.

00:05:03.760 --> 00:05:08.360
It would be weird if they were, you can't ingest news, or can you?

00:05:08.360 --> 00:05:13.320
A federal appeals court has overturned a $1 billion penalty that was leveled

00:05:13.680 --> 00:05:17.240
against Cable ISP Cox Communications in 2019.

00:05:17.240 --> 00:05:22.080
According to the judge, Sony did not prove its case that Cox received direct financial benefit

00:05:22.080 --> 00:05:26.280
from the piracy and copyright infringement committed by its users.

00:05:26.280 --> 00:05:30.360
Cox may still face penalties due to the finding that it failed to combat piracy,

00:05:30.360 --> 00:05:33.520
making it a willful contributor to the infringement,

00:05:33.520 --> 00:05:37.800
but such a penalty will likely be far lighter than the damages originally awarded.

00:05:37.800 --> 00:05:42.200
They didn't actually make money from you downloading all nine seasons of Seinfeld.

00:05:42.200 --> 00:05:45.240
They just stood there and watched it happen. Great finale.

00:05:45.240 --> 00:05:48.240
Elon Musk claims the first Neuralink patient

00:05:48.240 --> 00:05:53.240
can already move a computer mouse with their mind, a sentence that would be way more impressive

00:05:53.240 --> 00:05:56.240
if it didn't include the word computer.

00:05:56.240 --> 00:06:01.520
While Musk has stated patient A can move a mouse around the screen just by thinking,

00:06:01.520 --> 00:06:06.080
that's really all anyone knows at this point. This has caused concern among some researchers

00:06:06.080 --> 00:06:09.080
who believe the public deserves a bit more transparency

00:06:09.080 --> 00:06:12.120
because people care deeply about their brains.

00:06:13.080 --> 00:06:18.240
Using their brains. Yeah, I care deeply about how my soyboy brain

00:06:18.240 --> 00:06:21.680
is completely incapable of telekinesis. Hurry it up, Elon.

00:06:21.680 --> 00:06:24.960
I wanna be a Jedi, colon survivor.

00:06:24.960 --> 00:06:28.920
A coalition of AI experts, academics, and industry executives,

00:06:28.920 --> 00:06:32.360
including deep learning pioneer, Yashua Benjio,

00:06:32.360 --> 00:06:37.800
have published an open letter calling for more regulation and safeguards for deep fakes.

00:06:37.800 --> 00:06:41.280
According to the letter, which currently has over 600 signatures,

00:06:41.280 --> 00:06:45.840
the rapid advancement of deep fake technology means that they have become a common element

00:06:45.840 --> 00:06:50.720
of fraud schemes, disinformation campaigns, and non-consensual sexual imagery.

00:06:50.720 --> 00:06:54.520
Honestly, it's tragic to see people using AI to impersonate and deceive people

00:06:54.520 --> 00:06:59.960
when they could instead be like Will Smith, who commemorated last year's viral AI monstrosity

00:06:59.960 --> 00:07:05.600
by pretending to be a machine-generated version of himself eating spaghetti in the worst way possible.

00:07:05.600 --> 00:07:10.600
I call it a shallow, real deep fake.

00:07:10.640 --> 00:07:13.800
Y'all better watch yourself. Taylor Swift's gonna hunt you for sport.

00:07:13.800 --> 00:07:17.720
Last week, 13,000 owners of Y's home security cameras

00:07:17.720 --> 00:07:20.760
were able to see images, and in some cases, video,

00:07:20.760 --> 00:07:24.520
from cameras that weren't theirs due to a bizarre glitch.

00:07:24.520 --> 00:07:27.840
Even worse, a similar issue occurred just five months ago.

00:07:27.840 --> 00:07:32.400
In an email, Y's referred to this newest incident as a security issue that occurred

00:07:32.400 --> 00:07:35.560
due to a third-party caching client library

00:07:35.560 --> 00:07:39.200
the company had started using. Like they did five months ago, Y said,

00:07:39.280 --> 00:07:44.640
they may changes to their system so that something like this never happens again, again.

00:07:44.640 --> 00:07:47.880
And a nearly three-decade-old European satellite

00:07:47.880 --> 00:07:52.120
made an uncontrolled re-entry into the Earth's atmosphere earlier today.

00:07:52.120 --> 00:07:55.920
The ERS-2 satellite, which CBS described as wing,

00:07:55.920 --> 00:07:59.080
about as much as an adult male rhinoceros,

00:07:59.080 --> 00:08:03.160
so we can all get a ballpark. That's about pretty heavy.

00:08:03.160 --> 00:08:08.400
Was decommissioned and had the last of its fuel used up in 2011 in order to prevent

00:08:08.400 --> 00:08:13.240
any potential catastrophic explosion, meaning its descent couldn't be controlled

00:08:13.240 --> 00:08:17.760
as it spent the next 13 years slowly falling back to Earth

00:08:17.760 --> 00:08:21.760
while it thought about what it's done. In the future, space debris might be managed

00:08:21.760 --> 00:08:26.280
by specialized junk removers like the Addras-J satellite

00:08:26.280 --> 00:08:32.080
that launched this past weekend. In the meantime, however, most of ERS-2's rhino-like mass

00:08:32.080 --> 00:08:36.600
has burned up in the atmosphere, and its remaining chunks are raining down

00:08:36.640 --> 00:08:39.920
across a 100-mile-wide patch of the Pacific Ocean

00:08:39.920 --> 00:08:43.240
in a decidedly un-rhino-like manner.

00:08:43.240 --> 00:08:47.920
And also, unlike a rhino, you should come back on Friday for more tech news.

00:08:47.920 --> 00:08:52.320
I mean, hey, no offense if you're a rhino. I just, I don't know if it'd be really your kind of thing.

00:08:52.320 --> 00:08:56.000
I just don't, we don't have a rhino section in the demographics of,

00:08:56.000 --> 00:08:57.320
rounders are cool, man.
