1
00:00:00,080 --> 00:00:07,359
Okay, right from the jump here, I want to remind you that you chose to click

2
00:00:04,000 --> 00:00:09,760
this video. Okay, so whatever happens

3
00:00:07,359 --> 00:00:13,840
after this point is at least 50% your fault. Microsoft has released yet

4
00:00:11,519 --> 00:00:18,080
another update that fixes one thing and breaks something else. After last week's

5
00:00:16,320 --> 00:00:22,480
patch for Windows Defender suffered installation issues that the tech giant

6
00:00:20,000 --> 00:00:27,119
eventually fixed, another issue causing certain storage drives to fail during

7
00:00:24,800 --> 00:00:32,079
heavy writes was discovered by some kind of Japanese Twitter cat person with an

8
00:00:29,840 --> 00:00:38,079
apparent drinking problem. Despite this, Necoroo Cat was able to conduct further

9
00:00:34,960 --> 00:00:40,480
testing along with Japanese site Niche

10
00:00:38,079 --> 00:00:45,360
PC Gamer, identifying these drives from various brands that exhibit issues or

11
00:00:43,200 --> 00:00:50,239
straight up fail when attempting file transfers larger than 50 GB. This issue

12
00:00:48,480 --> 00:00:55,920
hasn't gotten a ton of coverage in the West. It may be that users in Japan just

13
00:00:52,640 --> 00:00:58,399
have larger collections of cat photos to

14
00:00:55,920 --> 00:01:03,199
deal with. Huh. On the plus side, an upcoming Windows build will reportedly

15
00:01:00,640 --> 00:01:08,560
finally allow file operation dialogues to support dark mode, so you won't

16
00:01:06,240 --> 00:01:12,880
suddenly be blinded just trying to copy and paste something. Microsoft hasn't

17
00:01:10,560 --> 00:01:17,280
added this until now because Windows is just a really long, complicated prank

18
00:01:15,040 --> 00:01:22,080
for them. One has to assume that we're all on camera being displayed on live

19
00:01:19,759 --> 00:01:26,640
feeds in Microsoft HQ like we're unknowingly part of a wacky Japanese

20
00:01:24,320 --> 00:01:30,880
game show with the staff there almost certainly making bets on when we'll all

21
00:01:28,880 --> 00:01:36,400
switch to Linux. This was part of the report. Guitar YouTuber Rhett Scholes

22
00:01:33,759 --> 00:01:41,840
latest video goes over a concerning new phenomenon, at least to me. YouTube

23
00:01:39,119 --> 00:01:48,640
appears to be automatically applying AI upscaling to shorts, making real videos

24
00:01:45,840 --> 00:01:52,880
look kind of AI generated. This is super noticeable when you compare identical

25
00:01:50,399 --> 00:01:58,079
videos uploaded both as a YouTube short and as an Instagram reel. And this seems

26
00:01:55,520 --> 00:02:02,000
to be happening on all shorts uploads, although we obviously didn't check

27
00:01:59,759 --> 00:02:05,840
everyone. Don't worry, I'll do that tonight instead of sleeping. Seems like

28
00:02:03,840 --> 00:02:11,680
the first post about this was made on the YouTube subreddit in mid June by

29
00:02:08,399 --> 00:02:13,680
Redditor Ulins Cesis, who said they had

30
00:02:11,680 --> 00:02:18,879
noticed the change kick in around that time. Now, why YouTube is making every

31
00:02:16,720 --> 00:02:23,520
short look even more gross than they already tend to is anyone's guess, but

32
00:02:21,440 --> 00:02:28,640
some are theorizing it's to get people used to AI looking videos so they'll be

33
00:02:26,080 --> 00:02:34,000
ready for the AI content flood once YouTube fully adds the V3 AI video

34
00:02:31,520 --> 00:02:37,440
generator into its shorts tools. But, I don't know. I feel like it's equally

35
00:02:35,440 --> 00:02:42,080
likely that the people at YouTube making these decisions simply wouldn't know

36
00:02:39,519 --> 00:02:46,959
taste if it licked them in the face. As Schul points out in his video, one major

37
00:02:44,560 --> 00:02:52,000
issue here is that viewers may think a creator is using AI when they're not.

38
00:02:50,000 --> 00:02:56,000
And even though the real Riley perished some time ago and I'm just an AI trained

39
00:02:54,319 --> 00:03:01,280
on his brain waves and tattered fragments of his mustache, we don't fake

40
00:02:58,319 --> 00:03:06,480
these episodes. We film them for real on location

41
00:03:04,000 --> 00:03:10,560
somewhere. NVIDIA held a live stream to make a bunch of announcements about

42
00:03:07,920 --> 00:03:14,560
GeForce Now and the NVIDIA app after someone apparently reminded them that

43
00:03:12,400 --> 00:03:20,800
gamers exist. NVIDIA's cloud gaming service can now run on RTX 5080 equipped

44
00:03:17,519 --> 00:03:24,800
servers with DLSS4 enabling up to 120

45
00:03:20,800 --> 00:03:27,360
FPS at 5K on PC and Mac and up to 90 FPS

46
00:03:24,800 --> 00:03:32,239
on the Steam Deck GeForce Now app. Meanwhile, the NVIDIA app can now

47
00:03:29,519 --> 00:03:36,640
automatically enable the DLSS of your choice across all your games. And it

48
00:03:34,720 --> 00:03:41,040
adds some more settings from the NVIDIA control panel, like NVIDIA surround

49
00:03:38,879 --> 00:03:45,040
setup, so it can inch closer to being the control panel replacement I thought

50
00:03:43,040 --> 00:03:48,480
it was supposed to be already. But back to GeForce Now, Andrew is going to tell

51
00:03:47,040 --> 00:03:52,000
us about more features, >> including one that expands the GFN

52
00:03:50,239 --> 00:03:56,480
library instantly. Right, Andrew? >> That's right. >> It is right, Andrew. And it's called

53
00:03:54,480 --> 00:04:01,599
Install to Play. Basically, you can install games on your cloud PC if the

54
00:03:58,959 --> 00:04:06,080
game dev enabled SteamCloud streaming. But what if you don't want to install to

55
00:04:03,599 --> 00:04:10,560
play and also want to play inside the Discord app?

56
00:04:08,319 --> 00:04:15,519
I know you've been waiting for that. The new Discord instant play experience lets

57
00:04:13,519 --> 00:04:20,880
you click a link from a friend in that app to instantly load in to a game.

58
00:04:18,560 --> 00:04:25,600
Seems like it's just Fortnite for now. kind of like Google's original vision of

59
00:04:23,199 --> 00:04:30,960
clicking right into a Stadia game from a Google search except inside a Discord

60
00:04:27,840 --> 00:04:33,919
window. Uh so as NVIDIA says, no more

61
00:04:30,960 --> 00:04:38,960
waiting, no more FOMO uh until the 30-minute trial runs out. Then you'll

62
00:04:36,320 --> 00:04:43,919
have to buy the the free-to-play game. But don't worry, Xbox may be prepping a

63
00:04:41,120 --> 00:04:48,880
cheaper cloudonly Game Pass subscription along with a bunch of other stuff as

64
00:04:45,680 --> 00:04:51,199
discussed on the Xbox podcast. Xbox exec

65
00:04:48,880 --> 00:04:55,600
said it's the hardest they've ever seen the team working because apparently

66
00:04:53,120 --> 00:04:58,639
there's just so much going on. Or maybe the team is desperately trying not to be

67
00:04:57,280 --> 00:05:02,880
in the next group of a thousand Microsoft employees to get canned. I

68
00:05:00,720 --> 00:05:08,800
don't know. Also, it's the sponsor spot now. The MSI Crosshair 18HX AI is one of

69
00:05:06,400 --> 00:05:13,120
the most powerful RTX 5070 gaming laptops that provide a smooth gaming

70
00:05:10,639 --> 00:05:19,120
experience in unique chassis designs at an affordable price. Do I need to say

71
00:05:16,000 --> 00:05:21,120
more? or like how it's got an 18inch QHD

72
00:05:19,120 --> 00:05:25,280
240 Hz display so you can get clarity and smooth motion. Hm. Are you really

73
00:05:23,759 --> 00:05:31,199
going to make me point out that it comes with up to a RTX 5078 GB laptop GPU with

74
00:05:28,800 --> 00:05:36,240
an exclusive RTX50 series features like DLSS4 and Reflex 2 and that thanks to

75
00:05:34,320 --> 00:05:39,440
MSI Overboost technology. This is like one of the most powerful RTX 570

76
00:05:37,840 --> 00:05:44,080
laptops. I mean, it's got a 24 zone RGB keyboard

77
00:05:42,320 --> 00:05:47,759
with uniquely designed illuminated key caps. I think we got the idea at this

78
00:05:46,080 --> 00:05:53,360
point. So, just check out the MSI Crosshair 18HX AAI at the link below.

79
00:05:51,280 --> 00:05:57,919
So, hm, have you understood the ramifications of what you've done yet?

80
00:05:55,919 --> 00:06:02,400
Because there's still quick bits. Valve made some bold claims about its Steam

81
00:06:00,240 --> 00:06:07,759
performance overlay more accurately reporting GPU utilization than Windows

82
00:06:05,360 --> 00:06:11,919
own task manager in the patch notes for their latest Steam client beta. But they

83
00:06:10,160 --> 00:06:15,759
may have gotten it too big for their britches because after a couple days,

84
00:06:13,919 --> 00:06:20,080
Valve updated the patch notes to say their new GPU monitoring method needs

85
00:06:18,080 --> 00:06:24,560
more testing. Valve, it's okay. Actually, you know

86
00:06:22,319 --> 00:06:28,960
what? Take your time. Microsoft is sabotaging our SSDs at this point. PC

87
00:06:26,800 --> 00:06:33,199
gamers have nothing to lose. Seagate was so fed up over the counterfeit Seagate

88
00:06:31,360 --> 00:06:37,600
hard drives flooding the market earlier this year, they sent their security

89
00:06:35,199 --> 00:06:42,720
teams to team up with Malaysian authorities and conduct a raid on a

90
00:06:39,919 --> 00:06:47,600
warehouse just outside Koala Lumpur that was apparently churning out the doctor

91
00:06:45,199 --> 00:06:52,560
drives as reported by German site Heisen. Seagate provided some photos of

92
00:06:50,080 --> 00:06:57,919
the raid but sadly didn't say whether they used striping, mirroring, or some

93
00:06:55,039 --> 00:07:01,440
combination of the two. Also, did they watch the movie beforehand to amp

94
00:06:59,360 --> 00:07:06,080
themselves up? We need the answers to these. I've never heard of a company

95
00:07:03,360 --> 00:07:09,520
called Global Wafers. I'm not afraid to admit that. But now, I'm aware that

96
00:07:08,160 --> 00:07:13,520
they've announced a planned manufacturing facility in Texas, which

97
00:07:11,599 --> 00:07:19,680
will make them the first company to produce silicon wafers on US soil, which

98
00:07:16,479 --> 00:07:22,479
foundaries like TSMC and Samsung need to

99
00:07:19,680 --> 00:07:26,639
make their chips. The wafers, uh, not the soil, although maybe that's why

100
00:07:24,160 --> 00:07:32,080
Aerrol Lake was so bad. The idea that AI is in a bubble has been floated for

101
00:07:28,880 --> 00:07:33,680
years, even by Sam Alman now, who said

102
00:07:32,080 --> 00:07:37,840
as much to a group of journalists recently over dinner. He does this just

103
00:07:35,840 --> 00:07:42,000
so he can tell them in person that 3/4ers of them will be made obsolete in

104
00:07:39,919 --> 00:07:46,240
5 minutes. Altman thinks the bubble thing is fine, though, because OpenAI

105
00:07:44,319 --> 00:07:50,639
will probably survive the bubble and create a slew of wonderful AI products

106
00:07:48,560 --> 00:07:54,240
that transform society for the better. In fact, they have a bunch of products

107
00:07:52,000 --> 00:07:58,639
just like that right now, but they can't launch them. Uh, sorry, because each one

108
00:07:56,800 --> 00:08:04,319
needs like the full power of a star to operate. What they can do is make GPT5

109
00:08:01,680 --> 00:08:10,479
warmer and friendlier. That's what you get for now. And Dutch researchers gave

110
00:08:07,039 --> 00:08:12,800
500 AI chatbots specific personas and

111
00:08:10,479 --> 00:08:18,479
stuck them in a simulated social network with no ads or algorithmic feeds just

112
00:08:15,759 --> 00:08:23,599
like an original Facebook style platform and found that in five different rounds

113
00:08:20,720 --> 00:08:28,720
consisting of 10,000 actions each. The bots organized themselves into bubbles

114
00:08:26,319 --> 00:08:32,560
based on political beliefs such that they primarily interacted with other

115
00:08:30,639 --> 00:08:36,800
bots that they agreed with. The researchers concluded that although

116
00:08:34,479 --> 00:08:40,320
toxic algorithms are often blamed as the cause of social media's problems, their

117
00:08:38,399 --> 00:08:43,919
findings suggest those problems may be rooted in the structure of emotionally

118
00:08:42,479 --> 00:08:48,480
reactive social media platforms themselves. We only thought social media

119
00:08:46,560 --> 00:08:52,720
wasn't as toxic before algorithmic feeds were introduced because they hadn't

120
00:08:49,839 --> 00:08:57,040
melted our brains that much by that point. And now it's time to go outside

121
00:08:55,279 --> 00:09:00,720
until Wednesday when we'll have more tech news to talk about. Don't stay out

122
00:08:58,880 --> 00:09:04,320
there for too long, though. There are things squirming around all over the

123
00:09:02,880 --> 00:09:10,200
place. And heck, one of those might get in your brain. And that would be way

124
00:09:06,399 --> 00:09:10,200
worse than doom scrolling.
