1
00:00:00,000 --> 00:00:05,600
Okay, right from the jump here I want to remind you that you chose to click this video, okay?

2
00:00:05,600 --> 00:00:11,760
So whatever happens after this point is at least 50% your fault. Microsoft has released yet another

3
00:00:11,760 --> 00:00:17,520
update that fixes one thing and breaks something else. After last week's patch for Windows Defender

4
00:00:17,520 --> 00:00:22,800
suffered installation issues that the tech giant eventually fixed, another issue causing certain

5
00:00:22,800 --> 00:00:28,560
storage drives to fail during heavy writes was discovered by some kind of Japanese Twitter

6
00:00:28,560 --> 00:00:34,800
cat person with an apparent drinking problem. Despite this, NekoRuCat was able to conduct further

7
00:00:34,800 --> 00:00:41,440
testing along with Japanese site NichePC Gamer, identifying these drives from various brands

8
00:00:41,440 --> 00:00:47,760
that exhibit issues or straight up fail when attempting file transfers larger than 50 gigabytes.

9
00:00:47,760 --> 00:00:52,720
This issue hasn't gotten a ton of coverage in the West. It may be that users in Japan just have

10
00:00:52,800 --> 00:00:59,840
larger collections of cat photos to deal with. On the plus side, an upcoming Windows build will

11
00:00:59,840 --> 00:01:06,880
reportedly finally allow file operation dialogues to support dark mode. So you won't suddenly be

12
00:01:06,880 --> 00:01:12,080
blinded just trying to copy and paste something. Microsoft hasn't added this until now because

13
00:01:12,080 --> 00:01:17,680
Windows is just a really long complicated prank for them. One has to assume that we're all on

14
00:01:17,680 --> 00:01:24,240
camera, being displayed on live feeds in Microsoft HQ like we're unknowingly part of a wacky Japanese

15
00:01:24,240 --> 00:01:29,520
game show. With the staff there almost certainly making bets on when we'll all switch to Linux.

16
00:01:31,280 --> 00:01:38,480
Guitar YouTuber Rhett Scholl's latest video goes over a concerning new phenomenon. At least to me,

17
00:01:38,480 --> 00:01:45,760
YouTube appears to be automatically applying AI upscaling to shorts, making real videos

18
00:01:45,840 --> 00:01:51,760
look kind of AI generated. This is super noticeable when you compare identical videos uploaded both as

19
00:01:51,760 --> 00:01:58,000
a YouTube short and as an Instagram real. And this seems to be happening on all shorts uploads,

20
00:01:58,000 --> 00:02:03,280
although we obviously didn't check everyone. Don't worry, I'll do that tonight instead of sleeping.

21
00:02:03,280 --> 00:02:09,040
Seems like the first post about this was made on the YouTube subreddit in mid-June by Redditor

22
00:02:09,120 --> 00:02:15,840
Eulinsiesis who said they had noticed the change kick in around that time. Now, why YouTube is

23
00:02:15,840 --> 00:02:21,840
making every short look even more gross than they already tend to is anyone's guess. But some are

24
00:02:21,840 --> 00:02:27,680
theorizing it's to get people used to AI looking videos so they'll be ready for the AI content

25
00:02:27,680 --> 00:02:33,920
flood once YouTube fully adds the VO3 AI video generator into its shorts tools. But I don't

26
00:02:33,920 --> 00:02:38,720
know, I feel like it's equally likely that the people at YouTube making these decisions simply

27
00:02:38,720 --> 00:02:44,480
wouldn't know taste if it licked them in the face. As Schull points out in his video, one major

28
00:02:44,480 --> 00:02:50,640
issue here is that viewers may think a creator is using AI when they're not. And even though the

29
00:02:50,640 --> 00:02:56,400
real Riley perished some time ago, and I'm just an AI trained on his brainwaves and tattered fragments

30
00:02:56,400 --> 00:03:04,640
of his mustache, we don't fake these episodes, we film them for real on location. Somewhere.

31
00:03:04,720 --> 00:03:09,600
NVIDIA held a live stream to make a bunch of announcements about GeForce Now and the NVIDIA

32
00:03:09,600 --> 00:03:14,880
app after someone apparently reminded them that gamers exist. NVIDIA's cloud gaming service can

33
00:03:14,880 --> 00:03:23,680
now run on RTX 5080 equipped servers with DLSS4 enabling up to 120 FPS at 5K on PC and Mac and

34
00:03:23,680 --> 00:03:30,160
up to 90 FPS on the Steam Deck GeForce Now app. Meanwhile, the NVIDIA app can now automatically

35
00:03:30,160 --> 00:03:35,920
enable the DLSS of your choice across all your games and it adds some more settings from the

36
00:03:35,920 --> 00:03:41,920
NVIDIA control panel like NVIDIA surround setup. So it can inch closer to being the control panel

37
00:03:41,920 --> 00:03:46,720
replacement I thought it was supposed to be already. But back to GeForce Now, Andrew's gonna

38
00:03:46,720 --> 00:03:51,840
tell us about more features. Including one that expands the GFN library instantly, right Andrew?

39
00:03:51,840 --> 00:03:57,120
That's right. It is right Andrew and it's called install to play. Basically you can install games

40
00:03:57,120 --> 00:04:02,880
on your cloud PC if the game dev enabled Steam cloud streaming. But what if you don't want to

41
00:04:02,880 --> 00:04:10,160
install to play and also want to play inside the Discord app? I know you've been waiting for that.

42
00:04:10,160 --> 00:04:16,720
The new Discord Instant Play Experience lets you click a link from a friend in that app to instantly

43
00:04:16,720 --> 00:04:22,880
load in to a game. Seems like it's just Fortnite for now. Kind of like Google's original vision

44
00:04:22,880 --> 00:04:28,080
of clicking right into a Stadia game from a Google search, except inside the Discord window.

45
00:04:29,280 --> 00:04:35,440
As NVIDIA says, no more waiting, no more FOMO until the 30 minute trial runs out,

46
00:04:35,440 --> 00:04:41,680
then you'll have to buy the free to play game. But don't worry, Xbox may be prepping a cheaper

47
00:04:41,680 --> 00:04:46,800
cloud only game pass subscription along with a bunch of other stuff as discussed on the Xbox

48
00:04:46,800 --> 00:04:53,040
podcast. Xbox execs said, it's the hardest they've ever seen the team working. Because apparently

49
00:04:53,040 --> 00:04:57,840
there's just so much going on, or maybe the team is desperately trying not to be in the next group

50
00:04:57,840 --> 00:05:04,000
of a thousand Microsoft employees to get canned. I don't know. So, have you understood the ramifications

51
00:05:04,000 --> 00:05:09,440
of what you've done yet? Because there's still quick bits. Valve made some bold claims about

52
00:05:09,440 --> 00:05:15,680
its Steam performance overlay, more accurately reporting GPU utilization than Windows own

53
00:05:15,680 --> 00:05:20,800
task manager in the patch notes for their latest Steam client beta. But they may have gotten a

54
00:05:20,800 --> 00:05:26,000
too big for their britches. Because after a couple days, Valve updated the patch notes to say their

55
00:05:26,000 --> 00:05:32,720
new GPU monitoring method needs more testing. Oh, Valve. It's okay. Actually, you know what? Take

56
00:05:32,720 --> 00:05:38,160
your time. Microsoft is sabotaging our SSDs at this point. PC gamers have nothing to lose.

57
00:05:38,160 --> 00:05:43,840
Seagate was so fed up over the counterfeit Seagate hard drives flooding the market earlier this year,

58
00:05:43,840 --> 00:05:49,840
they sent their security teams to team up with Malaysian authorities and conduct a raid on a

59
00:05:49,840 --> 00:05:55,840
warehouse just outside Kuala Lumpur that was apparently churning out the doctor drives as

60
00:05:55,840 --> 00:06:02,240
reported by German site Heise. Seagate provided some photos of the raid, but sadly didn't say

61
00:06:02,240 --> 00:06:08,080
whether they used striping, mirroring, or some combination of the two. Also, did they watch

62
00:06:08,080 --> 00:06:13,360
the movie beforehand to amp themselves up? We need the answers to these. I've never heard of a company

63
00:06:13,360 --> 00:06:19,040
called Global Wafers. I'm not afraid to admit that. But now I'm aware that they have announced a

64
00:06:19,040 --> 00:06:23,920
planned manufacturing facility in Texas, which will make them the first company to produce

65
00:06:23,920 --> 00:06:31,520
silicon wafers on US soil, which founders like TSMC and Samsung need to make their chips. The wafers.

66
00:06:32,320 --> 00:06:37,600
Not the soil. Although maybe that's why Aero Lake was so bad. The idea that AI is in a bubble has

67
00:06:37,600 --> 00:06:44,160
been floated for years, even by Sam Altman now, who said as much to a group of journalists recently

68
00:06:44,160 --> 00:06:48,800
over dinner. He does this just so we can tell them in person that three quarters of them will

69
00:06:48,800 --> 00:06:54,240
be made obsolete in five minutes. Altman thinks the bubble thing is fine though, because open AI

70
00:06:54,240 --> 00:07:00,000
will probably survive the bubble and create a slew of wonderful AI products that transform society

71
00:07:00,000 --> 00:07:04,000
for the better. In fact, they have a bunch of products just like that right now, but they

72
00:07:04,000 --> 00:07:09,280
can't launch them. Sorry, because each one needs like the full power of a star to operate.

73
00:07:09,280 --> 00:07:16,640
What they can do is make GPT-5 warmer and friendlier. That's what you get for now. And Dutch researchers

74
00:07:16,640 --> 00:07:24,160
gave 500 AI chatbots specific personas and stuck them in a simulated social network with no ads or

75
00:07:24,160 --> 00:07:30,240
algorithmic feeds, just like an original Facebook style platform, and found that in five different

76
00:07:30,320 --> 00:07:37,040
rounds consisting of 10,000 actions each, the bots organized themselves into bubbles based on

77
00:07:37,040 --> 00:07:42,320
political beliefs, such that they primarily interacted with other bots that they agreed with.

78
00:07:42,320 --> 00:07:47,360
The researchers concluded that, although toxic algorithms are often blamed as the cause of social

79
00:07:47,360 --> 00:07:52,480
media's problems, their findings suggest those problems may be rooted in the structure of emotionally

80
00:07:52,480 --> 00:07:57,280
reactive social media platforms themselves. We only thought social media wasn't as toxic

81
00:07:57,280 --> 00:08:02,720
before algorithmic feeds were introduced because they hadn't melted our brains that much by that

82
00:08:02,720 --> 00:08:07,680
point. And now it's time to go outside. Until Wednesday, when we'll have more tech news to

83
00:08:07,680 --> 00:08:12,560
talk about. Don't stay out there for too long, though. There are things squirming around all

84
00:08:12,560 --> 00:08:16,880
over the place, and heck, one of those might get in your brain, and that would be way worse than

85
00:08:16,880 --> 00:08:19,680
doom-scrolling. Sorry.
