WEBVTT

00:00:00.240 --> 00:00:08.080
This video is brought to you by our trusted graphics partner

00:00:04.120 --> 00:00:12.080
NVIDIA. Welcome to part two of our 3570K

00:00:08.080 --> 00:00:14.000
versus 8350 testing. So, just to

00:00:12.080 --> 00:00:18.720
reiterate, we've already done no anti-aliasing on a number of popular

00:00:16.240 --> 00:00:22.800
games. Now, what we've decided to do is crank up the details. Now, any of the

00:00:20.720 --> 00:00:26.640
games where the maximum presets were already factoring in high anti-aliasing

00:00:25.119 --> 00:00:30.640
settings, those ones you're going to discover actually haven't changed.

00:00:28.640 --> 00:00:34.960
Whereas everything else, what we're doing is we're taking our GTX 660Ti,

00:00:33.280 --> 00:00:40.960
which is the card we're using for our standard test bench, and we are taxing

00:00:37.920 --> 00:00:44.000
it. So, we're looking at sort more of

00:00:40.960 --> 00:00:47.920
the GPUbound scenario. What happens to

00:00:44.000 --> 00:00:51.200
the CPU in this situation? Because a lot

00:00:47.920 --> 00:00:53.440
of the hubbhub right now is the 8350,

00:00:51.200 --> 00:00:58.800
which is an 8 core processor. So, we're running that on a Crosshair 5 formula.

00:00:55.280 --> 00:01:01.039
The 8350 is a very legitimate choice for

00:00:58.800 --> 00:01:05.560
a gaming platform if you're running at realistic settings. Now, of course, you

00:01:02.800 --> 00:01:11.680
could all day we could run Quake 3 flybys and uh you know, time how long it

00:01:08.320 --> 00:01:13.439
takes to run and run at 640x480 and that

00:01:11.680 --> 00:01:17.119
would really separate out the singlethreaded CPU performance of every

00:01:15.439 --> 00:01:22.159
chip on the market. But it's not realistic. So, that's why we're taking

00:01:19.680 --> 00:01:25.680
this this realistic approach where we have two identical test platforms. In

00:01:24.000 --> 00:01:31.200
fact, it's this test platform except over there. So, we're running uh the

00:01:27.920 --> 00:01:34.000
8350 overclocked to 4.6 GHz. We're

00:01:31.200 --> 00:01:38.479
running our 3570K overclocked to 4.4 GHz. The reason we chose those numbers

00:01:36.720 --> 00:01:44.799
was that we wanted something that is realistically attainable by you like 99%

00:01:42.880 --> 00:01:48.560
of the time. So, almost anyone running these chips can reach those frequencies

00:01:47.040 --> 00:01:53.520
with guides that have actually been published in videos that I've done in the past. So, there's really no excuse

00:01:51.280 --> 00:01:58.240
for not being able to do that. Um, so that's why we ran at those settings.

00:01:55.119 --> 00:02:00.240
We're running 8 gigs of Mushkin uh,

00:01:58.240 --> 00:02:07.840
whatever it's called, blackline memory, 1600 MHz CL9. Nothing special about it.

00:02:03.360 --> 00:02:09.879
We're running a 128 gig Vertex 4 SSD and

00:02:07.840 --> 00:02:14.160
Republic of Gamers motherboards for both platforms. Some in-win power supply that

00:02:12.720 --> 00:02:19.360
doesn't really affect performance in any meaningful way. And we're liquid cooling

00:02:16.160 --> 00:02:21.200
the CPUs using an H100 from Corsair just

00:02:19.360 --> 00:02:26.640
so that we get good stability at those overclock settings. So let's start with

00:02:22.720 --> 00:02:29.360
Metro 2033. In Metro 2033, the 3570K

00:02:26.640 --> 00:02:34.200
really pulls ahead of the 8350. So this is on the high preset with an isotropic

00:02:31.599 --> 00:02:39.440
filtering at 16x and anti-aliasing at 4x. Our next one is dirt 3. And in Dirt

00:02:37.360 --> 00:02:46.239
3, we see the same situation where the 3570K pulls ahead by about 10%. So, this

00:02:43.920 --> 00:02:49.599
is out actually. Yeah, this is a tangible win, but not something you're

00:02:47.920 --> 00:02:55.879
actually going to feel when you're playing the game itself. In Skyrim, we

00:02:52.800 --> 00:02:58.160
see another victory for the 3570K.

00:02:55.879 --> 00:03:02.480
However, it should be noted that in Skyrim, we're running 18 aftermarket

00:03:00.400 --> 00:03:06.400
mods from the Steam Workshop, and we're not sure how those affect any

00:03:04.560 --> 00:03:11.040
optimizations that are made on a platform byplatform basis. So, it's

00:03:08.560 --> 00:03:14.879
possible that our particular Skyrim setup is inherently runs better on

00:03:13.200 --> 00:03:19.519
Intel, and unfortunately, there's not really a whole lot we can do about that.

00:03:16.959 --> 00:03:25.519
Up next, we've got Battlefield 3, where the FX8350 actually comes out ahead.

00:03:22.720 --> 00:03:30.560
Now, Battlefield 3 was one of the first really multi-threading aware AAA gaming

00:03:28.480 --> 00:03:35.840
titles that you started to see a tangible benefit with more cores. So

00:03:33.159 --> 00:03:39.599
FX8350 doing well here actually makes a lot of sense because it is clocked a

00:03:37.360 --> 00:03:44.000
little bit higher. Did I say 4.4 GHz for the 3570K? Because what I meant was 4.2.

00:03:42.080 --> 00:03:48.159
Sorry about that guys. So what we see here is that with its higher frequency

00:03:46.080 --> 00:03:53.959
and the fact that it's multi-threading aware, it does really well. So we're

00:03:50.000 --> 00:03:57.439
running um four times MSAA

00:03:53.959 --> 00:03:59.439
anti-aliasing. Next up we've got Crisis

00:03:57.439 --> 00:04:04.799
3. So, this is one of the ones where there's been a lot of discussion about

00:04:01.920 --> 00:04:09.920
how Crisis 3 is heavily multi-threading aware and I've seen I've even seen

00:04:06.959 --> 00:04:14.480
proclamations that the 8350 just, you know, wrecks the

00:04:11.239 --> 00:04:16.959
3570K, but those seem pretty isolated

00:04:14.480 --> 00:04:22.320
and uh the most of the data out there seems to support our conclusion, which

00:04:19.280 --> 00:04:24.000
is that the 3570K ees out a victory

00:04:22.320 --> 00:04:29.160
here. It's not by much. It's only by about 10%. But that for all intents and

00:04:26.720 --> 00:04:34.080
purposes, they are very close in performance. And I mean, uh, okay, we're

00:04:32.240 --> 00:04:38.080
going to move on to our last benchmark here, which is that's not our last one.

00:04:36.000 --> 00:04:42.320
Um, yeah, they're very close in terms of performance. So, we're not really seeing

00:04:39.840 --> 00:04:48.160
the whole huge difference in performance um, favoring the 8350. Far Cry 3 is a

00:04:46.320 --> 00:04:52.080
different situation with the 8350 winning by 10%. Now, the weird thing is

00:04:50.400 --> 00:04:55.759
both of these games are based on Cry Engine 3. So, it looks like whatever

00:04:54.000 --> 00:05:01.040
individual optimizations the game developers have put into their their

00:04:58.000 --> 00:05:03.280
titles, um, for whatever reason, one of

00:05:01.040 --> 00:05:08.000
them favors AMD more than the other one does. So, it tells us a couple of

00:05:05.759 --> 00:05:13.919
things. Number one, Crisis 3 doesn't necessarily run better on AMD, but

00:05:09.680 --> 00:05:15.759
number two, Cryine 3 can. Starcraft 2,

00:05:13.919 --> 00:05:19.600
we're including just kind of for lols, but Intel runs away with this one. And

00:05:17.520 --> 00:05:24.880
Starcraft 2 seems to be heavily singlethreaded performance dependent.

00:05:21.600 --> 00:05:27.199
And our last title is Crisis 2. So not

00:05:24.880 --> 00:05:32.199
Crisis 3, Crisis 2, where we see a very similar story to Crisis 3 with the 3570K

00:05:30.080 --> 00:05:36.479
coming out by about out ahead by about 10%. So what does all this mean? Like

00:05:34.720 --> 00:05:41.199
what CPU should you actually buy? Because remember guys,

00:05:38.280 --> 00:05:44.240
3570K doesn't have hyperthreading. So, when you have a heavily threaded

00:05:42.720 --> 00:05:49.199
workload, say for example, you're doing video rendering, 3D rendering, um, heavy

00:05:47.280 --> 00:05:53.919
heavy Photoshop use, where you're going to be able to take advantage of those

00:05:50.880 --> 00:05:56.080
multiple cores, the 8350 may actually be

00:05:53.919 --> 00:06:00.479
a better choice for you because the 3570 is quad core and the 8350 is four

00:05:58.560 --> 00:06:06.080
bulldozer modules, each of which is kind of like a core and a half. So, you're

00:06:02.400 --> 00:06:08.479
going to have just more horsepower, more

00:06:06.080 --> 00:06:12.400
oomph to throw at heavily multi-threaded applications. With that in mind, I don't

00:06:10.560 --> 00:06:16.360
know what you're gesturing. We have three minutes left in the video. Oh,

00:06:13.759 --> 00:06:21.600
okay. Uh, with that in mind, the 3570K does pull ahead in the

00:06:18.639 --> 00:06:26.560
singlethreaded heavy applications and does run cooler and does consume less

00:06:24.400 --> 00:06:30.479
power. So, depending on how much power costs in your particular area, that may

00:06:29.120 --> 00:06:35.440
be a factor for you. But it should be noted that you're looking at probably a

00:06:32.400 --> 00:06:37.280
few dollars a year um in terms of the

00:06:35.440 --> 00:06:42.319
actual power difference depending again on how much power costs in your area.

00:06:39.680 --> 00:06:45.919
Here in BC, Canada, it's cheap. So, it's not something that we think about nearly

00:06:43.680 --> 00:06:49.520
as much as if you lived say for example in California where it can be very

00:06:47.600 --> 00:06:53.120
expensive. So, I think that pretty much wraps it up. They're both viable gaming

00:06:51.680 --> 00:06:58.880
platforms. At the end of the day, you have to look at the results. Go, okay, this is the average FPS. Am I going to

00:06:56.240 --> 00:07:02.479
notice the difference between 5 FPS or 3 FPS or 10 FPS here and there? If you

00:07:00.800 --> 00:07:05.759
think you will, go for whichever one comes out on top. And if you think you

00:07:04.160 --> 00:07:09.520
won't, then go for whichever one is going to serve you better as an overall

00:07:07.840 --> 00:07:15.800
platform. Thanks for checking out this video on Linus Tech Tips. As always,

00:07:12.000 --> 00:07:15.800
don't forget to subscribe.
