{"video_id":"fp_fXyhgAK0mM","title":"THE ARC B580 IS ACTUALLY GREAT & AFFORDABLE -  Arc B580 Review","channel":"Linus Tech Tips","show":"Linus Tech Tips","published_at":"2024-12-12T14:01:00.025Z","duration_s":933,"segments":[{"start_s":0.0,"end_s":8.9,"text":"I can't believe it. After NVIDIA and AMD just completely abandoned the sub-$300 price point, or gave us intentionally","speaker":null,"is_sponsor":0},{"start_s":8.9,"end_s":12.64,"text":"nerfed options, Intel has finally done it.","speaker":null,"is_sponsor":0},{"start_s":12.64,"end_s":17.34,"text":"They released the first good budget GPU in over five years.","speaker":null,"is_sponsor":0},{"start_s":17.34,"end_s":24.26,"text":"It doesn't cut any corners. It's got 12 gigs of VRAM, which means that new games like Indiana Jones will run properly,","speaker":null,"is_sponsor":0},{"start_s":24.26,"end_s":29.54,"text":"it's relatively efficient, and best of all, it's got raw performance that beats the RTX","speaker":null,"is_sponsor":0},{"start_s":29.54,"end_s":33.98,"text":"4060 handily at both 1080p and 1440p.","speaker":null,"is_sponsor":0},{"start_s":33.98,"end_s":37.42,"text":"If I had scripted Intel's press conference for this thing, it probably would have gone","speaker":null,"is_sponsor":0},{"start_s":37.42,"end_s":42.86,"text":"something like this. It's been a difficult journey to get here for Team Blue.","speaker":null,"is_sponsor":0},{"start_s":42.86,"end_s":47.42,"text":"Their first generation had famously buggy drivers, with some games running poorly, or","speaker":null,"is_sponsor":0},{"start_s":47.42,"end_s":52.14,"text":"even not at all, which led to such low sales figures that the department has been the subject","speaker":null,"is_sponsor":0},{"start_s":52.14,"end_s":58.74,"text":"of near constant speculation that it would be cut. But believe it or not, they have squashed most of the bugs.","speaker":null,"is_sponsor":0},{"start_s":58.74,"end_s":62.94,"text":"Most of them. We'll talk about that later. First, the TLDR.","speaker":null,"is_sponsor":0},{"start_s":62.94,"end_s":67.66,"text":"Last time, we asked viewers to consider buying Intel Arc because we were desperate for them","speaker":null,"is_sponsor":0},{"start_s":67.66,"end_s":70.9,"text":"to emerge as a disruptor in the GPU duopoly.","speaker":null,"is_sponsor":0},{"start_s":70.9,"end_s":76.14,"text":"This time, we're asking you to consider Intel Arc because it's freaking awesome to the point","speaker":null,"is_sponsor":0},{"start_s":76.14,"end_s":82.7,"text":"where I would say that it is the only choice for gamers who want to spend $450 to $700 on","speaker":null,"is_sponsor":0},{"start_s":82.7,"end_s":88.24,"text":"a tower, or for those folks who have waited so patiently for an upgrade to their faithful","speaker":null,"is_sponsor":0},{"start_s":88.24,"end_s":91.58,"text":"GTX 1660 or 10 series.","speaker":null,"is_sponsor":0},{"start_s":91.58,"end_s":96.78,"text":"Let's start at 1080p, where AMD and NVIDIA are more than happy to gouge you for the luxury","speaker":null,"is_sponsor":0},{"start_s":96.78,"end_s":100.14,"text":"of being able to play. First stop, Night City.","speaker":null,"is_sponsor":0},{"start_s":100.14,"end_s":105.9,"text":"In Cyberpunk, the B580 makes an impressive debut, running neck and neck with the 4060","speaker":null,"is_sponsor":0},{"start_s":105.9,"end_s":111.02,"text":"Ti 16 gig, a card that retails for $450.","speaker":null,"is_sponsor":0},{"start_s":111.02,"end_s":115.58,"text":"You heard me, this one is 80% more expensive.","speaker":null,"is_sponsor":0},{"start_s":115.58,"end_s":123.14,"text":"And check out the B580's direct competition, the $300 4060 and the $250 RX7600.","speaker":null,"is_sponsor":0},{"start_s":123.14,"end_s":130.62,"text":"They're behind by nearly 15%. And Intel's lead increases in Red Dead Redemption 2, where their improvements to Vulcan support","speaker":null,"is_sponsor":0},{"start_s":130.62,"end_s":135.46,"text":"have lifted the fortunes of even their last generation cards, and brought the B580 in","speaker":null,"is_sponsor":0},{"start_s":135.46,"end_s":141.3,"text":"line with the 7700XT, a $400 GPU.","speaker":null,"is_sponsor":0},{"start_s":141.3,"end_s":149.74,"text":"This kind of utter dominance, though, isn't across the board. In The Last of Us Part 1, the B580 falls back in line, but then even this poor result has","speaker":null,"is_sponsor":0},{"start_s":149.74,"end_s":154.14,"text":"them beating NVIDIA and AMD's current generation price competitors.","speaker":null,"is_sponsor":0},{"start_s":154.22,"end_s":159.62,"text":"In F123, AMD takes the lead, but they've historically performed really well on the track.","speaker":null,"is_sponsor":0},{"start_s":159.62,"end_s":165.06,"text":"And in the old gold shadow of the Tomb Raider, the B580 barely inches past the RTX 4060 in","speaker":null,"is_sponsor":0},{"start_s":165.06,"end_s":170.46,"text":"average FPS and loses in 1% lows, which we classify as a loss.","speaker":null,"is_sponsor":0},{"start_s":170.46,"end_s":177.3,"text":"In Returnal, the B580 is neck and neck with the 4060 Ti and the 6700XT, only taking a","speaker":null,"is_sponsor":0},{"start_s":177.3,"end_s":183.78,"text":"clear L from the 7700XT, which, again, I remind you, is a $400 card.","speaker":null,"is_sponsor":0},{"start_s":183.78,"end_s":188.46,"text":"And then finally in Atomic Heart, the B580's sales lose some of their glorious wind, but","speaker":null,"is_sponsor":0},{"start_s":188.46,"end_s":193.38,"text":"hey, the most expensive card on the charts was bound to win at least one game, wasn't","speaker":null,"is_sponsor":0},{"start_s":193.38,"end_s":197.66,"text":"it? Looking at the overall picture, Intel, you've outdone yourself.","speaker":null,"is_sponsor":0},{"start_s":197.66,"end_s":203.66,"text":"You've pulled off a commending generational uplift of 55% and beaten your old flagship","speaker":null,"is_sponsor":0},{"start_s":203.66,"end_s":211.82,"text":"even by more than 20%. You've fallen shy of the 4060 Ti, but the bottom line is, if gamers out there are looking","speaker":null,"is_sponsor":0},{"start_s":211.82,"end_s":217.58,"text":"to upgrade their aging GPUs for high refresh rate 1080p gaming, you've given us something","speaker":null,"is_sponsor":0},{"start_s":217.58,"end_s":225.14,"text":"to finally recommend to them. Not just point at the least worst thing, but to recommend to them.","speaker":null,"is_sponsor":0},{"start_s":225.14,"end_s":229.98,"text":"This is a truly incredible achievement for Intel, and they did a lot to make it happen.","speaker":null,"is_sponsor":0},{"start_s":230.98,"end_s":236.86,"text":"Interestingly, they don't call it a GPU, leaves no stone unturned in the search for","speaker":null,"is_sponsor":0},{"start_s":236.86,"end_s":243.62,"text":"performance improvements. They've jumped to a new process node and made several architectural and driver overhauls","speaker":null,"is_sponsor":0},{"start_s":243.62,"end_s":250.46,"text":"that Intel claims gave them a 70% improvement in performance per core and a 50% energy improvement","speaker":null,"is_sponsor":0},{"start_s":250.46,"end_s":254.74,"text":"over last gen. Which makes me wonder, why did they hold back?","speaker":null,"is_sponsor":0},{"start_s":254.74,"end_s":262.34,"text":"If 20XE2 cores is good, then 32 cores on a B770 would be better.","speaker":null,"is_sponsor":0},{"start_s":262.34,"end_s":267.62,"text":"Speaking of bigger, nearly 37% of Steam users are gaming at a resolution that is greater","speaker":null,"is_sponsor":0},{"start_s":267.62,"end_s":273.58,"text":"than 1080p these days, and that number keeps growing as 1440p and 4K monitors continue","speaker":null,"is_sponsor":0},{"start_s":273.58,"end_s":278.06,"text":"to come down in price. So can Intel compete at 1440p as well?","speaker":null,"is_sponsor":0},{"start_s":278.06,"end_s":286.9,"text":"The short answer is yes. Across our suite of benchmarks, the B580 overtakes the RX 6700 XT and narrows the gap","speaker":null,"is_sponsor":0},{"start_s":286.9,"end_s":290.54,"text":"with the 4060 Ti to just a few percent.","speaker":null,"is_sponsor":0},{"start_s":290.54,"end_s":295.82,"text":"As for Intel's promised 10% lead over the 4060, well, it appears they were actually","speaker":null,"is_sponsor":0},{"start_s":295.82,"end_s":302.98,"text":"being modest. Across our game selection, we found a lead of 20%, though it is worth noting that this","speaker":null,"is_sponsor":0},{"start_s":302.98,"end_s":306.14,"text":"will vary depending on game selection.","speaker":null,"is_sponsor":0},{"start_s":306.14,"end_s":310.78,"text":"And with the famously VRAM hungry The Last of Us Part 1, the B580 holds its position","speaker":null,"is_sponsor":0},{"start_s":310.78,"end_s":317.58,"text":"on our chart, embarrassing NVIDIA again, with the B580's 1% lows besting the 4060's average","speaker":null,"is_sponsor":0},{"start_s":317.58,"end_s":322.38,"text":"FPS. Shadow of the Tomb Raider demonstrates just what an incredible generational leap Intel","speaker":null,"is_sponsor":0},{"start_s":322.38,"end_s":328.62,"text":"has pulled off here. And it seems like something about Returnal really likes Intel, because all of our team","speaker":null,"is_sponsor":0},{"start_s":328.62,"end_s":336.1,"text":"blue cards get a nice little bump in the rankings. I'm kind of running out of ways to say Intel's doing well, so in Red Dead Redemption 2, Intel","speaker":null,"is_sponsor":0},{"start_s":336.82,"end_s":344.78,"text":"did bloop-a-looby! And in Cyberpunk, they blomp-anated the competition, while making it glaringly obvious just how","speaker":null,"is_sponsor":0},{"start_s":344.78,"end_s":351.94,"text":"overdue the 1060 and the 1650 were for a valid, modern 1440p upgrade.","speaker":null,"is_sponsor":0},{"start_s":351.94,"end_s":356.74,"text":"With strong, if not mind-blowing performance in both Atomic Heart and F123, it's clear","speaker":null,"is_sponsor":0},{"start_s":356.74,"end_s":361.54,"text":"that Intel has achieved what they set out to do, made a killer value 1080p gaming card","speaker":null,"is_sponsor":0},{"start_s":361.54,"end_s":368.78,"text":"that's also capable of 1440p. There are some caveats, even Intel admits that they don't win in every game.","speaker":null,"is_sponsor":0},{"start_s":368.78,"end_s":374.34,"text":"And a motherboard with support for resizable bar is mandatory for ArcGPUs.","speaker":null,"is_sponsor":0},{"start_s":374.34,"end_s":379.58,"text":"And the B580 isn't a top-of-the-line benchmark buster, so if you have a higher-tier older","speaker":null,"is_sponsor":0},{"start_s":379.58,"end_s":384.42,"text":"card like a 3060 Ti, you shouldn't feel compelled to upgrade here.","speaker":null,"is_sponsor":0},{"start_s":384.42,"end_s":390.14,"text":"But as far as downsides go, those are pretty minor, and there's more to like than just","speaker":null,"is_sponsor":0},{"start_s":390.14,"end_s":396.62,"text":"the raw performance. Let's talk about ray tracing, which is a little more relevant these days all of a sudden","speaker":null,"is_sponsor":0},{"start_s":396.62,"end_s":402.9,"text":"than it used to be. Especially when you consider that games are starting to list ray tracing as a minimum requirement.","speaker":null,"is_sponsor":0},{"start_s":402.9,"end_s":406.74,"text":"While we didn't have time to develop a test for our boy Indy, we did check out the RT","speaker":null,"is_sponsor":0},{"start_s":406.74,"end_s":412.38,"text":"performance in a few other games. Tracing those rays still results in a significant performance here, but NVIDIA's mature and","speaker":null,"is_sponsor":0},{"start_s":412.38,"end_s":417.78,"text":"well-supported RT tech allows them to beat the B580 in a big way in Atomic Heart at 1080p.","speaker":null,"is_sponsor":0},{"start_s":417.82,"end_s":422.94,"text":"In Returnal, the B580 catches right back up to the 4060 Ti, but falls back into the pack","speaker":null,"is_sponsor":0},{"start_s":422.94,"end_s":429.34,"text":"in F123. Overall, you can have a solid plus 60fps ray traced gaming experience, assuming you're","speaker":null,"is_sponsor":0},{"start_s":429.34,"end_s":433.9,"text":"willing to fiddle with the settings just a little bit. Speaking of which, ultra settings at 1440p.","speaker":null,"is_sponsor":0},{"start_s":433.9,"end_s":438.14,"text":"That's beyond the reach of pretty much every card we've tested today, the entire lot failing","speaker":null,"is_sponsor":0},{"start_s":438.14,"end_s":447.7,"text":"to break 60fps average in any title. But the extra VRAM on the B580 earns it a sizable lead over the 4060 in Returnal and in F123.","speaker":null,"is_sponsor":0},{"start_s":447.78,"end_s":451.58,"text":"Even if it can't pull out a win in Atomic Heart. AMD's RX7600?","speaker":null,"is_sponsor":0},{"start_s":451.58,"end_s":455.9,"text":"Oof. It's basically crying in the corner trying to figure out why it's even here.","speaker":null,"is_sponsor":0},{"start_s":455.9,"end_s":459.98,"text":"And if you're wondering where the flagship ray tracing title Cyberpunk is, well, we had","speaker":null,"is_sponsor":0},{"start_s":459.98,"end_s":466.5,"text":"some Intel problems. It seems to not like running at ultra ray tracing settings on CPUs with 3DV cache.","speaker":null,"is_sponsor":0},{"start_s":466.5,"end_s":470.54,"text":"But before you start haranguing Intel about game compatibility, they are aware, and they","speaker":null,"is_sponsor":0},{"start_s":470.54,"end_s":474.14,"text":"are working on it. Look mom, we made the patch notes.","speaker":null,"is_sponsor":0},{"start_s":474.14,"end_s":477.7,"text":"And since the launch of ARC, compatibility has massively improved.","speaker":null,"is_sponsor":0},{"start_s":477.7,"end_s":483.62,"text":"Hardware box recently showed in a test of 250 games that 233 were completely playable.","speaker":null,"is_sponsor":0},{"start_s":483.62,"end_s":488.46,"text":"There were still issues and some of those were in major titles that took a lot longer","speaker":null,"is_sponsor":0},{"start_s":488.46,"end_s":494.06,"text":"to fix than we would have liked. But hopefully Intel will continue this upward trend in compatibility.","speaker":null,"is_sponsor":0},{"start_s":494.06,"end_s":497.74,"text":"Now knowing the price and the 1440p performance numbers, you wouldn't expect this to be","speaker":null,"is_sponsor":0},{"start_s":497.74,"end_s":502.1,"text":"a 4k gaming card and you'd be right. It's not a 4k card.","speaker":null,"is_sponsor":0},{"start_s":502.1,"end_s":505.94,"text":"Even in low settings in Cyberpunk, we don't see great frame rates.","speaker":null,"is_sponsor":0},{"start_s":505.94,"end_s":510.98,"text":"At least not natively. But Intel's got some more tricks up their sleeves.","speaker":null,"is_sponsor":0},{"start_s":510.98,"end_s":515.38,"text":"Desperate to not completely miss the boat on the AI boom, Intel has packed some juicy","speaker":null,"is_sponsor":0},{"start_s":515.38,"end_s":518.58,"text":"AI into the B580 in order to up its performance.","speaker":null,"is_sponsor":0},{"start_s":518.58,"end_s":522.14,"text":"Oh, I guess if I'm going to talk about AI, I might as well dress the part with my tech","speaker":null,"is_sponsor":0},{"start_s":522.14,"end_s":525.74,"text":"bro vest from LTTstore.com.","speaker":null,"is_sponsor":0},{"start_s":525.74,"end_s":529.82,"text":"Intel's AI-powered render enhancement sauce comes in the form of XCSS2.","speaker":null,"is_sponsor":0},{"start_s":529.94,"end_s":537.58,"text":"XCSS2 has three main parts. An AI upscaler that renders the game at a lower resolution than uses AI to upscale","speaker":null,"is_sponsor":0},{"start_s":537.58,"end_s":543.3,"text":"to display resolution, XCSS frame generation, which creates extra frames by interpolating","speaker":null,"is_sponsor":0},{"start_s":543.3,"end_s":548.26,"text":"visual and in-game vector data, and then taking the two frames and making a middle","speaker":null,"is_sponsor":0},{"start_s":548.26,"end_s":552.78,"text":"point to enhance animation smoothness, and a latency reduction component that helps to","speaker":null,"is_sponsor":0},{"start_s":552.78,"end_s":557.1,"text":"cancel out some of the extra latency from the aforementioned frame generation.","speaker":null,"is_sponsor":0},{"start_s":557.1,"end_s":563.02,"text":"In fewer words, they've invented NVIDIA DLSS, NVIDIA FrameGen, and NVIDIA Reflex.","speaker":null,"is_sponsor":0},{"start_s":563.02,"end_s":568.54,"text":"And what's nice is that unlike the time that AMD invented NVIDIA Reflex with Antileg Plus,","speaker":null,"is_sponsor":0},{"start_s":568.54,"end_s":572.66,"text":"Intel has implemented this in a way that the game developer bakes it into the game, meaning","speaker":null,"is_sponsor":0},{"start_s":572.66,"end_s":578.98,"text":"you won't get permaband for turning it on. We're not going to be doing a deep dive into image quality at this time, but we will be","speaker":null,"is_sponsor":0},{"start_s":578.98,"end_s":583.58,"text":"looking at performance and we will make some anecdotal remarks about image quality.","speaker":null,"is_sponsor":0},{"start_s":583.58,"end_s":589.02,"text":"XCSS has already proven itself to be a solid upscaling technology, and XCSS 2 successfully","speaker":null,"is_sponsor":0},{"start_s":589.02,"end_s":596.1,"text":"builds upon that foundation. When turning on supersampling to its highest quality level in Cyberpunk 2077 at 1440p, Intel","speaker":null,"is_sponsor":0},{"start_s":596.1,"end_s":600.78,"text":"wins an FPS thanks to the raw performance advantage they already had, but their super","speaker":null,"is_sponsor":0},{"start_s":600.78,"end_s":603.98,"text":"resolution solution doesn't scale as well.","speaker":null,"is_sponsor":0},{"start_s":603.98,"end_s":608.7,"text":"NVIDIA sees a performance uplift of 42 and 32% in the lows and averages respectively,","speaker":null,"is_sponsor":0},{"start_s":608.7,"end_s":612.4,"text":"while Intel only gets around a 20% boost.","speaker":null,"is_sponsor":0},{"start_s":612.4,"end_s":619.52,"text":"We see the same trend at 4K. The B580 doesn't gain as much performance from XCSS as NVIDIA does from DLSS, but wins","speaker":null,"is_sponsor":0},{"start_s":619.52,"end_s":625.44,"text":"by just being a more powerful card. The 4060, though, doesn't even hit 30 FPS here.","speaker":null,"is_sponsor":0},{"start_s":625.44,"end_s":630.92,"text":"But the story changes a bit in F124, the only title currently available for us to test XCSS","speaker":null,"is_sponsor":0},{"start_s":630.92,"end_s":634.12,"text":"2's Framegen, and this is where things get wacky.","speaker":null,"is_sponsor":0},{"start_s":634.12,"end_s":639.4,"text":"In this game, the upscalers are better matched, but in Framegen, it's a blowout.","speaker":null,"is_sponsor":0},{"start_s":639.4,"end_s":644.12,"text":"Intel is introducing way more generated frames, with performance skyrocketing to 70% over","speaker":null,"is_sponsor":0},{"start_s":644.12,"end_s":649.76,"text":"the upscale result, and double over native rendering, and it is almost indistinguishable","speaker":null,"is_sponsor":0},{"start_s":649.76,"end_s":652.98,"text":"from native rendering unless you really know what to look for.","speaker":null,"is_sponsor":0},{"start_s":652.98,"end_s":656.64,"text":"And if you ignore those entire artifacts that arise from the upscaling, not the Framegen,","speaker":null,"is_sponsor":0},{"start_s":656.64,"end_s":660.6,"text":"it's weird. It has double the FPS with barely a drop in quality.","speaker":null,"is_sponsor":0},{"start_s":660.6,"end_s":665.32,"text":"Heck, you can even get over 100 FPS at 4K with XCSS turned on.","speaker":null,"is_sponsor":0},{"start_s":665.32,"end_s":669.0,"text":"Granted, Intel probably worked pretty closely with the devs to make sure that this launch","speaker":null,"is_sponsor":0},{"start_s":669.0,"end_s":674.4,"text":"title would leave a good impression, but a good impression is a good impression.","speaker":null,"is_sponsor":0},{"start_s":674.4,"end_s":678.16,"text":"While Intel pitched this as a gaming card first and foremost, they did include some","speaker":null,"is_sponsor":0},{"start_s":678.16,"end_s":682.36,"text":"other value adds, like their enhanced media engine that now supports more codecs than","speaker":null,"is_sponsor":0},{"start_s":682.36,"end_s":689.4,"text":"NVIDIA or AMD. In modern AV1 encoding, we can see a clear generational uplift, but Intel can't quite","speaker":null,"is_sponsor":0},{"start_s":689.4,"end_s":696.42,"text":"match the speed of NVIDIA's NVENC encoder. And it's worth noting here, by the way, that while AMD does compete on this chart, it does","speaker":null,"is_sponsor":0},{"start_s":696.42,"end_s":702.94,"text":"so by outputting the wrong resolution, 1920 by 1082, and yes, that is a hardware level","speaker":null,"is_sponsor":0},{"start_s":702.94,"end_s":710.42,"text":"problem. Yikes. In the more ubiquitous H.264, the B580 snatches the crown, although margins are pretty small","speaker":null,"is_sponsor":0},{"start_s":710.42,"end_s":716.82,"text":"across the board here. Now thanks to their optics rendering, Blender sees some good old fashioned NVIDIA domination,","speaker":null,"is_sponsor":0},{"start_s":716.82,"end_s":725.3,"text":"at least on the cards that have RT cores. And interestingly, the B580 performed significantly worse than Intel's last-gen cards.","speaker":null,"is_sponsor":0},{"start_s":725.98,"end_s":729.02,"text":"This is expected, but didn't really go into why.","speaker":null,"is_sponsor":0},{"start_s":729.02,"end_s":733.18,"text":"Our best guess then is that it's due to the relatively smaller number of XE cores on this","speaker":null,"is_sponsor":0},{"start_s":733.18,"end_s":739.18,"text":"card compared to the last gen. But that didn't slow it down in gaming, so...","speaker":null,"is_sponsor":0},{"start_s":739.18,"end_s":745.78,"text":"Yay gaming. Moving into our AI testing, NVIDIA's dominant position means that most AI software is developed","speaker":null,"is_sponsor":0},{"start_s":745.78,"end_s":750.5,"text":"with CUDA in mind, making it easy for folks at home to get up and running quickly.","speaker":null,"is_sponsor":0},{"start_s":750.5,"end_s":754.62,"text":"The good news is things are getting easier for AMD and Intel users.","speaker":null,"is_sponsor":0},{"start_s":754.62,"end_s":758.7,"text":"And if you want to run some computer vision, large language models, or image generation","speaker":null,"is_sponsor":0},{"start_s":758.7,"end_s":764.06,"text":"at home, you can probably get it working on your non-NVIDIA GPU in a few extra steps.","speaker":null,"is_sponsor":0},{"start_s":764.06,"end_s":768.26,"text":"And generationally, Intel has made a massive leap in performance, taking the lead in our","speaker":null,"is_sponsor":0},{"start_s":768.26,"end_s":772.22,"text":"stable diffusion image generation benchmark by a pretty big margin.","speaker":null,"is_sponsor":0},{"start_s":772.22,"end_s":776.38,"text":"In computer vision, it loses to the 4060, although just barely.","speaker":null,"is_sponsor":0},{"start_s":776.38,"end_s":782.54,"text":"So if you can get it running, it works great. As long as you don't mind using a bunch of power to generate a portrait of your aunt","speaker":null,"is_sponsor":0},{"start_s":782.62,"end_s":787.58,"text":"as an actual aunt. Speaking of power, we haven't even talked about power yet.","speaker":null,"is_sponsor":0},{"start_s":787.58,"end_s":792.42,"text":"The physical PCB of our card is dinky, and Intel takes advantage of this by implementing","speaker":null,"is_sponsor":0},{"start_s":792.42,"end_s":799.9,"text":"a new flow-through cooler design. Thermals are well under control with the hotspot on the B580, never passing 76 degrees in either","speaker":null,"is_sponsor":0},{"start_s":799.9,"end_s":805.86,"text":"synthetic or gaming workloads. The B580 is targeting a TDP of 190 watts.","speaker":null,"is_sponsor":0},{"start_s":805.86,"end_s":814.5,"text":"That's 25 watts more than the RX7600 and 4060 Ti, and 75 watts more than the 4060 non-Ti.","speaker":null,"is_sponsor":0},{"start_s":814.5,"end_s":819.1,"text":"So it's clear that even with all of their improvements, Intel does have a ways to go","speaker":null,"is_sponsor":0},{"start_s":819.1,"end_s":823.42,"text":"with NVIDIA pulling off a clean victory when it comes to performance per watt, even if","speaker":null,"is_sponsor":0},{"start_s":823.42,"end_s":826.98,"text":"NVIDIA doesn't seem to understand what performance per dollar is.","speaker":null,"is_sponsor":0},{"start_s":826.98,"end_s":832.94,"text":"The B580 uses just one PCIe 8-pin for power, so you won't need to upgrade your power supply.","speaker":null,"is_sponsor":0},{"start_s":832.94,"end_s":837.16,"text":"That is, unless you have a power supply that is less than 600 watts.","speaker":null,"is_sponsor":0},{"start_s":837.16,"end_s":842.34,"text":"Even though it should only draw a max of 225 watts with that power connector, we saw some","speaker":null,"is_sponsor":0},{"start_s":842.34,"end_s":846.62,"text":"transient spikes up to 241 watts in F1.","speaker":null,"is_sponsor":0},{"start_s":846.62,"end_s":851.62,"text":"That is significantly higher than Intel's advertised 190 watts.","speaker":null,"is_sponsor":0},{"start_s":851.62,"end_s":856.18,"text":"So in conclusion, maybe it's just because it's been so long since we've seen a great","speaker":null,"is_sponsor":0},{"start_s":856.18,"end_s":859.58,"text":"budget card, but man, I'm excited.","speaker":null,"is_sponsor":0},{"start_s":860.1,"end_s":865.74,"text":"How many times have I had to say, well, you could buy this shiny new thing, but it kind","speaker":null,"is_sponsor":0},{"start_s":865.74,"end_s":871.34,"text":"of sucks. And you'd be way better off going with something last gen or second hand or just not upgrading","speaker":null,"is_sponsor":0},{"start_s":871.34,"end_s":878.94,"text":"at all. Well, not today. I mean, eBay deals obviously still exist, but this is a new GPU that doesn't suck with","speaker":null,"is_sponsor":0},{"start_s":878.94,"end_s":884.9,"text":"modern features and it's worthy of celebration, even if Intel is a little late to the party.","speaker":null,"is_sponsor":0},{"start_s":884.9,"end_s":890.78,"text":"See, the thing is AMD and NVIDIA are poised to release their next generation of GPUs early","speaker":null,"is_sponsor":0},{"start_s":890.78,"end_s":895.34,"text":"next year, so it isn't out of the question for a couple of companies that have had their","speaker":null,"is_sponsor":0},{"start_s":895.34,"end_s":902.62,"text":"stock prices double or ten-uple to make a move to nuke Arc-B series from orbit.","speaker":null,"is_sponsor":0},{"start_s":902.62,"end_s":906.74,"text":"Also, the looming threat of tariffs could impact the affordability of these cards.","speaker":null,"is_sponsor":0},{"start_s":906.74,"end_s":911.46,"text":"But hey, it's going to be a pretty sweet ride up until that time.","speaker":null,"is_sponsor":0},{"start_s":911.46,"end_s":916.42,"text":"It's a great GPU that will give you great performance without destroying your wallet.","speaker":null,"is_sponsor":0},{"start_s":916.42,"end_s":920.22,"text":"If you guys enjoyed this video, go check out our review of the original Arc.","speaker":null,"is_sponsor":0},{"start_s":920.22,"end_s":925.06,"text":"The benchmarks and graphs aren't really relevant, other than to give you the context for how","speaker":null,"is_sponsor":0},{"start_s":925.06,"end_s":933.74,"text":"far we have come.","speaker":null,"is_sponsor":0}],"full_text":"I can't believe it. After NVIDIA and AMD just completely abandoned the sub-$300 price point, or gave us intentionally nerfed options, Intel has finally done it. They released the first good budget GPU in over five years. It doesn't cut any corners. It's got 12 gigs of VRAM, which means that new games like Indiana Jones will run properly, it's relatively efficient, and best of all, it's got raw performance that beats the RTX 4060 handily at both 1080p and 1440p. If I had scripted Intel's press conference for this thing, it probably would have gone something like this. It's been a difficult journey to get here for Team Blue. Their first generation had famously buggy drivers, with some games running poorly, or even not at all, which led to such low sales figures that the department has been the subject of near constant speculation that it would be cut. But believe it or not, they have squashed most of the bugs. Most of them. We'll talk about that later. First, the TLDR. Last time, we asked viewers to consider buying Intel Arc because we were desperate for them to emerge as a disruptor in the GPU duopoly. This time, we're asking you to consider Intel Arc because it's freaking awesome to the point where I would say that it is the only choice for gamers who want to spend $450 to $700 on a tower, or for those folks who have waited so patiently for an upgrade to their faithful GTX 1660 or 10 series. Let's start at 1080p, where AMD and NVIDIA are more than happy to gouge you for the luxury of being able to play. First stop, Night City. In Cyberpunk, the B580 makes an impressive debut, running neck and neck with the 4060 Ti 16 gig, a card that retails for $450. You heard me, this one is 80% more expensive. And check out the B580's direct competition, the $300 4060 and the $250 RX7600. They're behind by nearly 15%. And Intel's lead increases in Red Dead Redemption 2, where their improvements to Vulcan support have lifted the fortunes of even their last generation cards, and brought the B580 in line with the 7700XT, a $400 GPU. This kind of utter dominance, though, isn't across the board. In The Last of Us Part 1, the B580 falls back in line, but then even this poor result has them beating NVIDIA and AMD's current generation price competitors. In F123, AMD takes the lead, but they've historically performed really well on the track. And in the old gold shadow of the Tomb Raider, the B580 barely inches past the RTX 4060 in average FPS and loses in 1% lows, which we classify as a loss. In Returnal, the B580 is neck and neck with the 4060 Ti and the 6700XT, only taking a clear L from the 7700XT, which, again, I remind you, is a $400 card. And then finally in Atomic Heart, the B580's sales lose some of their glorious wind, but hey, the most expensive card on the charts was bound to win at least one game, wasn't it? Looking at the overall picture, Intel, you've outdone yourself. You've pulled off a commending generational uplift of 55% and beaten your old flagship even by more than 20%. You've fallen shy of the 4060 Ti, but the bottom line is, if gamers out there are looking to upgrade their aging GPUs for high refresh rate 1080p gaming, you've given us something to finally recommend to them. Not just point at the least worst thing, but to recommend to them. This is a truly incredible achievement for Intel, and they did a lot to make it happen. Interestingly, they don't call it a GPU, leaves no stone unturned in the search for performance improvements. They've jumped to a new process node and made several architectural and driver overhauls that Intel claims gave them a 70% improvement in performance per core and a 50% energy improvement over last gen. Which makes me wonder, why did they hold back? If 20XE2 cores is good, then 32 cores on a B770 would be better. Speaking of bigger, nearly 37% of Steam users are gaming at a resolution that is greater than 1080p these days, and that number keeps growing as 1440p and 4K monitors continue to come down in price. So can Intel compete at 1440p as well? The short answer is yes. Across our suite of benchmarks, the B580 overtakes the RX 6700 XT and narrows the gap with the 4060 Ti to just a few percent. As for Intel's promised 10% lead over the 4060, well, it appears they were actually being modest. Across our game selection, we found a lead of 20%, though it is worth noting that this will vary depending on game selection. And with the famously VRAM hungry The Last of Us Part 1, the B580 holds its position on our chart, embarrassing NVIDIA again, with the B580's 1% lows besting the 4060's average FPS. Shadow of the Tomb Raider demonstrates just what an incredible generational leap Intel has pulled off here. And it seems like something about Returnal really likes Intel, because all of our team blue cards get a nice little bump in the rankings. I'm kind of running out of ways to say Intel's doing well, so in Red Dead Redemption 2, Intel did bloop-a-looby! And in Cyberpunk, they blomp-anated the competition, while making it glaringly obvious just how overdue the 1060 and the 1650 were for a valid, modern 1440p upgrade. With strong, if not mind-blowing performance in both Atomic Heart and F123, it's clear that Intel has achieved what they set out to do, made a killer value 1080p gaming card that's also capable of 1440p. There are some caveats, even Intel admits that they don't win in every game. And a motherboard with support for resizable bar is mandatory for ArcGPUs. And the B580 isn't a top-of-the-line benchmark buster, so if you have a higher-tier older card like a 3060 Ti, you shouldn't feel compelled to upgrade here. But as far as downsides go, those are pretty minor, and there's more to like than just the raw performance. Let's talk about ray tracing, which is a little more relevant these days all of a sudden than it used to be. Especially when you consider that games are starting to list ray tracing as a minimum requirement. While we didn't have time to develop a test for our boy Indy, we did check out the RT performance in a few other games. Tracing those rays still results in a significant performance here, but NVIDIA's mature and well-supported RT tech allows them to beat the B580 in a big way in Atomic Heart at 1080p. In Returnal, the B580 catches right back up to the 4060 Ti, but falls back into the pack in F123. Overall, you can have a solid plus 60fps ray traced gaming experience, assuming you're willing to fiddle with the settings just a little bit. Speaking of which, ultra settings at 1440p. That's beyond the reach of pretty much every card we've tested today, the entire lot failing to break 60fps average in any title. But the extra VRAM on the B580 earns it a sizable lead over the 4060 in Returnal and in F123. Even if it can't pull out a win in Atomic Heart. AMD's RX7600? Oof. It's basically crying in the corner trying to figure out why it's even here. And if you're wondering where the flagship ray tracing title Cyberpunk is, well, we had some Intel problems. It seems to not like running at ultra ray tracing settings on CPUs with 3DV cache. But before you start haranguing Intel about game compatibility, they are aware, and they are working on it. Look mom, we made the patch notes. And since the launch of ARC, compatibility has massively improved. Hardware box recently showed in a test of 250 games that 233 were completely playable. There were still issues and some of those were in major titles that took a lot longer to fix than we would have liked. But hopefully Intel will continue this upward trend in compatibility. Now knowing the price and the 1440p performance numbers, you wouldn't expect this to be a 4k gaming card and you'd be right. It's not a 4k card. Even in low settings in Cyberpunk, we don't see great frame rates. At least not natively. But Intel's got some more tricks up their sleeves. Desperate to not completely miss the boat on the AI boom, Intel has packed some juicy AI into the B580 in order to up its performance. Oh, I guess if I'm going to talk about AI, I might as well dress the part with my tech bro vest from LTTstore.com. Intel's AI-powered render enhancement sauce comes in the form of XCSS2. XCSS2 has three main parts. An AI upscaler that renders the game at a lower resolution than uses AI to upscale to display resolution, XCSS frame generation, which creates extra frames by interpolating visual and in-game vector data, and then taking the two frames and making a middle point to enhance animation smoothness, and a latency reduction component that helps to cancel out some of the extra latency from the aforementioned frame generation. In fewer words, they've invented NVIDIA DLSS, NVIDIA FrameGen, and NVIDIA Reflex. And what's nice is that unlike the time that AMD invented NVIDIA Reflex with Antileg Plus, Intel has implemented this in a way that the game developer bakes it into the game, meaning you won't get permaband for turning it on. We're not going to be doing a deep dive into image quality at this time, but we will be looking at performance and we will make some anecdotal remarks about image quality. XCSS has already proven itself to be a solid upscaling technology, and XCSS 2 successfully builds upon that foundation. When turning on supersampling to its highest quality level in Cyberpunk 2077 at 1440p, Intel wins an FPS thanks to the raw performance advantage they already had, but their super resolution solution doesn't scale as well. NVIDIA sees a performance uplift of 42 and 32% in the lows and averages respectively, while Intel only gets around a 20% boost. We see the same trend at 4K. The B580 doesn't gain as much performance from XCSS as NVIDIA does from DLSS, but wins by just being a more powerful card. The 4060, though, doesn't even hit 30 FPS here. But the story changes a bit in F124, the only title currently available for us to test XCSS 2's Framegen, and this is where things get wacky. In this game, the upscalers are better matched, but in Framegen, it's a blowout. Intel is introducing way more generated frames, with performance skyrocketing to 70% over the upscale result, and double over native rendering, and it is almost indistinguishable from native rendering unless you really know what to look for. And if you ignore those entire artifacts that arise from the upscaling, not the Framegen, it's weird. It has double the FPS with barely a drop in quality. Heck, you can even get over 100 FPS at 4K with XCSS turned on. Granted, Intel probably worked pretty closely with the devs to make sure that this launch title would leave a good impression, but a good impression is a good impression. While Intel pitched this as a gaming card first and foremost, they did include some other value adds, like their enhanced media engine that now supports more codecs than NVIDIA or AMD. In modern AV1 encoding, we can see a clear generational uplift, but Intel can't quite match the speed of NVIDIA's NVENC encoder. And it's worth noting here, by the way, that while AMD does compete on this chart, it does so by outputting the wrong resolution, 1920 by 1082, and yes, that is a hardware level problem. Yikes. In the more ubiquitous H.264, the B580 snatches the crown, although margins are pretty small across the board here. Now thanks to their optics rendering, Blender sees some good old fashioned NVIDIA domination, at least on the cards that have RT cores. And interestingly, the B580 performed significantly worse than Intel's last-gen cards. This is expected, but didn't really go into why. Our best guess then is that it's due to the relatively smaller number of XE cores on this card compared to the last gen. But that didn't slow it down in gaming, so... Yay gaming. Moving into our AI testing, NVIDIA's dominant position means that most AI software is developed with CUDA in mind, making it easy for folks at home to get up and running quickly. The good news is things are getting easier for AMD and Intel users. And if you want to run some computer vision, large language models, or image generation at home, you can probably get it working on your non-NVIDIA GPU in a few extra steps. And generationally, Intel has made a massive leap in performance, taking the lead in our stable diffusion image generation benchmark by a pretty big margin. In computer vision, it loses to the 4060, although just barely. So if you can get it running, it works great. As long as you don't mind using a bunch of power to generate a portrait of your aunt as an actual aunt. Speaking of power, we haven't even talked about power yet. The physical PCB of our card is dinky, and Intel takes advantage of this by implementing a new flow-through cooler design. Thermals are well under control with the hotspot on the B580, never passing 76 degrees in either synthetic or gaming workloads. The B580 is targeting a TDP of 190 watts. That's 25 watts more than the RX7600 and 4060 Ti, and 75 watts more than the 4060 non-Ti. So it's clear that even with all of their improvements, Intel does have a ways to go with NVIDIA pulling off a clean victory when it comes to performance per watt, even if NVIDIA doesn't seem to understand what performance per dollar is. The B580 uses just one PCIe 8-pin for power, so you won't need to upgrade your power supply. That is, unless you have a power supply that is less than 600 watts. Even though it should only draw a max of 225 watts with that power connector, we saw some transient spikes up to 241 watts in F1. That is significantly higher than Intel's advertised 190 watts. So in conclusion, maybe it's just because it's been so long since we've seen a great budget card, but man, I'm excited. How many times have I had to say, well, you could buy this shiny new thing, but it kind of sucks. And you'd be way better off going with something last gen or second hand or just not upgrading at all. Well, not today. I mean, eBay deals obviously still exist, but this is a new GPU that doesn't suck with modern features and it's worthy of celebration, even if Intel is a little late to the party. See, the thing is AMD and NVIDIA are poised to release their next generation of GPUs early next year, so it isn't out of the question for a couple of companies that have had their stock prices double or ten-uple to make a move to nuke Arc-B series from orbit. Also, the looming threat of tariffs could impact the affordability of these cards. But hey, it's going to be a pretty sweet ride up until that time. It's a great GPU that will give you great performance without destroying your wallet. If you guys enjoyed this video, go check out our review of the original Arc. The benchmarks and graphs aren't really relevant, other than to give you the context for how far we have come."}