{"video_id":"fp_Q6uluOHyOg","title":"AMD Radeon RX 9070 and 9070 XT Review","channel":"Linus Tech Tips","show":"Linus Tech Tips","published_at":"2025-03-05T14:01:00.029Z","duration_s":1103,"segments":[{"start_s":0.0,"end_s":3.74,"text":"Oh, Linus Sebastian is remoting in from Texas","speaker":null,"is_sponsor":0},{"start_s":3.74,"end_s":10.6,"text":"to present our next award of the evening. And the least crap GPU of 2025 award goes to AMD","speaker":null,"is_sponsor":0},{"start_s":12.16,"end_s":16.12,"text":"for the Radeon 9070XT.","speaker":null,"is_sponsor":0},{"start_s":16.12,"end_s":20.96,"text":"That's right, folks. After fumbling for years, AMD has finally figured out","speaker":null,"is_sponsor":0},{"start_s":20.96,"end_s":25.72,"text":"how to manufacture and price a GPU that will actually sell.","speaker":null,"is_sponsor":0},{"start_s":25.72,"end_s":30.24,"text":"And all it took for this to happen was for NVIDIA to slip on a banana peel,","speaker":null,"is_sponsor":0},{"start_s":30.24,"end_s":34.2,"text":"fall down an elevator shaft, and then have a piano drop down.","speaker":null,"is_sponsor":0},{"start_s":34.2,"end_s":39.08,"text":"As for the 9070 non-XT, it's also here","speaker":null,"is_sponsor":0},{"start_s":39.08,"end_s":45.56,"text":"in all seriousness, though. At $599 and $550, both of these cards are well-priced,","speaker":null,"is_sponsor":0},{"start_s":45.56,"end_s":49.68,"text":"capable of both 4K gaming and actually good ray tracing,","speaker":null,"is_sponsor":0},{"start_s":49.68,"end_s":53.24,"text":"and even add AI-powered upscaling capabilities.","speaker":null,"is_sponsor":0},{"start_s":53.24,"end_s":56.32,"text":"But as per AMD's long-proud tradition,","speaker":null,"is_sponsor":0},{"start_s":56.32,"end_s":61.96,"text":"they also come with caveats, like performance and productivity and power consumption.","speaker":null,"is_sponsor":0},{"start_s":61.96,"end_s":67.72,"text":"With that said, if they can stay in stock, I think we're gonna be able to look past those little details","speaker":null,"is_sponsor":0},{"start_s":67.72,"end_s":73.6,"text":"toward a brighter future for gamers and a bright segue to our sponsor.","speaker":null,"is_sponsor":0},{"start_s":73.6,"end_s":77.6,"text":"[\"Suggestion of the Year\"]","speaker":null,"is_sponsor":0},{"start_s":82.52,"end_s":85.84,"text":"AMD says these cards, they're for high-resolution gaming.","speaker":null,"is_sponsor":0},{"start_s":85.84,"end_s":91.76,"text":"So let's talk 1080p later and jump right into 1440p rasterization where, wow,","speaker":null,"is_sponsor":0},{"start_s":91.76,"end_s":96.48,"text":"does the 9070XT ever make NVIDIA look like a bunch of greedy buffoons.","speaker":null,"is_sponsor":0},{"start_s":96.48,"end_s":101.36,"text":"It nearly matches the performance of the RTX 5070 Ti","speaker":null,"is_sponsor":0},{"start_s":101.36,"end_s":106.36,"text":"while coming in $150 below the MSRP of that card,","speaker":null,"is_sponsor":0},{"start_s":106.6,"end_s":111.16,"text":"which of course is an imaginary price that no gamer has ever paid.","speaker":null,"is_sponsor":0},{"start_s":111.16,"end_s":114.4,"text":"Even the lesser non-XT variant looks decent here,","speaker":null,"is_sponsor":0},{"start_s":114.4,"end_s":119.0,"text":"going blow-for-blow with the 5070 across all of our benchmarks.","speaker":null,"is_sponsor":0},{"start_s":119.0,"end_s":123.32,"text":"We even got the occasional clear win for AMD, like in Alan Wake 2,","speaker":null,"is_sponsor":0},{"start_s":123.32,"end_s":128.32,"text":"where the non-XT beats the non-Ti by a whopping 17%.","speaker":null,"is_sponsor":0},{"start_s":129.12,"end_s":132.16,"text":"The only definitive loss for AMD in our suite","speaker":null,"is_sponsor":0},{"start_s":132.16,"end_s":136.96,"text":"was in Blacksmith Wukong, where the Ti beats the XT by around 9%.","speaker":null,"is_sponsor":0},{"start_s":136.96,"end_s":144.0,"text":"But the second you factor in the price, AMD's small L turns into a big W.","speaker":null,"is_sponsor":0},{"start_s":144.0,"end_s":149.24,"text":"In our Vulcan benchmark, Red Dead Redemption 2, we do see some worrying issues in frame-pacing","speaker":null,"is_sponsor":0},{"start_s":149.24,"end_s":153.64,"text":"where both of the new cards drop in ranking because of their poor 1% lows,","speaker":null,"is_sponsor":0},{"start_s":153.64,"end_s":157.64,"text":"despite solid average FPS, so hopefully this is something that AMD's team","speaker":null,"is_sponsor":0},{"start_s":157.64,"end_s":162.8,"text":"can work on post-launch. Overall though, we are off to a great start.","speaker":null,"is_sponsor":0},{"start_s":162.8,"end_s":167.12,"text":"The 9070XT is a clear winner in performance and value,","speaker":null,"is_sponsor":0},{"start_s":167.12,"end_s":173.08,"text":"and as for the 9070, well, it's about as bad a deal as the RTX 5070,","speaker":null,"is_sponsor":0},{"start_s":173.08,"end_s":177.64,"text":"at least on paper, and we'll talk a bit more about our theory as to why later.","speaker":null,"is_sponsor":0},{"start_s":177.64,"end_s":181.24,"text":"First, I wanna talk about 4K, because unlike NVIDIA,","speaker":null,"is_sponsor":0},{"start_s":181.24,"end_s":186.24,"text":"AMD gave their $550 card sufficient VRAM for Ultra HD gaming.","speaker":null,"is_sponsor":0},{"start_s":187.12,"end_s":190.96,"text":"Sure, it's GDDR6 and not GDDR7,","speaker":null,"is_sponsor":0},{"start_s":190.96,"end_s":195.84,"text":"but faster memory is gonna be irrelevant when you don't have the capacity to keep up,","speaker":null,"is_sponsor":0},{"start_s":195.84,"end_s":201.16,"text":"and there is no better illustration of this than the way that the 9070 extends its lead","speaker":null,"is_sponsor":0},{"start_s":201.16,"end_s":206.44,"text":"at higher resolutions, especially when we look at the all-important 1% lows,","speaker":null,"is_sponsor":0},{"start_s":206.44,"end_s":209.92,"text":"which is what determines the smoothness of your gameplay.","speaker":null,"is_sponsor":0},{"start_s":209.92,"end_s":216.92,"text":"Look at last of us part one here. The 9070 leads by a whopping 23% over the RTX 5070.","speaker":null,"is_sponsor":0},{"start_s":218.34,"end_s":221.56,"text":"Why? We'll look at the VRAM usage.","speaker":null,"is_sponsor":0},{"start_s":221.56,"end_s":227.0,"text":"On the 5070, it's full, which results in these gameplay hitches that you see here.","speaker":null,"is_sponsor":0},{"start_s":227.0,"end_s":232.44,"text":"Compare that to the 9070, or to NVIDIA's own cards that do have 16 gigs,","speaker":null,"is_sponsor":0},{"start_s":232.44,"end_s":237.16,"text":"and those hitches disappear. And I mean, sure, not everyone has a 4K monitor,","speaker":null,"is_sponsor":0},{"start_s":237.16,"end_s":240.28,"text":"and these limitations don't rear their heads in every game,","speaker":null,"is_sponsor":0},{"start_s":240.28,"end_s":244.2,"text":"or even most games, but still, it's kind of embarrassing","speaker":null,"is_sponsor":0},{"start_s":244.2,"end_s":248.0,"text":"when you're supposed to be the market leader in gaming GPUs, right?","speaker":null,"is_sponsor":0},{"start_s":248.0,"end_s":251.12,"text":"Overall, the 9000 series does fall a bit short","speaker":null,"is_sponsor":0},{"start_s":251.12,"end_s":254.68,"text":"of true 60 FPS 4K Ultra gaming,","speaker":null,"is_sponsor":0},{"start_s":254.68,"end_s":259.04,"text":"and in our tests, it did not achieve the lofty margins","speaker":null,"is_sponsor":0},{"start_s":259.08,"end_s":262.52,"text":"over the 7900 GRE that AMD promised in their slides,","speaker":null,"is_sponsor":0},{"start_s":262.52,"end_s":268.72,"text":"but we're not that disappointed yet, because AMD included ray tracing in their averages,","speaker":null,"is_sponsor":0},{"start_s":268.72,"end_s":272.68,"text":"and that is a very different story with these new cards.","speaker":null,"is_sponsor":0},{"start_s":272.68,"end_s":276.52,"text":"Before we look at those results though, let's dive into what AMD has done","speaker":null,"is_sponsor":0},{"start_s":276.52,"end_s":279.8,"text":"to bring about this generational leap in performance.","speaker":null,"is_sponsor":0},{"start_s":279.8,"end_s":284.72,"text":"To me, the most impressive thing is shrinkage. The new monolithic die that underpins both the 9070","speaker":null,"is_sponsor":0},{"start_s":284.72,"end_s":287.8,"text":"and 9070 XT should have been codenamed Frightened Turtle.","speaker":null,"is_sponsor":0},{"start_s":287.84,"end_s":294.24,"text":"It's built on TSMC's N4C process node, and it cramps 92% as many transistors as the 7900 XTX","speaker":null,"is_sponsor":0},{"start_s":294.24,"end_s":299.4,"text":"into just two thirds of the die area. This, along with some major improvements in performance","speaker":null,"is_sponsor":0},{"start_s":299.4,"end_s":302.56,"text":"per CU, is what makes these cards such a compelling value.","speaker":null,"is_sponsor":0},{"start_s":302.56,"end_s":305.76,"text":"Diving deeper into the compute engine, we see vastly improved matrix operations","speaker":null,"is_sponsor":0},{"start_s":305.76,"end_s":308.92,"text":"with support for more data types, a new dynamic register allocator,","speaker":null,"is_sponsor":0},{"start_s":308.92,"end_s":313.12,"text":"and improvements to the scheduler. Combine this with dual SIMD 32 vector units,","speaker":null,"is_sponsor":0},{"start_s":313.12,"end_s":318.44,"text":"overhauled AI accelerators, and beefed up ray tracing capabilities, and you get a card that can do a lot of calculations","speaker":null,"is_sponsor":0},{"start_s":318.44,"end_s":322.2,"text":"all at once, especially when it comes to ray tracing and AI.","speaker":null,"is_sponsor":0},{"start_s":322.2,"end_s":326.16,"text":"But wait, there is more. Both cards get an improved media engine,","speaker":null,"is_sponsor":0},{"start_s":326.16,"end_s":330.08,"text":"which provides a considerable improvement for low bit rate encoding during streaming.","speaker":null,"is_sponsor":0},{"start_s":330.08,"end_s":333.8,"text":"YouTube's compression might make this impossible to see, but to the human eye,","speaker":null,"is_sponsor":0},{"start_s":333.8,"end_s":339.8,"text":"these Twitch optimized recordings of Returnal, there's a clear improvement over the 7900 XTX.","speaker":null,"is_sponsor":0},{"start_s":339.8,"end_s":343.44,"text":"And it's even difficult to distinguish AMD from NVIDIA's end banking coding.","speaker":null,"is_sponsor":0},{"start_s":343.44,"end_s":347.92,"text":"We'll talk more about encoding later, but it's really great to see AMD finally catching up here.","speaker":null,"is_sponsor":0},{"start_s":347.92,"end_s":351.96,"text":"But where AMD is still behind is in 422 hardware encoding and decoding,","speaker":null,"is_sponsor":0},{"start_s":351.96,"end_s":355.52,"text":"which could make this a less desirable option for professional video creators,","speaker":null,"is_sponsor":0},{"start_s":355.52,"end_s":360.92,"text":"but it's unlikely to matter for non-pros. Another slight disappointment compared to the RTX 50 series","speaker":null,"is_sponsor":0},{"start_s":360.92,"end_s":367.4,"text":"is AMD's DisplayPoint 2.1a, which are only UHBR 13.5 rather than UHBR 20.","speaker":null,"is_sponsor":0},{"start_s":367.4,"end_s":372.72,"text":"So these new cards can still do 4K 240 Hertz, but they will rely on display stream compression to do so.","speaker":null,"is_sponsor":0},{"start_s":372.72,"end_s":376.8,"text":"Not a huge deal to me. I find it nigh imperceptible, but your mileage may vary.","speaker":null,"is_sponsor":0},{"start_s":376.8,"end_s":382.92,"text":"Enough specs. Let's talk ray tracing, where both the new cards beat the 7900 XTX,","speaker":null,"is_sponsor":0},{"start_s":382.92,"end_s":388.72,"text":"a card that was well received at a thousand US dollars. Too bad they don't fare quite as well against NVIDIA.","speaker":null,"is_sponsor":0},{"start_s":388.72,"end_s":392.2,"text":"In Alan Wake 2, the 9070 keeps up with the 5070,","speaker":null,"is_sponsor":0},{"start_s":392.2,"end_s":395.72,"text":"showing just how bad the 5070 is,","speaker":null,"is_sponsor":0},{"start_s":395.72,"end_s":399.24,"text":"but the 5070 Ti provides substantially better performance","speaker":null,"is_sponsor":0},{"start_s":399.24,"end_s":403.04,"text":"than the 9070 XT, giving NVIDIA one clear win","speaker":null,"is_sponsor":0},{"start_s":403.04,"end_s":408.48,"text":"for their overpriced 50 series cards, or one win so far.","speaker":null,"is_sponsor":0},{"start_s":408.48,"end_s":415.64,"text":"In the heavily path traced Black Myth Wukong, the 9000 cards handily outclass the 7000 series,","speaker":null,"is_sponsor":0},{"start_s":415.64,"end_s":421.12,"text":"but fall substantially behind NVIDIA's latest, and as for F124,","speaker":null,"is_sponsor":0},{"start_s":421.12,"end_s":427.08,"text":"AMD looks great across the board here, while the 5070 gets beaten by its own predecessor,","speaker":null,"is_sponsor":0},{"start_s":427.08,"end_s":429.64,"text":"the 4070 Super. It...","speaker":null,"is_sponsor":0},{"start_s":431.0,"end_s":434.56,"text":"The clown show. Removing Black Myth Wukong from the equation,","speaker":null,"is_sponsor":0},{"start_s":434.56,"end_s":437.92,"text":"the 9070 family is neck and neck with the competition,","speaker":null,"is_sponsor":0},{"start_s":437.92,"end_s":441.64,"text":"showing that AMD is no longer two generations behind","speaker":null,"is_sponsor":0},{"start_s":441.64,"end_s":446.76,"text":"in ray tracing. With that said, Black Myth Wukong does in fact exist,","speaker":null,"is_sponsor":0},{"start_s":446.76,"end_s":450.76,"text":"and shows that AMD is still one generation behind,","speaker":null,"is_sponsor":0},{"start_s":450.76,"end_s":455.56,"text":"which may matter in future games. And even in AMD's own tech demo,","speaker":null,"is_sponsor":0},{"start_s":455.56,"end_s":460.68,"text":"we can see that their path tracing implementation struggle with boiling artifacts and ghosting,","speaker":null,"is_sponsor":0},{"start_s":460.68,"end_s":463.92,"text":"that reminds me more of last gen's ray reconstruction.","speaker":null,"is_sponsor":0},{"start_s":463.92,"end_s":467.6,"text":"Hey, but here's hoping that AMD can continue to catch up","speaker":null,"is_sponsor":0},{"start_s":467.6,"end_s":470.84,"text":"and maybe even surpass NVIDIA in the future.","speaker":null,"is_sponsor":0},{"start_s":470.84,"end_s":475.32,"text":"Maybe the guy can dream. Hey, did we leave a 1080p raster?","speaker":null,"is_sponsor":0},{"start_s":475.32,"end_s":480.4,"text":"We ran those numbers, so damn it, I want them in the video. Even if the story remains largely the same,","speaker":null,"is_sponsor":0},{"start_s":480.4,"end_s":487.04,"text":"the 5070 Ti is still a pretty bad deal. It does win, but its margin of victory is so small","speaker":null,"is_sponsor":0},{"start_s":487.04,"end_s":491.04,"text":"compared to the pricing chasm between these cards that it's really hard to recommend.","speaker":null,"is_sponsor":0},{"start_s":491.04,"end_s":496.56,"text":"So if you are looking for an overkill 1080p upgrade, the 9070 XT looks like a great way","speaker":null,"is_sponsor":0},{"start_s":496.56,"end_s":499.8,"text":"to push huge FPS numbers in eSports titles.","speaker":null,"is_sponsor":0},{"start_s":499.8,"end_s":502.8,"text":"Or any title, if you don't mind a little AI upscaling.","speaker":null,"is_sponsor":0},{"start_s":502.8,"end_s":506.12,"text":"AMD said at one point that they don't think AI is necessary for upscaling,","speaker":null,"is_sponsor":0},{"start_s":506.12,"end_s":511.84,"text":"and they could get the same results using more traditional means. Ha, spoken like a company that did not have AI figured out","speaker":null,"is_sponsor":0},{"start_s":511.84,"end_s":517.08,"text":"yet. The good news is that AMD has used their slow start to squeeze a lot of image quality out of temporal upscaling,","speaker":null,"is_sponsor":0},{"start_s":517.08,"end_s":521.6,"text":"and now that they've added their proprietary Fp8 machine learning model to improve upscaling even further,","speaker":null,"is_sponsor":0},{"start_s":521.6,"end_s":526.84,"text":"the results are very impressive. Compared to FSR 3.1, we see reductions in artifacting","speaker":null,"is_sponsor":0},{"start_s":526.84,"end_s":531.68,"text":"and improvements in the rendering of fine lines. There are a few instances where it even beats DLSS4,","speaker":null,"is_sponsor":0},{"start_s":531.68,"end_s":536.64,"text":"like with these butterflies in Horizon Forbidden West. On FSR3, there's unsightly trailing and dissolving,","speaker":null,"is_sponsor":0},{"start_s":536.64,"end_s":542.08,"text":"and on DLSS4, they just kind of are blurry. But with FSR4, they appear much crisp and clearer","speaker":null,"is_sponsor":0},{"start_s":542.08,"end_s":545.8,"text":"than either of the other technologies. Sure, when you look closely, the artifacts are still there,","speaker":null,"is_sponsor":0},{"start_s":545.8,"end_s":549.28,"text":"but it's a market improvement. Just like haloing around detailed character models,","speaker":null,"is_sponsor":0},{"start_s":549.28,"end_s":552.52,"text":"it's very heavily reduced. And motion is just sharper in general.","speaker":null,"is_sponsor":0},{"start_s":552.52,"end_s":557.24,"text":"I would say that at least when upscaling to 4K, I found the image quality of FSR4 to be no more distracting","speaker":null,"is_sponsor":0},{"start_s":557.24,"end_s":560.88,"text":"than that of DLSS4. But none of the reduced image quality tradeoffs matter","speaker":null,"is_sponsor":0},{"start_s":560.88,"end_s":566.52,"text":"if we don't get better performance. And compared to DLSS, FSR gives a less substantial performance","speaker":null,"is_sponsor":0},{"start_s":566.52,"end_s":572.24,"text":"uplift at each quality setting than NVIDIA, in both the Last of Us Part 1 and in Horizon Zero Dawn.","speaker":null,"is_sponsor":0},{"start_s":572.24,"end_s":576.72,"text":"But we do see that AMD's Framegen provides a better uplift than NVIDIA's solution,","speaker":null,"is_sponsor":0},{"start_s":576.72,"end_s":579.88,"text":"which indicates that FSR 3.1 Framegen has less overhead","speaker":null,"is_sponsor":0},{"start_s":579.88,"end_s":583.6,"text":"than NVIDIA's Multiframegen. While AMD doesn't quite match NVIDIA yet,","speaker":null,"is_sponsor":0},{"start_s":583.6,"end_s":587.84,"text":"this is the most competitive their upscaler has been ever.","speaker":null,"is_sponsor":0},{"start_s":587.84,"end_s":592.36,"text":"Except support is a problem. Even though you can use the driver to force the updated","speaker":null,"is_sponsor":0},{"start_s":592.36,"end_s":596.76,"text":"FSR4 model in games with FSR3, AMD just has less games that use FSR3.","speaker":null,"is_sponsor":0},{"start_s":596.76,"end_s":600.16,"text":"And unlike DLSS4, older cards are not able to take advantage","speaker":null,"is_sponsor":0},{"start_s":600.16,"end_s":604.16,"text":"of the new machine learning the enhanced upscaling. And AMD still doesn't have a competitor","speaker":null,"is_sponsor":0},{"start_s":604.16,"end_s":607.76,"text":"for Multiframegen yet. However, FSR4 does have Framegen,","speaker":null,"is_sponsor":0},{"start_s":607.76,"end_s":611.08,"text":"but it makes use of FSR3.1's Framegen implementation.","speaker":null,"is_sponsor":0},{"start_s":611.08,"end_s":615.32,"text":"But you can bring Framegen to any game using AMD's driver level fluid motion frames,","speaker":null,"is_sponsor":0},{"start_s":615.32,"end_s":620.52,"text":"now on version 2.1. But even with the additional decimal point, AFMF still sucks.","speaker":null,"is_sponsor":0},{"start_s":620.52,"end_s":625.4,"text":"Without game integration, you get all these weird UI issues and overlays get mangled with motion heavy heck","speaker":null,"is_sponsor":0},{"start_s":625.4,"end_s":629.2,"text":"in even motion light scenes. Let's be real, it's just frame interpolation","speaker":null,"is_sponsor":0},{"start_s":629.2,"end_s":634.08,"text":"and no number of fancy names can change that. And if the use case for NVIDIA's Framegen was already weak,","speaker":null,"is_sponsor":0},{"start_s":634.08,"end_s":641.56,"text":"AFMF feels like it exists only so AMD can say, actually we support Framegen in way more games.","speaker":null,"is_sponsor":0},{"start_s":641.56,"end_s":645.8,"text":"But no, no one wants it. It's, I don't know, who cares?","speaker":null,"is_sponsor":0},{"start_s":645.8,"end_s":649.84,"text":"Despite the swath of shareholder approved AI junk","speaker":null,"is_sponsor":0},{"start_s":649.84,"end_s":655.12,"text":"in their latest driver software, AMD falls pretty well short of the mark here.","speaker":null,"is_sponsor":0},{"start_s":655.12,"end_s":661.4,"text":"There are some great gen over gen improvements in computer vision, but even the 4070 Super","speaker":null,"is_sponsor":0},{"start_s":661.4,"end_s":665.2,"text":"manages a sizable lead over the 9070XT.","speaker":null,"is_sponsor":0},{"start_s":665.2,"end_s":670.96,"text":"The generational gains are even more apparent in stable diffusion, but the story remains the same.","speaker":null,"is_sponsor":0},{"start_s":670.96,"end_s":677.92,"text":"Finally, there's AI text generation, and I bet AMD wishes an AI could rewrite these benchmark results","speaker":null,"is_sponsor":0},{"start_s":677.92,"end_s":681.12,"text":"over the new cards failed to improve substantially","speaker":null,"is_sponsor":0},{"start_s":681.12,"end_s":684.2,"text":"over last gen and get absolutely dunked on by NVIDIA,","speaker":null,"is_sponsor":0},{"start_s":684.2,"end_s":690.52,"text":"regardless of which model you're using. The one silver lining is that either of these cards","speaker":null,"is_sponsor":0},{"start_s":690.52,"end_s":694.16,"text":"can run all the same models that the 5080 can,","speaker":null,"is_sponsor":0},{"start_s":694.16,"end_s":697.64,"text":"and can run some that the 5070 cannot.","speaker":null,"is_sponsor":0},{"start_s":697.64,"end_s":701.96,"text":"Thanks to AMD's generous 16 gigs of VRAM.","speaker":null,"is_sponsor":0},{"start_s":701.96,"end_s":706.0,"text":"I mean, it's only generous compared to NVIDIA, but thank you guys.","speaker":null,"is_sponsor":0},{"start_s":706.0,"end_s":709.92,"text":"Don't be too thankful though. AMD still needs to impress in content creation","speaker":null,"is_sponsor":0},{"start_s":709.92,"end_s":713.28,"text":"where they do okay for video editing,","speaker":null,"is_sponsor":0},{"start_s":713.28,"end_s":718.6,"text":"but the 9070 and 70XT provide good, but not exceptional performance in Premiere Pro","speaker":null,"is_sponsor":0},{"start_s":718.6,"end_s":723.96,"text":"and DaVinci Resolve. And in Blender, the 9070 and 70XT perform about on par","speaker":null,"is_sponsor":0},{"start_s":723.96,"end_s":727.04,"text":"with the flagships from AMD's last generation,","speaker":null,"is_sponsor":0},{"start_s":727.04,"end_s":732.08,"text":"but they still just don't really get anywhere close to NVIDIA thanks to optics rendering.","speaker":null,"is_sponsor":0},{"start_s":732.08,"end_s":737.08,"text":"Yeah, there was a transition that I was written that made more sense before we had to record this.","speaker":null,"is_sponsor":0},{"start_s":737.08,"end_s":742.68,"text":"Speaking of updates, we have some encoder testing for you. While AMD has made strides to improve their encoder,","speaker":null,"is_sponsor":0},{"start_s":742.68,"end_s":745.96,"text":"they still fall behind in quality compared to NVIDIA and Intel,","speaker":null,"is_sponsor":0},{"start_s":745.96,"end_s":751.16,"text":"especially at lower bit rates in H.264. For this encoding test, we only set the bit rate","speaker":null,"is_sponsor":0},{"start_s":751.16,"end_s":754.32,"text":"and no other parameters. Depending on what encoder tweaks are used,","speaker":null,"is_sponsor":0},{"start_s":754.32,"end_s":758.24,"text":"any of these cards could have improved quality compared to our tests.","speaker":null,"is_sponsor":0},{"start_s":758.24,"end_s":761.48,"text":"But the simplest way to improve quality is use AV1.","speaker":null,"is_sponsor":0},{"start_s":761.48,"end_s":766.08,"text":"But sadly, we've excluded the 7,000 series from our results due to a hardware bug.","speaker":null,"is_sponsor":0},{"start_s":766.08,"end_s":769.24,"text":"The good news is everyone outputs better quality in AV1,","speaker":null,"is_sponsor":0},{"start_s":769.24,"end_s":771.96,"text":"and AMD is sadly still in last place.","speaker":null,"is_sponsor":0},{"start_s":772.88,"end_s":775.88,"text":"We are almost there. Let's talk power and thermals.","speaker":null,"is_sponsor":0},{"start_s":775.88,"end_s":780.2,"text":"Over the past several generations, AMD has made some substantial improvements to efficiency,","speaker":null,"is_sponsor":0},{"start_s":780.2,"end_s":784.4,"text":"and in their announcement, they didn't really talk much about it. I guess they were being modest.","speaker":null,"is_sponsor":0},{"start_s":784.4,"end_s":790.36,"text":"In F124, the 9070 pulls lower power on average than the 5070, but with less stable power delivery,","speaker":null,"is_sponsor":0},{"start_s":790.36,"end_s":794.48,"text":"having transient spikes as high as 321 watts.","speaker":null,"is_sponsor":0},{"start_s":794.48,"end_s":800.44,"text":"Speaking of which, the 9070 XT has a massive spike to 426 watts, and its average is 309 watts,","speaker":null,"is_sponsor":0},{"start_s":800.44,"end_s":804.48,"text":"which is already higher than its rated TBP of 305 watts.","speaker":null,"is_sponsor":0},{"start_s":804.48,"end_s":808.0,"text":"And that propensity for pulling profuse power can be found in combustor,","speaker":null,"is_sponsor":0},{"start_s":808.0,"end_s":812.94,"text":"where both AMD cards also use more than their rated total board power on average.","speaker":null,"is_sponsor":0},{"start_s":812.94,"end_s":816.72,"text":"But AMD is allowing some partner cards to use a higher power budget,","speaker":null,"is_sponsor":0},{"start_s":816.72,"end_s":821.12,"text":"so I guess this isn't fully out of spec, but I would definitely recommend you stick","speaker":null,"is_sponsor":0},{"start_s":821.12,"end_s":825.96,"text":"with the manufacturer recommended power supply capacity. And I recommend you check out PSU circuit","speaker":null,"is_sponsor":0},{"start_s":825.96,"end_s":829.72,"text":"if you need to make an upgrade decision. Well, that's a good info there. Thankfully, despite the power draw,","speaker":null,"is_sponsor":0},{"start_s":829.72,"end_s":833.04,"text":"thermals seem to be under control on our provided sapphire pulse samples.","speaker":null,"is_sponsor":0},{"start_s":833.04,"end_s":838.16,"text":"I'm sure in part to their use of PTM7950, which you can buy for yourself over at LTTstore.com.","speaker":null,"is_sponsor":0},{"start_s":838.16,"end_s":843.84,"text":"That's a double plug, double whammy. Self promo, baby! AMD doesn't feel the need to hide GPU hotspot metrics","speaker":null,"is_sponsor":0},{"start_s":843.84,"end_s":847.6,"text":"like NVIDIA does, and in combustor, we see that sapphire's coolers provide","speaker":null,"is_sponsor":0},{"start_s":847.6,"end_s":850.6,"text":"ample thermal headroom, which is good. And they're a little big.","speaker":null,"is_sponsor":0},{"start_s":850.6,"end_s":854.64,"text":"They're not obnoxious, but you know, come on. It does allow the cards to stay much cooler","speaker":null,"is_sponsor":0},{"start_s":854.64,"end_s":858.76,"text":"than say the 5070 or the 6700 XT in F124.","speaker":null,"is_sponsor":0},{"start_s":858.76,"end_s":862.04,"text":"So if you're coming from an older card, you can confidently upgrade to either the 9070","speaker":null,"is_sponsor":0},{"start_s":862.04,"end_s":867.2,"text":"or the 9070 XT, and know you'll get a solid improvement in raster performance and ray tracing,","speaker":null,"is_sponsor":0},{"start_s":867.2,"end_s":871.48,"text":"which is a big plus for folks who are on older ray tracing cards like the 2000 series,","speaker":null,"is_sponsor":0},{"start_s":871.48,"end_s":876.24,"text":"or if you're coming from the 6000 series on AMD. It's just, it's good, it's good, it's cool, it's nice.","speaker":null,"is_sponsor":0},{"start_s":876.24,"end_s":879.24,"text":"We like it, we're happy. They could be cheaper.","speaker":null,"is_sponsor":0},{"start_s":879.24,"end_s":882.56,"text":"They could be, they could be cheaper. There's kind of two conclusions here.","speaker":null,"is_sponsor":0},{"start_s":882.56,"end_s":888.44,"text":"A short one and a long one. The short one is that the 9070 XT is a winner.","speaker":null,"is_sponsor":0},{"start_s":888.44,"end_s":893.4,"text":"If AMD can keep this thing in stock and you have $600 to spend on a gaming GPU,","speaker":null,"is_sponsor":0},{"start_s":893.4,"end_s":897.44,"text":"you are gonna love this thing, and NVIDIA needs to respond now,","speaker":null,"is_sponsor":0},{"start_s":897.44,"end_s":901.02,"text":"or AMD might actually take some real market share for a change.","speaker":null,"is_sponsor":0},{"start_s":901.02,"end_s":906.14,"text":"The long conclusion is that it seems like AMD is trying to eat their cake and have it too here.","speaker":null,"is_sponsor":0},{"start_s":906.14,"end_s":909.88,"text":"See, the XT's 599 price point is giving real","speaker":null,"is_sponsor":0},{"start_s":909.88,"end_s":914.08,"text":"good guy AMD vibes, but the 9070 non-XT's price","speaker":null,"is_sponsor":0},{"start_s":914.08,"end_s":917.16,"text":"is giving maximized margins while GPUs","speaker":null,"is_sponsor":0},{"start_s":917.16,"end_s":923.24,"text":"are in short supply vibes. By matching the 5070 in both price and performance,","speaker":null,"is_sponsor":0},{"start_s":923.24,"end_s":926.72,"text":"unfortunately, you've also matched it in value,","speaker":null,"is_sponsor":0},{"start_s":926.72,"end_s":931.2,"text":"and let's be real. The 5070 is not a great value.","speaker":null,"is_sponsor":0},{"start_s":931.2,"end_s":934.44,"text":"Once nobody buys the non-XT because the XT","speaker":null,"is_sponsor":0},{"start_s":934.44,"end_s":940.2,"text":"is so much better for just $50 more, you're gonna end up dropping the price on this thing","speaker":null,"is_sponsor":0},{"start_s":940.2,"end_s":944.12,"text":"after the reputational damage has already been done to it.","speaker":null,"is_sponsor":0},{"start_s":944.12,"end_s":948.92,"text":"Did you guys learn nothing from the terrible initial reception to the 7900XT?","speaker":null,"is_sponsor":0},{"start_s":948.92,"end_s":952.28,"text":"And it's not like you have just reputation","speaker":null,"is_sponsor":0},{"start_s":952.28,"end_s":956.4,"text":"for days to give up. You have nothing to compete in the high end.","speaker":null,"is_sponsor":0},{"start_s":956.4,"end_s":960.84,"text":"And sure, 85% of gamers do buy cards under $700.","speaker":null,"is_sponsor":0},{"start_s":960.84,"end_s":964.28,"text":"I'm sure that's true, but without a high end card at all,","speaker":null,"is_sponsor":0},{"start_s":964.28,"end_s":969.08,"text":"you're giving up precious mind share. I mean, we didn't even bother to compare","speaker":null,"is_sponsor":0},{"start_s":969.08,"end_s":972.48,"text":"the 9000 series against NVIDIA's last gen flagship,","speaker":null,"is_sponsor":0},{"start_s":972.48,"end_s":975.44,"text":"the 4090, let alone their new flagship.","speaker":null,"is_sponsor":0},{"start_s":975.48,"end_s":979.56,"text":"And yes, 16 gigs of VRAM at 550 bucks is nice,","speaker":null,"is_sponsor":0},{"start_s":979.56,"end_s":983.44,"text":"but it's a bummer that the cheapest way to get more VRAM in your lineup","speaker":null,"is_sponsor":0},{"start_s":983.44,"end_s":987.52,"text":"is still a last generation 7900XT.","speaker":null,"is_sponsor":0},{"start_s":987.52,"end_s":992.92,"text":"If you have no plans to compete in the high end, we need you guys dominating the mid-range","speaker":null,"is_sponsor":0},{"start_s":992.92,"end_s":996.96,"text":"and enthusiast segments, especially because your non-gaming performance","speaker":null,"is_sponsor":0},{"start_s":996.96,"end_s":1000.94,"text":"is still kind of lacking. Another thing that we need to acknowledge","speaker":null,"is_sponsor":0},{"start_s":1000.94,"end_s":1004.92,"text":"in our longer conclusion is the PlayStation-shaped elephant in the room.","speaker":null,"is_sponsor":0},{"start_s":1004.92,"end_s":1009.16,"text":"As impressive as these cards are, there is still a strong argument to be made","speaker":null,"is_sponsor":0},{"start_s":1009.16,"end_s":1012.6,"text":"that PC gaming has just plain gotten too expensive","speaker":null,"is_sponsor":0},{"start_s":1012.6,"end_s":1019.96,"text":"when you can pick up an all-in-one gaming box that will already run at 4K for less than the MSRP of a 9070","speaker":null,"is_sponsor":0},{"start_s":1019.96,"end_s":1023.34,"text":"and even lesser if you buy it secondhand.","speaker":null,"is_sponsor":0},{"start_s":1023.34,"end_s":1026.8,"text":"Sure, you'll pay more for games and online services in the long run,","speaker":null,"is_sponsor":0},{"start_s":1026.8,"end_s":1032.08,"text":"but boy, is it ever a reasonable upfront cost. But hey, maybe the best is yet to come","speaker":null,"is_sponsor":0},{"start_s":1032.08,"end_s":1037.26,"text":"on the discrete GPU side. AMD has announced that 9060 cards will be coming in Q2","speaker":null,"is_sponsor":0},{"start_s":1037.26,"end_s":1040.26,"text":"and those could provide even better value,","speaker":null,"is_sponsor":0},{"start_s":1040.26,"end_s":1045.22,"text":"assuming of course that AMD doesn't launch them at a high price in order to maximize margin for those ones,","speaker":null,"is_sponsor":0},{"start_s":1045.22,"end_s":1049.9,"text":"which grosses me out just thinking about it. What I'm not grossed out by is ending this video","speaker":null,"is_sponsor":0},{"start_s":1049.9,"end_s":1053.54,"text":"saying that this is truly probably the best GPU launch","speaker":null,"is_sponsor":0},{"start_s":1053.54,"end_s":1059.78,"text":"of the year and safe for Intel's B580, perhaps the best launch of the past several years","speaker":null,"is_sponsor":0},{"start_s":1059.78,"end_s":1064.54,"text":"if they can keep it in stock. But as per usual, for you, the consumer,","speaker":null,"is_sponsor":0},{"start_s":1064.54,"end_s":1067.58,"text":"there is no single right answer, just what's right for you.","speaker":null,"is_sponsor":0},{"start_s":1067.58,"end_s":1071.38,"text":"So whether you wanna go with these new cards or pay extra for Team Green","speaker":null,"is_sponsor":0},{"start_s":1071.38,"end_s":1074.5,"text":"or go for a console, all the power to you.","speaker":null,"is_sponsor":0},{"start_s":1074.5,"end_s":1077.58,"text":"And all the power to our sponsor!","speaker":null,"is_sponsor":0},{"start_s":1077.58,"end_s":1081.38,"text":"Thanks for watching, guys. We are pretty tired from doing back to back to back","speaker":null,"is_sponsor":0},{"start_s":1081.38,"end_s":1085.62,"text":"to back GPU launches. Massive shout out to you for watching","speaker":null,"is_sponsor":0},{"start_s":1085.62,"end_s":1091.1,"text":"and of course to our Labs team, our editors, our camera team, our writers, everyone for being part of it.","speaker":null,"is_sponsor":0},{"start_s":1091.1,"end_s":1095.22,"text":"I think there's a little time to breathe. Wait, 90, 60 and Q2? Okay, okay, no.","speaker":null,"is_sponsor":0},{"start_s":1095.22,"end_s":1098.7,"text":"Oh, and 50, 60 I guess is coming. Well, whatever, if you liked this video,","speaker":null,"is_sponsor":0},{"start_s":1098.7,"end_s":1102.22,"text":"check out, I don't know, 50, 80 review.","speaker":null,"is_sponsor":0}],"full_text":"Oh, Linus Sebastian is remoting in from Texas to present our next award of the evening. And the least crap GPU of 2025 award goes to AMD for the Radeon 9070XT. That's right, folks. After fumbling for years, AMD has finally figured out how to manufacture and price a GPU that will actually sell. And all it took for this to happen was for NVIDIA to slip on a banana peel, fall down an elevator shaft, and then have a piano drop down. As for the 9070 non-XT, it's also here in all seriousness, though. At $599 and $550, both of these cards are well-priced, capable of both 4K gaming and actually good ray tracing, and even add AI-powered upscaling capabilities. But as per AMD's long-proud tradition, they also come with caveats, like performance and productivity and power consumption. With that said, if they can stay in stock, I think we're gonna be able to look past those little details toward a brighter future for gamers and a bright segue to our sponsor. [\"Suggestion of the Year\"] AMD says these cards, they're for high-resolution gaming. So let's talk 1080p later and jump right into 1440p rasterization where, wow, does the 9070XT ever make NVIDIA look like a bunch of greedy buffoons. It nearly matches the performance of the RTX 5070 Ti while coming in $150 below the MSRP of that card, which of course is an imaginary price that no gamer has ever paid. Even the lesser non-XT variant looks decent here, going blow-for-blow with the 5070 across all of our benchmarks. We even got the occasional clear win for AMD, like in Alan Wake 2, where the non-XT beats the non-Ti by a whopping 17%. The only definitive loss for AMD in our suite was in Blacksmith Wukong, where the Ti beats the XT by around 9%. But the second you factor in the price, AMD's small L turns into a big W. In our Vulcan benchmark, Red Dead Redemption 2, we do see some worrying issues in frame-pacing where both of the new cards drop in ranking because of their poor 1% lows, despite solid average FPS, so hopefully this is something that AMD's team can work on post-launch. Overall though, we are off to a great start. The 9070XT is a clear winner in performance and value, and as for the 9070, well, it's about as bad a deal as the RTX 5070, at least on paper, and we'll talk a bit more about our theory as to why later. First, I wanna talk about 4K, because unlike NVIDIA, AMD gave their $550 card sufficient VRAM for Ultra HD gaming. Sure, it's GDDR6 and not GDDR7, but faster memory is gonna be irrelevant when you don't have the capacity to keep up, and there is no better illustration of this than the way that the 9070 extends its lead at higher resolutions, especially when we look at the all-important 1% lows, which is what determines the smoothness of your gameplay. Look at last of us part one here. The 9070 leads by a whopping 23% over the RTX 5070. Why? We'll look at the VRAM usage. On the 5070, it's full, which results in these gameplay hitches that you see here. Compare that to the 9070, or to NVIDIA's own cards that do have 16 gigs, and those hitches disappear. And I mean, sure, not everyone has a 4K monitor, and these limitations don't rear their heads in every game, or even most games, but still, it's kind of embarrassing when you're supposed to be the market leader in gaming GPUs, right? Overall, the 9000 series does fall a bit short of true 60 FPS 4K Ultra gaming, and in our tests, it did not achieve the lofty margins over the 7900 GRE that AMD promised in their slides, but we're not that disappointed yet, because AMD included ray tracing in their averages, and that is a very different story with these new cards. Before we look at those results though, let's dive into what AMD has done to bring about this generational leap in performance. To me, the most impressive thing is shrinkage. The new monolithic die that underpins both the 9070 and 9070 XT should have been codenamed Frightened Turtle. It's built on TSMC's N4C process node, and it cramps 92% as many transistors as the 7900 XTX into just two thirds of the die area. This, along with some major improvements in performance per CU, is what makes these cards such a compelling value. Diving deeper into the compute engine, we see vastly improved matrix operations with support for more data types, a new dynamic register allocator, and improvements to the scheduler. Combine this with dual SIMD 32 vector units, overhauled AI accelerators, and beefed up ray tracing capabilities, and you get a card that can do a lot of calculations all at once, especially when it comes to ray tracing and AI. But wait, there is more. Both cards get an improved media engine, which provides a considerable improvement for low bit rate encoding during streaming. YouTube's compression might make this impossible to see, but to the human eye, these Twitch optimized recordings of Returnal, there's a clear improvement over the 7900 XTX. And it's even difficult to distinguish AMD from NVIDIA's end banking coding. We'll talk more about encoding later, but it's really great to see AMD finally catching up here. But where AMD is still behind is in 422 hardware encoding and decoding, which could make this a less desirable option for professional video creators, but it's unlikely to matter for non-pros. Another slight disappointment compared to the RTX 50 series is AMD's DisplayPoint 2.1a, which are only UHBR 13.5 rather than UHBR 20. So these new cards can still do 4K 240 Hertz, but they will rely on display stream compression to do so. Not a huge deal to me. I find it nigh imperceptible, but your mileage may vary. Enough specs. Let's talk ray tracing, where both the new cards beat the 7900 XTX, a card that was well received at a thousand US dollars. Too bad they don't fare quite as well against NVIDIA. In Alan Wake 2, the 9070 keeps up with the 5070, showing just how bad the 5070 is, but the 5070 Ti provides substantially better performance than the 9070 XT, giving NVIDIA one clear win for their overpriced 50 series cards, or one win so far. In the heavily path traced Black Myth Wukong, the 9000 cards handily outclass the 7000 series, but fall substantially behind NVIDIA's latest, and as for F124, AMD looks great across the board here, while the 5070 gets beaten by its own predecessor, the 4070 Super. It... The clown show. Removing Black Myth Wukong from the equation, the 9070 family is neck and neck with the competition, showing that AMD is no longer two generations behind in ray tracing. With that said, Black Myth Wukong does in fact exist, and shows that AMD is still one generation behind, which may matter in future games. And even in AMD's own tech demo, we can see that their path tracing implementation struggle with boiling artifacts and ghosting, that reminds me more of last gen's ray reconstruction. Hey, but here's hoping that AMD can continue to catch up and maybe even surpass NVIDIA in the future. Maybe the guy can dream. Hey, did we leave a 1080p raster? We ran those numbers, so damn it, I want them in the video. Even if the story remains largely the same, the 5070 Ti is still a pretty bad deal. It does win, but its margin of victory is so small compared to the pricing chasm between these cards that it's really hard to recommend. So if you are looking for an overkill 1080p upgrade, the 9070 XT looks like a great way to push huge FPS numbers in eSports titles. Or any title, if you don't mind a little AI upscaling. AMD said at one point that they don't think AI is necessary for upscaling, and they could get the same results using more traditional means. Ha, spoken like a company that did not have AI figured out yet. The good news is that AMD has used their slow start to squeeze a lot of image quality out of temporal upscaling, and now that they've added their proprietary Fp8 machine learning model to improve upscaling even further, the results are very impressive. Compared to FSR 3.1, we see reductions in artifacting and improvements in the rendering of fine lines. There are a few instances where it even beats DLSS4, like with these butterflies in Horizon Forbidden West. On FSR3, there's unsightly trailing and dissolving, and on DLSS4, they just kind of are blurry. But with FSR4, they appear much crisp and clearer than either of the other technologies. Sure, when you look closely, the artifacts are still there, but it's a market improvement. Just like haloing around detailed character models, it's very heavily reduced. And motion is just sharper in general. I would say that at least when upscaling to 4K, I found the image quality of FSR4 to be no more distracting than that of DLSS4. But none of the reduced image quality tradeoffs matter if we don't get better performance. And compared to DLSS, FSR gives a less substantial performance uplift at each quality setting than NVIDIA, in both the Last of Us Part 1 and in Horizon Zero Dawn. But we do see that AMD's Framegen provides a better uplift than NVIDIA's solution, which indicates that FSR 3.1 Framegen has less overhead than NVIDIA's Multiframegen. While AMD doesn't quite match NVIDIA yet, this is the most competitive their upscaler has been ever. Except support is a problem. Even though you can use the driver to force the updated FSR4 model in games with FSR3, AMD just has less games that use FSR3. And unlike DLSS4, older cards are not able to take advantage of the new machine learning the enhanced upscaling. And AMD still doesn't have a competitor for Multiframegen yet. However, FSR4 does have Framegen, but it makes use of FSR3.1's Framegen implementation. But you can bring Framegen to any game using AMD's driver level fluid motion frames, now on version 2.1. But even with the additional decimal point, AFMF still sucks. Without game integration, you get all these weird UI issues and overlays get mangled with motion heavy heck in even motion light scenes. Let's be real, it's just frame interpolation and no number of fancy names can change that. And if the use case for NVIDIA's Framegen was already weak, AFMF feels like it exists only so AMD can say, actually we support Framegen in way more games. But no, no one wants it. It's, I don't know, who cares? Despite the swath of shareholder approved AI junk in their latest driver software, AMD falls pretty well short of the mark here. There are some great gen over gen improvements in computer vision, but even the 4070 Super manages a sizable lead over the 9070XT. The generational gains are even more apparent in stable diffusion, but the story remains the same. Finally, there's AI text generation, and I bet AMD wishes an AI could rewrite these benchmark results over the new cards failed to improve substantially over last gen and get absolutely dunked on by NVIDIA, regardless of which model you're using. The one silver lining is that either of these cards can run all the same models that the 5080 can, and can run some that the 5070 cannot. Thanks to AMD's generous 16 gigs of VRAM. I mean, it's only generous compared to NVIDIA, but thank you guys. Don't be too thankful though. AMD still needs to impress in content creation where they do okay for video editing, but the 9070 and 70XT provide good, but not exceptional performance in Premiere Pro and DaVinci Resolve. And in Blender, the 9070 and 70XT perform about on par with the flagships from AMD's last generation, but they still just don't really get anywhere close to NVIDIA thanks to optics rendering. Yeah, there was a transition that I was written that made more sense before we had to record this. Speaking of updates, we have some encoder testing for you. While AMD has made strides to improve their encoder, they still fall behind in quality compared to NVIDIA and Intel, especially at lower bit rates in H.264. For this encoding test, we only set the bit rate and no other parameters. Depending on what encoder tweaks are used, any of these cards could have improved quality compared to our tests. But the simplest way to improve quality is use AV1. But sadly, we've excluded the 7,000 series from our results due to a hardware bug. The good news is everyone outputs better quality in AV1, and AMD is sadly still in last place. We are almost there. Let's talk power and thermals. Over the past several generations, AMD has made some substantial improvements to efficiency, and in their announcement, they didn't really talk much about it. I guess they were being modest. In F124, the 9070 pulls lower power on average than the 5070, but with less stable power delivery, having transient spikes as high as 321 watts. Speaking of which, the 9070 XT has a massive spike to 426 watts, and its average is 309 watts, which is already higher than its rated TBP of 305 watts. And that propensity for pulling profuse power can be found in combustor, where both AMD cards also use more than their rated total board power on average. But AMD is allowing some partner cards to use a higher power budget, so I guess this isn't fully out of spec, but I would definitely recommend you stick with the manufacturer recommended power supply capacity. And I recommend you check out PSU circuit if you need to make an upgrade decision. Well, that's a good info there. Thankfully, despite the power draw, thermals seem to be under control on our provided sapphire pulse samples. I'm sure in part to their use of PTM7950, which you can buy for yourself over at LTTstore.com. That's a double plug, double whammy. Self promo, baby! AMD doesn't feel the need to hide GPU hotspot metrics like NVIDIA does, and in combustor, we see that sapphire's coolers provide ample thermal headroom, which is good. And they're a little big. They're not obnoxious, but you know, come on. It does allow the cards to stay much cooler than say the 5070 or the 6700 XT in F124. So if you're coming from an older card, you can confidently upgrade to either the 9070 or the 9070 XT, and know you'll get a solid improvement in raster performance and ray tracing, which is a big plus for folks who are on older ray tracing cards like the 2000 series, or if you're coming from the 6000 series on AMD. It's just, it's good, it's good, it's cool, it's nice. We like it, we're happy. They could be cheaper. They could be, they could be cheaper. There's kind of two conclusions here. A short one and a long one. The short one is that the 9070 XT is a winner. If AMD can keep this thing in stock and you have $600 to spend on a gaming GPU, you are gonna love this thing, and NVIDIA needs to respond now, or AMD might actually take some real market share for a change. The long conclusion is that it seems like AMD is trying to eat their cake and have it too here. See, the XT's 599 price point is giving real good guy AMD vibes, but the 9070 non-XT's price is giving maximized margins while GPUs are in short supply vibes. By matching the 5070 in both price and performance, unfortunately, you've also matched it in value, and let's be real. The 5070 is not a great value. Once nobody buys the non-XT because the XT is so much better for just $50 more, you're gonna end up dropping the price on this thing after the reputational damage has already been done to it. Did you guys learn nothing from the terrible initial reception to the 7900XT? And it's not like you have just reputation for days to give up. You have nothing to compete in the high end. And sure, 85% of gamers do buy cards under $700. I'm sure that's true, but without a high end card at all, you're giving up precious mind share. I mean, we didn't even bother to compare the 9000 series against NVIDIA's last gen flagship, the 4090, let alone their new flagship. And yes, 16 gigs of VRAM at 550 bucks is nice, but it's a bummer that the cheapest way to get more VRAM in your lineup is still a last generation 7900XT. If you have no plans to compete in the high end, we need you guys dominating the mid-range and enthusiast segments, especially because your non-gaming performance is still kind of lacking. Another thing that we need to acknowledge in our longer conclusion is the PlayStation-shaped elephant in the room. As impressive as these cards are, there is still a strong argument to be made that PC gaming has just plain gotten too expensive when you can pick up an all-in-one gaming box that will already run at 4K for less than the MSRP of a 9070 and even lesser if you buy it secondhand. Sure, you'll pay more for games and online services in the long run, but boy, is it ever a reasonable upfront cost. But hey, maybe the best is yet to come on the discrete GPU side. AMD has announced that 9060 cards will be coming in Q2 and those could provide even better value, assuming of course that AMD doesn't launch them at a high price in order to maximize margin for those ones, which grosses me out just thinking about it. What I'm not grossed out by is ending this video saying that this is truly probably the best GPU launch of the year and safe for Intel's B580, perhaps the best launch of the past several years if they can keep it in stock. But as per usual, for you, the consumer, there is no single right answer, just what's right for you. So whether you wanna go with these new cards or pay extra for Team Green or go for a console, all the power to you. And all the power to our sponsor! Thanks for watching, guys. We are pretty tired from doing back to back to back to back GPU launches. Massive shout out to you for watching and of course to our Labs team, our editors, our camera team, our writers, everyone for being part of it. I think there's a little time to breathe. Wait, 90, 60 and Q2? Okay, okay, no. Oh, and 50, 60 I guess is coming. Well, whatever, if you liked this video, check out, I don't know, 50, 80 review."}