{"video_id":"OibVY-q2SAw","title":"Is NVIDIA Even Trying - GeForce RTX 5050 Gaming GPU Review","channel":"Linus Tech Tips","show":"Linus Tech Tips","published_at":"2025-09-06T14:53:29Z","duration_s":640,"segments":[{"start_s":0.08,"end_s":9.36,"text":"As much as it pains me to say it, this review of NVIDIA's RTX 5050 is probably","speaker":null,"is_sponsor":0},{"start_s":6.0,"end_s":11.2,"text":"the most important GPU review I will do","speaker":null,"is_sponsor":0},{"start_s":9.36,"end_s":15.92,"text":"this year. Because the bottom line is that for so many of you out there,","speaker":null,"is_sponsor":0},{"start_s":13.599,"end_s":19.6,"text":"whether you're building your own PC or getting it in the pre-built your","speaker":null,"is_sponsor":0},{"start_s":17.44,"end_s":24.88,"text":"grandmother got you for your birthday, this is the GPU you're going to use.","speaker":null,"is_sponsor":0},{"start_s":23.279,"end_s":29.84,"text":"There's nothing inherently wrong with that. Everybody has their own budget.","speaker":null,"is_sponsor":0},{"start_s":27.359,"end_s":34.8,"text":"It's just that when a better graphics card could be had for the same price or","speaker":null,"is_sponsor":0},{"start_s":32.079,"end_s":40.16,"text":"if there's a significant upgrade for a much less significant extra cost, it","speaker":null,"is_sponsor":0},{"start_s":37.04,"end_s":43.52,"text":"hurts me to see these boxes flying off","speaker":null,"is_sponsor":0},{"start_s":40.16,"end_s":46.079,"text":"the shelves instead despite my years of","speaker":null,"is_sponsor":0},{"start_s":43.52,"end_s":51.52,"text":"tactipping to the contrary. Which is why this time I am entrusting everyone else","speaker":null,"is_sponsor":0},{"start_s":49.44,"end_s":55.44,"text":"to persuade you. Maybe when you hear it from the people who actually have time","speaker":null,"is_sponsor":0},{"start_s":53.28,"end_s":61.6,"text":"to play video games after work, you'll believe us when we say stay away from","speaker":null,"is_sponsor":0},{"start_s":58.96,"end_s":67.68,"text":"this launch. Maybe they can also help me warn you about our sponsor sponsors.","speaker":null,"is_sponsor":0},{"start_s":65.119,"end_s":71.28,"text":">> That was awful. Dbrand. They paid us to mention their protective ghost cases and","speaker":null,"is_sponsor":1},{"start_s":69.76,"end_s":74.56,"text":"prison screen protectors, which you can pick up using our link in the video","speaker":null,"is_sponsor":1},{"start_s":73.119,"end_s":81.68,"text":"description. What they didn't pay us to do is say that they're good.","speaker":null,"is_sponsor":1},{"start_s":78.64,"end_s":82.64,"text":"Okay, go ahead and continue the video by","speaker":null,"is_sponsor":1},{"start_s":81.68,"end_s":88.72,"text":"Dbrand. I guess >> let me start by pointing out that we're using a high-end test bench to evaluate","speaker":null,"is_sponsor":1},{"start_s":86.4,"end_s":92.96,"text":"what will likely be the most entry-level offering from NVIDIA this generation.","speaker":null,"is_sponsor":0},{"start_s":91.04,"end_s":96.479,"text":"This is done to alleviate any potential bottlenecks that may distort our","speaker":null,"is_sponsor":0},{"start_s":94.4,"end_s":102.32,"text":"results, but it's not exactly a realistic configuration. I mean, nobody","speaker":null,"is_sponsor":0},{"start_s":98.88,"end_s":106.24,"text":"is going to pair a $250 RTX5050 with a","speaker":null,"is_sponsor":0},{"start_s":102.32,"end_s":109.68,"text":"$450 Ryzen 7800 X3D or with fast,","speaker":null,"is_sponsor":0},{"start_s":106.24,"end_s":111.92,"text":"high-capacity DDR5 memory, or at least","speaker":null,"is_sponsor":0},{"start_s":109.68,"end_s":115.2,"text":"they shouldn't. Realistically, if you buy this card, your performance will be","speaker":null,"is_sponsor":0},{"start_s":113.36,"end_s":118.799,"text":"at least a little bit worse than what we're going to show you today. So, with","speaker":null,"is_sponsor":0},{"start_s":116.96,"end_s":123.759,"text":"that in mind, let's kick things off with gaming. This thing blows. And by that, I","speaker":null,"is_sponsor":0},{"start_s":121.52,"end_s":129.52,"text":"mean it trades blows with our RTX 4060 at 1080p. In Counterstrike 2, the RTX","speaker":null,"is_sponsor":0},{"start_s":127.28,"end_s":135.28,"text":"5050 is in the bottom four, beating out the two generation old 3060, but losing","speaker":null,"is_sponsor":0},{"start_s":132.48,"end_s":139.44,"text":"to last gen 4060. And we see that same behavior in Red Dead Redemption 2. On","speaker":null,"is_sponsor":0},{"start_s":137.599,"end_s":145.44,"text":"the other hand, The Last of Us Part One at 1080p is just kind of sad. The 12 gig","speaker":null,"is_sponsor":0},{"start_s":142.16,"end_s":147.52,"text":"RTX 3060 pushes the 5050 from the","speaker":null,"is_sponsor":0},{"start_s":145.44,"end_s":151.36,"text":"bottom. Probably because the 5050 only has 8 gigs of VRAM, whereas this game","speaker":null,"is_sponsor":0},{"start_s":149.44,"end_s":156.319,"text":"utilized over 9 gigs on cards that had the capacity. Sure, we're crushing the","speaker":null,"is_sponsor":0},{"start_s":153.76,"end_s":160.239,"text":"RTX 3050. There's no 4050. So, that's the most recent 50 class card. That's","speaker":null,"is_sponsor":0},{"start_s":158.56,"end_s":167.319,"text":"like congratulating yourself for beating your 8-year-old baby brother in a boxing","speaker":null,"is_sponsor":0},{"start_s":162.4,"end_s":167.319,"text":"match. Sorry, little buddy. No mercy.","speaker":null,"is_sponsor":0},{"start_s":167.599,"end_s":173.92,"text":"The 50/50 sometimes beats the 4060, but","speaker":null,"is_sponsor":0},{"start_s":170.8,"end_s":176.0,"text":"usually not by a lot. F124 is an extra","speaker":null,"is_sponsor":0},{"start_s":173.92,"end_s":182.319,"text":"frame the 1% lows and 10 more in the averages. And we see a similar uplift in","speaker":null,"is_sponsor":0},{"start_s":178.319,"end_s":184.879,"text":"Cyberpunk 2077. But what's this? The","speaker":null,"is_sponsor":0},{"start_s":182.319,"end_s":191.36,"text":"4060 wins once you enable ray tracing in both these games. Yep. The 5050 may have","speaker":null,"is_sponsor":0},{"start_s":188.0,"end_s":193.92,"text":"24th gen RT cores, but the whole package","speaker":null,"is_sponsor":0},{"start_s":191.36,"end_s":199.76,"text":"isn't enough to keep up with the 24 third gen RT cores in the 4060.","speaker":null,"is_sponsor":0},{"start_s":197.04,"end_s":204.8,"text":"Pathetic. If you thought 1080p was bad, well, check out 1440p. The 50/50 goes","speaker":null,"is_sponsor":0},{"start_s":202.64,"end_s":209.92,"text":"from trading blows to just blowing it against the 4060. It only wins in a","speaker":null,"is_sponsor":0},{"start_s":207.76,"end_s":214.879,"text":"single game, and that's Cyberpunk. Adding insult to injury, check out how","speaker":null,"is_sponsor":0},{"start_s":212.0,"end_s":220.799,"text":"good the 3060 12 gig looks by comparison across every single game we've tested.","speaker":null,"is_sponsor":0},{"start_s":218.0,"end_s":226.239,"text":"Our poor RTX 5050 is barely beating out a 60class card from two generations ago.","speaker":null,"is_sponsor":0},{"start_s":224.0,"end_s":232.64,"text":"And in VRAMm hungry games like The Last of Us Part One, the 5050 loses again.","speaker":null,"is_sponsor":0},{"start_s":230.239,"end_s":238.879,"text":"We'll talk more about pure value later, but even the 9060 XT and RTX 5060,","speaker":null,"is_sponsor":0},{"start_s":236.159,"end_s":244.08,"text":"despite only having 8 gigs of VRAM, destroy the 50/50 in our gaming tests,","speaker":null,"is_sponsor":0},{"start_s":242.4,"end_s":249.12,"text":"which is pretty bad, especially when you consider that a more futurep proof card","speaker":null,"is_sponsor":0},{"start_s":246.159,"end_s":254.56,"text":"like the Intel Arc B580 can now be found easily for around $260.","speaker":null,"is_sponsor":0},{"start_s":252.319,"end_s":258.479,"text":"And that comes with 12 gigs of VRAM. That will make it a safer bet as AAA","speaker":null,"is_sponsor":0},{"start_s":256.56,"end_s":262.72,"text":"game requirements creep up over the next few years, assuming Intel's GPU division","speaker":null,"is_sponsor":0},{"start_s":261.199,"end_s":267.44,"text":"still exists over the next few years. But hey, NVIDIA's frame gen can make up","speaker":null,"is_sponsor":0},{"start_s":264.96,"end_s":270.88,"text":"for the 5050's poor performance, right? Oh, hey, didn't see you there. You know,","speaker":null,"is_sponsor":0},{"start_s":269.44,"end_s":275.84,"text":"this isn't the most scientifically accurate test we've ever done, but here's some slow motion footage of","speaker":null,"is_sponsor":0},{"start_s":273.68,"end_s":280.0,"text":"Cyberpunk running on the RTX 5050 at 1080p. The mouse moves, then the game","speaker":null,"is_sponsor":0},{"start_s":277.919,"end_s":284.8,"text":"moves 27 frames of footage later, which indicates overall system latency of 27","speaker":null,"is_sponsor":0},{"start_s":282.8,"end_s":290.28,"text":"millisecond. Now, let's enable frame generation and crank it up to 4x","speaker":null,"is_sponsor":0},{"start_s":286.72,"end_s":290.28,"text":"alongside DAA.","speaker":null,"is_sponsor":0},{"start_s":293.6,"end_s":300.479,"text":"Our game is now running at around 200 FPS. Great, but notice the increased","speaker":null,"is_sponsor":0},{"start_s":298.32,"end_s":304.4,"text":"latency. It's pretty minor, and even in twitchy firstperson shooters, it could","speaker":null,"is_sponsor":0},{"start_s":302.32,"end_s":309.199,"text":"be imperceptible to many. Sounds like a decent trade-off, right? But our base","speaker":null,"is_sponsor":0},{"start_s":306.4,"end_s":313.36,"text":"frame rate was already a solid 98 frames per second. Frame generation is what we","speaker":null,"is_sponsor":0},{"start_s":311.28,"end_s":317.52,"text":"in the MTG community call a win more card. It only really works well if","speaker":null,"is_sponsor":0},{"start_s":315.52,"end_s":322.479,"text":"you're already in a good situation. Here, watch what happens if we have a","speaker":null,"is_sponsor":0},{"start_s":319.6,"end_s":327.919,"text":"low starting frame rate. At 1440p, our base frame rate averages around 63 FPS.","speaker":null,"is_sponsor":0},{"start_s":325.36,"end_s":332.24,"text":"Good, but nowhere near the almost 100 we just came from. Rendering natively,","speaker":null,"is_sponsor":0},{"start_s":329.759,"end_s":336.0,"text":"we're starting with the same 36 frames of latency that we saw at 1080p with","speaker":null,"is_sponsor":0},{"start_s":334.24,"end_s":341.199,"text":"frame gen set to four in the previous clip. So then what happens when we add","speaker":null,"is_sponsor":0},{"start_s":338.88,"end_s":345.84,"text":"frame gen at 1440p? You're looking at MFG3X where we're getting an additional","speaker":null,"is_sponsor":0},{"start_s":343.84,"end_s":349.6,"text":"12 frames of latency even though our total frames per second or higher. Can","speaker":null,"is_sponsor":0},{"start_s":347.68,"end_s":354.32,"text":"an average person feel it? Not necessarily. But some of you out there","speaker":null,"is_sponsor":0},{"start_s":352.0,"end_s":358.96,"text":"are going to be sensitive to it. Let's be clear, we don't hate frame","speaker":null,"is_sponsor":0},{"start_s":356.479,"end_s":362.88,"text":"generation. And to NVIDIA's credit, they continue to improve the tech with every","speaker":null,"is_sponsor":0},{"start_s":361.12,"end_s":368.639,"text":"single generation just like they did with DLSS. What we do hate is how NVIDIA","speaker":null,"is_sponsor":0},{"start_s":366.4,"end_s":372.56,"text":"uses these features to mislead consumers by posting unfair representations of","speaker":null,"is_sponsor":0},{"start_s":370.88,"end_s":378.12,"text":"video card performance, which is especially bad because not every game","speaker":null,"is_sponsor":0},{"start_s":374.479,"end_s":378.12,"text":"supports those features.","speaker":null,"is_sponsor":0},{"start_s":381.36,"end_s":387.759,"text":"Sure, you could use an inexpensive thirdparty tool like Lothless Scaling,","speaker":null,"is_sponsor":0},{"start_s":385.44,"end_s":391.52,"text":"but even with the bells and whistles, if you buy this card, you're getting","speaker":null,"is_sponsor":0},{"start_s":389.44,"end_s":394.88,"text":"fleeced. Unlike if you buy the new transparent screwdriver from","speaker":null,"is_sponsor":0},{"start_s":392.639,"end_s":399.28,"text":"ltstore.com, it's a clear winner when it comes to nifty tools. Let's talk price","speaker":null,"is_sponsor":1},{"start_s":397.6,"end_s":405.199,"text":"and value. Starting with some of our competitors, AMD's 9060 XT8 GB, which is","speaker":null,"is_sponsor":0},{"start_s":402.72,"end_s":413.039,"text":"at the top of our 1080p Geomine results, costs us $2.32 per frame. The B580, $242","speaker":null,"is_sponsor":0},{"start_s":410.08,"end_s":417.84,"text":"is very nice. Then we get to the 5050 at $265.","speaker":null,"is_sponsor":0},{"start_s":414.639,"end_s":420.72,"text":"But it gets worse. The RTX5060,","speaker":null,"is_sponsor":0},{"start_s":417.84,"end_s":426.16,"text":"which costs around $299, just 50 bucks more than the 50/50, beats the 50/50 in","speaker":null,"is_sponsor":0},{"start_s":423.599,"end_s":429.599,"text":"value at $2.52. These aren't massive swings in value","speaker":null,"is_sponsor":0},{"start_s":427.68,"end_s":433.919,"text":"when you're looking at averages, but it still shows what a terrible deal the","speaker":null,"is_sponsor":0},{"start_s":431.44,"end_s":438.4,"text":"50/50 is. A huge part of why you get such better value with the more","speaker":null,"is_sponsor":0},{"start_s":435.599,"end_s":443.599,"text":"expensive card is that for $50 more, you're getting literally 50% more","speaker":null,"is_sponsor":0},{"start_s":440.639,"end_s":447.52,"text":"hardware. The 5060 has 50% more CUDA cores, 50% more texture processing","speaker":null,"is_sponsor":0},{"start_s":445.759,"end_s":451.919,"text":"clusters, 50% more streaming multipprocessors, 50% more tensor cores,","speaker":null,"is_sponsor":0},{"start_s":449.919,"end_s":457.36,"text":"50% more RT cores, 50% more texture units, and 50% more ROP units than the","speaker":null,"is_sponsor":0},{"start_s":454.319,"end_s":461.039,"text":"5050. And while this card is on the same","speaker":null,"is_sponsor":0},{"start_s":457.36,"end_s":464.16,"text":"128 bit bus, it's using GDDR6 instead of","speaker":null,"is_sponsor":0},{"start_s":461.039,"end_s":466.24,"text":"seven. Why? Because NVIDIA says G6 is","speaker":null,"is_sponsor":0},{"start_s":464.16,"end_s":470.8,"text":"best for desktops and the more power efficient G7. Well, that's that's best","speaker":null,"is_sponsor":0},{"start_s":468.24,"end_s":474.56,"text":"for laptops. Cool. So, then why not put the superefficient stuff in the one card","speaker":null,"is_sponsor":0},{"start_s":472.88,"end_s":479.039,"text":"in the desktop lineup that might actually want an extra low TDP? I just I","speaker":null,"is_sponsor":0},{"start_s":477.28,"end_s":484.0,"text":"don't get it. I quit. I can't I just can't do it anymore. >> Productivity. If there's anything","speaker":null,"is_sponsor":0},{"start_s":481.759,"end_s":489.52,"text":"positive to say about the 50/50, it's that it blends pretty okay. Sure, it","speaker":null,"is_sponsor":0},{"start_s":486.639,"end_s":494.319,"text":"renders our scene slower than the 4060 and hardly faster than a 3060, but sub 2","speaker":null,"is_sponsor":0},{"start_s":492.479,"end_s":498.4,"text":"minutes ain't half bad. We see similar results in Puget Bench's Premiere Pro","speaker":null,"is_sponsor":0},{"start_s":496.0,"end_s":503.68,"text":"benchmark, where our 50/50 is clearly pulling ahead and only overshadowed by","speaker":null,"is_sponsor":0},{"start_s":500.479,"end_s":505.68,"text":"the 5060. The Arc B580 puts up a good","speaker":null,"is_sponsor":0},{"start_s":503.68,"end_s":510.639,"text":"fight both here and in Puget Bench's Photoshop benchmarks. But then that card","speaker":null,"is_sponsor":0},{"start_s":508.0,"end_s":515.44,"text":"gets a big fat DNF with Puget's Da Vinci Resolve test, allowing NVIDIA to get a","speaker":null,"is_sponsor":0},{"start_s":512.719,"end_s":519.919,"text":"couple of blurry W's. Despite NVIDIA's obsession with AI, they failed to make","speaker":null,"is_sponsor":0},{"start_s":517.44,"end_s":525.12,"text":"the 5050s AI performance anything but underwhelming. In Proon AI image","speaker":null,"is_sponsor":0},{"start_s":522.479,"end_s":529.68,"text":"generation, the 50/50 is barely beating the 3060 again. And in text generation,","speaker":null,"is_sponsor":0},{"start_s":527.76,"end_s":535.279,"text":"the 50/50 is nothing to write home about. See what we did there? My biggest","speaker":null,"is_sponsor":0},{"start_s":532.399,"end_s":540.64,"text":"takeaway here is how great the ARC B580 is. Look at how much it crushes even our","speaker":null,"is_sponsor":0},{"start_s":537.279,"end_s":544.24,"text":"5060. Just look at what extra VRAMm","speaker":null,"is_sponsor":0},{"start_s":540.64,"end_s":546.08,"text":"NVIDIA improper support AMD can do. If","speaker":null,"is_sponsor":0},{"start_s":544.24,"end_s":551.279,"text":"Intel can figure it out as the new kid showing up with open vinyl, nobody has a","speaker":null,"is_sponsor":0},{"start_s":548.48,"end_s":555.839,"text":"good excuse. In summary, the RTX 5050 only makes sense as a product designed","speaker":null,"is_sponsor":0},{"start_s":553.279,"end_s":559.76,"text":"to make the RTX 5060 look good. And if you saw that review, you know that it","speaker":null,"is_sponsor":0},{"start_s":557.68,"end_s":563.92,"text":"does not look good. We can't in good conscience recommend the 50/50. It flat","speaker":null,"is_sponsor":0},{"start_s":562.0,"end_s":567.36,"text":"out sucks. And there are so many options that we've mentioned in this video that","speaker":null,"is_sponsor":0},{"start_s":565.519,"end_s":570.32,"text":"exist and we'll have all of those linked in the video description for you to","speaker":null,"is_sponsor":0},{"start_s":568.72,"end_s":575.279,"text":"check out. There's a saying that there are no bad products, just bad prices.","speaker":null,"is_sponsor":0},{"start_s":572.72,"end_s":580.959,"text":"And in RTX50 series fashion, the price is bad. But, you know, if nobody buys it","speaker":null,"is_sponsor":0},{"start_s":579.279,"end_s":586.0,"text":"and it gets discounted to maybe like $200,","speaker":null,"is_sponsor":0},{"start_s":583.519,"end_s":588.399,"text":"maybe then I could finally segue to our sponsor,","speaker":null,"is_sponsor":1},{"start_s":586.8,"end_s":591.44,"text":">> Dbrand. We all know that to prevent accidents, you should use proper","speaker":null,"is_sponsor":1},{"start_s":590.0,"end_s":595.6,"text":"protection. So, if you keep dropping stuff like me, consider protecting what","speaker":null,"is_sponsor":1},{"start_s":593.92,"end_s":599.92,"text":"is most likely your most precious possession, your phone with Dbrand.","speaker":null,"is_sponsor":1},{"start_s":597.76,"end_s":603.92,"text":"Their ghost cases are ultra durable, super scratch resistant, designed to","speaker":null,"is_sponsor":1},{"start_s":601.92,"end_s":607.2,"text":"never yellow, and they're grippy for a nice hand feel. Plus, if you add on one","speaker":null,"is_sponsor":1},{"start_s":605.6,"end_s":611.519,"text":"of their Prism screen protectors, you'll have 360° protection. They are stupid","speaker":null,"is_sponsor":1},{"start_s":609.68,"end_s":615.44,"text":"simple to apply, and the whole process, like some things in life, only last","speaker":null,"is_sponsor":1},{"start_s":613.519,"end_s":619.44,"text":"seconds. And also consider picking up a Glow Circuit skin to add a bit of style","speaker":null,"is_sponsor":1},{"start_s":617.6,"end_s":623.92,"text":"and illumination to your nighttime activities. Smash our link in the","speaker":null,"is_sponsor":1},{"start_s":621.12,"end_s":628.0,"text":"description. Oh, I get it. To learn more today, thanks for watching. If you like","speaker":null,"is_sponsor":1},{"start_s":625.76,"end_s":632.32,"text":"this video, go check out our 960 XT review. The 16 GB model is really the","speaker":null,"is_sponsor":0},{"start_s":630.399,"end_s":636.88,"text":"way to go, but if it's too far out of your price range, then a B580 or even an","speaker":null,"is_sponsor":0},{"start_s":634.16,"end_s":641.36,"text":"8 gig 960 XT would be better than garbage. Just dog.","speaker":null,"is_sponsor":0}],"full_text":"As much as it pains me to say it, this review of NVIDIA's RTX 5050 is probably the most important GPU review I will do this year. Because the bottom line is that for so many of you out there, whether you're building your own PC or getting it in the pre-built your grandmother got you for your birthday, this is the GPU you're going to use. There's nothing inherently wrong with that. Everybody has their own budget. It's just that when a better graphics card could be had for the same price or if there's a significant upgrade for a much less significant extra cost, it hurts me to see these boxes flying off the shelves instead despite my years of tactipping to the contrary. Which is why this time I am entrusting everyone else to persuade you. Maybe when you hear it from the people who actually have time to play video games after work, you'll believe us when we say stay away from this launch. Maybe they can also help me warn you about our sponsor sponsors. >> That was awful. Dbrand. They paid us to mention their protective ghost cases and prison screen protectors, which you can pick up using our link in the video description. What they didn't pay us to do is say that they're good. Okay, go ahead and continue the video by Dbrand. I guess >> let me start by pointing out that we're using a high-end test bench to evaluate what will likely be the most entry-level offering from NVIDIA this generation. This is done to alleviate any potential bottlenecks that may distort our results, but it's not exactly a realistic configuration. I mean, nobody is going to pair a $250 RTX5050 with a $450 Ryzen 7800 X3D or with fast, high-capacity DDR5 memory, or at least they shouldn't. Realistically, if you buy this card, your performance will be at least a little bit worse than what we're going to show you today. So, with that in mind, let's kick things off with gaming. This thing blows. And by that, I mean it trades blows with our RTX 4060 at 1080p. In Counterstrike 2, the RTX 5050 is in the bottom four, beating out the two generation old 3060, but losing to last gen 4060. And we see that same behavior in Red Dead Redemption 2. On the other hand, The Last of Us Part One at 1080p is just kind of sad. The 12 gig RTX 3060 pushes the 5050 from the bottom. Probably because the 5050 only has 8 gigs of VRAM, whereas this game utilized over 9 gigs on cards that had the capacity. Sure, we're crushing the RTX 3050. There's no 4050. So, that's the most recent 50 class card. That's like congratulating yourself for beating your 8-year-old baby brother in a boxing match. Sorry, little buddy. No mercy. The 50/50 sometimes beats the 4060, but usually not by a lot. F124 is an extra frame the 1% lows and 10 more in the averages. And we see a similar uplift in Cyberpunk 2077. But what's this? The 4060 wins once you enable ray tracing in both these games. Yep. The 5050 may have 24th gen RT cores, but the whole package isn't enough to keep up with the 24 third gen RT cores in the 4060. Pathetic. If you thought 1080p was bad, well, check out 1440p. The 50/50 goes from trading blows to just blowing it against the 4060. It only wins in a single game, and that's Cyberpunk. Adding insult to injury, check out how good the 3060 12 gig looks by comparison across every single game we've tested. Our poor RTX 5050 is barely beating out a 60class card from two generations ago. And in VRAMm hungry games like The Last of Us Part One, the 5050 loses again. We'll talk more about pure value later, but even the 9060 XT and RTX 5060, despite only having 8 gigs of VRAM, destroy the 50/50 in our gaming tests, which is pretty bad, especially when you consider that a more futurep proof card like the Intel Arc B580 can now be found easily for around $260. And that comes with 12 gigs of VRAM. That will make it a safer bet as AAA game requirements creep up over the next few years, assuming Intel's GPU division still exists over the next few years. But hey, NVIDIA's frame gen can make up for the 5050's poor performance, right? Oh, hey, didn't see you there. You know, this isn't the most scientifically accurate test we've ever done, but here's some slow motion footage of Cyberpunk running on the RTX 5050 at 1080p. The mouse moves, then the game moves 27 frames of footage later, which indicates overall system latency of 27 millisecond. Now, let's enable frame generation and crank it up to 4x alongside DAA. Our game is now running at around 200 FPS. Great, but notice the increased latency. It's pretty minor, and even in twitchy firstperson shooters, it could be imperceptible to many. Sounds like a decent trade-off, right? But our base frame rate was already a solid 98 frames per second. Frame generation is what we in the MTG community call a win more card. It only really works well if you're already in a good situation. Here, watch what happens if we have a low starting frame rate. At 1440p, our base frame rate averages around 63 FPS. Good, but nowhere near the almost 100 we just came from. Rendering natively, we're starting with the same 36 frames of latency that we saw at 1080p with frame gen set to four in the previous clip. So then what happens when we add frame gen at 1440p? You're looking at MFG3X where we're getting an additional 12 frames of latency even though our total frames per second or higher. Can an average person feel it? Not necessarily. But some of you out there are going to be sensitive to it. Let's be clear, we don't hate frame generation. And to NVIDIA's credit, they continue to improve the tech with every single generation just like they did with DLSS. What we do hate is how NVIDIA uses these features to mislead consumers by posting unfair representations of video card performance, which is especially bad because not every game supports those features. Sure, you could use an inexpensive thirdparty tool like Lothless Scaling, but even with the bells and whistles, if you buy this card, you're getting fleeced. Unlike if you buy the new transparent screwdriver from ltstore.com, it's a clear winner when it comes to nifty tools. Let's talk price and value. Starting with some of our competitors, AMD's 9060 XT8 GB, which is at the top of our 1080p Geomine results, costs us $2.32 per frame. The B580, $242 is very nice. Then we get to the 5050 at $265. But it gets worse. The RTX5060, which costs around $299, just 50 bucks more than the 50/50, beats the 50/50 in value at $2.52. These aren't massive swings in value when you're looking at averages, but it still shows what a terrible deal the 50/50 is. A huge part of why you get such better value with the more expensive card is that for $50 more, you're getting literally 50% more hardware. The 5060 has 50% more CUDA cores, 50% more texture processing clusters, 50% more streaming multipprocessors, 50% more tensor cores, 50% more RT cores, 50% more texture units, and 50% more ROP units than the 5050. And while this card is on the same 128 bit bus, it's using GDDR6 instead of seven. Why? Because NVIDIA says G6 is best for desktops and the more power efficient G7. Well, that's that's best for laptops. Cool. So, then why not put the superefficient stuff in the one card in the desktop lineup that might actually want an extra low TDP? I just I don't get it. I quit. I can't I just can't do it anymore. >> Productivity. If there's anything positive to say about the 50/50, it's that it blends pretty okay. Sure, it renders our scene slower than the 4060 and hardly faster than a 3060, but sub 2 minutes ain't half bad. We see similar results in Puget Bench's Premiere Pro benchmark, where our 50/50 is clearly pulling ahead and only overshadowed by the 5060. The Arc B580 puts up a good fight both here and in Puget Bench's Photoshop benchmarks. But then that card gets a big fat DNF with Puget's Da Vinci Resolve test, allowing NVIDIA to get a couple of blurry W's. Despite NVIDIA's obsession with AI, they failed to make the 5050s AI performance anything but underwhelming. In Proon AI image generation, the 50/50 is barely beating the 3060 again. And in text generation, the 50/50 is nothing to write home about. See what we did there? My biggest takeaway here is how great the ARC B580 is. Look at how much it crushes even our 5060. Just look at what extra VRAMm NVIDIA improper support AMD can do. If Intel can figure it out as the new kid showing up with open vinyl, nobody has a good excuse. In summary, the RTX 5050 only makes sense as a product designed to make the RTX 5060 look good. And if you saw that review, you know that it does not look good. We can't in good conscience recommend the 50/50. It flat out sucks. And there are so many options that we've mentioned in this video that exist and we'll have all of those linked in the video description for you to check out. There's a saying that there are no bad products, just bad prices. And in RTX50 series fashion, the price is bad. But, you know, if nobody buys it and it gets discounted to maybe like $200, maybe then I could finally segue to our sponsor, >> Dbrand. We all know that to prevent accidents, you should use proper protection. So, if you keep dropping stuff like me, consider protecting what is most likely your most precious possession, your phone with Dbrand. Their ghost cases are ultra durable, super scratch resistant, designed to never yellow, and they're grippy for a nice hand feel. Plus, if you add on one of their Prism screen protectors, you'll have 360° protection. They are stupid simple to apply, and the whole process, like some things in life, only last seconds. And also consider picking up a Glow Circuit skin to add a bit of style and illumination to your nighttime activities. Smash our link in the description. Oh, I get it. To learn more today, thanks for watching. If you like this video, go check out our 960 XT review. The 16 GB model is really the way to go, but if it's too far out of your price range, then a B580 or even an 8 gig 960 XT would be better than garbage. Just dog."}