{"video_id":"PUdE61PDa9c","title":"GPUs Are Getting MORE Power-Hungry","channel":"Techquickie","show":"Techquickie","published_at":"2023-05-05T14:58:16Z","duration_s":278,"segments":[{"start_s":0.0,"end_s":5.36,"text":"Pointless! That's how a lot of people these days would describe high wattage power supplies that","speaker":null,"is_sponsor":0},{"start_s":5.36,"end_s":10.16,"text":"promise you a thousand watts or more of power with an unreasonably high price tag.","speaker":null,"is_sponsor":0},{"start_s":10.16,"end_s":14.04,"text":"They used to be more common when people rocked multiple graphics cards in their rig, but","speaker":null,"is_sponsor":0},{"start_s":14.04,"end_s":19.08,"text":"as SLI and Crossfire fell out of favor due to stability and performance issues, so did","speaker":null,"is_sponsor":0},{"start_s":19.08,"end_s":24.64,"text":"those hefty power supplies. But are the days coming where they might be relevant once again?","speaker":null,"is_sponsor":0},{"start_s":24.72,"end_s":30.2,"text":"Yeah, they very well may be, if trends in graphics card power continue the way they're","speaker":null,"is_sponsor":0},{"start_s":30.2,"end_s":35.32,"text":"going. And if you haven't paid super close attention to exactly what's going on in GPU land,","speaker":null,"is_sponsor":0},{"start_s":35.32,"end_s":40.2,"text":"this might seem surprising, considering power efficiency has been a huge trend across electronics.","speaker":null,"is_sponsor":0},{"start_s":40.2,"end_s":44.28,"text":"I mean, we're getting to the point where it's disappointing if a high-end smartphone","speaker":null,"is_sponsor":0},{"start_s":44.28,"end_s":49.28,"text":"only gives you one day of battery life, and the latest Apple Silicon-equipped MacBook","speaker":null,"is_sponsor":0},{"start_s":49.28,"end_s":56.56,"text":"Pros boast over twenty hours. And although we don't run our desktop gaming rigs off of batteries, graphics cards have","speaker":null,"is_sponsor":0},{"start_s":56.56,"end_s":62.16,"text":"been getting more efficient. But the issue is that although the number of frames you get per watt of power has been","speaker":null,"is_sponsor":0},{"start_s":62.16,"end_s":66.16,"text":"increasing, the total power has also been increasing.","speaker":null,"is_sponsor":0},{"start_s":66.16,"end_s":69.92,"text":"Not to mention the fact that both chip makers and game developers keep pushing the boundaries","speaker":null,"is_sponsor":0},{"start_s":69.92,"end_s":73.6,"text":"of visual fidelity. It's just always more photorealistic.","speaker":null,"is_sponsor":0},{"start_s":73.6,"end_s":82.08,"text":"Give me some block graphics, it's okay. Just a few years ago, NVIDIA's then-top-end Titan XP drew around 250 watts at load, with","speaker":null,"is_sponsor":0},{"start_s":82.08,"end_s":86.2,"text":"AMD's competing Radeon 7 weighing in at 300 watts.","speaker":null,"is_sponsor":0},{"start_s":86.2,"end_s":93.28,"text":"But now, Team Red's current best, the Radeon RX6950XT, has increased to 330 watts, while","speaker":null,"is_sponsor":0},{"start_s":93.28,"end_s":98.84,"text":"NVIDIA's 3090TI has a TDP of a whopping 450 watts.","speaker":null,"is_sponsor":0},{"start_s":98.84,"end_s":103.8,"text":"May as well have just come out of a flame broiler. Toasted GPU.","speaker":null,"is_sponsor":0},{"start_s":103.8,"end_s":108.88,"text":"And the expectation is that the upcoming RTX 4080, featuring NVIDIA's new Ada Lovelace","speaker":null,"is_sponsor":0},{"start_s":108.88,"end_s":114.92,"text":"architecture, could clock in at around 400 or 500 watts, while the 4090 could suck down","speaker":null,"is_sponsor":0},{"start_s":114.92,"end_s":118.16,"text":"as much as 600 watts of power on its own.","speaker":null,"is_sponsor":0},{"start_s":118.16,"end_s":122.08,"text":"Now, one big reason that manufacturers might not be paying too much attention to how much","speaker":null,"is_sponsor":0},{"start_s":122.08,"end_s":126.76,"text":"power their cards are guzzling is simply because they don't particularly have to.","speaker":null,"is_sponsor":0},{"start_s":126.76,"end_s":131.24,"text":"I mean, sure, you can advertise a desktop card as being power efficient or having a","speaker":null,"is_sponsor":1},{"start_s":131.24,"end_s":135.08,"text":"good cooling solution, but at the end of the day, the thing that's going to sell cards","speaker":null,"is_sponsor":1},{"start_s":135.08,"end_s":138.92,"text":"is performance. Vroom vroom.","speaker":null,"is_sponsor":1},{"start_s":138.92,"end_s":143.44,"text":"AMD and NVIDIA would much rather compete on FPS benchmarks than trying to one-up the","speaker":null,"is_sponsor":1},{"start_s":143.44,"end_s":148.64,"text":"other company by saying, hey, our GPU uses 15% less power.","speaker":null,"is_sponsor":1},{"start_s":148.64,"end_s":154.2,"text":"And this trend might continue due to the rise of chiplets in CPUs and GPUs instead of the","speaker":null,"is_sponsor":1},{"start_s":154.2,"end_s":157.44,"text":"use of one big monolithic chip die.","speaker":null,"is_sponsor":0},{"start_s":157.44,"end_s":161.68,"text":"If you don't know, chiplets are modular chip pieces that can be combined to act as","speaker":null,"is_sponsor":0},{"start_s":161.68,"end_s":166.52,"text":"a single processor and are gaining popularity in fabs because they have better yield, meaning","speaker":null,"is_sponsor":0},{"start_s":166.52,"end_s":171.12,"text":"a defect on the wafer will only affect one small chiplet rather than a whole complete","speaker":null,"is_sponsor":0},{"start_s":171.12,"end_s":177.16,"text":"processor. It's likely that chiplets will allow companies to build bigger GPUs more profitably, something","speaker":null,"is_sponsor":0},{"start_s":177.16,"end_s":181.76,"text":"that AMD in particular seems quite interested in, and we can't rule out that NVIDIA may","speaker":null,"is_sponsor":0},{"start_s":181.76,"end_s":188.32,"text":"move in that direction someday either. Of course, this doesn't mean there isn't an upper limit to how much power a card of","speaker":null,"is_sponsor":0},{"start_s":188.32,"end_s":195.4,"text":"the future will draw. High wattage power supplies are expensive, and expecting folks to save up for both an","speaker":null,"is_sponsor":0},{"start_s":195.4,"end_s":202.08,"text":"expensive GPU as well as a $200-plus dollar power supply might just be too much to ask.","speaker":null,"is_sponsor":0},{"start_s":202.08,"end_s":206.12,"text":"Not to mention companies that build pre-built PCs won't be too happy about having to spend","speaker":null,"is_sponsor":0},{"start_s":206.12,"end_s":211.48,"text":"extra money on nicer power supplies, which they've traditionally cheaped out on.","speaker":null,"is_sponsor":0},{"start_s":211.48,"end_s":216.44,"text":"And even though there are plenty of spacious gaming-oriented cases on the market, an insanely","speaker":null,"is_sponsor":0},{"start_s":216.44,"end_s":221.84,"text":"high wattage card means a super bulky cooling solution that would take up an unpalatable","speaker":null,"is_sponsor":0},{"start_s":221.84,"end_s":226.04,"text":"amount of space or spit out unacceptable amounts of heat.","speaker":null,"is_sponsor":0},{"start_s":226.04,"end_s":231.92,"text":"Not ideal if you're in a small, poorly ventilated room like you are right now.","speaker":null,"is_sponsor":0},{"start_s":231.92,"end_s":235.12,"text":"Hey, that's the end of this video, guys. Thanks for watching.","speaker":null,"is_sponsor":0},{"start_s":235.12,"end_s":240.4,"text":"Like it if you liked it, dislike it if you disliked it. Check out our other videos, comment below with video suggestions, and don't forget","speaker":null,"is_sponsor":0},{"start_s":240.44,"end_s":244.36,"text":"to subscribe and follow. You got all that? I'm not going to repeat myself.","speaker":null,"is_sponsor":0}],"full_text":"Pointless! That's how a lot of people these days would describe high wattage power supplies that promise you a thousand watts or more of power with an unreasonably high price tag. They used to be more common when people rocked multiple graphics cards in their rig, but as SLI and Crossfire fell out of favor due to stability and performance issues, so did those hefty power supplies. But are the days coming where they might be relevant once again? Yeah, they very well may be, if trends in graphics card power continue the way they're going. And if you haven't paid super close attention to exactly what's going on in GPU land, this might seem surprising, considering power efficiency has been a huge trend across electronics. I mean, we're getting to the point where it's disappointing if a high-end smartphone only gives you one day of battery life, and the latest Apple Silicon-equipped MacBook Pros boast over twenty hours. And although we don't run our desktop gaming rigs off of batteries, graphics cards have been getting more efficient. But the issue is that although the number of frames you get per watt of power has been increasing, the total power has also been increasing. Not to mention the fact that both chip makers and game developers keep pushing the boundaries of visual fidelity. It's just always more photorealistic. Give me some block graphics, it's okay. Just a few years ago, NVIDIA's then-top-end Titan XP drew around 250 watts at load, with AMD's competing Radeon 7 weighing in at 300 watts. But now, Team Red's current best, the Radeon RX6950XT, has increased to 330 watts, while NVIDIA's 3090TI has a TDP of a whopping 450 watts. May as well have just come out of a flame broiler. Toasted GPU. And the expectation is that the upcoming RTX 4080, featuring NVIDIA's new Ada Lovelace architecture, could clock in at around 400 or 500 watts, while the 4090 could suck down as much as 600 watts of power on its own. Now, one big reason that manufacturers might not be paying too much attention to how much power their cards are guzzling is simply because they don't particularly have to. I mean, sure, you can advertise a desktop card as being power efficient or having a good cooling solution, but at the end of the day, the thing that's going to sell cards is performance. Vroom vroom. AMD and NVIDIA would much rather compete on FPS benchmarks than trying to one-up the other company by saying, hey, our GPU uses 15% less power. And this trend might continue due to the rise of chiplets in CPUs and GPUs instead of the use of one big monolithic chip die. If you don't know, chiplets are modular chip pieces that can be combined to act as a single processor and are gaining popularity in fabs because they have better yield, meaning a defect on the wafer will only affect one small chiplet rather than a whole complete processor. It's likely that chiplets will allow companies to build bigger GPUs more profitably, something that AMD in particular seems quite interested in, and we can't rule out that NVIDIA may move in that direction someday either. Of course, this doesn't mean there isn't an upper limit to how much power a card of the future will draw. High wattage power supplies are expensive, and expecting folks to save up for both an expensive GPU as well as a $200-plus dollar power supply might just be too much to ask. Not to mention companies that build pre-built PCs won't be too happy about having to spend extra money on nicer power supplies, which they've traditionally cheaped out on. And even though there are plenty of spacious gaming-oriented cases on the market, an insanely high wattage card means a super bulky cooling solution that would take up an unpalatable amount of space or spit out unacceptable amounts of heat. Not ideal if you're in a small, poorly ventilated room like you are right now. Hey, that's the end of this video, guys. Thanks for watching. Like it if you liked it, dislike it if you disliked it. Check out our other videos, comment below with video suggestions, and don't forget to subscribe and follow. You got all that? I'm not going to repeat myself."}