{"video_id":"yFxFLxLFCic","title":"Galaxy GeForce GTX 670 2GB Video Card Unboxing & First Look Linus Tech Tips","channel":"Linus Tech Tips","show":"Linus Tech Tips","published_at":"2013-05-07T14:53:29Z","duration_s":511,"segments":[{"start_s":7.44,"end_s":15.36,"text":"Welcome to my unboxing and first look at something that's very exciting. This is","speaker":null,"is_sponsor":0},{"start_s":11.44,"end_s":17.92,"text":"the Galaxy GeForce GTX 670. It supports","speaker":null,"is_sponsor":0},{"start_s":15.36,"end_s":23.6,"text":"a wide variety of exciting features, most of which you can actually find out","speaker":null,"is_sponsor":0},{"start_s":20.0,"end_s":24.88,"text":"about on my Galaxy GTX 680 unboxing as","speaker":null,"is_sponsor":0},{"start_s":23.6,"end_s":30.56,"text":"well as review and all the other videos I did about it. because the 670 is","speaker":null,"is_sponsor":0},{"start_s":27.76,"end_s":38.0,"text":"actually very similar to the 680 in many ways. Now, sure, it doesn't have quite","speaker":null,"is_sponsor":0},{"start_s":33.76,"end_s":39.68,"text":"as many CUDA cores. It only has 1,344.","speaker":null,"is_sponsor":0},{"start_s":38.0,"end_s":43.84,"text":"Sure, it's not clocked quite as high with a boost clock that's generally in","speaker":null,"is_sponsor":0},{"start_s":41.68,"end_s":48.719,"text":"the neighborhood of around 980 MHz, although GPU boost doesn't necessarily,","speaker":null,"is_sponsor":0},{"start_s":46.719,"end_s":54.559,"text":"you know, nail it down to an exact frequency at any given time. And","speaker":null,"is_sponsor":0},{"start_s":53.12,"end_s":59.52,"text":"sure, it might look a little different, and more on that in a moment, but in","speaker":null,"is_sponsor":0},{"start_s":56.879,"end_s":64.239,"text":"terms of the performance, in terms of the memory speed, in terms of the memory","speaker":null,"is_sponsor":0},{"start_s":61.84,"end_s":69.36,"text":"bus, this is the first time in a while we haven't seen a a step down card from","speaker":null,"is_sponsor":0},{"start_s":66.72,"end_s":73.439,"text":"the flagship single GPU that is actually using the same memory bus at the same","speaker":null,"is_sponsor":0},{"start_s":71.2,"end_s":80.56,"text":"frequency. So, it has six gigahertz DDDR5 memory with a 256 bit bus. Um,","speaker":null,"is_sponsor":0},{"start_s":78.64,"end_s":83.68,"text":"it's using the Right, right, right. So, the reason that most of the features are","speaker":null,"is_sponsor":0},{"start_s":82.08,"end_s":89.6,"text":"very similar is because it is using the same GK 104 core that the GTX 680 is","speaker":null,"is_sponsor":0},{"start_s":87.6,"end_s":93.68,"text":"using, although slightly cut down in terms of the functional units, the CUDA","speaker":null,"is_sponsor":0},{"start_s":91.119,"end_s":99.84,"text":"cores and the clock speed. But other than that, this is almost a GTX 680,","speaker":null,"is_sponsor":0},{"start_s":97.68,"end_s":103.68,"text":"although the key difference is that it comes at a more attractive price. So,","speaker":null,"is_sponsor":0},{"start_s":102.24,"end_s":108.0,"text":"you can see it actually looks very similar as well. More on that in a","speaker":null,"is_sponsor":0},{"start_s":106.32,"end_s":113.36,"text":"moment. So, let's just have a look at what Galaxy includes with their GTX 670.","speaker":null,"is_sponsor":0},{"start_s":111.119,"end_s":116.24,"text":"In here, we find slick all black packaging. I'm a big fan of packaging,","speaker":null,"is_sponsor":0},{"start_s":114.96,"end_s":120.799,"text":"as you guys probably know, if you've watched a few of my videos. I unpackage","speaker":null,"is_sponsor":0},{"start_s":118.479,"end_s":125.36,"text":"a lot of products, so I do recognize good packaging when I see it. Got nice","speaker":null,"is_sponsor":0},{"start_s":122.799,"end_s":131.44,"text":"little individually packaged zip locky things going on here. So, we got a DVI","speaker":null,"is_sponsor":0},{"start_s":126.88,"end_s":134.08,"text":"to VGA adapter. We have uh two dual","speaker":null,"is_sponsor":0},{"start_s":131.44,"end_s":137.84,"text":"Molex to six pin PCIe adapters. So that probably tells us that this card uses","speaker":null,"is_sponsor":0},{"start_s":135.599,"end_s":140.8,"text":"dual six pin PCIe power adapters as well as a graphics card driver installation","speaker":null,"is_sponsor":0},{"start_s":139.599,"end_s":145.92,"text":"disc. Although you should download the latest and finally a setup guide, easy","speaker":null,"is_sponsor":0},{"start_s":143.44,"end_s":151.52,"text":"setup for quick installation for GeForce GTX 600 series pamphlet and","speaker":null,"is_sponsor":0},{"start_s":149.599,"end_s":154.319,"text":"then a user's manual. So that pretty much covers it. Attention customers,","speaker":null,"is_sponsor":0},{"start_s":152.959,"end_s":159.2,"text":"thank you for purchasing your blah blah blah. Be sure to register your card within 30 days of purchase to receive an","speaker":null,"is_sponsor":0},{"start_s":157.28,"end_s":165.519,"text":"extended warranty. So Galaxy does provide up to I believe it's three","speaker":null,"is_sponsor":0},{"start_s":162.64,"end_s":171.36,"text":"years. Yes, 3 years of warranty if you extend if you uh register your card","speaker":null,"is_sponsor":0},{"start_s":167.84,"end_s":172.56,"text":"right away. So, let's go ahead and I","speaker":null,"is_sponsor":0},{"start_s":171.36,"end_s":177.84,"text":"don't know how this works. So, I'll figure that out later.","speaker":null,"is_sponsor":0},{"start_s":175.44,"end_s":180.56,"text":"Now, let's get to the card itself. So, first let's do a brief rundown. And","speaker":null,"is_sponsor":0},{"start_s":179.44,"end_s":186.48,"text":"yeah, I'm totally going to look over there and use my cheat sheet because that's how I roll of the features that","speaker":null,"is_sponsor":0},{"start_s":183.68,"end_s":191.04,"text":"the GTX 670 includes that were first introduced on the 680. So we've got uh","speaker":null,"is_sponsor":0},{"start_s":189.2,"end_s":195.68,"text":"GPU boost which means that it dynamically according to the TDP that is","speaker":null,"is_sponsor":0},{"start_s":193.28,"end_s":200.08,"text":"the thermal design envelope of this particular card as well as the power","speaker":null,"is_sponsor":0},{"start_s":197.2,"end_s":204.879,"text":"envelope is going to adjust assuming that your GPU is running cool enough and","speaker":null,"is_sponsor":0},{"start_s":202.72,"end_s":208.0,"text":"your board is not consuming too much power like if you're using something","speaker":null,"is_sponsor":0},{"start_s":206.159,"end_s":212.0,"text":"like Firmark then it would know that well that's not a real game and it's","speaker":null,"is_sponsor":0},{"start_s":209.92,"end_s":216.72,"text":"drawing more power than is realistic. GPU boost will dynamically adjust the","speaker":null,"is_sponsor":0},{"start_s":214.4,"end_s":221.84,"text":"frequency of your GPU in order to make the best use of the power and the heat","speaker":null,"is_sponsor":0},{"start_s":220.0,"end_s":227.92,"text":"envelope that it has available to it. So GPU boost is definitely present on the","speaker":null,"is_sponsor":0},{"start_s":224.319,"end_s":229.12,"text":"670. We also have FXA has been added to","speaker":null,"is_sponsor":0},{"start_s":227.92,"end_s":235.68,"text":"the NVIDIA control panel, although that's accessible to for many different cards. Now adaptive VSYNC is one that's","speaker":null,"is_sponsor":0},{"start_s":233.36,"end_s":241.28,"text":"really cool. So, adaptive VSYNC allows you to either be using VSYNC when your","speaker":null,"is_sponsor":0},{"start_s":239.28,"end_s":245.28,"text":"frame rate is way way above the refresh rate of your monitor, whether it's 60 or","speaker":null,"is_sponsor":0},{"start_s":243.04,"end_s":250.959,"text":"120 or whatever the case may be. So, it will turn VSYNC on to eliminate tearing.","speaker":null,"is_sponsor":0},{"start_s":247.92,"end_s":252.239,"text":"So, that is when your GPU draws the top","speaker":null,"is_sponsor":0},{"start_s":250.959,"end_s":255.92,"text":"half of the screen and the bottom half of the screen in the middle of a monitor","speaker":null,"is_sponsor":0},{"start_s":254.08,"end_s":259.359,"text":"refresh and you see like there's a guy standing there and he's like slightly","speaker":null,"is_sponsor":0},{"start_s":257.519,"end_s":263.84,"text":"disconnected at the waist. So, that's tearing. So the VSIC will turn on when","speaker":null,"is_sponsor":0},{"start_s":261.44,"end_s":268.08,"text":"you're up there and then when you are below 60, let's say 60 for our","speaker":null,"is_sponsor":0},{"start_s":265.919,"end_s":272.8,"text":"hypothetical in most monitors, most LCDs are 60 Hz. When you're below 60 Hz, it","speaker":null,"is_sponsor":0},{"start_s":270.72,"end_s":277.52,"text":"will no longer have to drop you all the way down to 30 if your GPU is capable of","speaker":null,"is_sponsor":0},{"start_s":275.52,"end_s":281.44,"text":"pushing 55 frames per second, which is how VSYNC works. So it turns VSync off","speaker":null,"is_sponsor":0},{"start_s":280.16,"end_s":290.56,"text":"as soon as you fall below that threshold. So you get the best of both worlds. So that's adaptive VSYNC. TXA is","speaker":null,"is_sponsor":0},{"start_s":286.88,"end_s":291.919,"text":"uh basically more","speaker":null,"is_sponsor":0},{"start_s":290.56,"end_s":297.919,"text":"What do they what do they what do they sort of call it pretty much um the","speaker":null,"is_sponsor":0},{"start_s":294.639,"end_s":301.199,"text":"performance of yeah see I'm mode one","speaker":null,"is_sponsor":0},{"start_s":297.919,"end_s":303.6,"text":"right two times","speaker":null,"is_sponsor":0},{"start_s":301.199,"end_s":308.88,"text":"multi-sampling AA performance and it looks like eight times AA so that's","speaker":null,"is_sponsor":0},{"start_s":305.44,"end_s":310.639,"text":"basically that and then mode two is you","speaker":null,"is_sponsor":0},{"start_s":308.88,"end_s":314.56,"text":"know what hold on give me a sec I couldn't read my little chart from too","speaker":null,"is_sponsor":0},{"start_s":312.24,"end_s":317.6,"text":"far away so mode one is the performance hit of two times anti-aliasing and it","speaker":null,"is_sponsor":0},{"start_s":316.56,"end_s":321.52,"text":"looks about like eight times anti-aliasing And then mode two looks","speaker":null,"is_sponsor":0},{"start_s":319.759,"end_s":324.56,"text":"even better than that and is about the same performance hit as four times","speaker":null,"is_sponsor":0},{"start_s":322.96,"end_s":330.639,"text":"anti-aliasing. So that is also present on this card. Next, we have full support","speaker":null,"is_sponsor":0},{"start_s":327.52,"end_s":333.199,"text":"for NVIDIA 3D vision surround off of a","speaker":null,"is_sponsor":0},{"start_s":330.639,"end_s":337.84,"text":"single card. So you can go one, two, three displays with an auxiliary","speaker":null,"is_sponsor":0},{"start_s":334.88,"end_s":340.8,"text":"display. No big deal. So you want to run your three displays and then your","speaker":null,"is_sponsor":0},{"start_s":339.039,"end_s":344.72,"text":"auxiliary one can't run the game at the same time. It's not for surround gaming.","speaker":null,"is_sponsor":0},{"start_s":342.639,"end_s":348.8,"text":"But if you had like, you know, a chat window open or like system monitoring","speaker":null,"is_sponsor":0},{"start_s":347.12,"end_s":353.919,"text":"utilities or stuff open on that fourth monitor, then you can totally run that","speaker":null,"is_sponsor":0},{"start_s":350.479,"end_s":356.16,"text":"one as well. So up to 3 + 1 displays is","speaker":null,"is_sponsor":0},{"start_s":353.919,"end_s":360.639,"text":"also supported on this card. 4K monitor support as well as HDMI 1.4A that","speaker":null,"is_sponsor":0},{"start_s":358.72,"end_s":366.4,"text":"includes 3D, which is very, very cool stuff. Better Physics performance. And","speaker":null,"is_sponsor":0},{"start_s":364.4,"end_s":370.479,"text":"we're pretty much there. Now, let's have a look at the card itself. So, it uses a","speaker":null,"is_sponsor":0},{"start_s":368.4,"end_s":376.08,"text":"similar blower design cooler, which also includes the acoustic dampening um","speaker":null,"is_sponsor":0},{"start_s":373.44,"end_s":379.36,"text":"hoo-ha that NVIDIA included with the GTX 680. You can see it's a slightly shorter","speaker":null,"is_sponsor":0},{"start_s":377.68,"end_s":384.4,"text":"card, although it's more than slightly shorter, which I'll show you more of in a moment. Uses the same fan, which is","speaker":null,"is_sponsor":0},{"start_s":382.24,"end_s":389.6,"text":"good because the GTX 680 fan is excellent. It is a PCI Express 3.0 card,","speaker":null,"is_sponsor":0},{"start_s":387.759,"end_s":392.8,"text":"which is good if you have the latest IV Bridge platform for your system. You can","speaker":null,"is_sponsor":0},{"start_s":391.36,"end_s":396.479,"text":"see on the back we got sort of, you know, more like plasticky shroudy thing.","speaker":null,"is_sponsor":0},{"start_s":394.72,"end_s":400.639,"text":"There's our PCIe interface. I already showed you the two DVI HDMI and display","speaker":null,"is_sponsor":0},{"start_s":398.479,"end_s":405.44,"text":"port connectors. Up here we've got cool GeForce GTX branding. And then here","speaker":null,"is_sponsor":0},{"start_s":403.039,"end_s":408.08,"text":"we've got our two six pin PCIe connectors. And what you may have","speaker":null,"is_sponsor":0},{"start_s":406.4,"end_s":416.24,"text":"noticed about that is it's in kind of a weird location. Why would that be?","speaker":null,"is_sponsor":0},{"start_s":411.68,"end_s":418.72,"text":"Because the GTX 670 comes with a very","speaker":null,"is_sponsor":0},{"start_s":416.24,"end_s":425.199,"text":"short PCB. NVIDIA has actually rotated the GPU around to make better use of the","speaker":null,"is_sponsor":0},{"start_s":421.28,"end_s":427.039,"text":"power circuitry being on the uh the back","speaker":null,"is_sponsor":0},{"start_s":425.199,"end_s":431.44,"text":"half of the card. So all your power delivery is actually here instead of","speaker":null,"is_sponsor":0},{"start_s":429.12,"end_s":437.039,"text":"here. So they no longer needed that segment of the PCB. Check this out.","speaker":null,"is_sponsor":0},{"start_s":434.72,"end_s":441.039,"text":"There you go. So you can see this 670 is a reference 670 and there's actually","speaker":null,"is_sponsor":0},{"start_s":438.96,"end_s":444.8,"text":"extra room for more graphics memory. So you can buy a 670 with up to 4 gigs of","speaker":null,"is_sponsor":0},{"start_s":443.12,"end_s":449.36,"text":"graphics memory. And where all that power delivery circuitry normally is,","speaker":null,"is_sponsor":0},{"start_s":446.96,"end_s":453.28,"text":"it's no longer there. So, they are still using the nice long efficient blower","speaker":null,"is_sponsor":0},{"start_s":451.68,"end_s":458.72,"text":"cooler, which is going to exhaust most of the air out the back of your case. And you can also see there's a nice big","speaker":null,"is_sponsor":0},{"start_s":456.08,"end_s":463.36,"text":"gap between the heat sink and these uh these vents back here, which actually","speaker":null,"is_sponsor":0},{"start_s":460.319,"end_s":465.759,"text":"allows for slightly better air flow. But","speaker":null,"is_sponsor":0},{"start_s":463.36,"end_s":470.96,"text":"they've used a very very efficient design to achieve that. Speaking of","speaker":null,"is_sponsor":0},{"start_s":467.36,"end_s":472.88,"text":"efficiency, it is um by far better","speaker":null,"is_sponsor":0},{"start_s":470.96,"end_s":477.599,"text":"performance per watt than anything else that NVIDIA has produced in this kind of","speaker":null,"is_sponsor":0},{"start_s":474.8,"end_s":481.84,"text":"a price range before. So, it's uh like I said using the same GPU as the GTX 680,","speaker":null,"is_sponsor":0},{"start_s":480.24,"end_s":487.039,"text":"which means it just blows away its predecessors, the 570, the 470 in terms","speaker":null,"is_sponsor":0},{"start_s":485.44,"end_s":491.599,"text":"of the performance per watt that this card can deliver. It is compatible with","speaker":null,"is_sponsor":0},{"start_s":489.12,"end_s":498.639,"text":"multi-way SLI configurations due to its two SLI uh its two SLI bridge.","speaker":null,"is_sponsor":0},{"start_s":495.919,"end_s":502.24,"text":"two SLI connectors. Sorry guys, I'm tired. And I think that pretty much","speaker":null,"is_sponsor":0},{"start_s":500.96,"end_s":507.28,"text":"covers it. So, thank you for checking out my unboxing and first look at the","speaker":null,"is_sponsor":0},{"start_s":504.319,"end_s":509.919,"text":"GeForce GTX 570 from Galaxy. And don't forget to subscribe to Lest Tech Tips","speaker":null,"is_sponsor":0},{"start_s":508.639,"end_s":512.24,"text":"for unboxings, reviews, and other computer","speaker":null,"is_sponsor":0}],"full_text":"Welcome to my unboxing and first look at something that's very exciting. This is the Galaxy GeForce GTX 670. It supports a wide variety of exciting features, most of which you can actually find out about on my Galaxy GTX 680 unboxing as well as review and all the other videos I did about it. because the 670 is actually very similar to the 680 in many ways. Now, sure, it doesn't have quite as many CUDA cores. It only has 1,344. Sure, it's not clocked quite as high with a boost clock that's generally in the neighborhood of around 980 MHz, although GPU boost doesn't necessarily, you know, nail it down to an exact frequency at any given time. And sure, it might look a little different, and more on that in a moment, but in terms of the performance, in terms of the memory speed, in terms of the memory bus, this is the first time in a while we haven't seen a a step down card from the flagship single GPU that is actually using the same memory bus at the same frequency. So, it has six gigahertz DDDR5 memory with a 256 bit bus. Um, it's using the Right, right, right. So, the reason that most of the features are very similar is because it is using the same GK 104 core that the GTX 680 is using, although slightly cut down in terms of the functional units, the CUDA cores and the clock speed. But other than that, this is almost a GTX 680, although the key difference is that it comes at a more attractive price. So, you can see it actually looks very similar as well. More on that in a moment. So, let's just have a look at what Galaxy includes with their GTX 670. In here, we find slick all black packaging. I'm a big fan of packaging, as you guys probably know, if you've watched a few of my videos. I unpackage a lot of products, so I do recognize good packaging when I see it. Got nice little individually packaged zip locky things going on here. So, we got a DVI to VGA adapter. We have uh two dual Molex to six pin PCIe adapters. So that probably tells us that this card uses dual six pin PCIe power adapters as well as a graphics card driver installation disc. Although you should download the latest and finally a setup guide, easy setup for quick installation for GeForce GTX 600 series pamphlet and then a user's manual. So that pretty much covers it. Attention customers, thank you for purchasing your blah blah blah. Be sure to register your card within 30 days of purchase to receive an extended warranty. So Galaxy does provide up to I believe it's three years. Yes, 3 years of warranty if you extend if you uh register your card right away. So, let's go ahead and I don't know how this works. So, I'll figure that out later. Now, let's get to the card itself. So, first let's do a brief rundown. And yeah, I'm totally going to look over there and use my cheat sheet because that's how I roll of the features that the GTX 670 includes that were first introduced on the 680. So we've got uh GPU boost which means that it dynamically according to the TDP that is the thermal design envelope of this particular card as well as the power envelope is going to adjust assuming that your GPU is running cool enough and your board is not consuming too much power like if you're using something like Firmark then it would know that well that's not a real game and it's drawing more power than is realistic. GPU boost will dynamically adjust the frequency of your GPU in order to make the best use of the power and the heat envelope that it has available to it. So GPU boost is definitely present on the 670. We also have FXA has been added to the NVIDIA control panel, although that's accessible to for many different cards. Now adaptive VSYNC is one that's really cool. So, adaptive VSYNC allows you to either be using VSYNC when your frame rate is way way above the refresh rate of your monitor, whether it's 60 or 120 or whatever the case may be. So, it will turn VSYNC on to eliminate tearing. So, that is when your GPU draws the top half of the screen and the bottom half of the screen in the middle of a monitor refresh and you see like there's a guy standing there and he's like slightly disconnected at the waist. So, that's tearing. So the VSIC will turn on when you're up there and then when you are below 60, let's say 60 for our hypothetical in most monitors, most LCDs are 60 Hz. When you're below 60 Hz, it will no longer have to drop you all the way down to 30 if your GPU is capable of pushing 55 frames per second, which is how VSYNC works. So it turns VSync off as soon as you fall below that threshold. So you get the best of both worlds. So that's adaptive VSYNC. TXA is uh basically more What do they what do they what do they sort of call it pretty much um the performance of yeah see I'm mode one right two times multi-sampling AA performance and it looks like eight times AA so that's basically that and then mode two is you know what hold on give me a sec I couldn't read my little chart from too far away so mode one is the performance hit of two times anti-aliasing and it looks about like eight times anti-aliasing And then mode two looks even better than that and is about the same performance hit as four times anti-aliasing. So that is also present on this card. Next, we have full support for NVIDIA 3D vision surround off of a single card. So you can go one, two, three displays with an auxiliary display. No big deal. So you want to run your three displays and then your auxiliary one can't run the game at the same time. It's not for surround gaming. But if you had like, you know, a chat window open or like system monitoring utilities or stuff open on that fourth monitor, then you can totally run that one as well. So up to 3 + 1 displays is also supported on this card. 4K monitor support as well as HDMI 1.4A that includes 3D, which is very, very cool stuff. Better Physics performance. And we're pretty much there. Now, let's have a look at the card itself. So, it uses a similar blower design cooler, which also includes the acoustic dampening um hoo-ha that NVIDIA included with the GTX 680. You can see it's a slightly shorter card, although it's more than slightly shorter, which I'll show you more of in a moment. Uses the same fan, which is good because the GTX 680 fan is excellent. It is a PCI Express 3.0 card, which is good if you have the latest IV Bridge platform for your system. You can see on the back we got sort of, you know, more like plasticky shroudy thing. There's our PCIe interface. I already showed you the two DVI HDMI and DisplayPort connectors. Up here we've got cool GeForce GTX branding. And then here we've got our two six pin PCIe connectors. And what you may have noticed about that is it's in kind of a weird location. Why would that be? Because the GTX 670 comes with a very short PCB. NVIDIA has actually rotated the GPU around to make better use of the power circuitry being on the uh the back half of the card. So all your power delivery is actually here instead of here. So they no longer needed that segment of the PCB. Check this out. There you go. So you can see this 670 is a reference 670 and there's actually extra room for more graphics memory. So you can buy a 670 with up to 4 gigs of graphics memory. And where all that power delivery circuitry normally is, it's no longer there. So, they are still using the nice long efficient blower cooler, which is going to exhaust most of the air out the back of your case. And you can also see there's a nice big gap between the heat sink and these uh these vents back here, which actually allows for slightly better air flow. But they've used a very very efficient design to achieve that. Speaking of efficiency, it is um by far better performance per watt than anything else that NVIDIA has produced in this kind of a price range before. So, it's uh like I said using the same GPU as the GTX 680, which means it just blows away its predecessors, the 570, the 470 in terms of the performance per watt that this card can deliver. It is compatible with multi-way SLI configurations due to its two SLI uh its two SLI bridge. two SLI connectors. Sorry guys, I'm tired. And I think that pretty much covers it. So, thank you for checking out my unboxing and first look at the GeForce GTX 570 from Galaxy. And don't forget to subscribe to Lest Tech Tips for unboxings, reviews, and other computer"}