{"video_id":"77zSQuvnMQc","title":"Intel's Worst Products Ever","channel":"Techquickie","show":"Techquickie","published_at":"2023-05-05T14:58:16Z","duration_s":406,"segments":[{"start_s":0.0,"end_s":5.2,"text":"Whatever your feelings about Intel, they haven't put out any CPUs lately that have been absolute","speaker":null,"is_sponsor":0},{"start_s":5.2,"end_s":11.12,"text":"duds. Now, I'm talking like completely bad, but this wasn't always the case.","speaker":null,"is_sponsor":0},{"start_s":11.12,"end_s":16.16,"text":"As Intel has had plenty of costly mistakes in its history, so let's take a look back at","speaker":null,"is_sponsor":0},{"start_s":16.16,"end_s":22.0,"text":"Team Blue's Hall of Shame. First, let's go all the way back to 1981, when Intel wasn't even old","speaker":null,"is_sponsor":0},{"start_s":22.0,"end_s":28.32,"text":"enough to drive. They came out with a line of CPUs that used an architecture called IAPX432,","speaker":null,"is_sponsor":0},{"start_s":28.32,"end_s":33.28,"text":"which, aside from being annoying to say, was actually supposed to be the long-term replacement","speaker":null,"is_sponsor":0},{"start_s":33.28,"end_s":40.56,"text":"for x86, which had been around by then for only about three years. You see, IAPX432 was meant to","speaker":null,"is_sponsor":0},{"start_s":40.56,"end_s":46.4,"text":"be used with very high-level programming languages. Now, high-level here means that the language is","speaker":null,"is_sponsor":0},{"start_s":46.4,"end_s":51.84,"text":"far removed from the raw zeros and ones that the physical hardware the processor uses. Instead,","speaker":null,"is_sponsor":0},{"start_s":51.84,"end_s":56.24,"text":"a high-level language is very user-friendly, relying on lots of natural words that have to","speaker":null,"is_sponsor":0},{"start_s":56.4,"end_s":60.8,"text":"be translated so that the CPU can make sense of them. High-level language support would make it","speaker":null,"is_sponsor":0},{"start_s":60.8,"end_s":67.2,"text":"easier for developers to code more complex advanced programs, and IAPX432 was specifically","speaker":null,"is_sponsor":0},{"start_s":67.2,"end_s":74.64,"text":"optimized for one such language called ADA. Yes, that ADA. Intel thought ADA would end up becoming a","speaker":null,"is_sponsor":0},{"start_s":74.64,"end_s":79.04,"text":"much more important and popular language for the more powerful programs of the future,","speaker":null,"is_sponsor":0},{"start_s":79.04,"end_s":83.6,"text":"especially as it got attention from the U.S. Department of Defense for its own computer systems.","speaker":null,"is_sponsor":0},{"start_s":83.68,"end_s":88.8,"text":"However, ADA didn't quite take off as expected in the consumer space, and the processor itself","speaker":null,"is_sponsor":0},{"start_s":88.8,"end_s":93.6,"text":"simply wasn't very high performance. The reasons for this are complex, but it boils down to the","speaker":null,"is_sponsor":0},{"start_s":93.6,"end_s":98.64,"text":"fact that the physical processor designs of the day weren't advanced enough to run the complicated","speaker":null,"is_sponsor":0},{"start_s":98.64,"end_s":104.0,"text":"instructions of languages like ADA. It was a product that was just too ambitious and tried to","speaker":null,"is_sponsor":0},{"start_s":104.0,"end_s":108.56,"text":"include too many features at the expense of performance. So after roughly five years of","speaker":null,"is_sponsor":0},{"start_s":108.56,"end_s":114.64,"text":"disappointing sales, the IAPX432 project was axed while x86 continued to dominate.","speaker":null,"is_sponsor":0},{"start_s":114.64,"end_s":119.04,"text":"But Intel had a much higher profile chip embarrass them that you might actually remember.","speaker":null,"is_sponsor":0},{"start_s":119.04,"end_s":125.2,"text":"The Pentium 4 was released back in 2000 to great fanfare, and unlike the IAPX432, computers were","speaker":null,"is_sponsor":1},{"start_s":125.2,"end_s":129.76,"text":"mainstream enough for the general public to notice this time, especially considering there was a","speaker":null,"is_sponsor":1},{"start_s":129.76,"end_s":136.08,"text":"$300 million ad campaign headlined by the Blue Man Group. Intel tried to achieve never-before-seen","speaker":null,"is_sponsor":1},{"start_s":136.08,"end_s":140.64,"text":"levels of performance by pushing clock speeds higher and higher. In fact, they planned to scale","speaker":null,"is_sponsor":1},{"start_s":140.64,"end_s":146.72,"text":"speeds up to a whopping 10 gigahertz as process nodes shrunk. That's roughly double the speed of","speaker":null,"is_sponsor":1},{"start_s":146.72,"end_s":153.12,"text":"today's best processors. But if you know anything about CPUs, you know that clock speed isn't","speaker":null,"is_sponsor":0},{"start_s":153.12,"end_s":158.32,"text":"everything, and anyone who bought an Intel PC in the early 2000s learned that the hard way.","speaker":null,"is_sponsor":0},{"start_s":158.32,"end_s":163.68,"text":"True, these chips weren't just souped up Pentium 3s. Instead, featuring an all-new architecture","speaker":null,"is_sponsor":0},{"start_s":163.76,"end_s":170.72,"text":"called Netburst, which sounds hilariously close to nut bust? Oh boy. But that meant programs had to","speaker":null,"is_sponsor":0},{"start_s":170.72,"end_s":175.52,"text":"be specifically optimized for the new architecture to see any real performance schemes, even though","speaker":null,"is_sponsor":0},{"start_s":175.52,"end_s":180.8,"text":"later revisions introduced hyperthreading for the first time. A key area in which the Pentium 4","speaker":null,"is_sponsor":0},{"start_s":180.8,"end_s":185.92,"text":"struggled was branch prediction, which is simply the ability of the CPU to predict what the next","speaker":null,"is_sponsor":0},{"start_s":185.92,"end_s":190.8,"text":"instruction is going to be. Branch prediction is a crucial feature in all modern CPUs, and because","speaker":null,"is_sponsor":0},{"start_s":190.8,"end_s":196.0,"text":"the Pentium 4 was so bad at it, it kept having to go back and correct its own mistakes. The Pentium","speaker":null,"is_sponsor":0},{"start_s":196.0,"end_s":201.28,"text":"4 also had a very long pipeline, which is just what it sounds like, an electronic pipe where","speaker":null,"is_sponsor":0},{"start_s":201.28,"end_s":205.76,"text":"instructions are loaded one after the other so that multiple commands can be kept moving at the","speaker":null,"is_sponsor":0},{"start_s":205.76,"end_s":211.28,"text":"same time. The P4's lengthy pipeline would often stall out because of poor branch prediction, and","speaker":null,"is_sponsor":0},{"start_s":211.28,"end_s":217.2,"text":"to top it all off, the chip was very expensive and ran very hot as a result of increased power","speaker":null,"is_sponsor":0},{"start_s":217.2,"end_s":222.32,"text":"consumption from the high clock speeds and power leakage from the transistors. That said, Intel","speaker":null,"is_sponsor":0},{"start_s":222.32,"end_s":228.56,"text":"still sold a boatload of these, mostly in pre-built machines built by OEMs, so it was more of a flop","speaker":null,"is_sponsor":0},{"start_s":228.56,"end_s":234.56,"text":"in terms of performance and customer satisfaction than with sales. Ultimately, P4 ended up being","speaker":null,"is_sponsor":0},{"start_s":234.56,"end_s":239.2,"text":"the final flagship Pentium chip before the core series of CPUs took over, giving us multiple","speaker":null,"is_sponsor":0},{"start_s":239.2,"end_s":246.08,"text":"physical cores and a whole new architecture by 2006. But not all of Intel's big flops were CPUs.","speaker":null,"is_sponsor":0},{"start_s":246.08,"end_s":251.12,"text":"Although Intel's recent Arc lineup marked Team Blue's first foray into the modern discrete GPU","speaker":null,"is_sponsor":0},{"start_s":251.12,"end_s":256.56,"text":"scene, this actually isn't the first time they tried to make a graphics card. This is the Intel","speaker":null,"is_sponsor":0},{"start_s":256.56,"end_s":261.68,"text":"i7-40, and it was the first real consumer gaming graphics card that they ever made. It also had","speaker":null,"is_sponsor":0},{"start_s":261.68,"end_s":267.76,"text":"some incredible DNA, as the i7-40 was developed from a Lockheed Martin project that originally","speaker":null,"is_sponsor":0},{"start_s":267.76,"end_s":272.56,"text":"provided a visual flight simulator for astronauts in the Apollo moon landing program. Lockheed Martin","speaker":null,"is_sponsor":0},{"start_s":272.56,"end_s":277.36,"text":"later spun off this division as the company Real3D to try and tap into the consumer market,","speaker":null,"is_sponsor":0},{"start_s":277.36,"end_s":283.44,"text":"and Intel partnered with Real3D in developing the i7-40. By the time the i7-40 came out, it was","speaker":null,"is_sponsor":0},{"start_s":283.44,"end_s":287.76,"text":"hugely anticipated, with some in the industry saying that it would lead Intel to dominate the","speaker":null,"is_sponsor":0},{"start_s":287.76,"end_s":294.88,"text":"discrete GPU space. But unfortunately, Intel only sold the card for about 18 months. What really?","speaker":null,"is_sponsor":0},{"start_s":295.76,"end_s":301.28,"text":"The main issue is how the i7-40 made use of memory. The card connected to the motherboard","speaker":null,"is_sponsor":0},{"start_s":301.28,"end_s":307.04,"text":"through an AGP slot, and remember this was before PCI Express was invented, and the AGP slot provided","speaker":null,"is_sponsor":0},{"start_s":307.04,"end_s":312.08,"text":"a more direct connection to System RAM than the then standard conventional PCI. The idea was to","speaker":null,"is_sponsor":0},{"start_s":312.08,"end_s":317.04,"text":"have the game store textures in System RAM instead of the cards built in VRAM, with the AGP interface","speaker":null,"is_sponsor":0},{"start_s":317.04,"end_s":321.52,"text":"allowing the card to access that texture data more quickly, so you could build a card that didn't","speaker":null,"is_sponsor":0},{"start_s":321.52,"end_s":328.16,"text":"need as much VRAM, and was therefore cheaper. But despite this advantage, the i7-40's reliance","speaker":null,"is_sponsor":0},{"start_s":328.16,"end_s":333.44,"text":"on system memory made it slower than other cards that had sufficient VRAM to store textures of their","speaker":null,"is_sponsor":0},{"start_s":333.44,"end_s":339.36,"text":"own. And this was a time when game textures were becoming far more detailed, meaning that","speaker":null,"is_sponsor":0},{"start_s":339.36,"end_s":343.84,"text":"even though the i7-40 GPU itself could have actually delivered good performance,","speaker":null,"is_sponsor":0},{"start_s":343.84,"end_s":348.48,"text":"it just couldn't load all that texture data quickly enough. In fact, some i7-40 models came","speaker":null,"is_sponsor":0},{"start_s":348.48,"end_s":354.72,"text":"with as little as two megabytes of VRAM, which was mainly used for buffering, not storing textures.","speaker":null,"is_sponsor":0},{"start_s":354.72,"end_s":360.24,"text":"Unsurprisingly, NVIDIA's Riva TNT and 3DFX's Voodoo 2 quickly knocked the i7-40 out of the","speaker":null,"is_sponsor":0},{"start_s":360.24,"end_s":365.76,"text":"market, and Intel wouldn't release another discrete graphics product for 24 years. I mean,","speaker":null,"is_sponsor":0},{"start_s":365.76,"end_s":370.64,"text":"they tried with Larabee. I know rejection can be hard to stomach, but man, can't wait that long","speaker":null,"is_sponsor":0},{"start_s":370.64,"end_s":374.64,"text":"to put yourself back on the horse, you know? Thanks for watching, guys. Like, dislike, check out","speaker":null,"is_sponsor":0},{"start_s":374.64,"end_s":378.96,"text":"some of our other videos, comment with video suggestions down below, and don't forget to subscribe and follow.","speaker":null,"is_sponsor":0}],"full_text":"Whatever your feelings about Intel, they haven't put out any CPUs lately that have been absolute duds. Now, I'm talking like completely bad, but this wasn't always the case. As Intel has had plenty of costly mistakes in its history, so let's take a look back at Team Blue's Hall of Shame. First, let's go all the way back to 1981, when Intel wasn't even old enough to drive. They came out with a line of CPUs that used an architecture called IAPX432, which, aside from being annoying to say, was actually supposed to be the long-term replacement for x86, which had been around by then for only about three years. You see, IAPX432 was meant to be used with very high-level programming languages. Now, high-level here means that the language is far removed from the raw zeros and ones that the physical hardware the processor uses. Instead, a high-level language is very user-friendly, relying on lots of natural words that have to be translated so that the CPU can make sense of them. High-level language support would make it easier for developers to code more complex advanced programs, and IAPX432 was specifically optimized for one such language called ADA. Yes, that ADA. Intel thought ADA would end up becoming a much more important and popular language for the more powerful programs of the future, especially as it got attention from the U.S. Department of Defense for its own computer systems. However, ADA didn't quite take off as expected in the consumer space, and the processor itself simply wasn't very high performance. The reasons for this are complex, but it boils down to the fact that the physical processor designs of the day weren't advanced enough to run the complicated instructions of languages like ADA. It was a product that was just too ambitious and tried to include too many features at the expense of performance. So after roughly five years of disappointing sales, the IAPX432 project was axed while x86 continued to dominate. But Intel had a much higher profile chip embarrass them that you might actually remember. The Pentium 4 was released back in 2000 to great fanfare, and unlike the IAPX432, computers were mainstream enough for the general public to notice this time, especially considering there was a $300 million ad campaign headlined by the Blue Man Group. Intel tried to achieve never-before-seen levels of performance by pushing clock speeds higher and higher. In fact, they planned to scale speeds up to a whopping 10 gigahertz as process nodes shrunk. That's roughly double the speed of today's best processors. But if you know anything about CPUs, you know that clock speed isn't everything, and anyone who bought an Intel PC in the early 2000s learned that the hard way. True, these chips weren't just souped up Pentium 3s. Instead, featuring an all-new architecture called Netburst, which sounds hilariously close to nut bust? Oh boy. But that meant programs had to be specifically optimized for the new architecture to see any real performance schemes, even though later revisions introduced hyperthreading for the first time. A key area in which the Pentium 4 struggled was branch prediction, which is simply the ability of the CPU to predict what the next instruction is going to be. Branch prediction is a crucial feature in all modern CPUs, and because the Pentium 4 was so bad at it, it kept having to go back and correct its own mistakes. The Pentium 4 also had a very long pipeline, which is just what it sounds like, an electronic pipe where instructions are loaded one after the other so that multiple commands can be kept moving at the same time. The P4's lengthy pipeline would often stall out because of poor branch prediction, and to top it all off, the chip was very expensive and ran very hot as a result of increased power consumption from the high clock speeds and power leakage from the transistors. That said, Intel still sold a boatload of these, mostly in pre-built machines built by OEMs, so it was more of a flop in terms of performance and customer satisfaction than with sales. Ultimately, P4 ended up being the final flagship Pentium chip before the core series of CPUs took over, giving us multiple physical cores and a whole new architecture by 2006. But not all of Intel's big flops were CPUs. Although Intel's recent Arc lineup marked Team Blue's first foray into the modern discrete GPU scene, this actually isn't the first time they tried to make a graphics card. This is the Intel i7-40, and it was the first real consumer gaming graphics card that they ever made. It also had some incredible DNA, as the i7-40 was developed from a Lockheed Martin project that originally provided a visual flight simulator for astronauts in the Apollo moon landing program. Lockheed Martin later spun off this division as the company Real3D to try and tap into the consumer market, and Intel partnered with Real3D in developing the i7-40. By the time the i7-40 came out, it was hugely anticipated, with some in the industry saying that it would lead Intel to dominate the discrete GPU space. But unfortunately, Intel only sold the card for about 18 months. What really? The main issue is how the i7-40 made use of memory. The card connected to the motherboard through an AGP slot, and remember this was before PCI Express was invented, and the AGP slot provided a more direct connection to System RAM than the then standard conventional PCI. The idea was to have the game store textures in System RAM instead of the cards built in VRAM, with the AGP interface allowing the card to access that texture data more quickly, so you could build a card that didn't need as much VRAM, and was therefore cheaper. But despite this advantage, the i7-40's reliance on system memory made it slower than other cards that had sufficient VRAM to store textures of their own. And this was a time when game textures were becoming far more detailed, meaning that even though the i7-40 GPU itself could have actually delivered good performance, it just couldn't load all that texture data quickly enough. In fact, some i7-40 models came with as little as two megabytes of VRAM, which was mainly used for buffering, not storing textures. Unsurprisingly, NVIDIA's Riva TNT and 3DFX's Voodoo 2 quickly knocked the i7-40 out of the market, and Intel wouldn't release another discrete graphics product for 24 years. I mean, they tried with Larabee. I know rejection can be hard to stomach, but man, can't wait that long to put yourself back on the horse, you know? Thanks for watching, guys. Like, dislike, check out some of our other videos, comment with video suggestions down below, and don't forget to subscribe and follow."}