In the late 2000s, the graphics card world had a clear pecking order. NVIDIA was the king. Their GeForce GTX 200 series, particularly the monstrous GTX 280 and later the refined GTX 285, were the undisputed performance champions. AMD (then ATI, before the 2006 acquisition) had been playing catch-up with their Radeon HD 4000 series. While the HD 4870 and 4890 offered great value, they couldn't quite topple NVIDIA’s single-GPU crown.
DirectX 11 was a massive deal. It introduced (smoother surfaces), Compute Shaders (using the GPU for non-graphics tasks), and better multi-threading. NVIDIA’s current cards only supported DirectX 10. AMD realized that if they could launch a full DX11 lineup before NVIDIA, they would own the future. amd radeon hd 5000
The engineering challenge was brutal. To be first, they had to tape out (finalize the design) on a brand-new, unproven 40nm manufacturing process from TSMC. The previous 55nm process was stable, but 40nm was plagued with high leakage and low yields. They also had to pack nearly 2.15 billion transistors onto a die not much larger than a postage stamp. In the late 2000s, the graphics card world
But AMD had a secret weapon: the (codenamed Hemlock). They simply glued two Cypress chips onto one board via a PLX bridge chip. The HD 5970 was a dual-GPU monster that became the fastest graphics card on the planet for over a year. It was so fast that it held the performance crown until NVIDIA’s GTX 580 launched a full year later. AMD (then ATI, before the 2006 acquisition) had