GPU prices remain painful. On that we can all agree. But can it really be correct that the per-transitor cost of chips hasn't budged in a decade? So said Google's Milind Shah last year, according to Semiconductor Digest (incorporating Memory Buses Weekly, or at least it ought to!).
More specifically, the claim (via Tomshardware) is that the cost of 100 million transistor gates within a chip are about the same today as back in the 28nm generation of silicon. In fact, according to Google, today's 5nm and 3nm chips are actually slightly pricier per transistor.
Uh huh, you say, isn't this all pretty obvious? Just look at graphics card prices. Yes… but also, no. Let's take Nvidia's 28nm GK104 GPU as an example. It first appeared in 2012 and was used in various GeForce graphics cards from the GTX 680 downwards.
Now, GK104 contained 3.5 billion transistors. Fast forward to today and the current Nvidia '104-class chip, AD104 as found in the RTX 4070 and 4070 Ti, clocks in at fully 35 billion transistors.
So, that's 10 times the number of transistors. Is Nvidia really paying 10 times as much for an AD104 die today as it did for a GK104 die in 2012? Again, graphics cards have gotten pricier, but not by that much.
So, let's consider the possible explanations. For starters, these figures appear to be largely centred on the cost of buying chips from Taiwanese foundry TSMC. And TSMC is notorious for having put its prices up in recent years.
Back in 2012, TSMC was generally considered to be well behind Intel for chip production technology. Now it leads the world and charges a pretty penny as a consequence.
At the same time, perhaps the non-GPU costs of graphics cards, like VRAM, the PCB, assembly and so on have come down dramatically since 2012, preventing the price of the whole card from jumping up 10x.
But still, the whole 10x thing seems pretty punchy. Consider an Intel Raptor Lake CPU, such as the Core i9 14900K. That contains around 25 billion transistors. Wind back 10 years and you come to Intel's Ivy Bridge CPUs, which topped out at a little over two billion transistors. So, that's well over 10 times the number of transistors in today's 14900K than an Ivy Bridge CPU.
A 14900K will cost you about $550. The top Ivy Bridge chip was either just under $600 or $999 if you include the Core i7 Extreme 4960X. But either way, those chips were more expensive, especially adjusted for inflation. But we are to believe that transistor costs have remained the same despite today's CPU containing over 10 times as many?
Just to be really clear, that's well over 10 times as many transistors in todays CPUs as those from 10 year ago, per-transistor cost has remained the same, but prices have fallen a bit (or a lot if you adjust for inflation)?
Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.
Moreover, with a CPU there's far less cost to account for beyond the CPU die itself. It's not built into a large PCB with VRAM, a cooler, video output hardware and so on. So, the cost per transistor of a CPU die should have a much more direct effect on the final retail price than a GPU die has on a graphics card. But if anything, it appears to have had less impact. CPUs are, if anything, a bit cheaper.
So, there's at least some nuance that's being lost in the blunt claim that transistors aren't getting any cheaper. If that were strictly true, high-end CPUs for consumer desktop PCs would be several thousand dollars. But they're not. So, we probably aren't going to panic.
Yes, it's true that chip production costs have been inflating in recent years. But we still think that you're going to get more bang from your buck in the coming years from PC components.