Saturday, November 01, 2025

AI Exponential: Moore's Law

Usually Moore's law, a foundation of AI and computing progress,
is represented in logarithmic scale: and that is not intuitive for most people.
It is a challenge even for AI :)

Moore's Law from 1947 (when the transistor was invented)
to 2030 on a linear scale:
doubling every 2 years
By 2020: 2^36.5 ≈ 92 billion transistors
By 2030: 2^41.5 ≈ 3 TRILLION transistors

correct, exponential chart:


first AI chart with bug: “overflow” (wrong chart)


For commercial GPUs: Nvidia's Blackwell-based B100 accelerator has 208 billion transistors, 2024
GB200 = 2xB100

For the absolute largest chip (wafer-scale): Cerebras's Wafer Scale Engine (WSE) has 2.6 trillion transistors with 850,000 cores 

Interestingly, this means the Cerebras chip already exceeds the 2030 projection from Moore's Law that we charted! However, it's a specialized wafer-scale design rather than a conventional chip, which is why the industry still talks about reaching 1 trillion transistors for traditional multi-chiplet GPUs by 2030-2034.

GB200 NVL72 | NVIDIA

The Engine Behind AI Factories | NVIDIA Blackwell Architecture

Cerebras



No comments: