WHY THIS MATTERS IN BRIEF
Many people have said Moore’s Law is dead, and at best slowing down, but Nvidia are bucking the trend.
Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trends, connect, watch a keynote, or browse my blog.
We’ve been hearing for a long time now that Moore’s Law is breaking – which ironically is more because of the cost of building new semiconductor fabs that can manufacture chips with smaller and smaller transistor sizes which can now require as much as $30 Bn in investment per factory. However, despite all these doom mongers Nvidia CEO Jensen Huang says the performance of his company’s Artificial Intelligence (AI) chips is advancing faster than historical rates set by Moore’s Law – the rubric that drove computing progress for decades.
“Our systems are progressing way faster than Moore’s Law,” said Huang in an on Tuesday, the morning after he delivered a keynote to a 10,000-person crowd at CES in Las Vegas.
Coined by Intel co-founder Gordon Moore in 1965, Moore’s Law predicted that the number of transistors on computer chips would roughly double every two years, essentially doubling the performance of those chips. This prediction mostly panned out, and created rapid advances in capability and plummeting costs for decades.
In recent years though Moore’s Law has slowed down. However, Huang claims that Nvidia’s AI chips are moving at an accelerated pace of their own; the company says its latest data center superchip is more than 30x faster for running AI inference workloads than its previous generation.
“We can build the architecture, the chip, the system, the libraries, and the algorithms all at the same time,” said Huang. “If you do that, then you can move faster than Moore’s Law, because you can innovate across the entire stack.”
The bold claim from Nvidia’s CEO comes at a time when many are questioning whether AI’s progress has stalled. Leading AI labs – such as Google, OpenAI, and Anthropic – use Nvidia’s AI chips to train and run their AI models, and advancements to these chips would likely translate to further progress in AI model capabilities.
This isn’t the first time Huang has suggested Nvidia is surpassing Moore’s Law. On a podcast in November, Huang suggested the AI world is on pace for “hyper Moore’s Law.”
Huang rejects the idea that AI progress is slowing. Instead he claims there are now three active AI scaling laws: pre-training, the initial training phase where AI models learn patterns from large amounts of data; post-training, which fine-tunes an AI model’s answers using methods such as human feedback; and test-time compute, which occurs during the inference phase and gives an AI model more time to “think” after each question.
“Moore’s Law was so important in the history of computing because it drove down computing costs,” said Huang. “The same thing is going to happen with inference where we drive up the performance, and as a result, the cost of inference is going to be less.”
The post Nvidia CEO says their new AI chips are beating Moore’s Law appeared first on Matthew Griffin | Keynote Speaker & Master Futurist.