Search Captions & Ask AI

David Sacks Explains How AI Will Go 1,000,000x in Four Years

May 03, 2025 / 01:43

This episode discusses the rapid advancements in AI technology, focusing on algorithms, chip improvements, and data center capabilities. Key topics include the exponential growth of AI models, the evolution of reasoning models, and the increasing deployment of GPUs in data centers.

The conversation highlights how algorithms are improving at a rate of three to four times a year, transitioning from basic LLM chatbots to more complex reasoning models. The next significant leap is anticipated to be in AI agents.

Chip technology is also advancing, with each new generation being three to four times better than the last. The discussion mentions NVL72 and its innovative rack system that enhances performance at the data center level.

Furthermore, the episode notes the dramatic increase in GPU deployment, citing Elon Musk's Grock project, which has expanded from 100,000 GPUs to 300,000, with projections reaching up to a million. OpenAI's data center is also expected to scale significantly in the coming years.

The guest emphasizes the importance of understanding exponential progress in AI, suggesting that the combined advancements in algorithms, chips, and compute power could lead to unprecedented growth in the field.

TL;DR

AI technology is advancing rapidly through improved algorithms, chips, and data center capabilities, leading to exponential growth in performance and deployment.

Video

00:00:00
I would say the rate of progress is
00:00:01
exponential right now on at least three
00:00:04
key dimensions. So number one is the
00:00:05
algorithms themselves. The models are
00:00:07
improving at a rate of I don't know
00:00:08
three to four times a year. They're not
00:00:10
just getting faster and better. But
00:00:13
qualitatively they're different.
00:00:14
Remember we started with pure LLM chat
00:00:16
bots. Then we went to reasoning models.
00:00:18
We didn't even get to the agents part of
00:00:20
it yet, but that's the next big leap
00:00:22
after reasoning models. We're just
00:00:23
starting to scratch the surface there.
00:00:25
Then you've got the chips. Depending on
00:00:27
how you measure it, each generation of
00:00:28
chips is probably three or four times
00:00:30
better than the last. It's not just the
00:00:32
individual chips that are getting
00:00:33
better. They're figuring out how to
00:00:34
network them together. Like with NVL72,
00:00:36
it's like a rack system to create much
00:00:38
better performance at the data center
00:00:40
level. And that would be the third area
00:00:41
where you're seeing basically
00:00:42
exponential progress. Just look at the
00:00:44
number of GPUs are being deployed in
00:00:46
data centers. So when Elon first started
00:00:48
training Grock, I think they had maybe
00:00:50
100,000 GPUs. Now they're up to 300,000.
00:00:52
They're on the way to a million. Same
00:00:53
thing with OpenAI's data center.
00:00:56
Stargate and within a couple years
00:00:57
they'll be at I don't know 5 million
00:00:59
GPUs, 10 million GPUs. The algorithms,
00:01:02
the chips and the data centers are all
00:01:05
improving or scaling at a rate of I
00:01:07
don't know 3 to 4x a year. That's 10x
00:01:09
every 2 years. Where people don't
00:01:11
understand exponential progress is that
00:01:13
if you're getting better at 10x every 2
00:01:15
years, that doesn't mean you'll be at
00:01:16
20x in 4 years. It means you'll be at
00:01:18
100x. So you multiply those things
00:01:20
together, the algorithms, the chips, and
00:01:22
then the raw compute that's available,
00:01:24
you're talking about a millionx
00:01:25
increase. Some of which will be captured
00:01:27
in price reductions, some of it will be
00:01:30
in the performance ceiling, and then
00:01:31
some of it will just be in the overall
00:01:33
amount of AI compute that's available to
00:01:35
the economy. But the impact of this
00:01:37
thing is going to be absolutely massive.
00:01:39
And I think people still don't even
00:01:40
appreciate that fact cuz they don't
00:01:41
understand exponential progress.

Episode Highlights

  • Exponential Progress in AI
    Algorithms, chips, and data centers are improving at an exponential rate, leading to massive advancements.
    “You're talking about a millionx increase.”
    @ 01m 24s
    May 03, 2025

Episode Quotes

Key Moments

  • Exponential Growth01:11
  • Massive Impact01:37

Related Episodes

Podcast thumbnail
E167: Google's Woke AI disaster, Nvidia smashes earnings (again), Groq's LPU breakthrough & more
Podcast thumbnail
E166: Mind-blowing AI Video: OpenAI launches Sora + Is Biden too old? Tucker/Putin interview & more
Podcast thumbnail
Winning the AI Race Part 3: Jensen Huang, Lisa Su, James Litinsky, Chase Lochmiller
Podcast thumbnail
Grok 4 Wows, The Bitter Lesson, Third Party, AI Browsers, SCOTUS backs POTUS on RIFs
Podcast thumbnail
Elon Musk on DOGE, Optimus, Starlink Smartphones, Evolving with AI, Why the West is Imploding
Podcast thumbnail
New SEC Chair, Bitcoin, xAI Supercomputer, UnitedHealth CEO murder, with Gavin Baker & Joe Lonsdale
Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally
Podcast thumbnail
Google DeepMind CEO Demis Hassabis on AI, Creativity, and a Golden Age of Science | All-In Summit