Search Captions & Ask AI

Bernie Sanders: Stop All AI, China's EUV Breakthrough, Inflation Down, Golden Age in 2026?

December 19, 2025 / 01:30:27

This episode covers Bernie Sanders' proposal for a moratorium on new AI data centers, the implications of AI on jobs and society, and economic updates in the U.S. The hosts discuss Sanders' concerns about AI leading to unemployment and harming children, with David Sachs arguing against the moratorium, emphasizing the importance of AI for national security and economic growth.

The conversation includes references to recent economic data, including unemployment rates and inflation, with the hosts debating the effectiveness of the current administration's policies. They highlight the mixed economic indicators and the public's perception of the economy.

A personal story from one of the hosts about finding a dog that resembles a deceased pet adds a heartwarming touch to the episode. The discussion also touches on the potential impact of China's advancements in semiconductor technology and the importance of maintaining technological leadership.

The hosts express concerns about the narrative surrounding AI and job displacement, stressing the need for better communication about the benefits of AI technology. They conclude with reflections on the current political climate and its effects on the economy.

TL;DR

The episode discusses Bernie Sanders' AI moratorium proposal, U.S. economic updates, and the impact of AI on jobs and society.

Video

00:00:00
All right, everybody. Welcome back to
00:00:02
your favorite podcast, the number one
00:00:04
podcast. In fact, in the entire
00:00:07
universe, the Allin podcast with your
00:00:10
besties. We're all here. It's the
00:00:12
original quartet. Everybody loves When
00:00:15
the Core Four are here. And we have a
00:00:17
docket for you today, guys. We got to
00:00:19
start out with Bernie Sanders. I know
00:00:21
this is becoming a bit repetitive, but
00:00:23
AI is the topic.
00:00:26
We reached a new level of retardation
00:00:28
this week that we cannot avoid. Bernie
00:00:30
Sanders has a major del pitch, a
00:00:34
moratorum on new AI data centers.
00:00:38
Here's his argument. Number one, the
00:00:40
billionaires are pushing AI because they
00:00:42
want more money and power. Number two,
00:00:45
there's going to be massive
00:00:46
unemployment. And he cites Skates,
00:00:48
Staro, Elon, saying that a AI would
00:00:52
replace most jobs. Number three, he uh
00:00:56
has an interesting point actually. AI is
00:00:57
harmful to kids because it decreases
00:00:59
social interaction. Actually, we kind of
00:01:00
agree with that one, I think, across the
00:01:02
board. But here's his pitch, Sachs. His
00:01:04
pitch
00:01:07
>> you want your using these chat box.
00:01:09
>> No, I've talked to my kids about it.
00:01:11
>> Okay, we'll get into it.
00:01:12
>> We can get into it, but I've talked to
00:01:14
them about it. Look, if you actually
00:01:15
look at the data, if you look at what
00:01:17
kids are doing, it's so much more
00:01:20
interactive and engaging for them to
00:01:22
talk to their friends on Snap or to
00:01:24
watch videos on Tik Tok. Those things
00:01:26
are super engaging. Whereas doing
00:01:29
research on an AI chatbot is just it's
00:01:31
way less engaging and you see this in
00:01:33
the data. So, I'm not saying there's not
00:01:34
an issue there. You want to pay
00:01:36
attention to the way that these
00:01:38
technologies are shaping the minds of
00:01:39
young kids, but I think people get a
00:01:41
little bit confused. what they're really
00:01:43
talking about is social media and then
00:01:45
they attribute all the ills of social
00:01:47
media over to these new AI chat apps and
00:01:51
they are a little different. You know,
00:01:52
when I when I asked my kids like
00:01:55
>> how much do you use these things? Are
00:01:57
they addictive? They said no, they're
00:01:59
more just really useful. It's like
00:02:01
Google.
00:02:02
>> It's like there's two
00:02:03
>> I couldn't do school without it.
00:02:05
>> Yeah, there's two pieces here. One,
00:02:06
using AI to be smarter and to learn
00:02:09
stuff and ask questions. Absolutely
00:02:10
fantastic. phenomenal. I think we'll all
00:02:13
agree on that. There is, and I don't
00:02:15
know if this is actually what he was
00:02:16
referencing, but there's character AI in
00:02:18
a in a long tale of spicy chats where
00:02:22
people we talked about getting
00:02:23
oneshotted and these parasocial
00:02:26
relationships. That's I think what he's
00:02:27
referring to, but maybe I'm reading it
00:02:29
wrong,
00:02:30
>> but let's get into it here. His pitch is
00:02:33
that we need to slow down so that quote
00:02:36
democracy can catch up and that Congress
00:02:39
should put a moratorum on new data
00:02:41
centers. That's his solution. He got
00:02:44
roasted obviously on social media for
00:02:47
this. It's the most absurd I think
00:02:51
solution to you know what are valid
00:02:53
concerns. friend of the podro kind of
00:02:56
who represents Silicon Valley. He
00:02:59
I guess supported it a bit and then he
00:03:01
replied to me and says his concerns are
00:03:03
not as acute and that he just wants to
00:03:05
make sure we use renewable energy which
00:03:07
I guess is you know a fine position to
00:03:09
have poly market put up a little uh
00:03:11
market here AI data center moratorium
00:03:13
passed before 2027.
00:03:16
Yeah, we'll see if that comes true or
00:03:18
not. But uh hey Saxs, you are our SGE.
00:03:22
Sachs or SGE, this is your time to
00:03:24
shine.
00:03:27
You have framed the AI debate as we
00:03:29
can't lose to China. Bernie framed it as
00:03:32
not letting the billionaire class get
00:03:34
more power and money by eliminating
00:03:35
American jobs. And last week on this
00:03:38
very program, Tucker framed it as what
00:03:40
are Americans going to get out of this?
00:03:41
Right? That that was the framing, you
00:03:43
know, as of December of 2025. So here's
00:03:47
your chance, Sax. Why should Americans
00:03:50
who have some concerns about losing
00:03:51
their jobs care about beating China? Why
00:03:54
should they care about that sex?
00:03:56
>> Why should they care about beating
00:03:58
China? Because AI is a profound
00:04:00
technology. It's going to have huge
00:04:02
economic and national security
00:04:04
implications. And the thing that Bernie
00:04:08
gets wrong is that he can't stop the
00:04:10
progress. I mean, he can't stop China
00:04:13
from making progress. We can stop
00:04:14
progress in the US, but it's not going
00:04:16
to stop China from advancing these
00:04:18
technologies. A lot of this is just
00:04:19
math. And just because we stopped
00:04:22
doesn't mean China's going to stop. So,
00:04:23
this would be the biggest own goal ever
00:04:25
if we took our leadership in the AI race
00:04:27
and just handed it to China. But look, I
00:04:30
appreciate Bernie's honesty in a way
00:04:32
because he is actually telling the truth
00:04:35
about what he wants. And I remember just
00:04:36
a week ago when President Trump signed
00:04:39
an executive order to advance a national
00:04:40
AI framework, a lot of critics and a lot
00:04:43
of Democrats were saying this was a
00:04:44
violation of states rights and we need
00:04:46
to support the states and regulating AI.
00:04:49
And here Bernie is acknowledging that
00:04:51
this is not about states rights because
00:04:53
he's saying that there needs to be a
00:04:55
moratorium on new data centers even if
00:04:56
the states want them. So he doesn't
00:04:58
support states rights either. He just
00:05:01
wants all the progress to stop. And I
00:05:03
think this is really the truth of the
00:05:05
matter is that all this talk about
00:05:08
states rights or affordability, it's all
00:05:10
a red hearing. And what's really going
00:05:12
on is there's a large and growing
00:05:14
contingent of people who just want all
00:05:16
the progress to stop. But again, we
00:05:18
can't stop China from making progress.
00:05:20
So all they would be doing is seeding
00:05:22
leadership of this AI race to China.
00:05:24
What people like Bernie really want is
00:05:26
they want the US to become like Europe.
00:05:28
You know, Europe has half the share of
00:05:30
global GDP that they had 30 years ago.
00:05:33
And that's because of their hostility
00:05:36
towards innovation and technological
00:05:38
progress. And that's kind of where
00:05:40
Bernie wants us to be is he wants us to
00:05:42
go down the the path of Europe. And the
00:05:44
reason he says is because, god forbid
00:05:46
someone gets rich. Well, look,
00:05:49
capitalism sometimes results in the
00:05:50
unequal distribution of wealth, but
00:05:52
socialism always results in the equal
00:05:55
distribution of poverty and misery. And
00:05:57
if the US stops developing AI, if we
00:06:00
hand this leadership to other countries,
00:06:02
it'll make the United States poor, it'll
00:06:05
make the American people poor, and it
00:06:08
will seed our leadership globally. We
00:06:11
will not be the pre-minent power in the
00:06:12
world anymore.
00:06:14
>> Okay.
00:06:14
>> So, I'll stop there.
00:06:15
>> No, no, it's great. Thank you. SGE David
00:06:18
Saxsberg,
00:06:20
there seems to be a couple of
00:06:22
conversations. Unfortunately, we weren't
00:06:23
here last week, but we had a really nice
00:06:24
uh visit from our pal Tucker, and we
00:06:27
talked about this. He seemed to think
00:06:28
there was a bit of a communication
00:06:29
problem with selling to the public the
00:06:33
value of these technologies. Do you
00:06:34
agree with that? And what do you think's
00:06:36
going on here in terms of the cross
00:06:38
conversations? One group wants to del,
00:06:41
has some concerns, environmental,
00:06:44
you know, job displacement, etc.
00:06:46
And then there is there an issue where
00:06:49
the China beating China doesn't resonate
00:06:52
with the American populace? You've been
00:06:54
harping on about the rise of socialism
00:06:56
for four plus years on this very
00:06:58
podcast. So from where you sit, what's
00:07:01
the actual disconnect in these multiple
00:07:03
conversations that are going on
00:07:04
concurrently and and seemingly coming to
00:07:06
a head here at the end of 2025? Well, if
00:07:08
you ask any of the politicians that are
00:07:10
making these proclamations about the
00:07:13
quote tech barons and try and actually
00:07:15
talk to them about what is the AI
00:07:19
technology delivering, what is it not
00:07:21
delivering, you kind of typically hit a
00:07:23
wall pretty quickly. It's very hard for
00:07:25
folks to articulate what is actually
00:07:26
happening. There's a tremendous amount
00:07:28
of capital being put at risk against
00:07:29
infrastructure to build out the capacity
00:07:32
for many companies, the entire economy,
00:07:34
the entire industry to deliver AI tools.
00:07:36
There isn't an aggregation
00:07:38
at this stage of value creation with the
00:07:41
exception of Nvidia which has really got
00:07:44
this $3 trillion market cap creation
00:07:46
that's happened in the last couple
00:07:47
years. Peter Teal said this best. He's
00:07:49
like look at all the money that's going
00:07:50
into AI. There's really only one company
00:07:52
that's making any money and that's
00:07:54
Nvidia. Like at this point the juryy's
00:07:55
still out. We don't even know what AI
00:07:57
is. It's sort of like when the internet
00:07:58
was happening. You know, everyone
00:08:00
thought these fiber optic switch
00:08:01
companies were going to make all the
00:08:02
money. Turns out that was wrong. It was
00:08:04
the end applications that made all the
00:08:06
money and they competed in many
00:08:07
different markets from Google to Amazon
00:08:10
to Uber. You can go down the list of all
00:08:13
the beneficiaries of the core
00:08:15
infrastructure technology of the
00:08:16
internet that was built out. So what is
00:08:18
really going on? Well, AI is the new
00:08:19
lightning rod for fear and for
00:08:23
divisiveness that ultimately breeds
00:08:26
compliance and control, which is where
00:08:28
these politicians are trying to drive
00:08:30
the the populace and the voting
00:08:31
conditions in the United States. And
00:08:33
that's what's going on right now.
00:08:34
There's a lot of fear about, oh, putting
00:08:36
this data center in my town is going to
00:08:38
do X or Y or Z with no real conversation
00:08:41
about the truth of that matter. There's
00:08:43
a lot of fear about wealth creation
00:08:44
being aggregated in the hand of a few
00:08:47
when, as we saw with the internet, it
00:08:48
benefited the many. And that fear
00:08:51
mongering is a very similar tactic that
00:08:54
we've seen in the prior generations
00:08:56
where policies were misstated, fear was
00:09:00
used, and then voting control allowed
00:09:03
folks to come to power that were looking
00:09:05
for power. So I think it's just the
00:09:06
lightning rod at the moment.
00:09:07
>> Okay, Jim, what's your take on this big
00:09:09
picture? What's going on here?
00:09:11
>> I think it's important to understand
00:09:12
that politicians have an incredible
00:09:16
sense of self-preservation.
00:09:18
So the question is why would Bernie
00:09:20
Sanders think he would not look totally
00:09:22
foolish in putting that video out? And
00:09:25
the reason it went viral is not because
00:09:28
it sounded so crazy. It was that to some
00:09:30
faction and percentage of people it
00:09:32
sounded rational and reasonable.
00:09:36
And the problem that highlights is that
00:09:38
we have a huge perception issue in AI.
00:09:41
We have a handful of companies.
00:09:43
All the PR that you see from those
00:09:46
handful of companies is a bunch of
00:09:47
circular deal making. It's a bunch of
00:09:49
capital that flows from one to the
00:09:51
other. It causes these stocks to go up
00:09:54
of which a small percentage of people
00:09:56
benefit. And at the tail end of it, it's
00:09:59
accompanied by a completely different
00:10:01
set of articles that everybody also
00:10:03
reads
00:10:05
about the sort of damocles that's about
00:10:07
to fall on their head. whether it's
00:10:09
electricity prices or whether it's their
00:10:11
jobs or whether it's the jobs of their
00:10:13
children or the quality of their
00:10:15
education. So we have a big perception
00:10:17
problem. So the question at hand is how
00:10:20
do we fix it? How do we get back to the
00:10:22
place where a video talking about
00:10:25
stopping all progress would seem as
00:10:27
laughable as it should be? And I think
00:10:31
you can go back to the guilded age and
00:10:33
you can ask the question, how did the
00:10:37
industrialist leaders of that era
00:10:39
respond through that 1880 to 1920 age of
00:10:44
industrialization when you had all of
00:10:47
this technological upheaval accompanied
00:10:49
by a handful of people with incredible
00:10:52
success, right? Andrew Carnegie, John D.
00:10:55
Rockefeller, Henry Ford. What did they
00:10:58
do? And I think the lesson we can borrow
00:11:00
from them is we now need to be on the
00:11:02
forward foot as an industry. Enough of
00:11:04
the stupid haircuts, dumb watches, ugly
00:11:08
clothes, ostentatious displays of
00:11:10
wealth. We've all done it. I've been
00:11:12
guilty of it. It has to stop. Absolutely
00:11:14
stop. And instead, we need to start to
00:11:16
use a percentage of the balance sheets
00:11:18
of these companies in order to benefit
00:11:20
as many Americans as possible. That is
00:11:22
the absolute minimum. Andrew Carnegie
00:11:25
built 2500 libraries.
00:11:29
The idea was as he built the railroads,
00:11:32
you're going to scale GDP. You're going
00:11:35
to scale education and knowledge. Those
00:11:37
libraries are artifacts that allowed
00:11:39
people to feel a dividend from that
00:11:41
industrial revolution. John D.
00:11:43
Rockefeller took all of his wealth and
00:11:45
invested in institutions and
00:11:46
universities. Henry Ford specifically
00:11:50
focused on wages. We need to
00:11:52
self-organize better and we need to be
00:11:54
more on the forward foot. We need to
00:11:55
start doing things that are practically
00:11:57
measurable by tens of millions of
00:11:59
American citizens. And it starts to beat
00:12:02
back the perception problem that we have
00:12:05
because Sax pointed out last week, the
00:12:07
data does not support the misperception.
00:12:11
But the problem is, and Sax and I were
00:12:13
talking about this last week, the
00:12:14
misperception is gaining steam. And so I
00:12:17
think we need to use these tools that we
00:12:19
have to provide some more practical
00:12:22
benefits that every man, woman, and
00:12:25
child can feel. Otherwise, this thing is
00:12:27
going to build and you'll see crazier
00:12:28
and crazier versions of this Bernie
00:12:30
Sanders video.
00:12:31
>> All right. I think it's well said and
00:12:32
you know, as I said last week, I think
00:12:35
beating China does not matter to the
00:12:38
average American. It's below their line,
00:12:40
Jimoth. Uh I think in many ways what
00:12:43
they care about is jobs. They care about
00:12:46
their kids getting jobs and yeah, they
00:12:48
care about costs. They care about
00:12:49
inflation. They care about energy costs
00:12:51
going up. Our industry has done a
00:12:52
terrible job of explaining how this
00:12:55
benefits the average American. And
00:12:58
they're seeing all these headlines
00:12:59
about, as you mentioned, all the cross
00:13:01
deal booking and all these stocks going
00:13:04
up. Half the country doesn't own stocks,
00:13:05
so they're not participating in it. And
00:13:08
they're not blind to seeing self-driving
00:13:10
cars or seeing their kids having a hard
00:13:12
time getting a job. and they're hearing
00:13:14
everybody talk about job displacement
00:13:16
and that's a valid concern. People are
00:13:18
scared and our industry is scaring them
00:13:21
I think by not meeting them where they
00:13:23
are and then beating China respectfully
00:13:26
is important sachs super important. I
00:13:29
can intellectually agree with you on
00:13:31
that.
00:13:31
>> But if you lose your job and your kids
00:13:34
can't get a job that's that's
00:13:36
existential. Yes. It's not
00:13:39
>> my child needs a job next year. They're
00:13:41
graduating from school with $100,000 in
00:13:43
debt and my energy bills going up and my
00:13:45
grocery bills are going up and they
00:13:46
haven't come down. That's what Americans
00:13:48
are experiencing. And yeah, maybe giving
00:13:51
uh I think you also made an interesting
00:13:53
point there too about the libraries.
00:13:55
>> The war with China is existential
00:13:57
because if we can keep it on the
00:13:59
battlefield of AI and intellectual and
00:14:02
economic prowess, we have a very good
00:14:04
chance of winning. If it devolves and
00:14:06
all of a sudden becomes a different kind
00:14:08
of battle, that's bad for everybody,
00:14:10
especially our children and our
00:14:11
children's children. So, we need to win
00:14:14
the current game on the field. But in
00:14:17
order to do that, we need to change
00:14:18
these misperceptions. We need to start
00:14:20
showing tactical artifacts that show
00:14:23
that this is a dividend that can benefit
00:14:25
everybody. And we need to start now. I'm
00:14:27
just putting it out there. We have so
00:14:29
much cash on the balance sheets of these
00:14:31
companies. Wall Street values it at
00:14:33
zero. If you look at the enterprise
00:14:36
value of any of these companies and you
00:14:37
do a sum of the parts, nobody cares
00:14:39
about that cash. So, we need to start
00:14:42
using that cash more effectively.
00:14:43
>> What's the equivalent of the library
00:14:45
metaphor here for you, Jamath? Do you
00:14:46
have any ideas?
00:14:47
>> I'm not going to frontr run
00:14:50
what's being worked on except to say
00:14:51
that the leaders of these companies have
00:14:54
gotten the message. They are working on
00:14:57
a whole host of solutions. And again,
00:14:59
we're fighting misperception, which is a
00:15:01
complicated battle, but it's winnable
00:15:04
and I think you're going to start to see
00:15:06
stuff in the new year.
00:15:07
>> Yeah, I would say education is at the
00:15:08
top of that list. Saxs, your thoughts
00:15:11
here as we wrap on this first topic.
00:15:12
>> I'm not sure whether to address the
00:15:14
particular hoaxes or the fact that there
00:15:16
are there's a larger effort here to try
00:15:19
and discredit AI and get AI development
00:15:23
to stop entirely.
00:15:25
Let's talk about some of these
00:15:26
particular hoaxes. So on the job loss
00:15:29
claim, I feel like I do this every week
00:15:31
now, but there's a new study from
00:15:33
Vanguard where they analyze job growth
00:15:36
and wage growth in occupations that are
00:15:38
highly exposed to AI automation versus
00:15:42
all occupations.
00:15:44
And they find that both job growth and
00:15:48
wage growth is higher, not lower, in the
00:15:51
occupations that are exposed to AI. So
00:15:55
if you look at job growth in occupations
00:15:58
exposed to AI, it's 1.7% compared to
00:16:01
0.8% for other. For wage growth, it's
00:16:05
3.8% versus 0.7% for other. So what you
00:16:09
see here is something maybe it's
00:16:10
counterintuitive, but it makes sense to
00:16:12
me, which is as you make workers more
00:16:14
productive, the value of their labor
00:16:16
increases, not decreases, and they end
00:16:18
up getting paid more, and you want to
00:16:20
hire more of them. So this is a huge
00:16:23
narrative violation. And again, this is
00:16:24
coming from Vanguard and it's showing
00:16:27
that AI productivity is good for
00:16:30
workers. This follows on the heels of
00:16:33
that study from Yale Budget Lab, which I
00:16:36
talked about in a previous show that it
00:16:39
said there's no discernable disruption
00:16:41
to the job market based on 33 months of
00:16:43
data after the launch of chat GPT. Now,
00:16:46
I understand that you can make
00:16:48
predictions as to the future that this
00:16:50
state's going to change, that there will
00:16:52
be AI job loss, but what I'm saying is
00:16:55
that if you look at the data so far,
00:16:57
there is no AI job loss. Quite the
00:16:59
opposite. It's job growth and job gains.
00:17:01
By the way, we have a 2% tailwind to GDP
00:17:05
growth right now. That's coming from
00:17:07
this AI boom, this capex boom that's
00:17:10
happening. And so, this is, I think, a
00:17:13
very good thing for the US economy. Now,
00:17:15
why do people want to sabotage it?
00:17:17
There's a really interesting article in
00:17:19
Semaphore recently that described how AI
00:17:22
critics were funding journalism
00:17:24
fellowships at major publications like
00:17:26
NBC News, Bloomberg, Time, The Verge, LA
00:17:30
Times, which by the way, I mean, I track
00:17:32
these things. They're relentlessly
00:17:34
negative about AI.
00:17:36
These fellowships were funded by future
00:17:38
of life institute
00:17:40
which is a doomer group that thinks that
00:17:45
AI is going to become sensient and
00:17:47
replace humans and they were funded by
00:17:52
a donation by Vitalic Buterine from
00:17:55
Ethereum. It's an interesting story
00:17:56
actually. He donated his dog coins to
00:18:00
future of life. They ended up being
00:18:01
worth $600 million.
00:18:03
>> Dogecoins.
00:18:04
>> Doge. And then you remember how there
00:18:06
was like those other dog coins.
00:18:09
>> She Oh, he had a collection of dog
00:18:11
coins. Got it.
00:18:12
>> He had a collection of dog coins.
00:18:13
Apparently people like air dropped them
00:18:14
to him as like a promotional thing.
00:18:17
>> Got it.
00:18:17
>> I don't think he wanted them. So he's
00:18:18
like, "How do I get rid of them?" So he
00:18:20
donates them to Future of Life and they
00:18:22
end up being worth $600 million. So by
00:18:24
this
00:18:24
>> Oh my god. almost like this accident.
00:18:26
This Doomer think tank ends up with a
00:18:29
$600 million war chest and they've been
00:18:31
funding these journalism fellowships.
00:18:35
They've been funding grants for
00:18:36
academics to study AI. Obviously, that's
00:18:39
going to end up being very negative. And
00:18:42
they are funding a lot of these nimi
00:18:44
organizations that are opposing data
00:18:46
centers because their goal is just to
00:18:48
get it to stop. That's their goal. They
00:18:49
just want the development to stop. And
00:18:52
you can't, I don't think, underestimate
00:18:54
how much of an impact this has had on
00:18:55
the public discourse. But if you look at
00:18:58
their actual claims, like for example,
00:19:00
the water use claims, it's a total hoax.
00:19:02
I mean, these AI data do not use a lot
00:19:05
of
00:19:05
>> Yeah, that was one of the things I
00:19:06
wanted to get into because yeah,
00:19:08
compared to a golf course, they're not
00:19:10
using that much or or walnuts and, you
00:19:12
know, almonds in central California. So,
00:19:15
I think it is worth maybe just
00:19:17
>> throwing a lot of spaghetti at the wall.
00:19:19
Look, you have to remember who was who
00:19:21
was Ida Tarbel because it's quite
00:19:22
interesting in the context of this. Ida
00:19:25
Tarbell was this American writer again
00:19:27
in the Gilded Age. She was part of the
00:19:29
Muckwreckers which was a group of
00:19:31
journalists that were doing
00:19:32
investigative research around the abuses
00:19:36
of the industrial revolution, labor
00:19:38
abuses and the like. And she took on
00:19:39
sandal. Exactly. If you look at that
00:19:42
example, I'm sure that the people in
00:19:44
this current generation who have a
00:19:45
Tarbell fellowship, what they're living
00:19:47
out is this desire to tell a story about
00:19:50
exploitation and wrong and misdoing. The
00:19:54
problem is that these things don't
00:19:55
factually hold together, but
00:19:57
unfortunately it adds to the perception
00:20:00
problem that we have and that has been
00:20:03
growing. As Sax said, like so many of
00:20:05
these things have been debunked, but
00:20:06
it's playing whack-a-ole, Jason, because
00:20:08
they'll throw the water thing out. It'll
00:20:11
get debunked next week. We'll probably
00:20:13
something about liquid cooling and all
00:20:15
the PAS and forever chemicals. We'll
00:20:18
have to debunk that. Then we'll go to
00:20:20
like air cooling and how that's a
00:20:21
problem. Then we'll go to something
00:20:22
else. Eventually, it'll be the upper
00:20:24
land grouse, but they're not going to
00:20:27
stop.
00:20:28
>> Sax, how do you do you have a plan to
00:20:30
reframe this to the American public?
00:20:32
You're explaining how these bad things
00:20:33
are happening and all the evil forces at
00:20:36
work behind the scenes.
00:20:37
>> Well, look, it's not evil forces, but I
00:20:39
don't think people understand the extent
00:20:40
to which the discourse has been impacted
00:20:43
by a few anti- AI tech billionaire.
00:20:46
>> I brought it up every week and on your
00:20:48
text.
00:20:48
>> I brought it up. Hold on. I brought it
00:20:50
up on this show and I tweeted about it
00:20:51
way back in November. Yeah.
00:20:53
>> And I linked to a writer named Nerret
00:20:55
Weisslat who has analyzed this whole
00:20:57
doomer industrial complex and she has
00:21:00
shown that there are hundreds of these
00:21:02
front organizations and they're all
00:21:04
really just funded by a few big tech
00:21:06
billionaires. It's Dustin Moskovitz, Yan
00:21:08
Talin and Vitalic Buterine. Well, when I
00:21:11
first described this, people thought
00:21:13
that this sounds kind of crazy like a
00:21:14
conspiracy theory. And then sure enough,
00:21:16
Semaphore comes out with the article
00:21:18
explaining that all these journalism
00:21:20
jobs are being funded by future of life,
00:21:22
which was the big donation by Vitalic
00:21:25
Buterine. So my point is just you can't
00:21:28
underestimate the extent to which a few
00:21:30
billionaires who've donated over a
00:21:32
billion dollars to this cause have
00:21:34
distorted the public debate. And you
00:21:36
really see this actually in the relative
00:21:39
popularity of AI in the United States
00:21:41
versus China. So there's recently a poll
00:21:43
on they call it AI optimism where they
00:21:45
ask people do you believe that the
00:21:47
benefits of AI will outweigh the harms.
00:21:50
83% of people in China believe the
00:21:52
benefits will outweigh the harms or AI
00:21:54
optimistic. In the US it's only 39%.
00:21:58
>> So again the discourse has really been
00:22:00
affected by a few of these
00:22:04
>> institutions. Let's say let's say that's
00:22:05
all true.
00:22:06
>> Yeah. My question to you is, what's your
00:22:08
plan to turn this around and explain to
00:22:10
the American people why they should be
00:22:12
optimistic about this? What's your plan,
00:22:14
David?
00:22:16
>> Well, look, I mean, you're right that we
00:22:17
have to flip the narrative around. And I
00:22:19
I do think that the tech companies have
00:22:20
done a really bad job explaining the
00:22:22
benefits. We talked about this with
00:22:23
Tucker. I think he brought up some
00:22:24
really good points. And you're right,
00:22:27
like the the AI companies lean way too
00:22:29
much into the whole job loss narrative
00:22:31
as as a way to explain their value prop.
00:22:33
And I think it's just either what Jamath
00:22:35
has said, which is this is about their
00:22:36
next fundraising round, or it's a form
00:22:39
of laziness because it's easier to
00:22:41
describe job loss or job replacement
00:22:43
than it is productivity, right? Like
00:22:45
multiffactor productivity is a difficult
00:22:47
concept to explain. So I do think that
00:22:50
they've played into this narrative. But
00:22:51
I think that what we have to do here is
00:22:54
just debunk these narratives. Again, no
00:22:56
evidence of job loss yet. No evidence of
00:22:58
the the water problems. I think the
00:23:01
electricity issues with data centers are
00:23:03
addressable. You have to let the AI
00:23:05
companies build their own power. They
00:23:06
don't have to connect to the grid. In
00:23:08
fact, that's what President Trump has
00:23:10
called for for the last 6 months is to
00:23:12
let the AI companies build their own
00:23:13
power so they're not drawing on the
00:23:15
grid. So, that problem is easily
00:23:16
addressed as well. Talking about the
00:23:18
affordability problem, but you're right,
00:23:20
we have to get the message out how this
00:23:22
technology will benefit Americans. And
00:23:24
the bottom line is, look, I've always
00:23:25
said I'm a technorealist, not a techno
00:23:27
accelerationist. Because at the end of
00:23:29
the day we don't have a choice. I mean
00:23:31
yeah China is gonna develop the
00:23:33
technology if we don't.
00:23:34
>> Okay.
00:23:34
>> So we don't really have a choice.
00:23:36
>> I you know as I said last week I think
00:23:38
the three most pressing issues chimat
00:23:40
education housing and healthcare. And
00:23:43
our industry is in a unique position to
00:23:45
use AI to impact all three of those. So
00:23:48
why don't we as an industry collectively
00:23:50
explain all the great things going on
00:23:52
there. I gave a shout out to Daniel,
00:23:54
Prenovo, Pernovo rather, all these great
00:23:57
companies analyzing our blood. There's
00:23:59
so much going on in healthcare that
00:24:00
could massively lower the cost, make
00:24:02
people's lives better.
00:24:04
>> Obviously, in construction, you could
00:24:05
have incredible advances, and certainly
00:24:07
in education, but we haven't explained
00:24:09
those three things. What's your point?
00:24:11
Is there anything you can think of to
00:24:12
add to that list?
00:24:13
>> We have to understand the president and
00:24:15
Sachs, they're responsible for
00:24:17
shephering the GDP of America. And as
00:24:21
Sax said, half of American GDP over the
00:24:24
next few years is forecasted to come
00:24:26
from all of the capital that goes in to
00:24:28
build out our capabilities in AI. We
00:24:31
need this to happen. There's the
00:24:32
existential reason why, but there's also
00:24:34
a basic economic reason why. So Jason,
00:24:37
the social license comes when off to the
00:24:40
side, the private companies
00:24:43
start to take this perception problem
00:24:45
more seriously. So to your point, are
00:24:48
there things that they can do around
00:24:50
housing? Yes. And again, not to frontr
00:24:52
run, they're actively working on
00:24:54
something that I think could be
00:24:55
transformational separately. Could they
00:24:57
do something on healthcare? TBD, but we
00:24:59
should look at it. Can you do something
00:25:01
in education? TBD, but we should look at
00:25:04
it. But the point is, if you can have
00:25:07
the same companies that are on the
00:25:08
forward foot building this revolution
00:25:10
for us, recognize that we need to bring
00:25:13
some more folks along take some of their
00:25:15
balance sheet capital and invest it to
00:25:18
create this social license to operate.
00:25:20
We will flip this perception on its
00:25:23
head.
00:25:23
>> Yeah, I think it's well said. And you
00:25:25
know, there was a really interesting I
00:25:26
don't you guys must remember this. The
00:25:28
AT&T commercials, you will. Do you
00:25:30
remember those
00:25:31
>> by the way? Sorry, that's another great
00:25:32
example, Jason. AT&T at the turn of the
00:25:34
century. What did they do? They took
00:25:36
their capital and they created Bell
00:25:38
Labs. What did Bell Labs do? Bell Labs
00:25:40
drove all of modern information theory.
00:25:43
It was the precursor to the internet.
00:25:44
They did so much fundamental research. I
00:25:47
want to be clear. I don't think that
00:25:48
this is a five alarm fire. This is very
00:25:50
fixible right now. And the litmus test
00:25:53
quite honestly is in 3, four, five, six
00:25:56
months. Jason, when you see the trickle
00:25:57
of progress if we can get organized,
00:26:00
does your perception change and does the
00:26:02
language that you use and the tone that
00:26:04
you've used over these last 6 months has
00:26:06
me?
00:26:06
>> No, no, no. I'm actually thinking that
00:26:08
you've actually been quite a good early
00:26:10
warning system
00:26:12
>> for what a certain percentage of
00:26:14
Americans think. And so,
00:26:15
>> yes, I predicted it perfectly. Yes.
00:26:17
>> It's been extremely useful. But now we
00:26:19
got to act on it and you have to change
00:26:21
your mind.
00:26:22
>> Oh, no. No. My my mind has been very
00:26:23
crystal clear. I won't let you
00:26:26
you know, tell what I'm saying is my
00:26:28
opinion has been job displacements
00:26:29
coming.
00:26:30
>> Jason, let me be clear. We have to earn
00:26:31
you changing your mind publicly on this
00:26:33
show. If we do that, we're back in a
00:26:35
good place.
00:26:36
>> Uh, yeah. No, I think that you're saying
00:26:37
that in a very condescending way. I have
00:26:39
brought up I have brought up Okay.
00:26:42
Unintentionally.
00:26:42
>> Here, let me let me try to say again.
00:26:43
Sorry. Let me try this again. We have
00:26:45
>> You're saying it as if I'm misguided.
00:26:46
>> No, no, no. You have a perception
00:26:49
>> which is what do you think my perception
00:26:51
is? that on the balance it's a coin flip
00:26:54
about whether AI is going to be
00:26:56
>> all good for all people or very good for
00:26:59
a small subset of people. That's what I
00:27:01
would roughly say that's I would say
00:27:03
that's accurate job.
00:27:07
So let me finish. So if we can execute
00:27:09
this plan again, meaning marry all the
00:27:12
stuff we're doing on the forward foot
00:27:14
within in investing
00:27:15
>> with some of the stuff that we also need
00:27:17
to do to bring a broader swath of the
00:27:20
American population along.
00:27:22
>> Yeah.
00:27:23
>> You will have a different perception if
00:27:25
we are successful. And what I'm saying
00:27:27
is you've had this one perception and
00:27:29
our goal would be to shift you and
00:27:31
people like you.
00:27:32
>> Okay. I I I take it I think it's
00:27:35
completely fair. Yeah. I have been
00:27:36
trying to bring up that as an industry,
00:27:39
we have not recognized people's valid
00:27:42
concerns about job displacement. That's
00:27:44
all I've brought up on this program. I'm
00:27:46
not saying I think it's going to be
00:27:47
cataclysmic or we can't handle it. But I
00:27:49
think if we constantly talk about, hey,
00:27:52
winning the AI race in China, that's
00:27:54
abstract for people. And if we deny the
00:27:57
fact that robo taxis and human robotics
00:28:00
are coming and intelligent people like
00:28:03
Elon or other leaders are saying, "Yeah,
00:28:05
we're we're planning on replacing those
00:28:07
jobs. We're doing a terrible job
00:28:08
communicating this." And I do think part
00:28:10
of this is communication and part of it
00:28:12
is, "Hey, these are valid concerns." So
00:28:14
I think great resolution. Yeah, sure.
00:28:15
I'll be the litmus.
00:28:17
>> Do you guys know what um the Mottton
00:28:19
Bailey fallacy is?
00:28:20
>> No, explain it to the audience. I mean,
00:28:21
I think I've heard of it, but
00:28:22
>> the M and Bailey Castle is an early
00:28:24
medieval fortification where there's
00:28:26
like a very protected area, the keep or
00:28:29
the M. And then the Bailey is kind of
00:28:31
this looser area that's harder to
00:28:34
defend. Right. So, what happens is if
00:28:36
they need to retreat, they'll go into
00:28:39
the M. Okay.
00:28:40
>> Yeah. Lord of the Rings. Right. Right.
00:28:42
So, this has become known as a debating
00:28:44
trick or fallacy where people will make
00:28:48
a really outrageous claim, which is
00:28:52
they'll they'll basically run to the
00:28:53
Bailey and then when you prove that it's
00:28:56
false, they'll run back into the M and
00:28:58
say something very unobjectionable. So,
00:29:01
in the context of like the AI job loss
00:29:03
fallacy,
00:29:04
the Bailey is people will say this is
00:29:07
causing massive job loss, massive
00:29:09
disruption. It's already here. you can
00:29:10
see it and then when I point out well
00:29:13
actually if you look at the Yale Budget
00:29:15
Lab study or you look now at the
00:29:16
Vanguard study there is no job loss then
00:29:19
they'll retreat into the M and they'll
00:29:21
say no no no I'm talking about what's
00:29:22
going to happen in the future which is a
00:29:24
position that's fundamentally
00:29:26
irrefutable and then when I point that
00:29:29
out well wait you just totally change
00:29:31
what you're saying you're like no no no
00:29:32
I was only talking about the future and
00:29:34
then as soon as we sort of seem to have
00:29:36
agreement
00:29:36
>> then the people in the M will race out
00:29:38
to the Bailey and basically saying well
00:29:40
look at happening with Uber drivers or
00:29:41
what have you. So there is this Mott and
00:29:43
Bailey thing happening all the time on
00:29:45
this job loss question
00:29:47
>> and I just want people to be straight
00:29:49
about or honest about it which is look
00:29:51
if your claim is that this will cause
00:29:53
job loss in the future. It's true. I
00:29:55
can't refute that because none of us can
00:29:58
prove what's going to happen in the
00:30:00
future. But be honest about what's
00:30:01
happening today. And in the first three
00:30:04
years after the launch of AI chatbots,
00:30:07
there's been no discernable disruption
00:30:08
to the labor market and the early
00:30:10
studies and data are showing
00:30:12
>> wage increases and actually job
00:30:14
increases and there's definitely job
00:30:17
increases in blue collars because of the
00:30:19
construction boom.
00:30:20
>> This has been our first debate club
00:30:23
corner. Every week we're going to teach
00:30:24
you how to debate better here. No, it's
00:30:26
you know and I I appreciate that and I
00:30:29
have really tried to say here like I'm
00:30:30
not a doomerist at all. you know the
00:30:33
statistics I look at you know I talk to
00:30:34
Dar I have actually sincerely that's why
00:30:37
I don't use I use displacement you know
00:30:39
I really use refined language I do I do
00:30:42
and when I talked to DA he told me in
00:30:45
areas where robo taxis like Whimo are
00:30:48
occurring what they did specifically to
00:30:51
deal with this issue was they stopped
00:30:53
trying to hire drivers in cal in Los
00:30:56
Angeles and in San Francisco because
00:30:59
there are so many whimos they don't want
00:31:01
them to have a had experience. So, you
00:31:03
know, you
00:31:05
wherever in the castle we're talking
00:31:06
about this, I I I don't have a a horse
00:31:10
in this race. I'm not part of this habal
00:31:12
of, you know, Dustin Mosvitz's people. I
00:31:14
just know from talking to Uber and Whimo
00:31:17
and these other companies and, you know,
00:31:19
other companies that are building
00:31:21
software where they're trying to
00:31:22
eliminate jobs. And so, I'm just
00:31:24
bringing that up as a a really
00:31:26
interesting example. Uber and Whimo,
00:31:29
Elon with Tesla, they are actually
00:31:31
making plans right now to deal with this
00:31:34
displacement of drivers and they're
00:31:36
coming up with new products and services
00:31:37
for those drivers. So, one of the things
00:31:38
they're doing is data labeling at Uber.
00:31:40
They they bought a data labeling company
00:31:42
and they're taking the drivers and
00:31:43
saying, "Hey, you want to do some data
00:31:44
labeling over here?" And actually
00:31:46
creating AI jobs for them. So, this is,
00:31:48
you know, uh something that the whole
00:31:50
industry is working on.
00:31:51
>> Look, there's going to be a spectrum,
00:31:52
right? Some jobs are going to change,
00:31:55
some jobs will be eliminated. The
00:31:56
question is with respect to job
00:31:58
eliminations, will that be more than
00:32:00
offset by the net new job creations? If
00:32:02
you're looking at things like Uber
00:32:04
drivers and we go to full self-driving,
00:32:06
then obviously there's going to be some
00:32:07
category of job loss there. So, I'm not
00:32:10
claiming that there's not going to be
00:32:11
any elimination of certain types of
00:32:13
jobs, but I believe that on the whole,
00:32:16
what we're seeing is that when you
00:32:17
improve the productivity of workers,
00:32:20
their wages go up and there's more
00:32:21
demand for their labor, not less. That's
00:32:24
what the data is showing so far and I
00:32:26
just don't think you get that from the
00:32:28
media right now because they are
00:32:29
promoting this dumer narrative and a lot
00:32:31
of that is astroturfed by these
00:32:33
enormously deeped organizations that
00:32:35
have been funded by a few effective
00:32:37
altruists. All right, topic number two.
00:32:40
Economic numbers are out. It's a mixed
00:32:42
bag. Unemployment rate is up and
00:32:46
government payrolls are down. Inflation
00:32:48
somewhere in between. Let's get into the
00:32:50
numbers here. Fun with numbers.
00:32:52
unemployment rate rose to 4.6% from 4.4%
00:32:56
in September and 4% in January of 2025
00:33:00
when President Trump took over. US
00:33:03
economy added 64,000 jobs in new jobs in
00:33:06
November. October saw about 104,000 job
00:33:09
losses, but 162,000 of those job losses
00:33:13
came from the federal government. You
00:33:14
remember the Doge buybacks, friend of
00:33:17
the pod Elon Musk was involved in. Uh
00:33:19
those all took effect at the end of
00:33:21
September. So, there was a little bit of
00:33:22
a balloon payment that occurred. An
00:33:24
analyst from Moody's says, quote, "It's
00:33:26
a frozen job market. There's not much
00:33:28
hiring. There's not much firing
00:33:30
happened." That tracks with what I'm
00:33:32
seeing on the ground. Meanwhile,
00:33:33
inflation came in better than expected
00:33:35
at 2.7%, beating the 3.1%
00:33:39
expectation, but still far away from the
00:33:43
2% target for the Fed. Trump gave an
00:33:47
18minute very loud emphatic address to
00:33:51
the nation last night and it was not
00:33:54
that we're invading Venezuela. It was uh
00:33:57
mostly him talking about his
00:33:59
accomplishments
00:34:00
in terms of uh prices and affordability
00:34:04
and uh a little bit of admonishing
00:34:07
Biden. Here's a 20 second clip. The last
00:34:09
administration and their allies in
00:34:11
Congress looted our treasury for
00:34:13
trillions of dollars, driving up prices
00:34:16
and everything at levels never seen
00:34:18
before. I am bringing those high prices
00:34:21
down and bringing them down very fast.
00:34:24
>> Sax, your thoughts on the economic data.
00:34:27
>> Well, you're you're painting it as
00:34:29
you're painting the economic data as
00:34:30
mixed. I don't know how you have a
00:34:31
economic report that's better than what
00:34:33
we just had today. So, first of all, we
00:34:35
saw that CPI came in at 2.7% like you
00:34:38
said, but the expectations were 3.1%.
00:34:40
That's why the market is rallying today
00:34:42
in a big way. Core inflation is down to
00:34:44
2.6%.
00:34:46
These are significant beats. And this
00:34:49
puts core CPI inflation in the US at its
00:34:52
lowest level since March of 2021. So,
00:34:55
since the whole CO thing. And if you
00:34:57
listen to Kevin Hasset, he said that
00:34:59
over the past 3 months, core inflation's
00:35:01
been running at 1.6%. So the trend line
00:35:04
is going down even further. So it looks
00:35:06
to me like inflation's rolling over and
00:35:08
that's just about a solved problem which
00:35:10
is really good news for what it implies
00:35:12
for interest rates because it implies
00:35:13
that interest rates are coming down and
00:35:15
that's going to bring down things like
00:35:17
mortgage costs and the costs of
00:35:19
financing a car payment and things like
00:35:21
that. Now you talk about unemployment. I
00:35:24
think the unemployment news is very very
00:35:26
good. So if you look at the data from
00:35:30
September to November, the overall
00:35:33
employment is down 41,000. But why was
00:35:36
that? Private employment was up 121,000,
00:35:39
but government employment in the last 2
00:35:41
months has declined by 162,000. So this
00:35:44
is what the media was focused on is they
00:35:45
were trying to claim that unemployment
00:35:47
was up, but actually it's just that the
00:35:49
government jobs decreased. Why did the
00:35:51
government jobs decrease? Well, you
00:35:53
remember that when Doge went in and made
00:35:54
their cuts at the beginning of the year,
00:35:56
they offered people a buyout as of
00:35:58
October 1st. Now, some people took the
00:36:01
buyout right away, but a lot more waited
00:36:03
until the last possible day, which was
00:36:05
October 1st. And that's why you saw this
00:36:07
big spike in October unemployment, but
00:36:11
those are government jobs that are being
00:36:13
cut. And moreover, there are people who
00:36:15
voluntarily wanted to take the buyout
00:36:17
and they took that deal. So again, I
00:36:20
think the employment picture is looking
00:36:21
actually quite good and if you believe
00:36:24
what we believe, you don't want an
00:36:27
excessive number of government jobs. And
00:36:28
President Trump has presided over, I
00:36:31
think, the first decrease in the federal
00:36:32
workforce in decades. We've seen a 10.7%
00:36:36
decrease in federal workers in 2025.
00:36:39
Basically, the number's gone from 2.4
00:36:41
million to 2.15 million. I know that'll
00:36:44
make Freeberg very happy. But look, what
00:36:46
we had during the Biden years is we had
00:36:49
9% inflation. We had a very weak
00:36:51
economy. We did have a recession. It was
00:36:53
that two quarter shallow recession. And
00:36:56
the media even tried to start redefining
00:36:59
what a recession was to avoid those
00:37:00
headlines. But what the Biden
00:37:02
administration did in response to that
00:37:03
is they went hog wild with government
00:37:05
hiring. And what we're seeing now is
00:37:08
inflation is now coming back down and
00:37:12
we're seeing the number of government
00:37:13
workers come down to a more reasonable
00:37:15
level and the private economy is making
00:37:17
up for it. We still have a relatively
00:37:19
historically low unemployment rate. So
00:37:23
those numbers look good. Then you look
00:37:24
at the deficit. We've reduced the
00:37:27
deficit year-over-year by 600 billion.
00:37:29
That will help bring interest rates
00:37:31
down. And then on prices, you got the
00:37:34
lowest gas prices in five years, below
00:37:36
$3 nationally. And then finally, on
00:37:38
wages, real wages are up by over $1,000
00:37:41
on average. And um it's $1,300 for
00:37:45
factory workers, $1,800 for construction
00:37:47
workers. And again, that's a big change
00:37:49
from the buying years where you saw that
00:37:52
in real terms, wages went down by about
00:37:55
$3,000 on average per worker. So look, I
00:37:59
mean, it seems to me like we're on the
00:38:01
cusp of a golden age here. I don't see
00:38:03
how the numbers could really be better.
00:38:06
And then on top of it, again, you've got
00:38:08
this AI tailwind, which I think is a
00:38:10
huge positive, not a negative, which is
00:38:11
adding roughly 2% GDP growth every year
00:38:15
because of huge capex investment with so
00:38:18
far no job loss associated with that. So
00:38:21
I would just say sit back and enjoy
00:38:23
this. I think we're headed for a gang
00:38:25
busters 2026. Rates are coming down.
00:38:28
Inflation's coming down. And you're also
00:38:31
getting tax cuts going into effect next
00:38:33
year because of the big beautiful bill.
00:38:35
No tax on tips, no tax on overtime, no
00:38:37
tax on social security, plus the
00:38:39
standard deductions being beefed up. So
00:38:41
people haven't even felt the benefit of
00:38:43
those tax cuts. That's coming in in
00:38:45
April. I don't see how things could be
00:38:47
much better.
00:38:48
>> Okay, Freeberg, any thoughts here on the
00:38:51
data as it's come in? Or Chimath? I'll
00:38:53
I'll go to either one of you.
00:38:54
>> Directionally, I think it's really
00:38:55
positive. You want to see
00:38:59
private sector jobs growing and the
00:39:01
shrinkage of government sector jobs has
00:39:04
profound impacts not just on the budget,
00:39:06
but it starts to create obvious air
00:39:10
pockets where you get a little bit more
00:39:13
rational regulation. You can deregulate
00:39:16
in the appropriate places. You can shift
00:39:18
the burden and the responsibility to
00:39:19
private industry. So, it's more than
00:39:22
just what's in the numbers that I think
00:39:23
is positive. I've told you this story
00:39:24
before, Jason, to do some of the things
00:39:26
that I've wanted to do. I'll give you an
00:39:27
example. When we started a battery
00:39:30
business 5 years ago, when we filed with
00:39:34
the DOE, the Department of Energy, these
00:39:36
are 700page reports. You spend millions
00:39:40
of dollars and you hire teams of 50 and
00:39:42
100 people. And what are we trying to
00:39:44
do? We're trying to build a battery
00:39:45
business in the United States so that we
00:39:46
can delever from the Chinese.
00:39:49
to the extent that we could do that a
00:39:51
little bit simpler and a little bit
00:39:53
easier because there's fewer folks and
00:39:55
so the burden goes to state and local
00:39:58
regulators which we all already have to
00:39:59
deal with. These are just generally good
00:40:01
things for productivity. They're
00:40:02
generally good things for GDP. It allows
00:40:05
us to invest more aggressively in the
00:40:06
things that help America. So it's
00:40:08
trending in the right direction. By the
00:40:09
way, Sachs didn't mention this
00:40:11
explicitly, but I'll say it.
00:40:12
>> The other thing in 26 is you get a bunch
00:40:14
of tax cuts that kick in. No tax on
00:40:17
tips, no tax on overtime. the
00:40:18
deductibility of the cost of interest
00:40:20
for
00:40:22
things like car loans.
00:40:24
These are big stimulation
00:40:28
accelerated depreciation. You have big
00:40:29
stimulative actions for the United
00:40:30
States economy.
00:40:32
>> Yeah. I think what you guys are missing
00:40:35
>> respectfully is that uh
00:40:37
>> you know uh Trump promised something
00:40:40
completely different. He said starting
00:40:42
on day one we will end inflation and
00:40:44
make America affordable again to bring
00:40:45
down the prices of all goods. And when
00:40:47
he was running for election, he promised
00:40:49
the Americans that he would reduce
00:40:52
prices. Now, what's happened is prices
00:40:55
have not come down. They've gone up 2.5
00:40:58
to 3.1%. But what really matters, and I
00:41:00
think you guys are doing a great job,
00:41:02
you know, pro- administration, part of
00:41:05
the administration. What you're missing
00:41:06
is the American people don't believe
00:41:08
you. And the American people are
00:41:10
experiencing something different. And
00:41:11
when you look at the uh approval rating,
00:41:13
Trump's approval rating is at its
00:41:15
historic lows. and specifically on
00:41:17
inflation. Pull up the silver bullet and
00:41:18
meta analysis there. This is the
00:41:20
disconnect and this is why it seemed
00:41:22
tonedeaf last night when Trump was
00:41:24
telling everybody it's great and when
00:41:25
you say it's the golden age, Americans
00:41:27
aren't experiencing that. That's just
00:41:29
not what they experience. They are
00:41:32
experiencing grocery prices that have
00:41:34
continued to go up and they are
00:41:37
experiencing unemployment that's gone up
00:41:39
15%. Whether it's the federal employees,
00:41:41
whether it's private sector, and you
00:41:43
debate the numbers, or there's a golden
00:41:44
age coming because of no tax on tips,
00:41:46
these are all talking points. What the
00:41:48
American people remember is the Trump
00:41:50
administration said prices would go
00:41:52
down. Prices have gone up. That jobs and
00:41:55
manufacturing, there'd be all these
00:41:56
incredible manufacturing things.
00:41:58
Obviously, that's going to take years.
00:41:59
That has not happened yet. So, I hope
00:42:01
for the best. I hope inflation goes down
00:42:03
to two. And I hope they turn this
00:42:05
around. But the American people don't
00:42:07
believe the Trump administration. And
00:42:08
it's a big difference between promising
00:42:11
stuff in 2024 and delivering it. And
00:42:13
here we are in the first year and the
00:42:15
delivery that's come in from the Trump
00:42:17
administration on inflation and on the
00:42:20
economy is not good if you're in the
00:42:23
bottom half of society that doesn't own
00:42:25
equities. Period. Full stop.
00:42:27
>> So, first of all, prices have come down.
00:42:29
You look at gas prices are the lowest in
00:42:30
five years. They're below $3 nationally.
00:42:33
Now, I don't know how you want a new
00:42:35
administration to come in and literally
00:42:37
affect every single change on day one.
00:42:39
How do you do that?
00:42:40
>> Oh, no. That's just what he said.
00:42:42
>> No. What the president said is they
00:42:43
would start working on they would start.
00:42:46
>> I'm not saying it's possible. That's
00:42:47
what he said.
00:42:48
>> No, he said he'd start working on it on
00:42:50
day one. And so he did. And look at the
00:42:52
results. Inflation's now down to 2.7%
00:42:55
way below expectations. I don't know how
00:42:57
you want a better report than this.
00:43:00
Unemployment is very low. The only
00:43:02
reason Remember the uh the economic
00:43:04
target was 3 three and three 3% GDP
00:43:07
growth, 3% deficit to GDP and 3%
00:43:10
inflation. They're below 3% inflation.
00:43:13
They're above 3% GDP growth and the
00:43:16
deficit to GDP is not where it needs to
00:43:18
be. So cutting is still ahead. But I
00:43:21
looked at the budgets for 2026 for a
00:43:23
number of the departments in the cabinet
00:43:26
over the last couple days and they do
00:43:28
have big cuts that they're trying to
00:43:29
implement across the federal government.
00:43:32
We're going to be interviewing Scott
00:43:34
Bessent tomorrow morning that'll come
00:43:36
out shortly after this episode airs. But
00:43:38
this is the catchup conversation with
00:43:40
Scott on how's the 333 plan going. But
00:43:42
on two of the three metrics, Jay Cal, it
00:43:45
does appear like things are good. I'm
00:43:46
not trying to be a spokesperson for the
00:43:47
Trump administration, but I'd say if you
00:43:49
look at this just from the raw economic
00:43:51
data and the the goals of the
00:43:52
administration following that, there
00:43:54
should be a flow through in terms of job
00:43:57
growth, the flow through in terms of
00:43:59
affordability, in terms of wage growth,
00:44:00
all those other sort of economic
00:44:01
indicators that actually affect people
00:44:03
on Main Street every day. Uh that's also
00:44:05
TBD in the narrative is still to be
00:44:07
written, but those high level economic
00:44:09
goals are still, you know, are starting
00:44:10
to kind of fall within the the framework
00:44:12
of what they set out to do, which is
00:44:14
generally good, right?
00:44:15
>> Yeah. I'm just pointing out the
00:44:16
disconnect again. This is the disconnect
00:44:18
with the American people. I mean,
00:44:20
>> no, no, I'm looking at year one.
00:44:21
>> Just year one. We're here at the end.
00:44:24
>> Time to implement an agenda.
00:44:26
>> I totally agree. I'm just telling you
00:44:28
the American people don't believe the
00:44:29
Trump administration right now. There's
00:44:31
more work to do.
00:44:33
>> That's the data that came out today is
00:44:35
looking awesome. By the way, that
00:44:37
Vanguard report that I mentioned that
00:44:39
also exposes the AI job loss hoax, it
00:44:43
also has a new economic assessment for
00:44:46
next year is projecting 3% growth for
00:44:49
next year, which I think is
00:44:50
conservative, and an improved labor
00:44:52
market outlook for 2026, and it says the
00:44:54
most robust improvement will come in the
00:44:57
second half of the year. So, Vanguard is
00:45:00
saying that 2026 is looking very good.
00:45:03
And by the way, that's when the fight
00:45:04
for Congress will unfold is next year.
00:45:06
So, you know, the thing about polls is
00:45:08
it's obviously just a snapshot in time
00:45:10
and it doesn't tell you what things are
00:45:12
going to look like 6 months from now or
00:45:13
a year from now. I think they're going
00:45:15
to look very good.
00:45:16
>> Yeah. I mean, maybe he turns it around.
00:45:18
>> I think it's already been turned around.
00:45:20
It just takes time to kick in.
00:45:22
>> Yeah. I mean, the the public's
00:45:23
perception it hasn't turned around and
00:45:24
inflation was supposed to go down and it
00:45:27
stayed.
00:45:27
>> It has. No, it stayed at it's still way
00:45:31
above the 2%
00:45:32
>> 1.6% for the last three months.
00:45:36
>> No, it has not. It was just 2.7.
00:45:38
>> No, that's the problem I think with the
00:45:41
administration and your part of it. Like
00:45:42
you guys cherry pick numbers and it's
00:45:44
not matching the numbers on the field.
00:45:46
So when you cherryick cherry picking,
00:45:48
this says CPI report.
00:45:50
>> The the the number
00:45:52
>> open the cover of the Wall Street
00:45:53
Journal. 2.7 Hold on. 2.7% CPI 2.6% 6%
00:45:58
core
00:45:59
>> 1.6% for last 3 months.
00:46:02
>> Okay.
00:46:02
>> What is hard to understand about this?
00:46:04
Obviously, the last three months are
00:46:06
going to tell you what the trajectory
00:46:07
is.
00:46:10
>> Okay. The the fact is
00:46:11
>> you're looking for every excuse you can
00:46:13
to basically downplay what is
00:46:16
>> an awesome report and just look at the
00:46:18
stock market which is ripping today
00:46:20
because this is way below expectations.
00:46:22
You're the only one who's not happy
00:46:23
about this news, Jacob.
00:46:24
>> No, no. the the country is not happy
00:46:26
about it. I I'm not I don't have a horse
00:46:28
in this race. I'm doing fantastic. I own
00:46:30
equities. I'm doing better than I ever
00:46:32
have in my life. I'm talking about how
00:46:34
the bottom half of America perceives it.
00:46:36
This is the blind spot of the Trump
00:46:37
administration. You've proved my point
00:46:39
by not addressing it. Let's go to the
00:46:41
next subject.
00:46:42
>> No, I haven't. What? What?
00:46:44
>> No, I guess the fact that you guys My
00:46:46
point is the American people
00:46:48
>> expected that inflation would go down
00:46:51
and prices would go down. Prices have
00:46:53
continued to go up. Fed has a 2% target.
00:46:56
It's a little bit unrealistic to say
00:46:57
that what you want to you want wait you
00:46:59
want to have you want to have a negative
00:47:00
inflation rate.
00:47:03
>> Trump said prices will go down. They
00:47:04
have not. That's what the American
00:47:06
people feel. Unemployment has gone up.
00:47:09
They expect it to go down. So the this
00:47:12
idea that we're in the golden age is not
00:47:14
tracking with the bottom half of
00:47:15
Americans. That's just facts, David.
00:47:18
It's just facts. You have work to do.
00:47:20
Trump has work to do. You're you're
00:47:21
saying that anything less than deflation
00:47:23
anything less than deflation in your
00:47:25
view is
00:47:26
>> I did not say that you keep telling I
00:47:27
keep telling you what I'm reporting in
00:47:29
terms of facts then you tell me I said
00:47:31
something different I didn't say that
00:47:32
the American people are very
00:47:34
disappointed in the Trump
00:47:35
administration's first year when it
00:47:37
comes to inflation and the economy
00:47:40
>> I don't understand how the numbers could
00:47:41
be any better we just blew past
00:47:43
expectations here 2.7%
00:47:47
>> I don't know whose expectations I don't
00:47:48
know whose
00:47:49
>> pulse you think you have the finger of
00:47:51
but the economists who put together
00:47:54
>> these expectations
00:47:56
>> were expect
00:47:59
one more time for me just pull up the
00:48:00
chart for me uh so I can just explain
00:48:03
that this isn't bias on my part I'm just
00:48:05
putting the fact on the field no it is
00:48:07
not this is
00:48:09
>> these are numbers that you can look at
00:48:11
and see that Trump's two worst
00:48:14
categories right now are the economy and
00:48:16
inflation the American people thought he
00:48:18
was
00:48:18
>> these are snapshots in time and they're
00:48:20
going to
00:48:21
snapshot time from February goes down.
00:48:24
That's the snapshot time. Every month
00:48:26
you have 12 months there to look at.
00:48:27
>> Let me explain. I don't know what's so
00:48:30
hard about this is a really partisan
00:48:31
point that you're making it. You're
00:48:33
basically saying
00:48:33
>> I'm not part economy is like a giant
00:48:36
super
00:48:36
>> tanker. I'm not part
00:48:38
>> Can I finish my point?
00:48:39
>> Sure you can.
00:48:39
>> I mean you must know this and I think
00:48:41
you're pretending not to. The US economy
00:48:43
is like a giant super tanker. It takes
00:48:45
time to turn around. It takes time to
00:48:46
implement your economic agenda. It took
00:48:48
the president the first six months of
00:48:49
the administration to implement the big
00:48:51
beautiful bill which is his economic
00:48:52
program. But it doesn't even go into
00:48:54
effect until January 1st. So the tax
00:48:57
cuts have not gone into effect. The
00:48:59
inflation reductions have happened.
00:49:02
We've gone from 9% under Biden to now
00:49:04
2.6% core 1.6% the last 3 months. I
00:49:08
don't know what more you could want.
00:49:10
Interest rates have come down. They're
00:49:11
projected to come down even more because
00:49:13
inflation's coming down. This is again a
00:49:15
huge ship that's in the process of
00:49:17
turning around and I predict that by
00:49:18
next year the numbers are going to be
00:49:20
even better. Let me remind you of what
00:49:22
happened in the early 1980s under Ronald
00:49:23
Reagan when Reagan took over. Okay, the
00:49:25
misery index was at like 20%. Reagan
00:49:28
implemented his economic agenda, but it
00:49:30
took a couple of years to actually
00:49:32
implement it. By 1983, we actually had a
00:49:35
very severe recession because Vulkar had
00:49:37
raised interest rates so much to tame
00:49:39
inflation. We're lucky we don't have
00:49:41
that issue today. In any event, it took
00:49:43
like three years for Reagan to implement
00:49:45
his economic agenda and his popularity
00:49:48
was very low by 1983, but by 1984, the
00:49:52
program had worked and it was morning in
00:49:53
America again. And he won the biggest
00:49:55
landslide election, I think, in American
00:49:57
history. So, you got to give it time for
00:50:00
the president's agenda to play out and
00:50:02
work. And it certainly looks like after
00:50:06
10 months that is performing extremely
00:50:08
well. and we are just beginning to see
00:50:11
the impact of it.
00:50:12
>> I think the core issue is Trump
00:50:14
overpromised and underd delivering year
00:50:17
one and it's not the golden age for the
00:50:19
bottom half of Americans and they don't
00:50:21
feel it's the golden age and I think
00:50:23
just t tackling that head on and saying
00:50:25
yeah we have more work to do and I
00:50:26
actually think that's what the address
00:50:29
last night was about. I suspect
00:50:30
somebody, maybe Susie or the the chief
00:50:33
of staff or somebody said, "Hey, we got
00:50:35
to start communicating better about the
00:50:38
cost of living and our promises during
00:50:41
the election so that we turn this
00:50:43
around." That's actually what I think is
00:50:45
happening. But I I don't have a horse in
00:50:46
this race. I'm doing fabulous because I
00:50:48
own equities. I'm happy. I'm concerned,
00:50:51
sure, that the bottom half of society
00:50:53
needs to do a little bit better. That's
00:50:55
all. All right. But you, you know,
00:50:57
listen, I I'm never going to win with
00:50:58
you, Saxs, because you're the captain of
00:51:00
the debate club. You will win every
00:51:01
debate. You're you're incredible.
00:51:02
>> Well, look, I mean, like the president
00:51:04
the president made the right point last
00:51:05
night, which is he inherited a giant
00:51:07
mess. Again, under the Biden years, we
00:51:09
had a $3,000 reduction in in real wages
00:51:13
by American workers. That's already gone
00:51:15
up by $1,000 under President Trump in
00:51:17
the first year. But yeah, if you're a
00:51:19
worker and you're just looking at your
00:51:21
situation, you're underwater 2,000
00:51:23
relative to where you were 5 years ago.
00:51:25
So obviously you're going to be salty
00:51:26
about that, but that's the situation
00:51:28
that that President Trump inherited. I
00:51:31
don't know how much more progress you
00:51:32
could make in 10 months, but give it
00:51:33
another year and then we'll really see.
00:51:36
>> All right, I am rooting for you guys. I
00:51:38
hope you turn it around in year two.
00:51:40
Okay, Freeberg, we got an important
00:51:43
story here we have to get to.
00:51:44
>> Do you guys want to hear a crazy
00:51:45
personal story for 2 minutes as a
00:51:47
palette cleanser?
00:51:48
>> Oh, 32.
00:51:49
>> Oh, yes. As a palette cleanser,
00:51:50
>> this is an amuse bougge.
00:51:51
>> Okay, this is an amuse. A little a
00:51:54
little pomplamoose sorbet.
00:51:56
>> This is the craziest story
00:51:58
>> in my life right now.
00:51:59
>> Pull up the photo, Nick. The first
00:52:01
photo.
00:52:02
>> You guys remember my dog, Monty?
00:52:04
>> Yeah. This is the virtue signal dog.
00:52:06
>> Not the virtue signal.
00:52:07
>> Oh, this is the best dog. This is Monty.
00:52:10
>> This is Monty. Monty. Monty was my dog
00:52:12
for 10 years. I got him 10 years ago in
00:52:15
January and he died suddenly in May. He
00:52:19
turns out he had a tumor on his heart
00:52:20
and his heart burst open in front of me
00:52:22
and he died. It was the worst thing. He
00:52:23
was with me every second of every day
00:52:26
>> that I wasn't at work. He slept on my
00:52:28
bed. He was in my office all day. You
00:52:29
guys have seen him a million times in
00:52:31
the background during the all-in pod. My
00:52:33
best friend in the world. The greatest
00:52:36
thing I've ever come across in my life.
00:52:39
>> So, he died suddenly. It was very
00:52:40
brutal, very hard. So, I'm at my house
00:52:43
the other day. My next door neighbor
00:52:46
walks in and he's got this dog on his
00:52:48
leash. Pull up this photo. This is this
00:52:51
dog. So, my next door neighbor finds
00:52:54
this dog at Rocket Dog Rescue. It was
00:52:57
pulled out of a kill shelter. The vet
00:52:59
estimates this dog is like a decade old.
00:53:02
Today, I live probably 40 miles away
00:53:04
from where we got Monty. He was found
00:53:06
wandering the streets of the Mission
00:53:07
District in San Francisco. This random
00:53:09
dog here was found at some random kill
00:53:11
shelter somewhere in California, pulled
00:53:13
out of the kill shelter a few days ago,
00:53:15
put into Rocket Dog, and rescued by my
00:53:17
next door neighbor. Brings him over.
00:53:19
He's walking him in the street. And I'm
00:53:20
like, "Oh my god, that dog looks just
00:53:21
like my old dog, Monty." We ran a DNA
00:53:24
test. This dog is Monty's brother.
00:53:28
>> What?
00:53:29
>> This dog is Monty's brother. It is the
00:53:32
craziest. There's like a billion dogs on
00:53:34
planet Earth. And of the billion dogs,
00:53:36
some random dog from some random litter
00:53:38
from some street pup in the Mission
00:53:40
District from a decade ago found its way
00:53:42
into a shelter, into another rescue
00:53:43
center, into my next door neighbor's
00:53:45
arms, and brought back to my house
00:53:47
months after Monty died. How crazy is
00:53:49
that?
00:53:50
What is his personality like?
00:53:52
>> I'm going to hang out with him this
00:53:53
afternoon. I got to go see.
00:53:55
>> But how crazy is this? Just It's Monty's
00:53:57
actual brother from the same
00:53:59
>> Monty in fairness.
00:54:00
>> Monty.
00:54:01
>> Well, the angles are off.
00:54:02
>> Step brother.
00:54:03
>> Monty's more robust.
00:54:04
>> Yeah. Well, Monty was wellfed. This dog
00:54:06
lived in a freaking kill shelter. So,
00:54:08
>> Monty was a heck of a dog. Take take 12
00:54:10
pounds off Monty. But
00:54:11
>> wait, where's the other tests?
00:54:13
>> Don't lie.
00:54:14
>> Where's the uh dog that was being
00:54:15
tortured by Revlon?
00:54:17
>> That dog. The dogs have gotten robust,
00:54:20
too.
00:54:20
>> Yeah.
00:54:21
>> Was it Rabon who was torturing your
00:54:23
>> She she Daisy has a thousand siblings,
00:54:26
not just one.
00:54:27
>> And uh Daisy Daisy I'm fine with. It's
00:54:30
Mitchell. It's Mitchell Freeberg. I hate
00:54:32
>> Marshall. Marshall.
00:54:33
>> Marshall Freeber. I hate Marshall
00:54:35
Freeberg.
00:54:36
>> I got to hang out.
00:54:36
>> He ate the pistachios last Christmas.
00:54:38
Marshall's like a little
00:54:40
>> Monty's father must have really gotten
00:54:42
around San Francisco.
00:54:44
>> He had a great run on
00:54:46
>> What are the odds of this?
00:54:47
>> Same mother, same father. It's not a
00:54:48
half brother. This is the same Monty's
00:54:50
dad was a
00:54:51
>> same same litter, same dog, same same
00:54:54
mother, same father, same litter on the
00:54:56
streets of San Francisco a decade ago.
00:54:58
And they all find their way in different
00:54:59
parts of California and come to Andy's
00:55:02
buried in my backyard. And this dog
00:55:04
>> samples
00:55:06
love. Was it the summer of the same?
00:55:08
>> You would say no way, but you got to see
00:55:11
better shots of it. I'll put more
00:55:12
>> photo side by side.
00:55:13
>> Yeah,
00:55:14
>> this doesn't make sense.
00:55:15
>> I know. It's the craziest ever. And by
00:55:17
the way, like
00:55:18
>> yeah, it doesn't look like they're
00:55:20
related. I'll be honest with you.
00:55:21
>> Gotta find you a proper photo of
00:55:23
>> What's going on? Chop up with Oki and
00:55:25
Joker. How old are these dogs now?
00:55:27
>> Oh my god. From your lips, the God's
00:55:28
yours, bro. Aie is 14 and a half. She
00:55:31
turns 15 in July.
00:55:33
>> She's so precious.
00:55:34
>> And Joker's eight or nine. So Touchwood,
00:55:37
they're uh
00:55:38
>> Yeah,
00:55:38
>> Golden Retrievers don't really live to
00:55:40
15, but my god, she's been
00:55:42
>> You got a lot of good years. I was just
00:55:43
on the I was on the floor last Thursday
00:55:45
snuggling with her and it was so
00:55:47
precious.
00:55:47
>> Well, you know what we did the I think
00:55:49
the best thing that we did was we got
00:55:52
much more disciplined about diet for AI
00:55:55
when she turned about 13.
00:55:58
Little bit calorie restricted. It's not
00:56:00
nearly as much. She lost a few kilos. It
00:56:03
helped her a lot walking and
00:56:05
>> she's a little deaf unfortunately, but
00:56:07
she can see.
00:56:08
>> Dogs are the best. My god,
00:56:09
>> dogs are the best. Here's my Here's
00:56:11
here's my other photo of the dog that
00:56:13
Chamoth hates. Pull it up just so you
00:56:14
can see.
00:56:15
>> Marshall, listen. Last last Christmas at
00:56:18
Nat's birthday dinner on the 26th, which
00:56:20
is her birthday, Marshall Freeberg
00:56:22
jumped on the table, started eating all
00:56:24
of the pistachios and the nuts,
00:56:26
>> ruined them,
00:56:27
>> ruined all of them, slobbered
00:56:28
everywhere.
00:56:29
>> All David Freeberg was like, "Oh, look
00:56:32
at that." Allison Freeberg was like,
00:56:34
"Marshall Freeberg, they they have no
00:56:36
control over what's going on."
00:56:38
>> Here, pull this photo up, Nick. This is
00:56:40
a a medical issue Marshall Friedberg's
00:56:41
been dealing with. He wakes up with
00:56:43
irreoverable erections, so he can't
00:56:45
walk.
00:56:47
>> What?
00:56:47
>> Pull this up.
00:56:48
>> Irreoverable.
00:56:51
>> So, Nick, you got to blur this out. When
00:56:53
you
00:56:54
>> Oh, dude's packing.
00:56:57
>> So, this little guy, he wait erection.
00:57:00
>> He has these erections and well, the
00:57:02
doctors say like, "You just got to wait
00:57:03
and if it ends up staying for too long,
00:57:05
there's going to be a surgery." But like
00:57:07
right now, he's okay. But he can't move
00:57:08
when he gets them and he's stuck and
00:57:10
he'll just stand there unable to move
00:57:12
for like 20 minutes like with his little
00:57:13
>> thing happened. I gave Chimat some of my
00:57:16
roast sparks and he had a similar thing
00:57:18
happened. He was frozen.
00:57:19
>> There's there's a Brian Johnson joke in
00:57:21
here. But
00:57:22
>> that is a problem.
00:57:23
>> Yeah.
00:57:23
>> How's Moose doing? Jake's my special
00:57:26
guy.
00:57:27
>> Here he is. Special.
00:57:29
>> Oh,
00:57:30
>> is that your dog?
00:57:31
>> This is the moose. The moose is trying
00:57:34
to keep Oh, buddy. Oh, buddy. Fish is my
00:57:37
buddy. Oh, good. What are you doing?
00:57:40
What are you chewing on?
00:57:41
>> What's he doing?
00:57:42
>> I think you got a pop. Get that out of
00:57:44
your mouth. Come on. Get that out of
00:57:45
your mouth. He is such a little love
00:57:47
buddy. Sh love.
00:57:52
>> It's like a gremlin.
00:57:54
>> He loves it.
00:57:57
>> It's great. It's great to see you so in
00:57:59
control.
00:58:00
>> Jason, you're such a natural dog guy,
00:58:02
Jason.
00:58:05
That's buddy.
00:58:06
>> A disciplinarian. This dog loves food.
00:58:09
>> You're really and he loves to wrestle.
00:58:11
>> You're like a modern believable
00:58:14
at the ranch.
00:58:14
>> Does he headbutt you?
00:58:16
>> The dog whisper
00:58:19
>> Moose. Take it easy, buddy. I know. He
00:58:21
loves to wrestle this dog.
00:58:23
>> He's heavy, right? He's
00:58:24
>> Well, we he added about four or five
00:58:26
pounds cuz he wasn't eating all that
00:58:27
much when we got him. And then we
00:58:30
realized he needs a little bit of extra
00:58:31
food. We gave him a little more food and
00:58:33
he became a little less um grumpy
00:58:36
>> and then he just
00:58:37
>> we take him for like maybe three one
00:58:40
mile walks a day on the ranch.
00:58:42
>> Totally fine. If he misses a walk,
00:58:44
doggy's got way too much energy. So,
00:58:46
he's not he's a ranch dog, not a city
00:58:48
dog. This dog
00:58:49
>> I know. I know. I told you.
00:58:50
>> Yeah, he needs to run. When he doesn't
00:58:52
run, you got a problem. All right, let's
00:58:54
keep going through the docket here. Uh
00:58:56
that's dog corner. That's K9 corner.
00:58:58
What do you guys think? That's a good
00:58:59
corner. It's a good cat corner. It's the
00:59:01
best corner.
00:59:02
>> But how crazy is that? One like one in a
00:59:03
billion's got to be that this dog lands
00:59:05
up next to me. It's just the craziest
00:59:06
thing. Anyway,
00:59:07
>> yeah. And uh yeah, there's no cat corner
00:59:09
here. Cats are horrible.
00:59:10
>> Shout out to our friends at Whimo for
00:59:12
eliminating the cat problem in San
00:59:14
Francisco.
00:59:16
>> I mean, that's just that was the
00:59:18
greatest update to their software ever.
00:59:19
If they can eliminate more cats,
00:59:22
>> what's that paradox that the AI has
00:59:25
where it's like, do I hit the trolley?
00:59:27
It's like the cat problem. Go for the
00:59:29
problem. Go for the cat. It's just go
00:59:30
for the cat.
00:59:30
>> Absolutely. You could get somebody there
00:59:32
on time or you could get them there 30
00:59:34
seconds late and take out a cat. Take
00:59:37
out the cat. Less stray cats. Cats are
00:59:40
horrible. Shout out to our cat ladies
00:59:42
listening to the pod. There's I wonder
00:59:44
how many cat ladies listen to this pod.
00:59:45
I think there are a few.
00:59:46
>> Oh, there are a few.
00:59:48
>> Are there a few? Yeah, they hear from
00:59:51
they're on blue sky talking about Oh,
00:59:54
god. I had a joke. I'm not going to do
00:59:55
it. All right. According to a Reuters
00:59:58
report, Freeberg, China has built a
01:00:01
prototype of ASML's EUV machine. Hey, if
01:00:06
this is true, this report, it's uh it
01:00:09
could have a profound impact on the AI
01:00:11
race that we talked about. For those of
01:00:13
you who don't know, ASML is the Dutch
01:00:15
company worth $400 billion,
01:00:18
22nd largest market cap in the world,
01:00:20
and uh they are the only company that
01:00:22
makes EUV machines, also called
01:00:24
lithography. That stands for uh extreme
01:00:27
ultraviolet. That's how you make H100s.
01:00:31
Here's a photo of one of these giant
01:00:32
machines. They cost $250 million to
01:00:35
make, and they take 6 months to make,
01:00:38
filled with all kinds of important
01:00:40
parts. Carl Zeiss lenses, etc. Since
01:00:43
2018,
01:00:45
these machines have been limited for
01:00:47
sale by the Dutch. Why? Why can't China
01:00:50
buy these machines? Well, in 2018,
01:00:52
Trump's Secretary of State, Mike Pompeo,
01:00:55
put pressure on the Dutch to impose
01:00:57
export controls on China. And uh later
01:01:00
that same year, they did. So,
01:01:03
according to Reuters in this news story,
01:01:05
Freeberg,
01:01:07
China used former ASML engineers
01:01:10
allegedly to reverse engineer the
01:01:13
machines. Reuters has pointed out that
01:01:16
the machine in China has not yet
01:01:18
produced any working chips. It's a
01:01:20
prototype. The CCP is targeting 2028 uh
01:01:24
for working chips. Some sources say
01:01:26
maybe 2030 is more likely. You've been
01:01:28
talking about China lithography on the
01:01:30
show. We can play the Chariots of Fire
01:01:33
music that Freeberg insists we play when
01:01:35
he does his victory lap. Chimoth, here's
01:01:36
his victory lap. Episode 224. Last year,
01:01:40
China announced and began a 37 billion
01:01:43
investment in developing their own
01:01:46
3nanometer uh chimp technology. China
01:01:50
made a claim that this investment they
01:01:52
had made was starting to pay off and
01:01:53
they had developed their own EUV system
01:01:56
and their big semiconductor companies
01:01:58
called the semiconductor manufacturing
01:02:00
international corporation or SMIC in
01:02:02
China. They launched a seven nanometer
01:02:05
chip with Huawei in their Mate 60 Pro,
01:02:08
which is sort of like their iPhone
01:02:09
competitor in China. And so they're
01:02:12
proclaiming that they've already got
01:02:13
this EUV technology. From what I
01:02:15
understand, and Sax would know better
01:02:17
than I, it sounds like there was a lot
01:02:18
of reverse engineering and workound of
01:02:21
existing technology in order to deliver
01:02:24
that system.
01:02:24
>> Who do you think has the best chance of
01:02:27
challenging Nvidia? My early prediction
01:02:29
for 2026 is Huawei where I think that
01:02:32
there's lithography technology that
01:02:35
exists in China that is not publicly
01:02:37
discussed that is going to be deployed
01:02:39
in Huawei and all these fabs that
01:02:40
they're building in mainland China.
01:02:42
>> So announcements 2026 impact 27 probably
01:02:45
fair.
01:02:46
>> Do you have first first on allin
01:02:48
Freeberg? Congratulations on your weekly
01:02:52
>> victory lap here. I don't I don't I
01:02:54
honestly don't think that the Reuter
01:02:55
story is necessarily news cuz I think
01:02:57
it's a little bit of a narrow scope in
01:02:58
trying to describe what happened which
01:02:59
is that they quote stole ASML technology
01:03:02
because scientists are working on it.
01:03:03
That's not really the full scope of
01:03:05
what's been going on in China for a
01:03:06
number of years and it really
01:03:09
understates the technological progress
01:03:11
that China's independently made in using
01:03:13
other systems to try and achieve
01:03:15
lithography parody. So if you go back a
01:03:18
number of years, the current kind of
01:03:20
investment vehicle is actually Chinese
01:03:22
banks put capital into a fund that's
01:03:25
roughly $48 billion uh US dollars which
01:03:29
is actually what's called the national
01:03:31
integrated circuit industry investment
01:03:33
fund phase 3. So there was phase 1 which
01:03:36
started in 2014 which focused on
01:03:39
manufacturing. So developing fabs with
01:03:41
groups like SMIC. Phase two was stood up
01:03:44
in 2019 which focused primarily on
01:03:47
design and materials. And then this
01:03:49
phase three fund was established in 2024
01:03:53
explicitly to pivot to what they call
01:03:55
choke points or bottlenecks in the
01:03:57
manufacturing process. And state media
01:03:59
has confirmed the existence of this
01:04:01
phase 3 fund and the intentionality of
01:04:04
the phase 3 fund to try and replicate
01:04:06
lithography technology. Couple of months
01:04:08
ago, if you pull up the link, Nick,
01:04:10
here's a publication on this paper. So
01:04:12
this is the leading research group out
01:04:14
of China that's been backing into
01:04:16
lithography technology using AIdriven
01:04:19
systems. They published this paper which
01:04:22
was a very good summary of where they
01:04:24
are in July.
01:04:27
Was this in nature? No, it's in light
01:04:28
science and applications and this paper
01:04:30
is called advancements and challenges in
01:04:33
inverse lithography technology a review
01:04:36
of artificial intelligence-based
01:04:37
approaches. If you read into the paper
01:04:40
and then you read the other publications
01:04:41
from this particular research team at
01:04:43
Singua University, they have made a
01:04:46
number of breakthroughs in using AI to
01:04:49
try and reestablish alternative systems
01:04:51
and the traditional Zeiss optics and
01:04:54
other control systems that are used in
01:04:55
the ASML platform. And the way that
01:04:58
they've done it, I I won't spend too
01:04:59
much time on it, but there's a bunch of
01:05:00
neural networks that they've used,
01:05:03
trained, and have made discoveries from,
01:05:05
including doing things like working with
01:05:07
suboptimal optics, meaning you can have
01:05:09
optics that have a degree of diffusion.
01:05:11
And then how do you actually recreate
01:05:13
3nmter printing techniques or 3nm masks
01:05:16
by having shadowy effects accounted for
01:05:19
in the AI? So, you almost like assume
01:05:21
that the shadowy effects are going to
01:05:23
come out and you print differently than
01:05:24
you otherwise would. you don't need the
01:05:26
degree of precision that you need and
01:05:27
you don't necessarily need to go get the
01:05:28
Zeiss optics. There was another paper
01:05:30
that was published early this year by
01:05:32
another Chinese research team that
01:05:34
actually demonstrated how they did use
01:05:36
AI to discover a novel method for doing
01:05:39
optics. So there is a very broad and
01:05:41
wellunded effort that's been underway
01:05:43
for over a decade. But just in the last
01:05:45
couple of years with artificial
01:05:46
intelligence based approaches, they've
01:05:48
made a series of breakthroughs. It is
01:05:50
very likely the case that Huawei already
01:05:52
has a lithography system that they'll be
01:05:55
putting into production and this is why
01:05:56
I kind of talked about it when we were
01:05:58
in Vegas and pointed it out earlier this
01:06:00
year.
01:06:01
>> But I again I think Reuters just sort of
01:06:03
maybe caught on to some narrow segment,
01:06:04
but this is a bigger story and a bigger
01:06:06
set of progress that's been made over
01:06:08
quite some time. ramification stacks
01:06:11
here and and what you're thinking.
01:06:13
Obviously, you've talked a bunch about,
01:06:16
hey, let's sell our stack in there and
01:06:18
they're clearly working on their own
01:06:19
stack. What what's uh what's the
01:06:21
ramification of this if it's true?
01:06:23
>> Well, I've never supported selling EUV
01:06:25
lithography to China. It is probably,
01:06:28
>> you know, selling the stack of H1,
01:06:30
whatever the previous Nvidia stack was.
01:06:32
Yeah, EUV lithography. These machines
01:06:34
that are made by ASML is probably our
01:06:36
single biggest advantage in the AI race
01:06:39
because they're the only machines that
01:06:41
are capable of creating say 2 to 3
01:06:43
nanometer chips, semiconductors, and
01:06:46
it's sort of the perfect export control
01:06:48
because there's only one company that
01:06:49
makes these machines like you showed.
01:06:51
They're gigantic. They're expensive. So
01:06:54
if you think about like trying to put an
01:06:56
export control on a commodity, there'd
01:06:58
be a lot of ways to circumvent it
01:06:59
because there's a lot of sources for a
01:07:01
commodity. But in this case, there's
01:07:03
literally only one company that makes
01:07:05
these machines. So it's been a
01:07:07
tremendous advantage, I'd say, for the
01:07:08
West that we have EUV lithography. I'm
01:07:11
not surprised that China is trying to
01:07:14
reverse engineer it. It's kind of the
01:07:16
obvious thing to do. No advantages
01:07:19
forever. It does seem likely that at
01:07:21
some point they'd be able to figure out
01:07:23
how to do it. So I think that this
01:07:26
article is not altogether a surprise in
01:07:29
that sense. I mean I'm sure that China
01:07:31
is trying really really hard to reverse
01:07:34
engineer this. In the meantime, however,
01:07:36
they've also made substantial progress
01:07:38
with DUV lithography. So the previous
01:07:40
generation of lithography before EUV was
01:07:44
DUV. E is extreme. D is deep. In any
01:07:48
event, DUV was supposed to sort of top
01:07:51
out at say 14 nanometer chips and China
01:07:54
was able to push the technology to get
01:07:56
to 7 nanometer and now to five. So
01:07:58
they've able to get a lot further using
01:08:01
DUV lithography than anyone thought
01:08:03
possible and that is why the Huawei
01:08:06
chips, the Ascent chips are they're not
01:08:08
as good as our chips, but they're
01:08:09
serviceable and they've been able to use
01:08:12
their prowess in networking to string
01:08:14
more of them together. You have the
01:08:16
cloud matrix 384 technology where Huawei
01:08:19
will string together 384 of their Huawei
01:08:22
Ascent chips into a rack and that will
01:08:26
perform comparable to an Nvidia rack but
01:08:30
using a lot more power. So the bottom
01:08:32
line is China's been able to create a
01:08:34
lot of these workarounds to our
01:08:36
restrictions. But if they figured out
01:08:38
how to reverse EUV that would be I'd say
01:08:41
a blow because it is a real advantage
01:08:43
for us. But I'm not convinced that this
01:08:45
Reuter story is the the full picture. I
01:08:48
think it's just a data point.
01:08:50
>> Jim, any thoughts here?
01:08:51
>> If you want a very short summary of how
01:08:53
the last generation of chips work, the
01:08:55
best way to think about this is that the
01:08:58
most profitable and valuable chips to
01:09:00
date have taken a very compute ccentric
01:09:03
approach. That's why we've needed these
01:09:05
ultra advanced process nodes. But the
01:09:07
reality is that in a world of infinite
01:09:11
inference, let's say the next generation
01:09:14
of silicon will not take the same
01:09:16
compute heavy approach and will probably
01:09:19
rely on a much more memorycentric
01:09:20
architecture that uses a lot of SRAM.
01:09:23
The value of that is that you can
01:09:25
produce these things as Sax said at 14
01:09:26
nanometer at 75 all of these process
01:09:30
nodes that are much less advanced that I
01:09:33
actually think is the future. So, in a
01:09:34
weird way,
01:09:37
if they steal this, which I think we
01:09:40
should assume they've already stole it
01:09:41
quite honestly and they figured it out,
01:09:44
the question is how bad is it? That's
01:09:45
the really important question. And the
01:09:47
thing about a more memory ccentric
01:09:50
approach for AI inference or these next
01:09:52
generation forms of silicon, it demands
01:09:55
a lot more of compilers. It demands a
01:09:58
lot more of the software. I still think
01:09:59
that's where we are light years ahead.
01:10:01
So, I don't think it's the end of the
01:10:02
world and we're still in a very good
01:10:04
place,
01:10:06
but it is bad. Friberg, weren't we
01:10:08
supposed to uh start onshoring with the
01:10:11
chips act and maybe get trending towards
01:10:15
having this production ability in
01:10:18
America? And obviously that's been
01:10:20
really hard to do.
01:10:22
And China and their relationship with
01:10:25
Taiwan seems to be and and the
01:10:26
possibility that they actually take over
01:10:28
Taiwan in a military fashion has been
01:10:30
this existential concern or even
01:10:32
short-term concern.
01:10:35
>> Maybe it's a better way to say it. The I
01:10:36
think the biggest security implication
01:10:38
to consider at this moment is that both
01:10:41
the US and mainland China are onshoring
01:10:43
manufacturing and that makes Taiwan less
01:10:46
of a strategic point of interest for the
01:10:48
United States and for United States
01:10:49
industry. Maybe one of the ways to think
01:10:51
about the positive aspect of what is
01:10:53
going on which is maybe call it the
01:10:55
bulcanization of generally manufacturing
01:10:57
but specifically semiconductor
01:10:58
manufacturing in this case is reduced
01:11:01
security concerns. So there's no longer
01:11:04
this flash point perhaps in Taiwan. If
01:11:06
China did invade Taiwan and the US had
01:11:08
fab manufacturing, we're less likely to
01:11:10
want to defend Taiwan. And frankly, if
01:11:12
China has fabs on mainland, do they
01:11:14
really need to invade Taiwan?
01:11:16
>> What's the economic interest in doing
01:11:18
so?
01:11:18
>> Yeah, that's kind of where I was going
01:11:19
with it. But if they also wanted to
01:11:21
jumpstart Freedberg, they could just go
01:11:23
to Taiwan and take the machines. Like,
01:11:25
is that actually the plan?
01:11:27
>> Like I I don't know what the CCP's plan
01:11:29
is. I think that's a pretty um simple
01:11:31
statement to make. I think just going to
01:11:33
Taiwan is probably a lot more
01:11:35
complicated than
01:11:36
>> of course it is. My point is if this if
01:11:38
this was super existential to get those
01:11:40
chips for China and they go to Taiwan
01:11:42
and they then they've reverse engineered
01:11:44
the machine by default.
01:11:46
>> We we need to recognize we keep using
01:11:47
the term reverse engineering as if ASML
01:11:49
is the only way to do this and it is
01:11:51
not. If you read their papers and you
01:11:53
read what's coming out, there are
01:11:55
alternative production manufacturing
01:11:57
methods and systems that are being
01:11:58
developed in China that not only maybe
01:12:01
provide parody to ASML, but get ahead of
01:12:03
ASML. And I think this is really
01:12:04
important to understand. China is not
01:12:06
just in a catch-up race. They're in a
01:12:08
primacy race and they are trying to
01:12:10
develop privacy in lithography
01:12:12
technology which will give them privacy
01:12:13
in manufacturing which will give them
01:12:15
primacy in AI which will give them
01:12:17
economic leverage over the planet and
01:12:19
that is very critical for us to
01:12:20
understand. This is not about stealing
01:12:22
data from ASML. Like, no one gives a
01:12:24
[ __ ] If you're sitting in China, you're
01:12:25
thinking about I've got the world's best
01:12:27
scientists or some of the world's best
01:12:28
scientists and I'm giving them the
01:12:30
resources and I'm giving them the
01:12:31
mandate to go solve this problem. The
01:12:33
government, as I mentioned last year,
01:12:35
put $40 billion against this problem and
01:12:37
said, "Go figure it out." And these guys
01:12:39
at Singa are publishing some
01:12:41
groundbreaking research on how to do it.
01:12:44
>> But it's easier to catch up or to copy
01:12:47
than to innovate.
01:12:49
So if you're China, the first thing
01:12:50
you'd want to do is just copy EUV and
01:12:52
get that technology for yourself.
01:12:54
>> Well, let me ask you this. What would
01:12:56
you do? So you're the AIAR. You're
01:12:57
sitting in the United States and China's
01:12:59
got this lithography technology. Is your
01:13:00
goal to steal their technology or is
01:13:02
your goal to get your best scientists
01:13:03
and say, "Let's get ahead of their
01:13:04
technology."
01:13:05
>> If I'm China,
01:13:06
>> neither are the United States.
01:13:08
>> It's to solve the problem. What is the
01:13:10
actual economic commercial problem we're
01:13:12
solving?
01:13:12
>> Right. So if you can figure out a way to
01:13:14
do manufacturing better, cheaper, faster
01:13:16
than the way the that China does it, you
01:13:18
would obviously take that path. You're
01:13:19
not trying to get to par with China, if
01:13:21
you view yourself to be supreme to
01:13:22
China,
01:13:23
>> China. But you have to understand even
01:13:24
with these EUV machines and the same is
01:13:26
true in the other side
01:13:27
>> at two or even sub 2 nanometer scale,
01:13:30
you have these incredible laws of
01:13:31
physics that you are pushing to the
01:13:33
boundaries of which cause these chips to
01:13:35
have pretty low yield. you have a lot of
01:13:38
systems that actually aren't that
01:13:40
efficient and highly utilized as a
01:13:42
result. So these systems are brittle.
01:13:44
The second thing is that when you have
01:13:46
these compute- ccentric architectures,
01:13:47
you end up making design decisions like
01:13:49
HBM, they'll have to go and steal all
01:13:52
the HBM which is the high bandwidth
01:13:53
memory business that SKHix and Samsung
01:13:56
have. But again, that's where Nvidia and
01:13:58
then Google have an effective monopoly.
01:14:00
>> Let me ask you a question, S. I mean,
01:14:02
what's your strategic response if China
01:14:04
next year goes into manufacturing? with
01:14:06
a sub 2 nanometer system that's got
01:14:08
higher yields, is faster, is cheaper
01:14:10
than anything that we're seeing anywhere
01:14:12
else in the world.
01:14:13
>> Well,
01:14:14
I don't think that's going to happen.
01:14:16
Let's talk about that if it actually
01:14:17
happens. I don't think it's going to.
01:14:19
But what would I be focused on right
01:14:21
now? I would say we want to get more of
01:14:23
the leading edge manufacturing in the
01:14:26
United States. TSMC has already opened a
01:14:29
big fab in Arizona and they're planning
01:14:32
on increasing the size of that and
01:14:34
that's really important and we should
01:14:37
get all these restrictions out of the
01:14:39
way. They're facing all sorts of
01:14:41
permitting restrictions. I mean all the
01:14:43
stuff that Bernie is talking about where
01:14:44
he wants to slow down. I mean he's
01:14:46
talking about data centers but not
01:14:49
>> manufacturing but the concepts are the
01:14:50
same.
01:14:51
>> Yeah. Roana, Elizabeth Warren, you've
01:14:53
got
01:14:54
>> exactly wrong. I mean,
01:14:55
>> anti-che progress people saying, "Let's
01:14:57
stop all this stuff." Meanwhile, China
01:14:59
is not just racing to catch up, but
01:15:01
they're likely going to get ahead of the
01:15:02
United States. And if we're caught
01:15:04
flatfooted because we've got a
01:15:05
moratorium on development, a moratorium
01:15:07
on production, it's going to be
01:15:10
extraordinarily damaging,
01:15:12
>> right? Well, look, this is where I agree
01:15:13
with you is that if this Reuters report
01:15:16
is accurate, it does speed up the
01:15:17
timeline on China closing the chip gap
01:15:20
from
01:15:21
>> decades to a few years. You can go read
01:15:23
the Singua papers. You'll see that these
01:15:25
guys are so advanced.
01:15:26
>> If they care about being ahead, why are
01:15:28
they publishing that paper?
01:15:29
>> So, by the way, this is a really
01:15:31
important point and I've picked up on
01:15:32
this from a number of different
01:15:34
disciplines in scientific research. So,
01:15:36
what's been going on is scientists in
01:15:39
China, they're very smart people.
01:15:40
There's very great scientists there. And
01:15:43
like all scientists, they don't think
01:15:45
necessarily in terms of nationalism
01:15:47
first. They think in terms of science.
01:15:49
They want to publish their research.
01:15:50
They want to be known. They want to be
01:15:53
the discoverers of these new frontiers
01:15:56
and so they publish. And what we've seen
01:15:58
in several instances is that the
01:16:00
scientific research groups will publish
01:16:02
and they'll publish slightly ahead. And
01:16:04
once they get ahead of the market or
01:16:06
they get ahead of the world, suddenly
01:16:08
they stop publishing. And it's almost
01:16:10
like the CCP comes in and it's like,
01:16:12
okay, you guys are now, you know, going
01:16:14
too far. We don't want to tip our hat
01:16:16
too far. And the stuff starts to recede.
01:16:18
And I think there was this one paper on
01:16:20
optics that is designed to replace the
01:16:22
Zeiss monopoly in optics that suddenly
01:16:25
disappeared and that research team
01:16:26
stopped publishing. I got to find this
01:16:27
paper. This came out last year.
01:16:30
>> There are a number of these disciplines
01:16:31
where there's publications that go to a
01:16:32
certain extent and then they stop
01:16:34
publishing. So I think that there's a
01:16:35
little bit of a
01:16:36
>> because of the funding cycle in China.
01:16:38
You have every single local government
01:16:40
that's given capital to invest in
01:16:42
businesses and so they compete and
01:16:44
compete until it gets to the state level
01:16:46
and you have one or two national
01:16:47
champions. But if this was so important,
01:16:49
Freeberg, would there also be an
01:16:51
argument to
01:16:54
publish things that would send um that
01:16:57
would be misinformation or send people
01:16:59
down a wrong direction?
01:17:02
>> No, because you have to again the
01:17:03
funding cycle is you get as a policy
01:17:06
person in Shanghai, you get promoted
01:17:09
because you make a good bet.
01:17:10
>> Yeah. No, but you so by the time it
01:17:13
becomes a one national to Freeberg's
01:17:15
point, when's the last time you saw BYD
01:17:16
or CL publish anything meaningful,
01:17:20
>> but when they were not the national
01:17:22
champion, they were probably publishing
01:17:24
very aggressively.
01:17:26
But at that time, there was 50 different
01:17:28
companies competing to be the national
01:17:30
champion. So I suspect what's happening
01:17:32
here is there's 50 different semi-
01:17:34
startups all over China. Everybody's
01:17:37
trying to support their own local
01:17:38
version and eventually when there's one
01:17:40
they disappear.
01:17:41
>> Okay. Interesting.
01:17:42
>> I mean, one of the things we see is
01:17:45
China does give money to these research
01:17:47
institutions on frontier science and
01:17:50
then there's an industrialization. It's
01:17:52
something the US isn't quite as good at.
01:17:53
And I think it's again, we've talked
01:17:55
about the co-opting of our universities
01:17:57
as research institutions over the last
01:17:59
couple of decades. They've become these
01:18:01
sort of employment centers, if you will,
01:18:03
and they've become these sort of weird
01:18:05
social engineering systems. They've kind
01:18:07
of gotten away from some of the core
01:18:09
research or that that's been kind kind
01:18:11
of co-opted. In China, there are these
01:18:13
sort of dedicated research centers and
01:18:15
then there are universities where
01:18:16
there's very clear independent research
01:18:18
that goes on on university campuses.
01:18:20
They typically get quite a bit of
01:18:22
funding. And the way this worked, if you
01:18:24
look into the details of this China
01:18:26
National Integrated Circuit Industry
01:18:27
Investment Fund, and there were three
01:18:29
funds, one, two, and three, the funds
01:18:30
get bigger and bigger as the research
01:18:32
teams make progress. Uh, it's a very
01:18:34
smart way of doing things from a central
01:18:36
planning perspective. And then in in
01:18:38
phase three, they're now targeting these
01:18:40
specific unlocks that then will enable
01:18:42
industrialization. So they're funding
01:18:44
some of this core research. The core
01:18:45
research has these breakthroughs and
01:18:47
then it gets industrialized. It's during
01:18:48
that research phase where we see a lot
01:18:50
of publications happening and this is
01:18:51
happening in life sciences as well and
01:18:53
in physics and in material science and
01:18:55
as soon as they start to get a little
01:18:56
bit ahead of the curve then it gets
01:18:58
industrialized and the and the
01:19:00
publications kind of stop. So I don't
01:19:01
know if it's necessarily the CCP
01:19:03
stepping in and saying hey stop
01:19:04
publishing you're going to let the world
01:19:05
know too much of what we're having
01:19:07
versus okay the handoff is now happening
01:19:08
and industrialization begins. Yeah,
01:19:10
look, I would just say that to me the
01:19:13
takeaway here is that the timeline on
01:19:16
China closing the gap with us is being
01:19:18
sped up. Whether it's the fact that
01:19:21
they're copying what we're doing,
01:19:22
whether EUV is being reverse engineered
01:19:24
or maybe their scientists are figuring
01:19:26
out a way to leaprog, we are in a pretty
01:19:29
close race with China. And that's why I
01:19:31
think it's really important that we
01:19:32
don't listen to Bernie and others who
01:19:35
just want to stop the progress because
01:19:36
the progress is not going to stop. This
01:19:38
is going to happen in China and that
01:19:41
will be a big problem for the United
01:19:43
States.
01:19:43
>> All right, speaking of
01:19:44
socialism/communism,
01:19:46
uh you guys thinking about coming to
01:19:48
Austin, my neck of the woods.
01:19:49
>> I've been uh pulling up these photos in
01:19:52
Austin of these homes.
01:19:53
>> You know I'm doing it right.
01:19:55
>> Welcome to Texas, boys.
01:19:56
>> Like on the lake.
01:19:57
>> Sachs already bought a house. Sachs
01:19:59
already bought a house.
01:20:00
>> He hooked me up with the uh agent.
01:20:03
>> What does she got you looking at?
01:20:04
>> I'm not sure yet. We're just
01:20:06
>> You guys are bidding against each other,
01:20:08
[ __ ]
01:20:08
>> You can have my sloppy seconds.
01:20:10
>> You have the high ground, so you can
01:20:11
negotiate hard.
01:20:13
>> Uh there is so much inventory, but only
01:20:16
like 20% of the inventory is on the MLS.
01:20:19
80% here because it's a no reporting
01:20:21
state is off market. So, it's completely
01:20:24
different than
01:20:25
>> any other market. Certainly California.
01:20:27
>> I want to be on the lake. Like, if I'm
01:20:29
going to do this on the water,
01:20:31
>> Oh, really? I don't want the lake. I
01:20:33
mean, if you're here, here's the
01:20:35
challenge with the lake. People are
01:20:36
going to come up to your house and take
01:20:39
pictures and hang out like 2 feet off
01:20:41
your house and starting at 7 a.m.
01:20:43
>> Texas house in San Francisco.
01:20:45
>> Everybody's zipping around the lake in
01:20:47
motorboats starting at 7 a.m. It's
01:20:49
beautiful.
01:20:49
>> I love your thing. It's great, but just
01:20:52
>> understand it's going to get loud at 7
01:20:55
a.m.
01:20:56
>> There's nothing a good motorboating
01:20:58
doesn't
01:21:00
>> Who doesn't love a good motorboat?
01:21:02
Jal, I heard you're like an hour away.
01:21:04
You're not even in Austin.
01:21:06
>> I I'm in the Hill Country on my ranch,
01:21:08
but we're getting a place closer to the
01:21:09
city, actually. So, we're 30 minutes
01:21:11
outside the city.
01:21:13
>> 30 minutes? My god.
01:21:14
>> I heard you're
01:21:16
>> I heard you're in the sticks.
01:21:18
>> No, not that far. Not that far.
01:21:19
>> No, he's in the suburbs of the suburbs.
01:21:21
>> No, if you want to have like, you know,
01:21:23
dozens of acres, you can't do that in
01:21:25
the city.
01:21:26
>> When I was growing up in Memphis, they
01:21:27
called that Bumble Egypt.
01:21:30
BF
01:21:31
>> Bumble.
01:21:32
>> I mean, if you want a ranch and you want
01:21:34
like dozens of acres, you can't do that
01:21:37
in You can get an acre to three acres
01:21:39
inside of the city.
01:21:40
>> I found an acre in Terry Town.
01:21:43
>> An acre is doable. Two acres is
01:21:46
possible.
01:21:46
>> You know what's so sad? Even if this
01:21:48
bill doesn't get on the books, which I
01:21:50
think
01:21:50
>> everyone's leaving,
01:21:51
>> there's a decent chance it ripped the
01:21:53
band-aid off, the revenues of California
01:21:55
are going to plummet.
01:21:58
plummet.
01:22:00
>> Man, it is um yeah, I mean we saw the
01:22:03
writing on the wall. Socialism is coming
01:22:05
and you know this one of the great
01:22:08
things about having a United States of
01:22:10
America that different states can
01:22:12
optimize for different things.
01:22:14
>> The thing here
01:22:14
>> that one bill this proposed billionaire
01:22:16
tax has singlehandedly changed the
01:22:18
trajectory of the California economy by
01:22:22
a hundred to$200 billion dollars over
01:22:24
the next 5 to 10 years. It's an enormous
01:22:26
cell phone. People were here willing to
01:22:29
pay the money and now
01:22:32
>> Yeah, people were willing to put up with
01:22:33
a lot in this state.
01:22:35
>> As soon as you tell people, "Hey, we're
01:22:36
we're doing property seizures,
01:22:38
>> property seizures or their threat of
01:22:40
property seizures or the fact that
01:22:42
people didn't come out against property
01:22:44
seizures is enough for everyone to be
01:22:45
like, "See you see you later." The fact
01:22:48
that all the politicians didn't in
01:22:50
uniform stand up and say, "This is
01:22:51
ridiculous. We're never going to let
01:22:52
this happen." They let this happen by
01:22:55
not defending the property rights that
01:22:57
are endowed in this country. They lost
01:23:00
everything.
01:23:01
>> They had to defend it.
01:23:02
>> What do you think happens to the state
01:23:04
budget over the next four or five years
01:23:06
as a result of this?
01:23:07
>> You're right. It's trash. And by the
01:23:08
way, and that that's what drives the
01:23:09
socialist spiral because then they try
01:23:11
and do more property seizures and they
01:23:13
try and get the money back.
01:23:14
>> Who are they going to tax? Everybody's
01:23:15
going to be gone now.
01:23:16
>> And here's the other crazy statistic
01:23:17
that we never talk about. I was going to
01:23:19
try and put some data together and get
01:23:20
it as a topic, but we should save it for
01:23:22
the new year. But the California pension
01:23:24
fund system is underfunded by roughly a
01:23:27
trillion dollars. So these folks in the
01:23:29
California pension fund system are
01:23:31
sitting around waiting for money to come
01:23:33
to them every year for the next 30, 40,
01:23:35
50 years and the money isn't there.
01:23:38
They're guaranteed that money from this
01:23:39
pension fund system and it's not there.
01:23:41
What are they going to do? It's a
01:23:43
trillion dollars short. That's a state
01:23:45
obligation. Why would anybody hear
01:23:47
something like that and think giving
01:23:49
politicians more money is a good idea?
01:23:53
These people are so fiscally
01:23:54
irresponsible. They're fiscally
01:23:56
illiterate.
01:23:58
There is that
01:24:00
it's pretty clear to me that in New York
01:24:03
and California it'll be 60 plus% taxes.
01:24:08
And you know, if you were living in a
01:24:11
perfect society that appreciated it and
01:24:13
you had this incredible safety, you had
01:24:17
extraordinary schools and the roads and
01:24:20
the public transit,
01:24:21
>> everything worked.
01:24:22
>> Yeah. Like Japan,
01:24:24
>> you'd pay 60%, you would be ecstatic.
01:24:26
>> Yeah. You'd be like, "Okay, I mean, it's
01:24:28
a lot, but hey, this is a pretty great
01:24:30
place to live." If you're paying that
01:24:32
and people are breaking into your home
01:24:34
and there's nine overdoses a day, you're
01:24:38
like, "Wait a second." And then on top
01:24:39
of that, by the way,
01:24:40
>> and they're seizing your private
01:24:42
property now.
01:24:42
>> Seizing your property.
01:24:43
>> They're seizing your property. And by
01:24:45
the way, they don't care about you when
01:24:47
it when you look at the laws. I was just
01:24:49
talking to some law enforcement people.
01:24:52
>> City's using your money to give homeless
01:24:53
people vodka. It's just
01:24:56
>> Well, I mean, if it was tequila, I mean,
01:24:58
I have a good tequila brand we could
01:24:59
give to the homeless. Maybe we can make
01:25:01
a deal.
01:25:02
>> Yeah.
01:25:02
>> Shout out to all at Gaya coming to you.
01:25:04
Uh to delivery is happening right now.
01:25:08
Um, in all seriousness, I was talking to
01:25:10
some security folks um and was talking
01:25:14
to my security guy in Cal who's a in
01:25:16
California and um in California, not
01:25:20
only do they not have the castle
01:25:22
doctrine, they've replaced the castle
01:25:23
doctrine where you can like defend your
01:25:25
home if somebody comes into it with your
01:25:27
gun uh and protect your family and your
01:25:30
property.
01:25:32
Now you have to have this um duty to
01:25:36
flee. So you have to be able to run out
01:25:39
of your home with your family as opposed
01:25:41
to defend your family. So they literally
01:25:44
are telling the criminals, "Hey, listen.
01:25:46
If you break into somebody's home in
01:25:47
California, don't worry about getting
01:25:49
shot. You they're going to have to run
01:25:51
out or else they're in the wrong." Like
01:25:53
this is deranged. The the inmates have
01:25:56
literally taken over the asylum. They
01:25:58
they have taken over the asylum. I I
01:26:00
realized this two years ago and I pulled
01:26:02
the rip cord. Some Elon realized it six
01:26:05
years ago when he tried to build a
01:26:06
factory. He pulled the rip cord. You
01:26:08
guys are realizing it now. Everybody's
01:26:10
realizing this. New York uh the hedge
01:26:12
fund guys in New York and in Jersey and
01:26:14
Connecticut. They realized it. What?
01:26:16
>> Only Hollywood guys. I know a bunch of
01:26:18
Hollywood guys leaving.
01:26:19
>> Yeah. I literally this producer, one of
01:26:21
the major producers who has produced
01:26:24
billions of dollars worth of films came
01:26:25
here last year. I met him at a football
01:26:27
game. He said, "Tell me everything." I
01:26:29
told him everything. He moved here. He
01:26:31
moved here. He's going to start making
01:26:32
movies here. And he is one of the top
01:26:35
top I won't say who he is, but you know
01:26:37
his films because his partner who is an
01:26:39
incredibly high-profile comedian, like
01:26:41
amongst the highest profile in history,
01:26:44
where they make films together, he moved
01:26:46
here. Why? He he doesn't want to live in
01:26:49
LA and risk his family's life to an home
01:26:51
invasion and and the house burning down
01:26:53
because they don't fill water in the
01:26:55
reservoir. I mean, guys,
01:26:57
>> hopefully not everyone realizes it cuz I
01:26:59
need to sell my house to someone.
01:27:02
>> Good luck. You'll be okay. You'll be
01:27:04
okay. There's a Chinese billionaire.
01:27:06
>> Actually, you know, you know who you'll
01:27:07
be able to sell your house.
01:27:08
>> There's no billionaires left in
01:27:09
California.
01:27:10
>> It's whoever uh the politicians are that
01:27:12
are able to aggregate power and money in
01:27:15
the socialist.
01:27:16
>> Gavin will buy your house and turn it
01:27:18
into a home.
01:27:18
>> Gavin's gone. You guys are going to look
01:27:21
You guys are going to look back on Gavin
01:27:23
and realize he was the most uh
01:27:26
>> upstanding moderate governor that
01:27:28
California had in modern era and
01:27:30
suddenly we're going to be like what the
01:27:32
>> and I think that's a great place to stop
01:27:33
it, boys.
01:27:35
>> We will be doing a couple of episodes
01:27:37
between now and the end of the year and
01:27:39
uh yeah, no weeks off. Hopefully we get
01:27:42
to see each other over the holidays.
01:27:44
I'll see you Saxs uh and you as well
01:27:46
Jimath and Freedberg shortly. Oh, by the
01:27:49
way, the president just signed an
01:27:50
executive order reclassifying marijuana
01:27:52
from schedule one to schedule three.
01:27:55
>> Schedule one to schedule three. Okay. Uh
01:27:59
that's something Obama wanted to do in
01:28:01
the second term and never did. So,
01:28:02
>> I think it's a good thing. I mean,
01:28:04
schedule one is like heroin. You know,
01:28:06
it just never made sense to treat it the
01:28:07
same as heroin. You know, I know a lot
01:28:09
of people aren't thrilled about
01:28:10
marijuana. It's not great when people
01:28:12
are in the
01:28:12
>> I don't like marijuana, but I I agree
01:28:14
with this.
01:28:15
>> It's not heroin.
01:28:17
I mean it would be on I think most
01:28:19
people intellectually would put it on
01:28:21
the same level as alcohol, right?
01:28:23
>> I just want it studied so that we can
01:28:24
have toxicity labels. That's the big
01:28:26
thing that I want because like if these
01:28:28
kids
01:28:29
>> Yeah.
01:28:30
>> That's the most important thing is put a
01:28:32
tox label on these things so you know
01:28:33
what you're taking.
01:28:35
>> It's the the strength of these things
01:28:37
over the last 50 years has changed
01:28:39
dramatically. And there are Yeah. There
01:28:41
there there are these um shards they
01:28:46
create of this like incredibly intense
01:28:48
THC that will make you like go
01:28:50
psychotic. Like it's really dangerous.
01:28:52
So the smoking a joint from our youth is
01:28:55
very different than smoking these like
01:28:57
resins and stuff that they've created
01:28:59
apparently that are like a 100 times the
01:29:01
potency 200 times potency and they can
01:29:03
put you in a psychotic state. Really
01:29:05
dangerous folks.
01:29:05
>> Oh that's that's where like the FDA
01:29:07
should regulate it. More like alcohol or
01:29:09
something. 100%. Much better process,
01:29:12
especially since kids are getting messed
01:29:14
up. Yeah.
01:29:14
>> All right, everybody. Another amazing
01:29:16
episode of your favorite podcast in the
01:29:18
world, the all in pockets. Love you.
01:29:19
>> Happy holidays, everybody. Merry
01:29:21
Christmas.
01:29:22
>> Happy Hanukkah. Merry Christmas.
01:29:23
>> Happy Hanukkah. Christmas, whatever
01:29:26
you're into.
01:29:28
>> Let your winners ride.
01:29:31
Rainman David
01:29:35
>> and it said we open sourced it to the
01:29:37
fans and they've just gone crazy with
01:29:39
it.
01:29:40
>> Queen of
01:29:48
besties are
01:29:51
my dog taking a notice driveway.
01:29:56
Oh man, my habitasher will meet up.
01:29:59
>> We should all just get a room and just
01:30:00
have one big huge orgy cuz they're all
01:30:02
just It's like this like sexual tension
01:30:04
that we just need to release somehow.
01:30:10
>> Your feet.
01:30:12
Waiting to get Mercury's are going all
01:30:23
in.

Badges

This episode stands out for the following:

  • 70
    Most heartwarming
  • 60
    Most emotional
  • 60
    Best overall
  • 60
    Best performance

Episode Highlights

  • Impact of Billionaires on AI Discourse
    A few billionaires have distorted the public debate on AI, affecting perceptions significantly.
    “You can’t underestimate the extent to which a few billionaires have distorted the public debate.”
    @ 21m 30s
    December 19, 2025
  • Flipping the Narrative on AI
    Experts discuss the need to reframe the narrative around AI and its benefits.
    “We have to flip the narrative around.”
    @ 22m 17s
    December 19, 2025
  • Concerns About Job Displacement
    The discussion highlights valid concerns about job displacement due to AI advancements.
    “We have not recognized people’s valid concerns about job displacement.”
    @ 27m 44s
    December 19, 2025
  • Government Job Cuts
    Government employment has declined significantly, affecting unemployment rates.
    “The number's gone from 2.4 million to 2.15 million.”
    @ 36m 39s
    December 19, 2025
  • Cusp of a Golden Age
    The speaker believes the economy is on the verge of a golden age, with improving metrics.
    “It seems to me like we're on the cusp of a golden age here.”
    @ 37m 59s
    December 19, 2025
  • Inflation and Prices
    Despite claims of improvement, inflation remains a concern for many Americans.
    “Prices have continued to go up.”
    @ 46m 51s
    December 19, 2025
  • A Heartwarming Reunion
    After losing his dog Monty, a man discovers a dog that is Monty's brother.
    “This dog is Monty’s brother.”
    @ 53m 29s
    December 19, 2025
  • The Power of Rescue
    A dog rescued from a kill shelter brings unexpected joy and connection.
    “How crazy is that?”
    @ 53m 49s
    December 19, 2025
  • China's Semiconductor Ambitions
    China is not merely catching up; they are striving for technological primacy.
    “China is not just in a catch-up race. They’re in a primacy race.”
    @ 01h 12m 06s
    December 19, 2025
  • The Urgency of Semiconductor Manufacturing
    The race to close the technology gap with China is accelerating, raising concerns for the U.S.
    “The timeline on China closing the gap with us is being sped up.”
    @ 01h 19m 13s
    December 19, 2025
  • Inmates Take Over Asylum
    A shocking statement about the state of safety and crime in California.
    “The inmates have literally taken over the asylum.”
    @ 01h 25m 56s
    December 19, 2025
  • Marijuana Reclassification
    The president signed an executive order reclassifying marijuana from schedule one to schedule three.
    “Schedule one is like heroin. It just never made sense to treat it the same.”
    @ 01h 28m 04s
    December 19, 2025

Episode Quotes

Key Moments

  • AI Water Use Claims19:02
  • Fixable Issues25:48
  • Valid Concerns27:44
  • Government Cuts36:31
  • Inflation Concerns46:51
  • Unexpected Connection53:29
  • Dogs Are the Best56:09
  • Inmates Take Over1:25:56

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E115: The AI Search Wars: Google vs. Microsoft, Nordstream report, State of the Union
Podcast thumbnail
NBA Gambling Scandal, Billionaire Tax, Tesla's Future, Amazon Robots, AWS Outage, Dangerous AI Bias
Podcast thumbnail
Tucker Carlson: Rise of Nick Fuentes, Paramount vs Netflix, Anti-AI Sentiment, Hottest Takes
Podcast thumbnail
Debt Spiral or NEW Golden Age? Super Bowl Insider Trading, Booming Token Budgets, Ferrari's New EV
Podcast thumbnail
Trump AI Speech & Action Plan, DC Summit Recap, Hot GDP Print, Trade Deals, Altman Warns No Privacy
Podcast thumbnail
AI Bubble Pops, Zuck Freezes Hiring, Newsom’s 2028 Surge, Russia/Ukraine Endgame