Search Captions & Ask AI

Iran War, Oil Shock, Off Ramps, AI's Revenue Explosion and PR Nightmare

March 13, 2026 / 01:20:23

This episode covers the State of the Union shout-out to Brad Gersonner, the economic implications of the war in Iran, and the rapid growth of AI companies like Anthropic and OpenAI.

Brad Gersonner shares his experience receiving a shout-out from President Biden during the State of the Union address, revealing that it was a surprise and discussing the significance of the event in American democracy.

The conversation shifts to the war in Iran, with discussions on the volatility of oil prices and the potential economic fallout. The guests analyze the implications of rising oil prices on inflation and consumer confidence, referencing Goldman Sachs' updated forecasts.

As the episode progresses, the hosts discuss the growth of AI companies, highlighting Anthropic's impressive revenue growth and the competitive landscape with OpenAI. They debate the future of AI and its impact on industries, emphasizing the need for responsible messaging and public perception.

The episode concludes with a discussion on the millionaire tax in Washington State, the implications for wealthy individuals like Howard Schultz, and the broader impact of such taxes on the economy.

TL;DR

Brad Gersonner discusses his State of the Union shout-out, the Iran war's economic impact, and the rapid growth of AI companies.

Video

00:00:00
All right, everybody. Welcome back to
00:00:01
the number one podcast in the world.
00:00:03
Freeberg's out saving the world,
00:00:04
creating new potatoes or I don't know,
00:00:06
quinoa, maybe some Brussels sprouts. I'm
00:00:08
not sure what he's working on at this
00:00:10
point. In his place, his personal
00:00:12
favorite bestie. He always says that
00:00:13
when I'm not here, I want Brad Gersonner
00:00:16
in the seat. Welcome back. We haven't
00:00:18
seen you on the pod since your shout out
00:00:22
at the State of the Union. Take us
00:00:24
behind the scenes for a brief moment
00:00:25
here, Brad, of what it's like to get a
00:00:28
shout out from POTUS at the state of Did
00:00:31
you know it was coming? Did you
00:00:32
choreograph to this thing? Did you Did
00:00:34
you choreograph that or was that more
00:00:35
spontaneous?
00:00:36
>> I honestly had no idea it was coming.
00:00:38
>> And in fact, I I found out after the
00:00:40
fact that it wasn't in the speech and
00:00:42
the president added it to the speech.
00:00:44
So, I don't even think it was, you know,
00:00:46
few days ago few days before going to
00:00:48
happen. But we got an invite to the
00:00:50
State of the Union and you know, listen,
00:00:52
it's an institution. And this has
00:00:53
happened every year for 250 years in the
00:00:58
country. I've never been. I thought I
00:00:59
did know he was going to talk about
00:01:00
Trump accounts. So I figured if I'm ever
00:01:02
going to go, that's the time to go. And
00:01:04
I have to say, you know, I'm just a
00:01:06
sucker for Democratic institutions and
00:01:08
Democratic traditions. It it was an
00:01:10
extraordinary night. Set aside, you
00:01:12
know, the headlines about what Democrats
00:01:14
did or Republicans did. Just the whether
00:01:15
it's a Democrat president or Republican
00:01:17
president that this happens every year.
00:01:19
You have to go report on the State of
00:01:20
the Union. So it was a it was a special
00:01:22
night. did dinner ahead of time. We're
00:01:24
in the chamber. The chamber, as you all
00:01:25
know, is very small. And so, you know,
00:01:28
just to your right was the first family
00:01:31
and and Jared and Ivanka.
00:01:34
And so, you know, we were there as to
00:01:37
observe like everybody else. And wow, it
00:01:40
was it was quite a moment. And
00:01:42
>> I want to just say you did a great job
00:01:44
because when you sent your heart out to
00:01:47
all of America,
00:01:48
>> I took it. I took it. I took it. I was
00:01:49
like it out,
00:01:51
>> but you kept it at the right angle.
00:01:53
>> Right. Right.
00:01:54
>> You had just gone up a little bit extra
00:01:56
and out.
00:02:00
>> Would have been no bueno.
00:02:01
>> Those would be some super racist Trump
00:02:03
accounts.
00:02:04
>> Keep your protractor and your ruler out
00:02:07
when you send your heart out. Okay.
00:02:08
>> One final thing on it, Jason.
00:02:10
You know, we're signing up over a
00:02:12
100,000 kids a day to these Trump
00:02:14
accounts. Fantastic. We have millions of
00:02:16
kids who've already claimed their
00:02:17
account. We have nearly 30 million kids
00:02:19
in America who are eligible for at least
00:02:21
$250 if they just go claim their
00:02:24
account. These things are going to go
00:02:25
live on July 4th. And what it really
00:02:27
showed, I think, the country, it
00:02:28
accelerated after the State of the Union
00:02:30
because the president, you know, really
00:02:32
believes this is a way to get everybody
00:02:34
main street America into the game of
00:02:36
capitalism and get them all directly
00:02:38
owning, you know, the great companies in
00:02:40
America. So, it meant a lot to me in
00:02:42
that regard that it highlights the
00:02:44
importance of the program. So, I was
00:02:45
deeply grateful to the president for not
00:02:48
only making sure this happens, but the
00:02:49
shout out is pretty cool.
00:02:50
>> Good for you, bro.
00:02:51
>> I have um an interesting uh idea for
00:02:53
you. I'm sure it's come up already, but
00:02:56
with this whole discussion of UBI,
00:02:58
somebody said to me, "Oh, you know, I
00:02:59
really like these Trump accounts your
00:03:00
friends did the the Invest America
00:03:02
because it's like the start of UBI." And
00:03:04
I was like, "Well, that's not exactly
00:03:05
the intention, but I get it." And with
00:03:08
wealth disparity going on in the
00:03:09
country, that has a lot of people
00:03:11
concerned. What if there was a giving
00:03:13
pledge around equities and people could
00:03:17
opt into it? They don't have to, but if
00:03:18
somebody like, I don't know, Larry and
00:03:21
Sergey or Zuckerberg said, I want to
00:03:23
pledge 5% of my shares to go into kids
00:03:26
accounts over the next 20 years. What an
00:03:30
amazing, beautiful thing that could be.
00:03:31
And it would be incredibly material to
00:03:33
get whatever it is, a tenth of a share,
00:03:35
a hundredth of a share, a thousandth of
00:03:36
a share of whatever company. Has that
00:03:38
come up yet as an idea? I'm sure it's
00:03:40
obvious, right?
00:03:41
It's come up. Stay tuned. But yes, we're
00:03:44
going to have some banger announcements
00:03:47
as we head toward July 4th.
00:03:49
>> All right, let's talk about the war in
00:03:50
Iran. Obviously, there are much more
00:03:52
important issues than financial ones.
00:03:54
Life, death, the freedom of the people
00:03:57
of Iran, but we're uniquely qualified, I
00:03:59
think, to talk about the economic
00:04:01
fallout, second order effects, first
00:04:03
order effects. And there has been
00:04:05
massive volatility over the last five
00:04:07
trading days. Just talking about Brent
00:04:10
crude oil and we'll we'll key the
00:04:12
discussion off of that type of oil. It
00:04:15
spiked to $84 on Friday. That was day
00:04:17
seven of the war, 119 on Monday, day 10
00:04:21
dropped back down to 84. Jump backed up
00:04:23
to 100 after three commercial ships were
00:04:26
hit in the straight on Wednesday. Those
00:04:28
ships, by the way, were not oil tankers.
00:04:30
Uh they were carrying cargo. They were
00:04:32
flagged as Thai Japanese and Marshall
00:04:34
Islands. Brent crude currently at 99
00:04:36
when we're taping this. It'll be at
00:04:38
something different by the time you
00:04:39
listen to the pod, I'm sure. But it's uh
00:04:42
quite a spike. And here's a second
00:04:44
chart. This shows you the spikes over
00:04:46
time. I was old enough to remember the
00:04:47
oil shock of 1978. Uh we had to like get
00:04:50
in line at the gas station based on your
00:04:52
license plate number and uh you had to
00:04:54
wait an hour or two to get gas. Gulf War
00:04:56
obviously it hit $100 in 20 $26.
00:05:00
2008 we hit kind of a peak moment. $216
00:05:04
in today's dollars. That was the peak
00:05:06
oil discussion. Demand from China went
00:05:08
off the charts. When Russia invaded
00:05:10
Ukraine, we hit 115, which would be 133
00:05:14
in today's dollars. So, this is uh not
00:05:16
new, but it is significant. And breaking
00:05:21
news today, Iran's new Supreme Leader
00:05:23
Mosh Taba says he's keeping the straight
00:05:26
closed as a tool to pressure the enemy.
00:05:29
Wall Street Journal on Thursday quoted a
00:05:31
senior fellow at the Middle East
00:05:33
Institute saying that reopening the
00:05:35
strait will require ground troops. Poly
00:05:37
market 27% chance that US forces enter
00:05:41
Iran by the end of March and 57% by the
00:05:44
end of the year. So the sharps over at
00:05:45
poly market believe we will have boots
00:05:48
on the ground. Let me stop there. Brad,
00:05:50
your thoughts on what happens when oil
00:05:52
hits this kind of number and we have
00:05:54
this uncertainty of, hey, this could
00:05:56
last, you know, two more weeks or it
00:06:00
could last 6 months, it could last a
00:06:02
year. Nobody seems to know and how it
00:06:05
resolves. We just had a really
00:06:08
interesting talk with Graham Allison.
00:06:10
Um, how it resolves is also a major
00:06:13
unknown. Your thoughts?
00:06:14
>> Right. So first is obviously there are
00:06:17
huge direct costs as oil prices go up
00:06:20
right oil is a component of a lot of
00:06:21
consumer you know and enterprise
00:06:23
products and it also hurts consumer
00:06:25
confidence enterprise confidence Goldman
00:06:27
Sachs is out today with some analysis
00:06:29
where they updated kind of the economic
00:06:31
knock-on effects right so they raised
00:06:33
their PCE inflation forecast from 2.1 to
00:06:36
2.9 right for the year so that's a huge
00:06:39
jump right in terms of their expected
00:06:43
you know PC inflation core PCE which
00:06:45
excludes oil, okay, is they forecasted
00:06:48
up uh from 2.2 to 2.4. So they're saying
00:06:51
even if you excluded the direct price of
00:06:54
oil, the knock-on effects is going to
00:06:56
cause a little more inflation. They
00:06:57
lowered their GDP forecast by 30 basis
00:06:59
points for the year. And they also
00:07:02
expect higher unemployment as a result
00:07:04
of this for the year. All of that is
00:07:06
weighing on the sentiment of the market.
00:07:07
Remember, just a few months ago, the S&P
00:07:09
peaked at 24 times. Now we're at 21
00:07:11
times. But I think the market may be
00:07:13
getting it a little bit wrong. Right?
00:07:15
The Trump doctrine, you know, I tweeted
00:07:17
about this last week. I think the Trump
00:07:18
doctrine is far more pragmatic than the
00:07:22
neocon doctrine. Right? I think we I
00:07:24
think Trump has a very limited set of
00:07:26
goals. He wants to destroy and degrade
00:07:27
threats to America's national security
00:07:29
interests. He doesn't want to spread
00:07:31
democracy. So my suspicion is that these
00:07:34
impacts are shorter duration, but right
00:07:36
now the market's having a little bit of
00:07:38
post-traumatic stress flashbacks to
00:07:40
Afghanistan and Iraq and wondering if we
00:07:42
might be wandering into a quagmire.
00:07:45
>> All right. And just he may not um in
00:07:49
terms of the doctrine, he has said he
00:07:50
wants to see uh the people rise up
00:07:52
there. So might be splitting hairs, but
00:07:54
I think uh he might actually be for
00:07:57
regime change. He says he wants the
00:07:58
regime to change
00:07:59
>> being equal. I don't think he minds if
00:08:01
the people bring it to themselves. The
00:08:02
question is whether the US is going to,
00:08:04
you know, put boots on the ground and
00:08:06
try to spread misonian democracy like
00:08:08
like the Cheney doctrine was. And I
00:08:10
think this is very different.
00:08:11
>> Chimat, uh, your take on the economic
00:08:13
impact and, you know, and any other uh,
00:08:16
things you'd like to add about the war
00:08:19
in Iran?
00:08:20
I think the most important thing that I
00:08:23
saw this week was I think President
00:08:26
Trump was asked about the war and he
00:08:28
said the war would be over very soon.
00:08:32
What did the market do? The market
00:08:34
literally took oil from 120 a barrel to
00:08:37
90 a barrel
00:08:40
almost in you know a nancond.
00:08:43
I think that that sort of tells you what
00:08:46
everybody thinks. To the extent that the
00:08:50
market really didn't believe it, oil
00:08:52
would not have budged and if anything,
00:08:54
it would have faded those comments and
00:08:56
you probably would have seen oil stay at
00:08:58
around 120 or even go slightly higher.
00:09:02
So the fact that there was this
00:09:03
reflexive move, I think is a belief by a
00:09:06
lot of the sharps that there is no path
00:09:09
to a sustained conflict. there's going
00:09:11
to be a lot of chest bumping from the
00:09:13
Iranians obviously because they need to
00:09:14
save face and they will want to set up
00:09:17
whoever comes next to have the most
00:09:19
successful chance of governing. So my
00:09:21
perspective is that was a trial balloon.
00:09:24
I think it validated what everybody
00:09:25
thought which is that this is going to
00:09:27
be a shortrun thing. I agree with that.
00:09:31
The downstream impact is, I think,
00:09:33
correct what Brad said, which is it
00:09:35
could show up in some short-term price
00:09:38
spikes. But then on March 11th, you saw
00:09:41
what Chris Wright did, which is the
00:09:44
president activated a whole bunch of
00:09:46
member countries in the IEA, and I think
00:09:48
Chris released 172 million barrels. I
00:09:51
think there's a coordinated release of
00:09:52
about 400 million barrels of petroleum.
00:09:56
That's going to dampen the effect of any
00:09:57
price spike. On top of that, I think the
00:10:00
estimate is there's probably another
00:10:02
billion or so more barrels that one
00:10:04
could
00:10:05
release out of strategic stock piles.
00:10:08
So, I think that both of these two
00:10:10
things together kind of paint a picture
00:10:12
that probably the worst is behind us.
00:10:14
And I think now it's about finding the
00:10:16
off-ramp.
00:10:19
Sax, your thoughts?
00:10:22
>> Well, I agree that we should try to find
00:10:23
the off-ramp. I mean, I agree with what
00:10:25
Brad and Jamas said about that. Look,
00:10:28
we've degraded Iranian capabilities
00:10:30
massively. Their army, navy, air force,
00:10:33
all been destroyed. This is a good time
00:10:36
to declare victory and get out. And that
00:10:37
is clearly what the markets would like
00:10:39
to see. You are seeing, however, a
00:10:42
faction of of people, I'd say largely,
00:10:45
but not exclusively in the Republican
00:10:47
party, who want to escalate the war and
00:10:49
who are calling for things like ground
00:10:52
troops or regime change or they simply
00:10:56
want the pounding of Iran to just keep
00:10:58
going on and on. I saw an oped in the
00:11:00
Wall Street Journal to that effect that
00:11:02
we shouldn't try and find an offramp, we
00:11:03
should just keep going with this. And I
00:11:06
just want to lay out, I think, some of
00:11:08
the risks of what an escalatory approach
00:11:12
could entail. So, first of all, we're
00:11:14
all seeing that the the straits for
00:11:16
moves are closed right now. We don't
00:11:18
want that to persist longer than it has
00:11:21
to, but there are actually worse
00:11:23
outcomes than that. So if the Iranians
00:11:27
get hit, if their oil and gas
00:11:28
infrastructure gets hit, they've already
00:11:31
said they're going to engage in tit
00:11:33
fortat retaliation against the Gulf
00:11:34
States. And we saw there was recently um
00:11:37
the Iranians blew up this giant oil
00:11:39
depot in Oman. You saw some of those
00:11:42
images. They could continue to target
00:11:44
the oil and gas infrastructure across
00:11:47
the Gulf States. And if that happens, it
00:11:50
won't really matter if the straits get
00:11:51
reopened because you won't be able to
00:11:53
restart oil and gas production in the
00:11:55
Middle East. So that would be, I think,
00:11:57
a much worse outcome that could result
00:11:59
from escalation.
00:12:01
Furthermore, there's an even worse, I
00:12:03
think, scenario from there, which is the
00:12:06
region is very dependent on desalination
00:12:08
plants. I think something like 70% of
00:12:11
Riad gets their water from desalination.
00:12:13
I think it's something like 100 million
00:12:14
people on the Arabian Peninsula that get
00:12:17
their water from Delaw. I mean, it's
00:12:19
basically a desert, right?
00:12:21
>> And those DEL plants are soft targets.
00:12:23
You already saw there was I think there
00:12:25
was one diesel plant in Iran that got
00:12:27
hit and then it caused Iran again tit
00:12:29
for tat to hit a delaw plant. I think it
00:12:32
was in Kuwait. I could be off about
00:12:33
that.
00:12:34
>> But any event, if you see that type of
00:12:36
destruction continue, you could
00:12:38
literally render the Gulf almost
00:12:41
uninhabitable. I you're not going to
00:12:43
have enough water for 100 million people
00:12:45
and human beings just cannot survive
00:12:46
very long without water. So that would
00:12:49
be a truly catastrophic scenario and
00:12:52
we're talking about destroying the Gulf
00:12:55
States economically and then also from a
00:12:57
humanitarian perspective. So I think we
00:12:59
have to take things like this into
00:13:00
account when you hear people preaching
00:13:02
for or advocating for escalation. You
00:13:05
also have to I think consider the
00:13:06
impacts on on Israel. I mean, it's hard
00:13:09
to know exactly how much damage Israel
00:13:11
is taking right now. There's a social
00:13:13
media blackout, but what you're starting
00:13:16
to hear trickle out is that Israel is
00:13:18
getting hit harder than they've ever
00:13:19
been hit before in their history. And
00:13:21
we're only two weeks into this. If this
00:13:22
war continues for weeks or months,
00:13:26
then Israel could just be destroyed.
00:13:29
There are very large parts of it. Now, I
00:13:31
think Israel is a harder target than the
00:13:33
Gulf States. Their infrastructure is
00:13:35
more hardened. Also, they're further
00:13:36
away. The Gulf States are vulnerable to
00:13:39
drones and short-range missiles, whereas
00:13:42
Israel is mainly vulnerable to
00:13:43
long-range missiles. Nonetheless, at
00:13:46
some point, their air defenses could
00:13:48
become exhausted if it hasn't happened
00:13:50
already. And Israel could get seriously
00:13:52
destroyed. And then you have to worry
00:13:53
about Israel escalating the war by
00:13:56
contemplating using a nuclear weapon,
00:13:58
which would truly be catastrophic. So
00:14:02
there's a lot of scenarios here, a lot
00:14:04
of really
00:14:06
frightening scenarios about where
00:14:07
escalation could lead. And even though
00:14:10
the United States is a much more
00:14:11
powerful country than Iran, they
00:14:13
essentially have a dead man's switch
00:14:16
over the economic fate of the Gulf
00:14:18
States and even potentially beyond that,
00:14:21
you know, the habitability of some of
00:14:23
these these countries. So I do tend to
00:14:26
think that this is a good time to
00:14:28
declare victory. I think Brad, you're
00:14:30
right that the president has never said
00:14:32
that democracy promotion is one of his
00:14:34
objectives. Yes, Jal, obviously everyone
00:14:37
would welcome if the people rose up and
00:14:39
chose a new regime, but that's not
00:14:41
something that we've said we have to
00:14:43
accomplish. And this would be a really
00:14:46
good time to take stock of where we are
00:14:48
and try, I think, to seek an off-ramp.
00:14:51
And look, if escalation doesn't lead
00:14:53
anywhere good, then you have to think
00:14:55
about, well, how do you deescalate? And
00:14:57
deescalation, I think, involves reaching
00:14:59
some sort of ceasefire agreement or some
00:15:02
sort of negotiated settlement with Iran.
00:15:05
And we can get into more of what that
00:15:06
looks like. But I think that's the big
00:15:08
picture is that if escalation could lead
00:15:11
in all these horrifying directions, then
00:15:12
I think that's not the right approach.
00:15:14
You have to look at deescalation.
00:15:16
>> Jake, where are you on this?
00:15:18
>> Complicated. I have my personal feelings
00:15:19
on regime change. And since we don't
00:15:22
have the information uh the MSAD and the
00:15:25
CIA and Trump has, I do think Trump
00:15:28
would only do this if he had a very high
00:15:30
probability of success and an off-ramp.
00:15:33
However, it's not looking good with the
00:15:35
offramp right now and it could be quite
00:15:37
chaotic. I think if the neocons get
00:15:40
their way and the people on poly market
00:15:43
are correct, the sharps who say 57%
00:15:46
chance we'll have boots on the ground, I
00:15:47
think this is kind of the end of Trump's
00:15:50
second term. And if you were to put
00:15:52
together the series of mistakes that
00:15:55
he's made um and the administration's
00:15:58
made, they're really at the heart of why
00:16:02
people voted for him. You take starting
00:16:04
a war like this specifically with Iran.
00:16:06
That's what we were told was the reason
00:16:09
to vote for President Trump. He was not
00:16:11
going to take us down this path. He was
00:16:13
not going to
00:16:16
risk World War II. He was not going to
00:16:17
risk a nuclear possibility as as Sax
00:16:21
correctly points out. And now you have
00:16:24
all the MAGA supporters from, you know,
00:16:26
Tucker to MTG to Rogan, Matt Walsh,
00:16:29
Megan Kelly, they are all up in arms
00:16:31
about this is the end of MAGA and this
00:16:34
is, you know, a massive betrayal.
00:16:36
There's the 1B betrayal where Trump
00:16:39
wouldn't release the Epstein files.
00:16:40
We'll put that aside because I don't
00:16:41
think that's as important as starting
00:16:43
World War II. Um, and then obviously the
00:16:46
insane unnecessary cruelty we talked
00:16:48
about many times on this podcast of ICE
00:16:50
agents which he has corrected by getting
00:16:53
rid of Christy Gnome. So you start
00:16:54
putting these things together. If this
00:16:58
continues for another 6 months, it's
00:17:00
basically going to result in the
00:17:04
Democrats doing a clean sweep in the
00:17:07
midterms. Here's the chart that I think,
00:17:10
you know, the Republicans really need to
00:17:12
look at at how misguided,
00:17:15
you know, uh, this all is. This is the
00:17:18
chart that should be absolutely
00:17:20
terrifying. Nobody wanted this war or
00:17:22
very few people wanted it besides the
00:17:24
neocons and the people of Iran probably
00:17:26
and the Israelis. But the chances of the
00:17:29
Democrats sweeping now is up to 45%.
00:17:32
This just happened. The Democrats are
00:17:35
going to sweep, then they're going to
00:17:36
win in 2028. and the entire agenda of uh
00:17:40
MAGA and Trump's 2.0 will be gone. And
00:17:43
then you look at just absolutely
00:17:45
ignoring the working man. Inflation
00:17:48
going up above 3% as you pointed out is
00:17:50
a likelihood, Brad. Unemployment ticking
00:17:53
up. Still very low, but it's ticked up
00:17:55
10%. Worth keeping an eye on. These
00:17:57
foreign affairs things are the least
00:18:00
important to the American people. It's
00:18:02
very, very low on the list of
00:18:04
priorities. And people are looking at
00:18:07
Trump and what they believe is the
00:18:09
enriching of his family and all these
00:18:11
business deals.
00:18:12
>> You're in the kitchen sinking it.
00:18:13
>> It's literally what I was thinking.
00:18:15
>> You're bringing everything at 1, two,
00:18:17
three, four. Number one, doing the war
00:18:20
that everybody said he should not do,
00:18:22
include and that that was why we should
00:18:23
vote for him. Number two, the Epstein
00:18:25
files. Number three, the ICE cruelty.
00:18:26
And number four, not working for the
00:18:30
American working man who doesn't own
00:18:32
equities. Those are four. 1 2 3 four.
00:18:34
It's not a kitchen sink. This is not my
00:18:36
personal feelings on this. This is my
00:18:37
assessment of the situation. If he
00:18:39
doesn't find an offer ramp quickly,
00:18:42
they're going to lose both houses in the
00:18:44
midterms. That's I think the thing Trump
00:18:47
needs to really consider and I think he
00:18:48
will consider I think he's going to find
00:18:50
an offer.
00:18:50
>> Right. That is the topic. The topic was
00:18:52
are we going to find an offramp or not
00:18:53
find an offramp in Iran? And I think Sax
00:18:56
made the argument that there's danger
00:18:59
that neocons and others are arguing that
00:19:01
we expand, put boots on the ground.
00:19:03
You're saying if he doesn't it'll be a
00:19:05
disaster. Chimath and I both say he
00:19:07
will. Right. And so the
00:19:10
>> wait will what?
00:19:11
>> He will find an off-ramp over you know
00:19:13
in the nearer term because the Trump
00:19:15
doctrine is not the neocon doctrine. As
00:19:18
much as people want to talk about Iran
00:19:21
Iran Iran. I think as I explained last
00:19:23
week I think this is about China China.
00:19:26
And you have to remember at the end of
00:19:28
this month he has a pivotal 3 days with
00:19:32
Xi Jinping in China. This is going to be
00:19:34
an absolutely historic convening
00:19:38
of the two superpowers
00:19:42
that run the world. One which is us, we
00:19:45
are the established and one which is
00:19:47
China who wants to be reasscended.
00:19:50
And I would bet dollars to donuts that
00:19:54
there is going to be an enormous
00:19:55
incentive for Xi to negotiate a grand
00:19:59
bargain in those three days
00:20:02
and do something historic for himself.
00:20:06
And I think that the president will use
00:20:08
that if he thinks that it creates
00:20:09
leverage. I
00:20:10
>> I think it's is a great insight. How
00:20:12
does the straight of Hormuz open? If
00:20:14
this war is dragging on and Israel,
00:20:16
which seems to be the driving force in
00:20:19
this, if Israel keeps it up with uh
00:20:22
Iran, how do we ever get the straight
00:20:24
open again?
00:20:25
>> I think the offramp is that the United
00:20:27
States uh you know, declares victory,
00:20:29
does what Zach says and says, "Listen,
00:20:31
we degraded and we destroyed. That's
00:20:33
what we came here to do. We did not come
00:20:35
here for some experiment in in
00:20:37
democracy. We wish the best to the
00:20:39
Iranian people to do the things they
00:20:41
need to do. And if Iran does not back
00:20:43
down, if after that declaration, Iran
00:20:45
continues to destroy cargo containers
00:20:48
moving through the narrow straits, I
00:20:50
think you're going to see Iran's
00:20:52
neighbors in Israel and others get very
00:20:56
involved as it pertains to Iran because
00:20:58
it's in their interest. Listen, the
00:20:59
United States produces 20 million
00:21:01
barrels of oil a day and we consume 20
00:21:03
million barrels a day. This is a this is
00:21:05
a modest problem for the United States.
00:21:07
This is a massive problem for China.
00:21:09
This is a massive problem for Asia. This
00:21:11
is a massive problem for all of our
00:21:13
friends in the Gulf who are trying to
00:21:14
dodge Iranian missiles right now. So,
00:21:16
there are a lot of people in the world
00:21:18
who will take up arms to deal with the
00:21:20
Iranians if the United States isn't
00:21:22
there because we can take care of
00:21:23
ourselves. Your position Brad just to
00:21:25
confirm it is we are going to leave the
00:21:28
war in the next 30 days and then if the
00:21:30
straits are not open then China, India
00:21:34
and all the Gulf uh countries that are
00:21:36
impacted by it they will protect it.
00:21:39
They will fight Iran.
00:21:40
>> I think they'll put a lot of pressure on
00:21:41
Iran not to continue firing missiles at
00:21:44
their ships. Right? At the end of the
00:21:45
day this is not just an American
00:21:47
problem. Right? And let's be clear we're
00:21:50
always involved in this part of the
00:21:52
world. The only question is, are we
00:21:54
going to have an active armada that's
00:21:56
engaged in active military activities
00:21:58
against Iran? And what I'm suggesting
00:22:00
again, and listen, anytime you try to
00:22:03
clean up a mess like this, there is
00:22:04
risk. This is not a riskfree, you know,
00:22:08
initiative by the United States, nor was
00:22:10
Venezuela. But let me steal man the
00:22:13
alternative. doing nothing and allowing
00:22:16
Iran to procure, you know, the
00:22:18
ingredients for a nuclear missile when
00:22:19
they are set on the destruction of the
00:22:21
United States and US interests. Doing
00:22:23
nothing in Venezuela while the Monroe
00:22:25
Doctrine is totally wrecked and we let
00:22:27
our adversaries take up, you know,
00:22:29
positions in in South America. Those
00:22:31
also have risks, right? Those carry a
00:22:34
lot of risks. And so we're we're
00:22:36
weighing these two risks. Again, for me,
00:22:38
I don't like the fact that we're engaged
00:22:40
in military activities here. But I will
00:22:42
tell you, I am very much on the side
00:22:44
that if we're going to go protect
00:22:46
American national security interest, you
00:22:48
go in, you do the the degrading of their
00:22:51
capability and you get out. And I think,
00:22:54
you know, that's what I hear out of the
00:22:55
president.
00:22:56
>> Jim, you had a followup.
00:22:58
>> All roads lead to China. I think that
00:23:00
there you're going to see Xi offer up a
00:23:03
grand bargain. And I think it's up to
00:23:04
the president to decide whether he wants
00:23:06
to take it and see what he wants to add
00:23:07
to it to get something done. But I just
00:23:09
don't see them meeting and coming out
00:23:11
with nothing. I think I see them going
00:23:13
in and coming out with something that's
00:23:15
historic.
00:23:17
And I think that all of this Venezuela
00:23:19
and Iran together is all about China.
00:23:22
>> Let me just say one thing as to that,
00:23:23
Chimath. I think because I think the
00:23:25
point is absolutely spoton, right?
00:23:28
Probably the single greatest takeaway
00:23:30
for us from an investment perspective at
00:23:32
the start of this war was that the
00:23:34
Chinese, right, didn't take up arms on
00:23:37
behalf of Iran
00:23:39
aren't defending Iran and they didn't
00:23:43
cancel the summit with the president,
00:23:45
right? The very fact that because they
00:23:47
need him, they need the oil. 20% of
00:23:51
their entire domestic consumption is oil
00:23:54
from Venezuela and Iran. 20%. But it's
00:23:57
not 20%. Because it's literally 100% of
00:24:00
anything that's feed stock, anything
00:24:02
that's transport, cars, buses, planes.
00:24:06
They are in an enormous world of hurt.
00:24:08
Now, they have a strategic petroleum
00:24:10
reserve as well, and it's quite robust,
00:24:13
but it's not robust enough to sustain
00:24:15
five or 6 months of this. It's not that
00:24:18
robust. So, at the end of the day, who
00:24:21
is going to be hurting the most? It is
00:24:23
China. And so if you play this game
00:24:25
theory out, the reason he kept it is
00:24:27
because now he needs a summit even more.
00:24:29
Could you imagine if the president
00:24:31
canled, that would be a disaster for the
00:24:33
Chinese.
00:24:35
So the fact that it's still on the
00:24:36
books, if I was Xi, I'd be like, how do
00:24:39
I negotiate and help find the off-ramp?
00:24:42
How do I end up fixing this faster? All
00:24:46
right, remember you have 25%
00:24:48
unemployment of young men inside of
00:24:50
China. 25% today. What do you think it
00:24:53
goes to in five months with no oil?
00:24:57
>> It's in the
00:24:58
>> That's the unemployment rate you should
00:24:59
be focused on, Jason.
00:25:02
>> Oh, the China issue is a separate one
00:25:04
just to
00:25:05
>> It's not separate.
00:25:06
>> No, no, that was separate from my point
00:25:08
is my point. I I was bringing up a
00:25:09
different point.
00:25:10
>> Yeah, your kitchen sink didn't include
00:25:11
the Chinese. I get that. I'm just adding
00:25:13
to your kitchen sink.
00:25:14
>> I didn't have a kitchen sink. I have
00:25:15
four very salient points. All right,
00:25:17
Sax, I'll give you the final word here.
00:25:20
Well, look, that was a bit of a a
00:25:21
broadside, Jay Cal, where you kind of
00:25:23
did kitchen sink it. But look, here
00:25:25
here's the part I'll agree with you
00:25:27
about, which is it doesn't take a
00:25:29
political genius to understand that long
00:25:31
wars are unpopular.
00:25:34
It will hurt the Republicans in the
00:25:36
midterms or 28 if this does turn into a
00:25:38
long war. Fortunately, I think the
00:25:40
president understands that his political
00:25:42
instincts are impeccable and he's always
00:25:44
favored
00:25:46
short, decisive, swift actions, military
00:25:49
actions, whether it was Midnight Hammer,
00:25:50
whether it was the Maduro raid. I think
00:25:53
that is his inclination and preference.
00:25:55
And I think we are pretty much at or
00:25:58
close to a point where the president's
00:26:00
going to have to decide on next steps. I
00:26:02
think he's indicated that we have
00:26:05
completed our objectives and I think
00:26:07
it's just important that we don't let
00:26:09
this neocon wing of the party try to
00:26:12
expand the objectives or aims of the war
00:26:16
because frankly they've always been
00:26:17
wrong about everything. I mean these are
00:26:19
people who never wanted to get out of
00:26:21
Iraq and Afghanistan would still be
00:26:23
there after 20 years if they had their
00:26:25
choice. So, I think it's just important
00:26:28
to not listen to those people. And look,
00:26:30
it's not just one op-ed in the Wall
00:26:32
Street Journal. The Wall Street Journal
00:26:34
is kind of the tip of the spear
00:26:35
representing that whole neocon
00:26:37
establishment. And I think it's just
00:26:39
important that this is the time to
00:26:41
frankly ignore those voices and let the
00:26:42
president do what I think his political
00:26:44
instincts are telling him to do, which
00:26:46
is to wrap this thing up.
00:26:48
>> I'm in strong agreement, and it is my
00:26:50
hope, too, that he wraps it up quickly
00:26:52
and that we don't have any more loss of
00:26:55
life. All right. We'll keep discussing
00:26:57
uh this ongoing breaking news story in
00:27:00
the coming weeks. But back to our zone
00:27:02
of excellence, AI and tech, OpenAI and
00:27:06
Anthropic are scaling revenue and costs
00:27:08
faster than we've ever seen in the
00:27:10
history of well business the world.
00:27:13
Revenue at these two companies growing
00:27:15
gosh like unprecedented levels. Here are
00:27:18
the reports and I believe your investors
00:27:20
in both these companies. Brad Anthropic
00:27:22
hit a $14 billion run rate last month,
00:27:26
February. That means they have grown
00:27:28
revenue from 1 billion uh to 14 billion
00:27:32
in 14 months. They have 12x
00:27:34
year-over-year.
00:27:36
They're valued at a meager 380 billion
00:27:39
uh last month. This feels like a bargain
00:27:41
given the growth. Openai ended 2025 at
00:27:44
20 billion annualized run rate and
00:27:47
they've grown revenue from 2 billion to
00:27:49
20 billion in 24 months. They're valued
00:27:51
at 840 billion last month. Uh, and man,
00:27:55
it looks like Sam Alman has Daario in
00:27:58
the rearview mirror. He could get lapped
00:28:02
any moment. Lots of debate.
00:28:03
>> Where did you find this? What the hell
00:28:05
is this?
00:28:05
>> That one? I made that. This is Daario
00:28:07
closing in.
00:28:09
>> What is that? A velociraptor? What is
00:28:11
that?
00:28:12
>> It's the famous scene from Jurassic
00:28:14
Park.
00:28:14
>> Oh, wow. But I mean, I don't think
00:28:16
anybody expected Daario to be coming
00:28:17
around the bend this fast, but he's
00:28:19
right behind apparently and they're
00:28:21
winning obviously the businessto
00:28:23
business side of the business. The J
00:28:25
curve on these companies is insane. Uh
00:28:28
250, 500 billion. Who knows what's get
00:28:30
gets invested before these companies
00:28:33
reach profitability, Brad, but you're
00:28:35
invested in these two companies.
00:28:37
>> Yeah.
00:28:37
>> Unless you sold when uh Sam Wman told
00:28:40
you he would buy his shares back on the
00:28:42
famous BG2 episode. I don't think you
00:28:44
sold it.
00:28:44
>> I bought a lot more since then, Jason. I
00:28:46
bought a lot more.
00:28:47
>> Fantastic. That's important information
00:28:48
for us to have. Quick question for you.
00:28:51
Number one, what's a better buy here?
00:28:53
>> Yeah.
00:28:53
>> Uh, Anthropic at 380, OpenAI at 840. Uh,
00:28:57
and then I think people want to know if
00:28:58
these companies are going to go public.
00:29:00
What you, if you think they should go
00:29:02
public, what is the chance of that
00:29:03
happening? Take those questions however
00:29:05
you like.
00:29:06
>> Well, I mean, listen, you know, Love
00:29:08
Your Children equally. They're both
00:29:10
incredible companies. Anthropic
00:29:11
unquestionably has a lot of financial
00:29:13
momentum, you know, and OpenAI is seeing
00:29:16
a lot of momentum themselves, right? But
00:29:17
the single most important question this
00:29:19
year, right, was would AI revenue show
00:29:22
up? And just 60 days ago, 90 days ago,
00:29:26
there was tremendous skepticism. No way
00:29:28
all of these infrastructure investments
00:29:30
were going to pay off. There's no
00:29:32
incremental revenue coming out of AI,
00:29:34
including many of our friends. But in
00:29:37
February, we had in January and
00:29:39
February, we really had kind of a
00:29:41
nuclear moment, right? The splitting of
00:29:42
the atom moment. I mean, we had a $6
00:29:45
billion month out of Enthropic in
00:29:48
February, right? What widely reported.
00:29:50
Okay, let that set in for a second,
00:29:53
right? $6 billion in a month, and it was
00:29:56
only a 28 day month. Okay, that's more
00:29:59
revenue than the annual revenue of data
00:30:01
bricks and snowflake that are two of the
00:30:03
greatest software companies of all time
00:30:05
after 12 years, right? They could do in
00:30:08
the first four or five months of this
00:30:09
year the total revenue of SpaceX this
00:30:12
year.
00:30:12
>> What is driving that? Just explain to
00:30:14
the audience what's driving it. Is it
00:30:15
token use? Is it clawed subscriptions?
00:30:18
>> We crossed a threshold with Opus 4.6,
00:30:22
right? And we saw it again with chat GBT
00:30:25
5.4 before where the models and the
00:30:27
agents on top of them whether it's cloud
00:30:29
codeex chatbt they're no longer
00:30:32
competing with IT budgets they're now
00:30:34
augmenting labor they're competing with
00:30:36
labor budgets you could not possibly
00:30:38
have a $6 billion month it is impossible
00:30:41
to do that by displacing IT budgets
00:30:44
millions of other companies across
00:30:45
America say oh my god let's spin up
00:30:48
these agents and have them do things for
00:30:49
us and we're willing to pay for it
00:30:51
because the product of that effort is
00:30:53
worth the money to And the revenue and
00:30:56
the usage momentum, I will tell you in
00:30:58
the month of March continues and it only
00:31:00
accelerates from here. As Kevin Wild has
00:31:03
said, the models in the agents are the
00:31:05
dumbest today they will ever be. Right?
00:31:08
We're in the early innings of compute
00:31:10
and algorithmic uh capability. And so,
00:31:14
you know, like that to me is the
00:31:15
observation at this moment. Should they
00:31:17
go public? I've said yes, they should go
00:31:19
public for several reasons. There's tons
00:31:20
of institutional demand. They need cheap
00:31:23
access to money to continue to build out
00:31:25
uh the compute they need to support.
00:31:27
They they there is more compute
00:31:28
constraint in these businesses this very
00:31:31
day than they've had any time in the
00:31:33
last 3 years. So they need access to the
00:31:34
capital. And then finally, I think you
00:31:37
have to have the retail investor in the
00:31:38
game. These are two of the most
00:31:40
important companies in the history of
00:31:41
capitalism in the history of America.
00:31:43
It's destabilizing not to have them
00:31:45
public. You know, Jensen said last week
00:31:48
that he expected the 40 billion he
00:31:50
recently invested in these two companies
00:31:52
would be his last money in because they
00:31:54
were both going to go public. He thought
00:31:56
they he said they would both go public
00:31:58
this year. I think that, you know,
00:31:59
they're they're preparing and heading
00:32:01
down that path, you know, and and
00:32:03
listen, I want to get some of these
00:32:05
shares in the accounts of all these kids
00:32:06
that were opening up because they're
00:32:08
really really important companies to the
00:32:10
future of the American economy. Shim,
00:32:12
you had some insight into the quality,
00:32:16
durability of this revenue. There's not
00:32:18
a single good example that we can find
00:32:22
of sustained positive
00:32:25
margin expansion and impact of AI inside
00:32:29
of a true corporate enterprise that is
00:32:32
not right now a small test. There's not.
00:32:35
>> So where does six billion come from?
00:32:38
because everybody has to show up to
00:32:40
their board and have an AI checkbox and
00:32:43
everybody is thousands and thousands of
00:32:46
companies. And when you have tens of
00:32:48
thousands of companies as customers
00:32:49
paying $200 plus a month, it's not that
00:32:52
hard to show up with that kind of
00:32:54
revenue. The real question is the
00:32:56
following. If you take you use the data
00:32:59
bricks and snowflake example, if you
00:33:01
look at the companies that use that
00:33:04
software,
00:33:06
those companies generate enormous
00:33:10
revenues and enormous margins and these
00:33:13
products are in critical production
00:33:16
workflows that underly those revenues
00:33:19
and profits. That is just not true with
00:33:21
AI today. We have all kinds of claims,
00:33:24
but we are still experimenting.
00:33:27
Why are we experimenting? Because we
00:33:29
know it's important, but we don't yet
00:33:32
really know what to do. You can't just
00:33:34
slot this in to a critical workflow in
00:33:36
healthcare and all of a sudden show up
00:33:39
where if you make a misdiagnosis or if
00:33:41
you make a a misccharacterization of a
00:33:44
procedure, you can get fined and go to
00:33:46
jail. The companies that are in
00:33:48
healthcare don't do that. If you're in
00:33:50
financial services and you make a
00:33:52
mistake about somebody's portfolio or
00:33:54
you make a misallocation and you point
00:33:55
to a model, you will get sued and you
00:33:58
will be in trouble. None of these things
00:34:01
have transitioned from it's interesting,
00:34:03
it's experimental to it's the core
00:34:06
critical operational workflow.
00:34:08
>> Just interesting.
00:34:09
>> There will be a transition in revenue
00:34:11
quality when that happens. A great
00:34:14
example of this is Amazon. Why does
00:34:16
Amazon issue an edict that says you
00:34:19
cannot use this stuff inside of AWS
00:34:22
unless a human now reviews and approves
00:34:24
it? Because what happened? They had
00:34:26
three or four sev one faults from a
00:34:29
bunch of code that was written by agents
00:34:31
that brought down AWS. Now look, I've
00:34:33
told you I love AWS for one reason,
00:34:36
because it's hyper reliable. I hate AWS
00:34:39
for the same reason that hyper
00:34:40
reliability comes at enormous cost. I
00:34:42
pay it, but I pay it to never have a SE
00:34:45
one. The reason they have 12 9s of
00:34:48
accuracy is because it's humans and
00:34:50
deterministic code that never fails. It
00:34:52
doesn't mean that two companies can't
00:34:54
get to 20, 30, 40 billion of revenue.
00:34:56
What it means is we have to be honest.
00:34:58
This is an industry that's early. We are
00:35:01
all figuring it out. There's a lot of
00:35:04
test budgets that are going at it. It
00:35:06
will slowly and methodically emerge into
00:35:09
production. But let's not oversell what
00:35:12
this moment is.
00:35:13
>> Okay, Brad, I want to give you a well-
00:35:14
constructed question here to respond and
00:35:16
then Sax, we'll go to you if you have
00:35:18
some input. Of the 20 billion, how much
00:35:20
of it do you think is experimental? What
00:35:24
percentage is experimental versus
00:35:26
production? Bas strip out the cons the
00:35:29
consumer spending for because like you
00:35:31
know that's half of it, right?
00:35:33
>> Okay, so let's I'll I'll put aside the
00:35:35
consu S. Great idea. We'll put the
00:35:36
consumer subscriptions aside. they're
00:35:38
obviously getting value or they wouldn't
00:35:39
be subscribing, but it's like Netflix.
00:35:41
>> And by the way, for that where it can be
00:35:44
extremely faulty and there's no SLA that
00:35:46
you're giving,
00:35:47
>> it's a phenomenal. These are phenomenal
00:35:49
products.
00:35:50
>> Yes. For 20 bucks a month, well worth
00:35:51
it. And consumers have decided it's
00:35:53
worth it. So, I think we're in agreement
00:35:54
there.
00:35:55
>> And also for the individual engineer, of
00:35:58
which I suspect there's a few million,
00:36:00
who get to pay $200 a month and have
00:36:02
their company subsidize it. The company
00:36:04
knows that these costs are being
00:36:06
incurred, but there is no tick and tying
00:36:09
at the end of it where then you review
00:36:10
the code in a different way because you
00:36:12
you you're worried that there's
00:36:13
hallucinations as Amazon just
00:36:14
demonstrated.
00:36:15
>> Yeah, there's a story in the there's a
00:36:17
story in the FT about Amazon having some
00:36:19
blast radius from some computerenerated
00:36:22
AI and they're putting controls in
00:36:24
place. We'll put that in the show notes.
00:36:25
Brad, let's get to the specific question
00:36:27
I asked. of the tens of billions of
00:36:29
dollars in revenue between the two
00:36:30
companies that's not consumer, what
00:36:33
percentage do you think is production
00:36:35
quality versus experimental to Chimat's
00:36:38
point?
00:36:38
>> Yeah, I mean listen and I've been, you
00:36:40
know, I coined the phrase experimental
00:36:42
run rate revenue, right, versus annual
00:36:45
recurring revenue, right? Like I think
00:36:46
it's I think Chimas's point is a really
00:36:48
important one. As an investor, I have to
00:36:50
discern what's repeating, right? What's
00:36:52
recurring and what's not. What I would
00:36:55
suggest is of course there's a lot of
00:36:56
experimentation because these things
00:36:58
haven't been around that long. But I
00:37:00
suspect right that Palunteer, the US
00:37:04
government, the US military, Nvidia, and
00:37:08
a lot of other major enterprises would
00:37:11
argue they've gone full production. In
00:37:14
fact, it's existential to the wartime
00:37:16
effort going on in Iran right now. That
00:37:18
doesn't sound to me like experimental as
00:37:20
much as it sounds like production
00:37:22
capability. And I will tell you what
00:37:24
will prove this one way or the other is
00:37:27
in the month of March, do revenues
00:37:29
continue and go up? Right? Do they
00:37:32
experimentation go on forever? That's
00:37:34
not true. That's not true. The
00:37:36
experimentation goes on forever. That
00:37:38
sounds like recurring.
00:37:39
>> We've scratched the surface of of the
00:37:41
number of companies that even know how
00:37:42
to adopt AI. So these numbers will go to
00:37:44
the stratosphere. I'm not debating that.
00:37:46
>> Okay.
00:37:47
>> Look, I'm on the same side of the bed as
00:37:49
you are. I want these numbers to keep
00:37:50
going to the moon. I'm just being much
00:37:52
more circumspect and honest with myself
00:37:54
to say I see it on the ground. I sit on
00:37:57
top of these models. I am paying these
00:37:59
models millions of dollars a year. I am.
00:38:01
>> Yeah.
00:38:02
>> And what I'm telling you is my revenues
00:38:03
don't go up faster than their revenues.
00:38:05
I'm consuming more tokens every single
00:38:08
day. Do I get more economic output? I am
00:38:11
not. And I would say that my team is at
00:38:13
the leading edge. And so I suspect a
00:38:15
Fortune 1000 company is steps behind my
00:38:18
team. And if I am spending triple every
00:38:21
3 months and not seeing my revenues
00:38:24
tripling, I suspect these other
00:38:26
companies are in a similar situation.
00:38:28
>> I'd finalize it. But ask it on Friday
00:38:30
when you're with Michael Dell because
00:38:32
I've had this conversation recently with
00:38:33
Michael Dell and Michael said a year ago
00:38:35
companies weren't seeing ROI. Today
00:38:37
they're seeing very big ROI in their AI
00:38:39
investments. But I think that's the
00:38:40
question on the table. But
00:38:42
>> what companies? Which companies? Of
00:38:44
course the second
00:38:46
of course is seeing an ROI. Sachs were
00:38:49
agreeing. This is experimental in large
00:38:52
part and this is a new tool. So by
00:38:56
definition you have to experiment before
00:38:57
you put it in production. What's your
00:38:59
take on this grand debate? How much of
00:39:01
this revenue is experimental versus
00:39:04
real?
00:39:04
>> Well look when you're talking about
00:39:06
enterprise revenue, what you're really
00:39:08
talking about is coding assistance.
00:39:10
That's been the breakout use case. It's
00:39:12
really the first big breakout use case
00:39:14
on the enterprise side. The consumer
00:39:16
side's been more of like you know
00:39:18
research and writing that kind of stuff
00:39:20
the chat bots but enterprise has all
00:39:23
been about coding assistance. My sense
00:39:25
is that the demand for code is very
00:39:27
scalable. Software engineers has always
00:39:29
been an area of the economy where
00:39:31
companies have never been able to hire
00:39:32
enough. Even in Silicon Valley which is
00:39:33
the most attractive place for software
00:39:36
engineers to work. We've never been able
00:39:37
to recruit and attract enough of them.
00:39:41
The rate limiting factor on the progress
00:39:43
of every startup I've ever invested in
00:39:45
is not having enough engineers to code
00:39:47
up the product roadmap. And then you
00:39:50
look at the rest of the economy, the
00:39:51
Fortune 500 and so forth and so on. They
00:39:53
have hardly been able to recruit
00:39:55
software engineers at all because
00:39:57
they've all gone to Silicon Valley. So I
00:39:59
think you're dealing with a part of the
00:40:00
economy where there's always been a
00:40:02
massive supply shortage. And I don't
00:40:04
know what the natural limit on that is,
00:40:06
but my sense is that there's a
00:40:08
tremendous latent demand for the ability
00:40:11
to generate code in large quantities,
00:40:16
create new products. You know, as the
00:40:18
cost of code goes down, as the coding
00:40:20
assistants get better, you can code up
00:40:22
new types of products. And then, of
00:40:24
course, it's going to lead to agents,
00:40:25
which is another way of basically using
00:40:28
the code that gets generated. So, my
00:40:30
sense is that this could be very
00:40:32
scalable. I don't know where it it taps
00:40:34
out exactly. Where I think Chimatha is
00:40:37
right is that I think there is a change
00:40:39
management aspect to this in Fortune 500
00:40:43
companies for example and they haven't
00:40:44
really wrapped their heads around how
00:40:46
exactly they're going to use this.
00:40:48
There's a McKenzie study that showed
00:40:50
that a lot of these pilot projects in
00:40:52
Fortune 500 companies were experimental.
00:40:54
A lot of them were proving not to be
00:40:55
successful. So I do think like as you go
00:40:58
beyond coding into you know company
00:41:02
transformation things like that it
00:41:03
becomes a little bit more speculative.
00:41:05
That's not to say it won't happen. I
00:41:07
think it will happen. I'm actually I'm
00:41:09
bullish. But I do think that
00:41:12
>> we're still waiting to see what the
00:41:13
breakout use cases beyond coding will
00:41:15
be. Probably agents will be the next big
00:41:17
one. But I think Brad's right that
00:41:18
that's big enough to see, you know, this
00:41:21
scale for a while because, you know, the
00:41:23
thing about the thing about code is
00:41:25
you're paying for code on a metered
00:41:27
basis right now. You're paying per
00:41:28
token, which is kind of an amazing deal
00:41:31
for companies, right? Because before
00:41:33
they had to go through this recruiting
00:41:34
process to find engineers, source them,
00:41:38
vet them, you know, keep them happy,
00:41:40
give them all the perks, the kind bars,
00:41:43
all, you know, uh, and so to be able to
00:41:45
buy code on a metered basis as the cost
00:41:48
per token keeps going down.
00:41:50
>> It's kind of metered before. It was just
00:41:52
metered on people. Now, now to your
00:41:54
point, it's it's metered. It's metered
00:41:56
in a different way, but it's still
00:41:57
metered.
00:41:58
>> Yeah. And let me just, you know, you use
00:42:00
this term labor displacement, Brad.
00:42:02
That's like the one part where I might
00:42:04
disagree with you is because there was
00:42:06
such a shortage.
00:42:08
>> Yeah.
00:42:08
>> Of software engineers. I think when
00:42:10
people hear the word labor or term labor
00:42:12
displacement, they might
00:42:14
>> start to think that 6 billion of
00:42:16
incremental revenue means 6 billion of
00:42:19
layoffs. No.
00:42:20
>> And I don't think it does. And and the
00:42:21
way to to thread that needle is the fact
00:42:23
that we were artificially constrained in
00:42:26
the number of software engineers and how
00:42:28
they could be used and how rapidly they
00:42:31
could be acquired and all that kind of
00:42:32
stuff. So to be able to now turn that on
00:42:35
like electricity and that's kind of what
00:42:36
we're talking about.
00:42:37
>> Exactly.
00:42:38
>> Is just such a huge
00:42:40
>> game changer and unlock for the whole
00:42:42
economy.
00:42:43
>> Yes.
00:42:43
>> And that's what I think is really
00:42:44
exciting about it.
00:42:45
>> It's augmenting it's augmenting human
00:42:47
labor, right? We're not at a place yet
00:42:49
where it's displacing. And this is the
00:42:51
definition of productivity gains.
00:42:53
>> I'm going to make just uh two quick
00:42:54
points here. The place to look for this
00:42:57
actually moving from experimental into
00:43:00
production is not not at big companies.
00:43:02
Big companies are actively resisting
00:43:04
this. Management in big companies will
00:43:06
resist it because it means lowering
00:43:07
headcount and it means the person who
00:43:09
implements it might actually implement
00:43:10
themselves out of a job. So that is the
00:43:13
natural resistance you'll see in big
00:43:14
companies. That's not where to look for
00:43:15
adoption of new tech.
00:43:16
>> That's not why it's happening. That's
00:43:18
not why
00:43:18
>> let me finish and then you can you can
00:43:20
counter it.
00:43:21
>> Startups are the place to look at this
00:43:24
and that's where I am on the ground. And
00:43:26
what I'm seeing there is that startups
00:43:29
are using this in production for their
00:43:31
legal work for their marketing for SDRs
00:43:35
uh for their accounting reviewing legal
00:43:38
documents. This is all uh work that
00:43:40
would normally they would hire
00:43:42
consultants for outsource uh or make
00:43:45
hires for. And what I'm seeing on the
00:43:47
ground is it's production ready in
00:43:49
startups who are using it in those
00:43:51
categories. HR as well, accounting,
00:43:54
marketing, all of that, all that
00:43:56
blocking and tackling, all those chores
00:43:57
are being done currently with these LLMs
00:44:00
and they're doing it in production and
00:44:01
they're doing it at scale. Just a quick
00:44:03
second point here. Here's the J curve.
00:44:05
And this is the question I think we'll
00:44:07
get to in our next segment. When does
00:44:08
this become a profitable business? If uh
00:44:11
and you asked this of Sam in that famous
00:44:13
clip on the BG2 podcast, RIP BG2, here
00:44:18
you go. The LLM industry, Jacob, I just
00:44:20
asked Claude to make this for me. Um, if
00:44:22
you have 500 billion, I think you would
00:44:24
agree it's probably going to be around
00:44:25
that number, Brad, invested in all of
00:44:27
this. And then
00:44:28
>> more a lot more.
00:44:30
>> Okay. So, five billion is an
00:44:32
underestimate here. And then when do we
00:44:34
actually see these large language model
00:44:35
companies hit profitability in a in a
00:44:38
calendar year? It took Tesla, Uber,
00:44:42
Amazon, you know, decade plus in each of
00:44:44
those cases
00:44:45
>> to win back their investment. 10
00:44:47
billion.
00:44:47
>> This is a really good chart. Here's the
00:44:49
precise math on this. So, I am building
00:44:51
a 1 gawatt data center in Arizona.
00:44:53
>> Okay.
00:44:54
>> When I green lit that project, I thought
00:44:56
it was going to be a four or five
00:44:57
billion investment. I was like, okay,
00:44:59
whatever. Then it went to 10. Then it
00:45:01
went to 15. Then it went to 20. And now
00:45:03
it's upwards of $50 billion for the
00:45:06
powered shell, for all the land, for all
00:45:08
the permits, then for all the
00:45:09
infrastructure, all the people, all of
00:45:12
it. Okay. Sarah Frier said, I think it
00:45:15
was about a year ago, maybe less than a
00:45:17
year ago, that for them, every gigawatt
00:45:19
is about 10 billion of annual revenue.
00:45:21
So if you think about that J curve,
00:45:23
Jason, really the way to think about it
00:45:24
is look, energy equals intelligence. For
00:45:27
every gigawatt that they're trying to
00:45:28
spend, they have a fiveyear payback is
00:45:30
roughly what it means just to get to
00:45:32
break even. And then year six, seven,
00:45:33
and eight will be where the profit is.
00:45:35
Now, how do you shrink the Jaker? You
00:45:37
have better silicon. We're going to see
00:45:39
something from Jensen in a in a week or
00:45:41
two that uses a bunch of the stuff that
00:45:44
we partnered with him at Grocon on.
00:45:46
There'll be other people. There'll be
00:45:48
open source. So, all those things can
00:45:50
shrink the depth and the surface area of
00:45:52
that JC curve so that you can get out of
00:45:54
it faster. But right now that that thing
00:45:56
is roughly accurate which is it's about
00:45:58
50 billion per gigawatt and it's about a
00:46:02
five to six year payback just to get
00:46:04
into the money and then it's about 10
00:46:05
billion a year and the technology
00:46:08
industry has to do something to make
00:46:10
this better. Could I though
00:46:13
take a step back and give you just a
00:46:14
different framing of all of this
00:46:17
>> please?
00:46:18
I think the big thing that we're
00:46:19
debating is actually something we've
00:46:22
seen in every other technology trend
00:46:26
when it started to get some really
00:46:28
meaningful traction. So in the first
00:46:30
generation of the internet when you
00:46:31
started to see e-commerce and all these
00:46:33
other business models then in the second
00:46:35
big wave of the internet around the move
00:46:37
to mobile and the move to social and
00:46:39
then now we're seeing this big wave
00:46:40
around AI and I think what happens is in
00:46:45
step one
00:46:47
entrepreneurs
00:46:48
are AB testing
00:46:51
what it takes to raise money. Okay,
00:46:54
that's step one.
00:46:56
And I think what has happened is that at
00:46:59
least some parts of the AI ecosystem
00:47:02
have decided that this crazy
00:47:06
scary doomerism is the best way to raise
00:47:09
money where every now and then they come
00:47:12
out and they say all the jobs will be
00:47:14
destroyed anthropic you know Daario says
00:47:16
that this thing is sentient and
00:47:18
investors are like okay here's 10
00:47:21
billion here's 50 billion here's 100
00:47:22
billion but then the second step happens
00:47:25
they get the money, they start to do the
00:47:27
training, they start selling, and then
00:47:29
the investors are like, "Hey, where's
00:47:31
the revenue?" And so then they start
00:47:32
selling everywhere.
00:47:34
And then if you see in the Department of
00:47:37
War example, all of a sudden you
00:47:39
flip-flop, you become sort of an
00:47:40
unserious dilotant like partner to the
00:47:43
American government. They're like,
00:47:44
"We're going to boot you out. That's
00:47:45
billions of revenue gone." And what
00:47:47
happens? Those same investors that gave
00:47:48
billions of dollars are like, "Hey, hold
00:47:49
on a second. That's absolutely not
00:47:51
allowed. You need to conform and get
00:47:53
back on track." And so what does Dario
00:47:54
do? He flip-flops and he's like, "Oh,
00:47:56
I'm really sorry. I didn't mean to.
00:47:58
Let's sort of make good." All of that to
00:48:00
me is an industry that's still in its
00:48:02
very early phases and still figuring out
00:48:05
what its place in society is. So, what
00:48:08
is the problem? The problem is the
00:48:10
following two clips, and I'll just have
00:48:12
Nick play these, and I'd love your guys'
00:48:14
reaction. The one thing though that I
00:48:17
think even now is underestimated by all
00:48:20
actors in industry and including in
00:48:23
Silicon Valley is how disruptive these
00:48:26
technologies are. If you are going to
00:48:28
disrupt the economic and therefore
00:48:30
political power significantly of one
00:48:33
party space, highly educated, often
00:48:36
female voters who vote mostly Democrat
00:48:39
and military and workingclass people who
00:48:42
do not feel supported. and you feel like
00:48:45
that's you believe that that's going to
00:48:46
work out politically, you're in an
00:48:48
insane asylum like that you cannot have
00:48:51
a this technology disrupts humanity's
00:48:54
trained largely democratic voters uh and
00:48:57
and makes the economic power less and
00:49:00
increases the power economic power of
00:49:03
vocationally trained workingclass often
00:49:05
male uh voters. and and and so these
00:49:09
disruptions are going to disrupt every
00:49:11
aspect of our society. And it to make
00:49:15
this work, we have to come to an
00:49:17
agreement of what it is we're going to
00:49:20
do with the technology. How are we going
00:49:22
to explain to people who are likely
00:49:24
going to have less good and less
00:49:26
interesting jobs from their perspective?
00:49:29
And how is it that we are going and by
00:49:32
the way on the military thing, these
00:49:34
technologies are dangerous society. the
00:49:37
only justification you could possibly
00:49:39
have would be that if we don't do it,
00:49:41
our adversaries and uh will do it and we
00:49:44
will be subject to their rule of law. So
00:49:47
you if you decouple this from the
00:49:48
support of the military, you're going to
00:49:50
have an enormous problem explaining to
00:49:52
the American people why is it that we're
00:49:55
absorbing the risk of disrupting the
00:49:58
very fabric of our society, including
00:50:00
the most powerful
00:50:02
parts of our society. uh if it's not
00:50:05
because it's about maintaining our
00:50:07
ability to be American in the near term
00:50:09
and and and long term.
00:50:10
>> Now watch Sam's reaction. Fundamentally,
00:50:13
our business and I think the business of
00:50:15
every other model provider is going to
00:50:17
look like selling tokens. But we see a
00:50:22
future where intelligence is a utility
00:50:26
like electricity or water and people buy
00:50:30
it from us um on a meter and use it for
00:50:34
whatever they want to use it for. So if
00:50:36
you take those three messaging veins on
00:50:38
a spectrum, one is we have a sentient
00:50:41
super god. We're the only ones that can
00:50:43
protect you from it, but you know your
00:50:45
days are numbered. That's Daario. Alex,
00:50:47
which is, hey, hold on a second. You
00:50:48
can't have it both ways. You can't both
00:50:51
say it on the one hand and then try to
00:50:52
run the fabric of society and flip it.
00:50:54
You need to be much more circumspect.
00:50:56
And then Sam's, which is we want to sell
00:50:58
tokens as a service.
00:50:59
I think the point is that this industry
00:51:01
right now that revenue traction if
00:51:04
anything else has distracted people from
00:51:07
actually getting on the same page and
00:51:08
being much more methodical and much more
00:51:11
reliable and trustworthy in explaining
00:51:14
all of this and managing the expansion
00:51:16
of this. And so what I would say is all
00:51:19
of this fundraising gobbledegook has
00:51:21
actually created this breathlessness
00:51:23
that is not useful and isn't helping.
00:51:27
And I would say there needs to be a lot
00:51:28
more seriousness by these folks to
00:51:30
actually run this business thoughtfully.
00:51:32
You can't be a dilotant. You can't
00:51:34
flip-flop. You can't pressure test AB
00:51:37
test this kind of messaging in public.
00:51:40
But I understand why you're doing it
00:51:41
because the stakes are so high. You're
00:51:43
playing this enormous poker game. But I
00:51:45
think we need to do a better job of
00:51:47
explaining all this to people because
00:51:48
right now my end of this is look at this
00:51:50
chart. This is now the result of those
00:51:53
three messages. Here is where AI is. It
00:51:55
is slightly above the Democratic party
00:51:58
and an autocratic state. That's where AI
00:52:01
is. ICE is more popular than AI.
00:52:05
So
00:52:06
>> not very popular.
00:52:08
>> So to me, this is really at the crux of
00:52:11
this where we are not really being
00:52:13
honest. It would be much better if we
00:52:16
said soberly, there's a lot of
00:52:17
experimenting. This revenue is great,
00:52:19
but we don't really know what's real.
00:52:21
We're going to try to figure it out.
00:52:22
We're going to work methodically.
00:52:23
There's a lot of regulated industries.
00:52:25
We're going to work within those. We're
00:52:26
not going to flout the law and the
00:52:28
rules. Lensure will still mean
00:52:30
something. That's a way better,
00:52:32
thoughtful, mature message.
00:52:36
And what do you think?
00:52:37
>> And rant. Great rant. Uh Brad, does the
00:52:41
industry have a PR problem? Obviously,
00:52:43
these recent surveys and especially
00:52:46
comparing them to China where people see
00:52:49
AI as abundance and like this incredible
00:52:51
new technology they want to embrace.
00:52:52
Here, people are scared. People are
00:52:54
scared they're going to lose their job.
00:52:56
People are scared about wealth u
00:52:58
disparity. The rich get richer, the poor
00:53:00
get poorer. There's a lot of fear here
00:53:02
in the United States. What can our
00:53:05
industry do to turn this around in terms
00:53:08
of communication from the big companies?
00:53:11
They don't seem to be communicating in
00:53:13
any coordinated fashion and and they
00:53:16
obviously are scaring the [ __ ] out of
00:53:17
the public.
00:53:19
>> Yeah. No, listen, I think it's uh I
00:53:20
think it's a fair rant and a fair point.
00:53:22
At the start of the industrial
00:53:24
revolution at the start, you know, in
00:53:26
the in the late 1800s, we had similar
00:53:29
social responses to innovations that
00:53:31
were occurring. We in fact had some
00:53:33
violent clashes. We had demonstrations
00:53:36
in the street. We had the entire robber
00:53:38
baron movement you know so class warfare
00:53:40
and worse is you know has come with
00:53:43
other re you know kind of industrial
00:53:46
changes of this magnitude so it doesn't
00:53:48
surprise me that we have a lot of
00:53:50
anxiety by people that they may lose
00:53:52
their job and I think there are people
00:53:54
out there who are kind of forecasting
00:53:56
into the future in ways that are scary
00:53:59
to you know the average person who's
00:54:01
who's listening to this and I don't
00:54:02
think that's particularly helpful so
00:54:04
could we do a better job messaging no
00:54:06
doubt about it. But if I just rewind to
00:54:09
kind of where we started, I actually
00:54:11
think the industry is um you know this
00:54:15
is going to be a pivotal year for the
00:54:16
industry to demonstrate right how this
00:54:19
is really beneficial for humanity. I
00:54:22
think we're going to be able to
00:54:23
demonstrate that it's very beneficial
00:54:25
from a health care perspective, from a
00:54:27
drug discovery perspective, from an
00:54:29
education perspective, etc. But we need
00:54:31
to have a coordinated effort because
00:54:32
Chimas right is deeply unpopular in the
00:54:35
country. I happen to be on the
00:54:37
optimistic side of this. 70% of the jobs
00:54:40
that exist in the United States today
00:54:42
did not exist 40 years ago. Right.
00:54:44
Right. We've gone through the digital
00:54:47
disruption that put a lot of people out
00:54:49
of work. But the abundance and the and
00:54:52
the recreation of new jobs, right,
00:54:54
expanded the pie for for largely
00:54:57
everyone. I think that will be the case
00:54:58
here. If you listen to Daario, he says
00:55:00
the concern is that the disruption
00:55:02
occurs at a faster and and and and
00:55:04
bigger rate and so that we can't keep up
00:55:06
with kind of that replacement. I think
00:55:08
that's another fine point. But if we
00:55:10
just go back to where we started the
00:55:11
conversation, which was are these good
00:55:14
investments, right? Do
00:55:16
>> That's not the conversation. No, of
00:55:18
course they're good investments. Of
00:55:19
course you're going to make money.
00:55:20
>> No, no, no. That's not what it's about.
00:55:22
>> That's not it.
00:55:23
>> He asked the question. You made the
00:55:24
argument. Jason asked the question,
00:55:27
right? Are these companies simply
00:55:29
selling do tokens at a loss? Right? And
00:55:32
we moved into this.
00:55:33
>> No, no, no. They're they're selling at a
00:55:35
profit. I'm buying them and losing
00:55:36
money.
00:55:37
>> Right.
00:55:38
>> In the 1849 gold rush,
00:55:40
>> Anthropic and OpenAI and all of these
00:55:42
model makers are selling the pick and
00:55:44
shovel in the gold rush.
00:55:46
>> I am buying it and I'm trying to pan for
00:55:48
gold. But as with the gold rush, most of
00:55:51
these companies will go out of business.
00:55:53
And all I'm saying is if we are really
00:55:55
circumspect and honest, there is still
00:55:58
way more to figure out than has been
00:56:00
figured out. This is not a solved
00:56:02
problem. And I think we it would behoove
00:56:04
everybody to just tell the truth about
00:56:05
this. It would be way better to be
00:56:08
honest. This is not figured out.
00:56:10
>> I would say I think the data the cards
00:56:11
that are being turned on the table move
00:56:13
me in the exact opposite direction.
00:56:15
>> Okay, let me get Sachs involved and then
00:56:17
I'll give my take. Sax, you have any
00:56:18
thoughts here? Well, I I have some
00:56:20
thoughts on the question you asked about
00:56:22
is the industry doing a good job with
00:56:25
PR. I think the answer is clearly no. I
00:56:27
think they are scaring the be Jesus out
00:56:29
of the public and that's why the
00:56:30
popularity is right down there with I
00:56:33
don't know what was it Iran. I mean it's
00:56:35
pretty I think we're a little more
00:56:37
popular than than Iran. But um but look,
00:56:40
>> Iran Iran's had 100 years to [ __ ] it up.
00:56:42
So we've only had two.
00:56:43
>> Yeah. Or 47 anyway.
00:56:45
>> Yeah. Iran's had 47 years to We've only
00:56:47
had two. But it is very much a
00:56:49
US-specific problem. If you look at
00:56:52
data, sentiment data across countries.
00:56:55
What you'll see is that other countries
00:56:56
are much more optimistic about AI than
00:56:59
the US. I think the Stanford did a study
00:57:02
on AI optimism. They simply asked the
00:57:04
question, do you think AI is going to be
00:57:06
more beneficial than harmful? Something
00:57:08
like 80% of people in China said yes. In
00:57:10
the US, it was in the 30s and it might
00:57:12
be even lower now. And it's not just
00:57:14
China and the US. you see across Asian
00:57:16
countries, they tend to be pretty
00:57:17
optimistic and then the US and Western
00:57:20
Europe tend to be pretty pessimistic
00:57:21
about it. I think that's less about the
00:57:24
reality of AI and more about our media
00:57:26
environment and who influences it.
00:57:28
Obviously, you have the influence of
00:57:30
Hollywood. It's created a lot of
00:57:31
dystopian films about AI. You've got the
00:57:35
fact that like we talked about these
00:57:36
CEOs are doing a horrible job and they
00:57:39
keep talking about putting everyone out
00:57:40
of business. I mean, this has I think
00:57:42
been not accidental. Well, I would say
00:57:44
some of these CEOs are speaking this way
00:57:47
because they're not very good at comms.
00:57:49
I think others are actually doing it
00:57:51
because they see a strategy there. You
00:57:54
know, they're going for a regulatory
00:57:55
capture.
00:57:56
>> Such a good point. It's delusions of
00:57:58
grandeur plus they're positioning their
00:58:00
companies. Yeah.
00:58:01
>> It could be for financing reasons like
00:58:03
you've mentioned Schmoth. like they want
00:58:04
to tap the stuff for fundraising, but
00:58:06
also I think that some of it is to
00:58:10
create a regulatory backlash that they
00:58:13
can then control. You know, create a
00:58:14
licensing scheme or permissioning
00:58:16
scheme. And that's a big part of it,
00:58:19
too. And then I think you just have the
00:58:21
fact that in our media environment, the
00:58:24
scare stories are the ones that get a
00:58:26
lot more attention than the heartwarming
00:58:29
stories. You know, if it if it bleeds,
00:58:31
it leads type thing. So you can just see
00:58:33
how unpopular it is for all of these
00:58:34
reasons.
00:58:35
>> You know, New York is about to outlaw
00:58:36
medical and legal advice from AI
00:58:39
chatbots, which by the way, that's
00:58:40
probably the the most obviously
00:58:43
>> valuable and highest ROI thing for for a
00:58:47
consumer
00:58:49
people the worst. Hurts poorest people
00:58:50
the worst. But do you understand like if
00:58:52
you're say a professional association
00:58:55
that sees it as your job to protect your
00:58:57
industry from disruption, you might
00:59:00
actually want to spread FUD about AI in
00:59:03
order to then seek those protections
00:59:05
through your state legislature.
00:59:07
>> Well, if you have companies that are
00:59:09
fanning those flames and those companies
00:59:11
are the ones in the industry, it's
00:59:12
making your job even easier.
00:59:14
>> Well, just think about the poorest
00:59:15
person. they they can't afford a lawyer
00:59:17
and they want to do their own research
00:59:19
and they research the legal stuff to in
00:59:22
order to fight an eviction or there are
00:59:24
poor people who don't have a primary
00:59:25
care doctor. They're not insured and
00:59:27
they find a way to deal with, you know,
00:59:30
some medical issue they're havinging.
00:59:31
This is the greatest thing. It's the le
00:59:33
it's a it levels the playing field for
00:59:36
poor people or people without resources.
00:59:39
So, this is the craziest legislation
00:59:41
ever. New York City is a disgrace. I
00:59:43
give them out of the week. No, the pro
00:59:44
the problem is very specifically that
00:59:47
these people that rely on these models
00:59:50
to make a healthc care diagnosis or get
00:59:52
a legal opinion to help improve their
00:59:54
lives, the makers of those tools are
00:59:57
telling everybody that they're about to
00:59:59
bring death and destruction upon the
01:00:01
economy and the world. So then the
01:00:03
lawyers and the doctors are like, "Well,
01:00:05
then maybe we should slow this down."
01:00:06
and they tell their lobbyists who then
01:00:08
go to New York and then tell the New
01:00:10
York legislators, hey, uh, these guys
01:00:13
are like trying to wreak havoc and then
01:00:15
they're like, oh yeah, well then maybe
01:00:17
we should shut it down. That is the loop
01:00:19
that's happening.
01:00:20
>> That was the Bernie Sanders moment this
01:00:22
week where he said we ought to have a
01:00:24
moratorum on all data centers being
01:00:26
built in the United States because AI is
01:00:28
dangerous. That was his message and that
01:00:30
is what he's pushing. And actually that
01:00:32
brings me to another point which is if
01:00:34
you look closely at Bernie Sanders
01:00:36
messaging, one of the talking points he
01:00:37
used was literally verbatim from Future
01:00:39
of Life Institute. Uh I think it was
01:00:41
something about how AI is less regulated
01:00:43
than a sandwich shop, which is just not
01:00:45
true. But future of life is one of these
01:00:47
EA funded doomer think tanks and they've
01:00:50
got something like a billion dollars.
01:00:52
Vitalic Butering donated $600 million of
01:00:54
dog coins to to the Anyway, you've got I
01:00:58
mean this is one really weird sort of
01:01:00
quirk about our media environment is we
01:01:02
have these EA funded think tanks with
01:01:05
literally billions of dollars. You know,
01:01:07
it's guys like Dustin Moscowitz and
01:01:10
>> they're the Dels, they're the Democratic
01:01:12
Socialist kind of arm. It's so weird
01:01:13
that like New York is now taking the
01:01:15
crown of most [ __ ] state from
01:01:17
California. I don't know how this
01:01:18
happened, but they just seem to be
01:01:20
making every mistake possible. Yeah, but
01:01:21
but hold on, let me just finish my point
01:01:22
because you've got these, let's call
01:01:24
them doomer think tanks funded by these
01:01:25
EA billionaires. They have literally
01:01:27
billions of dollars. You can influence a
01:01:29
lot of public discourse with that a lot.
01:01:32
>> Sure.
01:01:32
>> And they are behind a lot of the nimi
01:01:35
stuff around data centers. They've been
01:01:36
spreading a lot of the FUD around data
01:01:39
centers increasing your electricity
01:01:40
prices, which again I think is a solved
01:01:42
problem now because the users, the AI
01:01:45
companies have agreed to pay for the
01:01:47
incremental costs and stand up their own
01:01:48
power generation. They've been spreading
01:01:50
a lot of stories about water usage,
01:01:52
which is just totally made up. I mean,
01:01:54
the modern data centers recirculate
01:01:56
water, so they don't use up water. But
01:01:58
again, you've kind of got these doomer
01:02:00
groups who are just trying to stop AI
01:02:02
however they can.
01:02:04
>> And they're extremely well funded and
01:02:05
they're having a big impact. And I think
01:02:06
actually this is one of the reasons why
01:02:09
you're seeing in the US again that AI I
01:02:11
think is it's basically the least
01:02:13
popular thing they can pull for except
01:02:14
for the Democratic party and Iran. And
01:02:16
by the way, FLI, the Future Life
01:02:18
Institute, they also fund journalism
01:02:20
fellowships and endowments at
01:02:22
publications. So,
01:02:24
>> who probably writes negatively about AI?
01:02:26
>> Yeah,
01:02:26
>> exactly.
01:02:27
>> Here, Sax, look at this chart. It goes
01:02:29
back now a little bit earlier than 23,
01:02:31
but I had the data accurately from 23,
01:02:34
so we're going into the fourth year.
01:02:36
About 40% of all protested data centers
01:02:39
in America get cancelled.
01:02:42
And so, in 2023, this was a non-issue.
01:02:44
There were a few data centers that were
01:02:46
protested and a few of those 40% of them
01:02:49
I think it was literally like one or two
01:02:51
got cancelled but then starting in 24
01:02:54
when you had this divergence of
01:02:55
messaging or this chaotic slipshot
01:02:57
messaging and it was just a fever pitch
01:03:00
to raise money what you started to see
01:03:02
was this fermenting of negative
01:03:05
perspective by individual people on the
01:03:07
ground and so in 2024 about 40% of all
01:03:11
protested data centers were cancelled
01:03:13
still a small number, you could ignore
01:03:15
it. But last year was when the bottom
01:03:17
fell out. We had about 25 data centers
01:03:20
canled, about five gigawatts that got
01:03:24
cancelled. If you use Sarah Frier's
01:03:26
number, that's $50 billion a year of
01:03:29
revenue,
01:03:31
which is off the table because of what
01:03:33
happened in 25. Now, that has
01:03:35
implications to everybody. Look at the
01:03:37
amount of taxes that that would actually
01:03:39
raise for federal and local and state
01:03:41
governments. All gone. Vanished. in 26
01:03:45
just at the end of February there are
01:03:47
about a 100 data centers being protested
01:03:50
which if you flow that through will mean
01:03:53
about 40 will get cancelled and that
01:03:57
number right now is about 7 gawatt so
01:03:59
another $70 billion a year of revenue so
01:04:03
just last year and this year we've taken
01:04:05
off the table $120 billion of [ __ ]
01:04:08
revenue per year this is a wakeup call
01:04:11
that this messaging is These people are
01:04:14
not doing what is right on behalf of a
01:04:16
very nent and critical industry for
01:04:18
America. There's only so much that Sachs
01:04:21
can do, the White House can do. All
01:04:23
these other people are kind of at the
01:04:25
periphery. But if the people that are on
01:04:26
the ground don't get their [ __ ]
01:04:28
together, this is a national disaster.
01:04:30
Just to give people a uh sense of where
01:04:33
this is happening, almost all of these
01:04:35
cancellations are Virginia and Indiana
01:04:37
according to some just cursory research
01:04:39
here. And there have been zero
01:04:41
cancellations due to local opposition
01:04:43
here in the great state of Texas where
01:04:45
we have over 150 gawatts of data
01:04:48
capacity requests. So if you want to do
01:04:50
this, come to Texas, talk to Abbott,
01:04:53
talk to Ted Cruz. You just CC them on
01:04:54
your tweet and they'll have you to the
01:04:56
poker game and they will greenlight it.
01:04:58
Brad, before we move on, want to get
01:05:00
your opinion on open source and how
01:05:03
powerful it is and how powerful Apple
01:05:06
silicon is getting. I'm not sure how you
01:05:08
factored this in at alimter into your
01:05:11
thinking, but this seems to me to be a
01:05:13
massive headwind against the two big
01:05:16
bets you have
01:05:18
all of these open source models. We
01:05:20
started running them picking up about
01:05:21
85% of our tokens right now and every
01:05:24
startup I know is saying we are standing
01:05:27
up our local models and we only use the
01:05:30
top models, the paid ones uh when we
01:05:33
have jobs we can't do. So I I'm just
01:05:34
curious your thoughts on that and you
01:05:36
can add to that the auto research
01:05:38
project from Karpathy that came out this
01:05:41
weekend for people who don't know we now
01:05:43
have a group of tinkerers who are
01:05:45
setting up their open clause and now
01:05:47
setting up large language models and now
01:05:48
trying to train them with this auto
01:05:50
research tool. This seems like a
01:05:52
parallel track that could be material.
01:05:54
I'm just curious if you're monitoring it
01:05:56
at all.
01:05:56
>> I mean first I would say that uh I am
01:06:00
very enthusiastic for open source.
01:06:02
>> Okay. We see it in widespread use
01:06:04
everywhere. But here's the interesting
01:06:06
thing, you know, for the advanced
01:06:08
companies, they're doing some like
01:06:09
planning, you know, with the frontier
01:06:12
labs and then they're kind of doing the
01:06:13
execution, if you will, with the open
01:06:15
source models. So they're running an
01:06:16
ensemble of model strategy. But here's
01:06:19
what I think's more impressive. We have
01:06:22
incredible open- source models nearly on
01:06:25
the frontier. And notwithstanding that,
01:06:28
we're seeing companies like Anthropic
01:06:31
add five or six billion dollars of
01:06:33
revenue in a single month, which is
01:06:35
extraordinary. We've never seen anything
01:06:37
like it in technology. And that's in the
01:06:40
face, Jason, of all this open- source
01:06:42
usage. So, what does it tell me? It
01:06:44
tells me that the TAM
01:06:46
>> is dramatically bigger than any of us
01:06:48
think that it is. And that, you know,
01:06:51
when we when we look back on this
01:06:53
period,
01:06:54
>> you know, that will be the big takeaway.
01:06:55
It's a takeaway with Uber, the takeaway
01:06:57
with Google, the takeaway with Amazon.
01:06:59
The TAM was way bigger. We've crossed an
01:07:01
important threshold. Open source will be
01:07:03
a part of it, but clearly the front
01:07:05
Frontier Labs can do well even in the
01:07:07
face of it. All right, a little
01:07:09
housekeeping. The All-In Summit is
01:07:11
coming and liquidity is uh I think it's
01:07:15
sold out. We're about to sell out. We
01:07:16
might have a couple tickets left. You
01:07:18
can find both events at allin.com/events
01:07:21
and all-in summit tickets if you want to
01:07:23
get there quickly before they sell out
01:07:25
September 13th, 14th, and 15th in Los
01:07:27
Angeles. Again,
01:07:30
uh for all the all-in listeners, we're
01:07:32
launching a survey today. This is super
01:07:34
important to take the all-in survey. If
01:07:36
you made it to this point in the
01:07:37
podcast, if you are an early true
01:07:40
believer in the pod, if you're one of
01:07:42
the uh all-in stands, we need you to
01:07:44
fill out this survey.
01:07:45
Allin.com/servey.allin.com
01:07:46
allin.com/servey.
01:07:48
All right, let's uh wrap up with this
01:07:50
final story that went viral. The
01:07:52
millionaire tax has hit Washington
01:07:54
State. Howard Schultz, CEO uh longtime
01:07:57
CEO of Starbucks
01:08:00
has bailed and he's gone to Miami.
01:08:02
>> Surfside. He's in Surfside. He He bought
01:08:05
a He bought a condo in Surfside.
01:08:07
>> He uh he pulled a JCO.
01:08:08
>> He pulled a JCL.
01:08:10
>> I think you mean a sexy boo. Well, yeah,
01:08:12
but I was never I was never a Starbucks
01:08:14
liberal before I left the state of
01:08:16
California.
01:08:16
>> Listen, I I don't know how many times I
01:08:18
have to make this correction. I am a
01:08:19
moderate. I literally voted four
01:08:21
elections in a row for Republicans.
01:08:23
People have asked me for the receipts.
01:08:24
Pitakei, Giuliani, um, and Bloomberg. I
01:08:29
literally for almost a decade voted
01:08:32
exclusively Republican. Washington's
01:08:34
millionaire tax passed this week. Here's
01:08:36
what the tax is. People making more than
01:08:38
$1 million a year will pay an extra 9.9%
01:08:42
in tax starting in 2029.
01:08:45
The budget center estimates the tax will
01:08:47
impact 30,000 households, bring in
01:08:49
another 4 billion uh for the state's
01:08:51
general fund. Um the funds are supposed
01:08:54
to go towards public schools, higher
01:08:55
education, healthcare. In a huge
01:08:57
coincidence, on the same day the new tax
01:08:59
was passed, Howard Schultz, the
01:09:01
billionaire Starbucks founder, um
01:09:04
>> a huge coincidence, did you say?
01:09:06
>> Yeah. just
01:09:08
just unrelated stories.
01:09:10
>> Unrelated an unrelated story. Yeah.
01:09:11
Unrelated news.
01:09:12
>> He will be leaving Seattle after a 44
01:09:14
year run because he found out about
01:09:17
these incredible Cuban.
01:09:18
>> There was there was an opportunity.
01:09:20
There was an opportunity to buy a $44
01:09:22
million condo in Surfside. He couldn't
01:09:23
pass it up. It just happened to be on
01:09:25
the same day that they passed a
01:09:26
millionaire tax.
01:09:28
>> He he had the uh Cuban sandwich at Liss
01:09:30
Sanguich and uh he he fell in love.
01:09:34
Schultz has been getting crushed after
01:09:36
saying when he ran for president that he
01:09:38
would be willing to pay more taxes. Uh
01:09:40
Bezos obviously left back in November of
01:09:42
2023
01:09:44
>> and people speculated maybe the 7%
01:09:46
capital gains tax um would um have
01:09:50
influenced that. Who knows? So I guess
01:09:53
Shimoth what is the endgame here?
01:09:56
Because for these local politicians,
01:10:00
they must have learned the lesson that
01:10:04
people of means can move. They have the
01:10:08
ability to buy new homes, put their old
01:10:11
homes on the market. They're very mobile
01:10:14
and they could even leave the United
01:10:16
States and go to Singapore or Dubai or
01:10:18
other locations in the world. Why are
01:10:21
they still enacting these? and will they
01:10:22
continue to sell enactes until we get to
01:10:24
60 70% tax rates and we just lose all of
01:10:28
the creators and this becomes an
01:10:30
Randian.
01:10:30
>> I think that state politicians on the
01:10:33
west coast are very ineffective and not
01:10:36
not very smart. Nick, there's a there's
01:10:39
a tweet that was published I think maybe
01:10:41
it was a infographic that showed net
01:10:43
migration rates of every single state
01:10:47
for 2025. Washington is a few months
01:10:50
behind California in trying to enact
01:10:54
these stupid taxes. And the reason
01:10:58
they're stupid is these kinds of things
01:11:00
don't work at the state level. And we
01:11:02
know what it's already done in
01:11:03
California because the Hoover
01:11:04
Institution just published something
01:11:06
this morning and it's a complete
01:11:08
indictment of of what the billionaire
01:11:11
tax was trying to do. And by the way,
01:11:12
this billionaire tax is only polling
01:11:15
right now 25% of the votes it needs. So
01:11:18
maybe it'll find a way to get on the
01:11:20
ballot and then even then it'll have an
01:11:22
uphill climb to get voted in. But look
01:11:24
at the destruction that it has done in
01:11:26
California by just announcing it. The
01:11:29
Hoover Institution basically ran this
01:11:31
Monte Carlo simulation. They ran 100,000
01:11:34
runs and in 71% of those runs, it comes
01:11:37
out with a negative NPV.
01:11:40
And if you expected value it out, it's
01:11:43
about a $25 billion hole. They also
01:11:46
found that they overcounted the number
01:11:48
of billionaires in California. So that
01:11:50
number was wrong. They underounted the
01:11:53
amount of revenue that they pay. So that
01:11:55
was wrong. And they overcounted the
01:11:57
estimate of how much money that they
01:11:58
would make. So that
01:11:59
>> they're not good at math. They're not
01:12:00
good at math.
01:12:01
>> So when you add it all up, they thought
01:12:02
they were going to make a hundred.
01:12:04
They're actually going to make 40. The
01:12:06
people that left pay, you know, $3 to5
01:12:08
billion
01:12:10
a year of taxes. It's going to create a
01:12:13
$25 billion hole. You're going to have
01:12:15
the middle class that's now going to
01:12:16
have to foot this because this is net
01:12:18
revenue that's not going to come into
01:12:20
the budget. That's about 2500 per middle
01:12:24
class household. There's about 10
01:12:25
million in California. So that's what's
01:12:28
happened just by making the threat.
01:12:31
>> Washington had a 23-hour debate and
01:12:33
passed the law. So I suspect when you
01:12:36
look back on this in 18 or 24 months,
01:12:39
it'll be as bad or worse than
01:12:41
California. These things don't make
01:12:44
sense. The reason they don't make sense
01:12:46
is that you are putting good money after
01:12:48
bad. We all know that money that goes to
01:12:51
the state governments are wasted. We
01:12:53
just don't know how much. And so when
01:12:55
you keep asking more, eventually the
01:12:58
smart people say enough's enough. I'm
01:12:59
out of here. We might find out how much.
01:13:01
I think uh Bari Weiss is on the case. I
01:13:03
don't know if you saw her do her CBS
01:13:04
report this week. She's she's going hard
01:13:07
for fraud. And until you get fraud out
01:13:10
of the system, I don't think you have
01:13:11
the moral high ground to raise taxes. I
01:13:14
think that should be the message that
01:13:15
all Americans
01:13:16
>> send to politicians.
01:13:18
>> That should be your campaign promise
01:13:19
when you run whatever whatever you do.
01:13:21
>> Hi, I'm Jason Caliganos and I will get
01:13:22
rid of fraud and lower your taxes.
01:13:25
Look, you may have seen an even more
01:13:27
severe attack was proposed at the
01:13:28
federal level where Bernie Sanders and
01:13:30
and I think Roana came out with their
01:13:34
version of a national wealth tax where
01:13:36
it wasn't just 5% once like in
01:13:38
California, it was 5% per year,
01:13:40
>> per year.
01:13:41
>> So, in other words, in roughly 20 years,
01:13:42
the federal government's just going to
01:13:44
take all of your money. I mean, that's
01:13:45
it. Look, this is socialism. This is
01:13:47
another way to get to the same end
01:13:48
point, which is the government owns
01:13:50
everything.
01:13:51
>> They seize. Well, I mean, the seizure
01:13:53
part of it, I think, is the nuance point
01:13:56
we have to get across, which is if you
01:13:57
earned it and paid your taxes already,
01:14:00
>> and then the state can just decide, you
01:14:02
know what, we didn't take enough 10
01:14:04
years ago, we need to go seize that.
01:14:05
Hey, when you sold Yammer, we didn't
01:14:08
take enough. We need to take it now.
01:14:09
>> Raise your hand if you believe the
01:14:11
things you own are better off being
01:14:14
owned by Bernie Sanders and Roana. Raise
01:14:16
your hand if that's what you believe. I
01:14:17
mean, you'd have to be an idiot to
01:14:19
believe that. Well, I'll tell you the
01:14:21
last time we saw these proposals of
01:14:24
asset seizures was during the guilded
01:14:26
age, you know, 1870 to 1920.
01:14:30
>> You know, then it was Carnegi and
01:14:31
Rockefeller. You know what's
01:14:33
interesting? I went back and looked at
01:14:34
it. There were actually like it was
01:14:36
actual warfare. We had hundreds of
01:14:38
people killed in clashes um you know
01:14:41
during the great railroad strike, the
01:14:42
Pullman strike, etc. Um and it was all
01:14:45
over this. And so I think shame on the
01:14:47
politicians that are fanning the flames
01:14:49
of class warfare. Um we all need to
01:14:52
bring the temperature down. There are
01:14:54
fair debates on whether states have
01:14:57
enough resources to fulfill their
01:14:59
obligations to the citizens. There are
01:15:00
fair debates about fraud. All has to be
01:15:03
taken on. But I think it's interesting
01:15:04
in the state of California, right? I
01:15:06
think the teachers union is against the
01:15:08
the billionaire's tax because they know
01:15:10
it's going to lead to less dollars for
01:15:12
education for the state of California.
01:15:13
and Matt Mayan who's running for
01:15:15
governor is against the tax. The current
01:15:16
sitting governor, Democrat, against the
01:15:18
tax, right? They all need to step up and
01:15:21
explain not just that they're against
01:15:23
the tax, but we you're either on the
01:15:26
side of business and entrepreneurs and
01:15:28
creativity and moving the state forward
01:15:31
and growing the economy or you're
01:15:32
against it. And that's what's at stake
01:15:34
here. And fortunately, the outcome of
01:15:36
the battle uh you know during the
01:15:38
guilded age was that America didn't
01:15:41
abandon entrepreneurialism. We didn't
01:15:43
abandon capitalism like Europe did and
01:15:46
now we leaned into it. Now we played it
01:15:47
out.
01:15:48
>> One point on that. So you mentioned that
01:15:50
some of the unions in California are
01:15:52
opposed to this this asset seizure tax.
01:15:55
That's only because they weren't cut in
01:15:57
on it,
01:15:57
>> right?
01:15:58
>> So there's already rumors that the
01:16:00
California Teachers Association is
01:16:02
working on their own version.
01:16:03
>> Oh boy.
01:16:05
>> Billionaire asset seizure
01:16:07
>> for not this election cycle, but for the
01:16:08
next one. Yeah.
01:16:09
>> And next time they're going to have
01:16:11
their ducks in a row and they're going
01:16:12
to have all the pigs at the trough and
01:16:14
all the unions are going to get together
01:16:16
because the SEIU
01:16:18
uh they did this on their own. So again,
01:16:21
they didn't allow all the other groups
01:16:23
to wet their beak. So I think that
01:16:25
unfortunately I think that's going to be
01:16:27
corrected. If this one doesn't pass,
01:16:29
they'll correct it for 28 and it's more
01:16:31
likely to pass because they're all going
01:16:33
to do it. And the other thing is that I
01:16:36
think that by 28 this national wealth
01:16:39
tax will just be a standard part of the
01:16:41
Democratic platform.
01:16:43
>> Table stakes. Yeah.
01:16:44
>> Yeah. It's table stakes. I think that
01:16:45
you know the Bernie Sanders Roana
01:16:46
position will be the position of the
01:16:48
Democratic party. And you even see Gavin
01:16:50
Newsome creating wiggle room for himself
01:16:52
to embrace his position. If you look
01:16:54
closely at his statements formally
01:16:57
opposing the the BTA in California, what
01:17:01
he says is that a state can't do this by
01:17:02
themselves because they got 49 other
01:17:04
states to compete with.
01:17:06
>> So what he's saying is that, you know,
01:17:08
look, California is operating in a
01:17:10
comparative environment. One state can't
01:17:12
do it. And then he's leaving the other
01:17:15
part elliptical, which is well the the
01:17:17
federal government needs to do this. And
01:17:19
I think that you can expect him to
01:17:21
embrace that position by 2028. There is
01:17:24
a way out here from this socialist
01:17:26
movement. It's a it's very simple if you
01:17:29
think from first principles. What does
01:17:30
an American want? What does an American
01:17:31
family want? What what do mothers and
01:17:33
fathers in this country want? They want
01:17:35
to educate their kids. They want to be
01:17:37
able to own a nice home. They want to
01:17:38
have decent healthare. They want to have
01:17:40
healthy food. It's a very small subset
01:17:42
of issues. And AI is uniquely positioned
01:17:46
to solve a lot of these problems. and
01:17:48
entrepreneurs can come in and take these
01:17:50
highly regulated industries if we're
01:17:52
allowed to participate in them.
01:17:55
Education, we have to break this
01:17:57
accreditation, you know, uh, cartel. And
01:18:00
then housing, we have to break these
01:18:02
regulations like the great state of
01:18:03
Texas, Nevada, and Florida have. And
01:18:06
then you know when it comes to healthare
01:18:08
this is where AI could have a tremendous
01:18:10
impact and entrepreneurs could have a
01:18:11
tremendous impact in lowering the cost
01:18:13
of healthare and letting people solve
01:18:15
for that with their own um you know
01:18:18
healthcareled you know self-led
01:18:19
healthcare. These are the problems. If
01:18:22
you solve for these problems people's
01:18:24
homes, people's health, people's
01:18:25
education of their kids, we're going to
01:18:27
solve these problems and we don't need
01:18:29
to go to socialism and seize people's
01:18:31
assets. That's what entrepreneurs should
01:18:33
be doing. That's what entrepreneurs
01:18:34
should be working on and that's what
01:18:36
where the government can help. That's
01:18:37
where Trump's uniquely qualified. He is
01:18:39
the regulatory breaker. He got nuclear
01:18:42
back on the agenda. No other president
01:18:44
had gotten nuclear back on the agenda
01:18:46
for America. He can get housing back on
01:18:48
the agenda. We have to break those
01:18:50
things and not start foreign wars. My
01:18:52
earlier point and start creating houses
01:18:54
for Americans. Another amazing episode
01:18:56
of the All-In podcast. Gentlemen,
01:18:59
>> you always give yourself the last word,
01:19:01
Jal. Well, I go last. If you'd like, you
01:19:03
can moderate and I'll go first. But I
01:19:05
always go last, but you can you I'm you
01:19:08
could give me a round of applause or you
01:19:10
can throw a tomato. Go ahead, Sax. It
01:19:11
doesn't matter.
01:19:12
>> It's all good. It's all good. Go for it.
01:19:13
>> All right, everybody. Another amazing
01:19:15
episode of the All-In podcast. Thank
01:19:18
you, Bestie Brad, for joining us. Thank
01:19:20
you, Bestie Brad. Love you, boys.
01:19:22
Byebye.
01:19:24
>> We'll let your winners ride.
01:19:31
And it said, "We open sourced it to the
01:19:33
fans and they've just gone crazy with
01:19:34
it." Love you, Queen of
01:19:44
>> Besties are
01:19:47
my dog taking notice your driveways.
01:19:52
>> Oh man, my habitasher will meet up.
01:19:54
>> We should all just get a room and just
01:19:56
have one big huge orgy cuz they're all
01:19:57
just useless. It's like this like sexual
01:19:59
tension that we just need to release
01:20:01
somehow.
01:20:05
>> Your feet.
01:20:08
>> We need to get merch.
01:20:09
>> I'm going all in.
01:20:17
I'm going all in.

Badges

This episode stands out for the following:

  • 70
    Best concept / idea
  • 60
    Most shocking
  • 60
    Best overall
  • 60
    Most surprising

Episode Highlights

  • Brad's State of the Union Shout Out
    Brad shares his surprise at receiving a shout out from the President during the State of the Union, highlighting the significance of the event.
    “It was an extraordinary night.”
    @ 01m 08s
    March 13, 2026
  • Economic Impacts of the Iran War
    Discussion on the economic fallout from the ongoing conflict in Iran, including oil price volatility and market reactions.
    “This is a good time to declare victory and get out.”
    @ 10m 36s
    March 13, 2026
  • Escalation Risks in Iran
    Sax outlines the potential catastrophic scenarios that could arise from escalating the conflict in Iran.
    “There are a lot of really frightening scenarios about where escalation could lead.”
    @ 14m 06s
    March 13, 2026
  • Ignoring Neocon Voices
    It's crucial to disregard the neocon wing's influence on war objectives.
    “I think it’s just important to not listen to those people.”
    @ 26m 28s
    March 13, 2026
  • AI Revenue Explosion
    AI companies are experiencing unprecedented revenue growth, signaling a transformative moment.
    “We’re in the early innings of compute and algorithmic capability.”
    @ 31m 15s
    March 13, 2026
  • AI's Revenue Debate
    The discussion centers around the balance between experimental and production revenue in AI.
    “How much of this revenue is experimental versus real?”
    @ 39m 01s
    March 13, 2026
  • Labor Displacement Concerns
    The conversation touches on fears of job loss due to AI advancements and labor displacement.
    “Labor displacement might not mean layoffs, but a shift in job roles.”
    @ 42m 12s
    March 13, 2026
  • The Future of AI as a Utility
    The vision for AI is to become a utility, similar to electricity or water.
    “We see a future where intelligence is a utility like electricity.”
    @ 50m 26s
    March 13, 2026
  • Public Sentiment on AI
    The US is significantly more pessimistic about AI compared to other countries like China.
    “In the US, it was in the 30s and it might be even lower now.”
    @ 57m 12s
    March 13, 2026
  • Washington's Millionaire Tax
    A new tax on millionaires in Washington State coincides with Howard Schultz's departure.
    “He will be leaving Seattle after a 44 year run because he found out about these incredible Cuban sandwiches.”
    @ 01h 09m 14s
    March 13, 2026
  • California's Revenue Miscalculation
    California underestimated its revenue and now faces a $25 billion budget hole.
    “They're actually going to make 40.”
    @ 01h 12m 04s
    March 13, 2026
  • Class Warfare and Political Responsibility
    Politicians are fanning the flames of class warfare, impacting education and business.
    “We all need to bring the temperature down.”
    @ 01h 14m 47s
    March 13, 2026

Episode Quotes

Key Moments

  • State of the Union00:53
  • Escalation Risks14:06
  • Historic Summit19:32
  • Political Strategy26:28
  • Job Displacement Debate42:04
  • AI Optimism Gap57:21
  • Millionaire Tax Fallout1:08:36
  • Final Thoughts1:20:17

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Epstein Files, Is SaaS Dead?, Moltbook Panic, SpaceX xAI Merger, Trump's Fed Pick
Podcast thumbnail
Scott Bessent: Fixing the Fed, Tariffs for National Security, Solving Affordability in 2026
Podcast thumbnail
One of the greatest philanthropic gifts in the history of humanity”: Funding 25M investment accounts
Podcast thumbnail
E63: Insurrection indictments, human rights in the US and abroad, groundbreaking MS study and more
Podcast thumbnail
The Future of Everything: What CEOs of Circle, CrowdStrike & More See Coming in 2026
Podcast thumbnail
“This is Bibi’s War” - Harvard’s Graham Allison on the Influences and Endgame of the Iran War
Podcast thumbnail
E52: Trump's SPAC, peak venture liquidity, tech as an economic ladder, Dems overplaying their hand