Search Captions & Ask AI

Inside the Iran War and the Pentagon's Feud with Anthropic with Under Secretary of War Emil Michael

March 06, 2026 / 01:22:59

This episode of All-In covers the emergency military operation involving the US and Israel against Iran, featuring guest Emil Michael, the Under Secretary of War for Research and Engineering. Key topics include the operation's objectives, military technology advancements, and geopolitical implications, particularly concerning China.

Emil Michael discusses the recent military actions, including the killing of Iran's Supreme Leader Ali Khamenei and the operational goals of disarming Iran's capabilities to support terrorist groups. He emphasizes that the operation is designed to be swift, with no intention of prolonged ground involvement.

The conversation also touches on the evolving nature of warfare, particularly the increased use of drones and AI in military operations. Michael explains how advancements in military technology have changed engagement rules and operational strategies, making modern warfare more efficient.

Additionally, the episode explores the implications of these military actions on US-China relations, with Michael and co-hosts discussing how the conflict could impact negotiations with China regarding economic and geopolitical stability.

Throughout the episode, the hosts engage in a lively discussion about the ramifications of the military operation and the future of defense technology, highlighting the importance of maintaining a competitive edge against global threats.

TL;DR

The episode discusses the US-Israel military operation against Iran, featuring insights from Emil Michael on military strategy and technology advancements.

Video

00:00:00
All right everybody, emergency podcast
00:00:02
time. Episode 263 of All-In. We have
00:00:04
Emil Michael, the under secretary of war
00:00:08
for research and engineering working
00:00:10
directly for Pete Hexa. We had to get
00:00:13
this out to you on Thursday night
00:00:15
because it is an emergency pod. One of
00:00:18
my old besties. Emil Michael is here.
00:00:20
Emil and I uh were part of team Uber
00:00:23
back in the day. He was Travis's
00:00:25
right-hand man. Some might say fixer.
00:00:28
And Emil Michael is now the under
00:00:30
secretary for war
00:00:33
here in the United States, serving his
00:00:35
country like our bestie David Sachs.
00:00:38
Welcome to the program for the first
00:00:39
time. Neil Michael, how you doing,
00:00:41
brother?
00:00:41
>> I'm doing good. Uh I hope it was more
00:00:43
than a fixer, but you know,
00:00:46
>> raising 20 billion fixer. I mean, you
00:00:48
got it done. You got it done. The
00:00:50
hardest. He would give you the hardest
00:00:51
things. Yeah. Just
00:00:52
>> That's right. Fair enough.
00:00:53
>> If it was hard, and that's what a fixer
00:00:55
is.
00:00:55
>> An operational axe. That's what they
00:00:58
call them.
00:00:58
>> All right. Sure. Uh, in Brooklyn, we
00:01:00
call them fixers. With us again,
00:01:01
>> a rain maker.
00:01:03
>> There's that, too. There's that, too.
00:01:05
Making it happen. With us again. Chim
00:01:08
Polyhapatia. How are you, brother?
00:01:11
>> Great.
00:01:11
>> Yeah. Look at that smile. What do you
00:01:14
got going on? You got some pokers in the
00:01:17
fire.
00:01:18
>> Okay.
00:01:19
>> I'm not going to say in the coming
00:01:20
weeks, I think some news is going to
00:01:22
drop. That's my prediction. I don't have
00:01:23
any.
00:01:23
>> Are you loving tweet mogging that's been
00:01:25
going on this week? So good. So good.
00:01:28
>> Oh my god.
00:01:29
>> So good.
00:01:30
>> He's looks maxing by default, but he's
00:01:32
been mogging
00:01:33
>> the Gooners.
00:01:35
>> Yeah. So funny. What was your favorite?
00:01:38
>> The one I sent you this morning that you
00:01:39
said. What did you say? So funny.
00:01:41
>> Are you collecting your losses? Tax
00:01:45
harvesting.
00:01:45
>> What did you say? Oh my god.
00:01:47
>> Chimat said uh Oh my god. It was just
00:01:50
like Yes, I did. Yes, I did.
00:01:51
>> Yes, I did. Yes. Someone said something
00:01:53
to Chabat. He's like dropped in. Why is
00:01:55
everyone so mad at Jamoth? All he did
00:01:57
was lose billions in retail investors
00:01:58
money by proep one page spaxs. It's not
00:02:02
like he then told them to enjoy their
00:02:03
capital losses or anything. Give the man
00:02:05
a break. Jama's response. Yes, I did.
00:02:08
>> All right. Piling on is your sultan of
00:02:11
science. Everybody's favorite. Had a
00:02:13
great
00:02:15
>> some great science that he brought to
00:02:17
the show last week. Freedberg, how are
00:02:18
you doing?
00:02:20
>> Oh, yeah. I've been traveling this week
00:02:21
back at home.
00:02:22
>> All right. Sax is out today. He's very
00:02:24
busy on Capitol Hill. We'll talk about
00:02:26
what he's up to next week.
00:02:27
>> Let's go. Come on. Let's go. Let's go.
00:02:29
>> Let's go. Go, Jason.
00:02:30
>> All right. The US and Israel launched a
00:02:31
joint attack on Iran on Saturday. Today
00:02:34
is day six of Operation Epic Fury. Iran
00:02:38
Supreme Leader Ali Hameni
00:02:42
was killed within hours of the
00:02:44
operation. 40 senior officials have also
00:02:46
been killed. Death toll so far. About a
00:02:48
thousand people according to reports.
00:02:50
Tragically, six US Army Reserve soldiers
00:02:53
were killed following a drone strike on
00:02:55
a base in Kuwait. A US submarine sank an
00:02:58
Iranian ship off the coast of Sri Lanka.
00:03:00
This is the first torpedo kill since
00:03:02
World War II. Why were at war been a bit
00:03:05
of a moving target in a debate. First
00:03:07
explanation from Rubio. He said Israel
00:03:09
was going to attack and the US had no
00:03:12
choice but to participate. Later walked
00:03:14
that back. Trump made it clear this is
00:03:17
not a regime change effort, but we're
00:03:20
doing this to stop terrorism and the
00:03:22
development of ICBMs by obviously a
00:03:26
pretty crazy group of individuals and
00:03:29
obviously nuclear bombs which we blew up
00:03:31
a couple weeks ago. Trump also mentioned
00:03:33
the people of Iran should seize the
00:03:35
moment quote and take their country
00:03:38
back. Pegsth believe is sure boss meal
00:03:41
said quote this is not a so-called
00:03:43
regime change war but the regime sure
00:03:46
did change and the world is better off
00:03:48
for it so here's an interesting poly
00:03:50
market right now US forces enter Iran
00:03:55
this is boots on the ground by the end
00:03:57
of March 40% chance by the end of the
00:04:00
year 59% chance so the idea that we're
00:04:02
not going to have boots on the ground
00:04:04
the sharps on poly market believe we
00:04:06
will will the Iranian regime fall by
00:04:11
June 30th 39% chance according poly
00:04:13
market and by the end of the year 51%
00:04:16
chance so Emil I guess there are two
00:04:19
questions people really want to know
00:04:21
I'll leave off why we're doing this I
00:04:22
think President Trump has been pretty
00:04:24
clear now but how long is this going to
00:04:27
take is the one question and are we
00:04:29
going to have to have boots on the
00:04:30
ground maybe what is success here
00:04:35
>> um I think the the president tal talked
00:04:38
about this is a weeks not months kind of
00:04:41
operation and it's aimed at essentially
00:04:45
disarming the the regime uh or the
00:04:48
country from uh in such a way that they
00:04:52
can't supply Hezbollah, Hamas, um Muslim
00:04:56
Brotherhood, all the kind of terror
00:04:58
groups that get sponsored by weapons and
00:05:00
money from Iran, not to mention the
00:05:02
nuclear bit. And that's why you see from
00:05:06
the reporting they're going after the
00:05:08
depots the the you know we went after
00:05:11
nuclear sites before they're a
00:05:14
prodigious drone maker. These like huge
00:05:16
one-way attack drones that can go hund
00:05:19
you know hundreds and hundreds of miles.
00:05:21
Um lots of ballistic missiles uh that
00:05:25
are aimed at every country in the Middle
00:05:27
East as you've seen they've attacked
00:05:28
them. So
00:05:30
I think that's one. In terms of boots on
00:05:32
the ground, there's no scenario where we
00:05:34
have some protracted boots on the ground
00:05:37
Afghanistan, Iraq 2 like scenario.
00:05:40
>> Freeberg, your thoughts on this war.
00:05:43
Obviously, a lot of people voted for
00:05:47
Trump in order to have the peace
00:05:49
dividend that he was in his first term,
00:05:52
absolutely the peace president. And now
00:05:54
here we are. Eight countries have been
00:05:56
bombed and we've had two leaders deposed
00:05:59
and one of those two have been killed.
00:06:02
Your thoughts, Freeberg?
00:06:03
>> I think the president and the
00:06:05
administration have probably the biggest
00:06:07
meetings of the term coming up in China
00:06:10
in April. My
00:06:13
estimation based on the conversations
00:06:14
and the comments made by the president
00:06:17
before he came into office and since
00:06:19
he's been in office is that finding a
00:06:21
grand bargain or a deal with China is
00:06:23
probably one of his top priorities. And
00:06:25
if you think about the importance of
00:06:27
that, is the US going to wade into a
00:06:30
giant global conflict led by a US China
00:06:34
rift or is the US going to find some
00:06:36
grand bargain? I think he would probably
00:06:38
have a preference for the grand bargain.
00:06:40
And that being the case, I think you
00:06:42
could look at the in the context of
00:06:44
Maduro and the actions in Iran as
00:06:47
creating maximal leverage going into
00:06:49
those negotiations that
00:06:51
>> the reason for that free bird
00:06:53
>> 90% of the oil that comes out of Iran
00:06:54
goes to China
00:06:57
and there's been a long developing and
00:06:59
developed relationship between Maduro's
00:07:02
government and China and these are big
00:07:06
economic drivers or support the economic
00:07:09
driving in China. So creating leverage
00:07:12
by having significant influence or
00:07:15
damage or destruction to those supply
00:07:17
chains for China gives the United States
00:07:20
footing to be able to negotiate a better
00:07:22
deal for America. I would imagine that
00:07:24
the president's intention here isn't to
00:07:25
go and decide who should be in charge
00:07:28
and drive regime change and end in a
00:07:29
multi-year conflict with Iran. But
00:07:32
ultimately, if there's some transaction
00:07:34
with China that gets everyone out of
00:07:36
this and puts the US on a strong footing
00:07:39
where American businesses can sell into
00:07:40
China, which is very challenging as
00:07:42
everyone knows today. And there's
00:07:44
parity, regulatory par, economic and
00:07:47
trade parity between the US and China.
00:07:49
there's a point of view on what happens
00:07:50
with Taiwan and availability of key
00:07:52
technologies like semiconductors. I
00:07:54
think it could be a win-win and I think
00:07:57
that a deal with China could be the
00:07:59
crowning achievement of this
00:08:00
administration particularly going into
00:08:02
the midterms. So the timing is right and
00:08:04
I think that's probably a core part of
00:08:06
the motivation here. Chimath, your
00:08:08
thoughts on this action and why we're
00:08:12
doing it. You've heard obviously the
00:08:14
president has his position. We're not
00:08:16
doing regime change. It's a secondary
00:08:18
effect obviously, but we want to stop
00:08:20
those ICBMs and nuclear bombs from being
00:08:22
developed and we want to stop terrorism.
00:08:25
Additionally, Freeberg says, "Hey, we're
00:08:27
framing this great, you know, discussion
00:08:32
we're going to have with she and China
00:08:34
and oil is part of that." Where where do
00:08:36
you think you stand on all this?
00:08:38
>> I'll build on both what Emil said and
00:08:40
what Freeberg said. I don't think this
00:08:43
is about regime change and I don't think
00:08:45
it's about a local regional conflict. I
00:08:49
think if you take a step back and zoom
00:08:51
out, the most important thing that we
00:08:54
did in the last 3 months was by taking
00:08:58
out Maduro and by taking out the Iranian
00:09:04
leadership, we have created enormous
00:09:06
leverage as Freedberg said
00:09:09
with China. Now, why is that important?
00:09:12
Because I think all of this centers
00:09:14
around that geopolitical discussion.
00:09:17
Last night, something important
00:09:19
happened, which is that the official
00:09:20
Chinese
00:09:22
bureaucracy posted what their GDP
00:09:25
targets were, and it was shocking to
00:09:27
anybody reading it because what we saw
00:09:30
was that they guided to a range of 4.5
00:09:33
to 5%.
00:09:35
which if you look at the historical
00:09:37
context of that growth is the lowest
00:09:40
that it has ever been in about 30 years.
00:09:44
So three decades. So before they entered
00:09:47
the WTO
00:09:48
and the question that one should ask
00:09:50
yourself is when a country that's
00:09:51
growing at 8 9 and 10% start to grow at
00:09:54
half that rate yet have double the
00:09:57
number of people and double the GDP what
00:09:59
happens? you already have incredibly
00:10:02
high domestic unemployment, especially
00:10:04
youth unemployment.
00:10:06
Does it become more or less chaotic? And
00:10:08
I think the historical artifacts of
00:10:09
every other country would show that it
00:10:12
will become more chaotic. If you have
00:10:13
that as a starting point, what is it in
00:10:16
China's best interest to do? And I think
00:10:18
it becomes obvious that the right thing
00:10:19
to do would be to invade Taiwan. Why?
00:10:23
because you start to create a sinkhole
00:10:25
that occupies your people, that occupies
00:10:27
resources, that can get domestic
00:10:29
production up and running, that can
00:10:31
start to generate a war machine. And you
00:10:33
see the economic impact of war machines
00:10:35
in any country during any conflict.
00:10:38
And if I had to guess, just to build on
00:10:40
what Emil said, the president saw that
00:10:42
and I think what they did can be
00:10:44
summarized in this chart which I sent to
00:10:46
Nick. So if your goal is to prevent war
00:10:50
with China, which is a massive global
00:10:53
conflict, which could be nuclear, which
00:10:56
could be cataclysmic,
00:10:59
how would you do it? And this chart
00:11:02
paints one way to do it. If you look at
00:11:04
the conditions inside of the Chinese
00:11:06
economy,
00:11:07
the most interesting takeaway is that
00:11:10
they are enormously
00:11:12
dependent on imported oil. So about 20%
00:11:17
of their economy, but it's not 20% of
00:11:19
their economy because it's 100% of these
00:11:22
critical things that create GDP,
00:11:24
logistics, transportation, aviation,
00:11:26
feed stock inputs.
00:11:29
And of that 19%
00:11:32
about a fifth of it comes exclusively
00:11:34
from Iran and Venezuela. And now all of
00:11:37
that is off the table.
00:11:39
So if you take that and then you see
00:11:41
what Steve Whitoff and Jared Kushner and
00:11:43
Josh Greenbond have been doing, which is
00:11:46
trying to get a deal done in Russia, and
00:11:47
you put all of these things together,
00:11:49
because by the way, if you add Russia
00:11:50
into that mix, it's about 40% of China's
00:11:53
oil, not only do you red dollarize, not
00:11:56
only do you stop the funneling of all of
00:12:00
these illicit oil funds to creating
00:12:02
chaos all around the world, but you hem
00:12:05
in China going into a massive moment at
00:12:08
the end of March, beginning of April,
00:12:10
where as Freeberg said really astutely,
00:12:13
there is the potential for a grand
00:12:14
bargain and I think that secures global
00:12:17
safety in that that is a huge thing for
00:12:19
America.
00:12:20
>> Emil, how much does this have to do with
00:12:22
China?
00:12:24
I think you know my instinct is and I'm
00:12:27
not speaking for the administration on
00:12:29
this is that's a second order benefit to
00:12:32
some of these things like the um you
00:12:35
know and you said eight eight conflicts
00:12:37
there have not been eight conflicts
00:12:39
there is like we inherited Gaza we
00:12:42
inherited Russia Ukraine um Venezuela
00:12:45
was its own operation and then you could
00:12:48
sort of attach to it the drug boats that
00:12:50
were coming out of that as like one big
00:12:51
operation
00:12:53
um and then the Houthies was just Biden
00:12:55
was ignoring the Houthies. They were
00:12:56
just shooting at our ships. So that was
00:12:58
like very limited in terms of like stop
00:13:01
shooting our ships. We need freedom of
00:13:03
the seas and that wasn't sort of a, you
00:13:06
know, so that's something any president
00:13:08
should be doing generally. I think um
00:13:11
Iran being the one, you know, material
00:13:14
conflict outside of Venezuela. So it's
00:13:16
not it's not that many. And and how long
00:13:18
did Venezuela last? It was one raid.
00:13:21
one.
00:13:21
>> I guess that's a really important
00:13:23
>> few hours. This is this is an important
00:13:25
note I think for you Emil to sort of uh
00:13:28
>> I've explained to us there's a new
00:13:31
>> approach here with regard to the these
00:13:35
actions which is no boots on the ground
00:13:38
and we seem to uh and and you of course
00:13:41
have better information than anybody
00:13:42
else does. I don't think anybody would
00:13:44
have known Venezuela would have gone as
00:13:46
well as it did and so far and listen we
00:13:49
got a long way to go with Iran. This has
00:13:51
gone very well as as well. So explain to
00:13:54
us what you know and what you the
00:13:57
president and HGV know that we don't
00:14:00
that makes these two operations go so
00:14:02
smoothly. What what is it then? There's
00:14:05
obviously some new technology here in in
00:14:06
the case of what happened in Venezuela.
00:14:08
>> Yeah. Bes besides the discombobulator,
00:14:11
what we've got is a very well-trained
00:14:16
military. Like the the global war and
00:14:18
terror was disaster in so many respects,
00:14:23
but the people now who are fighting that
00:14:25
are generals now. And so they've learned
00:14:28
a lot of lessons. And you compare that
00:14:30
to the Chinese military. They don't have
00:14:32
a lot of experience. In fact, the the
00:14:34
decapitation they did in the Chinese
00:14:36
military, the one guy they took out was
00:14:37
the one guy who had experience in
00:14:39
Vietnam. So, they don't have conflict
00:14:41
experience, and that matters because you
00:14:43
understand going in what are the things
00:14:46
that could go wrong. Um, and then you
00:14:48
you have incredible technology, space,
00:14:51
air, land, sea, cyber, um, all kinds of
00:14:56
effects that you could bring together.
00:14:58
And so you imagine uh a hundred guys
00:15:01
goes into the most fortified compound in
00:15:03
Venezuela where the president is, you
00:15:06
know, take him and his wife out safely
00:15:08
and are out with no KAS.
00:15:12
>> Incredible.
00:15:12
>> I mean, it's just incredible, right? So
00:15:14
stunning.
00:15:14
>> Yeah. So, and they they these things
00:15:17
these war games have been on the on the
00:15:18
shelf for a long time. Every every
00:15:20
scenario has been planned for years
00:15:23
ahead of time. Midnight hammer in Iran
00:15:25
was planned years ahead of time in terms
00:15:27
of how would you do it if you were going
00:15:29
to do it.
00:15:30
>> Um, and then you keep refreshing the
00:15:32
tactics, tactics, techniques, and
00:15:34
procedures and you're updating them. So,
00:15:36
we have a very sophisticated way of of
00:15:38
doing these things to minimize loss of
00:15:40
life and maximize success.
00:15:42
>> Can I ask a question? I don't want to I
00:15:43
don't want to derail this conversation,
00:15:44
but is the discombobulator real? Like,
00:15:47
what can you say about the discombobul?
00:15:50
>> I completely I was like obsessed with
00:15:52
this when I saw it on X. I was like,
00:15:54
"What is this thing?" I mean, I need it
00:15:56
in my house. Like, I just want to push a
00:15:58
button of all these my kids.
00:15:59
>> And that's just for when Helm Youth
00:16:01
shows up.
00:16:01
>> Oh my god.
00:16:02
>> Not meant for your kids or if they're
00:16:04
behaving badly. No, it's can't talk
00:16:06
about it.
00:16:06
>> Emil, do you think we would have been
00:16:07
able to pull off that mission as
00:16:09
successfully as we did 5 years ago, 10
00:16:11
years ago? Has the technology improved
00:16:14
that quickly that this is not something
00:16:16
that's been possible historically? And
00:16:17
how does that change the the pacing and
00:16:20
the face of war for the next couple of
00:16:22
years? I'd say no. It wasn't only a
00:16:25
technology maturation from five years
00:16:28
ago. It's the uh rules of engagement.
00:16:31
The rules of engagement that we used to
00:16:34
have uh there was some I mean if you
00:16:36
read about them some of them were insane
00:16:37
like if in Afghanistan if the guy had a
00:16:40
small gun you had to have a small gun
00:16:43
and you know there was this parody in
00:16:45
weird ways.
00:16:46
>> And when you're like well but is the
00:16:48
objective to have like a fair fight or
00:16:50
an unfair fight? Um, well, if you're on
00:16:52
our side, you want it to be unfair. So,
00:16:54
the rules of engagement were relaxed to
00:16:57
be
00:16:57
>> Who writes those, Emil? Who sits in an
00:16:59
office and says you can't shoot back if
00:17:01
a combatant is shooting at you if you
00:17:04
aren't matched gun forgun?
00:17:06
>> Yeah. I mean, who writes that crazy
00:17:08
policies that are written into military
00:17:10
department? And that's why when
00:17:12
Secretary Hicks talks about um this kind
00:17:16
of thing and what was happening with him
00:17:18
when he was in Afghanistan, if you ever
00:17:19
read his book in Iraq, he's like the
00:17:22
rules of engagement were so punishing
00:17:24
that we were we were at risk all the
00:17:26
time cuz you had to have like a legal
00:17:28
understanding of what was happening in
00:17:30
every minute in the battlefield as
00:17:32
opposed to well your job is to you know
00:17:34
take out these guys and protect these
00:17:36
guys. Here's your munitions. here's like
00:17:38
the the the red lines and then like in
00:17:41
the middle of that go use your judgment
00:17:43
your commanding officer use your
00:17:45
judgment on how to win and we kind of
00:17:47
gotten back to that use your judgment
00:17:50
push responsibility field still have
00:17:52
your red lines but other than that the
00:17:53
objective is the objective it's more of
00:17:55
a colon Powell approach it's like go all
00:17:58
in have a clear objective come out use
00:18:01
overwhelming force and we were not doing
00:18:03
that for the last four years
00:18:05
>> and then going back to the the face of
00:18:07
war going forward forward. My
00:18:08
understanding is that there have been
00:18:10
more drones deployed by the United
00:18:12
States this past week than we've done in
00:18:14
the history of military activity. Is
00:18:17
that right? And like how does that
00:18:18
really change things going forward here?
00:18:20
>> It it changes it big. Well, so so the
00:18:23
Predator drone was the first big drone
00:18:25
program like 1015 years ago. It was this
00:18:27
big honken drone. Um and then if you
00:18:30
remember Obama would take out some of
00:18:32
these al-Qaeda leaders with drones on
00:18:34
their balcony and things like that. Uh I
00:18:37
think we uh President Trump took out
00:18:39
Sulammani with a drone near his car.
00:18:41
That was the beginning. And then Iraq uh
00:18:44
sorry the Russia uh Ukraine war happened
00:18:46
where it's like drone on drone. 70% of
00:18:48
the the casualties are for drone but
00:18:51
because of drones. So um drone on drone
00:18:55
warfare, robot on robot warfare, those
00:18:57
things are the future for sure,
00:18:59
>> right? And that's why companies like
00:19:00
Ander are companies like Anderoll is
00:19:02
because they're making unmanned systems.
00:19:04
A and this has been something you've
00:19:06
specifically been very focused on and
00:19:09
you tweeted today a little bit about a
00:19:11
competition. We'll play a little video
00:19:12
here. And this Lucas lowcost unmanned
00:19:14
combat attack systems.
00:19:17
>> It used to take a lot of time. It
00:19:19
certainly wasn't startup time to get new
00:19:23
product into the channel for our
00:19:25
military to use. Explain what program
00:19:27
you're running here. Feels like the
00:19:28
DARPA self-driving challenge all over
00:19:31
again and what these drones cost. I know
00:19:33
there's a company making them for I
00:19:35
think $35,000. Am I correct?
00:19:37
>> I mean the the small drones like I'm
00:19:39
holding right there and that are way
00:19:41
cheaper than that. The Lucas oneway
00:19:43
attack drone which can go 5 6 700 miles
00:19:47
at the speed of an airplane
00:19:49
>> carry a big warhead. Those are like 50
00:19:52
$80,000
00:19:53
depending on what kind of equipment you
00:19:54
put on you put on it. But the we've have
00:19:57
a drone dominance program and the real
00:19:59
and we basically have to build an
00:20:01
arsenal for for drones. Now are we
00:20:04
likely to have a territorial conflict
00:20:05
like Russia Ukraine with Canada and
00:20:07
Mexico? No. But but we do want to take
00:20:10
out drug drones at the border and we
00:20:13
want but long one-way attack drones are
00:20:16
important for you know any kind of major
00:20:19
conflict like you're seeing in Iran. but
00:20:21
also to protect military bases for
00:20:23
America 250 World Cup uh Olympics in 28
00:20:28
like there's other there's a lot more
00:20:29
uses of drones for surveillance not just
00:20:32
for you know for combat
00:20:34
>> there you're showing drones that are
00:20:36
sort of human operated but how much of
00:20:38
this should basically be AI so that it's
00:20:41
just some computer vision and again back
00:20:43
to what you said before a model
00:20:46
understands the rules and the red lines
00:20:48
but otherwise is be on task and
00:20:50
accomplish your mission. How much of it
00:20:52
is one versus the other?
00:20:53
>> I mean, it I believe that a
00:20:55
sophisticated drone war is going to be
00:20:57
drone swarms controlled by AI to some
00:21:00
degree or another, right? To what degree
00:21:02
the control matters? Like for example,
00:21:05
drones have decoys. They could spit out,
00:21:07
you know, they could dazzle. They could
00:21:09
put out things. So, how do you
00:21:11
discriminate what's a drone and how to
00:21:13
hit it? You know, you could use AI for
00:21:15
that because it's learned, you know, how
00:21:17
to do automatic target recognition, for
00:21:19
example. Um, and then also could it
00:21:22
identify a person and that and and does
00:21:24
that make it safer? So, it's going after
00:21:26
actually someone you want to get and not
00:21:28
someone you don't want to get. Um, so
00:21:31
there's a lot of uses for AI at the
00:21:32
edge, if you will. Um, in the future
00:21:35
here, the the Ukrainians and Russians do
00:21:37
something called like a kill box where
00:21:39
they lose comms because it's jammed for
00:21:41
this drone and then it just starts going
00:21:43
in a box and looking for, you know, a
00:21:46
the person they're trying to get and
00:21:48
they're trying they're starting to use
00:21:49
AI to do that.
00:21:50
>> Wow.
00:21:51
>> And China has this uh ability already
00:21:55
probably times some magnitude. Yeah.
00:21:58
They have drone swarms because they can
00:22:01
they can force the companies that make
00:22:04
them, not just DJI, to interoperate. So
00:22:07
interoperating drones called
00:22:10
heterogeneous autonomy, right? You take
00:22:11
different kinds of drones and how they
00:22:13
communicate with one another and then
00:22:15
make sure they're not going after the
00:22:16
same target is like a pretty complex
00:22:19
thing that they're definitely working
00:22:20
on.
00:22:21
>> And let's talk uh about the fidelity of
00:22:25
these. Obviously, AI is a new
00:22:28
technology. They it can make mistakes.
00:22:30
Anybody who uses it on a day-to-day
00:22:32
basis might experience a hallucination.
00:22:34
How confident are you in the AI Ukraine
00:22:37
and Russia conflict? They obviously are
00:22:40
not going to be as thoughtful maybe as
00:22:41
we are in putting this together. They're
00:22:43
in a hot war right now. But we as the
00:22:46
United States have to be very thoughtful
00:22:47
about this. So, how confident are you
00:22:49
that this isn't going to make a mistake?
00:22:50
I think that's the key to a lot of this
00:22:53
debate. and and when will it be you know
00:22:55
perfect defined as much better and I
00:22:59
guess this dovetales with the
00:23:00
self-driving
00:23:02
you know thoughts it has to be a
00:23:04
magnitude better than a human so when
00:23:05
will this be a magnitude more accurate
00:23:07
than you know when we have uh make a
00:23:10
mistake as a as a military and we kill a
00:23:12
civilian
00:23:12
>> yeah no it's it's a good question and um
00:23:16
I don't know when that moment hits that
00:23:18
FSD moment where it get kind of gets
00:23:20
better certainly not there and you
00:23:23
wouldn't want to take huge risk with
00:23:25
that in like you know there there's a
00:23:27
gradation of when you would use that and
00:23:28
what kind of risk you were trying to
00:23:30
take or not. If you were trying to take
00:23:32
out a drone using AI using a like a
00:23:35
laser or something you'd be pretty like
00:23:37
okay making mistakes because you just
00:23:38
missed the drone you know like whatever
00:23:41
with a with a with the laser laser goes
00:23:42
off it's all over. Um if you were doing
00:23:45
something more sophisticated in a
00:23:46
population area have densely populated
00:23:49
area you'd take less risk. So, we're
00:23:51
developing procedures, tactics for each
00:23:53
scenario. And this is part of the debate
00:23:55
I had with anthropic, which is we need
00:23:58
AI for things like Golden Dome.
00:24:01
>> Chinese hypersonic missile comes up.
00:24:02
You've got 90 seconds before it
00:24:04
separates and all kinds of decoys and
00:24:07
you don't know where the actual payload
00:24:09
is and you want to get it hit it from
00:24:12
space. and a human can't doesn't have
00:24:14
the reaction time, doesn't have the may
00:24:17
not be able to discriminate with their
00:24:18
own eyes what they're going after.
00:24:21
That's a pretty, you know, lowrisk thing
00:24:22
because it's in space and you're just
00:24:24
trying to hit something that's trying to
00:24:25
hit you. So, I think in the next 10
00:24:28
years, you're going to see a lot of
00:24:29
these applications develop AI to one
00:24:30
degree or another so long as we think
00:24:33
it's safe and it's not going to do the,
00:24:34
you know, make mistakes.
00:24:36
>> Before we get on to the anthropic
00:24:38
discussion, and we really appreciate you
00:24:40
coming here and my lord, this has been
00:24:41
so informative. So thank you Neil for
00:24:43
coming here and explaining to the
00:24:44
American public and to us what you're
00:24:46
working on. It really makes us uh I
00:24:48
think speak for everybody really
00:24:50
confident in what you're doing and it's
00:24:51
so great that you've you know left the
00:24:53
private sector to do this.
00:24:55
>> What I would say just very quickly Emil
00:24:57
is I think that not enough people
00:24:58
understand that the American military
00:25:01
has had to fight with one hand tied
00:25:02
behind their back. Just that little
00:25:05
>> insight that you just gave about
00:25:06
Afghanistan
00:25:08
to me seems so scary because the men and
00:25:11
women that sign up for the American
00:25:13
military. They're doing this to fight on
00:25:16
behalf of this country. They deserve a
00:25:18
lot more than being sent there and all
00:25:20
of a sudden being given this rule book
00:25:22
and say, "Do your best." And it's like,
00:25:24
"Oh, wait. You violated 19 rules trying
00:25:26
to protect America. Do your job." That's
00:25:28
insane.
00:25:29
>> Let's just
00:25:30
>> It's really insane in some cases. I I my
00:25:33
my belief is that's what the frustration
00:25:36
for those soldiers who were out there in
00:25:37
those wars had more than anything. There
00:25:40
was the broader frustration what are we
00:25:41
doing here and then the secondary
00:25:43
frustration is while I'm here why can't
00:25:45
I do my job?
00:25:47
>> Yeah. Is there is there much of a debate
00:25:49
internally and sorry Jake before we move
00:25:51
forward on this
00:25:52
>> regarding this idea of full autonomy in
00:25:56
military action? I don't want to speak
00:25:57
ahead to the anthropic point, but it was
00:26:00
something that the media seemed to say
00:26:02
was part of Daario's concern is that
00:26:05
when you press the button and hand over
00:26:06
complete autonomy and there's a kill
00:26:09
action that you're now giving to a robot
00:26:11
or to some autonomous system, do we then
00:26:13
kind of have a moral issue at hand? And
00:26:17
is that something that's kind of debated
00:26:19
or discussed? And is that the right way
00:26:20
to think about the framing of what goes
00:26:21
on?
00:26:22
>> I mean, we're not even close to there
00:26:24
yet, right? where like this the systems
00:26:26
are not we we wouldn't feel that a
00:26:29
system uh that would have sort of like
00:26:33
real risk for a civilian is ready to
00:26:36
launch yet. So we're not even debating
00:26:38
that. We're just trying to get basic
00:26:40
autonomy in drones, basic autonomy in
00:26:42
underwater unmanned vehicles, basic
00:26:45
autonomy that, you know, you've heard of
00:26:47
this collaborative aircraft that fly
00:26:49
along with a the jetcraft so that it has
00:26:51
more firepower, but it's still tethered
00:26:53
to what the jet does. So, we're
00:26:55
>> incredible. Yeah.
00:26:56
>> Yeah. So, we're just at the very
00:26:57
beginning of this stuff,
00:26:59
>> but for Golden Dome's a good example of
00:27:01
like, yeah, who can oppose that? Like,
00:27:03
it's the only way to get out a threat
00:27:04
like that. Um, so who could oppose if
00:27:08
you have a military base and you have a
00:27:10
bunch of soldiers sleeping that you have
00:27:11
a laser that can take down drones
00:27:13
autonomously on that? So there's it's
00:27:16
it's pretty scenario by scenario, but I
00:27:18
don't we're not having a lot of debate
00:27:19
because the Skynet thing is so not a
00:27:22
realistic thing at this moment, right?
00:27:25
Except if one thing I did tell the
00:27:27
anthropic guys, I was like, you know, or
00:27:29
I'd tell any company, your models are
00:27:32
getting stolen by the Chinese. They're
00:27:34
going to unguard rail them and use them
00:27:36
against us. And then you want our models
00:27:38
to be less capable against your models.
00:27:41
It's sort of
00:27:41
>> they're not going to be thoughtful. In
00:27:42
other words, they're going to go for it.
00:27:44
And you know, if we just benchmark this
00:27:46
against where we were at, you know, but
00:27:49
101 15 years ago, there was the Wikileak
00:27:53
of collateral murder, I think they
00:27:55
called it, where we tragically had an
00:27:58
Apache take out some journalists. And
00:28:00
this technology even applied today
00:28:03
probably would have avoided that in my
00:28:04
mind. Yeah. Like we have enough that
00:28:07
when you're targeting not drones but you
00:28:10
know people on the ground with an Apache
00:28:12
this would have probably avoided that.
00:28:14
>> Yeah. Or or you know the Kuwaiti
00:28:17
aircraft hitting you know an American
00:28:19
aircraft making a mistake because it
00:28:21
doesn't have the identification. I mean
00:28:24
it's the same self-driving argument to a
00:28:26
degree. like self-driving could save
00:28:28
lives even though it's scary to look at
00:28:30
a car without a human behind the wheel
00:28:32
but there's tons of scenarios where it's
00:28:34
a way better safer option more precise
00:28:38
um than the alternative
00:28:39
>> all right before we move on to the Dario
00:28:41
thing and anthropic and that brew haha
00:28:43
there was one piece that we haven't
00:28:44
addressed with this interaction Freeberg
00:28:46
Chimath which is uh the Israeli
00:28:49
government and their desire to take out
00:28:54
this regime And us according to Tucker
00:28:58
Carlson and a large contingent of the
00:29:01
MAGA base they feel that we are captured
00:29:04
by this group. Does Israel have too much
00:29:07
influence over the United States with
00:29:09
regard to these actions in the Middle
00:29:11
East? This is you know a big debate
00:29:13
within the party within the Republican
00:29:15
party within the MAGA constituent. Hey
00:29:17
we number one we don't want these wars.
00:29:19
Number two is Israel driving this thing
00:29:21
to the point of Rubio's quotes that hey,
00:29:23
we're doing this because Israel is going
00:29:25
anyway. I think we should address it
00:29:27
here. Not that I have a personal stake
00:29:28
in this. I'll give my personal opinion
00:29:29
at the end.
00:29:30
>> I don't think the president is captured
00:29:32
by Israel in the least. I think he
00:29:34
decides what is in the best interest of
00:29:36
the United States. And if
00:29:40
Israel can be a part of that, then
00:29:43
they're a part of it. And look, let's be
00:29:45
clear, they're incredibly capable. And
00:29:48
so in something like this, to be able to
00:29:51
incorporate the intelligence of MSAD,
00:29:54
what you're seeing today in this
00:29:55
operation Epic Fury, we're four days in,
00:29:59
Iran has been 90% depleted of all of
00:30:01
their munitions. It looks like they're
00:30:02
just firing no more missiles out from
00:30:05
Iran to anywhere else. There's fleets of
00:30:08
drones and planes just waiting.
00:30:09
Everybody knew where the Iranians were.
00:30:12
It's great that when we make a decision
00:30:14
on something that we need to do, we can
00:30:16
rely on our allies. I think the opposite
00:30:18
question should also be asked like what
00:30:20
was the UK doing? Why is Spain
00:30:21
pontificating? Why was Europe taking the
00:30:23
weekend off before they could even issue
00:30:25
a statement? Why don't you ask that
00:30:27
question?
00:30:27
>> Yeah. No, it's equally valid question.
00:30:30
You know, uh Freeberg, do you want to
00:30:32
get in on this or no?
00:30:34
>> No.
00:30:34
>> I'm a Jew. No one's going to care what I
00:30:35
have to say. They're either going to
00:30:37
they're either going to be like totally
00:30:39
like or they're going to say this guy's
00:30:40
a Jew. We shouldn't listen to him. So,
00:30:42
like let's move on. Go ahead.
00:30:43
>> Yeah. Yeah. Yeah.
00:30:44
>> Emil, any thoughts on on this? I do want
00:30:47
to know from Emil though like you know
00:30:49
is this Iron Dome working this laser in
00:30:52
Israel system is this operational and if
00:30:54
so is there any success metrics you can
00:30:56
share around it
00:30:57
>> I mean I think that the golden sorry the
00:31:00
golden iron beam was the first
00:31:01
generation of of the Israeli air air
00:31:04
defense thing and then they're build
00:31:06
building Iron Beam and and I think it's
00:31:08
still earlyish but yeah it works they're
00:31:11
they're a technologically sophisticated
00:31:14
country that's very small that has like
00:31:17
a reason to invest in these things and
00:31:19
they have a lot of smart people to do
00:31:21
them. So I think I think it's good. But
00:31:23
>> does it primarily work on rockets? And I
00:31:25
guess I just want to understand the
00:31:26
logical evolution of this because in the
00:31:28
80s and 90s there was a lot of
00:31:30
conversation about space-based lasers
00:31:32
that could shoot ICBMs out of the sky to
00:31:34
avoid, you know, global nuclear war and
00:31:38
we could always take out every nuclear
00:31:39
warhead delivered on an ICBM. Is that
00:31:42
technology feasible? Is there a place in
00:31:44
the near future where we could see
00:31:46
basically maximal global deterrence
00:31:48
using these systems, either groundbased
00:31:50
or space-based to take out hypersonic
00:31:52
missiles?
00:31:53
>> I think I think the the harder but more
00:31:57
valuable problem to solve would be the
00:31:58
space-based way of doing it because then
00:32:01
um you could get at any kind of almost
00:32:04
any kind of threat that hits space, but
00:32:06
you still need a ground layer because
00:32:08
there's cruise missiles that could come
00:32:10
at you, there's drones and so on. to so
00:32:12
we call them multi-layers like how do
00:32:14
you how do you get every kind of weapon
00:32:16
at every layer
00:32:18
but you know directed energy lasers as
00:32:22
they get more powerful you could take on
00:32:24
a bigger weapon farther away right so
00:32:26
those uh so those technologies as that
00:32:30
as they improve it gets more and more
00:32:32
capable and I think all these defense
00:32:34
systems uh are going to get more and
00:32:36
more capable to get more and more of a
00:32:38
variety of weapons at farther farther
00:32:40
standoff which is what want you want it
00:32:42
to you don't want to shoot it when it's
00:32:43
right over Tel Aviv. You want to shoot
00:32:45
it, you know, when it's still over their
00:32:47
their land ideally.
00:32:48
>> Are the laser interceptors in the field
00:32:51
today. There's reports that they are.
00:32:53
>> I I think there's some I think they've
00:32:55
demonstrated some of them.
00:32:56
>> Got it. And and is this our technology
00:32:58
or Israel's technology? Because
00:32:59
President Trump said, "Hey, that's
00:33:02
actually our technology." Is there any
00:33:04
insight there?
00:33:04
>> We we have collaborations with Israel on
00:33:07
some of this stuff. They have their own.
00:33:09
We have our own. Um, so it's not this is
00:33:12
uh but they're good at tech, we're good
00:33:16
at tech. There's certain there's certain
00:33:17
ways you get part of our system and part
00:33:19
of their system because it's like a he
00:33:21
it's a quickly evolving part of of
00:33:24
science right now. How do you cohhere
00:33:26
beams of light to like get distance? How
00:33:29
to use high-owered microwave to like
00:33:31
just drop drones in their tracks? Um
00:33:34
there's lots of different ways to get at
00:33:35
some of these things. Um and and yeah, a
00:33:38
lot of it's ours. uh and a lot of and
00:33:40
some of it's theirs.
00:33:41
>> Yeah. And uh to to the earlier question,
00:33:44
you know, I I am pro-regime change if it
00:33:47
can be done thoughtfully and obviously
00:33:50
isolating a dictator, that's the best
00:33:52
thing you can do. We've done that
00:33:53
successfully with, you know, Putin, Kim
00:33:56
Jong-un, etc. Keep diplomacy up. But if
00:33:59
there is a moment in time where you
00:34:01
could free the people of Iran after 50
00:34:04
years of being subjugated by these
00:34:07
lunatics and dictators, I'm all for it.
00:34:09
And I actually trust President Trump to
00:34:11
make that decision. I know this may
00:34:13
sound crazy. People think like I'm a
00:34:15
libtart or something because of the way
00:34:16
my besties frame me on this program,
00:34:18
which is completely inaccurate. I'm an
00:34:20
independent.
00:34:20
>> You are.
00:34:21
>> I actually I'm completely independent
00:34:24
and I am just based on my voting and I'm
00:34:26
not on either one of these sides. I am
00:34:28
pro President Trump and I trust his
00:34:30
judgment. I think he has more
00:34:31
information than us. I think you have
00:34:32
more information. I actually trust you
00:34:34
guys to do it thoughtfully and there
00:34:35
obviously was a window here. Israel can
00:34:37
have their own, you know, motivation.
00:34:39
That could be the China motivation, but
00:34:40
there's also spreading democracy, which
00:34:42
might be the least of people's concerns
00:34:45
here, but that's on the top of my list.
00:34:47
I would like to see the people of Iran
00:34:49
free. Just to build on your point,
00:34:50
Jason, the thing that Emil said before,
00:34:52
which I think is important as well, is
00:34:54
we have an enormous amount of learnings
00:34:56
about what happened in Iraq. Yeah. We
00:34:58
also have a ton of learnings between the
00:35:00
Iran Iraq war and a ton of learnings in
00:35:02
53 when US and the British deposed Mosad
00:35:06
in the or at least foremented that and
00:35:08
put the sha and then the sha was booted
00:35:10
up.
00:35:11
>> Yeah.
00:35:12
>> If you take those three chapters in
00:35:13
Iranian history or that regional
00:35:15
history, there's a ton to learn. And to
00:35:19
your point, there is a way to affect
00:35:21
what we need to do without creating some
00:35:25
20-year forever war. There was an
00:35:27
incredible tweet. I don't know if you
00:35:28
guys saw this. Somebody said, "So every
00:35:31
war doesn't have to be
00:35:34
>> three decades and trillions of dollars
00:35:35
to your friends in Virginia, Maryland,
00:35:37
and DC. Did you guys see that tweet?"
00:35:39
It's true. These things can be one and
00:35:42
done in and out.
00:35:44
And if President Trump succeeds here, I
00:35:46
just want to also give him some flowers
00:35:48
here. The people of Venezuela and the
00:35:51
people of Iran being free represent
00:35:53
about 5% of the people in the world
00:35:54
living under an autocracy, under a
00:35:57
dictator. If those both flip back to
00:36:00
democracies, he'll have done more for
00:36:02
the spread of democracy than any
00:36:03
president for many decades, perhaps in
00:36:06
our lifetime. This would be incredibly
00:36:08
noble. Incredibly noble, incredibly
00:36:10
just. And
00:36:11
>> would you in the human rights set
00:36:14
>> want him to get the Nobel then?
00:36:16
>> Absolutely. Give him all the Nobels.
00:36:18
Like literally if you can free people,
00:36:21
>> all of them. Give him every prize. Give
00:36:23
him an Oscar.
00:36:24
>> Physics, chemistry,
00:36:27
he can have everything.
00:36:29
>> Physics, philosophy.
00:36:30
>> Jay Cal's an independent. When's the
00:36:32
last time you voted for a Republican
00:36:33
presidential candidate? Just curious.
00:36:36
>> Um,
00:36:37
yeah.
00:36:38
>> Say it. Say it.
00:36:39
>> No, no, no, no. Um,
00:36:41
>> Montdale.
00:36:42
>> No, no, I didn't. I would have voted
00:36:44
for, if I was of age, I would have voted
00:36:46
for um I wouldn't have voted for the
00:36:47
Bushes. I voted for the moderates. Um,
00:36:52
uh, uh, obviously Clinton and Obama.
00:36:54
>> Oh, we're playing the would have, should
00:36:55
have game.
00:36:56
>> I would have voted for Reagan in
00:36:58
>> I would have bought Nidia at $4.
00:37:00
>> Well, no. And I I didn't vote for Kamla,
00:37:02
so I'll leave it at that. But I voted
00:37:04
probably
00:37:05
at that. Why don't you say that you
00:37:06
voted for President Trump?
00:37:07
>> Just say you voted for President Trump.
00:37:09
I don't want to complicate things,
00:37:10
>> but you did. So, just say it.
00:37:12
>> I didn't vote for Kla. I'll leave it at
00:37:13
that. All right.
00:37:14
>> It's so weird that you that you'll say
00:37:16
you're a moderate, but you won't say
00:37:17
that you voted for President Trump.
00:37:18
>> I am supporting President Trump in about
00:37:20
60 70% of what he does. Uh, let's leave
00:37:22
it at that. Three, two. All right. Let's
00:37:24
talk economic
00:37:26
impact of oil and insurance. Oil has
00:37:28
rose to $84 a barrel Wednesday
00:37:30
straightfor
00:37:33
basically a standstill at this point.
00:37:35
Here's the clip. You can see the traffic
00:37:37
slowing down and then hey, some of the
00:37:39
dots are even going away. That could be
00:37:41
uh ships were taking out.
00:37:45
Unless the straight opens, 3.3 million
00:37:47
barrels of daily production would be
00:37:49
lost early next week. And then there's
00:37:53
insurance companies. They've all
00:37:54
cancelled
00:37:56
the war risk coverage of vessels in the
00:37:59
Gulf effective March 5th. Super tanker
00:38:01
traffic dropped 94% within the first 48
00:38:04
hours. Trump said the US would provide
00:38:06
political risk insurance for all
00:38:08
maritime trade through the Gulf,
00:38:10
especially energy. Freeberg, your
00:38:12
thoughts on the economic second order
00:38:14
effects that we're starting to
00:38:16
experience here and over the next four
00:38:18
weeks could be um you know intense and
00:38:21
acute.
00:38:21
>> The modern insurance market emerged
00:38:24
specifically to solve the risks of
00:38:26
maritime trade. So in the 17th century,
00:38:29
Lloyds of London, which was a coffee
00:38:30
shop in London, where all the maritime
00:38:33
traders would get together and they talk
00:38:34
about, hey, what's the safest route so
00:38:36
pirates don't get our ship and so you
00:38:38
don't run into weather. That's where
00:38:39
they would kind of have these
00:38:40
conversations and eventually they
00:38:41
started underwriting the risks of the
00:38:43
shipping uh routes and giving each other
00:38:46
guarantees. They said, "Hey, if you make
00:38:48
this route, great. You pay me a certain
00:38:50
amount and if you don't make the route,
00:38:52
I'll pay you the loft value." And that's
00:38:54
how Lloyds of London, which is the kind
00:38:56
of world's biggest reinsurance market,
00:38:58
started today. Lloyds of London has 78
00:39:00
what are called syndicate members. And
00:39:02
these are kind of these pools of
00:39:03
reinsurance that underwrite big crazy
00:39:05
risks like maritime insurance for folks
00:39:08
that are moving oil tankers through the
00:39:10
straight of Hermuz, which the IRGC just
00:39:13
announced they're shutting down. When
00:39:15
the IRGC announced that they were
00:39:17
shutting down the straight of Hermuz,
00:39:19
there's a significant risk of all the
00:39:20
mines going in the straight and the
00:39:23
ships getting attacked and blown up. So
00:39:24
loss of value. The insurance premium
00:39:27
spiked initially from a quarter% so
00:39:29
0.25% of the value of the ship to 1.25%.
00:39:34
So it went up by like 5x and so folks
00:39:36
had to pay a lot more of the value of
00:39:37
their ship in order to continue the
00:39:39
routes and get guarantees that they'll
00:39:41
make it through. And then all of the
00:39:43
markets started to shut down. So once
00:39:45
the conflict got heavier, everyone said,
00:39:47
"Let's shut this thing down." And that's
00:39:48
a obviously a massive risk to energy
00:39:50
prices globally, which drives inflation
00:39:52
and puts US economic security at risk.
00:39:55
And so this is a brilliant move. I would
00:39:57
say the US government stepped in with
00:39:59
the US International Development Finance
00:40:01
Corporation, which was actually funny
00:40:03
enough started a couple of years ago
00:40:05
like in 2019 or something like that as a
00:40:08
kind of output of one of the agencies
00:40:10
that provided credit from USAD. much
00:40:13
talked about USA ID. And so they're
00:40:15
they're leveraging the credit capacity
00:40:18
of this old US ID agency to go out and
00:40:21
say to all the shipping companies, hey,
00:40:23
we'll give you insurance on your routes.
00:40:26
And the reason they need it is the
00:40:27
shipping companies are levered. They
00:40:30
take on debt to buy the ships. And the
00:40:32
debtors require that they have insurance
00:40:35
or else they're not allowed to take the
00:40:37
routes because the debtors are
00:40:38
ultimately going to be out the money.
00:40:39
And so the shipping companies themselves
00:40:41
need to have insurance. And so this
00:40:44
provides a market that has now gone
00:40:46
away. Very smart. And ultimately a lot
00:40:48
of people are saying this could actually
00:40:50
reshore or onshore maritime insurance
00:40:53
back to the United States and create an
00:40:55
entirely new insurance industry here in
00:40:58
the US that has historically been served
00:41:00
almost exclusively by European
00:41:02
syndicates and European partners. And it
00:41:04
actually creates a big economic
00:41:06
opportunity as this war dies down for
00:41:08
American insurance companies and
00:41:10
American brokers to basically be the
00:41:12
underwriters and the guarantors of this
00:41:14
sort of insurance and create a new
00:41:15
industry. So that's super super
00:41:17
interesting kind of side story on what's
00:41:18
going on here. All right, some breaking
00:41:20
news here folks via Bloomberg. The
00:41:22
Pentagon has formally notified Anthropic
00:41:25
that it's been deemed a supply chain
00:41:27
risk. This has never happened to an
00:41:29
American company. It has happened to
00:41:31
Russian companies. uh and Chinese
00:41:34
companies Huawei and for background the
00:41:37
department of war cancelled Anthropic
00:41:40
$200 million contract on Friday and said
00:41:43
they would do this. The dispute came
00:41:44
down to two clauses according to sources
00:41:46
and we have one of the principles here.
00:41:49
So we will hear directly from him in a
00:41:51
moment. Anthropic had two concerns.
00:41:54
Number one, fully autonomous weapons aka
00:41:56
murderbots. As we previously discussed,
00:41:59
Daario didn't feel that their technology
00:42:01
was reliable yet and wanted some
00:42:04
assurances. The second thing Anthropic
00:42:06
said was uh they were concerned about
00:42:09
mass surveillance of Americans because
00:42:11
they believe this technology is uniquely
00:42:14
powerful and uh it's can do things
00:42:19
beyond what a series of webcams or a
00:42:21
network of 7-Eleven cameras can do.
00:42:25
Pentagon said they wanted all lawful
00:42:28
use. Dario, you're welcome to come on
00:42:30
the program next week or any time to
00:42:32
give your side of the story, but this
00:42:33
week we have Emil. Emil, your thoughts
00:42:36
and explain to us what happened here and
00:42:38
how this broke down.
00:42:40
>> It's worth a little history, short
00:42:41
history. So,
00:42:44
if you remember the Biden executive
00:42:46
order on AI, which was this crazy
00:42:49
executive order that limited the amount
00:42:51
of compute any model company could do
00:42:54
and was essentially grandfathered in a
00:42:57
few a small number of AI companies that
00:42:59
they were going to designate the winners
00:43:01
and everyone else was out so they could
00:43:03
have more control on what they did. Um,
00:43:06
Anthropic was one of those winners. Um,
00:43:08
and then they were smart. act as a good
00:43:11
sales strategy to sell in to the most
00:43:14
sensitive parts of the US government
00:43:16
like all of our combatant commands sent
00:43:19
central command that's doing the Iran
00:43:21
fight now the Indoaccom command which is
00:43:24
sort of responsible for China several of
00:43:26
the intelligence agencies and they did
00:43:28
forward deployed engineers palunteer
00:43:30
style so they're very got very sticky um
00:43:33
to the workflows and all that and so I
00:43:36
came in and I got the AI portfolio for
00:43:39
department in August and I said, I just
00:43:42
want to see the contracts. You know, the
00:43:44
old lawyer in me and I looked at
00:43:46
contracts. I was like, holy cow, they
00:43:48
say you can't use them for you can't use
00:43:50
them to plan a a kinetic strike. You
00:43:53
can't use their AI model to move a
00:43:55
satellite. You can't There was a a 20
00:43:58
page.
00:43:58
>> You can't do a war game scenario with
00:44:00
it.
00:44:00
>> You could do a scenario, but you can't
00:44:02
like let's suppose you're writing a plan
00:44:04
saying like if this happens, here's what
00:44:06
we would do. and it might involve a
00:44:08
kinetic strike which causes harm to a
00:44:10
human. So like, well, what do you think
00:44:12
these folks do? You know, we use the
00:44:14
Department of War. This is what we do.
00:44:16
And so I said, okay, well, I've got to
00:44:18
number one, have direct relationships
00:44:20
with these companies, not just through
00:44:21
Palunteer, because I want to use it more
00:44:23
broadly. And then number two, I need to
00:44:26
have the terms of service be rational
00:44:28
relative to our mission set. So we
00:44:30
started these negotiations and and took
00:44:33
three months and I had to sort of give
00:44:37
them scenarios about like this Chinese
00:44:39
hypersonic missile example. They're
00:44:40
like, "Okay, we'll give you an exception
00:44:42
for that." Well, how about this drone
00:44:44
swarm? We'll give you an exception for
00:44:45
that. And I was like, "The exceptions
00:44:47
doesn't work. I I can't predict for the
00:44:50
next 20 years what all the things we
00:44:51
might do use AI for." Um, and so so all
00:44:55
lawful use seems like a good thing. If
00:44:58
Congress wants to act, great. We have
00:44:59
our own internal policies like we'll
00:45:02
follow them. We're not knuckle draggers
00:45:04
here. We want we don't want to hurt
00:45:06
people unnecessarily. So, you know, it's
00:45:08
our province to decide how we fight and
00:45:10
win wars um so long as they're lawful.
00:45:14
And I think at some point it turned into
00:45:17
a PR game for them because they were not
00:45:21
going to win this intellectual battle of
00:45:25
well, we're going to stop you. we're
00:45:26
going to use our judgment because we
00:45:28
think Congress is behind and impose it
00:45:30
on the US military. And it became this
00:45:33
like let's find the issues that are most
00:45:35
inflammatory, robot weapons and mass
00:45:39
surveillance. I mean, like we're the
00:45:40
Department of War. We're not the FBI.
00:45:41
We're not Homeland Security. We're not
00:45:43
>> You're not allowed to legally spy on
00:45:45
Americans.
00:45:46
>> Yeah. You're not you're not. So it's so
00:45:48
so you're like and then what it came
00:45:51
down to on that issue just as an
00:45:52
anecdote is they didn't want us to bulk
00:45:56
collect public information on people
00:45:59
using their AI system and they wrote it
00:46:02
in a way that I was like so you're
00:46:03
telling me before we got to bulk collect
00:46:05
if someone types in you know Shimat's at
00:46:08
LinkedIn it's I'm using public available
00:46:12
information that I would be violating
00:46:14
your terms of service like yeah well
00:46:15
okay let's rewrite it. was months of
00:46:17
this like stuff. Um, which which was
00:46:21
sort of interminable and then the
00:46:23
trigger point was after the Maduro raid,
00:46:27
one of their execs called Palanteer who
00:46:29
we buy ourselves through and asked them
00:46:32
uh was our software used in that raid
00:46:35
which is by the way classified
00:46:36
information anyway. So we're trying to
00:46:38
get classified information and implying
00:46:41
that if there was used in that raid that
00:46:43
that might violate their terms of
00:46:44
service. So they wanted to enforce, this
00:46:47
is very important here. Yeah.
00:46:48
>> They wanted to enforce their terms of
00:46:50
service. They went behind your back to
00:46:52
try to collect information to then
00:46:56
>> maybe pull your license for their
00:46:58
technology.
00:46:59
>> But you know, and it wasn't by behind my
00:47:01
back. I don't want to accuse them of
00:47:02
that. Palunteer is the prime contractor
00:47:04
this sub. Um, but it raised enough alarm
00:47:07
with Palenteer who's got a trusted
00:47:09
relationship with the department to tell
00:47:10
me and I'm like, "Holy, what if this
00:47:14
software went down, some guard rail
00:47:16
kicked up, some refusal happened for the
00:47:19
next fight like this one and we left our
00:47:22
people at risk and I had so I went to
00:47:24
Secretary Hath. said this would happen.
00:47:26
And that was like a wo moment for the
00:47:28
whole leadership at the Pentagon that
00:47:30
we're potentially so dependent on a
00:47:32
software provider without another
00:47:34
alternative that has the right or
00:47:36
ability to do to not only shut it off,
00:47:38
maybe it's a rogue developer who could
00:47:40
poison the model to make it not do what
00:47:42
you want uh at the time or sort of trick
00:47:44
you because you have to trick it. I mean
00:47:46
all these things that we know we were
00:47:48
about models or hallucinate purposefully
00:47:50
or do or not follow instructions like
00:47:53
some insider threat stuff. So then that
00:47:56
culminated in the Tuesday kind of
00:47:58
dramatic meeting with Hexath and
00:48:00
Secretary Hexath and me and and Daario
00:48:03
um with the Friday deadline that that
00:48:06
got blown and I never thought they
00:48:07
really wanted to make it. M is is the
00:48:10
model entirely hosted by Anthropic or
00:48:13
just explain to us technically does this
00:48:15
sit in a cloud that Palunteer runs for
00:48:18
you guys? Um is there really technically
00:48:20
a way that employees at Anthropic could
00:48:23
kind of interfere intervene in the use
00:48:25
of the model?
00:48:26
>> Yeah. So they put their model in AWS
00:48:30
GovCloud
00:48:32
>> GovCloud. Yeah.
00:48:32
>> And then Palunteer serves it from there
00:48:35
and they refresh it. They held the
00:48:38
control plane for the model. So, so
00:48:41
yeah,
00:48:41
>> they can change the model weights if
00:48:43
they want. They can do whatever they
00:48:44
want.
00:48:45
>> Yeah.
00:48:45
>> The insight into this thing is
00:48:46
unbelievable. Not just governments, but
00:48:50
now if you're running a company, the
00:48:51
reality is that what anthropics showed,
00:48:55
which by the way is their right at some
00:48:57
level, is that they are going to have a
00:49:01
political perspective and a set of terms
00:49:04
that reflect their philosophy and that
00:49:07
that philosophy can change on a dime.
00:49:09
But what the government did was also
00:49:11
completely reasonable, which is we can't
00:49:13
rely on you if you're going to be
00:49:16
completely unreliable and
00:49:20
disallow things that are reasonable.
00:49:23
I'll give you a different example to
00:49:24
make the point.
00:49:26
There's a state that wants to run some
00:49:28
healthcare program, but they're a
00:49:30
prolife state.
00:49:33
You can't conduct abortions in that
00:49:34
state. Does that mean that the anthropic
00:49:37
engineers can decide, you know what,
00:49:39
we're pro-choice, so we're going to
00:49:41
change the access model and the
00:49:42
capability of that model inside of that
00:49:44
state. Is that allowed? Should that be
00:49:47
allowed? At one level, you'd say, "This
00:49:50
is a private company. They're allowed to
00:49:51
choose." But what that really means is
00:49:53
for the government, for all the states,
00:49:55
for any city, for every company, you
00:49:57
cannot choose to only use one of these
00:50:00
things because it is just a matter of
00:50:02
time until some person inside of one of
00:50:05
these companies goes on some lunatic
00:50:07
moral tirade and then jeopardizes your
00:50:10
business against something that is
00:50:13
nothing about law, but is everything
00:50:14
about subjectivity. That is the huge
00:50:17
thing that this thing tore open this
00:50:19
weekend. So if you're not figuring out
00:50:21
how to be multimodel and agnostic across
00:50:23
these models, you're taking on enormous
00:50:25
business risk after Friday because you
00:50:27
can't tolerate that these folks will do
00:50:29
that. It's too critical of a technology.
00:50:32
By the way, this is deplatforming all
00:50:34
over again. Remember what happened when
00:50:36
you didn't like what was said? Now all
00:50:38
of a sudden you were deplatformed. This
00:50:40
is that times a thousand because this is
00:50:42
not about posting on social media. This
00:50:45
is about using fundamental technology to
00:50:47
either advantage or disadvantage your
00:50:48
business. Emil.
00:50:50
>> Yeah. I mean, I think I described it the
00:50:52
other way the other day as these c the
00:50:56
leaders of these companies say they're
00:50:58
going to cause 50% white collar
00:51:00
unemployment. This is as powerful as a
00:51:02
nuclear bomb. You it's like 50,000
00:51:05
geniuses in a data center. So you could
00:51:07
have a small country coers the world
00:51:09
into its whatever. So you're like, "Holy
00:51:10
cow." All right. So this is a general
00:51:12
substrate of intelligence of technology
00:51:15
that's applicable to a lot of things.
00:51:17
Very generalized. It's not like workday
00:51:20
like HR software we could just use a
00:51:22
competitor. This is going to be part of
00:51:24
our everyday life in so many different
00:51:25
ways and the controlling the like what
00:51:29
whether it has a moral conscience. I
00:51:30
mean anthropic has its own constitution.
00:51:33
It has its own soul. It's not the US
00:51:36
constitution. So you're subject to that
00:51:39
plus whatever whims and how that
00:51:40
changes. And that's a scary thought for
00:51:43
for Americans generally. Um and I think
00:51:46
that did come through a little bit
00:51:47
today. And in the coming years it's
00:51:49
going to be a bigger and bigger deal.
00:51:51
>> So take us through OpenAI software,
00:51:54
Gemini software and Gro software. Have
00:51:58
they push back on any use or are they
00:52:01
like Dell or Apple? They sell you a
00:52:03
computer and you have the computer and
00:52:05
you can use it as you will. Have any of
00:52:08
those given you any push back? So
00:52:10
Grock's all in for all awful use cases
00:52:12
across all classified and unclassified
00:52:14
networks as you'd expect because and you
00:52:16
know Elon's truth seeking. We want truth
00:52:19
and Department of War. We don't want
00:52:20
ideology
00:52:21
>> because ideology will mess with
00:52:23
operational decisions like you you don't
00:52:25
want anyone to anything to be fake or
00:52:27
tilted.
00:52:28
>> We're we're surgeing Google and
00:52:30
>> we have them we have Google for all
00:52:32
lawful use cases on on classified
00:52:34
networks and we're trying to move them
00:52:36
to classified networks. are just they
00:52:38
have to build out infrastructure because
00:52:39
the stuff's complicated.
00:52:41
>> So they're in compliance in terms of
00:52:43
what you're looking for as a partner.
00:52:44
And then
00:52:45
>> I guess the last one is OpenAI and Sam
00:52:47
seems to be
00:52:48
>> just characteristically playing both
00:52:50
sides a bit trying
00:52:53
to his credit
00:52:56
um I called him and said I need a
00:52:58
solution if this thing goes sideways. I
00:53:01
need multiple solutions. I'd like you to
00:53:03
be one of them. And he's like okay well
00:53:05
what can I do for the country? is like,
00:53:07
I need to get you up running as soon as
00:53:09
I can. And he was he was trying to
00:53:12
protect anthropic to his credit. He was
00:53:14
like, don't do don't call him a supply
00:53:16
chain risk. That's bad for the industry.
00:53:18
Let me maybe I can negotiate terms that
00:53:20
they'll find acceptable. Uh but he's in
00:53:23
the middle because they're they compete
00:53:24
for the same researchers. So, a lot of
00:53:27
this comes down to this thousand
00:53:29
researchers like baseball players that
00:53:32
get traded between these companies.
00:53:33
>> Moneyball. Yeah. These are the best of
00:53:35
the best. It's a very moneyballish sort
00:53:36
of thing and there's not that many of
00:53:38
them and you lose
00:53:40
>> 20% of them and all a sudden, you know,
00:53:42
they launched Claude Code before you
00:53:44
launched Codeex or something like that
00:53:46
and then the numbers changed pretty
00:53:47
dramatically. So, he was being a real
00:53:49
patriot um to his credit and trying to
00:53:52
him, you know, help anthropic while they
00:53:54
were trashing him and recruiting from
00:53:56
his company. And I I am not biased. I
00:53:58
just I want all of them. I want to give
00:54:00
them all the same exact terms uh because
00:54:03
I need redundancy. I want to see if they
00:54:06
diverge or not or did they if they
00:54:08
converge maybe I only need two over time
00:54:10
but we don't know it's too early
00:54:11
>> but why why keep them in the mix? So if
00:54:14
there's clearly like a difference of
00:54:16
operations and philosophy and how they
00:54:19
want to run their business and there's
00:54:21
other models is is their model
00:54:23
particularly good at particular
00:54:25
applications that make it important to
00:54:28
keep it in the mix given that there are
00:54:30
three or four other kind of alternatives
00:54:31
here.
00:54:32
>> Anthropic you mean?
00:54:33
>> Yeah. Oh well because number the number
00:54:35
one reason we were having this
00:54:37
conversation at all was because they
00:54:38
were deeply embedded. So now I have to
00:54:40
unentangle them and the other companies
00:54:44
have not gone as heavy enterprise
00:54:46
enterprise sales forward deployed
00:54:48
engineers government business. So
00:54:51
they're have to catch up not on
00:54:52
necessarily the capability of the model
00:54:54
but just how do you serve the government
00:54:56
>> the Bible is just way ahead on that
00:54:58
>> right
00:54:59
>> but the models themselves you don't
00:55:00
think are uniquely advantaged or do you
00:55:01
have a view on that at this point? I
00:55:03
don't have a view on that. I I don't
00:55:04
think they're, you know, I mean,
00:55:06
certainly cloud code was was innovative
00:55:09
and ahead. That's true. Um, but I do I
00:55:12
believe in 12 months Codeex is not going
00:55:15
to be close. I think it will be.
00:55:17
>> I think you're right. There's an
00:55:18
asmmptoing that's happening. If you just
00:55:20
look at the like the confidence interval
00:55:22
on how overperforming or
00:55:24
underperformance some of the leading
00:55:25
models are, the error bars are
00:55:27
shrinking, right? the confidence
00:55:28
intervals like these things are all kind
00:55:30
of becoming the same eventually they're
00:55:33
all getting access to enough power
00:55:35
enough compute they're generating
00:55:38
similar results it turns out which I
00:55:39
think you would expect so even more
00:55:42
important that you have a complexion of
00:55:44
models the other thing Emil I don't know
00:55:46
if you saw this but they posted about
00:55:49
the revenue ramp of anthropic
00:55:52
and well I have a small software company
00:55:55
called 8090 and I asked the team. Let's
00:55:57
go look at our opex. I posted it because
00:56:00
I was so shocked at these numbers. Our
00:56:01
costs have more than tripled since
00:56:04
November of 25. Between the inference
00:56:07
cost that we pay AWS, which is
00:56:10
ginormous, between our cost with cursor,
00:56:13
between anthropic, we are just spending
00:56:15
millions.
00:56:17
>> So now more per unit and more more in
00:56:19
aggregate
00:56:20
>> both. But the problem is that my costs
00:56:23
are going up 3x every three months. My
00:56:25
revenues are not
00:56:28
>> token use is very addicting.
00:56:30
>> Yeah. And by the way, because everybody
00:56:32
has gotten infatuated with what we call
00:56:33
these Ralph Wiggum loops, like just like
00:56:35
send the thing off and like it'll just
00:56:37
go figure something out. A, it never
00:56:39
figures anything out, and B, you just
00:56:41
get this ginormous bill from Cursor. So,
00:56:43
one of the things we had to do was just,
00:56:44
we had to say, guys, you got to
00:56:46
deprecate cursor because you're just
00:56:47
wrapping cloud code and charging us way
00:56:49
too much for these tokens. But I don't
00:56:51
know if you're seeing any of this thing
00:56:53
where like the tool usage, it's so great
00:56:55
to use these tools. Let's be honest,
00:56:56
it's super fun. It's like you feel like
00:56:59
a genius,
00:57:00
>> but then the ROI of these tools are
00:57:02
really important. I'm not sure that
00:57:03
that's as much of an issue for you or
00:57:05
not in in
00:57:08
>> it will be it. It will be
00:57:11
>> for sure.
00:57:12
>> As people find more and more use cases,
00:57:14
the use cases get more sophisticated. So
00:57:16
the next marginal thing you have it do
00:57:19
is likely to be harder and therefore be
00:57:21
more consumptive. Right.
00:57:23
>> Right. Right.
00:57:24
>> Let me just ask Emil, the important
00:57:26
question that I think triggered a lot of
00:57:27
the news this week is why then designate
00:57:30
them a supply chain risk? Why not just
00:57:33
abandon them, move on, use the other
00:57:35
vendors? Like why take this kind of
00:57:37
punitive action?
00:57:38
>> Yeah. So I I don't view it as punitive
00:57:40
and I'll tell you why. It's if their
00:57:44
model has this policy bias, let's call
00:57:46
it, based on their constitution, their
00:57:49
culture, their people and so on. I don't
00:57:52
want Loheed Martin using their model to
00:57:55
design weapons for me. I don't want the
00:57:58
people who are designing the things that
00:58:00
go into the the componentry to come to
00:58:02
me because if that po if you believe the
00:58:05
risk of poisoning threat yes it can
00:58:08
enter into any part of the defense
00:58:11
enterprise
00:58:12
>> but it's just the defense enterprise so
00:58:15
Boeing wants to use anthropic to build
00:58:18
commercial jets have at it Boeing wants
00:58:20
to use it to build fighter jets I can't
00:58:23
have that because I don't trust what the
00:58:25
outputs may be because they're so wedded
00:58:27
to their own policy preferences.
00:58:30
>> I guess a dovetail to that is why
00:58:31
couldn't this have been handled
00:58:34
quietly? Is this anthropic who made this
00:58:36
a public spat or was it the
00:58:39
administration that made it a public
00:58:41
spat or two to tango? I mean, they have
00:58:44
a very good sophisticated press
00:58:46
operation and like really good and
00:58:50
painting us as doing mass surveillance
00:58:53
where where where their issue was like
00:58:56
some commercial database thing that
00:58:58
someone else could buy. They didn't want
00:58:59
us to buy to use it, which I'm not even
00:59:01
sure we buy them except to do recruiting
00:59:03
for soldiers. And you know, we run
00:59:05
schools, hospitals, we do a lot of
00:59:06
things at DoD. We don't just fight wars.
00:59:09
and um and the the way they were able to
00:59:12
characterize these two things which are
00:59:13
genuinely scary to people but were not
00:59:16
the real issues. Um it was really the
00:59:20
you worry I worried about them shutting
00:59:22
off our system at a moment of need or
00:59:24
them messing with our system in motar
00:59:27
>> came to mind is if they are selling you
00:59:30
batteries and you need to use the
00:59:31
batteries or the laptops however you
00:59:33
need to use them lawfully okay that
00:59:36
should be enough for them unless they
00:59:38
are peacenicks and they don't want to be
00:59:40
involved in selling weapons which by the
00:59:41
way was Google's position for many years
00:59:43
they just didn't want to be involved in
00:59:44
it because to your point they want to
00:59:47
recruit talent that is also aligned with
00:59:49
that. So there's just seems to be maybe
00:59:52
this isn't the right partner for the
00:59:54
Department of War.
00:59:56
>> Yeah, you should if you if you don't
00:59:57
want your stuff to be used for
00:59:59
department war stuff, you shouldn't be
01:00:00
selling to the Department of War.
01:00:02
>> Pretty sure it's in the name. It's in
01:00:05
the name.
01:00:06
>> Well, and then also I have to say when
01:00:09
you know you said, "Hey, we don't know
01:00:10
what how we're going to use this thing."
01:00:12
Like immediately came to mind was like
01:00:14
911. you you have to go check with them,
01:00:17
you know, if you find out there's
01:00:19
another 911 unique, you know, black swan
01:00:22
event that's going to occur and you have
01:00:24
to go clear it with them. Like you
01:00:26
That's
01:00:26
>> That was literally the comment. That was
01:00:28
literally the comment when I was Yeah.
01:00:30
So I was in a room of 20 people. So this
01:00:31
is not undeniable if everyone want Daria
01:00:34
wants to deny it.
01:00:36
>> And I was giving these scenarios, these
01:00:37
Golden Dome scenarios and so on. And
01:00:39
he's like, "Just call me if you need
01:00:41
another exception." And you know, I'm
01:00:42
like, but what if the balloon's going up
01:00:45
at that moment and it's like a decisive
01:00:47
action we have to take. I'm not going to
01:00:49
call you to do something. It's like not
01:00:51
rational. And
01:00:53
>> yes,
01:00:54
>> uh so it that was another holy cow
01:00:56
moment of like how they think about it.
01:00:59
>> That just means that what he wants to be
01:01:00
is the secretary of war.
01:01:02
>> That's right.
01:01:03
>> He wants to be the the god king there, I
01:01:05
guess. Yeah.
01:01:06
>> You can't do that. The thing that shocks
01:01:08
me, Emil, I don't know. you maybe you
01:01:10
can't say anything but guys you can
01:01:11
comment on this. It's clear that
01:01:13
Anthropic just lost all the Republicans
01:01:17
but I think that if they think that they
01:01:19
have the Democrats that's fleeting as
01:01:21
well because I think progressive
01:01:23
Democrats fundamentally just hate
01:01:26
Silicon Valley and technology and so
01:01:27
there's no way they're going to let some
01:01:28
god king over here that they don't
01:01:31
control either. And so in both ways, I
01:01:33
think they accidentally may have pissed
01:01:35
off every constituent. The longer term
01:01:37
fallout amongst them and progressives
01:01:39
will come home to roost because as the
01:01:41
progressives want more control and these
01:01:43
guys push back on them, they're just
01:01:44
going to fall into the same situation.
01:01:49
>> Yeah. I mean, it's an interesting
01:01:51
perspective. I think if you don't want
01:01:54
to be involved in war that you're right,
01:01:56
I think you mentioned this like three
01:01:57
times, Jimoth.
01:01:58
Don't sell bullets if you don't want to
01:02:00
be in but you can't call Smith and
01:02:02
Wesson and say can I The other thing is
01:02:05
what the hell was the senior management
01:02:07
and the board talking about over these
01:02:10
last few days because to me it would
01:02:12
have sounded insane. So then the
01:02:13
question is were people just so
01:02:15
breathless to buy this revenue curve?
01:02:17
What is the board doing? What is the
01:02:19
senior management really doing? What do
01:02:21
you change guys? What do you think you
01:02:22
would tell them if you were sitting
01:02:24
inside of the board of
01:02:25
>> Enthrop if you're an investor, you're on
01:02:27
the board, what do you say to Daario
01:02:29
when he says, "Hey, I need to dictate to
01:02:32
Emil and Hexth how they use my tool and
01:02:34
everybody else is just saying lawful use
01:02:36
as the standard." What's your coaching
01:02:38
advice?
01:02:39
>> Well, it's also a very unusual
01:02:41
circumstance because I don't think any
01:02:43
business in history has grown as fast as
01:02:45
they have in the last 90 days. So,
01:02:47
they've added what was it? 6 billion of
01:02:50
ARR.
01:02:51
>> Yeah. in a month or something.
01:02:53
>> I mean, that's absurd. Like, I mean,
01:02:55
absurd. It's absurd. It's a great
01:02:56
product. Open Claw has driven a lot of
01:02:58
this.
01:02:59
>> If you're on the board,
01:03:00
>> you're closing your eyes. Yeah.
01:03:01
>> You're shutting the up. You're just
01:03:03
shutting the up cuz something's
01:03:04
working.
01:03:05
>> You're actually
01:03:07
>> I think he's off doing his thing and
01:03:08
they're going to let him do it. And I
01:03:09
don't think that company's worth 350
01:03:11
billion anymore.
01:03:13
God knows what it's worth.
01:03:14
>> Oh. Oh, that's interesting. Where do you
01:03:17
If you get put a block of stock right
01:03:18
now, where do you put a bid in? I'll
01:03:20
tell you where I
01:03:20
>> Oh my god. I had I had this conversation
01:03:22
at dinner two nights ago. It's like you
01:03:23
have to pick between OpenAI at their
01:03:25
current mark,
01:03:28
anthropic at their current mark, or
01:03:29
Google. And it's either multiple from
01:03:33
here or net market value creation from
01:03:36
here because those are actually two very
01:03:37
different conversations.
01:03:38
>> Explain the difference.
01:03:39
>> I think the net market valuation because
01:03:41
Google's already worth three trillion.
01:03:43
So if they double, they've added three
01:03:45
trillion, but I think Google is the bet.
01:03:47
I think Google is the market value
01:03:49
creator bet. But I think anthropic is
01:03:51
the multiple bet. I think anthropic is a
01:03:53
trillion five market cap at the end of
01:03:55
the day.
01:03:56
Unless this blows them up,
01:03:58
>> you're still buying the 5x versus the 3x
01:04:00
kind of thing.
01:04:01
>> You'd buy the 5x instead of the 2x.
01:04:03
>> But if you get put a block of stock now,
01:04:04
do you buy it at the last post or do you
01:04:06
buy it at a discount or do you just say,
01:04:08
"Ah, I just buy it at the last post."
01:04:09
>> Anthropic is worth a lot more than 350.
01:04:12
That's for sure.
01:04:12
>> I I that it's undervalued compared to
01:04:14
chat.
01:04:15
>> They just added six billion in the last
01:04:17
month. And I will tell you anecdotally,
01:04:18
anecdotally,
01:04:20
>> I am everyone I talk to is on co-work.
01:04:23
Everyone is like gone deep on this.
01:04:25
Everyone's amazed and shocked and
01:04:27
actively using it. And everyone's saying
01:04:28
the same thing, which is anthropic, may
01:04:30
actually be fulfilling the promise of
01:04:32
AI. I will also say that it's only going
01:04:34
to take 90 days for Google to flip on a
01:04:38
virtual version of co-work. And once
01:04:40
Google like has this integrated with G
01:04:42
Suite and you have a virtual hosted
01:04:44
version of co-work sweeps the market
01:04:46
with this same competitor. But right now
01:04:48
Coowwork is such an incredible product
01:04:50
and everyone's saying the same thing.
01:04:51
It's like giving Elon giving truth to
01:04:53
AI. Elon said something with respect to
01:04:57
Grock which was that
01:04:59
he expects it to exceed all of these
01:05:01
coding models probably in the May spin
01:05:05
but for sure by June.
01:05:07
So to your point free like what like I
01:05:10
guess my question guys to you is like
01:05:12
what happens okay what do you guys do em
01:05:15
what do you do when all the models
01:05:16
asmtote let's just say by October of
01:05:18
this year let's just say I can guarantee
01:05:20
you just for the thought exercise by
01:05:23
October all the models are the same do
01:05:26
you just take a complexion of them all
01:05:28
and say great we're going to build some
01:05:30
governance layer around it and now we're
01:05:33
indifferent or
01:05:35
>> what do you do
01:05:35
>> I would love to be indifferent because
01:05:37
then I could compete on price, right?
01:05:39
And then then I have and then I have one
01:05:42
one main and one redundant or two mains
01:05:45
and I'd need at least two.
01:05:48
>> Yeah.
01:05:49
>> Anthropic's not going to be one of them
01:05:50
if they continue sort of with their
01:05:52
their sort of posture. So then it would
01:05:55
be three. And if one gets wobbly from a
01:05:57
policy scenario too because they all,
01:06:00
you know, except for Elon's is based in
01:06:01
San Francisco and has that that vibe to
01:06:04
it. So, uh, you kind of want to have two
01:06:06
or three at any given time. And yeah,
01:06:08
then I then you price compete them. I do
01:06:10
think Google has a long-term strategic
01:06:13
advantage, not because of not only
01:06:15
because of their consumer thing, but
01:06:16
because they have their own cloud.
01:06:18
>> So, between them, they don't have the
01:06:20
margin on top of the cloud that
01:06:22
Anthropical have to pass on. So, it's an
01:06:24
interesting economic uh
01:06:26
>> uh proposition from them.
01:06:30
And just to build on your point,
01:06:32
Freedberg, after you finish your uh
01:06:34
insightful comments here, pull this up,
01:06:37
Nick. Almost on quue, Freedberg. You're
01:06:40
such an oracle. Here is the announcement
01:06:42
from Google. Google Workspace is now
01:06:45
integrated for agents and 40 agent
01:06:48
skills were included today. Emilio,
01:06:51
you've been great today. Super honest.
01:06:53
Daario's uh position. I'm going to give
01:06:55
you some fast balls here. Daario says,
01:06:58
"The real reason the Pentagon and Trump
01:07:03
admin do not like us is that we haven't
01:07:05
donated to Trump. While Open AI Greg
01:07:08
have donated a lot, here's Claude's
01:07:12
answer to that claim." Here's nine
01:07:14
companies and their activities with the
01:07:18
administration from the inauguration to
01:07:20
attending the inauguration to the White
01:07:22
House CEO dinner to the Melania
01:07:24
documentary. If you go through and you
01:07:26
look at these nine companies, Microsoft,
01:07:30
Apple, Tim Apple, Nvidia, Amazon, they
01:07:33
have all participated.
01:07:35
There's one company that hasn't
01:07:37
participated and that's Anthropic. Are
01:07:39
is Anthropic being singled out because
01:07:42
they are not genulecting and because
01:07:45
they're not paying the cover charge.
01:07:46
People say this administration is pay
01:07:49
for play. That's the accusation he's
01:07:51
making. I'd say maybe there's a cover
01:07:53
charge. Nobody likes to pay it, but the
01:07:55
other companies have. What do you think
01:07:57
here?
01:07:57
>> I mean, it's literally one of the
01:07:58
dumbest things I've ever heard. I I
01:08:02
truly just because I'm like, I'm in the
01:08:05
Department of War. I need to win wars.
01:08:07
If you help me win wars and I don't have
01:08:09
to waste time transitioning you out, I'm
01:08:12
that makes me thrilled. Um, and it sort
01:08:14
of it's a criticism on me because it's
01:08:17
not like Trump, President Trump dipped
01:08:19
in and he's like, "Hey, Emil, by the
01:08:21
way, those guys didn't get any money.
01:08:22
you can't use them anymore. I mean,
01:08:24
obviously, it's sort of like like
01:08:26
invention in his own mind. It's like I
01:08:28
don't know if people sleep at night if
01:08:29
those thoughts get in there. Um and and
01:08:33
I was trying to work with them. Why
01:08:34
would I spend three months trying to
01:08:35
negotiate with them to get to a simple
01:08:37
standard if I would have just said,
01:08:38
"Okay, guys, you're out. Bye." So, I I
01:08:42
think it's just some internal
01:08:44
psychosis. That's the only way I can
01:08:46
explain that.
01:08:47
>> Okay. It could be on Dario that he's
01:08:49
antagonistic to the administration both
01:08:51
with respect to how he operates
01:08:53
commercially and it's also reflected in
01:08:55
the fact that he doesn't want to support
01:08:56
the administration.
01:08:57
>> I have a different theory.
01:08:59
>> I think that they have a massive
01:09:01
instance of co-work internally that
01:09:04
helps them come up with business
01:09:06
strategy. And I bet you there's like
01:09:09
some element of AI that says, "Yeah, you
01:09:11
should do it. Do it. It just makes
01:09:12
sense.
01:09:13
>> Zig where they zag and get more press."
01:09:15
And so now there's some some
01:09:17
cladbot telling them to basically tell
01:09:19
the department of war to pound sand.
01:09:21
It's gonna turn out to be the stupidest
01:09:23
decision.
01:09:24
>> Listen, if I was chairman of the board
01:09:25
of that company, I pull Dario aside and
01:09:27
I'd say, "Listen, you're obviously a
01:09:28
genius. We obviously have the best tool
01:09:30
in town. This is not a battle you can
01:09:33
win and it makes no sense. You're going
01:09:35
to come across as not being patriotic."
01:09:38
And Tim Cook is showing up for the
01:09:40
Melania premiere. Would it kill you to
01:09:42
support the president? Would he kill you
01:09:43
to show up? Look what happened when
01:09:46
Biden Look what happened when Biden
01:09:47
excluded Elon that ankled him. Show up
01:09:50
for the president. Show up for America
01:09:53
and be a patriot. You don't have to
01:09:55
donate, but be a patriot and show up for
01:09:56
the dinners.
01:09:57
>> That's terrible advice. Here's my
01:09:58
advice.
01:09:59
>> Okay, here's your advice. Okay.
01:10:00
>> Hey Dario, call a meal back right now
01:10:03
and say, "You know what? Sorry, weed up.
01:10:06
We're gonna own this and we're going to
01:10:08
put out a press release that says we
01:10:11
support our customers use of our models
01:10:15
to do everything and anything that's
01:10:17
lawful. Number one, and number two, that
01:10:20
our terms of service are written in
01:10:22
stone and that you can expect solidity
01:10:25
and reliability from us. And this was
01:10:28
just a misstep.
01:10:29
>> Camille, how do you respond?
01:10:31
>> I mean, I would say that's what I've
01:10:33
always wanted. I need a reliable, steady
01:10:35
partner that gives me something that'll
01:10:38
work with me on autonomous because
01:10:40
someday it'll be real and we're starting
01:10:43
to see earlier versions of that and I
01:10:45
need someone who's not going to wig out
01:10:47
in the middle and we're just at the
01:10:49
early stages and it's rational. But
01:10:52
then,
01:10:53
>> you know, you called President Trump in
01:10:54
your 5,000word essay on Friday a wannabe
01:10:57
dictator.
01:10:59
>> You're going to have to apologize to
01:11:00
more people than just me. Yeah, maybe
01:11:02
time to rewrite the position here. Uh,
01:11:04
let let's just say Kumbaya, everybody.
01:11:07
Kumbaya, we solved the problem. And look
01:11:08
who's on the line. Surprise guest.
01:11:10
Daario's here. I thought I would
01:11:11
surprise everybody. Nick, pull Dario up.
01:11:13
No, he's not here.
01:11:14
>> What's your view on how the industrial
01:11:17
supply chain for hardware components and
01:11:20
systems is coming along in the United
01:11:21
States? Because my understanding is
01:11:23
we're trying to reduce dependency on
01:11:25
Chinese manufactured components. Where
01:11:27
are we with respect to where we need to
01:11:30
get to in the US manufacturing supply
01:11:32
chain?
01:11:33
>> We are early days. Um critical minerals
01:11:37
you see you've seen the action around
01:11:39
that. Um you'll start to see so I have
01:11:42
the office of strategic capital which
01:11:44
has 200 billion in lending authority.
01:11:46
And what we're trying to do is is it's
01:11:49
like a treasuries plus 100 bips loan to
01:11:52
companies, show them that the department
01:11:54
needs their solid rocket motors, their
01:11:57
batteries, their fiberglass, like all
01:11:59
the things that we that we're heavily
01:12:01
dependent on for our defense industrial
01:12:03
base that are completely outsourced to
01:12:05
China and domesticate them here. Um, and
01:12:09
we've got uh a bunch of great people
01:12:11
running it. So, but but it's early days.
01:12:13
is going to take for the rest of the
01:12:15
term to get um I think we'll get
01:12:18
critical minerals done before the rest
01:12:20
of the term where where we have the
01:12:23
access to what we need to from US or
01:12:26
allied countries. Um but from batteries
01:12:30
is like the next problem I'm trying to
01:12:31
solve. For example, batteries are
01:12:32
totally outsourced both technologically
01:12:34
and from lithium to China. Um, and
01:12:37
there's like, you know, kind of call it
01:12:39
20 critical things. If I could get to
01:12:42
all of them at some level, but then
01:12:43
it'll take a few years for them to like
01:12:45
build plants and do that stuff. But
01:12:46
there there it's it's very important. I
01:12:48
hope whatever administration comes next
01:12:50
continues it because I'm all free
01:12:53
market, but but we outsource so much
01:12:56
that um, you know, it crippled sort of
01:12:59
the the kind of the assembly part of
01:13:02
putting all these things together. Do we
01:13:04
have a munitions risk right now given
01:13:06
the conflict that we're involved in?
01:13:08
>> We don't have a munitions risk, but um
01:13:11
we do need to plus up because
01:13:14
the Europeans are taking a long time to
01:13:16
contribute like Ukraine. Russia has
01:13:18
consumed a lot of munitions from like
01:13:21
all over the world and then uh obviously
01:13:25
these conflicts we've been in and
01:13:28
um we need to have like the next
01:13:30
generation we're still there's still a
01:13:32
large degree we're fighting with 1980
01:13:34
cold war weapons
01:13:36
>> right and not modern weapons and so we
01:13:38
need to plus up those things that to to
01:13:41
regenerate them I mean our nuclear
01:13:43
missiles are 50 years old some of the
01:13:46
planes are 40 years old so all has to be
01:13:48
renewed.
01:13:50
>> Do you think um just speak to the
01:13:52
venture capitalists in the audience?
01:13:54
>> Are we in the early stages of this kind
01:13:56
of defense tech boom? Is defense tech
01:13:59
wellunded at this point or is it kind of
01:14:01
too hypy and bubbly and that's not
01:14:03
really the issue? It's not about funding
01:14:05
the companies. It's about funding some
01:14:06
of the further upstream uh issues that
01:14:08
we're facing. What's what's your view on
01:14:10
where we are there?
01:14:10
>> There's more defense tech venture
01:14:12
capital than ever by you know 3x more
01:14:15
than last year. So, you know, it it's
01:14:18
growing. What I need to do and what the
01:14:20
department needs to do is have some of
01:14:22
these companies win big contracts quick
01:14:24
like whether you know and sure um uh
01:14:28
Seronic sure like bunch of these
01:14:30
companies so that more money flows in
01:14:33
more entrepreneurs do it and I could buy
01:14:35
more because genuinely I do think
01:14:38
warfare is going from big car carrier
01:14:41
ships that cost20 billion dollars and a
01:14:44
decade and a half to build to mass
01:14:47
traitable
01:14:48
lowcost um things and that's what these
01:14:52
new these new entrance can do. So we
01:14:55
need those to succeed so that the
01:14:56
flywheel goes with venture capital money
01:14:58
entrepreneurs capabilities
01:15:00
>> in that sense and what I've heard as
01:15:02
kind of the explainer for this is we're
01:15:04
moving from the old primes to the new
01:15:06
primes that there's going to be a small
01:15:08
set of big winners and then obviously
01:15:10
lots of seconds and and subs and
01:15:12
whatnot. Is that really how this
01:15:14
market's going to evolve? So, are we
01:15:15
going to end up with Andrew, Palanteer,
01:15:17
and maybe three or four others, and
01:15:18
that's where most of the value is going
01:15:20
to acrue from a market perspective?
01:15:22
>> I mean, Andrew and Palanteer want that,
01:15:24
and I joke with them all the time about
01:15:25
it, but I I want I definitely want at
01:15:28
least a second layer that's innovative
01:15:31
and trying to disrupt the first layer
01:15:32
all the time. I met a mom and pop like
01:15:35
wholly owned company that that makes
01:15:37
these missiles called Rams that are we
01:15:39
really sell and send send to Ukraine and
01:15:41
they do it with like 30 people and they
01:15:43
can do a thousand a year because they've
01:15:45
designed a manufacturer and it's
01:15:46
awesome. So I want companies like that
01:15:48
to continue innovating maybe and then b
01:15:52
buys them but the the one of the reasons
01:15:54
the primes are such a small number it's
01:15:57
not the only but it's one is they learn
01:15:59
how to contract with the government.
01:16:00
They learned how to go through the
01:16:01
bureaucracy and that became a
01:16:04
competitive advantage. I'm trying to
01:16:05
take that competitive advantage away.
01:16:07
>> That's a really important point. How do
01:16:09
you disassemble all that bureaucracy so
01:16:12
that product innovation can actually get
01:16:14
to you?
01:16:15
>> Yeah. So we we did a big So part of it
01:16:20
comes down to requirements reform. What
01:16:22
used to happen is people like oh we need
01:16:24
a new fighter jet. So, Army, Navy, Air
01:16:26
Force put in the requirements and were
01:16:28
you know it would we needed to be
01:16:31
stealthy to hold a missile to hold four
01:16:34
humans and you know it became this
01:16:37
unbuildable thing but the contractor
01:16:40
didn't care because they're getting paid
01:16:41
cost plus so like sure I'll fulfill your
01:16:44
requirements two years from now you're
01:16:46
like that was never engineered properly
01:16:48
it'll be another few years late and a
01:16:50
couple more billion dollars so we're
01:16:52
trying to change that to I tell you my
01:16:54
common operational problem. I need a
01:16:56
bunch of missiles that go 500 miles or
01:16:57
more that have this kind of blast. Come
01:16:59
to me with solutions, as little
01:17:01
requirements as possible on that side.
01:17:03
And on the contract piece, trying to get
01:17:05
to as close to commercial contracts as
01:17:08
possible. And this is going to take, and
01:17:10
this is where the startups are so good,
01:17:12
they'll do fixed cost pricing. They'll
01:17:14
do, you know, pay you don't pay me as
01:17:17
much if I deliver late. You pay me more
01:17:18
if I deliver early.
01:17:20
>> It's very disruptive to the existing
01:17:22
system. Yeah.
01:17:22
>> Super disruptive. But that's that's what
01:17:24
I'm I'm like waking up every day trying
01:17:27
to do.
01:17:28
>> So you could put out as something
01:17:29
saying, "Hey, the straight of horses is
01:17:30
super important. We need to keep it
01:17:32
open. We need these type of devices to
01:17:35
keep it open." But come to us with your
01:17:36
ideas and let them be creative
01:17:39
entrepreneurs as opposed to, you know,
01:17:41
just trying to goose the profits. Yeah.
01:17:43
It's really brilliant.
01:17:44
>> Yeah.
01:17:45
>> Emil, you also oversee DARPA. Yeah.
01:17:47
>> Yeah. DARPA is the father of the modern
01:17:49
internet and it's created a lot of
01:17:52
really critical technologies. Can you
01:17:53
talk about what's going on in there? Are
01:17:55
there interesting things that you think
01:17:57
our audience should know about that
01:17:58
you're trying to push forward?
01:18:00
>> I mean there's so it's probably my
01:18:02
favorite part of of my my office is like
01:18:05
because there that's where you it's sort
01:18:08
of like it's still a very honored
01:18:10
profession to be part of DARPA. like you
01:18:12
know being a being in government service
01:18:14
for a long time is sort of reduced in
01:18:16
its stature since the Manhattan project
01:18:19
now because now now if you're a great
01:18:22
ass you know uh someone who wants to do
01:18:23
rockets and stuff you go to SpaceX DARPA
01:18:26
still has the best of the best and so
01:18:28
the most creative ideas happen there one
01:18:31
of the things that they're working on
01:18:32
that's public is they're trying to use
01:18:34
biology to synthesize critical minerals
01:18:36
so so how do you so how can you just
01:18:38
pull them out of ground use biology to
01:18:40
do it so you don't need to do all this
01:18:42
crazy messy dirty refining that would
01:18:45
like change the game big time on our
01:18:47
ability to get the critical minerals we
01:18:49
need faster and leaprog the Chinese in
01:18:51
terms of tech. Um,
01:18:54
so they're doing a lot of that kind of
01:18:55
stuff. They're deep in cyber cyber
01:18:58
attacks are are the next huge threat
01:19:00
with AI, right? The what what we saw
01:19:02
with the creating all these agents to
01:19:04
attack systems that anthropic happened
01:19:06
to them. Um, so they're they're they're
01:19:09
working on that's there's not a ton I
01:19:11
can talk about kind of DARPA because
01:19:13
it's so it's so classified, but those
01:19:15
are a couple examples for you.
01:19:18
>> All right, speaking of classified, uh,
01:19:19
just two quick questions before we wrap
01:19:21
here. Are there aliens? And what are you
01:19:22
going to tell us? And number two, uh, in
01:19:25
all seriousness, I I'm curious, what
01:19:28
have you learned about China and where
01:19:30
they're at and the threat there and our
01:19:32
ability to counter it? like give us some
01:19:35
idea of where we're at as a country cuz
01:19:38
we hear a lot of hyperbolic stuff.
01:19:41
They're building this incredible mobile
01:19:43
small navy. They've got hypersonics.
01:19:45
They're just way ahead of us. You know,
01:19:47
we hear these things. But realistically,
01:19:49
are we competitive?
01:19:51
>> Um I fought Well, I'll answer your first
01:19:53
question, which I fought for the alien
01:19:55
portfolio. I didn't get it.
01:19:58
>> Work to do more work to do.
01:20:01
All the guys on my team were like,
01:20:02
"Dude, you got to get this for us.
01:20:04
Please talk to the secretary. We want to
01:20:06
do this."
01:20:08
>> But I but I I was like, you as long as I
01:20:10
had 100% access to everything, I would
01:20:12
do it because that would be it would be
01:20:14
amazing, right?
01:20:15
>> Sabers would be a game changer.
01:20:17
>> Um but on the second one, uh it is true
01:20:20
that Chinese have had the greatest
01:20:23
military buildup in world history in the
01:20:25
last 15 years. and we're asleep at the
01:20:28
wheel to some degree because we're
01:20:30
focused on global war and terror. So,
01:20:32
they've advanced without sort of us
01:20:34
thinking about threat. That being said,
01:20:37
our operational expertise and our space,
01:20:40
like we have some sophisticated stuff,
01:20:44
you know, our subs, our space layer, um
01:20:47
we still have the best stuff in the
01:20:49
world that does, you know, but but we
01:20:52
have to make sure that gap doesn't
01:20:53
narrow,
01:20:54
>> right? We can't be complacent. We should
01:20:57
sleep well at night knowing you're
01:20:58
there. Yeah.
01:20:59
>> Knowing President Trump's allocating
01:21:00
money towards this and he's decisive in
01:21:02
his actions. But we cannot be complacent
01:21:04
trying to I feel like this week was a
01:21:07
true reminder of how fortunate we are to
01:21:09
have the defense that we have for the
01:21:11
United States. When you look at what
01:21:13
happened in Dubai and in Doha and in Tel
01:21:17
Aviv and you see how people in their
01:21:19
residential homes are getting attacked
01:21:22
and bombed, you realize just how
01:21:24
fortunate we are to have all of the
01:21:25
layers of protection that we have by our
01:21:27
government. And I've actually come
01:21:29
around to this quite a lot.
01:21:30
>> I'm a true kind of arguably libertarian
01:21:33
at heart, small government, but the one
01:21:35
thing that I've realized is so critical
01:21:37
for us to have the freedom to do all the
01:21:39
things we want to do is defense. And so
01:21:42
I think it's an amazing institution,
01:21:44
very valuable to the United States.
01:21:46
Emil, thank you for what you do.
01:21:47
>> Yeah, thank you. Really appreciate you
01:21:49
coming on and being so candid and
01:21:51
thoughtful and insightful. This has been
01:21:53
a fun amazing episode. We'll see you
01:21:56
next time. Byebye.
01:21:57
>> Love you, boys. Byebye.
01:22:00
>> Let your winners ride.
01:22:08
We open sourced it to the fans and
01:22:09
they've just gone crazy with it.
01:22:11
>> Love you. Queen of
01:22:15
>> yours.
01:22:20
Besties are gone.
01:22:23
>> That is my dog taking your driveway.
01:22:28
>> Oh man. Myasher will eat me.
01:22:30
>> We should all just get a room and just
01:22:32
have one big huge orgy cuz they're all
01:22:34
just useless. It's like this like sexual
01:22:35
tension that you just need to release
01:22:37
somehow.
01:22:41
>> Your feet.
01:22:44
We need to get merch. I'm going all in.
01:22:53
I'm going all in.

Badges

This episode stands out for the following:

  • 80
    Most shocking
  • 75
    Most dramatic
  • 75
    Best concept / idea
  • 70
    Most intense

Episode Highlights

  • Operation Epic Fury
    The US and Israel launch a joint attack on Iran, marking a significant military escalation.
    “This is boots on the ground by the end of March.”
    @ 03m 55s
    March 06, 2026
  • Geopolitical Leverage
    Discussion on how recent actions create leverage with China amidst global tensions.
    “We’re not doing regime change.”
    @ 08m 16s
    March 06, 2026
  • The Future of Drone Warfare
    A sophisticated drone war will involve AI-controlled swarms for better targeting and efficiency.
    “I believe that a sophisticated drone war is going to be drone swarms controlled by AI.”
    @ 20m 55s
    March 06, 2026
  • Military Autonomy Debate
    The discussion revolves around the moral implications of granting full autonomy to military systems.
    “We’re not even close to there yet, right?”
    @ 26m 24s
    March 06, 2026
  • Potential for Regime Change
    The conversation touches on the implications of regime change in Iran and Venezuela for democracy.
    “If those both flip back to democracies, he’ll have done more for the spread of democracy.”
    @ 35m 53s
    March 06, 2026
  • Insurance Market Evolution
    The modern insurance market began to address maritime trade risks in the 17th century.
    “The modern insurance market emerged specifically to solve the risks of maritime trade.”
    @ 38m 24s
    March 06, 2026
  • AI and Military Use
    Concerns arise over the use of AI in military applications, especially regarding autonomous weapons and surveillance.
    “Anthropic was concerned about mass surveillance of Americans.”
    @ 42m 06s
    March 06, 2026
  • The Cost of AI Tools
    Costs for AI tools are skyrocketing, with revenues not keeping pace. "My costs are going up 3x every three months."
    “My costs are going up 3x every three months.”
    @ 56m 25s
    March 06, 2026
  • The Dilemma of War Contracts
    The discussion highlights the ethical concerns of AI companies working with the Department of War. "If you don't want your stuff to be used for department war stuff, you shouldn't be selling to the Department of War."
    “If you don't want your stuff to be used for department war stuff, you shouldn't be selling to the Department of War.”
    @ 59m 57s
    March 06, 2026
  • Anthropic's Political Fallout
    Anthropic's stance may have alienated both Republicans and Democrats, leading to potential long-term consequences. "It's clear that Anthropic just lost all the Republicans."
    “It's clear that Anthropic just lost all the Republicans.”
    @ 01h 01m 13s
    March 06, 2026
  • Modernizing Military Weapons
    We need to have the next generation weapons to replace outdated technology.
    “We’re still fighting with 1980 cold war weapons.”
    @ 01h 13m 34s
    March 06, 2026
  • Importance of Defense
    Recognizing the critical role of defense in maintaining freedom.
    “Defense is critical for us to have the freedom to do all the things we want to do.”
    @ 01h 21m 37s
    March 06, 2026

Episode Quotes

Key Moments

  • Operation Epic Fury02:31
  • Geopolitical Leverage09:14
  • AI in Warfare20:55
  • Political Voting36:46
  • War Ethics59:57
  • Political Fallout1:01:13
  • Early Days1:11:33
  • Venture Capital Growth1:14:12

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E71: Russia/Ukraine deep dive: escalation, risk factors, financial fallout, exit ramps and more
Podcast thumbnail
E44: USA's Afghanistan embarrassment, China's new algo laws, future of robots + Italy recap!
Podcast thumbnail
“This is Bibi’s War” - Harvard’s Graham Allison on the Influences and Endgame of the Iran War
Podcast thumbnail
E100: Reflecting on the first 100 shows, fan questions, nuclear threat, markets, Amazon & more
Podcast thumbnail
E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack