Search Captions & Ask AI

Financial Crash Expert: In 3 months We’ll Enter A Famine! If Iran Doesn’t Surrender It's The End!

April 06, 2026 / 01:33:49

This episode discusses the potential outcomes of the ongoing conflict involving Iran, Israel, and the United States, featuring Professor Steve who shares his insights on the geopolitical landscape. Key topics include the implications of Trump's actions, the significance of the Strait of Hormuz, and the potential for global famine due to disrupted supply chains.

Professor Steve outlines five scenarios for the war's resolution, including the destruction of Gulf power infrastructure and the disabling of Israel's nuclear capabilities. He expresses concern over the fragility of global systems and the potential for famine if fertilizer supplies are cut off.

The conversation also touches on Trump's motivations, suggesting that his actions may be influenced by a desire for attention and profit, as well as the historical context of U.S.-Iran relations. Steve emphasizes the importance of understanding the motivations of all parties involved, including the deep-seated animosities between Israel and Iran.

Listeners are urged to consider the broader implications of the conflict, including the potential for a nuclear escalation and the economic consequences of energy disruptions. The episode concludes with a discussion on the need for self-sufficiency in food production and energy.

Overall, this episode provides a critical analysis of the current geopolitical situation and its potential ramifications for the global economy and food security.

TL;DR

Professor Steve discusses the potential outcomes of the Iran-Israel conflict and its global implications, emphasizing the fragility of supply chains and food security.

Video

00:00:00
So there are five scenarios in which the
00:00:02
war could end because Trump is stupid
00:00:03
enough to take on what Israel wanted to
00:00:06
do, which was destroy Iran, but they've
00:00:08
bitten off far more than they can chew.
00:00:10
So scenario one is Iran destroys the
00:00:12
Gulf power infrastructure. I think
00:00:13
that's highly likely. And if that
00:00:15
happens, then Saudi Arabia, Qatar,
00:00:18
Dubai, they'll become uninhabitable. And
00:00:20
then scenario two, Iran disables
00:00:22
Israel's nukes. I hope that happens, but
00:00:24
there's this one. And it scares the
00:00:27
out of me. Professor Steve, I have so
00:00:29
many questions. What is going on?
00:00:30
>> So, this war is threatening everybody on
00:00:32
the planet. And what Trump is doing at
00:00:33
the moment is a pump and dump scheme.
00:00:35
He's trying to drive up the oil price
00:00:36
and exploiting it for his friends and
00:00:38
for his own wealth in the process. So,
00:00:40
people are focusing upon the price of
00:00:42
this. But the really important point is
00:00:44
this, the straight of Ham. So, oil,
00:00:46
fertilizer, helium all have to pass
00:00:48
through the straight of Ham
00:00:49
>> and Iran have blocked that gap.
00:00:51
>> So, they can say you do or do not pass
00:00:53
depending on your country's attitude
00:00:55
towards our country. And that's quite
00:00:57
terrifying because 20 to 30% of our
00:00:59
fertilizer comes through at this point.
00:01:00
But if this is not available, the globe
00:01:02
has a famine.
00:01:03
>> Do you think he will send ground troops
00:01:04
in?
00:01:04
>> Yes, I do. But I'd hate to be one of
00:01:06
those troops because it's a suicide
00:01:08
mission. They've got underground
00:01:09
military units of weapons and troops,
00:01:11
but we have no idea of the scale.
00:01:13
>> Trump keeps saying that the war has been
00:01:14
won.
00:01:15
>> Yeah.
00:01:15
>> What's going on there in your view?
00:01:16
>> I think he's been fed propaganda to tell
00:01:19
him that he's winning the war by his
00:01:21
immediate advisers because you cannot
00:01:23
tell a person like that that they've
00:01:24
made a mistake. We'll talk about that as
00:01:26
well. But you developed a bit of a
00:01:27
reputation because you're very good at
00:01:28
predicting things. So, which of these
00:01:30
five outcomes do you think is most
00:01:32
probable to happen?
00:01:34
>> Oh god,
00:01:37
>> this is super interesting to me. My team
00:01:39
given me this report to show me how many
00:01:40
of you that watch this show subscribe
00:01:41
and some of you have told us according
00:01:43
to this that you are unsubscribed from
00:01:45
the channel randomly. So, favor to ask
00:01:47
all of you, please could you check right
00:01:49
now if you've hit the subscribe button
00:01:50
if you are a regular viewer of the show
00:01:51
and you like what we do here. We're
00:01:53
approaching quite a significant landmark
00:01:54
on this show in terms of a subscriber
00:01:56
number. So, if there was one simple free
00:01:59
thing that you could do to help us, my
00:02:00
team, everyone here to keep this show
00:02:02
free, to keep it improving year over
00:02:04
year and week over week, it is just to
00:02:06
hit that subscribe button and to double
00:02:08
check if you've hit it. Only thing I'll
00:02:09
ever ask of you, do we have a deal? If
00:02:11
you do it, I'll tell you what I'll do.
00:02:13
I'll make sure every single week, every
00:02:15
single month, we fight harder and harder
00:02:16
and harder and harder to bring you the
00:02:18
guests and conversations that you want
00:02:19
to hear. I've stayed true to that
00:02:20
promise since the very beginning of the
00:02:21
D ofio and I will not let you down.
00:02:25
Please help us. Really appreciate it.
00:02:26
Let's get on with the show.
00:02:35
>> Professor Steven, who who are you? If
00:02:38
you had to sort of distill it down to
00:02:39
three areas of specialism, what would
00:02:41
those be? history of economic thought,
00:02:44
financial instability, so the what
00:02:46
causes volatility in the in the economy
00:02:49
and the dynamics of money and ironically
00:02:53
it makes me a minority in economics
00:02:56
because most economists ignore money
00:02:57
completely.
00:02:58
>> It's a strange thing to you.
00:03:00
>> It's ridiculous but it's true.
00:03:01
>> We'll talk about that as well today. I
00:03:02
really want to focus on what's going on
00:03:04
in the world right now because there's
00:03:05
so many questions. It's it's all quite
00:03:08
confusing
00:03:08
>> extremely
00:03:09
>> and understanding the layers of
00:03:10
motivation that you know Trump has, Iran
00:03:13
have, Israel have is um it's a difficult
00:03:16
jigsaw puzzle to put together.
00:03:19
>> I guess the the question that I keep
00:03:21
asking myself is like what is going on?
00:03:24
>> You can't get away from the fact that
00:03:25
we've basically elected a mafia dawn to
00:03:28
president of the United States. You've
00:03:30
got a guy who
00:03:32
admires the mafia
00:03:34
who's running the country. So what we're
00:03:37
getting in some ways is a shakeddown
00:03:38
rather than anything driven by any sense
00:03:41
of political necessity. Okay. So that's
00:03:44
that's a crazy element to begin with.
00:03:47
And the American deep state as it's
00:03:50
called has been anti-Iran for 40 or 50
00:03:53
years. Israel has wanted to defeat Iran
00:03:56
for that length of time. Trump is stupid
00:03:58
enough but also cunning enough. It's a
00:04:00
combination of the two to take on what
00:04:04
Israel wanted to do, which was destroy
00:04:06
Iran. They're now trying to do it and
00:04:08
they're finding that they've they've
00:04:10
bitten off far more than they can chew.
00:04:12
>> Trump is someone who cares a lot about
00:04:13
people's opinions of him and he must
00:04:17
have known that this would be
00:04:18
politically unpopular to target around
00:04:20
at this moment in time.
00:04:22
>> I don't think so. I had a relationship
00:04:24
with somebody with narcissistic
00:04:25
personality disorder. So that's
00:04:27
something over and above what I learned
00:04:29
academically that I when I think about
00:04:31
his behavior and somebody like that,
00:04:33
they want to be the center of attention
00:04:35
at all times. They can't stand it when
00:04:38
somebody else is being spoken about.
00:04:39
It's ridiculous, but it's it's a
00:04:41
pathology. So he's interested in
00:04:43
people's opinions so long as they're
00:04:45
positive and they're about him.
00:04:47
>> So you are you saying that he attacked
00:04:49
Iran and started this war in part
00:04:50
because he wanted attention?
00:04:52
>> That's always something with somebody
00:04:54
who's got that disorder. Yeah.
00:04:56
>> I mean, what do you think about his
00:04:58
rational? He's saying that he attacked
00:04:59
Iran because they had nuclear weapons
00:05:01
and they were there was an imminent
00:05:02
threat.
00:05:03
>> We still don't know whether Iran has
00:05:04
nuclear weapons. Okay. We know that
00:05:07
Israel has. If you're going to attack a
00:05:09
country with you, you should attack
00:05:10
Israel, not Iran.
00:05:12
>> But you can't attack attack Israel, can
00:05:13
you? Cuz
00:05:14
>> I cannot make sense of what politicians
00:05:16
all over the planet are doing these
00:05:17
days. There's a huge gap between what
00:05:20
politicians are saying about global
00:05:22
politics and what people in the street
00:05:24
are saying about it. So people on the
00:05:26
street have seen the Gaza genocide.
00:05:28
They've seen all the conflicts Israel
00:05:30
has started there. And I think the
00:05:32
general sentiment in most countries in
00:05:34
the world today is anti-Israel because
00:05:37
of the way it's treating the
00:05:38
Palestinians. And that's what people are
00:05:40
thinking about. the top echelons like in
00:05:42
this country as you know if I if I say
00:05:44
free Palestine I say that outside on the
00:05:47
street I can be arrested it's crazy what
00:05:52
there's a huge divorce between what
00:05:54
people are thinking and what the
00:05:56
politicians are saying and I can't give
00:05:59
any explanation for that divorce apart
00:06:02
from believing that Israel has something
00:06:05
over our political leaders
00:06:06
>> what do you mean you think they have
00:06:08
something over our political leaders
00:06:09
>> I think there
00:06:11
We know about the whole Epstein. The the
00:06:12
way that the Iranians refer to what's
00:06:14
happening is they say they're fighting
00:06:15
the Epstein class and there's belief
00:06:17
that there's something where Epstein
00:06:20
has been working for with the Israeli
00:06:22
intelligence service and has blackmailed
00:06:25
worthy material on a huge range of
00:06:27
politicians. And that's the only way
00:06:30
that I can explain the the sort of
00:06:31
things that politicians are supporting
00:06:33
when their populace is angry about those
00:06:36
same policies. So you get demonstrations
00:06:38
here, you know, free Palestine
00:06:40
demonstrations, 80-year-old female
00:06:42
vicers being arrested for saying this
00:06:45
sort of stuff. You go back 40 years ago,
00:06:47
there was a a a belief in the public and
00:06:50
a belief amongst politicians that Israel
00:06:53
had a right to exist and it was all
00:06:54
pro-Israel. And now after 40 years, the
00:06:58
type of abuses that have happened in
00:07:00
Palestine have hit individual ordinary
00:07:04
people's attitudes to Israel. So
00:07:06
ordinary people are saying Israel's the
00:07:08
aggressor. Israel's making the mistakes.
00:07:10
But the politicians are all saying it's
00:07:12
a it's it's anti- um Seemitic to
00:07:15
criticize Israel.
00:07:17
>> If you had to give a a one-s sentence
00:07:19
answer as to why this war started
00:07:23
because we sort of hypothesized a few
00:07:25
things there. What would that one
00:07:26
sentence answer be?
00:07:31
>> Again, this is trying to make sense of
00:07:32
the senseless. I just think Israel
00:07:35
wanted to destroy Iran. They thought
00:07:37
they could do it and they thought they
00:07:38
had an American president who would help
00:07:40
them do it and they I drastically
00:07:42
underestimated how prepared Iran was for
00:07:44
that conflict.
00:07:46
>> Why would Israel want to destroy Iran?
00:07:48
What's the context there?
00:07:50
>> This goes back to religious elements.
00:07:52
The Zionist state had the right to that
00:07:54
whole region and there's an expansionist
00:07:57
element to Israel's behavior for the
00:07:59
last 40 years. And the major rival they
00:08:02
saw themselves as having in that sense
00:08:04
was Iran. They can invade Jordan. They
00:08:06
could attack Lebanon. Uh they could do
00:08:10
all these things. Of course they the 67
00:08:13
war. Uh they wiped out the is Arab
00:08:16
invading Arabian armies in six days.
00:08:18
They have this past history of being
00:08:21
militarily dominant in the area and they
00:08:24
they knew that Iran was too big for them
00:08:26
to take on on their own. They thought
00:08:28
they could get the Americans in there
00:08:29
and I think they drastically
00:08:30
underestimated how prepared Iran was for
00:08:33
this situation.
00:08:34
>> When you say Iran were prepared for this
00:08:36
situation and it somewhat surprised
00:08:38
Israel and the US. What is that
00:08:40
preparedness you're speaking about?
00:08:41
Well, it's for a start the the fact that
00:08:43
Iran witnessed that there were um
00:08:46
decapitation attacks on other countries
00:08:49
in the region going way way back not
00:08:51
just the last 10 years but the last 40
00:08:54
or 50 years decapitation
00:08:56
>> you take off the leader you kill the
00:08:57
leaders and then with the leaders killed
00:08:59
the armies in disarray and you can come
00:09:01
in and invade and take over. So getting
00:09:03
rid of Saddam Hussein that sort of thing
00:09:05
you know wipe out Saddam Hussein's power
00:09:07
base and then the whole system
00:09:09
collapses. That was the Iraq story. But
00:09:11
the Iranians observed that and they have
00:09:15
broken their military into 31 divisions.
00:09:17
There are 31 provinces like 31 states in
00:09:20
that sense inside Iran. Their military
00:09:23
has broken into those 31 units. They've
00:09:25
got their own fail safe system running
00:09:27
in the background. They've got their own
00:09:29
resources, their own missiles,
00:09:31
production systems, all that sort of
00:09:33
stuff. So you've got to take out the
00:09:35
whole 31 and then they'd have that sub
00:09:38
area. So the only way you can beat the
00:09:39
country is by literally bombing it to
00:09:42
back to the stone age
00:09:43
>> which is appears to be what they've been
00:09:45
trying to do
00:09:46
>> trying to do. But the thing is it's a
00:09:47
huge country. I mean look at you know
00:09:48
the scale of around the map's always
00:09:50
distort how large. So that is larger.
00:09:52
That's more than half the size of
00:09:53
Western Europe. It's got a population of
00:09:56
90 million about 1/3 or one quarter the
00:09:58
population of Europe far more than Iraq.
00:10:01
>> I mean it looks like it's double the
00:10:02
size of the UK or more
00:10:04
>> or more than double. I mean you know one
00:10:06
thing about the MA projection.
00:10:07
>> No. What's that?
00:10:08
>> Okay. It's that it's it makes the the
00:10:10
northern hemisphere is twice as large as
00:10:12
the southern and arounds in the northern
00:10:14
hemisphere but not as far north as
00:10:16
England. So the distortion gets
00:10:17
amplified the further north you go. So
00:10:19
it's bigger than France and Germany and
00:10:21
Italy and Spain
00:10:23
>> and possibly Poland in terms of area.
00:10:26
And then if you even see just looking on
00:10:27
the map itself you can see the
00:10:28
corrugations there versus what you can
00:10:31
see corrugation
00:10:33
>> the what representing mountains. Okay.
00:10:35
>> Okay. There's more mountains inside
00:10:36
there. It's a hor it's a horrendous
00:10:38
terrain to fight a war on. I think what
00:10:40
Trump is doing at the moment is a pump
00:10:42
and dump scheme. He's trying to drive up
00:10:44
the oil price, tell friends beforehand
00:10:47
that he's about to make his announcement
00:10:49
which will cause the price to fall and
00:10:51
he's just oscillating this way up and
00:10:53
down and exploiting it for his friends
00:10:56
and for his own wealth in the process.
00:10:57
>> Do you actually think that's the case?
00:10:59
Because this must be make sense of this
00:11:02
stuff. this must be hurting his friends
00:11:03
economically because the this, you know,
00:11:05
the stock market's going to take a dip
00:11:06
if he's not careful and his friends are
00:11:08
all shareholders in different big
00:11:09
companies. So, you know, also if you
00:11:12
know like the one of Kane's great lines
00:11:14
was that there's no point in buying a
00:11:17
stock which you think is going to
00:11:18
increase in value over time if you think
00:11:20
it's going to slump in the immediate
00:11:22
future. So, he's making an announcement
00:11:24
which causes oil markets to panic. So,
00:11:26
the price goes up. We've given him
00:11:28
control of the most powerful country on
00:11:30
the planet. He knows if you make an
00:11:32
announcement, it moves markets. He has
00:11:35
no compunction whatsoever in exploiting
00:11:37
that to cause rises and falls in prices
00:11:40
and try to exploit them himself and with
00:11:42
his friends.
00:11:43
>> I did. I mean, I did see that. I've got
00:11:44
the data here on on the floor showing
00:11:46
those graphs. I I generally looked at
00:11:47
that and thought, yeah, you know, maybe,
00:11:50
but it's also conceivable that Trump is
00:11:52
quite a predictable character and he
00:11:54
tweets at the same time every day. And
00:11:55
it's also I think me and you both know
00:11:57
that before the markets open on a Monday
00:11:59
morning, he's going to want to say
00:12:01
something really positive.
00:12:02
>> He has a track record of doing that. So
00:12:04
is it conceivable that they knew he was
00:12:06
flying because it was tracked that he
00:12:08
was going to be on this plane journey.
00:12:09
There's going to be a press gaggle. We
00:12:11
know he's going to give an interview. I
00:12:12
actually think that was quite
00:12:13
predictable. If I was a betting man, I
00:12:14
would have gone Sunday night or Monday
00:12:16
morning. I would have put a bet on oil
00:12:18
prices coming down, the stock market
00:12:20
going up.
00:12:20
>> Yeah. And like for example that one of
00:12:22
the things he said most recently he
00:12:24
talked about getting a present from Iran
00:12:26
>> and then he finally let slip what the
00:12:27
present was and was letting eight ships
00:12:29
through the straight of Hmas.
00:12:31
>> Oh
00:12:32
those eight ships were not American.
00:12:35
They were other allies. I think what
00:12:37
he's thinking is these um that's going
00:12:39
to mean the oil market gets calmed down.
00:12:42
That means the price is going to fall.
00:12:44
Uh so I can then do another pump and
00:12:46
dump.
00:12:46
>> Let's explain the straight of hormones.
00:12:49
>> Oh god. Yeah. is if we had to explain it
00:12:51
for 16 year olds because there's been
00:12:53
lots of coverage on it and I think some
00:12:54
people have kind of skipped past the
00:12:56
importance of the region. What is the
00:12:58
straight of hormones and why does it
00:12:59
matter?
00:13:00
>> Well, it's the choke point in the
00:13:02
Persian Gulf to get through. You've got
00:13:04
like 21 km. Okay, that's an incredibly
00:13:06
narrow gap for ships to pass through and
00:13:09
that means that all the countries that
00:13:11
pump not just oil but fertilizer, uh,
00:13:14
helium, all these critical elements for
00:13:16
the production system all have to pass
00:13:18
through this point. And obviously that's
00:13:20
well within reach of any weapons from
00:13:22
Iran. So they can say you do or do not
00:13:24
pass depending on whether we approve or
00:13:26
don't approve of your political your
00:13:28
country's attitude towards our country.
00:13:30
>> You said fertilizer.
00:13:32
>> Yeah.
00:13:32
>> Oil and helium.
00:13:34
>> Yeah. Helium.
00:13:35
>> Where are they coming from?
00:13:36
>> They're mainly coming from I think for
00:13:38
mainly from the Saudi Arabian side.
00:13:40
Saudi Arabia and like Iran will have the
00:13:42
same things but Iran would keep it keep
00:13:44
those for themselves but Saudi Arabia is
00:13:47
the main source of gases and oils which
00:13:50
are refined and as byproducts we get
00:13:52
sulfur dioxide and we get helium. This
00:13:55
is the helium.
00:13:56
>> Yeah.
00:13:56
>> Okay. That's you know a couple of kilos
00:13:59
of helium. But helium it's an element
00:14:02
which there's no substitute.
00:14:04
>> So helium is inert.
00:14:07
>> What does that mean? It means it doesn't
00:14:08
interact with other chemicals. You want
00:14:10
to give it a try?
00:14:11
>> I've never done it.
00:14:12
>> I don't think there's any in here.
00:14:13
>> Oh, what a pity. Okay. What I would have
00:14:14
done to give it a try. You got any real
00:14:16
helium?
00:14:18
>> Oh, bloody hell. Helium balloon.
00:14:20
>> Okay.
00:14:24
>> Does it change your voice?
00:14:26
>> I don't know. Has my voice changed?
00:14:27
>> It did. I'll give my voice a try. Okay.
00:14:31
>> Um Okay. I've never done this before,
00:14:33
but I've heard it at parties.
00:14:37
>> Okay. And now I think my voice has
00:14:39
changed somewhat from
00:14:42
the fact you can do it. Oh my god.
00:14:48
That is a riot. Okay.
00:14:49
>> Where is helium coming from?
00:14:51
>> It's coming from a gas field. So about
00:14:53
30% of the world's helium comes from a
00:14:55
gas field which spans both Saudi Arabia
00:14:58
and Iran. If you don't trap helium
00:15:01
physically somehow it goes to outer
00:15:03
space. That's the ultimate destination
00:15:05
of the stuff. So it's trapped in the
00:15:07
same things that trap oil. And then when
00:15:09
you drill for oil, you also get helium
00:15:11
coming out. And then helium is
00:15:13
absolutely critical for the
00:15:15
semiconductor industry. It didn't matter
00:15:17
100 years ago.
00:15:18
>> And semiconductors are important for
00:15:19
what?
00:15:20
>> Everything. I mean you you know your you
00:15:22
take the semiconductors out of that,
00:15:23
you've got a brick. Okay. The the the
00:15:26
processors, the CPUs, the memory chips,
00:15:29
they're all made. Helium is an essential
00:15:31
element to make them
00:15:32
>> for our iPhones, our tablets,
00:15:34
>> everything. Everything electronic. If
00:15:35
you need semiconductors, you need
00:15:37
helium. So if you cut off 30% of the
00:15:39
world's helium supply, you cut off the
00:15:41
capacity to produce 30% of the world's
00:15:44
semiconductors.
00:15:45
>> And Iran have blocked that gap.
00:15:47
>> And that means that we've suddenly lost
00:15:49
30% of the world's helium.
00:15:50
>> I've got a quote from March 2026 from
00:15:52
leading helium expert Phil Cornblutch.
00:15:55
>> He said, "We're looking at a minimum 2
00:15:57
to 3 months shutdown of helium
00:15:59
production with up to 6 months before
00:16:01
supply gets back to normal." and he
00:16:04
explained you can't stockpile helium
00:16:06
because it leaks through containers.
00:16:07
>> Y
00:16:08
>> so once supply is cut off semiconductor
00:16:10
production will stop entirely. South
00:16:13
Korea gets 65% of its helium from Qatar
00:16:16
in that region and makes 2/3 of the
00:16:19
world's memory chips. Their government
00:16:20
has launched an emergency investigation
00:16:22
into the shortage. Nobody's talking
00:16:24
about this.
00:16:24
>> I know. And this this is one reason it's
00:16:26
quite terrifying about the scale of what
00:16:28
we're going through cuz people just
00:16:30
thinking it's going to be oil's going to
00:16:31
be more expensive. That's the sort of
00:16:33
mindset we have. But in fact, critical
00:16:36
elements of the production system are
00:16:38
being terminated by this conflict. You
00:16:40
can't produce chips anymore. And you
00:16:42
can't Well, you got hang on. You can't
00:16:44
produce these chips either because the
00:16:46
fertilizer is disappearing.
00:16:47
>> So, you're holding a potato in your
00:16:49
hand.
00:16:49
>> Yeah.
00:16:50
>> How are potato chips going to be
00:16:52
impacted by the water?
00:16:53
>> Because the fertilizer. If we don't have
00:16:55
the fertilizer, we can't grow the
00:16:56
potatoes. And it's not just potatoes.
00:16:58
It's a whole range of crops. We eat
00:17:00
food, okay? We eat this green stuff. It
00:17:02
actually starts as brown stuff because
00:17:05
the fertilizer is an essential part of
00:17:07
growing all the food we eat. And the
00:17:09
fertilizer is produced by a process
00:17:11
called the habber bosch process which
00:17:13
takes petroleum and nitrogen and fixes
00:17:17
them in such a way that you can put this
00:17:18
on the on the field and your plants will
00:17:20
grow courtesy of the fertilizer. If we
00:17:23
didn't have fertilizer at all, guess how
00:17:25
many billion people the planet could
00:17:27
actually support?
00:17:28
>> I don't know.
00:17:29
>> Between one and two. And fertilizer
00:17:31
comes from this region.
00:17:32
>> Again, 20 to 30% of our fertilizer comes
00:17:34
from that region.
00:17:35
>> Through the straight of
00:17:36
>> through the straight of Hamos.
00:17:37
>> Where is it coming from?
00:17:38
>> It's coming again from the the same gas
00:17:41
field that's producing the helium
00:17:43
produces a side effect of fertilizer.
00:17:46
And you need you need I'm not a chemist,
00:17:48
okay? So I can get these things wrong,
00:17:50
but you need sulfur. You need sulfuric
00:17:52
acid as well as part of these production
00:17:53
processes. 20% of the world's
00:17:55
fertilizer, helium, sulfuric acid, all
00:17:59
pass through that straight. And if you
00:18:01
take them away, then you can't make
00:18:03
microchips, which is what Korea is
00:18:05
suffering from. You can't make
00:18:07
fertilizer, which which everybody will
00:18:10
suffer from. If we lost 20% of the
00:18:12
world's fertilizer, we'd lose roughly
00:18:14
20% of the world's food. And it cause a
00:18:16
global famine. We've never had this
00:18:17
experience before. We've had localized
00:18:20
famines. You know, countries like India
00:18:21
have had famines in parts of Africa and
00:18:23
so on. But if this is not available, the
00:18:25
globe has a famine.
00:18:27
>> And what's the last uh tanker you've got
00:18:28
down there? There's one more on the
00:18:29
floor.
00:18:30
>> Oh my god. Okay.
00:18:32
Hey, that's pretty good. It was
00:18:33
accidental, but that's that's petroleum.
00:18:35
Okay. A petroleum tank. Obviously empty.
00:18:37
20 L.
00:18:38
>> So that's oil.
00:18:39
>> That's oil.
00:18:40
>> Oil. Okay. And so if that that's what
00:18:42
we're losing right now and people are
00:18:44
focusing upon the price of this but the
00:18:46
really important point and I can bring
00:18:47
up one of my own charts here is the role
00:18:51
of energy in production because if we
00:18:54
don't have energy we can't produce goods
00:18:56
and services and the link is incredibly
00:18:58
tight. This is looking at change in
00:19:01
energy and change in gross world product
00:19:04
over the last 40 years. I'll throw this
00:19:06
graph on the screen for people that are
00:19:07
watching.
00:19:07
>> Okay. So, what what you've got here is
00:19:09
the annual percentage change in gross
00:19:11
world product and the annual percentage
00:19:13
change in gross energy consumption. And
00:19:15
they're virtually lock step and they're
00:19:17
the same magnitude.
00:19:18
>> So, when energy goes up, GDP goes up.
00:19:20
>> And when energy goes down, GDP goes
00:19:22
down. Now, we're losing 20% of the
00:19:25
world's liquefied natural gas, a
00:19:28
substantial proportion of its oil as
00:19:29
well. We could see a 5 or 10% fall in
00:19:32
energy. we will certainly see a 5 or 10%
00:19:35
fall in global world gross world
00:19:37
product.
00:19:37
>> So explain that to me. So where is the
00:19:39
oil in this region?
00:19:40
>> It's everywhere.
00:19:41
>> Okay.
00:19:42
>> I mean this this is one of the accidents
00:19:43
of history that the oil is a large part
00:19:46
is concentrated here and a large part
00:19:48
over here and a bit in Russia.
00:19:49
>> So over here for people that can't see
00:19:51
you're pointing at Iran, Saudi Arabia,
00:19:54
>> Saudi Arabia,
00:19:55
>> Iraq and then there's a lot in the
00:19:57
United States and there's a lot
00:19:58
>> you've got some in Russia as well. There
00:20:00
was a small amount like the North Sea
00:20:02
had a substantial amount of oil as well
00:20:04
at one stage.
00:20:05
>> And the type of oil in this region I
00:20:07
hear is quite important.
00:20:08
>> It's very I mean oil there's no such
00:20:11
thing as a homogeneous product.
00:20:12
>> What does homogeneous mean?
00:20:14
>> Me everything is the same everywhere.
00:20:15
You can if you don't get it here you can
00:20:17
substitute for something over here.
00:20:19
That's a myth that economists actually
00:20:21
unfortunately believe. They basically
00:20:23
could persuade people to think that
00:20:24
everything is homogeneous. In fact oil
00:20:26
from Venezuela is almost like tar. oil
00:20:29
from here is flows like water
00:20:31
comparatively. You need different
00:20:33
processing systems to to extract that
00:20:35
oil than you need over here. Uh if we
00:20:38
lose this, we can't replace it with
00:20:39
something from over here. So once that
00:20:42
goes then the production system of the
00:20:44
planet is damaged. Uh this has been the
00:20:47
shocking thing for me as a citizen has
00:20:51
been the fact that a war with one
00:20:53
country could decapitate what 20 to 30%
00:20:57
>> of global production
00:20:58
>> global production of oil.
00:21:00
>> Yeah. And food.
00:21:01
>> That's a vulnerability if I've ever
00:21:02
heard one.
00:21:03
>> I know. And this is like one reason I'm
00:21:05
a critic of mainstream economics is they
00:21:07
trivialize all this stuff. They don't
00:21:09
teach their students how critical this
00:21:11
is. So most people are like you, even
00:21:13
people who've done a PhD in economics,
00:21:15
even worse in that sense than other
00:21:17
people, they don't understand how
00:21:19
critical and how fragile our production
00:21:21
systems are. So people can talk about a
00:21:23
war in Iraq and think, "Oh, that's a war
00:21:24
in Iran and that's going to cut off our
00:21:26
oil supply." No, it's going to cut off
00:21:28
your food supply.
00:21:29
>> And for the average person listening
00:21:30
now, what will they start to experience
00:21:32
if this war doesn't immediately end? 2
00:21:36
or 3 months India is going to run out of
00:21:38
fertilizer and so there'll be a famine
00:21:41
in India. Food production on the planet
00:21:43
could fall 10 25%. And therefore the
00:21:47
there simply won't be enough food for
00:21:49
everyone on the planet and then it's a
00:21:51
question of who's going to starve. Now
00:21:53
you'd think the wealthy countries are
00:21:55
going to be safe there. Look for
00:21:56
Australia, my old home country has about
00:21:59
30 days oil supply. When it runs out it
00:22:02
can't get food from the farm to the city
00:22:04
anymore. So Australia is incredibly
00:22:07
vulnerable. We're all far more
00:22:09
vulnerable than we realize and this war
00:22:11
is threatening everybody on the planet.
00:22:13
>> I got in an Uber yesterday and I was
00:22:14
with a a wonderful guy who was actually
00:22:17
weirdly I sat to the Uber at 2 a.m. and
00:22:19
I looked up on the screen and he was
00:22:20
listening to the D of Sierra and then he
00:22:22
clocked he clocked me in the back of the
00:22:23
car. We had a great chat and he was
00:22:24
saying to me, listen this isn't actually
00:22:26
my main job. It's my third job. I do
00:22:28
this because of the cost of living and
00:22:30
it really stayed with me.
00:22:32
>> He's doing three jobs which I love.
00:22:33
Yeah. three jobs and he picked me up at
00:22:35
2:00 a.m. He's got a family
00:22:37
>> and he's working his butt off to keep
00:22:39
the family alive.
00:22:40
>> Yes. And you know, I'm going to say
00:22:42
something which I probably um I don't
00:22:44
say a lot which which came to mind which
00:22:47
is um in the position I'm in now. I
00:22:50
think it it it was a real reminder of my
00:22:52
own personal privilege that I think is
00:22:54
really important for someone like me
00:22:55
that doesn't has an interview show
00:22:57
because you've got to be like
00:22:59
intellectually honest with yourself or
00:23:00
just like honest with yourself generally
00:23:01
that like as a as someone in my position
00:23:04
who has been fortunate enough to be able
00:23:05
to make significant money. I can
00:23:07
understand from having that conversation
00:23:10
how
00:23:13
detached
00:23:14
>> you are
00:23:15
>> I am
00:23:15
>> from the world around you.
00:23:16
>> Yes.
00:23:18
You're you're a very unique soul because
00:23:21
you I know like read a bit of your
00:23:22
history of course and you've had that
00:23:24
terrible period where you were you know
00:23:25
unemployed and what the hell do I do
00:23:27
>> shoplifting food and stuff and
00:23:28
>> you were ambitious but you okay if you
00:23:31
don't experience poverty you don't know
00:23:32
what it's like
00:23:34
>> yeah but even if you have
00:23:35
>> you can forget it
00:23:36
>> you can forget it
00:23:37
>> but you haven't yet
00:23:38
>> well this is why it's so important for
00:23:39
me to have those conversations because
00:23:41
him saying I'm working three jobs and
00:23:43
this is and picking me up at 2 a.m. in
00:23:44
his Uber and him telling me he's doing
00:23:47
that because of cost of living because
00:23:48
he needs to pay the bills immediately
00:23:50
made me think of ahead of this
00:23:51
conversation today like oh my god if the
00:23:54
prices go up 20% for people
00:23:56
>> he's out he's he can't work 24 hours a
00:23:58
day
00:23:58
>> can't work another he can't work more
00:23:59
hours in the day and it was just one of
00:24:00
those moments where you go hell
00:24:01
Steve like man you need to stay close to
00:24:04
the plight of uh of people that are
00:24:07
>> on the bread line and so many people are
00:24:09
these days in advanced countries not
00:24:11
just third world countries but certainly
00:24:14
like in America and the UK there are
00:24:16
huge numbers of people who are basically
00:24:18
living from hand to mouth at the current
00:24:21
system. So if we have a breakdown they
00:24:23
can't afford it and in that situation
00:24:26
you can no longer use money as your way
00:24:29
of deciding whether you can eat food or
00:24:31
not.
00:24:31
>> I wonder if politicians know this cuz
00:24:33
part of the reason I say this is because
00:24:34
you know Trump is a very wealthy man
00:24:36
multi-billionaire reportedly
00:24:38
>> and if the prices go up 20% at the pump
00:24:41
>> he makes profit. I mean he actually when
00:24:43
he said he said he said in favor of the
00:24:44
rising oil price we'll make a lot of
00:24:46
money out of it. His immediate
00:24:48
association rising price of something
00:24:50
that I'm indirectly selling that's good.
00:24:52
He doesn't say what about people buying
00:24:54
it. The people who buy it can no longer
00:24:56
afford it.
00:24:57
>> You've got um some food there on the
00:24:59
table which shows how these conflicts
00:25:04
and the pressure they put on some of
00:25:06
these scarce resources can impact our
00:25:08
ability to go and buy food. I think
00:25:09
you've got two bowls of potatoes.
00:25:12
>> Well, let's actually make it fairer
00:25:14
right now. Let's get the actual
00:25:15
distribution correct initially. So, you
00:25:18
are talking about someone who in your
00:25:21
situation, you've got that
00:25:24
local Uber's got this, and now you're
00:25:26
taking away the oil price. That's going
00:25:28
to make Trump better off, but he's down
00:25:30
to the stage where, you know, he's not
00:25:32
too far from that happening. And that's
00:25:35
what we've pushed ourselves into with
00:25:36
this war.
00:25:38
Do do wars typically make inequality
00:25:42
worse?
00:25:44
>> Very good question. I think wars are
00:25:47
created when inequality is bad. If you
00:25:50
go back to the Great Depression and see
00:25:51
what caused World War II, it was largely
00:25:54
the collapse of the German economy. Uh
00:25:57
when they repaid their debt, their
00:25:59
private debt to America or government
00:26:01
private government debt to America that
00:26:03
led to the rise of Hitler. Everybody
00:26:05
thinks Hitler rose because of the VHimar
00:26:06
inflation. That's what people normally
00:26:08
think. In fact, the when when Hitler
00:26:11
came to power in Germany, the rate of
00:26:13
inflation was minus 10%. It was
00:26:16
deflation. Prices were falling.
00:26:18
Unemployment rose from very low to 25%
00:26:21
of the population. In that situation,
00:26:23
people supported Hitler. He revived the
00:26:26
economy. I'm happy to talk about how he
00:26:28
did that later. But inequality leads to
00:26:31
people being willing to elect demagogues
00:26:34
to say we can save you. And then you get
00:26:36
war coming out of it. What happened
00:26:38
after the World War II is
00:26:41
politicians realized that people had
00:26:43
been through the Great Depression, which
00:26:44
was horrific, and they'd been through
00:26:46
World War II, which was horrific. And in
00:26:48
that period, people in America were
00:26:51
talking about either a fascist world or
00:26:54
a communist world. So the Americans
00:26:56
realized they had to improve the living
00:26:58
standards of the average American
00:27:00
substantially to get away from that. And
00:27:02
if you look at, you know, in 1950s
00:27:04
and60s, that's what is called the golden
00:27:07
age of capitalism because at that stage,
00:27:09
you could be a single male supporting a
00:27:12
wife and four kids and have a
00:27:15
comfortable lifestyle at the time.
00:27:18
That was where we started from. So the
00:27:19
the war itself led to a focus upon
00:27:22
equality, a focus upon fairness and
00:27:24
getting as much as you can to the
00:27:26
poorest in society. And then we've
00:27:29
forgotten that over the last 80 years.
00:27:31
And we've now got back to massive
00:27:33
inequality once more. So I think
00:27:35
inequality causes wars. Wars in the
00:27:37
aftermath make people focus on equality
00:27:41
not to allow that horror to happen once
00:27:42
more. And then we forget and do the
00:27:44
whole damn thing again.
00:27:45
>> One of the um surprising things I
00:27:48
learned the other day was that the
00:27:50
country that is estimated to have the
00:27:52
biggest reserve of oil
00:27:55
is Venezuela.
00:27:56
>> Yep. The third country on this list that
00:27:59
is estimated to have the biggest reserve
00:28:00
of oil is Iran.
00:28:03
>> Yeah. Yeah.
00:28:05
>> Now, it doesn't take a genius. Funny
00:28:06
enough,
00:28:07
>> of two countries have added. Yeah. The
00:28:09
second country being America.
00:28:11
>> Well, it says Saudi Arabia.
00:28:12
>> Saudi Arabia. Well, that's already an
00:28:14
American vessel. Yeah.
00:28:15
>> Yeah. That's already basically they're
00:28:16
basically partners with America already.
00:28:18
>> And funnily enough, the fourth one is
00:28:19
Canada. And if you if you're listening a
00:28:22
lot to Trump's rhetoric, he said he was
00:28:23
going to take Canada and make it the
00:28:25
51st state or something.
00:28:27
it doesn't feel
00:28:29
that the countries that the US are
00:28:31
invading their leaders are the country
00:28:33
that have the biggest supplies. And
00:28:35
Trump has already said, you know,
00:28:36
immediately he said after taking out
00:28:39
Maduro in Venezuela,
00:28:41
>> pulling him out of his bed with his wife
00:28:42
and flying him back to the US,
00:28:44
>> he already said that the oil's on the
00:28:45
way back to America.
00:28:47
>> Yeah. One might assume that much of the
00:28:49
motivation here with Iran is when they
00:28:52
were in negotiations with them, maybe
00:28:55
they weren't playing boy ball with the
00:28:56
oil. Maybe they were threatening
00:28:58
something with the oil and maybe it's
00:29:00
such an economic waste.
00:29:02
>> Well, maybe most of Trump's friends, if
00:29:04
he has them, are oil executives and they
00:29:07
can see the benefit for them in
00:29:08
controlling global oil and the one part
00:29:10
they can't control is Iran.
00:29:11
>> But I mean, it would it's backfired
00:29:13
pretty horrifically. I think one of the
00:29:14
great sayings in humanity is it's looked
00:29:16
like a good idea at the time, then you
00:29:19
do it and you realize you underestimated
00:29:21
your opponent. You have you don't
00:29:23
realize how difficult it is. Like you
00:29:25
mentioned, you know, talking about how
00:29:27
being wealthy can make you dissociate
00:29:29
from the problems that ordinary people
00:29:31
have. It can also make you dissociate
00:29:33
from reality in general. You don't
00:29:35
realize how difficult it is to something
00:29:37
do something you want to have done. So
00:29:39
all these oil executives and people who
00:29:41
Trump socializes with could have
00:29:43
thought, take out Iran, America
00:29:45
dominates the global oil thing. We're
00:29:47
all going to be rich. Okay? But they
00:29:49
don't realize that Iran's been aware of
00:29:51
this possibility for 40 years. And
00:29:53
they're prepared. They're far better
00:29:55
prepared than the Americans and the
00:29:56
Israelis thought.
00:29:58
>> So you've got five scenarios laid out on
00:30:00
these cards in front of you here that
00:30:02
you think could happen next. I'm going
00:30:04
to ask you to explain to me what the
00:30:05
five scenarios are and then tell me
00:30:07
which one you think is most likely to
00:30:09
occur.
00:30:10
>> So, scenario one, which is the one that
00:30:12
I think Trump is I think Israel wants
00:30:15
this one, Iran has destroyed. Okay, if
00:30:18
that happens, we're all gone because to
00:30:20
destroy Iran, you're going to have to
00:30:21
use nuclear weapons. Okay, you can't
00:30:24
destroy it without obliterating it as
00:30:27
nuclear weapons do. And that's the
00:30:29
scariest. I don't think it's going to
00:30:30
happen. My main hope here is that Iran
00:30:32
realizes that possibility and they've
00:30:36
got a way to neutralize
00:30:39
not America's nuclear weapons, but
00:30:40
Israel's.
00:30:41
>> You think it's a possibility?
00:30:42
>> It's a possibility and it's what scares
00:30:43
the out of me because if this
00:30:45
happens, then we're all dead. Obviously,
00:30:48
a nuclear bomb doesn't just blow up an
00:30:50
individual target. It everything within
00:30:52
reach gets exploded into the atmosphere.
00:30:54
That's what led people to realize that
00:30:56
you couldn't have a nuclear war back in
00:30:58
the days when we had mutually assured
00:31:00
destruction as the as the policy. If you
00:31:03
attack a country, then you will also
00:31:05
die.
00:31:05
>> But can't they use narrow nuclear
00:31:07
weapons? Is that not a thing?
00:31:08
>> Well, um
00:31:10
>> smaller nuclear weapons.
00:31:12
>> Well, again, if the country is smaller,
00:31:13
you're talking about destroying Europe.
00:31:16
The weapons you'd need to make sure you
00:31:18
got every last potential element of Iran
00:31:21
neutralized. You're talking about
00:31:22
bombing something which is, you know,
00:31:25
virtually the size of Western Europe.
00:31:27
The amount of weapons you got to drop to
00:31:28
do that
00:31:30
and you've got to if you if you don't
00:31:32
get it right,
00:31:34
then they're going to come at you with
00:31:35
what they've got left.
00:31:36
>> The world has dropped nuclear weapons
00:31:37
before and people survive. Other
00:31:39
neighboring countries survived
00:31:41
>> only twice and only small weapons. The
00:31:43
weapons we're talking about in Hiroshima
00:31:46
and Nagasaki, they're about equivalent
00:31:48
to 20,000 tons of TNT. We're now talking
00:31:52
weapons to 20 million tons of TNT, the
00:31:55
biggest nuclear weapons. And if you
00:31:57
wanted to hit a country the size of Iran
00:32:00
and know you've neutralized it, so you
00:32:01
destroy the whole thing, you're talking
00:32:04
hundreds of those weapons.
00:32:05
>> If you had to give a percentage
00:32:07
probability of that outcome occurring,
00:32:10
would it be less than 1%?
00:32:13
If we didn't have a madman in
00:32:15
Washington, yes, be less than 1%. Um, if
00:32:19
we didn't have madman in Israel, less
00:32:21
than 1%, I think probably 5%.
00:32:23
>> 5% probability that
00:32:25
>> that's a possibility. I mean, again, you
00:32:27
know, this is trying to make sense of
00:32:28
the senseless.
00:32:29
>> Okay.
00:32:30
>> But I'd put it about less than 10% but
00:32:32
still scary as a possibility.
00:32:35
>> If we end up there, we're all gone.
00:32:38
I mean, you know, I know very little
00:32:40
about all these things, so that's the
00:32:41
disclaimer. Um, I'd say that I don't
00:32:46
think Israel would intentionally wipe
00:32:49
out the rest of the world or cause a
00:32:51
nuclear winter because that would
00:32:52
obviously impact them as well. But I I I
00:32:55
am quite scared of president's setting.
00:32:57
And what I mean by that is if we
00:32:58
establish it being okay to drop nuclear
00:33:01
weapons on people you don't like, the
00:33:03
sort of domino effect of that for people
00:33:05
in Ukraine and other parts of the world
00:33:07
where there's conflict might then lead
00:33:09
to,
00:33:10
>> you know, mutually assured destruction.
00:33:12
>> Yeah. It's the it's the last possibility
00:33:14
you want to have happen. The fact that
00:33:15
it's even possible to contemplate it is
00:33:18
a terrifying prospect.
00:33:20
>> Let us hope.
00:33:21
>> Yeah. So scenario two
00:33:24
is Iran destroys the Gulf power
00:33:25
infrastructure. I think that's highly
00:33:27
likely.
00:33:28
>> Iran destroys Gulf power infrastructure.
00:33:30
What does that mean?
00:33:31
>> What it means is that Iran all the Gulf
00:33:34
states have got their own power systems
00:33:36
mainly based on burning oil for obvious
00:33:38
reasons. Uh if you take out their power
00:33:40
structure systems then those countries
00:33:43
become uninhabitable.
00:33:44
>> Is that what's happening already? Cuz I
00:33:46
know Iran have attacked a few sort of
00:33:47
power facilities in the region. There
00:33:48
have been a couple of well there was one
00:33:50
attack on a Saudi Arabian power systems
00:33:53
and that took out two of the 14 units
00:33:55
that are critical for creating liqufied
00:33:57
natural gas and apparently it'll take 5
00:34:00
years to rebuild them and there are only
00:34:02
five companies on the planet that can
00:34:04
actually do that rebuilding. One quarter
00:34:06
of the world's liquid natural gas comes
00:34:08
through the straight of Hamos. One tenth
00:34:10
of that has been destroyed. It's like 2
00:34:13
and a half% of the world's energy supply
00:34:15
is gone for the next 5 years until those
00:34:18
are rebuilt. If Iran destroys the Gulf
00:34:20
power infrastructure, then Saudi Arabia,
00:34:22
Qatar, uh Dubai, they all become
00:34:25
uninhabitable.
00:34:26
>> And we're seeing that happen in parts. I
00:34:28
mean, it sounds like this these attacks
00:34:30
have slowed down a little bit, but it
00:34:32
was interesting that Iran's strategy was
00:34:33
to attack their neighboring sort of
00:34:35
partners and specifically targeting a
00:34:37
lot of their energy infrastructure. Is
00:34:38
that in part to apply pressure?
00:34:41
>> Yeah. If I attack Dubai, the leaders of
00:34:43
Dubai are going to call Trump and say,
00:34:44
"Listen, cut this out."
00:34:45
>> Oh, yeah. I mean, the pressure coming
00:34:46
back from the Arabian states on America,
00:34:49
I imagine, is quite immense right now,
00:34:51
saying, "Don't do it." It's quite
00:34:52
possible Israel could do it, like attack
00:34:55
Iran and then Iran does a retribution
00:34:57
attack. Trump, if you would have seen
00:35:00
his tweet this morning, I think he's put
00:35:01
it off to April the 6th before he says
00:35:03
he starts attacking power
00:35:04
infrastructure. if he attacks power
00:35:06
infrastructure in Iran. Iran has said we
00:35:08
will attack power infrastructure in the
00:35:10
Gulf States. So we've got till you know
00:35:13
what 8 days. I think he's bluffing. I
00:35:16
hope he's bluffing. But if he does do
00:35:18
the attack then Iran will respond by
00:35:20
destroying either an equivalent
00:35:22
component of the Gulf States or the
00:35:25
whole infrastructure.
00:35:27
>> I don't think people quite realize how
00:35:29
costly it is for regions like Dubai when
00:35:33
Iran attack them. I was looking at some
00:35:34
of the data.
00:35:35
>> Yeah.
00:35:36
>> And according to current estimates and
00:35:38
historical risk assessments by Dubai
00:35:40
officials,
00:35:41
>> they lose a million per minute, which is
00:35:45
60 million per hour or 1.4 billion a day
00:35:50
when there's an unplanned emergency
00:35:52
shutdown just of their airport.
00:35:54
>> Their airports, let alone their power
00:35:56
systems. Yeah.
00:35:57
>> As we probably saw on the news, Iran had
00:35:59
flown it seemed like a couple of drones
00:36:01
into Dubai's airport, which meant that
00:36:02
it had to shut down. Yeah,
00:36:03
>> they're losing a billion a day because
00:36:06
that airport is closed. I think it's the
00:36:07
biggest airport in the world.
00:36:09
>> It is economic pressure
00:36:13
which will then trickle down to Trump
00:36:14
and sort of force his hand. So, they've
00:36:16
got a clear incentive to cause chaos.
00:36:18
>> Yeah.
00:36:19
>> And that partly what Iran is saying.
00:36:22
It's it's like a game of bluff. You
00:36:24
don't want to do this bluff. If that
00:36:25
bluff happens then the Saudi Arabian
00:36:28
peninsula becomes uninhabitable
00:36:30
and therefore all the I mean that if if
00:36:33
people have are forced out of there and
00:36:34
most of the residents in those countries
00:36:36
are not Saudis. They're third world
00:36:39
workers from India and Pakistan and the
00:36:41
Philippines and so on. They're being
00:36:43
paid lousy wages to to work on all these
00:36:45
systems. If they leave because the power
00:36:48
is not there to support them anymore.
00:36:49
they try to leave then we lose the
00:36:52
entire energy contribution that that
00:36:54
region makes to the global economy
00:36:57
>> and and the figure screwed
00:36:58
>> and that figure I cited includes not
00:37:00
just lost airport revenue but then the
00:37:01
immediate impact on airlines cargo
00:37:03
logistics and the missed opportunity
00:37:04
cost of thousands of highv value
00:37:06
business travelers attending the region
00:37:07
that Dubai's GDP is roughly 30%
00:37:10
dependent on the aviation and tourism
00:37:12
sectors so when the airport closes it
00:37:14
impacts tourism hospitality real estate
00:37:16
investing global supply chains and
00:37:17
everything so it's Um, it's quite
00:37:20
remarkable specifically with Dubai
00:37:22
>> because Dubai I think Dubai is a lovely
00:37:24
place. I've been multiple times. I I I I
00:37:26
love going there.
00:37:28
>> But it felt really safe and so a lot of
00:37:30
people
00:37:30
>> It's not safe. Yeah.
00:37:31
>> It's not safe.
00:37:32
>> Yeah.
00:37:33
>> Yeah. A lot of people had chosen to
00:37:34
uproot their lives and move there and
00:37:36
you'd almost kind of forgotten you were
00:37:38
in the Middle East to some degree.
00:37:39
>> Yeah. Yeah.
00:37:40
>> But I think this is going to be a pretty
00:37:42
traumatic reminder for a lot of people
00:37:43
there
00:37:44
>> how fragile
00:37:45
>> how fragile
00:37:45
>> this area is and like that's the lesson
00:37:47
we're learning. It's the fragility of
00:37:49
the society we take for granted.
00:37:51
>> So that was scenario number two.
00:37:53
>> Okay. Scenario three. That's the one
00:37:55
that really scares me because that is
00:37:58
the Samson doctrine. You know the story
00:38:00
of Samson. Yeah. Okay. Samson is a
00:38:03
enormously strong individual who's
00:38:05
strong because of his hair. And then he
00:38:08
gets conned. This is an ancient story
00:38:10
from the Bible. And the woman who's
00:38:12
conned him shaves his hair. So he's
00:38:14
weakened. And then they put him in a
00:38:17
temple where he's standing between two
00:38:19
pillars and his hair is gone. He's bald.
00:38:21
He can't do a thing. They forget the
00:38:23
fact that his hair is starting to grow.
00:38:25
His hair gets to the stage where he's
00:38:27
now got his strength back. He pushes
00:38:29
those pillars and the whole thing
00:38:30
collapses and everybody dies. That's the
00:38:33
Samson doctrine. And that involves
00:38:35
Israel's nuclear weapons. If they
00:38:37
realize that they are going to lose this
00:38:39
war and it becomes existential for them
00:38:42
then one of the things they have claimed
00:38:44
that they do is unleash destruction on
00:38:47
the rest of the world like Samson
00:38:49
pushing the towers and the whole thing
00:38:51
comes collapsing down.
00:38:52
>> This is I mean going back to the
00:38:53
situation with Iran and Israel. One of
00:38:55
the things I was thinking a lot about
00:38:56
from some commentary that I'd seen is
00:38:59
Israel really have a motive to get rid
00:39:03
of Iran because Iran have repeatedly
00:39:05
threatened Israel. It's also because I
00:39:08
mean Israel is trying to get rid of the
00:39:09
Palestinians and in that sense Iran has
00:39:12
been probably the major bulkwood
00:39:15
supporting the Palestinians say let the
00:39:17
Palestinians survive. Let the
00:39:19
Palestinian people continue existing.
00:39:22
And the Israelis have been pushing and
00:39:25
pushing and pushing the Palestinians
00:39:27
out. You know, it's a hornets's nest.
00:39:28
We've provoked a hornets's nest. Iran is
00:39:31
responding right now, I think, in a very
00:39:32
judicious way. But if the Israelis
00:39:35
realize they're facing an existential
00:39:37
defeat, that scenario, it would again
00:39:40
mean uh civilization potentially gets
00:39:43
destroyed. And just looking at some of
00:39:44
the things that Iran have said about
00:39:46
Israel, historically, the Supreme Leader
00:39:48
of Iran stated in 2015 that Israel would
00:39:51
not see the next 25 years. Other
00:39:54
officials said things like, "The end is
00:39:56
near." Um,
00:39:58
>> and in March 2026, Iran's tone shifted
00:40:02
from ideological to purely retaliatory
00:40:06
with the speaker of the Iranian
00:40:07
Parliament, Muhammad, stating that Iran
00:40:11
has officially declared that it
00:40:12
considers all Israel energy, water, and
00:40:14
IT infrastructure legitimate targets for
00:40:17
irreversible destruction with zero
00:40:20
restraint.
00:40:21
If we think about this from a psychology
00:40:23
perspective, you've got two neighbors.
00:40:24
They're both either implying implicitly
00:40:26
or explicitly that they want to wipe the
00:40:28
other one out.
00:40:29
>> Yeah.
00:40:29
>> Trump is sort of this third party in in
00:40:31
the arrangement who's not in the region,
00:40:33
so he might be a little bit safer.
00:40:35
>> Those two parties that are against each
00:40:37
other, one of them has nuclear weapons
00:40:39
and the other appears to be trying to
00:40:42
make one. the neighbor that is Israel
00:40:45
presumably cannot let that happen
00:40:48
because if it gets to a point where they
00:40:49
both have nuclear weapons and they both
00:40:51
want to wipe each other out.
00:40:54
>> No, I I actually think that the old days
00:40:56
of mutually assured destruction were a
00:40:59
more stable time than what we're in now
00:41:01
because if you realize that if you
00:41:02
attack you also die, you don't attack.
00:41:05
>> But what if you think of death as being
00:41:07
a better thing than life?
00:41:10
You you have to have a society
00:41:12
continuing after you die. If you're
00:41:14
going to be a martyr, there has to be
00:41:15
people who are going to mourn your
00:41:17
death. If you believe being a martyr
00:41:18
means everybody else so so dies and you
00:41:20
don't do it.
00:41:21
>> So, do you think if Iran had nuclear
00:41:22
weapons, it would be a safer world?
00:41:24
>> I think it'd be safer because it would
00:41:26
tell the Israelis, stop attacking your
00:41:27
neighbors.
00:41:28
>> I sat with um a few nuclear experts and
00:41:32
one of the things that was shocking that
00:41:33
I learned is
00:41:35
>> if the United States wanted to launch a
00:41:37
nuclear weapon today,
00:41:38
>> Yeah. It is one person's decision.
00:41:40
>> I heard that. And that's what Trump can
00:41:42
actually just make that decision.
00:41:43
>> He can make the decision on his own. He
00:41:45
doesn't need to consult Congress or
00:41:46
anybody else. He has someone who walks
00:41:48
around with a briefcase that has the
00:41:49
nuclear codes in at any moment. They
00:41:52
call it Yeah. And when I think about the
00:41:53
same in this region, actually, you don't
00:41:55
need a whole state to decide that they
00:41:58
don't like their neighbor. All you need
00:41:59
is one supreme leader
00:42:01
>> or Netanyahu to say, "Do you know what?
00:42:04
I'm near the end of my life and these
00:42:06
people have really pissed me off." Yeah,
00:42:08
that's right. And that's I mean I
00:42:10
thought there was at least some control.
00:42:11
I saw I saw that segment with Annie
00:42:13
Jacobson. Yeah.
00:42:14
>> I thought there was at least some
00:42:15
control. He had to consult someone.
00:42:18
>> But if and or they had to have
00:42:19
circumstances were justified not
00:42:21
consulting someone.
00:42:22
>> If he's got that right, then we it comes
00:42:24
down to what's the behavior of the
00:42:26
person who carries the nuclear football.
00:42:28
Does he let Trump get hold of it? And
00:42:31
like there there was another incident
00:42:33
way way back I think in this 70s or 80s
00:42:36
that the Russian early warning system
00:42:39
reported that there was a nuclear attack
00:42:41
on the way to Russia and there was one
00:42:43
submarine commander or one element of a
00:42:46
submarine command system and they had to
00:42:48
have three people in the submarine who
00:42:49
agreed to to launch an attack and this
00:42:52
particular person refused.
00:42:55
If he'd agreed with the other
00:42:58
nuclear war at it,
00:43:04
even the Russian submarine had three
00:43:06
people who had to make that decision.
00:43:08
So, we didn't have a nuclear war. Now,
00:43:10
we've got one maniac in this White House
00:43:12
who could do it. I'll play Annie
00:43:14
Jacobson's clip now where she talks
00:43:16
about the idea of sole authority which I
00:43:19
think is an important thing for people
00:43:20
to understand because when we think
00:43:22
about who we're electing to lead our
00:43:24
nuclear capable countries
00:43:27
>> you have to think about who you want to
00:43:29
give sole authority to
00:43:31
>> the United States president has sole
00:43:34
presidential authority to launch a
00:43:35
nuclear war
00:43:36
>> what does that mean
00:43:38
>> it's exactly like it sounds what's so
00:43:40
interesting is a lot of this stuff this
00:43:41
nomenclature that gets thrown at
00:43:43
If you just break it down, it's sole
00:43:46
solo presidential. He's the pus
00:43:50
authority. He doesn't have to ask anyone
00:43:52
for permission. Not the SEC staff, not
00:43:55
the chairman of the joint chiefs of
00:43:57
staff, not the Congress. I love the
00:43:59
worried look on your face in this moment
00:44:01
because it is once you know that
00:44:06
you say well first you might Google is
00:44:08
it really true and you will get for
00:44:11
example on Reddit like that's not really
00:44:13
true you'll get like hundreds of
00:44:15
thousands of people you know coming in
00:44:18
with their opinions about how that's not
00:44:20
really true well it is really true it's
00:44:23
absolutely true and in fact during the
00:44:25
former President Trump administration
00:44:29
Congress became so sort of I want to say
00:44:32
motivated or alarmed by this issue
00:44:34
meaning they were being asked questions
00:44:36
by the powers that be. Is this actually
00:44:40
true that they released a report stating
00:44:43
specifically and I quote in the book yes
00:44:45
it is true as commanderin-chief
00:44:48
the president has this sole authority.
00:44:50
He doesn't need to ask anyone.
00:44:53
>> So what is scenario four in your
00:44:55
envelopes there? Iran disables Israel's
00:44:58
nukes. Nobody can know. But I do believe
00:45:00
that Iran has not developed nuclear
00:45:02
weapons.
00:45:03
>> So you're hoping Iran disables Israel's
00:45:05
nuclear weapons?
00:45:06
>> I am. I hope that happens because that
00:45:08
takes out the nuclear option. Okay. We
00:45:10
won't see nuclear war as a result of
00:45:12
this. If the only nuclear weapons that
00:45:13
we know exist in the Middle East are
00:45:15
destroyed.
00:45:16
>> But if Iran starts disabling Israel's
00:45:18
nukes and attacking Israel that
00:45:21
effectively, there's going to be an even
00:45:23
bigger problem. Well, not not if we're
00:45:25
talking conventional weapons. If it's
00:45:27
conventional weapons and ground trips,
00:45:29
then you don't end up with nuclear
00:45:31
winter and the death of everybody on the
00:45:33
planet.
00:45:33
>> Wait, so you're saying you hope Iran
00:45:35
invades Israel and takes out their
00:45:36
nuclear weapons?
00:45:37
>> No, that's necessarily invasion. It
00:45:38
could be the missiles they've got left.
00:45:40
Again, we don't know how capable their
00:45:42
missiles are. The level of planning that
00:45:44
Iran has done in this war, I I had no
00:45:46
idea of of the fact they had those 31
00:45:49
regions, for example, until the war
00:45:50
began. My special is economics, not
00:45:52
global military politics. But once I
00:45:56
learned that, I thought they have really
00:45:57
thought this through. They have wargamed
00:46:00
what happens if they get attacked by
00:46:02
America. And they've warmed it
00:46:04
comprehensively. Now, I hope they've
00:46:07
also wargamed if we start defeating
00:46:10
Israel and Israel realizes they're going
00:46:13
to be wiped out, then the possibility
00:46:15
for the Samson doctrine comes in. We
00:46:17
have to disable that before it happens.
00:46:19
How could they possib They They don't
00:46:20
have a functioning military left in any
00:46:23
sort of typical sense. They don't have
00:46:26
ships left. They don't have planes left.
00:46:27
>> They don't have ships. They don't have
00:46:28
planes. But they have got missiles. And
00:46:30
we don't know how many missiles they've
00:46:32
got. We don't know where the missiles
00:46:33
are. Certainly the Americans would have
00:46:35
some intelligence. I think the word's
00:46:36
got to be used with inverted commas
00:46:38
these days, but some intelligence over
00:46:40
where they are located in Iran. But if
00:46:42
you listen to the Iranians talking about
00:46:44
it, they say they've got hundreds of
00:46:46
these facilities buried hundreds of
00:46:49
meters below the ground. If the with the
00:46:52
weapons they've developed, the the um
00:46:54
advanced rocketry they've developed,
00:46:56
they can evade RIL's Iron Dome, maybe
00:47:00
they can also get into and destroy
00:47:02
Israel's launch capabilities. And if
00:47:05
that happens, I think that's that would
00:47:06
be the best possible outcome because we
00:47:09
have a a rogue state in the Middle East
00:47:12
which has nuclear weapons which will
00:47:14
neither admit that it has or won't sign
00:47:16
it. They're not part of the nuclear
00:47:18
nonpol proliferation treaty. They won't
00:47:20
sign that treaty. We should never have
00:47:22
allowed that to happen. And if Iran gets
00:47:25
rid of them, I think it's the world's a
00:47:26
safer place.
00:47:26
>> Israel are just going to make more
00:47:28
nuclear weapons.
00:47:29
>> They have the resources. Uh, you need a
00:47:32
hell of a lot of technology and a hell
00:47:33
of a lot of intelligent people to do
00:47:35
that. You've already lost the war to
00:47:36
>> How could Israel lose the war?
00:47:39
>> You've got 90 a population of 90 million
00:47:41
in Iran and a population of less than 10
00:47:44
million in Israel.
00:47:45
>> But they've got it's a sort of
00:47:46
technological
00:47:48
gulf.
00:47:49
>> It's not as big as we thought it was.
00:47:51
We're only realizing now the level of
00:47:52
technology that Iran has. I mean the
00:47:55
things which Iran are doing in this war
00:47:56
so far have surprised everybody who's
00:47:59
hasn't got the background of
00:48:00
intelligence to tell them what's going
00:48:03
on uh it's an educated sophisticated
00:48:06
culture far more so than the caricature
00:48:08
we've got from the west has been about
00:48:10
it in the past so they
00:48:12
>> they don't have near nearly the same
00:48:13
level of resources and technology and uh
00:48:19
and I would say maybe sort of
00:48:20
sophisticated yeah advanced systems from
00:48:23
a war perspective that Israel do.
00:48:27
>> We think we don't know. We're assuming
00:48:30
>> even the intelligence services, even
00:48:31
their like their planes and their
00:48:33
missiles and their defense systems are
00:48:35
like profoundly more advanced than
00:48:36
Iran's.
00:48:38
>> If that was the case, we wouldn't be
00:48:39
having this conversation. It's 3 weeks
00:48:41
after the war began or 4 weeks. You
00:48:44
know, the original belief that Trump has
00:48:46
to be over in one day. That's proved
00:48:48
false.
00:48:49
>> I think that's in part because of what
00:48:50
you said because they've prepared for
00:48:52
decapitation. If I was the supreme
00:48:54
leader of Iran, yeah, that's the sort of
00:48:56
approach I would have taken, which is
00:48:58
you take me out and actually you've got
00:49:00
a bigger problem because now you've got
00:49:01
to negotiate with 41 or 31 different
00:49:04
sort of submillitaries and that's an
00:49:06
impossible task.
00:49:07
>> Yeah. Yeah. And that's the Iranians were
00:49:09
aware of that and they've got a, you
00:49:11
know, a huge army. They've got they can
00:49:13
conscript far more people than Israel
00:49:15
has. Um it's to me if it gets down to an
00:49:18
conventional military then it's possible
00:49:21
that you know Israel could lose that as
00:49:23
well.
00:49:23
>> On March 21 Trump threatened to
00:49:26
obliterate Iran's power plants if they
00:49:28
did not fully reopen the straight of
00:49:29
Hormos within 48 hours.
00:49:32
>> He then came out and said that he was
00:49:34
pausing that because Iran were
00:49:36
negotiating
00:49:37
>> um and he says he he thinks he's
00:49:39
negotiating with the right person. As of
00:49:42
yesterday, Trump has announced a 10day
00:49:44
pause until April 6th on destroying
00:49:48
energy plants, claiming that indirect
00:49:50
talks are going very well and that Iran
00:49:52
is begging to make a deal according to
00:49:54
the Guardian. So, what's going on there
00:49:57
in your view?
00:49:58
>> I think he's gaming the markets. I
00:50:01
really think he's using it to cause the
00:50:03
OMI price to go up and down and gaming
00:50:05
at either side. and somebody in his
00:50:08
circle or people are making a fortune
00:50:10
playing that's the case.
00:50:12
>> Yeah, I do. I mean
00:50:13
>> because there's lots of ways to make
00:50:15
money that don't involve crashing the
00:50:18
global economy, losing the midterms.
00:50:20
>> Yeah, you'd think of that. You've got
00:50:21
you've got ethics, you've got empathy,
00:50:23
you've got morals. Trump has none of
00:50:25
those things.
00:50:26
>> Do you not think it's just it's just
00:50:27
again if we look at Trump's pattern of
00:50:29
behavior over time, even with the
00:50:30
tariffs?
00:50:31
>> Yeah. The same pattern of behavior
00:50:32
occurred there where he would come out
00:50:34
and say, "Every leader is calling me.
00:50:36
They can't stop calling me. They all
00:50:38
want to make a deal. I'm going to do a
00:50:39
tariff on you 10%. Wait, no, I'm not.
00:50:41
Pause. Call me."
00:50:42
>> Yeah.
00:50:43
>> It's the same pattern of behavior. It's
00:50:44
you you make a threat system. Yeah.
00:50:46
>> You then
00:50:48
>> blackmail the person to try and
00:50:49
negotiate with you. when they don't, you
00:50:51
hit them with the thing hard and
00:50:53
eventually in the end of the day, you
00:50:55
don't really do any of the stuff you
00:50:56
threaten to do
00:50:57
>> because you've sort of
00:50:59
>> manipulated a person into getting your
00:51:01
way. It's the same pattern of behavior.
00:51:03
We're going to smash you if you don't
00:51:04
call me.
00:51:05
>> Yeah,
00:51:06
>> they do or don't call. He announces to
00:51:07
the world that they called. They're
00:51:09
begging. Look, it says here, "They're
00:51:10
begging me for a deal. I'm going to give
00:51:12
them 10 more days."
00:51:14
>> To me, it sounds like he's trying to
00:51:15
build his golden bridge to get the
00:51:17
out of there. What he's imagining is
00:51:19
he's dealing with somebody like himself
00:51:20
in Iran. Okay? He's he's projecting what
00:51:24
how he would react to these things. He's
00:51:26
obviously projecting his own behavior
00:51:28
onto the system. And it's projection
00:51:30
rather than understanding. So if you
00:51:32
decapitate, you know, if you took out
00:51:34
Trump, the fear of being, you know,
00:51:36
assassinated, yes, well bargain, what do
00:51:38
you want me to do? He thinks that works
00:51:40
in Iran. It doesn't.
00:51:41
>> You can look at his behavior and sort of
00:51:43
understand what he wants. He wants to
00:51:44
win this war and he he want you know he
00:51:47
wants to win the war and that's and get
00:51:48
out of there because that's what he's
00:51:49
been saying. We've won. We won. We've
00:51:50
won every day. We've won. More missiles
00:51:52
go in. We've won. So that's clearly what
00:51:54
he wants to happen. The problem is
00:51:55
winning here doesn't seem like a
00:51:57
straightforward thing.
00:51:57
>> No, it's not going to happen.
00:51:58
>> No pun intended with a straight up. But
00:52:00
it really doesn't seem like a
00:52:01
straightforward thing.
00:52:02
>> So I it's my opinion now that they are a
00:52:05
little bit stuck because if you leave
00:52:07
now you lose.
00:52:08
>> Yeah.
00:52:09
>> Iran start firing at Israel. Israel
00:52:11
don't stop even though you tell them to.
00:52:13
Yeah,
00:52:13
>> they start firing at each other. The
00:52:15
whole thing blows up. Oil, they keep the
00:52:17
straight of Horos closed. Oil prices go
00:52:20
up. It looks terrible, terrible,
00:52:21
terrible for Trump. We might get he
00:52:23
might find himself in a Bush situation
00:52:25
where his legacy, and I think that's
00:52:26
such an important word, a man that can't
00:52:29
be elected for a third term. His legacy
00:52:32
is tarnished in the same way that Bush's
00:52:34
legacy was tarnished by going to war in
00:52:36
the Middle East. M
00:52:37
>> I think his greatest fear, Trump's
00:52:39
greatest fear, you think back through
00:52:41
all of these moments over the last
00:52:42
couple years where he talked about the
00:52:43
Nobel Prize,
00:52:44
>> I think he's trying to put himself on
00:52:45
the Mount Rushmore of presidents.
00:52:47
>> Yeah.
00:52:47
>> In history's mind.
00:52:49
>> And I think how this situation plays out
00:52:52
now, the sole thing he's thinking about
00:52:53
is his legacy. And right now, being
00:52:56
stuck in a war and contemplating putting
00:52:59
ground troops in is arguably the worst
00:53:01
thing for one's legacy. Americans dead.
00:53:03
>> Yeah. And lots of Americans dead. These
00:53:05
wars are like you think about Vietnam.
00:53:07
These wars are never really won.
00:53:08
>> No. Well, they did. America hasn't won a
00:53:10
war since World War II and even World
00:53:12
War II was won by the Russians more so
00:53:14
than the Americans. So, we have this
00:53:16
picture of America as being this, you
00:53:18
know, invincible military power. But it
00:53:20
lost in Vietnam. It lost in Iraq. It
00:53:23
lost in Afghanistan. America's failed in
00:53:25
all of these. This is another American
00:53:27
failure, but on a scale far beyond what
00:53:30
happened in Afghanistan and Vietnam.
00:53:32
>> Do you think he will send ground troops
00:53:33
in? Yes, I do. Uh, and like I've seen
00:53:36
people talking about where the troops
00:53:37
might land. And the only part of they
00:53:39
can land is is right towards this edge
00:53:41
here with Pakistan, where they might
00:53:43
land to between 2 and 10,000 troops. I'd
00:53:46
hate to be one of those troops because
00:53:47
it's a suicide mission. again with those
00:53:49
31 provinces, the separated um military
00:53:52
commands they've got, the weapons
00:53:54
they've got hidden underground, the
00:53:56
troops themselves who if if you know
00:53:58
that there are Americans landing and
00:54:00
you're Iranian and a soldier, you are
00:54:03
going to attack them like nobody's
00:54:05
business and not be afraid of your own
00:54:07
death because you do think if you get
00:54:09
martyed, it's the remaining people that
00:54:11
you're defending. There will be people
00:54:12
who recognize you as a martyr. It's it's
00:54:15
horrific. If you had to give a sort of
00:54:17
percentage probability of them putting
00:54:18
ground troops in,
00:54:19
>> I would say more than 50%. We're going
00:54:22
to find out in the next couple of weeks.
00:54:24
>> Much of the reason most people haven't
00:54:26
posted content or built their personal
00:54:27
brand is because it's hard and it's
00:54:30
timeconuming and we're all very very
00:54:31
busy and if you've never posted
00:54:33
something before, there's so many
00:54:36
factors in your psychology that stop you
00:54:38
wanting to post. What people will think
00:54:40
of you, am I doing this right? Is the
00:54:41
thing I'm saying absolutely stupid? All
00:54:44
of these result in paralysis, which
00:54:46
means you don't post and your feed goes
00:54:48
bare. I'm an investor in a company
00:54:50
called Stanto, which you've probably
00:54:52
heard me talk about. And what they've
00:54:53
been building is this new tool called
00:54:55
Stanley that uses AI, looks at your
00:54:57
feed, looks at your tone of voice, looks
00:54:59
at your history, looks at your best
00:55:00
performing posts, and tells you what you
00:55:02
should post, makes those posts for you.
00:55:04
You can also just use it for
00:55:06
inspiration. And sometimes what we need
00:55:08
when we're thinking about doing a post
00:55:09
for our social media channels is
00:55:11
inspiration. Building an audience has
00:55:13
fundamentally changed my life and I
00:55:14
think it could change yours, too. So,
00:55:16
I'm inviting you to give this new tool a
00:55:18
shot and let me know what you think. All
00:55:20
you have to do is search
00:55:21
coach.stand.store
00:55:23
now to get started. This company that
00:55:25
I've just invested in is grown like
00:55:27
crazy. I want to be the one to tell you
00:55:28
about it because I think it's going to
00:55:29
create such a huge productivity
00:55:30
advantage for you. Whisperflow is an app
00:55:32
that you can get on your computer and on
00:55:34
your phone on all your devices and it
00:55:36
allows you to speak to your technology.
00:55:37
So, instead of me writing out an email,
00:55:38
I click one button on my phone and I can
00:55:41
just speak the email into existence and
00:55:43
it uses AI to clean up what I was
00:55:45
saying. And then when I'm done, I just
00:55:47
hit this one button here and the whole
00:55:48
email is written for me. And it's saving
00:55:50
me so much time in a day because Whisper
00:55:54
learns how I write. So on WhatsApp, it
00:55:55
knows how I am a little bit more casual.
00:55:57
On email, a little bit more
00:55:58
professional. And also, there's this
00:55:59
really interesting thing they've just
00:56:00
done. I can create little phrases to
00:56:02
automatically do the work for me. I can
00:56:04
just say Jack's LinkedIn and it copies
00:56:06
Jack's LinkedIn profile for me because
00:56:08
it knows who Jack is in my life. This is
00:56:10
saving me a huge amount of time. This
00:56:11
company is growing like absolute crazy.
00:56:13
And this is why I invested in the
00:56:14
business and why they're now a sponsor
00:56:16
of this show. And Whisflow is frankly
00:56:17
becoming the worstkept secret in
00:56:20
business, productivity, and
00:56:21
entrepreneurship. Check it out now at
00:56:22
Whisper Flow spelled w
00:56:26
lw.ai/stephven.
00:56:29
It will be a game changer for you.
00:56:31
>> What is the best case scenario? The
00:56:34
Americans have to realize they've lost.
00:56:36
They've not going to negotiate the terms
00:56:37
of reparation. And what Iran has
00:56:39
proposed, when you look at Iran's terms,
00:56:42
they're extremely reasonable. They're
00:56:43
saying America
00:56:45
leaves the whole Asian. America no
00:56:47
longer comes back in this region. No
00:56:49
military bases, no agreements. This
00:56:52
becomes Iranian protectorate. That
00:56:54
becomes an Arabian Empire or not Arabian
00:56:56
Iranian Empire because they're not
00:56:58
they're not Arabs. They're Persians. Uh
00:57:00
so this becomes like a Muslim part of
00:57:03
the world that's you can actually take
00:57:06
the whole region out to here it's all
00:57:08
Muslim and what's been happening and
00:57:10
this is part of the weird religious
00:57:11
elements here you've got the Sunni sect
00:57:13
and the Shiite sect which is a bit like
00:57:15
the Protestants versus the Catholics go
00:57:18
back 500 years and what we're seeing
00:57:20
here is like the 100red years war that
00:57:22
occurred in Europe back in the days when
00:57:25
it was Protestant birth as Catholic was
00:57:27
a serious thing. Um, so we're seeing a
00:57:29
religious war being fought here and the
00:57:32
the Sunni majority about 90% of Muslims
00:57:34
are Sunni. They have focused on their
00:57:36
rivalry with the Shiites. And so what
00:57:39
they've done is they've sided with this
00:57:40
mob to enable United States the states
00:57:44
to so they've sided with the Christians
00:57:46
to strengthen their own Muslim sect
00:57:49
which is the Sunni sect against the
00:57:51
Shiite sect which is Iran is
00:57:52
predominantly Shiite. Now what's
00:57:54
happening here is as soon as the war
00:57:56
started America
00:57:59
the reason the the Arabs agreed to bases
00:58:02
here military bases is they thought it
00:58:04
to protect them from Iran. As soon as
00:58:06
the war starts those spaces are
00:58:08
obliterated the Americans leave and they
00:58:10
realize that hasn't worked at all. So
00:58:12
the deal the Sunnis made to side with
00:58:14
the Christians has proved to be an
00:58:16
extremely bad deal. You're going to have
00:58:19
to have change in who rules these
00:58:21
countries to enable it to happen. But I
00:58:23
think the persuasive case coming out of
00:58:25
this within the Muslim areas is Muslims
00:58:28
stick together. Don't cooperate with the
00:58:30
Christians.
00:58:31
>> Don't cooperate with the United States.
00:58:32
>> I think that's what's going to happen.
00:58:34
>> You think that's going to happen?
00:58:35
>> I hope so because that at least gives us
00:58:37
something which is relatively stable.
00:58:38
This becomes a region that is Muslim.
00:58:41
>> When you say this, you mean the Middle
00:58:42
East?
00:58:42
>> I mean the whole Middle East, Saudi
00:58:43
Arabia, Iran, Iraq, Pakistan as well
00:58:46
because it's a Muslim country.
00:58:47
Afghanistan. This region becomes Muslim
00:58:51
dominated. Shiites and Sunnis start to I
00:58:53
mean the whole idea of of Catholics
00:58:56
fighting Protestants that's completely
00:58:58
dissipated. Um there's no level in in
00:59:01
the west anymore of large scale military
00:59:04
type animosity between Catholics and and
00:59:07
Protestants. That's what's happening
00:59:09
over here. The the conflicts those
00:59:12
religious conflicts within the Cath
00:59:13
within the Christians disappeared
00:59:15
largely speaking. They're still
00:59:17
happening within the Muslim religion.
00:59:20
This could persuade them that that's got
00:59:22
to end.
00:59:23
>> So, we've got one more scenario.
00:59:24
Scenario five.
00:59:25
>> Iran develops nuclear weapons. I'd
00:59:28
rather four happen than five.
00:59:29
>> Which of these five outcomes do you
00:59:31
think is most probable to happen?
00:59:35
>> I think the most likely outcome is Iran
00:59:38
disables Israel's nuclear weapons.
00:59:40
Because Iran has been so prepared for
00:59:42
this conflict in a way that America has
00:59:44
not, in a way that Israel was not. I
00:59:46
hope they're also prepared for the
00:59:48
eventuality of having to neutralize
00:59:50
Israel's nuclear weapons.
00:59:52
>> You think the highest probability is
00:59:53
Iran disabling Israel's nukes?
00:59:56
>> Yeah, I hope I'm right. I mean, if Iran
00:59:58
gets destroyed, then this leads not to
01:00:01
Iran developing nuclear weapons, but
01:00:03
every potential rival for America on the
01:00:06
planet developing nuclear weapons. We go
01:00:09
to a nuclear war dominated world. Do you
01:00:11
not think it's more likely that Trump is
01:00:14
going to find himself a golden bridge to
01:00:17
get out of this situation? He's going to
01:00:18
call Netanyahu in Israel and say, "Stand
01:00:21
down, please. I'm going to announce that
01:00:23
we've won this war. I'm going to
01:00:24
announce that we've done a deal. It's
01:00:26
all over."
01:00:28
>> Well, without doubt, whatever happens,
01:00:29
Trump is going to say he won. Okay,
01:00:32
that's again the narcissistic
01:00:33
personality disorder thing. He simply
01:00:36
couldn't bring himself to stand on a
01:00:38
stage and say, "I lost." I mean, think
01:00:40
about the biggest insult that Trump ever
01:00:42
made in his apprenticeship show. You're
01:00:44
a loser. Okay? Being a loser is the
01:00:46
absolute worst possible thing that
01:00:48
anybody can be in his mind. If he has to
01:00:50
say, "I'm a loser," then his life is
01:00:52
over in that sense. He's self his
01:00:55
selfimage is over. So, whatever deal
01:00:58
comes out, he's going to say he won. For
01:01:00
the for the average person that's
01:01:01
listening now when they hear all this
01:01:03
conflict going on on in the world from
01:01:05
an economic perspective, is there
01:01:07
anything they can be doing to protect
01:01:09
themselves against some of these
01:01:11
downstream consequences?
01:01:12
>> Well, I think one thing is people we
01:01:14
we've now got to the stage where you can
01:01:16
buy your own uh solar systems for your
01:01:19
house. You need something which means
01:01:21
you are not dependent upon oil anymore.
01:01:23
I I think we've trivialized the dangers
01:01:25
of climate change for the last half
01:01:27
century. We've done very little about it
01:01:29
to reverse it. This is telling people
01:01:31
that if you relied upon oil, you've got
01:01:34
a fragile existence. Even if it cost you
01:01:36
more to build solar, you've got to build
01:01:38
solar as your own alternative energy
01:01:40
system. Cuz without energy, there's no
01:01:42
civilization.
01:01:43
And that's what we're learning the hard
01:01:45
way from this conflict. So I think
01:01:47
individual responses is going to be get
01:01:50
some way to have your own power source
01:01:52
and for most people that means having a
01:01:54
solar. One man that has done a lot for
01:01:56
both solar and sustainable energy is
01:01:59
Elon Musk.
01:02:00
>> He has. He's also helped get bloody
01:02:02
Trump elected. So I think you've got to
01:02:04
score that against him as well. But
01:02:06
yeah, his work on solar then and and and
01:02:08
power and rocketry. I've absolutely
01:02:10
admired that and I see that as a
01:02:12
critical positive contribution. But
01:02:14
getting Trump elected, he played a major
01:02:16
role in that. He should learn from that
01:02:18
mistake and get the out of
01:02:19
politics. He has backed off politics
01:02:22
now, which is
01:02:22
>> I think he's realized how poisonous it
01:02:24
is. Yeah.
01:02:25
>> Yeah. It sounds like he's realized you
01:02:26
can't really change the beast. No,
01:02:28
>> he tried.
01:02:29
>> Yeah. He should stick with the era where
01:02:30
he's mature, which is what he does with
01:02:32
energy systems and what he does with
01:02:34
rocketry. He's really I mean, in terms
01:02:36
of legacy, uh, he's tainted his legacy
01:02:38
by getting involved in politics. Go back
01:02:40
to engineering. So you say that you
01:02:43
think people should invest in solar for
01:02:45
their homes to get their own energy
01:02:47
sources so they're a little bit
01:02:48
insulated from these sort of
01:02:49
macroeconomics. Is there anything else
01:02:51
they should be thinking about? You know,
01:02:52
the average person the cost of living
01:02:54
crisis. What what happens next? What
01:02:56
should they be doing now?
01:02:57
>> The thing that I'm most worried about
01:02:58
this is the impact upon food. I'm the
01:03:00
last person to talk about growing your
01:03:03
own food. I've never done it. I'm I've
01:03:06
got brown thumbs, not green ones. But I
01:03:09
think if you can have any way to produce
01:03:10
your own food, you've got a bit of
01:03:13
insulation against what's happening at
01:03:14
the global level. The lesson that comes
01:03:16
out of this is self-sufficiency.
01:03:19
If we don't have self-sufficiency, then
01:03:21
these sorts of global chaotic things can
01:03:23
destroy you completely with you having
01:03:25
no recompense. If you have some degree
01:03:27
of self-sufficiency, you can survive.
01:03:29
>> And how does one create
01:03:30
self-sufficiency? Growing your own food
01:03:32
is quite expensive and slow, isn't it?
01:03:33
>> Yeah, extremely.
01:03:34
>> So, how does one develop
01:03:36
self-sufficiency in this these sort of
01:03:37
economic climates? Is it saving money or
01:03:40
is it um
01:03:41
>> I think it's having your own physical
01:03:42
resources close to you that enable you.
01:03:45
Money doesn't matter if you can't buy
01:03:47
the product in the first instance. The
01:03:49
product doesn't exist anymore. So, one
01:03:52
thing that happened during World War II
01:03:53
is a large amount of food was grown in
01:03:55
the UK by people turning their gardens
01:03:57
into market gardens. I
01:03:59
>> I've heard you make a few predictions
01:04:01
about the future of the economic
01:04:03
markets. You know, you're famous for
01:04:04
predicting 2008 and the financial crash
01:04:07
that occurred then. I've heard you
01:04:08
saying that you think because of AI
01:04:10
there's going to be another financial
01:04:11
crash around the corner within one or
01:04:13
two years.
01:04:14
>> Yeah. What's happening with AI is a
01:04:16
classic economic boom and bust cycle
01:04:19
overlaid on the fact that AI can also
01:04:21
eliminate a huge amount of employment
01:04:24
which we've never seen that possibility
01:04:27
in the past on that scale. But a common
01:04:30
pattern in capitalism is that some new
01:04:33
technology will be developed like
01:04:34
railways for example. Some people see
01:04:37
the potential profitability of railways.
01:04:39
Everybody pours in creating railways.
01:04:42
You get too many railways built. The 90%
01:04:45
of the companies that create the
01:04:47
railways go bust. But then we all have
01:04:49
these rail systems that we benefit from
01:04:51
afterwards. So that's the classic uh
01:04:54
pattern of Joseph Schumpeda was the
01:04:56
person who best described that of that
01:04:58
Austrian economist from the early uh
01:05:00
early 20th century. So he said you'll
01:05:03
get the the banks will finance a new
01:05:04
investment area that investment produces
01:05:07
a new technology which causes a boom
01:05:09
while you're building the technology but
01:05:11
when the technology comes online it
01:05:13
undercuts existing businesses and causes
01:05:16
a slump. So they boom and slump cycle
01:05:18
and AI is a natural example of that and
01:05:21
what you get is massive overinvestment
01:05:23
in the first instance because everybody
01:05:25
who invests in AI has the ambition of
01:05:28
being the only AI provider on the
01:05:29
planet. Therefore you get too many
01:05:31
companies investing there's too much
01:05:32
money going into it. That's what causes
01:05:34
a boom. But then when the technology
01:05:36
comes online because it undercuts
01:05:38
existing technologies you have a slump.
01:05:41
And when you look at the investment
01:05:42
taking place at the moment, the big tech
01:05:44
companies, Meta, Amazon, Microsoft,
01:05:46
Alphabet, own Google, Oracle is on track
01:05:49
to spend 720 billion on AI
01:05:54
infrastructure in 2026 alone, which is
01:05:56
less than 20% of the revenue that
01:05:59
they're making. We are seeing a 5:1
01:06:02
ratio of money being spent versus money
01:06:04
coming in. Yeah.
01:06:05
>> Which is historically unsustainable.
01:06:07
>> Yeah. And I think that's true. There
01:06:08
there has to be a slump coming out of
01:06:10
this. And in in a sense, that's part of
01:06:12
the natural cyclical behavior of
01:06:15
capitalism because if you want to make a
01:06:16
profit, you've got to bring in
01:06:17
technology that undercuts everybody
01:06:19
you're currently rivals with. So that's
01:06:21
the railways are a classic example
01:06:23
there. You know, you had to get around
01:06:24
by carriage instead. You undermine the
01:06:27
carriage companies by bringing in the
01:06:28
railways. But the ultimate benefit,
01:06:30
society benefits because now you got the
01:06:32
railways for transportation. So that's
01:06:34
the same sort of thing that AI is doing
01:06:36
this time around. But companies 90% of
01:06:38
those companies are going to fail.
01:06:40
>> I mean this is kind of what we're seeing
01:06:41
already. So the failure rate of AI
01:06:42
specific startups has hit 90% in 2026.
01:06:45
>> Wow. That's luck.
01:06:46
>> Yeah, you you predicted that one
01:06:48
correctly. Significantly higher than the
01:06:49
70% average for general technology.
01:06:52
Roughly 95% of enterprise AI pilots fail
01:06:54
to move into into production when they
01:06:56
incur massive cost. The other thing I
01:06:58
think a lot about is um
01:07:00
>> a lot of startups now are raising a lot
01:07:02
of money at crazy crazy valuations. I
01:07:05
can think of one particular startup I
01:07:06
know they're making like a couple of
01:07:07
million dollars a year. They've raised
01:07:09
at a billion dollar valuation
01:07:11
>> and because they've got the word AI on
01:07:13
them. And the thing is
01:07:15
>> because everyone's so such in a frenzy
01:07:16
at the moment about AI, they're probably
01:07:18
going to raise at a 2 billion valuation
01:07:19
6 months from now.
01:07:20
>> When you think about what's going on
01:07:21
there, someone somewhere is putting
01:07:23
their money in
01:07:24
>> and they're going to lose it all.
01:07:25
>> And they're going to lose it all. And
01:07:26
when when everybody starts losing all
01:07:28
their money very very quickly, you see
01:07:29
this contraction.
01:07:30
>> Yeah. where everybody realizes that
01:07:31
their paper gains, the gains they
01:07:33
thought they had on paper because of
01:07:34
valuation went up have just evaporated.
01:07:37
And when you see that, you have to
01:07:38
quickly count your pennies.
01:07:40
>> Yeah.
01:07:40
>> And get frugal
01:07:41
>> and pull back in again.
01:07:42
>> Pull back in again. Lay people off and
01:07:44
so on and so forth. So I I actually do
01:07:47
personally believe that we're probably
01:07:48
within 24 months of a pretty severe
01:07:51
contraction. And that won't just impact
01:07:53
these tech oligarchs, it'll impact all
01:07:55
of us in different ways.
01:07:57
>> Yeah, it's a Burman bus cycle. I mean
01:07:59
the only thing which we've experienced
01:08:00
in our own lives which is similar would
01:08:02
be the telecommunications bubble and
01:08:04
then the internet bubble between 1990
01:08:06
and 2001 2002
01:08:09
um we don't get bubbles in the internet
01:08:11
anymore because that's now a stable
01:08:13
technology in that sense uh but it
01:08:15
wasn't a big one this is much bigger
01:08:17
>> what do we do as entrepreneurs as team
01:08:21
members and companies what do we do at
01:08:22
this moment if what you're saying is
01:08:24
correct that there will be a
01:08:26
>> a boom and bust
01:08:27
>> a boom and bust which which I think ab
01:08:29
every smart person that I've spoken to
01:08:30
agrees that there will be a bust soon.
01:08:32
>> Yeah.
01:08:33
>> Their timelines vary.
01:08:35
>> Yeah.
01:08:35
>> But what does one do right now in March,
01:08:39
April 2026 to prepare for this?
01:08:42
>> Well, you put money aside if you can.
01:08:43
You buy other assets you think are going
01:08:45
to survive the the Burman bus cycle.
01:08:47
>> Like what?
01:08:47
>> That's the trouble. I mean, gold's been
01:08:49
driven up. Gold's now been driven down.
01:08:51
Uh people are buying Bitcoin, but
01:08:52
Bitcoin is collapsing as well. uh in
01:08:55
some ways you you really you can't it's
01:08:58
like saying what do I do during an
01:08:59
earthquake to not fall over in terms of
01:09:01
insulating yourself I really can't see a
01:09:03
way of insulating yourself from the
01:09:05
downturn but I don't I'm not wor as
01:09:07
worried about that as long-term
01:09:08
consequences of AI because this is the
01:09:12
first technology which implies you can
01:09:14
actually virtually eliminate labor as
01:09:16
necessary for reducing output because
01:09:18
you can use AI rather than clarks you
01:09:21
can use I know this is a long way from
01:09:24
being feasible, but robots could replace
01:09:26
process workers and then suddenly
01:09:29
something which employs 70% of the
01:09:31
global population is no longer
01:09:33
necessary. And then what do you do in
01:09:36
that situation? What I've seen which I
01:09:38
respect coming out of the tech bros in
01:09:41
America is they're talking in terms of
01:09:43
universal basic income.
01:09:45
>> You think a universal basic income is a
01:09:46
good idea? I we should probably explain
01:09:48
what that is.
01:09:49
>> Yeah. Well, it's it's the state provides
01:09:52
everybody with enough money to stay
01:09:54
alive. That's the basic idea. Rather
01:09:56
than having to work for a living at the
01:09:58
minimum, you get paid an amount of money
01:10:01
that means you can buy the goods and
01:10:02
services that are necessary to stay
01:10:04
alive. You don't necessarily prosper,
01:10:07
but you get enough to survive. And so
01:10:09
that's the idea of UBI. Now, at the
01:10:11
moment, to survive, you got to have a
01:10:12
job. And like that guy you mentioned is
01:10:14
working at three three jobs right now.
01:10:16
if he got a UBI, he wouldn't have to
01:10:18
work at those three jobs. He might work
01:10:20
at one or he might actually consider his
01:10:22
own business possibilities in that
01:10:25
situation. So, I think universal basic
01:10:27
income is a necessity given what
01:10:30
robotics and AI can do to employment.
01:10:34
Every time I've tried to improve
01:10:35
something in my life, like my
01:10:36
businesses, my health, my relationships,
01:10:38
I've noticed that the biggest shifts
01:10:40
have come from being better informed.
01:10:42
And when it comes to our health, most of
01:10:43
us know very, very little. So when our
01:10:45
team was approached about partnering
01:10:47
with function health, it felt very much
01:10:49
aligned. Their team has developed a way
01:10:50
of giving you a full 360deree view of
01:10:53
your health, many of the things that are
01:10:54
going on in your body in the form of
01:10:56
different tests. You do one blood draw
01:10:58
and it gives you access to over 160 lab
01:11:01
results, hormones, heart health,
01:11:03
inflammation, stress, toxins, the whole
01:11:06
picture. I use it and so have many of my
01:11:08
team members.
01:11:08
>> You sign up and you schedule your test
01:11:10
and once you're done, you get a little
01:11:11
report like the one I have here. I can
01:11:13
see my in-range results, my out of range
01:11:15
results, and there's a little AI
01:11:17
function, too. So, if I have any
01:11:18
questions about my out of range results,
01:11:20
I can just go in there and ask it any
01:11:22
question I want. And these tests are
01:11:24
backed by doctors and thousands of hours
01:11:25
of research.
01:11:26
>> It's $365 for a yearly membership. Go to
01:11:29
functionhealth.com/doac
01:11:32
and use the code DOAC25
01:11:34
for $25 off your membership. This is
01:11:37
something that I've made for you. I
01:11:39
realized that the direio audience are
01:11:41
strivvers. Whether it's in business or
01:11:43
health, we all have big goals that we
01:11:45
want to accomplish. And one of the
01:11:46
things I've learned is that when you aim
01:11:48
at the big big big goal, it can feel
01:11:51
incredibly psychologically uncomfortable
01:11:54
because it's kind of like being stood at
01:11:55
the foot of Mount Everest and looking
01:11:57
upwards. The way to accomplish your
01:11:59
goals is by breaking them down into tiny
01:12:02
small steps. And we call this in our
01:12:04
team the 1%. And actually this
01:12:05
philosophy is highly responsible for
01:12:08
much of our success here. So what we've
01:12:10
done so that you at home can accomplish
01:12:12
any big goal that you have is we've made
01:12:14
these 1% diaries and we released these
01:12:17
last year and they all sold out. So I
01:12:20
asked my team over and over again to
01:12:21
bring the diaries back but also to
01:12:22
introduce some new colors and to make
01:12:24
some minor tweaks to the diary. So now
01:12:26
we have a better range for you. So, if
01:12:30
you have a big goal in mind and you need
01:12:32
a framework and a process and some
01:12:34
motivation, then I highly recommend you
01:12:36
get one of these diaries before they all
01:12:38
sell out once again. And you can get
01:12:40
yours at the diary.com.
01:12:42
And if you want the link, the link is in
01:12:44
the description below.
01:12:45
>> And you think up to 50% of working-class
01:12:48
jobs could be wiped out because of AI
01:12:50
and robotics.
01:12:51
>> Yeah.
01:12:51
>> I mean, that's um that's been a
01:12:53
prediction from the leaders of some of
01:12:54
the biggest companies in AI. I heard the
01:12:56
the leader of Anthropic uh recently say
01:12:58
the same thing. thinks 50% of jobs could
01:13:00
be wiped out. I think the shocking thing
01:13:02
that we've talked a lot about in the
01:13:03
show is just,
01:13:04
>> you know, there's been other sort of
01:13:05
economic or industrial revolutions in
01:13:07
the past that have caused for job
01:13:09
displacement.
01:13:10
>> Yeah.
01:13:10
>> But none, I would argue at this speed.
01:13:13
>> No. And none that can replace virtually
01:13:14
everything. I see one of there's a
01:13:16
there's a classic story I read back when
01:13:18
I was uh talking about the global
01:13:20
financial crisis uh came out of the New
01:13:22
York Times article where they went to
01:13:25
interview workers in an air conditioning
01:13:27
factory. And there was one woman they
01:13:29
found there whose job it was to place a
01:13:31
thermouple inside the air conditioning
01:13:34
units as they went past. So there's
01:13:36
3,000 of these going past her a day.
01:13:38
She's just placing one of these
01:13:40
thermouples where it needs to go inside
01:13:41
the circuitry of the air conditioning
01:13:44
unit. And she said, "You don't have to
01:13:46
love your job as long as it pays you
01:13:48
money." It was totally boring job.
01:13:50
That's all she's doing. The thing is the
01:13:52
reason she got that job was she couldn't
01:13:53
make a machine to replace her because
01:13:55
the air conditioning units don't
01:13:57
necessarily end up precisely at the same
01:13:59
point. To make a machine that would do
01:14:01
that is really difficult. Now if you
01:14:03
train a robot on it, the robot
01:14:05
perception can ultimately get to the
01:14:07
point where the robot can place that
01:14:09
piece inside there. That particular
01:14:10
unskilled job disappears. So people who
01:14:13
work in jobs like that no longer have a
01:14:15
possibility of getting a job. I think
01:14:17
even, you know, Anthropic released a
01:14:18
report. Anthropic are the makers of
01:14:20
Claude. They released a report saying
01:14:22
that entry- level positions, they're
01:14:24
seeing a 13% decline already in people
01:14:27
getting those entry- level jobs. And
01:14:28
actually, as an employer, someone that
01:14:30
spends literally all last night, I was
01:14:31
looking through our inbox, our
01:14:32
recruitment inboxes at candidates and
01:14:34
talent.
01:14:35
>> I have noticed myself changing. I've
01:14:37
noticed that um
01:14:40
people that I would have given roles to
01:14:43
maybe six months ago,
01:14:44
>> yeah,
01:14:45
>> I now have to think long and hard about
01:14:47
whether there's going to be technology
01:14:49
that can do those exact roles instead.
01:14:52
And it's it was really shocking thing. I
01:14:53
was saying to the team last night at
01:14:54
like 1:00 a.m. in the office, I was
01:14:55
like, this is a prime example of a
01:14:57
candidate. I was looking at this
01:14:58
particular candidate that 6 months ago I
01:15:01
would have bitten their hand off but now
01:15:04
>> I have to pause because my innovation
01:15:06
team in the corner of the office they're
01:15:08
they're able to do that now with these
01:15:10
AI agents instead and so I am you know
01:15:13
you hear a lot about the theoretical
01:15:15
impact of AI
01:15:16
>> but you're actually making the decision
01:15:18
yourself
01:15:19
>> and then it's theory it's theory it's
01:15:21
this thing on my Twitter feed like blah
01:15:22
blah blah whatever you hear on a podcast
01:15:24
you go blah blah blah whatever and then
01:15:26
you find yourself actually behaving that
01:15:28
way.
01:15:29
>> Your behavior is changing and you're
01:15:30
going, "Oh, it's very hard to know the
01:15:33
types of people to hire into our
01:15:35
company." And I've kind of almost
01:15:37
segmented them into these two groups
01:15:38
where you've got people that have very
01:15:40
deep expertise. Yeah,
01:15:41
>> I'd say it's three groups. People that
01:15:42
have very, very deep expertise on a
01:15:44
particular thing, you know, like my CFO.
01:15:46
Group number two, I'd say, are people
01:15:48
that are AI proficient,
01:15:50
>> who can actually handle this stuff and
01:15:52
be the people who manage the agents.
01:15:53
>> Yes. and they can redesign our workflows
01:15:56
across every department in the company
01:15:57
to be agentic
01:15:59
>> um the word about AI agents that's kind
01:16:01
of like the word you use so agentic
01:16:03
workflows and then the third group of
01:16:05
people are people who have skills that
01:16:07
are highly beneficial human to human and
01:16:09
in in real life so like humanto human
01:16:11
sales people that deal with
01:16:12
relationships
01:16:13
>> and are very good at it
01:16:14
>> and are very good at it because there
01:16:16
are still a certain type of sale where
01:16:19
people want to meet the person shake
01:16:20
their hand and say okay you're
01:16:22
responsible for this deal
01:16:23
>> we're still in a situation where people
01:16:24
don't want agents to do that. Those are
01:16:26
like the three groups. What I didn't say
01:16:28
is young people who have
01:16:31
>> just come out of university, maybe don't
01:16:32
know anything about agents. They don't
01:16:34
have the deep expertise yet.
01:16:35
>> Yeah.
01:16:36
>> And when you look at the data, we'll
01:16:37
throw some of the data up on the screen.
01:16:39
It appears that these sort of entrylevel
01:16:42
white collar jobs are the ones that are
01:16:45
right now suffering. Yeah. Some of these
01:16:47
investment companies would hire like
01:16:48
three or 400 analysts to look at um
01:16:51
companies and make decisions. That is
01:16:53
one example of a of a role that's very
01:16:55
at risk now. We've got an investment
01:16:56
fund. We need one analyst, Molly. 6
01:16:59
months ago, we were interviewing more
01:17:00
analysts. We now realize that we just
01:17:02
need Molly, and we need to give Molly
01:17:04
AI.
01:17:04
>> Yeah.
01:17:04
>> And she can set up I think Molly said to
01:17:06
me yesterday when I left the office at
01:17:07
when she left the office at midnight,
01:17:09
she's now set up three agents, these
01:17:11
error agents
01:17:12
>> as her team. Those would have been three
01:17:14
people.
01:17:15
>> Well, I I saw a demo of that. Like I've
01:17:16
developed a software package called
01:17:17
Ravvel uh which I've got one programmer
01:17:20
for and I I teach an online course as
01:17:22
well and I give Ravel as part of that
01:17:24
online course and one of the members of
01:17:26
the course said he's using an AI to
01:17:29
build RAL models and he's also using an
01:17:31
AI to write code behind Ravvel and he
01:17:34
gave a demo this a couple of days ago
01:17:36
and you know I watched it happen on
01:17:39
screen as he built a model a simulation
01:17:41
system and it was messy on one stage but
01:17:45
it produced the correct mathemat atics.
01:17:47
So he's showing you can actually he he's
01:17:48
trying to tell me that we should get my
01:17:51
main programmer to learn to drive agents
01:17:53
to do the whole thing. Now my main
01:17:55
programmer has said look there's things
01:17:57
that I can do that an AI cannot do. He's
01:17:59
one of your highly gifted people. And he
01:18:02
said it wouldn't be worth my while to
01:18:03
have me telling an AI what to do because
01:18:06
what I lose in terms of my own
01:18:08
initiative I can't I just sort of
01:18:10
balance out. It's really a okay. But he
01:18:12
hires a junior programmer. than the
01:18:13
junior programmer would be one who's
01:18:15
trained to drive the AI.
01:18:16
>> I I do think programmers are fine.
01:18:18
Actually, there was some stats that I
01:18:19
saw the other day that showed there's
01:18:20
been this huge demand and people trying
01:18:22
to hire programmers. It's interesting
01:18:24
because you hear stats from Spotify.
01:18:26
Spotify saying, "We haven't written a
01:18:27
human line of code since December."
01:18:29
>> And I'm very good friends with the guys
01:18:30
at Spotify. I was actually with the CEO
01:18:31
the other day, a couple of weeks ago in
01:18:32
in Austin. And you you hear that and I
01:18:34
did check that with them. That's true.
01:18:37
>> So, you assume that that means we don't
01:18:38
need programmers anymore. But if you
01:18:40
think about like Jeban's paradox, when
01:18:42
something becomes, you know, Jeb's
01:18:43
paradox is the old analogy,
01:18:45
>> cheaper, you use more of it. Yeah.
01:18:46
>> Yeah. So like when coal became cheaper,
01:18:48
people were worried that maybe the coal
01:18:50
industry was out of business or trains,
01:18:52
whatever. But actually what ended up
01:18:53
happening is people just drove more
01:18:55
trains and they used them for other
01:18:56
things like transport. And the same
01:18:58
applies, I think, for AI. When
01:19:00
>> creating technology becomes easier,
01:19:02
every company starts using more
01:19:04
technology. So media companies, lawyers,
01:19:07
you name the company, executive
01:19:09
assistants, they all become coders. And
01:19:11
actually the demand for highly for
01:19:13
really anyone who knows how to code or
01:19:16
program it,
01:19:17
>> we're seeing it. It's exploding.
01:19:18
>> Yeah.
01:19:20
>> But I just think the job disruption in
01:19:22
the near for most people is going to be
01:19:23
pretty
01:19:24
>> Yeah. I mean there's ways in which AI
01:19:27
and robotics should be welcomed
01:19:29
>> because it means the possibility exists
01:19:33
and it's only a possibility that we can
01:19:35
no longer have to be exploited to get an
01:19:37
income because if you look at the
01:19:39
Marxist attitude towards capitalism as a
01:19:42
cap workers capitalists exploit the
01:19:44
workers. Okay. Um the real world is
01:19:46
we've been exploiting energy
01:19:48
>> mutually both labor and capital exploits
01:19:50
energy. We could have a future where we
01:19:52
don't have to work for a living and
01:19:54
therefore you could do what you want to
01:19:56
do for a living. That it's a Star Trek
01:19:58
future. That's that's the possibility
01:20:00
that it promises. But at the same time,
01:20:03
uh it could actually eliminate the jobs
01:20:05
that people currently rely upon. And
01:20:08
what I fear is we have two
01:20:10
possibilities. We have a future where
01:20:12
Star Trek's high future where you have
01:20:14
replicators that make the goods and we
01:20:16
consume and and we all live a energy
01:20:18
abundant life. uh or the hunger games
01:20:22
where there's one little elite that gets
01:20:24
has all the robots and lives extremely
01:20:26
well and we tolerate and oppress the
01:20:29
vast majority and they end up you know
01:20:31
hunger game entertainment those are the
01:20:34
two possibilities we face
01:20:36
>> I do think the cost of goods and
01:20:37
services will come down which is great
01:20:39
>> I think robotics you know if Elon is
01:20:41
right and I often say with Elon like his
01:20:43
timelines are not always accurate but he
01:20:45
does tend to deliver magic
01:20:47
>> he ultimately delivers but it you know
01:20:49
he always overpromises and delivers
01:20:51
later than he plans.
01:20:53
>> And if he's right about when he says
01:20:55
there's going to be more humanoid
01:20:56
robots, his Optimus robots, than humans,
01:20:59
and he says also in his predictions that
01:21:02
there's going to be no need to study to
01:21:03
be a surgeon because the robots are
01:21:04
going to be so much more uh advanced and
01:21:07
better than any living surgeon, that
01:21:09
would imply that surgery and other sort
01:21:11
of medical diagnoses and procedures are
01:21:14
going to be incredibly cheap, incredibly
01:21:15
quick. Great.
01:21:16
>> How do you pay for them? That's the next
01:21:18
question. Yeah. So, yeah. How do you pay
01:21:21
for them? And do people want to, you
01:21:22
know,
01:21:23
>> it's also it's also the physical
01:21:24
requirements. I mean, the amount of
01:21:26
copper inside a robot, you're talking,
01:21:29
you know, several kilos per robot. Um,
01:21:32
do we have enough to produce 8 billion
01:21:34
of them?
01:21:35
>> And maybe, you know, surgeons do much
01:21:36
more than just operate.
01:21:38
>> Yeah.
01:21:38
>> There's a human element to the medical
01:21:40
profession, which I think is sometimes
01:21:40
unappreciated. Like, I would I don't
01:21:42
know if I'm quite ready to go talk to a
01:21:44
robot about my health yet. Maybe I'll
01:21:45
adjust. I wanted to come back to
01:21:46
something you actually said earlier. You
01:21:48
talked about Bitcoin briefly. I've heard
01:21:49
you say that you think Bitcoin is going
01:21:51
to zero.
01:21:52
>> Yeah.
01:21:52
>> Okay. This is worrying. I think I have
01:21:54
some Bitcoin.
01:21:56
>> You're an economist. You're saying that
01:21:57
Bitcoin is going to zero. Why?
01:22:00
>> Because ultimately because of its
01:22:01
reliance upon energy. I mean I you know
01:22:04
you Max you Max Kaiser and Stacy
01:22:06
Herbert. Have you met them at all? No.
01:22:07
They were sort of the original
01:22:08
proletizers for Bitcoin and they're now
01:22:11
living in I think El Salvador um which
01:22:15
is adopted Bitcoin as a form of
01:22:16
currency. When they told me about
01:22:19
Bitcoin, I could have bought it for a
01:22:20
pound a bitcoin which would have been I
01:22:22
would have been bloody would be
01:22:23
wealthier than you if I'd done that. The
01:22:25
reason I didn't was they explained that
01:22:27
the way that the public ledger is kept
01:22:29
safe is that it takes too much energy to
01:22:32
break it. So each transaction requires
01:22:35
10 minutes of computer processing time
01:22:38
globally by the looks of it to actually
01:22:40
create an extra bitcoin and that means
01:22:43
it's too expensive for somebody to try
01:22:45
to break the ledger. That means it's got
01:22:47
a huge requirement for energy use and I
01:22:52
believe knowing what I know from climate
01:22:54
scientists that at some point we're
01:22:56
going to realize we're using far too
01:22:58
much energy on the planet. We've got to
01:23:00
cut the energy consumption and the two
01:23:02
easiest things to cut out to reduce
01:23:04
energy consumption are cryptocurrencies
01:23:06
and international travel.
01:23:08
>> But aren't you saying that, you know,
01:23:09
nuclear energy is becoming vogue again?
01:23:12
And they're talking a lot, you know,
01:23:13
about
01:23:14
>> it's the amount of time it takes to
01:23:15
build that stuff. I mean, China is
01:23:17
building nuclear power stations at a
01:23:19
hell of a rate and much much cheaper,
01:23:22
more cheaply than America is doing.
01:23:23
>> Solar has become a big topic of
01:23:25
conversation.
01:23:26
>> Yeah. Again, there's a guy called Simon
01:23:28
Machau, whom I recommend you get in
01:23:30
touch with as well. And Simon is an
01:23:32
engineer who claims that we simply don't
01:23:34
have the physical minerals necessary to
01:23:37
support a completely solar and
01:23:39
wind-based
01:23:41
energy system. He's got people who
01:23:43
criticize his analysis definitely, but
01:23:46
we still are using far more physical
01:23:49
resources than we're aware of at the
01:23:51
moment on the planet. And the
01:23:53
availability of various critical
01:23:55
elements that we need for the system we
01:23:58
have right now, it's much less abundant
01:24:01
than we would like it to be. So, a lot
01:24:03
of these things about, you know,
01:24:04
robotics taken over, do we have the
01:24:06
minerals for it? Solar power, do we have
01:24:09
the minerals? The answer is is not is
01:24:11
not yes. Okay. Sometimes the answer is
01:24:14
no. Other times it's it's dubious. But I
01:24:17
think that energy requirement alone is a
01:24:19
problem.
01:24:19
>> You're saying that we're going to have
01:24:20
we're going to have to cut back on our
01:24:22
energy consumption.
01:24:23
>> But I mean the direction of travel has
01:24:24
been we've been able to produce more and
01:24:26
more and more and more energy
01:24:27
>> and we're dumping it into the
01:24:28
environment. The planet the problem
01:24:30
about the use of energy is it's
01:24:32
happening on a planet. Okay. Can the
01:24:35
biosphere cope with the waste that we
01:24:37
dump into it as a result of using that
01:24:40
energy? And that is something which
01:24:41
economists are completely stupid on
01:24:44
beyond stupid. They've trivialized the
01:24:46
dangers of the amount of resources we
01:24:49
use and the amount of energy we use. So
01:24:51
I don't think that energy future is
01:24:53
possible on this biosphere at the
01:24:54
moment. It's possible in the future if
01:24:57
we get off the biosphere. So in that
01:24:59
sense I'm even more of a space cadet
01:25:00
than Elon Musk is. I think we have to
01:25:03
plan to take production off planet, but
01:25:05
while we're constrained on the
01:25:06
biosphere, the biosphere's constraints
01:25:08
will stop us using as much energy as we
01:25:11
used wish to use.
01:25:12
>> What are you what are your closing
01:25:13
statements on this whole situation with
01:25:15
the war and Iran and everything that's
01:25:16
going on from a geopolitical
01:25:18
perspective?
01:25:19
>> Basic thing is our system is far more
01:25:20
fragile than we've convinced oursel that
01:25:23
it is. And we can make observations
01:25:26
about potential futures which presume a
01:25:28
robustness we don't have. And if that
01:25:31
robustness is destroyed either by
01:25:35
military conflict or by overextending
01:25:37
what we put into the biosphere, then we
01:25:40
can fall off what's called the Senica
01:25:42
cliff. We can go from an abundant future
01:25:44
to a collapse.
01:25:45
>> And what would you say the people at
01:25:46
home should be doing to course correct
01:25:49
the path that you think we're on?
01:25:52
>> Stop electing fools.
01:25:54
Um, electing Trump was an enormous
01:25:56
mistake. We've got politicians who
01:25:59
follow what's called neoliberal
01:26:02
political philosophies. Therefore, put
01:26:04
us in this problem. It hasn't worked. We
01:26:06
need to reverse back to having a
01:26:08
humanoriented and physically realistic
01:26:11
view of how the economy managed should
01:26:14
be managed and how the biosphere should
01:26:16
be managed. We have to take care of our
01:26:18
home and in a central sense we're
01:26:20
destroying our home and thinking we can
01:26:22
keep on doing that indefinitely. We
01:26:24
can't. Our poem is planet earth. Planet
01:26:27
earth has got physical restraints. We
01:26:30
haven't respected them. Planet earth
01:26:32
will tell us what it thinks of that this
01:26:34
century.
01:26:34
>> And which leaders do you think we should
01:26:36
be electing? Do you think we should be
01:26:37
electing?
01:26:38
>> I don't think I I think even lifting
01:26:39
leaders itself is a mistake because what
01:26:42
we then do is end up getting we we
01:26:46
pander to narcissists. We pander to
01:26:48
people who believe they can solve all
01:26:50
our problems. We end up with
01:26:52
megalomaniacs making decisions. If you
01:26:55
look back at where Athenian democracy
01:26:58
came from, Athenanian democracy didn't
01:27:00
use elections. It used a process of like
01:27:03
random number generators to select
01:27:05
intelligent people to fulfill essential
01:27:08
roles in those societies. And they they
01:27:12
weren't even people you got to know by
01:27:13
name in that sense. We know Trump here,
01:27:15
we know Star here, we have Albanesei
01:27:18
over here. We end up getting narcissists
01:27:20
and megalomaniacs
01:27:22
directing us and they're the last people
01:27:24
you need to make decisions.
01:27:26
>> When you're thinking about your own
01:27:27
money as an economist, what are you
01:27:29
doing to protect your
01:27:30
>> I'm not doing much. I mean, I I've been
01:27:33
I've been a a crusader for reforming
01:27:35
economic theory. For my whole life, I've
01:27:38
sort of neglected this side of things to
01:27:41
my detriment, I've got to say. Um, but I
01:27:46
really am focused on what's sustainable
01:27:47
for everybody rather than what I can
01:27:49
make as my own cut. And I don't think
01:27:51
we've got a sustainable economy at the
01:27:53
moment. We have a philosophy of
01:27:55
economics which leads to breakdowns.
01:27:57
>> I'm asking that cuz I've got so many
01:27:58
friends and listeners that ask me often
01:28:00
like, should I be buying a house right
01:28:01
now? Do you think I should be investing
01:28:02
in gold? Do you think I should be saving
01:28:04
my money? Should I be, I don't know,
01:28:06
investing in technology companies?
01:28:07
>> Yeah.
01:28:08
>> And I'm wondering if you had a
01:28:09
perspective for them.
01:28:10
>> Not on that. No. like I've I've really
01:28:12
left that area alone. I'm I'm actually
01:28:14
looking at the overall system and saying
01:28:16
how do we make the system sustainable so
01:28:18
that people can live within it and what
01:28:20
we've got is an unsustainable system and
01:28:22
you're asking me how do people survive
01:28:24
within an unsustainable system? Answer
01:28:26
is they don't. We always think we can do
01:28:28
something at the individual level to
01:28:29
cope with what's happening in the system
01:28:31
around us that only works if the system
01:28:33
around us is stable.
01:28:34
>> What is a better system then? Uh I think
01:28:37
I think what China has done is a better
01:28:39
in a better direction. They have a they
01:28:42
have a collective focus as well as an
01:28:44
individual focus.
01:28:45
>> What's their system called?
01:28:46
>> It's called communist.
01:28:47
>> So you think communism is better than
01:28:49
capitalism?
01:28:50
>> I think a system which reflects the need
01:28:53
for a cohesive society as well as
01:28:56
individual gain is needed and the system
01:28:59
in China is closer to that than the
01:29:01
system in America. in in China. Listen,
01:29:03
I don't know a ton about this, but they
01:29:05
have a leader who stays in power and
01:29:07
>> that's one that's the potential weakness
01:29:09
>> and suppresses the people's decision-m
01:29:12
entrepreneurialism.
01:29:13
>> Equally, you've got a system to get into
01:29:15
the Communist Party. You've got to have
01:29:18
uh highly you've got to be educated to
01:29:20
get in and you have to perform to some
01:29:23
extent in the region in which you begin
01:29:25
your role.
01:29:26
>> But you're not saying you think the West
01:29:27
should adopt communism, are you? No, I'm
01:29:29
saying that is west should adopt a
01:29:30
system which reflects the need for a
01:29:33
cohesive society.
01:29:34
>> Is that socialism?
01:29:35
>> Socialism is closer to it. I mean I the
01:29:37
words are all tainted. Okay. If you go
01:29:40
back, you know, do you eat Cadbury's
01:29:42
chocolate?
01:29:43
>> I try not to.
01:29:44
>> You have, haven't you? Okay. Cadbury's
01:29:46
was a socialist enterprise. Okay. It was
01:29:48
formed as a a belief we have one who
01:29:51
could work as the best possible
01:29:52
situation while also selling a
01:29:54
profitable product. Mondreon in Spain is
01:29:57
another cooperative started by a
01:29:58
Catholic priest. Uh of all things we
01:30:01
tend to be very binary in the west. We
01:30:03
say you either have competition or you
01:30:05
have cooperation. Okay. Well, you need
01:30:07
to be more like the east in the sense of
01:30:09
the idea of ying and nang. You have to
01:30:11
have both. Okay, cooperation and
01:30:13
competition.
01:30:14
>> And so that view is the closest thing is
01:30:16
socialism.
01:30:16
>> The closest to socialism. And what China
01:30:19
has done that better than Russia. You go
01:30:21
back to the USSR. uh that was they they
01:30:24
were disastrous in terms of product
01:30:25
development. China's been extremely
01:30:27
successful on that front. They've
01:30:29
learned from the mistakes of being too
01:30:31
centralized and too top down in Russia
01:30:33
to have both the top down and the bottom
01:30:35
up dynamic going on.
01:30:37
>> What's wrong with capitalism? And
01:30:38
capitalism is what the UK and the US
01:30:40
have adopted as their sort of economic
01:30:42
model.
01:30:42
>> It's seeing competition absolutely
01:30:45
ruling and ignoring cooperation. Now the
01:30:47
real the successful society combines
01:30:50
both. You have cooperation, you also
01:30:52
have competition. And we've pushed it
01:30:54
far too far in the competitive end and
01:30:56
not enough in the cooperative. And what
01:30:58
comes out of that as well is this cooper
01:31:01
competitive tends to be short-term
01:31:02
focus. What can I make a profit out of
01:31:04
in time that the money that I've
01:31:06
borrowed is I'm going to be able to make
01:31:09
more of a profit than the interest I'm
01:31:11
paying on the money I've I've created.
01:31:13
And if the interest if the longer it
01:31:14
takes to get the the repayment, the less
01:31:17
likely you are to make the investment.
01:31:18
So what you get is a focus upon
01:31:20
short-term with just a market system
01:31:23
whereas with the long term you say
01:31:24
what's going to last for 100 years and
01:31:27
like and what that means is you build
01:31:28
the infrastructure for the long term
01:31:30
while you allow competition to occur in
01:31:32
the short term. It's getting the balance
01:31:34
right. We've got the balance extremely
01:31:36
wrong.
01:31:37
>> Professor Steve, thank you. I highly
01:31:40
recommend people go check out your
01:31:42
YouTube channel where you make videos
01:31:43
all the time about what's going on in
01:31:44
the world. to give your opinion on
01:31:46
economic issues, political issues, the
01:31:47
Iran war. So, if people are listening
01:31:49
and they want to learn more from
01:31:50
Professor Steve, then look down below
01:31:52
and you should see his YouTube channel
01:31:54
linked um next to our name because we're
01:31:56
going to try and collaborate on this
01:31:58
post and I'll put you the the link to
01:31:59
your channel in the description below
01:32:01
for anyone that wants to check you out
01:32:02
and subscribe. It's so fascinating,
01:32:04
especially the stuff about around the
01:32:05
raw materials coming out the
01:32:06
straightforward because I really had no
01:32:08
idea. I it's just it's quite staggering
01:32:10
to me that we're so dependent on one
01:32:12
region of the world and I think from
01:32:13
watching your videos over the last
01:32:14
couple of weeks,
01:32:15
>> it's really made me understand the
01:32:17
unintended consequences of war
01:32:19
generally, but specifically this war in
01:32:21
Iran.
01:32:22
>> Um, so thank you for turning the lights
01:32:23
on for me. I really, really appreciate
01:32:25
this and I hope we can meet again soon
01:32:26
and have a conversation and hopefully,
01:32:29
you know, this all resolves itself in a
01:32:31
way that's good for everybody.
01:32:32
>> I hope so. I I'm having my 73rd birthday
01:32:34
tomorrow. Oh, I might have 74th as well,
01:32:36
but I think there's a question mark over
01:32:38
that now.
01:32:39
>> Well, I did hear it was your birthday
01:32:42
tomorrow.
01:32:43
>> I think the team have gotten you a
01:32:45
little something.
01:32:46
>> Okay.
01:32:47
>> Happy birthday
01:32:48
to you.
01:32:50
>> I'm embarrassed.
01:32:51
>> Happy birthday.
01:32:53
>> Oh my god.
01:32:56
>> Happy birthday.
01:32:59
>> My god. Thank you.
01:33:01
Happy birthday to you.
01:33:05
>> Holy hell. I'm missing.
01:33:07
>> Thank you. Should I blow the candles
01:33:08
out?
01:33:08
>> Yes, you should.
01:33:10
>> Okay, you get a wish.
01:33:11
>> You blew them all out, so you get a
01:33:12
wish.
01:33:14
>> Well, I wish for peace in the Middle
01:33:16
East.
01:33:16
>> Okay,
01:33:17
>> that's probably the main thing to say
01:33:18
about right now.
01:33:18
>> That is a gorgeous cake. I have to say
01:33:20
>> it's a marvelous cake. Yeah,
01:33:21
>> that's marking our own homework, but
01:33:23
>> this better be eaten by the crew cuz I'm
01:33:25
not going to eat all this myself. Okay,
01:33:27
you want to get out a knife and start
01:33:28
slicing up? My god,
01:33:30
>> thank you. Thank you so much. We're
01:33:31
done.
01:33:31
>> Thank you.
01:33:31
>> YouTube have this new crazy algorithm
01:33:33
where they know exactly what video you
01:33:35
would like to watch next based on AI and
01:33:37
all of your viewing behavior. And the
01:33:39
algorithm says that this video is the
01:33:42
perfect video for you. It's different
01:33:44
for everybody looking right now. Check
01:33:46
this video out and I bet you you might
01:33:48
love

Badges

This episode stands out for the following:

  • 70
    Most shocking
  • 70
    Best concept / idea
  • 65
    Most intense
  • 60
    Most dramatic

Episode Highlights

  • The Threat of War
    The war is threatening everybody on the planet, with dire consequences for global stability.
    “This war is threatening everybody on the planet.”
    @ 00m 30s
    April 06, 2026
  • Iran's Preparedness
    Iran has prepared for conflict, breaking its military into 31 divisions to resist invasion.
    “They’ve got their own fail safe system running in the background.”
    @ 09m 25s
    April 06, 2026
  • The Role of Fertilizer in Food Production
    Fertilizer is essential for growing food; without it, billions could starve.
    “If we lost 20% of the world's fertilizer, we'd lose roughly 20% of the world's food.”
    @ 18m 12s
    April 06, 2026
  • Consequences of Rising Costs
    Rising prices could push low-income workers into food insecurity.
    “If the prices go up 20% for people, he's out.”
    @ 23m 54s
    April 06, 2026
  • Economic Impact of Attacks
    Iran's attacks on energy infrastructure lead to massive economic losses for regions like Dubai.
    “They lose a million per minute, which is 60 million per hour or 1.4 billion a day.”
    @ 35m 45s
    April 06, 2026
  • The Samson Doctrine
    The Samson doctrine suggests Israel might unleash destruction if faced with existential threats.
    “If they realize they are going to lose this war, they unleash destruction on the rest of the world.”
    @ 38m 35s
    April 06, 2026
  • Trump's Pattern of Behavior
    Trump's negotiation tactics resemble a threat system, manipulating others to get his way.
    “It's the same pattern of behavior.”
    @ 50m 43s
    April 06, 2026
  • Self-Sufficiency in Crisis
    In times of global chaos, self-sufficiency, especially in food production, is crucial.
    “The lesson that comes out of this is self-sufficiency.”
    @ 01h 03m 16s
    April 06, 2026
  • AI Startup Failure Rates
    The failure rate of AI-specific startups has hit 90% in 2026, significantly higher than the 70% average for general technology.
    “Wow. That’s luck.”
    @ 01h 06m 45s
    April 06, 2026
  • Future Job Displacement Predictions
    Predictions suggest that up to 50% of working-class jobs could be wiped out due to AI and robotics.
    “50% of jobs could be wiped out.”
    @ 01h 12m 50s
    April 06, 2026
  • Bitcoin's Fragile Future
    An economist warns that Bitcoin's reliance on energy could lead it to collapse.
    “Bitcoin is going to zero.”
    @ 01h 21m 51s
    April 06, 2026
  • A Wish for Peace
    On his birthday, a heartfelt wish for peace in the Middle East is shared.
    “I wish for peace in the Middle East.”
    @ 01h 33m 16s
    April 06, 2026

Episode Quotes

Key Moments

  • Helium Shortage15:47
  • Vulnerability Awareness22:09
  • Cost of Living Struggles23:54
  • Nuclear Warfare Fears33:12
  • Self-Sufficiency1:03:27
  • AI Startup Crisis1:06:45
  • Future Scenarios1:20:36
  • Political Change1:25:52

Words per Minute Over Time

Vibes Breakdown