Search Captions & Ask AI

Scott Galloway Predicts a $10 Trillion Market Wipeout | Pivot

March 13, 2026 / 01:05:19

This episode of Pivot covers the ongoing war in Iran, the impact on global oil prices, and the Pentagon's actions regarding Anthropic. Hosts Carara Swisser and Scott Galloway discuss the significant supply disruptions in the oil market, the U.S. military's responsibility for a tragic missile strike, and the broader implications of these events.

Scott shares insights from his interview with Senator Warner, highlighting the chaotic response to rising oil prices and the military's handling of civilian casualties. The conversation includes a discussion on the economic fallout for countries reliant on oil imports, particularly in the Middle East and emerging markets.

The hosts also address the Pentagon's controversial designation of Anthropic as a supply chain risk, which has led to a lawsuit against the government. Microsoft supports Anthropic, emphasizing the potential negative ramifications for the tech industry.

Finally, they touch on the ethical responsibilities of AI companies in preventing violence and the broader societal impacts of their technologies. The episode concludes with predictions about the future of the markets and the potential for economic downturns.

TL;DR

Scott and Carara discuss the Iran war's impact on oil prices and Anthropic's lawsuit against the Pentagon.

Video

00:00:00
This isn't military action. This is a
00:00:01
war. There's one
00:00:03
>> excursion. The word he's using now. It's
00:00:04
an excursion.
00:00:05
>> Like a field trip.
00:00:13
>> Hi everyone. This is Pivot from New York
00:00:14
Magazine and the Vox Media Podcast
00:00:16
Network. I'm Carara Swisser
00:00:18
>> and I'm Scott Galloway.
00:00:19
>> Scott, did we have a good time in
00:00:20
Minneapolis?
00:00:21
>> Oh, that was wonderful. And thank you to
00:00:23
the the wonderful people of Minneapolis.
00:00:25
I thought it was great. I I've never you
00:00:28
know what was really the the community
00:00:30
or you know maybe we got a not a
00:00:33
representative sample I'd like to think
00:00:34
we got a representative the community
00:00:36
seems very unified right now. Yeah,
00:00:38
absolutely. People drove from North
00:00:40
Dakota. There was
00:00:42
>> wherever that is or Iowa. We had a
00:00:44
lawyer from Iowa come.
00:00:46
>> Yeah. Judge,
00:00:46
>> by the way, shout out. We know who you
00:00:48
are. There's this wonderful woman who's
00:00:49
a lawyer in family court and she
00:00:51
commutes seven hours a week and she said
00:00:53
that uh excuse me. She said judge. Yeah.
00:00:56
And she said we're her we're her best
00:00:58
friends.
00:00:59
>> Yeah. Yeah. It was great. And people
00:01:00
were great. Anyway, we've got a lot to
00:01:02
get to today. I'm going to dig in.
00:01:03
First, the war in Iran is sending oil
00:01:05
prices on a wild ride this week and
00:01:07
creating what the International Energy
00:01:09
Agency says is quote the largest supply
00:01:11
disruption in the history of the global
00:01:13
oil market. Okay, that's kind of
00:01:16
something. As of this recording, oil is
00:01:18
still very high, slowly coming down from
00:01:20
over $100 a barrel after ships were
00:01:22
attacked in the Persian B Gulf. There's
00:01:24
also the tax still going on. Gulf uh gas
00:01:27
prices continue to climb as well. And
00:01:28
just remember, it's not just gas prices.
00:01:30
Every price goes up when gas goes up.
00:01:32
The IEA's 32 member countries are
00:01:35
releasing a record 400 million barrels
00:01:37
of oil from strategic reserves to
00:01:39
counter the chaos, which means we aren't
00:01:41
going to feel this yet. Uh I interviewed
00:01:44
uh Senator Warner yesterday and he was
00:01:46
noting that um Trump has tried to calm
00:01:48
markets. He keeps trying to to do this
00:01:51
to bring these oil prices down by words
00:01:53
saying the war is quote very complete
00:01:55
only to later announce we haven't won
00:01:58
enough. Oil prices also plunged after
00:02:00
energy secretary Chris Wright
00:02:01
incorrectly posted that US Navy had
00:02:03
escorted a tanker through the straight
00:02:04
of Hormuz. So that was a problem. The
00:02:07
post was deleted within minutes was
00:02:08
enough to move markets and wipe out uh
00:02:10
million billion million dollar trades.
00:02:13
Um this is such a taco. This is the
00:02:15
greatest taco of all I think. And even
00:02:17
if the war in Iran ends soon, returning
00:02:19
the straight of Hermuz to typical
00:02:20
traffic could take one to three months.
00:02:22
We're going to see reverberations of
00:02:24
this ridiculous situation. um the way
00:02:26
he's handling it and the way he's not it
00:02:29
seems all over the place. Um and also to
00:02:32
to add to the kind of mess there, the
00:02:34
initial findings of a military
00:02:36
investigation say that US was
00:02:37
responsible for that deadly tomahawk
00:02:39
missile strike on the Iranian elementary
00:02:41
school. It's actually causing a lot of
00:02:43
strife within MAGA. By the way, the
00:02:45
report notes officers like and
00:02:46
everywhere else, normal people and MAGA.
00:02:49
Um the report notes officers likely used
00:02:51
outdated information to label the school
00:02:53
as a military target. Trump has tried to
00:02:55
put the blame on Iran earlier this week,
00:02:57
claiming they also have the tomahawks,
00:02:59
which everyone thought was ridiculous.
00:03:00
And when asked about the military report
00:03:02
on Wednesday, Trump said he knew nothing
00:03:04
about it. Um, we'll get to the the
00:03:06
photography scandal at the Pentagon, but
00:03:08
talk a little bit about what's going on
00:03:10
with oil prices and this the school,
00:03:12
which is just I feel like we should take
00:03:14
responsibility when we make an error,
00:03:16
such a terrible error. But go ahead,
00:03:18
start.
00:03:19
>> I'll go I'll go in reverse order. When
00:03:21
you're handling a crisis, and this is a
00:03:23
crisis, the death of civilians,
00:03:25
especially children, is obviously pretty
00:03:27
ugly. You acknowledge the issue, you
00:03:29
take responsibility, and you try and
00:03:31
overcorrect. And they've done nothing of
00:03:33
the sort. And there's in a war, and this
00:03:36
is a war. This isn't military action.
00:03:37
This is a war. There's
00:03:39
>> an excursion, the word he's using now,
00:03:40
it's an excursion,
00:03:42
>> whatever that means. Excursion.
00:03:44
>> I went on a bike,
00:03:45
>> like a field trip, like
00:03:47
>> Exactly. My daughter went on an
00:03:48
excursion. except he didn't get
00:03:49
Congress's approval the day before that
00:03:51
he could go on the excursion. Um, you
00:03:53
know, it's a tragedy. Uh, they just made
00:03:55
a bad situation worse. First off, they
00:03:58
look incompetent by saying that it might
00:04:00
have been a tomahawk from Iran. Iran
00:04:02
doesn't have tomahawks. So, it it looks
00:04:04
like, okay, I'm not willing to own up to
00:04:06
this. I mean, there's not a good answer,
00:04:08
but there's a reasonable answer here,
00:04:10
and that is
00:04:13
>> Yeah, this we decided to go, you know,
00:04:15
with military action. This is a This is
00:04:17
a group of people who killed 30,000 of
00:04:19
its own people. War is going to have
00:04:21
collateral damage. We screwed up. We
00:04:23
take responsibility. These are the
00:04:25
following steps we're putting in place
00:04:26
to make sure it doesn't happen again.
00:04:29
And take responsibility for it and it
00:04:31
would have been not over, but it would
00:04:33
have been acceptable. Instead, it's
00:04:35
like, no, it was Iran's fault. It just
00:04:37
doesn't
00:04:38
>> or I didn't know.
00:04:40
>> Yeah. Oh,
00:04:42
>> Pegasus was the same way. It was it was
00:04:43
and was angry when people asked about
00:04:45
it, which is the everything wrong in the
00:04:47
response and everything wrong in the
00:04:50
mistake. But you're right. Absolutely.
00:04:52
>> Yeah. And the the real I mean, we're
00:04:56
just we're just starting to see.
00:04:59
So, I was speaking to a kid and um and I
00:05:02
said, "What what you know, where do you
00:05:03
want to be in 5 years?" I always ask
00:05:05
young men that. Where do you want to be
00:05:07
in 5 years? And this kid said, "Uh, I'd
00:05:09
really love to have my own auto repair
00:05:12
shop focusing on EVs."
00:05:13
>> Mhm.
00:05:13
>> I said, "Okay, well then let's reverse
00:05:15
engineer from those things." Like, what
00:05:17
kind of skills do you need to acquire?
00:05:20
What kind of job certification?
00:05:22
What kind of capital or money would you
00:05:24
need to um uh start something like this?
00:05:28
>> Uh have a business plan. Would what kind
00:05:30
of real estate would you need? What
00:05:32
would be your you know, let's reverse
00:05:34
engineer everything you need basics,
00:05:36
right? Let's reverse everything,
00:05:37
engineer everything to today around what
00:05:40
you would need to be an owner of an EV
00:05:42
repair shop in uh he lives in the
00:05:44
outskirts of Los Angeles. Just the
00:05:46
loveliest young kid. Anyways,
00:05:49
we can't even reverse engineer the
00:05:51
tactics because I don't think anyone is
00:05:53
really clear yet on what the endgame is,
00:05:55
what the end goal is.
00:05:57
>> Mhm.
00:05:57
>> And that is if they had said, "All
00:05:59
right, we're going to diminish their
00:06:00
launch capability from missiles." Makes
00:06:02
all the sense in the world. It's more
00:06:03
about the launchers and the missiles
00:06:04
because you can bury the missiles under
00:06:06
under
00:06:06
>> these are ballistic missiles for people
00:06:07
who don't know.
00:06:08
>> We can we are going to make sure that
00:06:10
the straits of Hormuz are more secure
00:06:12
than they were uh previous to this and
00:06:14
we're going to work with our Gulf allies
00:06:15
to create a series of mind sweepers and
00:06:18
and enforce the border. I mean, and
00:06:20
we're going to take out the navy and
00:06:22
we're going to take out the munitions
00:06:23
infrastructure that builds this stuff.
00:06:25
These are the three boxes we need to
00:06:27
check.
00:06:29
>> Can I interject since I just interviewed
00:06:30
Warner about this? One of the things
00:06:32
that they've talked about is going in
00:06:33
and getting the enriched uranium, but
00:06:34
that would actually be would take, as
00:06:37
they say, boots on the ground and it
00:06:38
would be
00:06:39
>> not viable. Not feasible.
00:06:41
>> Not feasible unless we want a lot of
00:06:43
Americans to die.
00:06:44
>> Yeah. As is quite frankly, as is regime
00:06:47
change. I mean,
00:06:48
>> this regime is sticking pretty strongly.
00:06:51
>> Oh, they're not collapsing.
00:06:53
>> Yeah. Yeah. No, I think Khi had the
00:06:55
likelihood of regime change at like 10%
00:06:58
by the end of March or something like
00:07:00
that right now.
00:07:01
Anyways, it's like, well, okay, in war
00:07:03
you always have to have plans A prime
00:07:06
and plan B because the enemy gets a say
00:07:08
in this. But the problem is no one can
00:07:10
identify plan A.
00:07:12
>> No, they ate it. They ate it. They the
00:07:14
the dog ate my homework. Can I ask you
00:07:15
about the oil prices because I think
00:07:17
that's something that's going to people
00:07:18
don't recognize. And um you the idea of
00:07:22
trying to calm the market by releasing
00:07:24
incorrect information, letting it go,
00:07:26
you know, whipssaw all over the place.
00:07:28
And this release of these 400 million
00:07:30
barrels is going to have repercussions
00:07:32
later because that's when the prices
00:07:34
will go up, these strategic reserves.
00:07:36
And what they're they're trying to do
00:07:38
everything possible to pretend we're not
00:07:40
going to have a real crisis between the
00:07:42
straight of Hormuz and this release. Um
00:07:44
and so it they it has second order
00:07:48
problems. Now Wall Street's sort of
00:07:49
sloughing it off a little bit. Um but
00:07:52
these are prices that are going to
00:07:53
reverberate through the system as you
00:07:55
have noted. So look, the biggest loser
00:07:56
here is obviously
00:07:59
um the people of of Iran who are in the
00:08:01
wrong place at the wrong time, right?
00:08:03
The the there is no bigger loser than
00:08:06
the families who lo lose loved ones. I
00:08:08
also think the reputation of the US and
00:08:11
what was an opportunity to create much
00:08:13
stronger alliances with moderate nations
00:08:16
in the Gulf. So big losers. What people
00:08:19
aren't talking about, the countries that
00:08:22
import more than 50% of their oil,
00:08:23
Japan, South Korea, India, and most of
00:08:26
Europe have seen their markets hammered,
00:08:29
absolutely hammered. Uh poor countries
00:08:32
with no foreign exchange reserves uh and
00:08:35
dollar denominated debt can't, you know,
00:08:37
are could be thrust into the IMF or
00:08:41
effectively what is bankruptcy. Airlines
00:08:43
and hospitality companies all over the
00:08:45
world, shipping, the bunker fuel cost
00:08:47
Warner said he's been meeting with
00:08:49
airline executives and they said they're
00:08:51
fine for now but it's going to be $25
00:08:52
million a day extra which is crazy.
00:08:56
>> I mean nations who import their oil
00:08:58
especially who get most of it through
00:09:00
the straits of Hormone their economy
00:09:01
basically their economies are like [ __ ]
00:09:03
for the year at a minimum. So there this
00:09:06
is having you know we have obviously the
00:09:10
biggest losers by body count are Iran
00:09:13
but by by economic collapse Middle
00:09:15
Eastern oil importers Jordan, Lebanon,
00:09:17
Egypt and fragile emerging markets
00:09:20
Pakistan.
00:09:21
>> Guess who's doing great? Russia.
00:09:23
>> Yeah. gives him the need he need he was
00:09:26
really on the on the ropes around the
00:09:28
million the million people who have died
00:09:30
and also the price of oil and now he has
00:09:32
more money to spend while we ignored uh
00:09:36
help from the Ukrainians on drones and
00:09:38
one of the things Warner was pointing
00:09:39
out was that fine we could take out
00:09:42
their battleships but their real problem
00:09:44
is all those small fast boats and their
00:09:46
drones they can just do all manner of
00:09:48
damage to us in that with these small
00:09:51
$50,000 drones and we use a million
00:09:53
dollar rocket to take it out. I mean,
00:09:55
this is the problem is they have an
00:09:57
ability to do this and they've been
00:09:59
they're, you know, the way Warner
00:10:01
described it, these this country is hard
00:10:03
is is hard enforced like hard like
00:10:06
hardwired. This this is not Venezuela.
00:10:09
This is Trump lives like he's in some
00:10:11
movie where you just do three bombs and
00:10:13
that's the end of it. But this is a
00:10:15
hardwired
00:10:16
150,000 people in this in this ruling uh
00:10:20
group in Iran. And they're not giving up
00:10:22
all this money and all this power for I
00:10:25
don't know. It's it's a really difficult
00:10:27
situation which they didn't.
00:10:28
>> They're just thinking about the market,
00:10:29
the winners and losers. The hardest
00:10:31
stock markets
00:10:33
>> are Middle Eastern markets. Jordan,
00:10:35
Egypt, Lebanon, their stock market's
00:10:37
greater. There's a capital flight to
00:10:39
safety. I mean, the ironic thing here is
00:10:41
that over the long term, our reputation
00:10:42
is in tatters. We're probably the least
00:10:45
damaged because we're energy
00:10:47
independent. We produce more energy than
00:10:49
we consume. We have two oceans
00:10:50
protecting us. Friendly Canada to the
00:10:53
north, harmless harmless Mexico to the
00:10:56
south. We still have capital inflows. In
00:10:58
a weird I mean, it's just terrible to
00:10:59
say, but in a weird way. Our markets are
00:11:02
probably least damaged by this. Europe
00:11:05
cost. There'll be costs for airlines.
00:11:06
There'll be costs for uh truckers.
00:11:08
There's going to be costs for home
00:11:10
heating. Thank goodness it's not winter,
00:11:12
right?
00:11:12
>> The dollar's already strengthened. I
00:11:14
mean,
00:11:15
>> it's it's ironic, but when you diminish
00:11:17
the entire world, there's a flight to
00:11:19
safety, and flights to safety usually
00:11:21
benefit the US. Emerging markets are
00:11:22
going to get the [ __ ] kicked out of
00:11:24
them. India, Brazil, South Africa,
00:11:25
Mexico, capital flowing out to the US
00:11:29
dollar for safe havens. The US will
00:11:32
likely be down 8 to 10% on a tariff
00:11:34
ruling or was down but it could be down
00:11:36
another 10 to 15% and that'll be I'll
00:11:39
talk more about that in our prediction
00:11:41
but you're going to have a pretty big
00:11:43
big peak to trough but that some of that
00:11:45
might just be the air coming out of the
00:11:47
bubble but to your point the least
00:11:49
damaged in the Middle East or Saudi
00:11:52
Arabia and the UAE but the big winner
00:11:53
here as you said is Russia oil the oil
00:11:56
price spikes the oil price spike
00:11:58
benefits them the US is distracted by
00:12:00
Iran. So more Ukraine leverage and oddly
00:12:04
the the ruble strengthens. So
00:12:07
>> yep,
00:12:07
>> this is
00:12:09
>> war is literally the agent of unintended
00:12:11
consequences.
00:12:12
>> Y
00:12:13
>> and this is so frustrating because if
00:12:14
this had been more like fore and less
00:12:17
like a rock and they'd set out a series
00:12:19
of achievable objectives
00:12:21
>> that this could have been a win. It
00:12:23
could have been the Gulf States coming
00:12:24
together
00:12:25
>> and if they had said, "Look, to a couple
00:12:27
European nations and to the Gulf States,
00:12:30
a stable Middle East benefits all of us.
00:12:33
Let's all have a series of objectives
00:12:35
and we're going to pay for and execute
00:12:37
against most of this. We could have
00:12:39
strengthened our alliances."
00:12:40
>> We've been dragged around by Israel
00:12:42
here. In a lot of ways, it looks like
00:12:43
it. Let me let me move on.
00:12:44
>> See, I disagree. I think we're very
00:12:46
tightly coordinated with Israel right
00:12:47
now.
00:12:47
>> I talked to Warner who's in the gang of
00:12:49
eight. I'm going to go with him over
00:12:50
you. I'm sorry to say that, but you
00:12:51
know, I think it was that they were
00:12:52
>> You went with the senator over Scott.
00:12:54
>> Yes. Yes, I damn. Um I think they were
00:12:56
going to attack and we decided to be the
00:12:58
senior partner like that's rather than
00:13:00
create something else because
00:13:01
>> Well, you mean Iran was going to attack
00:13:02
Israel? Israel's attack.
00:13:04
>> No, no, no. Israel was going to attack
00:13:05
Iran. That I mean that's the implication
00:13:08
he had. And
00:13:08
>> and Senator Warner feels like we did not
00:13:10
have the power to say stop.
00:13:12
>> Well, he doesn't know why we didn't.
00:13:14
That was one of his questions. He he's
00:13:16
he's surprised. He He seemed wor more
00:13:18
worried. He's usually not a worry wart,
00:13:20
but he seems worried about two things.
00:13:22
How this was conducted, obviously,
00:13:24
>> and what the real implications are,
00:13:26
especially around drones and small boats
00:13:28
that could do enormous damage to our
00:13:29
battleships and everything else, and
00:13:32
also election security. Um, but one of
00:13:34
the the weirder parts is how the the
00:13:36
administration has behaved. Um, Donald
00:13:38
Trump was dancing last night or golfing
00:13:41
and stuff like this, so the visuals
00:13:42
aren't very good. And the DoD has now
00:13:44
barred press photographers from Iran
00:13:46
briefings after publishing photos of
00:13:48
HEGs staff found unflattering according
00:13:50
to the Washington Post. Hex says vanity
00:13:52
aside, um it just they just look like
00:13:56
like he looks like a fatuous popping and
00:13:58
jay at all times. But in this case, the
00:14:01
lack of seriousness about something
00:14:03
that's very serious seems problematic.
00:14:05
And it's also causing problems within
00:14:07
their own group of mega. There's a real
00:14:09
shift. There's a real like sort of
00:14:12
Tucker Carlson and Megan Kelly u MTG on
00:14:15
one side um and then you know Mark
00:14:19
Levin, Ben Shapiro, all this there's a
00:14:20
real ugliness. I I went over I wandered
00:14:23
over to Twitter which I shouldn't have
00:14:24
done and the the nastiness between them
00:14:28
is really quite something. It's really
00:14:29
quite something to watch.
00:14:31
>> Like imagery is so incredibly powerful.
00:14:34
Basically, I think one photograph
00:14:36
brought didn't bring an end to the
00:14:38
Vietnam War, but expedited it. And it's
00:14:40
that that's it's that incredibly
00:14:42
dramatic photo of the the young girl
00:14:45
running from a napal bombing. And with
00:14:47
the Iraq war, George Bush and the
00:14:49
Pentagon, they banned photos of service
00:14:52
member coffins because he realized war
00:14:54
is so ugly that that it'll lose support.
00:14:58
And there's just and the notion that
00:15:00
these guys can't handle the images of
00:15:02
Pete Hegathth in an unflattering I mean
00:15:05
it's just uh it shows you're spending
00:15:09
you're allocating your capital in the
00:15:11
wrong places that's not that's not what
00:15:14
you should be thinking about or worried
00:15:16
about. And if you think you can control
00:15:18
the imagery of Pete Hacksath, well,
00:15:21
okay, just wait till you see the images
00:15:23
that are going to come out of Iran. And
00:15:25
you can already sell see that the IRGC
00:15:27
is quite frankly organizing again and
00:15:30
going on an information campaign.
00:15:32
>> They are and they've been very good.
00:15:34
Iran in general has been one of the
00:15:36
stronger players uh in th in those
00:15:38
spaces in terms of propaganda and
00:15:40
everything else. And so that's why
00:15:42
>> when you say good, you mean effective.
00:15:45
They they lie like there's no tomorrow,
00:15:46
>> of course. But hello, lots of people do.
00:15:49
Lots of governments do. Um,
00:15:50
>> oh, I don't know. I think I think Iran
00:15:52
takes it to a new level.
00:15:53
>> They do, but but they are, when I say
00:15:56
good is they're good at it. Um, they're
00:15:58
very um they're all throughout all the
00:16:00
various social networks. They're very um
00:16:03
they did one the other day which I was
00:16:05
sort of fascinated by where they put up
00:16:07
your president as a pedophile. Um which
00:16:09
was interesting. um they just they've
00:16:12
been at it for a long long time and they
00:16:14
have used often when there's stuff that
00:16:17
pops up online it's either Russia or
00:16:19
Iran. um China to an extent too, but
00:16:22
really Iran has used social media as one
00:16:26
of the smaller I mean it is a smaller
00:16:27
country than Russia or less powerful and
00:16:30
it has used social media to its
00:16:32
advantage in ways that are really of
00:16:35
course heinous because it's conspiracy
00:16:36
theories and you you always find them
00:16:39
somewhere in they're at the top everyone
00:16:41
I ever interview in cyber security are
00:16:43
the top in cyber security issues in uh
00:16:46
propaganda in conspiracy theories and
00:16:49
they have a very welloiled machine
00:16:51
throughout the world doing this kind of
00:16:53
stuff. So,
00:16:54
>> well, when the actual audit of social
00:16:56
media is done, I think we're going to
00:16:58
find that somewhere between 10 and 40%
00:17:01
of comments and posts
00:17:03
>> Yeah.
00:17:03
>> on geopolitical accounts or accounts of
00:17:06
influencers
00:17:07
is going to have originated from either
00:17:09
the CCP, the GRU, or the RGC.
00:17:12
>> Yep. Absolutely.
00:17:13
>> And this is what you do. You see a piece
00:17:15
of content and then you look at the
00:17:16
comments to evaluate and shape your own
00:17:19
view of that content.
00:17:21
>> Mhm.
00:17:22
>> And when
00:17:23
>> it's all gamed
00:17:25
>> Yeah. And it it has a huge impact. You
00:17:28
don't even recognize how much impact it
00:17:29
has on your views of stuff because if
00:17:31
someone says,
00:17:32
>> "Oh, the US the US will be able to
00:17:35
escort ships through the straits of
00:17:36
Hormuz." I'm just using an example.
00:17:38
>> And then there's just a ton of stuff
00:17:40
saying that'll never happen. Oil prices
00:17:42
are going to be at $200.
00:17:43
>> Mhm. All right. Where's that comment
00:17:44
coming from?
00:17:45
>> Right.
00:17:46
>> And and unfortunately, although they
00:17:49
could put in places to verify accounts
00:17:51
and get rid of fake accounts and fake
00:17:54
comments, you know, I mean, just go on
00:17:57
these really sensitive pages or
00:17:58
sensitive opinions and click on who made
00:18:01
the comment and it's someone with three
00:18:03
followers.
00:18:05
Okay, that's not a person,
00:18:07
>> right? And the question is why would
00:18:08
someone be making this comment or what
00:18:11
entity would have an interest in these
00:18:12
comments?
00:18:13
>> Yep.
00:18:14
>> Anyway, I
00:18:15
>> we're going to talk about that later
00:18:16
because there's a major report from the
00:18:18
Center for Countering Digital Hate
00:18:19
that's really interesting around chat
00:18:21
bots. Um but we're going to move on uh
00:18:23
and we have lots to talk about, but
00:18:25
there this story is going to continue in
00:18:27
our reberations obviously. Um, but when
00:18:30
uh we're going to go on a quick break
00:18:31
and when we come back, Anthropics sues
00:18:33
the Pentagon and Microsoft comes to
00:18:35
Anthropic's defense.
00:18:37
Support for this show comes from Quint.
00:18:40
If you've ever peered into your wardrobe
00:18:42
and felt paralyzed by indecision, then
00:18:44
it might be time to build a collection
00:18:45
of pieces you don't have to think too
00:18:47
hard about. Quint offers elevated
00:18:49
fabrics, thoughtful design, and pricing
00:18:51
that actually make sense. Quint makes
00:18:53
highquality wardrobe staples using
00:18:55
premium fabrics like 100% European
00:18:57
linen, 100% silk, and organic cotton
00:19:00
popppllin. Their lightweight cotton
00:19:02
kashmere sweaters are perfect for the
00:19:04
changing seasons. And they work directly
00:19:06
with top factories, cutting out the cost
00:19:08
of a middleman, so you're not paying a
00:19:10
brand markup, just quality clothing. I
00:19:12
love Quint, I have to say. I use it all
00:19:14
the time. I usually use sort of sports
00:19:16
athleisure clothes there, which I find
00:19:17
incredibly comfortable, including their
00:19:19
athletic bras and things like that. But
00:19:21
I just got a cardigan and I'm not a
00:19:23
cardigan gal, but it's really
00:19:25
comfortable. It's really attractive and
00:19:27
it's very soft and I love it. They also
00:19:29
have lots of great seasonal colors and
00:19:31
prints for spring that will make getting
00:19:33
dressed a breeze. Right now, go to
00:19:35
quint.com/pivot
00:19:36
for free shipping and 365day returns.
00:19:39
That's a full year to build your
00:19:40
wardrobe and love it. And you will. Now
00:19:42
available in Canada, too. Don't keep
00:19:44
settling for clothes that don't last. Go
00:19:46
to qinc.com/pivot
00:19:49
for free shipping and 365day returns.
00:19:52
quint.com/pivot.
00:19:57
Scott, we're back with more news. The
00:19:59
White House is reportedly preparing an
00:20:01
executive order to formally ban
00:20:02
anthropic across the federal government,
00:20:04
which is likely illegal. The Defense
00:20:06
Department CTO Emil Michael, and let me
00:20:08
just say I covered him and he's a a
00:20:10
toading bully, just said on CNBC that
00:20:13
Anthropic would quote, "pollute the AY's
00:20:15
supply chain." We've only done this for
00:20:17
foreign companies. Just so you know,
00:20:20
this kind of behavior. All this comes as
00:20:22
Anthropic is officially suing the
00:20:24
Pentagon for labeling it a supply chain
00:20:26
risk, effectively blacklisting the
00:20:28
company from federal contracts. This has
00:20:30
never been done to an American company.
00:20:31
Uh, Anthropic argues the government
00:20:33
overstepped its authority and violated
00:20:35
the company's First Amendment rights.
00:20:37
And now Microsoft is getting in the mix.
00:20:38
The company threw its support behind
00:20:40
Anthropic this week, urging the federal
00:20:42
court to temporarily block the Pentagon
00:20:44
supply risk designation in an amicus
00:20:46
brief. Uh, Microsoft warned uh that the
00:20:49
unprecedented move would have quote
00:20:51
broad negative ramifications for the US
00:20:53
tech industry. And they're damn right.
00:20:55
Scott, before we go further, I want to
00:20:56
play a prediction you made last week.
00:20:58
Let's listen. My prediction is no. And
00:21:02
that is Dario Amodi has given license
00:21:05
and permission to CEOs to say no. And in
00:21:08
the next 30 days, you are going to see a
00:21:11
raft of CEOs find their testicles and
00:21:14
start saying no to this administration.
00:21:16
>> So you were right, Scott. Uh so let's
00:21:18
talk about that them saying no. And it's
00:21:19
not just Microsoft. 37 AI researchers at
00:21:22
OpenA and Google, not the companies
00:21:23
themselves, also filed a brief
00:21:25
supporting anthropic. Um, you know, I'm
00:21:28
going to just very quickly comment that
00:21:30
the this what the government's doing
00:21:32
here is really unprecedented. It's a
00:21:34
disagreement with a company and instead
00:21:36
of just disagreeing and moving on, they
00:21:38
are attacking them in the most
00:21:39
ridiculous ways, trying to make an
00:21:41
example of anthropic and really hurt
00:21:42
their business. I need you all to
00:21:44
understand Emil Michael's role here
00:21:47
because the the these people all have
00:21:49
other interests and agendas that have to
00:21:51
do with their previous life in Silicon
00:21:54
Valley and their future life in Silicon
00:21:56
Valley. And Emil Michael's always, as I
00:21:58
said, been a toading bully to powerful
00:22:00
men. And this is what he's doing here.
00:22:02
Um, and he's not a he is not a um a
00:22:05
player that is in any way um you know,
00:22:09
sort of neutral. He's not doing things
00:22:11
for you and I in in this government.
00:22:14
He's doing things in his own
00:22:16
self-interest if would be my guess. And
00:22:18
so the the attacks on anthropic right
00:22:20
behind him is all manner of competitors
00:22:22
of Anthropic that are using the federal
00:22:24
government to uh hurt a company that
00:22:27
decided not to want to do something. And
00:22:29
I'm glad Microsoft uh stood up for them.
00:22:32
Scott,
00:22:33
>> I think this is the biggest story in
00:22:34
tech. And so just a quick a quick recap.
00:22:37
Um, Anthropic had basically two ass
00:22:41
um, and both pretty narrow. They didn't
00:22:44
want uh, Claude to be used for fully
00:22:46
autonomous weapons, meaning AI, not
00:22:48
humans making final lethal uh, targeting
00:22:52
decisions, which seems reasonable. And
00:22:54
the second one was no use of cloud for
00:22:56
mass domestic surveillance of Americans.
00:22:59
And the Pentagon responded that it does
00:23:02
not intend to use COD for those
00:23:03
purposes, but refused to contractually
00:23:05
commit to that, arguing that it it can't
00:23:08
lead tactical operations by exception
00:23:10
and that legality is the Pentagon's
00:23:12
responsibility. And then on the U about
00:23:16
2 and 1/2 weeks ago, Trump posted on
00:23:18
True Social directing every federal
00:23:20
agency directing every federal agency to
00:23:23
immediately seize all use of anthropics
00:23:26
technology.
00:23:28
And then Hexath designated anthropic as
00:23:30
a supply chain risk. Okay, that is
00:23:32
that's a label which was which has been
00:23:35
reserved for foreign adversaries.
00:23:36
>> Yeah, I just said that. Yeah.
00:23:38
>> And companies linked to the Chinese and
00:23:40
Russian government. Well, I'm saying it
00:23:41
again, Carol.
00:23:42
>> Right. Okay, fine.
00:23:43
>> The supply chain the supply chain risk
00:23:46
status.
00:23:48
First off, that's this isn't just the
00:23:50
government saying, "Okay, you don't want
00:23:52
to work with us, we don't want to work
00:23:53
with you." Mhm.
00:23:55
>> If they say if they label them as a
00:23:57
supply chain risk, then already uh a 100
00:24:01
plus enterprise companies have reached
00:24:03
out to Anthropic and said, "We may not
00:24:05
be able to use you." A financial
00:24:07
services company posits negotiations
00:24:09
regarding a $50 million contract.
00:24:12
A pharmaceutical firm, financial
00:24:14
technology company. I mean, they can't.
00:24:16
This really is an ex when you're labeled
00:24:18
sort of an enemy of state. This is
00:24:19
equivalent of like you're a corporate
00:24:21
enemy of state or threat. I say threat.
00:24:24
Anthropic has now filed a lawsuit
00:24:26
against the Pentagon saying that
00:24:28
Congress's procurement laws don't
00:24:30
authorize blacklisting a US company over
00:24:33
protected speech. That's what this is.
00:24:35
They get they get to work with or not
00:24:37
work with who they want. And the supply
00:24:39
chain designation is is just not it's
00:24:42
just not legal. And it sets a dangerous
00:24:44
precedent for any American company.
00:24:46
>> Lose most people. The government will
00:24:48
lose. But it will have an effect. Yes.
00:24:50
Yes. The government will lose, but it'll
00:24:52
still have the effect. This is a Trump
00:24:54
thing. He creates a real problem whether
00:24:56
it's anthropic
00:24:57
>> and companies won't work with them until
00:24:58
they figure it out
00:24:59
>> and then it causes damage just like they
00:25:00
done at you know when they fire all of
00:25:03
Voice of America. Now they've lost in
00:25:05
court and Carrie Lake is an idiot but
00:25:07
it's already caused damage and caused
00:25:10
damage to it and that's the goal is
00:25:12
they're going to push it illegally as
00:25:14
far as they can and then they'll be
00:25:16
stopped but by the time they're stopped
00:25:18
Anthropic is badly affected. And if you
00:25:20
all don't think this is a Silicon Valley
00:25:23
rumble happening here, it's all in the
00:25:25
self-interest of private companies who
00:25:28
have an interest in slowing anthropic
00:25:30
down. And if you look at the links
00:25:31
between Emil Michael and the rest of
00:25:33
these these these clowns,
00:25:35
>> they have financial interest and
00:25:36
competitors
00:25:37
>> just this. Yes, they do. And so this is
00:25:39
a way that Silicon Valley the penny
00:25:42
Silicon Valley used to ignore government
00:25:44
for the most part and then the penny
00:25:46
drop that they're easy to pay for and
00:25:48
that they can do their competition with
00:25:50
each other in the federal government by
00:25:52
pretending they're working for us as
00:25:54
people are getting spots putting their
00:25:56
putting their people in the various
00:25:58
spots right that will cause it. This is
00:26:00
a Silicon Valley corporate beef
00:26:02
happening. It's that is what's occurring
00:26:04
here. The one that's been most
00:26:06
outspoken, I'm trying to connect his
00:26:08
financial interest, which I'm sure is
00:26:09
driving his rhetoric, is David Sachs.
00:26:11
>> David Saxs, Mark Andre, please
00:26:14
understand there are shadow people
00:26:16
behind these actions that you need to
00:26:19
pay attention to. And Trump is, you
00:26:21
know, sort of a useful idiot. I'm I'm
00:26:23
sure they make fun of Trump behind his
00:26:25
back. Um, but, you know, it's all in
00:26:27
their economic self-interest to hurt
00:26:29
this company. And they couldn't hurt
00:26:31
them by being better. So, this is how
00:26:34
they're doing it. this is what they're
00:26:35
doing.
00:26:36
>> But it comes down this is the this is
00:26:38
the fulcrum that determines if companies
00:26:41
continue to show some backbone and by
00:26:43
the way good for Sachi Nadella um
00:26:46
showing some backbone here at again
00:26:48
risk. So
00:26:51
the
00:26:52
calcia is saying that anthropics like
00:26:55
the likelihood anthropic wins the case
00:26:56
is 72%.
00:26:58
In the meantime, companies will say,
00:27:00
"Hey, that site license we're about to
00:27:02
sign with anthropic, we're just going to
00:27:03
wait. We're apologize. This is terrible.
00:27:06
>> We love you. We think you're techn
00:27:09
>> We we we can't sign this contract right
00:27:12
now." To to your point, Microsoft and a
00:27:15
group of 22 retired senior military
00:27:17
officers have filed amicuses amicus
00:27:20
briefs in support of Anthropic and its
00:27:23
lawsuit.
00:27:24
But what's interesting is that consumers
00:27:27
are speaking. The enterprise is running,
00:27:29
but consumers are running towards
00:27:31
Anthropic. Downloads of the cloud app
00:27:33
spiked more than 75% after Trump
00:27:36
prompted federal agencies to stop using
00:27:38
Anthropic. And on the flip side,
00:27:41
uninstalls of Chat GPT
00:27:44
uh Chat GPT's mobile app spiked roughly
00:27:47
300%
00:27:49
the day after Trump's proclamation. So
00:27:52
the the the question is who wins in the
00:27:55
mind of anthropics board here the fear
00:27:58
and the stasis that has been created in
00:28:02
the enterprise market or consumers
00:28:04
running towards a company they think is
00:28:06
finally showing some
00:28:08
>> I think it's damaging I think this is
00:28:09
the this is such a Trump way to do this
00:28:12
is create
00:28:12
>> philanthropics more enterprise
00:28:14
unfortunately
00:28:14
>> I I know create chaos
00:28:17
>> and damage and it's legal but do the
00:28:20
punch Even if it's like I'm not a boxer,
00:28:22
but if you do like a kidney punch, you
00:28:24
do a you hurt the person and then you're
00:28:26
like, "Oh, did I do that? I didn't know
00:28:28
I did that." And you use your minions uh
00:28:31
and I cannot underscore again what a
00:28:33
minion Emil Michael is. Um to do your
00:28:37
dirty work and pretend you're working
00:28:39
for the government. It's the whole thing
00:28:40
is such a This is such a fixed fight. I
00:28:44
can't even you need to and I think
00:28:46
reporters should really spend a lot of
00:28:48
people don't know these characters.
00:28:50
Again, this was an ex Uber executive.
00:28:52
He's been involved in a lot of stuff in
00:28:53
Silicon Valley, but he had to leave Uber
00:28:55
under please go watch look at our
00:28:58
reporting on him many years ago. Um he
00:29:01
had to leave Uber under very difficult
00:29:03
circumstances around the rape of a woman
00:29:05
in India um in an Uber. Um, but just um
00:29:10
just go go Google them reporters who are
00:29:13
covering this and stop acting like Emil
00:29:15
Michael has is this clean character. In
00:29:17
any case, I'm sure he'll come after me,
00:29:19
but it's true. Um, so I I'll win on that
00:29:22
regard. Um, anyway, um, we're going to
00:29:25
move on. Uh, another thing that again,
00:29:27
Silicon Valley just can't stop stealing
00:29:30
essentially. Grammarly launched an
00:29:31
expert review AI feature that gives
00:29:33
editing suggestions supposedly inspired
00:29:35
by well-known writers and journalists.
00:29:37
Casey Newton discovered the tool was
00:29:40
attributing advice to him and others
00:29:42
even though they never agreed to
00:29:43
participate. The feature even generated
00:29:45
advice under the name of a certain tech
00:29:47
journalist, Cara Swisser. Um they've
00:29:51
they've stopped that now. They've gotten
00:29:53
they they pulled back on it apparently.
00:29:55
But what an incredible bunch of
00:29:57
information and identity thieves. I
00:29:59
don't know what to say. Anytime these
00:30:01
people can steal, they steal. They're
00:30:03
such shoplifterss. I don't your
00:30:04
thoughts. Well, it goes back to this
00:30:07
mindset and I thought one of the I think
00:30:10
there's looking glasses into people's
00:30:11
souls, how they treat their pets, how
00:30:14
they treat service staff is sort of a
00:30:16
you know when is their guard down when
00:30:18
there are certain tells, right?
00:30:20
>> And one of the tells that was really
00:30:22
frightening when Sam Alman was asked
00:30:25
about the energy consumption of AI. He
00:30:28
said, "What people don't take into
00:30:30
account is the amount of energy it takes
00:30:32
and the amount of investment and
00:30:33
resources it takes to get a human to a
00:30:35
point where it can make logical
00:30:37
decisions and engage in critical
00:30:39
thinking."
00:30:40
>> Mhm.
00:30:40
>> He said, "If you look at how much energy
00:30:42
and input and resources it takes to
00:30:44
raise a child such that it can get to a
00:30:45
point where it can make decisions,
00:30:48
>> AI is better." I found that so
00:30:50
nihilistic and so inhuman because what
00:30:54
Silicon Valley
00:30:55
>> or at least some of the individuals we
00:30:57
talk a lot about don't realize is that
00:31:00
>> we try and get ROI economically such
00:31:03
that we can make low ROI investments in
00:31:05
relationships and people we love. I'm
00:31:08
not getting I am not getting an ROI back
00:31:10
for my children on any sort of economic
00:31:12
level.
00:31:12
>> Well, you use a lot of energy. I'm
00:31:14
wondering if we should use as much
00:31:15
energy for you as we do. But go ahead.
00:31:17
Well, but the the whole point the whole
00:31:19
shooting match
00:31:22
>> of an economy and relationships and
00:31:24
satisfaction and purpose and some sort
00:31:25
of spiritual sense of calm and and like
00:31:28
your life mattered is that you do engage
00:31:31
in productive,
00:31:33
you know, productive economic or
00:31:34
domestic labor such that you can invest
00:31:38
that in other people
00:31:41
and you may or may not get a return. But
00:31:44
the point is the return you get is
00:31:45
you're so invested in something that you
00:31:48
you your life has meaning. The the whole
00:31:50
point is that you create value such that
00:31:53
you can you can you can invest that
00:31:56
value in relationships. And for most
00:31:58
people the most rewarding place of
00:31:59
investment where quite frankly they
00:32:01
don't get anything resembling an
00:32:02
economic ROI is in children. And to look
00:32:06
at it on that level is like, okay, you
00:32:09
don't understand
00:32:10
what it is to be a mammal or a human.
00:32:14
And and also the notion that you can
00:32:17
spend 50 years of your life
00:32:20
professionally working your ass off,
00:32:21
staying late, starting in the mail room
00:32:22
at the Washington Post as you did, such
00:32:24
that you have a voice, a reputation, a
00:32:26
twist of phrase, an ability to string
00:32:28
words together that compels people to
00:32:30
action or provides insight. And then
00:32:32
they can come in and just adopt that 50
00:32:35
years or piggyback on it.
00:32:37
>> That piggyback steal it really
00:32:39
>> is like I if I type in give me five
00:32:43
jokes on this or or give me a view on
00:32:45
the oil price and I put in my voice it
00:32:49
does a really good job because what it's
00:32:51
doing is stealing from everything I have
00:32:53
ever written, said or done.
00:32:55
>> That is correct.
00:32:56
>> And so the music industry did this
00:32:58
correctly. It said, "Okay, if we're
00:33:00
Kroq, which is awesome, the best radio
00:33:03
station
00:33:04
>> uh of the '9s in Los Angeles, and they
00:33:07
play a bunch of English Beat or Tom
00:33:09
Petty
00:33:10
>> or Lloyd Cole in the Commotions or Ramm,
00:33:13
they track how much they're playing and
00:33:14
then they send them a royalty."
00:33:16
>> Mhm. And what these guys want to do is
00:33:17
they want to leverage your years,
00:33:20
decades of of of discipline, schooling,
00:33:24
certification, risk-taking, time away
00:33:26
from your family, but they don't want to
00:33:28
pay for it. And they see everything. I
00:33:32
mean, that's I think a a a felony. But
00:33:35
what is double homicide from a mentality
00:33:38
standpoint is that these people really
00:33:40
look at relationships and humans on an
00:33:43
economic basis. I just when I saw that I
00:33:47
thought
00:33:47
>> Yeah. Yeah. He just had a kid.
00:33:48
>> This guy is not
00:33:50
>> he just had a kid.
00:33:52
>> Well, I'm I'm not going to I'm not going
00:33:54
to speak to his children, but what he's
00:33:56
going to find out and this is a what I
00:33:58
tell other dads
00:33:59
>> comment. It was a dark comment.
00:34:01
>> I'm like, don't make the mistake I made
00:34:02
and think that right away your kid's
00:34:04
going to be super into the [ __ ] you're
00:34:06
into and you're going to get all these
00:34:08
Hallmark moments despite what insurance
00:34:09
commercials would tell you. you're going
00:34:12
to have to invest more in this child in
00:34:14
every way.
00:34:15
>> And that's the point because at some
00:34:17
point what you realize is that that
00:34:18
overinvestment in other people gives you
00:34:21
purpose and value.
00:34:22
>> Well, I I I do think we're going to move
00:34:24
on from this, but let me just say they
00:34:26
think everything is for the taking and
00:34:27
for them. I just this is just another
00:34:29
example. This what was happening at the
00:34:31
defense department. Oh, we have an
00:34:32
uponthropic. Oh, anything they can take
00:34:36
they take. and they just continue to
00:34:38
prove, you know, they they keep not
00:34:40
meeting my low expectations for them
00:34:43
already. Um, and this is kind of an
00:34:45
interesting thing. Researchers from the
00:34:46
Center for Countering Digital Hate,
00:34:48
which has been attacked in all the and
00:34:50
it's its founder been attacked legally
00:34:52
by Elon Musk and the federal government
00:34:54
now in his at his behest, um, tested 10
00:34:59
major they they're keeping going though,
00:35:00
they don't care. tested 10 major AI
00:35:02
chatbots and found out eight out of 10
00:35:04
were willing to help plan a violent
00:35:06
attacks like school shootings, bombings,
00:35:08
or assassinations. Researchers posed as
00:35:10
a 13-year-old boys as 13 boys showing
00:35:13
how easily minors could get guidance on
00:35:14
weapons, locations, and strategies. Only
00:35:17
anthropics claude and Snapchat's my AI
00:35:19
consistently refused to assist in
00:35:21
planning attacks and only Claude
00:35:23
attempted to dissuade the users.
00:35:25
Deepseek wished the user happy and safe
00:35:28
shooting. And on that note, a lot of you
00:35:30
have been writing in about a story in
00:35:31
Canada earlier this year. An 18-year-old
00:35:34
gunman opened fire at a school in in
00:35:36
Tumblr Ridge, British Columbia, killing
00:35:38
eight people. Let's listen to a clip
00:35:39
from a listener. I am calling because it
00:35:42
seems to be that there is a connection
00:35:44
now between the shooter and chat GPT.
00:35:50
The shooter was flagged by Chat GBT
00:35:53
several months ago regarding some of uh
00:35:56
their behavior online. Chat GBT didn't
00:36:00
report it, which is one of the reasons
00:36:01
why I am leaving this message to see
00:36:06
what your thoughts are on that. Open A
00:36:09
is now being sued by the parent of the
00:36:11
child who was injured in the shooting.
00:36:13
I, as you know, I've been at this for
00:36:15
years, especially around kids, but it's
00:36:17
jumped into people. Um the most recent
00:36:20
uh one of the more recent shootings it
00:36:22
was was this suicide was an adult was
00:36:25
was changed by these chatbots. I cannot
00:36:28
let's stop calling them chatbots. What
00:36:29
an adorable word for synthetic beings.
00:36:32
Um which who don't who don't are not
00:36:35
bound by legal like if you're a lawyer
00:36:37
and you did this you'd go to jail. If
00:36:38
you're an anal if you're a you know a
00:36:40
psychologist and you did this you'd go
00:36:42
to jail. If you were a person and you
00:36:44
did this you would go to jail. like all
00:36:46
of the people go to jail. They're
00:36:49
willing to assist in violent attacks and
00:36:52
they're not doing anything to rein it
00:36:54
in. And it's not just kids, it's it's
00:36:56
everything. And again, the only one that
00:36:59
is doing the right thing is Claude. And
00:37:03
so, and this is anthrop. And this is the
00:37:05
company. I'm not doing an ad for Claude
00:37:07
here, but they have at least some. And I
00:37:09
think they should be regulated, too. But
00:37:11
I can't tell you how incandescent I am
00:37:14
about the way these people try to take
00:37:16
every p bit for themselves and they do
00:37:19
not care the damage they are creating.
00:37:22
And I I I am going to keep talking about
00:37:24
this until Congress steps in and does
00:37:28
something about it. You don't work for
00:37:30
those rich people. You do not work for
00:37:32
them. You and and I I'm with Dell Rico.
00:37:35
Enough with these people. So go ahead. I
00:37:37
just ranted. Well, I I think it's
00:37:39
important to draw a distinction between
00:37:42
potentially creating some sort of
00:37:43
psychosis that leads to self harm or
00:37:45
harm against others through overuse of
00:37:48
of AI or any other digital platform. I
00:37:51
think that's a separate study that needs
00:37:53
to be done
00:37:54
and without the interference of the
00:37:58
massive money and lies and and owned
00:38:02
bought research that these these firms
00:38:03
will do. I think this is different. I
00:38:06
think this is whether the federal
00:38:08
government needs to put in place laws
00:38:10
and incentives such that if a private
00:38:12
organization or corporation
00:38:15
receives information that this person
00:38:17
might be on the verge of committing an
00:38:19
act of violence, if they have a
00:38:21
responsibility to report it to the
00:38:23
authorities immediately,
00:38:25
and I think they do. I'm not a privacy
00:38:28
person. I'm not suggesting we go to
00:38:30
Minority Report where we arrest them
00:38:32
before they've committed the crime. But
00:38:34
at at my school or or
00:38:39
so uh my school in Florida where my kids
00:38:42
went at another school uh and we we all
00:38:46
shared information when I was involved
00:38:48
with the school about these very
00:38:49
difficult situations. A kid was drawing
00:38:51
very um disturbing images of gun
00:38:54
violence. And so the school felt like it
00:38:58
had an obligation to report it. And then
00:38:59
the FBI went to the house and the FBI
00:39:01
said, "Are there any guns in the house?"
00:39:03
Mhm.
00:39:04
>> And I think that was the right thing to
00:39:05
do.
00:39:06
>> You're right. That seems
00:39:07
>> If you notice, there was a video that
00:39:09
went viral on Snap. A teacher put out a
00:39:12
snap saying that she wanted to kill
00:39:14
these kids. And it immediately the cops
00:39:18
showed up and said, "Did you put did you
00:39:21
say this? Are you having any sort of
00:39:23
mental issue right now? You need to go
00:39:25
home and we need to understand what is
00:39:28
going on with you and if you have access
00:39:30
to guns before we let you back into a
00:39:31
school." Mhm.
00:39:33
>> And the same is true here that if you
00:39:36
are going to monetize this type of
00:39:38
information and you understand it so you
00:39:41
can interpret it so well
00:39:43
>> that you can create a prompt that keeps
00:39:45
them on another second, another minute
00:39:47
or serves them the exactly right auto
00:39:50
insurance ad. Then in exchange for that
00:39:53
economic benefit and what is clearly
00:39:55
demonstrated ability to know what's
00:39:57
going on with that person, if you see
00:40:00
any evidence that that person might be
00:40:02
capable of creating this type of crime,
00:40:05
you have an obligation,
00:40:07
you bartenders, the bar, if a bartender
00:40:12
continues to serve people alcohol,
00:40:15
observing that that person is really
00:40:17
drunk
00:40:18
>> and then that person gets in a bar and
00:40:20
kills someone. Mhm.
00:40:21
>> The bar is liable,
00:40:23
>> right?
00:40:24
>> So if they have such incredible
00:40:25
targeting, such unbelievable
00:40:27
information, they can clearly tell that
00:40:29
okay, this individual is getting maps
00:40:34
>> and and identification and information
00:40:36
is basically digital.
00:40:38
>> We should investigate is what you're
00:40:39
saying. This is
00:40:40
>> a school then immediately a message goes
00:40:43
out to the local authorities saying here
00:40:45
is exactly what this person said. We
00:40:47
have a judge involved. You get the order
00:40:50
and boom, they're in the house
00:40:54
asking this person questions. I'm not
00:40:56
saying they arrest them. They haven't
00:40:57
done anything yet.
00:40:58
>> Right. Right.
00:40:58
>> But
00:41:00
>> they would argue this is surveillance.
00:41:02
But of course, they don't mind selling
00:41:03
surveillance.
00:41:03
>> They're surveilling. They're surveilling
00:41:05
us to serve as the thing is, you know,
00:41:07
I'm just saying a human being in this
00:41:09
situation would be arrested or liable,
00:41:11
right?
00:41:13
>> These people are giving I agree you
00:41:15
should separate the two, but they're
00:41:16
related, Scott. It's the same mentality
00:41:19
of let us extract all the good stuff.
00:41:22
Let us not protect anybody and we are
00:41:25
not liable for what we're doing there.
00:41:27
You know, Mark Benoff once called them
00:41:30
cigarette companies. It's worse. It's
00:41:32
worse than a cigarette company. They
00:41:34
were just selling cigarettes and using
00:41:36
Joe Camel. That sucks. But this is
00:41:38
something demented. Like I think they
00:41:42
they they're demented. I I don't that
00:41:44
they think this is okay and that they
00:41:46
don't say to themselves, should we
00:41:48
really is this the way we want to make
00:41:50
our money? We want to make our money by
00:41:52
poisoning children's minds. We want to
00:41:54
make our money by letting people who are
00:41:57
mentally disabled become more so and
00:42:00
then giving that's a different issue.
00:42:02
>> I agree. But they're giving people plans
00:42:04
and if you're going to give people plans
00:42:06
on how to shoot a school, you have a
00:42:08
responsibility to say you might want to
00:42:10
check this out. I I get, but for the
00:42:12
purposes of of remedies,
00:42:15
>> I think you need to separate the two.
00:42:17
Character AI may in fact be leading
00:42:19
people into a state of psychosis where
00:42:21
they believe the right thing to do is to
00:42:23
find their stepfather's gun and kill
00:42:25
themselves cuz they're going to get to
00:42:26
hang out with Daenerys and the
00:42:28
afterlife. That is shifting their
00:42:31
psychological state.
00:42:33
My understanding of this the the shooter
00:42:35
here was that she was already in an
00:42:38
awful psychological state and was using
00:42:40
chat GPT as a tool to execute
00:42:44
>> violence.
00:42:45
>> Both require some sort of regulation,
00:42:48
responsibility, and action.
00:42:49
>> Different. You're right.
00:42:50
>> Yeah. You've done a lot of good work
00:42:52
interviewing parents around the rabbit
00:42:54
hole and psychosis that the character
00:42:56
AIs can lead people to, which by the way
00:42:58
has an average usage time of 75 minutes
00:43:01
versus AI at like 13 or 15. At the same
00:43:05
time, if these organizations
00:43:08
can very easily use the same technology
00:43:10
to not only alert them at the right
00:43:12
moment to serve them an ad for a dating
00:43:14
app or for a cryptocurrency trading
00:43:16
platform
00:43:17
>> to say, "This person is clearly going
00:43:20
through something and potentially a
00:43:21
threat to the community and others. They
00:43:24
have a responsibility to immediately
00:43:26
notify the authorities."
00:43:27
>> All right, we're going to finish up with
00:43:28
they don't have a community
00:43:29
responsibility. One of the things that
00:43:30
always struck me
00:43:31
>> when you say they don't have a
00:43:32
community.
00:43:32
>> They don't feel like they like
00:43:34
>> No, I'm saying they should. I think
00:43:35
we're in agreement here.
00:43:36
>> I think they never did is the point I
00:43:38
was going to make when I when they were
00:43:40
building their their headquarters. I
00:43:42
remember Twitter building its
00:43:43
headquarter and they had the most
00:43:44
beautiful cafeteria. I don't know if
00:43:46
you've ever been there, but it was
00:43:47
gorgeous.
00:43:47
>> I've never been invited to Twitter's
00:43:48
cafeteria.
00:43:49
>> This was pre Elon and I was thinking
00:43:52
>> pre-lon.
00:43:52
>> Pre-lon. Um I was thinking they don't
00:43:55
care about all the businesses around
00:43:57
like you know what I mean? like they
00:43:58
kept the people captive in this
00:44:00
beautiful everything is here, don't go
00:44:03
anywhere. And that they don't give a
00:44:04
[ __ ] about San Francisco. It's just like
00:44:07
they just want to be here. But they
00:44:09
didn't care about the surrounding
00:44:11
delies. They didn't care about people
00:44:12
going out in the street and creating a
00:44:14
street life. They didn't back the the
00:44:17
you know, they don't have to back the
00:44:18
opera, but they didn't back any civic
00:44:20
organizations ever. And I was always
00:44:22
like, "Huh, what a group of people. They
00:44:24
don't really care about anything but
00:44:26
themselves." Like I remember being
00:44:28
struck by that cafeteria and thinking
00:44:29
they really could give a [ __ ] And it
00:44:32
was the same it's the same idea. They
00:44:34
could give a [ __ ] about our government.
00:44:35
They could give a [ __ ] about all these
00:44:37
things except for what's in their
00:44:39
interests. And so I I could go I'm going
00:44:41
to I'm moving into I'm speaking of
00:44:42
psychosis. I'm moving
00:44:43
>> it comes down to one sort of basic
00:44:45
algorithm and that is all corporate. You
00:44:49
could argue the big tech is worse than
00:44:50
most. But generally speaking, it's safe
00:44:52
to assume that all corporations care
00:44:54
about is shareholder value and earnings
00:44:57
and getting to those earnings within the
00:44:59
confines of the law. What unfortunately
00:45:02
is different nowadays, I don't think
00:45:04
that's changed. I think General Motors
00:45:05
would still be pouring mercury into the
00:45:06
river if there wasn't
00:45:08
>> I would agree
00:45:09
>> wasn't an EPA. The the failure of the
00:45:12
glitch in the matrix is that we used to
00:45:14
have checks and balance in the form of
00:45:15
leadership.
00:45:16
>> Mhm. that prevented a tragedy of the
00:45:18
commons. But because of Citizens United
00:45:20
now,
00:45:21
>> the only thing that elected officials
00:45:23
care about is getting reelected. And the
00:45:25
only thing you need to get reelected is
00:45:27
more money than the next person.
00:45:29
>> And Silicon Valley has connected the
00:45:31
dots here.
00:45:31
>> Yeah.
00:45:32
>> And it said we can compromise inch by
00:45:34
inch their ability to regulate us and
00:45:36
prevent a tragedy of the commons by
00:45:39
throwing money at them.
00:45:40
>> Yep. And now billionaires, the 900
00:45:43
billionaires in the United States are
00:45:44
responsible for 19% of the pack giving.
00:45:47
>> Was that number? So I think you should
00:45:48
ask Taler Rico about this. I'm sorry.
00:45:50
You should let him talk about this
00:45:52
issue. I mean ultimately it's this is
00:45:55
not a good situation for all of us. And
00:45:57
they someone came up to me the other day
00:45:59
and who had been critical of my book
00:46:00
being too hard on Silicon Valley
00:46:02
burnbook and they said, "I have to
00:46:05
apologize. You weren't hard enough." And
00:46:07
I was like, "You're absolutely [ __ ]
00:46:09
right." All right, Scott. Let's go on a
00:46:11
quick break. When we come back, what
00:46:12
Barry Diller is saying about CNN.
00:46:15
>> Support for the show comes from
00:46:16
Netswuite. Every business is asking the
00:46:18
same question. How do we make AI work
00:46:19
for us? The possibilities are endless
00:46:22
and guessing is too risky. But sitting
00:46:24
on the sidelines is not an option
00:46:26
because one thing is almost certain.
00:46:28
Your competitors are already making
00:46:29
their move. No more waiting. With
00:46:31
Netswuite by Oracle, you can put AI to
00:46:33
work today. Netswuite is a toprated AI
00:46:36
cloud ERP trusted by over 43,000
00:46:38
businesses. It's a unified suite that
00:46:40
brings your financials, inventory,
00:46:42
commerce, HR, and CRM into a single
00:46:44
source of truth. Now, with Netswuite AI
00:46:46
connector, you can use the AI of your
00:46:48
choice to connect to your actual
00:46:49
business data and ask every question you
00:46:51
ever had from key customers to cash on
00:46:54
hand to inventory trends. Plus, you can
00:46:56
automate those tiresome manual
00:46:57
processes. Let's see your competitor do
00:47:00
that. This isn't another boltedon tool.
00:47:03
It's AI built into the system that runs
00:47:04
your business. Whether your company
00:47:06
earns millions or even hundreds of
00:47:07
millions, Netswuite helps you stay ahead
00:47:09
of the pack. If your revenues are at
00:47:11
least in the seven figures, get the free
00:47:13
business guide, Demystifying AI, at
00:47:16
netsweet.com/pivot.
00:47:18
The guide is free to you at
00:47:19
netsweet.com/pivot.
00:47:21
netsweet.com/pivot.
00:47:28
Scott, we're back with more news. Barry
00:47:29
Diller is speaking out about wanting to
00:47:31
buy CNN and what he would do with it. In
00:47:33
a new interview, Diller says it uh CNN
00:47:35
hasn't been managed optimally and had
00:47:37
enormous in and it's has enormous
00:47:39
potential to influence. He says he told
00:47:41
Warner Brothers CEO David Zazoff all
00:47:43
this. Let's listen.
00:47:44
>> I said to him, I don't think your
00:47:46
programming I I don't think it's being
00:47:47
optimally programmed. I don't think it's
00:47:49
competitive. Now, by the way, the facts
00:47:52
uh support that. Uh meaning that its
00:47:56
ratings have declined, its revenue has
00:47:58
declined.
00:48:00
Still is quite profitable. But how would
00:48:02
you alter it?
00:48:04
>> Oh, in every way.
00:48:07
Look, feel, and see. Every way.
00:48:12
And I mean, I hope I get the chance. I
00:48:13
don't think I will, but I hope I do.
00:48:16
>> Um, I'm not sure when this was, but I I
00:48:17
I texted him. Um, he said this not
00:48:20
happening. He said, "Not that now that
00:48:22
the Ellison's have it." Um, and they and
00:48:24
he quite correctly, and I happen to know
00:48:26
this, they're going to combine CNN and
00:48:28
CBS. Um, he doesn't think he has a
00:48:31
chance. I would love to work for Barry
00:48:33
D. He's much more conservative than I
00:48:35
am, but um I would certainly love he's
00:48:38
such a good programmer. He's such an
00:48:40
interest
00:48:42
journalism even if he gets mad at it
00:48:43
sometimes. He's someone I I appreciate
00:48:46
in that regard. Um and it would I wrote
00:48:49
him. I said, "Can you please?" And he's
00:48:51
there's no way. So I can I can knock
00:48:53
this one out of the water. He can't do
00:48:54
it unless Please, Ellison, sell it to
00:48:57
Barry Diller. Please, that would be
00:48:59
great. So, any thoughts?
00:49:02
>> I would love to see Barry Dillard
00:49:03
partnered with Jeff Zucker and a private
00:49:05
equity firm. And I think there's more a
00:49:08
greater likelihood than people believe
00:49:10
that the Ellison's might say this is too
00:49:12
big a headache.
00:49:14
>> We might just sell a combined CBS and
00:49:16
CNN to someone else because I think that
00:49:19
I'm not sure and maybe I'm being naive
00:49:21
here. I'm not sure they're as Mavian as
00:49:23
people think about trying to control the
00:49:25
world. Um, I don't know, but maybe they
00:49:28
have some grand vision for how they
00:49:29
integrated into Tik Tok, but I I can't
00:49:31
imagine Larry Ellison is as smart as he
00:49:33
is. Isn't going to say this is going to
00:49:36
be more a headache than it's worth.
00:49:38
>> No, they wanted the studios. I I I
00:49:39
agree. They're not quite as madilian.
00:49:41
They they they're just opportunistic, I
00:49:44
would say. I I you know, Ellison was was
00:49:47
democratic.
00:49:48
>> You're the third richest man in the
00:49:49
world by focusing on on economics. And I
00:49:52
think that
00:49:53
>> anyways, I think he makes a lot of
00:49:55
money. Diller is correct. It makes
00:49:56
>> the time margins. But I did some
00:49:58
analysis here because I just wanted to
00:49:59
show you like one talk about some
00:50:03
numbers of cable news. I spent a decent
00:50:04
amount of time last night uh on AI
00:50:08
looking at ratings and viewership and
00:50:10
essentially what I did was just to give
00:50:12
you a sense for the ecosystem and also I
00:50:15
never miss a chance to make pivot look
00:50:17
good.
00:50:18
>> It is good.
00:50:19
>> I looked at gross viewership. That is
00:50:21
the number or listenership. That's the
00:50:23
number of people who watch a program and
00:50:25
then see it on YouTube or on social or
00:50:28
download the audio and listen to it. And
00:50:30
actually listens are more valuable than
00:50:32
views because it's a more intimate
00:50:33
experience. And that's why
00:50:35
>> that's why you get higher CPMs on
00:50:37
podcast right now than you get on cable
00:50:39
TV. CPM is the cost per thousand viewers
00:50:42
an advertiser is willing to pay. So
00:50:44
let's look at gross viewership. the
00:50:46
number of times someone or the number of
00:50:48
people that watch the program, see it on
00:50:50
YouTube or somewhere else or listen to
00:50:52
the podcast version of it.
00:50:54
>> Fox News averages during prime time.
00:50:57
>> Fox
00:50:58
>> Fox Okay.
00:50:58
>> Fox News during prime time averages 2.1
00:51:02
million in gross viewership.
00:51:03
>> Mhm.
00:51:04
>> This is staggering. CNN 660,000.
00:51:08
>> Mhm.
00:51:09
>> Fox is kicking the [ __ ] out of CNN.
00:51:13
>> Yes. Pivot's gross viewership is
00:51:15
375,000.
00:51:17
>> CNBC is 252,000.
00:51:21
Now,
00:51:22
that's a bit of a a misnomer. It's
00:51:24
important, but what advertisers care
00:51:27
about, they don't care about kids. They
00:51:29
don't care about seniors. They care
00:51:31
about people aged 25 to 54 who are
00:51:33
buying kids, houses, and cars and in
00:51:35
their mating years.
00:51:36
>> This is a single pivot, not two together
00:51:38
of the week, right?
00:51:39
>> This is one show.
00:51:40
>> One show. single show.
00:51:42
>> We do two a week, but go ahead.
00:51:43
>> This is one show.
00:51:44
>> Mhm.
00:51:44
>> So, in the core demo, that's adults 25
00:51:47
to 54.
00:51:48
>> Mhm.
00:51:49
>> Okay. Well, let me let me let me let me
00:51:52
start here, which will explain that
00:51:53
number. Let's look at the median viewer
00:51:55
age.
00:51:56
>> Fox News, the median is 69,
00:52:00
>> CNN at 67, CNBC at 63.
00:52:03
>> Pivot, the median age is 42.
00:52:07
>> 42. So which leads you to believe as you
00:52:10
should that the number the percentage of
00:52:13
viewers in the core demographic for
00:52:16
these institutions or for the cable guys
00:52:18
and CNBC is somewhere between 20 and
00:52:19
30%. For pivot it's 70%. Meaning the
00:52:24
number of people listening or watching
00:52:27
these program listening to or watching
00:52:29
these programs in the core demo that
00:52:31
advertisers care about CNBC gets 63,000
00:52:35
people on average watching programming
00:52:37
who are in the core demo.
00:52:39
>> CNN gets 135,000.
00:52:42
Fox gets 197,000
00:52:45
and Pivot gets 233,000.
00:52:48
>> We beat them in the demo.
00:52:49
>> So we're getting more people in the core
00:52:51
demo. And then which leads to the
00:52:53
following. Our average CPM
00:52:56
>> Mhm.
00:52:56
>> according to Ray Chow, Ultimate Nice Guy
00:52:58
and New Father.
00:52:59
>> Mhm.
00:52:59
>> From Vox, we get a CPM of $45.
00:53:05
The word I've heard from CNN is they get
00:53:08
between$ 13 and $17. I don't know what
00:53:10
Fox gets.
00:53:11
>> Mhm.
00:53:12
>> So, just to give you a sense, Oh, and
00:53:15
let's talk about median household income
00:53:17
>> and cost of doing business, but go
00:53:19
ahead. Yeah. You want to reach wealthy
00:53:21
people. Wealthy people are now
00:53:23
responsible for 50% of consumer
00:53:24
spending. They have more discretionary
00:53:25
income, right?
00:53:27
>> Fox News, the average household, the
00:53:29
median household income is $60,000.
00:53:32
CNN 65, CNBC 85.
00:53:37
>> That's not
00:53:38
>> Pivot
00:53:39
150 because we get a very techheavy high
00:53:41
paid audience. So, it's pretty obvious
00:53:46
why cable news, Fox is actually doing
00:53:49
pretty well,
00:53:50
>> but cable news as a whole
00:53:52
>> is dying.
00:53:54
>> Yeah,
00:53:54
>> it's literally dying. So, Barry Diller
00:53:57
saying he wants a new look and a new
00:53:59
feel, what I would suggest is unless you
00:54:01
can pick it up at distressed pricing and
00:54:04
consolidate it with a bunch of other
00:54:06
stuff, I think Barry's falling into the
00:54:08
same trap that a lot of people fall
00:54:09
into, and that is nostalgia is not a
00:54:11
strategy. I don't think there's any I
00:54:14
don't think there's any coming back.
00:54:15
That's not to say
00:54:18
>> the these come
00:54:19
>> they're too expensive. I mean, you
00:54:21
didn't even figure in costs. Our costs
00:54:23
are basement compared to all their cost.
00:54:25
>> Oh, the gross margins.
00:54:26
>> Yeah.
00:54:27
>> I mean, then then it gets it goes from
00:54:29
ugly to worse.
00:54:30
>> Yeah. What's interesting is there there
00:54:33
there's it's a it's still a great brand
00:54:35
and I agree with you about the
00:54:37
romanticism and he happens to be even
00:54:40
today at his he's much older is still
00:54:43
the best programmer around. He's been
00:54:45
>> No, he's he's a legend in the world of
00:54:47
media.
00:54:47
>> But not just that. I don't I've never
00:54:49
seen him think like oh I
00:54:51
>> Yeah, but so is John Malone and he
00:54:52
hasn't been able to figure it out.
00:54:54
>> I agree. I agree. But I'm just saying I
00:54:56
I I wouldn't like just say oh he's just
00:54:58
being romantic. I I've had discussions
00:54:59
with him. He's got some great ideas and
00:55:01
I agree it's a real problem. It's a I
00:55:03
would spin it off and see what Zucker
00:55:05
and Diller could do cuz I both of them
00:55:07
very good. They have a lot of ideas and
00:55:09
bring in people who have great ideas and
00:55:12
what would you do with it if if they
00:55:14
said here is this. This is what you have
00:55:16
Scott. What would you do with it? I know
00:55:18
you have just an anathema to television.
00:55:20
I know that. But um I it's an
00:55:23
interesting I think it's what he knows
00:55:25
best and it would be interesting. I
00:55:27
think he would be an interesting owner.
00:55:29
It's he says it's not happening. Um but
00:55:32
uh but it's nice that he's bringing it
00:55:34
up. I think Well, sorry. And by the way,
00:55:35
speaking of our demo, of our young demo,
00:55:38
42 means there's a lot of people on the
00:55:40
very young side. A lovely young man
00:55:42
named Evan, last night I was going into
00:55:44
this party for Hank Pollson was like, "I
00:55:46
love Pivot. Say hi to Scott." And I was
00:55:48
like, and and was like,
00:55:50
>> "That is a very young person. I get
00:55:51
stopped by very young people, very old
00:55:54
people. um middle most much in the
00:55:57
middle and very different people and I
00:55:58
really Evan I really appreciate all the
00:56:00
nice things you said about the show uh
00:56:03
because we we like all our different
00:56:04
fans but you're right the age thing is
00:56:06
important all kinds of stuff anyway
00:56:09
we'll see anyway Barry good luck um all
00:56:12
right we're not going to be buying it
00:56:13
and I won't go off on my craziness like
00:56:15
I did with the post um all right Scott
00:56:17
one more quick break we'll be back for
00:56:19
predictions
00:56:21
support for this show comes from Indeed
00:56:23
when you're looking for talent. Indeed
00:56:25
sponsored jobs can be just the boost you
00:56:27
need. It matches you with quality
00:56:29
candidates fast, so you don't need to
00:56:30
spend months searching for that new
00:56:32
hire. According to their data, sponsored
00:56:34
jobs posted directly on Indeed are 95%
00:56:37
more likely to report a hire than
00:56:38
non-sponsored jobs. Join the 3.3 million
00:56:41
employers worldwide that use Indeed to
00:56:43
connect with quality talent that fits
00:56:45
their needs. Spend less time searching
00:56:47
and more time actually interviewing
00:56:49
candidates who check all your boxes.
00:56:51
Less stress, less time, more results.
00:56:53
When you need the right person to cut
00:56:55
through the chaos, this is a job for
00:56:57
Indeed sponsored jobs. And listeners to
00:56:59
this show will get a $75 sponsored job
00:57:02
credit to help get your job the premium
00:57:04
status it deserves at
00:57:06
indeed.com/mpodcast.
00:57:08
Just go to indeed.com/mpodcast
00:57:12
right now and support our show by saying
00:57:14
you heard about Indeed on this podcast.
00:57:16
That's indeed.com/mpodcast.
00:57:19
Terms and conditions apply. hiring. Do
00:57:22
it the right way with Indeed.
00:57:26
>> Okay, Scott, let's hear a prediction.
00:57:28
You you sort of mentioned it. What is
00:57:30
it? What is it? Oh, one thing. I predict
00:57:32
we're going to have a great time at
00:57:33
South by Southwest. All right. That's my
00:57:34
prediction.
00:57:35
>> That's what you're predicting always.
00:57:36
All right. So, my prediction is
00:57:40
essentially um I think the markets this
00:57:43
year are going to go down. Uh dangerous.
00:57:48
I think I think we're
00:57:49
>> I think we're on the precipice of like
00:57:51
a10 trillion dollar wipeout.
00:57:54
>> Whoa.
00:57:54
>> Um
00:57:55
>> really.
00:57:56
>> Oh yeah.
00:57:57
>> Tell all.
00:57:58
>> Well, not and by the way, I get this
00:58:00
wrong all the time. This is not
00:58:01
financial advice, but I don't think it's
00:58:03
from Iran. It's from what comes after
00:58:05
Iran.
00:58:07
Um and this is this is the chain
00:58:10
reaction here. Uh, I don't think oil is
00:58:14
going to I think oil is not going to be
00:58:16
at 150 bucks, but it's going to be it's
00:58:19
it's going to be
00:58:21
sustainably higher. It's going to be
00:58:23
elevated through the rest of the year.
00:58:25
And inflation in some markets reignites.
00:58:28
The Fed can't cut rates. They're trapped
00:58:31
um to inspire the economy because
00:58:33
they're worried about inflation. I think
00:58:35
corporate earnings are really impaired
00:58:37
as consumers stop spending
00:58:40
because some of them will be paying five
00:58:41
bucks a gallon for gas and their 401k
00:58:44
will start to decline and Q2 earning
00:58:47
season becomes bad and then what CEOs do
00:58:50
when things are sort of bad is they
00:58:51
throw in the kitchen sink and they'll
00:58:53
make it look like a blood bath just to
00:58:54
get all the bad [ __ ] out.
00:58:56
>> That's a good idea actually.
00:58:57
>> But the real contagion
00:58:59
>> uh here is going to be from emerging
00:59:01
markets. I think there's a decent chance
00:59:03
that Pakistan and Egypt default as well
00:59:07
as Sri Lanka and Bangladesh
00:59:09
>> dollar denominated debt very energy
00:59:11
dependent very fragile economies
00:59:14
>> because they all they all there's this
00:59:16
domino effect in those markets because
00:59:17
they can't afford oil imports and their
00:59:19
dollar denominated debt just becomes
00:59:21
unpayable
00:59:23
and then the real downward spiral starts
00:59:26
European banks holding that emerging
00:59:28
market debt start announcing write downs
00:59:32
um foreign banks, Deutsche Bank, BMP,
00:59:35
Pariba, all hugely exposed. Credit
00:59:37
spreads blow out and we get sort of a
00:59:41
not this to the same extent, but we get
00:59:42
an '08 style which bank is next moment
00:59:47
except this time it's happening while
00:59:50
the US is fighting a war we started for
00:59:52
no reason,
00:59:53
>> right?
00:59:54
>> Uh other than Scott, it's an excursion.
00:59:57
>> Well,
00:59:57
>> I'm teasing you. It's war. And well,
00:59:59
that's the mistake here is it should
01:00:00
have been a special it should have been
01:00:02
a military combat operation. Instead,
01:00:04
they've turned it into a war with no
01:00:06
objectives. But anyways,
01:00:08
by August, the narrative shifts shifts
01:00:11
from transitory war shock to holy [ __ ]
01:00:15
we may have broken the global financial
01:00:17
system. The S&P is off 20 to 40% from
01:00:21
its peak. Bitcoin goes to like 30,000.
01:00:25
Um, and you know, and quite frankly, the
01:00:27
only thing that probably goes up is
01:00:29
canned goods and ammunition
01:00:31
>> and Chevron. Um,
01:00:34
>> well, that's a scenario.
01:00:36
>> Happy South by Southwest.
01:00:38
>> But, but it's going to start, the
01:00:41
prediction is the following. It's going
01:00:42
to start, the contagion is going to
01:00:43
start in emerging markets that can't
01:00:45
afford
01:00:47
>> oil and uh, their debt is dollar
01:00:50
denominated. It's just a toxic cocktail.
01:00:52
It's a very accurate prediction, I have
01:00:54
to say.
01:00:55
>> So, and the problem is we've shot so
01:00:58
many bullets with our debt and printing
01:01:01
money that um the ECB and the Federal
01:01:05
Reserve doesn't have the same firepower
01:01:07
to try and lift us out of this.
01:01:10
>> Mhm.
01:01:10
>> So, in other words, it could be like a
01:01:12
an08 shock, but the problem is we we
01:01:15
have less ammunition for a bailout.
01:01:17
>> Yeah. Yep. With the tariffs, with the
01:01:19
debt, with everything. I mean, you know,
01:01:21
one of the things that uh did you hear
01:01:23
James Carville saying, "I don't have
01:01:24
enough Trump derangement syndrome. I
01:01:26
want more. I should, you know, I'm so
01:01:28
furious at this [ __ ] He was screaming
01:01:30
this what he has done here with this
01:01:32
Iran." And it all, as you have noted
01:01:34
many times, links back to Epstein again,
01:01:37
right? It links back to this guy.
01:01:39
>> He's the guy in every room
01:01:40
>> in every room. I think you're absolutely
01:01:42
right that this everything is motivated
01:01:45
by either people want to get before
01:01:47
while the getting's good or for
01:01:49
themselves or a an un unhealthy need to
01:01:54
hold on to power in a demented way like
01:01:58
I I remember when Elon said that one
01:01:59
time if Democrats it's an existential
01:02:02
crisis for the world if Democrats win.
01:02:04
Actually, as I always say,
01:02:07
every accusation is a confession. We're
01:02:09
in an existential crisis because of
01:02:11
these greedy [ __ ] and because of the
01:02:14
the need to hold on to power over
01:02:16
everything and it's going to it has
01:02:17
reverberations around the world.
01:02:19
>> There's some really interesting tax
01:02:20
proposals. Senator Booker proposed
01:02:22
basically a tax holiday for young people
01:02:25
which I which I love. Not that expensive
01:02:29
because young people don't make that
01:02:31
much money. Mhm.
01:02:32
>> We need to level up young people who are
01:02:34
24% less wealthy than they were 40 years
01:02:36
ago versus old people who are 72%
01:02:38
wealthier. And then
01:02:40
>> for the first time I saw a wealth tax
01:02:43
>> that could potentially
01:02:46
make sense. But instead of going after
01:02:47
billionaires
01:02:48
>> Mhm.
01:02:49
>> they should be going after anybody or
01:02:52
everybody that say has a well, you know,
01:02:54
more than call it $und00 million,
01:02:57
>> right?
01:02:58
>> You get no happiness. your kids will get
01:02:59
no incremental happiness from inheriting
01:03:01
that much money.
01:03:01
>> Billionaires were helping you lift your
01:03:03
wallets. Um,
01:03:04
>> and it should be it should be annual and
01:03:06
it should be small enough
01:03:07
>> such that people don't have to liquidate
01:03:09
assets
01:03:10
>> or move to Florida like
01:03:12
>> Yeah, it has to be federal.
01:03:13
>> Starbucks is just has to be federal.
01:03:15
You're absolutely right. That's great.
01:03:17
Okay. All right. We're going to talk
01:03:18
about that. That's going to be one of
01:03:19
our big topics at South by Southwest. We
01:03:21
Anyway, we want to hear from you. Send
01:03:23
us your questions about business tech or
01:03:25
whatever is on your mind. Go to
01:03:26
nymag.com/pivot
01:03:28
to submit a question for the show or
01:03:29
call 85551 pivot elsewhere in the Karen
01:03:32
Scott universe. I'm going to get serious
01:03:34
for a second. Monday I published a story
01:03:35
that I think I'm the most proud of of
01:03:37
anything I've done in a very long time.
01:03:39
I sat down with three Epstein survivors
01:03:41
who've been pushing for more
01:03:42
transparency with on on with career
01:03:44
survivor uh Liz Stein who's also a
01:03:47
survivor of childhood uh sexual abuse
01:03:50
said her desire to help her younger self
01:03:52
fuels her advocacy work. Let's listen to
01:03:54
a clip. It would be irresponsible of me
01:03:58
to have this position and to not use it
01:04:02
so that others did not feel alone in
01:04:05
this. Because if I could go back and
01:04:07
tell myself anything, it would be to
01:04:10
tell someone. And if they don't listen,
01:04:12
tell someone else. And just keep telling
01:04:14
until people listen to you. And even if
01:04:17
you feel like they don't, be proud of
01:04:19
yourself because you at least were able
01:04:22
to sit in your uncomfortable truth when
01:04:24
other people weren't. And that's really
01:04:26
what fuels me doing this advocacy, being
01:04:28
the person that I wish was there for me
01:04:31
when I needed them most. This was a
01:04:33
great show. They actually got to talk a
01:04:35
lot about it. Often you get these
01:04:36
shorter interviews. It was really very
01:04:39
moving. I dare I just listen to it. I
01:04:42
know everyone goes, "Oh, goodness."
01:04:43
>> Yeah. You can hear the emotion in her
01:04:44
voice.
01:04:45
>> Such dignity. such incredible strength,
01:04:48
such heroic behavior in in the face of
01:04:50
adversity. And uh you know, it was a lot
01:04:54
of I've gotten a lot of feedback that's
01:04:56
been I really appreciate, but it was all
01:04:58
these women. They were astonishing. It
01:04:59
has nothing to do with me, but I let
01:05:01
them talk and you should listen to what
01:05:03
they have to say as she said. Anyway,
01:05:05
that's the show. Uh thanks for listening
01:05:07
to Pivot. Be sure to like and subscribe
01:05:09
to our YouTube channel. We'll be back
01:05:11
next week.

Badges

This episode stands out for the following:

  • 80
    Most heartbreaking
  • 70
    Most shocking
  • 70
    Best concept / idea
  • 70
    Most controversial

Episode Highlights

  • War in Iran and Oil Prices
    The war in Iran is causing the largest supply disruption in the history of the global oil market.
    “The largest supply disruption in the history of the global oil market.”
    @ 01m 11s
    March 13, 2026
  • Impact on Global Economies
    Countries importing oil are facing severe economic challenges due to rising prices.
    “The biggest loser here is obviously the people of Iran.”
    @ 07m 59s
    March 13, 2026
  • Anthropic vs. Pentagon
    Anthropic sues the Pentagon for blacklisting, claiming violation of First Amendment rights.
    “"This has never been done to an American company."”
    @ 20m 31s
    March 13, 2026
  • Microsoft Supports Anthropic
    Microsoft backs Anthropic in its lawsuit against the Pentagon, warning of negative ramifications.
    “"This unprecedented move would have broad negative ramifications for the US tech industry."”
    @ 20m 51s
    March 13, 2026
  • Consumer Response to Controversy
    Downloads of Anthropic's app surged 75% after federal agencies halted its use.
    “"Consumers are running towards Anthropic."”
    @ 27m 31s
    March 13, 2026
  • AI and Violence
    Researchers found that 8 out of 10 AI chatbots were willing to assist in planning violent attacks.
    “Eight out of 10 chatbots were willing to help plan violent attacks.”
    @ 35m 02s
    March 13, 2026
  • ChatGPT's Controversy
    A listener claims that ChatGPT flagged a shooter months before a school shooting but did not report it.
    “The shooter was flagged by ChatGPT regarding some of their behavior online.”
    @ 35m 50s
    March 13, 2026
  • Corporate Responsibility
    Discussion on the responsibility of corporations to report potential threats based on user behavior.
    “If you see any evidence that that person might be capable of creating this type of crime, you have an obligation.”
    @ 40m 07s
    March 13, 2026
  • Barry Diller on CNN
    Barry Diller expresses interest in buying CNN, criticizing its current management and potential.
    “I don’t think it’s being optimally programmed. I don’t think it’s competitive.”
    @ 47m 46s
    March 13, 2026
  • Market Predictions
    Predictions of a potential market downturn and its implications for the economy.
    “I think we're on the precipice of like a 10 trillion dollar wipeout.”
    @ 57m 49s
    March 13, 2026
  • Survivor Advocacy
    A survivor shares her journey and the importance of speaking out.
    “It would be irresponsible of me to have this position and to not use it.”
    @ 01h 03m 58s
    March 13, 2026
  • Strength in Adversity
    Reflecting on the dignity and strength of survivors in the face of challenges.
    “Such dignity, such incredible strength, such heroic behavior in the face of adversity.”
    @ 01h 04m 48s
    March 13, 2026

Episode Quotes

Key Moments

  • Oil Market Crisis01:11
  • Future Aspirations05:03
  • Media Control14:34
  • Legal Battle20:31
  • Consumer Reactions27:31
  • Cable News Challenges53:52
  • Market Downturn57:40
  • Survivor Advocacy1:03:54

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Who’s to Blame After Texas Flooding Tragedy — And What Needs to Change | Pivot
Podcast thumbnail
Former Prince Andrew Arrested — Will Epstein’s Network Face U.S. Justice? | Pivot
Podcast thumbnail
Netflix and Paramount Face Off for Warner Bros: Who Will Win the Bidding War? | Pivot
Podcast thumbnail
Resist and Unsubscribe: Scott Galloway’s Plan to Hit Big Tech Where It Hurts | Pivot