Search Captions & Ask AI

Trump's First 100 Days, Tariffs Impact Trade, AI Agents, Amazon Backs Down

May 03, 2025 / 01:35:05

This episode features discussions on tariffs, AI agents, and the impact of recent economic policies with guests Ryan Peterson, Aaron Lee, and David Saxs. The panel examines the implications of tariffs on trade and small businesses, as well as the evolving landscape of AI technology.

Ryan Peterson, CEO of Flexport, shares insights on the current state of tariffs, noting a significant decline in ocean freight bookings from China and the challenges faced by small businesses due to increased tariffs. He emphasizes the importance of adapting to the changing trade environment.

Aaron Lee from Box discusses the potential of AI agents in automating tasks and improving efficiency in businesses. He highlights how AI can take on new roles that were previously unaffordable, thus expanding the scope of work that can be done.

David Saxs contributes to the conversation by addressing the rapid advancements in AI technology, particularly in coding and reasoning models. He points out the challenges of integrating AI into regulated industries, where accuracy and reliability are critical.

The episode concludes with a discussion on the future of AI and its potential to reshape various industries, alongside the ongoing challenges posed by economic policies and tariffs.

TL;DR

The episode discusses tariffs' impact on trade, AI agents' potential, and economic policy challenges with insights from Ryan Peterson, Aaron Lee, and David Saxs.

Video

00:00:00
I gotta wrap, guys. I got to catch a
00:00:01
flight to Miami. Let me do a closing
00:00:03
here. If you want to keep going, you're
00:00:04
welcome. Two. Three. Two. The plane.
00:00:05
Just wait. Just text the pilot and just
00:00:07
tell them you're all right. Listen. I'm
00:00:09
not burning all the allin credits, so to
00:00:12
speak, and all of our tokens. I'm
00:00:14
kidding. I'm kidding. I'm kidding. I'm
00:00:16
not flying private to everything and
00:00:19
then putting it on the all-in budget.
00:00:21
The rest of us are flying Southwest
00:00:23
for
00:00:24
your dictator Jim Pol. It's a strange
00:00:28
concept. Yeah. David S. I What does that
00:00:30
mean, Dave? When's the last time you
00:00:31
flew a commercial? Clinton, I haven't
00:00:34
missed a flight in about 15 years.
00:00:37
[Music]
00:00:39
Let your winners ride.
00:00:42
[Music]
00:00:46
We open sourced it to the fans and
00:00:48
they've just gone crazy with it.
00:00:54
All right, everybody. Welcome back to
00:00:55
the number one podcast in the world.
00:00:58
We're back. We're back and what an
00:01:00
amazing panel we have today with us.
00:01:03
Ryan Peterson, friend of the pod, is
00:01:05
back on the show. He's the CEO of
00:01:07
Flexport. How are you doing, Ryan? Did
00:01:09
you get any skiing in this year? I know
00:01:11
you like to ski in the deep powder like
00:01:13
I tried, man, but it was a busy year for
00:01:16
work and I got two little kids. I I did
00:01:18
a few days. Okay. So, you're Oh, yes. We
00:01:20
all forgot you gave control of your
00:01:23
company to somebody. It got a little uh
00:01:26
shaky, got a little contentious, and
00:01:28
then you took the reigns back. How's it
00:01:30
been being back in the pilot seat? Oh,
00:01:33
that was a year and a half ago. So, it's
00:01:35
a distant memory for in in flexport
00:01:37
time. That's like a decade. We've uh
00:01:39
yeah, really had an amazing run.
00:01:41
Although, these tariffs, I mean, I guess
00:01:42
that's why you guys invited me on. These
00:01:44
tariffs have kind of made a lot of
00:01:45
created a lot of new uncertainty in the
00:01:47
flex sports world. Okay, so we'll
00:01:49
definitely get into that. and of course
00:01:51
fan favorite back for his fourth
00:01:53
appearance on the pod. I I I was so
00:01:56
first of all I saw the comment I I saw
00:01:57
the comments last time I was on. I'm I'm
00:01:59
officially not a fan favorite but uh
00:02:01
glad to be back on and I will be I will
00:02:03
be representing free markets uh in uh in
00:02:06
this uh uh in this version. What do the
00:02:09
comments say about you? It was like uh
00:02:11
you know like uh loves Biden uh you know
00:02:16
to totally beta you know all the uh
00:02:19
lover beat soy boy you're filling in for
00:02:23
me though. Um I think so. Yes. I was
00:02:25
trying to represent uh I was trying to
00:02:27
represent libertarian values at the
00:02:28
time. But uh but I love this that the
00:02:30
leftists are embracing Milton Freeman. I
00:02:33
think it's all worth it if that's what
00:02:34
comes out of all of AOC AOC is going to
00:02:37
be a complete free market uh person
00:02:39
soon. Yeah, free market months are
00:02:41
coming soon. Embracing free market
00:02:42
values and the stock market, right? Yes.
00:02:44
Exactly. Because any decline in the
00:02:46
stock market is Trump's fault. So now
00:02:48
they're they're embracing the stock
00:02:49
market. Well, unfortunately
00:02:50
unfortunately the one day that he said
00:02:52
it's Biden's market, it was a it was a
00:02:53
green day. So that uh that didn't help
00:02:55
the case. I mean gosh uh well listen,
00:02:57
it's another massive green day already.
00:03:02
All that matters to me is that Uber is
00:03:04
the anti-tariff stock. It just does
00:03:07
great. It's not impacted by tariffs. So
00:03:08
here we go. with Chimoth. It started
00:03:10
already. We have TDS on both sides.
00:03:12
We've got Trump derangement syndrome
00:03:14
from Aaron. Wait, no, no, no, no. I want
00:03:16
to be defender syndrome from Sachs.
00:03:18
We've got both. Defender and
00:03:20
derangement. Here we go. Syndrome. Trump
00:03:23
bias syndrome on your part. Who? Me? Me?
00:03:25
I call balls and strikes. What are you
00:03:27
talking about? Let's get started. It's
00:03:30
starting already, folks. It's going to
00:03:31
be a great episode. Lots of excitement
00:03:34
with us again. Jason has the uh the rain
00:03:37
self-sabotage. Find every way to not get
00:03:39
rich syndrome. I do. What are you
00:03:42
talking about? You guys said you're buy
00:03:43
me out of this thing and I can get the
00:03:45
hell out of here. You know how much my
00:03:46
shares are in all in worth? For the love
00:03:49
of God, write a check.
00:03:52
Get me the hell out of here. I just may.
00:03:55
Oh god. I mean, I'm going to be a
00:03:57
terror. If Uber breaks 88, that's my
00:04:00
number. 88 is the number. You're all
00:04:03
when that happens. Uh, and we're getting
00:04:04
close. All right, let's get started
00:04:07
here. We have so many topics to get
00:04:10
through with us again. David Saxs. Hey
00:04:13
David, you're doing uh more episodes
00:04:15
now. The audience wants to know. I don't
00:04:18
know if we're allowed to make any
00:04:19
initial announcements, but people are
00:04:20
asking me on the streets, in the
00:04:22
airports, in the comment threads. Is
00:04:24
Sachs back?
00:04:26
Well, the ratings are back ever since I
00:04:28
came back to the show. That's for sure.
00:04:31
The ratings are back, show, but is
00:04:34
measurable back? Is Sachs back?
00:04:37
Well, I'm back as much as I can. Mhm.
00:04:40
And you are a partial employee of the
00:04:42
government. You can do 130 days a year
00:04:44
or something. Is that still the status?
00:04:46
Yeah, it's roughly half after work days.
00:04:48
Got it. And so what do you do? You have
00:04:49
a you have a punch clock there. When you
00:04:51
get to the White House, you punch in,
00:04:52
you punch out like Fred Flintstone or
00:04:54
what? Are you keeping track of these
00:04:55
days? How do you do it? I know why you
00:04:57
don't know this because you have yet to
00:04:58
be invited to the White House. But
00:04:59
that's interesting. I got not how it
00:05:02
works. Normal people just people just
00:05:04
badge in and badge out like that. Badge
00:05:06
it and badge. It's a natural place,
00:05:07
Jason. I I mean, literally, it's
00:05:09
interesting. There's a new private club.
00:05:11
It's incredible that you have thoroughly
00:05:14
prepared for this week, just like
00:05:15
always. I am always prepared.
00:05:17
Interestingly, I don't know if you
00:05:18
gentlemen know this, Ryan and Aaron.
00:05:20
There's a new private club in DC that uh
00:05:23
Don Jr. is doing and Sax is a member.
00:05:25
Chimat's a member. And I just checked my
00:05:27
Gmail. I checked all three of my Gmail
00:05:29
accounts, everything. No invite. You
00:05:31
must have gotten lost again.
00:05:34
Did you send a paper one? Was it like
00:05:35
you sent a goal card or something? Sax,
00:05:37
how do I get invited to this private
00:05:39
club? What is this private club?
00:05:40
Everybody wants to know. Well, we'll be
00:05:42
happy to have you as a guest. Okay. Do I
00:05:45
have to wear a MAGA hat and have the
00:05:46
courtesy MAGA hats at the door? If you
00:05:48
want to be a member, obviously there are
00:05:50
dues and a membership fee, and Okay. I
00:05:53
just didn't want to waste your time with
00:05:55
an offer that I knew you wouldn't uh be
00:05:58
willing to accept. It's only $500,000 is
00:06:01
what I read. Is that true?
00:06:03
That's true for for founding members who
00:06:05
have additional benefits, but there's
00:06:06
also a lower level that's the more
00:06:08
reasonable membership level. So, I think
00:06:11
people are getting a little bit carried
00:06:12
away with that number. Got it. Okay.
00:06:13
That's why I wanted to clarify. Yeah.
00:06:15
Yeah. There's like 10 founding members
00:06:17
who have that level and then there's a
00:06:19
lower level for more average member.
00:06:22
Chimatha, are you one of those 10? Yes.
00:06:25
Do you pay more if you have TDS or how
00:06:27
does that work? TDS premium. What do you
00:06:29
talking about? Jal specifically or what
00:06:30
are we talking about? the TDS search
00:06:33
charge. Asking for a friend. It's a TDS
00:06:35
search charge. You put the tariff search
00:06:38
we just we want a place to hang out in
00:06:41
DC. All of us have been to clubs like
00:06:43
the Battery or I don't know if you go to
00:06:45
LA like the I think places. There's
00:06:48
Malibu Beach House. There's Bird Street
00:06:50
Clubs. There are places in Palm Beach
00:06:52
that are really cool. In any event, we
00:06:54
wanted a place to hang out. And the the
00:06:56
clubs that exist in Washington today
00:07:00
have been around for decades. They're
00:07:01
kind of old and stuffy. To the extent
00:07:03
there are Republican clubs, they tend to
00:07:05
be like more Bush era Republicans as
00:07:08
opposed to Trump era Republicans. So, we
00:07:11
wanted to create something new, hipper,
00:07:13
and Trump aligned. Since I'm in the
00:07:16
government, I can't be an owner, but I
00:07:18
told him I'd be happy to be member
00:07:20
number one. And so, I, you know, said,
00:07:22
"Great, let's let's do it." And so,
00:07:24
we're creating a place for us to hang
00:07:25
out. That's basically it. We want a
00:07:27
place to go where you don't have to
00:07:28
worry that the next person over at the
00:07:30
bar is a fake news reporter or even a
00:07:33
lobbyist or something like that who we
00:07:35
don't know and we don't trust. Got it.
00:07:38
So, it's like any private club. You want
00:07:40
to go somewhere that's highly curated.
00:07:45
This private club uh movement is
00:07:46
happening all over the country, not just
00:07:48
Washington. But we're creating something
00:07:49
that didn't exist before in DC, which
00:07:52
again is younger, hip, Trump aligned,
00:07:54
Republican. We're uh I I actually
00:07:57
started a Kamala club in um in the Bay
00:07:59
Area. So um so we're Yep. I don't think
00:08:03
anyone would pay to join that though is
00:08:05
the problem, right? I mean, it's an open
00:08:08
bar, that's for sure.
00:08:10
Where where do where do you guys meet
00:08:12
up? In like Redwood City. We actually
00:08:13
meet up at the uh at the at the trade
00:08:15
ports. All right. was we're 100 days
00:08:18
into Trump 2.0. It's just a random 100
00:08:22
day thing, but everybody's talking about
00:08:24
everybody's hand ringing. What has it
00:08:26
been like for this first 100 days? How
00:08:28
does it compare to Biden? How does it
00:08:29
compare to Trump 1.0? 143 executive
00:08:32
orders, the most ever in the first 100
00:08:35
days. And they're moving obviously at a
00:08:37
at a different pace to uh be generous.
00:08:41
Major indices are down 7 to 10%.
00:08:43
Obviously, this trade war and tariffs,
00:08:47
the yield on the the yield on the
00:08:49
10-year, it's down about 40 basis
00:08:51
points. There's a lot going on. Let's go
00:08:53
around the horn. Ryan Aaron, you're our
00:08:55
guest. What's your take on the first 100
00:08:57
days? Is it what you expected, good,
00:09:00
bad, and otherwise, wins and fails,
00:09:02
everything. I'll go first. I think it's
00:09:04
a whirlwind. I mean, if you look at the
00:09:07
uh the John Boyd, the fighter pilot, has
00:09:09
this concept of the UDA loop, which is
00:09:11
observe, orient, decide, and act. And
00:09:14
the concept is that if you're in dog
00:09:15
fighting, if you're able to maneuver
00:09:18
through those UDA loops at a faster pace
00:09:20
than your than your competition, they
00:09:22
get disoriented and don't know what to
00:09:23
do. And I I think that that's got to be
00:09:26
how Democrats in Washington and maybe
00:09:28
mainstream Republicans in Washington.
00:09:30
Certainly journalists are all feeling
00:09:32
this like there's that the Trump Trump
00:09:34
administration takes action and before
00:09:37
anybody can respond to that they have
00:09:39
already done like four more things and
00:09:40
you're like wait I forgot to actually
00:09:42
follow up on the other thing that they
00:09:43
did that I didn't like. Uh, and so it's
00:09:45
yeah, it's pretty disorienting if you're
00:09:47
if you're trying to they they can't find
00:09:49
a line to fall back to and go, "Hey,
00:09:50
we're going to push back against this
00:09:51
policy because they're already moving on
00:09:53
to the next one, the next one." Um, so
00:09:55
that that's like my high level
00:09:56
interpretation. Obviously, I come at it
00:09:58
from a trade angle. I think everybody
00:10:00
knew that Trump was going to be he he
00:10:02
told us during the campaign that the the
00:10:04
most beautiful word in the English
00:10:05
language is
00:10:06
tariff. Don't tell them it's an Arabic
00:10:08
word, but the most beautiful word in the
00:10:11
English language. And so we knew that
00:10:12
was coming. I think that the the the
00:10:16
suddenness of it all caught people by
00:10:18
surprise. I mean, they told us April
00:10:20
1st, April 2nd would be Liberation Day.
00:10:23
They didn't tell us that it would go
00:10:25
live the next week, you know, and effect
00:10:27
you've already ordered these goods. So,
00:10:28
that's one aspect that people are kind
00:10:31
of disoriented about. And we're gonna
00:10:33
unpack. Yeah, we're going to unpack
00:10:35
that. Aaron, your thoughts on the first
00:10:36
100 days? Obviously you are a Democrat
00:10:40
and uh you were pretty vocally not in
00:10:44
support of Trump. So what's your take on
00:10:46
the first 100 days? Any any bright spots
00:10:48
for you things you you know support?
00:10:50
Actually Sax's world I'd say has has
00:10:53
been a bright spot. So especially I mean
00:10:55
I think we have a very clear message on
00:10:57
AI and uh and that that that is that's
00:11:00
been I think a huge net positive is um
00:11:03
you know if you look at the the past you
00:11:05
know few months uh out of all the the AI
00:11:08
push from the administration it's
00:11:09
unmistakably you know pro open source
00:11:11
you know pro you know bring as much AI
00:11:14
innovation you know to the US obviously
00:11:16
that the tariffs you know add a little
00:11:17
bit of a headwind to that. I have some
00:11:19
very strong asks, you know, around high
00:11:21
skilled immigration because I think
00:11:22
that, you know, AI talent is going to be
00:11:25
super critical to to actually win the AI
00:11:27
war. So, so I'd say that that
00:11:28
directionally has has had some positive
00:11:30
momentum. You know, from my perspective,
00:11:32
this is kind of playing out almost
00:11:35
exactly how I thought it would 6 months
00:11:36
ago. And then 3 months ago, I I think
00:11:40
there was some signs that maybe maybe,
00:11:42
you know, it wouldn't play out this way.
00:11:44
um just based on some of the some of the
00:11:46
you know kind of early groups that were
00:11:48
coming to the White House the the the
00:11:49
the sort of deep business you know kind
00:11:52
of centricity of the White House you
00:11:54
know I think it was day one or two that
00:11:56
Stargate was announced you know at the
00:11:58
White House we're going to go build
00:11:59
massive infrastructure the case I'd like
00:12:01
to make you know once we talk about
00:12:02
tariffs is is I think there's an
00:12:03
alternative universe where you just lean
00:12:05
into acceleration as opposed to adding
00:12:07
headwinds but but so that would be that
00:12:10
would be the case of what what maybe
00:12:11
could have been you know very different
00:12:12
is we just keep double down on doubling
00:12:14
down on what's working while fixing the
00:12:16
parts that aren't working. But uh but
00:12:19
that would be, you know, my my uh my
00:12:20
judgment so far. Chimath, I mean, you've
00:12:23
been talking about it here every week.
00:12:26
You and I have uh been talking about it
00:12:28
pretty consistently, so I don't think
00:12:29
there'll be many surprises here, but
00:12:31
take a second and maybe assess what you
00:12:33
think if you had to pick a singular
00:12:35
thing that's gone really well and a
00:12:37
singular thing you think could be
00:12:38
improved. What What do you got? Let me
00:12:40
give you my overall grade.
00:12:44
And then I'll tell you how I get to
00:12:45
that. I think the first 100 days have
00:12:49
been a B+.
00:12:54
And here's how I get to that
00:12:56
score. There have been two things where
00:12:58
I think Trump
00:13:01
has frankly hit a home run. The first is
00:13:06
all of the direct
00:13:08
investment and specifically the foreign
00:13:10
direct investment into the United
00:13:13
States. I think it's approaching if not
00:13:15
it has already exceeded a trillion
00:13:17
dollars from corporations and
00:13:20
organizations and individuals from
00:13:22
around the world who have committed to
00:13:25
bringing money into the United States.
00:13:27
And I think strategically that's a
00:13:29
legacy that will live past him. So, I
00:13:32
think that's been an
00:13:33
A+. The second is we had a very unsafe
00:13:38
border
00:13:39
situation and he ran on shutting it
00:13:43
down. I'm not talking about the
00:13:45
execution of the deportations. I'm just
00:13:47
saying getting the illegal crossings to
00:13:50
zero and he's done that. So, that's been
00:13:53
an A+.
00:13:55
I think what's going to be more
00:13:57
controversial are these next three
00:13:58
things though. But in my interpretation,
00:14:01
I think the tariffs have been an
00:14:04
A and I think that the market reaction,
00:14:07
the stock market is only down 4%. And
00:14:10
the interest rate markets are, you know,
00:14:12
4 and a/4%. I think those have been an
00:14:14
A. Now the reason I think tariffs have
00:14:18
been an A is because it is uncovered in
00:14:21
my opinion how beholden we are to a
00:14:25
brittle supply chain and specifically to
00:14:28
China who is a friend but who's also an
00:14:30
enemy and I think that that's going to
00:14:32
really severely complicate
00:14:35
our flexibility and optionality in the
00:14:38
future as they do what is in their best
00:14:40
interests. Okay. So where have they then
00:14:43
not done so well? I think the documents
00:14:47
have been frankly a D. We were supposed
00:14:50
to get the Epstein files. We haven't
00:14:52
yet. We were supposed to get the Martin
00:14:55
Luther King files. We haven't. We did
00:14:57
get the redacted JFK files. I don't
00:14:59
think there's been very good
00:15:00
communication about why it's taking so
00:15:02
long. So I think it's a very small
00:15:04
narrow thing, but I think it had a lot
00:15:06
of attention on the way in. I think
00:15:09
the communications of the
00:15:12
tariffs and the back and forth have been
00:15:14
a C. I think the markets were not
00:15:19
led in enough of a way where they could
00:15:22
absorb the
00:15:25
volatility. But if you take it all in
00:15:27
its totality, I would give it a B+. I
00:15:30
think it's been a very productive 100
00:15:32
days. And when you look back, I think
00:15:34
in, you know, 3 years, four years, 5
00:15:36
years. Okay, we've made some important
00:15:39
progress. Saxs, obviously you're part of
00:15:42
the administration, so I'm not sure
00:15:44
exactly how to ask you this, but you s
00:15:45
you heard some nice compliments about AI
00:15:47
from Aaron. I I happen to agree with
00:15:49
those. I actually agree with uh a good
00:15:51
portion of the crypto stuff, too. I
00:15:52
think actually getting those uh
00:15:54
tightened up, which are your two zones
00:15:56
of excellence and your area that you're
00:15:59
focused on. I think you've done a great
00:16:00
job there. So, just bestie to bestie,
00:16:02
great job there. What's your Thank you.
00:16:04
What's your take overall? You know, it's
00:16:06
kind of hard, I guess, to ask somebody
00:16:08
in the administration to criticize the
00:16:09
administration, but hearing everybody
00:16:10
else's take, what's your response,
00:16:12
maybe? Well, I would I would highlight
00:16:14
three main areas that I think are big
00:16:17
accomplishments for the Trump
00:16:18
administration in the first 100 days.
00:16:19
So, so number one has to be the border.
00:16:21
Like Jamas said, I think you have to
00:16:23
give the administration an A+ on this.
00:16:25
They've completely stopped the border
00:16:26
crisis. I think we all knew that Trump
00:16:29
would take action on this because it's
00:16:30
one of the main issues he campaigned on.
00:16:32
I think if you had asked any of us, you
00:16:34
know, 4 months ago, would this problem
00:16:36
be completely solved? Meaning, border
00:16:39
apprehensions completely stopped, border
00:16:41
completely sealed within the first 100
00:16:43
days, I don't think we would have
00:16:45
believed necessarily that it would get
00:16:46
done so quickly, but it has. Uh, recall
00:16:50
that for 4 years during the Biden years,
00:16:52
we were told for the first 3 years that
00:16:55
the problem didn't even exist. Whenever
00:16:57
the videos were published of caravans
00:17:00
coming or throngs of people running
00:17:02
across the border, we were told that
00:17:04
these were cherrypicked videos on Fox
00:17:06
News. It wasn't real. Finally, in the
00:17:09
last year of the Biden administration,
00:17:11
they said, "Okay, we're finally going to
00:17:12
do something about it." They took some
00:17:13
limited actions and they said that doing
00:17:15
more than that would require new
00:17:18
legislation. Well, all of that was just
00:17:20
gaslighting. It turns out Trump came in,
00:17:22
he restored remain in Mexico and other
00:17:24
policies, completely stopped it. He had
00:17:26
this line at the state of the union
00:17:27
which I think is exactly right which is
00:17:28
we didn't need a new law we just needed
00:17:30
a new president. So I think that's area
00:17:32
number one. Area number two I would say
00:17:35
would be the vibe shift in the culture
00:17:38
around wokeism and DEI. You know how
00:17:41
quickly we forget about this but wokeism
00:17:44
has completely collapsed. Uh I don't
00:17:46
know that anyone is endorsing in a
00:17:49
fullthroated way. Moreover, beyond just
00:17:51
sort of the cultural aspect of it, I
00:17:54
think we've had significant policy
00:17:55
changes on DEI. Trump has basically
00:17:58
ended DEI at the government level. He
00:18:01
also signed an executive order ending
00:18:04
the use of disparit impact for
00:18:06
affirmative action. This is the policy
00:18:08
that said that even if you have a policy
00:18:11
that's applied in a completely neutral
00:18:13
and objective way, if it results in a
00:18:16
disparate impact where different groups
00:18:17
are represented in a different way in
00:18:19
the outcomes, then somehow that must be
00:18:22
racist. And that led to essentially
00:18:25
engineering the results of various
00:18:28
populations to basically fit quotas. And
00:18:32
I think all of that now has fallen by
00:18:34
the wayside. And I think that
00:18:35
meritocracy and colorblindness are back.
00:18:37
The only hold out really has been these
00:18:40
universities where Trump is now taking
00:18:42
action against Harvard and I think that
00:18:45
ultimately we will win that battle. You
00:18:47
see that even in relatively liberal
00:18:50
companies the DI departments have been
00:18:52
cancelled and they're moving back
00:18:53
towards more of a meritocracy. So I
00:18:56
would say that that's like big shift
00:18:57
number two. And I think if any of us had
00:18:59
tried to predict that 100 days ago, we
00:19:02
would have thought yes, Trump will do
00:19:03
something about it. But I don't think we
00:19:04
would have predicted the total collapse
00:19:07
of wokeism and DEI so quickly. And then
00:19:10
I'd say the third area which is still in
00:19:13
flight is the rep privatization of the
00:19:15
economy. That's a term that Scott Besson
00:19:17
used. I think that the Trump
00:19:19
administration needs to reprivatize the
00:19:21
economy. And I like that framing of it.
00:19:24
And there's a bunch of different pieces
00:19:26
under that. I'd say number one is
00:19:28
Doge again ending this hogw spending. I
00:19:33
do think that Trump has come into office
00:19:36
inheriting a very weak Biden economy
00:19:38
that was being propped up by massive
00:19:40
amounts of government spending that was
00:19:42
not only stimulating the public sector,
00:19:44
but it was also gooseing the employment
00:19:46
numbers as well. And we knew that that
00:19:49
spending was unsustainable. We have to
00:19:51
do something about it. So, I think for
00:19:53
the first time in decades, we've
00:19:54
actually started to make real cuts in
00:19:57
government, real cuts in the federal
00:19:59
workforce. And look, we'd like to do
00:20:01
more, but that is a huge shift in the
00:20:03
conversation. There's other pieces of it
00:20:06
as well. I mean, President Trump has
00:20:08
signed a significant number of executive
00:20:10
orders on deregulation. There's also
00:20:13
been unleashing energy. He ended Biden's
00:20:15
EV mandate and a lot of these like green
00:20:18
new scam projects, offshore wind, and
00:20:21
he's been encouraging oil and gas
00:20:23
exploration. So, I think there's that.
00:20:25
And then I appreciate what Aaron said
00:20:27
about tech innovation. We did repeal
00:20:30
Biden's exec order on AI, which was, you
00:20:32
know, 100 pages of unnecessary
00:20:34
regulation on AI. We've ended the war on
00:20:36
crypto, and I think we're trying to stop
00:20:39
the regulatory capture that benefits
00:20:40
large incumbents. So, you have all these
00:20:43
things, and there's been other things
00:20:44
that that have been done on the economy
00:20:45
as well, but I I do think that this sets
00:20:47
us up for a Trump boom in the future.
00:20:51
It's just that a lot of these changes
00:20:53
take time to to play out. Okay, great.
00:20:56
Well done. And well, I think I think I
00:20:59
think we knew Sax would be very pro.
00:21:01
Chamas uh Chamas seems really pro other
00:21:03
than he wants like the alien conspiracy
00:21:05
files released which we'll get soon.
00:21:08
What is the what is the view from Jay
00:21:10
Cal when the where you're the the the
00:21:13
leftleaning guy in the in the room? Uh
00:21:16
you know I'm kind of independent but
00:21:17
yeah social liberal. You know I I look
00:21:19
at what all Americans believe and and
00:21:22
try to build some consensus here. It's
00:21:24
one of the things I've been trying to do
00:21:25
on the pod is look for where we
00:21:27
agree. Americans universally want the
00:21:30
border secured. They don't want illegal
00:21:33
immigration and they don't want
00:21:34
fentinel. So this is the biggest win I
00:21:36
think for Trump which I think everybody
00:21:37
on the panel pointed out and Sachs you
00:21:40
were dead right like when we were seeing
00:21:41
those videos some of them were 5 years
00:21:42
old some of them were recent. Biden
00:21:45
really covered up what was going on in
00:21:46
the border and it took years to figure
00:21:48
out what was exactly going on there. So
00:21:50
that's the biggest win possible. I I
00:21:52
give overall just to be brief a B
00:21:55
for this first 100 days and I give you
00:21:58
know Biden like a C minus. The second
00:22:01
thing that everybody agrees on is they
00:22:02
want to downsize the government. They
00:22:03
don't want waste and fraud. So I think
00:22:04
Doge is the other huge win. The things I
00:22:08
think that could be improved really just
00:22:10
three simple things. The economic
00:22:12
uncertainty is really terrible for
00:22:14
running a business. I'm seeing a lot of
00:22:15
folks in my circle on my podcasts this
00:22:18
week in Startups and here telling me,
00:22:20
"Oh, I don't know how to plan for the
00:22:22
future." And we're going to get into
00:22:23
that with this tariff stuff and the
00:22:24
trade war. And so, I think economic
00:22:27
uncertainty, we have to sort of slow
00:22:30
down and maybe make it easier for people
00:22:32
to understand what the administration is
00:22:34
trying to do. I think rule of law really
00:22:36
matters to people. People didn't like
00:22:38
Biden's pardons. They didn't like
00:22:40
covering up his mental acuity. And I
00:22:42
don't think people like the deportations
00:22:44
without due process. We talked about
00:22:45
that on a previous episode.
00:22:47
Overwhelmingly, people want Trump and
00:22:50
the administration to obey what the
00:22:52
Supreme Court says. They really want
00:22:54
rule of law, the third term talk. Like
00:22:57
eight out of 10 Americans don't like
00:22:58
that kind of talk. Um, and then
00:23:01
conflicts of interest. Obviously, people
00:23:02
hated the Hunter Biden stuff. They hate
00:23:04
the memecoin stuff. And so that's where
00:23:07
it could improve. crisper
00:23:08
communications, more thoughtful
00:23:10
execution, maybe less trolling. I don't
00:23:12
like the White House Twitter account
00:23:14
trolling. And then focus on what got
00:23:16
Trump here. You know, you all said the
00:23:17
same thing. What got Trump in here was
00:23:19
the economy. And one thing that wasn't
00:23:22
mentioned by everybody is the peace
00:23:24
dividend. And Trump is making massive
00:23:26
progress in Ukraine, apparently. I don't
00:23:29
know if it's on the docket today or not,
00:23:30
but stopping the wars and making the
00:23:33
economy boom, those are the two most
00:23:34
important things that he could do. Build
00:23:36
on that. I I totally missed that. You're
00:23:38
absolutely right. That's another one
00:23:39
where I would give Trump an A+. Nat and
00:23:41
I had dinner with POTUS two weeks ago.
00:23:45
And wait, you had dinner with Trump?
00:23:47
This is breaking news. Well, okay,
00:23:49
whatever. Yes. Well, I think it's I
00:23:50
think it's remarkable how much of a
00:23:52
Putin apologist Jake House's become. I
00:23:54
mean, you want to end the war in Ukraine
00:23:55
now? Well, you're going to you're going
00:23:56
to give it You're going to give it to
00:23:58
Putin? You're not going to stop Putin.
00:24:01
I'm totally in favor of what Trump's
00:24:03
doing in in negotiating a deal to get
00:24:05
more money. Oh, you want to talk to
00:24:06
Putin now? I've always wanted to talk to
00:24:08
Putin. I just don't trust him. But you
00:24:10
you can trust him. Let me tell you what
00:24:12
Trump said. So, there we go. There was a
00:24:14
handful of us at dinner and then he got
00:24:18
up to say a few words at the end and he
00:24:21
reminded me why I was so inclined to
00:24:27
vote for him, which is he talked about
00:24:29
his uncle
00:24:31
and he talked about how his uncle taught
00:24:33
him about the severity of nuclear war
00:24:36
and how people don't understand how
00:24:39
intense and how destructive it is and
00:24:42
the power of these weapons and he left
00:24:47
that speech at the end saying and this
00:24:49
is why I'm so fundamentally against this
00:24:51
thing and it reminded me to your point
00:24:54
Jason it is so easy to forget that
00:24:57
there's only one existential risk
00:25:00
save like aliens coming from the
00:25:02
heavens, right? There's only one
00:25:04
existential risk where all these issues
00:25:06
become fringe issues. You know, you
00:25:07
mentioned rule of law, border security,
00:25:09
foreign direct investment, tariffs, it
00:25:12
all goes out the window in a nuclear
00:25:14
war. And I was like, I am so glad this
00:25:17
guy's in charge because this one issue,
00:25:20
he never waivers.
00:25:23
Yeah. And I think there's all kinds of
00:25:25
complicated moments that could make this
00:25:29
an issue. And this was where my biggest
00:25:31
issue with Biden was was I did not know
00:25:33
who was in control. And I think that
00:25:35
Trump in the first 100 days, to your
00:25:37
point, I think has completely reinforced
00:25:39
that there are no conditions under which
00:25:42
he'll go to war. He has time and time
00:25:43
again showed find the off-ramp. And I
00:25:46
think that that's really healthy for
00:25:47
Americans to see. Yeah. And let me build
00:25:49
on that point with respect to to Ukraine
00:25:51
is we were on a glide path before the
00:25:54
Trump presidency that Biden had put us
00:25:57
on a certain path. Kla Harris gave every
00:26:00
indication she would have continued it.
00:26:02
What was that path? It was a path of
00:26:04
continued escalation and doubling down
00:26:06
in Ukraine. Recall that it was Biden
00:26:08
himself at the beginning of the war who
00:26:10
said that if we give Ukraine Abrams
00:26:13
tanks and F-16s or attackums or high
00:26:18
Mars or if we allow them to hit targets
00:26:20
inside of Russia, it would lead to World
00:26:23
War II. He actually used the word
00:26:25
Armageddon. So at the beginning of that
00:26:27
administration, they were very concerned
00:26:30
about how an escalatory path could lead
00:26:32
us into direct conflict with Russia and
00:26:35
World War II. And yet, despite that, at
00:26:38
every fork in the road where they had a
00:26:39
choice, they ended up doubling down.
00:26:41
They gave the Abrams tanks. They gave
00:26:42
the F-16s. They gave the High Mars. They
00:26:44
gave the attacks. And finally, when
00:26:47
Biden was a lame duck in his last couple
00:26:48
months in office, they did the most
00:26:50
reckless and irresponsible thing, which
00:26:52
is allow American weapons to be used to
00:26:55
strike targets on Russian soil. Not just
00:26:57
fighting in Ukraine, but on Russian
00:26:59
soil. Moreover, we now know from a New
00:27:01
York Times article that just came out in
00:27:03
the last few weeks that it was American
00:27:05
generals and American intelligence who
00:27:06
are planning this war. So, when you're
00:27:08
talking about striking Russian targets
00:27:10
on Russian soil, it's not just the
00:27:12
Ukrainians using our weapons. They're
00:27:14
using our targeting, they're using our
00:27:16
guidance, they're using our satellites.
00:27:18
I mean, we are deeply integrated in the
00:27:19
kill chain. This is the United States
00:27:21
being a co-elligerent in the war,
00:27:23
hitting Russian soil. That is incredibly
00:27:25
reckless and dangerous. I have no doubt
00:27:27
that if the Democrats were still in
00:27:30
office, we would be in an escalatory
00:27:32
spiral right now with the destination
00:27:34
being World War II. And I do think that
00:27:36
Trump has pulled us back from the brink
00:27:38
there. There's obviously still more work
00:27:40
to do. But I really appreciate the
00:27:42
efforts that Steve Wickoff has
00:27:44
undertaken where for the first time in 3
00:27:46
years, we've at least had direct
00:27:48
diplomacy with the Russians. We weren't
00:27:50
even talking before. We weren't even
00:27:52
talking before. I mean, talking is a
00:27:53
great thing and and and apparently we're
00:27:55
going to keep supplying with them with
00:27:57
weapons as long as they pay for them.
00:27:59
So, it's going to be very interesting to
00:28:00
see how this all hashes out over the
00:28:02
next 100 days or so. Let's keep moving.
00:28:05
I don't think we know that yet. Let's
00:28:07
let's wait and see on that. Okay. Yeah.
00:28:08
I mean, I think that's Yeah. Uh what was
00:28:11
reported, but you're right. We should
00:28:12
wait and see. Okay. Downstream tariff
00:28:15
impacts. We got to talk about this, and
00:28:16
this is why we have you here, Ryan,
00:28:18
since you're in the thick of it. you
00:28:20
tweeted a thread last week about the lag
00:28:23
time uh of shipments from China and when
00:28:26
you were on I guess during COVID you
00:28:28
really educated us to how the supply
00:28:30
chain how the supply chain
00:28:32
works and according to the thread that
00:28:36
you shared somewhere around early June
00:28:38
we're going to expect warehouses
00:28:40
trucking the entire supply chain maybe
00:28:43
to start to seize up or layoffs I don't
00:28:47
know how you would frame it Ryan but are
00:28:49
we asked the point of no return with
00:28:52
regard to the supply chain. Is there an
00:28:55
off-ramp for this tariff conflict war
00:28:58
negotiation with China in your mind?
00:29:00
What are you seeing on the streets and
00:29:03
in the purchase orders and the invoices
00:29:06
at Flexport? Definitely not past the
00:29:08
point of no return. I think we're still
00:29:10
right in the middle of the don't judge
00:29:12
the cook while he's cooking is one, you
00:29:13
know, like let's see what the we'll see
00:29:14
what it tastes like at the end is I
00:29:16
think a starting point here and we're
00:29:17
still they're still in active
00:29:18
negotiations. So I don't think today's
00:29:20
it's not static. Now the world does want
00:29:23
a lot more certainty and that's a big
00:29:25
cause of what's happened here and what
00:29:27
has happened is a 60% decline in
00:29:29
bookings of ocean freight from China to
00:29:31
the US. I mean so that's really really
00:29:33
pretty dramatic like probably exceeding
00:29:36
what was expected. I don't think, you
00:29:39
know, when they issued when they rolled
00:29:40
out the initial reciprocal tariff plans
00:29:42
on on April 2nd, it was meant to be a
00:29:44
54% tariff on China. Then, you know,
00:29:47
there's multiple cycles of escalation.
00:29:49
We ended up at what's now 154% tariff.
00:29:53
So, this is this is a lot higher than
00:29:56
anybody planned for. And so therefore, I
00:29:58
don't think anyone's planning for a 60%
00:30:00
decline in ocean freight. Ryan, let me
00:30:03
ask you a question about that. Are
00:30:05
people actually paying that 154%?
00:30:07
There's been this discussion online and
00:30:09
it's it's sort of unclear from the
00:30:11
administration and from
00:30:14
retailers stuff that's landing that
00:30:16
people ordered before April 2nd. Are
00:30:20
they actually paying the 154% on top of
00:30:22
what's landing? It's it's live now. Um
00:30:24
it is it was based on departure date. So
00:30:26
goods that departed China after midnight
00:30:30
Eastern time on April 9th are subject to
00:30:33
the tariffs upon arrival. And so now
00:30:35
enough time has passed that pretty much
00:30:36
all the ships that are arriving now left
00:30:38
China after April 9th when that started.
00:30:40
Um so yes. So what happens? People are
00:30:42
paying it or are people saying I won't
00:30:43
take delivery because it's Jason you
00:30:45
have you have to pay it Ryan correct me
00:30:47
if I'm wrong but you have to pay it at
00:30:49
the dock in order to get the goods
00:30:50
released. More or less more or less
00:30:52
that's true. They they allow you have a
00:30:55
bond in place so you can pull the goods
00:30:56
out before you pay but it the money's
00:30:58
owed at that time and then you get you
00:31:00
get like a two week time frame to
00:31:01
actually make the payment. But there are
00:31:03
strategies here. a lot of people are
00:31:04
doing that you can use what's called a
00:31:06
bonded warehouse and move cargo into
00:31:08
this warehouse uh and then you only owe
00:31:09
the duties when the cargo leaves that's
00:31:12
what I was asking like is there a hack
00:31:14
here to it's not that that lets you
00:31:16
defer things and it's very very common
00:31:17
right now people are searching
00:31:19
everywhere for bonded warehouse capacity
00:31:20
because in a bonded warehouse not only
00:31:22
you defer payment to when the cargo
00:31:24
leaves the warehouse but you only owe
00:31:26
the duty amount based on at that date so
00:31:30
if the duties come back down which a lot
00:31:32
of people are betting they will on the
00:31:33
China specific speific duties, you'll
00:31:35
actually lower your tariff burden. And
00:31:37
then there's another hack for this,
00:31:38
which is effect use a Mexican or
00:31:40
Canadian bonded warehouse. So you move
00:31:42
the goods into Mexico and then you
00:31:44
actually only technically import them
00:31:46
into the US at a future date when
00:31:48
tariffs are lower. So I understand a lot
00:31:49
of a lot of companies are doing that
00:31:51
right now, too. Um we're helping some
00:31:53
people with that type of strategy,
00:31:55
but yeah. Is that Sorry, Ryan. Do you
00:31:57
think that the government will they view
00:31:59
that okay that kind of hack and or like
00:32:03
you know like if you look at the GDP
00:32:04
numbers one of the craziest things was
00:32:06
the inventory pull forward that people
00:32:08
did to your point like trying to get as
00:32:10
much stuff into the United States before
00:32:13
April 9th as an example. Yeah. I mean
00:32:16
it's not a hack. It's a bonded
00:32:17
warehouses are been around for decades
00:32:19
and they're they're very commonly used.
00:32:21
I don't know that it'll be that material
00:32:22
in the scheme of things that it would,
00:32:25
you know, cause a change in the law
00:32:27
around bonded warehouse. So, you don't
00:32:28
think, for example, the Department of
00:32:31
Commerce will have an issue with the
00:32:33
strategy of sending inventory into
00:32:36
Mexico that essentially you're
00:32:37
essentially like, isn't it, it's a work?
00:32:41
Like, instead of paying the China
00:32:42
tariff, now you pay a Mexico tariff,
00:32:44
which should be less. Is that the idea?
00:32:46
Well, you can move it into a bonded
00:32:47
warehouse in Mexico even and not pay
00:32:49
Mexican tariffs either. and you just
00:32:51
wait until it imports. But I mean,
00:32:52
what's the Department of Commerce or the
00:32:55
customs to do? It's sort of you just
00:32:57
delayed importing the goods. You've
00:32:58
imported them in the future and you
00:33:00
know, it doesn't I I wouldn't even call
00:33:02
it a hack. It's just sort of like people
00:33:04
people are going to get creative here.
00:33:05
You know what I mean? Like that's the
00:33:06
job. Actually, the government should set
00:33:08
the rules and the rest of us got to
00:33:10
figure out, all right, how are we going
00:33:11
to compete and make money in this
00:33:12
environment that they've created? Ryan,
00:33:14
in that tweet you redid, which was a
00:33:16
pretty dramatic tweet painting a very
00:33:18
like I don't know, you know, like a
00:33:21
pretty dire situation. Where are we at
00:33:24
in terms of how uh dire this will get or
00:33:28
resolvable? paint us the the the best
00:33:31
case scenario and what you expect could
00:33:33
happen in that case or if this gets
00:33:35
extended are we going to see as you know
00:33:37
people are hand ringing empty store
00:33:40
shelves Christmas gets ruined and all
00:33:43
these layoffs start happening in the
00:33:45
supply chain take us through the two
00:33:46
scenarios that people are debating yeah
00:33:48
I mean the the the bleak scenario which
00:33:50
is I don't really think it's going to
00:33:52
happen I think that the administration
00:33:53
doesn't want this to be their legacy
00:33:54
that they like created a policy that
00:33:56
just like kind of tanked small business
00:33:58
and supply chain. So, I don't I don't
00:33:59
actually think this is going to happen,
00:34:00
but the bleak scenario is tariffs stay
00:34:03
at this level for 145% on China. The 10%
00:34:06
goes way back up to what it was
00:34:08
originally announced in reciprocal
00:34:09
tariffs. So, there's no like safe haven
00:34:10
for tariffs and trade just falls off a
00:34:13
cliff and a lot of companies go bankrupt
00:34:14
in our in our especially small companies
00:34:16
are the ones that are importing from
00:34:17
China. Reality is like tariffs have been
00:34:20
high on China for a long time. Labor
00:34:22
costs in China are not are not there for
00:34:24
cheap labor. You're there for quality
00:34:25
manufacturing at this point. like
00:34:27
there's much cheaper labor in Southeast
00:34:29
Asia, other parts of the world than
00:34:30
there is in China. So you're in China
00:34:32
because of the manufacturing
00:34:33
capabilities, the ecosystem, not just
00:34:35
for cheap labor. Uh and if you could
00:34:37
have moved, you would have already with
00:34:38
the 25% tariffs from the Trump's first
00:34:41
terms were pretty were high enough
00:34:43
incentive. And so that's the bleak
00:34:46
scenario is that small business starts
00:34:48
getting wiped out. The ones that are
00:34:49
buying from China and it's a lot of
00:34:51
brands like it's not just Amazon seller
00:34:53
selling stuff that you don't need. It's
00:34:56
like all the brands that you know are
00:34:58
like you know fashion brands, apparel
00:35:00
brands. I had cuts clothing on this
00:35:03
weekend startups last week and he said
00:35:05
there's going to be like if this doesn't
00:35:07
get revol resolved in like let's say two
00:35:09
to four weeks in his group chats people
00:35:11
are going to start layoffs and they they
00:35:13
can't
00:35:15
physically restart the supply chain in
00:35:18
Vietnam or wherever to make t-shirts. So
00:35:20
Aaron what's your thought on this as
00:35:22
well just bringing you in. Sure. Well,
00:35:24
well, first of all, I mean, Ryan has
00:35:26
supplied me with a high degree of doom
00:35:27
scrolling and uh it's just like a horror
00:35:30
show reading his tweets. First of all, I
00:35:32
I like I would feel better if the
00:35:34
messages out of the administration were
00:35:36
either more kind of consistent or that
00:35:39
there was a logical connection between
00:35:42
do we either want to raise the kind of
00:35:46
you know tariff revenue stream or do we
00:35:48
want free trade like like those things
00:35:50
are are working against each other
00:35:52
because like depending on who you talk
00:35:53
to they they say this is a a mechanism
00:35:55
to bring down income tax which obviously
00:35:57
then by definition means that they
00:35:59
expect the tariffs to sort of persist.
00:36:01
um which is totally different from let's
00:36:03
go negotiate deals that just allow for
00:36:05
the you know free trade to actually
00:36:07
increase and so are we worried about the
00:36:09
reciprocity or we worried about kind of
00:36:11
revenue stream so that's a whole whole
00:36:12
issue you also have this issue which is
00:36:15
the messaging from the government and
00:36:16
this is the meta point I'll make in a
00:36:18
second is about is about how we could
00:36:19
have actually accelerated into the
00:36:21
transformation of the economy but you
00:36:23
know you have folks like Lutnik etc you
00:36:25
know going on on TV talking about the
00:36:27
the end state of our economy which are
00:36:28
are actually probably you fine messages,
00:36:31
but but we haven't seen what that vision
00:36:34
looks like, you know? So, everybody is
00:36:35
kind of confused like does this mean
00:36:37
that we literally go into manufacturing
00:36:38
plants and we're like the ones literally
00:36:40
doing the screws on an iPhone or is it a
00:36:43
bunch of jobs which are next generation,
00:36:45
you know, jobs which is like we're
00:36:46
managing robots and and like shipping
00:36:48
and logistics grows as a result of this
00:36:50
and all of the surrounding kind of
00:36:52
supply chains, you know, start to grow.
00:36:53
So like like you know to like there's a
00:36:56
there's an underlying I mean you know mo
00:36:58
most people on this call have managed
00:37:00
teams like you do change management you
00:37:02
lead people to the end state that that
00:37:04
you want them to sort of see the
00:37:06
potential in and I think some something
00:37:08
that kind of gets missed is and the part
00:37:10
that kind of confuses me is like I don't
00:37:11
know if if to you know exactly to Ryan's
00:37:14
point like people are in China because
00:37:15
of the ecosystem of manufacturing yet
00:37:17
the messages you get out of the
00:37:18
administration are like oh you know
00:37:20
we're going to have fewer toys at
00:37:22
Christmas time. It's like, no, that's
00:37:23
not the big picture. Like, the big
00:37:25
picture is is like, like, you know, this
00:37:27
is supplying the parts that go into
00:37:29
building a manufacturing plant and
00:37:31
building a car that that allows us to
00:37:33
actually, you know, be even remotely
00:37:35
competitive in car manufacturing. So,
00:37:37
where should in your mind this all lead?
00:37:39
Because you have some thoughts on
00:37:41
American exceptionalism and maybe
00:37:43
skating to where the puck is going. So
00:37:45
for if you were to become an adviser
00:37:47
Yeah. on this as a technology expert and
00:37:50
and somebody who spent their whole
00:37:51
career in it, what would you advise them
00:37:53
to do? I'd get rid of Navaro immediately
00:37:55
and and you would basically say, you
00:37:58
know, Mia Kulpa, like oops. And like
00:38:00
obviously you need to like land that
00:38:02
with some really cool trade deals that
00:38:03
make everybody kind of feel happy. And
00:38:05
you basically say, you know what, like
00:38:07
let's go back to the first two days of
00:38:08
Trump, which is let's announce massive
00:38:11
deals. We're bringing manufacturing here
00:38:14
with Stargate. We're doing TSMC. We're
00:38:16
building Nvidia chips. We're going to do
00:38:18
a deal which is you get like 5% tax uh
00:38:21
rate if you build in America. And so you
00:38:23
just stimulate a a manufacturing boom in
00:38:26
in the country. You know, we
00:38:28
incentivize, you know, automation across
00:38:30
the manufacturing. We use that as a
00:38:32
competitive weapon to go and compete
00:38:34
with with the the sort of lowerc cost
00:38:36
labor that that that happens
00:38:37
internationally. You know, we we find
00:38:39
every incentive and tool we can. we
00:38:41
deregulate, you allow people to build
00:38:43
these plants and so you don't have to go
00:38:45
through the, you know, three-year EPA
00:38:46
process like like you you just
00:38:48
accelerate from from this position and
00:38:51
you see you see it all as upside and so
00:38:54
then business leaders, you know, if you
00:38:56
go talk to the Fortune 500 company that
00:38:58
actually has to build, you know,
00:38:59
anything right now, you give them a path
00:39:01
to say, listen, we're going to help you
00:39:03
transition away from your current supply
00:39:04
chain and we're going to make it even
00:39:06
more competitive and more compelling in
00:39:07
America to do that. You know, there's a
00:39:10
reason that Elon builds in America like
00:39:12
like he has he's actually made it be
00:39:14
more effective to to be able to to, you
00:39:16
know, bring automation to manufacturing,
00:39:18
to be able to to build locally, but he
00:39:20
wasn't forced to do that. And so so I
00:39:22
would just I would argue like you you
00:39:24
use as many carrots as possible in some
00:39:27
surgical areas. And Chimath, I've heard
00:39:28
your points about the like, you know,
00:39:30
chips, pharma, you know, AI, like in
00:39:32
those surgical areas, we get tough where
00:39:35
where necessary. Um, and and if we have
00:39:37
to do, you know, a couple sort of very
00:39:39
surgical tariffs, you know, to kind of
00:39:41
make make people move the the direction
00:39:43
that we want, that's totally fine. But I
00:39:45
I mean like like it's like the even
00:39:47
arguing the premise is hard because
00:39:48
because we act like we're like like it's
00:39:50
like countries that are screwing us, but
00:39:52
actually businesses are are
00:39:53
independently making decisions about
00:39:55
where they want their supply chain to
00:39:57
exist. They they have in a free market
00:39:59
they've made that decision. They don't
00:40:01
need the government to tell them where
00:40:03
where are they supposed to or where are
00:40:04
they allowed to to have their supply
00:40:06
chain operate. that that ends up with
00:40:08
just lots of economic distortions that
00:40:10
everybody on the the right would have
00:40:12
called, you know, the left socialists
00:40:14
for trying to kind of implement central
00:40:16
planning around supply chains. So that's
00:40:17
my piece. Well, no. What do you think,
00:40:19
Chimath, here of this sort of
00:40:21
reframing/offramp and sort of maybe the
00:40:23
positive spin on it, hey, if you want to
00:40:25
make t-shirts, you know, you want to
00:40:26
make commodity items, have at it. Free
00:40:29
trade, you know, reciprocal tariffs,
00:40:31
great checkbox there. But here is a
00:40:35
series of incentives and a path forward
00:40:37
to do the advanced stuff to do robotics
00:40:40
etc. Let me answer this in a different
00:40:42
way. Okay, a lot of those things he's
00:40:45
actually doing. I think this is where we
00:40:49
are is we're beyond
00:40:52
TDS. There's something that comes after
00:40:54
it. And I think that the mainstream
00:40:57
media has just lost their mind to a
00:41:01
degree that they hadn't even lost their
00:41:03
mind in Trump one. I'll give you a
00:41:05
couple of examples. Um well, one
00:41:08
example, by the way, just a shout out to
00:41:09
our friend, completely brazen,
00:41:12
ridiculous, shitty reporting by the Wall
00:41:14
Street Journal last night. When they
00:41:16
were told that this, you know, this
00:41:18
whole Tesla thing was a total farce,
00:41:20
they continued to publish it. Okay,
00:41:22
fine. They're they were referring to
00:41:24
Elon. The board Yeah. starting a search
00:41:28
to replace Elon and then the board said,
00:41:29
"Well, wait. We told you we weren't
00:41:30
doing that." And they didn't even
00:41:32
mention they communicated that directly
00:41:34
to the Wall Street Journal. The Wall
00:41:35
Street Journal said, "I don't care. I
00:41:37
have an axe to grind." Correct. Yeah. I
00:41:41
think that Trump has a strength, which
00:41:43
is he shapes these potholes for the
00:41:45
mainstream media to fall into. The
00:41:48
downside of that though is that that the
00:41:49
mainstream media then doesn't do the
00:41:51
other part of the job which is to tell
00:41:53
the things that are important. So for
00:41:55
example, we spent a lot of time
00:41:57
breathlessly talking about the MS-13
00:42:00
knuckles of the guy, right? Or then we
00:42:04
spent a bunch of time talking about how
00:42:06
MSNBC blurred out the names of the
00:42:09
placards on the lawn. Okay, but here's
00:42:12
the other part where then they get so
00:42:15
tilted. Here's what they don't report.
00:42:17
They didn't report that, for example,
00:42:19
when Trump took a shot at Harvard, he
00:42:23
also reinforced and strengthened
00:42:25
historically black colleges and
00:42:26
universities. Totally did not get
00:42:28
written. I'll give you another
00:42:31
example. This past
00:42:33
week, Bessant said that the tax bill
00:42:37
will allow you to fully deduct all the
00:42:41
PPE and all of the incidental costs of
00:42:45
building a factory. I heard that, I
00:42:48
immediately went to my wife. She runs a
00:42:50
pharma business. This is exactly what
00:42:52
she's trying to figure out. And we now
00:42:56
are like, how do you build a business
00:42:57
case if this actually gets effectuated?
00:43:00
The point is that
00:43:03
thing would create an absolute economic
00:43:07
bonanza if it were to get
00:43:10
passed. Other than people hearing it on
00:43:12
this pod or randomly maybe finding it on
00:43:15
a direct clip that Bessant puts out on
00:43:17
X, there has been zero coverage by the
00:43:20
mainstream media. Yeah. But but like
00:43:22
like the the uh first of all that Yeah.
00:43:25
Okay. MSM and whatever we want to call
00:43:27
it aside like the that is that is still
00:43:30
on the administration for driving a a
00:43:33
change management process that that
00:43:35
causes people to build on momentum and
00:43:37
not causes boards to basically say oh
00:43:40
are we going to pivot our entire supply
00:43:41
chain this week because because Trump
00:43:43
you know didn't get a call back from
00:43:45
from sheet like like that that is like
00:43:48
this is really not a a a TDS MSM issue
00:43:51
this if you talk to Fortune 500 CEOs did
00:43:54
you know about the PPE thing No, but
00:43:55
that's not but like I'm I don't need to
00:43:57
like the thing that I know but there are
00:43:59
many other CEOs that do they they're
00:44:01
controlling trillions of dollars of
00:44:02
capital allocation. It's an important
00:44:04
thing. If we had if we had Mary Bar on
00:44:06
this call and and we said Mary has has
00:44:10
you know Trump increased your ability to
00:44:12
execute and operate and accelerate the
00:44:13
the transition to the US or has he had
00:44:16
headwinds that make it tougher to
00:44:18
navigate right now? Which which way do
00:44:20
you think she'd go?
00:44:22
I think that she would give you a
00:44:23
calculated answer that neither is pro or
00:44:26
con. I think I think that answer changes
00:44:27
by the day. I think that if you talked
00:44:29
to her last week, she would say this has
00:44:30
been a major headwinds and then
00:44:32
yesterday to Chimal's point, they did
00:44:34
this thing where you can depreciate or
00:44:35
fully expense in year one capital
00:44:37
improvements or building out factories.
00:44:39
But also earlier this week, they made it
00:44:42
so that auto auto parts are not subject
00:44:44
to the tariffs. They created a huge
00:44:46
exemption that wasn't there everything
00:44:49
like they should have. By the way, This
00:44:51
all speaks if this was a bit planned, it
00:44:53
should have been there in the beginning
00:44:54
because these auto auto companies were
00:44:56
saying, "Hey, this is going to bankrupt
00:44:57
us if we have to pay taxes on where I
00:44:59
would take circling back to
00:45:01
communication and making a crisper and
00:45:03
clear, Erin, where you are right
00:45:06
expectation is it's my job to stay
00:45:08
informed." Okay. As a CEO of my company,
00:45:11
I try to stay informed and you're right.
00:45:13
It is hard because sometimes I find
00:45:15
myself hunting and pecking to find the
00:45:17
things that matter. But I do put a bunch
00:45:20
of that responsibility into the lap of
00:45:22
the people that are supposed to actually
00:45:24
report the facts. They can choose. They
00:45:27
didn't have to run that article about
00:45:29
Elon, which turned out to be total
00:45:31
and horseshit on the front page
00:45:33
of the Wall Street Journal. They could
00:45:34
have talked about what Ryan just
00:45:36
mentioned as the first article and said,
00:45:38
"Here completely changes your ro and roe
00:45:42
calculations for 90% of the S&P 500."
00:45:45
That was not the article they chose to
00:45:47
write and to publish. I but I also think
00:45:49
it comes back to my original point
00:45:50
around the UDOT loops that they're the
00:45:52
Trump is the administration is running
00:45:54
these very tight hey let's take an
00:45:56
action let's see what happens let's see
00:45:58
the reaction and then take another
00:46:00
action and Washington's used to doing
00:46:01
all these committees that plan
00:46:02
everything for 5 years or something or
00:46:04
whatever 18 months and then roll it out
00:46:06
slowly and they're going hey let's roll
00:46:08
it out oh crap we're about to cause this
00:46:10
huge problem in the auto manufacturers
00:46:12
and they're all telling us they're going
00:46:12
to go bankrupt okay 3 days later they
00:46:15
push an update and it's kind of it feels
00:46:17
chaotic But yeah, to summarize, Ryan and
00:46:20
Aaron, your position so we can keep
00:46:21
going through the docket. Hey, a little
00:46:23
less shock and awe, maybe a little more
00:46:25
predictability, a little crisper
00:46:27
communication, and Chimath, I think your
00:46:28
position is, hey, maybe the mainstream
00:46:30
media can play a better role here in
00:46:32
focusing us on what matters. That
00:46:34
wouldn't be my takeaway. So, yeah. Okay.
00:46:36
Um, my my What's your takeaway? I mean,
00:46:38
like like zero shaken like like not a
00:46:41
little less like like I like my strategy
00:46:44
would be 100% different. Actually, uh,
00:46:46
Scott Besson has an incredible podcast
00:46:48
from like September of last year, and he
00:46:50
basically said, you know, Bid Biden got
00:46:53
it all wrong. And I was like listening
00:46:54
to it. I was like, oh, okay, actually,
00:46:56
this is kind of cool. Like, he basically
00:46:58
says, deregulate the US, make it easier
00:47:01
to build manufacturing in the US,
00:47:03
increase the GDP, and then you'll be
00:47:05
able to take in less tax revenue, spend
00:47:06
less in the government. And it was like,
00:47:08
oh, this is actually like a glide path.
00:47:10
We could take we could take the fact
00:47:12
that we did a soft landing relative to
00:47:14
the rest of the globe. We're winning in
00:47:16
AI. We're winning in it in in you know,
00:47:19
you know, number of categories. We
00:47:20
obviously need more energy. We need we
00:47:22
need to bring in manufacturing into the
00:47:24
US. And and so you you have this great
00:47:26
momentum which is where we are the tech
00:47:28
leader, you know, in the world. Let's
00:47:29
like just pour fuel on that. And so to
00:47:32
pour fuel on that, you you do you just
00:47:34
do a series of carrots and and the and
00:47:36
the winds that build a flywheel of
00:47:38
positive energy. Like the reason why I
00:47:40
take I take a little bit of exception to
00:47:42
to Chimas's MSN point is that I think to
00:47:44
some extent Fortune 500 CEOs are not the
00:47:46
ones like oh my gosh like like Rachel
00:47:49
Matto said this like I'm going to go and
00:47:51
and you know worry about this topic now
00:47:52
like like the information coming at them
00:47:55
is is I'm not talking about the
00:47:56
information that's presented. I'm
00:47:58
talking about the information that's
00:47:59
excluded. How do you get the information
00:48:02
that's not published and shared broadly?
00:48:03
No, no, but but come on. Like like
00:48:05
Goldman Sachs and JP Morgan are not
00:48:06
writing reports on the fact that we
00:48:08
might enter a recession because of of
00:48:10
MSNB's MSNBC's reporting on this topic.
00:48:13
Like like again, but again, that's not
00:48:15
what I'm talking about. I'm saying like
00:48:17
gladhanding some highlevel
00:48:19
prognostication which nobody ever gets
00:48:21
right is in my opinion worthless. What
00:48:24
I'm talking about is the details. So
00:48:26
when you talk about something as narrow
00:48:28
and specific as excluding PPE or
00:48:32
allowing you to double or triple
00:48:33
depreciate something in a given calendar
00:48:36
year, fantastic. That is narrow. It's
00:48:38
precise. It's specific. It's actionable.
00:48:40
And what I'm saying is if I surveyed the
00:48:43
500 CEO of the S&P 500, dollars to
00:48:46
donuts, the overwhelming majority would
00:48:48
not have known. And had they brushed up
00:48:50
against that somehow in their normal
00:48:52
media consumption to then ask their
00:48:54
teams, the odds of that would have been
00:48:56
zero as well. So I guess then who do we
00:48:59
who do we blame for this chim? Is it the
00:49:01
administration's job or mainstream
00:49:02
media? But I bet you everybody knows
00:49:04
about the blurring out of the stupid,
00:49:06
you know, pictures on the lawn and the
00:49:08
MS-13 knuckle tattoos. Yeah. Okay. So
00:49:10
let's uh wrap up on this just really
00:49:12
lightning round here. Amazon
00:49:14
flip-flopped on a new tariff
00:49:15
notification on their websites. Trump
00:49:19
said he had a great discussion with
00:49:20
Bezos. He solved the problem very
00:49:22
quickly. He did the right thing, good
00:49:24
guy, etc. If you haven't seen this, it's
00:49:28
something that Teimu is doing. Here's
00:49:29
what Timu does uh currently today. Nick,
00:49:33
you have that image. If you could pull
00:49:34
it up of just when there is a tariff,
00:49:37
they explain the tariff coming into the
00:49:39
country. They put it as like a line
00:49:41
item. I thought this was actually kind
00:49:44
of cool. I don't know why people take
00:49:45
offense, Brian. This is pretty standard
00:49:47
stuff. So Amazon's competitor Tim Teimu
00:49:49
is putting in the import charges. They
00:49:51
don't say tariffs. They don't say taxes.
00:49:53
Import charges. This is like a standard
00:49:54
thing. This happens in other countries
00:49:56
too. What is this? What is Teeu? This is
00:49:58
like a dollar store. It's basically like
00:50:00
a dollar. Is it this the last place you
00:50:01
would ever
00:50:02
buy jeans? Yeah, you can buy $12 jeans.
00:50:06
Basically, your left sock from Laura
00:50:08
Piana costs less than Timu's entire
00:50:12
inventory of jeans. The point being, um,
00:50:15
I thought this was actually a plus for I
00:50:19
think they totally misplayed this. They
00:50:21
they they did it, they rolled it out,
00:50:23
then they got criticized. I think they
00:50:24
were called a treasonous company from
00:50:25
the White House press uh, you know, by
00:50:27
the press secretary. Uh, they totally
00:50:29
misplayed this because they should have
00:50:30
gone and leaned into it and said, "Yeah,
00:50:32
we're showing you all these tariffs when
00:50:34
you buy from China. If you buy from
00:50:36
America, you don't have to pay any
00:50:37
tariff and look at all these other
00:50:39
products." Come on. Come on. That would
00:50:40
have lasted three and a half seconds.
00:50:41
This is exactly consistent with the
00:50:43
other issue which is they're playing
00:50:45
whack-a-ole. Okay, we're going to do
00:50:46
something with the automakers. We're
00:50:48
going to we're going to try and solve
00:50:49
some problem with Amazon. Like like this
00:50:52
is a sign that that it like it's not a
00:50:55
good strategy if you have to do this
00:50:56
much whack-a-ole. Like like they they're
00:50:58
not like they can't cover up what Amazon
00:51:01
is going to end up dealing with because
00:51:02
there's going to be 500 other retailers
00:51:04
that that don't get the call with Trump.
00:51:06
So, so this is like that that like to me
00:51:08
that's evidence of of clearly this they
00:51:11
didn't think through the entire
00:51:12
downstream set of of of conditions that
00:51:14
are going to change as a result of this.
00:51:16
Sure. Yeah. I I thought this was a big
00:51:19
win uh Chimoth because they could then
00:51:23
have Amazon Here's um a mockup somebody
00:51:25
made. I'll pull it up here. It was
00:51:27
interesting. They could to Aaron's point
00:51:30
just show hey here's a bunch of American
00:51:32
companies by American when you do a
00:51:33
search. Ryan's point. I'm sorry, Ryan's
00:51:36
point. Hey, here's what it might look
00:51:37
like. Pull pull that um the OralB
00:51:39
toothbrush one up, Nick, if you got it
00:51:41
right there. So, somebody mocked this
00:51:43
up. I think this could be the hugest
00:51:44
win. You could have the retailers do buy
00:51:48
American, buy it once, buy a high
00:51:50
quality product from America. If you
00:51:52
look here, we don't have the products.
00:51:54
It wouldn't work. We don't have the
00:51:56
products.
00:51:58
Well, I mean, we do have for some
00:51:59
products, you know, Americanmade
00:52:01
products. Uh, you know, I buy my boots
00:52:03
from Danner and those are all American.
00:52:05
Yeah. So, we should just go back to
00:52:07
communism and we're all going to make
00:52:08
our our shoes. Like, it's like that's
00:52:10
like we we're in a global market. Like,
00:52:12
we buy from everywhere. Chamop, any
00:52:14
thoughts on this? I mean, obviously
00:52:16
there's the whack-a-ole angle. There's
00:52:19
buy American and be proud of it. There's
00:52:22
communication. Here's the narrow
00:52:23
question. I I got a bunch of emails from
00:52:26
people and a bunch of them were Amazon
00:52:29
sellers and I don't know Nick if you can
00:52:32
find it but I posted their comments and
00:52:34
I reshared them just to kind of
00:52:36
highlight the issues that they were
00:52:38
going through and at the core of it was
00:52:41
a feeling by them that Amazon had
00:52:45
abandoned them as American purveyors and
00:52:48
sellers of goods and that Amazon on the
00:52:52
margins had attended
00:52:54
to
00:52:56
help competitors from abroad come stand
00:53:00
themselves up and compete and
00:53:03
essentially cannibalize on price and
00:53:04
margin. This is my view completely and
00:53:06
that this is the biggest opportunity
00:53:08
that I think the Trump administr
00:53:10
administration's flying at 40,000 ft
00:53:11
doing macrolevel negotiations and look
00:53:13
and failing to see some of these micro
00:53:15
optimizations that are really really
00:53:16
real. So you in the United States you
00:53:18
can import goods as a foreign company.
00:53:21
You don't not you do not have to create
00:53:22
an LLC or any sort of registered entity
00:53:25
in the United States to import goods.
00:53:27
Sometimes they say, "Oh, Americans pay
00:53:28
the tariff." Like that is not true. In
00:53:31
many many cases, the foreign company
00:53:32
just imports this stuff and they sell on
00:53:34
Amazon. And when they get caught
00:53:36
cheating, they can they can lie about
00:53:38
the valuation and pay a lower tariff.
00:53:40
They can change the classification and
00:53:41
pay a lower tariff. They can import
00:53:43
stuff that, you know, is harmful to
00:53:44
children, has lead, paint, whatever
00:53:46
else. There's no enforcement at all. You
00:53:48
can't. So you're saying Amazon third
00:53:50
party is like a bit of a backdoor to
00:53:52
abuse the system, right? I mean, Amazon
00:53:53
is sort of just playing the game that's
00:53:55
on the field, but they're this is legal
00:53:57
in the United States is these companies
00:53:58
import stuff and I think it's 60% of all
00:54:00
of all the sellers on Amazon are these
00:54:02
Chinese registered. They're not
00:54:03
registered in the United States at all.
00:54:04
Just Chinese companies. Which sounds
00:54:06
profoundly unfair in terms playing
00:54:09
field. Can we just take a step back and
00:54:10
also acknowledge that we are talking
00:54:12
about a level of detailed issues that we
00:54:16
would never have talked about 6 months
00:54:17
ago. That there was
00:54:19
no interest in even bringing this up.
00:54:22
Like if if Ryan wanted to bring up the
00:54:25
sort of hollowing out of
00:54:28
American salesmanship, let's say, if you
00:54:31
will, because of like this arbitrage
00:54:33
that Amazon does for GMV. That would
00:54:36
have been a snoozefest. except today it
00:54:39
can actually get a lot of awareness and
00:54:41
Erin mentioned this and I've mentioned
00:54:43
this before but like I think that there
00:54:45
are four things that really matter
00:54:46
batteries AI pharma APIs and rare earths
00:54:50
that now is on the agenda I think the
00:54:51
the positive way to look at this is the
00:54:54
American economy is too complicated if
00:54:56
you had waited Aaron for a study for all
00:54:59
of the implications we would have been
00:55:01
waiting forever and nothing would have
00:55:03
happened and I think that we've made
00:55:05
macrolevel moves you're right and now we
00:55:07
are finding what the implications are in
00:55:10
course correcting in real time. And I
00:55:12
hope what happens though is when we find
00:55:14
these big thorny issues, I think the
00:55:16
Amazon thing is a is a pretty
00:55:18
interesting issue actually about like
00:55:19
American competitiveness. Now the
00:55:21
question is do we follow through and get
00:55:23
to the root cause of it and and fix it.
00:55:26
And those feedback loops are there. I
00:55:27
mean the Trump administration is going
00:55:28
to act on this and if then there's an
00:55:30
act that's coming out of Congress as
00:55:31
well to shut down the foreign import of
00:55:33
records. So those feedback loops are
00:55:34
there in ways that I don't I don't know
00:55:36
if they were there in the past. Go
00:55:37
ahead, Erin. We can totally chaos monkey
00:55:38
the the economy and just see what
00:55:41
breaks. I think that the you know you
00:55:44
you you sort of phrase the we could, you
00:55:46
know, do the research paper and we could
00:55:48
do, you know, Aspen Institute as a bad
00:55:50
thing, but also it can be a bad thing if
00:55:53
you're the small business owner right
00:55:54
now who has 30 employees and you just
00:55:56
literally don't know what you're going
00:55:57
to do next month. And so, so that that's
00:56:00
that's sort of then the argument to not
00:56:02
counterbalance. That's like why you do
00:56:04
have some bureaucracy and and you don't
00:56:05
chaos monkey the economy and why Rand
00:56:07
Paul is literally saying we shouldn't
00:56:09
actually let you know have unilateral
00:56:11
you know control over tariffs. So you
00:56:13
know interesting that that like dynamic
00:56:16
there. All right, you want to uh wrap us
00:56:19
up here or you want to pass? Oh, am I
00:56:22
still on the pod? Yeah. Well, you turned
00:56:25
your camera off. So, look, I spent 80
00:56:28
minutes debating this topic with Larry
00:56:30
Summers three weeks ago.
00:56:32
The point I made then is that we had in
00:56:36
this city at for 25 years a globalist
00:56:38
consensus on trade that distorted a lot
00:56:42
of outcomes. And I don't need to rehash
00:56:44
that debate, but I'll just recall that
00:56:47
Larry Summer's main argument for why
00:56:50
this would not work out is that the
00:56:52
market was down. Do you remember that?
00:56:53
That was his evidence. Yeah. That this
00:56:56
wasn't going to work. Okay? And it was
00:56:58
all about the market not pricing in
00:57:00
lower expectations. Well, guess what?
00:57:02
The market is actually up since
00:57:04
Liberation Day on April 2nd. So, what
00:57:07
happened 3 weeks ago was basically a
00:57:10
panic in the market over this policy and
00:57:13
the media has been trying to fuel that
00:57:15
panic. Now, what I said as well is we do
00:57:18
have to stick the landing on this. I
00:57:20
mean, President Trump shifted the
00:57:21
conversation away from this globalist
00:57:24
consensus and he's now redefined the
00:57:27
debate, but it is now up to Scott
00:57:29
Bessant, the Treasury Secretary, Howard
00:57:31
Lutnik, the commerce secretary, Jameson
00:57:33
Greer, the US trade rep, and so on, the
00:57:35
Trump trade team, to now negotiate these
00:57:38
deals, stick the landing. And I agree
00:57:40
with you to the extent that the sooner
00:57:42
that is done, the better because it is
00:57:44
good to provide business certainty. But
00:57:47
the idea that so far this hasn't worked.
00:57:50
I think again the main argument against
00:57:51
that was the market reaction that now
00:57:54
the market's not positive. So I think my
00:57:56
point is just we need to give this time
00:57:59
to work. I think it's too soon to be
00:58:01
judging this policy as if it hasn't
00:58:02
worked yet. It needs to be executed
00:58:04
properly. And quite frankly, Ryan, I
00:58:06
mean, I remember the last time you were
00:58:07
on this pod, you were coming on about,
00:58:11
wasn't there like some union deal that
00:58:13
was supposed to shut down all the ports
00:58:14
and all the shelves would be empty? That
00:58:16
never happened either. It did happen. It
00:58:18
did. They shut down for 3 days. I don't
00:58:20
Okay. I don't remember the the shelves
00:58:21
being empty, which is now the new panic
00:58:24
the media is trying to create. So, look,
00:58:26
there's a lot of pants wedding that's
00:58:28
occurring here that's being fueled by
00:58:29
the media. Well, well, I I do want I
00:58:31
want I do want to I just want to
00:58:32
bookmark one one thing. The only thing I
00:58:33
actually um I was more frustrated
00:58:35
listening to the Larry Summers and and
00:58:37
you conversation because I was like, why
00:58:38
Larry make this point like come on.
00:58:40
Like, what? Don't go down the WTO rat
00:58:42
hole. Like, that's that's not relevant.
00:58:43
So, here here's the other thing. Of
00:58:45
course, it's relevant. It's how we got
00:58:46
here. No, no. Like like as in like
00:58:49
that's 25 years ago. Like let's worry
00:58:50
about literally today and like what what
00:58:52
do we do going forward today? And the
00:58:55
only thing I just want to say because I
00:58:57
because I I do think that that you know
00:58:59
I appreciate your point about hey
00:59:01
there's like you know everybody's
00:59:02
freaking out whatever but to be totally
00:59:04
fair that some of that freak out whether
00:59:07
we can decide how emotional it needs to
00:59:08
be is the reason that then Trump walks
00:59:10
back the things that then cause the
00:59:12
market to correct. So, so we can't just
00:59:14
say the market's back and and like see
00:59:16
we didn't need to freak out cuz it was
00:59:18
literally I said we have to make the
00:59:19
deals. We have to stick to the landing.
00:59:21
But look, China over the last 25 years
00:59:23
has been able to strategically
00:59:25
annihilate our rare earth processing
00:59:28
capability and our ability to cast rare
00:59:30
earth magnets. We just sat back and
00:59:32
watched as the market basically went to
00:59:35
the lowest bidder which was being
00:59:36
subsidized by the Chinese government
00:59:38
which the WTO allowed them to do. And
00:59:40
now we have a critical dependency. Yep.
00:59:42
In our supply chain on China for
00:59:45
basically every electric motor in every
00:59:47
product, including cars. That was crazy.
00:59:49
We should not have allowed that to
00:59:50
happen. How are you going to change
00:59:52
that? So, we needed to shift the
00:59:54
political conversation to recognize the
00:59:57
ways in which free trade led to unfair
01:00:00
trade and created unacceptable
01:00:02
dependencies on the for the American
01:00:04
economy. Wait, wait, just I just want to
01:00:07
make one one point. It's more than that.
01:00:09
point and then I'll but this is the
01:00:11
national security of the United States
01:00:13
that that's at stake. Let's go take your
01:00:16
favorite pet issue. China invades
01:00:18
Taiwan. Okay. And we have to take a
01:00:21
side. Jason, my pet is hold on. Let me
01:00:24
just finish. And the and the and the
01:00:27
Chinese say here are the implications of
01:00:30
supporting Taiwan on this A, B, C, and
01:00:33
D. You don't get any pharma APIs. You
01:00:35
don't get any rare earths. You don't get
01:00:36
any batteries.
01:00:38
Okay, it'll it'll send life back 50
01:00:41
years. Or let's say China and India get
01:00:44
into a fight and we're forced to pick a
01:00:46
side. Same situation. The point is
01:00:48
there's all these scenarios that we
01:00:50
never even
01:00:52
considered us being able to have
01:00:55
strategic optionality to make the
01:00:56
decision that's morally and ethically
01:00:58
right for the United States. And I think
01:01:00
that we have learned through this lens
01:01:03
that these are huge issues. The thing
01:01:05
that the Chinese did that was so
01:01:06
brilliant, which we still don't have an
01:01:08
answer for is they have these national
01:01:10
champions. And being a national champion
01:01:13
allows you, and we'll talk about this in
01:01:14
AI, it allows you to blur the lines
01:01:17
between the public and private
01:01:18
partnership. It allows you to blur the
01:01:20
law. It allows you to blur capital. And
01:01:22
I'm not saying we have to do that, but
01:01:25
what I am saying is we need to have our
01:01:27
own answer to it. And that was never on
01:01:30
the table until April 9th.
01:01:33
All right. So yeah, but but like 100%
01:01:36
like do that strategy and then and then
01:01:38
and then don't have a mad rush. What is
01:01:40
the strategy? No, no, no. Because
01:01:41
because that's a you if you have a, you
01:01:43
know, Ryan, what what is the number? I
01:01:44
don't know, trillion of imports or
01:01:45
whatever, like you don't need everybody
01:01:47
then than then jamming the system to
01:01:49
build their supply chain, you know, in
01:01:51
the US to solve that problem
01:01:52
immediately. If we're sitting here in 9
01:01:54
months and you're saying this and there
01:01:56
are no deals, I would say that you're
01:01:58
right. what Howard Lutnik said last week
01:02:01
and again we may have all gotten caught
01:02:03
up in the knuckle tattoos and we missed
01:02:04
this but he was very clear we have a
01:02:07
country a deal is already done we're
01:02:09
convening parliament it's going to be
01:02:11
the first of many so for all we know
01:02:14
there's like 30 deals that are waiting
01:02:16
in the wings and the first one will set
01:02:18
the tone and and I think that Sax is
01:02:21
right here which is it's way too early
01:02:23
to declare defeat and that it was quote
01:02:25
unquote chaos I think if we're sitting
01:02:27
here in 9 months and foreign direct
01:02:29
investment has shriveled up and domestic
01:02:32
investment has shriveled up because
01:02:34
there is just no continuity. You have a
01:02:36
claim. But that's No, no, because I I I
01:02:38
don't I don't think that'll happen. I
01:02:40
don't I don't think that'll happen. I
01:02:41
think we will end up in like I'm I'm
01:02:42
with Ryan like we will end up in a good
01:02:44
spot because we'll iterate through this.
01:02:45
My my only point is there's an
01:02:46
alternative path that that could have
01:02:48
occurred. It could have been done in a
01:02:50
more thoughtful well-communicated
01:02:52
pattern instead of hey let's do barrel
01:02:53
rolls with the airplane. I don't
01:02:55
disagree with you, Aaron, and I can tell
01:02:56
you we've already started to see layoffs
01:02:58
and nobody wanted to even initiate the
01:03:00
barrel roll, guys. Yeah, listen, we get
01:03:03
it. It's like, hey, I don't want
01:03:05
anything to change. I think we agree to
01:03:07
disagree on this. We got to move on to
01:03:09
the next one. Hold on. This one last
01:03:11
point. Excuse me. Excuse me. You didn't
01:03:13
call me for 40 minutes. I just want to
01:03:14
make one final point. Eron, here he
01:03:16
goes. Erin, where were you with this
01:03:18
perfect plan? Yeah. Where were you with
01:03:21
this perfect plan before Liberation Day?
01:03:23
I was telling Kla about it. You were
01:03:25
telling Kamla. Okay, great. They were
01:03:27
having nobody nobody ties and they were
01:03:31
talking about this specific issue. All
01:03:33
the people who suddenly know what the
01:03:35
perfect plan is and how to perfectly
01:03:37
execute it. No barrel rolls had nothing
01:03:39
to say about this topic for 25 years.
01:03:41
Now all of a sudden they've come forward
01:03:43
with their perfect plans. I would say
01:03:46
that's victory for Trump. Finish. It's
01:03:47
like look at this. The best thing of all
01:03:49
of this is you've got the Liberals
01:03:50
embracing Milton Friedman and their
01:03:51
backgrounds on there. I love it. Yes. Uh
01:03:54
All right. Listen, we're going to agree
01:03:55
to disagree. We're going to agree to
01:03:56
disagree on this one. You know, the
01:03:58
Liberals love the stock market. Listen,
01:04:00
Sachs Kla is coming on next week. We're
01:04:02
going to make some cocktails. It's going
01:04:03
to be wonderful. You we'll ask her some
01:04:06
direct questions about it. But I want to
01:04:08
talk about AI agents. 2025 shaping up to
01:04:11
be the year of AI agents. Tons to talk
01:04:14
about here. Open AI is planning to
01:04:16
charge between two and 20K a month for
01:04:18
different levels of AI agencies would be
01:04:21
basically cron jobs they would run in
01:04:22
the background and do things for your
01:04:24
company that uh humans are doing right
01:04:26
now. You may have heard of this uh
01:04:28
agentic tool. Again agentic is just a
01:04:30
fancy word for agent which is a fancy ro
01:04:32
word for like a cron job that just runs
01:04:35
uh perpetually. Yep. And uh Manis is uh
01:04:39
the company in China that started this
01:04:41
weirdly benchmark invested in it that's
01:04:43
created a whole bluff on the side and
01:04:46
Manis' website has a really good
01:04:47
visualization of what these agents would
01:04:49
look like. So first of all I think
01:04:50
you're giving a little too much credit
01:04:51
to Manis. They didn't come up with the
01:04:53
agents but I do think that they have a
01:04:55
very good demo and it's hard to know
01:04:57
exactly how real it is because not
01:04:59
everyone's used it and it's from China.
01:05:01
It's from China. I'll get to that in a
01:05:02
second. If you go to their website, you
01:05:05
can see a bunch of their demos. And I do
01:05:08
think that what they deserve credit for
01:05:09
is advancing the ball on the UI
01:05:13
paradigm. And it's not that other people
01:05:16
weren't doing this. I mean, I think
01:05:17
notably Anthropic was doing this with
01:05:18
its operator product, but the basic idea
01:05:21
is that you've got this two-pane view
01:05:23
and in one window, you've got the
01:05:25
standard chatbot interface and then in
01:05:27
the other view, you can see what the
01:05:29
agent is doing. And that agent has the
01:05:32
ability to toggle between it currently
01:05:34
four apps. There's search, browser,
01:05:37
code, terminal, and document editor. And
01:05:40
so when you give Manis a task, the first
01:05:42
thing it does is create a to-do list in
01:05:44
the document editor. You can kind of see
01:05:46
it there. And then it works sequentially
01:05:48
to achieve each of those tasks and then
01:05:49
puts an X on them there. And you can
01:05:52
kind of see it working. And I think
01:05:53
what's cool about the demo is just the
01:05:55
way that it seamlessly toggles between
01:05:57
those four apps. and you can see what
01:05:59
the AI agent is doing, you know, and
01:06:01
it's browsing the internet, it's
01:06:02
searching for things, it's writing
01:06:04
documents, it's crossing things off its
01:06:06
to-do list. Now, I think it's pretty
01:06:08
easy to imagine where this goes, which
01:06:10
is you'll be able to connect an agent to
01:06:13
all of your SAS apps. So, it won't just
01:06:15
be four applications. It'll now be
01:06:17
connected to dozens of applications,
01:06:19
including ones that already have your
01:06:21
data. And it's going to know what
01:06:23
actions it's possible to take in those
01:06:24
apps. So when it creates its to-do list,
01:06:26
there's a much wider range of things
01:06:28
that it can accomplish. And in fact,
01:06:30
there's a new standard called MCP which
01:06:33
is taking off like wildfire which is
01:06:35
built specifically to enable agents to
01:06:37
connect with applications and understand
01:06:39
the data and understand the actions that
01:06:40
are possible in those SAS applications.
01:06:43
So look, Manis is just at the tip of the
01:06:45
iceberg here. I think this will become a
01:06:46
very standard UI paradigm. That's the
01:06:48
reason why I mention it. Not because I
01:06:50
am predeclaring them to be the winner in
01:06:52
the space, but just because I think
01:06:54
there's a lot of talk about agents and I
01:06:56
think it's hard to conceptualize what
01:06:59
that means without just seeing it
01:07:00
visually. A great great summary there,
01:07:02
Saxs. And Aaron, I want to get your
01:07:04
thoughts on it because obviously you're
01:07:06
running Box and and you have your your
01:07:08
finger on the pulse of this. We actually
01:07:10
started building one of these in our
01:07:11
venture firm. We have 20,000
01:07:14
applications a year and we have updates
01:07:16
coming in from investments. We are now
01:07:18
taking those sacks Aaron and we are
01:07:20
having an agent sort them and then look
01:07:23
for competitors and compare them to the
01:07:25
last update and we're looking into our
01:07:27
notion our kod and saying what else have
01:07:30
what other communications have we had
01:07:32
what questions should we ask about the
01:07:33
startup and about their strategy and
01:07:35
then we're presenting that in Slack to
01:07:37
our team. So this is coming fast and
01:07:39
furious and we spend I don't know
01:07:42
probably 15 minutes on each of those
01:07:43
incoming applications. You start doing
01:07:45
the math on that. Talk about 5,000 hours
01:07:47
of work. Aaron, what are you seeing on
01:07:49
the street? What are you doing at Box in
01:07:51
terms of agents landing right now in Q2
01:07:55
of 2025? Uh, yeah. I mean, I think I
01:07:58
think Sax represented it well, which is
01:08:00
which is, you know, you have to now
01:08:02
think about AI as as effectively being
01:08:04
able to do anything on a on a computer
01:08:06
or another piece of software as as a
01:08:08
human can do. And the little distraction
01:08:11
that that I think happened two years ago
01:08:13
after the chatbt moment was we sort of
01:08:15
thought about that as oh we're just
01:08:17
going to now you know do like typing
01:08:18
information retrieval and that's a new
01:08:20
paradigm for user interfaces let's say
01:08:22
so you just like talk to your software
01:08:24
and you like search Zillow via chat that
01:08:27
was sort of a little bit of a
01:08:28
distraction that that's super helpful
01:08:29
like when you want basic information
01:08:31
lookup or whatnot the big breakthrough
01:08:32
was starting to think through these
01:08:34
things as as full you know effectively
01:08:37
uh uh you know agentic systems that that
01:08:40
operate on any amount of data, any
01:08:42
amount of tools for as long as you want
01:08:44
to complete any task that you want. And
01:08:46
this is sort of the big year where
01:08:48
agents are starting to, you know, enter
01:08:50
the vocabulary of enterprises, of IT
01:08:53
people, of, you know, larger and and
01:08:55
certainly small organizations. Um, and
01:08:58
and it kind of requires you to have a
01:08:59
little bit of a of a reset moment on how
01:09:01
you think about AI, which is which is
01:09:03
it's not just now a kind of a co-pilot
01:09:05
that you talk back and forth with. it's
01:09:07
actually something running behind the
01:09:09
scenes that's now actually starting to
01:09:11
deliver, you know, real automated, you
01:09:14
know, kind of work for you. And so lot
01:09:16
lots of implications like, you know,
01:09:18
massive implications to what the
01:09:19
software business model is in the
01:09:20
future. You know, I I I would argue, you
01:09:22
know, strongly that's a massive TAM
01:09:23
increase um because now software starts
01:09:26
to go after labor spend. It completely
01:09:28
changes the dynamics of then, you know,
01:09:29
how do you build a moat in a world of AI
01:09:31
agents? Um uh but but I think unpack
01:09:34
that piece there Erin. You said
01:09:35
something very interesting how you how
01:09:37
software then is going to go over human
01:09:40
spend. Yes. Explain that concept. Unpack
01:09:43
it for a second. David and I, you know,
01:09:46
we we we go back way back in SAS land,
01:09:48
but like you used to basically just, you
01:09:50
know, you built you built a piece of of
01:09:52
software and you sell it for the number
01:09:54
of of people in the organization. And
01:09:56
so, you know, company has 500 employees
01:09:59
and uh and you sell that thing for,
01:10:01
let's say, $10, you know, a user a
01:10:03
month, you know, 120 bucks a year and
01:10:05
you make 60,000 bucks. Um, and so so
01:10:07
that that's kind of the business model.
01:10:09
Uh, now when your software actually
01:10:11
brings the underlying workflow to the
01:10:14
customer or the underlying outcome to
01:10:16
the customer, so you know that that
01:10:18
company might have 10 lawyers and so
01:10:20
previously if you were selling software
01:10:21
for lawyers, you had a maximum amount of
01:10:23
10 seats that you could sell. Now all of
01:10:25
a sudden if your AI agents are doing the
01:10:27
equivalent of let's say parallegal work
01:10:29
or some form of professional services
01:10:31
all of a sudden you might be able to
01:10:32
sell a multiple of the initial kind of
01:10:34
10 seats that you would have sold
01:10:35
previously. So you see it as a huge
01:10:37
opportunity because now you're not
01:10:39
enabling a human to be 5% more
01:10:41
productive. You're replacing a human or
01:10:42
you're replacing one out of 10. Yeah.
01:10:44
And and actually I'm going to take a
01:10:45
massive I'm going to do an underscore on
01:10:47
this point though. I I don't I don't
01:10:48
like the word replace because I I think
01:10:50
actually most of the upside is actually
01:10:52
going to be for companies that now
01:10:53
deploy labor at things that they
01:10:55
wouldn't have deployed labor at before.
01:10:57
And and you know, maybe I'm biased from
01:10:58
the view we have, but most of our
01:11:00
conversations with customers are when
01:11:02
they have AI agents, it they can
01:11:04
actually now go and actually deliver
01:11:06
work in areas that would have been
01:11:08
unaffordable previously. So they
01:11:09
actually they weren't doing the work.
01:11:12
True in logistics. So, like we we're
01:11:14
making thousands of phone calls a day
01:11:15
using AI, calling truck drivers, and we
01:11:18
if we have a load, we've got 400,000
01:11:20
truck drivers using the mobile app, the
01:11:22
Flexport mobile app. I don't have enough
01:11:24
loads to keep them all checking it every
01:11:26
day to see if there's a load that
01:11:28
matches them. And if they don't check
01:11:29
it, I they're they're useless to me now.
01:11:32
And it was too expensive to call the
01:11:33
truck driver and have a human call and
01:11:35
talk to them, even if it's a human in a
01:11:36
call center in the Philippines. Whereas
01:11:38
with AI, it's almost free. I'm calling
01:11:40
thousands of them a day going, "Hey,
01:11:42
this load looks like it's a good match
01:11:43
for you. Are you interested?" And and
01:11:45
then we activate them on the platform.
01:11:47
Um, that's new work that wasn't going to
01:11:49
happen before, not just a replacement.
01:11:51
Yeah, I I think that I think probably I
01:11:53
think like we we have, you know, in the
01:11:55
valley, unfortunately, we've been
01:11:56
co-opted a little bit somewhat with with
01:11:58
a little bit of a doomer, you know,
01:12:00
mindset in some areas and and we think
01:12:02
of then AI is, okay, like like it's all
01:12:04
fixed pie, it's going to replace things.
01:12:06
And on the ground with large
01:12:08
enterprises, the vast majority of the
01:12:10
use cases are are it's it's the you
01:12:13
know, it's the ability to finally review
01:12:14
the contracts that we never got around
01:12:16
to reviewing. It's the ability to
01:12:17
finally automate an invoice process that
01:12:19
we never did. It's the ability to go in
01:12:21
and just create marketing campaigns in
01:12:23
every language that we never got around
01:12:25
to. And so so I think that'll probably
01:12:27
be actually like 90% of the usage of AI
01:12:29
in the future will be things that if we
01:12:31
look back and we snap the line right now
01:12:32
and we said this is what knowledge work
01:12:34
is today. 90% of AI usage will be things
01:12:37
that we don't do today. 10% will will
01:12:39
replace you know what we're what we're
01:12:40
doing in some areas. I think that's the
01:12:42
right take cuz Chimath I can tell you in
01:12:45
our firm we would never have associates
01:12:46
or researchers or analysts sax you also
01:12:49
were in this line of work as well
01:12:51
venture capital you would never have
01:12:52
them review legal documents. That's
01:12:54
something lawyers would do in the legal
01:12:55
department. But now because of AI, we
01:12:58
have them say, "Here's the safe. Here's
01:12:59
the term sheet. Here's the edited
01:13:01
version, dump it all in, find out what
01:13:04
the changes are, what are the deltas
01:13:06
here, and and then let's have a
01:13:07
discussion about what the founder
01:13:08
changed in a Sander document, and we
01:13:11
don't have to bother with an attorney,
01:13:13
and maybe you wouldn't have even checked
01:13:14
those documents if you were, you know, a
01:13:16
seed fund or an angel fund. You would
01:13:18
just go along for the ride because
01:13:19
you're the 10th person signing the
01:13:21
documents." So what what do you think
01:13:22
Chimoth here in terms of the premise
01:13:24
that maybe it's 10% replacing work
01:13:27
that's happening but this is blue ocean
01:13:29
and we're going to do 90% of like new
01:13:30
stuff that we just never got to. Yeah, I
01:13:33
tend to I tend to believe that's true. I
01:13:34
think the customers that we sell into at
01:13:38
8090 are largely large enterprises as
01:13:41
well. So not dissimilar to Aeron's
01:13:42
customer base. What I would say is that
01:13:45
what they are encountering is the trough
01:13:49
of disillusionment.
01:13:52
And I don't know if Erin, you're seeing
01:13:53
this as well, but every single, you
01:13:56
know, CIO ran
01:13:58
around signing up some sort of AI
01:14:01
product in large part because their CEO
01:14:04
would say to them, hey, what's your AI
01:14:07
strategy? And the reason the CEO asked
01:14:09
them that is that at some point somebody
01:14:10
on the board said, what are we doing
01:14:12
about AI? So that's the the
01:14:15
cascade that that that we went through
01:14:17
in the last two years. And I think what
01:14:20
has happened now is people have spent
01:14:22
billions and billions of dollars. I
01:14:24
think you can see it in the revenue
01:14:25
traction of the AI
01:14:28
companies. But I think where we are
01:14:30
today is that there are some real
01:14:33
technical complexities that have not
01:14:35
been solved. I'll give you an example.
01:14:37
We have a lot of customers in regulated
01:14:39
industries which is to say that if you
01:14:42
make a
01:14:43
mistake you will get fined or you will
01:14:46
get shut down. Life sciences,
01:14:48
healthcare, financial services are three
01:14:52
examples. People still don't seem to
01:14:54
appreciate that when you replace
01:14:56
software that is deterministic with
01:14:59
software that is probabilistic, meaning
01:15:01
software that somebody wrote for
01:15:04
you, do A then do B, then do
01:15:08
C with an LLM that can
01:15:11
hallucinate, you'll have
01:15:14
errors. So what used to be a throwaway
01:15:17
thing, which is quality assurance and
01:15:18
QA, right? unit testing, integration
01:15:21
testing is now the only thing that
01:15:22
matters. Why? Because if you're a
01:15:24
financial services institution and
01:15:26
you're supposed to do KYC and hit BORS
01:15:28
and now all of a sudden you send a wire
01:15:30
somewhere in Syria, guess what? You're
01:15:32
in trouble. Bueno. Yeah. If you're a
01:15:34
healthcare company and you're supposed
01:15:36
to do some clinical diagnosis to send
01:15:38
out a drug on time and you don't do that
01:15:41
because the the model
01:15:42
hallucinates, that's a real problem. And
01:15:45
I am guaranteeing you, we have not seen
01:15:49
the class action lawsuits that will come
01:15:52
when those errors will eventually be
01:15:54
made. They're guaranteed to be made. We
01:15:57
just don't know the scope and the scale
01:15:58
of them. So that's why I'm sort of of
01:16:01
this posture where I think we've sold in
01:16:06
a ton of promise. I think the reality is
01:16:09
much more tactical. It's a little bit
01:16:11
more benal. I think we're sorting
01:16:13
through the exact use cases where you
01:16:15
can put guard rails around these error
01:16:18
rates where it's okay and tolerable.
01:16:19
Like Ryan will probably tell you there's
01:16:22
some number of phone calls that just
01:16:23
sound totally fcocked but he's okay with
01:16:27
that because the broader thing is okay
01:16:29
and Aaron will so I don't know. So I
01:16:32
don't want to talk to my customers
01:16:33
though. I have it calling truck drivers
01:16:35
to offer them you know offer them loads
01:16:37
but I'm not having to talk to my
01:16:38
customer. Sorry, I I meant your truck
01:16:40
drivers, but my my my point just is that
01:16:43
I think agents are
01:16:45
real, but I think that we are far away
01:16:48
from that because we're still at the
01:16:49
phase of how do you build reliable
01:16:52
software in production for an enterprise
01:16:55
versus the toy apps that you see on the
01:16:58
internet which is like let me vibe code
01:17:00
something. I think these things are
01:17:02
worlds apart still. Okay, so let me get
01:17:03
saxed in on here and just to inform the
01:17:05
audience you heard tri of
01:17:06
disillusionment. This comes from the
01:17:08
hype cycle. This is something Gartner
01:17:10
has been talked about for a long time.
01:17:11
So in case you're taking it for granted
01:17:13
if you're watching, you have some sort
01:17:15
of technology trigger like agents. You
01:17:16
have this like peak of inflated
01:17:18
expectations. Now we're in the trial of
01:17:20
disillusionment. Hey, this stuff doesn't
01:17:21
work. It's hallucinating. But we're kind
01:17:23
of going up that works. It just doesn't
01:17:26
say we're on the slope of I don't see
01:17:28
the disillusionment. I don't know where
01:17:29
this is coming from. I don't even think
01:17:30
we're at the peak yet. Oh, okay. So you
01:17:33
think we're still going up? cuz I I a
01:17:35
lot of people to Shimat's point were
01:17:37
buying stuff and saying, "Hey, it
01:17:38
doesn't work, you know, and now we're in
01:17:40
the mess." Let me say let me say it
01:17:42
differently, Sax. I think we have not
01:17:44
yet figured out how to move the budgets
01:17:46
from experimentation to mainline
01:17:49
production. Meaning where large chunks
01:17:51
of the US economy are comfortable enough
01:17:55
with the ways in which hallucinations
01:17:57
are managed such that they will replace
01:18:00
legacy deterministic code with this new
01:18:03
probabilistic model generated code
01:18:06
meaning model enabled code. Let's just
01:18:08
put it that way.
01:18:11
Where are we on the slope here? Yeah.
01:18:12
Look, I would I would separate change
01:18:15
management issues, which are always
01:18:16
going to be important and there's always
01:18:18
going to be big ones whenever there's a
01:18:19
big disruption, especially in enterprise
01:18:21
and especially around compliance and
01:18:22
legal and all that kind of stuff. I
01:18:23
would separate that from the impact of
01:18:25
the underlying technology trend. And I
01:18:28
don't think the impact has come anywhere
01:18:30
close to peaking yet. And in fact, I
01:18:33
would say the rate of progress is
01:18:35
exponential right now on at least three
01:18:38
key dimensions. So number one is the
01:18:40
algorithms themselves. The models are
01:18:42
improving at a rate of I don't know
01:18:43
three to four times a year. They're not
01:18:45
just getting faster and and better, but
01:18:49
qualitatively they're different.
01:18:50
Remember, we started with pure LLM chat
01:18:52
bots. Then we went to reasoning models.
01:18:55
And the difference there is with a
01:18:56
chatbot, I just it's like kind of a
01:18:58
smart PhD or college student giving you
01:19:00
an answer off the top of their heads.
01:19:02
The reasoning models, it's more like the
01:19:04
PhD saying, "Okay, let me go off and
01:19:06
think about that. Let me do a project on
01:19:07
that." And it could work for 30 seconds
01:19:10
or a couple of minutes. I mean, as much
01:19:11
compute as you want to throw at it and
01:19:13
it will break down your complicated
01:19:15
question into a bunch of sub questions
01:19:16
and then it'll try different approaches
01:19:18
and it can validate some of those
01:19:19
approaches and come back to you with a
01:19:21
much more impressive answer. And if
01:19:22
you've been using like the Gro 3 deep
01:19:25
research or the new Chad
01:19:29
GBT3 to do these types of new reasoning
01:19:32
models, it's pretty mind-blowing what
01:19:34
they're capable of. Have we even come
01:19:36
close to figuring out how to tap the
01:19:38
potential there, especially in an
01:19:39
enterprise context? No, but my point is
01:19:42
that the rate of progress on the
01:19:43
algorithms is again three or four
01:19:47
times
01:19:49
here. Okay, go finish about sex and then
01:19:52
I'll I'll take it and pass it. Go ahead.
01:19:53
Well, I was trying to lay out the
01:19:54
dimensions of which progress is
01:19:56
proceeding exponentially. Okay, so one
01:19:58
is the algorithms. Okay, which is not
01:20:00
just quantitative, it's also
01:20:01
qualitative. We didn't even get to the
01:20:03
agents part of it yet, but that's the
01:20:05
next big leap after reasoning models.
01:20:06
We're just starting to scratch the
01:20:08
surface there. Then you've got the
01:20:10
chips. I mean, the chips are getting
01:20:12
better at, I don't know, three to 4x a
01:20:14
year. We've gone from, you know, the
01:20:15
H100 to the H200. Now we're on the
01:20:18
GB200. We'll be a GB300 soon. We'll be
01:20:22
on to three times better per year or
01:20:24
they get better three times per year.
01:20:26
No, no, no. They're getting the chips
01:20:27
themselves, depending on how you measure
01:20:29
it. Each generation of chips is probably
01:20:31
three or four times better than the
01:20:32
last. Okay. and Nvidia is back to
01:20:36
rolling out new chip, new generation of
01:20:38
products roughly annually and I'm just
01:20:40
using them as one example. Obviously
01:20:42
there are other companies as well. So
01:20:44
basically the lead from Hopper to
01:20:46
Blackwell to got it Reuben I guess will
01:20:48
be in next year and and then I think
01:20:50
Fman's coming after that. I mean really
01:20:52
an astounding rate of progress. It's not
01:20:54
just the individual chips are getting
01:20:55
better. They're figuring out how to
01:20:56
network them together like with NVL72.
01:20:58
It's like a rack system to create much
01:21:01
better performance at the data center
01:21:03
level. And that would be like the the
01:21:04
third area where you're seeing basically
01:21:06
exponential progress. Just look at the
01:21:09
number of GPUs are being deployed in
01:21:11
data centers. So when Elon first started
01:21:13
training Grock, I think they had maybe
01:21:15
100,000 GPUs. Colossus was 100,000.
01:21:17
Correct. Right now they're up to
01:21:18
300,000. They're on the way to a
01:21:20
million. Same thing with OpenAI's data
01:21:23
center, Stargate. And within a couple
01:21:25
years they'll be at I don't know 5
01:21:27
million GPUs, 10 million GPUs. So and
01:21:29
you see that on the power side, right?
01:21:31
You're going from 100 megawatt data
01:21:32
centers to 300 megawatts to we're just
01:21:35
starting to now see the first gigawatt
01:21:37
power data centers. I don't even think
01:21:39
they're live yet, but this is where
01:21:40
they're trying to get to. And I don't
01:21:42
think it's beyond the real possibility
01:21:44
that we could be at 5 or 10 gawatt data
01:21:46
centers in the next I don't know several
01:21:48
years. So, so my point is just look, the
01:21:50
algorithms, the chips, and the data
01:21:53
centers are all improving or scaling at
01:21:56
a rate of, I don't know, 3 to 4x a year.
01:21:58
That's 10x every 2 years. Okay? Where
01:22:01
people don't understand exponential
01:22:02
progress is that if you're getting
01:22:04
better at 10x every 2 years, that
01:22:06
doesn't mean you'll be at 20x in four
01:22:08
years. It means you'll be at 100x. 100x.
01:22:11
So the models, the chips, and the data
01:22:12
centers will all be 100 times more
01:22:14
powerful in four years, let's say at the
01:22:16
end of of this presidential term. So you
01:22:18
multiply those things together, the
01:22:21
algorithms, the chips, and then the raw
01:22:23
compute that's available, you're talking
01:22:25
about a millionx increase, some of which
01:22:28
will be captured in price reductions,
01:22:30
some of it will be in the performance
01:22:31
ceiling, and then some of it will just
01:22:33
be in the overall amount of of AI
01:22:37
compute that's available to the economy.
01:22:40
But the impact of this thing is going to
01:22:42
be absolutely massive and I think people
01:22:43
still don't even appreciate that fact
01:22:45
because they don't understand
01:22:45
exponential progress. Yeah. And I think
01:22:47
maybe just to square the circle the the
01:22:49
because because everything is that that
01:22:52
you just said tax is what I think is
01:22:54
propelling the industry and then the
01:22:55
reality on Jama's side like like just
01:22:57
just to connect the dots. So uh we have
01:23:00
an eval test that we do uh where we run
01:23:03
enterprise data through every model to
01:23:05
to kind of figure out its accuracy rate
01:23:07
and and you know how much it's not even
01:23:09
hallucination but just literally how
01:23:10
much data does it miss when we ask for
01:23:12
facts. The best model in the world um
01:23:14
actually interestingly we was gro three
01:23:16
on on this particular test. We send it
01:23:18
500 documents and we ask for 40 data
01:23:21
fields back from the documents and so it
01:23:23
has to get every single data field
01:23:24
correct and we only do a single pass. So
01:23:27
we send the document to to the to the
01:23:29
model, we get a single pass back. Right
01:23:31
now the the best score is about 90%. Um
01:23:34
and so you can imagine a number of
01:23:35
industries where you can't have 90%
01:23:37
accuracy, you know, if you give
01:23:39
something, you know, a question on 40
01:23:41
data fields. Now there's ways to solve
01:23:42
it is you rerun it multiple times or you
01:23:45
chunk up the document into smaller parts
01:23:47
and so it doesn't get confused by the
01:23:49
large context window. But a lot of the
01:23:51
people that were deploying AI a year ago
01:23:53
or a year and a half ago weren't doing
01:23:55
that. And so they they you know they did
01:23:57
have a a kind of a a pilot run of
01:23:59
something and it it kind of worked okay.
01:24:01
And what they have to realize back to
01:24:03
your point sax is like this space is
01:24:05
literally exponentially you know
01:24:06
changing and so if you don't use the the
01:24:08
latest methods of okay you have to
01:24:10
actually like run the data through the
01:24:12
model multiple times um and you have to
01:24:14
chunk up the data into smaller parts and
01:24:16
you have to use a reasoning model and
01:24:17
you have to make sure that your your
01:24:18
prompt is like hyper tuned for the
01:24:20
particular use case. If you haven't done
01:24:21
those four things, then you probably
01:24:23
will actually end up with a project that
01:24:25
fails. And and even so, even when you do
01:24:27
all those things for even, you know,
01:24:29
harder problems, you know, you're still
01:24:30
going to run into issues. So, I think I
01:24:32
think the challenge is that everybody's
01:24:33
running a million miles an hour right
01:24:34
now and they're trying a lot of things.
01:24:36
Some work, some don't work at the same
01:24:38
time that the space is actually, you
01:24:40
know, you know, changing at a at a
01:24:41
pretty uh, you know, kind of crazy rate.
01:24:43
and let's take a look at our partner
01:24:45
Poly Market and uh which company they
01:24:47
think will have the best AI model by the
01:24:49
end of 2025 and get feedback from our
01:24:51
panel on if you think this is accurate
01:24:54
and who you would pick here. Looks like
01:24:57
Google is in the lead here. 41% of
01:25:01
people believe that they will have the
01:25:05
best AI model by the end of the
01:25:06
question. What is what is the dimension
01:25:08
like you know we use Gemini so for many
01:25:11
tasks at 8090 we use Gemini it's
01:25:13
incredible but for most of our codegen
01:25:15
we use anthropic and claude kicks ass
01:25:17
it's it's exceptional this is uh based
01:25:20
on the best scores in the chatbot arena
01:25:23
which just became a fourth company so
01:25:26
that is slightly different because
01:25:27
people have
01:25:28
gamed tests so that is a rub there
01:25:31
people are now building their a model
01:25:33
for the AVAL unit right all the models
01:25:34
are way overfitted for these eval so but
01:25:36
if you had to pick who's your I mean so
01:25:38
I guess Chimath you're saying you have
01:25:39
to take a task by task it's what depends
01:25:41
on task I agree what Zach said is right
01:25:43
so it's kind of like what what problem
01:25:45
are you trying to solve and then you
01:25:46
have to ride this technology wave that
01:25:48
is compounding very quickly all I was
01:25:50
just trying to get across is that the
01:25:51
error rates have been diminishing but
01:25:54
not nearly as fast as you need for some
01:25:55
sectors of the economy so you can use a
01:25:58
model to generate deterministic code
01:26:00
that's great and as long as you unit
01:26:02
test it and integration test it'll be
01:26:04
fine but I'm saying if you're going to
01:26:05
use a model in production
01:26:07
in an environment where if there are
01:26:10
consequences
01:26:12
we're not there
01:26:15
yet but you could use it for writing or
01:26:18
writing jokes or
01:26:20
maybe but that's that's too binary like
01:26:23
it's already used right now in
01:26:24
healthcare but it's just the doctor's
01:26:26
meeting notes that that would normally
01:26:28
take 30 minutes to go and transcribe
01:26:30
yeah so so it's we you can't be too too
01:26:32
black and white on that one yeah what's
01:26:33
happening right now the reason why the
01:26:35
the progress is so rapid in coding
01:26:37
assistance and I I think you know you're
01:26:39
right that enthropic with was it claw
01:26:43
3.7 37 yeah yeah I think they're the
01:26:45
leader and in fact I think the manis
01:26:47
demo that we showed it's not entirely a
01:26:49
rapper on clawed because they actually
01:26:51
they do a number of different things but
01:26:52
I think they are significantly using
01:26:54
anthropic for the code assistant part of
01:26:56
it in any event the reason why the
01:26:59
progress is so rapid with coding is
01:27:01
because code compiles and you can
01:27:04
determine objectively whether it works
01:27:06
or not, you can validate it. And so that
01:27:09
makes it a perfect area for AI to get
01:27:13
better at through reinforcement learning
01:27:14
and test time compute is AI tries a
01:27:16
bunch of things. It sees what works. It
01:27:18
sees what compiles, sees what the user
01:27:20
then accepts, and then be able to learn
01:27:22
and iterate based on that. That's why
01:27:24
coding right now is really the big
01:27:26
breakthrough application and use case.
01:27:28
But it's not going to be the only one. I
01:27:29
mean, math is another good area where I
01:27:31
think AI is improving rapidly again
01:27:33
because math you have proofs and you can
01:27:37
look at the results and see if it
01:27:39
validates. Now, I think one of the big
01:27:41
questions in terms of AI progress is how
01:27:44
extensible is the progress to other
01:27:46
areas that don't easily validate that
01:27:47
way. So, for example, legal work is I
01:27:52
think a really good area for AI, but how
01:27:54
do you validate that it's correct? You
01:27:56
would have to go to a court, right? the
01:27:57
court is the compiler like a lawsuit is
01:27:59
the compiler or maybe the laws or you
01:28:02
could hire like a thousand lawyers or
01:28:04
experts in an area to basically do you
01:28:07
know reinforcement people are doing
01:28:09
people people are doing but it's not
01:28:10
like a compiler to your point it's not
01:28:12
like a it's not the progress isn't going
01:28:13
to be as rapid because it's harder to
01:28:15
validate but absolutely but my guess is
01:28:17
that once they figure out how to nail
01:28:20
coding math and the things that are
01:28:22
easily validated they can move to the
01:28:23
things that are harder to validate and
01:28:26
but I I think this is one of the big
01:28:28
questions is whether I think people just
01:28:30
kind of assume that AI progress will be
01:28:32
you know equally fast in all areas and I
01:28:34
think it's possible that AI gets really
01:28:36
good in some areas better than human but
01:28:38
it's sort of childlike in other areas
01:28:40
and um narrow possible outcome is make
01:28:43
your point right like with a finite
01:28:45
answer or an answer we know is the
01:28:47
definitive well this is but but this is
01:28:49
the the important thing about the agent
01:28:52
kind of you know let's just say uh
01:28:54
framework or architecture uh you
01:28:56
momentum was was instead of just saying
01:28:59
okay we're going to do a single pass
01:29:00
through the model and then whatever it
01:29:02
comes back with is is we're going to be
01:29:03
satisfied um like you know the legal
01:29:06
work might be might be reviewed by
01:29:08
another agent whose job is to review
01:29:10
legal work and so we we can just throw
01:29:12
more and more compute at the problem uh
01:29:14
and we're just early in figuring out how
01:29:16
to architect those or multiple models
01:29:18
right you could
01:29:20
have you have enthropic checkp check
01:29:23
Gemini yeah when you have some anomaly
01:29:25
it spits back out to the user. So human
01:29:27
in the loop still matters in this type
01:29:28
of process. In the early days of OCR,
01:29:31
you would have a computer say here's the
01:29:33
characters in this legal document. Then
01:29:34
you'd have two humans type it in and
01:29:36
then you would get a certain level of
01:29:37
certainty. You'll quickly find that when
01:29:39
you layer these models on top of each
01:29:41
other, the test time compute costs are
01:29:43
astronomical. Y and Aaron's probably
01:29:46
dealt with this like it's like I get a
01:29:47
bill from AWS and it's like oh wait,
01:29:51
hold on a second. I just, you know, per
01:29:54
100,000 this month. what's going on. So
01:29:56
we have to get to the bottom. That that
01:29:58
by the way is another major trend line
01:30:00
which is that the new applications that
01:30:01
we talked about are all much more token
01:30:03
intensive. So we went from basic LLMs
01:30:06
totally you know which don't require
01:30:08
that many tokens to give you an answer
01:30:09
to the reasoning model where you can
01:30:11
spend a thousand times more tokens just
01:30:13
getting one answer to a question and now
01:30:15
the agents are going to be even more
01:30:17
token intensive than that. So the amount
01:30:19
of compute required to serve all these
01:30:21
new applications is going to be massive
01:30:23
which is why I think the capex buildout
01:30:25
actually makes sense when you do a deep
01:30:27
research to your point David you're
01:30:29
firing off maybe 200 queries and it's
01:30:31
asking them the AI is saying hey what
01:30:33
query should I ask on behalf of the user
01:30:35
and then you go down that rabbit hole
01:30:36
you it's basically like doing 200 of
01:30:38
them at once Ryan uh your thoughts here
01:30:40
on AI first companies and agentic
01:30:42
computing well the one I really wanted
01:30:44
to tie back to was actually our earlier
01:30:45
conversation on tariffs and there's a
01:30:47
very real use case of of LLMs is how do
01:30:50
you classify a product and you'd like to
01:30:52
get to what we see today when we did our
01:30:55
first machine learning based natural
01:30:57
language classification of a product you
01:30:59
take a product URL listing page a
01:31:01
Shopify page or Amazon page and say hey
01:31:03
what what classification code is this
01:31:05
what duty is owed six years ago in a
01:31:08
hackathon we got to like 70% accuracy
01:31:11
we're now in high 90s accuracy versus
01:31:14
what a human trained expert will get to
01:31:16
but you actually get to which is not
01:31:18
good enough. You know, you're wrong 3%
01:31:20
of the time. You know, you might have
01:31:22
committed a violation of the law for
01:31:23
sure. But actually, what is truth in
01:31:26
that regard? It's there's a lot of gray
01:31:27
area in this. And truth ultimately is
01:31:29
what does customs say? What does the CBP
01:31:32
determine is correct? And those guys are
01:31:35
using software, right? That's pretty
01:31:38
that's a very simple algorithm and it's
01:31:39
a decision tree that's going, okay, is
01:31:41
it a shoe? Yes. Is the top made of
01:31:43
leather? Yes. You know, is the bottom
01:31:45
made of rubber? and they just go through
01:31:47
a very simple and that outputs it. And
01:31:49
so on some level, if you convince the
01:31:51
government to use your LLM, it becomes
01:31:53
true whether it's true or not. I think
01:31:55
there's going to be some interesting
01:31:56
cases like that that we haven't really
01:31:57
thought through of like when does the
01:31:59
government adopt these to be the source
01:32:01
of truth. Okay. All of this speaks to
01:32:03
this thing that's going to sound totally
01:32:05
esoteric, but like we all used to on QA,
01:32:10
right? Like the least talented engineers
01:32:12
were allocated to QA. I think in the
01:32:15
world of AI it's it'll end up being the
01:32:17
most talented. You know we internally at
01:32:19
8090 we call it improvement engineering
01:32:22
and it's a total specialty. It's similar
01:32:24
to when I kind of coined the growth team
01:32:27
at Facebook. I feel it's the same kind
01:32:29
of moment where improvement engineering
01:32:31
is really the skill that translates toy
01:32:35
apps and vibe coding into something
01:32:37
that's very practical and real. And we
01:32:40
like and my team and the leader of this
01:32:42
team, he's like steeped in things like
01:32:44
Japanese kata management from like
01:32:46
Toyota and quality systems and these are
01:32:48
all the things that matter when you're
01:32:50
trying to just shrink the error rate
01:32:52
down to zero so that you can use it in a
01:32:54
reliable way and also to document it so
01:32:57
that if people want to question what
01:33:00
happened or have you know recompense or
01:33:03
some some way to come back and say hey
01:33:05
that that really harmed me, how do you
01:33:07
even do that like these are all very
01:33:10
complicated issues that that will get
01:33:12
sorted up. Super I think interesting.
01:33:14
Okay, four. I got to wrap guys. I got to
01:33:17
catch a flight to Miami. Let me do a
01:33:19
closing here. If you want to keep going,
01:33:20
you're welcome to. Three, two, the plane
01:33:21
just waits. Just text the pilot and just
01:33:23
tell them you're all right. Listen, I'm
01:33:25
not burning all the allin credits, so to
01:33:29
speak, and all of our tokens. I'm
01:33:30
kidding. I'm kidding. I'm kidding.
01:33:34
private to everything and then putting
01:33:35
it on the allin budget and the rest of
01:33:37
us are flying southwest
01:33:39
for
01:33:41
your dictator Jim Paul. I miss a flight.
01:33:44
It's a strange concept. Yeah, David s.
01:33:46
What does that mean? David, when's the
01:33:47
last time you flew commercial? Clinton,
01:33:49
I haven't missed a flight in about 15
01:33:51
years
01:33:53
for Ryan Peterson from Flexport. Aaron
01:33:56
Lee from the amazing box. Chim pitia.
01:33:59
David Saxs your chairman dictator Zar. I
01:34:02
am the world's greatest moderator. Love
01:34:04
you, boys.
01:34:07
We'll let your winners ride.
01:34:09
Rainman David S.
01:34:14
We open sourced it to the fans and
01:34:16
they've just gone crazy with it.
01:34:19
Queen of
01:34:21
[Music]
01:34:27
besties
01:34:27
[Music]
01:34:29
are my dog taking notice your driveways.
01:34:34
Oh man, my habitasher will meet up. We
01:34:37
should all just get a room and just have
01:34:39
one big huge orgy cuz they're all just
01:34:40
useless. It's like this like sexual
01:34:42
tension that we just need to release
01:34:43
somehow.
01:34:45
Wet your feet. Wet your feet. Your feet.
01:34:50
That's going to be good. We need to get
01:34:51
merch. I'm going all
01:34:55
[Music]
01:34:59
in. I'm going all in.

Badges

This episode stands out for the following:

  • 60
    Most shocking
  • 60
    Most creative

Episode Highlights

  • Trump's Economic Reforms
    The Trump administration is making significant cuts to government spending and deregulation, setting the stage for potential economic growth.
    “We've actually started to make real cuts in government, real cuts in the federal workforce.”
    @ 19m 53s
    May 03, 2025
  • Border Security Consensus
    Panelists agree that securing the border and addressing illegal immigration are top priorities for Americans.
    “Americans universally want the border secured. They don't want illegal immigration.”
    @ 21m 30s
    May 03, 2025
  • Concerns Over Economic Uncertainty
    Economic uncertainty is causing anxiety among business owners, impacting planning and operations.
    “Economic uncertainty is really terrible for running a business.”
    @ 22m 12s
    May 03, 2025
  • Manufacturing Vision
    The discussion revolves around the future of American manufacturing and the confusion surrounding it.
    “What does this mean for our economy?”
    @ 36m 35s
    May 03, 2025
  • Economic Bonanza Potential
    A tax bill could create significant economic benefits for businesses, yet it remains underreported.
    “This could create an absolute economic bonanza if passed.”
    @ 43m 03s
    May 03, 2025
  • Mainstream Media's Role
    Criticism of mainstream media for failing to report important economic developments.
    “They could have talked about what changes the tax bill brings.”
    @ 45m 38s
    May 03, 2025
  • Amazon's Unfair Advantage
    A discussion on how Amazon's practices create an uneven playing field for American sellers.
    “This sounds profoundly unfair in terms of playing field.”
    @ 54m 04s
    May 03, 2025
  • The Complexity of the American Economy
    Exploring the intricate issues affecting the economy today that weren't discussed six months ago.
    “We are finding what the implications are in course correcting in real time.”
    @ 55m 05s
    May 03, 2025
  • AI Agents: The Future of Work
    AI agents are set to revolutionize how tasks are completed in the workplace.
    “2025 shaping up to be the year of AI agents.”
    @ 01h 04m 11s
    May 03, 2025
  • The Future of AI Usage
    90% of AI applications will be new tasks that we haven't tackled yet, reshaping industries.
    “90% of AI usage will be things that we don't do today.”
    @ 01h 12m 27s
    May 03, 2025
  • Exponential Progress in AI
    The algorithms, chips, and data centers are improving exponentially, leading to massive impacts in the economy.
    “The impact of this thing is going to be absolutely massive.”
    @ 01h 22m 43s
    May 03, 2025
  • Government and AI Truth
    The adoption of LLMs by the government raises questions about the nature of truth.
    “If you convince the government to use your LLM, it becomes true.”
    @ 01h 31m 51s
    May 03, 2025

Episode Quotes

Key Moments

  • Government Cuts19:53
  • Trump's Dinner23:45
  • Supply Chain Crisis28:36
  • Tariff Negotiations29:00
  • Manufacturing Confusion36:35
  • Trade Debate56:36
  • Trough of Disillusionment1:13:45
  • Government Adoption1:31:51

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Trump's Big Week: Middle East Trip, China Deal, Pharma EO, "Big, Beautiful Bill" with Ben Shapiro
Podcast thumbnail
Tucker Carlson: ICE Raids, LA Riots, Strong Economic Data, Politicized Fed, War with Iran?
Podcast thumbnail
12 Day War, Socialism Wins in NYC, Stocks All-Time High, AI Copyright, Science Corner
Podcast thumbnail
Markets turn Trump, Long rates spike, Election home stretch, Influencer mania, Saving Starbucks
Podcast thumbnail
Trump AI Speech & Action Plan, DC Summit Recap, Hot GDP Print, Trade Deals, Altman Warns No Privacy
Podcast thumbnail
Trump Takes On the Fed, US-Intel Deal, Why Bankruptcies Are Up, OpenAI's Longevity Breakthrough
Podcast thumbnail
Does OpenAI Need a Bailout? Mamdani Wins, Socialism Rising, Filibuster Nuclear Option
Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally
Podcast thumbnail
Trump Rally or Bessent Put? Elon Back at Tesla, Google's Gemini Problem, China's Thorium Discovery
Podcast thumbnail
Fed Hesitates on Tariffs, The New Mag 7, Death of VC, Google's Value in a Post-Search World