Search Captions & Ask AI

Software Stocks Implode, Claude's Hit List, State of the Union Reactions, Trump's Tariff Pivot

February 28, 2026 / 01:21:08

This episode of the All-In podcast covers conspiracy theories including 9/11, flat earth, and the JFK assassination, featuring guest Alex Jones. The hosts discuss recent market trends, particularly the impact of AI on various sectors, and a viral Substack post predicting an economic collapse due to AI.

Alex Jones joins the conversation to share his views on conspiracy theories, while the hosts debate the implications of AI on the job market and the economy. They highlight a viral Substack post that speculates on a potential economic downturn driven by AI advancements.

The discussion shifts to the recent performance of tech stocks and the broader market, with the hosts analyzing the reasons behind the decline in stock prices. They explore the relationship between AI developments and market confidence, emphasizing the uncertainty surrounding future cash flows.

Furthermore, the episode touches on the political landscape, particularly the State of the Union address, and the ongoing debate over tariffs and economic policies. The hosts express their opinions on the effectiveness of Trump's administration and the challenges of bipartisan cooperation.

Finally, the episode concludes with a discussion on the future of AI and its potential to reshape industries, as well as the need for companies to adapt to these changes.

TL;DR

The episode discusses conspiracy theories, AI's market impact, and the political landscape, featuring guest Alex Jones.

Video

00:00:00
All right, everybody. Welcome back to
00:00:02
your favorite podcast, the AllIn
00:00:04
podcast. Today we have a conspiracy
00:00:08
corner episode for you. We're going to
00:00:10
go over the 9/11 inside job. We're going
00:00:12
over flat earth, JFK assassination. It's
00:00:16
going to be all conspiracy all the time
00:00:18
after our amazing blockbuster episode
00:00:21
during Ski Week. We're going all
00:00:22
conspiracy here. Our guest today, Alex
00:00:24
Jones.
00:00:25
>> How many views did it get? Nine views. I
00:00:28
mean, it's tough when you have one out
00:00:30
of four besties. It doesn't Michael
00:00:32
Tracy is on standby.
00:00:34
>> Not true. I can carry an episode for at
00:00:36
least 400,000 views.
00:00:37
>> I mean, you might. You might. I like
00:00:39
your Hey, for people who don't know,
00:00:40
Jimoth has his own YouTube channel. He's
00:00:42
got his escape hatch for when this train
00:00:44
wreck burns to the ground. He started
00:00:46
his own YouTube channel. He's hedging
00:00:48
his bets. Freyberg's working on his solo
00:00:50
project. Everybody's doing solo project.
00:00:52
The band's got a lot of solo projects.
00:00:54
>> The Beatles are experimenting. The
00:00:55
Beatles.
00:00:55
>> They're experimenting. We had a little
00:00:57
Yokoono situation going on here. You
00:00:59
know what the number one topic for this
00:01:00
show was by the all-in AI bot sacks.
00:01:04
>> What's up?
00:01:05
>> The number one was Daario versus Hegs,
00:01:08
the uh Department of War versus
00:01:10
anthropic was the number one topic
00:01:12
selected by our AI bot as a programming
00:01:14
note for folks. That decision will be
00:01:16
made end of the day Friday when this
00:01:17
podcast comes out. So, we will talk
00:01:18
about it next week. But, let's get to
00:01:20
work. We've got a full docket. The
00:01:22
clawed kill list has expanded and an AI
00:01:25
fanfiction substack tanked your 401k on
00:01:28
Monday. Let's get into it. Anthropics
00:01:31
generational run continues. They're now
00:01:33
three for three in tanking different
00:01:36
market sectors in February.
00:01:38
Congratulations. This is like they they
00:01:41
took the the mantle from Brad Gersonner
00:01:43
uh tanking the market.
00:01:44
>> The Enthropic list.
00:01:46
>> It is February 3rd. Anthropic announces,
00:01:48
hey, we got a legal plugin for Claude
00:01:50
Co-work, Thompson Reuters, Lexus, Nexus,
00:01:53
Legal Zoom, all down at least 10% since
00:01:56
February 3rd. Then on February 20th,
00:01:58
Claude Code Security is announced in a
00:02:01
limited research preview. Stocks tank
00:02:03
again. Crowd Strike, Cloud for Octa, all
00:02:06
down. Then February 23rd, Anthropic
00:02:10
announces Claude can modernize Cobalt
00:02:12
databases. If you don't know Cobalt,
00:02:14
that's the like oldest coding language
00:02:16
in the world. That's where Sachs learned
00:02:18
code when he was in college in the 70s.
00:02:19
It's used for banking, payroll,
00:02:21
government,
00:02:22
>> healthare.
00:02:23
>> Healthcare runs 95% of ATMs in the US
00:02:27
and it powers social security payments.
00:02:29
85% of all coal code runs on IBM
00:02:32
machines. So IBM decided they would tank
00:02:34
13% on Monday, their worst day since
00:02:38
2000. 31 billion in market cap losses.
00:02:43
So let's stop here before I get into the
00:02:45
fanfiction piece.
00:02:47
What's your take here of what's
00:02:49
happening in the market, Shmath? Is this
00:02:52
simply people are looking for an excuse
00:02:54
to trim their positions because things
00:02:55
have been top ticking all-time highs and
00:02:57
people are just looking for an excuse or
00:02:59
is this reality? Is this the go forward
00:03:01
reality that AI is going to compress
00:03:04
these kind of stocks because it solves a
00:03:06
lot of problems?
00:03:08
>> I'm going to give you two explanations.
00:03:11
I don't know what percentage I would
00:03:14
allocate
00:03:15
across the two, but I think one is
00:03:18
tactical and one is much more strategic,
00:03:20
but I think both are happening. The
00:03:23
tactical one is that we are at a moment
00:03:25
in time where a lot of the smart money
00:03:29
hedge funds are starting to massively
00:03:32
degross. And what that means is they're
00:03:36
trimming a lot of positions and they're
00:03:38
just taking on a lot less risk. Why? I
00:03:42
don't exactly know. It could be
00:03:44
motivated by the second thing that I'm
00:03:45
going to talk about. But the point is in
00:03:48
a deg
00:03:51
risk and making your position sizes much
00:03:54
smaller. So the longs become less long,
00:03:56
the shorts become less short and you
00:03:58
just shrink. And so there's just general
00:04:00
downward pressure. That is a clear
00:04:02
behavior right now. But I think the
00:04:05
structural change is the more important
00:04:07
one. And this is sort of what I talked
00:04:09
about this morning. In a normal
00:04:12
functioning market,
00:04:14
what we are always debating is when a
00:04:18
set of cash flows go from becoming
00:04:21
highly confident to less highly
00:04:24
confident. It's a when conversation. So
00:04:27
when will Coca-Cola's cash flows be
00:04:30
impacted? When will Eli Lily's cash
00:04:33
flows be impacted? When will Meta's cash
00:04:36
flows be impacted? And the answer to the
00:04:38
when
00:04:40
gets translated by the public markets
00:04:42
into three things. Your price to
00:04:44
earnings multiple where if you invert
00:04:47
that number what that is equivalent to
00:04:50
is the yield on the money that you get.
00:04:52
Okay. So if you you know 20 times PE
00:04:55
that's a 5% yield.
00:04:58
The second is a revenue multiple. And
00:05:00
the third is what's called your weighted
00:05:03
average cost of capital. Which is to
00:05:05
say, if you look at the next 20 to 30
00:05:08
years of earnings and you want to figure
00:05:10
out what that is worth today,
00:05:13
you have to discount all of these back
00:05:15
and you have to assume
00:05:18
a percentage of interest effectively
00:05:21
that it takes to get there. And the
00:05:23
basic math of this is that when you have
00:05:25
a high whack, it's called, you're
00:05:27
massively discounting these cash flows.
00:05:29
When you have a low whack, you're
00:05:31
assuming that these things are very
00:05:32
durable. Okay. So, what is happening?
00:05:36
We used to debate when this is no longer
00:05:38
a when moment. The market is very much
00:05:41
in an if mode.
00:05:43
Are these cash flows durable at all?
00:05:46
>> Could they fall off a cliff in year
00:05:48
three? Is there some AI model that's
00:05:50
going to come around the corner and
00:05:52
obliterate this business without me
00:05:54
knowing it? And because they've shifted
00:05:57
into this if mindset,
00:06:00
your risk becomes totally different. You
00:06:03
have this event risk that you don't know
00:06:06
how to price. And whenever the market
00:06:08
shifts into that mode, what you see are
00:06:12
that the holders of those equities want
00:06:14
a massive margin of safety. What does
00:06:16
that mean? They have to take pees way
00:06:19
down. If you used to trade at 40, you
00:06:21
should trade at 20. If you used to trade
00:06:23
at 20, you should trade at 10.
00:06:25
They take revenue multiples down. You
00:06:27
used to trade at 10 times revenue. Now
00:06:29
you're going to trade it three times.
00:06:31
You take the whack way up. Used to be a
00:06:33
6% discounted weighted average cost of
00:06:35
capital. You know what? I'm taking you
00:06:37
to 12 or 13. That's the market's way of
00:06:40
saying, I'm now debating
00:06:43
if these things will even exist and so I
00:06:47
need to give myself a huge buffer to own
00:06:49
this stuff.
00:06:50
>> That's what's happening right now. It
00:06:53
has a lot of ripple effects that we can
00:06:55
talk about. Freeberg and I have talked
00:06:56
about this a lot. The most obvious
00:06:58
impact is how these tech companies
00:07:00
recruit and retain talent because the
00:07:02
biggest thing that it starts to eat into
00:07:05
are the cash flows of a business which
00:07:07
really directly tied to stockbased comp
00:07:09
and all this other stuff. But let me
00:07:10
just stop there. So we are we have moved
00:07:12
away from a when
00:07:14
>> to now an if and I think that that is a
00:07:17
very smart question to be asking. The
00:07:19
answer may be from any of these
00:07:21
companies that they will survive, but we
00:07:23
don't know how long. And until that
00:07:24
becomes clearer, you have to give
00:07:26
yourself room to be wrong.
00:07:28
>> You said when, then if. Did you mean if?
00:07:30
>> No, no, no.
00:07:31
>> To when?
00:07:32
>> We we've always debated when. When will
00:07:34
these cash flows disappear? Now it's
00:07:35
like
00:07:36
>> will they even exist?
00:07:37
>> Got it. Okay. So the second part of the
00:07:40
story Friedberg and Sachs is that a
00:07:43
Substack post fanfiction
00:07:46
uh taking place in the fictional 2028 uh
00:07:51
global intelligence crisis went mega
00:07:54
viral. 28 million views on X. It was
00:07:56
posted Sunday night. It made the market
00:07:58
tank on Monday. In this fictional
00:08:02
Substack post, the author said there's
00:08:05
going to be essentially a death spiral
00:08:07
that happens because of AI. How does
00:08:09
that work? Well, first, companies
00:08:11
embrace AI. Everything goes right.
00:08:13
They're able to cut staff. Their margins
00:08:15
go up, similar to how Amazon has, you
00:08:18
know, trimmed their white collar staff.
00:08:21
Then they're so successful at this that
00:08:23
they lose their customer base because
00:08:25
consumers don't have discretionary
00:08:26
funding to spend. Then it creates a
00:08:29
death spiral where the companies keep
00:08:30
deploying AI to try to hit the margins,
00:08:32
cutting staff, and the entire economy
00:08:34
collapses. Dr. Doom level stuff.
00:08:36
Unemployment's at 10%, S&P goes down
00:08:38
from 38% highs. After this piece came
00:08:41
out which speculated that agents would
00:08:43
get rid of all the 3% interchange fees
00:08:46
and move everybody to settle
00:08:47
transactions on stable coins. All the
00:08:49
financial stocks got hit on Monday. Amex
00:08:51
down 8%, Capital 1 down 8%, Mastercard
00:08:54
6%, Visa 4%, yada yada. Finally, this
00:08:58
piece got a lot of push back. There was
00:09:00
a silly piece in it or a section in it
00:09:02
where they said AI agents would vibe
00:09:06
code their way to displacing Door Dash.
00:09:09
And uh that's kind of silly if anybody's
00:09:11
run a networkbased business knows. Sax,
00:09:14
I assume you read this piece or at least
00:09:17
saw the fallout from it. What's your
00:09:19
take? And then we'll go to you Freeberg.
00:09:21
>> Yeah. Well, I I know that this uh
00:09:23
Catrini article got passed around like a
00:09:25
join at a Grateful Dead concert, but I'm
00:09:29
starting to question how legitimately
00:09:31
viral it really was. There's some
00:09:33
information that just came out that the
00:09:36
attribution of the article has been
00:09:38
amended, meaning the co-authors have
00:09:39
been amended to include a short fund
00:09:43
that was shorting some of the names
00:09:45
mentioned in the article. This is
00:09:47
according to another post that just came
00:09:49
out. According to this post, the
00:09:51
authorship attribution attributed to
00:09:55
market moving was changed after
00:09:57
publication of the co-author is the
00:09:59
managing partner of a $262 million SEC
00:10:02
registered hedge fund who confirmed
00:10:04
short positions in the companies the
00:10:06
report named. So I think that's point
00:10:08
number one is I just wonder did this
00:10:11
article truly go viral or did the
00:10:13
authors do anything to kind of amplify
00:10:15
it and we just don't know the answer to
00:10:17
that question. But regardless of that,
00:10:19
let's just take the arguments on their
00:10:20
face. I think one of the best responses
00:10:23
to it was by another writer named Derek
00:10:26
Thompson who wrote a article called
00:10:28
Nobody Knows Anything, which I think is
00:10:30
a reference to a famous take by
00:10:33
legendary Hollywood writer William
00:10:35
Goldman.
00:10:36
>> Yeah.
00:10:36
>> In any event, what the article says is
00:10:39
no one really knows what's going to
00:10:40
happen with AI in two years, never mind
00:10:42
20 years. And so they resort to science
00:10:45
fiction writing masquerading as
00:10:47
analysis. And the author here, Derek
00:10:50
Thompson, says that the conversation
00:10:52
about AI is really just a marketplace of
00:10:55
competing science fiction narratives.
00:10:57
And he says, "That's not to say I think
00:10:58
the technology is a parlor trick, but
00:11:00
rather that the level of uncertainty is
00:11:02
so high and the quality and supply of
00:11:05
real world real-time information about
00:11:07
AI's macroeconomic effects so poultry
00:11:10
that very serious conversations about AI
00:11:12
are often more literary than genuinely
00:11:15
analytical." So in other words, what
00:11:17
he's saying is, look, this guy is
00:11:18
writing very compelling science fiction,
00:11:21
but there's no real analytics behind it
00:11:25
to defend it. And yes, this could
00:11:26
happen. Here's a prediction market on
00:11:29
whether people believe the Catrini
00:11:30
report's going to come true. Something
00:11:32
like 12% believe the Catrini scenario is
00:11:35
going to happen. But the truth is, no
00:11:36
one really knows. I mean, there's other
00:11:38
dueling science fiction narratives where
00:11:40
AI is going to create such a world of
00:11:42
abundance that we're not going to need
00:11:45
for anything. And just by the way, Derek
00:11:47
Thompson is one of the abundance guys
00:11:49
with Ezra Klein.
00:11:51
>> This is why the market's getting
00:11:52
whacked. I think that you're right,
00:11:53
Saxs, nobody knows. So, if you can get
00:11:56
5% for owning government bonds, why are
00:11:58
we taking excessive risk here?
00:12:00
>> Yeah. Let me just build on your point
00:12:01
about SAS. So the reason why there's so
00:12:03
much uncertainty around SAS is that SAS
00:12:06
used to be such a easily modeled and
00:12:10
predictable category and you know I saw
00:12:13
as a VC we saw the same story play out
00:12:15
across many many different categories of
00:12:17
software. You'd have this initial period
00:12:19
where there'd be this experimentation
00:12:21
phase you have a bunch of different
00:12:22
products that come to market. There'd be
00:12:24
a battle and then the market would
00:12:25
eventually settle and they'd be a
00:12:27
category leader and they would capture
00:12:29
most of the market share and the vast
00:12:31
majority of the market capitalization
00:12:33
and they would have very very
00:12:34
predictable metrics. It was very easy to
00:12:37
grade a SAS business. You look at ARR,
00:12:39
you say annually recurring revenue. You
00:12:41
look at the net dollar retention you
00:12:43
want to see depending on the phase
00:12:45
>> RPO RPO revenue under performance.
00:12:48
>> And so you know these things began to be
00:12:50
seen as like a annuity with growth right
00:12:52
because
00:12:53
>> they were rock solid. Yeah.
00:12:54
>> Yeah. Because a good net dollar
00:12:57
retention would be something like 120%.
00:12:59
Which means that your sort of cohort of
00:13:01
existing customers on balance would all
00:13:03
renew the next year and actually they
00:13:05
would renew at 120% of the previous
00:13:08
year's contract values. And the reason
00:13:10
you got that extra 20% is they would buy
00:13:12
more seats or there'd be additional
00:13:15
products or features they would upsell.
00:13:17
It got to be very very predictable. And
00:13:19
so when people were buying software
00:13:21
companies at I don't know 13 times ARR,
00:13:24
they thought they were buying a growth
00:13:25
annuity. And now all of a sudden you got
00:13:28
to factor into that. Well, wait a
00:13:29
second. What if AI disrupts the whole
00:13:33
market? What if it doesn't eliminate? I
00:13:35
don't think AI is going to get rid of
00:13:36
Salesforce, but it could eat into their
00:13:38
growth opportunity. We just don't know
00:13:40
what if it changes the pricing model. I
00:13:42
mean, it just creates a whole lot of
00:13:43
unknowns. And I I actually don't believe
00:13:45
in the Catrini or or the doomer take on
00:13:48
this, but I can see why the market would
00:13:52
feel this level of uncertainty
00:13:54
>> 100%.
00:13:55
>> Given how predictable a category SAS
00:13:59
used to be just say a year ago.
00:14:02
>> Yeah. Well, chaos is a ladder, Freeberg,
00:14:05
and this means opportunity. So, if we
00:14:08
look at this and SAS has headwinds, then
00:14:11
is there a winner? Is open source the
00:14:13
winner? Or is this all deflationary in
00:14:14
your mind, Freeberg? And we just make
00:14:16
less money and the earnings of these
00:14:19
companies get compressed, the size of
00:14:21
them gets compressed. How do you think
00:14:22
about it? I think fundamentally if
00:14:25
you're driving productivity with AI,
00:14:29
you're driving leverage on human time
00:14:32
and leverage on capital.
00:14:34
The question is how quickly can you
00:14:36
drive that up? And that's a function of
00:14:38
how much consumption there is, how much
00:14:40
capacity there is for consumption. So on
00:14:43
the one hand, I'll just speak broadly. I
00:14:46
think like humans have this desire to
00:14:48
improve their livelihoods by roughly 10%
00:14:50
every year. Meaning like your income and
00:14:53
your ability to purchase stuff that's
00:14:54
new relative to where you were last year
00:14:56
has to go up by 10% for you to be happy.
00:14:58
If it's less than 10%, you're probably
00:14:59
unhappy.
00:15:00
>> Is that that's your anecdote or that's
00:15:01
like
00:15:02
>> that's just like an anecdote. Like I
00:15:03
think I think that's sort of like my
00:15:05
rubric for thinking about like why are
00:15:07
people unhappy? You're happy. So if your
00:15:08
earnings are the same but things are
00:15:09
getting more expensive, you're not
00:15:11
happy. If your earnings go up by 10% and
00:15:13
things stay the same price, you got 10%
00:15:16
more than you had last year. You're
00:15:17
you're going to be happy. I just think
00:15:18
like all humans are driven by this need
00:15:19
to consume more each year than they did
00:15:22
last year. So I think for me that's like
00:15:25
the lower limit on consumptive capacity
00:15:28
in the world.
00:15:31
The question that we're now facing which
00:15:33
we've never faced in human history
00:15:34
before is there a upper limit
00:15:38
>> on consumptive capacity because AI
00:15:41
creates such a profound shift in
00:15:44
productivity and in leverage that
00:15:47
normally you would say hey when we get a
00:15:49
new tool or we get new leverage in a
00:15:50
system we build a new technology we can
00:15:52
make more with less. Therefore, everyone
00:15:55
gets access to more things for the same
00:15:57
price or the cost of things that they
00:15:59
consume come down by a certain price.
00:16:01
But there may be a situation now where
00:16:04
the ability to make stuff exceeds the
00:16:08
capacity to consume stuff. And that is
00:16:11
something that I don't think we've faced
00:16:12
before. And I think that's sort of where
00:16:14
a lot of the models start to break. Just
00:16:16
general economic models, just general
00:16:18
productivity models, and general social
00:16:20
models. And this goes to the point about
00:16:23
like what is everyone going to do? In
00:16:25
the same way that I think we've argued
00:16:26
that maybe SAS was a transitory business
00:16:29
phenomenon that existed between the
00:16:32
foundation of the internet and the era
00:16:35
of AI. It may be the case that knowledge
00:16:38
work in general is also a transitory
00:16:41
phenomenon that only existed between the
00:16:43
foundation of the computer or computing
00:16:46
tools and the existence of AI generally
00:16:48
speaking. And if all of that goes away
00:16:50
very quickly and all of those people can
00:16:53
be redistributed and recast into doing
00:16:55
other higher level, more creative
00:16:57
things, their productivity goes up by
00:16:59
100x.
00:17:00
Is there really a consumer on the other
00:17:02
end of all of that productivity? Is
00:17:04
there really enough consumptive
00:17:06
capacity? And I think that's the
00:17:08
profound question that we all face. I
00:17:09
don't think that there's any limit.
00:17:11
>> Is that your way of saying that SAS goes
00:17:13
to zero or that's your way of saying
00:17:16
these companies go to zero? I'm just
00:17:18
saying knowledge work in general like
00:17:20
the the like what is the implication
00:17:22
>> is this just another dueling science
00:17:24
fiction take I mean or what's your
00:17:26
evidence for this
00:17:28
>> I think it's fine to have a sci-fi take
00:17:30
intuition I think is data because I can
00:17:33
show you some data that I think
00:17:35
contradicts what you're saying
00:17:36
>> and I have some firstirhand sort of
00:17:38
>> in the sense that there's more leverage
00:17:40
stacks that people are able to actually
00:17:42
>> well Jake I want to hear what you have
00:17:43
to say because I know you're
00:17:44
experimenting with this but let me just
00:17:45
show you a few data points real quick um
00:17:47
because I think this is relevant. So
00:17:49
we're really talking about the
00:17:50
disruption caused by coding assistance,
00:17:53
right? This is like the first big killer
00:17:55
app of AI. I mean I guess after writing
00:17:57
and research for chat bots and we'll
00:18:00
have agents later, but really it's all
00:18:02
about coding assistance, right? And the
00:18:04
ability to more easily create code.
00:18:05
That's what's creating the disruption to
00:18:07
the SAS category. Well, so let's just
00:18:09
focus on the data we see right now
00:18:11
around that. And there are a lot of
00:18:14
people who are pointing this out that
00:18:17
Enthropic right now has a job listing
00:18:20
for a software engineer on their website
00:18:23
right now for $570,000.
00:18:27
And a lot of people are kind of pointing
00:18:28
out, okay, so wait, so what Anthropic is
00:18:30
saying is they're still trying to hire
00:18:32
software engineers at a very high wage,
00:18:34
but somehow they think these jobs are
00:18:36
going to be eliminated.
00:18:38
>> Timoth might apply for that job. is.
00:18:40
>> Yeah. Austerity measures sounds pretty
00:18:42
good to me.
00:18:43
>> That's a lot of money.
00:18:44
>> Jim might take that job and then just
00:18:46
have AI do it for him.
00:18:47
>> No, I'm I'm I'm worried like I hope my
00:18:48
8090 team doesn't see that offer. That's
00:18:50
a
00:18:51
>> that's a big number.
00:18:53
>> Our equity is way higher, but our
00:18:54
salaries are not that high. I mean, you
00:18:57
look you put these things together, it's
00:18:58
like that equity is money good. The
00:19:00
reality is like those guys are doing 5
00:19:02
to6 billion structured secondaries every
00:19:05
year now or they're starting which means
00:19:06
that they will. That's like cash
00:19:08
compensation. So for for me to match
00:19:11
that, I need to be 3x higher than that.
00:19:13
>> Right? And I think a lot of people are
00:19:14
kind of pointing out, well, this is a
00:19:15
contradiction. Anthropic doesn't really
00:19:17
seem to be practicing what they're
00:19:18
preaching if they're paying enormous
00:19:20
amounts still for software engineers
00:19:22
even as they claim they're obsoleting
00:19:23
the entire category. Something doesn't
00:19:25
quite add up. Citadel Securities did a
00:19:28
new report that rebutts that Catrini
00:19:32
report and they show a couple of stats
00:19:34
here which I think are really
00:19:35
interesting. So, job postings for
00:19:37
software engineers are rapidly rising.
00:19:40
They're showing, I think it was roughly
00:19:42
a 10% year-over-year increase in the
00:19:45
demand for software engineers. On a
00:19:47
related note, they also show that
00:19:51
company formation is also rapidly
00:19:54
expanding and that may have something to
00:19:56
do with AI making it easier to start a
00:19:58
business or to get leverage to your
00:20:00
point, Freeberg. So look, there's a
00:20:02
couple of competing effects going on
00:20:04
here. And I think Aaron Levy had a
00:20:06
really good explanation of why you might
00:20:10
see something very counterintuitive
00:20:11
happening. And again, it all goes back
00:20:13
to Jeban's paradox. But what Aaron says
00:20:16
is that when you lower the cost of
00:20:18
something that was previously supply
00:20:20
constrained, demand for that thing goes
00:20:22
up. Software engineering is just one of
00:20:24
the easiest examples to contemplate, but
00:20:26
there are going to be many other jobs
00:20:27
like that. But think about software
00:20:29
engineering. Even among startups in
00:20:31
Silicon Valley, which I think are
00:20:33
probably some of the most attractive
00:20:34
places for software engineers to work,
00:20:37
there's always been a chronic shortage
00:20:39
of them. Then you've got the Fortune 500
00:20:41
companies, non- tech companies, which
00:20:43
have always had an even harder time
00:20:45
hiring technical talent. So, you have
00:20:47
this massive unfilled need for software
00:20:51
engineers across the entire economy.
00:20:53
Now, you're going to be able to get a
00:20:55
lot more leverage out of software
00:20:56
engineers. It doesn't mean they're going
00:20:58
to get fired. It just means that now
00:21:00
maybe you can have a lot more 10x
00:21:03
software engineers and they're getting
00:21:05
those jobs are now being spread
00:21:07
throughout the whole economy. You know,
00:21:09
I also think just to put some numbers on
00:21:11
this, I think the cost structure of the
00:21:12
average Fortune 500 business is
00:21:15
something like 5% it includes all of
00:21:18
their IT, not just their software. You
00:21:21
know, what should it be? What should the
00:21:23
percentage of software be in an
00:21:26
enterprise cost structure? Elon
00:21:28
describes companies as cybernetic
00:21:30
organisms that are part software, part
00:21:32
human.
00:21:34
>> If you think about the current Fortune
00:21:35
500 company being one or two percent
00:21:37
software, maybe they should be 50%
00:21:39
software. I think what Aaron is saying
00:21:41
here is the market for software and
00:21:44
software engineers was so constrained by
00:21:46
the lack of availability
00:21:48
that even if we 10x or 100x the
00:21:51
productivity of software engineers, the
00:21:52
demand will be there to absorb this new
00:21:55
supply. And so it could lead to this
00:21:57
explosion in productivity without the
00:21:59
massive job loss.
00:22:00
>> I think you're right. I think the the
00:22:02
thing that I would look at is I would
00:22:05
expect OPEX as a percentage of revenue
00:22:07
to fall off of a cliff but within that
00:22:10
opex the percentage of it that you
00:22:13
allocate to technology and technology
00:22:15
related things probably goes way way up
00:22:17
than what it is today. Okay, Jason, the
00:22:19
batch of people that are applying for
00:22:21
launch,
00:22:23
has SAS stopped? Has software stopped?
00:22:25
>> It's AI first companies obviously and
00:22:28
people
00:22:29
>> are they rebuilding traditional SAS
00:22:31
tools just cheaper?
00:22:32
>> Basically, everybody's building the
00:22:35
great, you know, as we talked about at
00:22:36
the all-in summit like some of these
00:22:38
companies are trying to build the best
00:22:40
pilot in the world or Whimo's trying to
00:22:42
build the best driver in the world.
00:22:43
People are now trying to build the best
00:22:45
SDR in the world, the best salesperson,
00:22:47
the best executive coach. And so we have
00:22:50
been like obsessed with claude co-work,
00:22:53
but mainly OpenClaw. And so what we did
00:22:56
was and and I think it's not developers
00:22:59
that are going to do all this work, it's
00:23:01
knowledge workers. So we had we have 20
00:23:04
people in our firm. We had 15 of them
00:23:05
come in this weekend and they all got
00:23:08
trained over like six or seven hours had
00:23:10
to have their own openclaw agent and we
00:23:12
started building it. Every piece of
00:23:14
software that we wanted to buy or build
00:23:16
over the last 10 years that we never got
00:23:19
to my people are building in the last 30
00:23:22
days. As an example, you know, when
00:23:24
you're selling ads for a podcast, you
00:23:27
want to check all the other podcasts and
00:23:28
what advertisers they have. We trained
00:23:30
an agent to go take the top 100
00:23:33
podcasts, look through the transcripts,
00:23:35
figure out who the advertisers are,
00:23:36
check those advertisers in pipe drive,
00:23:38
tell us when the last time we contacted
00:23:40
them and put it into the sales room.
00:23:41
That was an SDR job that we wanted to
00:23:43
fill and software we wanted to build.
00:23:45
Then we wanted people to have
00:23:47
>> hold on that was a that was a human that
00:23:49
you were paying money and now you've
00:23:50
replaced with software or that human
00:23:52
still exists but now they just do it in
00:23:54
a better way.
00:23:54
>> Redeploying that human. We had a human
00:23:56
doing it. we're going to redeploy them
00:23:58
to do other things and the consistency
00:24:00
of this chimoth and the accuracy and
00:24:02
then it's doing it all night long. So we
00:24:05
have like seven of these agents in these
00:24:07
kind of roles. The next piece we did was
00:24:09
we gave my agent which is like the
00:24:10
ultron root access to Gmail, calendar,
00:24:14
zoom, notion, slack. And what it's doing
00:24:18
is it's giving each person here's what
00:24:20
you got done this week with their
00:24:21
manager. Here's the emails you sent.
00:24:23
Here's the meetings you took. Here's the
00:24:24
contacts. here the threads you were
00:24:26
involved in and then it's helping manage
00:24:28
those people and so we are
00:24:31
>> okay but all that all that to me says
00:24:33
you Jason despite all your dumerism
00:24:36
seems like you're growing and you're
00:24:38
going to be hiring more people and
00:24:39
you're more productive am I getting this
00:24:41
wrong
00:24:41
>> I'm not dumerous what I think's going to
00:24:43
happen in this position is
00:24:44
>> but you're growing and you're going to
00:24:45
be hiring more people
00:24:46
>> no no we're not going to add more people
00:24:48
definitely not adding people we have are
00:24:51
becoming 10 or 20% more efficient every
00:24:54
week Because the software we would have
00:24:55
paid for or built from another vendor if
00:24:57
we had the time or we wanted to build
00:25:00
custom software we had 10 engineers it's
00:25:02
being built by our openclaw agents. As
00:25:05
an example when we make clips for this
00:25:07
podcast other podcasts we have it go and
00:25:10
look at like uh this weekend startups
00:25:12
episode from 10 years tell us the three
00:25:14
best moments and it makes the clip it
00:25:16
puts the subtitles on it and then it
00:25:17
puts the clip into the Slack room. That
00:25:19
was something that was going to be a
00:25:21
full-time job. So, we're getting 10 20%
00:25:24
more efficient. Then I started doing it
00:25:26
at home. So, I had to take our
00:25:28
Instacart, pull out the last 10 orders
00:25:30
we did, and then tell us what we order
00:25:32
most of time. And then it's going to
00:25:34
automatically build a car for us. Every
00:25:36
single knowledge work job is being
00:25:38
automated right now. And you can take it
00:25:41
and if you're a business process head,
00:25:43
where you know how to like do a business
00:25:45
process and you can structure it and
00:25:47
write it with an agent, it'll just run
00:25:49
it every day, every week. We did another
00:25:51
agent, how do you make better
00:25:53
thumbnails? And we said, every Saturday
00:25:57
in your skills, so when you build an
00:25:59
open claw, it has like a soul file and
00:26:01
it has a skills file. In the skills
00:26:03
file, we told it sax every week go out
00:26:05
and look for people discussing how to
00:26:08
make better thumbnails on YouTube, how
00:26:10
to make better titles. It found this
00:26:12
week, Chimath, somebody at Mr. Beast
00:26:15
company talk about how they're using
00:26:18
heat maps. It was an article I would
00:26:20
have never known. It added it to its
00:26:22
skill and now whenever we post a
00:26:24
thumbnail, it tell tells us based on its
00:26:26
skill that it refineses every week how
00:26:28
to make that thumbnail better. And it's
00:26:30
starting to make the thumbnails. This is
00:26:32
becoming recursive. So you keep the same
00:26:34
number of people, but they get 10 or 20%
00:26:36
more efficient. I don't know what this
00:26:38
means for the larger economy. All I know
00:26:39
is it's the most exciting time I've had
00:26:42
online since the web came out, since the
00:26:44
internet came out. It is so much fun to
00:26:47
automate all this stuff. The big
00:26:48
question that I am thinking about that I
00:26:52
haven't gotten a good answer about so I
00:26:54
don't I don't know what you guys think
00:26:55
is
00:26:56
all these businesses are going to need
00:26:58
to batten down the hatches and give
00:27:00
themselves room to figure this all out
00:27:02
right like if you take Sax's point like
00:27:04
if you take your point Jal which is the
00:27:07
young nimble companies like yours are
00:27:09
going to be rapidly experimenting the
00:27:12
bigger larger companies are going to
00:27:14
slowly onboard themselves to start
00:27:16
experimenting All of that means we're
00:27:18
going to get much clearer answers to all
00:27:20
of this. But what it also means is that
00:27:22
you're going to have to have time so
00:27:25
that you can figure this all out.
00:27:27
>> And
00:27:28
if you want to buy yourself time, you're
00:27:31
going to need a ton of cash. And if
00:27:33
you're going to think about saving cash,
00:27:35
the one place tech companies literally
00:27:38
incinerate cash is how they do
00:27:41
compensation.
00:27:43
And so I kind of think like at some
00:27:45
point the next shoe will drop and all of
00:27:47
these tech companies have to really look
00:27:48
at stockbased comp because they
00:27:51
literally incinerate most if not all of
00:27:53
their free cash flow fighting the
00:27:55
delilution from stockbased compensation.
00:27:57
So if you want five or six years to just
00:27:58
be in the arena on the field figuring
00:28:00
this out, you're going to want to kind
00:28:03
of be very cash flow generative and
00:28:05
really conservative in how you spend
00:28:07
your money. Yeah, Saxs, the people who
00:28:09
embrace this, I think, become five or
00:28:11
ten times more valuable than the people
00:28:13
who are not. That's where I think the
00:28:14
opportunity in the economy is. So,
00:28:16
unless you think humanity is going to
00:28:18
run out of problems to solve, I think
00:28:20
it's going to be boom. It's going to be
00:28:21
boom town. And I think people are going
00:28:23
to start more companies because the the
00:28:26
barrier to start a company is no longer
00:28:27
three or four million dollars. You can
00:28:29
just have two or three people and you
00:28:31
start setting up these agents and man
00:28:32
you can
00:28:34
>> make software, you can do sales, you can
00:28:36
do PR, everything is getting faster and
00:28:38
faster and faster. So the time between
00:28:41
like conceiving of a product and
00:28:43
publishing it and finding a developer,
00:28:45
you don't even need a developer, you can
00:28:46
just publish software, the the wakeup
00:28:49
moment for me was we were talking to our
00:28:51
agent about, hey, we want to get this
00:28:55
functionality out of Slack. And it's
00:28:56
like, "Yeah, Slack doesn't have that,
00:28:58
but have you considered Matterpost?" I'm
00:29:00
like, "What's Matter Post?" Like, "Oh,
00:29:01
it's an open source project. I can spin
00:29:02
it up this weekend. Export your Slack
00:29:04
instance and put it there." And I was
00:29:05
like, "Oh, don't do that. We're only
00:29:07
spending 6K a year on Slack or 10K a
00:29:10
year." But the software is building CRM
00:29:13
systems for us. It's building agents for
00:29:15
us, and it wants to just build all the
00:29:18
software stack. So you could when you
00:29:20
renegotiate with Slack or HubSpot or
00:29:24
whatever company you're working with,
00:29:27
you're going to be able to say to them,
00:29:28
hey, we could roll our own and uh when
00:29:30
you want to upsell us on this latest
00:29:32
thing like you talked about SAS,
00:29:33
upselling is such a big part of SAS.
00:29:35
You're like, I can actually build that
00:29:37
software myself internally. I don't need
00:29:38
you to do it.
00:29:39
>> Ryan Peterson just posts on XC Claude
00:29:42
for legal seems to work just as well as
00:29:45
Harvey by the way. Now it's the SAS
00:29:48
apocalypse is going after private
00:29:50
companies now too.
00:29:51
>> Well, I think for a while now there has
00:29:53
been a question of which layer of the
00:29:54
stack is going to capture all the value.
00:29:57
>> So is it going to be the model companies
00:30:00
or could it be the applications that are
00:30:01
built on top of the models or you know
00:30:04
if there's a lot of competition at both
00:30:05
those layers of the stack did the chip
00:30:06
companies get it all? I think it's a
00:30:09
unclear question but
00:30:10
>> totally
00:30:11
>> yeah I think you know for any given
00:30:12
vertical application you do have to
00:30:16
defend why you think your value prop
00:30:19
will be sustainable as the underlying
00:30:22
foundation models get better themselves
00:30:25
>> and it's open source like this week we
00:30:27
put up Kimmy 2.5 it can do about 80 85%
00:30:31
of the jobs so we lowered our token
00:30:33
bills massively when we stood that up
00:30:35
all right listen this is TBD We got a
00:30:38
lot more to think about on this topic.
00:30:40
>> Just on this point of a lot of these
00:30:42
debates about AI are dueling science
00:30:46
fiction narratives. I just think that
00:30:48
the doomer narratives are inherently
00:30:50
more appealing to people. I mean, I
00:30:52
think it's partly just you look at most
00:30:54
sci-fi movies are dystopian, not
00:30:56
utopian. In addition to that, I think we
00:30:59
have a bunch of heristic biases in favor
00:31:01
of the dumer narrative. So, one of them
00:31:03
is the scene versus the unseen. It's a
00:31:05
lot easier to see the jobs that already
00:31:08
exist that could be obsoleted than it is
00:31:11
to imagine the new jobs and the new
00:31:13
business models that haven't been
00:31:15
created yet. And that will likely take
00:31:17
some great innovator or a genius to
00:31:19
think of in order to create. So we have
00:31:22
that huge heristic bias of not being
00:31:24
able to see the creation that's coming.
00:31:26
It takes way less creativity to think
00:31:28
about the potential destruction. And
00:31:30
then finally I think you know the other
00:31:31
heristic is just the whole fixed pie
00:31:34
fallacy. Most people do tend to think of
00:31:37
the economy as a fixed pie. This is why
00:31:39
you see so much anger against you know
00:31:41
millionaires and billionaires is because
00:31:44
of this idea that if someone's getting
00:31:45
rich it must be at the expense of
00:31:46
someone else. That's not actually the
00:31:48
case. The economy itself could be
00:31:50
growing larger as a result of someone
00:31:52
inventing something new that increases
00:31:54
production. A really good line from
00:31:56
another article that was written just a
00:31:57
couple weeks ago was the economy is not
00:32:00
a a pie, it's a garden and technology is
00:32:03
rain. So again, you know, all of this
00:32:05
technological innovation is going to
00:32:08
increase the growth rate of the garden.
00:32:10
It's not a fixed pie. And just because
00:32:13
you see an expansion and productivity in
00:32:17
one part of the economy does not mean
00:32:18
that you're going to see job loss in
00:32:20
another part of the economy.
00:32:21
>> Yeah. I think the job people are not
00:32:23
seeing but I'm seeing right now is the
00:32:25
person who creates agents, manages them
00:32:28
and is the maestro of the agents. The
00:32:31
person who can take the business
00:32:34
process, explain it and train the agent
00:32:36
to do it. And there are certain people
00:32:37
in business who are just really good at
00:32:39
operations. You were one of them Sachs
00:32:41
running companies and like that person
00:32:43
who can fire up an agent, train the
00:32:45
agent and figure out how to manage them
00:32:47
and figure out how to increase
00:32:49
>> with any new technology. Great job. and
00:32:51
it's not a developer.
00:32:52
>> Look, with any new technology, there's
00:32:54
always a huge change management aspect
00:32:56
with enterprises because it's hard for
00:32:59
them to adapt and change. And the people
00:33:01
in the organization who can lead that
00:33:04
change management are the ones who are
00:33:06
going to create an amazing career
00:33:07
opportunity for themselves. But it's
00:33:09
hard to do and that's going to slow down
00:33:10
the rate of change just the amount of
00:33:12
inertia in the economy. And also one
00:33:14
other constraint is going to be that at
00:33:16
some point here we may be token
00:33:18
constrained, right? I mean, we may not
00:33:19
have enough energy like we've talked
00:33:21
about, even though the chips are getting
00:33:23
so much better, that tokens per second,
00:33:26
tokens per watt, and tokens per dollar
00:33:29
are all increasing very fast, but we're
00:33:31
still going to probably be constrained
00:33:33
in the next couple of years on some
00:33:35
dimension, whether it's land, power,
00:33:36
shell, or just energy production or
00:33:38
maybe chip production. There are real
00:33:40
world constraints on just how fast we
00:33:43
can scale the infrastructure and that
00:33:45
will mean that these like hyper utopian
00:33:48
or hyperdystopian narratives will be
00:33:50
wrong. I don't think there's time in the
00:33:52
next few years for the whole economy to
00:33:54
change in the way that the extremes
00:33:56
would present.
00:33:57
>> I think you're right. I think you're
00:33:59
going to see a 10xing in
00:34:03
the demand for tokens, but I also think
00:34:05
you're going to see a 90% price
00:34:08
reduction in the cost of an output token
00:34:10
probably by the end of this year. So, I
00:34:12
think that to your point, like it's it's
00:34:15
going to just create an enormous upswell
00:34:17
of demand because we're going to be able
00:34:19
to cut the prices of an output token so
00:34:21
dramatically. And I think by the way
00:34:23
>> that discussion we had Shamath last week
00:34:25
when we talked about co the tokens
00:34:26
outpacing the employee salary and just
00:34:29
where are these tokens all going to come
00:34:30
from that was our most viewed clip or
00:34:33
one of the most viewed clips in the
00:34:34
history of this podcast. So people are
00:34:36
actually really focused on this.
00:34:38
>> I had my team at 89 we we redid our cost
00:34:40
model and now we have that as a line
00:34:43
item. When we think about fully burdened
00:34:46
cost of employees, we now factor that in
00:34:49
because we're we're at a place where
00:34:51
some of our engineers are just racking
00:34:53
up ginormous bills and then separately
00:34:55
just general runs that we do for general
00:34:59
purpose stuff that we need to just run
00:35:00
our product is it's so expensive. So I
00:35:03
am waiting with baited breath for what
00:35:05
Zach said which is like we need an
00:35:06
explosion in the capacity that's
00:35:10
available because I do think that the
00:35:12
the silicon solutions are coming that
00:35:14
will cut the cost but we need a large
00:35:17
block of land power shell ready to then
00:35:20
turn all of this stuff on so that we can
00:35:21
actually take advantage of it.
00:35:23
>> Rumors the new Mac Studio is coming will
00:35:26
have an M5 chip in it and will be
00:35:28
language model ready. So that's the
00:35:30
rumor is that they're building it for
00:35:32
models. So that could be an incredible
00:35:34
turn of events. Everybody's desktop
00:35:36
running the local model. Sax, you want
00:35:38
to have the final word here? Uh Free or
00:35:39
Freeberg before we
00:35:40
>> just to go back to what Jamath was
00:35:42
saying there. I mean, you've got
00:35:44
political forces that want to stop the
00:35:46
construction of all data centers in the
00:35:47
United States. So
00:35:49
>> if that gains steam, then that's going
00:35:50
to be a huge constraint on any change
00:35:52
whatsoever.
00:35:53
>> Can I tee this up for you, Jal? So I I
00:35:55
went back this weekend
00:35:57
and I looked at the number of data
00:36:00
centers that have faced local opposition
00:36:03
and whether there were patterns. And I I
00:36:05
posted it on X. So Nick, maybe you can
00:36:07
put this up, but
00:36:09
it was really a a very small
00:36:14
behavior which was pushing back on data
00:36:16
centers and getting them cancelled. We
00:36:18
had about 25 projects total of which
00:36:22
20 were just in Q2 alone. There are 100
00:36:25
data center projects right now that are
00:36:27
facing some form of local opposition.
00:36:29
>> So interesting.
00:36:30
>> If you take that 40% number and you
00:36:34
apply this and then you multiply by the
00:36:37
number of megawatts that they have
00:36:38
announced. Last year we lost almost 5
00:36:42
gawatt
00:36:44
in terms of cancel projects.
00:36:46
This year coming in 26 we have about
00:36:49
seven that could be cancelled if you use
00:36:51
this math. If then you flow that through
00:36:55
open AI Sarah Frier said this that every
00:36:58
gigawatt for her for open AAI is about
00:37:00
10 billion of revenue. So if you if you
00:37:03
assume that that's roughly accurate plus
00:37:04
or minus a billion here or there. What
00:37:07
that means is that 2025 the industry as
00:37:11
a whole lost
00:37:13
50 billion of revenue
00:37:15
and this year if 7 gawatt gets canceled
00:37:18
it's about 70 billion. Now you're
00:37:20
talking about 130 billion of lost
00:37:21
revenue over these two years that'll go
00:37:23
forward in time that we miss out on. I
00:37:27
think that that's really bad. We need to
00:37:28
figure out a way to nip this in the bud.
00:37:30
>> So confounding because we were sitting
00:37:33
here 5 years ago, 10 years ago, local
00:37:36
municipalities were fighting and giving
00:37:38
discounts to try to get these data
00:37:39
centers open to get the jobs and get the
00:37:41
revenue. And now we've got people trying
00:37:43
to stop them. This is a perfect
00:37:45
transition for the State of the Union.
00:37:46
Before we get there, two important
00:37:49
programming notes. All-In is going to
00:37:51
host two events in 2026. One of them,
00:37:53
liquidity, May 31st to June 3rd, uh, in
00:37:56
Yonville, uh, up in Wine Country.
00:37:58
Chimath has taken control of the event
00:38:01
and he has set a standard for who gets
00:38:03
on stage.
00:38:04
>> None of you mids can control the
00:38:05
programming. I just
00:38:06
>> That's it. Chimath came in and he
00:38:09
dropped the hammer. Who do you got so
00:38:10
far? You want to tease a couple of
00:38:12
people who you invited to come speak?
00:38:14
>> I'll tease two. The first is an
00:38:17
incredibly dear friend of mine,
00:38:19
>> the axe of axes,
00:38:22
Dan Loe, who founded Third Point, who is
00:38:26
an unbelievable investor in literally
00:38:30
every domain, private credit,
00:38:33
public equities, private tech.
00:38:36
He's he's just a he's
00:38:39
>> a beast. So, he'll be doing a really
00:38:41
important keynote. He has not done one
00:38:43
of these public speaking slots in a very
00:38:45
long time. And then the second is the
00:38:47
CFO of OpenAI, Sarah Frier.
00:38:50
>> Oh wow.
00:38:50
>> Unbelievable start. And we're going to
00:38:54
double click into the entire business
00:38:55
model of OpenAI on stage in front of
00:38:57
everybody.
00:38:57
>> So uh go to allin.com and then for those
00:39:00
of you who plan ahead for travel,
00:39:01
>> more coming more coming every week.
00:39:04
>> If you are an All-In Summit fan, I can't
00:39:07
believe it. Freedber, we're going to be
00:39:08
in our fifth year September 13th to
00:39:10
15th. only gets better. Gets better
00:39:13
every year. I could have some good
00:39:14
parties, too. I mean, that Back to the
00:39:16
Future and the Bladeunner parties, those
00:39:17
were epic.
00:39:18
>> allin.com/events. Hey, can I give a plug
00:39:21
to friend of the pod,
00:39:23
>> Bill Gurly? He's got an amazing new
00:39:24
book, Running Down a Dream. Please,
00:39:26
>> wait, wait, wait, wait. Before we start,
00:39:28
>> before I start,
00:39:29
>> there it is. Running down a Dream, I
00:39:30
just want everybody to just stop the
00:39:32
pause the podcast. I want you to buy
00:39:33
three copies, give it to two young
00:39:35
people and a parent. You know,
00:39:36
>> this book is incredible.
00:39:37
>> It's a great book. It's a great book. It
00:39:40
It really is. inspiring for kids and uh
00:39:42
Bill Gurly, friend of the pot. He always
00:39:44
shows up for us.
00:39:44
>> Jal, do an impression for us of what it
00:39:46
would be like if you and Bill Gurley
00:39:48
started a podcast together.
00:39:49
>> All right, everybody. Welcome to the
00:39:50
JCBG podcast. I'm your host, Jason
00:39:53
Calakanis. and I'm Bill Gurly and we're
00:39:56
here in Texas at Terry Blacks where
00:39:58
we're getting some beef ribs and we're
00:40:00
going to discuss investing in
00:40:03
marketplaces
00:40:04
as well as my new book running down a
00:40:08
dream which will teach your kids how to
00:40:10
not be fuckups and if your kids are ups
00:40:14
you can hit them in the back of the head
00:40:16
with the book
00:40:18
Texas style.
00:40:20
One of the big topics uh and I think
00:40:22
something you're working on with
00:40:24
President Trump's acts is uh this energy
00:40:28
pledge. I've been seeing uh rumblings
00:40:30
about this. Explain what's going on in
00:40:31
terms of getting the country in sync
00:40:34
around these data centers and energy.
00:40:37
Well, the president announced in the
00:40:38
State of the Union last night that he
00:40:40
supports a rateayer protection pledge
00:40:44
which requires uh the major tech
00:40:47
companies to provide for their own power
00:40:49
needs for AI data centers so that
00:40:52
residential consumers do not see their
00:40:53
rates going up. I think this makes total
00:40:55
sense. I think Chamas, to your point,
00:40:58
this is the reason behind a lot of the
00:41:00
opposition to new data centers is that
00:41:02
the local residents fear that their
00:41:03
electricity prices are going to go up
00:41:05
and that shouldn't be the case. And so
00:41:07
the president has said that he's
00:41:09
committed to not allowing residential
00:41:11
rates to go up as a result of data
00:41:13
centers. It's pretty straightforward.
00:41:15
You get the big tech companies, the
00:41:16
hyperscalers to pay for the increase in
00:41:18
the electricity cost or you let them set
00:41:22
up their own power behind the meter. The
00:41:23
president's been talking about this for
00:41:25
over a year that our biggest AI
00:41:27
companies would also become big power
00:41:29
companies because we would let them uh
00:41:31
stand up their own power generation
00:41:32
behind the meter. So these data centers
00:41:36
don't even have to connect to the grid.
00:41:38
They could just do collocation
00:41:40
themselves. But also, I think that with
00:41:42
this rateayer protection pledge, what
00:41:45
you're going to see is that it could
00:41:46
actually bring down consumer prices
00:41:49
because what happens is that when these
00:41:51
data centers then set up their own power
00:41:53
and connect to the grid, they can give
00:41:55
back the excess to the grid. Also, they
00:41:58
will make investments in scaling the
00:42:00
infrastructure. So, although electricity
00:42:03
is priced at a metered rate, the costs
00:42:05
to generate it are not all variable.
00:42:07
There's a lot of huge fixed costs in
00:42:09
there. So when you increase scale then
00:42:12
you can actually reduce the the metered
00:42:14
rate. So again you know this is really I
00:42:17
think the rebuttal to Bernie Sanders who
00:42:19
just wants to stop all progress
00:42:21
whatsoever.
00:42:22
I saw a funny post calling it bananas
00:42:25
which is build absolutely nothing
00:42:26
anywhere near anyone. So this is the
00:42:30
>> bananas is replacing the new nimi. So
00:42:32
you just can build absolutely nothing. I
00:42:34
think the president's approach finds a
00:42:36
very good balance here, which is look,
00:42:38
>> we can have progress. Just don't make
00:42:40
residential consumers pay for it. Let
00:42:41
the big tech companies pay for it
00:42:43
themselves. And I think you'll see more
00:42:44
coming out about this from the White
00:42:46
House next week.
00:42:47
>> Quite a deaf move. Freeberg, how should
00:42:50
America be thinking about this great
00:42:53
data center buildout, energy usage, you
00:42:56
know, if you expand it out over the
00:42:58
coming decade? And how do you sell that
00:43:01
to the backdrop that you talk about the
00:43:03
socialist movement? You got a great
00:43:04
interview coming out with Ray Dalio and
00:43:06
the all-in interview program next week.
00:43:09
How do you think about those competing
00:43:11
forces? You've got the socialists saying
00:43:13
bananas, nimi, slow down, del, and then
00:43:18
you've got this incredible race we're in
00:43:20
for efficiency and this opportunity and
00:43:22
abundance. How would you sell it to kind
00:43:25
of bring these two sides together? Or is
00:43:27
it just
00:43:30
impossible. The data coming in and out
00:43:32
of data centers moves at roughly the
00:43:34
speed of light. So you could put them
00:43:37
anywhere. And I think that our policy
00:43:40
makers need to be very cognizant of that
00:43:43
fact. You have and we do connect the
00:43:47
internet using high-speed cable,
00:43:50
high-speed
00:43:52
fiber optic throughout the world. And so
00:43:54
theoretically if we don't embrace and
00:43:58
allow the economic development of the
00:44:01
data center industry and it will
00:44:02
fundamentally be an industry because it
00:44:04
is almost like the new sort of oil.
00:44:07
Where are the oil rigs going to go?
00:44:09
Where are the railroads going to go?
00:44:11
Where are the telegraph lines going to
00:44:12
go? Where are the factories going to go?
00:44:14
If we don't put them here, someone else
00:44:16
will put them on their shores. Someone
00:44:18
else will put them in their country.
00:44:19
Someone else will put them in their
00:44:20
jurisdiction. And a lot of the economic
00:44:22
value that arises from the people that
00:44:25
will build those facilities, the energy
00:44:28
that will be installed to produce power
00:44:30
for those facilities, and then all of
00:44:33
the second and third order industries
00:44:35
that emerge as a result of those
00:44:36
installations, that value will acrue
00:44:39
elsewhere.
00:44:40
>> Such a good point.
00:44:40
>> So,
00:44:41
>> yeah,
00:44:42
>> it's not going to like just go away. The
00:44:44
the demand is there. The economy is
00:44:45
moving forward. AI is moving forward. We
00:44:49
live in a world with 196 countries and
00:44:52
data centers do not take up a lot of
00:44:54
space. They're very small relative to
00:44:56
the economic value that they produce. If
00:44:58
you zoom out on the map of the world,
00:45:00
all the data centers in the world fit
00:45:02
under the tip of a pin. And so this is a
00:45:06
very small footprint and if we're going
00:45:08
to give up hundreds of thousands of jobs
00:45:11
and many billions of dollars of economic
00:45:14
value creation, we're being pretty silly
00:45:15
and pretty obtuse in our view of the
00:45:17
world. I would just like encourage the
00:45:19
system that I think is the right system
00:45:21
and we talked about this last time where
00:45:23
provided data centers are producing
00:45:25
their own electricity. That means that
00:45:27
you're taking electricity consumption
00:45:29
off the grid because they otherwise are
00:45:32
not being used on the grid and that will
00:45:34
reduce the cost of electricity for other
00:45:36
residential and industrial users. So,
00:45:39
it's silly to think that we need to put
00:45:40
a moratorium on data centers. As soon as
00:45:42
you do that, the companies that use data
00:45:44
centers are not going to slow down.
00:45:45
they're going to go put them somewhere
00:45:46
else and we're going to miss out.
00:45:47
>> And it's such a good point, Chimamoth,
00:45:49
because you were recently in the Middle
00:45:50
East and I've been there a bunch in
00:45:53
Saudi, UAE. These are the folks who
00:45:56
built a large portion of those oil
00:45:58
refineries. And they are savvy to this.
00:46:01
And what are they doing in Saudi, UAE,
00:46:04
Qatar, all of these regions? They're
00:46:06
doubling down. They're 10xing their data
00:46:08
center builds. So to your point,
00:46:11
Freedberg, either we build them or
00:46:13
they're going to go somewhere else. And
00:46:14
there are people who are willing to
00:46:16
underwrite these and they're willing to
00:46:18
take out the red tape
00:46:21
from the process here and move quicker
00:46:23
than us. So we I think this is a pretty
00:46:24
deaf move by President Trump to say,
00:46:27
"Hey, you guys should all just guarantee
00:46:29
that consumers don't get impacted." The
00:46:31
water thing is a total hoax. Like the
00:46:33
water is recirculated. That's a hoax. I
00:46:35
think this is really smart. I think that
00:46:37
what the president's doing and what Sax
00:46:39
is doing is really smart. The thing to
00:46:41
keep in mind is that there's still a
00:46:44
risk that prices go up and it has
00:46:45
nothing to do with these data centers
00:46:47
and it has everything to do with the
00:46:48
business model of being a utility
00:46:51
because what happens is in order to get
00:46:54
a license a monopoly license in an area
00:46:58
to provide energy to generate energy for
00:47:01
a community.
00:47:03
The exchange works in the following way.
00:47:05
You go and you present a capex plan to
00:47:08
the public utilities commission. That's
00:47:10
effectively your budget that says here
00:47:12
are the lines I'm going to upgrade. Here
00:47:14
are the generators I'm going to upgrade.
00:47:16
Independent of data centers. The reality
00:47:18
is the draw the electricity consumption
00:47:21
of individual Americans is going up
00:47:23
because we have more devices, we have
00:47:25
cars, we have all of these other things.
00:47:27
So what we also have to do is we have to
00:47:29
look at how utilities's business model
00:47:31
actually incentivizes them to increase
00:47:34
prices by making all kinds of
00:47:37
investments. So we have to do a good job
00:47:39
of making sure we hold everybody
00:47:40
accountable because otherwise what you
00:47:42
could see is that the data centers
00:47:45
taking on the burden for themselves but
00:47:47
price is still continuing to escalate
00:47:49
because a utility says I need to spend a
00:47:51
billion dollars this year to upgrade my
00:47:53
infrastructure. And what that allows
00:47:55
them to do is take that billion dollars
00:47:56
and essentially invest it for a return.
00:47:58
That's the business model of utility.
00:48:00
And this is really happening in blue
00:48:03
states. Micron has a hundred billion
00:48:06
dollar mega fab in New York. And there's
00:48:08
a lawsuit by six 1 2 3 4 5 citizens.
00:48:13
>> That's shameful.
00:48:13
>> And the project has taken 12
00:48:17
>> 1200 days. 1,200 days
00:48:20
>> between their announcement and the
00:48:21
groundbreaking. And they spent 612 days
00:48:24
on the environmental impact study.
00:48:27
People wake up. Just go to Texas. Elon
00:48:30
built his factory here, the Gigafactory,
00:48:33
in under like 18 months. This is the
00:48:36
great state of Texas. Come here. We'll
00:48:38
build it for you and you'll be done.
00:48:40
>> Yeah. I don't know why anyone bothers
00:48:41
with the blue states anymore. They make
00:48:43
it too hard to build.
00:48:44
>> It's [ __ ] It's so dumb. It's such a
00:48:46
self-owned, too. Like, what? Don't you
00:48:48
want to be part of the future? You're
00:48:49
literally ankling the entire country to
00:48:53
scratch the odd.
00:48:54
>> By the way, there are a lot of people in
00:48:55
New York who want to work. This is not a
00:48:56
case actually of this new fab being
00:49:00
unpopular.
00:49:02
The majority of people in the area
00:49:04
actually want this plant being built.
00:49:06
They want the jobs that are going to
00:49:07
come there. A lot of people say data
00:49:09
centers don't create a lot of jobs. This
00:49:10
is actually a chip fab. So, it will
00:49:12
create a lot of jobs. A lot of good high
00:49:14
paying jobs. People want it. But six
00:49:16
people can stop it with a lawsuit after
00:49:18
it's already been through a two-year
00:49:20
environmental review.
00:49:21
>> It's not blue and red states. These are
00:49:23
nonprofits that get organized to create
00:49:24
this kind of chaos. I remember looking
00:49:26
at a massive lithium investment in
00:49:31
Nevada. And the whole point was to
00:49:34
domesticate
00:49:35
lithium production. And what was
00:49:37
interesting is this enormous deposit
00:49:40
that's just sitting there ripe for
00:49:41
development right before they were about
00:49:43
to get environmental approvals or right
00:49:45
after there was a lawsuit by people who
00:49:48
wanted to protect the upper land grass.
00:49:50
It's seared in my mind that the upper
00:49:52
land grass of Nevada is the reason why
00:49:54
we do not have domestic national
00:49:56
security around lithium.
00:49:59
And you have to ask yourself why is this
00:50:01
possible? And it's possible because you
00:50:03
have these environmental nonprofits that
00:50:06
can go and create this chaos with
00:50:08
absolutely no risk to them. Zero. They
00:50:11
can fund raise around it and they can
00:50:12
create this chaos. I mean, Nick, to this
00:50:14
point, this is an example of Greenpeace.
00:50:17
And specifically here, they were pushing
00:50:20
back on an oil pipeline to such a degree
00:50:22
and they created so much chaos that they
00:50:24
were sued. And a North Dakota judge just
00:50:27
said that he's going to order Greenpeace
00:50:28
to pay damages. That should total almost
00:50:30
$350 million in connection to those
00:50:33
protests.
00:50:34
>> And it should not be the case that six
00:50:36
people can slow down a hundred billion
00:50:38
investment package. That's not right.
00:50:40
Well, I think there's and I just want to
00:50:42
highlight this important point. There's
00:50:44
not a lot of logic and reason.
00:50:47
You guys are right. But I do think
00:50:49
there's a lot of emotion and there's a
00:50:52
huge aversion to big tech, a huge
00:50:54
aversion to wealth creation by select
00:50:56
individuals, select companies, a huge
00:50:59
aversion to economic growth that doesn't
00:51:01
benefit everyone. There's a fundamental
00:51:03
kind of underlying left behind emotion
00:51:06
that drives a lot of this. And I've said
00:51:08
it before, but I think unless there's
00:51:10
systems or mechanisms that get folks to
00:51:12
come along with the value creation ahead
00:51:14
and and help them connect their own
00:51:17
lives to the value creation that's being
00:51:18
realized, they're not going to be
00:51:20
supportive because there is this kind of
00:51:22
diametric opposition towards big tech,
00:51:25
towards the wealth gap, towards value
00:51:27
acrruel to a select few companies or
00:51:29
select few individuals and this fuels
00:51:31
and feeds that. So I think fundamentally
00:51:33
maybe it's not just about giving the
00:51:35
data centers their own power capacity
00:51:38
but there's got to be mechanisms and
00:51:40
tools that helps the broader population
00:51:42
understand or recognize or get some
00:51:44
benefit from it as well where they're an
00:51:46
owner in it or participant in it because
00:51:49
they have the power as we're seeing they
00:51:50
have the power to stop it. Therefore
00:51:52
they want to have some benefit for
00:51:54
providing authority to do it. And these
00:51:56
six people are concerned about housing
00:51:59
costs, worker exposure to toxic
00:52:02
chemicals, pollution in air and water,
00:52:04
greenhouse gas emissions, energy
00:52:06
consumption, flooding of the wetlands,
00:52:08
all these things that obviously could be
00:52:10
mitigated. All right, let's keep moving
00:52:12
here. We got a lot more docket to get
00:52:13
through. State of the Union came in at
00:52:16
108 minutes
00:52:19
and it's the longest in 60 years.
00:52:20
actually the longest since they started
00:52:22
tracking this. The theme of President
00:52:24
Trump's State of the Union this year,
00:52:26
America at 250, strong, prosperous, and
00:52:30
respected. Trump took a bunch of victory
00:52:32
laps, inflation, jobs, closing the
00:52:34
border, all those have gone really well.
00:52:36
But this comes to the backdrop of
00:52:38
Trump's approval rating being super
00:52:40
challenged. He started his first year at
00:52:44
plus 11.7%. Now he's negative 14 14.3%
00:52:49
26 point swing. Economy started plus 3.4
00:52:52
down to 18.2 and trade started at 5.9%
00:52:56
and we'll talk about the tariff stuff
00:52:57
later and went down to 22.7.
00:53:01
So let's call balls and strikes here,
00:53:03
gentlemen. Favorite moments. What were
00:53:05
your favorite moments from the State of
00:53:07
the Union and just general impressions
00:53:09
of 1 hour and 45 minutes of Trump going
00:53:14
to town?
00:53:16
I thought it was great.
00:53:18
>> Favorite moment? Favorite moment or two?
00:53:21
Well, I had a couple.
00:53:24
One was the
00:53:26
Elon Omar Rashida Tlay death stare and
00:53:31
them just like losing their minds and
00:53:34
and screaming. I just thought it was so
00:53:38
unamerican. The second was when he was
00:53:40
calling for law and order where and
00:53:43
focusing and prioritizing on American
00:53:45
citizens and
00:53:47
>> illegal uh aliens
00:53:49
>> and none of the Democrats stood up. I
00:53:52
thought that was kind of foolish. It was
00:53:54
like obvious things and the Democrats
00:53:55
wouldn't applaud but this time they did
00:53:57
like they did for the hockey team which
00:53:58
I thought was like the right thing to
00:53:59
do. And then the fourth thing is just a
00:54:01
a shout out to our friend Brad Gersonner
00:54:03
who got a big shout out from the
00:54:04
president. I don't know Sax if you
00:54:05
engineered that or not but that was
00:54:07
>> that was fantastic.
00:54:09
He got like a double shout out. It was
00:54:10
like a double tap.
00:54:12
>> Yeah, that was really cool.
00:54:13
>> That was surreal.
00:54:14
>> Our group chat group chat went crazy. It
00:54:16
was really That was really cool.
00:54:18
>> Those are my four highlights.
00:54:20
>> Great. Here's your uh here's your clip
00:54:22
of uh yeah, Democrats not standing for
00:54:26
Americans uh over illegal aliens. 20
00:54:30
seconds.
00:54:31
>> If you agree with this statement, then
00:54:33
stand up and show your support. The
00:54:36
first duty of the American government is
00:54:39
to protect American citizens, not
00:54:42
illegal aliens.
00:54:52
>> Why wouldn't you stand for that? That's
00:54:53
an easy one to stand for. Doesn't make
00:54:56
any sense.
00:54:57
>> Would you stand for it, Chico?
00:55:00
>> Yeah. I mean, I I'm I'm pro I can be
00:55:02
anti-ICE, but I'm pro-American and I'm
00:55:05
pro reasonable immigration like 90% of
00:55:08
the country is. So, it just doesn't make
00:55:10
any sense. Uh, do you have any
00:55:12
highlight?
00:55:12
>> Do you think American citizens should be
00:55:14
prioritized over illegals?
00:55:15
>> Well, of course, of course. Yes. Of
00:55:18
course. Yes. And then I I also think
00:55:20
there should be a path to citizenship
00:55:22
for people who have been here for a
00:55:23
while. And I think that's what the
00:55:24
majority of the country thinks as well.
00:55:25
>> Your point is I can hold two thoughts in
00:55:27
my head. I would have stood if he asked
00:55:29
me.
00:55:30
>> Yes, obviously we should take care of
00:55:32
American citizens first. Yes. And we
00:55:34
should deport violent criminals. We've
00:55:36
been over this like a million times
00:55:37
here. This is like consensus.
00:55:39
>> But what do you what do you think is
00:55:40
going on in everybody else's head when
00:55:42
they're like we got to we cannot stand
00:55:43
for this?
00:55:44
>> These two sides I mean I think it's like
00:55:46
the tariff thing. It's like the ice
00:55:48
thing. These two sides cannot work
00:55:49
together. It's just the most polarized
00:55:52
it's ever been. Trump is not like the
00:55:54
kind of guy to reach across the aisle.
00:55:56
the Democrats are now digging in. So, we
00:55:58
just have a dysfunctional government
00:56:00
where, you know, in a more functional
00:56:02
time period, like under Clinton, let's
00:56:05
say, or Bush, people would have gotten
00:56:07
together, and I'm I'm jumping ahead to
00:56:09
the tariff discussion. And they would
00:56:10
have said, "Yeah, of course, tariffs are
00:56:11
done in Congress. That's the law.
00:56:13
Whatever. What are your thoughts, Mr.
00:56:15
President? How can we support your
00:56:16
tariff program?" But now it's like, "Oh,
00:56:18
well, we don't work together. We don't
00:56:20
actually have discussions anymore.
00:56:22
There's no bipartisan,
00:56:24
you know, collaboration. All these
00:56:26
politicians are disgraceful descriat
00:56:28
across the board. They should be working
00:56:30
together for the American people. If the
00:56:31
president wants to do tariffs,
00:56:33
>> they should be reasonable about it and
00:56:35
give him the power to do reasonable
00:56:37
tariffs and he should be reasonable and
00:56:38
say, "Hey, I understand that's your
00:56:39
power. Let's get together and we'll
00:56:41
we'll chop it up and let's have dinner
00:56:42
together." But they're just too
00:56:43
polarized. It's just disgraceful where
00:56:45
this country has gotten to. I blame both
00:56:46
parties.
00:56:49
Whenever the Democrats get smoked out as
00:56:52
being radicals and extremists, you
00:56:54
always want to basically say a pox on
00:56:56
both your houses and blame the
00:56:57
Republicans and Democrats equally. The
00:56:59
fact of the matter is the president said
00:57:02
to the audience, to the members of
00:57:04
Congress, hey, if you agree with the
00:57:06
statement, stand up. And of course,
00:57:08
every single Democrat sat there
00:57:10
stonefaced and refused to applaud or
00:57:14
acknowledge what he was saying. This was
00:57:17
a very easy test for the Democrats to
00:57:19
pass.
00:57:20
>> What I just said,
00:57:21
>> in fact, it was what I said.
00:57:22
>> In fact, it was a political risk for the
00:57:24
president because it was so easy for the
00:57:27
Democrats to demonstrate that they're
00:57:29
operating in good faith and that they're
00:57:30
willing to be bipartisan and they're not
00:57:32
extremists and they're actually common
00:57:34
sensical and logical and they completely
00:57:37
failed the test. And by the way, it
00:57:39
wasn't just on that one. I mean, let me
00:57:40
just tell you some of the other ones
00:57:42
where they refused to applaud. So they
00:57:45
refused to applaud the grieving families
00:57:48
of innocent American women and children
00:57:50
murdered by criminal illegal aliens,
00:57:52
including the mother of Ireina
00:57:55
Zeritzkaya.
00:57:56
>> That was very sad. That was sad.
00:57:58
>> That was unbelievable. They refused to
00:58:00
applaud for securing our homeland and uh
00:58:03
ending the invasion of criminal illegal
00:58:06
aliens, killers, rapists, gang members,
00:58:08
and traffickers. They refused to applaud
00:58:10
for unifying against political violence.
00:58:13
So, the president mentioned the
00:58:14
assassination of Charlie Kirk. They
00:58:16
would not even do a polite clap for
00:58:19
Erica Kirk and unifying against
00:58:20
political violence. They refused to
00:58:23
applaud for keeping violent criminals
00:58:25
locked up. They even refused to applaud
00:58:28
for lower prescription drug prices for
00:58:30
millions of Americans because it was
00:58:31
President Trump who orchestrated that
00:58:33
policy. And there were so many other
00:58:35
examples like that. And you know, I
00:58:37
think the reason why this speech was so
00:58:39
effective, and by the way, it's not just
00:58:41
me saying it. something like twothirds
00:58:43
of the people that CNN pled, so
00:58:45
twothirds of CNN watchers said it was
00:58:48
highly effective and something like
00:58:49
threearters of CBS news viewers said it
00:58:52
was highly effective is because the
00:58:54
president laid out 8020 issue one after
00:58:57
another, right? Or even 9010 issues or
00:59:00
955 issues. I mean, these were all
00:59:03
issues where the overwhelming number of
00:59:05
Americans, I think, agree with the
00:59:07
policy the president laid out. And in
00:59:09
every single case, the Democrats
00:59:11
supposed to wouldn't even give it
00:59:12
applied applause. And that is different
00:59:14
than than in the past. And you can say
00:59:16
that's because of hyperartisanship and
00:59:18
polarization, but it's also because of
00:59:20
another thing. It's because the
00:59:21
Democrats have become a party of
00:59:24
radicalism and extremism. And the
00:59:27
viewpoints that they expressed through
00:59:30
their aesthetics the other night, they
00:59:33
do express those things in policy and in
00:59:35
speeches all the time. So it's not just
00:59:38
like a oneoff or you know somehow like
00:59:41
we have a misconception of who these
00:59:43
guys are. I think you know the big line
00:59:45
of the night was when Trump just sort of
00:59:47
said these people are crazy. I mean he
00:59:49
said it in like almost mournful and
00:59:51
regretful way. He doesn't want them to
00:59:52
be crazy. He wants them to be rational
00:59:54
so he can work with them. I mean,
00:59:56
>> but I think that point re I'll I'll
00:59:57
still point the other side, which is
00:59:59
this has been going on for a couple of,
01:00:01
>> you know, state of the unions here
01:00:03
across this. The Republicans didn't
01:00:04
stand for the Democrats often and uh
01:00:07
it's a bit of showmanship. But the truth
01:00:09
is Trump is the divider and chief chief.
01:00:11
He always is attacking people. He's
01:00:13
always mocking people. So, they don't
01:00:15
want to play ball with him. So, I do
01:00:17
think you can both choice in politics.
01:00:19
You got to counter punch.
01:00:20
>> Uh no, I don't think so. That's actually
01:00:22
the that's actually the problem with the
01:00:25
that philosophy, Trump's philosophy of
01:00:27
we have to counter punch, we have to
01:00:28
attack, we never have to apologize, we
01:00:30
never have to be reasonable. That's part
01:00:32
of what's broken down in our politics
01:00:33
and these two sides should work
01:00:35
together. We should go back to a
01:00:36
bipartisan.
01:00:37
>> Are you going to work with Helen Omar
01:00:39
when he is when the president going to
01:00:40
be easy, but she is the mirror. Hold on,
01:00:42
I'll finish my statement. You asked a
01:00:44
question. I think Trump is the mirror of
01:00:47
that. He has been hostile towards these
01:00:49
Democrat. He doesn't give them an inch.
01:00:50
they should be more collaborative. That
01:00:52
that's what the balance of power between
01:00:55
the executive branch, you know, and and
01:00:58
these uh you know, congressmen and and
01:01:00
the Congress and the Senate. Like this
01:01:02
is how it's supposed to work. And these
01:01:04
two sides need to learn how to get back
01:01:06
to listening to each other,
01:01:08
understanding each other's positions,
01:01:09
and then finding a middle ground. And
01:01:11
that's why the Democrats lost last time
01:01:13
because they didn't have the common
01:01:14
sense to say, "Hey, everybody wants the
01:01:16
border closed." To your point, it's a
01:01:18
90% issue. and Kla Harris was too dumb
01:01:21
to just say, "Yeah, we should have
01:01:22
closed the border. It's closed now." And
01:01:24
uh we've got it. Anyway, the whole thing
01:01:26
is a mess. I understand you got to fight
01:01:28
for your team. I don't like the
01:01:29
>> count. You just actually made the key
01:01:31
point. You made the key point which is
01:01:33
underlying the optics and the
01:01:34
polarization. You have issues and on
01:01:37
those issues, President Trump is on the
01:01:40
side of the American people. the issues
01:01:42
where 80% of the American people agree.
01:01:44
Some huge percentage, I don't know
01:01:45
exactly what it is, thinks that the
01:01:47
Somali daycare fraud in Minnesota was an
01:01:51
outrage. And the president is right to
01:01:54
point that out. And what's the
01:01:55
Democrat's reaction? You've got Ilan
01:01:57
Omar screaming from the audience at him.
01:02:00
>> Yeah, she's alone. I mean, at the end of
01:02:02
the Democrats are not that different.
01:02:03
You did have you did have Elizabeth
01:02:06
Warren stand for uh stopping Nancy
01:02:10
Pelosi from trading stocks and that gave
01:02:11
Trump his best oneliner of the night.
01:02:13
That was his best oneliner clearly and
01:02:15
they stood for Iran too and stopping
01:02:18
Iran from being a nuclear.
01:02:19
>> I give Elizabeth Warren credit for that.
01:02:21
>> There you go. You don't have to punch
01:02:22
her back.
01:02:23
>> Stop insider trading act without delay.
01:02:27
>> Yeah. See, that's something bipartisan.
01:02:29
Look at that, Zach. That's what you need
01:02:31
to get the country back to. But that
01:02:33
hold on. But this this disproves the
01:02:35
point you were making before. You said
01:02:37
that it was polarization.
01:02:38
>> You stood up for that. I can't believe
01:02:41
>> can't believe Nancy Pelosi stand up if
01:02:44
she sing pal. Good job.
01:02:47
>> That's why he's so good is he's in the
01:02:49
moment and he's reacting to what's
01:02:50
happening in the chamber. He's not good.
01:02:52
He's not just reading from a
01:02:53
teleprompter. And he nailed it. But
01:02:55
look, that moment disproves what you
01:02:57
were saying, Jal, because this is not
01:02:58
just about polarization. On that issue,
01:03:01
Elizabeth Warren was willing to stand
01:03:02
because she actually, to her credit,
01:03:04
wants to ban insider trading by members
01:03:06
of Congress. Yes. But on the rest of
01:03:08
those issues, like securing the border,
01:03:09
she did not stand. Why? Because she does
01:03:11
not agree with the president on that
01:03:13
issue.
01:03:14
>> We just have to get back to these sides
01:03:16
working together. That's my personal
01:03:18
feeling.
01:03:19
uh Freeberg, any thoughts on the uh
01:03:21
theatrics and uh Trump's first year at
01:03:24
large and you know the sort of back and
01:03:26
forth and is there any hope that these
01:03:29
two teams could collaborate at some
01:03:32
point on something like say the
01:03:34
ballooning deficit which Trump has not
01:03:36
gotten under control in his first year
01:03:38
and it's going to be $2.5 trillion
01:03:40
dollars added. What are your thoughts
01:03:42
here on them collaborating on anything
01:03:44
important? Friedberg, that's probably
01:03:45
one thing they can agree on is just keep
01:03:47
the money flowing. Got it. They'll both
01:03:49
give a They'll both give a a standing
01:03:51
ovation for burn more capital and put us
01:03:53
more in debt. Well said my guy David
01:03:57
Friedberg, sultpan of science, it is
01:04:00
your time to shine. The world's greatest
01:04:03
moderator has decided we're going
01:04:05
directly to science corner. This is your
01:04:07
time to shine.
01:04:09
>> Sax's time to drop a deuce.
01:04:10
>> Sax went immediately off camera.
01:04:12
>> Yeah, he has to drop a deuce.
01:04:13
>> What's up?
01:04:16
>> But what do you need me for? This is
01:04:17
very important for you.
01:04:19
>> Freeberg's going to talk.
01:04:20
>> Lightning round for science corner. Go
01:04:21
Freeberg.
01:04:22
>> Wait, wait, wait. Are we doing any more
01:04:23
topics after this or can I just leave?
01:04:24
>> Yes, tariffs.
01:04:25
>> Yes, we're doing tariffs.
01:04:26
>> Wait, why would we do that?
01:04:27
>> Because I want to get the audience to
01:04:30
sleep. I'm not saying Science Corner
01:04:32
doesn't have its audience.
01:04:33
>> Listen, you can go.
01:04:34
>> Why wouldn't we do Yeah, exactly. You're
01:04:37
screaming.
01:04:37
>> Let him do his work.
01:04:38
>> Let him let him cook. Let him cook.
01:04:40
>> Freeberg, tell us about this Harvard
01:04:41
scene.
01:04:42
>> Speaking of science, I think that
01:04:43
there's a very important moment
01:04:44
happening right now. We've talked a
01:04:46
number of times on the show about
01:04:47
Yamanaka factors. These are these four
01:04:50
proteins that were discovered by Shina
01:04:53
Yamanaka that we found later that when
01:04:56
applied to cells, mamalian cells can
01:04:59
actually reverse the age of those cells,
01:05:01
reset the epigenetic clock, reset the
01:05:04
epiggenome, which is the little markers
01:05:05
on top of the DNA that turn genes on and
01:05:07
off back to a youthful state.
01:05:10
extraordinary groundbreaking work that
01:05:12
was done that won the Nobel Prize led to
01:05:14
the foundation of several companies.
01:05:16
There's a Harvard uh scientist named
01:05:18
David Sinclair. He's a bit of a
01:05:20
controversial character. Do you guys
01:05:21
know him? I think you guys one one or
01:05:23
two of you may have met him. Chim, you
01:05:24
ever met him?
01:05:25
>> I I I followed him. I've seen his stuff.
01:05:28
So Sinclair is kind of bemoaned a little
01:05:31
bit by the scientific and academic
01:05:33
community for being a little too
01:05:34
overhypy snake oil salesman as some of
01:05:38
folks have claimed because years ago he
01:05:40
sold a company to GSK saying resveratrol
01:05:43
would reverse aging and you know he made
01:05:45
$720 million on that and it didn't end
01:05:47
up working and he's promoted certain
01:05:49
supplement companies and so on. So, I
01:05:51
want to preface with that before I kind
01:05:52
of underwrite what he's saying with this
01:05:54
next thing, but he's a co-founder of a
01:05:56
company called Life Biosciences, and
01:05:59
they've reached a major agreement with
01:06:00
the FDA to be the first company to treat
01:06:03
humans with Yamanaka factors.
01:06:06
Specifically, what they're doing is
01:06:07
they're going to be delivering these
01:06:09
Yamanaka factors, these are these
01:06:11
proteins
01:06:13
that rejuvenate cells and make them
01:06:15
youthful again into the eye. And so
01:06:18
their their first indication is to
01:06:20
actually inject them into the vitrial
01:06:22
fluid in the eyeball and they'll affect
01:06:24
the retina in the eye to address people
01:06:28
that have gotten blind from glaucoma or
01:06:31
one of these kind of stroke like
01:06:33
diseases that happen in the eye. And the
01:06:35
expectation with this phase one clinical
01:06:37
trial is that the delivery of these
01:06:39
Yamanaka factors into the eye will
01:06:41
rejuvenate the retina, make it youthful
01:06:44
again, and restore vision. If it works,
01:06:47
which it's expected to because we see
01:06:48
this result happen in animal models, it
01:06:51
could be an extraordinary breakthrough,
01:06:53
not just in terms of blindness, but in
01:06:55
terms of the first human application of
01:06:58
Yamanaka factors to reverse aging. The
01:07:01
way they're doing it is they're actually
01:07:03
packaging up DNA that will make these
01:07:06
proteins into viruses, AAV virus that is
01:07:12
delivered into the eye. The virus will
01:07:14
then go into the retinal cells and then
01:07:17
will deliver this payload for this DNA
01:07:19
to make these proteins in the eye uh
01:07:21
cells and it can be turned on and off.
01:07:24
Amazingly they've created a switch
01:07:26
mechanism in it where the protein
01:07:28
production the production of these
01:07:29
Yamanaka factors can be turned on and
01:07:31
off by taking an antibiotic called
01:07:34
doxycyc. So the person that gets the
01:07:36
delivery of this drug takes the
01:07:38
antibiotic turns on the production of
01:07:40
these Yamanaka factors and then
01:07:41
theoretically their eye cells will deage
01:07:44
will get youthful and their vision will
01:07:46
be restored. So phase one clinical
01:07:48
trials underway. First time in human
01:07:50
history we're seeing Yamanaka factors
01:07:52
being delivered into humans. Literally
01:07:54
the tip of the iceberg. There are now
01:07:56
over a dozen startups that are trying to
01:07:59
deliver Yamanaka factors which are these
01:08:01
proteins or some other sort of protein
01:08:03
that can actually reverse aging by
01:08:05
restoring the epiggenome and cells and
01:08:07
make them young again. So this is the
01:08:09
beginning of a wave of what I think will
01:08:11
be the most extraordinary revolution in
01:08:14
human therapeutics and ultimately could
01:08:16
lead to
01:08:18
you know some people would argue the
01:08:20
fountain of youth. Is this just a talk
01:08:22
study? So is it mechanism?
01:08:25
>> They're not. Yeah, they're they're well
01:08:26
they'll see results. They'll see
01:08:27
results, but they're going to they're
01:08:28
going to keep dosing low. But you will
01:08:30
see results.
01:08:31
>> God, that's going to be incredible.
01:08:32
>> It's going to be incredible. By the way,
01:08:34
the number of other folks that are
01:08:36
gearing up for phase one using if not
01:08:39
the Yamanaka factors, other factors that
01:08:41
they've identified or designed as an
01:08:44
alternative to Yamanaka factors again to
01:08:46
rejuvenate the cells. And just to remind
01:08:47
folks, the way this works is it was
01:08:49
discovered that these proteins when they
01:08:51
go into a cell, they take all of those
01:08:53
little markers that sit on top of your
01:08:55
DNA that turn genes on and off and they
01:08:58
create a system that causes them all to
01:09:00
move to the right place. So, it resets
01:09:02
the markers so that those cells will
01:09:04
start to operate like they're supposed
01:09:05
to when they were young again.
01:09:07
>> That's going to be
01:09:08
>> it's going to be incredible. Yeah. Do
01:09:10
you think like No, in all seriousness,
01:09:11
people's knees or joints or what where
01:09:14
do you think it could flow to next in
01:09:16
arthritis?
01:09:17
>> Yep. Um and and by the way, when applied
01:09:19
and if it's distributed in the skin,
01:09:21
they've seen some results in monkeys
01:09:22
where like wrinkles go away.
01:09:24
>> It like literally makes these cells all
01:09:26
work youthful again. And so a lot of the
01:09:28
damage that happens over time is not
01:09:31
damage to DNA. It's damage to the
01:09:33
epiggenome. It's the parts that sit on
01:09:36
top of the DNA that turn genes on and
01:09:38
off. And they get moved to the wrong
01:09:39
place as you get older. And by resetting
01:09:41
them and getting them back to the right
01:09:42
place, boom, the cell is young again,
01:09:44
the organ is young again, and suddenly
01:09:46
you look and act and feel young again.
01:09:48
It's an incredible technology. We're
01:09:50
just at the early stages, the early
01:09:52
innings of turning it into therapeutics.
01:09:54
Again, the discovery goes back to 2006
01:09:57
and now we're starting to see it get
01:09:58
into clinic.
01:09:59
>> All right, let's rejuvenate. Let's
01:10:00
rejuvenate some hairlines on this
01:10:02
podcast. Uh, that would be next up.
01:10:04
>> Speak for yourself.
01:10:05
>> I don't know. You got You got a little
01:10:07
uh You got a little peaks going there,
01:10:08
my brother. A little peak.
01:10:09
>> What are you talking about, bro? My
01:10:10
hairline's incredible. I'm 50.
01:10:12
>> I mean, it's not bad for 50. I give you
01:10:14
credit. You're You're holding your own.
01:10:15
All right, let's talk about our final
01:10:16
topic. Scotas struck down Trump's
01:10:19
emergency powers tariffs. Last Friday,
01:10:22
Scotas voted 63 against President
01:10:25
Trump's tariffs. Six judges voted
01:10:28
against. three conservatives, Roberts,
01:10:30
Barrett, Gorsuch, uh, and three liberals
01:10:33
via Bloomberg. This is the biggest
01:10:35
rebuke of existing executive policy in
01:10:39
91 years since Scottish struck down
01:10:41
FDR's first new deal in 1935.
01:10:45
UPUP Wharton Analysis says the tariffs
01:10:48
collected about 175 billion to date, 50%
01:10:51
of all tariff duties, um, might wind up
01:10:54
being refunded. This is going to take
01:10:57
some time to sort out in the courts.
01:10:59
2,000 importers have already filed for
01:11:01
refunds. We we talked about it here. I
01:11:04
think the majority of people felt like
01:11:05
this is the way the decision would go.
01:11:08
And we talked about here that there were
01:11:10
other options for President Trump to
01:11:12
pursue. He immediately said he was not
01:11:16
deterred and invoked a 15% global tariff
01:11:18
across the board via section 122 of the
01:11:21
1974
01:11:23
trade act. Here's your poly market. Will
01:11:26
the court force Trump to refund tariffs?
01:11:29
18% chance but spike to 40% after the
01:11:32
SCOTA's decision. How will Congress
01:11:34
react? Poly market says 3% chance.
01:11:37
Congress passes any tariffs by March
01:11:40
31st. So again, as I referenced earlier,
01:11:44
these two sides just can't seem to work
01:11:46
together. And that would have resolved
01:11:48
the whole thing. Saxs, you want to um
01:11:51
give us your take here?
01:11:52
>> First of all, I don't think that the
01:11:53
tariffs are going away. What the court
01:11:56
basically indicated especially the
01:11:58
70page Kavanaaugh descent is that
01:12:00
there's multiple alternative bases in
01:12:02
law for the tariffs in existing law. So
01:12:06
for example section 122 of the trade act
01:12:10
of 1974
01:12:12
enables temporary 150day tariffs of up
01:12:16
to 15% to address balance of payments
01:12:19
issues and the president has already
01:12:21
invoked this. So we are now operating
01:12:23
under that. What the 150 days is going
01:12:25
to do is buy the administration time to
01:12:28
substantiate via studies and agency
01:12:32
reviews. What it needs to prove in order
01:12:34
to invoke more sweeping tariff authority
01:12:38
under section 301 of the trade act and
01:12:40
under section 338 of the tariff act.
01:12:45
Section 301 authorizes tariffs
01:12:46
responding to unfair foreign trade
01:12:49
practices. Section 338 of the tariff act
01:12:52
allows tariffs against countries
01:12:55
discriminating against US commerce. The
01:12:57
Kavanaaugh descent actually provided a
01:12:59
roadmap for the administration to put in
01:13:03
place tariffs using one of these
01:13:05
alternate uh bases. So, I think that one
01:13:08
way or another, the tariff policies of
01:13:09
this administration and the favorable
01:13:12
trade deals that they allow us to strike
01:13:13
with many nations, they will continue.
01:13:15
And I think the court seems to know that
01:13:18
because the majority's opinion as
01:13:20
concurrences collectively said nothing
01:13:23
about how the administration should go
01:13:24
about refunding the tariff revenue
01:13:27
already collected. I think that if they
01:13:29
expected this decision to end the tariff
01:13:31
policies altogether, they probably would
01:13:33
have said something about that. And I
01:13:36
think that brings up a really important
01:13:37
point just on the merits here, which is
01:13:40
why would we want to give back hundreds
01:13:42
of billions of dollars to a bunch of
01:13:44
importers when we're trillions of
01:13:46
dollars in debt? And I'll just say that
01:13:48
the people who originally predicted that
01:13:50
somehow these tariffs would be
01:13:51
catastrophic for the economy. Those
01:13:53
predictions all prove not to be true. So
01:13:56
I think that this is ultimately, I
01:13:58
think, going to be a popular policy. The
01:14:00
administration will figure out a
01:14:01
different way to do it. And I predict
01:14:02
that future administrations, whether
01:14:04
they're Republican or Democrat, will
01:14:07
keep some version of the tariffs in
01:14:08
place, but I think that they will be
01:14:10
ultimately popular on a long-standing
01:14:12
basis. Chamathy, your thoughts?
01:14:15
>> I think we've proven the experiment has
01:14:17
been successful.
01:14:20
>> What was the experiment?
01:14:22
We needed to smoke out what the right
01:14:25
balance of trade should be between the
01:14:27
United States and all of its partner
01:14:29
countries. I think that what we
01:14:31
uncovered is that for the most part they
01:14:33
were structural imbalances that were
01:14:36
made not because they made economic
01:14:37
sense for America, but it was just part
01:14:40
of a hodgepodge of globalist dril that
01:14:44
people just bought into. And if you
01:14:45
strip all that stuff away, we had a
01:14:48
hollowedout manufacturing class and we
01:14:50
have a hollowedout middle class and the
01:14:52
tariffs will create more equality for
01:14:55
the American worker in the end. So now I
01:14:57
think the debate should be about how to
01:14:59
implement these in a structural and
01:15:01
permanent way.
01:15:03
I think we talked about this before
01:15:04
Jason that this was sort of expected and
01:15:08
>> there are many other mechanisms. I think
01:15:10
the president activated one of them
01:15:12
immediately.
01:15:13
>> I don't think this is going away and I
01:15:15
don't think it should go away. So I
01:15:17
think now the point is
01:15:19
Congress really should ratify these
01:15:21
things because it is clear that it was
01:15:23
the right thing to do
01:15:25
and if they don't then you know the
01:15:28
president still has a lot of room to get
01:15:29
these done but these make smart economic
01:15:31
sense in my opinion.
01:15:32
>> Freeberg any thoughts on the ruling?
01:15:34
Does it give you
01:15:37
some I don't know um respect for the
01:15:40
courts that they made a judgment not
01:15:42
along party lines for once? Uh does it
01:15:45
Yeah. Okay. Exactly.
01:15:46
>> Yeah. And I think I think that all I
01:15:48
think all Americans
01:15:50
should feel assured and comforted in the
01:15:53
fact that I think a lot of people view
01:15:55
the Supreme Court as having a high
01:15:56
degree of partisanship.
01:15:59
The fact that the president despite
01:16:01
having a majority of what others would
01:16:03
think were kind of politically aligned
01:16:05
appointees on the court had a ruling
01:16:07
that he did not want.
01:16:10
I think should give everyone good faith
01:16:11
that the system that the founders set up
01:16:14
is working that there is a judicial
01:16:15
branch that adjudicates the law against
01:16:18
the executive branch when they think
01:16:19
that um it doesn't map and I think that
01:16:22
that was very important to see. So, you
01:16:25
know, I clearly you the the debate about
01:16:27
tariffs, the economic effect of tariffs,
01:16:29
the security structural trade
01:16:31
relationship effect of tariffs and the
01:16:33
importance of that is a separate
01:16:34
conversation, but I do think that the
01:16:35
read of the law being what I would say
01:16:37
is nonpartisan with respect to the
01:16:39
court's action is important and probably
01:16:42
very valuable. I'll reiterate that this
01:16:44
is a great moment, I think, for uh the
01:16:47
Supreme Court to make a thoughtful
01:16:49
decision. And I think we need to think
01:16:51
about executive power a whole bunch.
01:16:53
Whether it's Biden with student loans or
01:16:55
Trump with tariffs, we have this
01:16:58
beautiful system set up by the founding
01:16:59
fathers. I know it's frustrating.
01:17:01
Gridlock's frustrating. Having to work
01:17:03
together is frustrating. Trust me, we we
01:17:05
all come to this podcast every Thursday.
01:17:08
We have to work together. It's hard to
01:17:09
work together. You got to learn to work
01:17:11
together. And we don't want an executive
01:17:13
branch that can unilaterally just roll
01:17:15
over the other branches. And uh that's
01:17:18
going to end. I think Trump's going to
01:17:19
lose the midterms and we're going to get
01:17:21
to more chaos again and it we might as
01:17:24
well start this reconciliation process
01:17:26
of these two sides stopping their law
01:17:29
for against each other and working
01:17:31
together for the American people on the
01:17:33
important issues. The tariffs there are
01:17:36
some fundamentally important things that
01:17:38
Trump was doing there and they were
01:17:40
working. They could have been chaotic.
01:17:43
uh that's a reasonable um you know
01:17:46
criticism of them because business
01:17:47
owners didn't know what to do. So Trump
01:17:49
did it in a chaotic way. That's just a
01:17:50
fact. He should have done it in a more
01:17:52
thoughtful way and the Congress should
01:17:54
have been alongside him saying, "Hey,
01:17:56
what tools do you need? How can we help
01:17:58
support this? We know that there's trade
01:18:00
imbalances. We know that people are
01:18:03
being unfair. Let's work together as one
01:18:05
America to negotiate these things." So
01:18:07
both sides start having dinner together,
01:18:10
start playing cards together, and do
01:18:11
what we do here on this podcast, which
01:18:13
is you fight it out. You argue, but then
01:18:15
you come together and try to find some
01:18:16
resolutions for this stuff. So they
01:18:18
should go and tell Trump, "Hey, we'll
01:18:21
approve all the tariffs you did. We will
01:18:23
not force you to get refunds." The
01:18:25
Congress should come out and just say
01:18:26
that. And then they should say, "Hey,
01:18:28
and when you want to do them in 2026,
01:18:30
just run them by us or ask us for some
01:18:32
parameters that you want and let's just
01:18:34
be thoughtful about it. These are our
01:18:35
concerns. That's it. Thank you for
01:18:37
coming to my TEA talk. Can
01:18:38
>> I just do one?
01:18:39
>> Absolutely. I'm sure you have some
01:18:40
debate club points that you want to
01:18:42
point that out.
01:18:42
>> Well, I just want to make one. I mean,
01:18:43
do you think Susan Rice is going to
01:18:46
respect your call for comedy and
01:18:50
basically working together Kumbaya?
01:18:53
>> I want the Aspree to corpse. No, I
01:18:54
don't. I think both sides. She just had
01:18:57
a diet tribe where she basically said
01:18:58
that Republicans and actually not just
01:19:02
like partisan Republicans, but even tech
01:19:04
companies that merely were working with
01:19:06
the administration should expect to get
01:19:08
prosecuted. I mean, she was basically
01:19:10
outright saying she's she and the
01:19:12
Democrats are going to pursue lawfare as
01:19:14
soon as they get back in charge.
01:19:16
>> They're they're absolutely going to do
01:19:17
that. Just like when Trump got in, he
01:19:19
went after Comey, he went after Jerome
01:19:21
Pal. The lawfare is happening on both
01:19:22
sides. Both sides need to drop the
01:19:24
lawfare. We need to get rid of these
01:19:25
pardons. They're ridiculous. And these
01:19:27
team this, we have to be a team. So,
01:19:30
let's just get some aspa of corpse and
01:19:31
teamwork going in Washington DC. And
01:19:34
that's what we should vote for in the
01:19:35
midterms. We should vote for moderates
01:19:37
who want to work together. And in 2028,
01:19:40
we should have some kind of moderates
01:19:43
and tickets that want to work together.
01:19:46
That would be better for all Americans.
01:19:48
This kind of chaos is not good, folks.
01:19:50
All right. Listen, this has been another
01:19:51
amazing episode of the All-In podcast,
01:19:53
your favorite podcast. Uh, like,
01:19:55
subscribe, whatever the hell you want to
01:19:57
do on your own time for David Sachs,
01:20:00
David Freeberg, Chamal Pia. Love you,
01:20:03
boys. I am the world's greatest Monterey
01:20:05
J. See you next time. Byebye. Byebye.
01:20:09
>> We'll let your winners ride.
01:20:16
And it said, "We open sourced it to the
01:20:18
fans and they've just gone crazy with
01:20:20
it."
01:20:21
>> Queen of
01:20:29
>> besties are
01:20:32
my dog taking notice your driveways.
01:20:37
>> Oh man, my habitasher will meet up.
01:20:40
>> We should all just get a room and just
01:20:41
have one big huge orgy cuz they're all
01:20:43
just useless. It's like this like sexual
01:20:45
tension that we just need to release
01:20:46
somehow.
01:20:51
>> Your feet.
01:20:53
We need to get merch.
01:20:55
>> I'm going all in.
01:21:03
I'm going all in.

Badges

This episode stands out for the following:

  • 60
    Most chaotic
  • 60
    Most viral

Episode Highlights

  • The Viral Substack Post
    A fictional post about AI's impact goes viral, causing market turmoil.
    “Dr. Doom level stuff.”
    @ 08m 36s
    February 28, 2026
  • AI and Job Dynamics
    As AI evolves, the demand for software engineers is rising, despite fears of job loss.
    “Even if we 10x or 100x the productivity of software engineers, the demand will be there.”
    @ 21m 51s
    February 28, 2026
  • The Future of Work
    The role of knowledge workers is changing as AI takes over repetitive tasks.
    “Every single knowledge work job is being automated right now.”
    @ 25m 38s
    February 28, 2026
  • The New Economy
    The barrier to starting a company is lowering, enabling more innovation and entrepreneurship.
    “It's going to be boom town. People are going to start more companies.”
    @ 28m 20s
    February 28, 2026
  • Data Centers Facing Opposition
    Over 100 data center projects are currently facing local opposition, with 20 in Q2 alone.
    “There are 100 data center projects right now that are facing some form of local opposition.”
    @ 36m 25s
    February 28, 2026
  • Economic Impact of Cancellations
    If 7 gigawatts get canceled this year, it could lead to $70 billion in lost revenue.
    “Now you’re talking about 130 billion of lost revenue over these two years.”
    @ 37m 21s
    February 28, 2026
  • State of the Union Highlights
    Trump's State of the Union was the longest in 60 years, focusing on America’s strength.
    “America at 250, strong, prosperous, and respected.”
    @ 52m 26s
    February 28, 2026
  • Political Polarization
    The discussion highlights the extreme division in American politics today, with both sides failing to collaborate.
    “It's just disgraceful where this country has gotten to.”
    @ 56m 43s
    February 28, 2026
  • Yamanaka Factors Breakthrough
    A groundbreaking agreement allows the first human trials of Yamanaka factors to reverse aging.
    “If it works, it could be an extraordinary breakthrough.”
    @ 01h 06m 47s
    February 28, 2026
  • Tariff Refunds in Limbo
    Over 2,000 importers have filed for refunds, but the court's decision may delay this process.
    “This is going to take some time to sort out in the courts.”
    @ 01h 10m 57s
    February 28, 2026
  • Political Cooperation Needed
    The discussion emphasizes the need for bipartisan cooperation to address tariffs effectively.
    “Both sides need to drop the lawfare.”
    @ 01h 19m 24s
    February 28, 2026

Episode Quotes

Key Moments

  • Conspiracy Theories00:04
  • Market Analysis02:49
  • AI Uncertainty10:40
  • Consumer Behavior14:50
  • AI Job Market19:42
  • Automation Revolution25:38
  • Economic Growth31:56
  • Data Center Opposition36:25

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Trump Takes On the Fed, US-Intel Deal, Why Bankruptcies Are Up, OpenAI's Longevity Breakthrough
Podcast thumbnail
IPOs and SPACs are Back, Mag 7 Showdown, Zuck on Tilt, Apple's Fumble, GENIUS Act passes Senate
Podcast thumbnail
Trump AI Speech & Action Plan, DC Summit Recap, Hot GDP Print, Trade Deals, Altman Warns No Privacy
Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally
Podcast thumbnail
AI Psychosis, America's Broken Social Fabric, Trump Takes Over DC Police, Is VC Broken?
Podcast thumbnail
Inside the White House Tech Dinner, Weak Jobs Report, Tariffs Court Challenge, Google Wins Antitrust