Search Captions & Ask AI

Peter Thiel: The Coming Collapse No One Is Prepared For

September 13, 2024 / 45:33

This episode features Peter Thiel discussing various topics including risk-taking, political involvement, election predictions, technology innovation, and the future of AI.

Thiel shares a quote about risk-taking, emphasizing that in a rapidly changing world, not taking risks is the biggest risk. He reflects on his political stance, expressing support for Trump and JD Vance while discussing his decision not to donate politically this cycle.

He predicts that Trump may win the upcoming election but expresses disappointment in the political landscape, suggesting that elections often lead to buyer's remorse. Thiel also discusses election integrity, advocating for reforms similar to those in other Western democracies.

On technology, Thiel critiques the current state of innovation, particularly in the U.S., and discusses the importance of AI, comparing its current state to the internet in 1999. He emphasizes the need for clarity in investing in AI and the significance of companies like Nvidia.

Thiel concludes by expressing a balanced view on optimism and pessimism, advocating for human agency in shaping the future.

TL;DR

Peter Thiel discusses risk, politics, technology innovation, and the future of AI, emphasizing human agency and the need for reform in various sectors.

Video

00:00:00
Peter was the person who told me this
00:00:02
really pithy
00:00:05
quote in a world that's changing so
00:00:08
quickly the biggest risk you can take is
00:00:10
not taking any risk this guy is a tough
00:00:13
nut to try to sort of explain changed
00:00:16
money with PayPal was the first outside
00:00:18
investor in Facebook uh back paler which
00:00:21
is I believe they helped find Osama Bin
00:00:23
Laden almost certainly the most
00:00:25
successful technology investor in the
00:00:28
world I don't think the future is fixed
00:00:31
I think what matters is a question of
00:00:32
agency what I think works really well
00:00:36
are sort of oneof A- kind companies how
00:00:39
do you get from Z to one what great
00:00:41
business is nobody building tell me
00:00:42
something that's true that nobody agrees
00:00:44
with you
00:00:45
on all right Peter welcome back it's
00:00:48
good to see
00:00:49
[Applause]
00:00:51
you
00:00:53
um you don't do this too often uh so we
00:00:56
do appreciate it U but when you do do it
00:00:58
you're always super candid and
00:01:00
appreciate that as well you fit right in
00:01:01
here um you're sitting this year's
00:01:04
political cycle out right politics well
00:01:08
no I mean I think this is a question we
00:01:10
all have which is you were very active
00:01:13
um you bet on Jed in a major way um he
00:01:17
delivered today it was a very impressive
00:01:18
uh discussion why aren't you involved
00:01:21
this this cycle where it's very
00:01:23
confounding to us because these are your
00:01:24
guys man how much time do we have
00:01:26
supposed talk about this for two hours
00:01:28
or something um um I I don't know look I
00:01:32
have a lot of conflicted thoughts on it
00:01:34
uh I I am still very strongly prot Trump
00:01:39
Pro uh Pro JD I've decided not to uh
00:01:43
donate any money politically but uh I'm
00:01:45
supporting them in every other way uh
00:01:48
every other way possible uh I you know
00:01:51
uh obviously uh you know I think uh I
00:01:56
think
00:01:57
there's my my pessimistic thought is
00:01:59
that uh Trump is is going to win and
00:02:02
probably will win by a big margin uh
00:02:05
he'll do better than the last time and
00:02:08
it'll still be really disappointing
00:02:09
because you know the elections are
00:02:11
always a relative Choice and then once
00:02:13
someone's president it's an absolute and
00:02:16
they you get evaluated you know do you
00:02:17
like Trump or Harris better and that
00:02:20
seems there seem to be a lot of reasons
00:02:22
you know and that one would be more
00:02:24
anti-h haris than anti-trump let's again
00:02:26
no one's Pro any of these people it's
00:02:28
all it's all negative right um and uh
00:02:31
but then after after they win there will
00:02:33
be a lot of buyers remorse and
00:02:35
disappointment that's sort of that's
00:02:36
sort of the arc that I I see of what's
00:02:38
going to happen and it's it's somewhat
00:02:40
under motivating um I don't know just to
00:02:42
describe describe it I I think you know
00:02:45
I think it's I think the odds are
00:02:47
slightly in favor of trump but it's
00:02:48
basically
00:02:49
5050 um my one contrarian view on the
00:02:52
election is that it's not going to be
00:02:53
close un you know most presidential
00:02:55
elections aren't and uh you know one
00:02:57
side just breaks you know 20 2016 2020
00:03:00
we super close but two-thirds of the
00:03:02
elections aren't and you know you can't
00:03:03
always line things up and figure it out
00:03:06
I think either the you know the comma
00:03:07
bubble will burst or you know maybe
00:03:09
maybe the Trump voters get really
00:03:11
demotivated and don't show up but I
00:03:12
think you know one side is is simply
00:03:14
going to collapse in the next uh in the
00:03:16
next two months and then you know if you
00:03:18
want to get involved you know with all
00:03:20
the headaches that come with being
00:03:21
involved if it makes a difference
00:03:23
counterfactually and if it's a really
00:03:25
close election everything makes a
00:03:26
difference if it's if it's not even
00:03:28
close I don't think it makes much of a
00:03:29
difference if if it is going to be close
00:03:31
by the way if it's if it's like going to
00:03:33
be a razor thin close election then um
00:03:35
then I'm pretty sure KLA will win
00:03:37
because um because they will cheat they
00:03:39
will fortify it they will steal the
00:03:41
ballots and so so uh so you know if we
00:03:45
if we can if we can if we can and so
00:03:47
then in the event in the event that it's
00:03:50
close I don't want to be involved in the
00:03:52
event that it's not close I don't need
00:03:54
to be involved and so that's sort of
00:03:56
that's sort of a straightforward analis
00:03:57
right there jumping off point much
00:04:00
cheating on a percentage basis do you
00:04:02
think happens every year how much and do
00:04:05
you think Trump actually careful with
00:04:07
the verb so you know um cheating
00:04:10
stealing that implies something happened
00:04:12
the dark night I think the verb you're
00:04:14
allowed to use is fortify okay yeah we
00:04:17
don't want get canell on YouTube ballot
00:04:19
harvesting I mean it was you know it's
00:04:21
all sort of there were all these rule
00:04:22
changes it was sort of done in plain
00:04:24
daylight and uh um but yeah I think I
00:04:27
think our elections are not they're not
00:04:29
perfectly clean otherwise we could
00:04:31
examine it we could have vigorous debate
00:04:33
about it well what would you change then
00:04:34
what what should change cuz we all want
00:04:36
everybody's votes to count we want it to
00:04:38
be clean um I'm I'm talking about the
00:04:41
audience here I know at a minimum you
00:04:43
you'd run them you'd try to run
00:04:44
elections the same way you do it in
00:04:46
every other Western democracy you have
00:04:47
one day voting you have practically no
00:04:50
absentee ballots um you have um and um
00:04:55
and it's you know it's it's it's it's
00:04:57
it's one day where everything happens
00:04:59
it's not this uh two two-month elongated
00:05:02
process that's what the way you do it in
00:05:04
every other country you you'd have you'd
00:05:05
have some somewhat stronger voter ID and
00:05:09
you know make sure that you know the
00:05:11
people who are voting have a right to
00:05:12
vote um make it a national holiday
00:05:15
that's that's basically that's basically
00:05:17
what you do in every other Western
00:05:18
democracy and it's it's it and it used
00:05:20
to be much more like that in the US I
00:05:21
mean it's it's it's meaningfully decayed
00:05:23
over the last 20 30 years you 20 30
00:05:26
years ago 30 40 years ago you you got
00:05:28
the results on the day of of the vote
00:05:30
and that sort of stopped happening a
00:05:32
while ago what would make you not
00:05:34
disappointed so Trump gets elected how
00:05:37
do you what's your counternarrative on
00:05:40
you know where a year or two years past
00:05:43
the election Trump is president what
00:05:45
what makes you say I'm surprisingly not
00:05:47
disappointed what takes place man it's
00:05:52
um you know it's I I I think there are
00:05:55
some extremely difficult problems that
00:05:57
it's it's it's really hard to know how
00:05:59
how to how how to solve them I wouldn't
00:06:01
know what to do but uh we have you know
00:06:03
we have an incre incredibly big deficit
00:06:07
and uh and yeah if you can if you can
00:06:11
find some way to meaningfully reduce the
00:06:13
deficit with no tax hikes and without
00:06:16
without GDP contraction well you you
00:06:19
would do it if you got a lot of GDP
00:06:21
growth maybe right but um if you could
00:06:23
if you could meaningfully reduce the
00:06:24
deficit with with um with no with no tax
00:06:28
hikes that would be that would be very
00:06:30
impressive you know I I think we're sort
00:06:32
of sleepwalking into Armageddon with uh
00:06:35
you know the Ukraine and um the conflict
00:06:38
and Gau are just sort of the warm-ups to
00:06:40
the China Taiwan war and uh and so if uh
00:06:45
if if Trump can find a way to head that
00:06:47
off that would be incredible if they
00:06:49
don't go to war in four years that that
00:06:50
would be that would be better than I
00:06:52
would expect possibly in relation to
00:06:54
Taiwan if Trump called you and asked
00:06:56
should we defend it or not in this acute
00:06:59
case
00:07:00
would you advise to let Taiwan be taken
00:07:03
by China in order to avoid a nuclear
00:07:07
Holocaust in World War II or would you
00:07:10
believe that we should defend it and
00:07:11
defend free countries like that well I I
00:07:14
I think um I think you're probably not
00:07:17
supposed to
00:07:18
say no no no if you no if you um you I I
00:07:23
I think look I think there's so many
00:07:24
ways our our policies are messed up but
00:07:26
probably you know the one the one thing
00:07:29
that's roughly correct on the Taiwan
00:07:31
policy is that uh uh we don't tell China
00:07:34
what we're going to do um and what we
00:07:36
tell them is we don't know what we'll do
00:07:38
and we'll figure it out when you do it
00:07:39
which is probably has the virtue of
00:07:41
being correct and uh and then I think if
00:07:43
you had yeah if you had a red line at
00:07:45
quoy Matsu the the island you know five
00:07:48
miles off the coast of China that's
00:07:50
unbelievable if you um if you um if you
00:07:53
say we want some guard rails and we
00:07:55
won't defend Taiwan then they'd get
00:07:57
invaded right away so I think I think I
00:07:59
think policy of um not say policy is and
00:08:04
maybe not even having a policy you know
00:08:07
in some ways is relatively the best I I
00:08:09
I think anything anything anything
00:08:11
precise you say that's going to just
00:08:13
lead to war right away but what do you
00:08:15
believe I um worth defending or not
00:08:18
worth starting the conflict democracy in
00:08:20
this tiny Island worth going to war over
00:08:23
or not according to Peter teal it's um
00:08:26
it's not worth uh it's not worth World
00:08:28
War III
00:08:29
um and I I still think it's it's uh
00:08:32
quite catastrophic if it uh if it gets
00:08:35
taken over by the Communists how does
00:08:36
the world those can both be true how
00:08:37
does the world
00:08:38
divide if we end up in a heightened
00:08:42
escalation is China Russia Iran friends
00:08:46
is that an Alli is that an axis that
00:08:48
forms you know think out the next decade
00:08:51
in kind of your base case and I don't
00:08:53
know what happens how the world I don't
00:08:55
know what happens militarily if there's
00:08:57
a China Taiwan Invasion I mean may maybe
00:09:00
we roll over maybe it
00:09:02
escalates you know all the way to
00:09:04
nuclear war probably it's it's you know
00:09:07
some very messy in between thing sort of
00:09:10
like what what you have in in the
00:09:11
Ukraine uh what what I think happens
00:09:13
economically is very straight forward I
00:09:15
think uh I think basically um you know
00:09:18
you have with Russia and Germany you had
00:09:20
one northstream Pipeline and we have the
00:09:23
equivalent of a 100 pipelines between
00:09:25
the US and China and they they all blow
00:09:27
up you I I met the tick tock CEO about
00:09:30
about a year ago and um you know I not
00:09:33
maybe I wouldn't have said this now but
00:09:35
what I told him and I felt was very
00:09:37
honest advice was you know you don't
00:09:39
need to worry about the us we're never
00:09:41
going to do anything about Tik Tok we're
00:09:42
too incompetent um but um but but if I
00:09:46
were in your place I would still get the
00:09:48
business out of China I would get the
00:09:50
computers out the people out I'd
00:09:51
completely decouple it from bike Dan
00:09:54
because um Tik Tok will be banned 24
00:09:57
hours after the Taiwan invasion
00:10:00
and if you think there's a 50/50 chance
00:10:02
this happens and that will destroy you
00:10:04
know 100% of the value of the Tik Tok
00:10:07
franchise what was his reaction
00:10:09
um you know uh he said that they had
00:10:12
done a lot of simulations and the bunch
00:10:15
of companies in World War I and World
00:10:16
War II that managed to sell things to
00:10:18
both sides he doesn't seem so bright to
00:10:20
me do you think he's um he no he donly
00:10:23
what your take on him he he didn't
00:10:25
disagree with my frame and so I always I
00:10:27
always find that flattering if someone
00:10:28
basically framing so he seemed he seemed
00:10:31
perfectly bright to me even
00:10:35
though lot of bright people I saw you
00:10:37
give this I saw you give a talk last
00:10:39
summer with Barry Weiss and you talked
00:10:41
about this decoupling should be
00:10:43
happening you weren't saying should you
00:10:45
were recommending that every industry
00:10:47
leader consider decoupling from China I
00:10:50
think your comment was it's like picking
00:10:51
up nickels in front of a freight train
00:10:53
you remember saying that the well I I I
00:10:57
think like it's it's there a lot of
00:10:58
different there lot of different ways in
00:11:01
which businesses are coupled to China
00:11:03
that were investors that tried investing
00:11:06
there are people who tried to compete
00:11:08
within China they people who built
00:11:09
factories in China for export um and um
00:11:12
you know there different parts of that
00:11:14
that worked to um to to varying degrees
00:11:17
but uh but yeah
00:11:19
my um my I I certainly would not try to
00:11:24
um to invest in a company that competed
00:11:27
domestically inside China I think that's
00:11:30
that's virtually impossible um I think
00:11:34
it's probably quite tricky even to
00:11:37
invest in Chinese Chinese businesses um
00:11:40
uh and uh and then and then there is
00:11:43
there is sort of this model of you know
00:11:45
building factories in China uh for uh
00:11:48
for export uh to the west and um it was
00:11:51
it was a very big Arbitrage these things
00:11:53
do work you know I me I visited the
00:11:55
foxcon factory nine years ago and it's
00:11:58
you know you have people get paid a
00:11:59
dollar and a half $2 an hour and they
00:12:01
work 12 hours a day and they live in a
00:12:03
dorm room with two bunk beds where uh
00:12:06
you know you get eight people in the
00:12:07
dorm room someone's sleeping in your bed
00:12:08
while you're working and vice versa and
00:12:11
uh and you sort of realize they're
00:12:13
really far behind us or they're really
00:12:14
far ahead of us and either way you know
00:12:16
it's it's not that straightforward to
00:12:18
just uh shift the um the iPhone
00:12:20
factories to the United States um so I I
00:12:23
sort of understand you know why a lot of
00:12:27
businesses ended up there and why why
00:12:30
this is the the arrangement that we have
00:12:32
but uh but yeah my my my intuition for
00:12:36
you know what is going to happen without
00:12:38
making any normative judgments at all is
00:12:40
it is going to decouple how inflationary
00:12:42
will that
00:12:44
be it presumably is it's presumably
00:12:48
pretty inflationary yeah that that's
00:12:50
that's probably the you know I I don't
00:12:53
know it's and you'd have to sort of look
00:12:55
at you know what the in elasticities of
00:12:57
all these goods are so that's true
00:12:59
what's the policy re probably not that
00:13:02
it may not be as inflationary as people
00:13:03
think because um people always model
00:13:06
trade in terms of uh pairwise in terms
00:13:09
of two countries so if you literally
00:13:11
have to move the people back to the US
00:13:13
that's that's insanely expensive I don't
00:13:15
you know I don't know how much would
00:13:16
cost people to build an
00:13:18
iPhone you just you just well I think
00:13:20
India is sort of too messed up but you
00:13:22
shift it to like Vietnam Mexico there
00:13:24
are you know there there 5 billion
00:13:26
people living in countries where the
00:13:28
incomes are lower than China and so um
00:13:31
and so you know probably the um the
00:13:34
negative sum trade policy we should have
00:13:36
with China is um you know we should just
00:13:39
shift it to other countries which is a
00:13:41
little bit bad for the US extremely bad
00:13:44
for China and let's say really good for
00:13:46
Vietnam that's kind of um and that's
00:13:49
kind of the the negative sum policy um
00:13:53
that uh that's going to manifest as this
00:13:55
sort of uh decoupling happens let's talk
00:13:58
about avoiding it for a second here
00:14:00
Trump seems to be extremely good with
00:14:02
dictators and authoritarians uh Kim
00:14:04
Jong-un seems like a big fan I mean that
00:14:06
in like as a compliment as a superpower
00:14:08
right like he doesn't have a problem
00:14:10
talking to them he connects with them
00:14:12
and they seem to like him so what would
00:14:16
be the path to him working with XI to
00:14:18
avoid this is there a path to avoid this
00:14:21
because we were sitting here last year
00:14:23
talking about this and it just seems
00:14:25
mindboggling that if everybody agrees
00:14:28
that this is going to happen happen that
00:14:30
we can't figure out a way to make it not
00:14:33
happen well it's it's not just up to us
00:14:37
so um yeah there's there's and so I
00:14:41
don't know it's it's obviously somewhat
00:14:42
of a black box we don't exactly we we I
00:14:46
I I feel we just have no clue what
00:14:48
people in in in China think but um but I
00:14:52
I I think it's sort of the the sense of
00:14:55
history is is strongly the sort of thus
00:14:58
trap idea that you have a rising power
00:15:00
against an existing power and it tends
00:15:02
to you know it's it's willham mean
00:15:05
Germany versus Britain before World War
00:15:07
I and you know it's um you know Athens
00:15:10
against Sparta the rising power against
00:15:12
the existing power you you tend to um
00:15:15
get conflict that's that's probably what
00:15:18
deep down I think is is really really
00:15:22
far in the China DNA so so I'd say maybe
00:15:24
maybe the first I don't know the meta
00:15:27
version would be the first the first
00:15:28
step avoiding the conflict would be we
00:15:30
have to we have to start by admitting
00:15:32
that China believes the conflict's
00:15:34
happening right and then if if if if
00:15:37
people like you are constantly saying um
00:15:40
well we just need to have some happy
00:15:41
talk right um that's that is a recipe
00:15:44
that's a recipe for world I'm not
00:15:46
advocating happy Jo necessarily
00:15:49
um I'm I get accused of being a bit more
00:15:52
hawkish obviously obviously um in in
00:15:54
general you know I don't know I I'm not
00:15:57
I'm not sure Trump should talked to the
00:15:59
North Korean dictator but yeah in
00:16:01
general um it's probably a good idea to
00:16:03
to try to um talk to people even if
00:16:06
they're they're really bad people most
00:16:08
of the time and uh and uh you know it's
00:16:10
it's certainly um a very odd Dynamic
00:16:13
with the US and and uh and Russia at
00:16:15
this point where um I I think it is
00:16:18
impossible for anybody in the Biden
00:16:21
Administration even to have a back
00:16:22
Channel communication with uh with
00:16:24
people like I I don't think Tucker
00:16:26
Carlson counts as an emissary from the
00:16:28
Biden Administration and if anybody gets
00:16:30
tuckered or I don't know what the verb
00:16:32
is who talks you know that's that's that
00:16:35
seems that seems worse than the
00:16:37
alternative can we um talk about
00:16:40
technology um you have this you you you
00:16:44
have a speech where you talk about some
00:16:46
of the misguided things we've done in
00:16:47
the past in the name of technology and
00:16:49
use like big data as an example of that
00:16:52
um what is
00:16:54
AI um oh man that's that's sort of a
00:16:59
big question I
00:17:01
um it's um yeah I I always I I always
00:17:07
had this riff where I I don't like the
00:17:09
buzzwords and um you know machine
00:17:12
learning learning Big Data cloud
00:17:14
computing you know I'm going to build a
00:17:16
mobile app bring the cloud to um you
00:17:19
know if you have sort of a concatenation
00:17:21
of buzzwords um you know my my first
00:17:23
instinct is just to run away as fast as
00:17:25
possible some really bad group think and
00:17:29
um and for many years I I my bias is
00:17:31
probably that AI was one of the worst of
00:17:33
all these buzzword it meant you know the
00:17:36
next generation of computers the last
00:17:37
generation of computers you know
00:17:39
anything in between so it's meant all
00:17:41
these all these very different things if
00:17:43
we if we roll the clock back to the
00:17:46
2010s you know the um probably the AI to
00:17:50
the extent you concretize I would say
00:17:52
the AI debate was maybe framed by by two
00:17:56
the two books the two canonical books
00:17:57
that framed it was there was the the
00:17:58
Bostrom book super intelligence 2014
00:18:01
where AI was going to be this super
00:18:03
human super duper intelligent um thing
00:18:07
and then um the anti- um Boston book was
00:18:10
the Kaiu Lee 2018 AI superpowers you can
00:18:13
think of the CCP rebuttal to Bostrom
00:18:16
where basically AI was going to be
00:18:18
surveillance Tech face recognition and
00:18:20
China was going to win because they had
00:18:22
no qualms about uh applying this
00:18:24
technology and um and then um if we now
00:18:28
think about what actually happened let's
00:18:29
say with the llms and and chat GPT it
00:18:32
was really neither of those two um and
00:18:35
it was this in between thing which was
00:18:37
actually what people would have defined
00:18:39
AI as for the previous 60 or 70 years
00:18:42
which is passing the touring test which
00:18:44
is you know this the somewhat fuzzy line
00:18:46
it's a computer that can pretend to be a
00:18:48
human um or that can fool you into
00:18:51
thinking it's a human and um and uh you
00:18:54
know even with the fuzziness of that
00:18:56
line you could say that pre chat GPT
00:18:59
wasn't passed and then chat GPT passed
00:19:02
it and that seems that seems very very
00:19:05
significant um and um and then obviously
00:19:08
leads to all these questions what does
00:19:10
it mean you know is it going to is it
00:19:12
going to complement people is it going
00:19:14
to substitute for people you know what
00:19:16
does it do to the labor market do you
00:19:17
get paid more paid less you know so
00:19:19
there all these all these questions but
00:19:21
uh it um it seems extremely it seems
00:19:25
extremely important um and um
00:19:29
and it's probably you know certainly the
00:19:32
the big picture questions which I think
00:19:34
Silicon Valley is always very bad at
00:19:35
talking about is like you know what does
00:19:37
it mean to be a human being right um
00:19:38
sort of the I don't know the stupid 2022
00:19:42
answer would be that humans differ from
00:19:43
all the other animals because we we're
00:19:45
good at languages if you're a
00:19:46
three-year-old or an 80-year old you
00:19:48
speak you communicate we tell each other
00:19:50
stories this this this is what makes us
00:19:53
different and so um so yeah I think
00:19:55
there's something about it that's uh
00:19:57
incredibly important and and very
00:19:59
disorienting you know the question I
00:20:00
always have as a I know the narrower
00:20:02
question I have as an investor is sort
00:20:03
of how do you make money with this stuff
00:20:06
and um how do you make money I it's um
00:20:09
it's pretty confusing and I think I I
00:20:13
don't know this is always where I'm
00:20:14
anchored on the late 90s is sort of the
00:20:16
formative period for me but uh I I I I
00:20:19
keep thinking that uh AI in 20123 2024
00:20:24
is like the internet in
00:20:26
1999 um it's it's really big it's going
00:20:29
to be very important it's going to
00:20:32
transform the world not you know in 6
00:20:34
months but in 20 years and then um there
00:20:37
are probably all kinds of incredibly uh
00:20:40
catastrophic approximations where you
00:20:43
know uh what businesses are going to
00:20:45
make money you know who's going to have
00:20:47
the Monopoly who's going to have pricing
00:20:48
power is um you know is is is is super
00:20:52
unclear um probably you know one one
00:20:54
layer deeper of analysis you know if
00:20:56
attention is all you need and if you're
00:20:58
you're not post economic you need to pay
00:21:00
attention to who's making money and in
00:21:02
AI it's basically one company is making
00:21:04
Nvidia is making over 100% of the
00:21:07
profits everybody else is collectively
00:21:08
losing money and so um and so there's
00:21:11
sort of a you have to do some sort of
00:21:13
you should do you should try to do some
00:21:15
sort of analysis you do you go long
00:21:18
Nvidia do you go short you know is it um
00:21:20
you know My Monopoly question is it a is
00:21:22
it a really durable Monopoly you know
00:21:25
and and then I it's it's hard for me to
00:21:27
know because I'm in Silicon Valley and I
00:21:28
haven't done anything we haven't done
00:21:29
anything in semiconductors for a long
00:21:31
time so I have no clue do you um if you
00:21:33
let's debz word the word Ai and say it's
00:21:35
a bunch of process automation let's just
00:21:37
say that's version 0.1 where brains that
00:21:40
are roughly the equivalent of a teenager
00:21:42
can do a lot of manual stuff what do you
00:21:45
have you thought about what it means for
00:21:48
you know 8 billion people in the world
00:21:50
if there's an extra billion that
00:21:52
necessarily couldn't work or like
00:21:54
whether that in political or economic
00:21:56
terms
00:21:59
I don't know the the the
00:22:01
um I I I don't know if this is the same
00:22:04
but this is you know the the history of
00:22:06
2050 years the Industrial Revolution
00:22:09
what was that it you know it adds to GDP
00:22:12
it frees people up to do more more
00:22:14
productive things um you know maybe
00:22:17
there's you know there was yeah there
00:22:18
was a I know there was a lite critique
00:22:20
in the 19th century of the factories
00:22:23
that people were going to be unemployed
00:22:24
and wouldn't have anything to do because
00:22:25
the machines would replace the people
00:22:27
you know maybe the Lites are right this
00:22:30
time around I'm I'm I'm probably I'm
00:22:32
probably pretty pretty skeptical of it
00:22:34
but uh but yeah it's it's it's extremely
00:22:36
confusing you know uh where where the
00:22:39
gains and and losses are there there
00:22:42
probably are um you know there there's
00:22:45
always sort of a hobby you can always
00:22:47
just use it on your hobby horses so I
00:22:49
don't know the you know my anti-
00:22:50
Hollywood or anti- university hobby
00:22:52
horse is that uh it seems to me that you
00:22:55
know the um the AI is quite good at the
00:22:57
woke stuff
00:22:59
and um it'll and and so you know if you
00:23:01
want to if you want to be a successful
00:23:03
actor you should be maybe a little bit
00:23:04
racist or a little bit sexist or just
00:23:06
really funny uh and you won't have any
00:23:09
risk of the AI replacing
00:23:11
you everybody else will get everybody
00:23:14
else will get replaced and then probably
00:23:17
I don't know
00:23:18
um I don't know uh Claudine gay the
00:23:21
plagiarizing Harvard University
00:23:23
president um you know the AI is going to
00:23:26
you know the AI will produce end amounts
00:23:29
of um of these sort of I don't even know
00:23:32
what to call them uh woke um papers and
00:23:36
um they they were all already sort of
00:23:38
plagiarizing one another they were
00:23:40
because they were always saying the same
00:23:41
thing over and over again they were
00:23:43
using their own version of is just going
00:23:44
to flood the Zone with with even more of
00:23:46
that and that you know I don't know
00:23:48
obviously they've been able to do it for
00:23:49
a long time and no one's noticed but uh
00:23:52
but I think I think at this point um
00:23:54
that it it doesn't seem promising from a
00:23:56
um compet compe Point obviously my hobby
00:24:00
horses so I'm I'm just maybe just
00:24:02
wishful thinking on my part what are the
00:24:03
areas of technology that um you're
00:24:06
curious about that your mind is like wow
00:24:07
this is really I have to learn more pay
00:24:11
attention you know I'm always I always
00:24:14
think uh you you want to instantiate it
00:24:18
more in companies than than um things or
00:24:21
you know you ask sort of like where is
00:24:23
where is innovation happening you um you
00:24:27
know it it it
00:24:29
in our society it doesn't have to be
00:24:30
this way but it's it's um it's mostly in
00:24:34
in um in a certain subset of relatively
00:24:37
small companies where we have these
00:24:39
relatively small teams of people that
00:24:41
are really pushing the envelope and
00:24:43
that's that's sort of you know that
00:24:45
that's sort of what I find you know
00:24:47
inspiring about about venture capital
00:24:50
and then you and then obviously you
00:24:52
don't just want Innovation you also want
00:24:54
it to sort it to it to um to translate
00:24:57
into into good businesses but that's
00:24:59
that's where it happens it it somehow it
00:25:01
doesn't happen in universities it
00:25:03
doesn't happen in government you know
00:25:05
there was a time it did I mean you know
00:25:07
somehow in this very very weird
00:25:09
different country that was the United
00:25:11
States in the 1940s you had you know
00:25:13
somehow the Army organized the
00:25:15
scientists and got them to produce a
00:25:16
nuclear bomb in Los Alamos in three and
00:25:18
a half years and you know the way the
00:25:20
New York Times editorialized after that
00:25:23
was you know it's it's you know it was
00:25:24
sort of an anti-libertarian write up it
00:25:26
was you know there were um you know
00:25:27
obvious Maybe you'd left the Primadonna
00:25:29
scientists to their own would have taken
00:25:31
them 50 years to build a bomb and uh the
00:25:33
Army could just tell them what to do and
00:25:35
this will should silence anybody who
00:25:37
doesn't believe the government can do
00:25:38
things and uh they don't write
00:25:40
editorials like that in the New York
00:25:42
Times anymore but I think um yeah but I
00:25:45
think that's sort of that that's that's
00:25:47
sort of where where when where when
00:25:49
should look I think it I think I think a
00:25:51
crazy amount of it still happens in the
00:25:52
United States you know there sort of you
00:25:55
know we've we've you know episodically
00:25:57
tried to all this investing we probably
00:25:59
tried to do tooo much investing in
00:26:00
Europe over the years it's always sort
00:26:02
of a junk it sort of it's a nice place
00:26:04
to go on vacation as an investor and um
00:26:07
and it's it's it is it is very it's it's
00:26:10
very and I don't have a great
00:26:12
explanation but it's a very strange
00:26:15
thing that uh so much of it is still the
00:26:17
US is somehow still the country where
00:26:19
people do new things Peter is that is
00:26:21
that a team organizational social
00:26:24
evolutionary problem in the United
00:26:26
States what is the root cause of the
00:26:28
failure to innovate in the United States
00:26:30
relative to the expectation going back
00:26:34
70 years 50 years Etc from you know the
00:26:37
the rocket shift and we're all going to
00:26:39
live yeah well this is always this is
00:26:40
always this always one of the big
00:26:42
picture and claims I have that we've
00:26:44
been in an era of relative Tech
00:26:46
stagnation the last 40 or 50 years or
00:26:48
the you know the tagline um that we
00:26:51
had they promis flying cars all we got
00:26:54
was 140 characters which is not an anti-
00:26:56
Twitter antix commentary even though the
00:26:58
the way the way I used to always qualify
00:27:00
it was that uh at least you know at
00:27:02
least it was at least a good company you
00:27:04
had you know 10,000 people who didn't
00:27:06
have to do very much work and could just
00:27:08
smoke marijuana all day very similar to
00:27:10
Europe and so I I think that actually
00:27:11
that part actually did get corrected but
00:27:14
um um but the um very but I think I I
00:27:19
think um like what went wrong because
00:27:21
you you point out that it's not a
00:27:22
technology Trend tracker that you think
00:27:24
about it's about people and teams that
00:27:26
innovate and drive to outcomes B on
00:27:28
their view of the world and and what's
00:27:30
gone wrong with our view of the world
00:27:32
and our ability to organize to achieve
00:27:35
the seemingly unachievable with very
00:27:37
rare exceptions obviously elon's here
00:27:38
later but yeah you know it's it's it's
00:27:40
it's overdetermined um the um the the
00:27:44
the rough frame I always have and again
00:27:46
it's not that there's been no innovation
00:27:48
there's been there's been a decent
00:27:49
amount of innovation in the world of
00:27:51
bits computers internet mobile internet
00:27:54
you know crypto AI so there sort of all
00:27:57
these um world of bits uh places where
00:28:01
there was you know a sign significant
00:28:04
but sort of somehow narrow cone of
00:28:06
progress but it was everything having to
00:28:07
do with atams that was slow this was
00:28:09
already the case When I Was An
00:28:10
undergraduate at Stanford in the late
00:28:12
80s in retrospect any applied
00:28:14
engineering field was a bad idea it was
00:28:16
a bad idea to become a chemical engineer
00:28:18
you know a mechanical engineer aeroastro
00:28:20
was terrible nuclear engineering
00:28:22
everyone KN I mean I no one did that you
00:28:25
know and um and and there's something
00:28:28
about yeah the world of Adams that um
00:28:31
you know from a Libertarian point of
00:28:32
view you'd say got regulated to death um
00:28:35
there probably uh you know there's
00:28:38
there's some there's some set of
00:28:39
arguments where um the lwh hanging fruit
00:28:42
got picked and got harder to find new
00:28:44
things to do although I always I always
00:28:46
think that was just a sort of baby
00:28:48
boomer excuse for for covering up for
00:28:50
the the failures of that generation um
00:28:53
and um and then um and then I think but
00:28:57
I think maybe maybe um maybe a very big
00:29:00
picture part of it was that uh at some
00:29:03
point in the 20th century the idea got
00:29:06
took hold that not all forms of
00:29:09
technological progress were simply good
00:29:12
and simply for the better and there's
00:29:13
you know there's something about the two
00:29:15
world wars and the you know the
00:29:17
development of nuclear weapons that uh
00:29:19
that that gradually pushed people into
00:29:21
this this more uh risk ofers society and
00:29:23
it didn't happen overnight but um you
00:29:26
know maybe a quarter century
00:29:28
you know after the nuclear bomb it's
00:29:29
like by Woodstock it happened by
00:29:31
Woodstock it happened yeah cuz that was
00:29:33
the same summer we landed on the moon
00:29:35
yeah Woodstock was three weeks after
00:29:36
that yeah that's that was the Tipping
00:29:39
Point progress stopped in the took over
00:29:41
sex can we shift gears just to the
00:29:43
domestic economy what what do you
00:29:44
think's happening in the domestic
00:29:45
economy and just say backdrop we've had
00:29:47
something like 14 straight months of
00:29:49
downward revisions to jobs the revisions
00:29:52
are supposed to be completely random but
00:29:53
somehow they've all been down um prob
00:29:56
doesn't mean anything um
00:29:58
there's also what's happening with with
00:30:00
the yield curve but I'll stop there what
00:30:02
what's your take on what's happening in
00:30:03
the
00:30:04
economy
00:30:07
um you know it's
00:30:09
it's man it's always hard hard to know
00:30:12
exactly yeah I I I suspect we're close
00:30:14
to a recession i' I've probably thought
00:30:16
this for a while uh it's it's being
00:30:20
stopped by really big government
00:30:22
spending so um you know in May of 2023
00:30:27
the projection for the deficit in 20
00:30:30
fiscal year 24 which is October of 23 to
00:30:33
September 24 was something like 1.5 1.6
00:30:37
trillion um the deficit is going to come
00:30:39
in about 400 billion higher and so um
00:30:42
which you know was a sort of a crazy
00:30:44
deficit was projected and it was way off
00:30:46
and then somehow um and so if if we had
00:30:50
not found another 400 billion um to add
00:30:54
to you know this this crazy deficit at
00:30:56
the top of the economics cycle you know
00:30:59
you're supposed to you're supposed to
00:31:00
increase deficits in a recession not at
00:31:02
the not at the top of the cycle um you
00:31:05
know think things would be probably very
00:31:07
shaky there's yeah there's there's
00:31:08
there's there's some way where um we
00:31:11
yeah we have a um too much debt not
00:31:14
enough sustainable growth um you know
00:31:17
again I always think it comes back to
00:31:19
you know Tech Innovation there probably
00:31:21
are other ways to grow an economy
00:31:23
without Tech um or intensive progress um
00:31:28
but I think they we we we don't have
00:31:31
those don't seem to be on offer and then
00:31:32
that's that's where it's very deeply
00:31:34
stuck if you if you wind back over the
00:31:35
last 50 years you there's always a
00:31:37
question why did people not realize that
00:31:39
this Tech stagnation had happened sooner
00:31:41
and I think there were two one-time
00:31:43
things people could do economically that
00:31:45
had nothing to do with science or tech
00:31:47
there was a 1980s Reagan Thatcher move
00:31:51
which was to massively cut taxes
00:31:53
deregulate allow lots of companies to
00:31:55
merge and combine and and it was sort of
00:31:58
a one-time way to make the economy a lot
00:32:02
bigger even though it had it was not
00:32:04
something that really had the sort of
00:32:05
compounding effect so it led to one
00:32:08
great decade and then there was um you
00:32:10
know and that was sort of the right-wing
00:32:12
capitalist move and then um in the 90s
00:32:15
there was sort of a Clinton Blair um
00:32:18
Center left thing which was sort of
00:32:19
Leaning into globalization and there was
00:32:21
a giant Global Arbitrage you could do
00:32:24
which also had you know a lot of
00:32:25
negative externalities that came with it
00:32:27
but um it sort of was a one-time move I
00:32:29
think both of those are are not on offer
00:32:33
you know I don't necessarily think you
00:32:35
should undo globalization I don't think
00:32:36
you should raise taxes like crazy but um
00:32:39
you can't you can't do more
00:32:41
globalization or more tax cuts here
00:32:43
that's not going to be the win and and
00:32:45
so I I think you have to somehow get
00:32:46
back to the future um we have time for a
00:32:48
couple more questions you um I think saw
00:32:52
that maybe this ivy league institutions
00:32:55
maybe weren't producing the best and
00:32:56
brightest or weren't exactly um hitting
00:32:59
their mandate um and you created the
00:33:01
teal fellows and you've been doing that
00:33:03
for a while and I meet them all because
00:33:04
they all have crazy ideas and they pitch
00:33:06
me for Angel investment what have you
00:33:08
learned getting people to quit school
00:33:10
giving them $100,000 and then how many
00:33:12
parents call you and get really upset
00:33:14
that their kids are quitting
00:33:16
school uh
00:33:20
it's well I I don't know I've I've I've
00:33:23
learned a lot I mean it's it's um
00:33:30
I don't know I I I I think I think the
00:33:32
universities are far worse than I even
00:33:33
thought when I started this thing um I
00:33:37
think um yeah it's um you know I I I I
00:33:41
did this uh I did this debate at Yale
00:33:45
last week um you know resolved higher
00:33:46
education's a bubble and um and uh you
00:33:51
sort of go through all the different
00:33:53
numbers and um the and then you know and
00:33:57
again I I was ful to word it in such a
00:33:58
way that I I didn't have to you know and
00:34:00
then people kept saying well what's your
00:34:01
alternative what should people do
00:34:02
instead and I said nope that's not was
00:34:04
not the debate I'm not you know I'm not
00:34:06
your guidance counselor I'm not your
00:34:07
career counselor I I don't know how to
00:34:09
solve your problems but um if
00:34:11
something's a bubble you know the first
00:34:13
thing you should do is probably not you
00:34:15
know lean into it in too crazy a way and
00:34:19
you know the student debt was 300
00:34:20
billion in 2000 it's uh it's basically
00:34:24
uh close to two trillion at this point
00:34:25
so it's just been the sort of runaway um
00:34:28
this runaway process and um and then if
00:34:30
you look at it by cohort if you
00:34:33
graduated from college in 1997 12 years
00:34:36
later um people still had student debt
00:34:39
but most of the people had sort of paid
00:34:40
it down um but the first by 2009 we
00:34:44
started the teal Fellowship in 2010 and
00:34:47
it you know it felt uh two by 2009 was
00:34:51
the first cohort where this really
00:34:53
stopped if you take the people graduated
00:34:54
from college in 2009 and you fast for
00:34:57
forward 12 years to
00:35:00
2021 the median person had more student
00:35:04
debt 12 years later than they graduated
00:35:07
with because it's actually just it's
00:35:09
just compounding faster and it was you
00:35:11
know partially partially the global
00:35:13
financial crisis the people had less
00:35:15
well-paying jobs they stayed in college
00:35:17
longer um and the colleges they it's
00:35:20
just sort of been this background thing
00:35:22
where it's it's decayed in these in
00:35:25
these really significant ways and um you
00:35:27
know again I I think it's on some level
00:35:30
um there are sort of a lot of um debates
00:35:33
in our society that are probably
00:35:34
dominated by sort of a boomer narrative
00:35:37
and maybe the Baby Boomers were the last
00:35:39
generation where College really worked
00:35:41
and you know they think well you know I
00:35:43
I worked my way through college and why
00:35:45
can't why can't um why can't you know an
00:35:48
18-year-old going to college do that
00:35:49
today and um and so I I I think the
00:35:53
bubble will will will be done once the
00:35:56
Boomers have exited stage left but does
00:35:58
the government to it would be good if we
00:36:00
figured something out before then you
00:36:02
know does does the government need to
00:36:03
stop underwriting the loans because it's
00:36:06
the lending I think the 90 plus perc of
00:36:09
the the the capital in the student loan
00:36:11
programs is funded by federal uh Federal
00:36:14
the federal government and there's if
00:36:17
you're an accredited University you can
00:36:18
take out a loan and go to it and
00:36:20
accreditation in in a in a in a rigid
00:36:23
kind of free market system you would
00:36:25
have an underwriter that says are you
00:36:27
going to be able to Gra graduate make
00:36:28
enough money to pay your loan off is
00:36:29
this a good school are you going to get
00:36:31
a good job and then the market would
00:36:32
figure out whether or not to give you a
00:36:33
loan would figure out what the rate
00:36:34
should be and so on but in this case the
00:36:36
government simply provides Capital to
00:36:38
support all this and as a result
00:36:39
everything's gotten more expensive and
00:36:42
the rigidity in the system that
00:36:44
basically qualifies schools and the
00:36:46
quality of those schools relative to the
00:36:47
earning potential over time is gone we
00:36:49
need the government to get out of
00:36:51
student loan business yeah but look the
00:36:53
the place where I'm I I know I'm sort of
00:36:56
some ways I'm rightwing some ways I'm
00:36:57
left wing on this so the place where I'm
00:36:58
leftwing is I do think a lot of the
00:37:01
students got ripped off and uh and so I
00:37:04
think there should be some kind of broad
00:37:06
um debt forgiveness at this point um who
00:37:09
should pick up the T but it's not just
00:37:11
the taxpayers it's the universities and
00:37:14
it's the the the the bond holders got it
00:37:17
the bond take a little bit out of those
00:37:18
ends the universities and um and then
00:37:22
obviously if you just make it the
00:37:24
taxpayers then um then you'll just then
00:37:26
the universities can just charge more
00:37:28
and more no incentive to reform what
00:37:30
whatsoever but uh had me you know it's
00:37:33
in 2005 uh it was under Bush 43 that the
00:37:36
bankruptcy laws got Rewritten in the US
00:37:39
where you cannot discharge student debt
00:37:41
even if you go bankrupt and if you
00:37:43
haven't paid it off by the time you're
00:37:44
65 your Social Security wages checks
00:37:47
will be garnished it's crazy so um so
00:37:50
you know I I I I do think um but should
00:37:52
we stop lending should the federal
00:37:54
government get out of the the student
00:37:55
lending business well if if if we if we
00:37:58
say that uh if we if we start if we
00:38:01
start with my place where you know a lot
00:38:03
of the student debt should be forgiven
00:38:05
and then and then and then reform the
00:38:07
then we'll see how many people are
00:38:09
willing to lend you know how how much
00:38:11
how many of the colleges can um pay for
00:38:13
all the student what's your sense if if
00:38:15
it was a totally free market system how
00:38:17
many colleges would shut down because
00:38:20
they wouldn't be able to S there's no
00:38:21
tuition
00:38:24
support it it probably would be a lot
00:38:27
smaller it it might it might you might
00:38:30
not have to shut them down because
00:38:33
there's you know a lot of them have
00:38:34
gotten extremely a blow it's like Bal
00:38:36
Mall's cost disease where you know I
00:38:38
don't know if I have no idea like a
00:38:40
place like UCLA it probably has you know
00:38:43
twice or three times as many bureaucrats
00:38:45
as they had 30 40 years ago so there's
00:38:48
sort of all there's sort of all these
00:38:49
sort of um parasitic people that have
00:38:52
sort of uh gradually approved and uh and
00:38:55
um and and so there's probably a lot of
00:38:57
would be a lot of rational ways to dial
00:38:59
this back but um but yeah um you know
00:39:01
maybe we're to a new
00:39:03
location if the only if the only way to
00:39:05
lose weight is to cut off your thumb
00:39:07
that's kind of a difficult way to go on
00:39:08
a DI he um Peter three of your
00:39:11
collaborators longtime collaborators
00:39:13
Elon Musk um Mark Zuckerberg and uh Sam
00:39:18
mman are arguably the three leading AI
00:39:22
language model um
00:39:25
leaders which one is going to win rank
00:39:27
in order and tell us a little bit about
00:39:33
each Peter said he would answer any
00:39:35
question I I I I I said I would take any
00:39:37
question I didn't say to answer any
00:39:39
question you said you would honestly you
00:39:41
said today you felt extremely honest and
00:39:43
candid Let's uh I I yeah but Ive already
00:39:46
been extremely honest and candid so I
00:39:48
think qu it's it's it's whoever I talked
00:39:52
to last okay they're they're all very
00:39:55
very convincing people so you know I cre
00:39:58
I I talk a little bit I talked to Elon a
00:40:02
while ago and and you know and it was it
00:40:04
was just um how ridiculous it was that
00:40:08
Sam Alman was getting away with turning
00:40:10
open AI from a nonprofit into a
00:40:12
for-profit that was such a scam if
00:40:15
everybody was allowed to do this
00:40:17
everybody would do this that it has to
00:40:19
be totally illegal what Sam's doing and
00:40:22
it shouldn't be allowed at all and that
00:40:24
seemed really really convincing in the
00:40:26
moment and then then sort of half an
00:40:27
hour later I I thought to myself but you
00:40:30
know actually um man it was it's been
00:40:33
such a horrifically mismanaged place at
00:40:37
open AI with this Preposterous nonprofit
00:40:40
board they had nobody would do this
00:40:42
again and so there actually isn't much
00:40:44
of a moral hazard from it so but yeah
00:40:46
who whoever whoever I talk to I find
00:40:48
very convincing in the
00:40:49
moment well will that spaces get
00:40:52
commoditized I mean do you see a path to
00:40:53
Monopoly there well again this is this
00:40:56
is again where you know you should you
00:40:58
know attention is all you need you need
00:41:00
to pay attention to who's making money
00:41:02
it's Nvidia it's it's the hardware the
00:41:04
chips layer and um and um and then
00:41:09
that's just it's just what we you know
00:41:11
it's not what we've done in tech for for
00:41:13
30 years you are they making 120% of the
00:41:17
profits they're they're make they're I
00:41:19
think everybody else is losing money
00:41:21
collectively yeah everyone else is just
00:41:23
spending money on on the computer so
00:41:24
it's it's one it's one company that's
00:41:25
making I mean maybe there's a few other
00:41:27
people are making some money I mean I
00:41:28
assume tsmc and asml but but uh but uh
00:41:32
yeah I think everyone else is
00:41:33
collectively losing money what do you
00:41:34
think of Zuckerberg's approach to say
00:41:36
I'm so far behind this isn't cour in my
00:41:38
business I'm going to open source it um
00:41:41
is that going to be the winning strategy
00:41:44
handicap that for
00:41:46
us um again I I I I I I I would say um
00:41:53
my again my my my big my my big
00:41:55
qualification is you know my my model is
00:41:57
AI feels like it does feel uncomfortably
00:42:00
close to the bubble of 1999 so I'm we
00:42:04
haven't invested that much in it um and
00:42:08
uh I I I'd want to have more clarity in
00:42:11
investing but uh but the the uh the
00:42:14
simple simplistic question is you know
00:42:17
who who's going to make money um you
00:42:19
know I think a year ago two years ago in
00:42:21
retrospect Nvidia would have been a good
00:42:22
buy you know I think at this point
00:42:24
everybody it's it's it's sort of too
00:42:26
obvious that they're making too much
00:42:27
money everyone's going to try to copy
00:42:29
them on on the chip side maybe that's
00:42:32
straightforward to do maybe it's not but
00:42:34
but that's you know I'd say probably um
00:42:38
you should you should if you if you want
00:42:39
to figure out the AI thing you you
00:42:41
should not be asking this question about
00:42:42
um you know meta or um open air or any
00:42:46
of these things you should really be
00:42:47
focusing on the Nvidia question the
00:42:49
chips question and the the fact that
00:42:50
we're not able to focus on that that
00:42:52
that tells us something about how we've
00:42:53
all been trained you know I think Nvidia
00:42:55
got started in 1993 yeah I believe that
00:42:57
was the last year where anybody in their
00:43:00
right mind would have studied electrical
00:43:02
engineering over computer science right
00:43:03
94 Netscape takes off and then yeah it's
00:43:06
probably a really bad idea to start a
00:43:08
Semiconductor Company even in '93 but
00:43:10
the benefit is there was going to be no
00:43:12
one would come after you no no talented
00:43:15
people started semiconductor companies
00:43:17
after 1993 because they all went into
00:43:20
you know into software score their
00:43:22
Monopoly
00:43:23
power um it's
00:43:30
I I I think it's quite strong because
00:43:33
this this this history that I just gave
00:43:35
you where none of us know anything about
00:43:37
chips um and then I think the you know I
00:43:41
think the risk it's always you know if
00:43:44
attention is all that you need um the
00:43:47
qualifier to that is you know when you
00:43:49
get started as an you know actress as a
00:43:51
startup as a as a company you need
00:43:54
attention then it's desirable to get
00:43:56
more and at some point attention becomes
00:43:59
the worst thing in the world and and
00:44:01
there was the one day where Nvidia had
00:44:03
the largest market cap in the world
00:44:05
earlier this year and I do think that
00:44:08
represented a phase transition once that
00:44:10
happened they probably had um more
00:44:13
attention than was good hey Peter as we
00:44:15
wrap here um your brain works in a
00:44:18
unique way you're an incredible
00:44:20
strategist you think you know very
00:44:22
differently than uh a lot of the folks
00:44:25
um that we get to talk to um with all of
00:44:28
this are you optimistic for the
00:44:32
future uh I I always man I always push
00:44:35
back in that question I I um I think I
00:44:40
think extreme optimism and extreme
00:44:42
pessimism are both really bad attitudes
00:44:46
and they're the somehow the same thing
00:44:48
you know extreme pessimism nothing you
00:44:50
can do extreme optimism the future will
00:44:53
take care of itself so if if you have to
00:44:55
have one it's probably you want to be
00:44:57
somewhere in between maybe mildly
00:44:59
optimistic mildly pessimistic but uh you
00:45:02
know I I believe in human agency and
00:45:04
that it's up to us and it's not you know
00:45:06
it's not some sort of winning a lottery
00:45:08
ticket or some astrological chart that's
00:45:11
going to decide things and I believe in
00:45:13
human agency and that's sort of an axis
00:45:16
that's very different from optimism or
00:45:18
pessimism what a great extreme optimism
00:45:19
extreme pessimism they're both excuses
00:45:21
for laziness what an amazing place to
00:45:24
end ladies and gentlemen give it up for
00:45:26
Peter J thank you thank you Peter come
00:45:28
on
00:45:29
now wow all right Peter ch

Episode Highlights

  • The Biggest Risk
    Peter emphasizes that not taking risks is the biggest risk of all.
    “The biggest risk you can take is not taking any risk.”
    @ 00m 08s
    September 13, 2024
  • Sleepwalking into Armageddon
    Peter warns about the dangers of current global conflicts and their implications.
    “I think we're sort of sleepwalking into Armageddon.”
    @ 06m 35s
    September 13, 2024
  • Taiwan and World War III
    Peter discusses the implications of a potential conflict over Taiwan.
    “It's not worth World War III.”
    @ 08m 28s
    September 13, 2024
  • AI's Transformative Potential
    AI in 2024 could transform the world, much like the internet did in 1999.
    “AI in 2024 is like the internet in 1999.”
    @ 20m 19s
    September 13, 2024
  • The Student Debt Bubble
    Student debt has ballooned from $300 billion in 2000 to nearly $2 trillion today.
    “The universities are far worse than I even thought.”
    @ 33m 32s
    September 13, 2024
  • Peter on AI Leadership
    Peter discusses the leading figures in AI and their influence on the industry.
    “Whoever I talk to, I find very convincing in the moment.”
    @ 40m 46s
    September 13, 2024
  • The Nvidia Monopoly
    Peter analyzes Nvidia's dominance in the AI hardware market and its implications.
    “It's one company that's making money; everyone else is collectively losing money.”
    @ 41m 24s
    September 13, 2024
  • Optimism vs. Pessimism
    Peter shares his perspective on maintaining a balanced outlook for the future.
    “If you have to have one, it's probably you want to be somewhere in between.”
    @ 44m 57s
    September 13, 2024

Episode Quotes

Key Moments

  • Risk Taking00:08
  • Taiwan Discussion08:28
  • AI Predictions20:19
  • University Critique33:32
  • Student Debt Crisis33:32
  • AI Leadership Debate39:18
  • Nvidia Dominance41:24
  • Balanced Outlook44:55

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Elon gets paid, Apple's AI pop, OpenAI revenue rip, Macro debate & Inside Trump Fundraiser
Podcast thumbnail
E4: Politicizing the pandemic, Police reform, Twitter vs Facebook with David Sacks & David Friedberg
Podcast thumbnail
Markets turn Trump, Long rates spike, Election home stretch, Influencer mania, Saving Starbucks
Podcast thumbnail
Big Fed rate cuts, AI killing call centers, $50B govt boondoggle, VC's rough years, Trump/Kamala
Podcast thumbnail
Inflated GDP?, Google earnings, How the media lost trust, Rogan/Trump search controversy, Election!
Podcast thumbnail
Kamala surges, Trump at NABJ, recession fears, Middle East escalation, Ackman postpones IPO
Podcast thumbnail
E6: Big Tech antitrust aftermath, potential effects of an M&A clampdown on Silicon Valley & more
Podcast thumbnail
E3: Modern Cold War, politicizing the pandemic & more with David Sacks & David Friedberg
Podcast thumbnail
E2: Rebooting economy, understanding corporate debt, avoiding a depression & more with David Sacks
Podcast thumbnail
E7: California's collapse, how SPACs are opening the markets for growth stocks & more
Podcast thumbnail
E8: TikTok + Oracle, how privacy loss will impact society, economy & COVID outlooks for 2021