Search Captions & Ask AI

Elon Musk | All-In Summit 2024

September 10, 2024 / 01:03:53

This episode features Elon Musk discussing freedom of speech, government regulations, and advancements in technology. Key topics include the challenges of maintaining free speech globally, the inefficiencies of government regulations, and the future of AI and humanoid robots.

Musk talks about the ongoing "freedom of speech wars" and the implications of censorship in various countries. He emphasizes the importance of the First Amendment and expresses concern over the global trend of suppressing free speech.

The conversation shifts to the regulatory environment in the U.S., particularly in California versus Texas. Musk highlights the difficulties faced by companies like SpaceX due to excessive regulations and contrasts this with the more efficient processes in Texas.

Musk also discusses the potential of AI and robotics, predicting a future where humanoid robots could vastly outnumber humans. He shares insights on the development of Tesla's Optimus robot and the expected cost and impact of such technology.

The episode concludes with humorous anecdotes from Musk's experience on SNL, showcasing his lighter side amidst serious discussions.

TL;DR

Elon Musk discusses free speech, government inefficiencies, and the future of AI and humanoid robots.

Video

00:00:00
[Applause]
00:00:00
the greatest
00:00:02
entrepreneur this generation Elon
00:00:06
Musk hey
00:00:11
guys grab my
00:00:15
stuff sit there I'm going to yep I'm
00:00:17
going to take this all
00:00:21
right do this thanks for taking the
00:00:26
time how um how you doing brother
00:00:31
you keep him
00:00:32
busy yeah I
00:00:36
mean it's rarely a slow
00:00:39
week I mean in the world as well yeah I
00:00:43
mean any given week I me it just seems
00:00:44
like the things getting nuttier it it
00:00:47
it's definitely a simulation we've
00:00:48
agreed on this at this
00:00:50
point I mean well look if if we are in
00:00:54
some alien Netflix series I think the
00:00:55
ratings are high yes ratings are High um
00:01:00
how are the um uh the the freedom of
00:01:03
speech Wars going um this is a uh you've
00:01:07
been you've been at War for two years
00:01:10
now yes uh the price of freedom of
00:01:13
speech is not cheap is it I think it's
00:01:14
like 44 billion something like that just
00:01:17
numbers give her give or take a billion
00:01:20
yeah round numbers yeah yeah um it's uh
00:01:24
it's it's pretty nutty there there is
00:01:26
like this weird movement to uh quell's
00:01:29
Free Speech uh kind of around the world
00:01:33
and um and this is something we should
00:01:34
be very concerned about uh you know you
00:01:37
have to ask like why was the first
00:01:40
amendment like a high priority was like
00:01:42
number one um One is because uh people
00:01:46
came from countries where if you spoke
00:01:49
freely you would be imprisoned or killed
00:01:52
and they were like well we would like to
00:01:54
not have that here um because that was
00:01:58
terrible and actually you know there's a
00:02:00
lot of places in the world right now if
00:02:02
you if you're critical of the government
00:02:05
you get imprisoned or killed right yeah
00:02:09
we'd like to not have that are you
00:02:12
concerned to
00:02:14
that I mean I supect this is a receptive
00:02:16
audience to that message
00:02:19
[Applause]
00:02:23
um you I I think we always thought that
00:02:27
the West Was the exception to that that
00:02:29
we knew there were authoritarian places
00:02:31
around the world but we thought that in
00:02:32
the west we'd have freedom of speech and
00:02:34
we've seen like you said it seems like a
00:02:35
global movement in brit you've got
00:02:39
teenagers being put in prison for memes
00:02:42
opposing it's like you like to you like
00:02:45
to Facebook post throw him in the prison
00:02:48
yeah people have got an actual you know
00:02:51
prison for for like like obscure
00:02:54
comments on social media not even [ __ ]
00:02:56
posting yet like not even yeah it's
00:02:58
crazy P got thrown in prison recently
00:03:02
I'm like that was pretty about I was
00:03:04
like what is the massive crime that
00:03:07
right pav in France and then of course
00:03:09
we got Brazil with judge
00:03:12
Voldemort that that one seems like the
00:03:14
one that impacts you the most can you
00:03:16
what's the latest on
00:03:21
that well we I guess we're we are trying
00:03:24
to figure out uh is there
00:03:27
some reasonable solution in Brazil
00:03:30
uh the you know the concern uh I mean I
00:03:34
want to just make sure that this is
00:03:35
framed correctly um and uh you know
00:03:39
funny memes aside I the the the nature
00:03:43
of the concern was
00:03:44
that least at at excorp we had the
00:03:47
perception that um we were being asked
00:03:51
to do things that violated Brazilian law
00:03:54
so obviously we cannot as an American
00:03:57
company impose American laws and values
00:04:00
on on other countries um that uh you
00:04:04
know we wouldn't get very far if we did
00:04:05
that um but but we do you know think
00:04:10
that uh if if a country's law laws are a
00:04:12
particular way and we're being asked to
00:04:15
what we we think we think we're being
00:04:16
asked to break them then and and be
00:04:19
silent about it then obviously that is
00:04:21
no good so so I just want to be clear
00:04:25
thetime sometimes comes across as uh
00:04:28
elon's trying to just be a crazy
00:04:31
whatever billionaire and demand
00:04:34
outrageous things from other countries
00:04:37
and
00:04:38
um you know well that is
00:04:42
true
00:04:44
um in
00:04:46
addition there are um other things uh
00:04:51
that that I think are you know valid
00:04:53
which is like we we we obviously can't
00:04:57
uh you know I think any given thing that
00:04:59
we do at excorp we've got to be able to
00:05:02
explain in the light of day and and not
00:05:05
feel that it was dishonorable or you
00:05:08
know we we we did the wrong thing you
00:05:10
know uh so we don't we that that that
00:05:14
that was the that that's the nature of
00:05:15
the concern so we actually are in uh
00:05:19
sort of discussions with uh the you know
00:05:23
judicial authorities in in Brazil to try
00:05:26
to you know run this to ground like
00:05:30
uh what What's actually going on like if
00:05:33
if we're being asked to break the law
00:05:36
Brazilian law then that's that that
00:05:38
obviously should not be should not sit
00:05:39
well with the Brazilian Judiciary and if
00:05:43
we're not and we're mistaken we'd like
00:05:44
to understand how we are mistaken I
00:05:47
think that's a that's a pretty
00:05:48
reasonable uh position I'm a bit
00:05:51
concerned as your friend that you're
00:05:54
going to go to one of these countries
00:05:56
and I'm going to wake up one day and
00:05:57
you're going to get arrested and like
00:05:59
I'm going to have to to go bail you out
00:06:00
or something like this is feels very
00:06:02
acute like yes I mean it's not a joke
00:06:05
now like they're literally saying like
00:06:07
you know it's not just Biden saying like
00:06:09
we have to look into that guy now it's
00:06:10
become quite literal like this I who was
00:06:13
the guy who just wrote the um was it the
00:06:16
guardian piece about like oh yeah yeah
00:06:19
there have been three articles and I
00:06:21
think in the past three weeks Robert
00:06:22
Reich but it wasn't just him it was like
00:06:25
three different articles three different
00:06:27
articles it doesn't that's a trend
00:06:30
that calling for me to be imprisoned
00:06:32
right in the in the guardian you know
00:06:35
guardian of what what are they Pro
00:06:38
exactly guardian of I don't know
00:06:42
authoritarianism yeah guardian of uh
00:06:45
yeah yeah censorship censorship but but
00:06:48
the premise here is that you bought this
00:06:51
thing this online Forum this
00:06:53
communication platform and you're
00:06:54
allowing people to use it to express
00:06:57
themselves therefore you have to be
00:06:59
jailed I don't understand the logic here
00:07:02
right um there's what do you think
00:07:04
they're actually afraid of at this
00:07:06
point what's the motivation here I mean
00:07:09
I think the if somebody's a if
00:07:12
somebody's sort of trying to push a
00:07:14
false premise on the world then and then
00:07:16
that that and that premise can be
00:07:18
undermined with public dialogue then
00:07:21
they will be opposed to public Dialogue
00:07:23
on that premise because they wish that
00:07:25
false premise to Prevail right um so
00:07:28
that's I think you know the the issue
00:07:31
there is uh if they don't like the truth
00:07:34
uh you know then we want to suppress it
00:07:36
so now the you know the the sort of the
00:07:41
what what we're trying to do with excorp
00:07:43
uh is uh I distinguish that from my son
00:07:47
who's also called X yes
00:07:50
uh you have you have parental
00:07:53
goals everything's just called X
00:07:55
basically it's very difficult
00:07:57
disambiguation the sun yeah everything
00:08:00
um
00:08:01
so what what we're trying to do is
00:08:04
simply adhere to the uh you know the the
00:08:08
laws in a in a country um so so if
00:08:12
something is illegal in the United
00:08:13
States or if it's illegal in you know
00:08:16
Europe or Brazil or or wherever it might
00:08:19
be uh then then we will take it down or
00:08:21
we'll suspend the account because we
00:08:24
we're not you know there to make the
00:08:26
laws we but but if speech is not legal
00:08:30
then then what are we doing okay now
00:08:32
we're injecting ourselves in as as a
00:08:36
sensor and and where does it stop and
00:08:38
who
00:08:39
decides so and where where does that
00:08:42
path lead I think it leads to a bad
00:08:45
place uh so if the people if in a
00:08:49
country want the laws to be different
00:08:51
they should make the laws different but
00:08:53
otherwise we're going to obey the law in
00:08:57
each jurisdiction right and some of
00:08:58
these Europe that's it it's it's not
00:09:00
more complicated there we're not we're
00:09:01
not trying to FL out the law we going to
00:09:01
be clear about that but we're we're
00:09:03
trying to adhere to the law and if the
00:09:05
laws change we will change and if if the
00:09:07
laws don't change we we won't we're just
00:09:09
literally trying to adhere to the law
00:09:12
it's pretty pretty straightforward there
00:09:13
are some very
00:09:14
straightforward if somebody doesn't
00:09:16
thinks we're not adhering to the law
00:09:17
well they can file a lawsuit Bingo also
00:09:20
very straightforward I mean there are
00:09:22
European countries that don't want
00:09:23
people to promote Nazi propaganda yes
00:09:25
they have some sensitivity to it well
00:09:27
it's it is illegal and it is illegal
00:09:30
in those countries if somebody puts that
00:09:32
up you take it down yes but they
00:09:34
typically file something and say take
00:09:36
this down no in some cases it is just um
00:09:39
obviously illegal like you don't need to
00:09:41
file a lawsuit for you know if something
00:09:44
is just you know unequivocally illegal
00:09:46
we can literally read the law this
00:09:48
violates the law you know anyone anyone
00:09:50
can see that like you know you don't
00:09:53
need like if if somebody is stealing you
00:09:55
don't need let me check the law on that
00:09:57
okay oh no they're they're stealing
00:09:59
Frisco let's talk so we had we had JD
00:10:02
Vance here this morning he did a great
00:10:04
job um and you know one of the things is
00:10:07
there's this image on X of like
00:10:09
basically like you
00:10:11
Bobby uh Trump and JD are like the
00:10:15
Avengers I guess and then there's
00:10:17
another meme where you're in front of a
00:10:19
desk where it says d g the department of
00:10:22
governmental efficiency yes yes I posted
00:10:25
that one tell us about I I made it using
00:10:27
grock the gro image generator
00:10:31
and I posted it tell us about it to my
00:10:33
profile seek for
00:10:36
efficiency um how how do you do
00:10:39
it well I mean
00:10:46
I I think with great difficulty uh but
00:10:50
you know look it's been a long time
00:10:52
since there was a serious effort to
00:10:54
reduce the size of government and to um
00:10:58
remove absurd regulations yeah and you
00:11:01
know the last time there was a really
00:11:02
concered effort on that front was Reagan
00:11:05
in the early ' 80s so we're 40 years
00:11:06
away from um a a a a serious effort to
00:11:10
remove um you know not regulations that
00:11:14
don't serve the greater good and and
00:11:17
reduce the size of government and I
00:11:19
think it's just if we don't do that then
00:11:22
what's what what's happening is that we
00:11:23
get regulations and laws accumulating
00:11:25
every year until eventually everything's
00:11:28
illegal uh and that's why we can't get
00:11:31
uh major infrastructure projects done in
00:11:32
the United States like if you look at
00:11:34
the absurdity of the California highs
00:11:36
speed rail I think they theyve spent $7
00:11:38
billion and have A600 put segment that
00:11:40
doesn't actually have rail in
00:11:41
it I mean your tax dollars at work I
00:11:44
mean yeah what are we doing that's an
00:11:46
expense of 600 feet of concrete you know
00:11:50
um and and I mean I think it's like if
00:11:54
you know uh I realize sometimes I'm
00:11:57
perhaps a little optimistic with
00:11:58
schedules but
00:12:00
uh you
00:12:02
know I mean I wouldn't be doing the
00:12:04
things I'm doing if I was uh you know
00:12:07
not an an optimist uh so but but but but
00:12:13
at the current Trend you know California
00:12:15
High Street rail might finish sometime
00:12:17
next
00:12:18
Century maybe probably not we're just
00:12:21
we'll have teleportation by that time so
00:12:23
yeah
00:12:24
exactly AI do everything at that point
00:12:26
so so so so
00:12:29
I think you really think of um you know
00:12:32
the the United States and many countries
00:12:34
it's it's arguably worse than the EU as
00:12:37
being like galiva tied down by a million
00:12:39
little strings and like any one given
00:12:42
regulation is not is not that bad but
00:12:45
you've got a million of them and um or
00:12:47
Millions actually and and and then
00:12:50
eventually you just can't get anything
00:12:51
done and and this is a this is a massive
00:12:54
tax on the on the consumer on the people
00:12:57
uh it's just they don't they don't
00:12:58
realize that there's this this massive
00:13:00
tax in the form of irrational
00:13:03
regulations um I'm going I'll give you a
00:13:05
recent uh example that that you know is
00:13:08
is just insane um is that uh like SpaceX
00:13:11
was fined by the EPA
00:13:13
$140,000 for um they claimed dumping uh
00:13:17
portable water on the ground drinking
00:13:19
water so and we're like uh this is that
00:13:22
star base and and we're like it's we're
00:13:24
in a TR tropical uh thunderstorm region
00:13:27
um that stuff comes from the Sky all the
00:13:30
time and
00:13:32
um and there was no actual harm done you
00:13:34
know it was just water to cool the the
00:13:36
the Launchpad during uh liftoff and um
00:13:40
there's zero harm done like and they're
00:13:41
like they agree yes there's zero harm
00:13:42
done we're like okay so there's no harm
00:13:44
done and um you want us to pay $140,000
00:13:48
fine it's like yes because you didn't
00:13:49
have a
00:13:50
pmit okay we didn't know there was a
00:13:53
ponent needed for zero har fresh water
00:13:56
being on the ground in a place that
00:13:58
where fresh water full from the sky all
00:13:59
the
00:14:00
time got it next to the ocean next to
00:14:04
the ocean cuz there's a little bit of
00:14:05
water there too yeah I mean sometimes it
00:14:07
rain so much the the roads are flooded
00:14:09
so we're like you
00:14:10
know how does this make any sense yeah
00:14:14
and and then they're like then then they
00:14:16
were like well we're not going to
00:14:17
process any more of your any more of
00:14:18
your applications for launch for
00:14:20
Starship launch unless you pay this
00:14:21
$140,000 they just ransomed us and we're
00:14:24
like okay so we paid $140,000 but it was
00:14:27
a it's like this is no good I mean at
00:14:29
this rate we're never going to get to
00:14:30
Mars I mean that's
00:14:33
the that's the confounding part here
00:14:35
yeah is we're acting against our own
00:14:38
self-interest you know when you look at
00:14:41
we do have to make putting aside fresh
00:14:43
water but hey you know there the rocket
00:14:46
makes a lot of noise so I'm I'm certain
00:14:49
there's some complaints about noise once
00:14:51
in a while but sometimes you want to
00:14:52
have a party or you want to make
00:14:53
progress and there's a little bit of
00:14:54
noise therefore you know we we trade off
00:14:57
a little bit of noise for massive
00:14:59
progress or even fun so like when did we
00:15:02
stop being able to make those tradeoffs
00:15:04
but talk about the difference between
00:15:06
California and Texas uh where you and I
00:15:09
now reside um Texas you were able to
00:15:12
build the gigafactory I remember when
00:15:15
you got the plot of land and then it
00:15:17
seemed like it was less than two years
00:15:20
when you had the party to open it yeah
00:15:23
from start of construction um to
00:15:26
completion uh was 14 months 14 14 months
00:15:30
is there anywhere on the planet that
00:15:32
would go faster is like China faster
00:15:33
than that uh China was 11 months got it
00:15:37
so Texas China 11 and 14 months
00:15:41
California how many months and just to
00:15:43
give you a sense of size the Tesla
00:15:46
gigafactory in China is three times the
00:15:47
size of the Pentagon which was the
00:15:49
biggest building in America uh no there
00:15:51
are bigger buildings but the pentagon's
00:15:52
pretty big one yeah where it was the
00:15:53
biggest in units in units of Pentagon
00:15:56
it's like three okay three pentagons and
00:15:59
counting yeah got it in 14 months um the
00:16:04
just the just the regulatory approvals
00:16:07
in California would have taken two
00:16:09
years yeah so that's that's the issue
00:16:11
where where do you think the regulation
00:16:14
helps like for the people that will say
00:16:16
we need some checks and balances we
00:16:17
can't have some because for every good
00:16:19
actor like you there'll be a bad actor
00:16:21
so where is that line then yeah I mean I
00:16:23
have a sort of you
00:16:26
know in in sort of doing Su sensible
00:16:29
deregulation and um reduction in the
00:16:33
size of government the is is just like
00:16:35
be very public about it and say like
00:16:37
which of these rules do you if the
00:16:38
public is really excited about a rule
00:16:40
and wants to keep it we'll just keep it
00:16:43
and and here the thing about the rules
00:16:44
if if like if the rule is um you know
00:16:47
turns out to be a bad we'll just put it
00:16:49
right back okay and then you know
00:16:51
problem solved it's like it's easy to
00:16:53
add rules but we don't actually have a
00:16:55
process for getting rid of them that's
00:16:57
the issue there's no garbage collection
00:17:00
rul um when we were um watching you work
00:17:05
David and I and Antonio um in that first
00:17:08
month at Twitter which was all hands on
00:17:10
deck and you were doing zerob based
00:17:12
budgeting you really quickly got the
00:17:15
cost under control and then miraculously
00:17:17
everybody said this site will go down
00:17:20
and you added 50 more features so maybe
00:17:23
explain because this is the first time
00:17:25
yeah there like so many articles like
00:17:27
the that this this
00:17:29
Twitter is Dead Forever there's no way
00:17:31
it could possibly even continue at all
00:17:34
it was almost like the Press wasting for
00:17:35
you let's write theit here has the
00:17:37
orbituary uh they were all saying their
00:17:39
goodbyes on Twitter remember that yeah
00:17:41
yeah yeah they were all leaving and
00:17:42
saying their goodbyes cuz the site was
00:17:43
going to melt down and totally fail and
00:17:46
and uh all the journalists left yeah and
00:17:49
which is if you ever want to like hang
00:17:51
out with a bunch of Hall monitors oh my
00:17:53
God threads is amazing every time I go
00:17:55
over there and post they're like they
00:17:57
they're really triggered but yeah I mean
00:17:59
if you like being condemned repeatedly
00:18:01
then you know for reasons that make no
00:18:04
sense then threads is the way to go yeah
00:18:06
it's really it's it's the most miserable
00:18:08
place on Earth if Disney's the happiest
00:18:11
this is the anti- Disney but if we were
00:18:13
to go into government you went into the
00:18:15
Department of Education or Pi pick the
00:18:17
department you've worked with a lot of
00:18:19
them actually sure you can't go in there
00:18:21
in zero based budget okay we get it but
00:18:24
if you could just pair two three four 5%
00:18:28
of those organizations what kind of
00:18:30
impact would that
00:18:32
have yeah I mean I think we'd need to do
00:18:35
more than that I think ideally but
00:18:37
compounding every year 2 3% a year I
00:18:40
mean it would be better than what's
00:18:41
happening
00:18:43
now yeah I look I think we we um you
00:18:49
know
00:18:50
uh if if Trump wins and obviously I
00:18:54
suspect there are people with mixed
00:18:55
feelings about whether that should
00:18:56
happen but uh but if but we do have an
00:19:00
opportunity uh to do kind of a once- in
00:19:03
a-lifetime deregulation and reduction in
00:19:05
the size of government um because
00:19:07
because the other thing besides the
00:19:08
regulations um America is also going
00:19:10
bankrupt extremely quickly um and and
00:19:14
nobody seems everyone seems to be sort
00:19:15
of whistling past the graveyard on this
00:19:17
one um but they're all they're all
00:19:20
grabbing the silverware everyone's stuff
00:19:22
in their pockets and the silverware
00:19:23
before this the Titanic SS like well you
00:19:26
know the the defense department by
00:19:28
budget is a very big budget okay it's a
00:19:31
trillion dollars a year DOD Intel it's
00:19:34
Trill a trillion dollars um and interest
00:19:38
payments on the national debt just
00:19:40
exceeded the defense department budget
00:19:43
they're over a trillion dollar a year
00:19:46
just in interest and Rising we're we're
00:19:49
adding a trillion dollars to the net to
00:19:52
our debt which our you know kids and
00:19:55
grandkids are going to have to pay
00:19:56
somehow um
00:19:59
you know every every three months and
00:20:02
then soon it's going to be every two
00:20:03
months and then every month and then the
00:20:05
only thing we'll be able to pay his
00:20:07
interest yeah and and if if this it's
00:20:10
just you know it's just like a person at
00:20:12
scale that has racked up too much credit
00:20:16
card debt um and
00:20:19
uh this this is not this is not have a a
00:20:23
good
00:20:24
ending so we have to reduce the spending
00:20:27
let me ask one question cuz I've brought
00:20:29
this up a lot and the counterargument I
00:20:30
hear which I disagree with um but the
00:20:32
counter argument I hear from a lot of
00:20:34
politicians is if we reduce spending
00:20:37
because right now if you add up federal
00:20:38
state and local government spending it's
00:20:41
between 40 and 50% of GDP so nearly half
00:20:45
of our economy is supported by
00:20:48
government spending and nearly half of
00:20:49
people in the United States are
00:20:51
dependent directly or indirectly on
00:20:53
government checks and either through
00:20:56
contractors uh that that the government
00:20:57
pays or their employed by government um
00:20:59
entity so if you go in and you take to
00:21:02
harden acts too fast you will have
00:21:05
significant contraction job loss and
00:21:07
recession what's The Balancing Act Elon
00:21:10
just thinking realistically because I'm
00:21:12
100% on board with you the steps the
00:21:15
next set of steps however assume Trump
00:21:18
wins and you become the the chief uh doe
00:21:23
um uh dog uh like G
00:21:28
how how like yeah and I think the
00:21:30
challenge is how quickly can we yeah how
00:21:32
quickly can we go in how quickly can
00:21:34
things change and without
00:21:39
without I want that on my business card
00:21:41
yeah without all the L without all the
00:21:44
contraction and
00:21:45
job yeah so so I guess how do you really
00:21:48
address it when so much of the economy
00:21:49
and so many people's jobs and
00:21:50
livelihoods are dependent on government
00:21:52
spending well I mean I I do think it's
00:21:55
it's it's sort of um
00:21:59
you know it's it's false dichotomy it's
00:22:00
not like no government spending is going
00:22:02
to happen um you really have to say like
00:22:04
is it the right level um and just
00:22:07
remember that that you know any any
00:22:10
given person if they are doing things in
00:22:13
a less efficient organization versus
00:22:15
more efficient organization their
00:22:17
contribution to the economy their net
00:22:19
output of goods and services will reduce
00:22:22
um I mean you've got a couple of clear
00:22:23
examples between uh East Germany and
00:22:25
West Germany North Korea and South Korea
00:22:28
um I mean North Korea they're starving
00:22:30
uh South Korea it's like amazing it's
00:22:32
the future the compounding effect of
00:22:34
productivity gains yeah yeah it's night
00:22:36
and day yeah um and so in the north
00:22:38
North Korea you've got 100% government
00:22:40
um and in South Korea you've got
00:22:42
probably I don't know 40% government
00:22:44
it's not zero yeah U and yet you've got
00:22:46
a standard of living that is probably 10
00:22:47
times higher in South Korea at least at
00:22:50
least exactly um uh and then East and
00:22:53
West Germany um in West Germany uh you
00:22:56
had just thinking in terms of cars I
00:22:58
mean you had BMW Porsche Audi Mercedes
00:23:01
um and and East Germany which is a
00:23:04
random line on a map um you you the the
00:23:08
car only car you could get was a a
00:23:09
trabant which is basically a lawn mower
00:23:11
with a shell on it um and it was
00:23:14
extremely unsafe and you there's a
00:23:17
20-year
00:23:19
wait so you like you know put your kid
00:23:21
on the list as soon as they're conceived
00:23:24
um they're conceived and and even then
00:23:26
only I think um you know quarter of
00:23:29
people maybe got got this lousy car and
00:23:33
the same so so that's just an
00:23:34
interesting example of like basically
00:23:36
the same people different operating
00:23:38
system and and it's not like uh West
00:23:41
Germany was some you know you know a
00:23:44
capitalist uh Heaven it was it's quite
00:23:48
socialist actually so uh so when you
00:23:51
look you know probably it was half half
00:23:54
government in West Germany and 100%
00:23:56
government in East Germany and again
00:23:58
gain sort of a five to like it call at
00:24:01
least a 5 to 10x standard of living
00:24:03
difference and even qualitatively vastly
00:24:06
better and and it's obviously you know
00:24:08
sometimes people have these amazingly in
00:24:09
this modern era this debate as to which
00:24:11
system is better well I'll tell you
00:24:13
which system is better um the one that
00:24:15
doesn't need to build the world to keep
00:24:16
people in okay that's that's how you can
00:24:19
tell
00:24:22
okay it's a dead giveaway spoiler alert
00:24:26
dead giveaway they climbing the wall to
00:24:28
get out
00:24:29
you have to build a barrier to keep
00:24:32
people in that is the bad system um
00:24:36
there wasn't West West berin that built
00:24:37
the wall okay they were like to you know
00:24:40
anyone who wants to flee West berin go
00:24:41
ahead um speaking of walls so it you
00:24:44
know and and and and if you look at sort
00:24:46
of the flux of boats from Cuba there's a
00:24:49
large number of boats from Cuba and
00:24:52
there's a bunch of free boats that you
00:24:54
anyone can take to to to Cuba
00:24:58
there's like hey wow an abandoned boat I
00:25:00
could use this boat to go to Cuba where
00:25:03
they have communism awesome Yes um and
00:25:06
and and yet nobody nobody picks up those
00:25:08
boats and and does it amazing um so
00:25:12
given this a lot of thought yeah wait so
00:25:13
your point is jobs will be created if we
00:25:15
cut government spending in half jobs
00:25:17
will be created fast enough to make up
00:25:19
for right just to count yes obviously
00:25:22
you know I'm not suggesting that that
00:25:23
people you know um have like immediately
00:25:26
tfed you know tossed out with with with
00:25:28
no severance and and you know can't now
00:25:31
can't pay their mortgage they need to
00:25:32
see some reasonable offramp uh where
00:25:34
yeah yeah um so reasonable offramp where
00:25:37
you know they're still um you know
00:25:39
earning they're still receiving money
00:25:41
but have like I don't know a year or two
00:25:42
to to f to find jobs in the private
00:25:45
sector which they will find and then
00:25:46
they will be in a different operating
00:25:48
system um again you can see the
00:25:50
difference East Germany was incorporated
00:25:52
into West Germany living standards in
00:25:53
East Germany uh Rose
00:25:56
dramatically um so in four years if you
00:25:59
could shrink the side size of the
00:26:01
government with Trump what would be a
00:26:03
good Target just in terms of like
00:26:05
ballpark I mean are you trying to get me
00:26:06
assassinated before this even happens no
00:26:09
no pick AO number I mean you know
00:26:12
there's that old phrase go postal I mean
00:26:13
it's like they might yeah on me so we'll
00:26:16
keep the post office I I'm going to need
00:26:18
a all the security details guys yes I
00:26:21
mean this year a number of disg granted
00:26:24
workers for former government employees
00:26:26
is you know
00:26:29
quite a scary number I mean I might not
00:26:30
make it you know I was saying low low
00:26:32
digits every year for four years would
00:26:34
be palatable yeah and I like your idea
00:26:36
the thing is that if if it's not done uh
00:26:39
like if you have a once once in a
00:26:40
lifetime or once in a generation
00:26:42
opportunity and you don't take Serious
00:26:44
action and and then you have four years
00:26:47
to get it done and then and if it
00:26:50
doesn't get done then how serious is
00:26:52
Trump about this like you've talked to
00:26:53
him about it yeah yeah I think he is
00:26:56
very serious about it got it um and no I
00:26:59
I think actually the reality is that if
00:27:01
we get rid of nonsense regulations and
00:27:03
shift people from the government sector
00:27:06
to the private sector we will have
00:27:08
immense
00:27:09
Prosperity um and and I think we will
00:27:11
have a golden age in this country and
00:27:14
it'll be
00:27:15
fantastic can we uh can we talk about
00:27:20
SP um you have a bunch of critical
00:27:23
Milestones coming up um yeah in fact
00:27:25
there's an important a very exciting
00:27:27
launch
00:27:28
um that is may be happening tonight so
00:27:31
if if that the weather is is holding up
00:27:33
then I'm going to leave here head to
00:27:35
Cape canaval uh for the um the the PO
00:27:39
stor Mission which is a private Mission
00:27:40
so funded by Derek um DED isman and he's
00:27:44
um awesome guy and and there this will
00:27:47
be the first time uh the first private
00:27:50
first first commercial space walk um and
00:27:52
and'll be at at the highest altitude uh
00:27:56
since Apollo so it's the furthest from
00:27:58
Earth that anyone's
00:28:00
gone um
00:28:04
yeah and you what comes after that let's
00:28:07
assume that's successful and I sure hope
00:28:09
so
00:28:12
man um no
00:28:16
pressure
00:28:17
um yeah we you know Absolut you know
00:28:20
Astron prior astral safety is man if I
00:28:24
had like all all all the wishes I could
00:28:26
save up that would be the one to to put
00:28:28
on so you know space is dangerous um so
00:28:34
the
00:28:36
the yeah the next
00:28:38
Milestone after that would be the next
00:28:41
flight of Starship um which um you know
00:28:45
Starship is next five Starship is ready
00:28:47
to fly we are waiting for regulatory
00:28:52
approval you know yeah it it it it
00:28:56
really should not be possible to both a
00:28:58
giant rocket faster than the paper can
00:29:02
move from one desk to
00:29:04
[Applause]
00:29:09
another that stamp was really hard
00:29:14
approved yeah you ever see that movie
00:29:16
zootopia you ever see that movie
00:29:18
zootopia there's like a
00:29:21
sloth in for the approval yeah
00:29:24
accidentally tell a joke and and I was
00:29:26
like oh no this is good here we go going
00:29:28
to take a long time sorry sorry um but
00:29:30
yeah
00:29:31
zootopia you know you know the funny
00:29:34
thing is like so I went to the
00:29:37
DMV about I don't know a year later
00:29:40
after zootopia and to get my whatever
00:29:43
license renewal and the guy in in an
00:29:45
exercise of incredible self- awareness
00:29:47
had the sloth from zootopia in his um in
00:29:50
his cube in in in his Cube and he was
00:29:53
actually Swift
00:29:55
yeah with the with the Mandate beat the
00:29:57
sloth yeah yeah no personal agency
00:29:59
personal agency no I mean some people
00:30:03
like think the you know the government
00:30:06
is um more confident than it than it is
00:30:09
I'm not saying that there aren't
00:30:10
confident people in the government
00:30:11
they're just in an operating system that
00:30:13
is inefficient um once you move them to
00:30:15
a more efficient operating system they
00:30:17
their output is dramatically greater as
00:30:19
we've seen examp you know when East
00:30:22
Germany was reintegrated to into with
00:30:24
West Germany and and and the same people
00:30:27
um were vastly more prosperous uh with a
00:30:30
basically half capitalist uh operating
00:30:33
system
00:30:34
so um but I
00:30:37
mean for a lot of people their like the
00:30:40
maybe most direct experience with the
00:30:42
government is the DMV um and and then
00:30:46
the important thing to remember is the
00:30:47
the government is the DMV at
00:30:51
scale right that's the government got
00:30:54
the mental picture how much do you want
00:30:55
to scale it
00:30:59
yeah yeah sorry can you go back to
00:31:02
chat's um uh question on Star so you you
00:31:04
announced just the other day Starship
00:31:06
going to Mars in two years and by the
00:31:09
way huh yeah yeah yeah yeah yeah and
00:31:12
then four years for a
00:31:14
crude uh aspirational launch in the next
00:31:16
window and how much is the government
00:31:18
involved I'm not saying like say you
00:31:20
watch by these not you know uh but these
00:31:23
uh but it based on our current progress
00:31:27
where with with Starship We were able to
00:31:29
successfully reach oval of velocity
00:31:31
twice uh we were able to achieve soft
00:31:33
Landings of the the booster and the ship
00:31:36
in water uh and that's despite the ship
00:31:39
having you know half its flaps cooked
00:31:41
off um you can see the video on the X
00:31:43
platform it's quite exciting um so you
00:31:47
know we we we think we'll be able to
00:31:49
have to launch reliably and repeatedly
00:31:53
and quite quickly um and the the the
00:31:57
fundamental Holy Grail breakthrough for
00:31:59
rocketry for to what the fundamental
00:32:02
breakthrough that is needed for life to
00:32:04
become multiplanetary is a rapidly
00:32:07
reusable reliable rocket
00:32:11
R for the pirate somehow um throw a
00:32:15
pirate in there um the so with Starship
00:32:21
is the first rocket design
00:32:25
where success is one of the possible out
00:32:28
with full
00:32:29
reusability um so you know for any given
00:32:31
project you have to say this is the
00:32:33
circle to write diagram um has a circle
00:32:37
and is Success the success dot in the
00:32:39
circle um is is success in the set of
00:32:43
possible outcomes that's uh you know
00:32:46
sounds pretty obvious but there are
00:32:47
often projects where that that is
00:32:50
success is not in the set of possible
00:32:52
outcomes um and so so
00:32:55
Starship not only is fully full
00:32:58
reusability in the set of possible
00:32:59
outcomes it it is being proven with each
00:33:01
launch um and and and I'm confident it
00:33:04
will succeed it's simply a matter of
00:33:06
time and you know if if we can get some
00:33:12
improvement in the speed of Regulation
00:33:14
we we could actually move a lot faster
00:33:16
um uh so that would that would be very
00:33:20
helpful and and in fact if if this if
00:33:23
not if something isn't done about um
00:33:26
reducing regul and and sort of speeding
00:33:29
up approvals and to be clear I'm not
00:33:31
talking about anything unsafe it's
00:33:33
simply the processing of the safe thing
00:33:36
can be done at a as as fast as the
00:33:39
rocket is built not slower then uh then
00:33:43
then we could become a space buring
00:33:44
civilization and a multiplet species
00:33:47
ultimate and be out there among the
00:33:48
stars in the future and
00:33:53
there's you know it's it's just very
00:33:55
like it's incredibly important that we
00:33:58
have things that that we find inspiring
00:34:01
that you look to the Future and say the
00:34:04
future is going to be better than the
00:34:05
past things to look forward to and like
00:34:09
like kids are a
00:34:11
good a good way to assess this like what
00:34:13
a kids fired up about and if you can say
00:34:17
you know you you could you know you
00:34:19
could be an Astron on Mars you you could
00:34:21
maybe one day uh go beyond the solar
00:34:24
system um we could make Star Trek
00:34:27
Starfleet Academy real um that is an
00:34:31
exciting future that is
00:34:34
inspiring um you know just I mean you
00:34:37
need things that move your heart right
00:34:40
um
00:34:42
yeah [ __ ] yeah [ __ ] yeah let's do it I
00:34:47
mean it like like life can't just be
00:34:50
about soling one miserable problem after
00:34:52
another there's got to be things that
00:34:53
you look forward to as well yeah uh and
00:34:56
and do do you think you might have to
00:34:58
move it to a different jurisdiction and
00:35:00
to move faster I've always wondered if
00:35:02
like it's rocket technology is
00:35:04
considered an advanced weapons
00:35:05
technology so we can't just go do it you
00:35:07
know in another country yes in it yeah
00:35:10
interesting and if we don't do it other
00:35:11
countries could do it I mean they're so
00:35:14
far behind us but theoretically there's
00:35:16
a national
00:35:18
security you know justification here if
00:35:21
if somebody can put their thinking caps
00:35:23
on like do we want to have this
00:35:24
technology that you're building the
00:35:26
team's working so hard on stolen by
00:35:27
other countries and then you know maybe
00:35:30
they don't have as much red tape I I
00:35:32
wish people were trying to steal it um
00:35:35
so that no no one's trying to steal it
00:35:38
it's just
00:35:39
too this just it's too crazy
00:35:42
basically um and that's for you yeah
00:35:46
it's way too crazy El what do you think
00:35:48
um is going on that led to
00:35:52
boing building the St line the way that
00:35:54
they did they were able to get it up
00:36:00
but not complete but can't complete they
00:36:02
can't finish can't finish and you're
00:36:05
going to have to go up and
00:36:06
finish um
00:36:10
um well I mean I think Boeing is a
00:36:14
company that is you they actually do so
00:36:17
much business with the government they
00:36:18
have sort of impedance matched to the
00:36:20
government so they're they're like
00:36:22
basically one notch away from the
00:36:24
government maybe two they're not far
00:36:26
from the government from an efficiency
00:36:28
standpoint because they derive so much
00:36:29
of the revenue from the government um
00:36:32
and a lot of people think well SpaceX is
00:36:33
super dependent on the government and
00:36:35
actually no most of our revenue is
00:36:36
commercial um
00:36:39
so
00:36:41
um and and and and
00:36:45
there's been I think at least up until
00:36:48
perhaps recently because they have a new
00:36:50
CEO who actually shows up in the factory
00:36:53
yeah um and the the the CEO before that
00:36:55
I think had a degree in accounting and
00:36:57
and never went to the factory and didn't
00:36:59
know how airplanes
00:37:00
flew um so I think if you are in charge
00:37:04
of a company that makes airplanes fly
00:37:08
and a spacecraft go to Orit you need to
00:37:12
know it can't be a total mystery as to
00:37:15
how they work
00:37:18
yeah
00:37:19
so you know I'm like sure if somebody's
00:37:23
like running cocoa Pepsi and and they're
00:37:25
like great at marketing or whatever uh
00:37:27
that's that's fine because you know it's
00:37:31
it's not a sort of Technology dependent
00:37:32
business um you know or if they're
00:37:35
running a you know financial consulting
00:37:38
and their degrees in accounting that
00:37:39
makes sense um but I think uh you know
00:37:43
if you if you're the Cavalry Captain you
00:37:45
should know how to ride a horse pretty
00:37:47
basic yeah
00:37:49
yeah great it's like it's disconcerting
00:37:52
if the Cavalry Captain just falls off
00:37:53
the horse you
00:37:55
know he's scar
00:37:58
team I'm sorry I'm scared of hces gets
00:38:00
on backwards I'm like
00:38:03
oops um sh shifting gears to AI uh Peter
00:38:06
was here earlier and he was talking
00:38:08
about how so far the only company to
00:38:09
really make money off AI is NVIDIA with
00:38:12
the chips um do you have a sense yet of
00:38:15
where you think the big applications
00:38:17
will be from AI is it going to be an
00:38:20
enabling self-driving is it going to be
00:38:21
enabling robots is it transforming
00:38:24
Industries I mean it's still I think
00:38:26
early in terms of where the big business
00:38:29
impact is going to be do you have a
00:38:30
sense
00:38:41
yet I I mean I think I think they that
00:38:45
they the spending on AI probably runs
00:38:48
ahead of I mean does run ahead of the
00:38:50
revenue right now that's there's no
00:38:52
question about that um but the rate of
00:38:56
improvement of AI is faster than any
00:38:58
technology I've ever seen by far
00:39:01
and and and it
00:39:05
it's I mean like the for example the
00:39:08
touring test used to be a thing now you
00:39:12
know your basic uh open source random
00:39:15
llm writing on a freaking Raspberry Pi
00:39:17
probably could uh you know beat the
00:39:20
touring test
00:39:22
um so
00:39:24
there's I I I think actually
00:39:28
like like the the good future of AI is
00:39:32
one of immense Prosperity
00:39:36
where there is an age of abundance no
00:39:40
shortage of goods and
00:39:42
services everyone can have whatever they
00:39:45
want unless except for things we
00:39:47
artificially Define to be scarce like
00:39:49
some special
00:39:50
artwork um but but anything that is a
00:39:53
manufactured good or provided Service uh
00:39:56
will I think with the Adent of AI plus
00:39:59
robotics that the cost of goods and
00:40:02
services will be
00:40:05
will Trend to zero like I'm not saying
00:40:08
it be actually zero but it'll
00:40:10
be it every everyone will be able to
00:40:14
have anything they want uh that that's
00:40:16
the good future of course and you know
00:40:19
in my view that's probably 80% likely so
00:40:21
look on the bright
00:40:23
side only 20% 20% probability of
00:40:26
annihilation nothing
00:40:28
um is is the 20% like what does that
00:40:31
look like I don't know man I mean
00:40:35
frankly I do have to go engage in some
00:40:37
degree of of deliberate suspension of
00:40:38
disbelief with respect to AI in order to
00:40:41
sleep well um and even then um because I
00:40:45
I I I think the actual issue the the
00:40:48
most likely issue is like well how do we
00:40:49
find meaning in a world where AI can do
00:40:51
everything we can do a bit better that
00:40:53
that is that is perhaps the bigger
00:40:55
challenge um
00:40:58
although you know at this point I know
00:41:00
more and more people who are retired and
00:41:01
they seem to enjoy that life
00:41:05
so uh but I think that that may be maybe
00:41:08
there'll be some crisis of meaning like
00:41:10
because the computer can do everything
00:41:12
you can do but better so maybe that'll
00:41:15
be a challenge um but but really uh you
00:41:21
know you need you need the sort of end
00:41:22
factors you need the the autonomous cars
00:41:26
and
00:41:27
you need the sort of humanoid robots or
00:41:29
your general purpose robots um but once
00:41:32
you have general purpose humanoid robots
00:41:36
um and autonomous
00:41:38
vehicles uh you really you you you can
00:41:42
build anything um and and and this I
00:41:46
think that there's no actual limit to
00:41:48
the size of the economy I mean there
00:41:50
obviously you know the mass of Earth you
00:41:52
know like that one limit um but the you
00:41:57
know the economy is is really just the
00:41:59
average productivity per person times
00:42:01
number of people that's the economy and
00:42:04
if you if you've got humanoid robots
00:42:08
that can do you know where there's no
00:42:10
real limit on the number of humanoid
00:42:12
robots um and and they they can operate
00:42:15
very intelligently then then there's no
00:42:18
actual limit to the economy in there's
00:42:20
no meaningful limit to the economy you
00:42:22
guys just turned on Colossus which is
00:42:24
like the largest private cluster I guess
00:42:28
of gpus anywhere is that it's it's the
00:42:31
it's the most powerful supercomputer of
00:42:32
any kind um which sort of speaks to what
00:42:36
David said and kind of what Peter said
00:42:38
which is a lot of the kind of economic
00:42:41
value so far of ai ai is entirely gone
00:42:44
to Nvidia but there are people with
00:42:46
Alternatives and you're actually one
00:42:47
with an alternative now you have a very
00:42:49
specific case because Dojo is really
00:42:50
about images and large images huge
00:42:54
video um yeah I mean the the the Tesla
00:42:57
problem is different from the
00:43:00
um you know the sort of llm problem uh
00:43:03
the nature of the intelligence actually
00:43:05
is actually and
00:43:07
and the what what matters in the AI is
00:43:10
is different um to to the point you just
00:43:13
made which is that in the in tales's
00:43:14
case the context uh length is very long
00:43:18
so we've got gigabytes of context G
00:43:19
context Windows yeah yeah you've got you
00:43:21
know sort of uh we just bringing it up
00:43:25
kind of billions of tokens of context NY
00:43:27
amount of context because you've got um
00:43:30
seven seven cameras and if if you've got
00:43:32
several you know let's say you've got a
00:43:34
minute of several high high def cameras
00:43:37
then that's gigabytes so you need to
00:43:39
compress so the Tesla problem is you got
00:43:41
to compress a gigantic context um into
00:43:45
the the pixels that are that actually
00:43:47
matter um
00:43:50
and you know and and and condense that
00:43:53
over a time so you've got in both uh the
00:43:57
time Dimension the space Dimension
00:43:58
you've got to compress the pixels u in
00:44:01
space and the pixels over in time
00:44:04
um and and and then and then have that
00:44:07
inference done on a tiny computer
00:44:09
relatively speaking a small you know a
00:44:12
few hundred watt uh it's a Tesla
00:44:15
designed AI inference computer uh which
00:44:17
is by still the best there isn't a
00:44:20
better thing we could buy from suppliers
00:44:21
so the Tesla designed AI inference
00:44:24
computer that's in the cars is better
00:44:26
than anything we could buy from any
00:44:27
supplier just by the way that's kind of
00:44:29
a by way the the Tesla ai ai CHP team is
00:44:33
extremely good you guys in the design
00:44:34
there is a technical paper and there was
00:44:36
a deck that somebody on your team from
00:44:38
Tesla published and it was stunning to
00:44:40
me you designed your own transport
00:44:42
control like layer over ethernet you're
00:44:44
like ah ethernet's not good enough for
00:44:46
us you have this TT Coe or something and
00:44:49
you're like oh we're just going to
00:44:50
reinvent ethernet and like string these
00:44:51
chips it's pretty incredible stuff
00:44:53
that's happening over there yeah um no
00:44:56
the team the the Tesla chip design team
00:44:59
is extremely extremely good um so um but
00:45:03
is there a world where for example other
00:45:05
people over time that need you know some
00:45:07
sort of like video use case or image use
00:45:10
case theoretically you know you'd say oh
00:45:12
why not you know I have some extra
00:45:13
Cycles over here so which should kind of
00:45:15
make you a competitor of Nvidia it's not
00:45:17
intentionally per se
00:45:19
but um yeah I mean
00:45:23
the you know this training and inference
00:45:25
and we we do have you know two those two
00:45:27
projects at T we've got Dojo which is
00:45:30
the the training computer uh and then um
00:45:33
you know our inference chip which is in
00:45:37
every every car inference computer um so
00:45:42
and Dojo we've only had Dojo one Dojo 2
00:45:45
is um you know should be we should have
00:45:48
Dojo 2 in volume towards the end of next
00:45:49
year um and and that that that will be
00:45:53
we think sort of comparable to uh the
00:45:58
sort of a b200 typ type system a
00:46:00
training system um and um you know so
00:46:05
there's I guess there's some potential
00:46:06
for for that to be used as a service um
00:46:11
and but like
00:46:14
you Dojo is is just kind of like I mean
00:46:18
we're I guess I guess I have like some
00:46:21
improved confidence in Dojo
00:46:25
um but I think we won't really know how
00:46:28
good Dojo is until probably version
00:46:31
three like usually takes three major
00:46:33
iterations on a technology for it to be
00:46:35
to be excellent um and we'll only have
00:46:38
the second major iteration next year um
00:46:42
the third iteration I don't know maybe
00:46:44
late you know 26 or something like that
00:46:47
how's the uh how's the Optimus project
00:46:49
going I remember when we talked last um
00:46:51
and you said this publicly that it's in
00:46:54
doing some light testing inside the
00:46:56
factory
00:46:57
um so it's actually being useful what's
00:46:59
the build of materials and when you know
00:47:02
for something like that at scale so when
00:47:04
you start making it like you're making
00:47:05
the model three now and there's a
00:47:06
million of them coming off the factory
00:47:08
line what would the would they cost 20
00:47:10
30 $40,000 you think yeah I mean what I
00:47:13
mean I've discovered really that you
00:47:16
know anything made in sufficient volume
00:47:19
will ASM totically approach the cost of
00:47:21
its of its uh
00:47:24
materials so now there's there's I
00:47:26
should say the there's some some things
00:47:28
are constrained by the cost of
00:47:30
intellectual property and like paying
00:47:32
for patents and stuff so a lot of you
00:47:34
know what what's in a a chip is like
00:47:36
paying paying royalties um and
00:47:39
depreciation of the chip faab so but the
00:47:42
actual modal cost of the chips is very
00:47:43
low um so so so Optimus it obviously is
00:47:48
humanoid robot it it is it weighs much
00:47:51
less than it's much smaller than a car
00:47:53
um so the you could expect that in high
00:47:57
volume uh and and i' said you also
00:48:00
probably need three three production
00:48:02
versions of Optimus so you need to
00:48:04
refine the design three at least three
00:48:07
major times and and then you need scale
00:48:10
production
00:48:11
to sort of the million unit plus per
00:48:14
year
00:48:15
level and I think at that point the cost
00:48:20
the the you know the labor materials on
00:48:23
Optimus is probably not much more than
00:48:26
10,000
00:48:27
yeah and that's a decade long journey
00:48:29
maybe basically think of it like Optimus
00:48:31
will cost less than um a a small car
00:48:36
right
00:48:37
so at at scale volume with three major
00:48:41
iterations of technology and and so if a
00:48:43
small car you know costs $225,000 you
00:48:46
know it's it's probably like I don't
00:48:49
know $220,000 for for an Optimus for a
00:48:51
humanoid robot that can be your your
00:48:54
body like a combination of R2D2 and c3p
00:48:57
a bit
00:48:58
better I mean you know that's I honestly
00:49:01
I think people are going to get really
00:49:02
attached to their humanoid robot because
00:49:04
I mean like you look at sort of watch
00:49:06
Star Wars and it's like R2D2 and see3 I
00:49:08
love those guys um you know they're
00:49:11
awesome um and their personality and and
00:49:15
I mean and all all R2 could do is just
00:49:17
beef at you could couldn't speak English
00:49:22
um c3p to translate the beeps you know
00:49:25
so you're in year two of that if you did
00:49:27
two or three years per iteration or
00:49:28
something it's a decade long journey for
00:49:30
this to hit some sort of scale I I would
00:49:33
say m major iterations are less than two
00:49:36
years so okay um it's probably on the
00:49:39
order of five five years yeah
00:49:43
uh maybe six to get to a million units a
00:49:46
year and at that price point everybody
00:49:48
can afford one on planet Earth I mean
00:49:51
it's going to be that one: one two: one
00:49:53
what do you think ultimately if we're
00:49:54
sitting here in 30 years the number of
00:49:57
robotss on the planet versus Humans yeah
00:50:00
I think the number of robots will vastly
00:50:01
exceed the number of humans vastly yeah
00:50:03
vastly exceed I mean you have to say
00:50:04
like who who would not want their robot
00:50:08
buddy everyone wants a robot buddy
00:50:12
um you know this is like it especially
00:50:15
if it can you know you know it can take
00:50:18
care of your your take your dog for a
00:50:21
walk it could you know mo mow the lawn
00:50:24
it could watch your kids uh it could you
00:50:26
know like it could it could teach your
00:50:29
kids it could it could we could also
00:50:31
send it to Mars we could send a lot of
00:50:33
robots to Mars to do the work needed to
00:50:36
yeah make it a colonized planet for you
00:50:38
Mars is already the robot Planet there's
00:50:39
like a whole bunch of you know robots
00:50:41
like Rovers and Rob helicopter yes only
00:50:44
robots um so yeah the no I I think the
00:50:49
the sort of useful humanoid
00:50:52
robot opportunity is the single biggest
00:50:55
opportunity
00:50:58
ever
00:51:01
um because if you assume like that I
00:51:04
mean the I think the ratio of humanoid
00:51:06
robots to humans is going to be at least
00:51:07
2 to one maybe 3 to one because
00:51:10
everybody every everybody will want one
00:51:11
and then there'll be a bunch of robots
00:51:13
that you don't see that are making goods
00:51:14
and services and you think it's a
00:51:15
general one generalized robot that then
00:51:18
learns how to do different tasks or yeah
00:51:21
hey um I mean we are a
00:51:23
generalized yeah we're a generalized
00:51:25
robot we're just made of meat you know
00:51:28
uh we're a meat General meat yeah I mean
00:51:31
operating my meat puppet you know um so
00:51:34
um yeah we are
00:51:36
actually and by the way it turns out
00:51:38
like as we're designing Optimus we sort
00:51:41
of learn more and more about why humans
00:51:43
are shaped the way they're shaped and
00:51:46
you know and why we have five fingers
00:51:48
and why your little finger is smaller
00:51:50
than you know your index finger uh you
00:51:53
know you know obviously why you have
00:51:55
opposable thumbs but also why for
00:51:57
example your the muscles the major
00:52:00
muscles that operate your hand are
00:52:02
actually in your forearm and and your
00:52:05
fingers are primarily operated like um
00:52:09
your the muscles that actuate your
00:52:12
fingers um are located the vast majority
00:52:15
of the of your finger strength is
00:52:17
actually coming from your
00:52:18
forearm um and your fingers are being
00:52:20
operated by tendons little
00:52:22
strings that that's and so the current
00:52:26
current version of the Optimus hand uh
00:52:29
has the actuators in the hand and has
00:52:32
only 11 degrees of freedom so it can't
00:52:33
it's not as doesn't have all the degrees
00:52:35
of freedom of human hand which has
00:52:38
depending on how you count it roughly 25
00:52:40
degrees of freedom um and
00:52:44
uh and and and and it's also like not
00:52:47
strong enough in certain ways because
00:52:49
the actuators have to fit in the hand um
00:52:52
so the Next Generation Optimus hand uh
00:52:54
which we have in Prototype form
00:52:57
uh the the actuators have moved to the
00:52:59
forearm just like a human and they
00:53:01
operate the the fingers through cables
00:53:03
just like a human hand and uh and then
00:53:07
the next Generation had has 22 degrees
00:53:09
of
00:53:09
freedom um which we think
00:53:12
is enough to do almost anything that a
00:53:15
human can do
00:53:18
um and presumably I think it was written
00:53:21
that X and Tesla may work together and
00:53:25
you know provide services but my
00:53:27
immedate thought went to oh if you just
00:53:28
provide a grock to the robot then the
00:53:30
robot has a personality and can process
00:53:32
oh yeah voice and video and images and
00:53:34
all of that stuff as the uh as we wrap
00:53:36
here U I think uh you know everybody
00:53:40
talks about all the projects you're
00:53:42
working on but um people don't know you
00:53:44
have a great sense of humor that's not
00:53:46
true oh you do you do um people don't
00:53:49
see it but I would say one of I know for
00:53:51
me the funniest week of my life or one
00:53:53
of the funniest was when you did SNL and
00:53:55
we got and you you I got to tag along
00:53:58
maybe you saw it um maybe behind the
00:54:02
scenes like some of your funniest
00:54:05
Recollections of that chaotic insane
00:54:08
week when we laughed for 12 hours a day
00:54:11
it was a little terrorizing on the first
00:54:12
couple of days but yeah I was I was bit
00:54:15
worried the beginning there because
00:54:17
frankly nothing was funny um day one was
00:54:21
rough rough um yeah so I mean
00:54:27
it's like a rule but can't you guys just
00:54:28
say it just say the stuff that got on
00:54:30
the cutting some the funniest skits were
00:54:34
the ones that didn't let you do that's
00:54:35
what I'm saying can you just say there
00:54:36
were a couple of funny ones yeah that
00:54:37
they didn't let you you can say it so
00:54:38
that he doesn't get I mean how much time
00:54:40
do we have here well we should just give
00:54:42
one or two because it
00:54:44
was in your mind which one do we regret
00:54:47
most not getting on
00:54:50
air you really want to hear that I I
00:54:54
mean I mean it was a little I see it was
00:54:57
a little
00:54:59
funny
00:55:01
okay here we go all right here we go
00:55:06
guys all right so one of the things that
00:55:11
um I think everyone's been sort of
00:55:12
wondering this whole time is is Saturday
00:55:14
night Saturday Night Live actually live
00:55:18
like live live live live or do they have
00:55:21
like a delay or like just in case you
00:55:24
know there's a wardrobe malfunction or
00:55:26
something like that uh is it like a you
00:55:29
knowu 5sec delay what's really going on
00:55:33
but there's a there's a way to test this
00:55:35
right we came out the way there's a way
00:55:37
to test this um which is we don't tell
00:55:40
them what's going on as I I W on and
00:55:43
says this is the script I throw it on
00:55:45
the ground we're going to find out
00:55:48
tonight right now it's Saturday if sat
00:55:50
night life is actually
00:55:53
live and the way that we're going to do
00:55:56
this is I'm going to take my [ __ ]
00:56:01
[Applause]
00:56:04
out this is the greatest pitch ever and
00:56:07
and if if if if you see my
00:56:12
[ __ ] you know it's
00:56:15
true and if you don't it's been a lie
00:56:18
it's been a lie all these years all
00:56:20
these years now this is we're going to
00:56:22
bust them right now and this we're
00:56:24
pitching this yeah yeah so ping this on
00:56:27
Zoom yeah ping this on zoom on like a
00:56:29
Monday after like yeah we're like kind
00:56:31
of hung over from the weekend and like
00:56:32
Ping thisi and and and it's h it's you
00:56:36
know Jason's on um and uh Mike and you
00:56:40
yeah and Mike uh you know got like you
00:56:43
know who my friends who I think are sort
00:56:44
of you know quite funny um you know uh
00:56:48
Jason's quite funny I think like like
00:56:51
Jason's the closest thing to Cartman
00:56:52
that exists in the real in real life
00:56:56
we have a joke that he's Butters and I'm
00:56:58
Cartman yeah so um and then my friend
00:57:04
Mike's Prett funny too so so we we come
00:57:07
in like like just like guns blazing guns
00:57:10
blazing with with like ideas and we
00:57:12
didn't realize like actually you know
00:57:14
that's not how it works and and and uh
00:57:16
that that it's normally like actors and
00:57:18
and they just get told what to do and
00:57:20
like oh right well you mean we can't
00:57:21
just like do funny things that we
00:57:24
thought of what there watching this and
00:57:27
on the zoom they're a
00:57:29
gas pitch yeah it's silence like so I'm
00:57:32
like and I'm like and I was like is this
00:57:34
thing working is this are we muted is is
00:57:36
our mic on they're like we hear you yeah
00:57:39
and then and then after a long silence
00:57:41
like Mike's Mike just says the word
00:57:43
crickets crickets and they're not
00:57:45
laughing they're going to not even going
00:57:47
to chugle like what's going on and then
00:57:49
Elon explains the punchline yes which is
00:57:51
exactly so there's more to it okay yes
00:57:58
that's just the
00:57:59
beginning so Elon says so so then I'm so
00:58:02
I'm like so so so so I said like I'm I'm
00:58:06
I'm going to I'm going to reach
00:58:07
[Laughter]
00:58:10
down into my pet into my pants and and I
00:58:13
stick my hand on my pants and I'm going
00:58:15
and I'm and I'm going to pull my co and
00:58:16
I tell this to the audience and the
00:58:17
audience is going to be like
00:58:19
go right and and and and and then and
00:58:24
and then and then and then I pull out a
00:58:27
a a baby
00:58:29
rooster you know yes and it's like okay
00:58:32
this is kind of PG you know it's like
00:58:34
that not that bad it's like it's this is
00:58:37
my tiny [ __ ]
00:58:39
and
00:58:42
and and it's like what do you think uh
00:58:46
and so then and do you think it's an ice
00:58:48
coock I mean I like it I pitch I'm like
00:58:51
and then Kate McKennon walks out yeah
00:58:53
exactly and I'm like oh no but you
00:58:54
haven't heard half of it Kate mcken
00:58:56
comes out yeah and she says Elon
00:58:59
expected you would have a bigger [ __ ]
00:59:02
yeah I like I I don't mean to disappoint
00:59:05
you Kate but yeah um but I I I hope you
00:59:07
like it anyway
00:59:09
um but Kate's got to come out with with
00:59:12
with her cat okay right uh so and Kate
00:59:16
says you see where you can see where
00:59:17
this is going and I say nice wow that's
00:59:20
that's a that's a that's a nice [ __ ]
00:59:22
you've got there
00:59:24
Kate wow that's amazing um it looks a
00:59:28
little wet was it raining
00:59:31
outside and then
00:59:34
um do you mind if I stroke your [ __ ] is
00:59:36
that cool it's like oh no Elon actually
00:59:40
can I hold your [ __ ] of course of course
00:59:43
Kate you definely hold my [ __ ] um and
00:59:46
and then you know we exchang and I think
00:59:48
just the audio version of this is pretty
00:59:49
good right um and and and um you know so
00:59:54
it's like wow you I really like um
00:59:56
stroking your [ __ ] and I was like
01:00:01
I and then you say I'm really enjoying
01:00:04
strug at your [ __ ] yes of course and um
01:00:08
yeah so you know they're looking at us
01:00:11
like oh my god what have we done
01:00:13
inviting these lunatics on the
01:00:16
program yeah they said they said like
01:00:19
well um it is uh it is Mother's
01:00:23
Day it's Mother's Day we might not want
01:00:26
to go with this one the mom's in the
01:00:28
audience and I'm like well that's a good
01:00:30
point well fair fair it might be a bit
01:00:32
uncomfortable for all the moms in the
01:00:34
audience maybe I don't know I don't know
01:00:35
maybe they'll dig it maybe they like it
01:00:37
uh so yeah that
01:00:40
was that's the um that's the that's the
01:00:44
that's the um cold open that didn't make
01:00:47
it we didn't get that on the air um but
01:00:51
uh we did fight for Doge yes and we got
01:00:54
Doge on the air Ian there's a bunch of
01:00:56
things that I said that were just not on
01:00:57
the script like they have these like Q
01:00:59
cards for what you're supposed to say
01:01:00
and I just didn't say it I just went off
01:01:01
off the rails yeah they didn't see that
01:01:04
coming yeah it's live well it's
01:01:09
live and uh so the Elon wanted to do
01:01:13
Doge this is the other one and he wanted
01:01:15
to do Doge on late night and he says um
01:01:18
Hey Jake Al can you um make sure oh yeah
01:01:20
so I want yeah I wanted to do the Doge
01:01:21
father like you sort of redo the you
01:01:23
know that scene from uh the the to
01:01:26
Godfather I mean you kind of need the
01:01:27
music to cue things
01:01:31
up you bring me on my daughter's
01:01:35
wedding listen you ask for Doge yeah you
01:01:37
got R and I give you Bitcoin but you
01:01:40
want do exactly you really got to set
01:01:42
the mood you have to have tuxedo
01:01:45
andice you got to have like
01:01:51
Mar you come to me on this day of my
01:01:55
Do's wedding
01:01:57
and you ask me for your private
01:02:02
keys are you even a friend you'll call
01:02:05
me the Dodge
01:02:08
father so b
01:02:10
b know so that has potential they had
01:02:14
great potential so they come to me and
01:02:16
I'm I'm talking to Colin um and Jo who's
01:02:20
got a great sense of humor and he's
01:02:21
amazing he loves Elon and he's like we
01:02:23
can't do it because of the law and stuff
01:02:25
like that and
01:02:26
the law and liability so I said it's
01:02:29
okay Elon called Comcast and he put in
01:02:33
an offer and they just accepted it we
01:02:36
just bought NBC so it's fine yeah and
01:02:41
Colin Joe looks at me I sold this so
01:02:43
good and he's like you're you're serious
01:02:46
I'm like yep we own NBC now yeah and
01:02:51
he's like okay well that kind of changes
01:02:53
things doesn't it I'm like absolutely
01:02:55
where're a go on on Doge yeah and then
01:02:57
he's like you're [ __ ] with me and I'm
01:02:59
I'm [ __ ] with
01:03:00
you or are we or are
01:03:03
we it was the greatest week of and that
01:03:07
like is like two of 10 stories yeah we
01:03:10
got yeah we got we'll save the other
01:03:11
eight yeah but it was and I was just so
01:03:14
happy for you to see you have a great
01:03:18
week of just joy and fun and letting go
01:03:20
cuz you were launching Rockets you're
01:03:22
dealing with so much [ __ ] in your
01:03:23
life to have those moments yeah to share
01:03:26
them and just laugh um it was just so
01:03:28
great and more of those moments I think
01:03:30
we got to we got to get you back on SNL
01:03:33
who wants them back on SNL one more time
01:03:35
all right ladies and gentlemen our
01:03:37
bestie Elon Musk
01:03:39
[Applause]
01:03:42
[Laughter]
01:03:42
[Applause]

Badges

This episode stands out for the following:

  • 80
    Most quotable
  • 80
    Most viral
  • 75
    Most memeable
  • 75
    Best concept / idea

Episode Highlights

  • A Simulation?
    Musk humorously suggests that the world's current state feels like a simulation.
    “If we are in some alien Netflix series, I think the ratings are high.”
    @ 00m 54s
    September 10, 2024
  • Elon Musk on Free Speech
    Musk discusses the global movement to quell free speech and its implications.
    “The price of freedom of speech is not cheap.”
    @ 01m 10s
    September 10, 2024
  • Regulatory Challenges
    Musk highlights the absurdity of regulations hindering progress, using SpaceX as an example.
    “We're acting against our own self-interest.”
    @ 14m 35s
    September 10, 2024
  • Comparing Government Systems
    A stark contrast between North and South Korea illustrates the impact of government efficiency on living standards.
    “In North Korea, they're starving; in South Korea, it's like amazing.”
    @ 22m 30s
    September 10, 2024
  • The Future of Space Exploration
    Elon Musk discusses the upcoming private space mission and the potential for commercial space travel.
    “This will be the first commercial space walk at the highest altitude since Apollo.”
    @ 27m 40s
    September 10, 2024
  • AI and the Economy
    Elon Musk predicts a future of abundance driven by AI and robotics, with goods becoming nearly free.
    “The cost of goods and services will trend to zero.”
    @ 40m 02s
    September 10, 2024
  • The Future of Optimus
    Optimus, Tesla's humanoid robot, is projected to cost less than a small car at scale.
    “Optimus will cost less than a small car.”
    @ 48m 31s
    September 10, 2024
  • Humanoid Robots and Humanity
    Elon Musk believes the number of robots will vastly exceed humans in the future.
    “Everybody will want a robot buddy.”
    @ 50m 04s
    September 10, 2024
  • A New Era of Robotics
    Musk describes the humanoid robot opportunity as the biggest opportunity ever.
    “The useful humanoid robot opportunity is the single biggest opportunity ever.”
    @ 50m 55s
    September 10, 2024

Episode Quotes

Key Moments

  • Simulation Theory00:47
  • Freedom of Speech01:10
  • Regulatory Absurdity14:35
  • Government Efficiency24:15
  • Space Exploration27:40
  • Optimus Cost48:31
  • Robot Buddies50:04
  • Biggest Opportunity50:55

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Elon Musk: OpenAI Betrayal, His Future at Tesla, and the Next Big Thing — Grokipedia
Podcast thumbnail
Under Secretary of State Sarah B. Rogers on dismantling the "Censorship Industrial Complex"
Podcast thumbnail
E17: Big Tech bans Trump, ramifications for the First Amendment & the open Internet
Podcast thumbnail
E23: Radical DAs, breaking down FB/Google vs. Australia, sustained fear post-vaccine & fan questions
Podcast thumbnail
E150: Israel/Gaza escalating or not? EU censorship regime, Penn donors revolt, GLP-1 hype cycle
Podcast thumbnail
E18: Inauguration talk, breaking down the $1.9T stimulus, the case for recalling Gavin Newsom & more
Podcast thumbnail
E74: Market update, inverted yield curve, immigration, new SPAC rules, $FB smears TikTok and more
Podcast thumbnail
E44: USA's Afghanistan embarrassment, China's new algo laws, future of robots + Italy recap!