Search Captions & Ask AI

Travis Kalanick & Michael Dell Live from Austin, Texas

March 17, 2026 / 01:15:56

This episode features Travis Kalanick discussing his latest venture, Atoms, and the challenges of operating in stealth mode. Key topics include the logistics of food delivery, automation in mining, and the future of self-driving technology.

Kalanick shares insights on his company’s branding and acquisitions, explaining how they are digitizing the physical world and transforming food production. He emphasizes the importance of infrastructure in achieving efficiency in food delivery, comparing it to Uber's impact on transportation.

The conversation also touches on the competitive landscape of self-driving technology, with Kalanick evaluating companies like Tesla and Waymo. He discusses the potential of automation to improve mining operations and the implications for the industry.

Additionally, Kalanick reflects on the migration of tech talent from California to Texas, highlighting the advantages of the business environment in Austin. He expresses optimism about the future of technology and its potential to solve societal issues.

The episode concludes with a discussion on the Invest America initiative, led by Michael Dell, aimed at providing financial support to children in low-income areas, emphasizing the importance of empowering future generations.

TL;DR

Travis Kalanick discusses his new venture Atoms, automation in mining, self-driving tech, and the Invest America initiative with Michael Dell.

Video

00:00:00
I don't know if some of you knew I was
00:00:01
an angel investor in some companies.
00:00:05
On the count of three, what's my
00:00:06
favorite angel investment of all time?
00:00:08
One, two, three.
00:00:10
>> Thank you. Give it up. Travis Kalanick.
00:00:21
>> Appreciate you. All right. Wow. Um, on a
00:00:25
big news day, Travis is here on a very
00:00:28
big news day. you spent uh wow I guess
00:00:31
like seven years just in the lab
00:00:34
building last year every year I ask you
00:00:37
hey you want to come to the summit you
00:00:39
want to say no it's like I'm going to
00:00:41
just chill I'm building next year hey
00:00:44
you know just it's always available
00:00:45
>> you understand I'm stealth
00:00:46
>> yeah stealth I'm stealth nobody knows
00:00:49
where I am nobody knows what I'm doing
00:00:50
the employees are not allowed to put the
00:00:52
name of the company on their LinkedIn
00:00:54
>> thousands of employees that weren't
00:00:57
allowed to put the company name on
00:00:58
LinkedIn I mean, incredible. And I'm
00:01:01
like, "Okay." And
00:01:02
>> their parents thought they worked for
00:01:03
the CIA.
00:01:04
>> Yeah. And then he's like, "And by the
00:01:05
way, Jake Al, you can invest. You can't
00:01:07
announce it and you have to sign an you
00:01:09
can't mention you're an investor." It's
00:01:11
like, "Okay, no problem. I'm just happy
00:01:12
to be on the cap table."
00:01:13
>> Is he like kind of like secret saying
00:01:16
what he wasn't supposed to say,
00:01:17
>> right? Now he's doing that just
00:01:20
happened.
00:01:21
>> Well, you know, you're out now. It Let's
00:01:23
go.
00:01:23
>> You're out. It's out. You're You came
00:01:25
out of stealth today.
00:01:26
>> Um,
00:01:27
>> it's so funny. Okay.
00:01:28
>> It's so great. You came out of stealth.
00:01:30
Well, you you talked a little bit. You
00:01:31
came to Allins last year. Is that fair?
00:01:33
You say you're coming out of stealth
00:01:34
today. Is that right?
00:01:35
>> Well, look, let let's just start with
00:01:37
what that meant for our employees
00:01:39
because again, imagine if you're at a
00:01:42
multi,000 person company and every
00:01:45
single employee has stealth on their
00:01:48
LinkedIn,
00:01:50
including salespeople,
00:01:53
okay? Including recruiters.
00:01:56
Like it was they they were they were
00:01:58
living life on hard mode. It
00:02:00
>> was kind of fun too, right? I mean
00:02:01
>> I mean yeah it was like kind of cool
00:02:03
what the what's what is this? Why are
00:02:04
there why is this massive density of
00:02:07
stealth
00:02:08
>> right
00:02:09
>> startup people in Los Angeles? What is
00:02:12
happening over there?
00:02:13
>> Yeah.
00:02:14
>> Yeah.
00:02:14
>> Also technically the name of the company
00:02:17
in different countries was very generic
00:02:20
names of companies. I mean everything
00:02:24
was designed to be stealth,
00:02:25
>> right?
00:02:26
>> So we operate in 30 countries. In the US
00:02:31
the kitchens product is known as cloud
00:02:33
kitchens.
00:02:35
In Korea
00:02:37
it's Kitchen Valley.
00:02:40
In the Middle East it's Nama.
00:02:44
In
00:02:46
Latin America, parts of Latin America
00:02:48
it's Cinaas.
00:02:50
I mean you get the idea. You can't even
00:02:52
remember all the names and all the code
00:02:53
words.
00:02:54
>> Think about it.
00:02:54
>> To think it through. But
00:02:56
>> we have four in China, you know, it's
00:02:57
like all over the place. Yeah.
00:02:59
>> But things have gone really well and
00:03:01
you've been a little inquisitive. So
00:03:03
tell us about the branding today that
00:03:05
you're announcing and then maybe some of
00:03:07
the acquisitions and evolution of the
00:03:10
company. You're not just renting kitchen
00:03:12
space.
00:03:13
>> Those who I mean know how I thought
00:03:16
about things in the Uber day, a lot of
00:03:17
this stuff's not surprising. I I would
00:03:20
often talk about digitizing the physical
00:03:21
world. I think I even did it all at
00:03:23
summit. The quick version of this, I'll
00:03:26
try to do it quickly, but it's like uh
00:03:28
we know the bits world, the computer
00:03:29
world, the one that Michael Dell
00:03:31
essentially invented for us. CPU,
00:03:34
storage, network. These are three core
00:03:35
computing resources. When you go to
00:03:37
computer science class your first day,
00:03:39
three core computer resources. CPU
00:03:42
manipulates the bits. Storage stores the
00:03:43
bits. Network moves bits from point A to
00:03:45
point B. But if you're digitizing the
00:03:47
physical world, you're treating atoms
00:03:49
like bits.
00:03:51
You're building an atomsbased computer.
00:03:53
And I'll explain what I mean to say. I
00:03:55
know this is a little little out there.
00:03:57
CPU manipulates bits. What manipulates
00:03:59
atoms? Manufacturing. Storage stores
00:04:01
bits. What stores atoms? Real estate.
00:04:04
Network moves bits from point A to point
00:04:06
B. What moves atoms? That's
00:04:07
transportation or logistics. So you have
00:04:10
these three core computing resources in
00:04:12
an atomsbased computer. The name of my
00:04:15
company was very obtuse and purposely
00:04:18
designed to be as boring as hell was
00:04:20
called City Storage Systems.
00:04:23
So that's digitized real estate in an
00:04:27
atomsbased computer. Our first computer
00:04:29
being a food computer. What does that
00:04:30
mean? Manufacturing real estate and
00:04:33
logistics for food.
00:04:36
And so you start to get there and the
00:04:38
idea the the mission was infrastructure
00:04:40
for better food. The idea was, can you
00:04:42
get a meal that's prepared and delivered
00:04:44
to you so efficient that it starts to
00:04:47
approach the the the cost of going to
00:04:49
the grocery store? If you can do that,
00:04:52
you do to the kitchen what Uber did to
00:04:54
the car. But in the Uber day, the roads
00:04:57
were there, the cars are unused. You
00:05:00
just had to put an app in the app store.
00:05:01
Wasn't that easy, but kind of that easy.
00:05:04
In this world, you can't do this on a
00:05:06
restaurant. restaurant doesn't have.
00:05:08
When I left Uber, 13% of all San
00:05:12
Francisco miles were Uber miles.
00:05:15
You can't get and that was 10 nine years
00:05:17
ago. You can't get there on food on
00:05:21
restaurants. They they have like 20%
00:05:23
capacity. Uber Eats and Door Dash fill
00:05:25
it, but the infrastructure to do high
00:05:28
capacity, highcale sort of
00:05:31
industrial production is just not there.
00:05:33
And the logistics just not there. It
00:05:35
just doesn't work. That's why on
00:05:36
e-commerce you go through Amazon big ass
00:05:39
warehouses with awesome logistics.
00:05:42
You've got to do the same thing when
00:05:44
food when food goes to e-commerce. That
00:05:46
was a lot. Okay. So bottom line is
00:05:49
>> it's awesome. We do this food
00:05:51
computation stuff. We're doing more
00:05:54
computers now. And so the name of the
00:05:56
company is called atoms and it's let's
00:05:59
say the mission uh is um is uh physical
00:06:05
automation to transform industries and
00:06:07
move the world. And so we have our food
00:06:10
computer I talked about. Then we do
00:06:11
we're doing mining.
00:06:13
>> Mining as in
00:06:15
>> mining data mining.
00:06:16
>> We're talking about atoms guys.
00:06:18
>> Yeah.
00:06:18
>> So well of course you do some mining
00:06:20
data mine too but the point is is
00:06:23
physical mining. So automation of mines
00:06:27
and uh the mission there is
00:06:31
uh more productive minds to power
00:06:34
earth's industries. Right? So it's got
00:06:36
this industrial atoms vibe to it. And
00:06:40
then on the transport side, it's uh
00:06:43
wheelbase for robots because if you're
00:06:45
doing specialized robots, not humanoids,
00:06:49
specialized robots,
00:06:51
you need to be able to move and act in
00:06:53
the physical world. But the minute
00:06:55
you're moving, you got to have a
00:06:57
wheelbase. So, it's just part of the
00:06:59
equation. And a lot of people go, look
00:07:02
at Tesla, it's great. Look at uh look at
00:07:05
Whimo. Awesome. They're cruising around
00:07:06
Austin, of course. But there's so many
00:07:09
things that move. It's not just a ride
00:07:11
sharing thing. Um, and so, um, obviously
00:07:15
including mining equipment that's doing
00:07:17
its thing. So you guys, that's the
00:07:18
general sort of idea. And we acquired a
00:07:21
company on the mining stuff, a company
00:07:23
called Pronto or it's about to close.
00:07:27
It's we're we're inches from closing is
00:07:29
the way to put it.
00:07:30
>> What were they doing? What was their
00:07:31
business? Pronto
00:07:32
>> automating mining equipment.
00:07:34
>> Were they based?
00:07:35
>> Uh, they're based in San Francisco. So,
00:07:37
you and I were starting to talk about
00:07:38
this backstage, but there's some folks I
00:07:40
talked to in the mining industry who
00:07:42
mentioned, you know, like the the big
00:07:43
issue with mining number one is just
00:07:45
surveying, like finding the the
00:07:47
locations, right? Is there an advantage
00:07:49
to be created there? Cuz I know there's
00:07:50
a couple startups that are trying to be
00:07:52
really smart about selecting locations
00:07:53
to get the targets out of the ground.
00:07:55
>> Yeah.
00:07:56
>> And then the other one is like, well,
00:07:57
can you go deep? Because pretty much
00:07:59
anywhere on Earth, you can get whatever
00:08:00
you want if you're willing to go deep
00:08:02
enough, but the cost is is distance
00:08:05
squared, right? So the energy cost is
00:08:07
like how deep are you going to the to to
00:08:09
the second power. So it becomes you know
00:08:11
geometrically more expensive to go
00:08:13
deeper but the deeper you go you're the
00:08:16
more you're able to kind of not worry
00:08:18
about getting the right location. So
00:08:20
does automation unlock that capacity?
00:08:23
>> Automation definitely does. I mean, but
00:08:24
you have I mean, also it's like, man,
00:08:27
does Boring Company have some good stuff
00:08:28
going?
00:08:30
>> Like I hope we we're like we're doing
00:08:32
the mining thing like and Boring Goes
00:08:34
makes, you know, some good tunnels for
00:08:37
for cars to do the thing. But like
00:08:39
there's some kind of boring mechanism,
00:08:41
automated tunneling to do some of this.
00:08:44
But to be honest, there's there's, you
00:08:46
know, they have this this thing is like
00:08:47
rare earths. I don't know why they put
00:08:50
plural rare earths, isn't it? Rar's
00:08:52
earth. I don't know. But the
00:08:55
>> but rare rare
00:08:58
>> rare earth.
00:08:59
>> Yeah. But it's not rare.
00:09:01
>> It's uncommon mineral guys. It's not
00:09:02
rare. It's what you have to do to the
00:09:04
land is aggressive. And what's rare is
00:09:08
the is is where the where are the places
00:09:10
they'll let you do it that you can also
00:09:13
sort of get people to.
00:09:15
>> When you automate, you can go to a lot
00:09:17
of places. Uh well, first is all the
00:09:20
mines that exist are way more
00:09:22
productive. Um and the second is you can
00:09:25
then sort of justify going to places you
00:09:28
wouldn't have been able to go before
00:09:29
because um you don't have as much of a
00:09:32
labor footprint or a safety issue or a
00:09:34
whole bunch of other things that then
00:09:35
>> So if it's inhospitable, if it's
00:09:38
regulated, if it's like I don't want to
00:09:40
live there, it's the end of the earth.
00:09:42
>> Yeah.
00:09:43
>> You can send robots and have people
00:09:46
monitoring them remotely. Yeah.
00:09:48
>> And this is like a a future that feels
00:09:53
like a little bit like science fiction.
00:09:54
>> Look, we're here in Austin. You got to
00:09:56
do the shout out to Tesla and all the
00:09:58
things cuz I like to sort of break down
00:10:00
the physical AI stack
00:10:04
includes not just like, oh yeah,
00:10:06
computation and I've got to have
00:10:07
physical AI models and I got to all the
00:10:10
things you sort of think of. What about
00:10:12
land development? That should be in that
00:10:15
stack. What about chemistry? that needs
00:10:17
to be in the stack. Manufacturing needs
00:10:18
to be in the stack. When you look at the
00:10:20
stack, you're like, damn, Tesla's got
00:10:23
this. They are they are the Google of
00:10:27
this era. Which is what I mean by that
00:10:29
is in the 2000s, if you were doing a
00:10:31
startup in the 2000s, the first question
00:10:33
you would get is uh why isn't why isn't
00:10:36
Google going to kill you or why isn't
00:10:38
Google just going to do it?
00:10:39
>> Why Google?
00:10:40
>> They're not going to know that they
00:10:41
killed you.
00:10:42
>> And before that, Microsoft.
00:10:43
>> And before that was Microsoft late 90s.
00:10:45
Uber had a time 2010.
00:10:47
>> What if Uber puts that in the app?
00:10:48
>> Come on. It's like, dude, this is Uber.
00:10:50
I'm like, but you know, I think in the
00:10:53
physical AI space, that's a that's sort
00:10:54
of a Tesla thing. Um, but there's so
00:10:57
many things to do.
00:10:59
You got to shoot your shot. You got to
00:11:00
do some stuff.
00:11:01
>> And rumors that um, hey, you might not
00:11:04
be done with self-driving, something
00:11:06
that you were very early on. How do you
00:11:08
think about what you're seeing in the
00:11:10
playing field of self-driving? Because
00:11:13
my lord, you know, Whimo's making great
00:11:15
progress. Tesla's making great progress.
00:11:16
>> Pick a winner. Like pick a winner
00:11:18
between Tesla, Tesla, Whimo, Uber,
00:11:21
>> or or like Uber. Uber Uber seems to be
00:11:24
building a network of stuff.
00:11:26
>> Yeah. I mean, the number of
00:11:27
>> pick a winner.
00:11:28
>> The number of players is crazy now,
00:11:30
right? Like
00:11:31
>> Yeah. Look, there's I think there's more
00:11:33
noise. There's more bark than there is
00:11:35
bite right now. Um, look, I think Whimo
00:11:39
obviously is ahead. the existence proof
00:11:41
is there. Their issue is manufacturing
00:11:45
and scale um and urgency and fierceness
00:11:49
like let's
00:11:50
>> come on. Let's go.
00:11:53
>> Yeah.
00:11:53
>> You know, Uber had an autonomy project
00:11:56
back in the day. So, and they have a
00:11:57
different strategy these days. I haven't
00:11:59
been there for a while. So, but the but
00:12:01
the point is is that you so you got
00:12:02
Whimo, then you've got Tesla
00:12:05
fundamentals,
00:12:07
science,
00:12:10
hard mode times 100. And the question
00:12:14
is, do they get there in what time
00:12:17
scale?
00:12:18
If if they and like honestly everybody's
00:12:20
like, could happen tomorrow, could
00:12:22
happen in 5 years.
00:12:25
And I think that it's like when does the
00:12:27
chat GPT moment happen for vision is
00:12:30
basically the thing. Let's call it
00:12:32
vision without other sensors. So super
00:12:34
inspiring, but like what's the timeline
00:12:37
on it?
00:12:38
>> Yeah.
00:12:38
>> Um those are the bas this is basically
00:12:40
the and then there's a lot of other
00:12:41
little guys that don't really have the
00:12:44
stuff I believe yet.
00:12:46
>> There's nobody standing out just yet of
00:12:48
of the others. Do you think we're at a
00:12:51
point now like obviously now that you're
00:12:53
getting into more of these kind of
00:12:56
autonomous systems that move around like
00:12:58
do we have these vision language action
00:13:00
models tuned and ready for prime time?
00:13:03
There's there's been a conversation like
00:13:05
who's going to have the Android the
00:13:07
operating system for vision language
00:13:09
action where I can use my voice tell it
00:13:12
to do something and it knows what I'm
00:13:13
saying and then it identifies the
00:13:15
objects and does the thing in the
00:13:16
physical world. Do those models exist
00:13:18
today or there still work? And is that
00:13:20
like a Google OS or like where does that
00:13:23
OS come from?
00:13:24
>> Look, I think there's a there this is an
00:13:26
area of a lot of energy, a mix of
00:13:29
research and implementation. I think
00:13:30
there's a lot of hope and interesting
00:13:32
stuff. I mean the high level is we all
00:13:35
remember what happened when you used
00:13:36
chat GPT 3.5 and you're like holy.
00:13:41
>> Yeah, it's legit.
00:13:43
>> Whoa. And then it went to four and
00:13:44
you're like, okay, like some stuff just
00:13:47
changed. The world just changed and I
00:13:48
can sort of connect some dots and
00:13:51
getting real.
00:13:54
Is it about to happen? Is it about to
00:13:57
happen for physical AI? And that's what
00:13:59
this is about. Yeah.
00:14:00
>> And the fun part about it is machine
00:14:03
learning, deep learning, this kind of
00:14:04
thing for many years, decades was like
00:14:06
inscrutable. I don't know what the thing
00:14:08
is thinking. It just spits out an answer
00:14:09
and I know it's correct. Well, now you
00:14:11
can have a conversation with it, right?
00:14:14
Like imagine if it's driving your car
00:14:17
and there's different agents and one's
00:14:20
just driving, the other's like, "Yo,
00:14:21
look out over there."
00:14:22
>> Yeah.
00:14:22
>> It's like, "Oh, just like how we roll."
00:14:26
Like somebody does that, you're like,
00:14:27
you're like, "Honey, that's like 200
00:14:30
meters away. We're going to be
00:14:32
>> okay." Yeah.
00:14:33
>> Jason and I don't call each other honey,
00:14:34
but I got you. Yeah.
00:14:36
>> Like,
00:14:37
>> sweetie,
00:14:38
>> you know. Yeah.
00:14:39
>> Okay. So anyways, so that was odd,
00:14:41
wasn't it? Okay. Okay. Okay. I didn't
00:14:43
mean it that way. I didn't mean it.
00:14:45
>> Yeah. You know, I meant it. Okay.
00:14:48
>> Language is a beautiful compression
00:14:50
mechanism that humans use 100 watts of
00:14:55
energy
00:14:57
like and you put that in the scheme of
00:15:00
things of like AI training, AI energy,
00:15:03
the power plants that are built to do
00:15:04
the thing that isn't even at human
00:15:07
strength yet. Okay. The Whimo machine
00:15:11
takes a hundred times more energy to
00:15:14
drive a Whimo than a human does to drive
00:15:18
a Whimo.
00:15:20
So, so language we there are still
00:15:23
things that humans are great at and that
00:15:26
unbeat like the goat. We're still the
00:15:28
goat at certain things. Language is this
00:15:30
epic compression.
00:15:32
And um we need to find ways to compress
00:15:34
cuz like when you think about how how we
00:15:36
first started looking at the physical
00:15:38
world is we saw everything. And you know
00:15:41
what guys and this is sort of obvious
00:15:43
like it doesn't matter what the cloud is
00:15:45
doing if I'm driving.
00:15:48
But like the car doesn't know that. It's
00:15:50
pulling in every freaking data point and
00:15:52
processing everything. And it's it's you
00:15:54
know look they've been about sort of
00:15:55
carving out the things that don't matter
00:15:57
and things like this.
00:15:58
There's ultra awesome versions of this
00:16:00
and you can imagine how you can use
00:16:02
language or things that look like
00:16:04
language to communicate either amongst
00:16:06
agents or sort of safety systems with a
00:16:09
driving system to sort of get very
00:16:13
efficient answers and to identify safety
00:16:16
issues very efficiently.
00:16:18
>> People don't know that you've moved to
00:16:21
Texas as of most people don't know but
00:16:23
it's it's out there.
00:16:25
>> Yeah. You moved here in December, so now
00:16:27
you're a resident of Austin.
00:16:29
>> Yeah, I was I Thank you.
00:16:32
>> It's very exciting for me. We've been
00:16:34
getting to play some back gammon.
00:16:35
>> Back gamon cards. It's
00:16:37
>> cards. We're having a good time.
00:16:38
>> So, I've had a place on Lake Austin
00:16:41
since 2021.
00:16:44
And uh I go there. I'm an avid water
00:16:46
skier. Like,
00:16:48
>> you're impressive at water skiing, I
00:16:50
have to say. Like,
00:16:51
>> so I've had a place in Austin for 5
00:16:53
years.
00:16:54
freaking love it. It's my weekend. I
00:16:56
would go 15 weekends a year.
00:16:58
>> What do you think's going to happen in
00:16:59
California?
00:17:00
>> It's pretty messed up. Look, I I grew up
00:17:02
in Cali. Uh like I grew up in Los
00:17:04
Angeles. Uh my parents were born and
00:17:08
bred in Los Angeles, which basically
00:17:10
makes them the founders of LA. Okay. But
00:17:14
um so I have a lot of heart like my
00:17:16
whole family, everything, you know? It's
00:17:18
pretty
00:17:19
>> it's pretty it's I don't want
00:17:21
>> a lot of us feel that way. I don't want
00:17:22
to get the violin out, but it just But
00:17:24
it's heartbreaking.
00:17:25
>> The plot totally It's just a place you
00:17:27
grew up. It's your home, you know,
00:17:29
>> when you have to leave.
00:17:31
>> Uh but it's getting weird out there
00:17:34
and uh it feels like it's getting
00:17:37
weirder and at some point that's it's
00:17:40
just too weird.
00:17:41
>> It's too weird. Do you think everyone's
00:17:43
going to leave?
00:17:44
>> I mean, it started with Elon and it was
00:17:46
like
00:17:47
>> we don't want Elon here and then he's
00:17:49
like message received,
00:17:51
>> right? And then it kind of worked its
00:17:52
way down the tech industry and in the
00:17:55
kind of it, you know, world of people
00:17:58
building businesses and whatnot. And now
00:18:00
it's kind of gotten so broad in terms of
00:18:03
the
00:18:03
>> Joe Rogan comedy, music, New Yorkers,
00:18:08
restaurant tours. I mean, this place is
00:18:10
>> I'm not even talking about this. I'm
00:18:11
just talking about everyone leaving LA
00:18:13
or sorry leaving California is almost
00:18:16
like working down this path of
00:18:19
>> look my the rest of my team's like we're
00:18:21
when are we moving you know they're like
00:18:24
>> and how are you dealing with that? So
00:18:25
that was the question was like
00:18:26
>> got to buy homes on the lake.
00:18:27
>> There are literally dozens of startup
00:18:29
CEOs of call it successful or growing
00:18:33
companies that I talked to who were like
00:18:36
>> dude I want to leave but I got employees
00:18:37
here. I got an office here. I got a
00:18:39
facility here. build stuff here. How am
00:18:41
I going to leave?
00:18:42
>> Yeah, I totally get it. It's a real
00:18:44
thing. So, look, I think like most
00:18:47
things, uh, sort of when it's time and
00:18:52
it feels painful to do something,
00:18:53
sometimes it's actually not as bad as
00:18:55
you think and you just got to make the
00:18:58
move and lead and do it. Um, and so, uh,
00:19:02
that's kind of what that's kind of the
00:19:04
process, the almost like a morning
00:19:06
process I went through and that's just
00:19:08
what it is. is and you're setting up a
00:19:09
team here.
00:19:10
>> Yeah, of course. Uh and I got that
00:19:12
office right on the lake.
00:19:15
>> Did you get that?
00:19:16
>> Uh it's we are negotiating. No, it's all
00:19:19
good. No, no, it's all good. We're
00:19:20
negotiating right now.
00:19:21
>> But I'm going to jet ski to work.
00:19:23
>> No, literally we is it true story. Last
00:19:27
year we're like driving up the thing and
00:19:28
I was like, "Wow, I wonder who owns
00:19:30
that." He's like, "I will." And I was
00:19:33
like, "Did you look at It's like I
00:19:35
looked at that and I was like, "That's a
00:19:37
that would be a nice one." But the the
00:19:39
truth is, you know, and I had a couple
00:19:41
people move here a couple years ago and
00:19:43
they all had the same reaction. Oh my
00:19:45
god, I'm living in a place that's twice
00:19:47
as big for half as much. The people here
00:19:49
are dope. The food is dope. Everybody
00:19:52
here is got this sense that we're
00:19:54
building the future. And it's just fun
00:19:57
and world positive. And you know, for
00:19:59
me, I got to live New York, LA, San
00:20:01
Francisco. I did three of the great
00:20:03
cities in this country. This one feels
00:20:05
the most like home to me,
00:20:07
>> which is a very strange feeling to me,
00:20:08
but it feels like everybody here wants
00:20:10
to build the future and it's very
00:20:13
diverse, you know, like all these
00:20:15
different industries and people pursuing
00:20:16
stuff. I I think this is the future.
00:20:18
>> Yeah. Here's the thing. Like you go to
00:20:19
San Francisco and I still have a little
00:20:22
nostalgia when I go to San Francisco
00:20:24
just having built Uber there and the
00:20:26
whole thing. Um, I still get the, you
00:20:29
know, the butterflies just I, you know,
00:20:32
but it does have something magical. You,
00:20:33
you just can't take it away. And then
00:20:35
you look at all of these bike lanes and
00:20:38
these bus lanes that never have a bus or
00:20:40
a bike in them
00:20:42
>> and cost $400 million to build one mile
00:20:45
>> and it's literally it's sort of like
00:20:47
this
00:20:49
subconscious desire to choke the city
00:20:51
off. Now remember, I look at things
00:20:53
through roads. That's how I think. So
00:20:55
I'm just like obviously the city is
00:20:58
totally busted.
00:20:59
>> Yeah.
00:21:00
>> No, they they literally took Market
00:21:02
Street and they're like, "What would be
00:21:04
the optimal way to this up and virtue
00:21:06
signal at the same time?" And they're
00:21:08
like, "Yeah, buses." And it's like,
00:21:11
"Nobody's on the bus. Nobody takes the
00:21:13
bus. It's a beautiful small town. San
00:21:16
Francisco. That whole street is empty
00:21:18
and painted red."
00:21:19
>> So, okay. So, we all [ __ ] and complain
00:21:21
non-stop on our chat.
00:21:23
>> When are you leaving?
00:21:25
>> I'm number one. I get it. But, when are
00:21:27
you leaving?
00:21:27
>> Okay. Well,
00:21:28
>> on the chat, you're the best, by the
00:21:29
way. I'm like,
00:21:30
>> okay. So, let me just,
00:21:30
>> by the way, there's couple glasses of
00:21:32
wine in
00:21:33
>> public facing Freedberg
00:21:35
>> and he's like, you know, I think there's
00:21:37
a better way to do. And then there's
00:21:38
like Darth Friedberg in group chat. He's
00:21:41
like, these people, these morons are
00:21:44
they're destroying societ. He is like
00:21:47
Darth Friedberg in group chat. Am I
00:21:50
lying?
00:21:51
>> Am I lying? Is he the most
00:21:53
>> correct? Especially after a couple
00:21:55
glasses of wine.
00:21:56
>> Yeah. When I start drinking fall and he
00:21:58
takes pictures, he's like, "Fourth
00:22:00
beverage." And we're like, "Oh, it's
00:22:02
worth staying up on the group chat."
00:22:04
>> Yeah. And then I'm like, I'll go and
00:22:05
attack this congressman on Twitter,
00:22:07
which I realized.
00:22:08
>> And then you go delete the tweet.
00:22:09
>> Yeah.
00:22:10
>> Don't delete the tweet.
00:22:11
>> Yeah. I delete the tweets. Okay. know
00:22:14
there's a group of people trying to
00:22:16
raise $500 million to create like a
00:22:18
tech/ business coalition to go to
00:22:22
Sacramento, which arguably is something
00:22:24
that everyone's left and avoided doing
00:22:26
forever cuz no one wants to spend time
00:22:28
in freaking Sacramento fighting
00:22:29
politicians, but it's almost like we're
00:22:31
all falling off a cliff. It's time to do
00:22:33
something. Do you think there's a
00:22:35
realistic path packed? Do you think the
00:22:37
people can actually get their together
00:22:38
that even if 500 million came in,
00:22:40
there's a way to kind of turn around the
00:22:41
state, fix some of the policies? You
00:22:43
think it's too late?
00:22:44
>> I don't think that. But look, I would go
00:22:46
I look, anybody who's doing anything to
00:22:48
fix things, I'm like, hell yeah, let's
00:22:51
do something. The issue is we all grew
00:22:53
up in the tech world, which was like a
00:22:56
libertarian place where you stay out of
00:22:59
politics and
00:23:01
>> that kind was that kind of vibe. It was
00:23:03
just everybody was like that.
00:23:05
>> Leave me alone. when I want to make
00:23:06
stuff.
00:23:06
>> Yeah. I just I'm not I don't do that.
00:23:08
And that's obviously this not a thing
00:23:11
anymore in California. I think the the
00:23:14
ballot initiatives are very powerful and
00:23:16
there's very clean ways to get something
00:23:18
on the ballot love that. I think that
00:23:21
your DAS who have decided we do not
00:23:24
enforce crime at all anymore. That's
00:23:28
like a sweet spot. Like I believe that I
00:23:31
sort of have this apherism. Truth and
00:23:33
justice are the immune system for
00:23:36
society.
00:23:38
When when the immune system is
00:23:42
suppressed, all the social ills flare
00:23:45
up.
00:23:47
So look for the places where truth and
00:23:50
justice are being deteriorated, are
00:23:52
being degraded, and say, how do we get
00:23:54
at that? Because if you get at that,
00:23:57
everything else downstream will be
00:23:59
better.
00:24:00
So that's kind of how I look at things
00:24:02
and how I also determine whether the
00:24:05
world's getting better or worse. When I
00:24:06
say weird, I'm talking about truth and
00:24:08
justice. That's what I mean when I say,
00:24:11
"Oh man, it's getting weird. It's
00:24:12
getting weirder," which means it's
00:24:13
weird. I'm just talking about truth and
00:24:15
justice.
00:24:15
>> Well, I mean, and you look at the
00:24:17
homeless industrial complex, you look at
00:24:19
Chesuine, which the all-in pod sacks,
00:24:22
myself, and the pod, like we we
00:24:24
literally led the recall of him. And
00:24:27
then you had the same thing going on in
00:24:28
LA where they were just like if somebody
00:24:31
>> Gasone
00:24:31
>> Yeah, Gasone. I mean, we basically lost
00:24:34
the script. You're running the city for
00:24:36
the criminals. It literally is like a
00:24:38
Batman movie. It's like Bane.
00:24:40
>> I mean, you want to arrest the
00:24:42
criminals.
00:24:44
>> Look,
00:24:45
>> I was born in the darkness. I mean,
00:24:46
these guys are lunatics.
00:24:49
>> Yeah. Look, I I know police officers in
00:24:51
Los Angeles who are no longer police
00:24:54
officers, and these are lifelong guys
00:24:56
who protect and serve. That's in their
00:24:58
bloods or DNA. They want to protect
00:25:01
people. They want the bad guys to be
00:25:03
dealt with. And they they almost have
00:25:06
PTSD from
00:25:09
what it is like to want to serve and see
00:25:12
bad things happening and not being
00:25:14
allowed to stop it.
00:25:16
>> Yeah. Nobody's got their back and
00:25:17
they're not allowed to do their job.
00:25:18
It's It's crazy. And I It's getting
00:25:20
weird.
00:25:21
>> Okay. Hey, I want to just go back to AI.
00:25:23
>> Sorry for the darkness.
00:25:24
>> No, no, no. I think it's good.
00:25:25
>> I was trying to induce I'm trying to
00:25:27
induce dark free.
00:25:29
>> Well, I brought it up. I mean, someone
00:25:30
bring me a tequila. I'll get going.
00:25:33
>> Yeah, let's do it. Can we get a couple
00:25:34
tequilas?
00:25:35
>> It was funny. I went on this podcast
00:25:36
yesterday and the guy and the f the guy
00:25:38
was like the first hour was middle of
00:25:39
the road. I was talking about tech and
00:25:40
science and then like politics came up.
00:25:42
He's like, so socialism and he said like
00:25:44
you lost it and then you were like he's
00:25:45
like the energy went 10x and
00:25:49
Yeah. So, it'll come out in a couple
00:25:50
weeks, but I was like, it it got me
00:25:52
going. Okay. I want to talk about
00:25:53
physical AI one more time.
00:25:54
>> Yeah.
00:25:54
>> So, one of now that now that you're
00:25:56
doing this, I saw a presentation the
00:25:58
other day. Someone showed like a video
00:25:59
of a squirrel jumping from one tree to
00:26:01
another tree. And they're like a tenth
00:26:03
of a watt or something like like the the
00:26:06
biology is tuned and it's so perfect in
00:26:09
terms of its efficiency of energy
00:26:11
utilization to do physical things. And
00:26:13
we're taking these like big things of
00:26:15
metal and motors and like actuators and
00:26:19
if you add up or you compound all of the
00:26:21
inefficiencies in the system, it's like
00:26:23
1,200 watts to get the robot to walk 4
00:26:26
foot. Like like break apart not just the
00:26:30
software but the hardware layer and
00:26:32
where are we at in evolving things like
00:26:35
actuators and the materials and
00:26:38
everything else that's going to make
00:26:39
physical AI work and scale. Oh, look, a
00:26:42
lot with the questions you're asking are
00:26:45
going down humanoid lane, which is like
00:26:48
this thing and and everybody talks about
00:26:50
how do you do the hand? It's almost like
00:26:51
Terminator 2 type obsession with the
00:26:53
hand, which is fair. Like, it's a very
00:26:56
critical part of it. I mean, look at the
00:26:58
I like to look at the Achilles the quote
00:27:00
unquote Achilles tendon of any of these
00:27:01
machines and you're like, that's where
00:27:03
the action is. This this there's a
00:27:06
couple other places. Um,
00:27:09
look, I'm in the nonhumanoid space. I
00:27:12
mean, but mechanical engineers have been
00:27:14
dealing with actuators and, you know,
00:27:17
all the sort of electromechanical sort
00:27:19
of interactions that make machines do
00:27:22
certain things, but like I'm in the food
00:27:24
machine space, so I can tell you how to
00:27:27
open a paper bag and put a put a a bowl
00:27:32
in a paper bag without tearing the paper
00:27:34
bag. But I am less into the I forget the
00:27:39
name pero. They're they're the the
00:27:41
senses to understand awareness and
00:27:44
touch. Um I'm not in that game. Um so
00:27:49
when you're mining you're like you're
00:27:51
not like you know you know
00:27:53
>> you're not threating
00:27:54
>> you're not playing tennis. Um certain
00:27:58
things may be equivalent to tennis. So,
00:27:59
look, the bottom line is we're seeing
00:28:01
obviously all you have to do is go
00:28:03
online and look at where the humanoids
00:28:05
are going over time and how much better
00:28:08
they're getting. Um, it's wild and it's
00:28:12
happening so freaking fast. But any
00:28:15
humanoid demo starts with dancing and
00:28:19
martial arts.
00:28:20
>> Yeah.
00:28:21
>> And we're sort of down specialized robot
00:28:25
lane, which is gainfully employed
00:28:27
robots.
00:28:28
>> Yeah. So, I know I didn't totally answer
00:28:30
the question, the technology piece, but
00:28:33
>> I just like do you agree that there's
00:28:34
probably like a big opportunity for
00:28:36
venture money and like research to go
00:28:38
into material science extra? Yeah,
00:28:40
>> for sure.
00:28:41
>> Sure. Because if the physical AI stack
00:28:43
manipulation and all of the related
00:28:47
things around it
00:28:49
>> is massive.
00:28:50
>> And so if you get the software working,
00:28:52
it's almost like the hardware has to
00:28:53
catch up. Yes,
00:28:54
>> we got a lot of
00:28:56
>> investor.
00:28:58
>> Well, actually was it's good that you
00:28:59
bring this up. You know, one of the
00:29:00
things you pioneered um at Uber was um
00:29:05
capital as a weapon and you were very
00:29:07
thoughtful about hey if we can take this
00:29:10
capital off the table then that's going
00:29:12
to let's call it what it is. It's going
00:29:14
to be an advantage versus the
00:29:15
competitors and these other competitors
00:29:17
couldn't get that capital. That's now I
00:29:20
think people have seen that playbook and
00:29:22
they're like hm Samman's like that was
00:29:24
smart let me try and it's at a different
00:29:27
scale now that you've come out of
00:29:28
stealth
00:29:29
>> now that you've got and and people are
00:29:31
starting to understand just starting
00:29:33
today how big your vision is
00:29:35
>> capital as a weapon this is I guess in
00:29:38
your plan yeah
00:29:39
>> well I mean here's the thing right so
00:29:41
capital as a strategic weapon for its
00:29:44
own sake is not a thing but when it is
00:29:47
actually a strategic weapon And then it
00:29:49
is a thing. And what I mean by that is
00:29:50
like in the Uber world early days if you
00:29:54
didn't have capital didn't matter how
00:29:56
good your app was because MASA is going
00:29:58
to put a billion dollars into your
00:29:59
competitor and you're going to lose 20%
00:30:01
market share tomorrow. So a critical
00:30:04
competency in fact your worldass
00:30:06
competencies one of them has to be
00:30:08
raising capital and you need to do it
00:30:10
better than everybody else and if you
00:30:11
don't you are going to lose.
00:30:13
>> Let me ask one follow up to that. Sorry
00:30:14
you go ahead. Um but the Middle East
00:30:16
>> Yeah. I've heard theories the last
00:30:19
couple days that big capital seekers are
00:30:21
kind of right now because of what's
00:30:23
going on in the Middle East with the
00:30:24
Iran war. Dubai, Qatar, Saudis are kind
00:30:29
of going to close up the capital flowing
00:30:31
to the US right now. And is that real? I
00:30:34
mean, do you think that's a real threat?
00:30:35
So look, our Middle East business was
00:30:36
supposed to go public in January and the
00:30:41
Saudi market was went down 20% over like
00:30:45
a two-month period and that was like a
00:30:48
massive damper on the situation. Now
00:30:51
part of it's part of that was because
00:30:53
the oil prices had gone down so
00:30:56
dramatically
00:30:57
>> and if you went into KSA, you went to
00:30:59
the kingdom, everybody's like, we need
00:31:01
we need oil prices to go up. uh you know
00:31:05
that's the other side of the equation.
00:31:07
So I don't know what h like I don't I'm
00:31:09
not in the market raising money right at
00:31:12
this moment and this is a twoe old thing
00:31:14
that I you know look I see the news just
00:31:16
like everybody else and I'm not out
00:31:18
there calling while a war is going on
00:31:20
and saying hey guys you got some money.
00:31:23
Um, so I don't know exactly what's going
00:31:25
to happen. But if you are an optimist
00:31:27
and you're like, "Okay, this isn't this
00:31:29
is not going on forever." Just like the
00:31:31
tariffs,
00:31:33
it was the end of the world and then it
00:31:35
wasn't very quickly.
00:31:37
>> If you're an optimist about this
00:31:38
situation and it won't be the end of the
00:31:40
world, maybe even a better world, then
00:31:44
we get to a better place. And I think
00:31:46
>> progress, abundance, the golden age
00:31:49
happens. And a lot of it is about all
00:31:51
the things that are happening in AI in
00:31:53
physical AI and and and just the
00:31:57
productivity gains that are coming in
00:31:59
very massive ways. Yeah.
00:32:00
>> Yeah. And I mean it it was shock shock
00:32:03
and awe and then hey now we've got a
00:32:06
steady state and let's hope that's what
00:32:07
happens in Iran is that we can depose
00:32:10
these evil dictators, replace it with
00:32:13
something a little more stable.
00:32:14
>> And related to this before we wrap,
00:32:16
they're going to China. There's big
00:32:17
trade deal being negotiated. What do you
00:32:19
hope comes out of this Chinese thing?
00:32:21
>> And what did you learn in China?
00:32:22
>> What would Yeah. What would you learn?
00:32:23
And and what do you think would be great
00:32:25
for America? Like what would you like to
00:32:26
see and be like, man, that's going to
00:32:27
set us all up. No more.
00:32:29
>> Look, here's the thing. If you go to
00:32:30
China right now and you go and just take
00:32:32
a tour of the manufacturing that's going
00:32:34
on there, just the manufacturing base,
00:32:38
the cities, especially if you've gone to
00:32:40
China for a couple decades in a row,
00:32:43
you're like, damn. Yeah. You So, let's
00:32:46
just do two things. You go to Shenzen,
00:32:49
which before felt like Kansas City, but
00:32:52
50 years ago and really humid, which I
00:32:55
guess Kansas City sometimes, but you go
00:32:59
there now and it's like oneupping
00:33:01
Singapore,
00:33:03
right? Or so that's the city view.
00:33:06
You're just experiencing a very awesome.
00:33:09
You're like, this is advanced and you
00:33:12
just get the vibe and it's everywhere.
00:33:14
And then you go and you start seeing the
00:33:16
manufacturing base and you see what like
00:33:19
Xiai is doing or any of the other
00:33:22
there's so many scrappy guys, badass
00:33:24
guys everywhere and you're like
00:33:27
f
00:33:28
>> they're hungry.
00:33:29
>> So does anybody remember the 2008
00:33:32
Olympics in Beijing? Anybody? Does
00:33:34
anybody This is a little bit you're down
00:33:36
a rabbit hole. Does anybody remember the
00:33:38
opening ceremony?
00:33:40
>> And you're like these mofos are taking
00:33:43
over. At least that's what they want to
00:33:45
do. That shit's happening.
00:33:48
So, I don't have any issue or this is
00:33:52
not negativity for me. I'm like, these
00:33:53
guys are killing it. The best idea is
00:33:55
winning. They're fiercely They're
00:33:58
fiercely going after truth and progress
00:34:00
and they're making [ __ ] happen. Let's
00:34:02
step up our game. Okay. But we can also
00:34:05
have a friendly game. Like, we don't
00:34:07
have to like be like the Detroit Pistons
00:34:09
in the '90s, you know?
00:34:10
>> Yes. We can we there there's a way
00:34:12
>> go into the stands.
00:34:14
>> Yeah. Yeah. You know, there's a way to
00:34:16
do this, right? And there's a way to do
00:34:18
it like adults. I hope that's where we
00:34:20
would end up. Um I have a employee who
00:34:24
because we we we for for a long time
00:34:27
we're the largest build kitchen builders
00:34:29
in China. I have an employee in China
00:34:32
has an American wife. Okay. They both
00:34:35
live in China. They're both from China
00:34:38
originally. Okay. but want him to it
00:34:42
would be great for him to work here on
00:34:43
some things I'm doing. It's very hard to
00:34:46
make that happen right now. Now, that's
00:34:48
selfish like I like maybe selfish like
00:34:51
I'm like there's a person I've been
00:34:52
working with for over a decade. I'd love
00:34:54
to continue here. Maybe there's other
00:34:57
bigger picture items that I'm not
00:34:59
dealing with. I'm not the geopolitical
00:35:01
guy, but I'd love for there to be sort
00:35:04
of
00:35:05
good
00:35:07
relations and good like like if you have
00:35:11
a significant other in who is an
00:35:13
American citizen like do we have to make
00:35:16
that hard as an example
00:35:18
>> some normaly would
00:35:19
>> something you know I'm just saying now I
00:35:21
agree like
00:35:23
>> there are ways to do immigration
00:35:25
properly like we effed it up super bad
00:35:28
don't to get me started. But there's
00:35:31
also there's good migration too. Like a
00:35:33
lot of great innovators
00:35:36
all over the place came from other
00:35:38
places for their own version of the
00:35:40
American dream. God bless
00:35:42
>> Freeberg.
00:35:43
>> And we don't have to that doesn't have
00:35:45
to be a negative thing. And so I'd like
00:35:48
to see more of that. And um yeah,
00:35:51
China's China's wild. So let's uh let's
00:35:53
keep our eye on the ball and let's let's
00:35:54
give them a run for their money, too.
00:35:56
>> Give it up for TK.
00:35:58
All right. Well done, brother. Thank
00:35:59
you. That was good. Good to see you,
00:36:02
brother. Wow. Michael Dell.
00:36:07
My lord. Texas native.
00:36:10
>> Michael.
00:36:11
>> Yes. Born born born in Houston.
00:36:12
>> Well, I missed the the opening. We we
00:36:14
jumped to the music, but you started
00:36:16
Dell Computer here in Austin with a
00:36:19
thousand bucks
00:36:20
>> 42 years ago in my dorm room at Dolby at
00:36:25
UT.
00:36:26
uh about 10 days before I finished my
00:36:29
freshman semester.
00:36:31
>> Amazing. And it's been working out
00:36:35
pretty good.
00:36:37
>> Yeah, it's been some bumps in the road,
00:36:39
but yeah, it's generally generally
00:36:41
worked out okay. Yeah, we'll have about
00:36:43
140 billion in revenue this year. So,
00:36:47
>> yeah, it's okay. It compounds over time,
00:36:50
doesn't it?
00:36:50
>> Yeah.
00:36:51
>> Yeah.
00:36:51
>> You know, you start small and just keep
00:36:53
adding and
00:36:54
>> there you That's how it goes. It's just
00:36:56
that easy.
00:36:59
>> But why Texas? Like, I think this is an
00:37:02
important thing. We're in Austin. Jason
00:37:04
lives here. David Sax lives here now.
00:37:07
More people are moving from California
00:37:09
to Austin. Why Austin? Why Texas? Why is
00:37:12
it work here? And is it getting better?
00:37:14
Is it just always worked? You know, I
00:37:16
think Texas has had a
00:37:20
uh, you know, low tax, progrowth,
00:37:25
uh, environment for a long time and pro,
00:37:28
you know, sort of progressive business
00:37:31
climate. And you know, if you sort of
00:37:33
look at the the growth of the Texas
00:37:36
economy relative to the rest of the
00:37:38
United States without Texas, you know,
00:37:42
Texas just kind of looks like a better
00:37:43
version of the US economy.
00:37:46
And uh you know, you now you've got uh
00:37:49
Austin is sort of just about in the top
00:37:52
10 cities in in the United States. So
00:37:56
you've got uh when that happens, you'll
00:37:59
have four of the 10 largest cities in
00:38:01
America in Texas. One out of 10 children
00:38:05
born in the United States born in Texas.
00:38:08
Uh more New York Stock Exchange
00:38:10
companies in Texas than in New York or
00:38:13
anywhere else. And uh you know, you've
00:38:16
got the University of Texas here in
00:38:18
Austin, which I always think of as kind
00:38:21
of the wellspring for a lot of the
00:38:24
companies that are here. certainly ours
00:38:27
and um you know a long history of of uh
00:38:33
innovative pioneering spirit and
00:38:36
entrepreneurship and um it's been a
00:38:40
fantastic place for us
00:38:42
>> and part of this I think Freeberg and
00:38:44
Michael is what's happened in the other
00:38:47
great cities or what were once great
00:38:49
cities my hometown New York I got to
00:38:52
spend 10 years in LA and the last 12 in
00:38:54
um the Bay Area and what's happening
00:38:56
there is incredibly unamerican uh and
00:39:01
they're decelerating when compared and I
00:39:03
think maybe the gap maybe and the
00:39:05
disparity from these um two locations
00:39:09
has gotten greater. Yeah. And you're
00:39:11
seeing a lot more people say life there
00:39:14
here in Austin seems a lot better than
00:39:16
the life I'm living in New York, LA or
00:39:20
in in the Bay Area.
00:39:21
>> Yeah. Well, I've got a lot of new
00:39:23
friends and neighbors, you know, that
00:39:24
that have that have that have come. And
00:39:26
certainly, I mean, if if you look at the
00:39:27
the the migration statistics, Texas has
00:39:32
attracted an enormous number of people
00:39:35
and and look, I mean, when you when you
00:39:37
when you look at the environment here
00:39:40
and compare it to the other kind of
00:39:42
situations that are going on, uh it's
00:39:45
it's it's very attractive. But, you
00:39:47
know, it's it's kind of been great for a
00:39:50
long time. So, uh, it's not really new
00:39:54
news to us that have been here a while.
00:39:56
>> Yeah. Elon had a great experience when
00:39:58
he was building the the Gigafactory over
00:40:01
here.
00:40:01
>> They let you do stuff here basically.
00:40:04
You know,
00:40:04
>> they let him build it
00:40:06
>> which he said was like an incredible
00:40:08
experience for him because in California
00:40:10
they didn't let him build, you know,
00:40:12
these factories. And in fact, the Tesla
00:40:14
factories that in Fremont was just an
00:40:16
old ancient factory that he was able to
00:40:18
retrofit. So, uh, there's something
00:40:21
going on here as well with the data
00:40:22
centers and and that's actually, I
00:40:23
think, very close to what you're working
00:40:25
on at Dell. Maybe you could talk a
00:40:26
little bit about the data center boom
00:40:28
that's going on in Texas that maybe
00:40:29
people aren't paying attention to.
00:40:31
>> Sure. Well, there's, you know, obviously
00:40:34
been enormous buildout of AI
00:40:36
infrastructure
00:40:37
and that requires, you know, lots of new
00:40:40
data centers, lots of power. Texas, you
00:40:43
know, has an enormous uh advantage there
00:40:47
relative to other states. Lot of power,
00:40:50
a lot of land, and it's and and you can
00:40:54
build stuff, right? So, there's there's
00:40:55
been a massive buildout particularly in
00:40:58
some of the cities in and towns in West
00:41:01
Texas where there's not a lot of
00:41:02
population and so they're not really too
00:41:05
opposed to having data centers out in
00:41:08
the middle of nowhere where there's land
00:41:10
and power. And so um yeah, I mean the
00:41:14
the the
00:41:16
demand for tokens is enormous. You know,
00:41:18
we've been building these AI data
00:41:22
centers not just here in Texas, but
00:41:24
around the world. And you know, the
00:41:27
growth in that has been tremendous.
00:41:30
You know, we we introduced the the first
00:41:32
uh H100 server. It was literally a
00:41:35
couple weeks before Chat GPT was
00:41:37
announced. And you know uh the
00:41:41
progression of our business in that area
00:41:43
has sort of gone from like 2 billion to
00:41:47
10 billion to 25 billion to this year
00:41:50
it'll be like 50 billion. So so
00:41:52
tremendous growth and when when you
00:41:55
think about what these models are
00:41:57
creating
00:42:00
there's this phase change that's
00:42:01
happened in computing right we we we had
00:42:04
60 years of calculating computing. Now
00:42:06
we have machines that are thinking and
00:42:08
helping us think. And so the demand for
00:42:11
that kind of intelligence and you know
00:42:13
the models are amazing but they're also
00:42:16
the worst they'll ever be and they
00:42:17
continue to improve. And so we just see
00:42:22
uh a lot more demand than supply. And
00:42:26
it's happening not just in the
00:42:28
hyperscalers and the cloud service
00:42:31
providers. It's happening in 4,000
00:42:34
enterprises where we building these
00:42:36
these Dell AI factories. It's happening
00:42:39
in sovereign AI, you know, like with
00:42:41
Palunteer and, you know, people want to
00:42:43
protect their data but also use AI on
00:42:46
it. They want to bring the AI to where
00:42:48
their data is. And you know when this
00:42:50
kind of started a few years ago, we had
00:42:52
some
00:42:54
really sophisticated
00:42:56
uh large companies think of like Fortune
00:42:59
100 and they started you know buying
00:43:02
these AI servers from us and and they
00:43:06
kind of knew what they were doing right
00:43:07
you know and and uh we said what what
00:43:09
are you doing and and they they were
00:43:12
they were kind of taking uh building
00:43:14
their own models they were taking open-
00:43:17
source models they were running them
00:43:18
some of them were were algorithmic
00:43:20
traders or you know derivatives of
00:43:23
machine learning and of course um they
00:43:27
needed a lot of help in in doing that
00:43:29
cuz it it it was sort of a complicated
00:43:32
thing. So about two years ago, we we put
00:43:34
together this product that we we we
00:43:39
called the Deli factory and now we've
00:43:42
got 4,000 plus of these and it's kind of
00:43:46
running rampant across enterprises.
00:43:49
>> How do you think about
00:43:51
the payback time on the investment
00:43:54
that's being made? The administration
00:43:56
put in place this accelerated
00:43:58
depreciation rule. the cont.
00:44:00
>> Yeah, that's very helpful actually.
00:44:02
>> So just for folks to understand that a
00:44:03
little bit like if you spend a hundred
00:44:06
billion dollars this year building and
00:44:08
data centers and buying infrastructure
00:44:10
for those data centers, you get to write
00:44:12
off 100% of that this year. Correct.
00:44:15
>> To deduct it. So you don't pay taxes.
00:44:16
You pay way much fewer tax
00:44:18
>> and that's in place for 10 years. I
00:44:20
think
00:44:20
>> that's a 10ear deal. So it's
00:44:22
accelerating the investment. How much is
00:44:25
that helping versus how are you seeing
00:44:28
folks rationalize the investment
00:44:30
relative to the return they're going to
00:44:32
make and over what time scale? This is
00:44:34
still the big question. Is the money
00:44:35
really there? The hyperscalers maybe
00:44:37
they're starting to come up but end
00:44:39
usage end states are we kind of hey wait
00:44:43
and see we don't know yet or folks are
00:44:45
getting 20% ROIC starting in year one
00:44:49
after they've made the investment. You
00:44:51
know, I I I can tell you in in our
00:44:53
business in in our company, we
00:44:55
definitely see plenty of use cases where
00:44:57
the ROI or the improvement in
00:45:01
productivity efficiency is 20% or or
00:45:05
greater
00:45:06
>> right away. It gets there.
00:45:07
>> I mean it, you know, it's not like you
00:45:10
just hit a button and you get 20%.
00:45:11
Right? There's there's work required in
00:45:15
thinking through the processes and maybe
00:45:17
it's worth a little bit describing that.
00:45:19
So you know when you have a a any
00:45:22
company
00:45:24
its processes and tools and technology
00:45:26
are a function of what was available at
00:45:29
the time it created those things. And so
00:45:31
what you sort of have to do is step back
00:45:33
and say all right what's the trajectory
00:45:36
of the improvement of the tools? What
00:45:39
outcome are we trying to create? And now
00:45:43
let's simplify and standardize the
00:45:45
processes. Get all the tools together.
00:45:48
get all the data together and then apply
00:45:51
the technology and this really has to be
00:45:55
done in kind of a tops down way. Uh you
00:45:58
can't sort of do it spontane in you know
00:46:00
in in silos are not going to
00:46:02
spontaneously improve themselves and
00:46:04
often that means that you're completely
00:46:07
changing the way the organization works.
00:46:10
>> It's like a wholesale rearchitecture.
00:46:12
>> It's a it's a reimagining of the way a
00:46:15
company works. And you know, I mean, the
00:46:17
way I described this to our team about 3
00:46:20
years ago is,
00:46:22
you know, uh, we were going to have a
00:46:24
new competitor 5 years from now, that
00:46:26
would be two years from now, you know,
00:46:28
that was in every business that we're in
00:46:30
except they were going to be faster and
00:46:32
more innovative and more successful and
00:46:35
lower cost and they were going to put us
00:46:37
out of business. And the only way we
00:46:39
were going to prevent that is is we're
00:46:40
going to become that company. And here's
00:46:42
how we're going to do it. And you know,
00:46:44
it excited some people, it scared some
00:46:47
people. And but I actually believe that
00:46:49
that's
00:46:51
what's going to happen. And and so we've
00:46:54
been dramatically changing our business.
00:46:57
I would say the biggest benefit by far
00:47:00
is speed. We're much faster at being
00:47:03
able to apply innovations. And so, you
00:47:06
know, you look at our at our
00:47:08
infrastructure business last quarter
00:47:09
grew 73%. Well, that's kind of unusual
00:47:13
for a business of this size.
00:47:17
>> And uh you know, this quarter we guided
00:47:21
that it would grow even faster, like
00:47:22
100%. So,
00:47:24
>> you've lived through a couple of
00:47:25
paradigm shifts here. The PC revolution
00:47:27
obviously you led that. Um and then you
00:47:29
of course had you know client server,
00:47:32
the network revolution, online uh
00:47:35
internet cloud, mobile.
00:47:38
>> Yeah. So each one of those we saw
00:47:40
massive disruption. We were talking in
00:47:42
in the green room about hey we used to
00:47:44
have a typing pool. There was a mail
00:47:45
room. All these things got abstracted
00:47:47
away by the PC andorked PC revolution.
00:47:51
Um but it took a decade or two and this
00:47:54
one's happening a lot faster. Yeah.
00:47:56
>> Yeah. This one I think it's it's like um
00:47:59
you know a quarter is like a year maybe
00:48:02
it's five times faster or something like
00:48:04
that. But but back to your question, I I
00:48:06
I would say maybe 10 or 15% of large
00:48:10
companies have really figured this out
00:48:12
and the rest of them are kind of
00:48:13
fumbling around. And you know, there's a
00:48:15
tendency when when you hear about a new
00:48:18
technology to like, oh, let's just let's
00:48:21
just go do it. You know, show the boss,
00:48:22
hey, we did AI. You know,
00:48:24
>> the board said we got to do AI. We got
00:48:26
to do AI, guys.
00:48:27
>> We need AI. Are you proud of me, boss?
00:48:29
You know,
00:48:30
>> um and
00:48:31
>> look at what I made. Exactly. And I also
00:48:35
think you know there's an important
00:48:36
point about about uh this which is
00:48:40
>> you know the barrier to technology
00:48:42
adoption is is not technology it's
00:48:46
culture and leadership and courage right
00:48:50
and and so
00:48:51
>> willingness to change and
00:48:53
>> and you know if you if you're in a
00:48:55
business that you don't think is
00:48:56
changing very much or you know hard
00:48:58
change is really hard right you you have
00:49:01
to it can be very uncomfortable you're
00:49:02
like, well, we're going to stop doing
00:49:04
that. Well, we don't need this anymore.
00:49:06
>> Particularly if your bonus is dependent
00:49:08
on not messing things up. But let's go,
00:49:12
let's use the internet as an analogy,
00:49:14
which you saw up close. There were
00:49:17
businesses that were internet transition
00:49:21
successful. They made the transition.
00:49:23
Maybe Macy's.com versus Sears Robuck,
00:49:27
right? Like maybe Macy's did a better
00:49:29
job of taking advantage of the internet
00:49:31
than Sears. But then there was internet
00:49:33
native businesses that seemed to blow
00:49:35
them all out and maybe Amazon's a good
00:49:38
example or um CSN stores whatever they
00:49:41
be Wayfair etc. What's the right way to
00:49:44
think about this evolution in industries
00:49:47
generally? Are we going to have
00:49:50
their businesses are going to transition
00:49:52
successfully and those that aren't and
00:49:53
they're going to die? And is this really
00:49:55
going to are we going to see AI native
00:49:57
businesses in every industry come in and
00:50:00
just disrupt everything? I believe we
00:50:02
will and certainly you know when you
00:50:04
talk to the Collison brothers at Stripe
00:50:07
they'll tell you that the rate of growth
00:50:10
of the 2025 cohort companies is about
00:50:14
four times faster than the 2018
00:50:16
companies and so every year comp the new
00:50:20
batch of companies are growing faster
00:50:21
and faster because they're starting with
00:50:23
all these new tools that you know
00:50:24
>> because they see all the new companies
00:50:25
on their platform.
00:50:27
>> Exactly. And so when when you think
00:50:29
about uh an incumbent company, okay,
00:50:33
that already exists, it has, let's say,
00:50:35
it's got brands, it's got balance
00:50:37
sheets, it's got, you know, customer
00:50:39
relationships, whatever stuff, right?
00:50:41
Okay. Um, but that's sort of like those
00:50:45
are expiring value assets. If it doesn't
00:50:47
change quickly and get onto the other
00:50:50
side of this, I think it will go out of
00:50:52
business. And which is exactly the
00:50:54
speech I gave to our team, you know,
00:50:56
three three years ago. And
00:50:58
uh I think, you know, you you you you
00:51:02
you have to be uh bold and you got to go
00:51:06
make those changes to to
00:51:09
not only survive this, but but to but to
00:51:11
thrive. And you know, I think about it
00:51:14
is how do we prepare our company to be
00:51:16
ready for the 2030s,
00:51:17
>> right? Isn't it like it's much more
00:51:20
it's kind of the story line. There's
00:51:22
more to do than there ever was. It's
00:51:24
like when the internet arrive kind of
00:51:26
came around, Sears doesn't just need to
00:51:28
sell locally. They can sell to the
00:51:30
world.
00:51:31
>> Well, sure. I mean, this is the point
00:51:35
when when when we have better tools, we
00:51:37
can do way more things, right? And and
00:51:39
uh you know, when you know, when I hear
00:51:41
people say, "Oh, you know, um maybe
00:51:46
we're just going to have all these great
00:51:48
tools and we we won't do more things.
00:51:50
we'll just do the same things with fewer
00:51:52
people. It doesn't sound right to me. I
00:51:55
I mean, there'll be some of that, but I
00:51:56
think most of it will be we're just
00:51:58
going to do a whole lot more things.
00:51:59
We're going to solve a lot more
00:52:00
problems. We're going to accelerate
00:52:02
scientific discovery.
00:52:04
We're we're we're going to invent all
00:52:06
sorts of new things. We're going to
00:52:07
solve all sorts of problems that haven't
00:52:09
been solved.
00:52:11
>> And you know, that's that's super
00:52:13
exciting.
00:52:14
>> What do we have wrong on infrastructure?
00:52:16
So the original build cycle looked a lot
00:52:19
like everything's in a data center.
00:52:21
Everything's got to sit there. That's
00:52:23
where all the intelligence. It'll all be
00:52:25
in these kind of hosted proprietary
00:52:27
cloud models. Do you think that it's
00:52:29
open source? Is it distributed on the
00:52:31
edge? Where does the intelligence where
00:52:33
does the inference sit? And how does
00:52:35
that really change or kind of
00:52:37
rearchitect the industry? Do you think
00:52:39
>> it's really all the above? I mean it
00:52:41
it's not like there's one answer. I mean
00:52:44
certainly if you go to
00:52:47
uh any industrial company or natural
00:52:50
resources company advanced manufacturing
00:52:54
retail logistics there's tons of
00:52:57
inference at the edge and that's growing
00:52:59
very very fast and uh you know we make a
00:53:03
lot of that embedded equipment certainly
00:53:06
you know telos are doing that too I mean
00:53:08
it's pretty much every every industry
00:53:10
think about wherever data is being
00:53:12
created you want the AI I infrastructure
00:53:14
and the inference you know close to the
00:53:17
data. Um
00:53:20
you know there there there has been this
00:53:22
sort of rebalancing as companies have
00:53:25
figured out you know sort of everybody
00:53:27
loves the public cloud right until they
00:53:30
get the bill right they get the bill
00:53:32
they're like wait this is supposed to
00:53:33
save us money yes
00:53:34
>> costs quite a bit more. So, you know,
00:53:37
the the the lowest cost token is going
00:53:39
to be the one that's generated right
00:53:41
where the data is on the device. You're
00:53:44
going to have, you know, tokens being
00:53:46
generated on your phone, on your PC, in
00:53:49
every embedded piece of equipment. And
00:53:52
and look, we have an interesting
00:53:53
perspective on this business because we
00:53:54
have 10,000 customers where they embed
00:53:58
our product in their product. This is
00:54:00
you know think medical devices,
00:54:02
security, all sorts of things in
00:54:05
hospitals and industrial plants and you
00:54:08
know any any kind of uh you know uh
00:54:13
datadriven activity right requires some
00:54:16
kind of computing network storage
00:54:18
infrastructure.
00:54:19
>> Yeah. So when you um look at the desktop
00:54:22
where you started, it's coming full
00:54:24
circle and this must be at least very
00:54:27
interesting or intriguing to you that
00:54:29
you see this open claw movement.
00:54:31
Everybody trying to buy the most
00:54:33
powerful desktop they can and all these
00:54:35
hobbyists who were your customers who
00:54:37
were calling you up and ordering from,
00:54:39
you know, Dell, their their bespoke PC.
00:54:43
Now they're
00:54:43
>> Dell.com.
00:54:45
>> What did I say? Dell.com. Yeah,
00:54:47
>> you said ordering from Dell. Calling us
00:54:48
up. They they they order online usually.
00:54:50
>> They order online now. Yes.
00:54:52
>> We have this thing called the internet.
00:54:53
>> They do. It works out pretty well. Um
00:54:56
but this is incredible that they're like
00:54:59
all stacking computers and and running,
00:55:03
you know, uh local models. I was just
00:55:06
thinking back to how much the first
00:55:07
couple of computers I owned cost $4,000
00:55:10
in 1980.
00:55:12
And then the prices came down. You can
00:55:14
buy a Dell for 500 bucks, 800 bucks,
00:55:16
like really nice laptops for that price.
00:55:19
Um, use the promo code allin. Um, it's
00:55:23
not a sponsor. It's a joke. Um, but do
00:55:26
you think there's a world where we're
00:55:27
going to start to see the desktop
00:55:30
because people want to protect that
00:55:32
data, they want to protect the skills
00:55:34
they're building. They don't want to
00:55:35
give it to Sam Alman, put it in a cloud
00:55:37
somewhere. They don't want to give it to
00:55:39
Google, whoever it happens to be. Um,
00:55:41
and that the desktop revolution comes
00:55:42
back and everybody's got a $10,000
00:55:44
desktop. Is that coming?
00:55:46
>> I don't know if everyone will have
00:55:48
$10,000 desktop, but that would be
00:55:49
great. I mean, you know, uh
00:55:53
um you know, so we have this
00:55:56
Dell uh portal on hugging face and we
00:55:59
have all these open models and we
00:56:01
qualified them on every kind of machine
00:56:04
we have and you know there's been
00:56:06
enormous progress in the open source
00:56:08
models. You know, Google has these Gemma
00:56:11
models, GMMA, and they work really,
00:56:13
really well on small machines. You know,
00:56:16
OpenAI has their open- source models.
00:56:18
You got the Nvidia Neotron models.
00:56:21
You've got, you know, enormous
00:56:24
uh ecosystem of open- source that is,
00:56:28
you know, thriving and certainly open
00:56:31
claw and, you know, there'll be some
00:56:33
good discussion about that.
00:56:35
>> How many people have set up OpenClaw?
00:56:37
Raise your hand. Oh my lord. That's
00:56:39
about what a 20% of the audience here.
00:56:42
>> Yeah. So, you know, autonomous agents,
00:56:45
um, big deal. And certainly inside
00:56:47
companies,
00:56:49
there's going to be a lot more
00:56:50
autonomous agents. There are significant
00:56:53
security requirements that need to go
00:56:54
with that. We need to we need to be able
00:56:56
to authenticate and validate who these
00:56:59
agents are and what they're doing and,
00:57:01
you know, have the right controls and
00:57:03
and and that sort of thing.
00:57:05
>> Yeah. and uh your take on uh AGI and
00:57:11
when we're going to hit it like do you
00:57:13
actually think about super intelligence
00:57:14
and AGI and the the the two sets of
00:57:18
problems that could solve there and do
00:57:20
you have a personal definition that you
00:57:22
like to use for those when you're
00:57:23
talking internally with your team of how
00:57:26
things are moving?
00:57:28
>> I I I don't really know Jason. Um, you
00:57:31
know, I I think if it feels like with
00:57:34
the latest releases, we were talking
00:57:36
about this backstage, uh, you know, the
00:57:40
Gemini 3.1,
00:57:42
the Opus 4.6, the OpenAI 5.4, it feels
00:57:46
like we sort of u hit some kind of
00:57:49
threshold where the just the quality of
00:57:52
the the models are are just tremendous.
00:57:54
and and and and you know when I listen
00:57:56
to what our teams are able to accomplish
00:57:58
in a day or two weeks that would have
00:58:01
taken them you know a few months or nine
00:58:03
months time you know it's it's just
00:58:06
amazing the speed of of innovation.
00:58:09
>> Yeah. And so
00:58:12
it it seems to be continuing and we get
00:58:15
all the reinforcement learning and
00:58:18
there's also tons of private dark data
00:58:22
that these models haven't been applied
00:58:24
to and that's sort of what's happening
00:58:26
with these
00:58:27
>> I think the auto research is the that's
00:58:29
the the key with auto research the
00:58:31
capacity to take a standard model and
00:58:34
then retrain it on your private data and
00:58:36
keep it private and build an advantage
00:58:39
for your organization based on the
00:58:40
history of your data that no one else
00:58:42
has. That seems to be what a lot of
00:58:44
folks are thinking about that have the
00:58:46
capacity. But if you were to start a
00:58:48
company today that was not in computing
00:58:50
and you were to build a business from
00:58:52
the ground up, how would you architect
00:58:54
your people and your organizational
00:58:56
principles as an AI kind of first
00:59:00
knowing what you know about computing
00:59:02
and where things are headed? Are you
00:59:04
hiring people? Are you hiring a bunch of
00:59:06
people to run a bunch of agents? How do
00:59:08
you think about architecting a new
00:59:09
business today?
00:59:10
>> It's a great question. I don't really
00:59:12
spend a lot of time thinking about that.
00:59:14
You know, I'm thinking about how do how
00:59:15
do I run
00:59:16
>> what the rest of us are thinking about.
00:59:17
>> How do I how do I run our company? I
00:59:20
mean, that's hard enough. So,
00:59:22
>> yeah, everyone I talk to, that's the
00:59:24
question. They're like, everyone goes to
00:59:25
these off sites and they're like, I'm
00:59:28
actually doing this with my management
00:59:29
team on Monday. We're doing like a tear
00:59:30
down be like, "Hey, how would we build
00:59:32
the business differently today?" Knowing
00:59:34
>> I mean what we've been thinking a lot
00:59:36
about is
00:59:38
it's sort of this this reimagining
00:59:40
question. Yeah.
00:59:41
>> You know, sort of all right, we know the
00:59:44
trajectory of the tools. What are the
00:59:46
tools going to be in 27 28 29 and how do
00:59:51
we
00:59:52
accelerate you know our path to that?
00:59:55
How worried are you about
00:59:58
social issues? So AI recently ranked as
01:00:01
the most unfavorable term of a list of
01:00:04
terms including president.
01:00:06
>> It was somewhere between ISIS, the
01:00:08
Democrats, ICE, ISIS, and the Democrats.
01:00:11
>> ICE was better than AI.
01:00:13
>> People liked ICE mass agents
01:00:16
>> more than they like AI.
01:00:17
>> Yeah. Well, I think I think part of the
01:00:19
problem is it's been it's been uh you
01:00:23
know maybe sold as
01:00:26
>> as you know it it it sort of presents
01:00:28
itself like a human would.
01:00:30
>> Yeah.
01:00:30
>> Right. And you know maybe if we called
01:00:32
it linear algebra
01:00:35
>> matrix multiplication
01:00:36
>> and statistics instead
01:00:38
>> matrix multiplier maybe that would be
01:00:40
more friendly. I don't know you know.
01:00:42
>> Yeah.
01:00:42
>> But do you think do you think we're
01:00:43
going to have No, I think you're right.
01:00:45
the positioning is wrong and then we're
01:00:47
not communicating to people, hey, this
01:00:49
could help healthare, this could make
01:00:51
you live longer, this could help your
01:00:52
kids get educated more, that this could
01:00:55
help with housing costs, this could help
01:00:56
with food cost. Messaging aside, I mean,
01:00:59
how much do you actually worry
01:01:01
>> about disruption or dislocation in
01:01:03
employment, about acceleration of
01:01:06
earnings for some people and
01:01:08
deceleration for other people in society
01:01:10
that feel left behind? And that starts
01:01:13
to fuel more of the kind of social
01:01:15
concerns and politicians saying, "Hey,
01:01:18
we got to stop building all the data
01:01:19
centers, you know, like that kind of
01:01:21
stuff." And and how much are you really
01:01:23
>> I I tend to be, you know, more
01:01:25
optimistic and and um you know, I I do I
01:01:30
do think that in all technology cycles,
01:01:33
you get sort of these network effects
01:01:36
and
01:01:37
uh that's kind of inevitable. But I also
01:01:41
think, you know, uh we're going to do
01:01:44
more with the tools. You you do have
01:01:46
this acceleration of all sorts of uh
01:01:50
great things. I mean, education can
01:01:52
dramatically improve, scientific
01:01:54
discovery, healthcare, energy, you know,
01:01:57
all sort of the the unsolved problems
01:01:59
can can be accelerated. Ultimately, I
01:02:01
think it's it's amplification of of
01:02:04
human potential and capability
01:02:06
>> and
01:02:07
>> extending the frontier, too. And and and
01:02:09
and by the way, we should also remember
01:02:11
that basically what we're talking about
01:02:13
here beyond sort of some of the advanced
01:02:15
semiconductors in the you know big data
01:02:18
centers, we're talking about software,
01:02:20
right?
01:02:21
>> Yeah.
01:02:22
>> Right. It's like software that runs on
01:02:24
your computer.
01:02:26
So you know if somebody says well we
01:02:28
don't we don't want that. It's like how
01:02:30
do you stop software I mean
01:02:32
>> yeah how are you going to how you going
01:02:33
to stop someone putting an open source
01:02:35
model on their computer at home and
01:02:36
asking it for medical advice? You know,
01:02:38
New York just passed a law saying AI
01:02:39
models can no longer give
01:02:41
>> medical advice. Passed, right? It's
01:02:43
being proposed.
01:02:44
>> Oh, proposed. It's being proposed. You
01:02:46
can't give legal and health advice.
01:02:47
>> We're anti-s software. It's like,
01:02:49
>> yeah. Well, we're also anti books and
01:02:50
advice. So, if you were going to look it
01:02:52
up in a book, but were you dropping into
01:02:54
your Bernie Sanders right there?
01:02:56
>> That was my Bernie Sanders.
01:02:57
>> Michael D. 1% of the 1% that you're
01:03:01
enabling with your data centers.
01:03:03
Why are you doing this to the people of
01:03:06
our great nation while you give your
01:03:08
money to children in the Invest America
01:03:11
accounts?
01:03:12
>> Yeah,
01:03:12
>> this is a good one you're doing.
01:03:14
>> Yeah. Can we talk about Invest America?
01:03:15
>> I think he might have even criticized
01:03:17
that, but but you know,
01:03:18
>> well, that's the problem. The
01:03:20
billionaires are giving our children
01:03:21
money and they're not asking us
01:03:23
permission and then those kids are going
01:03:24
to buy things that their parents never
01:03:26
asked for.
01:03:29
>> Well, they they don't they don't
01:03:30
actually get the money till they're 18
01:03:32
years old. So that's that's
01:03:34
>> But what gives you the right to give our
01:03:36
children an education? What is this
01:03:38
philanthropy? It makes no sense. No, I
01:03:41
mean honestly.
01:03:42
>> Well, I see I see my great friend Brad
01:03:44
Gersonner here.
01:03:45
>> Brad's here.
01:03:47
>> There he is.
01:03:48
>> Brad, come up for this little uh segment
01:03:49
here. Sit sit for a second. Let's talk
01:03:51
about America. We got 5 minutes left
01:03:52
here.
01:03:53
>> So, you know, I heard
01:03:54
>> everybody the fifth bestie. Give him a
01:03:55
round.
01:03:56
>> I didn't know he's going to be here.
01:04:00
>> All right. I
01:04:02
>> how did this go down, Michael?
01:04:03
>> I I heard about this idea in 2021 from
01:04:07
Brad
01:04:08
>> and I thought, you know, that's a that's
01:04:10
just a great idea. That's an awesome
01:04:12
idea. And, you know, I think there were
01:04:15
some discussions with the prior
01:04:16
administration,
01:04:18
but they didn't uh do do anything about
01:04:20
that, unfortunately.
01:04:23
And um you know, here we are, you know,
01:04:26
a miracle. you know, uh the the the
01:04:29
Invest America Act was was passed and um
01:04:33
you know, now we have thousands of
01:04:34
companies that are joining in and
01:04:38
matching the government's contribution
01:04:40
and u you know, Susan and I made a big
01:04:43
announcement uh giving $250 to 25
01:04:47
million children in uh zip codes where
01:04:51
the median income is
01:04:54
>> I mean, Michael, let's just let's pause
01:04:56
for a second
01:04:58
This is one of the greatest
01:05:01
>> What do you think, Bernie? Do you
01:05:02
approve?
01:05:04
>> I'm going to go with Jake All this. I
01:05:06
just want to pause on this because it is
01:05:08
one of the greatest philanthropic gifts
01:05:10
in the history of humanity. And I I
01:05:13
people have just kind of glossed over to
01:05:15
it because there's a lot of big numbers
01:05:16
in the world, but we're talking about
01:05:18
you. You personally, you and Susan sat
01:05:20
down and said, "We're going to give a
01:05:22
number." And that number was five, six,
01:05:25
seven billion dollars or this is
01:05:27
>> well it's it's $250
01:05:30
to 25 million children ages 2 to 10 in
01:05:34
zip codes where the median income is
01:05:36
$150,000 or lession. It's $6.25 billion.
01:05:40
>> I mean
01:05:42
and I just want to say something. You
01:05:44
know, we live at a time where
01:05:48
>> but they have to sign up to claim the
01:05:50
accounts. Yes,
01:05:51
>> they they have accounts, but they have
01:05:52
to sign up to claim the accounts. You
01:05:54
know, I think we're getting 100,000 plus
01:05:56
kids now a day signing up.
01:05:59
>> Yeah.
01:05:59
>> First, thanks for having me up. I mean,
01:06:01
what what a national hero and national
01:06:04
asset that my friend Michael Dell is,
01:06:06
but he understates this
01:06:08
>> because I've been working on this for
01:06:10
four years. We've been talking about it
01:06:12
on the All-In Pod. We had a lot of
01:06:14
momentum, but behind the scenes, Trump
01:06:18
gets elected. Um, and so it's April that
01:06:22
we're in the middle of the tariff
01:06:24
strife. April 25, we realize there's
01:06:27
only going to be one piece of
01:06:28
legislation that gets passed during
01:06:30
Trump's first two years. It'll be the,
01:06:32
you know, this big beautiful bill, the
01:06:34
reconciliation bill. And so I I call up
01:06:37
Michael and I said, "Michael, we got to
01:06:39
go. We've got five days. We have the it
01:06:42
it's drafted in the Senate. We have
01:06:44
bipartisan support, but we have a window
01:06:48
and like we ha I have to get in the Oval
01:06:50
Office. We have to get in the Oval
01:06:52
Office. And you know, Michael said, you
01:06:55
know, what what should the text say? And
01:06:58
you and I had a conversation and you you
01:07:01
know uh you know the text to DJT.
01:07:04
>> Yes. I I'm not talking out of school.
01:07:06
Listen, Biden, where wherever you sit on
01:07:09
the political divide, I will say I've
01:07:10
said this, Trump seeks out ideas from
01:07:14
business leaders and he has deep respect
01:07:16
for business leaders like Michael Dell.
01:07:18
Like you wherever your politics are,
01:07:20
that's just the truth. And the last
01:07:21
administration didn't and you know, if
01:07:24
it may have done the same thing
01:07:26
>> and and this and I just have to say
01:07:28
this, Invest America, it's not a red
01:07:30
idea or a blue idea. It's a red, white,
01:07:32
and blue idea, right? It's it's
01:07:34
>> so into the prior conversation when
01:07:38
Michael and I first talked about it um
01:07:42
you know it was this is the right thing
01:07:45
to do right like we have to reconnect
01:07:48
the 70% of people who feel left out and
01:07:50
left behind to the American dream right
01:07:52
but this isn't our self-interest this is
01:07:54
about defending
01:07:55
the ownership society and capitalism
01:07:58
that for 250 years created the greatest
01:08:01
experiment in the history of the world
01:08:03
but that's at at risk
01:08:05
less than half of people under the age
01:08:06
of 40 have a favorable view of
01:08:08
capitalism. So when I talked to you
01:08:09
about it the first time, Michael
01:08:11
understood both sides of it. It's the
01:08:13
right thing to do and it's the right
01:08:15
thing for the country. And uh so at any
01:08:17
rate, Michael Dell, tremendous American
01:08:21
I have one just one one punch up the
01:08:26
name invest America Trump accounts. What
01:08:29
do you think?
01:08:30
>> Were were you considering this in the
01:08:32
context of other philanthropy? I mean,
01:08:34
how do you how do you kind of put this
01:08:36
together in the spectrum of how you
01:08:37
think about giving back?
01:08:38
>> Yeah, great question. So, so you know,
01:08:40
we have a foundation that's very focused
01:08:43
on children in urban poverty. That's
01:08:47
basically the central focus of the
01:08:49
foundation, although folks in uh central
01:08:52
Texas would know that we do a few other
01:08:54
things here in our local community. Um,
01:08:57
and
01:08:59
you know, when I heard about this idea,
01:09:01
one of my thoughts was, "Wow, this is
01:09:03
like a platform for directly giving to
01:09:07
the people that we're targeting, right?"
01:09:10
And, you know, we actually thought about
01:09:13
doing it just in Texas first. And, um,
01:09:17
you know, things have gone pretty well
01:09:19
with the company and all that. So, you
01:09:21
know,
01:09:23
we thought we just go bigger. And and
01:09:26
what happens, Brad, if
01:09:28
you know 10 more Michael Dells show up
01:09:31
and and there are dozens of
01:09:33
>> That's not a lot of Michael Dells, let's
01:09:35
be honest. But
01:09:36
>> there there's a number of folks who
01:09:37
could make an equal size or even greater
01:09:40
gift. Um there are people who, you know,
01:09:43
many hands makes for light work. There
01:09:45
are a thousand people who can make a
01:09:46
gift of significance. What if this
01:09:49
actually becomes a movement and we
01:09:51
>> I think I think it actually is becoming
01:09:54
a movement instead of a moment and and
01:09:56
and we've you know we've got a lot of
01:09:58
that queued up. Brad, why don't you
01:10:00
>> Wait, have you called anybody? Michael,
01:10:02
have you did you call?
01:10:04
>> Michael and I chair the Invest America
01:10:06
Giving Committee and we're ambitious
01:10:08
guys.
01:10:09
>> You're knocking on doors.
01:10:10
>> We we've had a few conversations.
01:10:12
>> You're texting people.
01:10:13
>> Yeah.
01:10:14
>> And so so so just there was a question
01:10:16
earlier. First, it's really important to
01:10:18
understand and for you guys to spread
01:10:19
the word. Every child under the age of
01:10:21
18, every child under the age of 18 is
01:10:24
eligible to claim their account. Number
01:10:27
one. Number two, you've heard this like,
01:10:30
oh, kids born between 25 and 28. No,
01:10:34
this is forever more. The legislation
01:10:36
creates this account forever more. Every
01:10:38
child born in America starting January
01:10:41
1st, 2027 will automatically get an will
01:10:44
get a trump account right at birth
01:10:47
stapled to their social security card.
01:10:49
The $1,000 has to be reauthorized every
01:10:52
four years. Okay? But the accounts
01:10:54
don't. So every kid, this is social
01:10:57
security 2.0. This is the biggest change
01:10:59
to the social contract in America in in
01:11:03
50 years. 3.7 million kids a year will
01:11:06
get an account that can compound. a 401k
01:11:09
from birth. And yes, we're going to have
01:11:11
a lot of announcements, but it's not
01:11:14
just billionaires. It's going to be
01:11:16
companies that are donating stock on
01:11:18
their IPOs into these accounts. It's
01:11:20
going to be wealthy people. It's going
01:11:22
to be states. It's going to be moms and
01:11:24
dads. It's going to be corporations. And
01:11:26
the estimate is over 15 years, we can
01:11:29
move $5 trillion
01:11:32
into the pockets of families that would
01:11:34
have otherwise had zero. $5 trillion,
01:11:39
right? And so to me, the leadership that
01:11:41
Michael showed not only in helping me
01:11:43
get the meeting that ultimately got this
01:11:46
passed into law. And it does take people
01:11:48
like those moments either happen or they
01:11:51
don't happen. And if they don't happen,
01:11:53
there's no law. And this doesn't change
01:11:55
kids lives.
01:11:56
>> By the way, two things on this.
01:11:58
>> If this $5 trillion moved through
01:12:01
government programs, it would get
01:12:03
incinerated.
01:12:03
>> Exactly.
01:12:04
>> That's what we see happen. there's just
01:12:06
a million crony structures that take it
01:12:09
away and destroy it. So to give it
01:12:10
directly into the accounts is the
01:12:12
circumstance. The second thing is it
01:12:14
makes a lot of sense that you guys can I
01:12:17
I'll I'll be the lead but can we replace
01:12:20
social security in this country with a
01:12:22
defined benefit or defined contribution
01:12:24
like this and eventually everyone has a
01:12:27
Trump account or whatever you call it
01:12:29
and we don't have to have this fake
01:12:31
Ponzi scheme that we call social do we
01:12:34
can do it. Well, they they they have a
01:12:35
defined benefit program, but I'm saying
01:12:37
like everyone has an account and they
01:12:39
all own a piece of their future. And
01:12:40
every time you get a payroll to tax
01:12:41
deduction, instead of it getting
01:12:42
inviscerated and destroyed and
01:12:45
vaporized, that money actually goes into
01:12:47
an account and you buy a piece of a
01:12:49
company and maybe you can direct it.
01:12:50
>> Freeberg's getting
01:12:51
>> on July 4th of this year.
01:12:53
>> You're getting me wound up. So for all
01:12:54
the there are 4.5 million kids who've
01:12:57
claimed their account almost $150,000 a
01:13:00
day will have on the trajectory we're on
01:13:02
10 million by July 4th our 250th
01:13:05
anniversary of the country. Every one of
01:13:07
those kids accounts the parents and the
01:13:10
kids on July 4th they'll see an app on
01:13:12
their phone that looks a lot like a
01:13:14
Robin Hood app that'll see them owning.
01:13:16
It'll say you've received your $1,000 or
01:13:19
your $250. you're and it will show a
01:13:22
little bit of Nvidia, a little bit of
01:13:23
Walmart, a little bit of Dell. We
01:13:25
decompose the S&P 500 which they own
01:13:28
into the constituent parts so they can
01:13:30
get excited about being an owner in the
01:13:34
upside of America. And when moms and
01:13:35
dads doubleclick and Apple pay 510 bucks
01:13:38
into the account, right? When they send
01:13:40
their QR codes to their friends on their
01:13:43
birthday and now their friends all add
01:13:45
to the account or on Christmas or bar
01:13:46
mitzvah and they add to the accounts.
01:13:49
When companies add to the accounts, all
01:13:51
of this, they see it growing and it
01:13:53
unlocks the human potential. It's not
01:13:56
just the money. It's that I'm in the
01:13:59
game. I have a shot. Which to David's
01:14:03
point, I think the biggest crime of
01:14:05
social security. And we made very clear,
01:14:07
social security is a sacred promise. We
01:14:10
we refused and many people tried to get
01:14:12
us to to to take on the broader
01:14:14
struggle. And we didn't do it because we
01:14:17
knew it it would kill this program. But
01:14:19
but let's be clear about this. Our
01:14:21
government requires all of us to give
01:14:23
10% of what we earn into social
01:14:26
security, right? It was the social
01:14:28
contract evolution in the industrial
01:14:30
revolution that kept kept the country
01:14:32
together. The only problem is it goes
01:14:34
into a black hole. Nobody sees it.
01:14:36
Nobody knows what's there. But it is
01:14:38
your savings. Now imagine if that same
01:14:40
money was required to, you know,
01:14:42
government took it away, but it was in
01:14:44
an account with your name on it. You
01:14:46
could see it grow. you knew exactly what
01:14:48
was there. You could get excited and
01:14:50
say, "Hey, I'm going to add a little bit
01:14:51
more to that, right?" And and and you
01:14:53
had a little bit of choice. That to me
01:14:55
is uh is the possibility. And I think we
01:14:57
will end up there some
01:14:58
>> and and and Brad, thank you for
01:15:06
just going to say, you know, Brad Brad
01:15:08
also adopted his home state of Indiana.
01:15:11
We have Ray and Barbara Dal Dalio
01:15:14
adopted their home state of Connecticut.
01:15:16
and many many more to come.
01:15:19
>> And and and look, it's going to be super
01:15:20
easy for anybody to add a 100 kids in
01:15:24
your neighborhood, adopt a zip code,
01:15:26
adopt a school district, adopt a town.
01:15:28
>> It's going to be amazing. Give it up for
01:15:30
one of the great entrepreneurs of our
01:15:32
time and an incredible pledge.
01:15:35
I'm going all in.
01:15:51
I'm going all in.

Badges

This episode stands out for the following:

  • 70
    Best performance
  • 65
    Best concept / idea
  • 60
    Most heartwarming
  • 60
    Best overall

Episode Highlights

  • Digitizing the Physical World
    Kalanick explains his vision of treating atoms like bits, transforming industries through physical automation.
    “You’re building an atoms-based computer.”
    @ 03m 53s
    March 17, 2026
  • The Future of Mining
    Kalanick shares insights on automating mining equipment and the potential for increased productivity.
    “Automation unlocks capacity.”
    @ 08m 24s
    March 17, 2026
  • California's Changing Landscape
    Reflecting on the exodus from California, Kalanick shares his emotional connection to Los Angeles.
    “It’s pretty messed up.”
    @ 17m 00s
    March 17, 2026
  • Building the Future
    Living in a place that feels like home, where everyone wants to build the future.
    “The people here are dope. The food is dope.”
    @ 19m 49s
    March 17, 2026
  • Capital as a Weapon
    Using capital strategically can be a game changer in the competitive landscape.
    “Capital as a strategic weapon for its own sake is not a thing.”
    @ 29m 44s
    March 17, 2026
  • China's Manufacturing Power
    China is advancing rapidly in manufacturing, showcasing fierce competition.
    “These guys are killing it. The best ideas are winning.”
    @ 33m 55s
    March 17, 2026
  • AI Data Center Boom
    Texas is experiencing a massive buildout of AI infrastructure, driven by demand for data centers.
    “There's been a massive buildout particularly in some of the cities in Texas.”
    @ 40m 58s
    March 17, 2026
  • The Future of Business
    Companies must adapt to survive, or risk being outpaced by faster, more innovative competitors.
    “The only way we’re going to prevent that is to become that company.”
    @ 46m 42s
    March 17, 2026
  • The Rise of Open Source Models
    Open source models are thriving, with significant advancements from companies like Google and OpenAI.
    “You know, enormous progress in the open source models.”
    @ 56m 06s
    March 17, 2026
  • The Impact of AI on Employment
    Concerns arise about AI's impact on employment and social issues, with AI ranked unfavorably.
    “How worried are you about social issues?”
    @ 59m 58s
    March 17, 2026
  • Invest America Act Announcement
    A groundbreaking philanthropic initiative aims to provide $250 to 25 million children in need.
    “This is one of the greatest philanthropic gifts in the history of humanity.”
    @ 01h 05m 10s
    March 17, 2026
  • Social Security 2.0
    The new initiative promises to create accounts for every child born in America, revolutionizing financial security.
    “This is social security 2.0.”
    @ 01h 10m 59s
    March 17, 2026

Episode Quotes

Key Moments

  • Automation in Mining08:24
  • Startup Dilemmas18:36
  • Leaving the Office18:41
  • Building the Future20:13
  • Open Source Ecosystem56:24
  • AI and Social Issues1:00:01
  • Invest America Initiative1:04:23
  • The Possibility1:14:55

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
The Future of Everything: What CEOs of Circle, CrowdStrike & More See Coming in 2026
Podcast thumbnail
Grok 4 Wows, The Bitter Lesson, Third Party, AI Browsers, SCOTUS backs POTUS on RIFs
Podcast thumbnail
E125: SpaceX launch, Fox News settlement, "Zombie-corn" exodus to AI, late-stage implosion