Search Captions & Ask AI

JD Vance's AI Speech, Techno-Optimists vs Doomers, Tariffs, AI Court Cases with Naval Ravikant

February 15, 2025 / 01:50:19

This episode features discussions on podcast dynamics, parenting philosophies, and the implications of AI on jobs and technology. Guests include Naval Ravikant, David Sachs, and David Freeberg.

Naval Ravikant shares his thoughts on why the podcast format works well, emphasizing the importance of engaging conversations among intelligent participants. He praises the chemistry between the hosts and the fun atmosphere.

The conversation shifts to parenting, where Ravikant discusses his approach of granting children autonomy and treating them as equals. He references the philosophy of taking children seriously and allowing them to make their own choices.

AI's impact on jobs is a significant topic, with guests debating whether AI will create or destroy jobs. Ravikant argues that technology often leads to new opportunities, while Sachs highlights the importance of maintaining a competitive edge in AI.

The episode concludes with a discussion on tariffs and the implications of copyright law in the context of AI, with guests expressing differing opinions on the future of AI and its regulation.

TL;DR

Naval Ravikant and guests discuss podcast dynamics, parenting philosophies, AI's job impact, and copyright implications in technology.

Video

00:00:00
great job nabal you rocked it maybe I
00:00:02
should have said this on air but that
00:00:03
was literally the most fun podcast I've
00:00:05
ever recorded oh that's on air cut that
00:00:07
in yeah put it in the show put it in the
00:00:09
show I had my theory on why you were
00:00:11
number one but now I have the
00:00:12
realization what's the actual reason you
00:00:14
know us for a long time yeah what was
00:00:15
your theory what's a reality my theory
00:00:17
was that my problem with going on
00:00:19
podcast is usually the person I'm
00:00:20
talking to is not that interesting
00:00:22
they're just asking the same questions
00:00:24
and they're dialing it in and they're
00:00:25
not that interested it's not like we're
00:00:26
having a peer level actual conversation
00:00:28
so that's why I wanted to do air chat
00:00:29
and Clubhouse and things like that cuz
00:00:31
you can actually have a conversation I
00:00:33
see right and what you guys have very
00:00:35
uniquely is four people you know of whom
00:00:39
at least three are intelligent I'm
00:00:42
kidding how could you say that isn't
00:00:45
here what SX isn't even hearing you say
00:00:48
that that is so cold best right these
00:00:52
three are intelligent and all of you get
00:00:54
along and you can have an ongoing
00:00:56
conversation that's a very high hit rate
00:00:58
normally in a podcast you only get one
00:01:00
interesting person and now you've got
00:01:02
three maybe four right okay so that to
00:01:05
me was why Su and by you're talking to
00:01:08
is number four we don't know she remain
00:01:10
mysterious forever of the four right the
00:01:13
problem is like if you get people
00:01:14
together to talk two is a good
00:01:16
conversation three possibly four is the
00:01:18
max that's why in a dinner table at a
00:01:20
restaurant four top right you don't do
00:01:22
five or six because then exp splits into
00:01:24
multiple conversations so you had four
00:01:26
people who were capable of talking right
00:01:29
that I thought thought was a secret but
00:01:30
there's another secret the the other
00:01:32
secret is you guys are having fun you're
00:01:34
talking over each other you're making
00:01:36
fun of each other you're actually having
00:01:38
fun so that's why I'm saying this is the
00:01:40
most fun podcast I've ever been on and
00:01:42
that's that's why you'll be welcome back
00:01:44
anytime no all welcome back yes
00:01:48
absolutely in three smart
00:01:50
[Music]
00:01:52
guys I can't even believe you'd say that
00:01:54
about sax he's not even here to defend
00:01:57
himself sorry David
00:02:00
let your winners
00:02:01
[Music]
00:02:06
ride we open sources to the fans and
00:02:09
they've just gone crazy with
00:02:11
[Music]
00:02:14
it all right everybody welcome back to
00:02:17
the number one podcast in the world
00:02:20
we're really excited today back again
00:02:24
your sultant of science David
00:02:26
freeberg what do you got going on there
00:02:28
freeberg what's in the background
00:02:31
want I used to play I used to play a lot
00:02:33
of a game called Sim Earth on my
00:02:35
Macintosh LC way back in the day that
00:02:38
tracks yeah that tracks and of course
00:02:42
with us again your chairman what games
00:02:44
did you play growing up J actually I'm
00:02:46
kind of curious did you ever play video
00:02:47
games let's say Andrea Allison uh Susan
00:02:52
I mean it was like a lot of cute girls I
00:02:55
was out dating girls freeberg yeah I was
00:02:59
not on my Apple tun playing civilization
00:03:02
let me find one of those pictures of who
00:03:04
whoa don't get me in trouble Man the 80s
00:03:06
were ' 80s were good to me in Brooklyn
00:03:08
rejection the video
00:03:10
game yes you have three lives rejected
00:03:13
rejected it's a numbers game chamal as
00:03:16
you know as you well know it is a
00:03:18
numberous game yeah Nick go ahead pull
00:03:20
up uh pull up Rico Suave here oh no what
00:03:23
is this one instead of playing video
00:03:24
here I am no the80s that's fat that's
00:03:28
fat J you want get
00:03:30
Nick help out help out your unle yeah
00:03:32
here he is out slaying help out your
00:03:34
uncle with the VIN Jak C you know what
00:03:37
he was sling in there snack yeah you
00:03:40
pric and post asmic right correct and
00:03:43
weightlifting beef
00:03:46
jerky go find my Leonardo decaprio
00:03:48
picture please and replace my fat jow
00:03:51
picture with that thank you oh God I was
00:03:54
fat man plus 40 lbs is a lot heavier
00:03:57
than I it's no joke it's no joke 40b a
00:04:00
lot I there's so many great emo photos
00:04:03
of me I'm proud of you thank you my man
00:04:05
thank you my man if you want a good can
00:04:08
you get through the intros please so we
00:04:09
can start come on quick how you doing
00:04:11
brother how you doing chairman
00:04:14
dictator you
00:04:16
good all right all right we're really
00:04:19
excited today today for the first time
00:04:22
on the Allin podcast the Iron Fist of
00:04:26
Angel list the zenik mage of the early
00:04:29
stage he has such a way with words he's
00:04:32
the socer te of nerds please welcome my
00:04:36
guy Namaste Nal how you doing the intros
00:04:38
are back that is the best intro I've
00:04:41
ever gotten I didn't think I don't I
00:04:42
didn't think you could do that that was
00:04:44
amazing that's your super power right
00:04:46
there lock it in quit Venture Capital
00:04:47
just do that absolutely that's actually
00:04:50
you know what interestingly number one
00:04:51
podcast in the world like someone said I
00:04:54
mean that's what I'm manifesting it's
00:04:56
getting close we've been in the top 10
00:04:58
so I mean the weekends are good for all
00:05:00
in this this one will hit number one
00:05:02
this one will go vir I think it could if
00:05:03
you have some really great piffy
00:05:06
insights we might go right to the top I
00:05:09
just got to do a seile and it'll go
00:05:10
viral oh no oh no are you gonna send us
00:05:14
your heart my heart goes out to you my
00:05:17
heart I I end here at the heart I don't
00:05:20
send it out I keep it right here I put
00:05:23
both hands on the heart and I hold it
00:05:25
nice and stady I hold it in it's sending
00:05:28
out to you but just not explicitly all
00:05:31
right for those of you who don't know
00:05:33
Naval was an entrepreneur uh he kicked a
00:05:35
bit of ass he got his ass kicked and
00:05:37
then he started Venture
00:05:40
hacks and he started emailing folks and
00:05:43
saying you know 20 15 20 years ago maybe
00:05:46
15 here are some deals in Silicon Valley
00:05:49
he went around he started writing 50k
00:05:50
100K checks he hit a bunch of home runs
00:05:53
and he turned Venture hacks into Angel
00:05:56
list and then he has invested in a ton
00:05:59
of great startups uh maybe give us some
00:06:01
of the greatest hits there
00:06:02
Dev yeah Twitter Uber notion bunch of
00:06:06
others um Postmates you to me lot of
00:06:09
unicorns buch up coming I it's actually
00:06:12
a lot of deals at this point but
00:06:14
honestly I'm not necessarily proud of
00:06:16
being an investor investor to me is a
00:06:17
side job it's a hobby so I do startups
00:06:21
how how do you define
00:06:23
yourself I don't I mean I guess these
00:06:26
days I would say more like building
00:06:27
things you know every
00:06:29
every so-called career is an evolution
00:06:31
right and all of you guys are
00:06:33
independent and you kind of do what
00:06:34
you're most interested in right that's
00:06:36
the point of making money so you can
00:06:38
just do what you want so these days I'm
00:06:40
really into building and crafting
00:06:42
products so I built one recently called
00:06:44
airchat it kind of didn't work I'm still
00:06:47
proud of what I built and got to work
00:06:49
with an incredible team and now I'm
00:06:51
building a new product and this time I'm
00:06:52
going into Hardware oh and I'm just
00:06:54
building something that I really want I
00:06:57
I'm not it all yourself Nal
00:07:00
partially I bring investors along last
00:07:02
time they got their money back previous
00:07:04
times they've made money next time
00:07:05
hopefully they'll make a lot of money
00:07:07
it's good to bring your friends along
00:07:09
I'll be honest I love that you said I
00:07:11
love the product but it didn't work not
00:07:13
enough people say that yeah I know I I
00:07:15
built a product that I loved that I was
00:07:17
proud of but it didn't catch fire and it
00:07:19
was a social product so it had to catch
00:07:21
fire for it to work so I found the team
00:07:23
great homes they all got paid the
00:07:24
investors that I brought in got their
00:07:25
money back and I learned a ton which I'm
00:07:29
leveraging into the new thing but the
00:07:31
new thing is much harder the new thing
00:07:32
is hardware and software and what did
00:07:34
you what did you learn building in 2024
00:07:37
and 20125 that you didn't know maybe
00:07:39
before then the main thing was actually
00:07:42
just the craft the craft of pixel by
00:07:44
pixel designing a software product and
00:07:46
launching it I guess the main thing I
00:07:50
took away that was a learning was that I
00:07:53
really enjoyed building products and
00:07:55
that I wanted to build something even
00:07:57
harder and something even more real and
00:08:00
I think like a lot of us I'm inspired by
00:08:02
Elon and you know all the incredible
00:08:04
work he's done so I don't want to build
00:08:05
things that are easy I want to build
00:08:07
things that are hard and interesting and
00:08:09
I want to take on more technical risk
00:08:11
and less Market risk this is the classic
00:08:13
VC learning right which is you want to
00:08:15
build something that if people get it if
00:08:18
you can deliver it you know people will
00:08:20
want it and it's just hard to build as
00:08:22
opposed to you build it and you don't
00:08:24
know if they want it so that's a
00:08:26
learning air chat was a lot of fun for
00:08:28
those of you who don't know it was kind
00:08:29
of like a social media Network where you
00:08:32
could ask a question and then people
00:08:34
could respond and it was like an
00:08:36
audio-based Twitter would you say that
00:08:38
was the the best way to describe it
00:08:40
audio Twitter asynchronous AI
00:08:43
transcripts and all kinds of AI to make
00:08:45
it easier for you
00:08:47
translation really good way for kind of
00:08:50
try to make podcasting type
00:08:52
conversations more accessible to
00:08:53
everybody cuz honestly one of the
00:08:55
reasons I don't go on podcast I don't
00:08:56
like being intermediated so to speak
00:08:59
right where you sit there and someone
00:09:00
interviews you and then you go back and
00:09:01
forth and you go through the same old
00:09:02
things I just want to talk to people I
00:09:04
want peer relationships kind of like you
00:09:06
guys have running here nval what
00:09:08
happened when you went through that
00:09:09
phase there was a period where it just
00:09:12
seemed like something had gone on in
00:09:14
your life and you just knew the answers
00:09:16
you were just so grounded it's not to
00:09:19
say that you're not grounded now but
00:09:20
you're you're less active posting and
00:09:22
writing but there was this period where
00:09:24
I think all of us were like right what
00:09:26
does nval think oh really oh okay that's
00:09:29
USS to be I would say it would be like
00:09:31
the late teens the early 20s Jason you
00:09:34
can correct me if I'm getting the dates
00:09:35
wrong but it's been that moment where
00:09:36
like these Naval isms and this sort of
00:09:38
philosophy really started to I think
00:09:40
people had a tremendous respect for how
00:09:43
you were thinking about things I I'm
00:09:44
just curious like what were you going
00:09:46
through something in that moment or like
00:09:48
oh yeah yeah yeah yeah that's right no
00:09:50
very insightful yeah 20 so i' I've been
00:09:52
on Twitter since 2007 cuz I was an early
00:09:55
investor but I never really tweeted I
00:09:57
didn't get featured I had no audience I
00:09:58
was just doing the usual techie guy
00:10:00
thing talking to each other and then I
00:10:03
started Angelus in 2010 the original
00:10:05
thing about matching investors to
00:10:07
startups didn't scale it was just an
00:10:09
email list that exploded at early on but
00:10:11
then just didn't scale so we didn't have
00:10:13
a
00:10:14
business and I was trying to figure out
00:10:16
the business and at the same time I got
00:10:18
a letter from the Securities and
00:10:19
Exchange Commission saying oh you're
00:10:21
acting as an unlic broker dealer and I'm
00:10:22
like what I'm not making any money I'm
00:10:24
not I'm just making intros I'm not
00:10:25
taking anything it's just a public
00:10:26
service but even then they were coming
00:10:28
after me so I wasn't and I'd raised a
00:10:31
bunch of money from investors so I was
00:10:32
in a very high stress period of my life
00:10:34
now looking back it's almost comical
00:10:36
that I was stressed over but at the time
00:10:38
it all felt very real the weight of
00:10:40
everything was on my shoulders
00:10:41
expectations people money regulators and
00:10:45
I eventually went to DC and got the law
00:10:47
change to legalize what we do which
00:10:49
ironically enabled a whole bunch of
00:10:50
other things like icos and incubator
00:10:52
days and so on demo days but in that
00:10:55
process I was in a very high stress
00:10:57
period of my life and I just started
00:10:58
tweeting whatever I was going through
00:11:00
whatever realizations I was happening
00:11:02
it's only in stress that you sort of are
00:11:04
forced to grow and so whatever internal
00:11:06
growth I was going through I just
00:11:07
started tweeting it not thinking much of
00:11:10
it and it was a mix of there are three
00:11:12
things that I kind of always kind of are
00:11:14
running through one is I love science
00:11:17
you know I'm an amateur love physics
00:11:20
let's just leave it at that I love
00:11:22
reading a lot of philosophy and thinking
00:11:25
deeply about it and I like making money
00:11:28
right truth Love and Money that's my
00:11:29
joke on my Twitter bio those are the
00:11:32
three things that I keep coming back to
00:11:34
and so I just started tweeting about all
00:11:35
of them and I
00:11:37
think before that the expectation was
00:11:40
that someone like me should just be
00:11:41
talking about money stay in your lane
00:11:43
and people had been playing it very safe
00:11:46
and so I think the combination of the
00:11:47
three sort of caught people's attentions
00:11:49
because every person thinks about
00:11:52
everything we don't just stay in our
00:11:53
Lane in real life we're dealing with our
00:11:55
relationships we're dealing with our
00:11:57
relationship with the universe we're
00:11:59
dealing with what we know to be true and
00:12:01
you know with science and how we make
00:12:03
decisions and how we figure things out
00:12:05
and we're also dealing with a practical
00:12:06
everyday material things of how to deal
00:12:09
with our spouses or girlfriends or wives
00:12:11
or husbands and how to make money and
00:12:13
how to deal with our children so I'm
00:12:15
just tweeting about everything I just
00:12:17
got interested in everything I'm
00:12:18
tweeting about it and a lot of it my
00:12:20
best stuff was just notes to self it's
00:12:21
like hey don't forget this how to get
00:12:23
rich remember that one how to get rich
00:12:25
that like one theads and super Yeah Bang
00:12:30
Yeah Yeah Yeah I think that is still the
00:12:33
most viral threat ever on Twitter I like
00:12:35
Timeless things I I like philosophy I
00:12:37
like things that are still apply in the
00:12:39
future I like compound interests if you
00:12:42
will in
00:12:43
ideas obviously recently X has become so
00:12:46
addictive that we're all checking it
00:12:48
every day and elon's built the perfect
00:12:51
for you he's built Tik Tok for nerds and
00:12:53
we're all on it but normally I try to
00:12:55
ignore the news obviously last year
00:12:57
things got real we all had to pay of
00:12:59
attention to the news but I just like to
00:13:02
tweet Timeless things I I don't know I
00:13:03
mean people pay attention sometimes they
00:13:05
like what I write sometimes they they go
00:13:07
non ler on me but yeah the how to get
00:13:09
rich sweet storm was a big one is it
00:13:11
problematic when people now meet you
00:13:12
because the the hype versus the reality
00:13:16
there's like it's discordant now because
00:13:18
people if they absorb this content they
00:13:19
expect to see
00:13:21
Somey da yeah floating in the in the air
00:13:24
you know what I mean yes yeah like many
00:13:26
of you have stopped drinking but I used
00:13:28
to like have the occasional glass of
00:13:30
wine and there was a moment there where
00:13:31
I went and met with an information
00:13:34
reporter back when I used to meet with
00:13:35
reporters and she said where are we
00:13:38
going to meet so I said oh let's meet at
00:13:39
the the wine merchant and we'll get a
00:13:41
glass she's like what you drink like it
00:13:43
was like a big deal for I'm so
00:13:46
disappointed I was like I'm an
00:13:48
entrepreneur most of them are alcoholics
00:13:50
or and psychedelics or yeah for sure
00:13:53
takes to
00:13:54
man hot tub yeah right yeah when they
00:13:57
say I'm on therapy you know what that
00:13:59
that's code for um so yes it is highly
00:14:03
Discord medine yeah I'm I'm almost
00:14:05
reminded of that line in The Matrix
00:14:07
where that agent is about to like shoot
00:14:09
one of the Matrix characters and says
00:14:11
only human right so that's kind I want
00:14:13
to say to everybody like only human yeah
00:14:16
yeah yeah you did a recently a podcast
00:14:18
with Tim Ferris on parenting this was
00:14:22
out there I love this and I bought the
00:14:24
book from this guy yeah just give a a
00:14:27
brief overview of this philosophy of
00:14:30
parenting oh I didn't listen to this I
00:14:31
have to write this down tell us this
00:14:33
you're going to love this I this spoke
00:14:35
to me but it was a little crazy yeah so
00:14:37
I'm a big fan of David Deutsch David
00:14:39
deuts I think is basically the smartest
00:14:41
living human he's a scientist who yeah
00:14:44
Quantum computation and he's written a
00:14:46
couple of great books but it's about the
00:14:48
intersection of the greatest theories
00:14:50
that we have today the theories with the
00:14:51
most reach and those are epistemology
00:14:54
the theory of knowledge Evolution
00:14:56
quantum physics and computation this is
00:14:59
the beginning of infinity guy that's
00:15:00
theity is always reference correct yes
00:15:04
the fabric of reality is another book
00:15:05
I've spent a fair bit of time with him
00:15:07
done some podcasts with him hired and
00:15:09
worked with people around him and I'm
00:15:11
just really impressed because it's like
00:15:12
the the framework that's made me smarter
00:15:14
I feel like because we're all fighting
00:15:16
aging our brains are getting slower and
00:15:18
we're always trying to have better ideas
00:15:20
so as you age you should have wisdom
00:15:21
that's your substitute for the raw
00:15:23
horsepower of intelligence going down
00:15:25
and so scientific wisdom I take from
00:15:27
David not take but you know I learned
00:15:29
from David and one of the things that he
00:15:32
pioneered is called taking children
00:15:34
seriously and it's this idea that you
00:15:35
should take your children seriously like
00:15:37
adults you should always give them the
00:15:39
same Freedom that you would give an
00:15:40
adult if you wouldn't speak that way
00:15:42
with your spouse if you wouldn't force
00:15:44
your spouse to do something don't force
00:15:46
a child to do something and it's only
00:15:48
through the latent threat of physical
00:15:51
violence hey I can control you I can
00:15:53
make you go to your room I can take your
00:15:55
dinner away or whatever that you
00:15:57
intimidate children and it resonated
00:16:00
with me because I grew up very very free
00:16:03
my father wasn't around when I was young
00:16:05
my mother didn't have the bandwidth to
00:16:07
watch us all the time she had other
00:16:09
things to do and so I kind of was making
00:16:11
my own decisions from a extremely young
00:16:13
age from the age of five nobody was
00:16:15
telling me what to do and from the age
00:16:16
of nine I was telling everybody what to
00:16:18
do so I'm used to that and I've been
00:16:20
homeschooling my own kids so the
00:16:22
philosophy resonated and I found this
00:16:25
guy Aaron stuple on air chat and he was
00:16:28
an incredible Expositor of the
00:16:29
philosophy he lives his life with it 99%
00:16:32
as Extreme as one can go so his kids can
00:16:35
eat all the ice cream they want and all
00:16:36
the Snickers bars they want they can
00:16:38
play on the iPad all they want they
00:16:39
don't have to go to school if they don't
00:16:40
feel like it they dress how they want
00:16:42
they don't have to do anything they
00:16:43
don't want to do everything is a
00:16:44
discussion negotiation explanation just
00:16:47
like you would with a roommate or an
00:16:48
adult living in your house and it's kind
00:16:51
of insane and extreme but I live my own
00:16:54
home life in that Arc in that direction
00:16:59
and I'm a very free person I don't have
00:17:00
an office to go to I try really not to
00:17:02
maintain a calendar if I can't remember
00:17:04
it I don't want to do it I don't send my
00:17:07
kids to school I really try not to coers
00:17:08
them and so obviously that's an extreme
00:17:11
model but I was sorry sorry sorry hold
00:17:14
on a second so yeah your kids if they if
00:17:18
they were like I want hog and and it's
00:17:21
900 p.m. you're like okay two nights ago
00:17:25
I did this I ordered the hog andas it
00:17:27
wasn't hog andas it was a different
00:17:28
brand I ordered it
00:17:29
I'm just going to go through a couple of
00:17:30
examples in we do ice cream at 9:00 p.m.
00:17:32
and we all ate so they're like Dad I
00:17:34
want and they're happy they're happy
00:17:36
kids I want to be on my iPad I'm playing
00:17:38
fortnite leave me alone I'll go to sleep
00:17:40
when I want you're like okay my oldest
00:17:42
probably plays iPad nine hours a day
00:17:45
okay so then your other kid pees in
00:17:48
their pants because they're too lazy to
00:17:49
walk to the bathroom they don't do that
00:17:51
because they don't like peeing their
00:17:52
pants no I understand but I'm just
00:17:53
saying like there's a spectrum of all of
00:17:55
these things right yeah and your point
00:17:57
of view is 100% of it is allowed and you
00:18:00
have no judgments no no that's not where
00:18:02
I am that's where that's where Aaron is
00:18:04
my rules are a little different my rules
00:18:06
are they got to do one hour of math or
00:18:09
programming plus two hours of reading
00:18:12
every single day and the moment they've
00:18:14
done that they're free creatures and
00:18:17
everything else is a negotiation we have
00:18:19
to persuade them it's a persuasion I
00:18:20
should say not even a negotiation and
00:18:22
even the hour of math and 2 hours of
00:18:24
reading really you get 15 to 30 minutes
00:18:26
of math maybe an hour if you're lucky
00:18:28
and you get half an hour to two hours of
00:18:29
reading if and what do you think the
00:18:31
long-term consequences of that are and
00:18:34
then also what is the long-term
00:18:35
consequences let's say on health if
00:18:38
they're making decisions you know are
00:18:40
just not good like the ice cream thing
00:18:42
at 9:00 p.m. how do you how do you
00:18:43
manage that in your mind I think
00:18:47
whatever age you're at whatever part
00:18:48
you're at in life you're still always
00:18:50
struggling with your own habits I think
00:18:52
all of us for example still eat food and
00:18:54
feel guilty or want to eat something
00:18:55
that we shouldn't be eating and we're
00:18:57
still always evolving our diets and kids
00:18:58
are the same so my oldest is already he
00:19:01
passed on the ice cream last time and he
00:19:03
said I want to eat healthier because
00:19:04
finally I managed to get through to him
00:19:05
and persuade him that he should be
00:19:07
healthier my younger kids will eat it
00:19:09
but they'll eat a limited amount my
00:19:11
middle kid will sometimes eat some so if
00:19:14
they say something you'll enable it but
00:19:15
then you'll guide you'll be like Hey
00:19:17
listen like this is not the choice I
00:19:19
would make I don't think but if you want
00:19:20
it I do it yeah I'll try it but you also
00:19:23
have to be careful where you don't want
00:19:24
to intimidate them and you don't want to
00:19:25
be so overbearing that then they just
00:19:28
view dad is like controll it I find this
00:19:30
so fasc and so what do you think happens
00:19:32
to these kids at the like I'm sure you
00:19:34
have a vision of what they'll be like
00:19:36
when they're fully formed adults like
00:19:37
what is that Vision I I try not to
00:19:40
they're going to be who they're going to
00:19:41
be I this is kind of how I grew up I
00:19:43
kind of did what I
00:19:44
wanted I I would ra I would rather they
00:19:48
have
00:19:49
agency than turn out exactly the way I
00:19:52
want cuz agency is the hardest thing
00:19:54
right having control over your own life
00:19:56
making your own decisions them to be
00:19:59
happy I have a very happy household what
00:20:01
is the Plato what's Plato's goal udonia
00:20:04
right like udonia yeah the happy arist
00:20:07
like the the Fulfillment this Con is
00:20:08
that what you want for
00:20:10
them I don't really want anything for
00:20:13
them I just want them to be free and
00:20:17
their best selves God I
00:20:19
want shamat is worrying about details
00:20:22
he's got like 17 kids now I don't know
00:20:24
if you know but shamat has got like a
00:20:26
whole punch list of things but I love
00:20:27
this interview because the guy made a
00:20:30
really interesting point which was
00:20:32
they're going to have to make these
00:20:33
decisions at some point they're going to
00:20:35
have to learn the pros and cons the the
00:20:39
the upside the downside to all these
00:20:40
things eating iPad and the quicker you
00:20:43
get them to have agency to make these
00:20:45
decisions for themselves with knowledge
00:20:47
to ask questions the more secure they'll
00:20:50
be I found it a fascinating discussion I
00:20:52
like cause and effect especially in
00:20:54
teenagers now that I have a teenager
00:20:56
it's really good for them to learn hey
00:20:59
you know if you don't do your homework
00:21:00
you you have a problem and then you got
00:21:03
to solve that problem how are we going
00:21:04
to solve that problem so I like to
00:21:05
present it as what's your plan anytime
00:21:07
they have a problem 8-year-old kids
00:21:10
15year old kids I just say what's your
00:21:11
plan to solve this and then I like to
00:21:13
hear their plan and let me know if you
00:21:15
want to brainstorm it but I thought it
00:21:16
was a very interesting super interesting
00:21:19
discussion I I would say overall my kids
00:21:21
are very happy the household is very
00:21:23
happy everybody gets along everybody
00:21:26
loves each other yeah
00:21:29
some of them are way ahead of their
00:21:30
peers nobody's behind in anything that
00:21:33
matters nobody seems unhealthy in any
00:21:36
obvious way no one has aberant eating
00:21:38
habits I haven't even found an really an
00:21:40
aberrate Behavior that's out of line so
00:21:43
it's all good self correct me it's like
00:21:45
a i wor I worry a lot about this like
00:21:47
iPad situation I see my kids on an iPad
00:21:50
and it's almost like unless they're
00:21:53
doing an
00:21:54
interactive project if they end up
00:21:56
watching says says the guy who has a
00:21:58
video game theme in the background that
00:22:00
was interactive who probably and who
00:22:02
probably grew up playing video games
00:22:04
nonstop and probably spends 9 hours a
00:22:06
day on his screen just called a phone so
00:22:08
yeah yeah it's the same thing man well I
00:22:11
mean I feel like watch is it but do they
00:22:13
watch
00:22:13
shows no there a h there there's a
00:22:16
hypocrisy to picking up your phone and
00:22:17
then saying to your kid no you can't use
00:22:19
your iPad I grew up playing video games
00:22:21
non-stop in video games when I was older
00:22:23
and I was an avid gamer until just a few
00:22:25
years ago well I'm not I'm not critic
00:22:28
izing the the iPad I was obviously on a
00:22:30
computer since I was four years old so I
00:22:32
totally get it and I think the question
00:22:34
for me is like but I didn't have the
00:22:36
ability to play a 30 minute show and
00:22:40
then play the next 30 minute show and
00:22:41
the next 30 minute show and then sit
00:22:43
there for two hours and just have a show
00:22:45
playing the whole time I was you know
00:22:48
interacting on the computer and doing
00:22:49
stuff and building stuff which was a
00:22:51
little different me from a use case
00:22:53
perspective yeah we we we did use to
00:22:55
control their YouTube access although
00:22:57
now we don't do that
00:22:59
the only thing I ask them is that they
00:23:01
put on captions when they're watching
00:23:03
YouTube so it helps their reading they
00:23:05
learn to read fast good tip yeah I like
00:23:07
that one I will say that one of my kids
00:23:09
is really into YouTube the other two are
00:23:11
not like they just got over it and to
00:23:13
the extent that they use YouTube it's
00:23:14
mostly because they're looking up videos
00:23:16
on their favorite games they want to
00:23:18
know how to be better at a game all
00:23:20
right let's keep moving through this
00:23:21
docket we have David Sachs with us here
00:23:24
so uh David give us your philosophy of
00:23:26
parenting okay next item on the docket
00:23:27
let's go
00:23:30
talk about some real
00:23:32
issues show is not a parenting show a
00:23:35
parenting show I asked D what's your
00:23:38
parenting philosophy he said oh I set up
00:23:40
their trust four years ago so he's done
00:23:42
he's good trust is set up everything's
00:23:45
good g r a t
00:23:49
check you're all set guys let me know
00:23:52
how it works out all right speaking of
00:23:55
working out we've got a vice president
00:23:58
who isn't cuckoo for Coco Puffs and who
00:24:01
actually understands what AI is JD Vance
00:24:04
gave a great speech I watched it myself
00:24:06
he talked about AI in Paris this was on
00:24:09
Tuesday at the AI action Summit whatever
00:24:12
that is and he gave a 15minute banger of
00:24:15
a speech he talked about over regulating
00:24:17
Ai and America's intention to dominate
00:24:20
this and we happen to have with US Naval
00:24:23
the Zar the Zar of AI so before I go
00:24:26
into all the details about the speech I
00:24:28
don't steal your thunder sax this this
00:24:31
uh Speech had a lot of uh verbage a lot
00:24:34
of ideas that I've heard before that
00:24:36
maybe we've all talked about maybe tell
00:24:38
us a little bit about how this all came
00:24:40
together and how proud you are I mean
00:24:43
gosh having a vice president who
00:24:45
understands AI is just it's mind-blowing
00:24:48
he could speak on a
00:24:50
topic that's topical credibly this was
00:24:53
an awesome moment for America I think
00:24:55
what are you implying there JC I'm
00:24:57
implying you have workshopped it with
00:24:59
him no or that he's smart both of those
00:25:02
things the vice president wrote the
00:25:04
speech or at least directed all of it so
00:25:06
the ideas came from him I'm not going to
00:25:08
take any credit whatsoever for this okay
00:25:10
well it was on point maybe you could
00:25:12
talk about I agree it was on point I
00:25:13
think it was a very well crafted and
00:25:15
well- delivered speech he made four main
00:25:17
points about the Trump administration's
00:25:20
approach to AI he's going to ensure this
00:25:22
is point one that American AI continues
00:25:24
to be the gold standard fantastic check
00:25:27
two he says that the administration
00:25:30
understands that excessive regulation
00:25:31
could kill AI just as it's taking off
00:25:34
and he did this in front of all the EU
00:25:35
Elites who love regulation did it on
00:25:38
their home court and then he said number
00:25:40
three AI must remain free from
00:25:42
ideological bias as we've talked about
00:25:44
here on this program and then number
00:25:47
four the White House he said will quote
00:25:50
maintain a pro-worker growth path for AI
00:25:53
so that it can be a potent tool for job
00:25:55
creation in the US so what are your
00:25:58
thoughts on the four major bullet points
00:26:00
and and his speech here in uh Paris well
00:26:03
I think that the vice president you knew
00:26:06
he was going to deliver an important
00:26:08
speech as soon as he got up there and
00:26:09
said that I'm here to talk about not AI
00:26:13
safety but AI opportunity and to
00:26:17
understand what a bracing statement that
00:26:19
was and really almost like a shot across
00:26:21
the bow you have to understand the
00:26:23
history and context of these events for
00:26:26
the last couple of years the last couple
00:26:27
of these events have been exclusively
00:26:29
focused on AI safety the last inperson
00:26:31
event was in the UK at Bletchley Park
00:26:34
and the whole conference was devoted to
00:26:36
AI safety similarly the European AI
00:26:40
regulation obviously is completely
00:26:42
preoccupied with safety and trying to
00:26:43
regulate Away safety risks before they
00:26:46
happen similarly you had the Biden EO
00:26:48
which was based around safety and then
00:26:50
you have just the whole media coverage
00:26:52
around AI which is preoccupied with all
00:26:55
the risks from AI so to have the vice
00:26:58
president get up there and say right off
00:27:00
the bat that there are other things to
00:27:02
talk about in respect to AI besides
00:27:06
safety risk that actually there are huge
00:27:08
opportunities there was a breath of
00:27:11
fresh air and like I said kind of a shot
00:27:13
across the bow and yeah you could almost
00:27:15
see some of the eurocrats they needed
00:27:19
their fainting couches after that
00:27:20
eurocrats
00:27:21
trudo looks like his dog just died so I
00:27:24
think that was just a really important
00:27:26
statement right off the bat to the
00:27:28
context for the speech which is AI is a
00:27:31
huge opportunity for all of us because
00:27:34
really that point just has not been made
00:27:35
enough and it's true there are risks but
00:27:38
when you look at the media coverage and
00:27:39
when you look at the dialogue that The
00:27:42
Regulators have had around this they
00:27:43
never talk about the opportunities it's
00:27:45
always just around the wrist so I think
00:27:47
that was a very important corrective and
00:27:49
then like you said he went on to say
00:27:52
that the United States has to win this
00:27:54
AI race we want to be the gold standard
00:27:56
we want to dominate that was my favorite
00:27:58
part yeah and and by the way that
00:27:59
language about dominating Ai and and
00:28:01
winning the global race that is in
00:28:03
president Trump's executive order from
00:28:05
week one so this is very much
00:28:07
elaborating on the official policy of of
00:28:09
this Administration and the vice
00:28:12
president that went on to say that he
00:28:13
specified how we would do that right we
00:28:15
have to win some of these key building
00:28:17
block Technologies we want to win in
00:28:18
chips we want to win in AI models we
00:28:20
want to win in applications he said we
00:28:23
need to build we need to unlock energy
00:28:25
for these companies and then most of all
00:28:27
we just need to be supportive towards
00:28:28
them as opposed to regulating them to
00:28:30
death and he had a lot to say about the
00:28:33
the risk of overregulation
00:28:35
how often it's big companies that want
00:28:38
regulation he warned about regulatory
00:28:40
capture which our friend Bill Gurley
00:28:42
would like and he said that so basically
00:28:45
having less regulation can actually be
00:28:47
more fair can create a More Level
00:28:48
Playing Field for small companies as
00:28:51
well as big companies and then he he
00:28:53
said to the Europeans that we want you
00:28:55
to be partners with us we want to lead
00:28:57
the world we want you to be our our
00:28:59
partners and benefit from this
00:29:01
technology that we're going to take the
00:29:03
lead in creating but you also have to be
00:29:06
a good partner to us and then he
00:29:08
specifically called out the
00:29:09
overregulation that Europeans have been
00:29:11
engaged in he mentioned the Digital
00:29:13
Services act which has acted as like a
00:29:16
speed trap for American companies it's
00:29:18
American companies who've been
00:29:23
overregulation because the truth of the
00:29:25
matter is that it's American technology
00:29:27
companies that are winning the race and
00:29:29
so when Europe passes these owner
00:29:31
regulations they fall most of all on
00:29:34
American companies and he's basically
00:29:36
saying we need you to rebalance and and
00:29:39
correct this because it's not fair and
00:29:40
it's not smart policy and it's not going
00:29:43
to help us collectively win this AI race
00:29:45
and that kind of brings me just to to
00:29:47
the last point is I don't think he
00:29:48
mentioned China by name but clearly he
00:29:51
talked about adversarial countries who
00:29:54
are using AI to control their
00:29:56
populations to engage in censorship and
00:29:58
thought control and he basically painted
00:30:00
a picture where it's like yeah you could
00:30:02
go work with them or you could work with
00:30:04
us and we we have hundreds of years of
00:30:06
shared history together we believe in
00:30:08
things like free speech hopefully and we
00:30:11
want you to work with us but if you are
00:30:13
going to work with us then you have to
00:30:15
cooperate and we have to create a
00:30:17
reasonable regulatory regime Nal did you
00:30:20
see the uh Speech and your thoughts just
00:30:22
generally on JD Vance and and having
00:30:25
somebody like this you know representing
00:30:27
us and wanting to win American exception
00:30:30
very surprising very impressive I
00:30:32
thought he was polite
00:30:34
optimistic and just very forward-looking
00:30:36
just it's what you would expect an
00:30:38
entrepreneur or a smart investor to say
00:30:41
so I was very impressed I think the idea
00:30:43
that America should win great I think
00:30:45
that we should not regulate I also agree
00:30:48
with I'm not an AI Doomer I don't think
00:30:49
AI is going to end the world that's a
00:30:51
separate conversation but there's this
00:30:53
religion that comes along in many faces
00:30:55
which is that oh climate change is going
00:30:57
to end the world a going to end the
00:30:58
world asteroids going to end the world
00:31:00
covid-19 is going to end the world and
00:31:02
it just has a way of fixating your
00:31:03
attention right it captures everybody's
00:31:05
attention at once so it's a very
00:31:06
seductive thing and I think in the case
00:31:08
of AI it's really been overplayed by
00:31:11
incentive bias you know motivated
00:31:13
reasoning by the companies were ahead
00:31:15
and they want to pull up the ladder
00:31:16
behind them I I think they genuinely
00:31:18
believe it I think they genuinely
00:31:19
believe that there's safety risk but I
00:31:20
think they're motivated to believe in
00:31:22
those safety risks and then they pass
00:31:23
that along but it's kind of a weird
00:31:25
position because they have to say oh
00:31:27
it's so dangerous that you shouldn't
00:31:29
just let open source go at it and you
00:31:31
should let just a few of us work with
00:31:33
you on it but it's not so dangerous that
00:31:36
a private company can't own the whole
00:31:37
thing right because if it was truly the
00:31:39
Manhattan Project if they were building
00:31:40
nuclear weapons you wouldn't want one
00:31:42
company to own that no Sam wman famously
00:31:45
said that AI will capture the light cone
00:31:47
of all future value in other words like
00:31:49
all value ever created at the speed of
00:31:51
light from here will be captured by AI
00:31:54
so if that's true then I think open-
00:31:55
Source AI really matters and little Tech
00:31:58
AI really matters the problem is that
00:32:00
the nature of training these models is
00:32:02
highly centralized they benefit from
00:32:04
supercomputer clustered compute so it's
00:32:07
not clear how any decentralized model
00:32:09
can compete so to me the real issue
00:32:11
boils down to is how do you push AI
00:32:14
forward while not having just a very
00:32:16
small number of players control the
00:32:18
entire thing and we thought we had that
00:32:20
solution with the original open AI which
00:32:22
was a nonprofit was supposed to do for
00:32:23
Humanity but now because of they want to
00:32:26
incentivize the team and they want to
00:32:27
raise money they have to privatize at
00:32:29
least a part of it although it's not
00:32:30
clear to me why they need to privatize
00:32:32
the whole thing like why they need to
00:32:33
buy out the nonprofit portion you could
00:32:35
leave a nonprofit portion and you could
00:32:36
have the private portion for the
00:32:38
incentives but I think that the the real
00:32:41
challenge is how do you keep AI from
00:32:44
naturally centralizing because all the
00:32:46
economics and the technology underneath
00:32:49
are centralizing in nature if you really
00:32:51
think you're going to create God do you
00:32:53
want to put God on a leash with one
00:32:55
entity controlling God that to me is the
00:32:57
real fear not I'm not scared of AI I'm
00:33:00
scared of what a very small number of
00:33:02
people who control AI do to the rest of
00:33:04
us for our own good because that's how
00:33:06
it always works it's well said probably
00:33:07
should go with the Greek model having
00:33:09
many gods and uh Heroes as well freeberg
00:33:12
you heard the uh JD van speech I assume
00:33:15
what are your thoughts on overregulation
00:33:17
and uh maybe to nal's point one person
00:33:21
owning this versus open source I think
00:33:24
that there's this kind of big definition
00:33:27
of social balance right now on what I
00:33:29
would call techno optimism and techno
00:33:32
pessimism generally people sort of fall
00:33:35
into one of those two camps generally
00:33:37
speaking techno
00:33:39
optimists I would say are folks that
00:33:41
believe that accelerating outcomes with
00:33:43
AI with automation with bioengineering
00:33:46
manufacturing semiconductors Quantum
00:33:48
Computing nuclear energy Etc will usher
00:33:51
in this era of abundance by making by
00:33:54
creating leverage which is what
00:33:55
technology gives us
00:33:58
technology will make things cheaper and
00:34:00
it will be deflationary and it will give
00:34:02
everyone more so it creates abundance
00:34:05
the challenge is that people who already
00:34:07
have a lot worry more about the exposure
00:34:09
to the downside than they desire the
00:34:12
upside and so you know the techn
00:34:15
pessimists are generally like the EU and
00:34:18
large Parts frankly of the United States
00:34:21
are worried about the loss of X the loss
00:34:24
of jobs the loss of this the loss of
00:34:26
that whereas countries like China and
00:34:29
India are more excited about the
00:34:32
opportunity to create wealth the
00:34:33
opportunity to create leverage the
00:34:35
opportunity to create abundance for
00:34:36
their people you know GDP per capita in
00:34:39
in the EU $60,000 a year GDP per capita
00:34:42
in the United States like 82,000 but GDP
00:34:45
per capita in India is 2500 and China is
00:34:48
12,600 there's a greater incentive in
00:34:50
those countries uh to manifest upside
00:34:54
than there is for the United States and
00:34:56
the EU who are more worried about
00:34:58
manifesting downside and so it is a very
00:35:01
difficult kind of social battle that's
00:35:03
underway I do think like over time those
00:35:06
governments and those countries and
00:35:08
those social systems that Embrace these
00:35:10
Technologies are going to become more
00:35:12
capitalist and they're going to require
00:35:15
less government control and intervention
00:35:17
in job creation the economy payments to
00:35:19
people and so on and the countries that
00:35:21
are more techn pessimistic are
00:35:23
unfortunately going to find themselves
00:35:25
asking for greater government control
00:35:27
govern intervention in markets
00:35:29
governments creating jobs government
00:35:30
making payments to people government's
00:35:32
effectively running the economy my
00:35:34
personal view obviously is that I'm a
00:35:36
very strong advocate for technology
00:35:40
acceleration because I think in nearly
00:35:42
every case in human history when a new
00:35:43
technology has emerged we've largely
00:35:45
found ourselves assuming that the
00:35:48
technology works in the framework of
00:35:50
today or of yester year the automobile
00:35:53
came along and no one envisioned that
00:35:55
everyone in the United States would own
00:35:57
an automobile and therefore you would
00:35:58
need to create all of these new
00:35:59
Industries like mechanics and car
00:36:02
dealerships roads all the people
00:36:04
servicing building roads and all the
00:36:06
other industry that emerg and it's it's
00:36:09
very hard for us to sit here today and
00:36:10
say okay AI is going to destroy jobs
00:36:13
what's it going to create and be right I
00:36:15
think we're very likely going to be
00:36:16
wrong whatever estimations we we give
00:36:19
the area that I think is most
00:36:20
underestimated is large technical
00:36:22
projects that seem technically
00:36:24
infeasible today that AI can unlock for
00:36:26
example habitation in in the oceans like
00:36:30
it's very difficult for us to Envision
00:36:32
like creating cities underwater and
00:36:34
creating cities in the oceans or
00:36:35
creating cities on the moon or creating
00:36:37
cities on mars or finding new places to
00:36:38
live those are like technically people
00:36:40
might argue oh that sounds stupid I
00:36:42
don't want to go do that but at the end
00:36:43
of the day like human civilization will
00:36:45
drive us to want to do that but those
00:36:47
technically are very hard to pull off
00:36:49
today but AI can unlock a new set of
00:36:51
Industries to enable those transitions
00:36:53
so I think we really get it wrong when
00:36:54
we try and assume the technology as a
00:36:57
transplant for last year or last century
00:37:00
and then we kind of become techn
00:37:01
pessimists because we're worried about
00:37:02
losing what we have are you a techn
00:37:04
pessimist are you Optimist because I you
00:37:06
uh bring up the downside of awful lot
00:37:08
here on the program but you are working
00:37:10
every day in a very optimistic way to
00:37:12
breed you know better strawberries and
00:37:15
potatoes for folks so you're a little
00:37:17
bit of a no I have no techn pessimism
00:37:19
whatsoever I try and point out why the
00:37:21
other side is acting the way they are
00:37:22
got it okay putting it in full context
00:37:24
and what I'm trying to highlight is I
00:37:25
think that that framework is wrong I
00:37:27
think that that
00:37:28
frame
00:37:30
transology the way operating the way
00:37:34
about and it creates this you know
00:37:35
because of this manifestation about
00:37:37
worrying about downside it creates this
00:37:39
fear that that creates regulation like
00:37:40
we see in the EU and as a result China's
00:37:43
GDP will scale while the euus will
00:37:45
stagnate if that's where they go that's
00:37:47
my assessment or my opinion on what will
00:37:49
happen chath you want to wrap this up
00:37:50
for us what are your thoughts on JD I'll
00:37:52
give you two okay the first is I would
00:37:55
say this is a really interesting
00:37:58
moment where I would call this the tale
00:38:00
of two vice presidents very early in the
00:38:02
Biden Administration kamla was
00:38:05
dispatched on an equally important topic
00:38:08
at that time which was illegal
00:38:09
immigration and she went to Mexico and
00:38:12
Guatemala and so you actually have a
00:38:14
really interesting AB test here you have
00:38:18
both vice presidents dealing with what
00:38:21
we're in that moment incredibly
00:38:23
important issues and I think that JD was
00:38:26
focused he was precise he was
00:38:31
ambitious and even the part of the press
00:38:35
that was very supportive of kamla
00:38:37
couldn't find a lot of very positive
00:38:39
things to say about her and the feedback
00:38:41
was it was Meandering she was ducking
00:38:44
questions she didn't answer the
00:38:46
questions that she was asked very well
00:38:49
and it's so interesting because it's a
00:38:50
bit of a microcosm then to what happened
00:38:52
over these next four years and her
00:38:54
campaign quite honestly which you could
00:38:55
have taken that window of that feedback
00:38:58
and unfortunately for her it just
00:39:01
continued to be very consistent so that
00:39:02
was one observation I had because I
00:39:05
heard him give the speech I heard her
00:39:07
and I had this kind of moment where I
00:39:09
was like wow two totally different
00:39:12
people the second is on the substance of
00:39:14
what JD said I said this on Tucker and
00:39:17
I'll just simplify all of this into a
00:39:19
very basic framework which
00:39:21
is if you want a country to thrive it
00:39:24
needs to have economic Supremacy
00:39:28
and it needs to have military
00:39:30
Supremacy in the absence of those two
00:39:32
things societies
00:39:34
crumble and the only thing that
00:39:36
underpins those two things is
00:39:38
technological Supremacy and we see this
00:39:40
today so on Thursday what happened with
00:39:43
Microsoft they had a $24 billion
00:39:46
contract with the United States Army to
00:39:48
deliver some whizbang thing and they
00:39:52
realized that they couldn't deliver it
00:39:54
and so what did they do they went to
00:39:56
andril now why did they go to andril
00:39:58
because andril has the technological
00:40:00
Supremacy to actually execute a few
00:40:03
weeks ago we saw some attempts at
00:40:05
technological Supremacy from the Chinese
00:40:07
with respect to deep seek so I think
00:40:09
that this is a very simple existential
00:40:12
battle those who can harness and
00:40:16
govern the things that are
00:40:17
technologically Superior will
00:40:20
win and it will drive economic vibrancy
00:40:24
and Military Supremacy which then
00:40:26
creates safe strong societies that's it
00:40:29
so from that perspective JD nailed it he
00:40:33
saw the forest from the trees he said
00:40:35
exactly what I think needed to be said
00:40:38
and put folks on notice that you're
00:40:39
either on the ship or you're off the
00:40:41
ship and I think that that was really
00:40:42
good yeah and there was like um a little
00:40:46
secondary conversation that emerged
00:40:48
Sachs that I I would love to engage you
00:40:50
with if you're willing which is this
00:40:54
Civil War quote unquote between maybe
00:40:57
Maga 1.0 magga 2.0 techies ver in the
00:41:01
Maga party like
00:41:02
ourselves and maybe the core Maga folks
00:41:07
we can pull up the Tweet here in JD's
00:41:09
own word and he's been engaging people
00:41:12
in his own words it's very clear that
00:41:14
he's writing these tweets a distinct
00:41:16
difference between other politicians and
00:41:18
this Administration and they just tell
00:41:20
you what they think here it is I'll try
00:41:23
and write something to address this in
00:41:25
detail this is JD Vance's tweet but I
00:41:27
think this Civil War is overstated
00:41:30
though yes there are some real
00:41:31
divergences between the populace I would
00:41:33
describe that as magga and the techies
00:41:36
but briefly in general I dislike
00:41:37
substituting American labor for cheap
00:41:39
labor my views on immigration and
00:41:41
offshoring flow from this I like growth
00:41:43
and productivity gains and this informs
00:41:46
my view on Tech and regulation when it
00:41:48
comes to AI specifically the risks are
00:41:51
number one overstate to your point Nal
00:41:54
or two difficult to avoid one of my any
00:41:57
real concerns for instance is about
00:41:59
Consumer Fraud that's a valid reason to
00:42:02
worry about safety but the other problem
00:42:04
is much worse if a pure nation is 6
00:42:06
months ahead of the US on AI again I'll
00:42:09
try and say more and this is JD going
00:42:11
right at I think one of the more
00:42:13
controversial topic
00:42:15
sacks that the administration is dealing
00:42:18
with and has dealt with when it comes to
00:42:20
immigration and Tech because these two
00:42:22
things are dovetailing each other if we
00:42:24
lose millions of driver jobs which we
00:42:27
will in the next 10 years just like we
00:42:29
lost millions of cashier jobs well
00:42:32
that's going to impact how our nation
00:42:34
and many of the voters look at the
00:42:37
border and immigration we might not be
00:42:39
able to let as many people immigrate
00:42:41
here if we're losing millions of jobs to
00:42:44
Ai and self-driving cars what are your
00:42:46
thoughts on him engaging this directly
00:42:48
saxs well the first point he's making
00:42:50
there is about wage pressure right which
00:42:53
is when you throw open our Borders or
00:42:55
you throw open American markets products
00:42:57
that can be made in foreign countries by
00:43:00
much cheaper labor that's not held to
00:43:02
the same standards the same minimum wage
00:43:04
or the same Union rules or the same
00:43:06
safety standards that American labor is
00:43:08
and has a huge cost Advantage then
00:43:09
you're creating wage pressure for
00:43:11
American workers and he's opposed to
00:43:13
that and I think that is an important
00:43:15
point because I think the way that the
00:43:17
media or neoliberals like to portray
00:43:20
this argument is that somehow mag's
00:43:23
resistance to unlimited immigration is
00:43:25
somehow based on xenophobia or something
00:43:27
like like that no it's based on bread
00:43:29
and butter kitchen table issues which is
00:43:32
if you have this ridiculous open border
00:43:35
policy it's inevitably going to create a
00:43:37
lot of wage pressure for people at the
00:43:39
bottom of the pyramid so I think JD is
00:43:41
making that argument but and this is
00:43:44
point two he's saying I'm not against
00:43:46
productivity growth so technology is
00:43:48
good because it enables all of our
00:43:50
workers to improve their productivity
00:43:52
and that should result in better wages
00:43:55
because workers can produce more the
00:43:57
value of their labor goes up if they
00:43:59
have more tools to be productive so
00:44:01
there's no contradiction there and I
00:44:02
think he's explaining why there isn't a
00:44:04
contradiction a point I would add he
00:44:07
doesn't make this point in that tweet
00:44:08
but I would add is that one of the
00:44:11
problems that we've had over the last I
00:44:12
don't know 30 years is that we have had
00:44:15
tremendous productivity growth in the US
00:44:16
but labor has not been able to capture
00:44:18
it all that benefit has basically gone
00:44:20
to Capital or to companies and I think a
00:44:23
big part of the reason why is because
00:44:25
we've had this largely unrestricted
00:44:27
immigration policy so I think if you
00:44:29
were to Tamp down on immigration if you
00:44:32
were to stop the illegal immigration
00:44:34
then labor might be able to capture more
00:44:36
of the benefits of productivity growth
00:44:38
and that would be a good thing it' be a
00:44:39
more equitable distribution of the gains
00:44:43
from productivity and from technology
00:44:46
and that I think would help Tamp down
00:44:50
this growing conflict that you see
00:44:52
between technologists and the rest of
00:44:55
the country or certainly the heartland
00:44:56
of the country nval this is a okay you
00:44:59
want to add anything else David sorry
00:45:00
well I think just the final point he
00:45:02
makes in that tweet is that he talks
00:45:04
about how we live in a world in which
00:45:06
there are other countries that are
00:45:07
competitive and specifically he he
00:45:09
doesn't mention China but he says we
00:45:10
have a pure competitor and it's going to
00:45:13
be a much worse world if they end up
00:45:16
being six months ahead of us on AI
00:45:18
rather than six months behind that is a
00:45:19
really important point to keep in mind I
00:45:21
think that the whole Paris AI Summit
00:45:24
took place against the backdrop of this
00:45:26
recognition because just a few weeks ago
00:45:28
we had deep seek and it's really clear
00:45:30
that China is not a year behind us
00:45:32
they're hot on our heels or only maybe
00:45:35
months behind us and so if we hobble
00:45:37
ourselves with unnecessary regulations
00:45:39
if we make it more difficult for our AI
00:45:42
companies to compete that doesn't mean
00:45:44
that China's going to follow suit and
00:45:45
copy us they're going to take advantage
00:45:46
of that fact and they're going to win
00:45:48
all right nval this seems to be one of
00:45:49
the main issues of our time four of the
00:45:52
five people on this podcast right now
00:45:54
are immigrants so we have this amazing
00:45:57
in America this is a country built by
00:45:59
immigrants for immigrants do you think
00:46:02
that should change now in the face of
00:46:04
job destruction which I know you've been
00:46:06
tracking self-driving pretty acutely we
00:46:10
both have an interest there I think over
00:46:12
the years you know what's the solution
00:46:16
here if we're going to see a bunch of
00:46:19
job displacement which will happen for
00:46:22
certain jobs we all we all kind of know
00:46:23
that should we shut the border and not
00:46:25
let the next Nal chamat Max and
00:46:27
freedberg into the country well let me
00:46:28
let me declare my biases up front I'm a
00:46:30
first generation immigrant I moved here
00:46:32
when I was 9 years old rather my parents
00:46:34
did and then I naturalized
00:46:36
citizen so obviously I'm in favor of
00:46:39
some level of immigration that said I'm
00:46:40
assimilated I consider myself an
00:46:42
American first and
00:46:43
foremost I bleed red white and blue I
00:46:47
believe in the Bill of Rights and the
00:46:49
Constitution first and second and fourth
00:46:52
and all the proper amendments I get up
00:46:55
there every July 4th and I deliberately
00:46:56
defend the Second Amendment on Twitter
00:46:58
at which point half my followers Go
00:47:01
Bananas you know cuz you're because I'm
00:47:03
not supposed to I'm supposed to be a
00:47:04
good immigrant right and and carry the
00:47:06
usual set of coherent leftist policies
00:47:10
globalist policies so I think that legal
00:47:14
High skill immigration with room and
00:47:17
time for assimilation makes sense you
00:47:20
want to have a brain drain on the best
00:47:21
and brightest coming to the freest
00:47:22
country in the world to build technology
00:47:26
and to help civilization move forward
00:47:29
and you know as chth was saying economic
00:47:32
power and military power is Downstream
00:47:34
of technology in fact even culture is
00:47:37
Downstream of Technology look at what
00:47:39
the birth control pill did for example
00:47:40
to culture or what the automobile did to
00:47:42
culture or what radio and television did
00:47:44
to culture and then the internet so
00:47:46
technology drives everything and if you
00:47:48
look at wealth wealth is a set of
00:47:50
physical Transformations that you can
00:47:51
affect and that's a combination of
00:47:53
capital and knowledge and the bigger
00:47:55
input to that is knowledge and so the US
00:47:58
has become the home of knowledge
00:47:59
creation thanks to bringing in the best
00:48:01
and brightest you could even argue deep
00:48:03
Seek part of the reason why we lost that
00:48:04
is because a bunch of those kids they
00:48:06
studied in the US but then we sent them
00:48:08
back home right so I think you
00:48:09
absolutely is that actually accurate
00:48:11
that they were uh yeah yeah yeah some a
00:48:13
few of them really oh my God that's like
00:48:15
exhibit a wow I didn't know that so I
00:48:17
think you absolutely have to split
00:48:20
skilled assimilated immigration which is
00:48:23
a small set and it has to be both they
00:48:25
have to both be skilled and they have to
00:48:26
become Americans that oath is not
00:48:28
meaningless right it has to mean
00:48:30
something so skilled assimilated
00:48:32
immigration you have to separate that
00:48:33
from just open borders whoever can
00:48:35
wander in just come on in that that
00:48:37
latter part makes no sense if the Biden
00:48:39
administration had only been letting in
00:48:41
people with 150 IQs we wouldn't have
00:48:43
this debate right now the reason why
00:48:45
we're having this debate is because they
00:48:48
just opened the border and let millions
00:48:49
and millions of people in it was to
00:48:51
their advantage to conflate legal and
00:48:53
illegal immigration so every time You'
00:48:55
be like well we can't just open the
00:48:56
Border say what about Elon what about
00:48:59
this and they would just parade if they
00:49:00
were just letting Elon and the Jensen
00:49:04
and freed bgs we wouldn't be having the
00:49:06
same conversation today the correlation
00:49:08
between open borders and wage
00:49:11
suppression is irrefutable we know that
00:49:13
data and I think that the
00:49:16
Democrats for whatever logic
00:49:19
committed an incredible error in
00:49:23
basically undermining their core cohort
00:49:25
I want to go back to what you said
00:49:27
because I think it's super important
00:49:28
there is a new political calculus on the
00:49:32
field and I agree with you I think that
00:49:35
the the three cohorts of the future are
00:49:39
the asset light working in middle class
00:49:42
that's cohort number one there are
00:49:45
probably 100 to 150 million of those
00:49:48
folks then there are patriotic business
00:49:51
owners and then there's leaders in
00:49:54
Innovation those are the three and I
00:49:57
think that what Maga gets right is they
00:50:00
found the middle ground that intersects
00:50:02
those three cohorts of people and so
00:50:04
every time you see this sort of left
00:50:07
versus right dichotomy it's totally
00:50:09
miscast and it sounds discordant to so
00:50:12
many of us because that's not how any of
00:50:13
us identify right and I think that
00:50:16
that's a very important observation
00:50:17
because the policies that we adapt will
00:50:19
need to reflect those three cohorts what
00:50:21
is the common ground amongst those three
00:50:24
and on that point nval is right there's
00:50:26
not lot that those three would say is
00:50:28
wrong with a very targeted form of
00:50:31
extremely useful legal immigration of
00:50:34
very very very smart people who agree to
00:50:37
assimilate and be a part of America I
00:50:39
mean I'm so glad you said it the way you
00:50:41
said it like I remember growing up where
00:50:44
my parents would try to pretend that
00:50:46
they were in Sri Lanka and sometimes I
00:50:48
would get so frustrated I'm like if you
00:50:49
want to be in Sri Lanka go back to Sri
00:50:53
Lanka I want to be Canadian because it
00:50:56
was easier for me to make friends it was
00:50:58
easier for me to have a life I was
00:50:59
trying my best I wanted to be Canadian
00:51:01
and then when I moved to the United
00:51:03
States 25 years ago I wanted to be
00:51:06
American and I feel that I'm American
00:51:09
now and I'm proud to be an American and
00:51:11
I think that's what you want you want
00:51:13
people that embrace it doesn't mean that
00:51:14
we can't dress up in a show AR CES every
00:51:16
now and then but the point is like what
00:51:19
do you believe and where is your loyalty
00:51:22
freeberg we used to have this concept of
00:51:24
in a Melting Pot of this assimilation
00:51:28
and that was a good thing then it became
00:51:29
cultural appropriation we we kind of
00:51:31
made a right turn here where where do
00:51:32
you stand on this recruiting the best
00:51:35
and brightest and forcing them to
00:51:38
assimilate making sure that they're down
00:51:40
Jason find the people that care to be
00:51:42
here yeah let me res that uh I reject
00:51:46
the premise this whole
00:51:47
conversation wait wait hold on look I'm
00:51:50
look I'm a first generation American who
00:51:53
moved here when I was five and became a
00:51:55
citizen when I was 10 and yes I'm fully
00:51:57
American and that's the only country I
00:51:59
have any loyalty to but I the the
00:52:01
premise that I reject here is that
00:52:04
somehow an AI conversation leads to an
00:52:06
immigration conversation because
00:52:08
millions of jobs are going to be lost we
00:52:09
don't know that that's also true I agree
00:52:11
you're making a huge assumption buying
00:52:14
into the
00:52:16
dorismond of jobs that is not those are
00:52:18
fact not evidence I think any jobs have
00:52:22
any jobs been lost by AI let's be real
00:52:24
we've had AI for two and a half years
00:52:25
and I think it's great but so far it's a
00:52:27
better search engine and it helps high
00:52:30
school kids cheat on their essays I mean
00:52:32
you don't believe that self driving is
00:52:35
coming hold on a second Sach you don't
00:52:37
believe that Millions but but hold on
00:52:39
those those driver jobs weren't even
00:52:41
there 10 years ago Uber came along and
00:52:42
created all these driver jobs door Dash
00:52:44
created all these driver jobs so what
00:52:46
technology does yes technology destroys
00:52:49
jobs but it replaces them with
00:52:50
opportunities that are even better and
00:52:52
then either you can go capture that
00:52:54
opportunity yourself or an entrepreneur
00:52:56
will come along and create something
00:52:57
that allows you to capture those
00:52:59
opportunities AI is a productivity tool
00:53:02
it increases the productivity of a
00:53:03
worker it allows and do more creative
00:53:05
work and less repetitive work as such it
00:53:07
makes them more valuable yes there is
00:53:09
some retraining involved but not a lot
00:53:11
these are natural language computers you
00:53:12
can talk to them in plain English and
00:53:13
they talk back to you in plain English
00:53:15
but I think David is absolutely right I
00:53:17
think we will see job creation by AI
00:53:20
that will be as fast or faster than job
00:53:22
destruction you saw this even with the
00:53:24
internet like YouTube came along look at
00:53:25
all these YouTube streamers and
00:53:26
influencers that didn't used to be a job
00:53:29
new jobs really opportunities cuz job is
00:53:32
a wrong word job implies someone else
00:53:33
has to give it to me and sort of like
00:53:35
their handed out it's a zero some game
00:53:38
forget all that it's opportunities after
00:53:41
covid look at how many people are making
00:53:43
money by working from home in mysterious
00:53:46
little ways on the internet that you
00:53:47
can't even quite grasp here's the way I
00:53:50
categorize it okay is that whenever you
00:53:52
have a new technology you get
00:53:53
productivity gains you get some job
00:53:56
disruption meaning that part of your job
00:53:58
may go away but then you get other parts
00:54:00
that are new and hopefully more elevated
00:54:03
you know more interesting and then there
00:54:05
is some job loss I just think that the
00:54:07
third category will follow the
00:54:09
historical Trend which is that the first
00:54:11
two categories are always bigger and you
00:54:14
end up with more net productivity and
00:54:16
more net wealth creation and we've seen
00:54:18
no evidence to date that that's not
00:54:19
going to be the case now it's true that
00:54:21
AI is about to get more powerful you're
00:54:22
going to see a whole new wave of what
00:54:24
are called agents this year a agentic
00:54:26
products are able to do more for you but
00:54:29
there's no evidence yet that those
00:54:31
things are going to be completely
00:54:32
unsupervised and replace people's jobs
00:54:34
so you know I think that we have to see
00:54:36
how this technology evolves and I think
00:54:38
one of the mistakes of let's call it the
00:54:41
European approach is assuming that you
00:54:43
can predict the future with perfect
00:54:45
accuracy with such good accuracy that
00:54:48
you can create regulations today that
00:54:50
are going to avoid all these risks in
00:54:52
the future and we just don't know enough
00:54:54
yet to be able to do that that's level
00:54:56
of certainty I agree with you and the
00:54:59
companies that are promulgating that
00:55:02
view is what nval said those that have
00:55:04
an economic vested interest in at least
00:55:07
convincing the next incremental investor
00:55:09
that this could be true because they
00:55:11
want to make the claim that all the
00:55:12
money should go to them so they could
00:55:13
Hoover up all the economic gains and
00:55:16
that's that is the part of the cycle
00:55:17
we're in so if you if you actually
00:55:19
stratify these reactions there's the
00:55:22
small startup companies in AI that
00:55:24
believe there's a productivity leap to
00:55:25
be had and that there's going to be
00:55:28
Prosperity everybody on the sidelines
00:55:30
watching and then a few companies that
00:55:32
have an extremely vested interest in
00:55:34
them being a gatekeeper because they
00:55:36
need to raise the next 30 or $40 billion
00:55:38
doar trying to convince people that
00:55:39
that's true and if you view it through
00:55:41
that lens you're right saxs we have not
00:55:43
accomplished anything yet that proves
00:55:45
that this is going to be cataclysmically
00:55:47
bad and if anything right now history
00:55:49
would tell you it's probably going to be
00:55:51
like the past which is generally
00:55:53
productive and a cre Society yeah and
00:55:55
just to bring it back back to JD's
00:55:57
speech which is where we started I think
00:55:59
it was a quintessentially American
00:56:02
speech in the sense that he said we
00:56:04
should be optimistic about the
00:56:06
opportunities here which I think is
00:56:08
basically right and we want to lead we
00:56:11
want to take advantage of this we don't
00:56:13
want to hobble it we don't even fully
00:56:16
know what it's going to be yet we are
00:56:19
going to Center workers we want to be
00:56:21
Pro worker and I think that if there are
00:56:24
downsides for workers then we can
00:56:25
mitigate those things in the future but
00:56:27
it's too early to say that we know what
00:56:29
the program should be it's more about a
00:56:31
statement of values at this point do you
00:56:33
think it's too early freeberg given
00:56:36
optimists and all these robots being
00:56:38
created what we're seeing in
00:56:40
self-driving you've talked about the
00:56:41
ramp up with
00:56:43
wh to actually say we will not see
00:56:46
millions of
00:56:48
jobs and millions of people get
00:56:50
displaced from those jobs what do you
00:56:51
think freeberg I'm curious your thoughts
00:56:53
because that is the
00:56:54
counterargument my experience erence in
00:56:56
the workplace is that AI
00:56:59
tools that are doing things that an
00:57:02
analyst or knowledge worker was doing
00:57:05
with many hours in the past is allowing
00:57:07
them to do something in minutes that
00:57:09
doesn't mean that they spend the rest of
00:57:11
the day doing nothing what's great for
00:57:13
our business and for other businesses
00:57:15
like ours that can leverage AI tools is
00:57:19
that those individuals can now do more
00:57:21
and so our throughput our productivity
00:57:23
as an organization has gone up and we
00:57:26
can now create more things faster so
00:57:28
whatever the product is that my company
00:57:30
makes we can now make more things more
00:57:32
quickly we can do more development
00:57:34
you're seeing it on the ground correct
00:57:35
at a hollow and I'm seeing it on the
00:57:37
ground and I don't think that this like
00:57:39
transplantation of how bad AI will be
00:57:42
for jobs is the right framing as much as
00:57:45
it is about an acceleration of
00:57:46
productivity and this is why I go back
00:57:48
to the point about GDP per capita and
00:57:50
GDP growth countries societies areas
00:57:54
that are interested or industries that
00:57:55
are interested in accelerating output in
00:57:58
accelerating productivity the ability to
00:58:00
make stuff and sell stuff are going to
00:58:03
rapidly Embrace these tools because it
00:58:04
allows them to do more with less and I
00:58:07
think that's what I really see on the
00:58:08
ground and then the second point I'll
00:58:10
make is the one that I mentioned earlier
00:58:11
and I'll wrap up with a third point
00:58:13
which is I think we're underestimating
00:58:15
the new industries that will emerge
00:58:17
drastically dramatically there is going
00:58:19
to be so much new sh that we are not
00:58:21
really thinking deeply about right now
00:58:24
that we could do a whole another 2hour
00:58:26
brainstorming session on on what AI
00:58:28
unlocks in terms of large scale projects
00:58:32
that are traditionally or typically are
00:58:34
today held back because of the
00:58:36
constraints on the technical feasibility
00:58:38
of these projects and that ranges from
00:58:41
accelerating to new semiconductor
00:58:43
technology to Quantum Computing to
00:58:45
Energy Systems to transportation to
00:58:48
habitation etc etc there's all sorts of
00:58:50
transformations in every industry that's
00:58:52
possible as these tools come online and
00:58:54
that will spurn insane new Industries
00:58:56
the most important point is the third
00:58:58
one which is we don't know the overlap
00:59:00
of job loss and job creation if there is
00:59:03
one and so the rate at which these new
00:59:05
technologies impact and great new
00:59:07
markets but I think nval is right I
00:59:08
think that what happens in capitalism
00:59:10
and in free societies is that capital
00:59:13
and people rush to fill the hole of new
00:59:15
opportunities that emerge because of AI
00:59:18
and that those grow more quickly than
00:59:19
the old bubbles deflate so if there's a
00:59:22
deflationary effect in terms of job need
00:59:24
in other Industries I think the loss
00:59:26
will happen slower then the rush to take
00:59:28
advantage of creating new things will
00:59:30
happen on the other side so my bet is
00:59:32
probably on the order of I think new
00:59:34
things will be created faster than old
00:59:36
things will be lost I think actually as
00:59:39
a quick side note to that the fastest
00:59:41
way to help somebody get a job right now
00:59:43
if you know somebody in the market who's
00:59:44
looking for a job the best thing you can
00:59:47
do is say hey go download the AI tools
00:59:49
and just start talking to them just
00:59:51
start using them in any way and then you
00:59:52
can walk into any employer in almost any
00:59:55
field and say hey I understand Ai and
00:59:57
they'll hire you on the exactly nval you
00:59:59
and I watched this happen we had a front
01:00:02
row seat to it back in the day when you
01:00:05
were doing Venture hacks and I was doing
01:00:07
open Angel Forum we had to like fight to
01:00:09
find five or 10 companies a month then
01:00:11
the cost of running these companies went
01:00:13
down they went down massively from five
01:00:15
million to start a company to two then
01:00:17
to 250k then to 100K I think what we're
01:00:21
seeing is like three things concurrently
01:00:23
you're going to see all these jobs go
01:00:24
away for automation self-driving cars
01:00:27
cashier Etc but we're going to also see
01:00:30
static team size at places like Google
01:00:32
they're just not hiring because they're
01:00:33
just having the existing bloated
01:00:34
employee base learn the tools but I
01:00:37
don't know if you're seeing this the
01:00:39
number of startups able to get a product
01:00:41
to Market with two or three people and
01:00:43
get to a million in revenue is booming
01:00:45
what are you seeing in the startup
01:00:47
landscape definitely what you're saying
01:00:49
in that there's leverage but at the same
01:00:51
time the I think the more interesting
01:00:53
part is that new startups are enabled
01:00:55
that could not exist otherwise uh my
01:00:58
last startup air chat could not have
01:00:59
existed without AI because we needed a
01:01:01
transcription and translation even the
01:01:03
current thing I'm working on it's not an
01:01:04
AI company but it cannot exist without
01:01:06
AI it is relying on AI even at angelist
01:01:09
we're significantly adopting AI like
01:01:11
everywhere you turn it's more
01:01:13
opportunity more opportunity more
01:01:14
opportunity and people like to go on uh
01:01:18
Twitter or the artist formerly known as
01:01:20
Twitter and and and basically they like
01:01:22
to exaggerate like oh my God we've hit
01:01:25
AGI oh my my God I just replaced my all
01:01:27
my mid-level Engineers oh my god I've
01:01:28
stopped hiring to me that's like moronic
01:01:31
the two valid ones are the onean
01:01:33
entrepreneur shows where there's like
01:01:35
one guy or one gal and they're like
01:01:36
scaling up like crazy things to AI or
01:01:39
there are people who are embracing Ai
01:01:41
and being like I need to hire and I need
01:01:42
to hire anyone who can even spell AI
01:01:45
like anyone who's even used AI just come
01:01:47
on in come on in again I would say the
01:01:50
easiest way to see that AI is not taking
01:01:52
jobs of creating opportunities is go
01:01:54
brush up on your learn a little bit
01:01:57
watch a few videos use the AI Tinker
01:01:59
with it and then go reapply for that job
01:02:01
that rejected you and watch how they
01:02:03
pull you in in 2023 an economist named
01:02:05
Richard Baldwin said AI won't take your
01:02:07
job it's someone using AI that will take
01:02:10
your job because they know how to use it
01:02:11
better than you and that's kind of
01:02:13
become a meme and you see it floating
01:02:14
around X but I think there's a lot of
01:02:16
Truth in that you know as long as you
01:02:18
remain adaptive and you keep learning
01:02:20
and you learn how to take advantage of
01:02:22
these tools you should do better and if
01:02:25
you wall yourself off from the
01:02:26
technology and don't take advantage of
01:02:28
it that's when you put yourself at risk
01:02:30
another way to think about it is these
01:02:31
are natural language computers so
01:02:33
everyone who's intimidated by computers
01:02:35
before should no longer be intimidated
01:02:37
you don't need to program anymore in
01:02:39
some esoteric language or learn some
01:02:41
obscure mathematics to be able to use
01:02:43
these you can just talk to them and they
01:02:44
talk back to you that's magic the new
01:02:47
programming language is English chath
01:02:48
you want to wrap us up here on this
01:02:50
opportunity slash displacement slash
01:02:53
chaos I was gonna say this before but I
01:02:56
I'm pretty unconvinced anymore that
01:03:00
you should bother even learning many of
01:03:04
the hard sciences and maths that we used
01:03:07
to as underpinnings like I used to
01:03:09
believe that the right thing to do was
01:03:11
for everybody to go into
01:03:12
engineering I'm not necessarily as
01:03:15
convinced as I used to be because I used
01:03:16
to say well that's great first
01:03:18
principles thinking etc etc and you're
01:03:20
going to get trained in a toolkit that
01:03:21
will scale and I'm not sure that that's
01:03:24
true I think like you can you can use
01:03:26
these agents and you can use deep
01:03:28
research and all of a sudden they
01:03:30
replace a lot of that skill so what's
01:03:31
left over it's creativity it's judgment
01:03:34
it's history it's psychology it's all of
01:03:37
these other sort of like softare
01:03:39
leadership communication that allow you
01:03:41
to manipulate these models in
01:03:42
constructive ways because when you think
01:03:44
of like the prompt engineering that gets
01:03:45
you to Great answers it's actually just
01:03:47
thinking in totally different orthogonal
01:03:49
ways and
01:03:50
nonlinearly so that's my last thought
01:03:52
which is it does open up the aperture
01:03:54
meaning for every smart mathematical
01:03:56
genius there's many many many other
01:03:58
people who have high EQ and all of a
01:04:02
sudden this tool actually takes the
01:04:04
skill away from the person with just a
01:04:07
high IQ and says if you have these other
01:04:09
skills now you can compete with me
01:04:11
equally and I think that that's
01:04:13
liberating for a lot of people I'm in
01:04:15
the camp of more opportunity you know I
01:04:17
I got to watch the movie industry a
01:04:19
whole bunch when the digital cameras
01:04:21
came out and more people started making
01:04:23
documentaries more people started making
01:04:25
independent film shorts and then of
01:04:27
course the Youtube Revolution people
01:04:28
started making videos on YouTube or
01:04:31
podcasts like this and if you look at
01:04:33
what happened with like the special
01:04:34
effects industry as well we need far
01:04:38
fewer people to make a Star Wars movie
01:04:41
to make a Star Wars series to make a
01:04:42
Marvel series as we've seen now we can
01:04:45
get the Mandalorian and shoka and all
01:04:47
these other series with smaller numbers
01:04:49
of people and they look better than
01:04:52
obviously the original Star Wars series
01:04:53
or even the prequels so there's there's
01:04:55
going to be so many more opportunities
01:04:57
we're now making more TV shows more
01:04:59
series everything we wanted to see of
01:05:01
every little character that's the same
01:05:03
thing that's happening in startups I
01:05:04
can't believe that there is a app now
01:05:07
nval called slopes just for uh skiing
01:05:11
and there are 20 really good apps for
01:05:15
just meditation and there are 10 really
01:05:17
good ones just for fasting like we're
01:05:18
going down this long taale of
01:05:20
opportunity and there'll be plenty of
01:05:21
million to10 million businesses for us
01:05:25
you know people learn to use these tools
01:05:27
I love how that's the thing that tips
01:05:29
you over which
01:05:32
one you get an extra Marvel movie or an
01:05:35
extra Star Wars show so that tips you
01:05:37
over I think for a lot of people it's it
01:05:40
feels great that AI may take over the
01:05:42
world but I'm gonna get next to Star
01:05:44
Wars movie so I'm I'm cool yeah I mean
01:05:46
it's are you not entertained want to
01:05:48
final point on this is look I mean given
01:05:50
the choice between the two categories of
01:05:52
techno Optimist and techn pessimist I'm
01:05:54
definitely in The Optimist camp and I I
01:05:55
think and I think we should be but I I
01:05:57
think there's actually a third category
01:05:59
that I would submit which is techno
01:06:02
realist which is technology is going to
01:06:06
happen trying to stop it is like
01:06:07
ordering the tides to stop if we don't
01:06:10
do it somebody else will China's going
01:06:12
to do it or somebody else will do it and
01:06:14
it's better for us to be in control of
01:06:17
the technology to be the leader rather
01:06:19
than passively waiting for it to happen
01:06:21
to us and I just think that's always
01:06:24
true it's better for businesses to be
01:06:27
proactive and take the lead disrupt
01:06:29
themselves instead of waiting for
01:06:29
someone else to do it and I think it's
01:06:31
better for countries and I think you did
01:06:33
see this theme a little bit I mean these
01:06:35
are my own views I don't want to ascribe
01:06:38
them to the vice president but you did
01:06:39
see I think a hint of the Techno realism
01:06:43
idea in his speech and in his tweet
01:06:45
which is look AI is going to happen we
01:06:48
might as well be the leader if we don't
01:06:51
we could lose in a key category that has
01:06:55
implications for National Security for
01:06:57
our economy for many things so that's
01:06:59
just not a world we want to live in so I
01:07:01
think a lot of this debate is sort of
01:07:03
academic because whether you're an
01:07:06
optimist or pessimist is sort of glass
01:07:08
half empty half full the question is
01:07:10
just is it going to happen or not and I
01:07:12
think the answer is yes so then we want
01:07:14
to control it this just you know let's
01:07:16
just boil it down there's not a
01:07:17
tremendous amount of choice in this I
01:07:19
think I would agree heavily with one
01:07:20
point and I would just tweak another the
01:07:23
point I would agree with is that it's
01:07:25
going to happen anyway and that's what
01:07:26
deep seek proved you can turn off the
01:07:28
flow of chips to them and you can turn
01:07:30
off the flow of talent what do they do
01:07:31
they just get more efficient and they
01:07:33
exported it back to us they sent us back
01:07:35
the best open source model when our guys
01:07:37
were staying closed source for safety
01:07:39
reasons and exactly and I deep explo
01:07:43
safety of their Equity deep seek
01:07:45
exploded the fallacy that the US has a
01:07:48
monopoly in this category and that
01:07:50
somehow therefore we can slow down the
01:07:52
train and that we have total control
01:07:55
over the train and I think what deep
01:07:56
seek showed is no if we slow down the
01:07:57
train they're just going to win yeah the
01:07:59
part the part where I TR to tweak a
01:08:01
little bit is the idea that we are going
01:08:03
to win by we when you say America the
01:08:06
problem is that the best way to win is
01:08:08
to be as open as distributed as
01:08:10
Innovative as possible if this all ends
01:08:12
up in the control of one company they're
01:08:14
actually going to be slower to innovate
01:08:15
than if there's a dynamic system and
01:08:18
that dynamic system by its nature will
01:08:19
be open it will leak to China it will
01:08:22
leak to India but these things have
01:08:24
powerful Network effects we know this
01:08:25
about technology almost all Technologies
01:08:27
has Network effects underneath and so
01:08:30
even if you are open you're still going
01:08:32
to win and you're still going to control
01:08:34
you look at the internet that was all
01:08:35
true for the internet right the
01:08:36
internet's an open Technologies based on
01:08:38
tons of Open Source Who are the Dom
01:08:40
companies all the dominant companies are
01:08:42
us companies because they were in the
01:08:43
lead exactly right exactly right we we
01:08:45
embrac the open internet we embrac the
01:08:47
open internet right that was different
01:08:48
yeah so there will be benefits for all
01:08:49
of humanity and I think the vice
01:08:51
president's speech was really clear that
01:08:53
look we want you guys to be on board we
01:08:54
want to be good
01:08:55
Partners however there are definitely
01:08:57
going to be winners economically
01:09:00
militarily and in order to be one of
01:09:02
those winners you have to be a leader
01:09:04
who's going to get to AGI first nval is
01:09:06
it going to be an open source who's
01:09:08
going to win is it going to be open
01:09:10
source or closed Source who's going to
01:09:11
win the day if we're sitting here 5 10
01:09:13
years from now and we're looking at the
01:09:15
top three language models lot of trouble
01:09:17
for this but I don't think we know how
01:09:18
to build AGI but that's that's a much
01:09:20
longer discuss who's going to have the
01:09:22
best model five years from hold on I I
01:09:24
100% agree with you I just think it's a
01:09:26
different thing but what we're building
01:09:27
are these incredible natural language
01:09:29
computers and actually David in a very
01:09:32
pithy way summarized the two big use
01:09:33
cases it's search and it's homework it's
01:09:36
paperwork uh it's it's really paperwork
01:09:38
and a lot of these jobs that we're
01:09:40
talking about disappearing are actually
01:09:42
paperwork jobs they're paperwork
01:09:43
shuffling these are madeup jobs like the
01:09:45
federal government as we're finding out
01:09:46
through Doge you know a third of it is
01:09:48
like people digging holes of spoons and
01:09:50
the another third are filling them back
01:09:51
up they're filling up paperwork and then
01:09:52
burying it in a m shaft they're buing M
01:09:54
shaft Mountain yeah so I I I think a lot
01:09:57
of these madeup jobs they got to go down
01:09:59
the M shaft to get the paperwork when
01:10:00
someone retires and bring it up you know
01:10:01
what I'm going to get them some thumb
01:10:03
drives we can increase the throughput of
01:10:04
the elevator with some thumb drives it
01:10:06
would be incredible what we found out is
01:10:08
the the DMV has been running the
01:10:09
government for the last 70 years it's
01:10:11
been a compounding compounding that's
01:10:13
that's really what's going on DMV is in
01:10:15
charge I mean if the world ends in
01:10:18
nuclear war God for bid the only thing
01:10:19
that's be left will be the Cockroaches
01:10:21
and then a bunch of like government work
01:10:24
documents s report TPS reports down in
01:10:27
the M shaft basically
01:10:30
yeah let's take a moment everybody to
01:10:33
thank our
01:10:35
Zar we miss him we wish he could be here
01:10:38
for the whole show and thank you Zar
01:10:40
thank you to the Zar see you guys miss
01:10:43
you we miss you little buddy I wish we
01:10:44
could talk about Ukraine but we're not
01:10:46
allowed get back to work we'll talk
01:10:48
about it another time over cfee I'll see
01:10:50
you in the comissary thanks for the
01:10:52
invite bye man I'm so excited I'm nval
01:10:56
saaks invited me to go to the the
01:10:58
military mess I'm going to be in the
01:10:59
commissary no he didn't J you invited
01:11:02
yourself be honest I did yes I did I put
01:11:04
it on his calendar to keep the
01:11:05
conversation moving let me segue a point
01:11:07
that came up that was really important
01:11:09
into tariffs and the point is even
01:11:13
though the internet was open the US won
01:11:16
a lot of the internet a lot of us
01:11:18
companies won the internet and they won
01:11:20
that because we got there the firstest
01:11:22
with the mostest as they say in the
01:11:23
military and that matters because a a
01:11:26
lot of Technology businesses have scale
01:11:28
economies and network effects underneath
01:11:30
even basic brand-based Network effects
01:11:33
if you go back to the late 90s early
01:11:35
2000s very few people would have
01:11:37
predicted that we would have ended up
01:11:38
with Amazon basically owning all of
01:11:40
e-commerce you would have thought it
01:11:41
would have been a perfect competition
01:11:43
and very spread out and that applies to
01:11:45
how we end up with Uber as basically one
01:11:47
taxi service or we end up with Airbnb
01:11:50
meta Airbnb it's just Network effects
01:11:52
Network effects Network effects rule the
01:11:54
world around me but but when it comes to
01:11:55
tariffs and when it comes to trade we
01:11:58
act like Network effects don't exist the
01:12:00
classic ricardian comparative advantage
01:12:02
Dogma says that you should produce what
01:12:04
you're best at I produce what I'm best
01:12:06
at and we trade and then even if you
01:12:08
want to charge me more for it if you
01:12:09
want to impose tariffs for me to ship to
01:12:11
you I should still keep tariffs down
01:12:13
because I'm better off you're just
01:12:14
selling me stuff cheaply great or if you
01:12:17
want to subsidize your guys great you're
01:12:18
selling me stuff cheaply the problem is
01:12:20
that is not how most modern businesses
01:12:22
work most modern businesses have Network
01:12:24
effects as a simple thought experiment
01:12:26
suppose that we have two countries right
01:12:28
I'm China you're the US I start out by
01:12:31
subsidizing all of my companies and
01:12:34
industries that have Network effects so
01:12:36
I'll subsidize Tik Tok I'll ban your
01:12:39
social media but I'll push mine I will
01:12:42
subsidize my semiconductors which tend
01:12:44
do tend to have win or take-all in
01:12:45
certain categories or I'll subsidize my
01:12:47
drones and then B exactly byd
01:12:50
self-driving whatever and then when I
01:12:53
win I own the whole market and I can
01:12:55
raise prices and if you try to start up
01:12:57
a competitor then it's too late I've got
01:12:59
Network effects or if I've got scale
01:13:01
economies I can lower my price to zero
01:13:03
crash you out of business no one in
01:13:04
their right mind will invest and I'll
01:13:05
raise prices right back up so you have
01:13:08
to understand that certain industries
01:13:10
have hesis or they have Network effects
01:13:12
or they have economies of scale and
01:13:14
these are all the interesting ones these
01:13:15
are all the high margin businesses so in
01:13:17
those if somebody is subsidizing or
01:13:19
they're raising tariffs against you to
01:13:21
protect your industries and let them
01:13:23
develop you do have to do something you
01:13:25
can't just completely back
01:13:28
down what are your thoughts jamath about
01:13:31
tariffs and network effects it does seem
01:13:32
like we do want to have uh redundancy in
01:13:36
supply chain so there are some
01:13:37
exceptions here any um thoughts on how
01:13:40
this might play out because yeah Trump
01:13:42
brings up tus every 48 hours and then it
01:13:45
doesn't seem like any of them land so I
01:13:47
don't know I'm I'm still on my 72-hour
01:13:49
Trump rule which is whatever he says
01:13:51
wait 72 hours and and then maybe see if
01:13:54
it actually comes to pass where do you
01:13:56
stand on all these tariffs and tariff
01:13:58
talk well I think the tariffs will be a
01:14:00
plug are they coming absolutely the
01:14:02
Quantum of them I don't know and I think
01:14:04
that the way that you can figure
01:14:07
out how extreme it will be it'll be
01:14:11
based on what the legislative plan is
01:14:14
for the budget so there's two paths
01:14:15
right now path one which I think is a
01:14:17
little bit more likely is that they're
01:14:19
going to pass a slim down plan in the
01:14:21
Senate just on border security and
01:14:24
Military spending and then they'll kick
01:14:27
the can down the
01:14:28
road for probably another three or four
01:14:31
months on the budget plan two is this
01:14:33
one big beautiful bill that's irking its
01:14:35
way through the
01:14:37
house and there they're proposing
01:14:39
trillions of dollars of cuts in that
01:14:42
mode you're going to need to raise
01:14:43
revenu somehow and especially if you're
01:14:46
giving away tax breaks and the only way
01:14:48
to do that is probably through tariffs
01:14:49
or one way to do it is through
01:14:51
tariffs my honest opinion Jason is that
01:14:53
I think we're in a very complicated
01:14:54
moment I think the Senate plan is
01:14:56
actually on the margins more likely and
01:14:59
better and the reason is because I think
01:15:01
that Trump is better off getting the
01:15:04
next 60 to 90 days of data I mean we're
01:15:07
in a real pickle here we have persistent
01:15:11
inflation we have a broken fed they are
01:15:16
totally asleep at the switch and the
01:15:18
thing that Yellen and Biden did which in
01:15:21
hindsight now was extremely dangerous is
01:15:25
they issued so much short-term paper
01:15:27
that in totality we have $1 trillion
01:15:30
dollar we need to finance in the next 6
01:15:32
to9 months so it could be the case
01:15:36
that we have rates that are like five
01:15:40
five and a quarter 5 a
01:15:42
half% I think that that's extremely bad
01:15:46
at the same time as inflation at the
01:15:48
same time as delinquencies are ticking
01:15:51
up so I think tariffs are probably going
01:15:55
to happen but I think that
01:15:58
Trump will have the most
01:16:01
flexibility if he has time to see what
01:16:05
the actual economic conditions will be
01:16:07
which will be more clear in 3 four five
01:16:10
months and so I almost think this big
01:16:12
beautiful bill is actually
01:16:13
counterproductive because I I'm not sure
01:16:14
we're gonna have all the data we need to
01:16:16
get it right fre B any thoughts on these
01:16:19
tariffs you've been involved in the
01:16:22
global uh Marketplace especially when it
01:16:24
comes to produce and wheat and all this
01:16:27
corn and everything what do you think
01:16:29
the dynamic here is going to be or is it
01:16:31
saber rattling and a tool for Trump the
01:16:35
biggest buyer of us a exports is China
01:16:38
China a exports are a major Revenue
01:16:42
Source major income source and a major
01:16:44
part of the economy for a large number
01:16:46
of states and so there will be as there
01:16:50
was in the first Trump presidency very
01:16:52
likely very large transfer payments made
01:16:55
Farmers because China is very likely
01:16:58
going to tariff Imports or stop making
01:17:01
import purchases altoe which is what
01:17:03
happened during the first presidency
01:17:05
when they did that the federal
01:17:06
government I believe had transfer
01:17:08
payments of north of $20 billion to
01:17:10
Farmers this is a not negligible sum and
01:17:13
it's a not- negligible economic effect
01:17:15
because there's then a Rippling effect
01:17:17
throughout the a economy so I think
01:17:18
that's one key thing that I've heard
01:17:20
folks talk about is the um the activity
01:17:23
that's going to be needed to support the
01:17:26
farm
01:17:27
economy as the US's biggest a customer
01:17:30
disappears in the early 20th century we
01:17:33
didn't have an income tax and the
01:17:34
federal revenue was almost entirely
01:17:36
dependent on tariffs when tariffs were
01:17:39
cut there was an
01:17:40
expectation that there would be a
01:17:42
decline in federal government revenue
01:17:44
but what actually happened is volume
01:17:45
went up so lower tariffs actually
01:17:48
increased trade increase the size of the
01:17:50
economies this is where a lot of
01:17:51
economists take their basis in hey guys
01:17:54
if we do these tariffs it's actually
01:17:56
going to shrink the economy it's going
01:17:57
to cause a reduction in trade the
01:17:59
counterbalancing effect is one that has
01:18:01
not been tested in economics right which
01:18:03
is what's going to happen if
01:18:05
simultaneously we reduce the income tax
01:18:07
and reduce the corporate income tax and
01:18:10
basically increased Capital flows
01:18:12
through reduced taxation while doing the
01:18:15
Tariff implementation at the same time
01:18:17
so it's a grand economic experiment and
01:18:19
I think we'll learn a lot about what's
01:18:21
going to happen here as this all moves
01:18:22
forward I do think ultimately these
01:18:25
countries are going to capitulate to
01:18:26
some degree and we're going to end up
01:18:27
with some negotiated settlement that's
01:18:29
going to hopefully not be too short-term
01:18:32
impactful on the economies and the
01:18:33
people and the jobs that are dependent
01:18:35
on trade economy feels like it's in a
01:18:37
very precarious place it does to asset
01:18:41
holders obviously and obviously they've
01:18:43
left it in a bad place in the last
01:18:44
Administration and we shut down the
01:18:46
entire country for a year over Co the
01:18:48
bill for that has come due and that's
01:18:50
reflected inflation I I think there are
01:18:52
a couple other points in tariffs first
01:18:54
is it's it's not just about money it's
01:18:56
also about making sure we have
01:18:57
functional middle class with good jobs
01:18:59
because if you have a non-tariff world
01:19:01
maybe all the gains go to the upper
01:19:03
class and an underclass and then you
01:19:05
can't have a functioning democracy when
01:19:06
the average person is on one of those
01:19:08
two
01:19:09
extremes so I think that's one issue
01:19:11
another is strategic Industries if you
01:19:14
look at it today probably the largest
01:19:15
defense contractor in the world is DJI
01:19:18
they got all the drones even in Ukraine
01:19:20
both sides are getting all their drone
01:19:22
parts from DJI now they're getting it
01:19:23
through different supply chain chain and
01:19:25
so on but Ukrainian drones and Russian
01:19:27
drones the vast majority of them are
01:19:28
coming through China through DJI and we
01:19:31
don't have that industry if we have a
01:19:33
kinetic conflict right now and we don't
01:19:34
have good drone supply chain internally
01:19:37
in the US we're probably going to lose
01:19:39
because those things are autonomous
01:19:40
bullets that's the feature of all
01:19:42
Warfare we're buying f-35s and the
01:19:44
Chinese are building swarms of Nar scale
01:19:47
at scale so we do have to re onore those
01:19:50
critical Supply chains and what is a
01:19:52
drone supply chain it's not just there's
01:19:54
not thing called drone it's like Motors
01:19:56
and
01:19:57
semiconductors and Optics and lasers and
01:20:00
just everything across the board so I
01:20:02
think there are other good Arguments for
01:20:04
at least reshoring some of these
01:20:05
industries we need them and the the
01:20:08
United States is very lucky and that
01:20:09
it's very aaric we have all the
01:20:11
resources we have all the supplies we
01:20:13
can we can be Upstream of everybody with
01:20:14
all the energy to the extent we're
01:20:16
importing any energy that is a choice we
01:20:19
made that is not because fundamentally
01:20:21
we the energy yeah because of between
01:20:25
all the oil resources and the natural
01:20:27
gas and fracking combined with all the
01:20:30
work we've done in nuclear fishing and
01:20:31
small reactors we should absolutely be
01:20:33
energy independ we should be running the
01:20:35
table on it we should we should have AB
01:20:37
a massive Surplus and hey you know if
01:20:39
you if you're worried about you know
01:20:41
couple of million of door Dash Uber
01:20:43
drivers losing their jobs to automation
01:20:45
like hey there's going to be factories
01:20:47
to build these parts for these drones
01:20:50
that we're going to need so there
01:20:52
there's a lot of opportunity I guess for
01:20:53
people and there is a difference between
01:20:55
different kinds of jobs those kinds of
01:20:57
jobs are better jobs building difficult
01:21:00
things at scale physically that we need
01:21:03
for both National Security and for
01:21:05
Innovation th those are better jobs than
01:21:08
you know paperwork writing essays for
01:21:10
other people to read yeah or even
01:21:12
driving cars all right listen I want to
01:21:13
get to two more stories here we have a
01:21:15
really interesting copyright story that
01:21:18
I wanted to touch on Thompson Reuters
01:21:19
just won the first major US AI copyright
01:21:22
case and fair use played a major role in
01:21:25
this decision this has huge implications
01:21:28
for AI companies here in the United
01:21:30
States obviously open Ai and the New
01:21:33
York Times Getty Images versus stability
01:21:36
we've talked about these but it's been a
01:21:37
little while because the legal system
01:21:40
takes a little bit of time and these are
01:21:41
very
01:21:42
complicated cases as we've talked about
01:21:45
Thompson Reuters owns Westlaw if you
01:21:47
don't know that it's kind of like Lexus
01:21:49
Nexus it's one of the the legal
01:21:51
databases out there that lawyers use to
01:21:53
to find cases Etc uh and they have a
01:21:56
paid product with summaries and Analysis
01:21:58
of legal decisions back in
01:22:00
2020 this is two years before chat GPT
01:22:03
reuter sued a legal research competitor
01:22:05
called Ross for copyright infringement
01:22:07
Ross had created an AI powered legal
01:22:09
search engine sounds great but Ross had
01:22:12
asked Westlaw if they would pay a
01:22:14
license to its content for training
01:22:15
wesla said no this all went back and
01:22:18
forth and then Ross signed a similar
01:22:20
deal with a company called Legal ease
01:22:22
the problem is legal E's database was
01:22:24
just cop copied and pasted from a bunch
01:22:26
of Westlaw answers so Reuters Westlaw
01:22:28
sued Ross in 2020 accusing the company
01:22:31
of being vicariously liable for legal
01:22:33
eases direct infringement super
01:22:35
important Point anyway the judge
01:22:37
originally favored Ross in fair use this
01:22:40
week the judge reversed this ruling and
01:22:42
found Ross liable noting that after
01:22:44
further review fair use does not apply
01:22:46
in this case this is the first major win
01:22:51
and uh we debated this so here's a clip
01:22:53
you know you heard it here first on the
01:22:55
all-in Pod what I would say is you know
01:22:57
when you look at that fair use Doctrine
01:23:00
I've got a lot of experience with it you
01:23:01
know the fourth Factor test I'm sure
01:23:03
you're well aware of this is the effect
01:23:04
of the use on the potential market and
01:23:06
the value of the work if you look at the
01:23:08
lawsuits that are starting to emerge it
01:23:10
is Getty's right to then make derivative
01:23:13
products based on their images I think
01:23:15
we would all agree stable diffusion when
01:23:17
they use these open web that is no
01:23:19
excuse to use an open web crawler to
01:23:22
avoid getting a license from the
01:23:23
original own owner of that just because
01:23:25
you can technically do it doesn't mean
01:23:26
you're allowed to do it in fact the open
01:23:28
Web projects that provide these say
01:23:30
explicitly we do not give you the right
01:23:32
to use this you have to then go read the
01:23:35
copyright laws on each of those websites
01:23:37
and on top of that if somebody were to
01:23:39
steal the copyrights of other people put
01:23:41
it on the open web which is happening
01:23:42
all day long you still if you're
01:23:44
building a derivative work like this you
01:23:46
still need to go get it so it's no
01:23:48
excuse that I took some site in Russia
01:23:50
that did a bunch of copyright violation
01:23:52
and then I indexed them for my training
01:23:53
model so I've think this is going to
01:23:55
result fre free can you shoot me in the
01:23:57
face and let me know when this
01:24:00
okay oh great I feel same way same way
01:24:05
now exactly I know me too yeah okay good
01:24:08
good segment let's move
01:24:09
on well since these guys don't give a
01:24:11
about copyright holders what do you
01:24:13
think about uh you know I'm so glad
01:24:16
you're here nval to actually talk about
01:24:18
the topics these two other guys I'm
01:24:20
going go thinner limb I'm going to go
01:24:22
and even thinner Limb and say I largely
01:24:23
agree with you I think it's a bit Rich
01:24:25
to crawl the open web Hoover up all the
01:24:27
data offer direct substitution for a lot
01:24:29
of use cases because you know now you
01:24:30
start and end with the AI model it's not
01:24:32
even like you link out like Google did
01:24:34
and then you just close off the models
01:24:35
for safety reasons I think if you
01:24:37
trained on the open web your model
01:24:38
should be open source yeah absolutely
01:24:41
that would be a fine thing I have a
01:24:43
prediction here I think this is all
01:24:44
going to wind up wind up like the Naps
01:24:46
or Spotify case for people who don't
01:24:49
know Spotify pays I think 65 cents on
01:24:52
the dollar to the original Underwriters
01:24:55
of of that content the music industry
01:24:58
and they figured out a way to make a
01:24:59
business and Napster is roadkill I think
01:25:03
that there is a nonzero chance like it
01:25:05
might be five or 10% that opening eye is
01:25:08
going to lose the New York Times lawsuit
01:25:10
and they're going to lose it hard and
01:25:11
there going to be injunctions and I
01:25:12
think it's the settlement might be that
01:25:15
these language models especially the
01:25:16
closed ones are going to have to pay
01:25:17
some
01:25:19
percentage in a negotiated settlement of
01:25:21
their revenue half 2/3 to the content
01:25:26
holders and this could make the content
01:25:29
industry have a massive massive uplift
01:25:32
and a and a and a massive Resurgence I
01:25:35
think that the problem there's an
01:25:37
example on the other side of this which
01:25:39
is that there's a company that provides
01:25:41
technical support for Oracle third party
01:25:44
company and Oracle has tried times to
01:25:47
soo them into Oblivion using copyright
01:25:49
infringement as part of the
01:25:51
justification and it's been a PA over
01:25:54
the stock for a long time the company's
01:25:55
name is riny Street don't ask me why I
01:25:58
it's on my radar but I just I've been
01:26:00
looking at it and they lost this huge
01:26:03
lawsuit Oracle one and then it went to
01:26:05
appella court and then it was all
01:26:07
vacated why am I bringing this up I
01:26:10
think that the legal Community has
01:26:12
absolutely no idea how these models work
01:26:14
because you can find one case that goes
01:26:16
one way and one case that goes the other
01:26:18
and what I would say should become
01:26:21
standard reading for anybody bringing in
01:26:24
of these lawsuits there's an Incredible
01:26:26
video that carpath just dropped that
01:26:28
Andre just dropped where he does like
01:26:31
this deep dive into llms and he explains
01:26:33
chat GPT from the ground up it's on
01:26:36
YouTube it's three hours it's excellent
01:26:40
and it's very difficult to watch that
01:26:43
and not get to the same conclusion that
01:26:45
you guys did I'll just leave it at that
01:26:47
H I tend to agree with this there's also
01:26:50
a good old video by ilas sover where he
01:26:53
he was a I believe the founding Chief
01:26:54
scientist or CTO of open Ai and he talks
01:26:57
about how these these large language
01:26:59
models are basically extreme
01:27:01
compressors and he models them entirely
01:27:04
as their ability to compress and they're
01:27:05
L com compression exactly lossy L
01:27:08
compression exactly exactly and Google
01:27:10
got sued for fair use back in the day
01:27:13
but the way they managed to get past the
01:27:15
argument was they were always linking
01:27:16
back to you they showed tiny sent you
01:27:18
the traffic they sent you the traffic
01:27:20
this is lossy compression it is
01:27:22
absolutely I'm now on your p i I hate to
01:27:25
say this
01:27:28
Jason I agree with you you were you were
01:27:33
right you were right that's all I wanted
01:27:36
to hear all these years that's all I
01:27:39
wanted shaking my head when I saw those
01:27:42
videos cuz I was like oh man Jason was
01:27:43
right Jason was right oh my God no I
01:27:46
just I've been through this so many
01:27:48
times that these I think this is you
01:27:50
know rer Murdoch said we should hold the
01:27:54
line with Google and not allow them to
01:27:57
index our content without a license and
01:27:59
Google navigated it successfully and and
01:28:02
they were able to to not get him to stop
01:28:04
I think what's happened now is that the
01:28:08
New York Times remembers that they they
01:28:09
all remember losing their content and
01:28:13
these Snippets and the one box to Google
01:28:16
and they couldn't get that Genie back in
01:28:18
the bottle I think the New York Times
01:28:19
realizes this is their payday I think
01:28:22
the New York Times will make more money
01:28:24
from licenses from llms then they will
01:28:27
make from advertising or subscriptions
01:28:30
eventually this will renew the model
01:28:32
almost I think the New York Times
01:28:34
content is worthless to an llm but
01:28:36
that's a different story I think the
01:28:37
actual Val cont political reason
01:28:39
whatever but I can tell you as a user I
01:28:42
loved the wire cutter I think you knew
01:28:44
Brian and everybody over the wire cutter
01:28:45
we that was like fair enough yeah wire
01:28:47
cutter what a great product I used to
01:28:50
pay for the New York Times I no longer
01:28:51
pay for the New York Times my main
01:28:52
reason was I would go to the wire cut
01:28:54
yeah and I would just buy whatever they
01:28:56
told me to buy now I go to chat gbt
01:28:59
which I pay
01:29:01
forb tells me what to buy based on the
01:29:04
wire cutter so it's it and I'm already
01:29:06
paying for it so I stop paying for it I
01:29:08
philosophically disagree with all of
01:29:10
your nonsense on this topic all three of
01:29:13
you are wrong and I'll tell you why
01:29:16
number one if information is out in the
01:29:18
open internet I I believe it's
01:29:21
accessible and it's viewable and I view
01:29:23
an llm or a WebCrawler as basically
01:29:25
being a human that's reading and can
01:29:28
store information in its brain can if
01:29:29
it's out there in the open if it's
01:29:30
behind a pay wall 100% if it's behind
01:29:32
some protected password wait wait wait
01:29:35
wait David David in that in that case
01:29:37
can a Google crawler just crawl entire
01:29:39
site and serve it on Google why can't
01:29:41
they do that there so here here's the
01:29:44
fair use the fair use is you cannot copy
01:29:46
you cannot repeat the content you cannot
01:29:48
take the content and repeat it that is
01:29:50
how the law is currently written but now
01:29:52
what I have is I have a tool that that
01:29:54
can remix it with 50 other pieces of
01:29:56
similar content and I can change the
01:29:58
words slightly and maybe even translate
01:29:59
into a different language so where does
01:30:00
it stop do you know the musical artist
01:30:02
girl talk we should have done a girl
01:30:04
talk track here today he's got musical
01:30:07
taste this guy oh good here we go he
01:30:10
basically take takes small samples of
01:30:12
popular tracks and he made and he got
01:30:14
sued for the same problem there was
01:30:16
another guy named white panda I believe
01:30:18
had the same problem Ed Sharon got sued
01:30:20
for this yeah but but their entire sites
01:30:22
like stack Overflow and Wiki that have
01:30:24
basically disappeared now because you
01:30:25
can just swallow them all up and you can
01:30:27
just spit it all back out in chat GPT
01:30:29
with slight changes so I I think that
01:30:31
the the the fair how much of a slight
01:30:33
change is exactly the right question how
01:30:36
much are you changing yeah so that's the
01:30:37
question are and it actually boils down
01:30:39
to the AGI question are these things
01:30:41
actually intelligent and are they
01:30:42
learning or are they compressing and
01:30:44
regurgitating that's the question I
01:30:45
wonder this about humans and that's why
01:30:47
I bring up the white panda the girl talk
01:30:49
in audio but also visual art there was
01:30:51
always artists and all even in classical
01:30:53
music I don't know if you guys are
01:30:54
classical music people but right there
01:30:56
was there's a demonstration of how you
01:30:59
know one composer learned from the next
01:31:01
and that you can actually track the
01:31:03
music as kind of being standing on the
01:31:05
shoulders of the prior and the same is
01:31:07
true in almost all art forms in almost
01:31:09
all human knowledge and Media
01:31:11
communication it's very hard to figure
01:31:13
that out well that's exactly right
01:31:14
that's the hard it's very hard to figure
01:31:15
that out which is why I come back to
01:31:17
there's only one of two stable solutions
01:31:19
to this and it's going to happen anyway
01:31:21
if we don't crawl it the Chinese will
01:31:22
crawl it right deep seek prove that so
01:31:24
there's only one of two stable Solutions
01:31:26
either you pay the copyright holders
01:31:28
which I actually think doesn't work and
01:31:30
the reason is because someone in China
01:31:32
will crawl it and they just dump the
01:31:33
weights right so they can just crawl and
01:31:34
com and dump the compressed weights or
01:31:37
if you crawl make it open at least
01:31:40
contribute something back to open source
01:31:42
right you crawled open data contributed
01:31:44
back to open source and the people who
01:31:46
don't want to be crawled they're going
01:31:48
to have to go to a huge length to
01:31:49
protect their data now everybody knows
01:31:51
to protect the data yeah well
01:31:55
thing is happening here I have a book
01:31:57
out from Harper business on the shelf
01:32:00
behind me and uh I'm getting 2500
01:32:03
smackaroos for the next three years for
01:32:05
uh Microsoft indexing it so they're
01:32:07
going out and they're licensing this
01:32:08
stuff and um they're going $2500 so your
01:32:12
book literally I'm getting $2,500 for
01:32:14
three years a bunch of Harper to go into
01:32:17
an llm to go into Microsoft specifically
01:32:20
and you know what I'm going to sign it I
01:32:22
decided because I just want to set the
01:32:23
precedent maybe next time it's 10,000
01:32:26
maybe next time it's 250 I don't care I
01:32:27
just want to see people have their
01:32:29
content respected and I'm just hoping
01:32:32
that Sam wman loses this lawsuit and
01:32:33
they get an injunction against it hey uh
01:32:36
well just because he's just such a
01:32:38
weasel in terms of like making open AI
01:32:42
into a Clos thing I mean I like Sam
01:32:43
personally but I think what he did was
01:32:45
like the super weasel move of all time
01:32:48
for his own personal benefit If he if he
01:32:50
and this whole lying like oh I have no
01:32:52
equity I get healthare he does it no bro
01:32:55
he does it he does it for the love what
01:32:58
what was the statement he does it for
01:32:59
the I do it for the Jo benefit out of
01:33:02
the benefits I think he got Healthcare I
01:33:04
think in opena defense they do need to
01:33:06
raise a lot of money and they got to
01:33:07
incent their employees but that doesn't
01:33:10
mean they need to take over the whole
01:33:11
thing that the nonprofit portion can
01:33:13
still stay the nonprofit portion and get
01:33:15
the Lions share the benefits and be the
01:33:17
board and then he can have an incentive
01:33:19
package and employees can have an
01:33:20
incentive package why don't they get a
01:33:22
percentage of the revenue just AB I
01:33:24
don't understand has to be bought out
01:33:27
right now for 40 billion and then the
01:33:28
whole thing disappears into a closed
01:33:30
system that part makes no sense to me
01:33:32
that's called a shell game and a scam
01:33:35
yeah I think Sam and his team would do
01:33:36
better to leave the nonprofit part alone
01:33:39
leave an actual independent nonprofit
01:33:41
board in charge and then have a strong
01:33:43
incentive plan and a strong fundraising
01:33:45
plan for the investors and the employees
01:33:47
so I think this is workable it's just
01:33:49
trying to grab it all just seems way off
01:33:51
especially when it was built on open
01:33:53
algorithms for Google open data from the
01:33:55
web and nonprofit funding from Elon and
01:33:57
others I mean what a great proposal like
01:34:00
we just workshopped here what if they
01:34:02
just what do they make six billion a
01:34:03
year just take 10% of it 600 million
01:34:06
every year and that goes into uh a bonus
01:34:09
they're losing money Jason so they have
01:34:11
to okay eventually they no but even
01:34:13
Equity they could they could give Equity
01:34:14
to the people building it but they could
01:34:16
still leave it in the control of the
01:34:18
nonprofit I just don't understand this
01:34:20
conversion I mean there was a there was
01:34:21
a board coup right the board tried to
01:34:23
fire
01:34:24
took now it's
01:34:26
hispi selfing right and yeah they'll get
01:34:29
an independent valuation but we all know
01:34:31
that game you hire valuation expert
01:34:32
who's going to say what you're going to
01:34:33
say and they'll check box if you're
01:34:35
going to capture the life cone of all
01:34:37
future value or build super intelligence
01:34:39
that's worth a lot more that's why Elon
01:34:41
just spit 100 billion exactly you're
01:34:42
saying you're saying the things that
01:34:44
actually The Regulators and and the
01:34:46
legal Community have no Insight because
01:34:48
they'll see a fairness opinion and they
01:34:50
think oh it says fairness and opinion
01:34:52
two words side by side it must be
01:34:54
and they don't know how all of this
01:34:56
stuff is gamed so yeah yeah I man I got
01:35:00
stories about 409a that would
01:35:03
exactly 49a are gamed these fairness
01:35:06
opinions are gamed but the the reality
01:35:09
is I don't think the legal and the
01:35:10
judicial Community has any idea I mean
01:35:13
imagine if a Founder you invested in
01:35:15
just is just a total imaginary situation
01:35:17
nval had like a great term sheet at some
01:35:20
incredible dollar amount didn't take it
01:35:23
ran the valuation down to like under a
01:35:25
million gave themselves a bunch of
01:35:27
shares and then took it three months
01:35:29
later I don't know what would that be
01:35:32
called Securities for wrap yeah let's
01:35:34
wrap on your story I had an interesting
01:35:36
Nick we'll show you the photo I had an
01:35:39
interesting dinner on Monday with Brian
01:35:40
Johnson the don't die guy came over to
01:35:42
my house how's his erection doing
01:35:44
overnight what we talked about is he's
01:35:46
got three hours a night of nighttime
01:35:50
erections wow look at this by the way
01:35:52
first of all I'll tell you I think that
01:35:55
I I think that he's um wait which
01:35:57
one of those is giving him the erection
01:35:59
no no no he measures his nighttime
01:36:01
erection I think has given him the
01:36:03
erection or he but he said he said that
01:36:05
when he started so by the way he said
01:36:07
he's he was 43 when he started this
01:36:09
thing he was basically clinically obese
01:36:13
yeah and in these next four years has
01:36:15
become a specimen he now has 3 hours a
01:36:17
night of nighttime erections but that's
01:36:19
not the interesting thing at the end of
01:36:21
this dinner by the way his skin is in I
01:36:24
I was not sure because when you see the
01:36:25
pictures online but his skin in real
01:36:28
life is like a porcelain dolls I've both
01:36:31
my wife and I were like we've never seen
01:36:33
skin like this and it's incredibly soft
01:36:35
wait wait wait whoa whoa whoa how do you
01:36:37
know skin is soft you know you brush
01:36:39
your hand against his forearm or
01:36:40
whatever you know gives a hug at the end
01:36:42
of the night I'm telling you the guy
01:36:44
skin is had Supple skin bro it's the
01:36:46
softest skin I've ever touched in my
01:36:48
life anyways that's not the
01:36:51
point it was really fascinating din he
01:36:54
walked through his whole protocol but at
01:36:56
the end of it I think it was Nash the
01:36:58
CEO ofo alter networks he was just like
01:37:00
give me the top three things top three
01:37:03
and of the top three things what I'll
01:37:06
boil it down to is the top one thing
01:37:08
which is like 80% of the
01:37:11
80% it's all about sleep I was about to
01:37:14
guess sleep and he walked through his
01:37:17
nighttime routine and it's incredible
01:37:18
and it's straightforward it's really
01:37:20
simple it's like how you do a wind down
01:37:22
anyways I have tried to explain the wind
01:37:25
down briefly let's just say that because
01:37:27
Brian goes to bed much earlier so our
01:37:29
normal time let's just say you know 10
01:37:30
10:30 so my time I try to go to bed by
01:37:32
10:30 he's like you need to be in bed
01:37:35
you need to first of all stop eating
01:37:36
three or four hours before right and I
01:37:39
do that I eat at 6:30 so I have about 3
01:37:42
hours you're in bed by 9:30 or
01:37:45
10 you deal with the selft talk right
01:37:48
like okay here's the active mind telling
01:37:50
you all the things you have to fix in
01:37:51
the morning talk it out put it in its
01:37:54
place say I'm gonna deal with this in
01:37:55
the morning write it down in a journal
01:37:57
you're saying whatever you do so that
01:37:59
you put it away you cannot be on your
01:38:01
phone that's got to be in a different
01:38:03
room it's or you just got to be able to
01:38:05
shut it down and then read a book so
01:38:07
that you're actually just engaged in
01:38:09
something and and and he said that he
01:38:12
typically falls asleep within three to
01:38:14
four minutes of getting into bed and
01:38:16
starting this what I tried it so I've
01:38:19
been doing it since I had dinner with
01:38:20
him on Monday last night I fell asleep
01:38:22
within 50 minutes H the hardest part for
01:38:26
me is to put the phone away I can't do
01:38:28
it of course of course what about youal
01:38:30
tell us your one down oh yeah I know so
01:38:32
I know I know Brian pretty well actually
01:38:34
and I joke that I'm married to the
01:38:35
female Brian Johnson because my wife has
01:38:38
some of his routines but she's like the
01:38:40
natural version no supplements and she's
01:38:43
intense and I think when Brian saw my
01:38:46
sleep score from my eight sleep he was
01:38:51
shot he was he he was just like you're
01:38:52
going to die he's like you're literally
01:38:54
going to die well you got 70 80 no it's
01:38:56
ter it's terrible it's awful but it's
01:38:58
tell what's your number what's your
01:38:59
number it like in 30s 40s but you know
01:39:02
yeah but it's also because I don't sleep
01:39:04
much I only sleep a few hours a night
01:39:05
and I also move around a lot in the bed
01:39:07
and so on but it's fine I I never have
01:39:09
trouble falling asleep but I I would say
01:39:11
that Brian's yes skincare routine is
01:39:13
amazing his diet is incredible he is a
01:39:16
genuine character I do think a lot of
01:39:18
what he's saying minus the supplements
01:39:20
I'm not a big believer in supplements
01:39:22
yeah does work
01:39:23
I don't know if it's necess going to
01:39:25
slow down your aging but you'll look
01:39:26
good you'll feel good yeah sleep is the
01:39:28
number one thing in terms of falling
01:39:30
asleep I don't think it's really about
01:39:32
whether you look at your phone or not
01:39:34
believe it or not I think it's about
01:39:35
what you're doing on your phone if
01:39:37
you're doing anything that is
01:39:38
cognitively stressful or getting your
01:39:40
mind to spin then yes you think you can
01:39:43
scroll Tik Tok and fall asleep is fine
01:39:46
anything that's entertaining or that is
01:39:49
uh like you could read a book right on
01:39:51
your Kindle or on your iPad and I think
01:39:53
it'd be fine falling asleep or you can
01:39:55
listen to like some meditation video or
01:39:57
some spiritual teacher or something and
01:39:58
that'll actually help you fall asleep
01:40:00
but if you're on X or if you're checking
01:40:02
your email then heck yeah that's going
01:40:04
to keep you up so my hack for sleep is a
01:40:07
little different I normally fall sleep
01:40:09
within minutes and the way I do it is I
01:40:13
you all have a
01:40:14
meditation you have a set time no no I
01:40:16
sleep whenever I feel like usually
01:40:17
around 1 in the morning two in the
01:40:18
morning God damn I'm in bed by 10 yeah I
01:40:21
need to sleep I'm an owl but if you want
01:40:24
to fall asleep the hack I found is
01:40:26
everybody has tried some kind of a
01:40:28
meditation routine just sit in bed and
01:40:30
meditate and your mind will hate
01:40:33
meditation so much that if you force it
01:40:35
to choose between the fork of meditation
01:40:37
and sleeping you will fall asleep every
01:40:39
time well okay so after if you don't
01:40:41
fall asleep you'll end up meditating
01:40:42
which is great too so just I like the
01:40:45
meditation do the body scan the coda to
01:40:47
this story was a friend a friend of mine
01:40:49
came to see me from from the UAE and he
01:40:52
was here on Tuesday and I was telling
01:40:53
him about the dinner with Brian and he
01:40:55
told me the story cuz he's friends with
01:40:57
kabib the UFC fighter and he says you
01:41:00
know when kib goes to his house he eats
01:41:02
anything and everything fried food pizz
01:41:05
whatever but he trains
01:41:07
consistently and my friend adala says
01:41:10
how are you able to do that and how does
01:41:12
it not affect your physiology goes I've
01:41:14
learned since I was a kid I sleep 3
01:41:17
hours after I train in the morning and I
01:41:19
sleep 10 hours at night and I've done it
01:41:20
since I was like 12 or 13 years old
01:41:23
that's a lot of sleep it's a lot of
01:41:25
sleep I you know the direct correlation
01:41:28
for me is if I uh do something
01:41:31
cognitively like you know big heavy duty
01:41:34
conversations or whatever so no heavy
01:41:36
conversations at the end of the night no
01:41:38
existential conversations the night and
01:41:41
then if I go rucking I have the you know
01:41:43
on the ranch I put on a 35b weight vest
01:41:45
I walk you do that at night before you
01:41:47
go to bed no no no if I do it anytime
01:41:48
during the day I typically do it in the
01:41:50
morning or the afternoon but the 1 to2
01:41:52
mile Ruck with the 35 lbs whatever it is
01:41:54
it Just Tires my whole body out so that
01:41:57
when I do lay down is that why you don't
01:42:00
prepare for the
01:42:02
podt you know I mean this pot is the top
01:42:06
10 pot in the world Chim do you think
01:42:09
it's an accident freeberg what's your
01:42:10
what's your sleep routine can you just
01:42:12
go to bed you just like warm bath and I
01:42:15
send J Calli a picture of my feet I'll
01:42:18
wait till jal's done I do take a nice
01:42:21
warm bath
01:42:23
it but you do you do it every night a
01:42:25
warm bath I do yeah I do a warm bath
01:42:27
every night with candles too and do you
01:42:30
do it right before you go to bed yeah I
01:42:32
usually do it after I put the kids down
01:42:34
and I'll basically start to wind down
01:42:35
for bed I do watch TV sometimes but I do
01:42:38
have the problem and the mistake of
01:42:40
looking at my phone probably for too
01:42:41
long before I turn the lights off so do
01:42:44
you have a consistent time where you go
01:42:45
to bed or
01:42:47
no usually 11:00 to midnight and then up
01:42:51
at 6:30
01:42:54
man I I need I need eight hours
01:42:55
otherwise I'm a mess I go to I'm trying
01:42:57
to get eight I hit between 600 and seven
01:42:59
consistently I try to go to bed that 11:
01:43:01
to 1:00 a.m. window and get up the 7 to
01:43:04
8 window my problem is if I have work to
01:43:06
do I'll get on the computer or my laptop
01:43:08
and then when I start that after in my
01:43:11
evening routine I can't stop and then
01:43:13
all of a sudden it's like 3 in the
01:43:14
morning and I'm like oh no what did I
01:43:15
just do and then I still have to get up
01:43:17
at 6:30 so that does happen to me so
01:43:20
last night was unusual for me but it was
01:43:21
kind of funny anyway I thought oh I
01:43:23
should go to bed early cuz I'm an Allin
01:43:25
yeah but I ended up eating ice cream
01:43:28
with the kids
01:43:30
late wait what was the brand you said
01:43:32
you went for another brand I want to
01:43:33
know the brand I think it's van Luen or
01:43:35
something like
01:43:37
that New York
01:43:39
and the holiday cookies and cream oh my
01:43:42
God so good yeah it's so good so after I
01:43:44
polish that off then I was like oh I
01:43:46
probably ate too much to go to bed so I
01:43:48
better work out so I did a kettle bell
01:43:51
workout you sound like Mo what did you
01:43:54
say I have eight kettle bells right here
01:43:57
right next
01:43:58
to free this is called working out
01:44:02
fre and then while I'm doing my
01:44:04
kettlebell suitcase carry I was texting
01:44:07
with an entrepreneur friend so you can
01:44:09
tell how intense my workout was and he's
01:44:11
in Singapore so it was in the middle of
01:44:13
the night for me and early for him and
01:44:15
was time to go to bed I was like I was
01:44:17
like okay now I got to get to bed how do
01:44:19
I get to bed I'm my body is all amped up
01:44:21
I've got food in my stomach
01:44:24
Bells my brain is all my brain is all
01:44:26
amped up and all in podcast is tomorrow
01:44:29
and what time is it's 1:30 in the
01:44:30
morning I better get to bed so I I put
01:44:33
on like a little one of those spiritual
01:44:35
videos to calm me down and then I just
01:44:38
and then I got in bed and I was like
01:44:40
there's no way I'm falling asleep and I
01:44:42
started meditating and 5 minutes later I
01:44:44
was asleep you know actually the dolly
01:44:46
Lama has these great on his YouTube
01:44:48
channel he's got these great like 2hour
01:44:50
discussions you get about 20 30 minutes
01:44:53
into that you will fall asleep well yeah
01:44:55
my but my learning is yeah watch any
01:44:57
Dharma lecture from the SS exactly and
01:45:00
my my lesson is my learning is that the
01:45:03
mind will do anything to avoid
01:45:05
meditation yeah yes by the way did you
01:45:08
guys see just before we W did you see
01:45:09
all the confirmations RFK Jr confirmed
01:45:12
Brook rolands confirmed by the way if
01:45:14
you look at Poly Market poly Market had
01:45:16
it all right a couple weeks ago like I
01:45:18
was TR Market there was a moment where
01:45:21
fell to like 56% there was a moment when
01:45:24
RFK fell to 75% but then they bounced
01:45:26
back and it was done could you got snip
01:45:28
that man you could have made money yeah
01:45:31
and the media the media was like no way
01:45:32
he's getting confirmed this is not going
01:45:34
to happen but poly Market knows it's so
01:45:35
interesting huh well I saw a very
01:45:38
insightful tweet and I forget who wrote
01:45:40
it so I'm sorry I can't give credit but
01:45:42
the guy basically said look Trump has a
01:45:45
narrow majority in the house and the
01:45:48
Senate yeah and he can get everything he
01:45:50
wants as long as a Republicans stay in
01:45:52
line so all the pressure and all the
01:45:54
anger that all the mega movement is
01:45:57
doing against the left is pointless it's
01:46:00
all about keeping the rightwing in line
01:46:03
so it's all the people saying to the
01:46:05
Senators hey I'm going to primary you
01:46:07
Nicole Shanahan saying I'm going to
01:46:08
primary you it's Scott Pressler saying
01:46:10
I'm moving to your District that's the
01:46:12
stuff that's moving the needle and
01:46:14
causing the confirmations to go through
01:46:15
that's how you get cash Patel that's how
01:46:17
you get toy gabber the dni that's how
01:46:19
you get RF you worry about any of these
01:46:22
you think any of them are are too spicy
01:46:23
for your taste or you just like the
01:46:25
whole burn it down put in the crazy like
01:46:29
Outsiders that's such a bad
01:46:31
characterization that's not a fair
01:46:32
characterization mean the out honestly
01:46:34
it's like I never thought I'd see it but
01:46:36
I think between Elon and sax and people
01:46:38
like that we actually have Builders and
01:46:40
doers and financially intelligent people
01:46:42
and economically intelligent people in
01:46:44
charge and you know despite all the
01:46:46
craziness elon's not doing this for the
01:46:47
money he's doing it because he thinks
01:46:49
it's the right thing to
01:46:50
do he moved into the Roosevelt
01:46:53
I I think like many of us I had I had
01:46:55
bought into the great forces of History
01:46:57
mindset where it's just like okay it's
01:46:59
inevitable this is what's happening
01:47:00
government always gets bigger always
01:47:02
gets slower and we just have to try and
01:47:04
get stuff built before they just shut
01:47:06
everything down and we turn into Europe
01:47:08
but the thing that happened then was you
01:47:10
know Caesar crossed crossed the Rubicon
01:47:11
the great man theory of History played
01:47:14
out and we're living in that time and
01:47:16
it's it's an inspiration to all of us
01:47:17
despite Sam Alman and elon's current
01:47:20
fighting I know Sam was inspired by Elon
01:47:22
at one point and I think all of us are
01:47:24
inspired by Elon I mean the guy can be
01:47:26
the Diablo player and do Doge and run
01:47:29
SpaceX and Tesla and boring and neural
01:47:32
link I mean it's incredibly impressive
01:47:33
it makes us that's why I'm doing a
01:47:34
hardware company now it makes me want to
01:47:36
do something useful with my life you
01:47:37
know Elon always makes me question am I
01:47:40
doing something useful enough with my
01:47:42
life it's why I don't want to be an
01:47:43
investor you know Peter te ironically
01:47:45
he's an investor but he's inspirational
01:47:47
that way too because he like yeah the
01:47:48
future just doesn't just happen you have
01:47:51
to go make it so you know we get to go
01:47:53
make the future and I'm just glad that
01:47:55
Elon and Doge and others are making the
01:47:56
future thatare what the what do we got
01:47:58
going on here maybe I'll all in podcast
01:48:01
in a couple of months but it's really
01:48:03
it's really difficult I'm not sure I can
01:48:04
pull it off so let me try let me just
01:48:06
make sure it's viable is it drone
01:48:07
related is it self-driving drone drones
01:48:09
are cool but no it's not maybe podcast
01:48:12
should be an angel investor oh yes
01:48:16
absolutely no no Syndicate Jason just
01:48:18
our Mone what are you talking about you
01:48:20
know how I learned about syndicates was
01:48:21
Nal the first syic I ever did on
01:48:24
angelist I think is still the biggest I
01:48:26
don't know 5% and Nal is my partner on
01:48:29
this for.com I think you'll love what
01:48:32
I'm working on if I pull it off I think
01:48:34
you guys will love it i' love to show
01:48:35
your demo let us know where to send the
01:48:37
check get that black cherry chip van
01:48:39
Leen I love you guys what have we
01:48:40
learned I gotta go okay big shout out to
01:48:43
Bobby and to Tulsa that's a huge huge
01:48:45
huge room for America I'm stoked about
01:48:47
both of them congratulations I love me
01:48:51
thanks for coming
01:48:55
let's get bobbyy Bobby come back on the
01:48:57
Pod four thear David Sachs your sulan of
01:49:02
science David freeberg the chairman
01:49:05
dictator chamath P hatita
01:49:09
POA and Namaste N I am the world's
01:49:14
greatest moderator we'll see you next
01:49:15
time on the all in pond say
01:49:19
byebye let your winners ride
01:49:23
Rainman
01:49:26
David and instead we open source it to
01:49:29
the fans and they've just gone crazy
01:49:31
with it love you queen
01:49:35
[Music]
01:49:39
of Besties
01:49:42
are that's my dog taking your
01:49:45
[Music]
01:49:47
driveways oh man myit will meet me at we
01:49:51
should all just get a room and just have
01:49:52
one big huge orgy cuz they're all this
01:49:54
usess it's like this like sexual tension
01:49:56
that they just need to release
01:49:58
[Music]
01:50:03
somehow we need to get mer
01:50:09
[Music]
01:50:13
our I'm going all in

Episode Highlights

  • Building Products
    Naval discusses his journey in product development and the lessons learned along the way.
    “I built a product that I loved that didn't catch fire.”
    @ 07m 13s
    February 15, 2025
  • Growth Through Stress
    Naval reflects on how stressful experiences can lead to personal growth and insights.
    “It's only in stress that you sort of are forced to grow.”
    @ 11m 04s
    February 15, 2025
  • JD Vance's AI Speech
    JD Vance delivered a powerful speech on AI opportunities at the AI Action Summit in Paris.
    “It was a very well crafted and well-delivered speech.”
    @ 25m 13s
    February 15, 2025
  • Techno-Optimism vs. Pessimism
    A debate on the differing views of technology's impact on society, especially regarding AI.
    “There's a greater incentive in those countries to manifest upside than there is for the United States.”
    @ 34m 50s
    February 15, 2025
  • Economic and Military Supremacy
    For a country to thrive, it needs both economic and military supremacy, which are underpinned by technological supremacy.
    “If you want a country to thrive it needs to have economic Supremacy and it needs to have military Supremacy.”
    @ 39m 21s
    February 15, 2025
  • The Importance of Skilled Immigration
    A discussion on the necessity of skilled, assimilated immigration to drive technology and economic growth in America.
    “Skilled assimilated immigration you have to separate that from just open borders.”
    @ 48m 20s
    February 15, 2025
  • AI and Job Security
    AI won't take your job, but knowing how to use it will be crucial.
    “AI won't take your job, it's someone using AI that will.”
    @ 01h 02m 05s
    February 15, 2025
  • Competing with AI
    Soft skills are becoming more valuable as AI tools level the playing field.
    “If you have these other skills, now you can compete equally.”
    @ 01h 04m 13s
    February 15, 2025
  • Tariffs and Network Effects
    Discussion on the impact of tariffs and network effects on industries.
    “Tariffs are probably going to happen.”
    @ 01h 15m 55s
    February 15, 2025
  • The Future of AI and Copyright
    Debate on how AI models should handle copyright and fair use.
    “I think this is going to result in a massive uplift for the content industry.”
    @ 01h 25m 11s
    February 15, 2025
  • The Importance of Sleep
    Brian Johnson emphasizes that sleep is crucial for health and well-being. 'It's all about sleep.'
    “It's all about sleep.”
    @ 01h 37m 08s
    February 15, 2025
  • Bobby's Return
    Bobby is welcomed back with excitement and enthusiasm.
    “Let's get Bobby back on the pod!”
    @ 01h 48m 55s
    February 15, 2025

Episode Quotes

Key Moments

  • Fun Podcast01:40
  • Product Development07:13
  • AI Opportunities25:13
  • Immigration Debate46:02
  • Economic Gains55:13
  • Optimism56:06
  • Economic Predictions1:15:01
  • Supply Chain Concerns1:19:52

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E165: Vision Pro: use or lose? Meta vs Snap, SaaS recovery, AI investing, rolling real estate crisis
Podcast thumbnail
E85: SBF's crypto bailout, Zendesk sells for ~$10B, buyout targets, US diplomacy, AlphaFold & more
Podcast thumbnail
E36: New FTC Chair, breaking up big tech, government silent spying, Jon Stewart, wildfires & more
Podcast thumbnail
AI Psychosis, America's Broken Social Fabric, Trump Takes Over DC Police, Is VC Broken?
Podcast thumbnail
E170: Tech's Vibe Shift, TikTok ban debate, Vertical AI boom, Florida bans lab-grown meat & more
Podcast thumbnail
E117: Did Stripe miss its window? Plus: VC market update, AI comes for SaaS, Trump's savvy move
Podcast thumbnail
Trump's First Week: Inauguration Recap, Executive Actions, TikTok, Stargate + Sacks is Back!
Podcast thumbnail
Grok 4 Wows, The Bitter Lesson, Third Party, AI Browsers, SCOTUS backs POTUS on RIFs
Podcast thumbnail
E168: Can Google save itself? Abolish HR, AI takes over Customer Support, Reddit IPO teardown