Search Captions & Ask AI

E111: Microsoft to invest $10B in OpenAI, generative AI hype, America's over-classification problem

January 13, 2023 / 01:31:59

This episode covers media bias, the role of independent journalism, and the implications of AI technology. The hosts discuss the challenges faced by entrepreneurs and the impact of government policies on small businesses.

The conversation begins with the hosts addressing the media's reluctance to cover their podcast, highlighting the tension between independent voices and traditional journalism. They mention the recent profile by Slate, which they feel misrepresents their work.

They then shift to a discussion about the influence of media on public perception, particularly regarding technology entrepreneurs. The hosts argue that journalists often lack the expertise to accurately report on complex topics, leading to misinformation.

The episode also touches on the homelessness crisis in San Francisco, with the hosts debating the treatment of homeless individuals versus the struggles of small business owners. They discuss the need for a more nuanced approach to addressing addiction and mental health issues.

Finally, the hosts explore the potential of AI technologies, particularly in the context of business and legal applications. They speculate on the future of AI and its impact on various industries, emphasizing the importance of proprietary data in creating competitive advantages.

TL;DR

The episode discusses media bias, the challenges of entrepreneurship, homelessness, and the future of AI technology.

Video

00:00:00
is anybody else seeing a half a second
00:00:02
lag with jaycal or like a second line
00:00:04
test test one two one two from like the
00:00:06
way his mouth moves well that always
00:00:08
happens oh God here it comes his mouth
00:00:11
never stops moving black sex relaxed sex
00:00:15
are we going are we recording are you
00:00:18
ready to go
00:00:19
this is a plus material all right let's
00:00:22
go this is Chappelle at the punch line
00:00:24
let's go let's go
00:00:27
[Music]
00:00:39
[Music]
00:00:42
all right everybody Welcome to episode
00:00:47
111 of the world's greatest podcasts
00:00:50
according to slate the podcast that
00:00:53
shall not be mentioned by the Press
00:00:55
apparently you know what do you mean
00:00:56
they just did a profile on us well they
00:00:58
did this is the conundrum
00:01:00
it's so much of a phenomenon that we're
00:01:03
the number one business and the number
00:01:04
one Tech podcast in the world hands down
00:01:07
that the Press has a hard time giving us
00:01:12
any Oxygen because they want to hate us
00:01:16
they want to cover it you're saying they
00:01:18
take the ideas but not the they don't
00:01:20
want to cite it they don't want to cite
00:01:22
it they don't want to cite it but anyway
00:01:23
shout out to Slade yeah what I thought
00:01:25
was interesting was the guy pointed out
00:01:27
that we don't want to subject ourselves
00:01:29
to Independent journalists asking us
00:01:31
independent questions therefore we go
00:01:33
direct and that that's kind of the the
00:01:36
thing nowadays when everyone says they
00:01:38
want to go direct it's because they
00:01:39
don't want to be subject to Independent
00:01:40
journalists well one might ask
00:01:41
themselves why subjects don't want to go
00:01:44
direct yeah exactly you mean don't want
00:01:46
to go to journalists yeah because it
00:01:48
there's a there's a specific reason why
00:01:50
principles the subject of stories do not
00:01:52
want to have the Press
00:01:54
interpret what they're saying is because
00:01:56
they don't feel they're getting a fair
00:01:57
shake they feel like the challenge is
00:01:59
that then we avoid independent scrutiny
00:02:02
of our points of view and our decisions
00:02:03
they're constantly writing hip pieces
00:02:05
about us the the question is when we
00:02:08
want to present our side of it do we
00:02:11
need to go through their filter or not
00:02:12
why would you go through their filter
00:02:13
when it's always going to be a hit piece
00:02:16
right well I mean they have a class
00:02:19
hatred
00:02:20
of basically of Technology entrepreneurs
00:02:24
and investors
00:02:27
you're right Jake how they don't hate
00:02:29
you because you genuflect to their
00:02:31
political biases you see if if you do if
00:02:35
you do what SPF did which is basically
00:02:38
agree with all of their biases then yes
00:02:41
they'll treat you better that's the deal
00:02:43
that's how it works
00:02:45
a specific large media outlets right
00:02:48
she's not referring to Fox News
00:02:51
okay you can name one I'll trade you
00:02:54
I'll tell you what I'll trade you fox
00:02:55
for MSNBC and CNN and the New York Times
00:02:58
The Washington Post and the Atlantic
00:03:00
Magazine on and on and on you get a lot
00:03:02
of mileage out of being able to name Fox
00:03:04
the fact of the matter is Megan Callie
00:03:07
that's a podcaster she's independent now
00:03:09
that's true you can name one I mean
00:03:11
literally one Outlet
00:03:13
that is not part of this you know
00:03:16
mainstream media and they all think the
00:03:18
same way
00:03:19
there are very small differences the way
00:03:21
they think that's true it's all about
00:03:23
clicks uh at this point and it's all
00:03:25
about journalism and advocacy
00:03:28
well you're calling advocacy is bias and
00:03:32
activism it's activism it's that's what
00:03:34
I'm talking about activism journalism
00:03:35
yes I think the Draymond also highlights
00:03:37
a really important point which is you
00:03:39
know he started his podcast it's become
00:03:41
one of the most popular forms of sports
00:03:44
media and he can speak directly without
00:03:46
the filtering and you know
00:03:48
classification that's done by the you
00:03:51
know journalist and it seems to be a
00:03:53
really powerful Trend the audience
00:03:55
really wants to hear direct and they
00:03:56
want to hear unfiltered Raw
00:03:59
point of view and there and maybe
00:04:01
there's still a role for I think the
00:04:03
journalism separate from that which is
00:04:05
to then scrutinize and analyze and
00:04:07
question and it's not journalism to
00:04:10
activism they're just activists
00:04:12
there are also journalists out there
00:04:14
sites right so actually well it depends
00:04:17
what the topic is and what the outlet is
00:04:18
right but I I actually I would argue
00:04:21
that most of these journalists are doing
00:04:23
what they're doing for the same reason
00:04:24
that we're doing what we're doing which
00:04:26
is they want to have some kind of
00:04:28
influence because they don't get paid
00:04:30
very much right but the way they have
00:04:32
influence is to push a specific
00:04:33
political agenda I mean they're
00:04:35
activists they're basically artists
00:04:37
become advocacy journalism yes that's
00:04:39
the term I coin for it it's advocacy you
00:04:41
guys see this brouhaha
00:04:43
where Matt Iglesias wrote this article
00:04:45
about the fed and about the debt ceiling
00:04:49
and through this whole multi-hundred
00:04:51
word thousand word
00:04:54
tome he didn't understand the difference
00:04:57
between a percentage point and the basis
00:05:00
points yeah I did see that yeah wow so
00:05:04
wait a second you're saying the fed's
00:05:06
raising 25 percent
00:05:09
that's a huge difference between my
00:05:11
mortgage is coming up 25 between a
00:05:14
principal and an outside analyst right
00:05:15
like a principal has a better grasp
00:05:18
typically of the topic than the material
00:05:19
but you know the argument
00:05:23
within the journalist Circle he's
00:05:26
considered the conventional wisdom I get
00:05:27
it but the argument from a journalist is
00:05:29
that by having that direct access that
00:05:32
person is also biased because they're an
00:05:34
agent because they're a player on the
00:05:35
field they do have a point of view and
00:05:37
they do have a Direction they want to
00:05:38
take things so it is a fair commentary
00:05:40
that journalists can theoretically play
00:05:43
a role which if they're an off-field
00:05:44
analyst and that don't necessarily bring
00:05:47
I would argue they're less educated and
00:05:49
more biased than we are that may or may
00:05:51
not be true what the two of you guys are
00:05:52
debating which is a very subjective take
00:05:54
but the thing that is categorical and
00:05:56
you can't deny is that there is zero
00:05:59
checks and balances when something as
00:06:02
simple as the basis point percentage
00:06:04
Point difference isn't caught in
00:06:06
proofreading isn't caught by any editor
00:06:09
isn't taught by the people that you know
00:06:11
help them review this and so what that
00:06:13
says is all kinds of trash must get
00:06:15
through because there's no way for the
00:06:18
average person on Twitter to police all
00:06:21
of this nonsensical content this one was
00:06:22
easy because it was so numerically
00:06:24
illiterate that it just stood out but
00:06:27
can you imagine the the number of
00:06:29
unforced Errors journalists make today
00:06:32
in their search for clicks that don't
00:06:34
get caught out that may actually tip
00:06:36
somebody to think a versus B that's I
00:06:39
think the thing that's kind of
00:06:40
undeniable right yeah
00:06:44
there's a very simple test for this if
00:06:47
you read uh the journalists writing
00:06:50
about a topic you are an expert on
00:06:51
whatever the topic happens to be
00:06:53
you start to understand okay well on
00:06:56
that story I'm reading
00:06:58
that they understand about 10 or 20 or
00:07:00
30 percent of what's going on
00:07:03
but then when you read stories that
00:07:04
you're not involved in you know you read
00:07:06
a story about Hollywood or I don't know
00:07:08
pick an industry or a region you're not
00:07:10
super aware of you're like okay well
00:07:11
that must be 100 correct and the truth
00:07:14
is journalists have access to them there
00:07:17
is a name for it yeah it's called the
00:07:18
Gelman Amnesia effect you just
00:07:19
plagiarize Michael Crichton who came up
00:07:21
with that yeah so you yeah no it's
00:07:25
exactly right but I think it's worse
00:07:26
than that it's because now the
00:07:28
now the mistakes aren't being driven
00:07:31
just by sloppiness or laziness or just a
00:07:33
lack of expertise I think it's being
00:07:35
driven by an agenda so just to give you
00:07:37
an example on the Slate thing the Slate
00:07:39
article actually wasn't bad it kind of
00:07:41
made us seem you know cool the sub
00:07:43
headline was a close listen to all in
00:07:46
the infuriating fascinating safe space
00:07:47
for silicon Valley's money men okay but
00:07:50
the the headline changed so I don't know
00:07:53
if you guys noticed this the headline
00:07:54
now is Elon musk's Inner Circle is
00:07:57
telling us exactly what it thinks first
00:07:59
of all like they're trying for clicks
00:08:01
yeah so they're trying way too hard to
00:08:04
like describe Us in terms of Elon which
00:08:06
you know is maybe two episodes out of
00:08:08
110 but before inner circle the word
00:08:11
they used was cronies and then somebody
00:08:13
edited it because I saw cronies in like
00:08:16
one of those tweet you know
00:08:19
summaries you know where like it does a
00:08:22
capsule or whatever yeah yeah and those
00:08:25
get Frozen in time so you know they were
00:08:27
trying to bash us even harder and then
00:08:29
somebody took another look at it and
00:08:30
turned it down I'll tell you what
00:08:32
happens in the editorial process whoever
00:08:34
writes the article the article gets
00:08:36
submitted maybe it gets edited proof
00:08:37
read whatever maybe it doesn't even in
00:08:40
some Publications they don't have the
00:08:41
time for it because they're in a race
00:08:43
then they pick there's somebody who's
00:08:45
really good at social media they pick
00:08:46
six or seven headlines they a B test
00:08:48
them and they even have software for
00:08:50
this where they will run a test
00:08:52
sometimes they'll do a pay test they put
00:08:54
five dollars in ads on social media
00:08:57
whichever one performs the best that's
00:08:59
the one they go with so it's even more
00:09:01
cynical and because people who read the
00:09:03
headlines
00:09:04
sometimes they don't read the story
00:09:06
right obviously most people just see the
00:09:07
headline they interpret that as a story
00:09:08
that's why I told you when they did that
00:09:10
new Republic piece on you with that
00:09:12
horrific monstrous monstrosity of an
00:09:15
illustration
00:09:17
don't worry about it people just read
00:09:18
the headline they know you're important
00:09:20
nobody reads the story anyway no but it
00:09:23
wasn't a bad article actually it was
00:09:24
well written actually I was in shock I
00:09:26
was like who is this writer that
00:09:28
actually took the time to write some
00:09:29
prose that was actually decent yeah he
00:09:31
had listened to a lot of episodes
00:09:32
clearly that was a really good moment
00:09:34
actually that was great advice because
00:09:35
you gave it to him and you gave it to me
00:09:37
because both of us had these things and
00:09:39
Jason said the same thing just look at
00:09:41
the picture and if you're okay with the
00:09:43
picture just move on and I thought this
00:09:45
can't be true and it turned out to
00:09:47
mostly be true yeah but my picture was
00:09:48
terrible
00:09:50
yeah but it's close to reality
00:09:55
oh wow
00:10:00
but that just shows how ridiculously
00:10:03
biased it is right
00:10:07
Hugh Grant Elon let's pull that up one
00:10:10
more time here Elon looks like Hugh
00:10:12
Grant that's just kind of he does kind
00:10:14
of not bad kind of looks like Hugh Grant
00:10:16
and like Notting Hill I knew that
00:10:18
article was going to be fine when the
00:10:20
first you know item they presented as
00:10:23
evidence of me doing something wrong was
00:10:25
basically helping to oust chase a boudin
00:10:27
which was something that was supported
00:10:29
by like 70 of San Francisco yeah which
00:10:32
is a 90 Democratic City so not exactly
00:10:35
evidence of some you know right-wing
00:10:38
movement look at the headline
00:10:40
the quiet political rise of David sacks
00:10:42
silicon Valley's prophet of urban Doom
00:10:44
I'm just letting you know people don't
00:10:46
get past the sixth word in the image
00:10:47
yeah that's 99 of people are like oh my
00:10:50
God congrats on the you know Republic
00:10:52
article it could have literally been
00:10:53
Laurel what do they call them uh Laurel
00:10:55
lipsums you know like it could have just
00:10:57
been filler words from their second
00:10:58
graph down and nobody would know yeah
00:11:01
but now apparently if you notice that
00:11:04
San Francisco streets look like you know
00:11:07
Walking Dead that apparently you're a
00:11:09
prophet of urban do I mean people are so
00:11:11
out of touch I mean they can't even
00:11:13
acknowledge what people can see with
00:11:15
their own eyes that's the bias that's
00:11:16
gotten crazy and I don't know if you
00:11:18
guys saw this
00:11:20
really horrible dystopian video of an
00:11:23
art gallery owner who's been dealing
00:11:25
with owning a storefront in San
00:11:27
Francisco which is challenging and
00:11:29
having to clean up
00:11:31
feces and you know trash and whatever uh
00:11:34
every day and I guess the guy snapped
00:11:37
and he's hosing down a homeless person
00:11:40
who refuses to leave the front of a
00:11:42
store
00:11:43
oh I saw that I saw just like the
00:11:45
humanity in this is just insane like
00:11:49
really like you're hosing a human being
00:11:51
down
00:11:52
it's terrible who is obviously
00:11:55
not living a great life and is you know
00:11:58
I can feel for both of them I agree that
00:12:01
it's not good to hose a human being down
00:12:03
on the other hand think about the sense
00:12:04
of frustration that store owner has
00:12:06
because he's watching his business go in
00:12:09
the toilet because he's got homeless
00:12:11
people living in front of him so they're
00:12:13
both like being mistreated the homeless
00:12:15
person
00:12:18
say more the the homeless person is
00:12:20
being mistreated the Stormer is being
00:12:21
mistreated by the city of San Francisco
00:12:22
yeah
00:12:25
is not in a privileged position that
00:12:27
person probably the store owner the
00:12:29
store owner he's probably fighting to
00:12:31
stay in business I'm just saying I'm not
00:12:33
saying that's right but no no I'm laying
00:12:36
the Rope
00:12:40
you're trying to do what you're trying
00:12:42
to do is oh my God look at this this
00:12:45
homeless person being horribly oppressed
00:12:46
no that store owner is a victim too
00:12:49
oh yeah there's no doubt it's horrible
00:12:51
to run a business what is that person
00:12:53
supposed to do no wait this is this is
00:12:55
symbolic of the breaking down of basic
00:12:57
Society like these both of these people
00:13:00
are obviously like it's just a horrible
00:13:03
moment to even witness it's like oh he
00:13:07
it's like something Jason do you have
00:13:09
equal empathy for the store owner and
00:13:11
the homeless person or no
00:13:13
under no circumstances should you hose a
00:13:16
person down in the face who is homeless
00:13:18
like it's just horrific to watch it's
00:13:20
just inhumane this is a human being now
00:13:22
but as a person who owns a store yeah my
00:13:24
dad grew up in the local business if
00:13:25
people were abusing The Story You're
00:13:27
trying to make a living and you've got
00:13:28
to clean up you know whatever excrement
00:13:31
every day which is terrific yes and this
00:13:34
is dystopian look in that moment the
00:13:37
empathy is not equal I think you have
00:13:39
more empathy obviously for the person on
00:13:41
the receiving end of that hose okay but
00:13:43
in general our society has tons of
00:13:47
empathy for homeless people we spend
00:13:49
billions of dollars trying to solve that
00:13:51
problem you never hear a thing about the
00:13:53
store owners who are going out of
00:13:54
business so on a societal level you know
00:13:58
not in that moment but in general the
00:14:00
lack of empathy is for these middle
00:14:03
class store owners who may not be middle
00:14:05
class working class who are struggling
00:14:07
to stay afloat and you look at something
00:14:10
like what is it like a quarter or a
00:14:12
third of the store fronts instead
00:14:13
Francisco are now vacant the city yes
00:14:16
this person is running the shocking
00:14:18
thing is like this person is running an
00:14:20
art gallery storefront in San Francisco
00:14:22
like why would you even bother why would
00:14:24
you bother to have a storefront in San
00:14:25
Francisco I mean everybody's left it's
00:14:27
just what do you mean why do you bother
00:14:28
if you soap into stores so what are you
00:14:30
supposed to start to code all of a
00:14:32
sudden well no I mean you would shut it
00:14:34
down at some point and find an exit and
00:14:35
do it has large fixed costs right
00:14:40
10 years ago exactly at some point you
00:14:43
have to shut down your story in San
00:14:45
Francisco the second you can get out of
00:14:46
the solution to everything jcal isn't go
00:14:48
to coding school online and then you
00:14:49
know I didn't say it was I'm just but
00:14:52
moving to another city is a possibility
00:14:53
oh true a lot of folks in Silicon Valley
00:14:56
I think in this weirdly [ __ ] up way do
00:14:58
believe the solution to everything is
00:14:59
learn to code or it's an Uber driver
00:15:02
yeah
00:15:03
get a gig job get a gig years the guy
00:15:07
spent years building his retail business
00:15:09
I mean the thing is English person camps
00:15:11
in front and the homeless and he calls
00:15:13
the the police the police don't come and
00:15:15
move the homeless person the homeless
00:15:16
person stays there he asks nicely to
00:15:17
move customers are uncomfortable going
00:15:19
into store as a result yeah I stopped
00:15:20
going to certain stores in my
00:15:21
neighborhood because of homeless tents
00:15:23
being literally fixated in front of the
00:15:25
store and I go to the store down the
00:15:26
road to get my groceries or whatever
00:15:28
like I mean it's not a kind of uncommon
00:15:30
situation for a lot of these small
00:15:31
business owners they don't own the real
00:15:32
estate they're paying rent uh they've
00:15:35
got high labor costs you know
00:15:36
everything's inflating generally city
00:15:39
populations declining it's a brutal
00:15:40
situation
00:15:42
all around I think if everybody learns
00:15:45
to code or drives an Uber the problem is
00:15:47
that in the absence of things like local
00:15:50
stores and small businesses you hollow
00:15:52
out communities
00:15:53
you've got these random detached places
00:15:56
where you kind of live and then you sit
00:15:58
in your house which becomes a prison
00:15:59
while you order food from an app every
00:16:02
day I don't think that is the society
00:16:03
that people want so I don't know I kind
00:16:05
of want small businesses to exist and I
00:16:08
think that the homeless person should be
00:16:10
taken care of but the small business
00:16:11
person should have the best chance of
00:16:13
trying to be successful because it's
00:16:14
hard enough as it is the the mortality
00:16:16
rate of the of the small business owner
00:16:19
is already 90 percent
00:16:21
it's impossible in San Francisco let's
00:16:23
just be honest
00:16:25
I'm not trying to push people listen
00:16:27
yeah you are because here here's
00:16:29
something I'm saying the guy I'm just
00:16:31
shocked that the guy even has a
00:16:33
storefront I would have left alone
00:16:34
you're showing a tweet that's a moment
00:16:36
in time and you're not showing the 10
00:16:37
steps that led up to it oh a thousand
00:16:40
steps five times he called the police
00:16:42
station
00:16:44
customers the stuff that free burgers
00:16:46
were just talking about or maybe there
00:16:48
was physical conflict that we didn't see
00:16:49
in that you know and he's resolving it
00:16:52
man it's really hard to look at these
00:16:54
videos and know what's going on it's
00:16:55
awful to see but man we don't know it's
00:16:57
the whole thing actually you want to
00:16:59
know another reason why we can't solve
00:17:00
this problem this is the language we use
00:17:02
around it the fundamental problem here
00:17:03
is not homeless no it's addiction it's
00:17:06
addiction because and it's mental
00:17:08
illness shellenberger's done the work
00:17:09
it's like he said 99 of the people he
00:17:12
talks to it's either mental illness or
00:17:13
addiction but we keep using this word
00:17:15
homeless to describe the problem right
00:17:18
but the issue here is not the lack of of
00:17:20
housing although that's a separate at
00:17:22
problem in California but it's basically
00:17:24
the lack of of treatment totally so we
00:17:26
should be calling them treatmentless and
00:17:28
mandates around this because and
00:17:29
enforcement you you cannot you can't
00:17:32
have a super drug be available for a
00:17:35
nominal price and give people
00:17:38
you know a bunch of money to come here
00:17:39
and take it and not enforce it you have
00:17:41
to draw the line that's at fentanyl I'm
00:17:43
sorry fentanyl is a super drug there's
00:17:45
three Alternatives is mandated rehab
00:17:48
mandated mental health uh or jail or you
00:17:53
know Housing Services if you're not
00:17:55
breaking the law you don't have mental
00:17:56
illness you don't have drug addiction
00:17:57
and then provide those are the four
00:17:58
Paths of outcome here of success and if
00:18:01
all four of those paths were both
00:18:03
mandated and available in abundance this
00:18:06
could be a attractable problem
00:18:08
unfortunately the man the Mandate I mean
00:18:11
you guys remember that Kevin Bacon movie
00:18:12
where Kevin Bacon was locked up in a
00:18:14
mental institution but he wasn't like
00:18:18
he wasn't mentally ill it's a famous
00:18:20
story
00:18:24
you guys someone's probably gonna call
00:18:26
me an idiot for for
00:18:28
messing this whole thing up but
00:18:30
I think um there's a there's a story
00:18:32
where
00:18:33
mandated
00:18:36
Mental Health Services like locking
00:18:38
people up to take care of them when they
00:18:40
have mental uh mental health issues like
00:18:42
this became kind of inhumane
00:18:45
and a lot of the institutions were shut
00:18:47
down and a lot of the laws were
00:18:48
overturned and there are many of these
00:18:50
cases that happened where they came
00:18:52
across as like torturous to what
00:18:54
happened to people that weren't mentally
00:18:55
ill and so the idea was like let's just
00:18:57
development
00:19:00
yeah well that's another one right and
00:19:02
it's unfortunate but I think that
00:19:04
there's some you know we talk a lot
00:19:05
about nuance and gray areas but there's
00:19:07
certainly some solution here that isn't
00:19:09
black or white it's not about not having
00:19:11
mandated Mental Health Services and it's
00:19:13
not about locking everyone up that has
00:19:15
some slight problem but there's some
00:19:17
solution here that needs to be crafted
00:19:18
uh where you know you don't let people
00:19:20
suffer and you don't let people suffer
00:19:22
both as the
00:19:24
the victim on the street but also
00:19:25
talking about a 5150 I think like when
00:19:28
people are held but because they're a
00:19:31
danger to themselves or others kind of
00:19:32
thing right but Jacob think about the
00:19:34
power of language here if we refer to
00:19:36
these people as untreated persons
00:19:38
instead of homeless persons and that was
00:19:41
the coverage 24 7 in the media is this
00:19:43
is an untreated person the whole policy
00:19:46
prescription would be completely
00:19:47
different we'd realize there's a
00:19:48
shortage of treatment we'd realize
00:19:50
there's a shortage of Remedies related
00:19:52
to getting people in treatment as
00:19:54
opposed to building housing but why why
00:19:58
and laws that mandate it that don't
00:20:00
enable it because if you don't mandate
00:20:01
it then you enable the free reign and
00:20:04
the free living on the street and the
00:20:06
open drug markets and all this sort of
00:20:07
stuff there's a really easy test for
00:20:09
this if it was if it was yourself
00:20:11
and you were addicted
00:20:14
or if it was a loved one it's one of
00:20:15
your immediate family members would you
00:20:17
want yourself or somebody else to be
00:20:19
picked up off the street and held with a
00:20:21
5150 or whatever code involuntarily
00:20:24
against their will because they were a
00:20:27
danger would you want them to be allowed
00:20:28
to remain on the street would you want
00:20:30
yourself if you were in that Dire
00:20:31
Straits and the answer of course is you
00:20:34
would want somebody what's interesting
00:20:36
policy perspective on this Jake house so
00:20:38
let me ask you as our uh our Die Hard
00:20:40
liberal on the show no I'm not a
00:20:41
die-hard liberal no no he's an
00:20:43
independent only votes for Democrats
00:20:44
please get it right 75 of the time I
00:20:47
voted Democrat 25 right independent of
00:20:49
us are Democrats okay 25 Republicans is
00:20:52
it not that your individual liberties
00:20:53
are infringed upon if you were to be
00:20:55
quote picked up and put away you know my
00:20:57
position on it is if you're not thinking
00:20:59
straight uh you're in you're high on
00:21:02
fentanyl you're not thinking for
00:21:03
yourself and you you could lose the
00:21:05
liberty for a small period of time 72
00:21:07
hours a week you know especially if
00:21:09
you're a danger to somebody you know
00:21:12
yourself or other people and in this
00:21:14
case if you're on fentanyl if you're on
00:21:15
meth you're you're a danger I mean I
00:21:18
think if more I think if some people
00:21:19
have that if more people have that point
00:21:20
of view and have that debate as sax is
00:21:22
saying in a more open way you could get
00:21:24
to some path to resolution on just not
00:21:26
in San Francisco it's not how it
00:21:28
happened so
00:21:29
you guys know this we won't say who it
00:21:31
is but someone in my family has some
00:21:35
pretty severe mental health issues and
00:21:37
the problem is
00:21:40
because they're an adult you can't get
00:21:42
them to get any form of treatment
00:21:43
whatsoever
00:21:45
right right
00:21:47
do you only have the nuclear option and
00:21:49
the nuclear option is you basically take
00:21:51
that person to court and try to seize
00:21:53
their power of attorney which is
00:21:54
essentially saying that you know
00:21:56
individual liberties are gone yeah and
00:21:58
by the way it is so unbelievably
00:22:00
restrictive what happens if you lose
00:22:02
that power of attorney
00:22:03
and somebody else has it over you
00:22:06
it's just a huge burden that
00:22:09
the legal system makes extremely
00:22:11
difficult and the throttle law is a
00:22:14
backstop you know if the person's
00:22:15
committing something illegal like
00:22:16
camping out or doing fentanyl math
00:22:19
whatever you can use the law as the
00:22:21
backstop you know
00:22:23
all that person can do is really get
00:22:25
arrested even that is not a high enough
00:22:27
bar to actually get power of attorney
00:22:29
over somebody the other thing that I
00:22:30
just wanted you guys to know I think you
00:22:32
know this but just a little historical
00:22:34
context is
00:22:35
a lot of this crisis in mental health
00:22:37
started because Reagan defunded all the
00:22:40
psychiatric hospitals he emptied them in
00:22:42
California
00:22:43
and that compounded because for whatever
00:22:46
reason his ideology was that these
00:22:48
things should be treated in a different
00:22:49
way and when he got to the presidency
00:22:52
one of the things that he did was he
00:22:54
repealed the mental health I think it's
00:22:56
called the mental health systems Act
00:22:57
mhsa
00:23:00
which completely broke down some pretty
00:23:02
Landmark legislation on mental health
00:23:03
and it's and I don't think we've ever
00:23:05
really recovered and that we're now 42
00:23:07
years onward from 1980 but or 43 years
00:23:11
onward but just something for you guys
00:23:12
to know that that's that's well that
00:23:14
breaking had a lot of positives yeah but
00:23:16
that's one definitely negative check in
00:23:18
my book against his legacy is his stance
00:23:21
on Mental Health in general and what he
00:23:23
did to defund mental health well let me
00:23:25
let me make a couple two points there so
00:23:27
I'm I'm not defending that specific
00:23:28
decision there were a bunch of scandals
00:23:31
in the 1970s and epitomized by the movie
00:23:33
One Flew Over the Cuckoo's Nestle Jack
00:23:35
Nicholson about the conditions in these
00:23:38
mental health homes and that did create
00:23:40
a Groundswell to change laws around that
00:23:43
but I think this idea that like somehow
00:23:46
Reagan is to blame when he hasn't been
00:23:48
in office for 50 years as opposed to the
00:23:50
politicians who've been in office for
00:23:52
the last 20 years I just think it's
00:23:54
letting them off the hook I mean Gavin
00:23:56
Newsom 10 15 years ago when he was mayor
00:23:58
of San Francisco declared that he would
00:24:01
end homelessness within 10 years he just
00:24:03
made another declaration like that as
00:24:05
Governor so I just feel like not saying
00:24:08
it's Reagan's fault I'm just saying it's
00:24:10
an interesting historical moment I think
00:24:12
it's letting I think it's letting the
00:24:13
politics Society needs to start thinking
00:24:15
about changing priorities we didn't have
00:24:17
this problem of massive numbers of
00:24:20
people living on the streets 10 15 years
00:24:22
ago it was a much smaller problem and I
00:24:25
think a lot of it has to do with
00:24:26
fentanyl the power of these drugs is
00:24:28
increased yes massively there's other
00:24:30
things going on here so
00:24:32
in any event I mean you can question
00:24:34
what Reagan did in light of current
00:24:36
conditions but I think this problem
00:24:37
really started in the last 10 15 years
00:24:40
yes like in in an order of magnitude
00:24:43
bigger way these are super drugs until
00:24:44
people realize like these are a
00:24:46
different class of drugs and they start
00:24:48
treating them as such it's gonna just
00:24:50
get worse
00:24:52
as far as I know Reagan didn't hand out
00:24:55
to these addicts 800 a week to feed
00:24:58
their addiction so they can live on the
00:25:00
street San Francisco that is the current
00:25:01
policy of the city
00:25:03
all I just wanted to just provide was
00:25:06
just that color that
00:25:08
we had a system of funding for the
00:25:10
mental health infrastructure
00:25:11
particularly local mental health
00:25:12
infrastructure and we took that back and
00:25:16
then we never came forward and all I was
00:25:18
saying is I'm just telling you I think
00:25:22
that's part of the solution here is yeah
00:25:23
we're going to have to basically build
00:25:25
up shelters we're going to build up and
00:25:28
support
00:25:29
the problem now for example is Gavin
00:25:31
Newsom says a lot of these things and
00:25:34
now he's gone from a massive Surplus to
00:25:37
a 25 billion dollar deficit overnight
00:25:40
which we talked about even a year ago
00:25:42
because that was just the the law of
00:25:44
numbers catching up with the state of
00:25:45
California and he's not in a position
00:25:48
now to do any of this stuff so this one
00:25:49
this problem may get worse well they did
00:25:52
they did appropriate I forget the number
00:25:54
it's like 10 billion or something out of
00:25:55
that you know huge budget they had to
00:25:57
solve the problem of homelessness so I
00:25:58
would just argue they're not tackling it
00:26:00
in the right way because what happened
00:26:01
is there's a giant special interest that
00:26:04
formed around this problem which is the
00:26:07
the building industry who gets these
00:26:10
contracts to build the quote you know
00:26:13
affordable housing or this industrial
00:26:15
complex
00:26:16
so they end up building 10 units at a
00:26:20
time on Venice Beach like the most
00:26:22
expensive land you could possibly build
00:26:24
because you get these contracts from the
00:26:26
government so there's now a giant
00:26:28
special interest in lobby that's formed
00:26:30
around this if you really want to solve
00:26:32
the problem you wouldn't be building
00:26:34
housing on Venice Beach you'd be going
00:26:37
to cheap land just outside the city
00:26:40
totally and you'd be building scale
00:26:42
shelters I mean shelters that can house
00:26:45
10 000 people not ten and you'd be
00:26:48
having treatment services
00:26:50
but with treatment built into them right
00:26:52
you'll be solving this problem at scale
00:26:54
and that's not what they're doing
00:26:56
by the way do I do you guys want to hear
00:26:58
it this week in grift oh sure we're all
00:27:01
in that's a great example of grift I um
00:27:04
I read something today in Bloomberg that
00:27:05
was unbelievable there's about two
00:27:08
trillion dollars of debt owned by the
00:27:10
developing world
00:27:12
that has been classified by a non-profit
00:27:15
The Nature Conservancy in this case as
00:27:18
eligible for what they called nature
00:27:19
swaps so this is two trillion of the
00:27:22
umpteen trillions of debt that's about
00:27:24
to get defaulted on by come countries
00:27:26
like Belize Ecuador Sri Lanka Seychelles
00:27:30
you name it and what happens now are the
00:27:34
big bulge bracket Wall Street Banks and
00:27:36
The Nature Conservancy
00:27:38
goes to these countries and says listen
00:27:40
you know you have a billion dollar
00:27:42
tranche of debt that's about to go
00:27:44
upside down
00:27:46
and you're going to be in default with
00:27:48
the IMF we'll let you off the hook
00:27:50
and you know we will negotiate with
00:27:53
those bondholders to give them 50 cents
00:27:55
on the dollar
00:27:56
but in return you have to promise to
00:27:59
take some of that savings and you know
00:28:01
protect the rain forest or protect a
00:28:04
coral reef or protect some mangrove
00:28:06
trees all sounds good
00:28:08
except then what these folks do is they
00:28:10
take that repackage debt they call it
00:28:14
ESG they mark it back up and then they
00:28:17
sell it to folks like BlackRock who have
00:28:19
decided that they must own this in the
00:28:21
portfolio so it literally just goes from
00:28:23
one sleeve of BlackRock which is now
00:28:25
marked toxic Emerging Market debt and
00:28:29
then it gets into someone's 401K as ESG
00:28:32
debt is that unbelievable so you convert
00:28:35
your signal and buy some ESG to make
00:28:37
yourself feel good yeah two trillion
00:28:39
dollars
00:28:40
ESG is that Exxon is like the number
00:28:43
seven like top ranked company according
00:28:46
to ESG and Tesla is even on the list
00:28:48
disastrous how crazy is that it's it's a
00:28:53
complete game yeah all of those that
00:28:56
we've said this many times but each of
00:28:57
those letters individually means so much
00:28:59
and should be worth a lot to a lot of
00:29:01
people but when you stick them together
00:29:03
it creates this toxic soup where you can
00:29:05
just hide the cheese
00:29:07
yeah I mean governance is important in
00:29:09
companies of course the environment is
00:29:11
important social change is important I
00:29:14
mean but why are these things grouped
00:29:16
together in this it just perverts it's
00:29:18
an industry Jacob it's an industry
00:29:20
absolutely all right Speaking of Chris
00:29:24
um Microsoft is going to put 10 billion
00:29:25
dollars or something into chat GPT
00:29:28
degenerate AI as I'm calling it now is
00:29:31
uh the hottest thing in soccer Valley
00:29:34
the technology is incredible I mean you
00:29:36
you can question the business model
00:29:37
maybe but the technology is pretty well
00:29:40
I mean yeah so what I'd say is a 29
00:29:42
billion dollars for a company that's
00:29:43
losing a billion dollars on Azure
00:29:44
credits a year that's one way to work at
00:29:46
it that's also look at a lot of other
00:29:50
businesses that ended up being worth a
00:29:51
lot down the road I mean sure you can
00:29:54
model out the future of a business like
00:29:56
this and create a lot of really
00:29:57
compelling big outcomes you know
00:30:00
potentially yeah so Microsoft is close
00:30:03
to investing 10 million and open AI in a
00:30:05
very convoluted transaction that people
00:30:07
are trying to understand it turns out
00:30:09
that they might wind up owning 59 of
00:30:13
open AI but get 75 percent of the
00:30:15
cashing properties back over time 49 49
00:30:19
yeah of open AI but they would get paid
00:30:22
back the 10 million dollars over some
00:30:24
amount of time
00:30:25
and this obviously includes Azure
00:30:28
credits uh and chat GPT as everybody
00:30:33
knows this um just incredible
00:30:35
demonstration of what AI can do in terms
00:30:39
of text-based creation of content and
00:30:41
answering queries
00:30:43
is taken the net by storm people are
00:30:45
really inspired by it sax do you think
00:30:48
that this is a defensible real
00:30:51
technology do you think this is like a
00:30:53
crazy hype cycle
00:30:55
well it's definitely the next VC hype
00:30:57
cycle everyone's kind of glomming on to
00:30:59
this because VC really right now needs a
00:31:01
savior just look at the public markets
00:31:03
everything we're investing in is in the
00:31:04
toilet so we all really want to believe
00:31:06
that this is going to be the next wave
00:31:09
and just because something is a vce hype
00:31:12
cycle doesn't mean that it's not true so
00:31:15
as I think one of our friends pointed
00:31:17
out
00:31:18
you know mobile turned out to be very
00:31:20
real I think cloud turned out to be
00:31:23
I'd say very real social was sort of
00:31:27
real in the sense that did lead to a few
00:31:29
big winners on the other hand web 3 and
00:31:33
crypto was a hype Cyclone it's turned
00:31:35
into a big bust VR falls into the hype
00:31:37
cycle it's probably a hype cycle so far
00:31:39
no one can even explain what web3 is
00:31:42
in terms of AI I think that if I had to
00:31:46
guess I would say the hype is real in
00:31:50
terms of its technological potential
00:31:52
however I'm not sure about how much
00:31:55
potential there is yet for VCS to
00:31:58
participate because right now it seems
00:32:01
like this is something that's going to
00:32:02
be done by really big companies so open
00:32:05
AI is basically a it looks like kind of
00:32:07
a Microsoft proxy you've got Google I'm
00:32:10
sure will develop it through their deep
00:32:12
mine asset
00:32:14
you know I'm sure Facebook is going to
00:32:16
do something huge in AI so what I don't
00:32:19
know is is this really a platform that
00:32:21
starts gonna be able to benefit from I
00:32:24
will say that
00:32:25
some of the companies we've invested in
00:32:26
are starting to use these tools so
00:32:30
I guess I guess where I am is I think
00:32:33
the technology is actually exciting I
00:32:36
wouldn't go overboard on the valuations
00:32:39
so I wouldn't buy into that level of the
00:32:40
hype but you think there could be
00:32:42
hundreds of companies built around an
00:32:45
API for something like chat gbt Dolly
00:32:48
maybe yeah I don't I don't think
00:32:49
startups are going to be able to create
00:32:51
the AI themselves but they might be able
00:32:54
to benefit from the apis maybe that's
00:32:57
the thing that has to be proven out
00:32:59
there's a lot of really fantastic
00:33:01
machine learning
00:33:03
services available through Cloud vendors
00:33:06
today right so as there has been one of
00:33:09
these kind of vendors and uh obviously
00:33:11
open AI is building tools a little bit
00:33:13
further down on the the stack
00:33:16
but there's a lot of tooling that can be
00:33:18
used for specific vertical applications
00:33:19
obviously the acquisition of instabeat
00:33:21
by bioentec is a really solid example
00:33:24
and most of the big dollars that are
00:33:26
flowing in biotech right now are flowing
00:33:28
into machine learning applications where
00:33:30
there's some vertical application of
00:33:31
machine learning tooling and techniques
00:33:33
around some specific problem set and the
00:33:36
problem set of mimicking human
00:33:39
communication and
00:33:41
doing generative media is a consumer
00:33:45
application set that has a whole bunch
00:33:46
of really interesting product
00:33:47
opportunities but let's not kind of be
00:33:50
blind to the fact that nearly every
00:33:52
other industry and nearly every other
00:33:54
vertical is being transformed today and
00:33:56
there's active progress being made in
00:33:58
funding and getting liquidity on
00:34:01
companies and progress with actual
00:34:03
products being driven by Machine
00:34:05
learning systems and there's a lot of
00:34:06
great examples of this so you know the
00:34:08
fundamental capabilities of large data
00:34:10
sets and then using these kind of
00:34:13
learning techniques in software and
00:34:16
statistical models to make kind of
00:34:18
predictions and drive businesses forward
00:34:20
in a way that they're not able to with
00:34:22
just human knowledge and human
00:34:24
capability alone is really uh real and
00:34:27
it's here today
00:34:29
and so I think let's not get caught up
00:34:30
in the fact that there's this really
00:34:31
interesting consumer Market hype cycle
00:34:34
going on where these tools are not being
00:34:36
kind of you know validated and
00:34:38
generating real value across many other
00:34:40
verticals and segments when you look at
00:34:42
this Microsoft open aidl and you see
00:34:45
something that's this convoluted hard to
00:34:47
understand what does that signal to you
00:34:50
as a capital allocator and Company
00:34:52
Builder I would put deals into two
00:34:54
categories one is easy and
00:34:56
straightforward and then two is you know
00:34:59
cute by half or you know the two heart
00:35:01
bucket this is clearly in that second
00:35:03
category
00:35:04
but it doesn't mean that is it in that
00:35:06
category well it doesn't mean that it
00:35:08
won't work in our group chat with the
00:35:10
rest of the guys one person said there's
00:35:13
a lot of complex law when you go from a
00:35:17
non-profit to a for-profit there's lots
00:35:20
of complexity in Deal construction the
00:35:22
original investors have certain things
00:35:24
that they want to see
00:35:26
there may or may not be you know legal
00:35:28
issues at play here that you
00:35:30
encapsulated well in the last episode I
00:35:33
think there's a lot of stuff we don't
00:35:34
know so I think it's important to just
00:35:36
like
00:35:37
give those folks the benefit of the
00:35:39
doubt but yeah if you're asking me
00:35:41
it's in the too hard bucket for me to
00:35:43
really take seriously now that being
00:35:44
said it's not like I I got shown the
00:35:47
deal so I can't I can't comment here's
00:35:49
what I will say the first part of what
00:35:51
Sac said I think is really important for
00:35:54
entrepreneurs to internalize which is
00:35:55
where can we make money
00:35:59
the reality is that
00:36:01
well let me just take a prediction I
00:36:04
think that Google will open source their
00:36:06
models because the most important thing
00:36:10
that Google can do is reinforce the
00:36:12
value of search
00:36:14
and the best way to do that is to Scorch
00:36:17
the Earth with these models which is to
00:36:18
make them widely available and as free
00:36:21
as possible
00:36:22
that will cause Microsoft to have to
00:36:24
catch up and that will cause Facebook to
00:36:27
have to really look in the mirror and
00:36:29
decide whether they're going to cap the
00:36:32
betting that they've made on AR VR and
00:36:36
reallocate very aggressively to AI I
00:36:38
mentioned this in the I did this Lex
00:36:40
Friedman podcast but that should be what
00:36:42
Facebook does and and the reason is
00:36:45
if Facebook and Google and Microsoft
00:36:47
have roughly the same capability in the
00:36:49
same model
00:36:50
there's an element
00:36:52
of machine learning that I think is very
00:36:54
important which is called reinforcement
00:36:56
learning and specifically it's
00:36:57
reinforcement learning from Human
00:36:59
feedback right so these rlhf pipelines
00:37:01
these are the things
00:37:03
that will make your stuff unique so if
00:37:06
you're a startup you can build a
00:37:08
reinforcement learning pipeline how you
00:37:10
build a product that captures a bunch of
00:37:12
usage we talked about this before that
00:37:15
data set is unique to you as a company
00:37:17
you can feed that into these models get
00:37:20
back better answers you can make money
00:37:21
from it Facebook has an enormous amount
00:37:25
of reinforcement learning inside of
00:37:26
Facebook every click every comment every
00:37:30
like every share
00:37:31
Twitter has that data set Google inside
00:37:34
of Gmail and search
00:37:37
Microsoft inside of Minecraft and
00:37:39
Hotmail so my point is David's right the
00:37:43
huge companies
00:37:44
I think
00:37:46
will create the substrates
00:37:48
and I think they'll be forced to Scorch
00:37:51
the Earth and give it away for free
00:37:54
and then on top of that is where you can
00:37:56
make money and I would just encourage
00:37:57
entrepreneurs to think where is my Edge
00:38:01
in creating a data set that I can use
00:38:03
for reinforcement learning that I think
00:38:04
is interesting that's kind of saying I
00:38:07
buy the ingredients from the supermarket
00:38:09
but then I can still construct a dish
00:38:11
that's unique and you know the salt is
00:38:14
there the pepper is there but how I use
00:38:16
that will determine whether you like the
00:38:18
thing or not and I think that you know
00:38:20
that is the way that I think we need to
00:38:22
start thinking about it interestingly as
00:38:24
we've all pointed out here open AI was
00:38:26
started as a non-profit the stated
00:38:29
philosophy was this technology is too
00:38:31
powerful for any company to own
00:38:33
therefore we're going to make it open
00:38:35
source and then somewhere in the last
00:38:36
couple years they said well you know
00:38:38
what actually it's too powerful for it
00:38:40
to be out there in the public
00:38:41
we need to make this a private company
00:38:43
and we need to get 10 billion dollars
00:38:47
from Microsoft that is the the
00:38:49
disconnect I am trying to understand
00:38:51
that's the most interesting part of the
00:38:53
story Jason I think if you go back to
00:38:55
2014 is when Google bought deepmind yeah
00:38:58
and immediately everyone started
00:39:01
reacting to a company as powerful as
00:39:03
Google having a toolkit and a team as
00:39:05
powerful as deepmind within them and
00:39:08
that that sort of power should not sit
00:39:09
in anyone's hands I heard people that
00:39:11
I'm close with that are close to the
00:39:12
organization and the company comment
00:39:15
that they thought this is the most kind
00:39:16
of scary threatening biggest threat to
00:39:19
humanity uh is Google's control of deep
00:39:21
mind and that was a naive kind of point
00:39:23
of view but it was one that was close
00:39:25
that was deeply held by a lot of people
00:39:27
so Reid Hoffman Peter Thiel Elon Musk a
00:39:30
lot of these guys funded the original
00:39:31
kind of open AI business in 2015 and
00:39:34
here's the link so I'm putting it out
00:39:36
here um you guys can pull up the
00:39:38
original blog post do all those don't
00:39:39
people who donated get stock
00:39:43
it was all in a non-profit and then the
00:39:45
non-profit owned stock in a commercial
00:39:47
business now but your point is
00:39:49
interesting because at the beginning the
00:39:51
idea was instead of having Google on all
00:39:52
of this we'll make it all available and
00:39:54
here's the the statement from the
00:39:56
original blog post in 2015 open AI is a
00:39:59
non-profit AI research company our goal
00:40:01
is to advance digital intelligence in
00:40:03
the way that is most likely to benefit
00:40:04
Humanity as a whole unconstrained by a
00:40:07
need to generate Financial return since
00:40:10
our research is free from Financial
00:40:12
Obligations we can better focus on a
00:40:14
positive human impact and they you know
00:40:16
kind of went on and the whole thing
00:40:17
about Sam Greg Elon Reed Jessica Peter
00:40:21
Thiel AWS YC are all donating to support
00:40:24
open AI including donations and
00:40:26
commitments of over a billion dollars
00:40:28
although we expect that to only be a
00:40:31
tiny fraction of what we will spend in
00:40:33
the next few years
00:40:34
which is a really interesting kind of if
00:40:37
you look back historical perspective on
00:40:39
how this thing all started seven years
00:40:40
ago and how quickly it's evolved as you
00:40:43
point out
00:40:43
into the necessity to have a real
00:40:45
commercial alignment to drive this thing
00:40:47
forward without seeing any of these
00:40:49
models open sourced and during that same
00:40:51
same period of time we've seen Google
00:40:54
share Alpha fold and share a number of
00:40:57
tensorflow predictive models and and
00:40:59
Tool kits and make them publicly
00:41:00
available and put them in Google's cloud
00:41:03
and so there's both kind of tooling and
00:41:06
models and outputs of those models that
00:41:08
Google has open sourced and made freely
00:41:10
available and meanwhile openai has kind
00:41:13
of diverged into this deeply profitable
00:41:15
profit-seeking kind of Enterprise model
00:41:17
and when you invest in open AI
00:41:20
in the round that they did before you
00:41:23
could generate a financial return capped
00:41:25
at 100x which is still a pretty amazing
00:41:27
Financial return you put a billion
00:41:28
dollars in you can make 100 billion
00:41:30
dollars that's that's funding a real
00:41:32
commercial Endeavor at that point well
00:41:34
and then to it is the most striking
00:41:37
question about this whole thing about
00:41:38
what's going on in our Ai and it's one
00:41:40
that elon's talked about publicly and
00:41:42
others have kind of sat on one side or
00:41:44
the other which is that AI
00:41:47
offers a glimpse into one of the biggest
00:41:50
and most kind of existential threats to
00:41:52
humanity and the question we're all
00:41:55
going to be tackling and the battle
00:41:56
that's going to be happening politically
00:41:58
and Regulatory wise and perhaps even
00:42:00
between nations in the years to come is
00:42:03
who owns the AI who owns the models what
00:42:05
can they do with it and what are we
00:42:07
legally going to be allowed to do with
00:42:08
it and this is a really important part
00:42:09
of that story yeah to build on what
00:42:11
you're saying I just put in pie torch
00:42:13
people don't know that's another
00:42:14
framework uh p-y-t-o-r-c-h
00:42:18
this was you know largely built inside
00:42:20
of Facebook and then Facebook said hey
00:42:22
we want to democratize machine learning
00:42:24
and they made and I think they put a
00:42:26
bunch of Executives they may have even
00:42:28
funded those Executives to go work on
00:42:29
this open source project so they have a
00:42:32
huge stake in this and they went
00:42:35
very uh open source with it and then
00:42:37
tensorflow which you have an investment
00:42:40
in Tremont tensorflow was inside I don't
00:42:44
I don't have a investment in tensorflow
00:42:46
we've no tensorflow the public Source
00:42:48
came out of Google and then you invested
00:42:50
in another company but we we're building
00:42:51
silicon for machine learning that's
00:42:53
different right but it's based on
00:42:55
tensorflow
00:42:56
no no no no no it the the the founder of
00:42:59
this company was the founder of
00:43:01
tensorflow oh got it okay all right no
00:43:03
sorry not a tensorflow part of me of the
00:43:05
of TPU which was Google's internal
00:43:07
silicon that they built to accelerate
00:43:10
tensorflow right if that makes sense
00:43:13
and so that's the I you know I don't
00:43:15
mean to be cynical about the whole
00:43:18
project or not it's just the confounding
00:43:20
part of this of what is happening here
00:43:21
it reminds me I don't know if you
00:43:23
remember this the biggest opportunity to
00:43:25
hear the biggest opportunity here is for
00:43:27
Facebook I mean they need to get in this
00:43:29
conversation ASAP I mean to to think
00:43:32
that like look pytorch was like a pretty
00:43:35
seminal piece of technology that a lot
00:43:37
of folks in in Ai and machine learning
00:43:39
were using for a long time
00:43:42
tensorflow before that and what's so
00:43:45
funny about like Google and Facebook is
00:43:46
they're a little bit kind of like
00:43:48
they're not really making that much
00:43:50
progress I mean Facebook released this
00:43:53
kind of like Rando version of alpha fold
00:43:55
recently it's not that good
00:44:00
I think these companies really need to
00:44:02
get these products in the wild as soon
00:44:04
as possible it cannot be the case that
00:44:07
you have to email people and get on some
00:44:09
list I mean this is Google and Facebook
00:44:10
guys come on this is the I think the big
00:44:14
innovation of open AI sax to bring you
00:44:18
in the conversation they actually made
00:44:20
an interface and let the public play
00:44:22
with it to the tune of three million
00:44:24
dollars a day in Cloud credits or costs
00:44:28
which by the way just on that my my son
00:44:31
was telling me he's like hey Dad do you
00:44:33
want me to tell you when the best time
00:44:34
to use chat gbt is I'm like huh he's
00:44:36
like yeah my friends and I have tried
00:44:38
we've been using it so much we know now
00:44:40
when we can actually get resources oh
00:44:43
wow and it's such an interesting thing
00:44:45
where like a 13 year old kid knows you
00:44:47
know when it's mostly compute intensive
00:44:49
that it's unusable and when to come back
00:44:51
and use it when's the last time Sachs a
00:44:54
technology became this mainstream and
00:44:56
captured people's imagination this
00:44:58
broadly
00:45:01
maybe the iPhone or something yeah yeah
00:45:04
look it's it's powerful there's there's
00:45:06
no question it's powerful I mean I I'm
00:45:07
of two minds about it because whenever
00:45:10
something is the hype cycle I just
00:45:11
reflexively want to be skeptical of it
00:45:13
but on the other hand we have made a few
00:45:15
investments in this area and I mean I
00:45:19
think it is powerful and it's it's going
00:45:21
to be an enabler of some really cool
00:45:22
things to come there's no question about
00:45:24
it I have two pieces of more Insider
00:45:27
information uh one I have a chat GPT IOS
00:45:30
app on my phone one of the nice Folks at
00:45:32
openai included me in the test flight
00:45:35
and it's the simplest interface you've
00:45:37
ever seen but basically you type in your
00:45:39
question but it keeps your history and
00:45:41
then you can search your history so it
00:45:42
looks sacks like you're in iMessage
00:45:44
basically and it has your threads and so
00:45:47
I asked them hey what are the best
00:45:48
restaurants in yawnville yeah a town
00:45:51
near um Napa and then I said which one
00:45:54
has the best duck and it literally like
00:45:56
gave me a great answer and then I
00:45:58
thought wait a second why is this not
00:45:59
using a Siri or alexa-like interface and
00:46:03
then why isn't it oh here's a video of
00:46:04
it I gave the video to Nick by the way
00:46:06
Jason this what you're doing right now
00:46:08
is you're creating a human feedback
00:46:10
reinforcement learning pipeline for chat
00:46:12
GPT so just the fact that you asked that
00:46:15
question and you know over time if chat
00:46:18
GPT has access to your GPS information
00:46:20
and then knows that you went to
00:46:22
restaurant a versus B they can Intuit
00:46:25
and it may actually prompt you to ask
00:46:27
hey Jason we noticed you were in the
00:46:29
area did you go to Bottega if you did
00:46:32
how would you rate it one through five
00:46:33
that reinforcement learning now allows
00:46:36
the next person that asks what are the
00:46:37
top five restaurants to say well you
00:46:39
know over a thousand people that have
00:46:41
asked this question here's actually the
00:46:43
best answer versus a generic rank of the
00:46:45
open web which is what the first data
00:46:47
set is that's what's so interesting
00:46:50
about this so this is why if you're a
00:46:52
company that already owns the eyeballs
00:46:54
you have to be running to get this stuff
00:46:56
out there well and then this answer uh
00:46:59
and you know cited Yelp well this is the
00:47:01
first time I've actually seen chat GPT
00:47:03
site and this is I think a major legal
00:47:06
breakthrough it didn't put a link in but
00:47:08
if it's going to use yelp's data I don't
00:47:11
know if that permission for me out but
00:47:12
it's quoting Yelp here it should link to
00:47:14
French Laundry Bottega and Bouchon
00:47:16
Bouchon actually has the best confit
00:47:18
for the record and I did have that duck
00:47:20
so I asked this afterwards to see you
00:47:23
know in a scenario like this but it
00:47:25
could also if I was talking to it I
00:47:27
could say hey which one has availability
00:47:28
this afternoon or tomorrow for dinner
00:47:31
and make the phone call for me like
00:47:33
Google Assistant does or any number of I
00:47:36
was thinking about next tasks this was
00:47:38
an incredibly powerful display in a 1.0
00:47:41
product I was thinking about what you
00:47:43
said last week and I thought back to the
00:47:46
music industry in
00:47:48
in the world of Napster
00:47:51
and what happened was there was a lot of
00:47:53
musicians I think Metallica being the
00:47:56
most famous One famously suing
00:47:59
Napster because it was like Hey listen
00:48:01
like you're allowing people to take my
00:48:03
content which they would otherwise pay
00:48:05
for there's economic damage that I could
00:48:07
measure that legal argument was
00:48:09
meaningful enough that ultimately
00:48:10
Napster was shut down now there were
00:48:12
other versions of that that folks
00:48:14
created including us at Winamp we
00:48:16
created a headless version of that but
00:48:18
if you translate that problem set here
00:48:21
is there a claim that Yelp can make in
00:48:24
this example
00:48:25
that they're losing money
00:48:27
that you know if you were going through
00:48:29
Google
00:48:31
or if you're going through their app
00:48:33
there's the sponsored link revenue and
00:48:35
the advertising Revenue that they would
00:48:37
have got that they wouldn't get from
00:48:38
here now that doesn't mean that chat GPT
00:48:40
can't figure that out
00:48:42
but it's those kinds of problems that
00:48:43
are going to be a little thorny in these
00:48:45
next few years that have to really get
00:48:48
figured out
00:48:52
reading every review on Yelp about duck
00:48:56
then you could write a blog post in
00:48:58
which you say many reviewers on Yelp say
00:49:01
that Bouchon is the best doc so the
00:49:03
question is like is GPT held to that
00:49:06
standard or exactly or something
00:49:08
different
00:49:09
and as linking to it is linking to it
00:49:11
enough this is the question that I'm
00:49:13
asking I don't know it should be because
00:49:15
I'll argue it should be because if you
00:49:17
look at the four-part test for fair use
00:49:18
which I had to go through because
00:49:20
blogging had the same issue we would
00:49:22
write a blog post and we would mention
00:49:24
Walt mossberg's review of a product in
00:49:26
somebody else's and then people would
00:49:27
say oh I don't need to read Walt
00:49:28
mossberg's in need of Wall Street
00:49:29
Journal subscription and we say well
00:49:31
we're doing an original work we're
00:49:32
comparing two or three different
00:49:35
you know human is comparing two or three
00:49:37
different reviews and we're adding
00:49:40
something to it it's a you know it's not
00:49:42
a uh it's not interfering with Walt
00:49:44
mossberg's ability to get subscribers in
00:49:46
the Wall Street Journal
00:49:47
but the effect on the potential Market
00:49:49
is one of the four tests and uh just
00:49:51
reading from Stanford's uh quote on fair
00:49:54
use another important fair use factor is
00:49:55
whether you're used deprives the
00:49:57
copyright owner of income or undermines
00:49:59
a new or potential market for the
00:50:01
copyrighted work depriving a copyright
00:50:04
owner of income is very likely to
00:50:06
trigger a lawsuit this is true even if
00:50:08
you are not competing directly with the
00:50:09
original work
00:50:11
and we'll put the link to Stanford here
00:50:13
this is the key issue and I would not
00:50:15
use Yelp in this example I would not
00:50:18
open the Yelp app Yelp get no Commerce
00:50:20
can Yelp would lose this so chat GPT and
00:50:23
all these Services must use citations of
00:50:26
where they got the original work they
00:50:27
must link to them and they must get
00:50:29
permission that's where this is all
00:50:30
going to shake out
00:50:31
but forget about permission I mean you
00:50:34
can't get a big enough data set if you
00:50:35
have to get permission in advance right
00:50:38
yeah
00:50:39
it's it's going to be the large data
00:50:41
sets quora Yelp
00:50:43
the App Store reviews Amazon's reviews
00:50:46
so there are large corpuses of data that
00:50:49
you would need like Craigslist has
00:50:50
famously never allowed anybody to scrape
00:50:52
Craigslist the amount of data inside
00:50:54
Craigslist as but one example of a data
00:50:56
set would be extraordinary to build chat
00:50:58
GPT on chatgpt is not allowed to because
00:51:01
as you brought up robots.txt last week
00:51:06
there's going to need to be an ai.txt
00:51:08
are you allowed to use my data set in Ai
00:51:10
and under and how will I be compensated
00:51:13
for it I'll allow you to use Craigslist
00:51:15
but you have to link to the original
00:51:16
post
00:51:18
and you have to note that the other gray
00:51:21
area that isn't there today but may
00:51:23
emerge is when section 230 gets
00:51:26
Rewritten because if they take the
00:51:28
protections away for the Facebook and
00:51:30
the Googles of the world for the
00:51:31
basically for being an algorithmic
00:51:33
publisher and saying an algorithm is
00:51:36
equivalent to a publisher
00:51:38
what it essentially saying is that an
00:51:40
algorithm is kind of like doing the work
00:51:41
of a human in a certain context and I
00:51:43
wonder whether that's also an angle here
00:51:45
which now this algorithm which today
00:51:47
David you use you said the example I
00:51:50
read all these blog posts I write
00:51:51
something
00:51:52
but if an algorithm does it maybe can
00:51:54
you then say no actually there was
00:51:56
intent there that's different than if a
00:51:58
human were to do it I don't know my
00:51:59
point is
00:52:01
very complicated issues that are going
00:52:03
to get sorted out and I think the
00:52:06
problem with the hype cycle
00:52:08
is that you're going to have to marry it
00:52:09
with an economic model for VCS to really
00:52:11
make money
00:52:12
and right now there's just too much
00:52:14
betting on the come so to the extent
00:52:15
you're going to invest it makes sense
00:52:17
that you put money into open AI because
00:52:19
that's safe
00:52:21
because the economic model of how you
00:52:23
make money for everybody else is so
00:52:24
unclear right oh it's clear actually I
00:52:27
have it for business
00:52:28
I just signed up for chat GPT premium
00:52:31
they had a survey that they shared on
00:52:34
their Discord server and I filled out
00:52:35
the survey and they did a price
00:52:36
Discovery survey uh Freeburg what's the
00:52:39
least you would pay the most you would
00:52:40
pay what would be too cheap of a price
00:52:42
for chat GPT Pro and what would be too
00:52:45
high of a price I put in like 50 bucks a
00:52:47
month was would be what I would pay but
00:52:49
I was just thinking imagine chat gbt
00:52:51
allowed you Friedberg to have a slack
00:52:53
Channel called research and you could go
00:52:55
in there or anytime you're in slack you
00:52:57
do slash chat or slash chat GPT and you
00:53:00
say slash chatgpt tell me you know what
00:53:03
are the venues available in which we did
00:53:06
this actually for
00:53:08
I did this for values for on summer I
00:53:10
could say what are the venues that seat
00:53:11
over 3 000 people in Vegas and it just
00:53:14
gave us the answer okay well that was
00:53:16
the job of uh of the local event planner
00:53:20
they had that list now you can pull that
00:53:22
list from a bunch of different sources I
00:53:25
mean what would you pay for that a lot
00:53:28
well I think one of the big things
00:53:30
that's happening is all the old business
00:53:33
models don't make sense anymore
00:53:36
in a world where the software is no
00:53:39
longer just doing what it's done for the
00:53:41
last 60 years which is what is
00:53:43
historically defined as information
00:53:45
retrieval so you have this kind of
00:53:47
hierarchical storage of data that you
00:53:51
have some index against and then you go
00:53:53
and you search and you pull data out and
00:53:55
then you present that data back to the
00:53:57
customer or the user of the software and
00:53:59
that's effectively been how
00:54:01
all kind of data
00:54:04
has been utilized in all systems for the
00:54:07
past 60 years in Computing largely what
00:54:11
we've really done is kind of built an
00:54:12
evolution of application layers or
00:54:14
software tools to interface with the
00:54:16
fetching of that data the retrieval of
00:54:18
that data and the display of that data
00:54:20
but what these systems are now doing
00:54:22
what AI type systems or machine Learning
00:54:24
Systems now do is the synthesis of that
00:54:27
data and the representation of some
00:54:30
synthesis of that data to you the user
00:54:32
in a way that doesn't necessarily look
00:54:35
anything like the original data that was
00:54:37
used to make that synthesis and that's
00:54:39
where business models like a Yelp for
00:54:41
example or like a web crawler that
00:54:44
crawls the web and then presents web
00:54:45
page directories to you those sorts of
00:54:48
models no longer make sense in a world
00:54:50
where the software the signal to noise
00:54:51
is now greater the signal is greater
00:54:53
than the noise in being able to present
00:54:55
to you a synthesis of that data and
00:54:58
basically resolve what your objective is
00:55:00
with your own consumption and
00:55:02
interpretation of that data which is how
00:55:04
you historically use these systems and I
00:55:07
think that's where there's you know
00:55:08
going back to the question of the hype
00:55:09
cycle
00:55:10
I don't think it's about being a hype
00:55:12
cycle I think it's about the the
00:55:13
investment opportunity against
00:55:15
fundamentally rewriting all compute
00:55:17
tools because if all compute tools
00:55:19
ultimately can use this capability in
00:55:22
their interface and in their modeling
00:55:24
then it very much changes everything and
00:55:26
one of the advantages that I think
00:55:27
businesses are going to latch onto which
00:55:30
we talked about historically is novelty
00:55:32
in their data in being able to build new
00:55:34
systems and new models
00:55:36
that aren't generally available example
00:55:39
in biotech and Pharma for example
00:55:42
having
00:55:43
screening results from from very
00:55:46
expensive uh experiments and running
00:55:48
lots of experiments and having a lot of
00:55:50
data against those experiments gives a
00:55:52
company an advantage
00:55:55
in being able to do things like drug
00:55:56
discovering we're going to talk about
00:55:57
that in a minute versus everyone using
00:56:00
publicly known screening libraries or
00:56:02
publicly available protein modeling
00:56:04
libraries and then screening against
00:56:06
those and then everyone's got the same
00:56:07
candidates and the same targets in the
00:56:09
same you know kind of clinical
00:56:10
objectives that they're going to try and
00:56:12
resolve from from that output so um so I
00:56:14
think novelty and data
00:56:16
is is one way that Advantage uh kind of
00:56:19
arises but really you know that's just
00:56:21
kind of you know where's there an edge
00:56:23
but fundamentally every business model
00:56:25
can and and will need to be Rewritten
00:56:27
that's
00:56:28
dependent on the historical
00:56:31
on the legacy of kind of information
00:56:33
retrieval as the core of what Computing
00:56:35
is is used to do sacks on my other
00:56:38
podcast I was having a discussion with
00:56:39
Molly about the legal profession
00:56:42
what impact would it be if chat GPT took
00:56:45
every court case every argument every
00:56:47
document and somebody took all of those
00:56:51
legal cases on the legal profession uh
00:56:55
and then the filing of a lawsuit the
00:56:57
defending of a lawsuit
00:56:58
public defenders prosecutors but what
00:57:02
data could you figure out like if and
00:57:04
like just to think in recent history
00:57:06
look at Chessa Boudin you could
00:57:08
literally take every case every argument
00:57:10
he did put it through it and say you
00:57:13
know versus an outcome in another state
00:57:14
and you could uh figure out what's
00:57:17
actually going on with this technology
00:57:19
what impact could this have on the legal
00:57:21
field that you are a non-practicing
00:57:23
attorney you have a legal degree I I
00:57:25
never practiced other than one summer at
00:57:28
a law firm but no I think did you pass
00:57:30
the bar was I did pass the bar yes yeah
00:57:32
yes I did try yes of course of course
00:57:35
yeah come on here
00:57:38
I went to Stanford dude
00:57:41
I may not have passed the bar but I know
00:57:44
a little [ __ ] enough to know that you
00:57:46
can't look I I I would be curious in
00:57:48
terms of a very common question that a
00:57:53
an Associated Law Firm would get asked
00:57:55
would be something like you know
00:57:56
summarize the legal precedence in favor
00:57:59
of X right and that you can imagine GPT
00:58:04
doing that like instantly now I think
00:58:06
that the question about that I think
00:58:08
there's two questions one is
00:58:10
can you prompt GPT in the right way to
00:58:12
get the answer you want and I think you
00:58:14
know tremath you shared a really
00:58:15
interesting video showing that people
00:58:17
are developing some skills around
00:58:19
knowing how to ask GPT questions
00:58:22
engineering away exact prompt
00:58:24
engineering why because GPT is a command
00:58:27
line interface so if you ask gbt a
00:58:29
simple question about what's the best
00:58:30
restaurant in you know Napa it knows how
00:58:33
to answer that but there are much more
00:58:35
complicated questions that you kind of
00:58:37
need to know how to prompt it in the
00:58:38
right way so it's not clear to me that a
00:58:41
command line interface is the best way
00:58:42
of doing that I could imagine apps
00:58:44
developing that create more of like a
00:58:45
GUI so we're an investor for example on
00:58:47
copy AI which is doing this for
00:58:49
copywriters and marketers helping them
00:58:51
write blog posts and emails and so you
00:58:55
know imagine putting that like you know
00:58:57
GUI on top of chat GPT they've already
00:59:00
been kind of doing this so I think
00:59:03
that's part of it I think the other part
00:59:04
of it is on the answer side
00:59:07
you know how accurate is it because in
00:59:11
some professions having 90 or 95 or 99
00:59:14
accuracy is okay but in other
00:59:17
professions you need six nines accuracy
00:59:19
meaning
00:59:21
99.9999 accuracy okay so I think for a
00:59:26
lawyer going into court you know you
00:59:29
probably need
00:59:30
I don't know I mean it depends on on the
00:59:32
the ticket versus a murder trial is two
00:59:35
very different yeah exactly so is 99
00:59:38
accuracy good enough is 95 accuracy good
00:59:41
enough I would say probably for a court
00:59:44
case 95 is probably not good enough I'm
00:59:46
not sure GPT is at even 95 yet
00:59:50
but could it be helpful like could could
00:59:52
the associates start with chat GPT get
00:59:55
an answer and then validate it probably
00:59:58
yeah if you had a bunch of Associates
01:00:00
bang on some law model for a year
01:00:04
again that's that reinforcement learning
01:00:06
we just talked about I think you'd get
01:00:08
Precision recall off the charts and it
01:00:10
would be perfect by the way just a cute
01:00:12
thing I don't know if you guys got this
01:00:14
email it came about an hour ago from
01:00:16
Reed Hoffman and Reed said to me hmoth I
01:00:19
I created fireside chat Bots a special
01:00:23
podcast mini-series where I will be
01:00:25
having a set of conversations with chat
01:00:27
GPT so you can go to YouTube by the way
01:00:30
and and see Reed having and he's a very
01:00:33
smart guy so this should be kind of cool
01:00:35
and by the way chat GPT will have an AI
01:00:37
generated voice powered by the
01:00:41
text-to-speech platform play.ht
01:00:44
go to YouTube if you want to see Reed
01:00:46
have a
01:00:47
kind of a conversation with chat GPT I
01:00:50
mean we have a conversation with the two
01:00:53
Davids every week what's the difference
01:00:54
we know how this is going to turn out
01:00:57
hey but actually so synthesizing to Moss
01:01:00
Point about reinforcement learning with
01:01:02
something you said jaycal in our chat
01:01:04
which I actually thought was pretty
01:01:06
smart well that's first yeah so I'm
01:01:08
gonna give you credit here because I
01:01:10
don't think you've said it on this this
01:01:12
episode which is you said that these
01:01:15
open AI capabilities are eventually
01:01:17
going to become commoditized or
01:01:18
certainly much more widely available I
01:01:21
don't know if that means that they'll be
01:01:22
totally commoditized or it'll be four
01:01:23
players but there'll be multiple players
01:01:25
that offer them and you said the real
01:01:27
Advantage will come from
01:01:29
applications that are able to get a hold
01:01:32
of proprietary data sets
01:01:34
and then use those proprietary data sets
01:01:36
to generate insights and then layering
01:01:38
on what you must said about
01:01:39
reinforcement learning if you can be the
01:01:41
first out there in a given vertical with
01:01:44
a proprietary data set and then you get
01:01:46
the advantage the mode of reinforcement
01:01:48
learning that would be the way to create
01:01:50
I think a sustainable business just to
01:01:52
build on what you said this week is the
01:01:54
JPMorgan conference
01:01:56
Friedberg mentioned it last week I had
01:01:59
dinner on Wednesday with this really
01:02:00
interesting company
01:02:01
based in Zurich and what they have is
01:02:05
basically a library of ligands right and
01:02:07
so these ligands are used as a substrate
01:02:10
to deliver all kinds of molecules inside
01:02:12
the body and what's interesting is that
01:02:14
they have a portfolio of like a thousand
01:02:16
of these but really what they have is
01:02:19
they have all the nuclear medicine about
01:02:21
whether it works so you know they're
01:02:24
they target glioblastoma glioblastoma
01:02:26
and so all of a sudden they can they can
01:02:28
say well this ligand can actually cross
01:02:30
The Blood Vein barrier and get to the
01:02:31
brain they have an entire data set of
01:02:34
that and a whole bunch of nuclear
01:02:35
imagery around that they have something
01:02:36
for Soft Cell carcinoma so then they
01:02:39
have that data set so to your point
01:02:41
that's really valuable because that's
01:02:43
real work that Google or Microsoft or
01:02:47
openai
01:02:49
won't do right and if you have that and
01:02:52
you bring it to the problem you can
01:02:54
probably make money you know there's a
01:02:56
business there to be built just building
01:02:58
on this conversation I just realized
01:03:00
like a great prompt engineer is going to
01:03:03
become a title and an actual skill the
01:03:05
ability to interface with these here you
01:03:07
go hey guys
01:03:11
Lanier somebody who is very good at
01:03:13
talk to these you know instances and and
01:03:16
maximizing the result for them and
01:03:19
refining the results for them just like
01:03:20
a detective who asks great questions
01:03:22
the that person is going to be 10 or 20
01:03:25
times more valuable they could be the
01:03:28
proverbial 10x engineer in the future of
01:03:31
as a as in a company and as we talk
01:03:34
about austerity and doing more with less
01:03:36
and the 80 less people running Twitter
01:03:38
now or Amazon laying off 18 000 people
01:03:41
Salesforce laying off 8 000 Facebook
01:03:43
laying off 10 and probably another 10
01:03:45
000.
01:03:46
what catalytic effect like could this
01:03:49
have we could be sitting here in three
01:03:51
or four or five years and instead of
01:03:53
running a company like Twitter with 80
01:03:54
less people maybe you could run it with
01:03:56
98 less people look I think
01:03:58
directionally it's the the right
01:04:00
statement I mean you know I've I've made
01:04:02
the statement a number of times that I
01:04:04
think we moved from this idea of
01:04:06
greater economy to narrator economy
01:04:08
where historically was kind of Labor
01:04:10
economy where humans use their physical
01:04:12
labor to do things than we were
01:04:14
knowledge workers we used our our brains
01:04:16
to to make things
01:04:19
and then ultimately we kind of I think
01:04:21
resolved to this narrator economy where
01:04:23
the the way that you kind of can State
01:04:25
intention and better manipulate the
01:04:27
tools to drive your intentional outcome
01:04:29
the more successful you're going to be
01:04:31
and you can kind of think about this as
01:04:32
being the artist of the past Da Vinci
01:04:34
was um what made him so good was he was
01:04:38
technically incredible at trying to
01:04:41
reproduce a photographic like imagery
01:04:44
using uh using paint and there's these
01:04:47
really great kind of Museum exhibits on
01:04:48
how he did it using these really
01:04:50
interesting kind of like split mirror
01:04:53
systems and then the the better the
01:04:56
artist of the 21st century the 20th
01:04:58
century was the best user of Adobe
01:05:00
Photoshop and that person is not
01:05:02
necessarily the best painter and the
01:05:04
artist of the 22nd century isn't going
01:05:07
to look like the Photoshop
01:05:08
expert and it's not going to look like
01:05:10
the the painter it's going to look like
01:05:12
something entirely different it could be
01:05:14
who's got the most creative imagination
01:05:16
in driving the software to drive new
01:05:18
outcomes and I think that the same
01:05:20
analogy can be used across every Market
01:05:21
in every industry however one thing to
01:05:24
note jaycal it's it's not about
01:05:26
austerity because the the Luddite
01:05:28
argument is when you have new tools and
01:05:31
you get more leverage from those tools
01:05:33
you have less work for people to do and
01:05:35
therefore everyone suffers the reality
01:05:37
is new work emerges and New
01:05:39
Opportunities emerge and we level up as
01:05:41
a species and when we level up we all
01:05:43
kind of fill the gaps and expand our
01:05:45
productivity and our capability set I
01:05:47
thought what J Cal was saying was more
01:05:49
that Google will be smaller didn't mean
01:05:51
that the pie wouldn't grow it's just
01:05:52
that that individual company is run
01:05:55
differently but there will be hundreds
01:05:56
of more companies or thousands more
01:05:58
Millions more yeah that's sort of I have
01:06:00
an actual punch-up for you yeah instead
01:06:03
of narrative it's the conductor economy
01:06:05
it's you're you're conducting a symphony
01:06:07
oh a bunch up
01:06:09
punch up there but I do think like we're
01:06:11
gonna there's gonna be somebody who's
01:06:12
sitting there like remember Tom Cruise a
01:06:14
Minority Report as a detective was
01:06:16
moving stuff around with a interface the
01:06:18
in yeah you know with the gloves and
01:06:20
everything this is kind of that
01:06:22
manifested you could even if you're not
01:06:24
an attorney you could say hey I want to
01:06:26
sue this company for copyright
01:06:27
infringement give me my best arguments
01:06:29
and then on the other side Say Hey I
01:06:31
want to know what the next three
01:06:32
features I should put into my product is
01:06:34
can you examine who are my top 20
01:06:36
competitors and then who have they hired
01:06:38
in the last six months and what are
01:06:39
those people talking about on Twitter
01:06:40
you could have this conductor you know
01:06:43
who becomes really good at that
01:06:46
um
01:06:47
yeah the leveling up that happens in the
01:06:50
book Enders game I think is a good
01:06:51
example of this where the guy goes
01:06:53
through the entire kind of ground up and
01:06:55
then ultimately he's commanding armies
01:06:57
of spaceships and space and his
01:07:00
orchestration of all of these armies is
01:07:02
actually the skill set that wins the war
01:07:04
yeah you predicted that there would be
01:07:07
like all these people that create these
01:07:08
next-gen forms of con content
01:07:10
but I think this Reid Hoffman thing
01:07:12
could be pretty cool like what if he
01:07:13
wins a Grammy for his you know computer
01:07:16
created podcast mini series that's one
01:07:19
thing the thing I'm really excited about
01:07:21
it's the first AI novel gonna get
01:07:23
published by a major publisher I think
01:07:24
it happens this year when's the first AI
01:07:26
Symphony gonna get performed by a major
01:07:28
Symphony Orchestra and when's the first
01:07:30
AI generated screenplay get turned into
01:07:32
an AI generated 3D movie that we all
01:07:34
watch and then the more exciting one I
01:07:36
think is when do we all get to make our
01:07:37
own AI video game where we instruct the
01:07:40
video game platform what world we want
01:07:41
to live in I don't think that's
01:07:42
happening for the next three or four
01:07:44
years but when it does I think
01:07:45
everyone's got these new immersive
01:07:46
environments that they can live in I
01:07:48
have a question when you know when I say
01:07:50
live in I mean video game wise yeah
01:07:51
sorry go ahead when you have when you
01:07:53
have these computer systems just like to
01:07:55
use a question of game theory for a
01:07:57
second they're these models are
01:07:59
iterating rapidly these are all
01:08:00
mathematical models so inherent in let's
01:08:03
just say this the perfect answer right
01:08:06
like if you had perfect Precision recall
01:08:09
foreign
01:08:11
models get there at a system-wide level
01:08:13
everybody is is sort of like they get to
01:08:15
the game theory optimal they're all at
01:08:18
Nash equilibrium right all these systems
01:08:20
working at the same time then the real
01:08:22
question would then be what the hell do
01:08:25
you do then because if you keep getting
01:08:27
the same answer if everybody then knows
01:08:28
how to ask the exact right question and
01:08:31
you start to go through these iterations
01:08:32
where you're like maybe there is a
01:08:34
dystopian hellscape where there are no
01:08:36
jobs maybe that's the Elon World which
01:08:39
is you can you can recursively
01:08:42
find a logical argument where there is
01:08:45
no job that's possible
01:08:47
right and now I'm not saying that that
01:08:49
path is the likely path but I'm saying
01:08:51
it is important to keep in mind that
01:08:52
that path of outcomes is still very
01:08:55
important to keep in the back of our
01:08:57
mind as we figure these things out well
01:08:58
Freeburg you know you were asking before
01:09:00
about this like you know will more work
01:09:03
be created of course artistic Pursuits
01:09:05
podcasting is a job now being an
01:09:07
influencer is a job yada yada new things
01:09:09
emerge in the world but here in the
01:09:11
United States in 1970 I'm looking at um
01:09:14
Fred I'm looking at the St Louis Fred
01:09:16
1970 26.4 percent of the country was
01:09:20
working in a factory was working in
01:09:22
manufacturing you want to guess what
01:09:24
that is in 2012
01:09:27
sorry what percentage it was 26 in 1970
01:09:32
and in 2015 when they stopped the
01:09:34
percentage in manufacturing United
01:09:35
States they discontinued this it was a
01:09:37
10. so it's possible we could just see
01:09:40
you know
01:09:41
the concept of office work the concept
01:09:44
of knowledge work is going to follow
01:09:46
pretty inevitable the path of
01:09:49
manufacturing that that seems like a
01:09:51
pretty logical Theory or no
01:09:54
I think we should move on but yes okay
01:09:57
so how would we like to ruin the show
01:09:59
now should we talk about Biden and uh
01:10:01
the documents and ruin the show with
01:10:03
political talk or should we talk about
01:10:07
since it's been such a great episode so
01:10:10
far what do we want to talk about next
01:10:12
again a couple of choices don't want to
01:10:13
talk about
01:10:16
um
01:10:18
give it to him
01:10:23
hold on a second we all know jaycal that
01:10:26
according to you when a president is in
01:10:29
possession of classified documents in
01:10:33
his home yes that apparently have been
01:10:35
taken an unauthorized manner basically
01:10:37
stolen he should have his home raided by
01:10:39
the FBI almost close close yeah if
01:10:44
so anyway
01:10:45
the Biden uh as of the taping of this
01:10:49
has now said there's a third batch of
01:10:52
classified documents this group I guess
01:10:55
there was one at an office one at a
01:10:57
library now this third group is in his
01:10:58
garage with his Corvette certainly not
01:11:01
looking good uh they say they say that
01:11:03
in his defense they say the garage was
01:11:05
locked meaning that uh you could use a
01:11:07
garage door opener too to open or close
01:11:10
that that's what it was locked when it
01:11:11
went closed so we're pretty much as
01:11:13
secure as the documents at Mar-A-Lago
01:11:15
same equivalency no no no actually I
01:11:18
mean just to be perfectly Fair the
01:11:20
documents in Mar-A-Lago were locked in a
01:11:21
basement the FBI came checked it out
01:11:23
said we'd like you to lock those up they
01:11:25
locked them up so got it a little safer
01:11:28
than maybe a little safer then
01:11:30
but functionally the same functional
01:11:33
functionally the same uh the only
01:11:34
difference here would be what sacks when
01:11:37
you look at these two cases
01:11:38
well that in one case mayor Garland is
01:11:42
appointed an independent Council to
01:11:43
investigate Trump and there's no such a
01:11:45
special counsel or investigator
01:11:47
appointed to investigate Biden I mean
01:11:49
yeah these things are functional put
01:11:51
somebody on it though
01:11:53
I don't think they've appointed a
01:11:55
special counsel yet no they did as of an
01:11:57
hour ago a special counsel was appointed
01:12:00
did that just happen yeah one hour ago
01:12:02
uh Robert her is his name okay I guess
01:12:04
there are real questions to look into
01:12:06
here the documents apparently were
01:12:08
removed twice why were they moved two
01:12:11
ordered that what was a classified
01:12:13
document doing in Biden's personal
01:12:15
Library what do the documents pertain to
01:12:19
do they touch on the Biden families
01:12:21
business dealings in Ukraine and China
01:12:23
so there are real things to look into
01:12:25
here but let me just take a step back
01:12:28
now that the last three presidential
01:12:31
candidates have been ensnared in these
01:12:33
classified document problems remember
01:12:36
it's Biden now and then Trump and
01:12:39
Hillary Clinton before Trump I think
01:12:42
it's time to step back and ask are we
01:12:45
over classifying documents I mean are we
01:12:48
fetishizing these documents are they all
01:12:50
really that sensitive it seems to me
01:12:52
that we have an over classification
01:12:54
problem meaning that ever since uh foia
01:12:58
was passed the Freedom of Information
01:12:59
Act
01:13:00
the government can avoid accountability
01:13:02
and and prying Eyes by simply labeling
01:13:05
any document is classified so
01:13:08
overclassification was a logical
01:13:09
response by the permanent government to
01:13:12
the Freedom of Information Act and now
01:13:14
it's gotten to the point where just
01:13:16
about everything handed to a president
01:13:18
or vice president is classified so I
01:13:21
think I can understand why they're all
01:13:23
making this mistake and I think a
01:13:26
compounding problem is that we never
01:13:27
declassify anything there's still all
01:13:29
these records from the Kennedy
01:13:31
assassination well that's crazy
01:13:33
classified they they and they're
01:13:35
supposed to have Declassified these the
01:13:37
CIA
01:13:38
keeps filibustering on the release of
01:13:42
the JFK assassination documents and
01:13:45
they've been told they have they have to
01:13:46
stop and they have to release them and
01:13:48
then they keep redacting stuff
01:13:53
conspiracy theorist here but what are
01:13:56
they trying to cover up I mean this is a
01:13:58
long time ago that's the only way to
01:13:59
interpret it
01:14:00
but even for more mundane documents
01:14:02
there are very few documents that need
01:14:04
to be classified after even say five
01:14:06
years you could argue that we should be
01:14:08
automatically declassifying them after
01:14:10
five years unless they go through a
01:14:11
process to get reclassified I mean I'd
01:14:14
say like just you guys in business I
01:14:17
know it's not government and business
01:14:19
how many of the documents that you deal
01:14:21
with are still sensitive are Trade
01:14:23
Secrets five years later
01:14:27
certainly 20 years later they're not
01:14:29
right like okay let's say like five
01:14:31
years I mean
01:14:32
the only documents in business that I
01:14:35
think I deal with that you could call
01:14:37
sensitive are the ones that pertain to
01:14:40
the company's future plans right because
01:14:42
you wouldn't want to compare the table
01:14:44
to get those yeah there's a handful of
01:14:46
things legal issues yeah even cat people
01:14:49
is not that sensitive Because by the
01:14:50
time you go public it's legally has to
01:14:52
be public yeah at some point like
01:14:54
there's a hundred people who have that I
01:14:55
mean it's exactly so like in business I
01:14:58
think our experience has been there's
01:14:59
very few documents that stay
01:15:02
sensitive that need to remain secret now
01:15:05
look if Biden or Trump or whatever
01:15:07
they're reviewing the schematics to the
01:15:10
javelin missile system or to you know
01:15:13
how we make our nuclear bombs or
01:15:15
something obviously that needs to say
01:15:16
secret forever but I don't believe our
01:15:18
politicians are reviewing those kinds of
01:15:20
documents well I mean we both I don't
01:15:23
really understand what it is that
01:15:24
they're reviewing why are they keeping
01:15:27
needs to be classified five years why
01:15:29
are they keeping them was the issue we
01:15:30
discussed previously we actually agreed
01:15:31
on that I think they're just keeping
01:15:33
mementos I think there's a simple
01:15:35
explanation for why they're keeping them
01:15:36
Jason which is that the that everything
01:15:40
is more classified and there's a zillion
01:15:42
documents and if you look like both
01:15:45
Biden and Trump these documents were
01:15:48
mixed in with a bunch of personal
01:15:49
effects and mementos my point is if you
01:15:52
work in government and handle documents
01:15:55
they're all classified
01:15:57
so I mean if the National Archive asks
01:15:59
for them back or you find them you
01:16:01
should just give them back I mean that
01:16:02
is that's going to wind up being the
01:16:04
rugby Trump didn't give them back fair
01:16:06
enough I did so that's the only
01:16:08
difference here well no no hold on the
01:16:10
FBI went to Trump's basement they looked
01:16:11
around they said put a lock on this they
01:16:13
seem to be okay with it initially then
01:16:15
maybe they changed their minds I don't
01:16:16
know I'm not defending Trump it's pretty
01:16:18
clear that he wouldn't give him back him
01:16:19
that was the point I'm making is that
01:16:21
now that Biden Trump and Hillary Clinton
01:16:24
have all been ensnared in this is it
01:16:26
time to rethink the fact that we're over
01:16:29
classifying so many documents I mean
01:16:32
just think about the incentives that
01:16:33
that we're creating for our politicians
01:16:36
okay just think about the incentives
01:16:37
number One never use email remember
01:16:40
Hillary Clinton the whole email server
01:16:42
yeah you got to be nuts to use email
01:16:43
number two never touch a document never
01:16:46
touch a document what never let anyone
01:16:48
hand you a document flush them down the
01:16:50
toilet
01:16:51
never let anyone hand you a documentary
01:16:55
if you're a politician an elected
01:16:58
official yeah the only time you should
01:17:00
ever be handling anything is going to a
01:17:01
clean room right you know make an
01:17:04
appointment go in there read something
01:17:05
don't take notes don't bring a camera
01:17:07
and then leave I mean this is no way to
01:17:10
run a government
01:17:11
it's crazy who does this benefit who
01:17:14
does this benefit it doesn't benefit our
01:17:16
elected officials it makes it almost
01:17:17
impossible for them to act like normal
01:17:19
people that's why it benefits the
01:17:22
Insiders the permanent government you're
01:17:24
missing the most important part about
01:17:25
the sex
01:17:26
this was if you want to go into
01:17:28
conspiracy theories this was a setup
01:17:30
Biden planted the documents here we go
01:17:33
so that we could create the false
01:17:34
equivalency
01:17:35
and start up Biden versus Trump 2024
01:17:38
this ensures that now Trump has
01:17:41
something to fight with Biden about but
01:17:43
and this is going to help Trump because
01:17:46
they're both tainted equally tainted yes
01:17:48
the same Source
01:17:51
in the new cycle no I think but I think
01:17:54
it's the opposite I think mayor Garland
01:17:55
now is going to have to drop the
01:17:58
prosecution against Trump for the stolen
01:18:00
documents or at least that part of what
01:18:02
they're investigating him for they might
01:18:04
still investigate him over January 6th
01:18:06
or something but they can't investigate
01:18:07
them
01:18:08
seems more sticky yeah I agree with that
01:18:10
actually I think it's going to be hard
01:18:11
to do but my point is like just think
01:18:13
about look but both sides are engaged in
01:18:15
hyper partisanship the way right now the
01:18:18
that the conservatives and the right
01:18:20
they're attacking Biden now for the same
01:18:23
thing that the left was attacking Trump
01:18:24
for my point is like just take a step
01:18:27
back and again think about the
01:18:29
incentives we're creating about how to
01:18:31
run our government you can't use email
01:18:32
and you can't touch documents
01:18:37
and by the way if you're don't ever go
01:18:40
into politics if you're a business
01:18:42
person because they'll investigate every
01:18:44
deal you ever did prior to getting into
01:18:47
politics
01:18:48
what are you going to do when you try to
01:18:50
get uh your treasury a position what's
01:18:53
going to be nuts you got to be nuts to
01:18:54
go so you're not going to take a
01:18:56
position
01:18:57
that the Washington insiders by which I
01:19:00
mean the permanent Washington
01:19:01
establishment I.E the Deep state they're
01:19:04
creating a system in which they're
01:19:06
running things and the elected officials
01:19:08
barely can operate like normal
01:19:10
functioning humans there interesting
01:19:14
I heard a great rumor this is total
01:19:17
gossip mongering oh here we go uh
01:19:21
that you know one of Ken Griffin's best
01:19:24
out
01:19:25
is to get DeSantis elected so that he
01:19:28
can become treasury secretary I mean Ken
01:19:30
Griffin would get that if he wanted it
01:19:33
and then he would be able to divest all
01:19:35
of Citadel tax-free so he would Mark the
01:19:38
market like 30 billion dollars which is
01:19:41
a genius way to go out now then it
01:19:43
occurred to me oh my God that is me and
01:19:46
Sax's path too
01:19:52
why would it be tax-free
01:19:54
when you get appointed to those branch
01:19:57
those senior posts you're allowed to
01:19:59
either stick it in a blind trust or you
01:20:01
can sell with no capital gains
01:20:04
yeah
01:20:06
what well because they want you they
01:20:08
want you to diverse
01:20:10
yes anything that presents a conflict
01:20:12
they want you to divest and so the
01:20:13
argument is if you have if you're forced
01:20:16
to divest it to enter government you
01:20:18
shouldn't be forcibly if I become mayor
01:20:20
of San Francisco or Austin
01:20:23
Secretary of Transportation Jay Cal you
01:20:26
can do that oh I'm qualified for that
01:20:27
I've take the bus I got an electric bike
01:20:30
to answer freeburg's point I think
01:20:31
Citadel Securities there's a lot of
01:20:34
folks that would buy that because that's
01:20:35
just a Securities trading business and
01:20:36
then Citadel the hedge fund probably
01:20:39
something like a big bulge bracket bank
01:20:40
or Blackstone probably Blackstone in
01:20:43
fact because now Blackstone can plug it
01:20:44
into a trillion dollar asset machine
01:20:46
it's a I think there would be buyers out
01:20:49
the door this is this is an incredible
01:20:51
grift now I know why it's not a grift at
01:20:54
all but it's a it's an incredible come
01:20:56
on man a cabinet position for No Cap
01:20:59
gains well that's not a grift that's
01:21:01
like those are the laws they force you
01:21:03
to sell everything
01:21:04
and then you do Public Schools I think
01:21:06
you're I think you're misusing the word
01:21:08
to continue to genuflect to the left
01:21:10
lane
01:21:11
you're being a little defensive
01:21:14
that or you're dumb
01:21:17
I'm not stupid man
01:21:19
that's when I see it you take a camera
01:21:21
position
01:21:24
where does that exist yes if you were
01:21:27
asked to serve look any normal person
01:21:29
who wants to serve in government you
01:21:31
can't use email and you can't touch a
01:21:33
document and every deal you've ever done
01:21:35
gets investigated yes yes why would you
01:21:39
want to do it I mean I'm saying that you
01:21:40
get to divest tax-free me think thou dot
01:21:44
protesteth too much David the fact that
01:21:46
you two know this rule
01:21:49
no I know I know it it's like a
01:21:51
well-known language I know this rich
01:21:53
people I looked up Grifton stage it
01:21:58
means that to engage in a petty or small
01:22:00
scale Swindle I don't think selling a 31
01:22:02
billion dollar engine no it's not a
01:22:04
combination of BlackRock and Blackstone
01:22:06
would be considered a petty small did
01:22:08
any of you guys watch the Madoff
01:22:10
uh series on Netflix
01:22:13
no was it good no oh my God it is so so
01:22:16
depressing I gotta say like just that
01:22:18
that Madoff series there is no glimmer
01:22:22
of light or hope or positivity or
01:22:26
recourse everyone is a victim everyone
01:22:28
suffers it is just so dark don't watch
01:22:31
it it's so depressing
01:22:34
is so depressing it's so good they all
01:22:36
kill themselves and die and like all the
01:22:39
ones one guy died of cancer person pick
01:22:42
hard I didn't realize all this the
01:22:43
trustee that went and got the money he
01:22:45
went and got money back from these
01:22:47
people who were 80 years old and retired
01:22:49
and had spent that money decades ago and
01:22:51
he sued them and took their homes away
01:22:52
from them and no one and and they had no
01:22:55
one they were part of the scam no one no
01:22:57
one won it was a brutal awful whole
01:23:00
thing yeah by the way that's going to be
01:23:03
really interesting as we enter this SPF
01:23:05
trial because 100
01:23:06
that that is the track that that is what
01:23:09
happens if you guys southern district of
01:23:11
new York said that this case is becoming
01:23:13
too big for them because all the places
01:23:15
that SPF said money all those uh pacts
01:23:18
and all those political donations before
01:23:20
they have to go and investigate where
01:23:23
that money went and see if they can get
01:23:25
it back and it's going to open up an
01:23:26
investigation into each one of these
01:23:27
campaign finance and election and kind
01:23:30
of interfering right now
01:23:32
propublica sorry propublica on the other
01:23:35
end of the spectrum I did watch this
01:23:37
weekend triangle of sadness have you
01:23:39
guys
01:23:41
okay sadness is great it's so dark to
01:23:45
The Davids uh listen this is one of the
01:23:48
it I thought it was it didn't pay off
01:23:49
the way I thought but this is one of the
01:23:51
best setups you'll see in a movie so
01:23:53
basically it's a bunch of people on an
01:23:56
on a luxury yacht so you have a bunch of
01:23:58
rich people as the guests then you have
01:24:01
the staff that interacts with them
01:24:04
and this is like mostly Caucasian and
01:24:07
then under in the bowels of the ship
01:24:09
what you see are Asian and black workers
01:24:11
that support them okay so the the in
01:24:14
some ways is a little bit of a microcosm
01:24:16
of the world Oh I thought you would say
01:24:18
a microcosm of something else and then
01:24:21
and then what happens is there's like a
01:24:22
shipwreck basically right oh don't spoil
01:24:25
it come on okay and so but no but I'll
01:24:27
just say so so the plot is you have this
01:24:31
Caucasian patriarchy
01:24:33
that that gets flips upside down because
01:24:36
after the Shipwreck the only person who
01:24:38
knows how to make a fire and catch the
01:24:39
fish is the Filipino woman who is in
01:24:41
charge of cleaning the toilets Social
01:24:43
Security comes in charge so now you flip
01:24:45
to this immigrant Matrix pretty great
01:24:48
meditation on class and um survival it's
01:24:51
it's pretty well done it didn't end well
01:24:53
I thought I thought okay yeah well it's
01:24:55
hard to wrap that one up
01:24:57
well you know what they say Boys still a
01:24:59
little and they throw you in jail still
01:25:00
a lot and they make you King
01:25:02
uh famous Bob Dylan quote uh all right
01:25:05
well this has been a great episode great
01:25:06
to see you besties uh austerity menu
01:25:09
tonight tremath uh what's on the
01:25:10
austerity menu tonight what are we doing
01:25:12
salad uh some tuna sandwiches no I think
01:25:16
uh I think Kirsten is doing uh I think
01:25:18
durad
01:25:20
yeah that's yeah that's a good fish
01:25:24
Jake and I once had a great rod in
01:25:27
Venice in Venice Venice
01:25:31
[Music]
01:25:32
so good I agree when it's done well the
01:25:35
durad kicks us there's only one way to
01:25:36
cook it to rod do you know what that is
01:25:39
you gotta you gotta it's the way they
01:25:42
did in Venice you got to cook the whole
01:25:43
fish yeah yeah okay and then after you
01:25:46
cook the fish then you de-bone it right
01:25:49
and uh right yeah absolutely that's the
01:25:51
way to do it that was back when sax and
01:25:53
I used to enjoy each other's company
01:25:56
this podcast made us into mortal enemies
01:25:59
check out I'm a little disappointed you
01:26:01
couldn't agree with my take on this
01:26:02
document Scandal instead of dunking in a
01:26:05
partisan way I tried to explain why it
01:26:07
was a problem of our whole political
01:26:09
system I like your theory I I I I think
01:26:12
you know you you keep
01:26:19
I just think your party Crystal a little
01:26:21
bit more
01:26:23
but you know compare your grift are we
01:26:25
gonna play Saturday after the wild card
01:26:26
game are you guys interested in playing
01:26:28
Saturday as well because I got the hall
01:26:29
pass I can I can do a game outside I
01:26:32
don't know I have to check with my boss
01:26:34
who's going to the are you guys walking
01:26:35
on sex are you gonna come to play poker
01:26:38
at that live stream thing for the day in
01:26:41
a way I doubt it no
01:26:43
oh he doesn't want to interact with
01:26:45
humans that does not uh play well in
01:26:47
confirmation hearings
01:26:49
you know the last time I did one of
01:26:51
those Alan Keating destroyed me on
01:26:53
camera
01:26:55
feet and every time he bluffed I folded
01:26:59
every time he had the the nuts I called
01:27:01
that's true
01:27:04
a shellacking a classic saxophone saving
01:27:08
thing is that what's going on here no no
01:27:10
no no it has to do with the cabinet
01:27:13
positions he doesn't need to be seen
01:27:14
recklessly gambling so badly you could
01:27:17
do any cabinet position sacks which one
01:27:19
would it be steak
01:27:29
I don't know that those like cabinet
01:27:32
positions are that important I mean they
01:27:34
run these giant bureaucracies that again
01:27:36
are are permanent you can't fire anyone
01:27:38
so if you can't fire a person to the
01:27:41
early report to you right
01:27:44
Trump's idea was like put a bunch of
01:27:46
Hardline CEO type people in charge have
01:27:49
them blow up these things and make them
01:27:50
more efficient it didn't really work did
01:27:52
it yeah well you know why as CEO is
01:27:54
actually in charge like Elon he walks in
01:27:57
if he doesn't like what you're doing
01:27:58
he'll just fire you you can't fire
01:28:00
anyone how do you manage them when they
01:28:02
don't have to listen to anything you say
01:28:04
that's our whole government right now
01:28:07
our cabinet heads are figureheads
01:28:10
for for these departments for these
01:28:12
Giants is that a no or is that a yes
01:28:15
you'd still take state
01:28:17
look at that well you know I think I
01:28:19
have another feeling
01:28:21
first what is the best ambassadorship
01:28:24
well you can't you can't digest
01:28:26
everything
01:28:28
you can tell which ambassadorship is the
01:28:30
best one based on how much they charge
01:28:31
for it yeah so I think I think London is
01:28:35
the most expensive I think that one is a
01:28:37
million for London 10 to 15. yeah 10 15
01:28:41
million
01:28:42
that's that's what Sax's fourth least
01:28:45
expensive home cost no no you have to
01:28:48
spend that every year to run it Jason
01:28:49
you only got you gotta pick for him you
01:28:52
could be the ambassador to guinea or the
01:28:54
ambassador to the UK you get the same
01:28:56
budget actually what's kind of funny is
01:28:57
I know two people who serve as
01:28:59
ambassadors under Trump and it was
01:29:01
really cheap to get those because no one
01:29:02
no one should be part of the Trump
01:29:04
Administration
01:29:06
two for one they were on they were on
01:29:09
fire sale after because of trump who
01:29:12
wants to be tainted but by the way one
01:29:14
of them and you can just beep out the
01:29:16
name
01:29:17
was telling me it was the best thing
01:29:19
because he ended up selling they already
01:29:20
did the all-time highs to take the job
01:29:22
he was like I gotta get out of all of
01:29:23
his stuff no but listen let me tell you
01:29:25
the the ambassadorships it was it was a
01:29:27
smart trade by those guys because
01:29:29
Ambassador is a lifetime title so your
01:29:31
Ambassador whatever no one remembers was
01:29:34
president when you were Ambassador no
01:29:35
one cares so you are going for the
01:29:37
Ambassador so stay I think it's fair I
01:29:39
think he's going to Ambassador I'm not
01:29:41
interested in ceremonial things I'm
01:29:44
interested in making an impact
01:29:45
and the problem with all these positions
01:29:48
I mean being a cabinet official is not
01:29:51
much different than being an ambassador
01:29:52
so you're gonna you're gonna enlist in
01:29:55
the navy no what what would it what has
01:29:58
a bigger
01:30:00
impacts or being an ambassador who's
01:30:03
more influential Saks on the all in pod
01:30:05
or beep as the ambassador of Sweden
01:30:08
actually all in pod is more
01:30:12
impactful by the way this is why I take
01:30:14
issue with your statement about the term
01:30:16
mainstream media because I think you
01:30:18
have become the mainstream media more
01:30:20
than most of the folks that are
01:30:21
Independent Media we're Independent
01:30:23
Media trust me it's Independence things
01:30:24
happen by thread and stop genuflecting
01:30:33
three episodes I just like saying the
01:30:35
word genuine you like genuflecting I
01:30:37
know because that is the top word of
01:30:38
2023 so far for me oh is that is
01:30:41
somebody doing an analysis with Chad
01:30:43
chapite of the words used here no but
01:30:44
sax brought that word up it was just so
01:30:46
it's a wonderful word it's it's not used
01:30:48
enough
01:30:49
all right everybody we'll see you next
01:30:51
time on the all-in podcast comments are
01:30:53
turned back on have at it you animals
01:30:56
love you guys love you besties bye
01:31:02
Rain Man
01:31:03
[Music]
01:31:16
besties
01:31:19
[Music]
01:31:24
foreign
01:31:27
[Music]
01:31:48
[Music]

Episode Highlights

  • Media Bias and Filtering
    A discussion on why subjects avoid traditional media, fearing biased interpretations.
    “Why would you go through their filter when it's always going to be a hit piece?”
    @ 02m 12s
    January 13, 2023
  • Struggles of Small Business Owners
    The challenges faced by small business owners in urban environments are highlighted.
    “The mortality rate of the small business owner is already 90 percent.”
    @ 16m 16s
    January 13, 2023
  • The Power of Language
    Changing how we refer to those in need can shift policy perspectives.
    “If we refer to these people as untreated persons instead of homeless persons...”
    @ 19m 38s
    January 13, 2023
  • Mental Health Crisis Origins
    The mental health crisis traces back to Reagan's policies in the 1980s.
    “A lot of this crisis in mental health started because Reagan defunded all the psychiatric hospitals.”
    @ 22m 35s
    January 13, 2023
  • The Complexity of AI Investment
    Microsoft's convoluted investment in OpenAI raises questions about the future of AI.
    “This is clearly in that second category: too hard to understand.”
    @ 35m 03s
    January 13, 2023
  • The AI Ownership Debate
    The question of who owns AI and its models poses significant political and regulatory challenges for the future.
    “Who owns the AI? What can they do with it?”
    @ 42m 03s
    January 13, 2023
  • The Future of Business Models
    AI's evolution requires a complete rewrite of traditional business models, as the synthesis of data changes everything.
    “Every business model can and will need to be rewritten.”
    @ 56m 25s
    January 13, 2023
  • The Future of Legal Work
    What if AI could analyze every legal case and argument? The implications for the legal profession are profound.
    “What impact could this have on the legal field?”
    @ 57m 21s
    January 13, 2023
  • The Conductor Economy
    We are moving towards a 'conductor economy' where skillful orchestration of AI tools will be key.
    “Instead of narrative, it's the conductor economy.”
    @ 01h 06m 03s
    January 13, 2023
  • Document Classification Dilemma
    The discussion highlights the absurdity of over-classifying documents in government.
    “Just think about the incentives we're creating for our politicians.”
    @ 01h 16m 29s
    January 13, 2023
  • Madoff Series Review
    A deep dive into the dark themes of the Madoff series on Netflix.
    “It is just so dark, don't watch it!”
    @ 01h 22m 16s
    January 13, 2023
  • Ambassadorships and Impact
    Debate on the value of being an ambassador versus a cabinet official.
    “Being a cabinet official is not much different than being an ambassador.”
    @ 01h 29m 51s
    January 13, 2023

Episode Quotes

Key Moments

  • Podcast Success01:00
  • Small Business Struggles16:16
  • Mental Health Solutions19:18
  • AI Investment Complexity35:03
  • Business Model Revolution56:25
  • Prompt Engineering1:03:00
  • Conductor Economy1:06:03
  • Ambassadorship Debate1:29:51

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E100: Reflecting on the first 100 shows, fan questions, nuclear threat, markets, Amazon & more
Podcast thumbnail
E163: Market rips, Media RIFs, Texas defies Biden, Fintech reckoning, ARkStorm 2.0 & more
Podcast thumbnail
E21: Media misalignment, subjects controlling narratives & more with bestie guestie Draymond Green
Podcast thumbnail
E46: False Ivermectin narratives, regulatory grift, wartime mentality in solving issues & more
Podcast thumbnail
E32: Behind the scenes of Elon hosting SNL, CDC failures, America's real-time UBI experiment & more
Podcast thumbnail
Tariffs, Trump's Economic Endgame, Market Chaos, Bitcoin Reserve, CoreWeave IPO
Podcast thumbnail
E103: Tech layoffs surge, big tech freezes hiring, optimizing for profits, election preview & more
Podcast thumbnail
E6: Big Tech antitrust aftermath, potential effects of an M&A clampdown on Silicon Valley & more
Podcast thumbnail
Inflated GDP?, Google earnings, How the media lost trust, Rogan/Trump search controversy, Election!