Search Captions & Ask AI

E129: Sam Altman plays chess with regulators, AI's "nuclear" potential, big pharma bundling & more

May 19, 2023 / 01:28:05

This episode covers performance reviews of the podcast hosts, discussions on AI regulation, and recent events in the housing market and crime in major cities. Guests include David Friedberg, David Sacks, and Chamath Palihapitiya.

The hosts conduct a humorous performance review segment based on audience feedback from Reddit, with each host reading critiques about themselves. David Friedberg addresses criticism regarding his views on Social Security, while David Sacks responds to comments about his wealth and perceived intelligence.

They then shift to a serious discussion on AI regulation, focusing on a recent Senate hearing featuring Sam Altman, Gary Marcus, and Christina Montgomery. The hosts debate the implications of regulatory capture and the challenges of overseeing AI technology.

The conversation transitions to the housing market, where they analyze a Gallup survey indicating low consumer confidence in buying homes. Sacks explains how rising interest rates are affecting affordability and market mobility.

Finally, the hosts discuss two controversial incidents involving crime in San Francisco and New York, examining the implications of prosecutorial decisions and societal responses to crime and mental health issues.

TL;DR

Hosts review their performances, discuss AI regulation, housing market trends, and crime incidents in major cities.

Video

00:00:00
I have a little surprise for my besties
00:00:02
everybody's been getting incredible
00:00:04
adulation people have been getting
00:00:06
incredible feedback on the podcast it is
00:00:09
a phenomenon As We Know
00:00:11
and I thought it was time for us to do
00:00:14
performance reviews of each bestie now I
00:00:17
as executive producer I'm not in a
00:00:20
position to do uh your performance
00:00:22
reviews I too need to have a performance
00:00:23
review so I thought I would let the
00:00:25
audience do their performance reviews so
00:00:27
we went and we're debuting a new feature
00:00:29
today at this very moment it's called
00:00:31
Reddit performance review oh God cue
00:00:34
some music here some graphics Reddit
00:00:37
performance reviews and so we'll start
00:00:39
off with you David Friedberg this proves
00:00:42
why you haven't been successful in life
00:00:43
so far
00:00:48
Enterprise in
00:00:50
set to yourself which is what all Elite
00:00:52
performers do yes you turned it over to
00:00:55
a bunch of mids on Reddit yes I'm
00:00:58
already doing it they were already doing
00:00:59
it we just collected it what elucidation
00:01:01
were you going to get from Reddit yeah
00:01:03
really let's not ruin the bit do you
00:01:04
think Elon does 360 performance reviews
00:01:07
on Reddit on Reddit of all things you
00:01:10
know how many 360 performance trees I've
00:01:12
done in my life zero of course and
00:01:14
that's why this will be so interesting
00:01:16
okay go go start off with me
00:01:19
you are going to be presented with the
00:01:22
candid feedback that you've gotten in
00:01:23
the last 30 days on Reddit
00:01:25
but for the first time and you have to
00:01:28
read it out loud to the class Freeburg
00:01:30
you'll go first here's your first piece
00:01:33
of candid feedback in your 360 from the
00:01:35
Reddit in cells go ahead Freeburg read
00:01:37
it out loud David Friedberg deserves
00:01:40
more hate he and the others have made it
00:01:42
their mission to convince us that
00:01:44
reforming Social Security is the only
00:01:46
way forward to survive he hides behind
00:01:48
this nerdy apolitical Persona and then
00:01:51
goes hard right out of nowhere it's that
00:01:55
fear-mongered about the deficit as an
00:01:57
excuse to restructure entitlement
00:01:59
programs we would see that as the
00:02:01
partisan right-wing take it is when
00:02:03
Friedberg does it we're supposed to act
00:02:05
like he has no skin in this game he's
00:02:07
just the Science Guy no he's a rich guy
00:02:11
who would rather work your grandparents
00:02:13
to death than pay an extra five percent
00:02:14
tax all right there's your review very
00:02:16
good I think you took it well and you
00:02:19
don't have to respond now don't be
00:02:20
defensive just take it in just a counter
00:02:22
briefly I have highlighted multiple
00:02:24
times I think we're going to 70 tax
00:02:26
rates but hey you know all right they're
00:02:29
wrong everyone's got an opinion that
00:02:30
won't generate more Revenue yeah the
00:02:32
history is any guide yeah well the
00:02:34
audience has been waiting for this David
00:02:35
sacks has never taken a piece of
00:02:37
feedback and the feedback he has gotten
00:02:39
he hasn't taken well so here we go David
00:02:41
here's your performance review go ahead
00:02:43
I'm supposed to read this go with the
00:02:45
bit come on it's a little feedback for
00:02:46
you come on all right and I'm gonna do
00:02:48
this too and I haven't seen mine
00:02:50
oh Nick pulled these Nick pulled these
00:02:52
this is nice
00:02:54
these are actual pieces of real feedback
00:02:57
Nick you're fired that's it get out of
00:03:00
here
00:03:01
good sex
00:03:02
the thing about David sacks if he wasn't
00:03:04
Rich everyone would dismiss him as being
00:03:06
both stupid and boring and the secret to
00:03:09
his wealth is just follow theater since
00:03:10
College it's like if turtle from ontar
00:03:12
pretend to be a public intellectual
00:03:20
has never had a 360 review he informs us
00:03:23
and I think his staff you know for all
00:03:25
the people at Social Capital you can get
00:03:27
in on this by just posting to Reddit
00:03:28
since he went to 360 reviews at Social
00:03:30
Capital go ahead let's pull up your
00:03:31
months feedback for the quarter
00:03:34
moth is the biggest self-serving leech
00:03:36
as long as he can make a dollar trade on
00:03:40
it he will burn anything down to the
00:03:42
ground [ __ ] the consequences to society
00:03:44
or anyone else this was actually the
00:03:46
good part
00:03:48
yeah why'd you give him such a good
00:03:50
review
00:03:52
truth it is
00:03:55
okay I gotta read mine I haven't seen
00:03:57
this I'm embracing for impact here
00:03:58
what's the critique exactly what is the
00:04:02
critique
00:04:03
from that I mean that's like it's pretty
00:04:05
active yeah okay all right here we go
00:04:07
I gotta read mine okay I can't wait for
00:04:09
AI to replace Jay cow Jay cow is the
00:04:12
least skilled knowledge worker on the
00:04:14
show I think he has about three shows
00:04:16
left before AI replaces his hosting
00:04:18
skills and ability to trick dentists
00:04:20
into investing in hype
00:04:26
that's pretty good that last part
00:04:30
I do have a lot of dentist friends in
00:04:32
the funds um okay I'm gonna pay for a
00:04:34
lot of kids then to school great idea
00:04:35
great idea Jay Cal to give us what a
00:04:38
great big there's one group one there's
00:04:39
one for the whole group this is a group
00:04:41
survey this is our group 360. vote to
00:04:44
rename the podcast the vote is binding
00:04:47
and the podcast will be renamed three
00:04:48
billionaires in jaycal three kind of
00:04:51
smart guys and jcal three [ __ ] and
00:04:53
Free Bird that's a pretty good survey
00:04:56
that was pretty good yeah that's a good
00:04:57
one
00:05:01
Rain Man
00:05:02
[Music]
00:05:06
we open source it to the fans and
00:05:09
they've just gone crazy
00:05:10
[Music]
00:05:13
all right everybody Welcome to the all
00:05:16
in podcast we're still here episode 129
00:05:20
all in Summit 2023 general admission
00:05:22
sold out
00:05:23
too many people apply for scholarships
00:05:25
that's on pause and there's a couple of
00:05:27
VIP tickets left get them while they're
00:05:29
hot just search for all in Summit
00:05:30
Freeburg anything to add we'll just get
00:05:32
through the grift real quick here at the
00:05:33
beginning no it's gonna get it all right
00:05:35
that was it yeah I mean just more demand
00:05:37
than we predicted we look for a bigger
00:05:39
venue couldn't find one we're I think
00:05:41
we're excited about Royce Hall
00:05:43
it's still as you pointed out two and a
00:05:45
half times the size of last year so we
00:05:46
want to make sure it's a great quality
00:05:48
event but unfortunately way too many
00:05:50
folks want to go so we have to kind of
00:05:52
pause ticket sales what's my wine budget
00:05:55
300 per person per night a thousand
00:05:58
dollars per VIP per event thank you okay
00:06:01
I will handle it from here so there's
00:06:03
750 of them so I think you have 750 000
00:06:06
in one budget thank you
00:06:09
I mean I just can't believe I just gave
00:06:11
extra mod 750 dollars to buy one all
00:06:15
right guys I'm gonna curate an
00:06:17
incredible guideline here today by the
00:06:19
way
00:06:19
oh he's doing
00:06:21
yes Melanie Josh somebody hey Josh all
00:06:24
right let's get to work okay the senate
00:06:27
had a hearing this week for AI Sam
00:06:31
Altman was there as well as Gary Marcus
00:06:33
a professor from NYU
00:06:36
that's an overpriced College in New York
00:06:38
City and Christina Montgomery the Chief
00:06:41
privacy and Trust officer from IBM which
00:06:44
had Watson before anybody else was in
00:06:46
the AI business and I think they
00:06:47
deprecated it or they stopped working on
00:06:49
it which is quite paradoxical
00:06:52
there were a couple of very interesting
00:06:53
moments
00:06:55
Sam claimed the U.S should create a
00:06:56
separate agency to oversee AI I guess is
00:06:59
in the chamoth camp he wants the agency
00:07:02
to issue licenses to train and use AI
00:07:05
models a little regulatory capture there
00:07:07
as we say in the biz he also claims and
00:07:10
this was interesting dovetailing with
00:07:12
elon's CNBC interview with I think Dave
00:07:15
Farber which is very good that he owns
00:07:18
no equity
00:07:20
in open AI whatsoever and was quote
00:07:23
doing it because he loves it
00:07:26
any thoughts shamath you did say that
00:07:29
this would happen
00:07:31
two months ago and here we are two
00:07:32
months later and exactly what you said
00:07:34
would happen is in the process of
00:07:36
happening regulation Licensing and
00:07:39
Regulatory capture Sam went a little
00:07:41
further
00:07:42
then I
00:07:45
sketched out a few months ago which is
00:07:48
that he also said that it may make sense
00:07:50
for us to issue licenses for these
00:07:53
models to even be compiled
00:07:55
um
00:07:57
and for these models to actually do the
00:07:59
learning
00:08:01
and I thought that that was really
00:08:02
interesting because what it speaks to is
00:08:04
a form of kyc right know your customer
00:08:06
and again when you look at markets that
00:08:10
can be subject to things like fraud and
00:08:12
manipulation right where you can have a
00:08:14
lot of Bad actors banking is the most
00:08:16
obvious one
00:08:17
we use things like kyc to make sure that
00:08:19
money flows are happening appropriately
00:08:21
and between parties that where the
00:08:23
intention is legal
00:08:26
and so I think that that's actually
00:08:29
probably the most important new bit of
00:08:32
perspective that he is adding as
00:08:34
somebody right in the middle of it which
00:08:36
is that
00:08:37
you should apply to this agency to get a
00:08:40
license to then allow you to compile a
00:08:42
model
00:08:43
and I think that that was a really
00:08:44
interesting thing the other thing that I
00:08:47
said and I said this in my in a tweet
00:08:49
just a couple days ago is I'm really
00:08:51
surprised actually where this is the
00:08:53
first time in modern history that I can
00:08:55
remember
00:08:56
where we've invented something we being
00:08:58
Silicon Valley
00:09:00
and the people in Silicon Valley are the
00:09:02
ones that are more circumspect than the
00:09:04
folks on Wall Street or other areas and
00:09:07
if you see if you gauge the sentiment
00:09:10
the hedge funds and family offices right
00:09:12
now are just giddy about Ai and it turns
00:09:16
out if you look at their 13fs they're
00:09:17
all long Nvidia and AMD
00:09:20
but if you actually look at the other
00:09:22
side of the coin which is the folks in
00:09:23
Silicon Valley that's actually making it
00:09:25
the rest of us are like hey let's crawl
00:09:27
before we walk before we run yeah let's
00:09:29
think about guard rails let's be
00:09:31
thoughtful here and so the the big money
00:09:33
people are saying let's place bets and
00:09:35
the people building in are saying hey
00:09:37
let's be thoughtful which is opposite to
00:09:40
what it's always been I think right
00:09:41
we're like hey let's let's run with this
00:09:43
and wall Street's like prove it to me
00:09:46
sax you are a less regulation guy you
00:09:49
are a free market monster I've heard
00:09:51
you've been called uh you don't believe
00:09:53
that we should license this what do you
00:09:55
think about what you're seeing here and
00:09:57
there is some cynical cynical thoughts
00:10:01
about what we just saw happen in terms
00:10:03
of people in the lead wanting to
00:10:05
maintain their lead by creating red tape
00:10:09
what are your thoughts
00:10:10
yeah of course I think you know Sam just
00:10:12
went straight for the end game here
00:10:14
which is regulatory capture
00:10:16
normally when a tech executive goes and
00:10:17
testifies at these hearings they're in
00:10:19
the hot seat and they get grilled and
00:10:21
that didn't happen here because you know
00:10:23
Sam Allman basically bought Into The
00:10:25
Narrative of these senators and he
00:10:28
basically conceded all these risks
00:10:30
associated with with AI he talked about
00:10:33
how
00:10:34
GPT style models if unregulated could
00:10:38
increase online misinformation bolster
00:10:39
cyber criminals even threatened
00:10:41
confidence in election systems
00:10:45
so he basically bought into the center's
00:10:48
narrative and like you said agreed to
00:10:50
create a new agency that would license
00:10:52
models and can take licenses away
00:10:55
he said that he would create safety
00:10:57
standards specific tests that a mall has
00:10:59
to pass before it can be deployed he
00:11:02
says he would require independent audits
00:11:04
who can say the model is or isn't in
00:11:06
compliance
00:11:08
and by basically buying into their
00:11:10
narrative and agreeing to everything
00:11:12
they want which is to create all these
00:11:14
new regulations and a new agency I think
00:11:17
that Sam is pretty much guaranteeing
00:11:19
that he'll be one of the people who gets
00:11:20
to help shape the new agency and and the
00:11:24
rules they're going to operate under and
00:11:25
what these independent audits are gonna
00:11:27
how they're gonna determine what's in
00:11:28
compliance so he is basically putting a
00:11:31
big moat around his own incumbency here
00:11:35
and so yes it is a smart strategy for
00:11:39
him but the question is do we really
00:11:41
need any of this stuff and you know what
00:11:43
you heard at the hearing is that just
00:11:45
like with just about every other Tech
00:11:47
issue the centers on the Judiciary
00:11:49
Committee didn't exhibit any real
00:11:51
understanding of the technology and so
00:11:54
they all generally talked about their
00:11:55
own hobby horses so you know you heard
00:11:58
from Senator Blackburn she wants to
00:11:59
protect songwriters Holly wants to stop
00:12:02
anti-conservative bias Klobuchar was
00:12:04
touting a couple of bills that have her
00:12:07
name on them one's called the jcpa
00:12:09
journalism competition presentation
00:12:10
Bernie Sanders want to do he wants to
00:12:12
protect the one percent of the one
00:12:13
Durbin hates section 230 that was the
00:12:16
hobby horse he was riding and then
00:12:17
Senator Blumenthal was obsessed that
00:12:19
someone had published deep fakes of
00:12:20
himself so you know all of these
00:12:23
different Senators had different
00:12:25
theories of harm that they were
00:12:26
promoting and they were all basically
00:12:29
hammers looking for a nail you know they
00:12:31
all wanted to regulate this thing and
00:12:33
they didn't really pay much of any
00:12:36
attention to the ways that existing laws
00:12:37
could already be used absolutely to stop
00:12:40
any of these things if you commit a
00:12:42
crime with AI there are plenty of
00:12:44
criminal laws every single thing they
00:12:45
talked about could be handled through
00:12:47
existing law if they came out to be
00:12:49
harmed at all yeah but they want to jump
00:12:51
right to creating a new agency and new
00:12:53
regulations and Sam I think did the
00:12:57
you know expedient thing here which is
00:12:59
basically buy into it in order an
00:13:02
Insider if this was a chess game Sam got
00:13:04
to the mid game he traded all the pieces
00:13:06
and went right to the end game let's
00:13:08
just try to Checkmate here I've got the
00:13:09
lead I got the 10 billion from Microsoft
00:13:12
everybody else get a license and try to
00:13:13
catch up Friedberg we have trimath Pro
00:13:16
regulation licensing I think or just
00:13:19
being pretty thoughtful about it there
00:13:20
you got
00:13:21
sax being typically a free market
00:13:24
monster let the laws be what they are
00:13:26
but these senators are going to do
00:13:27
regulatory capture where do you
00:13:28
as the Sultan of science stand on this
00:13:31
very important issue I think there is a
00:13:35
more important
00:13:37
kind of broader set of trends that are
00:13:40
worth noting and that the folks doing
00:13:42
these hearings and having these
00:13:43
conversations are aware of
00:13:45
which implies why they might be saying
00:13:47
the things that they're saying that's
00:13:49
not necessarily about regulatory capture
00:13:51
and that is that a lot of these models
00:13:53
can be developed and generated to be
00:13:56
much smaller we're seeing you know
00:13:57
models that can effectively run on an
00:13:59
iPhone we're seeing a number of Open
00:14:01
Source models that are being published
00:14:02
now there's a group called Mosaic ml
00:14:04
last week they published what looks like
00:14:07
a pretty good quality model
00:14:09
that has you know a very large kind of
00:14:12
token input which means you can do a lot
00:14:14
with it and that model can be downloaded
00:14:17
and used by anyone for free you know
00:14:19
really good open source license that
00:14:21
they've provided on that model and
00:14:22
that's really just the tip of the
00:14:23
iceberg on what's going on which is that
00:14:25
these models are very quickly becoming
00:14:27
ubiquitous to monetized small and
00:14:30
effectively are able to move to and be
00:14:32
run on the edge of the network as a
00:14:34
result that means that it's very hard to
00:14:36
see who's using what models how behind
00:14:39
the products and tools that they're
00:14:40
building
00:14:41
and so if that's the trend then it
00:14:45
becomes very hard for a regulatory
00:14:47
agency to go in and audit every server
00:14:50
or every computer or every computer on a
00:14:52
network and say what model are you
00:14:54
running is that an approved model is
00:14:55
that not an approved model it's almost
00:14:57
like having a regulatory agency that has
00:14:59
to go in and
00:15:01
audit and assess whether a Linux upgrade
00:15:05
or some sort of you know open source
00:15:07
platform that's that's being run on some
00:15:09
server is appropriately vetted and
00:15:11
checked and so it it's almost like a
00:15:14
Fool's errand and so if I'm running one
00:15:17
of these companies and I'm trying to you
00:15:19
know get Congress off my butt and get
00:15:21
all these Regulators off my butt I'm
00:15:22
going to say go ahead and regulate us
00:15:23
because the truth is there really isn't
00:15:26
a great or easy path or ability to do
00:15:28
that and there certainly won't be in
00:15:29
five or ten years once these models all
00:15:31
move on to the edge of the network and
00:15:34
they're all being turned around all the
00:15:35
time every day
00:15:37
and there's a great Evolution underway
00:15:39
so I actually take a point of view that
00:15:42
it's not just that this is necessarily
00:15:44
bad and there's cronyism going on I
00:15:47
think that the point of view is just
00:15:48
that this is going to be a near
00:15:50
impossible task to try and track and
00:15:53
approve llms and audit servers that are
00:15:57
running llms and audit apps and audit
00:15:59
what's behind the tools that everyday
00:16:01
people are using
00:16:03
and I wish everyone the best of luck in
00:16:05
trying to do so but that's kind of the
00:16:07
joke of the whole thing it's like let's
00:16:08
go ahead and paddle these Congress
00:16:10
people on the shoulder and say you got
00:16:11
it you're right there you have it folks
00:16:13
wrong answer chamatha sacks right answer
00:16:16
Friedberg if you were to look at hugging
00:16:18
face if you don't know what that is it's
00:16:19
a basically an open source repository of
00:16:22
all of the llms the cat is out of the
00:16:25
bag the horses have left the barn if you
00:16:28
look at what I'm showing on the screen
00:16:30
here this is the open llm leaderboard
00:16:32
it's kind of buried on hugging face if
00:16:34
you haven't been to hugging face this is
00:16:35
where developers show their work they
00:16:37
share their work and they kind of
00:16:39
compete with each other in a social
00:16:40
network showing all their contributions
00:16:42
and what they do here is and this is
00:16:44
super fascinating they have a series of
00:16:47
tests that will take an llm a language
00:16:50
model and they will
00:16:52
have it do science questions that would
00:16:54
be for grade school they'll do a test of
00:16:57
mathematics U.S history computer science
00:16:59
Etc there's a Jeopardy test too I don't
00:17:01
know if it's on here but the Jeopardy
00:17:03
test is really good it's like straight
00:17:04
up Jeopardy trivia and see if it can
00:17:05
answer the questions yeah which actually
00:17:07
uh Friedberg was actually his high
00:17:09
school uh Jeopardy Championship three
00:17:11
years in a row
00:17:12
but anyway on this line report
00:17:15
you can see the language models are
00:17:18
outpacing what openai did I'm sorry
00:17:22
closed AI is what I call it now because
00:17:24
they're closed Source closed Ai and Bard
00:17:26
have admitted that internal person at
00:17:29
Bart said the language models here are
00:17:31
now outpacing what they're able to do
00:17:33
with much more resources many hands
00:17:35
makes for light work the Open Source
00:17:37
models are going to fit on your phone or
00:17:38
the latest you know Apple silicone so I
00:17:41
think the cat's out of the bag I don't
00:17:43
know how they post something
00:17:44
incompatible about that what Freebird
00:17:46
just said with what I said in fact three
00:17:48
verse point bolsters my point it's
00:17:49
highly impractical to regulate open
00:17:53
source software in this way also when
00:17:55
you look at that list of things that
00:17:56
people are doing on hogging face there's
00:17:58
nothing nefarious about it yeah and all
00:18:00
the harms that were described are
00:18:02
already illegal and can be prosecuted
00:18:03
exactly I need some special agency you
00:18:06
know giving it seal of approval again
00:18:08
this is going to replace permissionless
00:18:10
Innovation which is what has defined the
00:18:12
software industry and especially open
00:18:14
source with the knee need to develop
00:18:17
some connection or relationship and
00:18:18
lobbying in Washington to go get your
00:18:20
project approved and there's no really
00:18:23
good reason for this except for the fact
00:18:24
that the Senators on the Judiciary
00:18:27
Committee and all of Washington really
00:18:28
wants more control so they can get more
00:18:31
donations sax I have a question do you
00:18:33
think that creating the DMV
00:18:35
and requiring a driver's license limits
00:18:38
the ability for people to learn how to
00:18:40
drive the DMV is like the classic
00:18:42
example of how government doesn't work I
00:18:44
don't know why you'd want to make that
00:18:45
your example I mean people have to spend
00:18:48
the whole time he sends people to it
00:18:50
he's got a VIP person all day waiting in
00:18:53
line to get your photo taken it's insane
00:18:54
I mean everyone has a miserable
00:18:56
experience with it no but it's highly
00:18:58
relevant because you're you're right if
00:19:00
you create an agency where people have
00:19:02
to go get their permission it's a
00:19:03
licensing scheme you're gonna be waiting
00:19:06
in some line of Untold length it won't
00:19:08
be like a physical line at the DMV
00:19:10
building it's going to be a virtual line
00:19:12
where you're in some queue where there's
00:19:13
probably going to be some overwork
00:19:14
regulator who doesn't even know how this
00:19:17
was to approve your project they're just
00:19:19
going to be trying to cover their ass
00:19:20
because if the project ends up being
00:19:22
something nefarious
00:19:24
then they get blamed for it so that's
00:19:26
what's going to end up happening let me
00:19:28
also highlight something that I think is
00:19:29
maybe I think a little bit misunderstood
00:19:33
but you know an AI model is an algorithm
00:19:37
so it's a it's a piece of software that
00:19:40
takes data in and spits data out
00:19:42
and you know we have algorithms that are
00:19:44
written by humans we have algorithms
00:19:46
that have been you know written by
00:19:48
machines either machine learn models
00:19:50
which is what a lot of what people are
00:19:52
calling AI today is effectively an
00:19:54
extension of and out of
00:19:56
and so the idea that a particular
00:19:58
algorithm is differentiated from another
00:20:02
algorithm is also what makes this very
00:20:04
difficult because these are algorithms
00:20:06
that are embedded and sit Within
00:20:07
products and applications that an end
00:20:10
user and End customer ultimately uses
00:20:12
and I just sent you guys a
00:20:15
a link to the you know the EU has been
00:20:17
working towards passing this AI good
00:20:19
here we go there are a couple of weeks
00:20:21
ahead of these conversations in the U.S
00:20:23
but I mean as you read through this AI
00:20:26
Act and the proposal that it's uh that
00:20:28
it's put forth it almost becomes the
00:20:31
kind of thing that you say
00:20:33
I just don't know if these folks really
00:20:35
understand how the technology works
00:20:37
because it's almost as if they're going
00:20:39
to audit and have
00:20:41
you know an assessment of the risk level
00:20:43
of every software application out there
00:20:46
and that the tooling and the necessary
00:20:49
infrastructure to be able to do that
00:20:50
just makes no sense in the context of
00:20:52
Open Source software in the context of
00:20:54
an open internet uh in the context of
00:20:56
how quickly software and applications
00:20:59
and tools evolve and you make tweets to
00:21:01
an algorithm and you got to resubmit it
00:21:02
for authorization sure their number one
00:21:06
job freedberg is going to be to protect
00:21:09
jobs so anything there that in any way
00:21:12
infringes on somebody's ability to be
00:21:14
employed in a position whether it's a
00:21:16
artist or a writer or a developer
00:21:18
they're going to say you can't use these
00:21:20
tools or they're going to try to follow
00:21:21
them to try to protect jobs because
00:21:23
that's their second question of all
00:21:25
three of you do you guys think that this
00:21:27
was Sam's way of pulling up the ladder
00:21:28
behind him of course no 100 just like
00:21:31
okay absolutely it is and it's because
00:21:33
no you can you can prove it he made open
00:21:36
AI closed AI by making it not open
00:21:38
source if you're Sam you're smart enough
00:21:40
to know how quickly the models are
00:21:44
commoditizing and how many different
00:21:46
models there are that you know can
00:21:49
provide similar degrees of functionality
00:21:50
as you just pointed out jaycal so I
00:21:52
don't think it's about trying to lock in
00:21:53
your model I think it's about
00:21:55
recognizing the impracticality of
00:21:57
creating some regulatory regime around
00:21:59
model auditing and so you're very so
00:22:02
that in that in that world in that
00:22:04
scenario where you have that Vision you
00:22:06
have that foresight do you go to
00:22:07
Congress and tell them that they're dumb
00:22:08
to regulate AI or do you go to Congress
00:22:10
and you say great you should regulate AI
00:22:12
knowing that it's like hey yeah you
00:22:14
should go ahead and stop the Sun from
00:22:15
shining you know like it's just yeah so
00:22:18
basically he's telling him to do that
00:22:20
because he knows they can't no therefore
00:22:22
he gets all the points all the joy
00:22:24
points all the social credit I don't
00:22:27
want to say virtual signaling but he
00:22:28
gets all the credit relationship credit
00:22:30
with Washington for saying what they
00:22:32
want to hear reflecting back to them
00:22:34
even though he knows they can't compete
00:22:36
with Facebook's open model which is
00:22:38
number one yeah there is historical
00:22:39
precedent interesting for companies that
00:22:42
are facing Congressional scrutiny to go
00:22:45
to Congress and say go ahead and
00:22:46
regulate us as a way of Premium relief
00:22:48
yeah and I think that it doesn't
00:22:50
necessarily mean you're going to get
00:22:51
regulated but it's a way of kind of
00:22:53
creating some relief and getting
00:22:54
everyone to take a breather and a sigh
00:22:56
of relief and be like okay the industry
00:22:57
is with us you know what do you think
00:23:00
the gardas strategy he's pulling here
00:23:03
list guard does I think that's in chess
00:23:05
when you are going to take the queen
00:23:07
anyway what do you think of his chess
00:23:10
movies that's uh not a strategy in chess
00:23:14
so I think it is a chess move
00:23:16
nonetheless is he pulling up the ladder
00:23:19
sacks or no I don't think that's his
00:23:21
number one goal but I think it is the
00:23:23
result
00:23:24
and so I think the the goal here is I
00:23:28
think he's got two passing from and when
00:23:30
you go to testify like this you can
00:23:32
either resist and they will put you in
00:23:35
the hot seat and just Grill you for a
00:23:36
few hours or you can sort of concede and
00:23:39
you buy into their narrative and then
00:23:42
you kind of get through the hearing
00:23:43
without being grilled and so I think on
00:23:46
that level it's preferable just to kind
00:23:48
of play ball and then the other thing is
00:23:51
that by playing ball you get to be part
00:23:53
of the Insiders Club that's going to
00:23:54
shape these regulations and that will I
00:23:56
wouldn't say it's a ladder coming up I
00:23:58
think it's more of a moat where because
00:24:00
it's not that the latter comes up and
00:24:02
nobody else can get in but the
00:24:04
regulations are going to be a pretty big
00:24:06
moat around major incumbents who know
00:24:09
they qualify for this because they're
00:24:10
going to write these standards so at the
00:24:13
end of the day if you're someone in
00:24:15
Sam's shoes you're like why resist and
00:24:17
make myself a Target or I'll just buy
00:24:21
into the narrative and help shape the
00:24:22
regulations and it's good for my
00:24:24
business I like the analysis this
00:24:25
gentlemen this is a perfect analysis let
00:24:27
me ask you about the question tomorrow
00:24:28
what is the commercial incentive from
00:24:31
your point of view
00:24:33
to ask for regulation and to be
00:24:35
pro-regulation your pro regulation can
00:24:37
you just highlight for me at least what
00:24:38
you think
00:24:39
you know the the commercial reason is to
00:24:42
do that you know how do you benefit from
00:24:44
that like not you personally but
00:24:45
generally like where does benefit arise
00:24:48
I think that certain people in a sphere
00:24:51
of influence and I would put us in that
00:24:53
category
00:24:54
have to have the intellectual capacity
00:24:56
to see beyond ourselves and ask what's
00:24:59
for
00:25:00
the greater good
00:25:02
I think Buffett is right two weeks ago
00:25:05
he equated AI to nuclear weapons which
00:25:10
is an incredibly powerful technology
00:25:11
whose Genie you can't put back in the
00:25:14
bottle who's 99.9 of use cases are
00:25:19
generally quite societally positive but
00:25:22
the point one percent of use cases
00:25:24
destroys Humanity
00:25:26
and so I think you guys are unbelievably
00:25:28
naive on this topic and you're letting
00:25:31
your ideology
00:25:34
fight your common sense the reality is
00:25:36
that there are probably 95 billion
00:25:40
trillion use cases that are incredibly
00:25:42
positive
00:25:43
but the 1000 negative use cases are so
00:25:46
destructive and they're equally possible
00:25:49
and the reason they're equally possible
00:25:50
and this is where I think there's a lot
00:25:51
of intellectual dishonesty here is we
00:25:54
don't even know how Transformers work
00:25:56
the best thing that happened when
00:25:58
Facebook open sourced llama was also
00:26:00
that somebody stealthily released all
00:26:02
the model weights yeah
00:26:04
okay so I don't think a little bit for a
00:26:07
neophyte what we're talking about so so
00:26:09
there's the model and there's the
00:26:10
weights think about it as it's a
00:26:12
solution to a problem the solution looks
00:26:14
like a polynomial equation okay let's
00:26:17
take a very simple one let's take
00:26:18
Pythagorean theorem you know x squared
00:26:20
plus y squared equals z squared okay so
00:26:23
if you want to solve an answer to a
00:26:25
problem you have these weights you have
00:26:27
these variables and you have these
00:26:28
weights associated with it the slope of
00:26:31
a line Y equals MX plus b okay what a
00:26:34
computer does with AI is it figures out
00:26:36
what the variables are
00:26:37
and it figures out what the weights are
00:26:39
the answer to identifying images
00:26:42
flawlessly turns out to be 2x plus 7
00:26:46
where x equals this thing
00:26:49
now take that example and multiply it by
00:26:52
500 billion parameters
00:26:54
and 500 billion weights and that is what
00:26:58
an AI model essentially gives us to as
00:27:00
an answer to a question
00:27:02
so even when Facebook released llama
00:27:05
what they essentially gave us was the
00:27:07
equation but not the weights
00:27:09
and then what this guy did I think it
00:27:11
was an intern apparently or somebody he
00:27:13
just looked the weights so that we
00:27:15
immediately knew what the structure of
00:27:17
the equation looked like
00:27:19
so that's what we're basically solving
00:27:21
against but we don't know how these
00:27:23
things work we don't really know how
00:27:25
Transformers work and so this is my
00:27:26
point when I think you guys are right
00:27:29
about yes the overwhelming majority of
00:27:31
the use cases but there will be people
00:27:34
who can nefariously create havoc and
00:27:36
chaos and I think you got to slow the
00:27:39
whole ship down to prevent those few
00:27:42
folks from ruining it for everybody
00:27:46
I haven't had a chance to chime in on my
00:27:48
position so I'd like to just move mine
00:27:50
nobody cares okay well I do I think
00:27:54
actually I split the difference here a
00:27:55
little bit I don't think it needs to be
00:27:56
an agency in licensing I do think we
00:27:58
have to have a commission and we do need
00:28:00
to have people being thoughtful about
00:28:01
those thousand use cases chamoth because
00:28:03
they are going to cause societal harm or
00:28:07
things that we cannot anticipate and
00:28:09
then number two for the neophyte with
00:28:11
the 1600 rating on chess.com sax gardes
00:28:15
an announcement to the opponent that
00:28:17
their Queen is under direct attack
00:28:18
similar to the announcement of check the
00:28:20
warning was customary until the early
00:28:21
20th century so since you do not know
00:28:23
the history of check now you've learned
00:28:25
something here early 20th century okay
00:28:26
well since I've only played chess in the
00:28:28
20th and 21st centuries I'm unaware of
00:28:30
that Jake Owen French is pronounced
00:28:37
in the context of what we're talking
00:28:39
about that models are becoming smaller
00:28:42
and can be run on the edge and there's
00:28:44
obviously hundreds and thousands of
00:28:45
variants of these open source models
00:28:47
that have you know good effect and
00:28:50
perhaps compete with some of these these
00:28:53
models that you're mentioning that are
00:28:54
closed source
00:28:56
how do you regulate that how do you and
00:28:58
then they sit behind an application and
00:29:00
they sit behind tooling yeah I think in
00:29:03
order for you to be able to compile that
00:29:04
model to generate that initial
00:29:06
instantiation you're still running it in
00:29:09
a cluster of thousands of gpus but let's
00:29:11
say we're past that you can't be past
00:29:14
that we're not past that yet okay we
00:29:16
don't have five million models we don't
00:29:18
have all kinds of things that solve all
00:29:20
kinds of problems we don't have an open
00:29:22
source available simulation of every
00:29:24
single molecule in the world including
00:29:26
all the toxic materials that could
00:29:28
destroy humans we don't have that yet so
00:29:30
before that is created and shrunk down
00:29:32
to an iPhone I think we need to put some
00:29:35
stage gates up to slow people down what
00:29:37
do you mean by those stage kids yeah I
00:29:38
think you need some form of kyc I think
00:29:40
before you're allowed to run a massive
00:29:43
cluster to generate the model that then
00:29:45
you try to shrink
00:29:46
you need to be able to show people that
00:29:48
you're not trying to do something
00:29:49
absolutely chaotic or have it creating
00:29:52
that I don't think that that could be as
00:29:54
simple as putting your driver's license
00:29:55
in your social security number that
00:29:58
you're working on an instance in a cloud
00:29:59
right it could be your putting your name
00:30:01
on your work it becomes slightly more
00:30:03
nuanced than that it's like I think that
00:30:05
Jake out that's probably the simplest
00:30:07
thing for AWS gcp and Azure to do which
00:30:10
is that if you want to run over a
00:30:13
certain number of GPU clusters you need
00:30:14
to put in that information I think you
00:30:16
also need to put in your tax ID number
00:30:18
so I think if you want to run a real
00:30:19
high scale model that's still going to
00:30:21
run you tens or hundreds of millions of
00:30:23
dollars I do think there aren't that
00:30:25
many people running those things and I
00:30:27
do think it's easy to police those and
00:30:29
say what are you trying to do here so
00:30:31
let me just push back on that because
00:30:32
Mosaic ml published this model that is
00:30:35
let me I can pull up the the performance
00:30:37
chart or Nick maybe you can just find it
00:30:39
on their website real quick or the new
00:30:40
model they publish to mock they trained
00:30:42
this model
00:30:43
on open source data that's publicly
00:30:47
available
00:30:48
and they spent two hundred thousand
00:30:50
dollars on a cluster run to build this
00:30:52
model and look at how it performs
00:30:54
compared to some of the top models that
00:30:56
are closed Source just say it for the
00:30:57
people who are listening yeah so for
00:30:59
people that are listening basically this
00:31:00
model is called MPT 7B that's the name
00:31:04
of the AI model the llm model that was
00:31:07
generated by this group called Mosaic ml
00:31:10
and they spent two hundred thousand
00:31:11
dollars creating this model from scratch
00:31:14
and the data that they trained it on is
00:31:16
all listed here it's all publicly
00:31:18
available data that you can just
00:31:19
download off the internet then they
00:31:20
score how well it performs on its
00:31:22
results against other big models out
00:31:25
there like llama 7B
00:31:27
I know but I don't exactly know what the
00:31:30
actual problems they're trying to ask it
00:31:32
to compare right so the but the point is
00:31:35
that this model theoretically could then
00:31:37
be applied to a different data set once
00:31:39
it's been you know built I don't know
00:31:42
this is you know I just want to use your
00:31:43
point earlier about toxic chemistry
00:31:45
because models were generated and then
00:31:47
other data was then used to fine-tune
00:31:50
those models and deliver an output hold
00:31:52
on a second like those answers were to
00:31:54
specific kinds of questions
00:31:56
if you wanted to all of a sudden ask
00:31:58
totally orthogonal thing of that model
00:32:00
that model would fail you would have to
00:32:02
go back and you'd have to retrain it
00:32:04
that training does cost some amount of
00:32:06
money so if you said to me hmoth I could
00:32:09
build you a model trained on the
00:32:11
universe of every single molecule in the
00:32:13
world and I could actually give you
00:32:15
something that could generate the toxic
00:32:17
list of all the molecules and how to
00:32:18
make it for two hundred thousand dollars
00:32:20
I would be really scared I don't think
00:32:22
that that's possible today
00:32:24
so I don't understand these actual tests
00:32:26
but I don't think it's true that you
00:32:28
could take this model and these model
00:32:29
weights apply to a different set of data
00:32:32
and get useful answers but let's let's
00:32:34
assume for a minute that you can in fact
00:32:36
take two hundred thousand dollars worth
00:32:38
it
00:32:39
here's my point I want to tell you
00:32:41
what's happening right now which is
00:32:43
that's not possible so we should stop so
00:32:46
that then I don't have to have this
00:32:47
argument with you in a year from now
00:32:49
which is like hey some jack jerk off
00:32:51
just created this model now the cat's
00:32:53
out of the bag so let's not do it yeah
00:32:55
and and then what's going to happen is
00:32:57
like some chaotic seeking organization
00:32:59
is going to print one of these materials
00:33:01
and release it into the wild to prove it
00:33:03
but here's the point for the audience we
00:33:04
are at a moment in time where this is
00:33:06
moving very quickly and you have very
00:33:08
intelligent people here who are very
00:33:09
knowledgeable
00:33:10
talking about to the degree to which
00:33:13
this is going to manifest itself not if
00:33:15
it will manifest you are absolutely 100
00:33:17
certain Freeburg that somebody will do
00:33:19
something very bad in terms of the
00:33:22
chemical example as but one we're only
00:33:25
determining here what level of hardware
00:33:27
and what year that will happen chimatha
00:33:29
saying we know it's going to happen
00:33:31
whether it's 2 or 10 or you know five
00:33:33
years let's be thoughtful about it and I
00:33:36
think you know this discussion we're
00:33:38
having here I think
00:33:39
on a spectrum
00:33:42
it's this is a unique moment where the
00:33:45
most knowledgeable people
00:33:47
across every single political Spectrum
00:33:49
persuasion for-profit non-profit
00:33:51
Democrat Republican right Elon and Sam
00:33:54
will just use those as the two canonical
00:33:56
examples to demonstrate
00:33:58
are pro-regulation and then the further
00:34:00
and further you get away the less
00:34:02
technically astute you are the more
00:34:04
anti-regulation and like pro-market you
00:34:07
are and all I'm saying is I think that
00:34:08
should also be noted that that's a
00:34:10
unique moment that the only other time
00:34:12
that that's happened was around nuclear
00:34:14
weapons and you know that's when
00:34:16
Einstein I actually think I think it's
00:34:20
politically
00:34:21
incorrected right now I think because of
00:34:24
what you're saying well just give me a
00:34:25
second I think because of what you're
00:34:26
saying everyone on the left and the
00:34:27
right it's it's become popular to be
00:34:30
pro-regulation on AI and to say that AI
00:34:32
is going to Doom the world and it's
00:34:33
unpopular and absolutely
00:34:37
explained my point of view on Sam elon's
00:34:39
different but I think like it I think
00:34:42
it's become Politically Incorrect to
00:34:43
stand up and say you know what this is a
00:34:45
transformative technology for Humanity I
00:34:47
don't think that there's a real path to
00:34:48
regulation I think that I totally
00:34:50
there's laws that are in place that can
00:34:52
protect us in other ways with respect to
00:34:54
privacy with respect to fraud with
00:34:56
respect to biological warfare and all
00:34:58
the other things that we should worry
00:34:59
about Elon has said pretty clearly he
00:35:00
doesn't give a [ __ ] about what it does
00:35:02
to make money or not he cares about what
00:35:04
he thinks so all I'm saying is that's a
00:35:06
guy that's not trying to be politically
00:35:07
correct Elon has a very specific concern
00:35:09
which is Agi he's concerned that we're
00:35:12
on a path to a digital superintelligence
00:35:14
it's a singularity and if we create the
00:35:16
wrong kind of
00:35:18
artificial general intelligence that
00:35:20
decides that it doesn't like humans that
00:35:23
is a real risk to the human species
00:35:25
that's the concern he's expressed but
00:35:28
that's not what the hearing was really
00:35:31
about and it's not what any of these
00:35:33
regulatory proposals are about and the
00:35:35
reality is none of these Senators know
00:35:37
what to do about that even the industry
00:35:39
doesn't know what to do about the
00:35:41
long-term risk of creating an AGI nobody
00:35:44
knows nobody really and so and so I
00:35:46
actually I disagree with this idea that
00:35:48
tomorrow early earlier you said that
00:35:50
there's a thousand use cases here that
00:35:52
could destroy the human species I think
00:35:54
there's only one there's only one
00:35:57
species level risk which is Agi but
00:36:00
that's a long-term risk we don't know
00:36:01
what to do about it yet I agree we
00:36:03
should have conversations what we're
00:36:04
talking about today is whether we create
00:36:07
some new licensure regime in Washington
00:36:09
so that politically connected insiders
00:36:12
get to control and shape the software
00:36:13
industry and that's a disaster let me
00:36:16
give you another detail on this so
00:36:18
and one of the
00:36:20
chat groups I'm in there was somebody
00:36:22
who just got back from Washington I
00:36:25
won't say who they are it's not someone
00:36:26
who's famous outside the industry but
00:36:28
they're kind of like a tech leader and
00:36:30
what they said is they just got back
00:36:32
from Capitol Hill and the White House
00:36:34
and I guess there's like a White House
00:36:36
Summit
00:36:37
on AI you guys know about that yeah so
00:36:41
what this person said is that the White
00:36:45
House meeting was super depressing some
00:36:47
smart people were there to be sure but
00:36:51
the White House and vp's teams were
00:36:53
rapidly negative no real concern for the
00:36:55
opportunity or economic impact just
00:36:58
super negative of course basically the
00:36:59
mentality was that Tech is bad we hate
00:37:02
social media this is the hot new thing
00:37:04
we have to stop it of course that
00:37:06
basically is their attitude they don't
00:37:07
understand the technology White House
00:37:09
yeah and the VP specifically because
00:37:11
she's the now the AIS are to put Kamala
00:37:14
Harris in charge of this makes no sense
00:37:15
I mean does she have any background in
00:37:18
this like it just shows like a complete
00:37:19
utter lack of awareness where is the
00:37:22
Megan Smith of course somebody like a
00:37:24
CTO to be put in charge of this remember
00:37:26
Megan Smith was CTO under I guess Obama
00:37:28
like you need somebody with a little
00:37:30
more depth of experience here like
00:37:32
hopefully you guys than saying your your
00:37:34
pro-regulation depending on who's in
00:37:36
charge well I'm proud thoughtfulness I'm
00:37:38
pro-lawfulness I'm illustrating that
00:37:41
really this whole new agency that's
00:37:43
being discussed is just based on Vibes
00:37:45
you're not down with the Vibes of Vibes
00:37:48
the vibe is that a bunch of people in
00:37:51
Washington don't understand technology
00:37:52
and they're afraid of it yeah
00:37:55
these are socialist David they're
00:37:57
socialists they hate progress they are
00:38:00
scared to death that jobs are going to
00:38:02
collapse they're socialists they they're
00:38:04
union leaders this is their worst
00:38:06
nightmare Because the actual truth of
00:38:08
this technology is 30 more efficiency
00:38:11
and it's very mundane this is the truth
00:38:14
here I think that Representatives have
00:38:16
30 more efficiency means Google Facebook
00:38:19
and many other companies Finance
00:38:21
education they do not add staff every
00:38:24
year they just get 30 more efficient
00:38:26
every year and then we see unemployment
00:38:28
go way up and Americans are going to
00:38:30
have to take service jobs and why collar
00:38:32
jobs are going to be refined to like a
00:38:35
very elite few people who actually do
00:38:37
work in the world there is absolutely a
00:38:40
lot of new companies if humans can
00:38:42
become if knowledge workers can become
00:38:43
30 more productive there'll be a lot of
00:38:45
new companies absolutely and the biggest
00:38:47
shortage on our economy is coders right
00:38:49
and we're going to have an unlimited
00:38:50
number of them now they're all going to
00:38:51
go I don't know if it's unlimited but
00:38:53
yes it's a good thing if you give them
00:38:54
superpowers we've talked about this
00:38:55
before yeah yeah so I think it's too
00:38:57
soon to be concluding that we need to
00:39:00
stop job displacement that hasn't even
00:39:02
occurred yet I'm not saying it's
00:39:03
actually going to happen I do agree
00:39:04
there'll be more startups I'm seeing it
00:39:06
already I just think that's what they
00:39:08
fear that's their fear is and that's the
00:39:10
fear of the EU the EU is going to be
00:39:12
protection unionist protect Pro workers
00:39:15
aren't going to be affected because
00:39:17
these are not blue-collar jobs we're
00:39:18
talking about these are knowledge unions
00:39:20
at all the media companies created
00:39:22
unions and look at them they're all
00:39:23
going circling all these media companies
00:39:25
are circling the training going into
00:39:26
their business but that's on the margins
00:39:27
I mean that's not just trying to start
00:39:29
Tech unions sure they're trying to start
00:39:30
them but when we think of unionized
00:39:32
workers you're thinking about Factory
00:39:33
workers and these people are not
00:39:35
affected okay listen this has been an
00:39:38
incredible debate this is why you tune
00:39:39
into the pod
00:39:40
a lot of things can be true at the same
00:39:42
time I really think the analogy of the
00:39:43
the atom bomb is really interesting
00:39:45
because what Elon is scared about with
00:39:48
General artificial intelligence is
00:39:49
nuclear Holocaust the whole planet blows
00:39:51
up between those two things are things
00:39:54
like Nagasaki and Hiroshima or a dirty
00:39:57
bomb and many other possibilities with
00:39:59
nuclear power you know Fukushima you
00:40:03
know Etc so less than the Hiroshima not
00:40:06
yet right and you know the question is
00:40:08
is a three mile island is a Fukushima is
00:40:12
a Nagasaki are those things probable and
00:40:15
I think we are all looking at this
00:40:16
saying there will be something bad that
00:40:18
will happen there will be the equivalent
00:40:21
strings together and these large
00:40:23
language models string together words in
00:40:25
really interesting ways and they give
00:40:26
computers the ability to have a natural
00:40:28
language interface that is so far from
00:40:30
AGI I think it's a company
00:40:33
hold on I think it's obviously the
00:40:35
ability to understand language and
00:40:37
communicate in a natural way is a
00:40:39
component of a future AGI but by itself
00:40:42
these are models for stringing together
00:40:44
language auto GPT where these things go
00:40:47
out and pursue things without any
00:40:49
interference I would be the first one to
00:40:51
say that if you wanted to scope
00:40:53
models to be able to just do human
00:40:56
language back and forth
00:40:58
on the broad open internet
00:41:01
you know there's probably a form David
00:41:04
where these chat GPT products can exist
00:41:06
I don't I think that those are quite
00:41:08
benign I agree with you but I think what
00:41:10
Jason is saying is that every week
00:41:12
you're you're taking a leap forward and
00:41:14
already with auto gpts you're talking
00:41:16
about code that runs in the background
00:41:18
without supervision it's not a human
00:41:20
interface that's like hey show me how to
00:41:22
color my
00:41:23
cookies green for St Patty's Day it's on
00:41:26
my trip to Italy yeah yeah it's not
00:41:28
doing that so I just think that there's
00:41:30
there's a place well beyond what you're
00:41:32
talking about and I think you're
00:41:33
minimizing the problem a little bit by
00:41:34
just kind of saying the whole class of
00:41:36
AI is just chat GPT and asking kid
00:41:39
asking it to help it with its homework
00:41:41
this example I hate to say it out loud
00:41:42
but somebody could say here is the
00:41:45
history of financial crimes that were
00:41:47
committed and other hacks please with
00:41:50
their own model on their own server say
00:41:52
please come up with other ideas for
00:41:54
hacks be as creative as possible and
00:41:56
steal as much money as possible and put
00:41:57
that in an auto GPT David and study all
00:42:00
hacks that occur in the history of
00:42:02
hacking and it could just create super
00:42:04
chaos around the world and your computer
00:42:06
history is going to regret buying into
00:42:08
this narrative because the members of
00:42:10
the Judiciary Committee are doing the
00:42:11
same Playbook they ran back in 2016
00:42:13
after that election they ran all these
00:42:15
hearings on disinformation claiming that
00:42:17
social networks been used to hack the
00:42:19
election it was all a phony narrative
00:42:20
hold on look what happened
00:42:24
the Russians hacked who had to write out
00:42:26
that stop they thought they got all
00:42:28
these Tech CEOs to buy into that phony
00:42:30
narrative why because it's a lot easier
00:42:32
for the tech CEOs to agree and tell the
00:42:35
centers what they want to hear to get
00:42:36
them off their backs and then what did
00:42:38
that lead to a whole censorship
00:42:40
industrial complex so we're going to do
00:42:43
the same thing here we're going to buy
00:42:44
into these phoning names the centers off
00:42:47
our backs and that's going to create
00:42:48
this giant AI industrial complex that's
00:42:52
going to slow down real Innovation and
00:42:54
be a burden on entrepreneurs okay
00:42:55
lightning round lightning round we got
00:42:57
to move on three more topics I want to
00:42:58
hit let's keep going if honor to become
00:43:00
an evil more evil yeah what is it called
00:43:03
an evil uh comic book character a super
00:43:06
villain a super villain if you wanted to
00:43:08
be an even more loathsome super villain
00:43:11
continue I would take every single
00:43:14
virus patch that's been developed and
00:43:17
publicized learn on them and then find
00:43:19
the next zero day exploit on a whole
00:43:21
bunch of stuff I mean it's just like
00:43:22
even publish that idea please don't I
00:43:23
mean I'm I'm worried about publishing
00:43:25
that idea that's not an intellectual
00:43:27
leap I mean you have to be a dollar it's
00:43:29
obvious okay let's move on another great
00:43:32
debate Elon hired a CEO for Twitter
00:43:35
Linda yakarino I'm hoping pronouncing
00:43:38
that correct was the head of AD sales
00:43:39
that NBC Universal she's a legend in the
00:43:41
advertising business
00:43:43
she worked at Turner for 15 years before
00:43:44
that she is a workaholic is what she
00:43:47
says she's going to take over everything
00:43:49
but product and CTO elon's going to
00:43:52
stick with that
00:43:53
she seems to be
00:43:55
very moderate and she follows people on
00:43:58
the left or right people are starting
00:43:59
the character assassination and trying
00:44:01
to figure out her politics
00:44:04
and that she was involved with the w
00:44:05
World economic Forum which anybody in
00:44:08
business basically does
00:44:10
but your take sacks on this choice for
00:44:13
CEO and what this means just broadly for
00:44:18
the next six months because we're
00:44:19
sitting here at six months almost
00:44:20
exactly since Elon took over obviously
00:44:22
you and I were involved in month one but
00:44:24
but not much after that what do you
00:44:26
think the next six months holds and what
00:44:27
do you think her role is going to be
00:44:28
obviously some precedent this for this
00:44:30
with you know why not as basic listen I
00:44:34
think this Choice makes sense on this
00:44:37
level Twitter's business model is
00:44:39
advertising Elon does not like selling
00:44:41
advertising she's really good at selling
00:44:43
advertising so he's chosen a CEO to work
00:44:46
with who's highly complimentary to him
00:44:49
in their skill sets and interests and I
00:44:52
think that makes sense I think there's a
00:44:54
lot of logic in that what Elon likes
00:44:56
doing is the technology and product side
00:44:58
of the business
00:44:59
he actually doesn't really like the
00:45:03
let's call it the standard business
00:45:05
chores and especially related to like we
00:45:08
said advertising stuff
00:45:11
with advertisers I think it's his
00:45:13
personal nightmare right so I think the
00:45:15
choice makes sense on that level now
00:45:16
yeah instantly you're right her hiring
00:45:19
led to attacks from both the left and
00:45:22
the right the right you know pointed out
00:45:24
her views on covet and vaccines and her
00:45:27
work with the wef and then on the left I
00:45:30
mean the attack is that she's following
00:45:32
Libs at Tick Tock which you're just not
00:45:34
allowed to do apparently a follow is not
00:45:36
an endorsement well if you're just
00:45:38
following lives of tick tock they want
00:45:39
to say you're some crazy right winger
00:45:40
now well she also follows David sacks so
00:45:43
that does mean that she's pretty that
00:45:45
that is a signal but the truth is if you
00:45:47
sax correct me if I'm wrong here or trim
00:45:49
off maybe I'll send it to you if you
00:45:50
pick somebody that both sides dislike or
00:45:53
are trying to take apart you probably
00:45:54
pick the right person yeah
00:45:56
here's what I think okay go ahead we're
00:45:58
not going to know how good she is for
00:46:01
six to nine months but here's what I
00:46:03
took a lot of
00:46:04
joy out of here's a guy who
00:46:08
gets attacked for all kinds of things
00:46:10
now right he's an anti-semite apparently
00:46:13
and then he had to be like I'm very Pro
00:46:15
semi he's a guy that all of a sudden
00:46:17
people think is a conspiracy theorist
00:46:18
he's a guy that people think is now on
00:46:21
the Raging right all these things that
00:46:23
are just like inaccuracies basically
00:46:25
fire bombs thrown by the left
00:46:28
but here's what I think is the most
00:46:30
interesting thing for a guy that
00:46:32
theoretically is supposed to be a troll
00:46:33
and everything else he has a female
00:46:36
chairman of Tesla a female CEO at
00:46:39
Twitter and a female president at SpaceX
00:46:42
of course it's a great Insight it's the
00:46:44
same insight
00:46:45
I think a lot of these virtual signaling
00:46:47
lunatics on the left virtue signaling
00:46:50
mids and you know what like they're all
00:46:53
new house Elizabeth Warren and and
00:46:55
Bernie Sanders giving you know the CEO
00:46:59
of Starbucks a hard time when he doubled
00:47:02
the pay of the minimum wage gave them
00:47:06
you love mid right that's a great term
00:47:09
isn't it
00:47:10
these [ __ ] mids and I paid for the
00:47:13
college tuition what gives you the right
00:47:16
at Starbucks to pay for college tuition
00:47:18
and double the minimum wage
00:47:20
I don't know why it's so funny to me
00:47:23
isn't it so great like just because you
00:47:25
can picture them when I say that these
00:47:26
are these mids feverishly typing on
00:47:29
their keyboards their virtue signaling
00:47:31
nonsense sex wrapping it up
00:47:33
yeah look like you said Elon has worked
00:47:35
extremely well with Gwen Shotwell who's
00:47:37
the president of SpaceX for a long time
00:47:39
and I think that relationship shows
00:47:43
the way to make it work here at Twitter
00:47:45
which is they have a very culinary skill
00:47:47
set I think my understanding is that
00:47:48
Gwen focuses on the business side and
00:47:50
the sales side of the operation Elon
00:47:52
focuses on product and Technology she
00:47:54
lets Elon be Elon I think if Linda tries
00:47:57
to Reign Elon in tell him not to tweet
00:48:00
or tries to meddle in the Free Speech
00:48:03
aspects of the business which is the
00:48:04
whole reason he bought Twitter
00:48:07
yeah that's right that's when it will
00:48:09
fall apart so my advice would be let
00:48:12
alone Be Elon you know he bought this
00:48:14
company to make it a free speech
00:48:16
platform don't mess with that and I
00:48:19
think it could work great and a free
00:48:21
speech platform it is when you are
00:48:23
saying anything about covid and I really
00:48:25
don't even want to say it here because I
00:48:26
don't want our to even say the word
00:48:28
covet or vaccine means that this could
00:48:30
get tagged by YouTube and BD you know
00:48:33
algor the algorithm could could uh D
00:48:37
um I don't know what they call it
00:48:38
deprecate this and when we don't show up
00:48:40
and people don't see us because we just
00:48:41
said the word covet I mean the
00:48:42
censorship built into these algorithms
00:48:44
is absurd but speaking of absurd Nina
00:48:47
Khan
00:48:48
who has been the least effective FTC
00:48:50
chair
00:48:51
I think
00:48:52
started out pretty promising with some
00:48:54
interesting ideas she's now moved a
00:48:56
block of major Pharma deal and December
00:48:59
Amgen agreed to acquire Dublin based
00:49:02
Horizon Therapeutics for 27.8 billion
00:49:05
this is the largest Pharma deal
00:49:06
announced in 2022. FTC has filed a
00:49:08
lawsuit in federal court seeking an
00:49:09
injunction that would prevent the deal
00:49:10
from closing the reasoning is the deal
00:49:13
would allow Amgen to entrench the
00:49:15
Monopoly positions of Horizon's eye and
00:49:16
gout drugs
00:49:18
the agency said that those treatments
00:49:21
don't face any competition today and
00:49:22
that Amgen would have a strong incentive
00:49:24
to prevent any potential Robbers from
00:49:27
introducing similar drugs
00:49:29
chamoth the pharmaceutical industry is a
00:49:31
little bit different than the tech
00:49:32
industry insanity
00:49:34
explain why and then sax will go to you
00:49:36
on the gout stuff because I know that
00:49:38
personally impacts you go ahead I think
00:49:41
that this is a little like
00:49:42
scientifically illiterate to be honest
00:49:45
unpack the thing is that you want drugs
00:49:48
that can get to Market quickly but at
00:49:50
the same time you want drugs to be safe
00:49:52
and you want drugs to be effective
00:49:54
and I think that the FDA has a pretty
00:49:56
reasonable process and one of the direct
00:50:00
byproducts of that process is that if
00:50:02
you have a large indication that you're
00:50:04
going after say diabetes you have to do
00:50:06
an enormous amount of work it has to be
00:50:09
run on effectively thousands of people
00:50:11
you have to stratify it by age you have
00:50:13
to stratified by gender you have to
00:50:15
stratify it by race you have to do it
00:50:17
across different geographies right the
00:50:19
bar is high but the reason the bar is
00:50:21
high is that if you do get approval
00:50:23
these all of a sudden become these
00:50:25
Blockbuster 10 20 30 billion dollar
00:50:27
drugs okay and they improve people's
00:50:30
lives and they allow people to live
00:50:33
etc etc
00:50:34
what has happened in the last 10 or 15
00:50:36
years because of wall Street's influence
00:50:39
inside of the Pharma companies
00:50:41
is that what Pharma has
00:50:44
done a very good job of doing is
00:50:47
actually pushing off a lot of this very
00:50:49
risky r d
00:50:51
to Young early stage biotech companies
00:50:54
and they typically do the first part of
00:50:57
the work they get through a phase one
00:50:59
they even may be able to sometimes go
00:51:01
and start a phase two trial a two-way
00:51:03
trial and then they typically
00:51:06
can get sold to Pharma and these are
00:51:09
like multi-billion dollar transactions
00:51:12
and the reason is that the private
00:51:14
markets just don't have the money
00:51:16
to support the risk for these companies
00:51:18
to be able to do all the way through a
00:51:19
phase three clinical trial because it
00:51:21
would cost in some cases five six seven
00:51:24
eight billion dollars you've never heard
00:51:26
of a tech company raising that much
00:51:27
money except in a few rare cases
00:51:30
in far in in biotech it just doesn't
00:51:32
happen so you need the m a machine
00:51:35
to be able to incentivize these young
00:51:37
companies to even get started in the
00:51:38
first place otherwise what literally
00:51:41
happens is you have a whole host of
00:51:42
diseases that just stagnate okay
00:51:47
and instead what happens is a younger
00:51:49
company can only raise money to go after
00:51:51
smaller diseases which have smaller
00:51:53
populations smaller Revenue potential
00:51:55
smaller costs because the trial
00:51:57
infrastructure is just less
00:51:59
so if you don't want industry to be in
00:52:03
this negative Loop where you only work
00:52:04
on the small diseases and you actually
00:52:06
go and tackle the big ones
00:52:08
you need to allow these kinds of
00:52:09
transactions to happen the last thing
00:52:11
I'll say is that even when these big
00:52:12
transactions happen half the time they
00:52:15
turn out to still not work there is
00:52:17
still huge risk so don't get caught up
00:52:19
in the dollar size you have to
00:52:21
understand the phase it's in and the
00:52:23
best example of this is the biggest
00:52:24
outcome in in biotech private investing
00:52:27
in Silicon Valley was this thing called
00:52:29
stem Centrics and that thing was a 10
00:52:31
billion dollar dud right but it allowed
00:52:34
all these other companies to get started
00:52:35
after semcentrics got bought for 10
00:52:37
billion freeberg I want to get your take
00:52:40
on this especially in light of maybe
00:52:42
something people don't understand which
00:52:43
is the the amount of time you get to
00:52:46
actually exclusively monetize a drug
00:52:49
because my understanding you correct me
00:52:52
if I'm wrong you get a 20-year patent
00:52:53
it's from the date you file it but then
00:52:55
you're working towards getting this drug
00:52:56
approved by the FDA so by the time the
00:52:58
FDA approves a drug this 20-year patent
00:53:01
window how many years do you actually
00:53:03
have exclusively to monetize that drug
00:53:05
and then your wider thoughts on this FTA
00:53:07
yeah I'm not going to answer that
00:53:09
question right now because I do want to
00:53:10
kind of push back on the point I'm
00:53:13
generally pretty negative on a lot of
00:53:15
the comments Lena Khan's made and her
00:53:17
positioning and obviously as you guys
00:53:18
know we've talked about it on the show
00:53:20
but I read the FTC uh filing it's in um
00:53:24
federal court
00:53:26
and if you read the filing let me just
00:53:28
start the company that Amgen is trying
00:53:29
to buy it's called Horizon Therapeutics
00:53:30
which is a company that's doing about 4
00:53:33
billion in Revenue a year about a
00:53:34
billion to a billion and a half in
00:53:35
ebitda so it's a it's a business that's
00:53:38
got a portfolio of orphaned drugs
00:53:40
meaning drugs that treat orphan
00:53:41
conditions that aren't very big
00:53:43
Blockbusters in the in the
00:53:44
pharmaceutical drug context
00:53:46
and so it's it's a nice portfolio of
00:53:48
cast generating drugs Amgen buying the
00:53:52
business gives them real Revenue really
00:53:54
but uh and helps bolster portfolio that
00:53:58
you know is aging and I think that's a
00:54:00
big part of the Strategic driver for
00:54:02
Amgen to make this massive 28 billion
00:54:05
dollar acquisition the ftc's claim in
00:54:08
the filing which I actually read and I
00:54:11
was like this is actually a pretty good
00:54:12
claim is that the way that Amgen sets
00:54:16
the prices for their pharmaceutical
00:54:17
drugs is they go to the insurance
00:54:19
companies the payers and the health
00:54:20
systems and they negotiate drug pricing
00:54:23
and they often do bulk multi-product
00:54:26
deals so they'll say hey we'll give you
00:54:28
access to those products at this price
00:54:31
point but we need you to pay this price
00:54:32
point for this product and over time
00:54:34
that drives price inflation it drives
00:54:36
costs up and it also makes it difficult
00:54:38
for new competitors to emerge because
00:54:40
they tell the insurance company you have
00:54:41
to pick our drug over other drugs in
00:54:44
order to get this discounted price
00:54:46
and so it's a big part of their
00:54:47
negotiating strategy that they do with
00:54:48
insurance companies so the ftc's claim
00:54:51
is that by giving Amgen this large
00:54:53
portfolio of drugs that they're buying
00:54:55
from Horizon it's going to give them
00:54:57
more negotiating leverage and the
00:54:58
ability to do more of this drug blocking
00:55:00
that they do with insurance companies
00:55:02
and other payers in the drug system so
00:55:04
they're trying to prevent pharmaceutical
00:55:06
drug price inflation and they're trying
00:55:08
to increase competition in their lawsuit
00:55:10
so I felt like it was a fairly kind of
00:55:12
compelling case I'm no lawyer on
00:55:14
Anti-Trust and monopolistic practices
00:55:17
and the Sherman Act but this was not
00:55:18
sorry let me just say this was not an
00:55:19
early stage biotech risky deal that
00:55:22
they're trying to block a mature company
00:55:24
with four billion in revenue and a
00:55:26
billion and a half in ebitda I
00:55:27
understand and yeah I read it too but
00:55:29
two comments
00:55:30
it is because the people that traffic in
00:55:32
these stocks are the same ones that fund
00:55:34
these early stage biotech companies and
00:55:36
I talked to a bunch of them and they're
00:55:38
like if these guys block this kind of
00:55:39
deal we're going to get out of this game
00:55:41
entirely so just from the horse's mouth
00:55:44
what I'm telling you is you're going to
00:55:45
see a Paul come over the early stage
00:55:47
Venture financing landscape because a
00:55:49
lot of these guys that are crossover
00:55:51
investors that own a lot of these public
00:55:52
biotech stocks that also fund the
00:55:54
private stocks will change their risk
00:55:57
posture if they can't make money that's
00:55:59
just the nature of capitalism
00:56:00
the second thing is Lena Khan did
00:56:02
something really good about what you're
00:56:04
talking about this week actually which
00:56:06
is she actually went after the pbms and
00:56:08
if you really care about drug inflation
00:56:10
and you follow the dollars the real
00:56:12
culprits around this are the pharmacy
00:56:14
benefit managers and she actually
00:56:16
launched a big investigation into them
00:56:18
but this is what speaks to the two
00:56:20
different approaches it seems that
00:56:22
unfortunately for the FTC every merger
00:56:25
just gets contested for the sake of it
00:56:26
being contested
00:56:28
because I think that if you wanted to
00:56:30
actually stop price inflation there are
00:56:32
totally different mechanisms because why
00:56:34
didn't you just Sue all the pbms well
00:56:37
there's no merger to be done but you can
00:56:39
investigate and then you could regulate
00:56:41
and I think that that's probably a more
00:56:43
effective way and the fact that she
00:56:44
targeted the PPM says that somebody in
00:56:46
there actually understands where the
00:56:48
price inflation is coming from but I
00:56:50
don't think something like an Amgen
00:56:52
Horizon because what I think will happen
00:56:54
is all the folks will then just
00:56:56
basically say well man if if these kinds
00:56:59
of things can't get bought then why am I
00:57:00
funding these other younger things yeah
00:57:02
but you know yeah we're just not seeing
00:57:04
a lot of the younger stuff get get
00:57:06
get blocked I don't think we've seen any
00:57:08
attempts at blocking speculative
00:57:10
portfolio Acquisitions or speculative
00:57:12
company Acquisitions so I think these
00:57:14
guys are getting caught up in the dollar
00:57:15
number you know yeah so I think the
00:57:18
problem is they see 28 billion they're
00:57:19
like we need to stop it you know what's
00:57:21
amazing I'll just wrap on this because
00:57:22
it's a good discussion but I think we
00:57:24
got to keep moving here is I took the
00:57:25
PDF that you shared and I put it into
00:57:29
chat GPT now oh wow and you don't need
00:57:32
to upload the PDF anymore you could just
00:57:34
say summarize this and put the link and
00:57:36
it did it instantly
00:57:39
are using the browsing Plugin or I just
00:57:42
used chat GP no it's not the browser
00:57:43
plugin I just did this is the 3.5 model
00:57:46
and it pulled the link in the GPT 3.5
00:57:49
model I don't know I could do that
00:57:52
that's new they must have added browse
00:57:54
me in the background yeah or just
00:57:55
pulling it they did today by the way
00:57:57
whoa they did it today yeah remember
00:58:00
last week we said that they had to build
00:58:02
browsing into the actual product like
00:58:04
Bard right otherwise they're moving I
00:58:07
gotta get it today yeah
00:58:08
closed AI is on the top of their game
00:58:10
the app is available in the App Store
00:58:12
now right as of today no they they had a
00:58:15
test app I was on the test flight no no
00:58:16
no they just launched the app they did
00:58:18
oh that's game over man if this thing is
00:58:20
an app form that's going to 10x the
00:58:21
number of users and it's going to 10x
00:58:23
the amount of usage by the way I tested
00:58:24
the same thing for Bard we should
00:58:26
compare the two but amazing pretty good
00:58:28
as well yeah yeah wow to actually
00:58:31
compare its summary with chat GPT
00:58:33
summary
00:58:35
okay tell us which one's better
00:58:37
some interesting news here you know we
00:58:40
speaking of platform shifts do I get to
00:58:42
get my view on the Lena con thing oh
00:58:45
well yes
00:58:49
I was getting a little personal here
00:58:51
David and I didn't I didn't want to
00:58:53
trigger you I know you've been
00:58:53
struggling with the gout
00:58:55
because of your lifestyle choices the
00:58:57
alcohol the the foie gras everything but
00:59:00
no you've lost a lot of way to give you
00:59:01
a lot of credit tell us what do you
00:59:03
think about the bundling we're seeing
00:59:04
here because it does seem Microsoft asks
00:59:06
with the operating system yeah it's very
00:59:07
similar and what I said in the context
00:59:09
of tech is that we should focus on the
00:59:11
anti-competitive tactics and stop those
00:59:13
rather than blocking all mergers and I
00:59:16
think the same thing is happening on the
00:59:17
Pharma space if bundling is the problem
00:59:19
focus on bundling the problem when you
00:59:22
just block M A is that you deny
00:59:25
early investors one of the biggest ways
00:59:28
that they can make a positive outcome
00:59:29
and what's the downstream effect of that
00:59:31
yeah exactly look it is hard enough to
00:59:34
make money as either a former investor
00:59:36
or as a VC that there's only two good
00:59:38
outcomes right there's IPOs there's M
00:59:40
A's everything else basically goes
00:59:42
everything else is a zero it goes
00:59:43
bankrupt so
00:59:45
if you take M A off the table you really
00:59:48
suppress the already challenge returns
00:59:51
of venture capital yeah well said well
00:59:53
said and you're right you mentioned
00:59:55
earlier that we were willing to give
00:59:56
Lena Khan a chance we thought that some
00:59:59
of our ideas were really interesting
01:00:00
because I think there are these huge
01:00:02
tech companies that do need to be
01:00:04
regulated these big Tech monopolies
01:00:06
basically that you have the mobile
01:00:08
operating system duopoly with Apple and
01:00:10
Google and you've got Amazon you've got
01:00:11
Microsoft and there is a huge risk of
01:00:14
those companies uh preferring their own
01:00:16
applications over Downstream
01:00:18
applications or using these bundling
01:00:20
tactics yes if you don't put some limits
01:00:23
around that that creates I think an
01:00:25
unhealthy Tech ecosystem this is the
01:00:27
insight and I think it's exactly correct
01:00:29
sex
01:00:30
ferlina Khan I know she listens to the
01:00:32
Pod hey Lena you want to go after
01:00:34
tactics not Acquisitions so if somebody
01:00:37
buys something and they lower prices and
01:00:40
increases consumer Choice that's great
01:00:43
if it encourages more people to invest
01:00:45
more money into Innovation that's great
01:00:48
but if the tactics are we're going to
01:00:50
bundle these drugs together to keep some
01:00:52
number of them artificially high or
01:00:53
reduce choice or if we're going to
01:00:55
bundle features into the you know Suite
01:00:57
of products and we do anti-competitive
01:00:59
stuff you have to look at the tactics on
01:01:01
the field are people cheating and are
01:01:04
they using the Monopoly power to force
01:01:06
you to use their App Store just make
01:01:08
apple have a second app store that's all
01:01:11
we're asking you to do there should be
01:01:12
an app store on iOS that doesn't charge
01:01:16
any fees or charges one percent fees
01:01:20
break the Monopoly on the App Store sax
01:01:22
is so right perfectly said she actually
01:01:24
did Issue compulsory orders to the pbms
01:01:28
so to your point sax the FTC has been
01:01:30
worried that what free Brook said had is
01:01:32
been happening but the real sort of
01:01:34
middlemen manipulator in this market are
01:01:36
the pharmacy benefit managers and so
01:01:38
this week she actually issued compulsory
01:01:40
orders to the pbms and said turn over
01:01:42
all your business records to me I'm
01:01:44
going to look into them that makes a ton
01:01:46
of sense but then on the same hand it's
01:01:48
like you see merger and you're like no
01:01:50
it can't happen it just doesn't speak to
01:01:52
a knowledge of the market we should have
01:01:54
Lena on hey Lena I know you listen to
01:01:56
the Pod I've heard the back Channel just
01:01:57
come on the Pod I should be a good guess
01:01:58
right we would have a good conversation
01:01:59
with everything yeah uh open invite
01:02:02
Nikki Haley's coming on the Pod by the
01:02:03
way you have homework to do for the
01:02:05
summit which is to see if you can get
01:02:07
Donald Trump to come to the summit okay
01:02:09
huge love J Cal love all in okay
01:02:20
because you your mannerisms are
01:02:22
unbelievable how did you practice it I
01:02:24
did I did a little bit only because I
01:02:25
like to troll people and Trigger them
01:02:27
I'm gonna really dial in my trump in the
01:02:29
coming weeks all right here we go
01:02:30
Apple's long anticipated AR headset that
01:02:33
stands for augmented reality which means
01:02:36
VR you can't see the real world you're
01:02:38
just
01:02:39
in uh in a virtual world AR lets you put
01:02:43
digital assets on the real world so you
01:02:45
can see what's happening in the real
01:02:46
world but you can put Graphics all
01:02:47
around that's expected to be revealed
01:02:50
as early as June the projected cost is
01:02:54
going to be around three thousand
01:02:56
dollars it won't ship into the fall this
01:02:57
is a break from Apple's typical
01:03:00
way of releasing products which is to
01:03:02
wait till it's perfect and to wait until
01:03:04
all consumers can afford it this is a
01:03:06
different approach they're going to give
01:03:07
this out to developers early
01:03:09
and Tim Cook is supposedly pushing this
01:03:11
there was another group of people inside
01:03:13
of Apple who did not want to release a
01:03:15
jamaat
01:03:16
but there is some sort of external
01:03:18
battery pack it seems like a bit of a
01:03:20
Franken product Frankenstein kind of
01:03:22
project here
01:03:24
that you know perhaps Steve Jobs
01:03:26
wouldn't have wanted to release but he
01:03:28
needs to get it out I think because
01:03:29
Oculus is making so much progress the
01:03:32
killer app supposedly is a FaceTime like
01:03:34
live chat experience that seems
01:03:36
interesting but they look like ski
01:03:38
goggles
01:03:40
on this as the next compute platform if
01:03:44
they can get it you know to work would
01:03:46
you wear these would they have to be
01:03:48
Prada what's the story here no and no
01:03:50
does this seem like a weird conversation
01:03:51
because none of us [ __ ] know none of
01:03:53
us have seen this product and none of us
01:03:54
have used it we were lucky friend of the
01:03:57
podpa lucky says it's incredible so what
01:03:59
that's just like commenting on one guy's
01:04:01
five letter five word tweets
01:04:03
Palmer knows I mean Palmer invented
01:04:05
Oculus great but what are we talking
01:04:07
about we have nothing to say about it
01:04:08
what do we know about okay I'll a form
01:04:09
of a really good question here do you
01:04:11
believe this is going to be a meaningful
01:04:13
compute platform in the coming years
01:04:14
because apple is so good at product how
01:04:17
do we know until we see it we gotta see
01:04:18
it
01:04:19
of course they're of course they're good
01:04:22
at product let's let's see the product
01:04:23
though like all right fine sax what are
01:04:25
your thoughts I think it's a good thing
01:04:26
that they're launching this like you
01:04:28
said it is a deviation for what they've
01:04:29
normally done they normally don't
01:04:31
release a product unless they believe
01:04:33
the entire world can use it so their
01:04:36
approach has been only to release mass
01:04:37
mass market products and have a very
01:04:39
small portfolio of those products but
01:04:42
when those products work they're you
01:04:44
know billion user home runs this
01:04:46
obviously can't be at a three thousand
01:04:47
dollar price point and it also seems
01:04:49
like it's a little bit of a early
01:04:51
prototype where the batteries are like
01:04:53
in a fanny pack around your waist and
01:04:55
there's a ways to go around this but I I
01:04:57
give them credit for launching what is
01:05:00
probably going to be more of an early
01:05:01
prototype so they could start iterating
01:05:03
on it I mean the reality is the Apple
01:05:05
watch the first version kind of sucked
01:05:06
and first five versions yeah now they're
01:05:09
on one that's pretty good I think so
01:05:10
look I think this is a cool new platform
01:05:12
they get knocked on for not innovating
01:05:14
enough I think good let them try
01:05:16
something new I think this will be good
01:05:18
for meta to have some competition yeah
01:05:21
it's great having two major players in
01:05:23
the race maybe it actually speeds up the
01:05:25
Innovation or maybe we get somewhere
01:05:27
down here I mean I I think they should
01:05:30
have
01:05:31
done something in cars
01:05:34
what should they do in cars if you were
01:05:36
going to talk about the car what would
01:05:38
it be tell me what you think would be
01:05:40
the right approach you you were going to
01:05:41
do the Facebook phone that could have
01:05:42
changed the entire Destiny of Facebook
01:05:43
they should have bought Tesla when they
01:05:45
could have the chance for four or five
01:05:46
billion they could have bought it for 10
01:05:48
billion 20 billion it's only when it got
01:05:50
to 50 60 that it got Out Of Reach
01:05:52
what do you think the car should be they
01:05:54
could have bought it at 100 billion they
01:05:55
could have bought 100 billion Tim Cook
01:05:56
famously wouldn't take the meeting Elon
01:05:58
said it he wouldn't he wouldn't be
01:05:59
bizarre bizarre maybe they missed an
01:06:02
opportunity there but I do think the end
01:06:03
game with the AR headsetter glasses
01:06:06
right yes where you get the screens and
01:06:08
you get the Terminator mode and in con
01:06:11
is that what is that these are just
01:06:13
glasses acting like they were like fancy
01:06:16
technology this this size glasses is
01:06:18
what you're talking about yeah when you
01:06:20
have like a little camera built in and
01:06:22
heads-up construction with AI
01:06:24
then it gets really interesting so that
01:06:26
that's the end game here I think give
01:06:27
the audience an example of what this
01:06:29
combination of AI plus AR could do when
01:06:31
you're walking around it could layer on
01:06:33
intelligence about the world you meet
01:06:35
with somebody and it can remind you of
01:06:37
their name and the last time you met
01:06:38
with them and give you a summary of what
01:06:39
you talked about
01:06:41
what action items there are
01:06:43
you could be walking in a city and it
01:06:45
could tell you it knows you like Peking
01:06:47
dock it could show you hey there's a
01:06:48
peking dog place over here some reviews
01:06:49
of it it just knows you and it's
01:06:51
customized in the world what about for
01:06:52
people that do the same routine 99 of
01:06:55
the time how does it gonna help you
01:06:56
though it could tell you your steps
01:06:58
every day could tell you incoming
01:07:00
messages so you don't have to take
01:07:01
yourself three thousand dollars on that
01:07:03
no but you would spend people spending
01:07:06
three years
01:07:07
to get your notifications on your wrists
01:07:10
why do you want it on your eyes for
01:07:11
three thousand I would love this maybe I
01:07:13
just do like a lot of meetings or I'm at
01:07:15
events where people are coming up to me
01:07:17
and I've met them like once a year
01:07:18
before like it would be really helpful
01:07:20
to kind of have the terminal let's be
01:07:22
honest though the Terminator mode for
01:07:24
you to be able to be present with your
01:07:27
family and friends but be playing chess
01:07:28
with Peter TL on those glasses that's
01:07:31
your dream come true you and Peter in AR
01:07:34
playing chess all day long
01:07:36
throw up the picture of sax beating
01:07:38
Peter Thiel I watched the clip from the
01:07:40
earlier all in episodes when we
01:07:42
discussed you beating Peter Thiel what a
01:07:44
great moment it was for you all right
01:07:46
listen let's wrap up with this Gallup
01:07:47
survey the number of Americans who say
01:07:50
it's a good time to buy a house has
01:07:52
never been lower 21 say it's a good time
01:07:55
to buy a house down nine percent
01:07:58
from the prior low of a year ago prior
01:08:00
to 2022 50 or more consistently though
01:08:03
it was a good time to buy significantly
01:08:04
significantly fewer expect local housing
01:08:07
prices to increase in the year
01:08:09
hey uh Sachs is this like a predictive
01:08:11
of a bottom and pure capitulation and
01:08:15
then that means maybe it is in fact a
01:08:16
good time how would you read this data I
01:08:18
don't see it as a bottom necessarily the
01:08:20
way I read the data is that the spike in
01:08:22
interest rates have made it very
01:08:24
unaffordable to buy a house right now
01:08:26
you've got you know the mortgages are
01:08:28
what like seven percent interest rate or
01:08:31
even slightly higher so people just
01:08:33
can't afford the same level of house
01:08:35
that they did before I mean mortgages
01:08:37
were at three three and a half percent
01:08:39
like a year and a half ago now I think
01:08:42
what's kind of interesting is that even
01:08:43
in the 1980s the early 1980s when
01:08:46
interest rates were at like 15 you still
01:08:50
had 50 thought it was an okay time to
01:08:52
buy a house or an attractive time to buy
01:08:53
a house so
01:08:54
for the number to be this low tells me
01:08:58
that it's not just about interest rates
01:09:00
I think consumer confidence is also
01:09:02
plummeting and people are feeling more
01:09:05
insecure so I think it's just another
01:09:09
economic indicator that things are
01:09:11
looking really shaky right now and I'll
01:09:13
tell you one of the the knock-on effects
01:09:15
of this is going to be that people can't
01:09:17
move because in order to move you have
01:09:19
to sell your current house and then buy
01:09:21
a new one and you're not going to want
01:09:23
to sell your current house when prices
01:09:25
are going down and then for the new one
01:09:27
you're going to lose your three percent
01:09:29
mortgage enough to get a new one at
01:09:30
seven percent so you're not going to buy
01:09:32
anything like the house so it freezes
01:09:35
the market it freezes Mobility I think
01:09:37
over the last few years during covid you
01:09:39
saw tremendous movement between states I
01:09:42
think that's going to slow down a lot
01:09:43
now because people just can't afford to
01:09:45
trade houses
01:09:46
so as a result of that I think
01:09:48
discontent is going to rise because I
01:09:51
think one of the ways that you create a
01:09:52
pressure valve is when people are
01:09:54
unhappy in a state they just move
01:09:56
somewhere else well now they're not able
01:09:57
to do that well and you can also move to
01:09:58
a better opportunity for you and your
01:10:00
family whether that schools taxes a job
01:10:02
lifestyle so yeah you can you can you're
01:10:05
going to reduce joy in the country and
01:10:07
it also it screws with price Discovery
01:10:09
doesn't it if you if you don't have a
01:10:11
fluid Market here then how does anybody
01:10:13
know what their house is worth and then
01:10:15
this just again creates more I think of
01:10:17
a frost I think Friedberg has said this
01:10:19
a couple times freeberg you can correct
01:10:21
me if I'm wrong but like the the home is
01:10:23
like the disproportionate majority of
01:10:26
most Americans wealth right mm-hmm
01:10:29
it's all their wealth all their wealth
01:10:31
yeah so
01:10:33
I mean
01:10:35
there's that factoid yeah and then what
01:10:37
does that do foreign
01:10:41
what's going on they're bringing you
01:10:42
lunch no I was looking I was looking at
01:10:45
uh
01:10:46
mansion that's for sale
01:10:51
175 million dollars but they just got
01:10:53
the price to 140 so I'm just taking a
01:10:55
little again I mean there's gonna be a
01:10:57
lot of distress in the market soon I'm
01:10:59
predicting a lot of distress actually
01:11:00
can we shift to the commercial side for
01:11:02
a second I just passed away yeah Sam
01:11:05
Zell passed away today oh wow rest in
01:11:07
peace yeah rest in peace uh Chicago uh
01:11:10
um yeah crazy but speaking of bombastic
01:11:13
interesting guy yeah but speaking of the
01:11:15
real estate market so I want to give an
01:11:16
update on San Francisco cre I was
01:11:19
talking to a broker the other day and so
01:11:23
here are the stats that they gave me so
01:11:25
it was a local broker and someone from
01:11:27
Blackstone and they're fans of the Pod
01:11:29
and just came up to me and we started
01:11:31
talking about what's happening in San
01:11:32
Francisco shout out shout out to them
01:11:34
didn't take a didn't take a photo but
01:11:36
but any event they're they're fans of
01:11:38
the pot of so we start talking about
01:11:39
what's happening in San Francisco real
01:11:41
estate
01:11:42
so the SF office Market is just a level
01:11:45
set is 90 million square feet they said
01:11:47
the vacancy rate is now 35 so that's
01:11:49
over 30 million square feet vacant and
01:11:52
vacancy still growing as Lisa's end and
01:11:55
companies shed space because some of
01:11:56
that space that they're not using is not
01:11:59
up for sublease everyone says what about
01:12:00
AI is AI going to be the Savior the
01:12:02
problem is that AI companies are only
01:12:05
that's only about a million square feet
01:12:07
of demand so one million out of 30
01:12:09
million is going to be absorbed by Ai
01:12:12
and you know maybe that number grows
01:12:14
over time over the next five ten years
01:12:15
as we create some really big AI
01:12:17
companies but it's just not going to
01:12:19
bail out San Francisco right now the
01:12:21
other thing is that you know VC backed
01:12:24
startups are very demanding in terms of
01:12:26
their tenant improvements and landlords
01:12:28
don't really have the capital right now
01:12:29
to put that into the buildings and
01:12:31
storms just are not the kind of credit
01:12:33
worthy tenants that landlords really
01:12:36
want so this is not going to bail
01:12:38
anybody out there they said there are a
01:12:40
ton of zombie office Towers especially
01:12:42
in Soma and all these office towers are
01:12:45
eventually going to be owned by the
01:12:46
Banks which you have to liquidate them
01:12:48
and then we're going to find out that
01:12:49
these loans that they made are gonna
01:12:51
have to be written off because the
01:12:53
collateral that they thought was Blue
01:12:55
Chip that was backing up those loans is
01:12:57
not so blue chip anymore
01:12:59
so I think we've got not just a huge
01:13:01
commercial real estate problem but it's
01:13:03
going to be a big banking problem as
01:13:05
basically people stop pretending you
01:13:07
know right now they're trying to
01:13:08
restructure loans it's called pretend
01:13:10
and extend you reduce the rate on the
01:13:12
loan but add term to it but that only
01:13:15
works for so long if this keeps going if
01:13:17
the market keeps looking like this I
01:13:19
think we're gonna have a real problem
01:13:20
and and that will be a problem in the
01:13:22
banking system now San Francisco is the
01:13:24
worst of the worst but they said that
01:13:25
New York is similar and all these other
01:13:27
big cities with empty office towers are
01:13:29
directionally I'm in New York right now
01:13:31
for the side connections conference and
01:13:33
uh
01:13:34
it is packed the city is packed getting
01:13:38
anywhere there's gridlock you can't walk
01:13:41
down the street you got to walk around
01:13:42
people every restaurant it is dynamic
01:13:45
and then I talk to people about offices
01:13:46
and they said people are staying in
01:13:48
their houses in their tiny little New
01:13:50
York apartments instead of going three
01:13:52
train stops to their office they go to
01:13:54
the office one or two days a week unless
01:13:55
you're like JP Morgan or some other
01:13:57
places that's that Drop the Boom
01:13:59
but there's a lot of people still
01:14:01
working from home the finance people
01:14:03
have all gone back media people are
01:14:04
starting to go back so there are three
01:14:06
to five days here and the city is
01:14:09
booming
01:14:10
contrast that I spent the last two weeks
01:14:12
in San Francisco walking from Soma to
01:14:14
the Embarcadero back
01:14:16
dead nobody in the city
01:14:19
it like literally a ghost town it's a
01:14:21
real shame it's a real real shame I
01:14:24
wonder if these this is the question I
01:14:27
have for you sacks can they cut a deal
01:14:30
can they go to like month to month rent
01:14:32
sublets you know loosey-goosey just give
01:14:35
people any dollar amount to convince
01:14:37
them to come back is there any dollar
01:14:39
amount because I'm looking for a space
01:14:40
for the incubator
01:14:41
in San Mateo I've been getting a ton of
01:14:43
inbound but the prices are still really
01:14:45
high and I'm like how do I cut a deal
01:14:47
here because shouldn't people be
01:14:49
lowering the prices dramatically or are
01:14:53
they all just pretending or will I get a
01:14:55
lower Ritz or definitely coming down big
01:14:57
time especially for space that sort of
01:14:59
commodity and not that desirable but
01:15:00
what's happening is according to the
01:15:02
people I talk to is that the demand the
01:15:05
people who actually are looking for new
01:15:06
space they only want to be in the best
01:15:08
areas and they want to be in the newest
01:15:10
buildings that have the best amenities
01:15:11
and so uh that sort of commodity office
01:15:14
tower where there's barely anybody ever
01:15:16
there like no one wants that um people
01:15:19
would rather pay a higher rent I mean
01:15:21
the rent will still be much lower
01:15:23
probably half the price of what it used
01:15:24
to be but they'd rather pay a little bit
01:15:26
more for that than get like a zombie
01:15:28
office tower we can't talk about all
01:15:29
this without talking about two cases
01:15:32
tragically
01:15:35
a shoplifter a criminal who was stealing
01:15:38
from a drugstore in San Francisco I got
01:15:41
shot
01:15:42
and the video was released I'm sure
01:15:44
you've seen in sacks and then here in
01:15:45
New York everybody's talking about this
01:15:47
one instance of a Marine
01:15:49
trying to subdue
01:15:52
a violent homeless person with two other
01:15:55
people and it's on everybody's Minds
01:15:57
here and Brooke Jenkins is not
01:16:00
Prosecuting in San Francisco the shooter
01:16:03
they look like
01:16:04
you know a clean shoot as they would say
01:16:06
in the police business an appropriate
01:16:09
and it's tragic to say it is but the
01:16:11
person did charge the security guard the
01:16:14
security guard did fear for their life
01:16:15
and shot him so Brook Jenkins is not
01:16:17
going to pursue anything but in New York
01:16:20
City they're pursuing manslaughter for
01:16:21
the person who did seem a bit excessive
01:16:23
from the video it's hard to tell what
01:16:25
the reality is in these situations any
01:16:27
thoughts on it David these two cases in
01:16:29
Two Cities
01:16:31
yeah look I mean the only time you can
01:16:33
get a source da excited about
01:16:35
Prosecuting someone is when they act in
01:16:37
self-defense or defensive others I mean
01:16:39
this Marine I guess Daniel penny is his
01:16:42
name he was acting in defense of others
01:16:44
the person who he stopped
01:16:47
was someone with an extensive criminal
01:16:49
record who had just recently engaged in
01:16:54
a attempted kidnapping who had punched
01:16:56
elderly people
01:16:57
had pretty gnarly dozens of arrests in
01:17:00
fact people on Reddit were talking about
01:17:03
how dangerous this person was apparently
01:17:05
a dozen years ago or so he was seen as
01:17:09
more of like a quirky like Michael
01:17:11
Jackson impersonator street performer
01:17:14
street performer but something happened
01:17:16
this is according to a Reddit post that
01:17:18
I saw where something happened and there
01:17:20
was some sort of psychological break and
01:17:22
then since then he's had dozens and
01:17:24
dozens of primes and they just keep
01:17:27
letting him loose through this revolving
01:17:30
door of a justice system we have and now
01:17:32
look no one likes to see him basically
01:17:35
dying and yeah it's too bad it's
01:17:38
horrible that that happened tragic I
01:17:40
don't know though that if you're trying
01:17:42
to stop someone I don't know how easy it
01:17:45
is to precisely control whether you use
01:17:48
too much force or not so I think Daniel
01:17:50
penny has a strong case that he was
01:17:53
acting in self-defense and defense of
01:17:55
others and there were two other people
01:17:56
by the way who were holding this person
01:17:57
down there were three of them
01:17:59
restraining him and what universally New
01:18:01
Yorker said to me of all different
01:18:04
backgrounds was this is not a race issue
01:18:06
the other I think one or two of the
01:18:08
other people were people of color it was
01:18:09
not a race issue and they're trying to
01:18:11
make it into a race issue in both these
01:18:12
cases and it's this is literally what
01:18:15
happens it's just having been through
01:18:17
this in New York in the 70s and 80s when
01:18:18
you do so who's they who's they when you
01:18:21
say trying to make it a bunch of
01:18:22
protests on the street both in San
01:18:24
Francisco
01:18:25
and New York people protesting these as
01:18:27
you know justice issues the fact is if
01:18:30
you do not
01:18:31
if you allow lawlessness for too long a
01:18:33
period of time you get a Bernie gets
01:18:36
situation and Bernie gets people can
01:18:38
look it up in the 80s I was a kid when
01:18:40
it happened but they tried to mug
01:18:42
somebody he had a gun he shot him and
01:18:45
like this is what happens if you allow
01:18:46
lawlessness for extended periods of time
01:18:49
it's just you're basically gambling and
01:18:52
what happened to Bernie Getz he got
01:18:54
knocked guilty the case he got not
01:18:56
guilty but I think he had an illegal gun
01:18:58
so he was guilty of that the Bernie gets
01:19:00
thing was really um crazy because at the
01:19:04
time the climate in New York in this
01:19:06
1984 shooting
01:19:10
there was a portion of people who I
01:19:12
don't want to say they made him a hero
01:19:14
but they made it a c this is what
01:19:17
happens if you allow us to be assaulted
01:19:19
forever we're going to fight back at
01:19:21
some point that was the vibe in New York
01:19:23
when I was a child that was 14 15 years
01:19:24
old when this happened he was charged
01:19:26
with attempted murder assault what was
01:19:28
the name of that vigilante group that
01:19:30
used to walk the streets to something
01:19:31
angels that was the um Guardian Angels
01:19:34
Guardian Angels so it was so bad in the
01:19:36
80s and I actually almost signed up for
01:19:38
the guardian angels I went to their
01:19:39
headquarters because I was practicing
01:19:40
martial arts and I thought I would check
01:19:42
it out and um they had their office in
01:19:44
Hell's Kitchen I didn't wind up joining
01:19:46
but what they would do is they would
01:19:47
just ride the subway they would wear
01:19:49
a certain type of hat
01:19:51
and wear a guardian angel shirt and all
01:19:53
they did was arrested the Subways a red
01:19:56
beret and they would just ride the
01:19:57
subways
01:19:58
and you felt kind of martial arts were
01:20:00
you taking Taekwondo I was in Taekwondo
01:20:02
yeah this is before a mixed martial arts
01:20:04
but they just rode the Subways and
01:20:06
honestly I've been on the Subways with
01:20:08
them many times you felt safe and it
01:20:10
wasn't Vigilantes they were garden
01:20:12
angels they used that term and many
01:20:14
times they would do exactly what this
01:20:16
Marine did which is try to subdue
01:20:18
somebody who was committing crime I was
01:20:20
I had
01:20:21
two distinct instances where people
01:20:23
tried to mug me you know riding the
01:20:25
Subways in New York in the 80s
01:20:27
two distinct times and one was a group
01:20:29
of people and one was one person like it
01:20:31
was pretty scary
01:20:33
both times I navigated it but it was
01:20:35
yeah not pleasant in the 80s in New York
01:20:38
can I say one more thing about this
01:20:39
Daniel Penny Jordan Neely case so look
01:20:42
at the end of the day this is going to
01:20:43
be litigated I don't know all the
01:20:45
details they're gonna have to litigate
01:20:46
whether Daniel Penny's use of force was
01:20:48
was excessive or not but but here's the
01:20:50
thing is that the media has been falsely
01:20:54
representing Jordan Neely by only
01:20:56
posting 10 year old photos of him and
01:20:58
leaving out crucial information this was
01:20:59
a press report so again this is why I
01:21:02
mentioned the whole Michael Jackson
01:21:03
impersonator thing is that the media
01:21:05
keeps portraying nearly as this innocent
01:21:06
harmless guy who is this like delightful
01:21:09
Michael Jackson impersonator in truth he
01:21:12
hasn't done that in more than a decade
01:21:14
because again he had some sort of mental
01:21:15
break and since then he's been arrested
01:21:18
over 40 times including for attempting
01:21:20
to kidnap a seven-year-old child
01:21:23
and so the media is not portraying this
01:21:27
case
01:21:28
I think in an accurate way and I think
01:21:30
as a result of that it leads to pressure
01:21:33
on the D.A to prosecute someone who has
01:21:36
I think a strong self-defense claim or
01:21:38
you know maybe the D.A just wants to do
01:21:39
this anyway and it gives the D.A cover
01:21:42
to do this Soros is I mean I know that
01:21:45
we have this back and forth with this
01:21:46
why is CNN being inaccurate do you think
01:21:48
sucks
01:21:49
they're basically cooperating with Alvin
01:21:51
Bragg's interpretation of the case and
01:21:52
they're trying to make the case against
01:21:54
penny look as damning as possible yeah
01:21:57
why don't they just take it straight
01:21:58
down the middle it's a tragedy we have a
01:22:00
screwed up situation here we got a
01:22:02
mental health crisis and it's a tragedy
01:22:04
for everybody involved on the Bernie get
01:22:05
stuff he served eight of a 12 month
01:22:08
sentence for the firearm charge and he
01:22:10
had a massive 43 million dollar civil uh
01:22:13
judgment against him
01:22:14
in 1996 decade later it's just this is a
01:22:18
little different than the the Getz thing
01:22:20
because pulling out a gun and shooting
01:22:22
somebody well yeah no that's deadly
01:22:23
intent yeah yeah that's that's the
01:22:25
congratulations Penny he's a he's a
01:22:28
trained Marine right he's trying to
01:22:30
immobilize him he has to believe that he
01:22:32
he's just trying to subdue yes Neely and
01:22:35
so using a Chokehold to kill him that's
01:22:37
an unfortunate consequence of what
01:22:39
happened but he was trying to restrain
01:22:41
the guy as far as we know right as far
01:22:44
as we know yeah I mean tragedies all
01:22:47
around we got to have Law and Order I I
01:22:49
tweeted like I don't know why we still
01:22:51
have the post office maybe we can make
01:22:52
that like once a week and redo all of
01:22:54
that space and allow every American
01:22:56
who's suffering from mental illness to
01:22:58
check in to what used to be the post
01:23:00
office
01:23:01
you know maybe like once a week and
01:23:03
obviously you can give those people very
01:23:05
gentle Landings but I don't think we
01:23:06
need Postal Service more than once or
01:23:07
twice a week and then let
01:23:09
you know let's reallocate some money
01:23:11
towards mental health in this country
01:23:13
where anybody who's sick who feels like
01:23:16
they're violent or feels like they're
01:23:17
suicidal can just go into a publicly
01:23:20
provided facility and say I'm a sick
01:23:21
person please help me this would solve a
01:23:23
lot of problems in society we've got a
01:23:26
mental health crisis we should provide
01:23:27
Mental Health Services to all Americans
01:23:29
and it's a obviously easy thing for us
01:23:31
to afford to do and if we had done that
01:23:33
then this never would have happened
01:23:34
exactly I mean literally you have sacks
01:23:37
who wants to balance the budget saying
01:23:39
hey this is something we're spending on
01:23:40
we can all agree on this compared to the
01:23:42
impact on society I don't think it would
01:23:44
be a huge expense we would save money
01:23:45
we'd save money because a city like San
01:23:48
Francisco could become quite livable or
01:23:50
New York if and then if God forbid these
01:23:52
terrible school shootings you know if
01:23:54
you avoid even one of them it's 30
01:23:55
people's lives or 10 people's lives
01:23:59
convert post offices what we need to do
01:24:01
is stand up scaled shelters and it
01:24:03
doesn't need to be done on the most
01:24:04
expensive land in a given city outside
01:24:07
of cities
01:24:08
there is no expectation in Europe for
01:24:11
like Paris or London to be affordable or
01:24:14
Hong Kong to be affordable there are
01:24:15
affordable places 30 minutes outside of
01:24:17
those places where you could put these
01:24:19
facilities I just want to ask one
01:24:21
question to sax because I don't know
01:24:23
and I know Saks is a little bit deeper
01:24:25
into the center what is George soros's
01:24:27
motivation for putting in these Lawless
01:24:31
insane DA's like I understand that he
01:24:34
was able to buy them they're low cost
01:24:36
there's not a lot of money in them okay
01:24:37
I understand that that's table Stakes
01:24:39
but what is his actual motivation for
01:24:41
causing chaos in cities listen we can't
01:24:43
know exactly what his motivation is but
01:24:45
what he did is he went into cities where
01:24:48
he doesn't live
01:24:50
and flooded The Zone with money to get
01:24:52
his preferred candidate elected his D.A
01:24:54
now the reason he did that was to change
01:24:55
the law and the way that he changed the
01:24:57
law was not through legislatures the way
01:24:59
you're supposed to operate but rather by
01:25:01
abusing prosecutorial discretion so in
01:25:03
other words once he gets his Soros D.A
01:25:05
elected they can change the law by
01:25:07
deciding what to prosecute and what not
01:25:09
to prosecute right and that's why there
01:25:11
is so much lawlessness in these cities
01:25:12
but that there's a better path you're
01:25:14
saying yeah this is not the only way
01:25:16
that Soros has
01:25:18
I'd say imposed his values on cities
01:25:23
that he doesn't live in where does he
01:25:25
live I think he's a New York guy but I'm
01:25:26
not sure but but he's gone far beyond
01:25:29
that obviously in these elections but
01:25:31
also he's done this across the world
01:25:34
Soros has this thing called the open
01:25:36
societies Foundation which sounds like
01:25:38
it's spreading democracy and liberal
01:25:39
values but in fact is fermenting regime
01:25:42
change all over the world and he's been
01:25:45
sponsoring and funding color revolutions
01:25:47
all over the world now if you like some
01:25:49
of the values he's spreading then maybe
01:25:51
you think that's a good thing but I can
01:25:53
tell you that the way this is perceived
01:25:55
by all these countries all over the
01:25:57
world is it creates tremendous
01:26:00
dissension and conflict and then they
01:26:03
look at America and they basically say
01:26:04
you know this American billionaire is
01:26:06
coming into our country and he's funding
01:26:08
regime change and it makes America look
01:26:12
bad now he's doing this I think with
01:26:14
the cooperation of our state department
01:26:16
a lot of cases and maybe the CIA I don't
01:26:19
know but this is why America frankly has
01:26:21
hated all over the world as we go
01:26:22
running around meddling in the in the
01:26:25
internal affairs of all these countries
01:26:27
too is this guy all there like that was
01:26:29
the other thing I heard is that he's not
01:26:30
all there and the people around him are
01:26:31
doing these kind of things in his
01:26:32
organizations I heard something similar
01:26:34
is that it's the idiot son Alexander
01:26:36
who's really now pulling the strings
01:26:38
would you would you allow source to
01:26:40
speak at all in Summit would you
01:26:41
interview yeah sure yeah let's have
01:26:43
Sorrows or a son and they could explain
01:26:44
themselves if they're so proud
01:26:45
apparently there is an article that
01:26:47
Alexander Soros has visited the White
01:26:49
House like two dozen times during the
01:26:51
Biden presidency this is an extremely
01:26:53
powerful and connected person I mean I'm
01:26:55
sure he listens to pod okay we'll see
01:26:57
you all next time this is episode one
01:26:59
two nine of all in we'll see you on
01:27:02
episode 130 bye bye we love you bye-bye
01:27:06
to let your winners ride
01:27:09
Rain Man
01:27:10
[Music]
01:27:13
we open source it to the fans and
01:27:16
they've just gone crazy with it
01:27:20
[Music]
01:27:22
besties
01:27:25
[Music]
01:27:30
release
01:27:33
[Music]
01:27:55
[Music]

Badges

This episode stands out for the following:

  • 60
    Funniest

Episode Highlights

  • Reddit Performance Reviews Debut
    The team introduces a new feature where audience feedback is shared live.
    “Cue some music here, some graphics!”
    @ 00m 34s
    May 19, 2023
  • AI Regulation Discussion
    Sam Altman proposes a new agency to oversee AI, sparking debate on regulation.
    “Sam claimed the U.S should create a separate agency to oversee AI.”
    @ 06m 55s
    May 19, 2023
  • Open Source Models on the Rise
    Discussion on how open source AI models are becoming more accessible and powerful.
    “The cat is out of the bag, the horses have left the barn.”
    @ 16m 25s
    May 19, 2023
  • AI and Nuclear Weapons
    Buffett equates AI to nuclear weapons, highlighting its potential dangers.
    “AI is like nuclear weapons, whose Genie you can't put back in the bottle.”
    @ 25m 05s
    May 19, 2023
  • Political Climate on AI
    The White House's negative stance on tech raises concerns about understanding AI.
    “The mentality was that Tech is bad, we hate social media.”
    @ 36m 58s
    May 19, 2023
  • AI and Nuclear Fear
    Elon Musk's concerns about AI are likened to fears of nuclear disaster.
    “Elon is scared about General artificial intelligence is nuclear Holocaust.”
    @ 39m 43s
    May 19, 2023
  • Navigating Controversy
    The discussion touches on the challenges of leadership choices in a polarized environment.
    “If you pick somebody that both sides dislike, you probably picked the right person.”
    @ 45m 53s
    May 19, 2023
  • Lena Khan's Investigation
    Lena Khan launched a big investigation into pharmacy benefit managers, targeting drug inflation.
    “She actually went after the pbms... the real culprits around this are the pharmacy benefit managers”
    @ 56m 02s
    May 19, 2023
  • Apple's AR Headset Launch
    Apple's AR headset is expected to be revealed soon, breaking from their typical product release strategy.
    “This is a break from Apple's typical way of releasing products”
    @ 01h 03m 00s
    May 19, 2023
  • Housing Market Concerns
    A Gallup survey shows a record low in Americans believing it's a good time to buy a house.
    “The number of Americans who say it's a good time to buy a house has never been lower”
    @ 01h 07m 50s
    May 19, 2023
  • The Ghost Town of San Francisco
    San Francisco feels like a ghost town compared to the bustling streets of New York.
    “It's a real shame, it's a real real shame.”
    @ 01h 14m 21s
    May 19, 2023
  • Mental Health Crisis
    A call for better mental health services to address societal issues.
    “We have a mental health crisis; we should provide Mental Health Services to all Americans.”
    @ 01h 23m 27s
    May 19, 2023

Episode Quotes

Key Moments

  • Audience Feedback00:25
  • AI Regulation06:55
  • Regulatory Challenges20:52
  • Agency on Vibes37:41
  • Nuclear AI39:43
  • Censorship Concerns48:44
  • Housing Market Low1:07:50
  • San Francisco vs New York1:14:10

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
DOJ targets Nvidia, Meme stock comeback, Trump fundraiser in SF, Apple/OpenAI, Texas stock market
Podcast thumbnail
AI Bubble Pops, Zuck Freezes Hiring, Newsom’s 2028 Surge, Russia/Ukraine Endgame
Podcast thumbnail
Fed Hesitates on Tariffs, The New Mag 7, Death of VC, Google's Value in a Post-Search World
Podcast thumbnail
E133: Market melt-up, IPO update, AI startups overheat, Reddit revolts & more with Brad Gerstner
Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally
Podcast thumbnail
E114: Markets update: whipsaw macro picture, big tech, startup mass extinction event, VC reckoning
Podcast thumbnail
DeepSeek Panic, US vs China, OpenAI $40B?, and Doge Delivers with Travis Kalanick and David Sacks
Podcast thumbnail
E152: Real estate chaos, WeWork bankruptcy, Biden regulates AI, Ukraine's “Cronkite Moment” & more