Search Captions & Ask AI

E99: Cheating scandals, Twitter updates, rapid AI advancements, Biden's pardon, Section 230 & more

October 07, 2022 / 01:25:30

This episode covers topics such as the recent cheating scandals in chess and poker, Elon Musk's Twitter acquisition, and marijuana legalization. Guests include David Sachs, Friedberg, and Chamath Palihapitiya.

The discussion begins with the cheating scandals in chess and poker, focusing on Hans Neiman's alleged cheating in chess tournaments. Sachs explains how statistical analysis has raised suspicions about Neiman's gameplay, while Friedberg shares insights on the implications of cheating in competitive environments.

The conversation shifts to Elon Musk's ongoing Twitter acquisition, with Sachs detailing the complexities surrounding the deal and the potential financial implications for Musk and Twitter shareholders.

Later, the hosts discuss President Biden's recent announcement to pardon federal offenses of simple marijuana possession. The implications of this decision on the legal cannabis industry and the need for regulatory changes are explored.

The episode concludes with a debate on Section 230 and the responsibilities of social media platforms regarding content moderation and free speech.

TL;DR

The episode discusses cheating scandals in chess and poker, Elon Musk's Twitter deal, and Biden's marijuana pardons.

Video

00:00:00
we're seven minutes in and we've
00:00:01
produced absolutely nothing that will go
00:00:03
in the show
00:00:08
hey burgers
00:00:11
ax I'm actually angry at sax for not
00:00:14
publishing my AMA from the other day
00:00:17
crashed we had such a crowded room we
00:00:20
had some people in the room for like
00:00:22
four hours it was crazy it was like the
00:00:25
original days of Clubhouse everyone I
00:00:27
know that was trying to get in was
00:00:28
texting saying they couldn't get in so
00:00:29
it definitely capped out right I know
00:00:31
well we hit we hit some scalability you
00:00:33
may want to buy an extra server sax
00:00:35
cheap weren't you the same guy who was
00:00:37
responsible for scaring PayPal no that
00:00:39
was somebody else that was eBay
00:00:41
they sold it before it's killed no no
00:00:43
that's not true we had huge scalability
00:00:45
challenges at PayPal too it seems like a
00:00:47
theme yeah the theme is when you have an
00:00:49
app that's breaking out you hit
00:00:51
scalability challenges it's called a
00:00:53
high class problem 2 000 people is not a
00:00:55
high class problem it's a trickle it's
00:00:58
2022. 2 000 people participate in the
00:01:00
conversation is
00:01:02
a child I haven't written code in 20
00:01:04
years here's what you do when you get to
00:01:06
a thousand people coming to the room
00:01:07
that's a lot everybody else is in
00:01:08
passive for you've never written code
00:01:10
ever of course I of course I drive
00:01:12
that's a lie come on be honest oh yeah
00:01:13
it's actually been 25. the last time I
00:01:15
wrote code was Lotus Notes it's true
00:01:19
and let your winners ride
00:01:23
[Music]
00:01:26
we open source it to the fans and
00:01:29
they've just gone crazy
00:01:30
[Music]
00:01:34
so there have been three cheating
00:01:36
scandals across poker chess and even
00:01:38
competitive uh fishing I don't know if
00:01:40
you guys saw the fishing one but they
00:01:42
found weights and fillets during a fish
00:01:44
uh way and that everybody wants us to
00:01:47
check in on the chess and the poker
00:01:48
scandals
00:01:50
chess.com just released their report
00:01:53
that this Grand Master has been
00:01:55
suspended they have evidence he cheated
00:01:58
basically in a bunch of
00:02:02
uh tournaments that were in fact for
00:02:04
money he denied that he had done that
00:02:06
but he had previously cheated as a kid
00:02:07
they now have the statistical proof that
00:02:10
he was playing essentially perfect chess
00:02:13
and uh they've outlined this in like
00:02:15
hundreds of pages in a report Saxy what
00:02:19
are your thoughts on this uh scandal in
00:02:20
chess Magnus Carlson finally came out
00:02:22
and explained why he thought Hans Neiman
00:02:25
was cheating basically he got the strong
00:02:27
perception during the game that Hans
00:02:29
wasn't really putting in a lot of effort
00:02:31
that he wasn't under a lot of stress and
00:02:33
he's he it's his experience that when
00:02:35
he's playing you know the top players
00:02:37
they're intensely concentrating and the
00:02:40
Hans demon didn't seem to be exerting
00:02:41
himself at all so his you know hackles
00:02:44
were raised and got suspicious and then
00:02:46
you know he has had this meteoric rise
00:02:48
the fastest rise in classical chess
00:02:51
rating ever and I guess there were he
00:02:54
had gotten suspended from chess.com in
00:02:56
the past for cheating so on this basis
00:02:58
and maybe other things that Magnus isn't
00:03:00
telling us Magnus basically said that
00:03:01
this guy is cheating I think the maybe
00:03:03
the interesting part of this is that
00:03:05
there's been a lot of analysis now of
00:03:06
Hans Neiman's games and I just think the
00:03:09
methodology is kind of interesting so
00:03:10
what they do is they run all of his
00:03:13
games uh through a computer and they
00:03:16
compare his moves to the best computer
00:03:18
move and they basically assign a
00:03:20
percentage that correlation matches the
00:03:23
correlation matches the computer move
00:03:25
and what they found is there were a
00:03:27
handful of games where it was literally
00:03:28
a hundred percent that's basically
00:03:30
impossible without cheating I mean you
00:03:31
look at the top players who through an
00:03:34
entire career have never had 100 game
00:03:36
that you know chess is so subtle that
00:03:38
the computer can now see so many moves
00:03:41
into the future that nailing the best
00:03:43
move every single time for 40 50 100
00:03:45
moves is just and and and in chess which
00:03:47
a human really can't do that well is
00:03:49
that there are positional sacrifices
00:03:51
that you will make in short lines that
00:03:54
pay off much much later in the future
00:03:55
which is impossible for a human to
00:03:57
calculate and so you know and you saw
00:03:59
this by the way when uh I think it was
00:04:02
the it it was the Google AI the deepmind
00:04:04
AI that also play chess so the idea that
00:04:07
this guy could pay could play absolutely
00:04:09
perfectly according to those lines is
00:04:12
only possible if you're cheating right
00:04:14
exactly so so there were a handful of
00:04:15
games at 100 and then there were
00:04:17
tournaments where his percentages were
00:04:19
in the 70s something plus and so just to
00:04:22
give you some basic comparison Bobby
00:04:24
Fischer during his legendary 20-game
00:04:26
winning streak uh was at 72 percent so
00:04:30
he only matched the computer so for best
00:04:32
move 72 of the time Magnus Carlson
00:04:34
playing at his best is 70 percent uh
00:04:38
Gary Kasparov in his career was 69 and
00:04:41
then the you know the the super GM's
00:04:44
category are typically in the 64 to 68
00:04:46
range so I think it's really interesting
00:04:49
actually how you can now quantify by
00:04:51
comparing the human move to the best
00:04:53
computer move and it's multiple
00:04:55
computers training the best they
00:04:57
actually have it provides a way to
00:04:59
assess who the greatest player ever is
00:05:01
um I actually thought that it was Magnus
00:05:02
but now maybe there's a basis for
00:05:04
believing it was Bobby Fischer because
00:05:06
he was at 72 and Max is only at 70.
00:05:08
however look the idea that Hans Neiman
00:05:10
is in the 70s 80s or 90s during
00:05:13
tournaments would be you know just an
00:05:16
off the charts level of play and if if
00:05:19
he's not cheating then we should expect
00:05:21
over the next couple of years that he
00:05:24
should rapidly become the world's number
00:05:26
one player over the board uh you know
00:05:29
now that they have all this
00:05:30
anti-cheating stuff right so it'll be
00:05:32
interesting to see what happens in his
00:05:33
career now that they've really cracked
00:05:35
down on you know on with with anti-anti
00:05:39
Gene technology I have a general
00:05:40
observation which is these people are
00:05:42
complete losers
00:05:45
the people that cheat in any of these
00:05:47
games don't understand
00:05:49
this basic simple idea which is that
00:05:51
trying is a huge part of The Human
00:05:53
Experience the whole point is to be out
00:05:56
there in the field of play trying and
00:05:59
it's basically taking the wins and the
00:06:01
losses and getting better that is the
00:06:03
path that's what's fun once you actually
00:06:05
win it's actually not that much fun
00:06:08
because then you have this pressure of
00:06:09
maintaining Excellence that's a lot less
00:06:12
enjoyable than the path to getting there
00:06:14
and so the fact that these people don't
00:06:16
understand that makes them slightly
00:06:18
broken in my opinion and then the other
00:06:21
thing is like why is it that we have
00:06:23
this strain of people now that are just
00:06:25
so devoid of any personal responsibility
00:06:28
that they'll just so brazenly take
00:06:30
advantage of this stuff it's really
00:06:32
ridiculous to be they don't it's really
00:06:34
sad these people are pathetic it's
00:06:36
really pathetic this but it is really
00:06:38
interesting how they caught him and
00:06:40
running this against the computer here's
00:06:41
a chart of um his scores in these
00:06:44
tournaments oh well here is this first
00:06:46
chart is how quickly he Advanced which
00:06:49
was off the charts and then the second
00:06:51
chart that's really interesting is
00:06:54
um his chest.com strength so if you
00:06:55
don't know chess.com it has become like
00:06:56
a juggernaut in the Chess World
00:06:57
especially after that HBO series
00:07:01
came out a lot of people subscribe to it
00:07:02
I subscribed to it I like to play chess
00:07:04
there
00:07:05
and man you look at the chest strength
00:07:06
score there
00:07:07
he was just like perfect and then the
00:07:09
number of games he likely cheated in you
00:07:11
can see the last two columns he's
00:07:12
basically cheating in every game
00:07:14
you know Queen's Gambit yeah great show
00:07:16
on Netflix
00:07:18
and he he said he didn't cheat in any uh
00:07:21
of the uh games
00:07:23
where they were live streaming but
00:07:25
they've proven that wrong sax how does
00:07:27
he cheat in person then well that's the
00:07:29
thing no one really knows and I don't
00:07:31
want to overly judge it until they have
00:07:33
hard proof that he was cheating I mean
00:07:36
look here's the thing he was never
00:07:37
caught in the act it's just that the
00:07:39
computer evidence you know seems pretty
00:07:41
damning
00:07:42
and I mean here's the thing is I don't
00:07:44
know how they prove I don't know how
00:07:46
they prove that he was cheating over the
00:07:47
board without actually catching him
00:07:49
doing it and I don't I still don't think
00:07:51
anyone really has a good theory in terms
00:07:53
of how he was able to do that well it's
00:07:55
not just him though like look the the
00:07:56
fishing thing Jason which was crazy I
00:07:58
think Friedberg shared the video this
00:08:00
guy was in a fishing competition and
00:08:02
they basically caught these fish and
00:08:03
then they put these big weighted pellets
00:08:06
inside the fish's body they even put
00:08:07
like a you know chicken breast and
00:08:09
chicken fillets inside of the things
00:08:12
yeah
00:08:14
you know uh then there's no crossover
00:08:17
now in poker everybody's afraid that
00:08:19
there are ways in which you can read the
00:08:22
RFID and some of the cards and some of
00:08:24
these you know televised situations and
00:08:26
and front run what the what the playing
00:08:27
situation is so that you know whether
00:08:29
you're winning or losing and again I
00:08:31
just asked the question like
00:08:33
is it is it is this are things that bad
00:08:35
that this is what it gets to like we all
00:08:38
play poker the idea that we would play
00:08:39
against somebody that would take that
00:08:41
edge
00:08:42
yeah it's gross yeah it's really makes
00:08:45
me really it's sad so disappointing yeah
00:08:48
it's horrible one one observation might
00:08:50
be that across all three because I'm
00:08:52
trying to find some common thread across
00:08:54
these but it could be that there was a
00:08:56
lot of cheating going on for a long time
00:08:59
and maybe the fact that we do have
00:09:02
so much digital imagery that's live on
00:09:06
these things now and so much coverage
00:09:08
and everyone's got a cell phone that
00:09:09
suddenly
00:09:11
our perception of the cheating in
00:09:14
competitive events is becoming
00:09:17
more tuned whereas maybe there's been a
00:09:19
lot of cheating for a long time and it's
00:09:22
just kind of coming to light I mean we
00:09:23
didn't have a lot of live streaming in
00:09:25
poker who knows I mean we could probably
00:09:28
ask Phil this we're kidding but like for
00:09:30
how many years well there was Visions
00:09:32
yeah well there were tons of cheating
00:09:34
and online poker yeah remember like
00:09:37
people are using these like software
00:09:39
programs that would uh track the hand
00:09:42
history yeah of your opposition yeah
00:09:45
exactly so
00:09:46
so it helped you assess whether the
00:09:48
person might be bluffing in that
00:09:49
particular situation yeah like he has
00:09:51
superhuman memory so I don't know if you
00:09:53
guys I don't know if you guys watch
00:09:54
Twitch like video games like fortnite or
00:09:56
whatever but there are like players that
00:09:59
have been accused of using the screen
00:10:01
overlay systems that basically more
00:10:03
accurately show you and drive the mouse
00:10:05
to where an individual is on the screen
00:10:07
so you can more accurately shoot them
00:10:08
and so there's software overlays that
00:10:11
make you a better
00:10:12
you know competitive video I'll tell you
00:10:14
what the through line is and then this
00:10:17
the stuff basically became like so now
00:10:19
what's interesting is now there's eye
00:10:21
tracking software that people are using
00:10:23
on Twitch streams to see if the
00:10:26
individual is actually spotting the
00:10:28
target when they shoot or if the
00:10:30
software is spotting the target yeah
00:10:31
they're called reverse team Bots they're
00:10:33
aimbots yeah yeah and like reverse cheap
00:10:35
the whole thing and I think what's
00:10:36
interesting is just that there's so much
00:10:38
you know Insight now so much more video
00:10:40
streams so much more I mean think about
00:10:42
all those guys
00:10:44
yeah and there's cell phones and they
00:10:46
all videoed this thing happening yeah I
00:10:48
think 10 years ago that wouldn't have
00:10:49
been the case and there wouldn't have
00:10:50
been a big story about it and so you
00:10:52
said there was a theme you wanted to uh
00:10:54
there was a threat I I think the theme
00:10:55
is is pretty obvious which is that
00:10:57
there's been an absolute decay
00:11:00
of personal responsibility people don't
00:11:03
feel like there's any
00:11:05
downside to cheating anymore and they're
00:11:07
not willing to take it upon themselves
00:11:09
to take a journey of wins and losses to
00:11:11
get better at something they want the
00:11:13
easy solution
00:11:14
the easy solve the quick answer you know
00:11:17
that gets them to some sort of finish
00:11:20
line that they've imagined for
00:11:21
themselves will solve all their problems
00:11:24
the problem is it doesn't solve any
00:11:26
problems and it just makes them a wholly
00:11:28
corrupt individual yeah okay so let's
00:11:30
talk about this Hustler Casino Live uh
00:11:33
cash game play there's this woman Robbie
00:11:36
who is a new player apparently she's
00:11:38
being staked in a very high stakes game
00:11:40
she's playing as a guy Garrett who is a
00:11:42
very very
00:11:44
um known winning cash game player
00:11:48
and it was a very strange hand on the
00:11:50
turn all the money gets in she says she
00:11:53
has a bluff catcher then she claims that
00:11:54
she had a thought she misread her hand
00:11:56
now people are saying that
00:11:59
the poker world seems to be 70 30 that
00:12:01
she cheated but people keep vacillating
00:12:04
back and forth
00:12:06
there was a lot of weird word salad that
00:12:08
she said that she had a bluff catcher
00:12:10
which would normally be an ace then she
00:12:12
said she thought she had a pair of
00:12:13
Threes And then she immediately said
00:12:15
afterwards that he was giving her too
00:12:17
much credit
00:12:18
they confronted her in the hallway she
00:12:20
gave the money back because she
00:12:21
supposedly loves production so all this
00:12:23
stuff sounds very weird one side says
00:12:25
okay while this is happening because
00:12:27
she's a new player the other side is
00:12:29
saying somebody was signaling her that
00:12:32
she was good and giving her just a
00:12:33
binary you're good because if you were
00:12:35
gonna cheat cheating with Jack high in a
00:12:38
situation where you just put all in for
00:12:39
a 200 quarter million dollar pot seems
00:12:41
very suspect
00:12:43
uh I don't know if you guys watched the
00:12:45
hand breakdown where does everybody
00:12:46
stand on a percentage basis I guess if
00:12:49
they think she was cheating or not
00:12:51
because we this is not definitive
00:12:52
obviously it's not like they cut it open
00:12:54
and found the ball bearings it's not
00:12:55
it's not so obvious in that situation
00:12:57
but I think the way that that line
00:12:59
played made no sense
00:13:03
and I guess in her previous hand she had
00:13:07
a jack three and there was a three on
00:13:09
the board so if she misread her hand was
00:13:12
ten ten nine three no but you would have
00:13:14
but you would have you would have had to
00:13:15
call the Flop so I'm thinking what
00:13:17
yeah no I get it the hand makes no sense
00:13:19
but I'm just trying to find a logical
00:13:21
explanation and that Jack three
00:13:23
explanation somebody kind of fed that to
00:13:26
her and then she changed her story to
00:13:28
that so this changing of the story is
00:13:30
the thing I was sort of keyed on
00:13:31
Friedberg is why does she keep changing
00:13:33
her story is it because she's
00:13:34
embarrassed maybe she's had a couple of
00:13:36
beverages or whatever
00:13:38
or she's just a new player and she's
00:13:40
embarrassed by her play and can't
00:13:42
explain it she can't explain the hand
00:13:43
history all of the things you're saying
00:13:45
are probable
00:13:46
I don't think that yeah I don't think
00:13:48
there's any data for us to have a
00:13:49
strongly held point of view on this I'm
00:13:52
just looking forward to us all playing
00:13:54
live yeah HCL Poker Live October 21st
00:13:57
minus David Sachs unfortunately to mop J
00:14:00
Cal gerthner
00:14:02
Stanley Tang Phil Hellmuth so we're
00:14:04
going to be playing on the same stream
00:14:05
we're gonna be playing on the same
00:14:07
screen same table I figured out how to
00:14:09
hack into the video stream for the
00:14:10
camera I just got my RFID sunglasses as
00:14:13
well I'm gonna read all your shitty
00:14:14
hands jaycal I'm gonna take your money
00:14:16
and I'm gonna buy your kids
00:14:18
for my 40th birthday Sky organized poker
00:14:22
in Tahoe okay and and we we brought in
00:14:25
the team from CBS that was the present
00:14:27
and they they taped it as if it was
00:14:29
being broadcast with whole cards and
00:14:31
commentators and we edited it into a
00:14:33
two-day show it was an incredible
00:14:35
birthday present I I it was it's one of
00:14:37
the greatest things that that anybody's
00:14:39
ever given me I appreciate
00:14:41
there was a one hour block where
00:14:44
somebody at the table said okay guys how
00:14:46
about we do a cheating free-for-all yes
00:14:48
where you could look at each other's
00:14:50
cards and you know you could sort of
00:14:52
help somebody else
00:14:53
switch cards whatever in that one hour
00:14:56
our beautiful home game of friendship
00:14:59
became Lord of the Flies I have never
00:15:02
seen so much hatred Angling
00:15:06
Behavior oh my God it was incredible
00:15:10
all that humans are capable of so I I
00:15:12
hope that we never uh we never we never
00:15:14
see cheating in our game yeah well we'll
00:15:16
see how it goes on October 21st at HCL
00:15:18
Poker Live
00:15:20
I'm excited I can't wait it should be a
00:15:22
lot of fun it should be a lot of fun oh
00:15:23
and uh we're not having any official 100
00:15:26
stuff but the fans uh some of the fans
00:15:28
who were at the um all in Summit
00:15:32
2022 are doing
00:15:34
uh their own 100 episode 100 meetups on
00:15:38
October 15th I think all in meetups.io
00:15:42
so there are fan meetups happening in
00:15:43
Zurich in a bunch of other places I'm
00:15:45
going to FaceTime into some of them and
00:15:46
just say hi to the fans you know it
00:15:47
might be like 10 people in a bar
00:15:49
somewhere
00:15:50
um I think the largest one is like Miami
00:15:52
or San Francisco are going to be like 50
00:15:53
people or something we should all
00:15:55
FaceTime it that'd actually be kind of
00:15:56
fun I I'm basically I told them to send
00:15:59
me an invite and I'll FaceTime in any
00:16:00
time this is next week when is this the
00:16:03
15th I think is this occurring
00:16:06
the October 15th it's a Saturday the
00:16:08
Saturday after the 100th episode people
00:16:09
are doing these all in meetups.io what's
00:16:12
next earlier this week it was reported
00:16:15
that Elon contacted Twitter's board and
00:16:18
suggested that they move forward with
00:16:20
closing the transaction at the original
00:16:21
terms and the original purchase price of
00:16:24
54.20 cents a share in the couple of
00:16:27
days since then and even as of right now
00:16:29
with some news reports coming out here
00:16:31
on Thursday morning
00:16:32
um it's it appears that there are still
00:16:34
some question marks around whether or
00:16:36
not the deal is actually going to move
00:16:38
forward at 5420 a share because Elon as
00:16:40
of right now the report said is still
00:16:43
asking for a financing contingency in
00:16:45
order to close and there's a lot of back
00:16:47
and forth on what the terms are
00:16:48
meanwhile the court case in Delaware is
00:16:50
continuing forward on whether or not
00:16:52
Elon breached his terms of the original
00:16:55
agreement to close and buy Twitter at
00:16:57
54.20 as we know leading up to the
00:17:01
signed deal or a post signing the deal
00:17:04
Elon put together a financing Syndicate
00:17:06
a combination of debt investors as well
00:17:09
as Equity co-investors with him to do
00:17:11
the purchase of Twitter at 54.20 a share
00:17:15
so the 40 some odd billion dollars of
00:17:17
capital that's needed was committed by a
00:17:20
set of investors that were going to
00:17:21
invest debt and equity and there's a big
00:17:23
question mark now on whether or not
00:17:25
those investors want to or would still
00:17:27
consummate the transaction with Elon
00:17:29
given how the markets have turned and
00:17:31
given how debt markets are trading and
00:17:33
Equity markets are trading so chimath
00:17:35
I'd love to hear your point of view on
00:17:37
what um hurdles Does Elon still have in
00:17:39
front of him does he still want to get
00:17:41
this done and is there still a financing
00:17:43
Syndicate that's standing behind him at
00:17:45
the original purchase price to get it
00:17:46
done
00:17:47
that's a great question
00:17:49
um maybe the best way to start is Nick
00:17:50
do you want to queue up what I said in
00:17:53
August 25th the lawsuit really boils
00:17:56
down to one very specific clause which
00:17:58
is the Pinnacle
00:18:00
question at hand which is there is a
00:18:02
specific performance
00:18:05
Clause that Elon signed up to
00:18:08
right which you know his lawyers could
00:18:11
have struck out and either chose not to
00:18:13
or you know couldn't get the deal done
00:18:15
without and that specific performance
00:18:18
Clause says that Twitter can't force him
00:18:21
to close at 5420 a share
00:18:24
and I think the the issue at hand at the
00:18:26
Delaware business court is going to be
00:18:28
that because Twitter is going to point
00:18:30
to all of these you know gotchas and
00:18:33
disclaimers that they have around this
00:18:35
bot issue
00:18:36
as their cover story
00:18:40
and I think that really you know this
00:18:43
kind of again builds more and more
00:18:44
momentum in my mind that the most likely
00:18:48
outcome here is a settlement where
00:18:51
you have to pay
00:18:53
the economic difference between where
00:18:56
the stock is now and 5420 which is more
00:18:58
than a billion dollars
00:19:00
or you close at some number below 54.20
00:19:05
cents a share
00:19:06
and I think that that is like you know
00:19:08
if you had to be a betting person that's
00:19:09
probably and if you look at the
00:19:11
the way the stock is traded and if you
00:19:14
also look at the way the options Market
00:19:15
trades that's what people are assuming
00:19:18
that there's a seven to ten billion
00:19:19
dollar swing
00:19:21
and if you impute that into the stock
00:19:22
price you kind of get into the fifty one
00:19:25
dollars a share kind of a an acquisition
00:19:27
price again I'm not saying that that is
00:19:29
right or should be right that's just
00:19:30
sort of what the market says yeah so so
00:19:33
I it turns out that you know sort of
00:19:35
like that
00:19:36
kind of guessed him it turned out to be
00:19:38
pretty accurate because the stock today
00:19:39
is at 51 this year so I think that the
00:19:42
specific performance thing is exactly
00:19:44
what this thing has always hinged on and
00:19:47
I think that there was a realization
00:19:48
that there were very few outs around how
00:19:51
that contractual term was written and
00:19:53
agreed to so there is an out in the
00:19:56
contract and that out says that I think
00:19:58
it's by April if um if the deal doesn't
00:20:02
get done by April then the banks can
00:20:04
walk away from their commitment to fund
00:20:06
the debt and if the banks walk away then
00:20:09
Elon does have a financing contingency
00:20:11
that allows him to walk away so the
00:20:14
actual set of events that have to happen
00:20:16
is those two things specifically get to
00:20:18
April so the banks can pass and say
00:20:20
we've changed our mind market conditions
00:20:22
are different and then Elon is able to
00:20:24
say well you know the banks just walked
00:20:26
away right now the banks if you look at
00:20:29
all of the debt that they've committed
00:20:30
to
00:20:31
while they committed at a point in time
00:20:34
when the debt markets were much better
00:20:36
than they are today in the last you know
00:20:39
six or seven months since they agreed to
00:20:41
do this the debt markets have been
00:20:43
clobbered and specifically junk bonds
00:20:45
and a bunch of junk bond debt the yields
00:20:48
that you have to pay so the price to get
00:20:49
that kind of debt has skyrocketed so
00:20:52
roughly back of the envelope math would
00:20:55
tell me that right now the banks are
00:20:56
offside between one and two billion
00:20:58
dollars because they're not going to be
00:21:00
able to sell this debt to anybody else
00:21:01
so I think the banks obviously want to
00:21:03
weigh out the problem is their only way
00:21:06
out is to run the shot clock off until
00:21:08
April so I think that's the dance that
00:21:10
they're in right now elon's trying to
00:21:13
find a way to solve you know for the
00:21:15
merger
00:21:16
I think Twitter is going to say we're
00:21:17
not going to give you a financing
00:21:18
contingency you have to bring the banks
00:21:20
in and close right now and then we will
00:21:22
not go to court otherwise we're going to
00:21:25
court
00:21:26
and so I think it's a very delicate
00:21:28
predicament that they're all in but my
00:21:30
estimate is that the equity is probably
00:21:32
20 percent offside so it's not a huge
00:21:35
thing he can make that up because he can
00:21:37
create Equity value like nobody's
00:21:38
business
00:21:39
the debt is way offside by a couple
00:21:41
billion dollars
00:21:43
which is hard to make back but I think
00:21:44
in the end you know given enough time
00:21:47
they can probably make that back
00:21:49
the best off in all of this are the
00:21:50
Twitter shareholders they're getting an
00:21:53
enormous premium to what that company is
00:21:55
worth today in the open market
00:21:57
and so I think this deal is going to
00:21:58
close it's probably going to close in
00:22:00
the next few weeks and had you bought
00:22:02
Twitter when we were talking about it in
00:22:04
August you would have made 25 in six
00:22:07
weeks and you know if the deal closes at
00:22:08
54 you would have made it you know a
00:22:10
third of your money in eight weeks which
00:22:12
is you know very hard to do in a market
00:22:14
if you're a GP at one of the funds like
00:22:17
Andreessen or Sequoia
00:22:19
and you had made this commitment to Elon
00:22:21
or even Larry Ellison a couple months
00:22:24
ago
00:22:25
do you fight against closing at 5420 do
00:22:28
you stick with the deal and support him
00:22:31
I mean what do you do given that the
00:22:32
premium is so much higher than where the
00:22:34
market would trade it at today some
00:22:35
people are saying the stock should be at
00:22:37
like 20 bucks a share or something the
00:22:38
average premium in an m a transaction in
00:22:40
the public markets is about 30 percent
00:22:43
so um and I think the fair value of
00:22:46
Twitter is around 32 to 35 bucks this
00:22:49
year
00:22:49
so you know it's not like he is
00:22:52
massively massively overpaying
00:22:55
and so you know I would just sort of
00:22:56
keep that in the realm of the possible
00:22:59
so like if you take 35 as the midpoint
00:23:02
fair value is really 45.50 so yeah he
00:23:06
paid 20 more than he should have but he
00:23:07
didn't pay a hundred percent more
00:23:09
so it's not as if you can't make that
00:23:11
Equity back as a private company
00:23:13
particularly because there's probably
00:23:15
ten dollars of fat in the stock if you
00:23:17
think about just Opex right in terms of
00:23:19
all the buildings they have maybe they
00:23:21
don't need as many employees maybe they
00:23:23
revisit salaries you know one thing is
00:23:25
when I looked at doing an activist play
00:23:27
at Twitter I think I mentioned this five
00:23:30
or six years ago one of the things that
00:23:31
I found was at that time Twitter was
00:23:33
running their own data centers and you
00:23:36
know the most obvious thing for me at
00:23:37
that time was like we're going to move
00:23:38
everything to AWS now I don't know if
00:23:40
that happened but I'm sure that if it
00:23:42
hasn't just bidding that out to Azure
00:23:45
gcp and AWS can raise you know three or
00:23:48
four billion dollars because I'm sure
00:23:49
those companies would want this kind of
00:23:51
an app on their cloud
00:23:53
so there's all kinds of things that I
00:23:55
think Elon can do as a private company
00:23:56
to make back maybe the small bit that he
00:23:59
overpaid and then he can get to the core
00:24:01
job of rebuilding this company to be
00:24:03
usable uh this product to be usable
00:24:05
because look I'll just speak as a user
00:24:06
right now
00:24:08
it has been decaying at a very very
00:24:11
rapid clip
00:24:12
and I think that his trepidation in
00:24:14
closing the merger
00:24:16
in part also even though he hasn't said
00:24:18
it has to do with the quality of the
00:24:20
experience it's just degraded it's not
00:24:22
as fun to use as it was during the
00:24:25
pandemic
00:24:26
um or even before the pandemic so
00:24:28
something is happening inside that app
00:24:30
that needs to get fixed and if he does
00:24:32
it he'll make a ton of money sort of
00:24:34
like what happened with Friendster and
00:24:35
Myspace and any social networking app
00:24:37
over time the quality degrades if it's
00:24:40
not growing it's shrinking and it gets
00:24:41
if it if it's if it's not growing and
00:24:43
also if the product hygiene isn't
00:24:46
enforced in code and product hygiene in
00:24:48
this case are this you know the spam
00:24:51
Bots you know the trolling it can really
00:24:55
take away from the experience yeah I
00:24:57
mean interestingly like if you think
00:24:59
back to the original the the starting
00:25:01
Days the original days of Twitter I
00:25:02
don't know if you guys remember you
00:25:04
would send in an SMS to do your Tweet
00:25:06
and then it would post up and other
00:25:08
people would get the SMS notification
00:25:10
and um it would crash all the time and
00:25:13
the apps were the app was notoriously
00:25:15
crashing it was poorly architected at
00:25:18
the beginning and some people have
00:25:19
argued that Twitter has had
00:25:22
a cultural technical incompetence from
00:25:25
the earliest days I think that's a
00:25:26
little harsh so I do think look Twitter
00:25:28
was known for what's called a fail whale
00:25:30
you know they used to have these fail
00:25:31
whales constantly
00:25:33
and they did hire people that attempted
00:25:36
to try to fix it I remember the the
00:25:38
funniest part of when I went in there
00:25:40
and said hey here's my plan and here's
00:25:42
what I want to do is literally a day or
00:25:44
two later the head of engineering quit I
00:25:46
can't remember who his name was but he
00:25:48
was just out the door uh but it is a
00:25:54
I think it is a team that has tried its
00:25:56
best
00:25:57
that probably at the edges definitely
00:26:00
made some technical miscalculations like
00:26:02
I said at that time the idea that any
00:26:04
app of that scale would use their own
00:26:06
data centers made note technical sense
00:26:09
whatsoever it made the app laggy it made
00:26:11
it hard to use it made it more prone to
00:26:13
downtime to your point
00:26:15
but that being said I would be shocked
00:26:17
if they haven't made meaningful
00:26:18
improvements because the stack of the
00:26:21
internet has gotten so much better over
00:26:22
the last seven years and so to your
00:26:24
point David if they didn't take
00:26:25
advantage of all these new abstractions
00:26:28
and mechanisms mechanisms to rebuild the
00:26:30
app or to rebuild search or to rebuild
00:26:32
you know how you know all these
00:26:35
infrastructure elements of the app work
00:26:36
I would be really surprised because then
00:26:38
what are they doing over there yeah well
00:26:40
look I mean to the point earlier besides
00:26:42
the product points there was a
00:26:44
a really good
00:26:46
a tweet I liked
00:26:49
that said for what it's worth I think
00:26:51
Elon will show us just how lean the
00:26:53
Silicon Valley Advertising companies can
00:26:54
be run at the very least it'll be an
00:26:56
interesting thought experiment for
00:26:58
spectators
00:27:00
um because if he does go in and actually
00:27:01
does significantly reduce Opex and head
00:27:03
count and the company does turn
00:27:05
profitable and he can grow it well look
00:27:07
it'll really by the way it'll really be
00:27:09
a beacon from the financial big
00:27:11
companies yeah from a financial
00:27:13
perspective there is ten dollars a share
00:27:15
in Opex cuts that he should make right
00:27:17
away just so that he is economically
00:27:19
break even and he looks like every other
00:27:21
m a transaction you know you paid a 30
00:27:23
premium and you bought a company there
00:27:26
is a lot of margin of safety there if
00:27:27
Elon does that so to your point there
00:27:29
probably is and there probably needs to
00:27:31
be a meaningful riff at Twitter I'm not
00:27:33
saying it's right I'm not saying it's
00:27:34
you know and I feel for the people that
00:27:36
may go through it but from a financial
00:27:37
perspective the math makes sense for him
00:27:40
to do that because then he is a break
00:27:43
even proposition on a go in m a
00:27:45
transaction and I think that there's
00:27:46
there's a lot of intelligent Financial
00:27:49
sense so that all the debt holders feel
00:27:51
like he's doing the right thing and all
00:27:53
the equity holders particularly see a
00:27:56
chance for them to make a decent return
00:27:57
here
00:27:58
all right well let's move on
00:28:01
conversation a great conversation
00:28:03
between
00:28:04
Dave Friedberg uh about
00:28:07
the Twitter transaction and now we're
00:28:09
being rejoined by our besties
00:28:13
yeah how was your cappuccino Jayco that
00:28:16
was great I have a I have a nice cold
00:28:18
brew here a nice iced Cobra and a nice
00:28:20
drip coffee I'm working but I'd love to
00:28:23
talk about topics that I'm not uh being
00:28:24
subpoenaed or depositioned about we will
00:28:27
have a lot to say in the coming weeks
00:28:28
I'd love to talk about topics that my
00:28:30
lawyers have advised me not to talk
00:28:31
about how Eerie was our prediction 51
00:28:34
bucks a sharing it is exactly where the
00:28:36
stock is right now that's Yuri
00:28:38
yeah uh all right lots of advances let's
00:28:41
keep going yeah speaking of Elon uh
00:28:43
Tesla AI day
00:28:45
was last week I actually went it was
00:28:47
great uh this is a recruiting event
00:28:49
where what did you do after Fel helmuth
00:28:52
and I went and uh I drove Phil home
00:28:54
youth home uh the end uh no it's a great
00:28:57
event and
00:28:59
it uh is essentially a giant recruiting
00:29:01
event hundreds of AI sorry sorry I'm
00:29:04
sorry I'm sorry can we just talk about
00:29:06
Phil hamley's non-sequitur in the group
00:29:08
chat about Ken Griffin
00:29:10
I mean oh yeah where he's just like I
00:29:14
made a joke about his net worth and what
00:29:16
he's going on what is going on we were
00:29:18
talking about the most serious of topics
00:29:20
and he just comes seven seconds to fill
00:29:22
it's what's going on seven seconds to
00:29:25
fill by the way I uh I I was uh texting
00:29:27
with uh Daniel negranu he did an
00:29:30
incredible podcast if you guys with Lex
00:29:32
Friedman if you haven't listened to it
00:29:34
the Daniel negronu uh pod with Lex is
00:29:37
incredible but I I was joking with
00:29:40
Daniel that there's a section where he's
00:29:41
talking about the greatest poker players
00:29:43
of all time and if you look in the bar
00:29:45
of YouTube it shows where the most
00:29:47
viewership was and it was exactly the 30
00:29:50
seconds he talks about helmuth and I
00:29:52
said to Daniel this must have been Phil
00:29:54
re-watching it over
00:30:05
so anyway the event was um uh super
00:30:08
impressive
00:30:10
Elon only spoke when he showed The
00:30:12
Optimist the new robot he's building a a
00:30:15
general purpose robot that will work in
00:30:17
the factories it's very early days but
00:30:20
they showed two versions of it and um he
00:30:22
said he thinks they could get it down to
00:30:24
twenty thousand dollars it's going to
00:30:25
work in the factory so it's actually got
00:30:26
a purpose
00:30:27
and obviously the factors already have a
00:30:29
ton of robots but this is more of a
00:30:31
robot that uh will benefit from the
00:30:34
general or the the computer vision
00:30:38
and the AI the narrow AI being pursued
00:30:40
by the self-driving team this is like
00:30:42
two and a half hours of really intense
00:30:43
presentations
00:30:45
um the most interesting part for me was
00:30:46
uh they're building their own super
00:30:48
computer
00:30:48
and their chips and the dojo
00:30:52
supercomputer was really impressive
00:30:55
um at how much they can get through uh
00:30:58
scenarios so they're building every
00:31:00
scenario of every self-driving I
00:31:01
actually have the full self-driving beta
00:31:04
on my car I've been using it it's pretty
00:31:06
impressive I have to say
00:31:08
if you haven't used it yet I feel like
00:31:12
AI is moving at a pretty Advanced clip
00:31:15
the past year if you haven't also seen
00:31:17
meta announced a text to video generator
00:31:20
so this is even more impressive than
00:31:22
dolly dolly you put in a couple of words
00:31:24
Friedberg and you get a painting or
00:31:26
whatever
00:31:27
this is put in a couple of words and you
00:31:29
get a short video so they had one of a
00:31:31
teddy bear painting a teddy bear
00:31:33
so it looks like you're going to be able
00:31:35
to
00:31:37
essentially create a whole movie by just
00:31:39
talking to a computer
00:31:41
really impressive where do you think we
00:31:43
are Freeburg in terms of
00:31:46
the compounding nature of these narrow
00:31:48
AI efforts you know obviously we saw
00:31:50
poker chess go Dolly gpt3 self-driving
00:31:55
it feels like this is all compounding at
00:31:57
a faster rate
00:31:59
or am I just you imagining that yeah
00:32:02
look I mean it's interesting when when
00:32:03
people saw the first computer playing
00:32:05
chess they said the same thing I think
00:32:07
any time that you see progress with a
00:32:09
computer that starts to mimic
00:32:11
the predictive capabilities of a human
00:32:14
it it it's uh it's impressive but I will
00:32:17
argue and I just I'll say a few words on
00:32:20
this it's I think this is part of a
00:32:22
60-year cycle that we've been going
00:32:24
through
00:32:25
um fundamentally what humans and human
00:32:28
brains do is we can sense our external
00:32:32
environment then we generate Knowledge
00:32:33
from that sensing and then our brains
00:32:36
build a model that predicts an outcome
00:32:38
and then that that predicted outcome is
00:32:40
what drives our actions and our Behavior
00:32:42
we observe the sunrise every morning
00:32:44
then we observe that it sets and you see
00:32:46
that enough times and you build a
00:32:47
predictive model from that data that's
00:32:49
been generated in your brain
00:32:51
that I predict that the sun has risen it
00:32:53
will therefore set it has said it will
00:32:55
therefore rise and I think that the
00:32:57
Computing approach is very similar it's
00:32:59
all about
00:33:00
sensing or generating data and then
00:33:02
creating a predictive model and then you
00:33:03
can drive action and initially the first
00:33:07
approach was just basic algorithms and
00:33:09
these are deterministic models that are
00:33:11
built it's a piece of code that says
00:33:13
here's an input here's an output and
00:33:15
that that model is really built by a
00:33:17
human and a human designed to design
00:33:18
that algorithmic model and said this is
00:33:20
what the uh the predictive potential of
00:33:24
the software is then there was this term
00:33:26
called data science so as data
00:33:28
generation began to proliferate meaning
00:33:30
there was far more sensors in the world
00:33:32
it was really cheap to to create Digital
00:33:34
Data from the physical world really
00:33:36
cheap to transmit it really cheap to
00:33:37
store it and really cheap to compute
00:33:39
with it data science became the hot term
00:33:41
in Silicon Valley for a while and these
00:33:43
models were not just a basic algorithm
00:33:45
written by a human but it became an
00:33:48
algorithm that was a similar
00:33:49
deterministic model
00:33:51
that had parameters and the parameters
00:33:54
were ultimately resolved by the data
00:33:56
that was being generated and so these
00:33:58
models became much more complex and much
00:34:00
more predictive finer granularities uh
00:34:02
finer range then we used this term
00:34:04
machine learning and in the data science
00:34:06
era it was still like hey there's a
00:34:09
model and you would solve it statically
00:34:12
you would get a bunch of data you would
00:34:13
statically solve for the parameters and
00:34:15
that would be your model and it would
00:34:16
run machine learning then allowed those
00:34:18
parameters to become Dynamic so the
00:34:21
model was static but generally speaking
00:34:23
the parameters
00:34:24
that drove the model became dynamic as
00:34:27
more data came into the system and they
00:34:29
were dynamically updated and then this
00:34:31
era of AI became and that's the new
00:34:33
catch word and what AI is realizing is
00:34:36
that there's so much data that rather
00:34:38
than just resolve the parameters of the
00:34:40
model you can actually resolve a model
00:34:42
itself the algorithm can be written by
00:34:45
the data the algorithm can be written by
00:34:46
the software and so with a with AI
00:34:49
example so poker plug an Adaptive model
00:34:52
so people uh so you're playing poker and
00:34:55
the software begins to recognize
00:34:56
behavior and it builds a predictive
00:34:58
model that says here's how you're
00:34:59
playing and then over time it actually
00:35:01
changes not just the parameters of the
00:35:03
model but the model itself the algorithm
00:35:05
itself and so Ai and then it eventually
00:35:07
gets to a point where the algorithm is
00:35:09
so much more complex that a human would
00:35:11
have never written it and suddenly the
00:35:13
AI has built its own intelligence its
00:35:15
own ability to be predictive in a way
00:35:16
that a human algorithmic programmer
00:35:18
would have never done and and this is
00:35:21
all driven by statistics so none of this
00:35:23
is new science person say there's new
00:35:25
techniques that all on their underlying
00:35:28
use statistics as their basis and then
00:35:30
there's these techniques that allow us
00:35:31
to build these new systems of model
00:35:33
development like neural Nets and so on
00:35:35
and those statistics build those neural
00:35:37
Nets they they solve those parameters
00:35:38
and so on but fundamentally there is an
00:35:41
um geometric increase in data and a
00:35:44
geometric decline in the cost to
00:35:46
generate data from sensors because the
00:35:48
cost of sensors is coming down with
00:35:49
Moore's Law transmit that data because
00:35:52
the cost of moving data has come down
00:35:53
with Broadband Communications the cost
00:35:55
of storing data because the cost of dram
00:35:57
and and solid-state hard drives has come
00:36:00
down with Moore's Law and now the
00:36:02
ability to actually have enough data to
00:36:04
do this AI driven where people are
00:36:06
calling AI but it really is the same
00:36:07
it's part of the spectrum of things that
00:36:09
have been going on for 60 years to
00:36:11
actually drive predictions in the in the
00:36:13
world is really being realized in a
00:36:15
bunch of areas that we would have
00:36:16
historically been really challenged and
00:36:18
surprised to see and so my argument is
00:36:21
at this point Big Data played a big role
00:36:23
yeah yeah we've over the last decade
00:36:25
we've reached this Tipping Point in
00:36:27
terms of data generation storage and
00:36:28
computation that's allowed these
00:36:30
statistical models to resolve
00:36:32
dynamically and as a result they are far
00:36:35
more predictive and as a result we see
00:36:37
far more human-like behavior in the
00:36:39
predictive systems both physics both
00:36:41
those that are you know like a like a
00:36:43
robot is the same as one that existed 20
00:36:45
years ago but the way that it's run is
00:36:47
using the software that is driven by
00:36:49
this Dynamic model and that data is for
00:36:52
a better answer
00:36:55
okay I have two things to say but one
00:36:57
the first one is a total non-sequitur so
00:36:59
use the term data scientist do you know
00:37:01
where the term data scientist came from
00:37:04
as classically used in Silicon Valley it
00:37:07
came from Facebook and it came from my
00:37:09
team in a critical moment this was in
00:37:11
2007 I was trying to 2008 I was trying
00:37:13
to build the growth team
00:37:15
this is the team that became very famous
00:37:17
for getting 2 billion users and you know
00:37:20
building a lot of these algorithmic
00:37:21
insights and I was trying to recruit a
00:37:25
person from Google and he was like a PhD
00:37:28
in some crazy thing like astrophysics or
00:37:30
particle physics or something and we
00:37:33
gave him an offer
00:37:34
has a data analyst because this is what
00:37:36
I needed at the time this is what I
00:37:37
thought I needed an analyst you know to
00:37:39
analyze data and he said absolutely not
00:37:43
I'm offended by the job title and I
00:37:45
remember talking to my my HR you know
00:37:47
business process partner and I asked her
00:37:49
like I don't understand what is this
00:37:51
where is this coming from and she said
00:37:52
he Fashions himself a scientist and I
00:37:55
said well then call him a data scientist
00:37:57
so we wrote in the offer for the first
00:37:59
time data scientist and at the time
00:38:01
people internally were like this is a
00:38:04
dumb title what does this mean anyways
00:38:06
we hired the guy he was a star and uh
00:38:09
and that title just took off internally
00:38:11
so Rob it's funny because parallel we
00:38:13
started climate Corp in 2006 and the
00:38:15
original the first guy I hired was a
00:38:17
buddy of mine who was a 4.0 for in
00:38:18
applied math from Cal and then everyone
00:38:20
we hired on with him we called them the
00:38:23
math team and they were all applied math
00:38:25
and statistics phds and we called them
00:38:27
the math team and it was really cool to
00:38:29
be part of the math team but then we
00:38:31
switched the team name to data scientist
00:38:33
and then it obviously created this much
00:38:35
more kind of impressive role impressive
00:38:38
title Central function to the
00:38:39
organization that was more than just a
00:38:42
math person or data analyst as I think
00:38:44
it may have been classically treated
00:38:45
because they really were building the
00:38:47
algorithms that drove the models that
00:38:49
made the product work right Peter Thiel
00:38:51
is a very funny observation not funny
00:38:53
but you know observation which is you
00:38:55
should always be wary of any science
00:38:57
that actually has science in the name
00:38:59
political science social science I guess
00:39:02
maybe data scientists you know because
00:39:04
the real Sciences don't need to qualify
00:39:05
themselves physics chemistry biology
00:39:07
anyways that's so here's what I wanted
00:39:09
to talk about with respect to AI two
00:39:12
very important observations that I think
00:39:14
is useful for people to know the first
00:39:17
one Nick if you throw it up here is just
00:39:18
a baselining of you know when we have
00:39:21
thought about intelligence and compute
00:39:23
capability we've always talked about
00:39:24
Moore's Law and Moore's Law essentially
00:39:27
this idea that there is a fixed amount
00:39:28
of time where the density of transistors
00:39:32
inside of a chip would double and
00:39:34
roughly that period for many many years
00:39:36
was around two years and it was largely
00:39:38
led by Intel and we used to equate this
00:39:41
to intelligence meaning the more density
00:39:43
there was in a chip the more things
00:39:46
could be learned and understood and we
00:39:49
used to think about that as the
00:39:50
progression of how Computing
00:39:53
intelligence would grow and eventually
00:39:55
Ai and artificial intelligence would
00:39:57
would get to mass Market
00:39:59
well what we are now at is a place where
00:40:02
many people have said Moore's law has
00:40:04
broken why it's because we cannot cram
00:40:08
any more transistors into a fixed amount
00:40:10
of area we are at the boundaries of
00:40:12
physics
00:40:14
and so people think well does that mean
00:40:16
that our ability to compute
00:40:18
will essentially come to an end and stop
00:40:20
and the answer is no and that's what's
00:40:22
demonstrated
00:40:23
on this next chart just to make it
00:40:25
simple which is that what you really see
00:40:28
is that if you think about you know
00:40:30
super computing power so the ability to
00:40:33
get to an answer that has actually
00:40:36
continued unabated and if you look at
00:40:39
this chart the reason why this is
00:40:41
possible is entirely because we've
00:40:43
shifted from CPUs to these things called
00:40:46
gpus so you may have heard companies
00:40:48
like Nvidia why is companies like Nvidia
00:40:50
done so well it's because they said they
00:40:53
raised their hand and said we can take
00:40:54
on the work
00:40:55
and by taking on the work away from a
00:40:58
traditional CPU you were able to do a
00:41:00
lot of what Freeburg said is get into
00:41:02
these very complicated models so this is
00:41:05
just an observation that I think that we
00:41:08
are continuing to compound knowledge and
00:41:11
intelligence effectively at the same
00:41:14
rate as Moore's Law and we will continue
00:41:17
to be able to do that because this makes
00:41:20
it a problem of power and a problem of
00:41:23
money so as long as you can buy enough
00:41:26
gpus from Nvidia or build your own
00:41:29
and as long as you can get access to
00:41:31
enough power to run those computers
00:41:34
there really isn't many problems you
00:41:36
can't solve and that's what's so
00:41:38
fascinating and interesting and this is
00:41:40
what companies like open AI
00:41:42
are really proving you know when they
00:41:43
raise the billion dollars what they did
00:41:46
was they attacked this problem because
00:41:48
they realize that by Shifting the
00:41:49
problem to gpus it left all these
00:41:52
amazing opportunities for them to
00:41:55
uncover and that's effectively what they
00:41:56
have the second thing that I'll say very
00:41:59
quickly is that
00:42:00
it's been really hard
00:42:02
for us as a society
00:42:04
to build intelligence
00:42:06
in a multi-modal way like our brain
00:42:09
works so think about how our brain works
00:42:10
our brain works in a multimodal way we
00:42:13
can process imagery we can process words
00:42:16
and sounds we can process all of these
00:42:19
different modes text
00:42:22
into one system and then Intuit some
00:42:26
intelligence from it and make a decision
00:42:27
right so you know we could be watching
00:42:30
this YouTube video there's going to be
00:42:31
transcription there's video voice audio
00:42:33
everything all at once
00:42:35
and we are moving to a place very
00:42:37
quickly where computers will have that
00:42:40
same ability as well today we go to very
00:42:42
specific models and kind of balkanized
00:42:45
silos correct to solve different kinds
00:42:47
of problems but those are now quickly
00:42:49
merging again because of what I just
00:42:51
said about gpus
00:42:52
so I think what's really important about
00:42:55
AI for everybody to understand is the
00:42:58
marginal cost of intelligence is going
00:43:00
to go to zero
00:43:01
and this is where I'm just going to put
00:43:03
out another prediction of my own when
00:43:06
that happens
00:43:07
it's going to be incredibly important
00:43:10
for humans to differentiate themselves
00:43:12
from computers and I think the best way
00:43:14
for humans to differentiate ourselves is
00:43:17
to be more human it's to be less compute
00:43:20
intensive it's to be more empathetic
00:43:23
it's to be more emotional a lot less
00:43:25
emotional because those differentiators
00:43:27
are very difficult for Brute Force
00:43:29
compute to solve be careful the
00:43:31
replicants on this call are getting a
00:43:33
little nervous here they're not
00:43:35
processing that that was an emotional
00:43:36
statement do not want to process that
00:43:38
one well I to your point uh during this
00:43:41
AI day
00:43:42
um they were showing in self-driving as
00:43:45
you're talking about this balkanization
00:43:46
and trying to
00:43:48
um make decisions across many different
00:43:50
uh decision trees you know they're
00:43:52
looking at Lane changes they're looking
00:43:54
at other cars and pedestrians they're
00:43:56
looking at road conditions like fog and
00:43:58
rain and then they're using all this big
00:44:01
data to your point Freeburg to run uh
00:44:04
tons of different simulations so they're
00:44:05
building like this virtual uh World on
00:44:09
Market Street and then they will throw
00:44:11
people dogs cars people about that into
00:44:15
the simulation it's such a wonderful
00:44:16
example imagine that system hears a horn
00:44:20
yeah well you hear a horn so clearly
00:44:23
there's some auditory expression of risk
00:44:26
right there's something risky
00:44:28
and now you have to scan your visual
00:44:30
field you have to probabilistically
00:44:32
decide what it could be what the evasive
00:44:35
maneuver if anything should be so that's
00:44:38
a multimodal set of intelligence that
00:44:41
today isn't really available yeah but we
00:44:44
have to get there if we're going to have
00:44:45
real full self-driving so that's a
00:44:47
perfect example Jason a real world
00:44:48
example of how hard the problem is but
00:44:50
it'll get solved because we can brute
00:44:52
force it now with with chips and with
00:44:54
compute I think that's going to be the
00:44:55
very interesting thing with the robots
00:44:57
as well is all of these decisions
00:44:59
they're making moving cars through roads
00:45:03
all of a sudden we're going to see that
00:45:04
with vetoles vertical takeoff and
00:45:06
Landing
00:45:08
you know aircraft and we're going to see
00:45:10
it with this General robot and everybody
00:45:12
wanted to ask Elon about General AI you
00:45:14
know the Terminator kind of stuff
00:45:16
and his position is I think if we solve
00:45:19
enough of these problems Friedberg it'll
00:45:20
be an emergent
00:45:22
Behavior or an emergent phenomenon I
00:45:25
guess would be a better word based on
00:45:27
each of these cities crumbling you know
00:45:29
each of these tasks getting solved by
00:45:31
groups of people you have any thoughts
00:45:33
as we wrap up here on the discussion
00:45:34
about a general Ai and the timeline for
00:45:37
that because obviously we're going to
00:45:38
solve every vertical AI problem in short
00:45:40
order I spoke about this a little bit on
00:45:43
um the ask AMA on call in on Tuesday
00:45:46
night um once sax gets it out you can
00:45:49
listen to it but I really have this
00:45:51
strong belief that the servers crashes
00:45:55
this episode drops okay yeah you guys
00:45:59
can try to download the app but it might
00:46:00
crash so just be careful so so here's
00:46:02
here's my my course that you were 10
00:46:06
times more popular than jaycal so it's
00:46:08
unexpected levels of traffic
00:46:10
well you had you did have an account
00:46:11
with 11 000 followers I mean it might
00:46:14
look like you're right we'll put you on
00:46:16
that account next time yeah please yeah
00:46:18
I'm starting from zero yeah that's fair
00:46:21
that's fair yeah look my core thesis is
00:46:23
I think humans transition from being
00:46:26
um let's call it you know passive in
00:46:29
this system on Earth to being laborers
00:46:32
and then we transition from being
00:46:34
laborers to being creators and I think
00:46:36
our next transition with AI is to
00:46:38
transition from being creators to being
00:46:40
narrators and what I mean by that is as
00:46:42
as we started to do work on Earth and
00:46:45
engineer the world around us we did
00:46:47
labor to do that we literally plowed the
00:46:49
fields we walked distances we built
00:46:52
things and over time we built machines
00:46:55
that automated a lot of that labor
00:46:57
you know everything from a plow to a
00:47:00
tractor to a caterpillar equipment to a
00:47:03
microwave that cooks for US Labor became
00:47:05
less we became less dependent on our
00:47:07
labor abilities and then we got to
00:47:09
switch our time and spend our time as
00:47:10
creators as knowledge workers and a vast
00:47:13
majority of the developed world now
00:47:15
primarily spends their time as knowledge
00:47:18
workers creating and we create stuff on
00:47:20
computers we do stuff on computers but
00:47:22
we're not doing physical labor anymore
00:47:24
as a lot of the knowledge work gets
00:47:26
supplanted by AI as it's being termed
00:47:29
now but really gets supplanted by
00:47:31
software the role of the human I think
00:47:33
transitions to being one of a narrator
00:47:35
where instead of having to create the
00:47:37
the blueprint for a house you narrate
00:47:40
the house you want and the software
00:47:42
creates the blueprint for you dictate
00:47:43
and instead of yeah and instead of
00:47:45
creating the movie and not spending 100
00:47:47
million dollars producing a movie you
00:47:49
dictate or you narrate the movie you
00:47:51
want to see and you iterate with the
00:47:52
computer and the computer renders the
00:47:54
entire film for you because those films
00:47:56
are shown digitally anyway so you can
00:47:57
have a computer render it instead of
00:47:59
creating a new piece of content you
00:48:03
narrate the content you want to
00:48:04
experience you create your own video
00:48:06
game you create your own Movie
00:48:07
experience and I think that there's a
00:48:09
whole Evolution that happens and if you
00:48:10
look Steve pinker's book Enlightenment
00:48:12
now has a great statistic a set of
00:48:14
statistics on this but the amount of
00:48:16
time that humans are spending on leisure
00:48:18
activities per week has climbed
00:48:20
extraordinarily over the past couple of
00:48:22
decades we spend more time enjoying
00:48:24
ourselves and exploring our creative
00:48:25
interests than we ever did in the in the
00:48:28
past in human history because we were
00:48:29
burdened by all the labor and all the
00:48:31
creative and knowledge work we had to do
00:48:32
and now things are much more accessible
00:48:35
to us and I think that AI allows us to
00:48:37
transition into an era that we never
00:48:39
really thought possible or realized
00:48:40
where the limits are really our
00:48:42
imagination of what we can do with the
00:48:44
world around us and the software
00:48:45
resolves to the um and automation
00:48:48
resolves to make those things possible
00:48:50
and that's making a really exciting kind
00:48:51
of vision for the future that I think AI
00:48:53
enables Star Trek had this right people
00:48:55
didn't have to work and they could
00:48:56
pursue things in the Holodeck or
00:48:58
whatever that they felt was rewarding to
00:49:01
them but speaking of jobs uh the job
00:49:04
reports for August came in we we talked
00:49:06
about this we were trimming 300 000 jobs
00:49:09
a month and we're wondering if the other
00:49:11
shoe would drop but boy did it drop 100
00:49:13
over a million jobs burned off in August
00:49:16
so without getting into the macro talk
00:49:18
it does feel like what the FED is doing
00:49:20
and companies
00:49:22
doing hiring freezes and cuts is finally
00:49:25
finally having an impact if we start
00:49:26
losing a million as we predicted could
00:49:28
happen here on the show
00:49:31
people might actually go back to work
00:49:33
and lift and Uber are reporting that the
00:49:36
driver shortages are over they no longer
00:49:38
have to pay people spiffs and stuff like
00:49:40
that to get people to come back to work
00:49:41
so at least here in America feels like
00:49:43
we're turning a corner do we want to go
00:49:45
can we let's talk about the marijuana
00:49:49
yeah yeah I was about to say we got a
00:49:51
couple of things we really want to get
00:49:52
to here uh Ukraine section 230 and then
00:49:55
this breaking news uh we'll pull it up
00:49:57
here on the screen
00:49:58
while we're recording the show President
00:50:00
Biden says and I'm just going to quote
00:50:02
here
00:50:04
first
00:50:06
I'm pardoning all prior Federal offenses
00:50:08
of simple marijuana possession there are
00:50:10
thousands of people who are were
00:50:12
previously convicted of simple
00:50:13
possession who may be denied employment
00:50:15
housing or educational opportunities
00:50:16
there's not my pardon will remove this
00:50:18
burden that's big news second I'm
00:50:20
calling on Governors
00:50:21
to part in simple State marijuana
00:50:24
possession offenses just as no one
00:50:26
should be in a federal prison solely for
00:50:28
possessing marijuana nobody should be in
00:50:29
a local jail or state prison for that
00:50:31
reason either finally this is happening
00:50:33
third and this is an important one we
00:50:36
classify the marijuana at the same level
00:50:38
as heroin and even and more serious than
00:50:40
fentanyl makes no sense I'm asking
00:50:43
secretary bakara and the Attorney
00:50:46
General to initiate the process of
00:50:48
reviewing how marijuana is scheduled
00:50:49
under federal law
00:50:51
I'd also like to note that as federal
00:50:53
and state Regulators change we still
00:50:56
need important limitations on
00:50:57
trafficking marketing and under eight
00:50:58
hours of marijuana
00:51:00
thoughts on this breaking news is this
00:51:03
giving the the timing on this is kind of
00:51:06
midterm
00:51:07
related it just seems is this is this I
00:51:10
guess is this a politically popular
00:51:11
decision to do
00:51:13
I think so I mean look I support it so
00:51:16
buying finally did something I like
00:51:18
great I mean I thought that we should
00:51:21
decriminalize marijuana for a long time
00:51:24
or specifically you know I agree with
00:51:27
this idea of descheduling it it does not
00:51:29
make sense to treat marijuana the same
00:51:31
as heroin as a schedule one narcotic
00:51:34
just doesn't make any sense it should be
00:51:35
regulated separately and differently
00:51:38
obviously you want to keep it out of the
00:51:40
hands of minors but no one should be
00:51:42
going to jail I think for simple
00:51:43
possession so I do agree with this and I
00:51:45
think the thing they need to do I don't
00:51:48
see it mentioned here is they should
00:51:49
pass a federal law that would allow for
00:51:53
the normalization of let's call it legal
00:51:57
legal
00:51:58
um you know cannabis companies so so
00:52:01
companies that are allowed to operate
00:52:03
under state laws like in California
00:52:06
should have access to the banking system
00:52:08
should have access to payment rails
00:52:09
because right now the reason why the
00:52:12
legal cannabis industry isn't working at
00:52:14
all in California is because they can't
00:52:16
bank they can't take payments so so it's
00:52:19
this weird all-cash business that makes
00:52:20
no sense so so listen if we're not going
00:52:23
to criminalize it as a drug like heroin
00:52:25
if we're going to allow states to make
00:52:28
it legal then allow it to be a more
00:52:31
normal business where the state can tax
00:52:33
it and it can operate in a more above
00:52:36
board way
00:52:38
is what you're saying the federal medic
00:52:40
I think it could still be regulated on a
00:52:42
state-by-state basis but I think you
00:52:44
need the feds to bless the idea that
00:52:47
Banks and payment companies can take on
00:52:49
those clients which states have already
00:52:52
said are legally operating companies and
00:52:54
right now they can't and it's a huge gap
00:52:57
in the laws so maybe that's the one
00:52:58
thing I would add to this but I don't
00:53:00
have any complaints about this right now
00:53:01
based on what we know from this tweet
00:53:03
storm and I would say this by the way
00:53:06
was an about face this was an about
00:53:07
faced by Biden yeah do you know what the
00:53:09
polling data says I mean is is there I'm
00:53:12
assuming there's big support in kind of
00:53:14
the independence and the middle
00:53:16
uh for this it was 70 at one point yeah
00:53:19
yeah
00:53:20
so so look this to me this is the kind
00:53:22
of thing that Biden should be doing with
00:53:24
the 50 50 Senate finding these sorts of
00:53:26
bipartisan compromises right
00:53:29
so yeah I look this is good news for us
00:53:31
why hasn't this happened in the past
00:53:33
like what's been the political reason
00:53:35
that other presidents Obama even
00:53:40
didn't that that have there was a
00:53:42
similar ideology like why why but why
00:53:44
does anyone know why this hasn't been
00:53:45
done in the past there was rumors he was
00:53:47
going to do in the second term
00:53:49
they just didn't have the political
00:53:50
Capital why to do it why I don't I don't
00:53:53
well it's hard ones doesn't require yeah
00:53:55
the pardon doesn't require political
00:53:57
capital I think it's probably the
00:53:58
perception that this is soft on crime in
00:54:01
some way or there wasn't enough
00:54:03
broad-based support as David said I mean
00:54:05
I think the the United States population
00:54:07
has moved pretty meaningfully in the
00:54:09
last 20 years look at the chart here
00:54:12
um you know we were talking about 2000
00:54:14
it was only 31 percent uh and then you
00:54:17
look at 2018 it's up at 60 plus percent
00:54:20
so when people saw the states doing it
00:54:22
and they saw absolutely no problem you
00:54:25
know in every state and I think what
00:54:26
people will see next I think that's a
00:54:28
Gallup poll that's a Gallup pollution
00:54:29
yeah yeah so I mean it's increased
00:54:31
dramatically MDMA psilocybum and some of
00:54:34
these other plant-based medicines
00:54:35
Ayahuasca are next and they're doing
00:54:37
studies on them now I don't want to take
00:54:39
away from how important this is for uh
00:54:41
all the people for whom this will
00:54:43
positively impact I just want to talk
00:54:45
about the schedule change for marijuana
00:54:47
as a parent one of the things that I'm
00:54:50
really really concerned about is that
00:54:52
through this process of legalization
00:54:55
getting access to marijuana has frankly
00:54:57
become too easy particularly for kids
00:55:00
at the same time I saw a lot of really
00:55:03
alarming evidence that the the intensity
00:55:06
of these marijuana-based products have
00:55:09
gone you know I think it's like five or
00:55:11
six times more intense than I don't know
00:55:13
50 or 100 tomorrow much higher right so
00:55:16
so it's no longer you know this kind of
00:55:19
like you know Do no harm drug that it
00:55:22
was 20 years ago this is this could be
00:55:24
actually David the way that it's
00:55:27
productized today as bad as some of
00:55:30
these other
00:55:31
you know uh narcotics so
00:55:34
in June of this year
00:55:36
the Biden Administration basically made
00:55:38
this press release that said the FDA is
00:55:41
going to come out with regulations that
00:55:43
would cap the amount of nicotine in
00:55:45
cigarettes and I think that was a really
00:55:47
smart move because it basically set the
00:55:49
stage to taper nicotine out of uh out of
00:55:53
cigarettes which would essentially you
00:55:55
know decapitate it as a an addictive
00:55:58
product and I think by
00:56:01
thinking about how it's how it's dealt
00:56:03
with what I really hope the
00:56:05
administration does is it empowers the
00:56:08
FDA if you're going to legalize it you
00:56:11
need to have expectations around what
00:56:14
the intensity of these drugs are because
00:56:16
if you're delivering drugs OTC and now
00:56:18
any kid can go in at 18 years old and
00:56:20
buy them which means that 18 year olds
00:56:23
are going to buy them for 16 year olds
00:56:24
16 year olds are going to get fake IDs
00:56:26
to buy them for themselves you need to
00:56:28
do a better job so the parents you're
00:56:30
helping parents do our job here's what
00:56:32
you need
00:56:33
one like alcohol
00:56:36
if alcohol is 21 then of course yeah
00:56:39
fine but even alcohol David you know
00:56:40
that there are there are we know what
00:56:42
the intensity of these are their labels
00:56:43
and there's warnings and you know the
00:56:45
difference between they're they're
00:56:46
getting wine versus hard alcohol but let
00:56:49
me just give you some statistics here
00:56:50
chamoth if you think about the the
00:56:52
Cannabis in the 90s uh and prior to that
00:56:55
uh they've been very you've been a ton
00:56:57
of studies on this in Colorado it was uh
00:56:59
the THC content was less than two
00:57:01
percent
00:57:02
and then in 2017
00:57:04
we were talking about you know things
00:57:06
going up to uh 17 to 28 for a specific
00:57:11
strain so they have been building
00:57:12
strains like Girl Scout cookies Etc that
00:57:15
have just increased and increased and
00:57:16
then there are things like shards and
00:57:18
obviously Edibles you can create
00:57:20
whatever intensity you want so you have
00:57:22
this incredible there you know
00:57:24
variation you could have an edible
00:57:26
that's you know got one milligram of THC
00:57:29
one that has a hundred or you could have
00:57:31
a pack of Edibles and you see this
00:57:33
happen in the news all the time some kid
00:57:34
gets their parents pack or somebody
00:57:36
gives one and the kids don't know um and
00:57:38
this dabbing phenomenon combined with
00:57:40
dabbing is like the shards like this
00:57:42
really intense uh stuff
00:57:45
combined with the Edibles is really the
00:57:46
issue and the labeling of them so you
00:57:49
got to be incredibly careful with this
00:57:50
it's not good for kids it screws up
00:57:52
their brains and so
00:57:53
yeah be very careful I have a I have a
00:57:56
zero tolerance policy on this stuff I
00:57:57
don't care if it's legal illegal like I
00:57:59
don't want my kids touching any of this
00:58:00
stuff until it's not for kids obviously
00:58:02
yeah but we also should not be over
00:58:03
there until they're 35 or 40 and even
00:58:05
then I hope they never do it but but I
00:58:07
need some help
00:58:09
and I'm not sure I'm the only parent
00:58:10
that's asking you can't have this stuff
00:58:12
be available effectively sold like in a
00:58:14
convenience store no no that's not going
00:58:17
to happen where there isn't even
00:58:18
labeling at least like cigarettes are
00:58:19
labeled it's very clear how bad this
00:58:22
stuff is for you oh do you guys have any
00:58:23
feedback on the job report or anything
00:58:25
they're all going away when when the
00:58:26
when the AI wins
00:58:28
well that's why I brought it up it's
00:58:30
like we're now going to see a potential
00:58:32
you know a situation where Jobs go away
00:58:35
and a lot of the stuff like even
00:58:36
developers right don't you think
00:58:37
Freiburg developers are going to start
00:58:39
development tasks
00:58:42
everyone assumes a static lump of work I
00:58:45
think what happens particularly in
00:58:47
things like developer tools is the
00:58:49
developer can do so much more and then
00:58:52
we generate so much more output and so
00:58:54
the overall productivity goes up not
00:58:55
down
00:58:56
um so it's pretty exciting as these and
00:58:59
remember like like we were talking on
00:59:01
the AMA the other night Adobe Photoshop
00:59:03
was a tool for photographers so you
00:59:05
didn't have to take the perfect
00:59:06
photograph and then print it you could
00:59:08
you know you could use the software to
00:59:10
improve the quality of your photograph
00:59:11
and I think that that's what we see
00:59:13
happening with all software
00:59:15
um in the creative process is it helps
00:59:17
people do more than they realized they
00:59:18
could do before and that's pretty
00:59:20
powerful and it opens up all these new
00:59:21
avenues of interest and things we're not
00:59:23
even imagining today all right so scotus
00:59:25
is going to hear two uh cases for
00:59:27
Section 230
00:59:28
the family of nohima Gonzalez a 23 year
00:59:31
old American college student who was
00:59:33
killed in an Isis terrorist attack in
00:59:35
Paris back in 2015. you remember those
00:59:37
horrible attacks is claiming that
00:59:39
YouTube helped and aided and abetted
00:59:41
Isis uh the family's argument is
00:59:43
YouTube's algorithm was recommending
00:59:46
videos that make it that makes it a
00:59:48
publisher of content as you know it's
00:59:49
section 230 common carrier now if you
00:59:52
make editorial decisions if you promote
00:59:53
certain content you lose your 230
00:59:55
protections uh in court papers filed in
00:59:58
2016 they said the company quote
01:00:00
knowingly permitted Isis to post on
01:00:02
YouTube hundreds of radicalizing videos
01:00:04
inciting violence which helped the group
01:00:06
recruit including some who were actually
01:00:09
involved in the terrorist attack so they
01:00:10
have made that connection well look
01:00:12
let's let's be honest we can we can we
01:00:13
can put a pin in this thing because I
01:00:15
think it would be shocking to me if this
01:00:18
current scotus
01:00:19
uh all of a sudden founded in the
01:00:22
cockles of their heart to protect big
01:00:23
Tech I mean they've dismantled
01:00:27
um a lot of other stuff that I think is
01:00:31
a lot more controversial than this
01:00:34
um and so you know we've we've basically
01:00:37
looked at gun laws we've looked at
01:00:39
affirmative action we've looked at
01:00:41
abortion rights sorry well I mean I
01:00:44
think as as we've said I think we all
01:00:46
know where that dye is unfortunately
01:00:48
going to get cast
01:00:49
um so to me it just seems like this
01:00:52
could be an interesting case where it's
01:00:53
actually nine zero
01:00:55
in favor for complete for completely
01:00:58
different sets of reasons I mean if you
01:00:59
think of the liberal left part of the
01:01:01
Court they have their own reasons for
01:01:03
saying that there aren't 230 protections
01:01:04
for big Tech and if you look at the the
01:01:06
far right or the right-leaning Parts
01:01:09
members of this of scotus they have they
01:01:11
have another set of different do you
01:01:12
think you're gonna make a political
01:01:13
decision not illegal no but even even in
01:01:15
their politics they actually end up in
01:01:17
the same place they
01:01:19
have protections but for different
01:01:20
reasons
01:01:21
so there there's a reasonable outcome
01:01:23
here where you know uh Roberts is going
01:01:26
to have a really interesting time trying
01:01:27
to pick who writes the majority opinion
01:01:28
there was a related case in the fifth
01:01:30
circuit in Texas where do you guys see
01:01:33
this fist circuit decision where
01:01:36
Texas passed a law imposing common
01:01:39
carrier restrictions on social media
01:01:41
companies the idea being that social
01:01:44
media companies need to operate like
01:01:45
phone companies and they can't just
01:01:47
arbitrarily deny you service or deny you
01:01:49
access to the platform and the argument
01:01:53
why previously that had been viewed
01:01:56
actually is unconstitutional was this
01:01:58
idea of compelled speech that you can't
01:02:00
compel a corporation to support speech
01:02:03
that they don't want to because that was
01:02:04
a violation of their own First Amendment
01:02:06
rights and with the first the fifth
01:02:09
circuit said is no that doesn't make any
01:02:10
sense Facebook or Twitter can still
01:02:12
advocate for whatever speech they want
01:02:14
as a corporation but as a platform they
01:02:17
if Texas requires them to not
01:02:20
discriminate against people on the base
01:02:22
of their Viewpoint then Texas has the
01:02:24
right to to impose that because that
01:02:26
doesn't it their quote was that does not
01:02:28
chill speech if anything it chills
01:02:30
censorship so what's the right legal
01:02:32
decision here in your mind putting aside
01:02:34
politics if you can for a moment but on
01:02:37
your legal hat what is the right thing
01:02:38
for society what is the right legal
01:02:40
issue around section 230 specifically in
01:02:43
the YouTube case and just generally
01:02:45
should we look at YouTube should we look
01:02:47
at a blogging platform like medium or
01:02:49
blogger Twitter should we look at those
01:02:52
as common carrier
01:02:54
and they're not responsible for what you
01:02:56
publish on them obviously they have to
01:02:58
take stuff down if it breaks their terms
01:02:59
of service Etc or if it's illegal I've
01:03:01
made the case before that I I do think
01:03:03
that common carrier requirements should
01:03:05
apply on some level of the stack to
01:03:07
protect the rights of ordinary Americans
01:03:10
to have their speech in the face of
01:03:12
these giant monopolies which could
01:03:14
otherwise de-platform them for arbitrary
01:03:16
reasons just to you know just explain
01:03:18
this a little bit so
01:03:20
historically there was always a debate
01:03:22
between so-called positive rights and
01:03:25
negative rights so where the United
01:03:28
States start off as a country was with
01:03:30
this idea of negative rights that what a
01:03:32
right meant is that you'd be protected
01:03:34
from the government taking some action
01:03:36
against you and if you look at the Bill
01:03:37
of Rights you know the original rights
01:03:39
are all about protecting the citizen
01:03:42
against intrusion on their Liberty by by
01:03:45
a state or by the federal government in
01:03:46
other words Congress shall make no law
01:03:48
it was always a restriction so the right
01:03:50
was negative it wasn't sort of
01:03:52
positively a force and then with the
01:03:54
Progressive Era you started seeing you
01:03:56
know more uh Progressive rights like for
01:03:59
example uh American citizens should have
01:04:01
the right to health care right that's
01:04:03
not protecting you from the government
01:04:04
that's saying that the government can be
01:04:06
used to give you a right that you didn't
01:04:09
otherwise have and so that was sort of
01:04:11
the big Progressive Revolution my take
01:04:13
on it is I actually think that the
01:04:15
problem we have in our society right now
01:04:17
is that free speech is only a negative
01:04:19
right it's not a positive right I think
01:04:21
it actually needs to be a positive right
01:04:23
I'm embracing a more Progressive version
01:04:26
of Rights but on behalf of sort of this
01:04:29
original negative right so and the
01:04:31
reason is because the Town Square got
01:04:33
privatized right I mean you used to be
01:04:35
able to go anywhere in this country
01:04:36
there'd be a multiplicity of town
01:04:38
squares anyone could pull out their
01:04:39
soapbox draw a crowd they could listen
01:04:41
that's not how speech occurs anymore
01:04:43
it's not on public land or public spaces
01:04:46
the way that speech political speech
01:04:48
especially occurs today is in these
01:04:50
giant social networks that are that have
01:04:52
giant Network effects and are basically
01:04:54
monopolies so if you don't protect the
01:04:57
right to free speech in a positive way
01:04:58
it no longer exists so you not only
01:05:01
believe wow that YouTube should keep it
01:05:04
section 230 you believe they YouTube
01:05:06
shouldn't be able to de-platform as a
01:05:09
private company
01:05:10
you know uh Alex Jones as but one
01:05:12
example they should have their free
01:05:14
speech rights and we should lean on that
01:05:16
side of forcing YouTube to put Alex
01:05:18
Jones or Twitter to put Trump back on
01:05:21
the platform besides your position
01:05:23
I'm not saying that the constitution
01:05:25
requires YouTube to do anything uh what
01:05:28
I'm saying is that if a state like Texas
01:05:31
or if the federal government wants to
01:05:33
pass a law saying that YouTube If you
01:05:37
are say of a certain size you're a
01:05:38
social network of a certain size you
01:05:39
have Monopoly Network effects I wouldn't
01:05:41
necessarily apply this to all the little
01:05:43
guys but for those big monopolies we
01:05:45
know who they are if the if the federal
01:05:48
government or state wanted to say that
01:05:50
they are required to be a common carrier
01:05:53
and they cannot discriminate against
01:05:54
certain viewpoints I think the
01:05:56
government should be allowed to do that
01:05:57
because it furthers a positive right
01:05:59
historically they have not been able to
01:06:02
do that because of this idea because
01:06:04
this idea of compelled speech meaning
01:06:06
that it would infringe on YouTube's
01:06:08
speech rights I don't think it would I
01:06:10
mean Google and YouTube can advocate for
01:06:11
whatever positions they want they can
01:06:13
produce whatever content they want yeah
01:06:15
but but the point that and I think
01:06:17
section 230 kind of makes this point as
01:06:18
well is that they are platforms they're
01:06:20
distribution platforms they're not
01:06:22
Publishers so if they don't once so
01:06:24
especially if they want section 230
01:06:25
protection they should not be engaging
01:06:27
in Viewpoint discrimination okay so now
01:06:29
there is a rub here wait can I just say
01:06:30
can I just say go ahead your explanation
01:06:32
David your explanation that you just
01:06:33
gave before was so excellent thank you
01:06:36
that it allows me to understand it even
01:06:39
more clearly that was really so tremoff
01:06:41
do you think the algorithm is an act of
01:06:44
editorialization yes yes yes and so then
01:06:48
should YouTube look at the end of the
01:06:50
day let me let me break down an
01:06:52
algorithm for you okay effectively it is
01:06:54
a mathematical equation of variables and
01:06:57
weights
01:06:59
an editor 50 years ago was somebody who
01:07:03
had that equation of variables and
01:07:06
weights in his or her mind
01:07:08
okay and so all we did was we translated
01:07:11
again this multimodal model that was in
01:07:14
somebody's brain
01:07:16
into a model that's mathematical that
01:07:19
sits in code you're talking about the
01:07:21
front page and the New York Times yeah
01:07:23
and I think it's a fig Leaf to say that
01:07:25
because there is not an individual
01:07:27
person who writes 0.2 in front of this
01:07:29
one variable and 0.8 in front of the
01:07:31
other
01:07:32
that all of a sudden that this isn't
01:07:34
editorial decision making is wrong we
01:07:36
need to understand the current moment in
01:07:39
which we live which is that these
01:07:41
computers are thinking actively for us
01:07:45
they're providing this you know
01:07:47
computationally intensive
01:07:50
decision making and reasoning
01:07:53
and I think it's
01:07:55
it's pretty ridiculous to assume that
01:07:57
that isn't true that's why when you go
01:07:59
to Google and you search for you know
01:08:01
Michael Jordan we know what the right
01:08:04
Michael Jordan is because it's reasoned
01:08:06
there is an algorithm that is doing that
01:08:09
it's making an editorial decision around
01:08:11
what the right answer is they have
01:08:12
deemed it to be right and that is just
01:08:15
true and so I think we need to
01:08:16
acknowledge that because I think it
01:08:18
allows us at least to be in a position
01:08:20
to rewrite these laws through the lens
01:08:23
of the 21st century
01:08:25
we need to update our understanding for
01:08:28
how the world works today and you know
01:08:30
jamaat there's such an easy way to do
01:08:32
this if you're Tick Tock if you're
01:08:34
YouTube if you want section 230 if you
01:08:36
want to have common carrier and not be
01:08:38
responsible if it's there when a user
01:08:40
signs up it should give them the option
01:08:42
would you like to turn on an algorithm
01:08:44
here are a series of algorithms which
01:08:46
you could turn on you could bring your
01:08:48
own algorithm you could write your own
01:08:50
algorithm with a bunch of Sliders or
01:08:52
here are ones that other users and
01:08:54
services provide like like an App Store
01:08:55
so YouTube moth could pick one for your
01:08:58
family your kids that would be I want
01:08:59
one that's leaning towards education and
01:09:02
takes out conspiracy theories takes out
01:09:03
cannabis use takes out this one it's a
01:09:06
wonderful what you're saying is so
01:09:07
wonderful because for example like you
01:09:08
know this organization Common Sense
01:09:10
Media yes I love that website every time
01:09:12
I put in a movie I put Common Sense
01:09:14
Media decide if we should watch it or
01:09:16
like an I use it a lot for apps because
01:09:18
they're pretty good at just telling you
01:09:19
which which apps are reasonable and
01:09:21
unreasonable but you know if Common
01:09:23
Sense Media could raise a little bit
01:09:25
more money and and create an algorithm
01:09:26
that would help filter
01:09:28
stories in Tick Tock for my kids I'd be
01:09:32
more likely to give my kids tick tock
01:09:33
when they turn 14. right now I know that
01:09:36
they're going to sneak it by going to
01:09:37
YouTube and looking at YouTube shorts
01:09:39
and all these other things because I
01:09:41
cannot control that algorithm and it
01:09:43
does worry me what kind of content that
01:09:46
they're getting access to and you could
01:09:48
do this by the way chamath on the
01:09:49
operating system level or on the router
01:09:51
level in your house you could say I want
01:09:53
the common sense algorithm I will pay 25
01:09:55
a month a hundred dollars a year
01:09:59
and then any IP that goes through it
01:10:01
would be programmed properly I want less
01:10:03
violence I want less sex you know
01:10:05
whatever I think we are as a society
01:10:07
sophisticated enough now yes um to have
01:10:10
these controls and so I think we need
01:10:12
them and so I think we do need to have
01:10:15
the right observation of the current
01:10:18
state of play Friedberg
01:10:21
where do you sit on this do you think
01:10:22
the algorithm should be I don't I don't
01:10:24
I don't know right out of 230. yeah I I
01:10:27
don't fully agree with Saks on
01:10:30
um the monopolistic assumption I I think
01:10:33
that there are I think there are other
01:10:35
places to access content and I think
01:10:37
that there is still a free market to
01:10:39
compete
01:10:40
and it is possible to compete I think
01:10:43
that we saw this happen with Tick Tock
01:10:44
we saw it happen with Instagram we saw
01:10:46
it happen with YouTube uh competing
01:10:49
against Google video and Microsoft video
01:10:51
prior to that there has been a very
01:10:53
significant battle for the attention of
01:10:57
kind of being the next gen of media
01:10:59
businesses and we have seen Spotify
01:11:00
compete and we're seeing Spotify
01:11:01
continue to be challenged by emerging
01:11:04
competitors
01:11:05
so I don't buy the assumption that these
01:11:09
are built-in monopolies and therefore it
01:11:11
allows
01:11:12
some regulatory process to come in and
01:11:15
say hey Free Speech needs to be actively
01:11:17
enforced because they're monopolies this
01:11:18
isn't like when utilities laid power
01:11:21
lines and Sewer lines and and trains
01:11:24
across the country and they had a
01:11:26
physical Monopoly on being able to
01:11:28
access and move goods and services the
01:11:30
internet is still thank God knock on
01:11:32
wood open and the ability for anyone to
01:11:34
build a competing service is still
01:11:36
possible and there is a lot of money
01:11:38
that would love to disrupt these
01:11:40
businesses that is actively doing it and
01:11:42
I think every day look at how big Tick
01:11:44
Tock has gotten it is bigger than
01:11:45
YouTube almost or will be soon
01:11:48
and there is a competition that happens
01:11:50
and because of that competition I think
01:11:52
that the the market will ultimately
01:11:54
choose where they want to get their
01:11:56
content from and how they want to
01:11:57
consume it and I don't think that the
01:11:58
government should play a role Zach's
01:12:00
rebuttal to that you buy that well so
01:12:02
not all these companies are monopolies
01:12:05
but I think they act in a monopolistic
01:12:07
way with respect to restricting Free
01:12:09
Speech which is they act as a cartel
01:12:11
they all share like best practices with
01:12:13
each other on how to restrict speech and
01:12:16
we saw the The Watershed here was
01:12:18
remember when Trump was thrown off first
01:12:20
Twitter made the decision you know Jack
01:12:22
I don't even know if it was Jack but
01:12:24
basically the company Jack said it
01:12:26
wasn't him actually he said it was the
01:12:27
woman who was running it specifically
01:12:29
that's it for that yeah yeah Jack
01:12:31
actually said it was a mistake but any
01:12:32
event Twitter did it first and then all
01:12:34
the other companies followed suit I mean
01:12:36
even like Pinterest and OCTA and
01:12:39
Snapchat like officially YouTube
01:12:42
everybody yeah but Trump was actually on
01:12:44
Facebook he wasn't on all these other
01:12:45
companies they still threw him off so
01:12:47
they all copy each other and Jack
01:12:49
actually said that in his comments where
01:12:51
he said it was a mistake he said he
01:12:53
didn't realize the way in which
01:12:55
Twitter's action would actually Cascade
01:12:58
he said that he thought originally that
01:13:01
the action was okay because it was just
01:13:03
Twitter deciding to take away Trump's
01:13:05
right to free speech but he could still
01:13:07
go to all these other companies and then
01:13:08
all these other companies basically you
01:13:10
know they are all subject to the same
01:13:13
political forces the the leadership of
01:13:15
these companies are all sort of they all
01:13:17
drink from the same monocultural found
01:13:19
they all have the same political biases
01:13:21
the polls show this so the problem of
01:13:23
Freeburg is yeah I agree a bunch of
01:13:24
these companies aren't quite monopolies
01:13:26
but they all act the same way but I
01:13:28
agree with you I'm a pretty Collective
01:13:30
effect is of a speech cartel so the
01:13:33
question is how do you protect the
01:13:34
rights of Americans to free speech in
01:13:37
the face of a speech cartel that wants
01:13:39
to basically block them go ahead
01:13:40
Freeburg respond here's my argument my
01:13:42
argument is that these are not public
01:13:43
service providers they're private
01:13:45
service providers and the market is
01:13:47
telling them what to do the market is
01:13:49
saying and I think I don't think so I
01:13:51
think that the pressure that was felt by
01:13:53
these folks was that so many consumers
01:13:55
were pissed off that they were letting
01:13:57
Trump rail on or they were pissed off
01:13:58
about Jam 6 or pissed off about whatever
01:14:01
whatever the current fat is the trend is
01:14:04
they respond to the market and they say
01:14:06
you know what this has crossed the line
01:14:08
and this was the case on public
01:14:09
television when nudity came out and
01:14:11
they're like okay you know what we need
01:14:13
to take that off the TV we need to
01:14:14
because the market is telling us they're
01:14:16
going to boycott us and I think that
01:14:18
there's a market pressure here that
01:14:19
we're ignoring that is actually pretty
01:14:21
pretty relevant that as a private
01:14:23
service provider if they're going to
01:14:24
lose half Their audience because people
01:14:26
are pissed about one or two pieces of
01:14:28
content showing up that they're acting
01:14:30
in the best interest of their
01:14:32
shareholders and in the best interest of
01:14:33
their platform they're not acting as a
01:14:35
public service look I love Market forces
01:14:37
as much as the next libertarian but I
01:14:40
just think that fundamentally that's
01:14:42
just not what's going on here this has
01:14:43
nothing to do with Market forces has
01:14:44
everything to do with political forces
01:14:46
that's what's driving this look do you
01:14:48
think the average consumer the average
01:14:50
user of PayPal is demanding that they
01:14:53
engage in all these restrictive policies
01:14:55
throwing off all these accounts who have
01:14:57
the wrong viewpoints no that's nothing
01:14:58
to do with it it has to do with the
01:15:00
vocal mind
01:15:04
who work at these companies and create
01:15:07
pressure from below it's also the you
01:15:09
know the the people from outside the
01:15:10
actors who create these boycott
01:15:12
campaigns and pressure from outside and
01:15:14
then it's basically people on Capitol
01:15:16
Hill who have the same ideology who
01:15:18
basically create threats from above so
01:15:20
these companies are under enormous
01:15:21
pressure from above below and sideways
01:15:24
that and it's 100 political hold on it's
01:15:26
not it's not about maximizing profits I
01:15:29
think it's about maximizing you know
01:15:32
political outcomes yeah I don't know
01:15:33
that is that is what the American people
01:15:35
need to be protected from now I will I
01:15:37
will add one Nuance to my theory though
01:15:40
which is
01:15:41
I'm not sure what level of the stack we
01:15:44
should declare to be common carrier so
01:15:46
in other words you may be right actually
01:15:49
that at the level of YouTube or Twitter
01:15:52
or Facebook maybe we shouldn't make them
01:15:54
commentary and I'll tell you why because
01:15:56
just to take the other side of the
01:15:57
argument for a second which is you know
01:15:59
if you don't because those companies do
01:16:02
have legitimate reasons to take down
01:16:04
some content I don't like the way they
01:16:06
do it but I do not want to see bots on
01:16:08
there I do not want to see fake accounts
01:16:10
and I actually don't want to see like
01:16:12
truly hateful speech or harassment and
01:16:15
the problem is I do worry that if you
01:16:18
subject them to common carrier they
01:16:19
won't actually be able to engage in
01:16:21
let's say legitimate curation of their
01:16:24
social networks yeah right however so so
01:16:27
there's a real debate to be had there
01:16:28
and it's going to be messy but I think
01:16:31
there's one level of the stack below
01:16:32
that which is at the level of pipes like
01:16:34
in AWS like a cloud flare like a PayPal
01:16:37
like the the isps like the banks they
01:16:40
are not doing any content moderation or
01:16:43
they have no legitimate reason to be
01:16:44
doing content moderation none of those
01:16:45
companies should be able allow to engage
01:16:47
in Viewpoint discrimination we have a
01:16:48
problem right now where American
01:16:50
citizens are being denied access to
01:16:52
payment rails into the banking system
01:16:54
you're saying because of their view AWS
01:16:56
shouldn't be able to deny service to the
01:16:58
Ku Klux Klan or some hate speech group I
01:17:00
think that they should be under the same
01:17:02
requirements the phone companies under
01:17:05
okay
01:17:08
you know the question is like look I
01:17:10
could frame the same question to you
01:17:12
should you know something such a
01:17:14
horrible group should such and such
01:17:15
horrible group be able to get a phone a
01:17:17
phone account right yeah no no and you'd
01:17:19
say no they shouldn't get anything but
01:17:20
they have that right that has been
01:17:22
litigated and that's been pretty much
01:17:24
protected by the Supreme Court you know
01:17:26
even if it's a government-conferred
01:17:28
monopoly the Supreme Court has said okay
01:17:30
listen like it's not violating one's
01:17:32
constitutional right for example if your
01:17:34
water bill gets terminated without you
01:17:36
getting due process
01:17:38
and the the inverse is also true so
01:17:41
for what whether we like it or not that
01:17:43
Jason that issue has been litigated I
01:17:45
think
01:17:47
I think I think for me again just like
01:17:49
practically speaking for the functioning
01:17:52
of Civil Society I think it's very
01:17:54
important for us
01:17:55
to now introduce this idea of
01:17:57
algorithmic choice
01:17:59
and I don't think that that will happen
01:18:01
in the absence of us rewriting section
01:18:04
230 in a more intelligent way I don't
01:18:06
know I don't know whether this specific
01:18:08
case
01:18:10
create an upstanding for us to do all of
01:18:12
that
01:18:14
but I think it's an important thing that
01:18:17
we have to revisit as a society because
01:18:19
Jason what you described
01:18:21
as having a breadth of algorithmic
01:18:23
choices over time where there are
01:18:25
purveyors and sellers could you imagine
01:18:27
that's not a job or a company that the
01:18:29
four of us would ever have imagined
01:18:31
could be possible five years ago but
01:18:33
maybe there should be an economy of
01:18:35
algorithms and there are these really
01:18:36
great algorithms
01:18:38
that one would want to pay a
01:18:40
subscription for because One Believes In
01:18:42
the quality of what it gives you
01:18:44
we should have that choice and I think
01:18:46
it's an important set of choices that
01:18:47
will allow actually YouTube as an
01:18:50
example to operate more safely as a
01:18:52
platform because it can say listen I've
01:18:54
created this set of abstractions you can
01:18:56
plug in all sorts of algorithms there's
01:18:58
a default algorithm that works but then
01:19:00
there's a Marketplace of algorithms just
01:19:02
like there's a Marketplace of ideas I
01:19:04
don't discriminate and let people choose
01:19:06
this is the key thing with this model
01:19:08
like if if it was on a blockchain if all
01:19:11
the videos all the video content was
01:19:12
uploaded to a public blockchain and then
01:19:15
distributed on distributed computing
01:19:16
system then your ability to search and
01:19:19
use that media would be a function of a
01:19:22
service provider you're willing to pay
01:19:23
for that provides the best service
01:19:24
experience and by the way this is also
01:19:26
why I think over time to sax to to kind
01:19:29
of sax and I are both arguing both sides
01:19:31
a little bit but I think that what will
01:19:33
happen
01:19:34
I don't think that the government should
01:19:35
come in and regulate these guys and tell
01:19:37
them that they can't take stuff down and
01:19:39
whatnot I really don't like the
01:19:40
precedent it sets period I also think
01:19:43
that it's a terrible idea for YouTube
01:19:45
and Twitter to take stuff down
01:19:48
um and I think that there's an
01:19:49
incredibly difficult uh balance that
01:19:52
they're gonna have to find because if
01:19:53
they do this as we're seeing right now
01:19:54
the quality of the experience for a set
01:19:57
of users declines and they will find
01:19:58
somewhere else any Market will develop
01:20:00
for something else to compete
01:20:01
effectively against them and so I that's
01:20:03
why I don't like the government
01:20:04
intervening because I want to see a
01:20:06
better product emerge when the big
01:20:08
company makes some stupid mistake and
01:20:10
does a bad job and then the market will
01:20:12
find a better outcome and it just it's
01:20:14
messy in the middle and as soon as you
01:20:16
do government intervention on these
01:20:18
things and tell them what they can and
01:20:19
can't take down
01:20:21
I really do think that over time you
01:20:23
will limit the user experience to what
01:20:24
is possible if you allow the free market
01:20:26
and this is where the the industry needs
01:20:29
to police itself if you look at the
01:20:30
movie industry with the MPAA and Indiana
01:20:33
Jones and the the Temple of Doom they
01:20:35
came out with the PG-13 rating
01:20:37
specifically for things that were a
01:20:38
little too edgy for PG this is where our
01:20:41
industry could get ahead of this they
01:20:43
could give algorithmic Choice an
01:20:45
algorithmic app store and if you look at
01:20:48
the original sin it was these lifetime
01:20:49
bands like Trump should not have been
01:20:51
given a lifetime van they should have
01:20:52
given them a one-year ban they should
01:20:54
have had a process they overreached we
01:20:57
wouldn't be in this position
01:20:59
when you talk about like having a
01:21:02
industry Consortium like the MPAA what
01:21:04
you're doing is formalizing the
01:21:05
communication that's already taking
01:21:07
place is already happening between these
01:21:08
companies and what is the result of that
01:21:10
communication they all standardize on
01:21:12
overly restrictive policies because they
01:21:13
all know there is no free the same
01:21:14
political bias no but if they did it
01:21:16
correctly it's all in the execution
01:21:18
sacks it has to be executed properly
01:21:20
like the movie industry it doesn't
01:21:21
matter you'll end up with the same
01:21:23
problem as having the government
01:21:24
intervene if you have the government
01:21:25
intervene or private body intervene any
01:21:27
sort of set standard intervention that
01:21:29
prevents the market from company I just
01:21:31
I disagree with you I think you can
01:21:33
create more competition if the
01:21:34
government says
01:21:35
um okay folks you can have the standard
01:21:37
algorithm but you need to make a a
01:21:39
simple
01:21:41
abstracted way for somebody else to
01:21:43
write some other filtering mechanism and
01:21:45
to basically you so that users can pick
01:21:46
the power users yes
01:21:48
what the mpia did was I don't know why
01:21:52
isn't that more choice because that's a
01:21:54
product that's a product company I don't
01:21:56
want to be told how to make my product
01:21:57
right if you're not on YouTube you have
01:21:59
you have an algo you're now saying that
01:22:01
there is this distinction of the algo
01:22:02
from the ux from the data and my choice
01:22:04
might be to create different content
01:22:06
libraries for example YouTube has
01:22:08
YouTube kids and it's a different
01:22:09
Content Library and it's a different
01:22:11
user interface and it's a different
01:22:12
algorithm and you're trying to create an
01:22:14
abstraction that may not necessarily be
01:22:15
natural for the evolution of the product
01:22:17
set of that company I would much rather
01:22:19
see them figure it out that's not a good
01:22:21
argument that again if you were not a
01:22:23
monopoly I would be more sympathetic but
01:22:25
because like somebody somebody's
01:22:27
feelings would get hurt a product
01:22:29
manager's feelings will get hurt inside
01:22:30
of Google feelings
01:22:32
reason to not protect free speech I
01:22:34
think you're unnaturally disrupting the
01:22:35
product Evolution and I tough luck
01:22:37
that's what that's what happens when
01:22:38
you're worth two trillion dollars when
01:22:40
you impact a billion people on the
01:22:42
planet when you start having massive
01:22:44
impact in society you have to take some
01:22:46
responsibility those companies are not
01:22:48
taking responsibility if you're not
01:22:50
super super successful if this is not
01:22:52
going to affect you so you don't have
01:22:53
nothing to worry about you'll see you'll
01:22:55
see apps offshore and you'll see Tick
01:22:57
Tock and other things compete because
01:22:58
they'll have a better product experience
01:23:00
no no nobody's going to create a new
01:23:03
Google because they're downranking one
01:23:07
to ten percent of the search results
01:23:09
agreed some accountability hold on in
01:23:12
the ideal World Companies like Google
01:23:13
and so forth would not take sides in
01:23:15
political debates to be politically
01:23:17
neutral but they're not you look at all
01:23:19
the data around the political leanings
01:23:20
the people running these companies and
01:23:22
then you look at the actual actions of
01:23:23
these companies and they have become
01:23:25
fully political and they've waded into
01:23:26
all these political debates with the
01:23:28
result that the American people's rights
01:23:30
to speech into earn have been reduced
01:23:33
you have companies like PayPal which are
01:23:35
just engaging in retaliation basically
01:23:37
Financial retaliation purely on based on
01:23:41
what political viewpoints they have why
01:23:43
it's not like face it's not like PayPal
01:23:45
needs to be in the business content
01:23:47
creation let's continue this
01:23:48
conversation
01:23:49
call in ama uh if they can get some
01:23:53
servers over that I don't know maybe so
01:23:55
you got to raise some money stacks for
01:23:56
this app and get some more service all
01:23:58
right listen for the dictator who needs
01:24:00
to hit the Lou to do a number two yes I
01:24:04
am the world's greatest moderator
01:24:06
Friedberg is the Sultan of Science and
01:24:08
David Sachs is the prince of
01:24:12
peace see you all next week on the
01:24:16
episode wait wait is this 98 or 99 no
01:24:19
it's 99 it's 99. only one episode left
01:24:21
Wayne Gretzky get it well less enjoy it
01:24:24
well less that we're wrapping it up here
01:24:25
all right we'll see you all next time
01:24:27
have a great movement bye bye
01:24:30
we'll let your winners ride
01:24:34
[Music]
01:24:38
we open source it to the fans and
01:24:41
they've just gone crazy with it
01:24:44
[Music]
01:24:47
somehow
01:24:50
[Music]
01:25:14
we need to get Mercies
01:25:19
foreign
01:25:22
[Music]

Episode Highlights

  • The Rise of Cheating in Competitive Games
    Cheating scandals have emerged in poker, chess, and fishing, raising questions about integrity.
    “There's been an absolute decay of personal responsibility.”
    @ 11m 00s
    October 07, 2022
  • Elon Musk's Twitter Deal Uncertainty
    Elon Musk is still facing hurdles in closing the Twitter deal at $54.20 per share.
    “The lawsuit boils down to one very specific clause.”
    @ 17m 56s
    October 07, 2022
  • Elon's Challenge with Twitter
    Elon Musk faces a delicate predicament regarding the financing of the Twitter deal.
    “I think it's a very delicate predicament that they're all in.”
    @ 21m 28s
    October 07, 2022
  • Twitter Shareholders' Advantage
    Twitter shareholders are set to gain an enormous premium compared to current market value.
    “The best off in all of this are the Twitter shareholders.”
    @ 21m 50s
    October 07, 2022
  • AI Advancements
    Recent developments in AI show rapid progress, including a new text-to-video generator.
    “AI is moving at a pretty advanced clip.”
    @ 31m 12s
    October 07, 2022
  • The Rise of Data Scientists
    The term 'data scientist' emerged from a need for a more impressive title, leading to a new role in tech.
    “We wrote in the offer for the first time data scientist.”
    @ 37m 57s
    October 07, 2022
  • AI and the Future of Work
    As AI evolves, humans will transition from creators to narrators, shaping their experiences rather than creating them from scratch.
    “I think humans transition from being creators to being narrators.”
    @ 46m 23s
    October 07, 2022
  • Biden's Marijuana Pardon
    President Biden announces pardons for federal offenses of simple marijuana possession, a significant political move.
    “I'm pardoning all prior Federal offenses of simple marijuana possession.”
    @ 50m 06s
    October 07, 2022
  • The Dangers of Cannabis for Kids
    Discussing the need for better regulations around cannabis products to protect children.
    “It's not good for kids; it screws up their brains.”
    @ 57m 50s
    October 07, 2022
  • The Role of Algorithms in Free Speech
    Exploring how algorithms can influence free speech and the need for updated laws.
    “We need to acknowledge that these computers are thinking actively for us.”
    @ 01h 08m 16s
    October 07, 2022
  • Market Forces vs. Political Forces
    A discussion on whether market pressures or political ideologies drive content moderation.
    “It's about maximizing political outcomes, not profits.”
    @ 01h 15m 29s
    October 07, 2022
  • Political Neutrality in Tech
    Debating the need for tech companies to remain politically neutral.
    “Companies like Google should not take sides in political debates.”
    @ 01h 23m 13s
    October 07, 2022

Episode Quotes

Key Moments

  • Personal Responsibility11:00
  • Biden's Marijuana Announcement50:06
  • Cannabis Regulations57:52
  • Zero Tolerance Policy57:56
  • Algorithmic Influence1:08:16
  • Market Pressures1:14:19
  • Algorithmic Choice1:18:35
  • Political Neutrality1:23:13

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E38: Bestie brawl, Robinhood's $70M fine & S-1, Delta variant, next gen candidates & more
Podcast thumbnail
E75: Fast shuts down, board culpability, Elon buys 9% of Twitter, deplatforming's evolution & more
Podcast thumbnail
E50: Crypto investing deep dive, Facebook's whistleblower fallout, Chappelle's new special & more
Podcast thumbnail
JD Vance's AI Speech, Techno-Optimists vs Doomers, Tariffs, AI Court Cases with Naval Ravikant
Podcast thumbnail
E152: Real estate chaos, WeWork bankruptcy, Biden regulates AI, Ukraine's “Cronkite Moment” & more
Podcast thumbnail
IPOs and SPACs are Back, Mag 7 Showdown, Zuck on Tilt, Apple's Fumble, GENIUS Act passes Senate
Podcast thumbnail
Mag 7 sell-off, Wiz rejects Google, UBI, Kamala in, China's nuclear buildout, Sacks responds to PG
Podcast thumbnail
E170: Tech's Vibe Shift, TikTok ban debate, Vertical AI boom, Florida bans lab-grown meat & more
Podcast thumbnail
E49: Coinbase CEO reflects on controversial blog, state of the markets, 1000 unicorns & more
Podcast thumbnail
E51: Supply Chain Shortages, Inflation, DeSantis, Ted Sarandos Netflix Memo, Cancel Culture, Fan Q&A
Podcast thumbnail
NBA Gambling Scandal, Billionaire Tax, Tesla's Future, Amazon Robots, AWS Outage, Dangerous AI Bias
Podcast thumbnail
Tariffs, Trump's Economic Endgame, Market Chaos, Bitcoin Reserve, CoreWeave IPO