Search Captions & Ask AI

E124: AutoGPT's massive potential and risk, AI regulation, Bob Lee/SF update

April 14, 2023 / 01:33:07

In episode 124 of the All In Podcast, the hosts discuss global fan meetups, generative AI advancements, and the implications of new technologies on business and society. Guests include Jason Calacanis, David Sacks, and Chamath Palihapitiya.

The episode begins with a mention of self-organized fan meetups for episode 125, highlighting the enthusiasm of listeners. The hosts reflect on the phenomenon of fans gathering, comparing it to past media events.

Generative AI is a major topic, particularly the release of Auto GPT, which allows AI models to interact and complete tasks autonomously. The hosts discuss its potential to streamline processes in various industries, including sales and event planning.

They also touch on the implications of AI on company formation and venture capital, suggesting that smaller teams can now achieve what once required larger organizations. The conversation shifts to the need for regulation in AI, with differing opinions on how to approach oversight.

The episode concludes with a discussion on the societal impact of crime in San Francisco, reflecting on recent events and the media's portrayal of the city's safety.

TL;DR

Episode 124 covers fan meetups, generative AI advancements, and the implications of technology on business and society, featuring Jason Calacanis, David Sacks, and Chamath Palihapitiya.

Video

00:00:00
welcome to episode
00:00:01
124 of the all in podcast my
00:00:04
understanding is there's going to be a
00:00:06
bunch of global fan meetups for episode
00:00:09
125 if you go to Twitter and you search
00:00:11
for
00:00:12
all in fan meetups you might be able to
00:00:14
find the link but just to be clear we're
00:00:15
not they're not official all in this
00:00:17
they're fans it's self-organized which
00:00:20
is pretty mind-blowing but we can't
00:00:21
vouch for any particular organization
00:00:24
right nobody knows what's going to
00:00:26
happen at these things you can get
00:00:27
robbed it could be a setup I don't know
00:00:29
but I retweeted it anyway because there
00:00:32
are 31 cities where you lunatics are
00:00:35
getting together to celebrate the
00:00:37
world's number one business technology
00:00:39
podcast it is pretty crazy you know what
00:00:42
this reminds me of is in the early 90s
00:00:44
when Rush Limbaugh became a phenomenon
00:00:46
there used to be these things called
00:00:48
Rush rooms where like restaurants and
00:00:50
bars would literally broadcast rush over
00:00:54
their speakers during I don't know like
00:00:56
for the morning through lunch broadcast
00:00:57
and people would go to these restrooms
00:01:00
and listen together what was it like sex
00:01:01
when you were about 16 17 years old at
00:01:04
the time what was it like when you
00:01:05
hosted this it was a phenomenon but I
00:01:07
mean it's kind of crazy we've got like a
00:01:08
phenomenon going here where people are I
00:01:10
love it organizing you've said
00:01:12
phenomenon three times instead of
00:01:14
phenomenon he said it's phenomenon
00:01:16
phenomenal why is Saxon a good moochima
00:01:19
what's going on there's a specific
00:01:21
secret toe tap that you do under the
00:01:23
bathroom stalls when you go to a
00:01:24
rushroom
00:01:26
which we're already off the rails I
00:01:29
think you're getting confused about a
00:01:31
different event you went to
00:01:33
[Music]
00:01:37
let your Rain Man
00:01:38
[Music]
00:01:45
[Music]
00:01:49
there's a lot of actual news in the
00:01:51
world and generative AI is taking over
00:01:55
the dialogue and it's moving at a pace
00:01:57
that
00:01:59
none of us have ever seen in the
00:02:01
technology industry I think we'd all
00:02:02
agree the number of companies releasing
00:02:04
product
00:02:06
and the compounding effect of this
00:02:08
technology is phenomenal I think we
00:02:11
would all agree
00:02:12
a product came out this week called Auto
00:02:15
GPT
00:02:17
and people are losing their mind over it
00:02:20
basically what this does is it lets
00:02:23
different gpts
00:02:25
talk to each other and so you can have
00:02:29
agents working in the background and
00:02:30
we've talked about this on previous
00:02:31
podcasts but they could be talking to
00:02:35
each other essentially
00:02:36
and then completing tasks without much
00:02:39
intervention so if let's say you had a
00:02:41
sales team and you said to the sales
00:02:44
team hey look for leads that have these
00:02:48
characteristics for our sales software
00:02:49
put them into our database find out if
00:02:52
they're already in the database alert A
00:02:54
salesperson to it compose a message
00:02:56
based on that person's profile on
00:02:57
LinkedIn or Twitter or wherever and then
00:03:00
compose an email send it to them if they
00:03:02
reply offer them to do a demo and then
00:03:04
put that demo on the calendar of the
00:03:06
salesperson thus eliminating a bunch of
00:03:07
jobs and you could run these what would
00:03:09
essentially be cron jobs in the
00:03:11
background
00:03:12
forever and they could interact with
00:03:15
other llms in real time sex just gave
00:03:18
but one example here but when you see
00:03:19
this happening give us your perspective
00:03:21
on what this Tipping Point means let me
00:03:24
take a shot at explaining it in a
00:03:26
slightly different way not that your
00:03:28
explanation was wrong but I just think
00:03:30
that maybe explain it in terms of
00:03:32
something more tangible sure so I had a
00:03:35
friend who's a developer has been
00:03:36
playing with auto GPT by the way so you
00:03:40
can see it's on GitHub it's kind of an
00:03:41
open source project it was sort of a
00:03:43
hobby project it looks like that
00:03:44
somebody put up there it's been out for
00:03:46
about two weeks it's already got 45 000
00:03:49
stars on GitHub which is a huge number
00:03:51
explain what GitHub is for the audience
00:03:53
is this a code repository and you can
00:03:55
create you know repos of code for open
00:03:57
source projects that's where all the
00:03:58
developers check in their code so you
00:04:01
know for open source projects like this
00:04:02
anyone can go see it and play with it
00:04:03
it's like PornHub but for developers it
00:04:07
would be more like amateur or PornHub
00:04:09
because you're contributing your scenes
00:04:11
as it were your code yes but yes
00:04:12
continue this thing has a ton of of
00:04:15
stars and apparently just last night I
00:04:18
got another 10 000 Stars overnight this
00:04:19
thing is like exploding in terms of
00:04:21
popularity but anyway what you do is you
00:04:24
give it an assignment and what Auto GPT
00:04:27
can do that's different is it can string
00:04:30
together prompts so if you go to chat
00:04:32
GPT you prompt it one at a time and what
00:04:34
the human does is you get your answer
00:04:36
and then you think of your next prompt
00:04:37
and then you kind of go from there and
00:04:39
you end up in a long conversation that
00:04:40
gets you to where you want to go
00:04:42
so the question is what if the AI could
00:04:45
basically prompt itself then you've got
00:04:48
the basis for autonomy and that's what
00:04:50
this project is designed to do so what
00:04:53
you'll do is what my friend did is he
00:04:54
said okay you're an event planner
00:04:57
Ai and what I would like you to do is
00:05:01
plan a trip for me
00:05:02
for a wine tasting in heelsburg this
00:05:06
weekend and I want you to find like the
00:05:08
best place I should go and it's got to
00:05:10
be kid-friendly not everyone's going to
00:05:11
drink we're gonna have kids there and
00:05:12
I'd like to be able to have other people
00:05:14
there and so I'd like you to plan this
00:05:16
for me and so what Auto GPT did is it
00:05:20
broke that down into a task list and
00:05:22
every time I completed a task it would
00:05:25
add a new task to the bottom of that
00:05:27
list and so the output of this is that
00:05:30
it searched a bunch of different wine
00:05:32
tasting venues it found a venue that had
00:05:34
a bocce ball lawn area for kids it came
00:05:38
up with a schedule
00:05:39
it created a budget it created a
00:05:42
checklist for an event planner
00:05:44
it did all these things my friend says
00:05:46
he's actually in a book The Venue this
00:05:47
weekend and use it so we're going beyond
00:05:50
the ability just for a human to just
00:05:52
prompt the AI we're now
00:05:54
the AI can take on complicated tasks and
00:05:58
again it can recursively update its task
00:06:00
list based on what it learns from its
00:06:03
own previous prompt so what you're
00:06:04
seeing now is the basis for a personal
00:06:07
digital assistant this is really where
00:06:09
it's all headed is that you can just
00:06:11
tell the AI to do something for you
00:06:12
pretty complicated and it will be able
00:06:15
to do it it will be able to create its
00:06:16
own task list and get the job done
00:06:19
in quite complicated jobs so that's why
00:06:22
everyone's losing their [ __ ] over this
00:06:23
freeberg your thoughts on automating
00:06:26
these tasks and having them run and and
00:06:28
add tasks to the list this does seem
00:06:31
like a sort of seminal moment in time
00:06:34
that this is actually working
00:06:36
I think we've been seeing seminal
00:06:39
moments
00:06:41
over the last couple of weeks and months
00:06:43
kind of continuously every time we chat
00:06:47
about stuff or every day there's new
00:06:49
releases that are Paradigm shifting and
00:06:52
kind of reveal new applications and and
00:06:55
perhaps Concepts structurally that we
00:06:58
didn't really
00:06:59
have a good grasp of before some
00:07:01
demonstration came across chat GPT was
00:07:03
kind of the seat of that and then all of
00:07:05
this Evolution sense has really I think
00:07:08
changed the landscape for really how we
00:07:10
think about our interaction with digital
00:07:12
world and where the digital world can go
00:07:15
and how it can interact with the
00:07:16
physical world it's it's just really
00:07:17
profound
00:07:19
one of the interesting aspects that I
00:07:21
think I saw with some of the
00:07:23
applications of Auto GPT were these
00:07:25
almost like
00:07:28
autonomous characters in in like a game
00:07:31
simulation that could interact with each
00:07:33
other or these autonomous characters
00:07:34
that would speak back and forth to one
00:07:36
another
00:07:37
where each
00:07:38
instance has its own kind of predefined
00:07:41
role and then it explores some set of
00:07:44
Discovery or application or prompt back
00:07:46
and forth with the other agent and that
00:07:49
the kind of recursive outcomes with this
00:07:51
agent to agent interaction model and
00:07:53
perhaps multi-agent interaction model
00:07:56
again reveals an entirely new paradigm
00:07:58
for you know how things can be done
00:08:00
simulation wise
00:08:02
you know Discovery wise engagement wise
00:08:04
where One agent you know each agent can
00:08:06
be a different character in a room and
00:08:08
you can almost see how a team might
00:08:10
resolve to create a new product
00:08:11
collaboratively by telling each of those
00:08:14
agents to have a different character
00:08:16
background or different set of data or a
00:08:18
different set of experiences or
00:08:19
different set of personality traits and
00:08:21
the evolution of those that multi-agent
00:08:23
system outputs you know something that's
00:08:25
very novel that perhaps any of the
00:08:27
agents operating independently we're not
00:08:29
able to kind of reveal themselves so
00:08:31
again like another kind of dimension of
00:08:32
interaction
00:08:34
with these with these models and it
00:08:37
again like every week it's a whole other
00:08:39
layer to The Onion
00:08:41
it's super exciting and compelling and
00:08:43
the rate of change and the pace of kind
00:08:45
of you know New Paths being
00:08:48
being defined here really I think makes
00:08:51
it difficult to catch up and
00:08:53
particularly it highlights why it's
00:08:55
going to be so difficult I think for
00:08:56
Regulators to come in and try and set a
00:08:59
set of standards and a set of rules at
00:09:02
this stage because we don't even know
00:09:03
what we have here yet and it's going to
00:09:05
be very hard to kind of put the genie
00:09:06
back in the Box yeah and you're also
00:09:09
referring I think to the Stanford and
00:09:12
Google paper that was published this
00:09:14
week they did a research paper where
00:09:16
they created essentially The Sims if you
00:09:18
remember that video game put a bunch of
00:09:20
and what you might consider NPCs
00:09:22
non-playable characters you know the
00:09:24
merchant or the whoever in a um in a
00:09:28
video game and they said
00:09:30
each of these agents should talk to each
00:09:32
other put them in a simulation one of
00:09:33
them decided to have a birthday party
00:09:34
they decided to invite other people and
00:09:37
then they have memories and so then over
00:09:38
time they would generate responses like
00:09:41
I can't go to your birthday party but
00:09:43
happy birthday and then they would
00:09:45
follow up with each player and seemingly
00:09:48
emergent behaviors came out of this sort
00:09:51
of simulation which of course now has
00:09:53
everybody thinking well of course we as
00:09:55
humans and this is simulation there are
00:09:57
living in a simulation we've all just
00:09:59
been put into this is what we're
00:10:01
experiencing right now
00:10:03
how impressive this technology is or is
00:10:07
it
00:10:08
oh wow human cognition maybe we thought
00:10:11
was incredibly special but we can
00:10:13
actually simulate a significant portion
00:10:15
of what we do as humans so we're kind of
00:10:17
taking the Shine off of
00:10:19
Consciousness I'm not sure it's that but
00:10:21
I would make two comments I think this
00:10:23
is a really important week
00:10:25
because it starts to show
00:10:28
how fast the recursion is
00:10:31
with AI so in other Technologies and in
00:10:35
other breakthroughs
00:10:37
the recursive iterations took years
00:10:39
right if you think about how long did we
00:10:41
wait for from iPhone 1 to iPhone 2 it
00:10:44
was a year
00:10:45
right we waited two years for the App
00:10:48
Store
00:10:49
everything was measured in years maybe
00:10:51
things when they were really really
00:10:53
aggressive and really disruptive were
00:10:56
measured in months
00:10:57
except now these incredibly Innovative
00:11:00
breakthroughs are being measured in days
00:11:02
and weeks
00:11:03
that's incredibly profound
00:11:06
and I think it has some really important
00:11:09
implications to like the three big
00:11:11
actors in this play right so it has I
00:11:14
think huge implications to these
00:11:15
companies it's not clear to me how you
00:11:18
start a company anymore
00:11:20
I don't understand why
00:11:23
you would have a 40 or 50 person company
00:11:27
to try to get to an MVP I think you can
00:11:30
do that with three or four people
00:11:33
and that has huge implications then to
00:11:36
the second actor in this play which are
00:11:38
the investors in Venture capitalists
00:11:40
that typically fund this stuff because
00:11:42
all of our Capital allocation models
00:11:45
were always around writing 10 and 15 and
00:11:47
20 million dollar checks and 100 million
00:11:49
dollar checks then 500 million dollar
00:11:51
checks into these businesses that
00:11:53
absorbs tons of money
00:11:55
but the reality is like you know you're
00:11:57
looking at things
00:11:58
like mid-journey and others that can
00:12:00
scale to enormous size with very little
00:12:03
Capital many of which can now be
00:12:05
bootstrapped
00:12:07
so it takes really really small amounts
00:12:10
of money
00:12:11
and so I think that's a huge implication
00:12:13
so for me personally I am looking at
00:12:15
company formation
00:12:17
being done in a totally different way
00:12:19
and our Capital allocation model is
00:12:21
totally wrong size look fund four for me
00:12:23
was one billion dollars does that make
00:12:26
sense
00:12:27
nope for the next three or four years no
00:12:29
the right number may actually be 50
00:12:31
million dollars invested over the next
00:12:33
four years I think the VC job is
00:12:35
changing I think company startups are
00:12:36
changing I want to remind you guys of
00:12:38
one quick thing as a tangent
00:12:41
I had this meeting with Andre carpathy I
00:12:42
talked about this on the Pod where I
00:12:44
said I challenged him I said listen the
00:12:46
real goal should be to go and disrupt
00:12:49
existing businesses using these tools
00:12:51
cutting out all the sales and marketing
00:12:53
right and just delivering something and
00:12:55
I use the example of stripe
00:12:57
disrupting stripe by going to Market
00:12:59
with an equivalent product with
00:13:00
one-tenth the number of employees at one
00:13:03
tenth the cost
00:13:04
what's incredible is that this Auto GPT
00:13:06
is the answer to that exact problem
00:13:09
why because now if you are a young
00:13:12
industrious entrepreneur
00:13:14
if you look at any bloated organization
00:13:17
that's building Enterprise class
00:13:18
software
00:13:19
you can string together a bunch of
00:13:22
agents that will Auto construct
00:13:24
everything you need
00:13:26
to build a much much cheaper product
00:13:28
that then you can deploy for other
00:13:31
agents to consume so you don't even need
00:13:33
a sales team anymore this is what I mean
00:13:35
by this crazy recursion that's possible
00:13:37
yeah so I'm really curious to see how
00:13:40
this actually affects like all of this
00:13:42
all of these you know continuation
00:13:44
companies I mean it's a continuation of
00:13:47
and then the last thing I just want to
00:13:48
say is related to my tweet I think this
00:13:50
is exactly the moment where we now have
00:13:53
to have a real conversation about
00:13:54
regulation and I think it has to happen
00:13:56
otherwise it's going to be a [ __ ] show
00:13:57
let's put a pin in that for a second but
00:13:59
I want to get Sax's response to some of
00:14:01
this so sax we saw this before it used
00:14:04
to take two or three million dollars to
00:14:05
commercialize a web-based software
00:14:07
product app then it went down to 500k
00:14:09
then 250. I don't know if you saw this
00:14:12
story but if you remember the hit game
00:14:14
on your iPhone Flappy Birds Flappy Birds
00:14:18
uh you know was a phenomenon at you know
00:14:21
hundreds of millions of people played
00:14:23
this game over some period of time
00:14:25
somebody made it by talking to chat gpt4
00:14:29
in mid-journey in an hour
00:14:31
so the perfect example and listen it's a
00:14:33
game so it's something silly but I was
00:14:35
talking to two developers this weekend
00:14:37
and one of them was an okay developer
00:14:40
and the other one was an actual 10x
00:14:41
developer who's built you know very
00:14:43
significant companies and they were
00:14:45
coding together last week and because of
00:14:48
how fast chat GPT and other services
00:14:50
were writing code for them
00:14:52
he looked over at her and said you know
00:14:55
you're basically a 10x developer now my
00:14:59
superpower is gone so where does this
00:15:02
lead you to believe company formation is
00:15:05
going to go is this going to be
00:15:08
you know massively deflationary
00:15:09
companies like stripe are going to have
00:15:11
a hundred competitors in a very short
00:15:13
period of time or are we just going to
00:15:14
go down the long tail of ideas and solve
00:15:18
everything with software how is this
00:15:20
going to play out in the in the startup
00:15:22
space David sacks
00:15:24
well I think it's true that
00:15:27
developers and especially Junior
00:15:29
developers get a lot more leverage on
00:15:32
their time and so it is going to be
00:15:34
easier for small teams to get to an MVP
00:15:37
which is something they always should
00:15:38
have done anyway with their seed round
00:15:40
you shouldn't have needed you know 50
00:15:43
developers to build your V1 it should be
00:15:45
you know that's the founders really
00:15:48
so that that I think is already
00:15:51
happening and that Trend will continue I
00:15:53
think we're still a ways away from
00:15:57
stores being able to replace entire
00:15:59
teams of people I just you know I think
00:16:01
right now
00:16:02
to find a ways months years decade well
00:16:07
it's in the years I think for sure we
00:16:08
don't know how many years and the reason
00:16:10
I say that is it's just very hard
00:16:13
to replace
00:16:14
you know 100 of what any of these
00:16:17
particular job functions do 100 of what
00:16:20
a sales rep does 100 of what a marketing
00:16:22
rep does or even what a coder does so
00:16:25
right now I think we're still at the
00:16:26
phase of this where it's a tool that
00:16:28
gives a human leverage and I think we're
00:16:32
still a ways away from the you know
00:16:34
human being completely out of the loop I
00:16:36
think right now I see it mostly as a
00:16:38
Force for good
00:16:40
as opposed to something that's creating
00:16:42
okay a ton of dislocation Friedberg your
00:16:45
thoughts if we follow the trend line you
00:16:48
know to make that video game that you
00:16:49
shared took probably a few hundred human
00:16:52
years then a few dozen human years then
00:16:55
you know with other tool kits coming out
00:16:57
maybe a few human months and now this
00:17:00
person did it in one human day using
00:17:03
this tooling so if you think about the
00:17:06
implication for that I mentioned this
00:17:07
probably last year I really do believe
00:17:10
that at some point the whole concept of
00:17:12
Publishers and Publishing maybe goes
00:17:14
away where you know much like we saw so
00:17:17
much of the content on the internet
00:17:18
today being user generated you know most
00:17:20
of the content is made by individuals
00:17:21
posted on YouTube or Twitter that's most
00:17:24
of what we consume nowadays or Instagram
00:17:25
or Tick Tock in terms of video content
00:17:29
we could see the same
00:17:31
in terms of software itself where you no
00:17:34
longer need a software startup or a
00:17:36
software company to render or generate a
00:17:39
set of tools for a particular user but
00:17:41
that the user may be able to Define to
00:17:44
their agent their AI agent the set of
00:17:47
tools that they would individually like
00:17:48
to use or to create for them to do
00:17:50
something interesting and so the idea of
00:17:52
buying or subscribing to software or
00:17:55
even buying or subscribing to a video
00:17:57
game or to a movie or to some other form
00:17:59
of content
00:18:01
starts to diminish as The Leverage goes
00:18:04
up with these tools the accessibility
00:18:06
goes up you no longer need a computer
00:18:07
engineering degree or computer science
00:18:09
degree to be able to harness them or use
00:18:11
them and individuals may be able to
00:18:13
speak in simple and plain English that
00:18:15
they would like a book or a movie that
00:18:17
does that looks and feels like the
00:18:19
following or a video game that feels
00:18:21
like the following and so when I open up
00:18:23
my iPhone maybe it's not a screen with
00:18:25
dozens of video games but it's one
00:18:27
interface and the interface says what do
00:18:29
you feel like playing today and then I
00:18:30
can very clearly and succinctly State
00:18:32
what I feel like playing and it can
00:18:33
render that game and render the code
00:18:35
render the engine render the graphics
00:18:37
and everything on the Fly for me and I
00:18:39
can use that and so you know I kind of
00:18:41
think about this as being a bit of a
00:18:42
leveling up that the idea that all
00:18:44
technology again starts Central and
00:18:46
moves to kind of the edge of the network
00:18:47
over time that may be what's going on
00:18:50
with computer programming itself now
00:18:52
where the toolkit to actually use
00:18:55
computers to generate stuff for us is no
00:18:58
longer a toolkit that's harnessed and
00:19:00
controlled and utilized by a set of
00:19:02
centralized Publishers but it becomes
00:19:04
distributed and used at the edge of the
00:19:06
network by users like anyone and then
00:19:08
the edge of the Network Technology can
00:19:10
render the software for you and it
00:19:12
really creates a profound change in the
00:19:14
entire business landscape of software
00:19:17
and the internet
00:19:18
and I think it's uh you know it's it's
00:19:20
really like we're just starting to kind
00:19:22
of see have our heads unravel around
00:19:24
this notion and we're sort of trying to
00:19:26
link it to the old Paradigm which is all
00:19:28
startups are going to get cheaper
00:19:29
smaller teams but it may be that you
00:19:30
don't even need startups for a lot of
00:19:32
stuff anymore you don't even need teams
00:19:33
and you don't even need companies to
00:19:35
generate and render software to do stuff
00:19:37
for you anymore when we look at this it
00:19:39
it's kind of a pattern of
00:19:43
augmentation as we've been talking about
00:19:45
here we're augmenting human intelligence
00:19:48
then replacing this replication or this
00:19:52
automation I guess might be a nice way
00:19:54
to say it so it's augmentation then
00:19:56
automation
00:19:57
and then perhaps deprecation where do
00:20:00
you sit on this it seems like sax feels
00:20:03
it's going to take years
00:20:04
and Freeburg thinks hey maybe startups
00:20:06
and content are over where do you sit on
00:20:08
this augmentation automation deprecation
00:20:11
Journey we're on I think that humans
00:20:13
have judgment and I think it's going to
00:20:14
take decades for agents to replace good
00:20:17
judgment I think that's where we have
00:20:20
some
00:20:21
defensible ground and I'm going to say
00:20:23
something controversial I don't think
00:20:25
developers anymore have good judgment
00:20:28
developers get to the answer or they
00:20:30
don't get to the answer and that's what
00:20:31
agents have done because the the 10x
00:20:33
engineer had better judgment than the 1X
00:20:36
engineer
00:20:37
but by making everybody a 10x engineer
00:20:39
you're taking judgment away
00:20:41
you're taking code paths that are now
00:20:44
obvious and making it available to
00:20:46
everybody it's effectively like what you
00:20:48
did in chess an AI created a solver so
00:20:51
everybody understood the most efficient
00:20:53
path in every single spot to do the eat
00:20:56
most EV positive thing the most expected
00:20:58
value positive thing coding is very
00:21:00
similar that way you can reduce it and
00:21:02
view it very very reductively so there
00:21:05
is no differentiation in code and so I
00:21:07
think Freeburg is right so for example
00:21:09
let's say you're going to start a
00:21:11
company today
00:21:12
why do you even care
00:21:14
what database you use
00:21:16
why do you even care
00:21:19
which Cloud you're built on to free
00:21:21
Brook's Point why do any of these things
00:21:23
matter they don't matter they were
00:21:25
decisions that used to matter when
00:21:27
people had a job to do and you paid them
00:21:30
for their judgment oh well we think gcp
00:21:32
is better for this specific workload and
00:21:34
we think that this database architecture
00:21:36
is better for that specific workload and
00:21:38
we're going to run this on AWS but that
00:21:40
on azure
00:21:41
and do you think an agent cares if you
00:21:44
tell an agent find me the cheapest way
00:21:46
to execute this thing
00:21:48
and if it ever gets not you know cheaper
00:21:50
to go someplace else do that for me as
00:21:52
well and you know ETL all the data and
00:21:54
put it in the other thing and I don't
00:21:56
really care so you're saying it will it
00:21:58
will swap out stripe for add Yen or it
00:22:01
doesn't for Amazon web services it's
00:22:03
going to be ruthless it's going to be
00:22:05
ruthless and I think that the point of
00:22:06
that that and that's the exact perfect
00:22:08
word Jason AI is ruthless because it's
00:22:11
emotionless it was not taken to a steak
00:22:14
dinner
00:22:14
it was not brought to a basketball game
00:22:17
it was not sold into a CEO it's an agent
00:22:21
that looked at a bunch of API endpoints
00:22:24
figured out how to write code to it to
00:22:27
get done the job at hand that was tasked
00:22:29
to it within a budget right the other
00:22:31
thing that's important is these agents
00:22:32
execute within budgets so another good
00:22:35
example was and this is a much simpler
00:22:38
one but a guy said I would like seven
00:22:43
days worth of meals here are my
00:22:46
constraints from a dietary perspective
00:22:48
here are also my budgetary constraints
00:22:50
and then what this agent did was figured
00:22:53
out how to go and use the instacart
00:22:55
plug-in at the time and then these other
00:22:57
things and execute within the budget
00:22:59
how is that different when you're a
00:23:01
person that raises five hundred thousand
00:23:03
dollars and says I need a full stack
00:23:05
solution that does X Y and Z for two
00:23:07
hundred thousand dollars it's the exact
00:23:09
same problem
00:23:10
so I think it's just a matter of time
00:23:13
until we start to cannibalize these
00:23:15
extremely expensive fossified large
00:23:18
organizations that have relied on a very
00:23:20
complicated go to market in sales and
00:23:22
marketing motion I don't think you need
00:23:24
it anymore in a world of of agents and
00:23:25
auto gpts and I think
00:23:28
that to me is quite interesting because
00:23:30
a it creates an obvious set of public
00:23:33
company shorts
00:23:34
and then B
00:23:36
you actually want to arm the rebels and
00:23:39
arming the rebels to use the Tobi
00:23:40
lootkey analogy here would mean to seed
00:23:43
hundreds of one-person teams hundreds
00:23:47
and just say go and build this entire
00:23:49
stack all over again using a bunch of
00:23:51
Agents
00:23:52
yeah and I think recursively you'll get
00:23:54
to that answer in in less than a year
00:23:57
interestingly
00:23:59
when you talk about the emotion of
00:24:00
making these decisions if you look at
00:24:02
Hollywood I just interviewed on my other
00:24:04
podcast the founder of you have another
00:24:07
podcast I do it's called startups thank
00:24:10
you episode you've been on her four
00:24:12
times
00:24:13
don't give them an excuse to plug it
00:24:15
listen I'm not going to this week in
00:24:17
startups available on Spotify and iTunes
00:24:19
and youtube.com this weekend Runway is
00:24:22
the name of this company I interviewed
00:24:23
and what's fascinating about this is he
00:24:26
told me on everything everywhere all at
00:24:29
once the award-winning film
00:24:31
they had seven visual effects people on
00:24:33
it and they were using his software
00:24:35
the late night shows like Colbert and
00:24:37
stuff like that are using it they are
00:24:39
ruthless in terms of creating crazy
00:24:41
visual effects now without and you can
00:24:44
do text prompt to get video output and
00:24:48
it is quite reasonable what's coming out
00:24:50
of it but you can also train it on
00:24:52
existing data sets so they're going to
00:24:54
be able to take something sax like The
00:24:56
Simpsons
00:24:58
or South Park or Star Wars or Marvel
00:25:00
take the entire Corpus of the comic
00:25:03
books and the movies and the TV shows
00:25:05
and then have people type in have Iron
00:25:07
Man do this have Luke Skywalker do that
00:25:10
and it's going to Output stuff and I
00:25:12
said hey when would this reach
00:25:15
the the level that the Mandalorian TV
00:25:17
show is and he said within two years now
00:25:19
he's talking his own book but it's quite
00:25:21
possible about that all these visual
00:25:23
effects people from industrial Light
00:25:25
Magic on down
00:25:27
are going to be replaced with director
00:25:29
sacks who are currently using this
00:25:31
technology to do
00:25:33
what do they call the images like that
00:25:35
go with the script storyboards
00:25:36
storyboards thank you they're doing
00:25:38
storyboards in this right now right the
00:25:40
difference between the storyboards acts
00:25:42
and the output is closing in the next 30
00:25:45
months I would say right I mean maybe
00:25:48
you could speak to a little bit about
00:25:49
the pace here because that is the
00:25:51
perfect ruthless example of Ruthless AI
00:25:52
I mean you could have the entire team at
00:25:55
industrial Light Magics or Pixar be
00:25:58
unnecessary this decade well I mean you
00:26:01
see a bunch of the pieces already there
00:26:02
so you have stable diffusion you have
00:26:04
the ability to type in the image that
00:26:06
you want and it spits out you know a
00:26:08
version of it or 10 different versions
00:26:10
of it and you can pick which one you
00:26:11
want to go with you have the ability to
00:26:13
create characters you have the ability
00:26:14
to create voices you have the ability to
00:26:18
replicate a celebrity voice the only
00:26:20
thing that's not there yet as far as I
00:26:23
know is the ability to take static
00:26:24
images and stream them together into a
00:26:26
motion picture but that seems like it's
00:26:28
coming really soon so yeah in theory you
00:26:31
should be able to train the model where
00:26:33
you just give it screenplay and it
00:26:35
outputs essentially an animated movie
00:26:37
and then you should be able to fine tune
00:26:39
it by choosing the voices that you want
00:26:41
and the characters that you want and you
00:26:44
know and that kind of stuff so yeah I
00:26:46
think we're close to it now I think that
00:26:48
the question though is you know every
00:26:50
nine let's call it of reliability is a
00:26:54
big advancement so yeah it might be easy
00:26:56
to get to 90 percent within two years
00:26:59
but it might take another two years to
00:27:01
go from 90 to 99 and then it might take
00:27:03
another two years to get to 99.9 and so
00:27:06
on and so to actually get to the point
00:27:08
where you're at this stage where you can
00:27:10
release a theatrical quality movie I'm
00:27:12
sure it will take a lot longer than two
00:27:14
years well but look at this sex I'm just
00:27:15
going to show you one image this is the
00:27:17
input was aerial Drone footage of a
00:27:19
mountain range and this is what it came
00:27:21
up with now if you were watching TV in
00:27:23
the 80s or 90s on a non-hd TV this would
00:27:26
look indistinguishable from anything
00:27:28
you've seen and so this is at a pace
00:27:32
that's kind of crazy there's also
00:27:33
opportunity here right Friedberg I mean
00:27:35
if we were to look at something like The
00:27:37
Simpsons which has gone on for 30 years
00:27:39
if young people watching The Simpsons
00:27:42
could create their own scenarios or with
00:27:45
auto GPT imagine you told The Simpsons
00:27:50
stable diffusion instance read what's
00:27:53
happening in the news have Bart Simpson
00:27:55
respond to it have the South Park
00:27:57
characters parody whatever happened in
00:28:00
the news today you could have automated
00:28:02
real-time episodes of South Park just
00:28:04
being published onto some website before
00:28:07
you move on did you see the the Wonder
00:28:09
Studio demo we can pull this one up it's
00:28:12
really cool yeah please this is a
00:28:14
startup that's using this type of
00:28:16
technology and the way it works is
00:28:19
new film a live action scene with a
00:28:23
regular actor but then you can just drag
00:28:25
and drop and animate a character onto it
00:28:27
and it then converts that scene into a
00:28:32
movie with that character like Planet of
00:28:34
the Apes or Lord of the Rings right yeah
00:28:36
exactly you see the person who kept
00:28:38
winning all the Oscars so there it goes
00:28:40
after the robot has replaced the human
00:28:42
wow you can imagine like every piece of
00:28:44
this just eventually gets swapped out
00:28:46
with AI right like you should be able to
00:28:48
tell the AI
00:28:50
give me a picture of a human leaving a
00:28:56
building
00:28:57
like a Victorian era building in New
00:29:00
York and certainly can give you a static
00:29:02
image of that so it's not that far to
00:29:04
then give you a video of that right
00:29:07
and so yeah I think we're we're pretty
00:29:09
close for let's call it hobbyists or
00:29:11
amateurs to be able to create pretty
00:29:13
nice looking movies using these types of
00:29:16
tools but again I think there's a jump
00:29:19
to get to the point where you're just
00:29:20
all together replacing one of the things
00:29:24
I'll say on this is we still keep trying
00:29:25
to relate it back to the way media
00:29:28
narrative has been explored and written
00:29:31
by humans in the past very kind of
00:29:34
linear storytelling you know it's a
00:29:36
two-hour movie 30 minute TV segment
00:29:38
eight minute YouTube clip 30 second
00:29:41
Instagram clip whatever
00:29:42
but one of the
00:29:44
enabling capabilities with this set of
00:29:48
tools is that these stories
00:29:51
the way that they're rendered and the
00:29:52
way that they're explored by individuals
00:29:54
can be fairly dynamic
00:29:58
you could watch a movie with the same
00:30:00
story all four of us could watch a movie
00:30:02
with the same story but from totally
00:30:05
different Vantage points and some of us
00:30:06
could watch it in an 18 minute version
00:30:08
or a two-hour version or a you know
00:30:10
three season episode episodic version
00:30:13
where the the way that this opens up the
00:30:16
potential for creators and all so so now
00:30:18
I'm kind of saying before I was saying
00:30:20
hey individuals can make their own
00:30:21
movies and videos that's going to be
00:30:23
incredible there's a separate I think
00:30:25
creative output here
00:30:27
which is the leveling up that happens
00:30:30
with creators that maybe wasn't possible
00:30:33
to them before so perhaps a Creator
00:30:35
writes a short book a short story and
00:30:38
then that short story gets rendered into
00:30:40
a system that can allow each one of us
00:30:42
to explore it and enjoy it in different
00:30:43
ways and I as the Creator can define
00:30:46
those different Vantage points I as the
00:30:48
Creator can say here's a little bit of
00:30:50
this personality this character trait
00:30:52
and so what I can now do as a Creator is
00:30:55
stuff that I never imagined I could do
00:30:57
before think about old school
00:30:58
photographers doing black and white
00:30:59
photography with pinhole cameras and
00:31:02
then they come across Adobe Photoshop
00:31:03
what they can do with Adobe Photoshop
00:31:05
was stuff that they could never
00:31:06
conceptualize of in those old days I
00:31:09
think what's going to happen for
00:31:10
creators going forward and this is going
00:31:12
back to that point that we had last week
00:31:13
or two weeks ago about the guy that was
00:31:15
like hey I'm out of a job I actually
00:31:17
think that the opportunity for creating
00:31:19
new stuff in new ways is so profoundly
00:31:22
expanding that individuals can now write
00:31:25
entire universes that can then be
00:31:27
enjoyed by millions of people from
00:31:29
completely different lengths and
00:31:30
viewpoints and and models that can be
00:31:33
interactive they can be static they can
00:31:35
be dynamic
00:31:36
and that the person
00:31:37
personalized but the tooling that you as
00:31:39
a Creator now has you could choose which
00:31:42
characters you wanted to find you could
00:31:43
choose which content you want to write
00:31:46
you could choose which content you want
00:31:48
the AI to fill in for you and say hey
00:31:50
create 50 other characters in the
00:31:52
village and then when the viewer reads
00:31:54
the book or watches the movie Let Them
00:31:56
explore or have a different interaction
00:31:57
with a set of of those villagers uh in
00:32:00
that Village or you could say hey here's
00:32:01
the one character everyone has to meet
00:32:03
here's what I want them to say and you
00:32:05
can Define the dialogue and so the way
00:32:07
the creators can start to kind of
00:32:08
harness their creative chops and create
00:32:10
new kinds of
00:32:12
modalities for content and for
00:32:14
exploration I think is going to be so
00:32:16
beautiful and incredible I mean
00:32:18
Freeburg yeah you can choose the limits
00:32:20
of how much you want the individual to
00:32:23
enjoy from your content versus how
00:32:25
narrowly you want to Define it and my
00:32:27
guess is that the creators that are
00:32:28
going to win are going to be the ones
00:32:30
that are going to create more dynamic
00:32:31
range in the creative output and then
00:32:34
individuals are going to kind of be
00:32:35
stuck they're gonna be more into that
00:32:37
than they will with the static everyone
00:32:39
watches the same thing over and over so
00:32:40
there will be a whole new world of
00:32:42
creators that you know maybe have a
00:32:43
different set of tools that then just
00:32:46
just realizing a lot better to build on
00:32:48
what you're saying for a burmes I think
00:32:49
it's incredibly insightful just think
00:32:50
about the controversy around two aspects
00:32:53
of a franchise like James Bond number
00:32:55
one who's your favorite Bond we grew up
00:32:57
with Roger Moore We lean towards that
00:32:59
then we discover Sean Connery and then
00:33:00
all of a sudden you see you know the
00:33:02
latest one he's just extraordinary
00:33:04
and and Daniel Craig you're like you
00:33:06
know what that's the one that I love
00:33:07
most but what if you could take any of
00:33:09
the films you could say let me get you
00:33:10
know give me the spy who loved me but
00:33:11
put Daniel Craig in it Etc and that
00:33:13
would be available to you and then think
00:33:15
about the next controversy which is oh
00:33:16
my God does Daniel does James Bond need
00:33:18
to be a white guy from the UK of course
00:33:20
not you can you place it around the
00:33:22
world and each region could get their
00:33:24
own celebrity their number one celebrity
00:33:26
to play the lead and controversy over
00:33:29
you know the old story The Epic of
00:33:31
Gilgamesh right so like that story was
00:33:33
retold in dozens of different languages
00:33:35
and it was told through the oral
00:33:37
tradition it was like You Know spoken by
00:33:38
bards around a fire pit and whatnot and
00:33:41
all of those stories were told with
00:33:43
different characters and different names
00:33:44
and different experiences some of them
00:33:46
were 10 minutes long some of them were
00:33:48
multi-hour sagas explained through the
00:33:50
story but ultimately the morality of the
00:33:53
story the storyline the intentionality
00:33:55
of the original creator of that story
00:33:57
yes came through the the Bible is
00:33:58
another good example of this where much
00:34:00
of the underlying morality and ethics in
00:34:02
the Bible comes through in different
00:34:03
stories read by different people in
00:34:05
different languages everything that that
00:34:06
may be where we go like my kids want to
00:34:09
have a 10 minute bedtime story well let
00:34:10
me give them Peter Pan at 10 minutes I
00:34:12
want to do you know a chapter or a night
00:34:14
for my older daughter for a week long of
00:34:16
Peter Pan now I can do that and so the
00:34:19
way that I can kind of consume content
00:34:21
becomes different so I guess what I'm
00:34:23
saying is there's two aspects to the way
00:34:25
that I think the entire content the the
00:34:27
realm of content can be Rewritten
00:34:29
through AI the first is like individual
00:34:31
personalized creation of content where I
00:34:33
as a user can render content that of my
00:34:36
liking and my interest the second is
00:34:38
that I can engage with content that is
00:34:40
being created that is so much more
00:34:42
multi-dimensional than anything we
00:34:43
conceive of today we're current
00:34:45
centralized content creators now have a
00:34:47
whole set of tools now from a business
00:34:48
model perspective I don't think that
00:34:50
Publishers are really the play anymore
00:34:52
but I do think the platforms are going
00:34:53
to be the play and the platform tooling
00:34:55
that enables the individuals to do this
00:34:57
stuff and the platform tooling that
00:34:58
enables the content creators to do this
00:35:00
stuff are definitely entirely new
00:35:02
Industries and models that can create
00:35:04
multi-hundred billion dollar outcomes
00:35:06
let me hand this off to sax because
00:35:08
there has been the dream for everybody
00:35:10
especially in the Bay area of a hero
00:35:13
coming and saving Gotham City
00:35:16
and this has finally been realized David
00:35:20
sacks I did my own little Twitter AI
00:35:24
hashtag and I said to Twitter AI if only
00:35:28
please generate a picture of David Sax's
00:35:30
Batman crouched down on the bridge the
00:35:33
amount of creativity sacks that came
00:35:35
from this and this is
00:35:38
something that you know if we were
00:35:40
talking about just five years ago this
00:35:42
would be like a ten thousand dollar
00:35:43
image you could create a birthday these
00:35:46
were not professional quote unquote
00:35:47
artists these were individuals
00:35:49
individuals that were able to harness a
00:35:51
set of platform tools to generate this
00:35:53
incredible new content and I think it
00:35:55
speaks to the opportunity ahead and by
00:35:57
the way we're in inning one right so
00:35:59
you see yourself as Batman do you ever
00:36:01
think you should take your enormous
00:36:03
wealth and resources and put it towards
00:36:05
building a cave under your mansion that
00:36:08
lets you out underneath the Golden Gate
00:36:10
Bridge and you could go fight crime so
00:36:11
good do you want to go fight this crime
00:36:14
in Gotham
00:36:16
and I think San Francisco has a lot of
00:36:18
gotham-like qualities I think the
00:36:19
villains are more real than the heroes
00:36:21
unfortunately we don't have a lot of
00:36:23
Heroes but yeah we got a lot of jokers
00:36:25
Jokers yeah that's a whole separate
00:36:28
topic I'm sure a whole separate topic
00:36:30
we'll get to it at some point today you
00:36:31
guys are talking about all this stupid
00:36:33
[ __ ] like there are trillions of
00:36:35
dollars of software companies that could
00:36:36
get disrupted and you're talking about
00:36:37
making [ __ ] children's books and fat
00:36:39
pictures of socks it's so dumb no
00:36:41
special conversations
00:36:44
great job
00:36:46
cares about entertainment anymore
00:36:47
because it's totally obvious okay so one
00:36:49
of the biggest industries where the
00:36:50
money is why don't you teach people
00:36:52
where there's going to be actual
00:36:54
economic destruction
00:36:55
amazing economic destruction and
00:36:57
opportunity you spend all this time on
00:36:59
the most stupidest [ __ ] topics listen
00:37:01
it's an illustrative example no it's an
00:37:04
elitist example that you know it's
00:37:06
[ __ ] circle jerk yourself Batman's
00:37:08
not nobody nobody cares about movies
00:37:10
well let's bring nobody
00:37:12
tweet over everybody I mean I think I
00:37:14
think U.S box office is something like
00:37:16
20 billion a year I remember when like
00:37:18
they now got to like 100 billion a year
00:37:19
payment volume and now it's like
00:37:21
hundreds of billions so yeah and stripe
00:37:23
are going to process two trillion
00:37:25
dollars almost why don't you talk about
00:37:26
that disruption you ninny Market size of
00:37:28
U.S media and entertainment industry 717
00:37:30
billion okay it's not insignificant
00:37:33
video games are nearly half a trillion a
00:37:35
year yeah I mean this is the number
00:37:37
insignificant but let's pull up chamat's
00:37:39
tweet of course the dictator wants to
00:37:41
dictate here all this incredible
00:37:43
Innovation is being made and
00:37:45
a new Hero has been born chamath
00:37:48
polyhapatia a tweet that went viral over
00:37:51
1.2 million views already
00:37:53
I'll read your Tweet for the audience if
00:37:55
you invent a novel drug you need the
00:37:57
government to vet and approve it FDA
00:37:59
before you can commercialize it if you
00:38:00
invent a new mode of air travel you need
00:38:03
the government to vet and improve it FAA
00:38:04
I'm just going to edit this down a
00:38:06
little bit if you create new security
00:38:07
you need the government to vet it and
00:38:09
approve at SEC more generally when you
00:38:10
create things with broad societal impact
00:38:13
positive and negative the government
00:38:14
creates a layer to review and approve it
00:38:16
AI will need such an oversight body the
00:38:19
FDA approval process seems the most
00:38:21
credible and adaptable into a framework
00:38:23
to understand how
00:38:25
a model behaves and its counter factual
00:38:28
our political leaders need to get in
00:38:31
front of this sooner rather than later
00:38:32
and create some oversight before the
00:38:34
eventual big avoidable mistakes happen
00:38:36
and Genies are let out of the bottle
00:38:38
Tremont you really want the government
00:38:39
to come in and then when people build
00:38:43
these tools they have to submit them to
00:38:45
the government to approve them that's
00:38:46
what you're saying here and you want
00:38:48
that to start now
00:38:49
here's the alternative the alternative
00:38:51
is going to be the debacle that we know
00:38:53
as section 230. so if you try to write a
00:38:57
brittle piece of legislation or try to
00:39:00
use
00:39:01
old legislation to deal with something
00:39:04
new
00:39:05
it's not going to do a good job because
00:39:07
technology advances way too quickly
00:39:10
and so if you look at the section 230
00:39:12
example where have we left ourselves the
00:39:14
politicians have a complete inability to
00:39:17
pass a new framework to deal with social
00:39:19
media to deal with misinformation and so
00:39:21
now we're all kind of guessing what
00:39:24
a bunch of age 70 and 80 year old
00:39:27
Supreme Court Justices will do in trying
00:39:30
to rewrite technology law when they have
00:39:33
to apply on Section 230. so the point of
00:39:35
that tweet was to lay the Alternatives
00:39:38
there is no world in which this will be
00:39:41
unregulated
00:39:43
and so I think the question to ask
00:39:45
ourselves is do we want a chance for a
00:39:48
new body
00:39:49
so the FDA is a perfect example why even
00:39:52
though the FDA commissioner is appointed
00:39:54
by the president this is a quasi
00:39:56
organization it still arms length away
00:39:59
it has subject matter experts that they
00:40:02
hire and they have many Pathways to
00:40:05
approval some Pathways take days some
00:40:09
pathways are months and years some
00:40:11
pathways are for breakthrough Innovation
00:40:13
some pathways are for devices so they
00:40:16
have a broad spectrum of ways of of
00:40:18
arbitrating
00:40:19
what can be commercialized and what
00:40:21
cannot otherwise my prediction is we
00:40:24
will have a very brittle law that will
00:40:27
not work it'll be like the Commerce
00:40:29
department and the FTC
00:40:31
trying to gerrymander some old piece of
00:40:33
legislation and then what will happen is
00:40:36
it'll get escalated to the Supreme Court
00:40:38
and I think they are the last group of
00:40:40
people
00:40:41
who should be deciding on this
00:40:44
incredibly important topic for society
00:40:45
so
00:40:47
what I have been advocating our leaders
00:40:50
and I will continue to do so is
00:40:52
don't try to Ram this into an existing
00:40:54
body it is so important it is worth
00:40:57
creating a new organization like the FDA
00:41:01
and having a framework that allows you
00:41:03
to look at a model
00:41:04
and look at the counter factual judge
00:41:08
how good how important how disruptive it
00:41:10
is
00:41:11
and then release it in the wild
00:41:13
appropriately otherwise I think you'll
00:41:14
have these chaos GPT things
00:41:17
scale infinitely because again as
00:41:19
Friedberg said in Sac said you're
00:41:21
talking about one person that can create
00:41:22
this chaos multiply that by every person
00:41:26
that is an anarchist or every person
00:41:27
that just wants to sow seeds of chaos
00:41:30
and I think it's going to be all
00:41:31
avoidable I think regulating what
00:41:33
software people can write is a near
00:41:35
impossible task number one I think you
00:41:38
can probably put rules and restrictions
00:41:39
around Commerce right that's certainly
00:41:41
feasible uh in terms of how people can
00:41:44
monetize but in terms of writing and
00:41:46
utilizing software it's going to be as
00:41:49
challenged as trying to monitor
00:41:52
and demand
00:41:53
oversight and regulation around how
00:41:55
people write and use
00:41:57
tools for uh for genome and biology
00:42:00
exploration certainly if you want to
00:42:02
take a product to Market and sell a drug
00:42:04
to people that can influence their body
00:42:06
you have to go get that approved but in
00:42:08
terms of you know doing your work in a
00:42:10
lab it's very difficult I think the
00:42:12
other challenge here is
00:42:15
software can be written anywhere
00:42:17
it can be executed anywhere and so if
00:42:21
the US does try to regulate or does try
00:42:25
to put the brakes on the development of
00:42:27
tools where the U.S can have kind of a
00:42:29
great economic benefit and a great
00:42:31
economic interest
00:42:32
there will be advances made elsewhere
00:42:35
without a doubt and those markets and
00:42:37
those
00:42:39
those places will benefit in an
00:42:42
extraordinarily out outpaced way as we
00:42:44
just mentioned there's such
00:42:46
extraordinary kind of economic gain to
00:42:48
be realized here that if we're not
00:42:51
if the United States is not leading the
00:42:54
world we are going to be following we
00:42:56
are going to get disrupted we are going
00:42:57
to lose an incredible amount of value
00:42:58
and talent and so any attempt at
00:43:01
regulation or slowing down or telling
00:43:04
people that they cannot do things when
00:43:06
they can easily hop on a plane and go do
00:43:07
it elsewhere I think is is fraught with
00:43:10
Peril so you don't agree with regulation
00:43:12
sax are you on board with the chamoth
00:43:14
plan or you're on board with the Free
00:43:15
Bird well I'll say I think I think just
00:43:17
like with computer hacking it's illegal
00:43:19
to break into someone else's computer it
00:43:20
is illegal to steal someone's personal
00:43:22
information there are laws that are
00:43:24
absolutely simple and obvious and
00:43:28
you know no-nonsense laws those a lot of
00:43:31
legal to get rid of a hundred thousand
00:43:33
jobs by making a piece of software
00:43:34
though that's right and so I think
00:43:36
trying to intentionalize how we do
00:43:39
things versus intentionalizing
00:43:41
um the things that we want to prohibit
00:43:43
happening as an outcome we can certainly
00:43:45
try and prohibit the things that we want
00:43:46
to happen as an outcome and pass laws
00:43:48
and Institute governing bodies with
00:43:51
authority to oversee those laws
00:43:53
with respect to things like stealing
00:43:55
data but you can jump on a plane and go
00:43:57
do it in Mexico Canada or whatever
00:43:59
region you get to Saks where do you
00:44:01
stand on this debate yeah I'm saying
00:44:02
like there are ways to protect people
00:44:03
there's ways to protect Society about
00:44:05
passing laws that that make it illegal
00:44:07
to do things as the output is the
00:44:08
outcome what law do you pass on chaos
00:44:10
GPT explain chaos GPT Give an example
00:44:13
please yeah do you want to talk about it
00:44:14
real quick it's a recursive agent that
00:44:16
basically is trying to destroy itself
00:44:19
try to destroy Humanity yeah but I guess
00:44:22
by first becoming all-powerful and
00:44:23
destroying humanity and then destroying
00:44:25
itself yeah it's a tongue-in-cheek Auto
00:44:28
GPT
00:44:30
it's not a tongue-in-cheek auto GPT the
00:44:33
guy that created it you know put it out
00:44:34
there and said like he's trying to show
00:44:36
everyone to your point what
00:44:38
intentionality could arise here which is
00:44:39
negative intentionality I think it's
00:44:41
very naive for anybody
00:44:44
to think that this is not equivalent to
00:44:47
something that could cause harm to you
00:44:49
so for example if the prompt is hey here
00:44:52
is a security leak that we figured out
00:44:54
in Windows and so why don't you exploit
00:44:57
it so look a hacker now has to be very
00:44:59
technical
00:45:00
today with with these Auto gpts a hacker
00:45:03
does not need to be technical at all
00:45:04
exploit the zero day
00:45:07
exploit in Windows hack into this plane
00:45:09
and bring it down oh okay the GPT will
00:45:12
do it so who's going to tell you that
00:45:14
those things are not allowed who's going
00:45:15
to actually vet that that wasn't allowed
00:45:18
to be released in the wild so for
00:45:19
example if you worked with Amazon and
00:45:21
Google and Microsoft and said you're
00:45:23
going to have to run these things in a
00:45:25
sandbox and we're going to have to
00:45:27
observe the output before we allow it to
00:45:29
run on actual bare metal in the wild
00:45:32
again that seems like a reasonable thing
00:45:34
and it's super naive for people to think
00:45:36
it's a free market so we should just be
00:45:38
able to do what we want this will end
00:45:40
badly quickly and when the first plane
00:45:43
goes down and when the first [ __ ]
00:45:44
thing gets blown up all of you guys will
00:45:46
be like oh sorry Saks a pretty
00:45:48
compelling example here by chamoth
00:45:50
somebody puts out into the wild chaos
00:45:52
GPT you can go do a Google search for it
00:45:53
and says hey what are the
00:45:55
vulnerabilities to the electrical grid
00:45:58
compile those and automate a series of
00:46:01
attacks and write some code to probe
00:46:04
those until we and success in this
00:46:06
Mission you get a hundred points and
00:46:08
stars every time you Jason do this such
00:46:10
a it's such a beautiful example but it's
00:46:12
even more nefarious it is
00:46:14
hey this is an enemy that's trying to
00:46:17
hack our system so you need to hack
00:46:19
theirs and bring it down you know like
00:46:21
you can easily trick these gpts right
00:46:24
yes they have no judgment they have no
00:46:25
judgment and as you said they're
00:46:27
ruthless in in getting to the outcome
00:46:29
right
00:46:31
so why why do we think all of a sudden
00:46:33
this is not going to happen I mean it's
00:46:34
literally the science fiction example
00:46:36
you say Hey listen make sure no humans
00:46:38
get cancer and like okay well The
00:46:39
Logical way to make sure no humans get
00:46:40
cancer is to kill all the humans can you
00:46:43
just address the point so what do you
00:46:44
think you're regulating are you
00:46:45
regulating the code here's what I'm
00:46:47
saying to write if you look at the FDA
00:46:49
no you're allowed to make any chemical
00:46:51
drug you want but if you want to
00:46:52
commercialize it you need to run a
00:46:55
series of trials with highly qualified
00:46:58
measurable data and you submit it to
00:47:00
like-minded experts that are trained as
00:47:02
you are to evaluate the viability of
00:47:05
that and but hold on there are Pathways
00:47:08
that allow you to get that done in days
00:47:09
under emergency use and then there are
00:47:12
Pathways that can take years depending
00:47:14
on how gargantuan the task is at hand
00:47:16
and all I'm suggesting
00:47:18
is having some amount of oversight
00:47:21
is not bad in this specific example I
00:47:25
get what you're saying but I'm asking
00:47:26
tactically how what are you overseeing
00:47:28
you're overseeing cat GPT you're
00:47:30
overseas the the model you're doing
00:47:32
exactly chips okay look I used to run
00:47:35
the Facebook platform we used to create
00:47:37
sandboxes if you submit code to us
00:47:40
you would we would run it as a Sandbox
00:47:42
we would observe it we would figure out
00:47:43
what it was trying to do and we would
00:47:45
tell you this is allowed to run in the
00:47:46
wild there's a version of that that
00:47:48
Apple does when you submit an app for
00:47:50
review and approval Google does it as
00:47:53
well in this case all the bare metal
00:47:55
providers all the people that provide
00:47:57
gpus will be forced by the government in
00:48:00
my opinion to implement something and
00:48:03
all I'm suggesting is
00:48:04
that it should be a new kind of body
00:48:07
that essentially observes that has phds
00:48:10
that has people who are trained in this
00:48:12
stuff to develop the kind of testing and
00:48:15
the output that you need
00:48:16
to figure out whether it should even be
00:48:18
allowed to run in the Wild on bare metal
00:48:20
sorry but you're saying that the mod the
00:48:22
model sorry I'm just trying to
00:48:23
understand Shaman's points you're saying
00:48:24
that the models need to be reviewed by
00:48:26
this body and those models if they're
00:48:28
run on a third-party set of servers
00:48:36
you cannot run an app on your computer
00:48:38
you know that right it needs to be
00:48:40
connected to the internet right like if
00:48:41
you wanted to run an auto GPT it
00:48:43
actually crawls the internet it actually
00:48:45
touches other apis it tries to then
00:48:47
basically send a push request sees what
00:48:50
it gets back parses the Json figures out
00:48:52
what it needs to do all of that is
00:48:55
allowed because it's hosted by somebody
00:48:56
right that code is running not locally
00:49:00
so the host becomes sure if you want to
00:49:03
run it locally you can do whatever you
00:49:04
want to do but evil agents are going to
00:49:06
do that right so if I'm an evil agent
00:49:08
I'm not going to go use AWS to run my
00:49:10
evil agent I'm going to set up a bunch
00:49:12
of servers and connect it to the
00:49:13
internet how
00:49:15
I could use vpns the internet is open
00:49:17
there's openings
00:49:20
I think that what you're going to see is
00:49:23
that if you for example try to VPN and
00:49:25
run it out of like Tajikistan back to
00:49:27
the United States
00:49:29
it's not going to take years for us to
00:49:31
figure out that we need to IP block
00:49:32
Rando [ __ ] coming in push and pull
00:49:34
requests from all kinds of ips that we
00:49:36
don't trust anymore because we don't now
00:49:38
trust the regulatory oversight that they
00:49:40
have for code that's running from those
00:49:42
IPS that are not us domestications let
00:49:44
me steal man tremont's position for a
00:49:46
second Jason hold on I I think the
00:49:47
ultimate if what chamoth is saying is
00:49:50
the point of view of Congress and if
00:49:52
tomoth has this point of view then there
00:49:54
will certainly be people in Congress
00:49:55
that will adopt this point of view the
00:49:58
only way to ultimately do that degree of
00:50:00
Regulation and restriction is going to
00:50:02
be to restrict the open internet it is
00:50:04
going to be to have monitoring and
00:50:05
firewalls and safety protocols across
00:50:07
the open internet because you can have a
00:50:09
set of models running on any set of
00:50:10
servers sitting in any physical location
00:50:12
and as long as they can move data
00:50:14
packets around they're going to be able
00:50:16
to get up to their nefarious activities
00:50:18
let me still man that for you Freeburg I
00:50:21
think yes you're correct the internet
00:50:23
has existed in a very open way but there
00:50:25
are organizations and there are places
00:50:27
like the national highway traffic safety
00:50:29
administration if I were to steal
00:50:31
Management's position if you want to
00:50:33
manufacture a car and you want to make
00:50:36
one in your backyard and put it on your
00:50:37
track in on your land up in Napa
00:50:40
somewhere and you don't want to have
00:50:41
brakes on the car and you don't want to
00:50:43
have you know a speed limiter or airbags
00:50:46
or seat belts and you want to drive on
00:50:47
the hood of the car you can do that but
00:50:49
once you want it to go on the open road
00:50:50
the open internet you need to get you
00:50:53
need to submit it for some safety
00:50:54
standards like nht sa like Tesla has to
00:50:58
or Ford has to so sax where do you sit
00:51:00
on this or is let's assume that people
00:51:04
are going to do very bad things with
00:51:06
very powerful models that are becoming
00:51:09
available Amazon today said there'll be
00:51:10
Switzerland they're going to put a bunch
00:51:11
of llms and other models available on
00:51:13
AWS Bloomberg's llm Facebooks Google
00:51:17
bard and of course at gbt open Ai and
00:51:20
Bing all this stuff's available to have
00:51:21
access to that do you need to have some
00:51:24
regulation of who has access to those at
00:51:26
scale powerful tools should there be
00:51:28
some FDA or nhtsa I don't think we know
00:51:32
how to regulate it yet I think that's
00:51:34
too early and I think the harms that
00:51:36
we're speculating about we're making the
00:51:38
AI more powerful than it is and I
00:51:40
believe it will be that powerful but I
00:51:42
think that it's premature to be talking
00:51:43
about regulating something that doesn't
00:51:45
really exist yet take the chaos GPT
00:51:46
scenario
00:51:48
the way that would play out would be
00:51:51
you've got some future incarnation of
00:51:53
Auto GPT
00:51:54
and somebody says okay Auto GPT I want
00:51:57
you to be you know wmdai and figure out
00:52:01
how to cause like a mass destruction
00:52:03
event you know and then it creates like
00:52:05
a planning checklist and that kind of
00:52:06
stuff
00:52:07
so that's basically the the type of
00:52:10
scenario we're we're talking about we're
00:52:12
not anywhere close to that yet I mean
00:52:14
the chaos GPT is kind of a joke it
00:52:17
doesn't produce it doesn't produce a
00:52:20
checklist I can give an example that
00:52:21
would actually
00:52:22
be completely plausible one of the first
00:52:25
things on the chaos gpt's checklist was
00:52:27
to stay within the boundaries of the law
00:52:29
because it didn't want to get prosecuted
00:52:30
got it so the person who did that had
00:52:32
some sort of good intent but I can give
00:52:34
you an example right now
00:52:36
that could be done by chat GPT and auto
00:52:38
GPT that could take down large swaths of
00:52:40
society and cause massive destruction
00:52:42
I'm almost reticent to say it here say
00:52:44
it uh well I'll say it and then maybe
00:52:45
we'll have to delete this but if
00:52:47
somebody created this and they said uh
00:52:49
figure out a way to compromise as many
00:52:52
powerful peoples and as many systems
00:52:54
passwords then go in there and delete
00:52:56
all their files and turn off as many
00:52:58
systems as you can
00:53:00
Chachi PT and auto GPT could very easily
00:53:02
create phishing accounts create billions
00:53:05
of websites to create billions of logins
00:53:07
have people log into them get their
00:53:09
passwords log into whatever they do and
00:53:11
then delete everything in their account
00:53:12
which would cause chaos and it could be
00:53:15
done today I don't think we've done
00:53:17
today simpler than this how about how
00:53:18
about you fishing website yeah pieces of
00:53:21
it can be created today but you're
00:53:22
you're accelerating the progress yeah
00:53:25
but you can automate 30 days yeah
00:53:28
exactly and but I think I'm accelerating
00:53:30
it in weeks why don't you just spoof the
00:53:33
bank accounts and just steal the money
00:53:34
like that's even simpler like people
00:53:36
will do the stuff because they're trying
00:53:37
to do it today holy cow now they just
00:53:39
have a more efficient way to solve
00:53:41
somebody think about bank accounts geez
00:53:42
so number one this is a tool and if
00:53:44
people use a tool in nefarious ways you
00:53:46
prosecute them number two the platforms
00:53:49
that are commercializing these tools do
00:53:51
have trust and safety teams now in the
00:53:54
past trust and safety has been a
00:53:56
euphemism for censorship which it
00:53:58
shouldn't be but you know open AI has a
00:54:01
safety team and they try to detect when
00:54:03
people are using their Tech in a
00:54:05
nefarious way and they try to prevent it
00:54:08
well no not on censorship but I think
00:54:11
that they're probably a million people
00:54:13
are using chapters I think they're
00:54:15
policing it are you willing to abdicate
00:54:17
your or societal responsibility to to
00:54:20
open AI to do the trust sensation what
00:54:22
I'm what I'm saying is I'd like to see
00:54:24
how far we get in terms of the system
00:54:27
yes you're saying you want to see the
00:54:29
mistakes you want to see where the
00:54:30
mistakes are and how bad had the
00:54:32
mistakes are I'm saying it's still very
00:54:33
early to be imposing regulation we don't
00:54:35
even know what to regulate so I think we
00:54:37
have to keep tracking this to develop
00:54:39
some understanding of how it might be
00:54:41
misused how the industry is going to
00:54:43
develop safety guard rails okay and then
00:54:47
you can talk about regulation look you
00:54:49
create some new FDA right now okay first
00:54:51
of all we know what would happen look at
00:54:53
the drug process as soon as the FDA got
00:54:55
involved it slowed down massively now it
00:54:57
takes years many years to get a drug
00:54:59
approved appropriately so yes but at
00:55:02
least with a drug we know what the gold
00:55:04
standard is you run a double-blind study
00:55:06
to see whether it causes harm or whether
00:55:09
it's beneficial we don't know what that
00:55:11
standard is for AI yet we have no idea
00:55:14
what's going to happen in a double blind
00:55:16
study in AI what no you'd have somebody
00:55:19
review the code you have two instances
00:55:24
Auto GPT
00:55:27
it's benign I mean
00:55:28
my friend used it to book a wine tasting
00:55:32
so who's going to review that code and
00:55:34
then speculate say oh well 99.9 of cases
00:55:38
it's perfectly benevolent and fine and
00:55:42
innocuous you know I can fantasize about
00:55:44
some cases someone might do hold on how
00:55:46
are you supposed to resolve that very
00:55:48
simple there are two types of Regulation
00:55:49
that can occur in any industry you can
00:55:51
do with the movie industry did which is
00:55:53
they self-regulate and they came up with
00:55:55
their own rating system or you can do
00:55:56
what happens with the FDA and what
00:55:58
happens with cars which is an external
00:56:01
government-based body I think now is the
00:56:03
time for self-regulation so that we
00:56:05
avoid the massive heavy hand of
00:56:08
government having to come in here but
00:56:09
these tools can be used today to create
00:56:11
massive Farm they're moving at a pace we
00:56:13
just said in the first half of the show
00:56:15
that none of us have ever seen every 48
00:56:17
Hours something drops that is
00:56:19
mind-blowing that's never happened
00:56:20
before and you can
00:56:22
take these tools and in the one example
00:56:25
that shmoth and I came up with the top
00:56:27
of our head in 30 seconds you could
00:56:30
create phishing sites compromise
00:56:31
people's bank accounts take all the
00:56:33
money out Delete all the files and cause
00:56:34
chaos on a scale that has never been
00:56:37
possible by a series of Russian hackers
00:56:40
or Chinese hackers working in a boiler
00:56:42
room this can scale and that is the the
00:56:45
fundamental difference here and I didn't
00:56:46
think I would be sitting here Steel Man
00:56:48
in tremont's argument I think humans
00:56:49
have a horrible ability to compound I
00:56:52
think people do not understand compound
00:56:53
interest and this is a perfect example
00:56:55
where when you start to compound
00:56:56
technology at the rate of 24 hours or 48
00:56:59
hours which we've never really had to
00:57:01
acknowledge most people's brains break
00:57:03
and they don't understand what six
00:57:04
months from now looks like and six
00:57:06
months from now when you're compounding
00:57:08
at 48 or 72 hours is like 10 to 12 years
00:57:12
in other Technology Solutions this is
00:57:15
compounding this is this is different
00:57:16
because of the compounding I agree with
00:57:18
that the pace Revolution is very fast we
00:57:20
are on a bullet train to something and
00:57:22
we don't know exactly what it is and
00:57:24
that's disconcerting however let me tell
00:57:26
you what would happen if we create a new
00:57:27
regulatory body like the FDA to regulate
00:57:29
this they would have no idea how to
00:57:32
arbitrate whether a technology should be
00:57:34
approved or not development will
00:57:36
basically slow to a crawl to slight drug
00:57:37
development there is no double-blind
00:57:39
standard I agreement can we do what
00:57:42
self-regulation can we do there is no
00:57:43
double blind standard in AI that
00:57:46
everyone can agree on right now to know
00:57:48
whether something should be approved and
00:57:49
what's going to happen is the thing
00:57:51
that's made software development so
00:57:52
magical and allowed all this Innovation
00:57:55
over the last 25 years is permissionless
00:57:57
innovation any developer
00:58:00
any Dropout from a university can go
00:58:03
create their own project which turns
00:58:05
into a company and that is what has
00:58:07
driven all the Innovation and progress
00:58:09
in our economy over the last 25 years so
00:58:11
you're going to replace permissionless
00:58:13
Innovation with going to Washington to
00:58:14
go through some approval process and it
00:58:16
will be the politically connected it'll
00:58:18
be the big donors who get their projects
00:58:21
approved and the next Mark Zuckerberg
00:58:23
who's trying to do his little project in
00:58:24
a dorm room somewhere will not know how
00:58:27
to do that well not know how to compete
00:58:28
and that highly political process I
00:58:31
think you're mixing a bunch of things
00:58:32
together so first of all permissionless
00:58:34
Innovation happens today in biotech as
00:58:37
well it's just that it's what Jason said
00:58:39
when you want to put it on the rails of
00:58:41
society and make it available to
00:58:43
everybody you you actually have to go
00:58:45
and do something substantive
00:58:47
in the negotiation of these drug
00:58:49
approvals it's not some standardized
00:58:51
thing you actually sit with the FDA and
00:58:52
you have to decide what are our
00:58:54
endpoints what is the mechanism of
00:58:55
action and how will we measure the
00:58:57
efficacy of this thing
00:58:59
the idea that you can't do this today in
00:59:01
AI is laughable yes you can and I think
00:59:03
that smart people so for example if you
00:59:05
pit deep Minds team versus open ai's
00:59:08
Team to both agree that a model is good
00:59:10
and correct I bet you they would find
00:59:13
a systematic way to test that it's fine
00:59:15
I just want to point out okay so
00:59:17
basically in order to do what you're
00:59:19
saying okay this entrepreneur
00:59:21
who just dropped out of college to do
00:59:23
their project they're gonna have to
00:59:24
learn how to go sit with Regulators have
00:59:26
a conversation with them go through some
00:59:28
complicated approval process and you're
00:59:30
trying to say that that won't turn into
00:59:32
a game of political connections of
00:59:34
course it will of course it will of
00:59:36
course
00:59:37
which is self-regulation yeah well let's
00:59:40
get to that hold on a second and let's
00:59:42
look at the drug approval process if you
00:59:45
want to create a drug company you need
00:59:46
to raise hundreds of millions of dollars
00:59:48
it's incredibly expensive it's
00:59:50
incredibly Capital intensive there is no
00:59:53
drug company that is two guys in their
00:59:55
garage like many of the biggest
00:59:57
companies like many of the biggest
00:59:59
companies in Silicon Valley started that
01:00:01
is because you're talking about taking a
01:00:04
chemical or biological compound and
01:00:06
injecting into some hundreds or
01:00:08
thousands of people who are both
01:00:11
racially
01:00:12
gender-based age-based highly stratified
01:00:15
all around the world or at a minimum all
01:00:17
around the country you're not talking
01:00:19
about that here David I think that you
01:00:21
could have a much simpler and cheaper
01:00:23
way where you have a version of the
01:00:26
internet that's running in a huge
01:00:27
sandbox someplace that's closed off from
01:00:29
the rest of the internet and another
01:00:30
version of the internet that's closed
01:00:32
off from everything else as well and you
01:00:34
can run on a parallel path
01:00:35
as it is with this agent and you can
01:00:38
easily in my opinion actually figure out
01:00:40
whether this agent is good or bad and
01:00:42
you can probably do it in weeks so I
01:00:45
actually think the approvals are
01:00:46
actually not that complicated and the
01:00:48
reason to do it here is because I get
01:00:51
that it may cause a little bit more
01:00:52
friction for some of these Mom and Pops
01:00:56
but if you think about what's the
01:00:59
societal and consequences of letting the
01:01:02
worst case outcomes happen the AGI type
01:01:05
outcomes happen I think those are so bad
01:01:09
they're worth slowing some folks down
01:01:11
and I think like just because you want
01:01:12
to you know buy groceries for a hundred
01:01:15
dollars you should be able to do it I
01:01:16
get it but if people don't realize and
01:01:19
connect the dots between that and
01:01:20
bringing airplanes down then that's
01:01:22
because they don't understand what this
01:01:23
is capable of I'm not saying we're never
01:01:25
going to need regulation what I'm saying
01:01:27
is it's way too early we don't even know
01:01:29
what we're accolading we don't know what
01:01:31
the standard would be and what we will
01:01:33
do by racing to create a new FDA is
01:01:35
destroying American innovation in the
01:01:37
sector and other countries will not slow
01:01:39
down they will beat us to the puncher
01:01:40
got it I think there's a middle ground
01:01:43
here
01:01:44
of self-regulation and thoughtfulness on
01:01:46
the part of the people who are providing
01:01:48
these tools at scale to give just but
01:01:50
one example here and this tweet is from
01:01:52
five minutes ago so to look at the pace
01:01:54
of this five minutes ago this tweet came
01:01:57
out a developer who is an AI developer
01:02:00
says AI agents continue to amaze my gpt4
01:02:03
coding assistant learned how to build
01:02:04
apps with authenticated users that can
01:02:06
build and design a web app create a back
01:02:08
end handle off logins upload code to
01:02:12
GitHub and deploy
01:02:14
he literally while we were talking is
01:02:17
deploying websites now if this website
01:02:19
was a phishing app or the one that
01:02:22
shamop is talking about he could make a
01:02:25
gazillion different versions of banking
01:02:27
of America Wells Fargo et cetera then
01:02:29
find everybody on the internet's email
01:02:31
then start sending different spoofing
01:02:33
emails determine which spoofing emails
01:02:35
work iterate on those and create a
01:02:37
Global Financial collapse now this
01:02:38
sounds insane but it's happening right
01:02:40
now people get hacked every day at one
01:02:42
two three percent
01:02:44
sax fraud is occurring right now in the
01:02:47
low single digit percentages identity
01:02:49
theft is happening in the low single
01:02:50
identity percentages this technology is
01:02:53
moving so fast that bad actors could 10x
01:02:56
that relatively easy and so if 10 of us
01:02:59
want to be hacked and have our credit
01:03:00
card attacked this could create chaos I
01:03:03
think self-regulation is the solution
01:03:05
I'm the one who brought up
01:03:06
self-regulation what I said no I brought
01:03:07
it up first I brought it up first I get
01:03:09
credit no go ahead no it's not about
01:03:11
credit I'm no self-regulations
01:03:15
you talk for eight minutes so if you
01:03:17
have a point to make you should have got
01:03:18
in the eight minutes oh my God you guys
01:03:20
kept interrupting me go ahead what I
01:03:22
said is that there are trust and safety
01:03:25
teams at these big AI companies these
01:03:27
big foundation model companies like open
01:03:30
AI like I said
01:03:32
in the past trust and safety has been a
01:03:34
euphemism for censorship and that's why
01:03:35
people don't trust it but I think it
01:03:37
would be appropriate for these platform
01:03:40
companies to apply some guard rails on
01:03:42
how their tools can be used and based on
01:03:45
everything I know they're doing that so
01:03:47
this guy just released websites to the
01:03:49
Austin web with chat gp4 and he's going
01:03:51
to have it do it automated you're
01:03:52
basically postulating capabilities that
01:03:55
don't yet exist I just tweeted the guy
01:03:57
is doing it he's got a video of himself
01:03:58
doing it on the web what do you think
01:03:59
that's a far cry from basically running
01:04:02
like some fishing
01:04:04
Expedition that's going to bring down
01:04:06
the entire banking system uh literally a
01:04:08
fishing a fishing site and a
01:04:10
ns are the same thing go ahead Freeburg
01:04:12
I think that that guy
01:04:14
is doing something illegal if he's
01:04:16
hacking into computers uh into people's
01:04:19
emails and bank accounts that's illegal
01:04:21
you're not allowed to do that
01:04:23
and so that action breaks the law that
01:04:27
person can be prosecuted for doing that
01:04:29
the tooling that one might use to do
01:04:31
that
01:04:32
can be used in a lot of different ways
01:04:34
just like you could use Microsoft Word
01:04:37
to forge letters just like you could use
01:04:40
Microsoft Excel to create fraudulent
01:04:43
financial statements I think that the
01:04:44
application of a platform technology
01:04:48
needs to be distinguished from
01:04:50
the technology itself and while we all
01:04:53
feel extraordinarily fearful because the
01:04:55
unbelievable leverage that these AI
01:04:57
tools provide again I'll remind you that
01:05:00
the chat gpt4 or this gpt4 model
01:05:05
by some estimates is call it a few
01:05:06
terabytes you could store it on a hard
01:05:08
drive or you could store it on your
01:05:09
iPhone and you could then go run it on
01:05:11
any set of servers that you could go set
01:05:13
up physically anywhere so you know it's
01:05:16
a little bit naive to say we can go
01:05:18
ahead and you know regulate platforms
01:05:20
and we can go regulate the tools
01:05:21
certainly we should continue to enforce
01:05:24
and protect ourselves against nefarious
01:05:27
actors using you know new Tools in
01:05:29
inappropriate illegal ways
01:05:31
you know I I also think that there's a
01:05:34
moment here
01:05:35
that we should all kind of
01:05:38
observe just how quickly we want to shut
01:05:40
things down when
01:05:43
you know they take away what feels like
01:05:46
the the control that we all have
01:05:49
from one day to the next and you know
01:05:52
that the the real kind of sense of fear
01:05:56
that seems to be quite contagious for a
01:05:58
large number of people that have
01:06:00
significant assets or significant things
01:06:03
to lose
01:06:04
is that uh you know tooling that's
01:06:06
that's you know creating entirely newly
01:06:09
disruptive systems and models for
01:06:11
business and and economics
01:06:13
an opportunity for so many
01:06:15
needs to be regulated away to minimize
01:06:18
you know what we claim to be some
01:06:20
potential downside when we already have
01:06:22
laws that protect us on the other side
01:06:25
so you know I just kind of
01:06:27
want to also consider that this set of
01:06:30
tools creates extraordinary opportunity
01:06:32
we gave one sort of simple example about
01:06:34
the opportunity for creators but we
01:06:36
talked about how new business models new
01:06:38
businesses can be started with one or
01:06:40
two people
01:06:41
you know entirely new tools can be built
01:06:43
with a handful of people entirely new
01:06:45
businesses this is an incredible
01:06:47
Economic Opportunity and again if the
01:06:50
U.S tries to regulate it or the U.S
01:06:51
tries to come in and stop the
01:06:53
application of models in general or
01:06:54
regulate models in general you're
01:06:56
certainly going to see those models of
01:06:57
continue to evolve and continue to be
01:06:59
utilized in very powerful ways that are
01:07:02
going to be advantageous
01:07:03
to places outside the U.S there's over
01:07:05
180 countries on Earth they're not all
01:07:07
going to regulate together it's been
01:07:09
hard enough to get any sort of
01:07:11
coordination around Financial systems to
01:07:13
get coordination around climate change
01:07:15
to get coordination around anything on a
01:07:17
global basis to try and get coordination
01:07:19
around the software models that are
01:07:21
being developed I think is is pretty
01:07:23
naive you don't want to have a global
01:07:25
organization I think you need to have a
01:07:26
domestic organization that protects U.S
01:07:28
and I think Europe will have their own
01:07:30
they again FDA versus Emma Canada has
01:07:34
its own Japan has its own China has its
01:07:36
own and they they have a lot of overlap
01:07:39
and a lot of commonality in in the
01:07:40
guardrails they use and I think that's
01:07:42
what's going to happen here this will be
01:07:43
beneficial only for political insiders
01:07:45
who will basically be able to get their
01:07:47
projects and their apps approved with a
01:07:49
huge dead weight loss for the system
01:07:50
because Innovation will completely slow
01:07:51
down but to many build on freeburg's
01:07:53
point which is that
01:07:56
we have to remember that AI won't just
01:07:58
be used by nefarious actors it'll be
01:08:00
used by positive actors so there will be
01:08:03
new tools that law enforcement will be
01:08:05
able to use and if somebody's creating
01:08:07
phishing sites at scale they're going to
01:08:09
be probably pretty easy for you know law
01:08:11
enforcement AIS to detect so let's not
01:08:14
forget that there'll be co-pilots
01:08:15
written for our law enforcement
01:08:18
authorities they'll be able to use that
01:08:19
to basically detect and fight crime and
01:08:21
a really good example of this is in the
01:08:23
crypto space we saw this article over
01:08:24
the past week that chain analysis has
01:08:27
figured out how to basically track you
01:08:30
know illicit Bitcoin transactions and
01:08:32
there's now a huge number of
01:08:33
prosecutions that are happening of
01:08:36
illegal use of Bitcoin and if you go
01:08:38
back to when Bitcoin first took off
01:08:40
there was a lot of conversations around
01:08:43
Silk Road and the only thing that
01:08:44
Bitcoin was good for was basically
01:08:46
illegal transactions blackmailing drug
01:08:49
trafficking and therefore we have to
01:08:51
stop Bitcoin remember that was the main
01:08:53
argument and the counter argument was
01:08:56
that well no Bitcoin like any technology
01:08:59
can be used for good or bad however
01:09:00
there will be technologies that spring
01:09:03
up
01:09:03
to combat those nefarious or illicit use
01:09:06
cases and sure enough you had a company
01:09:08
like chain analysis come along and now
01:09:10
it's been used by law enforcement to
01:09:12
basically crack down on the illicit use
01:09:14
of Bitcoin and if anything it's cleaned
01:09:16
up the Bitcoin Community tremendously
01:09:18
and I think it's dispelled this idea
01:09:20
that the only thing you'd use Bitcoin
01:09:22
for is Black Market transactions quite
01:09:25
the contrary I think you'd be really
01:09:26
stupid now to use Bitcoin in that way
01:09:29
it's actually turned Bitcoin into
01:09:30
something of a Honeypot now because if
01:09:33
you used it for nefarious transactions
01:09:35
your transactions recording the
01:09:37
blockchain forever or just waiting for
01:09:38
chain analysis to find it so again using
01:09:42
Bitcoin to do something illegal be
01:09:43
really stupid I think in a similar way
01:09:45
you're going to see self-regulation by
01:09:47
these major AI platform companies
01:09:49
combined with new tools that are used
01:09:51
new AI tools does spring up to help
01:09:54
combat the nefarious uses and until we
01:09:57
let those forces play out
01:09:59
I'm not saying regulate never I'm just
01:10:01
saying we need to let those forces play
01:10:03
out before we leap to creating some new
01:10:06
regulatory body that doesn't even
01:10:08
understand what its mandated mission's
01:10:09
supposed to be the Bitcoin story is
01:10:11
hilarious by the way
01:10:12
oh my God Wall Street Journal story it's
01:10:15
unbelievable pretty epic it took years
01:10:17
but basically this guy was buying blow
01:10:19
on suck Road and he deposited his
01:10:23
Bitcoin and then when he withdrew it he
01:10:25
there was a bug that gave him twice as
01:10:27
many Bitcoins so he kept creating more
01:10:29
accounts putting more money into Silk
01:10:31
Road and getting more Bitcoin out
01:10:33
and then years later the authorities
01:10:35
figured this out again with you know
01:10:37
chain analysis type things look at James
01:10:39
zong over there same song he accused uh
01:10:43
had a Lamborghini a Tesla a lake house
01:10:45
uh and was living his best life
01:10:48
apparently when the feds uh knocked on
01:10:51
his door and found the digital keys to
01:10:54
his crypto Fortune
01:10:55
in a popcorn tin in his bathroom and in
01:10:58
a safe in his basement
01:11:01
floor so there you have it well the
01:11:04
reason the reason I posted this was I
01:11:05
was like what if this claim that you
01:11:09
could have all these Anonymous
01:11:10
transactions actually fold
01:11:13
the entire Market because it looks like
01:11:17
that this anonymity has effectively been
01:11:20
reverse engineered and there's no
01:11:22
anonymity at all and so what Bitcoin is
01:11:25
quickly becoming is like the most
01:11:27
singular Honeypot of transactional
01:11:30
information that's complete and
01:11:32
available in public and I think what
01:11:35
this article talks about is how
01:11:36
companies like chain analysis and others
01:11:38
have worked now for years almost a
01:11:41
decade with law enforcement to be able
01:11:44
to map all of it and so now every time
01:11:48
money goes from one Bitcoin wallet to
01:11:49
another they effectively know the sender
01:11:52
and the recipient and I just want to
01:11:53
make a one quick correction here it
01:11:55
wasn't actually exactly popcorn it was
01:11:58
Cheetos spicy flavored popcorn
01:12:01
and there's the tin of it
01:12:03
where he had a motherboard of a computer
01:12:05
that held is there a chance that that
01:12:07
this project was actually introduced by
01:12:10
the government I mean there's been
01:12:12
reports
01:12:13
or network that the CIA had their hands
01:12:17
all over to our Tor if you don't know it
01:12:19
which is an anonymous like multi-relay
01:12:22
peer-to-peer web browsing system and
01:12:25
people believe it's a CIA Honeypot an
01:12:29
intentional trap for criminals to get
01:12:32
themselves uh caught up in
01:12:35
all right as we wrap here what an
01:12:37
amazing discussion my Lord I didn't I
01:12:38
never thought I would be I want to say
01:12:40
one thing yes we saw that
01:12:44
someone was arrested for the murder of
01:12:46
Bob Lee that's what I was about to
01:12:48
answer this morning yeah which turns out
01:12:50
that the report of the sfpd's arrest is
01:12:53
that it's uh someone that he knew that
01:12:55
also works in the tech industry someone
01:12:56
that possibly know right so also
01:12:58
breaking news yes possibly
01:13:00
but I I want to say two things one
01:13:03
obviously based on this arrest and the
01:13:05
storyline it's quite different than what
01:13:07
we all assumed it to be which was some
01:13:09
sort of homeless robbery type moment
01:13:12
that has become all too commonplace in
01:13:14
SF it's a commentary for me on two
01:13:17
things one is how quick we all were to
01:13:20
kind of Judge and assume that you know a
01:13:23
homeless robber type person would do
01:13:25
this in SF which I think speaks to the
01:13:28
condition in SF right now
01:13:30
also speaks to our conditioning that
01:13:32
that we all kind of lacked or didn't
01:13:34
even want to engage in a conversation
01:13:36
that maybe this person was murdered by
01:13:37
someone that they knew
01:13:40
because we wanted to kind of very
01:13:42
quickly fill our own narrative about how
01:13:43
bad SF is
01:13:45
and that's just something that I really
01:13:46
felt when I read this this morning I was
01:13:47
like man like I didn't even consider the
01:13:49
possibility that this guy was murdered
01:13:52
by someone that he knew because I am so
01:13:54
enthralled right now by this narrative
01:13:56
that SF is so bad and it must be another
01:13:58
data point that validates my point of
01:13:59
view on SF so you know I kind of want to
01:14:02
just acknowledge that and acknowledge
01:14:03
that we all kind of do that right now
01:14:05
but I do think it also does in fact
01:14:06
unfortunately speak to how bad things
01:14:08
are in SF because we all are we've all
01:14:11
have these experiences of feeling like
01:14:12
we're in danger and under Threat all the
01:14:14
time we're walking around in SF uh in so
01:14:16
many parts of San Francisco I should say
01:14:18
where things feel like they've gotten
01:14:19
really bad I think both things can be
01:14:22
true that we can kind of feel biased and
01:14:25
fill our own narrative
01:14:27
by kind of latching on to our assumption
01:14:30
about what something tells us
01:14:32
but but it also tells us quite a lot
01:14:33
about what is going on in ourselves so I
01:14:35
I just I just wanted to make sure In
01:14:37
fairness and I think it's fine for you
01:14:38
to make that point I am extremely
01:14:41
Vigilant on this program to always say
01:14:42
when something is breaking news with
01:14:44
whole judgment whether it's the Trump
01:14:45
case or Jesse Smollett or anything in
01:14:47
between January 6th let's wait until we
01:14:49
get all the facts and in fact quote from
01:14:52
sacks
01:14:53
we don't know exactly what happened yet
01:14:56
correct literally sax started with that
01:14:59
we do that every [ __ ] time on this
01:15:01
program we know when there's breaking
01:15:04
news to withhold judgment but you can
01:15:06
also know
01:15:07
two things can be true a tolerance for
01:15:10
ambiguity is necessary but I'm saying I
01:15:12
didn't even do that as soon as I heard
01:15:14
this I was like I was like an assumption
01:15:16
but David that is a fine assumption to
01:15:19
make that's a fine essential technical
01:15:21
assumption listen you make that
01:15:23
assumption for your own protection we
01:15:25
got all these reporters who are
01:15:27
basically propagandists trying to claim
01:15:28
that crime is down in San Francisco
01:15:29
they're all basically seeking comment
01:15:31
from me this morning sending emails or
01:15:33
trying to dunk God on us because we
01:15:35
basically talked about the bubbly case
01:15:38
in that way listen we said that we
01:15:41
didn't know what happened but if we were
01:15:43
to bet at least what I said is I bet
01:15:46
this case it looks like a lot like the
01:15:47
Brianna cup for case that was logical
01:15:49
that's not conditioning or biased that's
01:15:51
logic and you need to look at what else
01:15:54
happened that week okay so just the same
01:15:57
week that bubbly was killed let me give
01:15:59
you three other examples of things that
01:16:01
happened in Gotham City AKA San
01:16:03
Francisco so number one former fire
01:16:07
commissioner Don carmignani was beaten
01:16:09
within an inch of his life by a group of
01:16:11
homeless addicts in the marina and one
01:16:14
of them was interviewed in terms of why
01:16:17
it happened and basically Don came down
01:16:19
from his mother's house and told them to
01:16:21
move off his mother's front porch
01:16:23
because they were obstructing her
01:16:24
ability to get in and out of her
01:16:25
apartment they interpreted that as
01:16:27
disrespect and they beat him with a tire
01:16:29
iron or a metal pipe and one of the
01:16:32
hoodlums who was involved in this
01:16:34
apparently admitted this yeah play the
01:16:36
video somebody over the head like that
01:16:38
and attack him as he was really
01:16:42
disrespectful he
01:16:49
so he was being disrespectful and then
01:16:52
but is that enough to beat him up yeah
01:16:56
sometimes oh my Lord I mean so this is
01:16:59
case number one and apparently in the
01:17:01
reporting on that person who was just
01:17:04
interviewed he's been in the marina kind
01:17:05
of terrorizing people maybe not
01:17:07
physically but verbally
01:17:09
so you have you know bands of homeless
01:17:12
people encamped in front of people's
01:17:14
houses Don carmignani gets beaten within
01:17:17
an inch of his life you then had the
01:17:19
case of the Whole Foods Store on Market
01:17:21
Street shut down in San Francisco and
01:17:24
this was not a case of shoplifting like
01:17:26
some of the other store closings we've
01:17:27
seen they said they were closing the
01:17:29
store because they could not protect
01:17:31
their employees the bathrooms were
01:17:33
filled with needles and pipes that were
01:17:36
drug paraphernalia you had drug acts
01:17:38
going in there using it they were
01:17:39
engaging in altercations with store
01:17:41
employees and Whole Foods felt like they
01:17:43
had to close the store because again
01:17:44
they could not protect their employees
01:17:46
third example Board of Supervisors had
01:17:50
to disband their own meeting because
01:17:52
their internet connection got vandalized
01:17:55
the fiber for the cable connection to
01:17:58
provide their internet got vandalized so
01:18:00
they had to basically disband their
01:18:01
meeting Aaron Prescott was the one who
01:18:03
announced this and you saw in the
01:18:04
response to this yeah my retweetingham
01:18:07
went viral there were lots of people
01:18:09
said yeah I've got a small business and
01:18:11
the fiber or the copper wire whatever
01:18:13
was vandalized and in a lot of cases I
01:18:16
think it's basically drug addicts
01:18:17
stealing whatever they can they steal
01:18:18
ten dollars of copper wire sell that to
01:18:20
get a hit and it causes forty thousand
01:18:23
dollars of property damage here's the
01:18:25
insincerity sex literally the proper
01:18:28
response when there's violence in San
01:18:30
Francisco is hey we need to make this
01:18:32
place less violent is there a chance
01:18:33
that it could be people who know each
01:18:35
other of course that's inherent in any
01:18:37
crime that occurs that there'll be time
01:18:39
to investigate it but literally the
01:18:41
Press is now using this as a moment to
01:18:44
say there's no crime in San Francisco
01:18:45
where that people are acting like I just
01:18:48
have the New York Times email me during
01:18:49
the podcast
01:18:51
Heather knight from The Chronicle San
01:18:53
Francisco Chronicle in light of the Bob
01:18:55
Lee killing appearing to be an
01:18:56
interpersonal dispute she still doesn't
01:18:58
know right we don't have all the facts
01:19:00
with another tech leader do you think
01:19:01
the Tech Community jumped to conclusions
01:19:03
why are so many Tech leaders painting
01:19:05
San Francisco as a dystopian hellscape
01:19:07
with the reality with the reality is
01:19:10
more nuanced I think it's a little typo
01:19:11
there yeah yes it's like of course the
01:19:14
reality is nuanced of course it's a
01:19:16
hellscape walk down the street Heather
01:19:20
can I give you a theory please
01:19:23
I think it was most evident in the way
01:19:26
that Elon
01:19:28
dismantled and manhandled the BBC
01:19:30
reporter oh my God that was brutal this
01:19:33
is a small microcosm of what I think
01:19:35
media is so I used to think that media
01:19:39
had an agenda
01:19:42
I actually now think that they don't
01:19:44
particularly have an agenda
01:19:46
other than to be relevant because they
01:19:49
see waning relevance and so I think what
01:19:53
happens is whenever there are a bunch of
01:19:55
articles that tilt a pendulum into a
01:19:58
narrative
01:19:59
they all of a sudden become very focused
01:20:02
on refuting that narrative
01:20:05
and even if it means they have to lie
01:20:08
they'll do it
01:20:09
right so you know I think for months and
01:20:11
months I think people have seen that the
01:20:13
quality of the discourse on Twitter
01:20:15
became better and better
01:20:16
Elon is doing a lot with Bots and all of
01:20:18
this stuff cleaning it up
01:20:20
and this guy had to try to establish the
01:20:23
counter narrative and was willing to lie
01:20:26
in order to do it then he was dismantled
01:20:28
here you guys I don't have a bone to
01:20:30
pick so much at San Francisco I think
01:20:32
I've been relatively silent on this
01:20:33
topic but you guys as residents and
01:20:36
former residents I think have a vested
01:20:37
interest in the quality of that City and
01:20:39
you guys have been very vocal but I
01:20:41
think that you're not the only ones
01:20:42
Michelle tandler you know sheldonberger
01:20:44
there's a bunch of smart thoughtful
01:20:46
people who've been beating this drum
01:20:49
Gary tan
01:20:51
and so now I think reporters don't want
01:20:54
to write the N plus first article saying
01:20:56
that San Francisco is a hellscape
01:20:58
so they have to take the other side
01:21:00
and so now they're going to go and pick
01:21:02
up the counter narrative and they'll
01:21:03
probably dismantle the truth and kind of
01:21:06
redirect it in order to do it so I think
01:21:09
that what you're seeing is
01:21:10
they'll initially tell a story but well
01:21:13
then there's too much of the truth
01:21:14
they'll go to the other side because
01:21:15
that's the only way to get clicks and be
01:21:17
seen
01:21:18
so I think that that's what you guys are
01:21:19
a part of right now they are in the
01:21:21
business of protecting the narrative but
01:21:22
I I do think there's a huge ideological
01:21:24
component to the narrative both in the
01:21:26
Elon case where they're trying to claim
01:21:28
that there was a huge rise in hate
01:21:30
speech on Twitter the reason they're
01:21:31
saying that is because they want Twitter
01:21:33
to engage in more censorship that's the
01:21:36
ideological agenda here the agenda is
01:21:39
this radical agenda of decarceration
01:21:41
they actually believe that more and more
01:21:43
people should be led out of prison and
01:21:45
so therefore they have an incentive to
01:21:48
deny the existence of crime in San
01:21:52
Francisco and the rise in crime in San
01:21:53
Francisco if you pull most people in San
01:21:56
Francisco large majorities of San
01:21:57
Francisco believe that crime is on the
01:21:59
rise because they can see it they hear
01:22:00
it and what I would say is look I think
01:22:02
there's a pyramid of activity a pyramid
01:22:06
of criminal or anti-social behavior in
01:22:09
San Francisco that we can all see the
01:22:11
base level is you've got a level of
01:22:13
chaos on the streets where you have
01:22:15
open-air drug markets people doing drugs
01:22:18
sometimes you'll see you know a person
01:22:20
doing something disgusting you know like
01:22:22
people defecating on the streets or even
01:22:24
worse then there's like a level up where
01:22:26
they're chasing after you or you know
01:22:28
harassing you people have experienced
01:22:29
that I've experienced that then there's
01:22:31
a level up where there's petty crime
01:22:33
your car gets broken into or something
01:22:36
like that then there's the level where
01:22:38
you get mugged and then finally the top
01:22:40
of the pyramid is that there's a murder
01:22:42
and it's true that most of the time the
01:22:45
issues don't go all the way to the top
01:22:46
of the pyramid where someone is murdered
01:22:48
okay but that doesn't mean there's not a
01:22:51
vast pyramid underneath that of
01:22:53
basically quality of life issues and I
01:22:56
think this term quality of life was
01:22:58
originally used as some sort of way to
01:23:02
minimize the behavior that was going on
01:23:04
saying that they weren't really crimes
01:23:06
we shouldn't worry about them but if
01:23:08
anything what we've seen in San
01:23:09
Francisco is that when you ignore
01:23:11
quality of life crimes you will actually
01:23:13
see
01:23:14
a huge diminishment in what it's like to
01:23:17
live in these cities like quality of
01:23:19
life is real and that's the issue and I
01:23:21
think what they're trying to do now is
01:23:23
that say that because Baba Lee wasn't
01:23:25
the case that we thought it was that
01:23:27
that whole pyramid doesn't exist that
01:23:30
pyramid exists we can all experience oh
01:23:32
my God I mean and that's the insincerity
01:23:34
of this it is insincere and the
01:23:36
existence of that pyramid that we can
01:23:37
see and hear and feel and experience
01:23:40
every day is why we're willing to make a
01:23:42
bet we called it a bet that the Bob Lee
01:23:45
case was like the Brianna kupfer case
01:23:46
and in that with the disclaimer with a
01:23:49
disclaimer and we always do a disclaimer
01:23:51
here and just to George Hammond from the
01:23:53
financial times who emailed me here's
01:23:55
what he asked me there's a lot of public
01:23:56
attention lately on weather San
01:23:57
Francisco status has one of the top
01:23:59
Business and Technology hubs in the US
01:24:00
is at risk in the aftermath of the
01:24:02
pandemic duh obviously it is I wondered
01:24:05
if you had a moment to chat about that
01:24:06
and whether there's a danger that
01:24:08
negative perceptions about the city will
01:24:10
damage its reputation for Founders and
01:24:11
capital locations in the future so
01:24:13
essentially the enemy says the obviously
01:24:15
a lot of potential for hysteria in this
01:24:17
conversation which I'm Keen to avoid and
01:24:19
it's like have you walked down the
01:24:21
street and I asked him have you walked
01:24:23
down the street in San Francisco Jason
01:24:25
the best response is send him the thing
01:24:27
that Sac sent which is the amount of
01:24:28
available office space in San Francisco
01:24:33
companies are voting with their feet so
01:24:35
it's already if the quality of life
01:24:37
wasn't so poor they'd stay this is the
01:24:39
essence of gaslighting is what they do
01:24:41
is the people who've actually created
01:24:43
the situation in San Francisco with
01:24:44
their policies their policies of
01:24:47
defunding the police making it harder
01:24:49
for the police to do their job
01:24:50
decriminalizing theft under 950 allowing
01:24:53
open-air drug markets the people who
01:24:55
have now created that Matrix of policies
01:24:57
have created the situation what they
01:24:59
then turn around and do is say no the
01:25:00
people are creating the problem are the
01:25:02
ones who are observing this that's all
01:25:04
we're doing is observing and complaining
01:25:06
about it and what they try to do is say
01:25:08
well no you're you're running down San
01:25:10
Francisco we're not the ones creating
01:25:11
the problem we're observing it and just
01:25:12
this week another data point is is that
01:25:15
the mayor's office said that they were
01:25:17
short more than 500 police officers in
01:25:19
San Francisco yeah nobody who who's
01:25:21
going to become a police officer here
01:25:22
are you crazy well and there's another
01:25:24
article just this week about how there's
01:25:27
a lot of speculation rumors are swirling
01:25:29
of an unofficial strike in an informal
01:25:31
strike by police officers who are
01:25:33
normally on the force who are tired of
01:25:36
risking life and limb and then you know
01:25:38
they basically risk getting at a
01:25:40
physical altercation with a homeless
01:25:42
person they bring them in and then
01:25:44
they're just released again so there's a
01:25:46
lot of quiet quitting that's going on in
01:25:47
the job it's like this learned
01:25:49
helplessness because why take a risk and
01:25:51
then the police commission doesn't have
01:25:53
your back it seems like the only time
01:25:55
you have prosecutorial Zeal by a lot of
01:25:57
these prosecutors is when they can go
01:25:58
after a cop not one of these repeat
01:26:01
offenders and you just saw that by the
01:26:02
way in L.A oh look motherboard and New
01:26:04
York Times just emailed and dm'd me and
01:26:06
then and then did you guys say that
01:26:08
instead of solving these issues the
01:26:09
Board of Supervisors was dealing with
01:26:12
a wild parrot what was it
01:26:17
the meeting that was disbanded they had
01:26:19
or yeah they had scheduled a meeting
01:26:21
to vote on whether the wild parrots are
01:26:25
the official animal of the city of San
01:26:27
Francisco
01:26:28
so that was the uh the scheduled meeting
01:26:30
that got uh disbanded also connect may I
01:26:33
just clarify what is talking about with
01:26:35
the Elon interview a BBC reporter
01:26:36
interviewed Elon and said there is much
01:26:39
more race and hate and hate speech in
01:26:43
the feeds on Twitter and he said can you
01:26:45
give me an example and he said well I
01:26:47
don't have an example but people are
01:26:48
saying this he said which people are
01:26:49
saying it and the BBC reporter said well
01:26:51
just different groups of people are
01:26:53
saying it and you know I've certainly
01:26:54
seen you said okay you saw it and for
01:26:56
you he goes no I stopped looking at for
01:26:57
you he said so give me one example of
01:27:00
hate speech that you've seen in your
01:27:02
feed
01:27:02
now we without speaking about any inside
01:27:05
information which I do not have much of
01:27:07
they've been pretty deliberate of
01:27:09
removing hate speech from places like
01:27:10
for you and you know it's a very
01:27:12
complicated issue when you have an open
01:27:13
platform but the the you people may say
01:27:17
a word but it doesn't reach a lot of
01:27:19
people so if you were to say something
01:27:20
really nasty it doesn't take a genius to
01:27:22
block that and not have it reach a bunch
01:27:24
of people this reporter kept insisting
01:27:26
to Elon that this was on the rise with
01:27:28
no factual basis for it that other
01:27:30
people said it and then he said but I
01:27:32
don't look at the feed he said so you're
01:27:33
telling me that there's more hate speech
01:27:35
that you've seen But you just admitted
01:27:37
to me that you haven't looked at the for
01:27:38
you feed in three months and it was just
01:27:40
like this completely weird thing
01:27:44
and this is the thing if you're a
01:27:46
journalist just cut it down the middle
01:27:49
come with prepared with facts listen and
01:27:51
stop taking a position either way I want
01:27:54
to connect one dot please which is that
01:27:56
he filled in his own narrative even
01:28:00
though the data wasn't necessarily there
01:28:02
in the same way that you know we kind of
01:28:05
filled in our narrative about San
01:28:06
Francisco with the Bob Lee
01:28:09
you know Murder being another example we
01:28:12
put a disclaimer on it
01:28:14
we did it hold on a second we so we knew
01:28:17
we didn't know and furthermore we're
01:28:20
taking great pains this week to correct
01:28:22
the record and explain what we now know
01:28:24
yeah
01:28:26
to be intellectually honest
01:28:29
this is just intellectual honesty
01:28:31
honestly you're you're you're going soft
01:28:34
here Freeburg you're getting gas lit by
01:28:35
all these people
01:28:37
I think the guy the guy totally the guy
01:28:40
totally had zero data
01:28:43
put the report on data and evidence so
01:28:45
he certainly you know I think probably
01:28:47
with Don carmignani it's the same story
01:28:50
yeah this is that Don Don happened to
01:28:53
survive guys I love I love you but I
01:28:55
gotta go goodbye here's what Maxwell
01:28:57
from motherboard have fun there's been a
01:29:00
lot of discussion about the future of
01:29:02
San Francisco and the death has quickly
01:29:03
become politicized
01:29:05
has that caused any division or
01:29:07
disagreement from what you've seen or
01:29:09
has that not been the case the Press is
01:29:12
gleeful right now like oh my God you
01:29:16
know it's just like the right was
01:29:17
gleeful with Jesse Smollett having
01:29:19
gotten himself beaten up or you know
01:29:21
setting up his own all right everybody
01:29:23
four the Sultan of science currently
01:29:27
conducting
01:29:29
experiments on a beach to see uh exactly
01:29:32
how burned he can get with his SPF 200
01:29:35
under an umbrella wearing a sun shirt
01:29:37
and pants
01:29:39
Freeburg on the beach wears the same
01:29:41
outfit astronauts wear when they do
01:29:43
space walks hey stable diffusion make me
01:29:46
an image of David Friedberg wearing a
01:29:49
full body bathing suit covered in SPF
01:29:51
200 under three umbrellas on a sunny
01:29:55
Beach
01:29:57
oh my God for the dictator
01:30:01
polyhapatia creating regulations and the
01:30:04
regular oh the regulator you can call me
01:30:06
the regular the regulator see you
01:30:08
tonight when we'll eat our orchalant
01:30:10
what's left of them the final four or
01:30:11
five ortalons in existence otherwise I'm
01:30:15
putting you on the b list today I will
01:30:17
be there I will be there I promise I
01:30:18
promise I promise can't wait to be there
01:30:19
and
01:30:21
the Rain Man himself he didn't even get
01:30:24
to putting Ron
01:30:27
versus
01:30:30
I think you should ask Auto GPT how you
01:30:34
can eat more endangered animals have a
01:30:36
plan for you yes and then have it go
01:30:39
kill those animals
01:30:41
on the dark web to go kill the remaining
01:30:44
rhinos and bring them to chamat's house
01:30:45
for poker night
01:30:48
I don't think rhinos would taste good
01:30:50
wasn't that the plot of a movie but it
01:30:52
was a oh did you guys see is cocaine
01:30:54
bear out yet no it was a Matthew
01:30:55
Broderick Marlon Brando movie right
01:30:57
where they're doing the takeoff on The
01:30:58
Godfather was the Freshman yeah yeah
01:31:00
yeah yeah yeah it's like a conspiracy to
01:31:02
eat uh endangered animals yes the
01:31:06
freshman
01:31:06
the president came out in 1990 yeah that
01:31:10
Marlon Brando did it with um Matthew
01:31:12
Broderick and like Bruno Kirby they
01:31:15
actually they that was the whole thing
01:31:16
there's no Kirby that's a deeper
01:31:18
actually uh
01:31:19
they were eating endangered animals what
01:31:22
do you what do you think heat too is
01:31:23
that going to be good sex I know Heat's
01:31:25
one of your favorite films me too it's
01:31:27
awesome is there a sequel coming they're
01:31:28
gonna do he too and the novel's already
01:31:30
come out Adam Driver so I saw the novel
01:31:32
yeah he's amazing
01:31:34
one of those movies where when it comes
01:31:36
on you just can't stop watching yeah
01:31:39
best bank robbery slash shootout in
01:31:42
movie history you know that is literally
01:31:45
the best film ever like it's up there
01:31:47
with like the Joker with Reservoir Dogs
01:31:49
the the the Joker in that Batman movie
01:31:52
where he robs the bank like I mean what
01:31:54
I love you guys all right love you
01:31:55
besties and four blah blah blah blah
01:31:58
blah this has been the all-in podcast
01:31:59
124. if you want to go to the fan
01:32:00
meetups and hang out with others
01:32:07
we'll let your winners ride
01:32:12
[Music]
01:32:17
and they've just gone crazy
01:32:19
[Music]
01:32:33
foreign
01:32:35
[Music]
01:32:56
[Music]

Badges

This episode stands out for the following:

  • 60
    Most shocking
  • 60
    Best concept / idea
  • 60
    Most talked-about

Episode Highlights

  • The Rise of Auto GPT
    Auto GPT is gaining traction, with 45,000 stars on GitHub in just two weeks.
    “Auto GPT is exploding in popularity with 45,000 stars on GitHub.”
    @ 03m 49s
    April 14, 2023
  • Disruption Through Automation
    Auto GPT could disrupt existing businesses by streamlining operations.
    “This Auto GPT is the answer to disrupting existing businesses.”
    @ 13m 06s
    April 14, 2023
  • The Rise of AI in Content Creation
    AI is transforming how content is created, allowing for personalized and dynamic storytelling.
    “AI is ruthless because it's emotionless.”
    @ 22m 08s
    April 14, 2023
  • A New Era for Creators
    The tools available to creators are expanding, enabling new forms of storytelling and interaction.
    “Creators can now write entire universes enjoyed by millions.”
    @ 31m 25s
    April 14, 2023
  • A New Hero in Innovation
    Chamath discusses the emergence of new innovations and the need for oversight.
    “Innovation is being made and a new Hero has been born.”
    @ 37m 43s
    April 14, 2023
  • Challenges of Regulation
    Regulating software and AI will be a monumental task due to its rapid evolution.
    “It’s going to be as challenged as trying to monitor and demand oversight.”
    @ 41m 35s
    April 14, 2023
  • The Open Internet Dilemma
    The discussion highlights the challenges of regulating technology in an open internet environment.
    “The internet is open; there are openings.”
    @ 49m 20s
    April 14, 2023
  • The Pace of Innovation
    Every 48 hours, something mind-blowing drops in technology. It's unprecedented!
    “That's never happened before!”
    @ 56m 19s
    April 14, 2023
  • The Risks of Self-Regulation
    Self-regulation could prevent chaos, but it also risks slowing innovation.
    “Self-regulation is the solution.”
    @ 01h 03m 05s
    April 14, 2023
  • Bitcoin's Unexpected Journey
    Bitcoin has evolved from a tool for illicit transactions to a regulated asset.
    “The Bitcoin story is hilarious by the way.”
    @ 01h 10m 11s
    April 14, 2023
  • Media and Relevance
    The media's struggle for relevance leads to a distortion of narratives, especially regarding crime.
    “They'll do anything to protect the narrative.”
    @ 01h 21m 21s
    April 14, 2023
  • Quality of Life Issues
    Exploring the pyramid of criminal behavior and its impact on daily life in San Francisco.
    “Ignoring quality of life crimes diminishes what it's like to live in these cities.”
    @ 01h 22m 58s
    April 14, 2023

Episode Quotes

Key Moments

  • AI and Automation19:56
  • Dynamic Storytelling29:51
  • New Hero37:43
  • Open Internet Issues49:20
  • Rapid Innovation56:19
  • Self-Regulation Debate1:03:05
  • Bitcoin's Evolution1:10:11
  • Under Threat1:14:12

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
AI Bubble Pops, Zuck Freezes Hiring, Newsom’s 2028 Surge, Russia/Ukraine Endgame
Podcast thumbnail
Satya Nadella on AI’s Business Revolution: What Happens to SaaS, OpenAI, and Microsoft?
Podcast thumbnail
E152: Real estate chaos, WeWork bankruptcy, Biden regulates AI, Ukraine's “Cronkite Moment” & more
Podcast thumbnail
The AI Cold War, Signalgate, CoreWeave IPO, Tariff Endgames, El Salvador Deportations
Podcast thumbnail
E126: Big Tech blow-out, Powell’s recession warning, lab-grown meat, RFK Jr shakes up race & more
Podcast thumbnail
E170: Tech's Vibe Shift, TikTok ban debate, Vertical AI boom, Florida bans lab-grown meat & more
Podcast thumbnail
E122: Is AI the next great computing platform? ChatGPT vs. Google, containing AGI & RESTRICT Act