Search:

OpenAI's $150B conversion, Meta's AR glasses, Blue-collar boom, Risk of nuclear war

September 27, 202401:35:49
00:00:00
all right everybody uh let's get the
00:00:01
show started here wait wait Jason why
00:00:03
you wearing a tux what's going on there
00:00:05
oh well it's time for a very emotional
00:00:08
segment we do hear on the all-in
00:00:11
podcast I just got to get myself
00:00:13
composed for this H Jason are you okay
00:00:17
I'm I'm gonna be okay I think it looks
00:00:19
like you're fighting back at tier what's
00:00:22
yeah this is always a tough one this
00:00:23
year we tragically lost Giants in our
00:00:28
industry these individuals bravely own
00:00:30
their craft at open AI before
00:00:34
departing Ilia sasver he left in he left
00:00:37
us in
00:00:38
[Music]
00:00:39
May Yan Leica also left in
00:00:44
May Sean shman tragically left us in
00:00:49
August wait these are AI employees yes s
00:00:54
left on Wednesday Bob mcru also left on
00:00:58
Wednesday Too Short too short and Mera
00:01:03
moradi also left us tragically on
00:01:05
Wednesday we lost miror too yeah and
00:01:08
Greg Brockman is on extended leave the
00:01:12
enforcer he left too thank you for your
00:01:15
service your memories will live on as
00:01:19
training data and may your memories be a
00:01:24
vesting let your winners
00:01:27
ride Rainman David
00:01:32
and instead we open source it to the
00:01:34
fans and they've just gone crazy with it
00:01:39
[Music]
00:01:41
Queen my goodness all those losses wow
00:01:45
that is three in one day three in one
00:01:47
day my goodness I thought open AI was
00:01:49
nothing without its
00:01:51
[Laughter]
00:01:53
people I well I mean this is a this is a
00:01:56
great whoa we lost something come back
00:01:59
wait oh wait
00:02:02
what this is like the photo it's like
00:02:05
the photo and Back to the Future wow
00:02:07
they're just all gone wait oh no don't
00:02:08
worry he's replacing everybody here we
00:02:10
go he replacing with a g700 a Bugatti
00:02:13
and I guess Sam's got mountains of cash
00:02:15
so don't worry he's got a backup plan
00:02:17
jamas anyway as an industry and as
00:02:19
leaders in the industry the show sends
00:02:21
its regards to Sam and the open AI team
00:02:24
on their tragic losses and
00:02:26
congratulations on the
00:02:27
$150 billion doll value and you're 7%
00:02:31
Sam now just cashed in10 billion
00:02:34
apparently so congratulations to fan of
00:02:36
uh friend of the Pod Sam amen is the
00:02:39
round that's all reported out of some
00:02:41
article right that's not like confirmed
00:02:44
or anything is all of that done I mean
00:02:46
it's reportedly allegedly that he's
00:02:47
going to have 7% of the company and we
00:02:49
can jump right into our first story I
00:02:52
mean what I'm saying is has the money
00:02:53
been wired and the docks been sign
00:02:55
according to reports this round is
00:02:57
contingent on not being
00:03:00
a um nonprofit anymore and sorting that
00:03:03
all out they have to remove the profit
00:03:05
cap and do the seor there's some article
00:03:07
that reported this right none of us
00:03:09
Bloomberg it's not some article it was
00:03:10
Bloomberg and and it got a lot of
00:03:13
traction and it was re-reported by a lot
00:03:15
of places and I don't see anyone
00:03:16
disputing it so is mainstream media we
00:03:20
trust the mainstream media in this case
00:03:23
because think I think that when we could
00:03:25
do a good bit yeah that's mine no I
00:03:27
think that Bloomberg Bloomberg reported
00:03:29
it based on obviously talks that are
00:03:31
ongoing with investors who have
00:03:32
committed to this round yeah and no
00:03:34
one's disputing it has anyone said it's
00:03:36
not true this has been speculated for
00:03:40
months the $150 billion valuation
00:03:44
raising something in the range of six
00:03:46
to7 billion if you do the math on that
00:03:49
and Bloomberg is correct that Sam mman
00:03:52
got his 7% I guess the reality is you
00:03:55
can't raise $6 billion dollar without
00:03:57
probably meeting with a few dozen firms
00:03:59
and some number of Junior people in
00:04:01
those few dozen firms are having a
00:04:03
conversation or two with reporters so
00:04:06
you can kind of see how it gets out all
00:04:08
right and before we get to our uh first
00:04:10
story there about open AI
00:04:12
congratulations to chamath let's pull up
00:04:14
the photo here he was a featured guest
00:04:18
on uh The Alex Jones Show no sorry I'm
00:04:21
sorry uh that would be Joe Rogan
00:04:24
congratulations on coming to Austin and
00:04:26
being on Joe Rogan what was it like to
00:04:27
do a three-hour podcast
00:04:30
with Jo Rogan it's great I mean I loved
00:04:34
it he's really awesome he's super cool
00:04:38
it's good to do long form stuff like
00:04:39
this so that I can actually talk clearly
00:04:41
is the limitation of this podcast is the
00:04:43
other three of us finally you have found
00:04:45
a way to make it about yourself um I saw
00:04:49
a comment somebody commented like oh wow
00:04:51
it's like amazing to hear chth expand on
00:04:53
topics about the constant eruptions by
00:04:56
Jal also known as moderation
00:05:00
someone called me someone called me Fran
00:05:03
or Fran from the 70s Show I thought that
00:05:06
was funny the the amount of trash
00:05:09
talking in Rogan's YouTube comments it's
00:05:11
next level it is it is it is the wild
00:05:15
wildd west uh in in terms of the comment
00:05:18
section on YouTube yeah um a bunch of
00:05:21
comments asking Jak why do you call it
00:05:22
Alex Jones is that cuz it's just a Texas
00:05:25
short podcaster who short and stout and
00:05:27
they look similar just a but I mean it
00:05:31
looks like Alex
00:05:32
Jones uh sort of lifting weights
00:05:34
actually no they're
00:05:35
both the same height and yeah both saw
00:05:38
joean 25 years ago doing standup I have
00:05:40
a photo with him at the club it was like
00:05:42
a small Club in San Francisco and we
00:05:45
hung out with him afterwards he was just
00:05:47
like a nobody back in the day was like a
00:05:48
stand-up guy right now
00:05:50
he's uberstar well you have to go back
00:05:53
pretty far for Joe Rogan to be a nobody
00:05:55
I mean he had a TV show for a long time
00:05:58
and two of them in fact
00:06:00
comic for a while stand up comic yeah
00:06:03
Fear Factor that's right that's also do
00:06:06
Survivor or one of those like so I think
00:06:08
and then the UFC I mean this guy's got
00:06:11
four four blew up UFC yeah yeah well I
00:06:15
mean I think he got the UFC out of Fear
00:06:17
Factor and being a UFC fighter and a
00:06:19
comedian and there's like a famous story
00:06:21
where like Dana White was pursuing him
00:06:25
and he was like I don't know and then
00:06:27
Dana White's like I'll send a plane for
00:06:28
you you can bring your friend he's like
00:06:30
okay fine I'll do it he did it for free
00:06:32
and then Dana White pursued him heavily
00:06:33
to become the voice of the UFC and yeah
00:06:36
obviously it's grown tremendously and
00:06:39
it's you know worth billions of dollars
00:06:42
okay so how is open AI worth 150 billion
00:06:45
dollars can anyone app well why don't we
00:06:49
get into the topic should we make the
00:06:50
bull case and the bear case all right
00:06:51
open AI as we were just joking in the
00:06:54
opening segment is trying to convert
00:06:56
into a for profit benefit Corporation
00:07:00
that's a BC Corp it just means we'll
00:07:01
explain bcorp later Sam wman is
00:07:04
reportedly I thought they're converting
00:07:05
to a C Corp no it's the same thing B
00:07:07
Corp doesn't benefit Corporation is a cc
00:07:10
Corporation variant that is not a
00:07:13
nonprofit but the board of director
00:07:16
Sachs is required not only to be a
00:07:19
fiduciary for all shareholders but also
00:07:22
for the stated mission of the company
00:07:24
that's my understanding of a b Corp am I
00:07:25
right freberg external stakeholders yeah
00:07:28
so like the environment or Society or
00:07:31
whatever but it from all other kind of
00:07:33
legal tax factors it's the same as a CP
00:07:37
and it's a way to I guess signal to
00:07:40
investors the market employees that you
00:07:43
care about something more than just
00:07:45
profit so famous most famous borp I
00:07:48
think is Toms is that the shoe company
00:07:50
Toms that's a famous B Corp somebody
00:07:52
will look it up here pagonia yeah that
00:07:54
falls into that category so for profit
00:07:56
with a mission Reuters has cited
00:07:58
anonymous sources close to the company
00:08:02
that the plan is still being hashed out
00:08:04
with lawyers and shareholders and the
00:08:06
timeline isn't certain but what's being
00:08:08
discussed is that the uh nonprofit will
00:08:11
continue to exist as a minority
00:08:13
shareholder in the new
00:08:15
company how much of a minority
00:08:17
shareholder I guess is uh the devil's in
00:08:19
the detail there do they own 1% or 49%
00:08:22
the uh very much discussed freedberg
00:08:25
100x profit cap for investors will be
00:08:28
removed
00:08:29
that means investors like venod friend
00:08:31
of the Pod and Reed Hoffman also friend
00:08:33
of the Pod could see a 100x turn into
00:08:37
a,x or more according to the Bloomberg
00:08:39
report Sam waltman's going to get his
00:08:41
Equity finally 7% that would put him at
00:08:44
around 10.5 billion if this is all true
00:08:47
and openi could be valued as high as
00:08:50
$150 billion we'll get into all the
00:08:54
shenanigans but let's start with your
00:08:55
question freeberg and since you asked it
00:08:57
I'm going to Boomerang it back to you
00:08:59
make the bull case for $150 $150 billion
00:09:05
valuation the bull case would be that
00:09:08
the moat in the business with respect to
00:09:11
model
00:09:12
performance and infrastructure gets
00:09:15
extended with the large amount of
00:09:17
capital that they're raising they
00:09:18
aggressively deploy it they are very
00:09:20
strategic and tactical with respect to
00:09:23
how they deploy that
00:09:24
infrastructure to continue to improve
00:09:26
model performance and as a result
00:09:28
continue to extend their advantage in
00:09:31
both consumer and Enterprise
00:09:32
applications the API tools and so on
00:09:34
that they offer and so they can maintain
00:09:38
uh both kind of model and application
00:09:40
performance leads that they have today
00:09:43
across the board I would say like the
00:09:45
the 01 model uh their voice application
00:09:48
Sora has not been released publicly but
00:09:50
if it does and it looks like what it's
00:09:51
been demoed to be it's certainly ahead
00:09:53
of the pack so there's a lot of uh
00:09:56
aspects
00:09:57
of of open AI today that kind of makes
00:10:00
them a leader and if they can deploy
00:10:01
infrastructure to maintain that lead and
00:10:02
not let Google Microsoft Amazon and
00:10:05
others catch up then their ability to
00:10:08
use that Capital wisely keeps them ahead
00:10:10
and ultimately as we all know there's a
00:10:12
multi-trillion dollar market to capture
00:10:13
here making lots of verticals lots of
00:10:16
applications lots of products so they
00:10:18
could become a a true kind of global
00:10:20
player here plus the extension into
00:10:22
Computing which I'm excited to talk
00:10:23
about later when we get into this
00:10:25
Computing stuff Sachs here's a chart of
00:10:27
open A's Revenue growth
00:10:30
that has been piec meal together from
00:10:32
various sources at various times but
00:10:35
you'll see here they are
00:10:38
reportedly as of June of 2024 on a $3.4
00:10:42
billion run rate for this year after
00:10:45
hitting two billion in 23 1.3 billion in
00:10:49
October of 23 and then back in 2022 was
00:10:54
reported they only had 28 million in
00:10:56
Revenue so this a pretty pretty hot a
00:10:59
pretty a pretty big shriek here in terms
00:11:00
of Revenue growth I would put it at 50
00:11:02
times Topline Revenue 150 billion
00:11:05
valuation you want to give us the bease
00:11:07
maybe or the bull case well so the The
00:11:09
Whisper numbers I heard was that their
00:11:12
revenue run rate for this year was in
00:11:13
the four to six billion range which is a
00:11:15
little higher than that so you're right
00:11:19
if it's really more like 3.4 this
00:11:21
valuation is about 50 times current
00:11:23
Revenue but if it's more like 5 billion
00:11:26
then it's only 30 times and if it's
00:11:28
growing % year-over-year it's only 15
00:11:31
times next year so depending what the
00:11:33
numbers actually are the $150 billion
00:11:35
valuation could be warranted I don't
00:11:38
think 15
00:11:39
times Ford ARR is a high valuation for a
00:11:44
company that has this kind of strategic
00:11:46
opportunity I think it all comes down to
00:11:49
the
00:11:50
durability of its comparative advantage
00:11:53
here I think there's no question that
00:11:54
open AI is the leader of the pack it has
00:11:57
the most advanced AI model models it's
00:11:59
got the best developer ecosystem the
00:12:02
best apis it keeps rolling out new
00:12:04
products and the question is just how
00:12:06
durable that Advantage is is there
00:12:09
really a moe to any of this for example
00:12:12
meta just announced llama 3.2 which can
00:12:15
do voice and this is roughly at the same
00:12:17
time that open aai just released its
00:12:20
voice API so the open source ecosystem
00:12:23
is kind of hot on open ai's heels the
00:12:27
large companies Google Microsoft so
00:12:30
forth they're hot on their heels too
00:12:33
although it seems like they're further
00:12:34
behind where meta is and the question is
00:12:38
just can open AI maintain its lead can
00:12:40
it consolidate its lead can it develop
00:12:41
some Moes if
00:12:43
so it's on track to be the next trillion
00:12:46
dollar big tech company but if not it
00:12:48
could be eroded and you could see the
00:12:51
value of open AI get get commoditized
00:12:53
and we'll look back on it as kind of a
00:12:55
cautionary tail okay chth do us a favor
00:12:57
here if there is a be case what is it
00:13:00
okay let's steeland the be case yes
00:13:03
that's what I'm asking
00:13:04
please so one would just be on
00:13:07
the fundamental technology itself and I
00:13:12
think the version of that story would go
00:13:14
that the
00:13:17
underlying Frameworks that people are
00:13:19
using to make these models great is well
00:13:22
described and available in open source
00:13:25
on top of that there are at least two
00:13:28
viable open source models that are as
00:13:32
good or better at any point in time than
00:13:36
open AI so what that would mean is that
00:13:39
the value of those
00:13:41
models the economic value basically goes
00:13:44
to zero and it's a consumer
00:13:46
surplus for the people that use it so
00:13:49
that's very hard theoretically to
00:13:53
monetize I think the second part of the
00:13:57
case would be that specific Al meta
00:14:00
becomes much more aggressive in
00:14:03
inserting meta AI into all of the
00:14:06
critical apps that they control because
00:14:10
those apps really are the front door to
00:14:12
billions of people on a daily basis so
00:14:15
that would mean WhatsApp Instagram
00:14:18
messenger the Facebook app and threads
00:14:22
gets refactored in a way where instead
00:14:24
of leaving that application to go to a
00:14:26
chat GPT like app you would just stay in
00:14:29
the
00:14:31
app and then the companion to that would
00:14:37
be that Google also does the same thing
00:14:40
with their version in front of search so
00:14:42
those two big front doors to the
00:14:44
internet become much more aggressive in
00:14:47
giving you a reason to not have to go to
00:14:49
chat gbt because a their answers are
00:14:52
just as good and B they're right there
00:14:54
in a few less clicks for you yeah I'm so
00:14:57
that would be the that would be the
00:14:58
second
00:14:59
piece the third piece is that all of
00:15:03
these models basically run out of viable
00:15:07
data to differentiate themselves and it
00:15:09
basically becomes a race around
00:15:11
synthetic information and synthetic data
00:15:14
which is a cost problem meaning if
00:15:16
you're going to invent synthetic data
00:15:18
you're going to have to spend money to
00:15:20
do it and the large companies Facebook
00:15:23
Microsoft Amazon Google Apple have
00:15:26
effectively infinite money compared to
00:15:27
any startup
00:15:30
and then the fourth which is the most
00:15:32
quizzical one is what does the human
00:15:35
capital thing tell you about what's
00:15:38
going on it reads a little bit like a
00:15:40
tel Nolla I have not in my time in
00:15:43
Silicon Valley ever seen a company
00:15:46
that's supposedly on such a straight
00:15:48
line to a rocket ship have so much
00:15:51
highle
00:15:53
churn and I've but and I've also never
00:15:55
seen a company have this much liquidity
00:15:59
and so how are people deciding to leave
00:16:03
if they think it's going to be a
00:16:04
trillion dollar company and why when
00:16:07
things are just starting to cook would
00:16:08
you leave if you are technically
00:16:11
enamored with what you're building so if
00:16:14
you had to construct the barcase I think
00:16:16
those would be the four things open
00:16:18
source front door
00:16:20
competition the move to synthetic data
00:16:23
and all of the executive turnover would
00:16:26
be sort of why you would say May maybe
00:16:29
there's a fire where there's all this
00:16:31
smoke okay I think this is very well put
00:16:33
and I have been
00:16:36
using chat GPT and Claude and Gemini
00:16:40
exclusively I stopped using Google
00:16:42
search and I also stopped Sachs asking
00:16:46
people on my team to do stuff before I
00:16:48
asked chat GPT to do it specifically
00:16:51
freeberg the 01 version and the 01
00:16:54
version is distinctly different have you
00:16:56
gentlemen been using 01 like on a daily
00:17:00
okay so we can have a really interesting
00:17:01
conversation here I did something on my
00:17:04
other podcast this week in startups that
00:17:05
I'll show you right now that was crazy
00:17:08
yesterday is a g is a game changer it's
00:17:10
the first real it's the first real Chain
00:17:14
of Thought production system yes that I
00:17:16
think we've seen are you using o1
00:17:18
preview or ow mini uh I am using o1
00:17:22
preview now let me show you what I did
00:17:24
here just just so the audience can level
00:17:26
set here if you're not watching us go to
00:17:27
YouTube and type in in and and you can
00:17:30
you can watch us we do video here so I
00:17:33
was analyzing you know just some early
00:17:36
stage uh deals and cap tables and I put
00:17:38
in here hey a startup just raised some
00:17:40
money at this valuation here's what the
00:17:42
friends and family invested the
00:17:44
accelerator the seed investor Etc in
00:17:46
other words like the history the
00:17:47
investment history in a company what 01
00:17:50
does distinctly differently than the
00:17:52
previous versions and the previous
00:17:54
version I felt was 3 to 6 months ahead
00:17:56
of competitors this is a year ahead of
00:17:58
competitor s and so here shth if you
00:18:01
look it said it thought for 77 seconds
00:18:04
and if you click the down arrow saxs
00:18:07
what you'll see is it gives you an idea
00:18:09
of what its rationale is for
00:18:12
interpreting and what secondary query is
00:18:14
it's
00:18:15
doing in order this is called Chain of
00:18:18
Thought and this is the underlying Mega
00:18:21
model that sits on top of the llms and
00:18:25
the mega model effectively The Chain of
00:18:27
Thought approach is
00:18:29
the model asks itself the question how
00:18:33
should I answer this question right and
00:18:35
then it comes up with an answer and then
00:18:37
it says now based on that what are the
00:18:39
steps I should take to answer the
00:18:40
question so the model keeps asking
00:18:42
itself questions related to the
00:18:44
structure of the question that you asked
00:18:48
and then it comes up with a series of
00:18:49
steps that it can then call the llm to
00:18:52
do to fill in the blanks link them all
00:18:55
together and come up with the answer
00:18:56
it's the same way that a human train of
00:18:58
thought works and it really is the the
00:19:00
kind of ultimate evolution of what a lot
00:19:03
of people have said these systems need
00:19:04
to become which is a much more call it
00:19:07
intuitive approach to answering
00:19:09
questions rather than just predictive
00:19:11
text based on the single statement you
00:19:13
made and it really has changing the game
00:19:15
and everyone is going to chase this and
00:19:17
follow this it is the new paradigm for
00:19:20
for how these AI kind of systems will
00:19:22
work and by the way what this did was
00:19:25
what prompt Engineers were doing or
00:19:27
prompt engineering websites we're doing
00:19:29
which was trying to help you construct
00:19:30
your question and so if you look to this
00:19:33
one it says listing disparities I'll
00:19:35
compile a cap table with Investments and
00:19:37
valuations building the cap table
00:19:39
accessing the share valuation breaking
00:19:40
down ownership breaking down ownership
00:19:42
Etc evaluating the terms and then it
00:19:44
checks its work a bit it waits
00:19:45
investment options and you can see this
00:19:47
is a this is fired off like two dozen
00:19:51
different queries to as fre correctly
00:19:55
pointed out you know build this chain
00:19:58
and
00:19:59
it got incredible answers explain the
00:20:01
formulas so it's thinking about what
00:20:03
your next question would be and this
00:20:06
when I share this with my team it was
00:20:08
like a super game Cher uh Sach you had
00:20:11
some thoughts here well yeah I mean this
00:20:14
is pretty impressive and just to build
00:20:16
on what freeberg was saying about Chain
00:20:18
of Thought where this All Leads is to
00:20:20
agents where you can actually tell the
00:20:22
AI to do work for you you give it an
00:20:26
objective it can break the objective
00:20:28
down into tasks and then it can work
00:20:30
each of those tasks and open AI at a
00:20:34
recent meeting with investors said that
00:20:36
PhD level reasoning was next on its road
00:20:39
map and then agents weren't far behind
00:20:41
that they've now released the at least
00:20:43
the preview of the PHD level reasoning
00:20:45
with this 01 model so I think we can
00:20:47
expect an announcement pretty soon about
00:20:49
agents yeah and so the and if you think
00:20:52
about if you think about business value
00:20:54
you know we think a lot about this is
00:20:55
like where's the SAS opportunity and all
00:20:57
this the soft off as a service
00:20:59
opportunity it's going to be an agents I
00:21:02
think we'll ultimately look back on
00:21:04
these sort of chat models as a little
00:21:06
bit of a parlor trick compared to what
00:21:08
agents are going to do in the workplace
00:21:11
if you've ever been to a call center or
00:21:13
an operations center they're also called
00:21:15
service factories it's assembly lines of
00:21:18
people doing very complicated knowledge
00:21:21
work but ultimately you can unravel
00:21:23
exactly what the chain is there The
00:21:26
Chain of Thought that goes into their
00:21:28
decisions it's it's very complicated and
00:21:31
that's why you have to have humans doing
00:21:32
it but you could imagine that once
00:21:37
system integrators or or Enterprise SAS
00:21:39
apps go into these places go into these
00:21:42
companies they integrate the data and
00:21:44
then they map out the workflow you could
00:21:47
replace a lot of these steps in the
00:21:49
workflow with agents yeah by the way
00:21:51
it's not just it's not just call centers
00:21:53
I had a conversation with um I'm on the
00:21:56
board of a company with the CEO the
00:21:57
other day and he was like well we're
00:21:59
going to hire an analyst that's going to
00:22:01
sit between our kind of retail sales
00:22:03
operations and do the and you know
00:22:05
figure out what's working to drive
00:22:07
marketing decisions and I'm like no
00:22:08
you're not like I I really think that
00:22:11
that would be a mistake because today
00:22:13
you can use 01 and describe or just feed
00:22:17
it the data and describe the analysis
00:22:19
you want to get out of that data and
00:22:21
within a few minutes and I've now done
00:22:23
this probably a dozen times in the last
00:22:24
week with different projects internally
00:22:26
at my company it gives you the enre
00:22:28
answer that an analyst would have taken
00:22:30
days to put together for you and if you
00:22:32
think about what an analyst's job has
00:22:34
been historically is they take data and
00:22:36
then they manipulate it and the big
00:22:38
evolution in software over the last
00:22:40
decade and a half has been tools that
00:22:42
give that analyst leverage to do that
00:22:45
data manipulation more quickly like
00:22:47
Tableau and you know R and all sorts of
00:22:49
different toolkits that are out there
00:22:51
but now you don't even need the analyst
00:22:53
because the analyst is the Chain of
00:22:54
Thought it's the prompting from the
00:22:57
model and it's completely going to
00:22:59
change how knowledge work is done
00:23:00
everyone that owns a function no longer
00:23:03
needs an analyst the analyst is the
00:23:05
model that's sitting on the computer in
00:23:07
front of you right now and you tell it
00:23:08
what you want and not days later but
00:23:10
minutes later you get your answer it's
00:23:13
completely revolutionary in um uh ad hoc
00:23:18
uh knowledge work as well as kind of
00:23:20
this repetitive structured knowledge
00:23:21
this is such a good point freeberg the
00:23:23
ad hoc piece of it when we're processing
00:23:26
20,000 applications for funding a year
00:23:28
we do a 100 plus meetings a week the
00:23:30
analysts on our team are now putting in
00:23:33
the transcripts um and key questions
00:23:36
about markets and they are getting so
00:23:38
smart so fast that you know when
00:23:41
somebody comes to them with a
00:23:42
Marketplace in diamonds their
00:23:45
understanding of the diamond Marketplace
00:23:47
becomes so rich so fast that we can
00:23:49
evaluate companies faster then we're
00:23:52
also seeing uh chth uh before we call
00:23:55
our lawyers when we have a legal
00:23:57
question about a document we start
00:23:58
putting in you know let's say the the
00:24:00
standard note template or the standard
00:24:02
safe template we put in the new one and
00:24:05
there's a really cool project by Google
00:24:07
called notebook L LM where you can put
00:24:10
in multiple documents and you can start
00:24:11
asking questions so imagine you take
00:24:13
every single legal documents acts that
00:24:16
Yammer had when you had chamath as an
00:24:18
investor I'm not sure if he was on the
00:24:19
board and you can start asking questions
00:24:21
about the documents and we have had
00:24:24
people make changes to these documents
00:24:25
and it immediately finds them explains
00:24:27
them and so everybody's just getting so
00:24:30
goddamn smart so fast using these tools
00:24:33
that I insisted that every person on the
00:24:35
team when they hit control tab it opens
00:24:37
a chat gp4 window in 01 and we burned
00:24:40
out our credits immediately like it
00:24:43
stopped us it said you're you're you
00:24:45
have to stop using it for the rest of
00:24:46
the month Cham your thoughts on this
00:24:48
we're seeing it in real
00:24:50
time in 890 what I'll tell you is what
00:24:54
sck said is totally right there's so
00:24:58
many
00:24:59
companies that have very complicated
00:25:02
processes that are a combination of
00:25:05
well- Tred and well-meaning people
00:25:08
and bad software and what I mean by bad
00:25:13
software is that some other third party
00:25:16
came in listened to what your business
00:25:19
process was and then wrote this clunky
00:25:20
deterministic code usually on top of
00:25:23
some system of
00:25:24
record charged you tens or hundreds of
00:25:27
millions of for it and then left and
00:25:30
will support it only if you keep paying
00:25:31
them millions of dollars a year that
00:25:34
whole thing is so nuts because the
00:25:35
ability for people to do work I think
00:25:37
has been very much
00:25:39
constrained and it's constrained by
00:25:41
people trying to do the right thing
00:25:42
using really really terrible software
00:25:45
and all of that will go away the radical
00:25:48
idea that I would put out there is I
00:25:50
think that systems of record no longer
00:25:52
exist because they don't need
00:25:54
to and the reason is because all you
00:25:57
have is data and you have a and just
00:26:00
explain to people what system of record
00:26:01
is just so so inside of a company you'll
00:26:04
have a handful of systems that people
00:26:06
would say are the single source of truth
00:26:08
they're the things
00:26:09
that are used for reporting compliance
00:26:13
and example would be for your general
00:26:17
ledger so toet record your revenues You'
00:26:21
use Nets or you'd use Oracle GL or you'd
00:26:24
use workday financials then you'd have a
00:26:27
different system of record for all of
00:26:29
your Revenue generating activities so
00:26:32
who are all of the people you sell to
00:26:34
how are sales going what is the
00:26:36
PIP there's companies like Salesforce or
00:26:39
Sugar CRM then there's a system of
00:26:42
record for all the employees that work
00:26:44
for you all the benefits they have what
00:26:46
is their salary this is hris so the
00:26:48
point is that the software economy over
00:26:52
the last 20 years and this is trillions
00:26:54
of dollars of market cap and hundreds of
00:26:57
billions of Revenue you have been built
00:26:59
on this premise that we will create this
00:27:03
system of record you will build apps on
00:27:06
top of the system of record and the
00:27:07
knowledge workers will come in and
00:27:10
that's how they will get work done and I
00:27:12
think that saaks is right this totally
00:27:14
flips that on its head instead what will
00:27:16
happen is people will provision an agent
00:27:20
and roughly direct what they want the
00:27:22
outcome to be and they'll be process
00:27:24
independent they won't care how they do
00:27:26
it they just want the answer so I think
00:27:28
two things happen the obvious thing that
00:27:31
happens in that world is systems of
00:27:34
record lose a
00:27:36
grip on the Vault that they had in terms
00:27:40
of the data that runs a company you
00:27:42
don't necessarily need it with in the
00:27:45
same Reliance and Primacy that you did
00:27:47
five and 10 years ago that'll have an
00:27:49
impact to the software
00:27:51
economy and the second thing that I
00:27:54
think is even more important than that
00:27:57
is that then the omic size of companies
00:28:00
changes because each company will get
00:28:03
much more leverage from using software
00:28:05
and few people versus lots of people
00:28:08
with a few pieces of software and so
00:28:09
that inversion I think creates
00:28:12
tremendous potential for operating
00:28:14
leverage all right your thought Sachs
00:28:15
you operate in the SAS space uh with
00:28:17
system of records and investing in these
00:28:19
type of companies give us your take well
00:28:22
it's interesting we were having a
00:28:23
version of this conversation last week
00:28:26
on the Pod and I started getting texts
00:28:28
from Ben off as he was listening to it
00:28:30
and then oh he he called me and I think
00:28:32
he got a little bit triggered by the
00:28:34
idea that systems of record like
00:28:37
Salesforce are going to be obsolete in
00:28:38
this new AI era and he made a very
00:28:41
compelling case to me about why that
00:28:43
wouldn't happen which is yeah well first
00:28:45
of all I think AI models are predictive
00:28:48
I mean at the end of the day they're
00:28:49
predicting the next set of texts and so
00:28:51
forth and when it comes to like your
00:28:53
employee list or your customer list you
00:28:55
just want to have a source of Truth you
00:28:58
want it to be 98% accurate you just want
00:29:00
it to be 100% accurate you want to know
00:29:02
if the federal government asks you for
00:29:04
the tax ID numbers of your employees you
00:29:06
just won't be able to give it to them if
00:29:07
Wall Street analysts asks you for your
00:29:10
customer list and what the Gap revenue
00:29:11
is you just won't be able to provide
00:29:13
that you don't want AI models figuring
00:29:15
it out so you're still going to need a
00:29:16
system of record
00:29:19
furthermore he made the point that you
00:29:21
still need databases you still need
00:29:23
Enterprise security if you're dealing
00:29:24
with Enterprises you still need
00:29:26
compliance you still need sharing models
00:29:28
there's all these aspect all these
00:29:30
things that have been built on top of
00:29:31
the database that SAS company's been
00:29:33
doing for 25 years and then the final
00:29:35
point that I think is compelling is that
00:29:38
Enterprise customers don't want to DIY
00:29:41
it right they don't want to have to
00:29:42
figure out how to put this together and
00:29:44
you can't just hand them an llm and say
00:29:47
here you go there's a lot of work that
00:29:51
is needed in order to make these models
00:29:54
productive and so at a minimum you're
00:29:56
going to need system integrators and
00:29:59
Consultants to come in there connect
00:30:01
hold on just connect all the Enterprise
00:30:02
data to these models map the workflows
00:30:05
you have to do that now how's that
00:30:07
different from how this this clunky
00:30:09
software is sold today I mean look I I
00:30:12
don't want to take away from the quality
00:30:15
of the company that Mark has built and
00:30:17
what he's done for the cloud economy so
00:30:19
let's just put that aside but I wish
00:30:21
this is what we could have actually all
00:30:22
been on stage and talked about I told
00:30:25
him that when he was at the summit I
00:30:26
said that because I agree with basically
00:30:29
every premise of those three things
00:30:31
number one systems integrators exist
00:30:32
today to build apps on top of these
00:30:34
things why do you think you have
00:30:36
companies like Viva how can a 20 billion
00:30:38
dollar plus company get built on top of
00:30:40
Salesforce it's because it doesn't do
00:30:42
what it's meant to do that's
00:30:47
why app stores are a great way to allow
00:30:50
people to build on your platform and and
00:30:52
and cover those Niche cases the point
00:30:54
I'm trying to make is that's no
00:30:55
different than the economy that exists
00:30:57
today is just going to transform to
00:30:58
different groups of people number one
00:30:59
well by the way he said he's willing to
00:31:01
come on the PO and talk about this very
00:31:02
issue but just with
00:31:05
you talk all us now great he'll come on
00:31:09
the Pod and discuss whether AI makes
00:31:10
sass obsolete a lot of people are asking
00:31:12
that question let's talk about it next
00:31:14
year at the summit can you talk about
00:31:16
his uh philanthropy first okay let's get
00:31:18
back to focus here let's get focused
00:31:21
everybody love you mark who's coming to
00:31:24
dream for us raise your hand I want to
00:31:26
make another point the second point is
00:31:28
that when you have agents I think that
00:31:31
we are overestimating what a system of
00:31:33
record did David what you talked about
00:31:35
is actually just an encrypted file or
00:31:37
it's a bunch of rows in some database or
00:31:39
it's in some data Lake somewhere you
00:31:42
don't need to spend tens or hundreds of
00:31:45
millions of dollars to wrap your Revenue
00:31:49
in something that says it's a system of
00:31:51
record you don't need that actually you
00:31:53
can just pipe that stuff directly from
00:31:56
stripe into Snowflake and you can just
00:31:59
transform it and do what you will with
00:32:00
it and then report it I'll tell you
00:32:02
could do that today it's just that
00:32:04
that's an interesting point through
00:32:05
stake dinners and golf outings and all
00:32:07
this stuff we've sold Ci's this idea
00:32:12
that you need to wrap it in something
00:32:13
called a system of record and all I'm
00:32:15
saying is when you confront the total
00:32:18
cost of that versus what the alternative
00:32:21
that is clearly going to happen in the
00:32:23
next 5 or 10 years irrespective of
00:32:25
whether any of us Build It or Not it'll
00:32:27
be not be able to you just won't be able
00:32:29
to justify
00:32:31
it I'll say there's there's probably
00:32:34
also an aspect of this that we can't
00:32:37
predict what is going to work with
00:32:39
respect to data structure so right now
00:32:41
all of um all of the the tooling for AI
00:32:45
is on the front end and we haven't yet
00:32:47
Unleashed AI on the back end which is if
00:32:50
you told the AI here's all the data
00:32:52
ingest I'm going to be doing from all
00:32:54
these different points in my business
00:32:56
figure out what you want to do with all
00:32:58
that data the AI will eventually come up
00:33:01
with its own data structure and data
00:33:04
system nothing that will look nothing
00:33:07
like that's already happening right and
00:33:09
so that's nothing like what we have
00:33:11
today in the same vein that we don't
00:33:13
understand how the translation Works in
00:33:14
a in an in an llm we don't understand
00:33:16
how a lot of the function works a lot of
00:33:18
the data structure and data architecture
00:33:19
we won't understand clearly because it's
00:33:21
going to be opusc by the model driving
00:33:24
the the development there are open
00:33:25
source agentic Frameworks that already
00:33:27
do freeberg what you're saying that so
00:33:29
it's not true that it's not been done
00:33:31
it's already sure so maybe it's being
00:33:33
done right that's it hasn't been fully
00:33:35
implemented to replace this record there
00:33:37
are companies I'll give you an example
00:33:39
of one like mechanical Orchard they'll
00:33:42
go into the most gnarliest of
00:33:44
environments and what they will do is
00:33:45
they will launch these agents that
00:33:47
observe it's sort of what I told you
00:33:49
guys before the io stream of these apps
00:33:51
and then reconstruct everything in the
00:33:53
middle
00:33:55
automatically I don't understand why
00:33:58
we think that there's a world where
00:34:00
customer quality and NPS would not go
00:34:02
Skyhigh for a company that has some old
00:34:05
Legacy forr system and now they can just
00:34:07
pay you know mechanical Orchard a few
00:34:10
million bucks and they'll just replace
00:34:11
it in a matter of months it's G to
00:34:13
happen right yeah that's the very
00:34:15
interesting piece for me is I'm you know
00:34:17
watching startups you know working on
00:34:19
this the AI first ones I think are going
00:34:22
to come to it with a totally different
00:34:24
cost structure the idea of paying for
00:34:25
seats and I mean some of the these seats
00:34:28
are 5,000 per person year you nailed it
00:34:31
a year ago when you were like oh you
00:34:33
mentioned some company that had like
00:34:34
flat pricing at first by the way when
00:34:37
you said that I thought this is nuts but
00:34:40
you're right it actually makes a ton of
00:34:42
sense because if you have a fixed group
00:34:44
of people who can use this tooling to
00:34:48
basically effectively be as productive
00:34:51
as a company that's 10 times as big as
00:34:53
you you can afford to Flat price your
00:34:56
software because you can just work
00:34:58
backwards from what margin structure you
00:35:00
want and it's still meaningfully cheaper
00:35:02
than any other alternative a a lot of
00:35:05
startups now are doing consumption based
00:35:07
pricing so they're saying you know how
00:35:09
many um how many sales calls are you
00:35:12
doing how many are we analyzing as
00:35:14
opposed to how many sales Executives do
00:35:16
you have because when you have agents as
00:35:19
we're talking about those agents are
00:35:21
going to do a lot of the work so we're
00:35:23
going to see the number of people
00:35:24
working at companies become fixed and I
00:35:27
think the St team size that we're seeing
00:35:30
at a lot of large companies is only
00:35:32
going to continue it's going to be down
00:35:33
and to the right and if you think you're
00:35:35
going to get a high-paying job at a big
00:35:36
tech company and you have to beat the
00:35:40
agent you're going to have to beat the
00:35:41
Maestro who has five agents working for
00:35:43
them I think this is going to be a
00:35:45
completely different world uh Cham this
00:35:48
I I want to get back to openi with a
00:35:50
couple other pieces so let's wrap this
00:35:52
up get
00:35:53
next last word for you last word so look
00:35:57
look I I think that on the whole I agree
00:36:00
with Benny off here that there's more
00:36:02
net new opportunity for AI companies
00:36:05
whether they be startups or you know
00:36:07
existing big companies like Salesforce
00:36:09
that are trying to do AI then there is
00:36:12
disruption I think there will be some
00:36:13
disruption it's very hard for us to see
00:36:15
exactly what AI is going to look like in
00:36:17
five or 10 years so I don't want to
00:36:18
Discount the possibility that ex there
00:36:21
will be some disruption of existing
00:36:23
players but I think on the whole there's
00:36:25
more net new opportunity for example the
00:36:27
most highly valued public software
00:36:29
company right now in terms of AR
00:36:31
multiple is paler and I think that's
00:36:35
largely because the market perceives
00:36:37
paler is having a big AI opportunity
00:36:39
what is Pal's approach the first thing
00:36:41
paler does when they go into a customer
00:36:44
is they integrate with all of its
00:36:46
systems and they're dealing with the
00:36:47
largest Enterprises they're dealing with
00:36:48
the government the Pentagon Department
00:36:50
of Defense the first thing they do is go
00:36:52
in and integrate with all of these
00:36:55
Legacy systems and they collect all of
00:36:58
the data in one place they call it
00:37:00
creating a digital twin and once all the
00:37:03
data is in one place with the right
00:37:04
permissions and safeguards now analysts
00:37:07
can start working it and that was their
00:37:08
historical value proposition but in
00:37:11
addition AI can now start working that
00:37:13
problem so anything that the analyst
00:37:15
could work now ai is going to be able to
00:37:16
work and so they're in an ideal position
00:37:19
to master these new AI workflows so what
00:37:22
is the point I'm making it's just that
00:37:24
you can't just throw an llm at these
00:37:26
large enter Rises you have to go in
00:37:29
there and integrate with the existing
00:37:31
systems it's not about ripping out the
00:37:32
existing systems because that's just a
00:37:34
lot of headaches that nobody needs it's
00:37:36
generally an easier approach just to
00:37:38
collect except when the renewal comes
00:37:40
what happens when you have to you know
00:37:42
billion on something yeah and then
00:37:45
you're going to renegotiate or you're
00:37:46
going to spend a billion dollars again
00:37:47
five years from now it just doesn't seem
00:37:49
very likely there's going to be a lot of
00:37:51
Hardcore negotiations going on shama
00:37:53
people are going to ask for 20% off 50%
00:37:55
off and people going to be more
00:37:57
competitive that's all I Su I suspect
00:37:59
palun go to market when they start to
00:38:01
release scale they'll be able to
00:38:02
underprice a bunch of these other
00:38:04
Alternatives and it so I think
00:38:07
that when you look at
00:38:11
the impacts and pricing that all of
00:38:15
these open source and close Source model
00:38:17
companies have now introduced in terms
00:38:19
to the price per token what we've seen
00:38:22
is just a massive step function lower
00:38:25
right so it is incredible
00:38:28
deflationary so the things that sit on
00:38:30
top are going to get priced as a
00:38:33
function of that of that cost which
00:38:35
means it will be an order of magnitude
00:38:37
cheaper than the stuff that it replaces
00:38:39
which means that a company would almost
00:38:42
have to purposely want to keep paying
00:38:45
tens of millions of dollars when they
00:38:47
don't have to they would need to make
00:38:49
that as an explicit decision and I think
00:38:52
that very few companies will be in a
00:38:54
position to be that Cavalier in 5 and 10
00:38:57
10 years so you're either going to
00:38:59
rebase the revenues of a bunch of these
00:39:03
existing deterministic companies or
00:39:05
you're going to create an entire economy
00:39:07
of new ones that have a fraction of the
00:39:09
revenues today but a very different
00:39:11
profitability profile I I think whenever
00:39:15
you're dealing with a whenever you're
00:39:16
dealing with A disruption as big as this
00:39:19
current one I think it's always tempting
00:39:21
to think in terms of the existing Pi
00:39:24
getting disrupted and and shrunk as
00:39:27
oppos to the pie getting so big with new
00:39:29
use cases that on the whole the
00:39:32
ecosystem benefits no no no I agree with
00:39:35
that I suspect That's What's happen no I
00:39:37
I I agree with that my my only point is
00:39:39
that the pie can get bigger while the
00:39:41
slices get much much
00:39:44
smaller well I mean I right between the
00:39:47
two of you I think is um the truth
00:39:50
because what's happening is if you look
00:39:52
at investing it's very hard to get into
00:39:55
these late stage companies because they
00:39:56
don't need as much Capital because to
00:39:58
your point shth they when they do hit
00:40:01
profitability with 10 or 20 people the
00:40:03
revenue per employee is going way up if
00:40:07
you look at Google Uber Airbnb and
00:40:10
Facebook meta they have the same number
00:40:12
or less employees and they did three
00:40:14
years ago but they're all growing in
00:40:15
that 20 to 30% a year which means in but
00:40:19
two to three years each of those
00:40:21
companies has doubled Revenue per
00:40:22
employee so that concept of more
00:40:25
efficiency and then that trickles down
00:40:28
Sachs to the startup investing space
00:40:29
where you and I are I'm a preed seed
00:40:31
investor you're a seed series a investor
00:40:34
if you don't get in in those three four
00:40:35
rounds I think it's going to be really
00:40:37
expensive and the companies are not
00:40:38
going to need as much money Downstream
00:40:41
speaking of of investing in late stage
00:40:43
companies we never Clos the loop on the
00:40:45
whole open AI thing what did we think of
00:40:48
the fact that they're completely
00:40:51
changing the structure of this company
00:40:52
they're changing it into a corporation
00:40:53
from the
00:40:54
nonprofit and Sam's now getting it 101
00:40:57
billion stock
00:40:58
package he's not in for the money he has
00:41:01
health insurance
00:41:03
sex but we never never Congress I don't
00:41:06
need money enough money I just needed
00:41:08
the health insurance pull the clip up
00:41:10
Nick pull the clip up I mean it's the
00:41:11
funniest clip
00:41:13
ever
00:41:14
C no they said Congress watch this this
00:41:17
what in
00:41:18
Congress you make a lot of money do you
00:41:21
I make no I paid enough for health
00:41:23
insurance I have no equity and open AI
00:41:24
really that's interesting you need a
00:41:26
large
00:41:28
I need a what you need a lawyer or an
00:41:30
agent I I'm doing this cuz I love
00:41:33
it thank you it's the greatest look at
00:41:36
me don't believe him I this can I ask
00:41:38
you a question there saxs are you doing
00:41:41
this Venture Capital where you put the
00:41:44
money in the startups cuz you love it or
00:41:48
cuz you're looking to get another Home
00:41:50
In a Coal City and put more jet fuel in
00:41:53
that plane I need an answer for the
00:41:54
people of the sovereign state of
00:41:56
Mississippi no Louisiana that's John
00:41:59
Kennedy from
00:42:00
louisi he's a very smart guy actually
00:42:03
with a lot of you know sort of common
00:42:07
folk wisdom he got that simple talk
00:42:10
simple talk we Shooters
00:42:12
here actually yeah he's very funny but
00:42:16
he's very funny if you listen to him he
00:42:19
knows how to slice and dice you might
00:42:21
need to get yourself uh one of them
00:42:23
fancy Agents from Hollywood or an
00:42:26
attorney from the Wilson Sun cini
00:42:28
Corporation to renegotiate your contract
00:42:31
son because you're worth a lot more from
00:42:32
what I can gather in your performance
00:42:34
today than just some simple healthare
00:42:37
and uh I hope you took the Blue Cross
00:42:39
Blue Shield I would like to make two
00:42:42
semi-serious observations let's go you
00:42:44
please get us back on track I think the
00:42:46
first is that there's going to be a lot
00:42:48
of people that are looking at the
00:42:49
architecture of this conversion because
00:42:52
if it passes muster everybody should do
00:42:54
it think about this model let's just
00:42:57
that you're in a market and you start as
00:42:59
a
00:43:00
nonprofit what that really means is you
00:43:02
pay no income tax so for a long time you
00:43:07
put out a little bit of the percentage
00:43:09
of whatever you
00:43:11
earn but you can now outspend and outco
00:43:14
compete all of your competitors and then
00:43:17
once you win you flip to a corporation
00:43:21
that's a great hack on the tax code and
00:43:24
you let the donators get first bite of
00:43:26
the Apple if you do convert because
00:43:29
remember venod and Hoffman got all their
00:43:31
shares on the conversion the other way
00:43:33
will also work as well because it's
00:43:35
there's no there's nothing that says you
00:43:36
can't go in the other direction so let's
00:43:38
assume that you're already a for-profit
00:43:41
company but you're in a space with a
00:43:42
bunch of competitors can't you just do
00:43:45
this conversion in Reverse become a
00:43:48
nonprofit again you pay no income tax so
00:43:51
now you are economically advantaged
00:43:54
relative to your competitors and then
00:43:57
when they wither and die or you can
00:43:58
outspend them you flip back to a
00:44:01
for-profit again I think the the point
00:44:04
is that there's a lot of people that are
00:44:07
going to watch this closely
00:44:10
and if it's legal and it's allowed I I
00:44:13
just don't understand why everybody
00:44:15
wouldn't do this yeah I mean that was
00:44:17
elon's Point as well and I mean the
00:44:19
second thing which is just more of
00:44:22
like cultural observation is and you
00:44:26
brought up Elon my comment to you guys
00:44:28
yesterday and I'll just make the comment
00:44:30
today
00:44:31
it's a little bit disheartening to see a
00:44:34
situation where Elon built something
00:44:37
absolutely incredible defied every
00:44:40
expectation and then had the justice
00:44:44
system take $ 55 billion away from him
00:44:49
his payment package you're referring to
00:44:50
at payment package the options in Tesla
00:44:54
and then on the other side Sam's going
00:44:56
to pull some like this
00:44:58
off definitely pushing the boundaries
00:45:01
and he's going to make 10 billion and I
00:45:04
just think when you put those two things
00:45:06
in
00:45:07
contrast that's not how the system
00:45:10
should probably work I think is what
00:45:11
most people would say freeberg you've
00:45:13
been a little quiet here any thoughts on
00:45:15
the transaction the nonprofit to
00:45:17
for-profit if you were looking at that
00:45:19
in what you're doing do you see a way
00:45:21
that ohal could take a nonprofit status
00:45:25
raise a bunch of money through donations
00:45:27
for virtuous work then license those
00:45:29
patents to your for-profit it would that
00:45:31
be advantageous to you and and do you
00:45:33
think this could become absolutely zero
00:45:35
idea I have no idea what they're doing I
00:45:38
don't know how they're converting a
00:45:39
nonprofit to a for-profit none of us
00:45:40
have the details on this there's there
00:45:42
may be significant tax implications
00:45:44
payments they need to make I don't think
00:45:45
any of us know I certainly don't I don't
00:45:48
know if there's actually a real benefit
00:45:50
here if there is I'm sure everyone would
00:45:52
do it no one's doing it so there's
00:45:54
probably a reason why it's difficult I
00:45:56
don't know been done a couple times the
00:45:58
the Mozilla Foundation did it we talked
00:45:59
about that in our previous episode Saks
00:46:01
you want to you want to wrap us up here
00:46:02
on the corporate structure any final
00:46:03
thoughts I mean Elon put in 50 million I
00:46:06
think he gets the same as Sam don't you
00:46:09
think he should just chip off 7% for
00:46:11
Elon and not that Elon needs the money
00:46:13
where he's asking but I'm just wondering
00:46:15
why Elon doesn't get the 7% and get or
00:46:18
you know if put in 50 did he put in $50
00:46:21
million he put in 50 million is the
00:46:23
report right in the nonprofit yeah
00:46:25
Hoffman put in 10 look I said on a
00:46:27
previous show that this organizational
00:46:30
chart of open AI was ridiculously
00:46:32
complicated and they should go clean it
00:46:34
up they should open up the books and
00:46:35
straighten everyone out and I also said
00:46:38
that as part of that they could give Sam
00:46:40
Alman a CEO option Grant and they should
00:46:42
also give Elon some Fair compensation
00:46:45
for being the seed investor who put in
00:46:47
the first $50 million and co-founder and
00:46:50
what you're seeing is well they're kind
00:46:52
of doing that they're opening up the
00:46:53
books they're straightening out the
00:46:56
corporate stru
00:46:57
they're giving Sam his option Grant but
00:46:59
they didn't do anything for Elon and I'm
00:47:02
not saying this as elon's friend I'm
00:47:04
just saying that it's not really fair to
00:47:07
basically go fix the original situation
00:47:10
you're making it into a for-profit
00:47:13
you're giving everyone shares but the
00:47:15
guy who puts in the Original Seed Capal
00:47:16
doesn't get anything that's ridiculous
00:47:20
and you know what they're basically
00:47:21
saying to Elon is if you don't like it
00:47:23
just sue us I mean that's basically what
00:47:26
they're doing and I said that they
00:47:28
should go clean this up but they should
00:47:30
make it right with everybody so how do
00:47:32
you not make it right with Elon I
00:47:34
haven't talked him about this but he
00:47:36
reacted on X saying this is really wrong
00:47:39
it appeared to be a surprise to him I
00:47:40
doubt he knew this was coming so the
00:47:42
company apparently made no effort to
00:47:45
make things right with him and I think
00:47:48
that that is a bit ridiculous if you're
00:47:50
going to clean this up if you're going
00:47:51
to change the original purpose of this
00:47:54
organization to being a standard
00:47:57
for-profit company where the CEO who
00:48:00
previously said he wasn't going to get
00:48:02
any compensation is now getting 10
00:48:04
billion of compensation how do you do
00:48:06
that and then not clean it up for the
00:48:08
co-founder who put in the first $50
00:48:10
million yeah that doesn't make sense to
00:48:12
me and you know when Reed was on our pod
00:48:14
he he said well elon's Rich enough well
00:48:17
that that's not a principled excuse I
00:48:19
mean does venod ever act that way does
00:48:21
Reed ever act that way do they ever say
00:48:24
well you know you don't need to do
00:48:25
what's fair for me because I'm already
00:48:27
rich that's that's not a principled
00:48:29
answer the argument that I heard was
00:48:31
that Elon was given the opportunity to
00:48:33
invest along with Reed with VOD and he
00:48:36
he declined to participate in the
00:48:38
for-profit investing side that everyone
00:48:41
else participated we made that argument
00:48:43
and I think it's the best argument the
00:48:44
company has but let's think about that
00:48:46
argument maybe Elon was busy that week
00:48:49
maybe Elon already felt like he had put
00:48:51
all the money that he had allocated for
00:48:53
something like this into it because he
00:48:54
put in a $50 million check whereas reput
00:48:57
in 10 we don't know what Elon was
00:49:00
thinking at that time maybe there was a
00:49:01
crisis at Tesla and he was just really
00:49:03
busy the point is Elon shouldn't have
00:49:05
been obligated to put in more money into
00:49:09
this Venture the fact of the matter is
00:49:11
they're refactoring the whole Venture
00:49:13
Elon had an expectation when he put in
00:49:15
the 50 million that this would be a
00:49:17
nonprofit and stay a nonprofit and
00:49:19
they're changing that and if they change
00:49:21
it they have to make things right with
00:49:22
him it doesn't really matter whether he
00:49:24
had a a subsequent opportunity to to
00:49:26
invest he wasn't obligated to to make
00:49:29
that investment what he had an
00:49:31
expectation of is that his $50 million
00:49:34
to be used for a philanthropic purpose
00:49:36
and clearly it has not been yeah and In
00:49:39
fairness to a node he bought that
00:49:40
incredible beachfront property and
00:49:42
donated it to the public trust so we can
00:49:43
all Surf and have our Halloween party
00:49:45
there so it's all good thank you venod
00:49:47
for giving us that incredible Beach I
00:49:50
want to talk to you guys about
00:49:51
interfaces that came up chth in your
00:49:54
headwinds or your your your four pack of
00:49:56
reasons that you know open AI when you
00:49:59
steal men the barcase could have
00:50:01
challenges obviously we're seeing that
00:50:04
and it is emerging that meta is working
00:50:08
on some AR glasses that are really
00:50:10
impressive additionally I've installed
00:50:13
iOS 18 which is Apple intelligence that
00:50:16
works on 15 phones and 16 phones 18 is
00:50:19
the iOS did any of you installed the
00:50:21
beta of iOS 18 yet and use Siri it's
00:50:24
pretty clear with this new one that
00:50:26
you're going to be able to talk to Siri
00:50:27
as an llm like you do in chat GPT mode
00:50:30
which I think means they will not make
00:50:33
themselves dependent on chat GPT and
00:50:34
they will siphon off half the searches
00:50:37
that would have gone to chat GPT so I
00:50:38
see that as a serious sir is not very
00:50:40
good J kll and you know this because
00:50:42
when you were driving me to the airport
00:50:43
yesterday we tested it it didn't work
00:50:45
yes you tried you try he tried to he
00:50:47
tries to execute this joke where he's
00:50:48
like hey Siri sent choth poly haaa a
00:50:52
message and it was a very off-color
00:50:54
message I'm not gonna say what it is was
00:50:55
a spicy joke and then it's like okay
00:50:57
great sending Linda blah blah blah he
00:51:00
like no stop like no don't send that
00:51:02
joke to her it hallucinates it almost
00:51:04
sends it to some other you know some
00:51:06
woman in his contact would have been
00:51:08
really damaging it's not very good Jason
00:51:10
it's not very good well but what I will
00:51:11
say is there are features of it where if
00:51:14
you squint a little bit you will see
00:51:17
that Siri is going to be conversational
00:51:20
so when I was talking to it with music
00:51:22
and you know you you can have a
00:51:23
conversation with it and do math like
00:51:25
you can do with cat GPT version and you
00:51:28
have Microsoft doing that with their
00:51:30
co-pilot and now met is doing it at the
00:51:33
top of each one so everybody's going to
00:51:35
try to inter intercept the queries and
00:51:37
the voice interface so chat gp4 is now
00:51:40
up against meta Siri Apple and Microsoft
00:51:44
for for that interface it's going to be
00:51:46
challenging but let's talk about these
00:51:47
meta glasses here meta showed off the AR
00:51:50
glasses that Nick will pull up right now
00:51:53
these aren't goggles goggles look like
00:51:55
ski goggles that that's what apple is
00:51:57
doing with their Vision Pro uh or when
00:52:00
you see the meta Quest you know how
00:52:02
those work those are VR with cameras
00:52:05
that will create a version of the world
00:52:07
these are actual chunky sunglasses like
00:52:09
the ones I was wearing earlier when I
00:52:11
was doing the bit so these let you
00:52:14
operate in the real world and are
00:52:18
supposedly extremely expensive they made
00:52:20
a thousand prototypes they were lending
00:52:21
a bunch of influencers and folks like um
00:52:26
Gary vanderchuck uh use them and they're
00:52:29
not ready for prime time but the way
00:52:31
they work freeberg is there's a
00:52:32
wristband that will track your fingers
00:52:35
and your wrist movement so you could be
00:52:36
in a conversation like we are here on
00:52:38
the Pod and below the desk you could be
00:52:40
you know moving your arm and hand around
00:52:42
to be doing replies to I don't know
00:52:44
incoming messages or whatever it is what
00:52:47
do you think of this AR vision of the
00:52:49
world and meta making this progress well
00:52:52
I think it ties in a lot to the the AI
00:52:56
discussion because I I think we're
00:52:58
really witnessing this big shift from
00:53:02
and this big transition in Computing
00:53:04
probably the biggest transition since
00:53:05
mobile you know we moved from Main
00:53:08
frames to desktop computers everyone had
00:53:11
kind of this computer on their desktop
00:53:12
but used a mouse and a keyboard to
00:53:13
control it to mobile where you had a
00:53:15
keyboard and clicking and touching on
00:53:17
the screen to do things on it and now to
00:53:19
what I would call this kind of ambient
00:53:21
Computing method and you know I think
00:53:25
the big difference is control and
00:53:27
response in directed Computing you're
00:53:30
kind of telling the computer what to do
00:53:32
you're controlling it you're using your
00:53:33
mouse or your keyboard to to go to this
00:53:36
website so you type in a website address
00:53:38
then you click on the thing that you
00:53:40
want to click on and you kind of keep
00:53:42
doing a series of work to get the
00:53:43
computer to go access the information
00:53:46
that you ultimately want to achieve your
00:53:47
objective but with ambient Computing you
00:53:50
can more kind of cleanly state your
00:53:52
objective without this kind of directive
00:53:54
process you can say hey I I want to say
00:53:57
I want to have dinner in New York next
00:53:58
Thursday at the at a Michelin star
00:54:00
restaurant at 5:30 book me something and
00:54:02
it's done and I think that there are
00:54:04
kind of five core things that are needed
00:54:06
for this to work um both in control and
00:54:09
response it's voice control gesture
00:54:12
control and eye control are kind of the
00:54:14
control pieces that replace you know
00:54:16
mice and clicking and touching and
00:54:18
keyboards and then response is audio and
00:54:21
kind of integrated visual which is the
00:54:23
idea of the goggles voice control works
00:54:25
have you guys used the open AI voice
00:54:27
control system lately I mean it is
00:54:29
really incredible I had my earphones in
00:54:31
and I was like doing this exercise I was
00:54:33
trying to learn something so I told open
00:54:35
AI to start quizzing me on this thing
00:54:38
and I just did a 30 minute walk and
00:54:39
while I was walking it was asking me
00:54:41
quiz questions and I would answer it
00:54:42
tell me I was right or wrong it was
00:54:43
really this incredible dialogue
00:54:45
experience so I think the voice control
00:54:46
is there I don't know if you guys have
00:54:48
used Apple Vision Pro but gesture
00:54:49
control is here today you can do single
00:54:52
finger movements with apple Vision Pro
00:54:54
it triggers actions and I control is
00:54:56
incredible you look at the letters you
00:54:58
want to have kind of spelled out or you
00:54:59
look at the thing you want to activate
00:55:00
and it does it so all of the control
00:55:03
systems for this ambient Computing are
00:55:04
there and then the AI enables this kind
00:55:06
of audio response where it speaks to you
00:55:09
and the big breakthrough that's needed
00:55:11
that I don't think we're quite there yet
00:55:12
but maybe Zuck is highlighting that
00:55:14
we're almost there and apple Vision Pro
00:55:15
feels like it's almost there except it's
00:55:17
big and bulky and expensive is
00:55:19
integrated visual where the ambient
00:55:20
visual interface is always there and you
00:55:23
can kind of engage with it so there's
00:55:24
this big change I don't think that
00:55:26
mobile handsets are going to be around
00:55:27
in 10 years I don't think we're going to
00:55:29
have this like phone in our pocket that
00:55:30
we're like pressing buttons on and
00:55:32
touching and telling it where on the
00:55:34
browser to go to the browser interface
00:55:36
is going to go away I think so much of
00:55:38
how Computing is done how much how we
00:55:40
integrate with data in the world and how
00:55:42
the computer ultimately fetches that
00:55:44
data and does stuff with it for us is
00:55:46
going to completely change to this
00:55:47
ambient model so I'm um I'm pretty
00:55:50
excited about this Evolution but I think
00:55:52
that what we're seeing with Zuck what we
00:55:53
saw with apple Vision Pro and all of the
00:55:55
open AI demos they all kind of converge
00:55:58
on this very incredible shift in
00:56:00
Computing um that will kind of become
00:56:02
this ambient system that exists
00:56:04
everywhere all the time and I know folks
00:56:05
have kind of mentioned this in the past
00:56:07
but I think we're really seeing it kind
00:56:08
of all come together now with these five
00:56:10
key
00:56:11
things jamath any thoughts on Facebook's
00:56:15
progress with ar and how that might
00:56:20
impact Computing and interfaces when
00:56:23
paired with language models I think
00:56:26
think David's right that there's
00:56:28
something that's going to be totally new
00:56:31
and
00:56:32
unexpected so I agree with that part of
00:56:36
what free Brook says I am still not sure
00:56:40
that glasses are the perfect form factor
00:56:42
to be ubiquitous when you look at a
00:56:46
phone a phone makes complete sense for
00:56:50
literally everybody right man woman old
00:56:54
young every
00:56:58
race every country of the world it's
00:57:01
such a ubiquitously obvious form
00:57:06
factor but the thing is like that
00:57:08
initial form factor was so different
00:57:10
than what it replaced even if you looked
00:57:13
at like flip phones versus that first
00:57:14
generation iPhone so I I do think
00:57:18
freeberg you're right that there's like
00:57:20
this new way of interacting that is
00:57:23
ready to explode onto the scene
00:57:26
and I think that these guys have done a
00:57:29
really good job with these classes I
00:57:30
mean like I give them a lot of credit
00:57:32
for sticking with it and iterating
00:57:34
through it and getting it to this place
00:57:36
it looks meaningfully better than the
00:57:37
Vision Pro to be totally
00:57:40
honest but I'm still not convinced that
00:57:42
we've explored the best of our
00:57:45
creativity in terms of the devices that
00:57:47
we want to use with these AI models you
00:57:49
need some visual interface I think the
00:57:51
question is where is the visual
00:57:52
interface is it in the is in in the wall
00:57:55
but do I mean well when you're asking
00:57:57
like I want to watch chamath on Rogan
00:57:59
like I don't just want to hear I want to
00:58:01
see like when I want to visualize stuff
00:58:03
I want to visualize it I want to look at
00:58:05
the food I'm buying online I want to
00:58:06
look at pictures of the restaurant I'm
00:58:08
GNA go to but how much of that time when
00:58:10
you say those things are you not near
00:58:13
some screen that you can just project
00:58:15
and broadcast that on I Me Maybe the
00:58:17
model if the use case is I'm walking in
00:58:20
the park and I need to watch TV at the
00:58:22
same time I don't think that's a real
00:58:24
use case I I think you're on this one
00:58:25
wrong WR shth because I saw this
00:58:29
revolution in Japan maybe 20 years ago
00:58:32
they got obsessed with augmented reality
00:58:34
there were a ton of startups right as
00:58:36
they started getting into the mobile
00:58:37
phones and the use cases were really
00:58:40
very compelling and we're starting to
00:58:41
see them now in education and when
00:58:42
you're at dinner with a bunch of friends
00:58:45
how often does picking up your phone and
00:58:49
you know looking at a message disturb
00:58:50
the flow well people will have glasses
00:58:52
on they'll be going for walks they'll be
00:58:54
driving they'll be at a dinner party
00:58:55
they'll be with their kids and you'll
00:58:57
have something on like Focus mode you
00:58:59
know whatever the equivalent is in apple
00:59:01
and a message will come in from your
00:59:02
spouse or from your child but you won't
00:59:05
have to take your phone out of your
00:59:06
pocket and I think once these things
00:59:08
weigh a lot less you're going to have
00:59:10
four different ways to interact with
00:59:12
your computer in your pocket your phone
00:59:13
your watch your airpods whatever you
00:59:16
have in your ears and the glasses and I
00:59:17
bet you glasses are going to take like a
00:59:19
third of the tasks you do I mean what is
00:59:22
the point of taking out your phone and
00:59:23
watching The Uber come to you but seeing
00:59:25
that little strip that tells you the
00:59:27
Uber is 20 minutes away 15 minutes away
00:59:29
or what the gate number is I don't have
00:59:31
that anxiety well I don't know if it's
00:59:33
anxiety but I just think it's ease of
00:59:35
use all those 15 minutes 10 minutes
00:59:37
that's that's the definition I think it
00:59:38
adds up I think it taking your phone out
00:59:40
of your pocket 50 times those are all
00:59:43
useless notifications the whole thing is
00:59:45
to train yourself to realize that it'll
00:59:47
come when it comes okay sack do you have
00:59:48
any thoughts on this uh impressive demo
00:59:51
or you know the demo that people who've
00:59:52
seen have said is pretty pretty darn
00:59:54
compelling I think it it does look
00:59:56
pretty impressive I mean you can wear
00:59:58
these meta Orion glasses
01:00:01
around you know and look and look like a
01:00:03
human I mean you might look like Eugene
01:00:05
Levy but you'll still look like a
01:00:07
semi-normal person whereas you can't
01:00:10
wear the Apple Vision Pro I mean you
01:00:12
can't wear that around what they don't
01:00:13
look good you don't like them Nick can
01:00:16
you please find a picture of Eugene
01:00:19
Levy har so I mean it seems like a major
01:00:23
advancement certainly compared to apple
01:00:24
Vision Pro I mean you don't hear you
01:00:26
don't hear about the Apple Vision Pros
01:00:28
anymore at all I mean those things came
01:00:30
and
01:00:31
[Laughter]
01:00:36
went it's pretty funny it seems to me
01:00:39
that who's that meta is executing
01:00:42
extremely well I mean you had the very
01:00:44
successful cost cutting which Wall
01:00:45
Street
01:00:46
liked Zuck published that letter which I
01:00:49
give him credit for regretting the
01:00:50
censorship that meta did which was at
01:00:52
the beest of the deep State yeah
01:00:56
they made huge advancements in AI I
01:00:58
don't think they were initially on The
01:00:59
Cutting Edge of that but they've caught
01:01:01
up and now they're leading the open
01:01:03
source yeah with with llama 3.2 and now
01:01:06
it seems to me that they're ahead on
01:01:08
augmented reality ever since uh Zuck
01:01:11
grew out the hair yeah you know don't
01:01:13
ever don't ever cut the hair it's like
01:01:14
Samson I mean you've been it's like
01:01:16
Samson Bas based Zuck is the best
01:01:20
Zuck I want to be clear I I think these
01:01:23
glasses are are going to be successful
01:01:25
my only comment is that I think that
01:01:28
when you look back 25 and 30 years from
01:01:30
now and say that was the killer AI
01:01:33
device I don't think it's going to look
01:01:35
like something we know today that's my
01:01:38
only point and maybe it's going to be
01:01:40
this thing that Sam Alman and Johnny IV
01:01:42
are baking up that's supposed to be this
01:01:45
AI infused iPhone Killer maybe it's that
01:01:48
thing I doubt that will be a pair of
01:01:51
glasses or a phone or a pin if you think
01:01:55
about like so so take the constraints on
01:01:57
I don't need a keyboard because I'm not
01:02:00
going to be typing stuff I don't need a
01:02:02
normal browser interface you could see a
01:02:05
device come out that's almost like
01:02:08
smaller than the palm of your hand that
01:02:10
gives you enough of the visuals and all
01:02:11
it is is a screen with maybe two buttons
01:02:13
on the side and it's all audio driven
01:02:15
you put a headset in and you're
01:02:17
basically just talking or using gesture
01:02:20
or looking at it to kind of describe
01:02:21
where you want things to go and it can
01:02:22
create an entirely new Computing
01:02:24
interface because AI does all of these
01:02:26
things withed text with gesture control
01:02:29
with eye control and with audio control
01:02:32
and then it can just give you what you
01:02:33
want on a screen and all you're getting
01:02:34
is a simple interface so jamat you may
01:02:36
be right it might be a big watch or a
01:02:38
handheld thing that's much smaller than
01:02:39
an iPhone and just all it is is a screen
01:02:42
with nothing I really resonate when you
01:02:45
talk about voice only because I
01:02:47
think it's like it you I think there's
01:02:50
like a part of like social
01:02:52
decorum that all of these goggles and
01:02:55
glasses
01:02:57
violate and I think we're going to have
01:02:58
to decide as a society whether that's
01:03:00
going to be okay and then I think like
01:03:03
are when you like go trekking in Nepal
01:03:06
are you going to encounter somebody
01:03:08
wearing AR glasses I think the odds are
01:03:10
pretty low but you do see people today
01:03:12
with a phone so what do they replace it
01:03:14
with and I think voice as a modality is
01:03:18
is I think it's more incredible that
01:03:20
that could be used by 8 billion people I
01:03:22
think social Fabric's more affected by
01:03:24
people staring at their phones all the
01:03:25
time you sit on a bus you sit at a
01:03:26
restaurant you go to dinner with someone
01:03:27
and they're staring at their phone like
01:03:29
you know spes friends we all deal with
01:03:31
it where you feel like you're not
01:03:33
getting attention from the person that
01:03:35
you're interfacing with in the real
01:03:36
world because they're so connected to
01:03:38
the phone if we can disconnect the phone
01:03:40
but still get take away this kind of
01:03:42
addictive feedback loop system but still
01:03:44
give you this Computing ability in a
01:03:46
more ambient way that allows you to
01:03:48
remain engaged in the physical world I
01:03:49
think everyone feel better hurt your
01:03:51
feeling when he's playing chess and not
01:03:53
paying attention yeah I'll be playing
01:03:54
chess on my our glasses while pretending
01:03:57
to listen to to you you
01:03:59
idiot oh F's he's bu them he got he got
01:04:03
version one what one point I want to
01:04:05
just hit on is the reason why these
01:04:07
glasses have a chance of working is
01:04:09
because of AI I mean Facebook initially
01:04:12
made these that's exactly my point it's
01:04:14
exactly my point Facebook or meta made
01:04:16
these huge investments before AI was a
01:04:18
thing and in a way I think they've kind
01:04:20
of gotten lucky because what AI gives
01:04:22
you is uh voice and audio so you can
01:04:26
talk to the glasses or whatever the
01:04:27
wearable is it can talk to you perfect
01:04:30
natural language and computer vision
01:04:33
allows it to understand the world around
01:04:35
you so whatever this device is it can be
01:04:37
a true personal digital assistant in the
01:04:40
real world and that's telling you if you
01:04:42
guys play with apple Vision Pro have any
01:04:44
of you actually like used it to any
01:04:46
extent if you actually used it I used it
01:04:49
for a day or a night when we were
01:04:51
playing poker and I've never used it
01:04:54
again since right which I get but I do
01:04:56
think that it has these tools in it
01:04:58
similar to like the original Macintosh
01:05:00
had these incredible Graphics uh editors
01:05:02
like mac paint and all these things that
01:05:04
like people didn't like get addicted to
01:05:06
at the time but they became this like
01:05:09
tool that completely revolutionized
01:05:11
everything in Computing later and fonts
01:05:14
and so on but like this I think has
01:05:16
these tools Apple Vision Pro with
01:05:18
gesture control and the keyboard and the
01:05:21
eye control those aspects of that device
01:05:24
highlight where this could all go which
01:05:26
is this these systems can kind of be
01:05:28
driven without keyboards without typing
01:05:31
without like you know moving your finger
01:05:33
around without CLI I think that's the I
01:05:35
think that's the key observation I
01:05:36
really agree with what you just said
01:05:38
it's this it's this idea that you're
01:05:40
just you're liberated from the hunting
01:05:43
and pecking the controlling it's you
01:05:45
don't need to control the computer
01:05:47
anymore the computer now knows what you
01:05:49
want and then the computer can just go
01:05:51
and do the work and they can respond so
01:05:53
now this is the behavior change that I
01:05:55
don't think we're fully giving enough
01:05:57
credit to so today part of what Jason
01:06:00
talked about what I called anxiety is
01:06:02
because of the information architecture
01:06:04
of these apps that is totally broken and
01:06:06
the reason why it's broken is when you
01:06:08
tell an AI agent get me the cheapest car
01:06:11
right now to go to XYZ place it will go
01:06:15
and look at lft and Uber and whatever
01:06:17
it'll provision the car and then it'll
01:06:19
just tell you when it's coming and it
01:06:21
will break this cycle that people have
01:06:24
of having to check these apps for what
01:06:26
is useless filler information and when
01:06:28
you strip a lot of that notification
01:06:30
traffic away I think you'll find that
01:06:32
people start looking at each other in
01:06:34
the face more often and I think that
01:06:36
that's a net positive so will will meta
01:06:39
sell hundreds of millions of these
01:06:41
things I suspect probably but all I'm
01:06:44
saying is if you look backwards 30 years
01:06:47
from now what is the device that sells
01:06:49
in the
01:06:50
billions it's probably not a form factor
01:06:52
that we understand today I just want to
01:06:54
point out like the the form factor
01:06:56
you're seeing now is going to get
01:06:58
greatly reduced these were um some of
01:07:00
the early Apple um I don't know if you
01:07:02
guys remember these but frog design made
01:07:05
these crazy tablets in the
01:07:08
80s that were the advenal inspiration
01:07:10
for the iPad you know 25 years later I
01:07:14
guess exactly uh and so that's the
01:07:16
journey we're on here right now this
01:07:18
clunky and and these are not functional
01:07:20
protot dude the Apple Newton is like and
01:07:24
then it turns out hey you throw away the
01:07:25
sty and you got an iPhone right and
01:07:28
everything gets a million x better the
01:07:29
other subtle thing that's happening
01:07:31
which I don't think we should sleep on
01:07:32
is that the airpods are probably going
01:07:36
to become much more socially acceptable
01:07:39
to wear on a 24x7 basis because of this
01:07:41
feature that allows it to become a
01:07:43
useful hearing aid and I think as it
01:07:46
starts being worn in more and more
01:07:48
social environments and as the form
01:07:51
factor of that shrinks that's when I
01:07:53
really do think we're going to find some
01:07:55
very novel use case which
01:07:57
is you know very unobtrusive it kind of
01:08:00
Blends into your own
01:08:02
physical makeup as a person without it
01:08:04
really sticking out I think that's when
01:08:06
you'll have a really killer feature but
01:08:08
I think that the the airpods as hearing
01:08:10
aids will also add a lot so meta's doing
01:08:13
a lot Apple's doing a lot but I don't
01:08:15
think we've yet seen the super killer
01:08:17
Hardware device yet and there was an
01:08:19
interesting Waypoint Microsoft had the
01:08:22
first tablets here's the the Microsoft
01:08:24
tablet for those of you watching
01:08:26
that came you know I don't know this was
01:08:27
the late 90s or early 2000s freedberg if
01:08:30
you remember it these like incredibly oh
01:08:33
bulky tablets that Bill Gates was
01:08:35
bringing to all the events 99 2000 that
01:08:39
era so you get a lot of false starts
01:08:41
they're spending I think close to 20
01:08:43
billion doar a year on this arv or
01:08:45
anyway we're definitely on this path to
01:08:47
ambient Computing I don't think I don't
01:08:48
think this whole like hey you got to
01:08:49
control a computer thing is anything my
01:08:51
kids are going to be doing in 20 years
01:08:52
this is uh this is the convergence of
01:08:54
like three or four really interesting
01:08:56
technological waves all right just uh
01:08:58
dovetailing with tech jobs and the
01:09:00
static team size there is a report of a
01:09:03
blue collar Boom the tool belt
01:09:06
generation is what gen Z is being
01:09:09
referred to as a report in the Wall
01:09:11
Street Journal reports hey tech jobs
01:09:13
have dried up we all we're all seeing
01:09:15
that and according to indeed Developer
01:09:17
jobs down more than 30% since February
01:09:19
of 2020 pre-co of course if you look at
01:09:23
layoffs.fyi you'll see all
01:09:26
you know tech jobs that have been
01:09:27
eliminated since 2022 over a half
01:09:30
million of them bunch of things at play
01:09:32
here and the Wall Street Journal notes
01:09:35
that entry level Tech workers are
01:09:37
getting hit the hardest especially all
01:09:40
these recent college graduates and if
01:09:42
you look at a historical College
01:09:45
enrollment let's pull up that chart Nick
01:09:47
you can see here undergraduate graduate
01:09:48
and total with the red line we peaked at
01:09:51
21 million people in either graduate
01:09:54
school or under graduate in 2010 and
01:09:57
that's come down to 8.6 million at the
01:09:59
same time obviously in the last 12 years
01:10:01
you've had uh the population has grown
01:10:03
so this is even you know if it was a
01:10:05
percentage basis would be even more
01:10:08
dramatic so what's behind this a poll of
01:10:11
a000 teens this summer found that about
01:10:13
half believe a high school degree trade
01:10:15
program or two-year degree best meets
01:10:18
their career needs and 56% said Real
01:10:20
World on the job experience is more
01:10:22
valuable than obtaining a college degree
01:10:24
something you've talked about with your
01:10:26
own personal experience jamath at waterl
01:10:28
doing apprenticeships essentially your
01:10:31
thoughts on generation tool belt such a
01:10:33
positive trend I mean there's so many
01:10:37
reasons why this is good I'll I'll just
01:10:39
list a handful that come to the top of
01:10:41
my mind the first and probably the most
01:10:43
important is that it breaks this
01:10:46
strangle hold that the university
01:10:49
education system has on America's kids
01:10:54
we have
01:10:56
tricked millions and millions of people
01:11:00
into getting trillions of dollars in
01:11:02
debt on this idea that you're learning
01:11:04
something in University that's somehow
01:11:06
going to give you
01:11:09
economic stability and ideally freedom
01:11:13
and it has turned out for so many people
01:11:15
to not be true it's just so absurd and
01:11:18
unfair that that has happened so if you
01:11:21
can go and get a trade degree and live a
01:11:24
econom ially productive life where you
01:11:26
can get married and have kids and take
01:11:28
care of your family and do all the
01:11:29
things you want to do that's going to
01:11:30
put an enormous amount of pressure on
01:11:33
higher ed why does it charge so much
01:11:36
what does it give in return that's one
01:11:38
thought the second thought which is much
01:11:40
more narrow Peter teal has that famous
01:11:42
saying where if you have to put the word
01:11:45
science behind it it's not really a
01:11:47
thing and what we are going to find out
01:11:50
is that that was true for a whole bunch
01:11:52
of things where people went to school
01:11:54
like political science
01:11:55
social science yeah social science but I
01:11:58
always thought that computer science
01:12:00
would be immune but I think he's going
01:12:02
to be right about that too because you
01:12:04
can spend two or $300,000 getting in
01:12:07
debt to get a computer science degree
01:12:09
but you're probably better off learning
01:12:10
JavaScript and learning these tools in
01:12:12
some kind of a boot camp for far far
01:12:14
less and graduating in a position to
01:12:16
make money right away so those are just
01:12:18
two ideas I think that it allows us to
01:12:21
be a better functioning Society so I am
01:12:26
really supportive of this trend sax your
01:12:28
thoughts on this generation tool belt
01:12:30
we're we're reading about and you know
01:12:33
the sort of combination with static team
01:12:36
size that we're seeing in technology
01:12:38
companies keeping the number of
01:12:40
employees the same or trending down
01:12:42
while they grow 30% year-over-year
01:12:44
oh my God I'm like so sick of this this
01:12:46
topic of of job loss or job disruption
01:12:50
you I got in so much trouble last week
01:12:51
you asked a question about whether the
01:12:53
upper middle class is going to suffer
01:12:55
because they're all going to be put out
01:12:56
of work by Ai and I just kind of brush
01:12:58
it off not because I'm advocating for
01:13:01
that but just because I don't think it's
01:13:02
going to happen this whole thing about
01:13:04
job loss is so overdone there's going to
01:13:07
be a lot of job disruption but in the
01:13:09
case of coders just as an example I
01:13:12
think we can say that coders depending
01:13:14
on who you talk to are 10 20 30% more
01:13:17
productive as a result of these coding
01:13:19
assistant tools but we still need coders
01:13:22
you can't automate 100% of it and the
01:13:24
world needs needs so many of them the
01:13:26
need for software is unlimited we can't
01:13:29
hire enough of them at Glue by the way
01:13:31
shout out if you're looking if here a
01:13:33
coder who is afraid of not being able to
01:13:34
get a job apply for one at glue believe
01:13:36
me we're hiring I just think that this
01:13:38
is so overdone there's going to be a lot
01:13:41
of disruption in the knowledge worker
01:13:44
space like we talked about the workflow
01:13:46
at call centers and to service factories
01:13:50
there's going to be a lot of change but
01:13:52
at the end of the day I think there's
01:13:54
going to be plenty of work for humans to
01:13:56
do and some of the work will be more in
01:13:58
the blue collar space and I agree with
01:14:00
jamath that this is a good thing I think
01:14:02
there's been perhaps an overemphasis on
01:14:05
the idea that the only way to get ahead
01:14:07
in life is to get uh like a fancy degree
01:14:10
from one of these universities and we've
01:14:12
seen that many of the universities
01:14:14
they're just not that great they're
01:14:16
overpriced you end up graduating with a
01:14:18
mountain of debt and you get a degree
01:14:20
that is you know maybe even far worse
01:14:23
than computer science that's just
01:14:24
completely worthless so if people learn
01:14:27
more vocational skills if they skip
01:14:30
College because they have a proclivity
01:14:32
to do something that doesn't need that
01:14:33
degree I think that's a good thing and
01:14:35
that's healthy for the economy freedberg
01:14:36
is this like uh just the pendulum swung
01:14:38
too much and education got too expensive
01:14:40
spending 200k to make $50,000 a year
01:14:43
distinctly different than our childhoods
01:14:45
or I'm sorry our adolescence when we
01:14:46
were able to go to college for 10K a
01:14:48
year 20K a year graduate with you know
01:14:51
some low tens of thousands in debt if
01:14:53
you did take debt and then you're your
01:14:55
entry level job was 5050 60 70k coming
01:14:57
out of college what are your thoughts
01:14:58
here is this a value issue with
01:15:00
college well yeah I think the Market's
01:15:02
definitely correcting itself I think for
01:15:04
years as chamat said there was kind of
01:15:05
this belief that if you went to college
01:15:07
there was regardless of the college
01:15:10
there was this outcome where you would
01:15:12
make enough money to justify the debt
01:15:15
you're taking
01:15:16
on and I think folks have woken up to
01:15:19
the fact that that's not reality again
01:15:22
if there was a free market remember most
01:15:24
people go to college with student loans
01:15:27
and all student loans are funded by the
01:15:29
federal government so the the cost of
01:15:32
education has ballooned and the
01:15:34
underwriting criteria necessary for this
01:15:36
free market to work has been completely
01:15:39
destroyed because of the federal
01:15:41
spending in the student loan program
01:15:44
there's no discri discrimination between
01:15:46
one school or another you could go to
01:15:48
Trump University or you could go to
01:15:51
Harvard it doesn't matter you still get
01:15:52
a student loan even if at the end of the
01:15:55
process you don't have a a degree that's
01:15:57
valuable and so I think folks are now
01:15:59
waking up to this fact and the market is
01:16:01
correcting itself which is good I'll
01:16:03
also say that I think that there's this
01:16:06
premium with generally mass production
01:16:10
and
01:16:11
industrialization of the human touch and
01:16:15
what I mean is if you think about hey
01:16:18
you could go to the store and buy a
01:16:20
bunch of cheap food off of the store
01:16:22
shelves you could buy a bunch of
01:16:23
Hershey's chocolate bars
01:16:25
or you can go to a Swiss choler in
01:16:27
downtown San Francisco pay $20 for a box
01:16:31
of Handmade Chocolates you'll pay that
01:16:32
premium for that better product same
01:16:35
with clothes there's this big Trend in
01:16:36
kind of handmade clothes and high-end
01:16:38
luxury goods spoke artisanal Aral
01:16:43
handmade and and similarly I think that
01:16:46
there's a premium in Human Service in
01:16:50
the partnership with a human it's not
01:16:51
just about blue collar jobs it's about
01:16:54
having a
01:16:55
talk to you and serve you if you go to a
01:16:58
restaurant instead of having a machine
01:17:00
spit out the food to you there's an
01:17:02
experience associated with that that
01:17:04
you'll pay a premium for there's
01:17:06
hundreds and hundreds of micro breweries
01:17:08
in the United States that in aggregate
01:17:09
out sell Budweiser and Miller and even
01:17:12
Modello today and that's because they're
01:17:14
handcrafted by local people and there's
01:17:16
a there's an artisan craftsmanship so
01:17:19
while technology and AI are going to
01:17:21
completely reduce the cost of a lot of
01:17:24
things and increase the production and
01:17:26
productivity of those things one of the
01:17:29
complimentary consequences of that is
01:17:31
that there will be an emerging premium
01:17:33
for Human Service and I think that there
01:17:35
will be an absolute burgeoning and
01:17:37
blossoming in the salaries and the
01:17:40
availability and demand for Human
01:17:42
Service in a lot of walks of life
01:17:44
certainly there's all the work at home
01:17:46
the electricians and the plumbers and so
01:17:48
on but also Fitness classes
01:17:51
food personal service around tutoring
01:17:54
and learning and developing oneself
01:17:56
there's going to be an incredible
01:17:57
blossoming I think in human service jobs
01:17:58
and they don't need to have a degree in
01:18:00
poai to be performed I think that there
01:18:03
will be a lot of people that would be
01:18:04
very happy in that world how do you see
01:18:05
the differentiation the person makes
01:18:08
free ber in doing that job versus the
01:18:10
agent or the AI or whatever well these
01:18:12
are iners human jobs so if I want to do
01:18:15
a fitness class do I want to stare at
01:18:16
the tonal or this what I'm this is what
01:18:18
I'm asking you yeah like what I think I
01:18:21
I think that there's an aspect of um it
01:18:23
look it's like your Laura like you you
01:18:26
talk about the story of Laura piano
01:18:29
where is theun coming from how's it made
01:18:31
Who's involved in it like yes look God
01:18:36
go don't stop Reber I could give you
01:18:39
truffle flavoring out of a can but you
01:18:41
you love the white truffles you want to
01:18:42
go to Italy you want the storytelling
01:18:44
there's an aspect of it right like yeah
01:18:46
and I think that there's an aspect of
01:18:48
humanity that we pay a premium that we
01:18:51
do and will look Etsy crushes I don't
01:18:52
know how much stuff you guys buy an Etsy
01:18:54
I love buying from Etsy I love finding
01:18:56
handmade stuff on Etsy for no you don't
01:18:59
do you really yes I do yeah hand crating
01:19:01
I yeah handmade so I think that there's
01:19:03
an aspect of this that um that in a lot
01:19:06
of walks of life I mean I have so many
01:19:07
jokes right now
01:19:09
chewing what I've never used that se but
01:19:12
I'm gon to try it now after this have
01:19:14
you guys taken music lessons lately you
01:19:16
know I start my kids do piano lessons
01:19:17
and so last year I started ducking in to
01:19:19
do a 45-minute piano lesson with the
01:19:21
piano teacher there's just like a great
01:19:23
aspect to paying for for these services
01:19:25
to getting fascinating bring oh here we
01:19:29
go you can play the harmonica really
01:19:32
I've just playing some I want to play
01:19:34
some Zack Bryan songs and he's got a
01:19:36
couple songs I like with a harmonica in
01:19:37
them so I just got harmonica my daughter
01:19:39
and I have been playing harmonica yeah
01:19:42
are you teaching yourself let's hear it
01:19:43
let's hear it let's hear it I'll play it
01:19:45
next week I'm deep in the laboratory not
01:19:48
a bit it could be a bit but could be a
01:19:50
I'll write I'll write your song next
01:19:52
week the uh a little shy he's a little
01:19:54
shy no no I'll do I'll I'll write a
01:19:56
trump song for you I'll do the uh the
01:19:58
the trials and tribulations of Donald
01:20:01
Trump and I'll I'll do a little Bob
01:20:02
Dylan send up song
01:20:05
for you see that interview with um with
01:20:08
Bob Dylan I don't know when it was
01:20:10
recently about how andc that clip
01:20:15
Bradley amazing Bradley clip about magic
01:20:18
yeah some of those songs I don't know
01:20:20
how I wrote them they just came out and
01:20:25
do that anymore he's like no but I did
01:20:27
it once no but I did it once and what an
01:20:30
incredible that means something yeah
01:20:34
eclipes ground it's really grounding you
01:20:37
understand too soon there is no chance
01:20:39
of ding yeah that's an incredible
01:20:42
clip guys want to wrap or you want to
01:20:43
keep talking about more stuff we we're
01:20:45
at 90 minutes let me just say tell you
01:20:47
something I think there's G to be a big
01:20:48
war I think by the time the show airs
01:20:51
Israel's incursion into Lebanon is going
01:20:52
to get bigger it's going to escalate
01:20:55
and by next week we could be in a
01:20:57
full-blown multinational war in the
01:21:00
Middle East and if I am you know a
01:21:04
betting man I would bet that the odds
01:21:05
are you know more than 30 40% that this
01:21:08
happens before the election that this
01:21:10
this conflict in the Middle East
01:21:11
escalates thank you for bringing this up
01:21:14
I
01:21:15
am not asking anybody to go listen to
01:21:18
what I my interview with Rogan uh but I
01:21:22
will say this part of why I was so
01:21:27
excited to go and talk to him in a long
01:21:29
form format was this issue of war is I
01:21:33
think the existential crisis of this
01:21:37
election and of this moment and I really
01:21:40
do agree with you freeberg there is a
01:21:43
non-trivially high probability the
01:21:45
highest it's ever been that we are just
01:21:48
bumbling and sleepwalking into a really
01:21:51
bad situation we can't walk back from I
01:21:54
really hope you're wrong and here's the
01:21:57
here's the situation I really hope
01:21:59
you're wrong if Israel incurs further in
01:22:01
into Lebanon going after Hezbollah and
01:22:05
Iran ends up getting involved in a more
01:22:08
active way does Russia start to provide
01:22:11
supplies to Iran like we are supplying
01:22:14
to Ukraine today does this sort of bring
01:22:17
everyone to a line just to give you a
01:22:19
sense you know of the scale of what
01:22:21
Israel could then respond with Iran has
01:22:24
600 ,000 active udy military another
01:22:26
350,000 in reserve they have dozens of
01:22:29
ships they have 19 submarines they have
01:22:31
a 600 km range missile system Israel has
01:22:35
90,000 has sorry 170,000 active duty and
01:22:38
half a million Reserve Personnel 15
01:22:39
warships five submarines potentially up
01:22:42
to 400 nuclear weapons including a very
01:22:45
wide range of tactical sub one kiloton
01:22:48
nuclear weapons small small payload you
01:22:51
could see that if Israel starts to feel
01:22:55
incurred upon further they could respond
01:22:59
in a more aggressive way with what is
01:23:01
you know by far in
01:23:03
away you know the the most significantly
01:23:06
stock Arsenal and military force in the
01:23:09
Middle East again we've talked about
01:23:12
what are these other countries going to
01:23:13
do what is Jordan going to do in this
01:23:15
situation how is how the Saud going to
01:23:18
respond what is Russia going to do
01:23:20
Russia well the Russia Ukraine thing
01:23:22
meanwhile still goes on and we saw in
01:23:24
our group chat one of our friends posted
01:23:27
but Russia basically said any more
01:23:29
attacks on our land you know we reserve
01:23:31
All rights including nuclear response
01:23:34
that is insane well you know so just to
01:23:36
give you a sense um how how are we here
01:23:40
yeah so the the nuclear bombs that were
01:23:43
set off during World War II I just want
01:23:46
to show you how crazy this this is do
01:23:50
you see that image on the
01:23:52
left that all all the way over on the
01:23:55
left that's a bunker Buster you guys
01:23:56
remember those from Afghanistan and and
01:23:58
the Damage that those bunker Buster
01:24:00
bombs caused Hiroshima is a 15 kiloton
01:24:04
nuclear and you can see the size of it
01:24:06
there on the left that's a zoom in of
01:24:09
the image on the right and the image on
01:24:12
the right starts to show the biggest
01:24:14
ever tested was Zar bomba by the Soviets
01:24:18
this was a 50 Megaton bomb it it caused
01:24:21
shock waves that went around the Earth
01:24:25
three times they could be felt as
01:24:26
seismic shock waves around the earth
01:24:28
three times from this one detonation
01:24:30
today there are a lot of .1 to1 kiloton
01:24:35
nuclear bombs that are kind of
01:24:36
considered these tactical nuclear
01:24:38
weapons that kind of fall closer to
01:24:41
between the bunker Buster and the
01:24:43
Hiroshima and that's really where a lot
01:24:45
of folks get concerned that if Israel or
01:24:47
Russia or others get cornered in a way
01:24:51
and there's no other tactical response
01:24:53
that that is what that get gets pulled
01:24:55
out now if someone detonates A.1 or 1
01:24:58
kiloton nuclear bomb which is going to
01:25:00
look like a mega bunker Buster what is
01:25:02
the other side and what's the world
01:25:03
going to respond with that's how on the
01:25:06
brink we are and there's
01:25:08
12,000 nuclear weapons with an average
01:25:10
payload of 100
01:25:12
kilotons around the world the US has a
01:25:15
large stockpile Russia has the largest
01:25:18
many of these are hair trigger alert
01:25:20
systems China has the third largest and
01:25:22
then Israel and India and and so on it
01:25:27
is a very concerning situation because
01:25:29
if anyone does get pushed to the brink
01:25:31
that has a nuclear weapon and they pull
01:25:33
out a tactical nuke does that mean that
01:25:35
game is on and that's why I'm so nervous
01:25:37
about where this all leads to if we
01:25:38
can't decelerate it's very scary because
01:25:40
you could very quickly see this thing
01:25:42
Accel I am the most objectively scared
01:25:44
I've ever
01:25:45
been and I think that people grossly
01:25:49
underestimate how quickly this could
01:25:51
just spin up out of control and right
01:25:53
now
01:25:55
not enough of us are spending the time
01:25:57
to really understand why that's possible
01:26:00
and then also try to figure out what's
01:26:01
the offering and I think it's I think
01:26:04
it's just incredibly important that
01:26:06
people take the time to figure out that
01:26:08
this is a nonzero probability and this
01:26:11
is probably for many of us the First
01:26:12
Time In Our Lifetime where you could
01:26:14
really say that well I think freeberg is
01:26:16
right that we're at the beginning stages
01:26:18
of I think what will soon be referred to
01:26:20
as the third Lebanon war the first one
01:26:23
was in
01:26:24
1982 uh Israel went into Lebanon and
01:26:27
occupied it until 2000 then it went back
01:26:30
in 2006 left after about a month and now
01:26:33
we're in the third war it's hard to say
01:26:36
exactly how much this will escalate the
01:26:40
IDF is exhausted after the war in Gaza
01:26:44
there's significant opposition within
01:26:47
Israel and within the Armed Forces to a
01:26:49
big ground invasion of Lebanon so far
01:26:53
most of the fighting has has been Israel
01:26:55
using its air superiority overwhelming
01:26:58
Firepower against Southern Lebanon and I
01:27:02
I I think that if Israel makes a ground
01:27:05
Invasion they're giving hisbah the war
01:27:08
that hisbah wants I mean hisb would love
01:27:11
for this to turn into a Guerilla war in
01:27:13
southern Lebanon so I think there's
01:27:14
still some question about whether
01:27:17
Netanyahu will do that or not at the
01:27:20
same time it's also possible that his
01:27:23
Bella will
01:27:24
attack Northern Israel what nzala has
01:27:28
threatened to invade the
01:27:32
Galilee in response to what Israel is
01:27:34
doing so there's multiple ways this
01:27:36
could escalate and if Hezbollah and
01:27:41
Israel are in a fullscale war with
01:27:44
ground forces it could be very easy for
01:27:46
Iran to get pulled into it on hisb side
01:27:49
and if that happens I think it's just
01:27:51
inevitable that the United States will
01:27:53
be pulled into this war so yeah look I
01:27:55
think we are drifting and and we have
01:27:58
been drifting into a regional war in the
01:28:01
Middle East that you know ideally would
01:28:04
not pull in the US I I think the US
01:28:05
should try to avoid being pulled in but
01:28:08
I think very likely will be pulled in if
01:28:10
it escalates and then meanwhile in terms
01:28:14
of the war in Ukraine I mean I've been
01:28:15
warning about this for two and a half
01:28:17
years how dangerous the situation was
01:28:19
and that's why we should have availed
01:28:22
ourselves of every diplomatic
01:28:23
opportunity to make peace and we now
01:28:26
know because there's been such Universal
01:28:28
reporting that in Istanbul in the first
01:28:31
month of the Ukraine war there was an
01:28:33
opportunity to make a deal with Russia
01:28:35
where Ukraine would get all of its
01:28:37
territory back is just that Ukraine
01:28:39
would have to agree not to be part of
01:28:40
NATO it would have to agree to be
01:28:42
neutral and not part of the western
01:28:44
military block that was so threatening
01:28:46
to Russia the Biden Administration
01:28:48
refused to make that deal they sent in
01:28:50
Boris Johnson to Scuttle it they threw
01:28:51
cold water on it they blocked it they
01:28:53
told zinsky will give you all the
01:28:55
weapons you need to fight Russia
01:28:57
zalinsky believed in that it has not
01:28:59
worked out that way Ukraine is getting
01:29:02
destroyed it's very hard to get honest
01:29:04
reporting on this from the mainstream
01:29:06
media but the sources I've read suggest
01:29:10
that the ukrainians are losing about
01:29:12
30,000 troops per month and that's just
01:29:15
Kia I don't even think that's wounded
01:29:17
that on a bad day they're suffering
01:29:20
1,200 casualties it's more than even
01:29:23
during that counter offensive last
01:29:25
summer that Ukraine had during that time
01:29:27
they were losing about 20,000 troops a
01:29:28
month so the level of Carnage is
01:29:31
escalating Russia has has more of
01:29:35
everything more weapons more Firepower
01:29:38
air superiority and they are destroying
01:29:40
Ukraine and it's very clear I think that
01:29:44
Ukraine within it could be in the next
01:29:46
month it could be in the next two months
01:29:47
it could be in the next six months I
01:29:48
think they're eventually going to
01:29:50
collapse they're getting close to being
01:29:52
combat incapable and in a way that poses
01:29:54
the biggest danger
01:29:56
because the closer Ukraine gets to
01:29:59
collapse the more the West is going to
01:30:00
be tempted to to intervene directly in
01:30:02
order to save them and that is what
01:30:05
zalinski was here in the US doing over
01:30:07
the past week is arguing for direct
01:30:10
involvement by America in the Ukraine
01:30:13
war to save him how did he propose this
01:30:15
he said we want to be directly admitted
01:30:17
to Nato immediately that was his request
01:30:21
and he called this the the victory plan
01:30:23
so in other words his plan for victory
01:30:25
is to get America involved in the war
01:30:26
and fighting it for him but that is the
01:30:29
only chance Ukraine has and it is
01:30:31
possible that the Biden Harris
01:30:33
Administration will agree to do that or
01:30:35
at least agree to some significant
01:30:37
escalation so far I think Biden to his
01:30:39
credit has resisted another zalinsky
01:30:42
demand which is the ability to use
01:30:44
America's long-range missiles and
01:30:47
British long- range missiles the storm
01:30:49
Shadows against Russian cities that is
01:30:51
what zalinski is asking for if zalinsky
01:30:52
wants a major escal of the war because
01:30:54
that is the only thing that's going to
01:30:56
save him save his side and maybe even
01:30:59
his neck personally and so we're one
01:31:02
mistake away from the very dangerous
01:31:05
situation that chth and freeberg have
01:31:07
described if a President Biden who is
01:31:11
basically scile or a president Harris
01:31:14
agree to one of these zalinsky requests
01:31:17
we could very easily find ourselves in a
01:31:19
direct war with the Russians the Walts
01:31:22
into World War II is what it should be
01:31:25
called and the reason why this could
01:31:26
happen is because we don't have a fair
01:31:28
media that's fairly reported anything
01:31:30
about this War I mean Trump is on the
01:31:32
campaign Trail making I think very valid
01:31:35
points about this war that the Ukrainian
01:31:37
cause is doomed and that we should be
01:31:39
seeking a peace deal and a settlement
01:31:42
before this cons spiral into World War
01:31:44
II that is fundamentally correct but the
01:31:46
media portrays that as being pro-russian
01:31:49
and pro Putin and if you say that you
01:31:50
want peace you are basically on the take
01:31:53
from Putin in Russia that is what the
01:31:54
media has told the American public for 3
01:31:56
years the definition of Liber liberalism
01:31:59
has always been being completely against
01:32:02
war of any kind and being completely in
01:32:05
favor of free speech of all kinds that's
01:32:08
what being a liberal means we've lost
01:32:11
the
01:32:12
script and I think that people need to
01:32:14
understand that this is the one issue
01:32:17
where if we get it wrong literally
01:32:19
nothing else matters and we are
01:32:22
sleepwalking and tiptoeing into a
01:32:27
potential massive World War Jeffrey sck
01:32:30
said it perfectly you don't get a second
01:32:32
chance in the nuclear age all it takes
01:32:34
is one big mistake you do not get a
01:32:36
second chance and for
01:32:40
me I have become a single issue voter
01:32:43
this is the only issue to me that
01:32:45
matters we can sort everything else out
01:32:47
we can figure it all out we can we can
01:32:50
find common ground and reason should
01:32:53
taxes go up should taxes go down let's
01:32:56
figure it out should regulations go up
01:32:58
should regulations go down we can figure
01:33:00
it
01:33:01
out but we are fighting a potential
01:33:05
nuclear threat on three fronts how have
01:33:08
we allowed this to happen Russia Iran
01:33:12
China you
01:33:14
cannot
01:33:15
underestimate that when you add these
01:33:17
kinds of risks on top of each other
01:33:20
something can happen here and I don't
01:33:23
think people know they're too far away
01:33:24
from it they're too many generations
01:33:26
removed from it war is something you
01:33:28
heard maybe your grandparents talk about
01:33:30
now and you just thought okay
01:33:33
whatever I lived it it's not
01:33:36
good jth you're right I mean during the
01:33:38
Cuban Missile Crisis all of America was
01:33:40
huddled around their TV sets worried
01:33:43
about what would happen there is no
01:33:45
similar concern in this day and age
01:33:46
about the escalatory wars that are
01:33:50
happening there's a little bit of
01:33:51
concern I think about what's happening
01:33:52
in the Middle East there's virtually no
01:33:53
concern about what's happening in
01:33:55
Ukraine because people think it can't
01:33:57
affect them but it can and one of the
01:34:00
reasons it could affect them is because
01:34:01
we do not have a fair debate about that
01:34:04
issue in the US media the media has
01:34:06
simply caricatured any opposition to the
01:34:10
war as being protin so I would say that
01:34:13
when every pundit and
01:34:17
every person in a position to do
01:34:21
something about it says you have nothing
01:34:24
to worry
01:34:25
about you probably have something to
01:34:27
worry
01:34:28
about and so when everybody is trying to
01:34:31
tell you everybody that this is not a
01:34:34
risk it's probably a bigger risk than we
01:34:38
think yeah they're protesting too much
01:34:41
how can you say it's not a risk me think
01:34:43
thou do
01:34:44
protested all right love you boys
01:34:48
[Applause]
01:34:49
byebye let your winners ride
01:34:53
Man
01:34:56
David and instead we open source it to
01:34:59
the fans and they've just gone crazy
01:35:01
with it love queen
01:35:04
[Music]
01:35:09
of Besties
01:35:12
are that's my dog taking your
01:35:17
driveways oh man myit will meet me we
01:35:21
should all just get a room and just have
01:35:22
one big huge or because they're all this
01:35:24
usess it's like this like sexual tension
01:35:26
that they just need to release
01:35:33
somehow we need to get mer
01:35:39
[Music]
01:35:43
our going
01:35:45
[Music]

Podspun Insights

In this episode of the All-In Podcast, the crew dons tuxedos for a hilariously somber tribute to the recent "losses" at OpenAI, where they bid farewell to several key figures in a tongue-in-cheek manner. The banter flows as they navigate the serious implications of OpenAI's transition from a nonprofit to a for-profit entity, exploring the potential valuation of $150 billion and the ramifications of such a shift. Amidst the laughter, they dissect the challenges and opportunities facing the AI industry, including the looming competition from tech giants like Google and Meta.

As the conversation unfolds, they delve into the future of AI and its impact on jobs, emphasizing a shift towards blue-collar roles and vocational training, which they argue could be a positive trend for society. The hosts also touch on the evolving landscape of augmented reality and the potential for new computing interfaces, speculating on how these technologies will change our interactions with the digital world.

With a mix of humor and insightful commentary, the episode captures the essence of the tech industry's rapid evolution, leaving listeners both entertained and informed about the future of AI and its societal implications.

Badges

This episode stands out for the following:

  • 95
    Funniest
  • 90
    Most intense
  • 90
    Most unserious (in a good way)
  • 90
    Best concept / idea

Episode Highlights

  • Chain of Thought Model
    Introduction of a new AI model that revolutionizes how AI systems think and respond.
    “This is the ultimate evolution of AI systems.”
    @ 01m 55s
    September 27, 2024
  • OpenAI's $150 Billion Valuation
    Discussion on OpenAI's significant valuation and its implications for the future.
    “How is OpenAI worth $150 billion?”
    @ 06m 45s
    September 27, 2024
  • The Evolution of Analysts
    Analysts are becoming obsolete as AI tools allow users to get answers in minutes.
    “The analyst is the model sitting on the computer in front of you.”
    @ 23m 05s
    September 27, 2024
  • Revolutionizing Knowledge Work
    AI is changing how knowledge work is done, making processes faster and more efficient.
    “Everyone's just getting so goddamn smart so fast using these tools.”
    @ 24m 30s
    September 27, 2024
  • The Future of Data Management
    Companies can now directly connect data sources without the need for traditional systems of record.
    “You can just pipe that stuff directly from Stripe into Snowflake.”
    @ 31m 56s
    September 27, 2024
  • Elon Musk's Investment
    Elon Musk invested $50 million in OpenAI, but feels overlooked in the new structure.
    “How do you do that and then not clean it up for Elon?”
    @ 48m 04s
    September 27, 2024
  • The Shift to Ambient Computing
    A discussion on the evolution of computing from mobile to ambient interfaces.
    “I think we're witnessing this big shift in Computing, probably the biggest since mobile.”
    @ 53m 02s
    September 27, 2024
  • The Future of Interaction
    Exploring how glasses could take on tasks currently done by phones and watches.
    “I bet you glasses are going to take like a third of the tasks you do.”
    @ 59m 17s
    September 27, 2024
  • The Rise of Vocational Skills
    A trend towards trade degrees and real-world experience over traditional college education.
    “If you can go and get a trade degree... that's going to put pressure on higher ed.”
    @ 01h 11m 21s
    September 27, 2024
  • Escalating Tensions in the Middle East
    The discussion highlights the potential for a multinational war in the Middle East, with increasing fears of escalation.
    “I think by the time the show airs, Israel's incursion into Lebanon is going to get bigger.”
    @ 01h 20m 51s
    September 27, 2024
  • The Existential Crisis of War
    The conversation emphasizes the existential crisis posed by current conflicts, particularly the risk of nuclear escalation.
    “This is the existential crisis of this election and of this moment.”
    @ 01h 21m 37s
    September 27, 2024
  • The Ukrainian Conflict's Dire Situation
    The ongoing war in Ukraine is causing significant casualties, with fears of direct U.S. involvement.
    “Ukraine is getting destroyed; it's very hard to get honest reporting on this.”
    @ 01h 29m 02s
    September 27, 2024

Episode Quotes

Key Moments

  • Emotional Tribute00:05
  • AI Valuation Discussion06:45
  • Game Changer17:10
  • Smart Tools24:30
  • Health Insurance41:06
  • Ambient Computing53:02
  • AI Integration1:04:20
  • Trade Skills Rise1:10:33

Words per Minute Over Time

Vibes Breakdown