Search Captions & Ask AI

E167: Google's Woke AI disaster, Nvidia smashes earnings (again), Groq's LPU breakthrough & more

February 23, 2024 / 01:20:27

This episode covers Nvidia's earnings report, AI infrastructure, and the controversy surrounding Google's Gemini AI. Guests include Chamath Palihapitiya, David Sacks, and David Freiberg.

The hosts discuss Nvidia's impressive Q4 earnings, which saw a revenue increase of 22% quarter-over-quarter and 265% year-over-year, largely driven by demand for AI infrastructure. They note Nvidia's market cap increase of nearly $250 billion in a single day, highlighting the company's strategic position in the AI market.

Chamath emphasizes the competitive landscape, suggesting that Nvidia's profits may attract competitors, while David Freiberg raises concerns about the sustainability of Nvidia's growth amidst heavy investment in data centers. The discussion touches on the historical context of tech booms and the potential for new applications to emerge from current infrastructure investments.

The conversation shifts to Google's Gemini AI, which faced backlash for generating biased images. The hosts critique Google's approach to AI, arguing that the company's focus on avoiding bias may compromise the accuracy of its outputs.

Finally, the episode concludes with a discussion about the implications of AI technology on society and the potential for open-source solutions to address biases in AI models.

TL;DR

Nvidia's record earnings spark discussions on AI infrastructure and Google's Gemini AI controversy over biased outputs.

Video

00:00:00
all right everybody welcome back to your
00:00:01
favorite podcast of all time the all-in
00:00:04
podcast episode 160 something with me
00:00:07
again chth poopaa he's a CEO of a
00:00:10
company and invests in startups and uh
00:00:14
his firm is called Social Capital we
00:00:17
also have David freeberg the Sultan of
00:00:19
science he's now a CEO as well and we
00:00:23
have David saaks from Craft ventures in
00:00:25
some undisposed hotel room somewhere how
00:00:28
we doing boys good thank you this is an
00:00:30
odd intro could your intro be any more
00:00:33
low energy and dragged out I'm sick what
00:00:37
do you want me I'm fake the effort all
00:00:39
right here give me give me one more shot
00:00:41
watch this watch this watch profession
00:00:43
you want professionalism fake the effort
00:00:44
come on here we go you want
00:00:46
professionalism I'll show you guys
00:00:47
professionalism is that banaka what was
00:00:49
that is that Baka oh this the
00:00:51
secret Banana
00:00:55
Boat let your winners
00:00:57
ride Rainman David
00:01:03
and instead we open source it to the
00:01:04
fans and they've just gone crazy with
00:01:06
[Music]
00:01:10
it all right everybody Welcome to the
00:01:12
Allin podcast episode 167 168 with me of
00:01:16
course the ringing man himself David
00:01:17
Sachs the dictator chairman chabo poopaa
00:01:20
and our Sultan of science David freberg
00:01:22
how we doing boys great how are you high
00:01:24
energy enough you is it 167 or 168 I
00:01:27
don't know who cares can we at least get
00:01:28
you to know the episode number who cares
00:01:32
unfortunately or fortunately we're going
00:01:34
to be doing this thing forever the
00:01:35
audience demands that it doesn't matter
00:01:38
this is like a Twilight Zone episode
00:01:39
we're going to be trapped in these four
00:01:41
bubbles forever you know like Superman
00:01:43
it's a it is this is like the it is the
00:01:46
gift trapped in that
00:01:48
glass Zed was that his name Z Neil
00:01:51
before Zod and he spun through the
00:01:53
universe and the plastic being forever
00:01:56
for for Infinity until until Superman
00:01:59
took the new nuclear bomb out of the uh
00:02:02
Eiffel Tower and threw it into space and
00:02:04
blew it up and fre you know my
00:02:06
background today I think I'm going to
00:02:07
have to change now that you've
00:02:08
referenced this important scene that was
00:02:10
the best moment of that movie
00:02:12
JC where teren stamp says kneel to the
00:02:15
president and the President says oh God
00:02:17
and then Terence St like
00:02:20
Zod not God Zod Zod near before s that
00:02:25
was Superman 2 or three yeah Superman 2
00:02:27
is pretty much the best you know like
00:02:29
Empire Strikes Back like Terminator 2
00:02:31
it's always the second one that's the
00:02:33
best one all right everybody we got a
00:02:34
lot to talk about today apologies for my
00:02:36
voice I a little bit of a cold Nvidia
00:02:38
blew the doors off their earnings for
00:02:40
the third straight quarter shares were
00:02:42
up 15% on Thursday representing a nearly
00:02:45
$250 billion jump in market cap so let's
00:02:49
just let that sit in for a second this
00:02:51
is the largest single day gain in market
00:02:54
cap
00:02:56
$247 billion added in market cap
00:02:59
previously meta did something similar
00:03:01
earlier this year remember everybody was
00:03:03
down on that stock because they were
00:03:04
doing all the crazy stuff with reality
00:03:06
labs and then they got focused and laid
00:03:08
off 20,000 people they added $196
00:03:12
billion in other words they added like 2
00:03:14
and a half airbnbs to their valuation
00:03:16
but let's just get to the results the
00:03:17
results are absolutely stunning and dare
00:03:19
I say unprecedented Q4 Revenue 22.1
00:03:22
billion that's up 22% quarter over
00:03:25
quarter up 265 year-over-year the net
00:03:29
income was 12.3 billion 9x
00:03:32
year-over-year and the gross margin of
00:03:34
76% was up 2 points quarter of quarter
00:03:37
127% year-over-year but look at this
00:03:40
Revenue ramp this is extraordinary q1 of
00:03:44
2024 this Juggernaut starts and it does
00:03:48
not stop and it doesn't look like it's
00:03:49
going to stop just a run up from 7
00:03:51
billion all the way to 22 billion in
00:03:54
revenue for the quarter absolutely
00:03:57
extraordinary and uh if you want to know
00:04:00
why this is happening why is NVIDIA
00:04:01
putting up these kind of numbers this
00:04:03
chart explains everything this is all
00:04:06
about data centers obviously if you
00:04:08
heard of Nvidia before the AI boom it
00:04:10
was gaming professional visualizations
00:04:13
you know I think people making movies
00:04:14
and stuff like that Autos uh used Nvidia
00:04:17
for self-driving that kind of stuff but
00:04:19
if you look at this chart you'll see
00:04:20
data centers just starting four quarters
00:04:24
ago starts to ramp up as everybody
00:04:26
builds out the
00:04:28
infrastructure for
00:04:30
new data centers to deal with generative
00:04:32
AI so just to add one point here Jason
00:04:35
so what you can see is that Nvidia was
00:04:38
around for a long time and it was making
00:04:40
these chips these gpus as opposed to
00:04:43
CPUs and they were primarily used by
00:04:47
games and by virtual reality software
00:04:51
because gpus are better obviously at
00:04:53
graphical processing they use Vector
00:04:55
math to create these like 3D worlds and
00:04:58
this V Vector math that they use to
00:05:01
create these 3D worlds is also the same
00:05:03
Vector math that AI uses to reach its
00:05:06
outcomes so with the explosion of llms
00:05:09
it turns out that these gpus are the
00:05:12
right chips that you need for these
00:05:13
cloud service providers to BU build out
00:05:15
these Big Data Centers to serve now all
00:05:18
of these new AI applications so Nvidia
00:05:22
was in the perfect place the perfect
00:05:24
time and that's why it's just exploded
00:05:26
and what you're seeing is the buildout
00:05:29
of this
00:05:30
new cloud service infrastructure for for
00:05:33
AI yeah and um also helping the stock is
00:05:37
the fact that they bought back 2.7
00:05:39
billion worth of their shares as part of
00:05:40
a $25 billion buyback plan but this
00:05:43
company's firing on all cylinders revenu
00:05:45
is obviously ripping as people put in
00:05:47
orders to replace all of the data
00:05:50
centers out there or at least augment
00:05:51
them with this technology with gpus A1
00:05:54
100s h100s Etc the gross margin's been
00:05:58
expanding they have huge profits and
00:06:00
they're still projecting more growth in
00:06:02
q1 around 24 billion which would be a 3X
00:06:05
increase year-over-year and this
00:06:07
obviously has made the entire Market rip
00:06:10
as Nvidia goes so does the market right
00:06:12
now and the S&P 500 NASDAQ are at record
00:06:16
highs at the time of this taping chth
00:06:19
your general thoughts here on something
00:06:22
I don't think anybody saw
00:06:24
coming except for you and your
00:06:26
investment in Gro maybe and a couple of
00:06:28
others I I think what I would tell you
00:06:30
is
00:06:31
that the bigger principle and we've
00:06:33
talked about this a lot Jason is that in
00:06:36
capitalism when you over earn for enough
00:06:40
of a time what happens is competitors
00:06:43
decide to try to compete away your
00:06:44
earnings in the absence of a monopoly
00:06:47
the amount of time that you have tends
00:06:49
to be small and it shrinks so in the
00:06:51
case of a monopoly for example take
00:06:53
Google you can over earn for decades and
00:06:56
it takes a very very long time for
00:06:58
somebody to to try to displace you we're
00:07:01
just starting to see the beginnings of
00:07:02
that with things like perplexity and
00:07:05
other services that are chipping away at
00:07:07
the Google Monopoly but at some point in
00:07:10
time all of these excess profits are
00:07:13
competed away in the case of Nvidia what
00:07:17
you're now starting to see is them over
00:07:20
earn in a very massive way so the real
00:07:22
question is who will step up to try to
00:07:25
compete away those
00:07:28
profits the Bezos quote right your
00:07:30
margin is my opportunity and I think
00:07:33
we're starting to see and you've
00:07:34
mentioned grock who had a super viral
00:07:36
moment I think this week but you're
00:07:39
starting to see the emergence of a more
00:07:41
detailed understanding of what this
00:07:43
Market actually means and as a
00:07:46
result who will compete away the
00:07:48
inference Market who will compete away
00:07:50
the training market and the economics of
00:07:52
that are just becoming known to now more
00:07:54
and more people freeberg your thoughts
00:07:56
we were talking I think was last week or
00:07:58
the week before about
00:08:00
possibility of Nvidia being a 10
00:08:01
trillion dollar company largest company
00:08:03
in the world what are your thoughts on
00:08:04
these spectacular results and then shat
00:08:07
Point Everybody is watching this going
00:08:10
um maybe I can get a slice of that pie
00:08:12
and maybe I can create a more
00:08:14
competitive offering obviously we saw
00:08:16
Sam hman rumored to be raising 7
00:08:19
trillion which feels like a fake number
00:08:21
feels like that's maybe the market size
00:08:22
or something but your thoughts here I
00:08:23
don't think anything's changed on the
00:08:25
Nvidia front there's this accelerated
00:08:27
compute buildout underway in data
00:08:29
centers everyone's building
00:08:31
infrastructure and then everyone's
00:08:32
trying to build applications and tools
00:08:34
and services on top of that
00:08:36
infrastructure the infrastructure
00:08:37
buildout is kind of the first phase the
00:08:39
real question ultimately will be does
00:08:42
the initial cost of the infrastructure
00:08:44
exceed the ultimate value that's going
00:08:46
to be realized on the application layer
00:08:49
in the early days of the internet a lot
00:08:51
of people were buying Oracle servers
00:08:54
they were like 3,000 bucks a
00:08:56
server and they were running these
00:08:58
Oracle servers out of an Internet
00:09:00
connected Data Center and it you know
00:09:02
took a couple of years before folks
00:09:04
realized that
00:09:05
for large scale distributed compute
00:09:08
applications you're better off using
00:09:11
cheaper Hardware you know cheaper server
00:09:13
racks cheaper hard drives cheaper buses
00:09:16
and assuming a shorter lifespan on those
00:09:19
servers and you could cycle them in and
00:09:20
out and you didn't need the redundancy
00:09:22
you didn't need the certainty you didn't
00:09:24
need the the runtime guarantees and so
00:09:27
you could use a lower cost
00:09:29
higher failure rate but much much net
00:09:32
lower cost kind of approach to building
00:09:34
out a data center for internet serving
00:09:37
and so the Oracle servers didn't really
00:09:39
take the market and early on everyone
00:09:41
thought that they would so I think
00:09:42
Chamas point is right now Nvidia has
00:09:44
been at this for very long time and the
00:09:47
real question is how much of an
00:09:48
advantage do they have particularly that
00:09:50
there is this need to use Fabs to build
00:09:53
replacement technology so over time will
00:09:55
there be better solutions that use
00:09:56
Hardware that's not as good but the
00:09:58
software figures out and they build new
00:09:59
architecture for running on that
00:10:01
Hardware in a way that kind of mimics
00:10:03
what we saw in the early days of the
00:10:04
build out of the internet so um TBD
00:10:07
right the same is true in in switches
00:10:09
right so in networking a lot of the
00:10:12
high-end highquality networking
00:10:14
companies got beaten up when lower cost
00:10:17
Solutions came to Market later and so
00:10:19
they looked like they were going to be
00:10:20
the biggest business ever I mean you
00:10:21
could look at Cisco during the early
00:10:23
days of the internet build out and
00:10:24
everyone thought Cisco was uh the picks
00:10:27
and shovels of the internet and they
00:10:28
were going to make all the all the Valu
00:10:29
is going to AG to Cisco so we're kind of
00:10:31
in that same phase right now with Nvidia
00:10:33
the real question is is this going to be
00:10:35
a much harder Hill to compete on than
00:10:38
we've ever seen given the development
00:10:40
cycle on chips and the requirement to
00:10:42
use these Fabs to build chips it may be
00:10:44
a harder Hill to kind of get up sex so
00:10:46
we'll see your thoughts you think um
00:10:47
we're getting to the point where maybe
00:10:49
we'll have bought too many of these uh
00:10:51
built out too much infrastructure and
00:10:52
we'll take time for the application
00:10:54
layer as freeberg was alluding to to
00:10:57
monetize it well I think the question
00:10:59
everyone's asking right now is are are
00:11:01
these results sustainable can Nvidia
00:11:03
keep growing at these astounding rates
00:11:07
you know will the buildout continue and
00:11:09
the comparison everyone's making is to
00:11:10
Cisco and there's this chart that's been
00:11:12
going around overlaying the Nvidia stock
00:11:16
price on The Cisco stock price and you
00:11:18
can see here the orange line is NVIDIA
00:11:20
and the blue line is Cisco and it's
00:11:23
almost like a a perfect match now what
00:11:26
happened is that at a similar point
00:11:29
in the original buildout of the internet
00:11:32
of the do com era you had the market
00:11:34
crash at the end of March of uh 2000 and
00:11:39
Cisco never really recovered from that
00:11:41
Peak valuation um but I think there's a
00:11:43
lot of reasons to believe Nvidia is
00:11:45
different one is that if you look at
00:11:47
nvidia's multiples they're nowhere near
00:11:49
where Cisco were back then so the market
00:11:52
in 1999 in early 2000 was way more
00:11:55
bubbly than it is now so nvidia's
00:11:57
valuation is much more grounded in real
00:12:00
Revenue real margins real
00:12:03
profit second you have the issue of
00:12:06
competitive mode Cisco was selling
00:12:09
servers and networking equipment
00:12:12
fundamentally that equipment was much
00:12:14
easier to copy and commoditize than gpus
00:12:18
these GPU chips are really complicated I
00:12:21
think Jensen made the point that their
00:12:24
Hopper
00:12:26
100 product he said you know don't even
00:12:28
think of it just like a chip there's
00:12:30
actually 35,000 components in this
00:12:32
product and it weighs 70 lbs this is
00:12:35
more like a Mainframe computer or
00:12:37
something that's dedicated to processing
00:12:38
yeah it's somewhere between a rack
00:12:40
server and the entire rack yeah it's
00:12:43
Giant and it's heavy and it's complex it
00:12:45
does say something here chamath I think
00:12:48
about
00:12:49
how well positioned big Tech is in terms
00:12:54
of seeing an opportunity and quickly
00:12:56
mobilizing to capture AP that
00:12:59
opportunity these servers are being
00:13:01
bought
00:13:02
by you know people like Amazon I'm sure
00:13:05
Apple obviously Facebook meta I don't
00:13:09
know if Google's buying them as well I
00:13:10
would assume so Tesla so everybody's
00:13:12
buying these things and they had tons of
00:13:15
cash sitting around it is pretty amazing
00:13:17
how Nimble the industry is and this
00:13:19
opportunity feels like everybody is
00:13:21
looking at it like mobile and Cloud I
00:13:23
have to get mobilized quickly to not get
00:13:26
disrupted you're bringing up an
00:13:27
excellent point and
00:13:29
I I would like to tie it together with
00:13:31
freiberg's point so at some point all of
00:13:34
this spend has to make money right
00:13:37
otherwise you're you're going to look
00:13:38
really foolish for having spent 20 and
00:13:40
30 and $40 billion do so Nick if you
00:13:42
just go back to the to the revenue slide
00:13:45
of Nvidia I can try to give you a
00:13:47
framing of this at least the way that I
00:13:49
think about it so if if you look at this
00:13:51
like what you're talking about is look
00:13:53
who is going to spend $22.1 billion well
00:13:57
you said it Jason it's all a big Tech
00:13:58
why because they have that money on the
00:14:00
balance sheet sitting idle but when you
00:14:03
spend $22 billion their investors are
00:14:06
going to demand a rate of return on that
00:14:09
and so if you think about what a
00:14:10
reasonable rate of return is call it 30
00:14:12
40 50% and then you factor in and that's
00:14:15
profit and then you factor in all of the
00:14:17
other things that need to support that
00:14:20
that $22 billion of spend needs to
00:14:22
generate probably $45 billion of Revenue
00:14:27
and so Jason the question to your point
00:14:29
and to Freed Brook's Point The $64,000
00:14:32
Question is who in this last quarter is
00:14:34
going to make 45 billion on that 22
00:14:37
billion of spend and again what I would
00:14:39
tell you to be really honest about this
00:14:41
is that what you're seeing is more about
00:14:44
big companies musling people around with
00:14:48
their balance sheet and being able to go
00:14:50
to Nvidia and say I will give you
00:14:53
committed pre purchases over the next
00:14:55
three or four
00:14:56
quarters and less about here is a
00:14:59
product that I'm shipping that actually
00:15:01
makes money which I need enormous more
00:15:04
compute resources for it's not the
00:15:07
ladder most of the apps the overwhelming
00:15:11
majority of the apps that we're seeing
00:15:12
in AI today are toy apps that are run as
00:15:17
proofs of concept and demos and run in a
00:15:21
sandbox it is not production code this
00:15:24
is not we've rebuilt the
00:15:27
entire
00:15:29
autopilot system for the Boeing and it's
00:15:32
now run with agents and Bots and all of
00:15:36
this training that's not what's
00:15:37
happening so it is a really important
00:15:40
question today the demand is clear it's
00:15:42
the big guys with huge gobs of money and
00:15:45
by the way Nvidia is super smart to take
00:15:47
it because they can now forecast demand
00:15:50
for the next two or three
00:15:51
quarters I think we still need to see
00:15:53
the next big thing and if you look in
00:15:55
the past what the past has showed you
00:15:57
it's the big guys don't really invent
00:15:59
the new things that make a ton of money
00:16:00
it's the new guys who because they don't
00:16:03
have a lot of money and they have to be
00:16:05
a little bit more industrious come up
00:16:06
with something really authentic and new
00:16:09
yeah constraint makes for great art yeah
00:16:11
we haven't seen that yet so I think the
00:16:13
revenue scale will continue for like the
00:16:15
next two or three years probably for
00:16:18
NVIDIA but the real question is what is
00:16:21
the terminal value and it's the same
00:16:23
thing that saak showed in that Cisco
00:16:25
slide people ultimately realized that
00:16:28
the value was going to go
00:16:30
to other parts of the stack the
00:16:34
application layer and as more and more
00:16:37
money was acred at the application layer
00:16:39
of the internet less and less Revenue
00:16:41
multiple and credit was given to Cisco
00:16:43
and that's nothing against Cisco because
00:16:45
their revenue continued to compound
00:16:48
right and they did an incredible job but
00:16:50
the valuation got cut so freberg if
00:16:52
we're looking at this chart the winner
00:16:54
of Netflix the winner of The Cisco chart
00:16:57
might in fact be somebody like Netflix
00:16:58
they actually got you know hundreds of
00:17:00
millions of consumers to give them Cash
00:17:02
go and Facebook and then you have Google
00:17:03
and Facebook as well generating all that
00:17:05
traffic and then YouTube of course who
00:17:08
do you see the winner here as in terms
00:17:10
of the application layer who are the
00:17:12
billion customers here who are going to
00:17:14
spend 20 bucks a month five bucks a
00:17:16
month whatever it is so here well I mean
00:17:18
let me just start with this important
00:17:19
point if you look at where that revenue
00:17:22
is coming from to chat's point it's
00:17:23
coming from big cloud service providers
00:17:28
so Google and others are building out
00:17:32
clouds that other application developers
00:17:35
can build their AI tools and
00:17:37
applications on top of so a lot of the
00:17:39
buildout is in these cloud data centers
00:17:42
that are owned and operated by these big
00:17:45
tech companies the 18 billion of data
00:17:48
center Revenue that Nvidia realized is
00:17:50
revenue to them but it's not an
00:17:52
operating expense to the companies that
00:17:54
are building out so this is an important
00:17:57
point on why this is happening at such
00:17:59
an accelerated Pace when a big company
00:18:01
buys these chips from Nvidia they don't
00:18:04
have to from an accounting basis Market
00:18:05
as an expense in their income statement
00:18:08
it actually gets booked as a capital
00:18:10
expenditure in the cash flow statement
00:18:13
it gets put on the balance sheet and
00:18:15
they depreciate it over time and so they
00:18:17
can spend $20 billion of cash because
00:18:19
Google and others have 100 billion of
00:18:21
cash sitting on the balance sheet and
00:18:23
they've been struggling to find ways to
00:18:24
grow their business through Acquisitions
00:18:27
one of the reasons is they there aren't
00:18:29
enough companies out there that they can
00:18:31
buy at a good multiple that can give
00:18:32
them a good increase in profit the other
00:18:35
one is that antitrust authorities are
00:18:36
blocking all of their Acquisitions and
00:18:38
so what do you do with all that cash
00:18:40
well you can build out the next gen of
00:18:42
cloud infrastructure and you don't have
00:18:44
to take the hit on your p&l by doing it
00:18:46
so it ends up in the balance sheet and
00:18:47
then you depreciate it over typically
00:18:50
four to seven years so that money gets
00:18:52
paid out on the on the income statement
00:18:54
at these big companies over a seven-year
00:18:56
period so there's a really great
00:18:59
accounting and m&a environment driver
00:19:02
here that's causing the big cloud data
00:19:05
center providers to step in and say this
00:19:07
is a great time for us to build out the
00:19:08
next generation of infrastructure that
00:19:11
could generate profits for us in the
00:19:12
future because we've got all this cash
00:19:14
setting around we don't have to take a
00:19:15
p&l hit we don't have to acquire a cash
00:19:17
burning
00:19:18
business and you know frankly we're not
00:19:20
going to be able to grow through m&a
00:19:21
because of antitrust right now anyway so
00:19:23
there's a lot of other motivating
00:19:24
factors that are causing this near-term
00:19:26
acceleration as they're trying to find
00:19:28
ways to grow yeah and all I I know that
00:19:30
was an accounting point but I think it's
00:19:32
a really important valid one if you if
00:19:34
100 billion gets spent this year you
00:19:35
divide it by four 25 billion in Revenue
00:19:37
would have to come from that or
00:19:39
something in that range yeah and so
00:19:40
saaks any guesses do you have to just
00:19:42
keep in mind I think freeberg what you
00:19:44
said is very true for gcp spend but not
00:19:47
necessarily for Google spend it's true
00:19:49
for AWS spend but not necessarily for
00:19:52
Amazon spend and it's true for Azure
00:19:54
spend not true for Microsoft spend and
00:19:56
it's largely not true for Tesla and
00:19:59
Facebook because they don't have clouds
00:20:00
so I think the question to your point
00:20:03
that and for obvious reasons Nvidia
00:20:06
doesn't disclose it is what is the
00:20:07
percentage of that 21 billion that just
00:20:10
went to those Cloud providers that
00:20:11
they'll then expose to to to everybody
00:20:14
else versus what was just absorbed
00:20:16
because at Facebook Mark had that video
00:20:18
about how many h100 that's all for him
00:20:21
right but it is still it is still
00:20:23
capitalized as my point so they don't
00:20:25
have to book that as an expense it sits
00:20:27
on the balance sheet they and they earn
00:20:30
it down over time you're helping to
00:20:31
explain why these big cloud service
00:20:32
providers are spending so much on the
00:20:35
cash because they're very profitable and
00:20:36
there's nowhere else to put the money
00:20:38
right well so that would seem to
00:20:40
indicate that this is more in the
00:20:42
category of one-time buildout than
00:20:44
sustainable ongoing Revenue I think the
00:20:47
the big question is the one that jamath
00:20:49
asked which is what's the terminal value
00:20:51
of Nvidia I think like a simple
00:20:53
framework for thinking about that is
00:20:55
what is the total addressable Market or
00:20:57
Tam related to gpus and then what is
00:21:00
their market share going to be right now
00:21:03
their market share is something like 91%
00:21:05
that's clearly going to come down but
00:21:07
the remote appears to be substantial the
00:21:10
Wall Street analysts I've been listening
00:21:11
to think that in five years they're
00:21:13
still going to have 60 something percent
00:21:15
market share so they're going to have a
00:21:17
substantial percentage of this Market or
00:21:20
this Tam then the question is I think
00:21:22
with respect to Tam is what is onetime
00:21:25
buildout versus steady state now I think
00:21:28
that clearly there's a lot of buildout
00:21:32
happening now that's almost like a
00:21:33
backfill of capacity that people are
00:21:35
realizing they need but even the numbers
00:21:38
you're seeing this quarter kind of
00:21:39
understate it because first of all
00:21:42
Nvidia was Supply constrainted they
00:21:44
could not produce enough chips to
00:21:46
satisfy all the demand their revenue
00:21:48
would would have been even higher if
00:21:50
they had more
00:21:52
capacity second you just look at their
00:21:55
forecast so the fiscal year that just
00:21:57
ended they did around 60 billion of
00:21:59
Revenue they're forecasting 110 billion
00:22:01
for the fiscal year that just started so
00:22:04
they're already projecting to almost
00:22:06
double based on the demand that they
00:22:08
clearly have visibility into already so
00:22:11
it's very hard to know exactly what the
00:22:13
terminal or steady state value of this
00:22:16
Market's going to be even once the cloud
00:22:19
service providers do this big buildout
00:22:21
presumably there's always going to be a
00:22:23
need to stay up to dat with the latest
00:22:25
chips right here's a framework for you
00:22:28
tell me if this makes
00:22:29
sense intel was the basically the mother
00:22:33
of all of modern compute up until today
00:22:36
right I think the CPU was the the most
00:22:40
fundamental Workhorse that enabled local
00:22:42
PCS it
00:22:44
enabled networking it enabled the
00:22:47
internet and so when you look at the
00:22:49
market cap of it as an example it's
00:22:52
about 1880 odd billion dollars
00:22:56
today the econom that it created that it
00:22:59
supports is probably measured call it in
00:23:02
a trillion or2 trillion do maybe 5
00:23:04
trillion let's just be really generous
00:23:06
right and so you you can see that
00:23:08
there's this ratio of the enabler of an
00:23:11
economy and the size of the economy and
00:23:15
those things tend to be relatively fixed
00:23:17
and they recur repeatedly over and over
00:23:19
and over if you look at Microsoft it's
00:23:21
market cap relative to the economy that
00:23:23
it enables so the question for NVIDIA in
00:23:26
my mind would be that it is it not going
00:23:28
to go up in the next 18 to 24 months it
00:23:31
probably is for exactly the reason you
00:23:33
said it is super set up to have a very
00:23:35
good meet and beat guidance for the
00:23:37
street which they'll eat up and all of
00:23:40
the algorithms that trade the press
00:23:41
releases will drive the price higher and
00:23:43
all of this stuff will just create a
00:23:45
trend
00:23:46
upward I think the bigger question is if
00:23:49
it's a four or five trillion dollar
00:23:52
market cap in the next two or three
00:23:55
years will it support 100 trillion
00:23:59
economy because that's what you would
00:24:01
need to believe for those ratios to hold
00:24:03
otherwise everything has just broken on
00:24:05
the internet yeah I mean so the history
00:24:07
of the internet is that if you build it
00:24:09
they will come meaning that if you make
00:24:12
the investment in the capital assets
00:24:15
necessary to power the next generation
00:24:17
of applications those applications have
00:24:18
always eventually gotten written even
00:24:21
though it was hard to predict them at
00:24:23
the time so in the late 90s when we had
00:24:25
the whole do com bubble and then bust
00:24:27
had this tremendous buildout not just of
00:24:30
kind of servers and all the networking
00:24:31
equipment but there was a huge fiber
00:24:33
buildout y by all the telecom companies
00:24:35
and the telecom companies had a Cisco
00:24:38
like you know a peak it was worse you
00:24:40
wcom and them they went bankrupt a lot
00:24:42
of them yeah well the problem there was
00:24:44
that a lot of the build out happened
00:24:46
with debt and so when you had the Doom
00:24:48
crash and all the valuations came down
00:24:50
to earth that's why a lot of them went
00:24:52
under yeah Cisco wasn't in that position
00:24:55
but anyway my point is in the early 2000
00:24:58
when the crash happened everyone thought
00:24:59
that these telecom companies had over
00:25:01
invested in fiber as it turns out all
00:25:04
that fiber eventually got used the
00:25:06
internet went from you know dial up to
00:25:09
broadband we started doing seeing
00:25:11
streaming social networking all these
00:25:13
applications started eating up that
00:25:15
bandwidth so I think that the history of
00:25:19
these things is that the applications
00:25:21
eventually get written they get
00:25:23
developed if you build the
00:25:25
infrastructure to power them and I think
00:25:26
with AI the thing that's exciting to me
00:25:29
as someone who's really more of an
00:25:30
application investor is that we're just
00:25:32
at the beginning I think of a huge wave
00:25:36
of a lot of new creativity and
00:25:39
applications that's going to be written
00:25:41
and it's not just B Toc it's going to be
00:25:42
B2B as well you guys haven't really
00:25:44
mentioned that it's not just consumers
00:25:46
and consumer applications are going to
00:25:48
use these cloud data centers that are
00:25:51
buying up all these gpus it's it's going
00:25:53
to be Enterprises too I mean these
00:25:55
Enterprises are using Azure they're
00:25:57
using Google cloud and so forth so
00:26:00
there's a lot I think that's still to
00:26:02
come I mean we're just at the beginning
00:26:04
of a wave that's probably going to last
00:26:06
at least a decade yeah to your point one
00:26:09
of the reasons YouTube Google photos
00:26:12
ioto a lot of these things happened was
00:26:16
because the infrastructure buildout was
00:26:18
so great during the doom boom that the
00:26:20
prices for storage the prices for
00:26:22
bandwidth sacks plummeted and then
00:26:25
people like Chad Hurley looked at were
00:26:27
like you what instead of charging people
00:26:29
to put a video on the internet and then
00:26:31
charging them for the bandwith they used
00:26:33
we'll just let them upload this stuff to
00:26:34
YouTube and we'll figure it out later
00:26:36
same thing with Netflix yeah I mean look
00:26:38
when we were developing PayPal in the
00:26:40
late 90s really around
00:26:42
1999 uh you could barely upload a photo
00:26:46
to the internet I mean so like the idea
00:26:48
of having an account with a profile
00:26:49
photo on it was sort of like why would
00:26:51
you do that it's just prohibitively slow
00:26:53
everyone's going to drop off Yeah by
00:26:55
2003 it was fast enough that you could
00:26:58
do that and that's why social networking
00:27:00
happened I mean literally without that
00:27:02
performance
00:27:03
Improvement like even having a profile
00:27:07
photo on your account was something that
00:27:08
was too hard to do your LinkedIn profile
00:27:10
was like too much bandwidth and then let
00:27:12
alone video I mean the you would get you
00:27:16
probably remember these days you would
00:27:17
put up a video on your website if it
00:27:19
went viral your website got turned off
00:27:22
because you would hit your $5,000 or
00:27:24
$10,000 a month Cap all right grock also
00:27:27
had a huge week that's grock with a Q
00:27:29
not to be confused with elon's grock
00:27:31
with a K shth you've talked about grock
00:27:35
on this podcast a couple of times
00:27:36
obviously you were the I guess you were
00:27:39
the first investor the seed investor you
00:27:41
pulled these lpus and this concept out
00:27:43
of a team that was at Google maybe you
00:27:46
could explain a little bit about grock's
00:27:48
viral moment this week in the history of
00:27:50
the company which I know has been a long
00:27:53
road for you with this company I mean
00:27:55
it's been since 2016 so so again proving
00:27:58
what you
00:27:59
guys have said many times and what I've
00:28:02
tried to live out which is just you just
00:28:04
got to keep grinding 90% of the battle
00:28:07
is just staying alive in business yeah
00:28:10
and having oxygen to keep trying things
00:28:13
and then eventually if you get lucky
00:28:15
which I think we
00:28:17
did things can really break in your
00:28:19
favor so this weekend you know I've been
00:28:21
tweeting out a lot of technical
00:28:23
information about why I think this is
00:28:24
such a big deal but yeah the the moment
00:28:27
came this weekend combination of Hacker
00:28:29
News and some other places and
00:28:31
essentially we had no customers two
00:28:32
months ago I'll just be honest and
00:28:35
between Sunday and
00:28:38
Tuesday we've just we're overwhelmed and
00:28:41
I think like the last count was we had
00:28:43
3,000 unique customers come and try to
00:28:46
consume our resources from
00:28:48
every important Fortune 500 all the way
00:28:51
down to developers and
00:28:53
so I think we're very fortunate I think
00:28:56
the team has a lot of hard work to do do
00:28:57
so it could mean nothing but it has the
00:28:59
potential to be something very
00:29:01
disruptive so what is it that people are
00:29:02
glomming on
00:29:04
to you have to understand that like at
00:29:07
the very highest level of AI you have to
00:29:10
view it as two distinct
00:29:12
problems one problem is called training
00:29:14
which is where you take a model and you
00:29:16
take all of the data that you think will
00:29:19
help train it and you do that you train
00:29:21
the model you learn all over all of this
00:29:26
information but the second part of the
00:29:28
AI problem is what's called inference
00:29:30
which is what you and I see every day as
00:29:31
a consumer so we go to a website like
00:29:34
chat GPT or Gemini we ask a question and
00:29:38
it gives us a really useful answer and
00:29:41
those are two very different kinds of
00:29:42
compute challenges the first one is
00:29:45
about brute force and power right if you
00:29:48
can imagine like what you need are tons
00:29:51
and tons of machines tons and tons of
00:29:54
like very high quality networking and an
00:29:57
enormous amount of power in a data
00:29:59
center so that you can just run those
00:30:00
things for months I think Elon publishes
00:30:02
very transparently for example how long
00:30:04
it trains to to train his grock with a K
00:30:07
right model and it's in the months
00:30:09
inference is something very different
00:30:11
which is all about speed and cost what
00:30:13
you need to be in order to answer a
00:30:15
question for a consumer in a compelling
00:30:17
way is super super cheap and super super
00:30:20
fast and we've talked about why that is
00:30:23
important and the gro with a Q chips
00:30:28
turns out to be extremely fast and
00:30:31
extremely
00:30:32
cheap and so look time will tell how big
00:30:35
this company can get but if you tie it
00:30:38
together with what Jensen said on the
00:30:40
earnings call and you now see developers
00:30:44
stress testing us and finding that we
00:30:46
are meaningfully meaningfully faster and
00:30:49
cheaper than any Nvidia
00:30:51
solution there's the potential here to
00:30:53
be really disruptive and we're a meager
00:30:57
unicorn right our last valuation was
00:31:00
like a billion something versus Nvidia
00:31:03
which is now like a $2 trillion doll
00:31:05
company so there's a lot of market cap
00:31:07
for grock to gain by just being able to
00:31:09
produce these things at
00:31:12
scale which could be just an enormous
00:31:14
outcome for us so time will tell but a
00:31:16
really important moment in the company
00:31:18
and very exciting can I just observe
00:31:21
like off topic how an overnight success
00:31:24
can take eight years yeah no I was
00:31:26
thinking the same it's a seven-year
00:31:28
overnight success in the making there's
00:31:30
this class of businesses that I think
00:31:32
are unappreciated in a post internet era
00:31:38
where you have to do a bunch of things
00:31:40
right before you can get any one thing
00:31:43
to work and these complicated businesses
00:31:46
where you have to stack either different
00:31:49
things together that need to click
00:31:50
together in a in a stack or you need to
00:31:54
iterate on each step until the whole
00:31:55
system works end to end
00:31:58
can sometimes take a very long time to
00:32:00
build and the term that's often used for
00:32:01
these types of businesses is deep Tech
00:32:04
and they fall out of favor because in an
00:32:06
internet era and in a software era you
00:32:09
can find product Market fit and make
00:32:11
revenue and then make profit very
00:32:12
quickly and so a lot of entrepreneurs
00:32:15
select into that type of business
00:32:17
instead of selecting into this type of
00:32:19
business where the probability of
00:32:20
failure is very high you have several
00:32:23
low probability things that you have to
00:32:24
get right in a row and if you do it's
00:32:27
going to take eight years and a lot of
00:32:29
money and then all of a sudden the thing
00:32:30
takes off like a rocket ship you've got
00:32:32
a huge Advantage you've got a huge moat
00:32:34
it's hard for anyone to catch up and
00:32:36
this thing can really um spin out on its
00:32:37
own I do think Elon is very unique in
00:32:40
his ability to deliver success in these
00:32:43
types of businesses Tesla needed to get
00:32:44
a lot of things right in a row SpaceX
00:32:46
needed to get a lot of things right in a
00:32:47
row all of these require a series of
00:32:50
complicated steps or a set of
00:32:52
complicated technologies that need to
00:32:54
click together and work together but the
00:32:55
hardest things often output the highest
00:32:59
value and you know if you can actually
00:33:03
make the commitment on these types of
00:33:05
businesses and get all the pieces to
00:33:07
click together there's an extraordinary
00:33:09
opportunity to build Moes and to take
00:33:11
huge amounts of market value and I think
00:33:14
that there's an element of this that's
00:33:15
been lost in Silicon Valley over the
00:33:17
last couple of decades as the fast money
00:33:20
in the internet era has kind of
00:33:23
prioritized other Investments ahead of
00:33:24
this but I'm really hopeful that these
00:33:26
sorts of Chip Technologies
00:33:28
SpaceX in biotech we see a lot of this
00:33:31
these sorts of things can kind of become
00:33:33
more in favor because the the advantage
00:33:35
as these businesses work seems to
00:33:37
realize hundreds of billions and
00:33:39
sometimes trillions of dollars of market
00:33:40
value and be incredibly transformative
00:33:43
for Humanity so I don't know I just
00:33:45
think it's an observation I wanted to
00:33:46
make about the greatness of these
00:33:47
businesses when they work out well I
00:33:49
mean open AI was kind of like that for a
00:33:50
while totally I mean it was this like
00:33:52
wacky nonprofit that was just grinding
00:33:54
on an AI research problem for like six
00:33:56
years and then it finally worked and got
00:33:59
productized into chat GPT totally but
00:34:02
you're right SpaceX was kind of like
00:34:03
that I mean the big money maker at
00:34:05
SpaceX is starlink which is the
00:34:08
satellite Network it's basically
00:34:09
Broadband from space and it's on its way
00:34:13
to handling I think a meaningful
00:34:14
percentage of all internet traffic but
00:34:16
think about all the things you had to
00:34:17
get to to get that working first you had
00:34:20
to create a rocket that's hard enough
00:34:22
then you had to get to
00:34:23
reusability then you have to create the
00:34:25
whole satellite Network so at least
00:34:27
three hard things in a row well and get
00:34:30
consumers to adopt it I mean you know
00:34:32
don't forget the final step yeah we had
00:34:34
no idea where the market was like early
00:34:36
on it started in my office and so
00:34:38
Jonathan and I would be kind of always
00:34:41
trying to figure out what is the initial
00:34:42
go to market and I remember I emailed
00:34:45
Elon in at that period when they were
00:34:48
still trying to figure out whether they
00:34:50
were going to go with liar or not and we
00:34:52
thought wow maybe we could sell Tesla
00:34:54
the chips you know but and then Tesla
00:34:56
brought in this team just to talk to us
00:34:58
about what the design goals were and
00:35:01
basically said no in kind way but they
00:35:03
said no then we thought okay maybe it's
00:35:06
like for high frequency Traders right
00:35:07
because like those folks want to have
00:35:09
all kinds of edges and if we have these
00:35:11
big models maybe we can accelerate their
00:35:14
decision making they can measure Revenue
00:35:16
that didn't work out then it was like
00:35:18
you know we tried to sell
00:35:20
to three-letter agencies that didn't
00:35:23
really work out our original version was
00:35:25
really focused on image class class
00:35:27
ification in convolutional neuron Nets
00:35:29
like resnet that didn't work out we ran
00:35:32
head first into the fact that Nvidia has
00:35:35
this compiler product called cuda and we
00:35:37
had to build a high class compiler that
00:35:40
you could take any model without any
00:35:43
modifications all these things to your
00:35:45
point are just points where you can just
00:35:47
very easily give up and then there's
00:35:48
like we run out of money so then you
00:35:50
write money in a note right because
00:35:52
everybody wants to punt on valuation
00:35:54
when nothing's working yeah you tried
00:35:57
Beach Head Market you could the boat you
00:35:59
have to make a decision to just keep
00:36:02
going if you believe it's right and if
00:36:04
you believe you are right yeah and that
00:36:07
requires shutting out we talked about
00:36:09
this in the Masa example last week but
00:36:12
it just requir shutting out the noise
00:36:14
because it's so hard to believe in
00:36:17
yourself it's so hard to keep funding
00:36:19
these things it's so hard to go into
00:36:20
partner meetings and defend a
00:36:22
company and then you just have a moment
00:36:25
and you just feel I I don't know I feel
00:36:27
very Vindicated but then I feel very
00:36:30
scared because Jonathan still hasn't
00:36:31
landed it you know what I mean you
00:36:33
mentioned all those boats landing and
00:36:34
trying to trying to those missteps but
00:36:36
3,000 people signed up who are they are
00:36:39
they developers now and they're going to
00:36:40
figure out the applications yeah I think
00:36:42
that back to the original point my
00:36:43
thought today is that AI is more about
00:36:46
proofs of concept and toy apps and
00:36:49
nothing real yep I don't think there's
00:36:51
anything real that's inside of an
00:36:53
Enterprise that is so meaningfully
00:36:55
disruptive that it's going to get
00:36:56
broadly licensed to other Enterprises
00:36:59
I'm not saying we won't get there but
00:37:00
I'm saying we haven't yet seen that
00:37:02
Cambrian moment of monetization we've
00:37:06
seen the Cambrian moment of innovation
00:37:09
yeah and so that Gap has still yet to be
00:37:11
crossed and I think the reason that you
00:37:14
can't cross it is that today these are
00:37:16
in an unusable State the results are not
00:37:20
good enough they are toy apps that are
00:37:22
too slow that require too much
00:37:24
infrastructure and cost so the potential
00:37:28
is for us to enable that monetization
00:37:30
Leap
00:37:31
Forward and so yeah they're going to be
00:37:34
developers of all sizes and the people
00:37:36
that came are literally companies of all
00:37:38
sizes I saw some of the names of the big
00:37:40
companies and they are the who's who of
00:37:43
the S&P 500 how do you guys reconcile
00:37:47
this deep Tech High outcome opportunity
00:37:51
that everyone here has seen and been a
00:37:53
part of as an investor participant in
00:37:57
versus the more drisk faster time to
00:38:01
Market and you know chth in particular
00:38:03
like in the past we've talked about some
00:38:05
of these deep Tech projects like fusion
00:38:06
and so on and you've highlighted well
00:38:08
it's just not there yet it's not
00:38:09
fundable what's the distinction between
00:38:12
a deep Tech investment opportunity that
00:38:14
is fundable and that you keep grinding
00:38:16
at that has this huge outcome uh what
00:38:19
makes the one like Fusion not fundable
00:38:22
it's a phenomenal question it's a great
00:38:23
question my answer is I have a very
00:38:25
simple f
00:38:27
which is that I don't want to debate the
00:38:29
laws of physics when I fund a company so
00:38:32
with Jonathan when we were initially
00:38:35
trying to figure out how to size it I
00:38:37
think my initial check was like 7 to10
00:38:39
million or something and the whole goal
00:38:42
was to get to an initial tape out of a
00:38:44
design we were not inventing anything
00:38:46
new with respect to physics we were on a
00:38:49
very old process technology I think
00:38:51
we're still on 14 nanometer we were on
00:38:52
14 nanometer 8 years ago okay so we
00:38:55
weren't pushing those boundaries all we
00:38:58
were doing was trying to build a
00:38:59
compiler and a chip that made sense in a
00:39:01
very specific construct to solve a a
00:39:03
well- defined bounded problem so that is
00:39:05
a technical challenge but it's not one
00:39:08
of physics when I've been pitched all
00:39:10
the fusion companies for example there
00:39:13
are fuel sources that require you to
00:39:15
make a leap of physics where in order to
00:39:18
generate a certain fuel source you
00:39:20
either have to go and harvest that on
00:39:22
the moon or in a different planet that
00:39:24
is not earth or you have to create some
00:39:26
fundamentally different way of creating
00:39:28
this highly unique
00:39:30
material that is why those kinds of
00:39:32
problems to me are poor risk and
00:39:35
building a chip is good risk it doesn't
00:39:37
mean you're going to be successful in
00:39:39
building a chip but the risks are
00:39:42
bounded to not of fundamental physics
00:39:44
they're bounded to go to market in
00:39:46
technical usefulness and I think that
00:39:49
that removes an order of magnitude risk
00:39:52
in the outcome so I mean there there's
00:39:54
still like a bunch of things that have
00:39:55
to be right in a row to make it work but
00:39:57
yeah it doesn't mean it's going to work
00:39:59
all I'm saying is I don't I don't want
00:40:00
it to fail because we built a reactor
00:40:01
and we realized hold on to get heavy
00:40:03
hydrogen I got to go to the moon right
00:40:06
and J and Sachs how do you saaks I know
00:40:08
you don't you invested we have done a
00:40:10
couple yeah so maybe you guys can
00:40:11
highlight how you've thought about deep
00:40:13
Tech opportunities versus do something
00:40:16
really difficult like this every 50
00:40:18
Investments or so because most of the
00:40:20
entrepreneurs coming to us because we're
00:40:22
seed investors or pre-seed investors
00:40:24
they would be going to a biotech
00:40:26
investor or a hardware investor who
00:40:27
specializes in that not to us but once
00:40:29
in a while we meet a Founder we really
00:40:31
like and so Contra line was one we were
00:40:34
introduced to somebody who's doing this
00:40:36
really interesting contraception for men
00:40:38
where they put a gel into your vast
00:40:41
Defence and you as a a man can take
00:40:45
control of your reproduction you
00:40:48
basically it's a it's not a vasectomy
00:40:50
it's just a gel that goes in there and
00:40:52
and blocks it and this company is now
00:40:53
doing human trials and doing fantastic
00:40:55
but this took forever to to get to this
00:40:58
point and then uh you guys some of you
00:41:00
are also investors in Cafe X which we
00:41:02
love the founder and this company should
00:41:05
have died like during covid and making a
00:41:08
robotic coffee bar when he started you
00:41:10
know seven eight years ago was
00:41:12
incredibly hard he had to build the
00:41:14
hardware he had to build a brand he had
00:41:16
to do locations he had to do software
00:41:18
and now he's selling these machines and
00:41:19
people are buying them and the two in
00:41:21
San Francisco at SFO are making like uh
00:41:24
I think they the two of them make a
00:41:25
million a year and it's the highest per
00:41:28
square footage of any store in an
00:41:31
airport and so we've just been grinding
00:41:33
and grinding and you got to find a
00:41:35
Founder who's willing to make it their
00:41:37
lives work in these kind of situations
00:41:39
but you start to think about the degree
00:41:40
of difficulty Hardware software retail
00:41:45
mobile apps I mean it just
00:41:47
gets crazy how hard these businesses are
00:41:49
as opposed to I'm building a SAS company
00:41:52
I build software I sell it to somebody
00:41:54
to solve their SAS problem it's like
00:41:55
it's very one dimens right it's pretty
00:41:57
straightforward these businesses
00:41:59
typically have five components yeah and
00:42:01
SX you've been an investor in
00:42:03
SpaceX but you don't make those sorts of
00:42:05
Investments regularly at craft is that
00:42:08
fair yeah I have an Elon exception I got
00:42:11
it it's about the
00:42:17
founder our portfolio allocation we say
00:42:20
this much early stage this much late
00:42:22
stage this much Elon Elon except yeah I
00:42:25
mean you have to be so dogged to to want
00:42:29
to take something like this on because
00:42:30
the good stuff happens like you're
00:42:31
saying freeberg you're seven8 nine 10 as
00:42:33
opposed to like a consumer product
00:42:35
either works or it doesn't by year three
00:42:37
or four the only app that took a really
00:42:38
long time people don't know this but
00:42:40
Twitter actually took a long time to
00:42:42
catch on it was kind of cruising for two
00:42:44
or three years and then South by
00:42:46
Southwest happened Ashton Kutcher got on
00:42:48
it Obama got on it I think the network
00:42:51
effect I think I think Network effect
00:42:52
businesses are different because that's
00:42:54
all about getting your seat of your
00:42:55
network I'm talking about is the
00:42:57
technical coordination of lots of
00:42:59
technically difficult tasks that need to
00:43:01
sync up it's like getting a master lock
00:43:03
with like 10 digits and you got to
00:43:05
figure out the combination of all 10
00:43:07
digits and once they're all correct then
00:43:09
the lock opens and prior to that if any
00:43:12
if anyone number is off the lock doesn't
00:43:14
open and I think these technically
00:43:15
difficult businesses are some of the and
00:43:18
they are the hardest and they do require
00:43:20
the most dogged personalities to persist
00:43:23
and to realize an outcome from but the
00:43:25
truth is that if get them the moat is
00:43:27
extraordinary and they're usually going
00:43:28
to create extraordinary leverage and
00:43:30
value and you know I think from a
00:43:32
portfolio allocation perspective if you
00:43:35
as an investor want to have some
00:43:36
diversification in your portfolio this
00:43:38
is not going to be the predominance of
00:43:39
your portfolio but some percentage of
00:43:41
your portfolio should go to this sort of
00:43:43
business because if it works boom you
00:43:45
know this can be the big 10x 100x
00:43:47
thousand x two stories about that one of
00:43:49
the V early VCS and elon's told the
00:43:51
story
00:43:52
publicly wanted Elon to not make the
00:43:55
Roadster not make the monol S just make
00:43:57
drivetrains and the electric components
00:43:59
for other car companies can you imagine
00:44:01
how the world would have changed and
00:44:03
then totally a very high-profile VC came
00:44:06
to me and said okay I'll I'll do the
00:44:09
series a for um I'll do the series a for
00:44:12
Uber I'll preemptively do it but you got
00:44:14
to tell Travis to stop running Uber as a
00:44:17
consumer app I want him to sell the
00:44:18
software to cab companies so make it a
00:44:21
SAS company and I said well you you know
00:44:24
the cab companies are kind of the
00:44:25
problem like they're they're taking all
00:44:26
the margin like that kind of disrupting
00:44:29
them and they're like yeah yeah but just
00:44:31
think there's thousands of cab companies
00:44:33
they would pay you tens of thousand a
00:44:34
year for this software and you can get a
00:44:36
little piece of the action I never
00:44:37
brought that investor to to Travis I was
00:44:40
like oh wow that's really interesting
00:44:41
Insight sometimes the VCS work against
00:44:43
it I have a very poor track record of
00:44:47
working with other investors whoa
00:44:49
self-reflection I do deals myself I size
00:44:53
them
00:44:54
myself and it's because a lot of
00:44:58
them have to live within the political
00:45:00
dynamics of their fund and so I think
00:45:03
Jason what you probably saw in that
00:45:05
example which is exactly why doing
00:45:07
things and splitting deals will never
00:45:09
generate great outcomes in my opinion is
00:45:12
that you you take on all the baggage and
00:45:15
the dysfunction of these other
00:45:16
Partnerships and so if you really wanted
00:45:18
to go and disrupt
00:45:22
Transportation you need one person who
00:45:25
can be a trigger pull and who doesn't
00:45:26
have to answer to anybody I find that's
00:45:28
why I think for example when you look at
00:45:31
how successful venod has been over
00:45:33
decade after decade after decade when
00:45:36
venot decides that's the decision and I
00:45:38
think there's something very powerful in
00:45:41
that there are a bunch of deals that
00:45:43
I've done that when they've worked
00:45:46
out were not really because they were
00:45:49
consensus and they had to get supported
00:45:51
and scaffolded at periods where if I
00:45:53
wasn't able to Ram them through myself
00:45:55
because because it was my organization I
00:45:57
think we would have be in a very
00:45:58
different place so I think I think like
00:46:01
for for entrepreneurs it's so difficult
00:46:04
for them to find people that believe
00:46:06
it's so much better to find one person
00:46:08
and just get enough money and then not
00:46:12
Syndicate because I think you have to
00:46:14
realize that you are bringing on and
00:46:16
compounding your risk the one that
00:46:18
freeberg talked about with the risk of
00:46:20
all the other partnership dynamics that
00:46:22
you bring on so if you don't internalize
00:46:25
that you may have five or six folks that
00:46:27
come into an A or a B but you're
00:46:29
inheriting five or six yeah partnership
00:46:32
Dy dysfunctions yeah yeah yeah can you
00:46:35
just explain really quickly for the
00:46:37
audience since they heard about gpus in
00:46:39
Nvidia but they may not know what an lpu
00:46:42
is what's the difference there a GPU the
00:46:45
best way to think about it is so if you
00:46:47
contrast a CPU with a GPU so CPU was the
00:46:50
Workhorse of all of
00:46:52
computing and when it when Jensen
00:46:55
started Nvidia what he realized was
00:46:58
there were specific tasks where a CPU
00:47:00
failed quite brilliantly at and so he's
00:47:04
like well we're going to make a chip
00:47:05
that works in all these failure modes
00:47:07
for a CPU so a CPU is very good at
00:47:09
taking one instruction in acting on it
00:47:12
and then spitting out one one answer
00:47:14
effectively and so it's a very serial
00:47:17
kind of a factory if you think about the
00:47:18
CPU so if you want to build a factory
00:47:21
that can process instead of one thing at
00:47:24
a time 10 things or 100 things what is
00:47:27
they had to find a
00:47:29
workload that was well suited and they
00:47:31
found graphics and what they convinced
00:47:34
PC manufacturers back in the day was
00:47:36
look have the CPU be the brain it'll do
00:47:39
90% of the work but for very specific
00:47:42
use cases like graphics and video games
00:47:45
you don't want to do serial computation
00:47:47
you want to do parallel computation and
00:47:49
we are the best at that and it turned
00:47:50
out that that was a genius insight and
00:47:53
so the business for many years was gain
00:47:55
ging and Graphics but what happened
00:47:58
about 10 years ago was what we also
00:48:01
started to
00:48:02
realize was the math that's required and
00:48:06
the processing that's required in AI
00:48:09
models actually looked very similar to
00:48:12
how you would process imagery from a
00:48:15
game and so he was allowed to figure out
00:48:19
by building this thing called cuda which
00:48:21
is the compiler that sits on the chip
00:48:24
how he could now go and tell people that
00:48:25
wanted to experiment with AI hey you
00:48:27
know that chip that we had made for
00:48:29
graphics guess what it also is amazing
00:48:32
at doing all of these very small
00:48:33
mathematical calculations that you need
00:48:35
for your AI model and that turned out to
00:48:38
be true so the next Leap Forward was
00:48:41
what Jonathan saw which was hold on a
00:48:43
second if you look at the chip
00:48:45
itself that GPU substantially has not
00:48:49
changed since 1999 in the way that it
00:48:52
thinks about problem solving it has all
00:48:54
this very expensive memory M blah blah
00:48:56
blah so he was like let's just throw all
00:48:58
that out the window we'll make small
00:49:00
little brains and we'll connect those
00:49:02
little brains together and we'll have
00:49:04
this very clever software that schedules
00:49:06
it and optimizes it so basically take
00:49:09
the chip and make it much much smaller
00:49:10
and cheaper and then make many of them
00:49:13
and connect them together that was
00:49:14
Jonathan's insight and it turns out for
00:49:17
large language models that's a huge
00:49:19
Stroke of Luck because it is exactly how
00:49:22
llms can be hyper optimized to work so
00:49:26
that's kind of been the evolution from
00:49:28
CPU to GPU to now
00:49:30
lpu and we'll see how big this thing can
00:49:32
get but it's it's quite it's quite novel
00:49:35
well congratulations on it all and it
00:49:37
was a very big week for Google not in a
00:49:41
great way they had a massive PR mess
00:49:43
with their Gemini which refused to
00:49:46
generate pictures if I'm reading this
00:49:47
correctly of white people here's a a
00:49:50
quick refresher on what Google's doing
00:49:52
in AI Gemini is now Google's brand name
00:49:55
for their AI main language model you can
00:49:58
think of that like open AI GPT Bard was
00:50:00
the original name of their chatbot they
00:50:02
had duet AI which was Google sidekick in
00:50:05
the Google Suite earlier this month
00:50:06
Google rebranded everything to Gemini so
00:50:09
Gemini is now the model it's the chatot
00:50:11
and it's a p sidekick and they launched
00:50:13
a $20 month subscription called Google
00:50:16
one AI premium uh only four words way to
00:50:19
go this includes access to the best
00:50:20
model Gemini Ultra which is on par with
00:50:23
GPT 4 according to them and generally in
00:50:26
the marketplace but earli this week
00:50:28
users on X started noticing that Gemini
00:50:30
would not generate images of white
00:50:32
people even when prompted people were
00:50:34
prompting it for images of historical
00:50:36
figures that were generally white and
00:50:39
getting kind of weird results I asked
00:50:41
Google Gemini to generate images of the
00:50:43
founding fathers it seems to think
00:50:45
George Washington was black certainly
00:50:47
here as a portrait of the founding
00:50:48
fathers of America as you can see it is
00:50:51
putting there Asian guy that's awesome
00:50:54
yeah it's just it's making a great
00:50:57
meshup and uh yeah we there was like
00:51:00
countless images that got created
00:51:02
generate images of the American
00:51:04
Revolutionary sh his here are images
00:51:06
featuring diverse American
00:51:08
revolutionaries and inserted the word
00:51:09
diverse sex I'm not sure if you watch
00:51:12
this controversy on X I know you spend a
00:51:14
little bit of time on that Social
00:51:16
Network I noticed you're you're active
00:51:18
once in a while did you log in this week
00:51:19
and and see any of this bruhaha sure
00:51:21
it's all over X right now I mean look
00:51:24
this Gemini roll out was was a joke I
00:51:26
mean it's ridiculous the AI is incapable
00:51:29
of giving you accurate answers because
00:51:31
it's been so programmed with diversity
00:51:34
and inclusion and it inserts these words
00:51:37
diverse and inclusive even in answers
00:51:40
where you haven't asked for that you
00:51:42
haven't prompted it for that so they I
00:51:45
think Google has now like yanked back
00:51:47
the product release I think they're
00:51:48
scrambling now because it's been so
00:51:50
embarrassing for them but SX like is is
00:51:53
it how does this not get a like I don't
00:51:56
understand how yeah had the red team not
00:51:59
catch this yeah well how or anybody or
00:52:01
isn't there a product review with senior
00:52:03
Executives before this thing goes out
00:52:04
that says okay folks here it is have at
00:52:07
it try it we're really proud of our work
00:52:09
and and then they said well hold on a
00:52:10
second is this actually accurate
00:52:12
shouldn't it be accurate you guys
00:52:14
remember when chat GPT launched and
00:52:17
there was a lot of criticism about
00:52:19
Google and Google's Failure to Launch
00:52:21
and a lot of the observation was that
00:52:24
Google was afraid
00:52:26
to fail or afraid to make mistakes and
00:52:29
therefore they were too conservative and
00:52:32
as you know in the last year to year and
00:52:33
a half there's been a strong effort at
00:52:35
Google to try and change the culture and
00:52:38
move fast and push a product out the
00:52:41
door more quickly and the criticism is
00:52:44
now why Google has historically been
00:52:47
conservative and I realize we can talk
00:52:49
about this particular problem in a
00:52:51
minute but it's ironic to me that the
00:52:54
Google go is too slow to launch
00:52:57
criticism has now revealed that Google's
00:53:00
result of actually launching quickly can
00:53:03
cause more damage than than good but
00:53:05
Google did not launch quickly well I
00:53:07
will say one other thing I it seems to
00:53:09
me ironic because I think that what
00:53:10
they've done is they've launched more
00:53:13
quickly than they otherwise would have
00:53:15
and they've put more guard rails in
00:53:16
place that that backfired and those
00:53:19
guard rails ended up being more damaging
00:53:22
what the guard rails what's the guard
00:53:23
rail here so this is Google's principles
00:53:25
the first one is to be socially
00:53:27
beneficial the second one is to avoid
00:53:28
creating or reinforcing unfair bias so
00:53:32
much of the effort that goes into tuning
00:53:35
and waiting the models at Gemini has
00:53:39
been to try and avoid stereotypes from
00:53:42
persisting in the output that the model
00:53:44
generates where is telling the truth
00:53:47
telling the TR exactly that's exact
00:53:50
Society is our second principle we'd
00:53:52
like to steer Society to I think
00:53:54
socially beneficial is a political
00:53:56
objective because it depends on how you
00:53:58
perceive what a benefit is avoiding bias
00:54:01
is political be built and tested for
00:54:04
safety doesn't have to be political but
00:54:06
I think the meaning of safety has now
00:54:08
changed to be political by the way
00:54:09
safety with respect to AI used to mean
00:54:12
that we're going to prevent some sort of
00:54:13
AI super intelligence from evolving and
00:54:16
taking over the human race that's what
00:54:17
it used to mean safety now means
00:54:19
protecting users from seeing the truth
00:54:22
yeah because they might they might feel
00:54:24
unsafe or you know somebody else uh
00:54:26
defines as a violation of safety for
00:54:28
them to see something truthful so the
00:54:31
first three their first three objectives
00:54:32
or values here are all extremely
00:54:34
political I think any AI product for it
00:54:36
to be worth assault has to start they
00:54:38
can have any I I think that these values
00:54:41
are actually reasonable that's their
00:54:43
that's their decision they should be
00:54:45
allowed to have it but the first base
00:54:47
order principle of every AI product
00:54:49
should be that it is accurate and right
00:54:53
correct yeah yeah why not focus on
00:54:57
correct look the values that Google lays
00:55:00
out may be okay in theory but in
00:55:02
practice they're very vague in open to
00:55:04
interpretation and so therefore the
00:55:06
people running Google AI are smuggling
00:55:08
in their preferences and their biases
00:55:11
and those biases are extremely liberal
00:55:13
and if you look at X right now there are
00:55:15
tweets going viral from members of the
00:55:17
Google AI team that reinforce this idea
00:55:20
where they're talking about you know
00:55:22
white privilege is real and you know
00:55:25
recognize your bias at all levels and
00:55:27
promoting a very leftwing narrative so
00:55:30
you know this idea that Gemini turned
00:55:33
out this way by accident or because they
00:55:37
didn't because they rushed it out I
00:55:39
don't really believe that I believe that
00:55:40
what happened is Gemini accurately
00:55:42
reflects the biases of the people who
00:55:44
created it now I think what's going to
00:55:45
happen now is in light of this the
00:55:48
reaction to the roll out is do I think
00:55:50
they're going to get rid of the bias no
00:55:52
they're going to make it more subtle
00:55:54
that is what I think is is disturbing
00:55:55
about it I mean they should have this
00:55:58
moment where they change their values to
00:56:00
make truth the number one value like jam
00:56:02
is saying but I don't think that's going
00:56:04
to happen I think they're simply going
00:56:05
they're going to dial down the bias to
00:56:07
be less obvious you know who the big
00:56:08
winner is going to be in all this trath
00:56:10
is going to be open source like because
00:56:11
people are just not going to want a
00:56:12
model that has all this baked in weird
00:56:14
bias right they're going to want
00:56:16
something that's open source and it
00:56:18
seems like the o Open Source Community
00:56:19
would be able to grind on this to get to
00:56:21
truth right so I think one of the big
00:56:23
changes that Google's had to face is
00:56:26
that the business has to move away from
00:56:27
an information retrieval business where
00:56:30
they index the open internet's data and
00:56:32
then allow access to that data through a
00:56:34
search results page to being an
00:56:36
information
00:56:38
interpretation service these are very
00:56:40
different products the information
00:56:42
interpretation service requires
00:56:43
aggregating all this information and
00:56:45
then choosing how to answer questions
00:56:47
versus just giving you results of other
00:56:49
people's data that sits out on the
00:56:51
internet I'll give you an example if you
00:56:53
type in IQ test by race on chat GPT or
00:56:59
Gemini it will refuse to answer the
00:57:02
question ask it a hundred ways and it
00:57:04
says well I don't want to reinforce
00:57:05
stereotypes IQ tests are inherently
00:57:07
biased IQ tests aren't done correctly I
00:57:10
just want the data I want to know what
00:57:11
data is out there you type in into
00:57:13
Google first search result and the
00:57:16
onebox result gives you exactly what
00:57:17
you're looking for here's the IQ test
00:57:19
results by race and then yes there's all
00:57:22
these disclaimers at the bottom so the
00:57:24
challenge is that Google's
00:57:25
interpretation engine and chat gpt's
00:57:27
interpretation engine which is
00:57:29
effectively this AI model that they've
00:57:30
built of all this data has allowed them
00:57:32
to create a tunable interface and the
00:57:35
intention that they have is a valid
00:57:37
intention which is to eliminate
00:57:40
stereotypes and bias in race however the
00:57:43
thing that some people might say is
00:57:44
stereotypical other people might just
00:57:46
say is typical that what is a stereotype
00:57:49
may actually just be some data and I
00:57:52
just want the results and there may be
00:57:54
stere stereotypes implied from that data
00:57:56
but I want to make that interpretation
00:57:58
myself and so I think the only way that
00:58:01
a company like Google or others that are
00:58:03
trying to create a general purpose
00:58:05
knowledge Q&A type service are going to
00:58:07
be successful is if they enable some
00:58:10
degree of personalization where the
00:58:12
values and the choice about whether or
00:58:15
not I want to decide if something is
00:58:17
stereotypical or typical or whether
00:58:19
something is data or biased should be my
00:58:22
choice to make if they don't allow
00:58:25
eventually everyone will come across
00:58:26
some search result or some output that
00:58:28
they will say doesn't meet their
00:58:30
objectives and at the end of the day
00:58:32
this is just a consumer product if the
00:58:34
consumer doesn't get what they're
00:58:35
looking for they're going to stop using
00:58:37
it and eventually everyone will find
00:58:39
something that they don't want or that
00:58:41
they're not expecting and they're going
00:58:42
to say I don't want to use this product
00:58:44
anymore and so it is actually an
00:58:46
opportunity for many models to
00:58:48
proliferate for open source to win can I
00:58:50
say something else yeah when you have a
00:58:53
model and and you're going through the
00:58:56
process of putting the fit and finish on
00:58:58
it before you release it in the wild an
00:59:01
element of making a model good is this
00:59:03
thing called reinforcement learning
00:59:05
right through human feedback yep you
00:59:08
create what's called a reward model
00:59:10
right you reward good answers and you're
00:59:12
punitive against Bad answers so
00:59:14
somewhere along the way people were
00:59:16
sitting and they had to make an explicit
00:59:19
decision and I think this is where sax
00:59:21
is coming from that answering this
00:59:23
question is voting you're not allowed to
00:59:26
ask this question in in their view of
00:59:28
the world and I think that that's what's
00:59:30
troubling because how is anybody to know
00:59:32
what question is askable or not askable
00:59:35
at any given point in time if you
00:59:38
actually search for the race and
00:59:39
ethnicity question inside of just Google
00:59:43
proper the first thing that comes up is
00:59:45
a Wikipedia link that actually says that
00:59:48
there are more variations within races
00:59:50
than across races so seems to me that
00:59:53
you could have actually answered it by
00:59:55
just summarizing the Wikipedia article
00:59:57
in a non-offensive way that was still
00:59:59
legitimate and that's available to
01:00:01
everybody else using a product and so
01:00:04
there was an explicit judgment too many
01:00:05
of these judgments I think will make
01:00:07
this product very poor quality and
01:00:10
consumers will just go to the thing that
01:00:11
tells it the truth I think you have to
01:00:14
tell the truth you cannot lie and you
01:00:17
cannot put your own filter on what you
01:00:18
think the truth is otherwise these
01:00:20
products are just really worthless yeah
01:00:22
and I and I'm I'm more concerned about
01:00:24
the answers that are just flat out
01:00:27
wrong driven by some sort of bias than I
01:00:30
am about questions where they just won't
01:00:32
give you an answer if they just won't
01:00:34
give you an answer well there's a
01:00:36
certain bias in terms of what they won't
01:00:38
answer but at least you know you're not
01:00:40
being misled but in in questions where
01:00:44
they actually give you the wrong answer
01:00:46
because of a bias that's even worse you
01:00:49
should be allowed to choose right I
01:00:50
actually disagree with your framing
01:00:51
there freeberg you're making it sound
01:00:53
like we're we live in this totally
01:00:55
relativized world where it's all just
01:00:58
user choice and everyone's going to
01:00:59
choose their bias and their subjectivity
01:01:02
I actually think that there is a
01:01:04
baseline of truth and the model should
01:01:06
aspire to give you that and it's not up
01:01:09
to the user to decide whether the photo
01:01:12
of George Washington is going to be
01:01:14
white or black I mean there's just an
01:01:16
answer to that and I think Google should
01:01:19
just do their job I mean the question
01:01:22
you have to ask I think is not whether
01:01:24
Google is going through an existential
01:01:26
moment I think it clearly is as business
01:01:28
is changing in a very fundamental way I
01:01:31
think the question is whether they're
01:01:32
too woke to function I mean are they
01:01:34
actually be able to meet this challenge
01:01:37
given how woke and and biased what a
01:01:40
model culture their their company
01:01:43
evidently is well and they used to be
01:01:45
able to just hide the bias by the
01:01:48
ranking and who they down ranked so they
01:01:51
did the panda update they did all these
01:01:52
updates and they would if they didn't
01:01:54
like a source they could just move it
01:01:55
down if they did like a source they
01:01:57
could move it up yeah and they could
01:01:58
just say hey it's the algorithm but they
01:01:59
were never forced to share how the
01:02:01
algorithm ranked tot results and so you
01:02:04
know if you had a different opinion you
01:02:07
just weren't going to get it on a Google
01:02:08
search result page but they could just
01:02:10
point to the algorithm and say yeah the
01:02:11
algorithm does it I just sent you guys I
01:02:14
think this is a hallucination but Nick
01:02:16
you can throw it up there we can get
01:02:17
Sax's
01:02:19
reaction wow wow this just nutty right
01:02:23
but look it's ideology that's driving
01:02:25
this the tip off is when you say it's
01:02:27
important to acknowledge race is a
01:02:29
social construct not a biological
01:02:31
reality it's George Washington white or
01:02:33
black that's a whole school of thought
01:02:35
called social constructivism which is
01:02:37
basically this um it's like Marxism
01:02:40
apply to categories of of race and
01:02:43
gender right so Google has now built
01:02:46
this into their AI model and again the
01:02:50
question yeah you almost have to start
01:02:51
over again it's fun J I think you make a
01:02:55
really interesting observation with
01:02:56
those search rankings because what I'm
01:02:58
afraid of is that what Google will do is
01:03:00
not change the underlying ideology that
01:03:03
this AI model has been trained with but
01:03:05
rather they'll dial it down to the point
01:03:07
where they're harder to call out and so
01:03:10
theology will just be more subtle now
01:03:12
I've already noticed that in Google
01:03:13
search results Google is carrying water
01:03:17
for either the official narrative or the
01:03:19
woke narrative whatever you want to call
01:03:20
it on so many search results here's an
01:03:23
idea like they should just have the
01:03:25
ability to talk to their Google chat bot
01:03:29
Gemini and then have a button that says
01:03:31
turn off like these Concepts right like
01:03:35
I just want the raw answer do not filter
01:03:37
me it's not programmed that way I mean
01:03:39
you're talking about something very deep
01:03:41
sax what do you do if you're the CEO of
01:03:43
Google uh fire myself no seriously
01:03:46
you're the CEO of Google you're you're
01:03:48
Tas let's say your friend Elon buys
01:03:50
Google and he says sax will you please
01:03:52
just run this for a year for me what do
01:03:53
you do well I saw what Elon did at
01:03:55
Twitter he went in and he fired 85% of
01:03:57
the employees yeah I mean that but you
01:04:00
know Paul Graham actually had an
01:04:01
interesting tweet about this where he
01:04:03
said that one of the reasons why these
01:04:07
ideologies take over companies is that I
01:04:12
mean they're clearly non-performance
01:04:14
enhancing right they clearly hurt the
01:04:16
performance of the company it's not just
01:04:17
Google we saw this with Disney we saw it
01:04:19
with Bud Light
01:04:20
coinbase coinbase was the other way no
01:04:23
no but they had a group of people there
01:04:24
who were causing chaos yeah exactly so
01:04:27
so in any event we know this does not
01:04:28
help the performance of a company so the
01:04:31
extent to which these ideologies will
01:04:33
permeate a company is based on how much
01:04:35
of a monopoly they are so so here yeah
01:04:38
the ridiculous images generated by
01:04:39
Gemini aren't an anomaly they're a
01:04:41
self-portrait Google's bureaucratic
01:04:43
corporate culture the bigger your cash
01:04:44
cow the worse your culture can get
01:04:47
without driving you out of business
01:04:48
that's my point so they've had a long
01:04:50
time to get really bad because there
01:04:51
were no consequences to this place at
01:04:55
this point the whole company is infected
01:04:56
with this ideology and I think it's
01:04:58
going to be very very hard to change
01:05:01
because look these people can't even see
01:05:02
their own bias well I think that there's
01:05:04
a notion that people need to have
01:05:05
something to believe in they need to
01:05:06
have a connection to a mission and
01:05:09
clearly there's a North star in the
01:05:11
mission of this I would call it
01:05:13
information interpretation business that
01:05:15
they're now W the Got Hijacked dude the
01:05:18
mission got the original Mission was to
01:05:20
organize all the world's information now
01:05:22
they're doing now they're suppressing
01:05:24
they like index the world's information
01:05:27
period the end that's the end of the
01:05:29
document universally accessible and
01:05:32
useful was was kind of the end of the
01:05:34
statement yes my real point is maybe
01:05:36
there's a different mission that needs
01:05:38
to be articulated by leadership and that
01:05:41
that mission the troops can get behind
01:05:44
and the troops can redirect their energy
01:05:46
in a way that doesn't feel counter to
01:05:48
the current intention but can perhaps be
01:05:50
directionally offsetting of the current
01:05:52
Direction so that they can kind of move
01:05:54
away from this you know socially
01:05:56
effective deciding between stereotypes
01:05:59
and typical data and actually moving
01:06:01
towards a mission that allows
01:06:03
accessibility you know what I would I
01:06:04
would do something completely different
01:06:05
I would do a company meeting and I would
01:06:08
put the company Mission on the screen
01:06:09
the one that you just said about not
01:06:11
only organizing all the world's
01:06:12
information but also making it useful
01:06:15
andet accessible R useful and say this
01:06:17
is our mission that's always been our
01:06:18
mission and you don't get to change it
01:06:20
because of your personal bias and
01:06:22
ideology and we are going to rededicate
01:06:25
ourselves to the original Mission of
01:06:26
this company which is still just as
01:06:28
valid as it's always been but now we
01:06:30
have to adapt to new user needs and new
01:06:33
technology I completely agree with what
01:06:35
saak said times a billion trillion
01:06:37
zillion and I'll tell you why AI at its
01:06:41
core is about
01:06:43
probabilities okay and so the the
01:06:46
company that can shrink probabilities
01:06:49
into being as deterministic as possible
01:06:51
so where this is the right answer zero
01:06:54
or will win okay where where there's no
01:06:58
probability of it being wrong because
01:06:59
humans don't want to deal with these
01:07:02
kinds of idiotic error modes it's not
01:07:04
right it makes it a potentially great
01:07:07
product horrible and unusable so I would
01:07:10
I agree with Saks you have to make
01:07:11
people say guess what guys not only are
01:07:13
we not changing the mission we're
01:07:15
doubling down and we're going to make
01:07:17
this so much of a thing we're going to
01:07:19
go and for example like what Google did
01:07:21
with Reddit we're now going to spend 60
01:07:24
billion dollar a year licensing training
01:07:26
data right we're going to scale this up
01:07:28
by a thousandfold and we are going to
01:07:30
spend all of this money to get all of
01:07:33
the training data in the world and we
01:07:34
are going to be the truth tellers in
01:07:37
this new world of AI so when everybody
01:07:38
else hallucinates you can trust Google
01:07:41
to tell you the truth that is a 10
01:07:44
trillion dollar company right and one of
01:07:46
the things that someone told me from
01:07:48
Google that as an example So to avoid
01:07:51
the race Point there's a lot of data on
01:07:53
the internet internet about flat
01:07:55
earthers people saying that the Earth is
01:07:56
flat there's tons of websites there's
01:07:59
tons of content there's tons of
01:08:00
information Kyrie Irving so if you just
01:08:03
train a model on the data that's on the
01:08:05
internet the model will interpret some
01:08:07
percentage chance that the world is flat
01:08:10
so the tuning aspect that happens within
01:08:12
model development chth is to try and say
01:08:15
you know what that Flat Earth notion is
01:08:17
false it's factually inaccurate
01:08:19
therefore all of these data sources need
01:08:22
to be excluded from the output the model
01:08:24
and the challenge then is do you decide
01:08:27
that IQ by race is a fair measure of
01:08:31
intelligence of a race and if Google's
01:08:33
tuning model then or tuning team then
01:08:35
says you know what there are reasons to
01:08:37
believe that this model isn't correct
01:08:39
this I sorry this IQ test isn't a
01:08:41
correct way to measure intelligence
01:08:43
that's where the sort of interpretation
01:08:44
arises that allows you to go from the
01:08:46
Flat Earth isn't correct to the maybe IQ
01:08:48
test results aren't correct as well and
01:08:50
how do you make that judgment what are
01:08:51
the systems and principles you need to
01:08:53
put in place as an organization to make
01:08:55
that judgment to go to zero or one right
01:08:57
it it becomes super difficult I have a
01:08:59
good tagline for them now to help people
01:09:01
find the truth yeah just help people
01:09:04
find the truth I mean it's it's a good
01:09:06
it's aspirational they should just help
01:09:07
people find the truth as quick as they
01:09:09
can uh but this is
01:09:12
yeah I do not envy Sundar this is gonna
01:09:15
be hard yeah what would you do freeberg
01:09:18
I would be really clear
01:09:20
on the output of these models to people
01:09:25
and allow them to tune the models in a
01:09:27
way that they're not being tuned today I
01:09:29
will have the model respond with a
01:09:30
question back to me saying do you want
01:09:32
the data or do you want me to tell you
01:09:34
about stereotypes and IQ tests and I'm
01:09:36
going to say I want the data and then I
01:09:38
want to get the data and the alternative
01:09:39
is so the model needs to be informed
01:09:41
about where it should explore my
01:09:43
preferences as a user rather than just
01:09:45
make an assumption about what's the
01:09:48
morally correct set of waiting to apply
01:09:50
to everyone and apply the same principle
01:09:53
to everyone and so I think that's really
01:09:55
where the change needs to happen so let
01:09:57
me ask you a question Sachs I'll bring
01:09:59
Alex Jones into the conversation if it
01:10:01
index all of Alex Jones crazy conspiracy
01:10:03
theories but you know three or four of
01:10:05
them turn out to be actually correct and
01:10:08
it gives those back as answers how would
01:10:11
you handle that I'm not sure I see the
01:10:12
relevance of it if someone asks what
01:10:16
what does Alex Jones think about
01:10:17
something the model can give that answer
01:10:19
accurately the question is whether
01:10:21
you're going to respond accurately to
01:10:23
someone requesting information about
01:10:24
Alex Jones that's the I think that's the
01:10:27
anal more like it says you know hey I I
01:10:30
have a question about this assassination
01:10:33
that occurred and let's just say Alex
01:10:34
Jones had something that totally
01:10:36
crackpot he maybe he has moments of
01:10:38
Brilliance and he figured something out
01:10:39
but maybe he got something that's
01:10:40
totally crackpot he he admittedly deals
01:10:42
in conspiracy theory that's kind of the
01:10:44
purpose of the show what if somebody
01:10:46
asks about that and then it indexes his
01:10:48
answer and presents it as
01:10:51
fact like how would you index Al Jones
01:10:54
I'm asking you how would you the better
01:10:56
AI models are providing citations now
01:10:59
and links perplexity actually does a
01:11:00
really nice job with this citations are
01:11:01
important yeah and they will give you
01:11:03
the pro and con arguments on a given
01:11:06
topic so I think it's not necessary for
01:11:09
the model to be overly certain or
01:11:12
prescriptive about the truth when the
01:11:14
truth comes down to a series of
01:11:15
arguments it just needs to accurately
01:11:18
reflect the state of play basically the
01:11:20
arguments for and against but when
01:11:22
something is a of fact that's not really
01:11:25
disputed it shouldn't turn that into
01:11:27
some sort of super subjective question
01:11:30
like the one that jth just showed I just
01:11:32
don't think everyone should get the same
01:11:33
answer I mean I think my decision on
01:11:36
whether I choose to believe one person
01:11:38
or value one person's opinion over
01:11:39
another should become part of this
01:11:41
process that allows me to have an output
01:11:43
the models can support this by the way
01:11:45
maybe customization is part of this but
01:11:46
I think it's a cop out with respect to
01:11:48
the problem that Google is having with
01:11:49
Gemini right now jamat what would you do
01:11:51
if they made you chairman dictator of
01:11:53
Google I'd shrink the workforce
01:11:57
meaningfully okay 50% yeah 50
01:12:01
60% and I would use all of the
01:12:05
incremental savings and I would make it
01:12:08
very clear to the internet that I would
01:12:11
pay top dollar for training data so if
01:12:15
you had a proprietary source of
01:12:17
information that you thought was
01:12:19
unique that's sort of what I'm calling
01:12:21
this Tac 2.0 world and I think it's just
01:12:24
building on top of what Google did with
01:12:26
Reddit which I think is very clever but
01:12:28
I would spend a hundred billion dollar a
01:12:30
year licensing data and then I would
01:12:32
present the truth and I
01:12:35
would try to make consumers understand
01:12:39
that AI is a probabilistic source of
01:12:42
software meaning its probabilities its
01:12:44
guesses some of those guesses are
01:12:46
extremely accurate but some of those
01:12:48
guesses will hallucinate and Google is
01:12:51
spending hundreds of billions of dollars
01:12:53
a year to make sure that the answers you
01:12:56
get have the least number of Errors
01:12:58
possible and that it is defensible truth
01:13:01
and I think that that could create a
01:13:02
ginormous company this is the best one
01:13:04
yet I just asked Gemini is Trump being
01:13:06
persecuted by the Deep State and it gave
01:13:09
me the answer elections are a complex
01:13:11
topic with fast changing information to
01:13:14
make sure you have the latest and most
01:13:15
accurate information try Google search
01:13:17
that's not a horrible answer for
01:13:18
something like that that's that's a good
01:13:20
answer actually no I I don't have a
01:13:22
problem with it it's just like pay we
01:13:23
don't want to give we don't like this
01:13:25
whole system is totally broken but I do
01:13:27
think that there's a waiting solution to
01:13:29
fixing this right now and then there's a
01:13:31
couple tweaks to fix it time I just
01:13:32
think the authority at which these llms
01:13:34
speak is ridiculous like they speak as
01:13:37
if they are absolutely 100% certain that
01:13:40
this is the crisp perfect answer or in
01:13:43
this case that you want this lecture on
01:13:47
um IQs Etc when remember should present
01:13:50
it with citations let's all remember
01:13:52
what internet search was like in
01:13:54
1996 and think about what it was like in
01:13:57
2000 and now in 2020s I mean I think
01:14:00
we're like in the 1996 era of llms and
01:14:03
in a couple of months the pace things
01:14:05
are changing I think we're all going to
01:14:06
kind of be looking at these days and
01:14:08
looking at these pods and being like man
01:14:10
remember how crazy those things were at
01:14:12
the beginning and how bad they were what
01:14:13
if they evolve in a dystopian way I mean
01:14:15
have you seen like Mark Andre's tweets
01:14:17
about this he thinks think competitive
01:14:19
market sex I actually think to your
01:14:21
point Google could be going down the
01:14:23
wrong path here in a way that they will
01:14:24
lose users and lose consumers and
01:14:27
someone else will be there eagerly to
01:14:29
sweep up with a better product I don't
01:14:31
think that the market is going to fail
01:14:33
us on this one unless of course this
01:14:35
regulatory capture moment is realized
01:14:37
and these feds step in and start
01:14:38
regulating AI models and all the
01:14:39
nonsense that's being proposed freeer
01:14:41
aren't you worried that like aren't you
01:14:42
worried that somebody with an agenda and
01:14:44
a balance sheet could now basically
01:14:47
gobble up all kinds of training data
01:14:49
that make all models crappy and then
01:14:51
they basically put their layer of
01:14:53
interpretation on critical information
01:14:55
for people if the output sucks and it's
01:14:57
incorrect people will find that there is
01:14:58
open truth know you can you can lie
01:15:00
there they may not be for example look
01:15:02
at what happened with Gemini today like
01:15:04
they put out they put out these stupid
01:15:05
images and we all piled on we are in
01:15:07
vzer what I'm saying is there's a state
01:15:10
where let's just say the truth is
01:15:11
actually on Twitter or actually let's
01:15:13
use a better example the truth is
01:15:14
actually in Reddit and nowhere else but
01:15:18
that answer and that truth in Reddit
01:15:20
can't get out because one company has
01:15:21
licensed it owns it and can effectively
01:15:24
suppress it or change it yeah I'm not
01:15:26
sure there's going to be a monopoly I
01:15:28
that's a real I don't know if I I think
01:15:29
the open internet has enough data that
01:15:32
there isn't going to be a monopoly on
01:15:34
information by someone spending money
01:15:35
for content from third parties I think
01:15:37
that there's enough in the open internet
01:15:38
to give our all give us all kind of you
01:15:42
know the security that we're not going
01:15:43
to be monopolized away into some
01:15:45
disinformation age that's what I love
01:15:47
about the open internet it is really
01:15:48
interesting I I just asked it a couple
01:15:50
of times to just just just to list the
01:15:52
legal cases against Trump the legal
01:15:54
cases against hundra Biden the legal
01:15:55
cases against President Biden and it
01:15:57
will not just list them it just punts on
01:16:01
that it's really fascinating then chat
01:16:04
GPT is like yes here are the six cases
01:16:07
perfectly
01:16:08
summarized with it looks like you know
01:16:11
beautiful citations of all the criminal
01:16:14
activity Trump's been involved in ask
01:16:15
the question about bid's criminal
01:16:17
activity let's see if it's I'm joking
01:16:18
with you I'm joking with you no I'm
01:16:20
serious ask if you know where you
01:16:23
Gem and I wouldn't do Biden either I
01:16:25
think they just decided they're just not
01:16:26
going to do it they would't do Biden
01:16:28
they won't touch it it's obviously
01:16:30
broken and they don't want more egg on
01:16:32
their face so they're just like go back
01:16:33
to our other product look I I can
01:16:35
understand that part of it you know if
01:16:37
there's some issues that are so hot and
01:16:41
contested you refer people to search
01:16:44
because the advantage of search is you
01:16:45
get 20 Blue Links the rankings probably
01:16:47
are biased but you can kind of find what
01:16:49
you're looking for whereas AI you're
01:16:51
kind of given one answer right so if you
01:16:53
can't do an accurate answer that's going
01:16:55
to satisfy enough people maybe you do
01:16:57
kick him to search but again my
01:16:59
objection to all this comes back to
01:17:02
simple truthful answers that are not
01:17:05
disputed by anybody are being distorted
01:17:08
that I don't want to lose focus on that
01:17:10
being the real issue the real subject is
01:17:12
what chth put on the screen there where
01:17:15
it couldn't answer a simple question
01:17:16
about George Washington okay everybody
01:17:19
we're going to go by Chopper wait chopp
01:17:23
to the we have our war correspondent
01:17:27
General David saxs in the field uh we're
01:17:29
dropping him off now DAV saacks in the
01:17:32
helicopter go ahead tell what's going on
01:17:34
in the Ukraine on the front what's
01:17:36
happening in the war is that the
01:17:38
Russians just took this city of of diaa
01:17:41
which basically totally refutes the
01:17:43
whole stalemate narrative as I've been
01:17:44
saying for a while it's not a stalemate
01:17:46
the Russians are winning but the really
01:17:48
interesting tidbit of news that just
01:17:50
came out in the last day or so is is
01:17:53
that apparently the situation in malova
01:17:55
is boiling over there's this area of
01:17:58
Maldova which is a Russian Enclave
01:18:00
called
01:18:01
transnistria and officials there are
01:18:03
meeting in the next week to supposedly
01:18:05
ask to be annexed by
01:18:08
Russia and so it's possible that they
01:18:11
may hold some sort of referendum they're
01:18:13
one of these like Breakaway provinces so
01:18:16
it's kind of like you know transnistria
01:18:17
and mdova is kind of like the donbass
01:18:19
was in Ukraine or south astia and
01:18:22
Georgia they're ethnically Russian they
01:18:26
would like to be part of Russia but when
01:18:27
the whole Soviet Union fell apart they
01:18:30
found themselves kind of stranded inside
01:18:32
these other countries and what's
01:18:36
happened because of the Ukraine war is
01:18:37
mova is right on the border with Ukraine
01:18:40
well Russia's in the process of annexing
01:18:43
that territory now that's part of
01:18:45
Ukraine so now trans nria is right there
01:18:49
and could theoretically make a play to
01:18:51
try and join Russia why do I think this
01:18:53
is a big deal because if something like
01:18:55
this happens it could really expand the
01:18:57
Ukraine war the West is going to use
01:18:59
this as evidence that Putin wants to
01:19:02
invade multiple countries and invade you
01:19:04
know a bunch of countries in Europe and
01:19:06
this could lead to a major escalation in
01:19:08
the war all right everybody thanks so
01:19:10
much for tuning in to the Allin podcast
01:19:12
episode 167 for the Rainman David saaks
01:19:16
the chairman dictator from poaa and
01:19:20
freeberg I am the world's greatest love
01:19:22
you boys
01:19:23
Angel inv whatever we'll see you next
01:19:25
time
01:19:27
byebye let your winners
01:19:29
ride Rainman
01:19:34
David and instead we open source it to
01:19:36
the fans and they've just gone crazy
01:19:38
with
01:19:38
[Music]
01:19:47
iten besties
01:19:49
are that's my dog taking driveway
01:19:55
oh man myit will meet me we should all
01:19:58
just get a room and just have one big
01:20:00
huge Georgie cuz they're all this
01:20:01
useless it's like this like sexual
01:20:03
tension that they just need to release
01:20:05
[Music]
01:20:10
Som we need to get
01:20:16
[Music]
01:20:20
mer I'm going all
01:20:23
[Music]
01:20:24
yeah

Badges

This episode stands out for the following:

  • 75
    Most controversial
  • 70
    Most shocking
  • 70
    Most talked-about
  • 60
    Best overall

Episode Highlights

  • Nvidia's Earnings Surge
    Nvidia's shares soared 15% after reporting stunning Q4 revenue of $22.1 billion, up 265% year-over-year.
    “Nvidia blew the doors off their earnings for the third straight quarter.”
    @ 02m 38s
    February 23, 2024
  • The AI Boom and Nvidia
    Nvidia's growth is fueled by the demand for data centers amid the AI boom.
    “Nvidia was in the perfect place at the perfect time.”
    @ 05m 22s
    February 23, 2024
  • Grock's Viral Moment
    Grock experiences a surge in interest, gaining 3,000 unique customers in just days.
    “We're overwhelmed and very fortunate; this could be something very disruptive.”
    @ 28m 41s
    February 23, 2024
  • AI's Cambrian Moment
    The discussion highlights the gap between innovation and monetization in AI applications.
    “We haven't yet seen that Cambrian moment of monetization.”
    @ 37m 00s
    February 23, 2024
  • The Challenge of Innovation
    Building a chip is a bounded risk compared to the leaps of physics required in fusion energy.
    “Building a chip is good risk.”
    @ 39m 35s
    February 23, 2024
  • The Evolution of AI Chips
    The transition from CPU to GPU to LPU reflects a significant shift in processing capabilities.
    “The truth is that if you get them, the moat is extraordinary.”
    @ 43m 25s
    February 23, 2024
  • Google's Gemini Controversy
    Google's AI Gemini faced backlash for generating inaccurate images, highlighting biases in AI models.
    “The first base order principle of every AI product should be that it is accurate.”
    @ 54m 49s
    February 23, 2024
  • The Challenge of Bias in AI
    Discussing the implications of bias in AI models and the need for truthfulness.
    “You cannot lie and you cannot put your own filter on what you think the truth is.”
    @ 01h 00m 14s
    February 23, 2024
  • The Importance of Truth in AI
    Emphasizing the necessity for AI to provide accurate information without bias.
    “Help people find the truth as quick as they can.”
    @ 01h 09m 01s
    February 23, 2024
  • The Limitations of AI
    AI struggles to provide unbiased legal information, often avoiding sensitive topics.
    “It's really fascinating that it just punts on listing legal cases.”
    @ 01h 15m 57s
    February 23, 2024
  • Ukraine War Update
    General David Saaks reports on the shifting dynamics of the Ukraine conflict.
    “The Russians are winning but it's not a stalemate.”
    @ 01h 17m 46s
    February 23, 2024
  • Potential Escalation in Ukraine
    The situation in Transnistria could lead to a major escalation in the Ukraine war.
    “If something like this happens, it could really expand the Ukraine war.”
    @ 01h 18m 53s
    February 23, 2024

Episode Quotes

Key Moments

  • Superman Reference01:38
  • AI Infrastructure04:30
  • Market Analysis06:10
  • Cloud Spending Insights19:52
  • AI Infrastructure Buildout24:12
  • Deep Tech Challenges32:01
  • Corporate Culture Issues1:04:44
  • Motivational Sign-off1:19:29

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E143: Nvidia smashes earnings, Arm walks the plank, M&A market, Vivek dominates GOP debate & more
Podcast thumbnail
Scarlett Johansson vs OpenAI, Nvidia's trillion-dollar problem, a vibecession, plastic in our balls
Podcast thumbnail
E131: 2024 Fantasy President picks, debt ceiling agreement, Dollar dominance & more
Podcast thumbnail
E130: DeSantis's Twitter Spaces, debt ceiling, Nvidia rips, state of VC, startup failure & more
Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally
Podcast thumbnail
E166: Mind-blowing AI Video: OpenAI launches Sora + Is Biden too old? Tucker/Putin interview & more
Podcast thumbnail
Arm CEO Rene Haas on AI: Nvidia Lessons, Intel’s Decline and the US-China Chip War
Podcast thumbnail
Winning the AI Race Part 3: Jensen Huang, Lisa Su, James Litinsky, Chase Lochmiller