Search Captions & Ask AI

E168: Can Google save itself? Abolish HR, AI takes over Customer Support, Reddit IPO teardown

March 01, 2024 / 01:26:52

This episode of the All-In Podcast covers topics such as Google's AI issues, Reddit's S1 filing, and Apple's Project Titan. Guests include David Sachs, David Friedberg, and Chamath Palihapitiya.

The discussion begins with a focus on Google's Gemini AI and its recent controversies, including accusations of bias in image generation. Sachs and Friedberg analyze the implications for Google's leadership and the potential need for significant organizational changes.

Next, the hosts shift to Reddit's S1 filing, highlighting its revenue growth and challenges in monetization. They discuss the implications of Reddit's user growth and the effectiveness of its advertising strategy.

The conversation then turns to Apple's Project Titan, which has reportedly been scaled back or canceled. The hosts speculate on the reasons behind this decision and its impact on Apple's future direction.

Throughout the episode, the hosts engage in lively banter, sharing insights on the tech industry's evolving landscape and the challenges faced by major companies.

TL;DR

The episode discusses Google's AI issues, Reddit's S1 filing, and Apple's Project Titan cancellation.

Video

00:00:00
Jason where are you that is that a
00:00:01
virtual background
00:00:03
or oh oh right that's place I I did look
00:00:07
architecturally familiar to me it is
00:00:09
architecturally significant we bleep out
00:00:10
the beep's house but yes I am you know
00:00:13
this is like top three or four of places
00:00:17
I like to be a house guest you know like
00:00:18
I'm in rotation right now my house guest
00:00:20
you're K kining through our friend group
00:00:23
basically you know I'm just a great
00:00:24
house guest people like to have
00:00:25
breakfast with me people like having me
00:00:27
around so I just find myself in I around
00:00:30
the world hey Nick do you have an
00:00:32
updated picture of K kin is he still
00:00:34
alive and kicking he's alive for sure
00:00:36
he's got to be 70 or something right
00:00:38
he's gotta be he was on one of those
00:00:40
reality TV shows with Dr Drew I think a
00:00:42
couple years ago what does k o kin do
00:00:44
he's a good hang he's yeah there's a lot
00:00:46
of these people in La you know like just
00:00:48
they just
00:00:49
hang they kind of
00:00:51
hang they set things up oh my
00:00:54
God is that current k k man he's 64 now
00:01:00
wow God Almighty that's incredible how
00:01:02
did he survive he's got an Instagram
00:01:04
account I'll tell you how he survived
00:01:07
see no evil you're no evil exactly I
00:01:10
didn't see nothing all I know is Hey
00:01:13
listen man you gave you a pool house
00:01:14
snitches get stitches I didn't see
00:01:16
anything what did you see
00:01:19
nothing let your winners
00:01:22
ride Rainman
00:01:26
David and in said we open sources it to
00:01:29
the fans and they've just gone crazy
00:01:31
with love
00:01:31
[Music]
00:01:34
youen all right everybody welcome to
00:01:37
your favorite podcast the Allin podcast
00:01:39
episode 168 David saaks can you believe
00:01:42
it we've made it to 168 episodes with me
00:01:46
again the Rainman David Sachs how you
00:01:50
doing buddy good yeah I heard you got a
00:01:53
big talk coming up another speech yeah
00:01:57
yes a talk yes I'm giving a talk all
00:02:00
right get ready for that all of the GOP
00:02:02
fans out there all your saxs fans you
00:02:04
got a big keynote coming from saaks next
00:02:06
week also with me of course the Sultan
00:02:09
of science formerly known as the queen
00:02:11
of quinoa he's got another crop he's uh
00:02:15
growing David freeberg how are you
00:02:18
doing how's the crops how's the fields
00:02:21
how's life in the fields worth cold open
00:02:24
not a cold do these are intros oh intro
00:02:27
I'm glad you're not producing there's a
00:02:28
reason why I'm the executive producer go
00:02:30
ahead free how's your um how's your
00:02:32
crops how's the how's life in the fields
00:02:34
it's great got a great team making
00:02:37
progress Tech is awesome having a lot of
00:02:39
fun it's great back in the CEO seat
00:02:42
huh and of course the chairman dictator
00:02:45
who's becoming completely insufferable
00:02:47
because he invested in a company seven
00:02:48
years ago that is absolutely crushing
00:02:50
right now called grock we talked about
00:02:52
it last week you're back you're back
00:02:54
chth everybody's talking about Gro
00:02:57
you're you're on uh Cloud 9 seem gr gr
00:03:02
gr Bitcoin Bitcoin Bitcoin groc
00:03:05
gr I mean it is interesting how your
00:03:10
book just literally whatever chat's book
00:03:13
is that's his mood it's just Bitcoin 60
00:03:17
grock well the honesty level is going to
00:03:20
be through the roof now oh right he
00:03:24
going to be running for governor and
00:03:25
buying the Hamptons again the truth
00:03:27
bombs that jamath is going to drop now
00:03:29
there is nothing like Peak Zer jth I
00:03:32
mean are we going back is are we back
00:03:35
right now I on paper what is your stake
00:03:37
and grock already worth let we're not
00:03:39
talking we're not we're not we're not
00:03:40
talking that's so un it's so UNC but
00:03:42
he's buying the Hamptons and he's
00:03:44
probably gonna
00:03:45
buy stop what has it been uncouth on
00:03:48
this podcast to talk about about wins or
00:03:51
wins or
00:03:53
assets these are not I agree with that
00:03:55
it's not a win you can never count you
00:03:56
can never count your chickens before
00:03:58
they hatch that's absolutely right I am
00:03:59
not counting any chickens don't book The
00:04:01
win yes there's a lot of hard work to do
00:04:04
there's a lot more people to sell
00:04:07
products to to build products by the way
00:04:09
developers by the way on Gro just this
00:04:11
past
00:04:13
week in the queue the weight list
00:04:16
tripled now there's almost 10,000
00:04:19
developers that's a big deal well that's
00:04:22
crazy that's a great sign I think the
00:04:24
most important Northstar metric for
00:04:25
these developer platforms is basically
00:04:27
that as goes the developers so goes the
00:04:30
platform you can kind of count on some
00:04:33
amount of pull through because some of
00:04:34
those developers just statistically will
00:04:35
land really important products they'll
00:04:37
consume more apis they'll just consume
00:04:40
more stuff it's just all goodness
00:04:42
because if you have 9,000 or 10,000
00:04:44
people just again waiting in line once
00:04:48
those guys get in somebody's going to
00:04:50
create something
00:04:51
magical that's the great part of having
00:04:53
a platform people just take it and they
00:04:56
build on it while you're sleeping
00:04:58
something amazing can just get built and
00:05:01
you get some amount of the uh credit for
00:05:03
that the thing that's really interesting
00:05:05
is the number of developers is
00:05:06
increasing I've seen two or three
00:05:08
founders recently just the last couple
00:05:10
of weeks who were previously idea
00:05:12
Founders who are now have taught
00:05:14
themselves to code so you know this
00:05:16
there's 1% of the you whatever number of
00:05:19
millions of delers in the world I think
00:05:20
it's going to like double or triple so
00:05:21
the the pool of people who are writing
00:05:23
code I think is about to grow very
00:05:26
meaningfully I mean you guys you guys
00:05:28
saw the release from the White House so
00:05:29
they were like we we don't want you to
00:05:32
code and C at C++
00:05:34
anymore yeah that was very interesting I
00:05:36
mean they were talking about memory
00:05:37
leaks and like this is I guess the
00:05:39
source of most memory leaks therefore
00:05:40
the White House wants you to learn to
00:05:42
develop in memory safe languages oh
00:05:44
great awesome yeah so why are they
00:05:46
involving themselves like they got
00:05:48
nothing better to do yeah I mean they
00:05:51
have an opinion on uh you know
00:05:53
preventing security leads all right
00:05:55
listen the top issue this week is the
00:05:57
same issue as last week Google's Gemini
00:05:59
I Dei black eye continues we covered
00:06:03
this woke AI disaster last week was kind
00:06:06
of funny I was watching CNBC and they
00:06:08
had a hard time describing the problem
00:06:11
the woman just I guess didn't want to
00:06:12
call it what it is it's a racist AI you
00:06:15
type in text and it gives you the
00:06:18
opposite or just culturally insane
00:06:21
responses so if you put in you want a
00:06:23
picture of George Washington from
00:06:24
Google's gemini or Sergey Bren you might
00:06:27
get back like a beniton style diversity
00:06:29
with like George Washington being black
00:06:31
or Sergey Bren being Asian Etc and so
00:06:36
this has caused um a bit of a gr fluffle
00:06:39
here in the industry to say the least
00:06:40
the stock is down 5% since we talked
00:06:42
about it last week and uh Sundar sent a
00:06:45
memo to the Gemini team of course when
00:06:48
they write these memos to a team it's
00:06:49
written to the entire world because you
00:06:51
know it's going to get leaked and so
00:06:52
you're writing as a such it might as
00:06:54
well be a press release I'll give uh two
00:06:56
quick quotes here and then I'll throw it
00:06:57
to the besties because there's so many
00:06:59
questions that we have to address quote
00:07:00
number one I know that some of its
00:07:03
responses referring to Gemini have
00:07:04
offended our users and shown bias to be
00:07:07
clear that's completely unacceptable and
00:07:09
we got it wrong and to be clear it
00:07:11
wouldn't show white people especially
00:07:13
ones like George Washington or just
00:07:15
somebody who is obviously a Caucasian
00:07:18
and so the next quote will be driving a
00:07:21
clear set of actions including
00:07:23
structural changes to me structural
00:07:25
changes means we're going to lay off a
00:07:28
bunch of people and we're going to get
00:07:29
rid of the Dei group that's that's a new
00:07:31
ma that's a new found motivational riff
00:07:34
quote in my mind freeberg will Sundar
00:07:38
survive and is Google too broken to fix
00:07:42
I'm just going to ask you since you work
00:07:43
there I don't want mean to make it
00:07:44
uncomfortable but what are the chances
00:07:46
Sundar survives this and what are the
00:07:49
chances that Google can be fixed and
00:07:51
produce great
00:07:53
products quickly that to light
00:07:57
users I don't know how answer the sunart
00:08:00
Will Survive because it's kind of an
00:08:01
idiosyncratic organization there
00:08:04
are a couple of Founders who have super
00:08:07
voting shares um and ultimately comes
00:08:11
down to their decision and the direction
00:08:14
they want to take the company and I have
00:08:15
no insights into what they individually
00:08:19
think so um frankly I've spoken to a lot
00:08:24
of folks who are investors in Google
00:08:25
over the last week and a lot of folks
00:08:27
are just deeply frustrated and angry um
00:08:31
uh on a number of fronts
00:08:34
the
00:08:36
business really there's you know three
00:08:38
businesses inside of Google there's you
00:08:40
know search and ads there's YouTube and
00:08:42
there's cloud and the rest of it is kind
00:08:43
of noise and to give you a sense of how
00:08:46
big these businesses are right YouTube
00:08:47
did 10 billion a quarter roughly Cloud
00:08:49
did 10 billion a quarter they have this
00:08:51
devices business and subscriptions
00:08:53
business does about 10 billion a quarter
00:08:56
and then search does about 50 billion a
00:08:57
quarter and the margin on search is much
00:08:59
higher than any of those other
00:09:01
businesses and so the search margin the
00:09:03
ad revenue on
00:09:04
search is you know probably 100% of the
00:09:08
true operating profit of the business so
00:09:10
the real threat to Google is more are
00:09:14
they in a position to maintain their
00:09:16
search Monopoly or maintain the chunk of
00:09:19
profits that that drive the business
00:09:22
under the threat of AI are they adapting
00:09:24
and less so about the anger around woke
00:09:26
and Dei because most of the investors I
00:09:28
spoke with aren't angry about the woke
00:09:30
Dei search engine they're angry about
00:09:32
the fact that such a blunder happened
00:09:34
and that it indicates that Google may
00:09:36
not be able to compete effectively and
00:09:38
isn't organized to compete effectively
00:09:39
in AI just from a consumer
00:09:41
competitiveness perspective so you know
00:09:44
investors are banging the table and in
00:09:46
the past we saw this with meta I think
00:09:50
it was towards the end of 22 if you guys
00:09:53
remember and it was a similar situation
00:09:55
investors were like why are you
00:09:56
investing in VR and AR this is crazy why
00:09:58
do you have all the these people that
00:10:00
are getting overpaid and everyone
00:10:02
started to write off the stock and the
00:10:03
stock took a big nose dive for a period
00:10:05
of time just like Google's is right now
00:10:07
and then the changes came and much like
00:10:09
Google there's an individual with super
00:10:11
voting shares who basically said you
00:10:13
know what I am going to step in and
00:10:15
we're going to make these changes and
00:10:16
we're going to fix this organization and
00:10:18
we're going to right siiz and we're
00:10:19
going to focus on the product winning
00:10:21
and since then meta stock is up a
00:10:24
tremendous amount 5x since then Google
00:10:26
Shares are down about 10% over the past
00:10:28
two weeks by the way I was one of the
00:10:29
stupid people to sell meta around that
00:10:31
time and your thinking was that they
00:10:33
just can't get out of their own way and
00:10:35
the god king is it's it's another yeah
00:10:37
it's exactly it's another one of these
00:10:38
idiosyncratic problems you don't know
00:10:40
what this individual is thinking and
00:10:42
what he is individually going to do the
00:10:44
point I'm making is that at Google Now
00:10:46
something has to give because the noise
00:10:48
is so loud the board is hearing this
00:10:51
left and right investors are banging the
00:10:54
table analysts are banging the table and
00:10:56
I I'll tell you a couple anecdotes
00:10:58
internally employees are now banging the
00:10:59
table a story I a story I heard this
00:11:02
week was that someone stood up in a
00:11:04
meeting and said a couple of weeks ago
00:11:07
if I had stood up in this meeting and
00:11:09
said we can't show black people in the
00:11:13
image generation at the rate that we're
00:11:15
showing I would have been cast a racist
00:11:18
and I didn't have permission to do that
00:11:20
inside of the organization but today and
00:11:22
everyone's like you're right you would
00:11:23
not have been able to stand up in the
00:11:24
organization and say that but the tenor
00:11:26
has changed inside of Google with a lot
00:11:28
of the employ that I've spoken with who
00:11:29
are now saying I can stand up and I can
00:11:33
say that this group called responsible
00:11:35
AI has too much power and it's a
00:11:38
one-sided asynchronous problem where
00:11:40
they get to come in and say we need to
00:11:42
change this and if you step up and
00:11:43
disagree with them you are deemed a
00:11:44
racist you were deemed you know
00:11:48
culturally inappropriate easier to keep
00:11:50
your head down it's easier to keep head
00:11:52
circum understand and and keep
00:11:53
collecting your RS all of a sudden you
00:11:55
wake up one day and you see the blunder
00:11:56
that happened last week yeah and so now
00:11:58
internally people are waking up and
00:12:00
saying we need to change this and I
00:12:01
heard that that's made its way up to the
00:12:02
higher ranks at Google and they're very
00:12:04
actively you know so there may be a
00:12:06
moment here where Google stock which
00:12:09
currently is trading at just 17 times
00:12:12
2025 consensus earnings which is cheaper
00:12:15
than all the other big tech companies by
00:12:17
far and it's still growing Core Business
00:12:19
is growing cloud is growing 20% search
00:12:21
is growing 15% all these other
00:12:23
businesses YouTube's growing 20% a year
00:12:26
it's a growth business that's very
00:12:28
profitable
00:12:30
and um it's trading at a very cheap
00:12:31
discount so there's you know the bull
00:12:33
case is now's a great time to buy
00:12:35
because it's so cheap and there could be
00:12:37
this moment where you know you see some
00:12:38
of the changes that are needed
00:12:39
internally to get the AI products to
00:12:42
where they need to be to maintain the
00:12:44
lead that is inherent because of C so
00:12:46
that would mean Sachs it would require
00:12:49
because we're using Zuckerberg as an
00:12:50
example we could also bring Twitter and
00:12:52
X into this it would require founder
00:12:54
authority to come in there and make
00:12:56
these changes it would require larryan
00:12:58
Sur who have the super voting chairs to
00:13:00
come in there and say hey this is all
00:13:01
changing enough of this do you think
00:13:04
there's any chance of that happening or
00:13:07
is Google just too broken to fix and
00:13:10
they're going to just lose this
00:13:11
opportunity and it's Microsoft under
00:13:12
Steve bomber missing mobile and
00:13:15
Cloud well to quote Jefferson the tree
00:13:19
of Liberty must be refreshed from time
00:13:20
to time with the blood of patriots and
00:13:23
Tyrant and I think there's something
00:13:24
analogous here I mean you have to if
00:13:27
you're going to refresh this company you
00:13:28
have to go in and you got to go in and
00:13:31
make major Cuts not just to rank and
00:13:32
file but to leadership who doesn't get
00:13:34
it and that's the only way it's going to
00:13:37
get fixed do I think that Larry and
00:13:40
Sergey are going to come in and pull an
00:13:41
Elon and go deep and figure out which
00:13:45
50% or 20% of the company's actually
00:13:48
good and doing their jobs probably not
00:13:51
but is it possible that they could make
00:13:54
a leadership change yeah it's possible
00:13:56
probable I don't know I mean I've heard
00:13:58
that
00:13:59
the company is the way it is because
00:14:00
they like the way it is I mean that
00:14:03
they're part of the problem in effect
00:14:04
they don't see they don't see the
00:14:06
problem or at least they haven't until
00:14:08
now maybe they'll get the wakeup call
00:14:10
what do you think shimar watching while
00:14:12
this happen I think it's basically a
00:14:14
small Cadre of a thousand people that
00:14:17
have built literally the most singular
00:14:20
best business model and Monopoly ever to
00:14:23
be created on the internet and a whole
00:14:26
bunch of other people that have totally
00:14:29
transformed this organization as sack
00:14:31
said
00:14:33
into ability and a platform to reflect
00:14:37
their
00:14:37
views and so I don't not a shareholder
00:14:40
of Google and outside of the tools I use
00:14:44
I don't think I really have much voting
00:14:47
power so I don't and and I have so many
00:14:49
Alternatives now so I actually think
00:14:52
like the I don't really care that much I
00:14:55
guess is the point I think that the
00:14:56
employees should care and the
00:14:58
shareholders should care and they should
00:15:00
come together and vote and I think saak
00:15:02
is right I think the company is the way
00:15:04
it is because they've chosen to be that
00:15:07
way and I think freeberg is right which
00:15:09
is that there's a small group of people
00:15:11
who have been protecting and breathing
00:15:14
life into the single greatest business
00:15:16
ever built ever in the history of
00:15:20
business but now we need to have a
00:15:22
confrontation amongst all of these three
00:15:24
different groups of people and they need
00:15:25
to make some decisions let me put a few
00:15:27
data points
00:15:29
in in play here jcal and this all this
00:15:32
all speaks to the problems being let's
00:15:35
say deliberate as opposed to a glitch or
00:15:37
an accident so first you have sundar's
00:15:40
letter to the company which there was a
00:15:44
very interesting tweet by Lulu Chang
00:15:48
misseri who does coms at Activision and
00:15:51
she writes a Blog called Flack she said
00:15:54
quote the OB she a com's expert just for
00:15:56
the audience she's a comms expert yeah
00:15:57
so she kind of red sundar's letter and
00:16:00
it was scorching she said the ausc lack
00:16:03
of clarity and fundamental failure to
00:16:05
grasp the problem are due to a failure
00:16:08
of leadership a poorly written email is
00:16:09
just the means through which that
00:16:10
failure is
00:16:12
revealed so that was one
00:16:14
reaction Mark andrees had a series of
00:16:17
posts indicating that the AI was
00:16:20
programmed to be this way again it's not
00:16:22
like a bug it's more of a feature and
00:16:25
that it's not an accident this is this
00:16:28
is happening because because companies
00:16:31
wanted to be that they chose to be this
00:16:32
way and in fact he goes further and says
00:16:34
that these companies are lobbying as a
00:16:36
group with great intensity to establish
00:16:38
a government protected cartel to lock in
00:16:41
their shared agenda and corrupt products
00:16:43
for decades to come wow I mean again
00:16:45
that's a really scorching critique here
00:16:49
and then you know the the question I
00:16:51
would ask is who's been fired for this I
00:16:54
mean imagine if I don't know one of
00:16:56
elon's products had a launch that went
00:16:58
this badly do you think no heads would
00:17:02
roll no heads have rolled and so you
00:17:04
have to kind of Wonder well what if it's
00:17:06
a structural issue sex I mean I I I the
00:17:09
point does resonate with me that there
00:17:11
is a group internally and I think it's
00:17:12
called the responsible AI team at Google
00:17:15
and this team's job is to enforce those
00:17:18
principles remember that we brought up
00:17:19
last week that they've defined and so
00:17:21
they go in and they're like well you
00:17:23
know if you just render an image of a
00:17:25
software engineer and all the images are
00:17:27
just white guys in hoods that's
00:17:29
inappropriate because there are plenty
00:17:30
of non-white people you need to
00:17:31
introduce diversity so then the
00:17:33
programmers say okay we'll go ahead and
00:17:35
overweight the model and make sure that
00:17:36
there's diversity and you can't say no
00:17:38
because otherwise you are deemed a
00:17:40
racist so who is the individual that's
00:17:42
responsible given that structural
00:17:44
circumstance that exists within the
00:17:45
organization it's more of a cultural and
00:17:47
structural problem to me than you know
00:17:50
one I guess ultimately there's
00:17:51
leadership that's lacking but actually I
00:17:53
I I agree with that to some degree let
00:17:55
me describe how I think it it works mhm
00:17:58
so what what I've heard about Google is
00:18:00
that every meeting above a certain size
00:18:02
has a Dei person in it I mean literally
00:18:05
so it's kind of like in the days of the
00:18:07
Soviet Union their military the Red Army
00:18:10
would have in every division or unit
00:18:13
there would be you know a commander or
00:18:15
lieutenant and there'd be a commissar
00:18:18
okay and the commander reported up the
00:18:20
chain and the commissar reported to the
00:18:22
party and the commissar would just
00:18:24
quietly take notes in all the meetings
00:18:26
of the unit and if the commissar didn't
00:18:28
like what the lieutenant was doing
00:18:29
Lieutenant would be taken out and shot
00:18:32
okay now that's kind of a dramatic
00:18:34
example but the point is that in every
00:18:36
large meeting at Google you've got this
00:18:39
Dei commissar who's like quietly taking
00:18:41
notes I'll point out that at Google the
00:18:44
only person we can ever remember to get
00:18:46
fired was James Dore who was an engineer
00:18:49
who complained about the political bias
00:18:51
at Google in other words he was a
00:18:54
whistleblower about the very problem
00:18:55
that's now manifested he's the only
00:18:57
person you can think think of to get
00:18:58
fired at Google Google was mocked on the
00:19:01
show Silicon Valley it was called hulie
00:19:03
but remember this yeah they the the
00:19:06
nobody ever got fired because it's a
00:19:08
company where it's impossible to get
00:19:09
fired unless you blow the whistle on the
00:19:11
political problem and I think that if
00:19:13
you're sitting in those meetings W with
00:19:16
a Dei commissar present and you know or
00:19:19
you have suspicions that the AI product
00:19:22
is not working right are you really
00:19:24
going to speak up and risk the fate of a
00:19:26
James Dore of course you're not
00:19:28
right so you're right freeberg that it's
00:19:30
a structural problem that there's
00:19:31
probably a lowgrade fear that's
00:19:35
pervasive through the organization and
00:19:36
no one's willing to say the emperor
00:19:38
wears no clothes the woke Emperor knows
00:19:40
wears no clothes because they don't want
00:19:41
to why stick your neck out you may not
00:19:43
know you're going to get fired but you
00:19:45
know you're a chance three years ago on
00:19:48
Twitter nobody would talk about certain
00:19:50
topics related to Dei because it was in
00:19:52
the aftermath of George Floyd and people
00:19:55
just did not feel comfortable even
00:19:56
calling out something minor that was
00:19:58
unfair no I would say they didn't feel
00:20:00
comfortable until Bill Amman broke the
00:20:02
seal on this just uh like a month ago
00:20:04
where he really went after Dei I mean
00:20:07
because remember the part of the
00:20:09
reaction to bill akman was like wow he's
00:20:11
really going there even though
00:20:12
everyone's saying that like basically
00:20:14
agreed with him and knew that he was
00:20:16
making a correct point but people were
00:20:18
still afraid to like again call out this
00:20:21
this woke
00:20:22
Emperor yeah and the way this works for
00:20:25
people who are not super aware you have
00:20:27
a language model and you know you write
00:20:29
a bunch of code but then there are guard
00:20:30
rails put in and then there are red
00:20:31
teams and people who test it and so at
00:20:34
the very least even if you were trying
00:20:36
to do something with great intent as you
00:20:37
pointed out Friedberg hey you know if we
00:20:39
pull up a doctor it doesn't shouldn't
00:20:41
necessarily always be a white guy there
00:20:43
are other people who are doctors in the
00:20:45
world somebody should have caught it in
00:20:47
testing they probably did Jason and they
00:20:50
didn't report it who wants to report
00:20:52
that problem that's an interesting rub
00:20:54
yeah how guts they didn't miss it they
00:20:57
didn't miss it didn't have the guts to
00:21:00
you have to build a somebody built a
00:21:01
reward model or people a reward model
00:21:04
was built for the reinforcement learning
00:21:07
from all the humans and their feedback
00:21:09
so these decisions were explicit the
00:21:11
question that is that you guys are
00:21:13
framing
00:21:14
is was this though a case where it was
00:21:17
explicitly imposed on people and people
00:21:20
felt a fear of pushing back or did they
00:21:24
just agree and say this is a great
00:21:26
decision and these rewards make ton of
00:21:28
sense and the whole point is that I
00:21:30
think what this is highlighted is that's
00:21:33
the truth that Google needs to figure
00:21:35
out and they need to figure it out
00:21:37
quickly because what is going to happen
00:21:39
now is it's we I know I think we talked
00:21:41
about this a year ago all Google needs
00:21:43
to see is 300 500 basis points of change
00:21:49
and the market cap of this company is
00:21:50
going to get cut in half okay because
00:21:53
there is only one way to go when you
00:21:54
have 92% share of a market and that is
00:21:57
down and so the setup for the stock is
00:22:00
now that people are looking at this
00:22:02
saying okay if I see 92% go to 91 or 90
00:22:07
or 89 that's all that has to happen and
00:22:10
people will say the trend is to 50% and
00:22:13
you will price this company at a
00:22:15
fraction of what it's worth today so I
00:22:17
think it's really critical this is the
00:22:19
moment that the senior leadership that
00:22:21
really understands business can separate
00:22:24
it from politics and decide is this a
00:22:26
thing of fear where there's one Rog
00:22:29
group that's run a muck or is this what
00:22:32
we believe because if it's the ladder
00:22:34
you just got to go with it there's
00:22:35
nothing you can do because you're not
00:22:36
going to replace 300,000 employees or
00:22:38
however many Google has but if it's the
00:22:40
former sax is right you're going to have
00:22:42
to some heads need to roll and they need
00:22:44
to tell the
00:22:45
marketplace that this was a mistake
00:22:47
where a group of Rogue employees got way
00:22:49
out way too much power we were asleep at
00:22:52
the job but now we're awake they're
00:22:54
fired and this is so they have two
00:22:56
choices but in in all of these choices
00:22:59
what I'm telling you on the the
00:23:01
dispassionate Market side is if you see
00:23:04
perplexity or anybody else clip off 50
00:23:07
basis points or 100 basis points of
00:23:09
share and search this thing is going
00:23:12
straight down by 50% by the way unless
00:23:14
Cloud takes off right because the other
00:23:16
hedge that Google has is
00:23:18
gcp and the tooling that they've built
00:23:20
in gcp can enable and support and be
00:23:24
integral to a lot of Alternatives and
00:23:27
competitors to what ultimately might be
00:23:28
searched so I think there's also a play
00:23:30
here in thinking about some of the
00:23:32
hedges that Google has implicitly built
00:23:34
into the business I'll just say one more
00:23:36
thing on this structural point I kind of
00:23:39
thought about this as when I started
00:23:41
speaking to people internally about how
00:23:42
this happened from a product perspective
00:23:44
it felt a lot like when you talk to a
00:23:47
lawyer as you guys know you're making a
00:23:48
business decision and there's some risk
00:23:50
I mean think about Travis building Uber
00:23:53
and the lawyer will say it is illegal
00:23:55
you need a medallion license to do what
00:23:57
you're doing in this City and Travis is
00:23:59
like well you know what I'm going to
00:24:00
take that risk Brian chesky and Airbnb
00:24:02
he said you know what I'm going to take
00:24:03
that risk and the problem is that like a
00:24:05
lawyer's job is to tell you what you
00:24:07
can't do to identify all the Peril of
00:24:10
your of Your Action and then you as an
00:24:13
executive or a business leader or a
00:24:15
manager your job is supposed to be to
00:24:18
take that as one piece of input one
00:24:20
piece of data that you then make the
00:24:22
informed business decision about what's
00:24:24
the right way to build this product
00:24:25
what's the right way to build this
00:24:26
company and I'll take on some risk and I
00:24:28
think one of the challenges structuring
00:24:30
side of Google is that product leaders
00:24:32
and other folks were never enabled to
00:24:35
make the decision that there were these
00:24:37
as saaks pointed out kind of like
00:24:39
policing type organizations that were
00:24:41
allowed to come in and veto things and
00:24:43
the vetoing or make unilateral decisions
00:24:45
and those vetoing on whether it's
00:24:47
waiting or or some you know training
00:24:49
data set or output that ends up killing
00:24:52
the opportunity for the smart business
00:24:54
leader to say that doesn't make sense we
00:24:55
can't do this is where this is where
00:24:57
having founders in the organization
00:24:58
every day changes everything because the
00:25:01
founders would say hey we we're here to
00:25:03
index the world's information we're here
00:25:04
to present the world's information we're
00:25:05
not here to interpret it we're not here
00:25:07
to win hearts and Minds we're not here
00:25:09
for a political agenda but there's a
00:25:11
group of people there who it's apparent
00:25:13
think that they're there to change the
00:25:16
world that Google is a vehicle for them
00:25:18
to make social changes in the world and
00:25:21
you know that's art and there's other
00:25:23
ways to do that and just paradoxically
00:25:25
like Hamilton making the founding
00:25:27
fathers diverse and doing art you know
00:25:30
that can win Tony's in that context you
00:25:32
could win tons of awards and a claim and
00:25:35
and make incredible beautiful art but on
00:25:37
the other side like people are not
00:25:39
looking for Google to do that they're
00:25:41
looking for Google to give them the
00:25:42
answer and the data and then these
00:25:44
people are thinking it's our job to
00:25:45
actually interpret the data for you as
00:25:48
we kind of touched on last week so uh
00:25:51
this I think look I think you guys are
00:25:53
really close to the bullseye here but I
00:25:55
would just refine slightly what you're
00:25:57
saying
00:25:59
so here's how I think it happens I I
00:26:01
don't think this is a rogue group I
00:26:02
actually think this is a highly
00:26:03
empowered group within Google I do agree
00:26:06
with you yeah yeah I think what happens
00:26:08
is Sundar says from the top that we're
00:26:12
going to be on the Forefront of
00:26:13
diversity and inclusion because he
00:26:15
personally believes that and that's the
00:26:17
way the social winds are blowing and
00:26:20
they think it's good for the company on
00:26:21
some level okay to implement that Mantra
00:26:25
or that platitude really HR hires a
00:26:28
bunch of Dei experts okay lots of them I
00:26:30
think the company has like this huge HR
00:26:33
department and a lot of those people are
00:26:36
basically Fanatics I mean they're their
00:26:38
chosen career yeah yeah they're trained
00:26:40
marxists basically and so they're the
00:26:42
commissars and they're sitting in all
00:26:44
these meetings and again they're the
00:26:46
ones taking notes and they're the ones
00:26:48
who push the company in a certain
00:26:49
direction but you have to then go back
00:26:51
to senior leadership and say it's their
00:26:53
fault for letting this happen because
00:26:55
they should have made a course
00:26:56
correction they should have realized
00:26:58
what was happening they should have
00:26:59
realized that through a combination
00:27:03
of bias and through this sort of like
00:27:07
overly
00:27:08
empowered HR team who are pulling a
00:27:11
legal card I mean freeberg is right
00:27:12
about that they're saying that our point
00:27:14
of view is the law which isn't true but
00:27:16
they're basically pulling the legal card
00:27:18
and pushing the whole organization a
00:27:20
certain direction it was up to
00:27:21
leadership to realize what was going on
00:27:24
and make a correction and the the thing
00:27:28
that you're seeing now is that in the
00:27:30
face of what's happened the statement
00:27:33
that we got really doesn't cut it funny
00:27:35
they use the word problematic yeah
00:27:37
exactly they're describing it as a
00:27:39
glitch or a bug it's not it's a much
00:27:40
deeper problem and so therefore you give
00:27:43
it gives no confidence that the solution
00:27:47
is going to be pursued in as
00:27:49
comprehensive manners as necessary yeah
00:27:51
I think it's just a that that memo is a
00:27:53
tip off that there is a 20,000 30,000
00:27:55
person riff and a reorganized and
00:27:58
they're just going to keep cutting the
00:27:59
de group and they've already meta and
00:28:01
Google have cut the DI groups a bit I
00:28:03
don't know if it's about as much about
00:28:05
cutting as much as it is about
00:28:08
empowering if you said to the product
00:28:10
leaders you can make the decision Dei
00:28:14
and responsible Ai and whatever other
00:28:16
groups they're going to inform you on
00:28:18
their point of view but they're not
00:28:20
going to tell you what your point that
00:28:22
diag is pulling the legal card they're
00:28:24
saying this is the law they're saying
00:28:25
that say that's what needs to change
00:28:27
right if you don't do things the way we
00:28:29
tell you Google's going to be hit with a
00:28:31
civil rights lawsuit that's what they're
00:28:32
saying oh well I mean let's see if
00:28:35
leadership can overcome that threat but
00:28:37
I think that's exactly the threat that
00:28:39
is keeping the organization from
00:28:41
resolving this problem or could keep the
00:28:42
organization that but I also wouldn't
00:28:44
underestimate the bias so you know
00:28:47
everybody there not I shouldn't say
00:28:48
everybody but it's a very liberal
00:28:51
culture right it's a monoculture so when
00:28:53
you're swimming in that much bias it's
00:28:54
hard to see right yeah yeah when
00:28:57
everybody is left to Center I mean just
00:28:59
look at something like political
00:29:00
contributions right it's 90s something
00:29:02
per are democratic I mean like high 90s
00:29:06
so it's just a a liberal bubble
00:29:09
basically and so right when everybody's
00:29:11
liberal it's very hard to see when you
00:29:14
know the results are Way offc Center as
00:29:17
well also if you if you give people this
00:29:19
job like how are they what are they
00:29:21
going to do every day freedberg if
00:29:22
they've been given the job Dei and
00:29:24
they've been given the job to do trust
00:29:25
and safety they they're looking to fill
00:29:26
their time in make an impact you got to
00:29:28
I think maybe have less of these people
00:29:30
I honestly think you guys saw you know
00:29:33
Shan Maguire is a partner at seoa and he
00:29:34
used to be a member of the team at
00:29:37
Google Ventures when it was called
00:29:39
Google Ventures and he did a Twitter
00:29:40
post did you guys see this yeah where he
00:29:42
said when he was at Google he was told
00:29:45
by his manager I'm not really supposed
00:29:46
to tell you this it could get me fired
00:29:48
but you're one of the highest performing
00:29:49
people here but I can't promote you
00:29:51
right now because I have a quota my
00:29:53
hands are tied you'll get the next slot
00:29:55
please be patient I'm really sorry It
00:29:57
ultimately LED sea to leave Google and
00:30:00
uh you know the rest is history he's a
00:30:02
partner at seoa now you're saying
00:30:05
because he's a white male because he's a
00:30:07
white male yeah and so the you knowon
00:30:11
yeah no I had the same thing happened to
00:30:12
me on AOL when I was at AOL and shabbath
00:30:14
was there at the same time the whole
00:30:16
organization was white men from Virginia
00:30:20
whatever and they gave me an SVP title
00:30:24
and I said well I want the EVP one and
00:30:25
they're like you know what we can't have
00:30:27
any more white males in that position
00:30:29
right now we have to like get some more
00:30:31
women and people of color in that
00:30:33
position and they told me you'll have
00:30:34
the same comp and bonus but we just
00:30:36
can't give you the title jamat do you
00:30:38
think these
00:30:40
like race and gender driven
00:30:44
quotas make sense for HR departments to
00:30:48
try and enforce upon
00:30:51
managers to M to increase diversity I
00:30:54
mean is that a good objective for an
00:30:55
organization to have this is like a
00:30:57
topic a lot of people I've heard kind of
00:30:58
flip back and forth on I there is no
00:31:01
company where I have majority control
00:31:03
where I have an HR department you don't
00:31:04
have an HR department no say more why I
00:31:07
think that it's very important you
00:31:09
should go to a very respected lawyer at
00:31:11
a thirdparty firm someone very visible
00:31:15
an Eric CER type person and you should
00:31:18
work with that law firm and retain them
00:31:20
so that you have an escape valve if
00:31:23
there
00:31:24
are any kind of serious issues that need
00:31:27
to be escalated so that you can get them
00:31:29
into the hands of a dispassionate third
00:31:32
party person who can then appropriately
00:31:35
inform the board the CEO and investigate
00:31:39
so that that covers sort of all the bad
00:31:41
things that can happen then there are
00:31:43
buckets of I think good things one one
00:31:46
important set of things is around
00:31:48
benefits my perspective is that the team
00:31:52
should build their own benefits package
00:31:54
that they want they should understand
00:31:56
the p&l of the company that they work
00:31:58
for they should be given a budget and in
00:32:01
in my companies again what I do is I
00:32:03
allow committees to form and I ask those
00:32:05
committees to be diverse but what I mean
00:32:09
by diverse is I want somebody who has a
00:32:12
sick partner I want somebody who has a
00:32:16
family I want somebody who's young and
00:32:18
single so that the diversity of benefits
00:32:20
reflects what all these people need they
00:32:22
go and talk to folks they come back and
00:32:24
they choose on behalf of the whole
00:32:26
company and there's a voting mechanism
00:32:28
then when it comes to hiring I think
00:32:30
what has to happen is that the person
00:32:32
that for whom that person will end up
00:32:34
working for it's the head of engineering
00:32:37
it's the head of sales those are the
00:32:39
people that should be running the hiring
00:32:42
processes I don't like to Outsource it
00:32:44
to recruiters I don't like to Outsource
00:32:46
it to HR so when you strip all of these
00:32:48
jobs away HR doesn't have a role and
00:32:52
what is left over is the very dark part
00:32:54
of HR in most organizations which is the
00:32:56
police person the police policeman right
00:32:58
the what what is Saks call the commiss
00:33:01
that is why everybody hates HR I've
00:33:03
never met a company where that is a
00:33:05
successful role over long periods of
00:33:08
time they are this conflict creating
00:33:11
entity inside of an organization that
00:33:13
slows organizations down
00:33:15
so that allows me to empower individuals
00:33:19
to actually design the benefits that
00:33:20
they want to hire the team that they
00:33:22
want and I let them understand the p&l
00:33:25
in a very clear transparent way and the
00:33:27
results are what the results are you
00:33:28
want more bonuses hire better people and
00:33:31
then what I do is at the end of every
00:33:32
year I talk about this distribution of
00:33:35
talent and I make sure we are
00:33:37
identifying the bottom five or 10% they
00:33:40
need to be managed up or they must be
00:33:44
fired every year and it does not matter
00:33:47
how big the company is you must manage
00:33:50
up or out the bottom 5 to 10% and in
00:33:53
some cases I'm talking about one person
00:33:54
because it's a small company and in
00:33:56
other cases I'm talking about 30 or 40
00:33:58
people so just hire the best person for
00:34:00
the job no you eliminate HR yeah you
00:34:03
empower the team make the make allow
00:34:05
them to make their own
00:34:07
decisions measure it hold them
00:34:09
accountable right so if you have a
00:34:12
salesperson who just hires their I don't
00:34:15
know 15 sorority sisters or fraternity
00:34:17
Brothers whatever it is and it just
00:34:18
becomes not a diverse group well hold on
00:34:22
it is what it is that still may be
00:34:23
diverse this is my point like the the
00:34:25
thing is that like there's different
00:34:27
ways to sell there's different sales
00:34:29
motions there's the sort of like
00:34:31
elephant hunting kind of sales model
00:34:33
there's the Dialing for Dollars thing so
00:34:35
even those 15 sorority sisters the way
00:34:37
you describe it could actually be
00:34:38
diverse my point is that kind of
00:34:41
superficial marking based on the mutable
00:34:43
traits will not yield a great
00:34:45
organization instead it's you're
00:34:46
empowered to hire whomever you want just
00:34:49
know that at the end of the year we're
00:34:50
going to measure them your bonus their
00:34:52
bonus the company's performance all
00:34:54
performance yeah performance I mean
00:34:56
Frank slutman said this and he he got
00:34:58
barbecued at some point he said I don't
00:35:00
have time to do this diversity stuff the
00:35:03
best person by the way in in my
00:35:04
companies like for example like when you
00:35:05
do sock 2 compliance you have to
00:35:07
generate these reports okay especially
00:35:10
for some of our customers some of our
00:35:11
companies that actually show the
00:35:13
diversity of our team and when we
00:35:16
measure them on the immutable traits
00:35:19
that whatever they represent as their
00:35:21
gender whatever they represent as other
00:35:24
dimensions we are incredibly incredibly
00:35:27
diverse
00:35:28
anyways but the way I've seen a startup
00:35:31
that isn't diverse you know Silicon
00:35:33
Valley attracts such a diverse mix of
00:35:37
people you know what do we know about
00:35:39
success factors for startups number one
00:35:42
the ones that are successful have a
00:35:43
cultural meritocracy number two they're
00:35:46
non-bureaucratic and they don't have too
00:35:48
much GNA basically overhead in the
00:35:52
company now all those things are cont
00:35:56
contradicted by having a large HR team
00:35:59
or especially a Dei organization right
00:36:02
they add bureaucracy they add overhead
00:36:05
and what's that Pips performance
00:36:09
Improvement they cut into well Pips are
00:36:11
great I pip people all the time does it
00:36:13
work I mean I I know some people are
00:36:15
just like these Pips don't work if
00:36:16
you're an unper performing member of the
00:36:17
team and you've been identified in the
00:36:19
bottom five or 10 percent we have a
00:36:21
responsibility as management to coach
00:36:23
you up or to get you to an organization
00:36:26
where you're not in the bottom 5 or 10%
00:36:28
that's the right thing to do for people
00:36:30
you do that by being very transparent
00:36:32
and writing it down you are not good at
00:36:34
these things you're underperforming in
00:36:36
these things fix them or you will not be
00:36:38
here that's a very fair thing to tell
00:36:41
somebody yeah look how you want to
00:36:42
implement your own meritocracy I think
00:36:44
there's different ways for Founders to
00:36:46
do that the point is you want these
00:36:47
companies to be a meritocracy you don't
00:36:50
want advancement in the company to be
00:36:52
based on factors other than skill Merit
00:36:55
hard work things like that we know
00:36:57
performance we know that's a bad bad
00:36:59
path to go on I think a lot of Founders
00:37:01
don't understand that Dei is not
00:37:04
something they have to do they don't
00:37:05
have to have a Dei organization this has
00:37:08
somehow become a thing it's it's not
00:37:11
required and I think people are
00:37:13
realizing like why would you do that why
00:37:15
would you create this large bureaucracy
00:37:16
in the company that undercuts the
00:37:19
meritocracy that adds a lot of costs and
00:37:22
that slows you down none of those things
00:37:24
will help your company what you need to
00:37:25
have I think J laid out some really good
00:37:28
best practices you should have an
00:37:30
outside Law Firm that I would say is you
00:37:33
could call it HR law but I would just
00:37:35
call it employment law I would say like
00:37:37
a non-ideological
00:37:38
partner e an expert in employment law
00:37:41
who sets up your company correctly and
00:37:43
to whom you can take a complaint if if
00:37:46
an HR complaint gets raised through the
00:37:48
chain of your company you do have to
00:37:49
take it very seriously and that's be a
00:37:51
proper investigation and that's probably
00:37:53
best handled by an outside lawyer so get
00:37:56
that out lawyer and then by the way
00:37:58
always there's never a case where your
00:38:00
coworker should know the intimate
00:38:02
details of any of those and that's what
00:38:04
also creates this horribly rotten
00:38:07
culture in HR where these people act
00:38:09
like Gatekeepers of secret information
00:38:12
salary information bonus information and
00:38:15
then all of the other things oh you know
00:38:17
did you know this person did this with
00:38:20
it's terrible to have that inside of a
00:38:21
company it should go to a dispassionate
00:38:23
third party person whose job it is to
00:38:25
maintain confidential it and discretion
00:38:27
yeah while investigating the truth yeah
00:38:30
when HR is too powerful in a company
00:38:32
that's a red flag at the end of the day
00:38:33
HR should be an administrative function
00:38:35
their job should be to get people on
00:38:38
boarded sign their offer letter and
00:38:40
their confidentiality and uh invention
00:38:43
an assignment agreement and set them up
00:38:45
in payroll and get them benefits and
00:38:47
that kind of stuff it should
00:38:48
fundamentally be an administrative
00:38:50
function and if it starts getting more
00:38:53
powerful than that it means there's been
00:38:55
a usurpation you don't even need people
00:38:57
for that like you have software that
00:38:59
does that now and it's all automated you
00:39:01
know you send them a link you go to your
00:39:02
favorite you know HR site boom it's done
00:39:05
yeah I don't think you need a lot of
00:39:06
people doing this anything you want to
00:39:07
add to this discussion freeberg as we as
00:39:09
we move on to the next topic I think
00:39:10
that unfortunately the term
00:39:13
diversity Equity inclusion has been
00:39:16
captured along as chth points out a
00:39:19
single Vector which is this immutable
00:39:21
trait of your racial Identity or gender
00:39:25
and I think the more important aspect
00:39:28
for the success of a team for the
00:39:29
success of an organization is to find
00:39:32
diversity in the people that comes from
00:39:34
different backgrounds different
00:39:36
experiences different ways of
00:39:38
thinking and so I I'm not a huge fan of
00:39:42
race-based
00:39:44
metrics or gender based metrics driving
00:39:46
I'm
00:39:47
generally more oriented around be blind
00:39:50
to those those those variables and focus
00:39:53
much more on the variables that can
00:39:55
actually influence the outcome of your
00:39:56
organization yeah one of the great
00:39:59
paradoxes of this as well is we are
00:40:01
moving to a much more Multicultural
00:40:04
mixed race Society anyway people filling
00:40:06
out fors a lot of our kids you know are
00:40:09
are could pick two or three of the
00:40:10
different boxes on a Dei form it's it's
00:40:12
not going to make much of a difference
00:40:14
in the coming decades all right issue
00:40:16
two Google go splashy cach AES licensing
00:40:18
deals for training data and it's now
00:40:21
becoming a bit of a pattern Google we
00:40:23
talked about I think just last week had
00:40:26
to deal with Reddit for $60 million
00:40:29
that's reportedly per year today stack
00:40:33
Overflow is now using its overflow API
00:40:36
to train Gemini no word on the contract
00:40:38
value I did get some back channel that
00:40:41
it's a multi-year non-exclusive deal
00:40:43
according to R's S1 they have already
00:40:45
closed 200 million worth of AI licensing
00:40:47
deals over the next two to three years
00:40:49
so you maybe it's going to be 75 million
00:40:51
a year 100 million a year who knows how
00:40:53
big that business can get we're going to
00:40:54
talk about the S1 from Reddit in just a
00:40:58
moment and uh this is on top of all the
00:41:00
other licensing deals that have occurred
00:41:02
AEL Springer in open AI you remember
00:41:03
that one and opening eyes and talks
00:41:06
reportedly with CNN Fox and time to
00:41:09
license their content that comes on the
00:41:11
hills of that Blockbuster New York Times
00:41:13
open AI lawsuit that we talked about I
00:41:15
don't know 10 episodes ago and open AI
00:41:17
lawyers are trying to get that one just
00:41:18
to give you a little update on it
00:41:19
they're trying to get that case
00:41:20
dismissed saying that the New York Times
00:41:22
hacked chat GPT to get certain results
00:41:25
and that the York Times took tens of
00:41:27
thousands of tries to generate the
00:41:29
results yada yada and they said here's
00:41:31
the quote from the filing from open AI
00:41:33
the allegations in the times complaint
00:41:35
do not meet its famously rigorous
00:41:38
journalistic standards both open Ai and
00:41:40
Google and Gemini have been fly Guard
00:41:42
railing their systems as we talked about
00:41:44
as well to stop copyright infringement
00:41:47
like trying to make pictures of Darth
00:41:48
Vader and that kind of stuff Jam you've
00:41:50
talked a little bit about your Tac 2.0
00:41:52
framework maybe you could talk about
00:41:54
what you see Happening Here with all
00:41:55
these licensing deals and what it means
00:41:58
for startups in the AI
00:42:01
space well just to maybe catch everybody
00:42:03
up Tac is this thing called traffic
00:42:06
acquisition cost and you can see it most
00:42:09
importantly in
00:42:11
Google's quarterly releases which is
00:42:13
that what they realized very early on at
00:42:16
the beginning of the search wars in the
00:42:19
early 2000s is that they could pay
00:42:21
people to offer Google search people
00:42:25
would use it and then it would generate
00:42:26
so much money that they could give them
00:42:28
a huge ref share and it would still make
00:42:30
money so I remember Jason when you and I
00:42:32
were at AOL this is the first time I met
00:42:34
Omid we were flying back to California
00:42:37
we were both in Dallas at the same time
00:42:39
in like
00:42:40
2003 or four and that's when Omid did
00:42:44
the first big search deal between AOL
00:42:46
and Google and it was it was I want to
00:42:48
say hundreds of millions of dollars back
00:42:50
then where Google pays you up front you
00:42:53
have to Syndicate Google search and then
00:42:55
they clean it up on the back end with
00:42:57
some kind of rure so what's incredible
00:43:00
is that that process has escalated to a
00:43:03
point now where for example on the
00:43:05
iPhone it's somewhere between 18 and 20
00:43:08
odd billion dollars a year is what now
00:43:10
Google pays Apple so that's the traffic
00:43:13
acquisition cost 1.0 R Tac
00:43:16
1.0 and I just said that we should call
00:43:19
this Tac 2.0 except now what Google is
00:43:22
doing is instead of paying for search
00:43:24
they're actually paying for your data
00:43:26
and saying give it to me so that I can
00:43:27
train my models and make it better and I
00:43:29
think that that's an incredible thing
00:43:31
both it's very smart for Google but also
00:43:34
it's great for these businesses because
00:43:36
it's an extremely high
00:43:38
margin thing to do when you have a
00:43:41
really good Corpus of of data that is
00:43:43
very unique so in the case of Reddit
00:43:45
that $60 million deal I didn't I I
00:43:47
looked through the S1 to try to figure
00:43:49
out whether it was a multi- your deal or
00:43:50
not it wasn't totally clear but the
00:43:54
point is that you know Google's paying
00:43:56
Reddit 60 million bucks and Jason you
00:43:57
just said that they're they've done a
00:43:59
couple more of these things that's
00:44:01
incredible this Tac 2.0 thing is amazing
00:44:04
so if you're an entrepreneur building a
00:44:06
website or building an app that has
00:44:08
really unique training data or really
00:44:10
unique data you'll be able to license
00:44:12
and sell that and that'll be an
00:44:14
incremental Revenue stream to everything
00:44:15
you do in the near future that's amazing
00:44:17
that's what T 2.0 is it's going to be
00:44:19
incredible for the entire uh content and
00:44:22
community- based Industries you think
00:44:23
this could sustain content creation
00:44:25
where advertising has become very
00:44:27
difficult sax I guess my question to
00:44:29
toal would be do you think this is going
00:44:31
to be available to small websites
00:44:34
they'll somehow be some sort of program
00:44:36
or because I mean Reddit is one of the
00:44:37
biggest sources of content on the entire
00:44:39
web right it's like a top five traffic
00:44:41
site with tons and tons of user
00:44:44
generated content yes do you think like
00:44:46
a small publication would be able to
00:44:48
make these types of deals yeah and in
00:44:50
fact I think like if you go back to
00:44:51
search 1.0 that's exactly what these
00:44:53
small companies were able to do which
00:44:55
was in a more automated way they were
00:44:57
able to basically partner and in that
00:44:59
example what they would say is here
00:45:01
Google why don't you just run your ads
00:45:03
on our page right and that was sort of
00:45:05
in that web 1.0 world so Google had
00:45:07
solutions for the largest companies on
00:45:10
the internet all the way to the smallest
00:45:12
and in this Tac 2.0 world I do think
00:45:15
that that it works in that way as well
00:45:17
the problem in the in that world if it's
00:45:19
a small website that says here's my
00:45:21
training data the question is how do you
00:45:24
attribute how much incremental value a
00:45:27
model derived from it versus something
00:45:29
else and so I think that that part has
00:45:32
to get figured
00:45:33
out and so you know what Google will be
00:45:36
able to pay you will probably be pretty
00:45:38
di
00:45:41
minimouse bound is 60 million for Reddit
00:45:44
then the average website's going to get
00:45:45
a few hundred
00:45:46
bucks but that still may be a good start
00:45:49
and when Google figures out how to
00:45:50
monetize stuff or somebody else where
00:45:53
then they can give you back some way to
00:45:56
make money I think I think that there's
00:45:57
a
00:45:58
real monetization here I really do you
00:46:02
took the other side of
00:46:04
this but now that you see the
00:46:06
market-based solution starting
00:46:08
to emerge what do you think of this
00:46:10
market-based solution think it's a got
00:46:12
legs well I'm not convinced that this
00:46:16
looks like Tac in the traditional sense
00:46:20
where you're basically buying a
00:46:21
continuous stream of traffic and then
00:46:23
you're helping to monetize that traffic
00:46:25
that's a effectively what Google did in
00:46:27
the ad syndication business and does
00:46:28
today that business makes about 10
00:46:30
billion a quarter at
00:46:32
Google and they're paying call it 70 to
00:46:36
80 cents on every dollar back out to the
00:46:38
owners of that traffic the folks where
00:46:39
that traffic is derived
00:46:41
from I would say that this looks a lot
00:46:44
more like the content licensing deals to
00:46:48
build a proprietary audience which is
00:46:51
effectively what
00:46:52
Netflix did they they paid Studios for
00:46:56
Content Apple does this they have
00:46:59
proprietary content that they pay
00:47:01
producers to make and they put on Apple
00:47:03
TV Amazon does this and so on this is a
00:47:06
lot more like that where there are
00:47:08
content creators out there whether that
00:47:10
content is proprietary like the New York
00:47:12
Times or user generated like Reddit and
00:47:14
what they're trying to do is acquire
00:47:15
that content to build a better product
00:47:17
on Google Search and I'm not sure how
00:47:20
you get paid a continuous licensing
00:47:22
stream for that content once you've
00:47:24
trained the model the content gets old
00:47:26
it gets stale at some point in a lot of
00:47:28
cases like news and then
00:47:30
eventually if you don't have a
00:47:32
highquality continuous stream of content
00:47:35
it's not worth as much anymore to give
00:47:37
you guys a sense humans generate in
00:47:39
total well let me just give you some
00:47:41
stats there's about a million pedabytes
00:47:42
of data on the internet today and humans
00:47:46
are generating about 2500 pedabytes of
00:47:49
data new data uh per day right now
00:47:52
remember I shared a couple weeks ago
00:47:53
YouTube's generating about two pedabytes
00:47:54
of data uh per
00:47:57
day half of all data generated is never
00:48:00
used so this is like records and files
00:48:03
and stuff that gets put on log files log
00:48:05
files gets stored somewhere never
00:48:07
accessed never used the majority of the
00:48:09
rest of that data is not in the public
00:48:11
domain it's not on the internet so there
00:48:13
is a lot of data out there what some
00:48:16
people might call dark data to train on
00:48:19
and I think that as the identification
00:48:22
of better sources of training data and
00:48:25
the value of training data right now
00:48:27
we're in this kind of shotgun approach
00:48:28
we're trying to blast out and you know
00:48:30
Source lots of content lots of data just
00:48:33
like over time Netflix got better at
00:48:35
figuring out what content to buy and
00:48:37
what to pay for it and they're the best
00:48:39
at it I think so too will Google and
00:48:42
others figure out what data is actually
00:48:44
particularly useful what it's worth and
00:48:47
what to pay for it and so there's a lot
00:48:49
of data out there to go and identify to
00:48:51
mine to pay license fees to get access
00:48:54
to whether those are continuous license
00:48:56
fees or one time is still TBD that's a
00:48:58
key issue yeah so I think we're still a
00:49:00
little bit early to know if this is like
00:49:02
you know a continuous model like a tack
00:49:04
type business or if these are sort of
00:49:06
chunky type deals and we don't really
00:49:08
know what the real value is yet and that
00:49:09
all changes over time and remember the
00:49:11
the the rate of data generation is
00:49:14
increasing so while we're generating 200
00:49:16
pedabytes of data per day as a species
00:49:18
on the internet that number is going up
00:49:20
every day and so every year all the old
00:49:23
data becomes worth even less so this is
00:49:25
all changing fairly dynamically and I
00:49:27
think there's a lot still to be figured
00:49:29
out on what the monetization model will
00:49:31
be for Content creators and and how
00:49:33
that's going to change over time yeah I
00:49:35
think it's a really good point some
00:49:36
things will be like The Sopranos or
00:49:38
Seinfeld or Simpsons where that library
00:49:40
is worth fortune and people will pay a
00:49:42
billion dollars a half billion dollars
00:49:44
some no one no one's no one's paying a
00:49:45
lot for old NFL games you know Nick
00:49:47
founded by the way Reddit licensing deal
00:49:49
was 203 million over it says 2 to three
00:49:53
years so let's assume it's call it three
00:49:55
just to be so it's you know6 million a
00:49:58
year it doesn't say whether the deal
00:50:00
with Google is exclusive wow so they
00:50:02
could do they could do that same licing
00:50:03
deal multiple times that's interesting
00:50:05
yeah none of these are exclusive it
00:50:06
seems this is what I don't understand
00:50:08
why why don't you do head deals like
00:50:10
with folks like Reddit where you
00:50:12
actually do it exclusively like it seems
00:50:14
like it's more valuable to spend a
00:50:16
multiple of this number for for one of
00:50:19
the big seven who have tens of billions
00:50:21
of dollars of cash anyways and block the
00:50:23
other players and block everybody else
00:50:25
that just seems Reddit multiple gets
00:50:26
capped if I'm Reddit I don't want to do
00:50:27
that deal yeah Reddit me I put it on the
00:50:29
table Yeah because then my multiple is
00:50:31
Capp like the the way I can monetize my
00:50:33
content is now set and I'm done inv get
00:50:36
bought investors are like oh you're
00:50:37
worth five times eai you know that's it
00:50:40
Reddit cor stack Overflow they're going
00:50:42
to just get taken out I this is I think
00:50:43
this is going to be the new mod did
00:50:45
around actually didn't cor raise uh 50
00:50:47
500 recently I think they're going to
00:50:49
get taken out I I I I think these
00:50:51
businesses will become too valuable
00:50:52
because they do have ongoing content
00:50:54
that
00:50:56
used to run you started web blogs
00:50:59
right and but think about the value of
00:51:01
that content today it's negligible like
00:51:03
it was very valuable at the time and as
00:51:06
time went on more content was being
00:51:07
created a 100 times a thousand times
00:51:09
10,000 times more content that started
00:51:12
to overshadow the value of that content
00:51:13
at the time the acquisition was done it
00:51:15
made a ton of sense but all of a sudden
00:51:17
two years later particularly with the
00:51:18
rate at which data is growing on the
00:51:20
Internet it's like does it make sense to
00:51:22
buy any content anymore so you well on
00:51:24
the other side of that is historical
00:51:26
content could be worth a lot of money
00:51:28
especially some could be so if you had
00:51:30
the Charlie Rose archive as an example
00:51:32
what is you know he's probably
00:51:33
interviewed Kissinger 10 times and he's
00:51:35
interviewed Kissinger for you know 10
00:51:37
hours I've got two almost 2,000 episodes
00:51:39
of podcasts I've done with startups over
00:51:42
13 years like yeah this weekend startups
00:51:45
archive is going to be worth something
00:51:46
at some point right I don't think it's
00:51:47
worth what are old baseball games Worth
00:51:49
right like I mean yeah they're
00:51:52
rewatchable and I don't think the data
00:51:54
from them is particularly important so I
00:51:56
I agree on that one but historical yeah
00:51:59
we don't know but and I just question
00:52:00
how much of reddit's content is actually
00:52:02
like long-term valuable versus like
00:52:05
they're covering a topic and they're
00:52:06
talking about interesting stuff and
00:52:08
then later it's a really really
00:52:10
excellent point yeah and we'll figure
00:52:13
that out okay and that's why I think
00:52:14
it's like it's the early days of knowing
00:52:15
how to Value all this content
00:52:16
particularly for llms and so we don't
00:52:18
really know yet over the next year this
00:52:20
will all start to become clearer but
00:52:22
it's it's again same thing happened in
00:52:24
music licensing yeah right but if if all
00:52:26
the content creators could kind of
00:52:28
unionize then it might increase that's
00:52:32
kind of like the yeah like music music
00:52:34
industry has ASAP socialist I love it
00:52:37
well I'm not I'm not advocating for this
00:52:39
but the point I'm making is if they
00:52:40
can't unionize then there's a lot
00:52:43
there's just a huge number of vendors of
00:52:46
content and so models will need to buy
00:52:49
some but as long as they can get some
00:52:50
they don't need to have all and
00:52:52
therefore it's basically highly
00:52:54
competitive among suppliers and there's
00:52:56
a very limited number of buyers so that
00:52:59
tends this is why the news industry
00:53:00
should have always had a federation
00:53:01
because they could have just said to
00:53:03
Google hey we're going to de-index
00:53:05
ourselves from the Google search engine
00:53:07
and so you won't have the New York Times
00:53:08
Washington Post LA Times you're just not
00:53:10
going to have any of us unless you give
00:53:11
us X Y and Z and they were just too
00:53:14
stupid and not coordinated to do it yeah
00:53:17
music industry the exact opposite you
00:53:19
you try to do anything with the music
00:53:20
industry they're going to come down on
00:53:22
you like a ton of bricks to this day I
00:53:24
see what
00:53:25
that's actually really interesting yeah
00:53:27
if all the um old school Legacy
00:53:30
newspapers and magazines magazin so on
00:53:34
basically formed their own whatever Cel
00:53:38
trade Association yeah that would have
00:53:39
been powerful yeah I mean that's what
00:53:41
Murdoch wanted Murdoch saw it clearly he
00:53:43
was like you know Google's the enemy
00:53:45
here they're going to take all of our
00:53:46
revenue and they're going to get all our
00:53:47
customer names and we're not going to
00:53:48
even know the names of our customers all
00:53:50
right issue three clowner crushes
00:53:51
customer queries with AI you may have
00:53:53
seen this trending on X and Twitter and
00:53:55
in the Press if you don't know Clara
00:53:57
they're a Swedish ftech company they do
00:53:59
that buy now pay later stuff I think
00:54:00
they were The Originators of that online
00:54:03
and they put out a press release with
00:54:04
some really eye popping claims AI
00:54:07
assistants are now doing the work of 700
00:54:10
full-time agents at CL they moved issue
00:54:14
resolving times from 11 minutes with
00:54:16
humans to 2 minutes with
00:54:19
AI and customer satisfaction is on par
00:54:22
with human
00:54:23
agents and it said it's resol ions are
00:54:25
more accurate than humans creating a 25%
00:54:27
drop in repeat inquiries that tracks and
00:54:30
so far their AI which they built with
00:54:33
open AI has had 2.3 million
00:54:35
conversations accounting for 2third of
00:54:37
cler's customer support service chats
00:54:40
cler estimates its AI agent will drive
00:54:43
wait for it a $40 million increase in
00:54:46
profits this year we could talk a little
00:54:49
bit more about cloner and their
00:54:50
valuations but freeberg what what do you
00:54:52
think this means if we're in year this
00:54:55
is the start of year two of chat GPT as
00:54:58
like a phenomenon let's say what do we
00:55:01
think year three looks like I think the
00:55:03
tech the techn pessimist point of view
00:55:05
is oh my God look at all these jobs that
00:55:06
are getting lost I think the optimistic
00:55:08
point of view is that that company has
00:55:11
all of that excess Capital now to
00:55:13
reinvest in doing other things that
00:55:15
Capital doesn't just become profits that
00:55:17
flow out the door and everyone's done
00:55:19
with that money and that money just gets
00:55:21
put away in a sock that money gets
00:55:23
reinvested and that money gets
00:55:24
reinvested invested in higher order
00:55:26
functioning work and that's really where
00:55:28
there's an opportunity to move the
00:55:30
workforce overall forward which is what
00:55:32
I think is super exciting and um I'm
00:55:34
super positive about we've seen this in
00:55:36
every technological Evolution that's
00:55:39
happened in human history from you know
00:55:41
the plow in agriculture to automobiles
00:55:44
to Computing and to now ai that you know
00:55:47
humans moved from manual labor to
00:55:50
knowledge work to now ideally and
00:55:52
hopefully more creative work and so I do
00:55:55
think that it isn't just about
00:55:57
eliminating jobs and making more money
00:55:59
but it's about enabling the creation of
00:56:01
entirely new class of work whether
00:56:03
that's prompt
00:56:04
engineering or you know building
00:56:06
entirely new businesses that simply
00:56:07
can't exist today or perhaps even
00:56:09
downscaling businesses where you no
00:56:11
longer need to have a 10,000 person
00:56:12
organization smaller organizations can
00:56:15
be stood up as startups to start to
00:56:17
replace large functioning organizations
00:56:18
so I don't know I think it's a time of
00:56:20
great opportunity I know that some
00:56:21
people would view it as being highly
00:56:23
shocking I think it's inevitable that
00:56:25
knowledge labor where the job of the
00:56:26
human is simply the ingestion of data
00:56:28
and then communicate an output of data
00:56:31
seems like it will eventually be
00:56:32
replaced by Computing somehow and this
00:56:34
is happening now in an accelerated way
00:56:36
with these llms so I think that what we
00:56:38
should focus on and think about is what
00:56:41
are all the new businesses all the new
00:56:42
jobs all the new opportunities that are
00:56:44
just couldn't have existed 10 years ago
00:56:46
that are now emerging that are very
00:56:47
exciting as the workforce transitions
00:56:50
sacked You by this techno utopian view
00:56:52
of this all these jobs that are going to
00:56:55
obviously be retired are going to open
00:56:58
up the opportunity for these humans to
00:57:00
do even better work at clana or do you
00:57:02
think it's just going to go straight to
00:57:04
the bottom
00:57:05
line well it sounds like they're able to
00:57:08
eliminate a lot of Frontline customer
00:57:10
support roles by using AI which is what
00:57:13
I would expect yeah I think this is a
00:57:15
very natural application for AI you know
00:57:19
it was already the case that you could
00:57:21
pretty much find answers to questions by
00:57:23
searching the FAQ things like this this
00:57:25
this is an even better way of doing that
00:57:28
so so look I I believe that this will be
00:57:30
a big area for AI is saving on again I
00:57:32
use the word Frontline customer support
00:57:34
because the way that customer support is
00:57:37
typically organized is there's level one
00:57:39
level two level three the more difficult
00:57:43
queries or cases get escalated up the
00:57:45
chain depending on how hard they are and
00:57:47
I think the AI will do a really good job
00:57:49
eliminating level one it'll start to
00:57:51
eating to level two but you're probably
00:57:53
going to
00:57:53
need humans to deal with the more uh
00:57:56
complex cases now the question is where
00:57:59
do those displaced humans go I think
00:58:01
there's going to be new jobs new work
00:58:03
that's always been the history of
00:58:05
technological progress and one of the
00:58:07
things you're already seeing is there's
00:58:08
a whole bunch of new AI companies that
00:58:10
are exploiting this technology and they
00:58:12
need to hire people so I basically agree
00:58:14
with freeberg that you will Elevate
00:58:16
people's work by automating away the
00:58:19
less interesting parts of people's jobs
00:58:21
and then creating more productivity and
00:58:23
more opportunity by the just to use your
00:58:25
just to use your example sacks so
00:58:27
imagine if all the level one support
00:58:29
people some chunk of them can now do
00:58:31
level two support and so the customers
00:58:33
are going to get greater Hands-On care
00:58:35
more customers will get access to a
00:58:37
higher level of service the organization
00:58:39
can afford to do that they'll be more
00:58:41
competitive in the marketplace because
00:58:43
customers feel better taken care of I
00:58:45
just think that's how the organizations
00:58:47
get leveled up as new technology kind of
00:58:49
shows up like this it's a great point
00:58:51
and then those folks can have a much
00:58:52
deeper level of interaction with with
00:58:54
their customers than they are
00:58:56
todaying question gets more complex and
00:58:59
then people might get better at the
00:59:00
software and they might discover new
00:59:02
features that you might be able to
00:59:03
redeploy those people if you look at
00:59:05
coffee like I don't know was it 40 years
00:59:07
ago you went to order a cup of coffee it
00:59:09
was decaf or regular coffee milk and
00:59:11
sugar those are like those are your four
00:59:13
choices and and now you go order coffee
00:59:16
I don't know if you guys have used the
00:59:17
Starbucks app or the you know I just had
00:59:20
the sweet green CEO on the Pod and man
00:59:22
you can the Fidelity and and the Nuance
00:59:25
of what you want to order is absurd Jam
00:59:28
where do you think this is all heading
00:59:30
because there is the issue of
00:59:31
displacement how people how quickly
00:59:33
people can re be redeployed and if we're
00:59:35
seeing in year two customer support and
00:59:39
developers getting 10x what other
00:59:41
categories do you think we're going to
00:59:42
see fall next I think the truth is that
00:59:45
as you said the real world applicability
00:59:47
to AI was not last year so I think we're
00:59:50
really in the first five or six weeks of
00:59:53
the first year
00:59:55
so you consider that year zero that's
00:59:57
year zero that was sort of like the you
00:59:59
know where everybody was running around
01:00:00
building toy apps Pro of concept this is
01:00:03
one of the first few times where you're
01:00:05
seeing something in production where
01:00:06
there's measurable economic value and
01:00:09
the important thing to note about that
01:00:10
is that it's not just what it means for
01:00:12
Clara but what it means for everybody
01:00:13
else so if you look at everybody else
01:00:15
for example here's Teleperformance which
01:00:17
is a a French company that runs call
01:00:20
centers they lost $1.7 billion of market
01:00:23
cap when that tweak went out about 20%
01:00:26
of their market cap so this is this is
01:00:28
the real practical implication Yes Clara
01:00:31
replaced 700 people and they saved 40
01:00:34
million of Opex but teleport 4 months
01:00:38
while they were just doing their
01:00:39
everyday work lost 1.8 billion of their
01:00:42
market cap at the exact same moment and
01:00:45
so what does it
01:00:47
mean I think that what Clara should do
01:00:50
is open source what they've built and
01:00:54
the reason is that you want to give
01:00:56
companies like
01:00:58
Teleperformance a chance
01:01:01
to retool themselves with the best
01:01:03
possible technology so they they can
01:01:05
actually preserve as many of the jobs as
01:01:07
possible because at the limit if every
01:01:10
single company is able to implement
01:01:12
something that is as economically
01:01:13
efficient as what Clara did
01:01:15
Teleperformance doesn't exist and
01:01:17
there's 10 billion and 335,000 employees
01:01:21
that will not have a job Y and so for
01:01:24
Clara the reason to open source it is
01:01:26
twofold one is they don't lose anything
01:01:29
because you will still need to train it
01:01:31
on your own data and so there's no
01:01:33
disadvantage that Clara will have right
01:01:36
they're just saying look I built this on
01:01:37
top of GPT here's what it looks like and
01:01:40
that production code can be used by
01:01:42
anybody else go for it but it has to run
01:01:44
on your own data that's a very
01:01:45
reasonable thing so I think it has the
01:01:47
benefit of both a setting a technical
01:01:50
Pace that can help them attract better
01:01:52
employees and more highly qualified
01:01:54
people who find the scope of work even
01:01:56
more interesting and B I think it's on
01:01:59
the right side of history with all this
01:02:00
AI stuff where it's allowing everybody
01:02:03
to sort of benefit in a way that is the
01:02:05
least destructive but I just wanted to
01:02:07
show you that the destruction was quite
01:02:09
quick and it was pretty severe and if
01:02:11
two or three other big companies launch
01:02:13
these kinds of tweets after real
01:02:15
measurable results Teleperformance will
01:02:17
be a 1 billion doll company short in
01:02:19
short order there's a third reason I I
01:02:21
think it's a brilliant idea for clar to
01:02:22
open source this this tool because it's
01:02:25
not their business right this is just
01:02:27
something they did as a as a
01:02:29
productivity Improvement they get the
01:02:30
benefit back to them of the community
01:02:32
working to advance that technology so
01:02:34
they don't have to put more Engineers
01:02:36
like advancing the ball on their
01:02:38
customer support AI totally they can
01:02:40
just re remerge in the changes that the
01:02:42
open source Community comes up with and
01:02:44
since they're not in the business of
01:02:46
selling AI directly there's no reason
01:02:48
not to do it like jamas said so I think
01:02:50
it's kind of brilliant yeah this is meta
01:02:52
strategy by the way I mean z should be
01:02:54
building this on meta's Open Source
01:02:56
Products and Apple's Open Source
01:02:57
Products right SX yeah well so what what
01:02:59
meta said what Zuck said on the last
01:03:02
meta call is the reason we open source
01:03:04
everything is because we don't directly
01:03:05
sell AI we create products that AI makes
01:03:09
better so by open sourcing this we allow
01:03:12
the community to advance the ball and we
01:03:14
get to reincorporate those changes so
01:03:16
it's a very smart strategy for companies
01:03:19
that aren't directly selling the AI now
01:03:21
you know if you're if you're like Brett
01:03:23
Taylor's new company Sierra obviously
01:03:25
you're not going to open source it
01:03:26
because your whole business model is to
01:03:28
create a proprietary solution yeah but
01:03:30
then to chat's you know 8090 whatever he
01:03:34
was talking about the with his you know
01:03:35
incubator concept and you know these
01:03:38
things it's a company it's a company
01:03:40
okay sorry um is it 8090 or 980 I'm
01:03:42
sorry 8090 yeah 8090 is there a third
01:03:45
word that comes off of it are just G to
01:03:46
call it 890 just 8090 got it okay 80% of
01:03:49
the features at a 90% discount so back
01:03:51
to that like what does this mean if
01:03:53
these things are getting to freeberg
01:03:55
your point about the pace man the pace
01:03:57
of these things is it going to improve
01:03:59
10% a year or 10% a month if it's
01:04:01
improving 10% a month we're going to get
01:04:03
to 98% of queries done this year if it's
01:04:07
doing 10% a year okay we're going to get
01:04:08
to 99 or 98% of queries in four years in
01:04:11
other words this is happening folks and
01:04:13
it's happening at a blistering Pace by
01:04:15
the way think of it's so not just
01:04:18
Teleperformance which is a it was A10
01:04:20
billion now a $ 7.58 billion USD company
01:04:24
but think about zenes right zenes was a
01:04:27
I think A8 to1
01:04:29
billion yeah it was A10 billion take
01:04:32
private in the hands of PE the entire
01:04:35
zenes workflow could be replaced by a
01:04:37
handful of these open source agents
01:04:39
where all of a sudden people can
01:04:41
eliminate a lot of opacs I think the I
01:04:44
think the thing to keep in mind here is
01:04:45
where the world is going has always been
01:04:48
to try to lower cost and the original
01:04:52
foundational principle of SAS was that
01:04:54
there's these line items in on-prem
01:04:56
software that are just extremely
01:04:58
expensive over time very hard to justify
01:05:01
right and so when people moved to SAS
01:05:03
from on Prem they were looking for cost
01:05:05
savings that was the initial thing now
01:05:07
yes it's actually not cheaper anymore
01:05:09
but it's much more featur so you get a
01:05:11
lot more value in SAS etc etc but the
01:05:14
point of these AI agents and Bots and
01:05:16
workflows is that it'll reintroduce the
01:05:19
concept of cash of of cost Savings of
01:05:22
this idea that you can have cheaper
01:05:24
faster and better and the more that that
01:05:27
stuff is open source my gosh I think it
01:05:29
just makes it very hard for companies
01:05:31
that have Point products to survive yeah
01:05:34
we I was talking to a friend of mine
01:05:35
Josh MOA who was the city head
01:05:39
of uber in New York and he launched his
01:05:41
own note-taking app he's writing it
01:05:43
himself and he's obsessed with this
01:05:45
concept of building a billion dollar a
01:05:46
unicorn company with one employee and
01:05:48
this is something that a lot of people
01:05:50
have been talking about my friend Phil
01:05:52
Kaplan from DH kid built a very large
01:05:54
business in District a unicorn with a
01:05:56
very like I think low single digit
01:05:58
number of people this could be the
01:05:59
future of you know efficiency you you
01:06:02
could build if you catch fire with a
01:06:03
really hot company that gets a million
01:06:05
custom absolutely the future it's
01:06:06
absolutely the future and then if you
01:06:07
think about our jobs in terms of capital
01:06:09
allocation well how much Capital does
01:06:11
that founder need do they need to dilute
01:06:13
10% 20% 30% they're not going to need to
01:06:15
delute 60 70 80% a oneperson company
01:06:18
should be able to spend less than a few
01:06:19
hundred grand to get to product Market
01:06:21
fit in the next few years yeah I mean
01:06:23
that's that's kind of what we're seeing
01:06:24
just to go back to your question J like
01:06:28
where does this go next yeah please I
01:06:30
think it's it's it's really interesting
01:06:32
to speculate about that so what Clara
01:06:34
seems to be talking about are
01:06:36
email-based customer support cases I
01:06:39
think we're this going to go next is to
01:06:41
phone 100% and these call centers use
01:06:44
what they're called ivrs these
01:06:46
interactive voice response systems but
01:06:48
they're very rigid it's a lot of
01:06:49
pre-recorded messages and it says push
01:06:51
one if your problem is this push two if
01:06:53
your problem is this everyone hates
01:06:54
those things yeah I think where it goes
01:06:57
next is you'll call up the call center
01:06:59
and you'll get a a voice that sounds
01:07:01
like a human just talk to you yeah and
01:07:04
you won't even necess realize that
01:07:05
you're talking to an AI because they're
01:07:06
already these AI companies that can do
01:07:09
generative voices any language any
01:07:12
accent oh and they're fast now and
01:07:14
they're fast and multi language think
01:07:16
about that just localizing them across
01:07:18
the globe you know you want to launch
01:07:19
your product in Japan by the way did you
01:07:21
guys see there was a meta demo which I
01:07:23
thought was really cool which was it was
01:07:25
run on Lama 70b but it was a real-time
01:07:29
translation tool where the person was
01:07:30
speaking in hen Chinese and the and the
01:07:32
other person was speaking in English and
01:07:34
they were able to understand each other
01:07:35
but saxs to your point when that person
01:07:38
calls for example B OFA now Spanish is
01:07:42
not a language it's actually many
01:07:44
dialects and many many many different
01:07:46
accents right depending on which country
01:07:48
you're from like the accent that you
01:07:49
hear in Spain is totally different than
01:07:52
the accent in El Salvador the accent in
01:07:54
Chile or Argentina and so wouldn't it be
01:07:56
amazing where you you call your bva app
01:07:59
it picks up your accent and your
01:08:01
tonality and it responds with the person
01:08:03
of that exact same accent and tonality
01:08:06
that is incredible well I'll take it to
01:08:08
another level you call it recognizes
01:08:10
your number it knows what you're doing
01:08:12
in the software it knows the problem
01:08:13
you've had it knows the last three times
01:08:15
you called and how long you've been
01:08:17
using the software and it anticipates
01:08:19
like okay I know this person has a
01:08:20
Windows machine and it's you know still
01:08:23
5 years old and it's like are you still
01:08:24
using that same 5-year-old Windows
01:08:26
machine yeah we know it's a bug with you
01:08:28
know Windows whatever uh you should
01:08:30
probably upgrade I mean it's going to
01:08:31
know the entire context of this and so
01:08:33
it's going to just get it could be more
01:08:36
efficient than a human could ever be the
01:08:38
best customer support interaction I had
01:08:40
was I called JP Morgan Chase because I
01:08:42
had an old credit card that I had had
01:08:44
for 20 years and I think that they had
01:08:46
outsourced it to somewhere in the
01:08:48
Caribbean and this woman picked up the
01:08:50
phone it was so cool jamat what is the
01:08:52
problem man and I was like oh this is
01:08:54
the best and I had this whole
01:08:56
conversation with her for 15 20 minutes
01:08:58
nothing to do with the phone uh nothing
01:09:00
to do with my credit card rather it was
01:09:02
great she's like I man you want to
01:09:04
cancel your card I was like yes please
01:09:06
no but you could do celebrities best
01:09:08
yeah celebrity voices yeah J Cal do the
01:09:11
djt okay let's let's role play uh you're
01:09:13
Donald Trump okay and I and I am JP
01:09:17
Morgan hi Mr President chath you're huge
01:09:21
you've got big spending I every time I
01:09:22
go see chath go see yourth I go in
01:09:25
amazing Mr Mr President I would like to
01:09:27
cancel my credit card sir okay you don't
01:09:29
want to cancel it we've got a great a
01:09:32
3.9% but for jamas you know what he's my
01:09:36
Sri Lankan friend how much do we love
01:09:37
Sri Lankans
01:09:38
okay huge and not jcal nasty nasty man
01:09:44
jcal very nasty he acts like a dog a dog
01:09:48
okay he's a dog some people say
01:09:51
TDS how much do we love our sex Saxy
01:09:54
poop great Mar Lago I go to Sax's house
01:09:58
his house huge huge
01:10:01
house almost as big as Maraga not quite
01:10:06
I gotta work on it because I haven't
01:10:07
done it for years that's really good
01:10:09
it's really good have you seen that guy
01:10:11
there's a there's a new like Impressions
01:10:13
guy who's amazing he does seen him he's
01:10:16
on Howard Stern all the time right he
01:10:17
does the Howard no not Shan Gillis this
01:10:20
is another kid Matt freed friend yeah
01:10:23
yeah yeah does Howard back to Howard
01:10:25
it's wa that video is good too hold on
01:10:31
Donald on other Trump Impressions so
01:10:34
like I'll tell you what Zack there are
01:10:36
so many people that try to do it you see
01:10:38
the failed Alec baldwi Baldwin comes out
01:10:40
on SNL which used to be a lot funnier
01:10:42
back in the day with chff but Baldwin
01:10:45
goes like this he goes we've got a great
01:10:50
show I don't do that I never say boy boo
01:10:53
this St coar he comes out like I know a
01:10:57
do do dot a DOT do a dot dot dot a China
01:11:00
I don't say the D and the last one is
01:11:03
Jimmy failing fell and he touched my
01:11:05
hair like a dog and Jimmy fa and he goes
01:11:09
okay look like dog we're rolling okay I
01:11:12
don't do these moves they're all wrong
01:11:15
so that's
01:11:16
a
01:11:19
incredible can let's see him doing uh
01:11:21
how got see him do Howard Stern it's
01:11:23
Next Level the other one he does that's
01:11:25
incredible but subtle is Stanley Tucci
01:11:27
how are you by the way Robin you look
01:11:28
beautiful right thank you is this still
01:11:32
you up Howard it's me up right yeah
01:11:35
right because when I start talking to
01:11:36
Robin after you leave I get crazy you
01:11:39
know right can we just talk about this
01:11:42
can we
01:11:43
talk right or wrong right right right
01:11:47
right Howard left or right right uh
01:11:50
right right right right right right
01:11:53
right on red right right right right
01:11:57
it's so great he is killing it I think
01:12:00
we have our halftime act for the all in
01:12:01
Summit 24 oh for sure oh my God yeah
01:12:03
he'd be amazing I love Stanley Tucci by
01:12:06
the way he's great oh you got to see
01:12:07
this kid Stanley
01:12:09
Tui have have you guys watched the
01:12:11
Stanley Tui on HBO where he cooking yeah
01:12:15
where like he goes to Italy it's so
01:12:17
great it's so great he does it on Tik
01:12:18
Tok too he just randomly cooks something
01:12:21
and he's like I'm going to make a
01:12:22
frittata I have some over Yi and I'm
01:12:24
just going to I do watch him on Tik Tok
01:12:27
Thanksgiving is a special time I'm in
01:12:29
London celebrating so today I'm making
01:12:33
St stuffing and also Sugo de car or as
01:12:37
it's known gravy now the thing about
01:12:40
stuffing is you might use a traditional
01:12:43
white bread but in my household I use a
01:12:46
homemade fatea because I'm Italian on
01:12:49
both sides and nothing tastes better
01:12:51
than bread with a little M SAU
01:12:54
it's a time for gratitude and giving
01:12:56
thanks enjoy your holidays and thank you
01:12:59
very much Felicity good that is really
01:13:01
good he's so good he's so good all right
01:13:03
listen I don't know how I keep the show
01:13:05
going here but three two okay Issue four
01:13:07
reddits S1 is kind of fun let's break it
01:13:09
down everybody 2023 Revenue 84 million
01:13:12
up 21% year-over-year they're still
01:13:15
losing money uh net loss 91 million in
01:13:17
2023 they lost 159 million in 2022 so
01:13:20
they're cutting the loss free cash flows
01:13:22
negative
01:13:23
here's a chart of their quarterly
01:13:25
revenue and cash flow so they're kind of
01:13:27
bouncing along the break even Mark as
01:13:29
you can see there in the chart they got
01:13:31
a wonderful gross margin because they
01:13:33
don't pay to produce the content unlike
01:13:36
the New York Times or Netflix and so an
01:13:39
86% gross margin up 2% year-over-year
01:13:42
their daily active unique 76 million up
01:13:45
27% year-over year average revenue per
01:13:47
user is incredibly low three bucks and
01:13:50
42 and their daily active unique users
01:13:55
uh here you can see another chart
01:13:57
growing nicely quarter over quarter they
01:13:59
got a billion two in cash most
01:14:02
interesting probably uh and could be
01:14:05
challenging to execute on is their
01:14:07
direct share program they're going to
01:14:08
carve out a bunch of shares in the IPO
01:14:12
to sell to their most active mods those
01:14:14
are the moderators the people who run
01:14:16
the different channels or subreddits as
01:14:18
they're called what could go wrong what
01:14:20
could go wrong wrong we've seen this
01:14:22
movie before
01:14:23
and uh but they're going to invite you
01:14:24
since to participate on it on a rolling
01:14:27
basis more
01:14:28
unqualified retail buyer pool does not
01:14:32
exist in the Reddit mods except maybe
01:14:34
the Reddit participants W Street bat
01:14:35
knows what they're doing yeah I think
01:14:37
they'll buy the uh I think they'll buy
01:14:38
after the lock up comes up but let's
01:14:41
just get started here I think you looked
01:14:45
at the S1 a little bit freeberg and you
01:14:49
had asked us to you asked me to put on
01:14:51
the docket because you were you were
01:14:52
digging into it anything stick out to
01:14:54
you or thoughts on the business overall
01:14:57
no I mean I wasn't pulling I just asked
01:15:00
if you guys had read it
01:15:02
umy um I hey I think I made a funny
01:15:07
joke read it accient get it I I'll let
01:15:12
you guys go on for a minute go ahead go
01:15:14
ahead tell me when you're
01:15:16
ready it was a good pun too bad it was
01:15:21
accidental no it was it was intend I'll
01:15:23
give you credit it was we'll give it to
01:15:25
you so I think the the thing about
01:15:26
Reddit if you could pull up the chart
01:15:28
with the quarterly average daily active
01:15:30
user data this was a business that the
01:15:33
last couple of years everyone was like
01:15:34
had flatlined because it was only
01:15:36
growing 5% a year in terms of usage and
01:15:40
then all of a sudden in the last two
01:15:42
quarters so starting in late summer
01:15:45
early fall of 23 so just six months ago
01:15:48
the usage started to climb pretty
01:15:50
significantly growing 15 and most in the
01:15:52
most recent quarter 27% year-over-year
01:15:55
absent that growth story it's a really
01:15:58
challenged business because a business
01:16:00
without much growth gets value typically
01:16:02
on a multiple of the cash flow that
01:16:04
they're generating and you know there's
01:16:06
less upside and all this kind of
01:16:08
optionality goes away that's kind of a
01:16:09
key point I I don't know I think like
01:16:11
for you to make an investment at a $5
01:16:12
billion valuation here you've really got
01:16:15
to believe that the growth continues at
01:16:17
this rate and it doesn't revert back to
01:16:19
the mean growth rate of the last couple
01:16:21
of years of basically 5 % which is
01:16:23
roughly flatlined the other challenge
01:16:25
they have is that their AR only in that
01:16:27
kind of $3 range which is like less than
01:16:30
10% of where Facebook is at and the data
01:16:33
that Facebook collects on their users
01:16:35
gives them the ability to do much better
01:16:37
targeting on ads and therefore monetize
01:16:39
Their audience much better than Reddit
01:16:41
has been able to do to the order of over
01:16:43
10x and then if you look at the arpo
01:16:45
number how much they've been able to
01:16:46
grow that metric it's also you know been
01:16:49
a little bit flatlined so this business
01:16:52
I think is a real question mark I mean
01:16:54
you could argue it's probably worth in
01:16:55
the best best case in the2 to3 billion
01:16:58
kind of valuation range and then you
01:17:00
have to believe the bull case that the
01:17:02
growth continues or accelerates from
01:17:04
here and they have a plan to address the
01:17:07
arpo problem they have other paths for
01:17:09
monetizing Their audience than what
01:17:11
they're kind of doing today what do you
01:17:14
guys think this thing's worth do you buy
01:17:15
it if it goes out at three billion or
01:17:17
five billion I think the first question
01:17:19
which you nailed that of buysight
01:17:24
investor will ask is what happened in
01:17:27
the last two quarters that was different
01:17:28
than the last 15
01:17:31
quarters that's going to be a very
01:17:33
important question and I think they're
01:17:34
going to have to have a very buttoned up
01:17:35
answer for that right and if they can
01:17:38
point to very specific repeatable things
01:17:42
I think that'll be good the thing that
01:17:44
they this IPO if it goes off in the next
01:17:48
four weeks they won't have to wait but
01:17:52
if it doesn't get off in the next four
01:17:53
weeks they'll have to update the S1
01:17:55
probably with
01:17:56
q1 and so you'll see whether this thing
01:17:59
is a trend or whether it was a one-time
01:18:01
thing do you know what it is well the
01:18:04
growth in the logged out is probably
01:18:06
largely because typically if you use it
01:18:08
on a
01:18:09
[Music]
01:18:10
phone it tries to force you to use the
01:18:14
app right so that you can be in this
01:18:16
logged in experience and if you just
01:18:19
turn that off you can get a lot more
01:18:20
logged out because Reddit gets
01:18:22
tremendous rank Authority from
01:18:24
Google so if you just if you just turn
01:18:27
that off I think that you'll have a lot
01:18:28
of logged out customers and that will
01:18:30
grow very quickly and so maybe it's a
01:18:32
decision that they'd rather have the
01:18:33
Topline number grow than have logged in
01:18:36
users grow but the logged in user growth
01:18:37
has still been pretty healthy it's
01:18:39
basically doubled in the last three
01:18:40
years so but to your point freeberg if
01:18:43
they said or our business is really only
01:18:46
these 30 odd million logged in users it
01:18:49
would be worth a lot less than saying 75
01:18:51
million I think it's I think you're
01:18:53
right it's kind of like in the mid you
01:18:57
know kind of two three four billion
01:18:59
dollish range the the big problem is the
01:19:02
arpu because these are not
01:19:04
users that represent sort of Facebook's
01:19:07
bread and butter kind of a $40 arpu
01:19:11
lives in a good suburb in the United
01:19:14
States and is monetized like
01:19:17
crazy I just don't think that's what
01:19:19
these users are but you could see that
01:19:21
as a challenge or opportunity because if
01:19:23
their RP is only 10% of Facebooks
01:19:26
there's a lot of Headroom there to grow
01:19:27
it if those users become economically
01:19:31
more valuable RP is actually down 2%
01:19:34
year over your sex the issue with this
01:19:35
user base is they're incredibly
01:19:37
sophisticated internet users who don't
01:19:39
click on ads and are kind of anti- ads
01:19:42
as opposed to you know the the general
01:19:45
population on a Facebook or a generic
01:19:46
service and it's Anonymous so you don't
01:19:48
know who the user is which is how
01:19:50
Facebook has such incredible demographic
01:19:51
targeting capabilities there's been a
01:19:53
lot
01:19:54
of conspiracy theories longc con
01:19:56
theories that came up Sachs that we were
01:19:59
talking about on group chat maybe you
01:20:01
could summarize this
01:20:04
long game that was played by Sam wman
01:20:07
and the allegedly founders of Reddit to
01:20:10
wrestle control of Reddit back from
01:20:13
their previous corporate owner
01:20:15
kest well this was a post by Yan who is
01:20:18
a former CEO of Reddit that was
01:20:20
published back in I think 2015
01:20:23
and he kind of lays out what I think
01:20:26
happened or I mean he says at the end
01:20:28
just kidding but if he's the former CEO
01:20:30
describing these events he must be
01:20:33
describing something he knows about I
01:20:35
would just
01:20:36
think but in any event what happened is
01:20:39
that Reddit was sold for only about $10
01:20:41
million a year after it launched so like
01:20:43
really really small and I think that it
01:20:45
kept growing and the founders realized
01:20:47
maybe they made a mistake or that this
01:20:50
was actually a bigger property and so
01:20:53
they started scheming on how to get
01:20:54
Connie Nas to spin it back out and so
01:20:57
Yan lays out the steps they went through
01:21:00
they
01:21:01
recruited a CEO who they kind of
01:21:04
pre-agreed on then they had that CEO
01:21:07
demand options specifically in Reddit
01:21:10
from Ki Nast which meant that Ki Nast
01:21:12
had to create a separate cap table for
01:21:15
it and then once they had a separate cap
01:21:17
table as a subsidiary of Cony Nast then
01:21:21
they could sort of pressure to have like
01:21:22
an outside investor bought in for the
01:21:25
expertise that just happened to be Sam
01:21:26
Alman and his
01:21:28
fund and you know eventually like step
01:21:31
by step they worked it to the point
01:21:33
where they got Connie asked to spin off
01:21:35
the
01:21:35
company and I guess this plan worked now
01:21:39
it should be said that the largest
01:21:40
shareholder in Reddit according to the
01:21:43
S1 is Ki Nas or Ki Nas parent company so
01:21:46
no one's going to benefit more from this
01:21:49
plan if you want to or scheme if you
01:21:51
want to call it that than Connie n it
01:21:52
was a smart thing for them to do to spin
01:21:56
out Reddit to allow the employees to
01:21:59
have options and then I would say to
01:22:00
bring back the founder Steve Huffman as
01:22:03
CEO several years ago so great it worked
01:22:05
out for everybody you know who knows if
01:22:07
it was all premeditated so if they own
01:22:09
30% and it goes out for five billion
01:22:12
they got 1.5 billion and they paid 10
01:22:15
million for it
01:22:17
so
01:22:19
that's a Web 2.0 1.0 2.0 no it's it
01:22:23
launched in 2005 2.0 the same time as
01:22:26
like web logs Inc and delicious and
01:22:28
Flicker and all that that whole cohort
01:22:31
of little web apps that used Ajax and
01:22:34
other things to you know the web was
01:22:36
just getting faster and there were a lot
01:22:39
of users well anyway congratulations it
01:22:41
was it was smart for Cony Nas to do the
01:22:44
spin out and give up 70% in order for to
01:22:47
have 30% of what's going to be a
01:22:48
multi-billion dollar IPO great outcome
01:22:50
for everybody all right issue five five
01:22:53
Apple doesn't have a fast car project
01:22:55
Titan is DBA dead before arrival apple
01:22:59
as you know has been working on an
01:23:00
electric vehicle for a decade
01:23:02
self-driving as well they've invested
01:23:04
billions in the project according to
01:23:06
Bloomberg and apple was targeting 100K
01:23:09
price point basically going after the
01:23:10
model S plaid with FSD company had 2,000
01:23:15
employees working on this project it was
01:23:16
called Titan and they had designers from
01:23:19
Austin Martin Lamborghini Porsche
01:23:21
according to the report most of the team
01:23:22
will be transferred to Apple's
01:23:24
generative AI division we talked a
01:23:26
little bit about Maggie their um
01:23:28
generative AI image language model
01:23:32
there's going to be some layoffs it's
01:23:33
unclear how many what does building a
01:23:35
car have to do with building an llm well
01:23:37
they were yeah so they weren't just
01:23:39
building a car they were all in on
01:23:41
self-driving and not having a steering
01:23:42
wheel so I understand I'm just saying it
01:23:45
doesn't make a lot of sense that 2,000
01:23:47
employees that were specialized in
01:23:49
building a car all of a sudden now
01:23:51
become the AI
01:23:53
team it would be more like the full
01:23:55
self-driving team is probably getting
01:23:57
those AI jobs and the rest are probably
01:23:58
getting laid off with Incredible
01:24:00
packages but what do you think sax I was
01:24:02
always skeptical that Apple was even
01:24:03
working on a car it's just a very
01:24:05
different kind of product than anything
01:24:06
else they make and so I never really
01:24:10
treated it that seriously that they were
01:24:11
going to make a car so I'm not I the
01:24:14
surprise to me is not that they canceled
01:24:16
this but that the that it was even true
01:24:18
that they're working on it in the first
01:24:20
place yeah no it was it there were
01:24:22
reports of them having test tracks and
01:24:24
everything it was pretty well
01:24:25
established and you could see that
01:24:26
people from Tesla and other places had
01:24:29
does it say why they actually killed it
01:24:32
no this is just a
01:24:33
report and I the speculation is that
01:24:36
they're going all in on AI they just see
01:24:39
that as a much better future well I
01:24:41
think it's more core to what they do I
01:24:43
mean the car never seemed that
01:24:45
core no and uh there were reports and
01:24:49
elon's talked about it publicly so we're
01:24:50
not speaking out of school here or
01:24:51
anything but
01:24:54
that Elon and apple had and Tim Cook
01:24:57
reportedly had talked or there have been
01:24:59
overtures that maybe Tesla was going to
01:25:01
get bought and Elon it's been pretty
01:25:02
clear that during the model 3 roll out
01:25:05
he was considering selling it to Apple
01:25:07
that would have been a smart acquisition
01:25:08
if they had done that can you imagine
01:25:10
all the Apple showrooms having a Model S
01:25:13
in it or something or a model y I mean
01:25:14
boom they would have just sold so many
01:25:16
of them and having the Apple operating
01:25:18
system on that display well wait was it
01:25:20
before the Model S came out or the model
01:25:23
3 it was model 3 was the time I think
01:25:24
they were talking seriously but remember
01:25:27
like no one really believed in the model
01:25:28
3 till it came out and started selling
01:25:30
like hot cakes the opposite they thought
01:25:32
it was going to kill the company right
01:25:35
all right everybody for the Rainman
01:25:38
David Sachs chairman dictator chth p
01:25:40
hoaa and consultant of science David
01:25:44
freeberg I am the world's greatest
01:25:46
moderator and we will see you next time
01:25:48
byebye gr
01:25:51
GR
01:25:53
we let your winners
01:25:55
ride Rainman
01:25:59
David and instead we open sourced it to
01:26:01
the fans and they've just gone crazy
01:26:03
with it love queen
01:26:06
[Music]
01:26:12
of
01:26:15
Besties my dog taking
01:26:18
driveways man oh man
01:26:22
meet me we should all just get a room
01:26:24
and just have one big huge orgy cuz
01:26:26
they're all this useless it's like this
01:26:28
like sexual tension that they just need
01:26:29
to
01:26:30
[Music]
01:26:36
release we need to get
01:26:41
[Music]
01:26:50
merch

Badges

This episode stands out for the following:

  • 60
    Most shocking
  • 60
    Best concept / idea

Episode Highlights

  • Google's AI Controversy
    Google's Gemini AI faces backlash for biased responses, leading to a significant stock drop.
    “It's a racist AI!”
    @ 06m 15s
    March 01, 2024
  • Investors Frustrated with Google
    Investors express deep frustration over Google's inability to compete effectively in AI.
    “Investors are banging the table!”
    @ 09m 46s
    March 01, 2024
  • The Fear of Speaking Up
    There's a pervasive fear in organizations that stops people from calling out issues.
    “The woke Emperor knows he wears no clothes.”
    @ 19m 38s
    March 01, 2024
  • A Deeper Problem Uncovered
    Recent memos reveal a significant issue within the organization, beyond just surface-level glitches.
    “It's a much deeper problem than just a glitch or a bug.”
    @ 27m 40s
    March 01, 2024
  • Questioning Bureaucracy in DEI
    The need for a DEI organization is being questioned as it may hinder meritocracy.
    “Why would you create this large bureaucracy that undercuts the meritocracy?”
    @ 37m 15s
    March 01, 2024
  • The Future of Content Licensing
    Google's new model for acquiring content could reshape how small websites monetize their data.
    “Tac 2.0 is going to be incredible for the entire content community.”
    @ 44m 19s
    March 01, 2024
  • AI Revolutionizes Customer Support
    Clara claims their AI now resolves issues faster than human agents, boosting profits by $40 million.
    “AI assistants are now doing the work of 700 full-time agents.”
    @ 54m 07s
    March 01, 2024
  • The Shift to Creative Work
    Humans are moving from manual labor to knowledge work, and now to more creative roles.
    “It's about enabling the creation of entirely new classes of work.”
    @ 55m 59s
    March 01, 2024
  • AI in Customer Support
    AI is set to revolutionize customer support by automating lower-level tasks.
    “AI will do a really good job eliminating level one support.”
    @ 57m 49s
    March 01, 2024
  • The Future of One-Person Companies
    The future may see billion-dollar companies run by just one person.
    “You could build a unicorn company with one employee.”
    @ 01h 05m 46s
    March 01, 2024
  • Reddit's Growth Surge
    Reddit's usage has surged significantly, growing 27% year-over-year in the latest quarter.
    “The usage started to climb pretty significantly.”
    @ 01h 15m 48s
    March 01, 2024
  • Apple's Car Project Canceled
    Apple has reportedly canceled its electric vehicle project, Titan, shifting focus to AI.
    “I was always skeptical that Apple was even working on a car.”
    @ 01h 24m 02s
    March 01, 2024

Episode Quotes

Key Moments

  • AI Controversy06:15
  • Fear of Speaking Up19:31
  • HR Culture Issues38:04
  • Diversity Discussion39:28
  • Content Licensing Evolution44:14
  • AI in Customer Service54:03
  • AI Revolution57:49
  • Spin-off Success1:22:41

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
E126: Big Tech blow-out, Powell’s recession warning, lab-grown meat, RFK Jr shakes up race & more
Podcast thumbnail
E101: Ye acquires Parler, Snap drops 30%, macro outlook, VC metrics, valuing stocks & more
Podcast thumbnail
Meta's scorched earth approach to AI, Tesla's future, TikTok bill, FTC bans noncompetes, wealth tax
Podcast thumbnail
AI Bubble Pops, Zuck Freezes Hiring, Newsom’s 2028 Surge, Russia/Ukraine Endgame
Podcast thumbnail
E167: Google's Woke AI disaster, Nvidia smashes earnings (again), Groq's LPU breakthrough & more
Podcast thumbnail
Hurricane fallout, AlphaFold, Google breakup, Trump surge, VC giveback, TikTok survey
Podcast thumbnail
E122: Is AI the next great computing platform? ChatGPT vs. Google, containing AGI & RESTRICT Act
Podcast thumbnail
E102: Elon closes Twitter deal, $META uncertainty, Zuck's historic bet, big tech decline & more
Podcast thumbnail
E118: AI FOMO frenzy, macro update, Fox vs Dominion, US vs China & more with Brad Gerstner
Podcast thumbnail
E165: Vision Pro: use or lose? Meta vs Snap, SaaS recovery, AI investing, rolling real estate crisis