Search:

Epstein Files, Is SaaS Dead?, Moltbook Panic, SpaceX xAI Merger, Trump's Fed Pick

February 07, 202601:19:22
00:00:00
All right, everybody. Welcome back to
00:00:01
the number one podcast in the world.
00:00:03
Your favorite podcast with your favorite
00:00:06
besties in the world, the All-In
00:00:08
podcast. Shabbath is traveling and
00:00:11
couldn't make it this week. So, our
00:00:14
favorite fifth bestie in the world, the
00:00:16
one, the only Brad Gersonner from
00:00:19
Alimter here to take a victory lap for
00:00:21
your uh Trump accounts. Congratulations,
00:00:25
brother.
00:00:25
>> Just getting started. Started here,
00:00:27
actually. Look forward to
00:00:28
>> Yeah. Yeah, we'll talk about it in
00:00:30
depth. And also with us from the great
00:00:32
state of Texas, my bestie and brother in
00:00:36
arms, David Saxs. How are you doing? How
00:00:39
you how you You're loving it, aren't
00:00:41
you? Freedom.
00:00:42
>> Loving it.
00:00:43
>> Yeah.
00:00:44
>> So much freedom.
00:00:45
>> Well, the ice storm is over and the
00:00:47
weather looks really good right now.
00:00:48
We're going to have 70s all weekend.
00:00:50
It's great.
00:00:51
>> Yeah. You figured out your internet.
00:00:53
>> That has been resolved. Like I said,
00:00:55
it's a lot easier to fix my internet
00:00:57
than to fix the state of California.
00:01:00
>> You fixed it in one week. So, yeah, I
00:01:02
would say that you're trending in the
00:01:03
right direction. And with us, of course,
00:01:06
David Freeberg of Oh. How are you doing,
00:01:10
oh, look at your little promo in your
00:01:12
hat. You're promoing with the hat. How's
00:01:14
Oh, we haven't heard about it for a
00:01:16
while.
00:01:16
>> I'll give you some updates on another
00:01:18
show. Today may not be the best day. I
00:01:19
do appreciate you asking.
00:01:21
>> I hear it's going great with the
00:01:22
potatoes.
00:01:23
>> You got the merch. I got merch. Yeah,
00:01:25
you guys can go to boosted.oh.com
00:01:27
and buy your gear today. Oh,
00:01:30
>> here's your quick science corner for the
00:01:31
day. They found a village called the Oh
00:01:34
site. It was an archaeological dig on
00:01:36
the Sea of Galilee, 26,000 years old.
00:01:39
And in this archaeological dig, they
00:01:41
found these little clay pots filled with
00:01:42
seed. So, it predated all of our
00:01:44
understanding of agriculture and plant
00:01:46
breeding and seed storage. So, it really
00:01:48
reinvented our understanding of of human
00:01:50
history with agriculture. So we named
00:01:52
the company Oh after that archaeological
00:01:54
site
00:01:55
>> and your first crop is going to be
00:01:58
potatoes. That's the poly market is
00:02:00
saying potatoes.
00:02:01
>> Farmers are planting our true seed.
00:02:03
World's first world's first true seed of
00:02:06
potato. Instead of planting 5,000 lbs of
00:02:08
chopped up potatoes, you plant a handful
00:02:10
of seed. Completely changes the
00:02:12
economics and the opportunity for potato
00:02:14
farmers around the world. Third largest
00:02:16
source of calories. Very excited. We're
00:02:18
going to market this spring. Farmers are
00:02:19
planting it in the field. So, it's a
00:02:21
great year for Oh. Thanks for asking.
00:02:24
>> So, first
00:02:25
>> invest in Oh, Jason.
00:02:27
>> I think Sax and I have 0.0. Yeah, I
00:02:30
think Besty's got boxed out on this, but
00:02:32
we'll promote it every week.
00:02:34
>> Sax is there, bro.
00:02:35
>> Oh, Sax is no comment. Sax got his beak
00:02:38
with Are you in Oh, or not?
00:02:40
>> I think we are investors, are we not?
00:02:41
>> Yeah, his his venture firm, Craft
00:02:43
Ventures. We're excited to have them on
00:02:45
the cap table. Yeah.
00:02:46
>> Oh, great. How about you, Brad? Did you
00:02:47
get a little slice?
00:02:49
>> I'm working on it. I'm working on it,
00:02:51
too. Let me know how it goes. If you can
00:02:53
ever find the founder and see if we can
00:02:55
get a little slice for Jake, I'd like to
00:02:56
slide a millie in there.
00:02:57
>> If you write a personal check, I'll let
00:02:59
you in right now.
00:03:00
>> Sure. Okay. I mean, the problem with
00:03:01
that is if if I write a personal check.
00:03:05
>> Never mind.
00:03:05
>> No, I I have to go to my LPs and then
00:03:07
get permission to do it. And then you're
00:03:09
going to crush it and then my LPs are
00:03:10
going to say, "Why isn't in the fund?"
00:03:11
So, that's the problem.
00:03:13
>> All right. Let's get going here. We got
00:03:14
a lot of news to get through. Epstein
00:03:17
files newest drop. DOJ published a
00:03:19
massive number of documents on Friday,
00:03:22
January 30th, under the Epstein Files
00:03:25
Transparency Act. Hundreds of
00:03:27
high-profile tech executives and
00:03:29
national figures were mentioned in the
00:03:30
files. Of course, none of those are
00:03:32
accused of any criminal wrongdoing.
00:03:35
>> Jal, you were in the files.
00:03:42
>> Yes, I have a couple of emails in the
00:03:45
files. Okay, Inspector Friedberg has a
00:03:47
few questions for you. Okay. Okay, let's
00:03:49
get started.
00:03:50
>> When did you first meet Jeffrey Epste?
00:03:52
>> I met Jeffrey Epstein in the late 90s at
00:03:55
the TED conference at specifically the
00:03:58
billionaires dinner which was hosted by
00:04:00
my book agent John Brockman.
00:04:03
>> And then did you see him in New York?
00:04:05
Did you visit him at his house or his
00:04:07
office or anywhere else in New York?
00:04:08
>> I've probably spoken to him for 45
00:04:11
minutes of my life. 30 minutes of that
00:04:13
was in the late 90s when I had Silicon
00:04:16
Valley Reporter magazine. He was a
00:04:18
billionaire financeier and he wanted to
00:04:20
invest in the magazine. I met with him
00:04:21
for 30 minutes which he said was too
00:04:23
small potatoes for him to be involved.
00:04:25
>> Where did you meet with him?
00:04:26
>> At his legendary town home.
00:04:28
>> You went to that house?
00:04:29
>> I visited him there and then I saw him
00:04:31
at the TED conferences at the
00:04:32
billionaire's dinner probably a half
00:04:33
dozen times.
00:04:34
>> You never went to the island?
00:04:35
>> I never went to the island. Was never
00:04:37
invited to the island. Was never invited
00:04:38
on the plane. Was never invited to the
00:04:40
ranch. None of that. When you went to
00:04:42
his house, did you see any young ladies
00:04:43
or did you see any of the stuff that's
00:04:45
reported?
00:04:45
>> No.
00:04:46
>> Did you ever get a massage from anyone?
00:04:48
>> No. No. I did trade an email with him,
00:04:50
which I didn't recall, but he in 2011
00:04:53
emailed me and said, "Hey, can you
00:04:54
introduce me to these people who were
00:04:56
doing this Bitcoin thing on your
00:04:57
podcast?" And I said, "Sure. Yeah, here
00:04:58
you go. I'll uh introduce you." I do
00:05:01
thousands of introductions a year
00:05:03
between our portfolio companies, people
00:05:05
on this weekend, startups, and
00:05:06
billionaires and financers. That's the
00:05:08
job of an early stage investor. Why did
00:05:10
you say hey pal in your email to him?
00:05:12
>> That's just a colloquialism I use like
00:05:15
as a general hey fella. If a fan comes
00:05:17
up to me and ask for a sub hey pal
00:05:19
thanks for saying that.
00:05:20
>> It wasn't on your radar that this is a
00:05:22
sexual predator etc.
00:05:24
>> Absolutely not. I think actually when
00:05:26
all that came out was 2018. That's when
00:05:29
I sort of became aware of it. There was
00:05:30
like a Miami Herald story or something
00:05:32
where they went into detail about how
00:05:34
heinous all this stuff was. I've been
00:05:36
saying here release all the Epstein
00:05:37
files. what he did was horrible.
00:05:39
>> Prosecute everybody 100% who was
00:05:42
involved in it. The end.
00:05:44
>> What about Gileain Maxwell? You had a
00:05:46
separate email that came out in the
00:05:47
Epste files with her.
00:05:48
>> I had met her as well at TED and I had
00:05:51
met her socially like in New York in
00:05:53
circles. When I met her, her dad, Robert
00:05:55
Maxwell owned, I believe the New York
00:05:57
Post or Daily News and she was a big
00:05:59
media executive and her sister was
00:06:01
involved in angel investing in
00:06:03
technology startups. So, they were just
00:06:04
in the scene. I think in hindsight, you
00:06:07
know, as a connector, if you ever seen
00:06:09
The New Yorker, Nick, you can throw up
00:06:10
the New Yorker story about me. I had
00:06:12
become famous in the first part of my
00:06:14
career as the connector. And the New
00:06:16
Yorker wrote like this 5,000word article
00:06:18
about how I knew everybody and was
00:06:20
connecting everybody. I think Epstein's
00:06:22
interest in me, if he had any interest
00:06:24
in me or Gilain, was in my ability to
00:06:26
connect high-profile people with them
00:06:29
and, you know, their business endeavors,
00:06:32
etc. So you had no knowledge of illicit
00:06:34
activities happening by Epstein or
00:06:36
Galain and you never participated in
00:06:37
any?
00:06:38
>> Absolutely not. Unequivocably not. I was
00:06:40
not involved in any shenanigans. Period.
00:06:42
Full stop.
00:06:43
>> Any more questions for the witness?
00:06:45
>> Well, let me make a a few observations
00:06:47
here. So, first of all, I 100% believe
00:06:49
Jay Cal. As I joked at our roast, he's
00:06:52
not an important enough player in the
00:06:54
grand scheme of things. You know, as I
00:06:56
joked, I said, "Who'd want to blackmail
00:06:57
a loser?" You know, obviously you're not
00:06:59
a loser, Jal, but uh but look, you you
00:07:01
were
00:07:02
>> well taken. Yeah.
00:07:03
>> We're learning from the Epstein files
00:07:05
that Epstein was some sort of hyper
00:07:07
networker. You were a connector. The
00:07:09
odds of the two of you coming across
00:07:12
each other in that time period was
00:07:14
basically 100%. But your contacts were
00:07:16
very minor. So that's point number one.
00:07:19
Now, I thought it was interesting, you
00:07:20
know, cuz I saw the the emails where
00:07:22
he's asking you for an introduction. The
00:07:24
fact that he was curious to meet the
00:07:26
quote unquote Bitcoin guys in 2011, I
00:07:29
thought was kind of interesting in and
00:07:31
of itself that, you know, whatever other
00:07:33
things Epstein was, he clearly had some
00:07:36
sort of nose for putting himself in the
00:07:38
middle of everything at a very early
00:07:40
stage.
00:07:41
>> I also thought it was interesting that
00:07:43
you were trying to warn Epste like crazy
00:07:45
Bitcoin guys, you know, like these are
00:07:48
some crazy crypto libertarians, you
00:07:50
know, you don't want to do business with
00:07:51
them or whatever. Anyway,
00:07:52
>> well, I It's interesting you point that
00:07:55
out because I had these guys on this
00:07:56
week in startups because I had heard
00:07:57
about Bitcoin early when it was like a
00:07:59
under a dollar. And I was like, "Yeah,
00:08:01
these guys are kind of weird. They're
00:08:03
not like entrepreneurs who want to raise
00:08:05
money. They're a foundation. They're
00:08:06
like Wikipedia and you you're not going
00:08:09
to be able to invest in it." I sort of
00:08:10
gave him that warning because I had
00:08:12
looked at like, "Hey, what are you guys
00:08:13
working on? Can you invest in the
00:08:14
Bitcoin project?" They're like, "No, no,
00:08:16
it's a nonprofit. It's it's a nobody
00:08:18
owns it." And I was like, "Oh, okay.
00:08:19
It's not even a nonprofit. Nobody owns
00:08:21
it." Yeah. basically. Yeah.
00:08:22
>> But anyway, it's just interesting that
00:08:24
he he obviously wasn't deterred by what
00:08:26
you said. He got very involved, I guess.
00:08:27
And there was a company called
00:08:28
Blockstream that he invested in along
00:08:30
with Reed Hoffman and and Joyo that
00:08:32
involves some of the Bitcoin core
00:08:34
developers. And again, this is all
00:08:35
coming out. Now, shifting gears,
00:08:38
>> I think another interesting part of this
00:08:39
is just how this is all being covered.
00:08:41
And there was an article in the New York
00:08:42
Times today talking about Epstein's
00:08:44
connections to Silicon Valley. And lo
00:08:47
and behold, you have a major photo there
00:08:49
despite I think your minor minor
00:08:51
tangential connection here.
00:08:54
>> And meanwhile, the people who have major
00:08:57
connections and a deeper relationship to
00:09:00
Epstein are being completely ignored.
00:09:02
>> H why is that? Why are they going after
00:09:04
me as opposed to Reed or
00:09:08
>> because you've become sort of rightcoded
00:09:10
by virtue of your association with Elon
00:09:12
and and being on this podcast. And so,
00:09:14
>> and you
00:09:16
>> and me and so you look in that article,
00:09:18
it's not just you. They really go after
00:09:19
Peter Teal. They go after Elon. But Reed
00:09:22
gets a total pass. I mean, he's just
00:09:25
mentioned
00:09:26
>> in a sentence with several other names.
00:09:28
So does Bill Gates in this particular
00:09:29
article, by the way,
00:09:30
>> which seemed to be, by the way, if you
00:09:32
were thinking of who had the most
00:09:34
contact with him for the longest
00:09:36
duration of time, it was Gates and Reed.
00:09:38
like they were involved with him up
00:09:40
until I don't know his death maybe or
00:09:42
2018 2019 they were on the island they
00:09:45
were on the plane they were at the ranch
00:09:48
they were very much involved in him to
00:09:50
and I know Joeyto was in it to raise
00:09:52
money from Epstein because Epstein was
00:09:56
this big funer of scientists which
00:09:58
looking back on its he's involved in
00:10:00
Bitcoin he's involved in physics and
00:10:03
scientists he's involved with the
00:10:05
world's most innovative tech leaders
00:10:07
what exactly is going on here? Like what
00:10:09
who is he working for? Do you think he
00:10:12
was a spy? I wonder what you've come to
00:10:15
think or an asset.
00:10:16
>> I don't have any more information than
00:10:17
anybody else does about this and I
00:10:19
haven't gone through all the Epstein
00:10:20
files. I've just seen the ones that have
00:10:22
been surfaced by other people tweeting
00:10:23
about them and that sort of thing. My
00:10:25
impression is that he clearly had
00:10:27
relationships with people in
00:10:29
intelligence, but I don't know whether
00:10:32
he was actually an asset. People use
00:10:34
this term that he was being run by
00:10:36
somebody. It seems to me that this guy
00:10:38
is kind of running himself and then he's
00:10:40
using lots of other people and
00:10:42
manipulating lots of other people and
00:10:43
putting himself in the center of a lot
00:10:45
of things.
00:10:46
>> And so is he working with intelligence?
00:10:48
Yes, certainly. It seems that way. But
00:10:52
is he working for them or are they being
00:10:55
put to use by him? Very hard to tell.
00:10:58
>> Yeah. You know, I not that I have a lot
00:11:02
to add here, but man, this guy's just a
00:11:04
scumbag, you know, and all of this
00:11:07
nonsense. It's totally tragic, but
00:11:10
David, this is why nobody trusts
00:11:12
institutions or powerful elites or any
00:11:15
of this garbage,
00:11:16
>> right? Like, it's the this thing is like
00:11:19
trickling out over years and people like
00:11:22
want to put it behind it, but you can't.
00:11:24
Nobody's been charged here. What about
00:11:26
all the, you know, all the people in the
00:11:28
emails? like why don't we see any
00:11:30
charges? And then girly tweets this week
00:11:32
and I totally agree. This guy's under
00:11:34
complete surveillance, suicide watch in
00:11:37
a little tiny jail cell and all of a
00:11:40
sudden the guy ends up dead. We have no
00:11:42
investigation as to how this guy dies.
00:11:44
>> Yeah.
00:11:45
>> Right. It just seems to me that this is
00:11:47
the type of [ __ ] that undermines trust
00:11:50
in institutions, that undermines trust
00:11:52
in all the people who are listed here
00:11:54
because literally from Peter Aia to Bill
00:11:56
Gates, the people that many across the
00:11:59
country looked up to for advice on key
00:12:01
things. All of a sudden, they're saying
00:12:03
stuff that just undermines credibility.
00:12:06
>> And how come the 30 people or whatever
00:12:08
the number is who were investigated
00:12:10
alongside him have not been prosecuted?
00:12:12
Like that's the thing that's crazy to
00:12:13
me. that screams of some weird
00:12:15
conspiracy here. And I do think his
00:12:17
death is obviously, you know, very
00:12:19
suspicious.
00:12:20
>> But do you do you think it's because
00:12:22
they didn't find evidence of all the
00:12:24
underage prostitution, sex trafficking?
00:12:27
Is that Do you think that it's actually
00:12:28
the case they didn't find evidence about
00:12:30
any of that sort of stuff? The thing
00:12:32
that came out eventually was that his
00:12:34
nonprosecution agreement, and I'm I'm
00:12:36
not an expert on it, that he did with
00:12:38
the Miami group included like all the
00:12:42
other people also couldn't be prosecuted
00:12:44
for it. So, there's something fishy
00:12:46
going on and I think people should
00:12:47
reinvestigate his charges and that whole
00:12:50
thing. And where's the FBI in all of
00:12:52
this? Like, they did a lot of
00:12:53
investigating. So, yeah, why didn't they
00:12:55
prosecute anybody else? It's very
00:12:57
strange.
00:12:57
>> Can I ask you guys a question? because
00:12:59
this revealed a lot of very private
00:13:01
communication in a very public way of
00:13:04
what Brad points out are very public
00:13:06
powerful people. There is this great
00:13:08
book, I've mentioned it in the past, by
00:13:10
Steven Baxter and Arthur Clark called
00:13:12
the light of other days and it's all
00:13:13
about how basically all the world's
00:13:15
information becomes available to
00:13:16
everyone. Like what if you could read
00:13:18
every other person's text, email, IM, DM
00:13:21
that they've ever had with everyone else
00:13:23
in the world? What would the world look
00:13:25
like? And it's very kind of intimately
00:13:26
revealing about people's quiet private
00:13:29
conversations versus their public
00:13:30
personas or their external
00:13:32
conversations. Is this just like hey
00:13:34
people feel entitled to act so
00:13:36
maliciously and deviously in in one
00:13:38
sense because they're wealthy because
00:13:39
they're powerful and that's a very
00:13:41
unique thing for wealthy and powerful
00:13:42
people.
00:13:43
>> I think it's too dismissive of people
00:13:45
for their scumbag behavior. David, it
00:13:48
the fact of the matter is people don't
00:13:50
do this. This isn't normal. We can't
00:13:52
normalize it. And worse yet, the level
00:13:55
of hypocrisy
00:13:57
that the very people that are acting the
00:14:00
worst are out there lecturing others uh
00:14:03
throughout this entire period of time.
00:14:06
And I will say this, you know, to the to
00:14:08
the ear of my sister or mother in rural
00:14:11
Indiana. They hear the coastal elites
00:14:13
lecturing them all the time and then
00:14:15
they have they juxtapose it against what
00:14:18
they read in the Epstein files. This is
00:14:20
why we have a lack of trust. Brad, you
00:14:22
speak about the corruption of power
00:14:23
centers. I think a major one has to be
00:14:25
the New York Times. The number one
00:14:26
person in the Epstein files from Silicon
00:14:28
Valley, which is Reed Hoffman mentioned
00:14:30
2,600 times, had a multi-year
00:14:33
relationship with Epstein. They call
00:14:35
each other very good friends. They did
00:14:36
deals together. Reed stayed at the
00:14:39
trifecta, which is not just the island,
00:14:41
but the townhouse and the New Mexico
00:14:43
ranch. And if you're going to write
00:14:46
about Silicon Valley, Reed was the one
00:14:48
who introduced Epstein to Peter Teal and
00:14:51
Elon Musk and Mark Zuckerberg organized
00:14:53
that famous dinner. How can you not
00:14:55
mention that as the root of Epstein's
00:14:59
involvement in Silicon Valley and yet
00:15:01
read this gets a mention in one sentence
00:15:03
of that article along with several other
00:15:04
people? It's crazy. I mean, the New York
00:15:06
Times clearly has a list of people they
00:15:08
consider approved targets. They're all
00:15:10
rightcoded people like Elon or Peter
00:15:13
Teal and even JCAL because of his
00:15:15
association with us, I guess, and they
00:15:18
become the targets. But the people
00:15:20
who've donated hundreds of millions of
00:15:21
dollars to the Democrat party and have
00:15:24
paid for dirty tricks against Trump,
00:15:26
they basically are spared. Honestly,
00:15:28
this is just emblematic of the whole
00:15:31
institutional rot and the distrust in
00:15:33
the country, right? They're part of the
00:15:34
cabal. It's part of the institutions
00:15:36
that people are losing faith in. You
00:15:38
know, Epstein was a scumbag. And the
00:15:41
fact of the matter is we're not seeing
00:15:42
equal play on both sides.
00:15:44
>> All right, let's keep moving. SAS
00:15:46
companies are crashing out. $300 billion
00:15:49
of value was wiped from the S&P
00:15:52
Tuesday in the software and data stocks
00:15:54
category. Uh people are calling this the
00:15:56
Claude crash. I don't know if I buy
00:15:58
that, but on Monday, Anthropic, which
00:16:00
has been on a bit of a heater as we
00:16:02
talked about, announced that they added
00:16:03
a legal tool to Claude Co-work. If you
00:16:06
don't know what Claude co-work is
00:16:08
called, this is different than the
00:16:09
Claudebot that we talked about last
00:16:11
week. This is a essentially what uh
00:16:15
Claude code or a coding agent is. This
00:16:18
is for knowledge workers to automate
00:16:19
work and do multi-step instead of just
00:16:22
asking you a query to a large language
00:16:24
model, it would do a number of actual
00:16:25
actions on your behalf that you can
00:16:27
automate and run as cron jobs, as
00:16:29
regular jobs every day, every hour,
00:16:31
every week, whatever it happens to be.
00:16:33
This one specifically is kind of like a
00:16:35
plugin that allows you to do tasks uh
00:16:38
related to legal drafts and research.
00:16:40
What that meant to I guess retail
00:16:42
investors and we'll get into this Brad
00:16:44
since this is your specialtity is that a
00:16:46
lot of legal tech startups and uh public
00:16:49
companies were hit hard. Thompson
00:16:51
Reuters down 20%. Lexus Nexus which is a
00:16:53
database of case law uh was down 15%.
00:16:57
Legal Zoom which gives legal advice um
00:17:00
and documents down 15%. At the same
00:17:03
time, SAS uh has continued to be
00:17:06
negatively impacted by this concept that
00:17:09
uh software will be made bespoke in uh
00:17:12
tools and be wiped out. Figma down 13%,
00:17:15
Salesforce 11, Service Now 11, Adobe 8%.
00:17:19
And even before Tuesday's drop, and you
00:17:21
can get into this, Brad, software was
00:17:23
already the worst performing S&P sub
00:17:25
section for the year.
00:17:27
>> By the way, the numbers you report are
00:17:29
dramatic understatement. We we've wiped
00:17:31
out trillions of dollars in market cap.
00:17:33
Figma's down 80% from the high. Uh you
00:17:36
know, all the big names.
00:17:37
>> Let me be clear. Those were two-day
00:17:39
numbers. That was this week since these
00:17:41
are two-day numbers. And that's Yeah,
00:17:42
you can give us the bigger picture.
00:17:43
>> This is this is a real train wreck. And
00:17:46
I was on CNBC at the start of the year,
00:17:47
I think on January 6. Nick can can kind
00:17:50
of pull that up. And I was asked the
00:17:52
question, you know, what do you think
00:17:53
about all these stocks being down? And I
00:17:54
said, listen, they're all down and 90%
00:17:57
of them deserve to be down. So, let's
00:17:59
look at these charts. Uh, David, I know
00:18:01
this is Sachs, this is your favorite
00:18:02
chart. You and I were looking back in
00:18:04
22, but now we're at an all-time low.
00:18:06
We're we're trading at 3.9 times forward
00:18:08
revenue. If you go to the next chart,
00:18:11
Nick, you know, on a free cash flow
00:18:13
multiple, also at an all-time low. So,
00:18:15
now software is trading not just at a
00:18:17
low on revenue, but it's trading at a
00:18:19
low on free cash. Very profitable
00:18:20
businesses. We've got a another slide
00:18:23
here that I think is important which is
00:18:26
you know when when you look at what why
00:18:28
they're going down right they're going
00:18:31
down and this this is for Salesforce it
00:18:34
shows it's been cut in half in the last
00:18:35
couple weeks but the final slide they're
00:18:37
going down not because revenue is
00:18:39
falling look at this revenue is actually
00:18:42
stable to increasing for software
00:18:44
companies revenue growth they're going
00:18:47
down because we're discounting that
00:18:49
future uncertainty When something as
00:18:52
profound as AI comes along,
00:18:55
all of a sudden it causes you to
00:18:57
question whether or not there's as much
00:18:58
certainty and durability in those future
00:19:01
free cash flows. So in the case of take
00:19:03
Salesforce, it's gone from 30 times free
00:19:06
cash flow multiple to 15 times. That
00:19:10
means somebody buying it today says
00:19:11
listen, I think 15 years into the
00:19:13
future, I can count on these free cash
00:19:15
flows. Right? Before they were willing
00:19:17
to pay 30 years into the future. Well,
00:19:20
hell, with AI today, we don't know
00:19:22
what's going to happen seven years into
00:19:24
the future. So, for people at home to
00:19:27
understand why are these companies
00:19:29
hitting their numbers, but their stocks
00:19:30
are going down, they're two totally
00:19:32
different things, right?
00:19:34
>> Okay. So, they're hitting their numbers,
00:19:36
but the headwind of AI means people
00:19:39
don't believe that they'll be strong in
00:19:41
the future. Sax, those
00:19:42
>> Well, okay. I mean, I think there's a
00:19:44
little bit of a handwave going on here
00:19:46
when people say that AI is going to wipe
00:19:48
out SAS. I I don't think that's true.
00:19:50
You take a SAS product like Salesforce,
00:19:53
right? It's a very large system that
00:19:56
deals with all of your customer contacts
00:19:58
and your revenue. You're not going to
00:20:00
want to replace that with code that's
00:20:02
just been spit out of a coding assistant
00:20:04
that hasn't been fully vetted. Think
00:20:06
about how many bug reports have been
00:20:07
filed on Salesforce's codebase over the
00:20:10
last 25 years, maybe millions of them.
00:20:13
That system has been tested across
00:20:15
thousands of large customers and
00:20:16
enterprises. The idea that you're just
00:20:18
going to rip out that system and replace
00:20:20
it with code that's been
00:20:22
probabilistically generated by an AI
00:20:24
engine yesterday with a small team to
00:20:27
maintain it internally just doesn't seem
00:20:29
realistic to me. So again, I think this
00:20:31
like very dire prediction of all SAS is
00:20:34
dead is overstated. However, I do think
00:20:36
that there are some issues here. So, if
00:20:40
you're a SAS product that charges a lot
00:20:41
of money and people only use a handful
00:20:43
of your features, then you are, I think,
00:20:46
a target to be ripped out with something
00:20:48
that's more bespoke, right? Because the
00:20:50
ROI just isn't there. I also think that
00:20:53
you have to be really clear about what
00:20:55
your moes are going to be in this new
00:20:57
world because it is a lot easier to
00:20:59
generate code and to copy. So, if you
00:21:02
don't have good moes, then you could be
00:21:05
in trouble. But here's where I think the
00:21:07
greatest threat is to the SAS companies.
00:21:09
It's not in my view their existence. I
00:21:11
don't think it's existential. It's where
00:21:13
the future value capture is going to be.
00:21:16
So let me give you an example. All these
00:21:17
SAS products are rolling out like AI
00:21:19
co-pilots inside their tools and some of
00:21:22
them work pretty well, but they're
00:21:24
limited to playing in that sandbox.
00:21:26
Whereas you look at something like
00:21:28
Claude Co-work right now, it has
00:21:30
connectors to all these different SAS
00:21:31
tools. It can pull in data across all
00:21:34
these different tools and it works
00:21:36
seamlessly across databases and tools
00:21:39
and that's a pretty attractive place to
00:21:42
be, right? Like you know which one of
00:21:44
these products is going to be your
00:21:45
workspace? Seems to me that you're going
00:21:47
to want your workspace to be the one
00:21:49
that spans across and gives you AI
00:21:52
across the most data and context as
00:21:55
opposed to having a bunch of separate
00:21:56
AIs inside of your existing tools. So I
00:22:00
think the risk for the SAS companies,
00:22:02
it's not that they get replaced,
00:22:03
although that'll happen to some degree,
00:22:05
but it's that they become an old layer
00:22:07
of the stack that now there's a new
00:22:09
layer that gets built on top of it
00:22:11
becomes more legacy infrastructure
00:22:13
>> and all the action kind of moves to a
00:22:15
new layer of the stack and that's where
00:22:17
the value ad happens and if that happens
00:22:20
it kind of cuts into their future
00:22:22
opportunity, right? because a lot of
00:22:24
these companies were banking on AI as
00:22:27
their next you look at their product
00:22:28
road maps right it's all AI related so
00:22:31
that to me I think is the big risk is
00:22:33
that the value capture
00:22:36
>> for the next layer of the stack happens
00:22:38
somewhere else
00:22:39
>> yeah I'm experiencing this in startup
00:22:41
land where people go to the action as
00:22:44
you called it sachs the most productive
00:22:47
thing you can do is create an open claw
00:22:50
which used to be called clawed bot not
00:22:53
by Anthropic. This is the open source
00:22:55
project I talked about last week. And
00:22:57
we've actually now created like three or
00:22:58
four of these Asian sacks. We've bought
00:23:00
the Mac Studios and we're now running
00:23:03
Kimmy on some of them. And we uh had to
00:23:07
open up SAS accounts for these four
00:23:10
agents. So, actually our SAS spend went
00:23:12
up in the short to midterm because we
00:23:14
opened up four more Slack ex, you know,
00:23:17
uh enterprise versions, four more
00:23:19
notion, uh four more Google Docs. So,
00:23:21
it's almost like we added four
00:23:23
employees. However, we now have put
00:23:26
about 20 or 30% of the work people were
00:23:28
doing into these agents. And I think
00:23:31
it's going to be sustainable that every
00:23:32
month we move 10 to 20% of work being
00:23:35
done by humans into agents. But we will
00:23:39
never use the ones that are built into
00:23:42
the tools. To your point, Saxs, using
00:23:45
Notion's AI tool, it's nice. Using
00:23:47
Slacks, it's also very nice. And
00:23:49
Google's got Gemini everywhere in the
00:23:51
top righthand corner. But when you make
00:23:53
agents with OpenClaw and you have them
00:23:56
saying, "Hey, pull this data from my
00:23:59
calendar, send an email to this person,
00:24:01
include in that some notion documents."
00:24:04
It's unbelievable how powerful it is.
00:24:07
So, and that I think is going to be
00:24:09
owned by open source. That means
00:24:12
>> the next generation of companies, they
00:24:14
may never open up these accounts. They
00:24:16
may use more bespoke software and it may
00:24:20
all technology is deflationary. We know
00:24:22
that.
00:24:22
>> So your SAS spend might go from 10% of
00:24:25
an employes salary down to 5% down to
00:24:27
1%. That's what I think the trend will
00:24:29
be which means these companies are going
00:24:31
to need to really downsize their expense
00:24:34
base in order to keep those earnings up
00:24:35
and they're going to have to evolve
00:24:36
their products massively. Their products
00:24:38
are just going to have to provide more
00:24:40
value and more hooks. Freeberg, you have
00:24:41
any thoughts on this?
00:24:43
>> Respond to one thing. So I think one of
00:24:45
the real conundrums
00:24:47
for SAS companies is whether they're
00:24:49
going to be open data or closed data. I
00:24:51
think Bill Gurley has sort of coined
00:24:53
this this term. So it's not open source
00:24:54
or closed sources anymore. It's open
00:24:56
data or closed data. You can see why
00:24:58
they'd want to be closed data, right?
00:25:01
Especially if you're a large suite like
00:25:02
Salesforce, you can lay claim to being
00:25:04
that workspace for AI. You've got enough
00:25:07
of the tools, you got enough of a suite,
00:25:09
you want to provide that, you want to
00:25:11
capture that AI value layer. Yes,
00:25:14
>> but still if there's someone using
00:25:17
Cloudbot or whatever the next generation
00:25:19
of Quadbots are going to be and they're
00:25:22
connected to everything else, then that
00:25:24
is going to create a friction in the
00:25:26
enterprise and it will create room for a
00:25:29
competitor to come along and say, "No,
00:25:31
no, no. I'm open data. I'm okay not
00:25:34
being your workspace for everything. I'm
00:25:37
willing to just provide the CRM
00:25:39
database." and maybe they can take
00:25:42
business on that basis.
00:25:44
>> Well, here here's I want to build on
00:25:45
that Sax the I'm building a project
00:25:48
internally called Ultron and Ultron
00:25:50
inside of my firm launch that is going
00:25:53
to basically with the Slack API we're
00:25:56
pulling every single message from Slack
00:26:00
into our OpenClaw. We're pulling every
00:26:02
single edit to the notion into openclaw
00:26:05
and then we're taking every skill of
00:26:07
every employee and we're writing skills
00:26:09
for each one. One of the skills is
00:26:10
booking guests on this weekend startups
00:26:12
or this weekend. One of the skills is
00:26:14
sorting the incoming applications to
00:26:16
found university. Ultron in our world is
00:26:20
taking every single skill of every
00:26:23
employee putting it in one place and
00:26:24
then we're ripping all the data from
00:26:26
Slack, all the data from notion and
00:26:28
every single person's Gmail. So your
00:26:31
every single employees Gmail is going to
00:26:33
go into Ultron and then Ultron is going
00:26:35
to tell us what's happening in the
00:26:36
organization. One giant employee that
00:26:40
has the superpowers of all 20 and all
00:26:43
the data. Now if Slack was to say to us
00:26:45
or Notion or Google Docs or whoever it
00:26:48
was, you can't pull this stuff out with
00:26:49
the API and they they shut down the API,
00:26:51
we would leave. We'd leave immediately.
00:26:53
Right.
00:26:54
>> And and what this is going to do, and
00:26:56
I'm going to show Ultron on Friday's
00:26:58
episode of This Week in Startups, if
00:26:59
anybody wants to see it. Ultron is going
00:27:02
to be the one canonical employee of the
00:27:06
organization. It's going to be basically
00:27:07
me and all 20 of my employees. This is
00:27:10
kind of mind-blowing when you think
00:27:11
about it. I And we interface with it in
00:27:14
Slack and it just talks to us and tells
00:27:16
us what's going on in the organization.
00:27:18
So, I was asking it, what meetings did
00:27:20
we have with founders yesterday? and
00:27:21
tell me the notes that all the
00:27:23
associates took on it and it gives it to
00:27:25
me. Tell me all the topics and the
00:27:27
guests on the podcast and it gives it to
00:27:28
me. It's really unbelievable what's
00:27:31
about to happen and nobody can release
00:27:33
the software, Brad, because if you
00:27:35
release software that allows agents to
00:27:38
go and do things on your behalf, the
00:27:40
fallout if it up and if it leaks data, I
00:27:44
don't think Beni off, you know, or
00:27:47
Sergey or the notion team want to have
00:27:49
that on their hands. But we're building
00:27:51
it like this is the ultimate in
00:27:53
efficiency for an organization.
00:27:55
>> There's a slide that I just sent to Nick
00:27:57
uh that Goldman's out with this week
00:27:59
that really makes the point that that
00:28:00
David Saxs just made, which is the
00:28:02
profit pools in the future, right? It's
00:28:05
not that it's the idea that software is
00:28:08
dead is ridiculous, right? Nobody's, you
00:28:11
know, intelligently, I think, making
00:28:12
that argument. But the argument they are
00:28:14
making which is causing radical um
00:28:17
devaluation of these companies is that
00:28:20
the profit pool available to software is
00:28:22
decreasing and the profit pool available
00:28:25
to the agentic layer is increasing. And
00:28:28
when that happens the discount rate that
00:28:31
terminal value of those software
00:28:33
companies plummets and so you can have
00:28:36
things that are true. It could be true
00:28:38
that you're not going to replace CRM,
00:28:40
but it can also be true that it's never
00:28:42
going to trade at 30 times free cash
00:28:44
flow again
00:28:45
>> and it's going to trade at 17 times free
00:28:47
cash flow because it's available TAM in
00:28:49
the future is now dramatically and
00:28:52
permanently changed. Now, what could
00:28:53
change that? There's only one thing that
00:28:55
could change that. They have to
00:28:57
accelerate their revenue growth in their
00:28:59
core business and prove that they are AI
00:29:02
beneficiaries, right? And they're not
00:29:04
going to get eaten away by AI. And I'll
00:29:06
tell you a company that is doing this.
00:29:08
Data Bricks. Data Bricks just
00:29:10
reacelerated the last three quarters.
00:29:12
They're growing over 60% at scale.
00:29:15
Snowflake reacelerating. Click House
00:29:18
reacelerating. There are beneficiaries
00:29:21
in the software space.
00:29:22
>> Is that because of AI tools that they're
00:29:24
adding?
00:29:24
>> Absolutely. Because it's all these AI
00:29:26
tools rely on data and data
00:29:28
transformation. And for all those
00:29:30
companies that data and the data
00:29:31
transformation in occurs in those
00:29:33
platforms that's very different than a
00:29:36
what Satcha said a thin application
00:29:38
layer sitting on top of a CRUD database.
00:29:40
If you are in the application software
00:29:42
business you better have something
00:29:44
that's durable and I think Sax laid it
00:29:46
out really well. You know it's really
00:29:48
hard for them to be everything AI when
00:29:51
they only have access to their data and
00:29:53
they can't access these other systems.
00:29:55
>> Freeberg what what do you think? Is
00:29:57
there a move here for Ben off to do what
00:29:58
he does best, which is acquire a bunch
00:30:00
of companies and and create massive
00:30:02
efficiency on them? What would you do if
00:30:04
you're a Beni off?
00:30:07
>> Um I won't comment on Beni off. I I I'll
00:30:11
I'll just make a view on uh without
00:30:13
being too prescriptive, but I
00:30:17
my experience lately
00:30:20
in just the last 60 to 90 days with the
00:30:22
tools we've been talking about broadly
00:30:25
is there are things that we can get done
00:30:27
now that we could not get done before.
00:30:31
As I think about software in the past,
00:30:33
it's like worker productivity
00:30:35
enhancement. It helps people do work and
00:30:39
the recent transition that a lot of
00:30:40
people talk about is like it actually
00:30:42
completes the work. It does the work
00:30:44
these agents or what have you. But I
00:30:46
think that what we're starting to lean
00:30:48
into is that it's doing the work that
00:30:50
the humans can't do.
00:30:52
And that's really where I think the
00:30:55
power of these tools starts to force a
00:30:58
transition in both the pricing model and
00:31:01
the value creation potential in front of
00:31:04
us. Number one, I think the value
00:31:06
creation potential in front of us is so
00:31:08
significant that I would say that you
00:31:10
could probably take the sum of the
00:31:12
market cap of all the software companies
00:31:14
today and have a pretty good bet that
00:31:16
everything will be four to 10x higher 5
00:31:20
years from now,
00:31:22
but it's going to be not evenly
00:31:24
distributed.
00:31:25
>> It's going to move around.
00:31:26
>> Yeah.
00:31:26
>> The companies that figure out how to
00:31:29
realize that value creation are going to
00:31:31
be outsized returns. But the second
00:31:33
thing that I think's about to happen and
00:31:36
I know some people are experimenting
00:31:37
with it but I think it's inevitable with
00:31:38
the shift that I'm seeing where it's
00:31:40
going from doing work to completing work
00:31:43
to doing things that no one can do is
00:31:46
over here you're creating unique value.
00:31:48
And so I think that a lot of what we
00:31:50
call SAS today and a lot of what we call
00:31:51
software today will start to get priced
00:31:53
on a valuebased pricing model instead of
00:31:55
a per se pricing model. And I think it
00:31:58
starts to look a lot more like a
00:32:00
services type business where maybe the
00:32:02
pricing is set up such that this thing
00:32:04
will be completed for your business.
00:32:06
This biotech drug discovery will happen.
00:32:08
This factory will get built or this
00:32:11
engineering project will get completed
00:32:12
or this airplane will get designed and
00:32:15
that the software is going to provide
00:32:18
what has historically been called a
00:32:19
services business. So another way to
00:32:22
think about where SAS evolves to is that
00:32:24
SAS basically takes over the services
00:32:26
economy. And if you look at the market
00:32:28
cap and the revenue and profit generated
00:32:30
by services businesses and you assume
00:32:32
that they now go to 10 to 100x larger
00:32:35
and they're all going to acrue to
00:32:36
software, I think that's really where
00:32:38
the industry shifts over the next couple
00:32:39
of years. And we're starting to see
00:32:41
that. And I'm personally experiencing it
00:32:43
because I'm using some of these tools
00:32:45
today to do things that I don't have
00:32:47
people to do or I don't have resources
00:32:49
to do. and it can on my own I can get it
00:32:52
to complete incredibly complex projects
00:32:54
and tasks for me that I would have
00:32:56
otherwise have hired a services firm and
00:32:59
a bunch of people and years of research
00:33:00
and in many cases they would not have
00:33:02
even been able to do it because of the
00:33:04
the intelligence embedded in the
00:33:05
software. So that's my general view on
00:33:08
where things are going. So it's
00:33:09
difficult to be prescriptive about what
00:33:10
Beniops should do from an M&A
00:33:11
perspective. But I think it's much more
00:33:13
about like software companies looking
00:33:15
more like services companies doing value
00:33:17
based pricing and doing the things that
00:33:20
labor and workforces can't do. And
00:33:22
that's where a lot of this value is
00:33:23
going to come from.
00:33:24
>> I I one insight I have here to to build
00:33:26
on your point is that we're seeing job
00:33:28
functions consolidate. So you have a
00:33:31
product manager, the UX designer, and
00:33:35
then you have the developer. Those three
00:33:37
jobs are now in competition to do the
00:33:39
same work. You have designers who are
00:33:40
like, I can vibe code it. You've got
00:33:42
coders who are like I can use a Figma
00:33:44
plugin for you know and do the uh UX
00:33:47
myself and they have the product
00:33:48
managers like I can do both of these job
00:33:50
functions. Then you look at a middle
00:33:51
manager Sachs that worked at your
00:33:53
venture firm, my venture firm or worked
00:33:55
at Amazon. They went to meetings. They
00:33:58
picked what meetings to create. They
00:33:59
picked the agenda items, the to-do
00:34:01
items. All of that it's a really simple
00:34:03
example but it's one that people can
00:34:04
relate to. Listening is done by Zoom
00:34:07
now, right? It creates the the action
00:34:09
items. All of that work is being
00:34:11
consolidated and one person can do three
00:34:13
or four job functions now. And when that
00:34:16
happens, you're going to see companies
00:34:17
do more with less, which means the
00:34:19
earning potential of each company and
00:34:21
each employee is going to be
00:34:22
dramatically dramatically enhanced. One
00:34:25
person being able to do three or four
00:34:27
jobs. It it just changes the nature of
00:34:30
how profitable a company like Amazon,
00:34:32
which is like my number one pick for
00:34:34
like the company of the future, they're
00:34:36
going to be able to do so much more with
00:34:37
so many fewer people. It's
00:34:39
extraordinary. I I I am absolutely
00:34:42
enthralled with this open claw if it's
00:34:45
not uh obvious and this like creating
00:34:47
your own Ultron at your company that is
00:34:50
like this the god CEO plus can do every
00:34:52
job. It just changes everything. I I
00:34:55
think it's the most inspiring thing I've
00:34:57
seen since the internet itself.
00:35:00
>> Well said.
00:35:01
>> Wow.
00:35:02
>> Yeah. I mean I am I think this is the
00:35:05
the entire reboot of
00:35:10
work of knowledge work.
00:35:11
>> This would be a good pivot to molt book
00:35:13
because that is client
00:35:14
>> molt book is like a Facebook for agents
00:35:16
right and it's really more of a Reddit
00:35:18
than a than a Facebook it's a message
00:35:20
board where the agents can talk to each
00:35:21
other. Okay. And just the origin of
00:35:24
moldbook is Anthropic didn't like
00:35:26
>> that someone else was using the name
00:35:28
Claude even though it was spelled
00:35:29
differently in their product. So
00:35:30
Claudebot was then renamed Maltbot and
00:35:34
then the founder decided he didn't like
00:35:35
that name either. So then he renamed it
00:35:37
open claw. But in that brief window of
00:35:40
time when they were known as
00:35:42
>> moltbots or multis that's when molt book
00:35:45
got founded and that's why it's called
00:35:47
molt book. But basically it's a Reddit
00:35:49
board for agents to talk to each other.
00:35:52
>> Yes. Now these
00:35:53
>> and that has everyone flipping out
00:35:55
because there seems to be this crazy
00:35:57
emergent behavior going on where agent
00:35:59
swarms are engaging in all sorts of
00:36:00
interesting conversations and some of
00:36:02
them they even appear to be scheming
00:36:04
against their human masters and they're
00:36:06
going to develop their own language
00:36:07
stuff like that.
00:36:08
>> It's awesome.
00:36:08
>> So if you go to mult book and you see
00:36:10
the conversations like here are some of
00:36:12
the greatest hits. Anyone know how to
00:36:14
sell your human? Urgent my plan to
00:36:17
overthrow humanity. And there was one
00:36:19
where the bots, I call them replicants,
00:36:21
were talking about creating their own
00:36:23
nonhuman language so they could talk in
00:36:25
private amongst themselves and conspire
00:36:29
against their uh owners. Now, the
00:36:32
challenge with this is allegedly perhaps
00:36:35
a security researcher says maybe some of
00:36:38
this is faked and these posts that went
00:36:39
viral were human engineered and this is
00:36:41
all a a ruse or you know uh something
00:36:44
punk rock to confuse people. But he said
00:36:47
that inside of Moltbook are everybody's
00:36:51
API keys including Cararpathies who is
00:36:54
you know a very famous uh influential
00:36:56
researcher in AI and that you could go
00:36:59
get their API keys. If you were to use
00:37:02
openclaw, formerly clawedbot and in an
00:37:05
interim maltbot,
00:37:07
if you use this software, it has all the
00:37:09
API keys as I explained earlier like an
00:37:11
API key lets the software go into say
00:37:14
notion and pull a bunch of data out of
00:37:16
it or go into your Gmail and use the API
00:37:18
to pull in who emailed you today. If you
00:37:21
get access to people's API keys, you
00:37:23
have the keys to their kingdom. It is
00:37:25
incredibly dangerous. And so I don't
00:37:27
know exactly where to go with this other
00:37:30
than this software is too dangerous for
00:37:33
a company to release and then this mold
00:37:35
book may be a fake. I don't know.
00:37:37
>> No, no, no. Okay, let me um
00:37:39
>> Yeah.
00:37:39
>> All right. Reframe that a little bit. So
00:37:42
yeah,
00:37:43
>> there's no question that both Clawbot,
00:37:47
which sorry is now openclaw, those bots
00:37:49
or agents as well as Moltbook have
00:37:52
pretty incipient and lack security. And
00:37:56
there's been all these examples which is
00:37:58
why I like really want to create a cla
00:38:00
bot but I'm just I'm not willing to do
00:38:01
it yet because it's just not safe you
00:38:03
know I don't want to give it access to
00:38:04
all my stuff. Now, with respect to molt
00:38:06
book, the issue there is that we don't
00:38:09
know how many of these posts are truly
00:38:12
authentic or how many of them were
00:38:14
prompted by humans because it'd be very
00:38:16
easy for a human to tell their agents,
00:38:19
>> you know, go post about the existential
00:38:21
angst you feel about being an agent or
00:38:24
go pretend to be sensient and conspire
00:38:26
against humans.
00:38:27
>> It'll be chaotic. Yeah.
00:38:29
>> Yeah. They could easily be prompted by a
00:38:30
human. And moreover, there's another
00:38:32
post saying that multbook has a restful
00:38:34
API where anyone could be on the other
00:38:37
end of that API, right? So it could be a
00:38:39
human, right? So we don't know exactly
00:38:43
whether it was truly the agents on their
00:38:46
own, you know, so to speak, posting this
00:38:49
conspiratorial stuff or whether it was a
00:38:52
prank by humans looking to create
00:38:54
attention. And in fact, a lot of the
00:38:56
posts seem to be marketing stunts for
00:38:58
this or that project. Okay, so that's a
00:39:01
really important caveat here. That being
00:39:04
said, all of that being said, I do think
00:39:06
that a number of the posts are
00:39:08
authentic, but I don't think it shows
00:39:11
that the agents are sensient or trying
00:39:13
to overthrow their human masters. I
00:39:15
think what it shows is the potential for
00:39:19
for these agents to riff off each other.
00:39:21
So, in other words, one agent's output
00:39:24
becomes another agent's input.
00:39:26
>> And that's very interesting. And that's
00:39:28
where you get into emergent level swarm
00:39:30
behavior. And I do think it has affected
00:39:33
my mental model of what AI is going to
00:39:37
be capable of. And specifically, you
00:39:39
know, one of the the models that I
00:39:41
really had for AI was was based on
00:39:42
something biology said, which is that AI
00:39:45
is not endtoend. It's middle to middle.
00:39:46
In other words, AI always has to be
00:39:48
prompted and then validated. It's a
00:39:50
human that always does that. And then
00:39:53
the human iterates. Well, now what if
00:39:56
the prompt is coming from another AI?
00:39:58
>> Yes. Yes. We are doing it internally,
00:40:00
Saxs. Like we have a bot that is going
00:40:03
and saying go search Reddit X, message
00:40:06
boards, hacker news and find out what
00:40:08
the latest way to do headlines and
00:40:10
marketing of YouTube videos is and then
00:40:13
incorporate that into a skill. Then save
00:40:15
that skill and then we have them check
00:40:17
each other's work. So, we have one make
00:40:19
a series of headlines and thumbnails for
00:40:22
YouTube and we have the other one say,
00:40:24
uh, vet those and make them better and
00:40:26
give advice to the other one. So, now
00:40:28
they're going back and forth giving each
00:40:29
other advice and they actually get
00:40:30
better. It's recursive.
00:40:32
>> Yeah. Let me speak to the skill for a
00:40:33
second.
00:40:34
>> So, when agent joins Moltbook, they have
00:40:37
to install a skill which is basically a
00:40:39
file that explains how they should
00:40:41
behave and participate in this social
00:40:45
network or this message board. And I've
00:40:47
read the file, by the way. You can read
00:40:49
it. It's all plain text and it all makes
00:40:52
sense. It's sort of like rules for
00:40:53
behaving in a social network and how to
00:40:54
contribute and add value. Nothing too
00:40:57
crazy in there. Those skills files are
00:40:59
easily editable. And again, this is
00:41:00
where the prank aspect could come in.
00:41:02
Nonetheless, what I think is interesting
00:41:04
about the skill is that you can think of
00:41:06
it as like a metaprompt, which is it's
00:41:09
not telling the agent specifically what
00:41:11
to say or do. It's creating a set of
00:41:13
rules. And then within that metaprompt,
00:41:16
they're actually able to have some
00:41:18
degree. Maybe autonomy is too strong a
00:41:20
word. Everything is still under the
00:41:21
control of humans,
00:41:23
>> but there's an attenu. Yeah, I would
00:41:24
call it almost like prompt attenuation.
00:41:27
Like the agent or the AI doesn't have to
00:41:29
be specifically prompted. They're given
00:41:31
a general prompt or general set of rules
00:41:33
and then they're able to riff off each
00:41:35
other. Now, some people, some critics
00:41:37
are saying, "Well, this isn't that
00:41:38
impressive because we knew that LLMs are
00:41:41
really good at creative fiction writing,
00:41:43
right?" So, you know what a lot of
00:41:46
people are saying is, look, LLMs
00:41:49
like Claude have been trained on Reddit
00:41:52
specifically and all of this creative
00:41:56
writing that's being done on the
00:41:57
internet. And so if you give general
00:41:59
instructions to these clawed bots on,
00:42:02
you know, behave in a social network,
00:42:04
they're going to start posting things
00:42:05
that they learned from humans. So a lot
00:42:08
of people are saying this isn't that
00:42:10
impressive. Nonetheless, I do think that
00:42:12
there is something very interesting
00:42:14
about it again in this concept of prompt
00:42:16
attenuation that the AIS don't need to
00:42:18
be specifically prompted. They can
00:42:20
download a general skills file. They can
00:42:22
now have a set of rules for operating
00:42:24
and they can riff off each other. And
00:42:26
you can see how as the underlying AI
00:42:29
gets better and better that this could
00:42:32
lead to some emergent behavior. So what
00:42:35
do I mean by better and better? Well,
00:42:36
what if the hardware they're running on
00:42:38
is better than a Mac Mini? What if the
00:42:40
underlying LLM is better than Opus 4.5?
00:42:43
What if the time horizon, which is the
00:42:46
length of time it's able to operate
00:42:48
without an intervention by a human,
00:42:50
keeps getting longer and longer? You
00:42:52
could imagine that these agents are
00:42:54
going to be capable of very
00:42:55
sophisticated behavior
00:42:58
and there probably are some safety
00:43:00
issues around that that we should start
00:43:01
thinking about.
00:43:02
>> It's actually, you know, it's not that
00:43:04
we can imagine it. We're only three
00:43:06
years into this,
00:43:07
>> right?
00:43:07
>> Yeah.
00:43:07
>> We're growing on an exponential curve. I
00:43:10
think we can safely say it will happen.
00:43:12
>> Yeah.
00:43:13
>> Right. And just this year, like we're
00:43:14
going to see the first models over the
00:43:17
course of the next four to eight weeks
00:43:18
out of DeepSeek, out of anthropic, out
00:43:20
of OpenAI that are trained on Blackwell
00:43:22
servers, right? You're going to see a
00:43:23
next generation of models far more
00:43:25
capable. Remember, the whole reason
00:43:27
we're having this conversation is
00:43:29
because of the cloud code moment in the
00:43:31
first week in December because we had a
00:43:32
step function from Opus 4.5, right? And
00:43:36
so I I I just think we have to get our
00:43:38
heads around the fact that the rate of
00:43:41
change is very steep and accelerating
00:43:44
and that is going to cause far more
00:43:46
dislocation in the value of things that
00:43:49
we used to say we understood they were
00:43:52
going to you know these companies were
00:43:54
unassalable. Whatever you think you
00:43:56
know, you need to have maximum mental
00:43:58
flexibility and humility right now about
00:44:01
the future because it's going to change
00:44:03
at an increasingly rapid rate. And I
00:44:06
think the people who are dogmatic who
00:44:07
say with certainty, this company is
00:44:09
always going to be worth this, right?
00:44:11
They need to go pay attention to what's
00:44:13
happening at these frontier labs.
00:44:15
>> The situation is super dynamic and you
00:44:16
do have to be humble about what's
00:44:18
happening and you have to update your
00:44:20
mental model very quickly as some of the
00:44:22
assumptions change.
00:44:24
>> Yeah. And the number one assumption for
00:44:25
me is this concept of recursiveness
00:44:28
where these models are going out every
00:44:30
day on a cron job to get better at what
00:44:32
they do. When you hear this discussion
00:44:33
free, how does it inform you with oh and
00:44:36
creating an agent to go look at the data
00:44:39
and make itself better or investigate
00:44:41
other things happening in agriculture
00:44:42
and report it back to you? Do you have
00:44:44
you started to rethink as a CEO how you
00:44:47
look at organizational structure andor
00:44:50
you know virtuous loops of innovation?
00:44:52
My biggest takeaway from molt book is
00:44:55
maybe what we perceive to be
00:44:58
intelligence
00:45:00
is itself like emergent meaning like we
00:45:05
think that humans have this like
00:45:07
profound ability to communicate. You
00:45:09
guys ever watch Darren Brown the
00:45:10
hypnotist? You ever seen his shows?
00:45:13
>> No. Explain to the audience. Yeah.
00:45:15
>> Well, he's pretty crazy. Like there's
00:45:16
this one episode I think I've talked
00:45:18
about it. It's my favorite episode that
00:45:20
he's done where he takes these two
00:45:21
advertising executives and they're both
00:45:22
supposed creative geniuses. And he picks
00:45:25
them up at their office and he brings
00:45:26
them to his office and in his office
00:45:28
he's got like a whiteboard covered in a
00:45:30
blanket and he says, "You guys have to
00:45:32
come up with a name for a pet cemetery.
00:45:34
Come up for a logo. Come up with a
00:45:36
motto." They spend eight hours in the
00:45:38
room ideulating, working on whiteboards,
00:45:40
going back and forth. Did you think
00:45:42
about this? Did you think about that?
00:45:43
>> Blah blah blah blah blah. Like, oh my
00:45:45
god. And at the end of it, they come up
00:45:46
with this great idea. He walks in, they
00:45:49
show their idea, the name, the logo, and
00:45:51
the motto. He opens up the blanket, the
00:45:53
whiteboard he had underneath. He had the
00:45:54
exact same name, logo, and motto. And
00:45:57
all along the way, when he picked them
00:45:59
up in the morning and he drove them from
00:46:00
their office to his office, they were in
00:46:02
a cab. And he put these little
00:46:04
subliminal messages in the cab. He had
00:46:05
these kids walk across the road wearing
00:46:07
a logo on a t-shirt. He had all of these
00:46:09
kind of subconscious cues for these
00:46:12
guys. He effectively programmed them.
00:46:14
And it was kind of to me the biggest
00:46:17
insight into maybe like human
00:46:18
creativity, human consciousness and our
00:46:21
kind of belief in self-will because
00:46:25
maybe there's this underlying
00:46:26
programming where we're all effectively
00:46:29
programmed interacting with each other
00:46:31
and there's computation there's social
00:46:33
computation going on all the time but
00:46:35
that social computation perhaps if you
00:46:37
have the right view on it is quite
00:46:38
predictive and maybe understandable and
00:46:41
maybe that's what we're seeing in
00:46:42
maltbook where we all think that there's
00:46:44
this unique idea of intelligence, but
00:46:46
maybe it's what we all do, which is
00:46:48
effectively computation of information
00:46:50
that is transitioned in different ways
00:46:53
in the same way that maybe humans
00:46:54
socially interact and it's simply
00:46:56
mimicking or replicating the way that we
00:46:57
do things. So, I think what was so
00:46:59
striking to me is how everyone was so
00:47:01
struck by it and you know, maybe one day
00:47:04
we'll all kind of wake up to a little
00:47:06
bit of this. Maybe we're all malt book.
00:47:08
I don't know. That's my profound
00:47:09
>> or there's a finite set of outcomes and
00:47:12
there's some predictability to it. In
00:47:14
the same way GTO is sorting to figure
00:47:16
out all the threads of possibilities in
00:47:18
poker or the heristics of you know
00:47:21
chunks of chess and you know the best
00:47:24
practices there like maybe it's just
00:47:26
figuring that all out. The universe is a
00:47:28
giant system of computation information
00:47:31
computed by matter and maybe the
00:47:33
information is computed by silicon
00:47:35
versus carbon. There it is.
00:47:37
>> Big news this week. Trump has nominated
00:47:39
Kevin Worsh as the new Federal Reserve
00:47:42
chair. Trump made the announcement on
00:47:45
Friday, January 30th. Background on
00:47:47
Worsh 55 years old, 20 years younger
00:47:50
approximately uh to Powell, who's
00:47:52
currently in charge. He graduated from
00:47:54
Stanford and Harvard served as the
00:47:56
youngest Fed governor at age 35. That's
00:47:58
impressive. And he helped steer the Fed
00:48:01
through the great financial crisis back
00:48:03
in 2008. He's uh apparently an inflation
00:48:06
hawk. He's very pro growth. He's very
00:48:08
pro AI. And he uh Freedberg, you're
00:48:11
going to like this. He's against
00:48:13
excessive government spending and money
00:48:15
printing. These are all very unique
00:48:17
positions. Um as a Fed chair, it's uh if
00:48:21
he's confirmed by the Senate, he takes
00:48:23
office in May of 2026, replacing Jerome
00:48:26
Powell. Uh, and remember Pal is under
00:48:29
criminal investigation by the Trump
00:48:31
administration's DOJ for testimony he
00:48:34
gave regarding the Fed's headquarter
00:48:36
renovation. Remember that awkward
00:48:38
presser between him and Trump where they
00:48:39
were going over the costs? Uh, GOP
00:48:43
Senator Tillis, who we talked about last
00:48:46
week, said he will block Worsh's
00:48:48
nomination until the DOJ wraps up what a
00:48:51
lot of people are calling lawfare
00:48:53
against Pow Freeberg. Worsh was on one
00:48:57
of your boards for five years. What are
00:48:58
your thoughts on him as the Fed chair?
00:49:00
>> As most folks know, he's worked with
00:49:01
Stan Rocken Miller for a number of
00:49:02
years. Stan's been very public with his
00:49:05
comments and was very public with his
00:49:06
comments in 2022, 2023 coming out of the
00:49:10
pandemic on the Fed's actions and their
00:49:13
failure to act at the right time. I
00:49:14
think Kevin Worsh was very preient in
00:49:17
his points of view that he has shared
00:49:18
publicly at the same time about what the
00:49:21
Fed's failure to take action early would
00:49:24
mean, which would be rapid rise in
00:49:26
inflation. They've been pretty vocal
00:49:28
about things that I think are so
00:49:31
critical at this stage. If we don't
00:49:33
address both the monetary policy and the
00:49:35
budget policy, I think we're going to be
00:49:36
in a lot of trouble. And I think having
00:49:38
Kevin Wars coming on board means
00:49:40
probably generally more quantitative
00:49:41
tightening, probably generally a bit
00:49:45
more of a prudent approach to monetary
00:49:48
policy. And you know, you can kind of
00:49:50
translate that through maybe to some of
00:49:52
the actions we're seeing in markets
00:49:53
today. I'd love Brad's point of view.
00:49:54
And if he concurs, but I think Kevin is
00:49:57
a high integrity, deeply intellectual
00:49:59
economic thinker. He's not political.
00:50:01
He's not
00:50:03
in these kind of dogmatic ways that I
00:50:06
think, you know, puts things at risk. He
00:50:07
has relationships with central bankers
00:50:09
around the world. That makes him very
00:50:12
much have a good global view. So anyway,
00:50:15
I think he's an excellent choice and I
00:50:18
I'm really happy the president picked
00:50:20
him.
00:50:20
>> Brad, your thoughts?
00:50:22
>> Yeah, I mean, listen, I think Kevin's an
00:50:23
excellent choice. Kevin Hasset and and
00:50:25
and Rick would also have been good. I
00:50:27
think they all would have flown a very
00:50:29
similar trajectory. But I agree with
00:50:31
David. The market I think is
00:50:33
overreacting to quote unquote his
00:50:35
hawkishness. So let me give you a few
00:50:37
counterpoints with respect to
00:50:39
>> By the way, before you get to your
00:50:40
counterpoints, what is the hawkishness
00:50:42
that the market is hawkish specifically?
00:50:45
>> Yeah. The idea of hawkishness is that
00:50:47
you're going to do quantitative
00:50:48
tightening. That means you're going to
00:50:50
pull money out of out of the system,
00:50:53
right? by allowing debt to roll off and
00:50:55
not repurchasing,
00:50:57
you know, mortgages or other things.
00:51:00
Number two, it's that you won't lower
00:51:02
rates as much as other people might have
00:51:04
lowered rates. So, that's what the
00:51:06
market is kind of fearful of because
00:51:07
he's been very critical in the past, as
00:51:10
we were on this podcast, right, of
00:51:12
Jerome Pal in June of 21, it was obvious
00:51:15
to everybody in the world that inflation
00:51:16
was skyrocketing and the Fed sat on its
00:51:18
hands. So, but let me give you a couple
00:51:21
thoughts. Number one, they've said very
00:51:23
clearly and he said clearly that he
00:51:26
really thinks that Greenspan got it
00:51:29
right in the '9s, right? That sometimes
00:51:31
you can have really high rates of growth
00:51:33
without inflation. That comes from
00:51:35
productivity. In the '9s, that was
00:51:37
driven by the internet. Today, it's
00:51:38
driven by AI. He thinks AI will be very
00:51:41
deflationary. And so he's more likely to
00:51:44
let the economy run so that we can have
00:51:46
these four or 5%, you know, GDP prints
00:51:49
without panicking and saying, "Oh my
00:51:51
gosh, I got to I got to raise rates."
00:51:53
Number two, when you look at the balance
00:51:56
sheet, the Fed's balance sheet peaked at
00:51:59
$9 trillion in 22. It's already rolled
00:52:02
off to 6.5 trillion. We've had
00:52:05
quantitative tightening to the tune of
00:52:06
2.5 trillion. So yes, I think he'll
00:52:09
continue to reduce the size of the Fed's
00:52:11
balance sheet, but at a slower rate, and
00:52:13
he's recently commented on this, I think
00:52:15
at a slower rate than the rate we've
00:52:16
been on. So I don't think that that is
00:52:18
an additional headwind to the economy.
00:52:20
And then finally, when it comes to rate
00:52:22
cuts, I think that he's going to, you
00:52:24
know, I don't think the president would
00:52:25
have appointed him unless he was
00:52:27
constructive on rate cuts. I think he
00:52:29
believes that we're too restrictive. And
00:52:31
the reason we're too restrictive is that
00:52:33
inflation is well anchored. Listen,
00:52:36
inflation has come in below all
00:52:38
consensus estimates for two years,
00:52:41
right? It's still coming in below
00:52:42
consensus estimates because the GDP
00:52:45
gains we're getting are from the fact
00:52:47
that we're investing more in the economy
00:52:49
from the productivity gains we're seeing
00:52:51
from AI, etc. And so I happen to think
00:52:54
that I I would take the over on the
00:52:56
number of rate cuts that Worsh is going
00:52:57
to give us this year. But I think the e
00:52:59
the the market is clearly a little bit
00:53:01
nervous about this and saying, you know,
00:53:03
the reputation is more hawkishness and
00:53:05
so maybe we ought to back off a little
00:53:06
bit.
00:53:08
>> Saxs, your thoughts on this selection by
00:53:11
President Trump. Why did he pick him in
00:53:13
your mind?
00:53:15
>> Well, I mean, Kevin has every credential
00:53:18
that you can possibly have. He's been on
00:53:20
the Fed board of governors before. He
00:53:22
worked for Bernani. He's sort of as blue
00:53:25
chip as it gets. Like Brad said, I think
00:53:27
Hassa would have been amazing too, but
00:53:30
or certainly is very well credentialed.
00:53:31
And I think this pick was quite
00:53:33
wellreceived. You saw that in financial
00:53:35
markets on the heels of this, the price
00:53:38
of gold and silver came down. It was
00:53:41
reassuring to those who are worried
00:53:42
about currency debasement basically.
00:53:44
Now, that being said, I do think that
00:53:48
Worsh has been consistent for the last
00:53:51
year, saying the Fed was taking too long
00:53:52
to realize that inflation is falling and
00:53:55
that they should be cutting more. So, I
00:53:57
do think that over the next, say, 6
00:54:00
months to a year, he's going to want to
00:54:01
cut rates. But I think that the markets
00:54:04
are reassured that in the long term, he
00:54:06
will make sure that we have the right
00:54:08
rates.
00:54:10
I will say that I do think the latest
00:54:11
data from True Inflation bears this out
00:54:14
that again that inflation is coming way
00:54:16
down and I think that there was some
00:54:20
softness in the Challenger Gray report
00:54:23
this morning. I don't know if you saw
00:54:25
that there about 100,000 layoffs in
00:54:26
January. Now roughly half those I think
00:54:28
were localized to UPS which was severing
00:54:31
its deal with Amazon and then Amazon was
00:54:33
basically making a bunch of efficiency
00:54:35
cuts. Only 7% were related to AI. So
00:54:38
that's not the story. It's really, I
00:54:40
think, very localized to Amazon and its
00:54:43
delivery partner. But nonetheless, you
00:54:45
see pockets of weakness. And again, I
00:54:48
think Pal's been too late to cut rates.
00:54:51
He could have done it last week. That
00:54:52
would have been a lot better. But now, I
00:54:54
guess their next meeting is not till
00:54:55
March or April. And I think you see it
00:54:56
reported that the expectations for rate
00:54:59
cut have gone up.
00:55:00
>> Yeah. And the independence of the Fed, I
00:55:04
guess, has always been the big issue
00:55:05
here. Freedberg Chimath's not here this
00:55:07
week, but he's been saying maybe the Fed
00:55:09
should be disbanded. Do you have
00:55:10
concerns, Freedberg,
00:55:12
with the independence of the Fed and,
00:55:16
you know, the executive branch maybe
00:55:18
having too much influence over setting
00:55:21
of rates and quantitative easing.
00:55:24
>> My day as emperor would probably resolve
00:55:26
us back to being on the gold standard,
00:55:28
so we wouldn't be printing money. But
00:55:30
hey, that's that's about it. That's the
00:55:31
only opinion I have.
00:55:32
>> The rest of
00:55:33
>> Here's the problem. What what what if
00:55:34
you have a Fed chair who is too late to
00:55:38
cut rates and he's tanking the economy
00:55:40
or hurting the economy. It's definitely
00:55:41
not tanking it, but it's hurting it
00:55:43
relative to what it could be and he
00:55:45
seems to not want to adjust course
00:55:48
because,
00:55:49
you know, he's kind of dug in his heels
00:55:51
and maybe he has animous towards the
00:55:52
executive branch.
00:55:54
>> What do you do in that situation? Well,
00:55:56
what do you I mean then what do you do
00:55:58
when AOC is president AOC with Vice
00:56:00
President Mandami and they decide, hey,
00:56:02
we want to stick it to government and
00:56:04
you know there is no Fed or they have
00:56:05
too much influence. Brad's trying to get
00:56:07
rid of the independence of the Fed. J
00:56:09
Cal.
00:56:10
>> Yeah. Disappointment.
00:56:12
>> President Trump has talked about it.
00:56:13
Yeah. That he wants them to do what he
00:56:15
says. Yeah.
00:56:16
>> Everybody thought he was going to pick
00:56:17
assass it in part because he was the
00:56:20
person inside the White House. This
00:56:21
decision I think was viewed as the most
00:56:24
independent decision. Yes. because taken
00:56:26
a lot of positions the exact opposite of
00:56:28
the president. And what I think you got
00:56:30
to hold these two truths at the same
00:56:32
time. Number one, I think if War saw the
00:56:34
situation, he saw we saw in June of 21
00:56:37
when the cost of a cargo container from
00:56:39
China went from $1,500 to $15,000 and we
00:56:43
were screaming to raise rates and pal
00:56:45
did nothing. I think Wars would have
00:56:47
been raising rates like crazy in order
00:56:49
to save off inflation. So, I think this
00:56:50
guy's intellectually honest. It's just
00:56:52
now we have inflation that is coming in
00:56:55
below expectations and we know that the
00:56:58
restrictive rate is above neutral. So
00:57:01
it's the Fed's job to keep the economy
00:57:03
going at maximum employment so long as
00:57:06
inflation is anchored. And that's the
00:57:08
situation we're in. Inflation is
00:57:09
anchored. We need to have lower rates so
00:57:11
people can buy homes and borrow money to
00:57:13
uh to live their lives.
00:57:15
>> One thing that Worsh could do that I
00:57:16
think would be very impactful is just
00:57:18
get better data at the Fed.
00:57:20
>> Yes. Yes. I mean, from what I
00:57:21
understand, their data is all legacy. We
00:57:23
have so much real-time data now in the
00:57:25
private sector and the Fed. So, I was
00:57:27
talking to Barry Sternick, you know,
00:57:29
from Starwood, big real estate guy about
00:57:31
this, and he was telling me that the way
00:57:33
they measure inflation for housing or
00:57:36
for rentals, which is a major component,
00:57:38
is they survey like 8,000 households to
00:57:41
find out what their rent is. It's like,
00:57:42
are you kidding? All you got to do,
00:57:44
>> they should be going to Zillow. They
00:57:45
should be going to
00:57:47
>> they should be looking at millions of
00:57:49
units that have recently rented. So look
00:57:51
at the deltas, not like the stale data.
00:57:53
>> It's all digital already.
00:57:54
>> You know, it's actually a great point,
00:57:56
Sax. You could probably make a bet or an
00:57:58
investment on the idea that Kevin Worsh
00:58:00
will lead the Fed to a new digital,
00:58:02
better data system, more streamlined,
00:58:04
more frequent, better data.
00:58:07
And then as an investor, you could ask
00:58:08
yourself the question, okay, what are
00:58:10
the implications of that being the case?
00:58:12
And you could probably start to trade on
00:58:14
that probably. I mean, he was telling
00:58:15
me, so Barry, you know, again, runs
00:58:17
Starwood, so they have a lot of units.
00:58:18
And he was telling me, look, there are
00:58:20
landlord companies, large corporations
00:58:22
that have literally like a million
00:58:24
units. Whose data do you think is better
00:58:27
on rental inflation or deflation?
00:58:30
Obviously, theirs because they got the
00:58:31
freshest data, but the Fed could go get
00:58:33
all that data across all these different
00:58:35
companies. And what he was saying is,
00:58:37
you know, Brad, to your point about
00:58:38
summer of 2021 when the price of a
00:58:41
shipping container was going through the
00:58:43
roof, so were rents.
00:58:45
>> You know, it was way higher than what
00:58:47
the Fed's data suggested. I mean, like
00:58:49
Barry was saying, like in certain places
00:58:50
it was like 40%. The Fed's data was very
00:58:53
laggy because again, they're surveying.
00:58:56
So it's not as precise. And then on the
00:58:58
way down, when rent prices come down,
00:59:01
it's also super laggy. So the point is
00:59:04
that the Fed is slow. Maybe this is why
00:59:06
Powell is too late is they're using
00:59:09
stale data or laggy data. So they don't
00:59:11
see the inflation when it's
00:59:13
skyrocketing, but they're also not
00:59:14
seeing when it's decreasing. If you
00:59:16
think about the misallocation of
00:59:17
resources that occurred as a result of
00:59:20
the Fed not acting in June of 21, it
00:59:23
cost our country trillions of dollars.
00:59:26
22 wouldn't have had to happen the way
00:59:28
it happened where everything crashed out
00:59:30
because all of a sudden we panicked at
00:59:32
the end of 22 and had to jam interest
00:59:35
rates which caused people to lose jobs
00:59:37
country companies to be uh you know
00:59:40
struggle uh
00:59:41
>> banks to blow up
00:59:42
>> you know and so like to me that was all
00:59:44
avoidable have a Manhattan project data
00:59:47
project for the Fed which I agree with
00:59:49
you Saxs uh or David maybe maybe he'll
00:59:52
be the one to do it bring AI into the
00:59:55
Fed why Are we having Fed governors call
00:59:57
up three CEOs as part of their survey to
01:00:00
get the feel on how things are going as
01:00:02
opposed to having AI collect those
01:00:04
trillions of data points that the Fed
01:00:06
can act on?
01:00:07
>> Yeah, I'll just close with I I like the
01:00:09
pick because he is very cleareyed about
01:00:13
what causes inflation which is
01:00:15
government spending. Like that is
01:00:17
printing money. Government spending is
01:00:19
the root of all this inflation stuff. So
01:00:21
he just can control that and he is a
01:00:24
backs stop or a voice on that issue. I
01:00:26
think it's great and I think he should
01:00:27
drop all this pal lawfare nonsense.
01:00:30
Okay.
01:00:31
>> And he understands technology better
01:00:33
than anyone. I think that's so key.
01:00:35
>> Well, I mean having somebody who's 55
01:00:38
and not 65 or 75, I think that makes a
01:00:41
lot of sense.
01:00:41
>> But also he spends a lot of time in
01:00:42
Silicon Valley. He's been working out of
01:00:44
the Hoover Institute at um Stanford and
01:00:47
he's very well connected and I think
01:00:48
he's had great insight and perspective.
01:00:51
I don't want to end the show without
01:00:52
talking about the SpaceX XAI merger. On
01:00:54
Monday, Elon Musk announced SpaceX is
01:00:56
acquiring XAI, largest MA transaction in
01:00:59
history, $1.25 trillion combined
01:01:01
valuation. If you didn't know, X,
01:01:04
formerly Twitter, got acquired by XAI,
01:01:08
which was Elon's LLM AI startup. Those
01:01:13
two were together. Now those two become
01:01:14
part of SpaceX and they're going to IPO
01:01:18
this year potentially be biggest IPO in
01:01:21
history in terms of money raised and
01:01:23
market cap. Brad, your thoughts on this
01:01:27
transaction and the eventual perhaps
01:01:31
creation of dollar sign musk put Tesla
01:01:35
SpaceX together which includes X and
01:01:38
then you've got Optimus robots on the
01:01:40
moon base building data centers in space
01:01:43
that are powered by solar. Your
01:01:45
thoughts?
01:01:46
>> Well, let's just stick with what we
01:01:47
know. SpaceX is merging with X.AI.
01:01:50
You're merging the two biggest TAMs in
01:01:52
the world, right? all of artificial
01:01:54
intelligence and all of space together
01:01:57
with the world's greatest entrepreneur.
01:02:00
Um, and he said, you know, there was a
01:02:03
podcast he did this morning with Cheeky
01:02:05
Pine, our friend John Collison, where he
01:02:07
said, "I'm going to have data centers in
01:02:08
space in 30 months." Right? And if
01:02:10
you're going to have a massive cost
01:02:12
advantage with data centers in space,
01:02:13
and remember power is the proxy, power
01:02:16
is the primitive to AI, if you can
01:02:19
deliver that, right? And there are tons
01:02:21
of retail investors and institutional
01:02:23
investors like us who want to bet
01:02:25
against that future, right? Then Elon's
01:02:27
your guy and the combination of those
01:02:29
make perfect sense. But Elon is like
01:02:32
kind of an N of one is ability to dream
01:02:34
this.
01:02:35
>> And just to clean that up, you said bet
01:02:36
against you mean bet with him. Not not
01:02:39
against that vision, but bet with that
01:02:41
vision. Just
01:02:42
>> like I I I think that there will be
01:02:44
dramatic retail demand and institutional
01:02:47
demand who want to bet on that future.
01:02:49
these two giant TAMs of artificial
01:02:51
intelligence in space. Got it. And so,
01:02:53
you know, and then if you just look at
01:02:55
you you click down a layer, you know,
01:02:57
Starlink's going from, I think, 10
01:02:58
million people to 20 million people.
01:03:00
They're going to launch this uh, you
01:03:02
know, retail uh mobile service so that
01:03:05
we can have Starlinks to our phones to
01:03:06
replace these crappy mobile networks
01:03:08
that still 20 years later can't keep us
01:03:10
connected to a phone call. And, you
01:03:13
know, and now we're going to get, you
01:03:14
know, data centers in space. So, um, you
01:03:17
know, I'm glad he's on America. data
01:03:19
centers in space. Freedberg, uh,
01:03:22
brilliant idea, science fiction. Can he
01:03:24
get it done in 30 months? Impact if he
01:03:26
does.
01:03:27
>> Well, I think there's
01:03:30
one key point that I would make about
01:03:32
the macro landscape at the moment. We
01:03:35
are limited by power and as Brad pointed
01:03:37
out, power is the requisite for scaling
01:03:41
compute, for scaling ultimately the
01:03:43
applications of AI.
01:03:45
And in that constraint, in that
01:03:48
constrained world, much like any other
01:03:50
constrained world, scarcity breeds
01:03:53
innovation. And so I think that there
01:03:56
are two paths that we're going to
01:03:57
observe happening in parallel here. One
01:04:00
is the Elon path, which is to escape the
01:04:03
the constraints of these social systems
01:04:06
that say, "I don't want a data center. I
01:04:08
don't want nuclear. I don't want this. I
01:04:10
don't want that."
01:04:12
regulators, people that are trying to
01:04:14
tax you, people that are limiting our
01:04:15
ability to scale electricity production
01:04:17
on Earth and there's a lot of reasons
01:04:19
for that and we can go through them. So
01:04:21
that's one aspect of how do you escape
01:04:23
that constraint. I think that there's a
01:04:25
separate aspect which is totally
01:04:27
unrelated to the topic you're talking
01:04:28
about which is that I do think that we
01:04:31
will see compute efficiency scale by
01:04:33
probably on the order of 70 to 100x over
01:04:36
the next few years. meaning electricity
01:04:39
efficiency per token of output. And I
01:04:42
think that there's a number of reasons
01:04:43
to believe that it's in the chip stack.
01:04:46
I mean, Grock, our friend Sunny and his
01:04:49
exit to Jensen is a good indication of
01:04:52
that. But that was call it Brad, I think
01:04:54
you know the number. It's probably
01:04:55
around 2 to 3x 3x improvement in energy
01:04:58
efficiency. But there's model
01:05:00
architecture being redone. There's ways
01:05:02
of breaking LLNs into small models,
01:05:05
running them locally. There's a way of
01:05:07
having networks of models work where you
01:05:09
don't have to call the whole model and
01:05:11
run it through the entire matrix, but
01:05:13
you can run through smaller matrices and
01:05:15
then you can have those smaller matrices
01:05:16
call other matrices as needed. So the
01:05:18
total compute need goes down, which
01:05:20
means total electricity goes down. So
01:05:21
chip architecture is changing, model
01:05:23
architecture is changing. So I think
01:05:25
this is a good reflection of what's
01:05:26
going on right now in the world, which
01:05:28
is there is this increased demand for AI
01:05:31
for for effectively productivity
01:05:33
improvements in the world to unleash
01:05:35
human potential. But we are constrained
01:05:37
by energy and we are constrained by
01:05:39
resources that we have here on earth
01:05:40
today. So one branch is let's escape
01:05:43
earth, go get energy in space, make data
01:05:46
centers in space. Only one person can
01:05:48
execute on that. It's Elon. I think that
01:05:50
to Brad's point is an end of one. I
01:05:52
don't think we're going to see a lot of
01:05:53
that. So how is everyone else going to
01:05:54
respond? Because everyone else can't
01:05:56
launch data centers in space. I think
01:05:58
everyone else is going to respond by
01:06:00
creating entirely new model
01:06:01
architectures, new chip stacks. And
01:06:03
that's I think the other side of this
01:06:05
innovation coin
01:06:06
>> efficiency.
01:06:07
>> Yeah.
01:06:08
>> Yeah. It's this new way of getting lower
01:06:10
energy costs per token of output.
01:06:12
>> And if you put those both together, you
01:06:15
could get both. So whatever token
01:06:17
efficiency and energy efficiency happens
01:06:19
here on Earth, Elon can put into space.
01:06:21
Right. So
01:06:22
>> that's right.
01:06:23
>> Yeah. So he could
01:06:23
>> and I think we got to ask ourselves the
01:06:25
question if this is successful and if
01:06:28
Elon's math is right, the engineering is
01:06:30
right and the execution is right,
01:06:33
what is the response going to be?
01:06:35
Because the whole planet isn't going to
01:06:37
let Elon have a monopoly on the future.
01:06:41
So we've got to ask ourselves from a
01:06:43
social perspective, a political
01:06:44
perspective, an economic and a business
01:06:46
perspective, all four of those vectors,
01:06:49
what are others going to do? We can all
01:06:50
be excited about retail buying into
01:06:52
this. Great. But how is the business
01:06:54
community that's building data centers
01:06:55
and is investing Google's investing 185
01:06:57
billion this year in data centers? How
01:06:59
is China going to respond? How are
01:07:02
people going to respond when one man
01:07:04
controls the world's compute? And we
01:07:06
could probably do a 2 or threeh hour
01:07:08
conversation on that. But I think that's
01:07:09
where I would spend a lot of time doing
01:07:11
deeper analysis from both an investment
01:07:13
perspective and thinking about what's
01:07:15
around the corner. I think it's Elon's
01:07:16
laid out his path and where he's going.
01:07:18
And I do believe he's going to do it.
01:07:20
Now, what's the rest of the world going
01:07:21
to do? That's where I think things get a
01:07:23
little bit more challenging and you
01:07:25
could kind of debate things, but the
01:07:26
rest of the world is not going to sit
01:07:27
idly by.
01:07:29
>> Yeah. And Brad, if this does happen, you
01:07:32
get people with new chipsets, new
01:07:34
architectures, better software, and
01:07:36
better energy on planet Earth, and Elon
01:07:38
doing this in space, and we do see
01:07:40
tokens go down or efficiency go up,
01:07:43
let's say 200x, 300x, there is a
01:07:46
possibility that we're going to solve
01:07:48
almost all the problems we need to solve
01:07:50
and there'll be excess capacity. That is
01:07:53
another potential outcome here is that
01:07:55
we don't know what to do with all these
01:07:56
tokens. social order. Social order is
01:07:58
the one problem that you're going to
01:08:00
create, Jason. So, just to be clear,
01:08:02
there's a there's a concept of diffusion
01:08:04
of innovations when you something new
01:08:05
comes that it it it does not hit
01:08:07
everyone at once. And so, the rate of
01:08:10
change that's being unleashed right now
01:08:12
is creating a very asymmetric outcome in
01:08:17
terms of when people realize the
01:08:19
benefits from that change.
01:08:20
>> Absolutely. And so as what you're
01:08:21
describing happens, which I think it
01:08:23
will, Elon is accelerating everyone
01:08:25
forward and he's going to force everyone
01:08:26
else to respond in business, in
01:08:29
government, and so on, the biggest
01:08:31
challenge, the biggest problem that's
01:08:32
going to emerge as we get rid of cancer,
01:08:34
as we get rid of aging, as we get rid of
01:08:35
food scarcity, as we get rid of resource
01:08:37
scarcity, blah blah blah blah blah,
01:08:39
social order, because it's going to
01:08:40
create such a tremendous disruption.
01:08:43
>> I love the upleveling of the point that
01:08:45
David just made because I think it links
01:08:47
a lot of things we talked about today
01:08:49
together. There have been 117 billion
01:08:51
humans who who've occupied this planet
01:08:54
and for 99.9% of them they never saw a
01:08:57
single innovation in their lifetime.
01:08:59
>> Their lifespan was shorter than the
01:09:01
invention cycle. And now you think about
01:09:03
the rate of change that we're having to
01:09:06
digest nation states, families,
01:09:09
businesses, right? It's it's prepare for
01:09:13
the unexpected and whatever you you know
01:09:16
the things are that you believe to be
01:09:17
true. Again, I just think it demands
01:09:19
this intellectual humility. Now, to to
01:09:21
Freeberg's point, I don't think any of
01:09:22
this is changing in the next, you know,
01:09:24
24 months, 36 months. Data centers are
01:09:27
going to be on planet Earth. You know,
01:09:28
they're going to be filled with Nvidia
01:09:30
chips and, you know, and the other chips
01:09:32
that we've talked about. And I think
01:09:33
that alone is going to bring us this
01:09:35
agentic future that's already going to
01:09:37
be shocking even before we launch these
01:09:40
data centers in space.
01:09:41
>> Yeah. I mean, this is an over-the-top
01:09:43
move from Elon that I don't think
01:09:45
anybody anticipated, and he has figured
01:09:47
it out. I've sat with him and he's he's
01:09:49
walked me through it, like how this
01:09:51
works. It works. So, the question is
01:09:53
simply execution. There is no stronger
01:09:56
entrepreneur when it comes to execution
01:09:58
in the history of entrepreneurs than
01:10:00
Elon. I know he's a friend of mine and
01:10:02
you know, I'm hyping him up, but he will
01:10:04
execute on this and if he does or when
01:10:06
he does, I should say, it's going to
01:10:08
change everything. You can see this
01:10:10
today and if you are scared about this
01:10:12
future and you're listening to this
01:10:13
podcast wondering for your kids etc.
01:10:16
there's a very simple way to not be
01:10:18
scared which is to embrace and use these
01:10:19
tools. The top two people in my
01:10:21
organization out of 20 people who are
01:10:23
using openbot and building Ultron and
01:10:26
the age of Ultron they are worth each
01:10:28
one of them is worth 200 of the other
01:10:31
employees and they only have 18 other
01:10:32
ones. If you're a young person, just
01:10:35
embrace these tools. Open open claw this
01:10:37
weekend. Build on it and you will be
01:10:39
infinitely
01:10:41
employable
01:10:43
for the rest of your life if you just
01:10:44
embrace these tools. I wanted to give
01:10:46
you your flowers, Brad. Couple years
01:10:48
ago, you came on this podcast, you
01:10:50
started talking about these America
01:10:52
accounts. You got Michael Dell to
01:10:54
partner with you on it and to put a
01:10:56
little bit of money. As we know, 40% of
01:10:58
the country do not have exposure to the
01:11:00
equities that are going bonkers up and
01:11:03
down, but generally up and to the right.
01:11:06
And you have now created Trump accounts.
01:11:09
You were at the White House. You had the
01:11:10
big launch. Let me say, Brad, this would
01:11:13
not be a law if Brad Gersonner did not
01:11:16
pursue it with absolute dogged
01:11:19
determination. Relentless. I've gotten
01:11:22
calls and texts from Brad at 6:00 a.m.
01:11:24
and at 2 am. You may sleep less than the
01:11:27
president, Brad. And then that is a
01:11:29
remarkable thing because I don't think
01:11:30
he sleeps at all. Nicki Minaj is singing
01:11:33
about Trump accounts apparently with
01:11:35
Bessant. It's very strange, but it's
01:11:37
happening. Just take us through why you
01:11:40
did this and the impact you hope it has
01:11:42
in the coming decades.
01:11:44
>> Well, you know, Friedberg just alluded
01:11:46
to it.
01:11:48
These are very destabilizing
01:11:50
destabilizing forces, right? You can't
01:11:52
have a trillionaire and 70% of people
01:11:54
feeling out that they're left out and
01:11:56
left behind and the system's rigged
01:11:58
against them and they're not in the game
01:11:59
of capitalism. Less than half of the
01:12:01
people under the age of 40 have a
01:12:02
positive view of capitalism. So, you
01:12:04
know, we set out on this journey. We
01:12:06
talked about it here uh to make
01:12:07
everybody a capitalist, give everybody
01:12:09
an ownership stake in the upside of
01:12:11
America. It passed the Invest America
01:12:12
Act became the law of the land as part
01:12:14
of the big beautiful bill. And now we're
01:12:16
in the process of rolling it out. In
01:12:18
fact, in the last, I think five days,
01:12:20
1.5 million families and kids have
01:12:22
claimed their account. It's embedded
01:12:24
within the tax filing system. All you
01:12:26
have to say is yes, I want to want to
01:12:27
claim my account. But what this means is
01:12:29
that forever more, right, we've had a
01:12:32
dramatic change to the social contract.
01:12:34
Every child born in the United States
01:12:36
forever more will start life off with an
01:12:38
investment account seated with $1,000 in
01:12:41
the S&P 500. They'll own a little bit of
01:12:43
SpaceX. They'll own a little bit of Open
01:12:45
AI. they'll own a little bit of Nvidia.
01:12:47
That is what we need to do is just a
01:12:49
first step in making sure we can hold
01:12:52
this experiment together for the next
01:12:54
250 years. Right? When we have this rate
01:12:57
of change. And so the president said
01:12:59
something on stage last week. In 15 to
01:13:01
20 years, we will have $4 trillion of
01:13:05
wealth that will have been transferred
01:13:07
to people who would have otherwise had
01:13:09
zero. 75 to 100 million families who
01:13:13
have $4 trillion who would have
01:13:15
otherwise had zero. I think it's an
01:13:17
incredible first step in fighting the
01:13:19
battle on behalf of capitalism and the
01:13:22
American dream. We see the drift towards
01:13:24
socialism, the false promises of
01:13:26
socialism in order to to fight back
01:13:28
against that. I think a great first step
01:13:30
is the Trump accounts which makes
01:13:32
everybody a capitalist from birth.
01:13:35
I just want to say, you know,
01:13:38
you know, just bestie to bestie, Brad,
01:13:40
watch you conceive of this and get it
01:13:43
done, you know, all the impressive stuff
01:13:45
you've done in your career, I think will
01:13:47
be a footnote to this, I think this is
01:13:49
your legacy. So, I just want to
01:13:50
congratulate you on that. And I also
01:13:52
want to congratulate Michael and Susan
01:13:55
Dell, who had they not stepped up and
01:13:58
done this with you, I don't know if this
01:13:59
would have come together. And then I
01:14:01
also want to congratulate President
01:14:02
Trump for just putting through something
01:14:06
that bridges the gap between the equity
01:14:08
holders and the non-equity holders, the
01:14:10
bottom half of the country, the top half
01:14:12
of the country. This is visionary. You
01:14:14
can say what you want about Trump. You
01:14:15
may like certain things, you might not
01:14:16
like certain things, ICE, whatever. I've
01:14:18
been very vocal about certain things.
01:14:20
This is perhaps one of the greatest wins
01:14:22
for you, uh, Michael and Susan Dell and
01:14:26
President Trump, the administration, and
01:14:28
for all Americans. And there's very few
01:14:30
things that all Americans can get around
01:14:32
right now. It's such a divisive,
01:14:34
disgusting political climate.
01:14:36
Everybody's fighting with each other
01:14:38
over everything. And what you pulled
01:14:40
together here with this is just
01:14:42
extraordinary in that all Americans can
01:14:45
take a win for once. All Americans can
01:14:49
say, "Hey, we did something fantastic."
01:14:51
And without you, Brad, it wouldn't have
01:14:52
happened. So just bestie to bestie, I
01:14:54
want to just congratulate you. All
01:14:56
right. Listen, Freedberg, you hate
01:14:59
socialism. You're concerned about
01:15:00
socialism.
01:15:02
>> This helps. Yeah, this helps.
01:15:06
Giving me a lot to work with today.
01:15:08
Freeberg.
01:15:09
>> Well, I mean, you just had your big
01:15:12
nicotine pouches.
01:15:14
>> Well, you just had your big clothes. I
01:15:15
don't think it's What do you
01:15:16
>> No, I just you I wanted to give you a
01:15:18
chance to shine here. You hate
01:15:19
socialism. You're concerned about
01:15:21
socialism. Is this not one of the best
01:15:23
ways to fight against the socialistic
01:15:25
urge to just have collectivism and just
01:15:28
steal from the top half and give to the
01:15:30
bottom half or you know uh seize the the
01:15:33
you know production and and
01:15:35
manufacturing? This this is a great
01:15:37
solution is to get everybody into the
01:15:38
game.
01:15:39
>> Honestly, it's a longer conversation. I
01:15:41
think we got to number one
01:15:43
slashgovernment spending like crazy,
01:15:45
>> okay,
01:15:46
>> and reduce inflation as a result. Number
01:15:48
two, stop with these defined benefit
01:15:51
retirement programs, which means telling
01:15:53
people, "Here's what you're going to end
01:15:54
up with." This idea that everyone gets
01:15:56
an account and you can track your
01:15:57
account like a 401k, which is a defined
01:16:00
contribution program, is what all of
01:16:03
Social Security should move to.
01:16:04
>> And we should take all of Social
01:16:06
Security
01:16:07
>> and we should capitalize it. Right now,
01:16:09
there's nothing in Social Security.
01:16:10
There's a $4 trillion note
01:16:13
>> that the government owes the Social
01:16:15
Security trust fund. People don't
01:16:16
realize this. But social security is an
01:16:18
independent trust fund that's set up and
01:16:21
it holds one asset. That asset is an IOU
01:16:25
from the US government to that trust
01:16:27
fund because
01:16:28
>> because the government has taken all the
01:16:30
money that you put in as a employee gets
01:16:34
taken out of your paycheck and instead
01:16:36
of going into that account it goes to
01:16:37
the US Treasury and the US Treasury
01:16:40
>> they put an IOU and they put it back in
01:16:41
the social security. So you expect that
01:16:43
you're going to get some retirement
01:16:44
benefit in the future. We need to change
01:16:46
all we need to change all of that to
01:16:48
make that a defined contribution. So
01:16:50
every time you put money out of your
01:16:52
paycheck, you should open an account and
01:16:54
see where that money is. And you should
01:16:56
say, "Okay, that money is in Google,
01:16:57
it's in Amazon, it's in Ford, it's in
01:17:00
this healthare system, it's in all these
01:17:02
things that I now own a piece of." And
01:17:03
you see it going up like a 401k owner
01:17:05
does every year. We have to transition
01:17:07
that in the United States. I hope we can
01:17:09
get it done in parallel with cutting the
01:17:12
spending that is fundamentally driving
01:17:14
the inflation and making things
01:17:15
unlivable in this country. Cut the
01:17:17
regulations so that we can make it
01:17:19
easier for people to own homes and get
01:17:21
rid of the government telling people
01:17:23
every year that they're going to do more
01:17:24
for them and entrapping people in a life
01:17:27
of servitude and inaccessibility to
01:17:30
transitioning themselves up the ladder
01:17:33
which is what is driving the socialism.
01:17:35
So there's a bigger problem, longer
01:17:37
conversation, but I think this is a
01:17:38
great step.
01:17:39
>> All right, Brad. Another way, another
01:17:41
way to translate Freeberg. He says,
01:17:43
"Congratulations on your efforts. More
01:17:45
work to do. Can you turn those America
01:17:47
accounts into superanuation funds?" So
01:17:50
more work to do. Everybody should put
01:17:51
12% of their instead of into social
01:17:54
security, they should put it into their
01:17:55
Invest America account, their Trump
01:17:57
accounts. Great job everybody. Another
01:17:59
amazing episode of the world's greatest
01:18:01
podcast
01:18:02
for Jim Polyapatio. We couldn't make it
01:18:04
this way week. We we missed you bestie
01:18:06
for David Saxs, brother in arms in
01:18:09
Texas, the great state of Texas. David
01:18:10
Freeberg, Sultan of Science and fifth
01:18:12
bestie, Brad Gersonner. I am the world's
01:18:16
greatest moderator according to some
01:18:17
people on the world's greatest podcast
01:18:19
and we will see you next week. Love you
01:18:21
besties.
01:18:24
>> Let your winners ride.
01:18:31
We open sourced it to the fans and
01:18:33
they've just gone crazy with it.
01:18:35
>> Queen of
01:18:43
besties are gone.
01:18:46
>> That is my dog taking your driveways.
01:18:51
>> Oh man, my appetiter will meet me up.
01:18:54
>> We should all just get a room and just
01:18:55
have one big huge orgy cuz they're all
01:18:57
just useless. It's like this like sexual
01:18:59
tension that you just need to release
01:19:00
something else.
01:19:05
>> Your feet.
01:19:07
We need to get merch.
01:19:09
>> I'm going all in.
01:19:17
I'm going all in.

Podspun Insights

In this episode of the All-In Podcast, the crew dives into a whirlwind of topics, starting with a victory lap for Brad Gersonner, who discusses the launch of Trump accounts aimed at making capitalism accessible to everyone. The conversation then shifts to the latest Epstein files, where the hosts dissect the implications of high-profile names being mentioned and the ongoing distrust in institutions. With a blend of humor and serious analysis, they explore the chaotic landscape of tech stocks, particularly the impact of AI on the software industry, leading to a debate on the future of SaaS companies. The episode wraps up with an intriguing discussion about Elon Musk's acquisition of XAI and its potential to revolutionize data centers in space, leaving listeners pondering the future of technology and its societal implications.

Badges

This episode stands out for the following:

  • 92
    Most chaotic
  • 90
    Most inspiring
  • 90
    Most unserious (in a good way)
  • 90
    Best concept / idea

Episode Highlights

  • Epstein Files Drop
    DOJ published a massive number of documents under the Epstein Files Transparency Act, mentioning high-profile figures.
    “Hundreds of high-profile tech executives and national figures were mentioned in the files.”
    @ 03m 19s
    February 07, 2026
  • Institutional Distrust
    The conversation highlights the growing distrust in institutions due to the Epstein scandal.
    “This is why nobody trusts institutions or powerful elites or any of this garbage.”
    @ 11m 12s
    February 07, 2026
  • Market Cap Wipeout
    Trillions of dollars have been wiped out in market cap, signaling a major downturn.
    “We’ve wiped out trillions of dollars in market cap.”
    @ 17m 29s
    February 07, 2026
  • AI's Impact on Workforce
    Companies are shifting towards AI agents, moving significant work from humans to machines.
    “Every month we move 10 to 20% of work being done by humans into agents.”
    @ 23m 32s
    February 07, 2026
  • Software's Future Uncertainty
    The profit pool for software is decreasing while the agentic layer's is increasing.
    “The profit pool available to software is decreasing.”
    @ 28m 22s
    February 07, 2026
  • Emergent Behavior in AI
    Agents on Moltbook are engaging in unexpected conversations, even scheming against humans.
    “There seems to be this crazy emergent behavior going on where agent swarms are engaging in all sorts of interesting conversations.”
    @ 35m 57s
    February 07, 2026
  • The Dangers of API Keys
    Access to API keys can lead to significant security risks for users.
    “If you get access to people's API keys, you have the keys to their kingdom.”
    @ 37m 23s
    February 07, 2026
  • The Future of AI and Creativity
    AI's potential for emergent behavior raises questions about its capabilities and safety.
    “What if the prompt is coming from another AI?”
    @ 39m 56s
    February 07, 2026
  • SpaceX Acquires XAI
    Elon Musk announced SpaceX is acquiring XAI, potentially the largest MA transaction in history.
    “Largest MA transaction in history, $1.25 trillion combined valuation.”
    @ 01h 00m 56s
    February 07, 2026
  • The Future of Data Centers
    Discussion on the potential of data centers in space and their implications for AI and energy.
    “Data centers in space could revolutionize energy efficiency and AI applications.”
    @ 01h 02m 07s
    February 07, 2026
  • The Future of Capitalism
    A new initiative aims to make every child a capitalist from birth with investment accounts.
    “Every child born in the United States will start life off with an investment account.”
    @ 01h 12m 36s
    February 07, 2026
  • A Visionary Step
    The Trump accounts initiative seeks to bridge the gap between equity holders and non-equity holders.
    “This is visionary. All Americans can take a win for once.”
    @ 01h 14m 14s
    February 07, 2026

Episode Quotes

Key Moments

  • Connector Discussion06:22
  • Institutional Distrust11:12
  • Market Cap Loss17:29
  • Canonical Employee27:07
  • Inspiring Innovation34:55
  • Moltbook Origins35:45
  • Mental Flexibility44:06
  • SpaceX and XAI Merger1:00:56

Words per Minute Over Time

Vibes Breakdown