Search Captions & Ask AI

MAGA Civil War Erupts Over Iran | Pivot

March 20, 2026 / 01:14:15

This episode of Pivot covers topics such as masculinity, introspection, the metaverse, and the current state of oil prices. Hosts Cara Swisher and Scott Galloway discuss their experiences at South by Southwest, including a notable moment with Representative Talerico's thoughts on masculinity.

They critique Mark Andre's views on introspection, arguing that it is essential for personal growth and societal progress. The conversation touches on historical figures like Socrates and Einstein, emphasizing that introspection has been valued throughout history.

The hosts then shift to discuss the metaverse, announcing Meta's decision to shut down its VR platform Horizon Worlds, which struggled to attract users. They reflect on their skepticism about the metaverse and its potential impact on society.

In the latter part of the episode, they address rising oil prices due to geopolitical tensions, the implications for the economy, and the challenges faced by the Biden administration in managing these issues.

Finally, they touch on OpenAI's recent strategic shifts and the challenges it faces in the competitive AI landscape, concluding with predictions about the future of various tech companies.

TL;DR

The episode discusses masculinity, introspection, the metaverse's decline, and rising oil prices, featuring insights from Cara Swisher and Scott Galloway.

Video

00:00:00
I predict this mega micro penis war is
00:00:03
going to get worse and I am here for it.
00:00:12
>> Hi everyone, this is Pivot from New York
00:00:13
Magazine and the Vox Media Podcast
00:00:15
Network. I'm Cara Swisser
00:00:16
>> and I'm Scott Callaway.
00:00:18
>> Scott, have you recovered from South by
00:00:19
Southwest at
00:00:21
>> I'm still basking in the glow. I thought
00:00:22
it was great.
00:00:23
>> It was good. Yeah, I had a good time in
00:00:25
the end. It was really fun. Um, what was
00:00:27
your favorite highlight of it?
00:00:29
>> My favorite highlight, um,
00:00:31
>> besides our time together,
00:00:32
>> the way Representative Talerico
00:00:36
described masculinity and he talks about
00:00:39
his father, I guess, used to come home
00:00:40
on Sunday and immediately change and
00:00:43
then mow their lawn and then without
00:00:46
ever talking about it, just went next
00:00:48
door and mowed the lawn of this old
00:00:49
ladies and thought and he described
00:00:53
>> that really struck you. You mentioned it
00:00:54
at the show that way. Did you did Did
00:00:56
you ever mow people's lawns, Scott?
00:00:57
>> I used to mow lawns for money, but I I
00:01:00
didn't do it. I showed up to the house
00:01:02
before I mowed a lawn and said, "Hey,
00:01:04
seven bucks in Ohio and I'll mow your
00:01:06
lawn."
00:01:07
>> Oh, wow.
00:01:08
>> And I had a manual lawn mower and I was
00:01:10
all of about 120 lbs pushing a manual
00:01:12
lawn mower around.
00:01:13
>> So you you were not taught to mow
00:01:15
people's lawns?
00:01:16
>> No, I was taught to make money. I My dad
00:01:18
was like, "Go make money." My my son got
00:01:21
a um a job at a taco truck last summer.
00:01:26
>> It was so good for him. Um you know, so
00:01:28
many Anyways, I think chores chores,
00:01:32
jobs, and sports. I mean, anyways, and
00:01:36
>> anyway, the reason I'm asking is cuz
00:01:38
it's introspecting when you're
00:01:39
introspection. Well, I've gotten I I've
00:01:41
gotten much more introspective just to
00:01:43
today I'm here in Tulum and I've had
00:01:45
some time to
00:01:47
>> really contemplate and I've decided to
00:01:49
it's time for me at this age, Cara. It's
00:01:51
time for me Life is finite. It's time
00:01:53
for me to start to start living my
00:01:54
dream. So, I'm going to start showing up
00:01:56
for tests I'm not prepared for naked.
00:02:01
>> Well, as always, so you're in the Mark
00:02:03
Andre school. Have you heard about this
00:02:05
situation, Mark? Let me just say what it
00:02:07
is. Mark Andre, who I really don't like
00:02:10
anymore. I didn't like that much then,
00:02:11
but he's really become such a troll. He
00:02:14
said uh on the he's an a famous he's he
00:02:17
was part of the Netscape browser thing.
00:02:20
I wouldn't say he was the only person.
00:02:21
He did take a lot of credit. Um but uh
00:02:24
important entrepreneur in Silicon Valley
00:02:26
um and etc. Now he's a venture
00:02:28
capitalist. On the founders podcast with
00:02:30
David Senra, he said, "My goal is zero
00:02:33
introspection as little as possible. 400
00:02:35
years ago it would never occur to
00:02:36
anybody to be introspective like the
00:02:38
whole idea I mean just all the modern
00:02:40
conceptions around introspection and
00:02:42
therapy and all the things that kind of
00:02:43
result uh from that are kind of
00:02:45
manufactured 1910s 1920s uh this is very
00:02:49
much in line with him being an expert on
00:02:51
everything. He used to lecture me about
00:02:52
things he knew nothing about a lot. Um
00:02:55
it all it says to me is this man is in
00:02:58
desperate need of therapy. Um to you
00:03:01
know he's just trying to be like I don't
00:03:02
think about anything. Um, and I find it
00:03:05
I found it very uh dystopian and I find
00:03:08
him dystopian in general. Um, but this
00:03:11
idea that introspection is a weakness
00:03:13
again is not masculine. It's not
00:03:15
feminine. It's not h human I think in
00:03:18
some way.
00:03:19
>> Yeah. I think it's important to
00:03:20
occasionally,
00:03:22
you know, do do some sort of I don't
00:03:25
know pondering. I, you know, ask
00:03:26
yourself ask yourself if you could only
00:03:29
bring one thing to a desert island, what
00:03:30
would you bring? And I decided the
00:03:32
answer is I wouldn't go.
00:03:33
>> No.
00:03:34
>> Uh I need edibles streaming media and my
00:03:37
plane car.
00:03:38
>> Mhm.
00:03:38
>> Um no. Look, look, in all seriousness,
00:03:41
it's as if these guys
00:03:43
>> Mhm.
00:03:43
>> Alman and Andre hired a publicist, the
00:03:46
brightest comms person in the world, and
00:03:48
said, "How do we convince humanity we're
00:03:50
bad for humanity?"
00:03:52
>> Right. Right. And this notion that
00:03:54
technology requires less energy to get
00:03:56
to a point of critical thinking than a
00:03:58
human is just so nihilist and so weird.
00:04:02
And then introspection is how we move
00:04:05
forward as a species.
00:04:05
>> It was like Socrates, Plato, Marcus
00:04:08
Aurelius, like it's been around. He's
00:04:10
like, "Oh, it's just the 1910s." He's so
00:04:12
ignorant. Like the idea introspection is
00:04:15
a critical element of all philosophy
00:04:17
going back. Also, by the way, Jesus test
00:04:21
the Bible. It's all about thinking about
00:04:24
>> reflecting on how you become a better
00:04:27
introspection is why we have the
00:04:28
Marshall Plan and why people reconnect
00:04:30
with their family members. Introspection
00:04:32
is how you
00:04:33
>> try to become a better person and
00:04:35
realize the errors of your actions and
00:04:38
that your actions have ramifications and
00:04:40
what can you do to be leave the world a
00:04:43
better place. It
00:04:44
>> Yeah.
00:04:44
>> And it's it's indicative again of this
00:04:47
faright
00:04:48
performative. I won't even call it
00:04:50
masculinity, but macho that I don't
00:04:52
care. I just plow ahead because I'm such
00:04:53
a baller.
00:04:54
>> Yeah.
00:04:55
>> It's just so It's just like, okay,
00:04:56
>> it's crazy. I I I I don't like my
00:04:59
grandmother didn't introspect a lot. She
00:05:00
grew up in the depression, right? She
00:05:02
didn't wonder if she was happy. I think
00:05:03
she was probably could have been
00:05:05
happier, right? That kind of thing. And
00:05:06
there's there's an element to that. But
00:05:08
this idea that this the idea of
00:05:10
thoughtfulness has not been around since
00:05:12
the dawn of time drives me
00:05:14
crazy. The second thing is look this guy
00:05:16
has a very famously doesn't speak to his
00:05:19
family right like there's all manner of
00:05:22
upness that is buried very deep
00:05:24
in this particular person who has
00:05:26
influence on other who has massive
00:05:28
influence on everybody else and you know
00:05:31
he's like a he's an emotional uh I don't
00:05:34
want to use you know he's just a tiny
00:05:36
little man from a from a soul point of
00:05:39
view like extraordinarily small um and I
00:05:42
find it really just bragging about it is
00:05:45
is the last is you know he's the one
00:05:47
that said we should fight more like we
00:05:49
should physically fight like as if he
00:05:51
could get in a fight with anybody he'd
00:05:52
lose in a second but this it's just they
00:05:55
just you're right they they're trying to
00:05:56
be villains or something by the way the
00:05:59
the main villain in the Marvel movies is
00:06:02
quite introspective FYI but go ahead
00:06:05
>> well I would argue that probably I mean
00:06:09
people would say the greatest mind of
00:06:10
the 20th century is Einstein but they
00:06:12
should take could play take a page from
00:06:14
the playbook of the greatest
00:06:16
>> arguably the greatest technologist
00:06:18
>> of the 20th century. And that was
00:06:20
someone who not only had a vision for
00:06:21
technology but but could bring together
00:06:22
people to what was at that moment
00:06:26
develop and deploy the most important
00:06:27
technology in history or at least the
00:06:29
most profound and that was Oenheimer and
00:06:31
he was hugely introspective.
00:06:32
>> So was Einstein. If you read some of
00:06:34
this,
00:06:34
>> they were hugely introspective. They
00:06:36
were they were really worried about what
00:06:39
about the the ramifications of their
00:06:42
actions and how they could spend the
00:06:44
rest of their lives trying to, you know,
00:06:48
they didn't just say the the in
00:06:50
introspection isn't some AI guy who
00:06:52
vests his shares and then scares the
00:06:54
out of the world as he pieces out
00:06:56
to the cotzora. That is not
00:06:58
introspection. Mhm.
00:06:59
>> Uh Bill Gates for all the Bill
00:07:01
Gates is getting and a lot of it is
00:07:02
warranted.
00:07:03
>> He is a he decided I have become the
00:07:06
wealthiest person in the world at that
00:07:08
moment.
00:07:09
>> I am smart. What could I do with my
00:07:12
resources to impact millions of people?
00:07:14
And he started distributing.
00:07:15
>> He decided I think I can stop malaria in
00:07:18
a continent.
00:07:19
>> That is introspection.
00:07:21
>> Yeah. Anyway, it's it's it's led to a
00:07:23
lot of very funny memes, you know,
00:07:25
Marcus Andronicus and then nothing. It's
00:07:28
called nothing. What a soulless empty
00:07:31
person. And these are not where we
00:07:32
should be getting clues as we go
00:07:34
forward. That's just my feeling. And I
00:07:36
think one of the more damaging figures
00:07:38
uh from a from in terms of training
00:07:40
young men uh at Silicon Valley is this
00:07:43
guy. He's not someone to follow. Let me
00:07:45
just say I've known him since he was
00:07:46
very young and he and he's progressed
00:07:49
negatively and and backwardly in a way
00:07:51
that's really quite depressing. Uh oddly
00:07:54
enough in in in relation and then we'll
00:07:56
finish up on this. I had lunch at South
00:07:58
by Southwest with Mark Cuban. What a
00:08:01
person who has developed in a really He
00:08:02
was telling me all about his cosplush,
00:08:04
the passion around it.
00:08:06
>> Um I just was like,
00:08:07
>> he also looks great, by the way.
00:08:08
>> He looks great. Yeah, he's eating clams.
00:08:10
That's another story.
00:08:11
>> Jesus, did he bore you with that story?
00:08:13
I had to suffer through that. He buys on
00:08:15
Amazon.
00:08:16
>> Yeah, I Let's not get into it. We We'll
00:08:18
have him on to talk about it at some
00:08:19
point. He's trying to get protein.
00:08:21
>> If oysters means GLP1, I believe it. I
00:08:24
seriously I met with him and Michael
00:08:26
Dell and they're both claiming that
00:08:27
they're playing a lot of Pedell and I'm
00:08:30
like your old you could eat you right
00:08:32
now. Pedel my ass.
00:08:34
>> Pedell. Anyway, they um I just was like
00:08:37
I had a wonderful talk about uh
00:08:39
prescription drugs about life about his
00:08:41
kids. Like what a It was such a
00:08:43
difference like he is the opposite of of
00:08:46
like a good man trying to add value.
00:08:48
>> Yeah, exactly. Anyway, uh we have to
00:08:50
move on but Mark honestly stopped all of
00:08:53
you. Alex Karp said a number of things
00:08:55
stupid things like stop talking all of
00:08:57
you stop talking because what you say is
00:09:00
nonsensical and actually makes you look
00:09:02
so stupid and pathetic that it's I'm
00:09:05
just here to help you on that issue
00:09:07
anyway
00:09:08
>> but look even if you're religion you're
00:09:11
supposed to reflect on some of it is
00:09:13
somewhere Jesus
00:09:16
>> if Jesus could feed the world with two
00:09:17
fishes and a loaf you know if you really
00:09:20
think about it that's tapas
00:09:24
I mean, peace out. Peace out.
00:09:26
>> All right, let's move on. I can't do any
00:09:28
better than that. I can't do any better
00:09:29
than that.
00:09:29
>> Okay. Um, as of this recording, oil
00:09:32
prices in the morning,
00:09:34
>> that's a segue.
00:09:35
>> That's a segue in the in a more real
00:09:37
situation. Uh, prices of oils over $119
00:09:40
a barrel at one point following attacks
00:09:42
on energy sites in the Gulf. President
00:09:44
Trump has been lashing out at US allies
00:09:46
this week demanding they send warships
00:09:48
to help secure the Strait of Hormuz. Uh
00:09:51
the response has been quote global
00:09:53
raspberry as one analyst put it. We're
00:09:55
seeing also the first resignation over
00:09:57
this war. Counterterrorism official Joe
00:09:59
Kent stepped down saying Iran posed no
00:10:01
imminent threat. Of course he went on
00:10:03
Tucker Carlson. His he's of that ilk. He
00:10:05
has some he has some problems himself.
00:10:07
But nonetheless, he quickly went on
00:10:09
Tucker Carlson to discuss his departure
00:10:11
because this is like the train, the
00:10:12
right-wing MAGA train if you're going in
00:10:15
one direction. Meanwhile, the Pentagon
00:10:17
is asking the White House to approve
00:10:18
$200 billion request for Congress to
00:10:21
fund the Iran war. I think of an entire
00:10:24
I think Joe Biden was 180 billion for
00:10:27
like years long wars or whatever.
00:10:29
Defense Secretary Pete Hex says just
00:10:30
said in a briefing that the number could
00:10:32
move because it takes money to kill bad
00:10:34
guys. Speaking of introspection, what an
00:10:36
idiot. Um, uh, the next move, what
00:10:40
happens here? Trump also said this week,
00:10:41
by the way, that a former president told
00:10:43
him he regretted not bombing them, but
00:10:45
all the former living presidents denied
00:10:47
saying that. So, I guess he's talking to
00:10:48
himself. I mean, they've denied it.
00:10:51
Like, he's such a liar. It's astonishing
00:10:54
like what this guy does. Um, obviously
00:10:57
either cognitive or just a liar. I'm not
00:10:59
sure. Where do you think what do you
00:11:00
think's happening here now? Let's have a
00:11:02
quick update. Well, I mean, the mother
00:11:05
of all understatements is it's
00:11:06
complicated. Look, I think the fatal
00:11:09
flaw of the Trump administration is they
00:11:11
don't recognize our power as a species
00:11:12
and as a country. And that is as
00:11:14
powerful as we are, we're only a third
00:11:16
of the world's GDP, but because we were
00:11:18
seen as the good guys and innovators,
00:11:20
and that we did
00:11:22
embrace this notion that if we can make
00:11:24
you wealthier and more peaceful,
00:11:26
ultimately that wealth and peace will
00:11:28
return home in the form of you buying
00:11:30
our trucks and being our ally. And we
00:11:32
can put a military base there. And the
00:11:34
operating system of 60 or 70% of the
00:11:38
world was US laws, military flows of
00:11:41
energy, general rule of law, even
00:11:43
democracies, even laws and justice
00:11:45
systems were based off the US model. And
00:11:48
to to his lesser extent, the British
00:11:50
model, it just got evolved. We were sort
00:11:52
of 2.0. And he's decided, no, with 30%,
00:11:55
I can go at it alone. And what he's
00:11:57
found is all of a sudden he's 1/3 versus
00:12:00
2/3. And this is just when my we, you
00:12:04
know, we warn my son not to take
00:12:05
grapefruit juice into the to the living
00:12:07
room with the brand new couch and he
00:12:09
tells us, "Don't be an idiot. I can
00:12:11
handle it." And then he screams, "Dad, I
00:12:13
need help." And I know exactly what's
00:12:14
happened. It's just
00:12:15
>> Yeah, he spilled the grape juice.
00:12:15
>> Well, what do you know? We're going to
00:12:17
I'm going to do this unilaterally. I'm
00:12:19
not going to go to the UN. I mean, talk
00:12:22
>> Gulf won. George Bush put together a
00:12:25
coalition of I think 31 countries. He
00:12:27
got UN authorization and he got the
00:12:29
allies to pay 62 of the 707 $70 billion
00:12:32
in cost. That war cost
00:12:33
>> and at great and at great sacrifice for
00:12:35
many of them and of course he's been
00:12:36
downplaying their sacrifice and they're
00:12:38
now like literally saying no and by the
00:12:41
way and if you get come out of NATO fine
00:12:44
like they're now at that point I mean
00:12:45
you know whatever
00:12:47
>> he asked China for help and by the way
00:12:50
>> China's ships are flowing through. So
00:12:52
the notion that he's going to quote
00:12:54
unquote an enemy or nemesis going to
00:12:56
people he's been really rude to. I mean
00:12:59
this is just and they didn't anticipate
00:13:00
that they wouldn't be able to count on
00:13:02
their allies.
00:13:04
I I
00:13:05
>> didn't anticipate the Iran push back the
00:13:07
strength of the I mean he was advised by
00:13:10
by the way pretty much stories coming
00:13:12
out now are like he was told this he was
00:13:14
told this he was told they would do this
00:13:16
they would close the straight moves.
00:13:18
like everyone's leaking their the
00:13:20
out of things which is really I mean
00:13:22
what's interesting I I know it's the
00:13:24
smallest part of it but the lie about
00:13:26
presidents was weird was just weird like
00:13:30
why would you say that and then they all
00:13:32
say no and it looks like he's it's a lie
00:13:35
or he's talking to himself or whatever
00:13:38
it the whole thing seems like lies come
00:13:41
out of his mouth every day now that are
00:13:42
easily checkable like easily checkable
00:13:45
lies and that don't really work and so
00:13:47
Something's going on. Something's
00:13:49
happening in a way that's I mean, I
00:13:52
don't want to give him an excuse. Maybe
00:13:53
he's just a malevolent prick, but it
00:13:55
seems problematic that he's leading this
00:13:58
coalition of the one.
00:13:59
>> When you hire incompetent conspiracy
00:14:02
theorists, which is what Joe Kent is.
00:14:04
>> Mhm.
00:14:05
>> I mean, this is very upsetting for me as
00:14:07
someone, you know, quite frankly, as a
00:14:09
Jew, and that is
00:14:10
>> he immediately said that
00:14:12
>> that basically the largest military in
00:14:14
the world in the United States is being
00:14:16
manipulated by Jews. Mhm.
00:14:18
>> And this just plays into a very
00:14:19
anti-semitic trope being fermented on
00:14:21
the far right. And I don't I don't Megan
00:14:24
Kelly, there's a bunch of them, right?
00:14:26
>> Yeah. That this is all be this is all
00:14:27
Jews fault.
00:14:28
>> Yeah.
00:14:29
>> That is just what they were doing.
00:14:30
>> That's not helpful.
00:14:32
>> Yeah.
00:14:32
>> So,
00:14:33
>> it's one thing to be against the war and
00:14:35
I think there's some legitimate like
00:14:37
>> or to say has too much influence over. I
00:14:40
get it.
00:14:40
>> No, I get it. No, I'm just saying the
00:14:42
American first people can say we don't
00:14:43
like wars and but they do always take it
00:14:45
right into that. That was that that
00:14:48
Tucker Carlson's a dangerous person in
00:14:50
that regard, I'll tell you. Like
00:14:52
>> they're doing a video. They're like the
00:14:54
number whatever five at the,
00:14:56
>> you know, in our intelligence unit is is
00:14:59
saying what what Candace Owens and
00:15:01
Tucker Carlson are saying. It's the
00:15:02
Jews. We're being manipulated by the
00:15:05
Jews.
00:15:05
>> Yeah. It's problematic. It's
00:15:08
problematic. So what So oil prices, what
00:15:10
what is what actual impact is this going
00:15:12
to have on the economy as a whole? It's
00:15:14
immediate. I mean, unfortunately, and it
00:15:17
always happens. It hurts. It hurts uh
00:15:20
middle-income families and lower income.
00:15:22
Already, you're talking about an
00:15:24
increase for every dollar increase at
00:15:26
the pump. And it looks like we are going
00:15:27
to have about a dollar increase. It's
00:15:29
another $530 a year. And low-income
00:15:31
families spend almost, get this, 20% of
00:15:34
their income on home and auto energy
00:15:37
costs.
00:15:37
>> Yeah. And then the the residual effects
00:15:39
of food from everything. Every single
00:15:41
>> everything you touch is impacted.
00:15:43
everything got to you using some form of
00:15:45
fuel or or is consuming fuel
00:15:48
>> and it's going to probably spike
00:15:49
inflation an additional 100 bips um in
00:15:53
the short run. So
00:15:55
>> speaking speaking of which Jerome Powell
00:15:57
says he'll stay on his Fed chair until
00:15:58
his successor is confirmed by the
00:16:00
Senate. Even if that's after his term
00:16:02
expires he has every right to. It could
00:16:03
be a while. The Senate hasn't even
00:16:05
scheduled a hearing for Trump's nominee
00:16:07
Kevin Walsh. GOP Senator Tom Tillis, who
00:16:09
I'm talking to uh next week, says he
00:16:11
won't vote on confirmation until the DOJ
00:16:14
investigation on Powell is over. They've
00:16:16
they've been handed some um uh court
00:16:20
things, Janine Piro and the rest around
00:16:23
Powell. Uh for his part, and they're
00:16:24
appealing it, I think for his part,
00:16:26
Powell also says he'll stay on as Fed
00:16:28
governor, which I said he would,
00:16:30
remember I said this until the
00:16:31
investigation is well and truly over. Um
00:16:34
this is ex I thought he would do this.
00:16:36
He looks like he ran out of a long
00:16:38
time ago. Um, and well and truly over
00:16:40
means he could stay as long as he is a
00:16:42
while there on that Fed governor thing.
00:16:44
You know, as you noted many times,
00:16:46
enormous influence. So, he's the this is
00:16:49
the opposite of what Trump wanted and
00:16:51
he's stuck with Powell and Tillis, I can
00:16:53
tell you, is not give I mean is not
00:16:56
stopping at all at all. So
00:16:59
>> I think if it had been a different
00:17:01
president who' demonstrated more grace
00:17:03
to him, I don't doubt he would have
00:17:05
stepped down or if he'd said to him,
00:17:07
"Listen, I want you to be my chief
00:17:09
economic advisor. I I you know, I have a
00:17:12
even something more important for you."
00:17:13
But keep in mind, as long as Jerome Pal
00:17:15
is in the room, I've said this. There's
00:17:18
there's how you think there's the
00:17:19
governance structure and then there's
00:17:21
actually how boards and body politic
00:17:24
works. And this is how this is
00:17:26
essentially a board of directors. This
00:17:28
is how they work.
00:17:29
>> There's a bunch of them
00:17:31
>> in every board. There's 12 people, and
00:17:33
there's two people who matter.
00:17:35
>> There's the largest shareholder, which
00:17:37
doesn't doesn't
00:17:39
uh apply here. And then there's someone
00:17:41
who's so smart that everyone
00:17:43
they don't speak a lot. They listen a
00:17:44
lot, but when they speak, everyone has a
00:17:46
tendency to nod their head. and that
00:17:49
tell me the whatever it is the other 11
00:17:52
governors are going to when when Jerome
00:17:54
Pal says you know who whatever the the
00:17:57
chair is the person who
00:17:58
>> well he's going to run it I don't think
00:18:00
I think Tillis isn't G I I know Telus
00:18:02
isn't giving up he said it he's like he
00:18:04
Tillis now suddenly as you said his
00:18:06
balls and he's like no I'm going to do
00:18:08
the right thing for he's very offended
00:18:11
by the Jerome Powell thing I know that
00:18:12
and so I think it's he's a business
00:18:15
person he's a really well he had a you
00:18:17
know he was
00:18:18
even though you know he sounds like he's
00:18:20
like from the country smart guy
00:18:21
>> very smart guy is very stuck on this
00:18:24
Powell not putting uh wash through um
00:18:27
obviously he helped take down Christine
00:18:30
um I think there's there's such a push
00:18:32
back not just from our allies abroad but
00:18:34
here and if you're someone like Tom
00:18:36
Tillis and can stop this you do it like
00:18:39
why not what's what's the negative for
00:18:41
him there's nothing because he's now
00:18:43
because Trump tried pushed him out
00:18:45
essentially of the Senate and now he's
00:18:47
an enormous position of power and and
00:18:50
influence the same thing. And so
00:18:52
Powell's Powell is not going to bring
00:18:53
rates down, by the way, especially with
00:18:55
inflation up. So Trump has gotten the
00:18:57
opposite of everything he wanted. So
00:19:00
>> no, Cal said there was a 99% likelihood
00:19:03
they would not cut rates. But where I
00:19:05
was headed was I would bet 98% of the
00:19:09
decisions in the Fed from the board of
00:19:12
governors, regardless of who's in
00:19:14
charge, regardless of who takes the mic,
00:19:17
uh the new chair, whatever Jerome Pal
00:19:20
said was probably the right move in that
00:19:22
meeting is what they're going to do.
00:19:24
>> Yeah. This is the guy that had a Marylu
00:19:29
Retin like stick the landing of the
00:19:31
economy where he basically tamed
00:19:33
inflation
00:19:34
>> by 600 basis points while not going into
00:19:37
recession. like no one
00:19:39
>> in economics.
00:19:40
>> No,
00:19:41
>> you know,
00:19:43
and
00:19:44
>> I think Worsh is perfectly qualified,
00:19:45
but Trump now has Jerome Powell forever
00:19:47
like especially the dumb attack
00:19:49
>> another six or 12 years or something or
00:19:51
whatever. Basically,
00:19:53
>> he's going to stay there as the as the
00:19:55
head of it. It's just Anyway, it's it's
00:19:57
>> good for him.
00:19:58
>> Good for him.
00:19:59
>> I think he is I think he is lit the
00:20:01
first hero.
00:20:02
>> The first Medal of Freedom recipient.
00:20:04
>> Yeah, he's a hero. One, Democrats love
00:20:06
to show that they're bipartisan. It'll
00:20:08
probably be Vice President Spence will
00:20:10
be first, the first one, and the second
00:20:12
one will be Jerome Pal.
00:20:13
>> Vice President Spence. Who's
00:20:16
>> um Pence? I'm sorry.
00:20:17
>> Oh, Vice President Pence. Yeah. Oh,
00:20:19
that's a good idea. The two of them. Oh,
00:20:22
Pence does not get enough recogn.
00:20:26
>> Yes. Daddy. Dad. Father. I like father
00:20:29
at this point. Father is actually father
00:20:32
and Powell. No, that's that'll work for
00:20:34
me. Uh, okay, Scott, let's go on a quick
00:20:36
break. When we come back, we'll say
00:20:38
goodbye to the metaverse. We hardly knew
00:20:40
you at all.
00:20:43
>> Support for the show comes from
00:20:44
Coreeave. AI isn't just a new tool. It
00:20:47
encompasses so much more. It's spurring
00:20:49
a revolution across all industries and
00:20:51
reshaping itself to become a big part of
00:20:52
our future together. Cororeweave is at
00:20:54
the center, powering some of the biggest
00:20:56
names in AI. As the essential cloud for
00:20:58
AI, Core Weave provides an AI platform
00:21:00
that combines next generation
00:21:02
infrastructure, intelligent tools, and
00:21:03
expert support. It's powering the
00:21:05
world's most complex AI workloads faster
00:21:07
and more efficiently. From medical
00:21:09
research and diagnosis to education,
00:21:10
from complex visual effects from movies
00:21:12
to breakthroughs in science and
00:21:14
technology. If it's AI, Cororeweave is
00:21:16
uniquely ready to power with
00:21:17
purpose-built tech, the big ideas, the
00:21:19
wild visions, and whatifs and why nots.
00:21:22
Corweave is working to build what's
00:21:24
never been built before. Coreweave is
00:21:26
the essential cloud for AI. Ready for
00:21:28
anything, ready for AI. To learn more
00:21:30
about how Coree powers the world's best
00:21:32
AI, go to cororeweave.com/re
00:21:34
for anything.
00:21:42
Support for this show comes from
00:21:43
MongoDB. If you're tired of database
00:21:46
limitations and architectures that break
00:21:48
when you scale, it's time to think
00:21:49
outside the rows and columns. Because
00:21:51
let's be honest, you didn't get to tech
00:21:53
to babysit a broken database. You got
00:21:55
into it to actually build something.
00:21:57
MongoDB lets you do that. It's flexible,
00:22:00
developer first, asset compliant,
00:22:02
enterprise ready, and built for the AI
00:22:04
era. Say goodbye to bottlenecks and
00:22:05
legacy code. Start innovating with
00:22:08
MongoDB. There's a reason it's trusted
00:22:10
by so many of the Fortune 500, and
00:22:12
that's because it's a platform built by
00:22:14
developers for developers. They swear by
00:22:16
it. Literally, they call it a great
00:22:18
database. Start building at
00:22:21
mongodb.com/build.
00:22:27
Scott, we're back with more news. And
00:22:29
Meta is shutting down its VR metaverse
00:22:31
on June 15th. And legless people are
00:22:34
gone. The VR social network Horizon
00:22:36
Worlds never drew more than a couple
00:22:38
hundred thousand active users a month. I
00:22:40
mean, I think I think it was like a
00:22:42
thousand were using it by the end. Some
00:22:44
users reported that daily active users
00:22:46
actually dropped to under a thousand.
00:22:48
Who are those people? I want to meet
00:22:49
those people. Over 70 billion was spent
00:22:52
on the project over time. Uh you have
00:22:54
talked about this for a long time. I
00:22:56
never like the metaverse.
00:22:57
>> So some breaking news that broke after
00:22:59
the recording. Cara, we just learned
00:23:01
that Meta is not shutting down VR
00:23:04
support for Horizon Worlds. That's
00:23:06
according to an Instagram post from Meta
00:23:07
CTO Andrew Bosworth. He said there was
00:23:11
uh open quote a lot of misinformation
00:23:12
about the company's plans. We announced,
00:23:15
hey, we're moving away from Horizon
00:23:16
Worlds in VR. And the headline is that
00:23:18
Horizon is dead. He said it's not. And
00:23:21
likewise, VR is not dead. We're
00:23:24
continuing to invest tremendously.
00:23:26
Uh, this is weak sauce. We up.
00:23:29
Nana is on life support. And despite the
00:23:31
fact she might have brain waves, we're
00:23:32
pulling the plug soon. This is, in my
00:23:35
view, an attempt to uh backtrack and not
00:23:39
totally freak out the remaining
00:23:40
employees before they find them another
00:23:42
job or lay them off. This is this is
00:23:45
dead. uh in my view and uh you know uh
00:23:50
an attempt to if you will say no there's
00:23:53
still hope when they believe uh and
00:23:56
every indication here is that uh this
00:23:59
thing is maybe in hospice but be clear
00:24:02
it it's it's on the green mile of that
00:24:06
let's move on
00:24:07
>> while the press was fawning over the
00:24:08
idea you were not impressed let's take a
00:24:10
look at what you've said over the years
00:24:12
>> I was the original hater of headsets and
00:24:15
metaverse Web 3.0, the metaverse. It's
00:24:18
supposed to be the next dimension of the
00:24:20
internet.
00:24:20
>> I just love the fact that Mark
00:24:21
Zuckerberg is is showing up with
00:24:24
literally the biggest thud in
00:24:26
history.
00:24:27
>> I am proud to announce that starting
00:24:29
today, our company is now Meta.
00:24:34
>> I think this thing is already a giant
00:24:35
flaming bag of If it was working,
00:24:38
they'd be putting out all sorts of
00:24:39
numbers and press releases about people
00:24:42
signing up. You know, by the end of the
00:24:44
decade, we hope to basically get to
00:24:46
around a billion people in the metaverse
00:24:48
doing hundreds of of dollars of commerce
00:24:50
each.
00:24:50
>> If I were to try and devise a strategy
00:24:53
to weaken the corpus that is Facebook, I
00:24:56
would invent this distraction called the
00:24:58
metaverse and specifically the Oculus to
00:25:01
pour billions of dollars down a single.
00:25:03
>> They pivoted their entire company to the
00:25:05
metaverse. I think I mean if I had tried
00:25:07
to figure out a way to kneecap Meta
00:25:10
which is a net negative for society I
00:25:11
couldn't come up with a strategy as
00:25:13
brilliant as this.
00:25:14
>> Nice call Scott let me just say that.
00:25:16
Second of all I didn't like Meta because
00:25:18
the legless it was weird. Remember when
00:25:19
he introduced it was so weird and
00:25:21
awkward at the time. Um, one of the
00:25:24
things that's astonishing here is that
00:25:26
he could have this much of a loss and
00:25:27
still they're doing so well elsewhere um
00:25:32
that this $70 billion loss doesn't
00:25:34
matter and he's fed while other people
00:25:36
who have losses get slapped back to but
00:25:39
this is such a failure. Please, please
00:25:41
take a lap and conclude this chapter of
00:25:45
Mark Zuckerberg's life for us. care of
00:25:47
the fact that $70 billion in capex got
00:25:50
taken into a street and burned and that
00:25:52
people didn't want to live on a legless
00:25:53
future where they didn't want to be in a
00:25:55
place where 40% of them were within 20
00:25:58
minutes nauseous or that they further
00:26:00
separate from human I'm shocked Cara I'm
00:26:02
shocked this didn't work I I had big
00:26:05
hopes for it because anything Mark
00:26:06
Zuckerberg is clearly right
00:26:09
uh the scariest thing I think the
00:26:12
scariest thing about our economy other
00:26:14
than the income inequality is the fact
00:26:15
that we have now tied the fate of the
00:26:17
S&P and the 10% wealthiest households
00:26:21
who control the economy now and
00:26:23
government. We've tied it to our ability
00:26:25
to evolve a new species of asocial
00:26:28
asexual males and some females.
00:26:31
And the thing is this is this is a
00:26:34
healthy gag reflex for mammals. one, on
00:26:38
a very instinctive level, it's very
00:26:41
uncomfortable, especially for women, but
00:26:43
for everybody, when you're walking on
00:26:45
the sidewalk alone and you hear
00:26:47
footsteps behind you or the side of you,
00:26:50
>> because the things you can eat and the
00:26:52
things that can eat you don't come
00:26:53
straight at you. They have a habit of
00:26:54
coming from behind you or from the side.
00:26:57
>> And so your peripheral vision, and the
00:26:59
reason why billboards on the highway are
00:27:01
still a big business, is you notice
00:27:03
in your peripheral vision. You're very
00:27:04
subconsciously conscious of what's in
00:27:06
your peripheral vision or what isn't.
00:27:08
>> And when it's blocked with a headset,
00:27:11
>> you feel uncomfortable.
00:27:12
>> So, no, they never spoke to an
00:27:15
anthropologist
00:27:17
to say, "All right, what happens when we
00:27:20
invent a technology that from the moment
00:27:21
they turn it on, it's like if you turned
00:27:23
on your PC and it made you feel slightly
00:27:25
nauseous by turning it off."
00:27:27
>> Yes. Remember all those um those all all
00:27:29
of that stuff? Remember they showed it
00:27:30
at CES years ago where you looked at a
00:27:32
TV that was jumping out at you? It was
00:27:34
sickening and no one and never. It was a
00:27:36
big thing one year at CES and then it
00:27:38
wasn't. Let me ask you I'm going to ask
00:27:40
you a more challenging question. All
00:27:41
right. Look, we're going to have
00:27:42
immersive worlds, right, in some way.
00:27:44
And some of it is kind of cool. I
00:27:46
remember 20 years ago, Walt and I went
00:27:48
to Korea and went to either Sony or LG
00:27:51
and we were looking at these headsets in
00:27:52
movies.
00:27:54
Pretty cool. I remember thinking
00:27:56
that. And I wasn't nauseous. I went to
00:27:58
the sphere this week which I loved. I
00:28:00
saw the Dorothy thing and I thought it
00:28:02
was wonderful and we were all in the big
00:28:04
room and I have to say it was a lovely
00:28:06
communal experience because everyone was
00:28:08
laughing and they dropped apples out of
00:28:10
the sky and everything else. There is
00:28:13
something I want you to say what will
00:28:15
work here because there is an immersive
00:28:17
experience with screens that is very
00:28:21
satisfying. What would you if you had to
00:28:23
pick a business in this in the immersive
00:28:26
screens either on your head or in a
00:28:28
situation like the sphere which I think
00:28:29
is a spectacular achievement um in a lot
00:28:32
of ways and it's also beautiful on the
00:28:34
outside cuz it's delightful. Um what do
00:28:37
you what do you imagine that to be?
00:28:39
>> They're I don't think they'll ever be
00:28:41
big businesses care. I think they're
00:28:43
niche experiences. I think that our
00:28:45
species has gotten really used to and
00:28:48
comfortable with as bad as it is this
00:28:50
world. So IMAX is an immersive
00:28:52
experience, but it's never really lived
00:28:55
up to the potential outlined.
00:28:56
>> It's a good business though. It's a good
00:28:58
business.
00:28:58
>> Yeah, it's been quite frankly over the
00:29:00
last 40 years, it's been a shitty
00:29:01
business. IMAX
00:29:03
>> relative to the cost, it's been okay. I
00:29:06
love IMAX every time. That's what I do
00:29:08
when I take my I love seeing in
00:29:10
IMAX.
00:29:10
>> I'm going to see Project Hail Mary
00:29:12
tomorrow night.
00:29:13
>> It's a niche business. The only place I
00:29:16
want an immersive experience is when I'm
00:29:18
having my teeth cleaned by a hot single
00:29:20
mother, Brazilian single mother. And
00:29:22
then she puts on headset that I can
00:29:24
watch, heated rivalry,
00:29:26
>> right?
00:29:26
>> And she, you know, and then I start
00:29:28
crying cuz I start thinking about my mom
00:29:29
and I'm under the influence. I tell her
00:29:31
to when she says,
00:29:32
>> she says 1 to 10 nitrous. I go 12 baby.
00:29:36
>> 12 baby. I'll have it. So, but you don't
00:29:38
you like look, let me can I tell you
00:29:40
what I liked about like you're right.
00:29:41
They're experiential things. One of the
00:29:43
things that was cool about Spheres, I
00:29:45
have seen Wizard of Oz a million times
00:29:47
recently too because my little kids are
00:29:49
now watching it. So, it's not something
00:29:51
I want to see again and again. But one
00:29:53
of the things I thought was quite
00:29:54
beautiful was the ability to see things
00:29:56
in the movie that I never saw like some
00:29:58
of the beautiful costumes, some of the
00:30:00
beautiful, you know, set design and
00:30:03
oddly enough the faces of all the people
00:30:05
that weren't Dorothy like or the or the
00:30:08
main characters. It's like I found
00:30:10
myself looking at these beautiful faces
00:30:12
from another era, right? Like there was
00:30:14
two twins there that I never noticed.
00:30:16
And so one of the things I found it
00:30:18
wasn't just everyone was like, "Oh, the
00:30:20
tornado." And I was like, "That was
00:30:21
cool." But what was beautiful was I
00:30:24
could really see things in a way that I
00:30:26
appre in a way I appreciated. So it
00:30:28
there is something valuable about
00:30:30
immersive in some way like travel I
00:30:33
suppose if or or when you go to a theme
00:30:36
park and you you get on one of those
00:30:37
rides that you like you know you go you
00:30:39
soar past the Golden Gate Bridge. I love
00:30:41
all those things.
00:30:42
>> No look going into another world you
00:30:45
feel like an explorer. It's it's sensory
00:30:47
overload. It's really exciting and then
00:30:49
you want out.
00:30:50
>> Yeah.
00:30:50
>> Escape escape room is correctly named.
00:30:54
>> You wouldn't you wouldn't want to live
00:30:56
in this sphere. Your body can't handle
00:30:58
that much sens sensory stimulation. And
00:31:01
the sphere, by the way, similar to IMAX,
00:31:03
>> an amazing product,
00:31:05
>> it's not doing well economically.
00:31:07
>> Yeah.
00:31:08
>> So, the idea or even the ultimate
00:31:11
sensory experience, the ultimate moment
00:31:13
of awe supposedly according to
00:31:14
astronauts is to go into space and see
00:31:16
the world
00:31:17
>> from another perspective. But guess
00:31:19
what? What's the first thing they want
00:31:20
to do after a week?
00:31:21
>> They want to get home.
00:31:23
>> Yeah. So what what I think I wish
00:31:26
technology was more focused on I hate
00:31:30
this notion that we need to colonize
00:31:31
Mars. No, the real genius here is
00:31:34
something that's going to make this
00:31:35
place a little bit more
00:31:36
habitable.
00:31:38
>> I'm in Tulum staring out at palm trees
00:31:40
and coconuts and the sand, the sugary
00:31:42
sand,
00:31:43
>> and I'm in awe and I'm comfortable
00:31:46
>> and this is the only universe I
00:31:48
want to be in.
00:31:48
>> Yeah. No, I know it's I've never wanted
00:31:50
to go to the space anyway. It it look
00:31:52
it's a disaster. Mark, you you were
00:31:53
wrong and Scott was right. That's all I
00:31:55
have to say. Speaking of scaling back,
00:31:57
OpenAI is scaling back on projects and
00:31:59
focusing on coding and business uh
00:32:01
users. Pressure for the change comes
00:32:02
from competitors like Anthropic, which
00:32:04
you and I have been talking about,
00:32:05
dominating the business AI market.
00:32:07
Employees also felt the company's do
00:32:09
everything strategy led to a lack of
00:32:11
focus. Speaking of which, uh, OpenAI
00:32:13
delayed the launch of the adult mode,
00:32:15
which would allow sexually explicit
00:32:16
conversations due to concerns from
00:32:18
advisers over mental health risks. You
00:32:20
think also of concern, an age prediction
00:32:22
system that has been mclassifying minors
00:32:24
as adults 12% of the time. The feature
00:32:26
which the company still plans to release
00:32:28
eventually would be text only. Um this
00:32:31
is all the influence of Fiji Simo uh who
00:32:33
is the um is is the new top executive
00:32:36
there. Very similar when Errol Schmidt
00:32:38
came to Google. They they were sort of
00:32:40
chaotic and did everything the two
00:32:42
founders Larry and Sergey. Um and then
00:32:44
they brought uh Eric in to really clean
00:32:46
it up. seems sort of basic this you know
00:32:49
this this executive but they do have
00:32:51
done made like a million stupid
00:32:53
announcements and it does remind one of
00:32:55
Google in that regard thoughts
00:32:58
>> you're exactly right it's remember when
00:33:00
Google was doing like trying to
00:33:01
cure death and then I I feel like Eric
00:33:04
brought in managerial competence and how
00:33:06
to scale an organization but Ruth Pat
00:33:09
showed up and said all right mom is home
00:33:11
>> fun time's over the dog's pregnant and
00:33:13
the garage is on fire I'm in charge now
00:33:15
>> y And this is the right move for open
00:33:18
AI. And that is and by the way, and this
00:33:22
will go to my prediction, Anthropic is
00:33:24
now worth more than OpenAI. I don't care
00:33:25
what the last mark is on a preferred
00:33:28
funding, but Anthropic has surged to 19
00:33:32
billion in annual recurring revenue up
00:33:34
from 14 billion just a couple weeks ago.
00:33:37
6 billion in AR was added just in
00:33:39
February. Open AI AR was 20 billion at
00:33:43
the end of 2025.
00:33:45
And here's the key. It's all about the
00:33:46
enterprise because they're the only ones
00:33:48
that are willing to make these huge
00:33:49
investments. And an and get this car
00:33:52
anthropic enterprise market share has
00:33:53
increased to 32% surpassing
00:33:57
OpenAI's 25%. And since 2023,
00:34:01
enterprise AI revenue has exploded from
00:34:04
1.7 billion
00:34:05
>> to 37 billion. Yeah, they've got to be
00:34:08
the open AI is really
00:34:09
>> and then the other the other the other
00:34:11
staggering statistic here that is why
00:34:13
open AI is focusing which is the right
00:34:16
thing to do
00:34:17
>> is anthropic is now capturing
00:34:20
>> three out of four new spending in
00:34:24
enterprise AI.
00:34:26
>> So they're getting 73% of all spending
00:34:28
among companies buying AI tools for the
00:34:30
first time.
00:34:31
>> Yeah.
00:34:32
>> And 10 weeks ago the split with open AI
00:34:34
was 5050. So get this
00:34:38
get this hag Seth
00:34:39
>> it was 6040 in open AI's favor as
00:34:42
recently as early December from so from
00:34:44
December to now it's gone from 6040 to
00:34:49
20 to 2773.
00:34:52
>> Yeah.
00:34:52
>> So they are they are literally losing
00:34:54
the enterprise market.
00:34:56
>> Yeah. So it's starting to feel like open
00:34:58
is Netscape not Google. Right. That's
00:35:00
how it
00:35:01
>> that's an interesting analogy you've
00:35:02
seen. I just I was there when Google was
00:35:04
the first bout of chaos was at the
00:35:06
beginning and there was you know there
00:35:08
was a cover of Fortune magazine chaos at
00:35:10
Google and of course Ruth also shut down
00:35:12
all their all manner they had so many
00:35:14
ridiculous they were doing and they
00:35:16
could do it just like Mark with with the
00:35:18
metaverse because they had all this
00:35:20
money but it was like dumb like it was
00:35:22
at the time when they would have you in
00:35:24
and I was always like this seems dumb
00:35:26
like why are you doing this like why
00:35:28
don't you stick with your business and
00:35:29
they just wanted to be more creative or
00:35:31
more something more interesting in some
00:35:33
fashion. But it's really interesting
00:35:35
because this is at a time when I think
00:35:37
you know anthropics been under pressure
00:35:40
from the government but in the end they
00:35:42
will soar and Pete Hagsath will be a you
00:35:45
know a a sad little footnote a sad
00:35:48
little drunken footnote in our history.
00:35:50
Um anyway we'll see what happens. Um
00:35:53
speaking of someone who won't be a
00:35:54
footnote I would say is Bob Iger stepped
00:35:56
down as Disney CEO again. Iger passed
00:35:59
the baton to his successor Josh Demorro
00:36:01
at Disney's annual shareholder meeting
00:36:03
this week. Tomorrow, a 28-year-old
00:36:05
veteran of the company, was most
00:36:06
recently head of Disney Experiences,
00:36:08
which includes parks, cruises, and
00:36:09
resorts. Iger is set to stay on as an
00:36:12
adviser and board member until the end
00:36:13
of 2026. Not very long. It's unclear
00:36:16
what he'll do after that. Before the
00:36:17
last time he left, he did a bunch of
00:36:19
advising and sailing around on a boat in
00:36:21
the South Seas. Um, uh, I last time he
00:36:24
retired, which I said he wasn't going to
00:36:26
stay retired, I asked him whether he
00:36:27
planned to get into politics. Let's
00:36:29
listen to what he told me in 2022.
00:36:32
Would you ever run for office?
00:36:34
>> Uh, I'm not planning to run for office.
00:36:36
>> That is that a no?
00:36:38
>> That's just what I said. I'm not.
00:36:39
>> Okay. All right. Fine. I think you are.
00:36:41
Um, so last thing. Um, you should I I
00:36:44
don't usually I usually do not tell
00:36:46
another white guy, "Oh, please run for
00:36:48
office. We don't have enough of you."
00:36:49
But I think you would be an excellent
00:36:51
because I think you'd be an excellent
00:36:52
politician because I don't think you
00:36:53
give a Uh anyway, I don't think
00:36:55
he's going to run for office. Actually,
00:36:57
I don't I can't imagine he's going to do
00:36:58
that. Um but what do you think his next
00:37:01
act will will be? Uh I mean, he
00:37:03
certainly had his ups and downs and the
00:37:05
stock has not reflected much of it.
00:37:07
Although I do think he did a lot around
00:37:09
digital. I think he did a lot around
00:37:10
streaming. Um I think he was a very good
00:37:14
CEO for much of his tenure and not so
00:37:16
good in other things. I think probably
00:37:18
the Fox purchase is one people point to
00:37:20
as being problematic, but in general,
00:37:23
pretty good tenure. Um, especially
00:37:25
around streaming. I I think that he made
00:37:27
those moves. Um, what do you think his
00:37:29
next act should be?
00:37:30
>> Hit the golf course and enjoy his life.
00:37:32
And I I would call challenge on his
00:37:34
tenure, Cara, because
00:37:35
>> Okay.
00:37:36
>> The last 10 years have been the most
00:37:38
prosperous in the history of the world
00:37:40
for American companies, and his stock is
00:37:42
below where it was 10 years ago. And at
00:37:44
the end of the day, as a CEO,
00:37:46
>> that's your what you're evaluated.
00:37:47
That's your kind of metrics 1, two, and
00:37:49
three. He quite frankly, he really
00:37:52
up. He's the guy who went to
00:37:54
Vietnam, completed his tour honorably,
00:37:56
came home with medals pinned to his
00:37:57
chest. He could be a viable candidate
00:38:00
for the Democratic nomination right now,
00:38:02
but he's more he looks less like Mark
00:38:04
Cuban and more like Cheryl Samberg.
00:38:06
>> And that is his second tenure. First
00:38:09
off, he was heckling from the cheap
00:38:10
seats. He left and never really left the
00:38:12
room, but convinced the board, as far as
00:38:14
I can tell, to fire the new guy and put
00:38:16
me back in like some returning hero. And
00:38:18
he has had huge wins in his face. But
00:38:22
Disney has become
00:38:24
Disney has gone from being probably the
00:38:26
most iconic company in the creative
00:38:28
community. To a certain extent, it
00:38:29
represents what's happened to the
00:38:30
creative community. And that is distinct
00:38:32
of how incredible it is and their great
00:38:35
IP and their great creativity. It's been
00:38:37
bad for shareholders and it's probably
00:38:39
been a difficult place to work the last
00:38:41
10 years. And he did he did make a lot
00:38:44
of the right moves. He launched a
00:38:46
streaming network. He invested in the
00:38:47
parks. But at the end of the day, his
00:38:50
last 10 years, there was a there was
00:38:52
never a clear succession path. He
00:38:54
started to feel a little bit like I
00:38:56
forget the name of that guy at City CR
00:38:58
that anytime someone got near him, got
00:39:00
shot in the head. So he leaves. He's
00:39:04
very likable. He's very smooth. Had he
00:39:07
stayed away and then just let someone
00:39:10
else run with it, I think he'd probably
00:39:12
be a cabinet member, maybe even by, you
00:39:15
know, in the next administration at a
00:39:17
minimum. Now he's Now he's the guy that
00:39:21
quite frankly took Disney. I he he
00:39:25
didn't take the stock anywhere. I get
00:39:26
that. I I understand. I think doing the
00:39:28
streaming stuff was critical to its
00:39:30
future and he definitely pushed that
00:39:31
through. Like I I was there watching. I
00:39:33
mean, he made a number of dumb digital
00:39:35
moves over the years. They kept changing
00:39:36
Disney Bu a Vista when he I mean, I was
00:39:39
I wrote stories on every one of them and
00:39:41
but I do think directionally very few
00:39:44
people leaned into digital and streaming
00:39:46
the way he did, right? And I think
00:39:47
>> Oh, I don't know. I would argue Netflix
00:39:49
leaned in.
00:39:49
>> Well, Netflix, of course. No, no. Yes.
00:39:51
No, they should have bought Netflix when
00:39:52
they had the chance and they everybody
00:39:54
had the chance at one point, but yeah,
00:39:56
you're right. Netflix was in the in the
00:39:58
right position. But you are dragging
00:40:00
around a legacy organization makes makes
00:40:02
it very hard
00:40:03
>> a legacy organization that had the
00:40:05
world's best IP. I mean
00:40:07
>> Netflix so okay so Disney in the last 10
00:40:12
years has market returns of of zero and
00:40:18
Netflix is up four I'm sorry it's up
00:40:23
600%. Yeah. Yep.
00:40:26
>> Granted, the other studios have not
00:40:28
fared any better,
00:40:30
>> right?
00:40:30
>> But with that IP, with the Parks or Cash
00:40:34
Flow,
00:40:36
>> Yeah. Look, Bob, what's the lesson here?
00:40:41
The lesson is the following, and I think
00:40:43
about this a lot. Mhm.
00:40:44
>> It is very hard to pull off the ultimate
00:40:48
gangster move for your brand when you're
00:40:49
in a position of power and you're doing
00:40:51
well, and that is to leave the party too
00:40:54
early.
00:40:56
Um, and that is people have a tendency
00:40:59
when they're doing well and they're so
00:41:01
iconic as Bob Iger is and was to think
00:41:04
to just stay too long.
00:41:06
>> Yeah.
00:41:06
>> You want to leave the stage while people
00:41:08
are clapping. You want to leave a party
00:41:10
10 minutes too early. You want to leave
00:41:11
the Vanity Fair Oscar party at midnight,
00:41:14
not at 4:00 a.m. when you're wandering
00:41:15
around alone. And it's clear Emily
00:41:16
Rodikowski is not going to speak to you.
00:41:19
>> Was she there?
00:41:21
>> By the way, at one point I was sitting
00:41:23
at the bar.
00:41:23
>> We didn't talk about this cuz you were
00:41:24
blabbing away to all your other
00:41:26
>> I was sitting at the bar, no joke, in
00:41:28
between John Ham, who's quite handsome,
00:41:31
and Jacob Allerty, who is even more
00:41:33
handsome and much taller.
00:41:35
>> Yeah. Emily started walking towards the
00:41:37
bar and all I could think of is there's
00:41:39
no way she's coming to me right
00:41:40
now. Yeah. No way. I'm like
00:41:43
>> I'm like the price is right.
00:41:45
>> This is the real Emily. You saw her.
00:41:48
>> Oh yeah. Trust me. I saw her.
00:41:50
>> Okay.
00:41:51
>> Uh yeah. By the way, she looks she looks
00:41:53
pretty good. She looks pretty.
00:41:55
>> So wait, what happened? Wait, the I I
00:41:57
only want the Ratikowski part. Go ahead.
00:42:00
>> Nothing. She did. She just walked up and
00:42:01
had a drink. And at some point I'm like,
00:42:02
I I want to be the professor, not the
00:42:05
stalker.
00:42:06
>> So, uh, but my favorite moment is
00:42:08
>> You didn't say hello.
00:42:09
>> I'm too intimidated.
00:42:11
>> Oh my god.
00:42:12
>> I said hi to I said hi to Morin Dow and
00:42:15
Caitlyn Collins. Those are my friends.
00:42:16
>> I saw that.
00:42:18
>> Those are That's who I hang out with.
00:42:19
And the Smartless guys. Those guys are
00:42:21
fun. I like those guys.
00:42:22
>> Those are fun.
00:42:23
>> And they they're like they feel sorry.
00:42:24
The only people that come up to me are
00:42:26
like think of me as an intellect. They
00:42:27
think, "Oh, it's so cute. They have a
00:42:29
professor here. Let's go be nice to him.
00:42:31
That's our charity for the night. And
00:42:33
everybody comes up to me and says, I
00:42:34
have sons and I very much appreciate
00:42:36
your work. And then they say, oh, can I
00:42:37
meet, you know, can can can I meet Jud
00:42:40
Appatile now? I mean, I'm convinced half
00:42:43
the people half the people talking to me
00:42:46
>> Yeah.
00:42:46
>> were checking themselves out in the
00:42:48
reflection of my glasses.
00:42:49
>> Oh no.
00:42:51
>> I can't believe you didn't speak to
00:42:52
Emily Rat.
00:42:53
>> By the way, that party.
00:42:55
>> Yeah.
00:42:55
>> Vanity Fair.
00:42:56
>> Yeah.
00:42:57
>> Those people are geniuses. I'm going to
00:42:59
subscribe twice. Okay. The the the
00:43:02
environment they pulled together that
00:43:03
night.
00:43:04
>> Yeah. It's nice. It's a nice part.
00:43:05
>> I would I think it's the most
00:43:06
aspirational environment I've ever been
00:43:08
in in my life. I just couldn't get over
00:43:10
the wardrobe, the environment, the food,
00:43:13
the vibe.
00:43:13
>> You've always done a good job.
00:43:14
>> I just saw it uh the new editor in
00:43:17
Unbelievable.
00:43:18
>> Mark. Yeah.
00:43:19
>> Yeah. Mark just is an amazing handsome
00:43:21
guy, too.
00:43:22
>> Yeah. I have to say they've always had a
00:43:24
good party. They they've been good at
00:43:25
that under under all their different
00:43:27
editors. I think it's been
00:43:28
>> And I got to hang out with Larry David.
00:43:29
It's like angry meet depressed.
00:43:30
Depressed meet angry.
00:43:31
>> Oh my god, you look alike. What
00:43:33
happened? Was there like a moment?
00:43:35
>> Larry and I are friends now.
00:43:37
>> Oh, you're friends.
00:43:37
>> We totally got along. Okay. All right.
00:43:39
>> Yeah, we hit it off. And by the way, the
00:43:41
Larry David Show is really the David
00:43:43
Larry David Show. He's like, that's
00:43:45
exactly who he is.
00:43:47
>> He's like, what's the point of an Oscar?
00:43:48
He just starts into a and you're
00:43:50
like, okay, here we are.
00:43:51
>> He has a new show that looks hysterical
00:43:53
that he did with the Obamas about
00:43:54
history.
00:43:55
>> My very lovely wife, too. Anyways, I
00:43:57
very much
00:43:58
>> I don't know. Was Bob Iger there?
00:44:01
Because it was
00:44:02
>> I did not see Bob. I did not sense a
00:44:04
Kashmir sweater or tuxedo anywhere.
00:44:06
>> Uh but the thing is you walk in and they
00:44:09
like do you want to do a red carpet in
00:44:10
Morningington? It's like I'm not doing a
00:44:11
red carpet. I'm like I am so doing the
00:44:13
red carpet and they have
00:44:14
>> Yeah. So you had
00:44:15
>> hundreds of photographers and there's
00:44:17
three X's and I guess you're supposed to
00:44:20
go to one X.
00:44:21
>> Yeah.
00:44:21
>> And take pictures.
00:44:23
>> Yeah.
00:44:23
>> I didn't know that. So, I go to the
00:44:24
first X and they're like, "Hello,
00:44:26
professor and they're all nice." And I'm
00:44:28
like, "Now I'm going to go to the second
00:44:29
X and sit here and pose.
00:44:30
>> Get the out of here."
00:44:31
>> And then I go to the third X and by the
00:44:32
time I got to the third X, I realized
00:44:34
everyone's like, "What the is this
00:44:35
guy doing?"
00:44:38
>> And one of the one of the photographers
00:44:40
just out of a moment of like feeling
00:44:41
sorry for me, kind of waved me along.
00:44:43
He's like, "You're supposed to go to
00:44:44
just one X."
00:44:45
>> And I turned red. I'm a bad celebrity.
00:44:48
>> Oh my god. Can I ask you one question?
00:44:49
Did you see Jeff Bezos? He was there
00:44:51
looking.
00:44:52
>> I saw him with Lauren. I thought they
00:44:53
looked great. I don't
00:44:54
>> No, I didn't. I thought they
00:44:55
>> I don't mind Jeff's midlife crisis. I'm
00:44:58
here for it.
00:44:58
>> I know. But did you say hello?
00:45:00
>> I said hello to all three of them. I
00:45:01
mean, I said Yeah. No. No. I said hello.
00:45:04
>> Um I did not. I'm tell I'm intimidated.
00:45:07
Unless people come up to me, I'm
00:45:08
intimidated. I don't like
00:45:10
>> gone and said Carara says hello. That
00:45:11
that in that one you could have done
00:45:13
that.
00:45:13
>> That's like hi. My rich father knows
00:45:16
you. I just don't want to do that. I
00:45:17
don't know.
00:45:18
>> He doesn't like me. I would be bad. He
00:45:19
would
00:45:20
>> I literally freaked out at about
00:45:21
midnight. And I'm like, this is the best
00:45:23
party of my life. I need to go home and
00:45:25
take a Xanax and just recover from all
00:45:26
the I feel like a kid who candy store
00:45:29
for 8 hours.
00:45:30
>> Did you? I can't believe my only note is
00:45:32
I can't believe you didn't say hello to
00:45:34
Emily Ratowski. You're a loser. You're a
00:45:36
loser. Anyway, Bob Iger's next act very
00:45:39
quickly.
00:45:40
>> He'll go on a couple boards and he'll
00:45:42
enjoy his life and he deserves to do all
00:45:43
of this. Hang out with his lovely wife
00:45:46
and speak at USC's film school.
00:45:48
>> He's got to do something else. I think
00:45:49
it's something else. Let me tell you,
00:45:50
when he was
00:45:51
>> Bob is 74, 73.
00:45:53
>> He is in really good shape. When he
00:45:55
looks really good, when he was in the
00:45:56
last one, he texted me far too much. And
00:45:58
I was like, I think you need to do
00:45:59
something else cuz I think he's got
00:46:01
another thing in him. I don't know what
00:46:02
it is.
00:46:03
>> I know. He's 75. Yeah.
00:46:04
>> Yeah. He could be in the cabinet. He
00:46:06
could be in a
00:46:07
>> Well, what is that? 77
00:46:10
head. Yeah.
00:46:11
>> Could be the head. Could he be the
00:46:12
commerce secretary? I don't know.
00:46:13
>> I don't know. He probably doesn't want
00:46:14
to. What do you need that for
00:46:16
anyway? If he wants to help people,
00:46:18
>> ambassador to France and throw amazing
00:46:19
parties at like the the US residents.
00:46:23
>> That's perfect. Ambassador to France.
00:46:25
Let's do it. Bob, we're going to send
00:46:26
you to France. Anyway, let's go on a
00:46:28
quick break. When we come back, we'll
00:46:30
talk about Khi facing criminal charges.
00:46:32
Your favorite groups of people there,
00:46:34
Scott.
00:46:36
>> Support for the show comes from BMC.
00:46:38
Before you scale AI to every corner of
00:46:40
your business, before you supercharge
00:46:42
your agents with AI ready data, before
00:46:44
you trust your entire business to AI,
00:46:46
BMC first. BMC is here to help you look
00:46:49
past the hype of the AI revolution and
00:46:51
look toward an orderly AI evolution. For
00:46:54
decades, BMC has powered the systems the
00:46:56
world can't afford to fail with
00:46:58
automation, orchestration, and control
00:47:00
at enterprise scale. And today, they are
00:47:02
the automation engine for the AI era,
00:47:04
the foundation for the Agentic
00:47:05
Enterprise at scale. And as companies
00:47:07
seek to harness the power of automation
00:47:08
to streamline and accelerate their most
00:47:10
complex and critical business processes,
00:47:12
BMC is ready to partner with them.
00:47:14
Because BMC is uniquely qualified to
00:47:16
solve the orchestration, data and
00:47:18
execution challenges that AI creates.
00:47:20
Before AI, before automation, before
00:47:23
orchestration, BMC first. How can you
00:47:26
change the course of your business when
00:47:28
you partner with BMC? Learn more at
00:47:30
bmc.com.
00:47:35
Scott, we're back with more news. Kelshi
00:47:37
is facing criminal charges in Arizona
00:47:39
where prosecutors say the prediction
00:47:40
market platform illegally let people bet
00:47:42
without a gambling license. Keli says
00:47:44
the charges are meritless and said they
00:47:46
should be regulated federally rather
00:47:47
than by individual states. The case is
00:47:49
the first criminal prosecution against a
00:47:51
prediction market company. It's more to
00:47:52
come. I actually when I was at South by
00:47:54
South met met with the California
00:47:56
Attorney General who today um did a
00:47:59
lawsuit uh one of the lawsuits against
00:48:01
the Nextar um the Nextar whatever that
00:48:05
ridiculous merger was. Um in any case
00:48:08
this is the states have been regulating
00:48:10
gambling for year like forever for
00:48:12
decades. So it's not meritless. Um so
00:48:15
what what do you think does this because
00:48:17
what was interesting another story
00:48:19
popped up which I found fascinating.
00:48:20
Times of Israel reporter received death
00:48:22
threats from gamblers on poly market
00:48:24
after reporting an Iranian missile
00:48:26
strike that affected a high stakes
00:48:27
prediction market bet. Some betters
00:48:28
tried to pressure him to change the
00:48:30
story so the market would resolve in
00:48:32
their favor. And let me just say it feel
00:48:35
this is a topic people are really
00:48:36
interested in. I'll read an email from
00:48:38
one of our listeners. I'm a journalist
00:48:39
and a fan of the show. I don't
00:48:41
understand why I'm hearing Kelie
00:48:42
percentages cited during the show as
00:48:44
it's anything. It's people guessing. I
00:48:46
think it's more harmful than helpful.
00:48:47
That's you doing it, Scott. I don't do
00:48:48
that. I agree with you. Um what do you
00:48:50
think about these markets shifting from
00:48:52
predicting events to actively
00:48:54
influencing them in certain given the
00:48:56
gaming part easily gamed uh unregulated
00:49:01
bad actors it is gambling um and
00:49:04
gambling is very well regulated. So what
00:49:07
do you think about that?
00:49:08
>> I think there's some truth to all of
00:49:10
that. I I'm this is one of those things
00:49:11
I'm hugely conflicted by because I am
00:49:13
absolutely fascinated with the data
00:49:15
where I would push back on the listener
00:49:18
is oh no this data is incredibly
00:49:20
insightful
00:49:22
um the wisdom this is the wisdom of
00:49:23
crowds this is this does illuminate
00:49:26
whenever I'm looking at political races
00:49:29
whenever I'm looking at interest rate
00:49:31
movements I go to I go to
00:49:32
>> a trailing indicator you don't think
00:49:34
it's a trailing indicator
00:49:35
>> it's pretty much up to date and the
00:49:36
thing about money and the thing about
00:49:38
looking at Typically the people who did
00:49:41
this stuff were were academics,
00:49:43
economists or an investment banking
00:49:46
analyst. All of them are conflicted. All
00:49:48
of them want to catastrophize because it
00:49:50
makes us look smarter. All of us have
00:49:51
third party influences. Nothing is more
00:49:54
amoral and pure than money. It just when
00:49:58
someone bets on something, it really
00:50:01
shows you what they really think is
00:50:02
going to happen. And if you look at it's
00:50:05
these comp these speculative markets,
00:50:07
speculation markets or prediction
00:50:08
markets have essentially put pollsters
00:50:10
into a certain extent investment banking
00:50:12
analysts out of work because guess what?
00:50:14
They're much
00:50:14
>> kind of I I would push back on that. I
00:50:16
just met with a bunch of pollsters on
00:50:17
this topic, but go ahead.
00:50:18
>> In my opinion, they're done. They if you
00:50:20
look at if you look at if you look at
00:50:21
the prediction markets record versus
00:50:24
pollsters in the last election, the
00:50:26
prediction markets kicked their ass.
00:50:28
Absolutely. I love the data. I am
00:50:32
swimming in the data. It's one of the
00:50:33
first things I do before I get on a show
00:50:36
is I look at I look at Cali data. Cal
00:50:40
I'm totally conflicted because at the
00:50:42
same time
00:50:43
>> Mhm.
00:50:44
>> there's a really good argument that this
00:50:45
is just gambling. Now what
00:50:48
>> what's happening is they're being
00:50:49
charged with four counts of election
00:50:50
wagering.
00:50:52
>> Um is the the debate is over the
00:50:55
fundamental definition of gambling
00:50:57
versus event contracts. and Arizona
00:50:59
charges claim that putting money on a
00:51:01
contingent future event or occurrence is
00:51:03
illegal. But at the same time, Car, if
00:51:05
that's true, then traditional options
00:51:09
would be illegal. And that here's the
00:51:12
problem or the issue.
00:51:15
Gambling and tapping into a prefrontal
00:51:19
cortex, an immature prefrontal cortex
00:51:21
that is dopah hungry and susceptible,
00:51:25
uh, in some ways there's just no getting
00:51:26
around it. feels predatory
00:51:28
and unhealthy. So what do you do? Do you
00:51:31
infantilize? I think Khi is trying to be
00:51:33
the the clean the cleanest best lit
00:51:36
place of this. They're not doing
00:51:38
contracts on things like war
00:51:40
>> whereas Poly Market is off offshore and
00:51:43
Kelsey is trying to get licensed by the
00:51:45
same people who license the options
00:51:47
exchange.
00:51:49
>> But I I want to hear what you think. I
00:51:52
have no more clarity around.
00:51:53
>> I think the states have been regulating
00:51:55
gambling forever. So I think that's
00:51:56
nonsense that they should if this if
00:51:58
gambling is going on they need to it's
00:52:00
it reminds me
00:52:01
>> they're approving it. They're approving
00:52:02
it everywhere
00:52:03
>> proving it in different places.
00:52:05
>> States have been approving gambling all
00:52:08
over the place.
00:52:08
>> They are but so they need to be
00:52:09
regulated in the same way. Like it's
00:52:11
it's my thing with everything. It's like
00:52:12
if open is giving legal, medical or
00:52:16
psychological advice, they need to be
00:52:18
subject to the same rules people are
00:52:20
right the same everybody. Like I I was
00:52:23
in Vegas for a second. I have to tell
00:52:24
you you're absolutely right. It's dead.
00:52:26
Vegas is dead. Like I
00:52:27
>> You need to be in Vegas. Vegas is in
00:52:28
your pocket.
00:52:29
>> That's right. I was like I literally
00:52:30
like Oh my god, Scott was right. It was
00:52:32
so freaky to be in Vegas without people.
00:52:34
It felt like I was in like Plurabus,
00:52:36
right? It was so weird. And you could
00:52:39
feel the the innovation of a place that
00:52:41
is just with these big rooms and the
00:52:44
casinos empty. It's weird. And so it's
00:52:47
it's definitely hurting businesses,
00:52:48
right? these kind of things, whether
00:52:50
it's sports betting online or this kind
00:52:52
of thing, there's they need to be
00:52:54
regulated the same way everybody else
00:52:56
is. And and states have every right to
00:52:58
do this. This is this is not and maybe
00:53:01
there should be federal gambling laws,
00:53:02
but there haven't been really.
00:53:04
>> I think that would be good. I think they
00:53:05
would want that. I think they want some
00:53:06
regulation.
00:53:07
>> Yes.
00:53:07
>> But let me ask you let me ask you a more
00:53:10
old tech company said, "Oh, we please
00:53:11
bring us regulation."
00:53:12
>> I think I actually think they're I think
00:53:14
they're I think they would actually
00:53:16
please bring us regulation.
00:53:18
Let me ask you this. You have sons.
00:53:20
>> I think about this a lot. Let's be
00:53:22
clear, much of this is gambling.
00:53:25
>> Yeah.
00:53:25
>> Uh and and it and but at the same time,
00:53:28
do you infantilize children? And I I I
00:53:31
know firsthand as someone who
00:53:32
appreciates data. There is real value in
00:53:36
this data.
00:53:37
>> There is. It can also be easily gained.
00:53:40
So easily
00:53:41
in it. There's a lot of potential for
00:53:43
insider trading. But the more liquid
00:53:45
markets, people are more greedy. Anyway,
00:53:48
huge potential for insider trading. I
00:53:49
get it. But let me ask you this.
00:53:51
>> Mhm.
00:53:52
>> Do you think it should be they should be
00:53:54
put out of business, regulated, or let
00:53:56
to just run free?
00:53:58
>> Regulated.
00:53:58
>> And what does that mean?
00:54:00
>> I'm not sure. I'm not I'm not an expert
00:54:01
on this, but I feel like how are g I
00:54:03
want to know how gambling things are
00:54:05
regulated and how
00:54:06
>> age getting to 21 would be one good
00:54:08
start, right?
00:54:10
>> Possibly. Yes. 21. It's interesting.
00:54:12
Yes. Yes. Yes. actually on certain
00:54:14
parts, other parts it's fine. But yes,
00:54:16
agegating would be one thing and it's
00:54:18
not infantilizing. We do it all the time
00:54:21
with with with real businesses. And so
00:54:24
what
00:54:25
>> foreign, alcohol, military,
00:54:26
>> what upsets me is we're different. All
00:54:29
it's the same it's the same song and
00:54:31
dance from all internet companies. We're
00:54:33
different. We don't deserve the same.
00:54:35
And they get unfair advantage here. Um
00:54:38
as to
00:54:38
>> who gets unfair advantage.
00:54:40
>> These these these markets get unfair
00:54:41
advantage. It made me very
00:54:44
uncomfortable, for example, when uh CNN
00:54:47
and others sign deals with them because
00:54:49
I'm like because I don't think they know
00:54:50
how to use them properly. That's the
00:54:52
other thing. It can be so
00:54:54
>> it's not reporting like it's not it's
00:54:56
some it's an indicator. It's a data
00:54:58
point, but it's not I guess I don't like
00:55:01
them doing polls either. So, I guess I I
00:55:04
just I find it very weak and it can be
00:55:06
very influential in a way. And so I just
00:55:09
feel like it it it obviously needs to
00:55:11
have some regulatory thing with my sons.
00:55:13
They don't actually they're not they're
00:55:14
not big bettors. I don't I'm not I don't
00:55:17
know why. I mean I I get that why like I
00:55:19
was in Vegas for two days and I didn't
00:55:21
bet once. Like I was like I walked right
00:55:23
through the casinos. But that's me. Um
00:55:25
but I just feel like it's the death
00:55:27
threat. This reporter thing was a really
00:55:29
interesting thing. Like this this has
00:55:31
implications that have been around since
00:55:34
the dawn of time. The these and and they
00:55:36
think they're different. And so how I
00:55:38
think we need to have more transparency
00:55:40
into how they're doing things. I think
00:55:42
they should have you know they shouldn't
00:55:44
bet on deaths like I mean they they
00:55:46
shouldn't be I don't know if we should
00:55:48
make them not do it or if you say okay
00:55:51
you're going to do that.
00:55:52
>> Yeah. But to be fair I do think Cali has
00:55:54
said we're not going to we're not going
00:55:56
to create markets and things like war
00:55:58
that might involve an incentive that
00:56:00
might involve death or
00:56:01
>> geopolitical. That's the kind of stuff
00:56:03
but there's going to be someone who's
00:56:04
going to. So maybe we need some laws,
00:56:07
right? Anyway, we we have to move on.
00:56:09
It's a really interesting it's a
00:56:10
developing situation, but I think it's
00:56:12
in every state's rights to do this. So
00:56:14
Ky should stop being so like high-handed
00:56:16
with them. Of course, they're going to
00:56:17
come in. It's affecting things. So um
00:56:20
this is exactly why the government
00:56:21
should come in in some fashion. At least
00:56:23
think about it, have hearings, talk
00:56:24
about it, and and let's discuss the
00:56:26
things. Um just before we finish, this
00:56:29
is the last thing. Uber plans to invest
00:56:31
$1.2 $2 billion in Rivian as part of a
00:56:33
deal to deploy 50,000 robo taxis. I
00:56:36
recently spoke with uh Rivian founder
00:56:38
and CEO RJ Scaring on on with Care
00:56:41
Swisser. I also saw him for an extended
00:56:43
amount of time at South by Southwest.
00:56:45
Let's listen to a clip where he talked
00:56:47
about self-driving.
00:56:48
>> If you're a customer and you have a
00:56:49
choice of I can buy a car for 35 $40,000
00:56:52
and it can, you know, drop me at the
00:56:54
airport, it can go to the grocery store
00:56:55
to pick up, you know, stuff for me. It
00:56:58
can drop a friend at a house. it can do
00:57:00
all those things or a car that doesn't
00:57:02
do that. It's it's going to be very
00:57:04
binary where I think there'll be very
00:57:06
few people that will self- select to say
00:57:09
I don't want those features. Even folks
00:57:11
who are not comfortable with the idea of
00:57:13
self-driving once you experience it one
00:57:14
or two times.
00:57:15
>> It does. I try to say that to everybody.
00:57:17
>> It's so sticky because you get your time
00:57:19
back. Suddenly you can be reading a book
00:57:21
on your phone. It's it's just so sticky.
00:57:23
>> I my one way of convincing one person
00:57:26
I'm like who likes to party. I'm like,
00:57:28
you can you can text and drink.
00:57:30
>> I don't know what to say. There's my
00:57:32
that's my sale for you. I think that was
00:57:34
you I was talking about. Um it was
00:57:36
really it was super interesting. I think
00:57:38
it's a real blow again to Tesla. Um and
00:57:40
I I I drove the Rivian 2 at um South by
00:57:43
Southwest. I also they have a really
00:57:45
nifty bike called Also, which I liked a
00:57:48
lot. Um I really like the Rivian. I I
00:57:51
think he's interesting. I think he's a
00:57:53
great spokesperson for this stuff. Um,
00:57:56
and they're wonderful. It's a wonderful
00:57:58
I may buy one. I may buy an R2 um
00:58:00
because I was super impressed with it.
00:58:02
Um, in any case, it's a really
00:58:04
interesting um move by Uber who needs to
00:58:06
get into this business and uh and uh and
00:58:09
it's a good thing for Rivian who, you
00:58:11
know, it's a tough struggle to get these
00:58:13
cars to get a car company going. Um,
00:58:15
your thoughts on Rivian?
00:58:17
>> I I think it's a win-win. I I think it's
00:58:20
uh Rivian is subscale. Automobile
00:58:22
platforms cost so many billions to
00:58:23
produce. I think Rivian has done as good
00:58:25
a job as anyone. I'm moving when I move
00:58:28
back to the US. I'm going to if I buy a
00:58:31
car and I've really enjoyed not having a
00:58:33
car for four years, I'm probably going
00:58:34
to buy a Rivian.
00:58:36
>> The two is nice. It's smaller.
00:58:37
>> I was one of those people that put 5,000
00:58:39
bucks down on it like five, six years
00:58:41
ago and never took delivery of it. By
00:58:43
the way, I should probably look into
00:58:44
that.
00:58:45
>> Um, new couch market. What's the
00:58:48
likelihood that Scott gets gets his
00:58:49
money back?
00:58:50
>> I I think this is they're there. Look,
00:58:52
Tesla's missed a real opportunity here
00:58:54
again and again and again, but I don't
00:58:55
think he cares about the cars anymore,
00:58:56
does he? I mean, he he was introducing a
00:58:58
a cyber cab that doesn't exist and isn't
00:59:01
being used anywhere. I mean, think
00:59:03
between Whimo and Rivian, I think
00:59:04
they've sort of ran around.
00:59:05
>> But let me It's also very one,
00:59:09
>> they need more scale. So, this is a
00:59:11
great win for Rivian. to I think one of
00:59:14
the biggest brand enhancements is to be
00:59:16
known like there are few brands that
00:59:19
have fallen further faster in the last
00:59:20
20 years and made shittier cars than
00:59:22
Jaguar. This is one of the great the one
00:59:25
of the great British brands in history.
00:59:27
The design and the and the cars the last
00:59:30
20 years have just been remarkably
00:59:31
uninspiring.
00:59:33
Now the best brand move in my opinion of
00:59:35
Jaguar is they have been um the car of
00:59:38
choice that I've seen for
00:59:41
>> Whimos. Yes, they are.
00:59:42
>> So immediately it's like, "Oh, Jaguar is
00:59:45
the kind of the Pepsi generation new
00:59:48
cool car." I didn't even know what the I
00:59:50
I had to look. I I didn't even recognize
00:59:52
the car. That's a Jaguar.
00:59:54
>> So it's Brand enhancing for Rubian. It
00:59:56
gives them all sorts of scale.
00:59:58
>> And also what people have
01:00:01
underappreciated is that the biggest
01:00:03
winner, the obvious biggest winner in
01:00:06
autonomous, regardless of all the
01:00:08
press releases, people realize
01:00:10
it's not Tesla. It's likely Whimo. They
01:00:12
have the capital. They're miles ahead of
01:00:13
everyone. They have exponentially more
01:00:16
miles under their under their under
01:00:18
their belt in terms of testing this. But
01:00:20
there's an outside shot that the biggest
01:00:23
winner here
01:00:24
>> Mhm.
01:00:25
>> is going to be Uber
01:00:26
>> because when you control
01:00:28
>> sort of like the apple. See, they're
01:00:29
sort of like the apple.
01:00:31
>> I love we always used in consulting we
01:00:33
always used to the term use the term
01:00:35
custody of the consumer. My first client
01:00:38
was Levi Strauss company and they were
01:00:39
always complaining about J C Penney and
01:00:41
Sears. I'm like, "Yeah, but they have
01:00:42
custody of the consumer. You need to
01:00:43
open your own stores. You need to go
01:00:45
vertical to control the relationship
01:00:46
with the consumer."
01:00:48
>> In the US, Uber has 75% market share.
01:00:52
They're basically a monopoly.
01:00:54
>> Yeah. Der is a very effective.
01:00:56
>> And so what they can do is they can say
01:00:58
they can push up an icon saying, "Why do
01:01:00
you need to download the Whimo or the
01:01:02
Tesla app? Just click here for
01:01:04
driverless." Yeah, they could also do
01:01:06
deals with Whimo too by
01:01:08
>> and they can play them off against each
01:01:09
other. They can find the company that
01:01:11
wants to work with them the most and get
01:01:13
market share the same way Apple
01:01:15
>> You could also use Uber to summon Whimo
01:01:17
if you I mean think why not like why not
01:01:19
>> my point. Yeah. And then take a take a
01:01:22
large margin. So what did Apple do?
01:01:24
Because they controlled custody of the
01:01:26
billion wealthiest people in the world
01:01:28
through UI and people don't want to
01:01:29
learn a new app. They extract $20
01:01:32
billion a year from Alphabet to be to
01:01:34
make to make Google the default search
01:01:37
engine.
01:01:37
>> Uber's in a position to extract
01:01:39
extraordinary deals around autonomous
01:01:42
>> and make it and say to people, "Oh yeah,
01:01:44
you want autonomous? No problem. Here's
01:01:46
the Uber app
01:01:47
>> you love." And so I I look Whimo,
01:01:51
>> it's going to be interesting.
01:01:52
Autonomous. I think I think one of the
01:01:54
places that AI actually comes to
01:01:55
fruition and exceeds our expectations is
01:01:57
around autonomous.
01:01:58
>> I agree. The question is, what's
01:02:01
interesting is
01:02:02
>> two of the biggest winners hands down
01:02:04
>> are going to be Uber and Whimo. And I
01:02:07
wouldn't be surprised if Uber is in fact
01:02:09
the biggest winner because they have
01:02:11
custody of the consumer.
01:02:12
>> Yeah. Ultimately, I've been a big, as
01:02:14
you know, a big proponent of of
01:02:16
self-driving in a safe mode. I I will
01:02:18
tell you, I would never get in a Tesla
01:02:20
given I had a long talk with with R.J.
01:02:23
about, you know, I think he's he's more
01:02:25
on you don't need this many points of
01:02:27
safety, but he put them on there anyway,
01:02:29
right? And so compared to Elon who's
01:02:32
like, I just have one camera with the
01:02:33
guy in the back. Like I feel so unsafe
01:02:35
in Teslas in that regard. Um, and I
01:02:38
think the way Whimo's done it is
01:02:40
correct, but you're right. Uber's in in
01:02:41
a very they could have been it could
01:02:43
have been easily sidelined by all these
01:02:45
companies but they have they I always
01:02:47
used to say they have the reservation
01:02:48
system and that and you're right it's
01:02:51
the chain of custody and and you do
01:02:52
trust Uber what a brand I mean I know
01:02:55
Travis Kalan is trying to come back in
01:02:56
this sector but I got to say DAR took
01:02:59
that company and really made it into one
01:03:01
like you know you know what is
01:03:03
comparatively
01:03:03
>> really been eye opening for me and it
01:03:05
goes to something you said that's always
01:03:06
really resonated with me and that is
01:03:08
>> the thing about tech executives They're
01:03:10
traditionally white males who went to
01:03:12
elite schools, raised in wealthy
01:03:14
families. And when you never been a
01:03:16
victim, it's difficult to understand
01:03:19
victimization. That's always struck me
01:03:20
that like until I walk in those shoes,
01:03:23
you don't. And you know what women say
01:03:25
to me that I it makes so much sense. I
01:03:27
just never realized it. I get into an
01:03:30
Uber, the driver doesn't usually doesn't
01:03:32
talk to me. I don't want to talk to him.
01:03:34
And I know that sounds terrible. I just
01:03:35
don't I don't want to talk. I want to be
01:03:36
on my phone. Every woman I've talked to
01:03:39
says when they get in an Uber, the Uber
01:03:41
driver tries to chat her up.
01:03:44
>> And it's not me, but Yes. Yes.
01:03:46
>> Well, it's uncomfortable, especially if
01:03:49
you talk to young women.
01:03:51
>> Mhm.
01:03:51
>> And they don't. And you know who's
01:03:53
really used loves Whimo is women.
01:03:56
>> Women. They do. Or else you can also now
01:03:57
on Uber, by the way, uh request a woman.
01:04:01
>> Um there's a they they've done a great
01:04:03
job. Let me tell you, Dar goes for I
01:04:05
don't agree with on everything. I think
01:04:06
sometimes he can be a little too
01:04:07
compromisy with terrible people. I think
01:04:09
he knows it. He's a great He's a great
01:04:12
CEO. He's done a great job here.
01:04:14
>> All right. Uh, one more quick break.
01:04:16
We'll be back for predictions.
01:04:19
>> Support for the show comes from Back
01:04:20
Market. Listen, there's a lot of ads out
01:04:22
there telling you to buy new products.
01:04:24
I've actually found that buying a new
01:04:26
car makes no sense. I've owned cars my
01:04:28
whole life, raised in California, and
01:04:31
I've bought a bunch of new cars and a
01:04:32
bunch of used cars, and I think that
01:04:34
quite frankly, used is the way to go.
01:04:35
And it's the same with tech products,
01:04:37
but Back Market is giving you the option
01:04:39
to find sustainable tech products at a
01:04:40
much more affordable rate. Back market
01:04:42
offers a range of highquality tech
01:04:45
inspected and refurbished by
01:04:46
professionals. It's all they do. They
01:04:48
have phones, computers, gaming consoles,
01:04:50
vacuum cleaners, and even iPods. All of
01:04:52
the tech at Back Market has been
01:04:53
inspected and restored by professionals
01:04:55
to ensure it is in perfect working
01:04:57
condition. They offer 30-day returns and
01:04:59
a one-year warranty. And not only is
01:05:01
BackMarket refurbished tech more
01:05:03
affordable than buying new, it's also
01:05:04
more sustainable. Back market is on a
01:05:06
mission to reduce the environmental toll
01:05:07
that fast tech has on our planet as
01:05:09
refurbished tech has proven to use less
01:05:11
raw materials, leave behind less waste,
01:05:13
and create fewer carbon emissions than
01:05:15
new tech. They say e-waste is the
01:05:17
fastest growing waist stream in the
01:05:19
world and it's an issue that BackMarket
01:05:21
wants to actively address through
01:05:22
initiatives including their Earthmonth
01:05:24
campaign. Shop now at backmarket.com.
01:05:31
Okay, Scott, let's hear a prediction.
01:05:32
Can I just start very quickly? Of
01:05:34
course. I predict this mega micro penis
01:05:37
war is going to get worse. And I am here
01:05:39
for it. Do you know about this? Right.
01:05:42
Megan Kelly said Mark Levin had a microp
01:05:44
penis and then President Trump defended
01:05:47
his micropenis and then Marjorie Taylor
01:05:49
Green came in with a micropenis and
01:05:52
Megan Kelly's doubling down on it. It's
01:05:54
completely crude and awful and repulsive
01:05:56
and I think it's going to get a lot
01:05:58
worse and I'm very pleased. Thank you.
01:06:02
>> Yeah, that's my prediction.
01:06:02
>> No, I I I don't I don't I think it's
01:06:04
inappropriate to talk about man's
01:06:06
genitalia. Um, by the way, I was at a
01:06:09
stall last night and a guy looked over
01:06:11
and he said, "Circumcised?" And I said,
01:06:13
"Nope, that's just the wear and tear."
01:06:16
>> Oh my god, you've told that joke before.
01:06:19
I'm going to start clocking your penis.
01:06:21
By the way,
01:06:22
>> I'm not going to say Adam Grant, but
01:06:23
Adam Grant said, "You got to cut back on
01:06:25
the penis strokes, the party you missed,
01:06:26
cuz you wrote the
01:06:27
>> Oh, my nemesis."
01:06:28
>> Yes. Your nemesis was like
01:06:30
>> the more successful version of Scott
01:06:31
Galloway.
01:06:32
>> He's doing a podcast on the Vox Media
01:06:34
Podcast Network with uh with Bnee Brown,
01:06:36
the two of them. They're trying to be
01:06:37
the nice version of Scott and Cara, I
01:06:39
think. Um and he was he commented on
01:06:42
your penis drugs and I said and I
01:06:43
literally Scott I I said I love them. I
01:06:47
defended you so hard.
01:06:48
>> I appreciate that. And I was like,
01:06:50
"People love them." And he's like,
01:06:51
"Yeah, but do you think it's the right
01:06:52
thing?" I go, "It's the right thing. I'm
01:06:55
no matter how much my brain
01:06:56
>> That's right. I don't want to be wrong."
01:06:57
And then also also I want to be a little
01:07:00
bent to the left. Um
01:07:02
>> anyway, just congratulations
01:07:04
and Adam on
01:07:05
>> It's a condition. I'm a special needs
01:07:08
person.
01:07:09
>> Anyway,
01:07:09
>> it's a condition. Don't penis. I'm here
01:07:11
for now. talking about mic. Sorry, Adam,
01:07:14
but micro penis work cracks me the
01:07:16
up and I'm I'm here for it and I hope
01:07:18
more to come and I think I think we're
01:07:20
not done with the micro penis.
01:07:22
>> I got to be I love it when they were
01:07:23
with each other because one of the
01:07:24
things I don't like about the Democratic
01:07:26
party
01:07:27
>> is that I find for the most part when I
01:07:30
just went on this great podcast PBR PB
01:07:34
this really lovely guy. He's a
01:07:35
conservative out of Fort Lauderdale.
01:07:37
>> Oh, you want the guy that guy? Mhm.
01:07:39
>> Oh, I like him. I thought he was nice.
01:07:40
Um anyways, um
01:07:44
the thing I find generally speaking
01:07:45
about Republicans is they're like, "Oh,
01:07:47
you want to be a Republican? Come on
01:07:48
in."
01:07:49
>> Uh and when you say, "Oh, I want to be a
01:07:51
pro progressive." It's like, "We'll
01:07:53
see."
01:07:55
>> I feel like we apply way more Sarah.
01:07:58
There's a new Democrat in town. But go
01:08:00
ahead.
01:08:01
>> If you don't choose the right words, if
01:08:02
you don't hold the gun correctly, let's
01:08:04
court marshall you and hang you.
01:08:06
>> It's the right that's doing it now. But
01:08:08
go ahead. No, they're you know what this
01:08:09
is? This is a bunch of podcasters who
01:08:13
know the algorithm, the more fights they
01:08:15
get into and the more incendiary they
01:08:16
are.
01:08:16
>> I suppose you're right.
01:08:17
>> You know, Candace Owens makes a living
01:08:19
off of saying really vile things because
01:08:22
the algorithms and the reason our nation
01:08:24
is being torn apart at the seams is
01:08:26
there's now a financial incentive in
01:08:28
being vile and incendiary. The
01:08:30
algorithms love it. In a world where
01:08:32
there were editors and fact checkers and
01:08:34
more reasonable people saying is that a
01:08:37
reasonable thing we want to print? She
01:08:39
would be nowhere or anyway. So I
01:08:43
I don't I love it when these guys fight
01:08:46
but at the end of the day it's it's it's
01:08:48
indicative of a bigger problem and that
01:08:51
is our media our overlords
01:08:54
are algorithms deciding that this is
01:08:57
news.
01:08:57
>> Oh, you're so good. I don't care. I like
01:08:59
the micro penis. Anyway, I defended you
01:09:01
to Adam Grant on the page. I appreciate
01:09:03
that. All right.
01:09:04
>> I tell you, other than academic
01:09:05
credibility and talent and higher IQ,
01:09:08
that that dude has nothing on me.
01:09:09
>> That dude has nothing on. Do you know he
01:09:11
was a diver in college?
01:09:13
>> Even better.
01:09:13
>> He was a diver.
01:09:14
>> Yeah.
01:09:15
>> All his sort of like, you know, his
01:09:18
tweets about, you know, characters doing
01:09:21
the right thing when no one's looking.
01:09:22
Oh, you.
01:09:23
>> All right, move along.
01:09:25
>> Adam Grant, you and Bnee Brown.
01:09:28
thoughtfulness and deep
01:09:29
>> interest. Poundforound, they're better
01:09:31
people than us, but that's okay.
01:09:32
>> Well, that's that's Bern's obvious.
01:09:35
>> I love that, Bnee. Anyway,
01:09:37
>> I want to be I like them both and Adam
01:09:39
is a friend, so I I trust he's taking us
01:09:41
all in just
01:09:42
>> I hope so. One would assume. Maybe
01:09:44
they'll discuss it on their new show.
01:09:46
Maybe we should have a Rumble with them.
01:09:47
>> I think we could be evil twins. I think
01:09:49
with his intellect and my I don't know,
01:09:52
my something, we could take over
01:09:53
Australia and Bnee would be queen of
01:09:55
>> a crossover show. We're gonna I'm gonna
01:09:56
invite them on a crossover show. All
01:09:57
right. We could switch partners. You
01:09:59
know when they switch the husband and
01:10:00
wives?
01:10:01
>> I you know I' I've I've tried it but I'm
01:10:03
the one that ends up alone and no one's
01:10:05
up for it.
01:10:06
>> You know it's called show. You could
01:10:08
have Bernay and Well, you've been on
01:10:09
Bernation anyway. Finish. Do your do
01:10:11
your prediction. Prediction.
01:10:12
>> Oh, I'm sorry.
01:10:14
>> Uh my prediction is OpenAI Sora social
01:10:17
media app will be shut down soon.
01:10:19
>> Oh,
01:10:20
>> Sora. Really? What do you know? You know
01:10:22
something?
01:10:23
>> No, I don't. I I've done no original
01:10:25
reporting.
01:10:26
Okay. All right.
01:10:26
>> Um, but they're focusing, which is the
01:10:29
right thing to do. SOAR is essentially
01:10:31
OpenAI's uh, it's sort of a tick-like
01:10:34
social media platform and for AI
01:10:36
generated content.
01:10:37
>> Yeah, it's pointless.
01:10:38
>> And users use their video model to
01:10:41
generate short form content and they can
01:10:44
upload it and share and share it, right?
01:10:47
And upon its release, Sora came out at
01:10:50
number one in the app store. Um, and
01:10:54
actually got more downloads out of the
01:10:56
gates than Chat GPT did. However, like
01:10:59
the parties ended. Downloads fell 32%
01:11:02
month over month in December and another
01:11:04
45% in January. And some Sora is the
01:11:06
little engine that didn't. And also
01:11:10
users continue to drop like flies. Um,
01:11:13
and but at the same time, Open I open AI
01:11:15
has to spend a ton of money to keep the
01:11:17
lights on there. And some estimates are
01:11:20
that
01:11:20
>> also brings a lot of legal challenges.
01:11:22
But
01:11:23
>> well, it's costing them 15 million bucks
01:11:25
a day or 5 billion a year.
01:11:27
>> And despite that, the app is bringing in
01:11:30
less than half a million dollars per
01:11:31
month. And given their new focus, which
01:11:33
is the right one,
01:11:35
>> uh, on focus,
01:11:37
>> it's not central to OpenAI's, uh, core
01:11:40
competences. They're an AI company, not
01:11:42
a social media company.
01:11:43
>> It's not creating revenue.
01:11:45
>> Yep.
01:11:46
>> Big losses. And also, it's really
01:11:48
unpopular. 62% of Americans disapprove
01:11:51
of online videos created with AI.
01:11:53
>> A lot more trouble than it's worth.
01:11:54
>> It feels dystopian. 70% of people
01:11:56
globally would be uncomfortable
01:11:57
consuming fully AI generated creative
01:11:59
content. So
01:12:01
>> this new focus, this new adult in the
01:12:03
room saying we need to focus on the
01:12:05
enterprise market is now quite frankly
01:12:07
we have seated so much share and value
01:12:09
to anthropic. The first
01:12:11
>> Yeah. It's stupid. It's stupid.
01:12:13
>> Anyways, the first example I like of
01:12:15
this focus is that open AI Sora uh rest
01:12:19
in peace. It's going to be shut down.
01:12:21
>> All right. Okay. Oh, I like that. That's
01:12:22
a big that's a big call. I think that's
01:12:24
a good one. Um I just want to make one
01:12:26
other note before we go. Um, President
01:12:29
Trump's comments about dyslexia. Um, I
01:12:31
have a lot of friends who have dyslexia,
01:12:33
by the way. He said Governor Newsome
01:12:34
should not be president because he had
01:12:36
dyslexia or has, but uh, gross. What a
01:12:39
gross thing to say. I just was I'm just
01:12:41
like, stop it. Like, stop like make
01:12:44
demonizing things that are learning
01:12:46
disabilities. It's gross. He's does it
01:12:47
all the time, but it's a continual thing
01:12:49
and everyone just lets him go. But
01:12:51
honestly, what it just And I predict it
01:12:54
will have bad effects. Anyway,
01:12:56
>> are you really are you really surprised
01:12:59
though?
01:12:59
>> No, but I just don't I'm like no. Like
01:13:01
we I I I think we should keep saying no
01:13:03
to this Gh. It was just gross. I
01:13:05
have a lot of friends.
01:13:06
>> I don't know. I I put the I put the sex
01:13:08
in dyslexia. Wait. Sexy dyslexia. Sexy
01:13:12
>> Anyway. Anyway,
01:13:15
that's good. Got it. I got it. Anyway,
01:13:16
you're grotesque. I I'm sorry. I think
01:13:18
we have to call these out all the time.
01:13:20
Anyway, we want to I'm not angry about
01:13:21
it. It's just
01:13:23
um we want to hear from you. My favorite
01:13:25
stripper has uh has dyslexia. Her name
01:13:28
was density.
01:13:32
>> Adam Grant, take that. Talk about
01:13:35
character and being a good manager. Try
01:13:37
to compete with my my stripper density.
01:13:40
My dyslexic stripper density.
01:13:43
>> Yeah, right.
01:13:44
>> Okay. Let's see who wins the iHeart
01:13:47
podcast of the year award.
01:13:48
>> Density should be president. Anyway, we
01:13:50
want to hear from you. to send us your
01:13:52
questions about business, tech, or
01:13:53
whatever's on your mind. Go to
01:13:54
nymag.com/pivot
01:13:56
to submit a question for the show or
01:13:57
call 85551 pivot. Okay, that's the show.
01:14:00
Thanks for listening to Pivot and be
01:14:01
sure to like and subscribe to our
01:14:03
YouTube channel. We'll be back next week
01:14:06
because there's so much news.

Badges

This episode stands out for the following:

  • 60
    Most shocking

Episode Highlights

  • The Importance of Introspection
    Introspection is crucial for personal growth and understanding our actions' ramifications.
    “Introspection is how we move forward as a species.”
    @ 04m 05s
    March 20, 2026
  • Critique of Modern Masculinity
    A discussion on how modern views of masculinity dismiss introspection as a weakness.
    “Introspection isn’t some AI guy who vests his shares and then scares the world.”
    @ 06m 52s
    March 20, 2026
  • Meta's Metaverse Struggles
    Meta is shutting down its VR metaverse, Horizon Worlds, after failing to attract users. Over $70 billion was spent on the project, which never gained traction.
    “I never liked the metaverse.”
    @ 22m 56s
    March 20, 2026
  • OpenAI's Strategic Shift
    OpenAI is scaling back on projects to focus on coding and business users, responding to competition from Anthropic.
    “OpenAI is focusing which is the right thing to do.”
    @ 34m 16s
    March 20, 2026
  • Bob Iger Steps Down as Disney CEO
    Bob Iger passes the baton to Josh Demorro, marking a significant leadership change at Disney.
    “Iger is set to stay on as an adviser until the end of 2026.”
    @ 35m 56s
    March 20, 2026
  • Keli Facing Criminal Charges
    Keli's prediction market platform faces legal issues as it operates without a gambling license.
    “This is the first criminal prosecution against a prediction market company.”
    @ 47m 37s
    March 20, 2026
  • The State of Vegas
    A chilling description of Las Vegas's current emptiness and its impact on businesses.
    “Vegas is dead.”
    @ 52m 26s
    March 20, 2026
  • Regulation in Gambling
    A conversation about the need for regulation in gambling markets and its implications.
    “I think they want some regulation.”
    @ 53m 04s
    March 20, 2026
  • Uber's Strategic Move
    Uber plans to invest in Rivian, potentially reshaping the autonomous vehicle market.
    “I think it’s a win-win.”
    @ 58m 20s
    March 20, 2026
  • Uber's Market Dominance
    With 75% market share, Uber is positioned as a monopoly in the ride-hailing industry.
    “They’re basically a monopoly.”
    @ 01h 00m 52s
    March 20, 2026
  • Micro Penis Discussion
    A humorous take on micro penis jokes and their cultural significance.
    “I’m here for it and I hope more to come.”
    @ 01h 07m 16s
    March 20, 2026
  • AI Generated Content Concerns
    A critical look at the unpopularity of AI-generated videos and their implications.
    “62% of Americans disapprove of online videos created with AI.”
    @ 01h 11m 51s
    March 20, 2026

Episode Quotes

Key Moments

  • Critique of Introspection06:52
  • Tillis's Stance18:11
  • Meta's Metaverse News22:56
  • Vegas is Dead52:26
  • Gambling Regulation53:04
  • Uber and Rivian58:20
  • Micro Penis Humor1:07:16
  • Dyslexia Critique1:12:39

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Who’s to Blame After Texas Flooding Tragedy — And What Needs to Change | Pivot