Search Captions & Ask AI

Nancy Guthrie Disappearance Raises New Surveillance Questions | Pivot

February 13, 2026 / 01:05:50

This episode of Pivot covers topics such as economic protests, the impact of unsubscribing from tech platforms, and the ongoing Epstein hearings. Hosts Cara Swisher and Scott Galloway discuss how celebrity actions, like Chelsea Handler's Instagram post, can lead to significant economic consequences for companies. They highlight the importance of unsubscribing from services to protest against companies like OpenAI, emphasizing that individual actions can collectively lead to substantial financial impacts.

The conversation shifts to Attorney General Pam Bondi's testimony regarding the Epstein case, where she faced criticism for her handling of the situation and her lack of engagement with victims. Galloway and Swisher express their disbelief at her comments and the overall state of the hearings, noting the disconnect between the administration's focus on economic metrics and the serious nature of the allegations against Epstein.

They also touch on the trial against Meta and YouTube, where the platforms are accused of designing addictive experiences for young users. The hosts discuss the implications of social media addiction, citing statistics on teen usage and its correlation with mental health issues.

Finally, the episode concludes with a discussion on surveillance technology, particularly in relation to the Nancy Guthrie abduction case, and the ethical concerns surrounding privacy and data retention by companies like Google.

TL;DR

Hosts discuss economic protests, the Epstein hearings, social media addiction, and surveillance technology in this episode of Pivot.

Video

00:00:00
I'm glad they got these pictures of this
00:00:01
guy. At the same time, this is an edge
00:00:04
case. They're they're keeping your video
00:00:06
that which I which everyone thought they
00:00:08
were doing and they said they weren't.
00:00:16
>> Hi everyone, this is Pivot from New York
00:00:18
Magazine and the Vox Media Podcast
00:00:19
Network. I'm Cara Swisser.
00:00:21
>> And I'm Scott Galloway.
00:00:22
>> Scott, we just did a great on with Cara
00:00:25
Swisser about resist and unsubscribe,
00:00:27
but I'd like
00:00:28
>> you have another podcast. Did I find
00:00:29
that out? This
00:00:30
>> dude, you were quite substantive. Where
00:00:32
are we right now? Give us a quick
00:00:33
update.
00:00:34
>> Uh, it lulled Tuesday and Wednesday. It
00:00:36
appears to have come back today because
00:00:38
Chelsea Handler, who reached out to me,
00:00:41
posted something of all the things she
00:00:43
was unsubscribing to. And just to give
00:00:44
you an example of how much impact one
00:00:46
person can have. Uh, I went on AI, I
00:00:50
went on to my site analytics. I think
00:00:51
she just that one video she did on
00:00:54
Instagram, that one post is going to
00:00:55
inspire 6 to 7,000 unique site visits.
00:00:58
conversion of 5% that's 300 people
00:01:01
unsubscribing average of two platforms
00:01:04
600 unsubscribes average 200 that's
00:01:06
$12,000
00:01:08
times or excuse me $120,000 times 10 so
00:01:12
$1.2 $2 million in market cap getting
00:01:15
taken out of these companies because of
00:01:16
one Insta Post. So,
00:01:17
>> Right. Exactly. And, you know, I'm going
00:01:19
to see her uh tomorrow night, I think.
00:01:22
Um, tomorrow night. She's here in DC. We
00:01:24
should get them all to do things like
00:01:26
that. Let's let's reach into the celebs,
00:01:28
we know, and get them to do.
00:01:30
>> I'm going to bug them all.
00:01:32
>> Okay.
00:01:32
>> I like it. Thank you.
00:01:33
>> Yeah. If they do that and put even just
00:01:35
one thing up, it matters and it's an
00:01:37
easy thing for a lot of them. And they
00:01:39
>> what people don't realize about about
00:01:40
economic protests, the most famous one
00:01:42
was a Montgomery bus strike. It wasn't
00:01:44
the one cinematic moment. It was it was
00:01:46
um an organization of thousands of car
00:01:49
pools over the course of a year.
00:01:50
>> Yeah.
00:01:51
>> So it takes it takes a while, but
00:01:54
>> any individual who subs unsubscribes
00:01:56
from Open AI OpenAI right now is taking
00:01:58
$10,000 out of their market valuation,
00:02:01
>> which is great. And that
00:02:02
>> and there's a substitute, the free the
00:02:04
free chat GBT
00:02:05
>> and also all kinds of other free
00:02:07
services. Gemini, all the others, you
00:02:09
don't have to pay for it necessarily.
00:02:11
And by the way, you can use things for
00:02:13
free. You're taking stuff from them,
00:02:15
right, without paying them. Like paying
00:02:17
is the issue is what you pay for. So
00:02:19
just keep that in mind. Everyone's like,
00:02:20
"Oh, now I can't use Google." I'm like,
00:02:22
"No, it's free."
00:02:24
>> Well, just just to give an example how
00:02:25
how these this unsubscribes to these
00:02:27
recurring revenue tech platforms or tech
00:02:30
companies, T-Mobile just had an earnings
00:02:33
call. They were projected to add 992,000
00:02:36
new subscribers. they added 962. So
00:02:39
30,000 fewer because 30,000 people
00:02:42
didn't show up for subscription. So not
00:02:44
only do these do these actions punch
00:02:46
above their weight class in terms of
00:02:47
economic impact, if you take if Sam
00:02:51
Alman is grows his subscriptions 7 12%
00:02:55
versus 8% month, he's not going to close
00:02:58
his $850 billion round.
00:03:00
>> Yep.
00:03:01
>> So this is literally this is the string.
00:03:03
If you if if you don't have the time or
00:03:06
the energy to do some of the very other
00:03:07
important work, whether it's protests or
00:03:09
or or calling your congressman, you can
00:03:12
have a massive impact by unsubscribing
00:03:14
right now.
00:03:15
>> Yeah, you can. Now, speaking of which,
00:03:17
and something the administration does
00:03:18
care about, Attorney General Pam Bondi,
00:03:20
who we'll talk about more in a minute,
00:03:21
was testifying in front of the the Jan,
00:03:24
crazy Jan, was testifying in front of
00:03:26
Congress about Epstein on Wednesday. She
00:03:28
made it clear she'd prefer to be talking
00:03:29
about other things. What did she zero in
00:03:31
on? Let's listen. Dow is over $50,000. I
00:03:35
don't know why you're laughing. You're a
00:03:37
great stock trader as I hear rascin. The
00:03:39
Dow is over $50,000
00:03:42
right now. The S&P at almost 7,000
00:03:46
and the NASDAQ smashing records.
00:03:50
Americans 401ks and retirement savings
00:03:53
are booming. That's what we should be
00:03:56
talking about.
00:03:57
>> Well, she's not the Treasury Secretary,
00:03:59
but this is what shows what they care
00:04:00
about. They really do. the fact that it
00:04:02
was inappropriate to bring this up in
00:04:04
here given they were talking about
00:04:05
victims sexual uh uh abuse victims, but
00:04:09
nonetheless, this is what floats their
00:04:11
boat is is this money, right? And so,
00:04:14
let's also listen to a great idea one of
00:04:15
our listeners sent in.
00:04:17
>> Every child of an elderly person should
00:04:21
also go through all of their parents
00:04:24
subscriptions. I went through my
00:04:26
mother's this weekend and was able to
00:04:28
take $125
00:04:30
off of some bills by unsubscribing
00:04:33
subscriptions she didn't even know she
00:04:35
had.
00:04:36
>> That is a great idea. I do that with my
00:04:37
mom all the time and I'm trying very
00:04:39
hard to take the New York Post off of
00:04:41
her subscriptions.
00:04:42
>> Two years after my mother died, I found
00:04:44
that Geico was still taking $220 out of
00:04:47
her bank account a month for car
00:04:49
insurance.
00:04:50
>> Wow. Crazy. if you don't, and I've used
00:04:52
this example before, when I unsubscribed
00:04:54
from AT&T, went to Noble, I'm saving
00:04:56
about 20 or 30 bucks a month, but in
00:04:58
addition, I found out
00:04:59
>> I had four accounts with AT&T for
00:05:01
Blackberries and iPads, which have been
00:05:03
in landfills for years, cuz I never went
00:05:05
on and unsubscribed them. And even
00:05:07
though they know they're not getting a
00:05:08
GPS signal from these things, and they
00:05:10
could send you an email saying, "Hey,
00:05:11
you know, you're paying 70 bucks a month
00:05:13
>> for something you haven't used in 5
00:05:15
years,
00:05:15
>> you're going to save money. It's these
00:05:17
companies are very good at figuring out
00:05:19
a way to get you to subscribe and get
00:05:20
you to forget that it's coming that this
00:05:23
money is coming out of your pocket every
00:05:24
month.
00:05:24
>> Yeah. You know, there's a couple
00:05:25
services and I don't have the names to
00:05:27
show where your subscriptions are and to
00:05:29
unsubscribe, but this is a better way to
00:05:31
do it. But then you can use those
00:05:32
services to find them all over the
00:05:34
place. You'd be surprised of what you're
00:05:36
I found an AT&T thing. I was still from
00:05:38
when Apple first had the iPhone when
00:05:40
they had unlimited if you remember.
00:05:42
Anyway, uh it's a great thing to do.
00:05:44
Keep going. We're going to do more.
00:05:45
We're gonna every little thing we can
00:05:47
pull on. The administration cares about
00:05:48
this issue. Uh it's the only thing left
00:05:51
is the DAO at this point. The fallout
00:05:53
from the
00:05:54
>> What's that $50,000? She's the [ __ ]
00:05:56
attorney general. She clearly knows
00:05:58
nothing about economics. What is she
00:05:59
talking about? She called the Dow.
00:06:02
>> I know. Also, also calling a
00:06:03
representative Rascin. Who does [ __ ]
00:06:06
does she think she is? She's in his
00:06:08
house. She's in his house. He calls him
00:06:10
Rascin. I'm going to call you Galloway
00:06:12
when I use your house. Hey, Galloway.
00:06:14
Anyway, the fallout from the Epstein
00:06:17
files continues. Speaking of which, as I
00:06:18
mentioned, crazy Attorney General Pam
00:06:20
Bondi, who really needs to be medicated,
00:06:23
testified before the House Judiciary
00:06:25
Committee on Wednesday, and things got
00:06:27
heated. She soiled herself multiple
00:06:29
times. Um, Bondie sparred with
00:06:31
Democrats, not just Democrats, over DH's
00:06:33
handling of the Epstein files and
00:06:35
refused to apologize to survivors. She
00:06:37
wouldn't even look at them there. It
00:06:38
turned out she's never talked to them.
00:06:40
She also clashed with uh GOP Congressman
00:06:42
Thomas Massie. Massie criticized Bondi
00:06:45
and the DOJ for failing to redact
00:06:46
victim's names while blacking out the
00:06:48
names of businessmen uh businessman Les
00:06:50
Wexner. Let's listen to the exchange.
00:06:53
>> Within 40 minutes, Wexner's name was
00:06:56
added back
00:06:57
>> within 40 minutes of me catching you
00:06:59
redhanded.
00:07:00
>> Red hand. There was one redaction out
00:07:06
and we invited you in. We This guy has
00:07:09
Trump derangement syndrome. He needs to
00:07:11
get You're a failed politician.
00:07:14
Uh really crazy crazy craziness I have
00:07:18
to say. I just don't know what to say.
00:07:19
She's What is wrong with her? Like ser
00:07:22
speaking of derangement syndrome. Like
00:07:25
honestly I don't know what she was doing
00:07:27
up there. I know it's an audience of one
00:07:29
but he can't even find this impressive.
00:07:31
It's grotesque. I mean I don't know.
00:07:35
>> Yeah. It really feels like we have
00:07:38
>> wheels are coming off. I mean it's it's
00:07:39
it's a shame because it's just so
00:07:42
>> it's a serious issue
00:07:43
>> weird and it's the attorney general
00:07:46
making a mockery of the institution and
00:07:48
just
00:07:49
u no no decorum but for I'm curious what
00:07:54
you thought about the hearings but the
00:07:56
moment that I found really chilling
00:07:59
was when I think it was representative
00:08:02
Jaipal
00:08:04
had um some of the the survivors uh
00:08:08
stand up and asked how many of them have
00:08:10
reached out to the DOJ
00:08:12
>> to provide evidence or input, but all
00:08:14
these survivors stood up.
00:08:16
>> Yeah.
00:08:16
>> And it was clear they've reached out to
00:08:18
the DOJ and the DOJ has um has is has
00:08:22
ignored them. And you thought, let me
00:08:23
get this, the Department of Justice
00:08:26
>> investigating what is arguably may go
00:08:28
down as the crime of the century to date
00:08:31
>> and survivors and people with direct
00:08:33
knowledge about what happened or what
00:08:35
didn't happen. They could also, quite
00:08:36
frankly, they might exonerate some
00:08:38
people.
00:08:38
>> Right. Exactly.
00:08:39
>> They don't want to talk to them.
00:08:41
>> Right. Right. And she wouldn't look at
00:08:42
them. That was another moment. She
00:08:44
wouldn't turn around. She wouldn't do
00:08:46
it. She This woman is insane. I just I
00:08:49
don't She a crazy one. She's like, it
00:08:52
was so strange. And And I know this
00:08:54
audience of one they always do, but in
00:08:56
this case, I was like, "Wow, you people
00:08:57
are desperate and terrified of what's
00:08:59
coming next for you." You know, I
00:09:01
thought Massie was effective. I thought
00:09:03
Becca Balant was effective. I thought
00:09:05
Gipol's effective, Rascin, um I thought
00:09:08
they all one of the things someone who
00:09:10
works there said, "How do you think it
00:09:11
went?" And I said, "The only problem
00:09:13
with this kind of thing was you lay down
00:09:17
with pigs. The only one only people that
00:09:20
like wrestling with pigs are the pigs,
00:09:21
right? If you get in the mud with them."
00:09:23
But I thought they rel they relatively
00:09:25
handle it well. It's just that the the
00:09:27
craziness is what gets attention and not
00:09:29
the victims, right? It becomes a
00:09:30
ridiculous circus. And on some level,
00:09:33
what was interesting is Fox didn't show
00:09:35
it, right? They they they they keep
00:09:37
they're obsessed with the Nancy Nancy
00:09:39
Guthrie kidnapping, which is a terrible
00:09:41
thing, too. But they're not even airing
00:09:43
it. They don't want to see show you the
00:09:45
crazy like and any normal person looking
00:09:47
at this would be like, "What? Honey, you
00:09:50
need some you need some therapy like
00:09:52
stat kind of thing." And you're you
00:09:54
know, and what happened to you? So, I
00:09:56
thought that was it was a really
00:09:58
interesting This Epste thing isn't going
00:09:59
away, Pam. I'm sorry. It's just not now
00:10:01
cuz it's so very clear that you didn't
00:10:03
do your job and neither did people
00:10:05
before you, by the way. But guess what?
00:10:07
>> It's a valid point.
00:10:08
>> You're in the chair now. I don't really
00:10:10
>> It's her It's her DOJ.
00:10:12
>> It's her DOJ
00:10:12
>> and her boss her boss is mentioned
00:10:15
>> in the Epstein files more times than
00:10:17
Jesus is mentioned in the Bible or the
00:10:19
term meth is mentioned in Breaking Bad
00:10:21
over eight seasons. And I felt like
00:10:23
every day, every time yesterday, she
00:10:25
>> she claimed that, you know, the
00:10:26
president had been the most quote
00:10:27
unquote transparent president. When she
00:10:30
uses the term transparent, I think some
00:10:32
somewhere there's a thesaurus filing for
00:10:34
protective custody. It's just
00:10:36
>> why are you laughing at me was just
00:10:41
and al just it's it was so weird. It's
00:10:43
so weird. It's so culty. It's so
00:10:45
strange. One of the things I do think is
00:10:46
effective is a lot of these Congress
00:10:48
people are going in and seeing
00:10:49
unredacted versions which are very
00:10:50
upsetting. um
00:10:52
>> when they come out and they look like
00:10:53
they've seen a ghost.
00:10:54
>> I know. Even Cynthia Lumis who was I
00:10:57
didn't know it was there now. Whoa.
00:10:59
Whoa. Folks, like Cynthia Lumis, I'm so
00:11:02
glad she's leaving politics. But I have
00:11:04
to say, even someone like that who
00:11:06
literally puts in the least effort
00:11:08
possible. Um same thing. They're looking
00:11:10
like, "Oh my [ __ ] god, you're kidding
00:11:12
me here." And you know, nobody's
00:11:14
>> I got to be honest. I didn't I didn't
00:11:16
realize it was this bad.
00:11:18
>> Yeah. when the more information you you
00:11:20
read about this
00:11:21
>> Yeah.
00:11:22
>> in terms of the number of victims.
00:11:24
>> Yeah.
00:11:25
>> In terms of how many people were
00:11:27
involved, uh how many
00:11:29
>> how many opportunities there were to
00:11:31
stop it.
00:11:32
>> Yeah.
00:11:33
>> And it just gets the web keeps getting
00:11:35
deeper and uglier.
00:11:36
>> Yes. And the lies like when commerce
00:11:38
secretary Howard Letic was on Capital
00:11:39
Hill this week as well. He told the
00:11:41
Senate committee he and his family had
00:11:42
lunch on Epstein's island in 2012, but
00:11:44
insisted he'd not have a relationship
00:11:45
with him. Of course, this was he had
00:11:47
given this sort of haha interview with
00:11:50
one of these right-wing outfits where he
00:11:52
said, "I never in that."
00:11:53
>> He was indignant. I was disgusted by him
00:11:56
and I said, "We're never we're going to
00:11:58
have no contact with him again." And
00:12:00
here's the thing.
00:12:01
>> He took his kids. I I I took my four
00:12:03
kids and their nannies and I got all the
00:12:05
kids off the island. That by
00:12:07
>> But this is the thing. It's not It's
00:12:09
It's not It's usually not the the
00:12:11
infraction itself. It's the cover up. If
00:12:13
he had said,
00:12:14
>> why'd he say the first thing? That's
00:12:16
>> But if he had just said right up front,
00:12:18
he's a neighbor. He had powerful
00:12:20
friends. I didn't do the diligence I
00:12:22
should have. I went with me and my kids
00:12:23
to his island once cuz it sounded like
00:12:25
fun.
00:12:26
>> Yeah.
00:12:26
>> Okay. Poor judgment, but go along, get
00:12:29
along.
00:12:30
>> Instead of trying to wrap yourself in
00:12:32
some sort of indignance that you
00:12:33
immedately smelled a rat and you're
00:12:35
lying
00:12:36
>> and you decided to
00:12:38
>> I mean, if he just come clean in the
00:12:40
beginning, said like, "Yeah, it was bad
00:12:42
judgment. took my kids to his island,
00:12:43
had a lunch. I'd heard he was a big
00:12:45
philanthropist and who knows maybe it
00:12:47
was okay. All right. Bad judgment. Move
00:12:51
along. But it again, it's the cover.
00:12:54
>> He had to take a laugh. He had to take a
00:12:56
I'm so pure laugh. And he that's cuz
00:12:58
he's a [ __ ] Let's let's be clear. This
00:13:00
guy's a [ __ ] And people are asking for
00:13:02
him to resign. He really is a liar. He's
00:13:03
a liar and a [ __ ] And it doesn't mean
00:13:06
he had to do anything, but he's a liar
00:13:08
and a [ __ ] The one that one that's
00:13:10
interesting under scrutiny is
00:13:12
entertainment executive Casey Wasserman.
00:13:14
Chapel Rowan and other artists have cut
00:13:15
ties with Wasserman as is their right
00:13:17
after latest file show exchanged emails
00:13:19
was Julian Maxwell seemed to have some
00:13:21
kind of relationship with her probably
00:13:24
extramarital who knows who serves as
00:13:26
chairman of the LA Olympics organized
00:13:28
appears to be holding on to that role.
00:13:30
They're backing him. There were other
00:13:31
names floated to take his place. He let
00:13:33
me be clear for people not letting him
00:13:35
out but it was 2003 before any of this
00:13:38
was known. He may have been able to pick
00:13:40
it up. That's different. Um but uh but
00:13:43
this was well before the first
00:13:45
conviction, the first um sweetheart deal
00:13:48
that Epstein did with in Florida. Um so
00:13:52
he's even even he's under scrutiny and
00:13:54
people are cutting ties. And again, this
00:13:57
is this artist right. They don't like
00:13:58
the cut of his jib. That's perfectly
00:14:00
fine. In his case, there's just the the
00:14:03
blast zone of this is so far right. It's
00:14:06
really
00:14:07
>> it's so indiscriminate. And again, I go
00:14:09
to the following.
00:14:10
>> Yeah.
00:14:10
>> If we had an institution we could trust,
00:14:12
including the Department of Justice and
00:14:14
the institutions that actually assembled
00:14:15
these files, if they could go through it
00:14:17
and go, "Okay, there are three circles
00:14:19
here. There's people who either engaged
00:14:22
in provided infrastructure or trafficked
00:14:25
and facilitated crimes, we are going to
00:14:27
release those names in the form of grand
00:14:29
jury indictments, and we're going to go
00:14:31
after these people." That's the headline
00:14:32
here. That's what the Department of
00:14:34
Justice isn't supposed to ruin people's
00:14:36
careers. It's supposed to create an
00:14:37
incentive system where people follow the
00:14:39
law by prosecuting criminals and
00:14:41
exonerating people who are not guilty.
00:14:43
That is what they are there to do. And
00:14:45
then the second circle, and this is a
00:14:47
harder one, is okay, if a cabinet, if a
00:14:50
cabinet member has clearly lied under
00:14:52
their testimony or under oath, should
00:14:54
they release that information? Didn't
00:14:56
didn't commit a crime. This is Howard
00:14:57
Lutnik. Should the president, who has
00:15:00
not so far been accused of a crime, if
00:15:02
he's mentioned in this thing 6,000
00:15:04
times, should we release that
00:15:05
information? I think that is a really
00:15:08
important point. The biggest circle,
00:15:10
quite frankly, is go I have seen on Tik
00:15:14
Tok and on Instagram people talking
00:15:16
about models, how they talked about
00:15:19
going to a museum with Jeffrey Epstein
00:15:21
and we should no longer uh have anything
00:15:24
to do. They're trying to shame all these
00:15:25
people and it's like, you know what,
00:15:27
folks, that's just pure gossip. And
00:15:30
unfortunately, the ring light shaming of
00:15:33
all these courageous, virtuous people
00:15:35
when they're behind a a keyboard and
00:15:37
have much higher standards for other
00:15:38
people than they do for themselves, that
00:15:40
is distracting from what the Department
00:15:42
of Justice is supposed to do. And that
00:15:44
has put pedophiles in prison.
00:15:47
>> Yeah, I would urge people to read. It
00:15:48
was really interesting. you know, Kathy
00:15:50
Katherine Rumler, who's the the legal
00:15:52
head of Goldman, you know, she was she
00:15:55
had a lot of emails and very chummy kind
00:15:57
of emails with Epstein going on for a
00:15:59
while. Um, I thought Bill Cohen did a
00:16:03
great job talking about why she was in
00:16:06
that relationship and most of it was in
00:16:08
fact she was professional. She's looking
00:16:10
for work, right? And that's a whole
00:16:12
different
00:16:12
>> guy who knows rich guys who can send me
00:16:14
a for wealth management.
00:16:15
>> Yes, exactly. So I would urge people to
00:16:17
read that and again one or two of them
00:16:20
and and one or two places she when he
00:16:22
said oh I it was only prostitution she
00:16:24
goes that's justice abusive Jeffrey like
00:16:27
she she unfortunately he kept saying
00:16:28
there are gifts there was a business
00:16:30
relationship I thought it was it was
00:16:32
actually a really um complex situation
00:16:36
that made me think god if she was a guy
00:16:38
and she did like golf with him she'd get
00:16:41
off because she was a woman was vaguely
00:16:43
flirty kind of she wasn't like it was it
00:16:47
was a great piece cuz it made me rethink
00:16:49
I was like okay like not great judgment
00:16:53
right should have known better should
00:16:55
have stopped talking to him after the
00:16:58
first thing um but didn't business it
00:17:01
was just interesting it was it made me
00:17:03
think a lot I recommend Bill Cohen's
00:17:04
column in pock and I thought this was
00:17:08
this is his area of expertise in finance
00:17:10
and I thought okay I got this is why
00:17:12
there she was he was trying to explain
00:17:14
why they haven't let her go, right? So,
00:17:16
I thought that was interesting. Anyway,
00:17:18
um speaking of uh um power, six
00:17:21
Republicans joined Democrats in the
00:17:22
House on Wednesday to vote for a
00:17:23
resolution aimed at ending President
00:17:25
Trump's tariffs on Canada that it's a
00:17:27
symbolic gesture, even if it clears the
00:17:29
Senate. Uh Trump would veto it, but that
00:17:31
didn't stop him from making threats.
00:17:32
Trump posted on True Social, that any
00:17:34
Republican who votes against the
00:17:35
terrorists would seriously uh suffer
00:17:38
consequences come election time, and
00:17:39
that includes primaries. Uh I think he's
00:17:41
losing the grip, as they say. What what
00:17:44
do you think?
00:17:44
>> Well, there's some new data that shows
00:17:46
that about So, the initial notion was
00:17:48
the tariffs would
00:17:50
uh mostly be paid by either
00:17:52
corporations, sort of a populist thing,
00:17:55
or the uh uh importer or excuse me, the
00:17:59
exporter themselves, the the country
00:18:01
would absorb it or whoever was sending
00:18:03
the products. It ends up and there's
00:18:04
finally analysis 94%
00:18:07
of the costs have been borne by US
00:18:09
consumers and then the other 6% have
00:18:11
been borne by companies either deciding
00:18:12
to take a bit of a hit or the the
00:18:15
importer themselves or excuse me the
00:18:17
exporter themselves reducing their
00:18:19
prices. You have about 15% of the
00:18:22
economy is um imports.
00:18:25
It they thought the tariffs average
00:18:27
around 20% so that's 3%. some managed to
00:18:30
get out of it. So, call it a 2% to the
00:18:32
economy, but the problem is it's an
00:18:34
unnecessary 2% hit to the economy. To be
00:18:37
fair, it hasn't had the catastrophic
00:18:39
effect a lot of people thought it was
00:18:40
going to have, but in a weird way.
00:18:44
>> Well, if Yeah, it's just
00:18:46
>> I feel it myself and the shelves are
00:18:48
emptier. It's weird. I never have
00:18:50
noticed that.
00:18:50
>> Well, but why reduce people's prosperity
00:18:52
by 2% for no real reason? It doesn't
00:18:54
cause growth. It doesn't cause
00:18:55
innovation. And all it's doing is is is
00:18:58
urging or reconfiguring the supply chain
00:19:01
>> around the United States. The EU is
00:19:02
entering into an agreement with
00:19:04
Merkasaur. There all kinds of new trade
00:19:06
zones being opened up such that people
00:19:08
are not as reliant on the US. And a
00:19:10
weird a weird thing though is that if
00:19:13
his tariffs are overturned
00:19:16
in by the Supreme Court or by the
00:19:18
Congress, I actually think the markets
00:19:20
will rip. So, in a weird way, it could
00:19:23
end up it could end up helping him if if
00:19:26
these things are turned back. I think
00:19:28
the markets will scream if these tariffs
00:19:31
are found to be uh illegal.
00:19:33
>> Yeah. Well, we'll see. And although
00:19:35
apparently he's got all these plans to
00:19:36
put other kinds of fees in place to take
00:19:39
their place that are that he that he'll
00:19:41
have to go back to court and stop him
00:19:43
for those. He's doubling down. This is
00:19:45
something he's talked about for years.
00:19:46
So, I don't know if he's going to back
00:19:47
off so quickly and take the
00:19:49
>> take the victory here. He'd like to take
00:19:52
the L. Honestly,
00:19:54
>> I don't know. No, they you know that
00:19:57
lunatic Peter Navaro talks about him
00:19:58
like that we have a whole bunch of
00:20:00
things to happen if the Supreme Court
00:20:02
>> overturns this. What's taking the
00:20:04
Supreme Court so long by the way?
00:20:06
Anyway, uh we'll see what happens. I do
00:20:07
think on the broader sense that there's
00:20:09
lots more um Republicans willing to push
00:20:12
back because of their own political
00:20:14
survival is not linked to Donald Trump
00:20:16
as much anymore. The other thing is it
00:20:18
looks like they may lose control of the
00:20:19
House.
00:20:20
>> That's right. Another person's
00:20:21
resigning, right?
00:20:23
>> So, you know, they're one they're one
00:20:25
sick person away from losing having the
00:20:28
Democrats in control. So, it's a really
00:20:30
interesting time. He doesn't have the
00:20:32
the power is slipping away and that's
00:20:33
why he screamy Pam or this nonsense and
00:20:36
stuff. So we'll see more of that I
00:20:38
think. Um okay Scott let's go on a quick
00:20:40
break. When we come back social media on
00:20:42
trial very important case
00:20:45
support for pivot comes from anthropic.
00:20:48
There are bumps in the road. The ones
00:20:50
you can just throw a band-aid on and be
00:20:51
done with it. And then there are the
00:20:53
bigger problems. The ones where you
00:20:55
really have to stop and think through.
00:20:57
The ones when you finally crack it feels
00:20:59
unbelievable. And for those problems,
00:21:01
you're going to need a partner to help
00:21:02
you understand where you're at, where
00:21:04
you're going, and how you're getting
00:21:05
there. Claude from Anthropic is that
00:21:08
partner. Claude is the AI for minds that
00:21:10
don't stop at good enough. It's a
00:21:12
collaborator that actually understands
00:21:14
your entire workflow and thinks with
00:21:15
you, whether you're debugging code at
00:21:17
midnight or strategizing your next
00:21:19
business move. Claude extends your
00:21:21
thinking to tackle the problems that
00:21:22
matter. Plus, Claude's research
00:21:24
capabilities go deeper than basic web
00:21:26
search. It can use comprehensive,
00:21:28
reliable analysis with proper citations,
00:21:30
turning hours of research into minutes.
00:21:33
Ready to tackle bigger problems? Start
00:21:35
with Claude today at claude.ai/pivot.
00:21:39
That's claude.ai/pivot.
00:21:41
And check out Claude Pro, which includes
00:21:43
access to all the features mentioned in
00:21:45
today's episode. Claude.ai/pivot.
00:21:56
Support for the show comes from
00:21:57
Corewave. AI isn't just a new tool. It
00:21:59
encompasses so much more. It's spurring
00:22:02
a revolution across all industries and
00:22:03
reshaping itself to become a big part of
00:22:05
our future together. Coreweave is at the
00:22:07
center, powering some of the biggest
00:22:09
names in AI. As the essential cloud for
00:22:11
AI, Corewave provides an AI platform
00:22:13
that combines next generation
00:22:15
infrastructure, intelligent tools, and
00:22:16
expert support. It's powering the
00:22:18
world's most complex AI workloads faster
00:22:20
and more efficiently. From medical
00:22:21
research and diagnosis to education,
00:22:23
from complex visual effects from movies
00:22:25
to breakthroughs in science and
00:22:26
technology. If it's AI, Cororeweave is
00:22:29
uniquely ready to power it with
00:22:30
purpose-built tech. The big ideas, the
00:22:32
wild visions and what-ifs and why nots.
00:22:35
Cororeeve is working to build what's
00:22:37
never been built before. Cororeweave is
00:22:39
the essential cloud for AI. Ready for
00:22:41
anything, ready for AI. To learn more
00:22:43
about how Coree powers the world's best
00:22:44
AI, go to corweave.com/refor.
00:22:54
Scott, we're back with more news. A
00:22:56
landmark social media trial got underway
00:22:57
this week with Meta and YouTube accused
00:22:59
of deliberately designing their
00:23:00
platforms to addict young users. You
00:23:02
think the lawsuit is the first of more
00:23:04
than500 similar cases to go to trial.
00:23:06
This is something that's been building
00:23:07
for a long time. The plaintiff's lawyer
00:23:09
is arguing that his client, a
00:23:10
20-year-old woman, got hooked on these
00:23:12
apps as a kid because they were like
00:23:13
digital casinos delivering dopamine
00:23:15
hits. Instagram head Adam Msari uh
00:23:18
testified on Wednesday that he doesn't
00:23:19
think users can be quote clinically
00:23:21
addicted to the app. Adam Msari is not a
00:23:24
doctor, just so you know. I can't
00:23:26
believe he said that. It was kind of a a
00:23:28
mistake on his part. Meanwhile, and also
00:23:30
he's wrong. Meanwhile, YouTube is
00:23:32
arguing it's not social media, it's an
00:23:34
entertainment platform like Netflix and
00:23:36
it's not addictive. That is also not
00:23:38
true. The jury, anyone who has kids
00:23:40
knows that. Uh it's very different from
00:23:41
Netflix. The jury trial, I mean, it's
00:23:44
become more like Netflix recently, but
00:23:46
it's also an addictive situation. The
00:23:48
jury trial is expected to last six to
00:23:50
eight weeks with Mark Zuckerberg and
00:23:51
YouTube's Neil Mo expected to testify.
00:23:54
This is a really important trial. The
00:23:56
big names are coming out and talking
00:23:57
about an issue you and I have talked
00:23:59
about for years. Um, what what are the
00:24:02
actual effects and who is responsible
00:24:04
for creating an addictive product? And
00:24:07
I'm sorry, Adam. I'm not a doctor
00:24:08
either, but any fool will tell you it's
00:24:11
anyone, not fool, any person will tell
00:24:13
you it's addictive. Anyone who uses it.
00:24:15
Um, and you design and there's so much
00:24:17
proof that you've designed it like a
00:24:19
casino or a cigarette or whatever it
00:24:21
happens to be. Thoughts?
00:24:22
>> Well, imagine you're 14 and someone you
00:24:26
go into your room and if you were like
00:24:28
me, your mom wasn't home until 6:00 or 7
00:24:30
p.m. and you're home alone.
00:24:31
>> Gilligans Island
00:24:32
>> and yeah, that's it was Bugs Bunny and
00:24:34
Gilligans Island and I Dream of Genie
00:24:36
for me. But what if in the corner there
00:24:37
was a casino? What if there was an
00:24:41
arcade? What if there was?
00:24:45
What if there was unlimited music? What
00:24:47
if there And then you say, "No, no, no.
00:24:50
Study. What if there was the high school
00:24:52
cafeteria where I could say something
00:24:54
mean about someone else or someone could
00:24:57
say something mean about me and all I
00:24:58
could think about the rest of the day
00:24:59
and night was what they were saying
00:25:00
about me?" That the high school
00:25:02
cafeteria never never left. And it ends
00:25:06
up that about 6% of teenagers are
00:25:08
clinically addicted or or meet the
00:25:11
clinical definition of addicted to
00:25:12
either drugs or alcohol. But under that
00:25:15
same those same standards, 24% are
00:25:17
addicted to social media. And just some
00:25:20
data, the average American team teen
00:25:22
spends 4.8 hours a day using social
00:25:24
media. 16% of teens or one in six use
00:25:27
Tik Tok almost constantly. 15% for
00:25:30
YouTube, 13% for Snap, 12% for
00:25:33
Instagram. And roughly half of all teens
00:25:35
report feeling addicted to social media.
00:25:38
And you say, well, okay, fine. What's
00:25:40
the impact? Teens who are in the highest
00:25:43
use group expressed two times more
00:25:45
suicidal intent or self harm than those
00:25:47
in the lowest use group. And the highest
00:25:50
use group also express poor uh body
00:25:52
image at three times more than the
00:25:56
lowest use group. And it typically takes
00:25:58
a society or it takes America 20 to 30
00:26:01
years to respond to really negative
00:26:04
externalities. Took us 30 years with
00:26:06
tobacco. It took us 20 years with
00:26:07
opiates. And if you think about social
00:26:10
going on mobile in 2012, it 20 years is
00:26:14
probably the right number. I think when
00:26:15
I'm I mean parents always ask me what
00:26:17
should I do with my kids and I say how
00:26:18
old are your kids? And if they say three
00:26:20
or five, I'm like, we'll have it figured
00:26:21
out by then because the data here is so
00:26:24
overwhelming and we're up against uh
00:26:28
intrigence and people trying to delay an
00:26:30
opuscate similar to those tobacco
00:26:32
executives and they have more money and
00:26:33
they're more skilled this time. But
00:26:35
eventually the tide, the tsunami of
00:26:38
parental concern here, you know,
00:26:41
understandable parental concern is
00:26:43
washing over all this [ __ ] where so
00:26:46
I think in I would say I mean you have
00:26:48
entire countries now age gating. Look at
00:26:50
what Australia is doing. I think another
00:26:53
two to three years I'm hopeful the
00:26:55
landscape's going to be much different
00:26:57
for children. The the remedies would be
00:26:59
warning sign there's lots of remedies
00:27:01
like with cigarettes peopleating age
00:27:03
gating warning signals um they check
00:27:07
ages legal liability the age checking is
00:27:10
harder
00:27:11
>> what every what every other substance
00:27:13
company and manufacturers media company
00:27:15
is subject to
00:27:16
>> they've got to be kidding you know
00:27:18
there's so much you they have so many
00:27:20
emails of them talking about this that's
00:27:22
the problem for Adam is sorry to say
00:27:24
this he doesn't think it's clinically
00:27:25
addictive come on Adam, come on. We all
00:27:28
think it is. We The problem is every
00:27:30
adult knows this in their bones, right?
00:27:33
It's like
00:27:33
>> cuz we're addicted.
00:27:34
>> We're addicted. Like, we are. It's a
00:27:36
problem. You cannot put it down. And it
00:27:38
is different from television. It is very
00:27:40
different. And television. Listen,
00:27:42
Gilligans Island's addictive enough. I
00:27:44
can't believe I watched all that [ __ ]
00:27:45
But you can walk away from it in a way.
00:27:47
You cannot walk away from this. It's I
00:27:50
find myself I'm I have to throw the
00:27:51
phone across the room, right? Sometimes
00:27:54
I'm like, "Put it down." Um, you know,
00:27:57
every Amanda, same thing. We just It's
00:27:59
really interesting. And sometimes I
00:28:01
think about it. I'm like, I like news
00:28:03
and I'm read I'm mostly reading news,
00:28:05
but I don't stop. That's the difference.
00:28:07
I put down magazines. I put down
00:28:09
newspapers. And I love news. So, this is
00:28:12
the all this stuff as it gets out, as
00:28:14
you see the emails in inside the company
00:28:17
talking about it. And especially early
00:28:20
on, they knew just what they were doing.
00:28:21
And um perhaps they weren't meaning to
00:28:24
be malevolent at the beginning, but it's
00:28:26
malevolent for many young people and the
00:28:28
impact is huge. And then they just keep
00:28:30
doubling down with AI relationships and
00:28:32
synthetic relationships and everything
00:28:34
else. This is the time has come round at
00:28:37
last for these companies. We'll see how
00:28:38
well how how this trial does, but it's
00:28:41
going to it's going to just uncover more
00:28:43
and more about what they knew. very much
00:28:45
like the cigarette companies.
00:28:46
>> When you have hundreds of billions of
00:28:48
dollars in shareholder value, trillions
00:28:50
of dollars of shareholder value, hundred
00:28:51
billions in revenue, millions of some of
00:28:53
the brightest people in the world, and
00:28:55
trillions of data points, all trying,
00:28:56
all aiming towards one thing. How do we
00:28:58
get people to spend one more second
00:29:02
every day on social and less time
00:29:06
somewhere else, whether it's sports,
00:29:08
friends, studying, sleep, and they're
00:29:10
winning. And young people, especially
00:29:12
young men, who have this tremendous fall
00:29:13
in their brains where they're constantly
00:29:15
dopah hungry, they're up against an
00:29:17
indomitable foe. And then the other
00:29:21
>> like sugar. It's like sugar. It's the
00:29:22
same thing. It's the same.
00:29:23
>> And then there's two or three. But your
00:29:25
kid your kid can take, you know, a 10
00:29:27
pound bag of sugar into his bedroom with
00:29:29
him.
00:29:30
>> The the the other kid my kid could, but
00:29:33
go ahead.
00:29:34
>> The other two things it is a cumulative
00:29:36
effect that I think have really hurt our
00:29:37
youth are one. I do think parents have
00:29:40
some culpability here and that is we
00:29:44
have decided that our job is to clear
00:29:46
out all borders and obstacles for our
00:29:48
kids. We engage in concierge and
00:29:49
bulldozer parenting and by the time the
00:29:51
kid gets to college he or she has never
00:29:53
had a sea or a disappointment
00:29:55
>> and we've created this princess and pee
00:29:58
um generation with good intentions. We
00:30:00
thought we were doing our kids a good
00:30:02
thing. And then something that doesn't
00:30:03
get talked a lot about but I absolutely
00:30:06
think is adding up to a generation that
00:30:09
is at a disadvantage and that is if you
00:30:11
are 21 since the age of 10 the person
00:30:15
you are supposed to look up to most in
00:30:17
the world is Donald Trump. So
00:30:21
performative verality, coarseness and
00:30:23
cruelty,
00:30:25
>> online scams,
00:30:27
>> crypto, doubling down on lies. This has
00:30:31
been the role model
00:30:33
>> as kids brains are being wired during
00:30:36
puberty. And no matter who is president
00:30:38
or what you think of the office,
00:30:40
>> president is the person that millions of
00:30:44
young Americans look to as the as the
00:30:46
ultimate of success in American values.
00:30:49
So what have we done? We've raised a
00:30:51
generation of kids who are dopah hungry
00:30:53
and their primary role model maybe with
00:30:55
a close second the richest man in the
00:30:58
world
00:30:58
>> the greatest control of all time
00:31:00
>> are exhibiting values that are very
00:31:04
>> I mean and what do you know these 21
00:31:06
year olds are not it's shocking it's
00:31:09
shocking what good people they are what
00:31:11
they have to deal with
00:31:12
>> I would agree I think they do resist
00:31:14
more than you think and actually there
00:31:15
are a lot of parents one of the things I
00:31:17
spent a lot of time doing with my kids
00:31:19
whenever like can Can you go get this
00:31:21
from me? Can you talk to that person if
00:31:22
they wanted something? I'm like, you
00:31:23
need to do it. Like you I you figure it
00:31:26
out has became one of my lines with my
00:31:29
kids, my older kids. You figure it out.
00:31:31
I do it with my younger kids now. With
00:31:33
Saul, I'm like, you figure it out. I
00:31:34
don't know. I know, but you can do it
00:31:36
yourself. And so that's it's the best
00:31:38
piece of advice you can give to like a
00:31:40
kid. You
00:31:41
>> I've started giving my kid pounds when
00:31:43
he gets good grades. Is that wrong?
00:31:45
>> I slip him I slip him a 20 pound. And I
00:31:48
slip him a note when he gets an A on a
00:31:49
test.
00:31:50
>> Do not do that. Well,
00:31:51
>> totally.
00:31:52
>> No. No.
00:31:53
>> Anyway, um
00:31:54
>> that's called that's called capitalism.
00:31:56
>> Okay.
00:31:56
>> You got to get a bunch of money.
00:31:58
>> Okay. All right. Whatever. Whatever you
00:32:00
want to do there, Scott. We should write
00:32:01
competing parenting books. Uh in the
00:32:03
same genre about surveillance, as you
00:32:05
know, that's another thing I go crazy
00:32:06
about. Um investigators in the Nancy
00:32:09
Guthrie abduction case have recovered
00:32:10
footage from the Nest doorbell. Nest is
00:32:12
owned by Google. It was initially
00:32:13
thought to have no video because there
00:32:14
was no active subscription. When you
00:32:16
sign up, you have, for people who don't
00:32:18
know, for Nest or any of these things,
00:32:20
you can buy a subscription. If you
00:32:22
don't, they say they don't keep the
00:32:23
video. As it turns out, they do. The
00:32:26
incident shows that Nest uploads video
00:32:28
to Google Cloud before you decide to
00:32:30
keep it with a paid plan so it can
00:32:31
linger after it says it's been deleted,
00:32:34
is supposed to be deleted. I'm glad they
00:32:37
got these pictures of this guy. At the
00:32:38
same time, this is an edge case. They're
00:32:41
they're keeping your video that which I
00:32:43
which everyone thought they were doing
00:32:45
and they said they weren't. The FBI
00:32:47
working with Google engineers took 10
00:32:48
days to recover the footage from
00:32:49
Guthri's camera. I the companies need to
00:32:52
spell out in plain English how long
00:32:55
deleted footage actually remains on
00:32:57
their servers. And by the way, they're
00:32:58
also getting incredible push back from
00:33:00
the Ring ad for the Super Bowl, which is
00:33:02
like, "We're watching everybody, but
00:33:04
only for your dogs." And there's been a
00:33:06
million memes about only for people we
00:33:08
need to take away. like the surveillance
00:33:11
of these kind of things and the ease of
00:33:14
which they are hacked by the way not
00:33:16
just taken off the door like this
00:33:18
terrible person did um but hacked into
00:33:21
are quite something a lot of people are
00:33:22
getting them hardwired into their house
00:33:24
so that they can't do that and also so
00:33:27
that they can't be um taken via wireless
00:33:31
there's a lot of wireless activity here
00:33:32
but there are ways to a lot of these
00:33:35
things are open season on your home I
00:33:38
don't when I Just speaking of my son, my
00:33:40
kids, Alex took I had one of them up at
00:33:43
one of our houses when we bought it. It
00:33:45
was there, one of these Amazon or Echo
00:33:47
or whatever. He took them all out. He
00:33:49
took one day I came back and everything
00:33:50
was gone. And I was like, "Why?" And he
00:33:53
goes, "Because they can watch us." And I
00:33:55
was like, "Don't be paranoid." He goes,
00:33:56
"I'm not." And he was right. So
00:33:59
>> I think we're I think we have a bit of a
00:34:01
different view on this in the sense that
00:34:03
I think technology I think we gave up
00:34:05
our privacy a long time ago. Yes, Scott
00:34:07
McNeely, we did
00:34:08
>> what I want to see. Oh, remember Scott?
00:34:11
>> Yeah, he said that
00:34:12
>> privacy doesn't exist. Get used to it.
00:34:14
Remember,
00:34:14
>> if you are in London or New York, you
00:34:17
can't go more than I think it's 30 feet
00:34:19
agree
00:34:19
>> outside without a camera.
00:34:21
>> And the reason they did that was they
00:34:23
implemented massive they have like a a
00:34:25
security headquarters because of 911.
00:34:28
And I actually what I think you need
00:34:31
though is really really wellthoughtout
00:34:34
laws and institutions that say we're not
00:34:38
going to go fishing unless it's a felony
00:34:41
crime. We don't investigate it.
00:34:43
>> In other words, people have the right
00:34:45
You said something I've thought about a
00:34:46
lot and that is people have the right to
00:34:48
have secrets.
00:34:49
>> Yeah.
00:34:50
>> And if you want to if you want to go
00:34:52
into a store, if you're I don't know,
00:34:55
you you should be able to do what you
00:34:57
want. If you murder somebody then quite
00:35:00
frankly and there are enough there's
00:35:03
enough evidence to say that you are a
00:35:07
reasonable person of interest then we
00:35:10
are going to utilize uh cameras data
00:35:14
video footage
00:35:15
>> I agree with you I just think you buy
00:35:16
this product and it says it isn't
00:35:18
keeping it if you don't pay for it then
00:35:20
it's not keeping it like I'm sorry
00:35:22
that's just the deal that's just the
00:35:24
deal when you buy I have several of
00:35:25
these and I've taken most of them off my
00:35:27
house, but they say ex and I pay a lot
00:35:30
of attention. We if you don't pay this
00:35:33
stuff is deleted. This is deleted. If it
00:35:36
says it's deleted, it should be deleted.
00:35:38
That's all. It's just the deal you make
00:35:40
with them. And so I don't think they
00:35:41
should keep it if it's supposed to be
00:35:43
deleted. Same thing with Echo. It
00:35:45
shouldn't be listening if it says it's
00:35:47
not listening. Right. That's what that's
00:35:49
if you want it to listen, you can tell
00:35:51
it. That's in your home. I'm talking
00:35:53
about this outside. I think we've lost
00:35:55
that battle. They're going to their
00:35:56
cameras are everywhere in talk about
00:35:58
London. Monte Carlo is really wired. So
00:36:00
is the United States of America. And
00:36:02
that's a good thing when it comes to
00:36:04
crime, but it's a very bad thing when it
00:36:06
comes inside of your house. Cuz Scott, I
00:36:08
know if you want to wear your frilly
00:36:10
underwear, I think I Oh, wait. Was that
00:36:12
a secret? Um, I back people in their
00:36:15
homes. I'm just
00:36:16
>> Daddy goes commando. Big and the twins
00:36:19
want to be free. But I I think it's it's
00:36:22
in this case it was good to be able to
00:36:24
get the picture of this guy. At the same
00:36:26
time, she didn't the intent wasn't to.
00:36:30
So if plain English of what you're doing
00:36:33
and how long it remains and then it
00:36:34
should tell you when it's deleted and
00:36:36
permanently deleted. If they say
00:36:39
permanently deleted, it needs to be
00:36:40
deleted. That's I I feel like that's
00:36:42
>> at some point you should be able to
00:36:44
have, you know, I have cameras around my
00:36:45
house. You can see almost everything.
00:36:47
I try to sneak in all the time
00:36:48
>> if someone were to break in. But I think
00:36:51
what you want is like
00:36:54
this is the hack that I think is coming.
00:36:58
Somebody hacks into Uber with your Uber.
00:37:01
If you use Uber a lot, I think you can
00:37:03
find out when someone is with a thin
00:37:05
layer of AI on top of your Uber trips
00:37:06
where
00:37:06
>> they go.
00:37:07
>> Mhm.
00:37:08
>> They'll be able to know if you just
00:37:09
terminated a pregnancy.
00:37:10
>> Yep.
00:37:11
>> Or if you're a Russian spy. Why is this
00:37:13
person continually going to the Russian
00:37:14
embassy? Why is this are you having
00:37:17
affairs with sameex?
00:37:19
A thin layer of AI on top of your ride
00:37:22
history when and where you are going
00:37:24
places.
00:37:26
>> It would be they would it would be very
00:37:28
easy to say, okay, this person is
00:37:31
clearly suffering from diabetes. This is
00:37:34
why they keep going to this
00:37:36
>> type of clinic. You could. This person
00:37:39
is clearly engaged in a love affair with
00:37:44
this dude at this address. This person
00:37:48
is clearly
00:37:49
>> sure is constantly going to Amtrak. But
00:37:50
go ahead.
00:37:51
>> This person is clearly cooperating with
00:37:53
the CIA as evidenced by the fact they
00:37:55
keep going to this one address that is a
00:37:58
co. They could find out. So that hack,
00:38:04
folks, this is this is the trade we all
00:38:07
make and we all talk a big game. Anyone
00:38:09
who talks about privacy is typically
00:38:11
over the age of 50 and in Brussels or
00:38:12
DC. We consistently trade our privacy
00:38:16
for utility.
00:38:18
>> Yep, we do.
00:38:18
>> And and what I want is massively
00:38:24
Okay. Unless it's a felony, maybe even
00:38:26
more than that, it's a felony that with
00:38:29
that has a threat of violence and
00:38:31
there's really strong evidence against
00:38:33
one person, all that [ __ ] is off limits.
00:38:36
No one can use it.
00:38:37
>> All I'm saying is if they say it's off,
00:38:39
it needs to be off like
00:38:40
>> or at least give you the power to delete
00:38:42
it.
00:38:42
>> It's like if you buy like I don't know,
00:38:45
organic apple, it's not organic. You
00:38:47
can't do that. I mean,
00:38:48
>> it's the same thing. You're selling a
00:38:50
product, you say what it is, stay with
00:38:52
what you say it is. But at the same
00:38:53
time, I love the fact, okay, when when
00:38:56
there's a crime,
00:38:57
>> crime is hitting despite all the
00:38:59
scariness and everyone saying whether
00:39:00
it's whether they saying saying it's,
00:39:02
you know, Eric Adams or mom Donnie or
00:39:04
it's it's bedum in the streets,
00:39:06
>> crime, the number of shootings in New
00:39:08
York last year, I think, hit like an
00:39:09
all-time low.
00:39:11
>> Violence is going and crime, violent
00:39:13
crime has consistently gone down the
00:39:16
last several decades. It was, is it
00:39:17
because we're a better people? I don't
00:39:19
think so. is because if you commit
00:39:21
crimes now, everyone has seen those Law
00:39:23
and Order SUVs
00:39:25
>> that if you if you go into a 7-Eleven in
00:39:28
the middle of [ __ ] nowhere
00:39:30
>> and shoot the clerk,
00:39:32
>> they're film
00:39:32
>> ATMs have cameras. So, was there any
00:39:35
ATMs outside? Then they check the
00:39:37
footage on the ATM. I like I don't like
00:39:39
a surveillance state. I like a state I
00:39:42
like a place where if a really strong
00:39:45
lawyers that where they consistently
00:39:47
say, "I get you think a crime is
00:39:49
committed here. There's not enough
00:39:50
evidence. You do not have access to this
00:39:52
video.
00:39:53
>> Right.
00:39:53
>> Stop. There's evidence that you're
00:39:56
planning a terrorist attack. Sorry,
00:39:58
boss. We're violating your privacy
00:39:59
rights. Every ring light, but we have
00:40:02
Uber. We still I still think we have a
00:40:05
dupe process. We can't have the wrong
00:40:07
people getting a hold of stuff. Anyway,
00:40:09
I I hope they find Nancy Guthrie and I
00:40:12
hope it helps that they have these, but
00:40:15
we have to be it brings up a big issue
00:40:17
about surveillance and we should pay
00:40:19
attention to it. Um, and I hope it helps
00:40:22
find find her and bring her home safely
00:40:24
to her family. Um, anyway, let's go on a
00:40:27
quick break. We come back, uh, we'll
00:40:29
talk about the latest in AI news.
00:40:31
There's a lot of it.
00:40:33
Support for the show comes from Indeed.
00:40:35
Hiring isn't just about finding someone
00:40:37
willing to take the job. It's about
00:40:38
connecting with someone who can move
00:40:40
your business forward. For that, check
00:40:42
out Indeed Sponsored Jobs. Indeed
00:40:44
sponsored jobs boosts your job post for
00:40:46
quality candidates so you can reach
00:40:48
people that can help your business
00:40:49
thrive. People are finding quality hires
00:40:51
on Indeed right now as we speak. In the
00:40:53
minute I've been talking to you, 27
00:40:55
hires were made on Indeed, according to
00:40:56
Indeed data worldwide. Join the 3.3
00:40:59
million employers worldwide that use
00:41:00
Indeed to connect with quality talent
00:41:02
that fits their needs. Spend less time
00:41:04
searching and more time actually
00:41:06
interviewing candidates who check all
00:41:07
your boxes. Less stress, less time, more
00:41:09
results now with Indeed sponsored jobs.
00:41:12
And listeners to this show will get a
00:41:13
$75 sponsored job credit to help get
00:41:16
your job the premium status it deserves
00:41:18
at indeed.com/pivot.
00:41:20
Just go to indeed.com/pivot
00:41:22
right now and support our show by saying
00:41:24
you heard about Indeed on this podcast,
00:41:26
indeed.com/pivot.
00:41:28
Terms and conditions apply. Hiring, do
00:41:30
the right way with Indeed.
00:41:36
Support for this show comes from Quint.
00:41:39
Style doesn't come from chasing new
00:41:41
trends every season. Real style comes
00:41:43
from slowly and intentionally
00:41:44
cultivating a wardrobe filled with
00:41:46
highquality staples that will last. And
00:41:47
if you're on the lookout for a perfect
00:41:49
addition to your closet, look no further
00:41:50
than Quint. You'll find organic cotton
00:41:52
sweaters, polos for every occasion,
00:41:54
light jackets that will help keep you
00:41:55
warm as the seasons change year after
00:41:57
year. Not to mention their famous 100%
00:42:00
Mongolian cashmere. If there's anything
00:42:01
better than Kashmir, I'd love to hear
00:42:03
it. Every Quint's item is built for
00:42:05
everyday wear and made with ethically
00:42:07
sourced materials from top factories.
00:42:09
And by partnering with manufacturers
00:42:10
directly, Quint keeps things affordable.
00:42:12
So, you're only paying for the quality
00:42:14
clothing and not the brand markup. I
00:42:16
have finally bought new Quint clothes,
00:42:18
not just uh soft pants that I can wear
00:42:20
when I do sports. I actually bought more
00:42:22
of those, but I also bought a lovely
00:42:23
cardigan that is so soft. I wear it all
00:42:25
the time. I fell asleep in it the other
00:42:27
day. I bought a beautiful jacket and I
00:42:29
just love it. I have to say this this
00:42:31
cardigan I'm wearing is so comfortable.
00:42:33
It's really good-looking. The fabric,
00:42:35
everything else, it feels richer than it
00:42:37
was. Um, and the same thing with the
00:42:39
coat. It's really good-looking and I
00:42:40
really like wearing it. Again,
00:42:42
comfortable, simple, uh, and just
00:42:45
lovely. I really, really like it.
00:42:47
Refresh your wardrobe with Quint. Don't
00:42:49
wait. Go to quint.com/pivot
00:42:51
for free shipping on your orders and
00:42:53
365day returns. Now available in Canada,
00:42:56
too. That's quinc.com/pivotpivot
00:43:00
t to get free shipping and 365day
00:43:03
returns. quint.com/pivot.
00:43:08
Scott, we're back with more news. Time
00:43:09
for rapid fire of AI news. First up,
00:43:12
Anthropic is in the final stages of
00:43:13
raising $20 billion in new capital at a
00:43:15
valuation of $350 billion
00:43:19
uh valuation. And also at Anthropic, a
00:43:21
researcher submitted a resignation
00:43:23
letter saying the world is in peril,
00:43:25
saying employees constantly face
00:43:27
pressures to set aside what matters
00:43:29
most. That researcher is going off to
00:43:31
write poetry, by the way, which should
00:43:33
trouble you. Over at XAI, Elon Musk has
00:43:36
lost two co-founders, Jimmy Ba and Tony
00:43:38
Woo. Both announced their departure is a
00:43:40
big restructuring over there too when he
00:43:42
as he's brought it into uh SpaceX. The
00:43:45
company at OpenAI, the company's fired
00:43:47
an executive after she opposed plans for
00:43:49
an AI erotica feature in chat GBT citing
00:43:52
sexual discrimination. We don't actually
00:43:54
know what happened here. Uh Anthropic
00:43:56
raised the funding uh raised twice the
00:43:59
funding initially sought based on
00:44:00
investor demand. Uh so thoughts on any
00:44:03
of these stories? Lots of different lots
00:44:05
of stuff happening around AI again.
00:44:07
>> Yeah, the why people get fired or why
00:44:10
they say they were fired? I don't know.
00:44:11
I haven't sorted through that. What I I
00:44:13
think is already happened whether it's
00:44:15
reflected in the valuations or not. I
00:44:18
think Anthropic is now worth more than
00:44:19
open AAI. I think open AAI
00:44:22
>> what was their valuation? 800 billion.
00:44:24
>> Well, they're I think they're trying to
00:44:25
close around at 850.
00:44:26
>> Yeah. 850.
00:44:27
>> But that one VC who kind of if there was
00:44:32
a moment where the the the balloon was
00:44:34
burst, if you will, the bubble was
00:44:35
burst. It was when that VC had Sam Alman
00:44:37
on his podcast and said, "You've made a
00:44:40
trillion dollars in spending commitments
00:44:41
on a company with 20 billion in revenue.
00:44:43
How are you going to do that?" And he
00:44:45
got very defensive about it. And they've
00:44:47
gone consumer anthropics gone
00:44:50
enterprise. Uh they haven't made the
00:44:53
kind of crazy commitments. I I think
00:44:56
there's been the kind of the mother of
00:44:58
all industrial pivots. I think now, if
00:45:01
you will, Avis is now hurts. I think
00:45:03
Anthropic is now worth more or will be
00:45:06
soon than Open AI.
00:45:07
>> They are not making the money.
00:45:10
>> Yeah, that's but they're they're
00:45:12
stronger in the enterprise. Anyways, I
00:45:14
none of this makes any sense in terms of
00:45:16
a multiple on revenues, but uh I think I
00:45:19
think Open AI is in real
00:45:23
um I don't know, crisis is the wrong
00:45:25
word. There's a lot of arguments over on
00:45:28
X about that they have now do not have
00:45:30
he his big thing was I have the best AI
00:45:32
researchers now he does not right from
00:45:35
what most people intelligent people are
00:45:37
saying about it but you know he always
00:45:40
does this he always goes in and shakes
00:45:42
the tree and then shakes the tree again
00:45:44
that's that's his emmo I guess they're a
00:45:47
distant what third or fourth something
00:45:49
>> well these guys are all here's an a
00:45:53
symbol of how easy it is and how
00:45:55
difficult or how vulnerable they are. It
00:45:58
says, "Here are some Daario and Daniela
00:46:00
Emodi.
00:46:02
Uh, we're at OpenAI now at anthropic."
00:46:05
Ilia Sitsker, open AAI, now at safe
00:46:08
super intelligence. Aravan Shinavas,
00:46:11
Open AI, now at Perplexity. Mera Morati,
00:46:13
open AAI now at thinking machines.
00:46:15
Arthur Mench was at Google, now at
00:46:18
Mstral AI. It's the brightest minds here
00:46:21
are supposedly in I used to work with a
00:46:24
lot of luxury brands and they said the
00:46:25
biggest problem they were having in
00:46:26
China
00:46:27
>> Mhm. is that at the biggest malls, if
00:46:29
Prada was had a store across the street
00:46:31
from Bautega Vanetta,
00:46:33
>> if if the manager of that Prada didn't
00:46:36
have people show up, he could go across
00:46:38
the street during the lunch hour to the
00:46:41
lunch court and offer someone 11 bucks
00:46:43
an hour from the Bautga store who was
00:46:45
making 10 and they wouldn't even go back
00:46:47
after their lunch break. They would go
00:46:49
over and work out. It was just so easy
00:46:51
to pick off people by offering them a
00:46:52
dollar more per hour. And it feels so
00:46:55
many of these deep these people who
00:46:57
have, you know, fairly or unfairly have
00:47:01
established themselves as some of the
00:47:03
few minds that really understand this
00:47:04
stuff. The amount of money and
00:47:07
temptation to go do their own thing or
00:47:09
join another firm. It is I mean
00:47:12
supposedly wasn't there reports that
00:47:15
Zuckerberg was paying some people$100 or
00:47:17
$300 million and then he wasn't.
00:47:19
>> I mean it just feels like it's total I
00:47:22
don't know bedum right now. Right. It's
00:47:24
it's they all think they're going to be
00:47:26
the one, right? I'm going to be the
00:47:28
final one standing and I'm going to own
00:47:29
the world essentially, which is a bet.
00:47:32
It's a bet, right? I think one of the
00:47:34
things that continues to plague these
00:47:35
companies are these researchers who are
00:47:37
like, we're [ __ ] everybody. Like they
00:47:40
come out and almost, you know, like
00:47:43
they're sort of like, sh it's going to
00:47:45
kill us,
00:47:46
>> I think. But quite frankly, K, I think a
00:47:47
lot of it is people
00:47:51
backfilling
00:47:53
uh the reason why they're living with
00:47:55
leaving with morality sometimes or some
00:47:57
sort of victimhood. If you look at just
00:48:00
to go back to musical chairs here, if
00:48:02
you look at XAI, the company lost its
00:48:04
second co-founder in just two days. And
00:48:06
that means that half of XAI's founding
00:48:09
team, six of the 12 have left the
00:48:11
company in less than three years of
00:48:13
existence. and Mus said, you know, we we
00:48:16
reorganized XAI to improve the speed of
00:48:18
execution, which required parting some
00:48:20
ways with some people. And I think for
00:48:23
some of these founders, there's legal
00:48:24
risk to staying at XAI. The EU is
00:48:27
currently investigating the company for
00:48:29
its creation of non-consensual sexual
00:48:31
deep fakes based on real people,
00:48:33
including children. So, this really is
00:48:36
the wild west. This is um you know, I
00:48:41
don't know. I I think it's just it's so
00:48:43
difficult to even keep track of
00:48:45
>> Yeah. Yeah.
00:48:46
>> You know, who ends up where and why.
00:48:48
>> It's like as if a science people went
00:48:50
crazy, right? But I I do think the
00:48:53
warnings are getting really interesting.
00:48:55
They're like I wish someone would just
00:48:56
explain what we're in peril. How how are
00:48:59
we in
00:49:00
>> Yeah. How does that manifest? What does
00:49:01
that mean? Hey. Hey. Like, oh, it's like
00:49:04
the people who knew that we were about,
00:49:06
you know, in those movies where a bunch
00:49:07
of people know we're about to get hit by
00:49:09
a like a comet or something and they're
00:49:12
not telling us. They're like, I would
00:49:15
>> love your family. Why
00:49:17
>> is it Arnold Schwarzenegger showing up
00:49:19
at your door wearing Oakleys and a lot
00:49:21
of leather? Like, what is it?
00:49:22
>> What is happening?
00:49:23
>> What does it look like here? What does
00:49:24
it What does it mean? Cuz the employment
00:49:26
destruction that was supposed to be
00:49:27
already well underway, I would argue is
00:49:29
not happening yet. I don't know. But why
00:49:31
would someone say they're in peril?
00:49:33
We're in peril and set aside what
00:49:36
matters most, which is safety
00:49:37
presumably. And then they go off and
00:49:38
write poetry. I would like some more
00:49:40
information if you don't mind. If you're
00:49:42
going to do that, you need to tell me.
00:49:44
>> Yeah. Why exactly why are we in peril?
00:49:46
>> Why are we in peril? But
00:49:48
>> from what? Tell me. Tell us. I know. I
00:49:51
know there's these legal things, but if
00:49:52
it's so terrifying, you need to like
00:49:55
step out and like tell us tell us what
00:49:57
it is and have bring proof, too. by the
00:50:00
way, would love to know when the comet's
00:50:02
going to hit us. In any case, uh
00:50:04
>> but the VP of product policy at OpenAI
00:50:06
was fired after she voiced opposition to
00:50:09
Open AI's upcoming erotica features for
00:50:11
adult users.
00:50:12
>> Yeah,
00:50:13
>> she she did something else
00:50:15
>> that enabling erotica would likely
00:50:17
strengthen feelings that users already
00:50:19
have for the chatbot. Based on a recent
00:50:21
report released by OpenAI, out of chat
00:50:23
GBD's 800 million weekly users,
00:50:26
>> 1.2 And 2 million users are prioritizing
00:50:29
talking to chat GBT over their family,
00:50:31
friends, school, or work. That's less
00:50:32
than I would have thought. Roughly 560K
00:50:36
are experiencing psychosis or mania.
00:50:38
This is shitty research. As a as a as a
00:50:41
ratio by 800 million people, is that
00:50:42
normal or not normal?
00:50:44
>> That's a lot.
00:50:44
>> And about 1.2 million people discuss
00:50:46
suicide with chat GPT. Again, what I
00:50:49
want to see is someone to say,
00:50:51
>> "All right, is that just a function of
00:50:52
people who are depressed thinking they
00:50:53
can talk to chat GPT just as they would
00:50:55
talk to a friend or a therapist, right?
00:50:57
>> Or is it something about talking to chat
00:50:59
GPT?"
00:51:00
>> Right? You get the psychosis,
00:51:01
>> suicidal ideiation or psychosis.
00:51:04
>> Ladder, you know, interesting. I just
00:51:05
did an interview with Sher Challe from
00:51:06
my doc series and she's been saying it
00:51:08
for years and she's like, I've never
00:51:10
seen anything like it now. It was before
00:51:12
on the sidelines and in the darker
00:51:14
places or people had, you know, it was a
00:51:16
small group of people. She goes, "It's
00:51:18
really gone mainstream in a way. I I
00:51:20
would like the information from these
00:51:22
people. Would you come out and bring a
00:51:23
bag and bring it to me or Scott or
00:51:25
something like that?"
00:51:26
>> Anyway,
00:51:28
Carisher, on a separate note, speaking
00:51:31
of sort of normal journalism and getting
00:51:33
information out, one of the most
00:51:35
depressing things, Hong Kong media mogul
00:51:37
and pro-democracy activist Jimmy Lie was
00:51:39
sentenced this week to 20 years in
00:51:40
prison after he found guilty in a
00:51:42
sedition and collusion with foreign
00:51:44
forces. It's the longest sentence ever
00:51:46
handed down under a death sentence.
00:51:48
>> Death sentence. Uh Eli's children are
00:51:50
saying a potential visit by President
00:51:51
Trump April could be crucial in securing
00:51:53
the release of their 78-year-old father.
00:51:55
This is something Trump should do. Back
00:51:56
in December, Trump said he asked uh
00:51:58
President Xi to consider releasing lie.
00:52:01
But on the campaign trail in 2024, he
00:52:03
was a lot more confident saying 100%
00:52:05
I'll get him out. He'll be easy to get
00:52:06
out. He's not so easy to get out. Let's
00:52:09
not forget the real surveillance
00:52:11
economy, the real control economy. We've
00:52:13
talked about these issues around control
00:52:15
and the uses of AI for badness. Um,
00:52:19
China wins the boats everywhere and they
00:52:21
they go after this guy who's a really
00:52:24
important uh figure um uh figure in in
00:52:29
this area. And so if President Trump can
00:52:32
do anything, please do it. If anyone can
00:52:34
do anything, but Jimmy Li is a hero and
00:52:37
and what's happened to him is is as you
00:52:39
say a death sentence.
00:52:40
>> Look, I go to the economics. when you
00:52:42
start imprisoning journalists, whether
00:52:44
it was Turkey in 2012, Soviet Union at
00:52:47
the turn of the century or China, uh,
00:52:50
putting the, you know, taking a very
00:52:53
hardfed approach to Hong Kong in 2021 as
00:52:56
kind of best epitomized by Timi being
00:52:58
imprisoned,
00:53:00
distinct of the morality of it, distinct
00:53:02
of the importance it plays in a society,
00:53:05
the nation gets poorer and angrier. It's
00:53:06
a it is literally a conard a canary in
00:53:09
the coal mine saying we are about to
00:53:11
send a chill across some of the most
00:53:13
talented people and scrutiny about what
00:53:15
can be said about companies that hurts
00:53:17
the economy. The nations get poorer and
00:53:20
angrier and it's literally a symbol of
00:53:24
when an economy is about to move to an
00:53:26
authoritarian state which is really bad
00:53:29
for innovation for attracting outside
00:53:30
capital.
00:53:32
When you're thinking about investing
00:53:35
in Turkey, and all of a sudden they
00:53:36
start locking up journalists. Does that
00:53:37
thing does that does that if you're
00:53:40
Google, you think, "Yeah, I'm going to
00:53:41
start I'm going to open an office in I'm
00:53:45
going to open an office in Istanbul."
00:53:46
You think, you know, I'm going to wait
00:53:47
and see if they sort that out. If you're
00:53:49
one of the brightest PhDs in the world
00:53:52
and you're doing research on
00:53:55
authoritarian governments or you're
00:53:56
doing research on innovation and you're
00:53:58
worried that your research might might
00:54:01
contradict something that the leadership
00:54:03
is espousing to do you go teach at those
00:54:05
universities? No, you go somewhere else.
00:54:08
So
00:54:10
this is look China is not you know is
00:54:13
not a model for
00:54:16
but having said that I was just supposed
00:54:17
to be on with Don Lemon who got
00:54:19
arrested. Why the [ __ ] why are they
00:54:21
arresting Don Lemon?
00:54:22
>> Don Lemon like give me a break. They
00:54:24
shouldn't be arresting any journalist
00:54:26
like this. It's just ridiculous. I would
00:54:28
agree. Um I'm going to finish up with
00:54:30
any Jimmy L. Let's get him out. Let's
00:54:32
let's get him out. He's a hero. Um I'm
00:54:34
going to finish up with something that
00:54:35
just happened. Um Gail Slater, who a
00:54:38
hugely respected lawyer, antitrust
00:54:40
lawyer who was running antitrust of DOJ,
00:54:42
just announced she's stepping down. It
00:54:44
follows uh the the resignation of a guy
00:54:46
named Mark Hammer, who was one of his
00:54:48
her top deputies. She's had clashes with
00:54:51
Pam Bondi over the Hanley antitrust
00:54:53
investigations. I have heard she was in
00:54:55
a real bind over the Paramount thing.
00:54:58
They're trying to like shove through
00:54:59
things that are friendly to the Trump
00:55:02
administration and she just can't do it.
00:55:04
She can't do it. that during her 11
00:55:05
months on the job, she found herself in
00:55:07
this bind caught between the Trump
00:55:09
administration's um
00:55:12
she was close to JD Vance. This is a
00:55:14
very respected and well- reggarded
00:55:16
antitrust version. This should be an
00:55:18
enormous signal that Gail Slater is
00:55:20
stepping down. Um I had hoped to talk to
00:55:22
her, but everyone had told me they
00:55:24
didn't know what she was going to do
00:55:25
about the the Netflix Paramount thing.
00:55:28
Um you cannot be against the Netflix
00:55:30
thing if you're not against the
00:55:31
Paramount thing. I'm sorry. like and of
00:55:34
course she's being, you know, she she
00:55:36
had a she's been put in a bind all over
00:55:38
the place. A talented and and well
00:55:41
regarded person has put into a bind and
00:55:43
so she's stepping down. Um I just don't
00:55:45
know who they'll put in some idiot like
00:55:47
a Brandon Carr type of person who will
00:55:49
just do what they say. Um but it really
00:55:51
brings it down rather significantly.
00:55:53
Even even um Megan Del Reinver works for
00:55:56
Paramount actually now very well
00:55:58
regarded like they're going to have to
00:56:00
put in it in a village [ __ ] idiot in
00:56:02
the Pam Bondi mode. So not a good sign.
00:56:04
Not a good sign.
00:56:05
>> Yeah, great.
00:56:06
>> Anyway, uh one more quick break. We'll
00:56:08
be back for predictions.
00:56:11
>> Support for the show comes from
00:56:12
Netswuite. We all hear all the time how
00:56:14
AI can push businesses to new frontiers.
00:56:16
If you're still not sure what that
00:56:17
actually looks like, particularly for
00:56:19
you and your company, then look no
00:56:21
further than Netswuite by Oracle.
00:56:23
Netswuite is a top AI cloud ERP trusted
00:56:26
by over 43,000 businesses. It's a
00:56:28
unified suite that brings your
00:56:29
financials, inventory, commerce, HR, and
00:56:31
CRM into a single source of truth. With
00:56:34
all that connected data, your AI doesn't
00:56:35
throw out its best guess. It actually
00:56:37
knows what it's talking about. It can
00:56:39
intelligently automate routine tasks,
00:56:40
deliver actionable insights, and help
00:56:42
make fast AI powered decisions with
00:56:44
confidence. Now with Netswuite AI
00:56:46
connector, you can use the AI of your
00:56:48
choice to connect to your actual
00:56:49
business data. Plus, you can automate
00:56:51
those tiresome manual processes. It's AI
00:56:54
built into the system that runs your
00:56:55
business, affording you total
00:56:57
flexibility. Get ahead of the game and
00:56:59
put AI to work today with Netswuite. If
00:57:02
your revenues are at least in the seven
00:57:04
figures, get the free business guide,
00:57:05
Demystifying AI, at netsweet.com/pivot.
00:57:09
The guide is free to you at
00:57:10
netsweet.com/pivot.
00:57:11
netsweet.com/pivot.
00:57:15
>> Okay, Scott, let's hear a prediction.
00:57:18
>> I think that
00:57:20
what was supposed to be the most
00:57:21
anticipated IPO, maybe with the expect
00:57:24
exception of kind of Space X, AI, Tesla,
00:57:28
the
00:57:29
>> whatever Tesla's not in there yet.
00:57:30
>> Probably the most anticipated was Open
00:57:33
the IPO of Open AI in 2026, sometime
00:57:36
this year, early 27. I don't think
00:57:38
that's going to happen. Um I think that
00:57:42
yeah I think this company is is now is
00:57:44
gone into full
00:57:47
um I don't call it panic mode but it
00:57:50
feels as if the m momentum has a habit
00:57:53
of creating more momentum and I think
00:57:54
the momentum is really negative around
00:57:56
this company
00:57:57
>> what happens where does it go what does
00:57:59
it do
00:58:00
>> well I think they'll substantially
00:58:03
um scale back there I mean have you
00:58:05
already seen the war have you already
00:58:06
seen Jensen Hang and Sam Alman who were
00:58:09
you
00:58:09
Bud buddies are already [ __ ] posting
00:58:11
each other,
00:58:12
>> right?
00:58:12
>> Claiming that the hundred billion dollar
00:58:14
agreement with was a framework and
00:58:16
they're actually not the hundred billion
00:58:17
investment.
00:58:18
>> May I just say you said that
00:58:21
>> well that was ridiculous. These circular
00:58:23
deals I'll give you a hundred billion.
00:58:25
I'll invest 100 billion if you invest
00:58:26
100 billion in our chips. And now and
00:58:28
now quote unquote Jensen's backtracking
00:58:30
and saying well it was just a framework
00:58:33
they couldn't justify it. Nvidia stock
00:58:35
has gone down because people are worried
00:58:37
about exposure to open AI. Right. So
00:58:38
what does OpenAI do? They start ship
00:58:40
posting Nvidia and saying no was because
00:58:42
their chips didn't live up to our
00:58:44
expectations. When when the biggest
00:58:47
player in the space Jensen Hong and kind
00:58:50
of the young gun Open AI start ship
00:58:52
posting each other and and they back out
00:58:55
of this hundred billion dollar
00:58:56
investment framework. That is a really
00:59:00
bad sign.
00:59:01
>> He kept using what was the word? We're
00:59:02
honored to be invited. What was he
00:59:05
saying? It was so funny. Yeah, but
00:59:06
they're both going on background now and
00:59:07
blaming each other.
00:59:08
>> Oh, totally. Utterly. Like, can I just
00:59:10
give people a lesson? When you hear
00:59:12
sources close to the situation, if they
00:59:14
were any closer to either of them,
00:59:15
they'd be on the other side of them.
00:59:17
>> That's them, right? The
00:59:20
>> So, I think I think the momentum the the
00:59:23
worm has turned. And it's not that
00:59:24
OpenAI isn't an unbelievable company
00:59:26
that could go public at like a $50
00:59:28
billion market capitalization. But the
00:59:29
problem is when you sell some investors
00:59:32
in at 250, 450, and then if he's able to
00:59:34
close this round at 850, they're not
00:59:37
willing to go public or let you have a
00:59:39
liquidity event that cuts there. What
00:59:41
happens in an IPO? Say he went public at
00:59:44
300 billion next year and said, "Okay,
00:59:45
the market isn't what we thought."
00:59:47
Unless there's a couple years where the
00:59:49
latest round of investors get so
00:59:51
fatigued they're willing to take a 60%
00:59:53
haircut. All of your shares, the last
00:59:55
round of investment has a preference,
00:59:58
meaning they they're the first money
01:00:00
out. So the 50 or 100 billion going in
01:00:01
at 850 doesn't want to give up their
01:00:04
liquidity preference and let them go
01:00:06
public if they're going public at less
01:00:07
than 850, which I think they would. So
01:00:10
your last round of investors become a
01:00:12
veto block for going public unless
01:00:14
you're going to go public at a valuation
01:00:16
greater than 850.
01:00:17
>> So what do they do? You haven't answered
01:00:19
my question.
01:00:20
>> They'll dramatic, in my opinion, they'll
01:00:22
dramatically scale back their capital,
01:00:24
their capex and they'll end up with a
01:00:26
much smaller, much less ambitious
01:00:28
amazing company that's only worth a
01:00:30
100red or 200 billion. It's only one of
01:00:31
the 30 most valuable companies in
01:00:34
America. Not the
01:00:35
>> get bought what? Well, that means
01:00:36
everyone else will get collapsed, right?
01:00:38
Or not. I think the whole my opinion if
01:00:40
you look at and I look at weird signals
01:00:42
the percentage of ads at the Super Bowl
01:00:44
right if you look at all this I think
01:00:47
there's a ton of anecdotal evidence
01:00:49
showing that while AI may live up to its
01:00:51
potential the market cap of the biggest
01:00:54
players this year is about to throw up
01:00:57
which isn't to say that similar in 2000
01:00:59
when the market cap of Amazon went down
01:01:00
95% it's still not going to be an
01:01:03
unbelievable company but I think we're
01:01:05
about to see a dramatic recalibration in
01:01:07
the markets which includes Open AAI's
01:01:09
IPO plans getting queered. Now, who's
01:01:11
going to take their place? And this is
01:01:13
the prediction.
01:01:14
>> Mhm.
01:01:15
>> The most impressive numbers hands down
01:01:17
that no one I wasn't expecting.
01:01:20
>> Khi's
01:01:23
year on year
01:01:24
>> and not Poly Market, right?
01:01:25
>> Well, Khi's Khi is actually of the two
01:01:29
the clean well-lit space of this I see.
01:01:32
Okay.
01:01:33
>> Of this marketplace, right? It's a
01:01:35
little Cali is CFTC regulated. It's also
01:01:39
in the US. It's peer-to-peer trading.
01:01:41
It's federally regulated. Um I have some
01:01:45
I don't have moral clarity around these
01:01:46
issues because I do think they tap into
01:01:48
the dopa of a young more risk aggressive
01:01:50
male brain. But just let me go straight
01:01:52
to the numbers here.
01:01:54
>> In 2026 or in this Super Bowl, right,
01:01:58
>> over a billion dollar in trading volume
01:02:01
on Cali. That's up 2,700%.
01:02:05
It was up 28fold this year. And you know
01:02:08
who's getting absolutely the [ __ ] kicked
01:02:10
out of them is Flutter is the gambling
01:02:13
sites.
01:02:15
They're killing these guys. The sports
01:02:17
market accounted for about 90% of Kow's
01:02:19
activity this month. And it's it's
01:02:22
having an incredible impact on
01:02:24
traditional gambling and sports book. Um
01:02:28
analysts have noted that Khu's ride ride
01:02:30
rise coincides with the underperformance
01:02:32
in major sports books stock prices draft
01:02:34
kings flutter as traders shift some
01:02:36
activity towards prediction markets and
01:02:39
with a venue that's easy to access
01:02:40
nationwide which koshi is even in states
01:02:43
without legal sports betting the firm is
01:02:45
attracting betterers who might otherwise
01:02:47
um have used traditional sports books.
01:02:49
So this is this company and my my
01:02:53
prediction is the following.
01:02:55
Open AI way to the downside doesn't get
01:02:58
public. Khi is about is going to be in
01:03:01
my opinion the kind of IPO we're all
01:03:04
trying to get into in Q2 of Q3 of this
01:03:06
year.
01:03:07
>> Kchi it is. All right. Well, that's
01:03:09
interesting. You've been sounding this
01:03:10
alarm for these companies. Interesting.
01:03:12
Fascinating. That's a big one, Scott.
01:03:14
That's a big one.
01:03:15
>> We'll see. Right.
01:03:16
>> Yeah. Anyway, we want to hear from you.
01:03:19
Send us your questions about business,
01:03:20
tech, or whatever's on your mind. Go to
01:03:21
nymag.com/pivot
01:03:23
to submit a question for the show or
01:03:25
call 85551 pivot. Uh elsewhere in the
01:03:28
Cara and Scott universe this week on
01:03:29
Profy Markets, Scott spoke with Esar
01:03:32
Prasad, professor of trade policy and
01:03:34
economics at Cornell University to
01:03:36
discuss why he thinks economics,
01:03:38
domestic politics, and geopolitics are
01:03:40
stuck in a doom loop. Doom loop. Let's
01:03:43
listen to a clip. Globalization used to
01:03:46
be seen as a positive sum game where
01:03:48
countries could benefit mutually from
01:03:50
trade and that would be an offset to
01:03:53
what is intrinsically the zero sum game
01:03:55
of geopolitics where one country can
01:03:57
gain influence only at the expense of
01:04:00
another. But now even globalization has
01:04:02
become seen as a zero sum game. So it
01:04:05
isn't offsetting the zero sum game of
01:04:07
geopolitics and worse some of the
01:04:10
negative dynamics of globalization have
01:04:12
started infecting domestic politics not
01:04:14
just in the US but in many other
01:04:16
countries.
01:04:17
>> God I feel smarter already my people
01:04:19
>> you know there's
01:04:22
professor um
01:04:23
>> Prasad that one of the things that
01:04:26
struck me and I said this
01:04:27
>> we graduated the same year from
01:04:29
undergraduate me from UCLA him from the
01:04:32
University of Madras.
01:04:33
>> Mhm. I graduated with a 2.27 GPA with an
01:04:37
incredible ability to make bongs out of
01:04:39
any household item.
01:04:40
>> He won a scholarship in India that
01:04:42
identified like one of the 50 smartest
01:04:44
kids of a billion kids.
01:04:46
>> And so what what do what does a guy
01:04:49
who's one of the 50 like he this guy
01:04:52
could walk into the Rose Bowl and take
01:04:54
the average IQ of those 80,000 people up
01:04:56
a couple points. That's how smart and
01:04:58
hardworking this man is. So what are we
01:04:59
doing to what are we saying to these
01:05:01
people now? Can you imagine a kid coming
01:05:02
out of the University of Memphis right
01:05:04
now in 2026? Is he going to go to Brown?
01:05:07
>> Yeah.
01:05:07
>> No, he's going to go to Miguel or he's
01:05:09
going to go to Instituta or he's going
01:05:11
to go to
01:05:11
>> INSEAD
01:05:12
>> or who knows maybe
01:05:14
>> maybe the University of Cordova in
01:05:16
Argentina. I mean,
01:05:17
>> speaking of doom loops, academic doom
01:05:20
loops.
01:05:20
>> We're the sports team that used to have
01:05:22
access to the number one draft at any
01:05:25
college in the world and we've said no,
01:05:27
we don't want
01:05:27
>> and now we just have Prof coming back to
01:05:30
the
01:05:32
Anyway, it's it sounds like a great
01:05:34
interview. I'll be listening to it. That
01:05:35
is the show. Thanks for listening to
01:05:37
Pivot and be sure to like and subscribe
01:05:39
to our YouTube channel. Uh we'll be back
01:05:41
next week.

Badges

This episode stands out for the following:

  • 60
    Most shocking

Episode Highlights

  • Managing Subscriptions for Loved Ones
    A listener shares how they saved money by managing their elderly parent's subscriptions.
    “Every child of an elderly person should go through all of their parents' subscriptions.”
    @ 04m 21s
    February 13, 2026
  • Pam Bondi's Controversial Testimony
    Attorney General Pam Bondi faced criticism during her testimony about the Epstein files, sparking heated exchanges.
    “It's grotesque. I mean I don't know.”
    @ 07m 18s
    February 13, 2026
  • Social Media Trial Begins
    A landmark trial against Meta and YouTube kicks off, focusing on addiction claims.
    “This is a really important trial.”
    @ 22m 57s
    February 13, 2026
  • Addiction Statistics
    24% of teens are clinically addicted to social media, compared to 6% for drugs.
    “The average American teen spends 4.8 hours a day using social media.”
    @ 25m 22s
    February 13, 2026
  • Parental Concerns
    Parents are increasingly worried about the impact of social media on their children.
    “The tsunami of parental concern is washing over all this.”
    @ 26m 43s
    February 13, 2026
  • The Illusion of Privacy
    Privacy is a myth in today's world; cameras are everywhere, especially in major cities.
    “Privacy doesn't exist. Get used to it.”
    @ 34m 12s
    February 13, 2026
  • Rights to Secrets
    In a world of surveillance, the right to keep secrets remains vital.
    “People have the right to have secrets.”
    @ 34m 48s
    February 13, 2026
  • The Surveillance Debate
    The balance between surveillance for safety and personal privacy is increasingly contentious.
    “We consistently trade our privacy for utility.”
    @ 38m 16s
    February 13, 2026
  • AI's Troubling Trends
    Concerns rise as AI's impact on mental health and privacy becomes more apparent.
    “Why are we in peril?”
    @ 49m 46s
    February 13, 2026
  • Jimmy Li's Imprisonment
    Jimmy Li has been sentenced to 20 years, raising concerns about freedom of speech.
    “He's a hero and what's happened to him is a death sentence.”
    @ 52m 37s
    February 13, 2026
  • Impact of Imprisoning Journalists
    Imprisoning journalists leads to economic decline and societal unrest.
    “When you start imprisoning journalists, the nation gets poorer and angrier.”
    @ 52m 40s
    February 13, 2026
  • OpenAI's IPO Uncertainty
    OpenAI's anticipated IPO faces challenges as market momentum shifts negatively.
    “The momentum has turned. It's not that OpenAI isn't an unbelievable company.”
    @ 59m 23s
    February 13, 2026

Episode Quotes

Key Moments

  • Economic Impact00:55
  • Pam Bondi Testimony07:18
  • Expert Testimony23:21
  • Parental Concerns26:43
  • Privacy Myth34:12
  • Jimmy Li's Sentence51:39
  • Journalism and Economy52:40
  • OpenAI IPO Predictions59:23

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Kristi Noem Fired — Her New Role Sounds Like a “Bad Marvel Movie” | Pivot
Podcast thumbnail
Resist and Unsubscribe: Scott Galloway’s Plan to Hit Big Tech Where It Hurts | Pivot