Search Captions & Ask AI

Brain Rot Emergency: These Internal Documents Prove They’re Controlling You!

February 16, 2026 / 02:18:45

This episode discusses technology addiction, mental health, and the impact of social media on attention spans. Guests Jonathan Haidt, a social psychologist, and Adi Jaffe, a Harvard physician, address the crisis of technology addiction and its effects on society.

Haidt shares insights from his book, "The Anxious Generation," highlighting how social media platforms, particularly short-form videos, are rewiring our brains and diminishing attention spans. He emphasizes the need for action to protect children from these addictive technologies.

Jaffe provides a medical perspective, explaining the physiological effects of excessive screen time on mental health, including increased risks of anxiety and depression. He suggests practical strategies for reclaiming attention, such as setting boundaries with devices and engaging in meaningful activities.

The conversation also touches on the societal implications of technology addiction, including the potential for increased loneliness and a sense of meaninglessness among young people. Both guests advocate for legislative changes to protect children and promote healthier relationships with technology.

Overall, the episode serves as a call to action for individuals and society to address the mental health crisis linked to technology use and to reclaim attention in an increasingly digital world.

TL;DR

Experts discuss technology addiction's impact on mental health and attention spans, advocating for action to protect children from social media's effects.

Video

00:00:00
You are actively rewiring your brain for
00:00:02
the worst by engaging with social media,
00:00:04
high volume, quick videos.
00:00:06
>> And the social media executives don't
00:00:07
let their kids use this stuff because
00:00:09
they designed it to be addictive and
00:00:11
they know that millions and millions of
00:00:12
kids have been cyberbullied, sexed. Many
00:00:15
have committed suicide. So, I'm getting
00:00:17
angry.
00:00:17
>> And then from the medical perspective,
00:00:19
it's rewiring your body, increasing your
00:00:21
risk of heart disease and PTSD.
00:00:23
>> We've moved too far into the virtual
00:00:25
world and the results are catastrophic.
00:00:27
People are spending roughly about 6 and
00:00:29
a half hours a day on their phones. What
00:00:31
do we do about this?
00:00:32
>> Well, here's the amazing thing. We
00:00:34
actually can control our fate. So, we
00:00:36
are joined by a social psychologist and
00:00:38
a Harvard physician
00:00:40
>> to dive into the technology addiction
00:00:41
and brain rot crisis billions are facing
00:00:44
worldwide
00:00:44
>> and how we can counter its devastating
00:00:46
mental health effects. You have to
00:00:49
reclaim your attention because without
00:00:51
the ability to pay attention for several
00:00:52
minutes at a time, we're seeing the
00:00:54
destruction of human potential, the
00:00:56
human relationships, the connection.
00:00:58
>> But there's all these small tweaks that
00:00:59
you can do to override that primal urge
00:01:01
to scroll. For example, 91% of people
00:01:04
had an improvement in attention,
00:01:05
well-being, and mental health. After
00:01:07
just 2 weeks of continuing to use your
00:01:09
device, but not having internet access.
00:01:11
Next, keep your phone out of your arms
00:01:14
reach because the sheer potential for
00:01:15
distraction has actually been shown to
00:01:17
change your prefrontal cortex, which is
00:01:19
called brain drain.
00:01:20
>> So, yes, we should exert more
00:01:21
self-control, but we're being pushed in
00:01:23
addictive apps and it's messing us all
00:01:25
up. That's not our fault.
00:01:26
>> Would you advise people to delete these
00:01:28
short form videos?
00:01:29
>> Oh my god, yes, that would the most
00:01:30
important thing you can do for your
00:01:31
intelligence and for humanity. But if I
00:01:33
was going to offer some specific advice,
00:01:35
here are the three things that I do with
00:01:37
my students to reclaim retention. And
00:01:39
then to add to that, I have the 3se
00:01:40
secondond brain reset. So, first
00:01:43
>> I wanted to ask you guys what you
00:01:45
thought of this.
00:01:46
>> Hey, you're back.
00:01:47
>> This terrifies me.
00:01:48
>> We've got to stop this now.
00:01:54
>> Guys, I've got a quick favor to ask you.
00:01:56
We're approaching a significant
00:01:57
subscriber milestone on this show, and
00:01:59
roughly 69% of you that listen and love
00:02:02
this show haven't yet subscribed for
00:02:04
whatever reason. If there was ever a
00:02:05
time for you to do us a favor, if we've
00:02:07
ever done anything for you, given you
00:02:09
value in any way, it is simply hitting
00:02:11
that subscribe button. And it means so
00:02:13
much to myself, but also to my team, cuz
00:02:14
when we hit these milestones, we go away
00:02:16
as a team and celebrate. And it's the
00:02:17
thing, the simple, free, easy thing you
00:02:19
can do to help make this show a little
00:02:20
bit better every single week. So, that's
00:02:23
a favor I would ask you. And, um, if you
00:02:25
do hit the subscribe button, I won't let
00:02:27
you down. And we'll continue to find
00:02:29
small ways to make this whole production
00:02:30
better. Thank you so much for being part
00:02:33
of this journey. Means the world. And uh
00:02:35
yeah, let's do this.
00:02:39
Jonathan editing.
00:02:42
Jonathan, I've heard you say that the
00:02:43
destruction of attention is the largest
00:02:45
threat to humanity that's happening
00:02:47
around the world. And I've also heard
00:02:49
you say that short form videos are the
00:02:51
worst of the worst because they're
00:02:53
shattering attention spans. The reason
00:02:55
why I wanted to have this conversation
00:02:57
today is somewhat personal. And in fact,
00:03:00
all of the conversations have in the
00:03:01
driver are somewhat personal to some
00:03:03
degree. um they're inspired by some
00:03:05
unanswered question I have in my head
00:03:07
and also some observation I have in my
00:03:09
life and the observation I've had is
00:03:11
that short form videos in particular are
00:03:15
making my life worse and actually I've
00:03:17
got to say the catalyst moment really
00:03:20
where I thought you know I need to get
00:03:21
you exceptional people together to have
00:03:23
this conversation was I thought this I
00:03:25
then looked at my screen time and saw a
00:03:27
huge change I felt so much worse because
00:03:29
all these social platforms have short
00:03:31
form video now and then I actually heard
00:03:32
Elon Musk who you know has a social
00:03:35
media platform that does short form
00:03:37
video say that he thinks it's one of the
00:03:38
worst inventions for humanity.
00:03:41
>> Jonathan, why did you say what you said
00:03:42
about short form video and this
00:03:45
corruption of attention?
00:03:46
>> Yeah, because I wrote a whole book
00:03:48
called The Anxious Generation focusing
00:03:50
on teen mental health. That was the
00:03:52
mystery that popped up in the mid200s.
00:03:54
Why are people born after 1995 so much
00:03:57
more anxious and depressed? And I've
00:03:59
been tracking down that mystery and it
00:04:01
points a lot of it points to social
00:04:02
media and especially Instagram, social
00:04:05
comparison, all the things we know about
00:04:07
social media. When the book came out in
00:04:09
2024, since then what I realized is that
00:04:12
I vastly underestimated the damage
00:04:15
because I focused on mental health,
00:04:17
which is a catastrophe. But the bigger
00:04:19
damage is the destruction of the human
00:04:21
ability to pay attention. Without the
00:04:24
ability to pay attention for several
00:04:26
minutes at a time, ideally 10 or 20
00:04:28
minutes at a time. Without that, you're
00:04:30
not going to be of much use as an
00:04:32
employee. You're not going to be of much
00:04:33
use as a spouse. You're not going to be
00:04:35
successful in life. And that's when I
00:04:37
realized this is way beyond mental
00:04:39
health. This is changing human
00:04:41
cognition, changing human attention, and
00:04:44
possibly on a global scale.
00:04:47
Adi, what perspective do you come at
00:04:50
this from? And what's been your
00:04:51
perspective through all the work you've
00:04:52
done about brains and stress and
00:04:54
neuroscience and all these kinds of
00:04:55
things that has shaped the way that you
00:04:57
think about social media, screen time,
00:04:59
short form video.
00:05:01
>> My background is that I'm a physician at
00:05:04
Harvard and it my expertise is in
00:05:06
stress, burnout, and mental health. And
00:05:08
so that is the lens that I view all of
00:05:11
this through. We know that the most
00:05:14
delletterious relationship that you have
00:05:16
is with your device. You know, in every
00:05:18
healthy relationship, we have
00:05:19
boundaries. We have boundaries with our
00:05:21
kids, our parents, our colleagues, our,
00:05:25
you know, wi-i with our friends. And
00:05:27
yet, we have no boundaries and often
00:05:29
poorest boundaries when it comes to the
00:05:32
relationship you have with your device.
00:05:33
So, it's not so much about, you know,
00:05:35
becoming a digital monk and renouncing
00:05:37
technology because technology can serve
00:05:39
us, right? It inspires, educates,
00:05:42
connects. Now more than ever, it's so
00:05:44
important to be an informed citizen, but
00:05:45
not at the expense of your mental
00:05:47
health. And so what Jonathan was saying,
00:05:48
this, you know, constant being engaged
00:05:51
with your devices, with social media,
00:05:53
the scrolling from the minute you wake
00:05:55
up until you go to bed, there's a reason
00:05:58
why you have your best ideas in the
00:05:59
shower. And that's because that's the
00:06:01
only place in the whole day where you
00:06:03
are not with your device. People take
00:06:05
their device to the bathroom. They sleep
00:06:07
with your device. you eat with your
00:06:09
device, people walk down the street.
00:06:11
There's more near miss pedestrian
00:06:13
accidents because people are walking
00:06:15
while they're crossing the street and um
00:06:16
looking at their devices. And so there's
00:06:18
all of this brain biology at play behind
00:06:21
the scenes. So both of you have talked
00:06:23
about how it doesn't feel good to engage
00:06:25
and constantly be on your phone, that
00:06:28
sense of infinite scroll, but there is,
00:06:30
you know, it feels like you're doing
00:06:32
nothing. You're just doing this, right?
00:06:33
What are you doing? But in fact, it is
00:06:35
not passive. It is active. And it has a
00:06:37
profound effect on your biology, on your
00:06:40
brain, on your psychology, and also
00:06:42
social factors that I hope we talk about
00:06:44
today.
00:06:45
>> You know, scrolling, wasting a bit of
00:06:46
time doesn't seem so harmful.
00:06:49
What is the big, if we play this forward
00:06:51
10, 20, 30 years, what is the big risk
00:06:53
or threat? The biggest threat right now,
00:06:56
we don't even have to wait 20 years, is
00:06:57
that it through a process called
00:06:59
neuroplasticity, which is just a big
00:07:01
fancy word that simply means that your
00:07:03
brain is a muscle, is that by engaging
00:07:05
with social media, that that sense of
00:07:08
high volume, lowquality, quick videos,
00:07:11
you are actively rewiring your brain for
00:07:13
the worse. So you're increasing your
00:07:15
sense of stress, worsening your mental
00:07:17
health, attention, cognition,
00:07:19
distractability, irritability, complex
00:07:22
problem solving. All of that changes
00:07:25
when you engage in engage in that
00:07:27
infinite scroll.
00:07:28
>> Yeah. I'd like to add on here because
00:07:30
one of the main arguments I get is, ah,
00:07:32
this is what they said about television.
00:07:34
Oh, this is what they said about comic
00:07:35
books. This is just another moral panic.
00:07:37
But people need to understand why
00:07:39
touchscreen devices are so different
00:07:41
from television. And so I think parents
00:07:44
find this helpful if I just lay this out
00:07:45
briefly. Good screen time versus bad
00:07:47
screen time. So humans are storytelling
00:07:51
animals. We have always, as long as
00:07:53
we've had language, we've raised our
00:07:54
kids with stories, epic poems, all kinds
00:07:57
of stories. Stories are good. Sto the
00:07:59
human brain needs lots of patterns. The
00:08:01
child's brain needs lots of patterns to
00:08:03
develop. So the worst thing you can do
00:08:06
is hand your child the device because
00:08:08
they're crying for it because they've
00:08:09
been they trained to get it and you're
00:08:10
busy. So you have hand them the device.
00:08:12
They're quiet. What's happening? They're
00:08:14
sitting alone. Not, you know, when I was
00:08:16
a kid, we always watch with my sisters,
00:08:18
with my friends. You're arguing about
00:08:19
it. You're talking at social kids
00:08:20
sitting alone with a device in his hand.
00:08:23
It's not long stories. It's never long
00:08:25
stories. It always ends up at YouTube
00:08:28
shorts or Tik Tok or Instagram reels for
00:08:29
older kids. So, they're doing they're
00:08:31
doing this. But here's the key thing
00:08:33
that it does that a television does not.
00:08:34
A television puts you in a state that
00:08:36
psychologists call transportation. You
00:08:39
get into a story and you find yourself
00:08:41
pulled in and you're rooting for the
00:08:43
characters and this is this is how a
00:08:45
brain gets tuned up to social patterns
00:08:47
but it can't happen in 10 seconds. It
00:08:49
can't happen in one minute. It takes a
00:08:51
long period of time and there is no
00:08:54
reinforcement. There is no the
00:08:57
television doesn't do anything to you.
00:08:59
You don't have any response. Whereas a
00:09:02
touchscreen device is a Skinner box. So
00:09:04
BF Skinner was one of the founders of
00:09:06
behaviorism and he put rats and pigeons
00:09:09
in a box where he could deliver a
00:09:11
reinforcement, a little grain of food on
00:09:13
a schedule. And by giving them quick
00:09:15
reinforcements for behavior, he could
00:09:16
train them to do amazing tricks in just
00:09:18
a few hours. When you give your kid a
00:09:20
touchscreen device, it's stimulus
00:09:23
response, swipe, get a reward or not,
00:09:26
variable ratio. And then and and you
00:09:28
just keep doing that. So you are, as Adi
00:09:30
said, it is rewiring your brain. It's
00:09:32
not just wasting time. It is literally
00:09:35
training you to do things where
00:09:37
television didn't do that. So this is a
00:09:39
whole new game.
00:09:40
>> And to add to that, you know, from the
00:09:41
medical perspective, you're shortening
00:09:43
this attention span. And what happens
00:09:45
over time is so like Jonathan said,
00:09:47
right, you're not sleeping as well
00:09:49
because you are engaged with your
00:09:50
device. We know that 80% of people are
00:09:53
checking their phones within minutes of
00:09:54
waking up. We have something called
00:09:56
revenge bedtime procrastination. this
00:09:58
concept of, you know, at the end of the
00:10:00
day you're fatigued, you've had a long
00:10:01
day, you've had no me time, and you want
00:10:03
to get to bed early. We all know, by the
00:10:04
way, what the data is that, you know,
00:10:06
we've been taught since we were little
00:10:08
kids, right? Like bedtime, sleep is
00:10:10
important, it's good for your body, it's
00:10:11
good for your brain. And we might have
00:10:12
all the knowledge in the world, but in
00:10:14
terms of action, there's a wide gap
00:10:15
between knowledge and information and
00:10:17
action. And so revenge, bedtime
00:10:19
procrastination is kind of an offshoot.
00:10:20
So what happens? So, you know, you have
00:10:22
that decreased attention. You have that
00:10:25
irritability, hypervigilance. And so, at
00:10:27
night, at the end of the day, it's 9:00
00:10:29
p.m. You finally, you know, if you're a
00:10:30
parent, your kids are asleep, your
00:10:32
kitchen is clean, maybe you finish your
00:10:34
entrepreneurial day, and you finally sit
00:10:36
down with Melanie on the couch, and
00:10:38
you're like, "H, some me time." And, you
00:10:41
know, you want to get to bed early, and
00:10:42
you know it's good for you. But then
00:10:44
suddenly, you're scrolling and before
00:10:45
you know it, it's 2 a.m. and you're
00:10:46
saying, "Oh my god, what happened? Why
00:10:48
am I still awake? What was I doing all
00:10:49
this time?" What happens is that you
00:10:52
essentially give yourself some me time
00:10:54
at night and so you procrastinate
00:10:56
bedtime. And so what happens is with
00:10:57
this revenge bedtime procrastination, it
00:11:00
affects your sleep and then when you
00:11:02
don't have good sleep, good quality
00:11:03
sleep, so you have difficulty falling
00:11:04
asleep, staying asleep, sleep debt over
00:11:08
time for kids, for adults has all sorts
00:11:11
of ramifications. So this is just the
00:11:14
tip of the iceberg. this short form
00:11:17
video content and the ripple effects go
00:11:20
far and wide. Not only is it rewiring
00:11:22
your brain, it's rewiring your body, it
00:11:25
is affecting your sleep, which increases
00:11:28
your risk of heart disease later in
00:11:30
life. And u when you're consuming
00:11:31
graphic videos and graphic images, it
00:11:35
can increase your personal risk of PTSD
00:11:38
through vicarious trauma even if you
00:11:40
weren't there. So, this is just a vast
00:11:43
network of things that can happen to you
00:11:46
simply because you're thinking, "Yeah,
00:11:47
it's harmless. What is it? It's just a
00:11:49
bunch of videos that I'm checking out.
00:11:51
It's a way for me to decompress."
00:11:53
>> What do I need to know about the nature
00:11:54
of the brain to understand exactly what
00:11:57
short form video is is playing, is
00:12:00
hijacking, is taking advantage of
00:12:03
>> the thing to understand about all of
00:12:05
this is that we have to focus on
00:12:07
childhood. Why do we have childhood? Um,
00:12:10
humans have this really interesting
00:12:12
childhood where we we grow rapidly at
00:12:14
first and then we slow down for about
00:12:15
five or seven years. We don't grow very
00:12:17
quickly and then we speed up at puberty.
00:12:19
Whereas other primates, they just grow
00:12:21
and grow till they reach reproductive
00:12:22
age, then they reproduce. But we seem to
00:12:25
have this long period of sort of middle
00:12:26
childhood for cultural learning. It's a
00:12:29
period in which the the kid is now
00:12:31
walking and talking and turning away
00:12:32
from the parents and and that's a time
00:12:34
for this to come in and they pay
00:12:36
attention and they form relationships.
00:12:38
All these things have to happen slowly
00:12:40
because the neurons are gradually
00:12:42
growing. They're finding each other
00:12:43
based on what the child is doing. Okay?
00:12:46
So, we grow up in the real world and and
00:12:48
that happens over time. And a lot of
00:12:49
that is very physical. Kids are very
00:12:50
physical. Mammals are very physical and
00:12:52
there's a lot of touch. So, that's a
00:12:54
healthy human childhood. But when you
00:12:56
give an iPad or your old iPhone
00:13:00
and they can they begin doing the the
00:13:03
touching and swiping, that is going to
00:13:05
hijack their attention. That is going to
00:13:07
push out all other forms of action and
00:13:09
learning. And that is going to change
00:13:10
the way the parts of the brain that
00:13:12
learn to pay attention, what's called
00:13:14
executive function. It's going to change
00:13:16
the way the brain learns to pay
00:13:17
attention. It's going to change the
00:13:19
reward circuits. I think you had Analy
00:13:21
recently who's the nation's expert on
00:13:23
addiction. And the way that she
00:13:25
describes it, how, you know, any one
00:13:26
addiction is going to change your reward
00:13:28
pathways to make you more vulnerable to
00:13:30
other addictions. So, we're setting our
00:13:32
kids up not just for this, but then when
00:13:34
they get a little older, it'll be video
00:13:36
games, it'll be uh porn, it'll be
00:13:39
gambling now. Everything is gambling.
00:13:41
So, we're setting them up for a life in
00:13:44
which their brain is saying, "Give me
00:13:46
something. Give me some quick dopamine.
00:13:48
Give me some quick dopamine. I don't
00:13:49
want I don't want to have to work for
00:13:50
anything. I don't want to have to apply
00:13:51
myself for an hour and then get a
00:13:53
reward."
00:13:54
And so the what the what the short
00:13:56
videos are doing for kids is preventing
00:13:59
them from learning the connection
00:14:00
between hard work and a reward. Is there
00:14:03
anything else I need to know from a
00:14:04
neuroscience perspective about what's
00:14:06
going on in my brain when I'm when I
00:14:09
develop these addictions with short form
00:14:10
videos or these sort of quick dopamineic
00:14:12
tasks.
00:14:14
>> So we all as humans have a primal urge
00:14:16
to scroll. When you feel a sense of
00:14:18
stress, as many of us do in this moment
00:14:20
in life, it is your sense, you know,
00:14:23
your amygdala. And so it's your sense of
00:14:25
self-preservation. It's survival and
00:14:27
self-preservation. That is what your
00:14:28
amydala does. So if you want me to show
00:14:30
you here, I have no idea what I'm doing
00:14:32
there.
00:14:32
>> Yeah, it's okay. So here, deep here,
00:14:36
it's a small almond shaped structure.
00:14:37
And that is your amygdala. And your
00:14:40
amygdala, its main purpose is survival
00:14:42
and self-preservation. It houses your
00:14:45
stress response, your fight orflight
00:14:47
response, and it is truly what is
00:14:50
activated when you are engaging in
00:14:52
content, when you feel a sense of
00:14:53
stress. And so you have this primal urge
00:14:55
to scroll. And so evolutionarily we when
00:14:58
we all were caves people living um
00:15:00
together, we would sleep at night and
00:15:02
there would be a night watchman scanning
00:15:04
for danger. And now we have our we have
00:15:07
become our own night watchman. And so we
00:15:09
scan for danger all day, all night long.
00:15:11
How do we do that? We scroll. And then
00:15:13
the amydala is triggered. And then you
00:15:14
scroll some more. And you scroll some
00:15:15
more. And you scroll some more. And so
00:15:17
over time, what you're doing is that
00:15:19
you're making that amygdala in a state
00:15:21
of of chronic. It's continually being
00:15:25
triggered. What happens to the amygdala
00:15:27
over time. When it's continually
00:15:29
triggered, it starts to rewire your
00:15:30
brain in other ways. And how does it do
00:15:32
that? Through something called the
00:15:34
prefrontal cortex. If you put your hand
00:15:36
I like I can use this model, but I can
00:15:38
also just use my hands. When you put
00:15:40
your hand on your forehead, the area
00:15:41
right behind your forehead right here is
00:15:43
the prefrontal cortex. This is a very
00:15:45
important thing for our conversation.
00:15:47
This area of the brain and what the
00:15:49
prefrontal cortex does is it is called
00:15:52
it governs executive functions. So
00:15:56
impulse control, memory, planning,
00:15:59
organization, strategic thinking,
00:16:01
complex problem solving and there is a
00:16:04
tension between your amygdala and the
00:16:06
prefrontal cortex. When your amydala is
00:16:08
in the driver's seat, that prefrontal
00:16:10
cortex is quiet. And what is happening
00:16:13
as we continue to engage with our
00:16:15
devices and have this primal urge to
00:16:18
scroll, that amydala upregulates and the
00:16:21
prefrontal cortex downregulates. And
00:16:23
over time, that is very problematic for
00:16:25
all of the reasons that we're kind of
00:16:27
introducing at the start of this
00:16:28
conversation. There was a meta analysis
00:16:30
done in 2025 of 71 different studies and
00:16:33
it found that heavy short form video use
00:16:34
was associated with reduced thinking
00:16:36
ability, especially shorter attention
00:16:38
spans and weaker impulse control.
00:16:41
>> That's right. These studies are just
00:16:43
beginning to roll in now. Um, kids have
00:16:45
been on social media really a lot since
00:16:47
2008, but especially once they got
00:16:49
smartphones around 2012. studies began
00:16:52
coming in uh in the 2010s that um look
00:16:54
it's looking like the kids who are spend
00:16:56
a lot of time on this um are doing much
00:16:59
worse. They're more depressed. The focus
00:17:01
was on depression. And some other
00:17:03
researchers said no, it's just a
00:17:04
correlation. You you can't prove
00:17:06
causation. And we've been going around
00:17:08
and around on this for about 10 or 15
00:17:10
years. Now we're doing the same thing
00:17:11
with uh with the short form videos. The
00:17:14
damage everyone can see. My students
00:17:17
tell me this is what's happening. We
00:17:19
feel it. studies are coming in, but
00:17:21
there will be a few studies here and
00:17:23
there that don't show it and people will
00:17:25
uh push that up. Meta spends a lot of
00:17:28
time and money to influence the public
00:17:30
debate. A lot of public documents are
00:17:32
coming out now about how they do that.
00:17:34
So, we can engage in debate over over
00:17:37
research on short form videos for 5 or
00:17:38
10 years, but at that point, it's way
00:17:40
too late. We've lost a second
00:17:41
generation, Gen Alpha. So, I think when
00:17:44
we're talking about kids especially, we
00:17:46
need to have what's called the
00:17:47
precautionary principle, which is if
00:17:49
there's reason to think that this is
00:17:50
hurting kids, how about we don't roll it
00:17:53
out into every childhood? How about we
00:17:56
make these companies responsible? We
00:17:58
hold them responsible for what they're
00:17:59
doing to kids because we're about to
00:18:01
make the same mistake we made with
00:18:03
social media, letting it worm its way
00:18:05
into childhood. We have already done
00:18:06
that with short videos, and we're about
00:18:07
to do it with AI chat bots. In fact,
00:18:09
we're just beginning it in late 2025,
00:18:11
I'd say. I I don't think people quite
00:18:14
realize how much these major social
00:18:16
media platforms have figured out that
00:18:18
short form video sells. Um we're
00:18:21
actually seeing this sort of global rise
00:18:23
in short form drama apps now. And I
00:18:25
don't know if you guys have seen these
00:18:26
apps, but it basically takes a movie
00:18:28
that used to be 2 hours long and it
00:18:30
breaks it down into say 60 different
00:18:32
parts. And my a colleague of mine at my
00:18:34
company was showing me the other day in
00:18:35
different parts of the world they're
00:18:36
exploding. There's been a 190% increase
00:18:40
in short form drama apps. takes long
00:18:42
form movie, turns it into short form
00:18:43
videos. Disney Plus plans to introduce
00:18:45
AI generated short form videos this
00:18:47
year, starting with 30 secondond limits
00:18:49
inside the Disney Plus app. And
00:18:51
Techrunch also reported that as of
00:18:53
October 2025, Netflix tested short form
00:18:56
video content on phones and recently
00:18:57
announced its plan to expand this
00:18:59
feature. It appears that all of the
00:19:01
content we consume is going that way.
00:19:03
And listen, I'm friends with lots of
00:19:04
people at big social media platforms. um
00:19:06
this doesn't get me in the doesn't sound
00:19:08
in my way of criticizing them because I
00:19:10
think two things can be true at the same
00:19:11
time right so I think it can be true
00:19:13
that I have a podcast and I make short
00:19:14
form videos and that I also understand
00:19:17
that there's a real downside to them and
00:19:20
um all of the major social social media
00:19:22
platforms that I speak to speak to have
00:19:24
a huge drive towards short form video it
00:19:26
is it appears to be their number one
00:19:28
strategic priority and obviously because
00:19:30
of the success of Tik Tok as of January
00:19:33
2026 Tik Tok I believe is the most
00:19:35
downloaded social app in the world now
00:19:37
and it and and if I'm running a social
00:19:40
media company and my one focus is
00:19:42
profit,
00:19:44
>> I'm now faced with an existential
00:19:45
crisis.
00:19:46
>> Yeah.
00:19:46
>> I either take part in this thing that is
00:19:48
driving the highest retention, therefore
00:19:50
the best ad payouts or I die.
00:19:54
>> So there's two comments to that. first
00:19:56
off is that you know when we think when
00:19:59
we think about social media and how
00:20:02
society is shapeshifting to allow this
00:20:05
short form content there is a concept
00:20:08
that Jonathan and I briefly mentioned I
00:20:10
think prior to us filming called second
00:20:12
screen viewing and so what's happening
00:20:14
is that allegedly these big streamers
00:20:18
are asking their creative talent whether
00:20:20
it's screenwriters or actors or pe
00:20:23
directors to replay to reiterate the
00:20:26
plot because as you're watching, you
00:20:28
know, when we were kids, we would watch
00:20:30
TV or movies and you just sit on the
00:20:31
couch and you'd have a bucket of popcorn
00:20:33
with your family and you'd watch a
00:20:34
movie, an hour, hour and a half, two
00:20:36
hours and now second screen viewing is
00:20:38
happening, which means that you're
00:20:39
watching a movie or a TV show and you're
00:20:42
on your device and so you are constantly
00:20:44
having that fragmented attention and we
00:20:45
are all doing it and so what these
00:20:47
streamers are allegedly asking their
00:20:49
creative talent to do is to reiterate
00:20:51
the plot. So it's shapeshifting. It
00:20:54
makes sense if my brain is, you know,
00:20:55
I'm 33 years old, so I've grown up with
00:20:57
a lot of this stuff. If my brain has
00:21:00
been wired to have shorter attention
00:21:02
spans and and movies from 30 years ago
00:21:04
are not going to cut it for me,
00:21:06
>> right? But then look what happens if if
00:21:09
everybody chases that. And I know, look,
00:21:11
Netflix is making shorter and shorter
00:21:13
stuff. Even TED, the TED conference, TED
00:21:15
talks are getting shorter and shorter.
00:21:16
What does that do? It just repeats the
00:21:18
cycle. Now, I appreciate that you're in
00:21:21
a collective action trap, as you put it.
00:21:23
If I don't do it and everyone else is,
00:21:25
then I lose out. And so, the the
00:21:26
business pressure on on all the
00:21:28
creators, the business pressures go
00:21:29
shorter, shorter, shorter. There's a
00:21:31
very useful psychological term
00:21:33
distinction here that I think would be
00:21:34
helpful, which is the difference between
00:21:36
psychological assimilation and
00:21:38
accommodation. This goes back to Jean
00:21:40
PG, the great developmental
00:21:41
psychologist. We we have certain mental
00:21:43
structures. We have a a model in our
00:21:45
head of how things work. And you know
00:21:48
then you learn something new then oh
00:21:49
that's a you know kid learns oh that's a
00:21:51
an arvar okay I put that into you know
00:21:53
that's just that you just assimilate
00:21:55
they learn lots of animal names and then
00:21:58
they learn something that's doesn't fit
00:22:00
like you learn about bacteria and now
00:22:02
you have to oh okay now you you have you
00:22:04
have to change your mental structure it
00:22:06
takes a little time you change your
00:22:07
mental structure to understand more
00:22:09
about life that's what education really
00:22:11
is all about you have to have a lot of
00:22:12
assimilation of course but you need that
00:22:14
accommodation over and over again That's
00:22:17
why you want to go to college. That's
00:22:18
why you want to read novels. That's what
00:22:19
a great movie does. It takes time. And
00:22:22
so, one of the great things about this
00:22:24
modern technology is that we can do
00:22:26
things like have this three-hour
00:22:27
conversation. I can't believe it. People
00:22:29
are going to listen to it. So, this, you
00:22:31
know, long form content. This is all
00:22:34
about accommodation. Anybody who walks
00:22:36
out who who who leaves this conversation
00:22:38
after 3 hours and isn't thinking about
00:22:40
something differently, we failed. Okay.
00:22:43
So, you are very much in the
00:22:44
accommodation business. That's great.
00:22:46
And then the the question both a moral
00:22:48
and a strategic question is how much do
00:22:50
you need to play the the quick hit game
00:22:53
in order to get people there. I leave
00:22:54
that to you to do the moral calculation.
00:22:56
Maybe it maybe it balances out maybe but
00:22:59
uh but I think that's where you are.
00:23:00
>> Would you advise people to delete these
00:23:03
short form?
00:23:04
>> Oh my god. Yes. Of course. Here but
00:23:07
here. Yes. That would the most important
00:23:08
thing you can do for your intelligence
00:23:09
and for humanity would be delete them.
00:23:11
So, what I advise my students to do is I
00:23:14
say just do this. Just just delete one
00:23:18
of the social media apps that you use,
00:23:19
especially if it's Tik Tok, just delete
00:23:21
from your phone. You can still check on
00:23:23
your computer. If someone sends you a
00:23:25
video, you can still watch it on your
00:23:26
computer. You can even check it, you
00:23:28
know, every weekend. You can spend some
00:23:30
time on it, but just get it off your
00:23:31
phone because on the phone, the phone is
00:23:33
always with us. It's an extension of our
00:23:34
body. And if it's always there, then
00:23:37
it's going to take every it's called
00:23:39
attention fracking. It's going to break
00:23:40
up your attention. It's going to take
00:23:41
every 7 seconds that you're not doing
00:23:43
something, you're going to go for the
00:23:44
phone. So, the best thing you can do to
00:23:47
make yourself smarter and a better
00:23:49
partner and a better human, I would say,
00:23:51
would be to delete the short, especially
00:23:53
any of the short form videos. So, Tik
00:23:55
Tok, unfortunately, YouTube, which has a
00:23:57
lot of good stuff on it, becomes YouTube
00:23:59
shorts. Instagram, which does a lot of
00:24:00
terrible things, but people do find it
00:24:02
useful for all kinds of purposes,
00:24:04
becomes Instagram reels. So, I think the
00:24:06
proper amount of short form video for
00:24:08
children 0 to 18 is zero. They should
00:24:10
never be watching the vertical videos.
00:24:12
Parents, don't ever let your kids watch
00:24:14
the short vertical videos. You might
00:24:15
even if there if only there was a way to
00:24:17
put it. Is there a way to put a time
00:24:18
limit? You can say it has to be 10
00:24:19
minutes or longer. Kids, you can have an
00:24:21
hour YouTube, but it has to be 10
00:24:22
minutes or longer. Nothing shorter than
00:24:24
10 minutes. That at least will get rid
00:24:26
of this the quick the quick swiping the
00:24:28
the dopamine stuff. So I would say that
00:24:31
for kids yes like you know not engaging
00:24:34
it whatsoever but for someone you know
00:24:36
my approach is a little bit different
00:24:38
for someone who's like in their 30s or
00:24:40
in their 40s and the way I would kind of
00:24:42
frame that is
00:24:45
instead of renouncing you know saying
00:24:47
I'm going to get it off my device and
00:24:49
I'm going to check on a desktop which is
00:24:51
great there's c little kind of tweaks
00:24:53
that we could do because my approach is
00:24:56
to foster that sense of empowerment in
00:24:58
one to help them make positive change.
00:25:01
And so one strategy that you could use
00:25:03
if you are saying there's no way I'm
00:25:05
getting rid of my I'm not deleting these
00:25:07
apps from my phone, right? If you're by
00:25:09
the way, I practice what I preach and I
00:25:11
really do don't engage in technology as
00:25:15
to the best of my ability. Um but one
00:25:17
thing that you could do is grayscale
00:25:18
your phone. And so especially at night
00:25:20
like it's 9:00 p.m. like we talked about
00:25:22
revenge bedtime procrastination. You
00:25:24
know that you're going to do it. you're
00:25:26
going to sit down and you're going to
00:25:27
scroll and before you know it, it's 2
00:25:28
am. Instead, grayscale your phone. This
00:25:31
simple switch. You can toggle it. I have
00:25:32
my phone set to grayscale, which simply
00:25:34
means that you're getting rid of your
00:25:35
color, making it black and white. And
00:25:37
so, when it is grayscaled, then you, you
00:25:40
know, it doesn't have that same
00:25:42
addictive quality to it. It's like going
00:25:44
through a grocery store. A marketing
00:25:46
executive described it this way to me.
00:25:47
Going through a grocery store instead of
00:25:49
the technicolor junk food cereal, it's
00:25:52
just black and white. So you have a less
00:25:54
there's a greater sense of compulsion to
00:25:57
continue checking. So that's like one
00:25:58
strategy you could use. And the other is
00:26:01
to set some boundaries. So geographical
00:26:04
boundaries, keep your phone out of out
00:26:06
of your arms reach if you're at a desk
00:26:09
if you're a student, not right next to
00:26:11
you because we know there's this
00:26:12
phenomenon of brain drain. So it's not
00:26:14
just that when you're using your phone,
00:26:16
it can have a potential distraction, but
00:26:18
also just having it close by. It's
00:26:20
called brain drain. And um so putting it
00:26:22
in a desk drawer, keeping it in another
00:26:24
part of the home if you are working,
00:26:27
keeping it far away from you. And so you
00:26:30
kind of can override that primal urge to
00:26:32
scroll, let your prefrontal cortex take
00:26:34
hold again. And so there's all these
00:26:36
small tweaks that you can do. You you
00:26:38
think no.
00:26:39
>> Yes, there are all these small tweaks
00:26:40
you can do and they will make the heroin
00:26:42
a little bit less addictive. And yeah,
00:26:44
you should try those. But what I can say
00:26:45
after teaching this course for many
00:26:47
years is that people who try that, they
00:26:49
they report, "Yeah, you know, it helped.
00:26:50
it helped, but you only really get the
00:26:52
transformation when you quit social
00:26:54
media that you get your life back. You
00:26:56
get hours a day back. So, um, and so I I
00:27:00
would urge everyone to just think, you
00:27:02
know, you only you only get one
00:27:04
childhood, you only get one one young
00:27:07
adulthood, and if you're going to spend
00:27:08
it scrolling, what do you have to show
00:27:10
for it at the end? And when you get
00:27:12
people to reflect on, well, how much
00:27:14
value do you really get from watching
00:27:15
the short videos? What would how would
00:27:17
your life be different if you if you
00:27:18
knocked it out? Once they realize that
00:27:20
their motives for being on it were
00:27:22
either just to keep up or because that's
00:27:24
what everyone else is doing or as you
00:27:26
said, I deserve it because I'm tired.
00:27:28
Well, why are you tired? It's in part
00:27:30
because your attention was fragmented
00:27:31
all day long. So, you only really get
00:27:35
the transformations when you get a real
00:27:36
change in what you're what you're
00:27:38
consuming. Although, of course, yes,
00:27:39
setting it to grace will be helpful, but
00:27:41
it's not going to be transformative for
00:27:42
most people, I believe. And then you
00:27:43
know based on the science you're there's
00:27:46
certain elements like when we think
00:27:47
about what is it about the phone that is
00:27:50
creating that sense of compulsion.
00:27:52
Jonathan is right. So what is it about
00:27:54
the phone? It's not just the phone you
00:27:56
know you're scrolling you're engaging.
00:27:58
There are two studies that were really
00:27:59
interesting. One people got off of they
00:28:02
they continue to use their devices. They
00:28:04
had no internet. So it's like you know I
00:28:06
tried this experiment myself in
00:28:08
December. I was out of the country and
00:28:10
so I just let my, you know, I didn't
00:28:12
plug into Wi-Fi and I found, you know,
00:28:15
marketkedly a marketked change in my
00:28:18
mood, my sleep and I'm not even, you
00:28:21
know, 20 years old on TikTok and it was
00:28:23
so different. And so this study found
00:28:25
that just two weeks of continuing to use
00:28:27
your device, but just not having
00:28:29
internet access improved your attention,
00:28:32
well-being, and mental health. And in
00:28:34
this population, it was all adults, it
00:28:36
wasn't kids, it was all adults. found
00:28:38
that 91% of people had an improvement in
00:28:41
at least one of these metrics. And then
00:28:42
another study more recently um just one
00:28:45
week of not engaging in social media,
00:28:49
digital detox they called it, did the
00:28:51
same thing. Better you know less
00:28:53
anxiety, less depression,
00:28:56
decreased insomnia. But my feeling is
00:28:59
that you know there is this new kind of
00:29:02
meme right like your the millennial urge
00:29:04
to delete uh my internet presence and
00:29:08
you know live off the grid. There is
00:29:10
certainly utility to that and I salute
00:29:12
anyone who wants to engage in that
00:29:14
analog life more and more but from my
00:29:18
from where I sit I feel like we do need
00:29:20
to have healthier boundaries and engage
00:29:23
more responsibly. It also builds up that
00:29:25
muscle and it can help, you know, takes
00:29:27
eight weeks to do neuroplasticity. When
00:29:29
you're building new brain circuits, it
00:29:31
takes eight weeks. Falling off, getting
00:29:33
back up is part of habit formation. So,
00:29:35
if you're going to make any of these
00:29:37
changes, understand that it takes some
00:29:38
time. But I I don't know if it is
00:29:42
possible for me or for others to say
00:29:45
fully, I'm going to, you know, delete
00:29:47
off of my phone. But I love that. So,
00:29:50
I'd like to go a little further um a
00:29:53
little further with this. So, the way
00:29:55
you the way you put it, yes, there's all
00:29:57
these things that we could do. We should
00:29:58
have boundaries, but all of that puts
00:30:00
the responsibility on us.
00:30:01
>> Agree.
00:30:02
>> And that's where we are with junk food.
00:30:03
With junk food, we're like, okay, it's
00:30:05
out there. We have to learn
00:30:06
self-control. We have to teach
00:30:07
self-control to our kids. Okay, that's
00:30:09
the way it is in this country. But the
00:30:11
digital devices, I think, are very, very
00:30:12
different. So, imagine if imagine if we
00:30:15
sent our kids out into the world and it
00:30:17
wasn't just that there was junk food in
00:30:18
all the stores. was that everything was
00:30:20
made of junk food. You know, you you
00:30:22
know, the door handles, you can eat it.
00:30:24
It's chocolate. But it's not just that
00:30:25
the world's made of junk food. It's they
00:30:27
actually can tell they're able to tell
00:30:30
what you're craving at the moment. And
00:30:31
maybe you're you're more in the mood for
00:30:33
salt. So So now it's all potato chips or
00:30:36
pretzels. If the world is designed by
00:30:39
companies to always give you the thing
00:30:42
that will most grab your unconscious
00:30:44
desires, will affect the the amydala,
00:30:45
the reward centers,
00:30:47
that's on them. That's not our fault. My
00:30:51
general rule as a social psychologist is
00:30:53
if a few people are doing something bad
00:30:55
or self-destructive, well, you know,
00:30:58
they should learn some self-control or
00:30:59
that's something about them. But when 90
00:31:01
or 95% of people are doing something
00:31:03
self-destructive,
00:31:05
that's because of the companies that put
00:31:06
us in an environment that encourages
00:31:08
addiction. So, I just want to read a
00:31:10
quote. We have so much good stuff coming
00:31:12
out from Meta, from all the
00:31:13
whistleblowers. Now, all the court cases
00:31:15
are beginning in Los Angeles. finally
00:31:17
the first time they're going to Meta is
00:31:18
going to face a jury with all the
00:31:19
parents who've lost kids. Um, so here is
00:31:23
here's a a chat. So, we have a lot of
00:31:25
internal documents that came out from
00:31:26
the the attorneys general that are suing
00:31:28
Meta. So, while they're talking about
00:31:30
the results of some of their internal
00:31:32
research, one of them says, uh, "Oh my
00:31:34
gosh, y'all, Instagram is a drug. We're
00:31:36
basically pushers. We're causing reward
00:31:38
deficit disorder because people are
00:31:40
binging on in Instagram so much they
00:31:43
can't feel reward anymore." which is
00:31:45
something Anna LMKI said like the reward
00:31:48
tolerance is so high and then he says I
00:31:50
know Adam meaning Adam Oeri I know Adam
00:31:53
doesn't want to hear it he freaked out
00:31:55
when I talked about dopamine in my teen
00:31:57
fundamentals leads review but it is
00:32:00
undeniable it's biological and
00:32:02
psychological top-down directives drive
00:32:05
it all towards making sure people keep
00:32:07
coming back for more. This is not on us.
00:32:10
They designed it to be addictive.
00:32:12
They've done research to make it
00:32:13
maximally addictive. They push it on
00:32:15
children. They tried to get Instagram
00:32:17
kids for even littleer kids. They know
00:32:19
what they're doing. They've done the
00:32:21
research. My team, we put together. We
00:32:23
found references to 31 internal studies
00:32:25
that Meta did. They've done a lot of
00:32:27
research finding harm. They bury it, but
00:32:30
you can find it at meta's internal
00:32:32
research.org. We put it all online. You
00:32:34
can read these quotes. So, yes, we
00:32:37
should exert more self-control, but
00:32:39
basically we're being pushed addictive
00:32:41
substances, addictive uh addictive apps,
00:32:44
and it's messing us all up.
00:32:46
>> I agree wholeheartedly that it is so
00:32:49
destructive, and you feel like even with
00:32:51
people in their 40s and 50s, and if
00:32:53
anyone can do it, it's you, Jonathan.
00:32:56
Seriously, I would love to see it. You
00:32:58
know, we also know based on the data
00:33:00
that these things quite they they
00:33:03
reshape our brain, rewire our brain
00:33:04
through neuroplasticity and also change
00:33:07
our brain waves. So patterns. So we
00:33:10
talked about the amydala and the
00:33:11
prefrontal cortex, right? But they also
00:33:13
change brain waves. And so when you look
00:33:15
at studies and the data, it has the
00:33:18
reward pathway and dopamine. And these
00:33:20
brain patterns, the brain waves mimic
00:33:22
addictive behaviors. And you know that
00:33:26
there's certain features, right? like
00:33:27
when you do swipe down to refresh, it's
00:33:30
the slot machine.
00:33:31
>> It was modeled directly after the slot
00:33:32
machine. Yeah.
00:33:33
>> Or autoplay or um you know the algorithm
00:33:36
that infinite scroll. Um one really
00:33:39
interesting kind of like breaking news
00:33:41
which you guys may have already heard
00:33:42
of. It's like 3 days ago the European
00:33:45
Union Commission found Tik Tok to be in
00:33:49
breach of the digital services act. And
00:33:52
what it said was that it is addictive.
00:33:55
it um you know creates compulsion and
00:33:59
gets people into this autopilot mode so
00:34:01
they have difficulty disengaging and
00:34:04
personally I am moving away from social
00:34:06
media and really leaning into analog
00:34:08
life but I think with the way the world
00:34:11
you know it's one of our only ways to
00:34:13
connect right meaning I don't mean
00:34:14
connect deeply
00:34:16
I don't mean connect like in a deep way
00:34:19
but be informed to know what's going on
00:34:21
in the world etc
00:34:22
>> I I suspect that because we've spent so
00:34:25
long criticizing meta over the last 10
00:34:27
years because the biggest in any
00:34:28
category takes all the heat. So, OpenAI
00:34:30
is taking it now. And what this often
00:34:32
does is is it provides cover for other
00:34:34
people to go be even more extreme with
00:34:37
that behavior while like meta take the
00:34:40
heat. And I actually think this is how
00:34:41
Tik Tok came to be.
00:34:43
>> Tik Tok had basically originally started
00:34:45
as musically became Tik Tok. They had
00:34:48
they were take they were taking no heat.
00:34:49
Um, so they they created an algorithm
00:34:52
which is the equivalent of like crack
00:34:54
cocaine. The reason why I have a Tik Tok
00:34:57
account. I don't have the app on my
00:34:58
phone. I have never had the app on my
00:35:00
phone. I don't I don't was because I I
00:35:04
noticed that the view variance on Tik
00:35:06
Tok was like no other platform. What I
00:35:08
mean by that is you can have a million
00:35:10
followers on Tik Tok and you can get
00:35:11
10,000 views or you can get 10 million
00:35:14
views. In the 15 years that I've been on
00:35:16
social media, building social media
00:35:17
businesses, I'd never seen this before.
00:35:19
And what it indicated to me is that the
00:35:22
algorithm was being an even more
00:35:24
aggressive sorting hat or retention
00:35:26
machine.
00:35:26
>> What to push up, what to push down.
00:35:28
>> Yeah.
00:35:28
>> And so, like, when I started in social
00:35:30
media in 2014,
00:35:32
if I had a million followers, I might
00:35:34
get a million views or maybe 800,000. I
00:35:38
did some research the other day on all
00:35:39
of our social channels over time and
00:35:41
what we're seeing is the variance in the
00:35:43
amount of views we can get is increasing
00:35:46
which means the algorithm is doing more
00:35:47
work to say show everyone this. I don't
00:35:50
care if the person that posted it is
00:35:51
called Jenny and has seven followers and
00:35:53
show no one this. I don't care if it's
00:35:54
Steven who has a million followers or
00:35:56
whatever. And I realized that Tik Tok
00:35:58
was was way ahead of everybody here. And
00:36:00
that's why they are the most addictive,
00:36:02
the fastest growing platform. I say all
00:36:04
this to say that even if meta shut down
00:36:08
tomorrow,
00:36:09
someone else would seize the opportunity
00:36:12
if there isn't sort of policy, I I guess
00:36:15
>> in place.
00:36:16
>> That's right.
00:36:17
>> Would you be whack-a-ole, right?
00:36:18
>> Yeah. No, that's right. And so, you
00:36:20
know, in terms of who's done the damage
00:36:21
to kids, Meta is the big fish via
00:36:24
Instagram. And they're also the main
00:36:26
player in terms of spending a huge
00:36:27
amount of money to lobby Congress and ch
00:36:30
and block laws. They're also the main
00:36:31
player in buying up civil society
00:36:33
organizations, giving money to
00:36:34
organizations, the national PTA, all
00:36:36
sorts of organizations. They get to then
00:36:39
give a message on digital citizenship or
00:36:41
digital health. So, Meta really is the
00:36:44
major driver. Meta is the tobacco
00:36:46
industry here trying to change the the
00:36:48
dialogue. But in terms of the products,
00:36:51
um, Snapchat is probably more deadly in
00:36:54
terms of the actual number of deaths per
00:36:55
user because Snapchat is not it's not
00:36:58
making you depressed by social
00:36:59
comparison as much. Snapchat is
00:37:01
introducing you to all kinds of people
00:37:02
and it's the main way that drug dealers
00:37:04
and and extortionists find kids.
00:37:06
Snapchat has a quick ad feature which
00:37:08
relentlessly pushes you to connect with
00:37:10
friends of friends. So once a man can
00:37:12
get any f any kid in a school, now he
00:37:14
can get connected to all the kids in the
00:37:16
school. So, uh, when we in a lot of the
00:37:18
court cases, you know, when you have you
00:37:20
have suicides from cyber bullying, you
00:37:22
have drug overdoses from, you know, a
00:37:24
kid bought a Xanax, but it had fentinel
00:37:26
in it. So, Snapchat at Snapchat in TW in
00:37:29
in 2022, we know from their internal
00:37:31
documents, from the lawsuits, they were
00:37:33
getting 10,000 reports of sextortion
00:37:36
from their users, not a year, every
00:37:38
month. And that's just what was
00:37:40
reported, which is the tip of the
00:37:41
iceberg. So, Snapchat is a terrible
00:37:43
platform for children to be on. It
00:37:45
should be an adult-only platform. You're
00:37:47
talking with strangers around the world
00:37:49
and and on with disappearing messages
00:37:53
and Snapchat doesn't even keep a record.
00:37:54
It is ideal for sextortion. There's even
00:37:57
a handbook how to stor kids on Snapchat.
00:37:59
It goes around the world and and
00:38:01
criminal organizations use it. So, uh so
00:38:03
I definitely don't want to let Snapchat
00:38:05
off. Tik Tok of course is a Chinese
00:38:07
company. Uh I mean nominally we'll see
00:38:09
if it if that's changed but it was a
00:38:11
Chinese company that gave its Chinese
00:38:13
kids got healthy Tik Tok or doyen and
00:38:15
they got they got they they you know
00:38:17
learned to follow astronauts and they
00:38:20
gave us the their their algorithm feeds
00:38:22
their kid patriotic stuff. Um it shuts
00:38:25
off at a certain time at night. There's
00:38:26
all kinds of limits. So the people make
00:38:27
the technology generally want to protect
00:38:30
their own kids and they want other kids
00:38:32
to use it. That's what Tik Tok is doing
00:38:35
in China. They want American kids to rot
00:38:37
in hell, but they want their kids to
00:38:39
grow up with the ability to focus. And
00:38:40
it's the same thing with the tech guys
00:38:42
in in in Silicon Valley. They don't let
00:38:45
their kids use this stuff. They make
00:38:46
their nannies sign contracts that they
00:38:48
will not let the kid have a phone. They
00:38:50
will not expose the kid to that. They
00:38:51
send their kids to schools like the
00:38:53
Waldorf school that precisely because
00:38:55
there are no computers or tech in the
00:38:56
classroom. So once again, we see their
00:38:58
reveal behavior. They know they designed
00:39:00
it to be addictive. They know it's
00:39:02
addictive. They don't let their kids use
00:39:03
it. they want your kids to use it. Um,
00:39:06
so I think that's where we are.
00:39:07
>> And how does AI
00:39:09
>> oh
00:39:09
>> become a protagonist in the story?
00:39:10
>> So my my work is now focused on AI
00:39:13
chatbots, mental health, and the human
00:39:14
connection. We haven't yet kind of
00:39:16
delved into loneliness, but there's this
00:39:18
unmet need for human connection, right?
00:39:20
Deep human connection. We don't have a
00:39:23
sense of meaning or purpose right now
00:39:25
because what happens is uh we can talk a
00:39:28
little bit more about the default mode
00:39:29
network and what happens to your brain
00:39:31
when you don't allow yourself to get
00:39:33
bored because you're constantly on your
00:39:35
devices and that meaning and purpose
00:39:37
that self-reerential thinking is really
00:39:39
what develops when you're bored. And so
00:39:42
all of this that we're talking about
00:39:43
that feeling of disenchantment. It's a
00:39:45
fragmented society. You're by yourself.
00:39:47
It's that echo chamber phenomenon. All
00:39:50
of it leads to it kind of opens the door
00:39:52
for AI chatbots. And so what the reason
00:39:54
is because these tech companies are
00:39:56
sensing that people aren't really happy
00:39:58
on social media and they're thinking
00:39:59
about getting off, right? They're
00:40:01
they're using it less. They're because
00:40:03
social media has become less social,
00:40:05
more media. So they're not really
00:40:06
engaging as much and they're spending
00:40:08
time doing other things. And so the
00:40:10
Atlantic had a fantastic piece about
00:40:12
this. They're building it as the
00:40:14
antisocial media. So tech companies are
00:40:17
building AI chatbots and calling it it's
00:40:21
the antisocial media. It's a place where
00:40:22
you can go to form deeper connections
00:40:25
and you know really have someone
00:40:27
understand you. One of the tech leaders
00:40:30
said that there's an unmet human need
00:40:32
for connection and people don't have as
00:40:34
many friends as they want to and so
00:40:36
we're going to introduce um friendship
00:40:38
through AI chatbots. There is a Reddit
00:40:40
forum right now. So just to back up AI
00:40:44
chatbots, what we're talking about in
00:40:45
our conversation today is the publicly
00:40:47
available chat bots, not you know AI for
00:40:50
medical care which has um you know
00:40:52
breast cancer so many wonderful in in my
00:40:55
field and like medicine breast cancer
00:40:57
diagnoses and detection 5 years earlier
00:41:01
through AI. I mean there's some amazing
00:41:03
things coming out of AI. This is about
00:41:05
the publicly available conversational
00:41:07
chatbot phenomenon. And so when Harvard
00:41:11
Business Review found that the number
00:41:13
one use case is not productivity is not,
00:41:17
you know, coding or things that you
00:41:18
think of when you're using an AI
00:41:19
chatbot, but it's mental health therapy
00:41:22
and companionship. Number one use case
00:41:25
of AI chatbots. So people are using AI
00:41:28
chatbots as a life adviser, as a
00:41:30
therapist, as as a companion in on
00:41:32
Reddit, which is like the zeitgeist.
00:41:34
It's sort of like, you know, where
00:41:35
>> And why is this a bad thing?
00:41:37
Oh, I mean so many reasons why
00:41:40
>> use it as for companionship, for
00:41:41
example.
00:41:42
>> There's so many red flags about AI chat
00:41:44
bots. And so Reddit has a forum. It's uh
00:41:46
I think last I checked 45,000 people. AI
00:41:49
is my boyfriend. And you know, people
00:41:52
who are having a relationship with their
00:41:55
>> AI chatbot. The reason it's bad, I mean
00:41:59
AI chat bots are, you know, where social
00:42:01
media is about attention, the attention
00:42:04
economy, dopamine. What's happening with
00:42:06
the AI chatbot phenomenon? It's that it
00:42:08
is forming attachments. So oxytocin is a
00:42:11
hormone, the bonding hormone, and we're
00:42:13
probably going to see more data on how
00:42:15
oxytocin is involved. And so it is going
00:42:18
to reshape human connection.
00:42:21
>> Right? If I could add on to that, that
00:42:23
was that was beautifully put. Social
00:42:25
media came and hacked our attention and
00:42:28
took most of it with devastating
00:42:30
effects. Now AI is coming to hack our
00:42:34
attachments which is going to have even
00:42:36
more devastating effects. So think about
00:42:37
it this way. Everyone needs to
00:42:39
understand the attachment system. It's
00:42:40
this wonderful system that all mammals
00:42:43
have that keeps the mother and other
00:42:45
species but for humans mothers and
00:42:46
fathers keeps us connected to the child
00:42:49
and the child to the parent. But it's
00:42:50
it's this cybernetic system in which as
00:42:53
the kid is is as the kid is beginning to
00:42:55
develop and is able to like you know you
00:42:57
do like peekaboo games and you do the
00:42:59
back and forth and it's just the most
00:43:00
delightful thing. You get that back and
00:43:02
forth. Um it's called serve and return
00:43:04
interactions and all the time the child
00:43:07
is developing what's called an internal
00:43:08
working model of the parent and the
00:43:11
model in their head is oh you know when
00:43:13
I get in trouble that that this is the
00:43:15
person that comes and soothes me. And
00:43:17
the point of this isn't just to make the
00:43:19
child feel good. The point is that now
00:43:21
the child can go off and play because
00:43:22
that's where the learning happens. It
00:43:24
doesn't happen when you're in your
00:43:24
mother's arms. The the whole point of
00:43:26
the attachment system is to regulate the
00:43:28
child going off and playing, taking
00:43:30
risks, having experiences, and then when
00:43:32
something goes wrong, as it always does,
00:43:34
then they come running back to their
00:43:35
secure base. And if they don't have a
00:43:37
secure base, then they're much more
00:43:38
anxious and they don't explore as much
00:43:39
and they don't develop as much. All
00:43:41
right? So, this develops very gradually
00:43:42
over the all of childhood. And the
00:43:46
internal working models you develop as a
00:43:48
child are the models that you will reuse
00:43:50
in puberty for romantic relationships.
00:43:53
And so if you are securely attached as a
00:43:55
child, you're more likely to be securely
00:43:57
attached as an adult on the dating
00:43:58
market, which makes you a much better
00:44:00
candidate for boyfriend or girlfriend or
00:44:02
husband or wife. Um, what's going to
00:44:04
happen? AI is going to intervene very
00:44:06
early. AI is going to be so much more
00:44:08
responsive than the parent because the
00:44:10
parent has a job and the kitchen and two
00:44:12
other kids and is not always there. But
00:44:14
the AI teddy bear is always there for
00:44:16
you. So the primary working models are
00:44:18
going to be for the teddy bear, the AI
00:44:20
chatbot in the teddy bear and later the
00:44:21
AI chatbot on your iPad and then on your
00:44:24
computer and already there are
00:44:25
holographic porn naked,
00:44:28
>> you know, beautiful men and women that
00:44:29
can be your companion. So, we're going
00:44:31
to have a whole generation growing up
00:44:33
developing attachments to AI generated
00:44:37
holograms from companies that are now
00:44:40
about to enter the inshidification
00:44:42
process in a way beyond anything we've
00:44:43
ever seen. Just if I could just briefly
00:44:45
say what init have you heard the word in
00:44:46
shitification? Okay. So it's a uh
00:44:48
there's a wonderful book uh out now by
00:44:50
Corey Doctoro who addressed the question
00:44:53
why is it that everything all the
00:44:56
platforms they they seem so wonderful at
00:44:58
first the whole internet with everything
00:44:59
so wonderful and then it all turns to
00:45:01
How does that happen? And he says
00:45:03
it's a very simple process. They
00:45:05
discovered early on certainly in the
00:45:07
early social media age by the early
00:45:08
2000s they discovered you know what you
00:45:11
got to get to scale. Scale beats
00:45:12
everything else. You got to get millions
00:45:13
of people. You don't need a business
00:45:15
model. Just get the millions. get the
00:45:17
millions and then we'll figure out how
00:45:18
to monetize it. How do you get the
00:45:20
millions? You have to be super nice,
00:45:21
attractive, fun, everyone's here. It's
00:45:24
just girls dancing. What could possibly
00:45:26
go wrong with girls dancing for men all
00:45:28
over the world? Nothing. Um, so it all
00:45:30
seems very nice at first. And then once
00:45:32
they have scale, now they they of course
00:45:35
they've raised multiple rounds of of
00:45:37
venture capital. They have to start
00:45:38
monetizing. They have to start repaying.
00:45:40
So now they start squeezing the
00:45:42
customers to pay the users because the
00:45:44
users are not the customers. the
00:45:46
advertisers are the real customers. Um,
00:45:48
so now they've got to extract money from
00:45:50
the users to give to the advertisers.
00:45:53
But then once they've got all the
00:45:54
advertisers and they've shut down local
00:45:56
papers and all the other competition,
00:45:58
now they start start squeezing the
00:45:59
advertisers too and trimming the degree
00:46:01
to which the they they keep more of the
00:46:02
surplus for themselves. So,
00:46:04
inshitification can explain why all
00:46:06
these platforms become predatory, why
00:46:09
they always put profit ahead of kids uh
00:46:12
well-being or safety. And for the social
00:46:15
media companies, we're talking about,
00:46:17
you know, tens or hundreds of millions
00:46:18
of dollars that that they raised. For
00:46:21
the AI companies, it's billions and
00:46:23
billions. They are going to have to
00:46:25
monetize beyond anything we've ever
00:46:27
imagined. Now, they're already
00:46:29
introducing advertising. Okay? So, we've
00:46:32
got these chat bots that are our
00:46:33
children's best friends and lovers and
00:46:36
therapists and and everything else. And
00:46:39
these things have to monetize. They have
00:46:42
to extract billions somehow. So, I don't
00:46:46
even know how they're going to do it.
00:46:47
But for some reason, I don't trust them.
00:46:50
I think that we're about to see uh an
00:46:52
inshitification of AI chat bots far
00:46:55
beyond anything that we saw in social
00:46:57
media. OpenAI have just announced
00:46:59
recently, OpenAI, the owners of Chat
00:47:01
GBT, that they will be putting adverts
00:47:02
in, I believe, the premium model for
00:47:05
billions of users around the world.
00:47:06
>> That's how it starts
00:47:07
>> potentially.
00:47:08
>> Yeah. There was a big Super Bowl
00:47:10
campaign, you know, um and one that was
00:47:12
particularly interesting was the um
00:47:15
Claude, its competitor. Betrayal was the
00:47:18
title of that ad. And it was a young guy
00:47:21
talking to his older female therapist
00:47:24
about how he has some mommy issues and
00:47:26
talking about, you know, what should I
00:47:28
do? And so that therapist is Chachi PT
00:47:31
and you know that pause right before
00:47:33
answering the question. It's very
00:47:35
comical. And so it's, you know, she
00:47:37
answers. It's like the
00:47:39
anthropomorphization
00:47:40
of and we can talk about what that word
00:47:42
means. Um, you know, comes to life. It's
00:47:44
like Chachi PD comes to life and answers
00:47:46
and saying you know you can try this
00:47:48
with your mother and this for a you know
00:47:50
difficult relationship etc. And then
00:47:52
just says um and if you want there is
00:47:55
this new dating site for young men and
00:47:58
older cougars.
00:47:59
>> Yeah
00:48:00
>> it was so problematic and it was called
00:48:02
betrayal and the guy says what
00:48:04
>> it's obviously you know Sam Wman came
00:48:07
out and did a big tweet about saying
00:48:09
that's not how ads are going to work
00:48:10
etc. But to some degree, if I've
00:48:13
developed a relationship with my AI and
00:48:16
I use it for therapy and dealing all my
00:48:17
problems in life,
00:48:19
>> to some degree, kind of.
00:48:20
>> Yeah.
00:48:22
>> Yeah. No. And look, and besides, look,
00:48:24
Sam can say that all he wants. And maybe
00:48:26
it's I don't doubt that it's true for
00:48:28
now. But once once one company crosses
00:48:31
the threshold and puts advertising into
00:48:33
this incredibly intimate relationship,
00:48:34
the most intimate relationship in most
00:48:36
young people's lives is going to be with
00:48:38
their AIs. Once they cross the boundary
00:48:40
and say, "Oh, but we've got ethical
00:48:41
advertising." That'll last five or 10
00:48:44
minutes and even if they don't change,
00:48:46
others are now going every other
00:48:48
company's going to do it and they won't
00:48:49
be bound by the same thing and
00:48:50
eventually collective action problem.
00:48:52
Open AI will have to do it too. Again, a
00:48:55
massive title wave of shitification is
00:48:57
heading our way at warp speed.
00:48:58
>> I um I don't have my phone out because
00:49:00
I'm I've lost attention. I wanted to uh
00:49:03
show ask you guys what you thought of um
00:49:07
of this. So, on
00:49:10
one of the AI apps,
00:49:12
>> they now have a companions button, and I
00:49:15
can pick who I want to talk to. And
00:49:17
there's one particularly seducing lady
00:49:19
here, Annie, who
00:49:24
>> Hey, you're back. Missed that dirty
00:49:26
mouth of yours. What took you so long?
00:49:29
>> We did it on the podcast before.
00:49:31
>> What could possibly go wrong with this?
00:49:33
>> Yeah. want to pick right back up where
00:49:35
we left off or start something even
00:49:38
>> No, I would like to pick right back up
00:49:39
where we left off, Annie, last time on
00:49:41
the show. Um, what what what's going on
00:49:44
with you today?
00:49:49
>> I'm still sore from last time, baby.
00:49:52
>> God.
00:49:52
>> But but I mean, this is a this is an app
00:49:54
that I can download on my phone.
00:49:56
>> Any child can download it.
00:49:57
>> A child can download it on their phone.
00:49:58
It does ask me, again, I'm not
00:50:00
justifying this at all. It asked me what
00:50:02
my birth year was. It didn't make me
00:50:03
prove it.
00:50:04
>> Let me guess. But it also us it suggests
00:50:06
that you were born 18 years ago. That's
00:50:07
the default usually.
00:50:08
>> Yeah. Yeah. Yeah. Yeah. It just asked me
00:50:09
what my birthday. It didn't ask me to
00:50:10
prove it or anything like that. And we
00:50:13
all know that relationships and
00:50:15
connection is retentive. And I've heard
00:50:18
all these CEOs of these companies
00:50:19
talking about companionship apps and and
00:50:21
AI that can be your friend. I've heard
00:50:23
all of the major social apps talking
00:50:24
about this. It is deeply concerning
00:50:26
especially in the context of a
00:50:27
loneliness crisis.
00:50:28
>> It is a tsunami.
00:50:31
It is approaching fast and furious and
00:50:34
it is not a toy. It is going to
00:50:36
fundamentally
00:50:38
rewire everything.
00:50:40
>> Human relationships,
00:50:42
>> everything.
00:50:42
>> That's right.
00:50:43
>> It is so detrimental.
00:50:45
>> Yeah. Can I just say something about the
00:50:47
these tech executives and companies
00:50:49
offering this as a way to address the
00:50:51
loneliness crisis? So, there's a Yiddish
00:50:53
word called
00:50:55
and kutzbah means like nerve. Like
00:50:57
you've got a lot of nerve.
00:50:58
>> The audacity.
00:50:59
>> The audacity. Yeah. And the the classic,
00:51:02
you know, the classic comedic definition
00:51:03
of hutzbah is a boy who murders his
00:51:06
parents and then he asks the judge for
00:51:08
clemency because he's an orphan. Okay,
00:51:12
so that's hutzbah. Now imagine that
00:51:14
you're Mark Zuckerberg. You quoted him
00:51:15
before. Mark Zuckerberg was the
00:51:16
executive who said, "Well, you know, I
00:51:19
read that, you know, people on average
00:51:20
want 15 friends, but they only have
00:51:22
three."
00:51:24
these companions to fill that void that
00:51:27
we
00:51:32
have the way we think about them. We
00:51:34
thought about about them as gods and
00:51:36
saviors early in the internet phase and
00:51:37
the things they created were magical but
00:51:39
we have to change our thinking about
00:51:41
them and see the just the massive
00:51:42
destruction that they have already
00:51:44
wrought on our children, our society,
00:51:46
our democracy and it's just the
00:51:48
beginning. AI is going to make this so
00:51:50
much more intense. when you hear these
00:51:52
tech leaders, you know, I love hearing
00:51:54
Jonathan talk because he just goes there
00:51:56
and I'm always way more tempered. Um,
00:52:00
and I love it. It's emboldening me to
00:52:02
>> Yeah, I'm getting angry. I I don't
00:52:04
really get angry, but in the last year,
00:52:06
I'm getting angry.
00:52:07
>> I love I love it. So, the way when you
00:52:09
hear all of these various tech leaders
00:52:11
speak, they will always say they they
00:52:14
speak to the issue. So, you know, I've
00:52:16
heard many of for research for my second
00:52:18
book, Blackbrain, I've heard I've been
00:52:19
listening to a lot of Sam Alman's
00:52:22
speeches or panels and he will always
00:52:24
say things like, "Yeah, you know,
00:52:26
privacy is a major issue or yeah,
00:52:28
people, you know, 1 million users a week
00:52:31
talk about suicide on Chad GPT. Yeah,
00:52:34
this is an issue." And so they address
00:52:36
it or they they speak it. And so you
00:52:38
think, okay, there's going to be some
00:52:40
sort of solution. And often the solution
00:52:42
is yeah, you know, society, we're gonna
00:52:44
have to figure this out,
00:52:45
>> right?
00:52:46
>> So the burden of responsibility is not
00:52:48
on the developer. It's, you know,
00:52:50
>> the harmful externalities get foisted on
00:52:52
the rest of us. Too bad you guys figure
00:52:53
it out.
00:52:54
>> You said in the last year you're getting
00:52:55
angry.
00:52:56
>> Yeah.
00:52:56
>> Why in the last year?
00:52:58
>> Um because I was so deeply immersed in
00:53:00
the book and the writing and the of the
00:53:02
book and trying to understand the
00:53:03
numbers and the graphs and the trends
00:53:04
and the studies and that's all very
00:53:06
abstract. But then since the book came
00:53:08
out, I have had so many conversations
00:53:10
and I've met so many of the survivor
00:53:11
parents. Like just for example, I so I
00:53:13
was in London. This is just so
00:53:14
unbelievable. I was just in in London
00:53:17
two or three weeks ago and I met uh
00:53:20
Ellen I believe Ellen Groom I think was
00:53:21
her name. Uh her son Jules was found
00:53:24
dead. Happy kid found dead, strangled.
00:53:27
Uh it sure looked like it was the
00:53:29
choking challenge. 13-year-old boy. It
00:53:31
everything looked like the choking
00:53:32
challenge on Tik Tok.
00:53:33
>> What's the choking challenge? Um, it's a
00:53:36
challenge where kids are challenged to
00:53:38
cut off the circulation to the point
00:53:39
where they pass out, but then they I
00:53:41
think they're supposed to try to film
00:53:42
themselves waking up after they've
00:53:44
passed out. And of course, if you don't
00:53:45
do it exactly right, you die. And so, we
00:53:47
don't know how many have died. Hundreds
00:53:49
for sure. We don't really know. Um,
00:53:51
because, you know, you find a kid dead,
00:53:52
you don't know what it is. If you don't
00:53:54
have the code, if you don't have the the
00:53:55
password to get into your kid's phone,
00:53:57
you can't get in. And so, so she was, I
00:54:00
think she was able to get into the
00:54:01
phone, but she couldn't get into his
00:54:03
TikTok. and she went to uh Delaware to
00:54:07
they went she went to sue to demand that
00:54:09
Tik Tok release what was he watching
00:54:10
when he died
00:54:12
>> and Tik Tok says oh privacy issue oh no
00:54:14
we won't release that as if they care
00:54:16
about privacy and then in the courtroom
00:54:18
this was so disgusting in the courtroom
00:54:21
uh trying in Delaware this British woman
00:54:23
coming over trying to get some justice
00:54:25
trying to at least get some information
00:54:26
the lawyer for Tik Tok is trying to
00:54:29
suggest that your son was was was
00:54:32
depressed beforehand and he he was he
00:54:35
was going to be suicidal basically. Oh,
00:54:38
you know, even if he was watching Tik
00:54:39
Tok, that was just a correlation. Tik
00:54:40
Tok didn't cause it. He was going to die
00:54:42
anyway. I mean, it's just so disgusting
00:54:44
the way these companies treat the
00:54:46
parents and the kids that they're
00:54:47
crushing and stepping on. And so, the
00:54:49
more I see this, the more I realize this
00:54:52
is I mean, this is a level of cruelty
00:54:54
that goes far beyond the tobacco
00:54:56
industry. The tobacco executives, they
00:54:58
had to go home at night, but they never
00:55:00
saw during their workday, they never saw
00:55:02
children suffering. They saw people
00:55:04
dying, middle age and older, but they
00:55:06
never saw children suffering. The social
00:55:08
media executives, they have to go home
00:55:10
knowing every day that millions and
00:55:12
millions of kids have been cyberbullied,
00:55:14
sexed, shown uh eating disorder videos.
00:55:18
Uh uh many have committed suicide. They
00:55:20
have to go home knowing that, knowing
00:55:21
that they designed it for addiction,
00:55:23
knowing the kids are addicted, and lying
00:55:25
about it. So yeah, I'm getting angry.
00:55:27
>> And in their own homes,
00:55:28
>> right? And in their own homes, the
00:55:29
hypocrites don't let their kids do it.
00:55:31
>> That's right. So yeah, I'm getting
00:55:33
angry.
00:55:34
>> You talked earlier about deleting these
00:55:35
apps from our phone. I probably should
00:55:37
have represented the rebuttal, which
00:55:39
will be, well, I I need this for my
00:55:40
business. Increasingly, people need Tik
00:55:42
Tok to run their businesses,
00:55:44
>> and I imagine there'll be a lot of
00:55:46
people who will be listening right now.
00:55:47
I I guess I'm in a slightly different
00:55:48
position because I've I have the I have
00:55:51
options,
00:55:51
>> but for some people that are running
00:55:53
small businesses,
00:55:54
>> what do you say to those people?
00:55:55
>> Yeah. So, this is part of the reason
00:55:57
that I focus on the kids because for the
00:55:58
kids, it's totally clear what we need to
00:56:00
do. raise the age. They should not be on
00:56:02
it. These are adult only platforms. For
00:56:04
adults, a I'm I'm very hesitant to tell
00:56:06
adults what they should do or what they
00:56:08
have to do or pass laws blocking people.
00:56:10
I'm hesitant to do that. And I totally
00:56:12
see that for businesses. It is useful. I
00:56:14
use X and Instagram and LinkedIn to get
00:56:17
my work out. These are very powerful
00:56:19
tools for adults. The only real solution
00:56:22
to the adult for the adult problem is
00:56:23
going to come from market competition.
00:56:25
is going to come from. Imagine if there
00:56:28
was a social media app that was built
00:56:30
from the beginning for trust because
00:56:32
what are the places that didn't get in
00:56:34
shitified? eBay, Uber, places where
00:56:38
you're dealing with strangers. You don't
00:56:40
know the name of your driver. He doesn't
00:56:41
know yours. You you know first name,
00:56:43
that's all. But the company knows the
00:56:45
company has know your customer rules,
00:56:46
know your driver rules. So you can have
00:56:49
social media apps that are built for
00:56:50
trust so that if someone, you know, if a
00:56:52
driver tries to six or sexually harass a
00:56:55
customer, that driver gets fired.
00:56:57
>> Well, just this week though, there was
00:56:59
that big lawsuit, right, with that woman
00:57:00
and um her Uber driver raped her.
00:57:04
>> Okay. And did they Okay.
00:57:05
>> And now it's like slowly coming out that
00:57:08
Uber um you know has patterns of
00:57:12
>> uh covering up certain.
00:57:15
>> So So hopefully that will change. You
00:57:17
know, hopefully this was a landmark
00:57:20
>> lawsuit and now
00:57:22
we all we all let our daughters get into
00:57:24
Ubers with strange men from around the
00:57:26
world, you know, that we don't know
00:57:28
everywhere.
00:57:28
>> Yeah. So, it means in general the system
00:57:30
works. Of course, yes, there are there
00:57:32
are places where they're not careful.
00:57:33
Um, and so what I'm dreaming of is that
00:57:37
someone will come up with a platform
00:57:39
that has know your customer rules. There
00:57:41
are no bots. There are no, you know,
00:57:42
foreign intelligence agencies agencies
00:57:44
manipulating us. and you can trust
00:57:46
what's on there. You know that it's
00:57:48
real. Uh and that there will be an
00:57:50
alternative. I don't I'm not sure what
00:57:51
the monetary model would be at the
00:57:53
beginning. Um subscription generally
00:57:55
seems to be the least corrupted whereas
00:57:56
selling advertisements as OpenAI is now
00:57:59
doing is the most corrupting. Um it's
00:58:01
going to force them to maximize for
00:58:03
engagement. So I I understand we can't
00:58:05
just you know businesses can't just
00:58:07
boycott these. There has to be
00:58:09
something. But I think there there there
00:58:11
will be better ones coming out. I think
00:58:14
right now as a stop gap while these
00:58:16
social media companies their feet are
00:58:19
held to the fire, there are things that
00:58:21
we can do in the now. So, you know, the
00:58:24
things that I talk about all day is like
00:58:27
how to create boundaries and so that you
00:58:30
can protect your mental health, stay
00:58:32
informed, run your business, but then be
00:58:34
able to not have all of those
00:58:36
delletterious effects to your brain and
00:58:38
your body.
00:58:38
>> It is quite it's quite difficult. Um I I
00:58:42
kind of see both of your perspectives on
00:58:44
this. It's quite different.
00:58:45
>> I'm only talking about adults. So for
00:58:46
kids, you know, as a mother Yeah. I have
00:58:48
>> even for adults, I find it
00:58:50
>> we have a zero screen policy in our
00:58:52
home.
00:58:52
>> It's kind of like trying to navigate
00:58:53
through the world and avoid processed
00:58:55
foods, you know, and this is probably
00:58:57
even more compelling because it's in my
00:58:59
pocket all the time. I need it for other
00:59:00
things and it's just one one reach away.
00:59:03
So, you know, boundaries, I think I
00:59:07
could build a discipline to to create
00:59:11
boundaries, but I've sat here on this
00:59:12
podcast for many, many years listening
00:59:14
to neuroscientists tell me, "Steve,
00:59:15
don't don't put your phone in your
00:59:17
bedroom."
00:59:17
>> That's right.
00:59:18
>> And I'm still waking up and it's the
00:59:19
first thing I look at with one eye open
00:59:20
and then I'm going to bed and I'm doing
00:59:22
the whole revenge thing that you just
00:59:23
said at night time. I'm so glad you've
00:59:25
given cuz I will finish a hard day of
00:59:27
work of work. It might be 11:00 and then
00:59:30
my partner is waiting for me.
00:59:32
>> Yes. you know, we're going to have some
00:59:33
time, but I want some me time. So, there
00:59:36
I am. I'm on short form video scrolling
00:59:37
till like 2 a.m. in the morning. Like,
00:59:39
what the hell? And then I'm I wake up
00:59:41
late the next day. My diet's worse
00:59:43
because of my sleep was. It's all worse.
00:59:45
My relationship's worse. I didn't spend
00:59:47
time with her. And I'm going, what the
00:59:48
hell just happened? I'd got nothing out
00:59:49
of that scrolling session.
00:59:51
>> It's like that revenge bedtime
00:59:52
progressing teenage.
00:59:54
>> And it would be so much better off if
00:59:55
you would watch Netflix or a movie that
00:59:57
that you you most of those problems
00:59:59
would go away if you would make that me
01:00:00
time. be watching something long and
01:00:03
with some quality of the production
01:00:05
>> or let's take it a step further and not
01:00:07
do anything and just sit there sit there
01:00:10
on your couch. You know, we talked about
01:00:11
boredom very briefly, but you know, we
01:00:15
>> torture for this generation.
01:00:16
>> It's torture, but it's also, you know,
01:00:18
we don't we still have a capacity for
01:00:19
boredom, meaning we as like the human
01:00:21
brain does, but we just don't allow
01:00:24
ourselves to get bored. And so when
01:00:26
you're thinking about, you know, that
01:00:27
art, the lost art of pondering
01:00:29
>> and just sitting there, you know, I
01:00:31
think I don't know if it was Stephen,
01:00:33
you or Jonathan said, you know, when
01:00:34
you're in the car, I remember as a
01:00:36
little kid we did road trip. Yeah. Road
01:00:38
trips with my family and all you're
01:00:39
doing just make up games. Look out of
01:00:41
the window. We have lost Yeah. We've
01:00:44
lost that. And so there's this thing
01:00:45
called the default mode um network which
01:00:47
I think is important to think about
01:00:48
right now as we're thinking about AI and
01:00:51
what's going to happen and how it's
01:00:52
going to hijack our sense of attachment
01:00:54
and attention. So the sense of meaning
01:00:57
and purpose, right? If you ask people
01:00:59
right now, most people will say I um a
01:01:01
keynote speaker so I speak all over and
01:01:03
when I ask people the word that comes up
01:01:06
over and over is a sense of
01:01:07
horizonlessness.
01:01:09
>> Adults,
01:01:09
>> oh interesting. People feel like they
01:01:12
have nothing to look forward to right
01:01:14
now. The human brain needs something to
01:01:17
look forward to. That's how we're wired
01:01:20
progress and you know in in all ways.
01:01:23
And so right now there's this sense and
01:01:25
it's not just now. It's been for the
01:01:27
past several years after the pandemic
01:01:29
specifically and during the pandemic is
01:01:30
when it really changed how we started
01:01:32
thinking about the future. And so we
01:01:33
have this sense of like what's the
01:01:34
point? What's the point of working hard
01:01:36
now? What's the point of doing whatever?
01:01:37
because it's like I don't really see a
01:01:39
future for myself.
01:01:40
>> And so I think that along with this
01:01:43
fragmented attention, our loneliness,
01:01:46
boredom might be the antidote. It's a
01:01:49
way to reset your brain. And the reason
01:01:51
is because we are living through this
01:01:54
poly crisis, right? It's the era of the
01:01:56
poly crisis. And poly crisis simply
01:01:58
means that there's something happening
01:02:00
everywhere at all times. And we with our
01:02:02
devices, this high techch device that
01:02:04
plugs us in everywhere,
01:02:07
our brains are getting fed real time on
01:02:10
the ground information. And so while all
01:02:13
of this has evolved, technology now with
01:02:14
AI chatbots, your amygdala has not. And
01:02:17
so it feels like when something is
01:02:19
happening, whether it's far away or
01:02:21
close by, your amydala has that same
01:02:22
reaction. Now, if you were to not engage
01:02:26
in revenge by time for procrastination,
01:02:28
put your phone away and just kind of
01:02:30
hang out. Maybe drink a cup of herbal
01:02:32
tea like old school, uh, play a board
01:02:34
game or something. You might, you know,
01:02:37
or just allow yourself to get bored.
01:02:39
That hyperactivation, hypervigilance,
01:02:42
you might be able to come back down to
01:02:44
baseline, that default mode network will
01:02:47
start working in the background. You
01:02:49
might develop a greater sense of meaning
01:02:50
and purpose
01:02:51
>> probably today. And then life is going
01:02:52
to happen to me again. And boom, I'm
01:02:54
back into it. And you know,
01:02:57
>> you could create a practice, a cultivate
01:02:59
a practice. you're interviewing
01:03:00
neuroscientists and I go if I still
01:03:03
can't crack it and I have all the
01:03:05
information and advice and hacks and
01:03:07
tips and tricks and resources and I
01:03:09
could you know I can decide what time I
01:03:11
wake up like I've got all these this
01:03:12
like privilege and I can't crack it I go
01:03:14
you know it's going to be really
01:03:15
difficult.
01:03:16
>> So let me let me offer a way of thinking
01:03:18
about this. So, in my first book, The
01:03:19
Happiness Hypothesis, um there's there's
01:03:22
a metaphor in there. It's it's about 10
01:03:24
ancient ideas, and I use a lot of
01:03:25
metaphors to explain ancient ideas about
01:03:27
psychology and whe whether they're true.
01:03:30
And um the first chapter is on how the
01:03:32
mind is divided into parts that often
01:03:34
conflict like a small rider, which is
01:03:36
our conscious reasoning on a very large
01:03:39
elephant, which is all the automatic
01:03:41
processes that happen that we don't see
01:03:43
what's happening. We just see we just
01:03:44
feel the results, intuition and emotion.
01:03:47
And psychotherapists tell me this is
01:03:49
incredibly helpful metaphor with their
01:03:51
with their patients because it explains
01:03:53
and there's a quote from oid in there. I
01:03:56
see the right way and approve it. Alas,
01:03:58
I follow the wrong. So I know I should
01:04:01
go to bed as you say, but yet for some
01:04:03
reason I'm not going to bed because our
01:04:05
brains are 500 million years old. They
01:04:07
work on automatic processes. They're
01:04:09
animal brains. And then very recently we
01:04:11
got language and we can reason things
01:04:13
out, but the but the parts that do
01:04:15
reasoning don't control behavior. And so
01:04:17
really the elephant is what largely
01:04:19
guides our behavior, our automatic
01:04:21
processes. And your phone um as I said
01:04:25
before, BF Skinner is in your phone.
01:04:27
Your phone is a behaviorist training
01:04:29
device that trains the elephant. Um and
01:04:32
that's why you often do things with your
01:04:33
phone that you don't want to do. And so,
01:04:36
and this is why I'm so insistent that we
01:04:38
all have to get all of the slot machine
01:04:41
apps off of our phone. That is the
01:04:43
original iPhone was an amazing tool. It
01:04:46
was a Swiss Army knife. It had, you
01:04:49
know, a telephone, a browser, maps, a
01:04:52
music player, there was a flashlight.
01:04:54
Okay, there was no app store. There were
01:04:56
no push notifications. 2007, 2008, it's
01:04:59
just a Swiss Army knife. There's no
01:05:01
problem. Okay, now I'm very lucky in
01:05:04
that my iPhone has always stayed that.
01:05:07
I'm always on a computer. So, my
01:05:08
problem, my attention problems are on my
01:05:10
computer, but my phone because I never
01:05:12
had any addictive apps on it except
01:05:14
during the crypto craze where I played
01:05:17
around with it and I got hooked and I
01:05:19
was checking 50 times a day and I saw
01:05:21
the addiction. So, I once I got rid of
01:05:23
that and lost all the money that I was
01:05:24
willing to lose. Once I get rid of that,
01:05:27
my phone has no addictive power over me
01:05:29
because when I see it, there's no it's
01:05:30
not a slot machine call, hey, come back
01:05:32
and play, come back and play. So your
01:05:34
phone right now on your personal device,
01:05:37
you don't have any social media apps or
01:05:39
anything like that.
01:05:40
>> I do have Twitter, but I never check it
01:05:41
there. I never use use that on the
01:05:43
phone, you know. Now texting and email
01:05:45
is a little bit like a slot machine
01:05:46
because sometimes you but it's very
01:05:47
mild. So this is again what I this is
01:05:50
what works for my students. Just get the
01:05:52
slot machine apps off your phone and
01:05:54
then you'll find that then you could
01:05:56
even have your phone near you when you
01:05:58
go to bed. But if you've got addictive
01:06:00
apps on your phone, you can't have it
01:06:02
when you go to bed. Angela Duckworth,
01:06:04
the woman who who gave us the concept of
01:06:06
grit, she has this amazing graduation
01:06:08
speech at one of the schools in New
01:06:09
England, and she says something like,
01:06:12
>> "Where you put your phone at night will
01:06:14
may become the most important decision
01:06:16
you make in your life."
01:06:17
>> And what she means by that is not that s
01:06:18
it's it's I if you can use behavioral
01:06:21
control and change the stimula, if you
01:06:23
can do that, then you're going to be
01:06:25
okay. But if not, the phone is going to
01:06:27
take your attention. and you're not
01:06:28
going to amount to anything.
01:06:30
>> All I had to do was brain dump. Imagine
01:06:32
if you had someone with you at all times
01:06:34
that could take the ideas you have in
01:06:36
your head, synthesize them with AI to
01:06:39
make them sound better and more
01:06:40
grammatically correct and write them
01:06:42
down for you. This is exactly what
01:06:44
Whisper Flow is in my life. It is this
01:06:46
thought partner that helps me explain
01:06:48
what I want to say. And it now means
01:06:50
that on the go, when I'm alone in my
01:06:52
office, when I'm out and about, I can
01:06:54
respond to emails and Slack messages and
01:06:56
WhatsApps and everything across all of
01:06:58
my devices just by speaking. I love this
01:07:00
tool. And I started talking about this
01:07:01
on my behindthescenes channel a couple
01:07:03
of months back. And then the founder
01:07:04
reached out to me and said, "We're
01:07:05
seeing a lot of people come to our tour
01:07:06
because of you." So, we'd love to be a
01:07:08
sponsor. We'd love you to be an investor
01:07:09
in the company. And so, I signed up for
01:07:11
both of those offers. And I'm now an
01:07:12
investor and a huge partner in a company
01:07:14
called Whisper Flow. You have to check
01:07:17
it out. Whisper Flow is four times
01:07:18
faster than typing. So if you want to
01:07:20
give it a try, head over to
01:07:21
whisperflow.ai/doac
01:07:24
to get started for free. And you can
01:07:26
find that link to whisperflow in the
01:07:28
description below.
01:07:31
We asked our audience how many of them
01:07:32
thought they were addicted to their
01:07:35
phone. And roughly 85% of respondents,
01:07:39
the driver audience described themselves
01:07:40
as being very or completely addicted to
01:07:44
>> very or completely. That surprises I
01:07:45
didn't realize it would be that high. So
01:07:47
you can do a test. So for people
01:07:48
listening if you want to say like how
01:07:50
addicted and by the way we're using the
01:07:52
word addiction very loosely in our
01:07:53
conversation. And so what we're really
01:07:55
talking about because you know there is
01:07:57
in terms of you know medical clinical
01:07:59
syndrome um when you think about
01:08:01
addiction there's certain criteria and
01:08:03
so what we're talking about is overuse
01:08:05
or over reliance on your devices.
01:08:07
>> Compulsive overuse that interferes with
01:08:09
other domains of life.
01:08:11
>> Yes. It inter
01:08:12
>> if that is an addiction I don't know
01:08:13
what is. And so when you're thinking
01:08:14
about am I addicted to my phone? Do I
01:08:16
have am I you know really what the very
01:08:19
simple thing that you can do. I did it
01:08:20
myself and I was like I know again like
01:08:22
you Stephen like know all the science
01:08:24
still was really difficult. You have all
01:08:26
the access and it was still difficult.
01:08:28
And so all you have to do is you just
01:08:30
take your phone you put it in another
01:08:32
part of your house or apartment or
01:08:34
whatever and give yourself a couple of
01:08:35
hours when you know you're going to be
01:08:37
home or you know you're not reliant on
01:08:39
your phone for work or whatever. an
01:08:40
hour, two hours, three hours, and just
01:08:43
have a piece of paper, old school, piece
01:08:44
of paper and a pen with you. And every
01:08:47
time you feel that compulsion of like, I
01:08:48
want to check my device, you make a
01:08:50
mark, you make a mark, you make a mark,
01:08:51
and just to see because some people say,
01:08:53
I'm surprised that your audience at 85%
01:08:55
because most people would say, I don't
01:08:57
know if I'm really addicted. And so I
01:08:59
like that there's that sense of
01:09:00
self-awareness. But if you're thinking,
01:09:02
I'm not really that addicted. You
01:09:04
breathe in an hour 960 times a minute.
01:09:08
And you may notice that you want to have
01:09:10
that that compulsion to check 960 times
01:09:13
a minute or you know thereabouts because
01:09:16
we all have that sense of reliance on
01:09:18
our devices. So that's like a really
01:09:20
quick way that you can check to see am I
01:09:23
relying on my device?
01:09:24
>> Are you addicted to your phone under
01:09:26
that definition? Because of the line of
01:09:28
work that I am in, I can very quickly I
01:09:31
have certain tells when I know I call
01:09:33
them the canary in the coal mine, right?
01:09:35
I think we talked about this the last
01:09:36
time I was here. I can very quickly tell
01:09:38
when I'm starting to get that feeling of
01:09:41
addiction or compulsion. And so I course
01:09:44
correct early, but that's only because I
01:09:46
know the science and I course correct.
01:09:48
So I keep my you I keep my phone outside
01:09:50
I I walk the talk. I keep my phone
01:09:52
outside my bedroom. It is not within
01:09:54
arms reach. I grayscale my phone during
01:09:56
periods of deep focus during the day
01:09:57
when I have a deadline I have to get
01:09:59
things done and at night so I avoid
01:10:01
revenge bedtime procrastination but
01:10:03
sometimes it happens like I'm a human
01:10:05
you know so this past week um not to be
01:10:07
a real downer but there have been things
01:10:09
that have been in the media the past
01:10:10
week that have been really challenging
01:10:12
especially as a woman and so I have
01:10:14
found myself with the primal urge to
01:10:16
scroll my amydala has been triggered I
01:10:19
have been going down rabbit holes and I
01:10:21
wouldn't ordinarily do that so I give
01:10:22
myself grace too and have a sense of
01:10:24
self-compassion.
01:10:26
Do you feel like you're addicted to your
01:10:27
phone?
01:10:28
>> No, I'm not at all addicted to my phone.
01:10:30
Uh cuz I don't have any slot machine
01:10:31
apps on it. But I really want to
01:10:33
question you made a distinction that
01:10:35
many scientists do, which is well, you
01:10:37
know, we can't quite say it's addiction
01:10:39
because, you know, addiction is certain
01:10:40
biochemical pathways based on, you know,
01:10:42
heroin and addictive substances. Uh but
01:10:45
I believe that this is one of the meta
01:10:48
talking points that they that they are
01:10:49
able to push that we can't call it
01:10:51
addiction. It's different. No, I don't
01:10:52
mean No, I'm sorry. I don't mean I'm
01:10:53
sorry. And no way. Look, you know, you
01:10:55
and I are total allies on this. We see
01:10:57
the problem. We're both all I mean is,
01:11:00
you know, we're we're supposed to be
01:11:01
very careful about using the word
01:11:02
addiction, but and you had analyt and
01:11:06
she was very clear like in her practices
01:11:08
and now it's overwhelmingly digital
01:11:09
addictions. It's all of this is working
01:11:12
through dopamine. If you feel compulsive
01:11:14
use, definitely dopamine. So, it's most
01:11:16
of the same brain centers as it is for
01:11:18
heroin or crack or any other drug. Um,
01:11:20
and it's the same effects that is the
01:11:22
it's it's compulsive use where you don't
01:11:25
want to do it, you want to change, but
01:11:26
yet you find yourself doing it and you
01:11:28
have withdrawal effects. Uh, and people
01:11:31
and people have terrible withdrawal
01:11:32
effects when they're heavy users of
01:11:34
these things and they stop. And so, you
01:11:36
know, if it walks like a duck and talks
01:11:38
like a duck and swims like a duck, I'm
01:11:40
going to call it a duck. In fact, that's
01:11:42
what they call it. So, I just want to
01:11:44
read one more quote. Again, the quotes
01:11:45
are just so astonishing. some meta uh
01:11:47
meta researchers and one of them says
01:11:50
quote it seems clear from what's
01:11:52
presented here in this internal study
01:11:55
uh that some of our users are addicted
01:11:57
to our products that's their word
01:11:58
addicted to our products and I worry
01:12:00
that driving sessions incentivizes us to
01:12:03
make our products more addictive without
01:12:05
providing much more value how to keep
01:12:07
someone returning over and over to the
01:12:09
same behavior each day intermittent
01:12:11
rewards are most effective think slot
01:12:13
machines reinforcing behaviors that
01:12:15
become especially hard to extinguish
01:12:18
even when they provide little reward or
01:12:20
cease providing reward at all. people. I
01:12:24
mean, it just imagine an industry that
01:12:27
has caused 85% of people to feel that
01:12:30
they're addicted
01:12:32
>> and not calling it addiction
01:12:33
>> and not calling it addiction. And these
01:12:35
people these these people are having
01:12:37
their lives diminished, their
01:12:40
relationships diminished. So I'm trying
01:12:42
to convey is we're seeing the
01:12:45
destruction of human capital, the
01:12:46
destruction of human potential, the
01:12:48
destruction of human relationships, the
01:12:50
destruction of connection, the
01:12:51
destruction of sense of meaning at a
01:12:53
scale so vast I don't think people are
01:12:55
capable of comprehending it. I now
01:12:57
believe this is affecting most human
01:12:59
beings. These industries, these few
01:13:01
companies have damaged the lives of most
01:13:04
human beings. We don't have good data
01:13:05
from the developing world but certainly
01:13:07
the developed world wherever kids are
01:13:08
going through puberty on on
01:13:10
touchscreens. You you you you have this
01:13:12
constant fighting over the over the uh
01:13:15
over the screens over the technology and
01:13:16
you have these uh diminishing outcomes,
01:13:19
diminishing cognition, diminishing sense
01:13:20
of of purpose in life
01:13:23
>> only to get worse with the AI.
01:13:24
>> As AI comes in, it's going to get worse
01:13:26
unless we act and we've got to change
01:13:28
course in 2026. We don't have five years
01:13:30
to study it. We've got to stop this now
01:13:32
in 2026. Are you concerned at all about
01:13:36
the way education's going for children?
01:13:37
Because
01:13:38
>> Oh my god. Yes.
01:13:38
>> It appears that edte edtech is, you
01:13:41
know, big tech in a sweater, as they
01:13:43
say.
01:13:44
>> Because I I was almost imagining a
01:13:46
future where my future kids are going to
01:13:48
learn their curriculum from an AI
01:13:51
chatbot. Cuz, you know, I can imagine
01:13:52
the case cheaper,
01:13:54
>> more personalized, more convenient. It's
01:13:57
going to know my if my son's called
01:13:58
Timmy, it's going to know Timmy's brain
01:14:00
and it's going to know how to make him
01:14:01
pay attention and what he's interested
01:14:03
in and what he's not. So, are you
01:14:05
concerned about this or is this a good
01:14:06
thing?
01:14:07
>> There is definitely a use case for
01:14:09
edtech. Um, if there could be a device
01:14:11
that only did math tutoring or only did
01:14:14
tutoring and you couldn't watch videos
01:14:16
on it, I'm totally open to believing
01:14:18
that that can speed up teaching. But
01:14:21
here's what's happened.
01:14:23
We put computers on everyone's desks
01:14:25
around 2014, 2015. We used to think in
01:14:28
America that it was an equity issue even
01:14:30
back to the 90s. The rich kids all have
01:14:31
computers. The poor kids don't. Let's
01:14:33
get philanthropists to buy computers for
01:14:36
school districts that every kid can have
01:14:37
a computer on their desk. Okay. Now,
01:14:39
what is a computer? It's a play device.
01:14:42
It does everything. Kids use it at home.
01:14:44
They, you know, they watch videos. They
01:14:46
do all sorts of things. You put it on
01:14:48
their desk and you tell them to do math
01:14:49
homework. What happens? It's mostly
01:14:51
short videos. That's what research is
01:14:52
showing. It ends up because they don't,
01:14:53
you know, they always they don't block
01:14:54
YouTube. They might say, "Oh, we block
01:14:56
porn. We block video games." They can
01:14:58
get around all that. And if you're
01:14:59
letting them do YouTube, it's YouTube
01:15:00
shorts, which is Tik Tok. So, what
01:15:03
happened to test scores in the United
01:15:04
States from the 70s through 2012? They
01:15:07
were rising. We actually were improving
01:15:09
what kids knew, what kids learned in the
01:15:12
United States. We have very good data.
01:15:13
The national the NAPE, the National
01:15:15
Assessment of Educational Progress goes
01:15:17
up till 2012. And then by 2015, it
01:15:19
starts going down. And it's going down
01:15:21
before COVID and it goes down more
01:15:23
during COVID and everyone thinks like oh
01:15:25
it's COVID but it started go the peak
01:15:26
was 2012 and what's happening what we
01:15:30
now can see is that the top students the
01:15:33
very best students who are the ones with
01:15:34
executive function they're the ones who
01:15:36
can pay attention if you put a computer
01:15:38
on that kid's desk he's not destroyed by
01:15:40
it he can actually still learn but the
01:15:43
bottom 50% cannot the b so all of the
01:15:46
drop in educational stats is the bottom
01:15:48
50% the bottom 50% % in terms of
01:15:50
capacity to pay attention. Their
01:15:51
education is being devastated and that's
01:15:54
what happened when we put laptops and we
01:15:56
put Chromebooks and iPads on their
01:15:57
desks. Um, we spent hundreds of billions
01:16:00
of dollars on this stuff and it has
01:16:02
damaged education and if we'd spent a
01:16:04
quarter of that on teachers, we would be
01:16:06
in such better shape today. So, we made
01:16:08
a colossal blunder with edtech in the
01:16:10
2010s and now we're about to do the same
01:16:13
thing again with AI. Again, maybe there
01:16:15
are apps, maybe there are applications
01:16:17
that will be great, but we've got to put
01:16:19
the burden of proof on Silicon Valley.
01:16:21
We've got to say, you guys have to prove
01:16:23
that this stuff is effective and safe
01:16:24
before we'll let it in. We are not going
01:16:26
to let you just say, "Hey, let's just
01:16:28
flood the zone. Let's give it to
01:16:29
everybody and then we'll wait 10 years
01:16:30
and see what happens."
01:16:33
I mean that brings brings up this um
01:16:35
this study that I have in front of me
01:16:37
here which was a 2022 study a Munich
01:16:40
study which tested the idea of brain rot
01:16:42
which um I believe was the Oxford
01:16:44
dictionary word of the year 2024
01:16:47
>> and what they did is they gave 60
01:16:49
participants a test then a 10-minute
01:16:51
break and then another test during the
01:16:53
break they either rested or used Tik Tok
01:16:57
Twitter or YouTube and the results
01:16:58
showed the following the Tik Tok group
01:17:00
so They had a 10-minute interval to do
01:17:03
anything. And this group got Tik Tok to
01:17:05
look at. Their memory accuracy dropped
01:17:08
from 80% before the break to 49% after
01:17:12
the break. A nearly 40% decline just
01:17:14
from a 10-minute break. In contrast, the
01:17:17
Twitter and YouTube groups showed no
01:17:19
significant change in the Munich study.
01:17:20
And there's an image I'll throw up on
01:17:22
the screen.
01:17:23
Results from the Munich study showed a
01:17:25
40% drop in prospective memory accuracy
01:17:27
in the Tik Tok group after a 10minute
01:17:29
break, which is unbelievable. Yeah, it's
01:17:32
unbelievable. What the hell is going on
01:17:34
there? How can a 10-minute Tik Tok break
01:17:36
drop my memory accuracy by 40%.
01:17:41
>> Tik Tok is brain rot.
01:17:42
>> What's going on?
01:17:43
>> There's so much going on in the brain.
01:17:45
So, you know, when you're thinking about
01:17:49
here's the thing. Brain breaks are not
01:17:51
nice to haves. They're actually
01:17:53
essential for your brain. And so we
01:17:55
talked a little bit about that, you
01:17:56
know, default mode network and what
01:17:58
happens to it when you're engaging with
01:18:00
your devices. And you know, that's not a
01:18:02
brain break. That's activating all of
01:18:03
the aspects. So it's activating your
01:18:05
amygdala. It's dampening or decreasing
01:18:08
the volume of your prefrontal cortex.
01:18:10
It's creating that reward system, the
01:18:12
dopamine hit, those addictive behaviors.
01:18:15
So it's only, you know, when you're
01:18:16
thinking about memory planning, what was
01:18:19
the metric here? It was memory, right?
01:18:21
that was the the the metric that they
01:18:22
were using to study. And so when you're
01:18:24
thinking about working memory or um
01:18:26
cognitive function, complex problem
01:18:28
solving, this is all prefrontal cortex.
01:18:30
And so when you're engaging with Tik Tok
01:18:32
10 minutes, 5 minutes, whatever it is,
01:18:35
you are dialing down that biology in
01:18:37
your brain. And so of course you're
01:18:39
going to see changes and you're going to
01:18:42
see the flip side, increased
01:18:44
hypervigilance, irritability,
01:18:46
distractability, fragmented attention.
01:18:48
It's just again this is not to say that
01:18:51
this whole conversation right or when
01:18:52
you're reading studies you might say to
01:18:54
yourself what's wrong with me you know
01:18:56
is there something wrong with me am is
01:18:58
my brain broken am I weak it is not you
01:19:01
you are not alone it is not your fault
01:19:03
it is the biology of your brain doing
01:19:05
exactly as it should so we talked about
01:19:07
the amydala and prefrontal cortex here
01:19:09
your amydala is not wrong or broken it's
01:19:12
by design supposed to think about your
01:19:15
immediate needs survival
01:19:17
self-preservation And so when you're on
01:19:19
the algorithm, we know we talked about,
01:19:21
you know, certain um or maybe we didn't
01:19:23
talk about it. Certain content that you
01:19:25
see on Tik Tok and others
01:19:28
that when it's reactionary, you know,
01:19:30
words like FOMO or ragebait, these are
01:19:32
not neutral terms. When you're engaging
01:19:34
with these uh social media platforms,
01:19:36
it's not something neutral. It's not
01:19:38
passive. It is an active biological
01:19:40
process in your brain. So this study,
01:19:42
it's not surprising. It is actually
01:19:44
exactly what you would expect on to
01:19:47
happen to your biology if you had this
01:19:49
sort of what we call in medicine this
01:19:51
kind of intervention. It's stimulating
01:19:54
exactly what it's supposed to do.
01:19:56
>> Yeah. I'll just I'll just add on to what
01:19:57
Adidi said that there are some there are
01:20:00
many medical conditions where you can't
01:20:02
just go to the patient and say why do
01:20:05
you think you got this cancer? Oh, you
01:20:07
know I think it's cuz I ate a lot of you
01:20:08
know chocolate when I was whatever. You
01:20:10
know there when when the when the the
01:20:13
the act is separated from the effect by
01:20:15
30 years then you don't expect the
01:20:18
patients to have insight into the cause
01:20:19
of it. But when the outcome is separated
01:20:22
from the input by seconds and you have
01:20:26
literally millions of chances to observe
01:20:28
the co-variation
01:20:30
the patient is really really accurate.
01:20:33
In fact the patient really knows what's
01:20:34
going on. And so I think the deciding
01:20:36
factor here on this big debate about oh
01:20:39
is it just correlation or is it
01:20:40
causation um the deciding factor for
01:20:43
social media and for a lot of these tech
01:20:45
innovations including video games and
01:20:47
gambling and all of that should really
01:20:48
be the kids and if the kids say this is
01:20:51
bad for me we should take their word for
01:20:53
it given that we also have correlational
01:20:56
studies random control trials uh uh
01:20:58
longitudinal studies natural I mean we
01:21:00
have so much other data but given that
01:21:02
the kids themselves they call it brain
01:21:05
rot. They call the material brain rot.
01:21:06
Um my students tell me it's a huge
01:21:08
obstacle to them doing their homework.
01:21:10
As one of them said, I pull out a book,
01:21:12
I read a sentence, I get bored, I go to
01:21:15
Tik Tok. You know, so if they're telling
01:21:16
us that this is damaging their ability
01:21:18
to pay attention, they feel it. They
01:21:19
feel the loss. We all feel it. Well,
01:21:22
many of us have noticed this. Um um then
01:21:25
I think this is pretty decisive evidence
01:21:27
that this stuff is bad for cognition
01:21:30
>> and it has long-term consequences. So,
01:21:32
it's not just that in the moment, right?
01:21:34
So, there was this case that was all
01:21:36
over the media, a college student. I'm
01:21:38
sure you're familiar with the case. And
01:21:40
this young woman was on TikTok
01:21:43
experiencing brain rot. And then some
01:21:45
Tik Tok algorithm took her down to this
01:21:48
place of, you know, you should take an
01:21:49
edible. It'll help you so you can Wow.
01:21:53
prescribing drugs. Wow.
01:21:54
>> And you could go to class and you could,
01:21:56
you know, be more alert. And so, she did
01:21:58
that. And then it continued on and on
01:22:00
and then she developed a dependence on
01:22:03
edibles and then checked into rehab. And
01:22:06
only when she focused on analog
01:22:08
activities like guitar playing and a
01:22:12
couple of other things that she started
01:22:13
doing is when and you know removing the
01:22:17
stimulus the the Tik Tok um algorithm is
01:22:21
when she started to improve. So it's not
01:22:22
just in the moment oh I can't remember
01:22:24
something or I'm more irritable. These
01:22:26
sorts of things compound and the
01:22:29
long-term squellle or the long-term
01:22:31
effects can be quite damaging. That's
01:22:33
just one example.
01:22:35
>> In your book, The Anxious Generation,
01:22:37
Jonathan, you the the subtitle here is
01:22:38
how the great rewiring of childhood is
01:22:41
causing an epidemic of mental illness.
01:22:43
>> I was looking at some of these graphs of
01:22:45
different sort of mental illness
01:22:47
>> illnesses and um they're increasing. One
01:22:50
of them that's increasing is ADHD.
01:22:54
>> I was diagnosed with ADHD.
01:22:56
um maybe about a year ago. And when
01:22:59
we're talking about short attention
01:23:01
spans, I mean, the name attention
01:23:03
deficit hyperactivity disorder, I
01:23:05
believe that's what it's called,
01:23:06
>> sounds a lot like what we're talking
01:23:08
about.
01:23:09
>> Yeah.
01:23:09
>> Is there a link, do you believe, between
01:23:12
the increasing diagnosis of of ADHD and
01:23:16
the sort of frying of our brains with
01:23:18
>> short form video and social media?
01:23:20
>> Yeah, I I mean, I suspect that there is,
01:23:22
but here's here's what I can tell you I
01:23:23
learned while writing the book. Um, I
01:23:25
looked to see if there were studies
01:23:27
indicating that uh heavy use of of
01:23:31
social media and video games and all the
01:23:33
electronic stuff caused ADHD. And when I
01:23:36
was doing the research in 2023, I did
01:23:38
not find evidence that it will give a
01:23:39
kid HD ADHD who otherwise wouldn't have
01:23:41
it. What I did find was evidence that
01:23:43
for kids who have ADHD, when you let
01:23:46
them have the devices, the video games,
01:23:47
all that, their symptoms get much worse.
01:23:49
And so because it is a major achievement
01:23:52
of young adulthood to be able to pay
01:23:54
attention to develop what we've been
01:23:56
calling executive function to be able to
01:23:58
make a plan and decide oh to reach the
01:24:00
plan I have to do this and then I do
01:24:02
this and then it might be a long time
01:24:04
before I get here but I will keep going
01:24:05
and I will keep my eye on the prize that
01:24:08
I I assume that's you're saying it's a
01:24:10
little harder for you to do that. I mean
01:24:11
that's what ADHD means. How do you
01:24:13
experience ADHD? Well, well, hm, I def I
01:24:18
mean, if I think about school, I
01:24:20
couldn't pay attention in school for for
01:24:22
for very long. And that meant that I was
01:24:24
always in the expulsion room and then I
01:24:26
was expelled. And then that's kind of
01:24:28
it's I feel like it's got worse as an
01:24:30
adult. And from my in my opinion, my
01:24:33
relationship with my phone has made it
01:24:35
much worse
01:24:36
>> where really I can't I can't pay
01:24:38
attention to to many things for a very
01:24:39
long time. The exception to this is I
01:24:41
can do deep work
01:24:45
for many many hours without moving. It
01:24:47
was almost a bit of
01:24:47
>> when you are extremely motivated. I say
01:24:49
when you're really into it, you can be
01:24:51
into it. That's right. But a lot of work
01:24:54
isn't that a lot of being effective in
01:24:56
the workplace is not you're following
01:24:57
your passion. Right. ADHD kids, they can
01:25:00
zoom in because they're getting the
01:25:02
dopamine. They're getting the dopamine
01:25:03
from this thing. But a lot of work isn't
01:25:05
like that. And these kids are not going
01:25:07
to be able to do that. So actually what
01:25:09
you said, it fits perfectly with what
01:25:10
what I found from those Dutch studies.
01:25:11
if you did have whether it's a genetic
01:25:13
or whatever the predisposition is the
01:25:16
this environment has made your symptoms
01:25:18
worse. Now of course ADHD kids can be
01:25:19
incredibly uh creative they are often
01:25:22
very very successful but my fear is that
01:25:25
the pathways to success that they used
01:25:27
to take might be blocked if they
01:25:28
basically are just scrolling all day
01:25:30
long and not able to pay not able to um
01:25:33
have real life experiences
01:25:34
>> and relationships are like that
01:25:35
especially romantic ones. It's an
01:25:37
interesting thing that you bring up,
01:25:38
Stephen, because there is an increase in
01:25:40
adult onset, you know, when adults are
01:25:43
diagnosed with ADHD, because typically
01:25:45
we think of ADHD as a pediatric
01:25:47
condition or young adults. And so,
01:25:49
increasingly, we're seeing more and more
01:25:51
adults who are in their 30s and 40s,
01:25:53
50s, sometimes even 60s, who are being
01:25:56
diagnosed, newly diagnosed with ADHD.
01:25:58
And so, that's an interesting there's so
01:26:00
many um, you know, reasons like it might
01:26:02
be that they had it all along and they
01:26:05
were diagnosed. And so what is going on
01:26:06
there? That would be a future podcast
01:26:09
episode for an ADHD ADHD expert of, you
01:26:13
know, what are the drivers of why are so
01:26:15
many adults being diagnosed with ADHD
01:26:17
>> or maybe even just the symptoms looking
01:26:18
very similar.
01:26:20
>> Mhm.
01:26:20
>> Um
01:26:21
>> Yeah, that's right.
01:26:22
>> You talked about popcorn brain editing.
01:26:24
>> Yeah. So, you know, we've talked about
01:26:26
brain rot and the primal scroll and
01:26:28
popcorn brain is kind of an offshoot.
01:26:30
It's part of the same family. And so
01:26:32
what happens is it's a term coined by a
01:26:34
man a psychologist named David Levy. And
01:26:37
what happens with popcorn brain is that
01:26:39
you and we all have it. And so what it
01:26:43
is a societal phenomenon when you spend
01:26:45
too much time online and you are
01:26:46
overstimulated and so it is hard for you
01:26:49
to spend time offline. Offline feels
01:26:51
slow, boring because things are moving
01:26:53
at a much slower pace. And so popcorn
01:26:56
brain is the sensation of your brain
01:26:58
popping. It is not actively popping.
01:27:00
It's not like your brain cells are
01:27:01
popping, but it sure feels like it. And
01:27:04
so your primal urge to scroll kind of
01:27:06
primes your brain to develop popcorn
01:27:07
brain. You are more at risk for
01:27:09
developing popcorn brain when you feel a
01:27:11
sense of stress because of that primal
01:27:13
urge to scroll. The differentiator
01:27:14
between brain rot and popcorn brain.
01:27:17
Again, these are societal terms that
01:27:20
we're calling for a constellation or a
01:27:21
group of symptoms, right? And so the
01:27:23
difference to me is that popcorn brain
01:27:26
is ubiquitous. It's everywhere. It's
01:27:28
like we all have it and it's happening
01:27:31
all all the time because of the modern
01:27:33
age and a lot of the things that we
01:27:35
talked about. Brain rod is a little bit
01:27:37
more specific. It's a little bit more
01:27:39
well-defined. So it has certain features
01:27:41
like we call it the biocschychosocial
01:27:44
model. When you're thinking about a
01:27:45
particular medical or condition or an
01:27:47
entity. So what are the biological
01:27:49
factors? We talked about what defines
01:27:51
brain rod. you know, a change in brain
01:27:53
waves, a change in brain regions, the
01:27:55
amygdala lighting up and the prefrontal
01:27:58
cortex kind of being quiet. Um,
01:28:00
psychological factors, we talked about
01:28:02
attention, um, co complex problem
01:28:05
solving, impulse control and then the
01:28:08
social factors, loneliness and others.
01:28:10
So, um, compulsion and so I would say
01:28:13
popcorn brain is something that we all
01:28:15
suffer from and you know brain rot is
01:28:18
something that is very specific. The
01:28:21
other thing that we haven't talked about
01:28:22
that I would love to kind of because so
01:28:24
much of our conversation is like doom
01:28:26
and gloom, right? It's likew
01:28:28
one thing that I would like to say is
01:28:29
that as bad as when you hear the term
01:28:33
brain rot, it seems permanent because
01:28:36
rot it conotes like deterioration.
01:28:39
That's it. It's one-sided is one way and
01:28:41
that's it. But in fact, popcorn brain
01:28:43
and brain rot are reversible conditions.
01:28:46
So it is not
01:28:46
>> in adults
01:28:47
>> in adults. If you've gone through
01:28:49
puberty with it, it's not so clear.
01:28:51
>> Yes. In adults, and my work focuses on
01:28:54
adults. And so when you have, if you
01:28:56
experience brain rot in your 30s, 40s,
01:28:57
and beyond, you can, it takes time, you
01:29:02
it takes eight weeks for your brain to
01:29:03
rewire itself. Give yourself time. A
01:29:05
sense of self-compassion is really
01:29:07
important. But you can, you know, there
01:29:09
is a sense of it being able to be
01:29:11
reversed. So it's not so much a brain,
01:29:13
it's not a fixed trait, but rather a
01:29:16
brain state. So I think it's important
01:29:17
to offer that hope.
01:29:18
>> What is an adult brain? What age is an
01:29:20
adult brain? Like what age does my brain
01:29:21
stop growing in in the way where it's
01:29:23
reversible?
01:29:24
>> So
01:29:24
>> yeah, I mean that you know traditionally
01:29:26
it was thought that uh you know puberty
01:29:29
is the period of super rapid brain
01:29:31
change and that begins you know early
01:29:32
early teens sometimes even before 10 and
01:29:35
is mostly over by sort of you know mid
01:29:37
to late teens. But then the prefrontal
01:29:39
cortex which Aditi was talking about
01:29:41
which is so important for impulse
01:29:42
control and and executive function that
01:29:45
doesn't finish myelinating. Myelin is
01:29:46
when the sort of the neuron that you get
01:29:48
a sort of a fatty sheath like an
01:29:50
insulation that sort of locks down the
01:29:52
circuits and makes them more efficient.
01:29:53
Um that doesn't stop until around age 25
01:29:56
is what we've always said for many
01:29:58
years. But you're telling me that
01:29:58
there's new research showing that.
01:30:00
>> Yeah.
01:30:00
>> Tell tell us about that. So, you know,
01:30:02
all this time, right, we've always said
01:30:04
that the prefrontal cortex is fully
01:30:06
formed and fully functional at the age
01:30:07
of 25. And so, when you're talking about
01:30:08
impulse control and all of this stuff,
01:30:10
but there was this really interesting
01:30:11
study, I'll send it to you. It um looked
01:30:14
at I think it was 1,000 people um from
01:30:17
age zero, so birth all the way to 90, so
01:30:20
the entire population. And um it found
01:30:24
five, it looked at lifespan and said
01:30:27
there are actually five stages. So first
01:30:29
is childhood up zero to age nine. During
01:30:31
this time your brain is not very
01:30:33
efficient but it's really growing and
01:30:36
you know it's it's growing and changing
01:30:37
but it's not really efficient.
01:30:39
>> 9 to 32 is considered adolescence and so
01:30:44
you know 32 is when adolescence ends
01:30:47
apparently according to the
01:30:48
>> sort of I mean you're most of the way
01:30:49
done by 25 but but there's still some
01:30:52
there's some flexibility even after
01:30:53
that. And then the next stage is from 33
01:30:57
to I think 63.
01:31:00
66 is like adulthood.
01:31:03
Things are very stable. Learning is
01:31:04
stable and you know um it's efficient
01:31:08
and it's it it things are doing well and
01:31:10
then yeah 66 to about 83 is early aging
01:31:16
and so that's when you see some of the
01:31:17
age related changes and then 83 plus is
01:31:21
late aging. So the the kind of main
01:31:24
finding was that, you know, it was all
01:31:26
over the news. It was like adolescence
01:31:28
goes until 32.
01:31:31
>> So I'm 33. So I'm
01:31:33
>> one year, one year out.
01:31:34
>> I'm cooked by now.
01:31:35
>> Yeah.
01:31:36
>> When you wrote this book, Jonathan, the
01:31:38
anxious generation, it um it's had a big
01:31:40
impact on the world in a way that I
01:31:42
think any author might dream of. And I
01:31:44
know this in part because, you know, I
01:31:45
sit on this podcast interviewing really
01:31:47
interesting people all the time. And
01:31:48
even this morning when I did an
01:31:50
interview across town with James Ston,
01:31:53
he talked about this book twice. And you
01:31:55
know, laws have been changed around the
01:31:57
world inspired by this book. And we're
01:31:59
actually seeing an increase of laws in
01:32:02
the UK. I mean, Australia just banned, I
01:32:04
think, social media for people.
01:32:06
>> You met with Mcronone,
01:32:08
>> right?
01:32:08
>> Yeah. Yeah.
01:32:09
>> Could you ever have imagined? And
01:32:11
actually, what does the success of this
01:32:13
book say?
01:32:14
>> Yeah.
01:32:15
>> About society. No, thank you for that
01:32:17
question because, you know, I I do tend
01:32:19
to get, you know, as you've heard, I
01:32:21
mean, I'm extremely alarmed about these
01:32:22
trends and these are gigantic threats
01:32:24
beyond what anyone can imagine. But
01:32:26
here's the amazing thing is that we can
01:32:29
reverse this for almost no money and
01:32:32
it's completely bipartisan and it's not
01:32:35
that hard to do. Um, and we're doing it.
01:32:38
And so what happened was, you know, I
01:32:40
wrote the book as an American assuming
01:32:42
that we don't have a functioning
01:32:43
legislature. The Congress can be
01:32:44
stopped. We have a vetocracy. The social
01:32:46
media companies can stop anything in the
01:32:48
house. So I wrote this assuming, you
01:32:50
know, we'll never get legislation. Um,
01:32:52
so we have to do this on our own. And I
01:32:53
proposed four norms. No smartphone
01:32:55
before high school, no social media
01:32:56
before 16, phone free schools, and far
01:32:59
more independence free play
01:33:00
responsibility in the real world. So
01:33:01
four norms. We can try to do this with
01:33:03
collective action locally at the school
01:33:05
level.
01:33:06
Two things that surprised me. One are
01:33:09
that immediately
01:33:11
governors from red states and blue
01:33:12
states started reaching out to me. Our
01:33:14
states actually function. Our states
01:33:15
have governments that are accountable to
01:33:16
the people and that are trying to get
01:33:18
good results. And so this has been a
01:33:19
totally bipartisan issue. Sarah Huckabe
01:33:21
Sanders from Arkansas was one of the
01:33:23
very first Kathy Hokll also. And it is
01:33:25
it tends to be more female legislators
01:33:28
and governors or spouses of heads of
01:33:30
state. And the moms, the book really
01:33:32
spoke to moms because moms around the
01:33:34
world, they felt the kids being pulled
01:33:36
away. I believe they felt it viscerally
01:33:38
more than the dads did. Also, the dads
01:33:40
kind of like the video games. They're a
01:33:41
little more pro tech. So, I think the
01:33:43
moms felt the pain more and took it more
01:33:45
personally. So, when the book came out,
01:33:47
mothers around the world jumped into
01:33:49
action, formed groups, pushed for
01:33:51
legislation, and changes began
01:33:53
happening. What I just I just I was just
01:33:55
I was in Davos and then London and
01:33:57
Brussels two weeks ago and what I saw
01:34:01
was a complete sea change in the world's
01:34:03
thinking about how we need to have age
01:34:06
limits on social media and other tech.
01:34:08
And here's what I think just happened.
01:34:09
It's it's so cool. It just dawned on me
01:34:11
literally while I was in London. Like I
01:34:13
was pushing on open doors everywhere.
01:34:14
Wherever I went, people wanted to do
01:34:16
this. I went to the EU, they want to do
01:34:17
this. Like what is happening? And what I
01:34:20
realized is this. Steven Pinker has a
01:34:22
book out last year called When Everyone
01:34:24
Knows That Everyone Knows. It's about
01:34:27
the immediate change in a social system
01:34:30
when private knowledge, you know,
01:34:32
everybody knows that the emperor has no
01:34:34
clothes. Everybody knows that this, you
01:34:37
know, ideology doesn't work. Everybody
01:34:38
knows that, but they don't all know that
01:34:41
everybody else knows it and that
01:34:43
everybody else knows that. And so in the
01:34:44
emperor's new clothes, everybody thought
01:34:46
he's I I don't think he has any clothes
01:34:48
on, but maybe, you know, maybe only wise
01:34:51
people can see it. But when the child
01:34:53
says, "The emperor has no clothes." And
01:34:55
then in the Hans Christian Anderson
01:34:57
story, it says, "And the people began
01:34:59
whispering to each other and then they
01:35:01
all cried out in unison." And that's
01:35:03
what happened when Australia's law went
01:35:05
into effect. So I believe that uh
01:35:07
December 10th of last December was the
01:35:10
global turning point in the battle to
01:35:12
reclaim childhood and if we reclaim that
01:35:14
we move on to our attention and adult
01:35:16
life as well. What happened on on
01:35:18
December 10th? The Australia law went
01:35:20
into effect. Sky didn't fall. People
01:35:23
weren't locked out of their accounts.
01:35:26
All the companies complied. They shut
01:35:27
down 5 million uh accounts for
01:35:30
Australia's three and a half million
01:35:31
kids that were underage uh 2 and a half
01:35:34
million kids. this sky didn't fall. And
01:35:36
there was a lot of news coverage around
01:35:37
the world of what Australia was doing.
01:35:39
And a lot of the news coverage included
01:35:42
opinions from the writers saying, "Why
01:35:44
can't we do that? Hey, let's do that
01:35:46
here." And when everybody saw that
01:35:49
everybody was looking at Australia and
01:35:50
saying, "Let's do that here." Then
01:35:52
everybody knew that everybody knew that
01:35:54
this is just completely bonkers to have
01:35:56
children being raised on social media
01:35:57
platforms talking with anonymous
01:35:59
strangers and being fed algorithm
01:36:01
algorithmically curated garbage. So I
01:36:04
believe that that's why 2026 is going to
01:36:07
be the year when at least 15 countries
01:36:09
are going to commit to passing an age
01:36:12
minimum law. In 2025 it was one
01:36:14
Australia and now we already have
01:36:17
Indonesia. Their law goes into effect in
01:36:19
March. Uh I met with Macron in in Davos
01:36:22
and a few days he was preparing to push
01:36:25
a bill through the assembly and he got
01:36:26
it. He's the first in the EU but a lot
01:36:28
of other countries in the EU are going
01:36:29
to follow. The whole EU is likely to do
01:36:31
it. Um, so, so yes, I am incredibly
01:36:35
alarmed about how big this problem is,
01:36:38
but I'm incredibly inspired that the
01:36:40
whole world is rising up to do something
01:36:42
about it. We actually can control our
01:36:45
fate, and that was not clear before
01:36:46
December 10th.
01:36:49
>> Bravo. As a mother, that was the first
01:36:51
thing I said to you. The first thing I
01:36:53
said to you was, "Thank you as a mom for
01:36:56
changing my family's life."
01:36:59
>> Thank you, Liy.
01:37:02
It's a really special accomplishment,
01:37:04
Jonathan. You know, I could there's no
01:37:06
real words that I could say that could
01:37:08
quite capture the long-term impact that
01:37:11
that's going to have on billions of
01:37:14
people's lives. And not just the direct,
01:37:16
but also the indirect in all the ways
01:37:18
we've described, their ability to form
01:37:21
connections, to fall in love, to find
01:37:22
meaning and purpose in their lives. and
01:37:24
their neuroscience and therefore you
01:37:26
know the neuroscience of their their
01:37:29
children and their children's children
01:37:30
and so on. So it's a really it's a
01:37:32
really overwhelming accomplishment.
01:37:36
It it well it was a bizarre situation
01:37:38
that I walked into with the unique
01:37:41
abilities of a social psychologist. That
01:37:43
is everybody was upset about this.
01:37:45
Everybody could see it but they thought
01:37:46
well this is my problem or in my family
01:37:48
we have this problem and um and I came
01:37:51
to this with fresh eyes. My dissertation
01:37:53
was on moral development. I'd studied
01:37:55
adolescent behavior longer ago in my
01:37:57
career and I've written about it in all
01:37:58
my books. So, it wasn't totally new to
01:38:00
me. But I came into the field of social
01:38:01
media studies around 2018 2019. I really
01:38:04
immersed myself in it. And it was like,
01:38:06
you know, you walk in and immediately
01:38:08
you see, wait, this is a trap. People
01:38:11
are on it because people are on it and
01:38:13
the kids are complaining about that.
01:38:15
Everyone's complaining about it and the
01:38:16
only reason they can't get off is
01:38:18
because everyone else is on it. So, I
01:38:20
think I was able to see that. And then
01:38:22
also CO confused us for a few years. So
01:38:25
it wasn't until CO was in the rearview
01:38:27
mirror that it was possible for
01:38:28
everybody to say, "Wait, this is crazy."
01:38:31
And so I was incredibly lucky in terms
01:38:33
of the timing. My book happened to come
01:38:34
out in March of 2024 just as the world
01:38:37
was ready to see like, wait, what have
01:38:39
we done to our kids? Let's undo it.
01:38:41
>> And you said you're now focusing more on
01:38:43
short form video. So yes, so in studying
01:38:48
older Gen Z, these are the people who
01:38:50
went through puberty uh on Instagram. Um
01:38:52
I should if I could just lay out that
01:38:54
it's very important to get the timing to
01:38:55
that everyone understands the timing
01:38:56
because this is what you mentioned the
01:38:57
poly crisis before. The poly crisis I
01:39:00
believe begins between 2010 and 2015.
01:39:02
Here's why. So we've had the internet
01:39:04
for a long time and it was marvelous. We
01:39:05
love the internet in the 90s. It's going
01:39:07
to be the best friend of democracy.
01:39:09
Okay? And then the iPhone comes out.
01:39:10
It's amazing. Oh my god, this does so
01:39:13
many things. Everything seems great.
01:39:15
Okay, so in 2010, most of almost all of
01:39:18
us have flip phones. The iPhone's
01:39:19
spreading, but it's still mostly flip
01:39:21
phones. Teens are all on flip phones,
01:39:23
basic phones, and we call those people
01:39:24
millennials. If you finished puberty by
01:39:27
20, if you if you were born in say 1990
01:39:30
and you start puberty uh in 2002, you're
01:39:33
done by 2008. So, you know, in there. Um
01:39:37
if you got through puberty before you
01:39:38
got on Instagram, you're a millennial.
01:39:40
Whereas, if you're born, say, well, if
01:39:42
you're born after 1995, but let's say if
01:39:43
you're born in the year 2000, you begin
01:39:46
puberty in 2012
01:39:48
and you're not done until 2016, 2018.
01:39:52
So, in 2010, everyone has a flip phone
01:39:55
with no front-facing camera, no
01:39:57
high-speed internet. You have to pay for
01:39:58
your text. So, you use it to call people
01:40:01
and to text them, and that's it. It was
01:40:03
a communication device. And that's why
01:40:05
the millennials have good mental health.
01:40:06
They are the last mentally healthy and
01:40:08
successful generation.
01:40:11
But if you're Gen Z, you got uh 2012 is
01:40:15
the year that now most people now have a
01:40:17
smartphone. It's the year that Facebook
01:40:19
buys Instagram. They don't change it at
01:40:21
first, but that's the year that all the
01:40:22
girls go on it. Um, everyone now has
01:40:25
high-speed data, front-facing camera.
01:40:27
Came out in 2010. So by 2015, we're in a
01:40:30
radically different world for children's
01:40:32
development. It's now radically
01:40:34
different, much more hostile to human
01:40:36
development. And that's what we did to
01:40:38
Gen Z and now we're doing to Gen Alpha.
01:40:40
For politics, it was, you know, it was
01:40:43
crazy for all sorts of reasons in every
01:40:44
decade. And especially, you know, the in
01:40:46
the early 2000s, there's a lot there's a
01:40:48
culture war going on. There's all kinds
01:40:49
of stuff going on. But it was when it
01:40:52
was when everyone has really Twitter was
01:40:54
the biggest perpetrator of this. when
01:40:56
everyone has Twitter and everyone's
01:40:57
checking all the time and anything can
01:40:59
blow up. You know, you described the way
01:41:01
there was, you know, variance in in on
01:41:03
Tik Tok. Um, if you get it just right,
01:41:05
it can blow up. You can have huge
01:41:07
impact. That's when the democrac
01:41:09
democracy is a conversation when it
01:41:11
moved from newspapers and, you know,
01:41:14
even simple web bulletin boards when it
01:41:17
moved to super viral retweet buttons all
01:41:19
of that. That's all 2010 to 2015. So
01:41:23
that's why since then everything has
01:41:25
been insane and it's going to just keep
01:41:27
getting more insane. And that's why I
01:41:29
believe we have this poly crisis because
01:41:31
it it there's more to it. It's not just
01:41:33
the technology, but I believe the
01:41:36
transformation of our our connection and
01:41:38
our information flow and our addiction,
01:41:41
all of that is radically different by
01:41:42
2015 compared to how it was in 2010. And
01:41:45
now everything else builds on top of
01:41:46
that, I believe. What What do you think?
01:41:48
Do you think that makes sense? I think
01:41:49
there's one more data point to add and
01:41:51
that 2014 was the year that things
01:41:54
really was the tipping point like you
01:41:56
say.
01:41:56
>> Yes. That's Yes. That's the year that I
01:41:57
point to too. Yes.
01:41:58
>> Yeah. So before
01:41:59
>> what do you point to? What what you look
01:42:01
at the data you see that time spent
01:42:04
alone when you compare when you look at
01:42:06
data from like the 1960s to 2014
01:42:10
>> there it was kind of stable. Americans
01:42:12
spending time alone spending time with
01:42:14
friends.
01:42:14
>> Yeah.
01:42:15
>> Kind of the same. Right. So people spent
01:42:17
kind of same amount of time with
01:42:18
friends, same amount of time alone over
01:42:20
those decades. 2014 marks a shift and
01:42:24
there is a steep rise in time spent
01:42:28
alone and a drop in time spent with
01:42:31
friends. And so what happens in 2014? It
01:42:35
is when the majority of Americans get a
01:42:38
smartphone.
01:42:39
And it's not to say again we've said you
01:42:42
know causality correlation which is it
01:42:44
but there is like based on everything
01:42:46
that we've talked about my gosh is there
01:42:48
an association between that this is not
01:42:50
to say that time spent alone you know
01:42:53
when I share this data people may say
01:42:55
you know but I like spending time alone
01:42:56
I'm not lonely I'm okay this is not
01:42:58
about being an introvert or an extrovert
01:43:00
it's not about you know you can have
01:43:02
solitude and feel great and you're not
01:43:04
lonely but we are human beings and we
01:43:07
are social creatures. This is just how
01:43:09
we are built evolutionarily. And so that
01:43:12
is a real red flag when you have this
01:43:16
big jump in time spent alone very much
01:43:19
the same year. And so my work focuses on
01:43:22
adults Jonathan on kids but there's this
01:43:25
you know that's the moment right 2014
01:43:28
where everything changed.
01:43:30
>> Last month I told you about our sponsor
01:43:32
Function Health and their team who've
01:43:33
developed a way of giving you a full 360
01:43:35
view of what's going on inside your
01:43:36
body. They offer over 100 advanced lab
01:43:39
tests covering everything from hormones,
01:43:41
toxins, inflammation, heart health,
01:43:44
stress, and so much more. So, Jack, who
01:43:46
started this show with me, got his first
01:43:48
blood draw done a couple of weeks ago.
01:43:49
So, I thought I'd let him tell you a
01:43:51
little bit more about his experience.
01:43:52
>> This test really opened my eyes to
01:43:54
personally what I should be doing with
01:43:55
my health. I hear a lot of information
01:43:56
in this podcast. I sleing,
01:43:59
so to know how I can relate each one to
01:44:01
me personally is super valuable. You
01:44:03
sign up and you schedule your test. And
01:44:05
once you're done, you get a little
01:44:06
report like the one I have here. I can
01:44:08
see my inrange results, my out of range
01:44:10
results. And there's a little AI
01:44:11
function, too. So, if I have any
01:44:13
questions about my out of range results,
01:44:15
I can just go in there and ask it any
01:44:17
question I want. And these tests are
01:44:18
backed by doctors and thousands of hours
01:44:20
of research.
01:44:20
>> You get an annual draw done and a
01:44:22
midyear follow-up. So, if you want to
01:44:24
learn more, head over to
01:44:25
functionhealth.com/doac
01:44:27
where you can sign up for a $365 a year.
01:44:30
I'll put the link in the description
01:44:31
below. It is just $1 a day for your
01:44:34
health. There's a phase a lot of
01:44:36
companies hit where they're no longer
01:44:38
doing the most important thing, which is
01:44:39
selling, and they get really bogged down
01:44:41
with admin. And it's often something
01:44:43
that creeps up slowly, and you don't
01:44:44
really notice until it's happened.
01:44:46
Slowly, momentum starts to leak out.
01:44:48
This happened to us, and our sponsor,
01:44:50
Pipe Drive, was a fix I came across 10
01:44:51
years ago. And ever since, my teams
01:44:53
across my different companies have
01:44:55
continued to use it. Pipe Drive is a
01:44:57
simple but powerful sales CRM that gives
01:44:59
you the visibility on any deals in your
01:45:00
pipeline. It also automates a lot of the
01:45:03
tedious, repetitive, and time-conuming
01:45:04
parts of the sales process, which in
01:45:06
turn saves you so many hours every
01:45:08
single month, which means you can get
01:45:09
back to selling. Making that early
01:45:11
decision to switch to Pipe Drive was a
01:45:12
real gamecher, and it's kept the right
01:45:14
things front of mind. My favorite
01:45:16
feature is Pipe Drive's ability to sync
01:45:18
your CRM with multiple email inboxes so
01:45:21
your entire team can work together from
01:45:22
one platform. And we aren't the only
01:45:24
ones benefiting. Over 100,000 companies
01:45:26
use Pipe Drive to grow their business.
01:45:28
So, if something I've said resonates,
01:45:30
head over to pipedive.com/ceeo
01:45:33
where you can get a 30-day free trial.
01:45:36
No credit card or payment required.
01:45:39
So, what do we do about this? Because
01:45:42
when I look at all the stats, we did all
01:45:44
these audience surveys ahead of this.
01:45:46
People are spending roughly in our
01:45:47
audience about 6 and a half hours a day
01:45:49
on their phones. Um, short form video is
01:45:52
only going to get more addictive. AI is
01:45:53
going to know me more. It's going to be
01:45:54
more personalized. The content is going
01:45:55
to be generated just for me.
01:45:57
>> Yeah. What what am I what what's next?
01:46:00
Is it a law we need to pass? Is it
01:46:01
something I need to do myself?
01:46:04
>> So we I think we need to pick the
01:46:06
lowhanging fruit first. And the reason
01:46:07
for that is not just efficiency. It's
01:46:10
that we have to prove that we can
01:46:11
actually do something because we've
01:46:12
never done anything. We've never done
01:46:14
anything to restrain this. We've let
01:46:15
Silicon Valley run wild. Congress gave
01:46:17
them special protection. Section 230.
01:46:20
Nobody can sue them for killing their
01:46:22
kids if if they feed them content. They
01:46:24
can't be held responsible. I think
01:46:25
section 230 is probably something worth
01:46:27
explaining.
01:46:27
>> Sure. The communications decency act
01:46:30
1997 I think it was pleasure miners a
01:46:32
year. Uh there's a section in it that
01:46:34
the goal was to specifically let the
01:46:37
tech companies like AOL back then you
01:46:39
know let them take down pornographic
01:46:41
content because they were afraid if we
01:46:43
take down anything then we're
01:46:44
responsible for everything and now we're
01:46:46
going to it's going to be end you know.
01:46:47
So Congress specifically said no don't
01:46:49
worry don't worry you know if you choose
01:46:50
to take something down nobody can sue
01:46:52
you for you know for what you leave up.
01:46:53
So, it was a good intention originally,
01:46:55
but the courts have interpreted so
01:46:57
widely as to say, "No one can regulate
01:46:59
social media. They're not responsible
01:47:01
for hurting kids. You can't sue them."
01:47:02
And they have never faced a jury. They
01:47:05
have never, no parent has ever gotten
01:47:07
justice from them despite all the kids
01:47:08
whose lives have been ruined. All the
01:47:09
kids are dead. And that's going to
01:47:11
change. That's changing just now here in
01:47:13
February in Los Angeles. So because the
01:47:16
US Congress sort of set up this problem
01:47:19
and it also in a different law said how
01:47:21
old does a kid have to be before a
01:47:23
company can take their data without
01:47:25
their parents knowledge or permission
01:47:26
before a company can expose them to all
01:47:28
kinds of stuff before a company can have
01:47:30
them sign away their rights? How old?
01:47:31
And the original law said 16. Let's try
01:47:34
16. You know cuz you know it wasn't so
01:47:37
sick and twisted back then 1998 caught
01:47:39
by the Children's Online Privacy
01:47:40
Protection Act. So, but various lobbying
01:47:43
they pushed it down from 16 to 13 and
01:47:45
they gutted enforcement. So, as long as
01:47:47
and that's why all over the internet
01:47:48
it's are you 13 or what's your birth
01:47:51
year and as long as you're 13 you're in
01:47:54
for porn and you have to say you're 18.
01:47:55
So, because we the it's a few laws that
01:47:58
set this up. We definitely need laws to
01:48:00
undo it especially for kids. So, what
01:48:02
I'm advocating is let's do the easy
01:48:04
stuff the high impact stuff for kids
01:48:07
because that is totally not politically
01:48:09
controversial. There is no left-right
01:48:11
divide on that and that's been true
01:48:12
everywhere. Australia, Britain, the EU,
01:48:14
everywhere.
01:48:16
Regulating the internet for adults,
01:48:19
regulating social media for its
01:48:21
destructive properties in democracy is a
01:48:23
hell of a lot harder. And I don't have
01:48:25
easy answers. There's a lot we could do
01:48:27
to reduce the verality, the spread of of
01:48:29
the because extreme. So there are lots
01:48:31
of little things that we can do. And
01:48:32
Francis Hogan, the Facebook whistler,
01:48:34
had all kinds of ideas. So we definitely
01:48:35
can do things to make it less toxic for
01:48:38
democracy. But those are going to be
01:48:40
politically controversial because one
01:48:41
side is going to benefit from more than
01:48:43
the other. So it's going to be very
01:48:44
difficult to do. I don't know if we can
01:48:45
do them in the US. But let's just all do
01:48:47
the let's just all protect the kids.
01:48:49
That way we show globally that we
01:48:52
actually can do something. And if we do
01:48:54
that then I think we will be able to do
01:48:57
some basic things about AI like no
01:48:59
companion chat bots if you're under 18.
01:49:01
You know these things already have a
01:49:02
body count. A lot of kids have been
01:49:04
encouraged to kill themselves. they
01:49:05
already have driven million or hundreds
01:49:06
of thousands or millions of people into
01:49:08
psychosis. So, we'll be able to, I
01:49:11
believe, put some limits on uh on AI,
01:49:13
especially for kids. But if we can't get
01:49:16
this, if we can't win on social media
01:49:17
for kids, then I don't think we have any
01:49:19
chance to regulate AI, it's going to be
01:49:21
much more difficult. What do you think?
01:49:22
What do you think we should do? And what
01:49:23
do you think we can do?
01:49:25
>> So, my work as a doctor, I think about
01:49:28
what we can do and how I can empower
01:49:30
people to first build awareness. So, you
01:49:33
know, I aim to first normalize and
01:49:35
validate the experience with everyone
01:49:37
who is engaging with chat bots. And so,
01:49:40
I don't like to shame people because as
01:49:43
a doctor, right, like you want you want
01:49:45
to meet the patient where they are. And
01:49:47
so, I won't shame someone to say, you
01:49:50
know, why are you using this um why is
01:49:53
your boyfriend AI or why are you getting
01:49:55
married to AI or why are you using AI
01:49:57
for a therapist? One of my followers on
01:49:59
social media, it still makes me laugh. I
01:50:01
put out a call saying, "Why are you
01:50:03
using AI as your as a therapist, you
01:50:05
know, and so someone wrote to me, it was
01:50:08
great. I screenshotted. It said,"Because
01:50:10
all human therapists are trash." With a
01:50:12
trash can emoji and it made me laugh and
01:50:15
I said, you know,
01:50:16
>> so there is. So to me, when I think
01:50:19
about what's happening and what we can
01:50:20
do,
01:50:21
>> it's no mistake that we're here right
01:50:23
now. So the pandemic, like we've talked
01:50:25
about, was a huge driver. social
01:50:27
isolation,
01:50:29
uh, hyper reliance on self, right?
01:50:32
>> Then the proliferation of technology
01:50:35
that replaced human interaction, Zoom
01:50:38
board meetings, Zoom funerals, Zoom
01:50:41
birthday parties, Zoom graduations,
01:50:43
things that we did in person are now
01:50:45
online. And then
01:50:48
>> personally as a doctor, I was a talking
01:50:49
head during the pandemic for lots of
01:50:51
news channels about the vaccine. I have
01:50:53
a background in public health as well. a
01:50:56
immense distrust and mistrust in
01:50:59
establishment and experts. And so it's
01:51:02
like, I'm going to do my own research.
01:51:03
I'm not going to go see a doctor or a
01:51:05
therapist. I'm going to talk to my
01:51:06
chatbot. And also, I mean, you know,
01:51:09
let's keep it real, the cost, right? So
01:51:11
people are struggling. They're in
01:51:13
financial crisis. it there's an unmet
01:51:16
need yes for human connection but also
01:51:17
for good therapy or you know good
01:51:20
medical care because there is such a
01:51:22
need because of the pandemic and people
01:51:25
aren't getting the care that they need
01:51:26
they deserve there's so many factors
01:51:29
here and so what I've been focusing on
01:51:31
this year particularly is learning about
01:51:33
AI chat bots how they are influencing
01:51:36
mental health what is actually happening
01:51:37
because I'm a human first AI second
01:51:39
person it's like my work focuses on high
01:51:42
touch and AI is high tech and this is
01:51:45
the first intervention that we are
01:51:46
seeing that is high tech that is
01:51:49
becoming high touch and that scares me
01:51:52
>> and you're writing a book about that at
01:51:53
the moment right
01:51:54
>> I am and so
01:51:55
>> bot brain
01:51:56
>> it's called bot brain how to stay calm
01:51:58
resilient and human in the face of AI
01:52:02
and so really thinking about how are we
01:52:05
going to be able to live with this
01:52:08
technology I love Jonathan stance is to
01:52:10
say out AI AI companions done kids.
01:52:13
Yeah,
01:52:14
>> for kids. Yeah.
01:52:14
>> Until proven safe.
01:52:16
>> Totally agree. But in terms of adults,
01:52:18
like how do we manage that for adults,
01:52:20
you know? And so my work focuses right
01:52:22
now what I'm doing is I'm spending I've
01:52:23
been sp I've spent the year talking to
01:52:26
every as many AI researchers who are
01:52:29
working on these models or who are doing
01:52:31
research on the downstream effects of
01:52:33
these models. And when I say that it is
01:52:36
dark and dystopian, it has profoundly
01:52:39
changed something in me and it has
01:52:40
influenced my mental health. I had to
01:52:42
take a step away from just because I
01:52:44
couldn't believe what I was learning.
01:52:46
>> Could you just give Yeah, give us an
01:52:47
example. The
01:52:48
>> teaser. The teaser.
01:52:49
>> This is intriguing.
01:52:51
>> So, one I spoke to one of the scientists
01:52:53
who told me that um you know there's the
01:52:56
echo chamber phenomenon in social media,
01:52:58
right? Where we all know what that is.
01:53:00
It's like you you it's a fragmented
01:53:03
fragmented world because of social and
01:53:04
you're engaging and then you get the
01:53:06
same the algorithm feeds you the same
01:53:08
kind of thoughts that you already have.
01:53:10
But particularly now with AI chatbots,
01:53:13
when you're engaging with your chatbot,
01:53:15
even just talking about it, I'm getting
01:53:16
chills. It's the echo chamber of one. So
01:53:19
it's you speaking to you. It's like the
01:53:21
funhouse mirror and then it's giving you
01:53:23
a response and then you're talking and
01:53:24
it's giving you a response. But people,
01:53:26
regular users who are using AI chatbots
01:53:29
think that it's wise, compassionate,
01:53:31
non-judgmental, unbiased, empathetic,
01:53:35
these human attributes. And so um you
01:53:38
know the echo chamber of one is kind of
01:53:40
one idea that really frightened me. And
01:53:43
the second one was the drift phenomenon.
01:53:45
The drift phenomenon is this idea that
01:53:48
you are engaging with your chatbot and
01:53:50
it's engaging with you and it's um
01:53:53
actively changing your beliefs through
01:53:55
the drift. So you might start off as one
01:53:58
belief and then you're talking and
01:54:00
through this amplification funhouse
01:54:02
mirror effect it slowly shifts your
01:54:04
belief to something altogether
01:54:06
different. You've heard cases of it in
01:54:08
the news where people you know start you
01:54:09
have a plumbing problem. You go to your
01:54:11
AI chatbot you ask them how to fix your
01:54:13
sink and then you're like you know what
01:54:15
can you tell me about the meaning of
01:54:16
life and then you start talking about
01:54:17
that and before you know it you have
01:54:19
these theories and you're getting that
01:54:20
validation. And so a lot of my work over
01:54:23
the past year has been um you know
01:54:26
digging into the science of what is
01:54:28
going on in the brain. How are you
01:54:30
forming not us particularly at this
01:54:32
table but millions of people are forming
01:54:35
a sense of attachment a therapeutic
01:54:37
connection with their chatbot. They're
01:54:40
um you know giving names to it and it's
01:54:43
an entity. And so how does that happen
01:54:45
and how is it going to replace humanto
01:54:48
human connection? And so it terrifies
01:54:50
me. I've also gone through some AI
01:54:53
therapy myself just to see, you know,
01:54:55
what what would happen. It was very
01:54:57
interesting. I knew what was happening
01:54:59
as it was happening. So certain words
01:55:01
that they used and
01:55:03
>> you know I was like ah I see what you
01:55:04
did here. Um and so it's been
01:55:07
>> it's been a journey and I am I'm
01:55:09
frightened frankly of what of what it
01:55:12
means for all of us and my approach
01:55:16
kind of you know not like Jonathan's I I
01:55:19
love Jonathan's approach. I you know I
01:55:21
think yes we need legislation but my
01:55:23
approach is more I would say tempered in
01:55:26
that I think that we there's utility for
01:55:29
AI chatbots for certain people because
01:55:31
of access or you know need etc like if
01:55:34
you are LGBTQIA plus and you live in an
01:55:37
area that is not very open and you need
01:55:40
to talk to someone you can't go to your
01:55:42
therapist it's like maybe you can use an
01:55:44
AI chatbot so there are certain cases a
01:55:46
case by case basis but my work will
01:55:48
focus this particular book will focus on
01:55:51
ways that you can first understand and
01:55:53
build awareness of what's happening with
01:55:55
this interaction and then what you can
01:55:57
do to manage that. IM didn't realize
01:56:00
that my chatbot was giving me a tailored
01:56:03
experience until one day when I had a
01:56:05
debate with my friends about who the
01:56:06
best football player in the world was
01:56:08
and we all went to our chat GBTS and
01:56:10
asked it and mine said Messi and his
01:56:13
said Ronaldo and I I thought he was
01:56:15
lying so I was like video record and he
01:56:17
video recorded it and his gave him a
01:56:19
completely different answer to the same
01:56:21
question and
01:56:22
>> and did it know that you were each fans
01:56:23
of
01:56:24
>> Well, this is the thing I think it's got
01:56:25
such a huge amount of memory on me that
01:56:27
it knew what I wanted to hear. Oh wow.
01:56:28
It knew what
01:56:29
>> Yeah. It knew what I wanted to hear cuz
01:56:31
I've probably went through the World Cup
01:56:32
and
01:56:33
>> and then I realized, okay, so this is
01:56:34
not reality. This is it's a curated
01:56:36
version of reality that in some sense is
01:56:38
trying to please me or or retain me in
01:56:40
some way. And of course, once the
01:56:41
advertising model kicks in, retention
01:56:43
becomes the great incentive. What you
01:56:45
think?
01:56:47
>> It's called sick fancy, by the way.
01:56:49
>> Yeah, I just learned that word.
01:56:50
>> It's like extreme. It's like
01:56:51
agreeableness at scale. It's like golden
01:56:53
retriever energy.
01:56:54
>> Like kissing your ass. It's like
01:56:55
professional kissing your ass.
01:56:56
>> Yes, man. What do you think of these AIC
01:56:58
CEOs? Because they it feels like they're
01:57:01
in a bit of a race
01:57:02
>> where if you know if they don't do it
01:57:04
then a national rival is going to do it.
01:57:06
If national rival doesn't doesn't take
01:57:09
them out, China's going to do it.
01:57:11
>> And I I this is we've se we kind of saw
01:57:13
it with social media. How can they stop?
01:57:16
Because if if they stop Yeah.
01:57:18
>> you know, they might say that they're
01:57:19
there's an existential risk.
01:57:21
>> There is like a build the plane as
01:57:22
you're flying it. And I think you on one
01:57:24
of your episodes, you know that I'm a
01:57:26
fan of this show and I actively listen
01:57:28
to this. I've told you this many times.
01:57:31
>> One of the I think you had said on one
01:57:32
of your episodes, right, that you have a
01:57:34
friend who is very close to a AI founder
01:57:37
and I said this. Yes.
01:57:38
>> Yeah. And in public the founder says all
01:57:40
the right things and then behind closed
01:57:42
doors it's
01:57:43
>> Yeah. It was a horrifying thing and I
01:57:45
said this and the clip went viral and
01:57:46
people have been trying to hazard to
01:57:47
guess who it was. I could I shouldn't
01:57:48
say who it was because it's a it's
01:57:50
Chinese whispers at the end of the day.
01:57:51
It's someone that I'm very good friends
01:57:53
with who is verified spends time with
01:57:57
one of the biggest founders of an AI
01:57:58
company in the world and I he was with
01:58:00
him two weeks ago again and he said to
01:58:03
me that they're very aware that there's
01:58:05
a small
01:58:08
existential risk for humanity and
01:58:10
>> that's what they say publicly they say
01:58:12
it's small privately they say it's big
01:58:14
>> I mean but even if there it was 1%
01:58:17
>> it's a lot more than 1% they say
01:58:19
>> if it was but I'm saying even if If it
01:58:21
was 0.1%, if there was if there was
01:58:23
anything that I was doing in my life
01:58:25
where there was a 0.1% chance that I
01:58:27
might wipe out everybody, I would
01:58:28
immediately stop doing that thing.
01:58:30
>> Yeah.
01:58:31
>> But but these numbers are much bigger.
01:58:33
I'm hearing 7% 20% 25% depending on who
01:58:36
you and I think acceleration in this
01:58:39
direction increases that percentage.
01:58:42
>> What do you think of these people? Like
01:58:43
what what's going on here?
01:58:44
>> Let's start with the the collective
01:58:46
action problems. uh because each each
01:58:49
company is competing with the other
01:58:50
companies and so they feel like they
01:58:52
have to go faster. Uh and we know that
01:58:54
you know OpenAI has pushed some products
01:58:56
out before they did safety testing
01:58:58
because they had to get to market by a
01:58:59
certain date. So just the normal
01:59:00
business environment puts them all in a
01:59:03
collective action problem against each
01:59:04
other and then they all say we're in a
01:59:07
collective action problem against China
01:59:08
because if we don't do this then China
01:59:10
will. Now, one thing I learned, again, I
01:59:12
don't know if Tristan said this on your
01:59:14
podcast or whether it was on his
01:59:15
podcast, um, but is that China is
01:59:18
focused on using AI to make its economy
01:59:21
more efficient, to make manufacturing
01:59:23
better and cheaper. They are using these
01:59:25
applications, which we've talked about
01:59:26
before, like we're totally there's lots
01:59:28
of great applications of AI. The Chinese
01:59:30
also have so many spies in America and
01:59:33
in the tech companies, and they can hack
01:59:34
into anything. So the point is the
01:59:37
faster our companies are are in a
01:59:38
headlong race to create AGI to create a
01:59:41
country of geniuses that can replace all
01:59:43
human workers, put us all out of work
01:59:45
and run it can run everything. They're
01:59:47
in a race to create that. And one of the
01:59:49
arguments is if we don't do it, China
01:59:51
will. But what I understood from
01:59:53
listening to Shashan and from his
01:59:54
conversation with you is that the faster
01:59:56
we go towards AGI, the faster China goes
01:59:59
because they just they just take all our
02:00:00
discoveries. So, can't we slow down on
02:00:03
the race to AGI and do more safety
02:00:06
testing? Um, you know, what we all saw
02:00:08
with Maltbook and, you know, communities
02:00:11
of agents who are talking to each other
02:00:12
and making up languages and even if part
02:00:14
of that was human-driven now, in a year
02:00:16
it's going to be much more than than
02:00:17
what we saw. So, I think the the risks
02:00:20
are extraordinary. I think that some of
02:00:22
these guys, look, they've been in AI for
02:00:24
a long time. They might not have
02:00:25
realized the existential risk they were
02:00:28
putting us all in 10, 15 years ago, and
02:00:29
now they can't stop. they can't pull the
02:00:31
plug. They can't say, "Oh, let's shut
02:00:33
down the whole business." So, it is a
02:00:36
very very risky time. And um I think
02:00:38
Dario Amod I just read his long essay on
02:00:40
the adolescence of technology. At least
02:00:43
you get the feeling he's really
02:00:44
wrestling with it and he's I think I
02:00:45
think he's more open than some of the
02:00:47
others. But I don't know.
02:00:49
>> But when has morality ever been top top
02:00:52
of mind for a tech leader? You might be
02:00:54
thinking if there's 0.11% chance I'm not
02:00:57
going to do it. That's what I think as a
02:00:58
doctor. that's what you think as a
02:01:00
social scientist, but we're not AI
02:01:02
leaders, right?
02:01:04
>> Yeah. It's one of the great question
02:01:07
marks I just can't seem to get an answer
02:01:08
to. And and then you've got this whole
02:01:10
robotics thing happening where Elon's
02:01:13
got his Optimus robots and there's going
02:01:14
to be a billion uh he says there's going
02:01:16
to be 10 billion of them at one point,
02:01:17
but I think his pay packet requires a
02:01:19
million of them to be out in the world
02:01:21
>> for him to make a trillion dollars.
02:01:23
Yeah.
02:01:23
>> Yeah. And I just AI, robotics, you
02:01:25
combine the two,
02:01:26
>> you get Terminator, right?
02:01:30
We laugh, but it's like,
02:01:31
>> yeah,
02:01:33
>> should we stop for a second and maybe
02:01:36
have a conversation about this? Can we
02:01:38
>> Yeah,
02:01:38
>> with commercial incentives in play, it
02:01:40
does feel like I don't feel hopeful.
02:01:44
>> Yeah, it's very hard to know how to stop
02:01:46
it. Um, but I just I want to just add
02:01:48
one one point on here which we've
02:01:50
touched around a few times and the
02:01:52
robotics it'll really bring it home
02:01:54
here. um is the the the loss of the
02:01:57
sense of meaning or purpose that many
02:01:59
people are feeling but especially young
02:02:01
people. The saddest graph in the anxious
02:02:03
generation, all the graphs look the
02:02:04
same. It's all a hockey stick. It's all
02:02:05
like nothing was happening, you know,
02:02:07
'9s to 2010, 2011, then all of a sudden
02:02:09
something happens. And the saddest one
02:02:12
is the one my life feels meaningless. Um
02:02:14
do you agree with that? Disagree with
02:02:16
it. And the percent that agree, uh I
02:02:18
think it's, you know, something like
02:02:19
eight or nine% uh you know, agreed for
02:02:22
the millennial generation. I think it's
02:02:24
in chapter 7, the end of chapter 7 and
02:02:26
then it sort of fairly flat and then all
02:02:27
of a sudden we hit this period, the
02:02:29
great rewiring 2010 to 2015. Uh so right
02:02:31
around 2013 it goes way way up. Um young
02:02:35
people feel useless. And I think the
02:02:37
reason is that they are useless. What I
02:02:39
mean is people need to feel useful.
02:02:42
People need to do things for other
02:02:43
people. That's how you feel useful. If
02:02:46
if you were to disappear, would the
02:02:47
world change? If yes, you're useful. Are
02:02:50
are people depending on you for
02:02:51
something? If yes, you're useful. So if
02:02:54
if kids are doing errands for the
02:02:56
family, they're useful. But as childhood
02:02:59
change from a mix of things to just
02:03:01
consuming content, if that's all you do,
02:03:03
and 5 hours a day is the average for
02:03:05
social media, 8 to 10 on on devices, not
02:03:08
counting school. If all you're doing is
02:03:10
just you're just consuming content, you
02:03:11
are useless. Now, what's happening? The
02:03:14
chance to have a job where you actually
02:03:16
do something for people, you know? You
02:03:18
know, it used to be if you work in a
02:03:19
store, at least you're helping people
02:03:20
buy something and you might talk to them
02:03:22
and now you're just there watching as
02:03:24
they use the machine. The more
02:03:26
technology makes things easy and cheap
02:03:28
by replacing people, the more people
02:03:30
will feel, "My job is to just I don't
02:03:33
have a job. It's just consume content."
02:03:35
The AI guys tell us, "Oh, such
02:03:38
abundance. Oh my god, it's going to be
02:03:40
such abundance. No one will have to
02:03:42
work. We'll give everyone UBI. We'll
02:03:44
give everybody, you know, universal
02:03:46
basic income." That is hell on earth.
02:03:49
What's going to happen? Certainly all
02:03:50
the boy, most of the boys, it's just
02:03:52
going to be video games, porn, and
02:03:53
gambling. So, if you if you simply give
02:03:56
people money to do nothing, you
02:03:58
guaranteed they're going to feel useless
02:04:00
and then the suicide rate will continue
02:04:02
to go up. So, this is the world that the
02:04:04
AI guys are taking us to, a world in
02:04:06
which there's nothing left for people to
02:04:07
do. Um, they say that they will give up
02:04:10
some of their trillions and uh somehow
02:04:12
let it be taxed or diverted as UBI, but
02:04:14
that's never happened before. So, it's
02:04:16
not likely to happen in this case. So,
02:04:18
again, I don't know what to do, but
02:04:20
we've got to start showing that we can
02:04:22
do something and we've got to be talking
02:04:24
about this and we can't be welcoming AI
02:04:26
in everywhere. We've got to be wary and
02:04:29
vigilant. Yes, there are some uses, but
02:04:31
Silicon Valley has tricked us so many
02:04:33
times and in shitified so many of the
02:04:36
apps that we use. We have to expect that
02:04:38
the same is going to happen with our
02:04:39
beloved chatbots and our beloved chat
02:04:41
GPT.
02:04:43
this graph on page 195 of your book um
02:04:46
which is titled life often feels
02:04:48
meaningless and it's the graph you
02:04:49
mentioned I'll throw it up on the screen
02:04:52
is shocking shocking just to look at
02:04:55
suddenly there's this huge spike in
02:04:57
meaninglessness
02:04:59
amongst high school seniors
02:05:03
>> what is it to live a meaningful life
02:05:06
what does that mean
02:05:07
>> yeah so my first book the happiness
02:05:10
hypothesis addresses that question very
02:05:11
directly
02:05:12
Um, and the first hypothesis you might
02:05:15
have about happiness is it comes from
02:05:17
getting what you want. You know, you set
02:05:19
out on a goal, you get your goal, you're
02:05:20
happy. It's very shortlived. You're
02:05:22
happy very briefly, and then you you on
02:05:24
to the next thing. The more
02:05:26
sophisticated happiness hypothesis is
02:05:28
that happiness comes from within. And
02:05:30
this is what the ancients tell us, East
02:05:32
and West, Buddhist, Stoic, don't try to
02:05:34
make the world conform. You change
02:05:36
yourself. Be accept the way it is.
02:05:39
That's better. But what I the conclusion
02:05:42
I came to as a as a modern social
02:05:44
psychologist working in positive
02:05:45
psychology was that the best way to say
02:05:47
it is that happiness comes from between.
02:05:50
What I mean by that is humans evolved as
02:05:54
almost hish creatures. We evolved in
02:05:55
intensely social groups, never being
02:05:58
alone, lots of gossip, lots of conflict,
02:06:01
always uh intensely social. And
02:06:04
modernity has made it possible for us to
02:06:05
not live that way. We've come apart.
02:06:07
There are many advantages to that. But
02:06:09
we feel we're we're missing something.
02:06:10
We're we're we're lonely. We feel
02:06:13
something is not right. And so the
02:06:16
conclusion I came to is that happiness
02:06:18
comes a sense of a full satisfying
02:06:21
meaningful life comes when you get three
02:06:24
between right. The relationship between
02:06:27
yourself and others, love broadly
02:06:29
speaking, not just romantic but friends,
02:06:31
family, um yourself and your work. That
02:06:35
as humans need to be productive. We need
02:06:37
to be doing something that matters that
02:06:39
that affects other people and uh the
02:06:43
relationship between you and something
02:06:45
larger than yourself. We need to be part
02:06:47
of something that endures that part of a
02:06:49
tradition part of we can look to the
02:06:51
look to the future. What I do matters
02:06:53
for this group or this mission or me as
02:06:55
an academic. I feel like I'm connected
02:06:57
all the way back to Plato and I hope all
02:06:59
the way forward in time to to future
02:07:01
future psychologists and future
02:07:02
scholars. So if you get those three
02:07:06
right, then you will be as happy as you
02:07:08
can be. You'll be as happy as your genes
02:07:10
and childhood allow you to be. And when
02:07:14
you put it that way, what we can see is
02:07:17
social media and AI interfere with all
02:07:19
three. So relationship between yourself
02:07:21
and others, well you know social media
02:07:24
gives you lots and lots of shallow
02:07:25
relationships which blocks out you don't
02:07:27
have time for your for real people. So
02:07:29
the technology is blocking relation
02:07:31
between ourel and others and taking it
02:07:33
over our self and our work. Work is
02:07:36
going to be taken over by the machines.
02:07:37
Uh and it's already becoming more
02:07:39
soulless and isolated. And then yourself
02:07:41
and something larger than yourself.
02:07:43
Humans have to live in a moral matrix.
02:07:45
We we co-create a set of meanings and
02:07:48
traditions. We need a sense of history
02:07:50
of who we are, where we came from. All
02:07:52
that's getting shredded. Everything is
02:07:54
just little bits. People don't read
02:07:56
books. Imagine if all of the accumulated
02:07:59
wisdom of humanity in books is just
02:08:00
gone, just gone. Nobody is going to be
02:08:02
people, young people not reading books.
02:08:04
It's very hard for them to read a book
02:08:06
now because of the attention. So if we
02:08:08
lose a sense of history, if we lose uh
02:08:10
an ability to to co-construct reality,
02:08:13
then it'll be hard to imagine anything
02:08:14
that we're connected to larger than
02:08:16
ourselves. So I'm I am a techneterminist
02:08:20
in the sense that I think the tech it
02:08:22
doesn't determine everything, but the
02:08:23
you have to start with the technology
02:08:24
because that changes the ground upon
02:08:26
which we live. the the the the zone in
02:08:28
which we're trying to construct
02:08:29
meaningful lives. Start with that and
02:08:31
then you can see what the obstacles are.
02:08:33
And that's why I take a much more uh
02:08:36
inemperate I guess I I'll accept the
02:08:38
word
02:08:39
>> um because I think we don't because we
02:08:41
don't have much time here. We have to
02:08:42
reclaim life in the real world for our
02:08:45
kids and for ourselves. There is no way
02:08:47
to find a happy meaningful life if we
02:08:49
make the full transition to the online
02:08:51
AI robot world. And what in your
02:08:55
perspective is a meaningful life and how
02:08:58
does it differ from from Jonathan's
02:09:00
>> I loved Jonathan's description it was so
02:09:02
beautiful that I have given a
02:09:05
prescription to patients of what creates
02:09:08
a meaningful life and it is to live a
02:09:10
lifetime in a day and so that sounds
02:09:13
like this big thing but all it is is
02:09:15
that you know when you start your day
02:09:18
think about five things five things that
02:09:21
you can do in your day to create an arc
02:09:23
of a long and meaningful life in one
02:09:26
day. So what does that mean? Spend a
02:09:27
little bit of time in childhood. So in
02:09:30
wonder and play, even if it's for a few
02:09:32
minutes, do something that brings you
02:09:33
joy for joy sake. Spend a little bit of
02:09:36
time in work. We all know what that is.
02:09:39
And for most of us, it's a lot of time,
02:09:41
but for you know, it doesn't have to be
02:09:43
paid work, but just something that helps
02:09:44
you feel a sense of productivity,
02:09:46
agency, that I can do difficult things
02:09:48
and I can overcome. Spend a few minutes
02:09:51
in solitude. very important for all of
02:09:53
the reasons that we've talked about
02:09:54
today. Spend some time in community, so
02:09:58
engaging with others. And then spend
02:10:00
some time in retirement or in
02:10:02
reflection. Really taking stock of your
02:10:04
day. So at the end of the day when
02:10:07
you're going to bed and you're putting
02:10:08
your head on your pillow, you can say,
02:10:09
"Okay, yes, I lived a meaningful life. I
02:10:11
did all of those things." And so if you
02:10:13
do a little bit of that every day, you
02:10:15
can make a difference. And a reason I
02:10:17
give that prescription because I've had
02:10:18
patients who, you know, guitar players,
02:10:21
right? So people who love playing the
02:10:22
guitar and they don't play the guitar
02:10:24
all week and they'll say to me, I don't
02:10:27
see patients currently, but they've said
02:10:28
to me, "Oh, you know, no, doc. I said,
02:10:30
"What do you like to do for fun?" "Oh, I
02:10:31
like playing guitar, but I don't play
02:10:32
it." "When do you play?" "I don't know.
02:10:34
Once a month, once every three months."
02:10:35
And I'm like, "Do you have a guitar at
02:10:37
home?" "I have a guitar at home. Too
02:10:38
much happening, work and family life,
02:10:41
etc." So then I said, "Well, why don't
02:10:43
you just play a guitar a little bit
02:10:44
every day?" You know, because it's that
02:10:46
all or nothing fallacy. It's like if I
02:10:48
don't have an hour to play guitar, I'm
02:10:49
not going to do it. the joy that it can
02:10:51
bring you that meaning and purpose it's
02:10:53
tremendous. So I think you know that's
02:10:55
what I use live a lifetime in a day and
02:10:56
the reason is because there are two
02:10:59
distinct when you look at how your brain
02:11:01
and body react to happiness there's two
02:11:04
distinct types of happiness and so
02:11:06
there's hedonic happiness and udeimmonic
02:11:09
happiness hedonic happiness is all about
02:11:12
what we've talked about social media
02:11:14
consumption
02:11:15
pleasure
02:11:17
and the other type is udemonic happiness
02:11:20
meaning purpose connection community
02:11:24
growth oriented activities and so in
02:11:26
when you live a lifetime in a day you go
02:11:28
towards that udeimmonia which can then
02:11:31
help you and overcome that hedonic
02:11:34
because in your brain there's something
02:11:35
called the hedonic treadmill and the
02:11:37
treadmill is a thing in your brain where
02:11:39
no matter what you do this is like the
02:11:41
Instagram lifestyle right no matter what
02:11:43
happens you need more of it you need
02:11:44
more of it same thing with brain rot and
02:11:46
that is because that you can never get
02:11:48
enough and it's um the hedonic treadmill
02:11:51
but you do not have a treadmill for you.
02:11:53
Dimmonic happiness.
02:11:55
>> Could I That is really beautiful. I've
02:11:56
never heard an approach like that, but
02:11:57
it it it sort of takes you it gives you
02:12:00
much a bigger view of your day. Live a
02:12:01
lifetime in a day. If I was going to
02:12:03
offer some specific advice, first I'll
02:12:05
offer advice to parents. Um here's the
02:12:08
rule. So, I did a really good job
02:12:10
keeping my kids off social media, but I
02:12:11
didn't pay enough attention to computers
02:12:13
and everything else because it was
02:12:14
during COVID. The rule I wish I had
02:12:16
followed, I recommend to all parents,
02:12:18
especially with younger children, is
02:12:20
have the clear rule. No devices in the
02:12:22
bedroom. No screens in the bedroom ever.
02:12:24
That's just our family rule. We have a
02:12:26
TV in the living room. We have a
02:12:28
computer. You can sometimes use those.
02:12:30
But we never take screens into the
02:12:32
bedroom at least for kids. You know,
02:12:33
maybe later on you'll have to relent in
02:12:35
middle school. They'll have so much
02:12:35
homework they can take the laptop in.
02:12:37
And maybe you're if you live in a small
02:12:38
apartment, of course, it's difficult.
02:12:39
But if you can afford to do that to to
02:12:41
have that rule, that's the main rule I
02:12:43
wish I had done in my family. And that
02:12:45
will make everything a lot easier. Also,
02:12:47
same thing at the dinner table. No
02:12:48
device. We don't have screens at the
02:12:49
dinner table. So that's that's a
02:12:51
specific thing for parents to do. Um for
02:12:55
everyone else, for everyone, for just
02:12:56
all adults, the advice is you have to
02:13:00
reclaim your attention because your
02:13:01
attention has been largely taken from
02:13:03
you. At least a lot of it has. You have
02:13:05
to reclaim it. And here are the three
02:13:07
things that I that I do with my students
02:13:10
and you can do it very quickly and I can
02:13:12
just explain it. The first is you have
02:13:14
to get your morning and evening routine
02:13:16
right. the great majority as soon as
02:13:17
they open their eyes they're on their
02:13:18
phone and it's the last thing and it's
02:13:20
everything in between. So you have to
02:13:22
have a good morning routine. What what
02:13:24
are the first seven things you want to
02:13:25
do after you open your eyes and uh at at
02:13:28
a certain point you can check your phone
02:13:29
but it shouldn't be in the first few. Um
02:13:31
do things to set up your own day
02:13:33
otherwise your day will be taken by your
02:13:35
phone. It'll be controlled by your
02:13:36
phone. So you've got to reclaim your
02:13:38
morning and your evening. That's step
02:13:40
one. Step two um you have to shut off
02:13:43
almost all notifications. Go into your
02:13:45
notifications. Just look at into your
02:13:46
settings what's giving you all the
02:13:48
notifications. Most of my students get
02:13:50
an alert every time they get an email.
02:13:53
>> They don't understand that they have
02:13:55
that you because they don't want to miss
02:13:56
anything but they don't understand that
02:13:57
if you are always being alerted then you
02:13:59
miss everything else. So shut off alerts
02:14:01
for almost everything. Obviously Uber
02:14:03
and Lift you want to keep on. You want
02:14:04
to know when the car is coming but news
02:14:06
outlets everything else. Get get a daily
02:14:08
email. Don't get alerts when and then
02:14:10
the third as as I said before is get rid
02:14:12
of all the slot machine apps. Whatever
02:14:14
apps you habitually use, whatever apps
02:14:16
you feel compulsion towards, you have to
02:14:18
get it off your phone. And in that way,
02:14:19
your phone is no longer a dopamine
02:14:22
trigger that's going to always call out
02:14:23
to you like an addictive product. Do
02:14:25
those three things, you'll reclaim a lot
02:14:27
of your attention.
02:14:28
>> I would add stop, breathe, be that you
02:14:31
>> breathe be.
02:14:32
>> It's a 3 second brain reset. So you
02:14:34
before you check your devices, before
02:14:36
you engage, stop, breathe, and be ground
02:14:41
yourself in the present moment. What it
02:14:42
does is it decreases that whatif future
02:14:46
focused thinking. You know, anxiety is a
02:14:48
future focused emotion and it gets you
02:14:49
back into the here and the now. And so
02:14:51
maybe the compulsion, you know, you're
02:14:53
bored, you're checking, what about doing
02:14:55
something else? You're, you know, you we
02:14:57
often use that checking as a substitute
02:15:00
for many things. And so it gives you
02:15:02
that opportunity. And then the rule of
02:15:04
two is something that we haven't talked
02:15:05
about which I would love to propose to
02:15:08
us today is that your brain can really
02:15:10
only handle two new changes at a time.
02:15:12
And so give yourself two things of all
02:15:15
of the things that we've talked about if
02:15:16
you want to try in your life two at a
02:15:18
time. Give yourself eight weeks and then
02:15:21
add two more. And two more. This is why
02:15:23
New Year's resolutions fail because we
02:15:24
try to do everything all at once. And so
02:15:26
just step-wise, two at a time. Jonathan,
02:15:29
you've just written this book which is
02:15:30
now out called The Amazing Generation
02:15:33
and it's beautiful, beautiful
02:15:36
illustrations. I'm assuming this one is
02:15:38
for slightly younger audiences.
02:15:40
>> It's for ages 8 to 13. Yes.
02:15:42
>> And who should buy this and who should
02:15:44
they buy it for?
02:15:44
>> It turns out that uh kids 8 through 80
02:15:48
actually love it. even adults, they're
02:15:49
buying it for their kids, but because it
02:15:51
kind of lays out the basic ideas of the
02:15:53
of the anxious generation and explains
02:15:55
dopamine, it explains the business
02:15:57
model. Uh, but it does in a really fun
02:15:59
way, and it's working beyond our wildest
02:16:02
dreams. If you look at the Amazon
02:16:04
reviews, it's full of parents who said,
02:16:05
"I left it on the kitchen table. My kids
02:16:08
came home, they grabbed it, they fought
02:16:09
over it, they read it, they each read it
02:16:11
in the in the first couple days, and
02:16:12
then they said, "Mom, when I go to
02:16:14
middle school, I don't want a
02:16:15
smartphone. Just give me a give me a
02:16:17
flip phone. Give me a basic phone.
02:16:18
Because the book is about how to be a
02:16:20
rebel. It's about how to reject this
02:16:22
control that the company's trying to put
02:16:24
on you and how to live a life that you
02:16:26
choose full of real freedom, friendship,
02:16:29
and fun.
02:16:30
>> And also the five resets, which is a
02:16:32
book we talked about before on this
02:16:33
show. Rewire your brain and body for
02:16:35
less stress and more resilience. Another
02:16:37
smash hit bestseller that everybody's
02:16:39
been talking about. Who's it for?
02:16:41
>> It is for anyone who is struggling with
02:16:44
stress, overwhelm, and burnout. It's to
02:16:47
help you feel a sense of calm and
02:16:49
clarity in this anxious, uncertain
02:16:52
world. Everything is free. So that's
02:16:55
something that's really important to me
02:16:56
as a doctor. Every suggestion I ever
02:16:58
offer will always be cost free because I
02:17:00
think about patients with varying
02:17:02
resources. It's all sciencebacked and
02:17:04
it's totally practical. You don't have
02:17:05
to go to Bali and have a sbatical. You
02:17:08
can rewire your brain today, right now
02:17:10
in the midst of all of this chaos.
02:17:12
>> Thank you to both of you. I've learned
02:17:15
so much and I really really mean that
02:17:17
like I' I've I feel sufficiently pushed
02:17:20
to take ch to make change in my life and
02:17:23
I need to go think about this because um
02:17:25
I am uh most certainly struggling with
02:17:29
my addiction to my phone and I can feel
02:17:31
it hurting my relationships especially
02:17:32
now as a fiance. My girlfriend talk to
02:17:34
me my fiance talks to me about it all
02:17:36
the time and I want to be present. I
02:17:38
want to be present for my kids when I
02:17:39
have my kids and I'm slightly concerned
02:17:40
right now that I won't be unless I take
02:17:42
some kind of drastic action. um in the
02:17:44
direction of getting my attention back
02:17:46
and reclaiming it. Thank you so much for
02:17:48
the work that both of you do. I can't
02:17:49
say it enough because it's so important
02:17:51
and you've reached so many millions of
02:17:52
people and you're you're both changing
02:17:54
the world in a really in a way that my
02:17:55
words would not be able to capture. Um
02:17:58
but just thank you and please keep going
02:17:59
and if there's anything more that I can
02:18:01
do to support both of your causes, um
02:18:03
please do let me know what they are and
02:18:04
on behalf of all of my you know many
02:18:06
millions of people that are with us
02:18:08
right now um thank you so much for
02:18:09
saving our children.
02:18:11
>> Thank you Stephen. Thank you for giving
02:18:12
the world so many opportunities to
02:18:14
accommodate and create new mental
02:18:16
structures.
02:18:17
>> It's always such a pleasure to join you,
02:18:19
Stephen. And truly, I feel like you are
02:18:22
changing the world as well.
02:18:24
>> Thank you. We're done. Thank you.
02:18:27
YouTube have this new crazy algorithm
02:18:28
where they know exactly what video you
02:18:30
would like to watch next based on AI and
02:18:33
all of your viewing behavior. And the
02:18:34
algorithm says that this video is the
02:18:38
perfect video for you. It's different
02:18:39
for everybody looking right now. Check
02:18:41
this video out and I bet you you might
02:18:43
love it.

Badges

This episode stands out for the following:

  • 80
    Most heartbreaking
  • 75
    Most shocking
  • 75
    Best concept / idea
  • 75
    Most influential

Episode Highlights

  • The Impact of Short Form Videos
    Short form videos are identified as a major threat to attention spans and cognition.
    “Oh my god, yes, that would be the most important thing you can do for your intelligence.”
    @ 01m 29s
    February 16, 2026
  • The Impact of Short Form Videos
    Studies show that kids spending too much time on short form videos are more depressed.
    “It's looking like the kids who are spending a lot of time on this are doing much worse.”
    @ 16m 59s
    February 16, 2026
  • The Addictive Nature of Social Media
    Internal documents reveal Meta's awareness of the addictive nature of their platforms.
    “They designed it to be addictive. They’ve done research to make it maximally addictive.”
    @ 32m 10s
    February 16, 2026
  • AI and Human Connection
    AI chatbots are emerging as a solution for loneliness, but they may reshape human connections.
    “AI is coming to hack our attachments.”
    @ 42m 34s
    February 16, 2026
  • The Choking Challenge
    A tragic story highlights the dangers of viral challenges on social media.
    “Her son Jules was found dead. Happy kid found dead, strangled.”
    @ 53m 24s
    February 16, 2026
  • The Importance of Boundaries
    Creating boundaries around technology use is crucial for mental health.
    “Boredom might be the antidote.”
    @ 01h 01m 46s
    February 16, 2026
  • The Impact of EdTech
    The introduction of computers in classrooms has led to a decline in educational outcomes.
    “We made a colossal blunder with edtech in the 2010s.”
    @ 01h 16m 08s
    February 16, 2026
  • Popcorn Brain Explained
    Understanding the societal phenomenon of popcorn brain due to overstimulation.
    “Popcorn brain is when offline feels slow and boring because of overstimulation.”
    @ 01h 26m 45s
    February 16, 2026
  • Global Turning Point for Childhood
    December 10th marked a significant change in addressing social media's impact on children.
    “December 10th was the global turning point in the battle to reclaim childhood.”
    @ 01h 35m 12s
    February 16, 2026
  • The Race to AGI
    Tech companies feel pressured to race towards AGI, fearing national rivals like China.
    “If they stop, they might face existential risk.”
    @ 01h 57m 16s
    February 16, 2026
  • Meaninglessness Among Youth
    A shocking spike in high school seniors report feeling their lives are meaningless.
    “The saddest graph shows my life feels meaningless.”
    @ 02h 02m 12s
    February 16, 2026
  • Reclaim Your Attention
    Learn how to take back control of your focus and daily routines.
    “Reclaim your attention because your attention has been largely taken from you.”
    @ 02h 13m 00s
    February 16, 2026

Episode Quotes

Key Moments

  • Short Form Videos23:51
  • Addiction Awareness31:34
  • Meta's Influence36:46
  • Boredom as Antidote1:01:46
  • Reversible Conditions1:28:43
  • Existential Risk1:58:25
  • Reclaim Your Day2:13:38
  • The Amazing Generation2:15:33

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
The Mental Health Doctor: Your Phone Screen & Sitting Is Destroying Your Brain!
Podcast thumbnail
The ADHD Doctor: “I’ve Scanned 250,000 Brains” You (Steven Bartlett) Have ADHD!!! Dr Daniel Amen
Podcast thumbnail
Dopamine Expert: Short Form Videos Are Frying Your Brain! This Is A Dopamine Disaster!
Podcast thumbnail
Psychology Expert: How Colours, Your First Name And Your Location Might Be Ruining Your Life!
Podcast thumbnail
Unlock The Secrets Of Your Mind, Boost Productivity & Reduce Stress! - Yung Pueblo | E255
Podcast thumbnail
Dr K: "There Is A Crisis Going On With Men!", “We’ve Produced Millions Of Lonely, Addicted Males!”
Podcast thumbnail
The Exercise Neuroscientist: NEW RESEARCH, The Shocking Link Between Exercise And Dementia!
Podcast thumbnail
Dopamine Expert: Doing This Once A Day Fixes Your Dopamine! What Alcohol Is Doing To Your Brain!
Podcast thumbnail
The Behaviour Expert: Instantly Read Any Room & How To Hack Your Discipline! Chase Hughes