Search:

Under Secretary of State Sarah B. Rogers on dismantling the "Censorship Industrial Complex"

January 22, 202645:04
00:00:00
David and I are staying in a
00:00:01
300-year-old house and uh we've both
00:00:05
smashed our head on the beams twice
00:00:07
already. Uh but this is both our first
00:00:09
Davos, David. It's our first Davos.
00:00:11
We've been here for 24 hours and um uh
00:00:15
any first impressions here.
00:00:17
>> Um it's interesting. You know, we're
00:00:19
we're staying very far away.
00:00:21
>> Yeah,
00:00:21
>> they didn't apparently they didn't want
00:00:22
you to be part of this.
00:00:23
>> They didn't want me too close. Yeah.
00:00:25
>> But we finally we got you finally got
00:00:27
you an invitation. your invitation to
00:00:28
not get lost in the mail.
00:00:29
>> My invitation didn't get lost in the
00:00:31
mail this time.
00:00:31
>> For those of you who watch the pod, you
00:00:32
know what I'm talking about.
00:00:33
>> Inside joke. Um, but it's great to be
00:00:36
here and uh great to be here at USA
00:00:39
House. Thanks to all the sponsors and um
00:00:42
really delighted for our first guest for
00:00:44
the pod. Sarah Rogers works for is the
00:00:47
under secretary uh for public diplomacy
00:00:50
at the State Department. for members
00:00:53
here who don't know this position or
00:00:56
what you've been charged with or what
00:00:57
you've decided to work on. I'm curious
00:00:59
about that. Do they tell you what to do
00:01:00
or do you come up with your own mandate?
00:01:02
But yeah, tell us everything about what
00:01:03
you're doing.
00:01:04
>> Longtime listener, first time guest and
00:01:06
thank you to both of us. Uh thank you
00:01:08
from all of us at America House for
00:01:10
joining us here.
00:01:11
>> Yeah.
00:01:12
>> So I am the under secretary for public
00:01:14
diplomacy and when I got this
00:01:16
nomination, my friends and family all
00:01:18
congratulated me and then conferably
00:01:19
said what is that?
00:01:20
>> Yeah. Uh so diplomacy traditionally
00:01:22
concerns the relationship between the
00:01:25
American government and foreign
00:01:26
governments. Two ambassadors, shake
00:01:28
hands, make a deal, um solve a war. But
00:01:32
public diplomacy is different. Public
00:01:34
diplomacy addresses the relationship
00:01:36
between the American government and
00:01:37
foreign publics. And this has become a
00:01:40
very important under secretariat with
00:01:41
the rise of the internet and then during
00:01:44
the Biden administration especially uh
00:01:46
these mushrooming concerns about
00:01:49
so-called disinformation and what do we
00:01:51
do when there are allegedly malign
00:01:55
influences on the public view of America
00:01:58
um the public's intersection with
00:02:00
American interest how do we interact
00:02:02
with the internet and the information
00:02:04
ecosystem that is part of my portfolio I
00:02:07
also oversee other soft power activities
00:02:09
including our educational and cultural
00:02:11
and sports diplomacy. So I am privileged
00:02:14
to play a role in the World Cup this
00:02:16
summer and the LA Olympics coming up and
00:02:19
uh the Fulbright program and others like
00:02:21
it. So
00:02:22
>> you seem particularly focused on freedom
00:02:26
of speech and a little bit of tension
00:02:29
between our standards and the companies
00:02:31
in America which have made the move to
00:02:34
being strongly freedom of speech.
00:02:36
something that kind of got lost in our
00:02:38
industry for a couple years in
00:02:39
technology but has now made I think um
00:02:42
some significant progress on does seem
00:02:44
like some folks in Europe don't share
00:02:46
our love of freedom of speech. Maybe you
00:02:48
could explain to us what the tension is
00:02:52
today and what some of the regulations
00:02:54
are that have been put in place in
00:02:57
Europe.
00:02:57
>> Sure. Absolutely. So the two main
00:03:00
regulations that I've interfaced with
00:03:02
since taking office and part of this is
00:03:04
just a product of my first official trip
00:03:06
was to Europe and while I was in Europe
00:03:08
a large fine came down on an American
00:03:11
platform X under the digital services
00:03:13
act which I'll get into in a moment. So
00:03:15
Europe um especially since since the
00:03:18
second world war but I think really
00:03:19
since the American founding and our
00:03:21
codification of the first amendment you
00:03:23
know America has taken a much stronger
00:03:25
approach on free speech than even most
00:03:27
of the west and with the rise of the
00:03:30
internet and all communication or a lot
00:03:32
of communication becoming transnational.
00:03:34
We see these new technocratic regulatory
00:03:36
frameworks in Europe bumping up against
00:03:39
the commitments to free speech in the
00:03:40
United States. And Jason makes an
00:03:42
important observation that for a while
00:03:44
some of these large American technology
00:03:46
platforms were more inclined to moderate
00:03:48
or to censor kind of in conformity with
00:03:51
some prevailing um some prevailing norms
00:03:54
and concerns in the United States. But I
00:03:56
think in the United States we've shifted
00:03:59
back toward a less centurious approach
00:04:02
and so have these platforms. And at the
00:04:04
same time, you have uh regulatory
00:04:07
efforts in Europe and the UK. And I'll
00:04:09
I'll name a couple that I think have
00:04:10
been particularly relevant. So the UK
00:04:13
has something called the online safety
00:04:15
act. The online safety act imposes age
00:04:18
gating obligations on a broad swath of
00:04:21
content, almost any content that's
00:04:22
upsetting, and then requires platforms
00:04:25
to run risk assessments for and in some
00:04:28
cases remove content that the UK would
00:04:31
say is illegal. And in the UK, you know,
00:04:34
major categories of content are banned,
00:04:36
are rendered illegal. That would not be
00:04:38
illegal in the United States, which is
00:04:40
where these platforms are located, which
00:04:42
is where their original user base is,
00:04:44
which is where their executives live um
00:04:46
and which is their primary regulator.
00:04:48
So, under the online safety act, we now
00:04:50
have um active litigation by the
00:04:52
relevant regulator against several
00:04:54
American websites. These are websites
00:04:57
that don't reach into the UK. They're
00:04:59
not these aren't websites dedicated to
00:05:01
discussing the queen. They're not
00:05:03
websites that sell goods in the United
00:05:05
Kingdom. These are websites that exist
00:05:07
on American soil, host large quantities
00:05:10
of American users, and in often in
00:05:12
oftentimes discuss American political
00:05:14
topics. But because users are permitted
00:05:16
to discuss them in a way that offends UK
00:05:18
law, there's the imposition of a UK
00:05:21
fine. The Digital Services Act in the EU
00:05:24
is similar. So DSA
00:05:26
contains but doesn't just contain uh
00:05:29
contentbased regulations, hate speech
00:05:31
regulations. So DSA requires all of the
00:05:34
EU member states to adopt at minimum
00:05:37
kind of a floor for hate speech
00:05:39
prohibition. And those prohibitions in
00:05:42
the statute, I think, are much vagger
00:05:44
than American lawyers are accustomed to.
00:05:47
And one of our one of our jurist
00:05:49
credential principles under the American
00:05:51
First Amendment is if you're going to
00:05:53
enact any regulation that comes close to
00:05:55
touching speech, it needs to be very
00:05:57
clear what you are prohibiting because
00:05:59
you have this chilling effect concept. A
00:06:01
vague prohibition will chill speech,
00:06:03
especially when that prohibition is
00:06:05
imposed on a large riskaverse
00:06:07
corporation. So you impose vague
00:06:09
prohibitions on large riskaverse
00:06:11
corporations and that's how it becomes
00:06:13
illegal to make jokes around the water
00:06:14
cooler for example. You see the same
00:06:16
effect here. Digital services act also
00:06:19
regulates other aspects of um digital
00:06:22
commerce and social media. So it
00:06:24
regulates um things like transparency
00:06:27
and competition. And I think you know we
00:06:30
have we have a lot of Europeans in the
00:06:32
audience today and I hope none of them
00:06:34
will find it contentious if I suggest
00:06:36
that in Europe there is more of a focus
00:06:39
on uh technocratic regulation as an
00:06:42
arbiter of what's acceptable than there
00:06:44
might be in America where we have this
00:06:45
tradition that um really emphasizes like
00:06:48
rugged individualism and individual
00:06:50
conscience. And to be clear, no one is
00:06:53
saying, certainly not the State
00:06:54
Department or America, hey, you can't
00:06:56
have your own platforms in Europe,
00:06:58
right?
00:06:58
>> Build your own. Build your own Facebook,
00:07:00
build your own Instagram, build your own
00:07:02
Twitter/X,
00:07:03
uh, Tik Tok, whatever you'd like to
00:07:04
build, and you can have whatever
00:07:06
standards you like on your platforms.
00:07:08
We're saying, "Hey, these are our
00:07:09
platforms. This is our standard, and we
00:07:12
don't want our users uh or our platforms
00:07:16
to be receiving fines." That's that's
00:07:17
our position.
00:07:18
>> I I think I think that's basically it.
00:07:20
And look, when American companies
00:07:21
operate abroad, they abide by the laws
00:07:23
where they operate. But at a certain
00:07:26
point, so so we recently issued some
00:07:29
sanctions, which we'll get into. And one
00:07:30
of the individuals we sanctioned uh was
00:07:32
a former EU official who threatened Elon
00:07:35
Musk with enforcement action because X
00:07:38
within the United States had said that
00:07:40
it was going to host on a live uh
00:07:43
Twitter space an interview with Donald
00:07:44
Trump, our president. So it wasn't that
00:07:46
Donald Trump had said anything
00:07:48
violative. It wasn't there was a
00:07:49
specific piece of content that the EU
00:07:51
wanted to ban. It was just that the act
00:07:54
of an American business hosting an
00:07:56
interview with an American president
00:07:58
might offend EU uh EU preferences about
00:08:03
speech generated a regulatory threat.
00:08:05
And when you reach across borders and
00:08:07
make a threat like that, that offends
00:08:09
American interests and American values.
00:08:11
And so you can expect America to
00:08:13
respond. And I think so I my history is
00:08:17
as an American lawyer in American courts
00:08:18
and we have you know we're a nation of
00:08:21
50 states and each state has its own
00:08:22
regulations and we've had to think about
00:08:24
you know when there's a website in
00:08:26
California that operates in Texas how do
00:08:28
you decide to what extent Texas gets to
00:08:30
regulate and we have all these
00:08:32
jurisdictional concepts like does the
00:08:34
website purposefully avail itself of the
00:08:36
forum is it is are you posting
00:08:38
defamatory statements about a person in
00:08:39
Texas but the mere existence of a
00:08:42
website in California that Texas doesn't
00:08:44
like is hardly ever, basically never, a
00:08:48
basis for regulation. And so when we
00:08:50
talk about things like extr
00:08:52
territoriality, you know, what we're
00:08:53
really talking about is it's undisputed
00:08:56
that Europeans get to have their own
00:08:58
laws in Europe, but we also get to have
00:09:00
our own laws in the United States. And
00:09:02
we're celebrating 250 years of American
00:09:04
independence. And so we want, you know,
00:09:07
we want our markets to be able to
00:09:09
interoperate online. Um, but we're not
00:09:12
willing to give up American freedom of
00:09:13
speech and the bargain.
00:09:14
>> Hey, David, when we look at, and I'm
00:09:18
asking you this one so I can give you a
00:09:19
pass on it, but what what do you think
00:09:22
people are so scared of in the UK when
00:09:24
it comes to freedom of speech? Uh, and
00:09:26
and maybe the most freedom of the the
00:09:29
most um rockus platform X specifically.
00:09:33
>> Well, I don't think the people are
00:09:35
afraid. I think the government is afraid
00:09:37
of the people criticizing it and
00:09:40
therefore they're engaged in what
00:09:42
sensors always do which is protect the
00:09:44
people in power. Um there's something
00:09:47
Sarah you you should explain this to us
00:09:51
but as I understand there have been over
00:09:53
12,000 people is it prosecuted or
00:09:56
>> arrested
00:09:57
>> arrested under the online safety act.
00:10:00
Was that just in one year or is that
00:10:02
since it
00:10:03
>> that that was in 2023 alone?
00:10:05
>> Okay.
00:10:06
>> But that isn't just under the online
00:10:07
safety act. So I think what's
00:10:09
particularly insidious and particularly
00:10:11
relevant about statutes like the OSA and
00:10:13
the DSA is that these are portals
00:10:15
through which existing censorship laws
00:10:17
get applied to the internet. So a lot of
00:10:19
these Brits are arrested under um ex of
00:10:22
under existing statutes like there's a
00:10:24
communications act, there's a law
00:10:26
against inciting quote unquote racial or
00:10:28
religious hatred. And we, I think, have
00:10:30
differences of opinion about, you know,
00:10:31
what amounts to incitement in America
00:10:33
versus the UK. But so, for example, uh,
00:10:36
you had a comedian called Graeme Lahan
00:10:38
who tweeted that, um, if a woman sees a
00:10:42
penis in a in a ladies room, you know,
00:10:43
she should feel free to kick that guy in
00:10:45
the balls. And that's that's something a
00:10:46
lot of comedians say, and I think it
00:10:48
channels an impulse that a lot of
00:10:50
Americans and Europeans would frankly
00:10:51
consider common sense. But he was
00:10:54
dragged out of the airport like a
00:10:55
terrorist, had his devices confiscated,
00:10:57
was thrown in jail overnight and lost
00:10:59
access, my understanding is to his heart
00:11:01
medication if I recall correctly.
00:11:02
>> And because this was an incitement to
00:11:04
violence
00:11:05
>> because this offended some existing law
00:11:08
against uh provocative speech in the UK
00:11:11
and the the online safety act is a
00:11:13
device through which all of those
00:11:14
existing laws get applied to the
00:11:16
internet. You had another case in the UK
00:11:18
where Joey Breton, a footballer, um,
00:11:21
called somebody a bike nonce, which
00:11:24
nonce is not an American term, but I
00:11:26
imagine you're insinuating someone is
00:11:27
affeminate for riding a bike so much or
00:11:29
in the manner that he rides the bike.
00:11:31
And that resulted in a suspended prison
00:11:33
sentence, but still a prison sentence
00:11:35
>> because he called somebody desty.
00:11:36
>> Yeah. Basically,
00:11:37
>> that's what we call in United States.
00:11:38
>> And there were there was there were some
00:11:39
other tweets, too, but none none that
00:11:41
would meet the bar for American
00:11:43
incitement. So, David, you're absolutely
00:11:44
right. So that was in a single year of
00:11:46
slightly over 12,000 Brits arrested for
00:11:49
speech acts and that is more than more
00:11:52
than we're arrested that year in Russia,
00:11:54
more than in China, more than in Turkey.
00:11:57
And when you talk to Brits about this,
00:12:00
you know, you're absolutely right. Most
00:12:01
of the British people that you talk to
00:12:03
say this is totally unacceptable. And if
00:12:04
you look at the polls in the UK, you see
00:12:07
public sentiment against this kind of
00:12:09
thing. But I've I've had, you know, both
00:12:11
public and private engagements with
00:12:12
regulators in these countries. And the
00:12:14
defense you hear is well, you know, we
00:12:17
have a less chilling, less totalitarian
00:12:19
environment than China. So maybe more
00:12:21
people are willing to break the rules,
00:12:22
more people are willing to offend. But
00:12:24
if you arrest 12,000 people a year for
00:12:26
speech and you're raising children in an
00:12:28
ecosystem where you can be dragged out
00:12:30
of the airport for offending the dogmas
00:12:32
of transgender activism,
00:12:34
>> then you might not have a different
00:12:36
culture than China for long.
00:12:38
And and why should the United States be
00:12:41
paying to defend your country and
00:12:44
support it in fighting say a proxy war
00:12:48
against Russia if that's basically the
00:12:51
values that are being enforced.
00:12:53
>> Right. Exactly. When we interact with
00:12:55
our NATO allies in the NATO context, we
00:12:57
hear a lot about our shared history and
00:12:59
shared values. And it's time to ask, you
00:13:01
know, what values do we still share? We
00:13:04
together with our allies comprised the
00:13:06
free world after World War II and the
00:13:09
free world that was assembled against
00:13:10
communism. But the cornerstone of a free
00:13:13
world of any free society has to be
00:13:15
freedom of speech
00:13:16
>> and criticizing and the uncomfortable
00:13:18
speech is where this actual defense is
00:13:21
necessary.
00:13:22
>> Yes.
00:13:22
>> And we have a very special, you know,
00:13:26
bent in the United States to really go
00:13:28
after our leaders. I do it every week
00:13:29
with David um since he's now a public
00:13:31
servant. I mean, we go at it and they're
00:13:35
knocking on people's doors strictly for
00:13:37
saying like, "Hey, you know, I I might
00:13:39
have disagreements with the Catholic
00:13:41
Church. I'm a Catholic."
00:13:42
>> Well, a lot of it's about immigration,
00:13:44
right? I mean, so I've seen a bunch of
00:13:45
these examples, you know, on X. I saw um
00:13:49
one clip on X where a judge was handing
00:13:51
down a 2-year sentence against somebody.
00:13:55
I don't know if if this rings a bell,
00:13:57
but supposedly for speech that I think
00:13:59
was criticizing the UK's open
00:14:02
immigration policies. That's where I
00:14:04
sense a lot of the the prosecutions are.
00:14:06
Right.
00:14:07
>> Right. And this is another place where,
00:14:09
you know, free speech and freedom of
00:14:10
expression are American values and
00:14:12
interests in and of themselves. But
00:14:14
another priority for the administration
00:14:15
is common sense on mass migration. And a
00:14:18
lot of the speech that offends those in
00:14:22
power has to do with migration policy.
00:14:24
So, there was a 31-month sentence handed
00:14:26
down to a suburban mother named Lucy
00:14:28
Connelly in the UK uh because after a
00:14:32
man called Axel Rudabanka stabbed I
00:14:35
think it was a seven-year-old girl, an
00:14:36
8-year-old girl, and a nine-year-old
00:14:38
girl at a birthday party. There was
00:14:40
ensuing unrest and she she tweeted uh
00:14:44
something anti-migration and it was
00:14:45
pretty inflammatory, but it would have
00:14:47
been unambiguously legal in the United
00:14:49
States. Um, she said, 'If this is what
00:14:51
migration is going to do to our country,
00:14:52
and I'm paraphrasing slightly, but I
00:14:54
remember it pretty well. If this is what
00:14:56
migration means, then burn down the
00:14:57
migrant hotels for all I care. Uh, this
00:15:00
was a berieved mother who' lost a child.
00:15:02
She saw three little girls murdered for
00:15:04
no reason. And she reacted and then she
00:15:07
felt bad and she deleted the tweet. That
00:15:09
was a 30 31-month sentence.
00:15:12
>> 31 month
00:15:13
>> 31month sentence in the United Kingdom.
00:15:15
Meanwhile, you have actual pedophiles,
00:15:18
actual child sex offenders who get
00:15:20
minimal prison or none in the United
00:15:22
Kingdom. And that that's led to this um
00:15:25
epithet that you hear among UK
00:15:27
activists, this quote unquote two-tier
00:15:29
policing, this activist cause that
00:15:30
they've assembled around because they
00:15:31
sense that if you oppose mass migration,
00:15:34
if you make that kind of critique, you
00:15:36
are subject to a different justice
00:15:38
system than the kind of person who
00:15:40
merely agitates for Sharia law in
00:15:42
Britain or merely downloads child
00:15:44
pornography in Britain.
00:15:46
>> Yeah.
00:15:47
>> So, okay. So,
00:15:48
>> I emphasized that for
00:15:49
>> Yeah. So I mean so I think um
00:15:52
>> that was the gong of righteousness.
00:15:54
>> We appreciate it.
00:15:55
>> And then just to add one more dimension
00:15:57
to this uh so US companies have been
00:16:01
getting fined like crazy right in the UK
00:16:05
and then the EU. Um and I think it's
00:16:08
related to this issue but can you just
00:16:09
describe that like
00:16:10
>> Yeah. Um because that that's where this
00:16:13
crosses over into
00:16:15
you know an ally doing something that we
00:16:18
think is mad into directly hurting
00:16:22
American interests I guess right or
00:16:25
>> right so I don't believe there have been
00:16:28
any big fines under the UK online safety
00:16:30
act yet but um its provisions take
00:16:34
effect over time and some of those
00:16:35
provisions are just coming online now
00:16:37
including the ones relating to AI okay
00:16:39
>> and we have active litigation in
00:16:42
American courts right now. Um, one of
00:16:44
the leading lawsuits involves the
00:16:46
website 4chan, which people who are very
00:16:49
online in America may be familiar with.
00:16:50
4chan is kind of a no holdsbred uh
00:16:53
promodial soup for memes and the like.
00:16:55
The cat memes come out of there. A lot
00:16:57
of activism Occupy Wall Street came out
00:16:59
of 4chan, but 4chan has essentially no
00:17:01
censorship rules. It bans child
00:17:03
pornography. That's pretty much it. And
00:17:05
so the UK has decided that 4chan is not
00:17:07
allowed to exist unless it pays a bunch
00:17:08
of money to the United Kingdom for not
00:17:11
policing its speech in accord with uh UK
00:17:14
laws, the numerous UK speech statutes
00:17:16
that led to prison sentences like the
00:17:18
one I just discussed. But there was a
00:17:20
large fine handed down during my recent
00:17:22
European tour against X and I believe it
00:17:24
wasund€140
00:17:26
million euros, but that might be
00:17:27
dollars. I'm
00:17:29
>> Elon because they disagree with his
00:17:31
influence uh in
00:17:34
>> uh the UK.
00:17:34
>> Look, I can't speak for the UK
00:17:36
regulators, but I can make inferences.
00:17:38
What's
00:17:38
>> your ex X has a particular political
00:17:40
veillance. Um we saw Joe Biden after
00:17:43
Elon acquired Twitter saying, you know,
00:17:45
we we've got to find ways to go after
00:17:47
him. And I think that that sentiment
00:17:49
might be shared. But as an under
00:17:52
secretary of state, I'm not an advocate
00:17:53
for one American company or even one
00:17:55
American viewpoint on the free speech
00:17:57
issue. If any American company were
00:18:00
fined, let's say $140 million even by a
00:18:04
foreign power for upholding the American
00:18:07
first amendment, um if General Motors
00:18:09
were treated that way, you know, the US
00:18:10
government would have something to say
00:18:11
about it. I also think that X is not the
00:18:15
first company to be fined under EU
00:18:18
digital regulations. So there's an
00:18:20
infographic that circulated recently
00:18:22
comparing, you know, revenues raised
00:18:24
within the EU through other metrics and
00:18:26
then revenues raised just by finding
00:18:28
American tech companies. And there's a
00:18:29
suspicion that this is really kind of a
00:18:31
de facto tax tax and pretexts are
00:18:34
contrived for finding large American
00:18:36
tech companies in order to raise
00:18:38
revenue.
00:18:39
>> Yeah. I I so that was the thing I think
00:18:40
I was referring to is that um and I
00:18:43
think actually the president may have um
00:18:45
truth that out um that I think the the
00:18:48
maybe this is more the EU but the the
00:18:51
DSA has become almost like a digital
00:18:53
speed trap to try and find American
00:18:55
companies and it does massly
00:18:58
disproportionately
00:18:59
affect them to the point where you could
00:19:02
argue that it's effectively like a
00:19:03
tariff on American tech companies
00:19:05
operating in Europe. And if that's the
00:19:08
case, well, I mean, I guess Europe is
00:19:11
allowed to have tariffs, but then that's
00:19:13
going to change the tariffs that we set.
00:19:14
So, it's all part of a larger trade
00:19:16
negotiation,
00:19:16
>> right? Exactly. I've referred to the DSA
00:19:18
before as a censorship tariff because
00:19:20
the cost of maintaining the censorship
00:19:22
apparatus under the DSA is intentionally
00:19:24
levied on specific companies, mostly
00:19:26
American ones, that are that are subject
00:19:28
to higher and more intricate regulatory
00:19:30
standards than other companies are. And
00:19:33
EU regulators say, "Well, that's not
00:19:34
because they're American, that's because
00:19:36
they're large." But the fact that
00:19:38
they're American and not European surely
00:19:40
makes them easier as a political
00:19:41
proposition to tax. And so a lot of
00:19:44
Americans see this as a tax.
00:19:45
>> Really bizarre. David, we're seeing
00:19:46
we're living in a time where we're
00:19:48
seeing freedom of speech expression go
00:19:50
down in Europe and go up in the Middle
00:19:53
East.
00:19:54
>> You know, they just had the Riad comedy
00:19:56
festival. There were some rules. Hey,
00:19:58
you can't criticize the kingdom.
00:20:01
Let's leave religion off the table. But
00:20:02
you can go after your own. But uh we we
00:20:06
might not uh we might have some uh
00:20:07
sensitivities there and um you know and
00:20:10
then there's everything in between.
00:20:12
South Korea does require you have a
00:20:15
social security number essentially to
00:20:16
post online. Um but David, I'm I'm
00:20:20
wondering what you think about this
00:20:22
overall trend in the world of what we're
00:20:24
seeing with censorship.
00:20:26
>> I mean it's not a good trend. Um, I
00:20:28
think that the the purpose of
00:20:31
censorship, like I mentioned before, is
00:20:32
always to protect the people in power
00:20:35
and and and specifically it it insulates
00:20:38
them from criticism, but it'd be a lot
00:20:41
better for them to hear that criticism
00:20:42
and adjust their policies than it would
00:20:44
be to try and switch off the, you know,
00:20:47
the feedback altogether.
00:20:49
>> And it's very clear, I think, in Europe
00:20:51
and the UK that these policies of open
00:20:54
migration, mass migration are very
00:20:55
unpopular. Why not listen to the people
00:20:58
and adjust your policies instead of
00:20:59
trying to silence them?
00:21:01
>> Uh, say what you will about President
00:21:03
Trump. Uh, and people have varying
00:21:06
opinions, but South Park has been
00:21:09
deranged this season. I mean, they have
00:21:12
gone full boore in attacking him like to
00:21:16
a level that I wouldn't feel comfortable
00:21:17
explaining the details of it. Uh, here.
00:21:20
>> Not in a family podcast.
00:21:21
>> Not in a family podcast.
00:21:24
But but even
00:21:25
>> President Trump has a thick skin on
00:21:27
these things. We did have one weird
00:21:28
thing that occurred. I think it was
00:21:30
before your time, the Jimmy Kimmel,
00:21:32
>> Charlie Kirk Kurfluffle, but even that
00:21:34
it seemed David um President Trump and
00:21:38
the administration and Brendan Carr
00:21:39
friend of the pod who's been on a couple
00:21:41
times um kind of rethought that one.
00:21:44
Yeah.
00:21:45
>> Well, Jimmy Kimmel was back in the air
00:21:47
within was it like two or three nights.
00:21:49
So yeah, I mean there's no real
00:21:50
censorship there. In that case, it was
00:21:52
the network affiliates who were upset
00:21:55
because he because Jimmy Kimmel said
00:21:57
something untrue and malicious and
00:22:00
outrageous. So in any event, this is
00:22:02
>> the heat of the moment with after kind
00:22:04
of worked itself out. There was no like
00:22:05
government censorship.
00:22:06
>> Brendan probably shouldn't have said
00:22:08
what he said in my estimation.
00:22:10
>> Anyway, in any event, there was no
00:22:11
government censorship. That's the bottom
00:22:13
line. Um yeah, I think it is disturbing
00:22:16
that uh countries that we see as our
00:22:19
closest allies that share similar values
00:22:22
that are part of the same western
00:22:24
culture and history are moving in this
00:22:27
direction of more and more censorship is
00:22:29
disturbing and I'm glad to see that um
00:22:32
under President Trump the Department of
00:22:34
State is pushing back on this. I think
00:22:35
Sarah the work that you're doing and
00:22:37
Secretary Rubio extremely important. So
00:22:40
I think you're making a huge difference.
00:22:42
I think we have to use the tools that we
00:22:45
have whether they're tools on trade or
00:22:48
the denial of visas or um expressing
00:22:51
condemnation to push back on this as we
00:22:54
you know as we will as we can.
00:22:56
>> Let's talk about some of the new issues.
00:22:59
Uh AI uh it was pretty obvious but 18
00:23:04
months ago when you saw a deep fake it
00:23:07
just didn't pass the uncanny valley. uh
00:23:10
Grock images uh Nano Banana from our
00:23:13
friends at Google. I mean these things
00:23:15
now if you're if you're flipping by very
00:23:17
quickly you could make a mistake. This
00:23:21
also uh in terms of censorship we have
00:23:24
significant protections in the United
00:23:26
States for say cartoonists as do the
00:23:29
French and
00:23:31
>> they're mocking public figures. How is
00:23:33
it different when you're mocking public
00:23:35
figures, uh, presidents, prime
00:23:38
ministers, cabinet members,
00:23:41
but the public can't tell? Cuz this is
00:23:44
new,
00:23:45
>> right? I think I think this is a really
00:23:47
interesting question and it's our
00:23:48
privilege to be at this new
00:23:49
technological frontier where these new
00:23:51
questions arise. U, I'm glad you
00:23:53
mentioned cartoonists. So, after Charlie
00:23:55
Kirk was murdered and I knew him, I
00:23:57
represented him on some First Amendment
00:23:59
issues in the United States. I saw
00:24:01
Americans walking around in an old
00:24:03
t-shirt from 10 years ago and that
00:24:05
t-shirt said just sweet Charlie. Uh
00:24:08
because that was a t-shirt we bought
00:24:09
when free speech in France was under
00:24:11
threat and French people stood up for
00:24:13
it. The Charlie Hepto cartoonists were
00:24:15
bombed. They were murdered. Yeah.
00:24:17
>> Um for saying things that offended
00:24:19
religious zealots. And I think you know
00:24:22
it's a different kind of religious
00:24:23
zealatry to not want to allow any
00:24:26
descent online. And thinking back on
00:24:29
that episode and how European consensus
00:24:31
on free speech might have shifted since
00:24:33
then is really sobering. But I think
00:24:35
America we take pride in being the kind
00:24:38
of civilization where Charlie Kirk and
00:24:40
Charlie Hebdo can both speak. I think
00:24:42
making fun of public officials. You know
00:24:44
pointing out when the emperor has no
00:24:46
clothes is one of the most essential
00:24:48
things you can do in a democracy. If you
00:24:49
believe in self-governance, you have to
00:24:51
believe in that. And what's interesting
00:24:53
about a deep fake is that the point of
00:24:55
parody is that you can tell that it's
00:24:56
parody. But if you're depicting a public
00:24:59
official falsely in a way that people
00:25:01
can't tell as a satirical or um
00:25:05
non-authentic depiction, then the parody
00:25:08
tension really isn't there. And what I
00:25:10
would say though is whenever whenever we
00:25:13
reach a new technological frontier,
00:25:15
there's a temptation to just enact a
00:25:17
flurry of new regulations. And if we
00:25:19
look back over history at other
00:25:21
frontiers that have caused similar
00:25:22
instability, like the invention of the
00:25:24
printing press, people thought that was
00:25:26
the end of the world. The invention of
00:25:27
the telegraph, there were worries about
00:25:28
disinformation and attention span. The
00:25:30
invention of the film strip, people
00:25:32
thought the train was coming at them
00:25:33
through the screen.
00:25:35
The the the impulse to restrain that
00:25:39
deal, to regulate, and to allow people
00:25:41
to adapt and to give freedom the benefit
00:25:43
of the doubt, that impulse tends to be
00:25:45
vindicated over time. When it comes to
00:25:46
deep fakes, I think we have in in
00:25:49
America and in Europe strong legal
00:25:52
remedies against defamation. So if
00:25:54
someone creates an image of a public
00:25:55
figure that is false and people are
00:25:57
believing that image and a reasonable
00:25:59
viewer would believe that that person
00:26:01
engaged in that action, you can already
00:26:03
sue for defamation,
00:26:04
>> right?
00:26:05
>> And that that doesn't mean that
00:26:06
>> and we have child protection and
00:26:08
underage laws. Those are very strong.
00:26:10
Yes. The thing is just because you don't
00:26:12
have AI specific laws doesn't mean that
00:26:15
you can do whatever you want with AI,
00:26:17
you could still use AI as a tool to then
00:26:19
break the law and be prosecuted.
00:26:21
>> I mean, if you engage in cyber hacking,
00:26:24
for example,
00:26:25
>> and you use AI to do it, you're guilty
00:26:28
of of a cyber crime.
00:26:30
>> Yeah. So there's a lot of things like
00:26:32
that where uh there are plenty of
00:26:34
existing laws that apply to AI and we
00:26:37
should just think about using all of
00:26:39
those before you start then creating a
00:26:41
bunch of AI specific.
00:26:42
>> Yeah. those didn't cover it
00:26:44
>> then there would be hey maybe we need to
00:26:45
have a thoughtful discussion
00:26:47
>> because you know I'm trying to think of
00:26:49
an edge case here but
00:26:51
>> um for cyber security it's even hard to
00:26:54
do like using voice clones it's just
00:26:56
fraud you know it's
00:26:57
>> wire fraud you know
00:26:59
>> and it might be that it you use the
00:27:02
existing fraud statute but there are
00:27:03
little regulatory tweaks you can make
00:27:05
the make the fraud easier to detect so
00:27:07
one approach which I don't think is
00:27:09
always correct but it it exemplifies one
00:27:11
direction of thinking is maybe there's
00:27:13
water marking or something on some AI
00:27:15
images that would mitigate tort
00:27:17
liability for some of the providers. Um
00:27:19
or when we got when we when we invented
00:27:21
capital markets on a mass scale, right?
00:27:23
We had our old laws against fraud, but
00:27:26
we kind of made some more fine grain
00:27:28
securities regulations like now you have
00:27:30
to uh file a certain disclosure annually
00:27:33
with your earnings and whatnot. And we
00:27:35
didn't fundamentally change how we
00:27:36
treated false information. we just
00:27:38
developed some slightly more fine-tuned
00:27:41
devices. But when I say fine-tuned, I
00:27:43
think I think that's an important piece
00:27:45
of guidance. You don't just go crazy and
00:27:47
try to put the technological innovation
00:27:49
back in the bottle, especially when, you
00:27:51
know, we have foreign policy rivals like
00:27:54
China that are developing AI at um an
00:27:56
aggressive pace. And if we if we cocoon
00:28:00
ourselves in safety, we hurt our
00:28:02
standing in that race. So where sir,
00:28:05
where do you think this relationship
00:28:06
between the US and EU is now headed on
00:28:09
this topic of free speech? I mean there
00:28:11
does seem to be a fundamental
00:28:12
divergence. Um do you I I don't know the
00:28:17
conversations that you're having, but do
00:28:19
you think this gets worked out or do you
00:28:21
think the divide gets greater? What like
00:28:24
where's this headed? So before uh
00:28:27
President Trump and Secretary Rubio did
00:28:29
me the honor of this appointment, I was
00:28:31
a litigator and it was my job to fight.
00:28:33
And now I'm a diplomat. So it is my job
00:28:36
to be diplomatic. And in that spirit, I
00:28:39
would like to sound a uh gong of
00:28:41
optimism.
00:28:42
>> Yeah. I think that a lot of ordinary
00:28:44
Europeans are not comfortable with
00:28:47
comedians getting dragged out of the
00:28:49
airport, just like Europeans weren't
00:28:51
comfortable with comedians getting
00:28:52
murdered who are publishing offensive
00:28:54
cartoons. And if you look at polls in
00:28:57
Europe, I think you see some of the
00:28:58
sentiment. So, I don't know where things
00:29:00
are going. I can't promise in a panacea,
00:29:03
but I will say that I've had productive
00:29:05
conversations and hope that I'll have
00:29:07
more. I mean, if it does become more
00:29:10
acute, is Europe prepared for all the
00:29:14
American social networks to be turned
00:29:17
off and blocked by IP address because we
00:29:19
really don't need the money like these
00:29:21
platforms. It's nice to make money in
00:29:23
Europe, but maybe it's time
00:29:25
>> or would the European or would various
00:29:27
European countries demand their own
00:29:29
version? You know, would there be a UK
00:29:31
specific version of X? Is that where
00:29:33
this would be headed
00:29:34
>> to an extent? because you know one way
00:29:36
to resolve the transnational issue is
00:29:37
geoencing. Now I think I I understand
00:29:41
that some of the UK enforcement actions
00:29:43
geofencing has not been enough which is
00:29:44
pretty ridiculous but you're essentially
00:29:46
>> saying that's not enough for them.
00:29:48
>> Yeah. Like there's there's a there's a
00:29:49
there's a small American website and I
00:29:51
can't recall the name of it that offcom
00:29:52
has sued and that website responded well
00:29:55
we've geoenced we've blocked UK IPs so
00:29:57
you should have nothing to say about the
00:29:58
content on our website but offcom is
00:30:00
still going.
00:30:00
>> Yeah, they should be good. But those
00:30:01
people then chose their citizens chose
00:30:03
to get a VPN. Yes, that's well that's
00:30:05
>> which cost 30 bucks a year and then you
00:30:07
can make your own decision
00:30:08
>> which is what the people of Iran are
00:30:10
doing too. So
00:30:12
>> um but I think I think you know you
00:30:14
mention um like fire like blocking by IP
00:30:16
address. I think some countries that
00:30:18
just where the people where they don't
00:30:20
have that Charlie Hubo tradition like
00:30:21
countries like Russia or China they just
00:30:24
admit that they are censurious societies
00:30:26
and they just block these websites when
00:30:28
you have
00:30:29
>> Yeah. I mean the great in China we have
00:30:32
the great firewall which we're not
00:30:34
trying to take down. If you want to
00:30:35
>> but could that happen? Could the UK put
00:30:37
up a great firewall and just say that
00:30:39
>> you know we're black outside world.
00:30:41
>> We call your bluff.
00:30:42
>> I think it's technologically it's
00:30:43
technologically feasible to a point.
00:30:45
There are circumventions but it is not
00:30:46
politically feasible because British
00:30:48
people want to be free. And I think if
00:30:50
Kier Starmer said we're putting up a
00:30:51
great firewall and you're not allowed to
00:30:52
access any American social media
00:30:54
anymore, he'd be out of office.
00:30:56
>> Right? So that is their right, but they
00:30:59
don't want to do that because it would
00:31:00
be too obvious what they're doing. And
00:31:01
so therefore,
00:31:02
>> they want to do the fines. They want
00:31:04
>> to be fines and just have it be more
00:31:06
subtle. And we're pushing back saying,
00:31:07
"No, you can't do that."
00:31:08
>> Yeah. Good luck doing that against Elon.
00:31:10
He's pretty principal disc. I think he
00:31:12
can pay for the speeding ticket. I don't
00:31:13
think it's going to be a problem. And I
00:31:15
would say President Trump because I I do
00:31:16
think that that President Trump's
00:31:18
election definitely changed the
00:31:20
direction of free speech in the United
00:31:22
States, never mind the rest of the
00:31:23
world, because I think under the Biden
00:31:25
administration, we now know from cases
00:31:27
like Biden v. Missouri and then also uh
00:31:30
you know what was released in the
00:31:31
Twitter files and then since then even
00:31:33
more disclosure that's come out that the
00:31:36
Biden administration was pressuring
00:31:38
social networks to engage in censorship.
00:31:41
>> David, we did discuss this. You could
00:31:43
bring up Biden for all of 2025, but when
00:31:45
2026 came around, you had to I'm
00:31:47
recounting I'm recounting what was
00:31:49
happening for several years and
00:31:51
President Trump changed that direction.
00:31:53
>> Absolutely.
00:31:53
>> And so if it weren't for that, I think
00:31:55
we'd still be on a censorship.
00:31:56
>> If it was up to Zuckerberg, he would
00:31:58
have continued to do it under Kamla. He
00:32:00
did it with no problem under Biden. He's
00:32:02
a weather vein. His entire position is
00:32:05
based on what makes the system grow.
00:32:06
>> Look, I think when it comes to these
00:32:08
>> I don't have any personal feelings on
00:32:09
it. I think I think when it comes to the
00:32:11
the tech companies, there's let's say a
00:32:13
range of courage.
00:32:14
>> Yes.
00:32:15
>> And so I'd say Elon is an outlier in
00:32:17
terms of willing to stand up Yeah. to
00:32:19
the government in terms of protecting
00:32:21
free speech and there's others who sort
00:32:23
of just kind of more blow with the wind
00:32:25
and do whatever the government is sort
00:32:26
of um suggesting. But demanding, but it
00:32:30
was wrong. It was wrong for the
00:32:31
government to be doing that particularly
00:32:32
in the US where you have a first
00:32:33
amendment. But let me
00:32:35
>> and to catch people up, we literally had
00:32:36
our FBI putting pressure on our own tech
00:32:40
companies to say we don't like the the
00:32:42
tone of these tweets, the tone of these
00:32:44
posts we think are damaging, but we
00:32:47
would have never gotten to the bottom of
00:32:49
CO and if it actually we should all be
00:32:52
taking mysterious, you know,
00:32:54
experimental vaccines. I took it. I'm
00:32:56
okay. But like the folks who were
00:33:00
saying, "Hey, maybe we don't need this.
00:33:01
Maybe we don't need to give it to kids.
00:33:03
They don't see that whole discussion was
00:33:05
shut down by the Biden administration.
00:33:06
Now you got me doing it
00:33:07
>> irrefutably and the the pretext for some
00:33:09
of this was disinformation a term that
00:33:11
was really distended to encompass and if
00:33:13
you read the white papers put out by
00:33:15
these disinformation NGOs they will
00:33:16
admit yeah the information can be true
00:33:18
but if it promotes an adverse narrative
00:33:21
we don't like it and that's such an
00:33:22
Orwellian adverse narative
00:33:25
misinformation. No, not misinformation.
00:33:26
>> There's there's misinformation and
00:33:28
disinformation. And the way some people
00:33:29
defined these was misinformation is
00:33:31
false and disinformation is bad. And if
00:33:34
disinformation pollutes your democracy,
00:33:36
the wrong candidate right might win. I
00:33:38
think is really the impulse. But we had
00:33:40
information suppressed under opaces of
00:33:43
combating disinformation. That turned
00:33:45
out to be true. So things that were
00:33:47
suppressed included the assertion that
00:33:49
the vaccine did not completely prevent
00:33:51
transmission. That turned out to be
00:33:53
true. It it mitigated transmission
00:33:55
significantly, but it was not a
00:33:56
sterilizing vaccine. Another thing that
00:33:58
was suppressed,
00:33:59
>> which by the way is why a lot of people
00:34:00
took it. They didn't want they wanted to
00:34:02
be a blocker in the system, so like it's
00:34:03
my social obligation to do it.
00:34:05
>> And if had given the choice, they might
00:34:07
not have.
00:34:08
>> And another thing that was suppressed
00:34:09
that was suppressed was the assessment
00:34:10
that the virus might have leaked from a
00:34:12
lab. And we now know that was that was
00:34:14
the same assessment of a House committee
00:34:16
and the CIA. So the government thinks
00:34:18
the virus, they're not sure, but it
00:34:20
might have leaked from a lab more likely
00:34:21
than not. and you're you're reaching out
00:34:23
to Twitter, to Facebook, to Instagram
00:34:26
saying, "Hey, you know, we can't force
00:34:27
you to take these posts down under the
00:34:29
First Amendment, but we'd really
00:34:30
appreciate if you did. And if you want
00:34:31
to stay in our good graces, you should."
00:34:33
>> Which is a long way of saying we all
00:34:35
need to be vigilant about it in the
00:34:36
United States, internationally. If
00:34:39
you're not vigilant about free speech,
00:34:40
there are people who will take it away.
00:34:42
>> Yes. And let me let me ask a question
00:34:43
about that. So, you mentioned um
00:34:44
organizations or NOS's who are kind of
00:34:47
instigating. They're like j you know,
00:34:49
jinning up these regulators. They're
00:34:51
showing them cases. What about this?
00:34:53
What about this? What about that? I'm
00:34:55
curious, and I think you've called this
00:34:56
a censorship industrial complex. Could
00:34:58
you just explain what this thing is? And
00:35:01
do you think that I know that some of
00:35:03
these groups are in the US, not just
00:35:04
Europe? In fact, they mostly might be in
00:35:06
the US. And what I'm wondering is, are
00:35:09
they going to the European regulators as
00:35:11
an end run around the First Amendment in
00:35:14
the US because they can get European
00:35:16
regulators to censor material that
00:35:18
otherwise could not be censored in the
00:35:20
US? I mean, it's a great question, but
00:35:22
um it's a question we don't even need to
00:35:23
ask because we know the answer, and the
00:35:24
answer is yes. So, we have emails that
00:35:26
have leaked from some of these NOS's.
00:35:28
So, um one of them, the Center for
00:35:30
Countering Digital Hate, which is a
00:35:31
British NGO whose leader was the target
00:35:34
of um some of our visa sanctions. There
00:35:36
are emails exchanged with with
00:35:38
Democratic politicians in the United
00:35:39
States, and with politicians now very
00:35:42
close to Kier Stormer saying um our
00:35:44
number one priority should be to kill
00:35:45
Musk's Twitter. So, kill an American
00:35:47
company, right? um in order to suppress
00:35:49
American political speech and our second
00:35:51
priority is to instigate UK and EU
00:35:54
regulatory action. So this is an entity
00:35:56
taking government money to get foreign
00:35:58
governments to come after American
00:36:00
businesses and this whole fact pattern
00:36:03
where these where these American NOS's
00:36:06
were working with the American
00:36:07
government to send kind of forceful they
00:36:10
allege not technically coercive emails
00:36:12
to to Twitter and Meta. um that was an
00:36:15
attempt by these activists to replicate
00:36:17
sort of the EU DSA in a way that would
00:36:21
kind of dodge the American first
00:36:22
amendment. So the EUDSA requires that
00:36:25
member states designate NOS's as
00:36:27
so-called trusted flaggers. Meaning this
00:36:29
this organization its job is to sit on
00:36:32
Twitter, look for offending tweets that
00:36:34
might be hateful or whatever and report
00:36:36
them to Twitter. and they get a
00:36:38
privileged reporting channel and the
00:36:39
company's required to give th give those
00:36:42
reports first trunch priority. If you
00:36:44
look at what was happening under Biden,
00:36:46
it was a very similar system. These
00:36:48
government agencies would arrange first
00:36:50
trunch priority for these reports and
00:36:52
some of what was put into those
00:36:53
channels. The reports were technically
00:36:54
made by NOS's. You also had you know the
00:36:57
the upside one of the small upsides of
00:36:59
co if you care about government
00:37:00
transparency is everyone was holding
00:37:02
their meetings on video. And so you have
00:37:04
these videos of these Zoom meetings with
00:37:05
government operatives saying, you know,
00:37:07
we couldn't do this under the First
00:37:09
Amendment. But fortunately, this NGO on
00:37:10
this call with us is going to do it
00:37:11
instead.
00:37:12
>> Yeah. And they have ways of pressuring
00:37:14
people, which leads to my final
00:37:16
questions for you, which is there are
00:37:19
firms that would, it actually started in
00:37:22
the conservative space and moved to the
00:37:24
liberal. Let's try to get advertisers to
00:37:26
cancel on this program. They went after
00:37:28
Howard Stern and they went after the
00:37:30
liberals first. Then they went after the
00:37:32
conservatives. let's get uh you know
00:37:35
Rush Limbaugh's advertisers to cancel.
00:37:37
Oh my god, he said these incendiary
00:37:38
things yada yada. Um but we had a even
00:37:41
more pernitious one which is people
00:37:43
started to say well hey you're
00:37:44
Cloudflare. Hey you're Amazon. Uh hey
00:37:47
you're PayPal. Hey you're Stripe.
00:37:50
>> We're going to go after you and make
00:37:52
sure that we shame slash pressure you
00:37:55
>> sometimes behind the scenes to debank
00:37:58
and to demonetize. YouTube got pulled
00:38:02
into this as well, like we're going to
00:38:03
shadowban your videos. They started
00:38:05
labeling all-in videos because we had
00:38:07
conversations with scientists
00:38:09
>> about co okay uh labeling also
00:38:13
suppressed I think. So what are your
00:38:15
thoughts on that which seems even more
00:38:17
pernitious because if you take away a
00:38:20
person's ability to monetize it, how do
00:38:21
they scale right
00:38:23
>> that? So there's been one successful
00:38:25
Supreme Court case in US history on
00:38:27
viewpointbased debanking and that was my
00:38:29
case um which we won and that was called
00:38:32
NRA versus VUO and the way we got that
00:38:34
into court was you had you had the New
00:38:36
York financial regulator um per the you
00:38:39
know per the pleadings literally
00:38:40
reaching out to financial institutions
00:38:42
saying you know it would really be
00:38:44
better for your enterprise risk
00:38:45
management framework if you didn't do if
00:38:48
you didn't do business with any proun
00:38:49
group
00:38:50
>> enterprise riskmanagement framework. So
00:38:52
you have all these professionals and you
00:38:54
guys have you know been in finance and
00:38:56
you know this um like these this these
00:38:58
bureaucracies hired within financial
00:39:00
institutions to ensure compliance with
00:39:02
all these regulations and so they have
00:39:04
these elaborate risk management
00:39:05
protocols and this gets a bit into the
00:39:07
weeds but there's this thing in finance
00:39:08
called reputational risk and that's
00:39:10
supposed to be the reputation of a bank
00:39:12
for safety solveny and soundness. You
00:39:14
don't want to run on the bank if
00:39:15
everyone thinks the bank might fail.
00:39:16
That's bad for the system. But there's
00:39:19
this ESG movement to expand the concept
00:39:21
of reputational risk to include things
00:39:23
like do you have a reputation for
00:39:25
letting naughty disfavored speakers have
00:39:28
bank accounts and that came up in the
00:39:30
NRA case. Supreme Court says government
00:39:32
is not allowed to do that. Even though
00:39:34
the government, you know, our first
00:39:35
amendment says Congress shall make no
00:39:37
law, you know, restricting the freedom
00:39:38
of speech. This wasn't Congress making a
00:39:40
law restricting their freedom of speech,
00:39:42
but it was a government entity uh
00:39:44
adversely applying regulations to choke
00:39:47
off certain viewpoints. And they were
00:39:48
applying it instead of going directly to
00:39:50
the guy saying the thing you don't like,
00:39:52
they're putting pressure on this risk
00:39:54
averse middleman, this bank. And the the
00:39:56
the debanking and deplatforming is
00:39:58
insidious for exactly that reason. When
00:40:00
you have a risk averse middleman like a
00:40:03
financial institution,
00:40:04
>> it's almost like they designed it that
00:40:06
way.
00:40:06
>> Yeah. they they they they don't have
00:40:08
skin in the game with respect. They
00:40:09
don't believe in your speech the way you
00:40:11
do. They have their in-house counsel
00:40:13
telling them that this is going to piss
00:40:14
off the financial regulator. So, it's
00:40:16
easier to take it down. And my office is
00:40:18
very
00:40:19
>> So, that's a common theme is that when
00:40:21
the government can't do it directly
00:40:22
because it' be a violation of the first
00:40:24
amendment, they use an intermediary to
00:40:26
do it.
00:40:27
>> So, you get the bank to debank someone
00:40:29
or you get an NGO,
00:40:30
>> Dark NGO,
00:40:31
>> which is really a government
00:40:32
organization because it's funded by the
00:40:34
government, but they call themselves
00:40:35
non-government. So they they do the
00:40:37
quote unquote factchecking.
00:40:38
>> Yeah.
00:40:39
>> You know, or um you know, you get one of
00:40:41
these other cases where um you know, the
00:40:44
the this was a little bit more overt,
00:40:45
but where the FBI through the Biden
00:40:47
administration is then putting pressure
00:40:48
on the social networks. In any event,
00:40:51
you get, like you said, a middleman to
00:40:53
do the dirty work because the government
00:40:55
can't do it directly
00:40:56
>> and to do it in a really nefarious
00:40:58
>> Yes.
00:40:59
>> way that's hard to detect. Yes. It's
00:41:01
like it'd be a shame if we blocked a
00:41:04
merger. Like they were very Zuckerberg's
00:41:07
a pragmatist. I'm going to go at him
00:41:09
again. Like he likes to buy things. And
00:41:11
if the FBI is calling you and you've got
00:41:12
to get something through the FTC next,
00:41:15
you're going to try to make nice, right?
00:41:17
And that is the risk of giving any
00:41:19
regulator kind of a capricious cudgel
00:41:21
over the internet is even if the even if
00:41:23
the regulation isn't explicitly
00:41:24
speechbased is just you can only do your
00:41:27
merger if this guy likes likes the look
00:41:28
of the merger then companies are going
00:41:30
to vi to impress that regulator that I
00:41:32
think a bit of that was going on frankly
00:41:34
with the Jimmy Kimmel thing. um because
00:41:36
you had this this this merger in the
00:41:38
works, this Techno merger that they
00:41:40
tried to complete under the Biden
00:41:41
administration and my friends in telecom
00:41:43
tell me that when Techno was trying to
00:41:45
sell itself during the Biden
00:41:46
administration, it went out of its way
00:41:48
to show the regulator how woke it was.
00:41:50
And so now it has an incentive to show
00:41:52
the administration that it's magal lined
00:41:54
and that's what happens when you give a
00:41:56
regulator a large cudgel. Now I want to
00:41:58
say something about labeling. So
00:42:00
>> labeling videos sounds oh it's just
00:42:02
transparency. What could be wrong with
00:42:04
that? people people should know if fact
00:42:06
checkers think that something is wrong
00:42:07
and I think my office's approach on that
00:42:09
is it depends on who's putting the label
00:42:11
on there and for what purpose so a lot
00:42:13
of these disinformation NOS's it was
00:42:15
almost like mart it was almost like the
00:42:17
red scare they would make a list of
00:42:19
outlets that were spreading
00:42:20
disinformation but they wouldn't just
00:42:21
publish that list they would send it
00:42:23
around to the credit card companies the
00:42:24
payment processors and suggest with with
00:42:27
kind of an implicit government
00:42:28
impromater because they're all
00:42:29
government funded as David very very
00:42:31
relevantly points out like you know you
00:42:34
guys really shouldn't be funding these
00:42:35
websites and the websites would never
00:42:36
know why and the viewer would never know
00:42:38
why. I think a type of labeling that's
00:42:40
really good is the type exemplified by
00:42:42
community notes on X where I can read
00:42:45
the tweet and then I can see what the
00:42:46
community notes say about it and you can
00:42:48
see a ranking of the community total
00:42:51
it's been a total game changer. I
00:42:52
remember when they were doing
00:42:53
factchecking and the facteing was so bad
00:42:55
cuz the fact checkers were biased or
00:42:56
sometimes it just wasn't good quality
00:42:58
control but the community note thing has
00:43:00
like really worked and you'll see that
00:43:02
when someone posts something that's
00:43:04
truly misinformation like it's a fake
00:43:06
image or a fake article or something it
00:43:09
always gets caught amount of time and
00:43:11
then you get notified
00:43:13
>> you get a if you like have you noticed
00:43:15
this if you like it'll come circle back
00:43:17
>> and then it gets community noted I get a
00:43:19
notification then I feel like an idiot
00:43:20
that oh I got
00:43:23
that. But you know what? People will
00:43:24
drag social media for, you know, the
00:43:26
fakes or whatever. But I never get
00:43:28
notified when the New York Times makes a
00:43:30
mistake and post a correction on page
00:43:33
43. They never notify anybody about
00:43:35
that. They bury that on the last page.
00:43:37
So I still think social media is by far
00:43:39
the best.
00:43:40
>> Were you in the meeting for accuracy?
00:43:41
Were you in the meeting when
00:43:43
>> they were deciding community notes or
00:43:46
not? And Elon was like, "Tell me about
00:43:48
it." And I was like, "Elon, I think this
00:43:49
is really interesting. You should
00:43:50
>> Yeah. double click on it because it's
00:43:53
actually working. Um, and he looked at
00:43:56
it, he immediately understood the
00:43:57
algorithm and he said, "Keep the group."
00:43:59
>> Yeah,
00:43:59
>> that group stays.
00:44:00
>> And you know what's the genius of that
00:44:02
algorithm is the community only gets
00:44:03
promoted if users who usually disagree
00:44:05
agree that that notice
00:44:07
>> they look for consensus amongst rivals.
00:44:10
>> Yes.
00:44:11
>> Which is a fascinating uh
00:44:13
>> the I'll tell you the other game changer
00:44:15
um on X has been Grock cuz you can just
00:44:18
go at Grock what's the truth? Yeah.
00:44:20
>> And you won't necessarily always agree
00:44:22
with Guac. I'm not saying it's perfect,
00:44:23
but it's pretty darn good.
00:44:24
>> It's trending in the right direction.
00:44:25
>> It's I mean, it's really good on the
00:44:27
whole. And yeah, I mean, it does a
00:44:30
really good job facteing, too. So, we
00:44:33
don't really need these bureaucrats and
00:44:35
politicians and regulators telling us
00:44:36
what's true or not. We have community
00:44:38
notes. We have AI now. You've got other
00:44:40
users
00:44:41
>> and you can file a lawsuit if you feel
00:44:43
you've been defamed. Like, this exists
00:44:44
in the United States.
00:44:45
>> Exactly.
00:44:46
>> As a concept. Uh, listen Sarah, we feel
00:44:49
I think I can speak for everybody uh
00:44:51
here in the USA House that we're really
00:44:53
glad that you're so vigilant and dogged
00:44:55
in protecting the First Amendment. Give
00:44:57
it up for Sarah Rogers.

Podspun Insights

In this episode, David and his guest Sarah Rogers, the Under Secretary for Public Diplomacy at the State Department, dive into the complexities of free speech in the digital age, particularly in the context of international regulations. The conversation kicks off with a humorous nod to their shared experience of staying in a 300-year-old house, setting a lighthearted tone before delving into serious topics.

Sarah explains the nuances of public diplomacy, emphasizing its importance in combating disinformation and fostering American interests abroad. The discussion takes a sharp turn as they explore the tension between American free speech values and European regulations, particularly the Online Safety Act and the Digital Services Act, which impose stricter content moderation standards on American tech companies operating in Europe.

As they dissect various cases of censorship and the implications for freedom of expression, the episode becomes a riveting exploration of how governments navigate the delicate balance between regulation and individual rights. The duo highlights alarming statistics, such as the thousands of arrests in the UK for speech-related offenses, prompting a critical examination of the chilling effects of such laws.

With wit and insight, they tackle the rise of AI and deep fakes, questioning the future of satire and parody in a world where misinformation can easily spread. The episode culminates in a call to action for vigilance in protecting free speech, leaving listeners with a sense of urgency and empowerment.

Badges

This episode stands out for the following:

  • 90
    Most shocking
  • 90
    Best concept / idea
  • 89
    Best overall
  • 88
    Most intense

Episode Highlights

  • First Impressions of Davos
    David and I share our initial thoughts on attending Davos for the first time.
    “It’s great to be here at USA House.”
    @ 00m 36s
    January 22, 2026
  • Understanding Public Diplomacy
    Sarah Rogers explains her role as Under Secretary for Public Diplomacy and its significance.
    “Public diplomacy addresses the relationship between the American government and foreign publics.”
    @ 01m 34s
    January 22, 2026
  • The Tension of Free Speech
    Discussion on the differences in free speech regulations between the US and Europe.
    “America has taken a much stronger approach on free speech than even most of the West.”
    @ 03m 25s
    January 22, 2026
  • Arrests Under the Online Safety Act
    Over 12,000 people were arrested in the UK under the Online Safety Act in 2023.
    “That’s more than we’re arrested that year in Russia, more than in China.”
    @ 11m 52s
    January 22, 2026
  • Censorship Tariff
    The DSA is seen as a censorship tariff targeting American tech companies in Europe.
    “It's effectively like a tariff on American tech companies operating in Europe.”
    @ 19m 03s
    January 22, 2026
  • Global Censorship Trends
    A discussion on the troubling rise of censorship across various regions.
    “Freedom of speech expression is going down in Europe and up in the Middle East.”
    @ 19m 50s
    January 22, 2026
  • Technological Innovation and Regulation
    The historical impulse to regulate new technologies often leads to negative consequences.
    “The impulse to restrain technological innovation tends to be vindicated over time.”
    @ 25m 43s
    January 22, 2026
  • The Importance of Vigilance
    A call to action for protecting free speech in the face of censorship.
    “If you're not vigilant about free speech, there are people who will take it away.”
    @ 34m 40s
    January 22, 2026
  • Censorship Industrial Complex
    A discussion on the entities working to suppress American political speech through foreign regulators.
    “It's a censorship industrial complex.”
    @ 34m 56s
    January 22, 2026
  • Debanking and Deplatforming
    Exploring the insidious nature of debanking and deplatforming in the current landscape.
    “The debanking and deplatforming is insidious for exactly that reason.”
    @ 39m 58s
    January 22, 2026
  • Community Notes as a Game Changer
    The effectiveness of community notes on social media in combating misinformation.
    “We don't really need these bureaucrats and politicians telling us what's true or not.”
    @ 44m 36s
    January 22, 2026

Episode Quotes

Key Moments

  • Public Diplomacy Explained01:14
  • Censorship Tariff19:03
  • Regulation of Innovation25:43
  • Free Speech Vigilance34:39
  • Vigilance for Free Speech34:40
  • Censorship Complex34:56
  • Insidious Debanking39:58
  • Community Notes Success44:36

Words per Minute Over Time

Vibes Breakdown