Search:

Charlie Kirk Murder, Assassination Culture in America, Jimmy Kimmel Suspended, Ellison Media Empire

September 19, 202501:22:22
00:00:00
Okay, everybody. There's no easy way to start today's show. Eight days ago, Charlie Kirk was savagely murdered while
00:00:06
doing what Americans love to do, debate. And when someone is senselessly killed
00:00:12
like this, especially at a young age and at the hands of another human, try to make sense of it. It's only natural. And
00:00:18
uh it's hard to imagine anything worse than a young father of two, just 31 years old, and entering the best and
00:00:24
most productive years of his life, being killed by a 22-year-old who's barely out of adolescence. Our hearts go out to the
00:00:29
Kirk family, his friends, fans, and every American who understands that no one should be killed for expressing
00:00:36
their beliefs. That's the core of the great American experiment. So, let's keep that experiment alive today and the
00:00:44
memory of Charlie Kirk by continuing the great debate. Besties, there's a lot to
00:00:49
process here as a community, a country, a society, and uh I just want to check
00:00:55
everybody's temperature at the top of the program. We're obviously not going to do a cold open here because that
00:01:00
would be inappropriate. But Shmath, how are you processing the last 8 days? I
00:01:06
actually wrote down something as well, which I normally don't do. I just like
00:01:11
to kind of react. But let me just read that and then maybe we can just talk from there. So to me, what Tyler
00:01:18
Robinson illustrates is the emergence
00:01:24
of a lost generation that was shaped by co I see years of isolation,
00:01:32
a reliance on screens, and an immersion in online subcultures
00:01:39
that have created a vacuum where some young men are drifting in without any
00:01:45
grounding. No institutions, no friends, no communities, no family. And out of that void,
00:01:53
I think what comes out can at best only be called ideological incoherence.
00:02:00
Somebody used the word salad bar extremism.
00:02:05
These individuals are not clearly aligned with any one ideology completely, but they seem to be
00:02:12
assembling fragments of memes, of conspiracies, of cultural signals
00:02:19
into an unstable identity and in some cases now it's exploding into violence.
00:02:26
I think the most troubling consequence isn't just the facts themselves,
00:02:33
which is abhorentt, and I and I feel incredibly bad for all of Charlie's friends and his family and his children,
00:02:40
obviously himself. But beyond the act itself, there's an enormously chilling effect, I think, on public discourse.
00:02:48
When you express an idea, it cannot be that then you risk becoming a target
00:02:56
because the ultimate outcome of that is fewer people will then enter the public debate.
00:03:01
And then what happens is the range of acceptable dialogue really narrows
00:03:07
and it only leaves space then for the most benign voices in the public square.
00:03:15
And if you have that kind of anodine discussion, I just think you have very bad outcomes
00:03:21
for society. I've watched a lot of his content since
00:03:27
he was murdered and I'm still trying to grapple with why people could not shout
00:03:34
the man down if they disagreed with him and instead shot the man down. And I think that that is a completely
00:03:40
unacceptable response for what he thought.
00:03:45
That's what well said. Uh Freeberg, your I think it wasn't his controversy, it
00:03:53
was his effectiveness. He was too smart,
00:03:58
too open, too honest, too willing to engage in discourse, too
00:04:04
willing to debate. He was too effective in changing people's minds. And I think that's why he became such a cultural
00:04:10
threat. And it wasn't necessarily the things he said because there are people out there who say far more controversial
00:04:17
things than Charlie. It's that he sounded sensible and he changed people's
00:04:22
minds through his discourse. And as he changed people's minds, I think he became a real threat to the ideologies
00:04:30
that he spoke up against. And I think that's why he was targeted. And it's obvious that if he wasn't as
00:04:37
effective as he is, if he wasn't as smart, as empathetic, as optimistic, as honest, as direct as he was, he wouldn't
00:04:45
have been a target perhaps. And it was very sad. One of the things that we've seen is the power of going direct. Sort
00:04:51
of like what we did with this podcast. He went direct. He went to college campuses, but he also recorded it and
00:04:57
put it on the internet. And millions of people saw it. and that new form of
00:05:03
media, that new form of communication where someone can actually have a town
00:05:08
square, the internet, that they can stand at and speak their mind and be heard. There's no longer these filters
00:05:14
and these controlling powers of influence that decide what we get to know and not know and what our opinions
00:05:19
need to be. The media and the traditional kind of systems are being degraded. I think it's important to
00:05:26
continue that. And I think one of the things that I've been optimistic about, I was very sad and angry the day this happened. I called Jamaat. I was
00:05:32
actually tearful that day. But since that time, I think it's been
00:05:38
amazing to see the optimism not just from one side, but from a lot of different people from different
00:05:44
backgrounds standing up and saying discourse is so important and re-underwriting this American process.
00:05:50
So, I'm very sad, but I'm very hopeful that people take this as a sign of how important this discourse
00:05:56
is. Saxs, I don't know if you knew him, but obviously he was instrumental, I think,
00:06:02
in um the MAGA movement, the conservative movement, and moving so
00:06:07
many young people over to it. Did you know him? Did you interact with him? What was he like? And and and how
00:06:14
are you feeling 8 days into this? I know this is the second assassination attempt, third actually in
00:06:22
this recent political season. Two on Trump and now this one. So this feels um pretty dark.
00:06:30
How are you feeling? Yes, Jason. I knew him and I considered Charlie a friend. I started doing his show, his radio show a few years ago. He
00:06:37
invited me on and I came on his show and he proceeded to have me back. I mean, I think I went on the show about half a dozen times.
00:06:43
It was always on a wide range of issues, but usually when I was on his show, we talked about tech. We talked about
00:06:48
things like censorship. We talked foreign policy. He was extremely versatile in the issues he could speak
00:06:55
about. And you see that when he was speaking on campuses, he would really take on all comers and every possible
00:07:01
issue would be thrown at him and he'd always have a wellthoughtout response. But in any event, I always tried to say
00:07:07
yes to him. I always move things around my schedule when he asked me to come on the show. I just felt that he was a very positive person, very upbeat about the
00:07:15
future. I think that he engaged in a very respectful type of debate with
00:07:20
people. You see this on college campuses. He saw every opponent as someone that he could potentially
00:07:26
persuade and convinced to convert to his side. You know, there are clips of him
00:07:31
going around where he explains that we all have to engage in dialogue in a democracy because the alternative when
00:07:37
we stop talking is that we become enemies. and eventually it leads to violence. And he was very concerned
00:07:43
about the rise of assassination culture. He wrote about this months before his death.
00:07:48
And so in a way he almost I don't think he saw his own death coming, but he definitely saw this disturbing rise of
00:07:55
violence as the alternative to free speech and and open discourse. And he
00:08:01
dropped out of college when he was 18 to start Turning Point. I mean, he was sort of a force of nature who put together
00:08:07
this organization from scratch. and he would go on these college campuses and kind of put down his his soap box or
00:08:13
tent and chair under a banner that said, "Prove me wrong." And he would engage people in conversation. Liberals like to
00:08:19
use this term platforming. It's usually the prelude to, you know, justifying why someone needs to be denied
00:08:26
the opportunity to speak on a tech platform, but Charlie wasn't platformed by somebody else. He created his own
00:08:31
platform. You know, he created this organization from scratch. And he was platforming liberals or leftists or woke
00:08:37
or whoever. He was giving them the opportunity to debate with him and his crowds got larger and larger. People
00:08:43
liked to see him speak and engage in these debates and he was platforming the
00:08:48
other side and giving them the opportunity. But like Freeberg was saying, the more
00:08:53
he did it, the more it became clear that his views were very well thought out and often times the other sides weren't. And
00:08:59
if one side became hysterical, it was always the other side. It was never Charlie. He was always sort of cool,
00:09:04
calm, and collected. And he was winning the debate. He was winning the debate very effectively and I think he was
00:09:10
revealing in a lot of cases that these woke students had these very strongly
00:09:15
held views but they couldn't really explain beyond the level of cliches why they believe what they believe. I mean
00:09:22
when he would just start to socratically ask them questions it was rare that you would get someone on the other side who
00:09:28
could go multiple questions deep with him and and debate him with the level of
00:09:35
knowledge that he had. And so I think that that fundamentally is why he was winning the debate. And that's why I
00:09:40
think that so many people of his generation and younger were really
00:09:45
paying attention is because he was offering them something that they weren't getting elsewhere. They weren't getting it through the school system.
00:09:51
The school systems were kind of drilling into their heads this sort of woke catechism about what they're supposed to
00:09:56
believe. And he was actually offering them critical thinking and dialogue and debate. And I think it's why young
00:10:03
people were flocking to him in in droves. And I do think that ultimately the reason why he was killed is because
00:10:09
he was so effective at engaging in this kind of debate. And we have it in the killer's own words. I mean, the killer
00:10:16
signed this very detailed confession, I should say, texted it to his alleged roommate lover
00:10:25
about why he did it. And he said that in his view, Charlie represented something
00:10:32
hateful. And then I think the key thing he said right after that was the killer
00:10:37
said, "I had enough of his hatred. Some hate can't be negotiated out." So in other words, there's no imperative
00:10:45
to engage in dialogue or debate or discourse with people who disagree with you, countering their arguments with
00:10:50
better ones. They just need to be silenced by any means necessary. That's basically the killer described his own
00:10:56
motive is that he essentially just thought that there are views that just have to be silenced with extreme
00:11:03
violence with murder and that that was justifiable. And I think this is what I think hit a
00:11:10
nerve for the whole country. I mean, beyond those of us who knew Charlie and considered him a friend and how much
00:11:16
sadness and grief that we have over losing him and for his wife and and for his kids are going to grow up fatherless
00:11:22
now. I think that we all felt that this was a huge invasion of the town square of the marketplace of ideas. This
00:11:28
happened on a college campus which is supposed to be the ultimate marketplace of ideas. It's supposed to be the place
00:11:35
where people engage in reason discourse to work out our differences. And so it felt like an assault on the citadel of
00:11:42
democracy. Freeberg, you set up this amazing panel we just released, which I think we all agree was kind of a sleeper panel at the
00:11:49
at the summit with two of the heads of colleges and we talked deeply at the summit about dialogue and freedom of
00:11:57
speech and and nurturing that. um your thoughts on sort of college campuses and
00:12:05
the free expression whether it's the college campus or whether Charlie Kirk and that genre
00:12:12
moves people via the internet. One of the things we talked about with the head of Berkeley and the head of Dartmouth is
00:12:20
that college needs to be a place that teaches kids how to think, not what to think.
00:12:25
And I think that was one of the things that Charlie exemplified better than anyone in this kind of
00:12:32
modern media genre is showing people that you can have a discourse, you can have a debate, you can have a dialogue
00:12:40
in the same way that as a group we've had fights, we've had disagreements, we've had different points of view that we challenge each other on.
00:12:48
One of the things early on when we started doing this podcast was people telling me that they they thought it was so cool to see friends argue like this
00:12:55
and have discourse and have debate and we still do it and I think Charlie exemplified that and
00:13:01
that teaches people how to think and it allows them to form their own point of view. I do not believe in this notion
00:13:08
that some system or some institution should be telling us what the truth is or what to believe or
00:13:16
what reality needs to be. They can present us with the facts and the evidence that they've sourced and they can give us a recommendation. But
00:13:22
ultimately, we need to have agency as individuals to form our own belief systems, to form our own opinions, to
00:13:28
make our own decisions. And I think what made Charlie so powerful was his ability to teach people
00:13:35
how to do that, how to have a discourse and reason your way to an opinion. And I
00:13:41
think that's one of the things that's changed so much in the last 20 years is just this idea that, you know, there's
00:13:46
truth and there's not truth and whatever one institution tells you is truth and the other institution tells you is not truth as opposed to giving ourselves
00:13:52
individual agency. I wasn't his target audience and so I was aware of Charlie Kirk, but I I had only seen, you know,
00:13:59
flipping through social media once in a while some of the clips. So, I took the time to, knowing we're going to talk about it here, watch at least 40 or 50
00:14:07
of these interactions, listen to a couple of his podcasts. And, you know, my takeaway from it was, to your point,
00:14:14
Saxi was incredibly respectful to people he disagrees with, certainly more respectful than you at times, Sax, when
00:14:20
we have debates. And he uh he was actually playful. He was gregarious with
00:14:25
these kids when they didn't understand something. Um, and there's a lot of talk
00:14:30
right now, people trying to, you know, point out where he was wrong. By the way, you don't get to murder somebody
00:14:37
because you believe they're wrong. I know this sounds like uh the most obvious statement in the world. But what
00:14:43
I came away with was he reminded me of all the Catholics, my family members I grew up with. He was for family values.
00:14:49
He was against abortion. He had specific feelings on affirmative action or DEI
00:14:54
that we all might agree with. might have said them at times in a spicy way or in a in a really full contact debate, but
00:15:02
these were the same conversations I had with, you know, growing up. And, you
00:15:08
know, this is this is, you know, a situation where I'm
00:15:13
just perplexed and I guess it goes back to where you started, Chimath, that there are some number of people who are
00:15:21
sick right now in our society and it seems to be young men. It seems to be the co generation.
00:15:27
How does a person that is seemingly a 4.0 student on the right track getting a
00:15:34
scholarship, how do you so severely go off the rails? And then how do you end up in a place
00:15:41
where you believe that something like this is even tolerated or justified? How can a person that's that seemingly
00:15:48
intelligent get to that place? from a conservative family. It seemed like he had his whole
00:15:54
life ahead of him and was on a right track at a certain age. And if he's capable of it, how many other people are going to end up in that
00:16:00
place? So what what is happening that basically allows these young men to
00:16:06
become so unmed? I think a lack of socialization and I we
00:16:11
don't know in this case but I I have very deep concerns about these SSRIs
00:16:17
aderall. These kids have all kinds of mental health issues and then they get put on a lot of different prescription
00:16:23
drugs. We don't know if that's the case here. So I'm I'm not speculating about that. But just talking about this generation in general, you know, if they
00:16:30
are permanently online, they're part of these subcultures. If they are being,
00:16:35
you know, having their brain scrambled with all these different drugs, many young people are on two or three different
00:16:41
medications. And I'm not necessarily anti them, but I do have a concern of
00:16:46
them being massively oversubscribed. His entire justification for this beyond the
00:16:52
fact that it's completely abhorrent and unacceptable is also completely all over the place. I actually asked my team
00:16:59
explain to me the things that this guy wrote down like where did they come from? One of it is from a video game
00:17:06
called Hell Drivers 2. Another one is from the furry subculture. Another one is a fascist theme from a Netflix show.
00:17:14
What is going on? Yeah. Well, look, I mean, we know some things about the killer's
00:17:21
ideology from what his family said, from what other people who knew him said, and and from what he put in his own text
00:17:28
messages explaining why he perpetrated this murder. And I think a big part of it was that in his view, the basically
00:17:36
mainstream conservative views that Charlie was discussing were in his view,
00:17:43
they were hateful and they were they were fascist. I think on one of the bullet casings he etched catch this
00:17:48
fascist. He had this view that first of all that violence is justified as a way
00:17:54
to end a political debate and that relatively mainstream conservative
00:17:59
thought was so hateful and beyond the pale that it needed to be stopped. And
00:18:05
I'd like to say that those views are so unusual that nobody else has them. But
00:18:11
the truth is if you look at the data, if you look at polling, you see that there's been a huge rise on the part of
00:18:18
young people and particularly on the left, although there's some on the right, but it's mostly on the left. They
00:18:24
believe now that political violence can be justified. Yeah.
00:18:29
And so you look, I mean, there's there's been many different uh ways of of getting at this. There was a a study I
00:18:35
guess Rucker University the social perception lab they asked about whether the murder of Donald Trump or Elon Musk
00:18:43
could be justified and insane 50% of left of center said yes 14% of
00:18:52
right of center said yes even that number is shocking to me how could it be 1% but obviously it's it's three times
00:18:58
greater on the left then they asked about destroying Tesla dealerships and protests whether that could be justified
00:19:06
and roughly 60% of left of center said yes and 23% of right of center said yes. So
00:19:14
there is I think an increasing view on the part of young people and this also coincides with similar declines in
00:19:20
respect for the value of free speech. They've been pulling people of all different age groups and different
00:19:25
political parties for many many years on whether they believe in free speech and those numbers have been on the decline
00:19:32
again particularly among young people and particularly on the left and so you
00:19:38
see it now in the data that there is a generation of young liberals I mean
00:19:43
obviously not all to be sure but a sizable contingent who believe that the
00:19:50
other side is so hateful that it's okay to use violence against them
00:19:55
and then you Yeah, that is a huge societal problem because
00:20:01
democracy ends when we can't debate ideas. What's at the root of it? I I have two
00:20:07
things I think are at the root of this. I mean, I talked a little bit earlier about the medication uh that they're putting all these kids on, and I'm going
00:20:14
to keep talking about it because we're talking about, you know, upwards of 20% of kids have been on SSRIs or on aderall
00:20:21
or or some stimulant right now. And it's not possible that all these kids need
00:20:27
this. And there is many other solutions to it. And then I'll add the algorithms. Not that an algorithm is responsible for
00:20:33
the murder, but we are taking people down rabbit holes online. And in fact,
00:20:39
Charlie's content was exceptional at engaging people, right? Short form, you
00:20:44
know, really interesting debates and all that kind of stuff. And that combination of being online too much, being
00:20:51
overmedicated, being isolated, you put all that together, Chimoth, I think that's what leads to these kind of
00:20:58
extreme it theories and disconnects from reality. But we could also be dealing with somebody who's severely mentally ill.
00:21:04
And we're trying to understand a severely mentally ill person. Hold on a second. Hold on. You're starting to sound a little bit like Paul
00:21:09
Graham there. That this is just some sort of random nut. Here's why it's not just some random nut. First of all, we
00:21:15
have an extraordinarily detailed statement and confession from the killer giving us a lens into his motives. But I
00:21:22
would say the even bigger thing is the reaction of hundreds of thousands of people online
00:21:27
who celebrated this murder who were gleeful who expressed some degree of happiness
00:21:33
about it or tried to downplay what the killer had done by saying on some level that
00:21:40
Charlie Kirk deserved it. So you have to ask Yeah. Where does that come from? And I think
00:21:46
this has to do again with the ideology of the left, not the traditional like old left because I think the sort of old
00:21:52
left really believe in free speech. Maybe anybody else, you know, like you know, folks like Nent. And so and by the
00:21:58
way, I think guys who I would consider kind of more old left like Bernie Sanders and Ezra Klein
00:22:04
put out excellent statements denouncing yes, this murder, agreeing with this core
00:22:10
idea that political violence is never acceptable. And then they were shattered down which is insane to Ezra. He got he
00:22:16
got crushed for that tells you something too. So look I think that the psychotic break if you will that is now occurring in a bunch of
00:22:24
different cases may be exacerbated by who knows medication by isolation
00:22:29
experienced during co or what have you. But there is clearly a huge ideological component to this that there are huge
00:22:36
numbers of people even if they don't commit the crime themselves who celebrate it because they have had drums
00:22:43
into their head for decades now the idea that the other side are fascists. Their
00:22:48
leader is like Hitler. What do you do to Hitler? You have to stop him at all costs. Right? And I think that is a
00:22:55
significant part of the problem here. Now you ask now where where does this ideology come from? I think it's coming from the school system. I mean, we've
00:23:01
had now for the last couple of decades this sort of woke ideology, this cultural Marxism be taught through the
00:23:08
school system where they drum into the kids that there are two kinds of people, oppressors or oppressed, and they divide
00:23:16
people into these groups and say that the oppressors have to be stopped. And that's really as deep as the discussion
00:23:21
gets. And if you want to get a sense for this exaggerated, destructive rhetoric,
00:23:27
just look at the new book that has come out by Randy Weingarden,
00:23:34
who is head of the teachers, basically the head of our school system because she runs the teachers unions and
00:23:40
they're the ones who have all the political power. She wrote why fascists hate critical thinking. And by the way,
00:23:46
this is not just an article. I guess Rolling Stone did an excerpt. This is an entire book. And the book is called Why
00:23:52
Fascists Fear Teachers. So she's basically saying that Trump and all of his supporters are fascists. They have
00:23:58
to be stopped. You've had this hyperbolic rhetoric. For a decade, you've had the press describe Trump as
00:24:04
some sort of new Hitler. To use Scott Adams term, they've basically Hitlerized him. And so you you asked, you know, we
00:24:12
don't know exactly where Tyler Robinson embibded all of these ideas, but you have to also take into account that
00:24:19
probably since he was 12 years old, the media, the mainstream media has been beating into his head that the leader of
00:24:26
the United States is a new Hitler and the worst thing that could ever happen is his return and his supporters are
00:24:32
fascists. And I'm sure this has to play into, if not his thinking, the thinking
00:24:37
of the people who are celebrating the death of Charlie Kerr because they have
00:24:43
bought into this idea that the other side are fascists who have to be stopped at all costs.
00:24:49
Yeah. I think it's important to point out as you did Mandami, Nancy Pelosi,
00:24:55
gosh, AOC, Ezra Klein, they all came out just very strongly that this is
00:25:00
horrible, unacceptable and and they con, you know, obviously
00:25:06
think it's abhorent. So, I think that leadership is strong and I think that's what we need to see more of is people
00:25:13
trying to tone this down and and yeah, not exacerbate it. Look, I I think you're right. I think
00:25:19
that what we need to see from all of our political leaders is a condemnation
00:25:25
of this murder in particular and political violence in general without any caveats whatsoever, without
00:25:32
intimations that somehow Charlie Kirk deserved it, without this exaggerated
00:25:39
rhetoric that the other side are fascist, that Trump is Hitler and so on. There just needs to be an unequivocal
00:25:45
denunciation of this crime and political violence and the idea that we can settle our
00:25:50
differences in this manner. Yeah. And I give credit to any liberal or Democrat politicians who say that. But
00:25:56
unfortunately, that has not been the discussion. And I think that's been one of the most disturbing things about this
00:26:03
is that if it had just been Tyler Robinson and everybody agreed that this is a heinous act, a heinous murder,
00:26:13
I think that we wouldn't have had the same national conversation. So would have been awful. But I think the reason why this became such a national
00:26:20
conversation is that we can all see that living among us, you have huge numbers
00:26:26
of people who've been indoctrinated into what Scott Adams called this Hitler bubble. And I think it's a real question
00:26:31
what we're going to do to get out of this this bubble because in order to have civil society,
00:26:37
we have to have the ability to engage in conversations, to have political debates
00:26:43
without fear of violence or murder. And when you have this sizable contingent of
00:26:48
people in our society who've deviated from that, it makes you really worry for the future of our society.
00:26:55
Absolutely. And it makes you wonder, what are we going to what are we going to do about this? And and there's a a whole trans angle to
00:27:02
this that is being talked about, but maybe people are withholding a little
00:27:07
bit. but he was answering a trans question when he was shot by somebody whose I guess girlfriend would be the
00:27:13
right way to say it was was in the middle of transitioning. So there's many different angles here that we're going
00:27:20
to uh try to understand over time. Yeah. Right. And then in response to that, you saw an ABC News reporter describe the
00:27:27
text chain, which was a murder confession, as touching, you know, and so now I think that ABC
00:27:34
reporter has published an apology. So, I'm not going to dwell on that part of it, but I think what you see here is an upside down
00:27:40
morality. What Tyler Robinson did was an act of hate. The ultimate act of hate.
00:27:46
He committed murder. He left a wife without a husband. He left two children without a father. It was the ultimate
00:27:52
act of hate. But in his twisted mindset, it was somehow an act of love. And the
00:27:58
question is, how do we get to this place? And again, if it was just Tyler Robinson, then we could just chalk it up
00:28:04
to the the random nut theory. But well, and we don't know the random not who shot Trump. Like we don't have any
00:28:09
information on that person either. So I'm right. But what bothers me more again are all these people in Reddit groups or
00:28:16
online and blue sky which is just a cesspool of hate now who are
00:28:22
celebrating celebrating to some degree. Uh no there there are definitively people celebrating. Yeah. This is why I
00:28:29
really think one of the most constructive things that we can do coming out of this is we have to find
00:28:36
out how to course correct these people. Actually JK Rowling had a really good tweet on this. So she says that if you
00:28:43
believe free speech is for you but not your political opponents, you're illiberal. I mean I think we've known
00:28:49
that for a while and I don't know if the woke left cares about that anymore. But any of it, she then goes on to say, "If
00:28:55
no contrary evidence could change your beliefs, you're a fundamentalist. If you believe the state should punish those
00:29:01
with contrary views, you're a totalitarian." And then I think the most important one is if you believe political opponents should be punished
00:29:07
with violence or death, you're a terrorist. And I think actually this was an act of terrorism in the sense that it
00:29:14
was designed to discourage other people from speaking out. Again, it was done in in the most public place in this town
00:29:21
square. But I think what what Rowling is getting at here is that in order for civil society to survive, we're going to
00:29:27
need some sort of minimum standard of decency. You know, I remember when we were in college, we used to have these
00:29:33
like thought experiment type questions, which is how much if tolerance can a tolerant society
00:29:38
tolerate, things like that. And I think we have our answer which is listen you
00:29:44
can say whatever you want but our minimum standard of decency in a society
00:29:50
is that you cannot use violence to settle political disagreements. And if
00:29:57
you cannot accept that I mean really that should be the pledge instead of frankly Grover Norquist tax pledge or
00:30:03
even the the pledge about deficit spending. I think we need a minimum decency pledge that political violence
00:30:09
is unacceptable. And if you can't sign that pledge, you should not be participating in the discourse.
00:30:16
I mean, the fact that you have to even suggest signing a pledge to not murder people you disagree with shows how far gone this all is. Um, and how disturbing
00:30:23
it is. Uh, that that just should be something that is inherent in every human being in their operating system.
00:30:29
You just don't murder people because you disagree with them. Um, all right. It's obviously, yeah, we're this is an
00:30:34
evolving story. We don't have all the answers here, but you see the Jimmy Kimmel
00:30:40
comments and should we play those and talk about it? Yeah, I guess it's a pretty good segue here. Um, so, uh, political discourse
00:30:47
adjacent related uh, the fallout from the murder of Charlie Kirk is going to
00:30:53
stay with us for some time here. ABC has suspended Jimmy Kimmel from Jimmy Kemmel Live indefinitely after he made the
00:31:00
following remarks and pressure from the FCC and the affiliates of ABC. Here's
00:31:06
the 15-second clip. We hit some new lows over the weekend with the MAGA gang
00:31:11
desperately trying to characterize this kid who murdered Charlie Kirk as anything other than one of them and
00:31:16
doing everything they can to score political points from it. in between the fingerpointing there was uh grieving.
00:31:22
And he goes on to show a clip of Trump. Yeah. But I think you also left out that
00:31:28
sentence right after that where he says, "Yeah, there was grieving, but it was like a four-year-old grieving the death of a goldfish."
00:31:34
Yeah. So, um yeah, pretty inappropriate comments, I think. And here's the
00:31:40
timeline of it. So, cuz there was some thinking this could have been a MAGA
00:31:45
person because the parents were conservatives and during that breaking news,
00:31:51
you know, maybe that first sentence wouldn't be as off-putting or inaccurate as it seems to be, but he said it Monday
00:31:58
night, and this was 3 days after we knew about the anti-fascist bullet casings
00:32:04
that Chimath was referencing. Hey, fascist catch, etc. And this was one day after Utah's
00:32:10
governor was very clear that this was a murderer who had leftist ideology and
00:32:17
one day after it was disclosed that Robinson had been living with a trans partner which may or may not turn out to
00:32:23
be relevant. Uh you could totally have be gay or have a trans partner or any of that and may have nothing to do with
00:32:29
this or it could have everything to do with this. So we'll have to wait and see. And Jake, wasn't the the weren't the
00:32:34
text messages out by then at as well? Well, the text message with partner that's very good. Those those were actually released on Tuesday after the
00:32:41
show aired and they usually tape in the afternoons for those nightly shows, just so people know that. So, there was
00:32:47
enough information here that, you know, he was inaccurate or spinning the story,
00:32:53
however you want to phrase it. It was clumsy. Megan Kelly put out a very long description of everything that was known
00:32:59
prior to Jimmy Kimmel's remarks and there was a huge amount of information to indicate that this was someone who
00:33:06
was in the grips of a radical leftist ideology and no evidence to suggest that
00:33:11
they were somehow part of MAGA. And what Jimmy Kimmel did there was viciously
00:33:17
lie. I think implying or stating outright that effectively Charlie Kirk was was
00:33:24
killed by one of his own by a MAGA person thereby I think diminishing
00:33:30
the crime somehow implying maybe that he deserved it and then he diminished the sincere grief and outrage that millions
00:33:38
of people are feeling. you know, beyond those of us who were friends with Charlie and knew Charlie, but millions of people
00:33:43
that is but that is okay. You know, in in free speech sachs, that statement,
00:33:48
that's his choice. He might lose audience over it, but free speech protects him doing that. The challenge was did he say something that he knew to
00:33:56
be not true and declare it as fact? And did that kind of
00:34:02
was the reaction that he got? Yeah. Yeah. And so if he said something that he knew, well, just reading his sentence, you
00:34:07
know, he was saying, "We hit some new lows over the weekend with the MAGA gang desperately trying to characterize this
00:34:13
kid who murdered Charlie Kirk as anything other than one of them." So he's referencing The Weekend. So he, you
00:34:18
know, you could say he's being disingenuous here, but I think he's not even supposed to be a late night
00:34:25
comedian. It is inaccurate. It's highly inaccurate. It's a lie and it's a vicious lie. I'm
00:34:31
not defending it, but it was not inaccurate that people were doing that speculation. During a podcast on
00:34:37
Tuesday, FCC Chair Brendan Carr, who we all know, said the FCC would look into
00:34:42
revoking the broadcast license of ABC affiliates because Jimmy Kimmel Kimmel was engaging in quote news distortion by
00:34:50
falsely claiming Tyler Robinson was MAGA. On Wednesday, NextStar, one of ABC's affiliates, uh, that has 28 local
00:34:57
stations, said it would be replacing Kimmel show with different programming. And there's now this backstory that
00:35:03
NextStar is buying a rival for $6 billion. So, the speculation is like the
00:35:08
CBS and Paramount situation where they fired Colbear on July 17th that they
00:35:13
don't want to get caught up in uh having the 47th administration gumming up their M&A activity. And then ABC after all
00:35:22
that pulled Jimmy Kimmel indefinitely. So I guess the question here s
00:35:28
pull up the ratings. Pull up the ratings over time. We'll get to all that, but I just let me just ask the question. Do do you have
00:35:35
concerns about the Sachs? Obviously you've been very pro free speech here. Do you think that this is overreaching?
00:35:40
If he hurt people's feelings, if he said something inaccurate, do you think he should be the government should be
00:35:46
taking him off the air or putting pressure on them to take him off the air? Well, no. The government shouldn't be ordering him to be taken off the air,
00:35:51
but neither President Trump nor the FCC chairman Brennan Carr did that. The reason why Kimmel lost his job is
00:36:00
because the two largest affiliates, which is NextStar and Sinclair, told ABC Disney that they wouldn't air it
00:36:06
anymore. And I think part of this is due to the genuine offense that what he
00:36:13
said, which okay, I guess you can put the best face you want on it. it was false and it was a flip way of
00:36:19
describing something that was causing pain to millions of Americans. Nextar called his remarks offensive and
00:36:25
insensitive. That is true. And then Sinclair said that they're going to air a Kirk tribute in in Kimmel's time slot
00:36:31
on Friday. So look, I think this is about the affiliates first of all finding what he said completely
00:36:38
offensive, but then secondarily the truth is that I think there's a lot of opportunism in this and this is really
00:36:44
about poor ratings. Kimmel has even worse ratings than Steven Colbear who recently got cancelled because his show
00:36:50
was too expensive and not enough people were watching it. And I think that what's happening here is the ABC
00:36:57
affiliates are frankly they're seizing an opportunity to rid themselves of this
00:37:02
money losing disaster and they're they're getting back that hour of television for other things. The question is whether Carr should have
00:37:09
said we can do this the easy way or the hard way. these companies can find ways to change conduct and take actions on
00:37:15
Kimmel or there's going to be additional work for the FCC ahead. That's what I'm sort of referencing in terms of the
00:37:21
government putting pressure on these folks to to cancel him. Well, look, what Carr was asked whether
00:37:27
by I think he's on Benny Johnson's show and Benny asked whether there's anything that can be done about this and then Carr said that there is a public
00:37:34
interest requirement with respect to the broadcast networks. So if Jimmy Kimmel
00:37:42
show was on a podcast or on cable, that public interest requirement wouldn't
00:37:47
apply. But because those affiliates receive public spectrum for free, there
00:37:53
is a requirement that the reporting be truthful to some degree and serve the
00:37:59
public interest. And I think he was right to point that out. Now look, did
00:38:04
it ever get to the point where the government ordered him off the air? No. It didn't need to get to that point.
00:38:10
because the guy's show was a ratings disaster. Look, even John Stewart, who
00:38:15
was a big late night host at one point, said, let me just read you his quote on not this topic, but about late night. He
00:38:20
said, "Being a late night host is like operating a blockbuster video kiosk inside a Tower Records." In other words,
00:38:27
the business model is totally obsolete. And I think that's what's ultimately driving this. Look, we never got to the
00:38:32
point where the FCC took an action. And I think if we got to that point, then maybe we debate this issue of what the
00:38:38
government should do. But we never got to that point. And I think it's because the affiliates decided on their own that
00:38:44
it was time for Kimmel to go. Chamathy, your thoughts? It's mostly that if you just look at the
00:38:49
ratings, the reality is that these folks get a lot more attention than they
00:38:54
actually deserve. I mean, it is an anemically sad audience approaching zero.
00:39:02
160,000 people watch Jimmy Kimmel live. More people hate stream our show in one
00:39:11
day on YouTube. You know what I mean? So it's like I mean so what are we talking
00:39:17
about here? This is like the pimple of the pimple of the pimple of the dog's ass. So I think that people needed air
00:39:25
cover to make a decision to cancel it today. And the reason why they wanted to cancel it today was if they didn't take
00:39:31
this opportunity, they would have had to wait another year or two for his renewal to come up. And so I think that there
00:39:38
was a economic motivation that was lingering for a long time before this
00:39:43
happened. His show sucks. It's not very good at at a very basic social, cultural, economic reality, as these
00:39:51
numbers demonstrate. Many many many millions of Americans who had a choice
00:39:56
every day to listen to this guy voted with their feet that this show sucked.
00:40:04
Let's move on. I will yeah say I agree that they made a very simple decision
00:40:10
that was economic because these same networks ABC and CBS were more than
00:40:17
happy when these shows were pulling in serious ratings and serious money back in you know 3, four, five years ago.
00:40:24
They were doing much better which shows you how cable has come apart over the last couple of years. In 2022 ABC signed
00:40:31
Kimble to three seasons 45 million. 2023 CBS sign on Coair to a three-year deal
00:40:37
that will be ending and as you showed in that chart, it's been plummeting since then. Quick note,
00:40:43
that's just for the very important 18 to 49 demographic. Obviously, they have a bunch of old boomers watching it on
00:40:50
their cable channels. That's the discrepancy in those numbers. Nobody watches this. But I do have grave concerns over myself
00:40:58
personally in what Brandon Carr is doing. I don't think the FCC can't do anything. Well, I can finish my
00:41:03
sentence maybe. Okay. Or you can just interrupt me every time I start a sentence. And I'm We just
00:41:09
talked about having civil discourse. Do you want to have civil discourse or you want to just shoot me down? I'm I'm civil
00:41:15
in the first sentence. Please go ahead. I do have concerns over both how the administration is doing this. I think
00:41:21
it's disgraceful. I didn't like the pressure they put CBS under to get Cobar
00:41:27
canled and I don't like the pressure they're putting ABC News under. We should have a tradition of free speech
00:41:33
and comedians should be protected even if you don't like what they have to say. This is the exact opposite of what many
00:41:40
folks and why they backed this administration was to have less censorship. And we sat here on this
00:41:46
program for many years talking about Biden censoring COVID talk and Biden censoring social media having the
00:41:53
Twitter files a direct line in there to cancel this. I think that this can hold
00:41:59
on let me finish. It's a dark moment for this administration
00:42:04
uh to be so obsessed with trying to silence their critics and it's not going
00:42:11
to work. It's only going to drive them to start podcast and do more of it. Yes. Go ahead. Can I ask you a question? Yeah. Well, first I make a statement. I think
00:42:17
your comments are insane. But here's my question. I think you're also sophantic to Trump.
00:42:23
What? What did I say? I said he didn't do anything. No, I think you're trying to suck it up to Trump by saying it's a terrible show.
00:42:28
He's not funny. None of that matters in terms of freedom of speech. I'm trying to ask you a question. Do you think because the government did
00:42:35
nothing? Let's just be clear. They did nothing. They threatened him after he was already can't.
00:42:42
No. And they shouldn't be doing any pressure. The government should be staying out of
00:42:48
Okay. Let me ask you a question. Yeah. Do you think the people that made the decision to dump Kimmel, what percentage
00:42:55
of it do you think was the economic reality of a busted show? And what
00:43:00
percentage of it do you think was the impingement of free speech as you characterize it?
00:43:06
What percent How do you Absolutely. I think they would have let both of their contracts run out,
00:43:11
renegotiate them with smaller dollar amounts, and reset the economics of those shows if not for the pressure put
00:43:17
on them by the 47th administration. I think they would have just rode these shows out till their next contracts went
00:43:23
out and then they would have renegotiated them at much lower amounts. You think Trump is the reason why Cole
00:43:29
Bear got cancelled? What are you talking about? I think that the pressure this is like a whole mythology now
00:43:34
that's been created. Not a pathology. I think they wanted to get the Paramount deal through. No, no. I think getting the Paramount
00:43:40
deal through, the sale of the Paramount and I think they've settled a lot of these agreements because they don't want
00:43:48
going away. Well, that that can all be true. Two things can be true at the same time. John Stewart understands the medium and
00:43:53
he says it's a Blockbuster store. They can't wait to be rid of these guys. These guys are overpaid. They're
00:43:59
entitled because they're grossly overpaid. They think they deserve this. By the way, free speech does not mean you have a
00:44:05
right to an ABC show. Sorry. You actually have to be able to get ratings. You have to be talented. You have to be
00:44:11
funny, which none of these guys are anymore. That's the reason they're getting cancelled. And exactly.
00:44:16
You know what? They're talking about replacing Kimmel with, not another late night performer,
00:44:22
Celebrity Family Feud. Like that's they're going to replace it with a game show. That's what they're talking about
00:44:28
replacing. Great show, by the way. Yeah, great show. It's It's a more affordable show. They would not have cancelled either of
00:44:34
these folks if it wasn't from pressure from the administration. The FCC head basically said, "We're going to do this
00:44:40
one way or the other." So, I believe that this is the kind of thing we don't want to see out of the Trump administration. And by the way, Tucker
00:44:46
Carlson, Glenn Greenwall, and I are all in sync on this. We all think that this is a bad look for Trump and the Trump
00:44:52
administration. You guys can disagree, but that's what we do here. We have several disagreements. Where were all the people who were complaining this when Gina Corano got
00:44:59
fired by Disney? Oh, I I absolutely felt that was inappropriate. I think that cancel I think there's cancellation now
00:45:05
on both sides the in and I understand people are very hurt by what happened. Are you cool with Roseanne Bar getting
00:45:11
fired? I am against anybody getting cancelled for their political beliefs. I was against it when they did it to the right
00:45:17
and I'm against it when they're doing it now to the left. Is that clear? Let's talk about cancel culture for a
00:45:22
second. I actually think that Dave Portoy had a great take on this. There is a legitimate question here about
00:45:29
cancel culture, like should it ever be used, when should it be used? And I think Portoy made a really good point.
00:45:34
He says here, to me, cancel culture is when people go out of their way to dig up old tweets, videos, etc., looking for
00:45:40
dirt on someone. So, in other words, they're going through 10, 20-year-old tweets of yours. Maybe they find some
00:45:47
joke that was either off color or impo taste or maybe it's just
00:45:55
something they can spin as being that thing and it's done in an opposition research type of way, right? They decide
00:46:01
they're going to go after you and so they sort through everything you've ever written. That's cancel culture. Okay. He
00:46:07
contrasts that with somebody who is on live TV basically saying something that
00:46:13
a ton of people find offensive, rude, and dumb in real time. So, in other words, this was not like opposition
00:46:19
research. This was a spontaneous reaction to a deeply offensive and insulting
00:46:25
and dishonest and false thing that Jimmy Kimmel said. And he already was on thin ice because his ratings are terrible and
00:46:34
his syndicates want to get rid of him. And like Portoy says, that is not cancel culture. That is consequences for your
00:46:40
actions. And I think he's right about that. And I am a free speech absolutist. If Jimmy Kimmel wants to create a
00:46:47
podcast and say all the types of things he said on ABC, I got no problem. That's his right. He will not be censored. He
00:46:53
can do that. Okay. And my view on the whole public interest requirement is what we should do is just auction off
00:46:59
all the network spectrum. I think it is a can of worms to basically say to all these broadcast affiliates that we're
00:47:07
going to give you all this wonderful free spectrum, but then you got to uphold the public interest. It's way too debatable what that means. I do think it
00:47:13
basically invites challenges to what could be free speech. So, I think just
00:47:19
get rid of the public spectrum. Just have an auction and we don't need free public spectrum for broadcasting.
00:47:25
I definitely agree with that. In the era of the internet, you do not need you do not need public spectrum anymore.
00:47:31
I think that that solution I think that that solution has a lot of positives beyond this issue. I think that
00:47:37
the best use of that spectrum anyway. And just to go back to this idea of look, if someone in real time is
00:47:42
advocating for the murder of someone for their political views or for any reason,
00:47:48
maybe that does justify a spontaneous outpouring where it should be reported
00:47:54
to their employer, hey, this person is saying this thing. if they're dancing on the grave of of someone, if they're praising that act of murder, you know, I
00:48:02
have to say that again, I have a minimum standard of decency for civil society.
00:48:07
And that minimum standard of decency is an unwillingness to engage in political violence or to celebrate political
00:48:15
violence when somebody engages in it because otherwise we're down the path of civil war. I think we all understand
00:48:21
that. So, I have to say that if someone's going to advocate for these views, I'm okay with there being a
00:48:28
spontaneous reaction of people complaining about it. Yeah. And if that means that Jimmy Kimmel does
00:48:34
a podcast instead of an ABC show in prime time, I'm I'm okay with that. I think, you know, the issue I have with
00:48:41
any of these cancellations is when the president is asking the networks to cancel people during his, you know, 17
00:48:48
time a day press conference and working with a gaggle, he's literally telling them fire the last two and then the FCC
00:48:54
chair is telling them to fire them and that they'll figure out a way to do it. That to me is the height of putting
00:49:01
pressure on these companies and I think it's abortant. Okay, that's just my personal belief. You guys don't have to
00:49:07
agree with it in this civil civil debate we're having here. Let's talk about Larry and David Ellison. They're making
00:49:13
massive media moves both in legacy and social media. Let's talk about the legacy stuff first.
00:49:19
Paramount Sky Dance merger is basically a month old and already CEO David
00:49:25
Ellison, that's Larry Ellison's son, is looking to acquire Warner Brothers Discovery. As you probably know, they're
00:49:31
run by David Zazlov. They own CNN, HBO, DC Comics, Discovery, a bunch of other brands. And that would put the Ellison's
00:49:38
in charge of not just CBS News, which owns 60 Minutes, they would also own CNN,
00:49:45
as everybody knows, Larry Ellison, big GOP donor in the past. And there also
00:49:50
are reports, interestingly, that David Ellison wants to buy the free press from Barrett Weiss for 200 million and put
00:49:56
her in charge of CBS in 60 minutes. That would be seismic in the news business.
00:50:01
If you put those assets together, 200 million paid subs. HBO Discovery, 120 million paid subs right now. CBS reaches
00:50:09
80 million paid with Paramount Plus. And the Free Press is obviously niche and brand new. Uh but they're growing. They
00:50:14
got 136,000 paid subscribers and, you know, almost a million followers across
00:50:20
their social media. The second piece to this puzzle that's super interesting is social media. Oracle is now the heavy
00:50:27
favorite to acquire Tik Tok and uh Trump said that was going to happen as part of the US and China
00:50:34
negotiations that are ongoing for trade. So what are your thoughts here
00:50:39
Freedberg? You wanted to talk about this is an interesting collection of assets coming under the Ellison umbrella. Is
00:50:46
this strategic? Is this vanity? Is it a nepo baby with a huge pay, you know,
00:50:52
blank checkbook? What are we seeing here? And what do you think? I mean, what do you do when you're worth a couple hundred billion dollars and
00:50:57
you're 81 years old and you're thinking about what legacy you want to leave behind, except to perhaps empower your
00:51:04
kid to build the largest, most influential media company in history.
00:51:09
And I think that's the story that's unfolding in front of us. Not to mention, I think that the Tik Tok deal
00:51:15
and Ellison's role in Tik Tok is going to be instrumental in realizing potential future distribution. So if you
00:51:22
think about the way media has evolved, it used to be kind of centralized studio-based production models, the old Paramount, the old Warner Brothers, etc.
00:51:30
And then there's been a lot of streamers that have come on that started to syndicate that content and contract for
00:51:35
production of content like Netflix. And obviously Paramount Plus and HBO Max
00:51:40
have their own streaming services to deliver their own content. But at the end of the day, there's kind of two big
00:51:46
behemoths that each come at it from a different place. One is YouTube and the other one is Netflix. Netflix has
00:51:51
historically been in the kind of scripted production or contracted production side and YouTube in the
00:51:56
social production side. The alternative to that might be this consolidated merger and Tik Tok and now Larry
00:52:03
Ellison's going to have a hand in both. And so there may be and I do think that there's going to be this convergence
00:52:09
between this kind of socially generated content platform like a YouTube like a Tik Tok and the high value produced
00:52:15
content like the studios like Netflix has been doing. And the reason is Netflix has had to compress margins on
00:52:23
the content creators and we talked about this at our summit last week where a lot
00:52:28
of the creators now that went to Netflix to get good deals or went to Amazon,
00:52:34
they're all finding that the budgets are getting cut and that the only way these folks are going to get paid is like cost
00:52:40
of production plus 10%. With cost of production getting squeezed and budgets getting squeezed by Netflix. So, as a
00:52:46
creator, you may actually make more money by taking your content to a bigger audience with sponsors or advertisers on
00:52:54
YouTube. And so, a lot of big-time creators, by the way, the audience at YouTube is over 10x bigger than it is on
00:52:59
Netflix. So, Netflix is only paying to retain subscribers. Now, you know, subscriber growth is kind of slowed down
00:53:06
a bit. At the end of the day, Netflix is just spending money on content to keep people on the platform. So if you take
00:53:13
the incredibly rich content and production capabilities of HBO and all the Warner Brother discovery media
00:53:19
properties and production houses underneath this combined company and you combine that with the direct to consumer
00:53:26
distribution of Tik Tok, there may in the future be a merger between this media company and Tik Tok or a deep
00:53:32
commercial relationship where imagine going on Tik Tok and you can now get premium content for 10 bucks a month or
00:53:39
two bucks an episode and watch all of your HBO shows in the Tik Tok app or watch all of the discovery or all of the
00:53:45
other content that's available. So, I do think that the distribution that has been delivered by this kind of social
00:53:52
media model like YouTube and Tik Tok combined with the premium model may end up creating a real category killer that
00:54:00
can challenge both YouTube and Netflix. And so, I would kind of look at this story as like a beginning of an unfolding of something that may rewrite
00:54:06
the entire media landscape. I mean, when you look back on it, Freeberg Kenberg ha
00:54:12
was maybe ahead of his time with Quibby. He wanted to try to make this bridge between the two the and and he just
00:54:17
might have been too early or just didn't execute. Well, think about the network effects. What made Tik Tok so big and what made
00:54:22
YouTube so big is the long tail of user generated content and that's what drove
00:54:29
the audience and that's what built the the platform and mainstream, right? And Yeah. Well, then you can do the the
00:54:34
premium stuff on top of it. Yeah. But when you try and create premium content in a small way like Katsenberg tried to
00:54:39
do, it's very hard to build the audience to make the economics make sense. And the economics are really challenging
00:54:44
already with premium content in the big players as you can see with Netflix. They're they're compressing budgets. So
00:54:50
I would I would argue that you really need to have them both to make the model work now because the audience is so attuned to user generated content.
00:54:57
Here's what our partner Poly Market, shout out to my guy Shane, 84% Larry Ellison or Oracle kind of the same thing
00:55:04
here. They put a slash acquire Tik Tok that's up over 20 percentage points in
00:55:09
the last couple of days. What do you think all this means? The the divevestature of Tik Tok, Larry Ellison
00:55:15
getting it as opposed to Google, Apple, Microsoft. I mean, there was so many people maybe even Elon who might have
00:55:21
wanted Tik Tok. What do you think of this divevestature and then uh our
00:55:27
government gets the golden vote now? I understand that the US government will have a seat on the board of uh Tik Tok.
00:55:34
So the Chinese have given it up except for the algorithm. What are your thoughts? Remember it's Tik Tok. It's Tik Tok US.
00:55:39
It's only 5% of the over it's only 5% 5 to 8% of the overall Tik Tok business.
00:55:45
So Tik Tok US is what's being spun out. Yes. Correct. Chimat what do you what do you think here? I think there's two things that are
00:55:51
important. The first is that in the future and this may sound very dystopian but he who controls the algorithm will
00:55:58
control what people think. And if you think about it in that context, you need, as you've said before, a
00:56:06
marketplace of very different approaches and algorithms that are essentially
00:56:12
fighting for mind share. If you don't have that, you'll have a massive zombie
00:56:17
group think culture. So from that perspective, you have to put Tik Tok
00:56:23
into the hands of a completely different owner than any of these other social media sites so that they are motivated
00:56:30
to compete against each other. That's one. The second thing is there's been a lot that has been said about the Tik Tok
00:56:37
algorithm called Monolith and a bunch of it has been already put out openly and Tik Tok was very
00:56:42
transparent and they published a paper. You can find it on archive. You can show the link to it, Nick, maybe in the show
00:56:47
notes, but it's an incredible paper that describes a very simplistic approach to essentially moving people into different
00:56:54
directions of thought. So if you put all of these ideas together, I think where we are maybe all
00:57:01
the way coming back to where we started this discussion about Charlie Kirk,
00:57:07
it is increasingly important to make sure that the overwhelming majority of
00:57:14
how people get ideas is understood by the rest of the people. Because if you
00:57:20
start to go down these rabbit holes in ways that are algorithmically programmed in models that you don't understand that
00:57:26
then push you into extremism, you will end up in a very very very bad place as will society.
00:57:33
And then the the outcomes of that are completely avoidable as we're seeing. Yes. So I think that the Tik Tok thing is
00:57:40
going to be one of these important moments where we shine a light on the importance of these algorithms. It's
00:57:46
poorly understood. It's not well talked about, but I think what the Trump administration doing is important to
00:57:52
keep it away from everybody else so that there's more competition. Yeah, I'm gonna um strongly agree. I
00:57:58
think the Trump administration did a great job on this one. Just calling balls and strikes of not making sure
00:58:03
Meta got it or somebody who has already a lot of algorithmic control over what
00:58:09
society seeing. And back to just just on this, you know, the I give
00:58:14
a lot of credit to Elon too because his algorithm is open source. We know what's happening there. He just did it two weeks ago. Yeah.
00:58:20
If that's open source and Monolith is open source, then I think the next big question we have to ask is what is going
00:58:27
on in Reddit? What is going on in meta? How are we how are we guiding and shaping people?
00:58:33
Why does YouTube sometimes for example like our Tulsi Gabbard video is apparently deemed too risque an adult
00:58:39
content? Who decided that? How did that happen? An algorithm. Yeah. Not not any human. And you know the thing I think is
00:58:46
important to start thinking about here is as an industry we're going to either need to regulate this ourselves or be
00:58:52
regulated. I think that there should be regulation and I'm not like super a fan of regulation that algorithms must be
00:58:59
disclosed and you must have the option given to you upfront to switch your
00:59:06
algorithm. There should be a b ya bring your own algorithm. There should be an algorithm store. And I I talked to Elon
00:59:12
about this publicly on on X many times. If you could say, I want one that just gives me a chronological feed. I want
00:59:18
one that is from the highest quality sources. And then you should be required to show what the default algorithm is
00:59:24
doing. And if you don't do that, I think you should lose your section 230 because an algorithm is more powerful than an
00:59:32
editor at the New York Times or an editor at HBO or Netflix deciding what we see, deciding what goes on the
00:59:38
homepage. The algorithm is a black box and who knows who's controlling it. And then what happens in society when young
00:59:45
men are pushed towards more and more fringe content, more and more dark content? Why not just have all this be
00:59:51
transparent? Why not let people see? Unfortunately, they don't need to be pushed to fringe content. All they need to do is go to a school where the woke
01:00:01
curriculum has been mainstreamed. And bingo. Randy Weingard's foot soldiers
01:00:07
are saying that conservatives are fascists. So, you're acting like you got to go through some weird rabbit hole to
01:00:13
embibe these ideologies when all you have to do is attend a public school and
01:00:18
probably a lot of private ones, too. I think Dalton's probably the worst. So, again, I you're acting like the radicalization's coming from the
01:00:24
fringes. I think what's so concerning about where we are as a society is that the radicalization's coming from our
01:00:31
institutions or a lot of these institutions. You're not going to get an argument with me. I think these schools
01:00:36
should be teaching, you know, basic skills and they should not be involved
01:00:41
in the culture wars in any way. I don't think any parent is setting their kids there. And if you are a parent and you're in California, I I would
01:00:49
encourage you to figure out what they're teaching your kids. Uh, and here in the great state of Texas, it's quite
01:00:54
totally agree. But hey, can I can I It's quite different here, by the way. And it's quite different here in the great state of Texas, I can tell you,
01:01:00
having lived in both places with three kids. Yeah. Yeah. I I get it. And look, that's why a lot of people move to to red
01:01:06
states because they care about. So, but hey, let me just did someone mention that our Tulsi video from Allins Summit
01:01:13
has been partially censored on YouTube. So, what happened is I was I was at my office yesterday
01:01:19
and I open up our YouTube channel cuz Nick had sent out a link that something new had posted. So, I went to go see what had posted that day. So, I pull it
01:01:26
up and I saw the Summit videos and I saw that Tulsi wasn't there. I'm like, "Wait, didn't we post Tulsi?" And then I
01:01:32
looked for it. I couldn't find it. So I text Nick and he says it is there and then I I clicked on the thing and it
01:01:37
said safe search is on. What does that mean? I searched and apparently
01:01:42
if your network administrator on your enterprise cuz I was at my office we had like our network set restricted mode
01:01:49
it's called. In restricted mode people can't like look at adult content on the network and that kind of stuff. And when
01:01:54
you have restricted mode on YouTube turns off anything that would fall under mature audiences. So here it is.
01:02:01
activate restricted mode. So it prevents a huge number of people. Huge number of people.
01:02:06
That's a huge number of people. This would be this would have been the second most viewed video. I guarantee
01:02:11
you after the Elon video maybe and maybe the most viewed and I was shocked. I said, "How is it possible that this
01:02:18
video is struggling to be in the top half and this is why?" And you can't
01:02:24
even see it. So then you put up Well, it's unclear. So
01:02:29
certain videos then get tagged as being for mature audiences and they don't show up in restricted mode by the algorithm
01:02:35
on YouTube. Oh, so it's the algorithm did it again. It wasn't reporting it. I'll give you guys my theory. My theory
01:02:40
is because she you the term Russia gate comes up and those sorts of terms and I think those terms are triggers if
01:02:46
they're in the transcript. They're triggers to be under kind of safe debunking Russia gate, right? But I don't think Do you think if
01:02:53
it was a video espousing and promoting the Russia gate hoax that it would have gotten censored this way?
01:02:59
It's a good question. I don't know. Yeah. I mean, we can test that by looking for other people discussing it. I mean,
01:03:05
you mentioned Scott Adams early. He talks about it 17 times in episode. By the way, this is a government official in the White House who is the
01:03:13
director of national intelligence who is providing an interview on national intelligence and is being filtered out
01:03:19
as being for adults only. Very weird. This is insane. It's insane. This is like when Trump as the sitting
01:03:26
president got banned from social media. I mean this you're right. This is a public official describing
01:03:32
intelligence investigation. She is doing an investigation reading reading out an investigation
01:03:38
investigation. I mean putting aside the topic, it's a it's a discussion. What who on earth
01:03:44
would think that this is something that people shouldn't consume? Why is this being flagged adult? what's adult about
01:03:49
it in any way? It makes no sense. Or for it's for mature audience. Yeah. Some some definition around it. But I
01:03:54
think that there's some keywords that triggered it without acknowledging or recognizing the algorithm. This is one of the points Chimak was making. We
01:04:01
don't know how the algorithms work because the algorithms cannot determine that this is a clearly cannot determine
01:04:08
that this is an actual government official and therefore should not be restricting access to the content. I don't know, man. AI's gotten a lot
01:04:14
better in the last year. It seems like this would be an easy
01:04:19
algorithm. Yeah, it here's what YouTube says about their filter. This helps hide potentially mature videos. No filter is
01:04:26
100% accurate. No I wonder if it's also people Here's the question is with respect to
01:04:33
the error rate. What percent of those videos are conservative versus liberal in their
01:04:39
ideological Bernie Sanders and Mandami getting the same treatment? If conservative content and liberal content are both subject to
01:04:46
the same error rate, then you know, they have a leg to stand on, maybe. I still don't think they should be doing this, by the way.
01:04:51
I mean, look, potentially mature videos should be I just want to add one technical thing here and then I'll go to this could also
01:04:58
be caused, I know in the past, by people reporting stuff they don't like. So people would do this report bombing, you
01:05:05
know, get 20 different people in a, you know, signal group or in a Discord group to go report this from different IP
01:05:11
addresses and file the reports to then silence each. So that happens. Wait for a second. Did did Tucker and
01:05:17
Cuban also get censored? That's a debate. I don't understand. Wait, what happened with that?
01:05:22
Anyway, Neil, can you check this out and report back to customer support here? Go ahead, Chim.
01:05:28
The Tucker Cuban debate was a perfect example of civil discourse.
01:05:34
Yeah, there was a hug. I think this is an example. There was a famous media theorist, his name is
01:05:40
Marshall McLuhan and he had this quote which is the medium is the message, right? And what did he mean by that? He
01:05:48
meant that the characteristics of the medium itself, meaning the way information is delivered,
01:05:54
has a greater impact on society and individuals than the actual content of
01:05:59
the message. What he's foreshadowing is the importance of these algorithms and these decisions that happen in the
01:06:05
shadows because the weight of those decisions have all of these downstream impacts. This is why they should be. Why
01:06:12
can't I see my YouTube algorithm? I can confirm that Tucker and Mark Cuban also get filtered.
01:06:19
This is insane. Well, basically because Tucker So Tucker and Tulsi are getting filtered. All right, listen. Great segue here
01:06:26
because uh we didn't get to recap the summit and we had
01:06:32
quite paradoxically we had a real discussion about the importance of debate friendship and you know the the
01:06:39
process we try to do here and at the summit which has let a lot of people speak and debate important issues of
01:06:45
which the Tulsi and the college university heads discussions so many of
01:06:50
these were important discussions but people didn't see our recap but just going around the horn here. Great
01:06:56
moments you felt were important or enlightening to you.
01:07:04
Chimat, did you have a moment at the summit that you know one of your amazing
01:07:09
insights that you felt were particularly insightful that you want to share with the audience? I think the most insights
01:07:15
I think the most important conversation there was the one with Tulsi. I really do. I mean, I think at a very
01:07:21
basic level, why does America go to war? Why do we fight wars? Why are men and
01:07:28
women being killed? Why? And it it just it's the same thing with Charlie Kirk. I
01:07:34
just get so agitated with this idea that a human being believes that they have the judgment
01:07:41
to take somebody else's life to me is incomprehensible. It's just so wrong. It's so immoral.
01:07:48
It's just so wrong. And what she talked about and what she's exposing
01:07:54
is the cascade of lies that causes an entire generation to lose tens of
01:07:59
thousands of their brothers and sisters. To lose your your son and your do I. So
01:08:04
yeah, it's um that was the most important conversation we had by a country mile. Freeberg, you
01:08:11
have a moment for you that was either entertaining, enlightening, or
01:08:17
important. I I'll let you take any angle on it. I would recount our wrap-up that we did at the end of day two.
01:08:25
When we first started planning the content for the summit this year, I was actually thinking about trying to get a
01:08:32
bunch of American speakers for day one and then a bunch of international speakers for day two where we could kind
01:08:39
of contrast the American perspective from the global perspective and hear how does the rest of the world view the
01:08:45
world and and view America and how does America view America and view the world and it was hard to kind of get the
01:08:50
scheduling right. So the way things kind of ended up getting scheduled out, we we tried to cluster the conversations into
01:08:56
different themes, but I didn't feel like we were going to get the orientation I was hoping for. And so we ended up having just a bunch of conversations.
01:09:03
But what was really powerful for me was that the similarity we saw with Elon,
01:09:08
with Tucker and Mark Cuban, with Alex Karp, and we brought it up, I think also with Eric Schmidt, is this idea that
01:09:17
much of what we see in terms of the evolution of society in the West can be
01:09:23
viewed through a lens of suicide. that the West may be committing
01:09:29
suicide financially in our deficit spending in terms of birth rate decline. So, we're
01:09:36
we're kind of destroying our dollar. We're destroying our headcount because of the birth rate decline. And then this
01:09:41
immigration policy issue where we're potentially destroying cultures, which
01:09:47
was a topic that came up a few times, and perhaps even thinking about the Tulsi comments, destroying ourselves by
01:09:53
by thrusting ourselves into war and chopping our heads off. And then we heard a lot of optimism. We heard a lot of technologists. We heard a lot of
01:10:00
folks talk about the options and the choice we have and that we have a choice to kind of reverse these trends and kind
01:10:07
of change how we're behaving as a society. And so it all came together as one long string. And we ended it with
01:10:12
this idea that there's choice and you have a choice to kind of go forward or you have a choice to commit suicide. And
01:10:19
then what really hit me was then we walk out and then the Charlie Kirk murder happened. And so it created this
01:10:24
emotional wrapper for me on the conversations we had that there were a lot of people that are going to make
01:10:30
different choices. And so I I just wish we could do a summary of what it was like at the summit instead of each one of these
01:10:36
talks standing on their own. They work well on their own. But I think that that whole kind of set of conversations
01:10:41
leading to that point of view was really important and poignant for me. Did we end up publishing our final like
01:10:46
wrap-up conversation? Not yet. I want to we have not done it. What I want to do is I want to do that with the
01:10:52
summary video at the end of the summary video. So, I want to be like here's the a couple of key points from the the
01:10:57
talks once we get all the talks out and then our wrap-up conversation which I think we should do. Saxs, you were
01:11:03
incredibly engaged in this summit. You showed up. You were engaged.
01:11:09
Were there moments that you found particularly enlightening, inspiring,
01:11:15
fun? Oh my gosh. We just found out that the Karp interviews also been censored on safe mode on YouTube. So wait, basically
01:11:24
the three talks were Tucker, Tulsi, and Karp, which by the way, I think if you had the audience vote on what the most
01:11:30
engaging and interesting, those were the three best. Those are up. Yeah. And by the way, there are a lot of other really good ones, but those ones
01:11:37
I think were the most captivating. I think the audience you could see was like really engrossed,
01:11:43
animated. It could be because of Jason's point, which is that if enough people click report an issue, it gets put into
01:11:48
that al that part of the algorithm. So, yeah, we got to have to give a little grace to yell here. Oh, so who's who's the one who's doing
01:11:54
all the reporting here? Who's weaponizing the YouTube reporting system? It's a bunch of tolerant liberals.
01:12:00
It could be Putin. You never know. Sax, you have moments. We don't need the Russians doing this to
01:12:06
us. We there could be a cobble that is we could be inflicting this.
01:12:12
I agree with Freeberg's wrap. I thought it was remarkable how the whole thing kind of came together at the end and and
01:12:17
there seemed to be a really stark choice and I agree that the next day Charlie was assassinated and um I mean I stopped
01:12:25
thinking about the summit but I agree that it really did play into this idea of the suicide of the west and in the
01:12:31
sense that like we talked about there apparently are a large contingent of people who don't believe in what I
01:12:38
would consider to be the central virtue or tenate of western civilization which
01:12:45
was expressed or at least it's attributed to Voltater. What Voltater said is I disapprove of what you say but
01:12:52
I will defend to the death your right to say it. I think that is the central tenant behind western civilization
01:12:59
because out of that came the right to free expression to freedom of religion. Out of that core principle came all the
01:13:06
other principles all the other key freedoms. The right to think and believe and worship as you please. freedom of
01:13:13
religion, freedom of speech, freedom of the press, it all comes out of that. And what you have here is people like
01:13:21
Tyler Robinson or the people who are celebrated what he did. It's the exact opposite. It's I disapprove of what you
01:13:26
say and I will murder you to prevent your right to say it. That is
01:13:31
the antithesis. It's the antithesis and it's not Western civilization. It's something different
01:13:37
and destructive. and where it will lead to is a civil war that none of us should
01:13:42
want on either side of the political spectrum. One thing I would just like to say and then J Jason, I'd love to hear what what
01:13:48
you think about this, but one of the most powerful things that I think Charlie got absolutely and precisely
01:13:54
right was the plight of young people. Meaning the malaise and the anxiety and
01:14:01
what was at the root cause of it. at the root cause in the way people think and now in the way people act is economic
01:14:08
instability. And that was a thing that I think also came out at the summit. And
01:14:13
Charlie does a really good job of very precisely putting it into a box what they're going through. The lack of home
01:14:20
ownership, the overwhelming and just crushing debt, the failure of the school
01:14:25
system, the inability to find a great job, the hookup culture that takes people away from having a committed
01:14:30
relationship, getting married, having children. There's a totality of issues that are really cultural. And I would
01:14:37
just en encourage people to go and find this out because I think at the root cause of all of this is that I think
01:14:43
that we need to sort of find a way to give folks a chance to believe in
01:14:48
something. And if you don't, I think the algorithm will take you to a very bad
01:14:54
place. It's well said. I was absolutely delighted. Great job uh Freeberg and to
01:14:59
the team, John and Kimber and Nick and just so many people put so much work into that. It was amazing to see us
01:15:05
double the size of it while increasing the quality of it from the speakers to the experience. Food was excellent. I heard the food was
01:15:10
excellent. No lines. I mean, all that logistical stuff was just so dialed in in year four. So, I give a lot of credit to our
01:15:16
amazing team. I have to thank our sponsors. They did an amazing job of making this summit available to so many
01:15:23
people. We can do the scholarships and have so many people involved. And there are three very special partners who did
01:15:28
insane buildouts at the show. Salana, our friends over there, went crazy. They they did the entire upstairs. I don't
01:15:34
know if you hung out there, but I was at their coffee bar and they had a juice bar, IPO themed bar. It was just
01:15:41
awesome. It was awesome. It was awesome. They hosted two of the dinners, too, the public market and and geopolitical
01:15:46
dinners. Our friends at OKX, they showed up big for us at the summit. They built out an awesome lounge uh modeled after a
01:15:52
saloon. I went in there and had a little bourbon myself on Sunday. They screened their first film, Mild Mild West. They
01:16:00
gave away 250 in Bitcoin to all attendees and somebody won a hot lap with an F1 driver because they sponsored
01:16:07
the McLaren team. That was super nice of them, huh, Freeber? Yeah, that was awesome. And we're going to be at that Formula 1 event in Vegas
01:16:14
in November, which we're going to Vegas. Yummy. Oh, we are. Yeah. Yeah. Yeah. I is a new partner this year and uh they
01:16:21
operate all these NextG data centers. So, big shout out to them for doing the casino night which was a huge hit at
01:16:27
Cliff. I went to that. Did you go to that? I didn't. I was at the other event in it
01:16:32
was insane. I thought it was like the pictures. Look at the pictures. I actually went to all four parties on
01:16:38
the Monday night. I was completely wiped out. But man, the casino night was awesome. It was so fun.
01:16:43
You're like Phil Collins. You at Live Age. You went to everything. You on your Concord and you went everywhere.
01:16:50
That's right. And they had a great engraving station in the expo hall that packed the event.
01:16:55
So just couldn't do it without you guys and gals. Again, without the sponsors, JCL,
01:17:01
we would not be able to put this event on because uh we spent way more than the ticket revenue
01:17:06
and we would not be able to have the scholarship tickets, which I think are really important. I remember in my early
01:17:12
career like I couldn't afford to go to a fancy event and I think I think the scholarship you guys debated me about the
01:17:17
scholarships for the first couple years and now you realize it's great to have that young energy, right? Yeah. Sometimes get something right. I really
01:17:24
enjoyed having Josai there and talking about the China and America relationship not being uh one in which you know we
01:17:32
have to go down a prescribed route that there could be offramps and there could be collaboration there. I thought Neil
01:17:38
um from YouTube uh despite what we're talking about here today I appreciate him coming and engaging us with the
01:17:43
censorship issue he was got into with USA and I thought that was very productive. I love the fact that people from this administration, Tulsy and
01:17:50
Chris Wright, they were willing to engage in a vibrant debate. The Dartmouth and the Berkeley discussion on
01:17:56
civil discourse is just so important and they have such an impact on kids' lives
01:18:02
when their brains are developing as we talked about on this show. Rick Caruso on the management of our cities I think
01:18:08
was amazing. And then you know Tucker and Karp they went right to some really
01:18:14
hard discussions around anti-semitism and the situation in Gaza. You know also
01:18:19
these are very difficult things to discuss. This event was a celebration of debating hard topics and for that I am
01:18:26
very thankful for my friendship with each of you and to have this amazing experience of the All-In podcast and
01:18:32
most of all the community meeting all of our I won't call them fans. I I think our friends are besties in the audience
01:18:39
and what a great job Freeberg in accommodating the scholarships. I know some people get bent out of shape. Oh,
01:18:45
you gave somebody a scholarship, whatever. We got a lot of young people in there to experience it as well as the
01:18:51
people who could afford the tickets, etc. And I'll say what a crazy joyful
01:18:56
moment at the end when Diplo played and I was we were three of us were on stage
01:19:01
dancing and uh just celebrating with our kids and I was having another seizure but thanks.
01:19:07
Yeah, thanks for letting me up there for a few minutes. Yeah, you came up for a couple of minutes and enjoyed yourself. Uh but
01:19:13
yeah, just um congratulations to the team. It was an amazing effort and what a great dialogue and I think that spirit
01:19:19
of dialogue and making a choice to engage the hard topics in a respectful manner at the end of every program.
01:19:25
Chimat says love you besties and Sax gives a back at you. But the truth is even in a contentious debate you can
01:19:32
still remain friends and love each other despite those debates. It's the most important American ideal. It's the most
01:19:38
human ideal and it's dare I say, you know, I'm not a Catholic anymore, but I I've remained a Christian is one of the
01:19:45
most Christian ideas and I know that was a major part of his faith was a major
01:19:50
part of Charlie's life and his legacy and um you know that hit
01:19:58
me personally as well. His Christianity um even if I disagree with him about you know issues and debates, he was a
01:20:05
Christian at his core. and he cared about other people deeply and he cared about humanity. So, rest in peace.
01:20:12
Let me just um say Jake, I'll thank you most of all for taking the time to watch all of those Charlie videos. How much
01:20:17
time do you say it was like 40 hours? I I watched dozens of hours and 50
01:20:22
videos um just to just to make sure I got from first principles my understanding of who he was. I disagree
01:20:29
with him about gay rights. I disagree with him about, you know, how he said certain things. But there was nothing I
01:20:35
saw in there that could even come close to justifying anything
01:20:41
that would be anywhere near violence. Well, in any event, I I appreciate that
01:20:46
you took the time to watch Charlie for yourself as opposed to listening to the
01:20:52
mainstream media's mischaracterizations of him, which are attempts to either downplay or minimize
01:20:59
or even to some degree justify what happened. And by doing that, JCL, I think you really honored our friend
01:21:05
Charlie. So, thank you for that. All right. Rest in peace, Charlie Kirk. And love to his family and his and his
01:21:11
friends. By the way, there was an incredible video that he did where somebody said
01:21:16
something along the lines of define what it means to be a conservative. And he said, you know, sometimes people think
01:21:22
that what that means is to like limit rights or whatever. And he said, that's not what it means. It's just that I
01:21:28
believe that I would like to conserve and honor a specific way of life that
01:21:33
has existed in the past. And he talked about it, you know, where you could go get a job, you could do
01:21:39
better than your parents, you get married, you have kids, everybody's safe. I mean and it kind of shocked me
01:21:47
because I thought wow that definition is actually what many many many people
01:21:52
believe even if they don't believe that they are a conservative because a lot of people um believe that you know and this
01:21:59
is not taking away from other people but that's what we'd want for a lot of us
01:22:05
for our kids. Let's all set an example here. Have vibrant debate and love your besties. Okay. And we'll see you all next time.
01:22:13
Love you, boys. Love you, besties. Bye-bye. There's a back at you. No back
01:22:18
at you. We We missed the back at you.

Podspun Insights

In this poignant episode, the hosts grapple with the shocking murder of Charlie Kirk, a young father and political commentator, who was killed while engaging in the very debates that define American discourse. The conversation dives deep into the implications of his death, exploring the chilling effects on public dialogue and the rise of violence against differing opinions. With heartfelt reflections from those who knew him, the hosts discuss the dangers of ideological extremism and the importance of maintaining open channels for debate in a democracy. They also touch on the broader societal issues that may contribute to such tragic events, including the isolation of young people and the impact of online culture. As they share their thoughts, the episode becomes a tribute to Kirk's legacy and a rallying cry for the preservation of civil discourse. The emotional weight of the discussion is palpable, as they emphasize the need for empathy, understanding, and the courage to engage with opposing views, reminding listeners of the fundamental values that underpin a healthy society.

Badges

This episode stands out for the following:

  • 100
    Most shocking
  • 100
    Most controversial
  • 95
    Most emotional
  • 95
    Best concept / idea

Episode Highlights

  • The Chilling Effect on Discourse
    The murder raises concerns about the future of public debate and discourse.
    “The ultimate outcome of that is fewer people will then enter the public debate.”
    @ 02m 56s
    September 19, 2025
  • Creating a Platform for Debate
    Kirk's approach to engaging in dialogue exemplified the importance of free speech.
    “He created his own platform.”
    @ 08m 31s
    September 19, 2025
  • Ideology and Violence
    The killer's justification for murder reflects a troubling rise in political violence.
    “Violence is justified as a way to end a political debate.”
    @ 17m 54s
    September 19, 2025
  • The Hitler Comparison
    Discussing the dangerous rhetoric comparing political leaders to Hitler and its implications.
    “They've basically Hitlerized him.”
    @ 24m 04s
    September 19, 2025
  • The Act of Hate
    Examining the murder of Charlie Kirk and its broader societal implications.
    “What Tyler Robinson did was an act of hate. The ultimate act of hate.”
    @ 27m 40s
    September 19, 2025
  • Jimmy Kimmel's Controversial Remarks
    Kimmel faces backlash for comments made after the murder of Charlie Kirk.
    “We hit some new lows over the weekend with the MAGA gang...”
    @ 31m 00s
    September 19, 2025
  • Free Speech and Censorship
    Debating the implications of free speech in the context of Kimmel's cancellation.
    “Free speech does not mean you have a right to an ABC show.”
    @ 44m 11s
    September 19, 2025
  • Cancel Culture Debate
    A discussion on the implications of cancel culture and its impact on free speech.
    “I am against anybody getting cancelled for their political beliefs.”
    @ 45m 11s
    September 19, 2025
  • The Power of Algorithms
    Exploring how algorithms influence public opinion and the importance of competition among them.
    “He who controls the algorithm will control what people think.”
    @ 55m 58s
    September 19, 2025
  • The Suicide of the West
    A discussion on societal decline and the choices we face.
    “The West may be committing suicide financially in our deficit spending.”
    @ 01h 09m 23s
    September 19, 2025
  • The Importance of Debate
    The summit emphasized the value of civil discourse and friendship in discussions.
    “This event was a celebration of debating hard topics.”
    @ 01h 18m 26s
    September 19, 2025
  • Reflections on Charlie Kirk
    A heartfelt tribute to Charlie Kirk's legacy and values.
    “Rest in peace, Charlie Kirk.”
    @ 01h 20m 12s
    September 19, 2025

Episode Quotes

Key Moments

  • Cultural Impact03:58
  • Justifying Violence17:54
  • Hyperbolic Rhetoric23:58
  • Indoctrination Concerns26:26
  • Economic Motivations39:31
  • Cancel Culture45:11
  • Algorithm Control55:58
  • Societal Choices1:10:19

Words per Minute Over Time

Vibes Breakdown