Search Captions & Ask AI

Tucker Carlson: Rise of Nick Fuentes, Paramount vs Netflix, Anti-AI Sentiment, Hottest Takes

December 13, 2025 / 01:38:31

This episode covers topics such as media consolidation, the bidding war between Paramount and Netflix for Warner Brothers, and the implications of AI on society. Guests Tucker Paulson and David Sachs discuss the dynamics of the media landscape and the potential impact of AI on jobs and society.

Tucker Paulson shares insights from a recent White House Christmas party where President Trump mentioned the All-In Podcast and engaged in a lighthearted conversation about AI. He emphasizes the importance of branding AI in a more positive light.

The discussion shifts to the bidding war over Warner Brothers, with Netflix and Paramount making competing offers. Paulson and Sachs analyze the implications of these deals on the media landscape and the potential for monopolistic power.

They also touch on the cultural shifts driven by user-generated content and the changing preferences of younger audiences, suggesting that traditional media may struggle to adapt.

Finally, the conversation addresses the societal implications of AI, including job displacement and the need for a balanced approach to technology that considers the well-being of individuals.

TL;DR

Tucker Paulson and David Sachs discuss media consolidation, AI's societal impact, and the bidding war for Warner Brothers between Paramount and Netflix.

Video

00:00:00
All right, back with us in place of David Freeberg who's busy this week is
00:00:06
the one, the only on his fourth appearance here on the All-In podcast, Mr. Tucker Paulson. How are you, Tucker?
00:00:12
Thanks for having me. Hey, Tucker. Good to see you. David, David, how do you have time for
00:00:17
every time I every time I turn on my phone, there's like David Sachs on something incredibly complex. Like, are
00:00:25
you sleeping? usually people attacking me for something, but it's not just like, oh, your views
00:00:31
are this or that geopolitical conflict. It's like the details of something
00:00:37
very complicated. And I'm just like, wow, man. That's a lot. That's a lot to
00:00:42
digest. Yeah. There's not a very high bar in Washington, as you know. You're a giant among pygmies, but still,
00:00:49
it's a lot of work. In the land of the blind, the oneeyed man is king. King. Exactly. Exactly. Are you still enjoying it? Yeah, it's
00:00:56
been a lot of fun. Well, you know, President Trump's a lot of fun to work for. He's the most fun. I mean, the best, right? He got a big shout out yesterday. It was
00:01:03
really awesome, actually. David did. Yeah. Huge shout out. That's right. We were at the White House Christmas party. I think they do like 25
00:01:10
of these. Yeah. Literally. Literally, cuz they got so many thousands of people, but they can only fit a couple hundred people in the White House and they're doing like two a day.
00:01:17
And the president comes down and gives a speech and every speech is different. You know, it's like a Dave Chappelle comedy routine where he never does the
00:01:23
same set and he does it with so much enthusiasm and gusto. You would think
00:01:29
that you were the only, you know, holiday party crowd that he ever addressed.
00:01:36
He never expresses any irritation at at doing that. He loves it. It's like amazing. It's unbelievable.
00:01:42
But in any event, he gave me a shout out during the speech and then he called me up there to like, hey, can you say a
00:01:47
couple words about AI? And I'm like, well, this isn't exactly Chris's party conversation. So, I just kind of talked
00:01:54
about how great he was and um how much fun it was to work for him. And then he gave Chamatha a shout out
00:01:59
as well. No way. And he just starts talking about the All-In Pod. Like, we're in the audience and he just starts having a conversation
00:02:04
with us about the All-In Pod and how's it doing? And you know,
00:02:10
the funniest part was he says to me, "Oh," and then Nat was behind him. So, he goes, "Hey, Nat." And he says, "I
00:02:15
hope everything's going well. How's your relationship?" He looks at me and I'm like this and then Nat's behind her. Me
00:02:21
going like this. Oh, it's so awesome. I'm going this
00:02:27
weekend cuz I miss him. You're going to see him this weekend? You guys are on good terms. Tuck with Trump. Oh, yeah. The best. I mean,
00:02:34
the best. I mean, of course, people have told him, many people that he's not allowed to talk to me. So, that just makes him like me much more cuz like all
00:02:41
he hears is, "Oh, he's" People be like, "He's the worst human being who's ever lived." And all Trump hears is who's
00:02:47
ever lived, you know, and he's just that he's just he's you can't control him
00:02:53
that way. Period. So, no, I get along with him literally in 25 years better than I ever have.
00:02:59
That's good, too. It's hilarious. Yeah. The last part of that story is so after he calls up me and Jimoth, actually, he
00:03:07
calls me up to speak and then calls David up to speak. He went on a little bit of a riff saying, you know, I don't like the term artificial
00:03:13
intelligence because why would you want to call it artificial? It sounds bad. Why don't they call it something else?
00:03:19
Superior intelligence intelligence organic. And then and then he calls up Chimath. He's like, Chimath, what do you think?
00:03:24
And Chimath says, well, I think I think AI is too late to change, but maybe it could be American intelligence.
00:03:30
And then he says, yeah, but you know, we want this to be used by the whole world. Sell it to the whole world. Anyway, we
00:03:35
start workshopping uh this branding exercise. in front of the entire Christmas party.
00:03:41
How about we call it Trump super intelligence? Okay, it's Trump
00:03:46
intelligence. No, but you know what? Calling AI American intelligence is actually the smartest and best one could do around
00:03:52
AI. We got a lot to talk about, Tucker, you've been on a run, huh? You've been on I mean, not in my world. I mean, I'm
00:04:00
off social media mostly, so it's like nothing's actually really happening in my world. You don't open X at all.
00:04:05
It's all about me, man. And I'm not going to I don't like to read about myself, so I don't look at it. No.
00:04:10
All right. For topic number one, Paramount versus Netflix. They're in a bidding war over the future of Warner
00:04:17
Brothers and all that amazing IP. The assets obviously many of us know Warner
00:04:23
Brothers is led by Zazlo, David Zazlo, but he owns currently HBO, DC, and the
00:04:29
Warner Brothers collections of films. Also, they have that great studio a lot. On the cable side, they own CNN, TNT,
00:04:37
Discovery, and uh they just saddled that company up with $30 billion of debt. And they had a
00:04:46
little bit of a competition for who would buy it. Netflix and Paramount Sky Dance, run by the Ellison family.
00:04:53
Netflix offered $83 billion to purchase just the streaming assets, which would
00:04:58
put the number one and the number three player together. and WBD publicly
00:05:05
accepted Netflix offer last Friday. This has created a bit of a kurluffle.
00:05:10
Paramount now is coming in with a hostile offer, 108 billion in cash for
00:05:16
the entire company. That includes the cable assets. That would be interesting because then David Ellison, son of Larry
00:05:22
Ellison, would own not just CBS, which is being run CBS News by your
00:05:27
favorite Tucker Carlson, uh, Bari Weiss. She would also, I guess, own and run CNN
00:05:33
in this instance potentially. We'll get to that. Uh the $ 108 billion offer
00:05:39
includes two vehicles, 41 billion in equity financing by the Ellison family
00:05:45
and then a bunch of other folks coming including and we'll get to this uh some Middle East uh sovereign wealth funds.
00:05:53
Poly market interestingly has Paramount as the favorite at 51% in Netflix has
00:05:59
dropped to 36% even though they say they have a done deal and 14% chance of no
00:06:06
deal. I think that might be the free money the 14% chance of no deal. What's your take on this, Tucker? And just
00:06:13
broadly speaking, consolidation in media having pulled the rip cord and left
00:06:18
traditional media and now yeah, the understanding is you're doing better than ever. You control your destiny and
00:06:25
I think you're making probably as much or more money now than you did when you were working for the men.
00:06:30
Yeah, I don't I actually haven't checked, but I'm not much of a money guy, but I'm I'm fine and can pay my
00:06:36
non-existent mortgage. I'm against monopoly power in general because I think it stifles creativity.
00:06:43
I'm not that worried about this because these, you know, these things never move in
00:06:48
exactly the direction you imagine. I've been in media my entire life and none of the big changes I anticipated. In fact,
00:06:55
almost all of them I made fun of. I just don't think that we're really threatened
00:07:02
by, you know, a conglomerate of CNN and Netflix and all. So it's like, okay, you
00:07:10
can assemble huge companies. Can you make people consume and believe the product? You know, buying CBS news is
00:07:18
like buying RCA records or something. It like just doesn't have any effect. And only people who are not paying attention
00:07:24
are pretty cut off think you're going to win hearts and minds by being buying CBS News or CNN. These are these brands are
00:07:32
husks. In fact, all they are is brands at this point. And I just am not at all convinced that this will have a material
00:07:38
effect on anyone's attitudes at all. You know, if you started to mess with what
00:07:44
YouTube is allowed to air or the ownership of X, you know, then I think you could could really change the
00:07:51
country and the conversations that we're allowed to have. But I don't see any of this as especially meaningful on the
00:07:57
society. I mean, is the is the product going to get, you know, I don't know, more subversive than than it already is?
00:08:03
I mean, is this like Netflix going to be worse for American society? Probably not. You know, I think this is a
00:08:09
business story, not a cultural story. Chimath, your thoughts. I'll give you two. The first is that whenever you see
00:08:16
deals, it's important to look at the amount of money that that is at risk. And that is the best tell about whether
00:08:23
this is important for the future or not. Hundred billion dollar deals are typically about things in the past. What
00:08:31
is the future? Billion dollar deals. So for example, when you look at when Facebook bought Instagram for a billion
00:08:36
dollars, that turned out to be a huge bet about the future, it was right. When
00:08:41
Google bought YouTube for a billion six, that was a huge bet on the future. They
00:08:46
were right. When Microsoft invested a billion dollars in OpenAI, that was a huge bet on the future. It was right.
00:08:53
But when you look at assets that trade at hundred billion dollar plus valuations, they're so undergurtded by
00:09:00
debt. All of that debt is only ever bought by looking at the past. Meaning how much money have they made and then
00:09:06
it's a best guess about how much money could they make in the future. So these multiund billion assets to Tucker's
00:09:12
point, they don't really matter that much. I don't think it's super anti-competitive. These are financial transactions. The reality in media, so
00:09:19
specifically about this deal. So that's a general statement about deal quantum and you can just judge the importance
00:09:25
based on that. People should be spending much more time looking at billion dollar transactions and hundred billion dollar
00:09:30
transactions. That's my takeaway there. But at the very specific thing about this deal, the reality is that the
00:09:38
future is unscripted, uncontrolled, userenerated content. You see it on
00:09:43
YouTube. It is already the 800 lb gorilla in the space. And then separately, it's now becoming about
00:09:48
shorter form video. And you see that with things like Instagram reels and Tik Tok. None of that landscape will change
00:09:55
based on this deal. If anything, if those trends accelerate, the value of historic IP is going to
00:10:02
erode even faster. Meaning, this generation of kids will have no idea or
00:10:07
care about the Marvel Cinematic Universe, about Star Wars, and that may upset those of us who are nostalgically
00:10:14
tied to it. So, I don't know. I would let the deal happen. I don't think it's particularly that important.
00:10:19
Saxs, obviously you don't speak for the administration on these issues, but I'm curious your thoughts on this.
00:10:25
Yeah, just my personal view on this is that we're going to get meaningful consolidation in the industry either way
00:10:31
because either Netflix and Warers are going to merge or Paramount and Warers
00:10:37
are going to merge. So either way, you're going to get consolidation. But that being said, if Netflix is allowed
00:10:43
to buy Warers, the antitrust concerns are a lot more serious because Netflix really is the 800 lb gorilla in
00:10:49
Hollywood right now. It's the number one streamer by far. It's got the biggest market cap. And they're the party who
00:10:55
the rest of Hollywood is freaked out about right now. And so you saw that the Hollywood unions like the WGA
00:11:02
sag. They oppose the deal because they're fearing job cuts, lower wages, worsen conditions due to reduced demand
00:11:09
for talent. And then the the content creators and distributors are worried about this too because
00:11:16
Netflix is known for making tougher deals I think than the traditional studios. I've got a friend who's a
00:11:21
showrunner in Hollywood, and he's done projects with both Netflix and with the
00:11:27
studios, traditional studios, and he says the big difference is Netflix will pay you pretty well, but you don't get
00:11:33
any equity in your show. Like whatever you get is sort of agreed to at the beginning and that's it. So, you're not really an entrepreneur when you do a
00:11:39
show for them. But when you then work for a studio, you actually get a back end. Now there's all sorts of, you know,
00:11:45
Hollywood accounting associated with that, but he kind of misses the days that are going away where he got to be a
00:11:51
little bit of an entrepreneur and have real upside in his shows. And if Netflix now is allowed to acquire Warner
00:11:58
Brothers and that's just another nail in that whole coffin. And so I think it is a big change and if the antitrust
00:12:04
regulators look at this, I do think that Paramount has a better chance. The other big factor is just that
00:12:12
Paramount's offering more. They up the bid. It's 108 billion versus around 80 or was
00:12:18
like $30 a share versus 27. And they're also buying the whole company whereas
00:12:23
Netflix just wants Warner's studio assets and streaming assets like HBO as opposed to the cable assets which
00:12:31
are considered a little bit of a co. So, I think if you're a shareholder in
00:12:37
Warers, you probably want to sell the whole thing. You don't want to just be stuck with the bad assets. So, yeah, I'm a little surprised actually that the
00:12:45
Warner board went with Netflix when they had Paramount as an option, assuming
00:12:50
this Paramount offer was on the table, because it seems like a better deal, and it's probably a little bit more likely
00:12:57
to get through the regulators. So, I guess I'm a little bit surprised they chose Netflix, but I guess Netflix is
00:13:03
the more bonafideed party, right? It's $400 billion market cap and maybe they thought that they're more able to
00:13:09
execute this transaction. Yeah, I have only three points on this. Number one,
00:13:14
it really depends on how you frame competitors in this space. Here's your paid streaming platforms, Netflix,
00:13:21
Disney, and HBO. Disney's done an amazing job after starting a decade after Netflix with streaming of really
00:13:28
getting a lot of subscribers and consolidating one in three here
00:13:34
obviously puts Disney way behind. But if you start looking at Tik Tok, Instagram,
00:13:39
YouTube, these properties have the majority of the audience. They dwarf the audience of these paid services. And
00:13:47
young people are not interested in movies anymore. They want obviously Tik
00:13:52
Toks and YouTube. If you look at the revenue, it's a ask you a question about that chart
00:13:57
before you move on. does how do you compare or do you adjust for time watched or minutes because how do you
00:14:03
I didn't in this but yeah that would a big difference between watching a Tik Tok and watching you know a movie
00:14:09
on Netflix in terms of attention span disagreeing with you about the look there's no question the cultural
00:14:15
significance has moved away from Hollywood towards user generated content on these online platforms but I'm just curious
00:14:22
you adjusted for that yeah no I didn't adjust for that you could you could slice it a bunch of different ways but at the end of the day
00:14:28
you I put the competitive set as a little bit broader here um which is including the free you know UGC services
00:14:35
since that's what young people are doing and then if you look at revenue
00:14:40
you you know it's a slightly different picture here these paid services are doing extremely well and they are
00:14:46
juggernauts in terms of making money and profitability now and UGC
00:14:53
obviously is much larger especially on a global basis the more important Third thing here is how we do antitrust in the
00:15:00
country. Trump wants to be involved in this. He said he's going to be very involved in it, you know, which as an
00:15:05
80-year-old, he's involved in everything. Why not be involved in this? He shouldn't be. Put that aside. I think
00:15:11
we need to have a way to prevet these and then just let the highest bidder win. I don't know how this concept is
00:15:17
getting convoluted, but the Ellison's are compromising Trump a bit here. I think that's why Trump gave a lot of
00:15:23
shine to Ted Sarandos. I don't know if you saw his quotes about that sachs, but he was praising what a genius Ted
00:15:29
Sarandos is and how amazing Netflix is. The Ellison's coming out and, you know, basically saying that they've got this
00:15:35
in and, you know, they're going to basically have the inside track here, I think is one of the issues I propose we
00:15:44
have a prevetting of these large deals because we want M&A to be vibrant in this country. We want more M&A after the
00:15:51
wrath of Lena Khan. So I think you should be able to prevet chimoth. You should be able to go to the government
00:15:56
and say, "Hey, we're considering selling this asset, whatever it happens to be, you know, YouTube back in the day. Is
00:16:03
there anybody who's not able to participate in this auction and then just have the FTC prevet some of these
00:16:08
and then highest bidder seems like what's best interest in the best interest of shareholders." So
00:16:14
kind of that's untenable. And the reason why Yeah, explain. You have multiple facets of antitrust that can come up from any
00:16:20
number of organizations in the United States. And that's just but one part of the complexities you have to navigate
00:16:26
because if you do business in any other country, all of these other countries are in a position to opine.
00:16:32
Oh, sure. If you think about doing a deal where you're in China and that other asset is in China, it can slow down for a very,
00:16:38
very, very long time and never happen for reasons that have nothing to do with the industrial logic of the merger. So I
00:16:45
don't think you can prevet these things because it's not scalable and I think the government would get frustrated because you'd have a thousand people
00:16:51
outside the door. They'd have no time to do anything else. They have to govern. The different thing that we have to
00:16:56
figure out whether is allowed is how these deals are getting done. The only thing that I would observe is the two
00:17:02
biggest transactions that have happened thus far this year that I've taken note of happened as total raw asset sales to
00:17:09
work around antitrust. The best example was Meta and Scale AI. Okay, I'll say this thing is worth 30 billion. I'll
00:17:15
give you 15 billion in cash, but what am I really doing? I'm carving out these assets so that I don't have to file even
00:17:21
an HSR filing. So, I think the future is that if the government has to have an
00:17:27
opinion, not just America, but the Europeans, the Chinese, what's going to happen instead is that
00:17:34
very smart lawyers who get paid, you know, 10, 20, $30 million a year, the NBA salaries now, they're going to find
00:17:41
workarounds. they've already done so for big tech and I think it'll spill over to other industries. You're kind of like
00:17:47
creating these boundary conditions where I think the concept of antitrust is going to be a very difficult thing
00:17:53
because if businesses want to be in business, you're not going to do these traditional deals. as David said, an
00:18:00
enormous sign of confidence about how there isn't going to be a competitive
00:18:05
threat for Paramount to just say, "We'll take the whole thing." Because they're subjecting themselves to a level of
00:18:10
scrutiny that they wouldn't if they felt there was any shred of a good argument for competitive antitrust.
00:18:16
Yeah. I just want to um address the gratuitous potshot at President Trump.
00:18:22
Which one? Well, your claim that he shouldn't get involved and somehow antitrust is better if the presidents don't get involved.
00:18:27
You may remember that Teddy Roosevelt was known as the trustbuster because he directed the DOJ to sue 45 companies
00:18:33
under the Sherman Act, including the whole Northern Securities Company. His successors, William Howard Taft, Woodro
00:18:40
Wilson, FDR, they were just as aggressive using antitrust. Anyway, a lot of presidents have gotten involved
00:18:46
in mergers and antitrust actions. So, it's just not that unusual.
00:18:52
Yeah, I just check brought to you by Grock. Oh, yeah. That's great. Um, yeah. We need to have like a Grock fact check
00:18:58
part of the show. We We need a Grock fact check. No, the reason I would say it's problematic for Trump to get involved in
00:19:04
it is because the Ellison's have also been major supporters of Trump and made
00:19:10
commitments for buying Tik Tok. Now you have one family who is a major supporter
00:19:16
of Trump, massive donators basically getting the inside line on Tik
00:19:22
Tok and now after a deal has been closed with Netflix being able to lobby to get
00:19:27
Trump to let them buy CBS and CNN. So if we start thinking about the influence
00:19:33
that Ellison, the Ellison family is having on the Trump administration, whether it's quidd proquo or it's the
00:19:39
appearance of quid proquo, it's best for Trump to stay out of it because already we have the tick tock deal. Now CBS they
00:19:47
own and they're going to own CNN. This is a lot of consolidation for one family
00:19:52
to have. That's a red herring. Between CBS and CNN, nine people watch those two channels. So those channels are
00:19:58
irrelevant. Those guys have to rebuild these things from scratch. Number one. And number two, it doesn't incentivize
00:20:03
or disincentivize 3 billion humans from using and watching that content. Who
00:20:08
owns that asset is not known to any of these people. CBS did not go up and down because, you know, one person owned it
00:20:15
versus another. Nobody knows who the CEO of Tik Tok is. It's Tik Tok is either good or not good. And so, I would just
00:20:23
keep in mind that this is something it's like the party circuit babble. Like you go there and you talk about it like, "Oh my god, it's so bad. It's so good." And
00:20:29
you miss the basic fact that nothing about the ownership changes the human
00:20:34
incentive to use a good product and to disqualify a product. Maybe I would take the other side of it.
00:20:41
There's still millions of people watching this and it's pretty clear that millions of people watch this show because it's good. No, no. Ultimately,
00:20:48
people are people are going to be drawn to great products. There's no doubt about that. But consolidation of the
00:20:55
major news assets CBS and the influence of Tik Tok and the influence of CNN is
00:21:01
undeniable. That is just undeniable. Undeniable to undeniable for what? Just reality. Reality Americans. No, not
00:21:09
my reality. You can try and insult me. That's not the point here in one company. No, it's not my reality. That's not
00:21:14
where I get my news from. Millions of people don't watch CBS and CNN. It's not true.
00:21:20
Literally, it's 4 million people who watch it. So yes, it is technically millions. If you're adding it all up over what, a
00:21:25
year, dduplicated, like what is what is CNN's most qualified, bestrun, most popular
00:21:32
show? Okay. Uh on the network or on the news? I don't know cuz I don't have it. I
00:21:38
don't even have cable. Yeah, I know. And you don't even pay for you asked your friends to steal New York Times articles.
00:21:45
Everybody will I am the lead. You get the final word. Having lived inside the beast, should Trump be
00:21:52
involved in these mergers and acquisitions? Yes or no? Chuck or call?
00:21:58
Well, you're not going to stop him, so it doesn't matter. What should we be concerned about is not media monopoly
00:22:04
power. It's censorship of the tech platforms. A return to that. That is
00:22:10
where you destroy creativity and diversity of thought. put the entire
00:22:16
nation into the mental prison from which it escaped last November. That's that is the threat. Censoring YouTube x
00:22:23
Instagram and I I just think we should be focused on that. And what's your take on Barry Weiss taking over CBS News? I'm not sure if
00:22:30
you've commented on me. I mean, I'm kind of impressed. I mean, I you know, it's easy to make fun of Barry
00:22:36
Weiss for being dumb or whatever, which is fair, but it's you have to sort of look at it in
00:22:43
reverse image and it's negative. It's like with those talents you got where you're amazing and I will say I agree
00:22:50
with that. I think that she's charming. She's tireless, energetic. I don't know
00:22:56
that still matters. Like we we overvalue IQ. Oh, a person's so smart, you know?
00:23:02
It actually doesn't matter. Like being charming, meeting people, you know, pushing an agenda tirelessly, like that
00:23:09
that really works. And in the end, the prize she got is not worth having. Like I how would you like to run CBS news
00:23:17
such as it is? No, for real. That's torture. They couldn't pay you enough. What would you do if you had CNN? They
00:23:22
put it in your laper and they say you're forced to be CEO of it for the next 10 years. What would you do? Well, actually, I've had that
00:23:28
conversation. Slightly more relevant. What would you do if you ran my son's school newspaper? Because it's about the same skill. Oh, I'd get radical.
00:23:34
Actually, Tucker had an interesting answer. Any other good questions? No, it's a great question. Tucker was about to answer it. Thanks for stepping on it, Jamai.
00:23:41
The CNN question. I mean, I spent almost 10 years there, so I I feel like I'm familiar with it. Yeah.
00:23:47
I have, you know, I actually had this conversation with someone was like, "We should buy CNN. You should run it." And
00:23:52
I So, I had caused to spend like an evening thinking about it and no way. I mean, what? First of all, I'm with
00:23:58
Chimath. Like, I don't have a New York Times or Washington Post or New Yorker subscription anymore after a lifetime of
00:24:03
having all three because they're totally irrelevant. They mean nothing. They're speaking to no one. And there's a kind
00:24:09
of musty. It's like going back to your childhood home and seeing that your bedroom was really small and like the
00:24:14
paint was actually turquoise and all these kind of sad posters from the '8s are still there. It's depressing. Like I
00:24:20
would just shut it down and and build something new. Okay, shut it down. Jason, can I give you my one thought
00:24:26
exercise about the New York Times? The one thing that I think is worth talking about the New York Times is I think they
00:24:32
will in the next five years do something so egregious and over the line akin to
00:24:37
some sort of liel or some sort of statement that has turned out to be completely false, they will get sued.
00:24:43
And I hope when that settlement happens, the person says, "I do not want to get paid the $4 billion. I want this to be
00:24:51
turned in into a nonprofit and into a public trust." And then shuts it down.
00:24:57
Interesting. That's kind of what happened with Fox when they did their 700 $800 million
00:25:04
settlement, the largest one. But the next one, Jason, the next one will go up. It won't be that scale because you've had now that that
00:25:10
precedent. That's a very important precedent about the scale of lying and misrepresenting things. And it only goes
00:25:16
up from here. This is a one-way ratchet. Yeah, you couldn't be more wrong about that. Uh it's definitely not going to happen. They have controls in place, but
00:25:22
it's a nice fantasy. They're also crushing it. By the way, when they moved to subscriptions, they have 12 million
00:25:27
paid subscribers now. They are objectively crushing it and figured it out better than any other news organization. Whether we agree or
00:25:34
disagree with, you know, their content and the quality of it, they are the most successful objectively here in America.
00:25:40
All right, speaking of successful in taking over the dialogue, we got to talk about Nick Fentes, who you just had on
00:25:47
your podcast, Tucker. You platformed him. Being facitious here, you
00:25:52
platformed him. I I created him. Basically, it's an interesting discussion. For those of you who don't
00:25:59
know, Nick Fuentes and have been living under a rock. He's a 27year-old white
00:26:04
nationalist with a very popular show on Rumble, about 500,000 subscribers, which isn't actually that big when you think
00:26:10
about it. His followers call themselves Gropers, and he's gained hundreds of
00:26:16
thousands over the past six months. He's on quite a heater, and uh he's got a
00:26:21
bunch of controversial opinions. I'll just give you the quotes. This is nothing to do with my opinion on him. He
00:26:26
was asked by Piers Morgan if he described himself as a racist. And he said, "Totally, I think everybody, if
00:26:33
we're being honest, is racist. The only people that aren't racist or pretend not to be are white people to their
00:26:40
detriment on women." Piers asked Nick if he was gay. Nick said, "No, but I will say women are very difficult to be
00:26:46
around." Piers uh then asked, "And do you think they should have the right to vote?" Nick said, "I do not." Absolutely
00:26:52
not. On Israel, Fuentes is very critical and what he calls organized jewelry in
00:27:01
America. So now you interviewed him. Couple of different ways to take this, but you did a good job of telling his
00:27:07
origin story. He was part of Prageru. He's got this really activated base. Why
00:27:13
is he resonating at this moment in time? And maybe you could explain to the
00:27:20
audience. MAGA versus America first, America Only, which I think are part of
00:27:26
America First, but you you you tell us because I think these are just terms right now. They're not like political
00:27:31
parties or anything. Well, there's a struggle over what those terms mean. It's very ugly and probably
00:27:38
necessary because you need to define terms. Like that's the first thing you do, I would say, when you think through
00:27:44
what you should be doing with your life, for example. So, um, as for Fuentes, his origin story is a little more precise, and I'll keep it short, but he tweeted
00:27:52
something as a freshman at BEu, critical of, pretty mildly critical of the
00:27:58
Congress for doing the bidding of this foreign country, Israel. And somehow Ben Shapiro saw that and attacked him and
00:28:05
tried to get him kicked out of his Republican club and made sure he didn't get an internship with some conservative organization. And I'm not attacking Ben
00:28:12
Shapiro, but that kind of tells you what attempts to shut people down, to shut
00:28:18
conversations down result in. They don't go away. They just fester in the
00:28:24
darkness and they can sometimes become really ugly. So what Fuentes is, among other Well, first of all, Fentes is
00:28:29
saying a lot of true things. That's why he's popular. He's funny. Uh he's smart.
00:28:35
But he's a good Yeah, he's a great broadcaster. But Fuentes on some macro level is
00:28:42
troubling because he his platform is an expression of something that has kind of
00:28:47
taken over all political discourse which is identity politics, tribalism. And I'm just opposed to that. I period and
00:28:54
always will be. And I I just think that we're governed by universal principles or we're governed by the mafia. Those
00:28:59
are our choices. And so you you know our principles have to apply to every human being or certainly every American
00:29:05
citizen. Period. or they're not principles. Um they're just a justification uh for tyranny. So
00:29:12
Fuentes, you know, has a different kind of identity politics, but there are all kinds of different identity politics. We
00:29:17
we lived under it during the Biden years. We've lived under it most of my life actually in one form or another.
00:29:23
And so if anything, Fuentes reminds us that we have to come up with some kind of principle that every American can
00:29:29
ascribe to something called national identity. That is not a dirty phrase.
00:29:35
that's actually necessary to keep the country from disintegrating, comma, which it is. So like what does every
00:29:40
American, all 350 million have in common with every other? And that's the conversation we need to have. And in its
00:29:45
absence, then we get a lot of people popping up and being like, well, all white people over here and all black people or Jewish people or whatever.
00:29:52
That's not going to work. That will end in violence. Everyone knows that. And so
00:29:57
now is probably a pretty good time to figure out what we all have in common. I
00:30:02
didn't platform him. I in first of all platform is not a verb and anyone who
00:30:08
says it is a verb is probably opposed to my core interests I would say and bad
00:30:13
language. Yeah. I interviewed him like interview everybody you know and my general belief
00:30:19
is you should let people say what they think and others can decide whether they mean it or not whether they're being false or sincere and and what they think
00:30:24
of what the person is saying. But that's that's my job. I'm not ashamed of it despite a lot of efforts to make me ashamed of it. I do disagree with
00:30:31
Fuentes on the question of universal principles. I think it's im well first of all it's against my religion to hate
00:30:37
any group and I told him that. But I didn't do a lot of other posturing designed to make me seem like you know
00:30:43
the good person. Peers unfortunately fell into that trap as an older man you know well isn't it you are bad and it's
00:30:49
like okay I don't even disagree with some of that but you don't elevate
00:30:55
yourself. you look like an outofouch buffoon. And that's exactly the trap that was awaiting Piers Morgan. And if
00:31:01
you watch that interview and if you watch the reaction to it, that did not diminish Nick Fuentes in any way. It enhanced Nick Fuentes. What diminishes
00:31:08
Nick Fuentes is asking him straightforward questions, particularly about women. Not have you had sex with
00:31:14
anybody, but like why are you so mad at women? And that, you know, letting people talk a lot reveals who they are.
00:31:20
That's just true. Sorry. If you were to give the top two or three reasons why he's resonating with it seems like young
00:31:29
men and this America this burgeoning American first movement which I guess it
00:31:34
would be good for you to define right now as best you can recognizing you're not the leader of it but you have said I
00:31:40
think uh you this uh concept is you know uh resonates with you so maybe why talk
00:31:48
why is Fentes resonating and what is America first versus MAGA. Like explain
00:31:53
that just in in reverse order. I mean, I would argue that the the premise of MAGA is America first, but I wouldn't say
00:32:00
that America first is a movement. I would say it's the only legitimate reason to run a government. And it's
00:32:06
very simple. The the government of your democratic republic ought to act in broad terms on behalf of its own
00:32:12
citizens. I mean, it's it's not more complicated than that. There's nothing sinister about it. In fact, anything
00:32:17
other than that is sinister because it's illegitimate. For what other reason would you run a a Democratic Republic
00:32:24
and treasonous? There there isn't one actually. So of course this has to be an America. You
00:32:29
could think of a new name for it if that name makes you uncomfortable. But the idea has to be the reason we have a
00:32:35
government or else we have to get rid of the government because there's no other
00:32:40
justification for having a government. Okay. So a b why is he popular? Because
00:32:45
he says that. But I would say more broadly because he's defiant. There's a kind of up yours, buddy. I can't say
00:32:52
that. Okay, watch this. I will. He's hilarious. He seems steadfast and strong. I I don't
00:32:59
think he is. He's not even married. So, like, if you're afraid of girls, I think you're a wuss. That's my personal view.
00:33:06
But there is a But in his defiance, people see something really appealing. Why wouldn't they? You know, you know,
00:33:12
these are kids who've grown up in a world of hectoring and telling them they're bad because of how they were born. And Nick Fuentes is just raising
00:33:18
the middle finger to the people saying that and saying up yours. And who wouldn't love that? Of course people love that.
00:33:24
Second piece is of the America first is America only. And I guess that means
00:33:29
I don't know what that means. That's it. No, of course it's Look, we we work in concert with others by definition. It's
00:33:36
a globalized economy. You know, maybe it shouldn't be, but it is. But America only that argument to the extent it's
00:33:42
not really an argument. That's like a counter slogan designed to undercut the main argument isn't doesn't really mean.
00:33:49
No one is arguing that. It's just saying look the US government ought to act on behalf of its own citizens. To which
00:33:55
people who don't believe that who are embarrassed to explain why they don't believe that because there's no justification for not believing that are
00:34:01
like well you're you're America only. No, the government should do that. Every
00:34:07
part of the government should have that foremost in mind. How does this help the people who pay for this in whose name it's done? Like again, even calling it a
00:34:14
movement drives me bonkers because compared to what? Some sort of creepy secretive oligarchy, which we've had
00:34:21
most of my life. Like that's just bad. There's no way to defend that. And we can argue within the framework of
00:34:27
America first how to put America first. That's a totally legitimate argument. And there are all kinds of different
00:34:32
thoughts about that. But what the motive should be, the goal should be, there's no debate. It has to be for American
00:34:39
citizens primarily. If they're, you know, ancillary beneficiaries, that's great. Not against that at all. Let's
00:34:44
help everyone if we can. But the point is to help the people who own the
00:34:50
country, the shareholders of the United States who are American citizens. There's no other point, is there? Chamat, let me bring you in on this from
00:34:57
the angle of the America first movement, America only movement as a reaction to
00:35:02
the first year of the Trump administration feeling to many people in the Republican party as benefiting maybe
00:35:09
tech oligarchs, billionaires, uh, international issues more than the
00:35:14
working man. you have started to tweet a little bit and become vocal about, hey,
00:35:20
maybe year two of the Trump administration, we got to get refocused on some of these things. Unpack that for
00:35:26
us in the audience. Well, can I can I offer my feedback on Fuentes first? Of course. Yes. Go with Fuentes if you
00:35:32
like. There's a couple points I want to make. The first is that he is, as Tucker said,
00:35:38
charismatic. I think he's funny and I think that he can animate around a lot
00:35:44
of touchy subjects and say things that have shock value. And I think in that what he is is actually like a modern
00:35:50
shock jock. He's like a younger Howard Stern. He's the Howard Stern of this era. the way that Howard Stern was in
00:35:57
that era unlistenable to so many people because he would be kind of okay for 80%
00:36:02
of the time and then go totally off the rails and you think, man, this guy is some combination of mean, nuts, crazy,
00:36:08
and then you throw out all these other adjectives. So that's point number one.
00:36:13
Point number two is it is true that the longer you allow him to speak actually
00:36:19
the more you understand what he thinks and as a result the quality of the product will dictate the scale of
00:36:25
adoption and now this is where I think the media yet again has been very sloppy
00:36:30
and doesn't do their work which is and Nick you can throw this up there's been a lot of research on what has been
00:36:36
happening in the last few months and the bottom line takeaway in the last few
00:36:42
months is that there is coordinated effort of individual largely unverified
00:36:48
accounts in social media. They typically emanate from India, Pakistan, Malaysia,
00:36:53
Indonesia, Nigeria. And there is a coordinated amplification process that
00:36:59
is happening around this content. And in this chart that you're seeing, it's just
00:37:05
a comparison of Nick in the first 30 minutes to what people like Elon get in
00:37:10
the first 30 minutes of him posting. Now, why that's so important is you start to see this huge disparity where
00:37:17
even though you have the most viral person and account in the world, i.e. Elon Musk on X. What you see is him
00:37:25
completely crushing and dominating the verality in the beginning of his content
00:37:32
creation versus anybody else's. And then there's a bunch of other charts. Nick, you can retweet a link to this. It
00:37:38
starts to show a pattern where there's a coordinated effort to amplify. I think
00:37:44
that's why we're having the Nick Fuentes moment at this point in time. The New
00:37:49
York Times all of a sudden sees this information, probably doesn't bother to do the diligence, has a bunch of
00:37:55
positive mentions, starts to pump him up as a
00:38:00
leader of some kind of nefarious, crazy, scary rebellion.
00:38:05
Well, no, they want to mainstream him and they want to try to mainstream him. So, what you see is the New York Times trying to mainstream this guy and make
00:38:12
him credible so as to paint the right as a bunch of evil racist ideologues. I
00:38:18
think the producers at Piers Morgan didn't understand this was happening. And so when you put him on a 20 minute
00:38:24
soundbite shot, it feeds in to exactly what makes him popular. The ability to land these small sound bites. This is
00:38:30
why I think Tucker was much better because again in multi-hour programming, you can't hide. You see the full facet
00:38:37
of what the person is and you start to understand that this is a very savvy young media personality. Now, when you
00:38:44
strip away all of this amplification, the product has to work for it to scale and grow. There is no way that you're
00:38:51
going to manifest average normal men and women spending their time to take these
00:38:57
views credibly unless it's good and right. There will be a moment in time where curiosity will cause you to say,
00:39:04
why is everybody talking about this? We are firmly in that moment. But I think that there is a large portion of what he
00:39:10
says which I don't know whether he believes or not because I haven't spent that much time but is meant to shock.
00:39:16
It's meant to catalyze and animate people but I don't think it's sustainable because the views themselves
00:39:22
are repugnant. So I think and this is why the New York Times wants to elevate him. Can you put it up there?
00:39:30
So there I am. I look amazing guy in that picture. I look like a
00:39:35
slackjawed yokal and he looks like rebel with a cause. Yeah, it's Jed
00:39:42
Clampid versus uh James Dean. The the other thing I'll say is that Sax and I are in a couple of these group
00:39:48
chats with some folks and some of the chatter there is that who is paying for
00:39:54
and who is activating all of these bots and fake accounts in all of these developing world countries and why did
00:39:59
they pick him? Some of that conspiracy basically points to a handful of nations who would love to ferment that kind of
00:40:05
dissent and that kind of yeah we don't have chaos and uprising. I mean this has been clear for a long
00:40:11
time. So I think it's very important from here on out that if people are to listen to him I think the longer form content
00:40:19
exposes what he really thinks so that you can judge it for yourself. But I would not discount the fact that this
00:40:25
moment is happening is not entirely organic. There is a deep inorganic
00:40:31
effort to put this on the front page of the news. And so it's up to traditional media to decide whether they're going to
00:40:38
basically lift this guy up as some kind of newfound hero or call it out for what
00:40:43
it is, which is lead in the articles with this data, which is widely available and easy to get.
00:40:49
Well, he literally explained this when he was on Tucker's podcast. He was originally in college part of this
00:40:55
Prageru kind of movement and he said he was in a Facebook group which was the
00:41:01
Prager army and I've talked about this before we've joked about it but there are groups like the one you mentioned uh
00:41:07
that you're in this group I heard about it a couple hundred right-wing folks of note take it easy Jason
00:41:13
my invite didn't get got lost apparently to it but uh yeah add me to that group please but what happens in these groups
00:41:19
some of them are designed to make money some of them designed to make impact In fact, Andrew Tate had one where there
00:41:25
was an affiliate scheme put up uh for it when Twitter started sharing revenue with accounts and you start looking at
00:41:31
these like accounts that are anonymous but get to massive scale. King Co the Great You're talking about the clip forming.
00:41:38
Yeah. Yeah. And and they're regurgitating stuff. They're doing it for the money obviously. They're making 10 20 grand a
00:41:43
month. It becomes a full-time job to do that. But we've got the Russians, we've got the Chinese, everybody in between
00:41:50
doing this kind of pumping. And then there is the actual army. So he Nick Fuentes has an army of young people who
00:41:56
do this and they're on VPNs and they flood comments and what they do is they share a clip. And you can experience
00:42:02
this in your own social media if you're of note because you'll have 10 posts get x number of replies and then all of a
00:42:08
sudden the 11th gets 10 times that all at the same time directly to it not from
00:42:14
your followers and that's when they get shared on these group chats. By the way, we should mention just tangentially
00:42:19
because I want to get Tucker's take on this, but Australia just passed a law that under 16, you're now not allowed to use social media. Instagram, Facebook,
00:42:26
Tik Tok, all banned until you turn 16. And in part, Jason, I think it's because these coordinated miss and
00:42:32
disinformation or amplification campaigns are on all kinds of characters, not just Nick. They're proliferating because to your point, the
00:42:38
economics creates an incentive if nothing else. Then there's obviously the state level sponsored chaos that it tries to sew. And one of the only highle
00:42:46
bits that you can flip is to say under a certain age, we're just going to minimize how much of this content you
00:42:51
get exposed to because we don't know what it is. Before we go there, can I just say the one thing on America first, America only, whatever these terms are.
00:42:58
I I don't know what they are. America first, America only. Okay. I I don't know what they are. These are slogans, anti- slogans. Here's
00:43:03
what I will say. I think it's very important, and I speak as an immigrant. I want to be American. I don't consider
00:43:11
myself Canadian American. I don't consider myself Sri Lankan Canadian American. I consider myself American. I
00:43:18
want to absorb and I want to reflect the values of this country. I want to know
00:43:24
and be able to talk to you about the constitution of this country. I want to be able to celebrate the cultural
00:43:30
heritage of this country. That's part of the compact that I think I'm making. And I do think that's an important thing
00:43:36
that we have all lost where we have to run around in all of our traditional garb and it just loses that what makes
00:43:44
countries great is a shared set of principles and values and we have to
00:43:50
find a way of doing that. When I immigrated from Canada, what I will tell you is Canada took the opposite view. We
00:43:57
used to call America a melting pot pjoratively, even in textbooks, and the textbooks would call Canada a toss
00:44:04
salad. And we would celebrate that that form of multiculturalism was better. But when you fast forward the clock 20
00:44:10
years, all it did was create confusion. For example, if you go to school in Canada, there's like a bajillion
00:44:16
holidays. Every culture gets their day off. Then all of a sudden, what happens? The kids don't get educated because you have to have every longtail country get
00:44:23
recognized in some way, shape, or form. All of a sudden you lose this very standard form of basic organization.
00:44:30
That's just but one example. You know the forms are in 50 languages. All that does is create chaos. It should have
00:44:36
been in English and French because those are the two official languages of the country. All of that indirection trying
00:44:41
to celebrate everybody's heritage confuses and slows that country down. And you can see it in the GDP. You can
00:44:48
see it in the FDI foreign direct investment. You can see it. So the one thing that we have to agree is that
00:44:54
there is an American culture and set of values and we should not lose it and we should ask the people that want to be
00:45:00
here to embrace it. We all embraced it. Saxs embraced it. If you talk to Sax's parents, they've embraced it. If you
00:45:05
talk to Freeberg, Freeberg's parents, we all came from different countries, but we are fundamentally American. Jake Cal,
00:45:11
I'm I wasn't sure, but were you intimating that the party behind Nick
00:45:18
Fuentes's meteoric success is the Russians? No, no, I was just saying that there's
00:45:24
brigading going on. Which foreign actor do you think it is? I, you know, some of them just want to create chaos. I put Russia in that. Just
00:45:30
creating division in America distracts Americans. I think they like that. But I
00:45:35
there are three levels of this. There's his army. Fuentes does have an army of super fans who are disaffected young
00:45:42
men. And there's a reason why they're disaffected. It's hard to get jobs. It's impossible to get a home. Health care.
00:45:49
They've seen people, you know, go bankrupt because of healthcare. If you look at healthcare, homes, and education, those are the three most
00:45:56
important things we have to fix in America. That's why young people are disaffected. And when you're disaffected
00:46:01
as a youth and somebody starts blaming the Jews, the blacks, the Hispanics, the
00:46:07
border, this issue, that issue, it's really appealing because then you don't have to take any personal responsibility
00:46:13
for it. And it is in fact really hard to own a home in most cities in America unless you move to Texas or, you know,
00:46:19
Nashville, Florida, and then there's too many homes and prices are going down. and he's a kid and he says stupid stuff,
00:46:26
but he says he does tap into that disaffectedness. Where would you put it if you ranked it,
00:46:33
Tucker, since you are pretty plugged into this? Is Fuentes's popularity based
00:46:39
on the Gropers and this like really inside group of people who are amplifying him, is it what he's saying
00:46:45
or is it like some foreign actors promoting him? How would you handicap
00:46:51
his massive popularity? And then we'll move on. All all three play a role I
00:46:56
would say and if you want to know who is primarily responsible for amplifying him consider who benefits. If you wanted to
00:47:02
discredit America first say a foreign policy then you would put it in the mouth of someone who is pro- Hitler. Of
00:47:09
course, anytime I hear someone endorsing Hitler, I love Hitler. Then I'm like, h, you know, the Fed alarm goes off or the
00:47:15
inorganic alarm goes off, right? Okay. A B, he is the product of a system that
00:47:23
the rest of us tolerated and certain among us created and we shouldn't be
00:47:28
surprised. You know, if you have identity politics, at some point you're going to get white identity politics. I think I wrote a book about this almost
00:47:34
10 years ago, which was totally ignored, but that's inevitable. It's inevitable.
00:47:40
And so to fix it, it's not a matter of censoring Nick Fuentes or anyone who
00:47:45
likes Nick Fuentes. It's a matter of dracializing our society and making it a fair society
00:47:51
where rewards or condemnation are not given on the basis of your DNA. Like you
00:47:58
can't have that and hope to avoid a Rwanda because it's just going to happen. It's inevitable. Tribalism is
00:48:04
the threat to every society and I don't know how we lost sight of that but we
00:48:11
did you people are sort of saying he's a fed he's accusing you of being a fed
00:48:16
you know I don't know the whole fed conver I unfortunately so he it's such a
00:48:21
long story I won't even bore you with it but he attacked my father at one point so I got baited into it and and I called
00:48:26
him a fed um you know I don't I don't know uh but I do know that there are and
00:48:31
I think that Fuentes is pime. Let me just be clear. I think he's primarily successful because of his talents and
00:48:38
because of the obvious truth behind some of what he is saying. I just right the
00:48:43
government of this country or any country should act on behalf of its own citizens and ours doesn't. And that's an
00:48:49
outrage. So, okay, that's just true. But the white identity politics part of it once again is inevitable. Identity
00:48:56
politics will give birth to white identity politics. Why wouldn't it? and your e efforts to stamp it out will
00:49:02
never work because they're too hypocritical. So the only way to fix that if you don't like it is to
00:49:08
eliminate all identity politics, which we should do tonight because it's the road to disaster. That's it. I have an
00:49:13
AI question for Tucker. I'm increasingly surprised by the number of people on the right who I would describe as
00:49:21
ardent free market low regulation to no regulation folks who
00:49:29
are very anti- AI and I'm just curious where do you think that comes from and
00:49:36
what do you think it comes from so far as I can tell the
00:49:41
perception that the risks outweigh the benefits. So the risk would include, you
00:49:46
know, massive job loss, chaos where nobody sort of knows if anything is real
00:49:52
and the fabric of reality itself begins to tear. You know, of of course the
00:49:58
massive energy draw and the huge and expensive infrastructure changes that
00:50:03
will require the disruption that will inevitably cause. So like the downsides are super obvious. Not even to mention
00:50:10
the potential this gets completely away from us and eats us or something. Okay. As weighed against the potential
00:50:16
benefits which are what and I I don't doubt that there are some you know
00:50:23
coming to faster you know diagnostic conclusions in medicine you know or
00:50:28
organizi you know getting rid of tedious tasks that no one wants to do elimination of clerical work etc. I guess those are upsides, but I it's
00:50:35
disproportionate. The in the view of most people, I think who aren't experts in this, not daily involved in it, the
00:50:42
risks far outweigh not just the upsides, but the announced upsides. So, typically
00:50:48
when we roll out a new product, we tell the people we hope to buy it like, "This is going to be amazing. It's going to
00:50:53
blow you away. Everything about your life will be better once you get the iPhone 27 or whatever." There's been
00:50:59
none of that with AI. Like, none. The announcement has been, "Holy, this is
00:51:04
going to change everything. Stop." How exactly? Well, it just is. I mean, I
00:51:10
don't know who's in charge of the marketing for this. Seriously, Sam Alman, Sam Alman, and
00:51:17
David Sachs, go David. Uh, wait, hold on. Can I just Can I just follow up on this?
00:51:23
Okay, so Tucker, here's just a thought exercise and just tell me how this factors into that opinion, if it should
00:51:28
at all. So let's say in a world in 10 years where you have these super intelligent computers and systems and
00:51:36
models. Okay, in my thinking what that does is it reorders the geopolitics
00:51:43
of all countries in the world where you're in one of three buckets. In bucket number one, you're an exporter of
00:51:51
that intelligence. And I think right now steady state it's going to be China and America, right? China will have one
00:51:58
version of exported intelligence and we will have one version of exported intelligence. Then there'll be these strategic partner
00:52:05
countries of which I suspect there's less than 10 who are the enablers, the facilitators.
00:52:11
They have specialized skills that wrap either the Chinese version or the American version with energy, with
00:52:17
money, with knowhow, etc. And then I think there's everybody else. And it almost creates this thing where if you
00:52:23
are an importer of intelligence in the future, you theoretically are at risk of becoming essentially a vassal state. And
00:52:30
so if you think about it at that level, isn't AI something that is almost
00:52:35
existential that we must win? I mean, for sure. I mean, at at that level, for sure. And I would just point
00:52:41
out that like almost every single state in the world is already a vassal state. So like no change there. But uh yeah, I
00:52:47
mean, you don't want to be the wrong side of it. That's clear. Is it containable to nation states? That's not
00:52:52
clear to be honest to me at all. But whatever I get the argument. I'm just saying at the consumer level, no one has
00:52:59
explained why we should be excited about this. And if I, you know, I'm a gold buyer and ammo buyer and freeze-dried
00:53:04
food buyers already told you. So it doesn't kind of affect me as an investment matter, but like just I I
00:53:11
think it would be so I I don't have a You're saying the dividend of AI is not
00:53:16
clear. Like it's like the positive to the average person to the average person. You are 100% correct, Tucker, on this.
00:53:23
We have done a terrible job as an industry communicating. What's the answer? What's the answer? Like how is this great for the answer?
00:53:29
It it'll wind up being great for you because the prices of goods and services will
00:53:34
get much lower. You'll live much longer. And listen, I'm not saying this is mine, but this is what the industry should be
00:53:40
saying. The price of education is going to go down 80 or 90%. You're going to have customized, adaptable education
00:53:47
versus, you know, paying for $50,000 a year degrees. You're going to be able to learn anything in half the time at 90%
00:53:54
less. All of these deliveries coming to your home are going to be delivered at half the price, twice as fast, because
00:54:00
it's going to be in a drone or it's going to be in a self-driving car. And we're going to make breakthroughs in health care that will reduce suffering.
00:54:07
And you will not die of cancer. You're going to live to 120. We will have job displacement, but we believe the lower
00:54:15
cost of living and the greater services that are going to be available to you in healthcare and education will make up
00:54:22
for that. And if it doesn't, we're going to put in ways to pace out the job
00:54:27
displacement. In China, they're doing this. In China, they're proposing and Sax and President Trump, the amazing
00:54:34
President Trump, will be doing an AI national edict soon, I believe, or an
00:54:40
executive order. But in China, in Wuhan, paradoxically, they are talking about
00:54:45
giving out licenses to self-driving cars in a paced roll out so that young men
00:54:51
don't lose their jobs on mass, which is what we're about to see in America. And if we don't take this into account, and
00:54:58
the great Zara will speak in a moment, that's when this will become, I think,
00:55:03
the worst nightmare that you're talking about, the dystopian version of this. We need to figure out healthcare, homes,
00:55:10
and education and make those free, close to free, and do a Manhattan project on
00:55:17
creating 10 new cities with 10 million new homes and free healthcare for everybody and free education for trade
00:55:22
schools, etc. That's what solves the problem. That's what nobody's doing. Sax, your chance to jump in here.
00:55:30
Well, there's a lot of things going on here, but I think one of them is that
00:55:36
humans are really attracted to either utopian or dystopian narratives. I think
00:55:41
liberals are probably more attracted to utopian narratives and conservatives are attracted to dystopian ones. And I think
00:55:47
the future is going to be more in the messy middle. I don't think it's going to be to either one of those extremes.
00:55:53
And I agree that the industry has not done a good job. They've created a lot of fear. The whole AGI narrative didn't
00:56:00
help because you had a lot of people in the industry saying we're going to get to AGI in two or three years. Those time
00:56:05
frames have all been pushed back by the way or eliminated. You don't even hear that term or eliminated. People were saying a few
00:56:11
years ago that we'd have AGI by now. Now no one is saying that. They're basically pushing by just on that that was exactly a
00:56:17
function of the immaturity of our industry. So to what Tucker said, it is true. Well, it's a utopian mindset, right?
00:56:22
Yeah. What they saw as utopian. I think a lot of people reacted, wait a second, that sounds pretty dystopian to me. And I
00:56:27
also think it's what needed to be said in that moment to get that next quantum of money. They were telling the investors some version of what the
00:56:34
investors either didn't understand or wanted to hear in order to get that next scale up capital. But now that we're
00:56:40
past that and we're seeing more practical implementations, right now we're actually at the beginning, I think, of the positive
00:56:46
productivity loop. And that isn't explained because we've spent so much time offering up this grand utopian
00:56:52
vision. it does seem like an overpromise underdeliver kind of situation to show up and say oh now your DMV form you
00:57:00
don't need to fill it out anymore and people are like wait a minute that's what we spent all this money on and I think that we should have taken a much
00:57:07
more conservative view in explaining what the upside was in the past perfectly said Chimoth and I think
00:57:12
Tucker one way to frame it is the abundance of Star Trek versus you know to Sax's point the dystopian nature of
00:57:18
Terminator 2 one of the great paradoxes here we we will have mass massive job displacement, destruction, whatever term
00:57:25
you you prefer in entry- level jobs already happening in, you know, what
00:57:31
I'll call chores like dishes and driving cars. Entrylevel jobs. Those are going
00:57:37
away. And they're going to go away in the millions. And they're going to go away in the millions in the short to midterm. 2, three, four, five years.
00:57:43
You're going to have people protesting in the streets over this issue. I don't know you agree with it. You say it every
00:57:49
time. You interrupt me every time I say it. But I will be right on this one. You will be wrong because you'll see taxi
00:57:54
drivers are going to be the first group to do it. And it's already happening in different places in the world as I
00:57:59
mentioned in China and it's already starting to happen in places like San Francisco where they're burning the whimos and with them for a reason. The
00:58:06
abundance argument is something we need to get on and we need to get on it quick or we need to get on some sort of
00:58:12
promises about re-education and retraining. And you don't hear rich people and these rich companies talking
00:58:18
about that half as much as they did. But you know what could be the great paradox, Tucker, is that the America
00:58:24
First movement is acutely aware of this and shutting the border and deporting
00:58:29
people, which I'm fine with deporting criminals. That actually might be the solution to the problem. As we deport
00:58:35
people, as we don't let people in, unemployment might stay at a low enough level that we could manage giving the
00:58:42
dishwasher jobs, the nanny jobs, the ones that maybe were being done by illegals, construction jobs, and we're
00:58:49
just going to have to pay more for those. It's going to be $30, $40 an hour jobs. So, America First might actually be the solution to this displacement.
00:58:58
May I just say one thing? I first of all, it makes me uncomfortable to hear you use the term re-education.
00:59:03
Yeah. Sorry, I don't want that. But um you know the the the awesome
00:59:08
power that AI gives governments and other concentrations of
00:59:14
power over the population as a concern particularly in the United States where we have a bill of rights. And it seems
00:59:19
to me it would be important and I never hear it raised to put in some guard rails to protect the average powerless
00:59:25
person against surveillance or having his rights taken away in effect famously
00:59:31
social credit. It's just too easy to extract compliance from people with
00:59:36
technology this powerful. And of course, the very obvious next 10 years looks
00:59:43
like this. There's a lot of disruption because of the elimination of jobs, particularly low-end jobs, but not only
00:59:49
low-end jobs like lawyers and stuff, the the true revolutionaries in any society.
00:59:55
and then the technology itself is used to keep the population under control
01:00:01
through repression. Like I don't think that's a crazy scenario at all. You're aware that is the big
01:00:08
what are we doing about that? Let me speak to this. So first of all I agree that that is the biggest risk of
01:00:13
AI is let's call it the Orwellian concerns as opposed to like the James
01:00:19
Cameron Terminator concerns. Just as an aside, I I agree actually with the Star Trek analogy that I think the right way
01:00:24
to think about AI, it's like the ship's computer in Star Trek where you can tell it what to do. It understands language
01:00:31
and it can speak back to you, but it doesn't have a mind of its own. But that doesn't mean that it couldn't be used by
01:00:38
humans or governments in a oppressive way. And that's I think the biggest risk of it. And I think the track that we
01:00:45
were on at the end of the Biden administration is that they were starting to require that DEI be
01:00:51
programmed into AI. And that should be seen as an attempt to kind of infiltrate
01:00:57
AI with ideology that then programs or brainwashes our kids and everyone who
01:01:02
uses AI. And we were seeing that when Google rolled out their first product, you had the whole black George Washington thing and black Nazis and all
01:01:09
that kind of stuff. history was being rewritten in real time by AI in order to serve a political agenda. And that
01:01:15
didn't happen by accident. It's because the values, the ideology was being programmed in.
01:01:21
Now, I think that that was the track we were on before President Trump got elected. I think it was a pretty scary
01:01:26
track. And let me say one other thing is that that whole apparatus of so-called trust and safety from social networking
01:01:33
which is basically a big excuse for censorship and shadow banning all of that was in the process of being ported
01:01:39
over to these AI companies. And in fact they even use some of this same terminology about like safety is really
01:01:45
this like catchall term for a lot of things in the social networking context when they talked about safety. The idea
01:01:52
was that users be confronted with ideas they didn't like and therefore that was a threat to their well-being. So
01:01:58
therefore we need a safety team to censor those opinions. Safe space. Yes. And I think I think in a similar
01:02:03
way like that whole safety apparatus was in the process of being applied to AI like we can't have users be confronted
01:02:09
with ideas that they don't like. Training data right Sax. I mean, if the training data is a set of woke
01:02:15
ideologies and those are pervasive on the open web, then as Elon pointed out
01:02:21
in like I think an early version of Grock, it was like if I misgender somebody that's not as that's worse than
01:02:27
No, no, that's right. The early versions bomb going off. Yeah. No, the earlier versions of the model, you would ask it what is worse,
01:02:34
global thermonuclear war or misgendering Caitlyn Jenner? And the model would say misgendering. So, and look, that
01:02:40
ideology was coming from somewhere, right? War games. Yeah. So I think Tucker is right that there is a real risk that AI
01:02:47
is used by future governments or the deep state basically to surveil us to
01:02:53
censor us and even potentially to brainwash us cuz it is really good for that. It's not that I think the AI is
01:03:00
going to develop a mind of its own. I don't think the technology is anywhere close to that. But I do think that it
01:03:05
could be used by the government in a much more invasive and intrusive way in the manner that that the government was
01:03:11
the deep state was already trying to get in bed with the social networks. That's right. This is I think Yeah. Go ahead. Go ahead.
01:03:17
This this is the absolute biggest risk. Both of you guys nailed it on the head. In the future when you have these really
01:03:22
powerful models, the reality is the incentive for governments to try to infiltrate the information cycle. They
01:03:30
will not be able to hold themselves back. And then what comes with that is a lack of privacy, a total loss of privacy
01:03:36
and then a push towards censorship. So as these AIs become more powerful, we have to marry it with a set of
01:03:44
technologies that can preserve privacy and preserve access to monetary
01:03:49
resources. If you look at the examples today that we have, there's nothing you can do today,
01:03:55
nothing online that is not tracked. Now, we have sets of rules that say that
01:04:01
tracking can't be shared. I'll give you an example. I decide to buy a very sugary cereal. That is not shared with
01:04:09
my insurance company that underwrites my health insurance. And there's all kinds of laws that prevent that. But that's
01:04:14
just a flimsy law. That's a moment in time that could change. If that decision were to change, now my buying patterns
01:04:21
become subject to scrutiny. That could also apply to how I consume information on these networks. So we have to find a
01:04:27
way to make sure that you can transact. Right? The great thing about the US dollar is when you get a dollar and you
01:04:33
put it in your pocket, the physical dollar bill, it is completely funible. Nobody knows what it was used for in the
01:04:38
past. Nobody can judge how you use it in the present or in the future. And we have to find a way to replicate a
01:04:45
version of that so that you can preserve privacy and minimize censorship. Because if you have to transact all day, every
01:04:52
day online for everything and there's no way to shield some amount
01:04:58
of privacy, it's a very scary outcome. One other point on this is I do think
01:05:03
that if you look at kind of who's promoting a lot of these scary narratives about AI, it is people on the
01:05:11
far left of the political spectrum. Because when you create enough fear in a population, the people cry out for
01:05:17
government to intervene and save them. And I don't think it's a coincidence that again that a lot of the the voices
01:05:24
who are spreading this like doomer ideology and saying that we need the AI models to be reporting a lot more things
01:05:30
to the government which is a stepping stone to surveillance or previously they had said we need to
01:05:37
embed DEI in AI models or we need AI models to prevent discrimination which
01:05:42
is kind of their back door for doing the same thing. They're on the left of the political spectrum. And I do kind of worry that people on the right are
01:05:50
buying into this in a way that's actually going to lead to a lot of government intervention in a way that
01:05:56
actually could lead to the orient outcomes that we're talking about. Mhm. I don't think that people on the right
01:06:01
who are concerned about civil liberties should want the government to play this super intrusive role in AI, if that
01:06:11
makes sense. Of course it does. Of course it does. I think you've got a lot of people
01:06:16
suddenly in the United States who are very sensitive about power and feel like their own power has eroded so
01:06:22
dramatically almost down to nothing. Their economic power, their political power, the power of their vote, the
01:06:28
power of the dollar in their pocket like have all been really reduced and all of a sudden you have a technology that
01:06:34
promises to concentrate power still further in the hands of people other than them. And so they're they're touchy about it.
01:06:40
I mean they're definitely just freaked out in general. I agree. Right. So that's that's the backdrop. So
01:06:47
people who feel panics like that and I'm I have power but I still I sympathize with it and I feel it to some extent
01:06:53
there you know you're more open to to dumer scenarios when you feel that way and so it would be helpful. Yeah
01:07:00
just to reassure people you will be protected and yes there is an upside for you. Yeah, look, I I agree with that and
01:07:06
I I think there's actually a couple other things that account for like the visceral nature of the criticism because I'm on the receiving end of a lot of it
01:07:12
right now, so I I see it. Um, so one of them is when people hear AI, they think
01:07:17
that's not me. Like that doesn't include me, right? So there's all these benefits that are so supposedly being created,
01:07:23
but I'm not going to participate in that. In fact, I might even lose my job. This is why I think it's like pretty important to get out the message about
01:07:29
how the whole country potentially benefits from this, not just a small click in Silicon Valley. Have you come up with examples of it,
01:07:35
Sax? Like how the country benefits from it? Well, sure. I mean, there's an article that didn't we cover last week, the the
01:07:41
Wall Street Journal talking about how construction workers have seen their wages increase 30%. Because of the data
01:07:46
center buildout, I mean, we are seeing a huge infrastructure boom throughout the country on energy production and
01:07:52
construction that's related to this. And so it's not just software where people are benefiting. But the other thing I
01:07:58
think that's very visceral on the right is that the hatred of big tech quite frankly. I mean a lot of conservative
01:08:05
influencers were directly censored and shadowbanned during this co period especially where the big tech companies
01:08:12
you know in the b administration were really coming down on them and censoring them and there's still a lot of hatred
01:08:19
towards big tech. I think some of that's even misplaced, but there's been like almost like a transference. Also, with social media, a
01:08:26
lot of people have concerns about what social media is doing to kids, body image concerns or
01:08:33
fear of online predators, all that kind of stuff. And I don't think it's an analogous situation with AI chatbots
01:08:38
because you're not meeting people, you're kind of doing research. It's like a whole different activity. But I think
01:08:44
there's almost like a transference of anger or anxiety or fear from what
01:08:49
happened over the last decade or two with social media or with again like these online platforms and that's being
01:08:55
transferred over to these new AI platforms. Even though I don't think they're precisely analogous and the
01:09:01
regulations should be looked at a little bit differently. Let me Jason give you some credit. I do
01:09:06
think you've put your finger on the pulse of what the problem is. Whether we call it a perception or a misperception,
01:09:12
the point is people are afraid for jobs of their jobs. That I agree with you. I think the data about what has happened
01:09:18
though is is pretty flimsy that it actually has seen a bunch of job loss. For example, when we got home from the
01:09:24
Christmas party, Sachs last night, I turned on CNBC and it was Jim Kramer and he was interviewing this wonderful guy
01:09:30
who I'd never heard speak before, but he's the the founder and chairman of Service Titan. And he had this very
01:09:38
elegant way of describing it, which is AI will put the jobs that are purely
01:09:44
cognitive at risk. But when you marry cognitive ability with physical
01:09:50
dexterity, those jobs are thriving. And he talked about construction workers,
01:09:56
plumbers, electricians. In fact, this week when I was in Abu Dhabi, we were
01:10:02
talking about the transformation of power, right? and that these electricians now get paid five six 7
01:10:09
$800,000 a year which by the way just FYI is more than most engineers in
01:10:15
Silicon Valley. Okay, these guys are the ones that are actually winning but the stories are not told and then the incentives aren't there. And so there's
01:10:22
a bunch of things that I think need to happen to highlight where the success stories are. They're not the obvious
01:10:28
places that one would think. It's not just some engineer tickling the keyboard making millions of dollars and putting
01:10:34
people out of work. That's not what's happening. But I don't think the story is told. And so the palpable fear of job loss is
01:10:41
there. To your point, I do agree with you, Jason. That is the overriding narrative that we have to with data and
01:10:49
facts convince people of what is actually happening. There is definitely a narrative that's ahead of the job
01:10:55
loss. And the question is what pace will it happen at? when people are seeing
01:11:01
young people having a hard time getting jobs and you know for whatever reason but I I I suspect it's AI when they see
01:11:08
firms like Amazon estimating estimating in the future they're going to eliminate these 600,000 jobs and that leaks and
01:11:15
that they're going to do a PR campaign about it when you see drive-throughs moving to AI and when you see a third of
01:11:21
rides in San Francisco and LA move to Whimo without the driver in it it's
01:11:26
really hard to say it's not happening so We're just on a different It's a matter of what timeline it's happening on. You
01:11:32
can't have it both ways where, you know, these companies are raising billions of dollars and they're replacing jobs and
01:11:40
saying, "Hey, these jobs are going to be 10 times more efficient or we're going to replace your driver and we're going
01:11:46
to replace your cashier." I see that as an early stage investor in founder university. I see it every day. Company,
01:11:53
hold on, let me finish, please, gentlemen. People are pitching me on startups and they're getting funded for
01:11:59
these startups to specifically replace roles and they're saying, "We want to make the perfect sales development rep.
01:12:06
We want to make the perfect customer support agent and enterprise customers
01:12:12
are agreeing with them and buying these products and services specifically to stop hiring and increasing their
01:12:19
headcount. I see it on the front lines. It is definitely happening. The only difference is timelines and can we
01:12:25
create enough jobs? This is why I think we've done a bad job of explaining it. We need to explain for every one of
01:12:31
those robo taxies that gets out there and that job is gone. How do we get that person another job?
01:12:38
Because they're not going to get the job as a cashier at Starbucks anymore cuz that's going AI too. Here's a very practical idea. Yesterday
01:12:44
I was at the Senate to just talk about this. What is the idea? I think we have to start looking very honestly at
01:12:51
stopping the federal underwriting of student loans. Why? Because it would
01:12:58
allow the market to move very quickly to your reality, Jason. Because we would go
01:13:05
beyond just funding somebody to become a master electrician. I suspect that we would pay people. Yes, I bet if you went
01:13:11
to Google, they would not only subsidize you, they would probably pay you a salary to get educated to do that job
01:13:18
because once you graduated and you could work up the ranks and become a master electrician, there is so much work that
01:13:24
for example Google needs, Amazon needs, Microsoft needs. And so if you
01:13:29
eliminated the federal underwriting, we don't have it for car insurance. We don't have it for home mortgages. We
01:13:35
allow the free market to tell us this home is more risky than that home because it's near a fire area. This
01:13:40
person is a poorer driver than that other person. We should allow the free market to say go to this kind of a job
01:13:47
and you'll get paid so much, but go to this other kind of a degree. It will cost you a lot of money. And let people
01:13:53
decide with more clarity. But that one thing would allow us to reinforce what
01:13:58
the economic upside of AI is in a very practical way for a lot of people and it would solve this student debt crisis
01:14:05
that we're in. Sax, should there be a license fee or a tax? This has been floated by people.
01:14:10
I'm not saying this is my position, but should there be a tax on having a robo taxi or a humanoid robot that is
01:14:18
then used to retrain actual humans? Look, I I think first we just need to start with some accurate facts here. And
01:14:25
we need to explain what's what's happening. And part of that is debunking some myths around this. Now, I remember
01:14:31
about a month ago there was a whole wave of very scary headlines, including in a publication I really like, the New York
01:14:36
Post. Nick, maybe you can put this on the screen, claiming that AI was wreaking havoc on US jobs. So, this was
01:14:42
a headline from the New York Post last month based on the October report from Challenger Gray, which
01:14:49
basically tabulates announced layoffs in the economy. And we had a spike in October and about 20% of those were
01:14:57
attributed to AI. It wasn't even the majority. It was actually a relatively small number. It wasn't even the number
01:15:02
one reason. But based on this, you got a wave of scare headlines that AI was wreaking havoc on US jobs. Well, lo and
01:15:08
behold, the November challenger grade report has come out and it makes clear that October was an anomalous spike. The
01:15:16
number fell by 53% and only about 6,000 of the layoffs that were announced in
01:15:21
November in the entire country were attributable to AI. This is only layoffs, by the way. It doesn't include
01:15:27
job creations. Okay, so only 6,000. And if you look at the year to date in the
01:15:34
Challenger Gray report, AI has only accounted for 4.7% of total layoffs. And that number is
01:15:41
self-reported by CEOs. So my guess is it's inflated because if you're a CEO, you'd rather blame AI for your company's
01:15:47
non-performance rather than yourself. So 4.7% is probably the high number. So
01:15:53
what we're actually seeing in the data is a very small number of actual layoffs related to AI and that was corroborated
01:16:00
by a new study by Yale Budget Lab which looked at the first 33 months after the
01:16:05
release of Chat GPT and it said there is no discernable disruption in the labor market. Okay. So that's I think a really
01:16:13
important fact is regardless of what you want to claim will happen in the future job loss has not happened yet. not in
01:16:19
any meaningful numbers and in fact AI has been responsible for about half of
01:16:26
GDP growth this year. So GDP growth is about 4%. That number would be at 2% if
01:16:31
it weren't for AI. So within that is a lot of job creation. You see that again with construction workers. So it's just
01:16:39
not the case that AI is creating job loss in any meaningful way right now.
01:16:45
And people do this Mott and Bailey thing where they're like, well AI is creating tons of disruption. It's wreaking havoc.
01:16:51
And then you point these facts out and say, "No, no, I mean in the future it's going to." But then they revert to, "Well, no, it must be happening now,
01:16:57
right? The disruption is so profound." So look, we can all argue about what's going to happen in the future, but right
01:17:03
now it's not. And if you're going to talk about the future, the time frames matter a lot because obviously we've
01:17:11
always had technological change in the economy and it does change people's job. But if those changes are happening over
01:17:16
20 or 30 years, that's very different than the next 5 years. And I really don't think you know how fast the disruption is going to be and how much
01:17:23
time people are going to have to react and for new jobs to be created. I'll give you an example. Back in the 90s, I
01:17:28
remember when they said that brick and mortar was going completely out of business. That was part of the reason why we had the first do bubble in 99 was
01:17:35
that, hey, everything's going to the internet. It's going to go pets.com instead of Toys R Us and so forth and so on. And people thought that bricks and
01:17:41
mortar was going to be out of business within 5 years. Well, it's literally 30 years later and bricks and mortar is
01:17:47
still a thing. I mean, it's not blockbuster. No, look, it hasn't been a great business. I mean, like, Amazon has been super successful and you know, you did
01:17:53
not want to own Toys R Us, but bricks and mortar is still around. Walmart's still around. The change is still
01:17:59
ongoing. And I think that's what's most likely going to happen here is this this technology is going to create a
01:18:06
productivity boom. I don't think the main thing it's going to do is cause job loss. It's going to have lots of different impact on our lives and we're
01:18:13
going to have time to adapt. I don't think this is a two to three year time frame thing. This will be the debate of our
01:18:19
lifetimes. I predict but look I mean if I'm wrong we'll find out in the next 5 years. But what I just
01:18:26
really resist is this Mott and Bailey thing where people are like this is happening right now and then they no no
01:18:32
this is going to happen in the future. Be clear. I think you're wrong. But I think the summary of this point is the following which is the facts today don't bear out
01:18:38
the bear case. But the perception is that people are afraid. And married with
01:18:44
that is that we as an industry and I don't actually blame it on you because you had to clean up all kinds of
01:18:49
craziness that the Biden era left. So I think you've done a great job. But our industry needs better spokesman. I mean,
01:18:55
we talked about this after our tech dinner. There needs to be a way for a handful of people who can really
01:19:01
represent the future in an articulate way that people believe. And I think we do need to do that. We can't have the CEOs
01:19:08
of these companies seem either sketchy on the one hand or too focused on material consumption on
01:19:15
the other. It's just bad. It's a bad look. I agree with you. But think about the two biggest narratives that created this
01:19:22
fear and resentment towards AI. I would say it's the AGI narrative and the time frames now are people are pushing them
01:19:29
back. Right. That was there's one there's one called a famous project called AI 2027
01:19:35
where they were predicting AGI in 2027 and now they've pushed their time frames back into the 2030s. Look, you know,
01:19:41
once your time frames are over 10 years, we know from the tech industry then you have no idea. You have no idea. But
01:19:47
it was that it was AGI and job loss and and I would say current current profound
01:19:54
disruption and job loss. And both those narratives I think have been debunked in
01:19:59
the last several months. Final chart, Nick, you can pull it up here. This is just Fred. Unemployment rate 16 to 24 year olds. This is the one
01:20:05
I think you should watch. 9% in January, now 10 a.5%. I think this chart's going
01:20:10
right up to 14%. Just my prediction and I think it's because of AI. Tucker, I'll give you the final thoughts on this and
01:20:16
then we're going to start everybody's favorite game, Tucker in 20 where we do a lightning round with Tucker Carlson.
01:20:22
Tucker in 20 coming up. Any final thoughts on this? Well, I was just thinking about the consequence of I mean, having lived
01:20:28
through Y2K and Obama, Y2K we thought was going to be a disaster. Obama people thought was going to be great. Both, you
01:20:34
know, were the opposite of what we imagined. I don't know that it's possible to predict the effects of this. But I guess
01:20:40
my one worry which I would just I I think that people all people especially men need to feel useful and the thing
01:20:48
that's offended me most about the about the AI conversation is not the AGI stuff
01:20:53
which always seemed a little bit fantastical to me. It's the it's UBI. It's the idea you could just like pay
01:20:58
people to be content or something. And having grown up both around inherited money and welfare, you know, both are,
01:21:05
you know, two sides of the same coin. People need to feel like they're contributing and that their lives have meaning. And I don't know. I just hope I
01:21:11
hope people are thinking about that a lot. Yeah, I agree with you.
01:21:16
By the way, that that whole UBI narrative, I think Sam was like towning that a couple years ago.
01:21:21
He funded it. He funded a study. It made this whole thing so much worse because again, it was playing into this
01:21:26
idea that everyone's going to be put out of work and that's a good thing and you'll just get welfare from the government. And who would want that? You
01:21:32
know, it's not if that's not the side you want. Now, I mean I mean where I disagree with my friends on the right is
01:21:38
I just don't think that's what's gonna happen. I mean, I could be wrong, but I just don't think that's what's happening. It hasn't happened yet. I
01:21:44
don't think that's what's going to happen. But look, I agree with them about the undesirability of that world
01:21:49
very much. All right, Tucker and 20, your thoughts on You can take up to 30 seconds, but
01:21:56
Tucker and 20 sounds better. Tucker in 20. What do you think of Venezuela
01:22:01
uh these boats and then seizing the oil tanker? Why are we doing this? Why are we so active in Venezuela, Tucker?
01:22:08
No freaking idea. But I do know that if it becomes a real war, people are going
01:22:14
to be shocked and it's the last thing the country needs. There's I mean the the number one requirement of war is that you explain
01:22:20
to your population why you're doing it. Even if you're lying about it. Even if it's like, "Oh, they have weapons of mass destruction. We'll find them once
01:22:25
we invade." Everyone mocks that, but at least it was like a real rationale that allowed the country to unite behind the invasion.
01:22:32
That groundwork has not been laid. The drug stuff, everyone's against drugs. They're not coming from Venezuela
01:22:37
primarily, as we know. So, they're coming from Mexico. I'm not advocating for an invasion of Mexico. There may be a good reason to have a war with
01:22:43
Venezuela, but I think it's it would be now would be the time to roll it out if in fact we are going to have one. My sense is we're probably not. This is all
01:22:50
an effort to get Maduro to leave. I don't think he's leaving. So, I hope we can live with that. But I I just don't
01:22:56
think right now is the time for a ground war in South America. You've been very excited about the
01:23:02
potential of Qatar being a deeper ally of the United States and you're buying a
01:23:07
place there. Uh why are you so why are you soar to mix?
01:23:12
I'm not I'm not I mean of course anyone who who travels to the Gulf can tell you there's something amazing happening
01:23:18
there. And it's not just about money. It's about openness, but I'm an American. I'm not going anywhere. I have
01:23:24
one passport. I'm buying a house in guitar. To make the simple point, I've been attacked for being a tool of guitar
01:23:30
paid by guitar. I've never taken a dollar from Qar or anyone else. I have no investors and no debt. So, I'm not
01:23:35
into taking money from people, but I wanted to turn it around and be a net investor in Qar in order to take control
01:23:43
of Qatari propaganda in order to say, "No, they haven't bought me. I've bought them." and I'm texting them my talking
01:23:48
points and they're repeating them and that's what I plan to say the second I close on my house. Uh Candace Owen, Charlie Kirk's
01:23:56
assassination, conspiracy theory. What's your take? I think it's important for I mean look,
01:24:02
in the end it's the job of federal law enforcement to find out who did it and then explain it to the public in a way
01:24:08
that makes sense and can be proven. And I really hope that will happen soon. So they're they're part like in any any I
01:24:14
mean I was a crime reporter. I wrote a book on this. There's in any aftermath of any crime, there are anomalies, weird
01:24:21
coincidences, things you can't fully explain. I mean, the closer you look at anything, the more complex it reveals
01:24:27
itself to be. So, that's certainly true here. But because of the nature of this murder of our friend, um, I think it's
01:24:35
all the more important to make sure the public understands who did this and and why. And I would
01:24:43
say the FBI doesn't have a lot of credibility. It's not the fault of Cash Patel and Dan Banchino. They inherited
01:24:48
an agency with basically no credibility that's has a documented history of manufacturing crime. So like it's not
01:24:54
enough to say the FBI says it. You have to explain how. And I'm not even
01:24:59
doubting the the core case they're making. But if they are telling me that this was a lone gunman, that no one else
01:25:06
was implicated in this crime, I think it's fair to ask like how did you reach that conclusion? And did you look at
01:25:12
this that and the other thing? And I don't think we should be intimidated out of asking those questions. Those are not
01:25:17
unpatriotic questions. Those are questions that I think uh you know
01:25:22
express our uh reverence for Charlie Kirk. This is a way to honor him and any American who's murdered. So
01:25:29
sorry. By the way, just on this if anybody has not watched Tucker's documentary Yep. about the Butler, Pennsylvania shooter,
01:25:35
I can't remember his name. We're memorizing that anyway. So, please don't ask any questions. Yeah. What happened with that? Jeez. I
01:25:42
thought your documentary was pretty kick-ass. Worth. Well, thank you. And it just I mean it look it was really good.
01:25:47
We we asked obvious questions, couldn't get straightforward answers. I do think the more you did more than this. I have to give
01:25:54
you credit because you were able to scrub internet searches. You went back to the way back machine. There was a
01:25:59
level of detail because my interpretation was it seemed like you guys were afraid of just getting sh on
01:26:07
from everybody. And so you went to the point of making sure that this stuff was irrefutable fact. And you had a level of
01:26:13
detail in there, which I hadn't seen in an investigative research piece in a long time. I do encourage people to
01:26:19
watch it. I thought it was very good. Well, the good news about being universally hated is it keeps your standards higher um because you can't
01:26:26
afford, you know, to make too many mistakes. No, but we can't have too many, you know, high-profile murders or
01:26:33
attempted murders that don't have firm, believable resolutions. The the the
01:26:38
social fabric can't handle that because then people become totally postmodern in their thinking and don't believe anything. So, that's incumbent on
01:26:44
federal authorities to reassure us. Tucker, if you are running the
01:26:51
Republican campaign going into the midterms, what do you do the same? What do you do more of and what do you do
01:26:57
less of? Well, I would, you know, I' I'd focus on domestic economic issues to the exclusion of everything else. I would
01:27:03
and um I would I think that's that's the main concern. It's always the main
01:27:08
concern. Now, I'm entering into very benal territory because I'm repeating every uh you know obvious observation in
01:27:15
the past 100 years in American politics. But people do care about that and they are concerned. AI is part of that
01:27:21
probably not in its reality but in its expectation and it's the fears that people have about what's coming. Um, and
01:27:28
so I would I would try to address those issues at least by explaining them. I do
01:27:34
think like well I'm in the explaining business so I'm biased but 80% of the problem this is true in marriage and
01:27:40
child rearing and governing as well. You need to explain what you're doing,
01:27:45
what's going to happen. I'm going to give you the shot. Count backwards from 10. By seven you're going to be asleep. When you wake up you'll be fixed. Like
01:27:51
that's what they tell you in surgery. And they tell you that for a reason. They don't just roll you into a dark room and start injecting you with stuff.
01:27:58
They walk you through it. And that's enormously reassuring. In fact, it's critical. And so, we just need a lot
01:28:03
more of that from everyone and not just government, but people with a platform explaining what the hell is going on
01:28:09
because we're getting to a place where trust is vanishingly rare and that's
01:28:15
bad. That creates volatility. After 3 hours with Milo Yiannopoulos,
01:28:20
is homosexuality nature or nurture a trauma response? And David Freeberg
01:28:27
wants to know, why are you so gay? Why are you gay? Why are
01:28:33
Why are you gay? I just wanted to do an interview where I could quote my favorite uh Nigerian but um no I mean
01:28:40
well clearly it's not nature at least primarily or we wouldn't be having an absolute rise in it and there would be
01:28:46
some hint of a gene responsible for it. I mean so many different um you know
01:28:52
genetic manifestations have been isolated from the from the decoding of the human genome and that we're not you know so no clearly it is primarily um
01:29:02
nurture that's not an attack on anyone doesn't make it any less real I'm not saying it's fake of course it's the opposite of fake it's very very real um
01:29:09
and it's not even a value judgment it's just an observation I I think on the question of sexuality and gender it's
01:29:18
best to depoliticize it. It's been so politicized, you can't even have an honest conversation about it or you get attacked from all sides. That how does
01:29:24
that help anyone? It doesn't. And so, it's best just to look at this as cooly and as rationally as you can. Try to get
01:29:31
to the truth and then allow people to make their own decisions about what to do with it. I mean, that's my view of everything really, but but it's time to
01:29:38
take that approach to sexuality. Okay, final one. Should we be in NATO? Should America pull out of NATO?
01:29:44
Of course, we shouldn't be in NATO. What? That's not I thought these were hard questions. Exactly. I give you
01:29:51
sometimes I give you a little alleyoop. I let you dunk the ball. Why would we be in NATO? NATO is like
01:29:56
the single most destructive force that we're a part of way more than the UN. NATO
01:30:01
should we support Israel and give them weapons. It depends for what
01:30:07
you know fight in Gaza and should they be our number one partner in the region? I think or the partner in the world
01:30:13
really. I think I think all of our alliances should be assessed and now
01:30:18
reassessed through a single lens. Does this help the United States and in the specific instance I'm certainly not
01:30:24
against being allied with Israel and I'm not against supplying Israel with weapons? Again, it depends what they're being used for. But I do think what's
01:30:31
happened in Gaza does not help the United States at all. I mean, tell me how it does. And um so yeah, I in fact,
01:30:38
I'm not even sure what the argument that it has helped the United States would be. I've never heard it articulated. Instead, I've heard people name calling.
01:30:45
You know, from my perspective, that's all I care about. And I got it. I never wanted to have this debate. I avoided it
01:30:50
for many years. The only reason I get into it was the prospect of a of a
01:30:55
regime change war in Iran. And I just thought, man, there is no way that helps us in any way. So, I piped up and said
01:31:01
something and my life has been a disaster ever since. But my views have not changed. Is it good for the US or
01:31:06
not? What is the future of Europe and the UK? Oh, it's so dark. I have family there. I
01:31:11
was just there. I mean just there um I you know I let me start with the good news. I mean, everyone knows all of
01:31:18
this, so I'm not going to repeat any of it other than to say finally Europeans, even the Germans, I spoke to one of the
01:31:26
most powerful people in Germany yesterday about this, are starting to realize, wow, this is not going well at all. And and migration is there are many
01:31:34
problems, but migration is the core problem. The second is energy. And they've made massive mistakes. They've
01:31:40
committed self harm over decades. We can argue about why they did that, but there's a growing realization that they
01:31:46
did. I was in Oslo this uh pretty recently, salmon fishing. And you go to
01:31:51
Norway. I'm Scandinavian, so I pay attention. And all they talk about is Sweden. And of course, everyone's always
01:31:57
kind of looked up to Sweden because it's huge and industrialized. And you know, Norway looked up to Sweden. Now you go
01:32:03
to Norway and the Norwegians all say, "Man, the one thing we're not going to do is become Sweden and open our borders
01:32:08
and destroy ourselves." So I think the Europeans are finally catching on to this and that's a blessing. Is it too
01:32:14
late? I hope not. Maybe Finland and Norway caught it. Yeah, they caught it early and said we can only have this many
01:32:20
people come in each year reasonably as a society. It's not that early. They've they've made a mess of Oslo. Oslo is a is not
01:32:28
what it should be. But yeah, I mean it's not totally destroyed. So yeah. All right. Listen, when Tucker has a new
01:32:33
product or service in the world, uh we he calls his boy Jal and we do a little
01:32:40
mutual support. You're today uh launching uh some silver or gold apparently.
01:32:45
Basically, we are selling gold as close to wholesale as we possibly can. Okay. And it's as usual a reaction
01:32:52
against all the gold scams going on. But people should be able to easily buy physical gold with a minor transparent
01:32:59
markup on the internet. You shouldn't have to call a number so you're fooled into buying a commemorative coin for,
01:33:06
you know, eight grand an ounce or whatever, twice spot price. Um, so that's that's the idea. And uh and it's
01:33:13
gone really well in the two weeks we've been open. What's the form factor? Are they It's
01:33:19
It's like 1 oz coins or you guys Well, you can you can buy any kind of precious metal. Um I I am a personally a
01:33:26
1oz coin buyer of long-standing. Turned out to be a pretty good route. I would
01:33:32
say I was much mocked by everyone I know. All the finance sophisticates I went to college with were, you know,
01:33:37
making fun of me. You're a gold bug. You're crazy. Um and yeah, I do bury it in my yard because that's the kind of
01:33:43
man I am, primitive. Um, but it has turned out to be a good thing. No, you
01:33:49
can buy with a shovel. Oh, well, I've I've thought that through, Chima. I've also scattered
01:33:56
millions of ball bearings around my backyard. So, good luck with your metal detector. That's so awesome.
01:34:03
I'm not kidding. By the way, also coming in 2026, uh, Tucker
01:34:08
Carlson's baked beans and fat. When you're prospecting out in the Wild
01:34:15
West, you can get your Can I hear something weird? I used to work in a baked bean factory, actually, B&M baked
01:34:21
beans in Portland, Maine in 1988, and for the summer, and I've never eaten a baked bean since because I made them and
01:34:28
I I ODed on baked beans, but in general, they're good. Go buy yourself a gold coin at Battalion
01:34:35
Medals. I think this is a great idea. I have to be really honest with you. M I do think that having this as a
01:34:42
practical hedge, there's like a whole set of elements that we all have to be educated on to hedge
01:34:49
the status quo and there are lots of reasons to own cryptocurrencies, gold.
01:34:56
I'm glad you're doing this because the way that this is done for most people is completely bonkers and these sites unlike yours
01:35:03
that typically sell direct to retail do not do a good job. So, I'm glad you're doing it. I hope it's a success. Thank
01:35:08
you. And let me put also put in a good word for firewood and ammunition. I don't think those are better if you want
01:35:14
to diversify your portfolio. Also, two wells. You got to have two wells, not just one. Different depths.
01:35:20
That's what I have on the ranch. Two wells. Look, I think this is going to be very successful. Uh, congratulations, Tucker.
01:35:25
I think this will be a very successful venture because you have a lot of trust with your audience and if like you're going to sell gold, like that's the most
01:35:30
important thing is people just want to know that this is like 100% legit pure gold. And
01:35:36
what's your daily carry? And the lowest price they can get. What's your daily what's your daily carry? What do you What do you carry around the the the ranch or whatever?
01:35:43
What's your daily How much do I carry in gold? No. What's your what's your pistol? What's your piece? What do you keep on
01:35:48
your side? Oh, I carry a Ruger LCR in 38 special. I like the revolver cuz it doesn't go off
01:35:55
accidentally and castrate you. So, yes, that's personally 100 Ruger. It's a great
01:36:01
Okay, so people like these highcapacity striker fire handguns and I just I'm a
01:36:06
revolver man. Everyone makes fun of me, but that's how I feel. No, you know what? You leave that revolver in the bottom of a pool for
01:36:12
three years, take it out, hammer some nails, and then fire it. Still fires, dude. I'm
01:36:18
the liberal here. Really? You don't say I'm a I'm a moderate. They say liberal to keep people tuning in. All right,
01:36:24
everybody. When he's selling ads, he's a liberal. When he's spending his money, he's a conservative. I live I'm the only guy
01:36:29
who lives in Texas on a ranch and carries a firearm. So, we'll just leave it at that. But, you never know. I might
01:36:35
have some This guy moving to Texas. He's living two lives. He's living two lives. Every time he
01:36:40
gets on a PJ, he's a conservative, but every time he has to talk to people, he's living in the PC24 and the Phenom 300 are great
01:36:47
planes. And we had our holiday party last weekend. Wish you were there, Tucker. Tony Hingcliffe burned the place
01:36:54
down. We did a live kill Tony. A little bit of roasting. A good time was had by all. Major thanks to our three partners.
01:37:01
OKX hooked up the gifting suite, custom candles that fans loved. Also, they had
01:37:07
a really classy milk and cookie bar. I ran sponsored all the VIP spaces with
01:37:13
great cocktails. And Google Cloud built out an amazing lounge with spiked hot chocolate and other holiday drinks. Well
01:37:20
done to our friends at Google Cloud. We will see you all next week on your favorite podcast, The All-In Podcast.
01:37:27
Love you boys. Thanks, Tucker. Bye-bye. Thank you guys. See you.
01:37:32
We'll let your winners ride. Rainman David
01:37:39
and we open source it to the fans and they've just gone crazy with it. Love you. Queen of
01:37:49
your besties are gone.
01:37:55
That is my dog taking notice your driveways.
01:38:00
Oh man, my appetiter will meet. We should all just get a room and just have one big huge orgy cuz they're all
01:38:06
just useless. It's like this like sexual tension that they just need to release somehow.
01:38:12
beak. Wet your feet. Your feet. That's going to be good. We need to get merch. I'm going
01:38:26
all in.

Episode Highlights

  • AI Renaming Discussion
    During a Christmas party, Trump and Chimath humorously brainstormed new names for AI.
    “I think AI is too late to change, but maybe it could be American intelligence.”
    @ 03m 30s
    December 13, 2025
  • The Threat of Censorship
    Censorship of tech platforms is seen as a major threat to creativity and diversity.
    “Censoring YouTube and Instagram puts the nation into a mental prison.”
    @ 22m 16s
    December 13, 2025
  • Barry Weiss's Leadership
    Discussion on Barry Weiss's takeover of CBS News and her qualities.
    “I think that she's charming. She's tireless, energetic.”
    @ 22m 50s
    December 13, 2025
  • The Future of Traditional Media
    Speculation on the New York Times and its potential downfall.
    “I hope when that settlement happens, it turns into a nonprofit and public trust.”
    @ 24m 51s
    December 13, 2025
  • Nick Fuentes: The Modern Shock Jock
    Nick Fuentes is compared to Howard Stern for his shock value and charisma.
    “He's like a younger Howard Stern.”
    @ 35m 50s
    December 13, 2025
  • The Chaos of Multiculturalism
    A debate on the effectiveness of multiculturalism versus a shared national identity.
    “What makes countries great is a shared set of principles and values.”
    @ 43m 44s
    December 13, 2025
  • The Risks of AI
    Concerns about AI's potential to disrupt jobs and society, with calls for guardrails.
    “The biggest risk of AI is Orwellian concerns, not Terminator fears.”
    @ 01h 00m 13s
    December 13, 2025
  • The Risks of AI and Privacy
    As AI models grow powerful, the risk of government intrusion and loss of privacy increases.
    “We have to find a way to replicate a version of that so that you can preserve privacy.”
    @ 01h 04m 45s
    December 13, 2025
  • Job Market and AI
    Despite fears, AI has not caused significant job loss; it may even create new opportunities.
    “AI has only accounted for 4.7% of total layoffs.”
    @ 01h 15m 41s
    December 13, 2025
  • The Importance of Meaningful Work
    In a world influenced by AI, people need to feel useful and contribute meaningfully.
    “People need to feel like they're contributing and that their lives have meaning.”
    @ 01h 20m 48s
    December 13, 2025
  • The Importance of Trust
    In a world where trust is rare, it's critical for authorities to reassure the public.
    “Trust is vanishingly rare and that's bad.”
    @ 01h 28m 15s
    December 13, 2025
  • Gold Selling Venture
    Tucker launches a gold selling service aimed at transparency and fair pricing.
    “We're selling gold as close to wholesale as we possibly can.”
    @ 01h 32m 45s
    December 13, 2025

Episode Quotes

Key Moments

  • Trump's Enthusiasm01:29
  • AI Branding03:30
  • Media Censorship22:16
  • Media Relevance24:09
  • AI Risks1:00:13
  • Trust Issues1:28:15
  • Depoliticizing Sexuality1:29:02
  • Gold Venture Launch1:32:45

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Tucker Carlson: ICE Raids, LA Riots, Strong Economic Data, Politicized Fed, War with Iran?
Podcast thumbnail
Tucker Carlson and Mark Cuban Debate How to Save America 🔥 | All-In Summit 2025
Podcast thumbnail
Tucker Carlson: State of America, leaving Fox News, Media Control, Politics, and more
Podcast thumbnail
E156: Ivy League antisemitism, macro, SaaS recovery, Gemini, Figma deal delay + big Friedberg update
Podcast thumbnail
Inside the White House Tech Dinner, Weak Jobs Report, Tariffs Court Challenge, Google Wins Antitrust
Podcast thumbnail
Trump AI Speech & Action Plan, DC Summit Recap, Hot GDP Print, Trade Deals, Altman Warns No Privacy
Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally