Search Captions & Ask AI

H-1B Shakeup, Kimmel Apology, Autism Causes, California Hate Speech Law

September 27, 2025 / 01:23:48

This episode covers the recent overhaul of H-1B visas, the implications of a new $100,000 fee for applications, and the ongoing discussion about autism and its potential causes. Guests include Chamath Palihapitiya, David Sacks, and Jason Calacanis.

The episode begins with Chamath discussing his recent trip to the UAE and the luxurious Emirates airline experience. The conversation quickly shifts to the significant changes in the H-1B visa program, with Sacks explaining the rationale behind the proposed $100,000 fee, which aims to reduce the number of low-skilled applicants and encourage higher-skilled workers.

Chamath shares his personal experience with the H-1B visa system, highlighting the abuse he witnessed in the IT sector. He and Sacks agree that the current system has been exploited, leading to wage suppression and a misallocation of visas.

The discussion then transitions to autism, with Freeberg presenting findings from a recent press conference involving Bobby Kennedy and President Trump. They discuss potential biological drivers of autism, including the role of folate receptors and the impact of acetaminophen during pregnancy.

The episode concludes with a debate about censorship and the implications of a new California bill targeting hate speech on social media, raising concerns about free speech and the potential for increased censorship.

TL;DR

The episode discusses H-1B visa changes, autism research, and California's hate speech bill implications.

Video

00:00:00
All right, everybody. Welcome back to the number one podcast in the world, the All-In podcast. We're back. We're back.
00:00:07
We got the original crew here. It's a tight foresome with me again. He's
00:00:13
returned from, I believe, the UAE. And Mina, the one, the only, your chairman,
00:00:19
your dictator, Chamath Polyhapatia. He puts the dick in dictator. That's what they all say. How you doing?
00:00:25
Good. You? So, you went and you got that. Wow. What's that beautiful airline that we
00:00:30
all take to the region? Emirates first class. Yeah. Oh, Emirates first class cabin.
00:00:35
It's insane. With the wine. Take everybody in cuz I do business class for 14 dimes round
00:00:41
trip. Emirates is unbelievable. But the problem is there's like literally a thousand movies. A thousand. So you have
00:00:47
to like favorite out 30 or 40 of them. There was like 95 different menu
00:00:53
choices. I had probably 8,000 calories. Oh, really? And that was just the wine,
00:00:58
I take it. Yeah. By the way, the wine is incredible. The wine list like 1996 mantros. And I was
00:01:05
like, is this an air I had never seen an airline wine list? It was pretty strong.
00:01:10
Did you bring your Sier Josh? Was he in the cabin next to you with his own? I didn't need it. No, I didn't need it.
00:01:15
I like that. I like that. And of course, your Sultan of science. We got a great docket for him. It's kind of like the
00:01:20
Super Bowl for Sultan fans this week because Yeah. I hope I deliver, bro. I mean,
00:01:27
cured autism and you're going to comment on it. It's a pretty big deal. It's a pretty big deal.
00:01:33
Let your winners ride. [Music]
00:01:41
We open sourced it to the fans and they've just gone crazy with it.
00:01:48
And then of course ah the one, the only. He puts the bizaar in Zar.
00:01:55
David Sax calling in from his You got backwards. It's It's You put the
00:02:02
Zar in bizaar, not the bizaar in SAR. Yeah, I kind of was. I'm playing with it. I'm workshopping. It's I've been
00:02:09
trying both ways. Um but yes. And uh you're calling in. You're echoing which is great.
00:02:14
I'm on the road. You're on the road between meetings? Well, yeah, basically. Exactly.
00:02:19
Are you in a Are you in a motorcade? No, no. Okay, listen. The topic of the week,
00:02:24
H-1B visas are being overhauled. Trump administration announced a new
00:02:30
$100,000 fee for all future H1B applications. It's a onetime fee. There's been a
00:02:37
little confusion about it uh and the details, but you know, that's how they they do things in the 47th. Just some
00:02:43
excitement, a big announcement, and then we we figure out the details. Lutnik originally said it would be a h 100,000 a year, but then the White House
00:02:49
clarified it will be a onetime fee. This is a huge jump. The current fee is nothing. It's like 2 to 5K that you pay
00:02:56
to the government. You might pay a lawyer, you know, double that or triple that to to do the work for you if you're a big corporation. Um but um you know,
00:03:04
this this hits on a lot of the Trump campaign promises, tougher on immigration, looking out for US workers.
00:03:10
We've talked before here about the abuse uh in the H-1B system. I'll give some of my personal
00:03:17
um you know insights in that after maybe I throw to you Chimoth. And before I do, they had an interesting poly market.
00:03:23
Will courts block Trump's 100K H1B by September 30th? 3% chance in that
00:03:28
happening. So it looks like everybody's kind of aligned with them with this program. All right, Sachs. Uh I know
00:03:34
you're on the road, but your your your fans demand to hear your take on this.
00:03:40
What what's what's your take? I think it's a good idea to have this $100,000
00:03:45
fee. And I'll tell you the reason why is because right now there's something like
00:03:51
five times as many H-1B applications as there are slots. So I think they grant
00:03:56
about 85,000 H1Bs a year and many more apply for it. And as a result they have
00:04:04
a lottery where they just kind of I guess they randomly choose who the winners are going to be. And if you look
00:04:09
over the past decade, roughly half the H-1Bs go to these like IT consulting
00:04:14
firms. Yes. And the average salary is like $65,000 a year. So it kind of puts the lie to this
00:04:21
idea that you hear that H-1Bs are for like high-skilled engineers, AI researchers, things like that. That's
00:04:27
not in practice what happens. Uh in practice, what happens is you have this lottery and a huge chunk of them end up
00:04:34
going to low-end IT jobs. And I think by putting this $100,000 fee on it, you
00:04:42
encourage the applications to go to the actual higher skilled, higher paid jobs
00:04:47
where there's actually a shortage of Americans and you encourage US companies to try to fill those jobs with Americans
00:04:54
first. And so I think, you know, putting aside some of the the details, I think
00:04:59
the big picture here is that they're using market forces to put some scarcity
00:05:06
around that H-1B application. And I think what that's going to do is encourage applicants to to actually be
00:05:13
these higher paid, higher skil jobs that the program is supposed to be for instead of these lower-end IT shops.
00:05:19
Yeah, these are supposed to be for highly specialized workers. I can tell you,
00:05:25
you know, when I was in IT in the early 90s, the abuse was happening all the time and it was indentured servitude. It
00:05:33
was disgraceful. The IT people would hire uh typically Indians and they would say stuff to the effect of these guys
00:05:40
are going to work for half as much and twice as long and they can't say no. That's the best part of it. They can't
00:05:46
say no when we put them on weekend coverage. You can't say no if we want to do a buildout and they have to work 10
00:05:51
days in a row because we can kick them out of the country and they have 30 days to find a new job. And so it's it's a
00:05:58
giant scam on the bottom half of these. I witnessed it firsthand. Every discussion I've ever had about H1Bs,
00:06:06
you know, in relation to IT and consulting has always been about saving money. And the truth is it's been
00:06:11
abused. And I talked about this in 2015 on CNBC Chamath when Trump first started
00:06:17
to talk about it. He's been on this for a while and uh it's just great to see them I I had suggested 20K a year and
00:06:24
that's kind of where they wound up. I additionally think they should do an auction for onethird of these. Let all
00:06:30
these big tech companies that are truly trying to get in very unique PhDs from Oxford in AI, man, let them just put out
00:06:39
how many they want to buy and at what price. do a reverse auction and fill onethird of them with I don't know maybe
00:06:44
open AAI or XAI or Microsoft jumps the fence and pays 100K 200k per person.
00:06:51
What do you think Chimath just broadly speaking on this and the policy the abuse everything? I came to the United
00:06:58
States initially on a TN visa which is the NAFTA visa between Canada and
00:07:05
America and then I switched to an H-1B and then I got my green card and my citizenship in
00:07:10
the early 2010s. Elon came in on an H1B. Sundar Pachaya came on an H1B. Satin Nadella came in on an H1B. There's a lot
00:07:17
of folks that have done a lot of good things that have used this specific visa.
00:07:23
That being said, I think Saxs is right that people have found an endound and have
00:07:31
been abusing this H-1B system. There was an incredibly exhaustive thread by Robert Sterling.
00:07:39
I think it was about a year ago, but I wanted to use that as a jumping off point to explain a couple of reasons why
00:07:45
I think that there's been rampant abuse. The first thing is the H-1B program is
00:07:51
supposed to be 85,000 visas a year, but here is the data. And so what you see is
00:07:58
that in many years, including the last several, it's been upwards of 10 times
00:08:04
that number. And so there are a lot of people that
00:08:10
are getting shoehorned into this program. And when you see this, you can start to
00:08:16
see why a lot of people are saying that there is wage suppression and that it's taking away from American jobs. Because
00:08:22
if the program was meant to be for 85,000, you would think, well, listen, that's a drop in the bucket. Nobody would feel
00:08:28
that in the American economy. But when you start talking about almost a million people a year, 600,000 to a million a
00:08:34
year, that starts to be perceptible and that is absorbing a lot of revenue and wages that would
00:08:41
otherwise go to domesticorn and legal immigrants that are already
00:08:48
here. So that's thing number one. Thing number two is there was a myth
00:08:53
that these H-1Bs were these extremely highly skilled people. And what Robert
00:08:59
found out in the data is that actually no, it's not really that case. And so I
00:09:05
think the average salary, I just want to get this exactly right, it's slightly under $120,000.
00:09:12
Now, if you started to tell me that these were the best-in-class PhDs in all
00:09:18
of these whisbang industries where the companies are raising billions and billions of dollars, you guys already
00:09:24
know that this salary would not pass the smell test. Most executive assistants at tech startups make more than $119,000 a
00:09:32
year. So, the idea that some qualified grad is making this should already sort of set off alarm bells that maybe where
00:09:39
there's smoke, there's fire. So that's the second thing. So number one, we've been overallocating by 5 to 10x. Number
00:09:47
two, these salaries aren't these incredible salaries that you think of, which tends to mean that there is the
00:09:53
potential, as Jason you said, in some form of indentured servitude and wage suppression. That's not good. And then
00:10:00
the third thing is you would ask the question, well, who gets these things? And it turns out, as Sax said, a large
00:10:08
plurality of these visas don't actually go to American companies that are
00:10:14
looking to hire talent to make this American business do better. These are
00:10:20
foreign companies that are arbiting labor and bringing people in.
00:10:25
So crazy. Cognizant is not an American business. Infosys is not an American business. Tata, we pro.
00:10:32
It's not to say that that in and of itself is wrong. But you need to find the right visa class to do this under.
00:10:38
Yeah, they're hacking it. And so when you put all of this together, I think the sort of broad
00:10:44
takeaway is from where this started and what it was intended to do, we've
00:10:50
deviated pretty wildly. And I think that this is a very important reset. Now, the
00:10:56
last comment I want to make is about the people that say, "Hold on, we are going to cut our nose off despite our face."
00:11:02
And it's going to stop an inflow of incredible talent. And what I would just remind people is
00:11:09
that it is really important to remember that when you are in the United States for a masters or PhD, you already get an
00:11:15
automatic visa. It's called OPT. So you have multiple years when you graduate
00:11:21
from a useful degree program in the United States to find a job. I have several of these folks that work for me
00:11:26
at 8090. These are incredible grads from Carnegie Melon. they are off thecharts smart but because they did a masters or
00:11:33
a PhD they come with a couple of years and you can oftentimes extend that and
00:11:38
that will give us a very good amount of time to figure out how exceptional they
00:11:44
are and then quite honestly I would gladly pay the 100,000 to get these guys on an H-1B program. So, I think if we're
00:11:52
going to try to return this to what it was meant to be, which is to help American companies excel, get the best
00:12:00
of the brightest. These changes, I think, are very good measures to course correct and get us
00:12:06
towards that. And to just give people the history of this, this was something that was started after World War II to get really
00:12:11
specialized people like Polish and German, like geniuses building out rockets. And I had a really interesting
00:12:18
discussion. If you could pull this tweet up, there's another sinister wrinkle to
00:12:24
this. I had um this gentleman uh Colin on and he went to apply for a product
00:12:29
manager. I had him on this weekend startup. It's my other podcast and he applied and sent a resume with the
00:12:35
reference number to the specific job Freedberg that he wanted at a company New Relic. And
00:12:43
he did this because in order to have an H-1B visa, you have to put the job in a newspaper,
00:12:50
right? So what these companies allegedly are doing is putting these jobs in these like obscured newspapers so that
00:12:57
Americans don't see them. They're not putting them in places, you know, that you might see them. And there's a group of Americans who are going and finding
00:13:02
these jobs and saying to Americans, "Go ahead and apply. Here's the shadow jobs," I think is what they call them.
00:13:08
And so he put the reference number in there and they wouldn't even interview him. And I talked to him and he's kind of crest falling. He's like, you know,
00:13:14
I'm I would like to apply for this job, but it's obvious that I can't get into it. This is like I think just shows the
00:13:21
entitlement of these tech companies. And I don't know New Relic's position on this. They can email us and I'll I'll
00:13:27
give it in the next episode, but they're basically listing fake jobs. and uh
00:13:33
somebody Abby from People Ops over there just kind of doesn't even let him interview for the job. The whole thing
00:13:39
is just really dirty at the at the low end and at the high end it's under monotized. So Freeberg, your thoughts on
00:13:47
this? I know you have a lot of friends. You're an immigrant yourself. I'm not sure how you got here and what visa you
00:13:53
came under, but I think it came when you were a kid, right? I'm not sure what your parents came under, but what are your thoughts on this and the impact it
00:13:59
might have? Well, I think there should be two separate programs for what we could call highly skilled workers. What
00:14:05
you were referring to after the end of World War II, there was a secret US
00:14:10
operation called Operation Paperclip where we tried to recruit German
00:14:17
scientists and engineers. Between 1945 and 1959,
00:14:23
America recruited, I think, 1,600 of these scientists. So it was both call it
00:14:30
disabling to an American rival or adversary but also expansive because that was when the nuclear industry was
00:14:36
growing and much of nuclear science was being pioneered in the earlier days in Germany.
00:14:42
And so the kind of American workforce expanded but more importantly a new industry was able to be enabled and
00:14:48
unlocked and grown in the US and then the the the German state was disabled by
00:14:54
losing these scientists. One could make the case that a similar sort of scenario should exist today that we should have a
00:14:59
second operation paperclip and perhaps it should be a continuing process rather than necessarily kind of this LZ fair
00:15:06
process that we have today where we identify some of the top industries and the top scientists and this top domains
00:15:14
and go after those scientists proactively with government action, government support in partnership with
00:15:20
private industry. If you look at papers being published across mainstream
00:15:26
scientific journals, the majority of papers today across nearly every scientific domain are being
00:15:33
published out of China. And this ranges from physics to chemistry to material
00:15:39
science to biotech. And there's a real case to be made that perhaps those scientists would be better off and
00:15:45
America would be better off if they were doing their research pioneering here rather than there. Yeah. So I think that
00:15:52
there's a very good strategic case to be made that perhaps like a more directed high energy high effort kind of
00:15:59
operation paperclip be undertaken again around the world. The H-1B program I do agree has been heavily abused as a way
00:16:05
of kind of compensation arbitrage. And you know, if you find a high highly qualified excellent talent, as we all
00:16:12
know, for a high-skilled laborer in engineering or science today, that person, if you advertise the H-1B over 7
00:16:19
years at 100K, that application fee, that's 15K a year, call it, that
00:16:24
certainly seems worth it for the right sort of talent and it forces the question about can this person be found
00:16:30
in the United States or not. The alternative would be to force a higher salary range such that you as a company
00:16:36
are now basically being forced to pay a higher salary which means you have to justify that this person is worth it to
00:16:43
bring them in from ex US and you can't find the talent locally. I'll tell you a program where we do this where it
00:16:48
doesn't work in the US is called the H2A program. This is the immigrant farm worker program that we use for temporary
00:16:54
labor on farms. And the way that program is set up today is you have to pay the
00:17:00
farm worker that comes in on an H2A some amount over minimum wage. And the
00:17:06
amount that you have to pay over minimum wage is a function of the average wage in that state across all industry. In
00:17:12
the case of Florida, they're paying5 to$10 over minimum wage for farm workers
00:17:18
and they cannot get any Americans to work on the farm. and they're being forced to pay $5 to $10 overage. And by
00:17:26
the way, these farmers and these farm businesses are being heavily subsidized by the government. One way to think about the ridiculousness of what's going
00:17:32
on is the US taxpayer is paying a premium salary to foreign workers.
00:17:39
What we should be doing is enabling when there's no workers available in the US, we should be enabling a free flow of
00:17:45
labor, but only in the case where there's no workers available in the US. But there is a downside to that model as we're now seeing in the in the a
00:17:51
industry. farmers are losing money across the board and they're having to pay a premium for foreign workers to
00:17:57
come and work on the farm and they can't get US workers. There's two sides to the sword on this is my point. But I do
00:18:03
think this operation paperclipip notion should be taken on as a separate kind of strategic mandate. Absolutely. Yeah. And that was a lot of
00:18:09
the Jewish scientists had already fled Germany. My understanding paperclip was for the Nazis, the former Nazi
00:18:16
scientists, and they were working on some pretty dark and cutting edge stuff in chemical
00:18:22
and biological. It just wasn't rockets, right? Well, it was everything. It was everything. Remember, at this era, we were just
00:18:27
developing quantum theory, and quantum theory led to nuclear science, which led to the development of the atomic bomb.
00:18:33
So, yeah, I think Operation Paperclip was pretty farreaching. But today as an American, do you really want all of the
00:18:40
cutting edge research in material science, in physics, in chemistry, etc. to acrue to China? Or should we be
00:18:46
thoughtful about it's intellectual talent that's making these breakthroughs? It's not necessarily institutional capacity. It's not like
00:18:53
they have better institutions per se than we do. We have amazing institutions, amazing capacity, amazing
00:18:58
place to live and so on. So there's a real kind of mandate that we should probably think about undertaking here, not just for extension of our industry
00:19:05
base, industrial base, but also for disabling what we would consider rivals or what American might consider rivals. Just on this Operation Paperclip point,
00:19:12
it's interesting that China, the Chinese government took away the passports of the engineers at
00:19:21
Deepseek after the launch of that model, or at least it was publicly reported. I can't attest to
00:19:28
this from firsthand knowledge, but there are definitely a lot of reports about this and you can see why. I mean, if we
00:19:35
could recruit or snap up a few hundred or at most a couple of thousand of these
00:19:41
top AI engineers, that would be a gamecher totally in the AI race. So, China has start to
00:19:46
see those people as a strategic asset and they're not going to let them immigrate, I don't think. But it it
00:19:52
would be something for us to think about. Certainly on the chip design front, there's probably just a few
00:19:58
thousand people. That's all we're talking about. That would be a gamecher on that side. Although they're largely
00:20:04
in Taiwan, not China. And again, I don't think the Taiwanese government's going be too excited for us to snap them all
00:20:10
up and move them to America. But in both these fields, there are a relatively
00:20:16
small number of people, kind of like in the space race, who if they were all in America, it'd be a huge game changer. Here's the thing. We do have a really
00:20:24
rich diversity of people from all around the world in higher ed institutions in
00:20:29
the United States getting masters and getting PhDs. We just need to be better organized about what to do with them and
00:20:37
we need to sort of reach out to those people, build relations with them, take advantage of OPT and then we can always
00:20:43
create a different class of visa for them. We have the ability to do these things called national interest
00:20:48
waiverss. So all of the infrastructure exists and I think that if we can clean
00:20:54
the decks on the H-1B stuff, it'll give people a lot more incentive to support
00:21:01
the national interest waiver concept. I think the reason why people don't believe in this entire immigration
00:21:08
conversation is on every part of the distribution of immigration,
00:21:13
people see problems. They see an open border on the one side, worried it's going to be abused. Yeah. Exactly.
00:21:18
Exactly. Yeah, it's abuse and and it feels terrible. There's been too many of these horror stories where an American is told to
00:21:26
train their H-1B replacement who's making 25% as much. Look, I shouldn't even tell this story, but it was told to
00:21:33
me yesterday, and I don't know this to be true, so I don't want to fan the flames of speculation here, but what I
00:21:39
was told is that in certain countries, there are these puppy farms that
00:21:44
essentially get these kids onboarded into US colleges. They're not the great
00:21:50
colleges, but they are decent enough in all far-flung corners of America. get
00:21:56
them into M's programs, they pay for their school, and then these folks have to then send money back to pay off their
00:22:02
degree. So, if that's happening, then these kids are being abused as well, right? So, the whole thing has just
00:22:08
completely run a muck, and I think we need to clean it up. At that point, we have the chance of rebuilding trust
00:22:13
where then we can propose what Freeberg talked about and everybody would be supportive because they see that the
00:22:19
system works as intended. I think closing of the border I think has made people feel a lot better about it
00:22:25
because that was so abused that people just look at immigration in one bucket and they don't separate it into multiple
00:22:30
buckets. There's compassionate you know people who are true dissident who we want to show compassion for. It's a
00:22:36
small number of people. Then you have all these folks who were being taken with mules and coyotes over the border
00:22:43
and uh then using that same compassionate designation and abusing that. And and when you do see that
00:22:50
abuse, I think people when they hear Trump come on here and when he was um a presidential candidate, President Trump
00:22:55
came on and said, "Hey, we're going to staple a green card right to those degrees." And then he immediately got backlash. Well, now I think if things
00:23:02
have calmed down and now President Trump and the administration have the high ground. They could say, "Look, we closed the border. Now we should have a really
00:23:09
thoughtful discussion. We're giving these people American educations. They want to start companies. They want to
00:23:14
build our companies. They want to build products and services. Those products and services are going to create more jobs. And look, we're at 4% unemployment
00:23:20
4.x. You know, we're we're in good shape because we closed the border." And that's where you had millions of people
00:23:26
coming in. And that was the true problem. And this isn't the problem. You make a really great point,
00:23:32
Jal. I mean, using the one word immigration to wrap up all of people
00:23:38
that are coming to the United States, I think masks the real series of things or set of things that are underlying people
00:23:44
seeking asylum, people in need, but then also people that we want to go attract and bring here actively.
00:23:50
And it's not probably not the right term to use just the word immigration. It should have a recruitment and a
00:23:57
qualifier for every term. Yeah. The term is recruitment. What you described the paperclip mode. That's recruitment. Then you have dissidence
00:24:05
and that is compassion for true dissonance. And then there's family and family and family. Great. So you have
00:24:11
compassionate dissonance and family members. And then you have this big thing in the middle which is everybody in the world wants to live here.
00:24:16
Everybody can't. That's immigration. And so just put it in three buckets. And then our leaders need to have
00:24:21
discussions, three different discussions and have it in a thoughtful way, not muddy the waters and politicize this.
00:24:28
That's what's been causing such a big problem. Both sides of this argument have been so charged hopefully now that
00:24:34
the biggest one has been deathly done by Trump. Trump said it at the UNGA. He said, "You
00:24:41
know how many people have crossed the border since since I came into office?" He said, "Exactly zero."
00:24:46
Yeah. Yeah. That's incredible. I mean, we have the greatest military in
00:24:52
the world. We can't close the border to people who are coming on, you know, with backpacks across the border. Like, we
00:24:58
should be able to secure the border with drones and cameras. It was done deliberately. They opened the border and you know that,
00:25:05
of course, because remember when Texas tried to put up barbed wire
00:25:10
to enforce their own border? That's right. The Biden controlled border patrol uh
00:25:16
got rid of it. They removed it. That was a test. No, this this was done deliberately. Yeah, that was a perfect tell. Like I remember
00:25:23
I remember when we covered that last year. That was the total tell. Okay, Freeberg, some major news uh in science
00:25:29
this week. Let's talk about autism and the press conference that happened this week with Bobby Kennedy and President
00:25:37
Trump. Here is the chart. Autism has increased dramatically over the years.
00:25:42
There's a big debate of what's causing this and there's obviously correlations, there's causations, there's the testing
00:25:48
of this and maybe we're just testing uh a little bit too loosely around this,
00:25:53
but we went from 1 in 10,000 in 1970 to 1 in a,000 in 1995 to 1 in 32 in 2022.
00:26:02
the press conference Freeberg was a little spicy and unique, performative
00:26:07
maybe were some of the criticisms, but there's a real issue here and why don't you take us through it and educate us so
00:26:12
we can kind of get to reality because the press is having a field day with this obviously on both sides. I think autism just like Alzheimer's
00:26:20
there may be several underlying conditions that lead to what we would call the phenotype of autism. That is
00:26:28
what we all observe as autism. You know, it's considered a spectrum disorder. There's many different variations of it.
00:26:34
There may actually be many different underlying conditions or underlying drivers, biological drivers that are
00:26:40
causing it. One of the drivers that came up during the press conference and in the subsequent interviews that Marty
00:26:46
McCary, head of the FDA, has done is that they've identified and shared papers that have been out for some time
00:26:53
that there is a receptor that absorbs folate, a type of vitamin B and that
00:26:59
that folate receptor may be attacked by the immune system and as a result you can't really uptake vitamin B and so
00:27:04
those cells dysfunction and when those cells are dysfunctional you end up having what looks like what we call
00:27:10
autism. And so one of the things that they announced is they're going to work on getting the label updated for lucavorin
00:27:16
which will resolve for many people the folate receptor issue. The other thing
00:27:21
they they brought up is a paper that was done by Andrea Bakarelli who's
00:27:29
dean of the Harvard Thchan School of Public Health. This paper is is a bit
00:27:34
old where he took several studies and analyzed them and showed that across 46
00:27:40
studies. Nine of them showed no association with acetaminophen, the main active
00:27:46
ingredient Tylenol, four showed a negative association, meaning it was actually protective and good for the
00:27:51
fetus. and 27 had a slightly positive association which means that it was
00:27:56
having some contributo effect to both ADHD and autism spectrum disorder when
00:28:03
women would take acetaminophen while pregnant. And Nick, if you want to just pull up that image from the paper, this
00:28:08
is the original paper that was published by Bakarelli. So again, he didn't do any primary research. He didn't actually go
00:28:14
and study patients. He took the data from 46 other studies and then he added
00:28:21
it all together to run this kind of macro analysis. And you can see here that he showed some risk. There's no
00:28:29
specific way to quantify that risk. But there's some increased risk of having attention deficit hyperactivity or
00:28:35
autism as a result of taking acetaminophen while pregnant. Now I
00:28:40
think autism again one of the underlyings might be this autoimmune condition associated with the folate
00:28:46
receptor. What causes autoimmunity is a whole another conversation and we can get into the vaccine stuff if you guys want to because there's obviously a lot
00:28:53
of conversations going on right now about the immune system being
00:28:58
primed to have kind of an auto antibbody response but uh there may be other
00:29:04
things contributing to it. So I think it's pretty clear that our modern world in the last couple of decades there's a
00:29:10
cumulative effect of environmental exposures that children are getting. Whether it's microlastics,
00:29:17
whether it's chemicals in the food, whether it's just the environmental exposure in the air related to small
00:29:23
molecules, whether it's related to other things we're putting in our body. Every one of these things, the way to think
00:29:29
about it is maybe if it has a positive effect, it might increase your chance of autism by 0.05%. And then another thing
00:29:36
might increase your chance by 0.07%. And so on and so forth. And so when you add up all the things in our environment,
00:29:43
there may be a cumulative effect that has a result in different underlying
00:29:48
conditions in our body that may result in what looks like things that we call autism. And so none of these are very
00:29:55
specific. There's one shot and one path and one specific thing and I I think that's very important.
00:30:00
Yeah. Let me just um ask two clarifying questions really lightning round you Freeberg. Number one for the audience.
00:30:06
How is autism diagnosed in these studies? Is there a blood test, a genetic test or is it just a bunch of
00:30:14
questions? I know the answer but I wanted you to clarify it for everybody. And then how does this the geographical
00:30:21
differences in autism just like we saw with trans kids, you know, there's many
00:30:28
in certain cities and none in others. So maybe you could talk a little bit about
00:30:33
those two issues which I think many people have been talking about. So I don't think that there's one
00:30:38
specific diagnostic test for autism as if it was one disease. Again, these are
00:30:44
phenotypes. These are behaviors that are being measured that people call autism spectrum disorder. And so the diagnostic
00:30:50
criteria falls under a set of screening and behavioral tests that go on. And you know, one of the things
00:30:57
there there's a survey, there's an observation, and there's a bunch of things. Is it like a score? Is it a score? Yeah. So it's there's a scoring system.
00:31:03
Exactly. And so then there's different levels. What did you and Sak score? Yeah. Where are you guys? Cuz we could
00:31:08
we could bet. Hold on before you tell us. Chimath and I are going to bet on it. I'm going to say that Sax over by 16
00:31:15
points. Make it over under over under. I think these guys are Well, what's who's got who's highest on
00:31:21
the spectrum? Well, Sachs may have gotten in the 90th percentile. He's definitely in the 90th framework of
00:31:26
the We need a poly market. We need a What did Reaper get on the market?
00:31:32
By by the way, just to go back on the acetaminophen study, it's so great. Oh, look. There's the emotional
00:31:37
detachment. Go ahead. Go ahead. No reaction. You and I are laughing. Well, I mean, I I I'm making an active
00:31:43
choice not to engage. Um, but uh the uh I am making an active choice to repress
00:31:48
my emotions. One thing to note and and one of the controversies about all of this, the
00:31:53
paper that was published by this guy Bakarelli from I don't know if I'm pronouncing his name right, from Harvard. It's actually been challenged
00:32:00
because in 2023 he was called as an expert witness in a lawsuit against the maker of Tylenol. And in that lawsuit,
00:32:06
the judge throughout his testimony is unreliable because he was being paid $150,000
00:32:13
uh to give the uh the expert testimony to work on the case. And so because of the payment, what side was he on?
00:32:19
So this is the guy who published the paper that linked that showed an increase on the expert testimony. Was he pro
00:32:25
Tylenol or anti Tylenol? So he was he was an expert witness for the lawyers that were filing
00:32:32
the claim against Tylenol. I will say in the last couple of days since this press conference he has publicly said that we
00:32:38
are not yet certain or sure about the link and I want to remind everyone that this association study indicates an
00:32:45
increased risk but that doesn't mean that if you take Tylenol your child is going to have autism that doesn't mean that there's a determinism here there is
00:32:52
a statistical chance that there is a slight increase and the more acetaminophen you take the more the
00:32:58
chance and that's what the papers show let me let me take let let me take the other side of this here's what we No, we know for sure that
00:33:06
there is this potential autoimmune issue that you as the mother can express as an
00:33:13
antibbody. There's an antibbody test for it. We also know that the child can be tested for it. So, at a minimum, we're
00:33:21
now at a point where we can create a very thorough well-funded study to get
00:33:28
to the core of this issue. separately. To the extent that you do test positive,
00:33:34
there will be some doctors and some parents that may decide to take lucarin prophylactically
00:33:41
and then also to administer it to the child before it's clear whether they do or do not have autism. I have several
00:33:48
friends who have kids on the spectrum. I talked to them about luccoin. What they
00:33:53
say is that when you have extreme autism, the drug is very effective. But when you have kids that are more
00:33:59
sort of mild on the spectrum, then there's a lot of benefit from behavioral
00:34:05
modifications and behavioral training and that it's not clear how effective that drug is. But in in one specific
00:34:11
case, one of my friends is considering it for their child. Here's the point though. I think the point is that we need to test for this. And I think that
00:34:17
some combination of governments and industry should come together. Beyond that, the point that I want to
00:34:23
make though is and I think Freeberg, I don't think you're adding to the conversation when you say that there's
00:34:29
no determinism because on the off chance that there is, I would say that we don't know yet. When you see women taking
00:34:35
acetaminophen in this performative art way, basically
00:34:41
to like try to like pone Donald Trump and it's all over Tik Tok and it's all over X. I just think it's reckless and I
00:34:48
think it solves nothing. you're supposed to go talk to your doctor about it in all weather conditions anyways, but then
00:34:55
when you transform it into some sort of like protest vote without really knowing, I think is really dumb.
00:35:01
Yeah. And this this test by the way, Chimath, I'll say this folate receptor anti auto antibbody, there has been a
00:35:06
test available, I think since 2012. This has been around for some time. In fact, cancer patients take this test, of
00:35:12
course, because and there was a there was a paper published on this. I'm trying to find this paper. I I'm not sure how much
00:35:18
follow-up there's been. This was a Sunni paper where these guys went and found
00:35:24
folks that were scored very high on the autism spectrum disorder diagnostic test and they found a high prevalence over
00:35:30
70% had this folate receptor auto antibbody. This was one paper so I want to I don't want to give it like a ton of
00:35:37
credence. There's a lot of follow-up that's happened since then and I'm not an expert in this space, but I did kind of do some research on what the history
00:35:44
is of this and so this is a very well-known kind of correlative effect and it could be a very big contributor
00:35:49
but obviously like no one should feel like hey if I don't have auto antibodies then my kid is fine that doesn't
00:35:56
necessarily mean the case again there may be several paths to autism and the thing about acetaminophen just like hey
00:36:01
have a drink or smoke a cigarette the more you do the higher the risk this is the case with with anything we put in
00:36:07
our body. And I think that's the point about like what they've identified in the in the paper that was published
00:36:12
regarding acetaminophen. And so there's still a lot to be kind of determined on how they're going to provide guidance to
00:36:18
women that are pregnant on do you have a fever, do you take it for pain, what's the right criteria, we all agree
00:36:25
medicating yourself to make a political point is hilarious. What are we do that is
00:36:31
absolutely absolutely happy this is true or it's stupid. Yeah,
00:36:37
I think for his next trick, President Trump should warn people not to snort rat poison. Yes. Also not a good idea.
00:36:44
Somebody was tweeting, "Next, President Trump's going to recommend not using uh toasters while you're in the bathtub."
00:36:50
Yeah. Just to see what people do. My fellow Americans, in light of recent
00:36:55
studies, I wanted to warn against the use of toasters in your bathtub or your
00:37:01
shower. Don't do it. Don't put the toaster in the You know, people are gonna do it if he
00:37:06
tells them not to. Exactly. Exact. I mean, I have such an inappropriate joke right now. I'm not going to say it.
00:37:11
Let's take Freedber's math when I saw that and you think maybe there's a 0.05%
00:37:17
chance. Well, guess what? If you stack that up and it actually turns out to be true across the thousands of women that
00:37:23
then performatively did this idiotically, you could actually have an extra kid or two with autism that didn't
00:37:30
need to have it. What what is going on in America? Well, look, I think there there were a
00:37:36
lot of people who instantly had a snarky reaction because it was Trump and Bobby
00:37:43
Kennedy making these claims about Tylenol and the media played into that. And look, I don't know what the truth of
00:37:49
it is, but there are even Tylenol tells you to not take it. They say call your doctor. Yeah, exactly. There are
00:37:55
How stupid are these people? There are plenty of articles establishing this risk. There's a paper from John's
00:38:02
Hopkins from 2019 called taking Tylenol during pregnancy associated with
00:38:07
elevator risk for autism and ADHD. Pull this up. This is to your point. This is CNN's coverage on when these
00:38:13
acetaminophen studies came out historically. And you can see that when
00:38:19
um it was announced at the Trump conference yesterday, they said Trump links autism to acetaminophen used
00:38:24
during pregnancy despite decades of evidence it's safe. And then if you read back, so this is 2017, 2016,
00:38:33
acetaminophen during pregnancy may increase risk of hyperactivity. Studies link acet.
00:38:38
So when Trump says it in the headline, in that no in the headline they dismiss it. But every other time, every other
00:38:46
time they've taken the paper and they've published a news article on the paper itself. I just couldn't believe this. All this
00:38:51
like CNN. Yeah. They ignore all their previous reporting on the subject. I mean, that's that's what's kind of crazy about it.
00:38:57
This super cut was hilarious. Check out the uh this is the super cut of Trump
00:39:02
saying Tylenol. It was hilarious. Don't take Tylenol. Don't take it with Tylenol. Don't take it. Don't take it.
00:39:10
Don't use Tylenol. Don't take Tylenol. Don't take Tylenol. Fight like hell not
00:39:15
to take it. I think you shouldn't take it. It's just hilarious how he presents information. Just don't take Tylenol.
00:39:22
Don't take Tylenol. Great week for Motrin, though. So, I heard sals are up for that. No, because it's also been established
00:39:27
for a very long time. You don't take ibuprofen aspirin. Yeah, you definitely don't think worse.
00:39:32
Ibuprofen is terrible if you're pregnant. So, wait a minute. Why why is it is it specifically that this one mechanism of
00:39:38
action was completely inoculated and every other mechanism of action for things like pain relief and headaches
00:39:44
are known to be pretty bad for women and babies? Come on guys.
00:39:49
Yeah. Uh Freeberg, explain to us the disparity in different regions of the
00:39:54
world and autism. Would you think that's the surveys and maybe people in the United States are a little more Again,
00:40:02
I'm not an expert in this stuff. Jal, I I can opine on kind of what I've read and I've done some studying, but think
00:40:08
about my like level of knowledge being in the range of hours compared to yours, which I Yeah, but I would say
00:40:15
I'm a complete expert on this because I've watched all seasons of Love on the Spectrum and I love that show. Have you seen
00:40:22
that? Love on the Spectrum. What is it? Oh, watch it with Nat. It's
00:40:28
She'll love it. It's great. I do agree with the general notion that we have a cumulative uh effect. Like many studies
00:40:34
that are done, by the way, are very organized around short duration, high exposure when it relates to novel
00:40:41
molecules in the environment and how they're and how we're exposed to them. Yeah. the cumulative long-term effect of
00:40:46
these things as we learned recently with respect to like the amount of microplastics and the endocrine disrupting nature of microplastics where
00:40:53
they can actually bind to specific receptors and as a result block the expression or binding of other proteins
00:40:59
which can have a systemic kind of effect on you. We didn't realize that until
00:41:04
recently after we've been making plastics for 70 years. And then we didn't realize these plastics were breaking down and cumulatively kind of,
00:41:11
you know, growing in our environment, in our water supply, as a result in our food, in our balls, in our brains, in
00:41:18
our hearts. And so what are these molecules doing as they accumulate in our body? And that's now needs to be
00:41:24
studied and understood. And then we have this big challenging industrial question of what do we do about it? But that's true for nearly every novel molecule
00:41:31
we've developed over the past hundred years on Earth. Some of them, by the way, we could make the case. We can show deterministically they break down. They
00:41:38
don't persist. They're fine. And then some of them we could say like, look, there's a cumulative effect as more of
00:41:43
these molecules enter our body and some of them stay in our body. they over time increase the probability of things like
00:41:49
DNA breaks resulting in cancer or things like endocrine disruption resulting in metabolic effects or things like we
00:41:56
might be seeing now with folate receptor auto antibbody presentations that drive autism spectrum disorder. So, I think
00:42:02
that there's a lot to be said generally speaking about the cumulative effect
00:42:08
about a lot of the stuff we put in our environment. And as much as I could disagree with things that Bobby Kennedy
00:42:13
might say, I can tell you asking these questions is critically important. As scientists, we have to constantly
00:42:18
interrogate and this cannot be just a political point. I'm not like going to sit here and say I agree with the
00:42:23
statements that might be made by the president when he says never take copen. I think it's a little more nuanced than that, frankly. But I do think that the
00:42:30
idea that we should be asking the questions and interrogating for answers is very very important. And so
00:42:35
I think that's an important takeaway because if you talk to people who have been impacted by this and people in the
00:42:41
community, the reaction was we're glad we're getting a lot of attention. Yeah, that was a weird press conference on the
00:42:47
margins, but we're glad that this is a focus now. And I think people are also glad about vaccines. Again, I don't
00:42:53
think we should be banning vaccines, but but I do like the idea that we're questioning all these things and we're not just giving the medical industrial
00:42:59
complex a complete pass on this and that we're spending money and investing in it. So, let's go on to our next story
00:43:06
around censorship. How much money do you think is spent in the USA annually on autism?
00:43:12
Gosh, do you think it's enough to fund a longitudinal trial?
00:43:18
Yeah, I mean that's I I don't have the data to know and then I guess you would
00:43:23
on a societal basis you would be thinking is there a better use of capital, right? We but we're a rich country. We have the ability to fund
00:43:29
some of this stuff and there's private companies who well too, right? I bet you we're wasting money in all
00:43:36
kinds of random places where finding a few billion dollars to run this trial is a good thing.
00:43:41
Yeah, that totally makes sense. All right, we've got a lot of news in the censorship space. So, this is this week
00:43:47
in censorship. So much to talk about. First story up. Jimmy Kimmel is back on
00:43:52
the air. Earlier this week, Disney announced that ABC would resume airing Kimmel show. And I think that was
00:43:58
Tuesday night. Disney explained why they suspended Kimmel last week to avoid
00:44:04
further inflaming a tense situation at an emotional moment for our country. And they called Kimmel's comments illtimed
00:44:11
and thus insensitive. He came back and had a massive amount of reach,
00:44:18
but Nexar and Sinclair's affiliates, they decided to not air it. And that was 60% of the market. So interesting. Poly
00:44:26
Market had Jimmy Kimmy Kimmel canled by September 30th. It was nearly 80% after
00:44:32
Kimmel was suspended last week, but that's plummeted to 1% since then.
00:44:38
Was quite emotional. I'm I'm guessing everybody watched it and I did not. I did not watch it.
00:44:43
Oh, you didn't watch it. Okay. Well, we'll play a clip here. It was I felt heartfelt and Yeah, it was incredibly
00:44:49
heartfelt and deaf in terms of its execution. It was sincere. And here it is.
00:44:54
I don't think what I have to say is going to make much of a difference. If you like me, you like me. If you don't, you don't. I have no illusions about
00:44:59
changing anyone's mind. But I do want to make something clear because it's important to me as a human. And that is,
00:45:05
you understand that it was never my intention to make light of the murder of a young man. I I don't
00:45:16
I don't think there's anything funny about it. I I posted a message on Instagram on the day he was killed
00:45:22
sending love to his family and asking for compassion and I meant it and I still do. Uh nor was it my intention to
00:45:28
blame any specific group for the actions of what it was obviously a deeply disturbed individual. That was really
00:45:35
the opposite of the point I was trying to make. But I understand that to some that felt either illtimed or unclear or
00:45:42
maybe both. And for those who think I did uh point a finger. I get why you're upset. If the situation was reversed,
00:45:48
there's a good chance I'd have felt the same way. I have many friends and family members on the other side who I love and
00:45:55
remain close to. Even though we don't agree on politics at all, I don't think the murderer who shot Charlie Kirk
00:46:02
represents anyone. This was a sick person who believed violence was a solution. And it isn't it ever.
00:46:11
Yeah. And he uh references later in this apology of sorts. I think it was pretty pretty
00:46:18
clear he was apologizing his own faith, his Christianity, and um just how beautiful it was that the widow of
00:46:24
Charlie Kirk had forgiven the shooter and he got broken up about that as well. He then went on to do a bunch of jokes
00:46:30
and have a normal show. It was massive ratings. Obviously, everybody was tuned into it and
00:46:36
we'll see. He didn't apologize. I didn't hear an apology. Um,
00:46:41
okay. Did he apologize? Do you know what that means? In the English language.
00:46:48
My understanding, if you look at Twitter and stuff, was that there was not technically an apology. Okay.
00:46:53
He didn't apologize and he didn't say what he had done wrong. I guess what he said is that I didn't mean to make light
00:46:59
of my situation. Yeah. Wasn't my intent. Yeah. It wasn't his intent. An explanation more than an apology.
00:47:05
Okay. Nobody was accusing him of making light of the murder. What he did and
00:47:10
what people were upset about is that he lied and said that the shooter was MAGA
00:47:16
and he did not hit the nail on the head in terms of addressing that and he's being called out for that. Now look, I
00:47:22
still think that his statement there, let's call it an apology, was constructive and positive because at
00:47:28
least he is showing empathy towards the other side. He obviously feels bad for
00:47:36
Erica Kirk and for Charlie Kirk and in the current overheated political
00:47:42
environment is expressing empathy for the other side is a positive statement
00:47:49
and I think he definitely brought the temperature down and I think later in the statement he also makes an important
00:47:54
point about he says you know just selfishly I have threats on me and what
00:48:01
he's basically saying is look we don't want to get into a civil war here. We don't want to get into a cycle of tit
00:48:08
fortat retaliation. Let's not play Hatfields and McCoys. This is my words. He didn't say this, but that was sort of
00:48:14
the intimation of what he was saying. And I think that is a good thing to say. I mean, no one here should want a civil
00:48:21
war and this thing can go off the rails really badly. So, look, I think that his statements were positive and and and
00:48:28
welcome and they showed empathy for the other side, but he did not fess up to
00:48:34
what he really did wrong here, which was to claim that the shooter was MAGA. That was the thing that was deeply offensive.
00:48:41
Yeah. And hold on. And and the reason why he did that is he was not the only one
00:48:46
doing it. In the early days of this shooting, of this assassination, it was a talking point on the left that this
00:48:54
shooter could be right-wing. And the reason why people on the left were saying that is it was exculpatory. It
00:49:01
was basically to put the blame on the other side instead of looking in the mirror and hopping to the fact that
00:49:07
there is this rise of left-wing political violence and assassination culture. As we demonstrated on the pod
00:49:13
last week by looking at all the data and all the numbers, there really is this poisonous ideology
00:49:19
that is on the left, and yes, there's some of it on the right, but way more of it on the left, that political violence
00:49:26
can be used to solve problems. And the left really does need to look in the mirror and rid itself of that ideology.
00:49:33
And by not admitting that this assassin was motivated by that ideology, they are
00:49:40
ignoring that opportunity for self-reflection and for progress. That last part is important. I think the
00:49:45
point is that when you say that somebody is mentally deranged, what most normal people do is then say, "Oh, it was an
00:49:52
aberration. It was an outlier." And I think that that is a dangerous way to
00:49:57
try to sweep under the rug what is something that's more virulent
00:50:03
and is increasingly acceptable in society. This guy might have been crazy in the
00:50:08
sense that he was willing to use murder to achieve his objectives. I think we can all say on some level that's crazy.
00:50:15
It doesn't mean he wasn't animated by an ideology that lots of people believe.
00:50:20
And I think the proof of this was the celebratory reaction to the assassination of Charlie Kirk. You saw
00:50:26
it on Tik Tok. You saw it on Blue Sky. You saw it on the corners of social media. You definitely saw it on Reddit
00:50:32
where you had thousands, maybe hundreds of thousands of people celebrating the death of Charlie Kirk and basically
00:50:39
buying into this idea of political violence as the solution to their
00:50:44
problem and to the idea that it was acceptable to use violence against people they hated. And so again, this is
00:50:51
the problem with the random nut theory is that it really ignores all the evidence we have about the larger
00:50:58
reaction to the Charlie Kirk assassination. And this is a thing that the left really doesn't want to
00:51:04
confront. It does not want to look in the mirror here and say that we have a problem on the left with this
00:51:10
assassination culture. And we talked about this last week. And if you look at polling, they just did polling around
00:51:16
this and there's still millions of people on the left who believe that the shooter was MAGA. And Jimmy Kimmel
00:51:22
helped foster that belief with this disinformation that he put out there.
00:51:28
And he really should have hit the nail on the head in terms of saying that he got that wrong, that it was wrong to say
00:51:34
that. And he he could have been a little bit clear about that. I'm not dismissing the positive things he said because I do
00:51:39
think it was good for him to show that. I mean, I'm gonna give him credit for getting emotional. I know I know a lot
00:51:45
of people on the right think that it wasn't sincere. I think that it probably was sincere. I'm going to give him
00:51:50
credit for that. I think his comments were constructive, but he did not apologize for the thing he actually did
00:51:56
wrong. And in fact, he just replaced that original lie with a new form of
00:52:02
left-wing spin, which was the random nut theory. And I think, yeah, let me stop there.
00:52:07
But the but the point is that really we need to come to grips with the fact that there is this toxic political ideology
00:52:15
now that's mostly on the left that does need to be confronted. I'm going to go ahead and say, you know,
00:52:20
we we should clean up a little bit here or I'm going to clean up. He should have, I think, made it clear that this
00:52:27
wasn't a MAGA person, but I'm just going to repeat the quote. We hit some new lows over the weekend with the MAGA gang
00:52:33
desperately trying to characterize this kid who murdered Charlie Kirk as anything other than one of them in doing
00:52:39
everything they can to score political points from it. Now, he should have said that turned out not to be true. It was
00:52:45
actually a liberal and I think it was not that he said the guy was MAGA. He said people were speculating he was
00:52:52
MAGA. That actually was true. So, just to be clear here, he never said the person was MAGA. Wait, who was
00:52:58
speculating? The the bag bag of people were speculating. Hold on. Charlie Kirk's fans, his loved ones, his
00:53:05
co-workers weren't speculating about this. It was people on the left. Hold on. People on the left of which Jimmy
00:53:11
Kimmel was one were trying to plant the story that this assassin could be MAGA.
00:53:18
Well, I mean, this is done deliberately. Hold on. That was done deliberately as disinformation to take the blame off of
00:53:25
the political ideology. Yeah. I don't I don't think anybody No. And the culture of political assassination that's been rising on the
00:53:32
left. Okay. So, over that weekend, there was definitive speculation that this might be a Groper, one of Nick Quentis' fans.
00:53:38
There was also significant speculation. And this is why people shouldn't speculate in a breaking news environment
00:53:44
because you'll frequently get it wrong. And people had gotten it wrong because they assumed that this individual was
00:53:50
MAGA because his parents were. And then when it turned out it wasn't true,
00:53:56
that's when the record got corrected. But this was one of these instances
00:54:01
where people actually were speculating and they were wrong. And that's why you should always wait in a breaking news
00:54:08
environment. Well, you're using a lot of passive tense in order to avoid who said what.
00:54:14
How so? You said there was a lot of speculation. No, there was talking about social media speculation. There was an effort to create a
00:54:20
narrative, a false narrative that somehow MAGA was to blame for this. And Jimmy
00:54:27
Kimmel was one of the leading people who did that. We covered last week
00:54:33
the rundown, the timeline of what was known at the time that Jimmy Kimmel said that. And Megan Kelly did an excellent
00:54:40
job summarizing everything we knew. We knew what was written on the bullet casings. We knew what the parents had said. We knew what the friends had said.
00:54:46
Megan's tweet on this is very exhaustive. So, you're acting like there was a legitimate basis for Jimmy to say
00:54:52
what he said and there wasn't. And I don't know why you're covering for him right now. No, no, no. I'm not covering for anybody. I'm clarifying a factual error.
00:54:58
He did not say the person was Magnet. He said there was speculation that weekend, which that weekend there was speculation
00:55:04
that this was a Groper and that the family was Maggie. And I'm just on social media everybody was
00:55:11
speculating about this. That's and I think the the problem here is that we're saying that the left wants to
00:55:18
assassinate people. The left has assassination culture. I know many people on the left, I don't know anybody, I haven't talked to a single
00:55:25
person on the left who is in favor of what happened or in any way supports it.
00:55:30
And it's quite the opposite. Every single person I know on the left, every single person, you know, who is a
00:55:36
high-profile person on the left, with the exception of maybe one or two really dark people,
00:55:43
they have all said that this is horrible and tragic and there is no place in
00:55:48
civil society for violence. So, I'm trying to actually balance this down and
00:55:53
say this wasn't the left. This is not the left strategy. The strategy does not want to murder people. That that is
00:56:00
absolutely false. And the left is not pro assassination culture. All the leadership on the left,
00:56:06
you certainly changed your tune because last week we showed you plenty of data showing that there was three times as
00:56:12
many people on the left who are celebrating or endorsing political
00:56:18
violence. You didn't object to that then. You didn't show me data on the
00:56:23
opposite. You're just saying that no one you know Well, I'm glad that no one you know is celebrating political murder.
00:56:29
The only person I and the only leader I saw was Ilhan Omar who was just saying
00:56:35
like Charlie Kirk's got a terrible legacy etc. like inappropriate comments there obviously but every other person
00:56:41
uniformly condemn this. Now I know there's tons of surveys out there and I think if we're going to talk about
00:56:46
political violence the right also has a political violence problem and that they need to do. We saw on January 6th MAGA
00:56:54
and all of these people beat police officers and destroy the capital. That is also political violence. It's not an
00:57:00
assassination. It's obviously distinctly different, but they were beating cops. Okay? And the price they paid for that
00:57:08
was they were all given pardons for beating police officers. Okay? So there
00:57:14
needs to be better leadership from the Democrats and the right on this. And everybody needs to calm this down and
00:57:20
say this is not acceptable. Whether you're beating cops up on January 6th, assassinating somebody, threatening
00:57:27
people to that they have to fight, this is all terrible. Everybody has to calm
00:57:32
everything down. And that's an example I'm trying to make here on this program is to have productive dialogue even
00:57:38
though we disagree about things. Every person I know who's on the left, every person I know was absolutely believes
00:57:45
this is abhorrent and they would never condone it. Period. Full stop. and every leader with the exception of like one or
00:57:52
two people who I don't understand why they would ever criticize, you know, a Christian who was murdered in cold
00:57:57
blood. I I'm sorry. It's that's totally unacceptable. But anyway, that's my position on it.
00:58:03
Look, I understand this is a heated issue. Let me just make one final point. I I know that both sides have their
00:58:08
quote unquote nut cases, violent extremists who engage in horrific crimes and they should all be denounced
00:58:15
equally. The difference here that I think we saw with the Charlie Kirk assassination is that you saw thousands,
00:58:22
maybe even upwards of a hundred thousand people on the left on social media
00:58:27
rejoice and celebrate his assassination or downplay it and minimize it on the
00:58:33
grounds that somehow he deserved it for the things he said. And I just have to say I don't think we've seen that
00:58:39
behavior before on the part of the right. whenever there's been some horrific crime. I don't remember anyone
00:58:46
on the right ever celebrating that. It was not something that was mainstream discourse by any means. I certainly do
00:58:52
not see thousands of videos and Reddit posts celebrating that. And I think what
00:58:57
you see in the polling data is that yes, there are some people on the right who feel that political violence is
00:59:04
acceptable or solution, but that number is three times greater on the left. I'm glad Jal doesn't know any of those
00:59:10
people. That's reassuring, but nonetheless, it's there in large amounts of data. And I think that we need to
00:59:16
address that problem without minimizing it or both sidesing it or else we're never going to make progress as a country.
00:59:22
Okay, Jal and Chimoth both had to run. We started late today and we ran a little bit too long for both of them.
00:59:28
Sax and I are going to wrap it up with a quick conversation on AI. Saxs, I don't know if you saw, but there were two
00:59:33
papers that were published this week, each of which on their own, I would say, were were pretty kind of important. I'll
00:59:39
highlight the first one and Nick if you could just pull this first one up. This is the MIT paper. So this paper is
00:59:46
called teaching LLMs to plan and effectively what this uh team did and
00:59:52
again they were out of MIT in collaboration with a scientist at Microsoft AI in Mountain View. They
01:00:00
basically created an instruction tuning framework that teaches LLM to do
01:00:05
symbolic planning which basically means that the LLM think about step by step or
01:00:10
chain of thought in a smarter way by making them generate explicit state
01:00:17
action state chains and then they trained that model by giving them
01:00:23
feedback with an external plan validator which is effectively going to be a human or or a
01:00:29
software tool that says, did this series of steps make sense to do the thing you're trying to do? If not, here's what
01:00:36
you did wrong, here's what you should have done better. And they were able to achieve planning accuracy of up to 94%
01:00:43
on some standardized benchmarks that are used for chain of thought, reasoning, and planning using LLMs. This is a 66%
01:00:52
absolute improvement over baseline models. And so this is pretty substantial. They took Llama 3 and they
01:00:58
were able to increase the performance from 1% to 64%. The outcome of this
01:01:03
basically is that this sort of a system can be used to train LLMs to do better
01:01:08
reasoning and better chain of thought in such a dramatic way that LLMs will look
01:01:13
like they are starting to reason. And so by training them effectively on the steps in planning on how to reason, the
01:01:20
LLM get better at looking like they're doing reasoning using this kind of symbolic planning method that they then
01:01:26
built a tuning framework around. That sounded a little bit complicated, but I think ultimately what it translates to
01:01:31
is they figured out a method to get AI to act in a more reasoned way in developing step-by-step plans and
01:01:38
execute against those plans. And the results and the benchmarks are incredible. So this was a big
01:01:43
breakthrough I would say this week. Sachs, I don't know if you spent any time looking at this paper from MIT or talked with your team about it.
01:01:48
I haven't seen it, but what what exactly is the symbolic framework they're talking about exact you know what
01:01:53
exactly is that that I mean I understand chain of thought but what is it that that improves accuracy?
01:01:59
There's an old language called PDDDL or planning domain definition language.
01:02:05
PDDL is kind of a uh an attempt to standardize AI planning languages. So
01:02:11
it's been around for a long time. I think it's been around since like the late 90s and it's effectively a series
01:02:17
of symbols that define planning. What they were then doing is basically using PDDL
01:02:24
to try and set a series of steps that the LLM would use to reason and get to
01:02:29
an answer on doing a task or or running an action. And then they tuned the PDDDL
01:02:35
using this tuning framework that they developed, giving it feedback. And then they also fed it good plans and bad plans and said this is a good plan, this
01:02:42
is a bad plan. And so overall the LLM was then run in such a way that it
01:02:47
actually had a better set of steps that it would use to solve a particular problem. And so this can then lead to
01:02:53
all of the underlying machinery of an LLM being better utilized to solve a bigger problem to solve kind of a a
01:02:59
chain of thought or to solve some reasoning problem that requires several steps or planning. I think it was a very
01:03:05
good breakthrough. The benchmark data that they shared was pretty impressive and it's getting quite a bit of
01:03:11
attention this week. That was one I think really interesting paper that came out this week. The other one and Nick
01:03:17
maybe you can pull this one up. So this one's really impressive sachs. This comes from a team in Germany. This paper
01:03:24
was published in the journal Nature Computational Science. These folks took a GPU and for each token typically
01:03:31
you'll have the entire key value chain transferred from high bandwidth memory
01:03:37
to cache memory. So this means that you're moving a lot of data between one type of memory and another type of
01:03:43
memory. And what they were able to do is they were actually able to reduce the
01:03:50
physical memory size that's needed to run the attention window. As a result,
01:03:55
the energy and the total token cost to run inference went down significantly.
01:04:02
I'm trying to simplify this down as best I can, but what matters is the the end data that they provided. Their
01:04:08
architecture led to a speed up of 7,000x compared to the Nvidia Jetson Nano, 300X
01:04:16
compared with Nvidia RTX 4090, and then 100x compared to the Nvidia H100. And
01:04:22
the energy was reduced by 40,000x compared to Jetson Nano, 90,000x compared to RDX490,
01:04:28
and a 70,000x energy reduction for the same outcome over an H100.
01:04:35
So I think that this mechanism, if it scales, this new kind of technique can
01:04:41
have a pretty dramatic effect on the energy consumption needed to run AI. And
01:04:47
importantly, because you need far less memory, you can actually move a lot of
01:04:52
AI inference to the edge of the network. Meaning you could put, for example, a very high-owered LLM model that could be
01:04:59
run in a robot or in a piece of equipment or in a computer or on your phone that historically you'd need to
01:05:05
run in a data center because you needed a very highowered GPU chip stack. And so
01:05:11
this architecture, I think, could be one of these big architectural breakthroughs. We've spoken with Sergey
01:05:17
Brin and Eric Schmidt and Sundar and Demis about the big architectural
01:05:23
breakthroughs that are coming in AI that could ultimately lead to many orders of
01:05:28
magnitude reduction in the energy cost needed to run inference and to run AI models. Again, if this scales, then all
01:05:35
of our assumptions about the data center, about the energy can start to
01:05:42
kind of be thought about under this new kind of architectural framework, which might actually result in much much lower
01:05:48
need states. We'll see. But it was a really, I think, important paper and folks are going to look up this paper and say this could be a pivot point in
01:05:55
how we think about the energy and infrastructure needs to support AI. I don't know if you and your team have reviewed it, but it's definitely worth
01:06:01
spending some time on. Yeah, look, I think the writing was on the wall that models are going to get smaller and
01:06:09
smaller and more efficient to the point where they can run on the edge on local
01:06:14
devices. I mean, that was one of the implications of Deep Seek. But if you
01:06:19
look more recently at I think the launch of Llama 4, their smallest model, I think it's called Scout,
01:06:26
runs on a single GPU, right? So, I think we're going to have a
01:06:32
whole range of smart devices that will have a single GPU running a pretty
01:06:37
decent AI model. And I mean, obviously, your phone will have one too, probably a
01:06:43
much better one. Have you and your team talked about like what the energy demand curve looks like as these better architectures? Like, if
01:06:49
we're talking about 10,000x reduction in energy to run a token, have you guys thought about well, you know, does
01:06:55
energy scale as we've projected it to scale? Does data center need scale like we've projected it to? Or do you think
01:07:00
that because they're more efficient, we'll actually have more demand? That just sounds a little too good to be true right now,
01:07:05
right? As between papers and products, I pay a lot of attention to the launch of products and I don't pay a lot of
01:07:11
attention to papers, right? I know that some papers end up being really important. For example, the paper
01:07:19
on the transformer architecture back in 2017 turned out to be enormously important. But I think that a lot of
01:07:26
papers just don't really go anywhere for whatever reason. Maybe they're hard to reproduce or they don't scale or what have you. So I just don't really pay
01:07:33
that much attention to the academic literature. I do pay a lot of attention to product launches. And when someone launches something
01:07:39
revolutionary, then it immediately gets everyone's attention because you don't have to speculate about whether a proof
01:07:45
of concept is going to be possible or not. You actually see it. I guess what I'm saying is that the proof of the
01:07:50
pudding is in the eating, right? I think we're going to need a lot more power, a lot more electricity. I
01:07:56
think that's pretty well known. We haven't even gotten to the robot revolution yet. That's coming in the next 5 years. That's going to be energy
01:08:03
intensive. So, if this thing's even close to being correct, then you could run the most
01:08:09
kind of sophisticated LLMs in a robot without it needing to be run out of a
01:08:15
data center going forward. And the robots can simply make a request for information from the internet that they
01:08:21
need. But all of the actual computation, the reasoning, uh, all of the base knowledge would sit
01:08:28
locally in that device. It's really incredible to think about. Like we are going to end up with these like robots. It's amazing.
01:08:34
Yeah. Well, I think that's right. I mean, I think that self-driving wouldn't work if you had to run all the inference on the
01:08:41
cloud. I mean, it's run locally, right, by powerful AI chips.
01:08:47
And then obviously it can connect when it needs to. But no, I I would expect that robots are going to have a local AI
01:08:52
model. Yeah. Okay, cool. Well, that's it. I mean, those were the two papers I
01:08:58
thought were really pretty impressive this week. Hey, Freeberg, what exactly happened with YouTube? Do you have an update on
01:09:04
what happened with our episodes from Allins Summit that appear to be shadowbanned? Yeah, tell us what
01:09:11
happened there. Okay, so thank you to the folks at YouTube. They actually worked all weekend to help us figure out what happened and there was nothing
01:09:17
nefarious. There was no shadow banning going on. What happened was, you guys may recall a couple of months ago, we
01:09:22
stopped bleeping out curse words in our episodes and we muted them instead. And when we muted them, the YouTube
01:09:29
algorithm still thought that we were saying the curse word quietly and it still showed up in the YouTube
01:09:35
transcript. When you have a curse word in a video, YouTube marks it as restricted. So, it's kind of not age
01:09:41
appropriate. And so, that's why it was getting the restricted mark. When we went back, the episodes that did get restricted all had a curse word in them
01:09:49
and we understand clearly what happened. So going forward, we are going to use the bleeping again instead of just
01:09:54
muting. It was very benign, not nefarious. YouTube did a great job supporting us. We went back and fixed
01:10:00
all the old episodes, so they're all out of restricted mode and we started reposting all of our summit videos again. So yeah, I mean, conspiracy
01:10:08
corner is closed on that one. Well, hold on. Do creators know about this that if you have fbombs in your
01:10:15
show that you go on to restricted mode? You know, that's a great question. There's no, and we were talking to the
01:10:21
YouTube product team about this. There's no easy way for YouTube creators to see that a video has been tagged as
01:10:28
restricted. And so, they need to fix that. They're going to fix that. They told us. And so, I think we should all
01:10:34
kind of continue to hold them to that because it's important that creators don't know why. One of the questions we had for them which we thought was a
01:10:40
theory was if people report your video does it automatically go into restricted
01:10:47
mode and the answer is no. So the reporting triggers a review separately but the restricted mode algorithm is
01:10:54
distinct. But when it comes to this like restricted mode being triggered you
01:10:59
don't know that it happened. You don't get a notice. You're not aware of it. And they need to address that. Obviously, they need to have like a dashboard that
01:11:06
shows you any kind of restriction on your videos and a reason code for why.
01:11:11
That's right. And specific timestamps cuz they their engineers were able to pull it up for us, look at the timestamps, point us to them, and we
01:11:18
could see what happened. That should be apparent like they should present that to the creators. They know why they got restricted. I think part of the argument
01:11:24
was like, well, restricted mode in YouTube isn't a big deal. It turns out it is a big deal. We saw it in our traffic. We had big drop off because a
01:11:31
lot of network administrators, so the people that run the Wi-Fi at Starbucks or on your public bus and subway or in
01:11:38
your, you know, office, they have a network setting that's called safe mode. And safe mode was originally designed to
01:11:45
block porn or other not safe for work content at work. But it also triggers the restricted mode being blocked on
01:11:52
YouTube. And so if you're in one of those public networks and you're trying to access YouTube and you're in a restricted video, you lose that entire
01:11:59
audience. So it turns out I think it actually is a bigger deal than folks realize that videos are getting tagged as restricted mode. At least I think it
01:12:05
is and they should do a better job kind of surfacing things and then people should be able to go back in creators and
01:12:12
correct any issues that might be causing that to be restricted. But I don't feel like the policy itself was bad. I think
01:12:17
there was an algorithm problem where their software didn't pick up that we had muted bad words and it was more
01:12:24
apparent previously when we bleeped them. So, we're going to go back to bleeping until they
01:12:30
was there any weaponized reporting of content or we just don't think that was a thing. No, we know that to not be true. We
01:12:36
checked that. I checked that at the high level and the answer is no. And I think we feel very good about that. There's no
01:12:42
like mechanism either that if people do blast reporting or they try and you know as we used to joke Brigadeun
01:12:49
it doesn't actually trigger anything. So, got it. And speaking of YouTube, there
01:12:54
was a really important report out this week where I think we kind of knew this,
01:13:00
but YouTube acknowledged that during the Biden administration, I think this was like roughly 2019 or 2022, that time
01:13:08
frame, that they censored, I think something like a million videos at the
01:13:14
behest of the Biden administration. I guess that would have started in 2021.
01:13:19
and they admitted that they were pressured by the administration. Zuckerberg had said the same thing about
01:13:24
Meta and the Twitter files informed us about the same thing. But now YouTube has finally acknowledged that and
01:13:31
there's a big release on that. I'll tell you my view on this. I think that the censorship that happened during
01:13:36
that era is very important to have happened because it has brought a light
01:13:42
to it in a way that now there's a hyper sensitivity to it not happening again.
01:13:47
And I actually think that that's very good. So the fact that it happened has now created a real sense that going
01:13:53
forward the policy limits, the boundaries are now more clear than they ever were. It's not just about the Trump
01:14:00
admin, which I think a lot of mainstream media tries to make it about, but it really is about the importance of free
01:14:05
speech and and and censorship and who decides what's objective truth or not. I mean, going back to all of the COVID era
01:14:11
discussions, not allowing people to have discussions clearly is a problem. And
01:14:16
and speaking of this topic, I don't know if you saw this, but there is this kind of hate speech bill that passed out of
01:14:23
the Assembly and the Senate in California that's now on Gavin Newsome's desk to sign, which basically would fine
01:14:29
social networks that allow content to show up on their social network that the state of California deems to be hate
01:14:36
speech. And so whatever language or terms the state of California calls hate speech, and you could see how this could
01:14:42
become a very slippery slope very fast, they can now find a social media company millions of dollars, which in and of
01:14:48
itself could actually propagate a whole new censorship regime where people that are using certain terms that in that era
01:14:55
are considered bad terms or hate speech terms, they're afraid that they don't want to get fined tens of millions of
01:15:01
dollars. So they block all that content. And I do think that if this gets signed by Governor Nuome, it could trigger a
01:15:08
whole new kind of censorship battle in the months and years ahead. We'll see. Well, I think that's exactly right. I
01:15:14
think the bill you're referring to is SB771, and it is an EU style suppression of
01:15:20
quote unquote hate speech on social networks. The problem is that there is no definition of hate speech. That's not
01:15:26
a category that exists. It's just whatever the people in power say it is.
01:15:31
That's right. And so there is no constitutional exception for hate speech
01:15:37
under the first amendment. They reference in the bill, the California bill, because I read it, civil rights statutes which speak to
01:15:45
certain types of discrimination, certain types of hate speech, but the to your
01:15:50
point, those words are not defined. And so what ends up happening is you could say, well, using that word is
01:15:57
discriminatory to this group in some way, or using that word is hateful because it offends another group. And
01:16:04
suddenly you start to blur the line between what the average person might call hate speech and what perhaps some
01:16:10
people in an administrative body are calling hate speech. And suddenly it becomes more like, hey, is this really a
01:16:17
civil rights violation or is it just offensive content? And it's a very slippery slope that offensive content
01:16:23
suddenly can get wrapped up and be called hate speech. And then the government starts to tell us all what we are and aren't allowed to see say. And
01:16:30
we're obviously seeing the repercussions of that in the UK right now where the police are knocking down doors to arrest
01:16:36
people for um putting stuff on on Twitter. The direction I thought you
01:16:41
were going in a minute ago was that it it sounded like you were saying that it's good that we've learned all these
01:16:47
lessons from this COVID period where YouTube and
01:16:52
Yeah. And I I think that's exactly right. I don't see any evidence that the I'd say especially the political left
01:16:59
has learned its lesson. You got Gavin Newsome now trying to ban hate speech in California. By the way, he also signed
01:17:06
that bill was it like a year ago banning parody. Remember that parody
01:17:12
videos? I do remember that. Yeah. Political AI. It was like political AI videos.
01:17:17
Yeah. Yeah. Because it was in the wake of the of a humorous fake advertisement for Kla
01:17:22
Harris. Then you've got these folks who on the left are already saying that
01:17:28
Sinclair and Nexttel need to be punished for not putting Jimmy Kimmel back on the
01:17:35
air. So in other words, the same people who are saying a week ago that the Trump administration jaw boned ABC Disney that
01:17:43
that was fascism, but if they jawbone next to Sinclair, that's democracy. I mean, it's completely hypocritical. I'm
01:17:49
not convinced anyone's learned a lesson from this. And just to be clear, I don't think Jimmy Kimmel should be taken off
01:17:55
the air or censored or whatever. I'm pretty sure that his show is not going to be back next year because it's got
01:18:01
such low ratings. I don't think there's really a need to censor him. It is true that there is a public interest
01:18:07
requirement for using public spectrum, but nobody seems to agree anymore on on
01:18:12
what's in the public interest. So, I completely agree with what you said last time. We just got to auction off that spectrum. I don't think we don't
01:18:18
need that spectrum. We have the thing called the internet now. And so we no longer need broadcast television. And there there shouldn't be a government
01:18:24
regulated, right, broadcast television system where they're deciding what is and isn't
01:18:30
appropriate content and in the public interest. That just doesn't make sense for the government to do in a market
01:18:36
that's supposed to fully support free speech. Yes. And I agree. I agree. I think I think
01:18:41
the Jimmy Kimmel issue should be up to the people that are spending their money to put Jimmy Kimmel on the air and they
01:18:46
can decide what they want to do with, you know, if no one watches it, they'll take him off. And if people watch it, they'll keep him on. That's that's their
01:18:53
decision. We shouldn't, you know, I don't think that it makes sense to quote cancel or ban someone for saying
01:18:59
something that's offensive. And I think that if you do it on uh one side, eventually it'll happen on the other
01:19:04
side. That but that's a triedand-true point in free speech advocacy, obviously.
01:19:10
Yeah. I mean, look, you could definitely argue that throughout history, both the right and the left have sought to censor
01:19:16
inconvenient speech when they've been in power. But again, I just think this is one of those issues where just because
01:19:23
both sides have done it throughout history doesn't mean that in the present day one side isn't a lot more guilty of
01:19:30
it than the other. And you see this with whether it's Scott Weiner and Elizabeth
01:19:36
Warren jaw boning Sinclair and next to keep Jimmy Kimmel on the air or whether it's Gavin Newsome possibly signing this
01:19:42
bill to ban hate speech in California. Well, do you think he'll sign it? I don't know if he does not sign it. If he vetos it,
01:19:49
will you give him credit? Yeah, for sure. That would be a good sign. Yeah. Like when he vetoed that AI bill,
01:19:55
I thought that was fantastic and I give him a lot of credit for that. Yeah. I think he's look he's not a
01:20:00
completely unreasonable person and there's points when when he he realizes things have crossed the line.
01:20:06
Yeah. Which by the way is scary because then I think about who's the next governor. You know these things pass out of the Senate
01:20:11
in the assembly and he's the only thing standing in the way. Scary Gavin Newsome isn't even the craziest
01:20:18
person on the left. I think we can all agree on that. Yeah.
01:20:23
I'm not a fan, but there are way crazier people. But let's see. I I suspect he will sign it. Mhm.
01:20:28
Because I think the left very much likes this sort of thing. You see it in Europe. You saw it in the Twitter files.
01:20:34
See it in these acknowledgements that YouTube has just made. You see it in what Zuckerberg told Rogan about the
01:20:43
censorship that Meta was pressured to do. I think there is a clamoring on the part of the left to silence speech that
01:20:49
they don't agree with. And they do call it hate speech. It's speech they hate. and um the right in a fit of peak when
01:20:55
they're angry about the assassination of one of their heroes, are they capable of saying that Jimmy Kimmel should be taken
01:21:01
off the air? Yeah. But is that something that's been broadly acted upon on the part of the right? No, it's not. The left is the one that's engaged in
01:21:08
massive amounts of speech suppression over the last few years. It's not a both sides problem. Uh so I hope I hope News
01:21:15
will veto that bill, but let's see what happens. I hope he vetos it. I think it would be an incredible statement if he did, but
01:21:21
if he doesn't, I could see why this creates effectively a free option for him, for the Democrats, for whoever is
01:21:27
in charge with administering California statute,
01:21:33
giving themselves basically a free option on whether or not to enforce it and how to enforce it. It creates a
01:21:38
mechanism, and I don't I just don't like that mechanism existing, obviously. But I could see how a system in power can
01:21:44
find this to be a good mechanism for maintaining power and influence or at least influence over speech.
01:21:50
I think Newsome's going to have to sign it into law cuz it's what his base wants and he intends on running for president
01:21:57
in a few years and it's not just the California base. It's the overall base of the Democrat party. I don't think he's going to want to risk alienating
01:22:03
them. And even if he harbors some qualms, which I'm not sure he does, he
01:22:08
probably realizes he's got the Supreme Court to back him up in the sense that they're probably going to find this unconstitutional, they're going to throw
01:22:15
it out. But he can still send a message to the left that he's with them, that he wants to suppress conservative speech
01:22:22
just like they do. So, I suspect he'll sign it. Well, we'll see. All right. Thanks everyone. Sachs, have a good drive
01:22:28
wherever you're going. Thank you guys for joining us. Sorry we lost our other besties. This has been your favorite
01:22:34
podcast, the All-In podcast. I am your closing host, Dave Freedberg,
01:22:40
joined by David Saxs. Goodbye to Chimoth, Poly Hoatia, and Jcal. We'll miss you guys.
01:22:46
Bye-bye. Let your winners ride.
01:22:52
[Music] We open sourced it to the fans and
01:22:58
they've just gone crazy with it. Love you. Queen of
01:23:04
[Music]
01:23:09
besties are my dog taking a notice in your driveway.
01:23:17
Oh man, my dasher will meet me at We should all just get a room and just have one big huge orgy cuz they're all just
01:23:23
useless. It's like this like sexual tension that they just need to release somehow.
01:23:29
Wet your feet. We need to get merch.
01:23:36
[Music]
01:23:42
I'm going all in.

Episode Highlights

  • Emirates First Class Experience
    Chamath shares his luxurious experience flying Emirates first class, complete with an extensive wine list.
    “The wine is incredible.”
    @ 00m 58s
    September 27, 2025
  • H-1B Visa Overhaul Discussion
    The podcast dives into the Trump administration's new $100,000 fee for H-1B applications and its implications.
    “This is a huge jump.”
    @ 02m 56s
    September 27, 2025
  • Abuse of the H-1B System
    The hosts discuss the rampant abuse of the H-1B visa program and its impact on American jobs.
    “It's a giant scam on the bottom half of these.”
    @ 05m 58s
    September 27, 2025
  • The Complexity of Immigration
    Immigration should be viewed through multiple lenses, not just one. 'We should have a really thoughtful discussion.'
    “We should have a really thoughtful discussion.”
    @ 23m 02s
    September 27, 2025
  • Trump's Border Claims
    Trump claimed zero crossings since he took office, raising eyebrows about border security. 'That's incredible. We have the greatest military in the world.'
    “That's incredible. We have the greatest military in the world.”
    @ 24m 52s
    September 27, 2025
  • The Autism Debate
    A press conference sparked controversy over autism and acetaminophen, highlighting the need for thorough research. 'Medicating yourself to make a political point is hilarious.'
    “Medicating yourself to make a political point is hilarious.”
    @ 36m 25s
    September 27, 2025
  • Jimmy Kimmel's Return
    Jimmy Kimmel returns to the air after a suspension, addressing sensitive topics with sincerity.
    “It was heartfelt and sincere.”
    @ 44m 49s
    September 27, 2025
  • The Importance of Empathy
    Kimmel emphasizes the need for compassion and understanding in a divided political climate.
    “Expressing empathy for the other side is a positive statement.”
    @ 47m 49s
    September 27, 2025
  • AI Breakthroughs
    Recent papers from MIT and Germany showcase significant advancements in AI reasoning and efficiency.
    “This architecture could be one of these big architectural breakthroughs.”
    @ 01h 05m 11s
    September 27, 2025
  • YouTube's Shadowbanning Explained
    YouTube clarified that recent episode restrictions were due to muted curse words, not shadowbanning.
    “There was nothing nefarious.”
    @ 01h 09m 17s
    September 27, 2025
  • California's Hate Speech Bill
    A new bill could fine social networks for allowing hate speech, raising censorship concerns.
    “It could trigger a whole new kind of censorship battle.”
    @ 01h 15m 08s
    September 27, 2025

Episode Quotes

Key Moments

  • Emirates Luxury00:30
  • Immigration Discussion23:02
  • Border Security Claims24:52
  • Autism Controversy25:29
  • Political Protests36:25
  • Censorship Discussion43:47
  • YouTube Update1:09:04
  • California Legislation1:14:23

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Trump Brokers Gaza Peace Deal, National Guard in Chicago, OpenAI/AMD, AI Roundtripping, Gold Rally
Podcast thumbnail
Epstein Files, Is SaaS Dead?, Moltbook Panic, SpaceX xAI Merger, Trump's Fed Pick
Podcast thumbnail
AI Psychosis, America's Broken Social Fabric, Trump Takes Over DC Police, Is VC Broken?
Podcast thumbnail
Tucker Carlson: ICE Raids, LA Riots, Strong Economic Data, Politicized Fed, War with Iran?
Podcast thumbnail
Trump Rally or Bessent Put? Elon Back at Tesla, Google's Gemini Problem, China's Thorium Discovery
Podcast thumbnail
E44: USA's Afghanistan embarrassment, China's new algo laws, future of robots + Italy recap!
Podcast thumbnail
E85: SBF's crypto bailout, Zendesk sells for ~$10B, buyout targets, US diplomacy, AlphaFold & more