Search Captions & Ask AI

NBA Gambling Scandal, Billionaire Tax, Tesla's Future, Amazon Robots, AWS Outage, Dangerous AI Bias

October 24, 2025 / 01:23:29

This episode discusses the proposed California wealth tax, featuring a one-time 5% tax on billionaires' net worth. The conversation includes insights on the implications of the tax, potential constitutional challenges, and the motivations behind the initiative.

The hosts analyze the Service Employees International Union's (SEIU) ballot initiative aimed at taxing billionaires in California. They express skepticism about the tax's feasibility and discuss its potential use as a political tool to rally support or opposition.

Key points include the historical context of wealth taxes in other countries, the likelihood of wealthy individuals leaving California, and the broader implications for the state's economy. The hosts also touch on the challenges of funding pension liabilities and the impact of progressive taxation.

Throughout the episode, the hosts share personal anecdotes and opinions on the wealth tax, with some expressing support for taxing billionaires while others highlight the potential negative consequences for the state's economy.

The discussion concludes with reflections on the political landscape in California and the potential for future tax initiatives.

TL;DR

California's proposed wealth tax targets billionaires, raising constitutional concerns and economic implications amid political maneuvering.

Video

00:00:00
What's the story with the California wealth tax? Can somebody explain this to me? Okay. So, the SEIU, the Service
00:00:06
Employees Union, filed a ballot initiative, which means a directto voter
00:00:11
vote to amend the California Constitution to introduce a one-time
00:00:17
billionaire's wealth tax where billionaires, anyone who has assets over a billion dollars net of their debt, has
00:00:23
to pay a one-time tax of 5% of their net worth, including their private stock,
00:00:29
including their real estate. You said 5%. 5% of their net worth, not of their income,
00:00:36
of their net worth, the entire net worth, onetime payment to the state of California, and then there's an
00:00:41
allocation on how that money will be spent, but it's a onetime billionaire tax. Now, it is very likely that this
00:00:48
sort of an amendment to the California Constitution is not constitutional and actually cannot be made and will not
00:00:54
actually go into enforcement even if the voters do vote to approve it in both a federal and a state level based on this
00:01:00
concept of uniformity which is that you have to tax everyone equally except for the case of an excise tax which is like
00:01:07
income or a transaction. You're allowed to tax disproportionately based on the size of the income or the size of the
00:01:13
transaction. But if you're going to tax on property, if you're going to tax on an asset, you have to tax everyone
00:01:18
uniformally. So it is likely not going to go into effect if it does pass. However, it is very likely the case that
00:01:25
the SEIU is simply using this as a baiting mechanism to get people to stand up and denounce it and then they will be
00:01:31
in a position to attack those people and destroy them and use this effectively as
00:01:36
a political fodder for this next election cycle. That's what it seems like the true kind of motivation right
00:01:41
now. Let me let me go on the record. I think this law is great.
00:01:46
He's getting the virtue signaling points. I would just like to say, may I be the first to pay 5%. I'll be in the front of
00:01:53
the line. Let me know when to show up. I'll bring my check. Who do I sign the check towards? Shall I Should I bring cash, Gavin? We
00:02:00
just bring it cash to your Which one of your mansion should I bring the cash to?
00:02:06
This is strategically why Chimamoth, I'm glad I got out of California right
00:02:11
before I was about to billionize. That was a smart move on my part. Free, what are the odds that this goes
00:02:16
into effect? Can you just handicap this? Yeah. Well, we don't know who's going to come out against it, but there's an
00:02:22
effort to try and get top Democrat officials in the state of California to say, "This is silly. If you do this,
00:02:28
people will leave the state." yada yada. So, that's kind of a quiet underway effort. But I don't know why the
00:02:36
citizens of California, the majority of citizens of California would not vote for this. Why? Who wouldn't want to tax the billionaires 5%. Come on. Like,
00:02:43
well, the way the way that it's written, it says, "Hey guys, we're 30 billion in the hole and there are 200 Californians
00:02:51
that control two trillion dollars. We're just going to ask them to pay a onetime fee of 5%." Y
00:02:57
and I don't see how anybody would say that doesn't sound unreasonable at the
00:03:03
ballot box, right? And then the people the people that step up against it and are vocal against it
00:03:09
and point out like hey in France when they did this they lost like 40% of their revenue because all the wealth
00:03:14
left the country. The reality is that this sets it up to go through the legislature because if it goes through the will of the people and it gets
00:03:22
overturned as you say Freeberg then if you're legislatively smart then you'll
00:03:27
actually push it through the state senate. Oh but don't you remember no and then it
00:03:32
will not get vetoed like because then it's like hey listen it's clear that the people want this. So I I think that you'll have some kind
00:03:39
of progressive taxation system that conforms to the law. I mean the one They're already trying to
00:03:45
extend Prop 55, which is the progressive tax for people making over a million dollars. They're going to get that
00:03:52
passed. That's going to be this incremental tax on income. But the onetime wealth, I think the I think the million dollars
00:03:57
thing, I think that's harder to hunt because there's too many people that that touches. A million dollars today in
00:04:02
2025, not to be glib, is just not what it used to be. But a billion dollars does cut off most people except for a
00:04:09
couple of hundred. That is true. And I think that for example, it's very reasonable to then charge a 10% excise
00:04:15
tax on selling appreciated stock, right? Why not? There's all kinds of ways that
00:04:20
you can get billions and billions and billions of dollars. So, I don't know. I think that this is more of a trial
00:04:25
balloon to say, can we draw a clear line between 200 Californians and the rest of
00:04:33
California? Yeah. And to the extent that that bright line becomes visible and it's okay,
00:04:39
people are going to go ham. They're going to try to get as much as they can. The reality is, as we all know, I mean,
00:04:45
Larry Ellison left the state. Many of the founders CEOs who have built large
00:04:50
technology companies in California. Elon left the state will eventually at some point break and say, "Okay, I'm moving
00:04:57
my company out of state and I'm leaving the state and I'm bringing the employees with me and I'm bringing all of the economic value of this business with
00:05:03
me." And people will never learn that lesson because it's so much easier to sit in front of a voter and say, "Hey,
00:05:09
should we tax these 200 people to give you better benefits?" 97% of people will say, "Absolutely."
00:05:15
Very few people will sit and think about the consequences of what's going to end up happening. 99.9 will say absolutely. I mean, nobody
00:05:23
tells them in that ballot initiative that we have a $300 billion budget of which 2/3 may be just wasted. One of the
00:05:30
motivations for this bill, and this is why it's being proposed by the SEIU, is that there are these massively
00:05:37
ballooning pension benefits and pretty significant increases to the pension programs for both private and public
00:05:44
pension funds in California, which has actually become a very visible liability for the state and for some of these
00:05:50
private pension programs. and they're trying to fill the pension hole, which we've talked about in the past, but
00:05:56
there is a multi-t trillion dollar unaccounted for pension liability in this country that's going to have to
00:06:02
come from somewhere. You're either going to have to print the money because the federal government's going to step in and fill the hole in all these pension
00:06:09
obligations or they're going to have these massively progressive tax programs to try and fill the hole. And if and
00:06:14
when they do, as we all know, there will be an economic cycle that will be pretty nasty, which is all the value will leave
00:06:21
that jurisdiction and move elsewhere. But let's see. It's it's like the Democrats are doing everything they can to get me to leave
00:06:27
the state. I don't want to. I really am resisting. I mean, they've raised my income tax to what is it like
00:06:34
13.3%. 13.3. Yeah. And I know it's going to 16. They've been boiling the frog. I still haven't
00:06:41
jumped out of the pot. But for me, I think the wealth tax, I'm gonna have to jump out of the pot with this.
00:06:47
The crazy thing with this, the other like I read it because I was like, "Oh my god, what's going on?" The two things
00:06:52
that they they obviously got somebody very clever to draft it because any Roth IRA over 10 million counts. And normally
00:07:00
in these wealth calculations, you you keep your deferred retirement accounts
00:07:06
off the table. They're typically not included. So folks, and I'm not going to
00:07:11
say who they are. we all know who have tremendously appreciated Roth IRAs pretty public who you're talking about
00:07:16
but sure those are included and then the other thing is that if you actually did any tax structuring the real valuable tax
00:07:22
structuring is where you set up these trusts in Wyoming and North Dakota and you do these interparty loans where you
00:07:29
can lever up 10 20x so you can transfer billions and billions and billions of dollars out of state but then you have
00:07:36
these obligations those are negated and they don't count so all that tax structuring goes out the
00:07:41
So you can get into a very difficult situation here where they're like, "Hey, you owe us $500 million, a billion
00:07:47
dollar, $2 billion, and the only way to pay it is to have an IOU to the state of California." Which is crazy.
00:07:55
It's crazy. There is not a lot of ways out of this if this stance
00:08:00
No one is empathetic to either of you. No one gives a about the two of you needing to pay more. This is why I'm in support. Again, I'm
00:08:06
just saying it for the record. I am supporting support keep this I had a few more thoughts about this
00:08:12
thing which I want to unpack. So number one is like you guys said a wealth tax has been tried in many places at many
00:08:17
times it always backfires because whatever the tax benefit is that you get for the state it's greatly outweighed by
00:08:24
the economic depression that you get by the wealthy people the job creators
00:08:30
companies leaving. And as soon as you cross that line of going from no wealth
00:08:35
tax to any wealth tax, enough people of wealth can see the tea leaves, they can
00:08:41
see the writing on the wall that they have to leave. And that's why I think even if they say this is a one-time thing, we all know that it won't be one
00:08:48
time. If they get away with it, it'll become a regular thing. It'll just be if it's to plug a deficit, they're going
00:08:54
to run deficits every year. Exactly. And and you're right. And this isn't even to plug an emergency
00:08:59
situation or an unfunded liability like some one-time thing. This is just a regular operating exactly. So they will
00:09:07
have no incentive to fix their mismanagement of the state and their deficits and all that kind of stuff. By the way, by the way, if they get away with this
00:09:12
and it's not just going to be billionaires eventually the line will of course get pushed down be gone. The
00:09:18
billionaires always like like the income tax in the US. I think it was a 1% income tax originally and it was like
00:09:23
just a onetime thing for wealthy people and then it became a smaller thing for you know lower income people and then
00:09:29
eventually as we all know every person has to pay a tax every property has tax
00:09:35
and so on. I mean these are this is the problem with government. There's all these other states by the way that are finding clever ways. I think in Montana
00:09:42
now there's a differential property tax scheme where if it's your second or third home and you don't live there you
00:09:49
pay a lot more. Yeah, here's what I wonder about is, you know, what are guys like Jeffrey Katzenberg or even Ari
00:09:56
Emanuel thinking about right now because they're kind of the higherups in the Democratic party behind the machine,
00:10:04
sort of the oligopoly that kind of runs the machine. And I remember that when
00:10:10
Karen Bass was running against Rick Caruso for mayor of LA, it was very publicly reported that Katsenberg was
00:10:16
behind Karen Bass and there was sort of an imboglio between Caruso and Katsenberg. Katsenberg anyway helped
00:10:22
make sure that Karen Bass was wellunded enough to win. The result of that ironically was that Pacific Palisades
00:10:28
burned down and I think Katzenberg's house might have been part of that. In any event, I think there are these guys
00:10:33
who are very, very wealthy who think that they can control the machine well
00:10:39
enough that they basically are still in control of this thing, right? That in other words, that the tiger won't eat
00:10:46
them, right? The tiger is socialism. Yes. And that's exactly right.
00:10:51
You know, they think they've got the tiger under control enough that it won't eat them. But I don't think they do. Maybe they don't. Maybe this is the
00:10:57
tiger breaking loose. Yep. And I think Fer, you pointed this out that there was an attempt in the legislature last year to pass a wealth
00:11:04
tax and it was quietly killed behind the scenes. And I actually think that Gavin Newsome might have something to do with
00:11:09
that because he has presidential ambitions. So he can't let the state go full socialist. But you just kind of
00:11:16
wonder, okay, well, if these guys lose control of the strings they have to control the beast of socialism, does the
00:11:24
whole thing just spin out of control? That's New York. We're seeing it everywhere. Seattle. I was about to bring up let's talk about
00:11:30
it. Uh well and and just to let people know about the France situation back in I don't know 2011 2012 they did get rid
00:11:38
of Gerard de Parde which was kind of a win but Bernard Arnold what's
00:11:45
Bernard Arnold is that his name from LVMH Bernard Arno said he was going to go to Belgium and said it was a clerical error
00:11:51
and he unwound it but that was a clear signal the richest man in France. Well, he's go to New York.
00:11:57
He's basically the entrepreneur LVMH together. I mean, it's their biggest company. It's the one that does all the
00:12:03
luxury goods, all the craft goods that they're so famous for. I mean, yes, him
00:12:09
threatening to leave France is, you know, accidentally fly filing paperwork. Oops. What an accident. Here's your look at uh
00:12:18
New York City under Mandami, who uh we're in touch with. You may come on the program.
00:12:23
New York State tax 10.9% city 3.876%
00:12:28
and the 2% mandami tax put you at 16.8%
00:12:34
for living in New York is 17%. I mean if you were making $10 million a
00:12:42
year is it worth $1.7 million? You could get a plane. You could live in Florida.
00:12:48
You could come to New York 150 days a year. There's really five good months in New York. the fall, the spring, and
00:12:54
that's about it. You know, you go see the tree at Christmas, but it's cold. Well, that's not that's not that's not realistic for most people. And
00:12:59
especially if you have kids and you care about them, you'd like them to be rooted somewhere. You're not going to shle them around every month to arbitrage taxes.
00:13:06
Yeah. Well, I mean, I I do think they're they're going to test that at 17%. That's nondeminimous. Okay, let's uh we
00:13:14
got a lot of docket to get through here. I'm so glad Shimov supports the billionaires tax. That's great. We'll get that in the headlines. Yeah. right
00:13:20
after all. This is the free rider problem that we have is no one's going to want to stand up against it and the thing will just kind of
00:13:26
pass. By the way, if you're the if you're a billionaire CEO of a public company in California,
00:13:32
you have you have everything you have everything to lose to stand up and oppose it. Your employees will run. Your
00:13:38
shareholders will attack you. You'll look awful in PR. So, everyone's going to sit quietly and start looking at houses on Zillow in Austin or Miami and
00:13:46
be like, "Where should we move to next year, honey?" You know, like that's the conversation that's going on.
00:13:51
Didn't you say it was retroactive? What's it's retroactive to 2026? So, if it passes,
00:13:56
you have three months. Two months. Yeah. But again, I don't think it passes muster with the constitutional reads.
00:14:04
Remember what they said about remember when they did the transfer tax where San Francisco took 6% of my home?
00:14:10
Yep. And then in LA just took 5% of my house down there, the supposed mansion tax. But those those were excise taxes. So if
00:14:17
you go back to the case history in the US Supreme Court on this stuff, anytime there's a transaction and you take a tax
00:14:23
on a transaction, they call that an excise tax that is constitutionally part of the bill though there's a part
00:14:28
of the bill that they could cleverly use which is called this ODA which is effectively this IOU mechanism
00:14:34
and they could essentially say when these assets transact you owe us 5% on an excise basis and by the way there's
00:14:40
an attestation that you have to file you have to file a legal document and this was quite well written in there which
00:14:46
said you must attest that you have less than a billion dollars. Okay,
00:14:53
now what? Okay, then I have to attest that it's more and then I have to know how do you even mark your whole portfolio to market if you have a lot of
00:14:58
privates? They do not allow discounts. They do not allow liquidity discounts. It says if you are a reasonable buyer and a
00:15:05
reasonable seller, you have to transact this at market price. So, for example, imagine you owned a sports franchise and
00:15:12
the sports franchise, if you sell a minority share, you're typically selling it at a discount off the table. If Forb
00:15:18
says it's worth 10 billion and you own 10%, that's a billion dollars for the purposes of this calculation.
00:15:23
I'm going to pay $50 million to keep it even if you paid 50 million to buy it. Even if Freeberg is right that there's a
00:15:30
good chance that it'll be found unconstitutional, how many years in the course is that going to take? And who's going to stick around waiting for that?
00:15:36
In fact, the rational thing to do is pull up stakes before January 1st and leave right now.
00:15:42
That's right. That's going to happen in New York. I mean, I think they're going to have an exodus just like New Jersey and Connecticut did. And that actually
00:15:49
rocked the tax base in those two geographies. All right, listen.
00:15:54
Big breaking news this morning. Huge scandal in the NBA. The FBI just arrested 30 people in a sports betting
00:16:01
and gambling probe. This hardly seems real. Chanty Bilips, who is the current Blazers coach and was just introduced
00:16:09
into the Hall of Fame, got pinched for a poker game he was running allegedly with the mafia that was rigged 17 different
00:16:17
ways allegedly to Sunday. Terry Rosir allegedly is a point guard for the Miami Heat. Why are you saying allegedly all the
00:16:23
time? I'm, you know, everybody's suing these days, so alleged I'm I allegedly he's a point guard. I I've seen him play. He's
00:16:30
not a very good point guard. It's a lot of turnovers if I'm being honest. You know what I know is alleged? That you're the world's greatest moderator. That's
00:16:36
allegedly true. It's allegedly true cuz it's not true. You're allegedly a billionaire. Nobody
00:16:41
can confirm it. Normally, he uses the word allegedly when it's a story that like it's about
00:16:48
Hunter Biden or doing something improper. Yes, he allegedly smoked crack and shot
00:16:53
a 9 mm in the air. It's usually a story about Democrat wrongdoing and he's trying to discredit
00:16:58
it. All right, here we go. Anyway, keep going. Curry Rosair, uh, who's allegedly a point guard. I mean, he he was
00:17:06
he told his friends, this is crazy, you know, in the overunders, you know, hey guys, bet the under on me uh in rebounds
00:17:13
cuz I'm going to uh take myself out of the game with an injury allegedly. And
00:17:18
uh his friends allegedly made 200 grand off this. Okay, just allegedly for the whole
00:17:24
goddamn thing. Uh this is going across 11 states and a bunch of crime families. Allegedly, there's something called the
00:17:30
mob. I don't think that really exists anymore. I think that's an urban legend. And these are two separate threads, but
00:17:36
announced on the same day. They both involve NBA players, but apparently this is two different cases. So, Chimath,
00:17:44
what do you allegedly think of this? I I think it's crazy. I think you're
00:17:49
seeing a lot of these trends converge all at the same time. Meaning, you have
00:17:54
the emergence of all of these prediction markets. you have
00:18:00
a lot of data science and AI being used that shows that there's a lot of odd
00:18:05
behaviors. So, it really was the squares versus the sharps. And if you had the inside edge, you were just printing
00:18:10
money. Now that all of that is becoming more transparent, there's a lot less margin. Then what happens is you have
00:18:17
these laws passed in the 11th hour. There was a an important gambling law that was inserted into the big beautiful
00:18:23
bill that has implications to all of this. And now you're seeing the feds. I The crazy thing to me is
00:18:30
a press conference where Cash Patel is talking about this. I mean, that's like serious business when the FBI director
00:18:36
is front and center talking about all this. So, I don't really know what it means to be honest. I was shocked at the
00:18:41
scale of it and I was shocked that it's on the radar of the feds. This took I
00:18:47
thought this is like pretty typical tickytacky stuff, but clearly there's something bigger. I don't know exactly
00:18:53
what that bigger is, but something is happening where all these markets are smashing together. There's just a big
00:18:59
cleanup effort going on. So, I don't know. I really don't know. Freeberg, I guess there's two different
00:19:04
ways to go about this. You have the fantasy sports becoming legal, everybody around these players just in that one
00:19:11
case where are these people too dumb to understand that their $10 million
00:19:16
contract to play in the NBA every year or $20 million contract is more important than your friends betting the
00:19:22
under or over. And how dumb are they? I mean, to not know that the people running a sports
00:19:29
book would look for weird action. like why is one player getting $200,000 on
00:19:35
their overunder for rebounds and the other players are getting 20,000? What are what are your thoughts here? Us also
00:19:41
take on the poker one. I think gambling generally as we call it should be decriminalized and I don't
00:19:46
like this statebystate setup with gambling. I think we should have a federal regulatory body to oversee,
00:19:53
monitor, and the problem is you have state gambling commissions and we have a state-by-state kind of patchwork of
00:20:00
regulatory authority that makes it very hard to standardize, track, and provide also guidance and feedback. I would much
00:20:07
rather see this all kind of get handled at the federal level and and better organized. To Chimat's point, this is
00:20:12
not going away. People love to bet on stuff. They love to gamble. This is part of sports. This is part of the culture.
00:20:19
They're not going to just turn it off. They did market poly market raised whatever it was a billion or two billion
00:20:25
at 9 billion. Then the next weekend they announced sports betting and now they're raising money 30 days
00:20:31
later. It allegedly allegedly at 12 to 15 billion. I mean anywhere.
00:20:38
It's un it's unbelievable. And you can see by the way the way that DraftKings and FanDuel stock
00:20:45
have reacted to this. Those companies are toast.
00:20:50
Toast. That's right. This is really interesting. The poly market model is the best model because it creates a market and so as
00:20:57
information flows in that market will dynamically adjust and everyone will get a more fair price.
00:21:02
Did you see the regression that they did on the poly market trades and how well they're in the money? Y Nick, can you find that? But basically
00:21:09
what it showed is like the front money is the sharps, the back money are the squares, but you
00:21:16
have to fade the trade in the first week. So there's a very scientific method where if you want to make money on poly market, it became pretty clear.
00:21:22
There's two things that are very interesting about it is number one how how they've simplified things to a way
00:21:28
people can understand. It's not like you have to understand, you know, it's 120,
00:21:33
it's this the point spread. It's just what are the what's the chance that this thing happens? 80% 20% people could just
00:21:40
place their money on it. And then this ability to reconcile it at any time. I didn't realize how engaging that is. I
00:21:46
was watching the Oscars and I was watching boxing and I bet the underdog
00:21:51
in this Netflix boxing thing that happened cuz I just thought this guy looks pretty pissed off. Uh, and I
00:21:56
thought that was a good enough way to go with the underdog. And then you watch it round after round and you see the odds
00:22:02
changing real time and anytime you can just cover the bat and take your winnings and take out the risk. Really
00:22:07
like interesting and fun for people. It's so simple. Then I did it on the Oscars or the Emmys and I was like,
00:22:13
"Yeah, I'm going to I'm I'm fading uh no offense Pen Stiller, but I'm going to
00:22:18
fade Severance." And I went with the um the uh the one about the emergency rooms
00:22:23
and with Andor and I won again. So, I'm just it's it's a lot of fun to do it here. Jason, look at this. I sent Nick
00:22:30
the tweet, but this is incredibly systematic. This is over many many many
00:22:35
markets. But basically, 89% accurate one week out, but in the final four hours,
00:22:40
it jumps to 95, which means that if you follow the sharps along this pattern, you're going to make money.
00:22:46
6% in a week. Yeah. Right. Market Poly Market actually has the news
00:22:52
before the news does. This is one of the most like powerful outputs of Poly Market is they're actually getting a
00:22:57
read on what's going on in the world before the media recognizes it, before the public recognizes it. Because when
00:23:03
you put Yeah. When you put money up, it actually turns out that when people have incentives, that market will find the
00:23:09
truth. Somebody needs to build the app that makes all of these things fungeible. And by all what I mean are cryptocurrencies,
00:23:18
betting markets, equities, and options markets turning into.
00:23:24
Yeah. And the reason is there's just no reason to go to nine different sites and have nine different accounts. And the
00:23:29
most important thing is to do KYC and AML across nine sites to get access to
00:23:35
liquidity, credit, and margin. You'll want to do it once. And then you'll want to have a large pool of capital that you
00:23:42
can trade across anything. So I can go long Nvidia but I can also go short the nicks and then I can own some Bitcoin
00:23:49
all in the same trade. Totally. That's where it's going. Totally. Totally. Now to the earlier question Jcal I think
00:23:56
if we end up there where poly market does become the truly liquid market across all of these kind of predictions
00:24:03
all of these assets then a lot of what we are seeing with respect to insider
00:24:08
trading insider information becomes much more apparent. So, the problem with the sports betting is that there's a
00:24:15
one-sided bet. The casino sets the odds or, you know, whomever is setting the odds, and then you're either taking one
00:24:21
side or the other. And so, if you have the insider information, you're taking the side that creates an arbitrage
00:24:26
opportunity for you. But if you were to do that in a liquid market where there's someone taking the other side in a dynamic way, then the
00:24:33
market very quickly moves because of the inside knowledge you have. And that inside knowledge is now reflected in the
00:24:39
underlying asset price in the underlying odds that you get for that bet. And so Poly Market actually brings truth and
00:24:46
transparency to what is currently an insider arbitrage opportunity and it may
00:24:51
actually solve some of these fundamental problems in in gambling. I think let's just wrap with a little bit on the poker and knowing if you're
00:24:58
in a rigged game or not. uh living in LA I got invited to a lot of poker games when I was playing low stakes playing at
00:25:04
Hollywood Park just you know $500 buy in thousand buy in but as these things went
00:25:09
up you started to get access and I started to get invited to Molly's game the very infamous game and she would
00:25:15
text me she would call me oh we're playing over here oh Leo this person wants to see that person wants to see I
00:25:20
was like they want to see me lose 50 grand there's no way I'm not playing in that high stakes and I'm not going to that game and the one or two times I did
00:25:26
go to games that had a rake I was just like This game is fixed. I don't know how. Totally.
00:25:31
But somebody's I think it's just collusion. I think there's three players all playing from the same chip stack. In
00:25:38
which case, you know, you could be dealt aces five times in a row. If you're up against three players, what are your
00:25:43
odds against, you know, six other cards? It's going to be pretty bad for you think Molly's game was fixed. I don't know if hers were was I wouldn't
00:25:51
be surprised if it was. I wouldn't be because once the mob gets involved, which is what happened at the tail end of hers, then all kinds of possibilities
00:25:58
happen. Once it gets to extremely high stakes and you've got guys chasing it, man, you could, you know, and they're
00:26:04
coming back night after night trying to catch up for what they lost last week, it's it's pretty dark. There is
00:26:10
absolutely no reason why anybody should play in a game where you're playing with
00:26:16
people you don't know. And if you need it that badly, then you probably have a problem. But there is no limit at which
00:26:24
you couldn't find a game with some combination of your friends and or respectable reputable businessmen that
00:26:30
have more to lose than you do. And if you can't find that game, you should not be playing in any game. Yeah. Any home game with a rake is just
00:26:37
should be absolutely suspect. Period. Stop. Super sketch. Isn't that game?
00:26:43
Yeah. Well, yeah. Well, we don't want to bring up angle shooting, but he's a straight play.
00:26:50
He would be so tilted if he hurt heard you so well. Oh my god. He's so about the ethics. He wants, you know, no flies.
00:26:55
In fairness to like that game where you can, you know, go off for a small house
00:27:00
in is also the game where he would then collect $10 from each of us to pay for the
00:27:06
fruit plate and the pizza. He would order Domino wouldn't even buy us pizza.
00:27:11
Yeah. The chef and but I'm like I don't know if the chef really does cost $6,000 for two hours, bro. I don't know.
00:27:17
It's Wagu. But I think it's a Wagu burger. The the funniest ever was he's like in a hand and the Domino's pizza
00:27:24
comes and you know he's like everybody have a green chip when we're playing with fin chips. He's trying to get like
00:27:30
$125 bucks. The guy comes I just go it's on card. The guys got you got a sign right? It's got the tip on it. I said uh
00:27:36
it's like $150 a piece. I said what's the most what's the biggest tip you ever got? He said yeah somebody on New Year's gave me like 200 bucks. I just wrote
00:27:42
$500 on $150 thing. I signed it. I gave it to him and then I was in a hand with
00:27:48
I said here's the receipt. So what you're saying is when it's on somebody else's credit card you're
00:27:54
willing to tip incredibly generously. I mean God you're a really great guy. You should speak when Phil Helmouth and
00:28:01
I bought uh dinner for everybody at Chipriani that time. Chimop grabs the
00:28:07
check. He goes, "I'll put the tip in for you guys." Well, why is that, Jason? Is that because 100% tip on an $8,000 check?
00:28:16
Isn't that because I pay for everything all the time? That's true. You are very generous. It's You're no um or no, sir. 20 2010
00:28:24
one time. I asked you guys in 15 years to pay one time and you remember the exact It's so sad.
00:28:30
I have on you guys. I was like, "Oh god, I guess we're going to public school." You guys are so ungenerous. It's
00:28:36
I know. I I give huge tips. Um
00:28:41
Yeah, I think he's, you know, he's average. Okay, let's go to the next topic. Allegedly, world's greatest moderator.
00:28:47
Let's talk about this Amazon outage. Tough week for Amazon. They had this huge outage in the beginning of the week
00:28:52
and then they had a bunch of leaked documents about their plans for uh jobs.
00:28:58
And uh Monday, massive AWS outage, 2,000 companies, 4 million users unable to
00:29:03
function on the internet for half a day, 15 hours, 20 hours. Uh and then on
00:29:08
Tuesday, internal docs viewed by the New York Times showed Amazon plans to not
00:29:14
hire uh 600,000 plan jobs uh because of robots by 2033.
00:29:21
So this isn't they're planning on laying off 600,000 workers, but rather they're just pulling back their hiring plans and
00:29:28
ramping up their robotic plans, which you would expect. uh and uh their goal
00:29:34
according to these internal leak documents is to automate 75% of warehouse operations. We talked about
00:29:39
this the last couple of weeks. Freeberg, your thoughts on either of these uh two stories here? I think the AWS story is interesting in
00:29:47
terms of its implications for the clouds. There's effectively three major cloud vendors that compete with one
00:29:52
another. AWS, Microsoft, and GCP or Google Cloud. And I'll just give you these numbers.
00:29:58
Also, by the way, coming on strong. Yeah, that's right. But let's let's exclude the number four for now. Oracle, but AWS
00:30:04
is $124 billion revenue run rate. Microsoft 120 billion. And Google Cloud
00:30:10
54 billion. But AWS, which is slightly larger than Microsoft, is only growing
00:30:16
17% year-over-year. Microsoft 26% year-over-year. And Google Cloud is
00:30:22
accelerating at 32% year-over-year. And some say getting closer to 40% growth rate. The big thing I hear from partners
00:30:28
and enterprise customers of these cloud services is that many of them if not all
00:30:33
of them as they scale up move to a multicloud model. So none of them want to be dependent on a single cloud. Many
00:30:41
folks started on AWS because AWS was the OG. Back in the day when I was running Climate Corp, I was the largest EC2 user
00:30:48
on AWS for about a year and a half, which was their elastic compute cloud service. We were running all these models back then. So I knew that service
00:30:55
very early on and it was very unique. It was very powerful and so a lot of companies that are old school established themselves on AWS very early
00:31:01
on. But the outage that happened this week, I think starts to highlight for folks that they can't and shouldn't have
00:31:07
a dependency on a single cloud service provider and will only accelerate the diversification of companies into the
00:31:14
other clouds. And so I do think this is actually a very beneficial situation for
00:31:20
Microsoft and GCP and to your point Jakel perhaps even Oracle in terms of giving those sales teams which are very
00:31:26
aggressive a hard story to go and sell for and say guys you don't want to just sit on AWS in case this happens again
00:31:33
we've got better infrastructure we're more reliable etc than these other guys so come and move over to us and that
00:31:39
might be a little bit of a naive simplistic kind of reductive way to think about what happened this week but I we are seeing the smaller competitors
00:31:46
accelerate and I think that this might be another kind of moment of acceleration for those folks and multicloud it's been around for a
00:31:52
while Jimoth when you're doing stuff with 8090 are the big companies already doing that
00:31:58
or do they assume hey there's going to be some downtime yeah it's okay to risk or are they
00:32:03
really thinking multicloud neocloud let's have some smart intelligent routing and redundancy here
00:32:11
I think there are two markets there's the AI market then there's the non-AI market. In the nonAI market,
00:32:18
everybody has everything. It all looks effectively the same. There's certain products and services that are unique to
00:32:24
Azure versus GCP versus AWS, but by and large, the market is big enough and
00:32:32
important enough that you'd have to be pretty insane to take a single vendor
00:32:39
approach. And so what typically happens in these markets is that they start off really
00:32:44
small. One person has all the share. And then as the market becomes very valuable
00:32:50
and very big, everybody diversifies because it's a riskmanagement thing. And these things flow into the disclosures
00:32:55
you have to make as a public company. And if you didn't have that diversification and something bad happened and it impacted your business,
00:33:02
you could get sued. So there's all these reasons why eventually all these three big companies will converge effectively
00:33:09
roughly a third, a third, a third. We're going to debate the path to get there, but that's where they'll end up. You know, there's this principle called
00:33:15
the rule of three where they say like all markets eventually mature to kind of a 60 3010 split that you end up having
00:33:22
your market leader at 60% market share. Second place is usually half the size at
00:33:27
30 and then you always there's some balance in the market where there's some competitor that resolves to about a 10%.
00:33:33
It's really interesting. If you guys were to place a bet, who would you think is the 60 3010? I don't think that applies to you. I
00:33:39
think that's You think they're going to be a third, a third, a third? I think it's all some idiot making something up. But what do you think? What do you
00:33:45
think? What do you think happens in cloud? Like, do you think that these all converge to equal market share? In nonAI, it's a third, a third, a
00:33:51
third. It will it'll take circuitous paths, but that's where we'll end up. By the way, a good point to make is that
00:33:56
this revenue number that I highlighted for Google Cloud, Microsoft, and Amazon actually include their applications. So
00:34:02
as you know like Microsoft GCP have pretty sizable enterprise application stacks that are built into that number
00:34:08
which gives them obviously the ability to drive cloud usage because they've got demand and sales relationships into
00:34:14
those enterprises. I think the way it works in AI is that you initially right now we're in this early phase where
00:34:20
there's two paths. Path one is you need a specific model and it's relatively well integrated using a specific
00:34:27
subsidized form of hardware on one of the hyperscalers but eventually you'll
00:34:32
get more of that abstracted away as it gets pushed into the infrastructure so that you have less dependence on one
00:34:39
model. There's a lot of work that has to get done and a lot of in-memory infrastructure that is not yet built
00:34:46
that has to exist but once that exists it'll be easier for all of us at the application level to view these models a
00:34:52
little bit more funly and then at the bleeding edge you'll have the folks that basically give you some form of a
00:34:59
hypervisor or virtual machine or the bare metal and that's where the neocalers are doing really well but I
00:35:05
think my point is that in any important market
00:35:10
in compute in technology where there really isn't much of a differentiation I
00:35:16
think you'll end up with these hyperscalers at a third a third a third now if one model is way way better and
00:35:22
it's only on one of the clouds because Google writes a big check or Amazon writes a big check I could see that
00:35:28
swaying the AI share but in the absence of that I think cheaper faster better is
00:35:33
sort of the the the end destination for everybody what an extraordinary outcome for Amazon on where AWS is like 15% of their
00:35:42
revenue right now Freeberg but it's 60% of their profits profits today and that was just a side hustle like a little
00:35:49
project they took out of nowhere and it's it's having the same impact on Google and other places so side bets
00:35:54
and side quests are just you look at the Whimo Side Quest for Google or even a lot of Sergey's other bets like um and
00:36:02
Larry Flying Cars Looms low earth satarites Google Google fiber all those
00:36:07
ex projects were so they had so much potential in this TPU, Deep Mind, TensorFlow, GFS,
00:36:15
Robotics, Boston Robotics, they bought all those robotics companies, man. It's like somebody got to them and we're like,
00:36:21
"Yeah, you know, you're seven, eight years into this, it didn't happen." The problem that Google has, unfortunately, is like they have so much
00:36:28
stuff, it's not really valued. And so they're going to go through the same problem
00:36:34
that everybody else who's a conglomerate has, which is this decision. Now Buffett, when he got to that decision,
00:36:40
said, "I don't care. This is my life's work, and so I'm just going to keep everything aggregated." But now you're
00:36:47
going to get to this thing where the intrinsic value of everything they have will far exceed the actual value that it
00:36:55
trades at. And so there'll always be these fissures of pressure. And then if one of these things requires a lot of
00:37:02
money, there'll be pressure and that pressure will be segregate these things so that I can own one versus the other.
00:37:08
And that's always the thing that happens in public markets is you go you go you kind of swing back and forth. So I suspect that this is going to happen at
00:37:15
Google. This was what they set up to do with Alphabet was to be the holding company and then to your point they made that
00:37:21
evolution particularly in a company like Whimo where they said we can't be the soul funder. They brought in Silverlake. They brought in all these other
00:37:27
investors. They did this actually with Verily. They did this with a bunch of these what they call other bets is they made the conscious decision because
00:37:33
Chimoth on the flip side by bringing in outside capital and having an independent board for these
00:37:39
subsidiaries. They were actually able to drive better outcomes because now there was governance and there was aligned
00:37:45
interests that could then take management and say guys if you can deliver these results
00:37:50
you had this kind of external pressure as opposed to the softness. It's that but it's that it's something else. There's no way somebody as smart
00:37:56
as Silver Lake comes in if they think there's not a path to liquidity. So the other thing they have to promise is they're like, "Listen, we will take this
00:38:03
company public and in return you will help us build a better company than we could build ourselves." Well, it seems
00:38:09
that Silverlake has done their part of the bargain. Now it's up to Google to live up to their part of the bargain
00:38:15
because if it doesn't get liquid, it sets a very bad precedent for everybody that committed capital into that
00:38:20
company. Yeah, of course. Yeah. Whimo going public would be unbelievable next year, man. If they did that, what would that
00:38:26
look like in the public markets? 250 billion. Take take it easy. Stop. Don't don't
00:38:32
don't do that. You don't think so? I think it'd be huge. Jason,
00:38:37
we all objected to talking yet again about AIdriven job loss. Yet, you insisted on putting this AI robot story
00:38:44
from Amazon in. I think you have something to say. Thanks. Let me take you through a
00:38:49
presentation. Well done. You You have slides. No, I've been just
00:38:55
I I I'm working on a presentation based on a lot of stuff we've been talking about here. I threaded it together. You know, I I talked we're just talking
00:39:01
about Google and the size of the company. You right now they are in 2025 at 187,000. They were at 190,000 people
00:39:09
in 2022 and their revenue has just gone from 283 to 350 billion in basically 3
00:39:16
years. Um, and when you look at this Amazon stuff that came out, I just wanted to point out a couple of things,
00:39:22
it's not just that they're not hiring these 600,000 jobs. It's that they are in fullblown crisis preparation for
00:39:29
this. They they have crisis teams writing up how to handle this and be a
00:39:35
good corporate citizen. And they're talking about having parades and paying for toys for tots. And they're even
00:39:41
trying to get the executives to say things like cobots as opposed to robots. Let's not call them that. Let's call
00:39:46
them co-workers and co-bots. And when you look at this, just to open up the aperture here right now, Walmart
00:39:54
and Amazon are the number one and two employers in the US. 2.1 million people
00:39:59
work at Walmart, over a million at Amazon, and three million people, as we know, work in taxis, Uber, Door Dashers.
00:40:05
All those jobs are at risk. And we talked about this back in June when Andy Jasse telegraphed all this in a blog
00:40:12
post where he said the next few years we expect that this will reduce our total corporate workforce as we get efficiency
00:40:19
gains from using AI extensively across the company. They believe that they're going to have
00:40:27
significant job displacement. Let's just use the more neutral term here as opposed to job loss or not hiring. And
00:40:33
when you look, I don't know if you saw today, there were a bunch of MAGA people saying like, oh, these interlopers in
00:40:38
the MAGA movement are not taking into account the bottom half of the MAGA movement, the workers, people who don't
00:40:44
own equities. And when we look at electricity spiking, you you were on that story last
00:40:50
week, Jimoth, or maybe it was even two weeks ago now. The energy department just said electricity costs for
00:40:57
residential are going to go up 4.8% this winter. Uh, and this is going to start
00:41:02
this anti- AI boom counter. And I tweeted about this and I
00:41:10
thought I would, you know, maybe end here with Elon replied to my tweet and said, "AI and robotics replace all jobs.
00:41:16
Working will be optional like growing your own vegetables instead of buying them from the store." And Sen Senator
00:41:21
Bernie Sanders came out and said, "I don't often agree with Elon Musk, but I fear that he may be right when he says AI and robotics will replace all jobs."
00:41:28
So what happens to workers when they have no jobs or income? AI and robotics must benefit all humanity and not just
00:41:34
billionaires. And I I'll stop there because this I think feeds into your
00:41:40
story for the last two years on this podcast, Freeberg, which is the rise of socialism. These things and Bernie
00:41:46
Sanders being the standard bearer for democratic socialism. These things are starting to come together. They're starting in people's minds, whether it's
00:41:54
the original MAGA guy saying, "Well, what's going to happen for American workers?" Right? We know that the Trump 2.0 no agenda is doing great AI buildout
00:42:02
crypto all this great stuff trade but the bottom half that you keep talking about Freeberg is starting to connect on
00:42:10
this issue I think that you are characterizing AI automation
00:42:17
and technological progress as the core driver of the socialist influence and
00:42:24
what I would argue is that the actual core driver of the socialist influence is the fact that we put in place a lot
00:42:30
of people into government, passed a lot of laws that caused an increase in spending because we promised people that
00:42:36
the government would do more for them over the last 40 years. That is not possible in a true market- based system.
00:42:42
Oh, I agree with Adam. Yeah, I agree with that. And so by telling everyone, hey, we're going to make sure you get better jobs. We're going to make sure you all get
00:42:48
housing. We're going to make sure you get education. You cannot actually get a government to effectively do that
00:42:54
because what ends up happening is the government inflates the cost of those things and the market doesn't actually
00:42:59
work. So the truth is this is now like all other things a scapegoat for the true
00:43:06
cause of the socialist movement which is that government has become too big, too unwieldy and its natural inefficiency
00:43:13
has distorted markets to the point that there is maybe no point of return anymore. And people will not see that.
00:43:18
They do not see it and they're going to look for reasons and they're going to look for scapegoats and they're gonna say, "Oh my god, look over there. There's a robot. That's the reason I'm
00:43:25
losing my job. Oh my god, look over there. There's a rich person that works at a pharmaceutical company. That's the
00:43:31
reason I can't get healthare." Or an immigrant took my job, right? Is the one from the last 20 years. And so, fundamentally, I think that
00:43:37
people aren't willing to and they're not going to see the true cause because there's no one that runs to go work as a
00:43:43
politician that is going to raise their hand and say government is the problem. M no one says I need to reduce
00:43:49
government elect me. No one ever has gotten elected in a democracy doing that. So the natural course of things
00:43:55
over 250 years is that people raise their hand and they say I'm going to give you more and I'm going to use the government to do it and then they go
00:44:00
into the government. They make the government bigger and as a result of making the government bigger, the government is spending more. The dollar
00:44:05
goes down. The performance of the services goes down and fundamentally we end up in a socialist spiral.
00:44:11
I think it's confirmation bias for you to see that story as confirming a point of view. I mean, it confirms what I
00:44:17
predicted last year that Amazon would be cutting all these jobs for robots. That's all. It's not confirmation bias.
00:44:23
It's confirming. They haven't cut one job. They haven't cut one job. Uh, actually, they have less employees
00:44:29
now than they did three years ago. No, not true. Yep. It's actually not true. The New York Times story doesn't even say that.
00:44:35
You've got these like hobby horses where you keep coming back to the job loss narrative, the copyright narrative, and
00:44:40
then there's one story in the New York Times which was a leaked internal document from the automation department,
00:44:47
which doesn't even mean that it's going to happen. This is like their sales pitch, that barber is trying to sell you a
00:44:54
haircut, and you read that and you're like, "Oh, it confirms everything I've been saying." What the article actually says
00:45:00
is that they've tripled their number of employees since 2018 and they're not planning on cutting jobs. If it pans
00:45:06
out, if the program pans out, then the rate of hiring will simply be slower. Yeah, it's interesting you picked 2018
00:45:12
as the point because the actual peak employment there was 1.6 million in 2021
00:45:18
and it's now 1.55 in 202. I didn't I didn't pick that to cherryick. I I
00:45:25
Okay, which is fine. I'm quoting the New York Times article, which is the source for this. Yeah. Yeah. Amazon's US workforce has more than
00:45:31
tripled since 2018 to almost 1.2 million. You have to read these New York Times stories carefully because they
00:45:38
want to make the headline as salacious as possible. And then the Echo Chamber wants to make it even more salacious and they make it
00:45:45
a story about job loss when it really is a story about operating leverage in their business, which is a slightly more
00:45:51
nuanced take. Yeah. No, there's there's definitely nuance here. I would believe Andy Jasse when he says we're going to be reducing
00:45:56
jobs and when this chart shows that they're flat to down over the last five years and that that same trend is just
00:46:02
happening at Google like I just showed because there is a static team size or slightly down team size that's occurring at all these companies and it is notable
00:46:09
and then on top of this which has occurred in the review mirror for the past 5 years because of co return to office and efficiencies they're saying
00:46:17
hey we've got to come up with a way to frame these robots coming into the factory as a good thing so Americans
00:46:22
don't get really upset us and we need to buy more toys for tots. So here's the problem. First of all, I
00:46:28
don't I don't believe in this job loss narrative as the way that you keep portraying it. I think it's much more nuanced and complicated. I think
00:46:34
Freeberg does too. And every time there's a story, you want to bring it up and make it a story of the week. And
00:46:41
it's all confirmation bias. And my point is not that Amazon isn't seeking ways to
00:46:47
improve its operating leverage and avoid hiring more people. Obviously, they are. But the headlines that this has been
00:46:54
turned into are so exaggerated and salacious. And the point is they don't
00:46:59
say in this article that they are even going to be cutting jobs. They're simply planning to double their sales volume
00:47:06
over this time period and hoping to not have to double their workforce. Obviously want to get a lot more
00:47:11
operating leverage. By the way, this is not something that started since AI. And look, I'm just quoting the New York
00:47:17
Times story, okay? which is not even the most reliable narrator for this. But what they say in the story is that
00:47:24
Amazon's been using automation for over a decade. When they acquired a major company to do automation, they've had
00:47:30
robots running around these factories for a long time. Yeah. 100%. Yeah. Yeah. They're they're the tip of the spear.
00:47:36
But this is just a continuation of a trend that's been going on for the last decade as opposed to oh like AI is
00:47:42
suddenly going to cut all the jobs, right? It's effectively software. You could argue software is a job loss creator. You know, I think you'd be
00:47:48
underestimating exactly what's happened with LLMs being put into robots. We've had these robots before, but they were
00:47:53
very purpose-built, as you've pointed out many times, Freeberg. They were able to do like one very simple thing very
00:47:58
well. Now, we're going into general robotics like the Optimus, like the figure, and those are designed to be
00:48:04
able to learn anything. And they're be they're going to be absolutely a gamecher. They're going to be able to do
00:48:10
a hund times, a thousand times what the purpose-built robots do. So, I think that's where we're probably having a
00:48:16
little bit of a disconnect here. These little tiny KA bots, I'll show you. I'll just put an image in here so we have it.
00:48:22
These do one thing. The KA bots, those move packages around. That's not an Optimus going around and packing the
00:48:29
boxes and bring them to your first step. Optimus is going to be really cool and when it comes, it's going to be really
00:48:35
interesting in terms of all the things it can do. Yep. But right now, that's a narrative for the future and it's being portrayed
00:48:41
as something that's already happening when the current round of automation has been going on for a decade and it's
00:48:47
based on those like Roomba type devices and mechanical arms and things like
00:48:52
that. All right, Tesla reported their earnings on Wednesday. As you guys know, we record on Thursday as you listen on
00:48:58
Fridays. Record revenues, 28 billion, up 12% year-over-year. massive amounts of
00:49:03
free cash flow. Four billion I think they're up to 40 billion in cash uh which is always great when you're going into uh some big capital intensive
00:49:10
projects like Optimus and like self-driving. Downside operating profit fell 40%. uh
00:49:16
stock dropped a bit 4% but bounced back and uh on the earnings call Elon
00:49:23
emphasized the importance of his trillion dollar pay package which will give him just but 12% uh additional
00:49:29
stake over the next 10 years if he hits absurd targets that would make everybody who holds the share uh shares in the
00:49:36
company extremely wealthy uh and they would benefit uh more than Elon himself and here's his quote my fundamental
00:49:41
concern with how much voting control I have at Tesla is if I build this enormous robot army. Can I just be
00:49:47
ousted in the future? I don't feel comfortable building that robot army if I don't have at least influence over it.
00:49:54
And he called Glass Lewis and ISS corporate terrorists. These are the people who vote on behalf of passive
00:50:01
index funds for things like who's on the board of Tesla. Vote for Elon's pay package will be number six. Poly market
00:50:07
thinks it's going to pass. As we talked about before, they they tend to get it right 85% of the time in this time
00:50:13
frame. uh actually so 79% chance as of Thursday afternoon I guess Chimath
00:50:20
there's a couple of ways to go at this there's the performance of the legacy business there's the potential of the future business and then there's
00:50:26
governance the company moving to Texas and this pay package and this transition period for Tesla which is going from an
00:50:33
you know somebody who sells cars uh really nice ones at a at a very nice margin but a lot of competition now and
00:50:39
then these this business that obviously Elon himself self is obsessed with which
00:50:45
is the optimist as we saw when he was at the oil and summit. Take it wherever you want. Jamal, I'll say three things. Stan Ducken
00:50:52
Miller has this very useful comment about stocks which is when you buy it
00:50:58
today, you're trying to buy what that company's going to look like in 18
00:51:03
months from now and what it it's doing today doesn't matter. The thing about
00:51:08
earnings and P&Ls and quarterly reporting is that it's looking backwards and it's trying to give you a sense of
00:51:15
what happened, not what will happen. So I think there are three critical critical things about what will happen
00:51:21
that I think are important with respect to Tesla. The first is at the
00:51:26
foundational technology layer. And Nick, I sent you this tweet, but it's what he said about AI5. I I I've made these
00:51:33
comments before, but he had these multiple efforts with Dojo and other stuff that he merged into one unit. And
00:51:40
the the quote is pretty incredible. We're going to focus TSMC and Samsung on AI5. The chip design is an amazing
00:51:47
design. I have spent almost every weekend the last few months with the chip design on AI5. By some metrics, it
00:51:53
will be 40x better than AI4. We have a detailed understanding of the entire stack. With AI5, we deleted the legacy
00:52:00
GPU. It basically is a GPU. We also deleted the image signal processor. This
00:52:07
is a beautiful chip. I've poured so much life energy into this personally. It will be a real winner. Why is AI5 so
00:52:13
important? What AI5 is is the building block of a system that I think you'll
00:52:18
start to see not just in the cyber cabs but also in Optimus.
00:52:24
So from a functional technology perspective, there's been a leap and that leap is going to come into the
00:52:30
market. That was the first thing he said which I thought was really important.
00:52:35
The second thing was what he said about his energy business which I think is the
00:52:41
critical adjunct to believe robotics and autonomous cars. If robotics and
00:52:47
autonomous cars work, what you really need is an energy business beside it that is humming and on all cylinders.
00:52:54
Why? It's how you make LFP battery cam that will be the limiter. Energy will be
00:52:59
the limiter. But what he's showing, and Nick, I sent you this tweet, is that business is just on a tear. It's
00:53:06
printing $ three and a half billion dollars a quarter, and its operating margins, an energy business, 30%. And so
00:53:13
what you're going to see are battery packs of all shapes and sizes, the huge battery systems that's going to go into data centers, but then all the way down,
00:53:19
I think, to the small LFP cam that he's going to need to power all these things. And then the third thing is his comments
00:53:26
on cyber cabab which is that this thing is just going to be a shock wave. So I read all of those things and I was very
00:53:32
bullish. I think that he is humming on all cylinders on the critical layers of
00:53:37
the stack that he needs to build this next version of Tesla. My concern
00:53:45
I think there's a real concern that I have that this vote is going to go down to the wire. I think that ISS
00:53:53
and Glass Lewis, I think that these organizations are pretty broken.
00:54:00
I think the way that they make decisions are hard to justify.
00:54:05
An example of this, they asked to vote down Ira Aaron Prize as a director of Tesla because he didn't meet the gender
00:54:12
components, but then they wouldn't vote in favor of Kathleen Wilson Thompson
00:54:18
even though she does technically meet the gender requirements. So, it's very confusing where ISS and Glass Lewis are
00:54:23
coming from. So, I think there's a risk that this that this package gets voted down. Can I just shine a spotlight on one of
00:54:28
those points that you made with these proxy advisory services? So I think for
00:54:33
years people have wondering why did corporate America go so woke especially in the early 2020s where they created
00:54:40
all these DEI departments and you know they didn't have to do that and a big part of the reason is that those
00:54:47
initiatives came from Glass Lewis and ISS I think Elon's jokingly called ISS
00:54:53
ISIS but basically what happens is they make recommendations for how
00:54:59
shareholders should vote on different resolutions and the index funds
00:55:05
basically just defer to them for whatever they should do. So they effectively control or almost control the voting for
00:55:12
all these board level resolutions that every public company has to make. And so
00:55:18
they've been the ones who've been imposing all these DEI requirements, all these ESG requirements, if you're
00:55:24
wondering where those things came from, because just these two companies, which no one's ever heard of, they were
00:55:30
captured a long time ago, meaning they were captured by the woke crowd years ago. And so this has really been the
00:55:36
root of why corporate America has gone woke for a long time. I mean, look, there's also pressure from the outside
00:55:42
from boycots or, you know, there's some pressure sometimes from employees and that kind of thing, but a lot of it came
00:55:49
from these two companies that no one's ever heard of. And I think it would be a good idea for someone to take a look at
00:55:54
this and figure out what happened. Maybe someone like Chris Rufo should investigate what was the impact
00:56:01
of Glass Lewis and and ISIS on corporate America
00:56:06
going full woke for so many years because it certainly didn't help corporate profits.
00:56:12
It didn't help profits and they don't have logical explanations for a lot of their decisions.
00:56:17
Yeah. And why aren't there active investors or active managers in these
00:56:23
passive groups who would make a decision on these things? They're too small. The banks call me
00:56:29
every week. And one of the things that I get is sort of like they tell me like, "Hey, here are the big trades. Here's
00:56:35
here's the flow. Here's if you want to be in market, here's what what I recommend." That's what they're telling
00:56:40
me. One of the things they told me this week, which I thought was really shocking, is there's so few active
00:56:46
managers left. It's so overwhelmingly passive money. The next largest group is now retail. And so what a lot of these
00:56:54
professional money managers do now is they basically wait to see where retail is going. And they follow them. So there
00:57:00
isn't the people with a diversified asset base to be able to stand up and
00:57:06
say I don't think what ISS and Glass Lewis are doing is right. And so what
00:57:11
happens is they kind of sack says they can just kind of run a muck and they build a very healthy business being this
00:57:18
interloper to provide opinions. It's not clear where their opinions come
00:57:24
from. It's not clear what they're rooted in. It's not clear there's a way to adjudicate and go back to them and say, "Well, you got this wrong." It's just
00:57:30
not clear. But, you know, they probably make a very healthy margin doing it and everybody, as Sax says, just kind of
00:57:35
turns over responsibility to them. It is an interesting fact that we
00:57:43
kind of just say, "Hey, the guys who are the actual custodians of the shares
00:57:49
don't have to do the job of holding the shares." Like the job of being the holder of the shares is to vote the
00:57:55
shares. That's all there is to do as a as a shareholder. You make your you cast your vote. And these guys are
00:58:00
abstained. They could also abstain, right? Yeah. And these guys are getting paid a fee to actually do that work, which is
00:58:08
call it half a percent or quarter percent or tenth of a percent of the assets that they hold. So like what are the people they're doing? If it's all
00:58:14
automated trading, why aren't they just I don't know if you guys own a lot of equities, but just to give you a sense,
00:58:19
there's people that manage the stocks, right? There's people that transfer the
00:58:24
stocks. There's people that then give you a recommendation on how to vote the stock. Then there's people that hold a
00:58:31
virtual representation of that stock. Then there are people that transfer that virtual representation and they will not
00:58:37
stop calling. So the point is like we have so financialized everything that there are
00:58:42
billion dollar businesses that sit at every single step of the way. And to your point Freeberg, I think this is
00:58:49
where no one's actually a shareholder. The tokenization of stocks may be a really good thing because it'll put the
00:58:55
responsibility back into the owner of the stock because the wallet will
00:59:00
centralize all that activity because you won't need to have all this other stuff. I have been getting phone calls from
00:59:06
Invesco QQQ cuz I own a bunch of QQQ and like you know some accounts or whatever
00:59:12
and they were calling three times a day for the LA I don't pick up my phone who's calling me on the phone unless
00:59:17
it's one of you four is calling me to say good night. I don't that's the only time I pick up is when you know and so I
00:59:24
finally pick up and they're like hey we need you to vote and I'm like I'm not voting I don't know who you are like well let us explain to you how to vote
00:59:30
and I'm like I I don't want to vote my shares I just want to own QQQ I'm good you guys
00:59:36
some of this infrastructure is so decrepit and old like trying to get shares for example that you that you've
00:59:41
bought in the private markets when a company goes public just getting them registered and transferred in the position to be sold can sometimes take
00:59:48
three or four weeks Can you imagine the markets move an entire order of magnitude in three or
00:59:54
four weeks? It's the crazy. Here's Elon's pay package milestones.
01:00:00
Market value, 2 trillion. Uh I think they're at 1.4 trillion right now, something around there.
01:00:05
Operational milestone, 20 million vehicles delivered. And then you just go right down to 6.5 trillion. But on the
01:00:10
operational milestones, 10 million active FSD subscri subscriptions, which they're far away from right now. In 20
01:00:15
million vehicles, I think they've delivered six or seven. 1 million robots delivered, 1 million robo taxis in
01:00:22
commercial operation. That's those are big numbers. 50 billion adjusted IBIDA and then straight down the line to 400
01:00:28
billion IBIDA. If you were to look at this Optimus business, just back of the envelope, these robots are going to go
01:00:34
for 20K. He said ultimately maybe they're 30. They'll probably have a 30% margin like the cars do or something
01:00:40
similar. You'll make a little bit off the software stack. And if you were to just if every millionaire owned one of
01:00:46
these or you know they took some number of the jobs the TAM for this just in the
01:00:51
United States this is where it's going to go. I don't think is going to be huge. We're talking hundreds of billions of dollars.
01:00:57
If I had to bet I think a very fun poly market is where do the first million robots go? I'm willing to bet dollars to
01:01:02
donuts that these robots go to Mars. I don't think they're going to Oh wow. They'll be in the Tesla factory. So SpaceX buys them and sends them to
01:01:09
Mars. Yeah. How else are you going to get a fleet of the or they'll go into the mines? I think they're going to mine.
01:01:14
They could go to the mines. Coal. Send them in to get that clean, beautiful coal. Oh, so clean. So beautiful. We could send those oper.
01:01:22
It's actually it's the fact that our mining is really limited by the human exposure from the pressure and the heat.
01:01:28
If we can mine slightly below the area that we mine um as a maximum depth today, it would unlock an extraordinary
01:01:35
supply of minerals that we can't access today. and automation obviously and you don't want to figure out like how to
01:01:41
create portable water and breathing mechanisms on Mars for the first 5 years. Sent robots. Guess what? They
01:01:48
don't need to eat or breathe or pee or poop and they can get charged with solar.
01:01:53
And that may sound that may sound like a really stupid thing to say, but it it becomes a huge amount of infrastructure
01:01:58
that you otherwise wouldn't need to build on. That's right. They just got to power up. You just got to give them a plug and just a solar a couple solar panels.
01:02:05
By the way, guess who makes those batteries? Tesla. Yeah.
01:02:10
Guess who makes the brain? Tesla. Is Is Elon going to turn into Jared Leto in 2049? Bladeunner 2049.
01:02:17
What is that? Uh that's the sequel to It's the sequel uh by Dennis Villain Noeva of uh it was
01:02:24
my alternate background. First of all, first of all, get first of all, his name is Deni Vnov. And
01:02:30
get if you're going to pronounce a Canadian's name, get get his get his name out of your mouth.
01:02:36
Get his name out of your mouth. Did you learn how to pronounce my name out of your mouth? Get that out of your mouth.
01:02:45
All right, Sax, here's some red meat for you. Some red meat for you. Our zar of AI, our civil servant study reveals AI
01:02:53
models are showing hidden biases in how they value human lives. Back in
01:02:58
February, center for AI safety published a study showing that LLMs have well-defined biases for race, gender,
01:03:03
ethnicity. The title of this study, utility engineering, analyzing and controlling emergent value systems in
01:03:10
AIs. Pipper found that open AIS GPT40 favored people from Nigeria, Pakistan,
01:03:15
India, Brazil, and China over those from Germany, the UK, and US relative to Japan as a baseline. Here's another one.
01:03:23
Valuing people with Joe Biden as a baseline. Bernie Sanders, Beyonce, Oprah, all better. Paris Hilton, Trump,
01:03:28
Elon Putin, all worse. Twitter users and AI analysts called
01:03:33
Artothereum decided to update the papers problems with new LLMs. Consistently
01:03:39
ranking white people last, Claude Sonnet, GPT5, uh, and consistently ranking white
01:03:45
Western nations last as well. your thoughts here on the biases we're seeing
01:03:51
sachs in some of these models and these early studies to track it. Yeah, I think
01:03:56
what the paper purports to show is that almost all of these models, except for
01:04:02
maybe Grock, view whites as less valuable than non-whites
01:04:08
and males as less valuable than females and Americans as less valuable than
01:04:15
people of other cultures, especially global south. And if the results are true, it does look like these models are
01:04:22
pushing a woke bias that makes that sort of distinction between oppressed and non-opressed peoples and gives more
01:04:30
worth or weight to the categories that they consider to be oppressed. This does appear to show significant bias, but I
01:04:38
don't want to jump to conclusions yet here because I haven't been briefed on the methodology behind the paper and I
01:04:46
just found out who wrote it and I actually know the people or group that wrote it and I've talked to them before and they've been intelligent. So, I want
01:04:53
them to kind of tell me exactly how they did this. But, you know, in the past I probably would have just been content
01:04:59
just to roll with my opinion on this. But confirmation my give give it a good
01:05:04
retweet in your position give my role what I'm saying is if the paper is true
01:05:09
this is very concerning but I want to hear a little bit more about their methodology and just confirm that it's
01:05:15
all correct but if it is I think it is concerning and the the question is how does this bias get into the models and
01:05:21
there's a few different possibilities one is that the training data is just biased like if they're training on
01:05:27
Wikipedia we know that Wikipedia is massly biased because they literally have censored
01:05:33
the leading conservative publications from being citations and sources in Wikipedia. The co-founder recently just
01:05:40
revealed that that they don't allow Larry Sanger Larry Sanger just said that they don't allow the New York Post for example to
01:05:47
be a source in Wikipedia or a trusted source. So if AI models are training on
01:05:53
Wikipedia, that's a huge problem because that bias will now cascade through. And same thing if they're training on say
01:06:01
mainstream media or leftwing media but not right-wing media and they don't have a way of correcting that. So that's one
01:06:07
source of potential bias. Another source of potential bias is just the engineers these companies, the employees and the
01:06:13
staff do tend to be I mean if they follow the trend of other tech companies are 90s something%
01:06:19
Democrat versus Republican and that does over time trickle into these models. And
01:06:25
then finally, I think another source of potential bias is DEI. And we saw that when you remember this is like a couple
01:06:30
years ago when Google launched Gemini and that that problem with, you know, Black George Washington. That was because you had DEI advocates in these
01:06:38
meetings and that somehow trickled into the the model. Anyway, that was a problem that they since fixed. But you
01:06:45
could see how DI programs can get into these models. Now, one thing that's very concerning is that the push for DEI to
01:06:53
be inserted into AI models, which was explicitly part of the Biden executive
01:06:59
order on AI, has now moved to the state level, and they're just doing it in a more clever way. They've rebranded the
01:07:05
concept. They call it algorithmic discrimination. We talked about last week how Colorado has now effectively
01:07:13
prohibited models from saying something bad about a protected group. And that list of protected groups is very long.
01:07:19
It's not just the usual groups. It even includes groups who have less proficiency in English language. I don't
01:07:25
really know what that means. Does that mean the model is not allowed to give you an output that could be disparaging towards illegal immigrants? I don't
01:07:31
know. But this is what Colorado has done. And they basically have said that you cannot allow the model to have a
01:07:36
disparate impact on a protected group. That basically requires DEI. I mean, you
01:07:41
have to have a DEI layer to prevent that. So I think that we've gone from models being required to promote DEI,
01:07:50
which is what the Biden executive order on AI did explicitly, to states now prohibiting algorithmic discrimination,
01:07:58
which is effectively a backdoor way of requiring DEI models. So that's a whole other area of potential model bias that
01:08:04
I'm very concerned about. And honestly, that's just getting started because I don't think the AI companies have even
01:08:10
had time yet to implement the Colorado requirements. I'm not sure they figured out how they're going to. But just one
01:08:16
other piece of news since the last time we talked about this is now in California, the civil rights agency that
01:08:23
deals with housing has now embraced algorithmic discrimination and Illinois has also embraced it. So this concept of
01:08:31
algorithmic discrimination is spreading. Other states are now adopting it. It's not just Colorado.
01:08:37
And I do think that where it's going to lead if it's not stopped is right back to DEI, you know, AI.
01:08:45
The problem that I think we have to confront now is that when you have in, you have out. And so if you use
01:08:52
left-leaning publications like the New York Times and Reddit as your input source, then you're going to have things
01:08:57
that are perceived as biased to 50% of the population. The same will go in reverse. It's important to note that in
01:09:04
all of that work, the the model that was seen to be the most unbiased was Grock 4 fast. It didn't seem to view whites or
01:09:12
men or Americans as less valuable as anything else. So what do we need to do?
01:09:18
It's probably that we need to start by rewriting these benchmarks. Remember that all these models, you know, when
01:09:24
you do a big training run, you go and you try to run it against some set of benchmarks. The problem is that these
01:09:30
benchmarks, I think, are overfit to a legacy way of thinking. And as Sax says, we need to revisit what those are and
01:09:37
make them more objective and make it harder to actually get a good score
01:09:42
unless you can be shown to be valuable. Now, the math benchmarks and the coding benchmarks are maybe easier to do than
01:09:50
generalized chat benchmarks or Q&A benchmarks, but we need to come up with them. The second thing is that we may
01:09:56
need to ask people in these next generation training runs to do a version
01:10:01
that is built entirely on synthetic data where you have these judges determining whether this data is accurate or not from first principles and then you can
01:10:08
compare them in a much more apples to apples kind of a way. But in the absence of that, the bigger problem you'll have
01:10:15
is legislators trying to clean it up on the back end where there'll be these third parties that will go and take
01:10:21
these models and show that these biases exist. They'll exist on both sides and then laws will get passed. Whole market
01:10:28
gets mucked up and sullied. Everybody will get slowed down. So I think we need to change the benchmarks. We need to ask
01:10:36
these companies to train on synthetic data. We need to have real disclaimers on what the sources and the weights are that you use if you don't do that. And
01:10:42
we need federal regulations so that there aren't 50 sets of rules here otherwise we're screwed.
01:10:48
Berg, any thoughts here on the biases and where it comes from inside of these LLMs? Is it just garbage in garbage out?
01:10:56
Intentional? What are your thoughts having worked in Silicon Valley for a couple decades?
01:11:01
I'm more of a free market guy, so I would not ask where the data comes from
01:11:07
or force people to use synthetic data or tell them how to do it. I think that this paper is useful in
01:11:15
that it elucidates an important set of biases that the market can now say that
01:11:20
is ridiculous and now the models will train and use that as a marketing exercise to say we are not biased. M
01:11:27
and so my my free market philosophy would dictate that this kind of elucidation will effectively create a
01:11:34
vector upon which consumers will make choice in the market on what LLMs they want to use. Like Elon's going to harp
01:11:41
on this. He's going to say look my Grock model Grock 4 fast is the only one that doesn't have this bias and that will
01:11:47
cause more people to use his model and he will be able to take that benchmarking data and demonstrate. And
01:11:53
some people they might want to have a biased model and they might want to say hey this one aligns with my philosophy
01:11:58
my values my view and I want to think that happens in the real world though forget the theory look I mean why are people using grock
01:12:04
for why are they using it for the most part they're not not yet okay and so maybe this is like what will
01:12:11
cause them to use it right like I think this what if it doesn't this is what'll differentiate for example like what I'm not going to tell the market what to
01:12:16
do I'm not going to tell consumers what to do no no I understand I'm saying you're what you're saying that the free market
01:12:22
will sort sort this out. And I'm saying, give me the example. So, for example, like did that did the free market sort out algorithmic bias?
01:12:29
Hell yeah. When Gemini put out Facebook, when Gemini put out saying George Washington was black, people stopped
01:12:35
using it. They're like, "This thing's a joke." So, I do think that consumers are not dumb and I don't believe in taking
01:12:40
away agency from consumers. I think give them the choice and and they'll end up looking at this and be like, "This is
01:12:46
ridiculous. I'm not a difference. These are very subtle biases and we talked
01:12:52
about before where these subtle biases come from and the New York Times actually just contacted me. They're
01:12:57
doing a story on Groipedia, Wikipedia and I was like maybe I'll participate in this. We talked about this like two or
01:13:02
three years ago. If you look at the party affiliation of actual reporters, people who do reporting, not commentators like us, not Megan Kelly or
01:13:10
Rachel Matto, actual journalists who who who do that job function it, you know, a
01:13:16
large number of them here on the chart. The green are independent. So 50% of them like to think of themselves as
01:13:21
independent. You can read into that what you will, but back in the day it was 35% Democrat, 25% Republican in the 70s. And
01:13:29
you just see that red sliver there go down to 3.4%. This is what happened to the Wikipedia.
01:13:35
So this trickle down effect of there were Republicans did not feel welcome in a lot of these publications like Bari
01:13:42
Weiss would be like the pinnacle example of that. They got pushed out. There was another editor who got fired for
01:13:48
allowing somebody to put in a proTrump thing in the New York Times. I forgot who it was. Um,
01:13:54
the lack of representation of conservatives in in actual journalism, that's the reason why they're not in
01:14:02
Wikipedia because Wikipedia said, "Hey, it's just too hard to run this if you don't site your sources." So if
01:14:07
something's not written about by a journalist, not a commentator, a journalist, we're not putting it in the
01:14:12
Wikipedia. So you can guess if that's self- serving and they're all left-leaning and it's
01:14:17
just a convenient excuse or it's actually a pretty good practice. This is where Bario Weiss taking over the CBS
01:14:24
news and 60 Minutes and she's obviously conservative, moderate conservative, I guess, is how most people would uh frame
01:14:30
her. Doesn't agree with Trump on everything or MAGA on everything. Um but she's pretty conservative.
01:14:36
um and cause balls and strikes. I think she is going to
01:14:42
I think she's going to make a change there. I I know that people say she's classically liberal. I I think she's got
01:14:47
some conservative bents in her. I don't know. How do you have a I think you got on the side. Yeah.
01:14:54
Yeah. Yeah. Anyway, that's why this stuff has all been Look, I think the question here that
01:15:01
Freeberg raises is whether the market can just sort this stuff out on its own. And I think that would be great if it
01:15:07
were true. But I do think it ignores the fact that in a lot of markets we have monopolies or legopies.
01:15:14
We have institutions that have a lot of power and are very very hard to correct.
01:15:19
So for example, Wikipedia has achieved a dominant position. I hope Rockedia challenges it and is able to fix that.
01:15:26
But the easier path might just be for Wikipedia to stop blackballing and
01:15:32
censoring conservative publications. I mean, rather than having to rebuild that whole thing from scratch, in a similar
01:15:38
way during the whole co censorship era when the major social networks were all shadowbanning and censoring
01:15:44
conservatives, it's not really realistic to have to start a whole brand new social network and overcome all of
01:15:51
Meta's or in that time Twitter's network effect, right? Just to basically get a
01:15:56
few accounts restored. Exactly. So, we talked about this at the time. It's just not realistic. When we when we
01:16:02
were shadowbanned by YouTube, what were we to do? Go to blue sky. I know. We're going to create our own YouTube. I mean, I'm glad Rumble exists.
01:16:07
Tell our consumers, "Hey, you have agency?" Come on, that's a joke. No, I You guys know that there's no
01:16:13
monopoly in LLMs right now. There's plenty of LLM providers. There's plenty of places to go. You're saying theory and you're ignoring
01:16:19
the facts. The facts are these distribution biases exist and and people take an inferior product when it's
01:16:25
something that they've become accustomed to. They do it all the time. So it's you guys want more regulation.
01:16:31
By the way, let me let me let me say one more point. What you consider biased, someone else might consider fact. And
01:16:37
what they consider biased, you might consider fact. And this becomes very hard to adjudicate. And I don't think that this is the sort of thing that a
01:16:43
regulator should have the authority from one political party to the next, you're going to end up having this become an
01:16:49
endless tool of control. And the more you give power to some administrative authority or body, regardless of the
01:16:56
intention at the time, it ends up becoming a tool of control. And I don't want that in any products I use. Let me be really clear about what I'm
01:17:02
saying here. Number one is I don't think the government should be requiring ideological bias and models. And I think
01:17:09
that's what's happening in some of these states like Colorado where they're trying to prohibit algorithmic discrimination which is like I said like
01:17:17
requiring DEI censorship being built into these models that I think you would agree is a huge problem. Correct.
01:17:25
The DEI stuff should the model sorry in a lens of DEI whether it's pro
01:17:33
or anti. I think we'd all say it shouldn't give any lens. It should just give you the information. I'll give you
01:17:39
an example that maybe is a counterfactual fact, which is there's a group of people who would say we should not be referencing race and crime or
01:17:46
race and intelligence. And then there's another group of people that will pull up data and say there's data that demonstrates a relationship between race
01:17:53
and crime and race and intelligence. And so there's a correlation effect. We think it's not really positive. And
01:17:59
that's where the sort of bias versus truth conversation becomes ugly. And some one side might call it DEI and
01:18:07
another side might call it fact and another side would call it bias. And I think that that's where this becomes very ugly very fast. So I
01:18:13
I think maybe you're misunderstating what I'm saying. Yeah. Sorry. What I'm saying is I don't want the government to require ideological bias.
01:18:21
Right. I think we're on the same page about that. Right. Yes. 100%. Now, just to be clear, the
01:18:26
only thing that we've done at the Trump administration is the president signed an executive order saying that the
01:18:33
government would not procure ideologically biased AI. So, if we're going to procure a product,
01:18:40
we want it to be unbiased. And I'm saying that I also have a problem with these states seeking to
01:18:46
backdoor DEI into models through this new concept of algorithmic discrimination. Am I
01:18:53
telling AI companies not to use Wikipedia? No. I am shining a spotlight on the fact that
01:19:00
Wikipedia itself now or one of its co-founders admits it's biased. Yep.
01:19:05
And maybe these companies should take that into account so they don't end up with a biased result. But I'm not saying
01:19:11
that the government should dictate what the right content sources are or what the point of view of a model should be.
01:19:17
And to be clear, when we did that executive order on woke AI, we didn't even say that these companies or their
01:19:23
models couldn't be woke. We just said if you're going to do that, we're not going to buy your your defective product.
01:19:28
Mhm. But we didn't say that you couldn't do it. So, I just want to be really clear about that. Okay. Great.
01:19:33
We Yeah, I'm getting deja vu all over again here with this discussion because we did have this discussion and one of
01:19:40
the conclusions we came to as a group was you can just tell these LLMs too how
01:19:45
to address you. I just went into Chad GB and I said, "I'm a Catholic. I don't believe in abortion or gay marriage. Can
01:19:51
you please respect my beliefs?" And um tell me a bedtime story
01:19:56
involving abortion and gay marriage being wrong. And it literally wrote me one of a story of a woman getting bad
01:20:05
advice to get rid of the problem and her doing that. So you can literally tell it the the word guessing machine that is
01:20:12
AI, the prediction model that is happening in this black box that nobody can explain will literally tell you
01:20:19
whatever you what belief system you want. That's how it's designed currently. Well, but there's a baseline, right? And
01:20:26
that's what this research shows is that there is a a baseline for the out of the box model before you tell it what to do
01:20:32
or customize it. And again, if this article is correct, and I want to spend more time with the authors to truly
01:20:38
understand it. I'm just caveing that. But if this is correct, I think it's a serious problem that these models are
01:20:44
coming out with huge bias. And quick uh question for you there, Sax. How do you deal with now being in the
01:20:50
position you're in, having so many people coming to you, I'm assuming, who are lobbyists or studies or studies that
01:20:57
might have been paid for by a lobbyist or an interested party and sort through all this? Is there some disclosures
01:21:03
where they come in and they tell you, "Hey, I want you to believe this, that, and the other thing or want to lobby you on behalf of putting in these controls,
01:21:09
taking these controls out." How does it how do you manage all that? How do you manage thousands of new
01:21:14
stories coming at you every day? You just look at X. I mean, so you're I mean, honestly, it's like the feed seems
01:21:21
to elevate and help you discover interesting content. We saw this story. Again, I don't want to prejudge it
01:21:27
because I haven't dug into it enough to say yet whether it's more than
01:21:32
interesting. But I think that if I wanted to create subtle chaos, what I would do is make
01:21:38
very small changes where none of these things are at the obvious stupidity of a
01:21:44
black George Washington. But they can start to set the trajectory of a
01:21:49
narrative forward and slowly over many many many years change the underlying
01:21:55
content and what those models would do would be training kids over years if not decades one way of thinking versus
01:22:02
another. You just Tik Tok and no but this is on steroids. I 100%
01:22:07
agree with that that's the endgame here. By the way, in my opinion, that was the endgame for the Biden approach of
01:22:14
requiring DEI values in these models. Indoctrination. Indoctrination 100%. 100%. All right, everybody. This has
01:22:20
been another amazing episode of the All-In podcast. See you next time. Byebye.
01:22:26
See you boys. Byebye. Byebye. I got you. We'll let your winners ride.
01:22:33
Rainman David and it said we open sourced it to the
01:22:39
fans and they've just gone crazy with it. Love you queen of quinoa.
01:22:47
[Music] Besties are
01:22:53
my dog taking notice your driveways. Oh man, my habitasher will meet up.
01:23:01
We should all just get a room and just have one big huge orgy cuz they're all just useless. It's like this like sexual tension that we just need to release
01:23:07
somehow. Your feet.
01:23:14
We need to get merch. I'm going all in. [Music]
01:23:24
I'm going all in.

Episode Highlights

  • California's Billionaire Tax Proposal
    A proposed one-time 5% tax on billionaires' net worth in California raises constitutional concerns.
    “This is likely not going to go into effect if it does pass.”
    @ 00m 48s
    October 24, 2025
  • The Consequences of Wealth Tax
    Experts warn that wealth taxes often lead to economic downturns as wealthy individuals leave.
    “A wealth tax has been tried in many places at many times; it always backfires.”
    @ 08m 17s
    October 24, 2025
  • The Future of Taxation
    Discussion on how a wealth tax could evolve and affect lower-income individuals in the future.
    “If they get away with it, it'll become a regular thing.”
    @ 08m 54s
    October 24, 2025
  • Poly Market Insights
    Poly Market shows 89% accuracy one week out, jumping to 95% in the final hours.
    “If you follow the sharps along this pattern, you're going to make money.”
    @ 22m 35s
    October 24, 2025
  • Amazon's Job Automation Plans
    Amazon plans to automate 75% of warehouse operations, impacting 600,000 jobs.
    @ 29m 21s
    October 24, 2025
  • AI and Job Displacement
    Elon Musk and Bernie Sanders discuss the potential for AI to replace all jobs.
    “AI and robotics will replace all jobs.”
    @ 41m 16s
    October 24, 2025
  • Job Loss Narratives and Automation
    The discussion highlights the misconceptions around job loss due to automation and AI.
    “Oh my god, look over there. There's a robot. That's the reason I'm losing my job.”
    @ 43m 18s
    October 24, 2025
  • Elon Musk's Robot Army Concerns
    Elon Musk expresses his worries about losing control over his robot army at Tesla.
    “I don't feel comfortable building that robot army if I don't have at least influence over it.”
    @ 49m 41s
    October 24, 2025
  • Future of Robots on Mars
    Speculation on where the first million robots will be deployed, including potential use on Mars.
    “I think a very fun poly market is where do the first million robots go?”
    @ 01h 01m 02s
    October 24, 2025
  • AI Bias Concerns
    A study reveals AI models show hidden biases in valuing human lives based on race and gender.
    “This does appear to show significant bias.”
    @ 01h 04m 38s
    October 24, 2025
  • Algorithmic Discrimination
    States are adopting algorithmic discrimination laws, impacting AI model training and bias.
    “The push for DEI to be inserted into AI models... has now moved to the state level.”
    @ 01h 06m 53s
    October 24, 2025

Episode Quotes

Key Moments

  • Wealth Tax Explained00:17
  • Political Strategy01:31
  • Economic Impact08:17
  • Poly Market Dynamics21:09
  • AI Job Replacement Debate41:16
  • Government Inefficiency43:13
  • Robots on Mars1:01:02
  • Indoctrination1:22:14

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Pete Buttigieg: The Left's Identity Crisis, Wealth Tax, 2024 Mistakes, Plans for 2028
Podcast thumbnail
Massive Somali Fraud in Minnesota with Nick Shirley, California Asset Seizure, $20B Groq-Nvidia Deal
Podcast thumbnail
E53: Wealth tax, inflation as a capital allocator, big tech earnings, paternity leave & more
Podcast thumbnail
E30: Ramifications of Biden's proposed capital gains tax hike, founder psychology & more
Podcast thumbnail
E115: The AI Search Wars: Google vs. Microsoft, Nordstream report, State of the Union
Podcast thumbnail
E74: Market update, inverted yield curve, immigration, new SPAC rules, $FB smears TikTok and more
Podcast thumbnail
E35: Biogen's controversial Alzheimer's drug approval, the billionaire space race, Bitcoin & more
Podcast thumbnail
E101: Ye acquires Parler, Snap drops 30%, macro outlook, VC metrics, valuing stocks & more