Search Captions & Ask AI

Biggest LBO Ever, SPAC 2.0, Open Source AI Models, State AI Regulation Frenzy

October 03, 2025 / 01:29:31

This episode of the All-In Podcast covers the $55 billion take-private deal of Electronic Arts, the impact of AI on the gaming industry, and state-level AI regulations. Hosts Jason Calacanis, Chamath Palihapitiya, and David Friedberg discuss the implications of the deal involving Jared Kushner's Affinity Partners and the Saudi PIF, as well as the future of gaming in relation to AI.

The conversation begins with a light-hearted discussion about fitness tests for government officials, leading into the significant news of EA's acquisition. The hosts analyze the deal's potential to reshape the gaming landscape, emphasizing the importance of video games in the digital economy and the role of AI in enhancing user engagement.

Chamath shares his bullish perspective on the gaming sector, highlighting the risks and opportunities presented by the rise of AI and private equity in the industry. He notes that the deal allows EA to focus on long-term growth without the pressure of quarterly earnings.

The episode also addresses the regulatory landscape surrounding AI, particularly the new California and Colorado laws aimed at ensuring safety and preventing algorithmic discrimination. The hosts express concerns about the potential for overreach and the challenges posed by having different regulations across states.

Overall, the episode provides a mix of humor, industry analysis, and critical insights into the evolving relationship between technology, regulation, and the economy.

TL;DR

The episode discusses EA's $55 billion acquisition, AI's impact on gaming, and state-level AI regulations.

Video

00:00:00
All right, everybody. Welcome back to the number one podcast in the world. Of course, that's the All-In podcast. I'm
00:00:05
your host, Jason Caliganis. With me again, your chairman, dictator, Chimath Polyhapatia, and the Sultan of Science,
00:00:11
David Freeberg, David Saxs, will be calling in from the skiff. He's in some deep negotiations uh for
00:00:18
the United States of America. From the skiff. It's not there's no tea at the end. It's just tea. No tea. Oh, skip to my l Okay. He's in a skiff
00:00:26
doing something with his Blackberry and a bunch of generals. Nobody knows what's going on in Sax's life, but he he'll
00:00:31
he'll crack in from the skiff any moment now. But we'll start with You guys see that Pete Hex have
00:00:37
announced a PT and fitness test for the generals? Could you imagine if Sachs had to pass a PT?
00:00:42
Oh my god. They should totally make it for the administration. Sax, we need you to do a one push-up. Sax,
00:00:48
what do they do if you don't pass? They remove you from your You probably get a cure period. We should do a push-up contest. That
00:00:55
would be great. Winner take all. How many push-ups can you do? Free birth. You have to adjust for people's heights.
00:01:00
I'm I'm the tallest of all of you. I have a much longer limb system. What does that mean? So 20 for me is much harder than 20 for
00:01:05
you, Jason. I mean, 20 is easy for me at this point. Yeah. You're like Bill Bull Baggins.
00:01:11
It'll take you like 8 seconds to put a tank, man. Bill Baggins. How about Thor? I'm like Thor at this. I'm going
00:01:17
to my Daniel Craig era. Kell has a elbow to my Daniel Craig weight to weight ratio. That's highly
00:01:23
advantaged. All right, let's get started. Enough shenanigans. EA is what is your arm
00:01:28
length, Jake? Do you have a good arm length? My wingspan, my wingspan uh technically is enough to kick your ass with one hand
00:01:34
and tie behind my back. That's actually let your winners ride.
00:01:41
Rainman David and we open sourced it to the fans and
00:01:47
they've just gone crazy with it. Love you. [Music]
00:01:53
Okay, EA is being taken private in the largest takeprivate deal in history. $55
00:02:00
billion. Man, that just stacks up to, let's see,
00:02:05
Texas Power Company in 2007, HCA Healthcare, $ 33 billion. This is a
00:02:10
large deal. Investors in the take private include Saudis PIF, Silverlake,
00:02:17
and Friend of the pod, Jared Kushner's Affinity Partners. 210 bucks a share, 25
00:02:22
premium on the stock. Kushner's largest LP at affinity as you know, SAP PIF as
00:02:28
well. The PIF has invested over $900 billion
00:02:33
when you know many of the things Lucid Motors, Live Golf, the Softback Vision Fund, Uber back in the day, Newcastle,
00:02:39
the Premier League, Electronic Arts obviously is in the video game business.
00:02:45
They were founded at uh Sequoa's office in 1982 in San Mateo. Shout out to our
00:02:50
guy Rudolph Botha who joined us for the all-in summit. Their headquarters still in Redwood City Madden NFL
00:02:57
the Sims. Oh, that's why you have the background of the Sims this week. Need for Speed. Pretty insane
00:03:03
deal here. Chimoth and this is a high watermark for private equity. Anyway, you look at it and the Pif loves games.
00:03:11
They are the biggest shareholder in Nintendo, Savvy Games, Scopely. I mean, they just keep buying games. Uh, what
00:03:17
are your thoughts here on this deal happening right now? I really like it. Let me give you the
00:03:25
bull case and then let me give you what the bare case would have to believe. The thing to remember is that video games is
00:03:31
the anchor pillar of usage across the entire internet.
00:03:36
Last week at our poker game, we had Matt Bramberg join in just for dinner, who's the CEO
00:03:43
of Unity, and Alex Blum, who's the COO of Unity. And one of the stats that they
00:03:48
shared with us at dinner was it's about 3 billion DAO play games
00:03:53
which is just an inc exactly it's an incredible incredible stat. So in many
00:03:59
ways it's much bigger than social networking and social media or as big and in that EA is sort of this 800 lb
00:04:06
gorilla but I think the problem is is that they've always been these gatekeepers and I think that there's a risk and a chance that these gatekeepers
00:04:13
get eroded away. Specifically who I'm talking about are folks like Microsoft and Xbox. And at the point that this
00:04:19
company is going private, there's some really interesting things that are happening. So Xbox, I think the day
00:04:25
after the EA deal got announced, decided to hike prices 50% to their subscription
00:04:31
service. And what happened over the subsequent few days is that so many people tried to cancel that the site
00:04:36
went down. So what are you seeing happening? You have distribution gatekeepers
00:04:41
trying to raise prices and take share. And then you have the original IP owners
00:04:46
who have not had a a wellfunded way of fighting back in a category that is
00:04:52
basically as important and frankly more important than social media. So I think if you take an asset like this private,
00:05:00
it allows you to take your time to clean up the opex model, right? figure out who
00:05:05
does what, be able to use the best of all these nextgen tools,
00:05:10
and then be able to find ways of finding distribution outside the scope of Xbox
00:05:15
and PlayStation so that you can take more of your share. If you do those things, this is a multiundred billion
00:05:22
dollar asset. And in that, I think it could be just an enormous win. So, I think it's very smart. What's the bare
00:05:29
case? I think the bear case is extending a
00:05:34
theme that I've talked about here a few times which is I think the value of
00:05:40
patents and by extension IP and copyrights are going to go away
00:05:46
and in that there's going to be a spectrum where certain content IP
00:05:52
holders lose and other ones win. I think gaming is on the winning side to be honest and I think content studios in
00:05:59
general like traditional content the Disney's the Hulus the Netflixes are on the losing side but the bare case
00:06:06
would be that these tool chains allow the number of games to be built to
00:06:13
increase by two three four orders of magnitude and that they are distributed
00:06:19
by other places like the social media sites. I just think that that's a pretty low probability. So on balance, I think
00:06:27
that Jared and Egon did a killer deal. I really like it. And for people who don't
00:06:33
know, Unity makes the 3D software that people build games in. It's a public company, 16 billion, also backed by Ruof
00:06:39
and Sequoia back in the day. Incredible company. Freeberg, what are your thoughts on the
00:06:47
gaming industry versus say social media versus traditional media? We're seeing
00:06:54
massive amounts of money being put into each of these, but this is time and for this next generation, let's say
00:07:00
millennials and younger, we're seeing a big mix. Obviously, they don't have cable TV, so that's been plummeting, but
00:07:06
they do play games. They do like the YouTube, the Tik Tok, etc. And they do love social social media. What what's
00:07:13
the future here as you see it? One way to answer that question is to think about how people spend their time.
00:07:19
Do you spend more minutes on social media or on traditional media or playing
00:07:26
games and how is that trending? But importantly, which of those will acrue more benefit
00:07:32
and as a result drive more hours spent from AI? Is AI going to create more
00:07:38
social media engagement? Is AI going to create more traditional media engagement or is AI going to create more video game
00:07:44
engagement? And I think that one way to kind of think about this thesis is that AI is going to ultimately acrew to video
00:07:49
game entertainment far more than social media entertainment or traditional. Why is that? Why? Explain to me.
00:07:55
Because I think you can create dynamic, more engaging experiences that will
00:08:00
benefit from kind of a back and forth sort of relationship than you can with traditional content or with social
00:08:06
media. And what we see now in a lot of gaming systems that didn't exist, call it 12 years ago, is AIdriven players
00:08:15
embedded in the games that act and feel a lot more like real human engagement.
00:08:20
That is very hard to kind of mimic from traditional programming methods that were used in gaming. And so that makes a
00:08:26
a big difference. Like for example, if you're playing Fortnite, I don't know if you guys play Fortnite or have played Fortnite, but if you're a noob in
00:08:33
Fortnite, like you're early player in Fortnite, you're mostly playing, even though you go online and play against
00:08:39
what are supposed to be kind of other players, you're mostly playing against AI because what they do is they tune the
00:08:45
AI to be easier to beat so that you can slowly develop your skills. Because what was happening early was they were seeing
00:08:50
a high degree of churn in Fortnite because kids would go on and play for the first time and they'd get paired up
00:08:56
with kids that were better than them and so they would never win and they would get frustrated and they would quit the game and stop. So the churn rate was
00:09:02
high. So AI unlocked higher engagement and higher retention on the Fortnite platform. And I think we're seeing that
00:09:08
in a lot of different gaming platforms now. So AI can be used for example to
00:09:14
maximally increase time, engagement, satisfaction, happiness. I think the the Saudis saw this and if they're trying to
00:09:20
diversify away from their oil holdings, entertainment, and how people spend their free time, which by the way, I
00:09:27
think is a general macro bet that everyone should consider making because if you believe in AI and you believe in
00:09:32
the improvements in productivity, generally speaking, people in the industrialized world will generally have
00:09:38
more free time on their hands and be able to support themselves with the deflationary effects of AI over time. So
00:09:44
if there's more time on people's hands, the general market for entertainment is growing. And if the general market for
00:09:49
entertainment is growing, gaming is the future of entertainment. And the future of gaming is AI. Now the the Saudis own
00:09:55
10% of this prior this company prior to the deal. And I don't know if you guys have tracked the investments they've
00:10:01
made, but they've been extremely aggressive with gaming. So they have this like investment division called Savvy Games. And within Savvy Games,
00:10:09
they bought Scopely for 4.9 billion in 2023. And then earlier this year, they spent 3.5 billion to buy Niantic, the
00:10:16
company that makes Pokémon Go. And then they also own 4% of Nintendo. They own
00:10:21
6% of Take 2. They own a sizable percent of Activision, Blizzard. So they've put
00:10:27
quite a bit of capital in small investments in other gaming platforms. They own a few gaming platforms. So this
00:10:33
is clearly like a big thesis and a big investment that they see as the future of entertainment over time. Jared's
00:10:38
firm, Affinity, is going to own about 5% of the company post transaction. the Saudis are going to be the majority owners. So, I think that this is going
00:10:44
to end up being the next big platform play for them and and it allows them to
00:10:51
make the important long-term investment in furthering the transition to AI and not have to worry about quartertoquarter
00:10:57
earnings, but really making a 10-year bet and they do talk a lot about this 2030 vision. And if you look at across those three
00:11:04
categories we've been discussing here, video game usage about 60% of US adults do it every week. Social media about 75%
00:11:11
of Americans use it every week and uh streaming, traditional media, the Netflixes, Disney Pluses of the world,
00:11:17
that's still 83%. So these are the three buckets of of people's time. Uh books
00:11:22
and going to the movies, those are obviously the big losers. You know mix the market was totally
00:11:27
getting this wrong because the Tik Tok of the deal is super interesting. When they were looking for the debt
00:11:33
financing, it was about 36 billion of equity, 20 billion of debt. They called Jamie Diamond and Jaime basically ripped
00:11:41
the 20 billion in on the same day just because I think I think he also
00:11:47
could underwrite this pretty fast. I mean, some of the biggest deals are frankly so obvious that it just takes
00:11:54
the courage to put it together and then everybody's like, "Oh, this just makes so much sense." And then Andrew Wilson,
00:11:59
who's the CEO, is going to stay on. He's a great guy. Super super compelling.
00:12:04
It's worth talking a little bit about the impact I think of private equity. If um you spend any time in the region, I'm
00:12:10
going to be in Saudi and Dubai in the first week of November doing my founder university and I'm I'm been out there
00:12:17
twice a year maybe for the last three years. They will tell you whether you're in DOA, Abu Dhabi or Riad, we've got six
00:12:24
or seven industries we really care about. Technolog is at the top of the list. Private equity is at the top of
00:12:30
the list. Live entertainment and sports at the top of the list. And then actually hospitality also at the top of
00:12:36
the list. Real estate building new places for people to go. And if you look at private equity, pull up that chart I
00:12:43
had there. This is just stunning how big this industry is getting. You know, $5
00:12:49
trillion is what we're up to here. And it just keeps growing. I I think private equity is totally
00:12:54
screwed. I I don't think Silverlake or Infinity or this deal
00:12:59
are screwed, but I think private equity in general is totally owed. All right. Right. Well, it's it's gotten huge just since 2015 and tripling in
00:13:08
size. So, why is this I guess my question for the gentleman here and for
00:13:13
the audience, why is private equity becoming so large and what impact does
00:13:18
that have on society? If people can't put EA into their retirement account,
00:13:25
they can't put Stripe into their retirement account. If we take all the great companies and we start to privatize them, SpaceX, let's say never
00:13:31
goes public. What impact does that have on people's retirement accounts? Okay, look, I think I think the history of this is important. There was a
00:13:38
long-standing belief that the best way to generate the best risk adjusted
00:13:45
return, what does that mean? That means to manage through periods where the stock markets go down and to manage
00:13:52
through periods of volatility. The best way to do that was to have what's called a 60/40 allocation. 60% to bonds and 40%
00:14:00
to equities. Over many years, especially when we artificially suppressed rates at
00:14:06
zero through Obama, a lot of people started to move their allocations away
00:14:11
from 6040 and they started to make more and more investments further out on the risk curve. The biggest beneficiaries of
00:14:18
that were venture capital, private equity, and hedge funds. The thing with private equity is that
00:14:25
because rates were zero, they had an infinite amount of borrowing capacity,
00:14:30
had very little downside to them, and so they were able to manufacture returns much faster than venture capital and
00:14:37
hedge funds could. So, as a result, you had an initial group of people that were defining the asset class, making a ton
00:14:43
of money, and then you had all these fast followers that said, "Well, if they're doing it, I can do it, too. So
00:14:49
far, so good." But then always what happens is then you have this flood of lagards that just flood the zone. And
00:14:57
it's these lagards that make it very difficult to generate returns because
00:15:02
they start overpaying for assets. They start mismanaging and undermanaging the assets that they do own. And so where we
00:15:10
are is that private equity has seen a very consistent way of returning money to help improve that 60/40 portfolio. as
00:15:17
a result they got a lot of money but then that created a lot of competition and so that's why you see this hockey
00:15:24
stick graph Jason and when you see that kind of graph it doesn't matter what asset class it is
00:15:29
the returns go to zero and so we've seen this in venture capital we've seen this in hedge funds
00:15:35
and we're now going to see this in private equity too much money going in to be clear what you're saying means you kind of exit it
00:15:44
right there's there's no returns and so again I've said in any of these alternative asset classes, there's only
00:15:49
one thing you should always ask if you had to have one critical question.
00:15:55
What are your distributions? Don't show me your IRRa. What is your
00:16:00
DPI? The distributions on your paidin capital. And if the answer is zero,
00:16:07
then it is a very challenged asset class. And what I will tell you in private equity is that over the last
00:16:13
four or five years distributions have been few and far between.
00:16:20
So I think what's going to happen is that the money is going to come out of private equity and it's going to get concentrated into the few companies that
00:16:26
know what they're doing of which Silverlake has generated over you know
00:16:32
the last 15 20 years tens and tens of billions of dollars of distributions. They are just an
00:16:38
exceptionally well-run organization. They've done these huge buyout deals
00:16:44
successfully before. So, we need to go through that in PE. Where does the money go? The money's already leaked into
00:16:51
private credit, which is the next big bubble that's building. It looks like this chart that you just showed,
00:16:57
which is loaning businesses money. You know, it's super interesting because you make such a good point. What we're
00:17:03
seeing in private equity is these continuation funds. Now continuation funds are coming chimoth to venture. So
00:17:10
I've been getting pitched on these continuation funds where like hey take all your assets sell it to a new group of people and then reset the clock and
00:17:17
then there's never an exit. The good news is I will say the last year we've
00:17:22
seen a lot more activity for shares of our companies that are still private. So
00:17:28
the secondary market Freeberg is coming back in a major way. But I do get worried about these continuation funds
00:17:34
because now you're just moving an asset from one class to the other and we need to have a functioning IPO market. How
00:17:41
functioning is the IPO market today? Would we say it's completely dysfunctional?
00:17:46
How dysfunctional is the IPO market? Let me say it another way. And and how do we
00:17:51
correct that? And this leads into your new spec. Look, there are three ways to go public. There's the traditional way IPO,
00:17:58
there's the direct listing, and then there's the reverse merger or the spa.
00:18:04
Up until I floated IPO A in 2018, I think it was the first way was really
00:18:11
the only way. I was involved in two direct listings, Slack and Coinbase.
00:18:18
And in both of those, what I learned is that, you know, it has the same vagaries
00:18:24
as the traditional IPO. So in the traditional IPO, you go to a bank, they underwrite you, they act as a
00:18:29
gatekeeper, and they take six, seven, 8% fees as a result, and then they allocate
00:18:35
what is essentially underpriced stock to their best customers. Then you see a
00:18:40
one-day pop, maybe a two or three day pop. All of those customers tend to unload and then the stock tends to drift
00:18:48
down. So the IPO is expensive and it typically is mispriced.
00:18:53
The direct listing you have a different dynamic which is the first trade is always the highest
00:18:59
trade and then it just goes straight down. That happened with Slack and it happened with Coinbase. So
00:19:04
Spotify would be in that group as well. Yeah. Yeah. With Slack I remember like I I was like offside a billion dollars and I was
00:19:09
like well I'm never letting this happen again. And so when I had the Coinbase thing, I sold it the first day. And I texted Brian. I said, "This is not a
00:19:15
directional indication of your company. It's the dynamics of the direct listing because I learned it the hard way that
00:19:21
the time to sell is on day one." So where does the spat come in, you know,
00:19:26
especially now in version two? Version two being the the thing that I have been
00:19:33
tinkering and refining with and am trying to push in in this new version.
00:19:39
I think that it's creating an incredibly competitive vehicle where you can have a ton of money go into these private
00:19:46
companies, take them public at a very, very low cost of capital. And I think that that's should be very enticing.
00:19:54
So, you closed your financing. Can you just tell us what the capital raise was like as you went out and met with folks?
00:20:00
What do you hear? Yes. You know, Nick, maybe you can find it. You know that image of the Raptor
00:20:05
engines? Yes. super complex to being elegantly simple. Yeah. Nick, can you can you maybe just
00:20:10
throw that up? What I would say is like Spack 1.0, of which I was, you know, right in the front of the parade, had a
00:20:16
bunch of misfires and it was complicated, but it worked. There were some hot fires that worked, but then
00:20:22
there were some clear misfires. And the whole point was to prove that you could create a competitive alternative to the
00:20:28
IPO. The thing that I'm the most proud of quite honestly is for all intents and purposes I started
00:20:36
a normalization of this vehicle that's now raised more than 1502 200 billion dollars for American companies. I am
00:20:43
very proud of them. That's an important thing for the American capital markets. I think what we did in American
00:20:50
exceptionalism is Raptor 2. It's not yet perfect, but I do think it tries to
00:20:56
improve on the things that I noticed was not working in Raptor 1. And in that is
00:21:01
a lot of the compensation and incentives. And so when I showed that to investors, they were quite excited. I
00:21:08
think that they want a competitive IPO market that brings many, many American
00:21:14
businesses to the public market so that they can be owned by everybody. the transparency they like and the fact that
00:21:21
the incentives are such now where there's absolutely no compensation unless this thing really works.
00:21:26
And historically they received warrants in the company typically with a strike
00:21:32
price of 1150. So 15% above the issue price of the stock and founders shares that were basically
00:21:37
and there was founder shares but like did you have a reaction from them saying hey we want some warrants we we need a
00:21:44
little extra kicker here like there's some sort of desire for that? No, in fact it was the opposite. I think that
00:21:49
the institutional investors and you know my investors in this 98.7 of the capital
00:21:55
was allocated to these guys are the best of the best. You you know who they are.
00:22:00
So they're every single blue chip A+ institutional investor. And what they
00:22:08
wanted was great companies. They want great companies to be public. And the reason is the thing that Freeberg I
00:22:14
think you mentioned this before. When a good company gets public, the amount of money that they can raise in the publics
00:22:20
and then the amount of growth that they have in the publics far outclasses what they'll ever do as a private company.
00:22:27
And so they want the simplest and cheapest way of great businesses to get
00:22:34
out. Jamat, do you think that the transaction when you find a merger partner, the traditional spa has been
00:22:42
announced as a merger concurrent with a pipe being done where new investors are underwriting the valuation of the deal
00:22:49
and saying we like this company at this price cuz we are now going to write money in in the form of a pipe and
00:22:55
historically the pipe was for common shares. So it kind of was like this is a good price and everyone felt good about
00:23:01
it. Number one, do you anticipate that there'll still be a pipe being done in concurrent with the merger in this
00:23:07
transaction? And then number two is do you think it'll look like a common pipe? Because after the spa frenzy died down,
00:23:14
in order to get deals done, the pipe started to get done with convertible preferred securities. So they were
00:23:20
senior to common and they almost were like dead. How do you think this is going to play out? because a clean deal
00:23:26
has not happened in quite some time where a spa has announced a merger and simply raised money via common in the
00:23:34
form of a pipe. It's a great question. I think it comes down to the underlying asset. But there are some incredible
00:23:39
companies that are private that if they go public
00:23:44
will be able to demand common pipe capital. I think that the
00:23:49
future maybe just prognosticating and guessing what does Raptor 3 look like in this back. I think the Raptor 3 will
00:23:56
look like where somebody a sponsor like me rolls everything up into one thing so
00:24:02
that it's already pre-wired from the beginning where I'll just speak to
00:24:08
a billion, two billion, three billion, whatever it is, flexible capital that can come in as common so that it's a
00:24:13
totally pre-baked IPO at a very fair price. I think that I think that that's what the Raptor 3
00:24:19
version of a spa will look like. So more capital and then they they put their full trust and faith in the sponsor to run the deal.
00:24:25
Well, then meaning then there's no conversion risk that all the money comes over right from it comes over, right? And so then you
00:24:31
have to fully commit in you set your compensation to be a bit Elonike in terms of your compensation as
00:24:38
the sponsor comes if I read it correctly Chimath when it hits certain milestones
00:24:44
in terms of share price. Yeah. Nothing can be earned unless the stock is up 50%.
00:24:49
And then there's a tunch at 50. Then when the stock is up 75% there's another tunch and when the stock is there's no
00:24:54
founder warrants in the deal or there are found there's no founder warrants. Nothing. I think this is great. You know I I was
00:25:01
asked by way the reason the reason why this is important is all of those things that you guys mentioned increases the cost of
00:25:08
capital to the founder and to the private company board and to the employees. All that's unnecessary dilution. So now we take it all off the
00:25:15
table. Yeah. Smart. The thing I, you know, the observation I had at the time, not just for your collection of spaxs in the 1.0
00:25:22
era, but just all of them in general, and I tried to explain this to our syndicate members and investors as well
00:25:29
as the CEOs because a lot of my CEOs were like, should we do a spack? And one of them, Desktop Metal, did
00:25:34
this felt like venture investing. And you know, if you look at Open Door, Virgin Galactic, um, Joby, which I don't
00:25:41
think was one of yours, Sofi, MP Materials, all of these companies, you h
00:25:46
you have to look at it if it is a venture type investment, 80% of venture goes to zero, 20% pays up for the other
00:25:53
80%. I think people were looking at this like it was Netflix and they were not
00:25:58
thinking of these companies and the stages they were at. Well, can I just ask a question? Yeah.
00:26:04
And then I'll drop it to a question cuz SoFi and MP Materials they did extraordinary. So in this class of
00:26:12
companies you're going to be taking out is it going to be the same early stage or are you thinking more robust more
00:26:18
predictable revenue let's call it um resilient revenue maybe rugged revenue?
00:26:23
I think it's the latter but I think it's also important to note that this time around I've tried to really minimize
00:26:29
retail exposure to this. I don't think that retail is well suited right now to have these things and what my my
00:26:35
honest advice is avoid maybe not all spaxs but definitely my
00:26:40
spack just avoid it. I think that there is more than enough liquidity on the
00:26:46
institutional side for us to do an interesting deal, but it fits in our portfolio and our construction which is
00:26:51
a very different risk model. And so I would hate that, you know, people are out on the risk curve without really
00:26:57
understanding the risks because Jason, you can't predict the market. You don't know where these things are going to go.
00:27:02
Yeah. I mean desktop metal 3D printing this is like a very cutting edge nason technology company should have stayed
00:27:09
private a couple more years or people investing it need to understand you're you're now acting like a venture capitalist which means the return
00:27:15
profile and how the portfolio management works is distinctly different than doing
00:27:20
Netflix and Nvidia and whatever other publicly traded companies I would just say do do not invest in these things don't at least you know
00:27:27
just I think you just inspire people to do it I know that's not your intent but would when You say don't do it. Stupid. I I'm
00:27:34
being very honest. Don't do it. No, no, I know. Don't buy spaxs unless it's like less than 1% of your portfolio
00:27:40
would be my advice. Before we move on, can I just make one comment and I'd like your guys
00:27:46
know about the private equity stuff because Chimoth made a comment that private equity is baked, but I think one of the things to take note of in this
00:27:52
take private of EA and we talked about it is the theme of AI empowering EA to
00:27:58
kind of transform the business. And Jared's brother Josh has at Thrive been
00:28:04
executing a rollup of CPA accounting firms that he's been applying AI to to
00:28:09
reinvent that business. Oh, is he really? Yeah. Oh, I should talk to him because we have
00:28:14
an investment in a company called taxjpt.com that is basically like co-pilots with AI for accountants that's
00:28:20
doing spectacular. So what he's done is he's bought these kind of traditional accounting firms at some multiple of IBITA and then he can transform the
00:28:27
business with AI and really create a new opportunity. And I've said like I think this is one of those few moments in history where there really is an
00:28:34
opportunity to beat the market and make money in the public markets if you can be thoughtful and selective about the
00:28:40
companies that stand to benefit from an AI execution strategy. Because in all of
00:28:46
these traditional kind of markets where you have competition, everything's commoditized and the market is mature.
00:28:52
It's very hard for any of these players to differentiate product service and obviously you know unit economics. But
00:28:58
with AI, it's completely transformative and has transformative potential in nearly every industry. So as a public
00:29:04
market investor, if you can identify those opportunities, select them where the management team has the right
00:29:10
leadership in place to execute against this, you could make real money. The problem is most of these companies are not led by folks that understand AI or
00:29:17
software first. And so I think there's an opportunity for more buyouts. They're not going to
00:29:22
be of the $55 billion scale. It's worse than that. In what sense?
00:29:28
So we at 8090 have done the dance with
00:29:33
all the big major private equity firms. And here's how it goes. It always goes
00:29:39
the same way. The partners love it because they're looking at minimal
00:29:44
distributions, companies that are like good but not great in many cases
00:29:50
and they want to see improvements to EBIT and performance so that they can either sell them or move them out.
00:29:56
And you're sorry you're saying you've looked at this you've looked at this with their portfolio. All of them. Yeah. All of them
00:30:01
with with their existing portfolio companies. So the GPS are like this is genius. We should do it. Then they're
00:30:07
like here's a handful of companies to go talk to. And I'll be really honest with you, what you find in most private equity
00:30:13
portfolios are B and C companies run by C and D folks. Yes.
00:30:19
And so the ability for them to go and embrace this is basically next to none. So if I look at my customer distribution
00:30:25
and concentration at 8090, okay, run rating into nine
00:30:32
figures already working on a three $400 million deal. Okay, about a single
00:30:38
dollar comes from a private equity firm. Although we spent initially a lot of time trying to sell it, trying to sell
00:30:43
our software factory and trying to sell work into them. It's really hard and
00:30:48
it's what you said before Freeberg, which is the people incentives at these businesses are misaligned to the AI
00:30:54
outcome, right? And you can't fire these people and I don't think the right answer is to fire
00:31:00
them. So I don't know what the right answer is. This is why I think private equity is very challenging. Do you think there's a do you think there's a power
00:31:06
loss situation where perhaps a handful of investors in the public markets and perhaps a handful of investors in the
00:31:11
private markets can identify and then put the right people in place and execute against these strategies like
00:31:17
Josh is trying to do with his I think Josh is smart so I think Josh will figure it out no matter what. What
00:31:22
I'm saying is if I can show you 20, 30 customers, a ton of revenue, all
00:31:29
these white papers that show upside, and I still can't get it done inside one of these companies, I think it's not us,
00:31:35
it's them, right? So, it's not inherent in traditional a private equity to do this either, which maybe begs the
00:31:41
question, is there a new kind of private equity that can execute this? Maybe that's an opportunity like like Josh is
00:31:46
showing, right? like he's he's a venture investor that's executing a private equity strategy and maybe that becomes
00:31:52
the play. I think if this works well, two of our biggest customers are individual deca
00:31:57
billionaires who own businesses and they're like you're doing this. Mhm. So to the extent that Josh looks more
00:32:04
like that, which is an owner of 100% of the business where it's like you're going to do it, then I think it can work. So I think the
00:32:11
Saudis I think the owner operated model is the only way the AI transformation really works and then the the other end
00:32:18
of the spectrum it's the public market CEO who realizes that they have to do
00:32:24
something real because they'll otherwise lose their job or they'll be disrupted. Those are the two cohorts that I feel
00:32:29
today are on their forward foot. Everybody else is like sticking their head in the sand. Just on the EA front, I forgot to
00:32:36
ask you, Sir Demis, my Greek brother, didn't he show a
00:32:41
It's just all the always the Greeks who get these things done. Yeah. Didn't he show like the uh 3D engine
00:32:47
that would make like infinite games? Yeah. So, it's not actually a 3D engine. It's a class of these AI models that can
00:32:55
render what ex what the experience is looks like and feels like a 3D world, but it doesn't have an underlying kind
00:33:01
of traditional object um rendering engine. It doesn't have a traditional 3D physics engine. So, it's a new way of
00:33:08
experiencing these kind of world interaction systems. And there's several startups. I think um Fay is her name,
00:33:15
the Stanford AI one. Yeah. And she has one of these. That's a virtual worlds company that has the same
00:33:20
principle. I asked Bramberg and Alex about exactly this at dinner.
00:33:26
What was their take? Yeah. He said it's just really, really hard to get these things to actually be
00:33:31
legitimate engines at the scale of what Unity offers for the quality of game that needs to be made for it to work.
00:33:38
The interim step is going to be the assets in it are created by AI. That's what I've seen a lot of startups doing.
00:33:44
So you want to make a character you know you dropping characters and they would be done in real time.
00:33:49
I think I think your whole the whole thing is Unify and Unity as the rendering engine and the AI sits on top
00:33:57
and the AI basically can render objects can render concepts can render structure can render the direction that you as an
00:34:03
engineer would typically provide to the to the Unity or Unified 3D engine and that's going to unlock not just in video
00:34:09
games but also in film. You're 100% right. Can I tell you an example? Yesterday there was um you know
00:34:14
in our group chat a bunch of people sent around the Sora the sloth app. Yeah. And I downloaded it just to play with
00:34:20
Sorl yesterday and the first video that came up was exactly this. It was like a ATP tennis
00:34:26
match. Yeah. Where it was a guy's face the guy like imagine you and then playing against
00:34:34
like a federer. And then I thought well what if he was playing against his friend and that was the actual video
00:34:39
game. to your point, you you get away from all this IP licensing, gatekeeping stuff, and you can just get to good
00:34:45
games faster, good content faster. I think they're adaptive in terms of the competition, so you're not playing
00:34:51
somebody who's going to just dominate you. It just get 5% better every time you play it. You'll get 4% better and
00:34:57
it'll just make it perfectly challenging so you don't quit and you'll learn as you go. It's it's really going to be an
00:35:02
interesting and the same the same will exist in like content J how like you'll make shorts and films and then the ones that have
00:35:08
the most engagement the AI prompting system will get better and better and ultimately it will yield like
00:35:16
uh you know bits of content that people see that happening with Star Wars or Marvel. If all of a sudden Silver Surfer
00:35:21
is an interesting character to you or Ashokano is interesting to you, it'll sort of make that world or enhance that
00:35:28
character and tell you more of their backstory. And that can be very interesting as a how you can sit in your seat and like
00:35:35
make fun of me, call me a nerd, and you actually know the name of this Star Wars character. I don't even know who you
00:35:41
are. Very important character. Ashoka is Anakin Skywalker's Padawan. She is a very important character. If you watch
00:35:47
the Clone Wars, you would know this. the animated series that threads through the
00:35:54
watch. Actually, oh, look who dropped in. Oh, David Sax
00:36:02
is here. Did you get out of your uh Were you in a skiff or something? What's going on? Zar,
00:36:08
I was in some meetings, but actually, no, I was just uh buying some domain names. Oh, you are? Did you get mahalo.com?
00:36:14
I got I got mahalo for the bargain price of $1 million. That's what it's worth. Go to mahalo.com. I'm selling it for a
00:36:21
million. I mean, it's it's in the dictionary. Yeah, I have some old assets. Somebody else should use them. I just I have
00:36:26
Begin.com and I'm going to be working on that in partnership probably with one of the large. I might give you an equity squad
00:36:34
for that. I'll give you a mahalo is the second most important name in the second most important word after
00:36:40
aloha in the um Hawaiian language. I'm surprised Beni off hasn't tried to
00:36:46
ask you for I was just texting with Beni off. Give it to him as a gift, dude. He's a great guy. Just give it to
00:36:52
I will give him the I will give benny off mahalo.com if he gives me four weeks
00:36:58
in one of his Hawaii resorts per year. He would do that. Oh, for the next 20. Oh my god. Imagine
00:37:05
Jake for 80 weeks. Oh my god. As a house for 80 weeks as a house guest. He could be there. He could be there.
00:37:11
It doesn't matter. I'll give him the money so he buys it. Don't worry, donate it to his nonprofit foundation. Then you
00:37:16
can take a tax write off. Look at everybody's When I have something to sell, the guy with the
00:37:21
lowest net worth on the program when I'm trying to pay off my jet, you guys all
00:37:26
have criticism. How come I can't wet my beak? I got Let me ask you a serious question. So
00:37:31
you had investors in Mahalo, right? Yes. And I assume this is their domain. This is their domain. It will go to them.
00:37:37
Oh, so it will. Oh, okay. It will go to those investors. You're paying off liquidation preference, correct?
00:37:42
Okay. Just sitting there. So now instead of losing 100% I'll lose 99.
00:37:48
Something like that. Uh it's just startups are hard folks. But I have the begin.com and I've been
00:37:53
talking to folks. I you know I mahalo was originally a human powered search engine like Wikipedia which we're about to get to and my concept was to do
00:38:01
comprehensive search like neighbor.com or dam in Korea had seen those services. Yeah. And it turned out to be exactly
00:38:07
like perplexity, but at the time we we tested machine learning, which is what everybody called AI back then, and it just didn't work. So, we were trying to
00:38:13
hand roll search results and then back them up with, you know, computerenerated
00:38:18
ones, algorithmically generated ones, but the tech wasn't there now. Um, but I want to do something again with begin.com. I'm really excited about that
00:38:25
domain name. All right, listen. We brought up Slop. Let's get into it. Two slop apps in a Fortnite here. Uh, no pun
00:38:32
intended. Zuck and Sammy the Bull have both released uh
00:38:38
the bull pull. What a deep pull. Sammy the bull. Gravana. There it is. And uh here's a look at
00:38:44
Sora. It's objectively extremely impressive. Here's Sam Alman. People
00:38:49
don't know this. Early in his career when he was starting OpenAI didn't have the money from Elon.
00:38:56
And here's Sam Alman stealing an H100. Here's Sam Alman. Also, this is when he
00:39:02
was um storming the capital on January 6th. Here he is at when he was working at Google. Yeah, lots of but it's really
00:39:10
good and they are basically taking a ton of risk and solving some problems with
00:39:15
IP. As we all know, the IP outputs is where people think you're going to have to be really thoughtful or get a bunch
00:39:21
of lawsuits. On this app, you can opt in and make your persona like Sam did
00:39:27
available to everybody to use. So that whole concept of notable persons
00:39:32
allowing their image to be used, you opt into that and that's pretty clever. So you can let your and you can make it so
00:39:38
your friends can, you know, basically make videos of you but nobody else can. It's it's a thoughtful way of doing it.
00:39:43
However, very controversially, this thing had everybody's IP in it and you
00:39:49
have to opt out if you don't want your IP used. That's going to get him another whole collection of lawsuits to go with the New York Times and Z Davis ones. And
00:39:56
there have obviously been a bunch of settlements now, uh, Anthropic settling their book thing for 1.5 billion. So,
00:40:03
anybody play with these tools yet? And what do you think, folks? And what's the point of these? Do we think this is like a Tik Tok competitor
00:40:10
tomorrow? Do you think it's just back door to training data? What do you think? The closest thing is a Tik Tok
00:40:15
competitor, but I I use it. I thought it was okay. But again, the thing that I
00:40:21
have that I keep in mind whenever I try these apps for the first time is today is the worst it'll ever be.
00:40:27
Sure. It it only gets better from here. And so if you look at the starting point, it
00:40:33
won't take but a year where this thing I think or maybe two years where this thing is legitimately excellent. It has
00:40:40
to get the scripting right. It has to get the prompting right. It has to be a little bit easier for you to use. There
00:40:46
was a bunch of prompts that I used that were rejected by so or by the IP, right? Well, it just said use me, but I
00:40:53
couldn't validate that I was me. And so, you have to take a picture of yourself. It's a it's a little clunky the app right now, but you're right. It's going
00:40:59
to get better in each version. The one by Zuckerberg is called Vibes. I you know,
00:41:04
I was looking at these sacks and I don't know that this is intended to be like the next great social media app as much
00:41:11
as it's a data play to get folks to train data. when you see them, what are any thoughts on them other than
00:41:18
interesting? Yeah, I haven't played with it yet, so Oh, sorry for me to say. Freeberg, you got any thoughts on it?
00:41:25
Just uh No, I I don't have like thoughts. I I think, you know, we're kind of early
00:41:30
innings. I do think there's like new categories of media that none of us are really considering today. Like
00:41:36
traditional media, as I've mentioned in the past, is like centrally produced and then broadly consumed. And I think that
00:41:41
there's models of media that are going to emerge that are going to create new business categories or new business models and and also new media categories
00:41:48
that are all about kind of distributed production and not necessarily like central production, distributed
00:41:55
consumption. So that that kind of changes things quite a bit and I think maybe this is going to start to open that door a bit. One of the things I
00:42:02
because I thought about this and I I mentioned this in the past where I'm like everyone's going to make their own movie, their own video game, their own music, but there is this notion of like
00:42:11
shared cultural context. Like everyone wants to talk about, you know, how did the 49ers do this weekend or did you
00:42:18
guys see that show adolescence? Did you guys like we want to have a conversation about some shared stories that's the the
00:42:25
basis of kind of societal interaction and mimetics. So I think like there are elements of this being the beginning of
00:42:32
the enabling tools, but I don't think we've actually seen what's going to happen, which is how do you take one story and then create a distributed way
00:42:38
of consuming that story where everyone experiences and consumes it differently. So I do think like this notion it's like
00:42:45
hey everyone's making fun of Sam or does some like maybe there's some cultural context about Sam Alman that we all share and then we're all like engaging
00:42:52
with Sam Alman in different ways you know. So, so I think like there's we're very early and we don't yet know kind of how it's all going to play out, but I
00:42:58
think that's really critical to bring it is something is lost because we
00:43:03
used to all talk about the latest Tarantino movie or the latest, you know, Sopranos episode. We don't do it
00:43:10
anymore. And I I do share stuff. We do talk about tweets and stuff and you know there's other forms of groups but it's it's not
00:43:18
like it used to be where 30 40 million people would see Raiders of the Lost Arc and it would be the discussion of the
00:43:24
summer or whatever it is. And so I I literally bought 20 tickets to the new Paul Thomas Anderson one battle after
00:43:30
another just so I could have a conversation with 20 friends about the new PTA. And so people really are
00:43:37
longing for this shared experience. Paul Thomas Anderson he did the master there. just there will be one of the
00:43:44
greatest ever he is top five director of all time but I know you don't care about culture um but is he like is he like
00:43:51
Michael Bay it would no opposite of that actually Michael Bay makes things that go boom
00:43:57
Paul Thomas Anderson's that make makes things that make you go Michael Bay super cool fun to hang out with fun to party with
00:44:03
right okay well way to bring it back to you um okay hold on you dropped a name here is
00:44:09
I don't know Paul Tom Sandra but it was a heck of a film as Sax. Sax is actually very cultured
00:44:15
when it comes to cinema. Did you see it yet, Sax? I have not seen it yet. No, it's it's of the moment
00:44:21
and it's heard it was anti- conservative. So, it doesn't have some
00:44:28
leftwing take. No, it kind of mocks the left and the right. It's kind of mocking both extremes. You'd love it.
00:44:34
I think you very much appreciate it. All right, I'll check it out. Yeah, I would check it out. Uh, hey, I have an idea. Why don't we find a topic that's interesting to talk about?
00:44:40
Yeah. Okay, great. Yeah. Well, that's a well, if you contributed to the docket or showed up on time, maybe we could do that. Um, so unbelievable. Just so you
00:44:48
know, the inner workings right now, there's a little resentment in the group because one of us decides to change the
00:44:54
time of the pod for four weeks in a row and then show up half an hour late. I won't say which person that is, Sax. Uh,
00:44:59
Sax, but here's an interesting topic from Red Meat for you. Deepseek, the Chinese LLM, just dropped their latest
00:45:06
model 3.2 EXP. It's faster, it's cheaper, and it has a new feature called DSA,
00:45:14
Deepseek sparse attention, which makes it faster to do uh training and inference at larger tasks. The key
00:45:20
takeaway is it can reduce API cost by up to 50%. The new model charges 28 cents
00:45:26
per million inputs, 42 cents per million outputs. Claude, which is a leading model from Anthropic that a lot of
00:45:32
developers use, a lot of startups use, is like $3.15, so 10 times 35 times more
00:45:38
expensive. Obviously, people are cutting their prices pretty quick. But, uh, Sax, this is your wheelhouse as our ZAR of
00:45:44
crypto and AI for the United States of America. What are your thoughts here on
00:45:49
the continued execution of the Chinese government with Deep Seek?
00:45:55
Well, I want you to hear Freeberg's thoughts on this because he was paying attention to this, weren't you? Yeah, I mean I think there's a total
00:46:01
rearchitecture underway and we're at the earlier stages of cost per token in terms of dollar and energy. My
00:46:07
understanding is there's actually a lot of work going on with US labs right now in a similar kind of track that's going to result in similar results. Maybe
00:46:14
they're a little bit ahead of the curve, but we should really pay attention to the curve. I think you know what do the
00:46:20
models say in terms of energy demand in terms of cost per token if these
00:46:26
architectural changes really do drive down 10x 100x a 1000x 10,000x um over
00:46:32
the coming months and this is open source so just so everybody understands it's available on
00:46:38
AWS it's available on GCP at least 3.1 is I don't know if 3.2 too is available there now, but I'm hearing from a lot of
00:46:44
startups, I don't know if you're hearing this in the field, Chimoff, that they're testing it and playing with it in some cases using it because it's uh so much
00:46:50
cheaper. Are you seeing that? We are a top 20 consumer of Bedrock. So,
00:46:57
let me tell you what it looks like on the ground. We redirected a ton of our workloads to Kimmy K2 on Grock
00:47:04
because it was really way more performant and frankly just a ton cheaper than OpenAI and Anthropic. The
00:47:13
problem is that when we use our coding tools, they route through Anthropic, which is fine because Enthropic is
00:47:19
excellent, but it's really expensive. The difficulty that you have is that
00:47:24
when you have all this leaprogging, it's not easy to all of a sudden just like, you know, decide to pass all of these
00:47:32
prompts to different LLMs because they need to be fine-tuned and engineered to kind of work in one system. And so, like
00:47:38
the things that we do to perfect codegen or to perfect back propagation on Kimmy
00:47:44
or on Enthropic, you can't just hot swap it to deep speed. All of a sudden, it comes out and
00:47:50
it's that much cheaper. It takes some weeks. It takes some months. So, it's a it's a complicated dance and
00:47:57
we're always struggling as a consumer. What do we do? Do we just make the
00:48:03
change and go through the pain? Do we wait on the assumption that these other models will catch up?
00:48:10
So, it's people are making tools now that and by the way, I can't just make
00:48:17
it easier to switch between them. No. And like you know this weekend a different company with a huge model came
00:48:24
to us and gave us the preview of their nextg model. Okay and it's incredible
00:48:29
but then when I sit on Monday morning with my team and I'm like okay what do we do? We don't know what to do. Do we
00:48:36
cut it? Do we move over and say great we'll refactor all these workloads to run on on this new model? It's a it's a
00:48:44
really hard problem and it's getting worse the more complicated tasks that we undertake. Okay. And just for people who don't know, Kimmy is made by Moonshot
00:48:51
AI. That's another Chinese startup in the space. Sack, your thoughts. Well, I think this is actually a really
00:48:56
interesting topic. This topic of open source. I'm a big fan of open source software because it's a it's a check on
00:49:03
the power of big tech in a way. What we've seen in the past in the history of
00:49:08
technology is that these major categories end up getting dominated by one or two big tech companies and they
00:49:14
have all the power and control. And open source provides an alternate path, right? Because the community of open
00:49:21
source developers just puts things out there and then you can take it and run it on your own hardware and you're not
00:49:26
dependent, right? It's a path to sort of software freedom, if you will. So, so far so good. I think the thing that is
00:49:33
now tricky about this is that all the leading open-source models are from
00:49:38
China these days. China has made a really big push on open source. Obviously, DeepSeek is an open source
00:49:44
Chinese model. That was the first big one. Kimmy is one. Quen from Alibaba.
00:49:50
And so I think that if you want the US to win the AI race, then we're all kind of two minds about this. On the one
00:49:56
hand, it's good that there are open- source alternatives to the closed source proprietary models. On the other hand,
00:50:03
they're all coming from China. Now, there were some American efforts that have been important. So, Meta most
00:50:10
notably has invested billions of billions of dollars in llama. But the release of Llama 4, I think, was
00:50:16
considered disappointing by a lot of people. And now there's statements by Meta that they might be backing away from open source and just going
00:50:22
proprietary. OpenAI released an open source model, but it's nowhere near their frontier.
00:50:29
And there are some startups that are trying. So there's one called Reflection that looks promising is developing an
00:50:36
open- source American model. But so far, this is maybe the one area in AI where
00:50:42
the US is behind China. as this sort of open source models. I'd say every other part of the stack, closed models, chip
00:50:48
design, chip manufacturing, semiconductor manufacturing equipment, every other part of the stack, even data
00:50:53
centers, I would say we're we're ahead, but this one area of open source is a little bit concerning.
00:50:58
Interestingly, Saxs, the two things of note is OpenAI. The open was originally
00:51:05
that they were supposed to do open source. So, that's kind of hilarious. But the second is that Apple, which is
00:51:11
the furthest furthest behind of everybody, they have a really interesting open source model. So when you're behind like Apple is or the
00:51:18
Chinese were, you're open. You're you do open source and when you're ahead like OpenAI became with ChatgBT, you close it
00:51:24
down. But that uh can I tell you open Elm Open ELM? Yeah.
00:51:29
Efficient language models from Apple. Keep an eye on that one. Can I tell you what's going to make this open source closed source battle even
00:51:35
worse? Because effectively what this is is the US versus China. the US is closed and China is open at least at the scaled
00:51:43
models that work. But that doesn't have to be the case, right? Because we could release open models too.
00:51:48
No, no, you're right. I'm just saying today if you look at the conditions on the field, the closed source, highly performant models are American. The
00:51:54
open- source highly performant models are Chinese. And you would say, okay, well, what is the next downstream thing?
00:52:01
It's what Freeberg mentioned, which is the energy and the cost of generating these output tokens. And I talked to
00:52:07
somebody yesterday who runs a huge energy business and I have to tell you it's not in a
00:52:13
good place. Meaning you saw I think this week where the residents of Indianapolis
00:52:20
were able to reject or get their city to reject a billion dollar data center that
00:52:26
Google was going to build near Indianapolis largely because of concerns
00:52:31
of price inflation around electricity. And what this energy CEO told me is,
00:52:37
look, the next five years are baked. And if we don't find some compelling solves,
00:52:44
and I'll tell you what the two ideas were, but if we don't find some compelling solves, electricity rates will double in the next 5 years. Now if
00:52:52
you think about how then consumers will view the use of AI
00:52:57
and then if you think about companies like us and others trying to use the cheapest version so that we are
00:53:02
minimally impacting the downstream cost of these things because it will become an energy problem. This is a very
00:53:09
complicated thing. Now his idea and it's a huge PR crisis because if you want to
00:53:14
take big tech which is already viewed negatively and make their perception even worse. If you start to finger point
00:53:20
to them and say these guys are the reason my electricity costs have doubled in the last 5 years that is no bueno for
00:53:26
them and they need to find an offramp asap. It's a bad look
00:53:33
doubling this could take your jobs right. Yeah it's terrible. Whether you believe that's true or not, that is the
00:53:38
perception. There are two offramps. There are two offramps that he suggested which I think are worth considering.
00:53:45
Offramp number one is what's called a cross subsidy which is essentially to say that they pay a rate card which they
00:53:53
can absorb with all their free cash flow materially higher than what other rate payers would pay in that geographic
00:54:00
area. So the homeowner his or her electricity costs stay flat to down. The
00:54:06
data center costs are higher and it's the Metas, the Googles, the Apples, the
00:54:12
Amazons who have hundreds of billions of free cash. They absorb it. That's that was idea number one. And idea number two
00:54:19
is to start to set up some mechanism so that they can install things like batteries at every single home in and
00:54:26
around these data centers to allow those homes to have a better chance of actually um absorbing some of this
00:54:33
inflation without having to pay it. That's a really good idea and this is playing out sachs in Virginia in a major
00:54:38
way because that's where data center alley is and 40% of the energy in Virginia now is going to data centers.
00:54:45
This is becoming acute. So what what are your thoughts here, Zar? Well, Chris Wright spoke to this pretty well at the all-in summit in terms of
00:54:51
what we have to do. I mean, there's no question that AI is going to create a huge need for power over the next 5 or
00:54:56
10 years. I think on a 5 to 10 year time frame, the answer is probably nuclear or
00:55:02
at least that's a big part of it. But nuclear takes at least 5 years. Within the next 5 years is probably gas, you
00:55:08
know, natural gas. But the issue there is there's a huge backlog for gas turbines. basically the engines that
00:55:15
burn the gas to create power and there's like a two to three year backlog for
00:55:20
those to spin those up. So the question is what do you do in the next few years and I think Chris Wright talked to this and I've heard this from other energy
00:55:26
executives which is we just need to squeeze more out of the grid. If we were to shed just 40 hours a year of peak to
00:55:35
say backup generators, diesels, things like that, you could get an extra 80 gawatt out of the grid. This is what one
00:55:40
energy executive told me. The reason is because they build the grid and they have regulations on it based on the
00:55:48
peak, right? Which is basically the coldest day in winter or the hottest day in summer. And the same way that you you
00:55:53
know you build your church for Easter Sunday, the rest of the year it runs at 50%. Same thing with the grid. And so if
00:56:00
they could just reduce the the peak 40 hours, if they could shed that load to backup to generators, to diesel, things
00:56:06
like that, then they could run the grid to squeeze an extra 80 gawatt out of it. And I think that's the bridge over the
00:56:13
next few years that we need to then get a lot more gas and then eventually some nuclear as well. But unless you want to
00:56:20
keep talking about electricity, I think there's some other things to talk about on the open source cuz I think it's a
00:56:26
pretty interesting topic actually. And if can we just go back there? Yeah.
00:56:31
Yeah. Yeah. I was I was just trying to paint the case that my economic model for going to open source is better
00:56:37
because I can't pay $3 an output token and then also pay for all this actually I want to I want to ask you
00:56:43
when you're running like Kimmy or something like that. So I think it would be good just to explain to the audience how this works because I think there's a
00:56:49
lot of confusion about what it means to be an opensource model. A lot of people think that when a Chinese company
00:56:56
publishes one of these models it's still somehow theirs. No. But the reality is once it's
00:57:01
published, it's no longer theirs. It belongs to anyone who wants to take that code and you're not running that on a
00:57:07
Chinese cloud or something like that. The data is not going back to China. You're taking that model and you're running it on your own infrastructure.
00:57:15
Can you just explain this? Yeah. So when I first started 8090, my only solution was uh Bedrock, which is a
00:57:21
service that Amazon provides that allows you to essentially get inference as a service. Right? So as we are building
00:57:28
our product and we need inference and we need inference tokens, bedrock basically
00:57:34
handles everything. So it's it's what AWS is but for this vertical of AI,
00:57:40
right? So they have the servers. These are in American data centers. They're managed by Americans and what they do is
00:57:47
they take a handful of models and they make sure that they can support usage of
00:57:53
those models. That was how we started. But as with everything, we have to
00:57:59
manage our costs and our operating profile. And so we're always looking for, are there other models and other
00:58:06
places other than Amazon that can service our needs? Because in fairness, Amazon is very expensive. So, a
00:58:13
different company that I helped get off the ground, Grock with the Q, they have a cloud and what they've been doing is
00:58:21
they've been working with initially Llama, then they work with OpenAI to
00:58:26
bring their open source model, but they also brought online a couple of these Chinese models. And what they do exactly
00:58:32
as you said, Sax, is they take the source code, they basically implement that, fork it, they fork it, they fork
00:58:37
it, and and now it's implemented domestically. on American soil by Americans inside of
00:58:45
an American data center. So there's China gave us kind of the the way the road map if you will the architectural
00:58:51
plans but we as in you know the American company in this case Grock built the house and then launched it and so now we
00:58:58
as 8090 basically made a cost decision to move to this open source model because it was just materially cheaper
00:59:04
right and what Grock with a queue will give you at you're the application company 8090 they're like Amazon for us they're
00:59:10
they'll give you an API exactly so the same way if you want to use a closed model like open AI or you know chat GBT They'll give you an API. You
00:59:17
submit prompts. They give you answers. Basically, tokens in, tokens out. What what Grock does is they will take
00:59:23
this open source model, run it on its own infrastructure, and then give you the API so that you can then get tokens
00:59:29
in, tokens out through their API. Well, for me, as a consumer, it reduces us to a pure economic decision. Where is
00:59:36
it cheaper? And you know, it's not dissimilar to the last generation of the internet. You'd run on AWS, but then
00:59:42
you'd bid it against GCP. You'd bring in Azure. you'd say who is cheaper because ultimately you're running a database,
00:59:48
you know, you're running, I don't know, pick pick your service, Snowflake, right? It didn't really matter where it was.
00:59:54
You were just really trying to find the cheapest vendor. Right. Now, here here's here's what's compelling about it. So, first of all, like you said, it's cheaper to just run
01:00:00
it on your own infrastructure if you know what you're doing. Also, enterprises like it because it's more customizable and there's going to be a
01:00:07
lot of fine-tuning of these open source models for specific applications 100%. And enterprises frequently want to
01:00:13
run these models on prem in their own data centers because they want to keep their own data on their own
01:00:18
infrastructure. But now the challenge is you've got these models that they're no longer Chinese. They've been forked.
01:00:24
It's an American company but they originated in China. That's right. And they could be running on some
01:00:30
critical infrastructure and that you that that does raise issues. I mean, do what is Grock doing, I guess, to like
01:00:36
test whether these models are safe, whether they could be backdoored. I mean, how do they think about that?
01:00:42
They they have an entire pipeline of stuff that they do, the details of which I I don't exactly know because I've not
01:00:47
asked exactly what they run through, but yeah, they're big rub in this. They go through an incredibly rigorous they basically do like safety testing to
01:00:54
make sure. Absolutely. So, I mean, because a lot of people think that if you run a Chinese model, the data must be going back to China, but it's not if it's being run on your own
01:01:00
infrastructure. I think the issue is more theoretical that like could a Chinese model somehow be backdoored with
01:01:06
an exploit or vulnerability somehow. Well, if you take a compiled version, sure, but if you just take the open
01:01:11
source and you do it yourself, no. Right. Well, that's the thing. So, I mean, and if someone did discover a
01:01:16
vulnerability, it would get widely shared in the community very very quickly. I think you can I think I think at this
01:01:22
point you can expect that every single major company that is in security that
01:01:28
is in a cloud vendor and also every single major model maker is trying to
01:01:34
prove and invalidate how the other models are inferior or bad in some way and so that's where the competitive
01:01:40
cycle I think is really valuable because you do have the best and the brightest computer scientists like you know
01:01:45
yesterday a certain person he's Italian that's I a leading security guy at one of these
01:01:52
model makers just talking to him. He's in charge of this security stuff. They're hammering everything to try to
01:01:58
figure out whether there's a there's a vulnerability because it slows these other folks down. So that made me feel
01:02:04
quite positive that we haven't seen anything yet on any of these models which is to say that generally everybody
01:02:10
is actually been a pretty good actor so far. The other piece to this puzzle sachs is
01:02:15
there's a lot of crypto distributed projects. The one I've been working on is Bit Tensor and Tao. I think you've
01:02:21
also done a deep dive on this Chimath and I'm a partner in a you know an emerging
01:02:27
crypto fund called Still Core Cap and we're buying Tao and we're looking at Bit Tensor and all of these subnets that
01:02:35
are being made to do distributed computing and this is a big push for Apple as well. A lot of these M4 Mac
01:02:40
minis you've seen out there. Their plan is to put all of this uh all these LLM
01:02:46
sacks on people's personal computers and then distribute them and have this like SETI at home and an incentive layer. And
01:02:53
I think that's going to be a big part of this. People are not going to want their AI jobs to go to the cloud necessarily.
01:02:59
They might want to do it locally and I think that's where the phones and all this silicon is going with um you know
01:03:04
Apple's big focus on it. It's going to be Yeah. Well, brave new world. Yeah. You bring up an interesting point.
01:03:10
You know, in the early years of this AI revolution, I'm talking about like 2023, 2024. I mean, this started in the last
01:03:16
three years. There was this analogy that AI was like nuclear weapons. I mean, you
01:03:22
hear the the doomer crowd, the safety advocates saying this that like AI was this really threatening technology.
01:03:29
And they would even say things like GPUs are like plutonium, you know, things like that. And I think that model of the
01:03:35
world is just wrong, right? Because what we're seeing is um and Justin actually had a pretty good line about this. He
01:03:41
says nobody needs nuclear weapons. Everyone needs AI. And it's true like every consumer, every business is going
01:03:47
to want to run AI. A lot of them are going to want to run it on their own infrastructure. Consumers are going to
01:03:52
want to run it on their own phone. You're going to have an AI that's highly personalized to you. And so everyone's
01:03:57
going to have AI. It's not like a nuclear weapon where we want to stop all proliferation.
01:04:02
AI is a consu first and foremost a consumer product that is going to proliferate and so the question is
01:04:09
bearing that in mind how do you then create an appropriate response for the
01:04:14
national security risk but this idea that we're just going to stop AI and only have two or three companies who who
01:04:20
have it which I think was the view a few years ago among policy makers ridiculous even now yeah they they were thinking in
01:04:27
very centralized terms and I think what we're seeing now is regardless of what certain policy makers might want. It
01:04:34
it's already highly decentralized, right? You've got five major American disclosed source companies. You've got
01:04:40
eight major Chinese models and then you've got everything that's happening with startups. So, this is going to be
01:04:47
highly decentralized and and verticalized, right? All the hugging face models, there specific ones on
01:04:52
images, specific ones for video. Like, it's it's going to be super fragmented. The vast majority of this activity is
01:04:58
benign. I mean, that's the thing. These are business solutions. These are consumer products. These are viral
01:05:04
videos. Most of the stuff does not rise to the level of a nuclear weapon or something like that.
01:05:10
This is a good chance for us maybe to talk about AI regulation. There is uh a lot of and and maybe we'll get to
01:05:16
Wikipedia as well, but there's a lot of states that are starting to look into
01:05:22
regulating AI. California SB53,
01:05:28
the Transparency in Frontier Artificial Intelligence Act, is working through the system. It's going to serve as a
01:05:33
template possibly for other states. It was introduced in January as an alternative to the more sweeping bill,
01:05:39
the SB 1047. This would require AI developers to conduct extensive safety
01:05:46
tests before rolling out the models. It got a lot of push back from tech obviously and Newsome ultimately vetoed
01:05:53
it. But this new law focuses only on the most advanced large frontier models that
01:05:59
we just talked about. And it requires companies to release a framework for knowing how they're approaching safety
01:06:04
issues, including standards and best practices, whatever that means, and however safety is defined.
01:06:10
These are models, I guess, in this definition, that have half a billion in annual revenue. I don't know how they
01:06:15
pick that out, but it requires these companies to release transparency reports before deploying. So they're
01:06:21
going to be like the app store, I guess, if this gets through to approve frontier models with updates. Oh, that sounds
01:06:27
great. You got to go to the government to release a new model. Your thoughts, David Sachs,
01:06:34
of AI, I think it's very concerning. There's a a regulatory frenzy happening at the
01:06:39
states right now. Just to be very clear about what happened in California, there was a original bill SB was it 1047 that
01:06:46
was incredibly obtrusive that Newsome vetoed that, but now they've passed a new one which is called SB53.
01:06:55
And like you said, it's not as burdensome and intrusive as the previous
01:07:00
version. It focuses on making frontier AI models report safety risks.
01:07:08
They're supposed to report if they have Can I stop you there for a sec? What is the safety risk they're going to be
01:07:14
required to report? That's it's such a nebulous term. What safety? What? That it's going to jump out of the computer
01:07:20
and murder me? Safety that it's going to give me the wrong answer. They're supposed to they're supposed to
01:07:25
report on potential catastrophic harms related to cyber attacks, bio threats,
01:07:31
model autonomy, which is the Terminator scenario. And they're supposed to
01:07:37
okay let the government know if there's a safety incident. I mean look all these
01:07:42
things are quite nebulous. It's almost like a nuclear power plant having to report if there was an incident. Are any of these in your mind
01:07:50
thoughtful? Let me just let me just let me just interrupt for a second. I think it's the equivalent
01:07:56
of saying I need any factory to report to me on the risk of something of a
01:08:02
nuclear explosion. Even though the factory might not be working with nuclear material, you see it like it it
01:08:08
uses a trying to get out here. I'm confused. I mean, it it effectively uses terminology that makes everyone nod
01:08:13
their head and say, "Oh, yeah, that makes sense. That's a good idea." When the reality is that the legislators have
01:08:20
actually no concept of what they're talking about. They have no concept of how these models are built. They have no
01:08:26
concept of how they're deployed. and they're using language that they think is inevitably going to result in giving
01:08:32
them ultimately tools and control over a private market system. And that's fundamentally what I think a lot of this
01:08:37
comes down to. Think about this issue that's going on with free speech in California. this hate speech bill SB771
01:08:44
that's sitting on the governor's desk to be signed right now where effectively the state of California's administrators
01:08:50
have the ultimate say of what is deemed hate speech and not which if you think about it if they had this bill in
01:08:57
Alabama during the civil rights era there would have never been the ability to have the protest and realize the
01:09:02
equal rights that arose from the civil rights movement because the government would have said those are inappropriate
01:09:07
hate speech things that you guys are saying and we're now putting those same tools in the hands of the legislators They're going to do the same thing with
01:09:13
AI. They're giving ownorously powerful tools to the legislators to let them decide what is and isn't appropriate for
01:09:20
private market actors when they actually have no sense and no sensibility about what they're talking about.
01:09:25
So, yeah, I'd actually I think that's a really important point. Just let me give you some stats on this this regulatory
01:09:31
frenzy that that's happening. So, all 50 states have introduced AI bills in 2025.
01:09:38
There's been over a thousand bills in state legislatures. 118 AI laws have
01:09:43
already been passed across the 50 states. The red state proposals for AI in general have a lighter touch than the
01:09:49
blue states. But everyone just seems to be motivated by the imperative to do something on AI, even though no one's
01:09:55
really sure what that something should be. Exactly. And there's no real agreement on like what all these AI regulations are
01:10:00
supposed to do. So, they're just making things up or what the risks are. That's what I'm trying to get at. So,
01:10:05
let me ask you a specific question. Yeah. Well, I was going to finish the point about California. So, so look, California, they've kind of gotten to
01:10:12
this point where now it's about reporting on all these safety risks. And if this is all it was, then it would
01:10:19
just be basically a bunch of red tape and it wouldn't be so bad. The problem is that you've got to multiply this by
01:10:25
50 states. So, you've got 50 different states, each with their own reporting regime, which is going to be a trap for
01:10:30
startups. They've all got to figure this out about what they're supposed to report on, what the deadlines are, who to report to. I mean, this is like very
01:10:38
European style regulations. Actually, maybe even worse than the EU because the EU tried to basically harmonize to get
01:10:43
to one authority. We're going to have 50. They're going to have one. But the other problem is that this is just the camel's nose under the tent. So even in
01:10:50
California, Scott Weiner, who's the legislator who did SB 1047, now he did this. He's got a block of legislators
01:10:57
and they have 17 more AI regulation bills that they want to pass. So this is just the beginning. And if you want to
01:11:03
see where this is going, okay, look at Colorado. We should talk about this Colorado bill because this has already
01:11:09
been passed into law and it's called SB24-205, Consumer Protections for Artificial
01:11:15
Intelligence. It was passed all the way in May of 2024. So, it was one of the first to pass. Even though they didn't
01:11:22
really know what they were trying to regulate, no one's quite sure how to implement it. But what the law does is
01:11:28
it bans something they call algorithmic discrimination. Okay? And algorithmic
01:11:34
discrimination is defined as unlawful differential treatment or disperate
01:11:39
impact based on protected characteristics. So things like age, race, sex, disability. If any of those
01:11:47
factors drive an AI decision and it results in a disperate impact, then both
01:11:54
the developer of the AI model and the deployer, which means basically the business that's using it, can be in
01:12:00
violation of this law and they can be prosecuted by the Colorado attorney general. Let me give you a practical
01:12:05
application here. So, let's say that you got someone like a mortgage loan officer who's reviewing applications, okay? And
01:12:13
let's say they don't even discuss race. There's it's not on the form, okay? They're just using race neutral criteria
01:12:19
like a credit rating or financial holdings, something like that. If the result of their decision nevertheless
01:12:26
had a disperate impact on a particular protected group, its decisions could be found to be discriminatory. And
01:12:33
moreover, the developer of that model could be liable even though their model
01:12:38
just gave an answer that under the circumstances was truthful. The only way that I see for model developers to
01:12:45
comply with this law is to build in a new DEI layer into the models to
01:12:50
basically somehow prevent models from giving outputs that might
01:12:56
have a disperate impact on protected groups. So, we're back to woke AI again. And I think that's the whole point.
01:13:02
Yeah, that's the whole point of this Colorado law. But let's get Shimoth in on this discussion. Shimath, I think that this is really, really
01:13:08
dumb. What's happening? If you have 50 sets of rules, what you will have are
01:13:13
some conservative versions of AI. You'll have some progressive leaning versions
01:13:19
of laws. These 50 series of laws will essentially just render this industry impotent and incapable of maximizing
01:13:27
itself and and actually doing what's necessary to drive productivity and GDP on behalf of the country.
01:13:33
There is no conceivable way, as Freeberg said, that anybody in Sacramento or
01:13:38
Little Rock or, you know, name your state capital will have the intellectual wherewithal to get to an answer as good
01:13:46
as the federal government will and as Sax will just to be totally honest with everybody. So what should happen here is
01:13:53
that there needs to be a complete moratorum and the federal government should be
01:13:59
given the time to figure out what the framework should be so that there is a one size one set of rules. Now if that
01:14:07
doesn't happen and this is allowed to stand there is a perfect example of where this has happened before and that
01:14:13
is in the car market because in the car market what happened was there is a complete set of rules in California
01:14:22
for emissions that is entirely different than the rest of the country and you can look and see what it did now that's just
01:14:29
two sets of rules and what let me let me let me finish okay
01:14:35
and so what these two sets of rules going from one set of rules to two. What did it do? It drove most of these
01:14:41
companies to go towards barely break even or massively money losing. It has
01:14:46
been something that the entire industry has been fighting back on for now 10
01:14:51
plus years now. Can you imagine instead of two sets of rules you have 50? I
01:14:57
think you know what the economic consequences will be. you'll render this entire category incapable of being able
01:15:04
to generate any positive economic output. So I guess the steel man if we were to
01:15:10
make one is transportation, education, abortion, taxes, alcohol,
01:15:17
cannabis. I think I mentioned those are all state cannabis is a poison and uh it is the
01:15:23
the worst thing in the world, right? But for you, okay, that's your opinion. Great. But
01:15:29
should states have some general are trash? Oh, okay. We know your position on that. I'm talking about the different which is
01:15:35
what should states zombies. Perfect. What are I don't disagree with with that statement.
01:15:42
The question I'm asking is we let states just to steal man this for the audience
01:15:47
decide how they want to execute against things like taxes, alcohol, education,
01:15:54
abortion, transportation. Should David Freeberg states have some rights here? This is the I'm just stealing here. I'm
01:16:00
not saying this is my opinion. But if this is the most transitional technology of our lifetime,
01:16:06
shouldn't states have a say or what's the argument for states having a say? It's the United States. It's a federated
01:16:12
republic. I am 100% in favor. I think what we're pointing out is the idiocy of these decisions for number one. Number
01:16:18
two, so so the internet created a virtual
01:16:24
network system for media, communications, content, productivity.
01:16:31
So, you know, we're talking about something that stretches across the federal landscape. What needs to happen is there needs to be federal
01:16:36
preeemption. So, the federal government, Congress, needs to pass a law that says, "Here are the standards that
01:16:43
we are going to set or here's the the rules that we think are relevant for AI. here are the things that states can and
01:16:48
can't do if we want this country to succeed on the uh opportunities and
01:16:54
advantages that will arise from AI. The second thing I'll say is that much of the the law that's being drafted by
01:17:00
these state legislators are regulatory oversight laws, not laws that define a
01:17:07
new civil or criminal penalty because of something you did that caused harm. They
01:17:12
are specifically written in such a way that they say we need to have oversight. we need to have review. We need to have
01:17:18
control over your systems because we get to review them. They don't say, for
01:17:23
example, if your AI kills someone, you are going to jail. That is what they
01:17:28
should say. And in fact, one could argue that much of the civil and criminal statutes that already exist in the
01:17:35
states cover much of the harm that is already being talked about as the potential safety risk associated with
01:17:41
AI. You don't actually need more because at the end of the day if the AI system, the producer of the tool, the user of
01:17:47
the tool causes harm to someone or something or some business, there is already statute to protect against that
01:17:54
harm. The statute that's being drafted is all about oversight. It is about giving the government the regulatory
01:18:01
control, the ability to go in and interrogate and investigate and create approval systems on whether or not what
01:18:08
you're creating as a private market business or citizen is appropriate to be used. And it is one of many points of
01:18:14
overreach that this federated republic has been able to withhold itself against historically. And after 250 years, the
01:18:23
day may be up. This makes so saxs in the case of a large language model being constructed in a
01:18:31
non-thoughtful way so that it could be used to do cyber attacks and you know
01:18:36
dox people or I don't know be used for impersonation there should the law
01:18:42
should be able to I'm trying to think of a scenario here when they give these security things that would be concerning
01:18:48
and the law should I don't know if openai allowed their to go hack credit cards That's already
01:18:54
illegal, right? It's already illegal to to conduct a cyber attack. And if you manage to take
01:18:59
an AI model and use as a tool to perform a cyber attack, that's still going to be illegal. Same thing in Colorado. Okay,
01:19:06
they've got this bill that they want to outlaw algorithmic discrimination, but discrimination is already a violation of
01:19:12
the law. So, what they're doing there is they're not just going after the
01:19:17
business that's performing discrimination. That's already illegal. What they want to do is get into the
01:19:22
tool itself, right? And they want to make the developer liable. If their model creates an output that supposedly
01:19:30
ends up creating a disperate impact in a decision and imagine if we did this with the internet. Imagine if we went back to the
01:19:36
start of the internet and we said, "Hey, if someone uses the internet to do something bad, therefore the government needs to approve everything that's done
01:19:41
on the internet." I mean, you can talk about mobile communications. You can say, "Okay, Verizon's responsible if
01:19:47
people use it in a terrorist attack. Verizon's not responsible if people use it to coordinate a bank robbery. That's
01:19:52
so obvious. So, yeah, this does seem like it's overreach. Zach, what is the situation on Capitol
01:19:57
Hill in having a conversation about creating federal preeemption, passing a a bill that says the federal
01:20:03
government's going to set standards around AI utilization that states cannot kind of intervene on and creating a
01:20:09
mechanism that allows this market to to develop and allows things to prosper. Well, here here's the situation is in
01:20:14
the big beautiful bill, there was a a federal moratorum on state AI regulation, and I think it was
01:20:20
well-intentioned and well motivated by the fact that we do see this huge knee-jerk reaction to state legislators
01:20:27
wanting to do something without knowing what it is they want to do. However, there was not enough Republican support. There wasn't enough Republican or
01:20:34
Democrat support for it. And I think that part of the reason why Republicans in particular have been opposed is just
01:20:40
because there's so much anger at the big tech companies right now for all the censorship that happened during
01:20:46
especially co but even before and you still see it. You saw with this Wikipedia news where they're banning all
01:20:53
conservative publications from being sources. There's just a lot of anger towards the big tech companies and tech
01:20:59
bros and and basically there's a lot of Republicans who don't want to get on board with anything that is perceived as
01:21:05
helping tech. Now the reality is who does that ultimately benefit? I mean
01:21:10
ultimately it benefits the blue states who are in the lead on this type of regulation. It's Gavin Newsome who just
01:21:17
signed this new bill. It's, you know, again, it's Jared Polus in Colorado who ultimately signed this Colorado law. And
01:21:23
if and if there is no federal standard, what you're going to see is that the blue states will drive this ban on quote
01:21:30
unquote algorithmic discrimination, which will lead to DEI being promoted in
01:21:36
models, which is what the Biden administration wanted. You will see the return of woke AI at the state level.
01:21:42
It's not something any Republican should want. I mean, I understand the the justifiable
01:21:47
anger at these tech companies because their behavior in the past has been really bad towards conservatives. I
01:21:53
mean, they did engage in a lot of censorship, shadow banning, demonetization, debanking, all that kind
01:21:59
of stuff. So, I get it. But we have to look at what the results are going to be. And the single federal standard is
01:22:05
the best way to make sure that we do not have woke AI, that we do not have
01:22:10
insanely burdensome regulations that allow China to basically get ahead of us in this AI race. And it's to ensure that
01:22:16
we actually have truthful, unbiased AI instead of highly ideological AI.
01:22:22
Do you think you can get it done? Let me go to poly market. The US enacts AI safety bill in 2025. Not getting done
01:22:29
this year. Well, here here's the good news. It doesn't really matter what what I think. The important thing is what President
01:22:35
Trump thinks. And in his July 23rd speech on AI, he was really clear that there needs to be a single national
01:22:41
standard for AI. He said it was impractical. It doesn't make sense to have 50 different regulatory regimes and
01:22:48
that that could cost us the AI race. And he would like there to be a single federal standard just like he promoted
01:22:54
for vehicle emissions because again, we didn't have a federal standard there. And then it was California taking the
01:23:00
lead and then the blue states set the standards. President Trump didn't think that made sense for California to be
01:23:06
setting the rules for the whole country. So the feds preempted that. And I think we should do the same thing on AI.
01:23:11
That's what the president basically said in his speech. So I think the administration ultimately will support this. And I think I think more
01:23:17
Republicans will come on board as they realize what the blue states are doing here is not helpful for conservatives.
01:23:25
is not helpful for having an unbiased information environment. I'm torn on this one. I, you know, I
01:23:31
moved to the great state of Texas to get rid of, you know, to have certain freedoms that we have here that we don't
01:23:36
have in other states. And I I kind of like the idea of states having certain rights, but I don't like the way these laws are being written. So, I remain
01:23:43
torn and the devil's going to be in the details on this one. Chimath had to bounce. Well, do you do you like the Colorado
01:23:49
law? Would you like to have No, of course not. So that it's how these laws are executed that, you know, are my concern, you know, and I had this
01:23:56
concern with gun rights in California, like you should have the right to own a gun and then they're just like, well,
01:24:01
you can't have a gun. Okay, well, you know, and then the states have to go back and forth in these lawsuits to see
01:24:09
can New York City, San Francisco ban guns and one of the reason crime is out of control in some of these places
01:24:15
because homeowners can't have guns and the stand your ground laws, etc., etc. And one of the nice things about this
01:24:21
country is you can pick a state where hey I want to live in a state where abortion's legal. I I don't want to live
01:24:26
in a state where abortion is legal. I want to live in a state without taxes, state taxes, ones with taxes. You get to
01:24:32
choose. It's one of the powerful things and we get to debate these things in real time. So I do have a concern of
01:24:37
centralized government and overreaching federal governments, especially with the way executive power is being deployed
01:24:43
these days from Obama to Biden and to Trump. This too much executive power in my mind. So I have concerns on both
01:24:49
sides of it, but you know it's this is a devil's in the details of the execution and I trust you to come up with
01:24:55
something good as our civil servant. So come up with something good, Sax. Well, we will. But you know, just just
01:25:00
to go back to to one of your points on states rights, look, there's a commerce clause in the Constitution. And the reason that exists is to create a
01:25:06
seamless national market economy. One of the reasons why the US has such a strong economy, why it's the number one economy
01:25:12
in the world, is because we have a single national economy, which is the largest market for products. Imagine if
01:25:19
we had 50 separate markets, each with their own rules and regulations. And then doing business in the US would be
01:25:25
like Europe. Remember, one of the reasons why the US dominated the internet in the '9s is because if you
01:25:31
launched a startup in America and you won the American market, you were basically right there in terms of
01:25:36
winning the global market. Whereas, if you were in a European country and you wanted your local country, whether it was the UK or Netherlands or France or
01:25:43
something, you would just want a small part of Europe and then you would have to go figure out all the rules and regulations to get into just the other
01:25:50
30 European countries, never mind the rest of the world. So, it's that seamless national market that's given
01:25:56
our companies the scale they need to then dominate across the world. And if
01:26:01
you restrict that by making every state have different laws for every product, we're going to lose that massive
01:26:07
advantage that we have. Here's the thing. You know, I look at the car standards with which Chimath brought up Friedberg and uh you know,
01:26:15
Trump, I guess, doesn't want to have California having their own car standards. That got rid of 70% of the
01:26:20
pollution in California. I was in favor of that. I wanted to see higher standards, not lower standards, because
01:26:26
I don't want to pollute. And the the the smog over California was just especially Los Angeles was insufferable at times.
01:26:33
Those standards which led the nation which led the world did they add extra cost? Of course, but it made California
01:26:39
a great place to live because it's car culture there and people were dying and taking years off their lives from the
01:26:44
smog. So that's an example of it I think working really well. And I am for cannabis regulation and for it being
01:26:50
legal. And California led the country in that whereas other states want to ban
01:26:55
cannabis and they don't want to have higher standards for pollution. I like the fact that California led in those
01:27:01
two ways. Now it's all in the execution of course. And so the problem is that because California is such a big market, those vehicle
01:27:07
emission standards that may or may not have been right for California apply to every other state because the car
01:27:12
companies can't manufacture different models for different states. Nor should they have to. They did though.
01:27:18
Practically they did produce different models for different states. But yeah, it definitely you want you want to have different AI
01:27:24
models for every state. You want to have you want to have a DEI model for Colorado. You want to have
01:27:30
if the in the case of cars, I do like the fact that they California did push the car companies to make cleaner cars.
01:27:37
Now, in the case of AI, that's why I was asking you which safety concerns you have, cuz I'm trying to find a safety
01:27:43
concern that we can all say is a legit concern for AI, and we can't come up with one. So, that's the interesting
01:27:49
part about this is like they're obviously overreaching laws right now because we can't come up with something
01:27:54
where AI is going to jump out of the computer and do something in the real world that regular laws don't account
01:28:00
for. I we can't come up with an example here and we're deep in this industry. Can you come up with a single example of
01:28:05
AI doing something bad in the world that we should be concerned about that isn't covered by existing laws? I can't.
01:28:12
Somebody in the audience figures that out, please email me. Another amazing episode of the All-In podcast. Great to
01:28:18
see you, Chimath, who had to jump. David Freeberg, and of course, my bestie, my
01:28:23
bestie, David Zach, our Zar getting it done in DC for the country. Well done,
01:28:28
and we'll see you all next time on the All Podcast. Bye-bye. will let your winners ride.
01:28:35
Rainman David and it says we open sourced it to the
01:28:41
fans and they've just gone crazy with it. Love you queen of quinoa.
01:28:47
[Music]
01:28:52
Besties are my dog taking a notice driveway.
01:29:00
Oh man, my habitasher will meet. We should all just get a room and just have like one big huge orgy cuz they're
01:29:05
all just useless. It's like this like sexual tension that they just need to release somehow.
01:29:13
Your feet. We need to get our [Music]
01:29:25
I'm going all in.

Badges

This episode stands out for the following:

  • 60
    Best overall
  • 60
    Best concept / idea

Episode Highlights

  • EA's Historic Take-Private Deal
    Electronic Arts is being taken private in a historic $55 billion deal, marking a high point for private equity.
    “This is a large deal.”
    @ 01m 53s
    October 03, 2025
  • The Rise of Private Equity
    Private equity has grown significantly, raising questions about its impact on society and retirement accounts.
    “Private equity is totally owed.”
    @ 12m 54s
    October 03, 2025
  • The Future of SPACs
    Discussion on the evolution of SPACs and the potential for clean deals in the future.
    “I think that’s what the Raptor 3 version of a spa will look like.”
    @ 24m 19s
    October 03, 2025
  • AI's Transformative Potential
    Exploration of AI's impact on traditional industries and investment opportunities.
    “AI has transformative potential in nearly every industry.”
    @ 28m 58s
    October 03, 2025
  • New Media Categories
    The emergence of distributed production models in media and their societal implications.
    “We're kind of early innings in new categories of media.”
    @ 41m 30s
    October 03, 2025
  • AI vs Nuclear Weapons
    Justin argues that while nuclear weapons are unnecessary, AI is essential for everyone.
    “Nobody needs nuclear weapons. Everyone needs AI.”
    @ 01h 03m 41s
    October 03, 2025
  • California's AI Regulation
    California's SB53 aims to regulate AI safety, but faces pushback from the tech industry.
    @ 01h 05m 28s
    October 03, 2025
  • Algorithmic Discrimination Law
    Colorado's SB24-205 bans algorithmic discrimination, holding AI developers accountable for biased outcomes.
    @ 01h 11m 28s
    October 03, 2025
  • Federal Preemption Needed
    Calls for a single federal standard for AI to avoid a chaotic patchwork of state laws.
    @ 01h 16m 36s
    October 03, 2025
  • The Power of State Choices
    You can choose to live in a state that aligns with your values, whether it's abortion laws or taxes.
    “It's one of the powerful things and we get to debate these things in real time.”
    @ 01h 24m 21s
    October 03, 2025
  • Seamless National Market
    A single national economy allows US companies to dominate globally, unlike fragmented markets in Europe.
    “Imagine if we had 50 separate markets, each with their own rules.”
    @ 01h 25m 12s
    October 03, 2025
  • California's Environmental Leadership
    California's car standards significantly reduced pollution and set a precedent for the nation.
    “Those standards led the nation and the world, making California a great place to live.”
    @ 01h 26m 26s
    October 03, 2025

Episode Quotes

Key Moments

  • SPAC Evolution24:19
  • AI Transformation28:58
  • Regulatory Frenzy1:06:34
  • Algorithmic Discrimination1:11:34
  • Federal Standards1:16:43
  • Economic Advantages1:25:12
  • California Standards1:26:20
  • Podcast Wrap-Up1:28:23

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Trump Takes On the Fed, US-Intel Deal, Why Bankruptcies Are Up, OpenAI's Longevity Breakthrough
Podcast thumbnail
E170: Tech's Vibe Shift, TikTok ban debate, Vertical AI boom, Florida bans lab-grown meat & more
Podcast thumbnail
IPOs and SPACs are Back, Mag 7 Showdown, Zuck on Tilt, Apple's Fumble, GENIUS Act passes Senate
Podcast thumbnail
AI Psychosis, America's Broken Social Fabric, Trump Takes Over DC Police, Is VC Broken?
Podcast thumbnail
E85: SBF's crypto bailout, Zendesk sells for ~$10B, buyout targets, US diplomacy, AlphaFold & more
Podcast thumbnail
Fed Hesitates on Tariffs, The New Mag 7, Death of VC, Google's Value in a Post-Search World
Podcast thumbnail
JD Vance's AI Speech, Techno-Optimists vs Doomers, Tariffs, AI Court Cases with Naval Ravikant
Podcast thumbnail
Grok 4 Wows, The Bitter Lesson, Third Party, AI Browsers, SCOTUS backs POTUS on RIFs
Podcast thumbnail
Epstein Files, Is SaaS Dead?, Moltbook Panic, SpaceX xAI Merger, Trump's Fed Pick
Podcast thumbnail
OpenAI's GPT-5 Flop, AI's Unlimited Market, China's Big Advantage, Rise in Socialism, Housing Crisis