Search Captions & Ask AI

Trump AI Speech & Action Plan, DC Summit Recap, Hot GDP Print, Trade Deals, Altman Warns No Privacy

August 01, 2025 / 01:23:51

This episode covers the All-In podcast's recent AI summit in Washington, D.C., featuring discussions on AI, trade deals, and a speech by President Trump. Key moments include the summit's organization by David Friedberg, insights from various tech leaders, and the implications of Trump's policies on the AI race.

David Friedberg shares details about the AI summit, highlighting speakers like Lisa Su from AMD and the focus on dispelling negative narratives about AI. The discussions centered around the economic boom driven by AI and the necessary infrastructure to support it.

Jason Calacanis recounts his experience meeting President Trump, including humorous moments and the president's remarks about the podcast hosts. The conversation touches on the significance of Trump's speech regarding AI and its impact on the U.S. economy.

David Sachs discusses the implications of the recent trade deal with the EU, emphasizing the benefits for the U.S. economy, including a significant investment and tariff structures. The conversation also addresses concerns about inflation and the potential effects of tariffs on consumer prices.

Throughout the episode, the hosts engage in lively debates about energy policies, immigration, and the future of AI, showcasing their differing perspectives while maintaining a humorous tone.

TL;DR

The episode discusses the All-In podcast's AI summit, Trump's speech, trade deals, and lively debates on AI and energy policies.

Video

00:00:00
How much founder mode did you do? Are you saying that I popped an ALP? I need an ALP right now. Hold on. You don't need anything right now. Are you
00:00:06
chewing it? What are you doing? No. You put this nicotine pouch, you upper deck it, releases it, and then you
00:00:11
become a god. Is that the app that Tucker sent you? Yeah. Tucker and I are going to do a crossover. Wait, did you work out a side hustle
00:00:18
here? I haven't presented it to the group for a vote yet. You're pre- Wait a second. Are you being paid for this plug right now?
00:00:24
Yes. I'm just saying if you use the promo code JCL. Wait a second. Promo code J 15.
00:00:32
Okay, he broke up, which is good. Is he on drugs? Is he taking drugs? He's on drugs. No, I'm not on drugs.
00:00:37
And he's doing a deal with This is like a PSA for not taking this stuff. You're so out of control.
00:00:42
Did you take two of them? What are you doing? I thought this stuff relaxes you. What the hell is going on? Your internet's on
00:00:49
the fritz, too. I fixed it. I fixed it. I fixed it. That was Putin. Putin's got my internet.
00:00:55
Putin's got my Oh my god. What flavor are you eating or using? Oh, today's chilled mint. Today's
00:01:00
chilled mint. You don't seem very chill. You seem This is the first one. Agitated and angry.
00:01:07
No, I'm trying to get us back to that original all-in energy where we laughed and we had fun and we enjoyed each
00:01:12
other's company. No, but Jakeal, seriously, do you have a side deal going on with Alp right now?
00:01:18
No, I don't have a deal yet. I don't know if I have a deal. There's no deal.
00:01:23
I'm texting Tuck right now just to cut you. [Music]
00:01:29
Let your winners ride. [Music]
00:01:36
We open sourced it to the fans and they've just gone crazy with it. Love you guys. Queen of
00:01:44
All right, everybody. Welcome back to what Jensen from Nvidia has confirmed is
00:01:50
the number one podcast in the world. Yes, the All-In podcast is here. We had an amazing time in uh DC last week and
00:01:59
we'll get into that. But uh hey Freeberg, you crushed it on all those
00:02:05
incredible speakers last week. 10 days you had to pull off that event Freedberg
00:02:11
and you did it. Chimath and I just parachuted in to DC last week for the AI
00:02:16
summit. Sax was busy working with pus to get all those executive orders done. Take us behind the scenes, Freeberg. all
00:02:23
of these incredible speakers. You got Lisa from AMD. You had Lutnik, I liked him. Bessent, I liked him. We had to say
00:02:31
no to a lot of tech company CEOs that found out about the event and wanted to speak on stage. So, there was a big kind
00:02:37
of cut off that we had to make around making sure that we got our message across. I think if you watch the
00:02:43
content, we talked briefly about it at the beginning, but the focus was really on trying to dispel the negative AI
00:02:50
narrative and myth that AI is just here to destroy jobs because there's this big economic boom that's happening both with
00:02:56
respect to new industries that are emerging, which is why we showcased Hadrian and others, but then also the
00:03:02
infrastructure needed to support the AI race with data centers, chips, mining,
00:03:09
and energy. And so we highlighted each of those four industries. And then the cabinet people found out about it and
00:03:14
wanted to get involved. So we were unfortunately squeezing people on and off stage. It's kind of crazy to tell
00:03:20
the secretary of treasury he has to get off the stage because he's passed his 20-minute allocation. But uh we had to
00:03:26
line everything up so that the president could get his Secret Service detail to clear the stage and get set up in time. That's why we were rushing everyone. But
00:03:33
man, what a week. What a rush. It was awesome. Thanks to David Saxs for the leadership and pulling it all together,
00:03:39
bringing those folks to the table. And Sax, congrats on getting your EO signed and your action plan published. That was
00:03:45
pretty cool. Pretty awesome to meet the president and meet all those cabinet members and have all of this day come
00:03:51
together because of the work you've been doing. How does it feel? Like, Sax, how are you doing in the afterlow there? I
00:03:57
can see that you're in the afterlow. You sent me a picture of the four besties with our incredible 47th president. How
00:04:05
you feeling right now? Are you going to put that on the screen? I may have it. I don't I don't know if
00:04:10
that's allowed. Are we allowed to put that on the screen? I don't know what the protocol is. Yeah, I think we can. Yeah. I mean, I haven't gotten my
00:04:17
picture. Um I did notice that I was unfortunately when they took the picture of the four of us with the president,
00:04:22
somehow I got cropped out by accident. I think maybe they weren't using a wide lens. Wait, Jason, what was it like for you to
00:04:28
meet the president? Cuz just for the audience, we all stood in line and then we took a photo with the president backstage and then we did a photo with
00:04:34
the four of us. But Jason, when you had your moment with the president, what did you say? Did you ask him about immigration?
00:04:40
Did you ask about I have your photo with the president? Oh, it's on my phone. Did you Did you bring up solar panels
00:04:45
with him? Like what was your big moment all about? I didn't know we were taking a picture. That was like sprung on me. So I
00:04:54
was like, "Oh, we're taking a picture." So my brother Josh who runs security for us was like they need you in the back to
00:05:00
take a picture with the president and I was like yeah I'm good I I got to I got to prepare for you know some Oh you were going to pass.
00:05:06
Well I thought he was joking with me. So I was like yeah I'm good. I'm good. So he's like no no I'm serious.
00:05:12
They're they're taking pictures with the president. I was like we are okay. So, I ran back and uh they put us in line and
00:05:19
then I was like, I think I'm getting punked here because they kept repeating to me, okay, Jason, you're last. You're
00:05:26
last. And they, you know, and I know you guys like to put in a joke or two. So, I, you know, I just got in line last.
00:05:31
And it's obviously, you know, it's it's a big deal to take a picture with the president. So, I didn't want to um, you
00:05:37
know, use that time inappropriately or anything. So, I just said it's a pleasure to meet you. Just say it already. You like it. Just
00:05:43
say it. Just let's get it over with. Just get it. Get to the end. You like him. You tried not to. You know, you're
00:05:48
all Mr. Big Shot, Mr. Big Talk, and then you got in front of him and you like
00:05:54
him. Just say it. Uh, what I will say is, Jesus Christ.
00:05:59
Like him or dislike him. What a joke. You're such a goon. I I had a great time. I had a great time. Predictable. You're a predictable
00:06:06
goon. You know, you don't even know what goon is. Okay, stop riz. Stop oraura farming. You don't
00:06:12
know what gooning is. Okay. I had a great time meeting him. It was a great event or farming. Obviously, he's trying
00:06:18
to get his RZ up to impress his kids. But, um, it was great. And I didn't know what to do in the picture. So, he did.
00:06:25
We can move on. I What do you think of his speech, Jason?
00:06:30
After he gave you a shout out, Jason after the president gave you a shout out. I don't know about love. He said, "Even
00:06:36
Jason." How many times have you listened to that clip over and over? How many times? How many times? How many people have you
00:06:43
shared that with? How many? Play the clip. Play the clip. I want to also uh Oh, no.
00:06:48
Say hello and thank to Jamath and his wonderful wife Nat. Thank you very much
00:06:53
for being here. Thank you very much. It was great seeing you again.
00:06:58
Great couple. David Friedberg and uh even as we know Jason Gall.
00:07:08
I say even. Thank you, Jason. [Applause] appreciation. I appreciate that.
00:07:14
Yeah, he's a good person. I mean, he's a good person. He called you a good person. He called you a good person. So, here we are.
00:07:20
What president What president's ever called you a good person this? Come on. I mean, it's it's obviously like surreal
00:07:25
for all of us, I think, to be this close to the administration and then for Sachs to be part of it. What I will say is you
00:07:32
have to give a lot of credit to this administration for the velocity they're going, what they're accomplishing. even
00:07:39
if you disagree with certain items on the margins and their ability to engage with leaders doing important work. And
00:07:47
if we compare that to Biden and Kamla, like they weren't even letting people
00:07:52
come to the White House. Is it Is this like I love this administration. I love the administration. I I like Trump. This is
00:07:58
a cabinet of CEOs. Let me just say this. I'm not in love with Trump. I'm in like with Trump.
00:08:04
That's where I'm at. I'm not in love with Trump. I'm in like with Trump. But what better team has ever been put
00:08:10
together? It is a cabinet of CEOs. It is a cabinet of managers. It is a cabinet of people who know how to get done. And
00:08:15
every time I go there, I'm impressed by this cabinet. I pull my hair out when I meet. You admit that you're proTrump finally
00:08:22
Friedberg. You've been splitting it. You've been dancing around the issue. Are you full 100% in support of Trump?
00:08:28
You want to sit here and put me on the spot? I put you on the spot. I support my president. I support the president. Okay. So, you voted for him and you love
00:08:34
Trump. You voted for him and you love Trump. I love what he's doing and you voted for him
00:08:40
and I have issues with the spending and that's not been resolved. So like I said before, okay, great. Here we are folks.
00:08:45
My fullthroated endorsement will come around when Doge actions are taken seriously andor the White House puts
00:08:51
pressure on Congress to take action on spending the budget. What is everybody's favorite moment? Favorite other than Trump, you know,
00:08:59
being absolutely amazing, great speech. He's he's hilarious. Whatever. We'll put pus outside that because it's hard to
00:09:05
compete with the president of the United States. Sax, did you have a couple of favorite moments? Give us a couple favorite moments. First of all, I think we should talk
00:09:11
about the substance of the speech because I think this was the first speech that President Trump has given on
00:09:17
AI since the AI boom began. He's he's spoken about it before, but this was a full-length policy speech and he
00:09:24
declared that the United States was in an AI race. It's a global competition.
00:09:29
I think the the language that he used was reminiscent of how President John F. Kennedy declared that America was in a
00:09:36
space race. And in a similar way, President Trump declared that we have to win the AI race. I think you can argue
00:09:41
that the AI race is more important than the space race. It's going to reshape the global economy. It's going to
00:09:46
determine who the superpowers are of the 21st century. And President Trump was
00:09:51
really clear that we had to win it and that he was going to support a strategy
00:09:56
for winning it. And then he laid out what some of those key pillars are. Number one was was innovation. We have
00:10:02
to get the red tape out of the way and let our geniuses cook and clearly was very supportive to a lot of the CEOs and
00:10:08
entrepreneurs in the crowd. Number two is infrastructure. He touted the hundreds of billions of dollars of
00:10:14
investments in energy and power generation and grid upgrades and data centers that he's supporting. And then
00:10:21
he also supported AI exports. He said that we have to make America's tech stack the global standard. So I think
00:10:27
those were really important messages. And then on top of that, I think there was also some parts of the speech that
00:10:34
maybe have gotten less attention but are also important where he said that it's
00:10:39
not only important that we win. He said it's important how we win. And he sort
00:10:45
of mentioned three non-negotiables here. Number one was that American workers have to be at the center of the
00:10:50
prosperity that we create. Number two is that the AI models that
00:10:57
the government procures and buys must be free of ideological bias. So no woke AI.
00:11:03
And he also signed an executive order to prohibit woke AI in the federal government. We could talk about that in
00:11:08
a second. That probably was my favorite moment. That was your favorite moment. That was my favorite moment. The red meat moment. I thought that was
00:11:14
that was the red meat. Yeah. That was the red meat for the base. Yeah. Yeah. The third thing is he he did say that we do want to prevent our
00:11:19
technologies from being misused or stolen by malicious actors. And look, we we are going to monitor for emerging and
00:11:25
unforeseen risk. So, you know, we're not going to disregard the risk. But he had this really good line in the speech
00:11:31
about how even though AI, look, it's it's a daunting technology because it's so powerful and like any revolutionary
00:11:38
technology like that, it can be used for bad as well as good. But the the daunting nature of it is all the more
00:11:45
reason why we have to do it in the United States. Why would the United States has to be the pioneer and the leader is cuz we don't want the power of
00:11:52
that technology being developed in other parts of the world at least other parts of the world are going to have it but we
00:11:58
want to be the ones on the cutting edge who are defining it and leading it. Fantastic. Okay. So I think it was a it was a really
00:12:03
important speech. I think this idea of an AI race that is similar to the space race, I think is going to be the
00:12:10
dominant frame on AI policy for years to come. Well, it's pretty clear, you know, this
00:12:16
presidency, this term is going to be earmarked, I think, by four key
00:12:21
initiatives. AI, crypto, immigration, and tariffs. I think that feels like
00:12:27
what they're locking into as what's important for the next three and a half years. I think you would agree with
00:12:33
that. And it's just great that you're spearheading and helping the president with two of those four. And just I the
00:12:39
velocity to me is what's super impressive. Any way you could take us behind the the scenes of how this stuff
00:12:45
is getting done so quickly. It feels like there's some operational
00:12:53
cadence here that we didn't see in his first term. Certainly didn't see in the Biden term, but there's a there's a
00:12:59
cadence here that's different. Yeah. Startup speed. H how is that? Well, yeah, we call it he's working at tech speed. I just think that the
00:13:05
president's constantly working. I mean, he's just so energetic. I mean, he basically works like two full work days.
00:13:11
I think it's well known that he doesn't need a lot of sleep and he's continues to work late into the night. And I just
00:13:18
think his energy propels everything forward. I also think that there's a very cohesive team at the White House
00:13:27
under the chief of staff, Susie Wilds. I think it's very important. I think she runs a tight ship and then you've got
00:13:32
the deputy chief of staffs under her and I think most of these people have been working together for a long time and
00:13:38
it's a team that works well together and I just feels very coherent and cohesive
00:13:45
to me. So I think it's a very effective team. It does feel like that. The pace is great. It means you're going to get more
00:13:52
shots on goal and you'll be able to try more things and and get more accomplished just like we see in startups. Chimamoth, outside of the
00:13:58
president's talk, we'll go around the horn here. Top two or three moments from the discussions, just lightning round here, rapid fire. What do you got? Top
00:14:05
two or three moments for you, Chamath, just in the discussions that were enlightening to you, inspiring to you,
00:14:10
notable to you. I came out of it very motivated. I think that the combination of the speech, the
00:14:16
executive orders, and the clarity of the big beautiful bill
00:14:22
now give those of us that are in these markets a ton of runway to go and
00:14:27
execute. And so those things reinforced by the various members of the cabinet I
00:14:34
think were very important. That was one. And then the second thing were the
00:14:43
market commentary from both Lisa Sue and Jensen I thought was really valuable.
00:14:48
And then the third was Chris Wright and Doug Bergam talking about energy. And I
00:14:56
tweeted this yesterday, but we are sort of back to basics almost in a sense
00:15:01
where in the absence of power, I think AI is is not going to be the thing that we think it can be. So that's going to
00:15:08
create an enormous amount of appetite by the federal government to do deals and get players on the field. That's to me
00:15:15
very exciting. So yeah, I came away really
00:15:20
riskon, I guess, is best way to say it. I love it. Freebridge, you have two or three moments outside of the president's
00:15:26
speech. Obviously, that's the pinnacle there. So, let's just go below the pinnacle. What were the other two or three moments for you that were salient,
00:15:33
inspiring, notable? I thought Jensen did a great job. I don't know what you guys thought, but he
00:15:39
is very compelling and has a incredible uh vision and view on where AI is taking
00:15:45
us, where it's headed, and what the challenges are. So, I really appreciated him taking the time to come and join us.
00:15:51
last minute he rearranged his schedule to come out for it and it was great. By the way, on the point on energy, which I
00:15:57
still think is the biggest unsolved issue right now in America besides the
00:16:02
uh the federal deficit and the debt problem, Chris Wright agreed to rearrange his
00:16:08
schedule to come and join us at the all-in summit in September. Oh, great. To continue the conversation. We didn't get enough time to talk about it. So, we
00:16:14
are going to hear more from Chris particularly with a particular focus, which is what I wanted to spend time on. didn't get a chance last week on nuclear
00:16:20
and where are we cuz he actually is very passionate like he said at the thing it's where he's spending most of his time right now and I think it's very
00:16:26
good to hear the deep dive on where we are in the cycle on trying to accelerate
00:16:32
nuclear energy deployment in the United States same question to you after pus
00:16:37
you got two or three moments that stood out let's just talk about the executive orders for a second because I think it's pretty cool that the president of the
00:16:44
United States signed three executive orders at the all-in summit that we just hosted I And that was pretty amazing.
00:16:49
One of them was to promote AI exports because we want the American tech stack
00:16:55
to become the global standard. The second was around AI infrastructure to make permitting easier so that we can
00:17:01
help solve those energy problems you're talking about Freeberg. And then the third one was on preventing woke AI in
00:17:07
the federal government. And that to me is probably my personal favorite because we spent a couple years on the show
00:17:14
talking about how when we're talking about woke, you're really talking about censorship, right? We were talking about
00:17:19
censoring people's views based on ideological bias, ideological dogmas. We
00:17:24
saw what was happening in social media before Elon bought X that helped
00:17:30
bring things back. But we were on a track, I think, before President Trump's election to repeat that whole social
00:17:37
media censorship apparatus in the form of AI bias or AI censorship. And we saw
00:17:43
this with the whole black George Washington and where some AI models were
00:17:48
saying it was worse to misgender someone than to have a global thermonuclear war. Yeah. And this wasn't an accident
00:17:55
because if you go back to the Biden executive order on AI, there was something like 20 pages of language on there encouraging DEI values to be
00:18:03
infused into AI models. So again, we were on track to repeat all
00:18:09
the social media censorship, all the trust and safety stuff in this new world of AI, but it would have been even more
00:18:15
insidious because at least when someone gets censored, you kind of find out about it. It's explicit. It's not.
00:18:21
It's explicit. But with AI, it would have been worse because you wouldn't have even known. It would just be there
00:18:27
rewriting history in real time to serve a current political agenda. It would have been brainwashing our kids.
00:18:32
Oh, and people trust these AIs more than they should. I mean, these things are making a prediction of the next word
00:18:38
coming. This is not like God-given truth here. And so, Freeberg, you wanted to
00:18:44
interject about this one because this is actually I'll be honest, Saxs, I'm surprised you're saying this was the most important one to you. I like that
00:18:50
you clarified it because it was the one that was mocked or kind of like people were like what why is this important? I
00:18:56
think you made a good case for why it's important. Freeberg your response. Yeah, but sex this is not about broadly making
00:19:03
quote AI non ideological. Private companies should still have the right
00:19:08
through freedom of speech or freedom of expression or freedom to operate to make AI that does whatever they want it to
00:19:14
do. What the EO was was that the federal government would not procure ideologically biased AI. Is that correct?
00:19:19
Yes. Exactly. No, we're we're aware just to make sure that the the federal government is not trying to instruct private companies how to operate. It's
00:19:26
simply saying if you want to sell to us, this these are the rules of the road. Yes, that's true. So, we were very
00:19:32
careful about the First Amendment issues. And you're right that if a private company wants to put out a
00:19:37
biased AI product, we're not going to tell people they can't use it. And it could work. It could be successful. People might like it. Ya.
00:19:44
Yeah. We're just saying that the federal government is not going to spend taxpayer money buying AI models that have compromised
00:19:52
their accuracy and quality because they're beholden to some, you know, ideological agenda,
00:19:58
which is similar to the approach with the universities, right? Hey, listen, you could have a biased university. We're just not going to fund it. We're
00:20:04
not participating. I think it's it's quite reasonable in that way. And yeah, and I would just say that, you know, we were a lot more careful about
00:20:11
this than the Biden administration was when they required that DEI be inserted into all these models. They didn't
00:20:16
distinguish between public and private money or government procurement versus private models. So they just they were
00:20:23
trying to suffuse DEI into everything. And what we're looking for here is just neutrality, right? We're looking for a
00:20:30
lack of ideological bias. The first step was to get rid of that Biden EO, which the president did his first week in
00:20:35
office. This goes a little bit further and it's a little bit of a shot across the bow of these Silicon Valley
00:20:41
companies saying, "Look, you need to play it straight. You need to be ideologically unbiased." As the default,
00:20:46
as the default, when you sell to the government, you can't insert your values at the expense of accuracy. Look, at the
00:20:52
end of the day, accuracy and truth seeeking is the standard, right? You can measure. That's the goal.
00:20:59
So, we don't want the quality, accuracy, and truth seeeking to be sacrificed because of these any are you still are
00:21:07
you still seeing that are like when you say these Silicon Valley companies I mean is this still kind of a widespread
00:21:13
concern or widespread deployment from your point of view where you're sitting like are you still seeing a lot of the models being trained on ideological
00:21:20
systems that you know are preferential to one group and not to another I think it was a much bigger concern 6
00:21:26
months ago and I think there's been such a huge vibe shift since President Trump's election and taking office that
00:21:31
like the woke stuff is sort of going away on its own but I and I I think that's the trajectory
00:21:36
we're headed. But it was But you still think it's important enough to make sure that there's any error? Yeah. It's like, look, this is make sure
00:21:41
this thing doesn't come back from the dead. I think that there's been a huge vibe shift since President Trump's election and woke has definitely fallen
00:21:49
out of favor and it seems to be going away on its own. But we could still get,
00:21:54
you know, Orwellian outcomes with AI. And I do think it's very important to just keep underscoring that what AI
00:22:01
models should be focused on is the truth is on accuracy. And we don't want
00:22:06
ideological agendas to sacrifice that. And um and I think I think that even though this is a less salient issue now
00:22:14
than 6 months ago, precisely because of the vibe shift, I still think it's important to underscore this point that
00:22:19
we don't want would you go so far we don't want AI taking an Orwellian direction. Yeah. Would you go so far as to limit
00:22:26
free speech and and make it non ideologically biased? Like would you make that law if you could
00:22:33
because again the the decision about the federal government procuring versus what these private companies can choose to
00:22:39
reflect as their quote values in their systems. No, he just you already answered that he would not. Yeah. No, look, I we understand the
00:22:45
difference between public procurement and private speech and again in a way that the Biden administration did not
00:22:51
because they were saying that all AI models Yes. had to be adhere to a specific ideology
00:22:56
to the CI stuff. So yes, it was an ideology they wanted embedded in it. You're saying don't put an ideology in. But just to be clear
00:23:02
here, I want to make one point. This is the defaults. Anybody who wants to could when they start their prompt or they set
00:23:08
up their preferred language model could say, "I'm an atheist. Here's what I believe. Please speak to me with this in
00:23:14
mind." Or, "I'm a Catholic. Uh, you know, I'm a Protestant." Whatever you want. Here's my belief system. Please
00:23:22
never reference, you know, these three subject matters in this way. So this is the default. I think it's a great thing
00:23:28
to I think that's a great example, JCAL. I do think we'll end up seeing religious AI. I think we'll see AI that's tuned to
00:23:34
people's religious ideological, but I think yeah, I have one of the startups we did was
00:23:39
doing a learning app and they were struggling and they just made a prayer app and their prayer app went parabolic
00:23:46
and now they're just like printing money. So there is definitely a a huge market here.
00:23:51
Check out what what were your highlights? It was great to be, you know, included in everything. So I appreciate that we
00:23:57
had uh No, I'm being dead serious. Your invitation finally didn't get lost in the mail. No, but here's the thing. It I think
00:24:04
this could have been a nonall-in thing. It could have just been, you know, you could have done it and just invited who
00:24:09
you wanted to. So I like that it was under the all-in umbrella and that we didn't censor anything and we went right at hard topics. I'm a moderate. I know
00:24:15
people want to make me into like a stupid lib, but I am an independent moderate. And there were moments in time when we had great debate, too. This
00:24:22
wasn't just a love letter to the administration. One of the great moments was J. D. Vance. It was just great that
00:24:28
he wanted to come chop it up and just hang with the besties. And he came out and he went right at me. He was like,
00:24:34
"Hey, you treated me like a beep at the thing. We had a big debate and you know, he went right at me." And then I was
00:24:41
like, "Okay, it's on. Want to talk about stuff?" And he said, "Yeah, let's get into it." And that's what I love about JD. JD to me seems like the politician
00:24:49
of the future. I know this is like the Trump's administration, but So, you like him? No. No. I I'm in like with Trump. I'm in
00:24:56
love with JD because he's young. He's opinionated and he likes to mix it up. He's on Twitter all day long. He engages
00:25:03
people on Twitter. He engages people in other groups. I'll leave it at that. And uh we had a really, I think, honest
00:25:10
discussion about immigration. And we got back to the highskilled immigration question. That's the third rail for MAGA
00:25:16
and and for the country right now. Immigration recruiting. You mean you brought it up right off the bat?
00:25:21
No. No. He said he wanted your hobby. You brought it up. Your hobby horse very
00:25:27
I want to continue the debate and I said, "Okay, let's continue the debate." So, here we go. He was super spicy and he made a great super spicy point that I
00:25:34
want to point out here on Amplify. if companies are going to be laying people off. And there was an incredible uh
00:25:40
chart that came out. It was in the Financial Times and they showed male college graduates versus
00:25:47
non-olgraduate males. And there was usually a huge gap in unemployment between those two. In other words, if
00:25:52
you had the college degree, you you had a much better chance than the non-ol male. And now those two things have
00:25:59
flipped or they're like neck and neck. If you have a college degree, you have no advantage as a man coming out in
00:26:05
this, you know, 20 to 27 year old range. This is men. Women are actually doing better. More women in college than men,
00:26:12
yada yada. But he's very attuned to this. And he said he's got big concerns right now. So this is again why I love
00:26:19
JD because JD is very tuned into the fact that people are asking for more H-1B visas and that typically is to save
00:26:27
money and supposed to be very skilled people. But why is Microsoft laying off 9,000 people then asking for more, you
00:26:32
know, H-1B visas? This is a really honest, truth-seeking question. And I it's hard for this administration to
00:26:39
talk about this issue because I know you got Steve Miller, Bannon, whatever people all the way on one side who want
00:26:44
to deport 20, 30 million people, Tucker, and you know, and then you have other people who are more moderate. And I thought that was like a really great
00:26:50
moment in time for America and for us as a podcast to challenge and have a really important discussion. And he made some
00:26:56
great points there. Number two, we had a great debate, I think, about energy. Uh,
00:27:02
disagree. You disagree. Okay. I disagree with the because I think you challenged him with I I think you
00:27:08
challenged him with things that were not facts and not true. And I'm happy to debate that with you offline. I think he was caught off guard, but I think it was
00:27:14
pretty like rough and inappropriate. 100 If you think it's inappropriate, that's fine. I Jake's favorite moments were all the
00:27:21
ones where he got the ones where we had debates. That's what you're describing. No, no. Where there were debates. got
00:27:27
with the vice president. You got into it with the secretary of energy. Those your favorite moments when you got to
00:27:33
Okay, great. So, okay, fine. I like when there's a little conflict, a little debate about an important issue. And when I walk the audience, which was, you
00:27:41
know, 90% Republican, GOP, MAGA, etc. People said
00:27:46
that was a great moment. I really like that debate because he kept saying like nonreliable energy and whatever. And I
00:27:51
said, are you talking about solar? And I think there was a little misinformation. Reliable. No, it's not. You put it with a battery. Right now,
00:27:58
Texas is 30% some days wind and energy. You know, like I can tell you I live in the great state of Texas.
00:28:03
Texas is Texas roughly 5% solar. Just so you know. What's that? Texas is roughly 5% solar,
00:28:10
right? And wind puts it up to 25 to 30% on the top days is coming from that. My point about that is and it's cheaper to
00:28:16
put in a solar uh and battery farm than a new coal plant. It is 100%. We can
00:28:21
pull up the stats. It is twice the cost to do solar than it is to do nat gas. It takes 4,000 acres whereas gas takes 20 acres.
00:28:28
I said coal. Yeah, but these the big advocacy with these guys is to use nat gas to use
00:28:34
methane. He was saying coal clean coal clean coal. He said it 50 times. These methane plants are half the cost
00:28:39
of solar. They can get stood up in less than two years to generate a gigawatt. And instead of being 4,000 acres of
00:28:46
solar, you can get it done for, you know, call it 20 acres. Now talk a little bit about pollution. And that's a
00:28:52
big part of why they're doing this. Well, a big part of methane is that it's actually cleaner than coal, which is why
00:28:57
they're using it. Cleaner than oilar. Cleaner than oil and the two ways of
00:29:02
getting energy. Now, science guy, now do I'm trying to give you the fact about
00:29:08
why about why it is cheaper and faster, which is what he was making an advocacy for, right? It's not about like solar.
00:29:14
Yes, you're right. It has a lower carbon footprint when you're running it. But at the end of the day, what these guys are
00:29:19
focused on and a big challenge for America is how do we scale energy production in the states and scaling
00:29:24
energy production, I personally think we need to fix the regulatory roadblocks in nuclear. And Chris Wright's been very vocal on this.
00:29:29
We all agree on that. But the fact is this NA gas supply that we have in the United States and the
00:29:36
fact that we can deploy nat gas energy production very quickly is what makes it such a reliable source right now. if the
00:29:41
US wants to have a chance at scaling from 1 terowatt to two faster than projected today and that's the reason
00:29:48
you know it's not it's not about like solar is being bad solar is bad like that's not the the argument it's just like dude we we got to get moving fast
00:29:54
and we got to have reliable energy in our debates I just want to point in our debates when there's bad faith
00:30:00
moments I think it's a bad faith moment for when I say coal versus solar and then you say no you're wrong it's solar
00:30:07
versus n gas and that's what he was doing this is what politicians do here all in. We like to do, you know, uh,
00:30:14
fact-based, truth first stuff, not biased stuff. And so solar, you're comparing. So, you know, solar and how
00:30:22
fast it is versus how fast it is to go to net gas. Of course, it's faster to go to net gas if we have those available. Let's put that aside. It's an important
00:30:28
debate. The fact that you and I are debating it is important. And I also thought Lisa from AMD was fantastic. I
00:30:34
haven't heard from her. By the way, I just want to point out that when I got back to the conference, so I I left for
00:30:40
a time to go back to the White House and then I came back. The first thing everyone said to me when I got back was,
00:30:47
"Did you see Jal being a jerk to Chris Wright?" They were everyone was like, "Oh, about the Yeah, a jerk.
00:30:54
He's a civil servant. He has to answer hard questions. You didn't talk to him in a in a in a in the way that you
00:31:00
would." Basically, everyone thought you were a jerk to Chris, right? And you were kind of a jerk to JD. And what are
00:31:05
your favorite moments? Call me an [ __ ] What are your favorite moments from the conference? You're reminiscing about
00:31:10
that. You were an [ __ ] to me. Anyway, the point is one thing you're
00:31:17
going to get here at the Allin. This is where everyone%
00:31:22
two out of three were everyone was saying this. You almost derailed the whole thing. Nobody derailed. You're a civil servant,
00:31:29
Mr. Sachs. You're a civil servant. You're all civil servants. I've been putting up with you for 5 years on this podcast.
00:31:34
The hard questions. It was perfect training for government services beyond the podcast being
00:31:39
interrupted by you for 5 years. Yes. That's why I'm so ready to dwell. You learn about you work for us, all of
00:31:46
you. And you're all going to take hard questions. And you're all going to take hard questions on September 7th, 8th, and 9th when we have the allin summit in
00:31:55
Los Angeles. By the way, by the way, one thing I'll say is Chris Wright's chief of staff came out to me afterwards and I
00:32:01
said, "Oh, I'm sorry. I heard Jake was a jerk to to Secretary Wright and he's like, "Oh, no. Chris loved it. He loves
00:32:07
mixing it up." Okay, of course. And he's coming to he's coming to Olan Summit in on September 8th. So,
00:32:12
can't wait to debate him more. Can't wait to mix it up more. So, he he likes mixing it up. So, kudos to him.
00:32:18
Okay. And so did so did JD Vance, the vice president to you. Stop calling him JD,
00:32:24
by the way. Well, I mean, listen, I just want to say Vice President J. D. Vance and I have been directly communicating.
00:32:31
We had a We Yes. No, you haven't. David Sachs, your worst nightmare. Oh my god. Your worst nightmare.
00:32:38
The nation is ruined. What the [ __ ] We let Jake Allen to Washington and now look what's happening.
00:32:43
Yeah. And listen, I want to level set with everybody. We I am going to ask whatever [ __ ]
00:32:50
question I want to whatever guest we have and nobody's stopping me. The only way you're going to stop me is by
00:32:55
writing me a huge [ __ ] check to buy me out of this podcast and replacing me with some mid until
00:33:00
or if the Secret Service keeps you off stage, which might be an option. Or Secret Service keeps up stage. But the truth is, this is one of the great
00:33:06
things about this administration sex is that they love to mix it up. They like great debate. You know who didn't like
00:33:13
great debate and ran from it? Kalama Ding-Dong. She wouldn't even come on this podcast. You know who doesn't like
00:33:19
debate? Tik Tok. We get that at Bernie's Biden who didn't even know what a podcast [ __ ] is.
00:33:25
Tim Waltz who doesn't own you. You definitely you definitely have your moments, bro. You definitely have.
00:33:30
But Tim Waltz doesn't own an equity. He literally doesn't own one share of any company, doesn't his home
00:33:36
and Tim Waltz is on there giving a hard time about the uh Trump savings accounts.
00:33:42
I mean, I don't even know if that's a hat though, which you loved. You thought that was going to win the election. I thought he might be able to speak to
00:33:49
like the middle of America and then I find out like when they do the the Deep Oppo research
00:33:54
that the guy doesn't own one stock. The guy doesn't own his home. He's
00:33:59
financially illiterate and we're making him employed by the government. He's been employed by the government his whole life.
00:34:05
I mean, have you There it is. That's what Jake thought would win the election.
00:34:11
You're never going to live that down. I remember when you tweeted you thought that was it. You thought that was the master stroke. I thought it might
00:34:17
master stroke that was going to win them the election. Hey, listen. No straanis does not bat a thousand. No, no, even no straanis
00:34:25
cannot bat a thousand. But it did come out, by the way, that Nancy Pelosi wanted to do the speedrun primary. I
00:34:30
don't know if you saw that, just not to rehash too much stuff. Sax, I want to um say there was one point of difference if you want to get into it around the the
00:34:37
the content part of part of it where and this is something that
00:34:42
the press was having a field day with and they really keyed on which was hey
00:34:47
respecting IP, respecting copyright. What's the feedback been so far on that which was a pretty spicy part of
00:34:54
President Trump's speech? Well, I think what the president said was just very pragmatic. He said we had
00:35:01
to have a common sense approach towards intellectual property. And he said if you have to make a deal with every
00:35:07
single article on the internet, every single website, every single book, every piece of IP in order to train an AI
00:35:15
model, it wasn't feasible. He said, "Look, I appreciate the work that went into people creating these works, but
00:35:22
you're not going to be able to negotiate a deal for every single one of them. And if we require our AI models to do that
00:35:29
and China doesn't and they won't. They're just training on everything whether it's you know pirated or not
00:35:34
then we're going to lose AI race. So I think he took the side of a fair use definition. I don't know if he used the term fair use but effectively he was
00:35:41
taking the side of a reasonable fair use. What did you think of that part Dave
00:35:47
Freeberg? You have any thoughts for Jimoth on that part? I think he's absolutely right. I've said this before. If something's in the
00:35:52
internet, if something's in the open domain, and I strongly disagree with the idea that AI getting trained is the same
00:36:00
as AI replicating copyright material. If AI outputs text or outputs audio or
00:36:07
outputs video that contains copyright material, it is 100% in violation of
00:36:12
copyright. And he said that, by the way, yes. And if the AI is learning, it is understanding patterns, it is
00:36:18
understanding reasoning, it is understanding concepts by reading copyright material just like humans do.
00:36:24
A writer, an author reads a bunch of fiction, learns good techniques, learns
00:36:29
good concepts, learns good theory from reading all those books and then goes and writes his or her own book. They are
00:36:36
not violating copyright material. In the same way, Freeberg, what if it's all the New York Times on the open internet?
00:36:43
100%. You're you're 100% correct that should be paid for or licensed. I'm talking about the open internet. I'm
00:36:49
talking about open material. I'm talking about stuff that's in the open domain which common crawl. There's a thing called common crawl.
00:36:55
If there was if somebody stole a hundred books, let's say, and put them on their
00:37:01
website and it was a pirated Russian website with a thousand books on it and you accidentally crawled it, you would
00:37:06
be obligated to take that out then. I think we all agree. Correct. Okay. Correct. Cuz that's what a lot of the lawsuits around. So, I think we're reaching
00:37:12
something. I just want to say, you know, this is such an important point, especially to me as a content creator and somebody who spent his career in
00:37:17
this. I've been thinking about the endgame and um I was I'm here in Park City. I was just giving a a keynote and
00:37:24
I wanted to show you something I made saxs because I think we have to get to the the endgame here. So, in my talk, I
00:37:33
talked a little bit about how can we get through this fight and then maybe getting to a solution. So, I had my team
00:37:39
mock up the New York Times website here and chat GPT doing a deal with them. So, here you
00:37:46
see you're on the New York Times website and you ask it a question powered by GPT. You ask it, hey, you might ask this
00:37:53
question. In fact, you log in with your chat GPT credentials. You're and it could be Gro, it could be Gemini. Give
00:37:58
me the earliest mentions of Putin, you know, if you were a fan of Putin or something. And it would then go through that and give you your your Putin
00:38:04
references. And then I made another one. And then obviously this would be an exclusive to Chat GPT. It would be one
00:38:09
of those things where you know they get an exclusive. And then here on the uh Disney Plus channel, imagine you could
00:38:15
make yourself into a Jedi Knight and you could then upload your photo. You know, kids might really get into this. You
00:38:21
upload your photo, you can make You talked about this free a couple of times of the future of narrative storytelling. You up your photo and then it makes you
00:38:27
into a Jedi Knight. There's there's Darth Calacanis. So that looks to me like you're infringing
00:38:33
on their trademark. What's that? Are you infringing on their their copyright? This is fair use. This is fair use. This
00:38:40
is a perfect example of fair use for editorial. You're also infringing on some Osmpic. That's absolutely infringing.
00:38:46
Trust me, I am definitely infringing on some OMIC here, guys. I'm I'm past those. I'm on to peptides now, man. I'm
00:38:52
on the Wolverine protocol. So, look at Are you Yeah, I started doing the I mean I I don't What could go wrong?
00:38:59
Don't take a podcaster's advice. Please don't take a podcaster's advice on your
00:39:06
healthcare rule number one. Take Chimat's advice because he's got 6% body fat, which I think attributes to much of
00:39:13
your pomp and circumstance around your privates. I think it has to do with the lack of fat. But I'm going to leave it
00:39:18
at that. First of all, it's 11 and a half, but you know, it that's that's like right that's like right before I go on summer
00:39:25
vacation. Then it then it ends up at 12 or 13. Did you go get that? Did you go get that gelato? What was that place we went that
00:39:30
we love? Lulu me. I've gone there every day. Every day so far. Did you do two or one? Be honest. Two or
00:39:36
one? Bro, I've had I've had I've been doing No, no. Per session. Do you do two or one? Be honest. Per session, too. I start with the
00:39:42
medium and then and I finish with a small. Yeah, exactly. You This stuff is so good. I've never tasted any gelato like this.
00:39:49
It's incredible. I mean, it's unbelievable. We have to license it for the United States and the all-in brand. We have to license it from them.
00:39:54
It's really incredible. But Chamath just generally speaking or anybody who wants to have at it. Friedberg Sachs, what do
00:39:59
we think about the endgame here? Because there's some major lawsuits here. They're going to get settled in the next year or two. What What do we think about
00:40:06
sort of the future I've shown here today? I think what Sax just highlighted is exactly right.
00:40:12
Look, we got to have a common sense approach here or we're going to lose the AI race. I mean, one of the out one of
00:40:18
the key determinants of AI quality is the amount of data that you have. It's very simple, right? It's
00:40:24
there's a few building blocks. There's energy, there's chips, and there's data, and there's algorithms. And if you lose
00:40:30
on any one of those dimensions, then you're in trouble, right? So, look, you just can't have a
00:40:35
situation where China can train on the entire internet. And RAI models are
00:40:40
hamstrung by needing to contract, negotiate contracts with every single website. But right now, Europe, Elon owns X,
00:40:47
right? He owns Twitter for now X. Does Sam Alman have the right to use X in his
00:40:53
corpus? It's public. No, it's not. No, it is not public
00:40:58
endpoint. It's not a public. I just honestly I don't there's I don't know the answer to that. There's some edge cases here. We're
00:41:03
going to have to come up with It's not about whether it's behind a payw wall or not. It's whether these APIs exist and whether you're you're
00:41:09
actually contractually allowed to use them or not. The terms of service. Correct. The terms of service. It's published on
00:41:15
every website what the terms of service are with respect to the content. I think it would be okay to let people opt out,
00:41:21
you know. So, we already have this with Common Crawl. You can put in the footer of the website, you put in robots.txt
00:41:27
and you opt out of Common Crawl. Common Crawl is like this nonprofit organization that basically archives the
00:41:33
entire web every few months. Funded by Gil Elbaz, former Yeah.
00:41:39
formerly of Google. Great fan of the pod. Shows up to our summits. Great guy. And all of Open AI was built off of
00:41:45
Common Crawl originally. And he but they're very clear by the way. They say you have to clear copyrights. You don't get to just use
00:41:51
open Chrome. Can I go out on a limb? I don't know if you guys saw this Amazon deal with the
00:41:57
New York Times for $25 million. Did you see that today? No, I didn't see today. Explain it, please.
00:42:02
I think that the New York Times licensed Amazon all of their content, including The Athletic and a bunch of other things
00:42:09
for training. 20 million. Sorry, 20 million a year. Okay, here we I read that and I thought
00:42:16
this is the peak of these deals. These deals will only go down in terms of
00:42:22
dollar value from here. And it it actually brought me to this
00:42:27
point where I was thinking to myself, is it even realistic to believe that patents and copyrights
00:42:36
actually exist in 5 years? And I went through this exercise of like if a
00:42:41
computer studies the periodic table and also understands the
00:42:48
laws of physics, the laws of biology, the laws of chemistry and then independently deres
00:42:53
some material that is otherwise patented, what will happen?
00:42:59
And then separately, if two competing AIs invent a new material from scratch,
00:43:05
how will the international courts deal with this? And if you take all of these examples to the limit, at the limit, the
00:43:13
idea that there are copyrights, enforcable copyrights,
00:43:18
I think is a very fragile assumption. So I'm actually thinking more that we have
00:43:24
to spend some time understanding the landscape of a world that doesn't have
00:43:30
copyrights and patent protections and instead what is the surface area in which you compete what is trade secret
00:43:37
what does that mean in a world of AI and I think it's quite an interesting thing to think about
00:43:43
patents are a totally different piece I think that's a fascinating string to pull on I will tell you I will take the
00:43:49
other side of the bet if we want to make a poly market on this. I will guarantee that this will be the beginning of the
00:43:54
deals and the deals will go up from here. I'll tell you why. The reason the New York Times made that deal is to make
00:44:00
it apparent that what OpenAI has done has damaged their business because now they have a customer and their customer
00:44:06
is Jeff Bezos at Amazon and Jasse and now they can show damages because and
00:44:11
now those damages could give them an injunction against OpenAI and OpenAI's got to take it out of their crawl of
00:44:18
their you know construct and that's going to be really expensive for them. It's not not doable, but it's going to
00:44:24
be expensive. And let's think on a societal basis of what we want as a
00:44:29
society. Do we want a society in which journalists, writers, artists, musicians, filmmakers, actors cannot
00:44:35
make a living, podcasters, or do we want a world in which they can? And I think technologists,
00:44:40
hold on, let me hold on. As a technologist, as a technologist, we typically think if we can crawl it, it's
00:44:47
ours. What I can tell you as an artist is if I make it, it's mine. and you need my permission because it's my art and I
00:44:53
think it the industry will do better if they respect them because now the New York Times can hire more fact checkers.
00:44:59
But can I just ask you a question? Yeah, go ahead. Sure. But why do you have to connect the two
00:45:04
as immutable things? meaning why can't somebody make something still you know
00:45:10
let's just say it's a song but that song can now be made by multiple AI models
00:45:16
but if they make the song there's a reasonable claim that even if
00:45:21
they don't have the copyright more people will want them to perform the song than some random AI
00:45:27
so can't you make a living without having the copyright which is the choice of the artist some artists are were very well known for not
00:45:35
wanting their art to exist in some mediums. As a perfect example, the Rolling Stones for a long time thought
00:45:40
they would be sellouts if they had their music used in commercials. And when they did Start Me Up with Windows, that was a
00:45:45
really big concession from them. And that's up to the artist to make that decision. You make a a valid claim. Hey, yeah, you go on tour and make more
00:45:51
money. But that's the artist decision, not the technologists or the people stealing their content. And by the way, $20 million a year is a hundred $200,000
00:45:59
highly paid journalists, fact checkers at the New York Times. They're going to get 10 of those deals. And it's going to
00:46:06
create a golden era age of journalism and content. And we should be happy.
00:46:11
I told you this example, Jason, but at at Beast, we did a a licensing deal of our content to allow OpenAI to learn
00:46:19
Yeah. to run training runs on our videos. And at the board, the thing that we kept
00:46:25
talking about was I was I was really concerned like, let's just do a a couple year deal max.
00:46:30
And the reason is we have no idea what this looks like. in five or 10 years. And there's just as
00:46:37
much chance to your point that we get it wrong as right. Now that was about 6 months ago. And so the intuition that I
00:46:42
had back then was maybe we should keep the deal term as short as possible. But now when I see how important AI is in
00:46:50
the global landscape and what China is doing, I think on the margins that this idea that these copyrights will mean
00:46:56
something in my mind, I am underwriting the value
00:47:03
of these things going to zero and I'm asking myself instead for my businesses, how are we actually building a real
00:47:09
defensible moat and not a piece of paper that we can use to sue somebody? Okay, Freeberg, you want the last word
00:47:15
here? We got to move on to some other topics. I I just want to be the last word. I just want to be clear that nobody is
00:47:22
losing their copyright. Copyright is the right not to have your work copied. And if an AI model produces
00:47:29
outputs that copy or plagiarize your work, then that's a violation of the
00:47:34
law. And I think the president specifically said that we're not allowing copying or plagiarizing. The question is whether AI
00:47:42
models are allowed to do math on the internet. You know, pattern recognition pattern recognition. Basically, that's what it is. And it's
00:47:49
and Jal, I think you're conflating the two. And I I don't want to be interrupted. I just want to say this. I understand the distinction.
00:47:55
And and I think that this idea that like I can't, for example, go to the library, rent a book, read it, and then learn
00:48:01
some of the good techniques on how to write a good book should be restricted to humans in this AI context. Like this
00:48:07
is exactly what they're doing. They're identifying patterns and then they're building predictive algorithms that
00:48:12
allow them to output stuff that starts to fit within different kind of, you know, variable settings. Do you guys
00:48:18
think it's possible that if you allocated enough compute at the problem, you could write Michael
00:48:26
Kryton's Jurassic Park Denovo without ever having read it? Yeah, me too. Me too. I I think
00:48:33
I don't know what that would mean. Like, well, this is my point. I know who Michael is and I know what Jurassic Park
00:48:39
is. I don't know what it means this issue. I don't know what it means to say can AI write that like but you guys remember the Ed Sheeran
00:48:45
lawsuit. Do you remember the lawsuit? I I did. But let me just make one point here on this cuz you're saying I don't understand it. I spent my career in it.
00:48:51
I understand it much better than you do and I understand it from lawsuits and being in the weeds on it. Like I
00:48:56
understand it from first principles which you do not. And I will say this is what we're talking about here is the
00:49:02
definition. It's the definition of a derivative work. and the output matters.
00:49:07
So if you were to take my knowledge and then create a derivative work from it and you used a percentage of my work and
00:49:13
that's where this will get into the nuance is what percentage of the original work is used in the derivative
00:49:19
work and under what context a commercial context or a non-commercial this is clearly a commercial one if it's a if
00:49:25
openai was a nonprofit right now we'd be having a distinctly different discussion because it would there would be you
00:49:30
wouldn't be competing with me as the copyright holder to use this new medium and create the derivative works and it
00:49:36
has to change substantially. So if it's a if it's a cliffnotes when China has the only models that are
00:49:42
able to meet your stringent definitions of copyright. Well, no. Here's the thing. I think the China fear the China fear [ __ ] is
00:49:48
[ __ ] I'll be totally honest here. Just because China steals IP does not mean you get to steal from Americans. In
00:49:55
America, we have rules. And when you go to China, and by the way, we spent the last 30 years. The major issue with
00:50:00
China is not Taiwan. It has been the technology industry itself. Let me finish. The technology industry itself
00:50:07
has leaned on our government for 30, 40 years, including Microsoft, including Google, to make sure our trade secrets
00:50:13
are not stolen, our IP is not stolen, our movies are not stolen. That is the key issue with China. So just because
00:50:18
China's a thief does not mean American companies get to have you seen Have you seen the latest
00:50:23
batch of Chinese open source models or open models? They they steal everything. Does that
00:50:28
mean you should be able to steal Windows? Should you be able to steal Jason? We don't think it's stealing.
00:50:35
Elon has said this pretty clearly, but Grock 5 and for sure Grock 6 will not use Common Crawl. It will not use the
00:50:42
internet. Okay? It'll just be an enormous amount of synthetic data. And back to what Freeberg and I just agreed
00:50:49
upon, if you synthetically go and try to generate all this content to learn across, you're invariably going to
00:50:54
produce something that's already been created. That's like some sci-fi level.
00:51:00
I understand. That's what's happening now. It's happening now. If somebody
00:51:05
What do you think happens to Grock 5 or Grock 6? Is that violating copyright? It didn't even know that it existed
00:51:11
on the output. Yeah, that's fine. If it on the output created a similar work,
00:51:17
they would need to then take it down. And so that that would be a really interesting new that's a new space we're
00:51:24
going to have to contend with. So you can if it does happen is a new concept that we would have to address in a new
00:51:30
way. I'll give you I'll give you a science corner example. There's this EVO2 model that they published at the Arc Institute which Patrick Collison you
00:51:37
know is the name Sherman. So that EVO2 model they just ingested all the DNA data they could find in the
00:51:43
world. Trillions and trillions of base pair of data that they ingested and then they looked at patterns in DNA
00:51:50
and that's it. They had no context for what the DNA represented. They had no context for the concept of genes. none
00:51:56
of the structured understanding of what that DNA does, what it is. And you know what it did? It fed in the BA gene
00:52:03
variant and the thing output a warning saying I think that this is a pathogenic
00:52:08
variant to DNA without having any context. This is the the breast cancer alil and it didn't have any knowledge
00:52:14
and it did it wasn't trained on that at all. It had no knowledge that there are pathogenic variants for cancer and it
00:52:20
identified that this was a genetic variant that can cause some sort of pathogenic outcome in the organism. So that was that's a great example where
00:52:27
there's a lack of understanding at the human level on what really drives some of the patterns in nature, the patterns
00:52:33
in society, the patterns in behavior that are kind of emergent phenomena perhaps that these AI models are
00:52:38
starting to identify. And I think to Jama's point, we may end up seeing this in things like entertainment as well. All right, this has been an amazing
00:52:44
debate. We got to move on. And you know what? We're going to have more amazing debates September 7th through 9th in Los Angeles at the All-In Summit. The lineup
00:52:51
is stacked. Alibaba's co-founder Josai Tomma Bravo co-founder Arc Invest Kathy
00:52:56
Wood Uber CEO Dra Sequoa's Rolaf Botha YouTuber Cleo Abram and many many more
00:53:02
coming you get the last word here go I was just highlighting this tweet that
00:53:08
I saw where talking about Chinese openweight models are basically open source models
00:53:14
so basically all the leading American models are closed source and all the leading Chinese models are open source
00:53:19
this is kind of the way things played out. It's a pretty good technique for catching up because then you got the
00:53:26
larger open source developer community helping you out. But the point is just that these open source models are catching up pretty
00:53:33
fast. We're ahead in many other aspects. Our chips are a lot better. Our data centers are better and so on. And I'd say our closed source models are better.
00:53:40
But they have this one area of open source models. So again, if you hamstring our AI models access to data
00:53:46
by creating a whole bunch of new requirements for contract negotiations, like we could really lose the AI race.
00:53:52
This is a really big deal. It's not a madeup concern. I don't know why you think it's made up. I never said that it's made up. I think
00:53:59
it's an opportunity for America to actually have a distinct advantage, which is that $20 million from Amazon
00:54:04
alone is 1% of the New York Times revenue. And that's going to go directly to the bottom line. It's going to allow them to hire more journalists. Then that
00:54:11
protected site will have be giving in real time something. These language models are going to have to go hack and
00:54:17
steal that real-time data is going to be a distinct advantage for Gemini, OpenAI, Amazon, whoever chooses to do it. And we
00:54:23
can create you have this like nostalgic sort of quasi romantic notions about like
00:54:29
journalism and the need to save the New York Times. It's also art. It's I mean you can say
00:54:34
all the derogatory things you want about me personally, Saxs. That argument doesn't work. No, no. You just said I
00:54:39
have this whole nostalgia whatever when you Yeah, you do. You're nostalgic for journalism as it used to exist. When I know I've beat you in the debate
00:54:45
is when you make it personal like that. It's not personal. I'm not being nostalgic. I'm trying to create a sustainable a sustainable advantage for
00:54:52
America. And you are our public servant and you're learning AI. You will take my feedback.
00:54:59
You will take my feedback. We're going to ignore your feedback. Take your feedback. Public service. Throwing in the trash. No, you take it
00:55:06
and I will be showing up at the White House for my tour. You have this crazy idea that we're
00:55:11
going to win the AI race by tying one hand behind our back so that you can subsidize journalists so you can
00:55:17
subsidize movies. You'll get more content. You said before you want more training data, pay for it. Pay for more
00:55:24
training data. You're the zar. Take it back to pus. All right, let's keep moving here.
00:55:29
We have to keep moving. We have a great This is great debate. Great debate here on the All-In podcast. It's not going to
00:55:35
stop, folks. It's just you yelling. It's just you yelling saying things that don't make sense. Okay, you can say that.
00:55:40
You only have like three topics going on. You can you can personally attack. You know what it is? It's like we got to
00:55:46
let in more immigrants. Number one. Number two, high skilled immigrants. AI is going to put everyone out of work.
00:55:52
By the way, no sense of perceived contradiction between those two things. Number three, we need to like subsize
00:55:57
here. You know, the audience says the same. When the three of you guys attack
00:56:02
me, Jason, when the audience gang up on me like this, the three of
00:56:08
you gang up on this and you personally attack me, the audience comes up to me and they say, "Wow, you really nailed and beat."
00:56:13
Have you done that today? No, not yet. Not yet. But a little bit of the buting.
00:56:19
Yeah, that's true. Let him eat the He's emaciated. He's 11% body fat. Let
00:56:25
him eat. Let him cook. All right. Listen, you and I, Saxs, will do more debate and it's going to be amazing.
00:56:30
allin.com/ yada yada yada for tickets. Get in there, folks. Uh, we have to get to the docket. We're an hour in and we
00:56:37
still have all the news. We should talk about this this um AI privacy issue that Sam Alman mentioned. All right, that's a great segue cuz I
00:56:43
saw that as well, David Saxs, and as our civil servant working on AI, this is something where you can have an
00:56:49
additional contribution. There's more work we can give you. All right, listen. Here it is. AI user privacy is becoming
00:56:54
an issue because friend of the pod Sam Alman says there is no legal
00:57:00
confidentiality when using his product chat GPT. Here's a 30 secondond clip
00:57:06
again. Friend of the pod fop Sam Alman on the vaugh people talk about the most
00:57:13
personal [ __ ] in their lives to chatt young people especially like use it as a therapist a life coach uh having these
00:57:18
relationship problems. What should I do? And right now, if you talk to a therapist or a lawyer or a doctor about
00:57:25
those problems, there's like legal privilege for it. We haven't figured that out yet for when you talk to ChachiBT. So, if you go talk to CHP
00:57:31
about your most sensitive stuff and then there's like a lawsuit or whatever, like we could be required to produce that. And I think that's very screwed up. I
00:57:38
think we should have like the same concept of privacy for your conversations with AI that we do with a
00:57:44
therapist or whatever. Okay, Saxs, this is bringing up something super important. What's your take on it?
00:57:50
Okay. Well, I I think this is an interesting topic because like copyright, this is an area where we have existing law, but it does make you
00:57:59
rethink whether those laws are truly applicable or make as much sense in this new world. So, the existing law, the
00:58:08
existing example is search history. You know, the government can get a copy of your search history. They can subpoena
00:58:14
it. Yeah. Every true crime story starts with the person's search for how do I kill my husband slowly with you poison and then
00:58:21
they Yeah, that's right. Exactly. The point is though that I think Sam is right about the legal
00:58:28
treatment right now, which is that your chat history isn't any different than the search history in the eyes of the
00:58:34
law. But it is much more personal. It's much more interactive than your search history. You are using it like you said,
00:58:41
you could use it as a as your doctor. You could use it as your therapist. You could use as your lawyer. And so the
00:58:48
ability for the federal government to be intrusive is so much greater than with
00:58:53
your search history. So I don't know what like the right policy should be
00:58:59
yet, but I I will say it does make me uncomfortable. Yeah, there's a market.
00:59:04
Can I make a recommendation to my AIR? Yes, please. He's our Why don't we Why don't we let AI models
00:59:11
get bar certified and get medically certified? So, if the AI models, it
00:59:18
turns out, are actually proving to be more accurate, more thoughtful,
00:59:23
more responsive, more reasonable, whatever it is, whatever metric we're using, and they pass the same criteria
00:59:29
as one would need to pass to qualify for the bar or to qualify for a doctor certificate. Why don't we do that for
00:59:35
the AI? If that then happens, then the the same privilege occurs to the AI as it does to the individual human that
00:59:42
does it. And now if you extrapolate from where that takes us, if we're suddenly giving AI the same sort of privileged
00:59:47
rights that we give to privileged humans, where's that going to take us ultimately with respect to the overall rights for AI?
00:59:52
Well, and they have responsibility. Hold on a second. I just want to point out here once again, you have a mind-blowing
00:59:58
concept here. I've never heard anybody vocalize that. Could they actually be certified in that knowledge? And if they
01:00:04
pass the test, makes sense they would. But then you also get responsibility. So with great power comes great responsibility. I will tell you this.
01:00:10
You can turn this stuff off. But this is an opportunity. I'm going to send a note to you. And it sounds crazy today, but I guarantee if you put it on Poly Market,
01:00:16
there will be a date when this happens. Poly market. Shout out to Shane. Let's get that up there. I just want to point out I'm going to email Elon about this
01:00:22
when I get off the pod. This is an opportunity to create the signal of the signal equivalent of an
01:00:28
LLM. The all of your chat should be encrypted. All of it should be by by default. Encrypt it by default on Grock.
01:00:35
Make it so that Grock can't even see it. They don't have it. So when you try to subpoena it, you can do what Tim Cook
01:00:41
does, which he says like, I don't have it. You, if you want to try to back door it, you can. That's a market
01:00:46
opportunity. I can tell you I only use the Brave browser and Brave search for this reason. I don't want my search
01:00:52
history like saved somewhere or whatever. [ __ ] that. You can take control of this as an individual, but the defaults matter and you have to then
01:00:59
do the work. It's a great market opportunity. Chimoth, I don't even want to know what you're talking to chat GPT about. What are you What's in your chat
01:01:05
GBT logs? What's in there, Chimoth? How to extend? How to get the extra centimeter? What's
01:01:10
in there? You trying to What's in there? I keep asking it to find me a moderator.
01:01:16
Oh, great. I keep asking it to find me a participant who's not a douche. Um,
01:01:25
my god, you are so deep in your villain era and you're leaning into it and I'm so here for it, Chimat. I love your
01:01:30
villain. You know why? I am so Why are you going into your villain? I am so riskon right now. It's like
01:01:37
it's liberating actually. It's amazing. It's really amazing. Is there any blowback to how outlandish
01:01:45
you've become this year? Any blowback at all? Has it had any negative consequence on business or hiring or anything?
01:01:51
No. But but outlandish? How how have I been outlandish? You're you're you're just filter off.
01:01:56
You're filter off. And I think it's great. I think you're over two windows back. It's absolutely fantastic we're seeing here. I asked chat GPT about my
01:02:03
future and uh my IQ. It's very interesting when you ask chat GPT to analyze you. I suggest everyone do it.
01:02:08
Well, actually, yeah, when you just ask chat GPT or whatever, what do you know about me?
01:02:14
And it's scary how much it already does. It's scary. There's this great personality test. You can put this personality test into Grock and this guy
01:02:20
like made this prompt and it goes and it tells you all your personality based on your Twitter ex history. It is wild how
01:02:28
accurate it is. What does it say about you, J? I'm actually curious. It says the same thing about all of us.
01:02:33
We're all likeworked narcissist, ENTJ. You can literally run
01:02:38
the MyersBriggs against your Yeah, your chat history. It's actually But I I like your mind-blowing concept
01:02:44
there, by the way, of like them becoming certified in some way. Okay, fresh economic news. It's time for the
01:02:51
administration to take their victory lap. GDP growth was 50% higher than expectations in Q2 as the Fed held rates
01:02:59
at 4.25%. In Q1, GDP declined 50 basis points. That's probably due to the
01:03:04
imports. People were stockpiling goods. That's the most pointless chart ever. Okay. And then Yeah, it is. I agree.
01:03:10
It's a little bit Yeah, it's distorted by I wanted to have both. I wanted to have both as bar charts.
01:03:15
This one totally on drugs. Just say it. It's okay. What drugs are you on? I'm not I had
01:03:21
coffee and now I'm out. I'm out. We're all friends. You can tell us. Is it really just out? All right, that's it. I'm taking it out.
01:03:27
Oh my god. I took it out. And now let's get back to the here. Okay. The Fed kept rates
01:03:32
unchanged for the fifth straight meeting. This time, two out of 11 Fed governors desented from Pal's decision.
01:03:38
Two of the dissenters were both Republicans nominated by Trump. So, it seems like the Fed is becoming a little polarized now, too. First time in 32
01:03:44
years that more than one governor dissented. And um yeah, even one person
01:03:50
dissenting is rare. Here's a 25-second clip of Pal explaining how GDP factored
01:03:57
into the cut decision. Nick, please play the clip. Recent indicators suggest that
01:04:02
growth of economic activity has moderated. GDP rose at a 1.2% pace in
01:04:08
the first half of this year, down from 2.5% last year. Although the increase in
01:04:14
the second quarter was stronger at 3%, focusing on the first half of the year helps smooth through the volatility in
01:04:21
the quarterly figures related to the unusual swings in net exports. The PCE
01:04:26
index and then I'll throw this over to you Sax for for the official position here for June dropped on Thursday. PC is
01:04:32
the Fed's preferred gauge of inflation over CPI. PCE rose 30 bips in June in
01:04:37
line with estimates and um if you remember we talked about in a previous episode CPI rose a bit 13% or 30 bips
01:04:45
from May to June. So, we're not any we're not close to the 2% uh target and
01:04:50
that's what the Fed keeps saying. We're not there yet. And the economy is al fuego saxs. You note I don't know if you
01:04:58
noticed this, but people are talking about the QDP the second quarter print which was amazing for GDP. You were
01:05:04
talking about it a bunch on the socials. He keeps referencing the first half. So, he's trying to blend those two together,
01:05:10
I think, because of the the tariff differences or, you know, maybe to to smooth it out as he said. What's your
01:05:15
take on this? The GDP boomed in, you know, 3% which is pretty great. The problem No, the problem that Jerome
01:05:22
Powell has is that he's trying to smooth it because it allows him to justify his
01:05:29
political decision. Okay. But the reason why you have to segregate Q1 and Q2, Q1 was before tariffs and Q2
01:05:37
was after tariffs. So I think you have to segregate these two things. And if you look at the run rate from Q2, what
01:05:42
you're probably going to see in Q3 and beyond is more similar to Q2, which is
01:05:48
to say a large surplus, good GDP expansion,
01:05:53
and moderating inflation. So why does the Fed not cut? Because at this point,
01:06:00
not cutting is the only thing that you can do to slow the Trump administration down going into the midterms if you
01:06:06
wanted to politicize the job. If, however, on the other hand, you just
01:06:12
take the data as is and you ignore Q1 because it was pre-tariff and you start to look at Q2 and you project forward,
01:06:20
if you inject a 100 basis point cut into the economy, this thing is going to go gang busters and Trump is going to look
01:06:25
like an economic genius going into 2026. So I think that again in the absence of
01:06:30
politics you cut Okay. Sachs, what's the take from inside the administration and around it? I know
01:06:36
you're you're not speaking for the president on this issue, but you're in the administration, so I'm assuming you're Look, I'm not speaking for anyone, but
01:06:42
obviously the 3% number is way ahead of expectations. It's a fantastic number. It just feels like, you know,
01:06:48
everything's humming on all cylinders here. One thing you didn't mention, but I think is relevant, is the new trade
01:06:54
deal with the EU. We're about to get to that, by the way. That's the next story. Oh, okay. Well, I mean, I would include
01:07:00
that because Okay, include it. Yeah. I mean, I think it was a deal that just got announced where the EU is going to
01:07:06
open its markets to US products. No tariff on US products, but they will pay a 15% tariff coming into the US. They're
01:07:13
going to be investing 600 billion in the US. They're going to be buying 750 billion of US energy. and then some very
01:07:22
large number I guess they didn't specify number on defense products basically American military products hundreds of
01:07:28
billions which is the followup to their commitment to raise their contribution to NATO to 5% of GDP up from I guess it
01:07:36
was sort of like 2% before so I mean this is a huge deal for the
01:07:42
United States I think it's a huge win for the Trump administration and the deal is so good that what I'm seeing
01:07:48
from European sources on X European publications just commenters is that
01:07:55
they they were like outraged. They felt like they got taken to the cleaners here. Good. And
01:08:00
okay, you see you see a lot of that on X by European side. A lot of the European leaders are saying that Ursula chickened
01:08:07
out. So, you know, all those stupid taco memes are going away now because people are realizing that
01:08:14
Trump's willingness to raise tariffs on these countries as a threat to
01:08:21
renegotiate better trade deals is working. It's working extraordinarily well. Just this EU deal, one way to
01:08:27
think about it is you add it all up, it's about $2 trillion. It's effectively $2 trillion of stimulus into the US, but
01:08:34
without money printing. Yeah. Over the next three years. So it's not inflationary. It's not insignificant. Freedberg, your
01:08:40
thoughts on the Fed, the GDP print, and then maybe you could get into the granular
01:08:47
details of that print. If you pull up the schedule of data, so this is the
01:08:53
national income and product accounts data from the Bureau of Economic Analysis. So this is where the inflation
01:09:01
print comes from. I think there are two lines worth taking significant note of.
01:09:06
The first is the furnishings and durable household equipment line. So in June, the cost for furnishings and household
01:09:13
stuff jumped 1.3% month over month on an annualized basis, right? That's almost
01:09:19
15% year-over-year if it were to continue at that level. And then the second one is this recreational goods
01:09:27
and vehicles that jumped 9% month over month. Neither of those categories have jumped that much in in kind of recent
01:09:33
history. So part of the argument that's being made is that what we are seeing in
01:09:39
these jumps is actually some of the first effects of the tariffs and the
01:09:44
cost of goods that are being imported because these are largely imports having an adverse effect on the consumer. And
01:09:51
so I think this is kind of a wait andsee moment on some of these categories that are predicted to have a tariff price
01:09:57
effect starting to show through. So I think this is where a lot of folks are keeping a close eye on and it kind of provides a little bit of the support for
01:10:04
the economists that are saying we should keep rates steady because if we are seeing a significant inflationary effect here it's worth noting that there's
01:10:10
something that we need to be thoughtful about rate policy. I think this is um a really good point. If you look in this debate, which is
01:10:17
obviously highly political, we're at inflation 2.567%.
01:10:24
Spending is increasing. Obviously, stock market at an all-time high. Unemployment trending down again. So, we're at like
01:10:31
4.1%. And people are just yoloing into crypto and they're doing sports betting.
01:10:36
Bitcoin at an all-time high. I think the Fed now is in a position where cutting rates seems like putting kerosene on the
01:10:43
fire. If Trump tanked the economy in Q2, he probably would have gotten the rates. But now I
01:10:49
don't think it's reasonable, as you're saying, Dave, the the reasons to not cut
01:10:54
are building because the economy is on fire. So maybe the shock and bore approach to tariffs, which is now
01:11:02
becoming a playbook. I had a nice talk with Lutnik about this, who I love, by the way. He really described to me how
01:11:08
they're doing these. And the shock and bore playbook is basically Trump says something completely outrageous,
01:11:14
shocking, everybody goes crazy, the media loses their mind, business leaders lose their mind. Lutick told me that
01:11:19
what he does is he sets the table and proposes something reasonable because, you know, now I'm a big, you know,
01:11:25
direct contact with all the administration. Sax, thank you for that. Um, and he described it. Trump comes in,
01:11:31
sees all the stuff, and then he starts making these micro tweaks. So it's on the finish line. It's in the red zone, 5
01:11:38
yard line. Trump comes in and then he sticks it to them again with three or four extra asks and then they wrap it up
01:11:44
and that this is becoming really effective. So it was chaotic at first. It seemed nonsensical, but they've put
01:11:50
the Fed in a really bad position because they never seen this before. They've
01:11:56
never seen this before. So now they're going to be in this defensive position of what if we cut it and the market
01:12:02
rips. To your point, Chimoff, you just said the market will rip the second they cut that. And the cynical view of this
01:12:08
is the market rips as we go into the midterms, which is the same claim the Republicans made about the cuts that
01:12:14
Biden did in September right before the election. So, there is some level of politics and gamesmanship going on here.
01:12:20
But you have to hand it to the Trump administration for what they're doing
01:12:25
with this sort of 2.0 playbook. If this was Sachs premeditated and we all just didn't understand it, fine.
01:12:32
The outcome here is this administration has to live or die by the results of these 600 billion from
01:12:40
the EU, 550 billion in investment from Japan. You put those two together, I
01:12:46
asked Lutnik, is that at the event, is that going into the sovereign wealth fund? And how does that get, you know,
01:12:53
spent? And he said at the discretion of the president and he's advising him to spend it on putting more nukes in. So
01:12:59
that's fascinating. We have a trillion dollars now that we can put into nuclear
01:13:05
power plants and these small module reactors. And that's what Lutnik said he wanted to spend it on. He's going to
01:13:11
advise the president to spend it on. But now we've got them investing in our country. It's absolutely brilliant. If
01:13:16
it works out, we'll see if it works out. April 2nd was liberation day and the
01:13:22
media went crazy. They were predicting a black Monday. The market crash. They basically tried to spook the markets and
01:13:28
create fear. They said that we're going to go into a recession or depression. And now look at where we are. It's just
01:13:33
a few months later. All the markets are at all-time highs. Trump has extracted trillions of dollars in these trade
01:13:40
deals that people know. Premeditated. Tell us the truth. Premeditated. Hold on. President Trump has extracted
01:13:47
trillions of dollars from other countries using powers that other presidents didn't even know they had.
01:13:53
100% 100%. Was it premeditated? Cuz it was chaotic. the market did the market%
01:13:59
moves because of the media by the way those moves because they were scared and we just had a 3% GDP growth
01:14:05
print how things could be what I think happened is that President
01:14:11
Trump saw an opportunity here that other people ignored it's like when a CEO comes in to a company a new CEO comes in
01:14:18
and that company's been mismanaged for a decade but it's got wonderful assets on its balance sheet it's got a market
01:14:24
position that's still very strong has been underutilized. And he came in and understood that the United States had
01:14:30
tremendous leverage in all these trade negotiations. Actually, they weren't even trade negotiations then, in all
01:14:35
these trade relationships. And he was able to essentially renegotiate all of them. And look at the results. I mean,
01:14:42
they're just staggering. And you know, everyone said that, "Oh, Trump's going to chicken out. He's not going to hang
01:14:47
tough." It's all these other countries that have folded like, I don't know, lawn chairs. I mean, they have all
01:14:54
capitulated. Yeah. Yeah, they follow. It's really remarkable. But you're not answering my question.
01:14:59
Was this premeditated? Give us some insight here. I don't know what this is. What are you talking about? When they came out and they was like,
01:15:04
"Oh, 100% tariffs, 200% tariff." The market was not making that reaction based upon the media. They were making
01:15:11
it based on Trump was saying. So, was it premeditated this shock and bore shock
01:15:16
and reasonable negotiating strategy or do you not know? Well, you're not privy to Look, I'm not speaking as an insider
01:15:22
here, but we said at the time that all of that was happening and Larry Summers was on the pod preaching doom is that
01:15:29
all of that was an opening bid. It was all a start to a negotiation and we had to see where it ended up and that the
01:15:36
administration still had to stick the landing. Okay, but I got to say based on EU, Japan, and South Korea, I mean, this
01:15:43
is looking really good right now. Well, listen, it's the top five that are like 90% of the negotiation. As Trump
01:15:49
said, there was another little note he did in the keynote when he kind of drifted into his, you know, different things he wanted to talk about where he
01:15:54
said, "I don't even need to know about the bottom countries. I've never even heard that names of some of these countries." He's just got to nail the
01:16:00
what, the top five, the top 10, and we're done. And this administration has to stick the landing as well because
01:16:07
these are handshake deals right now. They have to be inked. They have to be approved. So, there's there's a lot more
01:16:12
work left to be done. But I said as well, there's one other piece of it. We talked about the the um
01:16:18
you know the fact that Europe has 0% tariffs on American products but but
01:16:23
even after this deal that the European products into coming into the US will have a 15% tariff and
01:16:30
we're not including the $600 billion of European investment in the US. We're not including the 750 billion of sales of
01:16:36
American energy to Europe. Okay, just talk about the tariff that 15% and what we're seeing now across the board is
01:16:42
generating about 300 billion a year of additional tariff revenue that goes to help balancing the budget.
01:16:48
Yeah. So 300 billion a year over 10 years is 3 trillion. That is a big number. It's incredible. Yeah. It's got to
01:16:54
I don't know if that that completely satisfies Freeberg, but that's a big help. Freeberg, do you think that there is a
01:17:01
chance that inflation is going to tick up because of all this? like this is a
01:17:06
lot of money being pushed into the system again. So, could we see a three-handle on inflation in the next 6 months or what's the probability of that
01:17:12
in your mind? That's the big concern everybody has. I don't I I don't know. I I don't know. I think the the big question if you look
01:17:18
at each of these categories, one way to think about it is how much margin is the seller making?
01:17:24
If they're making 30% margin and we charge a 15% tariff, does their margin
01:17:30
go down to 15%. or do they take their margin down to 20% and raise the price
01:17:35
by 5%. What's the right balance? And what will happen is that now with this
01:17:41
effective you know tariff which is a sort of tax on the system a tax on the market will find its kind of new
01:17:48
equilibrium where the buyers are willing to pay X and the sellers are willing to sell it Y and I think every market's
01:17:54
going to be a bit different. So I think in some of these categories we will see significant inflation where there is a
01:17:59
very thin margin that the seller has in selling and in some of the categories where there's a monopoly and they have a
01:18:04
big margin they're going to eat it because they don't want to have competition and they don't want to see pricing competition emerge. So I think
01:18:10
we'll see it vary by category and you know we'll see how it goes. All right listen this has been another amazing
01:18:16
amazing episode of the number one podcast in the world according to Jensen Wong from Nvidia and me and uh great job
01:18:23
everybody. Great job to everybody. It's a class. Great job everyone. Even JCAL. Even J. Even Jason. Great job.
01:18:31
And actually, I want to thank Freeberg cuz Freeberg did most of the work to organize the AI. Let's give him a big shout out. There's
01:18:37
me. Great job. I mean, guys, can we just make a note here? One of us can run for Manurian candidate
01:18:45
president in eight years. And look at me and the president. I put on the red tie out of respect. I put my blue suit on
01:18:51
out of respect for the president. Does it not look like I'm running president
01:18:56
Jason.com? All right, listen. That photo could be like, you know, that
01:19:02
famous photo of uh Bill Clinton meeting JFK. You know, that could be the thing that that could be the thing that
01:19:08
I'm in like that famous image that propels you to the presidency. I'm in like Thank you for giving me that
01:19:14
and and for putting me in touch with each member of the administration directly. Thank you for that. And we had a wonderful tour of the White House the
01:19:21
next day. What a wonderful tour. Some of us had at the White House the next day. But in all honesty, no. I was
01:19:27
Did you? No. I was taking the pictures. That was my joke cuz it was all of you guys were giving you a tour. We could have gotten
01:19:33
you a tour. I mean, listen, I love J. Did you ask for a tour? I'm I'm not the kind of guy to ask. I'm
01:19:38
the guy. Some of us have actual meetings to do, bro. I mean, it's all good. It's all good. I got a lot going on. I got a lot to announce and happen in the coming weeks. But Sax,
01:19:44
do do take us behind the scene here. And I think it was hilarious. So, I don't mind getting trolled by the president. It was great. But how did you
01:19:51
how did that go about behind the scenes that he nailed that joke? Don't tell him. Leave it. Leave it. What did you do? I mean, cuz that looked
01:19:57
like it was workshopped or is he just naturally I mean, he's obviously naturally comedic, but did you put that in with him? Did you have to clear that
01:20:03
with him? Hey, Dunan, Jal, whatever. Well, they asked me for the names of,
01:20:09
you know, my co-hosts and so they could do shout outs. So, I gave him the list. Oh, no. And I just I said and I put even
01:20:16
Jake House, but I mean he went for it.
01:20:23
No, he he he got we went through it. So he got the joke. Okay. He got the joke. We went through it. He got the laugh. He got it. He heard
01:20:28
the laugh and he heard the laugh and he doubled down. I thought it'd be funny. But no, we went through everyone's names beforehand
01:20:35
and uh I mean, talk about EQ. The guy's EQ is off the charts, man. He just he's timing
01:20:42
is great. I suggested I suggested the name JCAL and he's like, "No, no, give me his full name." He thought it was more courteous.
01:20:48
He's actually a very courteous man. He wanted to use your
01:20:54
full name, not just your nickname. I think what he probably realized was for my parents who were just over the moon. So, thank you for
01:21:00
that. It meant a lot to my dad who's That's lovely. Yeah. He's been struggling a bit and it it really Let me get a little choked up
01:21:06
here, but my dad's been struggling a bit. And um I got to see him in Brooklyn after that and we were on a tech stream
01:21:11
and and it meant a lot you know cuz for a kid from Brooklyn to get a shout out from the president of the United States
01:21:16
is you made it. I mean it's just your father your father should be really proud of you. Thanks man. Appreciate it. I appreciate
01:21:22
it boys. All right listen for your Sultan of science the amazing Dave Freeberger who put that event together
01:21:27
in 10 days and then jumped right in. He's got to run a hollow at the same time. So I just want to give our MVP of
01:21:33
the week. We should give a shout out to the Hill and Valley guys for partnering with us. Jacob, Jacob Hellberg did a great job.
01:21:38
Love Jacob. I love Jacob. And Delian and Chris. Thank you guys. They were our partners on the event.
01:21:44
Hillen Valley did a great job. Yeah, I love those guys. But yeah, just I'm giving the MVP of the week for of the besties to you, David
01:21:51
Freeberg. You put a lot of work into this. So, and we appreciate it. You're running a hollow and then you went right into working on the all-in summit which
01:21:57
we'll be at in a couple weeks. Chimat, thank you for buttoning up. We're getting a little complaints from the HR department about the uh the buttons and
01:22:03
so we we've now renegotiated that. I'm going to I'm going to unbutton three buttons now and walk around Forte.
01:22:09
Perfect. And Saxs, I will see you at the White House. JD and I will be in the commissary. So, we'll invite you to lunch with us in JD.
01:22:16
It's called the Navy actually in the mess. Yeah. And and you know what? Lutnick's joining us as well. And
01:22:21
um who's our energy guy? Chris. Chris said he wanted to jump in on that. So maybe you can join us. I'll invite you
01:22:27
now that I am deep into the administration. Thank you for tuning in everybody. Allin.com events. The scholarship
01:22:33
tickets are up. So, if you want to try to get one of the very few scholarship tickets, we always like our upandcomers. Please, if you're if you're of means,
01:22:40
don't apply for the scholarship. You won't get it in. But if you're up and coming and you're part of the audience and you want to get one of those discounted tickets, we have a limited
01:22:45
number of those available. All.com/events. Love you besties. Byebye. Love you. Byebye.
01:22:52
Let your winners ride. Rainman David.
01:22:59
And it said we open sourced it to the fans and they've just gone crazy with it. Love you. Queen of kinshine.
01:23:07
[Music]
01:23:12
Besties are gone. That is my dog taking a notice in your driveway.
01:23:20
Oh man, my dasher will meet me up. We should all just get a room and just have one big huge orgy cuz they're all just
01:23:26
useless. It's like this like sexual tension that we just need to release somehow. Wet your feet. Wet your feet.
01:23:34
Your feet. We need to get mer.
01:23:41
[Music] I'm going all in.

Badges

This episode stands out for the following:

  • 70
    Most shocking
  • 70
    Funniest
  • 70
    Best overall
  • 70
    Best concept / idea

Episode Highlights

  • The Importance of Unbiased AI
    A discussion on the need for AI models to prioritize accuracy and truth over ideological agendas.
    “We don't want AI taking an Orwellian direction.”
    @ 22m 19s
    August 01, 2025
  • JD Vance: The Politician of the Future
    Admiring JD Vance for his engaging and opinionated approach to politics.
    “I love JD because he's young, opinionated, and likes to mix it up.”
    @ 24m 56s
    August 01, 2025
  • Debate and Discussion in Politics
    The podcast emphasizes the importance of open debate and discussion in political discourse.
    “This is one of the great things about this administration: they love to mix it up.”
    @ 33m 06s
    August 01, 2025
  • The Fragility of Copyrights
    Exploring the future of copyrights in an AI-driven world raises critical questions about their validity.
    “The idea that there are copyrights is fragile.”
    @ 43m 13s
    August 01, 2025
  • AI and User Privacy
    Sam Altman highlights the lack of legal confidentiality for conversations with AI, raising privacy concerns.
    “We should have the same concept of privacy for your conversations with AI as with a therapist.”
    @ 57m 38s
    August 01, 2025
  • The Future of AI Certification
    A proposal emerges to certify AI models like doctors or lawyers, raising questions about rights and responsibilities.
    “Could AI actually be certified in knowledge?”
    @ 59m 58s
    August 01, 2025
  • Emailing Elon
    A bold move to reach out to Elon Musk about a market opportunity.
    “I just want to point out I'm going to email Elon about this”
    @ 01h 00m 16s
    August 01, 2025
  • Villain Era
    Embracing a new persona can be liberating and empowering.
    “It's liberating actually. It's amazing. It's really amazing.”
    @ 01h 01m 37s
    August 01, 2025
  • Scary Insights
    Asking AI about oneself can reveal unsettling truths.
    “It's scary how much it already does.”
    @ 01h 02m 08s
    August 01, 2025
  • Brilliant Economic Strategy
    A potential $2 trillion stimulus without money printing could reshape the economy.
    “It's absolutely brilliant. If it works out, we'll see if it works out.”
    @ 01h 13m 16s
    August 01, 2025
  • A Personal Touch
    A presidential shout-out meant the world to a struggling father.
    “It meant a lot to my dad who's been struggling a bit.”
    @ 01h 21m 00s
    August 01, 2025
  • Proud Brooklyn Moment
    A heartfelt moment as someone reflects on receiving a shout out from the president.
    “For a kid from Brooklyn to get a shout out from the president...”
    @ 01h 21m 11s
    August 01, 2025

Episode Quotes

Key Moments

  • JD Vance Admiration24:56
  • Open Debate33:06
  • Copyright Fragility43:13
  • AI User Privacy57:38
  • AI Insights1:02:08
  • Economic Strategy1:13:16
  • Proud Recognition1:21:11
  • Motivational Sign-off1:22:52

Words per Minute Over Time

Vibes Breakdown

Related Episodes

Podcast thumbnail
Trump Takes On the Fed, US-Intel Deal, Why Bankruptcies Are Up, OpenAI's Longevity Breakthrough
Podcast thumbnail
Software Stocks Implode, Claude's Hit List, State of the Union Reactions, Trump's Tariff Pivot
Podcast thumbnail
AI Psychosis, America's Broken Social Fabric, Trump Takes Over DC Police, Is VC Broken?
Podcast thumbnail
Trump Rally or Bessent Put? Elon Back at Tesla, Google's Gemini Problem, China's Thorium Discovery
Podcast thumbnail
Bond crisis looming? GOP abandons DOGE, Google disrupts Search with AI, OpenAI buys Jony Ive's IO
Podcast thumbnail
Inside the White House Tech Dinner, Weak Jobs Report, Tariffs Court Challenge, Google Wins Antitrust
Podcast thumbnail
IPOs and SPACs are Back, Mag 7 Showdown, Zuck on Tilt, Apple's Fumble, GENIUS Act passes Senate
Podcast thumbnail
Inside the All-In Summit: Behind the Scenes of the World's Greatest Conference 🚀