Transcript of Inside the Iran War and the Pentagon's Feud with Anthropic with Under Secretary of War Emil Michael

All-In with Chamath, Jason, Sacks & Friedberg
01:22:58 587 views Published 29 days ago
Transcribed from audio to text by
00:00:00

All right, everybody. Emergency podcast time. Episode 263 of All In. We have Emile Michael, the Undersecretary of War for Research and Engineering, working directly for Pete Hegsa. We had to get this out to you on Thursday night because it is an emergency pod. One of my old besties, Emile Michael is here. Emile and I were part of Team Uber back in the day he was, Travis's right-hand man. Some might say, fixer. And Emile Michael is now the Undersecretary for War here in the United States, serving his country like our bestie, David Sacks. Welcome to the program for the first time. Emile Michael, how are you doing, brother?

00:00:41

I'm doing good. I hope it was more than the fixer, but you know, raising $20 billion. Dealmaker, fixer.

00:00:47

You got it done. You got it done. He would give you the hardest things. Yeah, just to- That's right. Fair enough. If it was hard, and that's what a fixer is.

00:00:55

An operational ax, that's what they call him.

00:00:58

All right, sure. In Brooklyn, we call him Fixers. With us again.

00:01:01

A Rainmaker.

00:01:03

There's that, too. There's that, too. Making it happen. With us again. Chamov Polyhapatea. How are you, brother?

00:01:10

Great.

00:01:11

Yeah, look at that smile. What do you got going? You got some pokers in the fire. I'm not going to say any. In the coming weeks, I think some news is going to drop. That's my prediction.

00:01:23

I don't have my- Are you loving Chamov's tweet mogging that's been going on this week?

00:01:26

So good.

00:01:27

So good.

00:01:28

So good.

00:01:30

He's look-maxing by default, but he's been magging the gooners.

00:01:34

Yeah, so funny.

00:01:36

What was your favorite favorite?

00:01:38

The one I sent you this morning that you said, what did you say? It's so funny.

00:01:41

Are you collecting your losses by tax harvesting?

00:01:45

What did you say? He's like, Chamat said... Oh, my God. It was just like...

00:01:50

Yes, I did. Yes, I did.

00:01:51

Yes, I did. Someone said something to Chamat. He dropped in. Why is everyone so mad at Chamat? All he did was lose billions in retail investors' money. Probably one-page SPACs. It's not like he then told them to enjoy their capital losses or anything. Give the man a break. Chamat's response, Yes, I did.

00:02:08

All right, piling on. He's your sultan of science. Everybody's favorite. Had a great... So good. Some great science that he brought to the show last week. Friedberg, how are you doing?

00:02:20

Yeah, I've been traveling this week back at home.

00:02:22

All right. Sacks is out today. He's very busy on Capitol Hill. We'll talk about what he's up to next week.

00:02:27

Let's go. Come on. Let's go. Let's go.

00:02:29

Let's go. Go, Jason.

00:02:30

All right, the US and Israel launched a joint attack on Iran on Saturday. Today is day 6 of Operation Epik Furry. Iran's Supreme Leader Ali Hamani was killed within hours of the operation. 40 senior officials have also been killed. Death toll so far. About a thousand people, according to reports, tragically, six US Army Reserve soldiers were killed following a drone strike on a base in Kuwait. A US submarine sank an Iranian ship off the Coast of Sri Lanka. This is the first torpedo kill since World War II. Why we're at war? Been a bit of a moving target in a debate. First explanation from Rubio. He said Israel was going to attack and the US had no choice but to participate. Later, walked that back. Trump made it clear this is not a regime change effort, but we're doing this to stop terrorism and the development of ICBMs by, obviously, a pretty crazy group of individuals, and obviously, nuclear bombs, which we blew up a couple of weeks ago. Trump also mentioned the people of Iran should seize the moment, and take their country back. Hegset, who believe it's your boss, Emile, said, This is not a so-called regime change war, But the regime sure did change, and the world is better off for it.

00:03:49

So here's an interesting polymarket. Right now, US forces enter Iran. This is boots on the ground. By the end of March, 40 % chance. By the the end of the year, 59 % chance. So the idea that we're not going to have boots on the ground, the Sharps on Polymarket believe we will. Will the Iranian regime fall? By June 30th, 39 % chance, according to Polymarket. And by the end of the year, 51% chance. So, Emiel, I guess there are two questions people really want to know. I'll leave off why we're doing this. I think President Trump has been pretty clear now. But how long is this going to take is the one question. And are we going to have to have boots on the ground? Maybe what is success here?

00:04:34

I think the President talked about this is a weeks, not months operation, and it's aimed at essentially disarming the regime or the country in such a way that they can't supply Hezbollah, Hamas, Muslim Brotherhood, all the terror groups that get sponsored by weapons and money from Iran, not to mention the nuclear bit. And that's why you see from the reporting, they're going after the depots. We went after nuclear sites before. They're a prodigious drone maker. These huge one-way attack drones that can go hundreds and hundreds of miles. Lots of ballistic missiles that are aimed at every country in the Middle East, as you've seen, they've attacked them. So I think that's one. In terms of boots on the ground, there's no scenario where we have some protracted boots on the ground, Afghanistan, Iraq II-like scenario.

00:05:40

Friedberg, your thoughts on this war? Obviously, a lot of people voted for Trump in order to have the peace dividend that he was in his first term, absolutely the peace President. Now, here we are. Eight countries have been bombed, and we've had two leaders deposed, and one of those two have been killed. Your thoughts, Friedberg?

00:06:03

I think the President and the administration have probably the biggest meetings of the term coming up in China in April. My estimation, based on the conversations and the comments made by the President before he came into office and since he's been in office, is that finding a grand bargain or a deal with China is probably one of his top priorities. If you think about the importance of that, is the US going to wade into a giant global conflict led by a US-China rift, or is the US going to find some grand bargain? I think he would probably have a preference for the grand bargain. That being the case, I think you could look in the context of Maduro and the actions in Iran as creating maximal leverage going into those negotiations.

00:06:51

The reason for that free burn?

00:06:53

90% of the oil that comes out of Iran goes to China. There's been a long developing and developed relationship between Maduro's government and China, and these are big economic drivers or support the economic driving in China. Creating leverage by having significant influence or damage or destruction to those supply chains for China gives the United States footing to be able to negotiate a better deal for America. I would imagine that the President's intention here isn't to go and decide who should be in charge and drive regime change and end in a a year conflict with Iran. But ultimately, if there's some transaction with China that gets everyone out of this and puts the US on a strong footing where American businesses can sell into China, which is very challenging as everyone knows today. There's regulatory parity, economic and trade parity between the US and China. There's a point of view on what happens with Taiwan and availability of key technologies like semiconductors. I think it could be a win-win. I think that a deal with China could be the crowning of this administration, particularly going into the midterms. So the timing is right, and I think that's probably a core part of the motivation here.

00:08:07

Chamath, your thoughts on this action and why we're doing it? You've heard, obviously, the President has his position. We're not doing regime change. It's a secondary effect, obviously, but we want to stop those ICBMs and nuclear bombs from being developed, and we want to stop terrorism. Additionally, Friedberg says, Hey, we're framing this great discussion we're going to have with Xi and China, and oil is part of that. Where do you think you stand on all this?

00:08:38

I'll build on both what Emil said and what Friedberg said. I don't think this is about regime change, and I don't think it's about a local, regional conflict. I think if you take a step back and zoom out, the most important thing that we did in the last three months was by taking out Maduro and by taking out the Iranian leadership, we have created enormous leverage, as Friedberg said, with China. Now, why is that important? Because I think all of this centers around that geo Political discussion. Last night, something important happened, which is that the official Chinese bureaucracy posted what their GDP targets were. And it was shocking to anybody reading it because what we saw was that they guided to a range of 4. 5% to 5%, which if you look at the historical context of that growth, is the lowest that it has ever been in about 30 years, so three decades. So before they entered the WTO. And the question that one should ask yourself is, when a country that's growing at 8, 9, and 10 % start to grow at half that rate, yet have double the number of people and double the GDP, what You already have incredibly high domestic unemployment, especially youth unemployment.

00:10:06

Does it become more or less chaotic? I think the historical artifacts of every other country would show that it will become more chaotic. If you have that as a starting point, What is it in China's best interest to do? I think it becomes obvious that the right thing to do would be to invade Taiwan. Why? Because you start to create a sink hole that occupies your people, that occupies resources, that can get domestic production up and running, that can start to generate a war machine. And you see the economic impact of war machines in any country during any conflict. And if I had to guess, just to build on what Emil said, the President saw that, and I think what they did can be summarized in this chart, which I sent to nick. So if your goal is to prevent war with China, which is a massive global conflict, which could be nuclear, which could be cataclysmic, how would you do it? This chart paints one way to do it. If you look at the conditions inside of the Chinese economy, the most interesting takeaway is that they are enormously dependent on imported oil. So about 20% of their economy.

00:11:18

But it's not 20% of their economy, because it's 100% of these critical things that create GDP: logistics, transportation, aviation, feedstock inputs. And of that About 19%, about a fifth of it comes exclusively from Iran and Venezuela. And now all of that is off the table. So if you take that and then you see what Steve Whitkoff and Jared Kushner and Josh Greenbaum have been doing, which is trying to get a deal done in Russia, and you put all of these things together, because by the way, if you add Russia into that mix, it's about 40% of China's oil. Not only do you redollarize, not only do you stop the funneling all of these illicit oil funds to creating chaos all around the world, but you hem in China, going into a massive moment at the end of March, beginning of April, where, as Friedberg said, really astutely, there is the potential for a grand bargain. I think that secures global safety. In that, that is a huge thing for America.

00:12:20

Emile, how much does this have to do with China?

00:12:24

I think my instinct is, and I'm not speaking for the administration on this, is that's second-order benefit to some of these things. You said eight conflicts. There have not been eight conflicts. We inherited Gaza, we inherited Russia-Ukraine, Venezuela was its own operation, and then you could attach to it the drug boats that were coming out of that as one big operation. Then the Houthies was just Biden was ignoring the Houthies. They were just shooting at our ships. That was very limited in terms of, Stop shooting our ships. We need freedom of the seas. That's something any president should be doing, generally, I think. Iran being the one material conflict outside of Venezuela, so it's not that many. And how long did Venezuela last? It was one raid, one night.

00:13:21

I guess that's a really few hours. This is an important note, I think, for you, Emil, to explain to us. I've made love for it longer than that. There's a new approach here with regard to these actions, which is no boots on the ground, and we seem to, and you, of course, have better information than anybody else does. I don't think anybody would have known Venezuela, would have gone as well as it did. And so far, and listen, we got a long way to go with Iran, this has gone very well as well. So explain to us what you know and what you, the President and Hegsef, know that we that makes these two operations go so smoothly. What is it? And then there's obviously some new technology here in the case of what happened in Venezuela.

00:14:08

Yeah. Besides the discombobulator, what we've got is a very Well-trained military. The global war on terror was a disaster in so many respects. But the people now who are fighting that are generals now, and so they've learned a lot of lessons. And When you compare that to the Chinese military, they don't have a lot of experience. In fact, the decapitation they did in the Chinese military, the one guy they took out was the one guy who had experience in Vietnam. So they don't have conflict experience. And that matters because you understand going in, what are the things that could go wrong? And then you have incredible technology, space, air, land, sea, cyber, all kinds of effects that you could bring together. And so you imagine A hundred guys goes into the most fortified compound in Venezuela, where the President is, take him and his wife out safely and are out with no KIAs. Incredible. I mean, it's incredible, right? Stunning. Yeah. And these things, these war games, have been on the shelf for a long time. Every scenario has been planned for years ahead of time. Midnight Hammer in Iran was planned years ahead of time in terms of how would you do it if you were going to do it.

00:15:30

And then you keep refreshing the tactics, techniques, and procedures, and you're updating them. So we have a very sophisticated way of doing these things to minimize loss of life, and that's my success.

00:15:42

Can I ask a question? Of course. I don't want to derail this conversation, but is the discombobulator real? What can you say about the discombobulator? I can't say it. It's real. I was obsessed with this when I saw it on X. I was like, What is this thing? I mean, I need it in my house. I can't just push a button about these.

00:15:59

My kids just- That's just for when Helmut shows up.

00:16:01

Oh, my God.

00:16:02

Not meant for your kids. I don't know if they're behaving badly. No, can't talk about it.

00:16:06

Emile, do you think we would have been able to pull off that mission as successfully as we did five years ago, 10 years ago? Has the technology improved that quickly, that this is not something that's been possible historically? And how does that change the pacing and the face of war for the next couple of years?

00:16:22

I'd say no, it wasn't only a technology maturation from five years ago. It's the rules of engagement. The rules of Engagement. The Rules of Engagement that we used to have, if you read about them, some of them were insane. In Afghanistan, if the guy had a small gun, you had to have a small gun. There was this parody in weird ways. When you're like, Well, but is the objective to have a fair fight or an unfair fight? Well, if you're on our side, you want it to be unfair. So the rules of Engagement were relaxed to be- Who writes those, Emile?

00:16:58

Who sits in an office and says, You can't shoot back if a combatant is shooting at you if you aren't matched gun for gun.

00:17:06

Who writes that? Crazy policies that are written in the military departments. That's why when Secretary Hexet talks about this thing and what was happening with him when he was in Afghanistan, if you ever read his book in Iraq, he's like, The rules of engagement were so punishing that we were at risk all the time because you had to have a legal understanding of what was happening every minute in the battlefield, as opposed to, Well, your job is to take out these guys and protect these guys. Here's your munitions. Here's the red lines. And then in the middle of that, go, use your judgment. Your commanding officer, use your judgment on how to win. And we've gone back to that, use your judgment, push a responsibility field, still have your red lines. But other than that, the objective is the objective. It's more of a Colin Powell approach. It's like, go all in, have a clear objective, come out, use overwhelming force. And we were not doing that for the last four years.

00:18:05

And then going back to the face of war going forward, my understanding is that there have been more drones deployed by the United States this past week than we've done in the history of military activity. Is that right? And how does that really change things going forward here?

00:18:20

It changes it big to well. So the Predator drone was the first big drone program, like 10, 15 years ago. It was this big, honking drone. And then if you remember, Obama would take out some of these Al Qaeda leaders with drones on their balcony and things like that. I think President Trump took out Suleimane with a drone near his car. That was the beginning. And then the Russia Ukraine war happened where it's drone on drone, 70% of the casualties are because of drones. So drone on drone warfare, robot on robot warfare, those things are the future for sure. And that's why Companies like Andrel are companies like Andrel. It's because they're making unmanned systems.

00:19:04

And this has been something you've specifically been very focused on, and you tweeted today a little bit about a competition. We'll play a little video here. And this Lucas, low cross unmanned like that, attack systems. It used to take a lot of time. It certainly wasn't startup time to get new product into the channel for our military to use. Explain what program you're running here. It feels like the Darpa self driving challenge all over again. And what these drones cost. I know there's a company making them for, I think, $35,000. Am I correct?

00:19:37

I mean, the small drones like I'm holding right there are way cheaper than that. The Lucas one-way attack drone, which can go 5, 6, 700 miles at the speed of an airplane, carry a big warhead. Those are like 50, $80,000, depending on what equipment you put on it. But we have a drone dominance program, and we basically have to build an arsenal for drones. Now, are we likely to have a territorial conflict like Russia-Ukraine with Canada and Mexico? No. But we do want to take out drug drones at the border. But long One-way attack drones are important for any major conflict like you're seeing in Iran, but also to protect military bases for America 250 World Cup, Olympics in '28. There's a lot more uses of drones for surveillance, not just for combat.

00:20:34

There, you're showing drones that are human-operated, but how much of this should basically be AI so that it's just some computer vision. Again, back to what you said before, a model understands the rules and the red lines, but otherwise is be on task and accomplish your mission. How much of it is one versus the other?

00:20:54

I believe that a sophisticated drone war is going to be drone swarms, control by AI, to some degree or another, to what degree the control matters. For example, drones have decoy. They could spit out, they could dazzle, they could put out things. So how do you discriminate what's a drone and how to hit it? You can use AI for that because it's learned how to do automatic target recognition, for example. And then also, could it identify a person? And does that make it safer? So it's going after actually someone you want to get and not someone you don't want to get. So there's a lot of uses for AI at the edge, if you will, in the future here. The Ukrainians and Russians do something called a kill box where they lose comms because it's jammed for this drone, and then it just starts going in a box and looking for the person they're trying to get, and they're starting to use AI to do that.

00:21:50

And China has this ability already, probably times some magnitude, yeah?

00:21:58

They have drone swarms because they can force the companies that make them, not just DGI, to interoperate. So interoperating drones called heterogeneous autonomy, right? You take different kinds of drones and how they communicate with one another and then make sure they're not going after the same target is a pretty complex thing that they're definitely working on.

00:22:21

Let's talk about the fidelity of these. Obviously, AI is a new technology. It can make mistakes. Anybody who uses it on a day to day basis might experience a hallucination. How confident are you in the AI, Ukraine and Russia conflict? They obviously are not going to be as thoughtful, maybe as we are in putting this together. They're in a hot war right now. But we, as the United States, have to be very thoughtful about this. So how confident are you that this isn't going to make a mistake? I think that's the key to a lot of this debate. And when will it be perfect to find as much better, and I guess this dovetails with the self-driving thoughts, it has to be a magnitude better than a human. So when will this be a magnitude more accurate than when we make a mistake as a military and we kill a civilian?

00:23:12

No, it's a good question. I don't know when that moment hits, that FSD moment, where it gets better. It's certainly not there. And you wouldn't want to take huge risk with that. There's a gradation of when you would use that and what risk you're trying take or not. If you were trying to take out a drone using AI, using a laser or something, you'd be pretty okay making mistakes because you just missed the drone, whatever, with the laser, the laser goes off, it's all over. If you were doing something more sophisticated in a population area, have a densely populated area, you'd take less risk. So we're developing procedures, tactics for each scenario. And this is part of the debate I had with Anthropic, which is we need AI for things like Golden Dome. Chinese hypersonic mixle comes up, you've got 90 seconds before it separates and all kinds of decoy and you don't know where the actual payload is, and you want to hit it from space. And a human doesn't have the reaction time, may not be able to discriminate with their own eyes what they're going after. That's a pretty low-risk thing because it's in space and you're just trying to hit something that's trying to hit you.

00:24:26

So I think in the next 10 years, you're going to see a lot of these applications develop AI to one degree or another, so long as we think it's safe and it's not going to make mistakes.

00:24:35

Before we get on to the Anthropic discussion, and we really appreciate you coming here. My Lord, this has been so informative. So thank you, Neil, for coming here and explaining to the American public and to us what you're working on, it really makes us, I think, speak for everybody, really confident in what you're doing. It's so great that you've left the private sector to do this.

00:24:55

What I would say just very quickly, Emile, is I think that not enough people understand that the American military has had to fight with one hand tied behind their back. Just that little insight that you just gave about Afghanistan to me seems so scary because the men and women that sign up for the American military, they're doing this to fight on behalf of this country. They deserve a lot more than being sent there and all of a sudden being given this rule book and say, Do your best. And it's like, Oh, wait, you violated 19 rules trying to protect America. Do your job. That's insane.

00:25:30

It's really insane in some cases. My belief is that's what the frustration for those soldiers who were out there in those wars had more than anything. It was the broader frustration of, What are we doing here? And then the secondary frustration was, While I'm here, Why can't I do my job?

00:25:47

Is there much of a debate internally, Emile? I'm sorry, Jekal, before we move forward on this. No, go ahead. Regarding this idea of full autonomy in military action, I don't want to speak ahead to the Anthropic point, but it was something that the media seemed to say was part of Dario's concern is that when you press the button and hand over complete autonomy and there's a kill action that you're now giving to a robot or to some autonomous system, do we then have a moral issue at hand. And is that something that's debated or discussed? And is that the right way to think about the framing, what goes on?

00:26:22

I mean, we're not even close to there yet. The systems are not... We wouldn't feel that a system that would have real risk for a civilian is ready to launch yet. So we're not even debating yet. We're just trying to get basic autonomy in drones, basic autonomy in underwater unmanned vehicles, basic autonomy that you've heard of this collaborative aircraft that fly along with the jet craft so that it has more firepower, but it's still tethered to what the jet does. That's incredible. Yeah. So we're just at the very beginning of this stuff. But for Golden Dome is a good example of like, yeah, who can oppose that? It's the only way to get out of threat like that. So who could oppose if you have a military base, you have a bunch of soldiers sleeping, that you have a laser that can take down drones autonomously on that? So it's pretty scenario by scenario, but we're not having a lot of debate because the Skynet thing is so not a realistic thing at this moment. Except if... One thing I did tell the Anthropic guys, I was like, I'd tell this any company, your models are getting stolen by the Chinese, they're going to unguard rail them and use them against us.

00:27:36

And then you want our models to be less capable against your models.

00:27:41

It's-they're not going to be thoughtful in other words. They're going to go for it. If we just benchmark this against where we were at, but 10, 15 years ago, there was the WikiLeak of collateral murder, I think they called it, where we tragically had an Apache take out some journalists. And this technology, even applied today, probably would have avoided that in my mind. We have enough that when you're targeting not drones, but people on the ground with an Apache, this would have probably avoided that.

00:28:14

Yeah. Or the Kuwaiti aircraft hitting an American aircraft, making a mistake because it doesn't have the identification. I mean, it's the same self-driving argument to a degree. Self-driving could save lives, even though it's scary to Let's look at a car without a human behind the wheel. But there's tons of scenarios where it's a way better, safer option, more precise than the alternative.

00:28:39

All right, before we move on to the Dario thing in Anthropicic, in that ruha, there was one piece that we haven't addressed with this interaction, Friedberg-Chimoth, which is the Israeli government and their desire to take out this regime. And us, according to Tucker Callson and a large contingent of the MAGA base, they feel that we are captured by this group. Does Israel have too much influence over the United States with regard to these actions in the Middle East? This is a big debate within the party, within the Republican Party, within the MAGA constituent. Hey, number one, we don't want these wars. Number two, is Israel driving this thing to the point of Rubio's quotes that, Hey, we're doing this because Israel is going anyway? I think we should address it here. Not that I have a personal stake in this. I'll give my personal opinion at the end.

00:29:30

I don't think the President is captured by Israel in the least. I think he decides what is in the best interests of the United States. If Israel can be a part of that, then they're a part of it. Look, let's be clear, they're incredibly capable. And so in something like this, to be able to incorporate the intelligence of Mossad, what you're seeing today in this operation, Epic Furry, we're four days in, Iran has been 90 % depleted of all of their munitions, it looks like. They're just firing no more missiles out from Iran to anywhere else. There's fleets of drones and planes just waiting. Everybody knew where the Iranians were. It's great that when we make a decision on something that we need to do, we can rely on our allies. I think the opposite question should also be asked, what was the UK doing? Why is Spain pontificating? Why was Europe taking the weekend off before they could even issue a statement? Why don't you ask that question?

00:30:27

Yeah, no, it's an equally valid question. Friedberg, do you want to get in on this or no?

00:30:34

No. I'm a Jew. No one's going to care what I have to say. They're either going to be totally like, or they're going to say, This guy's a fucking Jew. We shouldn't listen to him. So let's move on. Go ahead.

00:30:44

Emilia, Well, any thoughts on this, Khafaf?

00:30:46

I do want to know from a meal, though, is this Iron Dome working, this laser in Israel system? Is this operational? And if so, is there any success metrics you can share around it?

00:30:57

I think the the Gold Dome. Iron Beam was the first generation of the Israeli air defense thing, and then they're building Iron Beam, and I think it's still early-ish, but yeah, it works. They're a technologically sophisticated country that's very small, that has a reason to invest in these things, and they have a lot of smart people to do them. So I think it's good.

00:31:23

Does it primarily work on rockets? I guess I just want to understand the logical evolution of this, because in the '80s and '90s, there was a lot of conversation about space-based lasers that could shoot ICBMs out of the sky to avoid global nuclear war. We could always take out every nuclear warhead delivered on an ICBM. Is that technology feasible? Is there a place in the near future where we could see basically maximal global deterrence using these systems, either ground-based or space-based, to take out hypersonic missiles?

00:31:53

I think the harder but more valuable problem to solve would be the space-based way of doing because then you could get at almost any threat that hits space. But you still need a ground layer because there's cruise missiles that could come at you, there's drones, and so on. So we called it multi-layers. How do you get every weapon at every layer? But directed energy, lasers, as they get more powerful, you could take on a bigger weapon farther away. Those technologies that as they improve, it gets more and more capable. I think all these defense systems are going to get more and more capable to get more and more of a variety of weapons at farther standoff, which is what you want. You don't want to shoot it when it's right over Tel Aviv. You want to shoot it when it's still over their land, ideally.

00:32:48

Are the laser interceptors in the field today? There's reports that they are.

00:32:53

I think there's some. I think they've demonstrated some of them.

00:32:56

Got it. Is this our technology or Israel's technology? Because Trump said, Hey, that's actually our technology. Is there any insight there?

00:33:04

We have collaborations with Israel on some of this stuff. They have their own, we have our own. So it's not... But they're good at tech. We're good at tech. There's certain ways you get part of our system and part of their system, because it's a quickly evolving part of science right now. How do you cohere beams of light to get distance? How use high-powered microwave to just drop drones in their tracks. There's lots of different ways to get at some of these things. Yeah, a lot of it's ours and some of it's theirs.

00:33:41

Yeah. To the earlier question, I am pro-regime change if it can be done thoughtfully. And obviously, isolating a dictator, that's the best thing you can do. We've done that successfully with Putin, Kim Jong Un, et cetera. Keep diplomacy up. But if there is a moment in time where you could free the people of Iran after 50 years of being subjugated by these lunatics and dictators. I'm all for it. I actually trust President Trump to make that decision. I know this may sound crazy. People think like I'm a libtard or something because of the way my besties frame me on this program, which is completely inaccurate. I'm an independent.

00:34:20

You are. I actually- You are not independent.

00:34:22

I'm completely independent.

00:34:23

You're not, but okay.

00:34:24

I am just based on my voting, and I'm not on either one of these sides. I am pro President Trump, and I trust judgment. I think he has more information. I think you have more information. I actually trust you guys to do it thoughtfully. And there obviously was a window here. Israel can have their own motivation. There could be the China motivation, but there's also spreading democracy. Can I say something on that?

00:34:42

Which might be the least of people's concerns here, but that's on the top of my list.

00:34:47

I would like to see the people of Iran free.

00:34:49

Just to build on your point, Jason, the thing that Emil said before, which I think is important as well, is we have an enormous amount of learnings about what happened in Iraq. We also have a ton of learnings between the Iran-Irak war and a ton of learnings in '53 when us and the British deposed Mossadeh, or at least fomented that and put the Shah, and then the Shah was booted out. If you take those three chapters in Iranian history or that regional history. There's a ton to learn. And to your point, there is a way to affect what we need to do without creating some 20-year, forever war. There was an incredible tweet. I don't know if you guys saw this. Somebody said, So every war doesn't have to be three decades and trillions of dollars to your friends in Virginia, Maryland, and DC. Did you guys see that tweet? It's true. These things can be one and done, in and out.

00:35:43

And if President Trump succeeds here, I just want to also give him some flowers here, the people of Venezuela and the people of Iran being free represent about 5% of the people in the world living under an autocracy, under a dictator. If those both flip back to democracies, he'll have done more for the spread of democracy than any president for many decades, perhaps in our lifetime. This would be incredibly noble, incredibly just.

00:36:11

Would you and the human rights set want him to get the Nobel then?

00:36:15

Absolutely. I'll give him all the Nobels. Like, literally, if you can free people all of them, give him every prize, give him an Oscar, a Grammy.

00:36:25

For physics, chemistry.

00:36:26

He can give him everything.

00:36:28

Physics, philosophy. J. Cal is an independent. When's the last time you voted for a Republican presidential candidate? Just curious.

00:36:37

Yeah. Say it.

00:36:39

No, no, no, no.

00:36:41

Mondale.

00:36:42

No, no. I would have voted for... If I was of age, I would have voted for... I wouldn't have voted for the Bushes. I voted for the moderates, obviously, Clinton and Obama.

00:36:54

Oh, we're playing the what-a-should-a-play game?

00:36:56

I would have voted for Reagan in this situation.

00:36:58

I would have bought NVIDI at Well, no.

00:37:01

I didn't vote for Kamala, so I'll leave it at that. But I voted probably 35 %.

00:37:06

Why don't you say that you voted for President Trump?

00:37:07

Just say you voted for President Trump.

00:37:09

You did. I don't want to complicate things here.

00:37:10

But you did. So just say it.

00:37:12

I didn't vote for Kamala. I'll leave it at that.

00:37:14

All right. It's so weird that you'll say you're a moderate, but you won't say that you voted for President Trump.

00:37:18

I am supporting President Trump in about 60, 70 % of what he does. Let's leave it at that. Three, two. All right, let's talk about the economics impact of oil and insurance. Oil has rose to $84 a barrel. Wednesday, straight before Moos, here's a video, is basically a standstill at this point. Here's the clip. You can see the traffic slowing down. And then, hey, some of the dots are even going away. That could be ships were taking out Unless the straight opens, 3. 3 million barrels of daily production would be lost early next week. And then there's insurance companies. They've all canceled the war risk coverage of vessels in the Gulf, effective March. Fifth supertanker traffic dropped 94 % within the first 48 hours. Trump said the US will provide political risk insurance for all maritime trade through the Gulf, especially energy. Friedberg, your thoughts on the economic second-order effects that we're starting to experience here and over the next four weeks could be intense and acute.

00:38:21

The modern insurance market emerged specifically to solve the risks of maritime trade. So in the 17th century, Lloyds of London which was a coffee shop in London, where all the maritime traders would get together and they talk about, Hey, what's the safest route so pirates don't get our ship and so you don't run into weather? That's where they would have these conversations. Then eventually, they started underwriting the risks of the shipping routes and giving each other guarantees. They said, Hey, if you make this route, great. You pay me a certain amount. If you don't make the route, I'll pay you the lost value. That's how Lloyds of London, which is the world's biggest reinsurance market, started. Today, Lloyds of London has 78 what are called syndicate members. These are these pools of reinsurance that underwrite big crazy risks like maritime insurance for folks that are moving oil tankers through the Strait of Hormuza, which the IRGC just announced they're shutting down. When the IRGC announced that they were shutting down the Strait of Hormuza, there's a significant risk of all the mines going in the Strait and the ships getting attacked and blown up, so loss of value.

00:39:25

The insurance premium spiked initially from a quarter %, so 0. 2 5% of the value of the ship to 1. 25%. So it went up by 5X. And so folks had to pay a lot more of the value of their ship in order to continue the routes and get guarantees that they'll make it through. And then all of the markets started to shut down. So once the conflict got heavier, everyone said, Let's shut this thing down. And that's obviously a massive risk to energy prices globally, which drives inflation and puts US economic security at risk. And so this is a brilliant move. I would say the US government stepped in with the US International Development Finance Corporation, which was actually, funny enough, started a couple of years ago, in 2019 or something like that, as output of one of the agencies that provided credit from USAID. Much talked about USAID. They're leveraging the credit capacity of this old USAID agency to go out and say to all the shipping companies, Hey, we'll give you insurance on your routes. The reason they need it is the shipping companies are levered. They They take on debt to buy the ships, and the debtors require that they have insurance or else they're not allowed to take the routes because the debtors are ultimately going to be out the money.

00:40:39

The shipping companies themselves need to have insurance, and so this provides a market that has now gone away. Very smart. And ultimately, a lot of people are saying this could actually reshore or onshore maritime insurance back to the United States and create an entirely new insurance industry here in the US that has historically been served almost exclusively by European syndicates and European partners. And it actually creates a big economic opportunity as this war dies down for American insurance companies and American brokers to basically be the underwriters and the guaranteeors of this insurance and create a new industry. That's cool. Super interesting side story on what's going on here.

00:41:19

All right, some breaking news here, folks. Via Bloomberg, the Pentagon has formerly notified Anthropic that it's been deemed a supply chain risk. This has never happened to an American company. It has happened to Russian companies and Chinese companies, Huawei. For background, the Department of War canceled Anthropik's $200 million contract on Friday and said they would do this. The dispute came down to two clauses, according to sources, and we have one of the principles here, so we will hear directly from him in a moment. Anthropik had two concerns. Number one, fully autonomous weapons, AKA murder bots, as we previously discussed. Dario didn't feel that their technology was reliable yet and wanted some assurances. The second thing Anthropik said was that they were concerned about mass surveillance of Americans because they believe this technology is uniquely powerful and can do things beyond what a series of webcams or a network of 711 cameras can do. Kantean said, They wanted all lawful use, Dario, you're welcome to come on the program next week or any time to give your side of the story. But this week we have Emil. Emil, your thoughts and explain to us what happened here and how this broke down.

00:42:40

It's worth a little short history. If you remember the Biden executive order on AI, which was this crazy executive order that limited the amount of compute any model company could do and was essentially grandfathered in a small number of AI companies that they were going designate the winners, and everyone else was out so they could have more control on what they did. Anthropic was one of those winners. And then they were smart. Actually, it was a good sales strategy to sell into the most sensitive parts of US government, like all of our combatant commands, Central Command that's doing the Iran fight now, the EndoPaycom Command, which is responsible for China, several of the intelligence agencies. And they did forward deployed engineer as Palantier style. So they got very sticky to the workflows and all that. So I came in and I got the AI portfolio for a department in August, and I said, I just want to see the contracts, the old lawyer in me. And I looked at the contract, I was like, Holy cow. They said, you can't use them to plan a kinetic strike. You can't use their AI model to move a satellite.

00:43:56

You can't... It was a 20-page-You can't do a war game scenario with it? You could do a scenario, but you can't... Let's suppose you're writing a plan saying, If this happens, this is what we would do, and it might involve a kinetic strike which causes harm to a human. So what do you think these folks do? This is Department of War. This is what we do. And so I said, Okay, well, I've got to, number one, have direct relationships with these companies, not just through Palantier, because I want to use it more broadly. And then number two, I need to have the terms of service be rational relative to our mission set. So we started these negotiations and took three months and I had to give them scenarios about this Chinese hypersonic missile example. They're like, Okay, we'll give you an exception for that. Well, how about this drone sworn? We'll give you an exception for that. And I was like, The exceptions doesn't work. I can't predict for the next 20 years what all the things we might use AI for. And so all lawful use seems like a good thing. If Congress wants to act, great.

00:44:59

We have our internal policies. We'll follow them. We're not knuckle-draggers here. We don't want to hurt people unnecessarily. So it's our province to decide how we fight and win wars, so long as they're lawful. And I think at some point it turned into a PR game for them because they were not going to win this intellectual battle of, Well, we're going to stop you. We're going to use our judgment because we think Congress is behind and then What was it on the US military. And it became this, let's find the issues that are most inflammatory, robot weapons and mass surveillance. We're the Department of War, we're not the FBI, we're not Homeland Security, we're not ICE.

00:45:43

You're not allowed to legally spy on Americans.

00:45:45

Yeah, you're not. And then what it came down to on that issue, just as an anecdote, is they didn't want us to bulk collect public information on people using their or AI system. And they wrote it in a way that I was like, so you're telling me, before we got to bulk collect, if someone types in Schamatz LinkedIn, I'm using public available information that I would be violating your terms of service. Like, yeah, well, okay, let's rewrite it. So there's months of this stuff, which was interminable. And then the trigger point was after the Maduro raid, one of their execs called Palantier, who we buy ourselves through, and ask them, was our software used in that raid? Which is, by the way, classified information. Anyway, so we're trying to get classified information and implying that if there was use in that raid, that that might violate their terms of service.

00:46:45

So they wanted to enforce... This is very important here. They wanted to enforce their terms of service. They went behind your back to try to collect information to then maybe pull your license for their technology.

00:47:00

It wasn't by behind back. I don't want to accuse them in that. Palantier is a prime contractor, they're sub. But it raised enough alarm with Palantier, who's got a trusted relationship with the department, to tell me, and I'm like, Holy shit, what if this software went down, some guardrail kicked up, some refusal happened for the next fight like this one. And we left our people at risk. So I went to Secretary Heggseth, I said this would happen, and that was a whoa moment for the whole leadership at the Pentagon that We're potentially so dependent on a software provider without another alternative that has the right or ability to not only shut it off, maybe it's a rogue developer who could poison the model to make it not do what you want at the time or trick you because you have to trick it. I mean, all these things that we know we wear about models or hallucinate purposefully, or not follow instructions, like some insider threat stuff. So then that culminated in a Tuesday dramatic meeting with Hexath and Secretary Hexet and me and Darrio with the Friday deadline that got blown. And I never thought they really wanted to make it.

00:48:09

Is the model entirely hosted by Anthropic? Or just explain to us, technically, Does this sit in a cloud that Palantir runs for you guys? Is there really technically a way that employees at Anthropic could interfere, intervene in the use of the model?

00:48:26

Yeah. They put their model in AWS GovCloud. Govcloud, yeah. And then Palantir serves it from there, and they refresh it. They held the control plane for the model. So, yeah.

00:48:41

They can change the model weights if they want. They can do whatever they want. The insight into this thing is unbelievable. Not just governments, but now if you're running a company. The reality is that what Anthropics showed, which, by the way, is their right at some level, is that They are going to have a political perspective and a set of terms that reflect their philosophy and that that philosophy can change on a dime. But what the government did was also completely reasonable, which is we can't rely on you if you're going to be completely unreliable and disallow things that are reasonable. I'll give you a different example to make the point. There's a state that wants to run some health care program, but they're a pro-life state. You can't conduct abortions in that state. Does that mean that the anthropic engineers can decide, You know what? We're pro-choice, so we're going to change the access model and the capability of that model inside of that state? Is that allowed? Should that be allowed? At one level, you'd say this is a private company, they're allowed to choose. But what that really means is for the government, for all the states, for any city, for every company, you cannot choose to only use one of these because it is just a matter of time until some person inside of one of these companies goes on some lunatic moral tirade and then jeopardizes your business against something that is nothing about law, but is everything about subjectivity.

00:50:16

That is the huge thing that this thing tore open this weekend. So if you're not figuring out how to be multimodal and agnostic across these models, you're taking on enormous business risk after Friday because you can't tolerate that these folks will do that. It's too critical of a technology. By the way, this is deplatforming all over again. Remember what happened when you didn't like what was said? Now, all of a sudden, you were deplatformed? This is that times a thousand because this is not about posting on social media. This is about using fundamental technology to either advantage or disadvantage your business.

00:50:49

Emile?

00:50:50

Yeah, I mean, I think I described it the other day as the leaders of these companies say they're going to cause 50% white-collar unemployment. This is as powerful as a nuclear bomb. It's like 50,000 geniuses in a data center, so you could have a small country coerce the world into it, whatever. So you're like, Holy cow. All right, so this is a general substrate of intelligence, of technology that's applicable to a lot of things. Very generalized. It's not like workday HR software, where you could just use a competitor. This is going to be part of our everyday life in so many different ways. And the controlling, whether it has a moral conscience. Anthropic has its own constitution. It has its own soul. It's not the US Constitution. So you're subject to that, plus whatever whims and how that changes. And that's a scary thought for Americans, generally. And I think that did come through a little bit today. And in the coming years, it's going to be a bigger and bigger deal.

00:51:50

So take us through OpenAI software, Gemini software, and Grox software. Have they pushed back on any use, or are they like Dell or Apple, they sell you a computer and you have the computer and you can use it as you will? Have any of those given you any pushback?

00:52:09

So Grox all in for all awful use cases across all classified and unclassified networks, as you expect because Elon's truth-seeking. We want truth in Department of War. We don't want ideology because ideology will mess with operational decisions. You don't want anything to be fake or tilted.

00:52:28

We're surging Google and We have them.

00:52:31

We have Google for all lawful use cases on classified networks, and we're trying to move them to classified networks. They have to build out infrastructure because this stuff is complicated.

00:52:40

So they're in compliance in terms of what you're looking for as a partner. And then I guess the last one is opening eye, and Sam seems to be just characteristically playing both sides a bit. No, no. Where is he at?

00:52:53

To his credit, I called him and said, I need a solution if this thing goes sideways. I need multiple solutions. I'd like you to be one of them. And he's like, Okay, what can I do for the country? I was like, I need to get you up running as soon as I can. And he was trying to protect Anthropic to his credit. He was like, Don't call him a supply chain risk. That's bad for the industry. Maybe I can negotiate terms that they'll find acceptable. But he's in the middle because they compete for the same researchers. So a lot of this comes down to this thousand researchers researchers, like baseball players, that get traded between these companies. Moneyball, yeah. It's a very moneyballish thing, and there's not that many of them, and you lose 20% of them, and all of a sudden, they launch Claude Code before you launched Codex or something like that. And then the numbers changed pretty dramatically. So he was being a real patriot to his credit and trying to help Anthropit while they were trashing him and recruiting from his company. And I'm not biased. I want all of them.

00:53:59

I I want to give them all the same exact terms because I need redundancy. I want to see if they diverge or not, or if they converge, maybe I only need two over time. But we don't know. It's too early.

00:54:11

But why keep them in the mix, Emile? So if there's clearly Is there really a difference of operations and philosophy and how they want to run their business, and there's other models, is their model particularly good at particular applications that make it important to keep it in the mix, given that there three or four other alternatives here?

00:54:32

Anthropic, you mean?

00:54:33

Yeah.

00:54:33

Well, because the number one reason we were having this conversation at all is because they were deeply embedded. So now I have to unentangle them. And the other companies have not gone as heavy enterprise, enterprise sales, forward deployed engineers, government business. So they have to catch up not on necessarily the capability of the model, but just how do you serve the government? The Godel market. And probably it's just way ahead on that.

00:54:58

But the models themselves, you don't think are uniquely advantage, or do you have a view on that at this point?

00:55:03

I don't have a view on that. I don't think they're... Certainly, cloud code was innovative and ahead. That's true. But do I believe in 12 months, codex is not going to be close. I think it will be.

00:55:17

I think you're right. There's an asymptoting that's happening. If you just look at the confidence interval on how overperforming or underperforming some of the leading models are, the error bars are shrinking. The confidence intervals, these things are all They're all becoming the same. Eventually, they're all getting access to enough power, enough compute. They're generating similar results, it turns out, which I think you would expect. So even more important that you have a complexion of models. The other thing, Emile, I don't know if you saw this, but they posted about the revenue ramp of Anthropic. I have a small software company called 8090, and I asked the team, let's go look at our OpEx. I posted it because I was so shocked at these numbers. Our costs have more than tripled since November of '25. Between the inference cost that we pay, AWS, which is ginormous, between our cost with CURSOR, between Anthropic, we are just spending millions.

00:56:17

So more per unit and more an aggregate.

00:56:20

Both.

00:56:21

But the problem is that my costs are going up 3X every three months. My revenues are not. Token use is very And by the way, because everybody has gotten infatuated with what we call these Ralph Wigam loops, just send the thing off and it'll just go figure something out. A, it never figures anything out. And B, you just get this ginormous bill from Cursor. So one of the things we had to do was just we had to say, guys, you got to deprecate CURSOR because you're just wrapping clot code and charging us way too much for these tokens. But I don't know if you're seeing any of this thing where the tool usage, it's so great to use these tools. Let's be honest. It's super fun. They're phenomenal. You feel like a genius. But then the ROI of these tools are really important. I'm not sure that that's as much of an issue for you or not in the Department of War.

00:57:07

Not yet, but it will be. For sure. As people find more and more use cases, the use cases get more sophisticated. So the next marginal thing you have to do is likely to be harder and therefore be more consumptive, right?

00:57:22

Right.

00:57:24

Let me just ask, Amil, the important question that I think triggered a lot of the news this week is why Why then designate them as supply chain risk? Why not just abandon them, move on, use the other vendors? Why take this punitive action?

00:57:38

I don't view it as punitive, and I'll tell you why. It's if their model has this policy bias, let's call it, based on their constitution, their culture, their people, and so on. I don't want Lockheed Martin using their model to design weapons for me. I don't want the people who are designing the things that go into the componentry to come to me, because if you believe the risk of poisoning- You're compounding that risk. Yes, it can enter into any part of the defense enterprise, but it's just the defense enterprise. So Boeing wants to use Anthropic to build commercial jets, have at it. Boeing wants to use it to build fighter jets, I can't have that because I don't trust what the outputs may be because they're so wedded to their own policy preferences.

00:58:30

I guess a dovetail to that is why couldn't this have been handled quietly? Is this Anthropic who made this a public spat, or was it the administration that made it a public spat, or two to 10?

00:58:42

I mean, they have a very good sophisticated press operation and really good, and painting us as doing mass surveillance, where their issue was some commercial database thing that someone else could buy. They didn't want us buy to use it, which I'm not even sure we buy them, except to do recruiting for soldiers. We run schools, hospitals. We do a lot of things at DOD. We don't just fight wars. And the way they were able to characterize these two things, which are genuinely scary to people, but were not the real issues. It was really the, I'm worried about them shutting off our system at a moment of need, or them messing with our system at a moment of need. That was where we were scared.

00:59:27

The thing that came to mind is, if they are selling batteries, and you need to use the batteries or the laptops, however you need to use them lawfully, okay, that should be enough for them unless they are peace knicks and they don't want to be involved in selling weapons, which, by the way, was Google's position for many years. They just didn't want to be involved in it because to your point, they want to recruit talent that is also aligned with that. So this just seems to be maybe this isn't the right partner for the Department of War.

00:59:56

If you don't want your stuff to be used for Department of War stuff, you shouldn't be selling to the Department of War. Pretty straightforward.

01:00:03

It's in the name.

01:00:04

It's in the name. Well, and then also, I have to say, you said, Hey, we don't know how we're going to use this thing. It immediately came to mind. It was like 9/11. You have to go check with them. If you find out there's another 9/11 unique Black Swan event that's going to occur and you have to go clear with them. That speed is important.

01:00:27

That was literally the comment. Oh, really? Yeah. So it was in a room of 20 people. So this is not undeniable. If ever Dario wants to die it. And I was given these scenarios, these Golden Dome scenarios and so on. And he's like, Just call me if you need another exception. And I'm like, But what if the balloon's going up at that moment? And it's a decisive action we have to take. I'm not going to call you to do something. It's not rational. So that was another holy cow moment of how they think about it.

01:00:59

That just means that what he wants to be is the Secretary of War.

01:01:02

That's right.

01:01:03

He wants to be the God King there, I guess.

01:01:05

You can't do that. The thing that shocks me, Emile, I don't know, maybe you can't say anything, but guys, you can comment on this. It's clear that Anthropic just lost all the Republicans. But I think that if they think that they have the Democrats, that's fleeting as well, because I think progressive Democrats fundamentally just hate Silicon Valley and technology. And so there's no way they're going to let some God King over here that they don't control either. And so in both ways, I think they accidentally may have pissed off every constituent. The longer term fallout amongst them and progressives will come home to roost because as the progressives want more control, and these guys push back on them, they're just going to fall into the same situation.

01:01:49

Yeah. I mean, it's an interesting perspective. I think if you don't want to be involved in war, that's your right. I think you mentioned this three times, Jamal. Just don't sell bullets. If you don't want to be in... But you can't call Smith and Wesson and say, Can I...

01:02:03

The other thing is, what the hell was the senior management and the board talking about over these last few days? Because to me, it would have sounded insane. So then the question is, were people just so breathless to buy this revenue curve? What is the board doing? What is the senior management really doing? What do you change, guys? What do you think you would tell them if you were sitting inside of the board of Anthropity?

01:02:26

If you're an investor, you're on the board, what do you say to Dario when he says, Hey, I need to dictate to Emil and Hegset how they use my tool, and everybody else is just saying, Lawful use as a standard. What's your coaching advice?

01:02:39

Well, it's also a very unusual circumstance because I don't think any business in history has grown as fast as they have in the last 90 days. They've added, what was it, 6 billion of ARR in a month or something? I mean, that's absurd. It's absurd.

01:02:55

It's a great product.

01:02:57

Openclaw has driven a lot of this.

01:02:59

If you're on the board- You're closing your eyes. You're shutting the fuck up. You're just shutting the fuck up because something's working.

01:03:05

You're selling a secondary?

01:03:07

I think he's off doing his thing and they're going to let him do it. I don't think that company's worth 350 billion anymore. God knows what it's worth.

01:03:14

That's interesting. Where do you, if you get put a block of stock right now, where do you put a bid in? I'll tell you where I put it. Oh, my God.

01:03:20

I had this conversation at dinner two nights ago. It's like you have to pick between OpenAI at their current mark, Anthropic at their current mark or and it's either multiple from here or net market value creation from here, because those are actually two very different conversations.

01:03:38

Explain the difference.

01:03:39

I think the net market valuation, because Google is already worth 3 trillion. So if they double, they've added 3 trillion. But I think Google is the bet. I think Google is the market value creator bet, but I think Anthropic is the multiple bet. I think Anthropic is a trillion five market cap at the end of the day, unless this blows them up.

01:03:58

You're still buying the 5X versus the 3X thing.

01:04:00

You'd buy the 5X instead of the 2X.

01:04:03

But if you could put a block of stock now, do you buy it at the last post or do you buy it at a discount? Or do you just say, I just buy it at the last post?

01:04:09

Anthropic is worth a lot more than 350.

01:04:12

That's for sure. It's undervalued compared to ChatGPT.

01:04:15

They just added 6 billion in the last month. I will tell you anecdotally, everyone I talk to is on co-work. Everyone has gone deep on this. Everyone's amazed and shocked and actively using it. Everyone's saying the same thing, which is It may actually be fulfilling the promise of AI. I will also say that it's only going to take 90 days for Google to flip on a virtual version of co-work. Once Google has this integrated with G Suite and you have a virtual hosted version of co-work, I think Google sweeps the market with this competitor. But right now, co-work is such an incredible product, and everyone's saying the same thing. It's like giving truth to AI.

01:04:54

Elon said something with respect to Grok, which was that he expects it to exceed all of these coding models, probably in the May spin, but for sure by June. So to your point, Friedberg, I guess my question, guys, to you is, what happens? Okay, what do you guys do? Emile, what do you do when all the models are asymptomatic? Let's just say by October of this year, let's just say, I can guarantee you, just for the thought exercise, by October, all the models are the same. Do you just take a complexion of them all and say, Great, we're going to build some governance layer around it, and now we're in different?

01:05:34

Or what do you do for it? I would love to be in different because then I could compete on price. Then I have one main and one redundant or two mains, and I'd need at least two. Anthropic is not going to be one of them if they continue with their posture. So then it would be three. And if one gets wobbly from a policy scenario, too, because they all, except for Elon's, is based in San Francisco and has that vibe, two hits. So you want to have two or three at any given time. And yeah, then you price compete them. I do think Google has a long term strategic advantage, not only because of their consumer thing, but because they have their own cloud. So between them, they don't have the margin on top of the cloud that Anthropical have to pass on. So it's an interesting economic proposition from them.

01:06:30

And just to build on your point, Friedberg, after you finish your insightful comments, here, pull this up, nick. Almost on cue, Friedberg. You're such an Oracle. Here is the announcement from Google. Google Workspace is now integrated for agents, and 40 agent skills were included today. Emile, you've been great today. Super honest. Dario's position, I'm going to give you some fastballs here. Dario says, The real reason the Pentagon and Trump admin do not like us is that we haven't donated to Trump. While OpenAI, Greg have donated a lot. Here's Claude's answer to that claim. Here's nine companies and their activities with the administration, from the inauguration to attending the inauguration to the White House CEO dinner, to the Melania documentary. If you go through and you look at these nine companies, Microsoft, Apple, Tim Apple, NVIDIA, Amazon, they have all participated. There's one company that hasn't participated, and that's Anthropic. Is Anthropic being singled out because they are not genuflecting and because they're not paying the cover charge? People say this administration is pay for play. That's the accusation he's making. I'd say maybe there's a cover charge. Nobody likes to pay it, but the other companies have.

01:07:56

What do you think here?

01:07:57

I mean, it's one of the dumbest things I've I've never heard. I truly, just because I'm like, I'm in the Department of War. I need to win wars. If you help me win wars, and I don't have to waste time transitioning you out, that makes me thrilled. It's a criticism on me because it's not like President Trump dived in and it's like, Hey, Emil, by the way, those guys didn't get him any money. You can't use them anymore. Obviously. It's like, invention in his own mind. It's like, I don't know if people sleep at night for those thoughts get in there. I was trying to work with them. Why would I spend three months trying to negotiate with them to get to a symbol standard if I would have just said, Okay, guys, you're out. Bye. I think it's just some internal psychosis. I guess that's the only way I can explain that.

01:08:47

Okay.

01:08:48

It could be on Dario that he's antagonistic to the administration, both with respect to how he operates commercially, and it's also reflected in the fact that he doesn't want to support the administration.

01:08:57

I have a different theory. I have my own take. I think that they have a massive instance of co-work running internally that helps them come up with business strategy. I bet you there's some element of AI that says, Yeah, you should do it. Do it. It just makes sense.

01:09:12

Zig where they zag and get more press.

01:09:15

And so now there's some f, Clawed bot telling them to basically tell the Department of War to pound sand. It's going to turn out to be the stupidest decision.

01:09:24

Listen, if I was chairman of the board of that company, I pulled Dario aside and I'd say, Listen, you're obviously a genius. We obviously have the best tool in town. This is not a battle you can win, and it makes no sense. You're going to come across as not being patriotic. And Tim Cook is showing up for the Melania premiere. Would it kill you to support the President? Would it kill you to show up?

01:09:44

Look what happened when Biden- I think that's the wrong advice.

01:09:46

Look what happened when Biden excluded Elon. No. That anchored him. Show up for the President. I'll tell you what- Show up for America and be a Patriot. You don't have to donate, but be a Patriot and show up for the dinners.

01:09:57

That's terrible advice. Here's my advice.

01:09:59

Okay, here's your advice. Okay.

01:10:00

Hey, Dario, call Emil back right now and say, You know what? Sorry, we up. We're going to own this, and we're going to put out a press release that says, We support our customers' use of our models to do everything and anything that's lawful, number one, and number two, that our terms of service are written in stone and that you can expect solidity and reliability from us. And this was just the misstep.

01:10:29

Camille, how respond?

01:10:31

I mean, I would say that's what I've always wanted. I need a reliable, steady partner that gives me something that will work with me on autonomous because someday it'll be real, and we're starting to see earlier versions of that. And I need someone who's not going to wig out in the middle, and we're just at the early stages and it's rational. But then you called President Trump in your 5,000-word essay on Friday, A WANABY Dictator. You're going to have to apologize best of more people than just me.

01:11:01

Yeah, maybe time to re-underwrite the position here. Let's just say kumbaya, everybody. Kumbaya. We've solved the problem. And look who's on the line. Surprise guest, Dario is here. I thought I would surprise everybody. Nick, pull Dario up. No, he's not here.

01:11:14

What's your view on how the industrial supply chain for hardware components and systems is coming along in the United States? Because my understanding is we're trying to reduce dependency on Chinese-manufactured components. Where are we with respect to where we need get to in the US manufacturing supply chain?

01:11:33

We are early days. Critical Minerals, you've seen the action around that. You'll start to see... So I have the Office of Strategic Capital, which has 200 billion in lending authority. And what we're trying to do is it's like treasuries plus 100 bits, loan to companies, show them that the Department needs their solid rocket motors, their batteries, their fiber glass, all the things that we're heavily dependent on for our defense industrial base that are completely outsourced to China and domesticate them here. And we've got a bunch of great people running it. But it's early days. It's going to take for the rest of the term to get... I think we'll get critical minerals done before the rest of the term, where we have the access to what we need to from us or Allied countries. But from batteries, it's the next problem I'm trying to solve, for example. Batteries are totally outsourced, both technologically and from lithium to China. And there's, call it 20 critical things. If I could get to all of them at some level, but then it'll take a few years for them to build plants and do that stuff. But it's very important.

01:12:48

I hope whatever administration comes next continues it, because I'm all free market, but we outsource so much that it crippled the the assembly part of putting all these things together.

01:13:04

Do we have a munitions risk right now, given the conflicts that we're involved in?

01:13:08

We don't have a munitions risk, but we do need to plus up because the Europeans are taking a long time to contribute. Ukraine, Russia has consumed a lot of munitions from all over the world. Then obviously, these conflicts we've been in, and we need to have the next generation. There's still a large degree we're fighting with 1980 Cold War weapons and not modern weapons. We need to plus up those things to regenerate them. Our nuclear missiles are 50 years old. Some of the planes are 40 years old, so all that has to be renewed.

01:13:49

Do you think, just speak to the venture capitalists in the audience. Are we in the early stages of this defense tech boom? Is defense tech well-funded at this point, or is it too hypey and bubbly? And that's not really the issue. It's not about funding the companies. It's about funding some of the further upstream issues that we're facing. What's your view on where we are?

01:14:10

There's more defense tech venture capital than ever by three or It reacts more than last year. So it's growing. What I need to do, and what the department needs to do, is have some of these companies win big contracts quick. Like, whether, Andrewal, sure, Saronic, sure, like a bunch of these companies so that more money flows in, more entrepreneurs do it, and I could buy more. Because generally, I do think warfare is going from big carrier ships that cost $20 billion and a decade and a half to build to mass, intritable, low-cost things. And that's what these new entrants can do. So we need those to succeed so that the flywheel goes with venture capital money, entrepreneurs' capabilities.

01:15:00

In that sense, and what I've heard as the explainer for this is we're moving from the old primes to the new primes, that there's going to be a small set of big winners, and then obviously lots of seconds and subs and whatnot. Is that really how this market is going to evolve? So are we going to end up with Andril, Palantier, maybe three or four others, and that's where most of the value is going to accrue from a market perspective?

01:15:22

I mean, Andril and Palantier want that, and I joke with them all the time about it. But I definitely want at least a second layer that's innovative and trying to disrupt the first layer all the time. I've had a mom and pop, like wholly owned company that makes these missiles called E-RAMs that we really send to Ukraine, and they do it with like 30 people, and they could do a thousand a year because they It's designed to manufacture. It's awesome. So I want companies like that to continue innovating. Maybe Andril then buys them. But one of the reasons the primes are such a small number, it's not the only, but it's one, is they learn how to contract with government. They learn how to go through the bureaucracy, and that became a competitive advantage. I'm trying to take that competitive advantage away.

01:16:07

That's a really important point. How do you disassemble all that bureaucracy so that product innovation can actually get to you?

01:16:15

Yeah. So we did a big... So part of it comes down to requirements reform. What used to happen is people were like, Oh, we need a new fighter jet. So Army, Navy, Air Force put in the requirements, and it would We needed to be stealthy, to hold a missile, to hold four humans. And it became this unbuildable thing. But the contractor didn't care because they're getting paid cost plus. So like, Sure, I'll fulfill your requirements. Two years from now, you're like, That was never engineered properly. It'll be a few years late and a couple more billion dollars. So we're trying to change that to, I tell you, my common operational problem. I need a bunch of missiles that go 500 miles or more to have this blast. Come to me with solutions, as little requirements as possible on that side. And then on the contract piece, trying to get to as close to commercial contracts as possible. And this is where the startups are so good. They'll do fixed cost pricing. They'll do You don't pay me as much if I deliver late, you pay me more if I deliver early.

01:17:20

It's very disruptive to the existing system.

01:17:22

Super disruptive, but that's what I'm waking up every day trying to do.

01:17:28

So you could put out as something saying, Hey, the straightforward moves is super important. We need to keep it open. We need these type of devices to keep it open. But come to us with your ideas and let them be creative entrepreneurs as opposed to just trying to goose the profits. Yeah, it's really brilliant.

01:17:45

Emil, you also oversee Darpa, yeah?

01:17:47

Yeah.

01:17:48

Darpa is the father of the modern Internet, and it's created a lot of really critical technologies. Can you talk about what's going on in there? Are there interesting things that you think our audience should know about that you're trying to push I mean, it's probably my favorite part of my office is like...

01:18:05

Because there, that's where it's... It's still a very honored profession to be part of Darpa. Being in government service for a long long time, has reduced in its stature since the Manhattan project. Because now, if you're great, ask someone who wants to do rockets and stuff, you go to SpaceX. Darpa still has the best of the best. And so the most creative ideas happen there. One of the things that they're working on that's public is they're trying to use biology to synthesize critical minerals. So how can you just pull them out of ground, use biology to do it so you don't need to do all this crazy, messy, dirty refining? That would change the game big time on our ability to get the critical minerals we need faster and leapfrog the Chinese in terms of tech. So they're doing a lot of that stuff. They're deep in cyber. Cyberattacks are the next huge threat with AI. What we saw with creating all these agents to attack systems, did Anthropic actually happened to them. So they're working on this. It's not a ton I could talk about, Darpa, because it's so classical. Classified, but those are a couple of examples for you.

01:19:17

All right. Speaking of classified, just two quick questions before we wrap here. Are there aliens, and when are you going to tell us? Number two, in all seriousness, I'm curious, what have you learned about China and where they're at and the threat there and our ability to counter it? Give us some idea of where we're at as a country, because we hear a lot of hyperbolic stuff. They're building this incredible mobile small navy. They've got hypersonics. They're just way ahead of us. We hear these things, but realistically, are we competitive?

01:19:52

Well, I'll answer your first question, which I fought for the alien portfolio. I didn't get it. What work to That's true. All the guys on my team were like, Dude, you got to get this for us. Please talk to the secretary. We want to do this. But I was like, as long as I had 100% access to everything, I would do it because that would be amazing, right?

01:20:15

Lifesavers would be a game changer.

01:20:17

But on the second one, it is true that Chinese have had the greatest military buildup in world history in the last 15 years. We were asleep at the wheel to some degree we're focused on global war and terror. So they've advanced without us thinking about threat. That being said, our operational expertise and our space, we have some sophisticated stuff. Our subs, our space layer. We still have the best stuff in the world, but we have to make sure that gap doesn't narrow.

01:20:54

Right. We can't be complacent. We should sleep well at night knowing you're there, knowing President Trump's allocating money towards this, and he's decisive in his actions, but we cannot be complacent.

01:21:05

I feel like this week was a true reminder of how fortunate we are to have the defense that we have for the United States. When you look at what happened in Dubai and in Doha and in Tel Aviv, and you see how people in their residential homes are getting attacked and bombed, you realize just how fortunate we are to have all of the layers of protection that we have by our government. I've actually come around to this quite a lot. I'm a true, arguably libertarian at heart, small government. But the one thing that I've realized is so critical for us to have the freedom to do all the things we want to do is defense. I think it's an amazing institution, very valuable to the United States. Emile, thank you for what you do.

01:21:47

Yeah, thank you, Emile. I appreciate you coming on and being so candid and thoughtful and insightful.

01:21:53

This has been an amazing episode.

01:21:55

That was a fun episode. We'll see you next time. Bye-bye.

01:21:57

Love you, boys. Bye-bye. We'll let your winners ride.

01:22:03

Rain Man, David Sack.

01:22:05

I'm doing all in.

01:22:07

And it said, We open source it to the fans and they've just gone crazy with it.

01:22:11

Love you, West Navy.

01:22:12

I'm the queen of Kinawa. I'm doing all in. I'm doing all in. I want your winners ride. I want your winners ride. Besties are gone.

01:22:21

That's my dog taking a notice in your driveway.

01:22:25

Wait a minute. Oh, man.

01:22:28

I.

01:22:40

What?

01:22:41

You're the B. We need to get merches.

01:22:45

I'm doing all in.

01:22:53

I'm doing all in. I'm doing all in.

Episode description

(0:00) The Besties welcome Under Secretary of War Emil Michael (2:30) US war with Iran: Bigger picture and why now? (13:16) Trump's new approach to warfare, AI, drones, rules of engagement (28:39) Israel's role in the conflict, relationship with the US, Iron Beam (37:24) Oil prices, Trump's maritime insurance play (41:19) Pentagon vs Anthropic: Why Anthropic was labeled a supply-chain risk (1:02:03) How to value Anthropic after its supply chain risk designation (1:11:14) State of the US defense supply chain, the defense tech industry, DARPA, and China's military Follow Emil Michael: https://x.com/USWREMichael https://x.com/emilmichael Follow the besties: https://x.com/chamath https://x.com/Jason https://x.com/DavidSacks https://x.com/friedberg Follow on X: https://x.com/theallinpod Follow on Instagram: https://www.instagram.com/theallinpod Follow on TikTok: https://www.tiktok.com/@theallinpod Follow on LinkedIn: https://www.linkedin.com/company/allinpod Intro Music Credit: https://rb.gy/tppkzl https://x.com/yung_spielburg Intro Video Credit: https://x.com/TheZachEffect Referenced in the show: https://x.com/chamath/status/2029584905831891069 https://polymarket.com/event/us-forces-enter-iran-by https://polymarket.com/event/will-the-iranian-regime-fall-by-the-end-of-2026 https://x.com/chamath/status/2029416079781736844 https://x.com/USWREMichael/status/2029539950962626734 https://x.com/addyosmani/status/2029372736267805081 https://github.com/googleworkspace/cli https://x.com/chamath/status/2029634071966666964 https://www.lloyds.com/about-lloyds/history/lloyds-buildings https://www.bloomberg.com/news/articles/2026-03-05/pentagon-says-it-s-told-anthropic-the-firm-is-supply-chain-risk https://www.bloomberg.com/news/articles/2026-03-03/anthropic-nears-20-billion-revenue-run-rate-amid-pentagon-feud