Request Podcast

Transcript of #238 Sriram Krishnan - Senior White House Policy Advisor for AI

Shawn Ryan Show
Published 3 months ago 1,175 views
Transcription of #238 Sriram Krishnan - Senior White House Policy Advisor for AI from Shawn Ryan Show Podcast
00:00:00

Foreign that. Right.

00:00:09

Perfect. Nailed it, Sean.

00:00:10

Perfect. I'm sorry. But, man, we become. We've become friends over the. The course of the year, and. And, man, I've just been super excited to get you in here. I didn't think it was going to happen until, to be honest, until the next election, because I know you're so busy with all the AI Stuff within the White House, so I just want to say, man, it is. It is an honor to have you here. I've been looking forward to talking to you on the show about all the stuff that you're doing with the AI race, with China, and just AI in general. Lots of questions throughout the country and lots of fear, lots of excitement, a mixed bag of emotions here. But like I said, I am really excited, and I love your backstory. We were talking at breakfast about the American dream, and you came here from India and have been a part of many empires, and you've done really well for yourself and your family, and that's what I love to showcase here is the American dream. And you are very much a big part of that, of what that represents and all things are possible in this country.

00:01:18

So thank you, man.

00:01:20

Well, thank you. And thank you, Sean. I mean, it's an honor for me to be here. I watched you for. For, I think, over a year and a half when we first met up, and I've just been blown away by just the stories we both talk about at breakfast. So many amazing people, men and women, who have just done some amazing things for this country. And I just kind of love this space and the environment you have created. So the honor is mine. I'm nervous. I'm intimidated to kind of be in this room, which is kind of probably one of the badass, most patriotic rooms I've ever been in, by the way. No, but thank you. I'm super excited.

00:01:58

Thank you. Well, if it helps, I'm also nervous. I get nervous for every one of.

00:02:03

These things, especially wearing a jacket. I feel so bad because I got to. I got to wear this. I got to wear a suit because it's part of the job. It's part of the office, right? Like, and I'm like, man, I'm going to go to Chandra and look like a dork out there in a suit, right? Like, oh, no, I appreciate you, like, you know, wearing the jacket.

00:02:23

Thank you. Thank you. But everybody starts off with an introduction here. So here we go. Sriram Krishnan, appointed by President Trump. You are the senior White House policy advisor for artificial intelligence. You're a Leader in Silicon Valley with a product leadership career spanning Microsoft, Facebook and Twitter, helping shape everything from Windows Azure to Facebook's audience network and Twitter's home timeline. In 2021, you joined Andreessen Horowitz as a general partner, opening the firm's first international office in London, focused. Focused on investing in AI and crypto. You grew up with humble beginnings in Chennai, India, where your love for technology began. When you convinced your father to buy you a computer with no Internet access, you taught yourself to code from books. When a Microsoft executive in India discovered one of your blog posts, you were invited to interview an opportunity that marked the beginning of an exciting professional journey in the United States. You also co hosted the Arthur Aarthi and Sriram show with your wife. The show grew from a clubhouse talk show into a widely downloaded top tech app and top tech and business podcast. Today at the White House, you are a key architect in America's AI Action Plan, leading efforts to extend the US's global dominance in AI.

00:03:50

You have also participated in high level diplomatic efforts, including AI delegations to Paris and the Middle east, advocating for the global usage of US AI tech. Stack, most importantly, you're a family man. You've been together with your wife for 22 years and you even hosted a podcast together. And you're a huge pro wrestling fan. So we'll get into all that.

00:04:16

But.

00:04:17

But like I said, I want to do, I want to do your backstory, you're coming to the us, how you got into the tech, which we just covered a little bit, but. And then everything you're doing now with AI. So like I said, a lot of excitement, a lot of fear, a lot of mixed emotions about it. So it's going to be an amazing interview.

00:04:38

Thank you. And thank you for that. That was super kind of you. Thank you.

00:04:41

Oh, my pleasure, My pleasure. So we got a couple of things to crank out here real quick.

00:04:46

All right?

00:04:47

One of them is I have a Patreon community and it's a subscription network, but they've been with me actually before I even started the podcast. Then when I did the podcast, started it in my attic and now here we are in a brand new studio. And so one of the things that I offer them to do is I offer them the opportunity to ask each and every guest a question. So this is from Andre. What emerging technology or trend do you believe will have the most transformative impact on society in the next five to 10 years? And how can policymakers and innovators collaborate to maximize its benefits while addressing potential Risks.

00:05:31

Well, thanks, Andre. I think. So the obvious answer for me is all things AI and artificial intelligence. I think if you look at the last 60 to 80 years, there has been a few huge technology Trends in the 60s and 70s. The first microprocessor was invented in Silicon Valley, and the transistor and the microprocessor which kind of powered intel, the Moore's Law and things which power every single phone, every single laptop. Then in the 90s, you had the Internet, starting with dial up Internet, using the phone line, which is kind of how I started. People of our age group probably started. And then now, of course, you don't talk of logging off anymore. The Internet is just everywhere. So I think that was transformational for society in some really good ways, some maybe questionable bad ways. Then you had the mobile phone happen in, I'm guessing say 2008, when the iPhone came out and everything just people just moving everything onto their phone. But I don't think AI now is the next big platform. And we are in the early days of it. Right. Like I would say the AI revolution. You know, we can go back to the history and we can get into that, but I think really started with the launch of ChatGPT about two and a half years ago where it sparked, captured people's imagination.

00:06:54

And now we're just starting to see, wow, like what can we do when this thing is advancing so quickly? So I think AI is the game in town that I think is most important. I think crypto is also really interesting, but I really spend a lot of my time thinking on AI and I think my job and working with others in the administration is to harness it for the American people. Like, how do we make sure that it makes every individual's lives better? Whether it's like, you know, dad trying to make sure they provide for the family, somebody trying to teach their kids, whatever it is, we want to make sure it makes their lives better. At the same time, we are also in this very intense race with China on all things AI, which we can get into in detail. So I think the way I see my job is, one, we need to win the AI race for China. I think the consequences are catastrophic if we don't. And second, equally important is making sure AI benefits every single American, every single person watching this. Andre, who asked the question, my hope is when my time administration is done, they are like, okay, AI is helping me in my life just a little bit.

00:08:08

That's the goal anyway.

00:08:09

Yeah, I feel like with AI, this is, I mean, we were talking about it at breakfast. But, you know, you had mentioned this, like, when the automobile was invented. I feel like this is. This is a moment in humanity, like when the wheel was invented. Everything's going to change. And you know, what I love about our conversation at breakfast and what I love about you is, you know, you understand the importance to win the AI race against China, but you also understand the balance that needs to happen within, you know, within not just the US but within the world. And there is a lot of fear of AI is going to take all of our jobs. AI is going to crush everything. And you are the guy that has to navigate through all of that and make sure it benefits American citizens and humanity as a whole, while simultaneously winning the race against China, which is no easy task because China plays by damn near zero rules. And so it's going to be a fascinating discussion.

00:09:12

Yes, it is. It's super fun.

00:09:14

But got a couple gifts for you.

00:09:16

Oh, wow. Okay.

00:09:17

Yeah. One.

00:09:19

This is the part I was looking forward to the most. I have no shame. I want the gifts. There we go.

00:09:25

Those are Vigilance League gummy bears. Made here in the USA, legal in all 50 states. No THC or anything like that in there.

00:09:33

Can I eat candy?

00:09:34

Yeah.

00:09:34

Yeah. All right, let's do it.

00:09:39

Dying to know what you think here.

00:09:42

Okay.

00:09:43

One out of five. Five being the best?

00:09:45

I'm going to say five.

00:09:46

Nice.

00:09:47

Wow.

00:09:48

Good answer. Good answer. And I got you another gift.

00:09:53

Oh, man.

00:09:54

So I think you're really going to like this one.

00:09:57

Okay. Wow.

00:09:59

Here you go. So I got a friend over at Sig Sour. Do you know what Sig Sour is? His name's Jason. I told him you were coming on. He's trying to figure out AI, so he's really looking forward to this episode. But he wanted me to give you this.

00:10:13

Wow. Hold on. How do I get this open? Here we go. Oh, my goodness. Okay. What am I. Tell me. Explain the screen.

00:10:23

All right. So that is the. You want to hold it up? Yeah, it's unloaded. Going to teach you how to use that. Maybe you can teach me some AI stuff and I'll teach you how to shoot. So that is the Sig Sauer 211 GTO 9 millimeter. It's got 21 rounds in the magazine plus one in the pipe. So 22 round capacity. It's got that. See those slits in the front?

00:10:47

Yeah.

00:10:48

So that is to help with recoil management. So it's gonna keep your gun down when you shoot. It'll minimize recoil a little bit. Keep your gun a Little bit flatter. They're redoing their whole optics line. That's their latest red dot. So using a red dot makes shooting a lot easier to hit the target. And so in the gun industry, there's this rage about these 2011 pistols. Everybody wants a 2011 pistol. And Sig was. Sig was. I don't want to say late to the game, but everybody's been really excited about what they're going to release in the 2011 series. And. And so that is their. Their first model.

00:11:27

This is Gorgeous came out. But I'm going to ask you a favor, and after we are done, you want to show me some of the ropes and how to use this properly?

00:11:34

Hey, I would love to do it.

00:11:36

You know, one of the things as a federal, this is gorgeous, so thank him for me. This is gorgeous. And, you know, as a federal employee, you know, you got to declare, like, every single gift you get. And this is going to be, I think, probably the most interesting declaration for sure, you know, when I file in the paperwork. But no, this is gorgeous, so thank you. And you want to teach me how to use this?

00:11:56

I will. We'll do it on a break.

00:11:57

There we go. All right, now, well, before we. Now, I have a gift for you.

00:12:04

I love gifts.

00:12:05

Let's do this. And thank you. Let's keep this right here. Now, outside of, I would say technology, I would say one of the most important things in my life is professional wrestling. I got hooked to it as a young kid, started watching it. I think it was my introduction to America. It was my introduction to sort of sports and entertainment. It's been a huge part of my life. So I got you. Let me get this out of here. This guy.

00:12:41

No way.

00:12:43

Here, hold on. Okay, so this is a WWE title belt. It's not just any WWE title belt. You know, fans, you know, who are watching this will know this is known as the. The Winged Eagle WWE championship belt because. Or back to WWF. So in the 90s, right? Like they had. Like WWE has had. WWF back then, has had many, many championship title belts. But connoisseurs or fans might say, and there may be some controversy on this, the best one of all time was this guy right here, right? It has been held by greats like Bret the Hitman Hart, who is my favorite guy growing up, the Undertaker, Shawn Micha. A lot of greats, they actually had a comeback where Cody Rhodes wore it recently. It was a core, core part of my childhood. I had it on T shirts, everything else. And this guy, particular thing I'M holding in my hand when my wife and I had this podcast for the last four years until this job. If you watch episodes, you'll see a title behind me on the shelf. And so I grabbed that.

00:13:55

That's it.

00:13:55

Oh, yeah.

00:13:56

Oh, man.

00:13:57

So this is like some classic pro wrestling history, like right here.

00:14:03

Man, I love this.

00:14:04

There you go.

00:14:06

Wow. Thank you. That is amazing. So that'll be displayed at the studio from here till it's over.

00:14:15

Yeah. Well, Sean. Sean's title reign starts right now. Did you watch pro wrestling much as a kid now?

00:14:22

I watched actually. I watched it in the 90s.

00:14:25

Okay, who was your favorite?

00:14:28

Who was my favorite? I don't know. It was probably. I went back and forth, but I really liked Hulk Hogan and the Ultimate Warrior.

00:14:36

RIP.

00:14:37

Hulk Hogan, Terry Bolia, Shawn Michaels, Jake the Snake, all of them.

00:14:42

What is your favorite match that you remember?

00:14:45

You know what I always liked was the Survivor Series cage matches.

00:14:50

Yes. Yes. Which one do you remember? Like, for example? Well, there was a SummerSlam one between Brett and his brother Owen. But which one?

00:14:59

I don't remember exactly which one, but those were always my favorite when they would all show up.

00:15:05

Yeah, I think when I was a kid, I loved those and I didn't know what I was watching. It was like these large than life characters. You had Hulk Hogan. Eat your vitamins, say your prayers, train every day. The red and yellow coming out. My guy was Bret Hart. So. Bret Hart. Do you know Brett? Oh, yeah, Brett the Hitman Hart. I would say if you ask people for the Mount Rushmore of professional wrestling, Brett will always be in there. He was one of the most technically savvy wrestlers of all time. Fantastic at telling stories with his body and in the ring. Even if you watch his matches now, it is just so incredibly crisp and smooth. And I know we're talking about breakfast. You mentioned Stone Cold, Steve Austin. Brett was involved in of the most important matches of Stone Cold's career. Right. Like the. The I Quit WrestleMania match. Do you. Have you seen this?

00:16:08

No.

00:16:09

Oh, okay. So this is great. Right? Like, so. So I'm. I forget which year. I'm going to say 96, but I may be off, but. So do you know what a heel and a face is in pro wrestling? Okay, so in professional. Okay. In professional wrestling. Right. By the way, folks, spoiler, it's not real. It's scripted. And it's kayfabe, as they would say. Kayfabe is sort of the reality inside professional wrestling. And inside kayfabe. A face or a baby face? Is the good guy and the heel is the bad guy, right? Like so Hulk Hogan, right, for years was always the face, the red and yellow, right? He slams, slams Andre, right, Like in a. He was the face and then the heel would be the bad guy and the heels would do evil, conniving things. You know, they would, you know, do a low blow or throw something in your eye or, you know, it's one of those things where they kind of cheat to it. Those are the bad guys. And in the 90s, what was happening was the WWE was kind of going, maybe along with pop culture, was going through the shift where often, sometimes the bad guys would start to get cheered a little bit more.

00:17:20

There were these anti authority figures and you know, you know, and the good guys, what people would call the classic white meat baby face, right? Like, you know, we're not getting enough cheers. And so in 96, you know, Brett, you know, was, I think, was the. I'm not sure whether he was the champion at this match, but Brett was a good guy, he was a face. He's one of the lead faces at the time. And stone cold Steve Austin was kind of coming up then, was a heel, he was a bad guy. And they kind of set up this match. It's called the, you know, the I quit match. And the idea behind this, you know, the only way for you to win is the other guy has to say, I quit. No tap outs, no countouts. You got to say, I quit. And I think the match was in Chicago. It's WrestleMania by the WrestleMania is sort of WWE's, you know, big extravaganza. It's like the Super Bowl. It's a big event. There's been, I think, 41 so far. And so they have this guest referee, Ken Shamrock, who's from, I think, a mixed martial arts background back then.

00:18:23

So. And Brett and Steve had this amazing idea, right? So what happens is they fight all over the ring, all over the crowd. It's bloody, you know, Steve Austin, you know, he starts bleeding, cut open the hard way. It got juice, as the wrestlers would say. And at the end, you know, Brett had his finishing move, the sharpshooter, right? And you know, and the sharpshooter. Once you're in the sharpshooter, when I was a kid, right, like, even though you had this thing where, like, do not try this at home, right? Like when my cousins, like, I'm putting them in the shop, Shooter kids don't try this at home. But at the end of the match, and Steve's in This bloody mess. Brett puts him in the sharpshooter, right? And there's this iconic image of Steve just yelling out in pain, blood in his face. But he's not saying, I quit. He's not saying, I submit. And Ken Shamrock, it's like, do you quit, Steve? And he's like, no, right? And after, like, minutes of Steve in agony, he refuses to quit, and he passes out, right? And they end the match there. So that match was crucial for both their careers because it's one of the rare matches where they did what was known as a double turn.

00:19:39

So a turn in pro wrestling is when a good guy becomes a bad guy. Like a face becomes a heel or vice versa. And in this match, because Brett would not let go of the hold, right? And he would then push Shamrock, he kind of became a bad guy at the same time, because Steve did not give up. He was bleeding. He was in pain. He never said, I quit, right? He became a good guy, and it launched Steve's career. So that is sort of one of the people ask you for the top five matches in history. That's one of the top ones. So anyway, so Brett. So that's a great part of WWE history right there, man.

00:20:16

Thank you. This is totally unexpected. This is awesome.

00:20:21

I was excited. I was excited for this. Wrestling has been a big, big part of my life. I've learned so much. You know, I've learned storytelling. I've learned, you know, what connects to audiences. Because a lot of people think about it, and they go, oh, it's just like grown men in spandex, like, throwing each other around, right? Or they say, what you did, which is like. I used to watch it as a kid, and I used to watch a kid. It's great. You watch it. It's super fun. I try getting my kids to watch it. It's great. It's spectacle. Good guys, bad guys, they're larger than life. They're huge. They're characters. But as I got older, I saw the sophistication and the art form, right? Like, because, number one, it requires serious athletic ability. These guys and women are incredibly athletic. They're taking real risks, right? Like when they jump off the corner and onto a table. Well, that's a real table. That's a real corner. And people have hurt themselves, and some really bad thing happens. We're taking real risk. And then they're trying to tell a story. And instead of a story, which is through CGI and dialog, it is a story they're telling with their bodies, with some promos and dialog, but also with the audience together.

00:21:43

So wrestling only works because the audience is in there with you. And often the greatest wrestlers know to change what they're doing to get a different reaction or sometimes change something what they're doing on the fly because of the audience. And there's a famous example of this. You know who the Rock is, obviously.

00:22:01

Oh, yeah.

00:22:02

Okay. So, by the way, I don't know if we're talking about AI. We want to get into wrestling first, okay? But one of the greatest WrestleMania matches of all time was in WrestleMania 17 in Toronto. And the Rock is a good guy, and Hulk Hogan. And the Rock's a young guy. I think it was maybe late 20s or 30s. He's kind of the peak of his young career. He's a great guy. And Hulk Hogan had come back as a bad guy. And I think Hulk Hogan was maybe in his late 40s. He was kind of in the slightly. The tail end of his career, right? Like, he was not the red and yellow. And this is WrestleMania, right? And, you know, you should watch it on YouTube. They come to the middle of the ring, they look at each other, and, you know, it's like light bulbs going off everywhere, like 100,000 people flashing. And, you know, and they just turn around, look at the audience, and people are like, oh, my God. You get Hulk Hogan, one of the greats, the greatest of all time, slammed Andre the Giant. And then you have the people's champ, you know, the Rock, right?

00:23:00

And so what happens is, remember, the Rock went in as the good guy. Hogan was the bad guy within the first 30 seconds, right? It becomes very clear that the crowd doesn't care who the good guy or the bad guy is. They want to cheer Hulk, they want to cheer Hogan. They want to cheer their childhood hero, right? And one of the amazing things which happens is, in the first 30 milliseconds to a minute, the Rock does good guy moves. A good guy is supposed to work in a certain way, right? You do legitimate moves. You don't cheat, et cetera. The bad guy works in a particular way. But these guys, without even talking to each other, they realize, oh, wow, like, this is something different. And they call laudable. And they, without even talking to each other, they say, we're going to flip the story, right? And for the rest of the match, the Rock acts physically as the bad guy. And if you watch it, you can see that happen, right? And there is this amazing moment which happens where, like about near the end of the match, the Rock hits Hogan with his Finisher, you know, the rock bottom.

00:24:04

And Hogan kicks out, and he halts up. He's like. And he's sucking in the Hulkamania, sucking in the energy. And people, like, connected to their childhood. I remember watching. I was connected to my childhood, right? People are like, oh, my God. Hulkman is no. The Rock eventually winds up winning. But it is one of those amazing moments. I was watching it recently because, obviously, tragically, Hogan passed away recently. So I went back and watched a lot of these old matches. But it's one of these amazing things where you're like 100,000 people feeling something together. Two iconic superstars, right? Like, knowing how to navigate them and in the fly without even talking to each other, redoing the storytelling and creating, you know, something which any wrestling fan would tell you if you're watching that and if you don't have goosebumps, like, you know something's wrong with you, right? And so, anyway, so I'm a big fan of pro wrestling. I still watch it now. I'm fortunate to become friendly with some of the people who do it for a living. I have a lot of respect. I learn a lot from it.

00:25:05

Interesting, interesting. You know, we have a saying here at srs, the whole team, we compare US Politics to professional wrestling.

00:25:16

That is true.

00:25:17

Do you see any similarities?

00:25:19

Absolutely. First of all, the person I work for, the president of the United States, is in the WWE hall of Fame. Okay? So let's just start there. And I think that a lot of people in politics and other parts of the world, other domains, learn from pro wrestling. For example, you know what cutting a promo is?

00:25:40

No.

00:25:41

Okay, so cutting a promo is when, let us say, you and I, I'm a good guy or a bad guy. And we have this big match coming up this weekend, right? And we want to get everyone to buy it back in day, you would pay 60 bucks, get it on pay per view. You know, these days, you probably subscribe on Netflix or something. A promo is us trying to hype up the match. And you'd be like, I'd be like, no, I'm a good guy, Sean. I respect you for everything you've done, but this Sunday, you're going down, right? Like. Or something like that. That was a bad promo, right? Like, don't judge me. But you kind of build up to it, and you basically, you know, put over your other person. Kind of put over, meaning you make sure the other person looks a legitimate threat, because nobody wants to see you fight someone who's legitimate threat. So you got to put over the other person, right? And then at the same time, you're trying to build interest for this match, right? And I think, you know, if you look at a lot of people in politics, they have learned from that, like, and even outside politics, for example, Floyd Mayweather, Money Mayweather, he took a lot of how we constructed the TMT, the money character from pro wrestling, right?

00:26:49

Conor McGregor. Exactly, right? The walk, the whole thing, right? Like a lot from pro wrestling, kind of, because people want to be invested. They want to see the story, you know, they want to be invested in you, right? They want to see you kick someone's ass or see your ass get kicked because, you know, you're the jerk, whatever. But ultimately you're trying to get people to invest in a story and I guess, you know, watch the fight. And I think some of the best people to do it find a way to do it where you're like, man, I'm going to watch history and I got to watch it. So. And maybe I'll stop at this. You know, people ask me, you know what if I'm going new to pro wrestling, right? Okay, you're going to teach me how to use a six hour. I'm going to get you back into pro wrestling, right? And the. The match of. I'm going to have two matches I'm going to have you watch. And these are from 2009 and 2010, okay? And they're Shawn Michaels and the Undertaker. Two matches in a row. So the quick story there is that Shawn Michaels and the Undertaker, I'm sure you know who they are.

00:27:49

We've seen them growing up, right? Now, the Undertaker had what was called the streak. And the streak was this was the sequence of matches. At every WrestleMania that he had won, he was the only person, and at the time, I think he had won maybe 11 or 12 years in a row. Imagine, one football team winning every Super bowl for 11 years in a row. He eventually wound up getting the streak broken, I think, at 18 years. But at the time, the streak was this magical thing, you know, it's like, next person up who's gonna take him down, you know, he doesn't get taken down. Shawn Michaels, right? Like, you know, Mr. Wrestlemania, right? Like he was somebody whose iconic wins went up against Undertaker. So WrestleMania 25, right? I think this was 2009. They have this epic contest, right? Just epic contest. One of the greatest matches of all time, right? You watch it, right? You know, it's just a storytelling they do. 100,000 people again, invested. You know, I get goosebumps just talking about it right now. Amazing match. Now what happens after that? Shawn Michaels, basically the character goes crazy. He's like, I came so close and I couldn't get it done.

00:29:00

And so he challenges the Undertaker again for a rematch at the next year's Wrestlemania a year later. And Taker says no. And Sean just does a lot of these things to get him to say yes. And the story goes on, and Taker says, okay, fine, I'm going to give you a rematch, but there's not going to be any match. You know, if you win, you're going to break the streak. Nobody has ever done this before, but if I win, you're going to end your career. Okay, so now you can think of this as a resting storyline, right? It's like two people in a room wrote it out. On the other hand, it is real because we grew up with watching Taker win every match through our childhood. It is a part of my story growing up, Right. It is part of history. At the same time, we grew up with Shawn Michaels, right? Like, I loved Shawn Michaels. I hated Shawn Michaels. I loved Shawn Michaels again. So you knew when you watch that WrestleMania, something beautiful, which was real, was going to end that evening, right? When somebody counted 1, 2, 3. And anyway, so you should watch him at the end of the match.

00:30:07

You know, it is huge and emotional. And Sean is beating down and Taker tells him, you gotta stay down, man. Like you be. I'm just, you know, it's too much. You gotta stay on. And Sean gives him this throat slash gesture which basically says, like, you gotta put me down, otherwise I'm gonna keep coming back. And then Taker does this big move, wins. And then there is a sense of sadness there because, you know, you're like, wow, this amazing match ended. But Sean's career did really end. He retired after that match. And it was one of those sort of this epic mythological feelings, like these two amazing gladiators who had never been beaten. Finishing story. So anyway, so when I tell people to watch wrestling, like, like, you know, you should watch that. You should watch Rock vs. Hogan and it'll make you feel something.

00:30:57

I will, I will. Incredible. Let's move into your story.

00:31:04

Yeah.

00:31:05

You ready? All right. Where did you grow up?

00:31:09

I grew up in India in a city called Chennai. It was called Madras when I grew up. India has four major cities. Delhi, Mumbai, Kolkata and Chennai. And we are in the south. And what I would call, what Indians would call a lower middle class, middle class upbringing. My mom stayed at home, took care of the Kids. She was very focused on family, very religious in a lot of ways, very focused on just raising us in sort of the proper way with a lot of right values. My dad kind of had the same job for 40 years. He worked in this. This nationalized company. And we are one of those people where, you know, I think my dad, when you're a kid growing up, you don't really think about how your parents are acting towards you. It's just like, the way they are. But looking back now, and especially, you know, we were just kind of talking about sort of our parenting journey in a way. I see how much they gave me in so many ways. So my dad, you know, we never had, like, real money of any kind, but he always made sure that, you know, we were comfortable and that, you know, we never felt like, you know, we were left out.

00:32:35

But I now know just knowing some of the numbers, like, that must have not been super easy. I also really respect now because I took for granted. Then, you know, he passed away. He died in 2006, but. But he was just there all the time, you know, took me to school, like, every single day. Came back from work, a long day of work, hung out with us every single day. And back then, you know, like, that just the way things. When you're a kid, you're five, seven, eight, you're like, that's how everyone else. But now I realize how fortunate I was to kind of have that stable grounding experience. And my mom was just sort of this huge source of strength. You know, she was taking care of us without a lot of resources. And, you know, she was incredibly focused on my education, because when she grew up, you know, they didn't have any access to books or, you know, they had really struggled economically. And she was like, I'm just gonna make sure you had a good education. And she would save up money. And, you know, I used to get really into reading a lot, and she would make sure I could always buy things.

00:33:45

And again, at the time, I didn't really appreciate all the sacrifices they were going through to make that happen for me. But I'm just very, very grateful for the experience I had, because I think, like, that grounded me, taught me what being a great parent who's dependable, who's always there looks like. And I've been so much more fortunate in so many ways. But, yeah, that was a. I would say I had a pretty great childhood.

00:34:13

Yeah. You know, I'd say I watch your socials, and I mean, it's just. It's really it's nice to see somebody in the position that you're in that is so focused on family. See all the stuff that you're posting with your kids and your wife and, and you, you take your family time very seriously. But. And I commend you for that. I think that. I think you're a very positive example of what it means to be a husband and a father and a family man. And you know what? One thing I would like to talk about with you is, you know, I mean, you're raising your kids here in the United States. You've done very well for yourself. And that is a. I probably shouldn't assume anything, but I think that looks very different to. Than how you grew up in India. So could. Could you go into, you know, some of the differences or what, what your childhood was like growing up in India?

00:35:08

Well, very obviously, I've just been very fortunate here, and I think my kids are very young, so I think there's very, very different childhood from what I had. But let's see. So India has this term which I don't think is as popular in the United States, called middle class. And in my family, education was a huge priority. You know, they kind of. I think my family kind of pinned all their hopes and dreams on me. In a lot of ways, they were really focused on education and it was very high pressure. If you didn't do well academically, you know, you were really going to struggle, or at least that was what I thought. No, I don't think this is actually true. And I've seen a lot of folks kind of, you know, have great successful careers and whatnot, but that is what I was told, like, this is the way out. And when I was 18 or 19, I convinced my dad to buy me a computer. One of these old. Again, I'm dating myself here, a Pentium 3 box back when a CPU was one of these big beige boxes that showed up and you stuck it on a table.

00:36:18

And it was a big deal for him. I recently reconnected with somebody who knew him at the time, and it cost him a year's paycheck or something. Wow, this is a big deal. And again, my kids, I'm like, you guys have it so easy. But it was a big deal for him. And I begged and pleaded for a long time. Now, again, he was a lot of great things. He was a rebel in a lot of ways. I get some attitudes from him, but he didn't really understand technology. He was not of the generation. Right. He just didn't understand Computers. But he took a bet. He spent this serious amount of money on me. And then, even though he didn't know what I was going to do with it, and then I convinced him to get dial up Internet, which was kind of a big deal. And I would say my whole story exists because of that and America because like, you know, well, and I guess pro wrestling, because that's kind of how I grew up, like really, you know, understanding America in so many ways. But, you know, I grew up, you know, I would spend every single night learning compute the computer and learning to write code.

00:37:29

And way back then it was a bit hard to kind of get, you know, you can get online and you run up the phone bill. There's a concept which kids these days don't know which is you log off the Internet. Do you remember that when you used to log off? I do remember, yeah. Do you used to like run up the phone bill back home?

00:37:43

Oh, yeah, yeah.

00:37:46

And people like Sean get off the PC. You know, we have to make a phone call, right, like, or you're like, you have a big download and then somebody picks up the phone anyway, so kid these days. But I would have to do it late at night because that is when the Internet would be faster. Because during the day other people would be using these phone lines that would be slower. And I would get, you know, all these kind of, you know, coding guides. I would get all these used books on writing code. And at the time I was a bit lost, I would say, in what I wanted to do with my life, how I fit in. I wasn't very social and I was kind of lost. But then I realized that this code was something that brought me joy. And one of the things I think people who don't do computer science don't understand is the deep jar of creating something on your computer with computer science code, right? Because the computer is unforgiving. You have to figure out a way to get it to understand you. You have to solve mental, intellectual problems.

00:38:52

You never get it right in the beginning, but AI is now helping with that, which we can get to later. And it is a intellectual exercise. And once you get it working, it's just amazing, right? You just feel so good because you've created something and no one can take that away from you. So I was just doing this every single night. I would stay up from like 10pm till 4am, you know, and then go late to school. The next day I would just be writing code online. And this is where I think two amazing, amazing things, you know, Wound up happening one is my now wife, Aarti. You know, she was very similar to me because, you know, she came from a. You know, she was kind of one of the first people in her family to go to school in another city. She was one of the first people to be kind of good with computers. And she was in a different city and she was learning how to write code herself. Now, at the time, I had kind of built a little bit of a reputation myself in my town as the computer science guy.

00:39:51

I'd written some piece of code. Do you know what a virtual machine is?

00:39:55

A virtual machine?

00:39:56

Yeah.

00:39:57

No.

00:39:57

Okay, so a little nerdery here if you're going to nerd out for bit. So. So people usually manipulate computers using programming languages. You might have heard of that, right? But one of the things people figured out is if you directly have a programming language access the computer, it might be unsafe or it might be sort of hard to manipulate in lots of ways. So they came up this idea of this virtual machine, which is a computer that runs inside a computer, okay? And the reason you do it is you're not giving it access to all of it. It's a sandbox, okay, which runs on a computer, but you can run any sort of code on it. You can write a game, you can write all these amazing things on it. Now it's a very sophisticated piece of technology because you've got to be really fast because you got to run it like a computer. You got to know all the things a computer does, and you got to make sure it doesn't sort of break out of the sandbox and then does maybe evil things on the actual computer underneath. And for a lot of reasons, I got really into virtual machines and how they work and how to make them fast, how to make them performant and all these kinds of things.

00:41:10

And I kind of had a little bit of a reputation. Think of it like, think of the guy in your high school who can maybe dunk or athletic and the schools around him. I was kind of like that guy. And so my wife and I, we started chatting online because somebody had introduced us, say hey, you guys are these two nerdy computer science kids who seem to know how to do stuff with computers and can you help me? And at the time, I didn't even know she was a girl. I didn't know what her age was, but she was like, I was just this friend who has similar interest to me. So we would chat every single night. We would stay up night chatting. And then after six months, you know, she's asking me, hey, you know, like, who are you? I'm chatting with you. Who are you? And I'm like, well, I live in the city, which is kind of nearby. And then we wound up meeting a few months after that first time we met. And we've been together ever since. This was for the last 23 years now.

00:42:02

23 years? Yes, 23 years. When did you get married?

00:42:06

We got married in 2010, but we met each other in 2002, started dating in 2005, 2006. Then in 2010, our parents were like, you folks are both crazy. You'll never find anyone better than each other. You gotta get married. And got married in 2010. And I have two kids, the whole thing. So I always have to say the best thing computers have brought me is Aarti. Because without that, you know, I wouldn't have her. I wouldn't, you know, none of this would be possible. The second thing which wound up happening is that I was writing all this code and at the time, you know, there was a Microsoft executive who was touring India and they wanted some student who could do some of the things I was doing. And somebody had written, seen something I'd written online and I get this cold email saying, hey, you know, do you want to come out and do this little sort of event for us? Because they want a student to come do a demo with this kind of this big shot exec. And at the time I couldn't really speak English. I was kind of living in my bedroom doing these things on my computer.

00:43:12

And this is like, this is amazing. So I remember I got on my very first plane ride ever. I'd never been on a plane before. And my first fancy hotel room and we were like, I think I said maybe 20 at the time. It's like crazy in this fancy hotel room. Mics are paying for went really well. And this executive, you know, who's now the. But it was amazing. Was like, you kids, he met me and my now wife Adi. He was like, you kids should work at Microsoft. I was like, sir, we would love to, but we are here. We don't even know how to find you. So long story short, he said, let's find a way to get you in. Because he was kind of impressed by all the things I was doing and it was very connected to some of the things Microsoft was also working on at the time, like cloud computing. So he was like, hey, there's this kid who's doing these things that we are also working on. Let's just find a way to connect the dots. And so a year later, they flew me to Seattle Redmond, where Microsoft is off.

00:44:11

And I remember flying in, and Seattle is a beautiful place. The Pacific Northwest is beautiful. And I just fell in love. I was like, man, this is all I want to do. I've grown up watching Hulk Hogan, watching the Rock, watching Star Trek, watching these movies, and I'm now here. And even though I couldn't really speak English very well, even though I didn't really fit in very well, I was like, computers I can do. And let me figure this out. So that sort of, you know, started my whole journey. So I think without sort of my parents. So the environment I had without, like, my dad spending, taking his bet on me, you know, he was not sure what a teenage son. Teenage sons on the Internet late at night, right? He was like, I don't know what's happening there in the closed door? Like, I don't know, right? Like, and the door is closed. But they took a bet on me. And one of the things you realize as you get older is how fortunate you are, right? So without my parents being so focused on me and making sure I had a better life than they had, giving me these opportunities, without these people who just saw me and they were like, hey, I see something in this kid.

00:45:26

Let's take a bet on him. And I would say, mostly I would say this country, right? You know, every single thing I have professionally, my wife has, has been possible because of America. We live in Seattle, in San Francisco, now Live in Washington, D.C. right. Like, so I'm so grateful for all these things, and I've just been very fortunate, right? Like, I've been in the right time, right place, and I've been very, very fortunate.

00:45:53

Yes, yes, yes, you have. And wildly, both you and your wife are there. Wildly successful. Are your parents still alive?

00:46:02

No, my, you know, my father died in 2006 and not. And my mom died three years ago. And I had sort of, I think, you know, one thing you don't realize. So 2006, I was, what, 20 to 23 when my dad passed away. It's sort of a big regret in a way, because I never got to show him some of the success I wound up having later. Like, I think, you know, a lot of ways you want to show your parents that, oh, my gosh, mom, dad, like, look what I made of myself. Look what I've become. And at the time, I had just gotten this job, and he knew it was a big deal because it was, you know, I was doing these Cool things. And I was in the newspapers and whatnot. But I never kind of got to show him him, you know, that man. Like, you know, all the investments that you made in me. Right. Like, you know, look, I've done something with that, right. I also, I kind of missed out on having an adult relationship with him, which I think is quite important. Yeah, right. I do think, you know, I grew up in my 20s with my dad, so.

00:47:11

And obviously, look, you know, a lot of people don't have parents at all or you know, they didn't have the benefit of the childhood I had. But I sort of feel like I missed out because I never, I was a kid growing up, right. You knew your dad in that facility. But I'm saddens me that he never saw me, you know, have my career, never saw me get married, have my kids. So I never had a sort of adult relationship with him. I never got a chance to provide for him. You know, I would have loved. So he, he was a rebel, by the way. So he. Back in the community I grew up in, right. It's very community oriented, et cetera. But you're kind of supposed to stick to your. But he was a bit of a rebel. He was like, I want to be a writer, I want to write my own books. He was kind of, he had all these creative ideas. I think if he was just, if he didn't alive five years later, he would have self published on the Internet and you know, probably, you know, being on YouTube, comments with conspiracy theories.

00:48:07

He was that person. But he missed out. Right. But I think I have some of that nature in me to be like to speak out. So one of the things he really taught me is that, you know, a lot of people would be like, you know, if you see something wrong, you mind your own business. But if he saw something wrong in our community, he'd be the first person to go speak up and try and help one. And as a kid, you're like, man, like I don't want to get into trouble. Like, what is this? Like, I don't want to keep my head down. But he was always good at that. So I miss him. I feel sad that I didn't get the chance to kind of show him everything and maybe provide for him. My mom, you know, you know, she passed me three years ago, but you know, I'm very fortunate and she got to kind of see my family. You know, she built very close relationship with my wife, you know, my first, my first kid. And yeah, what would you.

00:48:52

I mean, what would you say to your dad, if he was here today.

00:48:57

Man, think about this. That I somehow made something of myself. I don't really say it or show it to him, but, you know, like a lot of parents, you know, he had. He was very proud of me. But, you know, we. Sometimes he would have friction, and he'd be like. And I just want to show him that in the very imperfect way I am now, you know, I've sort of made something of myself, right? And a lot of it thanks to him and a lot of the bets he took. And, you know, my hope is that, you know, somehow, some way, he knows that. But, you know, that's the one thing I always feel like I never got a chance. You know, you want to buy your parents cool stuff when you make some money, right? Like, you know, I think people ask me, what is it? When you ever make some money, right? And I think one of the best things to, you know, is to go buy your parents something ridiculous that they will never buy for themselves, right? And I got Jewels as my mom, by the way, so. But I never got to do it with my dad.

00:50:11

So I always feel like, you know, But I think the deeper notion would be to be like, hey, it kind of worked out, you know, I made something of myself. And thank you. What about you? Like, you know, how do you feel about your relationship with your parents and how it has evolved over time?

00:50:26

I think, you know, I think me and you share a very similar sentiment in. And I just always wanted to prove to my parents that I could be something and do good for the world, and it's something that I took on at the age of 18 when I joined the SEAL teams. I mean, that's really what pulled me through. Yes, I wanted to go to war. I wanted to fight for the country in the highest capacity possible and be the best that I could be, because I was affiliated up to then. And I didn't make good grades. I wasn't very athletic. I didn't really have much going for me. And so, you know, what really pulled me through all that training and got me in was I had a horrible fear of telling my dad that I failed again. And so that stuck with me at 18 and buds, that's what got me through. That's what got me into the agency, you know, is I wanted to. I just.

00:51:26

Just.

00:51:27

I just always wanted my parents to know, you know, they did a good job and that.

00:51:32

That.

00:51:32

That I could make something of myself, regardless of, you know, how my childhood went. And. And my parents are still Alive. And, and we're, we have a very, very close relationship.

00:51:42

How are your parents react when you first signed up?

00:51:46

My days don't slow down. Between work, the gym, and time with the kids, I need eyewear that can keep up with everything I've got going on. And that's why I trust Roka. I've tried plenty of shades before, but these stand out. They're built for performance without sacrificing style. I've put them through it all on the range, out on the water and off road. They don't quit. They're lightweight, stay locked in place, and are tough enough to handle whatever I throw at them. And the best part, they don't just perform. They look incredible. Sleek, modern, and designed for people who expect more from their eyewear. No fluff, no gimmicks, just premium frames that deliver every single time. And that's why Roka is what I grab when I'm heading out the door. Born in Austin, Texas, they're American, designed with zero shortcuts, razor sharp optics, no glare in all day, comfort that doesn't quit. And. And if you need prescription lenses, they've got you covered with both sunglasses and eyeglasses. One brand, all your bases. Roka isn't just eyewear. It's confidence you can wear every day. They're the real deal, ready to upgrade your eyewear. Check them out for yourself@roka.com and use code SRS for 20% off site wide at checkout.

00:53:11

That's R O K A dot com.

00:53:16

Well, what is that conversation like?

00:53:19

That conversation went when it came out. Like I said, I didn't make good grades and I had a lot of potential. I was a smart kid. I was really good at math and. But I didn't apply myself because I wasn't interested. And it came out one day, he was really frustrated. I just got a report card or midterms or something and it was not good. And I know that feeling, C was probably the best grade. And he said it was an argument. And he said, I'm not going to pay for your college. I'm not going to do it. He's like, you're not going to apply for yourself. And I said, I don't give a shit if you pay for my college or not, because I'm not going to college. I'm going to become a Navy citizen seal. And at that moment, the whole, everything changed. And he, you know, he. I mean, I think he thought it was full of shit because I didn't follow through on anything that I'd said that I was going to do. And he had asked me, you know, how serious are you about it? And I told him, I said, yeah.

00:54:28

I've had multiple meetings with the recruiters already. I've already talked about it. I want to sign. I was only 16 or 17 at the time.

00:54:36

And why did you want to be a seal?

00:54:39

I don't know. You know, I just always grew up playing G.I. joes, watching G.I. joe. I mean, your thing was professional wrestling?

00:54:46

G.I. joe, actually, G.I. joe. Equally. I'm trying to get my kids into it. Right now. We have a. Do you have a. Sorry, what is your favorite GI Joe toy growing up?

00:54:54

Ooh, Snake Eyes.

00:54:56

Perfect Man. Badass Ninja.

00:54:58

Yeah.

00:54:58

Snake Eyes versus Storm Shadow. Right. Like, and the rich kids would have this kind of Sky Striker jet plane. Oh, man. Yeah. Sorry. Yes.

00:55:08

But that's what got me. I think that. So I was always, you know, and, and early on in my, in my younger years, fourth grade, my dad took a job as a pharmacist in the, in the army, and we went to Germany, and that's when Desert Storm was kicking off. So I was always. Anytime we were at the bookstore, I was always grabbing all the magazines and books about what was going on over there, and I would just look, look at the pictures of tanks and helicopters and planes and, and, and, and, and, and military war fighters. And I was always out in the woods building forts, carving spears, doing. Making bow and arrows and shit like that. And, and so that just, that just always got me really interested. And originally in my military career, I just, I didn't really care. I just wanted to go fight in a war somewhere. So I wanted to be a Marine. Wanted to be. Then I started looking into that. Found force reconnaissance. Talked to the Marine Corps at the recruiting station. They laughed me out of the office because I was about a hundred pounds soaking wet.

00:56:13

Went to the army, wanted to be a Ranger or a Green Beret, same deal. Laughed me out of the office. And the Navy recruiter sticked his head out and said, hey, you want to be a seal? And I didn't even know what a SEAL was. And because I was also very into all the Vietnam generation movies like Apocalypse Now, Platoon, all that kind of stuff. And so they gave me a pamphlet on being a seal. And I read it and went to the library, checked out a bunch of books, started watching movies, and I was like, that's it. That's what I want to do. And luckily wound up making it through and followed through. And that took me into the CIA and had an awesome career there. And then somehow I wound up podcasting.

00:57:01

Man. Well, thank you for everything you've done. But I'm so fascinated with just how you signed up because I'm very curious about why people pick the carriers. They do, especially for something like that. Because when you're. How old were you when you signed up?

00:57:15

17.

00:57:16

Right. So super young. You don't really have a sense of the world, but maybe you did because you kind of been around the environment a little bit.

00:57:23

Bit.

00:57:24

Like, I'm curious about like, what is it that makes someone like, be like, all right, I'm going to do this. I'm going to serve my country. I'm going to put myself in harm's way. Right. What do you think some of the general motivations are when people go like, I'm going to go sign up?

00:57:37

I mean, I think it's always different, especially in the SEAL teams. I mean, I was one of the maybe. No, I wasn't the youngest. I was maybe the second youngest guy in my bud's class. And you know, when you go to buds, you have all these, you have guys that are coming in from other special operations units that had already been to war and they're just, they're trying to operate at the highest level as they possibly can, you know, and what they think they want to do. And so you've got 18 year olds, up to 30 year olds, you know, that are trying out to become a seal. And you know, for me, I can't say that I was overly patriotic. 911 didn't happen until I was already, it happened to like right after boot camp, so I was already in. But even when 911 happened, I mean, I knew we were going to war, but I didn't, I didn't understand what that meant. So like I kind of said at the beginning, for me it was, I mean, that's just what I wanted to do. It wasn't necessarily just for the country, it was what I wanted to do.

00:58:48

I wanted to experience. I just wanted to experience that, you know, at the, at the highest level. And then on top of that, even more than my own desires, I just wanted to make my parents, and in particularly my dad proud because my brother and sister were good students, good at athletics, better than everything that I was doing, doing. And, and, and I could see that. I could see the interest and especially in my brother. My brother was a really good ball player, baseball. And I could see my dad gravitating towards, you know, what my brother and sister were doing and Kind of like, well, Sean's just the up of family and in, in. I knew I needed to, you know, for my own mental health, change that to, to, to. I just wanted my dad to be proud of.

00:59:42

Was there a moment through your career, serving or elsewhere where you're like, man, this is the moment. I know my parents are proud of me.

00:59:51

Yeah. After Hell Week. After Hell Week. Do you know what Hell Week is? Okay, yeah.

00:59:56

Thanks a lot, by the way, to your show and hearing people talk about it on your show too.

01:00:00

Oh, cool, cool. But, but yeah, that was, that was the moment, you know, that was the first moment that I was, that I felt like, man, even though I'm only four weeks in to a. Over a, you know, two year process, I was like, that's the biggest hurdle, you know, and so that was. My dad was my first phone call and I said, dad, I'm. You know, I made it through the rest. It's going to get easier from here. It didn't get easier, but, but that was the big hurdle and it felt really good. And then graduating buds, getting into the SEAL team, first combat deployment, finally, you know, getting into contracting at CIA. I mean, it, it just, it just that it's, it's always in the back of my head. Even still today, you know, with the interviews that I do, I take, I take this very seriously. And I've made a lot of mistakes in the podcast world, but in general, you know, I just, just, I just want to, I have an ability to get stories out, you know, and, and make people comfortable during these interviews. And, you know, it, it's, it's, it's, it feels really good to be able to take somebody, you know, like I was telling you at breakfast, to take somebody that nobody's ever heard of, who's starting a business, who's been through struggles and get them in here and get them vulnerable.

01:01:29

Especially with some of the, the special operations guys that have, that have, you know, there's a cost that comes to doing that job, both mentally and physically. And, and there's a culture within that. And you know, the culture within the SEAL teams and the special operations and even just the military community in general is not one that's widely accepted by civilians. I mean, there's a lot of, there's a lot of, of drinking, debauchery, womanizing, fighting, bar fights. I mean, and, and now I can't even remember where I'm going with this. But, but to get somebody, yeah, to get somebody vulnerable, to talk about what that's like and and suicide attempts and stuff after. And, And a lot of these guys, you know, they, they get out, they don't have a voice. They aren't able to document what happened over there and to showcase all of that, what their career was like, what it was like getting out, the suicide attempts, the drug addictions, the womanizing, the infidelity, and to have somebody come on here and talk about all that and how they got out of it. Because a lot of veterans feel trapped. They don't fit in with regular society, you know, and, And.

01:02:51

And regular society is fascinated in the lives of. Of what me and my, you know, former colleagues used to do. And so to open that up, you know, it's. It's like. It's like giving the American. Not even just the American, the world. I mean, it's a huge podcast now. People all over the world listen to. To be able to get a peek behind that curtain on what that life actually is like from, From. From somebody that's lived it is awesome. And then the audience gets attracted to. They. They're invested. I mean, they, they. We go from childhood all the way to current date, and to give somebody a glimpse into what that journey looks like is. I mean, it. Before this podcast was. It was unheard of. You know, you, you didn't get that deep. And so when the audience gets invested into a story like that, these guys that are, you know, I think a really good way out of that downward spiral after military service and after war fighting is. Entrepreneurship is a great segue out because we talked about purpose right at breakfast today and how people need a purpose, whether they were a factory worker or, or auto mechanic or whatever.

01:04:12

And we were talking about AI taking jobs. I mean, for military guys, I think entrepreneurship gives you. For anybody, I think it's a enormous amount of purpose. And guys coming out of the military and women, I mean, they have, especially in the special ops community, are people that are wanting to be the best at what they can and what they do. I mean, they're going to implement that in whatever they do, but they don't necessarily get the traction, the exposure that they need to create a successful business journey as an entrepreneur. And we've had time and time again, we've had guys in that they didn't have the exposure, but I could see the drive, and I could just see them in the hamster wheel not getting any traction. So I'll be able to bring them in, tell a life story, get the audience invested in their story and who they are as a person. It just, you Know, some of these stories are so wild and real that they don't care what your business is. They just want to see you as a human succeed for all the sacrifices that you've made in the country.

01:05:26

And it brings other veterans hope and it really brings anybody hope. It's like, man, if this guy's dealing with this shit, then maybe I can, and maybe I can push through what I'm going through. And we're able to turn a lot of startup businesses into multimillion dollar businesses really overnight. And that's what I love to do more than anything. Or take somebody like you. I mean, you have a huge name in the tech world, you're very connected, you're very successful. But I think there's still a lot of people that don't, you don't know who you are, who you are as a person, all your accomplishments, where you came from in India, in a middle class, in a middle class family in India. I think a middle class family in India looks a lot different than a middle class family here in the United States, you know, and, and so to, so to be able to bring somebody on like you, who, I wouldn't say you came from nothing, but very little. And to be able to come to the United States and build what you built, you know, the life for you and your wife and your kids, I mean, that is going to bring some, at least one person that's watching this hope, you know, and because we live in this, you know, we live in this society where it's become very popular to victimize yourself and make excuses on why you're not finding success and it's always somebody else's fault.

01:06:59

Fault.

01:06:59

Right.

01:07:00

And, and, but you control, I'm a huge believer. And you control your own destiny. And when you find what you, what your gifts are, what you're good at, what you're interested in, I don't believe there are limits. You know, I think yes, there are limits. Like I'm not going to be a professional wrestler. You know, I'm a, we, we'll get you working.

01:07:21

Like, don't, don't cut yourself short. Sorry.

01:07:24

But, but, but you know, I'm a buck 80. It's just not going to work. But I find the things that I'm interested at, that I'm good at. And I, I don't think there are limitations. I think the sky is the limit and you can probably go higher than that. And, and you are a perfect example of that.

01:07:41

Thank you.

01:07:42

That if, if you have the drive and the work ethic and you don't spend your time looking for excuses on why you're not successful. You will rise to the top. And stories like yours, or Palmers or the Veterans or Blondesdale or. A lot of the people that I brought on, a lot of these guys, they didn't come from generational wealth. They built something out of damn near nothing.

01:08:10

That's how America, right there.

01:08:12

That's it. That's it. And that's why I do it.

01:08:16

And.

01:08:19

So long. Circle back.

01:08:21

I just want to say, well, thank you, but, you know, one of the. I don't know how much people can see just the room we are in, but before we started, you kind of gave me a tour. And there's just a powerful space because every object here kind of has some meaning. And so many of them, you know, are from people with just these insane, amazing stories. You know, often people who have sacrificed so much for the country, and this is so powerful. And I think one of the things that you've just done so amazingly well, and I'm not saying this kind of blow hot air up here behind, is create a space where people can share things. I think we were talking at breakfast about your episode with one of your closest friends. And one of the things I loved about that was that just seeing him seeing somebody who's just such a great sort of masculine figure, but at the same time opening up about some of his struggles dealing with the family, dealing with all the things that he had to overcome. And also in the business world, I was very interested when he talked about how he was navigating the business world.

01:09:36

And I think so you just created just this amazing space and platform for people to be part of, so. Well, thank you for that. I think what you've just done is just amazing.

01:09:45

Well, thank you for saying that. But let's get back to your story. So you go to Seattle and you start working for Microsoft. What did you think of the United States?

01:09:55

Oh, man, Dream come true, right? Like, you grew up a lot of places around the world, and, you know, you watch this on movies, you know, I grew up on GI Joe, by the way. Like, you know, I have a whole big collection. I grew up on Top Gun, Independence Day, you know, Rambo, Rocky, all the classics. So, you know, and so you watching it on tv, you have an idea. I think the word American dream gets thrown around. And thank you for that. I didn't know of any of that. I just was like, this is a place where I can do something with my skill set. And one of the ways I can maybe make this useful is a lot of times people ask me, what is my advice for young people, people in their early 20s. And I remember when I first got to Seattle and Redmond, I was just so lost. I didn't have a driver's license. I had never seen snow before, and it was snowing, and my now wife and I, we were like, oh, my God, like, we had to walk a mile to the grocery store in the snow.

01:11:00

And you didn't have an assistant for a while, which was because we had to figure out our paperwork. And we just felt so lost. We didn't know a lot of people there. But when I walked inside a Microsoft building and I would see these computers with coding windows, I felt incredibly comfortable because I knew that I was very, very good at that. And I knew on that there was anybody in the building, I could go toe to toe with them. So. And I think that sense of mastery is very important. Right. Maybe that might have been my arrogance. Maybe I wasn't like that good. But if you are, you know, if you're kind of lost and, you know, one of the things I tell people is find a way to become a master of one tiny niche thing. So, for example, when I was there, we had this sort of this big presentation we had to do for Bill Gates, another senior exec who were kind of running the show at the time. Bill Gates, Steve Ballmer, and this guy Ray Ozzie. And one of the things I did was I became the guy who, for the thing I was working on, the voice of the customer.

01:12:19

What I would do is I would go look at every single Internet complaint, every single. I would call up customers. I'd be like, hey, this is so. And Microsoft, they'd be like, why are you calling me? And I'd be like, hey, I just want to know what you guys think. And I was not supposed to do that. I was on the engineering side. And the reason I tell the story is because over a few months, because I had the technical background and I was building some of these products, but I was also talking to customers on this very niche thing. I became like the worldwide foremost expert on that thing, how people were using it, what were they doing, what are the issues they were running into? And by the way, it's a tiny thing. Nobody else. I was a master, because nobody else cared enough to make themselves a master. But the reason it was great for me was when I would walk into a meeting and I would be people who are much older, okay, now they're younger than me, so, but at least when you're in your early 20s, you know somebody was like 40 years old, you're like, oh my God, that guy's ancient.

01:13:24

So, you know, I'm older, but. And I had an accent, right? I look different, I'm weirdly tall, as you know. But when it came to this topic, which is about this kind of, this kind of programming, we could do that. I was like, I'm the expert, right? And even if the others didn't know it, I knew it because I'd spent every day, every night for months and months and nobody could take that away from me. So when you have these huge meetings, I once had to make this presentation to Steve Ballmer and these execs. I was comfortable because I had done all the work and I was an expert. And so Microsoft was very intimidating because it had all these legendary people, some of these real icons. But when I built this sort of sense of mastery, I found my way to comfort. So I always talk about when I talk to young people or even people are trying to break into the technology industry, which is, you know, the technology industry can seem bizarre. It has all these crazy larger than life personalities like Elon Palmer, et cetera. But if you can make yourself and everybody can make themselves the absolute expectation expert in one area, you're going to feel so good and it's going to start opening doors.

01:14:45

And so that was a big part. The second thing was I think a lot of people took a bet on me there and I'm very grateful. One of the things I absolutely didn't underline is I am the result of so many things going my way and so many people taking a chance on me when they did not need to. Okay, and let me name a couple of people. And so this is going to get very nerdy and technical. So sorry about that. But one of the things Microsoft had a lot of were these amazing technical geniuses who were really good at one thing. And one of the most iconic people was a guy named Dave Cutler. And Dave Cutler is probably known as probably the greatest programmer of the 20th century. The greatest one or two, in fact. Palmer will tell you how much he admires Dave Cutler. And the reason he's known as the Greatest programmer of 20th century is he basically built the version of windows in the 90s that became really popular. And he's a personality. When I showed up, he was in his late 60s, he's now in his 80s. He was known for being incredibly rude and mean to people.

01:16:04

There are legendary stories of him punching holes in walls of throwing people out of his office. He was not a person who suffered fools gladly, okay? But this guy had an insane work ethic. He had more money than God, but he would show up to work every single day. You know, at age, must have been late 60s. Now then. Now he's in his 80s. And then write code, okay? So I remember, you know, and I was like, man, I used to idol worship this guy, right? And computer science people, they still idol worship. I need to find a way to impress this guy. And I don't know how. I was terrified, right? And so what I would do is, you know, he would show up on weekends. I remember him once showing up on December 26th and working, right? Which is not a lot of. And so I would start showing up on weekends and just see him in the hallways, et cetera. And then after a while, he was like, all right, this young punk is here. And then after like six months, I found an error in his code. This scared the shit out of me because, like, he was.

01:17:08

He's basically like royalty when it comes to programming. Like, you know, think, you know, I don't know, you're going up against Michael Jordan and you're saying, hey, by the way, you know your jumper shooting form, I found an error, right? Or Steph Curry, you know, that three pointer? Like, I saw something last night like, I think you can fix. So this is on that level. So I was terrified. But I sort of went into his office and I was like. And he was like, sir, I have this thing. And I have this thing. And he goes, ugh. And he looks at his computer. I'm like, man, this is the end of my career. And he was not firing people, too, by the. The end of my career right there. And he was like, you know what? You're right. And I was like, okay. Then I ran out of the office. But, you know, the thing about him was he was so intellectually honest. So he took my fix. He made this buck fix in his code. And then he kind of slightly, you know, him and others took me under the wing a little bit.

01:18:07

There are others, a guy named Barry Bond, who was a deep mentor for me and my wife. He would have us for lunch every day just to kind of teach us things. And so I was taking a bet on, right? And Dave was a royalty, right? And you know, it's like Michael Jordan now seeing some punk kid and being like, I'm going to take you under my wing a little bit. So he took a chance on me. And when I had that stamp of endorsement. Other people like, well, you know, if this guy survived Cutler, well, at least he can survive, you know, you know, this crazy old person, you know, and he's good. So it helped me. So a lot of people took these crazy bets on me at Microsoft. So when I think about Microsoft, I mean, look, Microsoft is a crazy, complicated company and a lot of people have mixed emotions about it. But at that time, then I think about all the people who took a bet on me, even though they had no reason to. And so now for my wife and me, now we are in the position to take bets on people.

01:19:05

And I always think about, okay, how am I trying to spot these 20, 25 year old version of me? How would they look like where they come from? Maybe they don't know anything about computers, maybe they don't know anything about AI. Let me find a way to just take a chance. Because I do think one of the most powerful things you can give a young person is this idea that they are capable of much more than what they realized. It's one of the most powerful things you can do. And I have been fortunate to have a couple of people do that to me. But I'm like, wait, I didn't know I could even. I can do this meeting right, or I can start this project or I can start a podcast. But that was so powerful because you need that belief, right? And I think about now, how do I find ways to give, especially young people, okay, you are capable of so much more than what you realize right now. It may not be easy, it may take a lot of hard work, you may not accomplish it, it's, you know, whatever. But you are capable.

01:20:18

And I think that is one of the most powerful gifts that I've been given from my parents, others that worked with and in, I don't know, whenever possible, I try and find a way, can I capture some of that and give it to somebody else?

01:20:31

You empower people.

01:20:33

Well, thank you. People empowered me. They gave me belief in myself when I didn't have belief in myself. Maybe they gave me too much belief in myself. Some people would say that too. But it's a gift and I try and pay it forward whenever I can.

01:20:46

I love that. I love that. I think that I wish more people did that, that find success. And I think a lot of people do that. A lot of people that I talk to and hear do that. People have done that for me, you.

01:21:00

Know, but maybe you had such great stories. I just want to say I don't want to put you on the spot. But when you're talking about breakfast, you know, you've given so many of these people this platform, and, you know, I don't want to sort of reveal this, but you were like, hey, you are capable of doing this, and, you know, you're capable of so much more. So I think you have done justice with this podcast, too.

01:21:20

Thank you.

01:21:21

You.

01:21:21

Thank you. I mean, it's how we make the country great.

01:21:25

Yes.

01:21:26

You know, you pass it on, you pay it forward, you empower people, you give them confidence and. And just shower them with positivity. And, you know, then it's up to them if they go on to do great things or.

01:21:39

Yeah.

01:21:39

Or continue on the same route that they were.

01:21:41

But I think that's. In some ways, I think that is one of the most beautiful things about America, which is this idea that you have a shot. Right. Like, you know, if you. And I'm not going to pretend it is easy for everybody. Right. You know, a lot of people, you know, just come from different walks of life, have bad shit happen to them. But I think at the core of it, one of the promises of this country is that, you know, if you work hard and if you apply yourself and you do all the right things, right, like, good things will happen, like, you have a shot. A lot of other places in the world don't give you that. By the way, the way you're kind of told you have to stick in this lane or you don't have any opportunity. I do think at the core of this place is the sense of opportunity. And by the way, in some ways, looking at the layer, but I think about, with AI, how do we as a country expand the opportunity set, give people way more opportunity than they had before? So that's one of the things I think about when I think about my current job.

01:22:45

Yeah. Where did you go from Microsoft? Did you go to Meta, or was it.

01:22:49

So my wife and I, we idolized Silicon Valley, the culture of Silicon Valley. When we grew up, our first date was I couldn't afford anything. I had this tiny, slightly smelly one bedroom, and we had to see each other. And the parents didn't know we were seeing each other. So she would kind of sneak in and we would watch movies about Silicon Valley. Right. One of the greatest movies about Silicon Valley, by the way, is this movie called Pirates of Silicon Valley. It's a movie about Steve Jobs versus Bill Gates in their young heyday and how they competed with each other. And we would be like, man, someday we want to be there. And we were like, you know, by the. That was. I think it was a bit torrented cd. Like, we didn't exactly pay for that. Like, we were like, we couldn't afford the actual real thing. Hopefully I don't get in trouble for like 24 years later. And. But, you know, we were like, someday, because Silicon Valley was the land of opportunity. You went there, you know, you were good at computers. You could make something of yourself, right? So for me, like, you know, like, for example, if you, if you're playing football, what do you dream?

01:24:01

You want to play in the super bowl, right? You're a pro wrestler, you want a main event, WrestleMania, hopefully at Madison Square Garden. But for me, well, I also want to maintain WrestleMania, but I wanted to make something about in Silicon Valley. So, Anyway, so in 2011, 2012, we decided, okay, Microsoft has been great for us, and we did great. My wife and I had fantastic careers there. We made a lot of friends. We did very well for ourselves. We were like, we want to take a risk and go down to Silicon Valley. We just got married too, by the way. And the way we did it is we said, okay, there are two of us, so we're going to sort of hedge risk between us, and one of us is going to take a chance and build a startup, and one of us is going to get, like, go make some regular paycheck and job. And so one of us is paying the bills and the other person can kind of follow this entrepreneurial path and we can kind of take the pressure off each other. And the deal we made with each other was that we will kind of alternate.

01:25:01

Like, somebody will go start a company first, somebody will go get like a regular job, and then maybe we'll switch it on. Like, we didn't know what we were doing, right? We were like, we just wanted to be. Be part of Silicon Valley. And I will say Silicon Valley is a magical place. And I think it is one of the unique advantages America has over a lot of other parts of the world because the combination of capital which flows in, the combination of talent there, the density of the companies there, the idea of it's not perfect, a lot of issues with it, which we can talk about, but it is magical and it is a unique. So we wanted to be there. So we quit our jobs at Microsoft and we flew down to the San Francisco Bay area, like, all right, what do we do? Right? And so a year later, my wife. Do you know what Y Combinator is?

01:25:52

Yes.

01:25:53

Okay, so Y Combinator is this very popular startup, Incubator it's maybe one of the most popular startup incubators. And at the time, they were maybe not as popular as they are now, but they were still a bit popular. And what they would do is, if you had an idea, you go to them, you apply, and they pick maybe 20, 30 companies a batch. And some amazing companies have come out of that. Airbnb, Coinbase, Brian Armstrong, I think, talked about Y Combinator here, Dropbox, lots of great companies that come from there. And so my wife had this idea. She applied, she got into yc. And yc, by the way, doesn't give you a ton of money. I think at the time they gave you for the entire company, maybe like $80,000 or something. So it's not a lot like, you know, and I was like, okay, I need to figure out how to just get a paycheck, get some regular money. And so I wound up joining what was then Facebook and Facebook now called Meta. And some of the older people here might remember this. Facebook then had just gone public, and they were in a dark hole because, number one, the IPO had gone terribly poorly.

01:27:06

They had gone public, I think at like $45. The stock price had plummeted. Second is there was this big question about the whole world is moving from desktop computers to mobile, and Facebook is only making money on the desktop. So there's this big question of. First of all, the idea of social networks making money was sound, seem laughable, right? Like people just laugh you out. Because people had seen MySpace, they had seen all these other companies fail. They were like, I don't even know why this is a thing. Second was, they're like, nobody can make money on mobile. So they were in a bit of a hook. And I had some friends there, and I told myself, listen, I want to do something which is very different from Microsoft. I want to do something which involves consumers, because I really was interested in consumer psychology. What makes. Makes human beings use products, how they interact with products, how to build the technical algorithms. Now maybe AI. You know, we didn't have that really that word then, which kind of interacts with them. I was very interested, and I just want to do something different. So I got a job at Facebook and I wound up working on Facebook advertising.

01:28:11

Now it's a monster, you know, I don't know. Over a dollar company at the time. It was like the stock was down. And again, I got very lucky. You know, I wound up building this ad ecosystem product called the Facebook Ad Network with some amazing smart people. And we went from zero dollars to a billion dollar business in three months.

01:28:38

In three months, yes.

01:28:40

I remember, like, you know, I think $2.7 million a day is billion dollar run rate. And it was a rocket. And I think, look again. When you look back now, I think we were lucky for two things. One, people were starting to buy things on their phone because the iPhone had come out in 2008. I think the app store came out maybe a few years later. And one of my mentors tells me, there is no difference between being wrong and being early. They're just the same. But in 2012, 2013, people are starting to buy things. So there was commerce happening. So when there was commerce happening, people wanted to advertise and push their products. Target Amazon mobile games. Second, Facebook at the time had actual authentic people, not like bots, which was a huge problem in the Internet. And third was we were able to marry that with these algorithms and products. And the big lesson I learned at Facebook is just like the power of working with great people. Because I had this small team, this guy named Vijay and others, and who were just fantastic engineers. These are kind of people who would go off on a weekend and they would rewrite an entire system, which had taken a year for a team.

01:30:02

And they do it over a week. Right. Like Palmer, Lucky, you know, and I, we share a common sort of hero figure, a guy named John Carmack. Did he talk about John Carmack when he came on?

01:30:12

I believe he did.

01:30:12

He did, right. So John Carmack, you know, I think, by the way, one of the idols of the 20th century, John Carmack, was the guy who built Doom, the video game Doom. Right.

01:30:23

Oh, man, I used to love that game.

01:30:25

Yes, yes. So Doom. And so Carmack is a programmer. What he would do is he invented basically what is called the engine, which ran under Doom and then under Quake and all these engines. And then eventually he met Palmer because he was very interested in VR, and they wound up doing Oculus together. And then Carmack got hired into Facebook. But the reason I bring up Carmack is that there's this great story at Facebook where they realized Carmack individually was doing the work of entire 200% teams.

01:31:04

Holy shit.

01:31:05

Just one person guzzling diet Coke. Right? Like, I think he now lives in Dallas. Right? Like, sitting by himself, just, like doing the work of 200 people at one time. You know, Facebook, HR realized, like, you know, they had nowhere else to promote him to because they just run out of levels to give him. And he was just a machine. And so the reason I bring this up is it is very hard to make sort of get yourself to the top of an industry unless you know what greatness looks like. So I was lucky because when I joined Facebook, I was surrounded just by sheer serendipity by some great engineer, just like I was at Microsoft, right? So now, even though these guys are, by the way, just to be clear, they were thousand times better than me. They would run circles around me with anything technical, like, without even batting an eyelid. It's like me playing, like, a pickup game against LeBron or something, right? But I saw what greatness looks like, so. Which meant that many, many years later, when I was starting doing investing or when I meet entrepreneurs now, I now have a rubric in my mind of, okay, I know what elite talent looks like.

01:32:22

Like, I see the work they put in. I see how they talk, how they think, how they spend their free time. So now I can sort of like, you know, if you know how like, Steph Curry shoots threes, and then, you know, how, you know, the kid you play with at the gym shoots threes, you know, I kind of see what greatness looks like. So I was exposed to Facebook. I was exposed to some real greatness at Facebook. So I wound up doing ads. It became a huge hit, hit at Facebook. And I think Silicon Valley is one of those places where it has a lot of good things, but everyone's kind of looking for what is the win you've had, what is the thing which kind of says, okay, you are somebody who's capable of something. And that whole thing gave me that. I started to just get known as, oh, Shiram's guy, who kind of built all this stuff at Facebook. And if you go Google me from back then, I would start showing up in all of these press pieces. It very important. I think it put me and my wife on the map in Silicon Valley.

01:33:24

And so I'm very, very grateful for that. Wow.

01:33:28

Wow. You had mentioned something earlier, about five minutes ago. I think you said there's. I think you said there's no difference between being early and being wrong. What do you mean by that?

01:33:39

So. So I stole this from somebody. We should talk about Marc Andreessen. Marc Andreessen is the inventor of the web browser. He was the founder of Netscape. Did you use Netscape?

01:33:52

Oh, yeah.

01:33:53

Great, Right. So the spinning logo with the stars and all of that. So Marc Andreessen was kind of one of the original boy wonders of sort of technology world. He built this browser called Mazar Hayek, which then. And he started this company called Netscape, which was kind of this darling child of Silicon Valley in the 90s and then got crushed by Microsoft. But then 10 years later, him and Ben Horowitz started this venture capital firm, Ericsson Horowitz, which we can get to. I wound up joining later. But for many years he was a mentor. He's still a mentor for me. And he has great many sayings. So this is. And so when you, when you have an idea as an entrepreneur, right? Like I think, you know, there are a lot of things which can go wrong. And one of the things that you have to think about is that is this the right time for this idea, right? And history is filled with examples of companies that had the right idea but were too early and died. Let me give you an example. Do you know what the company, company Instacart does, right? It's.

01:35:01

It's a grossing, a grocery shipping service, right? Or doordash. What does doordash do, right? Like you buy a product, they bring it in from the nearby restaurant or, you know, deliver you groceries, right? Exact Same company as WebVant in the 90s, right? One of the most famous dot com busts. People ask you, oh man, what is one of the biggest things in.com era which lost money? Web band. They were, right? They were just too early, right? Or pets.com, right? Like another famous dot com bus. But people have figured out later, MySpace, right? Like MySpace was our Friendster. Do you remember Friendster? Right? Like, I don't remember Friendster, but all these social networks. But MySpace I'm sure you were on, right?

01:35:42

Oh, yeah.

01:35:42

You know, it was okay for a while, you know, and you know, they sold to, I think Murdoch, but, you know, Facebook did so much better. And I think as an entrepreneur, you know, you have to think about is this the right inflection time for my idea? Because when you connect the right idea, the right entrepreneur and the right time in history, magic happens. I'll give you an example. If you look at YouTube, right? So people think of YouTube as, you know, just obviously these days, the de facto way to have videos online, right? But when they first came out, it was other people had tried, right? Like, you know, Google had this effort called Google Videos. They tried to get videos online, others had tried it, but none of them had really worked. But YouTube captured this moment in time because digital cameras were starting to explode. And I think you sort of had started. You started to see the increase of original mobile phones which had reasonable quality, right? And in 2008, the iPhone comes out. So one is, you Starting to see more, more video production, right? Like by regular people, right?

01:36:50

Without needing a camcorder or doing this home videos or needing a bulky camera, you're seeing video production go up. Second is broadband around the United States was getting better. Super important to basically get these videos to show up, like, I don't know, maybe even the 90s. You open up real player, like buffering, buffering, buffering, buffering. You watch a minute buffering, buffering, right? Like it's in 2008. Most of those issues are pretty solved. And you could watch a video without it buffering. And then so these guys, like the YouTube founders, like Steve and Jawad, and all these guys built this product which more than anything, just captured this right moment in history, right? And they rode that way. And I think a lot of times the difference between a. A $500 billion iconic company and some company which runs out of money is not the persistence or the heart or the effort of the entrepreneur. It is just they were the wrong time. And so when you are an investor, which obviously I spent a lot of time as often I was trying to think about, is this the right time for this idea to happen?

01:38:06

Palmer is another good example. So a lot of people had tried virtual reality in the 90s. There was this thing called VRML where you were like, oh, let us embed virtual reality in your computer. And there are all these movies and TV shows which had virtual reality in it. The Matrix is the most famous one, but there was a Pierce Brosnan movie. All these things had virtual reality. But the challenge was that the hardware was too complicated. Nobody could figure out the hardware, and there was no Internet bandwidth to. To kind of show you these experiences, right? So what Palmer, you know, and through his genius, through his sort of, you know, he told you this amazing story, you know, him sort of like figuring this out by himself. He sort of captured the right moment in time with the camera gear, trying to figure out all the low latency interfaces so that the image that the camera sees is now or the image that is being projected is kind of reacting with you in near latency. So that was the right moment in time. So I often think about, like, you need the right time.

01:39:11

The other thing you need is you need some luck and some magical lightning in a bottle. And especially with consumer products, I think with businesses and enterprises, it's a bit different. Like, you can go call up your customers and you can be like, hey, you know, what do you guys want? I can build that for you. But with consumers, I do think you need Lightning in a bottle. So YouTube, you know, they started taking all these videos. They put, I think an SNL video once went viral. They went off to the races. Facebook has a interesting history. Everybody knows about the Facebook history, about Harvard. They've all seen the movie with the social network movie. But one of the other untold pieces of Facebook history is that how did videos become a thing on Facebook? Because for the many, many years, Facebook only did photos, right? This all seems like ancient history. And anybody under the age of 30 is like, hey, folks, what are you talking about? I'm on TikTok Instagram. But for a long time, you posted a photo on Facebook of your friends and you tagged them. No videos until. Do you remember this thing called the Ice Bucket Challenge?

01:40:17

Yes.

01:40:18

Right, right. Okay. This is Internet history. So the Ice Bucket Challenge was basically people raising money. And what you would do is you take this bucket of cold water, dunk it on yourself, right? Like with a video camera in front of you. And then, most importantly, you would then shout out four or five people. You'd be like, hey, I'm going to challenge my celebrity friend to go do the Ice Bucket Challenge. Edward spread. So I was at Facebook at this time, and Facebook had just launched video, but they could not really get people to upload video or what is going on. But the Ice Bucket Challenge had this remarkable couple of properties. One was that it was video, and people had to create it on their own phone, so it was personal. But second, you need to tag your friends, because you have to tag some friends. Facebook just happened to have a platform where you could post a video and you could tag a friend. You could not do it on YouTube. So I remember being in this meeting with Sheryl Sandberg where there was this vertical straight line, and that was usage because everybody was uploading videos onto Facebook.

01:41:29

And so every platform has one of these stories. With AI, I would say it's ChatGPT, for example, which has a story. But I think my point is that with consumer products, you need lighting in a bottle, and you need the right timing. And often, you know, when I talk to an entrepreneur, I often ask them, what makes you so sure that now is the right time, not four years ago, not four years from now. Why is today the right time?

01:41:54

Interesting. I never thought of it like that. And your explanation makes a hell of a lot of sense. What? You know, you went to work at Facebook and created the ad network and had a big part in the videos, and it sounds like you had a very big part in making it tremendously Successful. Your wife went into entrepreneurship?

01:42:18

Yeah.

01:42:19

What did she do?

01:42:20

So she started this. I always tell people my wife is the more impressive person of the pair because it is true she's a multi time entrepreneur. You know, she's actually been through the journey so many times. And so she started this company which was about renting electronic gear. The idea which by the way, I think she would say was maybe a bit ahead of its time, going backwards, our sort of theme, the idea was that instead of renting like for example, you're surrounded by several high end camera gear. But imagine you'd want to buy it. Could you just rent it and then try it out? Because you're going on a trip or you're going on a photo shoot or maybe you're renting a drone. That was a big deal because drones are very expensive. Somebody was doing an ad shoot or somebody was doing like I just want to do this one, you know, scene for a day and they would rent out this gear, you know, to them. So she raised money, you know, from Y Combinator from a bunch of people and they did very well for a period of time. You know, they, you know, I think they had like a, you know, I would say probably a couple of dozen employees, maybe they had a lot of customers, they loved them, but.

01:43:27

And I think the challenge that they ran into is she would say it's a little bit early because now you're actually seeing other companies do that at scale because so many other consumer product categories have exploded which need that dynamic. And at the time I think she's doing cameras and drones. So yeah, she had a reasonable outcome and I think she had a great experience. But she was also a bit early. But also at the same time, my wife and I, this is when we started do some angel investing together. And she's a fantastic investor. That's what she does now full time. But we had tiny bit of money, we had a little bit of money we had made from Facebook. And one of the things about Silicon Valley is that you just start meeting people who are starting companies all the time. And we started to learn how to, to do angel investing. And we would write like incredibly, I would say small amounts of money compared to what a lot of like other angel investors are like sometimes we wrote like a couple of thousand dollars, right, like, and, and just because we wanted to support someone who was a friend.

01:44:40

But I think that over the next four, five, six years taught me a lot about investing. It brought me in touch with fantastic entrepreneurs actually, some of whom actually been on Your podcast before. And then in a way, it kind of led me and her to our investing careers much later.

01:44:58

Interesting. Yeah, very interesting. And so why did you wind up leaving Facebook?

01:45:05

I was bored. I was a bit like, facebook is. I think there's a pattern through my career where once I've sort of felt comfortable and settled in, I want to be like, okay, I want to find my next thing, which really challenges me and push me. So I think they were great. A lot of friends there at the time, Some have left, et cetera. I could have probably stayed. You know, I was making decent money. I would have made more money, but I was just like, I just want a different adventure.

01:45:38

I have that problem.

01:45:39

Yeah, yeah. Like, I mean, if you look at my, you know, when you're reading my bio, I was like, man, there's a lot of stuff in there, right? Like, you know, there's like. Like this podcast, pro wrestling. And part of it, I think, is just. I just, you know, I've seeked out different adventures over time, and so I was very comfortable there. And also, there's a part of it, like, I'm very suspicious if I'm very comfortable because I'm like, man, am I stagnating? You know, do I need to push myself? Right? Like, I'm also very competitive. So I would be like, am I not pushing myself hard enough? Because I'm just showing up in this job where Facebook, at the time, has become a big company. They don't really need me, right? Because that's how you build these big companies. You shouldn't have to need anyone. That's how these companies are built. So I get very restless when I'm comfortable. One of the good things about my current job, you're never comfortable. But I was just bored. I wanted an adventure venture, and I was kind of. I was advising a few entrepreneurs. I was doing some investing, and.

01:46:49

But I would say what eventually that path led me down to is the first of two times at Twitter, right? And so Twitter at the time, you know, you know, there's a guy named Jack Dorsey who was running it, who was a founder, and they had been through a lot of really bad, like, turns as a business. And one of the things I like to compare is, like, Facebook and Twitter as companies, because at a time in 2008 or 2009, Twitter, by the way, of course, no X run by Elon, which we can come to, had some little involvement there. But Facebook, there was a time in 2008, 2009, Twitter was seen as the potential $100 billion, $200 billion company. And Facebook was like, oh, just this toy social network that just will never spread outside college that obviously flipped. And part of the reason, I think, is that Facebook had a very methodical way of using metrics and data and numbers and experimentation. One of the lessons I really learned from some of the people there was to always be distrustful. Unless you have the numbers and the experiments to prove it and the numbers back it up or the experiments back it up, you should be prepared to change your line of thinking.

01:48:02

One of the things I think Facebook has been very good at is changing how they believe what they believe about things when they have new data. At least the time I was at very different company now, you know, I don't really spend any time with them now, obviously. So I kind of learned that mentality about like, okay, if I run this experiment, if I try something, if I learn something different, I'm going to change my worldview, right? They were very good at that. Twitter was very different, right? They had this product, 140 characters, it had worked, and then one day it wasn't really working. And they were a public company, they weren't making enough money. They had a CEO change. So it was a little bit of a spiral, right? Like you're a public company, you know, people are comparing you to the other companies. You're getting like a new CEO every year or two. So it was a bad place to be. And they were spiraling and spiraling. And at the time, you know, I love Twitter and now X as a product, I use it all the time, right? Like, you know, it has given me so much.

01:48:57

A lot of my personal relationships have come from that. Professional relationships have come from that. I think it's just a fantastic product for the world in terms of, you know, just what it has done. And, you know, somebody reached out to me and said, hey, would you want to help? And I was like, yeah, sure. Like, you know, it seems like a challenge. The company is like in a bit of trouble. It was not glamorous, right? Like, you know, it was not the sexy company to work it, but I really like it and see seems like a challenge. So I want to joining there. And that was quite the thing because one of the things I did not realize at Twitter was how insanely political in many different ways it was on the inside, right? So that was quite the adventure. But that was the guy. So I joined Twitter, I think, in 2017.

01:49:41

2017, yes. Before we go farther, I'm just curious. I mean, what are your thoughts about just social networking in general? I mean it's caused a lot of problems in the world. I think it's also connected a lot of people in the world. I mean I've, I've made a lot of contacts off X, Facebook, Instagram, not so much TikTok but you know, but I feel like it's a double edged, sweet sword, you know, it's, it's also a great way for government, anybody to kind of map out who you are. It's, it's immediate. I mean I use it when I get messages on any one of these social platforms and it seems like interesting message. The first thing I do is I go to the friends list, see who they're, see who mutual connections are. And so it's, you know, I mean you're basically what a lot of people, myself included. I mean your life goes onto these social networks. It's easy to map you, it's easy to map your connections, everybody you're connected to, who they're connected to. I mean, I'm just curious, you know, what, what are your thoughts on, on all of it from.

01:51:02

Social media has completely revolutionized the world in a multitude of, of ways. I know firsthand how tough hiring can be. Finding the right fit takes time. You wait for the right candidates to apply, sort through stacks of resumes, and then try to line up an interview that works with everyone's schedule. It's a constant challenge. Well, the future of hiring looks much brighter because ZipRecruiter's latest tools and features help speed up finding the right people for your role so you save valuable time. And now you can try ZipRecruiter for free at ZipRecruiter.com SRS With ZipRecruiter's new advances, you can easily find and connect with qualified candidates in minutes. Over 320,000 new resumes are added to ZipRecruiter every month, which means you can reach more potential hires and fill roles soon. Sooner use ZipRecruiter and save time hiring 4 out of 5 employers who post on ZipRecruiter get a quality candidate within the first day. And if you go to ZipRecruiter.com SRS right now, you can try it for free again that ziprecruiter.com SRS ZipRecruiter the smartest way to hire.

01:52:20

It's such a complicated question, right? You know, first of all, like as somebody sort of worked in a few of these things, I have some biases when you're a part of an institution. I've been part of multiple of these institutions. You sort of have an affinity. You want to defend it. That's number one. But the second part is, I've also seen the abuses that happened when social media has real power. I've seen the censorship. I've seen, like, the centralized control that can happen, and I've seen the harm that it can do for kids. For example, when we grew up, I didn't have to worry about, was it a popular kid? Would I get bullied if I said something? I have a lot of friends with adolescent daughters, sons, and they have to worry about that. It is a very complicated topic. I think the thing which I often think about is. And one of the things I saw at Twitter was how censorship or nudging political agendas can happen, sometimes deliberately, sometimes even not deliberately, and how they can have huge impact. So, for example, people think of the Twitter files and all these examples of censorship happening at Twitter, right?

01:53:47

And I was there when some of this happened. And one of the things I always sort of fought back on was like, you know, the influence of politics inside these social media algorithms. Okay? So often, people kind of often know, I think the easy ones, they are like, oh, my account got shadow banned, right? Like. Or I said something like, for example, in Covid, you know, I said the whitest came from a laboratory and my account got blocked. Which is, by the way, the NIH director, Jay Bhattacharya. He said that his account got blocked. And think about his story now. He's now running the NIH because Twitter deleted his account for basically saying, hey, maybe the virus came from a lab and maybe. I'm not totally sure about the mask thing, and they deleted his account. But what I also saw was how. How easily algorithms can be used to shape politics. I'll give you a story. I was doing it one day, and I wake up and I'm scrolling the Internet and I see this story, and I won't say who, but it is this famous Hollywood actor, and it says that this Hollywood actor has his movie project that was just announced, but the Internet is mad at him for getting this movie project, and he might lose it.

01:55:03

And the reason it caught my eye was all these stories would embed the same two or three tweets inside. And I worked on that. I ran a lot of the algorithms, and I was like, these tweets look a bit weird. There's no followers, there's no likes. Retweets. How did these get found? Why are these kind of getting surfaced up. So I had some free time and I sent some emails and I poked her. And it turned out what had happened is, do you know what trending on Twitter is? So at the time, the trending algorithm on Twitter had all sorts of issues, all sorts of bugs, right? And what it would do is sometimes it would try and tell you, hey, this is what people are talking about right now on Twitter, right? It's trending. Sometimes you just kind of pick a random arbitrary thing and say it's now trending, right? When it couldn't find anything. And what happened one night that previous night was it had found a random set of tweets. And because it could not find a legitimate trending topic, this algorithm, by no one's fault or someone's fault, but not any deliberate agenda, right?

01:56:11

It said these tweets by total unknown people about this actor is important. So this happens maybe at 2am, 3am and then in the morning, what winds up happening is on Twitter, there was this product called. There was a way where they could highlight these tweets and kind of give big imagery. So one of these people wakes up and some of these folks are, you know, just sometimes had a political agenda. But I think they were just like, hey, I just wanna do my job, you know, and this thing is trending. They did not know this came from an error. So at 5am or 6am New York time, they take this and they go say, this is a thing which is happening on the Internet. Okay? A couple of hours later, all these editors of all these, you know, media publications wake up. They're like, like, oh, there's a thing. People are talking about Internet. Hey, can you chase this story down and write about it? And by the afternoon, you know, I think that whole thing, you know, that guy's agent was getting calls and be like, oh, my God, like, what is happening? Right?

01:57:09

It is. The Internet is talking about this. And the reason I bring up this story is, you know, first of all, it's kind of silly, right? It's like a Hollywood movie. Nobody really cares. But it tells you, right? Like how almost easily, inadvertently, you could shape the discourse, shape the arc of something so easily, right? And the combination of the algorithms, the metrics, and some of the people inside Twitter meant you could really influence the world. I often call Twitter a memetic battleground. You know, what memes and memetics are, right? So Twitter is a place where all these ideas fight, and if you win, you get to spread all over the world a little bit. And so one of these Things that people wound up doing was finding ways to sort of inject their idea into this mimetic battleground and try and fight everyone else. Now, this was rewarded by the algorithm because what the algorithm was doing was it was looking at, okay, what is getting the most attention? Likes retweets, right? And these were often things which were provocative, like, had, like, made people angry, right? Like, and then those people would get followers, okay?

01:58:32

So there was this kind of the system which is being created where if you said something provocative and you made people angry, right? Like, one, you could kind of shape the discord source. Second, the algorithm would be like, oh, this is what people on Twitter want, because it's getting more attention. I'm going to send you more followers. I'm going to bump you up in the algorithm, right? And this had two, I think, catastrophic impacts. One was that the people who were sort of exhibiting some of the worst behavior were getting rewarded, right? The more you get people actually angry, the more, you know, you get people outraged, the more attention you are getting. But the second more subtle thing which was happening was when somebody new signed up on Twitter, right? Imagine you walk into a restaurant for the very first time, okay? You walk into some fancy smancy Michelin restaurant, and you're like, okay, everyone's here in a suit or dressed up, right? Like, and you're like, I got to look the part. Or let's say you go to, you know, a sports bar, right? Like, late night, you're watching a game, it's a bit rowdy, you know, the vibe.

01:59:41

The thing which Twitter was doing was it was shaping the vibe to be one where anger and provocation was rewarded as opposed to kindness, as opposed to education. I'm not saying it doesn't exist. There was a lot of it. Obviously, you and I and others have had great experiences. We're sort of pushing the ball over in one direction, right? And when new people signed up, they were like, oh, I don't know what this place is. Oh, who's doing really well here? Oh, it's that guy who's getting people angry, by the way. This happens on YouTube. This happens on platforms all the time. What do people do? They're like, what are the videos that are doing really well? What are they doing on the thumbnail? What are their, you know, the titles look like? Let me copy that. So, one, you're kind of giving people who are exhibiting more wealth sometimes little wealth. Second, you were training everybody new at Twitter. Those are the icon. Those are the people you should follow, right? So There was kind of this whole system which evolved. And so anyway, my point of this sort of, sort of roundabout explanation is one, it taught me the absolute power of social media.

02:00:47

Second, that how these systems, when centralized, can have real power, right, and real central censorship. And how it is important that you have absolute transparency. Number one, we need to know what algorithms exist in these platforms. We need to know that there is no ideology in these platforms. So it kind of really was an awakening moment for me. It kind of really shaped me. Think of me as somebody on the inside, like, oh my gosh, like, you know, we need to sort of fix these things and destroy every platform. By the way. I'm not picking anybody, everybody, every platform. The second thing it kind of really sent me down the path of was decentralization. This idea which I think is kind of a lot of the crypto people really resonate with is like, we as people need to have ownership over how these things work. We need to have a say. We need to know what is happening behind the curtain and maybe we just don't have like one big thing. We need to have a lot of other small things competing. So it may be very distrustful of top down centralized control. And it made me feel like, okay, I need to find a way to bring more decentralized systems into the world.

02:02:00

Which kind of why I wound up in crypto for a while. But yeah, so social media I think shaped my career. It's given me a lot. But I also saw a lot of sides of it. My Twitter time was deeply formative for me and when I left, you know, I want to be investing, but part of it is like I need to find ways to battle some of the negative things I saw. I were there.

02:02:21

That makes a lot of sense. You know, that makes a lot of sense. I mean it's such, it's just such a powerful tool. I mean back to, you know, 2020 election time frame, I mean we saw, we saw what's, what appeared to me, you know, from the out, from an outsider is, you know, we saw a sitting president get banned on X, get banned on Facebook, get banned on Instagram, get banned on YouTube, ban, you know, banned on all of these networks. And so, you know, the way it appeared to, I think everybody is that these top guys, you know, that own, these companies are controlling the entire narrative, everything that's going out. And you know, it appeared and I don't, I don't think it appeared it was that way. Everybody leaned one way and you know, that was, you know, obviously towards the Left. Left. And. And. And it. It. I mean, they deployed. Was that one platform. They. They just pulled it. A. Aws.

02:03:20

Parlor.

02:03:21

Parlor. Yeah, they just pulled it. I'm not. I'm not sure why. I don't know what is going on down there. But, you know, it was like, holy shit. Like, now there's an entire party.

02:03:30

Oh, yes.

02:03:30

You know, with zero voice to include a sitting president. And that really, I think, opened everybody's eyes immediately of. Of how. How powerful and dangerous that this could be.

02:03:45

The de. Platforming was huge, I think, in a way, like, I had sort of seen the building. I had seen how it was, what was building up. So I was not terribly surprised when all of that happened. But if you think about that era, Right. Number one was you had the previous administration putting pressure on the social media platforms to take down costs, right? Like, they would send these angry emails. This is now all documented, and it's come out in depositions and lawsuits and hearings and whatnot. They would send all these angry emails to say, hey, you need to take this tweet down, right? Like, we don't love this tweet about Hunter Biden. Like, we don't like this tweet about, you know, mask mandates or whatever the case may be. So they would send this down. The second thing, I think what is happening was this was the peak of the culture war and these platforms where there was this domino effect where if one person took down a video, a piece of content downranked, something shadowbanned, for example, then every other platform immediately felt pressure. Because what had happened is all of a sudden you would get an email from the former White House, a member of Congress, or a New York Times hit piece, which would be like, hey, those guys have taken down this thing.

02:05:09

You guys are leaving this up, and you are responsible for all these crimes. And so they try to enforce the Overton window of conversation, and they try to really shrink what can be said. And I think in some ways, I think the platforms. I think the turning moment. Moment, in my view, and some people may disagree, is when. Well, obviously before this election would be when Elon bought Twitter. Okay. Because I think that totally changed how free speech on the Internet works, because all of a sudden you had one platform. And by the way, I was there. I was there on the first day. We kind of jumping ahead a bit. But I had spoken to Elon and he told me he was going to do this. I had left Twitter at the time for a couple of years, and I joined this venture capital firm. But Elon said, look, I'm going to do this thing. Can you come help me out? And for me, it was a little bit like getting a do over because I could go back and this is not an official capacity, I was not an employee, but maybe I can go back.

02:06:20

And I didn't have the power, power to do some things in. I don't have $45 billion to spend on a social media company, by the way. That was a problem I had. But now I have a chance to put some things right. Okay, so do you remember when Elon bought Twitter? The let it sink in, Dave, and he bought the sink in. So I showed up that same day, kind of snuck in, making sure the TV crews didn't see me. And that was such an experience, just watching Elon work and, and trying to sort of clean up the company, finding all the ways that sort of these progressive levers had been hidden. And for me, it was a chance to kind of like put right some of those things. And I would say that moment changed how free speech on the Internet works. Right, because all of a sudden you had a major platform, Twitter, 10X, which was trying to live by free speech speech, which brought back the president, which brought back a lot of people who had been taken down for just asking simple questions like, hey, what is the origin of the COVID virus?

02:07:29

Are masks effective? Or how can you say that we can't sit together in a restaurant, but a protest is okay, how does that work? But even just the ask a question, just get your account bland. And now here's a plan platform that was bringing it back. And look, people have a lot of complicated views on Elon or the company. But I do think that moment was huge in changing how free speech on the Internet works. And I think for, and I think even today that is not appreciated for how important that moment was.

02:08:03

I mean, I think it revolutionized everything immediately.

02:08:07

And I don't know, were you on Twitter? Were you active? Then you must have been.

02:08:11

Yes, I probably wasn't very active. That's, that's probably the, I don't know, that's maybe the platform. I used to be least active. I'm a lot more active on it now. But, but I, I, I really pull away from, I, I go on there mostly to see how our content's doing because I do think that there is a lot of toxicity. I know there's a lot of toxicity that comes out of all these platforms. So I, I try not to, I try not to fall into that. But, but I mean, I think it was. Well, I don't think I know it was instrumental in what Elon did because from an, like, once again, from an outsider looking in, what I saw was he would have capitalized on the entire market and taken the entire market share by providing an actual free speech platform where you don't have to worry about asking questions, you don't have to worry about calling out corruption and you're not going to be censored. And I think that all these other companies meta Google, maybe Amazon, I'm not sure, but I think they saw that happening and they thought, oh shit, if we don't loosen the reins up a little bit here, we're going out of fucking business.

02:09:29

And I think they would have gone out of business. And with some of them, I'm actually, actually surprised that they didn't just because of the repercussions of what they did to half the country.

02:09:40

Oh, yes.

02:09:41

You know, and, and. But I think that they saw that, that gaping hole in their business when Elon secured Twitter and turned it into X and everybody else had to follow suit. Am I, what do you think? Am I wrong on that?

02:09:54

I, I think you're very right. I think there's another dynamic which is courageous, which is he just took a lot of the arrows back then. And so when he did that, some of these companies, you know, they just want to stay out of trouble. Like they would allow it if there was nothing political like ever said on their platform. Like, they really don't want me the business of navigating where the COVID virus came from. They just want, look, come here, have a great time, have great content. We want to sell some ads against it. Let's go home. Okay. But that's not the choice, obviously, when you're running a major platform, it's almost like running a country. So they got sucked into it. And I think, you know, one of the. There's an amazing book you should read from the 50s, a bit hard to get on Amazon. It's called Private Truths, Public Lies. Private Truths, Public Lies.

02:10:49

Sounds interesting.

02:10:51

Yes. And this idea, and I think it's out of print or something, but it's a very simple idea. It is very simple idea that, that sometimes in society everybody starts to pretend to believe in a lie just because it is the convenient expected thing to do during COVID Right. You would somehow have to wear a mask into a restaurant and be asked to do so. But you could take off the mask once you sat at the table. You're like, okay, hold on a second. Like, how does These two, logically, this of kind can't totally make sense. There's something wrong here. Right. But you can't really speak out against it. If you did, there were consequences. Okay, so what this book says, the second part of this book, and it's a great book, please, I'm oversimplifying is what it takes is it takes one person, an entity, to point out the emperor has no clothes. Okay? All it takes is one. But when you have that one person in, do so at usually like sufficient, like, kind of like risk to themselves, everyone else starts to fall in line.

02:11:57

Makes it okay.

02:11:58

Makes it okay. You had two guests on your ship who've done this, right? Like, you know, Brian Armstrong with this mission oriented company statement. Like, did you talk about that when you had him? We did. Right, right. So that's such a great story. So Brian, please go watch that episode. Because he talks about. But you know, he basically gets buzzed, bullied into having to put out a statement, which he doesn't want to. And he doesn't like the fact that he's being bullied, right? And he says, listen, we are not a political social company. We are in the business of making crypto available safely, securely to everybody. That's the job. That's the job we are in, right? So he puts out this blog post which was hugely controversial because he was accused of being racist. He was accused of frontiering their times. And he says, like, okay, we are a mission oriented company, which sounds crazy that it's controversial. But he says, if you work here, you are signing up as a part of this mission, which is to make crypto available safely, securely, at scale. If you don't like this mission, don't work here. No harm, no foul, okay?

02:13:10

Go find a job somewhere else. If you care about something more than this, that's also fine. We just ask that you do not bring that into the workplace. Because in the workplace we care about this mission. It's a big mission. We have competition, the stakes are high, the rewards are high. It's going to take a lot of energy, right? And as long as you focus on this, we don't care about anything else. But that was so controversial because at the time, you know, I was watching inside these tech companies the ride of dei, right? Like I remember, you know, my wife at the time was, had a little bit of a stint at Meta. And you would have these employees who just totally hijack meetings and they would ask people to issue apologies. They would say, we have to comment on literally everything happening in the world. And A lot of Silicon Valley executives were just scared. Maybe they should not have been scared.

02:14:12

Maybe they should scared of their workforce.

02:14:14

Yes, they were scared of their own employees. They're absolutely scared of their own employees. Right. Like they were terrified of man. I don't want to seem, you know, pekinist, sexist, racist, whatever it. Like, I don't want to seem that right. And they would get attacked all the time. I spoke to, to some really famous people who are like, I don't want to hire this exec, but I'm being forced to because if I don't, I'm going to be accused of discrimination in some shape or form, and then it's going to piss a lot of other people off. Then it might piss some of my investors off or the media off. I don't know what to do. And there were so many people and I think, look, if you're harsh, we can see they didn't have enough courage. But also, look, they had a lot of employees. They were trying to make sure, like the business, business doesn't get into trouble. They had customers and they're like, we don't want to be in the political game, but I'm scared. And this was a thing all over Silicon Valley. It was almost, I would say, a rising infection which just spread by 2020.

02:15:17

A virus?

02:15:18

Yes. Like, I remember kind of going, jumping back a bit, the original moment when I felt the rise of DEI in Silicon Valley, I think this was in 2013, 2014. And. And there's a company called GitHub. They are a popular developer company. They build like, source code products for developers. And at the time they had actually a funny replica of the Oval Office. Okay, that's kind of a thing. And they had a carpet, like, instead of the presidential seal with the bald eagle, there was a carpet which said, with their logo and it said, in meritocracy, we trust. Okay. And you would think that safe meritocracy, you know, if you work hard, you know, you're the best at what you do, you win. That was hugely controversial at the time. And I remember a lot of us going, that's weird, right? Like, you know, can you imagine like an NFL draft combine where if you say, hey, you run a 4 4, you know, you know, you're pretty good at maybe being a wide receiver. And they're like, well, you can't talk about that. It's just like. But that became a real thing. And then I saw over the years, quotas, you know, me, a lot of other people, we would get into Hiring situations where you'd be told, hey, unless if your entire team is a male white, pick your thing, you're being racist or sexist.

02:16:49

And you are like, well, look, I want to hire amazing people. People. I don't care where they are. I don't care what they look like. I just want to hire the best people to the job. But there's a lot of pressure, I know, on Silicon Valley execs to be like, okay, my board has to look a certain way. My exec team has to look a certain way, otherwise the New York Times is going to come after you and you might lose your job. And some people did lose their jobs, by the way. So this was a bad space to beat. And I. I think a lot of people outside Silicon Valley don't recognize how bad it got. Like, I was at this venture capital firm and we used to get calls from all these founders being like, I'm just so terrified of my own employees. I don't want to deal with this. I just want to build a company. I want to build a product, serve my customers, help my employees. Hopefully we all make some money. I am not interested in weighing in on the latest social issue, but they were forced. And so the reason I bring this up, so when Brian Armstrong did that, right, like, he was taking a huge risk.

02:17:51

And I'm not sure he kind of took enough credit for it when he did your podcast, but he was taking a huge risk, right? He could have been fired. They were a public company. He could have been ousted. You had all these ESG firms, you had all these firms all kind of stopped doing it now in some ways thanks to who won the recent election. But this idea that, hey, unless you kind of check off all these boxes on the environment, on diversity, we want to invest in you, and if we are on your board or we on your cap table, we are going to put pressure on you to do these things, right? And so he took a lot of risk, and I think it was not easy for him. But when he did that, so, you know, one, I have a lot of respect for him because to do that, he could have just played it safe. He could have been like, yeah, I'm just going to say the things which everybody wants me to say, and I'll be at the COVID of Time and I'll be happy, right? But he took on the crowd.

02:18:50

He took on all these folks who are trying to bully him. So, number one, he deserves some respect for that. Second, when he did that, it sort of opened the overturn window. People Were like, oh, wait, I can do that. I can now start saying, oh, wait, maybe we just need the best hire. I don't want to just fill a quota. And Alex Wang, who you had, he did this thing where he said, instead of dei, I want mei, where meritocracy, excellence. And I forget what I stands for, but it's the idea that we focus on things you kind of want in a workforce. You want people who work hard, you want the best of the best, and you don't care how they look like or where they're from or what they do in their private lives. So Alex took some heat from. For that, Alex Wang, when he did that. But that set the tone. So I have a lot of respect for some of these people I work with. Some of them we spoke about, you know, folks like Balaji Srinivasan, who I think are similar, who took these arrows when they didn't need to and then opened the Overton window for a lot of people to follow them.

02:19:57

I mean, I think, I think Silicon Valley listened. Well, maybe they didn't listen, but I mean, it seems like, you know, talking to, you know, talked to a lot of innovators in the tech space recently this year, and, and a lot of these guys moved down to El Segundo. It seemed like for a different culture. And so now we see, you know, and I'm not terribly familiar with it. I've never been to Silicon Valley or El Segundo, but it seems like El Segundo is rising up is like Silicon Valley too.

02:20:28

Yes, I think, yes, for a certain class of companies else that part of that whole community, they have a lot of ties to sort of the defense establishment, to making hardware. And so which is why you're seeing a lot of these American dynamism companies come out of there. They can pull talent from elsewhere. And I think the other places which I've done start doing really well. Austin, Texas has started doing really, really well. Very true, very true. I always have a soft spot for, for like Raleigh, North Carolina. I think they have a great ecosystem there, which is awesome. Sometimes you have these ecosystems outside Silicon Valley, which is great because you get these people who don't have a huge ego. They want to work hard and they're not going to switch your company every year and go join somebody else. But I would say Silicon Valley is still really important, just because if you think about a lot of these AI companies, for example, example, a lot of them are in Silicon Valley. And so one of the things just a few plugs in One of the things we've done in this administration to combat a lot of these things about wokeness, about politics in platforms is we leased this executive order.

02:21:48

President Trump signed this executive order three weeks ago, which is called Stopping Woke AI. And I played a, like a big role in this along with David Sacks. And the whole idea behind the executive order is that it's not that we want you to pick a left versus right ideology, we just want no ideology in AI, right? Like, we don't want employees to put their thumb on the scale and sort of embed their beliefs into such a crucial piece of technology. And this kind of sometimes accidentally happened. Like, for example, last year, there was this sort of infamous incident where if you asked a leading large language model, show me a photo of George Washington, it showed you a black George Washington, right? And it was sort of an accident. It was kind of, you know, and I think it's not like super deliberate, but there are other cases which are way more insidious. And I think given that and given the fact a lot of these AI companies are in Silicon Valley, Valley, which often just have a very left leaning ideology, I think it is super important to figure out, okay, how do we make sure we don't have a repeat of what I was talking to you about, what I saw at Twitter.

02:23:01

How do we make sure that. And by the way, with AI, it's going to be so much more important than social platforms. It's going to be so much more important. It's going to be so much harder to find things. So what the executive order says and a lot of other things we have done says is, number one, no idea ideology. We don't, we want no thumbs on the scale. Number two, we want transparency, right? Like, we want to know, you know, if you say a certain answer, certain, you know, even example, a political comment, that's fine. Just tell us where you got it from. Tell us what your sources are, right? Like, you know, let the audience, let the viewer, let the person interacting with you. Let them make up their own minds. Like, don't hide it, right? Because I think if you go back to my Twitter story, that story about this Hollywood movie, if somebody could tell how the algorithm was working, they'd be like, oh, wait, something's weird. Let me go investigate. So I think sunlight is the best disinfectant, right? I am a transparency maximalist when it comes to technology. So with this Woke AI eo, with other things we have done, we are like, all right, these things are so important for our economy, for the world, and we want to make sure there is one.

02:24:10

No politics. And also we know what's happening behind the scenes.

02:24:15

How does an ideology get inserted into an AI platform? Is that from the engineers or. I mean, she's processing so much data. I mean, I'm probably way off, but the way I understand it, the way I took it from Alex Wang, is that you have these enormous data centers, and that's what the AI model pulls from and processes, gathers the data, processes it, and presents it to you. And so, I mean, I would imagine, I don't know what all goes into that, but I would imagine it would be fairly easy for a rogue engineer to insert his own ideology in that and it goes unnoticed by the rest of the company. Is that how that happens or.

02:25:02

There are many, many ways how ideology can be inserted. Like the word ideology is so broad and vague. So I'm an engineer. Let's make it very specific. Okay, Let me give you actually maybe a non AI story and then we come to AI. So when I was at Twitter, right, for a while, there was this phenomenon where left Democratic congressmen would sometimes are Democratic political figures would rank, get ranked higher in the algorithm. They get shown more like in the YouTube algorithm. You know, like the algorithm bumps you up just like on Twitter than figures on the right. And the reason that it happened was when you train the Twitter algorithm, you give it examples of good and you give it examples of bad, and you tell the algorithm, hey, go find more things like this. And the people who were sometimes giving these lists and. And I'll just give them the benefit of the doubt, right? I don't think they were trying to put a thumb on the scale, but they had certain beliefs. They stuck the news organizations they followed, they stuck the political figures they liked. They maybe didn't know or didn't approve of people on the other side.

02:26:10

So the algorithm kind of learned good to mean a certain class of views, a certain class of publication, okay? And so those guys started getting more clear traffic, more attention, and so it kind of spirals. So I bring up that story so very easily. And sometimes even without any mal intent, sometimes there is like mal intent. Even without mal intent, you can slide these systems, okay, now with AI, right? So very simply, just kind of like level sit a bit, right? The way a modern AI model works is there are kind of like two steps. One is a process called training. And the way I would think about it is you take all of human knowledge, all of the Internet, imagine this sort of massive kind of bitches cauldron you know, but imagine the cauldron spanned like multiple data football fields. You dump it all in like every book ever written, every movie ever made, all of Wikipedia, right? If you Google all of YouTube, you know, every bit of, of human knowledge ever gathered, you stick it in there, right? And then you have this algorithm which I really want to talk about, try and make sense of it, right?

02:27:29

You know, so for example, right. You know, it says that the word Sean. What should the word Sean be followed by? Okay, let's say, I say okay word Sean followed by Ryan. Then you're like, okay, Ryan. And then, then what comes after that? Well, the Shan Ryan show operator. Military intelligence, CIA. Makes sense. Okay, let's say after Sean, I say Michaels. Okay, Then you get title wrestling heel face, Stone Cold. So it is kind of training and trying to make sense of all of this knowledge. Okay? So that's called a training. Then there is a separate step where you, you know, you kind of take this kind of this large blob and then there is a step called post training where you are basically trying to make the model better in specific ways. For example, you're trying to make sure it gets better at say coding or you're trying to make sure it gets better at science. And you're trying to make sure it gets better at these, you know, in these kind of these very targeted ways, right? Like you might have heard of this place called fine tuning. You're trying to make it better in these very targeted ways.

02:28:44

And so that's at the end of this, you kind of get this fully cooked model and then you have inference which is you go to ChatGPT, or you go to Grok, or you go to Gemini on Google or you go to Claude. You type in a question, right? And then it starts giving you letter by letter, word by word, like that is inference. Like you're basically giving getting back. So the point, I want to kind of explain this whole thing. Just kind of ground a lot of the other conversations we're going to have. But also every step of this, right, like could be infected by ideology, right? Number one, let's go to the step of this big cauldron. What if you only stuck left leaning content in there, there's this data, Gosh, I wish I could remember this at the top of my head, that a lot of the written Internet is left leaning in me nature. And so if you just dump the Internet in there, you could just get a leftward bias just right there. So just by sort of the sources of data, your model could Wind up learning just a certain ideology.

02:29:50

Second is when you fine tune these things, right? Like for example, deep seq, which we'll probably wind up talking about this Chinese model, There's a lot of evidence that it was fine tuned to add Chinese ideology. Say nothing happened in Tiananmen Square.

02:30:14

Or.

02:30:14

What China thinks of Taiwan. That happened in that part of the process. Another part of the process you could add things are is when the model is trying to react and give you a token and it is thinking. A lot of these models have instructions on how they should think, rules they should follow. You could set up a rule which says pick the answer which optimizes. If you're me, I would say optimize for truth. Or you could say pick the answer which optimizes for equity in the DEI and it's going to slightly slide the model some way. Or pick the answer which is least of offensive rather than pick the answer which is intellectually honest. These are incredibly oversimplified examples. But you could see that, right? And so any one of these steps could have either deliberate or accidental ideology inserted. Now in the Google case, and I could be wrong because I only sort of read some press reports later. I don't know what actually happened. I think what had happened is somebody by accident had added this idea that whenever the model is trying to generate, generate photos of human beings to try and generate people of every race, but then you're like, wait a minute, if you get a photo of a Nazi, right, they're probably not Asian.

02:31:37

That's not what the average Nazi in Germany looked like in 1940s. Or if you have a photo of a Viking warrior, they probably didn't look like me. That's not what Vikings look like when they shipped out from Norway. But this little thing was like, oh, whenever we ask for a human being, I'm just going to spread the human race out there. So what happens? You get the black Pope, the black George Washington, right? I hope I'm not making this too complicated, but this idea is that these things can be very subtle, they can be hard to find, they can be deliberate or inadvertent. And for me in this role, I think, look, one of the things we really care about is making sure that there is searching for truth and that there's no ideology of any kind. And if you look at the executive order, it doesn't say you need to optimize for the right. It says you need to be truth seeking, optimize for the truth. And if you don't know the truth, express skepticism and tell us what you're reading. And my hope is that know you know, regardless of whether you agree with, you know, me and you on our political beliefs, you'll probably agree that truth seeking is a good thing.

02:32:51

So that's the hope.

02:32:56

Wow. Yeah, it's a great explanation. Best I've heard. Thank you, thank you, thank you. According to new reports, central banks around the world may be buying twice as much gold as official numbers suggest.

02:33:10

Best.

02:33:11

You heard that right. Twice as much. And get this, some are bypassing the traditional markets and buying gold directly from miners in Africa, Asia and Latin America. That means no US dollars, just straight physical gold. The shift isn't just symbolic, it could be strategic. They could be looking to bypass western financial systems. So if central banks are scrambling to reduce their dollar exposure and hold more physical gold gold, should you do the same? That's where the award winning precious metals company Gold Co comes in. Right now you can get a free 2025 gold and silver kit and learn more about how gold and silver can help you protect your savings. And all you have to do is visit SeanLikesGold.com plus if you qualify, you could get up to 10% back in bonus silver. Just for getting started started go to SeanLikesGold.com that's SeanLikesGold.com performance may vary. You should always consult with your financial and tax professional. Time is a funny thing. It's so limited and yet we let it pass by us. We're all guilty of putting important things off until later. If you've been putting off life insurance, check out Fabric by Gerber Link Life.

02:34:36

They make it fast and easy so you can check it off your list right now. Fabric by Gerber Life is term life insurance you can get done today. Made for busy parents like you all online and on your schedule. Right from your couch you could be covered in under 10 minutes with no health exam required. Fabric has partnered with Gerber Life trusted by millions of families just like yours for over 50 years. There's no risk risk. There's a 30 day money back guarantee and you can cancel at any time. And they have over 19005 star reviews on Trustpilot with a rating of excellent. Join the thousands of parents who trust Fabric to help protect their family. Apply today in just minutes@meatfabric.com Sean that's meatfabric.com Sean M E-E-T fabric.com policies issued by Western Southern Life Assurance Company not available in certain states. Prices subject to underwriting and health questions.

02:35:46

All right.

02:35:46

Sriram we're back from the break. We were talking about Twitter, how you got into there when Elon took over and basically eliminated the ideology of wokeness of Twitter. But something that we didn't talk about yet is a 16Z. Andreessen Horowitz VC firm and so I'm curious, we've had some offline discussions about investments and things like that, but I'm curious, how did you get picked up for that? How did you find your way into there?

02:36:16

So A16Z stands for Andreessen Horowitz. So A followed by 16 letters and Z and A16Z. It's kind of a tech thing. And so they're one of the leading venture capital firms in the world. I would say the leading venture capital firm. I think they're the largest in terms of money they have. And the founders are these two iconic figures in Silicon Valley, Marc Andreessen, who we spoke about, who invented the web browser, and Ben Horowitz, who was with him at Netscape and is a best selling author, among many, many other amazing things. And in a lot of ways they revolutionize, in my view, the way venture capital worked in Silicon Valley. Just a little bit of history. So when people sometimes think venture capital, they think shark tank. You know, you come in, you kind of, you know, you pitch an idea. You know, some like rich person in a suit gives you some money earned. Not there's some element of that, but it's actually a lot more complicated. The history of venture capital actually goes back to whaling and fishing. So back a couple of hundred years ago, when you would get these sort of sailors or whalers, I don't know what they'd be like, hey, we're going to go out and we're going to go into these stormy seas and take a lot of risks and be out for a few weeks, maybe catch some whales and come back.

02:37:43

And it's an incredibly risky investment, right? Like half the time they don't come back. Let's just say the hazard rate was pretty high. And so they go around and they would go to people with wealth and they'd be like, why don't you stake me on this journey and I will give you a percentage of the whale I carry back, right? Which is, by the way, the word carry in, venture capital comes from that. It was literally sort of the veil and sort of the thing that you carry back, you would get a percentage of that carry. So even 200 years later, that word has kind of stayed on in sort of the risk capital discourse. And so the history of venture capital is super fascinating and I think it's an essential part of American innovation. And Silicon Valley have existed. So in the 70s or 80s, what would happen is before venture capital, if you're an entrepreneur, you have an idea how do you go get money. You go to a bank, you get a loan, maybe you have some friends and family, they give you some money. But it was very, very hard to go to a bank and say, I have a really risky enterprise in technology, which you may not understand, but I need a few million dollars.

02:39:08

And by the way, you may never see it again and you may not have friends and family who can do that. So I'm going to say in the 70s there was a set of folks who wound up forming Sequoia Capital and Kleiner Perkins kind of developed the modern venture capital industry. These were folks who made some money from Silicon Valley. Valley, sort of the origin of Silicon. In Silicon Valley, they are kind of been part of these big chip companies. And you know, that's where the word Silicon Valley comes from. They made some money and they basically said, okay, we are going to stake these entrepreneurs, you know, fund them on their journey. Not out of sort of the goodness of their heart, because we know that when these companies do really well, we stand to make a huge return on our investment. But the thing which I think is very different from the rest of the world is they were willing to lose their money. They didn't love it, but they're willing to take bets on these risky enterprises. So I think if you look at a lot of these amazing companies in Silicon Valley, Apple for example, or Google or Netscape, they all had one of these venture capital firms who are essentially taking bets on a totally unknown person and saying, we're going to give you some money.

02:40:20

Famous example is Google. So Google comes by, I think in 1998, right? And again, I'm dating myself here. But Google was not the first, the second, the third, maybe the fifth or sixth search engine do you remember using, like Ask Jeeps like us. Yeah, yeah, yeah, right. Like, or Yahoo.com, like Yahoo is another famous one. And you know, and back then, you know, you would use these search engines and people like, well, look, search is really not a business, right? We don't know how to make money off it. It's business. And then these two guys, Larry Page and Sergey Brin basically come out with a new algorithm, right? They were these Stanford computer science guys. They came with this new algorithm called PageRank by the PageRank is super interesting. It basically says that I'm going to think you are important if a lot of other people who are trustworthy also think you're important. And that in one oversimplified sentence, is kind of how the origin of Google work. But they were like, we don't have any money. They actually went to a lot of other companies and they said, do you want to buy our algorithm for a million dollars?

02:41:22

So they went to all these existing search engine companies. Imagine, by the way, you just bought like original Google algorithm for like less than a million bucks. And everyone just said no. So they didn't know what to do. And so I think they came to two essential part of the Silicon Valley ecosystem. First is angel investors, and the second is venture capital. So they went to basically kind of these wealthy Silicon Valley guys, this guy named Andy Bechterstuck Time, and they said, listen, I have this idea. We need some money. And Andy, I think, kind of gave them $100,000. They didn't even have a bank account, so they didn't know where to put the money. So they kind of kept it and started building Google. But so think of this construct. Two guys, crazy idea. Nobody thinks it's going to work, but the ecosystem and the culture of the Valley was like, okay, if we think that you have a shot at this and you've done some work, there are probably enough people, capital are going to take a bet on you and probably going to assume, assume that 90% of the time this money is on a comeback every once in a while.

02:42:21

You want to create an iconic company. So Google then of course gets funded by another iconic firm called Kleiner Perkins, and then they go ipo. They obviously kind of become the huge giant they are. So all of Silicon Valley has these stories of these amazing companies, but these venture capitalist firms were at the heart of it. So this story is very interesting because Andreessen Harvard totally revolutionized in my mind and this ecosystem. Back in the day, venture capital firms were sleepy. They were kind of behind the scenes. They would never say anything in public. They didn't want any attention or controversy. The idea was, you come in, you pitch us, go on your way. The other thing was strategically what would happen is you would go pitch a person. At a venture capital firm, a partner usually had like five to eight partners, and that person would be assigned to you. And that person, if you needed some help, like maybe like, hey man, my company's failing, like, I need some advice. You had to go to that one person for advice. If you're like, hey, I need a contact in the Pentagon, like, do you know somebody at the dod?

02:43:28

That person better have a Rolodex. Right? But you're essentially the one person. So anyway, so Mark Andreessen and Ben, they had their own experiences with bad venture capital firms and a lot of bad words, right? They, you know, they do a lot of bad things to founders. They throw CEOs out. And they also were like, listen, why is it that when we go, we have to rely on this one person and where we need all this other help? And they met Mike Owitz. You know who Mike Owitz is?

02:43:54

I don't.

02:43:55

So. Mike Owitz is the founder of creative artist agency caa. Right. I would say, along with wme, the two iconic Hollywood talent agencies, Right. He's a guy who's represented like every Hollywood. I think he's retired now, but he's representing every Hollywood celebrity, best selling author, founder. CA and CAA had this amazing strategy to win the market, right? What they did was number one, they said, we're going to be very loud, right? Like, we're going to make sure everybody knows our name. We're going to be brash. We're going to use these red binders which are color attention grabbing. We're going to get people to pay attention to us. The second, and this is very interesting, they said, if you sign up with us, you are a, a, a list actor that you are with an opposite. The way these should sign up with us is you're not just getting me Mike Owens, you're going to get the entire firm's Rolodex. And this firm is going to maintain Rolodexes just for you. We are going to maintain a list of actors, we're going to maintain a list of producers, directors, you know, and so if you come in and you have an idea, you know, we don't need to go find the right editor.

02:45:03

You don't need to hope that I have a friend, you know, who can direct you movie. We have entire teams whose whole job is to maintain a roster for you at all times. And they would go and win all these clients because you would go, what is your image of a Hollywood agent? A guy in a sharp suit, right? Like, you know, on a phone, quick talking, maybe as a roller decks. CA industrializes, they had some of that for sure, but they industrialized it because they're like, you come in, we can plug you into this machine which is going to make great movies for you, make you fit, right? So Michael was Marc Andreessen's mentor and friend. So Mark And Ben heard this and they were like, why don't we do this for venture capital firms? Because let us say a typical entrepreneur, you know, we spoke about some, you're in your 20s, 30s, you often don't know what it means to build a business or you're good at one thing. Let us say you're good at building technology, you're great at computers like I used to be, right? Or maybe you just know your product area very well.

02:46:02

Your great at making ice cream, you're great at building rockets. But are you the best cfo? Are you the best marketer? Do you have a Rolodex of government officials? Something goes wrong, you need all these other things. Who these young CEOs didn't know or let us say, you run into trouble, you're running out of money. How do you restructure a deal? How do you close a big customer? How are you supposed to know these things? So what Andreessen Horowitz said, Mark, we are going to copy and replicate the CAA model, but for technology, venture capital, okay? So we are going to have a roster. We are going to have a roster of every single amazing CFO in Silicon Valley. Every single amazing marketer. We are going to maintain a roster of potential board members. So when you come in 25 year old with this idea, who has something working, we are going to plug you into the system. You want a cfo, we have everybody on speed dial. We have done them favors so they will pick up our call. We can get you in. By the way, they can also help you hire cfo.

02:47:04

So often for young founders, they don't know how to hire executive talent. How do you hire a 50 year old head of sales if you have never met a great sales leader before? Who do you even know how to do that? We have the team who seen every great sales leader help you do that. So that was a different product, right, from classic venture capital. So number number one instead of one person, solar decks, we're going to give you the whole sort of system. The second thing, they were loud, they were brash, they had press articles. Marc Andreessen wrote this famous blog post called Software is eating the World which basically said that every business on planet Earth is going to have software underpinning it. And I think in some ways I think he's been proven right. So there's a lot of backstory and this is kind of some of the tactics A16Z used to, I would say beginning become one of the most famous powerful firms in Silicon Valley. So how Do I come in there? Right. So Mark has this great strategy which he calls harpooning. In the venture capital tech business, if you sleep, if you are not paying attention to what the next generation is building, time will pass you by.

02:48:16

You have to stay current. You have to stay always on the edge of what people are doing. And Mark and there are many others are very good at this, like Peter Thiel Lonsdale, who I think, you know, who's been here, amazing people there. Mark was extremely good at it. And he would have this tactic he called harpooning. Okay. And what he would do is if he saw anybody online who had written or done something interesting, he would send them an email. Right? And getting a email from Marc Andreessen is like, he's a very notable technology figure. So you're just like sitting like, wow, what is that? And the reason you do that is like, I want to get to know this person before they become famous, before they build this next three. It's a little bit like, you know, you see, I don't know, 15 year old who has amazing skills and you know, I don't know what the legality of this as a coach is. You're like, I'm going to make sure I'm like building relationship with this person because then someday if they want to pick a school or a team, I have a relationship with this person.

02:49:12

So Mark was very, very good. Good. I'm pretty sure even to this day he starts sending out these emails in the blue and he harpooned me. You know, I had written a blog post in, I would say 2012, 2013, and I get this cold email one day from Marc Andreessen saying, hey, I like this blog post. And I was like, whoa, you know, and this was very different time in my life. So it was quite shocking, right?

02:49:35

Another blog post, huh? Blog post.

02:49:38

Yes.

02:49:39

Blog post to Microsoft.

02:49:40

No, I had left Microsoft. And this was about.

02:49:44

That's what they found you with.

02:49:45

Yes.

02:49:46

Post.

02:49:47

There's a whole pattern here, by the way, if you can like it. Take a slight tangent, which is, I think one of the superpowers I think people can have with a lot of, with very little effort is putting content out online, doing what you do, right? Like, or I used to write, I used to do video. Because when you put out content online and if you're passionate and, you know, hopefully you know something about it that somebody learns from, some of the best people in the world are paying attention. Attention, right? And I've noticed every sort of world leader in technology is Always scouring for new ideas, new people. And so there's been a repeating pattern in my career where I've written something and somebody has. Was at the right moment, saw it, and they were like, hey, this guy is doing something which I'm interested in. Let's reach out and make something happen. So one of the things I always tell people to do, especially young people, is write things, put things online. These days, I would say get a YouTube channel, right? Like, talk about what you're passionate about and somebody will find you.

02:50:44

Like, I know Elon, for example, has found amazing hires because he went on a YouTube rabbit hole and was like, let's get this guy. He looks like really smart. You and I talk about, like, you follow someone on Instagram and next thing you know, you're like, hey, come on my show. And so I think creating, writing, what you do is such a great differentiator. Anyway, so I wrote something and Mark liked it. He sent me an email. We met up, we built a relationship for many, many years. That's one. The second thing was I started doing my. Putting my own money, a little bit of my own money into various companies. I put some into SpaceX, put some into Alex Bank's company, scale AI into a bunch of other companies, which started doing really well. And Silicon Valley is a ecosystem run on reputation. If you are an investor who puts money into a founder and you are a genius jerk, you never show up, you never take the next phone call, you're not going to do really well because your reputation will spread. That founder is going to tell his or her roommate, they're going to tell the next person, and you know, you will not fare very well.

02:51:47

On the other hand, you know, if you take the phone call, if you wind up helping that person when they need you, if you're just not a jerk, right, and you respond to every single thing timely, you know, karma starts accumulating in your favor. And I just built up this portfolio of a bunch of these investments, which started to do pretty well. So I made a little bit of a name for myself, is what I would say, and I made a little bit of money. Covid happened. I was sitting at home collecting pasta and toilet paper like everyone else was. Do you remember that? Do you remember the whole era?

02:52:23

Oh, yeah, I still have socks. Toilet.

02:52:24

Oh, my God. What was your sort of. I can't believe we lived through that memory of COVID What?

02:52:32

What's that?

02:52:33

What was your I can't believe we lived through that memory of COVID What is.

02:52:39

I can't what is the.

02:52:41

I can't believe you did that. I can't believe we, as a human race or you here, we did that. Like for me, for example, the fact that we spent months stuck indoors, not seeing other human beings, just bizarre. I can't believe we did that. Right, and what was that for you?

02:52:57

A bunch of stuff. I remember I fell for it. I completely fell for it for about a month. And then, and then I was like, this doesn't seem right, but I remember spraying packages off at my front door with Lysol. I remember we.

02:53:17

So the multiple hand washing, you had to do that. You may do wash your hands.

02:53:21

My hands were all cracking. Here's a funny story for you. So when we, when we, we moved to Tennessee and my wife wanted to start a farm, so we bought, we went and got like six alpacas, bunch of goats, bunch of chickens and some ducks and.

02:53:42

And did you have any experience in farming before?

02:53:44

No.

02:53:44

No. Okay.

02:53:45

No, we were, we had just started this right before it happened. And so. Do you know what an alpaca is?

02:53:51

Of course.

02:53:51

Yeah.

02:53:52

My mentor in Seattle had an alpaca farm.

02:53:54

So it's like an, it's like a llama, right? Very docile animals. Anyways, it gets hot here, you know, could get to, you know, 100 degrees here. And so every spring you're supposed to, you're supposed to shave the alpacas. And. And I mean, that's what people raise them for anyways, the fur, right? So we call this guy. Or my wife finds this, this alpaca wrangler woman and she comes down and she's. She shaves the alpacas. I remember at the beginning it was all these people in Italy were supposedly dying.

02:54:31

The February, March of that year, that time frame.

02:54:34

And they come down and I'm like, hey, make sure you wear a mask. I don't, we don't know where these people have been in. I go down there after this is all going on and there's like four people on this alpaca. My wife doesn't have a mask on. They're shaving this damn thing. And these, these two people that came with the, with the, whatever you want to call it, the groomer, whatever, they go getting fancy. These are like free spirited people. They don't, you know what I mean? They just travel wherever and do this. And two of them were like, yeah, we just got back from this big trip to Italy and I cried, grab my wife. And I'm like, what are you doing? You're gonna Fucking kill us. These people just came from Italy and.

02:55:18

Da, da, da, da, da, da.

02:55:19

And she's like, holy. She got all upset about it. And anyways. Anyways, about a couple days later, you know, because I didn't. I don't even. I don't watch the news. I got tired of the news long before. Before people got tired of the news. And I was just like, we're just being fed the same over and over. Well, then Covid turns up. I don't have cable TV at the house. I don't really watch anything smart. And I just. I don't want to be fed that. You know what I mean? Because I figured out, you know, I mean, I think everybody's figured it out, that they. They're telling you how to think, what to think. They're injecting thoughts into your head by that, and it can manipulate the way you. You think. So we got rid of cable a long time ago. Covid pops up, and I was like, hey, let's just see what's on air. Tv. Only station we got was abc, you know. And so I was like, well, let's. Let's just see what's going on in the world. So we were fed, you know, that garbage for a while, and then. Then I started talking to some friends that still have news.

02:56:23

And. And. And that's when I figured it out. I was like, all right, this isn't about a virus. This is about something much bigger. But that was kind of. Those were my moments.

02:56:38

Oh, man. Yeah.

02:56:38

Lasted about a month.

02:56:40

Isn't it crazy that we all lived through that? We just had our first.

02:56:45

I gotta be honest, it's fucking embarrassing.

02:56:47

But, I mean, we didn't know better. We were all told this, and nobody had gone through this before, and. And there's so much fear, and you're like, is it spreading in the air? Not spreading in the air. Like, the number of feet, everyone. And we just had our first child, like, a little bit before that. And we spent so many months just not seeing any human beings. I was so bad in so many ways and so bad for sort of elderly people, I know, who are just stuck and just. So it's both funny, but also. I can't believe we all did that. But anyway, so I was sitting at home, you know, doing this, you know, not seeing other human beings. And everyone was on Zoom. If you remember, this was the era where everybody was doing video Zoom meetings. That was the thing. And Mark Andreessen, you know, he reaches out and he said, what are you up to? And I left Twitter because I'd gotten tired and I just wanted to do something else. And I was like, well, I'm sitting at home waiting for this pandemic to be over. I'm sure it'll be over in a few weeks.

02:57:46

Little did I know I'm amazing at predictions, Sean. And he said, well, well, just come help us out. And what I did not know was they'd been talking about me for a while and they had somebody else who I think was going to step aside. And so I became part of the team. I became one of the general partners along with Katherine Boyle, and she joined after me. But there's about, I would say, maybe 20 general partners for the firm. I became one of them and became a vc, started investing, Investing while also doing my podcast. But that's how my Andreessen Horowitz journey started.

02:58:24

Wow.

02:58:24

I learned a lot, by the way, at Andreessen Horowitz about investing. They mark, I think I learned a lot about how to be a good investor there, which in a lot of ways, I think can carry over in other things in life.

02:58:41

What is it that you see in a startup company that makes you want to invest? I mean, when. What are some of the points that you look for?

02:58:50

Good question. At the heart of it, that is the job. You are here to figure out who the winners are and put as much money as you can inside them. So the first thing is you probably do not know what to look for until you have met hundreds of companies and founders. So, for example, right, like if I were to meet somebody from your world, from your background, right, without meeting a lot of people, I don't think I would know the difference between a amazing top tier operator versus somebody who's not, just because I'm just not from that universe. And I suspect that if you came into my world, you may not know the difference between a top tier engineer, somebody's maybe just good at, not great. And the only thing that kind of sets that apart is have you put in the time and the effort to meet everybody. So the first thing, if you want to be an investor is you just got to talk to everybody. You got to know, okay, who the great founders look like, who the great engineers look like, who the great builders, marketers, who they may be.

03:00:01

And you need to meet everybody. When you meet people over time, you build a Spidey sense. Like, I'm sure, sure. When somebody reaches out to you just because of this podcast, you now have a little bit more of a Spidey sense in terms of how to judge them, Are they legit, are they full of shit or somewhere in between. The first thing is, unless as an investor you've done your homework and met a lot of people, you will not know the difference between the next Google founder or these guys are just they have nothing. You should have done the homework. That's number one. The number two, the belief I learned is that technology is a sector where often the winners are outsized. Peter Thiel talks about this. Peter Thiel has this book called Zero to One where he basically says that if you go to Palo Alto, right, there are in the Bay Area there are probably like 20 Indian restaurants, 20 Italian restaurants, restaurants, if you invest in one, right, there is no way that Italian restaurant is going to become the only Italian restaurant in the United States. Just not possible. At best, they maybe have a chain.

03:01:12

You know, you go to a few cities, but that's it. There is a cap on how big those Italian restaurants can be. No harm, no offense to Italian restaurant, but that's just the nature of the business. Technology, business are different, right? If you invest in the right company, they may be the only search engine people use, they may be the only, only social media network people use or pick your company, right? So there is a huge difference in picking the winner in a category versus not picking the winner in a category. For example, in 2005, Google dominated the world of search engines. Who was the number two search engine to Google? I don't know exactly. Nobody does. Doesn't matter, right. Because Google just dominated. Same with Facebook, right? And so there are these winner take all patterns which often wind up happening in technology. Where have you seen the movie Glengarry Glendros?

03:02:07

No.

03:02:08

Oh, okay. This is a classic movie. And there is this classic scene where there's a bunch of sales guys and they're kind of running low on meeting their quotas. And Alec Baldwin comes in and he's sort of this, this amazingly famous salesperson. He gives him a pep talk, right? And he basically insults them, insults their manlihood. And he says, you know, the guy who gets the most sales, the winner gets this amazing car. You know what second prize is after this fancy car, set of steak knives? Nothing, right? So often in the technology investing world there's a great scene, you should check it out on YouTube. It's very similar dynamic where if you invest in the winner, right? Like you're going to be in Google or, you know, Microsoft or Apple or pick an amazing company or you're in a company where you're like, oh, I don't even know who the second guy is. Right. So how do you then figure out what the next goal is going to be? Well, number one, you have to really, really, really do your homework. And I think what the firm taught me is that you can get the category wrong of company wrong, but you can't get the company, you can't get the actual winner or wrong.

03:03:15

What does it mean? Right? So, for example, about, like, VR is a good example. Like, Oculus was a big. It was a reasonable winner in VR, but a lot of other VR companies didn't really do super well. They might come back now, but people invested money in, er, and what the, you know, what, you know, some of the partners would say, like, that's fine, you took a bit on the entire category. You had the best company in the category. That's good, right? But what is not good is if you're investing in search engines and you did not invest in Google, because that is the difference between, you know, being part of one iconic company and not being part of anything at all. So we often thought a lot about how do we make sure that you're investing in the winner in a category versus somebody else. And you have to wait sometimes until you know who the winner is, Right? You have to be prepared. You have to, you know, you have to know all the founders. You need to be. They need to know you. You. So that was, I think, another big dynamic that they taught me.

03:04:14

The final, and I think the most important part.

03:04:16

I have a question real quick. You know, when you're talking about trying to find the winner in a. In a category, I mean, would it not be wise to invest in several companies within the same.

03:04:31

Great question. Great question. Let me ask you a simple question yet. Palmer here, right? Let's say you invested in Palmer. Let's say you're also invested in Palmer's competitor. How do you think he's going to feel?

03:04:42

That's the caveat to that, right?

03:04:46

And so we had a word called getting conflicted. Like, I think the great entrepreneurs don't want you in bed with the competition. Now, of course, there are a lot of ways to kind of, sometimes people get around it. Some people that I work in, other places know what, like, we work with everyone equally. But, you know, one of the prizes is, you know, if you just back the winners and they don't want you often to work with everyone else. Right? So that is a definite dynamic. But to be honest, there's ways around it where some people say, guess what? That's just the nature of me, I just work with everybody. That's a price you have to pay to work with me. That's all fine, but that was just the culture I grew up in when I was, at least in Harvard, which is you pick a person and that's the only person you work with. Right. And I believe a lot in that because I do think a lot of founders value loyalty. Like, they want you to work with them because they are in a knife fight every single day. Not metaphorical, not a literal one, with they're trying to win deals.

03:05:48

They are trying to make sure that company's not, like, crushed or running out of money due to the other person. They don't want you also helping the other person. They want to know that you are the. You know, you are loyal to them. And I think I really believe in that loyalty. I think that matters a lot. And I also think, you know, people are not loyal. You know, the great founders don't wind up working with them. So that's a big dynamic. But it's a good question. But that's the answer. I think the most important part I would say, is you need to have a spidey sense for what a great entrepreneur looks like, and not look like literally look, but how they operate, what they do. And I was lucky here because I got a spot. Spend time with lots of many amazing entrepreneurs from the sometimes working with them or sometimes from the outside. And you sometimes see similar patterns across multiple great entrepreneurs. For example, I'll just pick one. Every single great entrepreneur is insanely fast. They are urgent. I used to remember when I've been in teams, which are not great, if you wanted to have the next meeting or a conversation about something, they'd be like, yeah, let's go meet in a week or two, maybe.

03:06:57

I put a PowerPoint deck together. You know. You know, who has been in corporate America probably recognizes this. If you work for a great entrepreneur, they'd be like, well, let's talk in five minutes. Let's find the answer. Let's go, go, go. We. We don't have time to waste. They work 24 7. They eat, sleep, and breathe this. There is a real sense of urgency, of mission. And I think that's one of the things that I've learned to pick up on over time. And it's not the only thing. I think it's table stakes, right? It's necessary, but not sufficient, as they would say. But you build a spidey sense over time.

03:07:34

Interesting. Interesting. How long were you there?

03:07:38

Four and a half years until this job.

03:07:40

What are some of your best investments?

03:07:42

Well, I'll pick one. And because it ties to the theme of what we talked about in social media, I just became convinced that centralized social media platforms are not good for all of us. Because if you have the team which doesn't agree with your politics running them, they can just issue orders top on down. And that's how I really got into crypto. I became a fan of this idea that crypto is a way to decentralize these platforms. And instead of having one central company or algorithm which does everything, you can have people have say in this. So while personally I'd invested in companies like SpaceX with Elon or Scale AI with Alex Wang from the firm, one of my companies I'm really proud of is a company called Farcaster. And Foraster is a decentralized social network. And I did this like a few years ago and it is done by this X Coin product builder named Dan Ramoro, who's awesome. But the idea was that imagine if with Twitter, if Twitter was someday, let's say you like Elon, you agree with this politic. Let's say someday in the future. Twitter is run by somebody who doesn't agree with you or says, I just hate Sean Ryan.

03:09:00

I want to see his account disappear. Maybe YouTube does that to you. In Foraster, they built a app, a cryptographic protocol, sorry, protocol on top of the blockchain where you own your social graph and anybody can build a client on top of Farcaster. So what does it mean right now if Twitter decides to ban you, YouTube decides to ban you, you're done, you're toast. In Foraster, you're like, you know what, you can't ban me. I'm just going to go over using the same client or this other person. I'm going to take my follow up, I'm going to take my followers, I'm going to take my content, I'm going to go elsewhere, right? By the way, the interesting thing is this is how the Internet used to work. Let me ask you something. What was your first email? Don't tell me your first email. What provider did you sign up for your first ever email address? Hotmail. Great. Okay. And then you probably switched, right? You probably went to Gmail, et cetera. But when you switched, there are a lot of ways to take your email to other places. You could forward your email to other places.

03:10:02

If you wanted to access your email, you could do it on your phone, right? You could do it using the official hotmail.com website. Or later on you could use it on the iPhone or on your own desktop client. So you signing up for the service was not tied to the actual application you were using with it, and you're not tied to it. You could sometimes even take your email address elsewhere. And your social networks, we've lost that, right? If you want to use Instagram, if you want to use TikTok, you have to use, use the official app, right? And there's a lot of reason as to why advertising as a business model. But what Foraster is trying to do, and I think others are trying to do in crypto is let's bring that original Internet vibe back where you own your handle. Like so, for example, I want a world where even if Neil Mohan, who runs YouTube, gets really pissed off at Sean Ryan, you can take your audience and your subscribers and just go elsewhere and they can just follow elsewhere, right?

03:10:58

That's a possibility.

03:10:59

Now it is a possibility. Now it's early days. And the reason. So you had to figure out how to make it happen, right? Technically, economically. So they built this kind of crypto protocol which make it happen and they built all these alternate clients. It's very early days, right? But I love the idea because for me, that is the spirit of the Internet. I grew up in like late night. Because imagine in a world where I was late night in my computer back in Chennai like 21 years ago and I was told, oh wait, before you can write any bit of code, you need to call up a salesperson in Microsoft before you can write code. Like, I would never had anything. And so I think this hearkens back to an original ethos and spirit of the Internet. And I think of crypto, which is that you own these things, you have a say, you have a stake. Let me ask you another example. How many subscribers do you have on YouTube right now?

03:11:51

Almost 5 million.

03:11:51

Great. How much money do you think YouTube makes per year?

03:11:56

How much money do I think YouTube.

03:11:57

Makes a year in ad revenue for Google?

03:11:59

Man, I have, I've never.

03:12:02

Let's make. Let's say it's $5 billion, right? Again, I don't mean to pick on YouTube. I think they're amazing, right? How much of that money do you think you are owed or how do you have you even thought of that? Right? And you know, because you are a stakeholder, you're contributing to this platform. If you use another social media platform, you are contributing to this platform. And I think one of the promises of crypto, right, is that, well, let's give you Know everybody. It could be somebody with 5 million subscribers. Could be even five subscribers. A stake in the platform, a stake in two ways. One, financially, if you know, if YouTube, Instagram, TikTok make money, you get money. Second is stake in terms of how you want to experience it. If you want to use instead of YouTube.com, you want to use another app, you should be able to go for it. If you want to use a different algorithm on the right side, or if you want to use a TikTok with a different algorithm, you should be able to go for it. Right. Like, I want a world, for example, where one day you open up Twitter or TikTok and you're like, well, you know what?

03:13:00

I want to pick this algorithm. I don't want to pick just the algorithm they give me. Imagine you have, like, a shopping market of algorithms. You could pick from. Crypto makes all of this possible. So anyway, so that is one of the things, you know, I was a very deep believer in multiple. Look, I was lucky to work with some amazing founders and entrepreneurs. Some of the best, deepest relationships. These are friendships I'll have forever because they took a bet on me as much as I took a bet on them. And I'm grateful for all of them.

03:13:29

Wow. Very. What's that? Forecaster.

03:13:32

Forecaster.

03:13:33

That's interesting. That's very interesting. I've not heard of that.

03:13:37

Yeah, well, it's early days, but, you know, it's one of those things where I think, you know them or somebody like them. I think it's one of the things that just needs to exist.

03:13:44

Yeah. Yeah. That's sounds genius. Every day I go to bed, I'm like, oh, this could all end by.

03:13:51

Well, you should be like, neil, Mo is a nice guy. Right? You should be like, you know what? I put in some work I deserve a little bit part of. I have a say in the algorithm. And right now, I'm sure they listen to you. You can get a call and they'll probably listen to you, but you don't have an actual say. And I think that's the promise of crypto.

03:14:07

Yeah. Wow. That's genius. I love that.

03:14:10

Yeah.

03:14:11

So let's move into a AI. How did you get picked up for the position, man? Senior White House advisor on all things AI?

03:14:21

It is a pure, I would say, accident in a lot of ways. So going back a bit, I had a lot of people in D.C. you know, have long careers in public service. They have a lot of aspirations to be here. I was not one of them. You know, I was happy as can be back in Silicon Valley, back in the technology world, investing. I was thinking of starting a company. I was thinking of starting my own firm. I was just, just off doing, you know, my thing just because I'm sure a lot of people will agree. DC just felt like this other universe, right? Like you're like, well, a lot of crazy things happening here, but I'm here doing my thing and but what w up happening is about a year and a half ago, you know, like everybody, I had gotten really involved in AI. I was investing in AI, but there was this narrative that picked up about AI just killing us all.

03:15:23

Sleep is one of the most important parts of my health, both mentally and physically. Since getting my Helix mattress, I've noticed a huge difference. Difference. Before I never felt fully rested. Now I'm sleeping through the night and waking up refreshed and it's made a real impact on my life. If you're looking to upgrade the quality of your sleep, now is a great time to try a Helix mattress. Helix is the award winning mattress brand and it's recommended by many for improving sleep. Helix is made to fit your body type and your sleep position position and they can recommend which mattress will work best for you. And right now you can save when you decide to buy your own Helix mattress with this offer. For my listeners, go to helixsleep.com SRS for 25% off site wide. That's helixsleep.com SRS For 25 off site wide. Make sure you enter our show name after checkout so they know we sent you. Helixsleep.com.

03:16:27

I don't know how much you paid attention to the whole doomer arguments on AI maybe a year and a half, two ago, but there was this whole school of thought which picked up, I'm going to say in late 2023 sometime on that frame which basically said that, oh, we should just stop working on AI because this is going to become this superhuman intelligence which just takes over all of human humanity. And I disagree with that, which we can really get into. The other thing that they started doing was a lot of these folks started influencing government, influencing the Biden administration, influencing various legislators, including California. And what they said was let's find a way to stop or slow down AI in any number of ways. And I was paying attention to this, but the one thing which really struck me as very wrong was they tried to ban this thing called open source. Open source, by the way, just for history, is the software world through the last 30, 40 years has had two camps. One is closed source, which is somebody like Microsoft Windows, they build it in Redmond, they ship you a product, or Apple's OSX operating system, you use it.

03:17:37

The other is open source. The classic examples would be Linux, which I'm sure you've heard of for the Android operating system system, where there is source code which any of us can go look at, modify and then contribute back to. And open source is very important for a couple of reasons. One is it is kind of part of the spirit of the Internet. It is how people kind of innovate, build on. You get kids, you get academics, you can download the latest and greatest and you can build on it. It's kind of this spirit of how all these engineering ecosystems at the heart of Silicon, Silicon Valley. The second is open source is safer. There is a great law called Linus's law, named after Linus Starwells, the founder of Linux. It says with given enough eyes, all bugs are shallow. It's a different way of saying sunlight is the best disinfectant. Says like if you have the world looking at your code, you can't hide a security vulnerability in there. We're going to find it, right? Because one smart person might make but a thousand smart people looking at it. Somebody's going to find it.

03:18:39

So what has happened over the last 20 years is if you look at the heart of the things which power your phone, things which power security software, a lot of it is open source because people like I trusted it, a lot of its engineers working on it. It was very important. And in AI there was a growing effort to build open source, open weight models. Okay. And again, bear with me because some of these, I think ChatGPT is probably the first time people heard of these model AI models. Okay, AI as a. And you know, there's been a long history of AI development. AI started in the, I'm going to say in the 40s. 40s. Yes. AI is one of the reasons computation was invented. There's this guy, Alan Turing, you know, you might have seen in the Imitation Game with Benedict Cumberbatch. He invented the Enigma machine or is a big part of it, which, which helped the British with the Nazis in terms of ciphers. He was a mathematical genius in the 40s and 50s and he invented a lot of modern computation. He invented two really important ideas which underpin all of computing.

03:19:48

One is called the Turing Machine, which basically says that anything can be a computer if it can decide to between option A, option B be, or it can follow an instruction. It is the heart of every computer. But the second, which is even more interesting, AI. There's something called the Turing Test. Have you heard of this?

03:20:06

No.

03:20:06

The Turing Test is, okay, I'm sitting in front of you. Imagine there's a door in front of us. I couldn't see you. There's another door. There's one of. Behind one door is a human being, behind another door is an AI. Right? The Turing Test is, can I, as a human being, tell the difference? Difference. And know who's the human and who is the AI? Right? And it was always seen as sort of this theoretical hypothetical test, right? But the thing about AI development, it started in the 40s and 50s, and it has always been. I'm going to call it the Holy Grail, right? It has inspired people. They came into the industry, for example, in the 60s, there was this guy, you know, John McCarthy. You know, he invented these amazing programming languages called LISP, all because he wanted to build AI, right? And in every decade, there were people trying to figure out AI. And the 80s and 90s, people started really interested in the idea of neural networks. This idea was like, let's figure out how the brain works and then let's try and mimic it in a computer. And maybe we get AI, Right?

03:21:09

Like for some different definition of AI, Right? Maybe you think Terminator and Skynet. Maybe you think 2001. A space audience. Sorry, Sorry, Dave. You know, I can't do that. You know, you think data from Star Trek, whatever it is, that's some form of AI now. So basically, people have been trying for years now. The challenge with AI has been over the last 50 years, you would see these ups and downs. Somebody would get really excited about a particular idea, right? It would show promise for a while. People would build PhDs, they would build companies. And then one day, you'll run out of steam because what would happen is this piece of AI that work for one idea, idea would not work for another idea. Maybe you can detect cats, but not dogs. Maybe you can translate English, but not French, right? It wouldn't scale. And so all these ideas, what happened, they have this little kind of hill and kind of this momentum and energy and then disillusionment, and people would leave the industry, companies would go out of business. And this was happening time and time and time again. Even in the 2000 neural networks, which were super interesting and hard, hard.

03:22:14

Every academic was interested in the 80s. In the 2000s, people were like, I don't know about neural networks. We've been stuck for 20 years. We haven't made a breakthrough. Instead of that, let's figure out alternative mechanisms, like there are support vectors. The other things that people are doing now. There are two really key moments that happen in AI, which one of the questions I think people should ask is like, why is AI interesting now? Why not in 2010? Why was ChatGPT not built in 2005? Right? Like, why is it being built built now? So I think there's a long history of technical accomplishments happened and the two very important moments. One was there was something called Alexnet, which was helped build by this guy, Ilya Sutskeva, one of the founders of OpenAI in 2012. But the most important thing I would say, and this thing should, I think, someday win a Nobel Prize or something, is this paper that came out of Google in 2017, and the paper is called, called attention is all you need. Okay, I think this is going to be a historic paper. I think this is going to be as important as Einstein's theory of general relativity.

03:23:19

It is iconic. And the reason why it is important is that for the first time, we found a mechanism that just continues to scale and work with neural networks. Remember what I said until then, there are all these stop and start stop attempts. You start somewhere, you throw some promise, you would stop up with transformers and attention. These bunch of Google engineers figured out this thing and they didn't know what they had at first, but it turns out like it just kept growing. And it had this magical property called the scaling loss. And what that said was that if you give this more Data and more GPUs, more computers, it just kept getting better across the board. Okay, this is. And it won't stop. So far, it has not stopped. And the reason why this way, why is this important? Okay, Every AI algorithm in the past had stopped. It worked for a while, and then it did not scale. People tried to be smart. They be like, can I detect the human face in a particular way? Well, yes, but then people have different faces. Well, you can detect the face, can detect the bicep, and it just kept failing.

03:24:23

But this algorithm, right, as long as you give it more data, more to learn from, and then more computers, more data centers, more environment energy, it just kept getting better. Okay? And so it was built by Google, but OpenAI, which had been started by Sam Altman and Elon Musk and a bunch of others, they kind of ran with it and a few years later came out with ChatGPT, which I think is probably the real moment where people are like, oh, wow, this is really powerful. And so on. So just a lot of history in terms of how we got here. Why are we even here at the this moment? Okay, now with ChatGPT, it's a closed model. When you use ChatGPT, Grok, Anthropic, Google, what does a closed model mean? You type in a question, or maybe you give it an image, you give it a video and it then generates an answer for you. But you can't really see what it is doing behind the scenes. You can't run it on your laptop or you can't run it on your own data center. It is closed, not open. But that's perfectly fine because they have a business model.

03:25:32

They spend hundreds of million and billions of dollars on this. They want you to pay a subscription fee and use ChatGPT, right? But a set of companies started building open source models. And these are models where like, you could take ChatGPT, a smaller version of it, but run it on your laptop, right? You could run it on your phone. Meta was one. They had this model called Llama. Okay, why was this interesting to how I got in here? Why this whole roundabout thing? A set of people got really convinced that open source was dangerous. They were convinced themselves that it was going to help the Chinese, that somehow it is going to make the world unsafe. And they tried to get California as a state to basically ban open source. So I was sitting here minding my own business and I was like, man, this is just wrong because this is the way the Internet should work. This has been the heart of innovation. This is how you get multiple small entrepreneurs and not just a few big guys. Not that I have anything against big guys. They're awesome. But I need multitask. But this is just wrong.

03:26:30

So me, as somebody who had no interest in policy, I started getting involved in these battles, okay? So I started joining the right groups. I started putting my hand up and I was in the United Kingdom at the time. I was helping Andreessen Horowitz grow internationally. I had a meeting with the then UK government. They had a bunch of people and they asked me, hey, they asked the whole group, can we make this open source model public? This was two and a half years ago. It was super safe. Obviously. I said I was the only person in the room who said yes. And when I said yes, this person next to me looked at me and said, you have just killed all of our children. I was like, whoa, that's a bit much. But I remember thinking, wow, these people have infiltrated the highest reaches of government and they have sort of scared the world into thinking that this AI is going to take over the world world and just kind of take over humanity. For reasons, by the way, which I can sort of dispute and why I think is untrue. But that kind of got me personally motivated.

03:27:36

So fast forward, the election happens, and I was close to David Sachs, the aizar, and I said, listen, I have all these ideas for you because I think this is one of the most existential questions. The Biden administration has taken so many wrong turns. They have hurt the American AI ecosystem. They have caused us to almost lose the race to China in a bunch of different ways. And I think there's an existential issue. And David tells me, well, come to the White House and help fix it. And I was like, whoa, I didn't know that was an option. And for me, this country has just given me so much, much. And I was like, here's a moment in time where I have the chance to give something back. And I've been incredibly fortunate where imagine you have some skill set in some area, and all of a sudden you get a chance to help your country with that particular skill set. I was like, I don't know when this chance will ever come again. I was convinced the country was going down the wrong direction on AI. I thought the stakes were incredibly high. Like, if we get this wrong, which I thought the Biden folks were, we would lose this race to China with catastrophic consequences.

03:28:54

And here I was with this opportunity to, well, step up and try and do something about it. So I flew to Mar A Lago and I got a call saying, hey, you know what? You're on the team. This was, I'm going to say, mid early December, a little bit after the election. Fast forward a bit more. The President gets sworn in, and a couple of days later, and I suspect this was time, China comes out with this model called Deep Seek. Have you heard of it?

03:29:22

Oh, yeah.

03:29:23

Oh, yeah. So Deep Seek is super important because it is a open source model. Okay? So first of all, a lot of the people who wanted to stop open source said, well, one of the reasons we want don't have open source is because we don't want help China. It turns out that the Chinese are actually way ahead, and they actually had a genuinely a fantastic model that surprised the world. Okay? At the time, it was the only reasoning model, a model which can think and reflect on itself, which was not OpenAI. It was ahead of so many other models that America had. It captured everyone's attention. And I don't want to take any Credit away from the team that built Deep Seq, there was this team of basically hedge fund guys who had, you know, who were very, very good with building on top of GPUs. And it turns out that a lot of the skills that you need to build great models is programming GPUs very well. So they build some innovative, cool stuff. So I always tell people, like, Deepseek has some great ideas that we hadn't seen before, but it is, I think, a Sputnik moment.

03:30:29

Wow.

03:30:30

Because it showed us that not only are we not, like, scared everybody. Yes. And because not only are we, like, not like, far ahead, we are super close and we are on this wrong trajectory where we could just wind up losing. So we talked about all these companies, Google, Apple, et cetera. Imagine if in 1998, China built Google and that's all we use every single day. Day. China built the iPhone. That's all we use every single day. And AI could be a much, much more important technology platform than those things. And we were off to the races. Right. I remember the very first day coming in. I hadn't even been sworn in yet, so they had to give me a badge and do all these things, briefing everybody. And then the president comes out that evening and he says, like, we need to compete. We need to unleash American entrepreneurship. So that was. Was my, I think the day before, my first day, the next day I started and we were off for the races. Man.

03:31:26

Well, congratulations.

03:31:28

Thank you.

03:31:28

Le. Congratulations. But, you know, you had mentioned, I want to talk about, you know, you were talking about the doomers in AI. What is, what is? I mean, I know a lot of the concerns, but I want to hear them. You know, what. What do you think the concerns are? Where, what is?

03:31:43

So let me sort of try and be intellectually honest and steal my. And what some of the concerns around AI are. Okay, There are, I think there are several classes of concerns. The first, maybe the most important one, and this is not from the doomers, is AI is going to take my job. That's important. But that's not what the doomers are talking about. There is another set of concerns, which is AI could build maybe a new kind of biological weapon, a new kind of nerve to toxin. Right. And those are some very legitimate, serious threats in there. But the real doomer argument was this idea that as AI keeps improving, that at some time AI models will start improving themselves. So instead of a human being like, all right, I'm going to control this AI. I'm going to try and make it better every single piece of time. At some point in time, a model model will start to improve itself. And there's this word in sort of this, sort of this AI debate, which is called takeoff or foom, which is kind of the sound of a rocket taking off. Where the idea is, if you hit that moment of improvement, instead of AI just becoming better, better, better, better.

03:32:58

Right. It just goes. And so their belief is, if that happens, what are the results solves? Well, you might get AI, they would say that is not aligned with our hopes and beliefs. Not because AI is evil. Like if you see an ant, you're not aligned with this interest. We just not like we are particularly against ants, but we just don't care as much. And they worry that AI might think of us as ants. Maybe there's this famous thing called paperclip maximizing. Have you heard of this?

03:33:29

No.

03:33:29

Oh, so pay per clip. Maximizing is the idea that the AIs may not want to kill us, but they may not really know what we like. So they may put us in a job where they say, you know what, make just amazing paperclips because they think humans are happy and we are like, no, no, that's not what is the emotion, satisfying job. So it is kind of used as a way to say AI might become this all powerful, all knowing intelligence, which then is going to be smarter than any one human or any one country. And then just given its knowledge and power, could just control us and may not have our best interests at heart. I think I've done a reasonable job of conveying this scenario. And so if you believe that, and they had some other concerns, especially the Biden people, they believe that. Well, if you believe this, you need to make sure that we slow down, we don't get anywhere close. And we need to make sure that only America can build these AI models, no other country. Because why would you risk some other country having this superhuman intelligence? They would often compare it to a nuclear weapon and they would often compare a GPU, a graphics card, one of these thousands of GPUs which are in a data center, or hundreds of thousands to plutonium.

03:34:47

They would say it's like collecting plutonium and you don't want to have another country, even allied country, having a nuclear weapon before you. So they had this thought, I would say fear of, we need to make sure that if we hit AGI, I'm sure you've heard of the word AGI, artificial. It needs to be us first and we need to be kind of scared of it. And I think so. This was, I would say some of the school of thought around the Doomers. And I just didn't buy any of it. And the reason I, I didn't buy any of it is that we are so many years into these AI models and there are absolutely few things. There are absolutely no signs of takeoff. In fact we are recording this in September 2025. I would say for the last several months every model has slightly leapfrogged over another and there is no sign that any one model model is taking off. There is no one model they call it. And in fact what is happening is these models are increasingly specializing. You have one company which is building amazing models for code. You have another company which is building amazing models for maybe friendship or companionship into other models which are great for scientific discovery and thinking.

03:36:08

So instead of having this one model which is becoming the superhuman intelligence and searching ahead, you're having these cluster of models all sort of giving people great benefit but not showing any signs of takeoffs. That's number one. The second reason why I disagree with a lot of the doomers is that it fundamentally underestimates human ingenuity. Okay? Human beings through history have been able to harness technology, right? The wheel fire, right? Like electricity. Have you seen these videos of, you know, when people try and scare people over electricity by killing elephants? Like, you know, there's this whole thing where there was a lot of fear mongering about certain like one form of electricity versus another. So they would do these incredibly barbaric things. They were like, look, I'm going to electrocute this elephant. This is why electricity is not safe for you. There was a lot of fear mongering, right? And time and time I think again, human beings found a way to harness technology. Even with nuclear weapons, human beings found a way to harness nuclear energy. And of course there's a lot of doomer thought against that which we can get to. Same with the Internet.

03:37:20

We've a lot of downside. We found a way to harness it. So I think if you think about AI, it fundamentally underestimates human ingenuity. Humans are not going to allow a one all powerful model to become superhuman intelligent. You know what, without being like, you know what, we're going to have a say in this. We're going to try and stop it way, way before that happens. They're probably going to have a bunch of other AI models which stop it. So I think just fundamentally underestimates humans. All of us, you know, our human spirit, our creativity, our ingenuity as individuals. And as it is, that's number two. The third piece is that instead of having what I think AI has become is, and I want to come back to this later in the jobs question, we absolutely need humans at both ends of the AI model. We need a human being to give it context and input. Like I was telling you before I showed up today, I went and asked a model, hey, I'm going on Sean Ryan's show. I am the White House AI advisor. What should I talk about? It's pretty okay, but they didn't know me really well.

03:38:25

It didn't know you pretty well, Right? But if I'd given a lot more input and it worked out with would have done a lot better. The second thing, which we absolutely need humans for, is on verifying the output, which is when an AI model gives you an answer, be it a diagnosis to a doctor, be it a suggestion to an accountant, or maybe a suggestion to somebody manning a drone, you will absolutely need a human being to check it, to verify it, to specialize. So when I think of AI, I think of like the Iron man suit. It amplifies you. It is a fantastic assistant. It does not replace you. But anyway, so all of this, I think disproves the Doomers and their version of this takeoff risk. But along with this, I think the Biden folks made a bunch of key errors. They felt that one, we have to stop this AGI from happening anywhere else in the the world. Second, they were convinced that there was going to be a shortage of AI chips and GPUs forever and that China just can't innovate, they just can't build amazing models or amazing chips. They were just wrong on all of it.

03:39:41

And if you think about today, people can just get AI GPUs from Nvidia, from AMD. There's a bunch of other companies, you can just get them. There's no more shortages, no more supply constraints because the semiconductor industry is very, very good at reacting to shortages and honestly is finding a way to make money. But the second thing is they really underestimated China because since they really believe these AI models can be constructed only by a certain number of people in San Francisco, they do not think that some smart set of people around the world could build a deep seq. And it totally shocked them. And nobody predicted deep, deep sea. So as a result of all this, I just think the whole doomer narrative set the country on a wrong path. I think almost really hurt us in the race against China. And a lot of what we have done in this administration is try and undo that.

03:40:35

Interesting. I mean, couple of questions going all the way back to different AI specific models. You had mentioned one for friendship and companionship. What is that?

03:40:46

Well, I mean, I would say it's more of a use case, but I think if you look at grok, there are a lot of these sort of characters with certain personalities people use sometimes not safe for work. But I think when GPT5 came out recently, a lot of people were upset because they felt like they had built a friend. In the previous model with GPT4, it had a certain tone, it had a certain way of speaking to you. And I do think. And they kind of projected this idea of a relationship. And sometimes some of these AI companies, what they're doing is they're specializing in tone. Are we the friendly model or are we going to be just very clinical and cold in how we respond? So that is one dimension in which you can differentiate, But a lot of other different in which you can differentiate are. For example, example capabilities. Coding is by far one of the most lucrative, most interesting capabilities in models right now. Have you heard of the phrase vibe coding?

03:41:49

No.

03:41:50

Oh, okay. So when I wrote code and everyone wrote code, the way you did it is you typed in a piece of code, a program, you gave it to the computer, whether it worked or not, and then you took it back and they wrote more piece of code. And that it Right. These days, wipe coding is basically this idea, and if you wanted to learn, say, a new programming language, you went and learned how it sounded, how it looked like. You went to various corners of the Internet, you figured out how to use it idiomatically. Just like you learn a new language. It's one thing to look at the French dictionary. It's another thing to be able to speak in French idioms where people are like, all right, I think this person knows French. And you have the same with programming languages. But AI models are incredibly good at coding. There are a couple of reasons for this. One is that there's a lot of code on the Internet. So AI models are just trained on a lot of code. The second more interesting reason in my mind is that coding is a way where the models can actually get better just by themselves.

03:42:50

They can basically generate some piece of code and they be like, is it good? Let me run it. Oh, it was not good. Let me make myself better. So coding is one of these things where they can just. Just learn much better without having to have human input. If they wrote a poem, it's much harder to basically, hey, is this poem better than that poem? But with code, there is a objective answer. Now, I'm oversimplifying, but there's a couple of reasons why coding has gotten just dramatically good on these models. So vibe coding is this idea where you ask these models to essentially generate code for you. You can do this right now. Have you ever written code at all? No. Oh, amazing. Okay, let me get you into this. This is one of the most. You're going to teach me guns, and I'm going to teach you code. Okay, let's do this right. One of us is going to look way more badass than the other. But historically, even five years ago, I was like, well, I'm going to send you a book or I'm going to send you a YouTube video. Because somebody would say, well, let's say, give me an example of something you want to do.

03:43:48

Maybe. Do you have. Does this show have an app or a website? You have a website.

03:43:53

We have a website.

03:43:54

But you have an app?

03:43:55

We have no app.

03:43:56

Great. Let us say you want to build an app for the Sean Ryan show, right? It notifies you when there's a new episode, collects email addresses, right? Like, you know, you can sort of sign up to get early access. All these kind of things that app might do. Now we should build that, please. You should, right? Like, you know, or somebody watching should build it and get your attention. But. But now what you can do is I can teach you to do it. Right? Now what you would do is you would open up one of these models. You would say, I know nothing about computer science. I have no background in writing code. Take this YouTube channel and build me a mobile application, run it on my iPhone, runs on my Android, which does all the things I just said. That's it. And what it is going to do is it's going to start generating code for you. It might ask you some questions, like, how do you want the screen to look? Like maybe you say, look, I want the screen to have this color, have this functionality. It's going to generate that for you, and then it's going to maybe even run that for you, right?

03:44:53

And without maybe even you individually writing a single line of code yourself, you could have today a fully functional mobile app. People have to build much more sophisticated experiences. And whether you should try this tonight, right now, or after this, you know what? I'm going to show you maybe a demo, and you should try this right now. And I think. I think this is such a superpower because computers were Often sort of this arcane thing where you're like, wow, I'm going to go to school, I have to teach myself this. But with models, anybody and everybody can just get into it and they can amplify themselves and you can focus on the thing you want to do. Building a great app for the genre and show. So wipe coding is this idea where instead of spending a lot of time trying to think of what the right code to do, you basically tell the model, here's the code, code I'm trying to build. And what are the models? All right, yes, yes, yes. Just keep going, keep going, keep going. And it's sort of a little bit of a tongue in cheek idea. The idea is like you don't get perfect code.

03:45:48

You obviously, if you're writing this in production, if you're writing this for running inside of, I don't know, like a bank or a nuclear reactor, you absolutely want to make sure you check it and you know exactly what it is doing. You're doing it for fun. It's great, right? You can explore things, try out a new idea, you know, maybe build something as a hobby. So why coding's idea is that you can just sort of go with the model. So that has fundamentally changed. Writing code where I don't know the exact stats, but a lot of tech companies, I would say like 30, 40, 50% of their code is now written by AI anyway, so my point is kind of going back a little bit. AI models, instead of having this one model which takes over everybody and becomes this sort of this gigantic terminator brain, you now have multiple models which have specialized, like I'm the great coder model or I'm the great, you know, one of the great personality. And so we see no signs of takeoff happening. So I think the doomers in my mind have been completely proven wrong.

03:46:49

Is takeoff A. I mean, would you say it's a possibility?

03:46:53

Well, theoretically. Absolutely right. But for me, you have to come back to some science, right? You have to come back to the scientific method. So I would say you have to show me proof that it can happen. And every data point we have now is pointing the opposite direction. And what we were doing is in fear of, in my mind, this theoretical scenario which has had like no existence proof off. We were basically shooting ourselves in the foot. We were trying to stop AI, we were trying to ban open source AI. We were trying to stop our AI from being used by other countries, our allies. We were trying to stop our allies from using our GPUs and chips because we were worried they would build AGI at the same time China was just surging ahead building these models and building this chip capability. So if you ask me, is it a possibility? Anything is a theoretical possibility. But we live in the world of reality where you have to be like, you know what, show me the empirical evidence that we have any proof of this happening. When I have so much evidence of things in the opposite direction.

03:48:05

One of the things I heard the President say is, I'm not sure you've heard this. He says he hates the word artificial intelligence. He hates the words AI and he's very funny about it. But I think there's some kind of truth to it because the word AI, I would say almost oversight tells the space a little bit. Because in my mind I just don't think of this as this thing where we are getting to into some sci fi future. I think of it as the next great computer platform. This is like the Internet. This is. Peter Thiel says on the scale of this is a nothing burger or this is some sentient AI sci fi race, I'm somewhere in the middle and I think I agree with him. This is like the Internet. This is like the mobile phone, maybe bigger. Is this going to fundamentally shape humanity? Yes, the Internet did, mobile phones did. AI is going to. But I have not seen the evidence that this is going to be some all knowing, all powerful God that takes over all of us.

03:49:07

Makes sense, Makes sense. I mean, so you know, for the doomers, I, I mean I've listened to all these things and all these different theories on what it could do, what it could turn into. I mean, what, what would it take to kill it? I mean, wouldn't it just take removing the power source?

03:49:25

It's a good question. Let me ask you though, what are the theories you've heard?

03:49:29

Everything that you just said? Okay, it's going to take everybody's job. It's going to turn into the Terminator. It's going to kill everybody. It's going to wipe out humanity. Yeah, all of everything that you had stated above.

03:49:40

Let me ask. Well, let me ask you. I would say one of the challenges of some of these questions is they are almost a. The theoretical thought exercise where it is so hard to disprove something which is so theoretical. Okay, but let me give you an answer. Imagine, you know, tomorrow there was this idea that, you know what, Sriram was wrong. You know, he came on the genre and show he was wrong. We are actually seeing signs of these models taking off and maybe wanting to do bad Things to human beings. If that happens tomorrow, what do you think you, me, all of humanity is going to do? Do you think we're going to sit still?

03:50:19

No.

03:50:19

Yeah. Do you think you're going to allow that to happen? Like do you think, you know all these other people, companies, engineers, do you think that, well, that's it then for the human race, let's pack it up and go on home? No, they're going to stop it way, way before it ever becomes a thing. Like we talked about social media, right? Like, you know, social media, you know, today versus 10 years ago is so different. There's so many checks and balances and laws and regulations. So this idea that, you know, you go from this sort of, this, what we have today into this all knowing God without reasonable people, you know, whether it be in government, technologists, just regular human being without being like, hey, you know what, let's hold up a second here, let's just stop and think about it. Let's put in some safeguards, let's have ways to counter these AI biologists, models that will definitely happen. There is no way we just get from here to there without a bunch of things in the middle. So when sometimes when people say how do you stop this all powerful AI which has taken over a data center?

03:51:20

I'm like, how did it take over the data center? What are the 25 steps which happened before then? How did it amass all this power? I'm pretty sure somebody stopped it when they took over the second data center. So that, that's my first sort of instinctive reaction to when people post those questions. The second thing I would say is the best answer against models is other models. And I think this is very true by the way, in what I think sometimes the future of cyber warfare might look like where the best way to defend against maybe a model which is showing this crazy capability is have another model model which is looking for these capabilities. So look, I hear the sci fi. I grew up on sci fi, I grew up on Star Trek, I grew up on 2001 A Space Odyssey. I've seen the Terminator movies. I know all the risks, I know all the theories. The thing is we have no proof, no evidence we are anywhere on the track. Second, we have a lot of evidence we are in the opposite direction. This is an amazing platform. There are some questions in terms of how do we benefit humanity, but there are no signs of takeoff and sci fi behavior yet.

03:52:31

And most importantly, China is searching ahead. So if we rest, if we stop ourselves, the other side is Not.

03:52:44

I understand that. Before we move into China, I do want to ask you one question. I mean, do you have any concern about people in their relationships with AI So we're starting to see people ask very personal questions. Use AI as a therapist. Use AI Should I get divorced? How should I discipline my kids? You know, and they're, they're asking AI very, very personal questions. And the. I don't do this, but, but the AI will spit out an answer and then you. I think that, I mean, we've seen, we've seen manipulation with social media to an extraordinary extent. And so, you know, I think that the, the manipulation through, the potential of manipulation through AI could be even bigger than, you know, what social media has done. I mean, do you have any concerns about.

03:53:41

Absolutely. I would say this is at the heart, heart of why we made that executive order happen. Exact heart, right. Like, imagine some young kid, influence. Ask AI a very personal question. We don't want ideology to influence an answer. We want the honest truth. So one of the things executive order does is to basically say, if your model is not truth seeking, the government will not work with you. And I think, think that I absolutely have that concern. There's another very interesting concern which I think is going to come up more and more, which is the idea of privacy or confidentiality. If you go to your doctor, your lawyer or your priest, what are you promised? You know, that nothing you say can ever kind of get out of that context. There are laws, there's social convention where you have attorney client privilege, right? Like you have doctor patient confidentiality. Right. Various religions have this construct where what you say is sacred, except if you do something like really, really crazy. Now with AI, we are starting to see people ask very deep personal questions. They say, look, here's my medical report. Like, what does this mean?

03:55:00

Like, you know, give me a sense of what my blood does means. Or maybe I'm not feeling great about myself. What should I do? Or maybe just basic career advice. I know a lot of people ask AI for career advice. I would think it would be weird for all of those to be public. Imagine if somebody could say, if you gave a piece of AI your medical history and somebody could just say, you know what? Just like, I can get access to your emails. I want to get access to every single thing you asked a piece of AI So I don't know what the right answer is, but I do think the way we work with these AI models is a little different than other pieces of technology. And I do think we're going to have this public conversation about what are the right legal constructs kind of protect that. But yes, I absolutely do have concerns about ideology and that is at the heart of why we do did this.

03:55:58

Yeah, I'm not just talking about privacy. I'm talking about, I mean, what if somebody were to, I'm just pulling something out of thin there. What if somebody were, were severely depressed, you know, and they are, you know, they're asking an AI series of questions that it ends up with, you know, should I, should I kill my partner because they upset me? Should I kill myself? And I mean, the AI is going to respond to that. Oh yes, you know, and so that's kind of what I'm getting at is when I'm talking about manipulating population privacy is. This is another concern.

03:56:34

Yeah, but I wasn't there yet.

03:56:35

But, but you know, it has the potential to manipulate entire populations. The entire population.

03:56:43

Absolutely. And there have been these very tragic incidents, I would say, in the last few months where, where AI has encouraged people down a very, very bad path where a regular human being would have been like, hey man, maybe you need to get some help, maybe you should have a conversation. And so I think one of the things which a lot of the leading model companies are working on is addressing sycophancy. This idea that you just don't want an AI which just agrees with you you, but you want an AI which spots patterns of, you know what, this person may need some help or maybe we need to alert law enforcement and say like, and there are kind of precedents for this, by the way, in other places when. Look, one of the things I'll say about social media platforms is they just see a lot of very dark things in humanity. And you know, people want to do things to themselves. People are trying to do really terrible bad things to other people and they build a lot of systems over time to try and kind of deal with that. And most social media platforms, if you.

03:57:50

I'm not going to say every single time, but they try and like if you try and do something where you might be harming yourself or maybe you express a desire to harm someone else, they try and find which direct. They're not perfect. Obviously we know. And I think AI companies, this is going to be an existential question for them. They need to find ways to figure out how to spot it. When people are going down dark paths and make sure either they're alerting someone or they're kind of coming out of it, or you're just not saying, hey man, yes, you're absolutely right in everything you believe. And I think this is going to be a key, key topic for all of AI.

03:58:25

Okay, okay, let's talk about China. I mean, that's the big concern. I mean, Xi Jinping has said the winner of the AI AI race will dominate the entire world. I mean, I have a general idea of how that would happen. But how does. I mean, how, how far ahead is China than us?

03:58:48

Well, I think we are ahead right now, but maybe it's more useful to break down a little bit of a scoreboard.

03:58:55

Okay.

03:58:56

I would say let's start with the basics. What AI needs is, number one is infrastructure. Infrastructure in terms of energy. Energy which powers data centers because again, the scaling loss more energy, you get better models, you can use more AI. And there, I think one of the fundamental challenges the United States has had is for many, many years, our energy usage as a country has been fairly high flat. I think it's improved by a small percentage every single year. Then all of a sudden AI shows up and you need a lot more energy. You need a lot more data centers. And then all of a sudden you have this spaghetti bowl of issues which suddenly come up. The first one is where do we get the energy from? And that is where I think the current answer is absolute natural gas. That's where I think it's going to power a lot of these AI along with what the President would call clean, beautiful coal. But also the future is definitely going to be nuclear. That's going to be a big part on this. I would say China is ahead because they have done or they've just invested in building out their energy grid, building out generation.

04:00:18

They've done a lot of work on new nuclear and building out the grid that transmits power. We, on the other hand, one is we have work to do across all these fronts. I'll talk about what we are doing as this administration. But one, if you need more energy. Second is we have all these outdated, broken rules and laws which stop data centers from being constructed just from a lot of, I would say completely nonsensical climate concerns. And we need to get it all of the red tape. We need to get rid of the red tape. And we need to what the President called build baby built. Right. We need to build these data centers, we need to build energy and we need to also figure out a way to upgrade our energy infrastructure. And we have this peggotty mess of issues. So this administration, we did a bunch of things to attack this. There's an executive order which has come out which is tackling nuclear, which is said, I think I forget the exact number of years, but I think for 30, 40 years, the NRC, the Nuclear Regulatory Commission, hasn't approved a single reactor.

04:01:28

And I think there's a whole future. I know you had folks from the nuclear industry here with SMRs and so on. I know it's like a little bit of time away, but I do think we had an executive order which basically says, look, you know, we cannot believe, you know, the nuclear definitely has a strong future. America, in terms of the present, though, we need energy now. And this is where the President set up something called the National Energy Dominance Council, the nedc, which brings together the Secretary of Energy, the Secretary of the Interior, and we've been working closely with them, which basically tries to attack all of this. We just say, how do we remove the red tape on building data centers, on permitting, on regulating? Like, what are all the things that we can do to just get more data centers built, more energy going? So just on the scoreboard front, though, this is one where China's ahead and we've obviously done a lot to catch up, and I think we're going to search ahead, but for the last four or five years, they've just been on a great trajectory and I think we have a lot of great work here to catch up and obviously exceed them.

04:02:33

The second part is chips. So chips are super interesting and we should spend a lot of time. I see on your shelf you have chipwar by Chris Kelly out there on chips. We are ahead, but maybe not as much as people think. So when I talk about chips, there are multiple layers, but essentially for AI right now, the most important ones are the GPUs, which are built by people like Nvidia, Nvidia and AMD. And then of course you have Google, which has built their own hardware in terms of TPUs, and then Amazon has their own hardware. But Nvidia, AMD, all these companies are obviously ridiculously important and they are right now really far ahead of what the latest and greatest from China, which have companies like Huawei, which builds a product called the Ascends, or the other companies called Cambricon have. Now, why are we ahead of. There's multiple answers, but part of it is because we have access to better technology from TSMC in Taiwan, which I know you're very, very familiar with. We have access to much better software which runs these GPUs, and we just have like a lead on them now.

04:03:48

It turns out though, on this case, China has done a lot of Work on catching up. And instead of having a multi year lead, I think our lead is much, much smaller. And right now what China has been doing with companies like Huawei and Camricon is really worth paying attention to. A few months ago, and this is I think a very interesting one to look at, Huawei came out with this product called Cloud Matrix 384 and it's worth googling and looking up. And the reason why this is important is if you think about these data sets centers, right? If you take say for example, OpenAI or Groq, they have a bunch of Nvidia GPUs inside them and they have them in these clusters, right? Like if you have a H100, which is what a lot of people use, you have eight of them clustered together and you have maybe several hundred thousand of them. Or more recently Nvidia has come up with this product called the Blackwells where you connect 72 of them, but you kind of cluster these. And the idea is the more of these you bring together together, the better it is for training a model or inferencing from a model.

04:04:55

CloudMatrix384 is interesting because people believe that China was way behind innovating GPUs. And what they did was they built a cluster where you take 384 GPUs ascend Chinese GPUs. Now each of those use way more power than Nvidia. They are not as fast, but it doesn't matter because. Because guess what? China has more power. They don't care as much as we do. Second, Huawei has really good networking technology. They're a networking company, so they're able to basically connect. If I'm oversimplifying a lot of not as great GPUs but to try and compete with much more powerful GPUs. And I think of this as an interesting example. Just like Deep Seq of when we try and say we are not going to let the world have our technology or let China have our technology, they're often been very good at working around in other other very creative ways. So that's number two. But I think on the chip side we still have an advantage in the performance of each GPU and also how many we can make. For a lot of reasons we can just make several million of these. And China is definitely a lot more hamstrung in how much they can make.

04:06:08

But we are ahead. Models is very interesting on models, if you talk to somebody on January 15th or January, January 18th, they would say, oh man, American models are way ahead. We have OpenAI at the time 01, we have Claude 3 or we have whatever Grok 2 I think at the time. And China doesn't have anything. Deepseek totally in my mind demolished that idea because all of a sudden they were not ahead, but they were very close. And one of the phenomenons that happened in AI is this idea of distillation, which is you can take a really powerful model and you can distill it to make a smaller, not as powerful model, but just slightly close behind tldr. What happened is China proved to us that they can build really, really good models in some ways surpassing what we have. I was at a developer event in San Francisco recently and I asked the crowd, what are you guys? How many of you using a Chinese model model like Deepseek, Quinn, klm, there's a bunch of others. And almost everybody in the room put up their hands and I was like, whoa, this is not good.

04:07:23

Why is this not good? Number one, it is soft power. Like China having a technology platform which is now running inside American, hopefully not American infrastructure, but like American companies, definitely running in global companies. Companies where instead of our models. Second, these models also communicate culture. Like think about my story. I grew up on the Internet. I grew up on the English Internet. I absorbed a lot of American culture just because of the prevalence of America winning the Internet. But if China dominates the model race, if you look at Deepseek, it doesn't really shatter, right? And we don't want that to be the dominant ideology and values around the world. You asked about people asking models very personal questions. Think about a world where people ask a Chinese model very, very personal questions. I'm not sure I would want that. So one on the model side, I think they've caught up very close behind. I think they're ahead on open source. We are catching up quickly. There's been some great new launches recently. We are still very much ahead on closed source models. The Latest and greatest GPT5 Gemini, Grok. We are still ahead, but it is a much closer race.

04:08:45

The last and maybe most interesting part of the race is what I would call diffusion, which is how are people using AI? Because end of the day AI needs usage to be better. One of the reasons ChatGPT became really good is because when people use used it, you could use that feedback mechanism and become better. This is very important, not just for AI today, but important for the future like robotics. One of the things which is going to be key to getting robots, American robots all over the World is can we get that robot's data of usage either from a simulation, real world and make it better. So I think there's a race right now which is who can get their AI to spread faster, faster to be used faster. Okay, right. And historically, when you grew up, did you use Windows Laptops or Windows Computer when you grew up as a kid, do you know who Windows competitor was in the 90s? No, nobody remembers. There are companies like IBM's OS 2 and others, they all got crushed because Microsoft Windows along with intel dominated. Why? They got everybody to use it first and they got all the developers to build applications on it first.

04:09:56

They probably built Office, used Office games that you use Netscape. Now everyone used Windows. So when you get everybody to use your stuff, you get this ecosystem Flywheel, right? More smart people start building on your stuff, your stuff gets better, more applications make your platform better, and on and on and on. Right. And right now we have a window of time where we can make American AI the default here around the world. And if we don't, China will try and make their AI, their chips, their models, you know, future, their robots, the default around the world. And that for me is the heart of the race.

04:10:36

So if with all the concerns and all the different sectors, I mean, it sounds like we're ahead on chips, software, but we're behind on energy. And energy seems to be, I mean, with the little. I know, you know what I mean? Energy seems to maybe be the most important factor in AI and data centers and all this other stuff. And so what are we doing specifically to unleash? I've doven into the power grid, the vulnerabilities. I mean, it's atrocious of, of neglect for our energy system, our power grid and for, for, for years maybe. I mean, years and years. And you know, if China's ahead, I mean, what are we going to do to unleash nuclear power? I mean, it seems like that is, that's the key. Yeah, and we have all these innovators that are, you know, we talked to Isaiah Taylor, who's been building the mini reactors. We talked to Scott no, who's, you know, enriching uranium and, but we need to go faster. We need to go faster. And all these guys are very impressed with the, with the current administration and getting rid of some of the red tape. But I still don't feel like, I mean, if China has zero red tape, how the hell are we going to compete with our power grid?

04:12:07

I mean, we talked to Bajibot, you know, the founder of Robinhood and he's trying to beam solar energy in from space to receivers that are gonna, that are, that are gonna convert it into the grid. I mean we've got all these amazing ideas, but it's just not going fast enough in my opinion.

04:12:30

Well, I, I think, I totally agree with you. This is one of those things where we just need to move as fast as we can. So I would think of it in a few layers. I absolutely think nuclear is the future, but I think we're still a few years away from getting there. We have data center needs right now like today. What is stopping the training of the next large model? Well, we need to have a larger data center center and we are seeing entrepreneurs all the time. Like Elon for example. If you see how we build Colossus, it's in Memphis, right? And so he builds this old, he buys his old, I think Electrolux factory and then he drives in all these generators on all sides and then uses Tesla solar packs to basically even it, you know, kind of augment it, write some code to even it all out. And it's, it just is one of these amazing speeds of engineering, engineering. But the point being we need energy right now. So just on the nuclear front, it's absolutely the future. So administration has an executive order out which basically tries to clear the red tape on the permitting for all things nuclear.

04:13:43

But my sense, and I'm not like a deep nuclear person, it's my sense from talking to the best people in this, it's like we're still looking out like a few years.

04:13:50

Whether you're juggling tasks or trying to stay clear headed throughout the day. Ketone IQ delivers clean brain fuel that can help you think shar, sharper, longer and smoother. No caffeine, no crash, no over stimulation. Thanks to the folks at HVMN for sending me their Ketone IQ product to try. I really like taking Ketone IQ before I work out. It's not an energy drink but it gives me a ton of energy. I wish I had this when I was on active duty. When I take it I have more endurance but without the crash. Ketone IQ uses Ketone diet dial for a fast acting natural slow release effect with no artificial sweeteners or fillers. It helps support high focus tasks by directly powering neurons and stabilizing cognitive output. And it's military tested. Originally developed to support elite cognitive performance in the field. HVMN has an amazing offer just for my listeners. Visit ketone.comsrs for 30% off off your subscription order plus receive a free gift with your second shipment. Fun surprises like a free six pack Ketone IQ merch and more. These statements and products have not been evaluated by the fda. These products are not intended to diagnose, treat, cure or prevent any disease or condition.

04:15:11

There are a lot of choices out there when it comes to cell phone service and it feels like more are popping up all the time. But Patriot Mobile isn't just another option option. They're different. A company built by people who actually share your values and who are committed to doing things the right way. They're also ahead of the curve when it comes to tech. Patriot Mobile is one of the only carriers with access to all three major US networks, which means reliable nationwide coverage. You can even have multiple numbers on different networks all on one phone. A true game changer. They offer unlimited data, mobile hotspots, international roaming, Internet backup, Android automotive and more. Everything you'd expect from a top tier carrier. And switching couldn't be easier. Activate in minutes from home. Keep your number, keep your phone or upgrade. If you want to go to patriotmobile.com SRS or call 972patriot and don't forget to use promo code SRS for a free month of service. That's patriotmobile.com SRS or call 972patriot patriot.

04:16:16

The game right now is gas and gas turbines. And the challenge there, well on the energy production side then you have on the energy grid side to your point, we have a very old grid from one in infrastructure capacity and second we have all these local state utilities and monopolies which have weird regulation which makes it very, very hard to take power from one state to another as a whole cluster there. So we came out with this document a few weeks ago called the AI Action Plan, which is the entire AI strategy for America. In my mind I spent a lot of time on it. A lot of others have spent a lot of time on it. The President announced it and in that one of the top priorities we talk about is one, removing the red tape for data center construction. With things like how do we make sure that is the nepa, the National Environment Protection Act. How do we find carve outs so that we can get data centered construction just going right? Let's just get the red tap out of the way. Let's go build, build, build. There are directives in there on figuring out what to do with our grid capacity, figuring out incentives on energy.

04:17:33

I think the President in his first week announced this large project called Stargate in terms of getting more Investment and getting more of these data center construction going. So in my mind, nuclear is the future. I think we've done a lot to clear out the red tape, but I have a race to win right now. Every entrepreneur is like, look, I need to get 2000 more GPUs. Unlock. What do I do? So we need to unlock that. And I think unlocking data center construction, grid capacity, clearing of the red cap so the power can come to the data center. How do we find smart ways to go do that? Really innovative ways to go do that, that's the game right now, which I think you know. And I don't want to take credit for this because I do think the Department of Energy and Interior have done a lot of great work on this. They are very, very focused on all centered around gas and coal.

04:18:33

Moving on to China. Chips. I mean, you say we're ahead on chips. I think it was, was it April of this year? We said no, no chip sales to China. Correct. From Nvidia, was that correct?

04:18:48

Not really.

04:18:48

In August we said, I believe it was August. Right. It was last month, August. We said Nvidia can sell chips to China, but we get 15% of the the revenue. Correct. The H20 chips, which are the high end chips.

04:19:04

Correct.

04:19:05

Was that a mistake?

04:19:07

So actually there's a few corrections in there. A little bit of history. So Nvidia makes a whole set of chips and the latest greatest generation is called the black walls. They usually start with the letters G and B. The previous generation was called the Hopper. They started the letter H. The top of the line was the H1 100, which honestly is what a lot of people have right now. Or the H200. They're kind of the top of the line in America. Now about two and a half years ago, the Biden administration came out and said, hey, we're going to remember what I said. The Biden administration had all these mistaken beliefs and one of the beliefs I think they had was that China one, our allies can't have our AI chips. And if China can't ramp up chip technology, technology. So because since they believed that, they said, number one, we're going to put limits on the power of the chips, the flops of the chips that China can get. And then we're going to slice the world into three categories. This is called the Biden Diffusion rule, where some countries like USA UK can get any number of GPUs they want.

04:20:17

About 100 countries couldn't really get many GPUs without going through this crazy amount of red tape. And then some would just not get any GPUs at all, like Iran, North Korea, you know, the countries of concern. And as a result, like, you know, a couple of things happened. One was that a lot of our allies were like, well, we want to use American stuff, but you guys won't work with us. And they were just left out in the cold. A good example is like the Middle east, where the Middle east and countries like other countries were like, we want to bring AI to, to our citizens. And what is a way where we can get your GPUs? And the Biden Diffusion Rule said there is no way you can ever get our GPUs. That's number one. So we essentially kind of turned our back on our allies all over the world because we thought, number one, there's a supply constraint in GPUs that if a GPU ever goes out of America, it means one less GPU for an American company. We thought China will not compete on chips. Third, we thought AGI can only happen in America.

04:21:24

It's a scary, scary thing. So we came in and we were like, this is just wrong because number one, we are making a lot of GPUs. If you want, you can go get a H100 right now, a H200 right now. Because it turns out that all these semiconductor companies, when they find a supply construct constrained, they're very, very good at innovating and working around it. They are very capitalistic. They want to make money. So number one, if we ship a GPU to somebody else, that doesn't mean one less GPU to America. Number two, as I said earlier, I don't think we worry about AGI exploding anymore. We talked about that. I don't think we worry about takeoff anymore. I'm not so much worried about a allied country building a super intelligent AI ahead of us. I just don't think that is a realistic point possibility. So we said, first of all to our allies, we're going to tear apart this 200 page crazy document called the Diffusion Rule. And the first thing first, I want to talk about the Middle East. Then I want to come to China. President Trump, in this first state visit, he went to the Middle east and we struck these deals called the AI Acceleration Partnerships.

04:22:32

And this idea was that we will sell you these GPS use in return for investment in America and with ironclad security provisions to make sure that those GPUs are going to stay where we ship them. They're not going to go to some other country and they don't want to have somebody. We don't like accessing them. And the idea behind all of this is, like, we want our chips, our models, to be used the world over. We want to be like Windows. We want to be intel, and we don't want to risk competition. Because in the meantime, what wound up happening is Huawei has been ramping up chip production. So we now have a competitor who has a good product. They built cloud matrix. They are building asense. There is a story in Bloomberg last week about Huawei exporting chips or trying to export chips to multiple countries around the world. So if you think of American AI as a product, we have a competitor who's out of there selling a competing product out there. So our stance is that when it comes to our allies, we want them to have American AI, our chips, our models, in return for investment in America and with the right security safeguards in place.

04:23:48

And always, always underscore that it is in the terms that we sign, for example, a couple of these Middle Eastern countries. Now, hopefully, that's kind of clear on why we want our allies to have our technology. Technology. China is an interesting different case. There are two schools of thought on China. One school of thought is what you said, which is we should just deny them any chip, any gpu. Right? And the idea there is that, hey, if we deny them anything, number one, they're not making their own GPUs. They're not really making their own model. That was the original thinking. So we will just make sure we get, get this race to AGI. We will be far ahead. It turns out that was fatally flawed, okay? Because, number one, China's making chips, they're innovating, and they're building great models, and they're going knocking on other people's doors. Be like, hey, America is not selling you. The Biden administration is not giving you any of their technology at all. Maybe you want to work with us, okay? And so we think the answer is we will always keep the Latest and greatest GPUs for the United States in huge quantities that is irrevocable.

04:25:00

We'll be the only country which can build these massive, massive super clusters of GPUs. But on the other hand, we want to make sure that we are not giving Huawei cambricon all these Chinese competitors oxygen. Oxygen of growth, oxygen of reproduction, revenue, and usage, right? Because when somebody uses a gpu, they're writing code on it. They're finding issues, they're finding bugs, they're making it better. So what is the answer, how do we thread this needle between let's keep our big stuff, but let's not give the competition any auction. And I believe the right answer is let's ship them technology that is ahead of what the competition has, but way beyond kind, both in individual performance and also in quantity, what we have. So that's the principle and the framework. So if you kind of follow that, where does the hedge 20 come in? The first thing about hedge 20 is because some of these Biden staffers like to talk about this, is Biden never banned H20s ever. They were completely allowed all the time through the other administration. Because what happened is when Biden's team built out these rules, Nvidia went out. And by the way, Nvidia has H20, but AMD has something called the Mi 308, which is the equivalent of the H20.

04:26:24

They said, okay, both of these companies, we are going to build a nerfed version of our. You know what nerfed means in gaming lingo? It's a less powerful version of our gpu, which is way behind what the latest in greatest America has. And we are going to shift ship this only to China because it will keep us ahead of what Huawei has so we can get these Chinese customers using it. But we are way behind what America has. Nvidia built this. AMD built this. The Biden folks were completely okay with this. So we came in and the first thing we said is one, we tore up the diffusion rule for the rest of the world when it came to China. We said, okay, we need to know exactly what is going on going on when we ship GPUs to China. So we said, we are going to bring in a licensing regime. So every time you see a headline that says, President Trump banned H20s in April, March, not true. We brought it under a licensing regime which basically said, look, if you're going to start to export this, you need to ask us for a license first.

04:27:26

Actually, the technical term is called an is informal. We send to these companies, we need to come ask us for a license first. Right? So we know how much are we shipping and who's getting it on the other end. Right. And now, you know, as you mentioned a couple of weeks ago, these are my partners in the Commerce Department who are actually in charge of the licenses. Like, they kind of. They take all the credit for thinking through all this. You know, them along with the President said, okay, you know what? We're going to get a great deal for the American people. We're going to Start approving some of these licenses of these much older, older, less performing chips, way behind what America has in way less numbers than what America has. And we're going to get a great deal for the American people. So that's the history, but that's all the past. I like to think about the future because the H20 is now two and a half years old and every company has a new generation roughly every single year. And one of the amusing things for me is we are still talking about this even even though since the time the H20 has built, we have been through two iPhone generations.

04:28:27

It's like we're talking about, I don't know what the latest iPhone is. We're talking about like the iPhone 11 and we are several generations behind. This conversation is going to keep coming up again and again and again. So what is the right long term strategy? Right. In my mind it is one, we need to flood the zone around the world with our allies, with American technology. We're starting with the American GP. GPUs. Why? You know the whole old business model of Gillette which is you sell them a razor but you make money on the blades. You remember that GPUs and models are a bit similar. If you are a country and you spend a few billion dollars buying American chips, guess which models you're going to use? Probably American models. You're going to make both those models and the chips better on using it. When you build a next data center, you already have all these innocent American models. What are you going to do? Probably buy American all over again. Now if we refuse to do business with you, you're probably going to go to a competitor at some point in time because AI is just too important.

04:29:26

We just can't keep telling people to pound sand every single time they knock on the door and say I want to bring my citizens AI. So one, we should flood the world, our allies, with American technology, with GPUs, with models. Again, America will have a overwhelming lead in the quality and the quantity of these GPUs. We'd be the only ones who are building. You know, Elon, I think is going to build a million plus cluster this year. You know, I think Google has way more TPUs. We'll be the only country which can build these massive ones. But if another country is like, hey, I want to bring my citizens education, I'd much rather them doing it on an American gpu, running an American model, generating a American tokens. So that's for our allies, right? And we can talk about who the allies are for China. Our belief is the right strategy is find a way where we keep the latest and greatest but you know, ship technology. So if they want to build a chatbot, if they want to build education, I would much rather have them use a older nerd nerfed a lower quantity version of our GPUs rather than buy a Chinese company and then keep improving it.

04:30:43

And because the reason is if we do that, what happens? Let's say we ban all American technology. China, we say you can't get anything. Well, they are going to say we need to accelerate all of our indigenous chip making efforts because you guys left us no choice. This happened before by the way, about 10 years ago, right? Like we stopped exporting supercomputing chips to China and within a couple of years Chinese indigenous supercomputing ecosystem just exploded and they have, I think within a few years like much better supercomputers and the chips we were exporting them to. So if we force their hand, they're going to now build out an alternative technology stack and then they're going to start exporting it. They are going to go to other countries because we are being difficult to do business with in this Biden scenario. We're like, you know what, buy a Chinese chip and it's going to come freely loaded with deep SEQ on top of it. So that's the scary scenario. So I think the right answer is we ship China older, smaller quantity chips in enough quantity so that we retain the latest and greatest, we retain these super clusters, but it is enough to make sure the competition there does not get oxygen.

04:31:54

Now, I understand everything you're saying, saying, you know, I mean, I know you're very aware of the, the, the, the China Taiwan conflict. I mean so just the fact that we are sending them older technology, older chips that, that aren't up to snuff with what we have, I mean do you think that's more motivation for them to make a move on Taiwan over.

04:32:22

I don't want to speculate. It's hard to say because.

04:32:28

Because then they cut us out.

04:32:30

Well, my belief, I'm not a deep geopolitical expert on Taiwan. I'm much more, I'm much more familiar with the semiconductors is, I think the, the motivation there is more about sort of how they believe history. But I think it's a good question which is, I think there are other motivating. There are motivating factors. Right. And I'm not the expert. I think some of your, you know, you had a lot of great guests on number one is the Biden rule definitely amplified that. Like, because if we go tell people, pound sand, you're not getting anything. You sort of have to find ways to find alternatives. We are relieving the pressure, right? Like, you know, again, I keep emphasizing it because somebody, when somebody, they're like, they, like, we have the latest and greatest and we are, we'd be the only country who can build these million GPU clusters that any one of these model companies have. But at the same time, I want to make sure that they get enough to stop them feeling like, oh my gosh, we're just going to go full on out to build out our entire ecosystem and go talk to somebody else.

04:33:34

So does it take away the motivation? I don't think so, but it's definitely better than the previous alternative that we had last year. I do think you bring up another interesting question, which is the Taiwan TSMC question, which is that we as a country, I guess the world essentially has a reliance on a couple of companies around the world who are essential. One is ASML in the Netherlands who build these lithography machines. And the second is obviously, obviously tsmc. And if something happens in Taiwan, any number of situations could happen, but it probably means we don't get iPhones, we don't get AI chips, It's not going to be good. And I think this is where the TSMC project in Arizona, all of President Trump's efforts to onshore these capabilities, the recent deal that President Trump and the Commerce sector Secretary did with intel on this, all kind of play a part, which is this is such an important capability that we as America need to have on our soil and we need to find ways to bolster our entire chip manufacturing supply chain and not be so reliant on one single point of failure. So when I think about tsmc, I'm thinking about, about all things Intel.

04:35:08

I'm thinking about all things in how do we get these fab construction in Arizona going faster?

04:35:19

How is that factory going? I mean, from my interview with Bao Shi Kim, it sounds like the VP of Taiwan. I mean, it sounds like that the tariffs may have had some implications on the spot speed of that.

04:35:34

Well, I think they went a lot faster than the Biden regime for sure. But I think there's always space to do more. I would think that one of the things that we have really tried to emphasize is that we just need to accelerate our indigenous supply chain. And I don't have the latest numbers on how these fab production is going. I Believe they're kind of increase their capacity in some way than what it was two, three years ago. But we need to do more. I believe even if the fabs are at the current rate or full potential, it's not gonna be sufficient to make up for all the things that DSMC winds up making. So we need other answers. We'll probably. We're gonna need intel. We might need, like, other answers in there. I think this is gonna be one of those interesting questions over the next few years, and how do we build out these software sovereign chip making capabilities? So that for me is one. But I want to kind of come back to the Chinese Huawei question because I do think it's important. Like, one of the things top of mind for me right now is how do I stop?

04:36:41

How do I make sure that if somebody out in the world is building an application, they're building a critical piece of infrastructure, they are picking our chips, our models, writing applications on top. It is creating a space moat, and they're not picking the other team. Like, that is very much on top of mind for me right now.

04:36:58

I have an idea on how to speed it up.

04:37:01

Please. Yes.

04:37:02

You know, and you'll have to fact check me on this. On this through my other episode with Shelby Kim. But, you know, one of the, One of the hurdles, I think that, that frustrates them is, you know, they've bought. They're very concerned, obviously, about China taking them either by cognitive warfare or. Or actual kinetic warfare. And they purchased several planes, jets, military equipment from us, you know, but through our red tape and our bureaucracy. I mean, I. I can't remember. I believe it was about. I think it was. Somebody's just gonna have to fact check me. I can't. I can't look it up right now, but they bought these weapons like five years ago, you know, and they just got their first F16, you know, and. And that's created a lot of frustration over there. And so if there was a way to get through the red tape and get the stuff that they had purchased from us in a faster way, I think that would help motivate them to help us build the chip factories in Arizona.

04:38:11

Okay, not super familiar with that. I need to go back and do some homework on this.

04:38:15

Yeah, yeah, I mean, I know that, but they're frustrated about that, and rightly so. I mean, shouldn't have to buy something five years in advance before you get it. But I mean. So how far along are we in that facility?

04:38:34

I wish I knew this off the top of My head, I believe, man. I think they had faced a few challenges in the first few years around permitting and workforce. I believe they've just entered production recently, but we just need to get a lot more going there.

04:38:54

So it's up and running.

04:38:55

I think so.

04:38:56

Okay. Okay. Well, that's good to hear.

04:38:58

Yeah, I think so. Great to hear. But I don't want to get fact checked on this. The way I understand it is that, and I'm going to be spending a lot more time on this because I've been spending time on some of these model questions is that it was in this really slow phase for a little of time because they had a lot of issues with finding the right skill set. I think at some time they had trouble finding enough electricians and other skilled technical people. But I think it's ramped up now. But I don't have the latest and greatest on exactly where they are right now.

04:39:35

Could you paint a picture of what it looks like not only for the US but for the world if China does win the AI race? When are we looking at.

04:39:42

Oh, man.

04:39:44

Oh, man.

04:39:45

Yeah, well, AI, look, there are multiple timelines AI could take. I think it's pretty obvious that AI is going to have profound impacts on the economy, on productivity, just making people's jobs better. On drug discovery. Imagine a world. Imagine a world where China just dominates drug discovery. Imagine a world where every Chinese individual is able to be smarter, more productive, get more done than what we can do. Imagine Chinese companies being able to build faster, being able to innovate faster, and that flywheel just picking up moment momentum. Imagine a world where our allies are using Chinese technology. This is actually. This has happened before with 5G, where a lot of the world wound UP running on 5G technology from China and instead of something from a Western country. And that caused a whole set of issues. So imagine a world where AI being so much deeper. You mentioned it. Knowing people's personal issues. It knows how your business works. It knows your numbers. It is helping you make critical decisions. All of this being run on Chinese models and the influence and the power it would have. So imagine a world of robots building inside factories, inside people's homes, which are all powered by Chinese software, software and hardware all over the world.

04:41:30

And again, this is, for me, a nightmare scenario. We are not going to let this happen. But that was, I think, in some ways, the path we were on.

04:41:38

Yeah, I mean, I talked a lot about this with Alex Wang as well, and he had talked about the number of AIs against a number of AI. So we would need more, you know, if they have, you know, 100 AIs and we have 200 AIs, then they could dedicate their, you know, out, allocate, I don't know, 33 AIs to surface warfare vehicle. And if we have 200, we could put 66 against a 33.

04:42:04

I sometimes find those a little too theoretical.

04:42:07

Okay.

04:42:09

When we talk about AI, I often want to ground it in terms of how does it make the regular person's life better? And I just think about an individual, individual who can just do more. They are smarter, they're Iron man suit. They're smarter, they're better informed, they can get more done because they're getting more done. They can work more effectively together. They have an always on smart colleague in an AI who can ask things to their companies, can innovate faster, build more. So sometimes I worry that we get lost in a little bit of the theoretical elements of AIs versus AI AIs. Maybe that's true. It's hard for me to tell with high levels of confidence how that plays out. But I do have a high degree of confidence that AIs will need human beings and need to augment human beings. And I think about how do you make sure we are augmenting every single American, we're helping every single American every day with AI in going to work with their fat, with their hopes and dreams much better than the other team are. And I also don't want that to happen using Chinese technology.

04:43:20

Yeah, yeah. I never thought about when we were talking about open source and with Deep Seq, I mean, if that's the open source model, then the entire world uses that maybe except us. And so they're using a Chinese model. And so I understand that because that, I totally understand that. But what I'm getting at is, I mean, what are we doing about open source to spread is it. You've unlocked the fact that everybody can use our stuff, but where are we with the open source?

04:43:53

The first thing we did is we basically said, stop scaring people about. This is a good thing. And this is very important. Like, if you go read the AI action plan that I mentioned, which is the official document, it is literally the first paragraph, which is we want American open source to win. And the reason this is important is when the government takes a symbolic tone, it sets a tone for everybody in terms of do we like this or not? So that's number one. So companies would talk to us. We'd have companies who would tell us Hey, I want to build this open source model but the previous regime or the other state legislation, they were scaring us. They would like, if you guys do this, you'll have all these lawsuits and you'd get sued up business. What do you guys think? And our message has been like, we want you to go out there, we want you to win, we want you to win on OpenTours across the world. And if you look at the last four or five months again, I think China's done a great job. I think their models are really innovative.

04:44:54

I just want to make sure we give them props where it is due. On the other hand, we now have our own open source models. That is OpenAI AI for the very first time launched their first open source model. It's called GPT oss. It came out three weeks ago. There are multiple startups which are now building open source models again because when you have the government saying nobody should do this, it's very hard for a startup to go out and raise money. It's very hard for an academic to work on this. We have come in and say no, we need this for America. But look, I'm not going to say it's all hunky dory. That is a place where I think we need to do a lot more. So the action plan, it has a bunch of initiatives in there on how we are going to work with academia, on making open source more real and how we are going to work with industry. So that's one. I want to talk about something else here which is about laws around AI because it's very related. So I mentioned, I was talking about open source, this thing called SB 1047 and this was a legislation from the state of California which essentially destroyed open source.

04:45:59

American open source. What it was, it say it's a little bit like gun liability. Do you remember when some people said, you know what if somebody gets shot of the gun, the manufacturer should be liable. It tried to do the same with open source models where it said if somebody does something bad with the model, Mark Zuckerberg or Elon Musk or whoever should be personally, you know, that company should be held liable. Which essence would have just destroyed open source because no company can really afford to take that risk. And the challenge is we know have every single state now wanting to do their own rules around AI, but sometimes around open source. So the President three weeks ago came out and I think said two important things. The first thing he said was that AI is a national security issue. So it is way too important to Just leave everything to the states. So I don't know about you, I don't want California setting the tone for how we use AI all over the country. And there's a Gavin Newsom joke in there, but I won't go there. He did veto the last. But I don't want to go there.

04:47:07

But. Sorry, but if you don't want any individual state to set laws for the country, and how can that happen? Because, well, if you have a really important state make a law, the companies will be like, well, we know how to abide by those guys. We know how to abide by those guys. Let's just pick the lowest common denominator. The other hand, what is China doing? No red tape. Go, go, go, go, go. So President Trump came out and said, listen, you should listen to the speech. He's like, this is a national security issue, and we need to make sure that we are doing this at a federal level. So I think this is connected. Very related is this idea of copyright, where I said about models. Models become better when you have more data and you have more infrastructure. And the Chinese models are training on our data, but if our American models are stopped from training on this data, the other models are going to get further ahead. So I know this is a complicated discussion, but in my mind, I think of the national security dynamic of how do we make sure our guys are allowed to do the exact same things as the competition, because if they are training on our data, our guys should also be allowed to train on our data.

04:48:26

Otherwise we are going to fall behind. So with open source, I think it's not just encouraging open source. There is a cluster of issues which are all connected, which I think for folks watching this, you should read the action plan and you should listen to President Trump's speech when he announced the action plan.

04:48:42

I know you had a very important dinner last night. In fact, three former guests have been on there. Jared Isaacman, Alex Wang, Shyam Sangar. You were there. What came out of that dinner?

04:48:55

Well, I would say so. We had this big event yesterday, which is we had this whole AI Education task force meeting with the first lady, which I think may be something we should talk about, which the first lady and the administration, we really cared about how Americans get the skills, skills to work with AI. So there was a big event, and as a part of that, there was a dinner with several really, really interesting people. And I think the overwhelming tone, if you look at that dinner, which, by the way, I was not there, I was on a flight to get here oh, shit. Yeah. By the way, folks even know I had to cancel on you yesterday because I was like, hey, Sean's team, I'm getting pulled into something, I'm going to have it move to the next day. And then I was like, see, Sean, like, this was why I had to move to the next day. You know, I'm not just making up an excuse. I'm not being a jerk. But if you look at the dinner, the overwhelming message from everybody was like, we just love what the administration has done in cutting red tape, in being pro innovation under the leadership of President Trump.

04:50:03

So you kind of see that overwhelming message. But. But it is fun and entertaining in a way only President Trump can be.

04:50:12

I saw that I could be off on this. Did Tim Cook say that he was going to put 600 billion into U.S. manufacturing?

04:50:21

I think that was Zuckerberg.

04:50:23

Was that Zuckerberg?

04:50:24

I think so.

04:50:24

Oh, God.

04:50:25

I think there was definitely a big number from Apple, but I'm not sure whether it was yesterday. I think that was Mark Zuckerberg. I could be wrong, right? Like, I was on a very late plane here, so I wasn't watching the videos. But I think that it was suck who did that. But I think you're seeing this across the board. But with President Trump and his administration, there's one very key message which is you need to invest and build in America. Like, you see this message through and through and through. So if you, you know, there are so many examples. You had TSMC come in and talk about their investments here. You had Tim Cook go to the Oval and talk about investing here. Zuckerberg yesterday, multiple, multiple companies basically talking about how they are investing tens of billions of dollars in infrastructure all over America. And I think this is a key theme which ties into some of the pieces we talk about. The reason to this is not just a chip sovereignty, but it's the idea of we want to bring back investment and jobs and infrastructure in America. But there are so many examples from the tech world, not the tech world, of people announcing seeing investment in the U.S. yeah, yeah.

04:51:38

Sounds.

04:51:39

I think there's a second comment. I think after the. I think the President looked at Mark and he said, mark, this is your start of your career in politics. And Mark was like, I don't know about that.

04:51:49

It was interesting to see Elon didn't have a seat at that table.

04:51:53

I think he said he was. I don't know. I think he said he was invited. I think he sent somebody else. Don't know.

04:52:00

I wonder why. But. Well, Sriram, this was a fascinating interview. And I think. I mean, I'm excited for you and excited for the country, and I love your plan, everything that you just talked about here. And man, I gotta hand it to you. This is. You're probably not. Probably you are in one of the, if not the most important role in that role right now. I think that this is a lot more serious of a race than most people even realize with the race with China. And so I just commend you for rising up and taking the opportunity. And we're all rooting for you.

04:52:43

And thank you. Thank you for having me. Such an honor. And look, this is such a privilege. It's such a dream to just even have this opportunity to be a part of the administration, to work for everyone watching this video. And for me, I just think every day about, like, the time I have here. How do I make sure, one, we're winning against China, and two, how do I make sure that AI is working for every single American? Like, I keep going back to, we need to make AI work for every individual. Like, whether it's to help you. Like, I think about my dad. How will he use AI and, you know, how will he spend more time as family? How will it help him with his job and all of the current versions of my father? And I think about both those questions a lot and also want to say, I just want to hear from everybody. Part of the reason I wanted to do this is because I think it's super important. Just be very, very transparent. So if folks have thoughts and views, hit me up, let me know. We are open for business.

04:53:46

We want to hear from everybody. But most of all, thank you. I've been a fan for a long time. I've been such an admirer of what you've done. I can't tell people enough how much being in this room, seeing this story on these walls, is a testament to the space you have created. You deserve all the success and you're just getting started. And thank you. This means a lot for me to be here, man.

04:54:06

Thank you. That means a lot. Cheers.

04:54:09

Thank you, Sam.

AI Transcription provided by HappyScribe
Episode description

Sriram Krishnan is an entrepreneur, venture capitalist, and former senior product leader at tech giants like Microsoft, Facebook, Twitter (now X), and Snap. Born in Chennai, India, he began his career at Microsoft before moving to Silicon Valley, where he contributed to product development at leading companies and later transitioned to venture capital as a General Partner at Andreessen Horowitz from 2021 to 2024, focusing on consumer and enterprise investments.

In December 2024, President-elect Donald Trump appointed him as Senior Policy Advisor for Artificial Intelligence at the White House Office of Science and Technology Policy, tasked with advancing U.S. dominance in AI amid global competition.

Krishnan co-hosted "The Aarthi and Sriram Show" podcast with his wife Aarthi Ramamurthy, interviewing tech leaders and exploring innovation topics. A prolific writer and speaker, he advocates for immigration reform to attract global talent, ethical AI development, and bridging technology with policy to foster economic growth.

Shawn Ryan Show Sponsors:

https://betterhelp.com/srs

This episode is sponsored. Give online therapy a try at betterhelp.com/srs and get on your way to being your best self.

https://bruntworkwear.com – USE CODE SRS

https://calderalab.com/srs

Use code SRS for 20% off your first order.

https://meetfabric.com/shawn

https://shawnlikesgold.com

https://helixsleep.com/srs

https://www.hulu.com/welcome

https://ketone.com/srs

Visit https://ketone.com/srs for 30% OFF your subscription order.

https://moinkbox.com/srs

https://patriotmobile.com/srs

https://rocketmoney.com/srs

https://ROKA.com – USE CODE SRS

https://ziprecruiter.com/srs

Sriram Krishnan Links:

X personal - https://x.com/sriramk

X official - https://x.com/skrishnan47

Website - https://sriramk.com
Learn more about your ad choices. Visit podcastchoices.com/adchoices