Transcript of Adam Mosseri Returns (Head of Instagram) New

Armchair Expert with Dax Shepard
01:44:33 6 views Published 5 days ago
Transcribed from audio to text by
00:00:00

Welcome, welcome, welcome to Armchair Expert Experts on Expert. I'm Dan Sheppard, and I'm joined by Miniature Mouse.

00:00:06

Hi there.

00:00:08

I'm using Miniature Mouse because our guest has almost mouse in their name. Oh, right. Adam Moseri. Yes. Moseri. That's right. Mouseri.

00:00:17

Yeah, it sounds like a European version of mouse.

00:00:21

Like an eryodite mouse.

00:00:22

Yeah.

00:00:23

Adam is the current CEO of Instagram. That's right. He's willing to talk about all the things that scary in a very fearless way, and I admire it, and I thought this was a great, great conversation. So did I. Please enjoy Adam Moseri. He's an objets man.

00:00:44

He's an ultra-expert. He's an ultra-expert. He's an ultra-expert.

00:00:55

He has a lovely in here, right? Let me make a little room for you. It's like a glow-up.

00:01:00

Oh, yeah. We try.

00:01:02

We hated it at first.

00:01:03

Because you got used to the old way.

00:01:05

Yeah, that had a special juju.

00:01:07

How long are you in town?

00:01:09

Just a day. Just a couple hours? Yeah, I got a couple of things. Then it's Nico, my 10-year-old's birthday. Oh, happy birthday. Got to get back for Buster. The 20 Busters. 21st? 21st is his birthday, yeah.

00:01:19

You can tolerate Dave and Buster. I put earplugs in. Yeah. Okay, I was going to say, we have a mutual friend Eric who kept wanting to do playdates there with our daughters. Last time I went, I was like, I don't know what's up with my sensory system, but it's way too much.

00:01:33

It's wildly over-estimulating.

00:01:35

Yeah, it's too much. It makes me feel like I'm getting attacked.

00:01:38

My eight-year-old wanted to go for his birthday. This was November. My wife signed up. She's like, Cool, we'll take the whole class. Yes, 25 kids. Oh, my God. She's like, and we got them like, no other parents.

00:01:50

She's ambitious.

00:01:51

Talk about ambitious.

00:01:52

She's insane. It ended up being five adults because a couple of parents did come, and we went to the bus. She's a genius. She got them all yellow vests to label them.

00:02:02

That wouldn't have flown at my birthday party. Mom, we look like dorks. They're crossing guards. But they're eight. Eight's a good age. They thought it was cool.

00:02:09

They were just like, I don't care. They didn't even think about it. I counted them off like, one, two, three. I'm like, What number are you? They're like, Oh, no.

00:02:16

Oh, God. This is an eight-year-old birthday party?

00:02:19

Wow, you guys.

00:02:19

They're 10, 7, and 4, but now they're 11, 8.

00:02:23

They're 10, 8, and 5 now. Five, okay. The bruiser is five. He'll be six soon.

00:02:27

What are the names again?

00:02:28

Nico, Blaze, and Elio.

00:02:30

He's great. He's really great.

00:02:33

They are wonderful, and they also are just exhausting. The youngest one is like a brute. Really? Yeah. He plays soccer with his brothers. They don't take it easy on him. They just beat him. Then he goes and plays soccer with his own age. I took him to his older brother's soccer game, and he's like, Oh, I scored a goal. I was like, Oh, cool. How many goals did you score? He goes, I scored 15. I was like, he's clearly lying. He's five. I called his mom, and she's like, Oh, no, he definitely scored.

00:02:59

No, No way.

00:03:00

I tell that all the time on here, which is like, I thought I was so weak and so bad at skateboarding because my brother's five years older than me. Then I got to school and was like, Hold on a second. I think I might be strong.

00:03:11

Someone told me recently, I don't know if this is true, but they told me that a disproportionate number of successful athletes are little siblings.

00:03:17

I mean, that would make sense because you're competing with somebody so...

00:03:21

And you immediately want to. It's like, as soon as they do it, you want to do it, even if you're way too young for it.

00:03:25

Chris and I ended up on a trip, and it was with all strangers. I didn't know anyone. When we got to the airport You were there. Yeah. I really felt like, Oh, well, fucking thank God. Adam's here. I know Adam. A friend. I really, really loved him as a guest. I do remember being charmed by you like crazy.

00:03:42

I was really new to the role because it must have been 2019, 2020? I guess.

00:03:46

That you were on?

00:03:47

That I was on. Yeah.

00:03:48

Maybe 2019. I don't know. It was a long time ago.

00:03:50

I was new to the role, so I was still trying to figure out what was what, but I remember having a lot of fun. We hadn't seen each other in years before that trip. Right.

00:03:57

Yeah. I was watching, of course, I did some interviews with you prior to your arrival. I just wanted to start with a little bit of compassion. You have a hard job to talk about. Because you're really at the forefront of a lot of things people are concerned about. It's AI, it's social media, it's politics, it's deep fakes. It's like everywhere you're going, you're pretty much being asked.

00:04:18

Are you sweating? Just hearing all those things? I saw your face go.

00:04:21

He handles it so well. We are at the intersection of a lot of contentious things, and so it is tricky. Before I worked at Instagram, I worked on Facebook for a long time. I remember a really long time ago, maybe 10, 15 years ago now, I just felt like, and we talked about this at the company, that we weren't participating in the conversation, and there was all these conversations happening about tech, about social media. My position was like, Look, we got to participate.

00:04:46

Whether we want to or not.

00:04:47

Yeah, we can put our heads in the sand, but this conversation is happening, and we might as well participate. I'm going to get out there and I'm going to start talking to people, and I'm going to start traveling. I showed up a lot on Twitter because that's where all the journalists were. I was clear internally. I was like, Look, I'm going to do this, and hopefully it'll work. At some point it won't. I'll say something silly and everyone will hate me, and I'll stop.

00:05:07

That's what happens.

00:05:09

That was the game plan.

00:05:11

But it will be better for having done it for a couple of years before I get destroyed.

00:05:16

Wait, can I say before we get too far that Adam said something very nice about us on Instagram? Did you see this?

00:05:21

He did his wrap up of podcast he likes.

00:05:23

But I misread the question.

00:05:25

I know. I laughed so hard. Someone sent it to me and was like, You're his favorite What's your favorite food? Or whatever the question was.

00:05:31

Yeah, what's your favorite food? I was like, What's your favorite food? I don't know why I had podcasts on the brain. Maybe the next question was about podcast, so I rattled off a couple of pods I love. Then everyone, the comp team was like, What's wrong?

00:05:41

I laughed so hard.

00:05:43

I just do them between meetings on Fridays. It's not like a production.

00:05:47

Yeah. It's not very well thought out.

00:05:49

No, it's not.

00:05:50

It doesn't need to be. That's okay. I thought it was very funny and flattering. Yeah. To be your favorite food.

00:05:57

I was like, Oh, God, that's not the best.

00:06:00

Yeah, what do we taste like? Are we salty? Are we sweet?

00:06:03

You got burger and a chili dog.

00:06:06

I think we're savory, I like to think. I do, too.

00:06:08

There's some sweet stuff out there.

00:06:10

I dabble on it.

00:06:11

Well, maybe there's some relish on the hot dog.

00:06:14

There's some relish.

00:06:15

You really laid on the emphasis on the relish.

00:06:18

I think relish came up yesterday. Oh, garnish did.

00:06:21

Garnish, a gritty garnish. That's right. Yeah, you didn't like that, and I thought it was pretty impressive.

00:06:25

I said I'd sleep on it, and I like it now.

00:06:27

Anyways, I'm just really sympathetic because not only you have to go talk about these issues. So often you're being asked to be clairvoyant. That's really what people want to know. It's like, Well, in one year, what will AI be doing? In two years, what's it going to be? What will Instagram be in this iteration? It's like, you don't know any better than anyone else. I go to these conferences, the fucking people who are designing this shit, they don't know. Most of the good ones will admit that. Nobody knows what the timeline is. They don't know what the ceiling of its aptitude is. No one knows, but we need you to know.

00:06:58

Yeah, that is attention I do think some of us in this industry really pride ourselves on being able to put pen to paper on what's going to happen. I tend to find that that sometimes feels like false precision. It's a little bit more driven by our own wanting to be able to do that and to be prescient. But I think it's just good to always be honest about what you know and you don't. Then to also qualify things, not a way to couch, but to be like, Look, I think this is probably what's going to happen and why. But If this other thing happens, maybe we're going to go left instead of right. The world is changing more and more quickly. It's full of nuance. It's all gray. You have to embrace that ambiguity and that uncertainty. Otherwise, I feel like it's going to bite you.

00:07:42

Okay, so now I also want to... This is the value of having spent a weekend with you. This wasn't clear to me the first time we interviewed you, which I know that you come from a design background, but that didn't set in. As I spent time with you, you're very much an artist and a designer. That's what you have wanted to do you set out to do. That is who you are. It's almost weird you're in this role in some way.

00:08:04

Yeah, it's a little weird. I romanticize the specialist, the amazing architect, or the amazing graphic designer, or the amazing machine learning or AI engineer. But it's just not my shape. I'm not great at anything. I just have a lot of range. That's my strength. Generalist? Yeah, I'm a generalist, for sure. We need that. For me, it's been important, and this happened maybe when I was right around 30, to embrace that about myself. Before 30, I was always trying to be a designer. I'm the son of an architect. My mom's an architect. My brother's a musician. My sister's a designer. So I'm surrounded by... I'm the suit.

00:08:40

You guys are like a squid in the whale family. They're like a New York intellectual, artistic. Yeah. So I do think it's a bit ironic that you've ended up where you ended up.

00:08:51

Can you tell people what your role is just in case?

00:08:53

Oh, yeah, sorry. I lead the Instagram team, so I'm the head of Instagram at Metta. I also support the threads app as well.

00:09:00

We're going to get into how much I made fun of you. We just started it.

00:09:02

My running bit with him for three days was making fun of threads. He did, but it wasn't as bad as the fact that my wife has not installed it.

00:09:12

Got to keep you humble.

00:09:14

Well, she's at an age, I'm sure I'm done with the new... That's how I felt about TikTok. I'm like, No, I'm going to sit that out. I can't do it.

00:09:19

Yeah, she's got what she uses and what she doesn't, and she does not care about my world at all. It's great.

00:09:24

But you do have the title of CEO of Instagram, yeah?

00:09:26

Yeah, we say Head of Instagram.

00:09:28

Isn't that funny? They do. All the tech companies, they have their own fingerprint on what they call people. It's very specific. I find the world's very interesting.

00:09:37

That's a very nice interesting is what my mom would say when she's like, Oh, he's interesting. We are particular and odd in our own special way. For sure, I will acknowledge that.

00:09:46

Obviously, there's enough social science to back up the fact that cultures are really important at organizations. But I think if you're at a certain threshold of intelligence, a lot of it sounds condescending. That's what triggers me when it's like, I'm mopping up the toilet, you're calling me a partner or whatever, a buddy.

00:10:03

I'll give you the context. For us, it's a little bit like, Instagram is not its own company. It's part of a bigger company. And there's pros and cons to that. But there's a lot of pros for us. People move back and forth between the teams or build on top of the same safety systems, the same ad system, etc. And so we try to make sure we use language that embraces the fact that it's a part of a bigger company, not its own company. So that's the inside baseball on this language bit. Also, just because it's important for people to know. I don't want people to be surprised that Instagram is owned by Metta. Metta also owns Facebook and WhatsApp and these other things.

00:10:36

It's also very fascinating how much public opinion can vary within the offshoots of still the same company. I grew up working for General Motors, and that had that as well. It's like a Cadillac is such a different vibe than Chevrolet. A house of brands. Saturn is your progressive younger consumer. It's all General Motors, where the fucking motors are coming out of the same plant. But it's just shocking how many different opinions.

00:11:01

Well, we build up emotional affinity for these different brands. I'm trying to think of a good example. In the world of soda and drinks, how many different companies does Coke own? You might be anti-Coke and pro. I forget which water is theirs.

00:11:13

Sure, Smartwater I don't know that that's it.

00:11:15

Oh, no, I'm going to get yelled at. Dessani? Dessani. I was just trying to say that. I think it's Dessani. Oh, yeah.

00:11:19

There are some people- I hope it's Dessani. I think there might even be a big war between Aquafina and Dessani. I hate Dessani. Those are the camps. I think they're both Coca-Cola, but Rob will look it up.

00:11:28

Dessani is very salty. I'm saying it. Oh, unless they salt for us. Just throwing shade. I'm serious. There's salt in it. Electrolytes? No, I do love electrolytes.

00:11:39

Dasani is Coke, Aquafina is Pepsi. Oh, that's a legit battle.

00:11:43

Yes. Yeah, that's real.

00:11:45

That's shocking that I love Aquafina, that that's my pick. I know. Because I bleed Diet Coke.

00:11:50

Oh, yeah. See, again, as you're saying, we're proving your point.

00:11:53

I got through College on Diet Coke. Quire just did a pot on them recently, actually. Oh, they did? I haven't listened to it yet, but I'm excited to because those two guys are really smart. They They also have different recipes for different countries, though, which I think is interesting. Coke in Mexico tastes pretty different than Coke in the US, for instance.

00:12:07

That supposedly is due to whether it's cane sugar or high fructose, but I don't know. I don't want it sued by either Mexico or the US contingency. The country of Mexico, Suesdack Sheeper. Okay, so all to say, I'm going to ask you all those similar questions, and they probably will have a tiny bit different bent coming from us. But before that, I just want to double back on the design fact. As I watched you interact with people, and I had to assess from afar what I think your skillset is, is you have a very, very intuitive esthetic that you trust deeply, and you're a great challenger of people in hierarchies and with some savvy that has allowed you to still exist.

00:12:53

Without getting crushed. Yeah, it'll happen eventually.

00:12:56

Yeah, again, like you being in public, this could be the day.

00:12:59

This could be your last interview. Here we go. Microphones are right here.

00:13:01

Good place to end.

00:13:02

You're holding the loaded weapon.

00:13:04

If I'm going to go down, this is a great place to have.

00:13:06

Can you agree with that assessment at all?

00:13:08

Yeah, I'll take the compliment. Design is about problem solving at a fundamental level. I think you can apply problem solving to any industry. You're trying to identify what are you trying to get done, what are the different options, what are the trade offs? These are pretty standard patterns that you can apply elsewhere. That's the structured thinking that I have to do in my job, even though I'm not designing not wanting anything anymore, even though I would like to. That lends itself to being comfortable in debate because you can have a position and a reason for that idea, and you're comfortable articulating that reason.

00:13:43

It's not personal to you, Obviously. It's a function. You're arguing how functional this is.

00:13:48

Exactly. But that's the thing that designers, and I'm not an exception when I was a designer, often struggle with, is divorcing your sense of self-worth from the worth of your work. Critique is a whole part of design where you're going to go and you're going to show your stuff and people are going to rip it apart. And you have to embrace the signal and understand that it's not an attack on you. It's a way to support you by improving the work itself. And I didn't learn that properly because I didn't go to design school, really. I learned that from a handful of young designers who went to proper design school, who joined Facebook in the early years, who brought that culture to Facebook. And I learned a lot from them.

00:14:27

Am I wrong that you had a couple of defining moments in your career where you did challenge Mark on something specifically?

00:14:33

Oh, yeah, a couple of things. Yeah.

00:14:34

What were the things?

00:14:35

Okay, a couple of different ones. I'm trying to think which ones I may have told you about, but I might just share other things I'm not supposed to share. Well, look, I think careers are often defined by a few major decisions which may be informed or uninformed. For me, there were a couple. The first one was I was a design director at Facebook. I was managing a bunch of designers. There was this project that became the Facebook phone. All the PM leadership ended up leaving the company for different reasons, some personal, some professional. It was this giant crater where the leadership was in what we call product management or the PMs. I just declared myself the PM, which you can't do anymore. You have to interview because that's a reasonable thing to do to have a job. But this was a long time ago, probably 13 years ago. Mia, Mark definitely told me not to do it. Mark was like, No, you're a designer. You shouldn't do that. But I did it. He was nice about it. About six weeks later, it was late on a Friday, I was at the office. He was complimenting me, but he said, You're doing a lot better in this role than I thought you were.

00:15:36

It is a very Mark way of complimenting someone.

00:15:39

Yeah, exactly, which is great, which I took the signal. There's been a couple of instances like that over the years, some with Mark and disagreeing with him, some otherwise, where you make these decisions. I wasn't doing it because I thought it was a good career decision. I was doing it because I was working on the project, there was a need, and I just was trying to fill that need. But I got lucky Because like I said before, I'm a generalist, and PMs are basically generalists. You work with the designers, the engineers, the comms people, the lawyers, the policy people.

00:16:08

You're a translator now. You have to be able to speak each person's language well enough to bridge all these communication gaps.

00:16:14

Absolutely. You're a bridge across everybody. If somebody doesn't know something on your team that they should know, that's probably on you. My career, for a couple of reasons, but this was probably the main one, started taking off because I was in a role that matched my skills much better. I was a middle-of-the-pack designer. I made up for it with sheer work, like hours.

00:16:35

Abundance. Yeah.

00:16:37

In terms of raw talent, middle of the pack is maybe even a little generous. I did well, but by sheer force of will. Then When I kept working hard, but my role embraced my strengths and weaknesses, then things started to accelerate.

00:16:53

Okay, so I guess my first question is, how are you guys? Let's just leave what's on the app or alone for one second. Just how have you guys been able to utilize AI in the running of the product?

00:17:07

A couple of different things. One thing that we've been doing for a long time is, and this is controversial, is we rank content. We try to show you the content we think you're the most interested at the top. We've been using different forms of AI for that for many, many years. Another thing we do with AI for many years, our forms of it, is we try and classify content. It could be something positive, is this about a certain topic that interested in? Or it could be negative, does this violate our community guidelines, therefore shouldn't be on the platform?

00:17:35

And that the AI is sifting through all that material and trying to put those in buckets.

00:17:39

Because there's however you want to count tens, hundreds, even millions or billions of things uploaded a day, we can't have people review all of it. So we use technology to look at content at scale. And to be clear, and this is changing, but historically, I've said this before, people are better at nuance and technology is better at scale. We've had to focus on things that are less nuanced, like what our rules are, because technology hasn't been as good at understanding nuance, historically.

00:18:08

And humans are so crafty at quickly figuring out what is triggering the safety net.

00:18:13

Oh, yeah. It's very adversarial because one challenge we have is spammers who are constantly trying to work around all of our safety protections. You'll have bots, for instance, but bots that don't post every second, 24 hours a day. They pretend to go to sleep. They pretend to go to work. They're inconsistent. They add spelling mistakes.

00:18:31

What's the Japanese word for imperfection?

00:18:33

I don't know it.

00:18:34

This is the second day in a row. You can't get it. Wabi-sabi. It's Wabi-sabi, but it's not Wabi-sabi.

00:18:37

It's Wabi-sabi, but it's not. It's the one that has the gold that you piece together.

00:18:41

Oh, that's a specific art form. Yes, exactly. Where they take broken ceramics and they put it back together with gold lining. Correct.

00:18:48

It's like the imperfection is really the beauty.

00:18:50

That stuff is beautiful.

00:18:51

It has a name, and we're all supposed to know it. It's applicable.

00:18:54

There's a word that's not as helpful in Japanese for Instagramable, which I learned when I was there recently, which was instabaya. Oh, really?

00:19:01

You mean it's so esthetically pleasing. It's worth hosting?

00:19:03

Yes, exactly. Okay. I'm torn on how I feel about that. That's the thing. What's happening to us now, though, is we're building AI products that are more native AI into the product, into the app. I say product, that's Silicon Valley speak for the apps. But the other thing that's happening to us is we are also getting really disrupted ourselves. How we build, how we write code, how we do research, how we data is changing really quickly. We are also having to reinvent how we do what we do, which I think some people outside of the industry don't realize because they're like, Oh, tech is just going to keep doing tech. It's very different, and it's going to get a lot more different over the next year or two. I can tell you some of the things that we're doing now, but honestly, I think it's all changing really fast.

00:19:51

People are very right to be concerned because the power of these products is enormous. We've never seen this before. More. It is the most powerful thing. It does deserve the most amount of scrutiny. But also, nothing operates perfectly. Everything is iterations built on previous mistakes and correcting those mistakes. The notion that, again, people are going to be able to be clairvoyant. It's even like when I listen to that rabbit hole about YouTube and its algorithm of increasing engagement, everyone had great intentions. No one really could foresee, Oh, it's going to radicalize people. They didn't think, Oh, well, what will be more exciting is more and more radical content, and it starts at Jordan Peterson, and then you're a white nationalist. I'm sympathetic to that to some degree. Now, it has to be immediately corrected the moment that's discovered all hands on deck to fix that problem. But again, there's probably, like what you're saying, is this illusion that because you guys are creating the product, that you're not also victims of it all as well. No one's really immune to the challenges of this AI growth.

00:20:53

We're not immune to it at all. I would say two different things. One on the industry and the scrutiny and the size. Is not only is it big and important and obviously a lot of power in the hands of a small number of companies, it also grew really quickly. There are other industries that are equally important, but they took decades or centuries to get to their scale. I don't know, railroad, the automotive industry. The power sector. Exactly. It was 10, 20 years from not a thing to a big deal. That's tough because society doesn't have time to really adapt or consider it or get comfortable with it. Laws, It'll take much longer to happen as well. We're dealing with that now as a backlog of incoming compliance work.

00:21:36

The second they understand the tech to legislate, it's gone.

00:21:39

Yeah, exactly. But on the second thing, an app like Instagram, you make a lot of decisions as the team running it, and you have a lot of responsibility. But what you are mostly doing in some ways is designing a system that has rules. You could almost think about it as a city. It's like, Hey, where are the roads? What are the speed limits? Where are the traffic lights? But then people fill it and they decide what to do. I'm not trying to pass the buck. That doesn't mean that you don't have responsibility as the planner, but that responsibility is different. It's indirect. You have to set healthy incentives. You have to, in our case, moderate content effectively. Even outside of the world of really high scrutiny areas, the decisions you make about the design of Instagram will affect the vibe, how it feels. Is it more positive or is it negative? Is it more about debate? Is it more about visual expression?

00:22:29

Yes. You're right. It's a great point as the city because it's like, yes, we'll install these traffic lights at all these intersections. Then people will blow red lights and they'll drive drunk. It's questionable whether now the city is responsible. The light was hung. It is really tricky to figure out where the culpability lies. Yeah.

00:22:48

Look, I just think the responsibilities are shared and they're just different. We have to understand how the system works. We have to set healthy guidelines and rules. We have to enforce those rules effectively and consistently and appropriately. But people could just decide not to open up Instagram tomorrow, or they could decide to try to abuse the hell out of it tomorrow.

00:23:09

But there is an acknowledgement, right, that it's so hard to not open up Instagram. And I mean, good job. But it's built to get you to keep pressing it. But I guess if we're doing the metaphor about the city, it's like, Okay, we hung the light, but we know everyone runs that light. And, Oh, well?

00:23:30

Well, I don't think running the light is using Instagram. I think running the light is posting something that's against the rules. But yes, obviously, we are interested in creating something that people are going to use more. We think if you use it more, on average, it's a sign that it's valuable to you in some way. But there's well spent time and there's time poorly spent. We're not ignorant to that fact. So we try not to use tactics that make you use Instagram or convince you to use Instagram that it's going to make you regret using it later. And we try to focus more on ones that you're going to be more excited how you felt about the product later. So we do things like, we don't just look at how many things you like or how much time you spend. We have these like, worth your time surveys where actually we do a lot of these where we'll ask you, was this thing worth your time? And we see, are people's answers to that question trending up or trending down over time. So it's a balance of different types of goals.

00:24:18

It's funny, too, because, yes, I'm critical of that and then, but when I turn the lens back onto us, I go like, I don't feel guilty that people would listen to our show six hours a week. We put out six hours of content a week. I would want them to listen to all six hours. That would be my dream because I think what we're doing is good. It's tricky because I believe in it, and I'm not really too concerned with how much time people are spending or wasting on this, which I believe in. It's a little hard for me to then be critical of someone else that has a creative endeavor that they also want people to consume.

00:24:47

It's tricky, but I think you can do both. I think the scrutiny is merited because the importance is there, and I think there are responsibilities. I just think that evaluating how we do is not the same as evaluating how a person behaves because it's a platform. There's lots of people. What you control and what you don't is different. What's appropriate for you to do and not do is different. I don't think you want a tech company in Silicon Valley deciding this topic is the most important topic today or this news event is the most important news event today, and we're pushing that out.

00:25:17

Or a dancing video is irrelevant and a waste of everyone's time.

00:25:20

Exactly. But you also don't want us to be agnostic. I really don't think that you can pretend that we're neutral. We make decisions that affect how people use the product, what they see, and therefore, those responsibilities that come along with those decisions.

00:25:32

Have you guys ever approach this with the Danny Kahneman framing of thinking of multiple selves? And for a shortcut, we'll call one the experiential self and one the narrative self. So one is the you that lays in bed at night and evaluates your day, evaluates where you are in this broader story of your life. Am I reaching my goals? Am I the family member I want to be? And that's your narrative self. And then the experiential self is like, oh, boob, horsepower, right? That's instantly the experience itself is a self you have that you're servicing. I can almost think of your was this useful to your time as being like, trying to strike a balance between that narrative self in the experiential where the experiential is good, but also it doesn't give you a hangover from having used it.

00:26:20

I talk about the tension as your first order preferences and your second order preferences.

00:26:26

That's also common.

00:26:27

Yeah, I like chocolate. It tastes good. That's a first-order preference. I also want to be healthy, and I want to eat healthy.

00:26:33

You don't want diabetes.

00:26:34

I want to not want the chocolate. That's a second-order preference. I think one of the things that the industry needs to be honest about and struggles with is that we optimize for things that we can measure, and it's much easier to measure your first order preferences than your second order preferences. Did you like this thing in the next 500 milliseconds is pretty easy for us to tell. Did you talk with a friend that night at the dinner, or did you feel good about it the next day when you thought about it?

00:27:03

Did you go to the GoFundMe page and donate money to that cause you were made aware of on there?

00:27:07

You can measure these things. It's just more difficult. I have found in my time in this industry, there doesn't seem to be any correlation between how easy something is to measure and how important it is. But there is a strong correlation between how easy something is to measure and how much we optimize for it, not out of any malicious intent, but just because-That's what's available. That's what's available. What I have to do is always the teams to get more creative about trying to understand these second-order preferences or this narrative self so that we can actually lean into it. That's why I really love the D or algo stuff. Have you guys seen this yet? No, what's that? That's actually not the external name. That's just how I say it. It's called your algorithm. Two years ago, there was a meme on threads where people were writing letters to the algorithm making requests. Stop showing my high school friend's kids photos. I do not care. That stuff. That inspired this. Then the tech got to the place where we could do it. It's only in the reels tab right now. But if you go to the reels tab in the top right in the US, there's these little sliders with these hearts.

00:28:08

We'll tell you what we think you're interested in, and then you can make edits. You can say, I actually want to see these things. I don't want to see these other topics.

00:28:17

Wait, where is it? Where's the reels tab? E.

00:28:20

It's the second tab. Just click on the little...

00:28:23

Oh, that at the bottom. Oh, boy. Okay.

00:28:26

We're going to do it live. All right, this is great. All right, Do you see in the top right, a little icon?

00:28:31

Yeah. Wait, up top there, the hearts?

00:28:34

With the hearts, yeah.

00:28:35

Okay. Click it. I love that we're doing this live.

00:28:39

Your algorithm. Yes. Does it know you? What does it say?

00:28:42

It says, Lately, you've been into- Fashion. Lately, you've been into award show glamor, sweat sessions, and culinary indulgences.

00:28:55

It's pretty good. What are sweat sessions? What is that? No.

00:28:58

Like, workouts. Workouts. I do get served a lot of workouts.

00:29:02

We think you're into them.

00:29:03

Her narrative self knows she's supposed to be working out more, so it's already servicing your narrative.

00:29:07

But you can see there's a list of- Yeah, it says, What you want to see more of based on your activities summarized by AI.

00:29:13

Golden Globes, Fashion, Luxury, Fashion, Fashion.

00:29:16

Monica Fashion.

00:29:17

Apple Podcast, Los Angeles, Food, Seen. Okay. Beauty, Comedy, Food, Fitness, Motivation, Golden Go. It was pretty good.

00:29:25

But now, let's say we were wrong. Let's say you're not into sweat sessions. You can actually press and move that to the, I don't want to see this thing.

00:29:32

What you want to see less of.

00:29:33

You can get your hands in there and you can change it. Because also, sometimes we might be misreading a signal. You don't really want to see it, but you can't stop looking at it. Yes. Another thing is sometimes your interest can change. This is a tragic example, But one that I have a friend who went through, which is she got pregnant and she was going deep on the nesting thing. And her Instagram got all about all this baby gear. Tragically, she lost a pregnancy. How horrible is it that she's opening up Instagram and we're pumbling her with baby stuff? Yes. So that was one of the inspirations for this, which is she should be able to go in and be like, nothing about nurseries.

00:30:07

I like that.

00:30:09

And so by allowing you to get in there and not only see what we think you're interested in, but you just tell us yes and no, that then hopefully allows people to express their second-order preferences or their narrative self. Now, not everyone's going to do it, but one way to learn about these things is just ask. We pick up on your interests, we learn from them, and we adapt. But we don't understand quite as literally as people think we do what you're interested in. For instance, we don't understand that that's a joke about race from a black comic. It's not nuanced. The way it works is an approach we call embeddings. But basically, the way you can think about it is you can take any video and map it into a space. It's not two-dimensional, but imagine it's a giant two-dimensional map. There's no borders or boundaries or labels. It's just a bunch of videos on a big board. Similar videos end up in similar places on the board.

00:31:01

They concentrate in little areas.

00:31:03

Yes, exactly. Because the way the technology works is similar things end up in similar places. Then if you like one video, whatever it is, we look and find videos near it in that space and show you those videos. What allows us to do your algorithm work is now we can embed topics into the same map as the videos. We can be like, Oh, okay, cool. You said you're into men's vintage fashion. That's here. Let's find the videos near here. You said you don't want to see bikini videos. That topic is here. It won't show you any videos that are near it. Over here.

00:31:40

You're giving it a loci, basically.

00:31:43

Kind of, yeah. We couldn't do that effectively. Even two years ago. That's actually how a lot of ranking works is we're like, All right, you like this thing. What is similar to it? We'll show it to you. People think it's like, Oh, you know I'm into surfing. It's like two years ago, it would have just been this giant number. I wouldn't have been able to tell you what it means. Now I can say, Oh, that's pretty close to surfing or to skater culture or whatever it is. Yeah.

00:32:06

Stay tuned for more Armchair Expert.

00:32:10

If you dare, we are supported by Allstate.

00:32:14

Checking Allstate First could save you hundreds on car insurance. That's smart. Not checking the pockets of your jeans before doing laundry? Classic oversight. That mystery clunking in the dryer? Yeah, that was your lip balms final moments. And somehow, there's always one A random receipt in there to dissolve into confetti. Yeah, checking first is smart. So check Allstate first for a quote that could save you hundreds. You're in good hands with Allstate. Potential savings vary subject to terms, conditions, and availability. Allstate, North America Insurance Co and affiliates, Northbrook, Illinois. We are supported by HubSpot. Did you know that most businesses, Monica, only use 20% of their data?

00:32:53

That's like reading a book with most of the pages torn out.

00:32:55

Yeah, or paying for a coffee that's one-fifth full.

00:32:58

Yuck.

00:32:59

Point is, you miss a lot unless you use HubSpot. Their customer platform gives you access to the data you need to grow your business. The insights trapped in emails, call logs, and transcripts, all that unstructured data that makes all the difference. Because when you know more, you grow more. And when you get a full cup of coffee, you can do more, too. But I digress. Visit hubSpot. Com today.

00:33:30

I appreciate you reading your algorithm out loud, by the way. That took some bravery.

00:33:33

Thank you. I didn't know it was coming. It's good. Is there a way to add? I see you can drag, but can you type in?

00:33:40

Whatever you want. Things that are getting you angry on a regular basis, figure out what words are associated with that anger and block them out.

00:33:48

That's called hidden words. Yeah, you can do that. You can do that for comments, too, and you can also do that for your messages.

00:33:52

I like that one a lot.

00:33:54

It's just people, they like to be angry. This isn't your issue, but I don't see people wanting to get rid of that because they want it.

00:34:03

I think it depends. For politics, specifically, this research is a bit out of date. We say research, it's usually surveys. We found that there was a small percentage of people who really wanted more politics, but most people were like, I want to see less politics. That's, again, this tension between the narrative self and what was the other self? Experiential. Experiential self. Yeah. As they saw it, they clicked on it, we think they like it, but they don't. Their narrative self or their second-order preference was like, I just need a break.

00:34:31

They don't feel good. They know they don't feel good after work.

00:34:33

Yeah. Most people with these surveys were in that bucket. There's a very loud, call it 5 or 10% of people. I forget what the number is specifically. A meaningful minority by minority that are all in. It also varies by platform. Threads has way more politics than Instagram does, not because it ranks for politics, but because you make these other decisions about how the app works. Threads is designed around back and forth conversation. That's better for debate. That's going to mean topics like politics are better served.

00:35:05

When I'm right to say, Threads is also encouraging you to share thoughts, not necessarily images.

00:35:10

Yeah, I think Instagram is about sharing a creative object, a visual creativity. Videos, photos. Threads is about perspectives, your hot takes, and you're going to have a lot of takes on what's going on in politics. The percentage of threads that is politics is way higher than the percentage of Instagram.

00:35:27

That's concerning to me. I don't know that's doing anyone an ego, but whatever. I don't get to decide. Okay, now let's talk about AI-generated content. Yes.

00:35:35

It's a very uncontroversial local content.

00:35:38

That I'm sure you have a perfect- You really put yourself out in the fire.

00:35:41

I'm impressed.

00:35:42

You got to talk about these things. I would way rather that everything that is in your mind that you're worried about or that someone's watching is thinking about we talk about, then to sweep anything underneath the rug.

00:35:53

I hope people have an appetite, too. We're not sure and we're doing the best we can to monitor it and see as it evolves. I hope there's some appetite for that and some latitude for that because that's the reality of how this is unfolding. So I have a lot of questions about it. One is we should start by defining it, right? Because here's what's tricky. You were asked during this Bloomberg thing, Could you label everything that was AI generated? That would be helpful for all of us. And you're like, yes, we agree. We did that. And what we were doing is it'll read anything you do on Photoshop or Adobe as being AI generated. So it's like, where is this line? If someone uses some Photoshop to brush up a pimple on their face in the video, that's now AI. Where are we drawing the line? So maybe we should just define it.

00:36:41

Like most complicated issues, the edges are much easier to define. So there's clearly you used a film camera and it took light and those photons hit a piece of film and you captured a moment. And it's not perfect, but it's authentic capture of a moment in reality. And then there's, I went to to some AI model and I came up with an idea and I expressed that as words, and then out came this crazy video of a hippo in a two-two doing a backflip. That clearly never, ever happened. That's purely synthetic content, and then what I would call maybe captured content. What do you think most content is in the middle? With all of our phones today, when you capture, it's doing a lot of photo processing. Some of it's using AI, some of it's not. It's trying to make sure you are not wildly backlit. If the sky is bright in front of you, might use AI to brighten you up and darken the sky so it looks more like you would perceive it as opposed to a film camera where you would never be able to see the person.

00:37:36

Ai is doing color correction like crazy.

00:37:38

All sorts of stuff. Sometimes in some of the default cameras, it's doing skin smoothing. It's doing things that are a little bit more contentious. Really? There's lots of work to try to make sure lighting works properly for different skin colors, which is another interesting thing.

00:37:51

That just happened to me. I took a picture with a gal that was working at a restaurant in Detroit that I frequent. As she held it for the selfie and took the picture, I was like, I look 29 in this photo, and she looks much younger. I was like, Oh, her go-to camera is this augmented-It's like an app or something she uses? She only takes photos clearly through this app. I looked at it as it came. I was like, That's neither of us. But here we go.

00:38:14

There's your picture. When you do portrait mode, it's recreating the Bocca effect you get from a shallow depth of field from a really wide open lens. And so then it's like, okay, well, if it's using this model or this AI model, what is AI? It's an interesting question. If I'm doing a spot clean up for a pimple, it might be just literally copying pixels or it might be using AI to generate pixels. Do we call that AI? I'm not trying to put my hands up in the air and say we can't do anything. I'm just saying the reality of the situation is a spectrum.

00:38:45

Hard to define, almost impossible.

00:38:46

There's that challenge. Then there's another challenge, which is the models are getting so good that the work that we do now, and we do work to try to detect things that are automatically generated by AI, will get less and less effective over time as the models get better. Now, our Well, it's going to get better, too, but it's an arms race. I actually think at this point, we will continue that race, but it might be more practical to essentially mark things that were captured when they are captured by the camera with what's called a fingerprint and verify that those were actually captured, then verify that things were actually generated. It might be easier to identify what was actually captured by a camera by doing industry-wide solutions around some of these interesting technologies, then to try and automatically detect was created with an AI. So we're now exploring that. So then you could click on something that you see on social media and it could be like, Hey, this was signed by the Sony camera in a way that can't be replicated or forged by the AI.

00:39:42

Will they be using NFT technology to create that?

00:39:45

There's a couple of different approaches out there. I'm forgetting the name of the one that Adobe's leaned into right now. I was reading about theirs last week or the week before. But basically, you work with manufacturers. So whether it's Apple or Google or Samsung or Canon or whatever, and they actually put, think of it as a signature in the file that can't be replicated.

00:40:05

Is there a way to just require that whoever creates the content, so instead of you guys having to go in and decipher, That it's illegal if you create something from AI to not say... The onus is on the creator, not on you guys. Ads are like that, where you have to say this is an ad.

00:40:26

Yes, but then the challenge is enforcement.

00:40:28

Yeah, you have to police it, which puts in the situation that they're currently in.

00:40:31

Ads is a good example. In the US, there is no law that says you have to mark something as an ad if it's an ad. In Germany, you do. In Germany, you'll see a lot of creators will just put ad on everything because they're worried if they unintentionally there's a liquid death water in the background, they're going to get in trouble. Then the question is, how does the government police it? Because you said illegal. Now, if we had a rule, let's say you have to mark it as AI, the rule doesn't matter unless we enforce it. Back to this traffic light analogy from before. To enforce it, we need to be able to detect it on our own and say, Hey, you didn't label this.

00:41:03

Then again, what percentage are we going to let slide? Is it 8% synthetic? Is it 12% synthetic? When do we enforce? Am I writing a ticket for 5 over or 15 over?

00:41:14

I want to give people context to make more informed decisions. So we could maybe show you how likely we think it is to be AI. We don't know for sure, but we think there's a 70% chance. Or we could show you, Oh, this was actually signed. So we know this was actually captured because it's signed by this industry standard. I think we better for us to adopt a standard than build our own. That works across the whole industry because obviously people spend a lot of time on a lot of social networks, not just Instagram. My bias is to not avoid the work, but to do the work that is going to be robust over the long run and then give people information to decide what they want to trust or not.

00:41:47

But again, in this same Bloomberg interview, you said something I think is very profound. And again, I think people in general want everyone else to do the work that's also their work, whether that's your kid's not doing well in school. It's the teacher's fault, but you've not done one bit of homework with your kid. Everyone's always off.

00:42:04

The homework fights are hard.

00:42:06

Yes. But you said what you do with your own boys. There is some personal responsibility, which is we're maybe not asking the right question. The question we're asking is, is this AI or not? And a better framing of it for your own children, what you tell your children is to ask them who created it, what their incentives are, and what they're after. You have to identify that because that's really how you're going to figure all this out.

00:42:32

I think that's what digital literacy is going to become for our kids is in a world where anybody can create something that looks real, sure, you can be like, does it have a signature or not? Is it generated by AI or not? But it's still going to become much more important to consider who said it and why they might have said it. And that is something that historically isn't how we think, not just online, but offline. We just evaluate what we see based on exactly that, what we see. Exactly. What we believe our eyes.

00:43:05

Yes, and of course, we get into the vast spectrum of skepticism that people inherently have. What's funny is we want to not have to do on this platform, what we do in real life, which is you're telling me you need $20 because you need a bus ticket back to wherever because you're kids. So I'm evaluating, am I getting scammed right now? Is this person genuinely trying to get home or are they trying to go get Mad Dog 2020? I have to try to figure that out, and I have to do my best to assess their intention. And we're doing it all the time in all of our interpersonal relationships. If a guy is selling you something, you've got to consider the fact, well, they're a salesman who gets paid on units moved. I must keep that in mind when I'm evaluating how effective this product is or not. We're going to just have to do that with the content we see.

00:43:54

Yeah. Then our role, I think, is going to be to give you and make more prominent, because some of this already exists, but it's too hard to get to, more information. This account is based in this country. It was created two days ago or 20 years ago, or in our case, 15 years ago.

00:44:08

You use yourself as an example. Yeah.

00:44:10

You could be like, Okay, Adam, this account is based in the US. It has had no username changes ever. It's been registered for, call it 13, 14 years.

00:44:19

This is reputation.

00:44:20

Yeah, exactly. Then we could add other things. This post was recorded with the Sony camera, authenticated by Sony, or this post was created with AI. That might not mean it's a bad post. Then you just know.

00:44:32

When you said, and I guess this is probably forthcoming, I would be so supportive of this, is your biases. You are left leaning, you are right leaning. You mentioned financial bias, which is incredibly pertinent.

00:44:44

But then there's even other things over time. We're talking about step two, now three, four, five. I think it would be interesting to be like, Okay, well, this meme or this narrative, where did it come from? Who was the first person who posted about this? That's interesting, not just from a safety perspective, but from the value should go to the original creative perspective. Yeah, giving credit to who- Creatives, right? On one hand, it could be like, Okay, this is a lie. Who started it? But another could be like, This is an amazing dance or bit. Who came up with it first? There are all sorts of really interesting things that we can do under this umbrella of surfacing information about what you're seeing so that you can not just decide whether or not to trust it, but decide what it means.

00:45:25

I can tell you the one that scared me. I saw, and this was a breakdown, thank God, I'm not going to expose this, but this video that had gone very, very viral. It was a New York City police officer confronting ICE agents saying, You don't have jurisdiction in this city. I, too, am upholding the Constitution. It was compelling. It was inspiring. You wanted to take action when you saw it. It was millions and millions of millions of forwards. Someone had the wherewithal to zoom in on the badge on the shoulder of the cop. It's not real writing. But I'm telling you, I think I'm quite good at seeing this stuff, as probably all of us think, the literacy competency illusion.

00:46:04

Yeah, we're not good at it.

00:46:05

And I was like, Oh, now this is what's really fucking dangerous. This is like a super inspirational video that could make you leave your house and go react to something that didn't happen.

00:46:15

And was designed to be. It's not a coincidence that that thing was sent around. It was designed to be sent around. It was designed to invoke a specific emotion.

00:46:22

And I just think, man, when you start having people responding to atrocities that haven't even happened, that's the scary part, I think.

00:46:29

There's tons of things that are concerning, and there's tons of things that are exciting. Technology has continued to change more and more quickly, and we're just accelerating a lot right now. This has been happening as a trend for a long time, but this last year or two in this next couple of years is a real inflection point in the speed at which technology is changing. There are lots of things to be excited about, and there's lots of things to be concerned about. Our job as platforms like Instagram is to consider both. In what way can AI give Give people superpowers, make you happy, remember everything, process immense amounts of information, produce more, be more creative, get people who weren't creative to be creative in the first place? But also, how can it be abused? Where do we have to make sure we're giving people more context that we didn't think about having to give people just a year or two ago. Where do we have to be more careful about how it's going to be misused and then get ahead of that and prevent those misuses from happening in the first place? When invariably something happens that we didn't plan on that's bad, how can we react quickly?

00:47:27

It's just the two sides.

00:47:29

It's a stressful fucking job you've taken.

00:47:30

Yeah, very.

00:47:31

It is amazing. It is very stressful, and it's also really amazing.

00:47:35

Yes, like the product itself.

00:47:37

It's not lost on me.

00:47:38

What's your personal high from the job?

00:47:41

I get to meet amazing people, which for me is really energizing for two reasons. One, I love people. I'm an extrovert. But two, I'm really curious about different industries. I get to go talk to a bunch of different podcasters about how their world works, how does the business work, how does the creative process work. I get to do the same thing in fashion. I get to do the same thing with European footballers.

00:48:02

You're just exposed to everyone.

00:48:04

I'm lucky enough that I get to talk to people who are amazing in all these different worlds and all these different countries.

00:48:11

We have the same upside, same job upside. Yeah, for sure. Yeah, literally. That's actually identical. It's identical.

00:48:16

Yeah, that's one of my favorite perks of this job.

00:48:17

Yeah, I keep walking around just like, it's impossible the amount of people we've got to meet in this one lifetime.

00:48:22

Yeah, and you get to really talk to them. Yes. In a world where, not last than me, we have been part of moving the world to shorter and shorter bits of information that are consumed, you all are part of this almost counter trend, which is like, no, we're just going to talk for an hour or two, and we're going to go deep. People want that.

00:48:41

I get nervous because even with what we do, so much has changed in the past eight years since when we started, where they knew they would come and you'd have to listen for the whole thing or for most of it. But now clips are such a huge part of doing podcasts, and I do worry. I'm like, Oh, no. It's just back to the thing that we were escaping, which is late night.

00:49:04

The antidote.

00:49:05

Yeah, literally.

00:49:06

To the 140 characters.

00:49:07

Yeah. In the five-minute fun clip on late night, which we love, but the whole point is to have a real conversation. Now it feels like we're almost reversing Starting back to that.

00:49:16

I think what's key, and you should tell me if I'm off base here because this is not your world, not mine, is to not conflate what I see as more of the marketing for the conversation and then the actual conversation. There might be way more views of the short clips on YouTube and Instagram and elsewhere. But the core of the business, I think, is still the long form, both from a who do you have the deepest connection with as a creative, but also from a business perspective.

00:49:43

Yeah, the real estate to sell is on the long form. Exactly.

00:49:47

That's what's a bummer, I think, sometimes where I'm like, Oh, my God. I mean, I ask so many people. I'm like, Oh, did you listen to that? Or let's say they say something about Good Hang, and I'm like, Oh, did you listen? They're like, I saw clips. I mean, I saw clips is so common.

00:50:00

Totally. There will be more people who see clips than listen to the whole thing. But if you're growing the group of people who listen to the whole thing by having 10 times as many people do clips, you're still growing your core base.

00:50:11

I know. It's just hard to compete.

00:50:13

But if that's shrinking, then it's concerning.

00:50:15

Yeah, I don't have that fear. I don't think we're losing listeners to clips. I think we're getting exposed to more people who don't do two-hour shows, personally.

00:50:25

The thing I also like about your world is you get to participate with just audio. Obviously, video has become a big deal. This is why this place is so beautiful. But I listen to you guys when I'm on the way to work, when I'm in the car, when I'm working out. I can't do that with Instagram. I can't do that with clips and videos. And so you get to participate in parts of people's lives that things and products and apps like Instagram don't.

00:50:52

I agree. That's what I love about it. But I worry that it is changing and moving more into because our attention are much shorter, that it's like, Oh, I got the gist.

00:51:03

It's a real worry. I think people underappreciate audio in general. My whole industry does. I've been trying to push it a little bit, which is most times when on Instagram is watching videos now, that is half audio, right? But we're not thinking about that experience as much? What does Instagram sound like is an interesting question.

00:51:19

Because the visual component has just gotten better and better and better, and the audio experience is really not.

00:51:25

And we tend to, I think, think more about what we see than what we hear. But what we I think, can have just as strong an emotional impact.

00:51:32

Yeah. Another thing that I was sympathetic to is, first of all, I think it might shock people to learn. And you do this great example. When you have a crowd of people, it's very easy to demonstrate the reality of this. Because when you first say it, you almost don't believe it. But tell us where people spend time on Instagram. Oh, yeah. Because you have basically four options. You have the feed, you have reels, you have DMs. Maybe is there three? Stories. And stories.

00:51:58

Stories is a big one.

00:51:59

Right. So You got these four columns. And also just start maybe at the beginning, what it was and where we're at now.

00:52:05

People my age, I'm in my 40s, they think of Instagram almost always as a feed of square photos because that's what Instagram was when Instagram launched. But if you look at how people spend their time, and even more, if you look at what people do, it's just not the core part of Instagram anymore. People share way more to stories than they do to feed. Particularly young people spend way more time in stories than they do in feed. But the thing people do the most, particularly young people, is actually message, is DM. Teens will often, and young people will spend more time even in DMs than in some of these other surfaces, and they definitely share way more. And forget about all the text. If you just look at photos and videos sent, there's way more photos and videos sent as DMs than there are posted to stories or feed or even those two combined at this point.

00:52:51

Yeah, you ask everyone in the room, put your hand up if you've sent a DM on Instagram in the last day. And 90% of the room puts their hand up. Great. Now, keep your hand up if you have-If you posted a story.

00:53:03

I actually do this just this one time at Bloomberg, I made it up. I was like, Did you open up Instagram today? Did you send a DM today? Did you post a story today? And it's like a bunch of hands went down. Did you post a feed post today? And it was like, Every.

00:53:14

Nobody Nobody has posted a feed post today.

00:53:17

I did.

00:53:18

Well, we did because we have a schedule.

00:53:20

Yeah, that's right.

00:53:22

But when you did it that way, it really sank in. And then what I appreciated was you being honest about the fact that we have that data, right? So had we only stuck with feed? And we said, no, we're feed only. I wouldn't be on this stage today. We wouldn't be relevant because we see what people want to do, and they would have gone to a place where they could have done that. And that's just the hard facts of the business. And so what I thought about is we've had very few changes. We went from audio to video, and people lost their fucking mind. And then the other crazy thing was we had, you could do it a week early if you join the subscription. People People lost their mind.

00:54:01

Also, when we first went to Spotify, people really freaked out.

00:54:05

Yeah, but all to say, we've only had three or four changes in eight years, and it was very difficult, and people were very, very angry. It's stressful. I was thinking like, how you're weathering the outrage and also the frustration, despite what everyone's saying, once they get over the hurdle, they like it.

00:54:24

This challenge of change management was my first exposure to this controversy in the industry, where I was the designer on a new design of the homepage of Facebook in late 2008, early 2009. People lost their miles. I was reading the comments. It was gnarly. Death threats. I'm like 25. Yeah. Again, my self-worth is totally tied up in my work.

00:54:54

You're nothing if you're not that.

00:54:55

Yeah, and I'm just getting pumped.

00:54:57

You ruined it.

00:54:59

Yeah, I was just devastating. I You learn to accept the fact that, look, if you're going to spend half an hour, an hour using our product today, the analogy I used to use is like it's a desk. You wrote some letters, you organized some photos, you did some things, and I just came in, I just re-arranged your desk. You're going to tell me to F off because I didn't ask for permission.

00:55:19

Get out of my space.

00:55:20

And you think of it as your desk. But the alternative is to become irrelevant. And I don't think that's being bombastic. I'll give Mark a lot of credit on this. Mark has always said that companies usually fail by hitting their goals all the way down. That's about not really defining success properly and also not being willing to make hard decisions. In that case, it's about aiming high. But in this case, it's about being willing to actually do things differently. I would rather lead Instagram during however long I lead it for through a bunch of changes and occasionally go too fast, too far and get backlash, then have it become irrelevant under my watch.

00:55:59

Yeah, that's This is relevant because my phone updated last night. Do you have the new update?

00:56:05

Are you just getting liquid glass?

00:56:07

I hate it so... I was like, I'm in a bad mood today. I can't figure out how to use it. I don't like the way it looks. I'm so mad.

00:56:14

We've pushed too far. We've made mistakes. We've experienced a fair share of slaps for it. We're trying something now that's different that I don't know if it's going to work or not. But we are trying a version of Instagram that is basically you open it up and it's just stories, and then you swipe straight into reels. If and when we launch it, there's going to be all of the energy.

00:56:33

Is it going to Tahiti for two weeks?

00:56:35

So there'll be no grid posts at all?

00:56:37

No, you can swipe through them, too.

00:56:39

You'd have to select to be on the feed?

00:56:41

The feed would just be more video. It'll still support photos, but it'll be full screen, one at a time, how you consume video, mostly. Oh, it's like a TikTok. How you can zoom video mostly. And the reels tab and shorts on YouTube and all of it. Okay.

00:56:52

Yeah, that scares me. I know.

00:56:53

I don't want me to do that. I'm really scared.

00:56:55

I don't know if it's going to work. When we try new things, we have to be ready to answer questions because people see it. So right now we are allowing people, not everybody, but we're allowing people in India to opt in. The idea is to manage this, if it's successful, over a year or even two, and keep making the experience better and better, and then see how many people opt in, and of the people who do switch, how many people keep it. And then when we get to a place where, let's say, the majority of people switch and the majority of them actually also keep it, then we can consider moving everybody else over. That creates a really healthy incentive for us to make it something people want to switch to, but it also takes a long time.

00:57:37

It's a trial.

00:57:38

So we're trying it out.

00:57:39

Is there a way to mass DM? Is there a way to get people to sign up or something where we DM them? Channels.

00:57:46

You can create a channel, which is basically a broadcast DM, where people can sign up. It's basically a group chat, but instead of having every armchair in a group chat, you can just message all of them, and then they can reply to your messages so you can consume it.

00:58:02

But it's not creating a message board?

00:58:04

No, it's just in DMs.

00:58:05

I think we should do that and start sending out the episode.

00:58:09

Yeah, it's not for... You know what I'm understanding about that is the number of people who will sign up for that is much smaller than the people who are going to follow your account. So it's for your most passionate people. And so the content strategy should match that.

00:58:23

Okay, so great. That brings up, it was my grievance the first time we spoke with you, and it continues to be my grievance, and I need to the rationale behind it. Please. Why can I put a link in stories and I can't put a link anywhere else?

00:58:34

Yeah, we hate that.

00:58:35

Why the fuck can't I put a link in my feed?

00:58:38

It's contentious. I get this one every week.

00:58:42

Might be time to buckle.

00:58:43

Yeah, well, I have buckled before. If we do links for everybody everywhere, I think what will happen is there'll be a lot of links posted very quickly, and it'll change how Instagram feels significantly, and I don't think it'll be for the better. One, you get a lot more scam and spam. Links are the vector for all that. You get a lot more politics and you get a lot more news because they usually are link and text-oriented as opposed to photo and video-oriented. I think we become less differentiated from other platforms like Facebook. I want us to be about creativity. I think, specifically, Instagram is about visual creativity. I think when we do links, we become more about news and publishers and politics and less about fashion and art and film. And so that's the reason. I don't expect that to be remote Totally satisfying. Satisfying? Yeah. But that is the reason.

00:59:36

Stay tuned for more Armchair Expert, if you dare. Okay, another quick thing I wanted to ask you about is I watch a lot of your two camera posts, and they are generally bits of advice for creators.

00:59:58

Oh, my video?

00:59:58

Yeah, yours. You're The comfort level of speaking directly to the lens is outstanding. The fact that you're not in show business is incredible because I suffer from that. You get right on that motherfucker and talk directly to lens all the time. You're giving tips to creators. I was curious, do you have a distinction in your mind between the occasional poster who puts food or their vacation photos up versus someone who's making content? Yes. What would we call the two groups? There's creators and then what?

01:00:25

Just average folk.

01:00:26

Users. And what percentage is on the platform? Do you have any sense of that?

01:00:30

It depends on how you define it, but they're definitely, at the very least, tens of millions of creators, and there's over 3 billion people who use Instagram.

01:00:37

So it's a tiny minority. But my hunch is, because you're servicing them so much, are they the lifeblood of the platform?

01:00:45

They are very valuable for the platform. I think that creators... Well, a couple of things. One, we define creators as individuals or groups of people, so not like a company or a corporation or brand that produce original content with the an intent to grow. You're trying to get something done. Maybe you're trying to evangelize a cause, maybe you're trying to get elected, maybe you're trying to sell some shoes, but there's an intent there to grow. If you're just posting pictures of your vacations or the occasional hobby, that's cool, but we don't think of you as a creator. You can be creative, that's great. But I believe, and we believe, I should say, that power has been shifting from institutions to individuals across industries for years, and we should be leaning into that. Athletes are much more relevant relative to the teams that they play for compared to 10, 20, 30 years ago.

01:01:33

Or the fact that these college kids are now incredibly well paid all through that.

01:01:37

Journalists are building their own brands outside of the publications that they work for. You'll see actors have their Instagram films explode when the new Netflix show gets announced even before it drops. And I think this is because we want to understand the world through the eyes of people, people we admire or trust or look up to. And that's what creators are. And so we think that we lean in there because that's what people are most interested in. That's who's going to become more and more important over time.

01:02:04

This is so obvious, but I didn't think of it until you said it out loud, which is you said, I think it's useful occasionally to pull all the way back and look at the very big picture, which is what the Internet, the first iteration, gave rise to is prior to the Internet, people could not self-publish. The expense was too high or there were institutions that controlled that. And so the first wave of the Internet was really allowing every human to be a publisher. That's a big, big concept. It's been great and fucked this up because not everyone should be publishing. We're hearing from a lot of people that we wouldn't have otherwise, and they're very helpful.

01:02:39

We've given a lot of power to everybody. Yes.

01:02:42

Very democratized. Everyone's democratization of everything. Well, this is what it looks like. Now, the second wave is really producers, production. Before, studios only could afford the Aeroflex camera. That was $250,000. The film you put in it, that was 35,000 in the processing and all the stuff that went into creating content. It was impossible for someone to do it. That's been changing, but AI is really now the insane breakthrough where it's like your environment could be any location. So what we're now doing is democratizing production, which is a fascinating thought.

01:03:18

Yeah. My hope is that the way we participate in this wave will empower human creativity. I think that there There's versions of AI that will disintermediate people and displace people in jobs. But there's also versions of AI that can give creative people the ability to do more of what they do so well and to be more successful. My hope is that on Instagram, the ways that we lean into AI with creative tools and assistance and the ability to understand your insights or to understand the patterns for what works and what doesn't or to recreate or create whatever is in your imagination will be really empowering But it's the same construct. The internet allowed anybody to publish and reach an audience because the cost of distributing things went almost to zero. Ai is going to make the cost of producing things go way, way, way down as well. You will just see more content. Our job is to figure out the right way and the responsible way to manage one of the platforms. I'm hoping that we can really empower human creativity in how we lean into that. But that's the big question for us over the next couple of years.

01:04:28

Okay, now, how have you How has Instagram generated you? How has Instagram evolved in terms of monetizing the work of those creators? And how is Instagram generating money for these creators? And how does it compare to, say, a creator on YouTube?

01:04:42

So what creators do, and they're not the only people who do this, is they make content, people consume that content. And then we sell ads, obviously between pieces of content. You could think of it as selling ads against that attention.

01:04:53

It's really the old network television model. You get this free show, Seinfeld, and you're going to watch a Clorox commercial.

01:04:59

Exactly. Now, YouTube has historically been more focused on long-form video, and they show ads before and during the video, and then they give creators a cut. Our videos are shorter, so we don't put ads before. You don't get a pre-roll on Instagram, which is like, you have to watch this, and you don't get ads in the middle of the videos. So there's one challenge, which is who deserves the credit? Yeah. Youtube is the best in the industry at creator payouts, I think. It means specifically paying creators directly because the platform can pay them, the fans can pay them or brands can pay them. In terms of the platform paying them, YouTube does the best. Now, what we've done is we've got tests currently in US, India, Japan, and Korea at different times. We've tried to pay creators for creating content. And what we keep finding, particularly with videos, if we pay creators. We're like, Hey, if you make this number of posts, we'll pay you this number of dollars. They do create more posts, but the incremental posts that they make are not as good, and they don't hit that many more. So they actually don't drive up the amount of time people use Instagram that much.

01:06:00

I was going to say there's really no metric you can decide on that. It's not going to obscure the output because it could either be number of posts or it could be total views. Well, now total views, I'm incentivized to be provocative. I'm incentivized to be what the algorithm was on YouTube.

01:06:12

But even if you ignore those challenges, All the tests we have are just burning money. If we invest $10 million in a bonus program, we've never made anywhere near $10 million back. I'd be happy just to break even. Sure. For me, if we could have this program break even, have the eligibility be clear, so it's obvious how you get to be part of it because it can't feel like a lotto, and ideally, the payouts aren't embarrassing, then it's a success. People create so much content on their own, and the incremental content that people create when we pay them is not that high in quality or quantity on average. I'm sure some people are doing amazing things, that we end up burning most of the money. We're a business. I want to be honest about that. We're not just going to burn the money. So we'll continue iterating because that could change or trying, I should say. But we focus more now on the other ways we can help creators make a living. How can we help with fans paying you directly?

01:07:06

I was going to say you have a Patreon-ish version existing, right?

01:07:09

We have subscriptions, which is actually quietly growing and doing well, where if you have a platform where people are willing to pay for your content, you can do that on Instagram. You can set the price. I think we just cover the cost of running it. We don't actually make money off that. And then the big one, though, is getting paid by brands. Last I checked was years ago, it seemed like it was more than a $15 billion industry a year, which is creators making deals off platform and getting paid by companies to advertise their products.

01:07:37

To do a post about a pair of sneakers.

01:07:38

So we've tried to support that. So we have the creator marketplace where you can have brands and creators find each other and sess each other out. We have ads tools where the ads can then use that creative and run ads with that same creative with the creator's permission.

01:07:53

It's like a system that integrates that perfectly into there.

01:07:56

So that the advertiser can find out the value that they're getting from actually paying for this content. We want creators to get paid, but we can't just burn money writing checks. So we're trying to find the balance.

01:08:06

Yeah. So interesting.

01:08:10

We're going to do a hard pivot. I can feel it.

01:08:13

You make a prediction. Dangerously, perhaps. But your 2026 prediction was, Looking forward to 2026. One significant interesting is that authenticity is becoming infinitely reproducible.

01:08:24

Yeah, this is a real tension.

01:08:26

Tell me about this statement. I think I understand it, but I'm not sure. It's It's abstract enough that you could deny any implications of the statement. In legal terms, this is real messy language. I wrote that thing and I was like, I should write more.

01:08:40

I wrote this thing and it was really boring. Then I was like, All right, I should make it less boring. Then I rewrote it and I was like, Now it's just purposely controversial. That's annoying. That's just as boring, just different. Then I was like, All right, I should just cut half of the stuff that I don't have any business weighing on anyway, and I'll focus on the things that I care most about. I don't know how it turned out, but you read it, so that's cool. Maybe you had to as your homework for this People think of Instagram's esthetic, particularly people my age, as these perfect photos, sunsets, skin smoothing, makeup, this whole thing. That is not what is in the cultural zeitgeist today. What works even on Instagram is content that is very purposely countered to that polished esthetic. Messy. Messy, raw, pimples, blurry, not crop properly. Real. Because people just want a little of authenticity in this world where we're inundated with processed, intense information. I think that there's this interesting thing now where that perfect esthetic is becoming cheap because it's easy to produce. And so the most savvy creatives across a lot of industries are rebelling against that, not atypical of what happens in different art worlds.

01:09:46

Well, as any person in a business, you're looking for a hole in the market and you're trying to fill it. If everything is perfect and polished, what's available to exploit is this.

01:09:55

We're in this moment now, and I don't know how long it'll last, where imperfections are an indication of authenticity. That's one of the reasons why they're working. They're also relatable. It's a way of being like, I'm not AI, or I'm not a brand who's trying to sell you something. I'm just a person.

01:10:12

Yeah, a fucked up mess.

01:10:14

Yeah, I curse, I stutter over my words. I make mistakes just like everybody else. That's where we are, and you're already starting to see it.

01:10:21

Ai is going to figure that out, right?

01:10:23

It's already recreating it. If you talk to ChatGPT, you hear it take a breath. It doesn't need to breathe. That's not a thing. You're going to be able to create the illusion of that raw imperfections. It's just starting now.

01:10:37

Adam, I called the front desk of the hotel I was staying at in Detroit. Busy, busy, busy, busy. Then I went online. I got the actual hard number to the place and called on my phone.

01:10:46

Yeah, landline style.

01:10:47

It was like, Hello, thank you for calling such and such hotel, and blah, blah, blah. It had a bunch of imperfections in the thing. I was talking for a while. I don't know what clued me in, but I go, Hold on, are you a computer? She's I was like, I could see why you would ask that. I was like, Oh, yeah. Fuck, dude. This is nuts.

01:11:05

Yeah, yeah, yeah. It's a thing.

01:11:11

It's insane. I talked for a while before I figured out it was a computer.

01:11:16

But in that world, what's left, I think it's our taste. It's our perspective. It's our opinions that are going to be the thing that makes some piece of content be interesting or not. But the other question for me is, how can you Point the technology at what you want to get done. One way to think about it is it is an it, it is an algorithm, it is AI, and it is coming for you. There's another way which is like, all right, it's a bunch of tools. It doesn't care about me. It's agnostic. But what are the things that I want to do that it can help me do more of better, faster, stronger, etc. Then how do I figure out how to do that? Because it's interesting tension. On one hand, it's so technical. You're talking about these giant foundation models and these data centers with all of these GPUs. Anybody listening to this is already like, I don't care what he said. On one side, it's so technical. On the other side, it makes things so much more approachable. Like I said before, I'm not creating anything. I used to program. I used to write a lot of code, really bad code.

01:12:11

There'll be some engineers now who will go look up my diffs.

01:12:14

Just if they need to throw up. Just to make fun of me.

01:12:16

They ate some poison. It was just rage. Just dunk on me on internal forms. I can code again now because I can work with an AI to program and I can speak to it in English. Because I'm technical enough, I can give it pretty good direction. But it's getting easier and easier. And so a couple of months ago, I started programming and I was like, All right, this will be fun. I haven't done this in forever, both my own stuff, but also at work. And the way you did it is you basically had an AI, you talked to and you watched the program. That's not how people do it. Well, some people do it, but it's not how I do it now. It's not how a lot of people do it now. Now, I've got four of them. That's four engineers, basically. And I talk to each one of them. To see who does it better. No, I just check back. I do four things in parallel or I do two things and see who does it better. It's a completely different way of exercising your brain. I'm multitesting aggressively. I've got four really junior employees that I talk to every 10 minutes.

01:13:07

Without any civilties.

01:13:09

I actually do. I say, please.

01:13:10

Me too.

01:13:11

I do, too. I know.

01:13:12

It's so weird. I heard that it's wasting X amount of money. It is.

01:13:14

I'm sure it's like burning water. Energy a year. Yeah. But I do this because I've always said, please and thank you to Alexa, because my kids, I don't want them to hear me barking orders at any personality. It's definitely gotten into my program. It is. Why am I being so? I'm sorry. I I gave you the wrong instructions the last time.

01:13:31

It's like, why am I being a part of your mistakes? I know. It's so bad. Just being accountable to the computer. It's just so ridiculous.

01:13:37

But I don't think it's limited to programming. I think it's going to make technical things, not just programming, more approachable to a bunch of people because it can bridge. That's an interesting tension that I'm not sure I know how to make sense of yet.

01:13:50

Okay, my last thing is thread. We were on this trip and I was like, What is thread? You're telling me about thread. I'm like, So it's Twitter? I don't know. What are you doing? I want to say at that time, I was trying to remember because I said it so often. I think you had 300 million subscribers at that time. Is that possible?

01:14:05

About 300 million users, yeah.

01:14:06

Every morning, I'd ask where we were at and stuff.

01:14:08

The goal was to create a place for the healthy exchange of perspectives, of ideas. It started really in reaction to Twitter. We didn't want to be a Kumbaya, super friendly Twitter. People are going to argue, I don't want to pretend otherwise, but we just wanted it to be more open for more types of perspectives, more ideas, a bit more civil. Over the time, it's evolved, though, and now it's much more focused on trying to support communities, trying to support just the open exchange of ideas. It's growing. We're doing pretty well here in the US. We're crushing right now in Japan, which is interesting.

01:14:43

You're doing a great job of integrating it into Instagram because I see headlines I can't resist but click on and open threads.

01:14:51

That's our equivalent of clips. We show little bits of threads in Instagram that raise awareness.

01:14:56

And I go.

01:14:57

We look at two things. We look at how many people use threads or see threads because of its Instagram integration. Then we look at how many people just go on their own. That go on their own is the core business. There are more people who go because of the- Yeah, you still haven't gotten me to go directly to threads. We might not ever. Who knows? We haven't even gotten my wife to install It's all it's.

01:15:16

Well, we're on threads if our listeners want to join the conversation. Join the conversation. It's follow. It's follow. Yeah, follow us.

01:15:24

What is your evaluation of the ex Twitter platform?

01:15:28

I don't want to underestimate them. I mean, they've gone through a lot of change.

01:15:32

Have they grown, shrank, plateaued? What have they done?

01:15:35

It's hard to say. Somewhere between plateaued and shrunk a little. But maybe they're growing in some countries and shrinking others. That's pretty normal for social networks. They obviously laid off, I think, over 90% of their employees. That's scary. It's also impressive. It's very efficient. They've got some good people over there. I just want us to beat them.

01:15:52

Yeah.

01:15:54

That's honest and fair.

01:15:55

As I'm watching the Iran stuff and I'm learning, they've shut off all the social media and they've shut off even WhatsApp there.

01:16:03

We got blocked partially there years ago. We were actually really big in Iran. The government there will block certain social media networks, certain websites, and then they'll shut the whole Internet down at times.

01:16:13

I wanted to ask just a stupid question. How did they mechanically decide what's on the Internet? To me, the Internet seems like this huge pipeline that you pull different addresses.

01:16:24

Yeah, how did they do it?

01:16:25

How did they shut off just some parts of the Internet?

01:16:28

They work with the Internet service providers to block specific domains. You don't use Instagram by going to Instagram. Com, most likely. But behind the scenes, there is an address. It might be a number, it might be words somewhere.

01:16:39

My app goes to there.

01:16:40

Your app goes to there to talk to the servers. They'll have the people who provide Internet block specific addresses. Then you can play this game of cat and mouse where you can switch your addresses or you can go through a VPN.

01:16:52

What about Starlink? How does that factor in? Obviously, there's no control over Starlink. If you somehow have Starlink, you do have access to everything, yeah?

01:16:59

Yeah, straight from space.

01:17:03

Okay. Crazy. It's the future.

01:17:05

It's more than half the satellites in orbit.

01:17:07

It's crazy. The last two things I pride myself in being able to look at someone and say the exact car that they should all be.

01:17:14

Yeah, you were right.

01:17:15

I spent this time, Monica, with Adam, and I said, I'm just going to say it, man, you need a '70s BMW 3. 0 CSI. I just need to know what availed you to that advice because you went on I went on a hunt for a long time on bringatrailer.

01:17:32

Com. I don't know. I've never been a car guy. I drive around a Toyota Siena. Also, you can't be a tech exec in a Porsche. It's too much of a- Yeah, stereotype. It's just too much of a cliché. I'm saying that a lot as I'm thinking of friends of mine who are tech execs of Porsche, who I apologize to say this on the record.

01:17:50

Well, you use cliché. That was nice. I used do she.

01:17:53

I was trying to find something that would give me some pleasure that wasn't crazy. But that also was a a little bit, if you know, you know. I drive that thing around. Most people don't even look, but every once in a while, someone is like, Oh, man, check out 39. They come over and they want to talk. Wow.

01:18:09

What color is it?

01:18:10

It's Fjord Blue. It's like this blue. It's a little bit of a cult. It's a little bit of a cold pink color from the early '70s.

01:18:17

I had said, too, blue would be my pick for you. That's my favorite.

01:18:20

It's blue all the way through. It's blue on the inside and blue on the outside. It's tiny. It's like a gentleman's coupe.

01:18:26

It's very Art Deco-y. There's a mix of steel.

01:18:29

I I love that car. I should think you brought that. I never would have bought it. I would have been just driving the CNR around.

01:18:35

Do your boys like it?

01:18:37

Yeah, particularly Blaze. He calls it the Blueberry. He calls it Blueberry.

01:18:41

Okay, and then the last thing I find so charming about you is you are a hardcore burning man person. What do you call yourselves? Burners or something?

01:18:48

In a past life.

01:18:49

What do you mean in a past life?

01:18:51

I have kids now. I haven't gotten yours.

01:18:53

When's the last time you went?

01:18:54

I don't know, but I've been more times than I can count, so you're not wrong.

01:18:57

Adam's the type that goes, Monica, a week before and fucking builds the enormous contraptions.

01:19:03

The build is for me the most fun part. You go and you're building your camp. You're really making a city in the middle of the desert. By the end of the week, it's chaos and it gets a little bit too messy for me. I'm getting old. But to get out there and to be using your hands. We built these yurts, which is where we sleep in, or these giant trapeze tents, which is where we hang in, or the kitchen. Wow.

01:19:23

There's AC.

01:19:24

We built a shower. We built everything. Oh, my God. Then you rip it all down. It's a lot of work. Yeah. But in this I'm on glowing rectangles all day. It's nice to be outside with no internet in the desert with your hands all dirty.

01:19:36

That makes sense. And fabricating. Yeah. Well, Adam, I adore you. You'll be back. Yeah. That was round two. We'll do round three. We'll be laughing at our naivete in this conversation. That's right.

01:19:49

It will all be irrelevant.

01:19:50

In fact, the conversation will happen with our avatars.

01:19:54

I believe in human creativity.

01:19:56

All right. Well, thanks for coming. This was a blast.

01:19:58

Thanks for having me. It was so much fun. We hope you enjoyed this episode.

01:20:03

Unfortunately, they made some mistakes. Excuse me, I'm just sorting my sides here. I just did a little bit of acting a second ago.

01:20:16

And how do you feel like you did, sir?

01:20:17

Well, it was a cold read. It was my first time reading any of it. It was a scene with my friend Monica. Yeah. Yeah. It looks like she might be doing this role, so we had to do a little self tape.

01:20:30

Oh, my God. I hope that doesn't happen when I'm on set.

01:20:34

Oh, my God. I'm glad this happened after the audition.

01:20:38

Yeah, I just put myself on tape for something, which is fun. Now, did it make you think you want to act with me someday?

01:20:45

First of all, I have acted with you.

01:20:46

Oh, yeah. I'm blessed this mess.

01:20:49

Bless this mess. Chips, I'm just outside the window.

01:20:52

You're outside the window, but you weren't in a scene with me. Just paying you.

01:20:57

You're right, but definitely blessed this mess. Yeah, blessed this mess. Confusing. You were very upset. You were hot. So I fucked up your-Yeah, it was easy. Farmer's Market stand or something.

01:21:07

It was really easy for me to get into character for that one. Yeah, yeah, yeah. That was fun. I forgot about that one.

01:21:12

Like a duck to water. Yes. Okay. I had forgot to say this on the last fact check, but I'd like to say it here. One thing I left out about my trip to Florida was that I had the time, finally, to binge Best Dead. And I just want to give a testimonial. Most importantly, the first eight episodes, completely entertaining. I'm entertained. It's sleathy. It's hudunity. We got an expert.

01:21:38

Yeah.

01:21:39

Just top-notch entertainment. Thank you.

01:21:40

Yes.

01:21:42

Extrem interesting. Interesting. And then I started episode nine right as I was taking off into the skies, leaving Miami. Yeah. I know, so you just said you don't like when I cry, but- No, I do. I mean, sincere. What did you say? No, no, no. Do you think I'm exploiting it or something?

01:22:06

No, I know it's all true to you, and I want you to cry because it's not good for you to build that up like parts. But Because you never did, and now you do every day, this is also happening. Simultaneously, this is happening with Jess.

01:22:21

Okay. I think it's an older male thing. I think it is. I think we've been holding it in for a very long time. Yeah. As all other muscles are atrophine, whatever holds in those tears, they are failing, too. I know.

01:22:33

I think it's very sweet. I'm just around a lot of men who are crying. Sometimes I'm like, Wow, do you mean it? But of course, you mean it.

01:22:44

That's what you said last time, too. I know.

01:22:45

Of course, you mean it, but it's so weird to be around someone for so long who you never see do it. It takes so much. Yeah. Then now it's like drop the hat, D-O-T-H.

01:23:00

It's not drop the hat. There are very specific categories that get me. You're right. They're pretty predictable. It's not like someone says, I just had my first baby and I'm weeping. No one says, I lost my aunt, and I start weeping. No, you're There's categories that I'm now defenseless against.

01:23:18

Yeah, of course. It makes sense.

01:23:20

To hear someone be totally accountable for their bad behavior, I find so moving. It's impossibly moving. So as that's happening, I'm also a little self-conscious because I'm sitting next to a dude who, right when we got... I sat down, he wanted to bond over the fact that we had the same watch, which I was cool, and I gave that some time. But all that let me know is he is aware of me. It's not like an invisible object sitting next to him, which is often the case when you fly. Sure. He's aware of me.

01:23:55

Yeah.

01:23:55

You're being perceived. I've got my head probably observed, I think. And then so I have my headphones on and it starts getting me. And then I start welling up. And then there is a point in that episode 9, where now tears are just streaming down my face. I'm not audibly doing anything. I just have my eyes closed, I have my headphones on, and I'm hoping the dude's not looking at me. And also maybe it doesn't matter. There's nothing I can do about it. And it was just so moving. I was so moved by it in a very similar way that episode we've talked about for years, Blaine from Radiolab. Yeah. Very moved by it. Then there's some score.

01:24:40

Andy. Shout out, Andy.

01:24:41

And then there's a reaction from Elizabeth that then now I audibly, I do the thing where I start laughing because I'm clearly crying, crying now. But anyways, I just was really, really, really blown away with that. I thought ultimately this entertaining thing that was just entertaining for me became very, very poignant and a wonderful display of why it's worth finding out everyone's point of view in all situations. So it was great. I just wanted to encourage people to listen to it.

01:25:12

Thank you. I really appreciate it. Obviously means a lot to me that you liked it.

01:25:18

Then I can answer without any spoilers. I guess there's debate. You've since brought me up to speed that there's debate. I 1,000% believe this person. I have no zero doubt whatsoever.

01:25:31

Yes. The end of the show has led to a lot of questions for people. A lot of people are skeptical of the end, and a lot of people have said, What does Dax think about this? Because, no spoilers, but you have a connection to something that's going on in this.

01:25:54

I hope I'm known as well, too. I have a very good bullshit meter. Yes.

01:25:58

I think they We were definitely expecting you to be cynical about the end. Oh, yeah. No. Well, it was funny because when we were making it and stuff, I think I told you a little bit like, Oh, this is going on. This is crazy. We're doing this. I didn't at all think you'd be cynical. But then as soon as people started saying it, I was like, Oh, no. He might hear this and be on that team. Yeah.

01:26:32

Which could have happened. Again, back to, and I can't say it enough up front first. I absolutely love A Million Little Pieces. James Fry.

01:26:43

James Fray.

01:26:44

Fray. I love the book. I don't care at all that it was fictional. I love the book. It's a beautiful book. And then my friend Leonard is almost maybe as good as the follow-up. And it's just a beautiful story. But I I also was reading it as a Stephen King best said it. He wrote a little op-ed about it. Not one person in Recover believed it. That's not the physics of how it all works. Yeah. And unless this guy was this one anomaly, which is possible, there are anomalies.

01:27:15

Yeah.

01:27:15

But we all thought it was bullshit. Yeah. But I didn't care. Yeah. But yes, if I hear... This is why I hated Hillbilly Elegy. I was wondering, I'm like, this is bullshit. This dude's story, he's overheard it secondhand and hasn't lived any of this.

01:27:30

The ride that Hillbilly Elegy has taken in my life is wild. You were definitely right about that. I was so wrong. Yeah.

01:27:42

So given that track record, I'm a thousand % certain, as much as one can be, that that person was speaking honestly.

01:27:51

Yeah.

01:27:54

Stay tuned for more Armchair Expert, if you dare. Well, I wish everyone heard so we could really deconstruct it a little bit. But it's just like, I think people... It's much easier when you're afraid of somebody or someone has wronged you to assume that there was a great deal of calculation happening. Because it fits better with the archetype of villain, and it fits better with the archetype of bad person. And as someone who's just on a lot of inexplicable things fucked up, I'm like, I just can't relate so much. People could call me about some stuff, and I would be just as like, absence of an explanation is the other people would be. Yes. I don't know why at that... In my four-day blackout, the time I think I got closest to dying, it's like, only later the next day, I'm realizing Well, clearly, I tried to take a motorcycle, right? Because the motorcycle was sitting on its side. I have pulled ribs. I clearly must have tried to pick that motorcycle for a very long time. I don't know. I don't know where I was going to go or what I thought I should be doing on a motorcycle or whatever.

01:29:17

I don't know.

01:29:19

Yeah. I mean, I brought you up, obviously, multiple times at the end in our episode 10 when we recap, and then We also did two more episodes on Patreon where we took a listener voicemails, and then we answered those. And there's a lot of poking holes. Yeah. I think everyone has their own relationship with what happens in this story. And that's also interesting to me.

01:29:51

For the people who don't believe, I'm very compassionate to them and sympathetic. My hunch is they have been lied to multiple, multiple, multiple a couple of times by an addict or a manipulator or a deceiver, and they have lost the willingness to trust again. And I get it. Me, too. I felt that way about my dad in some categories. I was like, I'm not I'm not even ever buying back into this thing. So if you come in with that, yeah.

01:30:20

And if you have no connection to it whatsoever, then you also have all these ideas. And I think that was for me before knowing you you and Eric, I would have been skeptical, too. I definitely would have been one of those people. But there have been so many experiences with you that are contradictory in a way that has made me change the way I think about people in a good way. Like, oh, it's really not black and white. It's not that this person behaves in this way, and they don't behave in this way, and they're this, and they're not this. It's like, oh, my God. We really can be- We run the gamut. All of the things. It has totally changed my perspective on humans and for the better. I mean, I'm so I'm a better person because of it, definitely. It's the idea that I could never is the part that I don't believe in anymore. I don't believe for your relapse, when When all that was going on, it was so hard for me to compute that that was you still. How is this person who I know and love and trust and know wouldn't lie is doing all of it.

01:31:47

It's very- Disorienting. It's very disorienting. But what I'm saying, and same with some stuff with some of our other friends, it's It's helpful. It's helpful to be reminded that we're all capable of all things at all times, and it doesn't mean someone's good or bad. It's definitely possible to do things that you don't think you'd ever do. Oh, yeah.

01:32:14

They It doesn't have to be addiction-related.

01:32:16

Everyone has the chance to do something they thought they would never do. That's right.

01:32:20

Everyone knows what it's like to have secrets.

01:32:23

Yeah. To have regrets.

01:32:24

To lie to protect secrets.

01:32:26

Have regrets and have remorse. It's Well, because a lot of people were saying, How could the depth of the things that this person was doing, how could they have possibly done that if they had this other thing going on? And I was like, No. I was like, Dax was interviewing big experts.

01:32:51

One of the smartest persons we've ever interviewed, in fact, at the lowest point of detox. Exactly.

01:32:57

It was like, It's possible. And I understand if they're specific relationship with addiction or those people doesn't present that way. Yeah.

01:33:05

That's the other thing I think that's misleading about people who have a a generic idea of what an addict is, which is like everything the person is before they're fucked up, it also carries over to when they're fucked up. So it's like, if someone's completely irresponsible, sober, guess what? That's going to get exaggerated. But if you're incredibly controlling and buttoned up, guess what? My version of addiction looks very much like that. So it's like there isn't a single version of it. It's like you carry into it all the things you already are and add that on top of it. Yeah.

01:33:43

I also think what's good about knowing addicts is it is helpful because we all just want answer. I mean, this show explores this, too. We want answers. The whole time I'm during this show where the questions are coming up and I'm like, Why? Wait, why? Why did this person do this? Why? And you don't get answers often.

01:34:11

We're trying to understand.

01:34:12

Yeah, but it's also when that person has done something painful, you want to know why. If someone cheats on someone, they're going to be like, Why? Why? With addiction, what I have just realized is there isn't a why. That's an obvious common question. I think the reality is you don't get that often. It's okay to not get it, but it's practiced. You got a really good piece.

01:34:46

I think the why to, it does imply a strategy. Back to what we were saying, it's like this thing was unfolding in the other person's experience of it as much as it was unfolding in Andy and Elizabeth. There It wasn't a game plan. There was a first thing that led to a second thing that led to a fourth thing that led to a... And now it's this enormous snowball that is quite complex by the end of it. But an initial dumb comment on a thing is almost innocuous. And then it's maybe I'm commenting. I know. It does grow. It doesn't necessarily require a game plan or a goal in mind from the person. It can be getting away from them just as much as it's getting away from the victim.

01:35:32

Yeah. Yeah. That's good. Well, I'm really glad you liked it a lot. It means a lot that you like it. Great job.

01:35:39

Yeah, to you and Elizabeth and Andy. Thanks.

01:35:43

Yeah, check out Beth said if you haven't. It's all available on your podcast.

01:35:47

Now you can really barrel through it. If you're driving back and forth from Key West a lot, couldn't recommend it enough.

01:35:55

You want to do facts? Yeah.

01:35:57

Let's do some facts.

01:35:58

When was Adam on last? He was on March 12th, 2020. Wow. Right at the- Wow, like maybe one of our first. He was in person.

01:36:10

Oh, maybe one of our last.

01:36:11

Yeah, because it came out then. So for sure we did it February.

01:36:16

February 22nd, 2020 with him.

01:36:18

We were recording. Probably one of our last, yeah, before COVID. Okay. Now, are a disproportionate number of professional athletes little siblings? Says, yes. Research indicates that a disproportionate number of professional and elite athletes are younger siblings. Oh, my God. The phenomenon often called the little brother effect.

01:36:41

You haven't heard it in a positive way yet, have you?

01:36:44

No, mine is little brother energy. Energy. Thank you. Studies show that having older siblings to compete with from a young age forces younger children to develop greater skills, tenacity, and physical ability to keep up. The little brother sister effect. Growing up. Yes, same.

01:36:58

Reggie Miller's story is Great. The older sister, Sheryl Miller, is the greatest female basketball player of all time. Reggie Miller became one of the greatest of all time. But he was competing with his sister. Yeah, he talks about how long it took him to be able to beat his sister.

01:37:12

Oh, cool.

01:37:13

I like that one.

01:37:14

Yeah, I like that, too. Okay, does Coke own Dasani? Yes. The Coca-Cola Company owns, licensed or markets more than 200 brands worldwide as of 2025. A reduction from over 400 in years to focus on stronger high potential products. While known for its flagship soda, the company's portfolio spans thousands of beverage products in cross categories like water, coffee, tea, chips, and dairy. What's the difference between Mexican Coke and US Coke? The primary difference between American and Mexican Coke is the sweetener. Mexican Coke uses cane sugar while American Coke uses high fructose corn syrup. I think that's what you said. Also, Mexican Coke is typically packaged in glass bottles. Count Sook Yee comes up yet again. They'll don't have it memorized. I know. He said, Creators making deals off platform to do posts is a $15 billion industry. The content creator brand deal industry as a massive, rapidly expanding sector with influencer marketing alone valued at roughly $24 billion in 2024 and projected to grow significantly.

01:38:25

Wow.

01:38:26

How much energy, water are we wasting when we say, Please and thank to ChatGPT. Oh, great. It really wasn't giving me a very correct answer, and then that was concerning because it didn't want me to. Okay. It didn't want to tell on itself. Yeah. A recent study found that AI models like ChatGPT use significant amounts of water to stay cool. Every 20 to 50 prompts you type, regardless of their urgency, depth, or silliness, consume the equivalent of half a liter of fresh water.

01:38:52

Okay.

01:38:53

Okay. He said Starlink is more than half the satellites in orbit. As of early 2026, Starlink satellites account for approximately 65% of all active operational satellites in orbit.

01:39:05

And now Bezos is going to be putting satellites up, too. So there's just going to be a trillion satellites. Yeah.

01:39:11

It says with over 9,400 satellites in low Earth orbit. Low Earth orbit, L-E-O.

01:39:20

I'm just wondering, when they launch a spacecraft up out of low orbit, are they dodging those things? How do they plan it? Will there not be a point where there's just a net of those fucking things and we can't even exit now.

01:39:33

I know.

01:39:34

Because certainly, if you hit one of those going 17,500 miles an hour, which is how fast they're going, that's going to be an ish.

01:39:42

That's an ish, yeah.

01:39:43

That's a big time ish.

01:39:44

I have it.

01:39:46

How's it going?

01:39:48

It's fine so far. I have to have it. It's the only option. I mean, that's interesting.

01:39:56

I had a thought. I find myself doing this a little bit, and I caught myself doing it. It was like, let's just say if you hate Elon Musk, it's natural that you would root for him to fail. You want him to fail, but what you don't think about is like, I don't know, he's probably got 50,000 employees. Tesla is an American company that's employing 50,000 people. And there are a lot of people who would love to see Tesla just go under because they hate him. But you got to really look at from a Utilitarian. Yeah, utilitarian point of view is like, do you really want 50,000 jobs to go away? Do you really want a big American successful company to go under? Do you really want all that tax revenue to go away? That's a lot of damage because you hate one person.

01:40:45

Yeah.

01:40:47

Does that make sense?

01:40:48

It does make sense.

01:40:48

It's complicated.

01:40:51

I do think that people with that money just have this unlimited power. But if they feel it, if their pocketbook feels it or their bottom line feels it, they do adjust. He's still a businessman at the end of the day. So if everyone stops But currently, the Space X, they're talking about Space X having an IPO, and they're talking about it being probably the biggest IPO in the history of the stock market.

01:41:27

And so even if you took away Tesla, this dude is going to be the richest guy. He's going to get richer. That company is wildly successful, and Starlink is like... So at that point, he's beyond probably hurting, unfortunately.

01:41:43

Yeah.

01:41:44

And then you're just talking about these tens of thousands of people that live in Texas that manufacture these cars and then the dealerships, and all the mechanics, and then just a big American car company. I found myself really going like, Yeah, I guess I have a good thing. Because Well, I have found myself getting excited when I hear that their stock went down or something. Right.

01:42:05

But there are other companies, there are other huge competing companies that I align with more ideologically. And so I want to support that. I just want to support that more.

01:42:22

Yeah. Unfortunately, there's none of the American big three that have gotten their foot in the door on electric. He's like, That's the only company that really got its the door globally with electric.

01:42:31

Yeah. I don't have that issue because I do not like Teslas, regardless of him. Sure. If that was a car that I was like, Fuck, I love that car. I don't. I feel so nauseous in those cars. I specifically ask on Uber no Teslas because I don't like them. That's not an issue for me. What's Kristen has an American A Bolt, yeah. Yeah, that's a great... I much prefer that driving experience.

01:43:04

I personally do, too. But I also I want there to be thriving American industry that employs a lot of people. Then I feel ethically compromised if I let my hatred of one person get in the way of 50,000 people's livelihood.

01:43:20

Yeah, I guess I know what you mean, but there's all these other people who have jobs at other companies, too, that those jobs are being threatened with this monopoly being potentially held by him. It's like, no, I'm probably just going to veer off over here and try to put my money there. It's all personal. People get to spend their money how they want to spend it. That's right.

01:43:43

You get to vote with your money.

01:43:44

You do. You get to do whatever you want.

01:43:47

Well, I would think it was abundantly clear in the interview, but I love Adam.

01:43:52

I'm impressed that he puts himself out there on a hard topic.

01:43:57

He's also pretty staggeringly smart. When I've watched all these other interviews, the amount of things he has a full comprehension of is pretty massive, which I think you have to in that role.

01:44:09

Yes, definitely. Yeah, he was great. All right. Love you. And we're his favorite food. Oh, right. That's right. That was flattering. All right. Love you. Love you.

Episode description

Adam Mosseri (Instagram, Facebook, Fortune’s 40 Under 40) is the CEO/Head of Instagram at Meta. Adam joins the Armchair Expert to discuss being the suit in a family of artists and designers, how we build up emotional affinities for particular brands, and why his approach to design is based in problem solving. Adam and Dax talk about using intelligent technology to evaluate safety at scale, how the Instagram algorithm actually works, and the arms race of the ability to detect when something was made by AI. Adam explains the process of rolling out new features and dealing with mistakes, the implications of how power has been shifting from institutions to individuals, and his prediction that authenticity is becoming infinitely reproducible.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.