Request Podcast

Transcript of Trending Towards Violence with Charlie Warzel

The Weekly Show with Jon Stewart
Published about 1 month ago 248 views
Transcription of Trending Towards Violence with Charlie Warzel from The Weekly Show with Jon Stewart Podcast
00:00:00

Hello, everybody. Welcome once again to the Weekly Show podcast. My name is Jon Stewart. We're recording on a Wednesday. I'm even going to tell you around the time, it's around noon. I want to tell you the time, because last time we recorded on a Wednesday, I thought, Well, something, just let them know it's on a Wednesday, and it'll be out on Thursday. And obviously, by Wednesday night, there was the horrific assassination and not addressed it any way in the podcast because of that. I'm preemptively suggesting that we don't know what happens in 24 hours anymore in this general shit show of a society that we're working on now. On a positive note, Colbert won an Emmy, The Late Show. Well-deserved, thank goodness. It was lovely to see the response from the audience in Los Angeles and in the country showing a much-deserved love, much deserved, well earned. It was really lovely to watch, and I thought his words were perfect and beautiful, and a little bit of a tonic in what is just a It's sad. It's difficult to even muster the necessary performative enthusiasm for these types of intros at some level. One thing I should mention is I hope people realize that shit posting after tragedies is not mandatory.

00:01:40

I don't know if people know that, that you don't have to say the worst thing you can possibly think of to inflame the spirits of people who may be suffering. You can actually write it in a journal and put that journal under your mattress and not demonstrate to all those around you that you really lack humanity. You could write it on a piece of paper and then chew it and swallow it rather than... It is not a mandatory. You don't have to caveat. You don't have to go out. That's one of my least favorite. I don't agree with everything he said, but he shouldn't have been killed. Just fucking don't say anything because the truth is, it's one thing. It's this. It's these individuals who believe themselves to be judge, jury, and executioner, who are a vigilante society is not a society. And that's the part that cannot stand. Ironically, they then find themselves in the warm embrace of every entitlement that our legal system has to offer. As I'm watching all the cable news go through all their pre-ordained steps and all that, I actually came upon an article in The Atlantic that I thought was really interesting that talked less about the individual event and more about the system around it that we really don't know a lot about, and it is an enigma to so many people, and it's not talked a little bit.

00:03:26

So I wanted to bring that writer of that article onto the program. To have that conversation, which I think could be illuminating to some extent. But man, oh, man. All right, so we're going to get to our guest. He is a staff writer at The Atlantic. He's the author of The Atlantic's newsletter, Galaxy Brain. Charlie Wurzel is joining us. Charlie, thank you for joining us.

00:03:58

Thank you for having me.

00:03:59

Charlie, In the week that has followed this terrible, terrible incident, I have read and seen and heard pages of print, hours of chatter, much of it circular, unhelpful, much of it hurtful. Your article on this assassination was, I thought, one of the most interesting Becoming macro views on... A lot of people are talking about... I'll liken it to something else. A lot of people are talking about the weather. You seem to be talking about the climate. Everybody wanted to point fingers at a very specific point of blame. You're talking about a cultural shift in the atmosphere, the complexities of this new communication world and how these individuals exist in the world. Do you want to talk a little bit about that, Charlie? Talk a little bit about the thesis of your article.

00:05:06

Yeah, sure. This whole thing happened last week. As usual, I think for most of us now, you watch this stuff unfold online. You may be seeing stuff happening on cable news or wherever you are or however you're getting it. But these things play out over all of our different social feeds. It's very fractured. It's very confusing. I happen for my job to be more plugged in, probably than the average person, but I think for anyone who is watching this.

00:05:36

Explain your job very quickly to give a background.

00:05:39

Yeah. I cover technology and media and politics, essentially. Just how the internet weirds everything that it touches. I've been doing that for about 15 years.

00:05:50

The three horsemen of the Apocalypse, as it were.

00:05:53

That's right. I'm watching this play out. The The moment that I watched this video pop into my feed on wherever Blue Sky, X, you name the thing, watching essentially someone being murdered in front of your face and all the horror that comes with that, I felt this genuine roller coaster drop pit stomach feeling for a lot of reasons. The first reason being it's horrifying to watch someone get murdered. You immediately think about who the person is. I know who Charlie Kirk is, young family, young father, all of those human reasons why you feel horrible. But the next reason I felt horrible was I knew exactly how this is going to play out. You knew what was coming. I have watched this happen for more than a decade on these different platforms, and the cycle only refines itself. It only gets faster, it only gets smoother. The participants only know what their roles are in the production. They only get better at it. It's more efficient. What I believed was going to happen, outside of the political ramifications of this or even the cultural ramifications, what was about to happen was an immediate... The two polarized sides of our discourse were immediately going to take this and fit it into their ideological box in one second.

00:07:31

We saw this, especially there's, and I'd love to talk more about this, but there is this, I think, a real asymmetry between the two polarized sides. But I'm scrolling through my feed, it's not clear what has actually happened to Charlie Kirk. We don't have any information. One of the first things I see is an Elon Musk tweet that the left are murderers, essentially.

00:07:57

He's not a very influential force on that platform. So I don't imagine anybody saw that.

00:08:03

It's not like he owns it or anything like that. And so I started watching this. I have feeds that I've set up to monitor some of these shock jocks, some of these influencers, some of these pundits, whatever, people across this spectrum. And it was very clear immediately that this was war, essentially, especially for the far right.

00:08:30

Supercharged, online, and performative.

00:08:35

Yeah. Immediately pointing the finger of blame, this is what happens. No need for evidence. When these types of things happen, motives take a while to... Just physically, you need to go, usually, to the person's house unless they've left a note right there in the crime scene. You have to apprehend the killer. You have to know things, information. What we have in the aftermath of these big events is an information vacuum. The internet has become so good at filling that. The internet abhors an information vacuum, and it basically incentivizes all of these people through these different social media feeds to fill it. And there is this attentional incentive baked into that. Elon Musk is performing, essentially, whether he believes it or not, whatever. But what he's doing when he's tweeting, when they still haven't probably moved Charlie Kirk's body to a hospital to be treated, is this notion of, I am going to give this crowd what it wants.

00:09:44

And in some ways, give himself what he wants, which is the dopamine hit of that engagement. Everybody online is, they're looking for those hits. Charlie, is this rewiring our brains. How much of this is cynical? How much of this is a biohack? In the same way, I liken it to if you're a chef in a restaurant and you want people to eat your food, you've got some hacks. You'll throw in a little sugar into the marinar. You'll put a little fat, a little oil, a little salt, a little something to get people lip smacking good. That's one thing. But there are labs where people work on ultra-processed food that's designed to hack your body's ability to stop you from eating. That force you in some ways beyond your ability to control it, to continue to eat. Is that the insidious secret sauce that online is ultra-processed speech?

00:10:56

It's a really interesting way to put it. I I think I want to be careful with terms like rewiring because there is this great debate that is going on about how bad are the phones? How bad is the Internet? What is it doing? What is it doing to kids? What is it doing to adults? I fall into the camp of it's very clear that something is going on here. I can't believe this thing ever. It's right next to me.

00:11:27

I'm doing a freaking podcast with you. We poop with it. For God's sakes, Charlie, we poop with it.

00:11:32

Yes, among other things. There's clearly something going on. There's something with the dopamine, with the way, with the intermittent reward cycle. Because you post something, you do something for this crowd, and it's not reliable whether or not the world is going to shower you with the attention and the affection that you want or whether the algorithms from these companies are going to do that. You have You're pulling the slot machine all the time. What it's doing long term, I'm not exactly sure, but what is very clear is that it is changing our incentives, our behavioral incentives. I think it is turning us often into the worst possible versions of ourself. I think a lot about the way that tech companies and algorithms work with food as well. But the way I I think about it is if you said to somebody, Hey, I'm going to try to diet this year. I'm going to try to eat well, and part of that, and get fit, whatever. Part of that is every day at four o'clock, I'm going to exercise. Seven days a week, I'm doing it. I want to be a better person or a better version of myself.

00:12:53

And then every day that friend at 3: 57 slides you a donut. What a dick. Exactly. And says to you, You don't have to eat that donut, but it's there. I mean, you could nibble it. You know how much you like donuts, right? You eat the donut because it's been a long day, you're in a moment of weakness, whatever it is. You just feel like you want to do it. Then the way you feel.

00:13:25

Yeah, that's the thing, though. I would say that they understand the bio rhythmic changes in your day to know the exact right time to wave that donut under your nose. That it's almost worse than just being a dick. I was watching the trending topics, Charlie, like a horse race, hoping upon hope, like this is what's happened to us. You're praying that it's not committed by somebody that the other side can weaponize. And so you're watching trans start to trend, and then groiper starts to trend, and then this other thing starts to trend. I don't want to say misinformation because you don't know yet. It's non-information. It's merely accelerants. These are just accelerants. All right, folks. I'm going to share something with you. You might not know. I'm Lazy. When I talk about that, I'm talking about, obviously, food, eating food. Cooking food takes time, and I'm hungry now. But I got to tell you something. I got the perfect solution. There's a company called Factor. Factor, they got you. Chef-prepped, dietitian-approved meals make it unbelievably easy to get your meals, to stay on track. Healthy, healthy food, yet still comforting and delicious. You get a wide selection of weekly meal options, including GLP-1 friendly meals.

00:15:10

Yeah, that's right. They now make meals that are GLP-1 friendly. Glp-5, I don't know. But one? Yes. Get premium seafood options, salmon and shrimp at no extra charge. 97% of customers, an insane positive rating, say that Factor helped them live a healthier life. Feel the difference no matter your routine. Eat smart at factormeals. Com/tws50off. Use code TWS50off to get 50% off your first box, plus free breakfast for a year. It's code TWS50 off at factormeals. Com for 50% off your first box, plus free breakfast for a year. Delicious, ready to eat Eat Meals Delivered with Factor. Offer only valid for new Factor customers with code and qualifying auto-renewing subscription purchase.

00:16:08

I think what I struggle with covering this stuff is to know how much of this is an intended consequence. When you say something like there's almost like an understanding the bio-rhyhtms and things like that, I think a lot of it is actually dumber than that. I don't I think that these algorithms are optimized for engagement, just broad engagement. They don't care whether or not that engagement is screaming to end the rights of trans people or that engagement is, Look at this kitten. It's furry and beautiful and L-O-L.

00:16:49

Don't you think they understand that outrage and fear travels faster than even adorability? Absolutely. When you monetize were maximized for that. Don't you think they know? I mean, I watch it on the evening news, man. They understand that if a terrible crime happened in the Bronx and they've got video of it, that's what's going. They always save the... And in America, first, tonight is always the last story where they're like, A girl sold a cupcake for adoption shelter. They know what drives the most engagement management.

00:17:30

They absolutely do. I'm not trying to let them off the hook, but I think the technology is dumber in that way. I mean, some of them are more sophisticated. I mean, TikTok, for example, is a genius, an evil insidious genius invention in that you are giving it every time you scroll to a new video on TikTok, you are giving it a signal. It takes what that video is, and it has it categorized 50 different ways. It's a sports video. It's a music It's a band. Oh, it's this clip, this sound clip, whatever it is. And you're voting every six seconds or less. If you flick off something really quickly, your algorithm changes. Just an infinitesimal bit. They say, Charlie's not into that today.

00:18:15

Do you understand how these work, Charlie? Are you privy in any way to these algorithms?

00:18:22

No. Nobody is, really. These are company secrets.

00:18:27

But how is that possible? Think about it food, right? They have to tell us what's in it. Now, they don't have to tell us that beaver anus has been substituted for strawberry flavoring, but there does have to be a chemical listing there. How is it possible that these programs which have such an impact on our lives, and certainly on the lives of young people, have no responsibility to the public to in any way deconstruct how the fuck they're put together and what they're doing to us.

00:19:01

I think there's two reasons. I think the first reason is simply that the way that regulation works, the way to get all this going, works so much slower than these companies work. So any time they're trying to be put in a box by lawmakers, things change. The whole paradigm changes. You make a rule to govern Facebook after the 2016 election, say, which didn't happen, but say you did that, and then bite dance from China comes up with TikTok, and there's this whole new way, there's this whole new thing. There's a little of that going on. I guess maybe there's three things. There's also a lack of understanding, very clearly. Congress has gotten slightly better, but there's a lack of understanding. Third, I think, is the horrific... It's the same problem we were just We're talking about polarization of all this stuff. There is, from the left, this desire of, Hey, we have clearly a neo-nazi problem on some of these websites and platforms and these harassment problems and these things of targeting real human beings, swatting them, doxing them, getting them in trouble with their job, getting them fired, all these actual real-world harms. These are not necessarily vague speech harms.

00:20:29

These are real the world harms and things that are happening. That is the perspective that progressives come to this fight with it. Then on the right, you have this free speech maximalist in theory idea, which is if you try to do anything to these platforms whatsoever, you are putting your finger on the scales.

00:20:54

That's right.

00:20:54

It's cloaked in this idea of free speech. It doesn't talk about the fact that There's a line from this researcher, Renee Duresta, who is great on all these topics, and she has talked about freedom of speech and freedom of reach. And these platforms, with their algorithms are about reach. It's not about speech necessarily. You can say a lot of stuff on these platforms and it gets through, or now a lot of these platforms don't care about content moderation. But the whole part of the algorithm is what is it boosting? What is it sending out to more people? It's different than standing in the town square and speaking versus standing at it and having a bullhorn that reaches entire municipalities or whatnot.

00:21:42

Right. Or knowing in the town square when you're going to show up and then following you because they also do that. There's notification. If somebody on your list put something out there, you know they're in the town square and you can run and harass them.

00:21:58

Right. Or waking you up when you're asleep and telling you, Hey, you got to listen to this, right? Right. You have this argument that is coming from the right, which is basically any attempt to shut down some of this political intimidation, some of this Some of these communities that exist to harass and belittle and cause this physical world violence, that is immediately seen by the right as this Putting the fingers on the scales of who gets to speak and who doesn't.

00:22:33

Or it was until a week ago, and now it's not.

00:22:36

Or until they get power.

00:22:39

Or until it's not convenient and they want to get somebody else off there.

00:22:42

Until they get power. Elon Musk's whole reason, supposedly, for buying Twitter, which is now X, was this idea of restoring free speech, right? That Democrats have censored Conservatives forever. So he buys the company, he reinstates people like Alex Jones, a couple of neo-Nazis, just a whole slew of people who've been banned, not because of their speech, but because they broke the rules. They broke the rules of engagement for this.

00:23:12

Because it's a platform, it's not the government. It's- It's a platform, it's not the government.

00:23:15

It's- It does all this and very clearly has put his fingers on all kinds of different scales there.

00:23:27

Mecca Hitler would say. The one that we know about. He's turned grock. That's what I'm saying, though. It's completely obscured. You can't see into it.

00:23:39

Right. And so this also causes regular people, and I've been watching this happen for over a decade, to become conspiracy theorists about what happened to the thing that they posted or said. There are so many people I know who are totally just regular, not public figure type people who have said, I think I'm being shadow banned. I think I'm being shadow banned by X or Facebook or whatever. I'm like, You're probably not on their radar. If you're just a regular person living your life, but because these things are so opaque, because we don't know what is happening, we create these different stories for why this post didn't go viral or why I'm not getting followed as much as I think I should or why X, Y, or Z is happening. But the reality is, too, that there There is stuff happening. These companies are tweaking these things. I mean, Elon Musk- Of course.

00:24:36

It's not free speech. It's ultra-processed speech. An algorithm, it's one thing. Look, the fact that it links you to other people, the fact that it drives comments and the comments drive more engagement is what turns it from free speech into ultra-processed speech. It is designed to pull you further and further into the platform and further and further down the rabbit holes. There is gravity. There is inertia. There are forces that are at work, that are not at work in normal free speech settings that are manipulating the dynamics.

00:25:20

One of the most sobering stories of this that I covered right after the 2020 election, right after January sixth, I was looking at different people who had stormed the Capitol and looking through their social media presences. There's one guy who was a small-time right-wing influencer. I went back, we downloaded his whole Facebook account to look at a decade. It was very clear that starting in 2014, 2015, he was just a guy who was trying to get some attention. He was, Hey, look at my stand-up comedy thing. No, that's not working. Okay, open mic night. All these different, Here's my startup, blah, blah, blah, blah, scenes. Just not even that, just like, I'm skeptical of this thing. The post, he was getting maybe one or two likes on every single post. 250 likes. The next post, a little further. He goes just a little bit further with it, right? Cut to six months later, this guy storming the Capitol and logging himself doing it. And it's like, that's it right there.

00:26:43

Do you think this The difficulty here is you're creating a army of nihilist influencers because you have to get the dopamine hit. Look, we've all Anybody out there who's done drugs, me being one of them, understands that part of the problem with drugs is to get that same hit, you got to go further. That's the issue, that it goes very quickly from, I feel a part of something to, I need this somewhere that my nervous system has overtaken this, that the pleasure pleasure is now not pleasure, it's desperation.

00:27:33

I think the worst part about all of this, though, is that for some people, I think that that's accurate how you're describing it, some people get rewarded and figure it out. If you look at the political situation that we have in the aftermath of the Kirk stuff, we have an influencer who was assassinated. The Trump administration sent two podcasters who just happened to be working for the FBI to Utah to investigate that.

00:28:09

You're talking about Cash Patel and Dan Bungino, who are now the heads of the FBI.

00:28:14

Sent two other influencers, people who've made their bones using all these platforms to create these audiences and continue to ratchet up the takes that they put out. And the vice Who's the President of the United States hosted the influencer show, possibly, or Kirk's show. I mean, you just have these influencer dynamics because the show must go on, right? Because that's really important.

00:28:43

But don't you think some of that is because power understands... Look, propaganda and information, whoever controls that has a huge advantage in terms of political power. And so they're just understanding how to weaponize these various platforms. Everybody is trying to do that to whatever aim it is. Every medium that came before it had the same problem. I've never seen it supercharged in this manner. It's not that people didn't... Father Coughlin loved radio. Once radio hit, his propaganda was supercharged. Television had the same thing, man. Everybody remembers Morton Downey Jr. That wasn't so much political or Jerry Springer, but it was a ratcheting up of dopamine hits to watch something. You saw those things aren't sustainable. They start to spin out of control into caricature. Even though it's done that online, It hasn't been a detriment to the purveyors of it. All right, guys. These days, every headline feels like it's been engineered to make you either furious, terrified, or both. It's honestly maddening. You can't even... You can't watch it. You can't read it. You can't deal with it. Well, ground news is here to help you fight back against the tyranny of reptilian emotion that these other news organizations are trying to hit you with.

00:30:20

It's a response to this fear and anger-based media. They don't tell you how to think or feel. They aggregate and organize information just to help readers make their own decisions. Ground News provides users reports that easily compare headlines or reports that give a summarized breakdown of the specific differences in reporting across all the spectrums. It's a great resource. Go to groundnews. Com/stuart and subscribe for 40% off the Unlimited Access Vantage subscription. Brings the price down to about $5 a month. It's groundnews. Com/stuart or scan the QR code on the screen.

00:31:02

It's all personalized, too. That's the big difference, right? Because if you are, again, like what I was saying with the stuff on TikTok and the scrolling and giving the micro feedbacks to tailor everything to your interest perfectly, right? There are a lot of people who are coming to any influencer who is polarizing, whether it's Charlie Kirk or whomever. They're coming to them in the exact act right way. It's not like a clip of the influencer saying something that they might disagree with. It's them talking about the subject that they care about, that they like. Influencers will spend a lot of time on air, and they will clip up these things in all kinds of ways because they're also trying to find as broad and diverse an audience. They'll talk about cancel culture Netflix specials or whatever, the movie of the week that matters the little thing. Then they'll talk about sports.

00:32:07

The aggregators at the larger media companies will find the most explosive clips, and they'll put that out. I I mean, the one thing that I want to say is everybody's playing the same game, and that is to join in in this new everything is content economy and an assassination. I think the thing that has been so jarring is that an assassination is content, is another driver of everybody's, I picked up these followers, I People shit posting, people posting horrible things about a family that's just lost a father. All of this. You know what's interesting to watch? Cable television displaying its own vestigial uselessness. That they're going back and forth as though they might be the issue when they're not even part of the game right now. They're following. And by the way, this kid, he's on Discord or some other platform. Why you would name your platform after argument? It's Discord. It's shittiness. But who even knows the ecosystem that he's swimming in? We don't know any of this.

00:33:42

It doesn't look like it was a great one, honestly.

00:33:45

How could it be?

00:33:46

The charging document or whatever we're calling it of the shooter. First of all, I don't want to name the shooter because that's also a part of all of this. This is These mass shooters now are showing us with each of these killings that they are absorbing all of this. They're absorbing how you and I talk about it, how I write about it, how cable news portrays it, all of these things. In that charging document, the shooter inscribed a bunch of different hyper, very online phrases on the bullets and stuff, right? Right. And was texting with the roommate right after or right before the assassin.

00:34:33

The roommate or the boyfriend or a girl. They're in a relationship, I think. It seems.

00:34:39

But again, who the fuck knows? Right. I can't get It directly quoted, but it's something like, if I see whatever the phrase is on Fox News, I'm going to lose it. Essentially saying it's a troll. The bullets, these things.

00:34:57

Oh, my God. They're saying, Oh, if you do that and you get on Fox News, you win the day on the internet. So these people are like, assassination influencers, like murder influencers. Exactly. On the very same day, there was a 15-year-old kid who shot up his high school in Denver. Again, hasn't really received the coverage because I don't think it fits as well into the algorithm's engagement model. But apparently this kid was involved in a whole other subculture online that is a public murder subculture, if that's to use the wrong phrase. But do you know anything about that subculture as well, or is it all part of the same weird thing?

00:35:58

It's It's very amorphous. Online communities tend to be this way. I mean, some of them are very rigid, but generally, the best thing and the worst thing about the internet is that if you are somebody who feels very alone and has some really niche things. I love whatever this trombone player I'm growing up in Iowa. I feel like I'm a weirdo. I'm an ostracized by whatever. You can go on the internet and you can find the 11 other people who have that strange hobby, that strange weird thing, and you can find each other. There's great stories there. It's a wonderful thing. The exact same thing works with people who love and are really interested in the Columbine shooting, and not because they think it's an interesting historical thing, it's because they idolize the killers. The internet puts these different communities together, and then not quite exactly the same as Netflix's, If you liked this you might be interested in. But organically, there is this referral thing where, Oh, you're involved with this very messed up subculture. Well, there's some people on the outskirts who are part of that, and they'll help introduce you. There are these.

00:37:16

There's a terrorgram, which is basically a terrorist subculture, public acts of terrorism, but not joining ISIS or whatever. It's just regular people who are fascinated by this and want to take up a political cause. There's this far-right, the neo-nazi stuff that's very real and actually bigoted.

00:37:41

Luigi Mangione has an entire subculture of supporters for vigilante assassins. I don't know if that's... The thing you don't know is, are they part of those communities or are they weren't lionized by those communities, but were separate from that? What's the causation?

00:38:07

We have some evidence in certain instances. All right. Talk about that. There are some evidence that I'm probably going to get the particulars wrong, but there are two recent school shootings from last year and earlier this year in Nashville and in Madison, Wisconsin. I'm going to confuse which is which. But there's investigative reporting that shows that they crossed paths online.

00:38:38

The two shooters separately knew each other online.

00:38:43

On the internet.

00:38:44

On the internet.

00:38:45

It's not that they were best friends or anything like that, but when the first shooter committed the act of violence and murder, and that other, the second shooter, I'm just trying to not name names here because I'm sticking with that. The second shooter was like, Oh, my God, I know her. By the way, it was a young girl. It was a 15-year-old, I believe, girl. That is something that is also changing. The threat profile of this is... Experts who I've spoken to who study this stuff have said it's democratizing because there are mass shooter online fandom communities that exist. Some of them are around specific instances, like Columbine. Some of them are just simply, there's the Christ Church shooter who is lionized. There was a shooting that I wrote about, and I think it's two weeks now, but it It was like 20 years.

00:39:46

Jesus.

00:39:47

At a church in Minneapolis. The shooter had inscribed all these other shooters on their gun and all these online memes and phrases. That was very clearly, very, very clearly, just fan service, essentially, to this community saying, I'm going to do it. I'm going to be the next guy, and you're going to write my name on the next gun. That's what it's about. Jesus.

00:40:18

It's the cinnamon challenge. It's the thing that goes viral online, but to the extent of depravity and murder. Is that similar?

00:40:29

Yeah. I mean, Essentially, yes. Very truly, these dynamics, the thing about the internet is the specifics are always very different, and the specifics certainly matter. But there are a lot of very similar dynamics that happen. When I talk to some of these analysts and researchers who spend time in these... At great psychological pain in these subcultures or around them. What they describe essentially is a love bombing that these groups have. So someone will wander in. Maybe they're just interested in true crime. They're pretty depressed and they're pretty depressed, and they're pretty upset, and they see this subreddit, or not subreddit, but some community, online community, that is interested in school shootings. And they don't know exactly what they're getting, but the people there know how to groom them. They know how to bring them into the movement. They know how to treat them, give them something they're not getting anywhere else. And a lot of times, the people who end up going way down the rabbit hole, tend to feel extremely alienated from their communities or from their families or personally alienated by their own life choices in whatever way. There's this way that these communities pull them in and know how to do that and get better at doing that and give them essentially a safe space.

00:42:11

Like a cult. That's how cults operate. They love bomb you and bring people in who are searching for something and bring them into these smaller subcommunities, offer them. But the question then always becomes, what is the radicalizing factor? And can they take somebody from neutral? I know you did a blank slate user thing on X. What happened then? What was the dynamic of that for you?

00:42:43

When I did that, this was right around the 2024 election. My personal experience and the experience I was having doing research and reporting was that the site was getting worse. I was starting to see straight up neo-nazzi content in my feeds without showing any affinity for that myself.

00:43:10

I'm always like, when your For You page shows you that, you're like, what have I done to make you think that that's for me.

00:43:19

That's the thing, right? That's the quote unquote, invisible algorithmic hand that you're saying, well, okay, is this Elon Musk? Is this whatever? So what I did is I created a new account, got a different browser, whatever, so I didn't know who I was, created a different account. I can't remember exactly precisely the things that I said I was interested in, but I think it was technology and sports, right? Like, very It's very bland. Sure. I'm a person. I have some interests. Yes. I'm not going to tell you what, I didn't follow anyone. First thing I see when I open up the new For You feed because I'm not following anyone, it's eight Elon Musk ones, tweets. Just like- That's a welcome from the- Of course, from the concierge.

00:44:06

It's like when you move into a neighborhood and somebody brings over the big basket of fruit. You've joined a community. Yeah.

00:44:13

I'm trying to see what else. I got it here. Okay. Musk Post was the first thing, a post from Donald Trump, a tweet from an account called MJ Truth Ultra, which offered a warning from a supposed FBI whistleblower that said, vote, arm yourself, stock up three to four months supply of food and water and pray. After that was a post from a MAGA influencer.

00:44:38

Wait, whoa. Nothing tripped this up. You're in a clean... Does it think... Because is there something suspicious maybe about not having any data history that makes it think like, Oh, this is an experiment or a Russian bot? Was there something to the fact that you were a clean slate that seems suspicious?

00:45:07

I mean, if that is the case, I don't know why you'd be serving a supposed Russian bot with like, vote and arm yourself and suck up.

00:45:14

Wouldn't a Russian bot enjoy-I guess, maybe. Destroying the fabric of the United States?

00:45:20

The answer to that is that I can't be sure. But I also, again, these things are not that savvy. They're just trying... All these platforms are numbers games, both with engagement but also with users.

00:45:36

But it's clearly not agnostic. It's pointing you in a very particular direction.

00:45:41

Yeah. I mean, there's a couple of posts from Libbs of TikTok. There was a post from Benny Johnson and Jack Pasobe.

00:45:50

All right wing content. All of it. All of it. Yes. You've not engaged with any of this yet?

00:45:56

No, nothing. This is just the first scroll. Just the first scroll.

00:46:01

That's your starter kit. Yeah.

00:46:03

No, truly. I let it marinate. I checked back in a few times, and it essentially was like it devolved into like Videos from the manosphere influencers like Andrew Tate. There was a lot of MMA stuff. It was a very stereotypical version of a user, which happens to be the core user now.

00:46:33

Young male, technology, sports. They just think like, if you're in a technology in sports, you're one of us, you're in the bro sphere, and we're going to give you that stuff that just massages that reptilian part of your not quite formed brain and bring you in.

00:46:54

Yeah. It's mind-blowing because you said, Well, okay, what was the catch there? When I was performing this experiment, I wasn't expecting to just be able to copy and paste that into an article and be like, Well, there you go. I thought I was going to have to work a little for it, right?

00:47:15

Right. You're going to have to work to join the manosphere. Right. Now, what would that have been, if I can ask, what do you think that would have been on a different social... Like, either Instagram or Blue sky. Does Blue sky Can I immediately bring you to Lilith Fair? What's the difference?

00:47:37

Blue sky is not really algorithmic in the same way. It's an experiment to I tried to do the internet in a different way. It has its own polarization issues. It has its own people have been bullied off or whatever. But it's a different experiment. It's trying to be not speed run all the mistakes of Twitter. Threads is like you have REI and Johnson & Johnson tweeting, what's up, babe?

00:48:13

It sounded like you were going down to RC Cola. You're like, Well, there's Coke and there's Pepsi. Hey, RC Cola is, yeah, that's fine.

00:48:21

No, it's like brands mindlessly tweeting, it's like Starbucks saying to Burger King, You up. It's just as bad. Very dead inside.

00:48:32

It's radicalizing the Burger King.

00:48:34

But again, TikTok is, again, the most sophisticated version of all of this, because you open that up, it's going to throw you really popular posts, and sometimes those will be political, right? But a lot of times those will be silly dances or what have you. The thing is, it gets real good at There's a situation where in my reporting, all sorts of people have gotten in trouble with their spouses because they stay too long on a video of a Buxom woman, and then TikTok is like, Oh, he's a perf. Got it. Okay, let's give him-Oh, my God. Let's give him boobs. And so you just end up with this for a while.

00:49:23

They've got to try and game their own feed like Neil deGrasse Tyson, Neil deGrasse Tyson. Come on, Tyson. Let Let me get some astrophysics in there. We got to get the boobs out.

00:49:33

Yeah, but this is also something that, not to over generalize for a generation, but younger people, early millennials, Zoomers, whatever the other generation is called, I don't remember, they're very good at speaking to the algorithms in this way. They know that they're being perceived in this way, and they're They're good at trying to clean up their timeline or do whatever, or try to get out of it something.

00:50:14

The thing that struck me, if the communications are to be believed, is the utter lack of understanding that you've ended a life. The impact of that seemed completely bereft of consequence, did not understand or did not care.

00:50:44

Yeah, I think some of that is beyond my scope when it comes to the psychological issues of depression or whatever. But the thing that I really can understand, and I actually think the thing to take away from most of these violent acts, is this irony-poisoned nihilism that is the lingua franca of the internet. That there is this L-O-L, nothing matters, feeling. The only thing that really matters is getting a rise out of different people, performing for your people. This Minnesota, Minneapolis shooter.

00:51:30

People immediately tried. This is the church or this is... This is the church. Here's how horrible this is. I'm like, Are you talking about the church or the guy who went and killed Melissa Hortman? I mean, that's- No, sorry.

00:51:40

Yeah. No, I know. Sorry. This is the church that happened two and a half weeks ago. This person had all kinds of awful stuff, racist, white supremacist, anti-Semitic stuff scrawled on his guns in his magazines. He also had weird stuff about Exon, down with Exon and down with Black Rock, all these weird things. He had all these shooters.

00:52:07

Sort of that extreme left and extreme right meeting. It's that everybody talks about that, that extreme left, extreme right meeting in this weird place.

00:52:16

But I think it wasn't that. I think the point is they knew when they did this, that it was going to make news, that it was going to be picked apart, that he was going to be treated like puzzle to solve by the media, by you and me. And that's why there's all this stuff that is on there that is actually ideologically incoherent.

00:52:41

So in some ways, you're saying that The idea behind all this is nothing more than murder, Fame in your online group. And also, now let's all sit back and watch the world burn.

00:52:58

Yeah.

00:52:59

Are we Are they already behind on all of it? We're discussing it in terms of the algorithm on social media. Have they already found other deeper caverns where they're living that aren't even touched by those algorithms? Are these people even on Instagram? Or are they living in... Is it subterráne? From that, even?

00:53:30

So these communities, a lot of them will exist in chat spaces, which aren't really algorithmically driven. Some of the- What does chat space mean? There are WhatsApp groups. There are Telegram. Telegram is a messaging service that is infamously, it doesn't moderate. It took the European Union and the Department of Justice a lot of work to get them to try to get rid of ICS content. They do not moderate for anything. They're free speech maximalist platform. There are communities there. On Telegram. The terrorism, white supremacy, the school shooter fandoms, whatever. There are... Discord, as you mentioned, is a platform that does moderate, but is locked in this whack-amole game. You were mentioning with cults and leaders. There's no leader to a lot of these things.

00:54:38

It's an amoeba. It's the Borg. It's this weird- It's headless, yeah. It's a headless network. I wonder sometimes, this is where we're at now. Ai is going to supercharge this, I'm assuming. It's going to supercharge the algorithm. It's going to make this worse. Do you have a sense, though, of, are other countries facing this issue? Why is America seem to be, I mean, other than our unbelievable access to the weapons of death. Is that the only thing that supercharges this in this country? Because Europe, for all its problems, doesn't seem to have this one.

00:55:25

It seems like it's probably the original sin when it comes to taking online violence and chaos and nihilism and making it physical.

00:55:40

It's just pretty easy. But it doesn't seem to It happened in other places. Certainly not as often and as constantly as it does here.

00:55:54

Yeah. I think the closest thing that you see globally is terrorist organizations, like fringe radical and terrorist groups. Oh, my God.

00:56:03

But I think- We have an Isis of disaffected fucking high school students and college students? Kind of. Wow. I mean, that's deep, man. But there's not an ideology really, behind it other than this.

00:56:18

I mean, again, I think when I go back to watching, one of the things that was so frustrating, whether it was this church shooter in Minneapolis or the stuff with Kirk and the bullets. The first thing that came out was that, like ATF unsourced or one sourced Wall Street Journal report that said there was transgender ideology on the thing. When I see that as someone who's spent a lot of time trying to wrap my head around this stuff. I go, Oh, man, whoever that person was who wrote down that report, I don't care if they're politically motivated or not, and maybe they are, maybe they're not, but I know for damn sure that they have never spent time in these communities. You don't just say words like that.

00:57:04

That we are jumping to conclusions that favor whatever our narratives are when that's the exact point of these people is to be enigmas, to be puzzles, to have people not understand. Or maybe some of them go that way, and some of them are much clearer about their political aims. But it's the certainty that you see in legacy media creates There's more misinformation and uncertainty is the point.

00:57:33

In the chaos. In the chaos. If you go into these communities, we hear a lot of a 4chan. If you go into one of the political message boards on 4chan, part of what you're seeing is just a lot of chaos. Just a lot of people insulting each other, but they don't really mean it or they do me. It's nuts. There is a way in which these communities have this feeling. And by committing this act, by murdering a bunch of people, they see it as a troll. They are trolling everyone because just you watch, Donald Trump is going to get on Fox News and say something, right? Or MSNBC is going to do this, or Or these lawmakers are all going to point the finger.

00:58:32

Or this dumb- Oh, God. They're playing the response.

00:58:36

I mean, truly, I feel sometimes like I'm, again, over generalizing when I'm looking at this.

00:58:46

It's hard to have any conversation without over generalizing.

00:58:48

But every person got their own motivations, whatnot. Then you read the charging document of Charlie Kirk's shooter, and it says, Man, if I see this on Fox News, I'm going to have a stroke. That's the quote. I'm like, Well, there it is. That's it.

00:59:06

That's one of the things is they want to see the reaction from the so-called normal world or whatever.

00:59:12

And wants to see the normal people not be able to understand it. They want the reporters, the journalists, the lawmakers, the lawmakers, the investigators to try to solve their puzzle and fail because they... When you think about it, that's so human, right? They They want to try to be understood, but they also want to be seen as unendingly complex.

00:59:35

I've got no sympathy for the misunderstood miscreant that's like, I just want to be seen. You're like, Man, fucking take a painting class. Get good at something.

00:59:46

I'm certainly not defending it. No, no, no.

00:59:48

But that's what it is. I didn't mean that. I just meant, yeah, I get what you're saying. Do you think, look, anything that can be constructed can be deconstructed. Are there tools that can give us a better sense of when these things are going to break into real life? And are we going to have to understand how to utilize those? Is that the way out of this? I don't want to be like the kids in the underground subreddits, but let's face facts. The overwhelming majority of it is pretty harmless. But is there some way to crystallize where the real peril is and how to find our way out of it?

01:00:39

I think that there's such a number of ways you could go with all of this. When I talk to a lot of these researchers, they all put their head in their hands because they're like, What I'm going to say to you sounds so naive and so pie in the sky and you're going to dismiss me immediately. But they're like, We need more third spaces and community spaces. We need the infrastructure that's not digital. We need the physical infrastructure.

01:01:14

God, the answer is midnight basketball? Yes.

01:01:18

The answer is a community of people who care about and keep people who are showing these types of tendencies from driving themselves or running into the arms of these other people. The one thing that this one researcher at University of Boulder, Alex Newhouse, told me, is that you can see from these people a receding from the physical world when they start to really go down these paths. They're not going to school. They stop going to these things and they all of their time.

01:02:01

But this kid didn't seem like that at all. No. He's always having dinner with his parents. He's going to work, he's going to school, he's doing the whole thing.

01:02:08

That's why we don't know whether... We still don't really even know he's not being cooperative. He may fit into this other box, but it's very clear there are a few little things, signs and things that he left that he thought would either be nihilistically funny or whatnot. But there is a few It's a physical thing. Right.

01:02:30

Or was like, this person is a hateful person, and I've decided to be the judge, jury, and executioner. It could simply be that. Right.

01:02:39

Truly. I wrote this in the last piece, but The thing that most, I think, unerves people the most is this idea that there might not be an ideology to pin it to. I think a lot of people were like, Oh, I hope it's not my side who did this. I hope that person's not on my side because I don't want to have to answer for that or whatnot, or that's going to make things worse for me somehow. But I think even more unnerving to people is this, what made them do this? What failsafe didn't work? Sure.

01:03:23

No, it takes you back. I think in some respects, you say, Okay, we've seen assassinations before in our history, we've seen times of volatility and insecurity, and generally revolved around either Vietnam War, racial issues, or different things. This feels like clockwork orange. This is a whole other subculture that, like you say, it has its own language, it has its own mores. It's why the manson case created such a hysteria back when that first surfaced.

01:04:10

I think it's really interesting. People have brought this up, and again, this is a little outside of my zone of expertise, but I think it's interesting. We don't really have a serial killer problem in the same way that historically we had, but we do have this mass shooting problem. It's interesting. It's interesting, is this a different expression of a type of violence, of a type of thing instead of... Are we seeing the same things, but it's being transformed by our current culture, our current media, our current whatever.

01:04:48

A nihilism that's always been with us, supercharged by technological advances.

01:04:52

Again, that's a little beyond my scope, but I think it's certainly fascinating. I think it's good to keep in perspective with this because you can immediately go into the doomer territory with all this. It's very easy to do.

01:05:08

Yeah.

01:05:08

But I do think that this isn't the common experience of being online.

01:05:15

No, I know. It's one of those things where you're like, But think of all the people who aren't shot. It's one of those like, Yeah, okay. All right.

01:05:23

No, totally. But I think when you're trying to address this, I think one of the biggest issues, there's obviously the technological issues. I think the government needs to really start taking this even more seriously.

01:05:43

Well, they apparently are going to on the left. Apparently, they're not going to do it. Apparently, George Soros is going to jail for this one.

01:05:51

Right. Yeah. No, that's not going to make anything better. But I think The Biden administration, right at the end, designated this terrorist online group, Terrorgram, as a domestic threat, which puts it in a different category and allows the actual law enforcement agencies to take this more seriously to monitor this stuff. Again, that gets into conversations about surveillance and what are we doing surveilling our kids and are our kids going to get caught up in this thing because they're ironically posting about whatever. It's not a coincidence that these communities are all cloaked in 85 layers of irony because it also makes it a little harder to track and understand what is actually going on until it might be too late. But I also think one of the biggest things here that I see in the aftermath of the Kirk's assassination is that people need to start taking the Internet much more seriously. I've had journalists that I have seen who have glibly and proudly been Well, I don't know about those spaces or those types of things. What I would say is if you are trying to cover politics in 2025, you need to know what a Discord room is.

01:07:29

You need to know these things operate. You need to understand. I'm not saying you have to drive yourself mad or anything like that, but you have to understand these things because what we are doing right now is we're very clearly as a broad God, media, and political system, cultural system, is we are playing into the hands of these people who are doing this. We are giving them what they want.

01:07:54

We are performing for the internet. You see this after Charlie Kirk's assassination. You're watching internet residents perform with either shit posting or thing. You're seeing cable news certainly perform. All of it's Our prayers and other things. They're going through all the stages of how a story is, in some ways, put through the refinery of our modern media outlet. There's very little insight and a lot of performance. And maybe that's the exact wrong thing to be steering into.

01:08:38

I have no idea if this would have as much of an effect as I think it would. But if the media stopped naming these shooters, I think that that is such a small thing that actually goes a bit of away. Because, again, the thing that I think was so chilling about the Minneapolis church shooting is that they wrote all these names of other mass shooters on the gun. The shooters- That was very clearly a-Christ Church.

01:09:13

I want whatever media storm those people had. I want that.

01:09:18

Precisely. I want to be the next name on the gun.

01:09:20

Hey, listen, man, I know you got a lot going on, and I appreciate you spending the time with us. I got to tell you, I made my bones in this business spending hours and hours watching cable news. It was a corrosive experience. I can't imagine because I think you're doing the same, but in an even more difficult and acidic environment. You're with the atmosphere of Venus. We were still on Earth, but you're in a whole other area. So much appreciated. Charlie Warzell Staff Writer at The Atlantic, author of its newsletter, Galaxy Brain. Thanks very much for talking to us, Charlie.

01:10:07

Yeah, thanks for having me.

01:10:12

Wow. I have to say, it's what I thought it was, this strange, unregulated Wild West culture where the worst version of yourself is not just in any way I would say, visible but encouraged, cultivated, fertilized. It is exactly the environment that we need to aerate. If we are to move forward in non-niallist fashion. Yes, that is for sure. But I do want to thank Charlie for taking the time. I also, as always, I want to thank all those that work on this podcast who do such a great job week in and week out. The not unsung heroes, they are the song. We should sing their praises more often. Our lead producer, Lauren Walker, Brittany Mamedevik, producer, Gillian Speer, video editor and engineer, Rob Vittolo, audio editor and engineer, Nicole Boi, and our executive producers, Chris McShane and Katie gray. We will see you next week under, hopefully, more pleasant circumstances. Bye-bye. The Weekly Show with Jon Stewart is a Comedy Central podcast. It's produced by Paramount Audio and Bustboy Productions.

01:11:51

Ount Podcasts.

AI Transcription provided by HappyScribe
Episode description

In the aftermath of Charlie Kirk’s assassination, Jon is joined by Charlie Warzel, staff writer at The Atlantic and author of its "Galaxy Brain" newsletter. Together, they explore how algorithms distort the way we experience these tragic events while rewarding the most extreme reactions, investigate the online ecosystems that can radicalize individuals, and consider whether our responses to violence perpetuate the very cycles we condemn.

This podcast episode is brought to you by:

FACTOR - Go to https://www.factormeals.com/TWS50OFF  to claim 50% off your first box, plus Free Breakfast for 1 Year.

GROUND NEWS - Go to https://groundnews.com/stewart to see how any news story is being framed by news outlets around the world and across the political spectrum. Use my link to get 40% off unlimited access with the Vantage Subscription.

Follow The Weekly Show with Jon Stewart on social media for more: 

> YouTube: https://www.youtube.com/@weeklyshowpodcast > Instagram: https://www.instagram.com/weeklyshowpodcast> TikTok: https://tiktok.com/@weeklyshowpodcast 

> X: https://x.com/weeklyshowpod  

> BlueSky: https://bsky.app/profile/theweeklyshowpodcast.com

Host/Executive Producer – Jon Stewart

Executive Producer – James Dixon

Executive Producer – Chris McShane

Executive Producer – Caity Gray

Lead Producer – Lauren Walker

Producer – Brittany Mehmedovic 

Producer – Gillian Spear

Video Editor & Engineer – Rob Vitolo

Audio Editor & Engineer – Nicole Boyce

Music by Hansdle Hsu

Learn more about your ad choices. Visit podcastchoices.com/adchoices