Request Podcast

Transcript of 'The Interview': The Culture Wars Came for Wikipedia. Jimmy Wales Is Staying the Course.

The Daily
Published 1 day ago 11 views
Transcription of 'The Interview': The Culture Wars Came for Wikipedia. Jimmy Wales Is Staying the Course. from The Daily Podcast
00:00:00

Hi, this is Melissa Clarke from New York Times cooking. Who doesn't love a simple one pan meal? Take my Shekshuka with Feta recipe, for instance. In a single skillet, you get perfectly cooked eggs nestled in a bright and fragrant tomato sauce, surrounded by creamy nuggets of melted feta. It's a delicious breakfast, but it's just as good for dinner, and it won't leave you with a lot of cleanup.

00:00:21

You can find this recipe and all of our fan favorite One Pan recipes at nytcooking.

00:00:27

Com. Nytcooking has you covered with recipes, advice, and inspiration for any occasion. From the New York Times, this is The Interview. I'm Lulu Garcia Navarro. As one of the most popular websites in the world, Wikipedia helps define our common understanding of just about everything. But recently, the site has gone from public utility to a favorite target of Elon Musk, Congressional Republicans, and MAGA influencers who all claim that Wikipedia is biased. In many ways, those debates over Wikipedia are a microcosm of bigger discussions we're having right now about consensus, civil disagreement, shared reality, truth, facts, all those little easy topics. A bit of history. Wikipedia was founded back in the Paleolithic era of the Internet in 2001 by Larry Sanger and Jimmy Wails. It was always operated as a non profit, and it employs a decentralized system of editing by volunteers, most of whom do so anonymously. There are rules over how people should engage on the site, cordially, and how changes are made, transparently. It's led to a culture of civil disagreement that has made Wikipedia what some have called the last best place on the internet. Now, with that culture under threat, Jimmy Wales has written a book called The Seven Rules of Trust, trying to take the lessons of Wikipedia's success and apply them to our increasingly partisan, trust-depleted world.

00:02:04

I have to say, I did come in skeptical of his prescriptions, but I left hoping he's right. Here's my conversation with Wikipedia co founder, Jimmy Wales. I wanted to talk to you because I think this is a very tenuous moment for trust, and your new book is all about that. In it, you lay out what you call the seven rules of trust based on your work at Wikipedia. We'll talk about all those as well as some of the threats and challenges to Wikipedia. But big picture, how would you describe our current trust deficit?

00:02:47

I draw a distinction between what's going on maybe with the politics and journalism, the culture wars and all of that, and day-to-day life. Because I think in day-to-day life, people still do trust each other. People generally think most people are basically nice, and we're all human beings bumping along on the planet, trying to do our best. And obviously, there are definitely people who aren't trustworthy. But the crisis we see in politics, trust in politicians, trust in journalism, trust in business, that is coming from other places and is something that we can fix.

00:03:29

One of the reasons why you can be an authority on this is because you created something that scores very high on trust. You have built something that people want to engage with.

00:03:42

Yeah. I do think Wikipedia isn't as good as I want it to be. I think that's part of why people do have a certain amount of trust for us because we try to be really transparent. You see the notice at the top of the page sometimes it says, the neutrality of this page has been disputed or the following section doesn't cite any sources. People like that. Not many places these days will tell you, Hey, we're not so sure here. And it shows that the public does have a real desire for unbiased, neutral information. They want to trust. They want the sources. They want you to prove what you're saying and so forth.

00:04:26

How does Wikipedia define a fact?

00:04:30

Basically, we're very old fashioned about this thing. What we look for is good quality sources. We like peer reviewed scientific research, for example, as opposed to populist tabloid reports. We look for quality magazines, newspapers, et cetera. We don't typically treat a random tweet as a fact. We're pretty boring on that regard.

00:05:00

Yeah, it's like the publication that you cite gets cited by other reputable sources that it issues corrections when it gets things wrong.

00:05:08

It's all the old-fashioned good stuff. I think it's important to say when we look at different sources, they will often come to things from a different perspective or different political point of view. That doesn't diminish the quality of the source. So for example, I live here in London, in the UK. We have the Telegraph, which is a generally generally right leaning but quality newspaper. We have The Guardian, generally left leaning but quality newspaper. Hopefully, the facts, as you read the articles and you glean through it, the fact should be reliable and solid, but you have to be very careful as an editor to tease out, okay, what are the facts that are agreed upon here? And what are the things that are opinions on those facts? That's an editorial job. It's never perfect and it's never easy.

00:05:55

Wikipedia also is famously open source. It's decentralized, and essentially it's run by thousands of volunteer editors. You don't run Wikipedia, we should say.

00:06:06

It runs me.

00:06:08

How do those editors fix disputes when they don't agree on what facts to be included or on how something is written? How do you negotiate those differences?

00:06:19

Well, in the best cases, what happens and what should happen always is take a controversial issue like abortion. Obviously, if you think about a a kind and thoughtful Catholic priest and a kind and thoughtful planned Parenthood activist, they're never going to agree about abortion. But probably they can come together, because I said they're kind and thoughtful, and say, Okay, but we can report on the dispute So rather than trying to say abortion is a sin or abortion is a human right, you could say Catholic Church position is this, and the critics have responded thusly. You'll start to hear a little of the Wikipedia style, because Because I believe that that's what a reader really wants. They don't want to come and get one side of the story. They want to come and say, Okay, wait, hold on. I actually want to understand what people are arguing about. I want to understand both sides. What are the best arguments here?

00:07:14

Yeah, and basically Every page has what's called a talk tab where you can see the history of the discussions and the disputes, which relates to another principle of the site, which is transparency. You can look at everything and see who did what and what their reasoning was.

00:07:27

Yeah, exactly. Oftentimes, Sometimes if you see something repeating, you think, okay, well, why does it say that? Often you'll be able to go on the talk page and read what the debate was and how it was, and you can weigh in there and you can join in and say, actually, I still think you've got it wrong. Here's some more sources, here's some more information, maybe propose a compromise, that thing. And in my experience, it turns out that a lot of pretty ideological people on either side are actually more comfortable doing that because they feel confident in their beliefs. I think it's the people who, and you find lots of them on Twitter, for example, they're not that confident in their own values and their own belief system, and they feel fear or panic or anger if someone one's disagreeing with them rather than saying, Okay, look, that's different from what I think. Let me explain my position, which is where you're more intellectually grounded person will come from.

00:08:27

What you're saying is supported actually a study about Wikipedia that came out in the science journal Nature in 2019. It's called The Wisdom of Polarized Crowd. Perhaps counterintuitively, it says that politically contentious Wikipedia pages end up being of higher quality, meaning that they're more evidence-based, they have more consensus around them. But I do want to ask about the times when consensus building isn't necessarily easy as it relates to specific topics on Wikipedia. Some pages, they have actually restricted editing privileges. So the Arab-Israeli conflict, climate change, abortion, unsurprising topics there. Why are those restricted and why doesn't the wisdom of polarized crowds work for those subjects?

00:09:12

Well, typically the subjects that are restricted are we try to keep that as short as we can. The most common type of case is if something's really big in the news or if some big online influencer says, Wikipedia is wrong, go and do something about it. We get a rush of people who don't understand our culture, don't understand the rules, and they're just vandalizing or they're just being rude and so on and so forth. And we just, as a calming down, just like, Okay, hold on, just slow down. We're going to protect the page. And then there are pages where the most common type of protection we call semi-protection, which just means you have to have had an account for, I forget the exact numbers, it's something like four days and you have made 10 edits without getting banned. Now, typically, and this is what's surprising to a lot a lot of people about Wikipedia, 99% of the pages, maybe more than 99%, you can edit without even logging in, and it goes live instantly. That's mind boggling. But it points to the fact that most people are basically nice. Most people are trustworthy. People don't just come by and vandalize Wikipedia.

00:10:18

Often, if they do, it's because they're just experimenting or they didn't believe that would... They're like, Oh, my God, it actually went live, and I didn't know it was going to do that. It's like, Yeah, please don't do that This brings me to some of the challenges.

00:10:33

Wikipedia, while it has created this very trustworthy system, it is under attack from a lot of different places. One of Wikipedia's super Your powers can also be seen as a vulnerability, the fact that it is created by human editors. Human editors can be threatened, even though they're supposed to be anonymous. You've had editors doxed, pressured by governments to doctor information. Some have to flee their home countries. I'm thinking of what's happened in Russia, in India, where those governments have really taken aim at Wikipedia. Would you say this is an expanding problem?

00:11:13

Yeah, I would. I think that we are seeing all around the world a rise of authoritarian impulses towards censorship, towards controlling information. And very often, these come as a wolf in sheep's clothing because it's all about protecting the children or whatever it might be that you move forward in these control ways. But at the same time, the Wikipedians are very resilient and they're very brave. And one of the things that we believe is that in many, many cases, what's happened is a real lack of understanding by politicians and leaders of how Wikipedia works. A lot of people really We have a very odd assumption that it's somehow controlled by the Wikimedia Foundation, which is the charity that I set up that owns and operates the website. Therefore, they think it's possible to pressure us in ways that it's actually not possible to pressure us. The community has real intellectual independence. But yeah, I do worry about it. I mean, it's always something that weighs very heavily on us is volunteers who are in dangerous circumstances and how do they remain safe is critically important.

00:12:33

I want to bring up something that just happened here in the US. In August, James Comer and Nancy Mace, two Republican representatives from the House Oversight Committee, wrote a letter to Wikimedia, requesting records, communication, analysis on specific editors, and also any reviews on bias regarding the state of Israel in particular. Their reason, and I'm going to quote here, is because they are investigating the efforts of foreign operations and individuals at academic institutions subsidized by US taxpayer dollars to influence US public opinion. Can you tell me your reaction to that query? Yeah.

00:13:13

I mean, we've given a response to the parts of that that were reasonable. I mean, what we feel like is there's a deep misunderstanding or lack of understanding about how Wikipedia works. Ultimately, the idea that something being biased is a proper and fit subject for a congressional investigation is frankly absurd. In terms of asking questions about cloak and Dagger, whatever, we're not going to have anything useful to tell them. I'm like, I know the Wikipedians, they're like a bunch of nice geeks.

00:13:47

Yeah. I mean, the Heritage Foundation here in the United States, which was the architect of Project 2025, have said that they want to dox your editors. I mean, how do you protect people from that?

00:14:01

I mean, it's embarrassing for the Heritage Foundation. I remember when they were intellectually respectable, and that's a shame that if that's what they think is the right way forward, they're just badly mistaken.

00:14:14

But it does seem that there is this movement on the right to target Wikipedia over these types of concerns, and I'm wondering why you think that's happening.

00:14:26

I mean, it's hard to say. There's a lot of different motivations, a lot of different people. Some of it would be genuine concern if they see that maybe Wikipedia is biased. I have seen, for example, Elon Musk has said, Wikipedia is biased because they have this really strong rules about only sighting mainstream media and the mainstream media is biased. Okay. I mean, that's an interesting question, interesting criticism. Certainly, I think, worthy of some reflection by everyone, the media and so on and so forth. But it's hardly news to anybody and not actually that interesting. Then other people in various places around the world, not speaking just of the US, but facts are threatening. If you and your policies are at odds with the facts, then you may find it very uncomfortable for people to simply explain the facts. I don't know, that's always going to be a difficulty. But we're not about to say say, Gee, maybe science isn't valid after all. Maybe the COVID vaccine killed half the population. No, it didn't. That's crazy, and we're not going to print that. They're going to have to get over it.

00:15:43

I want to talk about a recent example of a controversy surrounding Wikipedia, and that's the assassination of Charlie Kirk. Senator Mike Lee called Wikipedia wicked because of the way it had described Kirk on its page as a far-right conspiracy theorist, among other complaints that they had about the page. I went to look at the time that we're speaking, and that description is now gone from Wikipedia. Those on the left would say that that description was accurate. Those on the right would say that that description was biased all along. How do you see that, that tension?

00:16:21

Well, I think the correct answer is you have to address all of that. You have to say, look, this was a person. I think The least controversial thing you could say about Charlie Kirk is that he was controversial. I don't think anybody would dispute that. To say, okay, this was a figure who was a great hero to many people and treated as a demon by others. He had these views, many of which are out of step with, say, mainstream scientific thinking, many of which are very much in step with religious thinking and so on and so forth. Those are the kinds of things that if we do our job well, which I think we have in this case, we're going to describe all of that. Maybe you don't know anything about Charlie Kirk. You just heard, Oh, my God, this man was assassinated. Who was he? What's this all about? Well, you should come and learn all that. You should learn who his supporters were and why they supported him and what are the arguments he put forward and what are the things he said that upset people. That's just part of learning what the world is about.

00:17:20

Those words that were there, far-right and conspiracy theories, those were, in your view, the wrong words, and that the critics of Wikipedia had a point. Well, it depends on the specific criticism. So if the criticism is this word appeared on this page for 17 minutes, I'm like, you know what? You got to understand how Wikipedia works. It's a process, it's a discourse. It's a dialog. But to the extent that he was called a conspiracy theorist by prominent people, that's part of his history. That's part of what's there. Wikipedia shouldn't necessarily call him that, but we It should definitely document all of that.

00:18:02

You mentioned Elon Musk, who's come after Wikipedia. He calls it Wokopedia. He's now trying to start his own version of Wikipedia called Grochopedia, and he says it's going to strip out ideological bias. I wonder what you think a tax like his do for people's trust in your platform, writ large. Because as we've seen in the journalism space, if enough outside actors are telling people not to trust something, they won't.

00:18:33

Well, it's very hard to say. I mean, I think for many people, their level of trust in Elon Musk is extremely low because he says wild things all the time. So to that extent, when he attacks us, people donate more money. So that's not my favorite way of raising money, but the truth is, a lot of people are responding very negatively to that behavior. One of the things I do say in the book, and I've said to Elon Musk, is that type of attack is counterproductive, even if you agree with Elon Musk, because to the extent that he has convinced people falsely, that Wikipedia has been taken over by woke activists, then two things happen. You're kind and thoughtful Conservatives who we very much welcome and we want more people who are thoughtful and intellectual and maybe disagree about various aspects of the spirit of our times. Come and join us and let's make Wikipedia better. But if those people think, Oh, no, it's just going to be a bunch of crazy woke activists, they're going to go away. And then on the other side, the crazy woke activists are going to be like, Great, I found my home.

00:19:45

I don't have to worry about whatever. I can come and write rants against the things I hate in the world. We don't really want them either.

00:19:52

You said you talked to Elon Musk about this. When did you talk to him about it and what was that conversation like?

00:20:00

I mean, we've had various conversations over the years. He texted me sometimes. I text him sometimes. He's much more respectful and quiet in private. But that you would expect. It's got a big public persona.

00:20:17

When was the last time you had that exchange?

00:20:20

That's a good question. I don't know. I think the morning after the last election, he texted me that morning. I congratulated him.

00:20:30

Obviously, the debate that happened more recently was because of the hand gesture that he made that was interpreted in different ways, and he was upset in the way that it had been characterized on Wikipedia.

00:20:47

Yeah, I heard from him after that. I mean, in that case, I pushed back because I went to check like, Oh, what does Wikipedia say? And it was a very matter of fact. It said he made this gesture. It got a lot of news coverage. Many interpreted as this, and he denied that it was a Nazi salute. That's the whole story. It's part of history. I don't see how you could be upset about it being presented in that way. If Wikipedia said Elon Musk is a Nazi, that would be really, really wrong. But to say, look, he did this gesture and it created a lot of attention, and some people said it looked like a Nazi. Yeah, that's great. That's what Wikipedia is. That's what it should do.

00:21:25

Do you think Elon Musk is acting in good faith? You're saying that in private, he's nice and cordial, but his public persona is very different.

00:21:36

I think it's a fool's errand to try and figure out what's going on in Elon Musk's mind. So I'm not going to try.

00:21:46

I don't mean to press you on this. I'm just trying to refer to something that you said, which is people human to human are nice, right? That we're good, that we should assume good faith. You're saying that Elon one-on-one is lovely, but he is attacking your institution and potentially draining support for Wikipedia.

00:22:11

Well, I don't think he has the power he thinks he has or that a lot of people think he has to damage Wikipedia. I mean, we'll be here in 100 years, and he won't. I think as long as we stay Wikipedia, people will still love us. They will say, You know what? Wikipedia is great. And all the noise in the world and all these people ranting, that's not really the real thing. The real thing is genuine human knowledge, genuine discourse, genuinely grappling with the difficult issues of our day. That's actually super valuable. So there's a lot of noise in the world. I hope Elon will take another look and change his mind. That'd be great, but I would say that of anybody. And in the meantime, I don't think we need to obsess over it or worry that much about it. We don't depend on him for funding. And yeah, there we are.

00:23:05

I hear you saying that part of your strategy here is just to stay the course, do what Wikipedia does. Are there changes that you do think Wikipedia needs to make to stay accurate and relevant?

00:23:19

Well, I think what we have to focus on when we think about the long run, we also have to keep up with technology. We've got this rise of the large language models, which are an amazing but deeply flawed technology. And so the way I think about that is to say, okay, look, I know for a fact, no AI today is competent to write a Wikipedia entry. It can do a passable job on a very big famous topic, but anything slightly obscure and the hallucination problem is disastrous. But at the same time, I'm very interested in how can we use this technology to support our community. One idea is take a short entry and feed in the sources. Maybe it's only got five sources in a short and just ask the AI, is there anything in the entry that's not supported by the sources or is there anything in the sources that could be in Wikipedia but isn't? And give me a couple of suggestions if you can find anything. As I've played with that, it's pretty okay. It needs work and it's not perfect. But if we react with just like, oh, my God, we hate AI, then we'll miss the opportunity to do that.

00:24:32

If we go crazy like, Oh, we love AI, and we start using it for everything, we're going to lose trust because we're going to include a lot of AI hallucinated errors and so on.

00:24:41

I mean, that's interesting because Wikimedia writes this yearly global trends report on what might impact Wikipedia's work. For 2025, it wrote, We are seeing that low-quality AI content is being churned out, not just to spread false information, but as a get-rich-quick scheme, and it is overwhelming the internet. High-quality information that is reliably human-produced has become a dwindling and precious commodity. I read that crawers from large language models have basically crashed your servers because they use so much of Wikipedia's content. It did make me wonder, will people be using these large language models to answer their questions and not going to the source, which is you?

00:25:28

Well, this has It's been a question since they began. We haven't seen any real evidence of that. I use AI personally quite a lot, and I use it in a different way, though, different use cases. How? I like to cook. It's my hobby. I fancy myself as being quite a good cook. I will often ask ChatGPT for a recipe. I also ask it for links to websites with recipes. It sometimes makes them up, So that's a bit hilarious. And I also suggest be careful using ChatGPT for cooking unless you actually already know how to cook, because when it's wrong, it's really wrong. But Wikipedia would be useless for that. Wikipedia doesn't have recipes. It's a completely different realm of knowledge than encyclopedic knowledge. So, yeah, I'm not that worried about it. I do worry about in this time when journalism has been under incredible financial pressure, And there's a new competitor for journalism, which is low quality churned out content, produce for search engine optimization to compete with real human written content. To the extent that that further undermines the ability for the business model of particularly local journalism is something that I'm very worried about, then that's a big problem.

00:26:56

And it's not directly about Wikipedia, but it is about it's very cheap to generate very plausible text, that, yeah, that doesn't seem good to me.

00:27:09

It definitely doesn't seem good to me either as a journalist. I just recall that there was this hope that as the internet got flooded with garbage, and this is even before AI, this was just troll farms and click bait, that it would benefit trustworthy sources of information. Instead, we've seen that the opposite has happened. Wikipedia, news organizations, academic institutions, they're all struggling with the same thing. Why do you think that they are struggling in an era where they should be flourishing if what you say is true, that people ultimately do want to trust the information that they're getting?

00:27:52

Well, I think that a big piece of it is that the news media has not done a very good job of sticking to the facts and avoiding bias. I think a lot of news media, not all of it, but a lot of news media has become more partisan, and there are reasons for it, and it's short termism. You'll even see there's some stuff in the book about this, some arguments by some people in journalism that objectivity is we should give up on it and be partisan and so on and so forth. I think that's a huge mistake. And I think there's lots of evidence for that. Wikipedia is incredibly popular, and that's one of the things people say about Wikipedia that they really value, and they're really disappointed if they feel like we're biased and so on and so forth. So I gave the example earlier because I live in the UK, I read these papers. But if you look at the Telegraph and you look at the Guardian on an issue related to climate change, I can already tell you before we start reading which attitude is going to come from which paper. Neither of them is doing a very good job of saying, saying, actually, it's not our job to be on one side of that issue or the other.

00:29:03

It's our job to describe the facts and to understand the other side and so on and so forth. Returning to that value system is hugely important because otherwise, how are you going to get the trust of the public?

00:29:17

I guess I'm surprised at you saying this because Wikipedia has been faced with similar attacks on its own credibility. You You say that you are neutral and credible and that the system that you employ is fair. And yet there are people who completely dispute that. I think what the response to What is a very common broadside against journalists and journalism in this era that they have taken aside, those of us on the inside would say it is part of a larger project of discrediting facts. We've seen those attacks on Wikipedia, we've seen them on academic institutions, and we've seen them on the media. They are all part of the same thing. I'd love you to tease out why it's unfair when it happens to Wikipedia, but it's fair when it happens to journalistic institutions.

00:30:20

Well, it's either fair or unfair for both, depending on what the actual situation is. There have been cases in the history of Wikipedia when somebody said to me, Wow, Wikipedia is really biased on this topic. And I say the response should be, Wow, let's check it out. Let's see. Let's try and do better. If we find that we have been biased in some area, then we need to do a better job. And I think that for many outlets, that isn't happening. Instead, what's happening is pandering to an existing audience. And I understand why. The reason why that's happened has to do with the business This model has to do with getting clicks online and so on and so forth. It has to do with the fact that without sufficient financial resources, you have to scrap for every penny you can get in the short run rather than saying, no, we're going to take the long view, even though a few people may cancel. I'm encouraging us all to say, you know what? Let's double down on that. Let's really, really take very, very seriously the need to be trustworthy.

00:31:30

After the break, Jimmy and I speak again about how he thinks part of Wikipedia's success is the fact that profit isn't even on the table.

00:31:38

The most successful tweet I ever had, I think it was a New York Post journalist, tweeted to Elon, You should just buy Wikipedia when he was complaining something about Wikipedia. And I just wrote not for sale. I'm Wesley Morris.

00:31:58

I'm a critic for the New York Times, and I'm I'm the host of a podcast called Canon Ball.

00:32:03

We're going to talk about that song, You Can't Get Out of Your Head, that TV show you watched and Can't Stop Thinking About, and the movie that you saw when you were a kid that made you who you are, whether you like it or not. I was so embarrassed the whole time because it's a bad film, and I still love it.

00:32:20

You can find Canon Ball on YouTube and wherever you get your podcast. Hi.

00:32:35

Hello.

00:32:37

I was thinking about our first conversation, and I was thinking about the moment that Wikipedia was created in, a time before social media, before the dramatic polarization that we've seen, before the political weaponization of the internet that we've seen. I'm still, after talking to you, not sure that the lessons of how Wikipedia was created apply to today. I wanted to ask you, do you think Wikipedia could be created now and exist in the same way that it does?

00:33:14

Yeah, I do. I think it could. I actually think that the lessons are pretty timeless. At the same time, yeah, it's absolutely valid to acknowledge the internet is different now, and there's new problems, new problems that come from social media and all the rest and the aggressively politicized culture wars that we live in. That is different, but I don't think that's a permanent change to humanity. I think we're just going through a really crazy era. Here we are.

00:33:50

Why do you think the internet didn't go the way of Wikipedia? Collegial, working for the greater good, fun, nerdy, all What are the words that you use to describe that moment of creation.

00:34:04

Well, the thing is, I'm old enough that I grew up on the internet in the age of Usenat, which was this massive message board, like Reddit today, except for not controlled by anyone because it was, by design, distributed and uncontrollable, unmoderatable for the most part. And it was notoriously toxic. There was Some skepticism then, and that was when it first was recognized, I suppose first recognized that anonymity can be problematic, that people behind an alias behind their keyboards, no accountability, can be just really bad and really vicious. That's when we first started seeing spam. I remember some of the early spam, and everybody was like, Oh, my God, what's this? Spam? It's terrible. I think some of these things are just human issues. But now that's to a larger degree than then. We live online.

00:35:08

It's in our pocket all the time.

00:35:10

It's in our pocket all the time. Yeah. Obviously, the impact is much more.

00:35:15

I think I was thinking about Wikipedia in particular and maybe why it went a different way in that you chose at a certain point to make it a not-for-profit. You chose not to capitalize on the success of Wikipedia. It made me wonder about OpenAI started as an open source for the greater good project, like Wikipedia, and they've now shifted into being a multibillion dollar business. I'd love to know your thoughts on that shift for OpenAI, but more broadly, do you think that the money part of it also changed the equation?

00:35:54

Yeah, I do think it made a difference in lots of ways. I'm not against for-profit. There's nothing wrong with for-profit companies. But even as a nonprofit, you do have to have a business model, so to speak. You've got to figure out how you're going to pay the bills. And for Wikipedia, that's not too bad. The truth is, we don't require billions and billions and billions of dollars in order to operate Wikipedia. We need servers and database, and we need to support the community and all these kinds of things. I would say in terms of the development of Wikipedia and how we're so community-first and community-driven, you wouldn't really necessarily have that if the board were made up largely of investors who are worried about the profitability and things like that. Also, I think it's important today for our intellectual independence. We're under attack in various ways, as we've talked about. What's interesting is one of the things that isn't going to happen, actually, the most successful tweet I ever had, I think it was a New York Post journalist, tweeted to Elon, You should just buy Wikipedia, when he was complaining something about Wikipedia. And I just wrote, Not for sale.

00:37:13

That was very popular. But it isn't for sale. And I just thought, you know what? I would like to imagine myself as a person who would say to Elon, no, thank you for a $30 billion offer if I owned the whole thing. But would I actually 30 billion? 30 million? Yeah, I'm not interested. That's not going to happen because we're a charity and I don't get paid and the board doesn't get paid and all of that. I do think that's important for that independence that we're not We don't think in those terms. We're not even interested in that.

00:37:48

Since we last spoke, the co founder of Wikipedia, Larry Sanger, has given an interview to Tucker Carlson that's getting a lot of attention here in the United States on the right. He has had a lot to say about Wikipedia Wikipedia, and not a lot of it's good. In the past, he's called it one of the most effective organs of establishment propaganda in history. We should say that he believes Wikipedia has a liberal bias. In this interview, and on his X feed, he's advocating for what he's calling reforms to the site, which include reveal who Wikipedia's leaders are and abolish source blacklists. I just wonder what you make of it.

00:38:28

I haven't watched it. I bear Tucker Carlson, so I'm going to have to just suck up and watch, I suppose. I can't speak to the specifics in that sense, but the idea that everything is an equally valid source and that it's somehow wrong that Wikipedia tries to prioritize the mainstream media and quality newspapers, magazines, and make judgments about that is not something I can in any way apologize for. But there's no question. One of my fundamental beliefs is that Wikipedia should always stand ready to accept criticism and change. And so to the extent that a criticism says, Wikipedia is biased in a certain way and that these are the flaws in the system, well, we should take that seriously. We should say, Okay, is there a way to improve Wikipedia? Is our mix of editors right? At the At the same time, I also think, you know what? We're going to be here in 100 years, and we're designing everything for the long haul. The only way I think we can last that long is not by pandering to this raging mob of the moment, but by maintaining our values, maintaining our trustworthiness, being serious about trying to make things better if we've got legitimate criticism.

00:39:59

Other than the fact that, okay, we're just going to do our thing and we're going to do it as well as we can. I don't know what else we can do.

00:40:06

I think you and Larry did build something beautiful that has endured. I do wonder if it's going to be part of our future because I feel some despair about where we're all headed and some fear. I guess you'll just say that I have to trust that it's all going to end up okay. But I do worry that it might not.

00:40:36

Yeah. I mean, there's so much right now to worry about. I can't dismiss all that. I can try and cheer you up a little bit. But we just saw Donald Trump talking about the enemy within and suggesting the military should be in cities. Doing what? Shooting people? I It's unbelievable. On the other hand, I think he just is blustering and being Donald Trump and all that. But you have to worry.

00:41:10

I didn't cheer me up. I got to tell you right there.

00:41:14

Fair enough.

00:41:15

I got to tell you, as a pep talk- Pretty low.

00:41:19

I think we're going to be all right, but it's a rough time. Okay.

00:41:28

What was the last page you read on Wikipedia and what were you trying to find out?

00:41:31

Oh, that's a good question. Can I take a second to look?

00:41:34

Sure.

00:41:36

Show full history. Search Wikipedia. Now, I'm going to skip over list of Wikipedians by number of edits. That's just me doing work. I'm going to look. Oh, I know. This is fun. Admiral Sir Hugo Pearson, who died in 1912, used to own my house in the countryside. I found I found this out, and there's a picture of him, which I found on eBay and ordered. I was trying to remember something. There's nothing about my house in the article because he was there and then he moved away. But I love it. I'm thinking of replacing the AI voice assistant. I use Alexa. People use, okay, Google or whatever. I want to make my own, and I want to have it be a character, and the character will be the ghost of Hugo Pearson. Anyway, that's what I was researching. I'm not sure I'll ever get around to it. I'm really Really busy promoting my book and things like that. But when I get spare time, I dream of being a geek, and I'm going to go home in February, maybe, and just work all month playing in my house.

00:42:41

You are a geek in the best possible way. Thank you so much for your time. I really appreciate it.

00:42:47

Thank you. Yeah, it's been great.

00:42:50

That's Jimmy Wales, his new book, The Seven Rules of Trust, a blueprint for building things at last, comes out October 28th. To watch this interview and many others, you can subscribe to our YouTube channel at youtube. Com/@symboltheinterviewpodcast. This conversation was produced by Wyatt Orm. It was edited by Annabelle Bacon, mixing by Affim Shapiro. Original music by Dan Powell, Ron Nemistow, and Marion Lozano. Photography by Philip Montgomery. Our senior booker is Priya Matthew, and Seth Kelly is our senior producer. Our executive producer is Allison Benedict. Video of this interview was produced by Paula Newdorff. Cinematography by Zebediah Smith and Daniel Bateman. Audio by Sonia Herrero. It was edited by Amy Moreno. Brooke Minters is the executive producer of podcast video. Special thanks to Molly White, Rory Walsh, Ronan Barale, Jeffrey Miranda, nick Pitman, Maddie Macielo, Jake Silverstein, Paula Schumann, and Sam Dolnik. Next week, David talks with Sir Anthony Hopkins about his new memoir and what he's learned in his 87 years.

00:44:01

You get to a certain age in life. You're going through, you got ambitions, you got great dreams, and everything's fine. And there on the distant hill is death. And you think, well, now is the time to wake up and live.

00:44:17

I'm Lulu Garcia Navarro, and this is The Interview from The New York Times.

AI Transcription provided by HappyScribe
Episode description

Attacks on the site are piling up. Its co-founder says trust the process.Thoughts? Email us at theinterview@nytimes.comWatch our show on YouTube: youtube.com/@TheInterviewPodcastFor transcripts and more, visit: nytimes.com/theinterview
Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify. You can also subscribe via your favorite podcast app here https://www.nytimes.com/activate-access/audio?source=podcatcher. For more podcasts and narrated articles, download The New York Times app at nytimes.com/app.