Two decades ago, Wikipedia emerged as a quirky online project that aimed to collect and document all of humanity’s knowledge and history in real time. Skeptics worry that much of the web will consist of unreliable information and frequently point out mistakes.
But now, online encyclopedias are often seen as a place where, on balance, it helps to combat misinformation and misinformation spreading elsewhere.
Last week, the Wikimedia Foundation, the group that oversees Wikipedia, announced that Maryana Iskander, a social entrepreneur in South Africa, has worked for many years in nonprofit organizations to address youth unemployment. and women’s rights, will become its chief executive in January.
We talked to her about her vision for the group and how the organization is working to prevent misinformation and misinformation on its sites and across the web.
Tell us about your direction and vision for Wikimedia, especially in such an information-laden landscape and in this polarized world.
There are a few core principles of Wikimedia projects, including Wikipedia, which I think are important starting points. It’s an online encyclopedia. It’s not trying to be anything else. It certainly isn’t trying to be a traditional social media platform in any way. It has a structure led by volunteer editors. And as you probably know, the platform has no editorial control. This is a user-led community, which we support and enable.
The lessons to learn, not just from what we’re doing but how we continue to iterate and improve, start with this idea of radical transparency. Everything on Wikipedia is quoted. It is debated on our talk pages. So even when people may have different views, those debates are open and transparent, and in some cases, really allow for proper reciprocity. I think that’s the need in such a polarized society – you have to make space for passersby. But how do you do that transparently and ultimately lead to a better product and better information?
And the last thing I’m going to say is, you know, this is a community of incredibly humble and honest people. As we look to the future, how do we build upon those attributes of what this platform can continue to offer society and provide free access to knowledge ? How do we ensure that we are reaching the full diversity of humanity in terms of who is invited to participate, who is written about? How do we really ensure that our collective efforts reflect more of the global south, more women, and more diversity of human knowledge, more reality? ?
What do you think about how Wikipedia fits into the widespread problem of online misinformation?
Many of the platform’s core attributes are very different from some traditional social media platforms. If you give false information about Covid, the Wikimedia Foundation has partnered with the World Health Organization. A group of volunteers got together around the so-called WikiProject Medicine, which focuses on medical content and creates articles that are then carefully curated because these are the kinds of topics you want to focus on around. around misinformation.
Another example is the foundation that assembled a task force ahead of the US elections that, again, tried to be proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there are only 33 reversals on the main US election page is an example of how focusing on key topics where misinformation poses real risk.
Then another example that I think is really interesting is there’s a podcast called “World According to Wikipedia”. And on one of the episodes, there was a volunteer interviewed, and she really did her job to become one of the main followers of the pages on climate change.
We have technology that notifies these editors when changes are made to any page so they can see what the changes are. If there is indeed a risk that misinformation could creep in, there is an opportunity to temporarily block a page. No one wants to do that unless it’s absolutely necessary. The climate change example is useful because the talk pages behind it have great debate. Our editor is saying: “Let’s argue. But this is the page I am monitoring and monitoring closely.”
A major debate currently taking place on these social networking platforms is the issue of information censorship. There are those who argue that biased views take precedence on these platforms and more conservative views have been dropped. When you think about how to handle these debates when you’re the head of Wikipedia, how do you issue a call for judgment when this happens in the background?
What has inspired me about this organization and these communities is that there are core pillars that were established on Day One of the founding of Wikipedia. One of them is the idea of presenting information with a neutral point of view, and neutrality requires an understanding of all sides and all points of view.
That’s what I said earlier: There are debates on the talk pages, but then come to a suitable kind of informed, documented, verifiable conclusion about the articles. I think this is a core principle that, again, has the potential to provide something for others to learn from.
Coming from a progressive organization fighting for women’s rights, do you think a lot about wrongdoers weaponizing your résumé to say it can influence calls for you to bring it up? out about what’s allowed on Wikipedia?
I will say two things. I can say that the really relevant aspect of the work I’ve done in the past is that volunteer-led movements are probably a lot harder than others might think, and that I played a really active role in understanding how to build systems, build cultures, and build processes that I think would be the right fit for an organization and a community group trying to scale. and their reach.
The second thing that I would like to say, again, I am on my own learning journey and invite you to join the learning journey with me. The way I choose to be the world is that we interact with others on the assumption of goodwill and that we participate in respectful and civilized ways. That doesn’t mean others will do it. But I think we have to hold on to that as an aspiration and as a way of, you know, the change that we also want to see in the world.
When I was in college, I did a lot of my research on Wikipedia, and some of my professors would say, ‘You know, that’s not a legitimate source.’ But I still use it all the time. I wonder if you have any thoughts on that!
I think most professors now admit that they also sneak into Wikipedia looking for things!
You know, this year we’re celebrating Wikipedia’s 20th anniversary. On the one hand, this is something that I think people scoff at and say is going nowhere. And now it’s become the most legitimately referenced source in human history. I can tell you just from my own conversations with academics that the narrative around Wikipedia sources and Wikipedia usage has changed.