Advertisement

The slow and steady battle to close Wikipedia’s dangerous gender gap

As online information fractures into alternate realities, Wikipedia is fruitlessly trying to quash its gender biases.

CREDIT: Adam Peck/ThinkProgress
CREDIT: Adam Peck/ThinkProgress

This time last year, Emily Henning was standing in a computer lab before a group of preteen girls in Detroit.

Every week, as part of a course she was taking at Wayne State University, she went to the local public middle school to teach web and research skills: smart Google searches, some basic coding, online safety and civility, how information travels online, how to analyze sources, how to attribute information.

Now, each girl was writing a short article — on their school’s cheerleading team, on Jamie Foxx, on police brutality victims— in a Wikipedia-like online sandbox for the classroom.

This was no ordinary after-school club. It was an intervention, part of researcher Stine Eckert’s Wikid GRRLS initiative to teach middle school girls how to contribute to Wikipedia. The project taught more than 65 preteens, mostly minorities, the online skills and confidence to begin creating knowledge online.

As part of Wikid GRRLs, Wayn State University student Emily Henning teaches middle school girls in Detroit tools to present information online. Photo by Stine Eckert
As part of Wikid GRRLs, Wayn State University student Emily Henning teaches middle school girls in Detroit tools to present information online. Photo by Stine Eckert

“About the time they reach adolescence, girls are already behind in this mentality,” said Jade Metzger, Wikid GRRLs assistant and a doctoral student researching gender in online communication. “They don’t see themselves as experts, they don’t see themselves as people who produce knowledge. Wikid GRRLs takes this group of young minority women and tells them, ‘You are an expert. So let’s add your experiences to the knowledge out there, because knowledge is partial until everyone participates.’”

Advertisement

As debates swirl over the spread of fake news and misinformation on Twitter and Facebook, the web’s leading source of free knowledge continues to struggle with its own slanted coverage. Projects like Wikid GRRLs are doing their best, but Wikipedia’s gender gap is just as wide as it was five years ago.

In 2011, an internal survey by the Wikimedia Foundation found that less than 13 percent of Wikipedia’s contributors were women. The statistic set off a flurry of think pieces and activism urging women to add their expertise to the collaborative web encyclopedia.

Non-profit groups, academics, and volunteers have led to efforts to correct Wikipedia’s systemic gender imbalances. Without solving it, the site will never near its goal of becoming the web’s independent, neutral arbiter of knowledge. But the going is slow.

“You can’t put it all on Wikipedia to solve what’s obviously a larger problem.”

This year, more than 2,500 people participated in 175 global events organized by the Art+Feminism campaign, which holds annual edit-a-thons to collaboratively create and expand Wikipedia articles on female artists. In 2016, participants have created or improved more than 3,500 pages about women on Wikipedia. And the new WikiWomenWeek project encourages editors to write articles about women for seven consecutive days. So far, the initiative has led to over 200 new articles in 20 languages.

Advertisement

Editors have also created collaborative projects dedicated to improving Wikipedia’s coverage of women scientists, writers, athletes, and noteworthy historical figures. Others have teamed up to create articles for notable women whose names are missing from Wikipedia: this year alone, the Women in Red project has turned more than 18,000 red links blue with new articles. That brings the proportion of English Wikipedia’s biographies about women to 16.7 percent this month, up from about 15 percent in 2014. There’s also a gender gap task force to help counter the systemic bias.

When the Wikimedia Foundation announced funding for projects to tackle the encyclopedia’s gender diversity, they received 266 ideas. The campaign culminated in 16 grant-funded projects, including women’s editing meetups with childcare, training to moderate discussions related to gender, and more.

Rather than being an egalitarian “sum of human knowledge,” Wikipedia reflects existing sexism in art, academia, science, technology and literature.

But despite hundreds of women’s edit-a-thons, blog posts, and educational initiatives, Wikipedia’s gender gap continues to mirror those offline.

“Wikipedia was really proactive about the issue,” said Eckert, Wikid GRRLs founder and a Wayne State University assistant professor researching women in online spaces. “But you can’t put it all on Wikipedia to solve what’s obviously a larger problem.”

Still, the nonprofit that runs Wikipedia has pushed for progress. The Wikimedia Foundation began launching efforts to introduce more women to Wikipedia, build an environment more friendly to women contributors, and ease the transition from user to contributor. In the wake of its landmark 2011 survey, the online encyclopedia published a comprehensive page on its gender imbalance and set up a public mailing list to solicit feedback.

Advertisement

Soon, then-executive director Sue Gardner confidently declared her commitment to raise the number of women contributors to 25 percent by 2015.

But by 2013, shortly before she left her position, Gardner was much less assured of the effort’s success. “I didn’t solve it. We didn’t solve it. The Wikimedia Foundation didn’t solve it,” she said bluntly. “The solution won’t come from the Wikimedia Foundation.”

Founder Jimmy Wales admitted much the same to the BBC last summer: “We’ve completely failed. We realize we didn’t do enough. There’s a lot of things that need to happen to get from around 10 percent to 25 percent. A lot of outreach, a lot of software changes.”

It’s now nearly 2017, and the numbers have still barely budged since the original Wikimedia survey was released about six years ago.

Anyone who expected any dramatic progress was kidding themselves, Eckert said. “This is a cultural societal problem where we have huge gender gaps in areas connected to technology and countless other fields,” she explained. To think Wikipedia could change that apart and in isolation from society is overly ambitious and not realistic.”

But while the worlds of scholarly and literary publishing do contain significant obstacles for women, Wikipedia does tout itself as the encyclopedia that anyone can with an internet access can edit. So why are so few women contributing?

In Gardner’s informal survey of women who use Wikipedia, she found that many women found the interface complicated. Others lacked the time or confidence in their expertise to write for a community site, or found the site’s culture too sexualized, misogynistic, and aggressive.

“We’ve completely failed. We realize we didn’t do enough. There’s a lot of things that need to happen to get from around 10 percent to 25 percent.”

Another part of the answer lies at the roots of the internet itself, which emerged from the intersection of the U.S. government, military, academia, and engineering, Eckert said. “Those are four fields that in the middle of the 20th century when the internet was developed, were intensely male-dominated.” So it’s no coincidence Wikipedia contributors and editors tend to be young, white, Western, tech-savvy men.

Nor can we ignore how gender, technology, and knowledge have been constructed over thousands of years.

“The entire history of how women have been framed, how they’ve been socialized to not speak out, goes back to Athens and who was recognized as a public speaker,” Eckert explained. “You had to be a citizen and a white male body in order to have the authority to speak. That’s the lineage of who is allowed to speak publicly, and then in our time, who’s allowed to speak online.”

For all the talk of online information sharing’s democratic potential, Wikipedia and the broader digital community are often far from equal. As a website built on the idea that anyone can access and add information, user bias is ingrained within its self-policing wiki model. A well-respected professor could add a piece of information, and a white nationalist leader could remove it just as easily. Each user has their own version of facts, their own version of truth. The fracturing reality of the web means some users see Wikipedia as too liberal, enough that they’ve created their own wiki-realities, like Conservapedia— and others on the left, who’ve created alternatives like RationalWiki.

For any of these Wikis, the only method of combating biases in crowdsourced contributions is through moderation and editing. But when moderators, too, are predominately male — the latest surveys estimate between 13 and 23 percent of Wikipedia editors identify as female— it’s not surprising when moderating decisions lean in directions that critics say erase women and minority identities.

For example, Wikipedia’s commitment to a neutral perspective has resulted in editors changing the term “rape scene” to “sex scene” to describe depictions of non-consensual intercourse in films. When one woman journalist attempted to give actress Hedy Lamarr’s inventions higher prominence on her biographical page, another user reversed the change. “Invention is significant, but it’s not more significant or more noted by sources than her entire acting career,” they wrote. “Her acting was serious work she was proud of.”

Results of the Wikimedia Foundation’s 2015 Harassment survey.
Results of the Wikimedia Foundation’s 2015 Harassment survey.

And remember Gamergate, when misogyny in the gaming community reached fever-pitch in 2014? Wikipedia found itself embroiled in the controversy after the site’s arbitration committee banned several feminist editors from updating the Gamergate article and other gender-related pages, including the page on feminist media critic Anita Sarkeesian. Sarkeesian received countless death and rape threats over her criticism of video game depictions of women. During Gamergate, anonymous accounts vandalized her Wikipedia page with pornography and misogynist and racist language, calling her a “cunt” and “lying whore.” Around the same time, game developer Zoe Quinn’s page called her a prostitute; developer Brianna Wu’s page claimed she gave her husband AIDS after her father raped and infected her.

It’s that kind of aggressive backlash that makes some women think twice before adding their knowledge. In Wikimedia’s 2011 survey, more than half of editors reported getting into an argument with other editors on discussion pages; 12 percent of female editors reported someone leaving inappropriate messages about or for them.

“A Wikipedia editing war is not my style,” the journalist attempting to improve Hedy Lamarr’s page told New Statesman last year. The preteen girls that Wikid GRRLs worked with held that same self-preservation instinct: “When we’d ask, ‘Would you consider editing a Wikipedia article if you saw any issues?’ I remember some girls saying ‘No, because I would be afraid of what people would say because people would be mean,’” Metzger told ThinkProgress.

The Wikimedia Foundation has attempted to make the atmosphere more friendly for new users, particularly women, by introducing Wikilove, a tool that makes it “easy and fun to send barnstars or whimsical messages of appreciation to other users.” They’ve also rolled out friendly space expectations for some projects, and a space for female users to discuss their experiences as contributors.

In a sense, the lack of barriers to access — to trolls and misogynists, for instance — actually creates more barriers to participation for women and minorities. It’s a lesson Twitter and Reddit are learning, too, as troll culture and online harassment become rampant. There, as well, non-profits and activists have taken the lead in developing mechanisms to thwart trolls.

Around 30 people participated in an Art+Feminism edit-a-thon hosted at Washington, D.C.’s National Museum of Women in the Arts. Photo by Wikipedia user Geraldshields11, used via CC BY-SA 4.0
Around 30 people participated in an Art+Feminism edit-a-thon hosted at Washington, D.C.’s National Museum of Women in the Arts. Photo by Wikipedia user Geraldshields11, used via CC BY-SA 4.0

“Wikipedia aspires to reflect the sum of all knowledge,” Wikimedia Foundation spokeswoman Samantha Lien told ThinkProgress in an email. “Systemic bias including the gender gap on Wikipedia runs counter to that vision, and it’s something we take very seriously.”

Last year, a group of German and Swiss researchers found that Wikipedia articles on women are more likely to use words referencing the subject’s gender than are articles about men: more than 20 percent of the most used words in articles about women relate to family, relationships, or gender. That number is less than 4 percent in articles about men. Articles on women are also more likely to link to articles about men than the other way around.

“Wikipedia aspires to reflect the sum of all knowledge. Systemic bias including the gender gap on Wikipedia runs counter to that vision.”

Today, Wikipedia is the world’s sixth-most visited site. And rather than being an egalitarian “sum of human knowledge,” Wikipedia reflects existing sexism in art, academia, science, technology and literature in its content. Say a user can’t find a Wikipedia page for a certain writer or engineer and assumes she isn’t noteworthy, or sees an incomplete list of Iranian women journalists and assumes there are few prominent Iranian women in media. The next step is to perpetuate these representations outside of the world of Wikipedia.

It’s a dangerous circle.

But there is evidence that the numbers have begun shifting: if you compare the number of female editors in the U.S. between 2011 to 2012 in two surveys, that number jumps by a statistically significant 3.4 percent. It’s hardly a trend across the board, but it’s something.

The work is only beginning for those committed to ensuring Wikipedia is representative of the fuller human knowledge. Gender isn’t the end of Wikipedia’s imbalances — its race gap and Eurocentric bias are just as fraught.

“What about black women thought processes? What about Muslim women thought processes?” Metzger asked. “Their knowledges aren’t held as valuable. We need to be having a conversation about where are the voices of people of color and people from the global south. Because we’re not hearing those voices either.”