NewsQ Strategist Kate Harloe (at right) moderates a panel at WikiCredCon. Panelists, from left to right: Ivonne Lujano, Ambassador for the Directory of Open Access Journals (DOAJ) for Latin America; Benjamin T. Decker, researcher at Harvard’s Shorenstein Center and Lead Analyst for the Global Disinformation Index, and; Melissa Zimdars, critical media studies scholar at Merrimack College. Photo by Nevin Thompson.

How Online Communities Can Identify and Promote Quality News

Lessons from WikiConference North America and the Wikipedia community

--

By Nevin Thompson

The NewsQ team recently participated in WikiConference North America, a four-day, 70-session conference at the Massachusetts Institute of Technology that focused on information reliability and credibility. Here are just a few thoughts on the lessons we learned from the Wikipedia community.

This article was originally published on the NewsQ blog.

In November 2019, WikiConference North America brought together Wikipedians from all over North America for discussions, collaboration and exploration.

Co-branded “WikiCredCon”, the annual gathering for 2019 also included people from outside of the Wikipedia community, such as technologists, researchers, librarians, journalists and representatives from web platforms that are increasingly using relying on Wikipedia as a source of information, as well as a tool to assess the quality of information.

The NewsQ team were among these outside collaborators. We came to WikiCredCon hoping to discuss a number of questions that guide our work, including:

  • What are the indicators of quality for news sources?
  • And is it possible to increase the flow of quality news in a way that is economically viable, as well as sensitive to freedom of speech?

In particular, we wanted to discuss how effective blacklists or whitelists are in guiding readers to reliable sources of information. In turn, the NewsQ team learned valuable insights from about how communities collaborate to identify and promote quality news.

Community and Collaboration Help Build Tools that Target Misinformation

WikiCredCon was the direct result of a grassroots Wikipedia project aimed at tackling information reliability and credibility. Traveling to a hackathon at CredCon in Austin TX in 2018, Wikipedia community members Kevin Payravi and Josh Lim were looking for solutions to one of Wikipedia’s biggest challenges: while the online encyclopedia allows anyone to edit and add factual information, the open nature of the platform means any can also propagate the spread of misinformation.

To address this challenge, Payravi and Lim came up with Cite Unseen, a tool that enhances Wikipedia citations to make them more useful at a glance. By making Wikipedia article citations more prominent, their tool also creates one way to prompt users to take a closer look at where their information is coming from through icons such as government sources, opinion pieces, blogs, and tabloids and more.

Although they developed Cite Unseen at the 2018 hackathon, it is still a work in progress. At WikiCredCon, Payravi and Lim asked the Wikipedia community and other conference attendees for ways to improve the tool and effectively crowdsource analysis of Wikipedia citations.

For NewsQ, Cite Unseen was a good example of how efforts to guide readers to reliable sources of information don’t always need to be developed and implemented solely by platforms. Instead, grassroots communities can collaborate to develop and refine these tools.

The importance of community to the success of Cite Unseen also validates the importance the NewsQ project places on encouraging discussions about insights generated from a community of researchers and practitioners committed to news quality. Typically, tools are developed by platforms and institutions first, with community engagement and user education coming after that.

How Internet Archive Can Help Improve Information Quality on Wikipedia

In many of the sessions the NewsQ team was able to attend, Wikipedians and others repeatedly raised one also key challenge with Wikipedia citations: link rot. Since the link citations themselves often break as web sources disappear or are intentionally removed over the years, it can sometimes be impossible to verify information in Wikipedia articles, or ensure the citations will still be around in the future.

Also present at WikiCredCon, the Internet Archive offered a solution, pledging to not only to use its WayBack Machine to preserve outgoing links, but to also recognise book references and add links where they are available on the Archive. While this project is in its preliminary stages, once again, the lesson is community and collaboration can help promote news and information quality.

Addressing the Problem of Bias

Wikipedia and WikiCredCon also demonstrated that bias can complicate efforts to find a solution to disinformation and misinformation. Carwil Bjork-James of Vanderbilt University and a board member of Wiki Education presented on next steps for countering systemic bias in Wikipedia. In her session, she noted the demographics of Wikipedians drive imbalances of interest and therefore coverage, resulting in systematic misrepresentation along the lines of race, gender, sexuality, geography, indigeneity, and economic class.

In NewsQ’s own panel discussion on the use of whitelists and blacklists as a tactic to disempower false information, bias was highlighted as a key obstacle, where biases of all kinds can easily be built into algorithms that were initially intended to surface quality news.

“Battling Fake News Has Been Wikipedia’s Mission Since 2001”

The “principled design” of the Wikipedia community itself may provide lessons for how to surface quality news. In a 2018 essay recently republished on the Misinfocon blog, WikiCredCon participant Pete Forsyth argues that Wikipedia’s design and governance, rooted in carefully articulated values and policies, serve to ensure information can be trusted.

One reason, Forsyth says, is that the Wikipedia experience is the same for everyone — unlike with Google, Facebook or other platforms where many people consume information, there is little personalization on Wikipedia. If there is personalization on Wikipedia, Forsyth notes the design of the algorithms is mostly transparent. On top of that, unlike most major platforms, Wikipedia does not collect, harvest or monetize user data.

The result, according to Forsyth, is that there is no effort on Wikipedia to anticipate what users will find interesting and then present results that could be tainted by political spin or advertising interests.

It’s Critical to Address Bias

The NewsQ team came away from WikiCredCon having learned many valuable insights that will help inform our work surfacing quality news, as well as a lot of questions. Notably, it’s critical to address bias when building tools and developing approaches to misinformation, and community and collaboration can help with that. How to do this all according to specific topics and communities is something the NewsQ team continues to discuss and reflect on.

--

--