MisinfoCon DC Summary: Day One

Karina M. Chapman
MisinfoCon
Published in
11 min readAug 15, 2018

--

Photo credit: Karina M. Chapman
MisinfoCon 4.0 (Photo credit: FilippoMenczer @Truthyatindiana)

Attending a MisinfoCon has been one of my goals since the first was held in Boston in February of 2017. Like so many others who tried to register, I was unable to get a spot. However, Hacks/Hackers soon announced that there would be more MisinfoCons in our future. One was held in London, and another in Kyiv — unfortunately, both of these were out of my reach.

Enter MisinfoCon 4.0, held in Washington, DC on August 6 and 7, 2018. I was delighted to be one of the estimated 300 people lucky enough to find a place at the tables.

The two day conference, held in the Knight Conference Center inside the Newseum (which, by the way, you must visit whenever you find yourself in our nation’s capital), was phenomenal. Billed as the “largest MisinfoCon to date”, it featured multiple daily Lightning Talks of ten minutes each, rigorously timed and sometimes hilariously adhered to by the presenters.

I could have listened to each of the presenters for hours.

The structure of each day followed the same format: four ten-minute “Lightning Talks”, followed by a Q&A and a break. Then there were four (or more) Lightning Talks, another Q&A, then lunch. After lunch, we broke up into our choice of workshops to tackle an issue dealing with mis- or disinformation. These workshops lasted several hours. At the end of the day, one person from each group presented an overview of what was accomplished — with slides, of course.

Day one focused on state actors. The following summaries are based on my notes from the sessions, and I absolutely welcome corrections and input.

The first four Lightning Talk speakers for day one featured the following:

  • Renee DiResta, Mozilla
  • Adam Hickey, US Department of Justice
  • Lisa-Marie Neudert, Oxford Internet Institute
  • Haroon Ullah, Chief Strategy Officer, Broadcasting Board of Governors

Renee DiResta, Mozilla Foundation

Renee DiResta began MisinfoCon by discussing the cybersecurity implications of mass dissemination of misinformation.

Renee DiResta (@noUpside), Mozilla Foundation. (Photo credit: Dina Sadek)

DiResta stated, “We’re still treating misinformation as an issue of truth, and not as an issue of cybersecurity.” Misinformation dssemination included some unexpected platforms, such as Pinterest and PokemonGo. There were also websites which looked like citizen journalism in that the content looked much like what one would expected. However, they was created by foreign actors. The main topic of dissemination focused on racial tension. DiResta explained that both sides of the 2016 Presidential campaign were unified in its anti-Clinton stance. Some of the misinformation on the left was focused on suppressing the Clinton vote, using such tropes as “a vote for Stein is not wasted” and “Bernie’s nomination was stolen”. On the right, it was mostly anti-Rubio, pro-Trump, and (of course) anti-Clinton. Further, as manipulation of the 2016 election was “so easy”, it actually ramped up in 2017. DiResta stated that campaigns now target industry, entertainment, energy, and more: “Our narratives are being laundered,” she explained. With the advent of audio/video manipulation, DiResta warnsit will become even harder “to trust our own eyes” while taking advantage of our laws protecting freedom of speech.

Adam Hickey, US Department of Justice

Adam Hickey spoke next about the IRA — the Internet Research Agency.

Deputy Assistant Attorney General Adam Hickey, US Department of Justice

The IRA is a Russian troll farm which has since been indicted for efforts to interfere with our elections, spreading what Hickey called “absolute untruths”. Unlike American opinions which, even if unpopular, are protected by the First Amendment, the disinformation published by the IRA was paid for by Russia to spread propaganda; it was, he said, a “conspiracy to defraud”. The United States does not hinder the free speech of foreign individuals; however, it does require them to file a disclosure statement periodically. The goal of the United States therefore, said Hickey, was “transparency, not prohibition”. A report by the Attorney General’s Cyber Digital Task Force warned of “foreign malign operations” who may be trying to hack, rig elections, doxx,influence state media, and even create and support rallies attended by unwitting Americans. Hickey said that doxxing, for example, may violate laws, while also violating values and privacy, all of which could have a chilling effect on free speech. He ended by saying that the Justice Department is sharing information through the FBI’s foreign influence task force, and is working to safeguard Americans from dis- and misinformation spread by foreign actors.

Lisa-Marie Neudert, Oxford Internet Institute

Lisa-Marie Neudert discussed regulations on social media.

Lisa-Marie Neudert, Oxford Internet Institute (@lmneudert)

On one hand, she said, regulation may not be the solution. On the other hand, we may be “past the point” of effective self-regulation. Regardless, she said, “Countries and governments are tightening their control on information and content posted online.“ Germany, for example, has enacted a Network Enforcement Act, the Netzwerkdurchsetzungsgesetz (NetzDG). Introduced in January 2018, it “holds companies liable” for what users post. Based on German law, the act requires libel, slander, hate speech, or misinformation to be taken down within 24 hours of notification, or the company will be fined. The Oxford Internet Institute found that on YouTube alone, for example, just under 215,000 complaints were filed, resulting in over 58,000 posts being removed. Google+ had almost 3000 complaints vs. over 1200 takedowns, and Twitter had over 260,000 complaints but only took down just over 28,600 posts`. Facebook was the anomaly, with only 856 complaints to their 362 takedowns. Neudert explained that the Facebook number may be so small because the button for reporting is somewhat hard to see! Unfortunately, the people doing the work of taking down posts aren’t lawyers or policymakers — they’re social media employees, and as Neudert explained, sometimes they get it wrong. She added that the “deep threats to democracy” reach far beyond mere Russian meddling. Regulations are now being passed worldwide. The side effects of these, she warns, could be chilling to public discourse.

Haroon Ullah, Broadcasting Board of Governors

Chief Strategy Officer for the BBG, gave the final morning lightning talk for day one.

Haroon Ullah, Broadcasting Board of Governors (@haroonullah)

He began by explaining that BBG sees four Disinformation Challenges in 2018:

  1. Declining media and digital literacy
  2. EU and NATO countries under risk
  3. Content attribution and distribution
  4. Censorship and circumvention

Foreign actors are increasingly targeting audiences with low media literacy, in places like Italy, Romania, and Greece, because the people in these smaller countries aren’t as ”resilient to disinformation”. Given this problem, disinformation travels further than it would in countries with more robust medial literacy programs. Ullah explained that disinformation campaigns are becoming hyper-local, focusing on smaller areas that larger news agencies don’t cover. That makes it harder to counter the disinformation. He mentioned ten small languages where BBG is ramping up its fight on disinformation. These include Spanish, Russian, Portuguese, Chinese/Mandarin, Arabic, Tagalog, Farsi, Rohingya, Uyghur, and Urdu. Journalists in these languages who tackle disinformation are ending up in jail — and sometimes their families are, too. To combat this, BBG is doing the following:

  • Launching broadcasting in smaller niche languages (Bulgarian, Romanian, etc.)
  • Establishing local and international fact-checking programs
  • Working to enhance media/digital literacy with training and materials
  • Promoting high standards of journalism and best practices
  • Empowering local media with ideas and technology
  • Collaborating with NGO, local initiatives, and influencers

Ullah stated that, as they are becoming more effective, journalists are being targeted — and some are going missing. BBG is working hard to keep ahead of this dangerous curve.

After these four Lightning Talkers concluded their remarks, there was a short Q&A session, and then a quick break before session two began. The speakers for the second session included the following:

  • Deen Freelon, UNC School of Media & Journalism
  • Jess Leinwand, Facebook
  • Farida Vis, Visual Social Media Lab
  • Melanie Smith, Graphika

Deen Freelon, UNC School of Media & Journalism

Deen Freelon began the afternoon Lightning Talks by defining disinformation.

Deen Freelon, UNC School of Media and Journalism

Unlike misinformation, disinformation is “The surreptitious purposeful distribution of messages intended to harm targets.” Disinformation includes the intention to benefit the source, which is hidden, as is the purpose of the message. Freelon said we need to know baseline information such as “who is being targeted, what is being said (what the disinformation really is), and which identities and tactics are most successful.” IRA is active on Twitter, YouTube, and Facebook. Twitter identified 3,814 IRA accounts, which produced 175,993 tweets that 677,775 users interacted with — and these were engagements, not simply views. Twitter states they have so far deleted over 200,000 troll tweets, which NBC then promptly invited people to read on their website. The House Intelligence Committee also identified 2,752 Twitter handles as Russian IRA Human-Coordinated accounts, according to a report dated November 1, 2017. Freelon notes that many of these accounts were created prior to 2016, and simply awaited activation once a purpose was found. As he said, “Putin didn’t roll out of bed in 2016.” Freelon stated that trolls infiltrate what’s popular and what people are interested in. Some favored topics included the election, conservatives, and black issues. He finished by giving these “next steps”:

  • Get a paper out (any ideas?)
  • Continue theorizing disinformation
  • Analyze datasets that show how the messages spread and who spread them
  • Whither other state-sponsored disinfo orgs?

Jess Leinwand, Facebook

Jess Leinwand began by describing Facebook an “authentic communication platform for all ideas”, where users control their experience, and friends and family come first.

Jessica Leinwand, Facebook

She explained Facebook’s algorithm, which uses inventory, signals, and predictions. This algorithm basically examines the content, who posted it, how recently, and Facebook’s prediction of how you’ll engage with it; Facebook then assigns the content a “score of relevancy”; the higher the number, the more likely you’ll see it. Leinwand also said Facebook’s policies and algorithm are subject to change at any time, although they “try to be transparent” about it. Facebook deals with misinformation by removal, reducing its ranking, or informing the public why it’s inauthentic. Facebook does not, however, have a policy requiring only true content, nor is there a policy about removing something that’s untrue. Leinwand stated, “Nobody wants to have Facebook determine the truth or falsity of their content.” Instead, Facebook removes content for violating standards: bullying, harassment, hate speech, or anything contributing to “imminent violence”. Leinwand stated that Facebook will deal with misrepresentation as well: “You have to be who you say you are”, otherwise they will remove monetization opportunities and reduce content distribution. Facebook uses third-party fact checkers who identify, review, and rate information. Currently, they plan to expand fact-checking to additional companies, and to target photos and videos. Leinwand stated that Facebook is taking more aggressive action against repeat offenders. She assured attendees that Facebook is “working on transparency of misinformation” with community, fact checkers, and stakeholders.

Farida Vis, Visual Social Media Lab

Farida Vis spoke about visual mis- and disinformation, explaining how images were used to spread disinformation during the French and UK elections.

Farida Vis, Visual Social Media Lab

Visuals such as that of an immigrant child crying at the feet of an ICE agent, she said, had impact overnight. Images tap into fears and can be very powerful. She discussed one example of disinformation: a Nigel Farage “Breaking point” poster, which contributed to the EU Referendum outcome leading to Brexit. Vis explained that Buzzfeed will put out lists of false and misleading news and images, but they don’t always catch everything. An example of one disinformation image from the Westminster bombing showed a woman in a hijab walking by an injured person, turning away in dismay. She was portrayed by a now-suspended IRA account, “Texas Lonestar” (@southlonestar), as “[casually] walking by a dying man while checking [her] phone”. Images such as these, Vis stated, “press emotional buttons”, and despite debunking, this particular image lives on. Visual Social Lab has run 95 image mini case studies on the UK and French elections (52 UK, 43 France) and identified 21 image types, classified as the following types of mis- and disinformation: Video; photo; data visualization; photos with text; screenshot of tweet; screenshot from web; screenshot of facebook; collage; mixed media; and, meme, including an image macro, map with text, and photograph with quote. Vis noted that some of these may be accurate but humorous, and that memes are relatively small compared to photographs. In examining these images, Vis noted that 30% of the problematic content is true image with a false context. She ended with this information:

Images can be powerful and are not given the attention they deserve. Visuals and memes are the most complex to understand and therefore potentially dangerous vehicles of mis and disinformation. What does the problem actually look like? Do we know? If we don’t know, how can we know what to do about it? No “straightforward” technical solutions!

Melanie Smith, Graphika

Melanie Smith was the final Lightning Talker of day one. She discussed state sponsored disinformation dealing with a humanitarian crisis: the White Helmets of Syria.

Melanie Smith, Graphika (@MelanieFSmith)

Graphika created network maps of actors who spread disinformation about the White Helmets by filtering out those accounts mentioning #WhiteHelmets more than 50 times, assuming these were the most interested/obsessed. This filtering reduced 12 million tweets by 2.65 million accounts down to 12,000 accounts. What they found was an overlap between the US left (reached by disinformation), “truthers” (defined as “alternative media and conspiracy theorists”), and “Russia/Syria regime support accounts”. Disinformation about the White Helmets reached 56 million people, and focused on sowing discontent while increasing support for Russian military action against the White Helmets. Oddly, one subset of the community were Pink Floyd fans, after band member Roger Waters ranted about the White Helmets and why they’re “actually terrorists”. At this point, White Helmets misinformation is being mainstreamed by groups such as the Times. A comparative analysis was run on the Amesbury Novichok poisonings; the most popular hashtag in the group was #WhiteHelmets, and 40% of them engaged in disinformation on Amesbury. 42% were anti-west establishment/pro-Russians who were working to distort information. This included officials of Russian embassies, ministers, and state apparatus that were “spreading false narratives” on the White Helmets. Graphika’s conclusion was that “disinformation targeting humanitarian actors has a different set of objectives and tactics to those used in electoral interference campaigning.” They will “continue their efforts to document attacks on the credibility and reputation of humanitarian workers in conflict zones.”

After a Q&A, we broke for lunch, then individually chose a workshop to attend for the afternoon.

Photo Credit: @Misinfocon

The workshops lasted close to three hours, after which each group designated one person to present a summary, with slides, of what they had accomplished. Rather than attempt to summarize the summaries, and given the length of the day and the information overload which may have compromised my notetaking skills, I will simply add this Evernote link to said notes.

However, I can’t end this summary of day one without a shoutout to my personal favorite slide of the day, which was the “protest slide” from Credibility Coalition:

Connie Moon Sehat (@msconnie) and the “protest slide”

Well played, Connie. Well played.

___________________________

MisinfoCon is a community of people focused on the challenge of #misinformation & what can be done to address it. Events so far at MIT, London, Kyiv, and DC.

--

--

Educator, writer, photographer, traveler. Teacher of American History and Government. NEH Scholar. Member, Hacks/Hackers Boston. Coffee shop addict.