Slide 1

Slide 1 text

Combatting misinformation online: The role and relevance of the social sciences The University of Manchester 24 September | 16:00 – 17:00

Slide 2

Slide 2 text

Combatting misinformation online _________________________________________________________ _____ Professor Rachel Gibson Dr Ariadna Tsenina 24th September 2020 The role and relevance of the social sciences

Slide 3

Slide 3 text

Structure 1. What is misinformation – and why does it matter? 3. Is fact-checking enough? Insights from social science research: 2. How does misinformation operate online? 4. How can we work together? 5. Introducing DILPS Looking forward:

Slide 4

Slide 4 text

What is misinformation? 1. Misinformation 2. Disinformation 3. Malinformation False or inaccurate information spread without the intent to cause harm Information created and disseminated with the intent to deceive or mislead The dissemination of true facts obtained through illegal means to inflict harm on others and achieve strategic goals

Slide 5

Slide 5 text

Why does it matter? • Threatens the physical and mental well-being of citizens • Threatens democracy • Erodes consumer and citizen trust • Contributes to a negative user experience and harms the brand of online social platforms Misinformation is harmful to society, governments and business

Slide 6

Slide 6 text

Amplification of Problematic Content Creation of Problematic Content How does misinformation operate online? Initial Publication of Problematic Content Who: - State - Domestic Political - Third-party - Citizens How: - Replication (reposting) - Reformed Sharing (incorporating information into new content) To warn: - Fact-checkers - Influencers (politicians; opinion leaders; NGOs) - Journalists - Citizens To influence : - Fake actors (‘bots’) - Influencers (politicians; opinion leaders; NGOs) - Journalists - Citizens Why: Where: - ‘Fake News’ sites - Social media - Mainstream media - Other media outlets (such as political blogs) What: - False information - Misleading information - Facts obtained illegally - Constructed Narrative - Opinions

Slide 7

Slide 7 text

Persistent Misinformation Is fact-checking enough? Misinformation Claims corrected by fact-checking organisations Corrected claims seen by citizens Corrected claims accepted by citizens Citizens change attitudes and behaviour in response to corrected claims Some claims remain uncorrected Some corrected claims not seen Some citizens reject corrected claims Continued influence on attitudes and inferences (subconsciously)

Slide 8

Slide 8 text

Shared responsibility and cooperation amongst: • Governments/Regulators • Tech and Social Media Companies • Journalists • Citizens Solutions should focus more on preventative, rather than corrective, strategies In sum, we have highlighted two major insights from social science research which are relevant for combatting misinformation: 1 Online misinformation is a complex ‘eco-system’ of different actors and pathways for dissemination 2 Misinformation can persist after fact-checking, particularly when it is linked to political beliefs

Slide 9

Slide 9 text

Why involve social sciences in the fight against misinformation? The understanding of human behaviour and social relationships offered by social sciences is crucial for the prevention of harmful social activity and inhibiting the effects of misinformation once individuals are exposed to it.

Slide 10

Slide 10 text

• Creating common practical standards and pre-publication support for ‘influencers’ and office-holders on social media Research on the causes, forms and consequences of online misinformation can inform codes of conduct and guidelines on how to spot and avoid spreading misinformation as an influential content creator Some collaborative preventative strategies • Redesigning digital media platforms to encourage mindful consumption and creation of information online Behavioural research can inform ‘nudges’ that make harmful activity more laborious and help users to reflect on and reconsider their actions as consumers and creators of information • Empowering citizens Research on misinformation and media literacy can inform mechanisms that boost citizen resilience to misinformation and enable individuals to identify, avoid and tackle online misinformation effectively

Slide 11

Slide 11 text

DILPS - A Digital Information Literacy Programme for Schools - Despite being ‘digital natives’, young people possess low levels of digital information literacy - The school curricula make poor provisions for digital information literacy, and the nature of the delivery of this learning is largely the province of the teacher 2% of children and young people in the UK have the skills needed to tell if a news story is real or fake Commission on Fake News and the Teaching of Critical Literacy in Schools in the UK (2018) Teachers are, therefore, important gatekeepers of digital information literacy for future generations

Slide 12

Slide 12 text

DILPS - The project aims to develop a pilot commercial resource for secondary school teachers that will: a) increase their understanding of the dynamics behind the spread of online political misinformation and the key strategies for challenging it b) empower them to identify the teaching and learning opportunities for passing this understanding on to their students at Sixth Form level - A Digital Information Literacy Programme for Schools

Slide 13

Slide 13 text

Commercial Sector Teachers Social Science Research - Knowledge exchange workshops with practitioners to identify gaps and needs in practice - Academic workshop - Applying insights from social science research to develop resource - Knowledge exchange workshops with commercial actors in the sector of secondary education to identify opportunities for more widespread dissemination

Slide 14

Slide 14 text

Conclusion • Misinformation is pervasive online and threatens the interests of society, governments and businesses alike • Existing social science research suggests that combatting misinformation will require cooperation throughout all sectors of society and such action should focus on prevention, rather than cure • Social sciences have a crucial role to play in informing such strategies - initiatives such as DILPS exemplify what that could look like in practice

Slide 15

Slide 15 text

Vladimir Barash, Ph.D. | Science Director, Graphika June 24, 2020 Anatomy of a Disinformation Campaign

Slide 16

Slide 16 text

2 Coronavirus Disinfo

Slide 17

Slide 17 text

3 We Are the Cartographers of Cybersocial Terrain Coronavirus Disinfo

Slide 18

Slide 18 text

4 Coronavirus Disinfo Our Philosophy For thousands of years, human civilization has organized around geographic terrain. It is where nations were formed, wars fought, and goods traded. The rise of the networked society has resulted in the emergence of a new cybersocial terrain, which is now the key domain.

Slide 19

Slide 19 text

5 Our Technology Detects Network Deception & Helps Clients Understand Social Influence Graphika leverages the power of machine learning to create the world’s most detailed maps of social media landscapes. Our platform discovers how communities form online, and maps how influence and information flow in real-time within large scale networks. Our patented SaaS technologies and groundbreaking analytical capabilities currently serve several key areas: • Strategic Communications • Digital Marketing • Disinformation Detection & Analysis

Slide 20

Slide 20 text

Deceptive The Command & Control: IRA, GRU, Iran, China, Dark Arts PR, Disinformation for Hire Shops A Who? Actors The Distribution System: The Disinformation Armies, Bot Networks, Trolls, Embedded Assets, Alt News Sites, Overt State Outlets How? Deceptive B Behavior Ammunition: True Information, Fake News, Memes, Hacked Documents, Cheap Fakes, Deep Fakes, Read Fakes What? Deceptive C Content 6 The ABC’s of Digital Deception: Three Key Layers

Slide 21

Slide 21 text

Level 3 Embedded assets, integration into organic communities, hybrid human + automated control, integration into offline spectrum of public sphere manipulation Level 2 Turing-testable automation, crafted profiles, integration into statecraft Level 1 “Eggs,” baby bots, simple automation, black market accounts Personas Bodyguards Fake Community Simple Amplifiers B Anatomy of a Disinformation Army 7

Slide 22

Slide 22 text

Level 3 Embedded assets, integration into organic communities, hybrid human + automated control, integration into offline spectrum of public sphere manipulation Level 2 Turing-testable automation, crafted profiles, integration into statecraft Level 1 “Eggs,” baby bots, simple automation, black market accounts Personas Bodyguards Fake Community Simple Amplifiers Automation Asset Value B Anatomy of a Disinformation Army 8

Slide 23

Slide 23 text

Reconstructed Network CCP Attributed “Hong Kong Riots” CCP Talking Points Messages and talking points framing Hong Kong protests as “riots” caused by separatists and terrorists. C Content “Spamouflage” accounts hiding political messages in cultural content, personas targeting specific technical and industry segments, such as cloud computing and wireless tech B Behavior A Actor China 9

Slide 24

Slide 24 text

Int. Anti-CCP US Right NBA Sports INT US Media US Left Hong Kong Protestors Boycott NBA Hong Kong Protests & #BoycottNBA A Global Problem With Business Implications 10

Slide 25

Slide 25 text

#BoycottNBA The #BoycottNBA campaign was consistently pushed simultaneously and separately by the Hong Kong protesters group and the US Right. The NBA Sports Group saw little activity. 11

Slide 26

Slide 26 text

Blizzard Blowback Int Illustration | Anime US Politics | Media Hong Kong Protestors Int US Gaming | E-Sports #boycottblizzard 1936 38.7 #boycottchina 1243 34.2 #blizzardboycott 600 15.9 #boycotnbat 362 5.3 #boycottblizard 314 12.8 #boycottmulan 263 17.0 #boycottvans 210 16.9 #boycottcathaypacific 67 9.8 #boycottccp 62 10.1 #boycottespn 47 6.2 Count Focus Other Companies Were Also Targeted 12

Slide 27

Slide 27 text

The Threat of Disinformation to Corporations ● Disinformation and network deception unfortunately are a booming business. These threats are not just political, there are a variety of threats that have affected brands and corporations from fake boycott movements to coordinated harassment campaigns ● Foreign actors also have track records of targeting companies caught in the crosshairs of geopolitical rivalries. This happens across a wide range of industries from sports, entertainment and technology ● The challenge for most companies is to understand when an online issue genuinely impacts key stakeholders versus being an abundance of noise not reaching key audiences ● Brands should think of disinformation strategically, as a risk factor, and examine all aspects of a campaign from messaging, how the narrative is evolving and being amplified to who is engaging ● Early warning and monitoring empowers corporations and organizations to get ahead of potential threats and manage false or misleading information that may impact the bottom-line or brand reputation 13

Slide 28

Slide 28 text

Thank you. 14

Slide 29

Slide 29 text

Thank you for joining today’s session @aspect_network Aspect Social Sciences Network