Technology is redefining our world—its
economy, culture, societies and democracies.
Slide 3
Slide 3 text
“We demonstrated that the Web had failed instead
of served humanity. The increasing centralisation,
has ended up producing a large-scale emergent
phenomenon which is anti-human.”
Slide 4
Slide 4 text
Technology negatively affects mental health.
Source: Is social media bad for you? The evidence and the unknowns
Slide 5
Slide 5 text
Source: How heavy use of social media is linked to mental illness
Slide 6
Slide 6 text
Technology is a megaphone for harassment.
Source: Twitter Has a Serious Harassment and Abuse Problem but Doesn’t Seem to Want to Cure It
Slide 7
Slide 7 text
No content
Slide 8
Slide 8 text
Technology treats personal data as currency.
Source: Uber concealed massive hack that exposed data of 57m users and drivers
Slide 9
Slide 9 text
The doctor I saw was terrible. But luckily I received
another opinion and the receptionist/nurse was lovely.
Which is why I would go back
The receptionist/nurse was lovely.
Source: GP booking service HealthEngine sanitises patient reviews
BECAME
Slide 10
Slide 10 text
Technology fosters unfairness and exclusion.
Source: Tech Leavers by Kapor Center
Slide 11
Slide 11 text
78%
of tech workers report some form of unfair
treatment.
Source: Tech Leavers by Kapor Center
Slide 12
Slide 12 text
0.8%
of jobs in tech companies are held by Black
women.
Source: Here’s the clearest picture of Silicon Valley’s diversity yet: It’s bad. But some companies are doing less bad
Slide 13
Slide 13 text
6%
of Fortune 500 chief executives are women.
Source: Fortune CEOs
Slide 14
Slide 14 text
98%
of VCs are white or Asian males.
Source: Silicon Valley’s Morality Crash
Slide 15
Slide 15 text
29%
is as high as the pay gap between men and
women gets.
Source: The Gender Pay Gap in Tech
Slide 16
Slide 16 text
60%
of women reported unwanted sexual advances.
Source: Elephant in the Valley
Slide 17
Slide 17 text
30%
of women of colour were passed over for a
promotion.
Source: Tech Leavers Study
Slide 18
Slide 18 text
95%
of Open Source contributors are male.
Source: Github Open Source Survey
Slide 19
Slide 19 text
Humans are paying the price for (un)intended
consequences of rapid advancement.
We are facing a crisis.
Slide 20
Slide 20 text
We’ve become complacent, engaging in an act
of cultural denial.
A process in which we avoid uncomfortable realities,
like poverty, suffering, injustice and racial oppression.
Slide 21
Slide 21 text
“A dangerous form of magical thinking often
accompanies new technological developments, a
curious assurance that a revolution in our tools
inevitably wipes the slate of the past clean.”
Virginia Eubanks
Slide 22
Slide 22 text
“Move fast and break
things”
PITFALLS OF TECHNOLOGY
Slide 23
Slide 23 text
No content
Slide 24
Slide 24 text
Disruption isn’t a getaway free card to bypass
laws, freedoms, moral compass, civil rights
and human protections.
Slide 25
Slide 25 text
“Technology is neutral”
PITFALLS OF TECHNOLOGY
Slide 26
Slide 26 text
Technology is not, and never was or will be,
neutral. Our biases and prejudice are baked-in.
Slide 27
Slide 27 text
“This myopic focus on what’s new leads us to miss
the important ways that digital tools are embedded
in old systems of power and privilege.”
Virginia Eubanks
Slide 28
Slide 28 text
Lacking ethical
education
PITFALLS OF TECHNOLOGY
Slide 29
Slide 29 text
No content
Slide 30
Slide 30 text
“Ethical decision-making is like a muscle that
needs to be exercised lest it atrophy,
and experiments have shown that ‘moral reminders’
help people make more ethical decisions.”
Irina Raicu
Slide 31
Slide 31 text
Exclusion and
homogeneity
PITFALLS OF TECHNOLOGY
Slide 32
Slide 32 text
Until anyone, no matter their background, will
be able to survive and thrive and be
represented online, the Web won’t be open.
Slide 33
Slide 33 text
How do we establish what’s ethical?
Slide 34
Slide 34 text
Ethical principles defend and systematise
moral, righteous conduct.
Slide 35
Slide 35 text
Human rights
and democracy
№1
Slide 36
Slide 36 text
Algorithms create filter bubbles, preventing
us from seeing disagreeable content. It’s
harder to make informed choices as citizens.
Source: Your filter bubble is destroying democracy
Slide 37
Slide 37 text
Tech has to support and improve the civic
processes on which democratic societies
depend.
Slide 38
Slide 38 text
• Respects and extends human rights
• Serves and support democracy
• Fights against the spread of misinformation
• Encourages civic engagement
Ethical technology:
Slide 39
Slide 39 text
Well-being
№2
Slide 40
Slide 40 text
Tech-saturated world decreases our
cognitive capacity and fosters anxiety,
depression and stress.
Source: The Future of Well-Being in a Tech-Saturated World
Slide 41
Slide 41 text
Technology should be a mindful, quiet
companion to our lives rather than an
overbearing disruptor, hijacking attention.
Reference: Calm Tech and Centre for Humane Technology
Slide 42
Slide 42 text
Ethical technology:
• Requires minimum attention
• Informs and create calm
• Works in the background
• Respects societal norms
Slide 43
Slide 43 text
Security and safety
№3
Slide 44
Slide 44 text
Data breaches might have catastrophic
consequences; from identity theft to doxxing
or swatting.
Source: My Three Years in Identity Theft Hell
Slide 45
Slide 45 text
Ethical technology:
• Responds promptly to crisis
• Eliminates single points of failure
• Invests in cryptography and security
• Protects the most vulnerable
Slide 46
Slide 46 text
Responsibility
and accountability
№4
Slide 47
Slide 47 text
It’s humans who put artificial intelligence in
place, and it’s humans who should take
ownership for its systemic flaws.
Slide 48
Slide 48 text
Source: Able, Allowed, Should: Navigating Modern Tech Ethics by Margaret Gould Stewart
Slide 49
Slide 49 text
Ethical technology:
• Is aware of diverse social and cultural norms
• Creates policies for algorithmic accountability
• Collaborates with lawmakers to advance
regulations
• Complies with national and international
guidelines
Slide 50
Slide 50 text
Data protection
and privacy
№5
Slide 51
Slide 51 text
Data can be used to enhance user experience,
but it also can be easily weaponised.
Source: Facial recognition software is not ready for use by law enforcement
Slide 52
Slide 52 text
Sensitive data should be easily modifiable,
restricted, exported and deleted.
Don’t collect it. Don’t store it. Don’t keep it.
Source: Haunted by Data by Maciej Cegłowski
Slide 53
Slide 53 text
Ethical technology:
• Only collects data necessary for operation
• Gives full control of data, including permanent
and swift deletion
• Allows anonymity
Slide 54
Slide 54 text
Transparency
№6
Slide 55
Slide 55 text
Lack of transparency increases the magnitude
of harm and lowers accountability.
Black boxes cannot be challenged.
Slide 56
Slide 56 text
Ethical technology:
• Responsibly discloses abuse of software
• Establishes clear rules for reporting
and accountability
• Has mission and value statements
Slide 57
Slide 57 text
Misuse and
bias awareness
№7
Slide 58
Slide 58 text
No content
Slide 59
Slide 59 text
Algorithms are thoughtless.
Software doesn’t learn. We teach it.
Source: How Machines Learn to Be Racist
Slide 60
Slide 60 text
“Technologies and their design do not
dictate racial ideologies; rather, they reflect
the current climate.”
Safiya Umoja Noble, Algorithms of Oppression
Slide 61
Slide 61 text
Ethical technology:
• Is aware and combat unconscious bias
• Tests for misuse and malice
• Fights against harmful societal inequalities
Slide 62
Slide 62 text
Diversity
and inclusion
№8
Slide 63
Slide 63 text
Diverse teams are more creative, performant
and welcoming.
Source: Why Diverse Teams Are Smarter
Slide 64
Slide 64 text
Technological redlining, reinforcing oppressive
social relationships and enacting racial profiling,
cannot exist in the ethical world.
Slide 65
Slide 65 text
“When automated decision-making tools are not
built to explicitly dismantle structural inequities,
their speed and scale intensify them.”
Virginia Eubanks, Automating Inequality
Slide 66
Slide 66 text
Ethical technology:
• Is inclusive of all people
• Prioritises diverse teams and organisations
• Prevents technological redlining
Slide 67
Slide 67 text
Tools and resources
Slide 68
Slide 68 text
Recommended reading
Slide 69
Slide 69 text
No content
Slide 70
Slide 70 text
Canvases, workshops and applications
Source: How to practice ethical design?
Slide 71
Slide 71 text
Manifestos and pledges
Source: Ethical Design Manifesto by ind.ie
Slide 72
Slide 72 text
“We demonstrated that the Web had failed instead
of served humanity. The increasing centralisation,
has ended up producing a large-scale emergent
phenomenon which is anti-human.”
Sir Tim Berners-Lee
Slide 73
Slide 73 text
The shift towards more humane technology is
happening, but it needs your help.
Slide 74
Slide 74 text
We all are responsible for what the Web
is today and will become tomorrow.
Slide 75
Slide 75 text
“Should we build this?” has to become
the ethical foundation for our work.
Slide 76
Slide 76 text
The Web is ought to enhance our lives and
fulfil our dreams, rather than crush hopes,
magnify fears and deepen our divisions.