UNDERSTOOD, GOVERNED AND MADE ACCOUNTABLE? THESE ARE CHALLENGING BUT SIGNIFICANT POLITICAL QUESTIONS, WHICH REQUIRE URGENT PUBLIC DEBATE. Dries Buytaert, creator of Drupal ALGORITHM DEVELOPMENT
▸ Algorithm design for office temperature optimization ▸ Facebook’s 2014 “emotional contagion” mood manipulation ▸ Search: ▸ Men are more likely to be shown Google Ads for high paying jobs ▸ Ads for arrest records more likely to show up on searches for distinctively black names or a historically black fraternities ▸ Learned autocomplete ▸ AI assistants returning jokes when users complain of assault (e.g. Siri)
American Institute for Behavioral Research & Technology, information displayed in Google’s search engines could shift voting preferences for undecided voters by 20% or more ▸ Voting Machines ▸ Medical Devices ▸ Algorithms to determine early release from prison ▸ Breathalyzers - Algorithms behind breathalyzers have been requested in courts but denied access
goes far beyond insensitivity & insults to arbitrarily limit people’s opportunities ▸ Most ML objective functions create models accurate for the majority class at the expense of the protected class ▸ For minority populations, the number of training samples is dwarfed by the majority ▸ Accuracy-Fairness tradeoff
FAIRNESS? ▸ No current model for algorithmic fairness ▸ We don’t even have one “best” conception of what fairness actually is! ▸ One way to characterize fairness is to ensure both majority and the protected population have similar outcomes. ▸ We know we must involve as many people as possible in design & discussion of algorithms
leads to missed opportunities ▸ Netflix’s “Recommended for you” as example of optimization leading to tunnel vision ▸ User centered design values > pure optimization ▸ If the algorithm is always trying to disprove its own model, we avoid over-fitting ▸ Allows for “serendipity”
KNOW WHICH DATA IS CAPTURED, HOW THAT DATA IS USED, BUT ALSO HOW THESE ALGORITHMS WORK…IT WOULD BE GOOD IF SOMEBODY COULD AUDIT THESE ALGORITHMS TO BE SURE THERE ISN’T BIAS BUILT INTO THEM —EITHER ON PURPOSE OR ACCIDENT
sell, and rent algorithms now ▸ Allows non-algorithm developers to make business choices about which algorithms to use ▸ Follows “free market” principles: ▸ The best algorithms will become the most popular ▸ People will seek out algorithms specific for their needs ▸ Creates space for two-way communication between creators and users
▸ Similar to micro service & SOA architecture ▸ More cutting edge algorithms on the market; fewer stuck in academia and lower barrier of entry ▸ Standardization through reuse and chaining ▸ Commercialization incentivizes accuracy & validation ▸ Follows “free market” principles: ▸ Not necessarily concerned with ethics ▸ People can choose to use biased algorithms
AGENCIES, COURTS, AND STATE AND FEDERAL LAWMAKERS UNDERSTAND THE TECHNOLOGY WELL ENOUGH TO MAKE POLICY. Ryan Calo, assistant professor of law at Washington University & expert in cyberlaw ALGORITHMIC ACCOUNTABILITY
to oversee companies’ algorithms ▸ Proposed by many in the Open Web movement ▸ 2014 White House report on Big Data raised concerns about privacy and fairness ▸ India recently banned “Free Basics” provided by internet.org for violating the essential rules of net neutrality
▸ It can be in a business’s best interest to have oversight ▸ Business lead the push towards corporate responsibility ▸ Established the idea of “public interest” in journalism ▸ Led to legal rules accordingly ▸ Consumers began demanding more rigorous oversight
intelligence ethics board, founded after buying DeepMind ▸ They refuse to name who is on the board ▸ Not transparent about what they do ▸ Other AI startups are more inclined to share who is on their ethics boards, such as Lucid.AI, which has made their board public ▸ If we don’t know who is on the ethics oversight boards, how do we know they are ethical?
DESIGN SYSTEMS WITH STRONGER VALUES. THEY MAY NOT CHANGE US (WE ARE OLD), BUT OUR CHILDREN WILL SEE THE VALUES IN THESE SYSTEMS AS NORMAL. THAT IS BOTH SCARY AND EXCITING. Buster Benson on Eric Meyer’s XOXO 2015 Conference Talk CHOICE
Four-Letter Word” for RubyConf 2011 ▸ Rachel Shadoan’s “Reasoning About Opaque Algorithms” in The Recomplier ▸ Cathy O’Neil’s “Weapons of Math Destruction: How Big Data Increases Inequality & Threatens Democracy” ▸ Eli Pariser’s “The Filter Bubble” ▸ “The Secret Rules of Modern Living: Algortihms” on Netflix