Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Decisions: Week 6

Will Lowe
August 31, 2021

Decisions: Week 6

Will Lowe

August 31, 2021
Tweet

More Decks by Will Lowe

Other Decks in Education

Transcript

  1. M Machine learning and automation Converging case studies Resistance to

    automation Stakeholders ree ways to think about automated decision making Exercise
  2. M We’ll take a broad de nition of machine learning

    for decision making → any kind of algorithmically de ned, automated process ending with an action Examples that di er only in implementation → Rule-based expert system, neural network, y-by-wire piloting system We’ll focus on issues around automation Claim (Algorithm Watch Report ): we’re no longer just automating society. We have automated it already Claim : → e problems of automation have very little to do with machines, or learning
  3. M → pre- : designs are realized by a ‘draw-boy’

    moving threads, directed by a weaver → : Basile Bouchon and Jean-Baptiste Falcon gured out how to realize a design using punched cards → : Jacques de Vaucanson arranged the a punched tape to move to the next row, a ‘carriage return’ (a weaver provided the power source) → Joseph Marie Jacquard invents and popularizes the programmable loom A portrait in , cards
  4. M Weaving gets with the program Powered looms were possible

    since Cartwright and perfected by in the ‘Lancashire Loom’ Note the steady removal of the human from designing, realizing, and powering the process At the same time → - e Continental System (a trade blockade from most of Europe) → Di culties with American trade
  5. R Luddites destroying a Jacquard loom s: ‘Luddites’ destroyed weaving

    machines, rioted, and assassinated a mill owner What was this resistance to? → Machines themselves? → British macroeconomic policy? → Lack of social safety net? → Worsening factory working conditions? → Removal / denigration of the intrinsic value of labour
  6. R Luddites destroying a Jacquard loom s: ‘Luddites’ destroyed weaving

    machines, rioted, and assassinated a mill owner What was this resistance to? → Machines themselves? → British macroeconomic policy? → Lack of social safety net? → Worsening factory working conditions? → Removal / denigration of the intrinsic value of labour : ‘Lancaster Loom’ invented : Friedrich Engels is sent to Salford (Greater Manchester) to oversee his father’s weaving factory
  7. A ... Suppose one of these men, as I have

    seen them, – meagre with famine, sullen with despair, careless of a life which your lordships are perhaps about to value at something less than the price of a stocking-frame – suppose this man sur- rounded by the children for whom he is unable to procure bread at the hazard of his existence, about to be torn for ever from a family ... Are we aware of our obligations to a mob? It is the mob that labour in your elds, and serve in your houses – that man your navy, and recruit your army – that have enabled you to defy all the world, – and can also defy you, when neglect and calamity have driven them to despair. You may call the people a mob, but do not forget that a mob too of- ten speaks the sentiments of the people. Lord Byron, Feb th, on the Frame Work Bill (Hansard link)
  8. ... e Frame Work Bill passed and turned into the

    Destruction of Stocking Frames, etc. Act ( Geo c. ) Made destruction of looms a ‘capital felony’ → Previously - years in a penal colony(!) → Later changed to transportation → And then back to a capital punishment e Luddite movement was e ectively suppressed: many hanged, killed by troops, imprisoned...
  9. ... e Frame Work Bill passed and turned into the

    Destruction of Stocking Frames, etc. Act ( Geo c. ) Made destruction of looms a ‘capital felony’ → Previously - years in a penal colony(!) → Later changed to transportation → And then back to a capital punishment e Luddite movement was e ectively suppressed: many hanged, killed by troops, imprisoned... In the meantime, Byron’s daughter was becoming a programmer
  10. E Charles Babbage and Ada Lovelace (nee Byron) worked on

    a general purpose computing machine e Di erence Engine (∼ )
  11. W ? e British government funded most of the development

    Why? Because it wanted the output: → Astronomical and mathematical tables When the machines were not built or abandoned they were (perhaps understandably) unhappy... → Babbage could not or did not see this dynamic → Lovelace didn’t need to → ey were building something that ‘did’ mathematics years later → Turing and Welchman knew – and built the ‘Bombe’
  12. W ? As soon as labour in the direct form

    has ceased to be the great well-spring of wealth, labour time ceases and must cease to be its measure, and hence exchange value [must cease to be the measure] of use value. e surplus labour of the mass has ceased to be the condition for the development of general wealth, just as the non-labour of the few, for the development of the general powers of the human head. With that, production based on exchange value breaks down, and the direct, material production process is stripped of the form of penury and antithesis. e free development of individualities[, ...] the general reduction of the necessary labour of society to a minimum, which then corresponds to the artistic, scienti c etc. development of the individuals in the time set free, and with the means created, for all of them. Marx Grundrisse ch. para. (the ‘Fragment on Machines’)
  13. T How to pull humans out of the decision making

    loop → Human decisions about action, individual and unaided → Human decisions about action, in institutions → Human decisions about action, aided by calculation tools → Human decisions about inputs to calculation e.g. probability or loss elicitation, and as trigger → Human decision outcomes (behaviour) as input, e.g. collaborative ltering → Combinations of the elements above, e.g. automated trading → Human decisions about goals/losses, e.g. self-driving cars, automated fraud checking Separately these roles may be hidden or transparent...
  14. R G Rabbi Judah Loew ben Bezalel ( s) ‘Josef’

    c.f. McElreath ( ) Statistical Rethinking (ch. ) A Leckie ( ) c.f. Fall ( ) E : James Cameron Aliens
  15. N C → Judge a technology by its outputs e.g.

    utility → O en so ened to ‘rule consequentialism’, e.g. ‘maximize expected utility’, transparency optional Corollary: → Judge the fairness of a technology by its actual and its counterfactual outputs
  16. N C → Judge a technology by its outputs e.g.

    utility → O en so ened to ‘rule consequentialism’, e.g. ‘maximize expected utility’, transparency optional Corollary: → Judge the fairness of a technology by its actual and its counterfactual outputs D → Judge a technology by its operating principles → Most naturally applied to rule-based decision systems, transparency a virtue Corollary: → Judge the fairness of a technology is determined by the operating principles it realizes
  17. T ? Transparency is o en considered a virtue in

    decision making, but identifying it is tricky → Not quite visibility. Seeing the mechanism may not help me much → Not quite explanation. You can try to explain EU anti-trust law to me but I’ll fall asleep → Not quite justi cation: e rule you show me may not be the cause, e.g. ‘parallel construction’ → Not quite manipulability. Seeing and doing give di erent information We might like these other properties more for themselves Some of these are more useful for fairness than others → Did my foreignness actually gure in your decision? → Would things have gone di erently had I not been foreign? → Should my foreignness have mattered to your decision?
  18. A Algorithms are neither “neutral” nor “objective” even though we

    tend to think that they are. ey replicate the assumptions and beliefs of those who decide to deploy them and program them. Humans, therefore, are, or should be, responsible for both good and bad algorithmic choices, not “algorithms” or ADM systems. e machine may be scary, but the ghost within it is always human. And humans are complicated, even more so than algorithms. Algorithm Watch ( ) What do we make of this with respect to functional roles, rhetorical frames, and normative assumptions?
  19. E : C A. Black boxes → What are the

    technical objections to ‘black boxes’? What are the societal objections? B. Algorithmic status quo → Can a ‘trustworthy AI put people rst’? Does this make sense as a goal? → What is the role of oversight? and where do ‘activists’ t in? (if they do) C. Lack of auditing, enforcement, skills, and explanation → Is it “important to know what it is that is being ‘optimized’ in terms of public services?’ → In what sense is transparency and explainability relevant? D. ‘Techno-solutionism’ → What is ‘techno-solutionism’? → What is (the relevance of) “Arthur C. Clarke’s ird Law”?
  20. R Algorithm Watch. ( , October). Automating Society . Algorithm

    Watch and Bertelsmann Sti ung. Fall, I. ( ). ‘I sexually identify as an attack helicopter’ [magazine]. Clarkesworld Magazine, ( ). Leckie, A. ( ). ‘Ancillary justice’. Orbit. Marx, K. ( ). ‘Grundrisse’ (M. Nicolaus, Ed.). Penguin. (Original work published ) McElreath, R. ( ). ‘Statistical Rethinking: A Bayesian Course with Examples in R and Stan’. CRC Press, Taylor & Francis Group.