Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Concepts on AI Fairness

Concepts on AI Fairness

Yoriyuki Yamagata

March 05, 2020
Tweet

More Decks by Yoriyuki Yamagata

Other Decks in Technology

Transcript

  1. Group discrimination vs. Individual discrimination • Group discrimination • eg.

    Men are more/less hired than women • Individual discrimination • eg. A man/woman with same skill is hired while a woman /man is not • Can be contradictory
  2. Motivating Example • Gender discrimination on employment • Men may

    be hired more/less likely than women • Extreme measure: hire 50/50% of men/women • The extreme measure may cause individual discrimination • Women with the same skill may be more/less likely hired than men
  3. Direct discrimination vs. indirect discrimination • Direct discrimination • eg.

    Hire employers based on gender • Indirect discrimination • eg. Hire based on features which correlated to gender
  4. Motivating example • Hire people with longer experience of full-time

    employment • This can be legitimate • Full-time employment may indicate higher skill • But can adversely affect women • Women tend to have more part-time job • Family responsibility, child birth, etc...
  5. General settings • S: sensitive features (gender, nationality, religion...) •

    X: other features • Y: output of the algorithm • Can be deterministic/probabilistic • Assume Y: binary
  6. Direct / Indirect discrimination • Fairness against direct discrimination •

    Pr[ Y = 1 | X, S ] = Pr[ Y = 1 | X ] • Fairness against indirect discrimination • Pr[ Y = 1 | S ] = Pr[ Y = 1 ] • X0: explainable feature • Pr[ Y = 1 | X0, S ] = Pr[ Y = 1 | X0 ]
  7. Motivating examples • Fairness against direct discrimination • Hire regardless

    of gender if an applicant has the same skill • Fairness against indirect discrimination • If 25% of applicants are women, 25% of hired applicants are also women (regardless of relative skill level between men/women)
  8. Problem of group fairness • Hire men by their skill,

    but women by the length of hair • Hire men randomly, but women by their skill • Probability can be same • More generally, it is based on statistical properties • address “disparate impact” • do not address “disparate treatment”
  9. Idea • “Similar” person (for a given task) must be

    treated similarly • eg. A person with the same skill level is treated in the same way, regardless of gender
  10. Formal setting • V : set of individual (= X,

    S) • Y : outcome (Y = 0 or 1) • d : similarity measure on V (real number) • d(x, y) >= 0, d(x, y) = d(y, x), d(x, x) = 0 • M : Assignment for Y to V (M : V -> distributions on Y) • may use lottery (eg. choosing jury) • D : measure of difference between two distribution on Y
  11. (D, d) - Lipschitz property • D(Mx, My) <= d(x,

    y) for all pairs of individuals x, y • • Dtv (P, Q) = 1 2 ∑ Y=0,1 |P(Y) − Q(Y)| D∞ (P, Q) = sup Y=0,1 log(max{ P(Y) Q(Y) , Q(Y) P(Y) })
  12. Relation to group fairness • : Average provability of Y

    = 1 for an individual belonging to S • : the largest for any M with (D, d) - Lipschitz property • Theorem: • : Earth mover’s distance between S and T μS (1) biasD,d (S, T) μS (1) − μT (1) biasD,d (S, T) ≤ dEM (S, T) dEM (S, T)
  13. Reflective equilibrium • Methodology of modern ethics • Start from

    intuition • Find a governing principle • Explanatory power • Internal consistency • consistency with other principles • Change intuition Principle Intuition
  14. When does distinction become discrimination? • Competing views (* are

    popular views) • If it causes disadvantage to one of the groups* • If it shows/expresses exclusion of one of the groups* • If it is not justified rationally • If it is based on one’s nature which cannot be changed
  15. References • ػցֶशɾσʔλϚΠχϯάʹ͓͚Δެฏੑ, Kamishima & Komiyama, ਓ޻஌ೳ34ר2߸, 2019 • Fairness

    Through Awareness, Dwork et al., ITCS 2012 • FlipTest: Fairness Testing via Optimal Transport, Black et al. ACM FAT 2020 • Altman, "Discrimination", The Stanford Encyclopedia of Philosophy (Winter 2016 Edition)