Slide 71
Slide 71 text
The Issue of Biases
• Ideally, we would like d+,b′
k
− d+,b
i
= 0 ∀i, k and d−,b′
t
− d−,b
j
= 0 ∀t, j
• However, this condition is very strict, as it would enforce uniform distance among all
positive (resp. negative) samples.
• We propose a more relaxed condition where we force the distributions of distances,
{d+,b′
k
} and {d+,b
i
}, to be similar (same for negative).
• Assuming that the distance distributions follow a normal distribution,
B+,b
∼ N(µ+,b
, σ2
+,b
) and B+,b′ ∼ N(µ+,b′ , σ2
+,b′
), we minimize the Kullback-Leibler
divergence of the two distributions with the FairKL regularization term:
RFairKL = DKL(B+,b
||B+,b′ ) =
1
2
σ2
+,b
+ (µ+,b
− µ+,b′ )2
σ2
+,b′
− log
σ2
+,b
σ2
+,b′
− 1
56/91