Slide 26
Slide 26 text
Shameless plug: other works
Interpolating between OT and KL
regularized OT using Rényi Divergences
Rényi divergence ̸∈ {f-div., Bregman div.}, α ∈ (0, 1)
Rα
(µ | ν) :=
1
α − 1
ln
X
dµ
dτ
α dν
dτ
1−α
dτ ,
OTε,α
(µ, ν) := min
π∈Π(µ,ν)
⟨c, π⟩ + εRα
(π | µ ⊗ ν)
is a metric, where ε > 0, µ, ν ∈ P(X), X compact.
OT(µ, ν) α↘0
←
−
−
−
−
or ε→0
OTε,α
(µ, ν) α↗1
−
−
−
→ OTKL
ε
(µ, ν).
In the works: debiased Rényi-Sinkhorn divergence
OTε,α
(µ, ν) −
1
2
OTε,α
(µ, µ) −
1
2
OTε,α
(ν, ν).
W2 gradient flows of dK
(·, ν)2 with
K(x, y) := −|x − y| in 1D.
Reformulation as maximal monotone inclu-
sion Cauchy problem in L2
(0, 1) via quantile
functions.
Comprehensive description of solutions’ behav-
ior, instantaneous measure-to-L∞ regular-
ization, implicit Euler is simple.
Viktor Stein Accelerated Stein Variational Gradient Flow 28.05.2025 3 / 3
−1 −0.5 0.5 1 1.5 2
1
2
3
µ0
8 6 4 2 0 2 4 6 8
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
Iteration 0
initial
target
explicit
implicit