Slide 12
Slide 12 text
Kernel Methods SVM
Linear Support Vector Machine
Linear SVM - Primal Problem
Given a linear separable training set
D = {(x1, y1), (x2, y2), ..., (xl, yl)} ⊂ Rn × {+1, −1},
we can calculate the max margin decision surface ⟨w∗, x⟩ = b∗ solving the convex
program
(P)
min
w,b
ϕ(w, b) = 1
2
⟨w, w⟩
subject to ⟨w, yixi⟩ ≥ 1 + yib,
where (xi, yi) ∈ D ⊂ Rn × {−1, +1}.
(1)
1. The objective function doesn’t depends on b.
2. The displacement b appears in the restrictions.
3. The number of restrictions is equal to the number of training points.
12 / 35