Upgrade to Pro — share decks privately, control downloads, hide ads and more …

SVM classifiers

Sponsored · Your Podcast. Everywhere. Effortlessly. Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.
Avatar for vhqviet vhqviet
February 26, 2018
190

SVM classifiers

Avatar for vhqviet

vhqviet

February 26, 2018
Tweet

Transcript

  1. • Input: Given training data (x i , y i

    ) for i = 1 ...N, with x i and y ∈ i {−1, 1} ∈ • Task: learn a classifier f(x) such that 2 Linear Classifiers
  2. A linear classifier has the form 4 Linear Classifiers •

    w is the normal to the line, and b the bias • w is known as the weight vector
  3. Linear Support Vector Machine ( LSVM) • SVM can be

    formulated as an optimization: • Or equivalently:
  4. 9 Linear Support Vector Machine ( LSVM) • Change to

    the Lagrange formulation: The Maximization depends only on dot products of pairs of vectors Maximize
  5. Not Linearly Separable Data Not Linearly Separable Data transform to

    a higher demensional space, called feature space. 11 Support Vector Machine ( SVM) kernel Maximization depends only on dot products
  6. Multiple Classes 14 Support Vector Machine ( SVM) • There

    are 2 kind of comparision for doing multiple classes >> .SVC(kernel='linear', decision_function_shape='ovr’) OVO: One vs One OVR: One vs Rest Pros: less sensitive to imbalanced Cons: More classifications Pros: Fewer classifications Cons: Classes may be imbalanced
  7. 15 • Pros: • Good at dealing with high dimentional

    data. • Works well on small data sets. • Different kernel functions for various decision functions or combine 2 different kernel functions for better result. • Cons: • Picking the right kernel and parameters can be computationally intensive. • SVM do not provide probability estimates. Support Vector Machine ( SVM)
  8. 参考文献 16 • Support Vector Machines - Patrick Winston, MIT

    OpenCourseWare • Understanding Support Vector Machine algorithm from examples - Sunil Ray