Unsupervised Outlier Detection Using Finite Mixtures with Discounting Learning Algorithms. Proposed by Yamanishi, K., Takeuchi, J., Williams, G. et al. (2004) Refs: http://cs.fit.edu/~pkc/id/related/yamanishi-kdd00.pdf
Laplace Estimation) SDEM (Sequentially Discounting Expectation and Miximizing) or SPDU (Sequentially Discounting Prototype Updating) Log loss or Hellinger score ࢄϕΫτϧ x ࿈ଓϕΫτϧ y p( x ) p( y | x ) ※ SDLEʹΑͬͯಉఆ͞Ε֤ͨηϧ͝ͱʹϞσϧ͕ଘࡏ͢Δ p( x ) p ( y | x ) SDEM (Sequentially Discounting Expectation and Miximizing) or SPDU (Sequentially Discounting Prototype Updating) SDEM (Sequentially Discounting Expectation and Miximizing) or SPDU (Sequentially Discounting Prototype Updating) SDEM (Sequentially Discounting Expectation and Miximizing) or SPDU (Sequentially Discounting Prototype Updating) ϞσϧΛߋ৽ ֘͢Δηϧʹ֘͢ΔϞσϧΛߋ৽ Ψεࠞ߹ ώετάϥϜີ SL(xt, yt) = log p (t 1) (xt, yt)
1.5 // Hyper parameter for continuous variables. beta := 1.0 // Hyper parameter for categorical variables. cellNum := 0 // Only continuous variables. mixtureNum := 2 // Number of mixtures for GMM. dim := 2 // Number of dimentions for GMM. ss := smartsifter.NewSmartSifter(r, alpha, beta, cellNum, mixtureNum, dim) logLoss := ss.Input(nil, []float64{0.1, 0.2}, true) fmt.Println("Score using logLoss: %f\n", logLoss)