builds a model that predicts for every unlabelled instance I a numeric value with accuracy. y = f(x) Example Stock-Market price prediction Example Airplane delays
= (f(xi) − yi)2/N Root mean square error: RMSE = √ MSE = (f(xi) − yi)2/N Forgetting mechanism for estimating measures Sliding window of size w with the most recent observations
= (|f(xi) − yi|)/N Relative absolute error: RAE = (|f(xi) − yi|)/ (|ˆ yi − yi|) Forgetting mechanism for estimating measures Sliding window of size w with the most recent observations
hw (xi))2 Stochastic Gradient Descent: w = w − η∇Jxi Gradient of the error function: ∇J = − i (yi − hw (xi)) Weight update rule w = w + η i (yi − hw (xi))xi
with HT: 1. Splitting Criterion 2. Numeric attribute handling using BINTREE 3. Linear model at the leaves 4. Concept Drift Handling: Page-Hinckley 5. Alternate Tree adaption strategy
Entropy(before Split) − Entropy(after split) Entropy = − c pi · log pi Gini Index = c pi(1 − pi) = 1 − c p2 i Regression Gain = SD(before Split) − SD(after split) StandardDeviation (SD) = (¯ y − yi)2/N
al, 2003) Closest implementation of a batch method Incrementally update a binary tree as data is observed Issues: high memory cost, high cost of split search, data order
= max (0, gt−1 + t − υ) if gt > h then alarm and gt = 0 The Page Hinckley Test g0 = 0, gt = gt−1 + ( t − υ) Gt = min(gt ) if gt − Gt > h then alarm and gt = 0