Slide 1

Slide 1 text

Neri Van Otten [email protected]

Slide 2

Slide 2 text

What is Reservoir Computing?  Temporal Recurrent Neural Network (Dominey 1995)  Liquid State Machines (Natschläger, Maass and Markram 2002)  Echo State Networks (Jaeger 2001)  Decorrelation-Backpropagation Learning (Steil 2004)

Slide 3

Slide 3 text

Types of Neural Networks

Slide 4

Slide 4 text

Recurrent Neural Networks (RNN)  RNN attractive solution to many engineering problems  learn by example  model highly nonlinear systems  Training phase is slow and unstable  Reservoir Computing avoids this problem by not training it at all

Slide 5

Slide 5 text

A Reservoir is like Water

Slide 6

Slide 6 text

Reservoir Computing (RC)

Slide 7

Slide 7 text

RC Explained  Large random RNN as an excitable medium  Driven by input signals, each unit in the RNN creates its own nonlinear transform of the input  Output signals are read out from excited RNN  Typically a simple linear combination of the reservoir signals  Outputs trained in supervised way  Typically by linear regression of the teacher output on the tapped reservoir signals.

Slide 8

Slide 8 text

Advantages of RC  Training phase is computationally fast  Works as a dynamic system just as other RNN  Form of memory is used  Suited for a wide variety of problems:  epileptic seizure detection  brain computing interfaces  time series prediction

Slide 9

Slide 9 text

Feedback Controller  “Feedback control by online learning an inverse model” (Waegeman 2012)

Slide 10

Slide 10 text

Training

Slide 11

Slide 11 text

Testing

Slide 12

Slide 12 text

Feedback Controller

Slide 13

Slide 13 text

Use RC with OGER  > import Oger  > resnode = Oger.nodes.ReservoirNode(output_dim = 100)  > readoutnode = Oger.nodes.RidgeRegressionNode()  > flow = resnode + readoutnode  > flow.train(data)