state to another on a state space. The probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it." Andrey Markov (1856-1922) Simple weather prediction: a sunny day is followed by another sunny day 90% of the time; a rainy day followed by another rainy day 50% of the time
form a directed cycle. This creates an internal state of the network which allows it to exhibit dynamic temporal behavior. RNNs can use their internal memory to process arbitrary sequences of inputs. This makes them applicable to tasks such as unsegmented connected handwriting recognition or speech recognition. www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns
lyrics from Metallica and Taylor Swift letter by letter - Extract sequences of random sizes - Post to Tumblr unforgiven-swift.tumblr.com github.com/herval/unforgiven-swift
(use signals such as Likes to reinforce the network) - Evolve the topologies & learning parameters automatically (NEAT) - ??? en.wikipedia.org/wiki/Neuroevolution_of_augmenting_topologies