Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Creating Creative Machines (v0)

Creating Creative Machines (v0)

In which I present a quick rundown on some attempts to beat The Herval Test.

Herval Freire

April 28, 2016
Tweet

More Decks by Herval Freire

Other Decks in Programming

Transcript

  1. The Turing Test "Can a machine answer your questions so

    convincingly that you couldn’t tell it apart from a human?" Developed by Alan Turing in 1950
  2. The Herval Test "Can a machine create original content so

    convincingly terrible that you couldn’t tell it apart from a really bad human writer?"
  3. Markov chains "a random process that undergoes transitions from one

    state to another on a state space. The probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it." Andrey Markov (1856-1922) Simple weather prediction: a sunny day is followed by another sunny day 90% of the time; a rainy day followed by another rainy day 50% of the time
  4. Markovian Literature - Load text from books from the Gutenberg

    Project into a Markov Chain - Extract sequences of 1-3 phrases - Tweet twitter.com/markovian_lit github.com/herval/markovian_literature
  5. Recurrent Neural Networks A neural network where connections between units

    form a directed cycle. This creates an internal state of the network which allows it to exhibit dynamic temporal behavior. RNNs can use their internal memory to process arbitrary sequences of inputs. This makes them applicable to tasks such as unsegmented connected handwriting recognition or speech recognition. www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns
  6. LSTM (Long Short-Term Memory) A RNN with memory Good for

    classifying and predicting time-series karpathy.github.io/2015/05/21/rnn-effectiveness
  7. Unforgiven Swift - Train a 200 neuron LSTM with all

    lyrics from Metallica and Taylor Swift letter by letter - Extract sequences of random sizes - Post to Tumblr unforgiven-swift.tumblr.com github.com/herval/unforgiven-swift
  8. Haikuzao - Train an LSTM with Two layers of 100

    neurons with a database of 5k Haikus - Extract Haikus - Tweet twitter.com/haikuzao github.com/herval/haikuzao
  9. Next steps - Use a bigger dataset - Reinforced learning

    (use signals such as Likes to reinforce the network) - Evolve the topologies & learning parameters automatically (NEAT) - ??? en.wikipedia.org/wiki/Neuroevolution_of_augmenting_topologies