Upgrade to Pro — share decks privately, control downloads, hide ads and more …

¡Escuincla babosa!: Creating a telenovela scrip...

Lorena Mesa
November 03, 2018

¡Escuincla babosa!: Creating a telenovela script with a neural network

Telenovelas, Spanish soap operas, are beloved for their over the top drama (el drama es real) and complicated twists and turns. In this talk, we'll look at some of the most popular telenovelas in recent years to devise a common arch for a telenovela. Using this formula, we'll determine if a recurring neural network is able to successfully produce a telenovela script. If so, what would a telenovela script look like as imagined by a neural network? Hold on tight, as this is bound to be a bumpy ride full of amor, pasiòn, and teenage angst.

Lorena Mesa

November 03, 2018
Tweet

More Decks by Lorena Mesa

Other Decks in Programming

Transcript

  1. ¡Escuincla babosa!: Creating a telenovela script with a neural network

    Lorena Mesa @looorenanicole SLIDES @ http://bit.ly/2P8F5gs North Bay Python
  2. 2 HOLA! Soy Lorena Mesa. I am here because I

    am really interested in deep learning and wanted to start deep diving into the topic more. Also, I love telenovelas. Win win? You can find me at @loooorenanicole on most platforms (yes that’s FOUR letter Os).
  3. “ “The social phenomenon of televised melodramas, called telenovelas in

    Spanish … are serial melodramas … [based] on the construction of a relationship of fidelity with the audience” - Jorge Gonzalez, 2003, Understanding Telenovelas as a Cultural Front 9
  4. 2 billion People watch telenovelas or approximately ⅓ of the

    global population (Telegraph - 2006) 10
  5. “ “Things have to be cleaned up so the audience

    has satisfaction. They won’t worry about Maria — did she find true love, her true mother or her true father” - Dr. Diana Rios, Associate Professor of Communications, University of Connecticut 12
  6. The arc of a telenovela Unlike English soap operas telenovelas

    are defined by: - A fixed melodramatic plot (e.g. love lost, mothers and daughters fighting, long-lost relatives, love found) - A finite beginning and end - A conclusion that ties up loose ends generally with a happy element (e.g. big wedding) 13
  7. “ “A field of study that gives computers the ability

    to learn without being explicitly programmed” - Arthur Samuel, pioneer in Machine Learning (1959) 18
  8. “ “A computer program is said to learn from experience

    (E) with respect to some task (T) and some performance measure (P), if its performance on T, as measured by P, improves with experience E.” - Machine Learning, Tom Mitchell (1997) 20
  9. Deep Learning: Classification A programmable flashlight has ML model to

    classify if a verbal phrase has the audible cue “dark” and turns on Deep learning implementation can learn: - any phrase with the word “dark” - other relevant cues like “I can’t see” or “light switch won’t work” Deep learning model learns through its own computation (activation) method; it’s “brain” 23
  10. “The difference between a neural network and deep learning is

    that deep learning is the act of using a subset of neural networks called deep neural networks” - What is a Neural Network and How are Businesses Using Them, Erika Morphy (May 2018) Neural Network vs Deep Learning?
  11. Biological neurons can learn from large volumes of data; base

    of ML neuron Neuron model: - inputs: features (x1 … xn) represented as a weighted number - weights: of the feature indicates the strength of the feature (w1j .. wnj) - outputs: the result of the weighted sum of inputs (netj) passed through the activation function A Neuron: What is it?
  12. Activation Function & How a NN Learns Activation function models

    the “firing rate” of a biological neuron, converting weighted sum into a new number based on a formula Training the neuron means iteratively updating the weights with the inputs so that it can progressively approximate the underlying relationship to the dataset given. Once trained, it can be used to do things e.g. sort / classify samples of images into cats and dogs.
  13. The differences in the types of neural networks available depend

    such things as the number of nodes included, how the neurons are connected, movement of information, etc. Examples of types: - Feedforward - information flows only moves one way, from input layer to output layer - Radial basis function - Includes a distance criterion - Recurrent neural network - propagates data forward and backwards, from later processing stages back to earlier processing stages; it’s a directed graph! The many architectural types of NNs
  14. Python is an ideal language to use for scientific programming,

    if you are curious as to why listen to Jake Vanderplas explain why in his 2017 PyCon keynote “The Unexpected Success of Python in Science” Many libraries available: - PyTorch - Keras ** (we’ll use this!) - TensorFlow - Theano Python + Neural Networks
  15. Python is an ideal language to use for scientific programming,

    if you are curious as to why listen to Jake Vanderplas explain why in his 2017 PyCon keynote “The Unexpected Success of Python in Science” Many libraries available: - PyTorch - Keras ** (we’ll use this!) - TensorFlow - Theano Python + Neural Networks
  16. Data Sources - Jane the Virgin scripts from “Forever Dreaming”

    - Queen of the South scripts from “Springfield Springfield” - Ugly Betty scripts from “Springfield Springfield” 31
  17. Steps to creating an RNN Fit Model by training with

    epochs Generate character hot encodings Transform data to input sequences and rescale [0,1] 32 Hot Encoding Allows Model to Predict the Probability of Each Character in the Vocabulary Jupyter Notebook here!
  18. 33 from keras.models import Sequential from keras.layers import Dense, Dropout,

    LSTM from keras.callbacks import ModelCheckpoint from keras.utils import np_utils model = Sequential() model.add(LSTM(256, input_shape=(X.shape[1], X.shape[2]))) model.add(Dropout(0.2)) model.add(Dense(y.shape[1], activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer='adam') model.fit(X, y, epochs=20, batch_size=128, callbacks=callbacks_list) Building LSTM with Keras
  19. 34 model.load_weights(filename) model.compile(loss='categorical_crossentropy', optimizer='adam') for i in range(1000): x =

    numpy.reshape(pattern, (1, len(pattern), 1)) x = x / float(n_vocab) prediction = model.predict(x, verbose=0) # Map the prediction int to char and output letter Generating Text with Keras LSTM
  20. 36 # pick a random seed pattern = dataX[numpy.random.randint(0, len(dataX)-1)]

    # generate characters for i in range(1000): x = numpy.reshape(pattern, (1, len(pattern), 1)) x = x / float(n_vocab) prediction = model.predict(x, verbose=0) index = numpy.argmax(prediction) result = int_to_char[index] seq_in = [int_to_char[value] for value in pattern] sys.stdout.write(result) pattern.append(index) pattern = pattern[1:len(pattern)] Generating Text with Keras LSTM
  21. “ “do not threaten me. do you understand me? do

    you understand me? whoa, whoa, all right? hey, look, i” 37
  22. 38 - Make a LSTM model per character you want

    in your script - Find similar enough “sources of inspiration” per character - Weave all these models together? - Likewise can take this approach to create LSTM models for telenovelas of each type of melodrama (e.g. lost love) Generating Text with Keras LSTM
  23. Script generation is hard. Generating a “learned” plot may be

    possible, but without coherent narratives and emotionally compelling characters we miss the mark. 39
  24. How can we do better? Telenovelas may have a format,

    but text generation is complex. 40
  25. 01x01 - Chapter One (Pilot) 01x01 - Chapter One (Pilot)

    Latin lover narrator: Our story begins 13 and a half years ago, when Jane Gloriana Villanueva was a mere ten years old. It should be noted that at a mere ten years old, Jane's passions include... in no particular order... her family, God, and grilled cheese sandwiches. This is Jane's grandmother, Alba Gloriana Villanueva. Her passions include God and Jane, in that particular order. Woman: Really, Mom? Shh. But this is so lame. Varying Data Quality Queen of the South (2016) s01e01 Episode Script 1 (helicopter blades whirring) (upbeat music) TERESA: My name is Teresa Mendoza. I am from México. I was born poor, not that that's bad. But take it from me, I've been poor. And I've been rich. Rich is better. Believe me.
  26. More Data! Want to contribute to my telenovela sets? Please

    do! Add at http://bit.ly/telenovlas-for- all!
  27. Resources Blogs + Articles - Develop your first NN in

    Python with Keras Step-by-Step A NN in 11 lines of Python - How to build a 3 layer neural network from scratch - What is the best alternative to Keras? - Videos + MOOCs - 10.4 NN: Multilayer Perceptron Part 1: The Nature of Code - Exploring Deep Learning Framework in PyTorch - Stephanie Kim PyCon USA 2018 - Mathematics for Machine Learning: Linear Algebra - Fast.ai - 46
  28. ¡Gracias! Any questions? Find me after this session and let’s

    talk! Otherwise I’m @loooorenanicole on most social media platforms. 47
  29. Credits Special thanks to all the people who made and

    released these awesome resources for free: ◆ Presentation template by SlidesCarnival ◆ Photographs by Unsplash 48