Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Dive into Deep Learning

Dive into Deep Learning

Date: March 4, 2016
Venue: Trondheim, Norway. Doctoral Seminar at NTNU

Please cite, link to or credit this presentation when using it or part of it in your work.

#MachineLearning #ML #DeepLearning #DL #TensorFlow #TF

Darío Garigliotti

March 04, 2016
Tweet

More Decks by Darío Garigliotti

Other Decks in Research

Transcript

  1. Deep dive = Deep (Learning: a shallow) dive • Deep

    learning is a very hot topic • A recently very successful ML paradigm • Key: Data + GPUs
  2. TensorFlow • Data: tensors • Graph representation of computations •

    Nodes: operators • States by Variables • Execution in Sessions
  3. TensorFlow Advanced features • Construction phase • Execution phase import

    tensorflow as tf # Create a Constant op that produces a 1x2 matrix. The op is # added as a node to the default graph. # # The value returned by the constructor represents the output # of the Constant op. matrix1 = tf.constant([[3., 3.]]) # Create another Constant that produces a 2x1 matrix. matrix2 = tf.constant([[2.],[2.]]) with tf.Session() as sess: result = sess.run([product]) print(result)
  4. TensorFlow Advanced features • Working with Variables # Create two

    variables. weights = tf.Variable(tf.random_normal([784, 200], stddev=0.35), name="weights") biases = tf.Variable(tf.zeros([200]), name="biases") ... # Add an op to initialize the variables. init_op = tf.initialize_all_variables() # Later, when launching the model with tf.Session() as sess: # Run the init operation. sess.run(init_op) ... # Use the model ...
  5. TensorFlow Advanced features • Graph Visualization • Using GPUs •

    Sharing variables https://www.tensorflow.org/ with tf.Session() as sess: with tf.device("/gpu:1"): matrix1 = tf.constant([[3., 3.]]) matrix2 = tf.constant([[2.],[2.]])