Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Tensorflow Hub - DevFest Nairobi 2018

Barsolai
September 22, 2018

Tensorflow Hub - DevFest Nairobi 2018

Tensorflow Hub is being introduced by Google with a focus on discovery, and consumption of reusable parts of ML models. This presentation will give a subtle introduction to the library, and the reusable modules one can leverage presently

Barsolai

September 22, 2018
Tweet

Other Decks in Technology

Transcript

  1. Tensorflow Hub Frontier for Reusable ML Modules Chris Barsolai Organizer,

    Nairobi AI @chrisbarso GDG DevFest Nairobi 2018
  2. 4

  3. • Make solar energy affordable • Provide energy from fusion

    • Develop carbon sequestration methods • Manage the nitrogen cycle • Provide access to clean water • Restore & improve urban infrastructure • Advance health informatics • Engineer better medicines • Reverse-engineer the brain • Prevent nuclear terror • Secure cyberspace • Enhance virtual reality • Advance personalized learning • Engineer the tools for scientific discovery www.engineeringchallenges.org/challenges.aspx U.S. NAE Grand Engineering Challenges for 21st Century
  4. Half screen photo slide if text is necessary Using Tensorflow

    to keep farmers happy and cows healthy Source: https://www.blog.google/technology/ai/using-tensorflow-keep-farmers-happy-and-cows-healthy/
  5. Current: Solution = ML expertise + data + computation Can

    we turn this into: Solution = data + 100X computation ???
  6. Neural Architecture Search to find a model Controller: proposes ML

    models Train & evaluate models 20K times Iterate to find the most accurate model
  7. Inception-ResNet-v2 computational cost Accuracy (precision @1) accuracy AutoML outperforms handcrafted

    models Learning Transferable Architectures for Scalable Image Recognition, Zoph et al. 2017, https://arxiv.org/abs/1707.07012
  8. Inception-ResNet-v2 Years of effort by top ML researchers in the

    world computational cost Accuracy (precision @1) accuracy Learning Transferable Architectures for Scalable Image Recognition, Zoph et al. 2017, https://arxiv.org/abs/1707.07012 AutoML outperforms handcrafted models
  9. Learning Transferable Architectures for Scalable Image Recognition, Zoph et al.

    2017, https://arxiv.org/abs/1707.07012 computational cost Accuracy (precision @1) accuracy AutoML outperforms handcrafted models
  10. ...you have to worry about so much more. Configuration Data

    Collection Data Verification Feature Extraction Process Management Tools Analysis Tools Machine Resource Management Serving Infrastructure Monitoring Source: Sculley et al.: Hidden Technical Debt in Machine Learning Systems ML Code
  11. 2005, no source control 2010, source control & C.I. …

    but not for ML 2018, great tools and practices … some still to be defined Story time ML Software
  12. A repository of pre-trained model components, packaged for one-line reuse

    Easiest way to share and use best approaches for your task tensorflow.org/hub Module
  13. # Download and use NASNet feature vector module. module =

    hub.Module( "https://tfhub.dev/google/imagenet/nasnet_large/feature_vector/1") features = module(my_images) logits = tf.layers.dense(features, NUM_CLASSES) probabilities = tf.nn.softmax(logits) 62,000+ GPU hours!
  14. # Download and use NASNet feature vector module. module =

    hub.Module( "https://tfhub.dev/google/imagenet/nasnet_large/feature_vector/1", trainable=True, tags={“train”}) features = module(my_images) logits = tf.layers.dense(features, NUM_CLASSES) probabilities = tf.nn.softmax(logits)
  15. • NASNet-A mobile, large • PNASNet-5 large • MobileNet V2

    (all sizes) • Inception V1, V2, V3 • Inception-ResNet V2 • ResNet V1 & V2 50/101/152 • MobileNet V1 (all sizes) Available image modules
  16. Available text modules • Neural network language model (en, jp,

    de, es) • Word2vec trained on Wikipedia • ELMo (Embeddings from Language Models)
  17. Universal Sentence Encoder embed = hub.Module( "https://tfhub.dev/google/universal-sentence-encoder/1") messages = [

    “The food was great, and USIU rocks”, “I didn’t understand a thing today” ….. “Where is jollof rice?” ] print session.run(embeddings) Check out the semantic similarity colab at https://alpha.tfhub.dev/google/universal-sentence-encoder/2
  18. # Use pre-trained universal sentence encoder to build text vector

    column. review = hub.text_embedding_column( "review", "https://tfhub.dev/google/universal-sentence-encoder/1") features = { "review": np.array(["an arugula masterpiece", "inedible shoe leather", ...]) } labels = np.array([[1], [0], ...]) input_fn = tf.estimator.input.numpy_input_fn(features, labels, shuffle=True) estimator = tf.estimator.DNNClassifier(hidden_units, [review]) estimator.train(input_fn, max_steps=100)
  19. # Use pre-trained universal sentence encoder to build text vector

    column. review = hub.text_embedding_column( "review", "https://tfhub.dev/google/universal-sentence-encoder/1", trainable=True) features = { "review": np.array(["an arugula masterpiece", "inedible shoe leather", ...]) } labels = np.array([[1], [0], ...]) input_fn = tf.estimator.input.numpy_input_fn(features, labels, shuffle=True) estimator = tf.estimator.DNNClassifier(hidden_units, [review]) estimator.train(input_fn, max_steps=100)
  20. # Use pre-trained universal sentence encoder to build text vector

    column. review = hub.text_embedding_column( "review", "https://tfhub.dev/google/universal-sentence-encoder/1") features = { "review": np.array(["an arugula masterpiece", "inedible shoe leather", ...]) } labels = np.array([[1], [0], ...]) input_fn = tf.estimator.input.numpy_input_fn(features, labels, shuffle=True) estimator = tf.estimator.DNNClassifier(hidden_units, [review]) estimator.train(input_fn, max_steps=100)
  21. More modules • Progressive GAN • Google Landmarks Deep Local

    Features (DELF) • Audio, video, and more coming...