Slide 1

Slide 1 text

May 7, 2020

Slide 2

Slide 2 text

GDG Seattle GDG Seattle is an inclusive community where developers, designers, and entrepreneurs of all skill levels, genders, religions, and backgrounds are welcome to learn, practice, and share Google technologies, services, and platforms. Our motto is “be excellent to each other”. If you see or experience anything different, please contact GDG Seattle organizers.

Slide 3

Slide 3 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Agenda ● Instructors intro ● Why Study Jams? ● Who should participate & why? ● Study Jam series & key takeaways ● How do we study? ● Next steps Post questions to YouTube live stream…tweet with hashtags below. 3 Agenda

Slide 4

Slide 4 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision The Instructor(s) - Andrew Ferlitsch ● Google Cloud AI / Developer Relations ○ Conferences ○ Universities ○ Meetups ○ Training / Workshops ○ Enterprise Clients ● Expertise is Computer Vision and Deep Learning ○ Formerly principal research (imaging) scientist for 20 yrs. ○ 115 US issued patents based on my research ● Writing a Deep Learning book to be published by Manning 4

Slide 5

Slide 5 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision The Instructor(s) - Margaret ● ML GDE (Google Developer Expert) ● Leading GDG Seattle, and Seattle Data/Analytics/ML ● Working on AI for art & design 5

Slide 6

Slide 6 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Why Study Jams? A great way to learn together! Andy’s upcoming book - Deep Learning Design Patterns ● Materials are catered to job roles instead of academia. ● Practical hands-on workshops & multimodal learning: ○ Slides & workshops available on GitHub ○ Videos on YouTube ○ Study Jams sessions with Q&As. 6

Slide 7

Slide 7 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Who is the audience? Data scientist, ML engineers, Software engineers Prerequisites: ● Basic knowledge of Python ● Basic knowledge of ML 7

Slide 8

Slide 8 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Why you should attend? ● This Deep Learning Study Jam Series build the foundation for developing DL skills as a profession. ● Accelerated learning - our education style is broadly recognized as being easy to learn. ● Understanding what’s under the covers without the statistics. ○ The why, what and how 8

Slide 9

Slide 9 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Study Jam series ● Session 1 - Overview ● Session 2 - Primer: Deep and Convolutional Neural Networks, Training Foundation and Data Engineering ● Session 3 - CNN Design Patterns, Wide vs. Deep, Alternative Connectivity Patterns and Mobile Convolutional Networks. ● Session 4 - AutoEncoders, Hyperparameter Tuning and Transfer Learning ● Session 5 - Training Pipeline and Data Augmentation Taught with TF 2.x/Keras with accompanying workshops 9

Slide 10

Slide 10 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Key takeaways ● Session 2 - Primer ○ Know how a DNN/CNN works and be able to code them in TF.Keras ○ Know the design principles behind ConvNets and Residual Nets, and code them in TF.Keras ○ Know how to prepare image data for training and fundamentals of training CNN models. ● Session 3 - Design Patterns ○ Be able to quickly and accurately code large and mobile SOTA models from research papers. ○ Know the differences and advantages of deep layers, wide layers and other connectivity patterns in SOTA models. ○ Know how to use TF.Lite for deploying mobile models. 10

Slide 11

Slide 11 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision What are the Takeaways ● Session 4 - AutoEncoders, Hyperparameter Tuning, Transfer Learning ○ Understand purpose of and quickly code autoencoders for compression, denoising, super-resolution and unsupervised pre-training. ○ Have good judgement in manual selection of hyperparameters, and constructing learning rate schedulers. ○ Understand under the covers how automatic tuning works and know how to use KerasTuner for automatic hyperparameter tuning. ○ Know when and how to do both fine-tuning transfer learning (similar tasks) and incremental tuning (dissimilar tasks) 11

Slide 12

Slide 12 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision What are the Takeaways ● Session 5 - Training Pipeline and Data Augmentation ○ Your ready know to start doing deep learning at your job. ○ You’ll be training pre-built, your own built for image classification using full training and transfer learning. ○ You be able to increase the performance accuracy of your models using data augmentation techniques. 12

Slide 13

Slide 13 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision How do you study? ● Read Andy’s DL Primer (link) ● Watch the videos (bit.ly/2z3kbIA ) ● Do the exercises (bit.ly/2zMRVdw ) ● Join slack to help each other out (bit.ly/ml-study-jams) ● Bring your questions to Study Jam sessions for live Q&A 13

Slide 14

Slide 14 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Please study prior to session 2 (5/21/2020): 14 Topics Video Workshop Deep Neural Networks 45 min talk + 15 min workshop Link Link Convolutional and Residual Networks 45 min talk + 15 min workshop Link Link Training Foundation 90 min talk + 30 min workshop Link Link What’s next?

Slide 15

Slide 15 text

@AndrewFerlitsch | @margaretmz | @GDGSeattle | #TensorFlow 2.x | #Keras | #DeepLearning | #ComputerVision Missed this session? Catch up with YouTube recording here. Happy studying & see you on 5/21! 15 Thanks for joining us!