for Open-Source Data & AI Technologies - http://codait.org) • Founder, R-Ladies (http://rladies.org) • Member, R Foundation (https://www.r-project.org/foundation/) • Member, Infrastructure Steering Committee (ISC), R Consortium (https://www.r-consortium.org/) About me Data + Community + Mentor + Advocate @gdequeiroz | www.k-roz.com | [email protected]
Environment • Popular language for data scientists • It has more extensions than any other data science software • RStudio - an IDE with a lot of functionality • Awesome Community (#rstats + R-Ladies + R Forwards)
• keras • tfestimators - Implementations of model types such as regressors and classifiers • tensorflow - Low-level interface to the TensorFlow computational graph • tfdatasets - Provides various facilities for creating scalable input pipelines for TF model Tools • tfruns - Manage experiments (runs) • tfdeploy - Deploy TF models from R • cloudml - Interface to Google Cloud ML
code to run on CPU or on GPU • User Friendly API • Is capable of running on top of multiple back-ends including TensorFlow, CNTK, or Theano source: https://keras.rstudio.com/articles/why_use_keras.html
every training run. • Compare hyperparameters and metrics across runs to find the best performing model. • Generate reports to visualize individual training runs or comparisons between runs. tfruns - Track and Visualizing Training Runs
= eval_acc) run_dir eval_acc flag_dense_units1 flag_epochs 1 runs/2018-11-02T02-05-19Z 0.9778 128 30 3 runs/2018-11-02T02-03-11Z 0.9765 128 20 2 runs/2018-11-02T02-03-52Z 0.9747 256 30 4 runs/2018-11-02T02-02-12Z 0.9745 256 20 The best model is the model #1 with 128 dense units and 30 epochs
habitual form of saving Keras model • The resulting file contain: • Weight values • Model’s configuration • Optmizer’s configuration • Resume the training later from the same state # save the model as hdf5 model %>% save_model_hdf5("my_model.h5") # now recreate the model from that file new_model <- load_model_hdf5("my_model.h5") new_model %>% summary()
model to be used in R or outisde R • Serve it locally before deploying # export the model as a TensorFlow SavedModel tfdeploy::export_savedmodel(model, "savedmodel") # test the exported model locally tfdeploy::serve_savedmodel('savedmodel', browse = TRUE)
API • CloudML • RStudio Connect • TensorFlow Serving # deploy the save model to CloudML library(cloudml) cloudml_deploy("savedmodel", name = "keras_mnist", version = "keras_mnist_1") The same HTTP POST request we used to test the model locally can be used to generate predictions on CloudML
the 1000 different classes of objects in the ImageNet 2012 Large Scale Visual Recognition Challenge. • The model consists of a deep convolutional net using the Inception-ResNet-v2 architecture that was trained on the ImageNet-2012 data set. • input: a 299×299 image • output: list of estimated class probabilities