Slide 1

Slide 1 text

Henk Boelman Cloud Advocate @ Microsoft Build and deploy PyTorch models with Azure Machine Learning HenkBoelman.com @hboelman

Slide 2

Slide 2 text

Agenda Machine Learning on Azure Data science process Simpson recognition using Azure Machine Learning

Slide 3

Slide 3 text

Machine Learning Ability to learn without being explicitly programmed.

Slide 4

Slide 4 text

Programming Algorithm Data Answers

Slide 5

Slide 5 text

Machine Learning Algorithm Data Answers

Slide 6

Slide 6 text

Machine Learning Model Data Answers

Slide 7

Slide 7 text

Machine Learning Model Data Answers

Slide 8

Slide 8 text

Machine Learning Predictions Data Model Data Answers

Slide 9

Slide 9 text

Sophisticated pretrained models To simplify solution development Azure Databricks Machine Learning VMs Popular frameworks To build advanced deep learning solutions TensorFlow Keras Pytorch Onnx Azure Machine Learning Language Speech … Azure Search Vision On-premises Cloud Edge Productive services To empower data science and development teams Powerful infrastructure To accelerate deep learning Flexible deployment To deploy and manage models on intelligent cloud and edge Machine Learning on Azure Cognitive Services

Slide 10

Slide 10 text

Vision Speech Language Knowledge Cognitive Services: Pre-Trained models in the cloud and on the edge

Slide 11

Slide 11 text

No content

Slide 12

Slide 12 text

Slide 13

Slide 13 text

Ask a sharp question Collect the data Prepare the data Select the algorithm Train the model Use the answer The data science process

Slide 14

Slide 14 text

Ask a sharp question

Slide 15

Slide 15 text

Ask a sharp question How much / how many? Is it this or that? Is it weird? Which group? Which action?

Slide 16

Slide 16 text

What is the price of a second hand car?

Slide 17

Slide 17 text

Collect the data

Slide 18

Slide 18 text

Collect the data

Slide 19

Slide 19 text

Collect the data

Slide 20

Slide 20 text

Prepare the data

Slide 21

Slide 21 text

Prepare the data Every column in your dataset has to be: Relevant Independent Simple Clean

Slide 22

Slide 22 text

Prepare the data: Relevant

Slide 23

Slide 23 text

Prepare the data: Independent

Slide 24

Slide 24 text

Prepare the data: Simple

Slide 25

Slide 25 text

Prepare the data: Clean

Slide 26

Slide 26 text

Prepare the data: Clean

Slide 27

Slide 27 text

Select the algorithm

Slide 28

Slide 28 text

Select the algorithm

Slide 29

Slide 29 text

Train the Model

Slide 30

Slide 30 text

No content

Slide 31

Slide 31 text

No content

Slide 32

Slide 32 text

Use the answer

Slide 33

Slide 33 text

No content

Slide 34

Slide 34 text

Azure Machine Learning studio A fully-managed cloud service that enables you to easily build, deploy, and share predictive analytics solutions.

Slide 35

Slide 35 text

Is it Marge or Homer?

Slide 36

Slide 36 text

Simpson Lego dataset https://github.com/hnky/dataset-lego-figures

Slide 37

Slide 37 text

Sophisticated pretrained models To simplify solution development Azure Databricks Machine Learning VMs Popular frameworks To build advanced deep learning solutions TensorFlow Keras Pytorch Onnx Azure Machine Learning Language Speech … Azure Search Vision On-premises Cloud Edge Productive services To empower data science and development teams Powerful infrastructure To accelerate deep learning Flexible deployment To deploy and manage models on intelligent cloud and edge Machine Learning on Azure Cognitive Services

Slide 38

Slide 38 text

No content

Slide 39

Slide 39 text

Prepare your environment Experiment with your model & data Deploy Your model into production

Slide 40

Slide 40 text

Step 1: Prepare your environment

Slide 41

Slide 41 text

Setup your environment VS Code Azure Notebooks Azure Portal

Slide 42

Slide 42 text

Azure Notebook / Jupyter Notebook

Slide 43

Slide 43 text

Create a workspace ws = Workspace.create( name='', subscription_id='', resource_group='', location='westeurope') ws.write_config() ws = Workspace.from_config() Create a workspace

Slide 44

Slide 44 text

No content

Slide 45

Slide 45 text

Datasets – registered, known data sets Experiments – Training runs Models – Registered, versioned models Endpoints: Real-time Endpoints – Deployed model endpoints Pipeline Endpoints – Training workflows Compute – Managed compute Datastores – Connections to data Azure Machine Learning Service

Slide 46

Slide 46 text

Create Compute cfg = AmlCompute.provisioning_configuration( vm_size='STANDARD_NC6', min_nodes=1, max_nodes=6) cc = ComputeTarget.create(ws, '', cfg) Create a workspace Create compute

Slide 47

Slide 47 text

No content

Slide 48

Slide 48 text

Demo: Setup your workspace Create a workspace Create compute Setup storage

Slide 49

Slide 49 text

Step 1 Prepare your environment Create a workspace Create compute Setup storage

Slide 50

Slide 50 text

Step 2: Experiment with your model & data

Slide 51

Slide 51 text

Create an experiment exp = Experiment(workspace=ws, name=“”) Create an Experiment

Slide 52

Slide 52 text

Create a training file Create an Experiment Create a training file

Slide 53

Slide 53 text

Create an estimator params = {'--data-folder': ws.get_default_datastore().as_mount()} estimator = TensorFlow( source_directory = script_folder, script_params = params, compute_target = computeCluster, entry_script = 'train.py’, use_gpu = True, conda_packages = ['scikit-learn','keras','opencv’], framework_version='1.10') Create an Experiment Create a training file Create an estimator

Slide 54

Slide 54 text

Submit the experiment to the cluster run = exp.submit(estimator) RunDetails(run).show() Create an Experiment Create a training file Submit to the AI cluster Create an estimator

Slide 55

Slide 55 text

Create an Experiment Create a training file Submit to the AI cluster Create an estimator Demo: Creating and run an experiment

Slide 56

Slide 56 text

Azure Notebook Compute Target Experiment Docker Image Data store 1. Snapshot folder and send to experiment 2. create docker image 3. Deploy docker and snapshot to compute 4. Mount datastore to compute 6. Stream stdout, logs, metrics 5. Launch the script 7. Copy over outputs

Slide 57

Slide 57 text

Register the model model = run.register_model( model_name='SimpsonsAI', model_path='outputs') Create an Experiment Create a training file Submit to the AI cluster Create an estimator Register the model

Slide 58

Slide 58 text

Create an Experiment Create a training file Submit to the AI cluster Create an estimator Register the model Demo: Register and test the model

Slide 59

Slide 59 text

Step 2 Experiment with your model & data

Slide 60

Slide 60 text

Step 3: Deploy your model into production

Slide 61

Slide 61 text

AMLS to deploy The Model Score.py Environment file Docker Image

Slide 62

Slide 62 text

Score.py %%writefile score.py from azureml.core.model import Model def init(): model_root = Model.get_model_path('MyModel’) loaded_model = model_from_json(loaded_model_json) loaded_model.load_weights(model_file_h5) def run(raw_data): url = json.loads(raw_data)['url’] image_data = cv2.resize(image_data,(96,96)) predicted_labels = loaded_model.predict(data1) return json.dumps(predicted_labels)

Slide 63

Slide 63 text

Environment File from azureml.core.runconfig import CondaDependencies cd = CondaDependencies.create() cd.add_conda_package('keras==2.2.2') cd.add_conda_package('opencv') cd.add_tensorflow_conda_package() cd.save_to_file(base_directory='./', conda_file_path='myenv.yml')

Slide 64

Slide 64 text

Inference config inference_config = InferenceConfig( runtime= "python", entry_script="score.py", conda_file="myenv.yml" )

Slide 65

Slide 65 text

Deployment using AMLS

Slide 66

Slide 66 text

Deploy to ACI aciconfig = AciWebservice.deploy_configuration( cpu_cores = 1, memory_gb = 2) service = Model.deploy(workspace=ws, name='simpsons-aci', models=[model], inference_config=inference_config, deployment_config=aciconfig)

Slide 67

Slide 67 text

Deploy to AKS aks_target = AksCompute(ws,"AI-AKS-DEMO") deployment_config = AksWebservice.deploy_configuration( cpu_cores = 1, memory_gb = 1) service = Model.deploy(workspace=ws, name="simpsons-ailive", models=[model], inference_config=inference_config, deployment_config=deployment_config, deployment_target=aks_target) service.wait_for_deployment(show_output = True)

Slide 68

Slide 68 text

Demo: Deploy to ACI

Slide 69

Slide 69 text

No content

Slide 70

Slide 70 text

Pipelines Azure ML Service Pipelines Azure Pipelines

Slide 71

Slide 71 text

Azure Machine Learning Pipelines Workflows of steps that can use Data Sources, Datasets and Compute targets Unattended runs Reusability Tracking and versioning

Slide 72

Slide 72 text

Azure Pipelines Orchestration for Continuous Integration and Continuous Delivery Gates, tasks and processes for quality Integration with other services Trigger on code and non-code events

Slide 73

Slide 73 text

Create a pipeline step Input Output Runs a script on a Compute Target in a Docker container. Parameters

Slide 74

Slide 74 text

Create a pipeline Dataset of Simpsons Images Prepare data Train the Model with a PyTorch estimator Processed dataset model Register the model Blob Storage Account Model Management

Slide 75

Slide 75 text

Continuous Integration

Slide 76

Slide 76 text

Demo: Setup Azure Pipeline for AMLS Pipeline

Slide 77

Slide 77 text

Complete Pipeline

Slide 78

Slide 78 text

“The future we invent is a choice we make, not something that just happens” Satya Nadella

Slide 79

Slide 79 text

@hboelman github.com/hnky henkboelman.com Thank you! Read more on: henkboelman.com