Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Train your own model using Azure Machine Learning

Train your own model using Azure Machine Learning

== Workshop AI - Live360 Orlando AI Live ==

Henk Boelman

November 17, 2019
Tweet

More Decks by Henk Boelman

Other Decks in Programming

Transcript

  1. Henk Boelman Cloud Advocate @ Microsoft Train your own model

    using Azure Machine Learning HenkBoelman.com @hboelman
  2. Ask a sharp question Collect the data Prepare the data

    Select the algorithm Train the model Use the answer The data science process
  3. Azure Machine Learning A fully-managed cloud service that enables you

    to easily build, deploy, and share predictive analytics solutions.
  4. Sophisticated pretrained models To simplify solution development Azure Databricks Machine

    Learning VMs Popular frameworks To build advanced deep learning solutions TensorFlow Keras Pytorch Onnx Azure Machine Learning Language Speech … Azure Search Vision On-premises Cloud Edge Productive services To empower data science and development teams Powerful infrastructure To accelerate deep learning Flexible deployment To deploy and manage models on intelligent cloud and edge Machine Learning on Azure Cognitive Services
  5. Create a workspace ws = Workspace.create( name='<NAME>', subscription_id='<SUBSCRIPTION ID>', resource_group='<RESOURCE

    GROUP>', location='westeurope') ws.write_config() ws = Workspace.from_config() Create a workspace
  6. Datasets – registered, known data sets Experiments – Training runs

    Models – Registered, versioned models Endpoints: Real-time Endpoints – Deployed model endpoints Pipeline Endpoints – Training workflows Compute – Managed compute Datastores – Connections to data Azure Machine Learning Service
  7. Create Compute cfg = AmlCompute.provisioning_configuration( vm_size='STANDARD_NC6', min_nodes=1, max_nodes=6) cc =

    ComputeTarget.create(ws, '<NAME>', cfg) Create a workspace Create compute
  8. Create an estimator params = {'--data-folder': ws.get_default_datastore().as_mount()} estimator = TensorFlow(

    source_directory = script_folder, script_params = params, compute_target = computeCluster, entry_script = 'train.py’, use_gpu = True, conda_packages = ['scikit-learn','keras','opencv’], framework_version='1.10') Create an Experiment Create a training file Create an estimator
  9. Submit the experiment to the cluster run = exp.submit(estimator) RunDetails(run).show()

    Create an Experiment Create a training file Submit to the AI cluster Create an estimator
  10. Create an Experiment Create a training file Submit to the

    AI cluster Create an estimator Demo: Creating and run an experiment
  11. Azure Notebook Compute Target Experiment Docker Image Data store 1.

    Snapshot folder and send to experiment 2. create docker image 3. Deploy docker and snapshot to compute 4. Mount datastore to compute 6. Stream stdout, logs, metrics 5. Launch the script 7. Copy over outputs
  12. Register the model model = Model.register( ws, model_name=‘My Model Name',

    model_path='./savedmodel', description='My cool Model') Create an Experiment Create a training file Submit to the AI cluster Create an estimator Register the model
  13. Create an Experiment Create a training file Submit to the

    AI cluster Create an estimator Register the model Demo: Register and test the model
  14. Score.py %%writefile score.py from azureml.core.model import Model def init(): model_root

    = Model.get_model_path('MyModel’) loaded_model = model_from_json(loaded_model_json) loaded_model.load_weights(model_file_h5) def run(raw_data): url = json.loads(raw_data)['url’] image_data = cv2.resize(image_data,(96,96)) predicted_labels = loaded_model.predict(data1) return json.dumps(predicted_labels)
  15. Environment File from azureml.core.runconfig import CondaDependencies cd = CondaDependencies.create() cd.add_conda_package('keras==2.2.2')

    cd.add_conda_package('opencv') cd.add_tensorflow_conda_package() cd.save_to_file(base_directory='./', conda_file_path='myenv.yml')
  16. Create the image image_config = ContainerImage.image_configuration( runtime= "python", execution_script="score.py", conda_file="myenv.yml")

    image = Image.create(name = “my-image", models = [model], image_config = image_config, workspace = ws)
  17. Deploy to ACI aciconfig = AciWebservice.deploy_configuration( cpu_cores = 1, memory_gb

    = 2) aci_service = Webservice.deploy_from_image( deployment_config = aciconfig, image = image, name = ‘<ImageName>', workspace = ws)
  18. Prepare Data Register and Manage Model Train & Test Model

    Build Image … Build model (your favorite IDE) Deploy Service Monitor Model Prepare Experiment Deploy