Upgrade to Pro — share decks privately, control downloads, hide ads and more …

AI in the Battle against fakes

AI in the Battle against fakes

Counterfeited products are a growing problem. As well as impacting the revenue of a company, they hurt brand reputation and customer confidence. Our company was asked to create a solution for a global Electronics brand that can identify fake products by just taking one picture on your smartphone.

In this session we zoom into the building blocks that make this AI solution work. We’ll find out that there is much more to it than just training a convolutional neural network. We look at challenges like: how do you manage and monitor the AI, how do you build and improve the AI model in such a way it fits your DevOps production chain.
Interested in this technological journey?

Learn how we used Azure Functions, Cosmos DB and Docker to build a solid foundation; Azure Machine Learning Service to train the neural networks. And Azure DevOps to control, build and deploy this state-of-art solution.

AI used to be in the realm of scientists in white lab coats. Now any smart developer can integrate this exciting technology into their next project. Join us to find out how.

Henk Boelman

December 03, 2019
Tweet

More Decks by Henk Boelman

Other Decks in Programming

Transcript

  1. Henk Boelman Cloud Advocate @ Microsoft AI in the battle

    against fakes HenkBoelman.com @hboelman
  2. Agenda Assignment & challenges Solution architecture Icon detection and validation

    with Custom Vision Logo classification with AMLS Recap and lessons learned
  3. Ingredients Azure Durable Functions Docker Containers Azure Kubernetes Service Cosmos

    DB Azure Blob Storage Umbraco AMLS Cognitive Services Azure DevOps API Management
  4. AI Engine flow Photo of product Pre-Process image Match to

    Product in DB Text AI Logo AI Icon AI End evaluation
  5. Sophisticated pretrained models To simplify solution development Azure Databricks Machine

    Learning VMs Popular frameworks To build advanced deep learning solutions TensorFlow Keras Pytorch Onnx Azure Machine Learning Language Speech … Azure Search Vision On-premises Cloud Edge Productive services To empower data science and development teams Powerful infrastructure To accelerate deep learning Flexible deployment To deploy and manage models on intelligent cloud and edge Machine Learning on Azure Cognitive Services
  6. Object detection Classification Classification { Icon: Car Genuine: False, Score:

    95% }, { Icon: Truck Genuine: True, Score: 85% } Input Output Genuine: 2% | Fake: 95% Genuine: 85% | Fake: 3% Icon validator API
  7. Label images Export Train Model Deploy Model Test Model Visual

    Object Tagging Tool Custom Vision Service Icon detection Run test-set against model
  8. Label images Export Train Model Deploy Model Test Model Custom

    Vision Service Icon verification Custom Vision Service API Images in 2 folders Genuine / Fake Run test-set against model Image augmentation
  9. Computer Vision API Cut out PHILIPS Pre- Process Predict image

    Return Result Logo classification Icon Validator API
  10. Label images Pre process Train Model Deploy Model Test Model

    Azure Machine Learning Service Logo classification Images in 2 folders Genuine / Fake Run test-set against model Image augmentation
  11. Azure Machine Learning studio A fully-managed cloud service that enables

    you to easily build, deploy, and share predictive analytics solutions.
  12. Datasets – registered, known data sets Experiments – Training runs

    Pipelines – Workflows runs Models – Registered, versioned models Endpoints: Real-time Endpoints – Deployed model endpoints Pipeline Endpoints – Training workflows Compute – Managed compute Datastores – Connections to data Azure Machine Learning Service
  13. Create a workspace ws = Workspace.create( name='<NAME>', subscription_id='<SUBSCRIPTION ID>', resource_group='<RESOURCE

    GROUP>', location='westeurope') ws.write_config() ws = Workspace.from_config() Create a workspace
  14. Create Compute cfg = AmlCompute.provisioning_configuration( vm_size='STANDARD_NC6', min_nodes=1, max_nodes=6) cc =

    ComputeTarget.create(ws, '<NAME>', cfg) Create a workspace Create compute
  15. Create an estimator params = {'--data-folder': ws.get_default_datastore().as_mount()} estimator = TensorFlow(

    source_directory = script_folder, script_params = params, compute_target = computeCluster, entry_script = 'train.py’, use_gpu = True, conda_packages = ['scikit-learn','keras','opencv’], framework_version='1.10') Create an Experiment Create a training file Create an estimator
  16. Submit the experiment to the cluster run = exp.submit(estimator) RunDetails(run).show()

    Create an Experiment Create a training file Submit to the AI cluster Create an estimator
  17. Create an Experiment Create a training file Submit to the

    AI cluster Create an estimator Demo: Creating and run an experiment
  18. Azure Notebook Compute Target Experiment Docker Image Data store 1.

    Snapshot folder and send to experiment 2. create docker image 3. Deploy docker and snapshot to compute 4. Mount datastore to compute 6. Stream stdout, logs, metrics 5. Launch the script 7. Copy over outputs
  19. Register the model model = run.register_model( model_name='SimpsonsAI', model_path='outputs') Create an

    Experiment Create a training file Submit to the AI cluster Create an estimator Register the model
  20. Create an Experiment Create a training file Submit to the

    AI cluster Create an estimator Register the model Demo: Register and test the model
  21. Score.py %%writefile score.py from azureml.core.model import Model def init(): model_root

    = Model.get_model_path('MyModel’) loaded_model = model_from_json(loaded_model_json) loaded_model.load_weights(model_file_h5) def run(raw_data): url = json.loads(raw_data)['url’] image_data = cv2.resize(image_data,(96,96)) predicted_labels = loaded_model.predict(data1) return json.dumps(predicted_labels)
  22. Environment File from azureml.core.runconfig import CondaDependencies cd = CondaDependencies.create() cd.add_conda_package('keras==2.2.2')

    cd.add_conda_package('opencv') cd.add_tensorflow_conda_package() cd.save_to_file(base_directory='./', conda_file_path='myenv.yml')
  23. Deploy to ACI aciconfig = AciWebservice.deploy_configuration( cpu_cores = 1, memory_gb

    = 2) service = Model.deploy(workspace=ws, name='simpsons-aci', models=[model], inference_config=inference_config, deployment_config=aciconfig)
  24. Deploy to AKS aks_target = AksCompute(ws,"AI-AKS-DEMO") deployment_config = AksWebservice.deploy_configuration( cpu_cores

    = 1, memory_gb = 1) service = Model.deploy(workspace=ws, name="simpsons-ailive", models=[model], inference_config=inference_config, deployment_config=deployment_config, deployment_target=aks_target) service.wait_for_deployment(show_output = True)
  25. Recap Azure Function for orchestrator Custom Vision for easy models

    Azure Machine learning Service Make services (API) for specific tasks / models GIT / CI / CD / Test everything
  26. You never have enough data Apply DevOps practices from the

    start… Break the problem into smaller pieces Think big, start small, grow fast