Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Rendre son IoT encore plus intelligent avec Tensorflow Lite @SnowCampIO

Rendre son IoT encore plus intelligent avec Tensorflow Lite @SnowCampIO

Alors que le Machine Learning est déployé habituellement dans le Cloud, des versions allégées de ces algorithmes et adaptées aux systèmes contraints de l’IoT comme les micro-contrôleurs commencent à apparaître.

Utiliser du Machine Learning « at-the-edge » présente en effet plusieurs avantages comme la réduction de la latence, la confidentialité des données, et le fonctionnement sans connexion internet.

Au cours de cette présentation, nous verrons qu’il est donc possible de déployer des algorithmes de Deep Learning sur des objets connectés grâce à TensorFlow Lite. Nous verrons alors comment l’utiliser pour concevoir l’« agriculture du futur » capable de prédire et optimiser la production de légumes, aussi bien chez soi que dans des pays en voie de développement où la connexion internet est intermittente.

83124b745752d1a1b0ca2eee1af0bd48?s=128

Alexis DUQUE

January 23, 2020
Tweet

Transcript

  1. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Make you IoT Smarter with

    Tensorflow Lite ... … to Design the Future of Vertical Farming @alexis0duque
  2. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Who am I?

  3. @alexis0duque #Snowcamp #IoT #Tensorflow #AI

  4. @alexis0duque #Snowcamp #IoT #Tensorflow #AI

  5. @alexis0duque #Snowcamp #IoT #Tensorflow #AI

  6. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Outline & What You Will

    Learn
  7. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Indoor Vertical Farming

  8. @alexis0duque #Snowcamp #IoT #Tensorflow #AI System Architecture

  9. @alexis0duque #Snowcamp #IoT #Tensorflow #AI System Architecture

  10. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Irrigation System • • •

    • https://www.hydroponic-urban-gardening.com/hydroponics-guide/various-hydroponics-systems
  11. @alexis0duque #Snowcamp #IoT #Tensorflow #AI System Architecture

  12. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Computer Vision

  13. @alexis0duque #Snowcamp #IoT #Tensorflow #AI System Architecture

  14. @alexis0duque #Snowcamp #IoT #Tensorflow #AI

  15. @alexis0duque #Snowcamp #IoT #Tensorflow #AI System Architecture

  16. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Why Machine Learning? Flavor-cyber-agriculture: Optimization

    of plant metabolites in an open-source control environment through surrogate modeling Johnson AJ, et al. (2019) Flavor-cyber-agriculture: Optimization of plant metabolites in an open-source control environment through surrogate modeling. PLOS ONE 14(4): e0213918.
  17. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Why Edge Computing?

  18. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow

  19. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow

  20. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow Lite (TF-Lite)

  21. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow Lite (TF-Lite)

  22. @alexis0duque #Snowcamp #IoT #Tensorflow #AI ML Workflow with Tensorflow Lite

  23. @alexis0duque #Snowcamp #IoT #Tensorflow #AI ML Workflow with Tensorflow Lite

  24. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Setup on your Laptop

  25. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Setup on your Laptop $

    python3 --version $ pip3 --version $ virtualenv --version $ virtualenv --system-site-packages -p python3 ./venv $ source ./venv/bin/activate $ pip install --upgrade pip $ pip install --upgrade tensorflow=2.0 $ pip install numpy pandas jupyter jupyterlab notebook matplotlib
  26. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Setup on your RPI

  27. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Setup on your RPI $

    sudo apt install swig libjpeg-dev zlib1g-dev python3-dev python3-numpy unzip $ wget https://github.com/PINTO0309/TensorflowLite-bin/raw/master/2.0 .0/tflite_runtime-2.0.0-cp37-cp37m-linux_armv7l.whl $ pip install --upgrade tflite_runtime-2.0.0-cp37-cp37m-linux_armv7l.whl
  28. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Setup on your RPI $

    sudo apt install swig libjpeg-dev zlib1g-dev python3-dev python3-numpy unzip $ wget https://github.com/PINTO0309/TensorflowLite-bin/raw/master/2.0 .0/tflite_runtime-2.0.0-cp37-cp37m-linux_armv7l.whl $ pip install --upgrade tflite_runtime-2.0.0-cp37-cp37m-linux_armv7l.whl $ sudo apt install swig libjpeg-dev zlib1g-dev python3-dev python3-numpy unzip $ wget https://github.com/PINTO0309/TensorflowLite-bin/raw/master/2 .0.0/tflite_runtime-2.0.0-cp37-cp37m-linux_armv7l.whl $ pip install --upgrade tflite_runtime-2.0.0-cp37-cp37m-linux_armv7l.whl
  29. Demo @alexis0duque #Snowcamp #IoT #Tensorflow #AI

  30. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow Lite Benchmark Source: Alasdair

    Allan
  31. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow Lite Benchmark Source: Alasdair

    Allan
  32. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Tensorflow Lite Limitations

  33. @alexis0duque #Snowcamp #IoT #Tensorflow #AI TFLite on Microcontrollers Arduino Nano

    33
  34. @alexis0duque #Snowcamp #IoT #Tensorflow #AI TFLite on Microcontrollers #include <TensorFlowLite.h>

    // This is your tflite model #include "lg_weight_model.h" #include "tensorflow/lite/experimental/micro/kernels/all_ops_resolver.h" #include "tensorflow/lite/experimental/micro/micro_interpreter.h" #include "tensorflow/lite/schema/schema_generated.h" tflite::ErrorReporter *error_reporter = nullptr; const tflite::Model *model = nullptr; tflite::MicroInterpreter *interpreter = nullptr; TfLiteTensor *input = nullptr; TfLiteTensor *output = nullptr; #include <TensorFlowLite.h> // This is your tflite model #include "lg_weight_model.h" #include "tensorflow/lite/experimental/micro/kernels/all_ops_resolver.h" #include "tensorflow/lite/experimental/micro/micro_interpreter.h" #include "tensorflow/lite/schema/schema_generated.h" tflite::ErrorReporter *error_reporter = nullptr; const tflite::Model *model = nullptr; tflite::MicroInterpreter *interpreter = nullptr; TfLiteTensor *input = nullptr; TfLiteTensor *output = nullptr;
  35. @alexis0duque #Snowcamp #IoT #Tensorflow #AI TFLite on Microcontrollers // Finding

    the min value for your model may require tests! constexpr int kTensorArenaSize = 2 * 1024; uint8_t tensor_arena[kTensorArenaSize]; // Load your model. model = tflite::GetModel(g_weight_regresion_model_data); // This pulls in all the operation implementations we need. static tflite::ops::micro::AllOpsResolver resolver; // Build an interpreter to run the model with. static tflite::MicroInterpreter static_interpreter( model, resolver, tensor_arena, kTensorArenaSize, error_reporter); interpreter = &static_interpreter; // Finding the min value for your model may require tests! constexpr int kTensorArenaSize = 2 * 1024; uint8_t tensor_arena[kTensorArenaSize]; // Load your model. model = tflite::GetModel(g_weight_regresion_model_data); // This pulls in all the operation implementations we need. static tflite::ops::micro::AllOpsResolver resolver; // Build an interpreter to run the model with. static tflite::MicroInterpreter static_interpreter( model, resolver, tensor_arena, kTensorArenaSize, error_reporter); interpreter = &static_interpreter;
  36. @alexis0duque #Snowcamp #IoT #Tensorflow #AI TFLite on Microcontrollers // Allocate

    memory for the model's tensors. TfLiteStatus allocate_status = interpreter->AllocateTensors(); // Obtain pointers to the model's input and output tensors. input = interpreter->input(0); output = interpreter->output(0); // Feed the interpreter with the input value float x_val = random(0, 1); input->data.f[0] = x_val; // Run Inference TfLiteStatus invoke_status = interpreter->Invoke(); // Get inference result float y_val = output->data.f[0]; // Allocate memory for the model's tensors. TfLiteStatus allocate_status = interpreter->AllocateTensors(); // Obtain pointers to the model's input and output tensors. input = interpreter->input(0); output = interpreter->output(0); // Feed the interpreter with the input value float x_val = random(0, 10); input->data.f[0] = x_val; // Run Inference TfLiteStatus invoke_status = interpreter->Invoke(); // Get inference result float y_val = output->data.f[0];
  37. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Further Work

  38. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Summary

  39. @alexis0duque #Snowcamp #IoT #Tensorflow #AI Thanks! frama.link/rtone-iot-tflite frama.link/rtone-jobs

  40. @alexis0duque #Snowcamp #IoT #Tensorflow #AI References