I was given a presentation on the Introduction to tinyML (Machine learning on edge) and a bit about TensorFlow lite for microcontrollers at Mozilla firefox club VIT webinar.
What is tinyML? • Machine learning + embedded systems = Intelligent IOT devices • It helps to explore what type of models you can run on small, low-power devices like microcontrollers. • Summarizing and analyzing data at the edge on low power devices
Why tinyML? • Instant response • It enables • low-latency • low power(1000x) • low bandwidth model inference at edge devices. • Privacy • Can collect more data which helps to build better products.
TensorFlow Lite • You can use this tool to wrap your models to run on embedded systems • It supports Android, IOS, Arduino etc.. • Not only Python you can use C, C++ and JAVA • Pretrained models • Other tools like • CoreML • PyTorch Mobile
TensorFlow Lite for Microcontrollers • Especially designed to run machine learning models on microcontrollers and other devices with only few kilobytes of memory. • TensorFlow Lite for Microcontrollers is written in C++ 11 and requires a 32-bit platform. • The framework is available as an Arduino library
Workflow • Train your Machine learning models • Convert them to a TensorFlow Lite model using the TensorFlow Lite converter. • Convert to a C byte array using standard tools to store it in a read-only program memory on device. • Run inference on device using the C++ library and process the results.
Limitations of TF Lite for Microcontrollers • Support for a limited subset of TensorFlow operations • Support for a limited set of devices • Low-level C++ API requiring manual memory management • On device training is not supported
Complete project flow • Define the problem • Set up your hardware • Set up your software • Build and prepare the binary • Get ready to flash the binary • Flash the binary • Generate the output