Andrei
Diaconu
2010 – Android
Android Iasi with Andrei Verdes
2018 – Flutter
Wrote PostMuse using Flutter
2020 – Microsoft
Developer advocate in the Flutter community
This is a story, not a lecture.
Slide 3
Slide 3 text
Machine
Learning
What is it?
What does it excel at?
How do you implement it?
What is inference and how do you
do it in Flutter?
Slide 4
Slide 4 text
ML
Frameworks
• Training data
• Training algorithms
• Operators
• Inference Model
ONNX can be compared to a
programming language specialized in
mathematical functions.
Slide 5
Slide 5 text
ORT + Pieces
Article
• TensorFlow Lite is more restrictive.
• ORT offers a C library.
• Quantization
• Continuous model refinement
Article: ONNX Runtime: performant
on-device inferencing - Microsoft Open
Source Blog
The app: https://pieces.app
Slide 6
Slide 6 text
LET’S
FIRST GO
TO THE
PUB
Tip: Easiest way to contribute to OSS is to solve
your own issues and upstream the changes.
Slide 7
Slide 7 text
WHICH
ONE DO
YOU
PICK?
Tip: Whenever you decide which library to use,
have a look at the code and the issues raised on
GitHub. This gives you a better picture of
overall quality and completeness.
?
Slide 8
Slide 8 text
Run the
sample app
A good example for audio processing.
It is far too complicated.
I am still learning.
I need something simpler.
Link:
https://github.com/gtbluesky/onnxrunti
me_flutter/tree/main/example
Slide 9
Slide 9 text
STEP 1
THE
SIMPLEST
MODEL
Tip: When setting up ONNX Runtime, use a
simple model. It eliminates issues that stem
from processing, model complexity, supported
operators, and so on
Slide 10
Slide 10 text
Link: onnxruntime-inference-examples/mobile/examples/basic_usage/model at main ·
microsoft/onnxruntime-inference-examples · GitHub
Slide 11
Slide 11 text
Link: onnxflutterplay/lib/main.dart at main · andreidiaconu/onnxflutterplay · GitHub
Load model
A and B
(input tensors)
Run model
C (output tensor)
Slide 12
Slide 12 text
Link: onnxflutterplay/lib/main.dart at main · andreidiaconu/onnxflutterplay · GitHub
Slide 13
Slide 13 text
STEP 2
COMPLEX
MODEL
Tip: Install Netron so you can simply double-
click to view on your PC. Always check the
inputs and outputs this way to avoid confusion.
Slide 14
Slide 14 text
Link: examples/fast_neural_style at main · pytorch/examples · GitHub
Slide 15
Slide 15 text
Link: models/validated/vision/style_transfer/fast_neural_style at main · onnx/models · GitHub
Slide 16
Slide 16 text
STEP 2
COMPLEX
MODEL
Tensor shape?
Slide 17
Slide 17 text
STEP 1
THE
SIMPLEST
MODEL
Slide 18
Slide 18 text
CONVERT ONNX TO ORT
Link: ORT model format | onnxruntime
Slide 19
Slide 19 text
What images
to use
Tip: Working with images means
very large arrays, which are hard to
follow. Whenever some input or
output is hard to debug, ask
yourself what image you can
manufacture to help find the
problem.
Red 224x224 Red-Green 224x224
RUN
POST-PROCESSING
ON
PRE-PROCESSING
RESULTS
Tip: Working with images? Feed your pre-processing
to your post-processing and display it on the screen.
This makes many issues easy to spot.
255 255 255 255 … … 255 0 0 0 … …
Slide 25
Slide 25 text
RESULTS
Tip: Take the smallest step you can think of.
There is a lot that can go wrong when
processing large tensors such as those for
images. Creating bespoke images to use as
input is also a skill you need to learn.
Photo of an Eurasian blue tit by Francis Franklin
Slide 26
Slide 26 text
Resources
These slides
https://speakerdeck.com/andreidiaconu
Code
https://github.com/andreidiaconu/onnxflutterplay
Article
https://devblogs.microsoft.com/surface-duo/flutter-onnx-
runtime/