Upgrade to Pro — share decks privately, control downloads, hide ads and more …

ONNX-Go: neural networks made easy

ONNX-Go: neural networks made easy

Feel free to reach out on Twitter or Gopher slack: @owulveryck.

The main project is hosted here: https://github.com/owulveryck/onnx-go

If you want to test the code, here are the links:
# Get the model:
curl https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz | tar -C /tmp -xzvf -

# Get the demo binary from https://github.com/owulveryck/onnx-go/releases/tag/v0.1-mnist-cli

# Run it:
./mnist-reader.darwin -model /tmp/mnist/model.onnx
------------------------------------------------------------------------------------------------
# Why would the developers need machine learning in their code?

The software has not eaten the world (yet), but it has drastically changed it.

The software gave us, as humans, new capabilities. It helps us toward our choices.
The current goal of the tools is to help us to answer such a question:
"Given X, I want to predict Y".
This is where machine learning enters the scene.

However, mastering the machine learning ecosystem is hard. And, as a developer, it's normal to freak out in front of tensors, gradients, non-linearities, back-propagation, but, I think that this should not prevent us, the “regular developers,” from being part of this evolution.

We should be able to use a machine learning model as a regular library. We feed it with some input and get the output regardless of the wiring inside the model.

_Let’s take an illustration_
A neural network is just a bunch of values applied to an input. The algorithm is a relatively complex mathematical formula.
The goal of the data scientist is to find the right equations for a given use-case and to train it to get an accurate output given an input.
For example, the MNIST model returns an integer given a picture of the handwritten digit.

We see that machine learning strength is the model and its value, not its runtime. Its runtime is just an execution constraint.

# How can we easily use neural networks?

Now let's say we can encode the model and its trained value to an open format, independent of the execution backend. We now have the power of the machine learning, and we can choose the best execution backend.

Such a format exist; it's called Open Neural Network eXchange (ONNX).

Suppose that you want to use Go as a runtime (we all love programming in Go)?
Go has an efficient execution engine for machine learning related stuff and it’s called Gorgonia.

# What can we use?

Let me introduce you to onnx-go, an interface that allows importing pre-trained ONNX models into any Go program and running it thanks to an execution backend (Gorgonia is one example).

onnx-go is a Go package that exposes some functions to read a model encoded in the ONNX protobuf definition.
onnx-go does not implement any execution backend, but instead, it relies on pre-existing engines (such as Gorgonia for example).

Its goal is to make an abstraction of the neural network so that Gophers can use any neural net like any other library.

## Where to go from here?

* Take a look at what software 2.0 is.
* Take a look at the [onnx project](https://onnx.ai/), the [model zoo](https://github.com/onnx/models), pre-trained models.
* Imagine what you, as a Gopher, can do with this capacity named Machine Learning.
* Get involve in [onnx-go](https://github.com/owulveryck/onnx-go)..

You may even want to join the community of Gopher data-scientists (#data-science channel on the gopher slack), where people are willing to help…

and *altogether, let’s make programming (with) neural network fun again.*

Olivier Wulveryck

March 25, 2019
Tweet

More Decks by Olivier Wulveryck

Other Decks in Programming

Transcript

  1. ONNX-GO Neural Networks made easy dotGO - March 25th 2019

    Olivier Wulveryck Octo Technology @owulveryck
  2. - I want to give my code super power! -

    I want to use it to predict Y, given X! - Use machine learning Luke! iota
  3. Gradient Tensorflow Tenso rs n o n -lin e a

    ritie s Sigmoid backpropagation overfitting pyto rch LSTM convolution K-m ean
  4. Input p u t Input o u tpu t output

    Machine Learning Model
  5. SHOW ME SOME CODE! ONNX-GO (MNIST) example backend := gorgonia.NewGraph()

    model := onnx.NewModel(backend) Create the execution backend (Gorgonia) and the onnx model receiver b, err := ioutil.ReadFile("model.onnx") err = model.Unmarshal(b) Read the onnx file and Deserialize it into the model receiver gorgonia.NewTapeMachine(backend).RunAll() output := getData(model.Output[0]) // []float32 Run the backend to compute the result var picture *image.NRGBA setData(model.Input[0],picture) Set the input
  6. DEMO of the POC # Get the model: curl https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz

    | tar -C /tmp -xzvf - # Get the demo binary from https://github.com/owulveryck/onnx-go/releases/tag/v0.1-mnist-cli # Run it: ../mnist-reader.darwin -model /tmp/mnist/model.onnx
  7. Imagine what you, as a Gopher, can do with Machine

    Learning. Get involve, nobody is a nobody! github.com/onnx/models github.com/owulveryck/onnx-go gorgonia.org/gorgonia
  8. Let's make programming (with) Neural networks fun (again) Welcome to

    software 2.0 with GO! Thank you! @owulveryck