Feel free to reach out on Twitter or Gopher slack: @owulveryck.
The main project is hosted here: https://github.com/owulveryck/onnx-go
If you want to test the code, here are the links:
# Get the model:
curl https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz | tar -C /tmp -xzvf -
# Get the demo binary from https://github.com/owulveryck/onnx-go/releases/tag/v0.1-mnist-cli
# Run it:
./mnist-reader.darwin -model /tmp/mnist/model.onnx
# Why would the developers need machine learning in their code?
The software has not eaten the world (yet), but it has drastically changed it.
The software gave us, as humans, new capabilities. It helps us toward our choices.
The current goal of the tools is to help us to answer such a question:
"Given X, I want to predict Y".
This is where machine learning enters the scene.
However, mastering the machine learning ecosystem is hard. And, as a developer, it's normal to freak out in front of tensors, gradients, non-linearities, back-propagation, but, I think that this should not prevent us, the “regular developers,” from being part of this evolution.
We should be able to use a machine learning model as a regular library. We feed it with some input and get the output regardless of the wiring inside the model.
_Let’s take an illustration_
A neural network is just a bunch of values applied to an input. The algorithm is a relatively complex mathematical formula.
The goal of the data scientist is to find the right equations for a given use-case and to train it to get an accurate output given an input.
For example, the MNIST model returns an integer given a picture of the handwritten digit.
We see that machine learning strength is the model and its value, not its runtime. Its runtime is just an execution constraint.
# How can we easily use neural networks?
Now let's say we can encode the model and its trained value to an open format, independent of the execution backend. We now have the power of the machine learning, and we can choose the best execution backend.
Such a format exist; it's called Open Neural Network eXchange (ONNX).
Suppose that you want to use Go as a runtime (we all love programming in Go)?
Go has an efficient execution engine for machine learning related stuff and it’s called Gorgonia.
# What can we use?
Let me introduce you to onnx-go, an interface that allows importing pre-trained ONNX models into any Go program and running it thanks to an execution backend (Gorgonia is one example).
onnx-go is a Go package that exposes some functions to read a model encoded in the ONNX protobuf definition.
onnx-go does not implement any execution backend, but instead, it relies on pre-existing engines (such as Gorgonia for example).
Its goal is to make an abstraction of the neural network so that Gophers can use any neural net like any other library.
## Where to go from here?
* Take a look at what software 2.0 is.
* Take a look at the [onnx project](https://onnx.ai/), the [model zoo](https://github.com/onnx/models), pre-trained models.
* Imagine what you, as a Gopher, can do with this capacity named Machine Learning.
* Get involve in [onnx-go](https://github.com/owulveryck/onnx-go)..
You may even want to join the community of Gopher data-scientists (#data-science channel on the gopher slack), where people are willing to help…
and *altogether, let’s make programming (with) neural network fun again.*