Learn Elixir: Building a Neural Network from Scratch

Learn Elixir: Building a Neural Network from Scratch

33ab5062beb333c981b7a1530024f7bc?s=128

Karmen Blake

March 04, 2016
Tweet

Transcript

  1. LEARN ELIXIR: BUILDING A NEURAL NETWORK FROM SCRATCH

  2. KARMEN BLAKE @KBLAKE

  3. None
  4. Dev Coop Meetup Group wanted to learn more about neural

    networks. Each in a language of their choice. Javascript, Ruby, Python, and Elixir.
  5. Goal for me: Learn more Elixir

  6. Neural Networks

  7. What have I gotten myself into?!

  8. More than academic

  9. What does a neural network do?

  10. “The task that neural networks accomplish very well is pattern

    recognition. You communicate a pattern to a neural network and it communicates a pattern back to you. At the highest level, this is all that a typical neural network does.” - Jeff Heaton (http:// www.heatonresearch.com)
  11. “A neural network is not just a complex system, but

    a complex adaptive system, meaning it can change its internal structure based on the information flowing through it.” - DANIEL SHIFFMAN
  12. INPUT LAYER HIDDEN LAYERS (NOT REQUIRED) OUTPUT LAYER [Input data

    pattern] [Ideal output pattern]
  13. SWIM KEEP YOUR HEAD ABOVE WATER Patterns COOK COMBINE CONSUMABLE

    ITEMS AND WARM THEM UP TO BE EATEN DAYDREAM ENJOY IMAGINATIVE STORY
  14. Neuron INPUT VALUE INPUT VALUE INPUT VALUE OUTPUT VALUE *

    connection weight * connection weight * connection weight - sum weighted inputs - output = activation_function(sum_of_weights)
  15. None
  16. Back Propagation Train your network

  17. Neuron Update connection weight Update connection weight Update connection weight

    - update delta (output - target output) - weight = gradient_descent Target output Back propagation - “backward propagation of errors”
  18. Activation - Use weights, update outputs, feed forward Backprop -

    update connection weights determined by delta Process of Learning
  19. “I CAN LEARN NOW!!”

  20. ELIXIR

  21. Learn Elixir - Initial thoughts going in…

  22. Well… one thought , really…

  23. TRANSFORM ALL THE DATA

  24. I started out with code like this:

  25. defmodule NeuralNet.Neuron do defstruct input: 0, output: 0, incoming: [],

    outgoing: [] … defmodule NeuralNet.Connection do defstruct source: %{}, target: %{}, weight: 0.5 … defmodule NeuralNet.Layer do defstruct neurons: [] …
  26. neuron = %NeuralNet.Neuron{ incoming: [ %NeuralNet.Connection{source: %NeuralNet.Neuron{output: 2}}, %NeuralNet.Connection{source: %NeuralNet.Neuron{output:

    5}} ] }
  27. {:ok, neuron_a, neuron_b} = NeuralNet.Neuron.connect(neuron_a, neuron_b) def connect(source, target) do

    {:ok, connection} = Connection.connection_for(source, target) source = %Neuron{source | outgoing: source.outgoing ++ [connection]} target = %Neuron{target | incoming: target.incoming ++ [connection]} {:ok, source, target} end neuron.ex
  28. def activate(layer, values \\ nil) do values = values ||

    [] activated_neurons = layer.neurons |> Stream.with_index |> Enum.map(fn(tuple) -> {neuron, index} = tuple NeuralNet.Neuron.activate(neuron, values[index]) end) {:ok, %NeuralNet.Layer{neurons: activated_neurons}} end
  29. This is awesome!

  30. But…

  31. “Connecting layers” is difficult!

  32. Transforming this is hard!

  33. connection1 source: source_neuron target: target_neuron Data is being transformed and

    duplicated. It gets out of sync and thus becomes more difficult to transform later. source_neuron, target_neuron, connection1 are defined and stored in different places outgoing: [connection1, …] source_neuron Layer1 incoming: [connection1, …] target_neuron Layer2
  34. # TODO: refactor this # accumulate connections then map Enum.each

    NeuralNet.Layer.neurons(input_layer_name), fn(source) -> Enum.each NeuralNet.Layer.neurons(output_layer_name), fn(target) -> {:ok, s, _} = NeuralNet.Neuron.connect(source, target) add_neurons(:source_neurons, [s]) end end
  35. input_layer_neurons = build_input_layer_neurons_with_connections(input_layer_name, output_layer_name) output_layer_neurons = build_output_layer_neurons_with_connections(input_layer_name, output_layer_name) set_neurons(input_layer_name, input_layer_neurons)

    set_neurons(output_layer_name, output_layer_neurons) stop_agent(:source_neurons) stop_agent(:target_neurons) {:ok, input_layer_neurons, output_layer_neurons}
  36. # TODO: simplify this method defp build_input_layer_neurons_with_connections(input_layer_name, output_layer_name) do #

    group neurons by source input_layer_outgoing_connections = Enum.chunk(neurons(:source_neurons), length(NeuralNet.Layer.neurons(output_layer_name))) |> Enum.map(fn(neurons) -> # collect the connections for each source neuron Enum.map neurons, fn neuron -> List.first neuron.outgoing # list of connections for a source neuron end end) # reduce each source neuron with collected outgoing connections NeuralNet.Layer.neurons(input_layer_name) |> Stream.with_index |> Enum.map(fn tuple -> {neuron, index} = tuple %NeuralNet.Neuron{neuron | outgoing: Enum.at(input_layer_outgoing_connections, index)} end) end
  37. This is painful :(

  38. None
  39. https://howistart.org/posts/elixir/1

  40. “…when we need to keep some sort of state, like

    the data transfering through a portal, we must use an abstraction that stores this state for us. One such abstraction in Elixir is called an agent.” - José Valim
  41. None
  42. defmodule NeuralNetwork.Connection do defstruct pid: nil, source_pid: nil, target_pid: nil,

    weight: 0.4 … defmodule NeuralNetwork.Neuron do defstruct pid: nil, input: 0, output: 0, incoming: [], outgoing: [], bias?: false, delta: 0 … defmodule NeuralNetwork.Layer do defstruct pid: nil, neurons: [] … defmodule NeuralNetwork.Network do defstruct pid: nil, input_layer: nil, output_layer: nil, hidden_layers: [], error: 0 … PIDs
  43. Agents are cool PID management became a thing

  44. def start_link(neuron_fields \\ %{}) do {:ok, pid} = Agent.start_link(fn ->

    %Neuron{} end) update(pid, Map.merge(neuron_fields, %{pid: pid})) {:ok, pid} end def update(pid, neuron_fields) do Agent.update(pid, &(Map.merge(&1, neuron_fields))) end def get(pid), do: Agent.get(pid, &(&1)) neuron.ex
  45. connection1 source: source_neuron PID target: target_neuron PID Data is being

    transformed but NOT duplicated. It stays in sync and thus becomes easier to transform later. source_neuron, target_neuron, connection1 are defined and accessed via PID outgoing: [connection1 PID, …] source_neuron Layer1 incoming: [connection1 PID, …] target_neuron Layer2
  46. “Connecting layers” is easier!

  47. def connect(input_layer_pid, output_layer_pid) do input_layer = get(input_layer_pid) unless contains_bias?(input_layer) do

    {:ok, pid} = Neuron.start_link(%{bias?: true}) input_layer_pid |> add_neurons([pid]) end for source_neuron <- get(input_layer_pid).neurons, target_neuron <- get(output_layer_pid).neurons do Neuron.connect(source_neuron, target_neuron) end end That’s it! For real!
  48. :D

  49. Transformations are important! |>

  50. Macro data transformations can be painful - trying to transform

    a whole neural network is hard - maintaining state of a neural network is hard Micro data transformations are beautiful and should be used liberally |> |> |> Reconstructing manageable data from one form into a more meaningful form given the context it is in
  51. Train the Network

  52. def run(args) do gate_name = args |> List.first # Setup

    network {:ok, network_pid} = NeuralNetwork.Network.start_link([2,1]) # grab data set data = NeuralNetwork.DataFactory.gate_for(gate_name) # Run trainer NeuralNetwork.Trainer.train(network_pid, data, %{epochs: 10_000, log_freqs: 1000}) end
  53. data = NeuralNetwork.DataFactory.gate_for(gate_name) @or_gate [ %{input: [0,0], output: [0]}, %{input:

    [0,1], output: [1]}, %{input: [1,0], output: [1]}, %{input: [1,1], output: [1]} ] @and_gate [ %{input: [0,0], output: [0]}, %{input: [0,1], output: [0]}, %{input: [1,0], output: [0]}, %{input: [1,1], output: [1]} ] @xor_gate [ %{input: [0,0], output: [0]}, %{input: [0,1], output: [1]}, %{input: [1,0], output: [1]}, %{input: [1,1], output: [0]} ] @nand_gate [ %{input: [0,0], output: [1]}, %{input: [0,1], output: [1]}, %{input: [1,0], output: [1]}, %{input: [1,1], output: [0]} ]
  54. @doc """ Iris flower data set. The output labels are:

    Iris setosa, Iris versicolor, Iris virginica. https://www.wikiwand.com/en/Iris_flower_data_set 4 inputs, 3 output, 150 samples """ @iris_flower [ %{input: [5.1, 3.5, 1.4, 0.2], output: [1, 0, 0]}, %{input: [4.9, 3.0, 1.4, 0.2], output: [1, 0, 0]}, %{input: [4.7, 3.2, 1.3, 0.2], output: [1, 0, 0]}, %{input: [4.6, 3.1, 1.5, 0.2], output: [1, 0, 0]}, %{input: [5.0, 3.6, 1.4, 0.2], output: [1, 0, 0]}, %{input: [5.4, 3.9, 1.7, 0.4], output: [1, 0, 0]}, %{input: [4.6, 3.4, 1.4, 0.3], output: [1, 0, 0]}, %{input: [5.0, 3.4, 1.5, 0.2], output: [1, 0, 0]}, %{input: [4.4, 2.9, 1.4, 0.2], output: [1, 0, 0]}, %{input: [4.9, 3.1, 1.5, 0.1], output: [1, 0, 0]}, %{input: [5.4, 3.7, 1.5, 0.2], output: [1, 0, 0]}, %{input: [4.8, 3.4, 1.6, 0.2], output: [1, 0, 0]}, %{input: [4.8, 3.0, 1.4, 0.1], output: [1, 0, 0]}, %{input: [4.3, 3.0, 1.1, 0.1], output: [1, 0, 0]}, %{input: [5.8, 4.0, 1.2, 0.2], output: [1, 0, 0]}, %{input: [5.7, 4.4, 1.5, 0.4], output: [1, 0, 0]}, %{input: [5.4, 3.9, 1.3, 0.4], output: [1, 0, 0]}, %{input: [5.1, 3.5, 1.4, 0.3], output: [1, 0, 0]}, …
  55. trainer.ex for epoch <- 0..epochs do average_error = Enum.reduce(data, 0,

    fn sample, sum -> Network.get(network_pid) |> Network.activate(sample.input) Network.get(network_pid) |> Network.train(sample.output) sum + Network.get(network_pid).error/data_length end) if rem(epoch, log_freqs) == 0 || epoch + 1 == epochs do IO.puts "Epoch: #{epoch} Error: #{unexponential(average_error)}" end end
  56. OR gate learning ********************************************* Epoch: 0 Error: 0.0978034950879825143 Epoch: 1000

    Error: 0.0177645755625382047 Epoch: 2000 Error: 0.0065019384961036274 Epoch: 3000 Error: 0.0032527653252166144 Epoch: 4000 Error: 0.0019254900093371497 Epoch: 5000 Error: 0.0012646710040632755 Epoch: 6000 Error: 0.0008910514800247452 Epoch: 7000 Error: 0.0006602873040322224 Epoch: 8000 Error: 0.0005081961006147329 Epoch: 9000 Error: 0.0004028528701046857 Epoch: 9999 Error: 0.0003270377487769315 Epoch: 10000 Error: 0.0003269728572615501 **************************************************************
  57. 0 0.025 0.05 0.075 0.1 Epochs 0 1000 2000 3000

    4000 5000 6000 7000 8000 9000 10000 Learning: error rate going down
  58. OR logic

  59. Other Elixir lessons and tips…

  60. |> (pipe operator) is beautiful network.hidden_layers |> Enum.reverse |> Enum.each(

    &(Layer.train(&1)) ) # vs. reversed = Enum.reverse(network.hidden_layers) Enum.each(reversed, &(Layer.train(&1))
  61. Pattern Matching on Functions defp create_neurons(nil), do: [] defp create_neurons(size)

    when size < 1, do: [] defp create_neurons(size) when size > 0 do Enum.into 1..size, [], fn _ -> {:ok, pid} = Neuron.start_link pid end end neurons = create_neurons # [] neurons = create_neurons(0) # [] neurons = create_neurons(3) # […,…,…]
  62. $ mix docs Docs successfully generated.

  63. None
  64. TDD $ mix test.watch Running tests... ................................. Finished in 0.1

    seconds (0.1s on load, 0.01s on tests) 33 tests, 0 failures Randomized with seed 956963
  65. None
  66. Dev Coop meetup group asked me to teach functional programming

    concepts to them using Elixir
  67. https://github.com/kblake/neural_network_elixir https://github.com/kblake/neural-net-elixir-v1 failed attempt https://github.com/kblake/neural_network Ruby version https://hex.pm/packages/neural_network Code

  68. @usertesting #UTLife

  69. Thank you! Karmen Blake @kblake

  70. RESOURCES ▸ My Neural Network resources: ▸ https://gist.github.com/kblake/55e8ef457075a80a1dc3 ▸ http://www.unikaz.asia/en/content/why-it-neural-network-and-why-expanses-

    internet ▸ https://www.technologyreview.com/s/600889/google-unveils-neural-network- with-superhuman-ability-to-determine-the-location-of-almost/sdfsdf ▸ https://howistart.org/posts/elixir/1
  71. RESOURCES ▸ http://www.unikaz.asia/en/content/why-it-neural-network-and-why-expanses-internet ▸ http://tfwiki.net/mediawiki/images2/thumb/d/d3/G1toy_sons_of_cybertron_optimus_prime.jpg/300px-G1toy_sons_of_cybertron_optimus_prime.jpg ▸ http://fitandstrongdads.com/wp-content/uploads/2013/04/rocky-training-partner.jpg ▸ http://i.cbc.ca/1.3004920.1427053176!/cpImage/httpImage/image.jpg_gen/derivatives/16x9_620/jose-pirela.jpg ▸

    http://www.lifeloveandsugar.com/wp-content/uploads/2014/09/Caramel_Apple_Layer_Cake4.jpg ▸ http://i.telegraph.co.uk/multimedia/archive/02839/pompeii_2839323b.jpg ▸ http://www.tnooz.com/wp-content/uploads/2014/02/child-eureka.jpg ▸ http://www.hookedgamers.com/images/134/the_matrix_path_of_neo/reviews/header_57_the_matrix_path_of_neo.jpg ▸ http://img.pandawhale.com/85469-Keanu-WHOA-gif-The-Matrix-Neo-jirD.gif ▸ http://images.military.com/media/military-fitness/improving-your-pft-run-time-image.jpg ▸ http://1.bp.blogspot.com/-hitbg0RMZ7c/TmfJAXq_T_I/AAAAAAAAJms/YBszXIeBdnM/s1600/mainstream.jpg ▸ http://hearinghealthmatters.org/hearingeconomics/files/2014/12/onion.jpg