Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Can Neuroscience insights transform AI?

Can Neuroscience insights transform AI?

B7189c9a09c7d99379c2a343fcfb2dbd?s=128

Lawrence Spracklen

July 10, 2021
Tweet

Transcript

  1. Can Neuroscience insights transform AI? Dr. Lawrence Spracklen Director, ML

    Architecture
  2. Numenta Developing machine intelligence through neocortical theory • Understand how

    the brain works • Apply neocortical principles to AI Developed the “Thousand Brains” theory of how the neocortex works
  3. Artificial Neural Networks (ANNs) Layer 1 Layer 2 Layer 3

    Layer N Input Output Dense, fully-connected and computationally expensive
  4. Traditional approach to ANNs Perform matrix multiplications very fast •

    GPUs have become AI workhorses • 500+ trillion arithmetic operations per second per card • Hardware performance doubles every few years • Hardware cannot keep pace with growth in model size • Exploding AI costs • 2018 : BERT cost $6K+ to train • 2020 : GPT-3 cost $10M+ to train 3-years 17,000X increase Figure credit
  5. AI Today Incredible progress, but, at what cost…. • Vast

    models • Trillions of parameters • Expensive training • Massive compute, power & training data requirements • Catastrophic forgetting • Static task-specific models that can’t learn • Fragility • Significant real-world dangers Still a long way from AGI (Artificial General Intelligence) • Can we continue down this current path?
  6. Going forward 1. Improve model performance 2. Decrease frequency of

    retraining 3. Decrease training complexity • Both Algorithms and Hardware need to evolve • Focus on just one dimension doesn’t solve the problem • “Faster Horse” issues • Ensuring synergy provides a lasting solution • Hardware feasibility needs to influence algorithm evolution • And vice versa
  7. Can Neuroscience help? Examine the Neocortex • Neuron interconnections are

    sparse • Neuron activations are sparse • Neurons are significantly more complex than AI’s point neuron abstraction • Humans can learn from very limited examples Numenta’s Roadmap
  8. Make models fast Sparse models • Deliver comparable accuracy with

    up to 20X fewer parameters • Also leverage activation sparsity for multiplicative benefits • 100X+ reduction in compute costs • Hardware needs to be capable of exploiting sparsity • Efficiently avoid multiplying by the zeros! • 100X on FPGAs, 20X on CPUs with Numenta’s sparsity
  9. Always be learning Active dendrites • Point neurons only incorporate

    proximal synapses • Small proportion of neuron’s total synapses • Extend artificial neurons to incorporate distal synapses • Basal synapses used to modulate neuron behavior • Applying context signals enables networks to learn multiple tasks and facilitates online continuous learning • Primes relevant neurons based on context • Unsupervised determination of context is critical image credit
  10. Reduce learning repetition Reference frames • Training an ANN to

    recognize even a cup requires many images • 100s of pictures of cups at different orientations, distances, designs and colors • Separate problem into two base components • Invariant representation of object • Understanding of positional relationship to object • Create robust position independent representations of objects • Make observer orientation and distance explicit considerations • Inspired by human grid cells • Object independent • Significantly reduces number of training examples
  11. Conclusions • Continued progress in AI is threatened by exponentially

    increasing costs • Insights from the Neocortex provide critical insights for how to evolve AI • Numenta has developed neocortex inspired roadmap to AGI • Already demonstrated 100X AI model speedups using brain inspired sparsity • Working to incorporate continual learning and positionally invariant representations into AI systems • Reduce both retraining frequency & number of training examples • Cumulative benefits reduce AI costs by many orders of magnitude
  12. THANK YOU Questions? lspracklen@numenta.com https://numenta.com