Upgrade to Pro — share decks privately, control downloads, hide ads and more …

The Shape of U -- Befriending Tensors

Nishant Sinha
February 01, 2020

The Shape of U -- Befriending Tensors

The Tensor (n-dimensional array) is the principal software abstraction for designing and coupling the different components of modern machine learning systems. Unfortunately, writing code using popular tensor libraries is very challenging. First, the libraries expose the developers to a low-level, physical memory model, forcing them to translate between their semantic mental model and the physical model continuously. Second, the broadcast semantics between tensors of different dimensions are not robust and lead to surprising errors. Finally, in absence of a common language to specify tensor shapes explicitly, developers are compelled to write adhoc comments and woefully cryptic code.

In this talk,

We present a new language to specify tensor shapes and their transformations, by attaching "semantic" names to individual tensor dimensions.
The shape annotations are library-agnostic, improve code readability, and make it easy to write semantic shape assertions and transformations.
We present the tsalib library (http://github.com/ofnote/tsalib) which integrates named shape annotations into existing Python tensor programs seamlessly using optional type annotations. With examples (self-attention in Transformers, convolution-sequence in Resnets), we illustrate how shape annotations improve code readability dramatically and even reduce the code size.
We also discuss a tool, "tsanley" (http://github.com/ofnote/tsanley), for Python code, which (a) helps catch tricky shape errors at runtime and (b) improves third-party code understanding by annotating code with shape annotations.

Nishant Sinha

February 01, 2020
Tweet

Other Decks in Research

Transcript

  1. The Shape of U Nishant Sinha, OffNote Labs Nishant Sinha

    Founder, Chief Scientist OffNote Labs nishant@offnote.co The Shape of U: Befriending Tensors
  2. Key Ingredients of Software 2.0 Design Example (X, Y) Generator

    Annotator Inductive Biases, Performance Model The Diff Engine Loss Function Knob Tweaker Optimizers Tensors numpy, tensorflow, pytorch, mxnet, ...
  3. As you dive into 2.0 ... reshape/view, permute/transpose expand dot

    matmult batch_matmult Weird shape modifiers that hijack your code reduce_* select API
  4. The Shape of U Nishant Sinha, OffNote Labs Talk Roadmap

    • Getting to know Tensors ◦ Dimensions and Shapes, Memory and Mental Models ◦ Flaws in Tensor APIs • Befriending Tensors ◦ Tsalib: named shapes via Python types, library-agnostic Tsanley: A dynamic checker for named shapes
  5. The Shape of U Nishant Sinha, OffNote Labs Tensor Representations:

    Physical Layer Image source: https://stackoverflow.com/questions/32034237/how-does-numpys-transpose-method-permute-the-axes-of-an-array T: (i 1 , ..i n ) -> v
  6. The Shape of U Nishant Sinha, OffNote Labs Why are

    Tensor Programs hard to write ? Tensor Libraries (numpy, tensorflow, pytorch, …): A. Expose developers to physical memory model a. Unable to enforce semantic view of data B. Implicit / Adhoc broadcast semantics ◦ Esoteric bugs! C. Hard to write tensor transformations ◦ Tensor shapes are latent across the program T: (i 1 , ..i n ) -> v
  7. The Shape of U Nishant Sinha, OffNote Labs A. Exposure

    to Low-level Memory Model API tied to physical, indexed layout. No semantic notion of ‘axis’.
  8. The Shape of U Nishant Sinha, OffNote Labs B. Adhoc

    Broadcast Semantics Rule 1: Make dimensions same by padding shorter one to the left. Rule 2: Mismatch in same dimension: stretch ‘1’ to higher size. (1, 32)
  9. The Shape of U Nishant Sinha, OffNote Labs C. Hard

    to read / write Tensor manipulating code Shapes are latent No standard way to track Adhoc comments
  10. The Shape of U Nishant Sinha, OffNote Labs Proposals for

    Naming Tensor Dimensions • No consensus on named APIs • Require deep changes to tensor libraries • Long due, no action ◦ Recently, Pytorch added named dimensions support. ◦ Mesh-tensorflow, Tensor-networks ◦ xarray Alexander Rush
  11. The Shape of U Nishant Sinha, OffNote Labs tsalib: A

    Tensor Shape Annotation Library • Goals ◦ Enable tracking tensor shapes in programs — as first-class citizens ◦ A language for tensor shapes ▪ Write crisp, intuitive, shape transformations — no cryptic code! ▪ Semantic (named) shape assertions ▪ Allow abstraction ◦ Integrated with Python — immediately usable ▪ Work with arbitrary tensor backends — avoid deep integration • Insight: Use Python 3.x (optional) type annotations
  12. The Shape of U Nishant Sinha, OffNote Labs Resnet.forward Full

    model: https://github.com/ofnote/tsalib/blob/master/models/resnet.py
  13. The Shape of U Nishant Sinha, OffNote Labs Named Dimensions

    --> Grammar of Named Shapes Shape Transformation API
  14. The Shape of U Nishant Sinha, OffNote Labs From Transformer

    Attention Module (old vs new) https://github.com/huggingface/transformers/blob/master/transformers/modeling_gpt2.py
  15. The Shape of U Nishant Sinha, OffNote Labs Rewriting BERT

    with warp • Enhances code readability • Reduced BERT attention_layer fn by ~25 lines (200 -> 175) •
  16. The Shape of U Nishant Sinha, OffNote Labs tsalib: Lowering

    Named Shape Transformations Under-the-hood: • Sympy: symbolic expressions • Fast lookup, Substitution
  17. The Shape of U Nishant Sinha, OffNote Labs Big ->

    Small Step Transformations 'b*t,n,h ->> bnth'
  18. The Shape of U Nishant Sinha, OffNote Labs Tensors Considered

    Harmful http://nlp.seas.harvard.edu/NamedTenso r
  19. The Shape of U Nishant Sinha, OffNote Labs Two Issues

    • Manually write named shape annotations • Manually write named shape assertions
  20. The Shape of U Nishant Sinha, OffNote Labs Tsanley: dynamic

    shape checking • Interplay of AST parsing, Python tracing (trace, inspect) ◦ Filter a subset of functions to track • Piggyback runtime shape checks on trace callbacks ◦ Track last executed statement in a function • Access concrete shapes from runtime frame ◦ Match against named shape annotations ◦ Log shapes for post- code annotation
  21. The Shape of U Nishant Sinha, OffNote Labs Open Source

    Research @ OffNote Labs Bridging the divide between real and imaginary research
  22. The Shape of U Nishant Sinha, OffNote Labs Summary: Befriending

    Tensors • Disconnect between memory model of tensor libraries and developer’s semantic model • Naming dimensions help bridge the disconnect ◦ A language for named shapes and transformations • Naming has multiple benefits ◦ improves code readability, shape assertions ◦ semantic shape transformations Shapes for arbitrary data?