Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Behind the Scenes with Auto Layout or How to Solve Constraints with the Cassowary Algorithm (iOSConfSG)

Behind the Scenes with Auto Layout or How to Solve Constraints with the Cassowary Algorithm (iOSConfSG)

Auto Layout is the most powerful tool on iOS today, to support developers in creating adaptive user interfaces on iOS. The power of this great technology lies in how simple its foundations are. The engineers behind Auto Layout at Apple started to reveal more of the internals of their technology recently, to help us understand its principles better, in order to get the best performance out of our layout code. Let’s look more closely at how the Auto Layout engine solves constraints through exploring the Cassowary algorithm, that’s running under the hood, turning constraints into linear equations, and solving them. By doing the math ourselves, we’ll have a better understanding of what layouts are easier, or more difficult to satisfy. What coding practices can help us achieve best Auto Layout performance.

Agnes Vasarhelyi

January 18, 2019
Tweet

More Decks by Agnes Vasarhelyi

Other Decks in Technology

Transcript

  1. BEHIND THE SCENES WITH AUTO LAYOUT or how to solve

    constraints with the Cassowary algorithm Agnes Vasarhelyi @vasarhelyia
  2. Your input ! • Satisfiable, non-ambiguous layouts • Size and

    position of content • Constraints (either code, or IB)
  3. The Render Loop Constraints 1. Layout 2. Display 3. layoutSubviews()

    setNeedsLayout() layoutIfNeeded() updateConstraints() setNeedsUpdateConstraints() updateConstraintsIfNeeded() draw(_ :) setNeedsDisplay()
  4. Auto Layout Engine View Values AL Engine What are the

    values for this frame? Stores values
  5. Auto Layout Engine View Values Value changed AL Engine What

    are the values for this frame? Stores values
  6. Auto Layout Engine View Values Value changed AL Engine What

    are the values for this frame? Stores values setNeedsLayout() layoutSubviews()
  7. WWDC 2018 • Layout performance scales linearly • Constraint inequalities

    are not expensive • Error minimization has a cost • Constraint solver algorithm: Simplex (Cassowary)
  8. Cassowary paper ‘97 • Based on Simplex (40's) linear programming

    algorithm • Constraint solving • UI applications • Incremental, optimized
  9. • Similar problems repeatedly • Add / remove / edit

    efficient Performance Incremental, optimized
  10. Linear Programming • Maximizing, or minimizing of model • Requirements

    = linear equations/inequalities • Objective function = goal
  11. Constraint solving LP problem • Objective function minimizing distance /

    error • Constraints linear equations / inequalities
  12. Let's solve constraints! x b y a xa = 15

    widtha = 30 xb = xa + widtha + 10
  13. Let's solve constraints! x b y a xa = 15

    widtha = 30 xb = xa + widtha + 10 xb = 15 + 30 + 10 = 55
  14. Let's solve constraints! x b y a xa = 15

    widtha = 30 xb = xa + widtha + 10 xb = 15 + 30 + 10 = 55
  15. xleft ≤ xmiddle ≤ xright xleft ≥ 0 xright ≤

    100 ≤ ≤ x y xmiddle xright xleft
  16. xmiddle = (xleft + xright ) 2 xleft + 10

    ≤ xright ≤ ≤ x y xmiddle xright xleft
  17. 1. Augmented Simplex Form 2. Basic Feasible Solved Form 3.

    Basic Feasible Solution 4. Simplex Optimization Cassowary
  18. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100
  19. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100
  20. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr
  21. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr xl + 10 + s1 = xr
  22. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr xl + 10 + s1 = xr
  23. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr xl + 10 + s1 = xr xr + s2 = 100
  24. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr xl + 10 + s1 = xr xr + s2 = 100
  25. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr xl + 10 + s1 = xr 0 ≤ xl , s1 , s2 xr + s2 = 100
  26. Converting to equalities 2xm = xl + xr xl +

    10 ≤ xr xl ≥ 0 xr ≤ 100 2xm = xl + xr xl + 10 + s1 = xr 0 ≤ xl , s1 , s2 xr + s2 = 100 slack variables * s1 , s2 :
  27. 2xm = xl + xr xl + 10 + s1

    = xr 0 ≤ xl , s1 , s2 xr + s2 = 100 minimize xm − xl subject to
  28. 2xm = xl + xr xl + 10 + s1

    = xr 0 ≤ xl , s1 , s2 xr + s2 = 100 Cu Cs Cu: unrestricted constraints, * CS: Simplex constraints
  29. 2xm = xl + xr xl + 10 + s1

    = xr 0 ≤ xl , s1 , s2 xr + s2 = 100 Cu Cs Cu: unrestricted constraints, * CS: Simplex constraints
  30. 2xm = xl + 100 − s2 xl + 10

    + s1 = 100 − s2 0 ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs Cu: unrestricted constraints, * CS: Simplex constraints
  31. 2xm = xl + 100 − s2 xl + 10

    + s1 = 100 − s2 0 ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs Cu: unrestricted constraints, * CS: Simplex constraints
  32. xl + 10 + s1 = 100 − s2 0

    ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs minimize subject to xm − xl 2xm = xl + 100 − s2 Cu: unrestricted constraints, * CS: Simplex constraints
  33. xl + 10 + s1 = 100 − s2 0

    ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs minimize subject to xm − xl Cu: unrestricted constraints, * CS: Simplex constraints
  34. xl + 10 + s1 = 100 − s2 0

    ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs minimize subject to xm = 50 + 1 2 xl − 1 2 s2 xm − xl Cu: unrestricted constraints, * CS: Simplex constraints
  35. xl + 10 + s1 = 100 − s2 0

    ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs minimize subject to xm = 50 + 1 2 xl − 1 2 s2 Cu: unrestricted constraints, * CS: Simplex constraints
  36. xl + 10 + s1 = 100 − s2 0

    ≤ xl , s1 , s2 xr = 100 − s2 Cu Cs minimize subject to xm = 50 + 1 2 xl − 1 2 s2 50 − 1 2 xl − 1 2 s2 Cu: unrestricted constraints, * CS: Simplex constraints
  37. 2. Basic feasible solved form x0 = c + a1

    x1 + . . . + an xn x0 is basic, x1 . . . xn are parameters (Cs) c ≥ 0
  38. s1 = xr = minimize subject to xm = 50

    − 1 2 xl − 1 2 s2 100 50 90 −s2 + 1 2 xl − 1 2 s2 −xl −s2 2. Basic feasible solved form xm , xr , s1 are basic, * xl , s2 are parametric
  39. s1 = xr = minimize subject to xm = 50

    − 1 2 xl − 1 2 s2 100 50 90 2. Basic feasible solved form xm , xr , s1 are basic, * xl , s2 are parametric
  40. 3. Basic feasible solution xr 100, xm 50, s1 90,

    xl 0, s2 0 50 − 1 2 xl − 1 2 s2 Objective function
  41. 3. Basic feasible solution xr 100, xm 50, s1 90,

    xl 0, s2 0 50 − 1 2 xl − 1 2 s2 Objective function 50
  42. x y 50 100 0 xr 100 xm 50, xl

    0, 50 A basic feasible solution xmiddle xright xleft
  43. x y 50 100 0 xr 100 xm 50, xl

    0, 50 xm = (xl + xr ) 2 xl + 10 ≤ xr xl ≥ 0 xr ≤ 100 Original constraints A basic feasible solution xmiddle xright xleft
  44. 4. Simplex Optimization 1. Find (adjacent) basic feasible solved form

    2. Pivot: swap out basic and parametric vars fobj
  45. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 50 − 1 2 xl − 1 2 s2 100 90 −s2 50 + 1 2 xl − 1 2 s2 − −s2 xl =
  46. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 50 − 1 2 xl − 1 2 s2 100 90 −s2 50 + 1 2 xl − 1 2 s2 − −s2 xl =
  47. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 50 − 1 2 xl − 1 2 s2 100 90 −s2 50 + 1 2 xl − 1 2 s2 − −s2 xl =
  48. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 50 − 1 2 xl − 1 2 s2 100 90 −s2 − −s2 xl =
  49. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 50 − 1 2 xl − 1 2 s2 100 90 −s2 − −s2 xl = 95 − 1 2 s1 −s2
  50. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 100 90 −s2 − −s2 xl = 95 − 1 2 s1 −s2
  51. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 100 90 −s2 − −s2 xl = 5 + 1 2 s1 95 − 1 2 s1 −s2
  52. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 100 90 −s2 − −s2 xl = 5 + 1 2 s1 95 − 1 2 s1 −s2 xm , xr , xl are basic, * s1 , s2 are parametric
  53. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 100 90 −s2 − −s2 xl = 5 + 1 2 s1 95 − 1 2 s1 −s2 xm , xr , xl are basic, * s1 , s2 are parametric
  54. 4. Simplex Optimization s1 xr = minimize subject to xm

    = 100 90 −s2 − −s2 xl = 5 + 1 2 s1 95 − 1 2 s1 −s2 5 xm , xr , xl are basic, * s1 , s2 are parametric
  55. x y 90 100 0 90 95 xr 100 xm

    95, xl 90, 5 Optimal solution xm xr xl
  56. x y 90 100 0 90 95 xr 100 xm

    95, xl 90, 5 Optimal solution xm = (xl + xr ) 2 xl + 10 ≤ xr xl ≥ 0 xr ≤ 100 Original constraints xm xr xl
  57. Adding constraints • Convert to augmented simplex form • Substitute

    new constraint in using current table • Reach basic feasible solved form
  58. Removing constraints • Marker variables to keep track of original

    constraints • Pivot until marker is basic • Remove row
  59. Constraint priorities • Weights added to constraints • Makes objective

    function non-linear • Quasi-linear optimization
  60. WWDC '18 • Performance scales linearly (in independent views) •

    Inequalities are not expensive (one extra variable) • Error minimization has a cost (constraint priorities)
  61. WWDC '18 • Performance scales linearly (in independent views) •

    Inequalities are not expensive (one extra variable) • Error minimization has a cost (constraint priorities)
  62. WWDC '18 • Performance scales linearly (in independent views) •

    Inequalities are not expensive (one extra variable) • Error minimization has a cost (constraint priorities)
  63. WWDC '18 • Performance scales linearly (in independent views) •

    Inequalities are not expensive (one extra variable) • Error minimization has a cost (constraint priorities)
  64. Resources • Auto Layout Documentation • High Performance Auto Layout

    WWDC '18 • Cassowary paper '97 • The Simplex Method: An Example (UT Dallas)