Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Avito Mobile: State of the Union

Egor Tolstoy
November 17, 2018
72

Avito Mobile: State of the Union

Размер имеет значение. Это верно и для приложения Авито, которое разрабатывается сразу несколькими десятками независимых кроссфункциональных команд, в каждой из которых есть по несколько iOS разработчиков. За последние пару лет мы столкнулись с различными проблемами масштаба, часть из которых получилось успешно решить. В докладе я поделюсь техниками и советами, которые позволяют нам удерживать качество продукта на высоком уровне и стабильно релизиться раз в 2 недели, одновременно с этим предоставляя максимальную свободу разработчикам в выборе архитектуры и пути решения их проблем.

Egor Tolstoy

November 17, 2018
Tweet

Transcript

  1. Avito Mobile

    State of the Union
    INSERT COIN TO CONTINUE

    View Slide

  2. PAGE 02/51

    View Slide

  3. PAGE 03/51
    Backend
    Mobile
    Frontend
    Product
    Designer
    QA

    View Slide

  4. PAGE 04/51
    Unit Seller Unit Mnz

    View Slide

  5. PAGE 05/51
    Unit Seller
    Make sellers happy
    +15% items
    -34s create item

    View Slide

  6. PAGE 06/51
    Unit Mnz
    Make more money
    +40% $ from vas

    +35% $ from ads

    View Slide

  7. PAGE 07/51
    Mnz
    Seller
    Buyer
    Billing
    Growth

    View Slide

  8. PAGE 08/51
    > Stop the release, I'm
    finishing a feature
    > I don't want to use VIPER
    > App start up time is not my
    problem

    View Slide

  9. PAGE 09/51
    Product units
    > Work for end
    users
    Platform units
    > Work for
    product teams

    View Slide

  10. > Releases
    > Architecture
    > Performance
    PAGE 10/51

    View Slide

  11. PAGE 11/51
    > Releases
    > Architecture
    > Performance

    View Slide

  12. PAGE 12/51
    ft.A
    Unit 1
    Unit 2
    Unit 3
    Release 1
    ft.C
    ft.D ft.E

    View Slide

  13. PAGE 13/51
    ft.A
    ft.C
    ft.D
    ft.E
    Unit 1
    Unit 2
    Unit 3
    Unit 3
    Unit 1
    Unit 2
    Unit 3 Release 1
    ft.B
    ft.F
    ft.G
    ft.H
    Release 2
    ftI

    View Slide

  14. PAGE 14/51

    View Slide

  15. PAGE 15/51
    feature
    freeze
    1st
    regress
    end
    2nd
    regress
    end
    staging
    release
    > Release cycle

    View Slide

  16. PAGE 16/51
    Criteria:
    1. Closed in Jira
    2. Tested
    3. Has feature toggle
    > ... - feature freeze

    View Slide

  17. PAGE 17/51
    Automation:
    1. Git+Jira
    2. Feature Toggle config
    3. Slack mentions
    4. Branch cut
    > ... - feature freeze

    View Slide

  18. PAGE 18/51
    > feature freeze - 1st regress end
    Unit tests
    Component tests
    E2E tests
    Manual tests
    1. Regression test suite
    is run
    2. Each unit performs
    impact analysis
    3. Each unit manually
    tests its features
    4. P0/P1 bugs are fixed
    or features are
    disabled

    View Slide

  19. PAGE 19/51
    > 1st regress end - 2nd regress end
    Unit tests
    Component tests
    E2E tests
    Manual tests
    1. Regression test suite
    is run
    2. Impact analysis for
    fixed bugs
    3. Broken features are
    disabled

    View Slide

  20. PAGE 20/51
    > 2nd regress end - staging
    1.Roll out for 1% users
    2.Automatically watch for
    crashes & metrics
    3.Hotfix if necessary

    View Slide

  21. PAGE 21/51
    > staging - release
    1.Manual roll out
    2.Anomaly detection
    3.Hotfix if P0 bug

    View Slide

  22. PAGE 22/51
    > Release cycle
    1. Release
    automation
    2. CI/CD hacks
    3. Tooling

    View Slide

  23. PAGE 23/51
    30
    200
    700
    860
    Q1 Q2 Q3 now
    > E2E Testing
    Began automatisation in Q4'2017
    Currently 860 UI test methods
    3 devices
    2610 tests in regress suite

    View Slide

  24. PAGE 24/51
    Our tooling
    - XCUITest
    - Wrapper (MixBox)
    - Test runner (Emcee)
    > E2E Testing

    View Slide

  25. PAGE 25/51
    Who write tests
    - Manual QA
    - Engineers
    > E2E Testing
    65%

    QA
    35%

    engs

    View Slide

  26. PAGE 26/51
    1. Why XCUITest
    2. Millions of
    hacks
    3. MixBox
    > E2E Testing

    View Slide

  27. PAGE 27/51
    1. Infrastructure
    2. Running tests
    in parallel
    3. Simulators
    > E2E Testing

    View Slide

  28. PAGE 28/51
    > Releases
    > Architecture
    > Performance

    View Slide

  29. PAGE 29/51
    > Tragedy of Commons

    View Slide

  30. PAGE 30/51
    > Monolithic Architecture
    Problems
    - Build time
    - Responsibility scopes
    - Impact analysis

    View Slide

  31. PAGE 31/51
    > Modular Architecture

    View Slide

  32. PAGE 32/51
    > Modular Architecture
    Core layer
    - Networking
    - Routing
    - Deeplinking
    - Persistance
    - Logging
    - etc

    View Slide

  33. PAGE 33/51
    > Modular Architecture
    module 1
    module 2
    module 3
    module 4

    View Slide

  34. PAGE 34/51
    > Modular Architecture
    Feature consists of
    - Code modules
    - UI
    - Network requests
    - Persistance requests
    - Models
    - Navigation

    View Slide

  35. PAGE 35/51
    > Modular Architecture

    View Slide

  36. PAGE 36/51
    > Modular Architecture
    Mediator layer
    - Module interfaces
    - Service interfaces
    - Models

    View Slide

  37. PAGE 37/51
    > Modular Architecture
    Metrics
    - 150 modules
    - 9 months for transition
    - Clean build: 635 -> 438s
    - Incremental build: 65 -> 15s

    View Slide

  38. PAGE 38/51
    > Modular Architecture
    How we use it
    - Each module has its owner
    - Demo apps out of the box
    - Module lazy load
    - Module quality checks: usage, contributors,
    build time, etc

    View Slide

  39. PAGE 39/51
    1. Architecture
    details
    2. Transition
    process
    3. Experiments
    4. Usage
    > Modular Architecture
    evgeniy
    suvorov

    View Slide

  40. PAGE 40/51
    > Releases
    > Architecture
    > Performance

    View Slide

  41. PAGE 41/51
    App performance affects business
    metrics.
    > Hypothesis

    View Slide

  42. PAGE 42/51
    5% of daily audience, each group - 1%
    - A,B - control groups
    - C - +2s delay
    - D - +4s delay
    - E - +6s delay
    > Experiments

    View Slide

  43. PAGE 43/51
    1. Performance affects bounce rate
    2. Hygienic minimum - 4s
    3. Most problems in regions
    4. [Web] 1s of speed up -> 0,2-4% business
    metrics
    > Results

    View Slide

  44. PAGE 44/51
    > Avito - daily use service
    Our target:
    - Cold start - 1,5s
    - Screen first paint - 0,1s
    - Screen first interactive - 0,5s
    - Scrolling - 60fps

    View Slide

  45. PAGE 45/51
    > Performance
    Monitoring Culture
    Optimisations

    View Slide

  46. PAGE 46/51
    > Optimisations
    1. Modules lazy loading
    2. Network optimisations
    3. Lazy DI
    4. Lazy tabs creation
    5. DI without reflection

    View Slide

  47. PAGE 47/51
    > Monitoring

    View Slide

  48. PAGE 48/51
    > Monitoring
    init
    load
    parse
    draw

    View Slide

  49. PAGE 49/51
    > Culture
    1. Internal meetups
    2. Competitors dashboards
    3. All screens dashboards
    4. Success stories
    5. Trending

    View Slide

  50. > Releases
    > Architecture
    > Performance
    PAGE 50/51

    View Slide

  51. Egor Tolstoy
    > Avito Mobile
    many features
    many developers
    many challenges
    Avito
    t.me/etolstoy

    View Slide