Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Avito Mobile: State of the Union

Egor Tolstoy
November 17, 2018
88

Avito Mobile: State of the Union

Размер имеет значение. Это верно и для приложения Авито, которое разрабатывается сразу несколькими десятками независимых кроссфункциональных команд, в каждой из которых есть по несколько iOS разработчиков. За последние пару лет мы столкнулись с различными проблемами масштаба, часть из которых получилось успешно решить. В докладе я поделюсь техниками и советами, которые позволяют нам удерживать качество продукта на высоком уровне и стабильно релизиться раз в 2 недели, одновременно с этим предоставляя максимальную свободу разработчикам в выборе архитектуры и пути решения их проблем.

Egor Tolstoy

November 17, 2018
Tweet

Transcript

  1. PAGE 08/51 > Stop the release, I'm finishing a feature

    > I don't want to use VIPER > App start up time is not my problem
  2. PAGE 13/51 ft.A ft.C ft.D ft.E Unit 1 Unit 2

    Unit 3 Unit 3 Unit 1 Unit 2 Unit 3 Release 1 ft.B ft.F ft.G ft.H Release 2 ftI
  3. PAGE 16/51 Criteria: 1. Closed in Jira 2. Tested 3.

    Has feature toggle > ... - feature freeze
  4. PAGE 17/51 Automation: 1. Git+Jira 2. Feature Toggle config 3.

    Slack mentions 4. Branch cut > ... - feature freeze
  5. PAGE 18/51 > feature freeze - 1st regress end Unit

    tests Component tests E2E tests Manual tests 1. Regression test suite is run 2. Each unit performs impact analysis 3. Each unit manually tests its features 4. P0/P1 bugs are fixed or features are disabled
  6. PAGE 19/51 > 1st regress end - 2nd regress end

    Unit tests Component tests E2E tests Manual tests 1. Regression test suite is run 2. Impact analysis for fixed bugs 3. Broken features are disabled
  7. PAGE 20/51 > 2nd regress end - staging 1.Roll out

    for 1% users 2.Automatically watch for crashes & metrics 3.Hotfix if necessary
  8. PAGE 23/51 30 200 700 860 Q1 Q2 Q3 now

    > E2E Testing Began automatisation in Q4'2017 Currently 860 UI test methods 3 devices 2610 tests in regress suite
  9. PAGE 24/51 Our tooling - XCUITest - Wrapper (MixBox) -

    Test runner (Emcee) > E2E Testing
  10. PAGE 25/51 Who write tests - Manual QA - Engineers

    > E2E Testing 65%
 QA 35%
 engs
  11. PAGE 30/51 > Monolithic Architecture Problems - Build time -

    Responsibility scopes - Impact analysis
  12. PAGE 32/51 > Modular Architecture Core layer - Networking -

    Routing - Deeplinking - Persistance - Logging - etc
  13. PAGE 34/51 > Modular Architecture Feature consists of - Code

    modules - UI - Network requests - Persistance requests - Models - Navigation
  14. PAGE 37/51 > Modular Architecture Metrics - 150 modules -

    9 months for transition - Clean build: 635 -> 438s - Incremental build: 65 -> 15s
  15. PAGE 38/51 > Modular Architecture How we use it -

    Each module has its owner - Demo apps out of the box - Module lazy load - Module quality checks: usage, contributors, build time, etc
  16. PAGE 39/51 1. Architecture details 2. Transition process 3. Experiments

    4. Usage > Modular Architecture evgeniy suvorov
  17. PAGE 42/51 5% of daily audience, each group - 1%

    - A,B - control groups - C - +2s delay - D - +4s delay - E - +6s delay > Experiments
  18. PAGE 43/51 1. Performance affects bounce rate 2. Hygienic minimum

    - 4s 3. Most problems in regions 4. [Web] 1s of speed up -> 0,2-4% business metrics > Results
  19. PAGE 44/51 > Avito - daily use service Our target:

    - Cold start - 1,5s - Screen first paint - 0,1s - Screen first interactive - 0,5s - Scrolling - 60fps
  20. PAGE 46/51 > Optimisations 1. Modules lazy loading 2. Network

    optimisations 3. Lazy DI 4. Lazy tabs creation 5. DI without reflection
  21. PAGE 49/51 > Culture 1. Internal meetups 2. Competitors dashboards

    3. All screens dashboards 4. Success stories 5. Trending