Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Scaling Collective Code Ownership SCNA 2018

Scaling Collective Code Ownership SCNA 2018

The contemporary code review practice has been adopted by many organizations and open source projects as a quality assurance measure. This process typically involves software tools such as Review Board and the time of one or more peer reviewers.

However, there is a growing body of empirical research about this broadly‐adopted practice that can help us learn more about the effects of code reviews on software quality. Available research shows that peer code reviews are better suited for knowledge sharing and code improvements, rather than for eliminating code defects or reducing error rates. The effectiveness of the process seems to depend on the experience of the reviewer with the code, the size of the change set, and the rate at which the review is conducted.

In this talk, I will share how the LinkedIn Flagship Product Engineering team leverages the code review process to spread organizational knowledge and uplevel individual contributors. I will also discuss the tools we’ve built to support this practice across a team which spans three offices, four development stacks, dozens of teams, and hundreds of individual contributors.

Nikolai Avteniev

April 19, 2018
Tweet

Other Decks in Technology

Transcript

  1. Scaling Collective Code Ownership and Code Review in LinkedIn’s Flagship

    Product Engineering Team Nikolai Avteniev Software Engineer
  2. About Me • First product launched in 2001 • Worked

    in different industries • LinkedIn Video By Day • Learning and Teaching By Night
  3. The Big Picture Code Ownership impacts quality Ownership Way more

    than just quality assurance Code Review Mechanism for scaling code ownership Review Process
  4. Has Impact On Quality Code Ownership • Defect Prediction –

    Organizational Structure is a strong predictor • Collective Ownership – More contributors leads to more defect • Individual Ownership – Stronger ownership leads to fewer defects • Reproducible? – Kind of
  5. Isn’t just about defects Code Review • Modern – Light

    weight, asynchronous, tool enabled • Opinion Forming – Influences Opinions and Future Collaboration • Time Consuming – 10-15%, hours to weeks • Multi purpose – Code Improvement, Knowledge Transfer, Defects
  6. Issues Raised During Code Review REPLICATED IN INDUSTRY AND OPEN

    SOURCE 75% Maintainability • Finding Defects is Difficult • Consistent across teams at MSFT • Results Similar for OSS and Industry
  7. Useful Code Reviews WHAT IS A USEFUL COMMENT? • 63-68%

    for five projects • Grows to 80% with Experience • Decreases With Review Size • Functional defects most useful 65% Useful
  8. Code Review Tech Stack • Kibitzer –Bot to run automated

    checks • Review Butler – Platform Review Analytics • Review Board Reporting – General Code Review Analytics • RB Tracker – Team RB Process Tracking • Review Board –Manages Code Reviews • Git Review – CLI extension to create / update / submit code for review • ACL – Fine Grained Code Ownership • Code Of Conduct – Review guidelines and best practices
  9. Code Review Process git review submit ACL Members git review

    apply Kibitzer Bot Ship It Comments, Fix Its Author git review create / update Database Trunk Review reports
  10. Issues Raised By Team MOST COMMON ISSUE TYPE IS SOLUTION

    APPROACH Solution Approach Solution Approach Solution Approach Solution Approach Solution Approach Solution Approach Solution Approach Solution Approach Solution Approach Solution Approach 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% messaging platform search onboarding windows profile feed opportunities my-network stylesheets
  11. Usefulness By Type Useful Useful Useful Useful Useful Useful Useful

    0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Defect Organizatoin Solution Approach Documentation Safeguarding False Positive Other
  12. Time In Review • Median TTFR about 45 Min •

    Median TTC about a day • 25% of reviews TTFR > 14 hours • 25% of reviews TTC > 4 days
  13. Organization • Allocate Time – Code Reviews take time and

    can’t be treated as less important • Code Review Culture – Review Needs to be incorporated into Engineer Culture • Expect Delays – Reviews introduce a step between writing code and shipping code • Instrument – Monitor the review process for bottlenecks • Not QA – Complements Other QA activities like Testing
  14. Aka Reviewer Code Owner • Take Your Time – Don’t

    rush through a review you’ll miss the defects • Hurry Up – Respond in a timely manner • Don’t Do It All At Once – Longer you review the less effective you become • Communicate Compassionately – Understand the need of the author • Don’t Nitpick – Focus on useful feedback
  15. Code Author • Take Your Time – Prepare your code

    for review, testing, self-review • Small Change – Small changes get better reviews • Hurry Up – Respond to feedback in a timely manner • Find the Right Reviewer – Figure out who needs to review this code • Communicate Compassionately – Understand the need of the code owner
  16. References 1. Bacchelli, Alberto, and Christian Bird. "Expectations, outcomes, and

    challenges of modern code review." Proceedings of the 2013 international conference on software engineering. IEEE Press, 2013. 2. Baum, Tobias, et al. "A faceted classification scheme for change-based industrial code review processes." Software Quality, Reliability and Security (QRS), 2016 IEEE International Conference on. IEEE, 2016. 3. Beller, Moritz, et al. "Modern code reviews in open-source projects: Which problems do they fix?." Proceedings of the 11th working conference on mining software repositories. ACM, 2014. 4. Bird, Christian, et al. "Don't touch my code!: examining the effects of ownership on software quality." Proceedings of the 19th ACM SIGSOFT symposium and the 13th European conference on Foundations of software engineering. ACM, 2011. 5. Bosu, Amiangshu, et al. "Process aspects and social dynamics of contemporary code review: insights from open source development and industrial practice at microsoft." IEEE Transactions on Software Engineering 43.1 (2017): 56-75. 6. Bosu, Amiangshu, Michaela Greiler, and Christian Bird. "Characteristics of useful code reviews: An empirical study at microsoft." Mining Software Repositories (MSR), 2015 IEEE/ACM 12th Working Conference on. IEEE, 2015. 7. Czerwonka, Jacek, Michaela Greiler, and Jack Tilford. "Code reviews do not find bugs: how the current code review best practice slows us down." Proceedings of the 37th International Conference on Software Engineering-Volume 2. IEEE Press, 2015. 8. Cohen, Jason, et al. Best kept secrets of peer code review. Somerville: Smart Bear, 2006. 9. Foucault, Matthieu, Jean-Rémy Falleri, and Xavier Blanc. "Code ownership in open- source software." Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering. ACM, 2014. 10. Greiler, Michaela, Kim Herzig, and Jacek Czerwonka. "Code ownership and software quality: a replication study." Proceedings of the 12th Working Conference on Mining Software Repositories. IEEE Press, 2015. 11. MacLeod, Laura, et al. "Code Reviewing in the Trenches: Understanding Challenges and Best Practices." IEEE Software (2017). 12. Nagappan, Nachiappan, Brendan Murphy, and Victor Basili. "The influence of organizational structure on software quality." Software Engineering, 2008. ICSE'08. ACM/IEEE 30th International Conference on. IEEE, 2008. 13. Rigby, Peter C., and Christian Bird. "Convergent contemporary software peer review practices." Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering. ACM, 2013. 14. Shimagaki, Junji, et al. "A study of the quality-impacting practices of modern code review at sony mobile." Software Engineering Companion (ICSE-C), IEEE/ACM International Conference on. IEEE, 2016. 15. Ciavolino Amy, http://amyciavolino.com/assets/MindfulCommunicationInCodeReviews.pdf