Agile Software Development Practices: Perceptions & Project Data

Agile Software Development Practices: Perceptions & Project Data

Talk on agile software development practices and their relationship to team members perceptions, held at the 2020 Software Engineering (SE) conference, organized by the Gesellschaft für Informatik (GI), 24-28 Feb 2020 in Innsbruck, Austria. Conference website: https://se20.ocg.at/

Paper :
C. Matthies, J. Huegle, T. Dürschmid, and R. Teusner, “Attitudes, Beliefs, and Development Data Concerning Agile Software Development Practices,” in Software Engineering 2020, M. Felderer, W. Hasselbring, R. Rabiser, and R. Jung, Eds., Bonn: Gesellschaft für Informatik e.V., 2020, pp. 73–74. doi: 10.18420/SE2020_20 (CC BY-SA 4.0)
[Online] available: https://dl.gi.de/handle/20.500.12116/31697

4ab2845fa4f5ab860bc3476c9f11b759?s=128

Christoph Matthies

February 27, 2020
Tweet

Transcript

  1. Hasso Plattner Institute, University of Potsdam, Germany *Carnegie Mellon University,

    Pittsburgh, USA christoph.matthies@hpi.de @chrisma0 Agile Software Development Practices: Perceptions & Project Data Christoph Matthies, Johannes Hügle, Tobias Dürschmid*, Ralf Teusner February ’20 Innsbruck
  2. Background 2 An undergraduate software engineering capstone course “ methods,

    concepts, and technologies that help successfully deliver large software products developed by multiple teams. [1] [1] https://hpi.de/plattner/teaching/archive/winter-term-201819/softwaretechnik-ii.html ”
  3. Motivation: Agile SE Teams 3 Teamwork in collaborating (student) groups

  4. 4 Motivation: Agile SE Teams Teamwork in collaborating (student) groups

  5. 5 Relationship between Process Perceptions and Project Data Motivation: Perceptions

    & Data Two worlds, each relevant
  6. Perceptions vs. Project Data Main research questions ▪ Q1: What

    are perceptions of agile practice usage in student project teams? ▪ Q2: Which practices are perceived to be most related to agile values? ▪ Q3: What is the relationship of perceptions and software project data? 6 Regarding agile practices
  7. Perceptions vs. Project Data ▪ SE lecture with agile (collaboration)

    process novices ▪ Subset of eight agile best practices under study ▪ Repeatedly collect perceptions of practice use ▪ 42 students ▪ Surveys after each of the four Sprints ▪ Define data measurements reflecting Agile practice usage 7 Methods
  8. Overall Survey Results 8 Q1: Perceptions regarding practices, answers over

    all sprints
  9. Overall Survey Results 9 Q1: Perceptions regarding practices, answers over

    all sprints
  10. Overall Survey Results 10 Q2: Correlations to perceptions of agile

    value implementation
  11. Overall Survey Results 11 Practices most related to “Agile Mindset”:

    ▪ Practicing Collective Code Ownership (Q2, τ=0.15, p < .05) ▪ Not working near the deadline (Q5, τ=−0.21, p < .01) ▪ Following “check in early, check in often” principle (Q6, τ= 0.24, p < .01) Q2: Correlations to perceptions of agile value implementation
  12. Evidence of Agile Practice Usage ▪ Dev. practices “inscribed into

    software artifacts” [deSouza, 2005] ▪ Analysis of teams’ GitHub project data ▪ Define measures of agile practice usage □ Based on previous related literature where available □ “Analytics cold-start” problem [deSouza, 2005] □ Intuitively traceable to underlying data 12 Based on project data evidence [deSouza et al., 2005] de Souza, C., Froehlich, J., & Dourish, P, “Seeking the Source: Software Source Code as a Social and Technical Artifact”. In Proceedings of the 2005 international ACM SIGGROUP conference on Supporting group work - GROUP ’05, p. 197, 2005.
  13. Extract of Employed Measures ▪ Code reviews □ Amount of

    Pull Request comments by a developer in a Sprint ▪ Test-driven Development □ Ruby on Rails conventions separate test from application code □ Ratio of line changes in test and application code ▪ Last-Minute Commits □ Percentage of commits by developer within 12 hours of sprint review meeting ▪ ... 13 Agile practice measures based on project data
  14. Code Reviews in PRs ▪ Expectations □ Many developers with

    few comments □ Barrier for leaving comments ▪ Some devs very motivated: “Hero reviewers” cf. [Mockus et al., 2002] 14 Amount of code review comments Amount of comments per reviewer Frequency [Mockus et al., 2002] Mockus, Audris, Roy T. Fielding, and James D. Herbsleb, "Two case studies of open source software development: Apache and Mozilla," ACM Transactions on Software Engineering and Methodology (TOSEM) 11, no. 3, pp. 309-346, 2002.
  15. Test-Driven Development ▪ Test Statements per Solution Statement [Buffardi et

    al., 2012] ▪ Expectation: Low ratios of test to app code changes ▪ 10 test LOC changes / 100 app LOC changes = 0,1 15 Ratio of test to application code line changes Amount of devs Ratio test to app code [Buffardi et al., 2012] K. Buffardi and S. H. Edwards, “Impacts of Teaching Test-Driven Development to Novice Programmers,” International Journal of Information and Computer Science IJICS, vol. 1, no. 6, pp. 135–143, 2012.
  16. Last-Minute Commits ▪ Expectation: high percentage of commits shortly before

    Sprint end ▪ “Deadline-Driven development” [Ariely et al., 2002] 16 Percentage of Last-Minute Commits per developer [Ariely et al., 2002] D. Ariely and K. Wertenbroch, “Procrastination, deadlines, and performance: self-control by precommitment.” Psychological Science, vol. 133, pp. 219–224, 2002. Frequency Ratio of last-minute commits by developer
  17. Project Evidence vs. Perceptions 17 Correlations between perceptions and data

    TDD - Ratio Test/App Code CCO - Unique Files Edited Deadline-Driven Development - Last-Minute Commits “Check in early, check in often” - Avg. LOC churn Parallel User Stories - Unique User Story Identifiers Useful code reviews - Pull Request Comments
  18. Study Summary 18 A case study on agile practice usage

    in student teams ▪ Case study within education context on selected Agile practice usage in teams ▪ Initial measures for Agile practice ▪ Self-assessments correlated with measurements concerning TDD and last-minute work □ Well-defined concepts, intuitive to grasp and measure □ Shared mental models
  19. 19 ▪ Differing assumptions between measurement creator and participants ▪

    Is employed proxy not measuring the intended construct or are perception and data at odds in this context? Starting points for discussion and improvement Conclusions & Interpretation Contributions and lessons learned in this study
  20. Future Work 20 Integrating project data analysis into SE processes

    Software Process Improvement
  21. Future Work 21 Integrating project data analysis into SE processes

    ▪ Scrum Retrospective: “an opportunity for the Scrum Team to inspect itself” [Schwaber, 2017] ▪ Common process problems, common diagnoses approaches ▪ Project data: additional perspective on team development process ▪ New Retrospective activities based on project data ▪ e.g. Remedy Appraisal: did a (process) change manifest in project data? [Schwaber et al., 2017] Schwaber, K., & Sutherland, J., “The Scrum Guide - The Definitive Guide to Scrum: The Rules of the Game”, 2017, [online] Available: http://scrumguides.org/docs/scrumguide/v2017/2017-Scrum-Guide-US.pdf
  22. 22 Retro Bot Vision

  23. Summary 23

  24. Image Sources 24 In order of appearance ▪ attitude by

    Nithinan Tatah from the Noun Project ▪ Data by Alice Design from the Noun Project ▪ agile by Florent B from the Noun Project ▪ Mortar Board by Mike Chum from the Noun Project ▪ developer by Becris from the Noun Project ▪ GitHub mark by GitHub, Inc. ▪ questions by Gregor Cresnar from the Noun Project ▪ Survey by unlimicon from the Noun Project ▪ Merge by Danil Polshin from the Noun Project ▪ measures by supalerk laipawat from the Noun Project ▪ sum by Trevor Dsouza from the Noun Project ▪ end by priyanka from the Noun Project ▪ Future by Alice Design from the Noun Project