$30 off During Our Annual Pro Sale. View Details »

Beyond Surveys: Analyzing Software Development Artifacts to Assess Teaching Efforts

Beyond Surveys: Analyzing Software Development Artifacts to Assess Teaching Efforts

Slides of the talk on the paper "Beyond Surveys: Analyzing Software Development Artifacts to Assess Teaching Efforts" by Christoph Matthies, Ralf Teusner and Guenter Hesse given at the Frontiers in Education 2018 conference in San Jose, CA, USA in October 2018.

Preprints of the paper are available on arXiv (https://arxiv.org/abs/1807.02400)

Christoph Matthies

October 05, 2018
Tweet

More Decks by Christoph Matthies

Other Decks in Science

Transcript

  1. Hasso Plattner Institute
    University of Potsdam, Germany
    [email protected]
    @chrisma0
    Beyond Surveys:
    Analyzing Software Development Artifacts
    to Assess Teaching Efforts
    Christoph Matthies, Ralf Teusner, Guenter Hesse
    ’18, San Jose, CA, October 2018

    View Slide


  2. Background
    Course Focus
    You will learn how to manage a long-running software
    project with a large number of developers. [1]
    2
    [1] https://hpi.de/plattner/teaching/archive/winter-term-201718/softwaretechnik-ii.html

    An undergraduate software engineering capstone course

    View Slide


  3. Background
    Course Focus
    You will learn how to manage a long-running software
    project with a large number of developers. [1]
    3
    [1] https://hpi.de/plattner/teaching/archive/winter-term-201718/softwaretechnik-ii.html
    ■ All participants (in teams) jointly develop a software
    ■ Self-organizing teams
    ■ Collaboration > technical skills

    An undergraduate software engineering capstone course

    View Slide


  4. Background
    Course Focus
    You will learn how to manage a long-running software
    project with a large number of developers. [1]
    4
    [1] https://hpi.de/plattner/teaching/archive/winter-term-201718/softwaretechnik-ii.html
    ■ All participants (in teams) jointly develop a software
    ■ Self-organizing teams
    ■ Collaboration > technical skills
    ■ Project work & intro exercises & lectures & tutors
    ■ Learn and apply agile methods
    ■ Real-world scenario, real development tools

    An undergraduate software engineering capstone course

    View Slide

  5. Challenges & Ideas
    ■ Course employed Scrum
    ■ Structured, prescriptive
    ■ Good for starting [2]
    ■ Kanban gaining popularity in industry
    5
    Evaluate and adapt the course over time
    [2] V. Mahnic, “From Scrum to Kanban: Introducing Lean
    Principles to a Software Engineering Capstone Course”

    View Slide

  6. Challenges & Ideas
    ■ Course employed Scrum
    ■ Structured, prescriptive
    ■ Good for starting [2]
    ■ Kanban gaining popularity in industry
    ■ Idea: Update course and project!
    ■ Employ Scrum Sprints first, then switch to Kanban
    ■ Kanban for “finishing touches”
    6
    Evaluate and adapt the employed development methodology
    [2] V. Mahnic, “From Scrum to Kanban: Introducing Lean
    Principles to a Software Engineering Capstone Course”

    View Slide

  7. Challenges & Ideas
    7
    Scrum: the usual agile development process
    Scrum key ideas
    ■ Sprints: timeboxed
    iterations
    ■ Planning and estimation
    ■ Review and
    retrospectives
    ■ Prescriptive process

    View Slide

  8. Challenges & Ideas
    8
    Kanban: the new kid on the block
    Kanban key ideas
    ■ Visualize work items
    on Kanban board
    ■ “Pull” them through
    the process
    ■ Limit work-in-progress
    ■ Handle bottlenecks

    View Slide

  9. Research Question
    9
    Goals of the research and background
    How can we gauge (the effect of curriculum)
    changes in student behavior during project work?

    View Slide

  10. Research Question
    ■ Usual approach: end-of-term surveys
    10
    Goals of the research and background
    How can we gauge (the effect of curriculum)
    changes in student behavior during project work?

    View Slide

  11. ■ Performed regularly after end of course (before grades)
    ■ Allows student feedback on the course
    ■ Standardized questions, overwhelmingly positive responses
    → Hard to gauge changes in curriculum design
    End-of-term Surveys
    11
    General satisfaction indicators of the last 5 years

    View Slide

  12. Research Question
    ■ Usual approach: end-of-term surveys
    12
    Goals of the research and background
    How can we gauge (the effect of curriculum)
    changes in student behavior during project work?

    View Slide

  13. Research Question
    ■ Usual approach: end-of-term surveys
    ■ Programming project provides unique opportunity
    ■ Developers regularly produce software artifacts
    ■ GitHub: Version control system, issue tracker
    13
    Goals of the research and background
    How can we gauge (the effect of curriculum)
    changes in student behavior during project work?

    View Slide

  14. ■ Collected data from the last 5 course instalments
    ■ Last 3 course iterations introduced Kanban
    ■ The 2 before these used only Scrum
    Development Artifact Collection
    14
    Comparing development artifacts over course instalments

    View Slide

  15. ■ Collected data from the last 5 course instalments
    ■ Last 3 course iterations introduced Kanban
    ■ The 2 before these used only Scrum
    Crawled GitHub APIs and extracted artifacts
    Development Artifact Collection
    15
    Comparing development artifacts over course instalments
    commits
    tickets / user stories

    View Slide

  16. Development Artifact Analysis
    Kanban usage had some noticeable effects
    16
    Gaining insights into team behaviors

    View Slide

  17. Development Artifact Analysis
    Kanban usage had some noticeable effects
    ■ Higher mean amount of non-comment events
    ■ Assigning labels (status, priority) & developers (responsibility)
    → More interaction with issues
    17
    Gaining insights into team behaviors

    View Slide

  18. Development Artifact Analysis
    Kanban usage had some noticeable effects
    ■ Higher mean amount of non-comment events
    ■ Assigning labels (status, priority) & developers (responsibility)
    → More interaction with issues
    ■ Commits towards end of Sprint higher in Kanban
    ■ Scrum (planning) vs Kanban (dynamic)
    → Better ability to adapt to changes
    18
    Gaining insights into team behaviors

    View Slide

  19. Development Artifact Analysis
    Key development artifacts measures did not change significantly
    ■ Mean amount of commits & touched files
    ■ Mean line changes per commit
    ■ Mean amount of unique issues referenced
    ■ Mean issues closed, mean comments
    19
    Gaining insights into team behaviors

    View Slide

  20. Development Artifact Analysis
    Hypotheses regarding changes in artifacts were violated
    20
    Gaining insights into team behaviors

    View Slide

  21. Development Artifact Analysis
    Hypotheses regarding changes in artifacts were violated
    ■ Similar percentage of issues opened and closed by same person
    ■ No dedicated Product Owner role
    → Expected higher engagement of entire team
    21
    Gaining insights into team behaviors

    View Slide

  22. Development Artifact Analysis
    Hypotheses regarding changes in artifacts were violated
    ■ Similar percentage of issues opened and closed by same person
    ■ No dedicated Product Owner role
    → Expected higher engagement of entire team
    ■ No change in user story length
    ■ No need to estimate, focus on throughput
    → Expected smaller user stories
    22
    Gaining insights into team behaviors

    View Slide

  23. Kanban Survey
    Survey in 2017/18 course instalment (N=18, 5 point Likert scale)
    23
    Asking the questions that actually matter

    View Slide

  24. Kanban Survey
    Survey in 2017/18 course instalment (N=18, 5 point Likert scale)
    ■ “Was the Kanban sprint more useful and
    productive than another Scrum sprint?”
    ■ Yes!, mean 4.08
    24
    Asking the questions that actually matter

    View Slide

  25. Kanban Survey
    Survey in 2017/18 course instalment (N=18, 5 point Likert scale)
    ■ “Was the Kanban sprint more useful and
    productive than another Scrum sprint?”
    ■ Yes!, mean 4.08
    ■ “Did you adapt your workflow?”
    ■ Yes., 3.83
    25
    Asking the questions that actually matter

    View Slide

  26. Kanban Survey
    Survey in 2017/18 course instalment (N=18, 5 point Likert scale)
    ■ “Was the Kanban sprint more useful and
    productive than another Scrum sprint?”
    ■ Yes!, mean 4.08
    ■ “Did you adapt your workflow?”
    ■ Yes., 3.83
    ■ “Biggest (dis)advantages of Kanban?” (free text)
    ■ Advantages: Efficiency & Autonomy
    ■ Drawbacks: Only work on small stories,
    uneven task distribution
    26
    Asking the questions that actually matter

    View Slide

  27. Kanban Survey
    Survey in 2017/18 course instalment (N=18, 5 point Likert scale)
    ■ “How did user stories change from using Scrum to Kanban?”
    ■ More bug-oriented (11 mentions)
    ■ Shorter (11 mentions)
    ■ With more detailed requirements (8 mentions)
    27
    Asking the questions that actually matter

    View Slide

  28. Kanban Survey
    Survey in 2017/18 course instalment (N=18, 5 point Likert scale)
    ■ “How did user stories change from using Scrum to Kanban?”
    ■ More bug-oriented (11 mentions)
    ■ Shorter (11 mentions)
    ■ With more detailed requirements (8 mentions)
    ■ “Would you recommend using Kanban
    to next year’s participants?”
    ■ YES!, mean 4.33
    28
    Asking the questions that actually matter

    View Slide

  29. Summary & Conclusion
    ■ Kanban introduction was liked by students, but w/ mixed success
    29
    Take-away messages
    [email protected] @chrisma0

    View Slide

  30. Summary & Conclusion
    ■ Kanban introduction was liked by students, but w/ mixed success
    ■ Development artifacts represent another dimension of analysis
    ■ Beyond the perceptions of students
    ■ Based on data naturally produced, high “response rate”
    30
    Take-away messages
    [email protected] @chrisma0

    View Slide

  31. Summary & Conclusion
    ■ Kanban introduction was liked by students, but w/ mixed success
    ■ Development artifacts represent another dimension of analysis
    ■ Beyond the perceptions of students
    ■ Based on data naturally produced, high “response rate”
    ■ Analysis allowed finding those areas where expectations are…
    ■ Confirmed
    ■ Violated! (even more interesting)
    31
    Take-away messages
    [email protected] @chrisma0

    View Slide

  32. Summary & Conclusion
    ■ Kanban introduction was liked by students, but w/ mixed success
    ■ Development artifacts represent another dimension of analysis
    ■ Beyond the perceptions of students
    ■ Based on data naturally produced, high “response rate”
    ■ Analysis allowed finding those areas where expectations are…
    ■ Confirmed
    ■ Violated! (even more interesting)
    → Opportunity for conversation and improvement
    32
    Take-away messages
    [email protected] @chrisma0

    View Slide

  33. Image Credits
    33
    In order of appearance
    ■ Archaeologist by Gan Khoon Lay from the Noun Project (CC BY 3.0 US)
    ■ Mortar Board by Mike Chum from the Noun Project (CC BY 3.0 US)
    ■ Target by Arthur Shlain from the Noun Project (CC BY 3.0 US)
    ■ Process by Laymik from the Noun Project (CC BY 3.0 US)
    ■ Questions by Gregor Cresnar from the Noun Project (CC BY 3.0 US)
    ■ Data collection by H Alberto Gongora from the Noun Project (CC BY 3.0 US)
    ■ Search Code by icon 54 from the Noun Project (CC BY 3.0 US)
    ■ Clipboard by David from the Noun Project (CC BY 3.0 US)
    ■ Idea by Gilbert Bages from the Noun Project (CC BY 3.0 US)

    View Slide