Research Questions
14
So many questions, so little time!
■ How can we
□ “lint” project data to find process improvement areas? [1]
□ measure success in implementing agile practices? [2]
□ design curricula and exercises for ease of data collection? [3]
□ use project data to assess curriculum goals? [4]
□ integrate data-informed reflection into Agile processes? [5]
[1] C. Matthies, T. Kowark, K. Richly, M. Uflacker, and H. Plattner (2016), “How Surveys, Tutors, and Software Help to Assess Scrum Adoption,” in Proceedings of the 38th International
Conference on Software Engineering Companion (ICSE), ACM, doi:10.1145/2889160.2889182
[2] C. Matthies, T. Kowark, M. Uflacker, and H. Plattner, (2016) “Agile Metrics for a University Software Engineering Course,” in 2016 IEEE Frontiers in Education Conference (FIE), IEEE,
doi:10.1109/FIE.2016.7757684
[3] C. Matthies, A. Treffer, and M. Uflacker, (2017) “Prof. CI: Employing Continuous Integration Services and GitHub Workflows to Teach Test-Driven Development,” in 2017 IEEE Frontiers in
Education Conference (FIE), IEEE, doi:10.1109/FIE.2017.8190589
[4] C. Matthies, R. Teusner, and G. Hesse, (2018) “Beyond Surveys: Analyzing Software Development Artifacts to Assess Teaching Efforts,” in 2018 IEEE Frontiers in Education Conference (FIE),
IEEE, doi:10.1109/FIE.2018.8659205
[5] C. Matthies, (2019) “Feedback in Scrum: Data-informed Retrospectives,” in Proceedings of the 41st International Conference on Software Engineering: Companion Proceedings (ICSE),
IEEE, doi:10.1109/ICSE-Companion.2019.00081