Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Leveraging educational software, the Videolab c...

Leveraging educational software, the Videolab case study

A case-study in educational software simulation design, development, and use.
Presented at END 2019 International Conference on Education and New Developments. Porto, June 22, 2019.

Eduardo Morais

June 22, 2019
Tweet

More Decks by Eduardo Morais

Other Decks in Education

Transcript

  1. Leveraging educational software with exploration guides in Higher Arts Education:

    the VideoLab simulation case study Eduardo Morais  [email protected] With Carla Morais & João C. Paiva  cmorais | [email protected] END 2019 – International Conference on Education and New Developments Porto, June 22-24 This work was supported by the UT Austin | Portugal Program and the Foundation of Science and Technology (FCT) doctoral scholarship PD/BD/128416/2017.
  2. My story University of Porto Digital Media PhD scholarship (2016-);

    Experience teaching technological subjects to arts students (ESAP 2005 – 2015, U.Porto 2014 – 2017): • Digital video editing and processing (2005 – 2017); • Computer programming / ‘creative code’ (2010 – 2017).
  3. Wilks, J., Cutcher, A., & Wilks, S. (2012) Digital Technology

    in the Visual Arts Classroom: an [un]easy partnership. Studies in Art Education, 54(1), 54–65. Rizwan, O. (2015) Bring on the Real Computer Revolution. Intersect, 8(2). The context There is a lack of exploration of digital technologies’ potential in arts classrooms. Computers are still regarded as “tools rather than a medium.”
  4. McLuhan, Marshall (1964/2001) Understanding Media: The Extensions of Man. Routledge.

    Ferneding, Karen (2007) «Understanding the Message of the Medium: Media Technologies as an Aesthetic», in L. Bresler (Ed.), International Handbook of Research in Arts Education (pp. 1331–1352). Springer. The context Artists are expected to “encounter technology with impunity.” Educators and institutions are called to “negotiate the effects of mediation.”
  5. … but also a highly technical craft. ( dealing with

    complex technologies and materials [ e.g. digital video ] )
  6. J.J. Hausman(2000) Arts Education in the Twenty-First Century: A Personal

    View. Arts Education Policy Review, 102(2), 17–18. Effectiveness grounded in literature reviews; Material practice is central in arts education. Escueta et al. (2017). Education Technology: An Evidence-Based Review. Cambridge, MA Simulation development. Why?
  7. VideoLab is educational software that lets students learn and interact

    with basic concepts of digital video technology.
  8. Design According to the principles of Cognitive Load and Multimedia

    Learning theories. Mayer, Richard E. 2003. “The Promise of Multimedia Learning: Using the Same Instructional Design Methods across Different Media.” Learning and Instruction 13(2):125–39. van Merriënboer & Ayres. 2005. “Research on Cognitive Load Theory and Its Design Implications for E-Learning.” Educational Technology Research and Development 53(3):5–13.
  9. Aspect ratios. Interlaced & progressive scanning. Luma, chroma & sampling

    types. Bit depth. Design Modality: four different simulation modes. Video can be input from files or a live camera.
  10. Exploration guide To ease students “from worked examples into independent

    problem-solving.” To “guide students’ learning and avoid dissipation.” To measure outcomes. (small questionnaire at the end of each section) van Merriënboer & Ayres. 2005. “Research on Cognitive Load Theory and Its Design Implications for E-Learning.” Educational Technology Research and Development 53(3):5–13. Paiva, João C. and Luiza A. Costa. 2010. “Exploration Guides as a Strategy To Improve the Effectiveness of Educational Software in Chemistry.” Journal of Chemical Education 87(6):589–91.
  11. Case studies Constraints: • VideoLab use shouldn’t impact the course

    syllabus and the session time assigned to its topics; • No control group.
  12. Case studies Method • Learning Object Evaluation Scale for Students

    (LOES-S): • Learning efficacy; Object quality; Object engagement; Kay & Knaack. 2008. “Assessing Learning, Quality and Engagement in Learning Objects: The Learning Object Evaluation Scale for Students (LOES-S).” Educational Technology Research and Development 57(2):147–68.
  13. Case studies Method • Learning Object Evaluation Scale for Students

    (LOES-S): • Learning efficacy; Object quality; Object engagement; • Exploration Guide quizzes; Kay & Knaack. 2008. “Assessing Learning, Quality and Engagement in Learning Objects: The Learning Object Evaluation Scale for Students (LOES-S).” Educational Technology Research and Development 57(2):147–68.
  14. Case studies Method • Learning Object Evaluation Scale for Students

    (LOES-S): • Learning efficacy; Object quality; Object engagement; • Exploration Guide quizzes; • Screen Recording for usage measurements (with full written consent of the students and the institution). Kay & Knaack. 2008. “Assessing Learning, Quality and Engagement in Learning Objects: The Learning Object Evaluation Scale for Students (LOES-S).” Educational Technology Research and Development 57(2):147–68.
  15. Case studies Participants • N = 40 students, divided in

    two classes (n = 20 each); • 22 male students, 18 female students; • Ages 18 to 27 (median 19, mean 20.2); • Knowledge of the topics: • 7 with no knowledge; • 25 with a little familiarity; • 8 very confident about their knowledge.
  16. Plots of the exploration guide outcomes vs time using VideoLab

    (L) and reading the help (R) (n = 40, quadratic fit lines). Plots of the exploration guide outcomes vs module loads (L) and help invocations (R) (n = 40, quadratic fit lines).
  17. Plots of the exploration guide outcomes vs time using VideoLab

    (L) and reading the help (R) (n = 40, quadratic fit lines). Plots of the exploration guide outcomes vs module loads (L) and help invocations (R) (n = 40, quadratic fit lines).
  18. Findings – EG quiz outcomes * usage… Strong Spearman correlations

    found between quiz outcomes and… • Reading the text explanations (p < .001); • Switching modes less often (p = .005) – suggest the student is following the EG more closely.
  19. Findings – LOES-S questionnaire Analysis • Item groups checked for

    internal reliability (Cronbach α > .7); • Factor analysis used to corroborate groups’ consistency.
  20. Findings – LOES-S questionnaire Results (-2 to +2 scale) •

    Learning value: mean +1.05 ( std. dev = .40 ) • Design quality: mean +1.23 ( std. dev = .51 ) • Engagement: mean +0.68 ( std. dev = .66 )
  21. Findings – LOES-S questionnaire Results (-2 to +2 scale) •

    Learning value: mean +1.05 ( std. dev = .40 ) • Design quality: mean +1.23 ( std. dev = .51 ) • Engagement: mean +0.68 ( std. dev = .66 ) Correlations (Spearman 2-tailed) • All three constructs persuasively correlate to each other (p < .014 or better); • Strong correlation between students’ opinion of VideoLab’s design quality and their quiz outcomes (p < .001).
  22. Discussion VideoLab’s educational efficacy – but not effectiveness; Room for

    improvement in some of the modules/topics; Instructional flexibility led to different learning approaches – but reading was very important;
  23. Discussion VideoLab’s educational efficacy – but not effectiveness; Room for

    improvement in some of the modules/topics; Instructional flexibility led to different learning approaches – but reading was very important; Outcomes were dependent on adherence to the exploration guide.
  24. Future work Work with other educators in other learning contexts

    (ex. in vocational studies, or as a blended learning tool);
  25. Future work Work with other educators in other learning contexts

    (ex. in vocational studies, or as a blended learning tool); Further study aimed at determining effectiveness;
  26. Future work Work with other educators in other learning contexts

    (ex. in vocational studies, or as a blended learning tool); Further study aimed at determining effectiveness; Use of acceptance research methodologies;
  27. Future work Work with other educators in other learning contexts

    (ex. in vocational studies, or as a blended learning tool); Further study aimed at determining effectiveness; Use of acceptance research methodologies; Address the literature gap on the interactions between educational software and: • the instructor, • other materials (such as the EG), • the curriculum.
  28. This work was supported by the UT Austin | Portugal

    Program and the Foundation of Science and Technology (FCT) doctoral scholarship PD/BD/128416/2017. Thank you! Eduardo Morais  [email protected] END 2019 – International Conference on Education and New Developments Porto, June 22-24