Learner Centred Design for Learning Analytics

Learner Centred Design for Learning Analytics

Slides for a presentation at The 14th annual Academic Practice and Technology (APT) Conference.
Abstract: https://showtime.gre.ac.uk/index.php/ecentre/apt2016/paper/view/980

Dafb8db748341892bda61117ecd16a43?s=128

Ed de Quincey

June 28, 2016
Tweet

Transcript

  1. 1.

    Learner Centred Design for Learning Analytics Dr Ed de Quincey,

    Dr Mark Turner, Dr Theo Kyriacou, Dr Nikki Williams School of Computing and Mathematics, KeeleUniversity Photo by GotCredit www.gotcredit.com
  2. 2.

    Dr Ed de Quincey @eddequincey Postgraduate Course Director, Lecturer in

    Computer Science School of Computing and Mathematics, KeeleUniversity Lead of the Software Engineering Research Cluster instagram.com/eddequincey
  3. 3.

    TRADITIONALLY A STUDENT’S PROGRESS AND LEVEL OF ENGAGEMENT HAS BEEN

    MEASURED BY ASSESSMENT Photo by Alberto G. https://www.flickr.com/photos/albert ogp123/58435 77306
  4. 4.

    HOW ELSE CAN WE MEASURE ENGAGEMENT AND PROGRESS IN REAL-TIME?

    Photos by Cropbot https://en.wikipedia.org/wiki/Lecture_hall#/media/File:5th_Floor_Lecture_Hall.jpg See-ming Lee https://www.flickr.com/photos/seeminglee/4556156477
  5. 5.

    Learning Analytics has been defined as a method for “deciphering

    trends and patterns from educational big data … to further the advancement of a personalized, supportive system of higher education.” (Johnson et al., 2013) Co-authorship network map of physicians publishing on hepatitis C (detail) Source: http://www.flickr.com/photos/speedoflife/82749 93170/
  6. 8.
  7. 9.

    Apple Lisa PRESENTING STUDENT DATA BACK TO STUDENTS and LECTURERS,

    USING USER CENTRIC QUERIES, FORMATS and METAPHORS
  8. 10.
  9. 11.

    WP1. KLE Data Review A review of activity log data

    that are currently available via the KLE. ✓ ✖
  10. 12.
  11. 13.
  12. 14.

    WP1. KLE Data Review ✓ ✖ • Inconsistent formats •

    Reports take a long time to run • No way to bulk download all “raw” data • Single Course User Participation Report is the most useful BUT have to run for each student Around 11 “reports” available 9 Course Reports, a Performance Dashboard and a Retention Centre
  13. 15.

    WP1. KLE Data Review (and WP2. LA Pilot Study) A

    set of log files, generated from previous KLE module activity. ✖ ✖ Data Persistence Logs deleted after 6 months Data Protection/Usage Policy Current policies do not cover usage for Learning Analytics
  14. 17.

    Review of Existing Systems Few systems available for general use

    ✖ Primarily targeted at educators. Only 5 of the 22 systems being designed purely for use by students Only 4 of the studies gathered the requirements for the systems directly from students Currently analysing visualisation techniques used and types/sources of data used
  15. 18.
  16. 19.

    “Jisc is hosting an event in February where we’ll bring

    together people from universities and colleges across the UK to look at what they think can and should be provided to students directly.” December, 2014 “What data and analytics should be presented directly to students? This was the subject of a workshop on 27th Feb in Jisc’s London offices, attended by staff from across UK higher education. We also had with us a couple of students with a keen interest in the area … In advance of the meeting members of Jisc’s Student App Expert Group had anonymously provided a range of potential requirements, which I then grouped and used as the basis for the workshop.” March, 2015 Effective Learning Analytics Using data and analytics to support students https://analytics.jiscinvolve.org/
  17. 20.

    “I’m now carrying out sessions directly with students to find

    out what they would find most useful… The students were from a variety of levels and backgrounds, ranging from engineering to drama.” April 29, 2015 Effective Learning Analytics Using data and analytics to support students https://analytics.jiscinvolve.org/
  18. 21.

    “I joined colleagues from Jisc and Therapy Box last week

    in Manchester to apply the same methodology to app design that our SOSI students have been using. The technique, developed by live|work includes devising personas and user journeys, and competitor and SWOT analyses, defining business and technical requirements, walking through concepts with other teams, and the development and user testing of wireframes” August 21, 2015 Effective Learning Analytics Using data and analytics to support students https://analytics.jiscinvolve.org/ https://analytics.jiscinvolve.org/wp/2015/04/29/what-do-students-want-from-a-learning-analytics-app/
  19. 22.

    “A major influence on our thinking is the use of

    fitness apps such as MapMyRide and Google Fit: some of us are already avid users of these technologies. To emulate their addictive qualities in an app for learning is one of our aims.” https://analytics.jiscinvolve.org/wp/2015/08/21/student-app-for-learning-analytics-functionality-and-wireframes/
  20. 23.

    “About half of the respondents (427/934, 45.7%) had stopped using

    some health apps, primarily due to high data entry burden, loss of interest, and hidden costs.”
  21. 24.
  22. 26.

    WP3. Requirements Gathering Set as Coursework Case Study Contextual Interviews

    with staff 2nd Year Computing Module 84 students in 14 groups Asked questions as they complete tasks on the KLE 2 completed so far
  23. 27.

    WP3. Requirements Gathering Preliminary results from Thematic Analysis of key

    features students said the dashboard should include: Students want an overview of their activity/learningi.e. everything one place Comparison to average seems important Common metaphor was a timeline/calendari.e. students wanting a way of showing their progress against deadlines Common functionality was support for (Instant) Messaging/ Discussion
  24. 28.

    'My Timeline': Past events which have been completed successfully provide

    gratification to the user in the form of positive icons such as thumbs up or smiley faces. The past timeline balances with the upcoming events to try and alleviate future workload stress by demonstrating positive success at the same time.
  25. 29.

    Digital Synchable Diary: The diary can be edited and synced

    to various devices; A countdown system to exams and assignments; The ability for students to set prioritiesto set work.
  26. 30.

    Course/Module Progress Bar: a progress bar based upon the how

    far in the course users are. Indicating how much of the course they should know and how much time is left until the course finishes.
  27. 31.

    Score Indicator: The score indicator is a metric derived from

    an algorithm which would track a student’s engagement with course materials and other important areas of engagement that students should be utilising (comments,discussion).
  28. 32.
  29. 33.

    Comparison with peers: The chance for the user to see

    how they are comparing with the top 20% of the class and how they are doing comparedwith the average mark.
  30. 34.

    Goals and Trophies: The student will get a list of

    goals or tasks sent to their dashboard and there they can also add more tasks of their own.
  31. 35.

    News Feed: An aggregated news feed is a pattern followed

    on many major platforms, and provides a intuitive way to group content into a digestible stream of informationfrom multiplesources (modules).
  32. 36.

    Discussion Forum: An open forum within the learning dashboard application,

    to allow students to post questions about specific issues related to their course or to materials, and for staff to then see where collective issues were found
  33. 37.

    Did we find agreement with the 6 Jisc principles? •

    Comparative ✓ • Social ✓ • Gamified ✓ • Private by default (✓) • Usable standalone x Seen as part of the VLE • Uncluttered (✓) VLE’s have poor usability Effective Learning Analytics Using data and analytics to support students https://analytics.jiscinvolve.org/ https://analytics.jiscinvolve.org/wp/2015/08/21/student-app-for-learning-analytics-functionality-and-wireframes/
  34. 38.

    “So all in all some great suggestions from this group

    of students in Lincoln. Some of them aren’t what are normally considered learning analytics applications but they all rely on data – some of it new data such as students being able to specify their interests in more detail in order to receive targetted materials and details of events.” April 29, 2015 Effective Learning Analytics Using data and analytics to support students https://analytics.jiscinvolve.org/ https://analytics.jiscinvolve.org/wp/2015/04/29/what-do-students-want-from-a-learning-analytics-app/
  35. 39.

    WP3. Requirements Gathering Contextual Interviews with staff • Accesses does

    not necessarily mean engagement. • Only interested in low values e.g. not accessed for a long time. • Current inaccurate Dashboard and the Retention Centre alerts makes people feel less trustful of the data. • Information in current reports interesting rather than being useful. • Comparison against average values (for a module/class) is important as it provides context to the data, otherwise it is not clear what is 'good' or 'bad'. • The reports need to be easy and quick to run/use. • No easily accessible method for seeing a student’s overall level of interaction on all modules
  36. 40.

    WP4. Implementation of Prototype LA Dashboard WP5. Pilot Study into

    uses of LA Dashboard within a Module Python programme that utilises 2 reports to calculate files uploaded since last login
  37. 41.
  38. 44.

    “The average user interface has some 40 flaws. Correcting the

    easiest 20 of these yields an average improvement in usability of 50%. The big win, however, occurs when usability is factored in from the beginning. This can yield efficiency improvements of over 700%.” (Landauer, 1995) ©CollegeDegrees360 viaFlickr
  39. 45.

    “The cost of poor usability is high. It includes unsatisfied,

    ineffective learners and ineffective e-learning initiatives. Learners who find an e-learning program hard to use might: • Carry out their task reluctantly • Be confused about the learning exercise • Fail to engage with the e-learning, possibly abandon the e-learning completely, fail to learn or retain knowledge.” (Abedour and Smith, 2006) ©CollegeDegrees360 viaFlickr
  40. 47.

    Appendix: Other Example Dashboards Dr Ed de Quincey, Dr Mark

    Turner, Dr Theo Kyriacou, Dr Nikki Williams School of Computing and Mathematics, KeeleUniversity Photo by GotCredit www.gotcredit.com
  41. 48.
  42. 49.
  43. 50.
  44. 51.
  45. 52.
  46. 53.
  47. 54.