Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Learner Centred Design for Learning Analytics

Learner Centred Design for Learning Analytics

Slides for a presentation at The 14th annual Academic Practice and Technology (APT) Conference.
Abstract: https://showtime.gre.ac.uk/index.php/ecentre/apt2016/paper/view/980

Ed de Quincey

June 28, 2016
Tweet

More Decks by Ed de Quincey

Other Decks in Education

Transcript

  1. Learner Centred Design for
    Learning Analytics
    Dr Ed de Quincey, Dr Mark Turner, Dr Theo Kyriacou, Dr Nikki Williams
    School of Computing and Mathematics, KeeleUniversity Photo by GotCredit
    www.gotcredit.com

    View Slide

  2. Dr Ed de Quincey @eddequincey
    Postgraduate Course Director, Lecturer in Computer Science
    School of Computing and Mathematics, KeeleUniversity
    Lead of the Software Engineering Research Cluster
    instagram.com/eddequincey

    View Slide

  3. TRADITIONALLY A STUDENT’S PROGRESS AND LEVEL OF
    ENGAGEMENT HAS BEEN MEASURED BY ASSESSMENT
    Photo by Alberto G.
    https://www.flickr.com/photos/albert ogp123/58435 77306

    View Slide

  4. HOW ELSE CAN WE MEASURE ENGAGEMENT
    AND PROGRESS IN REAL-TIME?
    Photos by
    Cropbot https://en.wikipedia.org/wiki/Lecture_hall#/media/File:5th_Floor_Lecture_Hall.jpg
    See-ming Lee https://www.flickr.com/photos/seeminglee/4556156477

    View Slide

  5. Learning Analytics
    has been defined as a method for
    “deciphering trends
    and patterns from
    educational big data
    … to further the
    advancement of a
    personalized,
    supportive system
    of higher education.”
    (Johnson et al., 2013)
    Co-authorship network map of physicians publishing on hepatitis C (detail)
    Source: http://www.flickr.com/photos/speedoflife/82749 93170/

    View Slide

  6. NTU student dashboard:
    Learning analytics to improve retention

    View Slide

  7. Blackboard Analytics

    View Slide

  8. View Slide

  9. Apple Lisa
    PRESENTING STUDENT DATA BACK TO STUDENTS and LECTURERS,
    USING USER CENTRIC QUERIES, FORMATS and METAPHORS

    View Slide

  10. View Slide

  11. WP1. KLE Data Review
    A review of activity log data that are
    currently available via the KLE.


    View Slide



  12. View Slide

  13. View Slide

  14. WP1. KLE Data Review


    • Inconsistent formats
    • Reports take a long time to run
    • No way to bulk download all “raw” data
    • Single Course User Participation Report is the most useful
    BUT have to run for each student
    Around 11 “reports” available
    9 Course Reports, a Performance Dashboard and a Retention Centre

    View Slide

  15. WP1. KLE Data Review (and WP2. LA Pilot Study)
    A set of log files, generated from
    previous KLE module activity.


    Data Persistence
    Logs deleted after 6 months
    Data Protection/Usage Policy
    Current policies do not cover usage for Learning
    Analytics

    View Slide

  16. Identified and Reviewed 22 Learning Analytics Systems

    View Slide

  17. Review of Existing Systems
    Few systems available for general use

    Primarily targeted at educators. Only 5 of the 22
    systems being designed purely for use by students
    Only 4 of the studies gathered the requirements for
    the systems directly from students
    Currently analysing visualisation techniques used
    and types/sources of data used

    View Slide

  18. View Slide

  19. “Jisc is hosting an event in February where we’ll bring together people
    from universities and colleges across the UK to look at what they
    think can and should be provided to students directly.”
    December, 2014
    “What data and analytics should be presented directly to students? This
    was the subject of a workshop on 27th Feb in Jisc’s London offices,
    attended by staff from across UK higher education. We also had
    with us a couple of students with a keen interest in the
    area … In advance of the meeting members of Jisc’s Student
    App Expert Group had anonymously provided a range
    of potential requirements, which I then grouped and used as the
    basis for the workshop.” March, 2015
    Effective Learning Analytics
    Using data and analytics to support students
    https://analytics.jiscinvolve.org/

    View Slide

  20. “I’m now carrying out
    sessions directly with
    students to find out
    what they would find
    most useful… The
    students were from a variety
    of levels and backgrounds,
    ranging from engineering to
    drama.” April 29, 2015
    Effective Learning Analytics
    Using data and analytics to support students
    https://analytics.jiscinvolve.org/

    View Slide

  21. “I joined colleagues from Jisc and Therapy Box last
    week in Manchester to apply the same methodology
    to app design that our SOSI students have
    been using. The technique, developed by live|work
    includes devising personas and user
    journeys, and competitor and SWOT
    analyses, defining business and
    technical requirements, walking
    through concepts with other teams,
    and the development and user
    testing of wireframes” August 21, 2015
    Effective Learning Analytics
    Using data and analytics to support students
    https://analytics.jiscinvolve.org/
    https://analytics.jiscinvolve.org/wp/2015/04/29/what-do-students-want-from-a-learning-analytics-app/

    View Slide

  22. “A major influence on our thinking is the use of fitness apps such
    as MapMyRide and Google Fit: some of us are already avid users of
    these technologies. To emulate their addictive qualities
    in an app for learning is one of our aims.”
    https://analytics.jiscinvolve.org/wp/2015/08/21/student-app-for-learning-analytics-functionality-and-wireframes/

    View Slide

  23. “About half of the respondents (427/934, 45.7%) had stopped
    using some health apps, primarily due to high data entry
    burden, loss of interest, and hidden costs.”

    View Slide

  24. View Slide

  25. User-Centered Design Process Map
    http://www.usability.gov/how-to-and-tools/resources/ucd-map.html

    View Slide

  26. WP3. Requirements Gathering
    Set as Coursework Case Study
    Contextual Interviews with staff
    2nd Year Computing Module
    84 students in 14 groups
    Asked questions as they complete tasks on the KLE
    2 completed so far

    View Slide

  27. WP3. Requirements Gathering
    Preliminary results from Thematic Analysis of key
    features students said the dashboard should include:
    Students want an overview of their activity/learningi.e.
    everything one place
    Comparison to average seems important
    Common metaphor was a timeline/calendari.e. students
    wanting a way of showing their progress against deadlines
    Common functionality was support for (Instant) Messaging/
    Discussion

    View Slide

  28. 'My Timeline': Past events which have been completed successfully provide
    gratification to the user in the form of positive icons such as thumbs up or smiley faces.
    The past timeline balances with the upcoming events to try and alleviate future workload
    stress by demonstrating positive success at the same time.

    View Slide

  29. Digital Synchable Diary: The diary can be edited and synced to
    various devices; A countdown system to exams and assignments;
    The ability for students to set prioritiesto set work.

    View Slide

  30. Course/Module Progress Bar: a progress bar based upon the how far in the course
    users are. Indicating how much of the course they should know and how much time is
    left until the course finishes.

    View Slide

  31. Score Indicator: The score indicator is a metric derived from an algorithm
    which would track a student’s engagement with course materials and
    other important areas of engagement that students should be utilising
    (comments,discussion).

    View Slide

  32. Degree Classification Requirements: Panel showing the Percentage
    needed in Future Assignments to get certain Degree Classifications.

    View Slide

  33. Comparison with peers: The chance for the user to see how
    they are comparing with the top 20% of the class and how they
    are doing comparedwith the average mark.

    View Slide

  34. Goals and Trophies: The student will get a list of goals or tasks
    sent to their dashboard and there they can also add more tasks
    of their own.

    View Slide

  35. News Feed: An aggregated news feed is a pattern followed on many
    major platforms, and provides a intuitive way to group content into a
    digestible stream of informationfrom multiplesources (modules).

    View Slide

  36. Discussion Forum: An open forum within the learning dashboard application, to allow
    students to post questions about specific issues related to their course or to materials,
    and for staff to then see where collective issues were found

    View Slide

  37. Did we find agreement with the 6 Jisc principles?
    • Comparative ✓
    • Social ✓
    • Gamified ✓
    • Private by default (✓)
    • Usable standalone x Seen as part of the VLE
    • Uncluttered (✓) VLE’s have poor usability
    Effective Learning Analytics
    Using data and analytics to support students
    https://analytics.jiscinvolve.org/
    https://analytics.jiscinvolve.org/wp/2015/08/21/student-app-for-learning-analytics-functionality-and-wireframes/

    View Slide

  38. “So all in all some great suggestions
    from this group of students
    in Lincoln. Some of them aren’t
    what are normally
    considered learning analytics
    applications but they all rely
    on data – some of it new data such
    as students being able to specify their
    interests in more detail in order to
    receive targetted materials and details
    of events.” April 29, 2015
    Effective Learning Analytics
    Using data and analytics to support students
    https://analytics.jiscinvolve.org/
    https://analytics.jiscinvolve.org/wp/2015/04/29/what-do-students-want-from-a-learning-analytics-app/

    View Slide

  39. WP3. Requirements Gathering
    Contextual Interviews with staff
    • Accesses does not necessarily mean engagement.
    • Only interested in low values e.g. not accessed for a long time.
    • Current inaccurate Dashboard and the Retention Centre alerts makes
    people feel less trustful of the data.
    • Information in current reports interesting rather than being useful.
    • Comparison against average values (for a module/class) is
    important as it provides context to the data, otherwise it is not clear
    what is 'good' or 'bad'.
    • The reports need to be easy and quick to run/use.
    • No easily accessible method for seeing a student’s overall level of
    interaction on all modules

    View Slide

  40. WP4. Implementation of Prototype LA Dashboard
    WP5. Pilot Study into uses of LA Dashboard within a Module
    Python programme that utilises 2 reports to calculate files uploaded since last login

    View Slide

  41. View Slide

  42. Personalised (semi) automated reminder emails

    View Slide

  43. Learning Analytics in the classroom?
    Students in lecture hall ©JirkaMatousekvia Flickr

    View Slide

  44. “The average user interface has
    some 40 flaws. Correcting the easiest
    20 of these yields an average
    improvement in usability of 50%.
    The big win, however,
    occurs when usability is
    factored in from the
    beginning. This can yield
    efficiency improvements of
    over 700%.” (Landauer, 1995)
    ©CollegeDegrees360 viaFlickr

    View Slide

  45. “The cost of poor usability is
    high. It includes unsatisfied,
    ineffective learners and
    ineffective e-learning
    initiatives. Learners who find an
    e-learning program hard to use
    might:
    • Carry out their task reluctantly
    • Be confused about the learning
    exercise
    • Fail to engage with the e-learning,
    possibly abandon the e-learning
    completely, fail to learn or retain
    knowledge.” (Abedour and Smith, 2006)
    ©CollegeDegrees360 viaFlickr

    View Slide

  46. Good morning
    class…
    Do you prefer lecturing in the dark?

    View Slide

  47. Appendix: Other Example Dashboards
    Dr Ed de Quincey, Dr Mark Turner, Dr Theo Kyriacou, Dr Nikki Williams
    School of Computing and Mathematics, KeeleUniversity Photo by GotCredit
    www.gotcredit.com

    View Slide

  48. View Slide

  49. View Slide

  50. View Slide

  51. View Slide

  52. View Slide

  53. View Slide

  54. View Slide