Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Quality in Use — Meet the Needs of Users

Quality in Use — Meet the Needs of Users

LINE DEVDAY 2021

November 11, 2021
Tweet

More Decks by LINE DEVDAY 2021

Other Decks in Technology

Transcript

  1. Difference in perspective Functionality Performance Black-box testing Sign off Pleasure

    Safety Do not annoyed Provider Users Quickly Difference
  2. Functionality Performance View of user Black Box Testing Signoff Pleasure

    Safety Do Not annoyed Provider Users quickly GAP Difference in perspective Well-known story But, Need a method from the quality side
  3. Difference in perspective How much do we really understand the

    user? Black box testing Viewpoint of user testing =
  4. Difference in perspective How much do we really understand the

    user? Black box testing Viewpoint of user testing
  5. Service Quality measure How difficult is it to measure? Service

    quality is difficult to measure quantitatively ü Service Quality is subjective ü Difficulty to collect customer data ü The customer is a factor of change
  6. Service Quality measure In order to our tests to be

    more meaningful, The Quality of Context In Use should be known. How difficult is it to measure?
  7. Quality In Use Satisfaction Effectiveness Efficiency Freedom From Risk Context

    Coverage Pragmatic. Hedonic What is Quality in Use? ü The concept that already existed ü Definition and measurement standard (ISO/IEC 25010 / 25022) ü It is not usability, but the overall quality of use.
  8. Quality in use measure Effect of software product When do

    we perform Quality in Use Quality In Use Develop phase QA phase ? Process Process quality Internal properties External properties Quality In use Software product Process measure Internal measure External measure Influences Influences Influences depends on depends on depends on Contexts of use
  9. Goal of Quality In Use Measure user’s point of view

    periodically Find what we can do Make the customer happy
  10. Process I. Check the available log. II. Check measurement method

    III. Select proper metrics I. Choice pilot service (LINE Pay, TH Bank) II. Task definition III. Weekly measurement IV. Indicator evaluation. I. Analysis result II. Analysis error code Search Measurement method Develop metric & Pilot Service measurement Analysis and improvement through results. Quality In Use standard & research
  11. Measurement method Log is better than Questionnaire & User test

    ü Questionnaire & User test : Attractive way but hard to measure periodically Questionnaire User test API log Google Analytics Click log VOC Log
  12. Measurement method The service(LINE Pay) that GA cannot be applied

    due to security. Most services lack log data for user analysis. So, API match screen and user actions Get user and friends APIs Auth APIs Transfer APIs Balance info APIs Attempt transfer
  13. Develop metric and log basis Choice and Develop indicator -

    Method Source Effectiveness Task Completed Ratio = A/B A = Number of unique tasks completed B = Total number of unique tasks attempted API log GA Efficiency Task Time ratio = T/A A = Task time T = Ideal time (Step of Task * 2s) API log GA Freedom from risk Safety of VTFS affected by use of the system = 1-A/B A = Number of user put at Blocked System error by task B = Total number of user using service by task API log Satisfaction Proportion of user complaints about a task = 1- ((A/B) / (C/D)) A = Number of complaints for a particular task B = Total number of user using a particular task C = Number of complaints for a Service D = Total number of user using service VOC
  14. Effectiveness Check if users achieve their goals well. 18.8s 5s

    16s 12s Task start Step 2 Step 3 Step 4 Task complete 65% 69% 31% 50% 35 % % : inflow 50% 100% % : drop off
  15. Effectiveness Check if users achieve their goals well. Task complete

    ratio = tasks completed total tasks attempted Analysis ü Check each step drop-off ü Check Step Response code (exclude success code) ü Check Complete ratio trend by task ü Result = 0.312 ü # of tasks completed = 151,646 ü Total # tasks attempted = 488,944
  16. Efficiency How long does it take for users to achieve

    their goals? 18.8s 5s 16s 12s Task start Step 2 Step 3 Step 4 Task complete 18s 12s 16s s : stay time + transition time 5s
  17. Efficiency How long does it take for users to achieve

    their goal? Task time ratio = ideal time (step x 2s) task time * Ideal time = internal basis, each step(user action or screen transition) * 2s Analysis ü Check the steps that took the most time ü Check the back-end API performance ü Check task time ratio trend by task ü Result = 0.426 ü Task time = 18.8 ü Ideal time = 4*2
  18. Freedom from Risk Doesn't the user feel dangerous while using

    the service? Safety of user affected = facing error on a task total of user on a task ü Result = 0.992 ü # of user facing error = 1,279 ü Total # of user on task = 166,056 Analysis ü Analysis of the pattern of defined errors. ü Analysis of the types of system errors. ü Check the trend of errors by task.
  19. Satisfaction How satisfied and enjoyable do users feel while using

    the service? Proportion of user complaints about a task = 1 − (% & / ( ) ) A = Number of complaints for a particular task B = Total number of user using particular task C = Number of complaints for a Service D = Total number of user using service Analysis ü Analysis of VOC items by task. (In particular, it mainly deals with inquiries about use.) ü Check VOC Trend by task ü Result = 0.861 ü A = 51 ü B = 166,056 ü C = 2,788 ü D = 1,252,910
  20. Quality In Use Score experiment Effectiveness + Efficiency + Satisfaction

    + Freedom from Risk Number of Item QIU Score = Each measurement result are related. But we haven't found the relation yet. Metrics Result Effectiveness 31 Efficiency  Freedom from Risk  Satisfaction 99 QIU Score 64.5