Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Quality in Use — Meet the Needs of Users

Quality in Use — Meet the Needs of Users

LINE DEVDAY 2021

November 11, 2021
Tweet

More Decks by LINE DEVDAY 2021

Other Decks in Technology

Transcript

  1. Difference in perspective
    Functionality
    Performance
    Black-box testing
    Sign off
    Pleasure
    Safety
    Do not annoyed
    Provider Users
    Quickly
    Difference

    View full-size slide

  2. Functionality
    Performance
    View of user
    Black Box Testing
    Signoff
    Pleasure
    Safety
    Do Not annoyed
    Provider Users
    quickly
    GAP
    Difference in perspective
    Well-known story
    But, Need a method from the
    quality side

    View full-size slide

  3. Difference in perspective
    How much do we really understand the user?
    Black box
    testing
    Viewpoint
    of user
    testing
    =

    View full-size slide

  4. Difference in perspective
    How much do we really understand the user?
    Black box
    testing
    Viewpoint
    of user
    testing

    View full-size slide

  5. Difference in perspective
    Provider Users
    I represent the user

    View full-size slide

  6. Difference in perspective
    Provider
    Users
    Looking from the
    same direction
    It’s
    Difficult

    View full-size slide

  7. Service Quality measure
    How difficult is it to measure?
    Service quality is difficult to measure quantitatively
    ü Service Quality is subjective
    ü Difficulty to collect customer data
    ü The customer is a factor of change

    View full-size slide

  8. Service Quality measure
    In order to our tests to be more meaningful,
    The Quality of Context In Use should be known.
    How difficult is it to measure?

    View full-size slide

  9. Quality In Use
    Satisfaction
    Effectiveness
    Efficiency
    Freedom
    From
    Risk
    Context
    Coverage
    Pragmatic.
    Hedonic
    What is Quality in Use?
    ü The concept that already existed
    ü Definition and measurement standard
    (ISO/IEC 25010 / 25022)
    ü It is not usability,
    but the overall quality of use.

    View full-size slide

  10. Quality in use
    measure
    Effect of software
    product
    When do we perform Quality in Use
    Quality In Use
    Develop phase QA phase ?
    Process
    Process
    quality
    Internal
    properties
    External
    properties
    Quality
    In use
    Software product
    Process
    measure
    Internal
    measure
    External
    measure
    Influences Influences Influences
    depends on depends on depends on
    Contexts of
    use

    View full-size slide

  11. Goal of Quality In Use
    Measure user’s point of view periodically
    Find what we can do
    Make the customer happy

    View full-size slide

  12. Process
    I. Check the available log.
    II. Check measurement
    method
    III. Select proper metrics
    I. Choice pilot service
    (LINE Pay, TH Bank)
    II. Task definition
    III. Weekly measurement
    IV. Indicator evaluation.
    I. Analysis result
    II. Analysis error code
    Search
    Measurement
    method
    Develop metric &
    Pilot Service
    measurement
    Analysis and
    improvement
    through results.
    Quality In Use
    standard &
    research

    View full-size slide

  13. Measurement method
    Log is better than Questionnaire & User test
    ü Questionnaire & User test : Attractive way but hard to measure periodically
    Questionnaire User test
    API log
    Google Analytics
    Click log
    VOC
    Log

    View full-size slide

  14. Measurement method
    The service(LINE Pay) that GA cannot be applied due to security.
    Most services lack log data for user analysis.
    So, API match screen and user actions
    Get user and friends
    APIs
    Auth
    APIs
    Transfer
    APIs
    Balance info
    APIs
    Attempt transfer

    View full-size slide

  15. Develop metric and log basis
    Choice and Develop indicator
    - Method Source
    Effectiveness
    Task Completed Ratio = A/B
    A = Number of unique tasks completed
    B = Total number of unique tasks attempted
    API log
    GA
    Efficiency
    Task Time ratio = T/A
    A = Task time
    T = Ideal time (Step of Task * 2s)
    API log
    GA
    Freedom from risk
    Safety of VTFS affected by use of the system = 1-A/B
    A = Number of user put at Blocked System error by task
    B = Total number of user using service by task
    API log
    Satisfaction
    Proportion of user complaints about a task = 1- ((A/B) / (C/D))
    A = Number of complaints for a particular task
    B = Total number of user using a particular task
    C = Number of complaints for a Service
    D = Total number of user using service
    VOC

    View full-size slide

  16. Effectiveness
    Check if users achieve their goals well.
    18.8s
    5s 16s
    12s
    Task start Step 2 Step 3 Step 4 Task complete
    65% 69%
    31%
    50% 35 %
    % : inflow
    50%
    100%
    % : drop off

    View full-size slide

  17. Effectiveness
    Check if users achieve their goals well.
    Task complete ratio = tasks completed
    total tasks attempted
    Analysis
    ü Check each step drop-off
    ü Check Step Response code (exclude success code)
    ü Check Complete ratio trend by task
    ü Result = 0.312
    ü # of tasks completed = 151,646
    ü Total # tasks attempted = 488,944

    View full-size slide

  18. Efficiency
    How long does it take for users to achieve their goals?
    18.8s
    5s 16s
    12s
    Task start Step 2 Step 3 Step 4 Task complete
    18s
    12s 16s
    s : stay time + transition time
    5s

    View full-size slide

  19. Efficiency
    How long does it take for users to achieve their goal?
    Task time ratio =
    ideal time (step x 2s)
    task time
    * Ideal time = internal basis, each step(user action or screen transition) * 2s
    Analysis
    ü Check the steps that took the most time
    ü Check the back-end API performance
    ü Check task time ratio trend by task
    ü Result = 0.426
    ü Task time = 18.8
    ü Ideal time = 4*2

    View full-size slide

  20. Freedom from Risk
    Doesn't the user feel dangerous while using the service?
    Safety of user affected = facing error on a task
    total of user on a task
    ü Result = 0.992
    ü # of user facing error = 1,279
    ü Total # of user on task = 166,056
    Analysis
    ü Analysis of the pattern of defined errors.
    ü Analysis of the types of system errors.
    ü Check the trend of errors by task.

    View full-size slide

  21. Satisfaction
    How satisfied and enjoyable do users feel while using the service?
    Proportion of user complaints about a task = 1 − (%
    &
    / (
    )
    )
    A = Number of complaints for a particular task
    B = Total number of user using particular task
    C = Number of complaints for a Service
    D = Total number of user using service
    Analysis
    ü Analysis of VOC items by task. (In particular, it mainly deals with inquiries about use.)
    ü Check VOC Trend by task
    ü Result = 0.861
    ü A = 51
    ü B = 166,056
    ü C = 2,788
    ü D = 1,252,910

    View full-size slide

  22. Quality In Use Score
    experiment
    Effectiveness + Efficiency + Satisfaction + Freedom from Risk
    Number of Item
    QIU Score =
    Each measurement result are related. But we haven't found the relation yet.
    Metrics Result
    Effectiveness 31
    Efficiency
    Freedom from Risk
    Satisfaction 99
    QIU Score 64.5

    View full-size slide

  23. We focus Context In Use by
    User

    View full-size slide