Upgrade to Pro — share decks privately, control downloads, hide ads and more …

User Experience Maturity Model (Prachi Sakhardande)

October 25, 2013

User Experience Maturity Model (Prachi Sakhardande)

Quality of User Experience [UX] is often defined using subjective/relative measures. We haven’t come across do not have a model to compare UX of two different products , or even define it in absolute terms. At TCS Products, we aspire to build products with a great UX. But what is great? What is acceptable quality of UX? We had to define a model that would quantitatively and objectively assess UX quality for a given product. Thus was born the User Experience Maturity Model [UXMM]. This model defines four incremental levels of UX – Usable, Useful, Desirable and Delightful, each mapped to a set of formative and summative techniques. E.g. Usable meant that the product is designed based on generally accepted best practices or heuristics, but without end user verification little knowledge of who the end users are . Assessment for this level is carried out through expert review guided by a scorecard covering 10 UX KPAs. The other three levels are similarly defined. Each level is associated with a scorecard, and passing criteria. We have been conducting assessments using this model for the last year and half. In this talk we present UXMM and how it has helped us define With this, we have been able to define UX benchmarks, focused improvements and overall institutionalizing UX in the product development domai


October 25, 2013

More Decks by uxindia

Other Decks in Design


  1. Prachi Sakhardande, UXINDIA13 Conference Presentation

  2. Prachi Sakhardande

  3. • Component Engineering Group is a vertical of TCS that

    focuses on the Non Linear Growth Strategy • We build software products across various verticals ranging from Healthcare, Retail, ecommerce, Transport, Productivity Tools etc. • As User Experience CoE, we are responsible for guiding the user experience design of these products, as well as quality control through a process called Final Inspection • The Final Inspection or FI process necessitates that we objectively evaluate the user experience of a product, and define a benchmark failing which that product will not be released in the market About Us
  4. • Is there a way to quantify and compare user

    experience of software products, irrespective of type of application, domain, channel – and other variables? Problem Statement
  5. Tab versus Phone: To compare or not to compare?

  6. Challenges 1. Products covering a wide spectrum in terms of

    domain, audience and devices 2. No common definition of user experience 3. Limited Assessment Window 4. Need for a clear definition of “minimum acceptable quality”
  7. User Experience for Software Products Ease of Use Speed of

    Use Learnability Consistency Content Accessibility Flexibility Aesthetics Recovery from Errors Brand Recall Persuasiveness Differentiator Greater Good These are Key User Experience Parameters [KUXPs], on which the evaluation model is based
  8. User Experience Maturity Model Product displays a basic consideration of

    usability. Product displays a conscious effort to cater to specific end user requirements. Product displays a conscious effort for a better user experience compared to peers Product displays thought leadership in user experience 7
  9. Assessment Methodology Level 1 - Usable •Define Benchmark •Expert Review

    •Certified Usable Level 2 - Useful •Usability Testing •Certified Useful Level 3 - Desirable •Competitor Analysis •Certified Desirable Level 4 - Delightful •Emotional Response •Certified Delightful For a product that has considered UX guidelines, but not necessarily end user requirements For products where end user needs are met, but that is not necessarily the USP For products where UX is an important *economic* differentiator For products where UX is one of the primary reasons people buy the product
  10. Quantifying a KUXP across the 4 levels Ease of Access

    L1 – Usable Does system provide easy access to primary content and functionalities? L2 – Useful Are users able to quickly access the content or features required to accomplish tasks? L3 – Desirable Is the system better designed as compared to peers to provide easy access to primary content and functionalities? L4 – Delightful Did the users feel the system was simple and intuitive? Ease of Use
  11. Ease of Use Ease of Access Do primary pages define

    the goals and functionalities users can accomplish with the application? Do primary pages provide clear entry points for users to accomplish these goals or use these functions? Adaptation to user needs Can user customize the interface as per her needs, if required? Is personalized content available (if required)? Does the system have any implicit personalization, based on user's current context? Does the system have an ability to recognize a context and only show relevant features or content? Visibility of System Status If a task [e.g. processing] takes reasonable time, is an appropriate message displayed on the UI? Does system provide a confirmation of task completion? Does system provide a clear indication if a task cannot be completed either due to system error or manual error? + Context of current location + Ease of Data Input + Navigation Nuts And Bolts – Level 1 Assessment
  12. Parameters for evaluation Frequency of occurrence Effectiveness Rating KUXP Score

    Category Weightage
  13. Learning 1. The User Experience quality is a combination of

    two dimensions – frequency and effectiveness 2. Severity definitions need to address both UX & UI 3. A single pass / fail score insufficient to capture KUXP specific issues – hence application has to meet benchmark at KUXP level as well 4. Context needs to be considered to define “bare minimum” score, as well as during actual assessment 5. Define scope of application for a single assessment. Modules that have diverse functionalities should be treated as different applications
  14. • Objective quality assessment – Enables products to be specific

    on their user experience objectives – Defines specifics of the current state and what needs to be done to take the user experience to the next level – Allows definition of a specific numeric score as cut-off benchmark • KUXP based structure – Enables UX objectives to be defined in context of the application – Allows analysis of overall UX strengths and weaknesses of the Product Landscape – Lends scalability so that the same model can be extended to user interfaces for mobile devices, kiosks or used across multiple domains • Maturity levels are informed by specific UCD methodologies – which in turn helps institutionalization of User Experience – Enables a left-shift in the UCD process, with more and more product teams thinking about UX early-on in the Product Lifecycle Benefits of using a structured quantification model
  15. • User Experience Final Inspection basis Level 1 assessment; numerous

    FIs have been conducted since the past 18 months • Level 2 & Level 3 assessment pilots underway • While numerous emotional response elicitation methods are recommended for Level 4 assessment, research is in progress to identify optimum assessment strategy • We are also in the process of aligning our maturity assessment methodology with product lifecycle as products mature through their lifetime up until retirement What Next
  16. QUESTIONS ? Friend to Groucho Marx: “Life is difficult!” Marx

    to Friend: “Compared to what?” - Monitoring and Evaluation NEWS