Upgrade to Pro — share decks privately, control downloads, hide ads and more …

UXINDIA17 - Transforming user needs into design...

uxindia
November 20, 2017

UXINDIA17 - Transforming user needs into design priorities through research and collaboration

Transforming user needs into design priorities through research and collaboration

uxindia

November 20, 2017
Tweet

More Decks by uxindia

Other Decks in Design

Transcript

  1. Transforming user needs into design priorities through research and collaboration

    Image: https://www.flickr.com/photos/richardsummers/213085702
  2. Steve Fadden, Ph.D. 2 Researcher & Director, Analytics UX Research,

    Salesforce Lecturer, UC Berkeley School of Information @sfadden on Twitter https://www.linkedin.com/in/stevefadden/ Slides...
  3. Who’s here and why? ▪ Design ▪ Research ▪ Engineering

    ▪ Marketing ▪ Management ▪ Student ▪ Other? 5
  4. “ 7 “Design is a solution to a problem. Art

    is a question to a problem.” John Maeda Ref: @johnmaeda (Twitter), June 6, 2009
  5. Methods to understand problems 1 Observing 2 Listening 3 Interviewing

    9 Phase of Development Concept Design Production Problems Values Progress
  6. Concerns ▪ Simplification bias ▪ Translation bias ▪ Hawthorne effect

    ▪ Logistics ▪ Operational definitions ▪ Decision rules 13 Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  7. Observation Framework Activities: Goal-directed actions Environment: Space where activities take

    place Interactions: Between people, things Objects: Purpose and use of things in environment Users: Behaviors, attitudes, preferences, needs, values, roles 14 Ref: https://help.ethnohub.com/guide/aeiou-framework
  8. Reflect ▪ Did you do anything differently? ▪ Thoughts about

    your process? ▪ Observations? ▪ Interpretations? ▪ Problems and solution opportunities? 17
  9. Best Practices ▪ Write up notes ASAP ▪ Debrief with

    colleagues ▪ Arrange into a chronology ▪ Clarify and elaborate - Reflections - Questions - Hypotheses - Themes - Direct vs. paraphrased quotes 18
  10. Reflect ▪ Process - Challenges - Highlights ▪ Observation -

    Reflections - Questions - Hypotheses - Themes - Quotes 20
  11. Goals ▪ Develop empathy ▪ Understand intent - the why

    - Reasoning - Reactions - Values - Guiding principles 22 Ref: Young, I. 2015. Practical empathy for collaboration and creativity in your work. Rosenfeld Media.
  12. Approach ▪ Start broad ▪ Don’t take notes ▪ Let

    speaker control topics - don’t switch abruptly ▪ Avoid “I” ▪ Use fewest words possible ▪ Don’t introduce new words ▪ Be respectful; don’t judge 23 Ref: Young, I. 2015. Practical empathy for collaboration and creativity in your work. Rosenfeld Media.
  13. 24 Practice listening ▪ Considerations - Start broad - Don’t

    take notes - Let speaker control topics - Avoid “I” - Fewest words possible - Don’t introduce words - Be respectful; don’t judge ▪ Goal: Understand why - Reasoning - Reactions - Guiding principles - Values ▪ Prompts - Reiterate main topic - Why’s that? Because? - [Repeat phrase]
  14. Reflect ▪ Process - Challenges - Highlights ▪ The why?

    - Reasoning - Reactions - Guiding principles - Values 25
  15. Goals ▪ Understand context ▪ Learn about - Goals, intentions

    - Needs, frustrations - Techniques - Experience ▪ Identify problems, opportunities 27
  16. 28 Interview Types Structured Semi- structured Unstructured Consistent Less consistency

    No consistency Faster to capture and analyze More challenging to capture and analyze Most challenging to capture and analyze More quantitative Quantitative and qualitative More qualitative More controlled Interviewer mostly in control Least control - like a conversation Less depth and opportunity to explore or discover Some opportunity to follow-up on answers Flexibility to explore and follow-up Very rich data Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  17. Typical Structure 29 Set expectations Build trust Draw out the

    story Find useful details Debrief Closure Intro Warm-up Focus Deep focus Retrospective Wrap-up Image: http://pixabay.com/es/reloj-de-arena-reloj-temporizador-152090
  18. Build Rapport ▪ Set expectations at the beginning ▪ Use

    ordinary language, fluid ordering, obvious transitions - “Ok, let’s shift our focus to searching...” ▪ Be aware of body language; signal that you’re paying attention - “Mmmhmm,” “interesting” ▪ Respond dynamically and follow up - “So what I hear you saying is you ask for help only after searching. Is that correct?” 30
  19. Open and Closed Questions ▪ Open - “Explain how you

    search for information” - “Describe your thoughts about browsers” - “Talk through the steps you take to search” ▪ Closed - “How many searches do you do in a day?” - “Do you prefer Safari?” - “When do you do most of your searching?” 31 Ref: Portigal, S. 2012. Seventeen types of interviewing questions, at: https://www.portigal.com/seventeen-types-of-interviewing-questions
  20. 32 ▪ “Is finding the best price through searching on

    a familiar browser something that’s important to you or your family?” ▪ “With your level of experience, why do you continue to use an inferior tool for searching?” ▪ “Most technology experts recommend Chrome over Internet Explorer. Which browser do you prefer?” ▪ “Is Safari a great browser?” ▪ “Would you use voice-based searching in the future?” Improve these questions
  21. Critical Incident Questions ▪ Gain insight - Evidence of problems

    - Solution opportunities ▪ Based on - Recent events - Specific details - Feelings and perceptions - Future behaviors and responses 33 Ref: http://www.usabilitynet.org/tools/criticalincidents.htm
  22. Critical Incident Process ▪ Identify time since last experience ▪

    Gather details - Description - Actions taken - Feelings - Outcome - Future actions/responses desired 34 Ref: http://www.usabilitynet.org/tools/criticalincidents.htm
  23. “ 35 “Consider the last time you had to search

    online. How long ago was this? What did you search for? Describe the steps you took, and highlight any surprises or problems (if any) that happened. What would you do differently?”
  24. Set expectations Build trust Draw out the story Find useful

    details Debrief Closure 36 Practice critical incident questioning ▪ Develop question - Time since last experience - Description of experience - Actions taken - Feelings - Outcome - Future actions, desired responses
  25. Reflect ▪ Process - Challenges - Highlights - Changes you’d

    make? ▪ Examples - Details - Problems uncovered - Opportunities 37
  26. Methods to understand values 1 Photo elicitation 2 Reaction cards

    3 Kano survey 4 Delphi methods 5 Valuation methods 39 Phase of Development Concept Design Production Problems Values Progress
  27. Photo Elicitation ▪ Goal: Identify values, associations ▪ Method -

    Overall prompt - Initial response to materials - Elaborate with stories ▪ Provides context around chosen values 41 Ref: Goodman, E., Kuniavsky, M., & Moed, A. 2012. Observing the User Experience: A Practitioner's Guide to User Research, 2nd Edition. Morgan Kaufmann
  28. “ 42 “We want to understand how you feel about

    travel planning. Consider each image/description provided, and write the first words and feelings that come to mind.”
  29. Example People at a beach, walking in the water and

    sitting under umbrellas on chairs in the sand. 43 Image: https://commons.wikimedia.org/wiki/File:Security_Check_At_Terminal_1B.JPG
  30. Example Bengaluru airport entrance with automobiles queued on the road.

    44 Image: https://commons.wikimedia.org/wiki/File:Security_Check_At_Terminal_1B.JPG
  31. Example Airport security area with people preparing their belongings for

    the X-ray machine. Water bottles are on the table. 45 Image: https://commons.wikimedia.org/wiki/File:Security_Check_At_Terminal_1B.JPG
  32. Discussion 1. Words used for each image? 2. Why selected?

    3. Implications for travel planning app? 46
  33. Reflect ▪ Process - Challenges - Highlights ▪ Feedback received

    - Values - Stories - Benefits - Limitations 47
  34. Reaction Cards ▪ Goal: Summarize experience ▪ Method - Understand

    scenario - Perform task - Review terms - Identify all that apply - Pick top 3 - Stack rank ▪ Also useful for desired values 49 Ref: Benedek, J. & Miner, T. 2002. Measuring desirability: New methods for evaluating desirability in a usability lab setting. UPA Conference Proceedings, at: https://www.microsoft.com/usability/UEPostings/DesirabilityToolkit.doc
  35. Intuitive Inviting Low Maintenance Meaningful Motivating Novel Optimistic Personal Powerful

    Predictable Professional Relevant Reliable Responsive Satisfying Secure Organized Sophisticated Stable Stimulating Straight Forward Time-Saving Trustworthy Unconventional Understandable Usable Useful Valuable Controllable Convenient Creative Customizable Cutting edge Desirable Easy to use Effective Efficient Effortless Empowering Energetic Engaging Entertaining Enthusiastic Essential Exceptional Exciting Expected Familiar Fast Flexible Fresh Friendly Fun Helpful High quality Impressive Innovative Inspiring Integrated Simplistic Slow Sterile Stressful Time-consuming Too Technical Unapproachable Unattractive Uncontrollable Undesirable Unpredictable Unrefined Positive Accessible Advanced Appealing Approachable Attractive Business-like Calm Clean Clear Collaborative Comfortable Compatible Compelling Comprehensive Confident Connected Consistent Negative Annoying Boring Busy Complex Confusing Dated Difficult Disconnected Disruptive Distracting Dull Fragile Frustrating Gets in the way Hard to Use Impersonal Incomprehensible Inconsistent Ineffective Intimidating Irrelevant Not Secure Not Valuable Old Ordinary Overbearing Overwhelming Patronizing Poor quality Rigid 50 Terms Ref: https://www.nngroup.com/articles/desirability-reaction-words/
  36. 51 Practice reaction cards ▪ Perform task ▪ Review cards

    - Pick all cards that apply - Select top 3 - Rank order - Discuss ▪ Repeat for ideal experience Do task Current experience Ideal experience
  37. Reflect ▪ Process - Challenges - Highlights ▪ Values -

    Current; Ideal ▪ Other applications - Between groups - Competitive evaluation - Change over time 52
  38. Kano Model (partial) 54 Completely Absent Completely Present Dissatisfaction Satisfaction

    Must-have (required) Attractive (surprising, unexpected) One-dimensional (more is better) Ref: Moorman, J. 2012. Leveraging the Kano Model for Optimal Results, at: https://uxmag.com/articles/leveraging-the-kano-model-for-optimal-results
  39. Kano ▪ Goal: Identify feature type - Understand scenario -

    Assess satisfaction with feature, satisfaction without feature, importance 55 Scenario: “Imagine using an app to plan a trip for your family. You need to identify flight options that are most cost-effective, balancing cost with risk and duration.” Question 1: “How would you feel if the app allowed you to filter flight results by number of stops?” Question 2: “How would you feel if the app did not allow you to filter flight results by number of stops?” Question 3: “How important is this function to you?”
  40. Analyzing results 56 Answer to did not allow question Satisfied

    ......… Neutral …...… Dissatisfied Answer to allowed question Satisfied Attractive Attractive Attractive One- dimensional ......… Must- have Neutral Must- have ......… Must- have Dissatisfied Ref: Moorman, J. 2012. Leveraging the Kano Model for Optimal Results, at: https://uxmag.com/articles/leveraging-the-kano-model-for-optimal-results
  41. 57 Practice Kano questions ▪ Describe scenario ▪ Create feature

    list (~5) ▪ Create 3 questions - Allowed... - Did not allow... - Importance ▪ Plot and interpret results “Did not allow…” Satisfied “Did not allow...” Dissatisfied “Allowed you to…” Dissatisfied “Allowed you to…” Satisfied
  42. Reflect ▪ Process - Challenges - Highlights ▪ Results -

    Attractive - One-dimensional - Must-have 58
  43. Delphi ▪ Goal: Understand uncertainty - Select and assemble experts

    - Conduct group activity (usu. simulation or scenario) to gather initial estimates - Anonymously collect and lay out responses to identify IQR (middle 50%) - Discuss responses above and below IQR - Repeat (perhaps multiple times) - Arrive at consensus estimate or weighted estimate 60 Ref: Helmer-Hirschberg, O. 1967. Analysis of the future: The Delphi Method The RAND Corporation, at: http://www.rand.org/pubs/papers/P3558.html
  44. “ 61 “Consider all the times you have searched for

    something online. How often did your search lead to incorrect information?” (First iteration)
  45. “ 62 “Consider all the times you have searched for

    something online. How often did your search lead to incorrect information?” (Second iteration)
  46. “ 63 “Consider all the times you have searched for

    something online. How often did your search lead to incorrect information?” (Third iteration)
  47. Discussion ▪ Process - Observations - Ways to improve ▪

    Results - Thought process? - Convergence? 64
  48. 65 Vacation planning You are planning a vacation flight, and

    need to make a connection. How likely is it that you will miss your connection? Practice Delphi ▪ Determine scenario ▪ Come up with estimates ▪ Anonymously collect and lay out estimates ▪ Identify IQR ▪ Discuss outliers ▪ Repeat
  49. Reflect ▪ Process - Challenges - Highlights ▪ Results -

    Scenario - Range: Likelihood of missing connection - Consensus? 66
  50. Feature Cost Tradeoff ▪ Goal: Identify relative importance of features

    - Create feature list - Determine “cost” of each feature - Based on: factors that matter most to your team (e.g. effort, risk, complexity) - Provide user a limited “budget” - Record rationale as features are selected 68 Ref: http://www.uxforthemasses.com/buy-the-feature/
  51. Identify Features of Interest ▪ Product (enhanced search app) -

    Feature 1 (naming) - Feature 2 (sharing) - Feature 3 (color coding) - Feature 4 (search with friends) - Feature 5 (alerting) - Feature 6 (using images) - etc. 69
  52. Determine Effort and Risk Team Can Handle* 70 *Represents amount

    of work product team can do (based on time, story pointing, risk assessment, etc.) In this case, 1 box=1 low effort, low risk feature
  53. Feature Effort (1=Low, 2=Med, 3=High) Risk (1=Low, 2=Med, 3=High) Cost

    (Effort * Risk) Feature 1 (naming) 1 1 1 Feature 2 (sharing) 1 3 3 Feature 3 (color coding) 2 1 2 Feature 4 (search with friends) 2 3 6 Feature 5 (alerting) 3 1 3 Feature 6 (using images) 3 3 9 Determine Cost to Implement Each Feature 71
  54. Assign Features to Matching Shapes 72 Feature Product (Effort *

    Risk) Feature 1 (naming) 1 Feature 2 (sharing) 3 Feature 3 (color coding) 2 Feature 4 (search with friends) 6 Feature 5 (alerting) 3 Feature 6 (using images) 9
  55. Assign Features to Matching Shapes 73 Feature Product (Effort *

    Risk) Feature 1 (naming) 1 Feature 2 (sharing) 3 Feature 3 (color coding) 2 Feature 4 (search with friends) 6 Feature 5 (alerting) 3 Feature 6 (using images) 9 Naming Sharing Color coding Search with friends
  56. User Places Features in Grid, Explaining Why 74 Sharing Alerting

    Search with friends Using images Nam -ing Color coding
  57. 75 Practice Feature-Cost Tradeoff ▪ Consider scenario ▪ Identify 10

    travel app features ▪ Estimate effort to implement (1=low, 2=med, 3=high) ▪ Write features in corresponding shapes ▪ User selects features for grid, explaining why High High High Med Med Med Med LowLowLowLowLow
  58. Reflect ▪ Process - Challenges - Highlights ▪ Results -

    Insights - Patterns - Connections (features that must be together) 76
  59. Methods to assess progress 1 Commenting 2 Impression testing 3

    Sentiment 4 Value audit 78 Phase of Development Concept Design Production Problems Values Progress
  60. Commenting ▪ Goal: Identify clarity and concerns as users encounter

    concept ▪ Gain insights - Initial confusions - Overall acceptability 80 Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  61. Process ▪ Present overall scenario description - Identify concerns and

    questions - Assess comprehension of goal ▪ Illustrate steps, flows, interactions ▪ Solicit feedback about: - Concerns, confusions - Benefits, positives - Open questions ▪ Capture final comments at end 81 Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  62. “ 82 “Consider the data export concept presented on the

    next 4 slides. As you read through the concept, comment on anything you find to be confusing, problematic, useful, or appealing about the concept.” Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  63. Slide 1 83 “You first see a view of your

    spreadsheet, with instructions describing how to set up the export.” 1. “Makes sense so far. Not sure how applicable this is to my work though.” Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  64. “For each field you want to export, you first click

    on the name of the field, and then select the range of cells you wish to export.” Slide 2 84 2. “Doing this would require a lot of clicks, even for a small number of columns.” Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  65. “After selecting your fields and ranges, you will see a

    confirmation screen with the field names you have chosen.” Slide 3 85 3. “You should embed best practices for field naming here. Otherwise, the result could be messy.” Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  66. “After you initiate the export process, a progress screen will

    dynamically update with the percent of data exported.” Slide 4 86 4. 100% “Will we be able to save the mappings? That could save time in the future.” Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  67. “Any final concerns, comments, or benefits to discuss?” Summary 87

    1. 2. 3. 4. 100% “It’s great that you don’t have to jump around different parts of the system to do this. Very valuable to be able to complete this from one place.” Ref: Fadden, S., & Bedekar, N. 2015. How to effectively implement different online research methods, Presented at UXPA: https://www.slideshare.net/SteveFadden1/how-to-effective-implement-different-online-research-methods-uxpa-2015-fadden-bedekar
  68. Impression Testing ▪ Goal: Determine immediate response users have to

    your product ▪ Useful for identifying reactions - In as few as 50ms - Stay the same after additional viewing time - Not influenced by actual usability issues - Are not influenced by actual usability issues 89 Ref: Lindgaard, G., Fernandes, G., Dudek, C., & Brown, J. (2006) Attention web designers: You have 50 milliseconds to make a good first impression!, Behaviour & Information Technology, 25:2, 115-126
  69. Process ▪ Present interface for up to 5s ▪ Assess

    factors of interest - Appeal - Ease of use - Efficiency 90
  70. “ 91 “You will be shown an interface for 5

    seconds. After viewing the interface, indicate your response to the following questions.”
  71. Prompt 92 Mark how you feel about the interface you

    just saw. The interface is: Very Very Attractive - - - - - - - Unattractive Very Very Easy - - - - - - - - - - - Hard Very Very Efficient - - - - - - - - Inefficient
  72. Impressions Correlate with System Usability Scale 93 1. I think

    that I would like to use this system frequently. 2. I found the system unnecessarily complex. 3. I thought the system was easy to use. 4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use. 9. I felt very confident using the system. 10. I needed to learn a lot of things before I could get going with this system. Ref: Sauro, J. 2010. Can you use the SUS for websites? http://www.measuringu.com/blog/sus-websites.php
  73. Semantic Differentials ▪ Goal: Understand how people feel about your

    concept, interface, experience ▪ Alignment with intended goals, values 95
  74. Semantic Differentials ▪ Determinations - Granularity: Overall concept, interface, task

    performance - Criteria: utility, efficiency, satisfaction ▪ Administer as survey or interview prompt ▪ Consider participant characteristics, interface differences, tasks 96 Ref: https://www.surveygizmo.com/survey-blog/how-to-measure-attitudes-with-semantic-differential-questions/
  75. Ref: Johnson, F. 2012. Using semantic differentials for an evaluative

    view of the search engine as an interactive system, EuroHCIR2012 Prompt 97 “Mark how you feel about this experience.” Powerful - - - - - Simplistic Attractive - - - - - Unattractive Valuable - - - - - Not valuable Relevant - - - - - Irrelevant Satisfying - - - - - Frustrating Fast - - - - - Slow Predictable - - - - - Unpredictable Intuitive - - - - - Rigid Easy - - - - - Difficult
  76. Value Audit ▪ Goal: Review product against multiple value attributes

    - Emotion - Aesthetics - Identify - Impact - Ergonomics - Technology - Quality 99 Ref: Cagan, J. & Vogel, C.M. 2002. Creating breakthrough products: Innovation from product planning to program approval. New Jersey: Prentice Hall.
  77. Assess Values again Needs, Wants, Desires Emotion Sense of adventure

    Feel of independence Sense of security Sensuality and luxury Supports confidence Promotes power Aesthetics Visual Tactile Auditory Olfactory Gustatory Product identity Personality Point in time Sense of place 100 Impact Social Environmental Ergonomics Ease of use Safety Comfort Core technology Enabling Reliable Quality Craftsmanship Durability Ref: Cagan, J. & Vogel, C.M. 2002. Creating breakthrough products: Innovation from product planning to program approval. New Jersey: Prentice Hall.
  78. Value Opportunity Analysis 101 Category Values Low Med High Emotion

    Adventure Independence Security Sensuality Confidence Power Ergonomics Comfort Safety Ease of use Aesthetics Visual Auditory Tactile Olfactory Taste Identity Point in time Sense of place Personality Impact Social Environmental Core Technology Reliable Enabling Quality Craftsmanship Durability Ref: Cagan, J. & Vogel, C.M. 2002. Creating breakthrough products: Innovation from product planning to program approval. New Jersey: Prentice Hall.
  79. Group Workshops ▪ Goal: Explore concepts, topics in context -

    Value and goal exploration - Rapid idea generation - Open up new ideas, areas of inquiry, opportunity - Interactive exploration of problems and potential solutions 103
  80. Process ▪ Identify focus areas ▪ Determine roles ▪ Create

    activities ▪ Develop discussion guide ▪ Pilot 104 Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  81. Focus Areas ▪ Values ▪ Goals ▪ Problems ▪ Priorities

    ▪ Context 105 Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  82. Roles ▪ Moderator ▪ Facilitator ▪ Note-taker(s) ▪ Setup /

    Technical Support ▪ Greeter / Escort 106 Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  83. ▪ Template (choose ~3 in between) - Introductions - Problems:

    Critical incident exercise, sketching - Issues and risks: Delphi activity - Current / ideal values: Reaction cards - Addressing issues: Design ideation - Feedback to new approach: Commenting - Feature prioritization: Feature-Cost Tradeoff - Final takeaway: “One message” Activities 107
  84. Pilot ▪ Walk through each activity with unfamiliar colleagues ▪

    Same time and location, if possible ▪ Test out protocols ▪ Realistic time estimates ▪ Evaluate effectiveness of products ▪ Revise 108
  85. Discussion guide 109 Topic Duration Example Goal Introduction 5 minutes

    Introduce first name, job title, hobby Familiarize group, build comfort level Warm-up 5 minutes Consider the last time you needed to “perform task/achieve goal” → Jot down on sticky note Activate topics of interest; gauge level of knowledge Topic / Activity Problem focus [Role, process, actors, resources] 15-20 minutes Identify all the people and resources you work with when performing task → Sketch on paper Understand contexts and problem(s) Topic / Activity Reaction card 15-20 minutes Identify values associated with current / ideal solution → Reaction cards (x2) Determine how people feel about current / ideal solution Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  86. Discussion guide 110 Topic Duration Example Goal Topic / Activity

    Solution ideation 15-20 minutes Illustrate problems and potential solutions → Crazy 8’s Identify opportunities and issues Topic / Activity Review solutions 15-20 minutes Review mockups and flows → Feedback and value ratings Get feedback on approaches team is exploring Wrap-up 10 minutes What’s most important to you → 1 thing you’d say to CEO Bring closure to discussion Summary, what’s missing? 10 minutes Review key themes Answer questions Review summary Identify interests Ref: Baxter, K., Courage, C., & Caine, K. 2015. Understanding your users: A practical guide to user research methods, 2nd edition. Morgan Kaufmann.
  87. ▪ Fold paper in half 3 times ▪ Round 1:

    8 ideas in 5 minutes - May be sufficient for group purposes ▪ Round 2: 1 big idea in 5 minutes - Opportunity for participants to add details ▪ Round 3: 1 storyboard in 5 minutes - Opportunity to put ideas in a flow Design Ideation: Crazy 8s 111 Ref: Kaplan, K. 2017. Facilitating an effective design studio workshop, at: https://www.nngroup.com/articles/facilitating-design-studio-workshop/
  88. Example Workshop Flow 113 ▪ Recruit stakeholders ▪ Introductions ▪

    Need (problem) centered discussion ▪ Value assessment ▪ Design brainstorm ▪ Feedback
  89. “ 114 “Consider the last time you had to share

    something online. How long ago did this happen? What did you share? Describe the steps you took to share, and highlight any surprises or problems (if any) that happened. Create a sketch to illustrate this.”
  90. “ 116 “Select the values that you’d most like to

    see reflected in a sharing tool. First, just select all the terms you like. Then, pick your top 3.”
  91. Essential Engaging Empowering Efficient Ideal Values (subset) 117 Effective Easy

    to use Customizable Creative Convenient Consistent Organized Low maintenance Inspiring Innovative Impressive Helpful Fun Friendly Comprehensive Attractive Flexible Fast Familiar Exciting Useful Understandable Straightforward Professional Predictable Powerful
  92. “ 119 “Select the values that you currently associate with

    our sharing tool. First, just select all the terms you want. Then, pick your top 3.”
  93. Essential Engaging Empowering Efficient Ideal Values (positive subset) 120 Effective

    Easy to use Customizable Creative Convenient Consistent Organized Low maintenance Inspiring Innovative Impressive Helpful Fun Friendly Comprehensive Attractive Flexible Fast Familiar Exciting Useful Understandable Straightforward Professional Predictable Powerful
  94. Ideal Values (negative subset) 121 Time-consuming Undesirable Uncontrollable Unattractive Stressful

    Simplistic Rigid Overwhelming Orginary Irrelevant Inconsistent Impersonal Hard to use Frustrating Dull Difficult Confusing Complex Busy Boring
  95. “ 123 “Fold a piece of paper in half 3

    times. In each square, sketch (or write) an idea that represents your ideal sharing experience, or a frustration with the current experience.”
  96. Feedback Session 125 ▪ As part of the group activity

    - Review concepts - React to mockups - Create and evaluate sketches - Assess usability of prototypes 1. 2. 3. 4. 100%
  97. “ 126 “Imagine you are leaving the building and you

    meet our CEO in the elevator. Based on what we’ve discussed in this session, what would you say? Write it on a note and explain to the group.”
  98. 127 Scenario: Vacation planning Your participants need to plan a

    vacation using your app. They need to get costs for an inexpensive flight, affordable hotel, and transportation to and from the airport. You want to understand the real problems people face, features your app should provide to address these problems, and relative priority of each. Practice Workshop ▪ Introductions ▪ Understand problems ▪ Learn about Issues/risks ▪ Determine values ▪ Addressing issues ▪ Gather feedback ▪ Prioritize features ▪ Conclusion
  99. Reflect ▪ Process - Challenges - Highlights ▪ Results -

    Problems uncovered - Features? - Prioritization? - Next steps? 128
  100. Recruiting 130 ▪ Clarify target vs. nontarget - Demographics -

    Psychographics - Usage - Goals - Contexts
  101. Order and Flow 131 ▪ Support gradual understanding - Agreement

    to the study - Overall experiences and goals - Problems and incidents - New or existing concepts - Interfaces and interactions - Consider implications for impression testin - Critical feedback and open questions
  102. Pilot 132 ▪ Aim for high-fidelity - Participants who aren’t

    fully aware - Similar location and time - Equipment, tools, resources - Protocols, stimuli - Adequate durations ▪ Immediate debrief - Participant feedback on experience - Clarify protocol - Address equipment, tool issues
  103. Triangulation 133 ▪ As many sources as possible - Different

    participants, respondents - Varied contexts (outside of research focus) - Multiple methods - Usage data - Desk research
  104. Credits Special thanks to all the people who made and

    released these resources for free. 135 ▪ Presentation template by SlidesCarnival