* Mind Mapping = central idea + connect related thoughts, radiant thinking * advantages: - visual overview - group related concepts - easy to grasp whole picture MIND MAPPING
* introduced by British Psychologist Tony Buzan * visual tool to help you organize your thoughts A Mind Map is a visual and graphic Holistic Thinking Tool. — Tony Buzan
* Questions that come up: how should this work, why would Mind Mapping be of interest for visually impaired people? Hi, I'm a visually impaired user of Apple products & I'm wondering if your app works with VoiceOver? — Jessica
* perfectly normal use case: write down ideas & organize thoughts * realization: visually impaired users think visually too I'd like to write an autobiography, and need a place to write down my ideas and construct chapters. — Jessica
Several reasons: * easy: you can get the basics done in a few hours * increase user base (WHO: 285 million visually impaired people worldwide, 39 million blind) * UI Testing WHY SHOULD WE CARE?
* Main Reason: empathy * unique chance for developers: improve the lives of millions of people * Smartphone + Accessibility = live-changing for people with disabilities (empowerment, independence) * Goal: equal experience for all users, regardless of disabilities BECAUSE IT'S THE RIGHT THING TO DO!
* very common belief * we implicitly assume most extreme case * reality: entire spectrum [corrective lenses -> complete blindness] * needs of partially-sighted users wildly differ from blind users MYTH 1 VISUALLY IMPAIRED USERS ARE BLIND.
* use of memory of location * once you remember location of a control, you don't scan the whole screen MYTH 2 VISUALLY IMPAIRED USERS ACCESS THINGS SEQUENTIALLY.
* listen to just enough to orient themselves * labels should be as short as possible: concise, not verbose * users don't want to read a book, they want to use your app * it's all about efficiency MYTH 3 VISUALLY IMPAIRED USERS LISTEN TO ALL ON-SCREEN TEXT.
* wide range of accessibility technologies: * localization, RTL, Dynamic Type, Bold Text, Reduced Motion, Reduced Transparency,... * VoiceOver = Screen Reader * turns your whole screen into an input area * reads out stuff that's under your finger (focused) VOICEOVER
* to activate currently focused element double tap anywhere on screen * = single-tap in normal mode * calls "func accessibilityActivate() -> Bool" ACTIVATE DOUBLE TAP
* moves focus to the previous or next element on screen * good way to explore screen * order of elements is very important (UIAccessibilityContainer) CHANGE FOCUS SEQUENTIALLY SWIPE LEFT/RIGHT
* Magic Tap is optional, triggers the most important action in your app * e.g. phone app: take incoming call, music app: play/ pause music, camera app: take picture MOST IMPORTANT ACTION 2-FINGER DOUBLE TAP
* framework provided by Apple * allows us to provide accessibility information about UI elements * this information is then used by VoiceOver & other assistive technologies * at the heart: informal protocol on NSObject: NSObject(UIAccessibility) UIAccessibility
* single most important tool * it's what VoiceOver reads out first (compare with Myth #3) * rules: - as short as possible, single word - don't include type of control -> accessibilityTraits - localized - Starts with a capitalized Word - Does not end with a period - More: http://developer.apple.com/library/ios/ #documentation/UserExperience/Conceptual/ accessibilityLabel "WHAT'S MY NAME?"
lots of other properties to check out: * hint: what happens when you trigger me? * traits: what`s my personality? * frame/path: where am I? * value: what's my value? accessibilityHint accessibilityTraits accessibilityFrame/Path accessibilityValue ... AND MANY MORE ...
* think of it as a context menu for objects on screen * can make the difference between an accessible app, and a really great to use app with VoiceOver UIAccessibilityCustomAction "RIGHT-CLICK MENU"
* Example: Browsing a Website. - Sighted users don't read the whole screen top to bottom, they scan for images, headlines, links, ... - the Rotor allows visually impaired people to perform the same workflow: skim through content * can be shown by rotating fingers on screen * new in iOS 10: custom rotors - you can add your own categories UIAccessibilityCustomRotor "WHAT'S ON THIS PAGE?"
* challenge: take a selfie with VoiceOver enabled 1. activate VoiceOver with triple-click on home button 2. select time in status bar 3. 3-finger swipe up to show control center 4. 3-finger triple tap to enable "Screen Curtain" (turns screen off) 5. find camera icon 6. activate camera app 7. change camera to selfie camera TAKE A SELFIE!
the power of making your app accessible = being able to answer these kind of questions with YES & knowing that it improves the lifes of so many people out there Hi, I'm a visually impaired user of Apple products & I'm wondering if your app works with VoiceOver? — Jessica