Upgrade to Pro — share decks privately, control downloads, hide ads and more …

First Step Towards Accessibility

First Step Towards Accessibility

Presented by Mohit Mangla at MobileFlock III

Avatar for Mobile Flock

Mobile Flock

January 15, 2022
Tweet

More Decks by Mobile Flock

Other Decks in Technology

Transcript

  1. Myths • “It’s not that manypeople” • “It's time consuming”

    • “My app is too complicated to be accessible”
  2. –2011 WHO World Report on Disability “15 percent (1 Billion)

    of the world’s population (7 Billion) live with some kind of impairment”
  3. “75% of the FTSE 100 companies in the UK do

    not meet basic levels of accessibility, thus missing out onmore than £96 ($147) million inrevenue.” –UN Factsheet on Persons with Disabilities
  4. VoiceOver Perceive and Interact without seeing the screen or direct

    manipulation VoiceOverreplicates the UI for users who can’t see it. Move through UI elements sequentially and announce them via Speech or Braille Perform actions using a single button: whole screen Free (saving thousands), Built-in, High-Quality, Localizable
  5. accessibilityLabel - Returns Localized text that represents the accessibility element

    - Image-based controls need to specify this! - Don’t include the control type “Polo T-Shirt”
  6. accessibilityTraits - Combination of traits that best characterize the accessibility

    element - UIKit controls: standard traits - Combine traits with anOR operator “Order, Button” button.accessibilityTraits = UIAccessibilityTraits.button
  7. accessibilityHint - Describes the outcome of performing an action -

    Keep it brief “Order, Button, Double tap to order product” button.accessibilityHint = “Double tap to order product”
  8. Moving VoiceOver focus UIAccessibility.post(notification: .screenChanged, argument: brandLabel) - Notification posted

    when major part of screen is changed - Pass element and gain focus - Pass nil and gain focus on first accessible element - Pass string, VoiceOver read out that text - screenChanged, layoutChanged, announcement etc.
  9. Order of Elements Specify the order VoiceOver should go through

    the elements. self.accessibilityElements = [self.firstLabel, self.button, self.secondLabel]
  10. VOICE USER INTERFACE - HOW IT WORKS VOICE ASSISTANT SPEECH

    RECOGNITION INTENT SELECTION ENTITY EXTRACTION FULFILLMENT Dictation Siri Alexa Google Assistant Natural Language Understanding Content Logic App Selection How will be the weather tomorrow in Mumbai?
  11. How Voice is Utilized in Apps Voice Search –Perform search

    based on vocal instructions and take user to search results Voice Navigation –Navigate user to any page. “Show Cart” Voice Action – perform action with voice commands. “Order Voltas AC 1.5 ton”
  12. Voice Interaction INPUT - Dictation - AVAudioRecorder/ AVAudioEngine - Speech

    Framework - SiriKit OUTPUT - VoiceOver - AVSpeechSynthesizer - AVAudioPlayer
  13. Voice Interaction Stages Listening Mode When user speaks, user text

    appear on UI Element Voice Processing Mode After an Interval, voice assistant stops recording and starts processing text using Core ML models and Natural Language framework. Output Mode Output based out the voice command. Navigation, text read etc.
  14. Transcription Basics SFSpeechRecognizer Primary Controller in Framework. Its job is

    to generate recognition task and return result. Handles authorization and configure locales. SFSpeechRecognitionRequest Base class for recognition request. Its job is to point recogniser to an audio source. Read from a file –SFSpeechURLRecognitonRequest Read from buffer –SFSpeechAudioBufferRecognitonRequest
  15. Transcription Basics SFSpeechRecognitiontask Objects are created when a request is

    kicked of by a recognizer. Used to track the progress of a transcription or cancel it. SFSpeechRecognitionResult Objects contains transcription of a chunk of the audio
  16. Ask for permission Considering user’s privacy, we should ask for

    user permission before using user’s audio to transcribe. ncelit.
  17. Transcribing a file 1. SFSpeechRecognizer() provides a recogniser with device

    locale. isAvailablechecks if recogniser is ready. 2. SFSpeechURLRecognitiontask created and showing loader while transcribing 3. recognitionTaskprocesses the request and trigger closure on completion 4. isFinalwill be true once transcription is completed. bestTranscriptioncontains most confident transcription. formattedStringprovides string output to display.
  18. Transcribing from Audio AVAudioEngineobject to process audio signals 1. input

    audio node obtained from device’s microphone and it’s output format 2. Tap installed on output bus of node. When buffer is filled , closure returns data in buffer and appended to request. 3. audioEngineis prepared and started to start the recording.
  19. Stop Transcription 1. audiEnginestopped and released all the resources. 2.

    endAudiontells request not to expect more incoming audio 3. cancelis called on request to let task that it’s work is done and free all the resources.
  20. What is SiriKit? SiriKitis a frameworkthat allows apps to work

    with Siri, a virtual assistant that responds to a user’s voice. SiriKitprocesses all interactions with a user and works with an extensions for processing user queries. We can create shortcuts using Intent donations.
  21. SiriKit Basics Extensions: SiriKitsupports two type of extensions 1. Intent

    App Extension –transform user request to app- specific actions 2. Intent App UI Extension –display content in Siri interface Domains Apple provides predefine Intent domains to work with. List, Ride booking, Messaging, Payment, workout etc.
  22. Shortcuts Shortcut We should create shortcuts for actions which user

    can perform repeatedly. Steps to create shortcuts - 1. Define shortcuts -Define functionality which are exposed to Siri to understands the functionalities in the app. 2. Donate shortcut -Donate shortcuts for a particular feature in the app when a particular action is performed. 3. Handle shortcut -Implement/define handling for the shortcut. Two ways to create shortcuts 1. NSUserActivity 2. Donation
  23. Define Shortcuts System Available Intents can be used to provide

    details regarding the actions that can be performed in the app. Create custom intents for customised functionalities.
  24. Handling Extension When a user request is received then SiriKitloads

    the intents app extension and creates an object of INExtensionsubclass. The extension returns the handler for intent.
  25. Handling Extension For each intent object, an intent handling protocol

    is created. An intent is handled in the following 3 steps: 1. Resolve -Resolve each parameter, clarify from SiriKitif all parameters received 2 . Confirm -Confirm all the parameters are validated. Now intent can be handled by opening the app or by the intent extension. 3 .Handle -In this stage, the intent will be handled and the response object is sent to SiriKit.
  26. QnA