Slide 1

Slide 1 text

23rd January 2024 Getting started with manual accessibility testing Martin Underhill

Slide 2

Slide 2 text

Martin Underhill Lead Accessibility Specialist

Slide 3

Slide 3 text

What we’ll cover 1. Accessibility conformance 2. Automated accessibility testing 3. Manual accessibility testing 4. Process (ideal, and pragmatic)

Slide 4

Slide 4 text

Accessibility conformance

Slide 5

Slide 5 text

— W3C “WCAG … de fi nes how to make Web content more accessible to people with … a wide range of disabilities, including visual, auditory, physical, speech, cognitive, language, learning, and neurological disabilities” Background on WCAG 2 The Web Content Accessibility Guidelines (WCAG)

Slide 6

Slide 6 text

Versions • 2.0 • 2.1 ← Most common • 2.2 ← Current recommendation (brand new!) The Web Content Accessibility Guidelines 2.2

Slide 7

Slide 7 text

Conformance levels (from lowest to highest) 1. A 2. AA (A + AA) ← Most common 3. AAA (A + AA + AAA) www.w3.org/TR/WCAG21/#wcag-2-layers-of-guidance

Slide 8

Slide 8 text

— W3C “Although these guidelines cover a wide range of issues, they are not able to address the needs of people with all types, degrees, and combinations of disability” Background on WCAG 2

Slide 9

Slide 9 text

Automated accessibility testing

Slide 10

Slide 10 text

axe DevTools • Other tools are available! • Fewer false positives than other tools (anecdotally) • Widely used • Maps issues to Deque University knowledge base • UI (in Chrome/Firefox dev tools) for axe-core • Many services run axe-core under the hood • Decent free baseline product with room to grow www.deque.com/axe/devtools

Slide 11

Slide 11 text

axe-core Disclaimer: I know very little about this; best left to experts like you! 1. Install in your CI/CD pipeline 2. Con fi gure to appropriate version, level, etc. 3. Then either: • Flag issues on build • Export reports

Slide 12

Slide 12 text

Facts and fi gures • 20% to 25% of WCAG 2.1 AA issues – Level Access • Around 40% ‘accessibility issues’ – Smashing Magazine • 57% of ‘digital accessibility issues’ – Deque www.levelaccess.com/blog/automated-accessibility-testing-tools-how-much-do-scans-catch/ www.smashingmagazine.com/2022/11/automated-test-results-improve-accessibility/ www.deque.com/blog/automated-testing-study-identi fi es-57-percent-of-digital-accessibility-issues/

Slide 13

Slide 13 text

Automated testing is really useful for things like • Colour contrast • Suspect markup • Missing content, like image alt text and the document • Touch target size github.com/dequelabs/axe-core

Slide 14

Slide 14 text

No content

Slide 15

Slide 15 text

Manual accessibility testing

Slide 16

Slide 16 text

What I won’t be covering

Slide 17

Slide 17 text

Mobile testing • This is a WCAG requirement • You’re probably already doing it • To a great extent, also covers when users: • have browser zoom turned up on larger screens • change the font • add extra text spacing

Slide 18

Slide 18 text

A bunch of design stuff • Descriptive (alt) text makes sense • Form fi eld labels provide enough information • Captions match a video’s audio track • Using colour alone to convey meaning • Icon-only buttons

Slide 19

Slide 19 text

We’ll focus on just three things 1. Keyboard-only use 2. Screen reader software 3. Speech recognition software

Slide 20

Slide 20 text

Keyboard-only use

Slide 21

Slide 21 text

Keyboard controls • Tab ⇥ moves through the tab index • Enter/Return ⏎ • Activates a link or button • Submits a form • Space ␣ • Activates a button (not a link) • Checks a checkbox • Up ↑ and down ↓ move through dropdown options and radio groups

Slide 22

Slide 22 text

No content

Slide 23

Slide 23 text

Some keyboard-only testing fi nds • Incorrect markup • Keyboard interaction expectations not met • Focus indicators • Unexpected tab order

Slide 24

Slide 24 text

Keyboard-only testing top tips • Check for skip links • Only interactive elements should be focusable • Double check against default browser behaviour • Try to cancel an action with Esc playground.tempertemper.net

Slide 25

Slide 25 text

Screen reader software

Slide 26

Slide 26 text

Popular screen reader software • NVDA for Windows • JAWS for Windows • Narrator for Windows • VoiceOver for Mac • VoiceOver for iOS/iPadOS • Talkback for Android

Slide 27

Slide 27 text

No content

Slide 28

Slide 28 text

Some screen reader testing fi nds • Whether form inputs are properly hooked up to their labels, hint text, etc. • Components that look like one thing and are read out as another • Incorrect heading structure • Missing page ‘landmarks’

Slide 29

Slide 29 text

Page landmark examples 1. Header 2. Navigation 3. Main content 4. Sidebar with content fi lters 5. Footer

Slide 30

Slide 30 text

Screen reader testing top tips • Don’t use the tab key! • Understand the ‘modes’: • Navigation • Interaction • Learn the ropes on an accessible website (like GOV.UK) • Once you’ve mastered the basics, learn shortcuts www.tempertemper.net/blog/screen-reader-users-and-the-tab-key

Slide 31

Slide 31 text

Speech recognition software

Slide 32

Slide 32 text

Popular speech recognition software • Dragon for Windows • Speech Recognition for Windows • Voice Control for Mac • Voice Control for iOS/iPadOS • Voice Access for Android

Slide 33

Slide 33 text

Some speech recognition software testing fi nds • Decorative button icons that have not been hidden • Labels not matching their underlying accessible name • Incorrect semantics

Slide 34

Slide 34 text

Speech recognition software testing top tips • Learn the ropes on an accessible website (like GOV.UK) • Don’t begin testing on your own product until it works well for: • Keyboard-only users • Screen reader users

Slide 35

Slide 35 text

Process

Slide 36

Slide 36 text

Ideal process involves • axe-core in CI/CD pipelines; con fi gured to catch WCAG 2.2 AA, plus fl ag best practices and AAA • Screen reader testing with multiple screen readers across multiple operating systems • Speech recognition software testing with multiple speech recognition software apps across multiple operating systems • Testing on multiple platforms: • Parallels for Mac (or equivalent) to test on Windows • Device lab for native mobile accessibility testing and Windows users to test on Mac

Slide 37

Slide 37 text

Getting started • Start by aiming for WCAG 2.1 AA conformance • Run axe DevTools on each page (until axe-core is installed and con fi gured) • Start manual accessibility testing with the software on your machine • Once comfortable, introduce testing with a second screen reader or speech recognition software package, ideally on a different platform • Ask devs to check with the keyboard and their built-in screen reader software • Get eyes on designs early – you’ll spot issues others won’t

Slide 38

Slide 38 text

Don’t worry – it’ll get quicker! • Getting to grips with a new way to use UI takes time • Eventually, it’ll be as quick as testing with the mouse • Introducing manual accessibility testing gradually is the best way

Slide 39

Slide 39 text

Don’t forget the three steps 1. Keyboard only usually fi nds the big issues 2. Screen reader software fi nds a bunch more 3. Speech recognition software fi nds the remaining issues

Slide 40

Slide 40 text

Thanks 🙇

Slide 41

Slide 41 text

Questions!