Testers & QA
You are the gatekeepers of inclusion.
While developers build the features, QA ensures they actually work for people with disabilities. Accessibility testing is not just about running a scanner; it’s about verifying the human experience. Automated tools can identify about 30% of issues (syntax errors), but manual testing is required to catch the other 70% (usability and logic).
Quick Wins
Keyboard Only
Put your mouse away. Can you navigate the entire feature using only Tab, Enter, and Space?
Visible Focus
As you tab through, is there a clear, visible ring around the active element at all times? Is it hidden by strict CSS?
Zoom to 200%
Use Ctrl + + to zoom the browser to 200%. Does text overlap? Do navigation menus disappear?
Error Handling
Trigger a form error. Does the focus move to the error? Is the error message clear and not just a red border?
Multimedia
If there is video, are captions available? If there is audio, is there a transcript?
The Hybrid Methodology
We use a two-pronged approach to validate WCAG 2.2 AA compliance
Automated Scans
Run these first to catch low-hanging fruit (missing alt text, low contrast, invalid HTML).
- Tool: axe DevTools (Browser Extension)
- Coverage: ~30% of total errors
- Frequency: Every sprint / Every pull request
Manual Verification
Human testing using Assistive Technology (AT) to verify logical flow and operability.
- Tool: NVDA (Windows) or VoiceOver (Mac)
- Coverage: ~70% of total errors
- Frequency: Major releases / UAT
References
Testing Templates
Copy-paste bug descriptions for common failures (e.g., “Focus not trapped in modal”)
WebAIM: Keyboard Accessibility Testing
Test keyboard accessibility, common failure patterns, and what issues automated tools cannot detect
Wave
Flags common accessibility issues on web pages to support early feedback and triage.