Accessibility Testing Tools: axe, WAVE, Lighthouse
Accessibility Testing Tools: axe, WAVE, Lighthouse
Automated accessibility testing catches approximately 30-40% of WCAG violations. The other 60-70% require manual testing — keyboard walkthroughs, screen reader evaluation, and user research. Automated tools are essential but insufficient alone. Understanding what each tool detects (and misses) determines how to build a complete testing strategy.
The Big Three: axe, WAVE, Lighthouse
axe DevTools (Deque)
What it is: A browser extension (Chrome, Firefox, Edge) and JavaScript library (axe-core) for automated accessibility testing. axe-core is open-source and powers dozens of other tools.
What it detects:
- Missing alt text on images.
- Color contrast failures.
- Missing form labels.
- ARIA attribute errors (invalid roles, missing required properties).
- Focus order issues.
- Missing document language.
- Empty headings and buttons.
What it misses:
- Whether alt text is actually meaningful (it checks presence, not quality).
- Logical reading order.
- Whether keyboard focus indicators are visible and sufficient.
- Whether ARIA roles match actual component behavior.
- Whether content is understandable.
Best for: CI/CD integration. axe-core runs in Node.js, Selenium, Cypress, Playwright, and virtually every test runner. Adding axe.run() to your end-to-end tests catches regressions automatically.
Pricing: axe DevTools browser extension has a free tier. axe DevTools Pro (paid) adds guided intelligent testing, issue grouping, and WCAG 2.2 coverage.
WAVE (WebAIM)
What it is: A browser extension (Chrome, Firefox) and web service that visually overlays accessibility information on the page.
What it detects:
- Same core WCAG violations as axe (alt text, contrast, labels, ARIA).
- Structural elements: headings, landmarks, lists.
- Alerts for potential issues that need manual review (redundant alt text, suspicious link text, missing first-level heading).
What it misses:
- Same categories as axe — automated tools share fundamental detection limits.
- Does not integrate into CI/CD (it is a manual review tool).
Best for: Visual debugging during development and design review. The overlay showing icons on each element makes it easy to scan a page for issues. WAVE’s “Alerts” category flags items that are not definitive failures but warrant human review — a feature axe does not emphasize as strongly.
Pricing: Free browser extension and web service. WAVE API (for batch testing) is paid.
Lighthouse (Google)
What it is: An automated auditing tool built into Chrome DevTools. Runs accessibility, performance, SEO, and best-practices audits.
What it detects:
- Uses axe-core as its accessibility engine, so detection capabilities overlap heavily with axe.
- Produces a 0-100 accessibility score (useful for dashboards but potentially misleading — a 100 score does not mean full compliance).
What it misses:
- Everything axe misses, because it uses axe-core under the hood.
- The score incentivizes “pass all checks” rather than comprehensive accessibility.
Best for: Quick baseline audits and performance-focused teams already using Lighthouse. Good for initial awareness. Not sufficient as a primary accessibility testing tool.
Pricing: Free (built into Chrome).
Specialized Testing Tools
| Tool | Purpose | Type |
|---|---|---|
| pa11y | CLI-based automated testing, CI/CD integration | Open source |
| Tenon | API-first automated testing | Commercial |
| SortSite | Full-site crawling and accessibility audit | Commercial |
| Accessibility Insights (Microsoft) | Guided manual testing + automated checks | Free |
| ARC Toolkit (TPGi) | Browser extension with detailed issue reporting | Free |
| Color Contrast Analyser (TPGi) | Desktop app for color contrast checking | Free |
| Colour Contrast Checker (WebAIM) | Online contrast ratio calculator | Free |
Manual Testing (the Other 60-70%)
Automated tools cannot replace these manual checks:
Keyboard Testing
Unplug the mouse. Tab through every page. Verify:
- Every interactive element is reachable.
- Focus order is logical.
- Focus indicators are visible (meeting WCAG 2.2 focus requirements).
- No keyboard traps exist.
- Custom widgets respond to expected keys (Enter, Space, Escape, arrows).
For detailed keyboard testing methodology, see our keyboard navigation guide.
Screen Reader Testing
Navigate with a screen reader and verify:
- All content is announced in a logical order.
- Interactive elements have meaningful names, roles, and states.
- Dynamic content updates are announced via live regions.
- Forms are navigable and errors are announced.
Test with at least two combinations: NVDA + Firefox on Windows and VoiceOver + Safari on macOS. See screen reader compatibility for details.
Content Review
- Alt text quality: Is the description meaningful, not just present?
- Heading hierarchy: Does the structure create a logical outline?
- Link text: Does every link make sense out of context?
- Language: Is the content written in plain language at an appropriate reading level?
- Error messages: Do they identify the problem and suggest a fix?
Building a Testing Strategy
Development Phase
- Linting: eslint-plugin-jsx-a11y (React), vue-axe (Vue), or equivalent catches issues at code time.
- Unit/integration tests: axe-core assertions in component tests catch regressions.
- Pre-commit hooks: Run axe-core on changed templates before code is merged.
QA Phase
- Automated full-page scan: axe DevTools or WAVE on every unique template.
- Keyboard walkthrough: Manual testing of every user flow.
- Screen reader walkthrough: NVDA + VoiceOver on critical flows.
Release Phase
- CI/CD gate: axe-core in the pipeline blocks deploys with critical accessibility regressions.
- Accessibility audit: Quarterly manual audit against the full WCAG 2.2 AA checklist.
Post-Release
- Monitoring: Continuous automated scans of production pages.
- User testing: Sessions with assistive technology users at least once per major release.
Key Takeaways
- Automated tools (axe, WAVE, Lighthouse) catch 30-40% of WCAG issues — they detect presence, not quality.
- axe-core is the best choice for CI/CD integration; WAVE is best for visual debugging; Lighthouse is a convenient baseline.
- Manual keyboard and screen reader testing are required to cover the remaining 60-70% of potential issues.
- A complete strategy layers linting, automated testing, manual review, and user testing across the development lifecycle.
Next Steps
- Use testing tools to audit WCAG 2.2 compliance across your product.
- Apply manual testing to specific areas: forms, e-commerce checkout, data visualizations.
- Review ARIA best practices to fix the issues these tools flag.
Sources
- axe-core GitHub Repository — Open-source accessibility testing engine by Deque.
- WAVE Web Accessibility Evaluation Tool — WebAIM’s visual accessibility testing tool.
- WebAIM: WebAIM Million Report — Annual analysis of accessibility across the top million websites.
- Deque University: Automated Testing — Guidance on integrating automated accessibility testing.
- MDN Web Docs: Accessibility — Comprehensive developer accessibility reference.
Tool descriptions based on current versions as of early 2025. axe-core is open source under the Mozilla Public License 2.0. WAVE is maintained by WebAIM at Utah State University. Lighthouse is maintained by the Google Chrome team.