UX Design

Accessibility Testing Tools: axe, WAVE, Lighthouse

By EZUD Published · Updated

Accessibility Testing Tools: axe, WAVE, Lighthouse

Automated accessibility testing catches approximately 30-40% of WCAG violations. The other 60-70% require manual testing — keyboard walkthroughs, screen reader evaluation, and user research. Automated tools are essential but insufficient alone. Understanding what each tool detects (and misses) determines how to build a complete testing strategy.

The Big Three: axe, WAVE, Lighthouse

axe DevTools (Deque)

What it is: A browser extension (Chrome, Firefox, Edge) and JavaScript library (axe-core) for automated accessibility testing. axe-core is open-source and powers dozens of other tools.

What it detects:

  • Missing alt text on images.
  • Color contrast failures.
  • Missing form labels.
  • ARIA attribute errors (invalid roles, missing required properties).
  • Focus order issues.
  • Missing document language.
  • Empty headings and buttons.

What it misses:

  • Whether alt text is actually meaningful (it checks presence, not quality).
  • Logical reading order.
  • Whether keyboard focus indicators are visible and sufficient.
  • Whether ARIA roles match actual component behavior.
  • Whether content is understandable.

Best for: CI/CD integration. axe-core runs in Node.js, Selenium, Cypress, Playwright, and virtually every test runner. Adding axe.run() to your end-to-end tests catches regressions automatically.

Pricing: axe DevTools browser extension has a free tier. axe DevTools Pro (paid) adds guided intelligent testing, issue grouping, and WCAG 2.2 coverage.

WAVE (WebAIM)

What it is: A browser extension (Chrome, Firefox) and web service that visually overlays accessibility information on the page.

What it detects:

  • Same core WCAG violations as axe (alt text, contrast, labels, ARIA).
  • Structural elements: headings, landmarks, lists.
  • Alerts for potential issues that need manual review (redundant alt text, suspicious link text, missing first-level heading).

What it misses:

  • Same categories as axe — automated tools share fundamental detection limits.
  • Does not integrate into CI/CD (it is a manual review tool).

Best for: Visual debugging during development and design review. The overlay showing icons on each element makes it easy to scan a page for issues. WAVE’s “Alerts” category flags items that are not definitive failures but warrant human review — a feature axe does not emphasize as strongly.

Pricing: Free browser extension and web service. WAVE API (for batch testing) is paid.

Lighthouse (Google)

What it is: An automated auditing tool built into Chrome DevTools. Runs accessibility, performance, SEO, and best-practices audits.

What it detects:

  • Uses axe-core as its accessibility engine, so detection capabilities overlap heavily with axe.
  • Produces a 0-100 accessibility score (useful for dashboards but potentially misleading — a 100 score does not mean full compliance).

What it misses:

  • Everything axe misses, because it uses axe-core under the hood.
  • The score incentivizes “pass all checks” rather than comprehensive accessibility.

Best for: Quick baseline audits and performance-focused teams already using Lighthouse. Good for initial awareness. Not sufficient as a primary accessibility testing tool.

Pricing: Free (built into Chrome).

Specialized Testing Tools

ToolPurposeType
pa11yCLI-based automated testing, CI/CD integrationOpen source
TenonAPI-first automated testingCommercial
SortSiteFull-site crawling and accessibility auditCommercial
Accessibility Insights (Microsoft)Guided manual testing + automated checksFree
ARC Toolkit (TPGi)Browser extension with detailed issue reportingFree
Color Contrast Analyser (TPGi)Desktop app for color contrast checkingFree
Colour Contrast Checker (WebAIM)Online contrast ratio calculatorFree

Manual Testing (the Other 60-70%)

Automated tools cannot replace these manual checks:

Keyboard Testing

Unplug the mouse. Tab through every page. Verify:

  • Every interactive element is reachable.
  • Focus order is logical.
  • Focus indicators are visible (meeting WCAG 2.2 focus requirements).
  • No keyboard traps exist.
  • Custom widgets respond to expected keys (Enter, Space, Escape, arrows).

For detailed keyboard testing methodology, see our keyboard navigation guide.

Screen Reader Testing

Navigate with a screen reader and verify:

  • All content is announced in a logical order.
  • Interactive elements have meaningful names, roles, and states.
  • Dynamic content updates are announced via live regions.
  • Forms are navigable and errors are announced.

Test with at least two combinations: NVDA + Firefox on Windows and VoiceOver + Safari on macOS. See screen reader compatibility for details.

Content Review

  • Alt text quality: Is the description meaningful, not just present?
  • Heading hierarchy: Does the structure create a logical outline?
  • Link text: Does every link make sense out of context?
  • Language: Is the content written in plain language at an appropriate reading level?
  • Error messages: Do they identify the problem and suggest a fix?

Building a Testing Strategy

Development Phase

  1. Linting: eslint-plugin-jsx-a11y (React), vue-axe (Vue), or equivalent catches issues at code time.
  2. Unit/integration tests: axe-core assertions in component tests catch regressions.
  3. Pre-commit hooks: Run axe-core on changed templates before code is merged.

QA Phase

  1. Automated full-page scan: axe DevTools or WAVE on every unique template.
  2. Keyboard walkthrough: Manual testing of every user flow.
  3. Screen reader walkthrough: NVDA + VoiceOver on critical flows.

Release Phase

  1. CI/CD gate: axe-core in the pipeline blocks deploys with critical accessibility regressions.
  2. Accessibility audit: Quarterly manual audit against the full WCAG 2.2 AA checklist.

Post-Release

  1. Monitoring: Continuous automated scans of production pages.
  2. User testing: Sessions with assistive technology users at least once per major release.

Key Takeaways

  • Automated tools (axe, WAVE, Lighthouse) catch 30-40% of WCAG issues — they detect presence, not quality.
  • axe-core is the best choice for CI/CD integration; WAVE is best for visual debugging; Lighthouse is a convenient baseline.
  • Manual keyboard and screen reader testing are required to cover the remaining 60-70% of potential issues.
  • A complete strategy layers linting, automated testing, manual review, and user testing across the development lifecycle.

Next Steps

Sources

Tool descriptions based on current versions as of early 2025. axe-core is open source under the Mozilla Public License 2.0. WAVE is maintained by WebAIM at Utah State University. Lighthouse is maintained by the Google Chrome team.