What Automated Testing Misses
Why axe-core and Lighthouse aren't enough for true accessibility
Introduction
Automated accessibility testing tools like axe-core are invaluable for catching structural issues. However, they can only check about 30% of WCAG success criteria. This page demonstrates errors that will NOT be caught by automated tools but are critical accessibility issues.
Bad Alt Text (Not Caught)
Automated tools check if alt text EXISTS, but they cannot evaluate if it's GOOD.
✅ Good Alt Text
alt="Woman in business attire presenting to a meeting"
Clear, descriptive alt text helps users understand the image's context and purpose.
❌ Bad Alt Text
alt="image123.jpg"
Filename alt text is useless to screen reader users—they hear "image one two three jpg" instead of understanding the image.
More Bad Alt Text Examples (All Pass Axe)
These images all have alt text, so axe-core will report them as PASSING. But they're terrible for users:
All of these will pass axe validation:
Example 1: Generic "Image"
alt="Image" — Not helpful
Example 2: Filename
alt="photo_2024_01_19_3947.jpg" — Meaningless
Example 3: Generic "Picture"
alt="Picture" — Too vague
Example 4: Redundant "Photo"
alt="Photo" — Users already know it's an image
Semantic Misuse (Not Caught)
Axe can check that elements have valid roles, but it cannot determine if a role is APPROPRIATE to context.
These are technically valid but semantically wrong:
❌ Button marked as heading
role="heading" aria-level="2" on a button — Confusing to screen reader users
❌ Text in interactive role
role="button" on plain div without keyboard handling — Only visually appears interactive
❌ Inappropriate list usage
- This is a navigation menu
- But it's marked as a list
- When it should be nav with role="navigation"
Technically valid HTML, but semantically incorrect context for navigation
Visual Relationships Not Marked Up (Not Caught)
Axe can't determine if related elements should be grouped together.
Related form fields shown visually but not semantically:
Issue: These fields are visually grouped as a date input, but they're marked up as three separate, unrelated fields. Axe won't catch this because each field has a label. A screen reader user won't know these fields belong together.
Content Appropriateness (Not Caught)
Automated tools cannot determine if list content is actually appropriate for a list:
- Product price: $19.99
- Shipping: Free
- Total: $19.99
Issue: This should be a definition list or table, not an unordered list. Axe will pass because the HTML is technically valid, but it's not semantically correct for the content.
Hidden Content & Keyboard Navigation (Not Caught)
While axe checks for keyboard accessibility, it cannot determine if keyboard users have adequate ways to navigate:
Missing skip links
This page lacks a "Skip to main content" link. Keyboard and screen reader users must tab through all navigation before reaching content. Axe won't catch this.
Key Takeaway
Automated Testing is 30% of the Solution
Axe-core and similar tools can catch about 30% of WCAG success criteria. They check:
- ✅ That alt attributes exist
- ✅ That ARIA attributes are valid
- ✅ That lists have correct structure
- ✅ That color contrast meets standards
But they CANNOT check:
- ❌ If alt text is meaningful
- ❌ If a role is contextually appropriate
- ❌ If related content is grouped correctly
- ❌ If content in lists belongs in lists
- ❌ If the page has adequate navigation aids
Always combine automated testing with manual testing by real users with disabilities.