← Back to Home

What Automated Testing Misses

Why axe-core and Lighthouse aren't enough for true accessibility

Introduction

Automated accessibility testing tools like axe-core are invaluable for catching structural issues. However, they can only check about 30% of WCAG success criteria. This page demonstrates errors that will NOT be caught by automated tools but are critical accessibility issues.

Bad Alt Text (Not Caught)

Automated tools check if alt text EXISTS, but they cannot evaluate if it's GOOD.

✅ Good Alt Text

alt="Woman in business attire presenting to a meeting"

Woman in business attire presenting to a meeting

Clear, descriptive alt text helps users understand the image's context and purpose.

❌ Bad Alt Text

alt="image123.jpg"

image123.jpg

Filename alt text is useless to screen reader users—they hear "image one two three jpg" instead of understanding the image.

More Bad Alt Text Examples (All Pass Axe)

These images all have alt text, so axe-core will report them as PASSING. But they're terrible for users:

All of these will pass axe validation:

Example 1: Generic "Image"

Image

alt="Image" — Not helpful

Example 2: Filename

photo_2024_01_19_3947.jpg

alt="photo_2024_01_19_3947.jpg" — Meaningless

Example 3: Generic "Picture"

Picture

alt="Picture" — Too vague

Example 4: Redundant "Photo"

Photo

alt="Photo" — Users already know it's an image

Semantic Misuse (Not Caught)

Axe can check that elements have valid roles, but it cannot determine if a role is APPROPRIATE to context.

These are technically valid but semantically wrong:

❌ Button marked as heading

role="heading" aria-level="2" on a button — Confusing to screen reader users

❌ Text in interactive role

This looks like a button but doesn't work

role="button" on plain div without keyboard handling — Only visually appears interactive

❌ Inappropriate list usage

Technically valid HTML, but semantically incorrect context for navigation

Visual Relationships Not Marked Up (Not Caught)

Axe can't determine if related elements should be grouped together.

Related form fields shown visually but not semantically:




Issue: These fields are visually grouped as a date input, but they're marked up as three separate, unrelated fields. Axe won't catch this because each field has a label. A screen reader user won't know these fields belong together.

Content Appropriateness (Not Caught)

Automated tools cannot determine if list content is actually appropriate for a list:

Issue: This should be a definition list or table, not an unordered list. Axe will pass because the HTML is technically valid, but it's not semantically correct for the content.

Hidden Content & Keyboard Navigation (Not Caught)

While axe checks for keyboard accessibility, it cannot determine if keyboard users have adequate ways to navigate:

Missing skip links

This page lacks a "Skip to main content" link. Keyboard and screen reader users must tab through all navigation before reaching content. Axe won't catch this.

Key Takeaway

Automated Testing is 30% of the Solution

Axe-core and similar tools can catch about 30% of WCAG success criteria. They check:

But they CANNOT check:

Always combine automated testing with manual testing by real users with disabilities.

← Back to Home