Most accessibility audits do not test what regulators actually check.
The European Accessibility Act has been enforceable since 28 June 2025. Eleven months in, the audits being sold under the EAA banner vary wildly in scope. Some test only the website. Some run a WCAG scanner against a homepage and call the work done. Few of them grade against the six elements that an EAA regulator, or a private complainant in Germany via consumer or competition law citing a BFSG breach, will actually look at.
This is a plain-English breakdown of those six elements. The framework comes from Article 13 and Annex V of Directive (EU) 2019/882(opens in new tab), the EAA itself, and from the established model template set out in WAD Implementing Decision (EU) 2018/1523(opens in new tab). If your accessibility audit does not produce findings against all six, it is not an EAA audit. It might be a useful piece of work, but it is not the audit a regulator will recognise.
AUDITSU grades these six elements automatically against any consumer app or website. We will come back to that at the end.
An EAA accessibility audit is a statement audit
The single most common misunderstanding about EAA audits: an EAA audit is not a code-level WCAG scan. It is an audit of your accessibility statement and the conformity claims that statement makes.
Under the Act, the statement is the legally binding artifact. It declares your conformance status to the public and to the regulator. The audit checks whether that statement exists, whether it cites the right standard, whether it is honest about gaps, and whether it gives users a working route to flag barriers. The requirements for the statement itself are covered in our Article 13 breakdown.
A WCAG scanner output is an input to the audit, not the audit itself. The scanner finds specific code-level violations. The accessibility audit asks a different question: does the statement reflect those violations, does it cover the mobile app and not just the marketing site, is the contact path real, does the named enforcement body match the user's jurisdiction.
This is why a £15,000 consultancy audit and a £197 per month tool can both produce valid EAA audits. The scope is the statement and its supporting evidence, not the line-by-line code. A consultancy that hands you a 200-page WCAG findings document without grading the statement has missed the regulatory test. An automated tool that grades the statement against Article 13 and runs the scan underneath has met it.
An EAA audit grades the statement. The scanner grades the code. You need both, but only one of them is what regulators read.
Most companies coming to us already have scanner output in a Jira board somewhere. What they do not have is the grade against the statement, because nothing in their existing tooling produces it.
The 6 elements an EAA accessibility audit checks
These six elements come from the WAD Implementing Decision (EU) 2018/1523 model, the established public-sector accessibility statement template since 2018. The structure has been adapted to private-sector EAA contexts through national transpositions: Germany's BFSG, France's Loi 2023-171, and Spain's Law 11/2023 all expect a structured statement along these lines. EAA Article 13 of Directive (EU) 2019/882 sets the obligation to publish service-accessibility information, and Annex V sets the categories the information must cover. The WAD model is what national regulators across the EU recognise when they grade what gets published.
An accessibility audit checklist that does not name all six is incomplete.
Accessibility audit checklist:
- Accessibility statement published
- EN 301 549 referenced as the standard
- Compliance status declared
- Non-accessible content listed
- Mobile app in scope
- Feedback and enforcement contact
The rest of this section expands each one with what auditors look for, what passes, and what fails.
1. Accessibility statement published
The statement is a public, discoverable document at a stable URL. Common locations include /accessibility, /accessibility-statement, or a clearly named link in the site footer.
What auditors look for: does the URL resolve, is it linked from the homepage or footer, is it discoverable inside the app (in app store metadata or in the app's settings menu).
What passes: a present, dated statement linked from the consumer entry point. The link is one click from the homepage and named clearly enough that a user can find it without a search.
What fails: missing entirely, returns a 404, hidden behind a login, only mentioned inside the privacy policy, or buried two levels deep under a "Legal" submenu. Guidance in several member states is explicit that the statement must be discoverable from any page, not just from the legal landing page.
2. EN 301 549 referenced as the standard
The statement names the technical standard it is measuring the product against. EN 301 549 v3.2.1(opens in new tab) is the current harmonised standard cited in the EAA's implementing measures, and WCAG 2.1 Level AA is incorporated by reference for web content.
What auditors look for: is the standard named explicitly, is the version cited, is the citation correct.
What passes: an explicit reference such as "EN 301 549 v3.2.1, incorporating WCAG 2.1 Level AA." Some statements also reference the platform-specific clauses in EN 301 549 Chapter 11 for native mobile apps.
A note on WCAG 2.2: the W3C published WCAG 2.2 as a Recommendation in October 2023. EN 301 549 v3.2.1 still formally references WCAG 2.1, so a statement citing v3.2.1 today is current. The forthcoming EN 301 549 v4 is expected to reference WCAG 2.2 directly. A statement that ignores WCAG 2.2 in its scan plan will be out of date the moment v4 lands.
What fails: vague language such as "we follow accessibility best practices" with no named standard. References to outdated versions (EN 301 549 v2.x) also fail. The harmonised standard moved on, and a statement citing a version superseded years ago is itself evidence the document is stale.
3. Compliance status declared
The statement says, plainly, where the product currently stands. Three positions are recognised under the WAD template and used by EAA enforcement bodies: fully compliant, partially compliant, or non-compliant.
What auditors look for: is one of the three states actually declared, is the declaration unambiguous, does it match what the underlying audit data shows.
What passes: an explicit state, with reasons. "Partially compliant" with a list of known gaps is the most common outcome for mid-market products and the most defensible position to hold.
What fails: hedge language such as "we strive for accessibility" or "the site is broadly accessible." None of these are recognised states under the EAA. A statement claiming full compliance for a product that visibly fails WCAG 2.1 Level AA is also a finding, because the public claim does not match the conformance evidence underneath.
4. Non-accessible content listed
An honest list of the parts of the product that do not meet the standard, with named exceptions for technical limits, third-party content, or disproportionate burden claims under Article 14.
What auditors look for: does the list exist, is it specific enough to be falsifiable (a regulator can verify the named feature behaves as claimed), are exceptions cited correctly.
What passes: a named list of non-conformant features with a documented reason for each. "Video content uploaded by users does not have closed captions because captions are out of our control under Article 14 third-party content" is the kind of specificity that holds up under review.
What fails: an empty list (which implies full compliance and is contradicted by the scanner output), generic boilerplate such as "some content may not be fully accessible," or a list that does not match the conformance data underneath. "We did not get to it yet" is not a recognised exception.
5. Mobile app in scope
If the company ships a native mobile app, the statement covers the app and not just the marketing website. This is the single biggest gap in mid-market portfolios.
What auditors look for: does the statement mention iOS or Android, is there a separate statement linked from inside the app or from the app store listing, are mobile-specific clauses (in particular EN 301 549 Chapter 11) referenced. The Chapter 11 reference matters because the success criteria for native mobile differ from the web. Our breakdown of EN 301 549 Chapter 11 covers the specific clauses.
What passes: an explicit mobile section in the main statement, or a separate statement reachable from the app's About or Help menu. Some companies publish two statements, one for web and one for mobile, which is the cleanest pattern.
What fails: a web-only statement on a company that ships a mobile app. A regulator who can find the app on the App Store or Play Store and cannot find a statement for it has a documented finding before they have run a single test.
6. Feedback and enforcement contact
A working route for users to flag accessibility barriers, and a route to escalate to the national enforcement body if the company does not respond within a stated window.
What auditors look for: is there a working email or contact form, is the response time committed to (national guidance in several member states expects a response within 30 working days), is the relevant enforcement body named for the user's jurisdiction. In Germany the relevant body is the Land authority designated under the BFSG. In France, federal enforcement sits with the DGCCRF; in November 2025 disability organisations filed public injunctions against Auchan, Carrefour, E.Leclerc, and Picard, the first significant private-actor enforcement under the French transposition. In Spain, enforcement is distributed across sectoral regulators under Law 11/2023, with OADIS acting as the national accessibility observatory. Other member states have similar bodies, all listed in the national transposition acts.
What passes: a feedback path that works (an email that does not bounce, a form that does not silently fail), a stated response time, and the relevant enforcement body named with a link.
What fails: feedback only with no escalation route. Enforcement only with no feedback channel. A contact email that bounces. A form that returns a generic success message but does not actually deliver to a human.
A regulator-credible accessibility audit produces a finding for every element on every product the company ships. AUDITSU's snapshot grades all six in around 30 seconds, from a domain. That output becomes the input to the full audit underneath.
How a real accessibility audit runs
Six steps, in order. The first two and the last two are statement-level. The middle two are code-level. The mistake most teams make is buying steps 3 and 4 alone and calling the work an audit.
- Discovery. Pull the existing accessibility statement, if one exists. Pull the app's store listing on iOS and Android. Confirm the scope of what is being audited: which products, which versions, which markets.
- Statement grading. Run the six-element check. Produce a pass or fail per element with evidence (URL, snippet, screenshot). This is the artifact that maps directly to Article 13 and Annex V.
- Substantive conformance. For the parts of the product the statement claims are conformant, run the WCAG and EN 301 549 scan. Cross-check the scanner output against the statement's claims. A statement claiming "fully compliant" against a homepage that fails three Level A criteria is a documented contradiction, and contradictions are the easiest findings to support.
- Mobile coverage. If the company ships a native app, run the EN 301 549 Chapter 11 clauses against the iOS and Android builds. Mobile-specific criteria (target sizes, gesture alternatives, platform accessibility API support) are not visible to a desktop scanner.
- Feedback loop. Send a test message through the published feedback contact. Note whether it delivers, how long the response takes, and whether the response addresses the substance. A bounced email or silent form is a finding on its own.
- Findings report. Produce a document with one finding per failed element plus one finding per WCAG violation. This is what the company remediates against, and what a market surveillance authority can request under EAA Articles 19 onwards (and the relevant national transposition) to verify the underlying work.
A statement audit without the substantive scan is shallow. A scan without the statement audit is misdirected. The two together are the audit the EAA expects.
What an EAA accessibility audit is not
Three things worth naming directly.
Not a one-time event. The statement is dated. A statement last updated 18 months ago is itself a finding under most national guidance. Audits that do not include a maintenance plan produce a document that is out of date the moment the next product release ships.
Not a WCAG scanner output. Scanner output is a feed of code-level violations. The audit reads that feed against the statement's public claims. A scanner alone cannot tell you whether the statement is honest, whether it covers the mobile app, or whether the contact path works.
Not optional now that enforcement is live. The question is no longer whether regulators will ask. The first public injunctions in France landed in November 2025. The German BFSG(opens in new tab) carries a maximum administrative fine of EUR 100,000 per breach. Companies still treating the EAA as a nice-to-have are working from a 2024 mental model.
How to run an EAA accessibility audit on your own app
Three viable paths, in increasing order of resilience.
Hand-write the audit. Pull your statement, grade it against the six elements above, run a scanner against your homepage and three core flows, write up the findings. Free, slow, manually verifiable. Suitable for a single-product company with internal accessibility expertise.
Buy a one-off external audit. A consultancy runs the six-element check plus the substantive scan and hands you a findings document. Typical cost is GBP 15,000 to GBP 50,000 per app on a six to twelve week engagement. The output is accurate at delivery and stale six months later. Our EAA compliance software buyer's guide compares this against the platform option.
Use a platform that grades continuously. AUDITSU runs the six-element grade automatically plus the substantive scan against EN 301 549 and WCAG 2.1 Level AA. The statement is generated from live findings, not from a template, so it stays current as the product changes. Rule-based grading, not AI: every finding maps to a specific clause and can be defended.
There is no "no audit" option that produces a credible position. Skipping the audit and writing a statement anyway leaves you with a public document the company cannot back up if a regulator asks for the file behind it.
Run the six-element grade on your own app
If you want the six-element grade on your own product, AUDITSU produces it from a domain or app store URL. The output names every gap, ties each one to the Article 13 element it sits under, and gives you a remediation order to work from. Request a free EAA snapshot, or read how the full continuous audit works at the statement generator page. Beta access to the audit platform is GBP 197 per month, with the first month free.