With the European Accessibility Act (EAA) now in effect and 72% of mobile user journeys containing critical accessibility barriers, mobile app testing has become a business-critical requirement. Over 1.3 billion people worldwide experience significant disability, and 91% of screen reader users access content via mobile devices. This guide provides practical, actionable steps to test your iOS and Android apps against WCAG 2.2, EN 301 549, and platform-specific guidelines.
What is mobile accessibility testing?
Mobile accessibility testing verifies that iOS and Android applications can be used by people with disabilities, including those who rely on screen readers, switch controls, voice commands, or modified display settings. It combines automated scanning, manual evaluation with assistive technologies, and validation with real users.
Why mobile accessibility testing matters now
The regulatory landscape has shifted dramatically. The EAA enforcement deadline of June 28, 2025 applies to any company trading in the EU, covering mobile apps in banking, e-commerce, transport, and telecommunications. In the United States, over 8,800 ADA digital accessibility lawsuits were filed in 2024, with 75% targeting businesses under $25 million in revenue.
The business case is equally compelling. The disability community controls over $6 trillion in global spending power. Forrester Research calculates a $100 return for every $1 invested in accessibility. Meanwhile, 40% of 2024 lawsuits targeted companies that had been sued previously, demonstrating that reactive fixes fail to address underlying issues.
Understanding WCAG 2.2 mobile requirements
Web Content Accessibility Guidelines (WCAG) 2.2, published in October 2023, introduced several criteria specifically relevant to mobile interfaces. While WCAG uses "web" in its name, these guidelines are the global standard for native mobile apps and form the technical basis for EN 301 549 compliance. For a detailed breakdown of what changed, see our guide to WCAG 2.2 changes.
Key mobile-specific criteria
2.5.8 Target Size (Minimum) at Level AA requires interactive elements to be at least 24×24 CSS pixels, or have sufficient spacing to prevent accidental activation. Platform guidelines set higher bars: Apple requires 44×44 points and Android requires 48×48 dp.
2.5.7 Dragging Movements at Level AA mandates single-pointer alternatives for any functionality that uses dragging, such as sliders, drag-and-drop interfaces, or list reordering. Users with motor impairments who rely on head wands or eye-tracking cannot perform drag gestures.
2.4.11 Focus Not Obscured at Level AA prevents sticky headers, floating action buttons, or cookie banners from completely hiding the currently focused element. This affects users navigating with keyboards or switch controls.
3.3.8 Accessible Authentication at Level AA prohibits requiring cognitive function tests like CAPTCHAs or memorised passwords as the only authentication method. Apps must support biometrics, password managers, or copy-paste functionality.
How mobile accessibility differs from web
Mobile apps use platform-specific accessibility APIs rather than HTML and ARIA. iOS uses the UIAccessibility protocol with properties like accessibilityLabel and accessibilityTraits. Android uses AccessibilityNodeInfo with contentDescription and semantic properties. For more on these differences, see why mobile accessibility is more challenging.
Touch interactions replace mouse precision. Fingertips contact an area of 16-20mm, eliminating hover states and requiring larger touch targets. Apps must also respect system settings including Dynamic Type for text scaling, Reduce Motion for animation preferences, and Bold Text for enhanced visibility.
Both platforms generate an Accessibility Tree parallel to the visual view hierarchy. Screen readers query this tree, not the pixels on screen. If an element is not in the tree, it is invisible to assistive technology regardless of how visible it appears visually.
Testing on iOS with VoiceOver
VoiceOver is Apple's built-in screen reader, used by 70% of mobile screen reader users according to the WebAIM Survey. Testing with VoiceOver should be your primary iOS accessibility validation method.
Enabling VoiceOver
Navigate to Settings > Accessibility > VoiceOver and toggle it on. For faster access during testing, configure the Accessibility Shortcut to triple-click the Side button by going to Settings > Accessibility > Accessibility Shortcut and selecting VoiceOver.
Essential VoiceOver gestures
- Swipe right moves focus to the next element
- Swipe left moves focus to the previous element
- Double tap activates the focused element
- Two-finger double tap (Magic Tap) triggers the primary action, like pausing media
- Two-finger rotate opens the Rotor for navigation options
- Three-finger triple tap toggles Screen Curtain to test without visual cues
VoiceOver testing workflow
Start by enabling Screen Curtain to simulate the experience of a blind user. Navigate through every screen using swipe gestures. Verify that all elements announce meaningful labels, not just "button" or "image." Check that the focus order matches the visual layout from top to bottom, left to right. Use the Rotor to navigate by headings and verify your heading structure is logical.
Accessibility Inspector in Xcode
Access via Xcode > Open Developer Tool > Accessibility Inspector. This tool connects to the iOS Simulator or a physical device and displays the Accessibility Tree without requiring audio output.
The Inspection mode lets you click any element to see its label, traits, hints, and frame. The Audit feature scans the current screen for contrast issues, undersized touch targets, and missing labels. The Settings panel simulates Dynamic Type sizes and Reduce Motion without changing device settings.
Voice Control testing
iOS Voice Control allows users to navigate entirely by voice. Enable it in Settings and say "Show Names" to overlay accessibility labels on all interactive elements. This immediately reveals missing labels and verifies that labels match visible text, which is required by WCAG 2.5.3 Label in Name.
Switch Control testing
Switch Control assists users with motor impairments who cannot tap the screen. Configure it in Settings > Accessibility > Switch Control and test that all interactive elements receive focus, the scan order is logical, and no focus traps exist where users get stuck in a loop.
Testing on Android with TalkBack
TalkBack is Google's screen reader, used by 28% of mobile screen reader users. Enable it via Settings > Accessibility > TalkBack or by holding both volume keys for three seconds.
Essential TalkBack gestures
- Swipe right moves to the next item
- Swipe left moves to the previous item
- Double tap activates the focused item
- Swipe up then down changes navigation granularity (headings, links, controls)
- Swipe down then right opens the TalkBack menu
For debugging, enable "Display speech output" in TalkBack's developer settings to see text captions of announcements.
Accessibility Scanner
Google's Accessibility Scanner app, available on the Play Store, provides automated auditing on the device itself. Navigate to any screen, tap the floating checkmark button, and the scanner captures a screenshot and analyses the view hierarchy for contrast issues, touch targets below 48×48 dp, and missing content descriptions.
The scanner cannot evaluate label quality or navigation flow. Combine it with manual TalkBack testing for comprehensive coverage.
Switch Access testing
Switch Access allows navigation via external switches or the volume keys. Test that focusable items appear in logical order and no focus traps exist in custom views or WebViews.
Automated testing in your build pipeline
Automated accessibility testing catches 20-40% of WCAG violations. This includes missing labels, contrast failures, undersized touch targets, and structural issues. Automation cannot evaluate label quality, logical reading order, or the usability of complex interactions.
iOS automated testing with XCTest
Xcode 15 and iOS 17 introduced accessibility audits in XCTest:
func testAccessibilityAudits() throws {
let app = XCUIApplication()
app.launch()
try app.performAccessibilityAudit()
}
You can filter specific audit types:
try app.performAccessibilityAudit(for: [.contrast, .dynamicType])
Android automated testing with Espresso
Enable accessibility checks in your Espresso tests with a custom test runner:
class AccessibilityChecksTestRunner : AndroidJUnitRunner() {
init {
AccessibilityChecks.enable()
.setRunChecksFromRootView(true)
.setThrowExceptionFor(AccessibilityCheckResultType.ERROR)
}
}
Cross-platform automated testing
For React Native, use react-native-accessibility-engine for Jest matchers and eslint-plugin-react-native-a11y for static analysis during development.
For Flutter, the built-in Accessibility Guideline API tests tap targets and contrast:
testWidgets('Follows a11y guidelines', (tester) async {
final handle = tester.ensureSemantics();
await tester.pumpWidget(const MyApp());
await expectLater(tester, meetsGuideline(androidTapTargetGuideline));
await expectLater(tester, meetsGuideline(iOSTapTargetGuideline));
await expectLater(tester, meetsGuideline(textContrastGuideline));
handle.dispose();
});
CI/CD integration
Add accessibility checks to your build pipeline to catch regressions early:
name: Accessibility Check
on: [push]
jobs:
access-check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Run accessibility tests
run: ./gradlew connectedAndroidTest
Set quality gates to fail builds on critical errors, but use severity thresholds so not every issue blocks deployment.
Manual testing checklist
Since automation catches less than half of accessibility issues, manual testing is essential. Use this checklist on physical devices.
Visual and touch verification
- Touch targets — 44×44pt on iOS, 48×48dp on Android (WCAG 2.5.5, 2.5.8)
- Text contrast — 4.5:1 for normal text, 3:1 for large text (WCAG 1.4.3)
- UI component contrast — 3:1 for borders and icons (WCAG 1.4.11)
- Colour not sole indicator — Icons or text must accompany colour (WCAG 1.4.1)
- Text scaling — Works at 200% without clipping (WCAG 1.4.4)
- Orientation — Supports both portrait and landscape (WCAG 1.3.4)
Screen reader verification
- Navigation order — Focus moves logically (left-to-right, top-to-bottom)
- Element roles — Buttons announced as "button", links as "link"
- Element labels — Icons have descriptive labels ("Close" not "X")
- Element state — "Selected", "Checked", "Disabled" announced correctly
- Hints — Complex elements explain how to interact
- Decorative images — Hidden from screen reader
- Dynamic updates — Success and error messages announced
- Modal traps — Focus stays in modal, returns to trigger on close
Hardware verification
- Focus visibility — Clear highlight ring on focused element
- Keyboard navigation — All actions possible without touch
- Gesture alternatives — Single-tap alternatives for complex gestures
Common accessibility issues and fixes
Missing accessibility labels
The most common issue. Screen readers announce "unlabeled button" without context.
SwiftUI fix:
Button(action: { /* action */ }) {
Image(systemName: "heart.fill")
}
.accessibilityLabel("Add to favorites")
Jetpack Compose fix:
Icon(
imageVector = Icons.Filled.Share,
contentDescription = stringResource(R.string.label_share)
)
Insufficient touch targets
Affects 35% of mobile users. Platform minimums are 44×44pt (iOS) and 48×48dp (Android).
SwiftUI fix:
Button(action: { /* action */ }) {
Image(systemName: "xmark")
.padding(12)
.contentShape(Rectangle())
}
.frame(minWidth: 44, minHeight: 44)
Jetpack Compose fix:
Icon(
imageVector = Icons.Default.Close,
contentDescription = "Close",
modifier = Modifier
.clickable { onClose() }
.minimumInteractiveComponentSize()
)
Fixed font sizes ignoring Dynamic Type
Users with low vision cannot enlarge text. Use system text styles and scalable units.
SwiftUI fix:
@ScaledMetric var iconSize: CGFloat = 24
Image(systemName: "star")
.frame(width: iconSize, height: iconSize)
Jetpack Compose fix:
// Use sp (scalable pixels) for text, never dp
Text("Title", fontSize = 20.sp)
Dynamic content not announced
Form submissions or state changes go unnoticed by screen reader users.
Jetpack Compose fix:
Text(
text = message,
modifier = Modifier.semantics {
liveRegion = LiveRegionMode.Polite
}
)
React Native fix:
// iOS
AccessibilityInfo.announceForAccessibility('Form submitted successfully');
// Android
<Text accessibilityLiveRegion="polite">
{statusMessage}
</Text>
Testing with real users
Automated and manual technical testing catches approximately 30-40% of issues. The remaining 60-70% require human judgement about label quality, navigation flow, and cognitive burden.
When to involve users with disabilities
Include user testing during the design phase for wireframe and prototype feedback, during development for build testing, and before release for final validation. Focus on "Red Routes" — the critical user journeys like login, checkout, and key workflows.
Recruiting participants
Reach out to disability organisations, university disability services, and specialised research panels. Allow 2-4 weeks for recruitment. Screen for the specific assistive technologies and device configurations you need to test.
Remote testing setup
For iOS testing via Zoom, have the participant join from their iPhone, tap Share Content > Screen, and enable "Share Device Audio" to capture VoiceOver speech.
For Android testing, have the participant share their screen with the "Share Audio" toggle enabled to transmit TalkBack speech.
Plan 60-minute session gaps to accommodate setup time. Compensate participants fairly, recognising that skilled assistive technology users possess specialised expertise.
Frequently asked questions
Can mobile accessibility testing be fully automated?
Automated tools catch 20-40% of accessibility issues, including missing labels, contrast failures, and undersized touch targets. Manual testing with screen readers and user validation with people with disabilities are required for comprehensive coverage.
What is the difference between VoiceOver and TalkBack?
VoiceOver is Apple's screen reader for iOS, used by 71% of mobile screen reader users. TalkBack is Google's screen reader for Android, used by 28%. Both serve the same purpose but use different gestures and have different announcement behaviours.
Is mobile app accessibility legally required?
Yes, in most jurisdictions. The European Accessibility Act requires accessible mobile apps in the EU from June 2025. In the United States, the Robles v. Domino's ruling confirmed that ADA Title III applies to mobile apps. The UK Equality Act 2010 requires reasonable adjustments for disabled users.
How often should I test my app for accessibility?
Integrate automated accessibility checks into your CI/CD pipeline to catch regressions on every build. Conduct manual screen reader testing before each release. Include user testing with people with disabilities for major feature launches.
What touch target size should I use?
Apple requires 44×44 points, Android requires 48×48 dp, and WCAG 2.2 Level AA requires 24×24 CSS pixels or adequate spacing. Use the platform-specific minimum to meet both platform guidelines and WCAG requirements.
Start testing today
Mobile accessibility testing requires a layered approach: automated scanning in your build pipeline catches the obvious issues, manual testing with VoiceOver and TalkBack validates navigation and announcements, and user testing with people with disabilities confirms real-world usability.
With the EAA now in effect and mobile app lawsuit precedents established, the question is no longer whether to implement accessibility testing but how quickly you can integrate it into your existing quality processes. Start with automated testing in your build pipeline, add screen reader testing to your QA checklist, and plan user testing before major releases.
For a comparison of how US and European regulations differ, read our guide on EAA vs ADA compliance requirements.
Learn how AUDITSU helps teams integrate accessibility testing into their development workflow.