Skip to main content

VoiceOver Quick Start for iOS App Testing

by AUDITSU7 min read

You're running an audit on app.auditsu.com and a step has asked you to test something with VoiceOver. This is the quick start that gets you there.

The AUDITSU platform walks you through the full audit mapped to WCAG 2.1 AA and EN 301 549. This post is the side-reference: just enough VoiceOver to follow the platform's checks confidently on your device.

VoiceOver is Apple's built-in screen reader for iOS. It ships on every iPhone and iPad. Daily users have years of muscle memory and customised settings, and you don't need any of that. You need just enough to verify a finding, walk through a screen, and report what you hear. For everything beyond the basics, we link out to Apple's official documentation.

Enable VoiceOver in two taps

The fastest path is voice activation:

  • Say "Hey Siri, turn on VoiceOver"
  • Say "Hey Siri, turn off VoiceOver" when you're done

For a hands-on approach:

  • Settings, then Accessibility, then VoiceOver, then toggle on
  • Or set up the Accessibility Shortcut: Settings, then Accessibility, then Accessibility Shortcut, then tick VoiceOver. Triple-click the side button (Face ID iPhones) or Home button (Touch ID iPhones) to toggle.

Once VoiceOver is on, every interaction changes. A single tap selects an item and reads it aloud. To activate a button you double-tap. Everything in the rest of this guide assumes VoiceOver is running.

The six gestures you actually need

You don't need to memorise Apple's full gesture list. For an audit, six gestures get you everywhere.

  • Single tap. Select the item under your finger and hear it announced.
  • Swipe right. Move to the next item in reading order.
  • Swipe left. Move to the previous item in reading order.
  • Double tap. Activate the focused item (button, link, switch).
  • Two-finger swipe down. Read all from the current position.
  • Two-finger tap. Pause or resume speech.

With those six, you can walk any screen and confirm focus order, label quality, and that controls actually fire.

Two extras worth knowing for specific audit tasks:

  • Two-finger twist. Opens the rotor (next section).
  • Three-finger triple-tap. Screen Curtain. Turns the display off while VoiceOver keeps running. Useful when you want to prove a screen is operable without sight. Repeat the gesture to bring the screen back.

Apple's complete gesture list is at Use VoiceOver gestures on iPhone(opens in new tab).

The rotor, your audit shortcut

The rotor is VoiceOver's contextual navigation dial. Twist two fingers on the screen and you can switch between categories like Headings, Links, Form Controls, Words, and Lines. Once a category is selected, swipe up or down with one finger to jump between items in that category.

For an audit, three rotor categories matter most.

  • Headings. Twist to Headings, swipe down. VoiceOver should announce each heading and its level ("Heading level 2"). If the hierarchy is wrong or headings are missing, you'll hear it immediately.
  • Form Controls. Twist to Form Controls, swipe through every input. Any field that reads as bare "text field" or "button" with no descriptive label is a missing-accessibilityLabel bug.
  • Links. Twist to Links. A link that reads as just "link" with no context, or as the raw URL, has a labelling problem.

To customise which categories appear: Settings, then Accessibility, then VoiceOver, then Rotor. The full reference is at Control VoiceOver using the rotor(opens in new tab).

Four settings to change before you audit

Out of the box, VoiceOver is configured for someone who uses it every day. For a tester, four tweaks help. All paths begin Settings, then Accessibility, then VoiceOver.

  • Speaking Rate. Drag it down to about 30%. Default is fast. Slow it down for your first few sessions.
  • Caption Panel. Turn on. VoiceOver shows what it's reading as on-screen text. Invaluable for filing bugs with screenshots.
  • Large Cursor. Turn on. The focus rectangle thickens, easier to follow with your eyes while VoiceOver speaks.
  • Screen Recognition, under VoiceOver Recognition. Turn off for audits. This one matters enough to need its own section.

Why Screen Recognition has to be off

Screen Recognition is an Apple AI feature that generates synthetic labels for any control lacking one. If your app has an unlabelled button, Screen Recognition might announce it as "Possibly Play button."

For daily users it's helpful. For testers it's a trap.

If Screen Recognition is on, missing-label bugs disappear into AI guesses that sound reasonable. Users on older devices, or users who haven't enabled the feature, will hear nothing on those buttons. Your audit will miss the bug because the AI papered over it.

Rule: turn Screen Recognition OFF before any audit. Only turn it on when you're intentionally testing how the AI handles your unlabelled UI for that subset of users.

Path: Settings, then Accessibility, then VoiceOver, then VoiceOver Recognition, then Screen Recognition. Apple's overview is at Use VoiceOver Recognition(opens in new tab).

Your first walk-through

Before you head back to your AUDITSU audit, run this short loop once on any screen to get comfortable.

  1. Open the app to a screen with a few buttons and some text.
  2. Turn VoiceOver on ("Hey Siri, turn on VoiceOver" is fastest).
  3. Tap the top-left of the screen. VoiceOver focuses the first element.
  4. Swipe right a few times. VoiceOver reads each element in order. You're listening for whether labels make sense.
  5. Two-finger twist to open the rotor. Swipe up or down with one finger to step through items in the active category.
  6. Turn VoiceOver off when done ("Hey Siri, turn off VoiceOver").

Three minutes of this is usually enough to feel oriented. Now jump back to your audit, where the platform will tell you exactly what to listen for on each screen.

Common gotchas

A few things trip up testers on their first audits.

  • Double-tap not registering on a custom button. The developer wired a gesture recogniser directly instead of overriding accessibilityActivate(). Increase Double-tap Timeout (Settings, then Accessibility, then VoiceOver, then Double-tap Timeout) to rule out timing first. If split-tap works (hold one finger on the item, tap with another) but double-tap doesn't, file the bug.
  • Focus jumping back to the top of a list. Usually a parent view re-rendering and resetting focus. Note the gesture that triggered the jump.
  • "Actions available" with no obvious action. Open the rotor, swipe to Actions, swipe up or down to step through custom actions. Double-tap to perform.
  • Modal that traps focus. The dialog needs accessibilityViewIsModal = true AND a working close affordance. If both are present and focus is still stuck, file the bug.

Go deeper

For the gestures, settings, and APIs not covered here:

Back to your audit

AUDITSU runs the full audit mapped to WCAG 2.1 AA and EN 301 549. The platform tells you which screens to test, what to listen for, and which findings need a real VoiceOver check rather than an automated scan. VoiceOver is the verification layer for the parts machines can't reach: focus order quality, real announcement clarity, custom action discoverability.

Head back to app.auditsu.com(opens in new tab) to continue. The 6 elements an EAA-credible audit grades against are the same whether the surface is iOS, Android, or web.

Not started an audit yet? AUDITSU helps you audit and evidence your app's accessibility alignment to the European Accessibility Act.