Skip to main content

TalkBack Quick Start for Android App Testing

by AUDITSU9 min read

You're running an audit on app.auditsu.com and a step has asked you to test something with TalkBack. This is the quick start that gets you there.

The AUDITSU platform walks you through the full audit mapped to WCAG 2.1 AA and EN 301 549. This post is the side-reference: just enough TalkBack to follow the platform's checks confidently on your device.

TalkBack is Google's built-in screen reader for Android. It ships on Pixel and is bundled (sometimes as "Screen reader" under Samsung) on every modern Android phone. Daily users have years of muscle memory and customised settings, and you don't need any of that. You need just enough to verify a finding, walk through a screen, and report what you hear. For everything beyond the basics, we link out to Google's official documentation.

The baseline here is TalkBack 15.x on Android 15. Pixel 9 ships with this by default. Older devices on TalkBack 14.x work the same way for the core gestures.

Enable TalkBack in two taps

The fastest path is voice activation:

  • Say "Hey Google, turn on TalkBack"
  • Say "Hey Google, turn off TalkBack" when you're done

For a hands-on approach:

  • Settings, then Accessibility, then TalkBack, then toggle on
  • Or use the volume key shortcut: press and hold both volume keys for a few seconds. TalkBack confirms aloud when it activates.

The Settings path differs slightly across phone makers. On Samsung One UI it sits under "Screen reader" rather than "TalkBack." On Xiaomi and others the wording may vary. Google confirms OEM variation but does not enumerate every menu structure.

Once TalkBack is on, every interaction changes. A single tap selects an item and reads it aloud. To activate a button you double-tap. Everything in the rest of this guide assumes TalkBack is running.

The six gestures you actually need

You don't need to memorise Google's full gesture list. For an audit, six gestures get you everywhere.

  • Single tap. Select the item under your finger and hear it announced.
  • Swipe right. Move to the next item in reading order.
  • Swipe left. Move to the previous item in reading order.
  • Double tap. Activate the focused item (button, link, switch). On TalkBack 15.1+ this works on links inside and outside WebViews.
  • Two-finger swipe up. Scroll forward through a list or page.
  • Two-finger single tap. Pause or resume speech.

With those six, you can walk any screen and confirm focus order, label quality, and that controls actually fire.

Two extras worth knowing for specific audit tasks:

  • Three-finger swipe up or down. Cycle through Reading Controls (next section). On older devices without multi-finger support the fallback is a single-finger swipe up-then-down or down-then-up.
  • Three-finger single tap. Open the TalkBack menu, which exposes custom actions on the focused item.

If you've used older TalkBack tutorials, you may have seen "angle gestures" (swipe right then down, swipe up then right). Those still work as fallbacks on devices without multi-finger support, but on modern Android they have been replaced by Reading Controls.

Google's complete gesture list is at Use TalkBack gestures(opens in new tab).

Reading Controls, your audit shortcut

Reading Controls are TalkBack's contextual navigation dial, equivalent to VoiceOver's rotor on iOS. Three-finger swipe up or down to switch between categories like Headings, Controls, Links, Words, and Characters. Once a category is selected, swipe down with one finger for next, up for previous.

For an audit, three categories matter most.

  • Headings. Select Headings, swipe down. TalkBack should announce each heading and its level. On TalkBack 15.1+ you can also add individual levels (Heading 1, Heading 2, etc.) for direct hierarchy verification.
  • Controls. Select Controls, swipe through every interactive element. Each should announce a clear label plus its role ("button", "checkbox checked", "edit box"). Anything that reads as just "button" or "edit box" with no label is a missing-contentDescription bug.
  • Links. Select Links. Any link that reads as just "link, here" or as the raw URL has a labelling problem.

To customise which categories appear: Settings, then Accessibility, then TalkBack, then Settings, then Customize menus, then Customize reading controls. TalkBack 15.1 expanded the available categories significantly, adding Buttons, Edit fields, individual heading levels 1-6, Images and videos, Lists, Tables, and Visited links among others. The full reference is at Use the TalkBack menu & reading controls(opens in new tab).

Five settings to change before you audit

Out of the box, TalkBack is configured for someone who uses it every day. For a tester, a few tweaks help. All paths begin Settings, then Accessibility, then TalkBack, then Settings.

  • Speech rate. Drop to about 1.0× for your first sessions. Default is faster. You can speed up later.
  • Verbosity. Switch the preset to Custom and turn everything on. High verbosity announces roles, states, hints, and capital letters, which is exactly what a tester needs.
  • Display speech output. Under Advanced settings, then Developer settings. Turn on. TalkBack overlays its spoken text on-screen, which is invaluable for bug-report screenshots.
  • Log output level. Under Advanced settings, then Developer settings. Set to VERBOSE. Pipes detailed events to adb logcat if you need to attach logs.
  • Automatic descriptions. Under Audio. Turn off for audits. This one matters enough to need its own section.

Google's settings reference is at Learn about TalkBack settings(opens in new tab).

Why Automatic Descriptions has to be off

Automatic descriptions is a TalkBack AI feature that synthesises announcements for any image without a contentDescription. If your app has an unlabelled ImageView, TalkBack might announce a sentence describing what the AI sees in it.

For daily users it's helpful. For testers it's a trap.

If automatic descriptions are on, missing-label bugs disappear into AI guesses that sound reasonable. Users with the feature off, or on devices where it isn't available, will hear nothing on those images. Your audit will miss the bug because the AI papered over it.

Rule: turn Automatic descriptions OFF before any audit. Only turn it on when you're intentionally testing how the AI handles your unlabelled images for that subset of users.

Path: Settings, then Accessibility, then TalkBack, then Settings, then Audio, then Automatic descriptions, then No descriptions. Google's overview is at Use image descriptions in TalkBack(opens in new tab).

Pixel 9 runs the AI model on-device; other supported devices use a server-side model. Both can mask missing labels equally.

Your first walk-through

Before you head back to your AUDITSU audit, run this short loop once on any screen to get comfortable.

  1. Open the app to a screen with a few buttons and some text.
  2. Turn TalkBack on ("Hey Google, turn on TalkBack" is fastest).
  3. Tap the top-left of the screen. TalkBack focuses the first element.
  4. Swipe right a few times. TalkBack reads each element in order. You're listening for whether labels make sense.
  5. Three-finger swipe up to open Reading Controls. Swipe down with one finger to step through items in the active category.
  6. Turn TalkBack off when done ("Hey Google, turn off TalkBack").

Three minutes of this is usually enough to feel oriented. Now jump back to your audit, where the platform will tell you exactly what to listen for on each screen.

Common gotchas

A few things trip up testers on their first audits.

  • Double-tap doesn't activate a custom view. Usually the view receives focus but isn't marked clickable, or it handles raw touch instead of calling performClick(). File the bug; the developer needs to fix the view's accessibility delegate.
  • TalkBack reads nothing. Check that the container isn't marked android:importantForAccessibility="no". If the screen is a single WebView or custom Canvas, run the Accessibility Scanner to confirm nodes are exposed.
  • Focus stuck off-screen. Two-finger swipe to scroll, or two-finger triple-tap for "Read from focused item" to relocate.
  • Reading Controls stuck on speech rate. Symptom: swipe down adjusts speed instead of moving focus. Three-finger swipe up or down to cycle back to item-by-item navigation.
  • Custom actions missed. Apps register accessibility actions ("Dismiss", "Mark as read") with no on-screen button. Open the TalkBack menu (three-finger tap) on the focused item and look under Actions. Missing this is a common source of false "missing functionality" reports.
  • OEM differences. Samsung's accessibility stack overlays Google TalkBack. Pixel and Samsung can announce the same screen slightly differently. Test on the device families your real users use, not just Pixel.

Go deeper

For the gestures, settings, and APIs not covered here:

Back to your audit

AUDITSU runs the full audit mapped to WCAG 2.1 AA and EN 301 549. The platform tells you which screens to test, what to listen for, and which findings need a real TalkBack check rather than an automated scan. TalkBack is the verification layer for the parts machines can't reach: focus order quality, real announcement clarity, custom action discoverability.

Head back to app.auditsu.com(opens in new tab) to continue. The 6 elements an EAA-credible audit grades against are the same whether the surface is Android, iOS, or web.

Not started an audit yet? AUDITSU helps you audit and evidence your app's accessibility alignment to the European Accessibility Act.