The Invisible UI – iOS Accessibility
This post is long overdue. Since becoming a full-time iOS developer, the discovery that left the greatest impression on me was the accessibility feature built-in with iOS. Tugged away in a corner, the accessibility API has largely been ignored by the majority of app developers. In fact, I was only vaguely aware of its existence, and by name that is.
Until I got to meet in person Victor Tsaran, the Accessibility Lead in Yahoo.
Victor has been in the company for a long time and is renowned for being an advocate of improving accessibility support in our web products. His role is essential, as most product managers and developers have never seen a screenreader in action, let alone how to develop products with decent support of it. I strongly suggest anyone to check out Victor’s TED talk here and a clip recorded at the Karlsruhe Institute of Technology in Germany of visually impaired people demoing how he uses the iPhone. It’s mind blowing.
With VoiceOver mode turned on, the iPhone empowers the visually impaired by augmenting their abilities to send commands to the device that might not be possible in the standard mode. A continuous stream of verbal feedback provides cues to users on the state of the app and results of their actions.
Perhaps this is easier to understand with an example. Take a look at this app I put together below. It has a UIScrollView at the top showing a bunch of album arts and a UIWebView below that loads the last.fm page of the album when it is tapped. For a non-visually impaired user, a quick look and a few explorative swipe gestures would be enough to figure out what the app is about and how it is used.
The second clip below shows the exact same build of the app running with VoiceOver mode on (General > Accessibility > Enable Voice Over). You’d need to turn audio on to hear the verbal feedbacks from the device. Since the video was recorded after being airplay-mirrored from an iPad, you won’t be able to see any fingers or mouse cursors manipulating the screen. In any case, I was using Double-Tap and Three-Finger-scroll gestures to navigate between the UI elements.
How awful was that? VoiceOver declared the album arts simply as “buttons” without indicating what kind of buttons they are and what results tapping upon them might bring. Swiping in the scrollview brought an unhelpful “Page x of 5″ announcement. Nothing is reported to the user when the buttons have been tapped. The app is downright unusable to visually impaired users at this state. Let’s tackle those issues one by one.
The announcement over the album art (i.e. “buttons”) can be fixed by attaching UIAccessibilityLabel and UIAccessibilityHints to the UIButton, like so:
Now whenever the buttons are tapped or highlighted in VoiceOver mode, the Album Name (accessibility label) would be announced, along with a further instruction for action (accessibility hint).
The next thing we want is notice the users that the webview has finished loading and their album info is ready. For this, we make the ViewController a delegate of the webview, and in webViewDidFinishLoad, use a UIAccessibilityPostNotification to announce the event.
We’re almost there. One last improvement I wanted to make was to make sure the Three-Finger-Scroll wouldn’t cause the scrollview to jump to arbitrary positions. This meant implementing the accessibilityScroll:(UIAccessibilityScrollDirection)direction method, and controlling exactly how much to move the scrollview when the gesture was detected.
Let’s have these packed up and re-run the app:
The difference is so profound it felt someone was narrating the interaction. With just a few lines of code and a little bit of imagination, we have opened up our app to a new group of often neglected audience. With this, I hope that more mobile developers would take the extra hour or so to learn about the Accessibility API and add it to their apps. A little bit indeed goes a long long way.