Apple didn’t make a big deal out it, but its iOS-based devices (iPhone, iPod, iPad) offer a revolutionary approach to accessibility for users who are blind or visually impaired. The devices are based on a TouchScreen interface that relies on MultiTouch gestures applied by the user to objects on the screen. So how could this ever be made accessible?
For sighted users, touch-screen devices have been around for quite a while; it’s the fact that the user can use just a finger or fingers, in lieu of a special stylus, that’s new. For Blind and Low-Vision users, the fact that Apple figured out how to make this complex and extremely visual interface accessible is very, very revolutionary. Indeed, it flies in the face of commonly held beliefs by many that a touch-screen interface would be nearly impossible to make accessible.
Apple was successful by integrating two important accessibility tools: VoiceOver and Zoom. When these tools are activated, the behavior of the screen gestures change. Here’s how they work:
Obviously a person who is blind must know what’s on the screen before activating an icon. So when VoiceOver is running, simply touching an icon does not activate it. VoiceOver speaks the icon’s name. In order to activate it the user needs to use another gesture, a double-tap, or a split-tap.
Double-tapping with one finger activates the last item spoken by VoiceOver. Ditto with Split-tapping. There are also flick/swipe gestures to scroll a volume control and to read continuously, to move the insertion pointer back and forth through text, to scroll back and forth through pages, and one of the most clever, the rotor, which involves rotating two fingers on the screen counter-clock or clock wise. The rotor is like a control center for other flick or swipe gestures. Use it to select whether you want to navigate by word, line, headers, tables, form controls, etc. when browsing, or to speed up the voice or change languages. There’s more to VoiceOver than what I have room to discuss here– including very nice Braille support-and if you wish to know more, go to apple.com/accessibility and read about VoiceOver in all of its incarnations.
Zoom is the Low-Vision alternative to VoiceOver. It doesn’t have as many gestures because the user is able to see what’s on the screen, therefore the gestures mainly allow for easy enlargement of the images on the screen.
Zoom has a three-finger double-tap gesture to instantly go from 100% to 200% magnification and a three-finger double-tap and drag to go from 100 to 500% magnification.
At this time VoiceOver and Zoom will not work simultaneously. However, both Zoom and VoiceOver will work with the built-in white-on-black, reverse video effect. You can also assign the “triple-press” of the Home button to activate/de-activate Zoom or VoiceOver. It’s worth mentioning, too, that there is a separate function called AutoText that will automatically speak predictive text even if VoiceOver or Zoom is off.
Room for Growth
Okay, if VoiceOver and Zoom are so great why don’t we just declare that we’ve reached accessibility nirvana and celebrate? Well, first of all, the accessibility functionality can still be improved. Hopefully we can have both Zoom and VoiceOver work together, giving people who have some vision, but need the speech support, the best of both worlds.
Other improvements I’d like to see include:
- a truly independent volume level for VoiceOver.
- a pronunciation dictionary
- support for third-party TTS engines
- BlueTooth Braille displays placed among all the other BlueTooth devices would be much less confusing
- cloud functionality (so if I come across an app with badly labeled buttons and icons and I go through and label them all, I should be able to share that with the community of users and then if another user gets that same app and they take advantage of the VoiceOver cloud, their app would instantly have all the icons properly labeled!)
There are many more improvements and bug fixes I could mention and I’m sure other users of iOS products have their own suggestions. Functions and features of VoiceOver that are contained in iOS 5 are not covered here since iOS 5 is still in beta and not yet finalized. Developers need to hear from us. Send your suggestions to firstname.lastname@example.org.
Peter Cantisani is a consultant on AT and accessibility. He is the author of 26 Useful Apps for Blind iPhone Users, released in June by the National Braille Press.