Smartphones Are Introducing Tech That Makes the World More Accessible for People With Disabilities
"People with disabilities need the same things from their phones as those without: affordability, reliability, usability, value, functionality," says Salimah LaForce, senior policy analyst at the Georgia Institute of Technology's Center for Advanced Communication Policy. LaForce and her colleagues are currently collecting data on 2019 smartphone accessibility as a follow-up to a 2017 study published earlier this year. The results have been encouraging, as there are a growing number of accessibility features on devices that help those with many different disabilities. "Some were specifically designed to improve access by people with disabilities, such as screen readers, screen contrast, and font size customization for those with vision and print disabilities, and captions so that people who are hard of hearing can access video content," says LaForce. "Other features may not have been designed specifically for people with disabilities but impact the accessibility and the utility of the device. For example, we saw a large increase in the presence of two-way video, which is incredibly important for people who are deaf who communicate using a signed language."
{{post.sponsorText}}
Indeed, accessibility was a major topic of conversation at Apple's Worldwide Developers Conference in June, proving that this is a key area of investment for the brand. One of the buzziest breakthroughs was Apple's upgraded Voice Control feature, which allows people to navigate their Macs, iPads, and iPhones entirely via speech. For instance, if a person wanted to text a photo to their friend Tim, they'd simply have to say things like "open photos," "scroll up," "click 'share,'" and "Tim." This is a major win for those who aren't physically able to tap, swipe, and scroll on a screen. On a similar note, both Google and Apple have created smartphone shortcuts where people can trigger a complex set of actions with a single tap or voice command—for instance, opening Maps, sending a pre-written text, and turning off the lights when a person is leaving the house.
There are also smartphone accessibility settings that make communication easier for those with low or no hearing. Android users can utilize the Live Transcribe app, which uses the phone's microphone to create a real-time text transcript of what those around them are saying. Third-party apps such as Ava and MyEar allow iPhone users to do the same. Google Pixel 4's Live Caption automatically captions voice memos, social videos, and other pieces of video and audio content, while Samsung phones can detect and alert users to specific sounds, like a baby crying or doorbell ringing. Apple has even rolled out a line of Made for iPhone hearing aids that sync up with iPads and iPhones while monitoring hearing health.
For those who are blind or partially sighted, smartphones also offer a number of benefits. "For me, as a blind user, accomplishing tasks in the fastest and the most efficient way possible is one of the biggest features I expect from my phone," says Victor Tsaran, senior technical program manager on Google's accessibility team. "The utility of today’s smart devices for people with disabilities really shines when hardware and software work together to fill in the sensory gap between users and a physical reality."
One of his favorite examples is Google's Lookout app, which allows blind or partially sighted users to take a photo of, say, a sign or a menu, and then have an app tell them what it says. Detailed Voice Guidance, a function of Google Maps' accessibility settings, verbally tells users what street they're on, how much farther their next turn is, and when to use caution near a busy intersection. Microsoft's Soundscape tool has a similar functionality—it narrates the various shops, restaurants, and other landmarks a person is passing as they walk down the street, and guides them to their destination using a headset with 3D sound, kind of like sonar.
There are also many ways that smartphones are making home life easier for those without full sight. One app, called Be My Eyes, connects blind people with sighted volunteers who can do things like read expiration dates on food or tell them what color a shirt is. Samsung Galaxy phones allow users to "tag" objects in the environment that are of similar size and shape using near-field technology—for instance, two different bottles of medication—and lets them record a voice cue that will play when the phone is nearby.
Impressive as all these accessibility settings are, gaps in the landscape still exist. According to Georgia Tech's study of 2017 phones, those offered by the federal Lifeline Assistance program for low-income households often didn't have the same breadth of features as those sold by retailers. For instance, 84 percent of phones sold by wireless carriers included built-in text-to-speech readers, whereas the same was only true of 26 percent of Lifeline phones. According to LaForce, this is concerning because a significant percentage of Lifeline users have disabilities.
There's also some bias that's inherent in AI tools, says LaForce, namely around voice-recognition software. "While voice assistants have been an incredible access tool for people with vision and mobility disabilities, people who are deaf and those with atypical speech patterns have limited engagement with them," she says. Google is aiming to rectify this—it recently announced the launch of Project Euphonia, which is recording voices of people with atypical speech patterns—for instance, those with ALS and Down Syndrome—to help computers recognize and respond to them. They're also doing the same for those who aren't able to speak at all via gestures and facial expressions. LaForce adds that this type of accessibility needs to extend to all of the things we now control with our smartphones, like lights and home appliances, to make them inclusive for all. "We need more proactive and inclusive technology design that considers all of the potential users without inserting bias as to who the users are. They can be anyone, so when designing, let’s think of everyone.
Luckily, LaForce says, the future looks bright on this front. "Work is being done to allow for devices to be controlled by your thoughts, or the brain activity and signals responding to certain visual stimuli," she says, noting that brain-computer interfaces (BCIs) would make technology accessible for everyone, regardless of their ability to see, hear, speak, or even move. "I cannot say whether or when we’ll see BCIs as an accessibility option in smartphones, but I will say that with wearable devices connected to smartphones being quite common, they may not be as ‘far out’ as one might imagine."
And as Tsaran points out, accessibility innovations like these would likely end up benefiting those with disabilities and those without. "Working on Accessibility at Google, I am always impressed with how technology built for specific needs often ends up benefiting everyone," he says. "The recently introduced detailed voice guidance feature for Google Maps, for example, which provides more verbose instructions for visually impaired users, turns out to be a welcome improvement for all pedestrians." Perhaps one day we won't be designing accessibility settings in a silo, but creating technology that benefits all people and their needs at once.
Smartphone apps just keep getting smarter—one might soon be able to identify dangerous food bacteria, while another lets you stargaze at any time of day.
Loading More Posts...