At WWDC 2018, Apple introduced a range of improvements that should make its solutions effectively usable by those “left out” on other platforms. Here’s a short run-down:
One of the most potentially liberating enhancements, Siri Shortcuts will deliver significant hands-free benefits to people who have autism or mobility challenges. The software enables creation of verbally controlled ways to get things done.
That’s useful in small ways — sending emails or messages — but also in more specialized tasks, such as controlling devices in the smart home, or even complex tasks as third-party developers introduce Shortcuts support to their apps. This automation should help users with motor skill or cognitive problems.
Apple’s introduction of Group FaceTime chat for up to 32-users makes it possible for simultaneous sign-language translation between chat participants.
So long as both parties have an Apple Watch, people will be able to contact family and/or key care providers using the Watch — no need to find an alarm button or find your iPhone.
Apple’s decision to make AirPods compatible with LiveListen in iOS 12 seems useful. The technology was originally developed for Made for iPhone hearing aids. It uses the iPhone mic to monitor what is being said and streams the audio direct to the hearing aid, or (now) AirPods. This won’t replace hearing aids, of course — people with complex hearing problems should get medical advice — but it does put this kind of solution in more people’s reach.
Apple has made a significant change to the Speak Selection tool. In the past, this used a different system of voices to speak what was on screen. Apple has changed that. iOS 12 speaks selected text in the much-improved Siri voice by default.
iWork gains Optical Character Recognition support for handwritten notes created in Apple’s productivity suite. This means handwritten notes can be read aloud, so blind and low-vision users can hear text mark-up in documents. That’s particularly useful for those who write or collaborate on written projects.
Apple has also made improvements to the accessibility keyboard when typing using Switch Control. These improvements make typing much faster and easier than before, boosting productivity and enabling those who use such controls to realize their ideas much more quickly than they could in the past.
MacBook Pro with Touch Bar
Apple has made a useful improvement for the MacBook Pro with TouchBar. VoiceOver users with those Macs can now build custom Automator shortcuts that can be accessed in the Touch Bar. A second enhancement available to all Macs, Dark Mode in macOS Mojave will help users see better when using their Mac.
Backwards compatibility and performance
Apple’s decision to ensure iOS 12 will deliver performance improvements on devices as old as the iPhone 5s will benefit people who have disabilities, many of whom use older devices.
“For our customers, that’s going to be really huge,” said AssistiveWare CEO David Niemeijer. He notes that many of his customers use older devices, so extending the usable life (and performance) of the hardware they already own will deliver significant benefits to them.
Accessibility: a ‘requirement’
“Developers need to realize that accessibility shouldn’t be a feature. It is a requirement,” wrote WWDC 2018 student scholarship winner John Ciocca, developer of youBelong and MyVoice.
“An app that is accessible has the potential to reach a substantially larger audience. Those users who are typically left out now have the ability to achieve more and will remember that your app enabled them to do so.”
Source: WWDC: Accessibility has become a requirement | Computerworld