What’s New In Accessibility In iOS 16 For Blind And DeafBlind Users
Learn about accessibility updates made in iOS 16 that help individuals who are blind and DeafBlind. Updates made to improve dictation, voiceover and more.
By Scott Davert, Coordinator: Technology, Research and Innovation Center, Helen Keller National Center for DeafBlind Youths and Adults
September 12, 2022
Introduction
I’m back yet again. Just like in years past, September brings us a new major release of iOS. This latest edition includes many mainstream changes such as a revamped Lock screen, enhancements to privacy and safety features, Focus Mode enhancements, new functionality in Messages and Mail, along with many other improvements. For Apple’s official list of iOS 16 features and changes, please see Apple’s official list which contains many.
A lot of articles will cover these changes in detail, but far fewer will cover the accessibility enhancements and features for individuals who are blind or DeafBlind.
Noteworthy Mainstream Changes
Hello Automatic Verification, Goodbye Captcha!
CAPTCHAs may soon become a thing of the past thanks to a new feature in iOS 16 called Automatic Verification. By allowing iCloud to automatically and privately verify your device and account, it may soon be possible for users to bypass CAPTCHAs. For more detailed information on how Automatic Verification works, please see this TechCrunch article.
This capability will need to be further developed so it could take some time before it works but it’s a wonderful idea which will hopefully gain the support of developers. Though this feature was enabled by default for me, you can insure it is set up for you by heading over to Settings > Apple ID > Password & Security > Automatic Verification and setting it to “on”.
Siri
Each time I cover what’s new in iOS, Siri gets a mention. This article is no exception. This time around, Siri gains the option to carry out new tasks. You can shut down your phone, restart it, hang up a call, and other actions. With “Hey Siri” enabled, simply tell it to carry out any of these tasks.
If you are someone who prefers to collect your thoughts before dictating, you now also have the ability to control the amount of time Siri will wait for a response. You can adjust this by going to Settings > Accessibility > Siri and choosing among the 3 options. The default setting is what we have always had, but there are now options for “Longer” and “Longest”.
Finally, there are new sounds for when Siri is listening and when it stops. These new sounds are much lower pitched and may be easier for someone with a high frequency hearing loss to detect.
Dictation
With iOS 16, it is, in most cases, no longer necessary to speak the punctuation needed to formulate a proper sentence. Even when I spoke with no inflection in my voice, the dictation was still able to correctly insert punctuation marks in many instances. That said, I’ve also had random times where no punctuation shows up at all, so it is always best to verify what you are sending before doing so.
If you want to add or delete punctuation, you can edit on the fly as the onscreen keyboard remains available during dictation. It’s also possible to dictate emojis. For example, saying “smiling face with smiling eyes emoji” would produce the correct emoji.
I have found that punctuation and emoji insertion is about 80% accurate on my iPhone SE 3, but that it was a bit higher on an iPhone 12 Pro Max.
It’s worth noting that the automatic insertion of emojis and punctuation is only available on the iPhone XS and later, which does include the SE 2, 3, and XR.
Voiceover
There Are More Voices Inside My Phone!
iOS 16 brings a large number of new speech options to those who use them for VoiceOver, Speak Screen or Speak Selection. According to the above linked Apple page, speech options are now available in over 20 additional languages and locales, including Bangla (India), Bulgarian, Catalan, Ukrainian, and Vietnamese.
These new options aren’t only new voices. Newer versions of various software synthesizers are included, such as a higher sampling rate on the voice called Tom, a new premium version of Ava, and new voices such as Nathan, Joelle, Noelle and others. Explore these and other new options by going to Settings > Accessibility > VoiceOver > Speech > Voice.
Read Eloquently With Reed
You can listen to your iOS device with any other variant of Eloquence you would like as well. In English, this includes both the U.S. and U.K. variants. Many prefer this speech synthesizer to others, as it is the default with JAWS and is very responsive. Its comparatively robotic and more predictable nature can also help those who are hard of hearing, as it can sometimes be difficult to understand the more modern concatenative synthesizers at faster rates.
Much like food, music, books or many other things in life, this is highly subjective. With iOS 16, your choice of voices and languages is larger than it ever has been. Even the Novelty voices macOS users are familiar with are now available.
A complete list of U.S. voices and audio demonstration can be found in this podcast recorded by Thomas Domville.
There is a second podcast, which demonstrates and lists the English voices for those outside the United States.
Less Is Sometimes Better
When iOS 15 was released it brought more Verbosity options to both speech and braille users, including how to convey availability of the Action rotor. The challenge was that the messages were either always on or always off. Always on caused issues with the Mail app, since the actions rotor is almost always available, but it was very helpful for learning about Actions rotor availability in unfamiliar contexts. In iOS 16, there is now a setting that when enabled, only displays or speaks this information once until it changes. Those using Mail and similar will no longer find themselves regularly disrupted by these repeated messages. Find it under Settings > Accessibility > VoiceOver > Verbosity > Actions. Turning on “First Item Only” is the only new option.
Time To Start
A new feature for Apple Maps is sound and haptic feedback for VoiceOver users to identify the starting point for walking directions. As I understand it, this feature is automatically turned on whenever VoiceOver is running and is active.
Magnifier
Door Detection
There are other things beyond magnification which happen in this app. One of them is detection. After opening the Magnifier app, on the lower right side of the screen you will find a button called “Detection”. Activating this mode reveals a new screen which will have 3 buttons on it: People Detection, Door Detection, and Image Description. Any of these 3 buttons can be activated and your iOS device will then find and report what it thinks it has detected.
Walking around my neighborhood with all 3 buttons activated, the amount of information was overwhelming. This is partially because there were many businesses on each block, people going about their afternoon and lots of advertising and information to detect. Turning only Door Detection on seemed to slow things down some.
While we are on the topic of slow, this is exactly the speed you will need to travel for the LiDar and other hardware to give you accurate information which makes sense. When identifying objects and their distance measured in feet, 1 or 2 steps can change the entire perspective. However, if you are looking for a door or sign, you probably are going to move at a slower pace anyway. My issue was that the information was without context unless I already knew someone or something was there. For example, I was often informed of the presence of others but had no information about where they were in relationship to me. I was also able to read address numbers, which was nice, but there were often 2 or 3 doors detected at the same time so figuring out exactly what was being reported took work.
There is also the issue of not having any free hands when conducting this exercise. I was walking with my cane in one hand and the phone in the other. It has to face outward which could be problematic if someone bumps into you or thinks they would like a new iPhone and tries to steal it. For a totally DeafBlind person, they would have to most likely secure their phone via a lanyard or some other method, as they would have the phone, mobility aid and braille display to juggle when utilizing this technology.
Navigating indoors with this new set of tools, I found they worked much better and were able to provide me with useful information. For example, when I stopped in front of a door in a random office building, the iPhone was able to identify the room. It also read me the name of the person who probably thought I was a weirdo for hanging outside their door.
As is the case with other kinds of machine learning, I feel that we are approaching the time when these tools will provide extremely valuable information. It is my view, no pun intended, that such technology would function far better in the form of glasses or some other head worn device. Detection in the Magnifier app is only available on the Pro and Pro Max versions of the iPhone 12, 13, and 14.
Activities
iOS 13 brought this feature to VoiceOver, but iOS 16 extends that same type of contextual flexibility to the Magnifier. You can save your preferred Magnifier controls, including camera, brightness, contrast, filters and more as an Activity and quickly switch between them. This makes switching optimized configurations for recurring tasks and situations like a restaurant menu versus an airport flight board more efficient.
Hearing
Live Captions
New in iOS 16 is a Live Caption feature which will not only provide captions for apps like FaceTime, but is also available system wide. You can enable the feature by going to Settings > Accessibility > Live Captions (Beta).
There are possibilities to control the appearance of the captioning panel. Options are available to make the text bold, adjust the text size, text color, background color and the idle opacity.
For braille users, the Live Captions feature is useful, though a bit frustrating to navigate. When turned on and using system wide captions, you can typically find the icon on the lower right corner of your Home screen with space and dots 4-5-6 and then press space with dot 4 to get to the icon. However, sometimes the Live Captions icon isn’t there.
For those wishing to make regular use of this feature, I recommend setting up a keyboard command to jump directly to the Live Captions panel from anywhere. The command needing to be customized is “move to next app”. To set up a customized command for the Live Captions feature, follow the below steps:
1. Head over to Settings > Accessibility > VoiceOver > Braille > More Info (under the device you have connected), and then commands.
2. Activate the Device button.
3. Find “move to next app” among the list of choices.
4. Scroll to “Assign New Braille Keys”.
5. Have a command in mind so that you can immediately press this keyboard combination. For example, Dot 8 with dots 2-6. If the command you have chosen doesn’t already have something assigned to it, you will be done with this process. If the Braille keyboard assignment does have a command already associated with that keyboard combination, you will get an alert telling you what the already assigned action is and asking you if you wish to change it.
6. Choose “OK” or “Cancel”, and the appropriate option will be selected.
Once active, you can go to the captions panel by pressing the command assigned above. Once in the panel, in addition to the captions, you will encounter a few options. If no captions are available, you will find “Listening…” After that is a “Pause” button followed by a microphone button and “Minimize” button. Turning on the microphone will enabled Live Captions of the environment around you. If the microphone is not selected, you will receive captions of any audio other than VoiceOver playing on other apps. For example, I pulled up a stream of WCBS AM on the OOTunes app and received captions of what the host was saying. Frustratingly, the arrival of new text returns focus back to the beginning of the line and disrupts the flow of reading. This is an issue even when using Live Captions with those around you. This may prove to be a nuisance for even fast braille readers.
When in a FaceTime call and using VoiceOver, you can go to the “more Info” button and turn Live Captions on for FaceTime calls. On my iPhone SE, I was not able to get Live Captions to work with FaceTime Audio, only when making a video call.
Even standard phone calls are supported, which means that it could eventually replace apps such as ClearCaptions which puts new captions on different lines. This makes it so that the text in focus does not change each time new text arrives. The advantages to this over a 3rd party app are that a different phone number is not needed and that there is no complex identity verification required by the FCC regulations for relay and internet Protocol Captioning Telephone Service.
The captions themselves are about as reliable as dictation. Meaning that they sometimes come out well, and other times, not so much. I called a friend in South Carolina who has a very thick southern accent. It did not do so well with his voice. However, after calling my brother who lives in Michigan, it was much more accurate. Live Captions is a feature only available in the U.S. and Canada upon launch, and requires an iPhone 11 or later.
Your Sounds Identified
With the release of iOS 14, Apple introduced Sound Recognition. A few new sounds it could identify were featured in iOS 15, but now you can teach your device what to listen for.
To set this feature up, go to Settings > Accessibility > Sound Recognition and choose whether you would like to customize an alarm or appliance. After selecting “Custom” you will be walked through the setup process that first has you assign a name to the sound and then play it 5 times. I had some issues with all 5 attempts being registered, but after playing it 7 or 8 times, it seemed to work.
Another new feature in iOS 16 is the ability to choose a sound and custom vibration for each alert. So, if you have an urgent notification like a smoke alarm, you may wish to use one of the more persistent vibration patterns. For the few sounds I have active, I recorded my own vibration in the form of a 3 character morse code pattern so that the alert would be immediately recognizable.
Performance seems to have improved, and appears especially more reliable for the custom sounds. I found that, for example, the doorbell alert was not too reliable in previous versions of iOS. Now that I can record my own, the reliability has improved with no false positives. I still find, though, that the sound in question has to be quite loud in order for Sound Recognition to alert me. The threshold of noise with customized sound recordings also seems to be a bit lower than the standard sounds that are available from Apple. Am I ready to throw out my smoke detector and other alarms? Not at all, as my default alerting system feels more safe and reliable.
I’ve Been Notified
For users of AirPods and Beats headphones, it was possible to have the notifications directly delivered into those devices via Siri when announced. IOS brings this ability to users of mFI hearing aids.
Hearing Health
In iOS 15, users were given the ability to get Headphone Accommodations made to compensate for any type of hearing loss. The user would complete a short hearing screening that would then be taken into account when used with compatible headphones. iOS 16 expands this allowing the generated audiogram to be imported into the Health app.
Hanging It Up
By default, it is now possible to end calls by pressing the Side Button on your iPhone, however, I’m not able to get it to work with my iPhone SE. I tried leaving a Bluetooth audio device connected, disconnecting the Bluetooth audio and holding the phone to my ear and also tried the feature with VoiceOver turned off. Others have had success with this feature, but I still can’t seem to hang up on people reliably. It is possible to turn this setting off if you wish; it can be found at Settings>Accessibility>Touch>Prevent Lock To End Call.”
Conclusion
Apple continues to innovate and bring new accessibility features to users who need them. Having new features that I can specifically use as a VoiceOver and braille user each year makes me feel included in the process. Though there are certainly bugs in the iOS 16 release, I have chosen to upgrade my main device since I’m able to work around the bugs present. Many of the bugs I have found are minor in nature. While this is true for my circumstance, it may not be for yours. I would recommend checking out iOS 16 on another device before installing it yourself, especially as a low vision user. Apple has indicated, for example, in the article linked in the first paragraph that there are many new designs. Further, I would recommend checking out the list of known bugs on the AppleVis website. The original post and comments often provide valuable information which may help in decided whether iOS 16 is right for you. If so, you can get it on the iPhone 8 and later. Sorry iPod touch lovers, iOS 15.6.1 was the end of the line for the iPod 7. You can download and install the update by going to Settings > General > Software Update. Remember, though, that you may only be able to go back to 15.6.1 for a very limited time, and that the process of doing so is far from simple.