Full Issue: AccessWorld October 2022

Editor's Notes: AFB Employment Initiatives

Dear AccessWorld readers,

Once again it is October, and that means that it is National Disability Employment Awareness Month. In acknowledgement of this month, I wanted to introduce you to two AFB initiatives that are seeking to promote employment of people with blindness or low vision.

In January 2022, AFB conducted our Workplace Technology Study, gathering data on various aspects of technology use in the workplace by people who are blind or have low vision. The study specifically focused on "technology used for hiring and onboarding, required work-related training, and productivity; receipt of workplace accommodations; interactions with Information Technology (IT) staff; and experiences with telework." In addition to the survey, in-depth interviews were conducted with 25 of the 323 participants that took part in the study. With this extensive data, we aim to aid companies and organizations fill gaps in access for workers with blindness or low vision. If you would like to read the full report, you can do so here

One of AFB's newest flagship programs is our Talent lab, where cohorts of interns and apprentices take part in a multi-year program to learn the ins and outs of web and app accessibility, as well how to test for access issues, and be an access leader in their future workplace. The program is the new form of our Access Consulting department, allowing our participants to learn from our seasoned access professionals. We began the first cohort this past summer, and applications for the next are now open. You can learn more about the Talent Lab and view apprentice and intern information and applications .

We hope you enjoy this issue of AccessWorld; Bill Holton brings us another of his Vision Tech articles, this time focusing on potential uses of ultrasound to treat vision conditions. With the release of iOS 16, Janet Ingber and Judy Dixon review the mainstream and accessibility features respectively of this latest iOS operating system update. Finally, Steve Kelley reviews a brand new strategy for delivering vision rehabilitation services remotely through the Eschenbach Haus Call Telelowvision Program.

As always, I would like to thank our authors for bringing us another excellent issue, and you, for reading our magazine.

Aaron Preece

AccessWorld Editor in Chief

American Foundation for the Blind.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Next Article

Back to Table of Contents

Vision Tech: The Sound of Sight--Preserving and restoring vision using high frequency ultrasound

Bill Holton

For most of us our experience of medical ultrasound is limited to sonagrams of a new baby or diagnostic evaluations of the heart and other internal organs. Here in this article we will present two exciting new uses for medical ultrasound: the possibility of restoring sight with a non-surgical retinal bypass, and the latest innovation in the treatment of glaucoma.

Bypassing the retina with sound waves

If you’ve ever poked yourself in the eye, or, like so many cartoon characters, walked underneath a falling piano, safe or anvil, you are likely familiar with the expression “seeing stars.” It happens even in pitch dark—pressure-induced flashes of light called Phosphenes that originate from inside the eye rather than from an outside light source. But what if you could control this phenomenon and use it to transmit patterns, shapes, even the scene ahead of you? Around the world Retinal degenerative diseases are one of the leading causes of blindness, affecting tens of millions of people worldwide. However in many cases while the light-sensitive rods and cones are no longer functioning the underlying neural circuitries connecting them to the brain are relatively intact. And if, like Phosphenes, we could pressure stimulate those neural pathways directly… That was the thinking of a USC research group co-led by Drs. Qifa Zhou, Professor of Biomedical Engineering and Ophthalmology, and Mark S. Humayun, Professor of Ophthalmology and Biomedical Engineering, and one of the inventors of Argus II. Happily, you don’t need a poke in the eye or a falling anvil to stimulate retinal nerves. “You can do it with ultrasound,” says Zhou. And whereas current retinal stimulators and bypass solutions require invasive eye and/or brain surgery, "Using ultrasound could lead to a non-invasive retinal prosthesis that works without retinal or brain implants. Special glasses with a camera and an ultrasound transducer could be all that is required to restore at least partial vision.” To test this hypothesis the team stimulated blind rats’ eyes with ultrasound waves, sound with a frequency far above what a human can hear. “These high-frequency sounds can be well-manipulated and tightly focused on a desired area of the retina. This approach made it possible to physically stimulate extremely small groups of neurons in the rat's eye, just like light signals can activate a normal eye. The team then measured the visual activities directly from the superior colliculus, a midbrain structure that plays a central role in visual information processing and to which the optic nerve is directly connected. Using a multi-electrode array the team was able to record a matching retinal activation. “When the ultrasound was projected as a pattern on the retina--for example, a circular point, or the letter ‘C’--it was possible to measure corresponding activities in the superior colliculus,” says Zhou. As mentioned, the research is in its infancy. There are two major hurdles to overcome: resolution and frame rate. “To be useful an ultrasound retinal stimulator would have to provide images at a fairly high resolution containing as many individual points as possible. “The camera attached to the system may be able to handle this, but whether the sound waves are capable of transmitting such small details to the retinal neurons without merging into each other is still an open question,” says Zhou. In the test rats the ultrasound beam stimulated a circular area of the retina with a diameter of about 600 millionth of a meter, while individual neurons that are only a few millionths of a millimeter in size are packed together in the retina. “To achieve better resolution we need to experiment with higher-frequency ultrasound beams with even shorter wave lengths.” In the tested rats the neurons gave off the strongest signal when they were activated about five times per second. But human brains have a much higher calculation speed, so at that speed they may see each image separately. Film and video game makers are aware that images can be smooth only at a frame rate higher than 24 frames per second. Unfortunately, when the team tested rats at this frame rate the images failed to process.  “But we are hopeful that using even higher frequencies will solve this problem, as our work has shown that the higher the frequency of the ultrasound the stronger the retina reacts. We’re optimistic this will enable us to at least double the current frame rate.” There is a pending patent on this new ultrasound stimulation system. A Texas-based company and Zhou is looking forward to initiating human testing sometime in the next three to five years.

Afive-minute blast of ultrasound can cure glaucoma

Glaucoma is the leading cause of irreversible blindness, affecting more than 70 million people around the world. The disease progresses slowly, often showing no symptoms until after it has already caused permanent damage—which is why medical professionals recommend regular glaucoma screenings. Glaucoma is usually associated with increased ocular pressure. This is caused mostly by an excess of aqueous humor, due to the drainage channels becoming blocked or otherwise non-functioning,” says Luís Abegão Pinto, M.D., PhD., Assistant Professor of Ophthalmology Medicine at Lisbon University. “The hydraulic pressure increases, putting pressure on the optic nerve and leading to irreversible vision loss.”

Currently the standard treatment for glaucoma uses eyedrops to decrease the amount of aqueous humor produced or increase the drainage. The drops must be taken regularly and indefinitely, however people often forget to take the drops, especially since they have no visible symptoms to remind them. For some however, even diligent use of drops will not sufficiently reduce the intraocular pressure. For these patients surgery or laser treatments may be required to physically open the blocked drainage tubes, create new channels or reduce the production of aqueous humor. “Each of these options have their risks, involve cutting, stitching and/or burning away tissue, and may not be effective,” says Pinto. Recently, Pinto and other European researchers have begun work on a new, less invasive glaucoma treatment called ultrasound cyclo plasty. The procedure uses high-intensity, focused blasts of ultrasound to reduce production of aqueous humor. The ultrasound device, known as the EyeOP1 and developed by France’s Eye Tech Care, resembles a tiny telescope with a ring probe that contains 6 piezoelectric transducers delivering ultrasound at 6 different places at different times. The patient’s eye is numbed with anesthetic drops or anesthetic block, then the device is positioned over the eye. The device aims and fires pulses of ultrasound on to the ciliary body, the area of the eye that pumps out the aqueous humor. In the weeks following the treatment the pressure reduces as less aqueous humor is produced —although enough is made to keep the eye healthy. The post operative is mostly painless, and the procedure can be repeated if necessary.

As part of a two-year study by ophthalmologists at Genoa University and other centers in Italy, 66 patients were treated with the device. The results, reported in the journal Scientific Reports, showed that in 68 percent of patients the treatment was a success, with their need for eye drops cut in half. Ten per cent of patients were classed as a complete success, with eye pressure below 21 mmHg.

Ultrasound cyclo plasty is not yet FDA approved in the US, but so far over 10,000 glaucoma patients have been treated with the procedure. “The procedure could be done in a clinic—even a mobile clinic, which would be especially useful in areas where medical specialists are not readily available,” says Pinto.

Of course the best treatment is always preventative. According to one study of 6,000 men and women aged 40 or over who reported what they ate each day, those with glaucoma had a ‘significantly lower’ daily niacin intake compared with those who didn’t. One theory is that the vitamin, found in liver and chicken breast as well as in supplements, may have a protective effect on optic nerve cells. 

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

A Guide to New Mainstream Features in iOS 16

Janet Ingber

iOS 16 and iPadOS 16 became available on September 12, 2022 and they are packed with new features, both mainstream and for the visually impaired. In this article, I will describe many of the new mainstream features.

Since September 12, Apple has issued several updates to this operating system, as is common with major software releases. I recommend waiting at least a week from a first release to give Apple time to correct bugs. The AppleVis website has information on bugs and fixes.

I will describe updates to major apps including Messages, Mail, and Siri. As you go through the article, I recommend going through each app’s Settings. For this article, I used an iPhone 13 mini.

Before updating your device, make sure it is fully backed up.

Lock Screen

On the newly designed lock screen, notifications have been moved lower down. There is a Show Notifications button that will open the Notification Center. The Flashlight is at the bottom left corner and the Camera is in the lower right corner. If you flick left and right, you can explore the whole screen. Scheduled summaries will appear in the same place they were previously. The same is true for text messages. You can now add widgets to your lock screen. Unfortunately, at this time, it is a complicated procedure when using VoiceOver. To give it a try, use the excellent description on the AppleVis website. The author of the post is screen name HEXAGON.

I put a widget on my lock screen and now text messages are displayed below the widget.

Messages

The Messages app now has some very useful features including the ability to recall a message, to mark it as unread after reading, and to edit a message after it has been sent. If you swipe through message choices, there is a new choice, whether to mark it as unread. This might be useful if you want to revisit the message. If the Delete option is selected, VoiceOver will say that the message will be deleted from all your devices.

If you accidentally delete a text message, go to the main messages screen and activate the Edit button in the upper left corner. Swipe right until you hear VoiceOver say “Show Recently Deleted.” A list of your deleted conversations will appear. Swipe down to the Recover option and select it. Select the Done button in the upper right corner.

Have you ever sent an iMessage and then wish you could unsend it? In iOS 16 you can do that and can also edit the message once it has been sent. Unfortunately, these new features only work with iOS 16 and both the sender and recipient must be using iOS 16. Also, both must be using iMessage. Unsend a message by doing a one-finger double tap and hold on the message area. You will hear three notes. Select the Undo option. Be aware that you only have two minutes to unsend a message.

Edit a message that has been sent by first doing a one-finger double tap and hold on the message. This time, select the Edit option. Make your edit and re-send the message. You can do up to five edits per message.

There is another very useful feature, the ability to report junk texts. If you receive a text from someone not in your contacts or from a number you don’t recognize, you can report it to your carrier and Apple. That option will appear when you go to delete the text.

When dictating a text, you can bypass Siri asking you whether you want to send the message. Go to Settings>Siri Search and select the Automatically Send Messages button.

Unfortunately, audio messages are not as easy to do in iOS 16 as in iOS 15. I quickly realized that on the message screen there was no button for dictating an audio message. To send an audio message after adding recipients, swipe to the Apps button and select it. Next, swipe right until you hear Audio. Select that option. The Record button is at the bottom of the screen.

There are two ways to activate the Record button. The first is to double tap on the button to start the recording and double tap again to stop recording. The other method is to double tap and hold on the Record button. When you are finished recording, release the button. Be aware that if you use this method, the message will be sent automatically. In either case, VoiceOver’s instructions will be heard on your message.

Mail

Mail now has some of the same features as Messages. For example, you can now undo an email. Be aware that the time you have is very short, much less than the allotted time in messages, approximately 12 seconds. The Undo Send button is just before the Compose button in the lower right corner of the screen. Once time is up, it goes away.

When an email is open, the following buttons will be at the bottom of the screen: Delete, Move, Reply, and Compose. Selecting the Reply button brings up the standard list of options including Reply, Reply All, Forward, and Move Message.

Mail has some new options that are available by swiping up or down on the closed email. The Read Later option lets the Mail app remind you about the email. Select the Read Later option and a new screen will load with options to receive a reminder about the email. Options include Remind Me in One Hour and Remind Me Tonight.

Mail now has the ability to schedule what time an email is sent. After completing your email, triple tap the Send button. A menu will appear. There are several time options and a Send Later button.  This button did not work well: I was able to set the date but not the time.

If you receive an email containing a rich link, it can be opened within the Mail App. Double tap and Hold on the link.

Siri 

When Siri is activated, the tone you will hear is lower pitched and not so intrusive as in iOS versions. You will hear this new tone whether you use the side button, Hey Siri, or use dictation. It is now possible to add more emojis to your dictation. First, say the name of the emoji you want to add, followed by the word ‘emoji.’ You will need to know the exact name of the emoji. I tried this feature with the following emojis: birthday cake, musical notes, red heart, and guide dog. It took several attempts to get the guide dog emoji. You really need to enunciate the word ‘guide.’

In iOS 16, you can now ask Siri which commands you can use in a particular app. This is called Siri Command Guidance. Ask Siri, “What can I Do Here?” Siri will tell you what Siri can do in the app. For example, in the Messages app, Siri told me that I could make a call, make a FaceTime call and read my messages.

Siri’s new Automatic Punctuation feature will add punctuation. When dictating, you will not have to say the name of the punctuation you want. This feature works well most of the time. You can enable or disable it by going to Settings>General>Keyboard>Dictation>Auto-Punctuation. This feature is on by default.

Safety Check

Safety Check settings can be found by going to Settings>Privacy & Security>Safety Check. Here is Apple’s description of what Safety Check does. “If circumstances or trust levels change, Safety Check allows you to disconnect people, apps, and devices you no longer want to be connected to.” Below the description is a button labelled “Learn More.”

There are two ways to use Safety Check. The Emergency Reset button will remove access for all people and apps, and you can also review account security.

The other option is Manage Sharing Access: you can choose who has access to your information. You can also review your account security. If you choose this option, a new screen will load with information about who has access. At the top of this screen are two buttons, People and Information.

In another effort to increase security, Apple will automatically download and install security updates.

Apple Music

Apple Music now lets you choose an artist as a Favorite. By doing this, you get personalized recommendations based on that artist as well as any new releases from the artist. Go to the Search tab and put the artist’s name in the Search box. Select the artist’s name in the results. On the next screen, near the top, is a Favorite button. Select the button and it will now say “Unfavorite.” Therefore, you can always change your mind.

There are new sorting options for playlists. Go to your library and select Playlists. On the next page, flick right to the Sort button and select it. Sorting options include By Title, Recently Played, and Recently Updated. Make your selection.

You also have the ability to sort the order of a specific playlist. Open the playlist and when the new screen loads, flick right to the More button and select it. On the next screen, flick right to the Sort By button and select it. Sorting options for the playlist include Title, Artist, Album, and Release Date.

Focus

The Focus app has been significantly upgraded. Go to Settings>Focus. There are specific kinds of focuses including Sleep, Personal, and Workout. Do Not Disturb is also an option. There is a Set Up button next to the name. Below the list are two buttons that are on by default: Share Across Devices and Focus Status. The first button lets the Focus be activated on all your devices. If you activate or turn off a focus on one of your devices, it will be the same on all your devices. The Focus Status lets recipients know that you have notifications silenced.

You can also create a Focus by selecting the Add button in the upper right corner of the screen. When the new screen loads, there is another list of options including both Fitness and Mindfulness. Make your selection. The app will then guide you through creating your Focus. The Back button is in the upper left of the screen and the Done button is in the upper right.

Conclusion

Upgrading to iOS 16 or iPadOS 16 is very worthwhile. Remember to backup your device first.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

Eschenbach Haus Call Telelowvision Program

Steve Kelley

Here's a troubling statistic--Vision Serve Alliance (VSA) reports that only 3% of individuals who might benefit from vision rehab services actually receive these services. Think about that for a moment… only 3 people out of every 100 people needing vision rehab services, receive it.

The reasons for this are complicated, and nearly everyone agrees there is a shortage of vision rehab professionals and low vision doctors to serve the growing need. Imagine for a moment a virtual house call was available, with a vision rehab professional, someone who was trained to do a low vision assessment and make some suggestions for devices that might be useful with low vision—magnifiers, reading glasses, glare filters, video magnifiers, etc. And what if, after the assessment, these devices could be demonstrated, right in the home or workplace, where they could be tried out to see if they'd be useful, before buying them.

Now that's really dreaming, isn't it? Maybe not... Eschenbach's new Haus Call Telelowvision Program is designed to make this a reality—enable the professional to meet with a client virtually—someone who might live a great distance away or be unable to get into the office or clinic, provide a virtual assessment, and introduce them to aids and appliances useful with vision loss.

How does it work?

Haus Call can be set up with a vision rehab professional through Eschenbach. If a client or patient wants to find a doctor or vision rehab provider that supports Haus Call Eschenbach customer service can get them connected. Each virtual house call will include two kits—one for the initial assessment, and the second with recommended devices to try. Both kits have the option of including an iPad with a preset, HIPA compliant meeting software to be used for videoconferencing to enable the patient or client to work virtually with the vision rehab professional.

The Assessment Kit

My assessment kit arrived, by UPS, in a hardcover case, with handle, and an accessible lock. Directions for the lock were emailed with a link to a YouTube video on how to unlock the case. The case included:

  • Near vision acuity test
  • Distance acuity test
  • 4 glare filters
  • iPad which opened to video-conferencing app
  • hands-free iPad stand
  • Patient instructions

Both near and distance acuity tests contained a premeasured cord to establish the correct viewing distance—16 inches for the near viewing assessment, and 10 feet for the distance viewing acuity chart.

By using the large print instructions provided in the kit, and a pre appointment phone call with the vision professional, the client is prepared to work through each one of the assessments with the guidance of a vision professional connected and observing through the camera on the iPad. A stand for the iPad is included in the kit, so the iPad can be positioned Hands free to allow the professional guiding the assessment to observe and make suggestions with the various assessment tools.

For Haus Call clients that may be unfamiliar with an iPad a preliminary call before the meeting will help get oriented to the iPad and meeting software. The iPad powered up to a home screen with the Eschenbach logo and an icon of a video camera. Touching the camera provided two tabs—one to Start a Meeting, and the other to Join a Meeting. Although the home screen was simple and the icon fairly large, there was no additional accessibility incorporated into the software, like Zoom or VoiceOver. Siri could then be turned on following visual prompts, and with it enabled, VoiceOver could be turned on, if necessary, for a client or practitioner accessing the iPad with a screen reader.

With the assessments completed—near vision, distance vision, and glare filters, the practitioner uses the assessment results as a guide to order devices that will be useful vision aids. Eschenbach has a wide range of devices that can be ordered for the client to try in the home—readers, handheld magnifiers, monoculars, portable and desktop video magnifiers, glare filters, and more. The client packs up the Haus Call assessment kit, when finished, locks the case, attaches the enclosed shipping label and calls Eschenbach to arrange for UPS pickup.

The practitioner uses the findings of the assessment to order aids and appliances appropriate for the client, directly from Eschenbach, to be shipped to the client's home for a trial period.

From the results of this writer's assessments, the following items were requested from Eschenbach for a trial in the home. These included the following devices:

  • 5X handheld magnifier with light
  • 6D reading glasses
  • Yellow glare filter
  • Handheld video magnifier

My request was followed up with a clarification for the handheld magnifiers, and a suggestion that both the readers and handheld magnifier requested be 'bracketed', meaning that in addition to the ones requested, additional devices in a size or magnification power above and below the one requested also be included. For example, not only was the 5X handheld magnifier ordered, a 4x and 6X were also ordered so each could be tried to fine tune the best match. This is what a client or patient might experience during a clinic evaluation—the option to try several similar devices, for the best match.

Haus Call Demo Devices

Like the initial Haus Call assessment kit, the trial devices were shipped in a hard plastic case with an accessible padlock. Instructions were forwarded by email, prior to the kit's arrival for unlocking the padlock.

Carefully packed within the case were the following items:

  • 4X, 5X, and 6X handheld magnifiers
  • 7D and 9D readers
  • 2 yellow glare filters (size small and large)
  • 1 handheld video magnifier, the Eschenbach Smartlux Digital)

Also included in the kit was an iPad and hands-free stand, like the one included in the initial assessment kit, set up for a video conference for any questions or instructions related to the devices sent. In addition, the near acuity chart, instructions, and a return shipping label were also included in the case.

The addition of the bracketed items really made this experience similar to what might be expected in an office or clinic, with different items to try of various powers and sizes. Although the vision rehab professional is available virtually, rather than in person for both the assessment and the device trials, one of the benefits of the virtual Haus Call is that these devices are tried out in the home environment where they will be used. It is not uncommon for a client to try something in a clinical environment, only to find that when they use it in their home environment with different lighting conditions, it may not be as effective, or perhaps another power would be the better choice.

Once the in-home trial is complete, the devices are packaged back up in the case, the packing label put on the case, and Eschenbach called to arrange pickup. Any of the devices the client found helpful may be purchased directly through the practitioner.

Wrapping it Up

Eschenbach's Haus Call was convenient and provided the tools necessary for both a virtual assessment and then to try beneficial products, hands-on, after the assessment. The iPad used during the assessment provided appropriate support for a functional assessment and to determine some devices that might be helpful aids. Shipping the devices directly to the client's home for a trial, in many ways, is a better alternative than just trying them out in a clinical environment, because conditions in the home will be different, and that is where they ultimately need to work well for the client.

After trying out the devices, clients have the option of keeping any of the devices in the kit by purchasing them through their doctor or vision rehab professional. The devices may be kept from the kit or returning them in the kit and placing an order for later delivery. It's important to note that Eschenbach does not sell directly to the client, or patient, this is done through the vision professional. For more information contact Eschenbach customer service at (800) 487-5389 or email info@eschanbach.com.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

What's New in iOS 16: New and Updated Accessibility Features

Judy Dixon

Once again Apple has released the latest updates to some of its operating systems. This time, it was iOS 16 and WatchOS 9, and they both have come with many new and exciting features. iPadOS will not be released until later this year.

iOS 16 is compatible with all iPhone models back to the iPhone 8. It drops support for several older devices including the iPhone 6s, 6s plus, 7, and 7 plus as well as the original iPhone SE and the iPod Touch.

Here is a rundown of major new accessibility features that have been added to iOS 16.

Door Detection

Door detection is available on Pro model iPhones beginning with the iPhone 12 Pro that include LiDAR capability. Door Detection is part of the Magnifier app. A four-finger triple tap is the gesture to bring up Detection Mode.

There are three buttons in the Detection Mode section of the Magnifier app: People Detection, Door Detection, and Image Descriptions. You can turn on any combination of these and it is possible to have all of them turned on simultaneously but the resulting cacophony of information can be a bit overwhelming.

Door Detection gives information about the door such as Swing or Slide and attempts to provide details of how the door works such as "Turn handle or knob."

If you double tap the End button at the top of the Detection Mode screen, you will be on the main screen of the Magnifier app. There is a Settings button in the lower left corner of this screen. This brings up a Context Menu with another Settings button. Here you can rearrange the elements that appear on the main screen of the app. Below this section is a Detectors heading. Here you can change the settings for the three types of detection.

In Door Detection settings, you can set the unit of measurement to either feet or meters. You can adjust the Sound Pitch Distance; this is the distance at which the app gives a higher pitch indication. The default is six feet. You can set the type of feedback, sound, speech, or haptics. You can have any or all of these. You can select a color to outline detected doors. The default is white. You can set Back Tap to On or Off. When On, you can double tap the back of the phone to hear more information about a detected door. In the Descriptions section, you can set Door Attributes to on or off. This provides details about the door such as color, material, or shape, and there is a switch for Door Decorations, which will read any signs or text on or near the door.

New Languages and Voices

iOS 16 adds speech for more than 20 new languages including Bangla (India), Bulgarian, Catalan, Ukrainian, and Vietnamese. In English, there are many new voices. Among the most exciting is Eloquence. Now, the speech that so many blind people use on computers running Windows has come to the iPhone. There are other high-quality voices for English U.S. including Evan, Nathan, and Zoe. In addition, there are dozens of other new voices for other English-speaking countries as well as other countries, and a few novelty voices including Bells, Bubbles, Jester, and Superstar.

Siri

Siri has several new features with accessibility implications.

New Siri Sound

Siri has a new sound. It has a much lower pitch and is designed to be easier to hear by people with common kinds of hearing impairment.

Tell Siri to Hang Up

You can now say "Hey Siri Hang Up," and Siri can hang up a phone call. This works even if you don't have "Hey Siri" enabled.

Control the Amount of Time Siri Waits

You can now set the Siri Pause Time. This is the amount of time that Siri will wait for you while you are speaking. In Settings, Accessibility, Siri, the options for Siri Pause Time are Default, Longer, and Longest. Default is about 2 seconds, Longer is about 3 seconds, and Longest is about 4 seconds.

Use Siri to Turn Auto-answer Calls On and Off

You can now tell Siri to turn Auto-answer calls on and off. When Auto-answer Calls is enabled, the phone will answer a call after 3 seconds of ringing. You can change this default time in Settings, Accessibility, Touch, Call Audio Routing. Now, you can tell Siri "Turn auto-answer calls on," and you can also tell her to turn it off. This will affect phone calls as well as FaceTime calls.

Voice Control Spelling Mode

You can now dictate names, addresses, or other custom spellings letter by letter using Voice Control spelling mode. When in Voice Control’s Dictation mode, you can invoke spell mode by saying "spelling mode." It will be turned off when you return to Dictation Mode.

Dictation

The iPhone will now automatically punctuate your dictated text. You no longer need to say comma, period, question mark, and so forth while you are dictating text. The iPhone will add this sort of punctuation automatically.

You can also now dictate emojis. But you have to say the word "emoji". Saying "smiling face emoji" gets you the "smiling face with smiling eyes and rosy cheeks" emoji, saying "sad face emoji" gets you the "sad, pensive face" emoji. These new dictation features are only available on the iPhone XR and later.

Rotor Actions

In Settings, Accessibility, VoiceOver, Verbosity, Actions, you have the option to have VoiceOver do something such as speak, place sound, or change pitch when rotor actions are available. Now, First Item Only has been added as an additional option so that VoiceOver doesn"t repeatedly say "Actions available," when you are in an app such as Mail where actions are almost always available.

When this option is enabled, you will only be alerted when the availability of actions changes.

Sound and Haptic Feedback in Maps

When you request walking directions in Maps and you have VoiceOver enabled, Maps will automatically bing quietly several times and vibrate to indicate the beginning of the walking route. This feature is automatically turned on when VoiceOver is detected.

Live Captions

This feature displays the text of audible speech in Phone and FaceTime calls as it is happening. Live Captions is currently labeled as beta and is off by default. This feature can be enabled in Settings, Accessibility under the Hearing section. You can set the appearance of the text by enabling bold text, and setting text size as well as foreground and background colors. You can also set the Idle Opacity to reduce the visibility of the Live Captions button when not in use. Instead of enabling it globally, you can optionally enable it for specific apps. At the moment, only FaceTime and RTT are available as apps in which Live Captions can be specifically enabled.

When you are in a call with Live Captions enabled, there is a Live Captions button just to the left of the keypad row that includes 7, 8, and 9. By default, this button is collapsed. If you double tap it, the button is expanded and the text of the conversation is displayed across a single line. At the moment, it is difficult to scroll to the current point in the conversation and it is not possible to review or save the text after a phone call.

Live Captions requires iPhone 11 or later and is only available in English in the U.S. and Canada.

Custom Sounds in Sound Recognition

In iOS 15, the Sound Recognition feature was added so a hearing impaired user could be alerted to specific sounds. But this feature only had a small number of sounds that it could recognize. Now, in iOS 16, the ability to add custom sounds to this feature has been added.

Sound Recognition is off by default. When you enable it for the first time, the iPhone will download the feature and tell you how many megabytes of storage it is using. To turn it on, go to Settings, Accessibility, and you will find Sound Recognition in the Hearing section.

By default, no sounds are selected. To add sounds, press the Sounds button and you will find four categories of sounds: Alarms which includes fire, siren, and smoke; Animals which includes cat and dog; Household which includes appliances, car horn, doorbell, door knock, glass breaking, kettle, and water running; and People which includes baby crying, coughing, and shouting. You can add any of the pre-selected sounds by turning the switch from off to on. For each sound, you will have options for how you want to be alerted to the sound.

In two of the categories, Alarms and Household, there is an option to add custom sounds. When you activate one of these buttons, you will be prompted to play the sound that you want the feature to recognize.

Hanging Up the Phone

It is possible to hang up the phone by pressing the side or lock button.

In Settings, Accessibility, Touch, be sure that Prevent Lock to End Call is set to off.

Apple Watch Mirroring

This feature allows you to fully control your Apple watch from your phone. You can even use other accessibility features such as Switch Control, Voice Control, and so forth.

To turn on Apple Watch Mirroring, go to Settings, Accessibility. You will find Apple Watch Mirroring in the Physical and Motor section.

When you turn this feature on, the Apple Watch screen is replicated on the iPhone. It is fully usable with VoiceOver. Along the right edge of the phone's screen is a Digital Crown button and a Side Button button. This feature is very handy for someone having trouble reading the Apple Watch, or if you want to make a screen recording of an app's behavior on the Apple Watch.

Haptic Touch on the iPhone Keyboard

You can enable a very subtle haptic feedback on the iPhone’s keyboard. It is off by default. To turn it on, go to Setting, Sounds and Haptics, and it is under Keyboard Feedback.

Startup and Shutdown Sound

Technically, not an iOS 16 feature, but all models of the iPhone 14 now have a startup and shutdown sound. This is off by default. To turn it on, go to Settings, Hearing, Audio/Video. When the phone is turned on, a prominent two-tone sequence will sound as soon as the Apple logo appears. When you turn the phone off, the same tone sounds just as the phone is turned off.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

<i>AccessWorld</i> News

APH ConnectCenter and NSITE Deliver Resources and Access to Job Opportunities to Create a Diverse and Inclusive Workforce for Job Seekers

American Printing House for the Blind ConnectCenter and NSITE, have partnered to create a Job Seeker’s Toolkit. The Job Seeker’s Toolkit is an accessible, self-paced, free online training course that helps students and job seekers who are blind or low vision, develop career exploration and job-seeking skills. The course provides five, 60-minute sessions, that cover   self-awareness, career exploration tools and resources, the preliminary employment process, the interview, and maintaining employment. The toolkit can be used by professionals to follow their students’ progress and provide feedback.

This collaboration between APH ConnectCenter and NSITE will help bring employment resources to a habitually underserved demographic in the way that works best for them.

“We have an entire resource library of tools for blind and low vision adults including bringing the Job Seeker’s Toolkit to those who are seeking jobs today,” notes Olaya Landa-Vialard, APH ConnectCenter Director. “Working with NSITE’s team, APH CareerConnect is able to best engage and bring the Job Seeker’s Toolkit to a larger audience and in a format for today’s learning style, whenever and wherever the candidate is in their job search.”

Blind and low vision job seekers who visit the APH ConnectCenter will find a link to the Job Seeker’s Toolkit, which will send them to NSITE U. While there, users will find multiple programs for employment preparation and a regularly updated job board.

"NSITE offers workforce training and development programs and a NSITE Connect job board specifically for blind and low vision individuals,” said Jonathan Lucus, NSITE Executive Director. “By collaborating with APH’s team and their 160+ years of providing resources to this community, we can ensure that talented people with blindness or low vision, have equal access to jobs in today’s competitive work environment.”

The new Job Seeker’s Toolkit features five, 60-minute sessions, designed to be taken in sequential order as the content builds on each prior course. However, each module can also work as a stand-alone course:

  1. Self-Awareness
  2. Career Exploration: Methods & Resources
  3. Finding Employment
  4. The Interview
  5. Maintaining Employment - Advancing Your Career

Click here to participate in the sessions and learn more, or go to: https://communities.nonprofitleadershipalliance.org/nsitecommunity/home

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Back to Table of Contents