Full Issue: AccessWorld January 2023

Editor's Notes: A Focus on Braille

Dear AccessWorld readers,

As always, for our January issue, we celebrate the month with a braille focused issue in honor of Louis Braille's birthday. It is an exciting time to be a braille reader as over the past few years, braille has become more accessible than ever. With the development of low cost braille displays such as the Orbit Reader and Braille Me as well as long awaited developments in multi-line braille displays such as the launch of the Canute Multi-line Display as well as the upcoming dynamic tactile tablet from the APH, it is the best time to be an electronic braille reader.

As someone who grew up learning to read hardcopy braille, I've seriously missed the multi-line format it provides, so the fact that we are seeing tangible developments in multi-line displays, (that are actually coming to market!) is heartening.

In case you might have missed it before, you can now find our entire archive of AccessWorld Premium Content published to date, available for easier viewing on our website.

I would like to once again thank our authors for their hard work producing articles for this month's issue, AccessWorld is what it is today thanks to their effort and dedication. I would also like to thank you, AccessWorld readers, for reading our magazine each month. Knowing that there is an audience out there that derives value from what we publish is a great gift.

Sincerely,

Aaron Preece

AccessWorld Editor in Chief

Check out the January 2023 Issue now!

Braille Codes and Characters: History and Current Use - Part 1

Judy Dixon

Part 1 of this article will discuss the history of braille codes in the United States with particular emphasis on how braille became usable with computers. This encompasses both braille as it is sent from a computer for embossing, and braille as it is read from a computer with a refreshable braille display. How did the braille codes change to make these things possible?

Part 2 will look at how braille is used with screen readers on computers and mobile devices. What braille codes are available to access the contents of a computer screen? How do these codes make it possible for a braille reader to pursue different tasks?

Introduction

To be a useful tool for reading and writing, braille must be able to represent, to the extent possible and practical, the array of printed material that a blind person needs to pursue education, employment, hobbies, and all the other activities of everyday life. With only six dots in a braille cell, representing the more than ten thousand print characters found in today's world has become a real challenge.

The codes used to display this vast array of characters must be comprehensive, and able to represent printed material accurately and completely. But the code must also be straightforward and understandable by the blind person using it. It would be a simple matter to devise a code representing thousands of characters but making that code easy to learn and memorable is definitely not a simple matter.

Technology has helped by increasing the number of dots available, but it has also significantly increased the complexity of material to be represented. Nevertheless, today, braille users are reading literature, science, math, music, knitting patterns, computer programs, and much more all in braille.

This article presents a brief history of the braille codes in the United States and discusses how they are used by screen readers on computers and mobile devices to provide braille readers the greatest possible access to the vast world of print.

Early Braille Codes in the U.S.

For a couple of centuries, blind people only read braille that was embossed on paper. Every braille character had no more than six dots. There were only a few different braille codes, used to represent different languages. For the most part, there was one braille code per language.

Slowly, over time, the braille codes used throughout the world evolved. In the United States, contracted (grade 2) braille, much as we know it today, was adopted in 1932. A simpler form of braille, grade 1-1/2, was commonly used until the middle of the 20th century. This grade of braille had about 50 contractions. With the publication of English Braille, American Edition in 1959, contracted (also then called grade 2) braille became the accepted code. This grade of braille had 189 contractions, and virtually all literary braille reading material published in the United States was produced in this code for the next 57 years.

ASCII Braille

In the 1960s, a braille code for sending text from a computer to a braille embosser was developed. This code was based on ASCII, the American Standard Code for Information Interchange which was originally published as a computer standard in 1963. This braille code is referred to as North American ASCII braille, but also as the MIT Braille Code because MIT had a hand in its development to go with their braille translator Dotsys and their embosser, Braille Mboss.

Initially, the 64 possible dot combinations of braille were mapped to the 95 printable ASCII characters. The 26 letters and the 10 digits were assigned to their ASCII counterparts; other characters such as plus and minus were assigned the dots that were then in use; while, still others, such as left and right parentheses, were assigned logical, mirror image combinations. Many others received random assignments.

Later in the 1960s, the ASCII code was extended to be an eight-bit code which meant that there were 256 possible characters. Despite the expansion of the printed computer character set, the braille code, a six-dot code, remained at 64 characters until the early 1970s, when braille embossers capable of using an eight-dot braille code became available from Triformation Systems, Inc. These embossers were based in part on the Braille Mboss developed at MIT. They added dots 7 and 8 to the bottom of the braille cell, and were capable of embossing braille from computers using eight-bit ASCII.

In the United States, there is wide agreement on the mapping of braille characters to the lower 128 characters of ASCII but there is no accepted standard for which braille character stands for what character in the extended ASCII character set from 128 to 255.

Refreshable Braille Displays

Refreshable braille displays made their first appearance in the late 1970s. The very early devices, The Elinfa Digicassette and the Telesensory VersaBraille had six-dot braille cells. These were stand-alone devices with limited note-taking and document-reading capabilities.

But soon, as the general public began regularly using computers, the idea of using a braille display to access the contents of a computer screen caught on. Six-dot braille cells were not capable of accurately showing the text on a computer screen without braille translation software in the act. If translated into contracted braille, the one-for-one relationship between the characters on the screen and the braille cells would be lost. The solution was to create braille displays with eight-dot cells. Eight-dot cells quickly became the norm and continue to be in use today.

Dots 7 and 8, together or separately, were used to provide information about the character being displayed that could not otherwise be shown with the normal six dots. For example, it is a fairly common practice to use dot 7 on a refreshable braille display to show that a letter is capitalized, and to use dots 7 and 8 to indicate that the character is in a certain type form, such as bold, highlighting, etc. Most screen readers use dots 7 and 8 together to indicate the presence of the cursor.

The Computer Braille Code (CBC)

The fact that computers and braille embossers were using an eight-bit code but braille books were still printed in a six-dot code was not a problem for computer programmers who were the primary users of computer-related information in the 1970s and early 1980s.

However, in the late 1980s, things began to change. Books and magazines for the general public began to contain what had been thought of as computer-related material-filenames, websites, email addresses, and so forth. Blind people who used braille displays could read web addresses and filenames easily but there was no good way to show such material on paper. The literary braille code of the day was not up to the challenge.

The Braille Authority of North America (BANA) decided that what was needed was a Computer Braille Code. The purpose of this code was to represent, in six-dot braille, on paper, the computer-related text that had become common in everyday literature.

The 26 letters of the alphabet would be the same in the computer braille code as in literary braille, but the characters would stand only for the letters themselves. The letters would be used to spell words. No contractions would be used. Every letter, number, and punctuation mark would have its own separate meaning so that there would be no ambiguity, precisely as was done in ASCII.

Because there are only 64 possible dot combinations in a six-dot braille cell and there were 95 printable characters in ASCII, some characters had to be assigned a two-cell sequence. This was accomplished by using a prefix of dots 4-5-6, which appeared as the first cell in all of the computer braille code's two-cell symbols. Dots 4-5-6 indicated to the reader that the next cell was something different from what it would have been without the prefix. If it was a letter of the alphabet, that meant it was uppercase. If it was dots 4-5, then it was a tilde and not a caret. If it was a second dots 4-5-6, then it was an underscore character. The same dot prefix was used to create indicators to go into and out of the Computer Braille Code so that it would be clear to the reader exactly what braille code was being read.

In November 1986, BANA approved the Code for Computer Braille Notation for publication, and it was officially adopted in 1987. According to BANA, the goal was to "make the Code for Computer Braille Notation a realistic code, capable of unambiguous representation of current computer notation but flexible enough to respond to changing and demanding needs." (Braille Authority of North America 1987).

During the 1970s and 1980s, braille codes to represent music, math, and science continued their development. They too have evolved over time but a detailed look at these codes is beyond the scope of this article.

Unified English Braille (UEB)

In 1991, Dr. Tim Cranmer, Chair of the committee that had developed the Computer Braille Code and Dr. Abraham Nemeth, originator of the Nemeth Code for Mathematics and Science Notation wrote to BANA to raise a red flag over the state of the braille code. The literary, math, and computer codes all had different characters to show some of the same symbols. The alphabet, of course, was the same, but after that, things diverged rather quickly.

What was needed was to unify the braille code. BANA began working on code unification for the literary, math, and computer codes, all except music. Soon, BANA realized that unifying English Braille was not just a United States concern. BANA reached out to other English-speaking countries and in 1993, the International Council on English Braille (ICEB) was formed. Over the next decade, work continued on developing and evaluating a unified English braille code. The overarching goal for this code was unambiguity which was accomplished in all but a few minor ways.

In 2004, ICEB decided that Unified English Braille was sufficiently complete to be released as a viable and usable code for literary, math, and computer materials. In 2012, UEB was adopted as the code of the land by BANA with an implementation date set for 2016. All of the ICEB member countries are now using UEB.

With the adoption of UEB, the Computer Braille Code was no longer needed so it was declared obsolete by BANA. Because of its familiarity to many readers, the Nemeth Code for Mathematics and Science Notation continues to be an official braille code in the United States.

The North American ASCII Braille Code that was developed so long ago continues to be used with refreshable braille displays, and for sending braille from computers to embossers. Users who want to read a one-for-one representation of text on a computer screen can opt to use this code, usually referred to as Computer Braille. This term, however, should not be confused with the Computer Braille Code which had been created to represent computer-related material on paper, and is no longer in use.

The Mustang GT Amp from Fender—a case study in VoiceOver accessibility

Bill Holton

Two years ago I used my renewed interest in playing the guitar to explore the ways YouTube, smartphone apps and other modern technologies could make pursuing a hobby more accessible. Well--as my family is known to say--“He’s at it again.”

In this article I’m going to use those same guitars as a springboard to talk about the benefits and limitations of an ever-growing trend in the consumer electronics marketplace—pairing, accessing, and controlling a physical device with a smartphone app or smart speaker.

If you’re like me, you’ve already had experience in this realm. Myself, I operate my thermostat and yard sprinklers entirely via my iPhone and Alexa and/or Hey Google. Each of these offers me complete access to controls which would otherwise be touch screen only. At the other end of the access spectrum, however, when I needed to purchase a new washer and drier I paid extra for smartphone and Alexa compatibility, and since you can’t test large appliances in the store it wasn’t until after the set was delivered that I learned the smartphone app wasn’t accessible and Alexa offered few commands other than “Check status.”

In my very first article for AccessWorld, Reaching Out: How You Can Help App Developers Improve Accessibility I discussed reaching out to developers when you encounter issues of app accessibility. In the article I mentioned that the smaller the company, the easier this would likely be. Sure enough, when the initial release of the control app for my sprinkler system had a few VoiceOver quirks I was actually able to speak to one of the developers, who fixed the issues in short order. The appliance company was a different story altogether. I tried several times to report my issues, but the response was, say it with me: “I will happily pass your comments and suggestions to the members of our team.”

So, back to guitars. My first guitar was an acoustic, but it wasn’t long before I, like Dylan before me, was ready to go electric. So I bought a relatively inexpensive used guitar, and an even cheaper amplifier to “make it go.” Needless to say, the amp sounded as terrible as my playing.

Obviously, I could cure both problems with a new, more powerful amp. And after many hours of research, comparing prices, features and YouTube amp demos I settled on Fender’s Mustang GT, an earlier model of their current Mustang GTX. The amp featured a collection of physical switches, knobs and buttons, but it also connected via Bluetooth with the Fender Tone app, and offered smartphone access and control to nearly every feature.

Unfortunately, no guitar store in the area stocked the GT or GTX, and the Fender Tone app required a Bluetooth connection to an amp to fully function. When I called Fender’s customer support I could find no one who was familiar with VoiceOver. But I did find one customer rep who was willing to learn the basics from me over the phone and put the app through its paces. She spent nearly 30 minutes tapping and swiping with me listening in, and the app seemed quite accessible. That is until I purchased the amp, set it up, and discovered the one hidden accessibility glitch that limited its functionality by half.

To understand the glitch, I need first to explain just a bit about how guitar players achieve the nearly infinite variety of sound profiles, usually called tones. First, most classic amps have their own distinctive sound. This sound can be modified using either the tone and volume controls on an electric guitar, or onboard controls to control EQ, reverb and gain. Professional players take these controls even further, however, with at least one and perhaps an entire series of foot pedals that modify the signal in various ways to add effects, such as tremolo, fuzz, tape delay and compression. Each of these effects offer their own adjustments—a half second delay, for example, versus a quarter second delay. These are the sorts of adjustments that provide the bubbling tremolo of Tommy James’ Crimson and Clover, the in-your-face bulldozer of Black Sabbath’s Iron Man, and Jimi Hendrix’s, well, everything.

(Yes, I know all my music examples are quite old. But so am I, and—excuse me for a second—“Hey you kids out there! Get off my lawn!” OK, let’s continue.)

Most professional guitarists still use their favorite amps and collection of foot pedals to create their unique sound. However for the rest of us technology has given us a new option. DSPs (digital signal processing chips) have given manufactures the ability to emulate the sound profiles of various amps, and effect and modulation pedals, and combine them into what’s known as a modeling amp.

The Fender GT and GTX are both modeling amps. They can emulate dozens of classic amplifiers, and offer a host of effects and modulation settings available both on the physical, hardware knobs and buttons, or via the Fender Tone app, available for both iOS and Android.

The GT and GTX offer 100 preset tones, with names such as Basic Clean, Ethereal and Country Deluxe. Each can be accessed either by a physical amplifier knob or by scrolling through the various choices on the main Fender Tone app screen. The default tones run the gamut from blues to rock to country to jazz. You can also log onto Fender’s Tone Cloud and download and install other company and user generated tones with descriptive names such as Almost Santana Smooth and Nashville Gospel.

Accessibility-wise, this is where the issues began.

Along with using the predefined tone profiles you can create your own, or modify a pre-existing tone. Each can be modified with any number of effects and modulations, and each of these allows at least one and often several parameters to be changed. For example, if you want to add Ambient reverb you can either accept the defaults or set specific parameters including Level, Decay, Dwell and Tone. The trouble is, once you tried adding Ambient Reverb to your effects chain the parameter screens were utterly invisible to VoiceOver; you had to accept the defaults. Consequently, I could not modify a tone, or create a new one without sighted help. I was left with a guitar amp with numerous options, which was good, but without the ability to create new tones and/or fine tune any of the presets—a definite frustration.

I did reach out several times to Fender, and received the standard reply. Basically, describing their ongoing commitment to offer their users the best experience possible. A company exec also told me they had recently hired a subcontractor to help them with accessibility.

To my delight, they were actually working on the issue, and beginning with version 3.1.7 of the Fender Tone iOS app the controls became visible. There were still a few bugs, such as the slide controls working but not reporting the results until you swiped away and then back again. But this and other bugs were resolved in subsequent Versions, and as of the latest release, Version 3.2.3 this and other bugs have been successfully squashed. There are still times when I need sighted help, such as when I purchased the GT’s four-button footswitch and looper pedal and had to enable it on the unit itself. And the tuning option works via colored LEDs. But all in all my current experience is positive, and I have no hesitation recommending this the Mustang GT or GTX.

I decided to cast my net a bit wider and try out a few other modeling amps with smartphone controls. Two others, in fact: The Spark Mini from Positive Grid and the Boss Waza-Air Wireless Personal Guitar Amplification System.

The Spark Mini features three simple physical control knobs. One for amp volume, a second for volume of the music or backing tracks you play through the input jack from your phone or other music device. The third know controls the four default guitar presets: Rhythm, Lead, Solo and Custom. You can set these four either individually or in groups using the Spark app. Unfortunately, the app is nearly completely inaccessible. There are unlabeled buttons that do unexpected things, or nothing at all. Sliders don’t work with touch, or report their status. Only by luck was I able to change any of the presets.

I did enjoy the simplicity of the hardware controls, and any players who stick to a few basic tone profiles might be happy with this model. The sound was surprisingly good for its size, and it is extremely portable. Basically, it’s the perfect beside the couch pick up the guitar and play for a few minutes amplifier.

I also tried the Boss WAZA AIR, which is a headset-based practice amp with some rather unusual features. A Bluetooth dongle plugs into your guitar and broadcasts to a dedicated pair of headphones. The software has some interesting features, such as the ability to position the sound behind you, as though you were on stage, and lock the playback in place so if you turn your head the sound remains stationery. It also offers an audible tuner that provides different pitched sounds depending on whether the string is flat or sharp. Unfortunately, the accompanying software was also mostly inaccessible. I was not even able to locate the tuning feature using VoiceOver.

I already use several app-controllable devices, such as my home thermostat, lawn irrigation system and washer/dryer. But these products are aimed toward the sighted consumer, and they are used to dials and knobs. Consequently, too many developers are trying to duplicate their hardware experience by creating look-alike controls. Standard controls, such as edit fields, slide controls and checkboxes are usually quite accessible, out of the box. But when developers create custom controls, such as the skeuomorphic dials and knobs on modeling amplifiers, washing machines and other devices, they tend not to test for accessibility. Too often an accessibility overlay is required to meet W3 guidelines. Fender has taken this step with their GT and GTX amplifiers—proving it can be done. Positive Grid, Boss and too many others have not.

The blind are a relatively small part of the total marketplace. Our total buying power is usually not enough to sway positive changes. Without the pressure of persistence of individual consumers and blindness organizations there will continue to be precious little incentive for companies to place accessibility on their radar and keep it there.

If and when you reach out to a developer or company regarding their less-than-VoiceOver-friendly app, here are just a few resources you may also wish to pass along:

Information for Developers on how to Build Accessible iOS, iPadOS, Mac, Apple Watch, and Apple TV Apps | AppleVis

Supporting VoiceOver in your app | Apple Developer Documentation

How to make your app accessible using VoiceOver - Rootstrap

Apple Fitness+ Revisited

Janet Ingber

When Apple Fitness+ became available in January 2021, I wrote an AccessWorld article about it. Unfortunately, at that time, if a trainer gave instructions only visually, someone who could not see the video could not follow what the trainer was demonstrating. The app offered a wide variety of classes but the trainers did not give clear descriptions on how to do each movement or exercise.

In March 2022, Apple added “Audio Hints” to their Fitness+ Workouts. This new feature is designed specifically to make workouts accessible to people who are blind or low vision. VoiceOver speaks hints and descriptions that are not spoken by the trainers. However, VoiceOver does not explain all descriptions. If you have VoiceOver in your rotor, you can change the VoiceOver volume. You also can change the volume doing a two-finger quadruple tap. This will open VoiceOver settings. Volume control is there. When doing workouts, try raising VoiceOver’s volume and slowing down the speaking rate. Fitness+ is available on many devices. Here is a link from Apple Support regarding which ones are compatible.

You do not need to have an Apple Watch to use Audio Hints. Audio Hints are given in your chosen VoiceOver voice and come through your device’s speaker. However, it is beneficial to have the watch since you can get more information about your workout. Providing your weight and height will make Apple Watch more accurate with calories burned. Enter your weight and height in the Health App. On your iPhone, go to My Watch>Health>Health Details. For this article, I used an iPhone 13 mini with iOS 16.2 and an Apple Watch series 6 with watchOS 9.2.

Subscribing to Apple Fitness+

Start by opening the Fitness app on your device. If this is your first time opening the app, there will be general information about fitness and setting goals. Eventually you will come to the main Fitness screen. At the bottom of the screen are three tabs: Summary, Fitness+, and Sharing. Select the Fitness+ tab.

On the next screen is a brief description of Apple Fitness+. There is a Continue button on the bottom of the screen. Below it is information about Apple’s use of data. Next is a Continue button.

On the following screens, there is information about the workouts. There are also buttons to Start Working Out. Apple Fitness+ costs $9.99 per month or $79.99 per year. Select the option you want. Apple Pay can be used to make the purchase.

The Home Screen

Once your purchase is complete, a new screen will load. There is a vertical scroll bar on the right edge of the screen. The first control is a More button. When you select it, a list of options is shown. They include Copy and Share. This same control will be at the top of some other screens as well.

The first part of the screen has the list of workout categories. They are KickBoxing, Meditation, HIIT, Yoga, Core, Strength, Pilates, Dance, Cycling, Treadmill, Rowing, and Mindful Cooldown.

Selecting any of the options above will load an information screen containing a brief description of the benefits of participating in the exercises from that category. Below the description is a More button. If VoiceOver is active, there is no need to use it.

Some of the categories include a short “Get Started” video. Flick right, on the screen, until you hear “Get Started.” VoiceOver says it is a button, but there is no button option in the rotor.

The Meditation category has a video describing nine themes of meditation and how they are organized into three categories: Renew, Connect, and Grow. The video explains the benefits of meditation.

The Get Started video for cycling explains how to set up your bike. Instructions are very clear. The Treadmill Get Started video discusses safety, adjusting your treadmill and body position.

The Rowing Get Started video talks about adjusting your feet before rowing. Then the instructor explains the order of rowing. He then talks about power during your workout.

When you select one of the categories, a new screen will load. It is the same basic screen for any of the category choices. The Dismiss button is in the upper left corner of the screen. Next is the workout category name followed by a Filter button. Since form controls are not in the rotor, just flick right. Using Headings navigation, you can quickly get through the filter options. When you find the heading you want, flick right to review options. Headings include Trainer, Time, Music, and Equipment. Make your selections and select the Done button in the upper right corner.

There are some variations in Filter categories. For example, the Yoga filter has Flow Style and the Strength category has Body Focus.

Further down the screen is a Sort button. Just below it is a list of your Filter choices. You can easily remove any option. The Sort button gives options for how your workouts are listed. They are by trainer, by time and/or by music. When you are choosing how to list, select the Dismiss Context Menu button. Below the categories list is a button that plays a video called “This Week, What’s new in 60 seconds.” It gives information about some of the new workouts.

Next is an extensive list of workout offerings. You can read them by flicking right, navigating by heading, or using the vertical scroll bar on the right. I found that when I navigated by headings, as long as I read the heading and swiped right to find out what was available, everything worked well. However, if I wanted to go back to previous headings, VoiceOver did not read those headings. All I heard was “Explore Image” and “Activate.” If you encounter this issue, swipe left or right and then go to the previous or next heading. Information changes regularly, so it is worth revisiting the main screen.

Apple Fitness+ has individual workouts and Collections. According to Apple, Collections are “Curated from the library to help you go for a goal.” For example, on one day, Collections included Pilates for More Than Your Core, 30-Day Core Challenge, and 6 Weeks to Restart Your Fitness. The Pilates workout had 6 episodes, the 30-Day Core Challenge had 28 episodes and the 6 Weeks to Restart Your Fitness had 21 episodes. Categories for this workout were Strength, HIIT, Core, and Yoga.

Further down on the Home screen is an Artist Spotlight heading. In this section, a particular artist is played throughout the workout. Examples include Taylor Swift, Keith Urban, Billie Eilish, and Jennifer Lopez. The list is extensive. Selecting the artist’s name will bring up a list of workouts associated with the artist.

Next is a list of additional workouts and programs. They can be navigated by headings. At the bottom of the screen is My Library. This is where your saved workouts are located.

Apple Fitness+ will make suggestions for you based on your workouts. For example, since I did a core workout, the app listed some other core workouts to try.

The Workout Home Screen

No matter how you choose a workout, the same basic information is on its home screen. The first item is a “More” button. Selecting it brings up options including Share and Add to My Library. Next is the type of activity and the trainer’s name. Next is a button to learn more about the trainer. Also on this screen are other controls that are also on the main home screen. Get back to the Home screen with the Back button in the upper left corner.

Next on the Home screen is a button to add the workout to your Workouts list. The added workouts can be found in your library, which is on the main Fitness+ page. Next is the time length, type of music, episode number, and date. Accessibility information is the next item. However, this was not on all workouts. More accessibility information is at the bottom of the screen.

The “Let’s Go” and Preview buttons are the next items on the screen. Below these buttons is a description of the workout. Some of the exercises for the workout may be listed. I found that there were also times when no exercises were listed or only a few exercises were listed.

Below is a “More” button. VoiceOver reads the entire description although not all of it is immediately visible on the screen.

Next is an Apple Music heading. Below the header is a “Listen to Music” button. If you are an Apple Music subscriber, selecting this button opens the Music app and you can add the playlist to your Apple Music Library. Below the button is the list of songs for that workout.

The next part of the screen is a list of related workouts. This is followed by the list of available languages. Next is the Accessibility heading. It describes what the accessibility features do. For example, for VoiceOver the description is “Audio Hints are spoken descriptions of workout moves that a trainer only demonstrates visually.”

Doing the Workout

Once you have selected a workout by activating the Let’s Go button, turn your phone so the charge port is to the right. The following controls will be on the screen: Close, Volume, Mute, Play, AirPlay, and Metrics. VoiceOver users do not need to use the Metrics button. Metrics are automatically on the screen once the workout begins. At the same time as the screen loads, your Apple Watch will automatically bring up the workout screen. The workout can be started on your device or watch.

When your workout is over, the screen will display: Done, Share, Mindful Cooldown, Add to my Workouts, Total Time, Average heart rate, Active kilocalories, Total kilocalories, and Activity ring information. During the workout, tap your iPhone or iPad to check progress. Each time the trainer mentioned heart rate, VoiceOver gave me my heart rate information.

If you have a supported Apple Watch, there are additional workouts. Open the Workout app on your watch. At the top of the list of workouts is a new choice, Fitness+ Audio Workouts. These workouts are designed for running or walking while using your watch and AirPods or other Bluetooth device. The Mindfulness app on your watch includes Fitness+ Audio Meditations.

My Experience with the current Apple Fitness+

Fitness+ has many, many workouts. The trainers are enthusiastic and welcoming. They did not always clearly describe an exercise and VoiceOver did not  always provide  information. The app includes modifications for different levels of exercise, but descriptions were not always given and VoiceOver would sometimes speak the modification when the exercise was almost over. Some of the classes, such as Dance and Kickboxing, move very quickly and there are times when I missed something important. It is not possible to go back for a few seconds to hear what was said. I understand this might change the “flow” of the routine but I think it would be useful. If a sighted person didn’t catch what the trainer said, he or she could just look at their iPhone or iPad.

Another option would be to have VoiceOver describe exercises in more detail. This would be very helpful, especially in the first round of the exercise. Another option would be to list all the moves in the class. This would allow someone who is blind or low vision to get information in advance about the moves they didn’t know, before they do the workout. Another option is for Apple to create a master file describing how to do all the exercises. Someone could either download it or it could be available directly in the app.

Conclusion

Although accessibility for Apple Fitness+ has definitely improved, it is not totally accessible for someone who is blind or visually impaired. My recommendation is if you want to try it, subscribe for one month to start. If I were sighted, I would definitely use this app.

A Review of Evidence 111, An Audio Game Optimized for People who are Blind or Have Low Vision

Jamie Pauls In the fall of 2022, game developer Play By Ears released their first audio game in the United States. Entitled Evidence 111, the game uses no graphics, and is designed as an audio only experience. The developers of the game did not target the title specifically for blind and low vision players, but realized that the title would be perfect for that audience. To that end, they worked with members of the blind community to ensure the best experience for those who could not see the screen.

In a recent Mosen At Large podcast interview with game developer Tom Oramus, we learn that Tom is an audio designer by trade. When playing through Evidence 111, it becomes abundantly evident that his skills are put to good use in the game. Many scenes are recorded in binaural audio. Assuming you are wearing a good pair of headphones or earbuds, there are times when it sounds as though characters in the game are moving around you instead of simply moving right to left or left to right. A thunderstorm is woven throughout the game, and this along with a great musical score drew me into the game instantly. The voice acting in this game matches the level of music and sound to present one of the best audio gaming experiences I have ever encountered.

Setting the Scene

Although the game costs $4.99 in the Apple or Google Play store, the introductory scenes are available for free, thereby giving you an idea of what to expect. The story of Evidence 111 is set in England, and begins in the year 1975. Officer Alice Wells receives a call on her radio, and is told to be on the lookout for a fast-approaching vehicle. The scene quickly cuts to exactly ten years later. Now Chief Inspector Wells gets a phone call from an unidentified person who knows something about the events of that night ten years earlier than nobody is supposed to know—something that Inspector Wells has kept hidden all this time. Alice Wells is told to go to an old hotel with a folder containing the contents of evidence 111 which relates to the events of that ill-fated incident in 1975. Once there, Alice meets a number of other people all associated in one way or another with the hotel—and, in at least one case, Alice herself.

The game takes about two hours to play through, depending on the choices you cause Alice to make. There are ten different endings depending on Alice’s decisions throughout the game. I have played through the game twice, but I plan to explore further as time permits.

Playing the Game

Once you have downloaded the free demo of the game and opened it, you will hear VoiceOver on iOS tell you that you are in a direct touch area. You need to turn VoiceOver off, and the high-quality game narration will begin giving you instructions on how to navigate the game. Sighted players can turn off this audio help, but the game is smart enough to know that VoiceOver is running at launch, and turns help for the blind/low vision on automatically. Swipe up with two fingers to repeat instructions from help at any time. Swipe right with one finger to start a new game, or swipe left to restore a previous game. Swipe down to purchase the game. You will need to turn VoiceOver back on at this point to complete the process. During game play, a two-finger swipe down pauses game play. A two-finger swipe to the left repeats a section of the game you are currently in or moves you to a previous section, and a two-finger swipe right moves you to the next section of the story.

Throughout the game, you control the actions of only one character--Inspector Alice Wells. At various points in the story, Alice will ask you what she should do. For example, “Should I explore the closet, or go downstairs to talk with the desk clerk?” Very occasionally, there are three choices to pick from. Swipe right with one finger to pick the first option, swipe left to pick the second, and swipe down to pick the third option.

My Thoughts On Game Play

As I played through the game, I was impressed at how responsive to my gestures the game was. At no time did I ever encounter any sluggishness. Scene changes within the story are seamless. There are no pauses in order for a new area of the game to load. Music, sounds, and dialog are uninterrupted. Sounds within the game are very realistic without drawing attention to themselves at any time. The high-quality music tracks also do not intrude into the game play experience. Voice acting and help for blind and low vision players is of the highest quality with no exceptions. The plot of the game is detailed and realistic. The story drew me in from the very start, and kept my attention for the entirety of the game. All game play instructions were very clear, and all commands worked as expected. The only exception was one scene where it sounded as though I should have three choices and only two were available. I’m not sure if there was a bug preventing me from making the third choice, or if there were actually only two options available to me. The ending scene is a bit of a cliff-hanger. I am not sure if the player is supposed to use their imagination to decide how the story really ends, or if there is the possibility of a sequel in the future. Either way, I am happy with how the game ended. I especially enjoyed listening to full game credits read as the musical score played out after I had finished the game.

Although the story involves some dark matters, there is little to no profanity in the game with only references made to violence. There are no graphic sounds in the game that would disturb most people as far as I can determine.

The Bottom Line

Evidence 111 is an audio game that is extremely well designed with a rich plot, excellent sound and music, and superb voice acting. At $4.99 U.S. it was well worth the price, in my opinion. Although the game is available on the iOS and Android platforms, I tested the game using an iPhone 12. I would encourage others to support the developers of this game in hopes that they will release more titles in the future. Perhaps they could branch away from detective fiction and explore other areas such as games with fight scenes. I can only imagine the sound design gems that could be hidden in a game of that sort.

In addition to the Mosen At Large podcast interview referenced earlier, you can also read about Evidence 111 on AppleVis. You can obtain Evidence 111 from the Apple App store or the Google Play store for $4.99 U.S.