Full Issue: AccessWorld May 2021

Editor's Page: Happy Global Accessibility Awareness Day!

Dear AccessWorld Readers,

Happy Global Accessibility Awareness Day! For the last 10 years, every 3rd Thursday in May has been Global Accessibility Awareness Day (GAAD). Considering that this event started from a single blog post it is impressive how widely celebrated the event is. Like many organizations, AFB is celebrating GAAD in several ways. We have produced a GAAD blog post containing AFB and GAAD resources. In addition, check out this month's AccessWorld News for information on the episode of our Inform and Connect podcast focused on GAAD.

As part of AccessWorld's celebration of GAAD, I thought I would take this space to share information on some useful NonVisual Desktop Access (NVDA) addons with you. An NVDA addon is essentially a packaged piece of code that adds features to NVDA. JAWS has had this functionality for decades in the form of Scripts. For a sterling example of the power of JAWS scripts, see our review of Leasey in the July 2020 issue of AccessWorld. NVDA addons serve a similar purpose.

Before I begin, I would like to welcome the newest AccessWorld author, Judy Dixon! You may know Judy from her role of Consumer Relations Officer at the National Library Service (NLS). You also likely know Judy from the many useful technology books that she has published with the National Braille Press, many of which have been reviewed here in AccessWorld. Judy's debut article discusses the uses of Lidar technology on the latest iPhones for people with vision loss.

As I mentioned, Addons can add functionality to NVDA but can also be used to make programs more accessible or serve as programs in their own right. Adding Addons to NVDA is as simple as can be.

  • Locate an addon package (most likely online)
  • Run it directly or save the file and then run it from Windows Explorer.
  • After launching the file, You will be asked if you would like to install the addon.
  • Once installation is complete, you will be prompted to restart NVDA.

That is all there is to it. There is one caveat to keep in mind when searching for addons online. Starting with NVDA version 2019.3 and later, NVDA now uses Python 3, which means that addons that are not updated to be compatible with this version of the Python programming language will not work with modern versions of NVDA. Generally when you install an addon, NVDA will check for compatibility and let you know if the addon will not work for you.

The main source for NVDA addons is the NVDA Addon Repository. This site is the official source for NVDA addons which means that addons from this repository should be safe to use and compatible with the most recent version of NVDA. There are other sources of NVDA addons online, but you are on your own when it comes to determining safety and compatibility. Below, I have listed a few addons that I find personally useful.

Instant Translate: This addon will allow you to translate text on the fly. To use, press the NVDA key (Insert or Caps Lock) + SHIFT + T. After this, you can press "T" again to read text that you have selected or press Shift + T to read what is contained on the clipboard. To change languages and make other setting changes, go to the NVDA menu > Preferences > Settings. The Translate settings will be at the bottom of the list.

Speech History: This deceptively simple addon is something I have found to be incredibly useful. This addon allows you to copy whatever NVDA last spoke to the clipboard by pressing the "F12" key. Considering all the items spoken by a screen reader that you normally could not copy to the clipboard, This addon has many uses. In addition, you can press Shift + F11 and Shift + F12 to move backwards and forwards respectively through the last 100 items spoken by NVDA. This means that you can cycle to something you would like to copy with these commands, and then press the "F12" key to copy it to the clipboard. I would say that it is the addon that I personally use most commonly.

NVDA Remote Access: This extremely useful addon allows you to control another computer running NVDA from your PC. See my Review in the March issue of AccessWorld for full details and information on how to operate the addon. As would be expected, an addon like this has many uses, from allowing an instructor to assist a student remotely to allowing you to manage multiple computers seamlessly. It probably seems obvious, but if you use this addon be extremely careful to whom you give access to your machine as this makes for a very affective tool for anyone who would want to steal data from your device or has malicious intent for your machine.

LION, Live OCR Reader for NVDA: This addon is one of the most revolutionary that I have seen in a while. In short, This is an automatic subtitle reader for NVDA. The addon allows you to perform Optical Character Recognition (OCR) continuously on the screen of your windows 10 computer. The addon will only speak if it detects text that was not present in a previous scan. In addition, you can crop the recognition window size in the options, allowing you to focus on the part of the screen where the subtitle resides, keeping the addon from repeatedly speaking extraneous information and lowering your CPU's load. If you would like to learn more and discuss the addon, This forum thread on the Audio Games forum is a good place to do so. I have found the addon to work fairly well for the intended purpose, but to also be helpful for other applications. For example, A website I use has text that will appear after performing a function but there is no indication when the next text will appear. I will run LION while using the site and as soon as text appears, LION will begin speaking, letting me know I can read the new text. You may notice that LION stops speaking randomly; I find that toggling the addon on and off (NVDA Key + ALT + L) can fix the issue. I also find that the addon stops working or crashes more often if I have the addon performing OCR on the entire screen. Narrowing the recognition window size tends to fix the issue. The addon can take some fiddling to get it to the proper recognition window size for the application with which you are using it, but I have personally found it an invaluable tool.

Are there any NVDA addons or JAWS scripts that you use that you find particularly valuable? Let us know and we will be happy to share your suggestions with your fellow AccessWorld readers.

Sincerely,

Aaron Preece

AccessWorld Editor and Chief

Next Article

Back to Table of Contents

Employment Matters: Jim Kracht, Retired Miami-Dade County Attorney

Deborah Kendrick

Sometimes the most extraordinary characteristic of a highly successful human being is the ability to present himself or herself as entirely ordinary. Not that there is anything exactly typical or average about Jim Kracht. There isn't. He's brilliant, excelling in both his personal and professional realms, and by no means content to rest upon past achievements. While those achievements are numerous and varied, what you notice most about Jim Kracht, whether just making his acquaintance for the first time or revisiting a decades-old friendship, is just what a genuinely good and warm person he is.

Five Faces of Change

The Diversity and Inclusion Committee of the Florida Bar Association launched a plan in 2020 to recognize outstanding lawyers over the past several generations who have expanded the boundaries of inclusion on the Florida legal stage. Five exceptional Florida attorneys were selected for the Five Faces of Change traveling exhibit, which showcases the attorneys' original portraits. The portraits are being created by student artists, and the exhibit as a whole will move from courthouse to courthouse throughout Florida, coming to rest permanently at the Florida Bar headquarters.

Each of the five attorneys selected for this prestigious exhibit served as a pioneer for one segment of the population, breaking through where barriers had kept others from participating. James Weldon Johnson, 1871-1938, who was the first Black lawyer to gain acceptance to the Florida Bar by examination, heads the pack. In addition to law, James Weldon Johnson excelled as a novelist, educator, and civil rights leader, best known for writing the poem "Lift Every Voice and Sing" which was set to music by his brother and is today often referred to as the Black national anthem.

The other four faces in the exhibit include a prominent female attorney, an attorney who is a gay rights activist, an appellate judge who was born in Cuba, and one Miami tax attorney who happens to be blind. That fifth face belongs to James Kracht.  Jim is recognized for being an outstanding attorney in the arena of tax and finance, as well as for his involvement as a mentor and leader for other blind lawyers, and for his advocacy work in disability rights, particularly in the area of voting accessibility.

Jim's primary professional role was in the capacity of tax attorney with the Miami-Dade County Attorney’s office, where he began as a fledgling lawyer fresh from Harvard Law School in 1975 and retired as section head of the firm's tax and finance area in 2012.

Growing Up

Jim and his twin sister were born prematurely and, like so many babies in the 1950s, were given too much oxygen while in incubation. Both lost vision as a result—in Jim's case, he lost all of it.

Growing up in California, Jim learned braille at an early age and his academic prowess was matched by summers rich in experiences with horses, bicycles, swimming pools, and more. While his childhood was idyllic by many standards, things got a bit bumpier by early adolescence. His parents divorced when he was ten, and he began dreaming of a future that was far from home. He turned down a four-year scholarship to Stanford University because it was too close to home, and rallied instead for the opportunity to travel to the other side of the country to attend Harvard.

Leaving home involved more adjustment than the usual adaptation to college life, though. As a high school senior, Jim fell in love. At 18, he married the love of his life, and together, Jim and Pat Kracht set out for Harvard.

Law School and Beyond

When Jim refers to his days in college and law school, he often uses the plural pronoun: “We worked to get permission to record lectures,” or “we were worried that everyone in class had found a job except me.” His “we” refers to himself and his wife, Pat, because they did so much as a team. Pat dropped out of college and became Jim’s fulltime reader. Although Jim became an avid braille reader at an early age, he never became proficient with a braille slate and stylus. In law school, he recorded lectures on a heavy tape recorder, and then listened a second time to make notes in braille on his Perkins Brailler.

A degree from Harvard Law School, he thought, would certainly be a ticket to a great job, but one application after another led to a dead end. “I’d never encountered discrimination based on my blindness,” he says, “until I was looking for a job.”

Then, while he was studying for finals, alarmed that everyone in his class had found a job after law school except him, Jim got his fair chance. In a confluence of coincidences which each could have gone another way, he was offered the job as the 20th lawyer to join the Miami-Dade County Attorney’s office in Miami, Florida. He went to work in July 1975, and continued to shine and thrive in that role until retirement in 2012.

It would be a decade before any significant technology, as we understand it today, would become available. Representing government entities in tax and financial matters clearly called for a calculator, and Jim did have one with synthesized speech. An IBM electric typewriter that typed braille was purchased, but there was little else in the way of technology in those early years. The firm paid for human readers. His work-weeks were long and arduous. Braille paper was the primary tool of the trade. He recalls one uncharacteristically awkward moment in an early court appearance. He had inherited some cases from another lawyer, and found it necessary to take a few notes in court one day. His board slate and stylus was positioned on a table that was positioned on a platform and the echo of the punch, punch, punch of the dots was unavoidable.  The judge boomed that there was an unknown disruption in the court. Jim explained that he was blind and taking a few notes. Opposing counsel interjected that it was no problem, and the judge moved on.

In the late 1980s, by contrast, when he had a Braile ‘n Speak and found he could take notes on a tiny device and then send them to paper by connecting to a braille embosser, productivity soared as did the more manageable approach to getting the job done.

Like all attorneys, he had law clerks and secretaries to assist, and says that everyone in his office, the appraiser’s office with whom he often worked, and most of the judges were supportive and accepting of him. When access technology became more sophisticated around 1990, he wrote a proposal for equipment that would enable him more seamlessly to get his job done. With a Navigator braille display, a computer, a braille embosser, and two Kurzweil reading machines, the time required from human readers could be diminished and Jim Kracht could practice law with efficiency closer to that of his sighted colleagues. The proposal was shared by Miami-Dade County Attorney’s office, the appraiser’s office, and Jim. Again, productivity increased.

The American Council of the Blind held its convention in Miami in 1977, and Jim Kracht, then a young rising star in law, was asked to speak. That launched a long commitment to involvement with the American Blind Lawyers Association (today known as the Association of Visually Impaired Attorneys), as well as other affiliates within the national organization. Over the years he has held leadership roles not only among blind lawyers, but also blind library supporters, braille enthusiasts, and more. He has served as president of the Florida Council of the Blind and has been a director on the national ACB board since 2018.

For decades, his law work focused exclusively on the business of Miami-Dade, ultimately supervising others as head of the firm’s tax and finance department. In 2000, however, his passion for the civil rights of all Americans and dismay at the inequity of the voting process for people with disabilities, prompted him to get involved in another kind of law. With the approval and support of his law office, he became a leader in the work that has gone on for the last 20 years to secure access to independent and private voting for all Americans, including those who are blind.

Life After Law

After 37 successful years, Jim Kracht was surprised to find that when the time came, at age 62, he was actually ready to retire. His pace, however, didn't slow so much as get redirected.

When his career clearly required a long-term stay in Miami, Jim had promised his wife that she could choose where they would live once he retired. In 2012, he reminded her of that choice. They had made a home in Miami and neither wanted to leave it. But their two children, now married and with children of their own, had settled near Orlando. The result was that the couple now have two homes and divide time more or less equally between them.

Jim speaks glowingly of his children, his grandchildren, and their many and varied shared family experiences. He continues to be involved with the American Council of the Blind and disability rights, particularly in the area of accessible voting. And, if anyone wonders whether his long career was financially rewarding, they might take a look at his most recent passion: collecting music boxes.

If music boxes conjure for you miniature statues or snow globes with tinkling tunes that take up a few inches of shelf space, Jim Kract’s music boxes will expand your imagination considerably. I had the privilege of actually visiting a few of his prized possessions shortly before the publication of this article, and words almost fail.

Imagine an enormous piece of furniture, the size, say, of a large chest of drawers or china cabinet. Now, put discs in it that are filled with a complex array of impressions and that measure some 24 inches in diameter. Perform a bit of magic, step back, and be bathed in sound that is gorgeous, large, and reminiscent of times we can only imagine. He has smaller ones as well—snuff boxes with musical notes bright and beautiful—and can tell you the history and mechanical details of every piece he has acquired.

He began acquiring music boxes shortly after his retirement, purchasing the first one in early 2013, and says he has promised Pat that the most recent, purchased earlier this year, is his last. There are 48 in all, ranging in price from $1,200 to $30,000, and some dating back two hundred years. He has traveled the United States and abroad to see them and buy them, and has countless charming anecdotes to share about his pursuits.

Clearly, a man who collects music boxes is a bit of a romantic, and this truth was best illustrated by the surprise he planned for his wife on their fiftieth wedding anniversary. In an article published in a music box journal, he captures the tale of building the music box that was her spectacular surprise on that occasion. First, there was the beautifully handcrafted jewelry box. Then, an arranger was found who could put their favorite song from their courting days on a music box cylinder. The music box was inserted, a plaque engraved and, of course, an exquisite diamond ring designed to be housed in the box. Perhaps saying that he is a bit of a romantic is an understatement!

In addition to writing articles for other enthusiasts about his music box journeys, he now serves as vice chair of the southeast chapter of the Music Box Society International. He carries his passion and expertise in the realm of advocacy into this avocation, too, since he has been working steadily to obtain an accessible version of the Encyclopedia of Music Boxes, a directory published by a Canadian author.

He reads and writes on a HIMS Polaris and a Brailliant BI 40X from HumanWare on a daily basis.

Family is of paramount significance to Jim. He is always ready with fresh tales of adventures with his children and grandchildren, recounting Easter egg hunts or birthday parties or just time spent hanging out together.

It is not at all unusual to hear him comment on how blessed and beautiful his life continues to be.

When I asked him about advice for young people coming into the workforce, his response was a typically simple, unvarnished insight.

“I’ve had tremendous opportunities in my life,” he said, “but they didn’t just happen. I chose to make them happen. Even when it looked like I wouldn’t find work after law school, I knew I would. Because I had to work, to make a living.”

Jim Kracht has made far more than a living. This mentor for many has built a beautiful and exemplary life of family, friendships, vocation, and recreation—and richly deserves the place he has been assigned among Florida’s five legal faces of change. We can all learn from his warmth and his wisdom.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

More by this author:

Previous Article

Next Article

Back to Table of Contents

The Power of LiDAR Comes to an iPhone Near You

Judy Dixon

"Nine, eight, seven, six..."—my iPhone cheerfully counted down the number of feet between me and the person ahead as I moved up in line at the COVID-19 vaccination site. Of all places, I was so happy to be able to independently socially distance here. What made this electronic assessment of the distance between me and the next person possible is the latest amazing feature to come to an iPhone, LiDAR.

What is LiDAR?

In September 2020, Apple released four new iPhone models. The two higher-end models, the iPhone 12 Pro and the iPhone 12 Pro Max, as well as the iPad Pro 2020, included a new, highly touted feature called LiDAR. It was a bit of a mystery at first. We knew it had something to do with the camera, but Apple's description, "AR at the speed of light," sounded pretty sensational and didn't do a great deal to tell us how it was going to change our lives.

But LiDAR is definitely life-changing. LiDAR stands for “light detection and ranging,” and works by bouncing lasers off objects to very accurately measure their distance based on how long it takes for light to return to the receiver. It's like radar except that instead of radio waves, it uses infrared light.

LiDAR technology has been around since the 1960s. It has been used in industries such as robotics, surveying, meteorology, and even in the space program. LiDAR is often used to make high-resolution maps. In fact, LiDAR came to the attention of the public in 1971 when it was used by astronauts to map the surface of the moon. It is also used in self-driving cars to judge the distance to cyclists and pedestrians, and by the guidance systems of robot vacuums.

There are many types of LiDAR sensors. The scanning LiDAR systems used by iPhones and iPads fire thousands of laser pulses at different parts of a scene over a short period of time. This type of scanner has a range of up to five meters (about 16 feet). The LiDAR scanner's data is analyzed along with data from the cameras and motion sensor, then enhanced by computer vision algorithms for a more detailed understanding of the entire scene. 

This means the iPhone's LiDAR scanner is designed more for room-scale applications, like games or IKEA's Place app, which lets customers move prospective furniture purchases around in their own homes so they can see how it will look. The LiDAR on iPhones is currently not accurate enough to create 3D scans of individual objects.

The iPhone's LiDAR scanner can also help the phone take better pictures, especially at night when combined with Night Mode. By sensing the distance between your iPhone and the subject you’re taking a picture of, the camera can figure out at what distance it should focus to get the best result.

But even with distance and other limitations, the LiDAR on iPhones can have huge benefits for blind users. Let's have a look at Apple's People Detection feature and three apps that are using LiDAR to provide information to blind users.

People Detection

One of the first practical uses for LiDAR came along in November 2020 with the release of iOS 14.2. Apple added a People Detection feature. This is the one I used at the vaccine center. People Detection is part of the Magnifier app, which is on the phone by default. It uses the back-facing camera to alert you to people nearby. It can help you maintain a physical distance from the people around you by keeping you informed of how far away they are.

You can invoke People Detection in several ways:

  • Open the Control Center. By default, Magnifier is in the Control Center near the very bottom. Double tap to open it and double tap the People Detection button near the bottom right corner of the screen.
  • Perform a VoiceOver gesture. By default, the four-finger triple-tap gesture turns People Detection on. To assign a different gesture, go to Settings > Accessibility > VoiceOver > Commands > Touch Gestures. When People Detection is on, this same gesture takes the focus to the End button at the top left of the screen.
  • Add People Detection to the Accessibility Shortcut menu. Do this by going to Settings > Accessibility > Accessibility Shortcut. If you already use the Accessibility Shortcut to turn VoiceOver on and off, adding another option to this menu will make things a bit more complicated. Using the menu to start People Detection works well, but when you use it to turn VoiceOver off you will have no access to the menu when you try to turn VoiceOver back on. The only option here is to use Siri to turn VoiceOver on.
  • Add People Detection to Back Tap. This lets you turn it on with a double or triple tap on the back of your phone. Do this by going to Settings > Accessibility > Touch > Back Tap. If you do enable Back Tap to turn on People Detection, you will need to be very careful whenever you set your phone down on a hard surface.

If you prefer to have an actual Magnifier app, then go to Settings > Accessibility. You will find Magnifier near the top of the list. Magnifier is off by default. Double tap it. This will bring up a screen where you can turn it on. Here, you can double tap Magnifier Off and it will become Magnifier On. Turning Magnifier on in Settings will add the app to your App Library.

If you would like to have the Magnifier app on one of your home screens, go to the App Library and use the Search function to locate Magnifier. Once found, bring focus to Magnifier, be sure your rotor is on Actions, swipe down once and you will hear "Drag Magnifier." Double tap it. Now, do a three-finger swipe right to move to your home screens. Do this as many times as necessary to get to the screen where you would like to place Magnifier. Swipe down until you hear an instruction such as "Drop Magnifier before Notes" (or something similar), and double tap to place Magnifier in the location described.

If you do this, you have two additional ways to open Magnifier and get to People Detection. They are:

  • Launch Magnifier and double tap the People Detection button near the bottom right corner of the screen
  • Tell Siri "Open Magnifier," and double tap People Detection near the bottom right corner of the screen

The People Detection screen has a large Viewfinder in the center and an End button in the top left. When no one is visible, it says "no people detected," near the bottom of the screen. As soon as the app detects a person, it begins beeping and vibrating and the number of feet away is spoken and displayed near the bottom of the screen. As you get closer to the person, the number of feet spoken decreases, and the speed of the beeps and vibrations increases. At times, it does seem to get confused by mirrors and photographs of people. When you press the End button, you are returned to the Magnifier app.

You can change settings for People Detection. Double tap the Settings button in the lower left corner of the main Magnifier screen. The first section is Customize Controls. It is divided into Primary Controls (always visible) and Secondary Controls. Here you can move People Detection to another location in the rows of buttons on the lower part of the screen.

Near the bottom of the screen is a heading for Other Controls. If you double tap the People Detection button under this heading, you will have the following options:

  • Units: Choose meters or feet.
  • Sound pitch distance: Swipe up or down on the adjustable control to adjust the distance. When people are detected within this distance, the pitch of the sound feedback increases. The default is six feet.
  • Feedback: Turn on any combination of Sounds, Speech, and Haptics. If you turn on Speech, iPhone speaks the distance between you and another person.

When you have adjusted the people detection settings to your liking, double tap the Back button in the upper left corner then the Done button in the upper right corner.

Other Apps Using LiDAR

There are at least three other apps specifically made for blind users that feature LiDAR. Let's have a look at Seeing AI, Super LiDAR, and LiDAR Sense.

Seeing AI

Seeing AI is a free app from Microsoft. They describe it as an app that "narrates the world around you." It uses multiple channels to perform many different tasks such as reading short text and documents, scanning barcodes, describing people and scenes, and detecting color and light, and it even takes a stab at reading handwritten text. It is available in Czech, Danish, Dutch, English, Finnish, French, German, Greek, Hungarian, Italian, Japanese, Polish, Spanish, Swedish, and Turkish.

In December 2020, Seeing AI added a World channel. This channel is only visible with devices that are equipped with a LiDAR scanner. As you scan the camera slowly around a room, the app will detect things in your environment such as doors, windows, furniture, cups, bottles, books, and even people. It will speak the distance to an item it detects when it first sees it, but it doesn't continuously change the distance as you move around in the way that People Detection does.

The World channel's main screen has the following controls: Menu, Proximity Sensor, and Quick Help buttons across the top, Spatial Summary and Place Beacon buttons near the bottom, and the adjustable Channel Selector across the bottom.

The Menu button brings up a list of general items for the app such as Help, Settings, Feedback, and About.

The Proximity Sensor causes the phone to vibrate as it detects items in your environment. The vibration becomes more intense as you get closer to the object.

Quick Help provides a summary of the functions of the app and reminds us that this feature is experimental so we must be cautious. At the bottom of the Quick Help screen is a Check Headphones button. Double tapping this causes the app to speak the words "left ear" then "right ear." This lets you ascertain that your headphones are set up properly.

Spatial Summary causes all the objects that have been detected in the room to speak their name. A special feature of this app is its ability to provide a spatial view using headphones. So, if you are wearing headphones when you request a spatial summary, each object's name will be spoken from its location.

Place Beacon lets you create a beacon to guide you to a specific object. It brings up a screen that lets you choose a detected object from a picker item. When you have selected your object, swipe right to a Start button and double tap it. A tone will sound. Turn to face the tone and walk in that direction. If the tone moves to one side or the other, you must turn to keep it in front of you. A different tone will sound when you have reached the object.

I tried this. I walked into my office and slowly scanned the room. I heard the app say "bottle." I double tapped Place Beacon, selected bottle from the list, and double tapped Start. The tone sounded in my right ear. I turned to face it, followed the tone, and walked right to my water bottle. Curiously, it detects my clear water bottle when there is water in it but not when it is empty.

Super LiDAR

Super LiDAR is an app from Mediate, the same developer as Super Sense. It uses LiDAR to detect objects and people and verbally indicates the distance to them as well as providing feedback with tones and vibration.

When you first launch the app, you will be greeted with a Welcome screen that contains the descriptive help text for the app. You are required to enter an email address before you can proceed. Once that is done, double tap the Get Started button and you will be presented with another block of text that describes the main screen of the app with an OK button at the bottom.

When you get into the app itself, it begins sounding a tone and vibrating. Both the tone and vibration vary as you scan your environment based on how close you are to anything in its view. The higher the pitch, the farther away is the object that the app is seeing. The vibration becomes more intense the closer you are to an object.

The app verbally identifies specific objects that it knows about such as window, door, ceiling, table, seat, and so forth. At the moment, Super LiDAR can detect about fifteen different objects.

As soon as the app sees any part of a person, it starts saying "a person." When it sees the person's face, it will tell you if the person is wearing a mask. This app does keep changing the distance to an object or person as you move around, but it does it a bit eratically.

There are only two buttons on the screen. Open Menu is in the top left and Stop Detecting is in the bottom center. When you open the menu, you will find buttons to select the detection type. The options are Detect People, Detect Environment, or Detect All. Detect All is selected by default. There are also buttons for Help, Request a Call, Rate Super LiDAR, About, and Open Super Sense. The latter is Mediate’s companion app that can scan text, detect barcodes, identify currency, and much more.

If you stop detecting before you close this app, it will be off when you re-open it.

LiDAR Sense

LiDAR Sense is yet another app that can detect objects and people in your environment. But this one has very few features. It also sounds tones and vibrates.

When you first launch the app, you will get a brief description of how to use it. Double tapping the Continue button will bring up the camera permission screen, and then the app will be live.

The main screen has three buttons: Start/Stop in the center, with Settings and Information Menu below.

When you press Start, the app begins sounding tones and vibrating. It does not speak at all. It simply emits tones and vibrations of various strengths to indicate your proximity to nearby objects. The more intense the tone and vibration, the closer you are to an object.

Settings lets you disable vibration, disable sound, turn spatial audio on/off and change the maximum range for vibration and sound. The default maximum range for vibration is 2.1 meters and the default maximum range for sound is 5 meters. Spatial audio is only enabled when you are using headphones that supports this feature. When I used this app with my AirPods, spatial audio was turned on automatically.

The Information Menu lets you Replay the instructions, provides an email address to contact the developer, and links to informational websites.

The Bottom Line

I have found that the LiDAR feature on the iPhone works better indoors than out. Sixteen feet is really not that far. In general, the apps are optimized for indoor use because that is where the objects they know can be found, but Seeing AI does recognize a car. The app's Proximity Sensor also does a nice job finding our rubbish bins after the collection people have tossed them about a bit. Both Super LiDAR and Seeing AI recognize my front door as a door if I remember to angle the phone up a lot because it is at the top of my front steps.

While LiDAR is far from perfect, it does represent another step for blind people toward getting accurate information about our surroundings. We can then use that information for many different practical and useful purposes, from finding misplaced items to following a line of people at the bank, airport, or vaccination center.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

Previous Article

Next Article

Back to Table of Contents

A New Day for TalkBack: Android Screen Reader Gets a Major Update

J.J. Meddaugh

TalkBack has been Google's built-in screen reader since nearly the beginning of Android, and is available on any Android phone purchased today. And while Android has a devoted fan base of screen-reader users, including the writer of this post, some major features have been missing from TalkBack, especially when compared to VoiceOver on the iPhone.

Version 9.1, released in late February, is a major step toward closing this inequity, with the highlight being support for multi-finger gestures, those commands that can be activated using two, three or four fingers. Below we'll talk about what's new and who can take advantage of these features.

First, a Bit of History

As most readers likely know, smartphone users who turn on a screen reader often interact with their devices using gestures. When a screen reader user taps on the screen, the text under their finger is spoken, but nothing is activated or opened. Typically, tapping twice on the screen is what actually opens or interacts with the spoken item. This is an example of a gesture, and the foundation for commands on VoiceOver or TalkBack.

On Android, these gestures have been mostly limited to using just one finger, which severely limits the number of motions that can be made to execute commands. Swiping right or left would move to the previous or next item, for instance, but if you wanted to perform a more complex action from a menu, you needed to use what have been commonly termed "angle gestures." These gestures are performed by moving your finger in one direction, and then moving it in another direction, all without lifting your finger. To open the TalkBack menu, you would need to draw an L on the screen, or swipe down, then right. It was very important to distinctly draw both portions of the L separately, or TalkBack would often not recognize it.

Meanwhile, those using Samsung phones had access to an entirely different screen reader called Voice Assistant. Voice Assistant included some additional commands not available in TalkBack, but often would cause frustration for users who ended up with two screen readers on their phones.

If this all sounds a bit confusing to you, you are among the chorus of users begging for something more user friendly. With only single-finger gestures available, the number of possible commands was naturally limited.

A New TalkBack, for Some

Version 9.1 of TalkBack is perhaps the biggest change in the screen reader since the introduction of gestures in 2012. Most of the angled gestures have been replaced with multi-finger equivalents that will be easier for many, but you do need a supported modern Android phone to get these features.

TalkBack 9.1 is available on Google Pixel phones, the flagship devices released directly by the company. These are typically the first phones to gain any new Android features. In addition, Samsung and Google have collaborated on this version of TalkBack, so modern Samsung devices such as Galaxy phones now also work with this version of TalkBack, instead of using their own proprietary screen reader. Other devices that are likely to gain the features from this new version include phones in the Android One program, which Google certifies to receive the latest software updates as they are released. These include selected phones from Nokia, LG, Motorola, and Xiaomi.

If you have a supported device with automatic updates turned on or have recently updated the apps on your phone, you should have received the update by now. TalkBack 9.1 rolled out to most users of Pixel and Samsung devices at the beginning of March. One way to check to see if you have the new update is to tap once with three fingers on the screen. If you have the new update, the TalkBack menu will be displayed and spoken.

Getting to Know the New Gestures

For seasoned TalkBack users, many of the existing gestures remain intact. In fact, if you are one of the users who liked the original angle gestures, they are still available for your use. But there are now simpler ways to perform many actions, and a variety of new commands as well. We'll focus on the default gestures for each command, but be aware that any of these gestures can be customized to suit your needs.

First, there is now a new way to navigate between elements. Swiping right or left will now always move between each element on the screen, regardless of navigation setting. This is your baseline level of navigation. If you wish to move by character, word, or link for example, you do this by first selecting the element you wish to navigate by swiping right or left with three fingers, and then swiping up or down to move by that element.

This gives us an opportunity to mention another new multi-finger gesture. To start reading at the current item, and continuously read items until the end of the screen or until you stop TalkBack from speaking, you can perform a triple tap with two fingers. TalkBack will move through each element one at a time. You can swipe right or left while it is reading to more quickly move forward or backward through the elements.

One of the most popular gestures on VoiceOver is what is commonly called the Magic Tap, which is tapping twice with two fingers on the screen. TalkBack now has a similar version of this command, which will start or stop media playback, or answer and hang up a phone call. You can also still turn on the feature to end a phone call by pressing the power button, which is an accessibility option that is independent of the screen reader.

As mentioned above, there is now a much simpler way to locate the TalkBack menu, by tapping once with three fingers. The TalkBack menu is where you will find a variety of advanced and lesser-used commands including reading or spelling the last spoken phrase, showing or hiding the screen, and changing the spoken language. While some of these commands also have assigned gestures, others do not. But you can assign just about any command to a gesture by opening TalkBack Settings from the TalkBack Menu and selecting Customize Gestures.

New Commands for Editing Text

TalkBack has always included a way to copy the last spoken text to the clipboard, but new commands make text editing a much simpler chore. As an example, you may want to do this to copy the contents of a note to a text message or email.

To select some text, first locate a box with editable text, such as a text message or note. Turn on text selection mode by double tapping and then holding with two fingers. In other words, you will tap on the screen twice using two fingers, but hold that second tap for about a second. TalkBack will speak "Selection Mode On" to let you know you have correctly performed the command. Then, swipe up or down with one finger to select or deselect text. TalkBack will speak each letter, word, or line as you pass it and let you know if it has been selected or deselected. Remember you can swipe right or left with three fingers to change the navigation mode between characters, words, or lines. When you have selected the desired text, perform the same two finger double tap and hold gesture. TalkBack will turn off selection mode and speak the text that was selected. If you were to swipe left or right, or otherwise move your focus away from the editable text, the selection mode will be turned off and text will not be selected.

Now that you have text selected, you can use one of the new three-finger gestures to work with this text. These include:

  • Three-finger double tap: Copy the selected text to the clipboard
  • Three-finger double tap and hold: Cut the selected text to the clipboard
  • Three-finger triple tap: Paste the contents of the clipboard at the current cursor position

Remember, you can still also copy the last spoken text to the clipboard. This is now easily done by activating the TalkBack menu by tapping once with three fingers and then selecting the Copy Last Spoken Phrase item.

Voice Commands

TalkBack can now be operated using a variety of voice commands. By default, this feature is available either through the TalkBack menu, or by swiping right and then up with one finger. Admittedly, if this is a command you plan to use regularly, you may want to assign it to an easier gesture from TalkBack settings.

You can use a host of voice commands to move quickly to items on the screen, read text, and control TalkBack features. For instance, you can say the name of an item on the screen and TalkBack will move focus to that item. You can say "speak faster" or "speak slower" to control the rate of speech. You can say "first" or "last" to move to the first or last item on the screen, or "select all," "copy," or "paste" to manipulate text. Say "help" to get a complete list of commands.

Getting Help

With all of these new commands, there's now more to learn. Thankfully TalkBack has added a new command to enable a gesture learning mode. Tap once with four fingers, and then try any of the gestures mentioned above, or other combinations of single, double, or triple tapping with one to four fingers. To exit this mode, swipe right until you hear the Finish button, and then double tap.

The TalkBack tutorial has also been updated to reflect these new features. To start the tutorial, tap twice with four fingers.

In addition, Google has created a page with all of the TalkBack commands.

Feature Requests

TalkBack 9.1 is a huge step forward for Android accessibility, but like just about any app, there are always places that could be improved. For instance, having a quick way to start and stop text dictation, whether using the current two finger double tap gesture or another gesture, would make text input easier. You can use the voice command feature mentioned above, and then say “type bla bla bla” to type text, or locate the Voice Input button on the on-screen keyboard, but a dedicated gesture would still be helpful for many.

There are several gestures that are currently duplicated as the transition is made from single-finger to multi-finger commands, so hopefully some of these can be replaced with unique commands to broaden the number of functions that can be performed without the need to go into a menu. Finally, BrailleBack, Android's app to support reading and writing from a braille display, lacks many of these new capabilities and has greatly fallen behind in terms of its usefulness. Hopefully, attention can be paid to BrailleBack to give it the updates it deserves. This is of vital importance to those who rely on braille as a primary reading method.

All things considered, the fact that Google and Samsung are now combining their efforts to create a single high-quality screen reader for Android is an encouraging sign. As Android 9.1 rolls out to more devices, this will hopefully improve the accessibility experience for more users, regardless of which type of phone they happen to be using, and we look forward to future developments for TalkBack and the Android Accessibility Suite.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

More by this author:

Previous Article

Next Article

Back to Table of Contents

Lookout App for Android: Google’s AI Swiss Army Knife

Steve Kelley

For those of us who have been waiting rather impatiently for Microsoft’s Seeing AI app to come to Android, the wait is over—sort of. Google’s Lookout App made its debut last year, and originally was only available on the Pixel phone and some select others. If you gave the earlier versions of the app a try and abandoned it for its early limitations, it’s time for another look.

Google’s Lookout App is now a more polished app that will run, according to the system requirements outlined in the Support documentation, on Android 6.0 (Marshmallow) or above. For this review it was running on a Galaxy S9 with Android 10. Download the free Lookout App from the Google Play Store. The first time the app opens, it asks for your Google account or prompts you to create one. This is followed by the generic request for permission to share data use. To use the app, of course, you grant permission.

Opening the App

By default, the app is in Text Mode when it opens. Text Mode is just one of the five modes available in Lookout. The various modes appear along the bottom of the screen, and include: Explore (listed as beta); Food Labels; Text; Documents; and Currency. On the top portion of the screen is the Change Language button, which offers the option of selecting any of 22 languages. To the right of this button is the Account Menu, which contains options for Settings, Help and Feedback, and Contact Info. Help and Feedback is a great place to get started with various support articles that explain the various modes and settings options.

Text Mode

Text Mode is designed to read any text the camera detects, whether it’s on a product label, newspaper, or computer screen. Text Mode is quick, and moving the camera around will jump from one item to the next, which can be confusing. Holding the camera steady and gradually moving it across the text proved the best way to read without a lot of jumping around.

This mode is great for reading short bits of text like an address on an envelope, and for scanning a pamphlet, business card, or medicine bottle. The default setting for the app is to use the device’s flashlight in low-light settings. For nearby observers, this can look like a strobe light as it goes off and on to light up the text. The flashlight can be disabled in the Settings menu for more discreet reading, and Lookout does pretty well in dim light, without the flashlight.

Reading in Text Mode is responsive, quick, and pretty accurate. The text-to-speech voice pitch and rate can be changed from the Settings menu.

Document Mode

For longer text recognition, the Document Mode will scan a page of text and convert it to digital text, accessible to TalkBack or Select to Speak. Document Mode offers nearly continuous hints to align the document properly before snapping the picture. You might hear, “Too close, move the device away,” or “Move device left,” etc. until the document is within the camera frame. Once framed, Lookout will report, “Hold still,” or “Try taking a snapshot.” The camera automatically takes a picture following “Hold still.” When prompted to take a snapshot, press or double tap the Take Snapshot button to manually take the picture.

Text recognition is super-fast and seems very accurate. During the review, Airplane Mode was turned on while using Document Mode, and this had no impact on text recognition, so optical character recognition (OCR) processing must be done on the device rather than in the cloud. In addition to being fast, it makes it so much more convenient to have text recognition available all the time, with or without internet access.

Once a document is processed, the text can be read, copied and pasted, or shared using the Share Button in the top right of the display. Document text can be shared with other apps or email. What is missing here though, as with Seeing AI, is the ability to scan multiple docs and save them as a file for later use, like you can with an app like KNFB Reader. As long as the Lookout App remains open, you can open the Recents button, on the right side of the screen just above the Modes list, to open any recent scans. This list disappears once the app closes.

Explore Mode (Beta)

Explore Mode describes objects in the environment as the camera is moved around. In the kitchen, for example, it quickly identified cabinets, the oven, a microwave, coffee cup, tableware, window, door, etc. It identified the cat as a dog, and failed to ID the fridge and other smaller appliances. One of the more impressive identifications was identifying a backpack as a bag, and then reading the text on a business card in a clear plastic sleeve attached to the bag. At the time, the bag was about three feet away, in dim light.

The Explore Mode, like Text and Document, functioned in Airplane Mode, so this too must be working off device-based object identification. Although in beta, it certainly appears useful for identifying many of the major features in the immediate surroundings, such as the door, window, and furniture. It may be premature, however, to delete your Be My Eyes App or any other app using a human to identify objects or describe details in the environment around you.

Food Labels Mode

The first time Food Labels Mode was opened, an additional data file was downloaded, which took a couple minutes, and was specific to the United States. The US is just one of 20 countries available to select from. The onboard support article indicates that data files are not available for all countries. It also indicates these data files enable offline barcode recognition.

Like Document and Text modes, Food Label Mode is very responsive and fast. One of the most impressive features was the ability of the camera to locate the barcode on a wide variety of products. Following the directions provided in the support article for Food Labels, I slowly rotated products in front of the camera. In most cases, identification occurred quickly. The flashlight is used for dim lighting as in a kitchen cupboard or pantry. One of the best features in Food Label mode is the way that barcodes and text recognition are incorporated, so if the barcode is not immediately recognized, the text on the product label may be recognized first and read. Once the barcode is identified, it often provides more details about the product than what is read from the label. In addition to the quick response the on-device data provides, this feature will make Food Label Mode really handy while shopping in a grocery store, where access to the internet might be spotty.

Currency Mode

Like the other modes, Currency Mode takes its settings by default from the phone, so if you are in the US, the currency will be U.S. Currency. At least 20 other countries are listed in settings, so it's simple to change currencies. Like the Text and Food Label modes, Currency Mode is really fast and accurate. It identified various bills in a variety of conditions—folded up, upside down, etc., and in each case the response was accurate and nearly instantaneous. Because identification happens on-device rather than relying on cloud service, this can be a really handy feature anywhere.

Conclusion

Like Seeing AI for the iOS devices, Google’s Lookout for Android is a game changer. With text and barcode recognition on the device, processing seems nearly instantaneous in all five modes. Text, barcode, and currency accuracy is quite reliable, even with some of the stylized text that appears on package labeling. If you’ve been waiting for Microsoft to migrate Seeing AI to your Android, just download Google’s Lookout on the Play Store——you may not need Seeing AI after all!

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

More by this author:

Previous Article

Next Article

Back to Table of Contents

Streaming Audio Services Part 2: The Accessibility of Apple Music

Janet Ingber

Apple Music is Apple’s very accessible music streaming service that was launched on June 30, 2015. It offers more than 75 million songs, radio stations, and music videos. There are no ads and content can be downloaded to your device to play offline. Apple Music can be synced between devices. Siri can execute many Apple Music commands such as "play a song,"play analbum," "play playlists," and "display song lyrics."

Apple Music works on all Apple devices and many non-Apple devices. In this article, I will discuss Apple Music on iOS, macOS, watchOS, and Amazon’s Alexa. It is supported on other devices including PCs with the PC version of iTunes, Android devices with the Android Apple Music app, and Sonos.

Signing Up

Apple Music offers three basic subscriptions: individual, family, and student. The individual plan is $9.99 per month or $99 per year. The family membership costs $14.99 per month and allows up to six family members. Each family member has their own content. Student membership costs $4.99 per month. The student subscription has certain requirements. No matter which subscription you choose, you will have a free trial. Any subscriber or free trial user has access to all of Apple Music’s content. Begin the sign-up process by selecting the Music App on your iPad or iPhone. Another option is to visit this page. Next, select the Listen Now or For You tab. On the new screen select the trial offer and then select your subscription plan.

Next, sign in with the Apple ID and password that you use to make purchases. If you do not have one, select Create New Apple ID. You will be asked to confirm billing information and form of payment. Next, select the Join button. Once you have subscribed, you will be asked to choose your favorite artists and genres. The sooner the process is completed, the sooner Apple Music can make recommendations for you. Apple Music will continue to make recommendations based on the music you are playing.

Apple Music on an iOS device

At the top right of the iOS device screen is the My Account button. The button is there only when the Listen Now tab is selected. Options in this section include Redeem a Gift Card or Code, Set up Profile, and Notifications. At the bottom of the screen are five tabs: Listen Now, Browse, Radio, Library, and Search.

Listen Now

The Listen Now tab contains a lot of information including Recommended Music, Recently Added, New Releases, Made for You, and Recently Played. Headings navigation can be used to get to some of the sections. There is a scroll bar on the right side of the screen that brings you to the different pages in Listen Now.

Browse

The next tab is Browse. Use it to check out music charts, find playlists curated by Apple, and get information about new music. This tab also contains playlists based on activity or mood. Examples include Party, Fitness, and Feel Good. Select the button for the option you want. The new screen will have playlists for your selection.

Radio

The third tab is Radio. At the top is information about Apple’s three live radio stations: Apple Music (which plays new music), Apple Music Hits, and Apple Music Country. Underneath each listing is a Schedule button, which brings up a list of programs, hosts, and show times. Beneath the Schedule button is the Play button. The Radio tab also contains a list of radio stations that Apple thinks you will like, based on your listening habits. These radio stations are not live. You can also ask Apple Music to create a station for you.

Library

The next tab is Library. All your music can be found here, including music you have imported or purchased from the iTunes Store. If you want to download a song, Apple Music lets you do that. The advantage of downloading means an internet connection or WiFi is not necessary. The disadvantage of downloading music is that it takes up space on your device.

Once the Library tab is activated, at the top of the screen are buttons including Playlists, Artists, Albums, Songs, and Downloaded. In the upper right corner is an Edit button. Selecting this lets you re-order the buttons. VoiceOver gives clear instructions.

What is on your screen depends on which button is selected. For example, selecting Playlists brings up a list of all your playlists. At the top of the list is a button labeled New Playlist. When selected, you will be prompted to enter a name and, if you want, a description. Next is an Add Music button. This will bring you to the Search form. Below, I will describe other ways to add music from the Search tab and when playing content.

Search

The final tab is Search. The form is at the top of the page. You can search by artist name, song title, song lyrics, a particular mood, or an activity. There are radio buttons to search Apple Music or your Library. Results will appear under the search form. If you go to the search tab and do not enter a search query, there is a list of search categories including Music Genres, DJ Mixes, Essentials, and Charts.

If your search results are a compilation, such as an album, list of songs by a particular artist, or a list compiled by Apple Music, you will have the option to play music or shuffle and then play the music. Flicking up or down on a particular track gives options to delete it, add it to your library, or activate it. Depending on your search, there might be artists, albums, pre-made playlists, or individual songs in your search results. Select the result you want. VoiceOver speaks what you have selected. Go through the content with standard VoiceOver gestures.

The Now Playing Screen

A Now Playing screen will load when music starts playing. A MiniPlayer is just above the five tabs. There are buttons to play and go to the next track. Left of these buttons is an option called Album Artwork. If you continue flicking left a couple of times and then flick back right, VoiceOver will say, “MiniPlayer Album Artwork.” Either way, activate the Album Artwork option. A Now Playing screen will load. On the screen are the song title and artist. Next is a More button. Selecting it brings up options including Share, Show Album (if the content is on an album), Add to a Playlist, View Lyrics (for Apple Music songs only) and Create Station. When done, select the Dismiss Pop-Up button in the upper left corner. You will once again be on the Now Playing screen.

Under the More button are Track Position, Previous Track, Play, Next Track, and Volume. The remaining options are Auto Play and Playback Destination. At the top left corner is a button to dismiss the Now Playing screen.

Making a Playlist

Select the More button on the Now Playing Screen and then choose Add to Playlist. On the next screen is a list of your playlists. Select the one you want for the song that is playing.

Apple Music on the Mac

Apple Music can be used on the Mac, but I prefer the interface on iOS devices. Open the Music App. The easiest way to learn the Mac’s layout is to use VO + Left Arrow and VO + Right Arrow. Using Tab and Shift+ Tab will get you some of the information. The layout is different from iOS. For example, the player is near the top of the Music window. The tabs and all the music are in the sidebar.

Playing Music

The first playback option is a Shuffle button. Next are three buttons labeled Previous, Play, and Next. Here are basic commands for playing music.

The LCD Section

To the right of the Player is the LCD Section. When a song is playing, the LCD section gives information about the song including title, artist, time elapsed, and total time. Also in the LCD section is a More button, which brings up many options including Add to Library, Download, Add to Playlist, Create Station, and Share.

The Lyrics button is further to the right. Once it is selected, go to the bottom of the window. You will be at the end of the lyrics. Either use Up Arrow or Shift + Tab to jump to the start of the lyrics.

The Sidebar

All your music is listed in the sidebar table. Some categories are already there, including Songs, Artists, Albums, and Recently Played. Your own playlists and Apple Music playlists also will be in the sidebar, as will Apple Music options including Listen Now, Browse, Radio, and Library. Check out the View menu for sorting options. There is a Search edit box at the top of the table. Enter your query and results will be under the edit box.

Creating a Playlist

To create a playlist, begin the process by typing Command-N and then entering a name for your playlist. If you want to add one song at a time, locate the song and bring up the contextual menu. Select Add to a Playlist. Find the name of your playlist and select it. To add a list of contiguous songs, follow the same procedure and select all the songs before using the contextual menu.

Apple Music on the Apple Watch

Apple Music that you have recently played is automatically available on the Watch. Music can be synced either from your iPhone or from the Music app on your watch.

Syncing with the Apple Watch

Open the Music app on your watch. Next, select Library or Listen Now. Find the music you want to add and select it. By flicking up or down there are other options including Download and Activate. Music is added to the watch only when the watch is on its charger and connected to WiFi.

Adding Music Using your iPhone

Open the Watch app on your iPhone and select the My Watch tab. Then select the Music app. On the next screen, just below the Playlists and Albums, is an Add Music button. Select it to get a list of categories including Albums, Artists, and Playlists. Select whatever you want to add and then select the Add to Playlist button in the upper right corner. This will bring you back to the screen with the Add Music button. Under the button is a list of content you have chosen to add. Content cannot be added until the watch is on its charger and near your iPhone.

Apple Music on Amazon Alexa

Alexa can play albums, artists, playlists, and genres. Alexa can also skip songs and play radio stations. Give Alexa a command such as “Play my Stevie Wonder playlist on Apple Music.” You can also make Apple Music your default music player. Once you do this, you just have to tell Alexa what you want to hear.

Begin the set-up process by opening the Alexa app and selecting the More tab at the bottom of the screen. Next, select Skills and Games. The next screen has a Search button in the upper right corner. Select it and an edit box will open. Enter Apple Music and select the Search button in the lower right corner. Results are under the edit box. Select “Apple Music. Music & Audio.” Follow the prompts to link your account.

You can make Apple Music the default player for Alexa. Select the More tab on the Home screen and then select Music and Podcasts. At the top of the screen is a Settings heading. You may be at the bottom of the screen when it first loads. Under this heading is a “Tap to Change Default Services” link. Select this option. On the new screen, choose Apple Music. Now you can tell Alexa what to play without having to add “on Apple Music.”

Conclusion

Apple Music is very accessible and Siri can perform many related tasks. Personally, I like Apple Music better on the iPhone than on the Mac. Having Apple Music on the Apple Watch is convenient especially when you do not have your iPhone with you. Having Apple Music on Amazon Alexa is a nice additional feature and makes the service more useful for those who don't have all their devices on one platform.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

More by this author:

Previous Article

Next Article

Back to Table of Contents

Untold RPG for iOS: Playability and Accessibility

Jamie Pauls

Over the past few years, I have reviewed many games for AccessWorld. Many if not most have been games designed specifically for blind people, often by blind game designers. Some have been games created originally for sighted players, with accessibility for the blind coming later. Often, the accessibility of these games is a mixed bag. Some elements of game play might be accessible to those using screen readers, while other elements are more difficult for blind players. It is a pleasure to be able to review a game developed for sighted people that really takes advantages of accessibility tools that can make game play not only possible, but also a real pleasure for players who are blind.

One such game is Untold RPG, a role-playing game made playable on iOS for those who use Apple's VoiceOver screen reader. Unlike games that are self-voicing, this game takes advantage of VoiceOver itself and allows for a really great gaming experience. In this article, we will take a look at some reasons why this game is great for blind players who may wish to enjoy the same gaming experience as their sighted counterparts.

Playing the Game

In Untold RPG, you are a character who finds yourself on a beach. You have only fragmented recollections of how you got there, along with a strange brooch in the sand that somehow brings back a flood of associations, many not good, regarding your previous experiences. Through the course of the game you meet another person who has also shared your experiences of being imprisoned on a ship and forced to do unspeakably bad things. Something has happened to your mind, and you must try to put the pieces of your memories, no matter how painful, back together again. You also would like to seek vengeance on the crime lord who enslaved you and others, forcing you to do his bidding. You travel strange lands, are caught in the crossfire of warring tribes, and save a few lives in the bargain.

A rich musical score and sound effects flow through the entire game, but never get in the way of VoiceOver as you play. Battles, sailors at work, and conversations in taverns are all enriched by sound and music. As the story progresses, VoiceOver reads new screens automatically, allowing for a smooth playing experience. You can always read text again if you missed something, or simply want to better understand the unfolding story line as you progress through the game.

Most actions in the game are determined by selecting from a series of buttons. You may see choices such as "talk to the merchant," "explore the forest path," or "turn back." Once you have taken an action, the word "used" appears next to that action to let you know you have previously acted on it. You can generally still take the same action again, but it is harder to wander around in a circle while playing this game than it might otherwise be if the "used" indicator wasn't present. Sometimes, there is only one action to be taken in order to move the story along.

At the bottom of your screen are a series of tabs including character, quests, inventory, and notifications. Along your journey, you will collect many items such as gold, food, weapons, clothing, and medicine. It's a good idea to take a look at the clothing and armor you possess, as you may want to put on better armor or clothing as you acquire it. Parts of the game actually require you to disguise yourself by wearing certain attire. Gold, food, and medicine are pretty obvious as far as their importance is concerned, and not having enough of any of these can cause you to get stuck in the game and possibly may even require you to start over. The game allows you to save multiple versions of your playing experience, and even autosaves your game play from time to time. It can be difficult to recall which saved game state contains what items of inventory, and where you are in the story at that particular state. Sometimes, simply starting over is the best course of action.

Your character receives a number of attribute points at the end of each level of game play that can be spent in various areas. If you spend all your character's attribute points on strength, you may be able to rip a gate off its hinges, but your low number of perception points may cause you to miss a secret passage you would find if that number were higher. A higher number of charisma points might allow you to talk your way into a building rather than needing to kill the guard to enter. Be prepared to do a lot of fighting in this game. Also, be prepared to die! Fights, once they begin, take place in real time. Your opponent attacks you, and you must manually fight back. Because the action happens in real time, I found it difficult to quickly look at the screen in order to determine my health points versus those of my enemy. Your character will collect many weapons from fallen opponents, so you will want to play around with various configurations of armor and weapons to see what works best. Fortunately, you can suspend fighting at times in order to change armor and weapons.

There are parts of this game that move in a linear direction, requiring you to take only one course of action. Other areas of the game allow you to explore a number of possible solutions to a problem with multiple outcomes. This is definitely one of those games where you can find yourself at a dead end. Sometimes you can backtrack, and sometimes you must simply start over. At the writing of this article, I have not yet completed the game although I believe I am close. I played so many hours, and restarted the game so many times, I eventually became tired of game play and needed to step away for a while. I have only recently picked the game up again.

One of the things I particularly like about this game is the ability to quickly move from one location to another using the game's map. Travel is restricted to your current location. In other words, if you have sailed across the ocean to get to an island of interest, you can move to different locations on that island but to explore elsewhere you must go back to the harbor, board your ship, and travel back across the ocean (if that is allowed during the playing of that particular level of the game).

Another thing I particularly like is the ability to interact extensively with characters in the game. You can ask them questions and they will provide meaningful answers. Sometimes they will travel with you on part of your journey and help you fight the enemy. Be prepared, though. People you learn to care about in the game can and will die suddenly. Consuming the wrong foods or medicines can negatively affect your character as well.

The Bottom Line

Untold RPG is a well-crafted game with a rich plot and highly developed characters. VoiceOver accessibility is excellent, although some areas of the game, such as the inventory tab will take some exploration. You may have to look around a bit to figure out how to expand and collapse various inventory groups such as clothing, weapons, and armor. Once you get the layout of the screen, you shouldn't have any problems.

The game is huge, both in terms of the plot and the number of places you will travel. You may find that you need to start the game over a few times before you really get the hang of game play and start developing the right inventory and character attributes that serve you best.

At $4.99, I highly recommend this game for anyone who doesn't mind a few references to magic and quite a bit of fighting.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

More by this author:

Previous Article

Next Article

Back to Table of Contents

<i>AccessWorld</i> News

AFB GAAD Podcast: Inform & Connect podcast with Sarah Herrlinger (Apple) and Christopher Patnoe (Google)

Happy Global Accessibility Awareness Day (#GAAD)! We are excited to have Sarah Herrlinger of Apple and Christopher Patnoe of Google on Inform & Connect, the American Foundation for the Blind’s podcast. Both Sarah and Christopher sit on AFB’s Board of Trustees. The conversation, led by AFB Major Gifts Specialist Melody Goodspeed, focuses on AFB’s centennial, technology and inclusion, testing products, and their role as AFB board members. They will also touch upon Teach Access, a collaboration among education, industry, and disability advocacy organizations to address the critical need to enhance students’ understanding of digital accessibility as they learn to design, develop, and build new technologies with the needs of people with disabilities in mind.

“I am thrilled to welcome Sarah and Christopher, especially in light of Global Assistive Technology Awareness Day fast approaching,” Melody said. “Not only are they fierce advocates, they are also fun and engaging! I also love how they are always thinking outside the box and driving innovation through inclusive technology.”

M-Enabling Summit to be Held October 4-6, 2021

The M-Enabling Summit Conference and Showcase, dedicated to promoting innovation in accessible and assistive technology for senior citizens and users of all abilities, will host its signature industry networking event October 4-6, 2021 in Washington, DC. The M-Enabling Summit is fully committed to providing a safe and powerful platform in-person this fall with virtual participation options for those still unable to travel or participate in public events by October. Industry leaders, influencers and advocates will be addressing the current issues and strategies surrounding digital accessibility in light of the significant evolutions that occurred since the 2019 M-Enabling Summit.

With participants’ health security as the top priority, conference logistics will reflect official health guidelines and best practices for conferences and public events and their updated details will be published on the M-Enabling Summit website. With the theme of “Digital Accessibility, a Driver for Inclusion Strategies,” key topics that will be explored at this year’s hybrid M-Enabling Summit include:

  • The acceleration of the adoption of virtual environments during COVID-19: What’s durable evolution, what’s not, perspective from Industry and Persons with Disabilities.
  • Spotlight on most impactful newest accessibility features and their benefits for users.
  • Multi-modal gaming and virtual entertainment experiences for accessibility.
  • Balancing privacy and security with accessibility and assistive services requirements: The need for users’ choice options.
  • The era of voice as a new platform for digital interaction: Challenges and opportunities.
  • The emergence of neurotechnologies for advanced assistive solutions: Risks and opportunities.
  • State of Accessibility among business, government, and academia.
  • The emergence of Strategic Leaders in Accessibility (SLiA) among large organizations
  • Workplace accommodations success stories in a virtual environment
  • The best of education accommodations strategies in a virtual environment
  • The rise of the accessibility profession: Global footprint, professional development resources and benefits for organizations

For additional information on participating in the event please visit the M-Enabling website.

Previous Article

Next Article

Back to Table of contents

Letters to the Editor

In this section, we publish letters submitted by AccessWorld readers on a range of topics. If you would like to submit a letter to the editor, you can do so by sending an email to the Editor, Aaron Preece, at apreece@afb.org, or by activating the "Comment on this article" link at the bottom of any article.

Dear AccessWorld Editor,

My comment might be in that back row; I’m not at all young, probably there’s where the problem lies. Every month I go to read Access World, and I do read it, but the task of going from article to article, it seems more adventurous than it needs to be. I liked the format where we would read an article and at the end you could choose, previous or next article. Being honest here, I could not tell a friend how to read through AccessWorld today. I mess around until I get it. Well okay, I chafed at having to arrow through the Facebook reference, twitter, what have you; I simply want to read the darned magazine. So today I asked JAWS to do the Links List, no help, it was grand AFB material, so while I am fully aware that many of us simply breeze through websites, even really hard ones, sorry to say I don’t. You mention a change for this issue? Hmmm, I hope to get it, so far, this time actually seems harder. I’m so behind in all the amazing new things on the internet that I’m almost always certain the problem is me, but I am a loyal reader, I would love for the experience to seem more straight forward for me. Demographically, if it helps at all, I’m 75, I’ve always been blind, and I use JAWS all up to date. Thanks and keep up the good work.

Mike Cole from Berkeley Ca.

Response from AccessWorld Editor, Aaron Preece

Unfortunately, AccessWorld is still running on our interim platform, so some features are currently unavailable until the updated AccessWorld website is ready for launch. We have been manually adding a link to the table of contents for the current issue at the bottom of each article, and starting with this issue, I have also begun to add previous and next article links at the bottom of each piece. Again, these are being added manually, so if you notice a missing link, please let me know and I can fix it.

As always, the feedback is much appreciated; it is always helpful as we aim to make the magazine as helpful and user friendly as we can.

Previous Article

Back to Table of contents