Full Issue: AccessWorld August 2018

Letters to the Editor

Dear AccessWorld Editor,

This letter is in response to Scott Davert's July article, Scratch That: A Review of Scratchpad, the New Notetaking Feature on the Focus Fifth Generation Braille Displays.

I recently purchased the Focus 14 5th Generation after reading about it in AccessWorld. I bought the Focus 14 because I wanted a compact, ergonomic, and efficient mechanism for typing in braille input mode on my mobile devices. I liked the ergonomic layout and found it more comfortable than the Orbit and Humanware Brailliant 14.

One major drawback with this device is that the?Focus 5th generation loses the Bluetooth connection with the iPhone after five minutes in Airplane mode.?This happens each time. I do not have this problem when using the Humanware Brailliant. I reported the error to VFO and hope they will fix this issue.

Sincerely,

Mary Wilson

Dear AccessWorld Editor,

This letter is in response to Deborah Kendrick's July article, Book Review: Writing Your Way: Composing and Editing on an iPhone or iPad, by Judy Dixon.

I'm sold. I'm going to order this book on Monday.

Best,

Mary Hiland

Dear AccessWorld Editor,

This letter is in response to Jamie Pauls' February 2017 article, Easier-to-Use Cell Phone Options for People with Vision Loss.

Thank you for this article. It was a big help, as I have been searching for a smartphone my daughter can use. She has very low vision due to retinitis pigmentosa.

This article in AccessWorld is the best I have found in all my research. I appreciate your taking the time to compare the companies and phones. I will be reaching out to Verizon for help in choosing a phone for my daughter.

Before I read the article, I didn't think I should have the features I wanted in mind. I would just go into a store and ask what features the phones had, relying on the salesperson to know my needs.

Thank you for a great article.

Gail Vespie

Dear AccessWorld Editor,

I struggle with YouTube. I'd like to see AccessWorld publish an article on how to navigate it.

Here are the problems I have:

  1. Channels: how do I find the home page for a particular channel? And when I do, and I select a video from that channel, then another, how do I get back to its home page?
    Seems like every YouTube page has links to videos that it thinks are related, but aren't actually part of the channel set I want to watch. What does "subscribing" to a channel mean? Is there a way I can keep YouTube from showing me links to videos that aren't what I'm interested in?
  2. Search results: the same thing happens. I search for a topic, but YouTube starts showing me links to random things that aren't part of my search. How do I get it to stick with showing my search results?
  3. Pause, Rewind, and Fast-Forward: sometimes these buttons are easy to find and sometimes not. What is a reliable way to move through the video?
  4. Saving a link to a video: besides using Web favorites or bookmarks, is there a way I can easily store links to videos I want to watch again, for example an exercise or music video I will repeatedly play?
  5. Clutter: there is just so much of it on YouTube pages, it is hard to find the author's description of the video and visitor comments. How can I read the information relevant to the video I'm currently watching without all the other unrelated stuff?

Thanks for your time,

Debbie Armstrong

Dear AccessWorld Editor,

This letter is in response to Deborah Kendrick's October 2016 article, Orbit Reader 20 Review.

Though this article was written two years ago, I agree with it. I just purchased an Orbit Reader 20 and have been using it now these last four months or so. It's wonderful! I'm almost ready to bid adieu to bulky paper braille.

Thanks again for this fine article.

Best,

Duncan Holmes

AccessWorld News

U.S. AbilityOne Commission Designates AFB as an AbilityOne Authorized Central Nonprofit Agency: An Important Step Toward Encouraging Greater Inclusion in the Workforce of the Future

Statement from Kirk Adams, President and CEO, American Foundation for the Blind

The American Foundation for the Blind (AFB) has a bold new mission to create a world of no limits for millions of Americans with vision loss. That includes significantly improving the workforce participation rate among people who are blind, which persistently hovers around 35 percent compared to about 70 percent for people without disabilities.

On the anniversary of the Americans With Disabilities Act, we are excited to be designated a Central Nonprofit Agency (CNA) under the AbilityOne Program, one of the nation's largest sources of employment for people who are blind or have significant disabilities. AFB will have the opportunity to partner with nonprofits, government agencies, and corporations to identify innovative ways to maximize the power of the Javits—Wagner—O'Day Act, the law requiring federal agencies to purchase specified supplies and services from nonprofit agencies employing people who are blind or who have significant disabilities.

As a CNA, AFB's primary focus will be giving people with visual impairments exceptional new career development opportunities in the fast-growing industries of financial services, healthcare and information technology, and to create pathways to mainstream, integrated employment in the public and private sectors.

Our first order of business is to launch an 18-month research phase to inform the design of this new model. AFB will conduct qualitative and quantitative research, including a literature review and market analysis. In addition, AFB will meet with key stakeholders engaged in procurement, as well as job training and placement.

It is essential that we address the unemployment and underemployment issue now. Technology is changing the way we work at an unprecedented pace, which opens a host of job opportunities for people who are blind so long as our systems are accessible and inclusive. Sixty five percent of children entering primary school today will ultimately end up working in jobs that don't yet exist, according to the World Economic Forum's Future of Jobs report. That means we must stay current and ensure people with vision loss are developing the right skillsets to compete in the workforce today and in the future. Further, there are many private sector companies dedicated to hiring and promoting people with disabilities, recognizing that full inclusion strengthens company culture and improves the bottom line. AFB will connect these companies to qualified employees who are visually impaired.

We are thrilled about the possibilities to expand career options and increase workforce participation among people who are blind. Please stay tuned. We'll be keeping you apprised of our research findings as we move through Phase One of our work with the AbilityOne Commission.

For additional information, please contact:

Adrianna Montague
Chief Community Engagement Officer
212-502-7615

Senator Markey Hails Passage of International Treaty to Support Blind and Visually Impaired Individuals

Senator Edward J. Markey (D-Mass.), member of the Foreign Relations Committee and longtime champion for people with sensory disabilities, released the following statement after the US Senate passed the Marrakesh Treaty Implementation Act and the resolution of advice and consent to ratification for the Marrakesh Treaty to Facilitate Access to Published Works by Visually Impaired Persons and Persons with Print Disabilities.

"I am thrilled that the United States will join the 35 other countries that have agreed to share braille books, audiobooks, and other published materials across borders and around the world,? said Senator Markey. ?By increasing the availability of such materials, this treaty will help millions of blind and visually impaired people afford and access books and other written works that capture our imagination, foster education, and support economic opportunity."

The Marrakesh Treaty has already been signed by Australia, Argentina, Botswana, Brazil, Burkina Faso, Canada, Chile, Costa Rica, Ecuador, El Salvador, Guatemala, Honduras, India, Israel, Kenya, Kyrgyz, Liberia, Malawi, Mali, Mexico, Moldova, Mongolia, Nigeria, North Korea, Panama, Paraguay, Peru, Russia, Saint Vincent and the Grenadines, Singapore, South Korea, Sri Lanka, Tunisia, United Arab Emirates, and Uruguay.

The Marrakesh Treaty is supported by The American Council of the Blind, American Foundation for the Blind, National Federation of the Blind, American Library Association, Association of College and Research Libraries, Association of Research Libraries, Authors Guild, Benetech, National Music Publishers Association, and the Perkins School for the Blind.

Senator Markey's 21st Century Communications and Video Accessibility Act (CVAA) mandates accessibility of devices and services for the millions of Americans with disabilities and enabled the use of a wide range of devices and services needed in the digital era.

Even Cheaper PCs for the Blind!

In the July issue of AccessWorld we alerted you to a new partnership between VFO and Computers for the Blind (CFTB), a Texas-based organization that refurbishes donated computers and offers them at low price to people with visual impairments. For $130 you can purchase a fully equipped desktop computer. But now, along with the PC itself and an assortment of open source software, users also receive a one-year licensed copy of JAWS, MaGic or Fusion. The version that comes with CFTB computers is a single install version good for one full year. At the end of this year you have two choices. You can allow the version to expire and revert to a 90-minute demo version. Or you can purchase an SMA and upgrade to a new full version. You can then continue to upgrade with additional SMAs, or end your participation and continue to use the last full version of the software.

Recently, the folks at CFTB received a $50,000 grant from the Dallas Association of the Blind that enables the organization to lower the price of a computer by $60 for recipients of SSI or SSDI due to blindness. The grant also includes parents of legally blind children. Required documentation is an e-mail from the TVI or caseworker stating that the child needs a computer, would benefit from it, and purchasing a new computer with the expensive software would create a hardship on the family. No eye reports or income verification are necessary.

The discounted cost of a grant-covered CFTB desktop PC is $70. Only on-campus blind college students are eligible for a grant-discounted laptop at $125.

The Dallas Association for the Blind has closed its doors, so this grant is a one-time limited offer. Learn more on its website, call 214-340-6328, or e-mail.

ATIA 2019 Registration Options

There are many registration options for ATIA 2019 to meet your needs, including group discounts. With these discounts, the bigger the group, the bigger the savings. Each full conference registration receives access to the Exhibit Hall, Opening Reception, and educational sessions. Check out the packages below:

  • 5 or more people: 5% off per person
  • 10 or more people: 10% off per person

To register and for a complete list of registration options, visit the conference website.

Call for Papers Opens

On Thursday, August 23, 2018, the call for papers for Volume 7 of the Journal on Technology & Persons with Disabilities and the Science/Research Track of the 34th CSUN Assistive Technology Conference will begin. The Science/Research Journal Call for Papers will conclude Tuesday, September 11, 2018 at 11:59 PM (PDT).

Additional information for the Science/Research Journal Call for Papers can be found on the Center's Journal and Journal Call for Papers pages. A working draft of Volume 7 of the Journal on Technology & Persons with Disabilities will be made available online prior to the 34th CSUN Assistive Technology Conference. The final version will be published on the CSUN ScholarWorks Open Access Repository in Spring 2019.

Please note, the information above is only relevant to the Science/Research Journal Call for Papers. Information for the General Call for Papers has already been?sent?to those who have expressed an interest in the Call for General Session Proposals. You may indicate your interest by subscribing to the CSUN conference mailing list on this page.

An Evaluation of OrCam MyEye 2.0

In the March 2016 issue of AccessWorld, I had the privilege of writing about several new products introduced at ATIA 2016. One product that particularly caught my interest was OrCam MyEye, a device that coupled a camera and standard eyeglasses to allow a blind person to recognize faces and read text, among other things. I was taken with the potential afforded by this piece of technology, and dreamed of being able to recognize faces of people around me simply by looking in their direction. In that article, I mused about how well this product would really work. In the February 2017 issue of AccessWorld, contributing author Bill Holton briefly took OrCam through its paces, and came away from that experience fairly impressed. He too had some concerns about the future of the product, but was willing to give the developers some time to continue its development.

When I had the opportunity to conduct a more extensive evaluation of the newest iteration of OrCam in April of this year, I was ecstatic. I would finally get the opportunity to have some of my questions answered. Would the unit stand up to the tests I planned to put it through? Would I come away from this experience wanting to own the product, or would I wait for yet another hardware release before taking the plunge?

What follows are some thoughts regarding my experiences with the product, and hopefully some comments from the developers themselves.

A Physical Description of OrCam MyEye 2.0

In discussing OrCam, it is important to be aware of what the product is and isn't. OrCam consists of a piece of hardware that measures 3 × .83 × .59 inches with a camera on one end and a speaker near the other end. The device mounts on the frame of any pair of eyeglasses via a magnetic mount. The box contains several mounts so you can use OrCam on more than one pair of glasses if desired. The underside of the OrCam device is flat, and has a power button near the speaker. The top side of the device is rounded and has a touch bar that is raised for easy location. The bar clicks when depressed, so it is easy to know when action has been taken. With that said, the small size of the device along with the use of a touch bar might present a challenge for anyone who has dexterity issues. It is possible to adjust the sensitivity of the touch bar, although I did not play with this feature at all during my evaluation. OrCam comes with a lanyard which should be worn at all times because it's possible to accidentally bump the device and dislodge it from your glasses. On a couple of occasions, I would have launched OrCam across the room had I not been wearing the lanyard. Fortunately, the device never received even so much as a bump during these incidents.

OrCam does not allow a blind person to physically see their environment. Unfortunately, several sighted people in my community were under the impression that I was testing a device that actually allowed me to see. I had to repeatedly explain to people that OrCam is a camera that takes pictures, and reports feedback through a text-to-speech engine whose voice I could easily hear coming through its speaker, which was very near my right ear. Actual demonstrations of the device cleared up much of this confusion, although I constantly reminded people that I could not actually see anything through the glasses I was wearing.

During my trial of OrCam, I chose to use the English female Ivona Kendra voice that is the default, with the speech rate set to 220 words per minute. OrCam speaks many languages.

Setting Up OrCam MyEye 2.0

OrCam has so many settings that it's not possible to cover them all in this article. There are, however, a few things worth noting here. First, to get into the Setup menu, you must quickly press the Power button while swiping along the touch bar. Although I found this doable, it was definitely tricky at times. I often found myself placing the unit into suspend mode instead, which is accomplished by simply pressing the Power button. Another press of the button before the unit suspends will power it down, and I have had the frustration of doing that on more than one occasion.

Once in Setup mode, however, you move through various settings by swiping the touch bar to move from choice to choice, and tapping the touch bar to make a selection. Because of OrCam's small size, I found that instead of wearing the unit, holding it in my hand worked best. Also, I needed to take extra care when swiping and tapping the touch bar to avoid accidentally making choices I didn't want.

I found the menu structure and the voice prompts to be very intuitive, and I never had any trouble figuring out how to make any changes to OrCam's configuration settings. That said, I had two very significant issues during setup, one of which I was able to find a workaround for.

When setting the time, OrCam insisted on being five hours behind when I would restart the unit, even though I was very careful to make sure my time zone was set correctly. Eventually, I set the time five hours ahead, rebooted the unit, and was then presented with the correct time. OrCam representatives were not sure why I had this problem. It is possible to visit the set time page on a computer or mobile device and allow OrCam to read a command from the screen that will cause it to reset the time, but I never used this feature since my workaround did the trick for me. To hear the current time, simply hold your bare wrist in front of the camera as though you were looking at a watch.

The other issue I encountered involved connecting to my home Wi-Fi network. The OrCam user guide instructs you to visit a Wi-Fi set-up page where you are asked to enter the name of your Wi-Fi network along with the password. You then press a button to generate a QR code. From within Setup mode, while OrCam is connected to the charger, you point OrCam at your screen and wait for it to see the QR code and connect to your network. Unfortunately, I was never able to get this feature to work, even with assistance from my sighted wife. The only reason to connect OrCam to the Internet is to install updates to the device, something which I was unable to do during my evaluation. One OrCam representative told me that this process is often completed when the user receives the available one-on-one training, which is free to anyone who purchases the unit. Since the training I received was a couple hours away from my home, there was no way that the trainer could have assisted me in connecting to my home Wi-Fi network. Another representative from the company suggested that perhaps an app might be available in the future that would make this process easier to accomplish.

Unfortunately, I know of no way to update OrCam unless it is able to see a QR code containing the necessary information that will connect it to the Internet.

Light Requirements for OrCam MyEye 2.0

Whether you are reading a menu in a restaurant, learning a face for future recognition, or reading a church bulletin, OrCam requires a lot of light. Fortunately, OrCam will prompt you when not enough light is available for tasks such as face learning. Other tasks such as recognizing a piece of paper money in my hand were somewhat less clear to me. If I was having difficulty accomplishing a task, I always sought out more light before doing anything else.

Battery Life on OrCam MyEye 2.0: More, Please!

One significant change from the original OrCam device to the current one is that there is no longer a box that clips to your belt. Instead, OrCam is self-contained in a single piece of hardware. Unfortunately, the unit takes a significant hit when it comes to battery life due to this change. Instead of getting 4 or 5 hours from a single charge, you will now find yourself needing to recharge after only a couple hours of use—something that made it difficult for me to truly integrate OrCam into my daily workflow the way I would have liked. It only takes a half hour to recharge the battery, and it is possible to use the unit while it is connected to a small external battery charger, but I would honestly prefer to have a dedicated piece of hardware that contains a bigger battery and perhaps some easier-to-use controls rather than a self-contained unit with poor battery life. OrCam uses a standard micro USB charger, which makes it easy to recharge from anywhere.

Reading with OrCam MyEye 2.0

One of the things I enjoyed most about using OrCam was sorting mail. I was able to pick up an envelope, hold the clear window of the envelope in front of me, tap OrCam's touch bar, Allow OrCam to take a picture, and then begin reading the text that was visible through the window. Although not perfect, I was almost always able to learn where the piece of mail was from and often its nature. Although it's possible to perform this task with Microsoft's Seeing AI app using my iPhone 8 Plus, it was nice to be able to use both hands to steady the envelope without having to juggle the phone and the envelope at the same time.

If I desired, I could open the envelope and read the contents with OrCam. To begin reading from a sheet of paper, you simply hold it in front of you, tap the touch bar, and let OrCam snap a picture and begin reading. It is also possible to point at a piece of paper with your finger held straight up. OrCam sees your fingernail, and snaps a picture as soon as you move your finger away. This allows you to select various points in a document, and begin reading from that point.

Sometimes OrCam announces that there is unreadable text, and continues reading after that point. At other times, OrCam will tell you that there is more text below where it stops reading, but that it can't read that text. You then need to take another picture farther down the page.

Finally, you can tell OrCam to automatically begin reading a page if it sees three of four sides of that page.

Using hand gestures or the touch bar, it is possible to stop reading, pause reading, or skim forward and backward through text.

Face Recognition with OrCam MyEye 2.0

Perhaps one of the most fascinating features of OrCam is face recognition. If you want, you can have OrCam tell you any time it sees a face whether it knows that person or not. I enjoyed sitting in a mall and hearing that a man was in front of me, a child was in front of me, a young woman was in front of me, and a woman was in front of me. Some friends of mine had a few good-natured choice words for OrCam regarding its distinction between "woman" and "young woman." On a more serious note, I was on at least one occasion embarrassed by OrCam's insistent announcement that a man was in front of me when, in fact, it was a woman. Fortunately, the woman in question was not aware of what I was doing with OrCam. I can only hope that she didn't even hear the messages I was getting in my ear.

It is possible to have OrCam only recognize known faces, which is the default setting out of the box. It is easy to have OrCam identify the fact that a face is in front of the camera by simply tapping the touch bar.

To learn a face, you hold down on the touch bar while the subject of interest turns their face slowly one direction, and then the other. It is even recommended that the person speak during this process. OrCam takes a series of pictures for up to 30 seconds. You then have the opportunity to record the person's name. Each time my wife's face came into view while I was evaluating OrCam, I heard my own voice say her name.

You can store up to 100 faces in OrCam's database. It cannot download pictures from social media, nor can it recognize faces from a photograph. OrCam needs to take the pictures in order to properly identify the face in future. It is possible to modify or delete any or all faces in the database.

Currency Identification with OrCam MyEye 2.0

Another handy feature of OrCam is the ability to identify money. To be honest, I found it a bit awkward to hold money up to the camera and wait for OrCam to identify the bill in question. This didn't always work, and I blamed either poor lighting or a wrinkled bill for the inconsistent performance. It is possible that I wasn't always looking in the right place, something that is always a possibility when using OrCam. Since the camera is on the right side of your glasses, it is necessary to make some physical adjustments to ensure that the camera is pointing where you want it to. In one instance, when I was trying to read a plaque at a museum, OrCam told me that I was consistently pointing my finger too far to the right—a prompt that I found most helpful.

Although I like being able to identify money with OrCam, I find it much faster and less awkward to use the MoneyReader app in conjunction with my iPhone 8 Plus.

Additional Features

It is possible to learn and later identify objects with OrCam in the same way that you can with faces. It is necessary to take a picture of the object in question and name that object using your voice. It is my understanding that some people use this feature to identify phones, credit cards, and about anything else you can think of. To be honest, this wasn't a feature I could see myself making much use of, since there are so many alternatives available, such as making a braille label, or, in my case, using my I.D. Mate Galaxy to affix a bar code to the item in question. For this reason, I didn't really test this feature much.

Another feature that I tested with very limited success was the ability to identify bar codes using OrCam. The trick is to actually locate the bar code. One this happens, OrCam automatically reads the desired information with no further action required on the part of the user.

Finally, I did not test the color identification feature of OrCam, since I live with a spouse who is sighted. It is my understanding that this feature is still very much in early testing, but perhaps others will find it useful in the future.

The Bottom Line

There was a lot to like about OrCam MyEye 2.0. For reading short blocks of text, it worked great. Menus in restaurants presented a challenge because they contain a lot of text and restaurants are often noisy and dimly lit.

While it's possible to perform tasks such as identifying money and reading bar codes with OrCam, low-cost or even free apps for your smartphone accomplish these same tasks very efficiently.

The battery life of OrCam is short enough to make it difficult to use the product without always placing it in suspend mode. If the unit is always sleeping, how will it identify the face of your boss when he walks silently past the door of your office? It is possible to carry around a small external battery charger while you use OrCam, but this defeats the purpose of a self-contained unit, in my opinion.

My inability to connect OrCam to the Internet meant that I couldn't download updates to the product. Whether OrCam simply didn't like something about my home network, or I was not pointing the camera at the QR code properly, the process was not successful. If I were a paying customer, this would present a serious challenge to me.

Finally, OrCam MyEye 2.0 costs $4,500. The cost of the original unit was $3,500. When version 2 of the product was released, customers were given a $1,500 discount, but they still paid $3,000 for the new hardware. A payment plan that would allow customers to purchase OrCam over time with the option of upgrading the hardware during this process would be helpful in easing the sticker shock of this device.

I enjoyed using OrCam, and it was certainly a conversation piece as I interacted with my sighted friends and family. I look forward to seeing what future versions of OrCam will bring to the table, and I hope to evaluate OrCam MyEye 3.0 for AccessWorld one day.

Product Information

Visit the OrCam website to purchase the product for $4,500, read the user guide, access various tutorials, and connect with the Israel-based company on social media.

Manufacturer's Comments

Comments from Gill Beeri, Business Development Director - North America

The next generation OrCam MyEye 2.0 is the world's most advanced wearable artificial vision device. Communicating vital visual information by audio, the intuitively operated breakthrough assistive technology is wireless, lightweight, and compacted into the size of a finger.

OrCam MyEye is meticulously designed with the needs of people who are blind, visually impaired or have a reading disability in mind ? and doesn't adapt, or rely on, a platform that isn't originally accessible.

OrCam MyEye 2.0 delivers increased independence by reading printed and digital text aloud ? from any surface ? and seamlessly recognizing faces, products and more, all in real time.

Made from exceptionally durable, high impact plastic, OrCam MyEye magnetically mounts to either side of the wearer's eyeglasses or sunglasses frame.

A new hardware feature built in to OrCam MyEye 2.0 is LED lights — positioned on either side of the camera. The LEDs automatically illuminate in low-light environments.

OrCam MyEye is the only wearable artificial vision tech that is activated by an intuitive pointing gesture or simply by following the wearer's gaze — allowing for hands-free use without the need of a smartphone or Wi-Fi. All of OrCam MyEye's operation is processed offline, without requiring an internet connection – resulting in real time audio while ensuring data privacy.

OrCam periodically release software updates to existing users which add new functionalities to the device.

OrCam MyEye is available in 20 languages and in over 30 countries.

70% of the device's battery charges within the first 15 minutes.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related articles:

More by this author:

Microsoft Soundscape Adds a New Dimension to Accessible Wayfinding

Last year, Microsoft released its Seeing AI OCR and recognition app on the iOS platform, which received an overwhelmingly positive reception from the blind and visually impaired community. Recently, Microsoft has released a second app on the iOS platform aimed at aiding people with vision loss: a unique wayfinding app called Soundscape. The Soundscape app is novel in the wayfinding space as it provides information about points of interest and the streets around you using augmented reality by way of 3D sound. Since its release, the app has seen several updates and the development team is always seeking feedback from the community so that they can make the app as helpful as possible.

In this article, I will detail Soundscape's features and operation and also provide my thoughts and findings when using Soundscape in real world situations.

Getting Started and App Layout

After downloading Soundscape from the app store and launching it for the first time, you will be presented with a welcome screen and asked to grant the app the permissions it will need to operate effectively. First you will be asked to choose a voice that will be used to identify landmarks and points of interest. The default is a female UK voice but two male voices (US and UK) and a female US voice are also available to download. After making your selection, you will begin navigating through a series of screens asking you to grant permissions including location access and motion/fitness activity. Soundscape will describe why it needs these permissions and, once you grant permission, you will be presented with the standard iOS dialog for allowing the app access to this information. After permissions have been granted, you will be shown a video demonstrating a Soundscape audio beacon as well as a point of interest callout. After the video several of Soundscape's features are described; you can navigate through these by either changing the page in the Picker or by tapping the "Next Page" button. After the last page, you will be asked to agree to the Soundscape terms and conditions. Note that the terms and conditions are not displayed on this page; use the link located in the text before the check box to view them.

Once you have agreed to the Terms and Conditions, you are taken to the Soundscape Home screen. The Home screen has a simple layout and contains almost all of the controls that you would commonly use.

In the top left of the Home screen you will find a Menu button where you can manage various settings and access help files and tutorials. Across from this button is a Callouts button, which will allow you to turn the audio announcements provided by Soundscape on or off. Below these two buttons are two large buttons that cover the width of the screen. First is the Set a Beacon button, which will allow you to set an audio beacon on an address or point of interest, followed by an Expand Callout History button below. This button will produce a horizontally scrolling list of recent announcements from the text-to-speech (TTS) voice so that you can review them. You can close the history by activating the button again. At the bottom of the screen is a row of three buttons. From left to right these are My Location, Around Me, and Ahead of Me. These three buttons will cause the TTS voice to announce the requested information about your surroundings. Note that often controls on the home screen will have options through the "Actions" item in the VoiceOver Rotor. For example, a point of interest displayed in the Callout History will have options to set a beacon on the location, repeat it, or gain more information about the point.

3D Sound through Soundscape

The main feature that makes Soundscape unique is how it presents information. All information provided by the TTS voice, as well as the audio beacons, are presented in 3D audio. This means that if you are passing a point of interest on your left, it will sound from your left earphone. The way the sound is presented, it sounds as if it originates from the outside environment to a greater or lesser degree depending on the earphones you are using. To me, it sounds as if the text-to-speech is coming from somewhat above me, giving callouts the sound of audible signs. Beacons appear as a steady drumbeat in the direction of the location of the beacon. If you are facing the beacon's direction so that it is in front of you, a steady beep will play along with the beat. Again, the way that the beacon is presented sounds as if it is coming from the environment and blends somewhat with the other outside sounds. To hear a demonstration of both callouts and beacons, see this YouTube Video.

Callouts and Gaining Information

There are two primary ways of gaining information from Soundscape, automatic callouts and requesting information directly using the buttons at the bottom of the Home screen. As you travel, Soundscape will automatically call out certain information. By default, Soundscape announces points of interest as you pass them, upcoming intersections, transportation information, and the distance to a beacon, if you have one set. It is possible to manage what is spoken by selecting the Menu button and then the Manage Callouts item. In addition to the previously mentioned callouts, you can choose to have indoor Bluetooth beacons (where they are present) and current location information spoken.

As you walk, the callouts that you have set will automatically be announced. Points of interest are announced as you pass them while intersection information is usually read somewhat prior to the intersection. As mentioned previously, the TTS voice speaks from the position of the point of interest. For example, if you are standing in front of a row of buildings, those to the extreme left and right would be spoken from the very far left and right, but as the TTS announced points of interest that were closer to directly in front of you, they would be announced closer to the center of the stereo field. When approaching an intersection, the TTS will notify you which streets are left, right, and ahead. Again, streets to the left and right will be spoken from their respective points in the stereo field, while streets that continue ahead will be announced from the center of the field. Announcements of position information, such as "Facing southeast along Main Street," are spoken from the center of the audio field. You can turn and the position of the callout will remain in the direction of the point of interest. As an example, if you are walking and hear a point of interest on your left and turn to face it while the voice is speaking, the speech will move through the stereo field to be in front of you when you are facing it.

To request information manually, you can use the three buttons across the bottom of the Home screen. The My Location button will provide your current compass direction as well as the street on which you are walking. The Around Me button will alert you to four nearby points of interest, one each behind, to your right, left, and in front of you. The Ahead of Me button will list five points of interest that are in front of you. Note that distance does not seem to be a factor in what is reported. For example, I was in an area with few points of interest so was provided information on locations up to a half a mile away. Another useful aspect of the Ahead of Me information is that if you are unsure which points of interest reported by the Around Me button are ahead or behind you, you can use the Ahead of Me button to confirm. Note that you can gain more information about a specific callout or interact with it from the Callout History section on the home screen.

Beacons

To add a beacon, select the Add a Beacon icon from the home screen. When you add a beacon for the first time, you will be presented with a tutorial that will walk you through the process. Interestingly, the tutorial uses a beacon in your area as an example and references it in the instructions instead of using pre-scripted examples. If you choose to skip the tutorial, it will be available in the Help and Tutorials section of the menu.

When you place a beacon on a location or address, you will hear a steady beat from the direction of the beacon. Be aware that the beacon changes position in the stereo field depending on your orientation in relation to it, but does not get louder or softer depending on your proximity. As you travel with a beacon active, the TTS will regularly provide updates on your distance to the beacon's location. Once you reach the beacon the TTS will announce it without any distance information. The beacon should stop sounding at this point, but in my tests, I found that I had to stop walking soon after the beacon was announced or else the beacon would not automatically end. If I kept walking, such as in the case where a beacon was not directly on the entrance of a large building, the beacon would continue to sound.

Soundscape in the Real World

Now that we've explored how Soundscape functions, let's look at how effective it is in real world situations. I've found that in most situations Soundscape functions well but is highly reliant on the quality of the maps in your area. Soundscape uses Open Street Map (OSM) for its map data, which has variable accuracy depending on your location. In large cities, the data seems to be full featured and up to date while in more rural areas there can be many points of interest missing or very old data. When using Soundscape in a large city I found that I was given a great deal of information and as long as the maps were up to date, Soundscape could identify points of interest with a high degree of accuracy. In my home city, there are few labeled points of interest so it has been less useful for discovering new points of interest, but I have found it useful for identifying streets and intersections. Fortunately, Soundscape allows you to set a beacon on an address so it is possible to use a beacon to navigate to a location even if it is not labeled. A feature called Markers is planned to be included in the app as of the next release which would allow you to mark Points of interest, addresses, and your current location. This would allow you to fill in gaps in your local map data (only for your own device) and mark your own personal landmarks such as building entrances, park benches, or patches of grass for a guide dog. Keep an eye on AccessWorld News, as we will be announcing the availability of the feature there once it has been released to the app store.

I have found Soundscape to be quite accurate in its positioning of callouts and beacons in the stereo field. In some situations I have noticed beacons switching orientation even though I am not changing my direction, but in almost all cases, the beacon settled in the correct orientation as I continued walking. This generally occurred after making a turn so may have been due to compass calibration issues. In areas with many up-to-date points of interest, I have been able to use Soundscape beacons to locate my destination. In my home city where points of interest are more scarce, I have still found address-based beacons to be very accurate. I've found Soundscape to be most useful paired with a turn-by-turn navigation app such as Google Maps. The turn-by-turn directions meshed well with the callouts provided by Soundscape. This would be particularly helpful if you are navigating in an unfamiliar area where gaining intersection information is most important. I found that Soundscape could be a significant drain on my battery. For my testing I was using an iPhone SE, so newer generations or larger devices may fare better, but I found it necessary to close Soundscape from the App Switcher to be sure it wouldn't drain my battery after use.

The Bottom Line

Soundscape adds a unique and novel dimension to travel and presents a fairly seamless method for understanding your surroundings, particularly in areas with solid maps. Like any GPS app, there are some accuracy issues, but in my testing, these errors were rare and correct themselves relatively quickly. For anyone with an iOS device, I highly recommend giving Soundscape a try; the app is free and even if you already have a solid navigation strategy, the unique nature of Soundscape's presentation is enjoyable to experience and you may find it a useful addition to your toolkit.

Product Information

Product: Microsoft Soundscape
Platform: iOS
Price: Free Developer: Microsoft

Developer Comments

We recognize Soundscape is a new technology. As such, it does take a bit of getting used to. Its intention is to increace a person's awareness of where they are and what's around them in a pretty novel way. Part of our design ethos is to help people get into a position where they can exercise greater choice about how they get from one place to another, so they can start to connect with more of the world around them.

Because Soundscape is new and novel, we are very keen to receive feedback and learn what is working well and what isn't, so that we can work with our insiders and partners to figure out ways to improve the overall experience.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related articles:

More by this author:

New Standards for Braille Displays are on the Horizon

On May 31, the Universal Serial Bus Implementers Forum (USB-IF) approved a new Human Interface Device (HID) standard for braille displays. USB-IF is a non-profit organization comprised of many industry-leading companies such as Microsoft, Apple and Google. Their goal is to advance USB technology to make it more useable for consumers. According to the USB-IF press release, the standard will "make it easier to use a braille display across operating systems and different types of hardware. It will also simplify development, removing the need for braille devices to have custom software and drivers created for a particular operating system or screen reader."

Apple and Microsoft are both among the companies working to achieve this standard. Sarah Herlinger, director of Global Accessibility Policy and Initiatives at Apple, said: "We're proud to advance this new USB-IF standard, because we believe in improving the experience for all people who rely on braille displays to use their Apple products or any other device." Jeff Petty, lead program manager of Windows Accessibility, commented, "Developing a[n] HID standard for braille displays is one example of how we can work together, across the industry, to advance technology in a way that benefits society and ultimately improves the unemployment rate for people with disabilities." As of this writing, Google has not commented.

If the new standard is supported by manufacturers of mainstream technology, braille displays, and screen readers, you will be able to start a screen reader, plug a display in via USB, and have access to braille with no further effort required. It will be unnecessary to find and install drivers, connect via Bluetooth, or change settings before plugging your display into a USB port supporting the new standard.

Further, those devices that support the standard will have a common set of braille keyboard commands and functions.

People who are deaf-blind will no longer require assistance to pair a display with their mobile device for the first time or reconnect one that has come unpaired. At present, you have to use Bluetooth to use a braille display with an Android or iOS device. If you can't hear well enough to understand VoiceOver or the Android Accessibility Suite, and if you can't see well enough to interact with the screen, you can't establish a Bluetooth connection without assistance. This newly adopted standard will allow you to start your mobile device's screen reader and plug a braille display in via USB, making assistance from anyone else unnecessary.

At the time of this writing, the VFO group is the only braille display and screen reader manufacturer to confirm support is forthcoming for its current products. In a VFO blog post, the company stated:

At this time, we are unable to estimate when such an update will be available, but it is not imminent. Remember, it will also take some time for operating system manufacturers to include support for HID-compatible displays.

It is a complex task, because we are also committed to ensuring our hardware works with all existing operating systems currently supported by our displays, including operating systems that may not receive an update incorporating support for this new standard.

Apple and Microsoft did not provide timeframes for supporting this new standard. Stay tuned to AccessWorld for updates as they become available.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related articles:

More by this author:

Letters, We Get Letters: Receiving Digital Scans of Your Mail Envelopes Using Informed Delivery

Do you use Microsoft's free Seeing AI app or KNFB Reader to scan, sort and read your mail? Well, the US Postal Service may now be ready to save you at least one of those steps: scanning your envelopes so you can quickly sort through the deluge and select what's important to actually open and read. The service is called Informed Delivery, and it's currently available to residential consumers and eligible personal PO Box addresses throughout most of the United States.

For over 20 years the USPS has been creating a scanned copy of the outside of every piece of mail processed—over 149 billion pieces of mail per year. These scans help with sorting, but I suspect that at least one of the reasons for saving these digital scans may have something to do with the Patriot Act, though note that I have no evidence that this is true.

Whatever the reason, the US Postal Service is now sharing these digital images with recipients, via email, a Web portal, or both.

To sign up for the service, log on to the Informed Delivery website. A link is also prominently featured on the USPS.com homepage. You'll need to either sign up for a USPS account or sign into your current account. After filling out an Informed Delivery application you are then asked a few credit bureau questions, such as which employer or home address you've been associated with, to establish your bona fides.

After completing my application, I began receiving daily emails with envelope images attached. I am told by a postal representative that up to ten images can be included in a daily email, but all are available from the website. The email also contains a link so you can quickly log on and view your images there. I usually received the emails by 8 am, long before my normal 4 pm delivery.

Using my iPhone to open these emails, each piece of mail is labeled "Scanned image of your mail piece." Now I can double tap and hold to call up Mail's Share sheet, and select Recognize with Seeing AI. I could, alternatively, use KNFB Reader, or JAWS OCR capabilities to do the same on my Windows PC.

Today, I had a single piece listed in my daily email, and a quick OCR showed me that all I could expect in the day's delivery was junk mail. Yesterday I could tell by the outside address that one of my four pieces of mail was payment for a previous AccessWorld article. I know the shape of an AFB envelope, which made it easy to pick out of the pile when I walked to the bottom of my driveway to fetch the day's offerings. Another was an insurance form for my wife. Apparently once I signed up with my address, I would henceforth be notified of all letter and post-card size mail processed on USPS automation equipment arriving at my address. Images of flats such as magazines and catalogues are not captured, nor are images of packages. In emails, package information is presented as status updates when it's available. In addition, some products like Every Door Direct Mail (EDDM) or saturation mail (mail addressed using mailing lists), are also not available at this time.

Granted, receiving images of the day's mail is not a huge time saver, but if you use a post office box to collect your mail, or if you're waiting for something important, it's an easy way to check if it's on its way, or if it was lost in the mail. The service recommends waiting at least seven days before reporting a missing letter. On the positive side, if you are missing that letter you can determine if it was out for delivery and lost then or if it was never sent. No more "the check's in the mail."

The reason for scanning mail is to use optical character recognition to facilitate delivery. With the resources of the USPS, it's not unreasonable to assume these are high-quality scan recognitions, far more accurate than you can do when you "try this at home." Perhaps with the proper lobbying, this could be the beginning of making snail mail universally accessible. How difficult could it be, after all, to offer a "Send my images with text" OCR option on an Informed Delivery account?

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related articles:

More by this author:

Outstanding Summer Showcases for Technology: ACB and NFB Annual Conventions

For blind consumers of assistive technology and the vendors who manufacture and distribute that technology, two of the most significant events of the entire year occur on Independence Day, the Fourth of July. Those are, of course, the annual gatherings of the American Council of the Blind and the National Federation of the Blind, held this year in St. Louis, Missouri and Orlando, Florida, respectively. If you attended one or both of these events, you no doubt saw exciting new products, heard about thrilling innovations, and had the stimulating sense that your personal batteries were charged by the energy and camaraderie you encountered. If you were not able to attend, this article is our attempt to bring you a vicarious overview of what took place in those venues. We cannot replicate the feel of the crowded exhibit halls, the sound of cheers in the general sessions, or the up-close look at a new piece of hardware in a breakout session, but we will try to get some adrenaline pumping just the same.

Overview

The 57th annual American Council of the Blind Conference and Convention was held June 29 to July 6 at the Union Station Hotel in St. Louis, Missouri. This former train-station-turned-hotel has plenty of charm, and plenty of confusion and staircases. (Yes, there really were two 1st floors and one floor that was both the 2nd and 3rd! And then there was the ubiquity of staircases, in sometimes surprising locations.)

There were restaurants in and near the hotel, an outdoor pool, and a bar that ran the length of the lobby. You could, in other words, go to both general and breakout sessions, canvas the exhibit hall, and have plenty of food and drink without ever leaving the premises.

The weather was sizzling hot!

The National Federation of the Blind's 78th annual convention was held at the Rosen Shingle Creek Hotel. An Orlando, Florida, resort and conference facility, the hotel offered several restaurants and bars, a resort-style pool area, and abundant meeting and exhibit space. Distance from one event to another was such that just participating fully in the convention could serve as ample fitness training for the workout enthusiast.

And yes, like St. Louis, the weather was sizzling hot!

Where technology is concerned, there were far more similarities than differences at these two leading events. Here, then, are some of the highlights garnering the most attention.

Smart Glasses, Smart Glasses Everywhere

The concept of using smart glasses of one iteration or another has exploded in the field of assistive technology and all of it is exciting. Some products, OrCam for instance, offer greater success to the wearer with some vision. None attracted as much attention, however, as Aira, the smart glasses that enable the blind or low vision user to use a camera and Internet connection to share and discuss the environment around her with a trained agent.

Aira offered free access at both conventions, in the host hotels and the nearby airports, and offered special discounts to new subscribers. The company also announced new features in the Aira app such as messaging when you need to have a silent conversation with an Aira agent (in a meeting or classroom, for example) at times when a phone chat would be disruptive.

Smart phone apps that provide sight to blind users, such as Be My Eyes and Envision, were also present and demonstrating.

All About That Braille

There was no shortage of products for the braille lover at either convention.

For the first time in braille display history, there were two displays in both exhibit halls priced at under $500. The Orbit Reader from the American Printing House for the Blind and the Braille Me from National Braille Press, are both 20-cell displays designed primarily for reading, with limited note-taking capabilities. Some popular displays from Freedom Scientific VFO and HumanWare were deeply discounted for convention attendees. HIMS Inc. was demonstrating its new Q-Braille, a 40-cell display combining the familiar eight-dot braille keyboard with all special keys from the QWERTY keyboard. Although pre-orders were taken, the Q-Braille is still under final development, and will be featured in a future issue of AccessWorld.

The most innovative in refreshable tactile displays were the Graphiti and Canute demonstrated by the American Printing House for the Blind. Graphiti will be a tremendous asset in educational and other settings, with its capacity to display tactile images sent to it from computers or mobile devices, or drawn freehand by a blind or sighted user. The Canute, with its 9 lines of braille (360 cells in all) will also be welcome in both educational and employment settings. Both products are still under development.

Hardcopy braille was also widely available in both hotels, including the daily convention newspaper in braille at ACB and voluminous amounts of braille speeches, reports, articles, and calendars at NFB. Both, of course, also provided convention programs, restaurant menus, exhibit hall guides, and lists of area restaurants in braille.

Prime Time for Television Audio Description

As the amount of audio description increases on primary networks and streaming services, so are people with visual impairments being perceived as important target audiences. Charter Communications, in particular, was more than a little behind the curve in catching on to the requirements put in place by the 2012 Communications and Video Accessibility Act, but they are making some impressive strides toward remedying the situation. Charter's accessibility team expanded from one employee to 21 in just the past year, and some of those employees are themselves people with disabilities. Currently, customers with visual impairments can receive a Roku, which provides some accessibility, but the company was proudly exhibiting the solution that is scheduled to be rolled out in markets across the country over the next year. The new accessibility solution from Charter promises to speak onscreen information such as channel information, program schedules, and more. While the Spectrum phone app is somewhat accessible, representatives admitted that the audio description track is not being passed through on the app. Whether that will be remedied in the upcoming solution remains to be seen.

Amazon also proudly announced advances in television accessibility. To its existing products of the Fire Stick and Fire TV Stick, Amazon has added the Fire TV Cube, a completely hands-free solution to controlling your TV and accompanying components. All three have Amazon Alexa built in. Even more exciting was the new Fire TV, a collaboration with Amazon, Toshiba, and Best Buy, which promises to be a completely accessible device.

Speaking of Amazon, the company also demonstrated at both conventions the new Amazon locker. Intended as an alternative for Amazon shoppers who would rather pick up their packages than have them delivered to an empty porch, the Amazon locker features complete voice guidance accessibility. Finally, one company is offering a kiosk that provides equal access to blind customers!

Who Cares About Accessibility?

One unspoken measure of success for people with visual impairments is the prevalence of mainstream companies visible at these two major gatherings. In addition to the scores of vendors marketing specifically to consumers with visual impairments, many of the major players in the larger, mainstream tech world were present as sponsors, presenters, and exhibitors. Microsoft, Amazon, Google, Apple, Lyft, and Uber were among the companies playing key roles in both St. Louis and Orlando. While there is still plenty of room to grow in this area, forward motion with regard to incorporating accessibility into mainstream technology products is unmistakable.

Next year, the American Council of the Blind convention will be held in Rochester, New York, and the National Federation of the Blind will be in Las Vegas, Nevada. If you want to get your hands on some of the newest technology products, be on hand for demonstrations of new programs, and simply bask in an environment of peers who care about technology, you might try to check one of these events out for yourself. Of course, if that is not possible, we at AccessWorld will again do our best to bring you the highlights.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related articles:

More by this author:

Reader Feedback Influences Change at AccessWorld

Lee Huffman

Dear AccessWorld readers,

Seven years ago, at this time, AccessWorld implemented a "Comment on this article" link at the end of each article to bring your comments, questions, and ideas right to my inbox. Since that time, hundreds of you have written to share your thoughts, many of which have been shared in the monthly "Letters to the Editor" column. Your feedback has been extremely valuable to the AccessWorld team and has helped us to grow, enrich our content, and better understand your access interests and challenges. I thank you for taking the time to share your thoughts with us.

For those who still haven't taken the opportunity, I encourage you to send me your comments on articles and your thoughts on any topics you would like to see addressed in AccessWorld. Based on your previous feedback, decisions have recently been made to build out additional AccessWorld content in the areas of Employment, Technologies on the Horizon, and Technology Case Studies at major technology companies. We are also working to include an AccessWorld podcast. Stay tuned over the coming months to see how these new initiatives unfold.

We hope you enjoyed the July 2018 Back-to-School issue and gained information to help with getting ready for the upcoming school year.

Sincerely,
Lee Huffman, AccessWorld Editor-in-Chief
American Foundation for the Blind