Full Issue: AccessWorld September 2016

Letters to the Editor

Dear AccessWorld Editor,

In response to The Accessible Kitchen: Using the Instant Pot Smart Bluetooth-Enabled Multifunctional Pressure Cooker by Bill Holton:

This sounds like an interesting device. Does Bill come with the device to show you in person what to do? Otherwise, I must first learn an iPhone and find that often elusive "sighted help."

Bill says: "The approximately 6-inch by 4-inch digital touchpad could be marked with raised dots, or a Braille overlay could have been created, but why bother when there is an accessible app for that?"

I say, bother because not everyone has an iPhone. We aren't all pros at touch and swipe yet. I'm not sure I want my life run from an iPhone. What fun a hacker could have.

Regards,

David

Dear AccessWorld Editor,

In response to Described Video via Blindy TV: "Taking the Vision out of Television" by Bill Holton:

Frankly, I would rather see market and consumer demand spur cable companies and streaming services to act rather than government regulation alone, since government regulations can be overturned.

I am hopeful blind consumers will purchase audio described content when available to show distributors that we are a market worth catering to.

As the population ages, disabilities will become more prevalent, and making content accessible is just the right thing to do!

Dear AccessWorld Editor,

Thank you for your article, The Accessible Kitchen: Using the Instant Pot Smart Bluetooth-Enabled Multifunctional Pressure Cooker by Bill Holton

I was so intrigued I went online to buy one for myself, when I found out that some versions of the product sold before June first of last year are under recall, according to the manufacturer. I feel readers need to know this if they decide to purchase one.

See the recall at http://instantpot.com/instant-pot-smart-recall/

Thanks for the great work.

Zuhair

Dear AccessWorld Editor,

Great review of the glasses in the article, A Review of Sonar Glasses from G-Technology Group by Janet Ingber.

I'm actually very seriously thinking about getting a pair, so I checked out their web site. Couple of points although I know it's too late to get the word out:

The price for basic glasses is now $299.

They say the range is adjustable but now it doesn't look like there are two model options.

It also looks like the only feedback is vibration, there is no mention of tonal feedback.

That's all I have for now, but if interested, I can write more if/when I go through the process of actually buying a pair.

Thanks,

Pete De Vasto

Dear AccessWorld Editor,

Described Video via Blindy TV: "Taking the Vision out of Television" by Bill Holton is a good article on Blindy TV. I would add a couple things. The Victor Reader Stream new edition and the HIMS Braille Sense will capture and play all the Blindy channels. May I also recommend the satirical commercials played at the ends of the hour. The content can be a little rough, but they are funny. Audio Description is minimal, nonexistent for some programs.

Mike Cole from Berkeley

Dear AccessWorld Editor,

Thank you very much, Bill Holton, for bringing Blindy TV to our attention in the August issue of AccessWorld. It is heartwarming to learn that in our highly business-driven and profit-orientated world, there are still individuals and organizations whose intentions are purely altruistic. Bravo, Blindy TV!

Reading about Blindy TV has incentivized me to write to AccessWorld about my own "project of love: — Karablind. (www.karablind.com)

I was diagnosed with retinitis pigmentosa and am now totally blind. In my part of the world (Hong Kong), karaoke is still very much a part of the cultural make-up and people still love popping in to the nearest karaoke bar to do a bit of crooning; although with the proliferation of karaoke aps in the cyber world, people are channeling their inner Adele at home, much to the consternation of neighbors. I have always liked doing a spot of karaoke myself but with the loss of my eyesight, am unable to read the words to the songs on the screen. Hence, the creation of Karablind.

I described Karablind as a project of love because the person who read and recorded each and every single word to the almost 250 songs in my personal library of song lyrics, and who put together the Karablind website so that we may share our end-product with all visually-impaired people all over the world, is the kindest and gentlest man I have the honor of calling my husband.

Most of the songs featured in the Karablind website are in Cantonese or Mandarin, but there are a few English ones too. Please take a moment to read the "About Karablind" section. I know that the majority of AccessWorld's subscribers are not interested in Canto or Mando-pop, but, hey, it's never too late to learn something new! Who knows, we may be able to start up a challenge, rather like the Ice Bucket Challenge, and get a non-Chinese speaking person to learn a Canto or Mando-pop song featured on Karablind, do it live on YouTube and then get people to donate to their favorite blind charities or research. Just a thought!

Best regards,

Karen

Dear AccessWorld Editor,

I have recently been reading some of the back issues of Access World, in particular, the articles about the Trekker Breeze GPS device.

I use this device and have watched it develop over the years. The reason I am writing is because I feel you should write a more up to date review of the device as some of the information in the reviews is no longer the case. One such fact that was published was that the device takes up to three minutes to pick up the GPS signal. However, with the recent upgrade which Humanware call the Trekker Breeze +, the device is now able to get a signal in as little as ten seconds.

I enjoy reading Access world, as I am from the UK and it is interesting to hear about some of the technology that is available in the US that is not available here.

With kind regards

Craig Slater UK

Dear AccessWorld Editor,

In response to The Accessible Kitchen: Using the Instant Pot Smart Bluetooth-Enabled Multifunctional Pressure Cooker by Bill Holton:

I wanted to let you know — that I have created smart scripts for many of the pressure cooker recipes on my website which you can download into the smart app. You can find all of the scripts here:

http://www.hippressurecooking.com/category/smartcooker/

Very nice write-up on the Instant Pot SMART!

Ciao,

Laura D.A. Pazzaglia

Dear AccessWorld Editor,

We are looking for a voice medication reminder + voice responds when asking for time. We saw the ROSIE REMINDER on a web site. Any advice or recommendations from someone with your experience would be greatly appreciated.

Thank You,

John Hanifin

Response from AccessWorld Associate, Aaron Preece:

Thank you for contacting the American Foundation for the Blind. The following is the site that sells the talking medication alarm system you referenced in your message. The device can be programmed with 25 alarms per day and users can ask for time and date and it will be reported. The site sells other talking alarm systems that have less features, but are less expensive also. The site is: epill.com/talk. In a recent issue of our AccessWorld magazine, we included an article that discussed various different technologies that are useful for daily life for people with vision loss. The article includes a section regarding talking medication systems. The article is A Day in the Life: Technology that Assists a Visually Impaired Person through the Day by Bill Holton

You may also find the talking clock/alarm systems from LS&S useful; they can be found here: lssproducts.com.

AccessWorld News

AFB Encourages You to Learn NVDA with Free, Online Video Tutorials

The American Foundation for the Blind recently announced the availability of Learn NVDA, a series of free, online video tutorials designed to help people who are blind or visually impaired learn how to use the computer and improve their computer skills.

The Learn NVDA tutorials feature the use of NVDA (Non Visual Desktop Access), a free and fully featured screen reader. The Learn NVDA tutorials allow a person who is blind or visually impaired and entirely new to NVDA to independently install the program and learn how to use it. Learn NVDA teaches:

  • How to install NVDA on a computer
  • How to navigate Microsoft Windows with NVDA
  • How to use NVDA Hotkeys
  • How to install and use the Firefox internet browser
  • How to use Microsoft Word and Excel with NVDA

Each tutorial contains step-by step instructions with audio of a presenter using NVDA and video of the computer screen. Additional tutorials from AFB will be available soon.

To learn more or to share information about Learn NVDA with people who may be interested in using the tutorials, visit the AFB Learn NVDA website.

Learn NVDA was made possible with support from the Consumer Technology Association Foundation. AFB is also pleased to partner with Lion's Club International Foundation to support technology literacy. Together, we are working to create a more accessible, inclusive world for people with vision loss.

American Foundation for the Blind Accepting Nominations for the AFB Access Award

The annual American Foundation for the Blind Access Awards honor individuals, organizations, or companies who have improved the lives of people with vision loss by enhancing access to information, the environment, technology, education, or employment, including making mainstream products and services accessible. If you would like to submit a nomination, we encourage you to do so.

All AFB Access Award nominations should be submitted online for review by the AFB Access Awards Committee. Please follow the nomination guidelines. All nominations are due by September 30, 2016.

Please be sure to look at the list of past winners to see if your nominee has already received an Access Award.

Applicable supporting materials, such as articles, brochures, and press releases are welcome. E-mail material to Ike Presley. Letters of support for nominations are welcome and should also be sent to Ike Presley.

Please mail any materials that cannot be easily converted to accessible electronic format to:

Ike Presley, National Project Manager
American Foundation for the Blind
739 W. Peachtree St. N.W., Suite 250
Atlanta, GA 30308

NYU ABILITY Project Honored by NYC Mayor's Office for People with Disabilities

Mayor Bill de Blasio and the New York City Mayor's Office for People with Disabilities recently presented the NYU ABILITY Project with the ADA Sapolin Award in a ceremony at Gracie Mansion.

Each year, the Mayor and Mayor's Office for People with Disabilities present four ADA Sapolin Awards, named after Matthew Sapolin, the late commissioner of the Mayor's Office for People with Disabilities. The awards recognize individuals and organizations that have made significant contributions to increasing accessibility for people with disabilities under the titles of the Americans with Disabilities Act.

The NYU ABILITY Project is an interdisciplinary research center dedicated to the development of adaptive and assistive technologies for people with disabilities. It fosters collaboration between engineers, designers, educators, speech and occupational therapists, and individuals with disabilities to create opportunities for teaching, learning, and research. For example, through the ABILITY Project, NYU offered a course this spring on vision-related assistive technologies that was co-taught by two assistive technology specialists, one of whom is blind.

In 2015, the ABILITY Project partnered with AT&T to lead the ConnectAbility Challenge, a three-month technology challenge designed to spur innovation for people with physical, social, emotional, and cognitive disabilities. The competition, which coincided with the 25th anniversary of the Americans with Disabilities Act, resulted in the submission of 63 software, wearable, and other technology solutions from developers in 16 states and 15 countries aimed at enhancing the lives of people with disabilities.

2nd Annual Research Proposal Competition Open to Multidisciplinary Teams at 2016 Envision Conference

Recently, Envision announced the return of the Research Proposal Competition to the 2016 Envision Conference, September 7–10, 2016 at the Grand Hyatt in Denver. It is sponsored by the Envision Research Institute (ERI). As part of the competition, an intensive workshop will be offered to assist attendees with developing successful research proposals. One team of workshop participants will receive up to $10,000 in seed funding and access to ERI resources to launch the proposed study.

The Envision Conference gives professionals who work with individuals with low vision in a variety of fields — optometrists, rehabilitation therapists, occupational therapists, special education teachers, researchers — the chance to receive updates and collaborate with each other on the latest ideas and advancements in vision rehabilitation, research, practice and technology.

This year's Research Proposal Competition workshop has been divided into three sessions:

In the first session, Dr. Walker will discuss "Forming a Translational Research Team."

In the second session, "Statistics Boot Camp for Clinicians," Aaron Johnson, Ph.D., will discuss the meaning of current statistics used in studying low vision as well as the best methods for calculating them. Topics to be covered include traditional statistics analysis as well as the new statistics that are becoming requirements in most scientific and medical journals.

In the third session, "Research Study Design & Proposal Writing," Dr. Walker will walk participants through how to design a study and draft a research proposal, following which each team will be invited to submit a letter of intent (LOI) to participate in the Research Proposal Competition. The ERI will provide specific feedback on each LOI to guide development of full research proposals.

The sessions are staggered throughout the Envision Conference agenda to allow time for contest participants to process earlier lessons and get them thinking about new opportunities for research as they attend other sessions.

Earlier this year, Envision announced the winner of its 2015 Research Proposal Competition, and awarded a research prize of $10,000 to a multidisciplinary team led by Ava Bittner, O.D., Ph.D., FAAO (Dipl.) of Nova Southeastern University (NSU). The prize funded a one-year investigation into the preliminary efficacy of telerehabilitation, i.e., using HIPAA-secure videoconferencing to deliver follow-up low vision services to visually impaired individuals who live in remote areas or have difficulty getting to a specialist's office. Dr. Bittner will present her project at the 2016 Envision Conference.

The complete Envision Conference program is available at the Envision Conference 2016 website.

FCC Announces Test of Emergency Alert System and Seeks your Feedback

When the FCC last conducted a nationwide test of the Emergency Alert System in 2011, it determined that the system fell well short of meeting the needs of Americans who are deaf, hard of hearing, deafblind, blind, or low-vision.

In the intervening time, the FCC has implemented a new set of technical rules intended to ensure that all Americans receive the information they need to seek shelter and safety in the face of an imminent emergency, such as a tornado. In order to assess the implementation of these improvements, the FCC and FEMA will conduct a new test over television and radio on September 28 at 2:20 pm ET.

However, the FCC cannot, by itself, effectively assess the nationwide implementation of the test for all Americans who are blind or have low vision. Thus, the FCC wants to hear from you about your experience with the test. They are putting together a simple form on their website that will ask for your contact information (to determine geographical region), the source to which you were listening (broadcast TV, cable, radio, satellite), any complications you may have experienced, and specific feedback on your experience.

The FCC particularly hopes to learn whether the audio is clear, distinct, and informative. The FCC also needs to know whether the text crawl is broadcast with sufficient contrast, whether the text is large enough to see, and whether the crawl moves at an effective and understandable speed for viewers with low vision. For those with both usable vision and hearing, the FCC is further interested in whether the text and audio have parity. The FCC also wants to know whether Spanish-language channels broadcast a Spanish-language alert.

For a day or two after the test on September 28, you can provide your feedback on the FCC Consumer Help Center website under the Emergency Communications heading.

Book Review: Getting Started with the iPhone and iOS 9: Step-by-Step Instruction for Blind Users, by Anna Dresner

There are two characteristics, in my view, that are absolutely essential for any nonfiction book worth reading. One is a solid foundation of researched material. The other is the ability to present that material in straightforward, comprehensible language.

Anna Dresner's latest offering in her series of manuals guiding users with visual impairments through the often convoluted waters of using an iPhone contains both of these elements in abundance. The result, Getting Started with the iPhone and iOS 9: Step-by-Step Instruction for Blind Users, is a book that every blind and low vision iPhone user, whether neophyte or veteran, will want to own.

The book is wonderfully organized and blissfully clear. Dresner delivers solid step-by-step instructions in as few words as possible.

Useful Information for New and Experienced iPhone Users with Visual Impairments

This book is of value to both the iOS newcomer as well as the seasoned user. The first section of the book gives clear and concise information that can guide a new or experienced user through the process of selecting and purchasing an iPhone, case, screen protector, and/or extra battery.

Dresner provides detailed instruction and explanation for enabling the phone's accessibility features, followed by all of the elements in the sequence for setting up a new iPhone. She explains the basic VoiceOver gestures in a manner that is clear enough for the first-time user to follow and yet not so detailed that the experienced user would find it tedious.

Next up is an excellent group of instructions for the various ways you can set up and backup your iPhone. There is, of course, a fantastic explanation for navigating and using iTunes as well as information on other ways to transfer content to your phone. Wondering about new versions versus older versions of apps? Wondering how to sync the Calendars and Contacts lists on your phone and your computer? All of it is here.

Once the phone is set up satisfactorily and VoiceOver is running, the book works through a commendably large number of the functions the iPhone presents. The new user will learn how to navigate home screens, and will receive gentle encouragement from the author to expect mistakes and not fear them.

Remember It's a Phone

The smartphone craze has turned so many of us into humans with an extra appendage. Though you can accomplish amazing tasks with the iPhone it is, after all, also a phone! With that in mind, the author does provide a thorough discussion of initiating, accepting, and ignoring phone calls before leading the reader off into explorations of some slightly more complex features. With all features, if there is more than one way to accomplish a task (such as typing a phone number or using voice dialing to place a call), those options are clearly presented. Also included are explanations for how to access and make use of the Control Center, Notification Center, and Spotlight.

Especially useful is the detailed discussion of the voluminous number of gestures available to the VoiceOver user for moving around the screen, through apps, to turn pages, start and stop reading and music, and more. Even the most seasoned user is likely to find some hidden or overlooked nugget of value in the discussion of Settings, particularly with regard to the rotor and other accessibility features.

Take Note

When it comes to text-related apps alone, such as sending and receiving e-mail messages, iMessages, or using the native Notes app, so many approaches are available that multiple books could be written on several of them. In this book, you can learn about manipulating text (composing, erasing, selecting, copying, cutting, and pasting), along with the many and varied ways in which you can get that text onto your iPhone. The author does a thorough job of discussing using the onscreen keyboard, the braille onscreen keyboard, and Siri, all of which are already available, out of the box, on every iOS device. She also covers third-party apps for entering and manipulating text such as MBraille and Fleksy.

And then there are the external devices for those users who are more comfortable with physical buttons to press. The book covers Bluetooth keyboards and braille displays with sufficient explanation to get any user, experienced or not, up and running. From pairing the Bluetooth keyboard or braille display with your iPhone to experimenting with specific keystrokes, all necessary information is included.

For many readers, there will likely be one particular section or another that will render the book invaluable. The author tells readers how to explore the action connected with individual keystrokes by going into VoiceOver Practice. In the spirit of thoroughness, however, she also has included lists of key commands germane to a given external accessory. There is a list of keystrokes for using a wireless keyboard with the iPhone and another list for using a braille device for input and output. There is a list of keystrokes required for entering special symbols. For those who use a variety of tools for different situations, such as sometimes using the onscreen keyboard, sometimes a braille display, and sometimes a Bluetooth QWERTY keyboard, the opportunity to have such lists in one convenient location will definitely be welcome.

In addition to the lists included in the body of the book, a few lists at the back of the book are equally well thought out and succinctly constructed. Some readers will undoubtedly find themselves referring again and again to one or more of the four appendices appearing at the end of the book. Appendix A provides a quick reference describing the various VoiceOver gestures and the location and function of the buttons on the iPhone. Appendix B is a list of frequently asked questions. Appendix C gives links to those apps mentioned in the book, and Appendix D is an excellent compilation of additional resources.

Bottom Line

National Braille Press and Anna Dresner have, once again, produced a manual that many blind and low vision iOS device users will find indispensable as a guide for learning or refining their understanding of the iPhone and iOS 9. As Dresner herself points out, the whole world of iOS (and all technology, for that matter) is a moving target, but even if small nuances have shifted by the time you read this book, you will nevertheless find it a highly useful tool. Read it from cover to cover the first time, and then keep it on your desk for dipping into one section or another, one list or another, or just to look up an isolated resource.

The book is available in a variety of formats.

Product Information

Book: Getting Started with the iPhone and iOS 9: Step-by-Step Instructions for Blind Users, by Anna Dresner
Price: $24.00
Available from: National Braille Press, 88 St. Stephen St., Boston, MA 02115; 800-548-7323, Ext. 520.
Formats: print, braille, DAISY text, and ePub formats

Comment on this article.

Related articles:

More from this author:

The Aware Audible Proximity Solution Navigation App: An Interview with Rasha Said, App Creator and Founder of Sensible Innovations

GPS systems have become so advanced they can easily and accurately provide location and navigation instructions with little difficulty when traveling outdoors. When navigating indoors, however, these systems are of little use. There have been many attempts to produce an indoor navigation system for people with vision loss. In many cases, these projects never leave the prototype stage, so there are currently very few options for indoor navigation on the market. The Aware app is a new indoor navigation solution from Sensible Innovations. To learn more about the app's development, I spoke with Rasha Said, the founder of Sensible Innovation and the mind behind the Aware app. At the end of this article, you'll also get a user's perspective on the app from Albert Rizzi of My Blind Spot.

The Aware App

The Aware app is an indoor navigation app for iOS and Android. The app can be downloaded free on both platforms and is one part of the overall system. To assist users in navigating indoors, the Aware app connects to specially configured iBeacons that provide position information while indoors. iBeacon technology was developed by Apple and uses Bluetooth Low Energy to alert capable devices of the beacons' locations; the beacons can also send information to a user's device. iBeacons are traditionally used to send advertisements and promotions to user's devices, but the Aware app has made use of their capabilities to provide proximity-based information and detailed navigation instructions. With the Aware app active, nearby beacons will communicate with the app, which will then alert users to the closest beacon. Essentially, the app uses beacons to provide information regarding nearby points of interest, essentially serving as an audible sign. The app can also provide step-by-step instructions to move from one location to another. The app gives the user instructions in segments. For example, once the user selects a destination, the app will say something like: "Move forward past the store on your right and turn left at the corner." When the app detects a beacon at the corner, the next segment of the instructions would be provided. In this way, the app can provide the user with navigation instructions based on where they are in real time.

Interview with Rasha Said

NOTE: This interview has been edited for length and clarity.

A. Preece: Could you tell me a little about yourself, your background, and any development you've done in the past?

R. Said: I have a Bachelor of Science in mathematics and a Master's in Business Administration with a concentration in finance. I also have a minor in computer science in my undergrad. I worked as an actuary and I passed a few exams in the actuary of science exams. So I was working as an actuary analyst and financial analyst for a while. Working a lot on financial models and databases. I did some programming but not apps but I have good exposure and good hands on experience reading code. I actually quit my job last November. I did not develop the Aware app myself—I did not code it, I commissioned a software company…LRS [to do so].

AP: What gave you the idea to develop the Aware app?

RS: I have a son who is legally blind. … He is thriving; he is all honors, a very smart person. The thing we always struggle with is when we go places, there's no way to accommodate how he gets the information we get through signs. So, that's how this whole thing started. … I believe in technology; I believe in wireless technology. I know my son uses the iPhone. We get printed signs, how can I get that to him in audio? So, I start searching. That was actually three years ago before even iBeacons were out. [When] they started announcing iBeacons, that these are electronic tags that you can put in place and push advertisements to your phone, and the phone can detect them, I thought that's it, that's what I'm looking for. All I would need is to go and program an app that tells me about the place instead of giving me a coupon. We ended up creating the front end and the back end. Other companies who are trying to use the iBeacon for the same purpose are not doing that. But we created the whole solution.

AP: What are Aware's capabilities and how does a user interact with the app?

RS: You download it free from the App Store or Google Play. In the roaming mode, you just slip your phone in your pocket and Aware will recite the names of the places you pass by as you walk. Now if you're interested in one of the points of interest, there is a More Info button at the top right [that] will give you more information about the place. So, let's say I'm walking in the mall, and let's say the app says "Gap." When I push the More Info button it will say "Gap is a clothing store" and it will give me a simple layout description. The description may include where the cashier is, it may tell me that the boy's section is on the right and maybe there's an escalator. It's not an obstacle-avoiding app by any means… It's meant more for information than anything else. If there is a promo, you'll hear the promo through the "More Info" button as well; like if there is a sales offer or anything like that.

The Take Me To button on the main screen is a navigation feature. I'm at Gap, and now I want to go to somewhere else in the mall. If you're at a certain location, push Take Me To and it will list everything at that site. It will pop up a directory and then I can scroll through the list, or I can skip through the list by category. I'll select my destination and then it will automatically tell me the first segment. Like it will say, "Keep Gap on your right, continue straight, passing Vale's Jewelers" for example, "and then turn right at the hallway intersection." Once I reach Vale's there's another beacon that will catch me. The app will announce Vale's, so now you know that at least you're on the expected path and then it will give me the next turn. All of this is information that we put in our back end system. There is a Map button that allows me to upload a map, a visual map, as well. What I do is take the map to a graphic designer, and I enhance it, Make it bold, stuff like that. There is a Directory button that allows you to pick another point of interest, different from your current location, from a list and read More Info [about that selected location]. Now these three buttons…pop up whenever you are close to a beacon. Another three buttons on the main screen are more manual [because] if you are not next to a beacon, and the app cannot detect your current location, you will lose those three, automatic button options. [So] if you are outside or if you are away from a beacon, [you can still] navigate the place manually and learn about it. So the app has six buttons, three that pop up when a beacon is detected…and three that are always there. [One of those buttons is Venues.] It will list all the venues that have Aware, and [you can] choose a venue, and choose a directory, and it will guide you through more menus. [It's included because] I think that sometimes people like to learn about a venue before they go.

AP: Do you have any plans for the future of the Aware app, any features you want to add?

RS: There is actually another app I am working on, a visual one that will be out very soon, that pulls information from the same back end system. This is coming very soon, maybe next month. You can look at the map visually, you can see the routes visually, stuff like that. Same descriptions, same stuff. I'm trying to get a do-it-yourself packet, with a very easy user interface on the back end that's accessible so you can be your own advocate, buy your beacons, and make your own route. It could be for Orientation and Mobility teachers, or for a small area, or to let students learn their schedule. It could be for a mom who would want to put them in a few places for her child; a do-it-yourself sort of packet to make it easy to deploy a few beacons.

AP: If an organization wanted to set up this system, what would that process be like?

RS: Typically, what I ask for is the floor plan…and the functions of the place. If there is a big ballroom that only has one function, you only need one beacon, but if the ballroom has booths, now I need several beacons since I want [to provide information for] each booth. At the beginning I can give an estimate just by looking at the floor plan and the exits and the stairs, elevators, all of which should be covered by beacons. Then we'll either meet or do conferences, and exchange information through e-mail that I need to put in the system. [Finally,] I'll take the beacons on site and place them,…test them, and make sure all the information is in [the system]. So it's like a three-step process. We give each venue their own log in credentials so they can log into the system. They log into their own venue and they can change the information. … I do the initial setup to help the venue and then we train them on how to use the software on the back end.

User Thoughts on the Aware App: Albert Rizzi

Albert Rizzi, who you may remember from his previous interview in the June 2014 issue of AccessWorld has provided his thoughts on the Aware app:

A. Preece: Where do you make use of the Aware app?

A. Rizzi: My first experience using the app was at the CSUN conference. It was wonderful to be aware of my surroundings and know what booth I was walking up to or even walking past for that matter. The app provided information on each vendor, which helped me to determine, independently, if I wanted to stop and learn more about the products or offerings at the booth. Too often, I am always asking what you do, or what company is at this booth. Knowing in advance is a much better way to navigate a conference floor, and I am looking forward to the day this becomes the norm rather than a happenstance. I can see how valuable this would be in an airport, a shopping mall or even a museum amongst other public forums.

AP: Which features have been the most useful to you?

RS: The most useful features are the announcements of the booths or locations I am walking by. Being totally blind, I know I am missing a wealth of possibilities [because I'm] not getting a visual or audible cue about what is around me. [The Aware app] just eliminates all of that and puts the information I need in the palm of my hand. [It] opens up a world of possibilities for me to consider as I walk through a hotel, a conference, and any indoor forum.

AP: How do you interact with the app? Braille display, VoiceOver, voice command?

AR:I use VoiceOver since I am totally blind. I am still trying to find the right amount of information or verbosity, as well as learning how to manipulate the app as seamlessly as I need to. Learning how to use the app is no different than when I had to learn how to use the iPhone.

AP: Has the app changed the way that you travel in inside spaces?

AR: It absolutely has. Being able to know, without depending on others, in advance, or even in hindsight, what storefronts or vendor tables I am approaching or passing[…]opens up a world of possibilities that directly impacts my independence and confidence to navigate large public forums that are currently such daunting undertakings. In the past, I would rely on the kindness of others to alert me to what vendors or offerings were on the floor, and now I am able to choose, with complete confidence and independence, to walk the floor of any conference or convention where the Aware app is being used.

AP: Are there any features that you would like to see in future releases?

AR: I would need to think about that since I am still processing the current wonders that it holds for my independence. But if anything comes to mind I will be sure to share my thoughts and suggestions with Rasha. [Having] an option for searching what is around would be a great feature without question. I can see the potential marketing options here for places like restaurants or retailers alerting the user to sales or even inviting them into their store or restaurant. Sort of like a virtual visual reminder or alert to what is around you.

The Bottom Line

As it has been described, the Aware app could be an excellent solution to provide indoor navigation to people with vision loss. Rasha Said has made the effort to make the app as user friendly and intuitive as possible, and the use of iBeacons allow for unobtrusive setup in many locations.

Product Information

Product: Aware Audible Proximity Solution app
Manufacturer: Sensible Innovations
Price: Free

Comment on this article.

Related Articles:

More from this author:

Choose the Right Electronic Magnifier, Part 3: Handheld Magnifiers

Editor's Note: This is the third in a three-part series of articles covering the state of electronic magnification options, and offering advice to readers who want to acquire one.

You encounter text everywhere during the day—on restaurant menus, on appliance labels, and in handwritten notes. If you have low vision and rely on magnification to work with text, photos, and objects, these situations can present a challenge. As these examples attest, the text you encounter in your environment can't necessarily be placed under a desktop magnifier's camera or scanned with OCR. If you do a lot of "spot" reading or simply need a solution that's always with you and easy to use, there are many ways to achieve a combination of magnification and portability. You can buy simple optical magnifiers, some designed with the needs of low vision users in mind, some aimed at users who need less magnification. Even your cell phone camera, assuming it's a good one and offers zoom and a flash, can meet many magnification needs. But as we discussed in our first article of this series, there are also a great many electronic devices designed specifically for the needs of people with low vision. In this article, we'll focus on electronic magnifier products with the goal of helping you identify the features you need, and answering the question: given so many options, who needs a standalone electronic magnifier, anyway?

The Overall Range of Handheld Magnifiers

In our first two articles, we described desktop and portable magnifiers with long lists of features, including high-quality cameras, large displays, connectivity to computers, and sturdy stands and tables for arranging documents or objects. Like these larger devices, handheld magnifiers address the needs of users with low vision by providing powerful zoom options, color and contrast adjustments, and easier-to-see controls. Handheld magnifiers are built and sold by many of the same companies that produce desktop models. Of course, handhelds are smaller, one-piece devices. They're easier to carry, with built-in displays. Though most are not cheap, they are typically less than half the price of a desktop unit.

Let's take a look at what constitutes a "handheld" magnifier. It first might be helpful to categorize the devices according to screen size. Handheld screens range in size from a diminutive 3 inches to a generous 7 inches, the largest screen size that can reasonably be held, and used, in your hand, for any period of time. For a bit of perspective, a 3-inch magnifier fits in most pockets, while a 7-inch magnifier matches up with the size of a small tablet computer, like Apple's iPad Mini or the Google Nexus 7. In the middle are 5-inch devices, still pocket-sized in some cases and most easily stored in almost any purse. A few devices don't fit into these three main size categories (some 4-inch units are still on the market, for example). One reason that product sizing is so consistent across manufacturers is that companies often source screens, along with cameras, and other components, from the same equipment manufacturers.

Larger magnifiers generally have higher-quality cameras. Almost all 7-inch devices, and most current 5-inch models, boast HD quality. Again, this has a lot to do with components available to assistive technology companies, but it's also true you will benefit more from a high-quality display when using a 7-inch magnifier than you will when working with a 3-inch model.

Finding the Right Size of Handheld Magnifier

Because portability is a key feature of all handheld magnifiers, choosing a size that meets your needs is especially important—perhaps more so than you might think. The difference between a palm-sized 3-inch device, and a tablet-sized 7-inch magnifier is particularly noticeable at high levels of magnification, since the smaller screen can't display as much of what you're looking at as the larger screen can. Conversely, a tiny magnifier fits easily into a purse, or even a front pocket (be careful) and will thus travel more easily, anywhere you go. The weight of the device is less of a concern, since handheld devices aren't particularly heavy (they range between 5 ounces and 1.5 pounds). Larger devices offer more flexible positioning options: they rest on stands or bases, which can free your hands as you read or work with crafts or objects. If you want to read your TV serial number, however, or that wall-mounted sign in the break room, you'll find it easier to maneuver a smaller device into position, or to hold it in one hand. Even with limited screen real estate, a 3-inch handheld could change your life. If you've been using a small optical magnifier and upgrade to a small handheld, you will gain greater magnification and better lighting (both on what you're viewing, and your magnified view of it).

As is so often the case, a middle-sized handheld magnifier offers the best of both worlds for anyone who values portability, affordability, and an extensive feature set. Many 5-inch models mirror or closely resemble their larger siblings, but with less bulk and lower cost. The size of your hands, and your need to use a magnifier one-handed, will also impact your choice. Mid-size devices, weighing in under 10 ounces, can easily be held in one hand by a person with typical strength and dexterity. One-handed operation is made easier if your chosen device provides a handle, as many small and mid-size units do. You can certainly use a 7-inch device one-handed, but it is likely that you will become fatigued relatively quickly.

High Definition Handheld Magnifiers: Check Camera Megapixels

As we mentioned in the second article of this series, HD quality isn't always expressed in terms of precise resolution. Indeed, handheld devices that claim HD resolution less often list their specs than do larger devices, and you won't see numbers like "1080p" that appear in monitor and TV fact sheets. More often, the best gauge you have of a handheld's display quality is the number of camera megapixels. Magnifiers don't use the high-end lenses of high-end digital cameras. Handheld magnifier cameras usually range from 2 to 8 megapixels, with larger devices usually, but not always, providing more. A better camera means crisper text, especially at high magnification levels, and more detail for crafts and near-distance viewing, too.

Handheld Magnifier Form Factor Considerations

The basic layout of a handheld magnifier is a metal or plastic rectangle, with the display and controls facing the user, and a camera and light on the opposite surface. Many manufacturers mix things up a bit, providing a folding handle, or a base that allows the magnifier's camera to slant down toward what you're viewing. Handles extend the reach of your magnifier, as well as make it easier to hold with one hand. A base that places the display and camera at an angle allows you to use the magnifier hands-free. If handle and stand options are important to the way you plan to use a magnifier, be sure to examine models you're considering before you make a purchase.

Zoom, Color, and Contrast Options in Handheld Magnifiers

Handheld magnifiers rely on digital zoom to magnify an image. Maximum zoom level for handhelds is usually around 20x, less than that of a typical desktop magnifier. That's partly because the smaller screen sizes make it impractical to zoom to, say, 50x or 60x. The greatest zoom we found was 24x, available in 7-inch models from Freedom Scientific and Optelec. Larger devices tend to offer greater zoom, but not by much.

Like larger magnifiers, most handhelds support readability by allowing you to freeze images, choose color schemes, and, in a few cases, add masks or orientation lines to help you follow text as you read. Any good quality magnifier will give you the ability to change the color and contrast settings of the display, providing options to reverse video, use grayscale, or alter the background and text colors. You can find as many as 20 color and contrast combinations on some devices, though most devices, even the smallest ones, offer 8 to 12. If you find a particular color combination easier to read, check to see that any handheld you are considering supports that combination.

Freeze frame is another near-universal feature in handheld magnifiers. Once you've zoomed in on something you want to see, select freeze frame and now you no longer need to continue to point the magnifier at the object. Zoom in on and freeze-frame a label on a can of food, for example, and then you can put the can down, adjust your grip on your magnifier and its position, and take a careful look at the label. Many magnifiers allow you to save a few images, or, if there's a USB port, copy what you've saved to a computer.

Masking or windowing allows you to hide a portion of the display, or place a horizontal line under a line of text. These features can make it easier to find a specific bit of writing, or to more easily maintain your place while reading. You won't find these features on all handhelds, and they're frankly not as important on a 3-inch device as they are when you're trying to read a lot of text on a 7-inch screen. It follows, then, that guides are most often found on higher quality magnifiers.

Controls, Connectivity, and Special Features to Consider when Selecting a Handheld Magnifier

An important challenge for manufacturers of handheld magnifiers is to ensure the buttons and dials that control zoom, color, and lighting are easy to identify and use. Controls should be clearly labeled with icons so a user with low vision can easily identify each one. Controls should contrast with the device bezel and the buttons and icons should be large enough for a person with low vision to either see or be able to differentiate tactilely. To save bezel space, some devices put controls on one side of the device, or on the underside. Like other aesthetic features, the variation in placement and type of controls is a good reason to hold and test a magnifier before you buy one. If you're getting a device for someone else, pay special attention to the size and tactile nature of the controls.

A surprising number of devices offer some kind of connectivity to other devices. Most handhelds have an HDMI out port, which allows you to display what the magnifier camera sees on a separate monitor. This vastly increases the size of the display, though text or photos may look quite different when "blown up" to TV size. Be aware, too, that not all HDMI out ports are created equal. You might not be able to connect your magnifier to a computer monitor, for example. You'll also find USB ports on many handhelds. If the magnifier supports saving images you capture with the camera, you can connect the device to a computer and copy them over. Most magnifiers don't store a large number of images, though a few do. If you plan to save magnifier images, be sure you choose a device that can save and copy them in a way that suits your needs.

After You've Selected a Handheld Magnifier

Magnifier packages are not typically chock-full of accessories, though there are a few you should look for. A carrying case protects your screen and camera. Lots of magnifiers offer a strap, which is great if you're moving around a store, reading prices or labels. Many packages also provide a cleaning cloth. If yours doesn't, it might be worth getting one and tucking it into the carrying case.

In most cases, you won't spend a lot of time reading magnifier documentation. As we pointed out earlier, controls and settings should be easy to see and understand. Documentation is, however, a quick window into the care a vendor has taken during product development. Is documentation laid out in an easy-to-understand and visually accessible way? Do English language instructions appear to have been written by a native speaker? Is there a troubleshooting guide and information about how long the built-in battery should last? Is it easy to find the tech support phone number, e-mail, and social media links? Is it clear that you can contact support staff in your own country and language? The answers to these questions will identify those companies that are committed to quality service and support and those that may simply be offering a good deal.

Cost of Handheld Magnifiers

Handheld magnifiers, like a lot of assistive technology products, come at a high cost. As we discussed in the earlier articles in this series, cost has a lot to do with the special features required to support users with low vision, and the small number of units sold. Assuming you want and need a magnifier, though, it's helpful to have a realistic understanding of what you can expect to pay. The three size ranges are a good starting point for comparison. And as we've discussed throughout this article, keep in mind that feature sets, like prices, tend to correlate to display sizes, so bigger devices will have more features and be more expensive. Another important pricing caveat is that devices that use older technology will be less expensive than the newest HD magnifier. Unlike computers or phones that need the latest hardware to continue working as software improves, a magnifier using older technology will still magnify effectively, and vendors will continue to sell older devices until they run out of them. This isn't a bad thing. Each product's specs are (or should be) available on the company website, allowing you to compare features and technology for yourself. If a vendor does offer older devices, chances are they also have a shiny new one with more features and a correspondingly higher price. The choice is yours.

Determining the Necessity of a Dedicated Handheld Magnifier

We began this article by pointing out that you have many options when it comes to handheld magnification. A dedicated handheld magnifier isn't right for everyone, and, after reading about them, you might wonder if your phone camera or an optical magnifier would work just as well. A smart phone with a camera is probably the greatest existential threat to the makers of handheld electronic magnifiers. Because today's phones are generally accessible and have great cameras, many people with low vision choose to use the device that's already in their pocket when they need to enlarge text or objects. Apps that specialize in magnification are plentiful for iOS and Android phones and tablets. The Magnifier feature in Apple's upcoming iOS 10 even offers color filters and freeze-frame, two marquee features of dedicated handhelds. So far, this is a unique phone feature, and it still won't provide masking or reading guides. Two reasons your phone isn't an ideal magnifier replacement are its lack of a stand and the drain that using the camera places on the battery. These issues may not matter if you only need magnification occasionally, but if you rely on your handheld magnifier throughout the day, especially for reading, a phone won't cut it.

Optical magnification is a budget-conscious and flexible alternative to the electronic kind. Many devices with high-quality optics include a stand and even a light, and even those designed specifically for people with low vision can be had for under $100. Optical magnification tops out at 10x in most cases, and these devices are often quite small, especially at higher power levels. They're a good fit if you don't need digital zoom or contrast adjustment features.

The Choice is Yours

No two people with low vision have exactly the same magnification needs. Even if two people have the same eye condition, they each live different kinds of lives and interact differently with the world. You should approach selecting a magnification option, whether it's your cell phone, a pair of reading glasses, an optical device, or an electronic handheld magnifier, as a very personal decision. Remember, too, that these devices last a long time, so it's smart to buy the best made device with the most features and greatest flexibility that you can afford.

Comment on this article.

Related articles:

More from this author:

The Current State of ChromeVox Next (Beta), a Screen Reader for Chrome OS

In the January 2016 issue of AccessWorld we reviewed a Chromebook and its built in screen reader, ChromeVox. Google has been in the process of overhauling the screen reader, the upcoming version of which they have dubbed ChromeVox Next. For a number of months, a beta version of ChromeVox Next has been available for users of the stable channel of ChromeOS. Though it is still in beta, ChromeVox Next may be useful for Chromebook users.

In this article, I will detail the differences between the original version of ChromeVox (dubbed ChromeVox Classic) and ChromeVox Next, as well as describe how the beta version of ChromeVox Next performs in comparison to its predecessor. For this review, I used an Asus C201 Chromebook with ChromeOS version 52.0.2743.116 released August 3rd, 2016 on the Stable Channel. ChromeVox Next is available natively on Chromebooks that run modern versions of the operating system.

After activating ChromeVox Classic with the command CTRL+ALT+Z, you can press the combination SEARCH+SHIFT+Q and then press Q again to activate ChromeVox Next. To return to ChromeVox Classic, press SEARCH+Q ChromeVox will state "Next" or "Classic" to alert you to the version that you have activated.

Essential Differences Between ChromeVox Classic and ChromeVox Next

In the Classic version, ChromeVox injected code into the page to read it. This allowed ChromeVox to provide access to pages but caused some issues. One major issue with this method was that the full capabilities of the screen reader were not available when the user was doing something other than viewing a webpage. In addition, through my use of ChromeVox I found that the screen reader would often lag on some large pages and those with many refreshing ads. ChromeVox Next has been designed as a separate application so that it can better operate throughout the operating system. The main consequence of this change is that ChromeVox Next commands work outside webpages, in areas where the screen reader had previously struggled, like the Files app. This means that ChromeVox commands such as those used to navigate by headings or links are available outside of a webpage but that basic navigation—such as by objects, lines, or words—is also available so that it is possible to navigate areas such as the visual buttons for Back, Forward, and Reload using these basic navigation elements where before they were only available when navigating a webpage. Because of this change, if you use Sticky Mode (which allows access to ChromeVox commands without the need to press the Search key with each combination), you must turn it off when entering text into an edit field or else you will begin triggering ChromeVox commands.

New Earcons in ChromeVox Next

ChromeVox Classic has always contained earcons, sounds that are played to alert the user to various elements and events. These have all changed in ChromeVox Next. One notable difference is that links now produce a sharp thump, similar to that of a struck drum, instead of the soft tone that was used for links in ChromeVox Classic. When pages load, a constant clock-like ticking sound is played that tapers off in volume until it falls silent when the page is finished loading. A guide to these sounds does not exist so you must learn them through trial and error. Most take the form of a thump sound, similar to the sound that is played for links, simply at varying pitches.

Altered Keyboard Commands

Keyboard commands in ChromeVox Next are similar but not the same as those found in ChromeVox Classic. The ChromeVox key, which was SHIFT + SEARCH in ChromeVox Classic, has been replaced by the SEARCH key alone. You can still activate Sticky Mode by pressing the SEARCH key twice quickly. One major difference is that you no longer need to press either the P or N key before issuing a jump command, such as for a heading or link. Now, you only need to press the ChromeVox key plus a single key to move forward or press the ChromeVox key plus SHIFT plus the element's key to move backward.

With this change, in Sticky Mode you can navigate the web much as if you would when using a Windows screen reader. As noted on the ChromeVox Next page, some jump commands have not yet been implemented. Below, I have listed the elements that are included and the keyboard key associated with each:

  • Button: B
  • Checkbox: X
  • Combo box: C
  • Editable text area: E
  • Form field: F
  • Heading: H
  • Link: L
  • Table: T
  • Visited link: V

Basic screen navigation with the Arrow keys has changed somewhat in ChromeVox Next. ChromeVox Classic required that you select a navigation level for navigating with the Up and Down Arrow keys, and the Left Arrow and Right Arrow keys would move by the next level down. Now, the Up Arrow and Down Arrow keys will navigate by line when pressed while holding the ChromeVox key or while in Sticky Mode and the Left Arrow and Right Arrow keys will move by object when holding the ChromeVox key or while in Sticky Mode. Other key commands are used in conjunction with the Arrow keys to move by word or character as well as to jump to either the top or bottom of the page. I have listed these commands below:

  • Move by word: SEARCH+SHIFT+CTRL+LEFT to move backward and SEARCH+SHIFT+CTRL+RIGHT to move forward
  • Move by character: SEARCH+SHIFT+LEFT to move backward and SEARCH+SHIFT+RIGHT to move forward.
  • Move to the top of the page: SEARCH+CTRL+LEFT
  • Move to the bottom of the page: SEARCH+CTRL+RIGHT

It is also possible to activate the item that has focus with SEARCH+SPACE, the same combination used in ChromeVox Classic. A particularly useful new addition is the ability to simulate a right-click/evoke the context menu through SEARCH+M.

ChromeVox Next on Standard Webpages

ChromeVox Next demonstrates very polished performance on traditional webpages. Basic navigation using the Up and Down Arrows set to Line is similar to navigating with ChromeVox Classic. The commands that most differ in ChromeVox Next are the commands for navigating by character and word, and the commands for navigating to either the top or bottom of the page. Most other commands have changed little or mirror familiar commands from other screen readers.

Using Sticky Mode and the new jump commands in ChromeVox Next will seem very familiar to Windows screen reader users, since the commands are nearly the same in regards to element hotkeys. One issue that I encountered is that certain combo boxes are not recognized by the combo box jump command in ChromeVox Next. To find these combo boxes, the user must use the jump command for form field instead. An example of a combo box that does not register to ChromeVox Next as a combo box is the Filter combo box under the heading "The Most Recent Additions and Updates to the AppleVis Site" on the AppleVis homepage. ChromeVox Next similarly misidentified most other combo boxes I encountered, with the exception of the combined combo box/edit field on Google's search page, which ChromeVox Next identified correctly.

Both ChromeVox Classic and Windows screen readers can recognize combo boxes even if they have been included as part of a form. ChromeVox Next, however, only recognizes combo boxes included on a form as form fields. Note that these fields are not referred to as combo boxes by ChromeVox Next, but as popup buttons.

There are two major functions that have yet to be included in ChromeVox Next. One is the ability to select text on a webpage. In ChromeVox Classic, you can press SEARCH+S to begin a selection, navigate through the text you want to select, and then press the same command to end the selection. This command has not yet been implemented into ChromeVox Next. The other command that has yet to be implemented is the Table Mode in ChromeVox Classic. This mode allows users to navigate cell by cell through a table as well as gather information about the table. At this time, it's possible to navigate to a table with a jump command but it's not possible to navigate the table with the level of detail found in ChromeVox Classic.

ChromeVox Next and Google Web Apps

I have reviewed Google Drive, Docs, and the Files app in order to determine their usability with the current iteration of ChromeVox Next. Google Drive works very well from my testing with ChromeVox Next. Tabs are read correctly and it is possible to navigate through files and folders as well. Menus and dialog boxes were also read correctly; I did not encounter any issues that would make it difficult to use Drive with ChromeVox Next.

At this time, there is an issue with Google Docs that may make it difficult to use effectively with ChromeVox Next: when navigating within a document, often words, characters, and lines are not read. For example, if you move through a word character by character using the Left and Right Arrow keys, many characters are not read. There is no set pattern for these omissions; in some cases, letters are read once but not again when navigated past at another time. The issue occurs most commonly with characters, less commonly with words, and very infrequently when navigating by line. Otherwise, menus and dialogs behaved as expected. ChromeVox Next did not announce dialogs when they were launched where ChromeVox Classic does so by saying "Entered Dialog." At this time, using ChromeVox Classic will provide a better experience when using Google Docs.

As the Chromebook has been updated, ChromeVox has acted differently in the Files app. Both ChromeVox Next and Classic act similarly in the app at the time of this writing. It is possible to navigate menus, buttons, and lists of files and folders without using specific ChromeVox commands with Classic and Next, but the list of drives and folders is not read by either without using ChromeVox specific commands. When moving from one file or folder to another, ChromeVox Next will read the previously selected folder or file before reading the file or folder that is currently selected, which means you have to wait through a repeat before hearing your currently selected item. If you navigate using ChromeVox Classic, or ChromeVox Next specific commands, only the newly focused file or folder is read. One major benefit of using ChromeVox Next with the Files app is that you have access to the context menu for items. This menu is not announced when it is launched but it is possible to navigate and operate the menu when it has been evoked. Because of this, ChromeVox Next does provide some benefit in the Files app over ChromeVox Classic.

The Bottom Line

Even in its beta state, ChromeVox Next provides a more fluid experience when navigating traditional webpages than ChromeVox Classic. Some needed commands are not yet implemented and there are issues with some controls and applications in this beta version, but even in its current state ChromeVox Next can enhance the Chromebook user experience for people with vision loss. Since you can switch between ChromeVox versions with a few keystrokes, you have the freedom to use ChromeVox Next in the areas where it currently excels and return to Classic for those areas where ChromeVox Next is incomplete or is experiencing issues.

Product Information

Product: ChromeVox Next (Beta)
Manufacturer: Google
Price: Free

Comment on this article.

Related articles:

More from this author:

The Fire TV with VoiceView from Amazon: An Accessibility Review

In the July 2016 issue of AccessWorld, we describe how Amazon brought its VoiceView touch screen reader to the Amazon Kindle Paperwhite using AudioAdapter. Already, the company has taken screen accessibility another step further with the release of VoiceView Over Bluetooth on Kindle (8th Generation), which enables VoiceView access over a Bluetooth speaker or pair of earbuds and costs less than the Paperwhite to boot.

The VoiceView touch screen reader is a ground-up rewrite of what was originally a modified version of the Android TalkBack screen reader. One of the advantages of writing VoiceView from scratch is that a preview version of VoiceView has been added to the Amazon Fire TV. Recently, I obtained a $49 Amazon Fire TV Stick with Voice Remote. Keeping in mind that this is a preview version, here's what I discovered.

The Amazon Fire TV

Similar in function if not content to the Apple TV, the Fire TV is a set-top device that enables streaming of both audio and video content through a television's HDMI port. Most flat screen TVs include at least one HDMI port; most include multiple ports.

The Fire TV comes in three versions. The current configuration of the original $99.99 Amazon Fire TV features a quad-core processor, 2 GB of memory, expandable flash storage with an SD card slot (up to 200GB), Wi-Fi 802.11a/b/g/n/ac, and wired Ethernet access. These last two features increase the throughput enough to support 4K ultra high-definition video.

The $49.99 Fire TV Stick with Voice Remote has a dual-core processor, 1GB of memory, Wi-Fi 802.11a/b/g/n, and supports 1080p video. There is also a $39 version of the Fire Stick that does not include voice control, but as you will see, the extra $10 will more than pay for itself in convenience.

Since I don't plan to store a lot of applications and content on the device itself, and don't own a 4K TV, I went with the Fire TV Stick with Voice Remote. Along with the device itself, the box also included a USB charging cable, wall adapter, and the voice remote (batteries included). There was also a male to female HDMI extender cable, in case the stick does not fit behind your set and needs to be connected to the far end of an HDMI cable.

The stick resembles a thick package of gum, with a male HDMI jack at one end and a USB charging port along a side edge. The remote is smaller than a standard TV remote, but somewhat larger than the Apple TV remote. There are only a few buttons, so their functions and positions are easy to memorize. At the top center is the Voice Command button (the version without voice control does not include this button). Press and hold this button to issue voice commands. Directly beneath is the Select button, which is surrounded by a standard navigation ring with controls for up, down, left, and right. Completing the layout are two rows of three buttons each. The upper row from left to right: Back, Home, and Menu. The bottom row, also from left to right: Rewind, Play/Pause, and Fast Forward.

Getting Started

VoiceView Preview will automatically update over the air, so if you already have a Fire TV, the chances are good it has been updated to the latest version. To check, simply press and hold the Back and Menu buttons for a few seconds to start VoiceView.

Not every new Fire TV will arrive with the latest software updates already installed. Brand new Fire TV owners will require sighted help to set up the unit. Although mine came with my Amazon credentials pre-entered, I did need sighted assistance to help select a language and Wi-Fi network. Once this was done, the Fire TV auto-played a brief instructional video outlining basic commands used to operate the device.

Unfortunately, the software did not automatically update to the latest version at this point. I didn't know you can manually initiate a software update via the Settings/Device/About menu, so I simply turned off the TV and waited overnight to see if it would update on its own. It did. In the morning, when I pressed and held down the Back and Menu buttons, VoiceView began speaking almost instantly.

Using VoiceView

In order to save network bandwidth, the over-the-air update that includes VoiceView Preview doesn't include the high quality IVONA voice. Instead, the preview began with the lower quality Android SVOX Pico voice offering a brief tutorial on how to use VoiceView. Midway through the tutorial, the IVONA voice kicked in. This is the same text-to-speech voice used on VoiceView for Fire tablets, and the new VoiceView for Kindle. The IVONA voice is clear, crisp, and easy to understand, even you've never used a screen reader before.

One of the controls missing from this preview version is the ability to change the VoiceView volume. The preset volume is acceptable, but while viewing one program with an especially quiet soundtrack I turned my TV volume so high that VoiceView blared uncomfortably loudly.

The Left and Right controls on the navigation ring move the user forward and backward through the various menus and listings, offering screen-reader-like feedback along the way. The Select button confirms your choices. Additional information unrelated to menu structure, such as program descriptions, is often presented on the screen; pressing Up and Down provides access to these points of interest. Note that some information, such as user reviews, are not currently accessible, and will announce "Disabled" when VoiceView encounters them.

For a sighted user, the Fire TV menus do not always run from left to right (some run top to bottom), but VoiceView rearranges them for easier navigation. This is done through the screen reader's Enhanced Navigation mode. To toggle between Standard and Enhanced navigation, press and hold the Menu button.

For sighted users, pressing the Menu button once opens a control menu with options such as Jump to Time and Play from Beginning. When VoiceView is running, the first press of the Menu button speaks a context-sensitive help screen. Double pressing Menu button? will call up the secondary menu.

Accessing Media on the Fire TV

You can use the Amazon Fire TV to purchase and play movies, TV shows, music, and other content directly from Amazon, or to play content already in your Amazon library. Amazon Prime members also have access to Prime Video and Prime Music.

Use VoiceView to navigate through the various category menus and listings until you locate your desired title, then use the Select button to play, purchase, or add the title to your watch list. Titles available for free play via Prime Video are announced as such after their title names are voiced.

Alternatively, you can use the Search box to jump directly to a title. Users of the original Apple TV will be familiar with the style of keyboard used. It consists of a single row of letters and numbers that you navigate using the Left and Right keys. Press Select to choose a character. There is an audible click when you press the select button, but you will not hear audible confirmation of the letter you have typed. I found these clicks to be hit and miss. Sometimes I would hear them, sometimes not. If I didn't hear the click, I would reenter the character only to discover that I had now entered it two or three times. I could not find a way to review what I'd typed. The Backspace key is at the extreme far bottom right of the multi-row keyboard, and in the preview version of VoiceView, at least, you are not able to review what you delete.

When I entered the letter "B" and arrowed to the end of the keyboard the first listing I found was "Batman vs. Superman," which I'm guessing is currently the most popular movie beginning with the letter B. Other titles followed. I do wish at least the first two or three of these titles would have auto-voiced as I typed. The Up and Down keys enable rapid navigation through the keyboard, so they cannot be used here to voice otherwise inaccessible screen content.

Some Fire TV mobile apps include a keyboard. I used the iPhone app, and was pleasantly surprised when it recognized my Fire TV and VoiceView announced the PIN I needed to enter to activate the app. I found that the iPhone app was a much easier way to enter text into the search box, although I did have to return to the Fire remote to confirm my selection. You can also choose to bypass the keyboard altogether using Voice Search.

Voice Search

If there is a specific movie, TV program, or audio track title you are looking for, you don't have to cramp your fingers navigating menus or typing in the name. Instead, press and hold the Voice button at the top center of the remote. You will be prompted with a sound to begin speaking. When you're done, release the button. I found this feature to be quick and easy to use and extremely accurate.

You can also use many Alexa commands by beginning your inquiry by speaking the name "Alexa." For example, "Alexa, what's the weather?" or "Alexa, play the Beatles." Basically, purchasing a Fire TV with Voice Remote is a bit like getting many features of the Amazon Tap, which we reviewed in the May 2016 issue of AccessWorld, for free.

Third-Party Content

Along with games, which are currently mostly inaccessible using VoiceView, the Amazon Fire TV also offers access to a growing number of third-party content providers, including Netflix, Hulu, Spotify, and Pandora. CNN, CBS, and NBC News are also available. For competitive reasons iTunes is not available, just as Amazon Video is unavailable on the Apple TV.

As of this writing, very few of these third-party apps have been optimized to work with VoiceView. I was able to install many of them, but then I would encounter a non-speaking screen and have to back my way out. Amazon does not require third-party vendors to make their services accessible, though they do offer developer resources to help get the job done. A little lobbying on behalf of Fire TV VoiceView users might also speed the process. We shall see.

Final Thoughts

If you own an Apple TV and wish you could also accessibly stream Prime Video on your home TV, the $40 cost of a Fire TV makes it an affordable option. Amazon Prime membership makes available any number of movies and TV shows that are not available on other streaming services like Netflix.

The Fire TV VoiceView Preview is by no means a finished product. But it does most of what it needs to do to allow you to watch Prime Videos or accessibly purchase and play a movie or TV episode with the rest of your family.

As with the Kindle Paperwhite Audio Adapter, the proof of Amazon's commitment to accessibility will come in the form of software updates. For now, I commend Amazon for releasing VoiceView as a preview, instead of spending many months assuring the blind community that something was in the works but not—if you will excuse the wordplay—ready for prime time.

I have enjoyed many of the Prime Video exclusives, such as the TV series Catastrophe. But in all honesty, I prefer to watch Netflix for the simple reason that Netflix offers audio description, and Amazon does not. However, considering the resources the company has dedicated thus far toward making their various products speech accessible, I have to think that audio description will be included sooner rather than later. Stay tuned.

Comment on this article.

Related articles:

More from this author:

Pokemon GO: Elevating the Conversation around Accessible Gaming

Lee Huffman

Dear AccessWorld readers,

As AccessWorld has noted the increase in reader interest in gaming, we have responded by increasing the number of gaming-related articles over the past year.

Recently, AFB published a blog post regarding the accessibility of the Pokemon GO app for the Android and iOS operating systems. The game has been incredibly popular, allowing players to capture the titular Pokemon while moving throughout the real world. Our blog post on adding accessibility to the game sparked a vigorous online discussion with gamers, game developers and programmers on the details of how access could be included in the app and the obstacles that could make accessibility challenging. AFB's original blog post was even picked up and discussed on Micah Curtis's YouTube video blog.

In this month's Editor's Page, we would like to further explore this topic, sharing some of the insights we have received from those who have participated in this discussion.

For many apps, adding accessibility consists of simply adding text labels to elements. However, as we have learned, adding access to elements in Pokemon GO is not quite that easy. Pokemon GO uses the Unity engine, a very popular cross-platform engine for developing video games. The Unity engine streamlines game development in various ways making it the go-to engine for game developers.

Unfortunately, Unity does not appear to natively support Apple's VoiceOver protocols for communicating element information. To add accessibility to Pokemon GO, Niantic, the company that created the Unity platform, would apparently need to add VoiceOver compatibility from scratch. This process would not be completely unprecedented; there have been efforts from independent developers to add VoiceOver compatibility to Unity. Adding VoiceOver accessibility to the Pokemon GO app may require more time and resources than it would for an app that uses native controls, but it is possible.

Adding access to on-screen elements would make the majority of Pokemon GO accessible to people with visual impairments. There are two more interactive portions of the app that would be more challenging to make accessible and provide a person with vision loss the same experience as their sighted peers. One of these areas is capturing Pokemon.

When a Pokemon has been discovered, a player must throw a Pokeball at the Pokemon to capture it, but the Pokemon doesn't make this easy. When targeting the Pokemon to capture, the player must both strike it with the Pokeball and be sure to attempt the capture when a ring that appears around the Pokemon is smallest. This process could be made accessible without altering the capture method by providing sounds to indicate direction. Many audio games already use this method to identify the position of objects in a soundscape.

The other active aspect of Pokemon is Gym battles. Gyms are located at various landmarks. Once a player reaches a certain level, he or she can choose a team and begin to attempt to capture gyms for that team. Once a gym has been captured, players can leave their Pokemon to defend it from challengers. When a player challenges a gym, he or she must compete in a Pokemon battle against the Pokemon that occupy the gym.

The battles in Pokemon GO are very interactive. Each Pokemon has two attacks, a standard attack and a special attack. When in combat, a player can tap the screen to make standard attacks against an opponent, and once a displayed gauge has been filled, the player can touch and hold to launch their Pokemon's special attack.

When attacking an opponent, a player attacks the opponent's Pokemon in turn. To keep from taking damage, the player can swipe left or right to dodge attacks from their opponent. These on-screen movements could be made accessible through sounds. Sounds could be included to indicate the state of the special attack gauge, and stereo positioned sounds could be played to indicate the position of an enemy attack. With these adjustments, a player with a visual impairment could use his or her ears to compete in a similar way to that of a sighted peer.

Though making the Pokemon GO app accessible would not be a simple process, it is not impossible. Niantic has an excellent opportunity to make Pokemon GO accessible to people with vision loss and provide a similar experience to that of sighted players. Judging by the response of gamers who are visually impaired to our recent blog post and in other online forums, there is a great deal of support for accessibility, and Niantic can be a leader in this new technology by insuring that all possible players are included. Future iterations of Pokemon Go could even be used in other settings, such as a tool for teaching orientation and mobility skills.

You can support these advances by helping us continue this online conversation, to show Niantic support for accessibility and help outline technology solutions. Thank you to all of our readers who have participated in the dialogue so far. You are helping to raise the profile of accessibility and showing others how to imagine what can be possible.

Taking AccessWorld's theme of accessibility in a different direction, as our regular readers know, AccessWorld has covered home appliance accessibility in years past. It has been brought to my attention several times recently that readers would, once again, like more information on this topic. In direct response to those comments and the interest in the August article by Bill Holton entitled, The Accessible Kitchen: Using the Instant Pot Smart Bluetooth-Enabled Multifunctional Pressure Cooker, later this year and next year, AccessWorld will be looking at home appliance accessibility from the perspectives of people who are blind and people who have low vision.

We will also be taking steps to update all sections of the AccessWorld Home Appliance Accessibility Guide, which is located on the AFB main website under the Living with Vision Loss tab.

We will investigate and cover features such as tactilely discernable controls, audible tones, font size and style of control labeling, color contrast, glare, and the positioning of controls. We hope this will provide additional useful information for our readers, and practical guidance when purchasing home appliances. So, stay tuned.

As I'm sure you have all noticed, the days are now growing noticeably shorter. Students have returned to school, and it's now a logical time to begin thinking about work and careers. October is National Disability Employment Awareness Month, and next month AccessWorld will recognize its observance by taking a closer look at employment resources for people with vision loss as well as by revisiting tried and true job search strategies. Of course, we will also be looking at technology to support and enhance your career and work life.

The AccessWorld team hopes you will read each article in this and every issue to gain as much access information as possible. As technology is always advancing, we encourage you to stay proactive in seeking out new access strategies that may better meet your particular situations at home, at school, and at work.

Sincerely,
Lee Huffman
AccessWorld Editor-in-Chief
American Foundation for the Blind
and
Aaron Preece
AccessWorld Associate
American Foundation for the Blind