Full Issue: AccessWorld April 2018

Highlights from The 2018 American Foundation for the Blind Leadership Conference

Each year, the American Foundation for the Blind Leadership Conference (AFBLC) brings together individuals from across the vision loss field. In addition to educational sessions, the conference also serves as a networking space and includes the presentation of many of AFB's annual awards. The information presented at the conference covers a range of topics, from aging with vision loss to public policy. For this article, I will be focusing primarily on the technology content presented at the conference.

AccessWorld Technology Summit

The AccessWorld Technology Summit occurs each year and is a full-day session that showcases new technologies and technological research of interest to people with vision loss. Presenters include mainstream and assistive technology companies as well as other organizations active in the technology arena.

New Developments in Accessibility for Google Products

The first session of the AccessWorld summit was presented by members of the Google accessibility team. Members included Kyndra LoCoco, Accessibility Community Manager, Victor Tsaran, Technical Program Manager for Android, Laura Palmaro, Program Manager for Chrome, and Roger Benz, Program Manager for G Suite. Each presenter discussed the accessibility features of their respective products with a focus on what accessibility features have been included recently.

Various accessibility improvements have been included in Android Orio (Android 8.0). Over the past several years, the team has focused on bringing productivity improvements to TalkBack, the Android screen reader. One aspect of this is the inclusion of extensive keyboard shortcuts for TalkBack that provide access to most features of the screen reader and allow TalkBack to be used similarly to a desktop screen reader. Other improvements include the addition of an "Accessibility" button on the navigation bar where the "Back", "Home," and "Overview" buttons are located. This gives the user the ability to launch a specific accessibility tool and to assign a TalkBack gesture to the fingerprint censor, if present.

The redesigned ChromeVox screen reader for the ChromeOS operating system is now simply titled ChromeVox and the previous version has been retired. This version of the screen reader has redesigned keyboard shortcuts, new earcons (audio cues used to identify on screen elements), and the ability for the screen reader to be used consistently across the operating system. Another key feature recently introduced is the ability to use USB-capable braille displays on ChromeOS. The ability to use Bluetooth displays is in development.

Several accessibility advances are in development for G-Suite apps (Docs, Drive, Sheets, etc.) with some available on certain combinations of platforms and screen readers. Magnification has been built into Google Docs. This feature is available on ChromeOS and Mac with access on Windows with ZoomText coming soon. Braille support has also been added to Google Sheets with access currently available on ChromeOS.

Google has introduced a new video conferencing system titled Hangouts Meet. In addition to software, specially designed hardware has been produced for use with the system so that it can be used in physical meeting rooms. The hardware includes a touch panel for controlling the system that is accessible using the ChromeVox screen reader.

An Introduction to Oath

Oath is a Verizon company that incorporates various Web properties including Yahoo, AOL, HuffPost, TechCrunch, and Flickr. The presentation was given by Darren Burton, Accessibility Specialist for Oath's Yahoo property. He explained that much like other parent companies (such as VFO and Alphabet, the properties under the Oath umbrella will mostly remain distinct and separate. In addition to working with development teams at Oath to promote app accessibility, new hires are provided with information on accessibility processes during initial orientation. In addition, Oath has prepared a set of learning modules for their developers explaining methods for implementing the most common forms of accessibility to give their developers guidance when the accessibility team is not available. Recently, Oath has been developing a second usability experience lab in the New York City area.

Samsung

Youngsun Shin, the Principal designer for user experience for digital appliances at Samsung, discussed recent efforts to bring accessibility to Samsung appliances. One recent enhancement is the introduction of rising and falling tones to indicate the changing of temperature for Samsung refrigerators. Refrigerators with speakers will also announce the temperature; the tones have been included so that refrigerators that only have a buzzer have a method for providing the information in an accessible form. Samsung is also beginning to add braille to certain appliances; where this is impossible the company will generally have tactile stickers produced to provide tactile feedback for controls.

The team has also recently designed a voice-activated washer for the Korean market with the assistance of an employee with a visual impairment. The technology is in development for a global release. Samsung also provided the voice-enabled washer that was used in the athletes' village for the Paralympics. This allowed the team to receive further feedback from those with visual impairments on the accessible washer.

Audio Description Research

David Vialard of Illinois State University discussed research he is conducting regarding audio description. There is more audio description available than ever before for those who are visually impaired, though research into best practices is scarce. The International Collection of Child Art at Illinois State University was used as a beta project for providing audio description for images. Because a variety of individuals submitted transcripts for the images, researchers are able to see which items are commonly described in an image as well as differences among describers. Vialard also uses technology that can track eye movement to determine where individuals focus when looking at images. Data relating to focus patterns and the duration of a viewer's focus on specific areas are collected for analysis. This information can be combined and displayed to show focus patterns for multiple individuals so that similarities and differences can be more easily detected and studied.

Bluetooth Beacons for Indoor Navigation

Mike May of Envision, Inc. discussed the current state of indoor wayfinding for those with vision loss, current concerns, and future developments in this space. The most common form of indoor navigation uses Bluetooth beacons. These mainstream devices can be placed in a space to communicate information to a smartphone app when a user comes within a certain distance of the beacon. Currently, many companies deploy beacons in public spaces such as schools, malls, and airports. At the moment, most beacons are associated with a single app, which means a user must have multiple indoor wayfinding apps on their device and know which one to use in a given venue. To attempt to alleviate this issue, several indoor wayfinding providers formed a coalition to share beacon data. Some companies involved in the effort include Sendero's Seeing Eye GPS app, BlindSquare, and Indoo.rs. The company Radius Networks is also involved in the effort. Radius makes beacons and also has a database of beacons in a format similar to a wiki. The goal of this coalition is to share beacon data across apps so that everyone can benefit from the beacons, regardless of who placed them.

One disadvantage of beacon technology is that their accuracy is a bit imprecise, though a new technology may provide a solution. Wi-Fi fingerprinting takes into account all available signals and their strengths for a device at a specific location. The distinct signals and signal strengths provide a virtual fingerprint for a user when they are in a given location. This could provide accuracy up to a foot and could lead to accurate turn-by-turn directions for indoor spaces.

Verizon and the Development of 5G Wireless Technology

Zachary Bastian of Verizon discussed Verizon's research into 5G wireless technology, its benefits over current technologies, and deployment challenges. 5G technology has been shown to display speeds of 1 Gigabit per second for both upload and download. Current 4G and 4G LTE technology can achieve speeds of around 12 Megabits per second. The technology is still in development stages as the way in which 5G waves interact with objects is still being studied.

Compared to current technology, 5G wireless will require many more cells than current wireless data technologies. 4G wireless can be provided for a wide area using only one tower where possibly hundreds of smaller towers may be needed for 5G. This means that 5G will most likely appear in urban areas before being deployed in more rural locations. In addition, the antennas used with the technology are larger than those currently used so the technology will most likely appear as a replacement for home internet before being used elsewhere.

HumanWare

Peter Tucic of HumanWare spoke about new developments in the company's braille products. The Brailliant 14 is a 14-cell braille display that can connect to five Bluetooth-capable devices at one time. The device contains a basic notetaker and it's possible to sync notes from the Brailliant to the iOS device using the Brailliant sync app for iOS. The Brailliant Sync app is currently only available on iOS but there are plans to develop an Android version in future. Instead of traditional cursor routing keys, the Brailliant has touch censors above each cell that perform the same function. Part of this design choice is to hopefully improve the durability of these controls.

KeySoft version 5 has been released for HumanWare's BrailleNote devices. One key addition to this version is the implementation of one-handed operation and improvements to math content. It is now possible to review a visual graph using the braille display. When viewing the graph in this way, the user can essentially scroll through the graph with the X and Y axes being represented by specific braille symbols while the equation being graphed appears as a mass of braille dots forming the line. It is possible to emboss a hardcopy tactile graphic of the equation being graphed to allow the user to view an overview of the graph as well.

Aira

Paul Schroeder of Aira discussed the Aira technology, demonstrated the service, and detailed recent developments for the company. The Aira service operates by connecting visually impaired customers (Explorers) with trained sighted individuals (Agents) who can provide assistance with a variety of tasks. Though the user can use a smartphone's camera to provide video for an agent, the Aira service generally relies on a set of camera-equipped smartglasses. The recently released Horizon smart glasses have up to 7 hours of battery life and also provide a wider field of view for the Aira agent than earlier models. In addition, the picture quality transmitted by the camera has improved, which makes it faster for Aira Agents to discern small details, such as when reading small print.

Aira also announced that Explorers who use Aira for job seeking tasks, such as editing a resume or picking out an outfit for an interview, will receive those minutes free. Aira is also partnering with airports to provide free access to travelers when inside terminals. In addition, Aira is reaching out to other public sites about providing access to visitors with visual impairments.

General Sessions

Each day at AFBLC begins with a general session discussing pressing issues in the field of vision loss. This year, the issues discussed ranged from technology to employment. The first general session brought together representatives from mainstream technology firms who discussed how their companies improved product accessibility, supported employees with vision loss, and improved opportunities for people with visual impairments. The panel consisted of Sarah Herrlinger of Apple, Mark Lapole of eBay, Megan Lawrence of Microsoft, and Jeffrey Wieland of Facebook. Highlights included braille display support for Apple TV and the inclusion of emoji accessibility on braille displays from Apple; the Facebook Navigation assistant, which makes Facebook easier to navigate and use for blind or low vision users; the recently released Soundscape wayfinding app from Microsoft; and eBay Mind patterns, a compendium of accessibility examples that can be applied to various situations when developing for accessibility. All companies also described the importance of accessibility knowledge among students and their efforts in these areas.

The second panel featured both representatives of mainstream technology companies as well as individuals with visual impairments with successful careers. The goal of the panel was to determine what challenges currently face those with vision loss when attempting to find employment and what methods companies are employing to increase opportunity for people with visual impairments. The panel consisted of Jennison Asuncion of LinkedIn, Dina Grilo of JPMorgan Chase, Jen Guadagno of Microsoft, and Megan Mauney of Florida Blue. The companies represented have accessible hiring processes as well as supports for individuals with vision loss after hiring. Interestingly, the panelists explained that their companies find it useful if an applicant discloses their vision loss early in the application process as it allows the company to be sure aspects of hiring are accessible. Disclosing a disability early also allows companies to be better prepared with accommodations, such as assistive technologies, once an applicant is hired.

The final general session was presented by Bryan Bashin, CEO of the San Francisco Lighthouse. Using available data and extrapolation, he detailed how people with visual impairments are using technology to learn new skills, complete tasks, and network in greater numbers than before. The time spent using technologies and learning from one another was compared to the time spent learning or receiving services from traditional visual impairment services with the former far outnumbering the latter. In addition, he described how the general public's understanding and experience with visual impairment is shaped much more by individuals. An example is the number of YouTube videos regarding visual impairment, created by individuals who aren't associated with any organization, which have been viewed in greater numbers than those produced by visual impairment organizations.

Awards

AFBLC also serves as a forum where AFB can present various awards to those who have achieved much in the field of visual impairment. For the first time, the AFB Helen Keller Achievement Awards, which are usually hosted at their own venue, were presented at the conference. This year marked the 22nd Helen Keller Achievement Awards, with Microsoft and Facebook receiving the award for improving accessibility to their existing products and developing new accessible products or services. Haben Girma, a disability rights lawyer who was also the first deafblind individual to graduate from Harvard Law School, also received the award for her dedication to improving opportunities for people with visual impairments.

The Corinne Kirchner research award is named for AFB's former staff member Corinne Kirchner, who is a leader in conducting research in the field of visual impairment. Recipients are honored for their contributions to the field through valuable research that reveals ways of improving the lives of those with visual impairments. This year's recipient was the Envision Research Institute, which conducts research on a range of topics related to vision loss, from improving rehabilitation for people with visual impairments to accessible wayfinding. Mike May, formerly CEO of the Sendero Group, received the award on behalf of the organization.

The Stephen Garff Marriott Award was named for Stephen Garff Marriott, who after losing his vision rose through the ranks at Marriott and served as a role model for others with vision loss. The award honors those who have been extraordinary mentors or have been extremely successful in their careers. This year's award was presented to Jennison Asuncion, who is the Engineering Manager—Accessibility at LinkedIn.

The Migel Medal is named for AFB's first chairperson, M.C. Migel, who established the award in 1937 to recognize those whose work has significantly contributed to improvements in the lives of individuals with vision loss. The first recipient was Larry Campbell, formerly the president of the International Council for Education of People with Visual Impairment (ICEVI) who has also long served in the field of education of people with visual impairments. The second recipient was Ted Henter, who was the key developer of the JAWS screen reader.

The Bottom Line

This year's AFB Leadership Conference was filled with information from across the field of blindness and visual impairment. Key mainstream and assistive technology companies took part providing details on their new products but also possible future developments in technology as a whole that will benefit individuals who are blind or have low vision. Next year, AFBLC returns to Arlington, Virginia, with the conference being held February 28–March 1, 2019.

Comment on this article.

Related articles:

More from this author:

Bringing the Conferences to You

Lee Huffman

Dear AccessWorld readers,

The AFB Leadership Conference (AFBLC) 2018 was held April 5–7 in Oakland, CA. I am excited to say the conference attracted well over 400 established and emerging leaders in the blindness field, making this the highest attended AFBLC to date. Conference attendees included technology experts, corporate representatives, university professors, teachers of students with visual impairments, orientation and mobility instructors, rehabilitation professionals, and parents. They came from diverse organizations and institutions spanning the public and private sectors, including school districts, schools for the blind, Veterans Administrations, hospitals, private agencies, and universities.

As in years past, this year's AFBLC focused on technology, leadership, employment, education, seniors experiencing vision loss, orientation and mobility, and rehabilitation.

AFB and the AccessWorld team would like to thank our generous conference hosts, San Francisco's LightHouse for the Blind and Visually Impaired and Northern California AER, and our sponsors: JPMorgan Chase & Co., Delta Gamma, Hewlett Packard Enterprise, Facebook, Google, Lee Hecht Harrison, Vanda Pharmaceuticals, Microsoft, APH, Sprint, Verizon, Aira, AT&T, Canon, Oath, NIB, HumanWare, and T-Mobile.

In addition, this year's Aging Track is sponsored in part by the National Research and Training Center (NRTC) on Blindness and Low Vision at Mississippi State University, Older Individuals Who are Blind Training and Technical Assistance Center, U. S. Department of Education RSA Grant #H177Z150003.

It's not too early to mark your calendars and save the date for the 2019 AFB Leadership Conference. Join us as we envision a future with no limits at the 2019 AFB Leadership Conference, which will be held February 28 through March 1, 2019, in Washington, DC.

The 33rd Annual International Technology and Persons with Disabilities Conference, otherwise known as CSUN 2018, was also held in late March in sunny San Diego, California. It was impossible to take in all the pre-conference workshops, educational sessions, forums, technology exhibits, and group meetings, but the American Foundation for the Blind was there, doing its best to experience as much of CSUN 2018 as possible!

AFB staff members were involved in several meetings with national leaders in the mainstream and access technology arenas. To help keep AccessWorld readers up-to- date with the goings on at CSUN, AFB was, once again, proud to sponsor the Blind Bargains podcast coverage of CSUN 2018. The AccessWorld team encourages you to log on to the Blind Bargains Audio Content page, which features great interviews, presentations, and updates on the latest in technology news from the conference. In addition to listening to the podcasts, you can read the accompanying transcripts on the Blind Bargains website as well.

As part of AccessWorld's special CSUN coverage, please be sure to read the articles by Shelly Brisbin and J.J. Meddaugh in this issue, which also highlight the technology shown in the CSUN exhibit hall.

It's not too early to mark your calendars and save the date for the 34th annual CSUN conference. Next year, the venue will move from the Manchester Grand Hyatt in San Diego, to the Anaheim Marriott, in Anaheim, California, and will take place March 11–March 15, 2019. As CSUN continues to grow, the AccessWorld staff wishes the CSUN conference planners a smooth venue transition and successful conference in 2019.

Sincerely,
Lee Huffman
AccessWorld Editor-in-Chief
American Foundation for the Blind

Letters to the Editor

Dear AccessWorld Editor,

This message is in response to Bill Holton's March 2018 article, New Accessibility Support Options Courtesy of Google, Microsoft, and Be My Eyes.

This is a great article. My only disappointment is the almost non-existent support BrailleBack offers in the Android environment. Grade Two Braille is not supported, and navigation and screen review commands are very limited. To make matters worse, the application doesn't even use a standard command set used by other operating systems like Windows and iOS. I am an accessibility tester. I review websites and mobile apps every day. I have never gotten BrailleBack to work for more than a minute or so before it crashes. I am using a fairly new phone, and a variety of braille displays. I tried installing a new beta version of BrailleBack, but keep getting errors from the Google Play Store despite the fact that I have a Google account, and am a member of several Google groups. I finally gave up on BrailleBack. I am using a standard Bluetooth keyboard to enter text and navigate the screen.

When I raise issues about braille support on computer mailing lists for the blind, I get a typical response that if I don't like braille support in the Android environment, then I should use iOS. I don't have a choice in the matter as Android is one of our officially supported platforms I am required to use. I have noticed that many blind Android users who post to computer mailing lists and Google groups have a cavalier attitude toward braille. Last week someone posted a question to a computer list where I have a subscription. The person was asking for advice about whether they should use iOS or Android. I responded by pointing out what I considered valid reservations about Android accessibility. People then accused me of spreading rumors and claimed that learning the Android environment wasn't that difficult.

I appreciate it that AFB supports braille by having many great articles about braille displays. I hope someone can get Google's attention and encourage them to improve the situation. It's unacceptable that Google solely relies on an open source application to do what Google should be doing in the first place.

Dan

Dear AccessWorld Editor,

This message is in response to Bill Holton's March 2018 article, New Accessibility Support Options Courtesy of Google, Microsoft, and Be My Eyes.

Well, I tried to get some relief using BrailleBack with a supported braille display for input. I did get a response quickly from Google. However, it only told me that the question was being passed on to someone else for investigation. I explained the issue fully, but a response like the one I received simply wasn't helpful.

David Allen

Dear AccessWorld Editor,

This message is in response to Deborah Gessler's letter in the March Letters to the Editor section related to the Amazon Echo. You asked about getting a second Echo and whether or not you would be hearing what is going on in both the living room and your son's room. You can chose from several different names for the device so you are only activating the one you want. We have two: one is Alexa and the other is Echo. If you use the name of the one you want, it only wakes up that one and then functions independently. If they have the same name they can function together depending on the set-up. We put one in the bedroom to use as an alarm and before we changed the name, the one in the living room, which has very big ears, was responding. We have an Echo and a Dot. The sound quality and volume is much better on the Echo if you want to use it for music. I was talking to someone recently who said you can also set it up with your BARD account from the National Library Service.

BJ LeJeune

Dear AccessWorld Editor,

I fully agree with Mike from Berkeley. His recent letter to the editor In the March 2018 issue of AccessWorld explained how all this technology is very useful but takes hours of time to learn and to keep up with. I'm slipping behind, I can tell. This is partly because of lack of training where I live. We have no Lighthouse that offers classes and support groups. I feel I'm slipping into the Dark Ages.

David in Louisiana

AccessWorld News

2018 M-Enabling Summit June 11–13, 2018

The M-Enabling Summit, the conference and showcase promoting accessible technology and environments for seniors and users of all abilities, will be held on June 11–13, 2018, at the Renaissance Arlington Capital View Hotel, in Washington, DC. It is the annual meeting place for all who create and contribute to accessible ICT products, services and consumer technologies.

With its 2018 theme of "Accessible and Assistive Technologies Innovations: New Frontiers for Independent Living," the M-Enabling Summit sets the stage for focusing on next-generation innovation and breakthrough solutions for all in the accessibility field. It also offers a platform to network with accessibility professionals, organizations, and decision makers seeking to address compliance challenges and market development opportunities.

The Summit's program will focus on accessibility innovations, with over 120 speakers, private sector leaders, developers, policy makers, mobile accessibility experts and disability advocates sharing their knowledge and experience. Confirmed presenters include representatives of leading organizations facilitating the accessible technology market. Dr.

Jeff Jaffe, CEO of the World Wide Web Consortium (W3C), will be delivering the opening morning keynote on June 12, 2018. The M-Enabling Summit will also host the FCC Chairman's Awards for Advancement in Accessibility (Chairman's AAA) on June 12th. FCC Chairman Ajit Pai will address Summit participants in his evening keynote.

Highlighted Sessions and Topics: Artificial Intelligence (AI) and Robotics, Augmented and Virtual Reality (AR/VR), Aging in Place: Connected Health and Big Data for Activity Monitoring, Accessible Security, Identification and Privacy Protection, Digital Assistants, Autonomous Mobility, Consumer Technology Products and IoT for Independent Living, and Accessible Smart Cities, Higher Education & Workplaces

Due to the successful addition of the International Association of Accessibility Professionals (IAAP) Pre- Conference Session at last year's M-Enabling Summit; the IAAP Annual Conference has been integrated into the 2018 Summit, where it will be hosting technical and organizational training tracks throughout the 3-day event.

View the complete agenda at this link.

The M-Enabling Summit offers a discounted early bird registration rate through April 27. You can register at this link.

Bosma Enterprises Brings BlindSquare Technology to National Nonprofit in New York

The installation will help Alphapointe's employees adapt to its new 140,000 square foot facility in Queens.

Last March, Bosma Enterprises became one of the first nonprofit organizations in the nation to install BlindSquare, a mobile navigation technology designed to create a more accessible environment for people with visual impairments. After hearing about the positive impact the technology has had on Bosma's employees, Alphapointe, a fellow National Industries for the Blind (NIB) agency, decided to implement the wayfinding technology into its new facility with hopes of a similar outcome.

Headquartered in Kansas City, Missouri, Alphapointe has served people who are blind and visually impaired since 1911. Its mission is to empower people with vision loss to achieve their own goals and aspirations. Alphapointe operates in nine locations, in four states, with over 400 employees. The organization is the third largest employer of people who are blind in the US and also provides rehabilitation, education, outreach, referrals and job placement to nearly 5,000 people a year.

Due to escalating rent, Alphapointe recently decided to move its Brooklyn operations to a new, wholly owned industrial space in Queens' Richmond Hill neighborhood. The facility will house multiple lines of business including its warehouse operations, janitorial supplies production, plastics division and contact services. The move will create major savings and financial stability, sustaining the future of Alphapointe.

To make the move as easy as possible for the 230 employees who will work at the new location, Alphapointe called on Bosma for its help and expertise to install BlindSquare. Because the new 140,000 square foot location consists of 19 buildings, it represents a potentially difficult environment for people who are blind or visually impaired to adapt to. With the help of Bosma and BlindSquare, transition into the new facility will be made easier and more accessible for all employees.

Nationally, people who are blind or visually impaired face an Alleged 70 percent unemployment rate. Bosma and Alphapointe hope that by showing employers that adapting the workplace can be relatively easy and inexpensive, it may lead to additional employment opportunities for this overlooked population in the future.

BlindSquare is a mobile app for iOS devices developed by Finland-based MIPsoft. In outdoor environments, BlindSquare uses a mobile phone's GPS in conjunction with data from third-party navigation apps like Foursquare to describe the environment and announce points of interest and street intersections. The indoor version of the program, called BlindSquare Event, interfaces with a Beacon Positioning System (BPS) consisting of iBeacons located throughout the building. These are small, low energy Bluetooth devices that provide data on specific navigation markers, such as restrooms or meeting rooms, while guiding the user with audio queues about their surroundings.

"Being able to serve as a national model for how technology can create job opportunities and accessible workplaces for people without vision is something that Bosma takes huge pride in," said Lou Moneymaker, CEO, Bosma Enterprises. "We are excited to help another nonprofit like ourselves become a leader in advocating for technologies for those who are blind."

For both Bosma and Alphapointe, giving those who are blind the help and support they need in order to live independently is their main goal. Sharing this mission and advocating for policies that level the playing field for individuals who are blind or visually impaired makes for an impactful partnership and can serve as a model for other nonprofits.

"Having an organization like Bosma help us install this innovative technology makes us that much more confident in the installation process," said Clay Berry, director of rehabilitation and education for Alphapointe. "They get it. They're seeing success with the technology and learning what works and what doesn't, giving us high hopes for the potential success it can have in our new facility."

Canadian Vision Teachers' Conference 2018: Seeing Beyond the Horizon Visit Nisku, Visit Nisku, Alberta, Canada, May 3–5 for the Canadian Vision Teachers' Conference.

Get more information at the conference site.

  • Hotel reservation – Special $99 room rate (see venue section of the website)
  • An array of concurrent presentations from fellow professionals in the field

Pre-conference Sessions on May 2

  • Matt Tietjen, MEd — What's the Complexity Framework? (working with students with CVI)
  • Peter Tucic and Michel Pepin (Humanware) — BrailleNote Touch: the hands-on experience
  • David Wilkinson (HIMS) — Up Close and Personal with BrailleSense Polaris

Keynote Speakers

  • Dr. Kevin Stewart of York Region District School Board in Ontario
  • Diane Brauner, creator of Paths to Literacy
  • Molly Burke Motivational Speaker, YouTuber and Spokesperson … who happens to be blind
  • Banquet Speaker: Lowell & Julie Taylor, formerly from The Amazing Race Canada

An Interview with Jonathan Mosen, Popular Broadcaster, Advocate, and Teacher

You may recognize Jonathan Mosen's name from his Blind Side podcast, Mushroom FM, or his work with Freedom Scientific. Based in New Zealand, he has been a major advocate for people who are blind or visually impaired his entire adult life. Especially through his podcasts and Freedom Scientific, he has helped people who are blind learn new technologies. His accomplishments are many.

Personal Information

Mosen, 48, lives in Wellington. He is 5.35 feet tall. He remarked, "This surprises some people because they think I sound taller." He used to have ginger hair, but it is now a brownish color. He has always been totally blind with no light perception.

He has been married to Bonnie, who is also blind, since 2015. She was a regular listener to his radio show, "The Mosen Explosion," before they met. She was living in Massachusetts and he went there to do a Mosen Explosion episode. He invited her to come and cohost the show.

He said, "The rapport was great between us. After my previous marriage ended, our friendship developed into something more. She still listens to my show, I'm pleased to say, but now from upstairs. She also participates in it."

Mosen added that Bonnie is a valued part of Mosen Consulting. "She provides consultancy services on vocational related matters, and has published a popular book, It's Off to Work We Go. She also coordinates the Mosen Consulting job club, which provides advice to blind jobseekers."

Mosen has four children, Heidi, Richard, David, and Nicola. Heidi sometimes helps with the Blind Side podcast and Richard has a classic rock show on Mushroom FM. "My four children are known worldwide," he says. "Heidi also appears on my podcast, The Blind Side, from time to time. When we do iOS analysis, for example, after a WWDC event or other Apple keynote, Heidi is constantly taking screenshots of the slides, so that on The Blind Side, we can describe to blind people some of the visual elements of the Apple presentation that weren't covered on stage, including a physical description of new products."

Mosen currently uses a white cane for mobility. While working in the government relations field he did use a guide dog. When he started doing more international travel, he switched to the cane. He explained, "New Zealand has very strict quarantine laws because there are many diseases that haven't made it to our shores. So the constant leaving and re-entering New Zealand made having a guide dog difficult, plus I don't think it's terribly fair on the dog to subject it to very long flights on a regular basis. I don't do very much international travel now, so could get another dog, except I'm primarily home-office-based, so the dog wouldn't get a lot of work."

Education

Mosen attended a school for the blind from age 5 to age 11. After that, he was mainstreamed. He spoke about the importance of learning braille, saying, "I received thorough braille instruction, and without braille, I wouldn't have been able to do many of the jobs that I've held in my life."

In 1991, he received a BA degree in History and Political Studies from the University of Auckland. In 1999, he received a Master's in Public Policy from the Victoria University of Wellington.

On the Radio

Mosen began his radio career at a very young age. He explained:

Like many blind people, I was fascinated by the radio. I also loved to talk. Still do, funnily enough. My mother tells me I was talking fluently from about 18 months. So, one day when I was four, I decided to give the radio station a call during a talk show. They seemed quite pleased to hear from me, so I kept calling back.

One day not long before Christmas, my parents received a telegram from the radio station. And I appreciate there will be many younger readers who have no clue what a telegram is. The telegram asked for one of them to call the manager at the radio station. My mum and dad had quite the discussion about who would make the call, because they thought the radio station was going to ask them to ensure their precocious son ceases and desists. In the end, my dad made the call. They wanted to know if I would be allowed to come into the studio and host a show just before Christmas for children. I loved every minute of it. It was a very popular thing that ran for over a decade, until I became a cynical teenager. Then it really didn't work anymore. Fortunately, by that stage, I was exploring how to set up my own radio station, which I duly did.

While still in college, Mosen began his commercial broadcasting career at Counties Radio, which served the Counties Manukau area. He then moved to Auckland 1476 where he hosted a morning current affairs show, interviewing politicians and newsmakers. He also worked at Today FM and then Q96 FM, where he was the program director.

Advocacy

In 1994, Mosen was appointed Manager of Government Relations for the Royal New Zealand Foundation for the Blind. His accomplishments in his five years in this position included changes to the copyright act making printed material available to people with visual impairments, getting the right for blind people to serve on a jury, and saving the right for free postage for braille and talking books. Mosen explained, "In New Zealand, a decision had been made to end the postal concession. It took a full on campaign to get that decision reversed."

He was also instrumental in the name change of the Foundation. It is now the Royal New Zealand Foundation of the Blind.

Mosen served two terms as president of the consumer group, the Association of Blind Citizens of New Zealand, from 1997 until 2001. He was the youngest person to hold that position. He said, "The thing I'm most proud of in that role is overturning the old, custodial, institutionalized governance structure of the Foundation for the Blind. Prior to the reforms in which I played a key part, the Foundation's board was appointed in the main either by volunteers or government. Now, blind people elect the board. That's as it should be. But it was a massive struggle that took several years to complete, in the face of a lot of resistance from the establishment." From 2002 to 2003 he served as the Foundation's chairman of the board.

Mosen was awarded the Association of Blind Citizens of New Zealand's Beamish Memorial medal in 2003, their most prestigious award, for his services to the blind of New Zealand.

More Radio

American Council of the Blind (ACB) Radio began on December 1, 1999 with Mosen at the helm. By the time he left, in 2003, the station had four concurrent streams and listeners in over 70 countries.

In 2001, ACB presented Mosen with the Vernon Henley award for his positive representation of blind people through the media.

In 2010, Mosen started the popular Mushroom FM internet radio station. It now has listeners in more than 115 countries. Here is a schedule of their programming.

Assistive Technology

In 2003, Mosen became the Blindness Product Manager for Pulse Data, now known as HumanWare. In 2006, he joined Freedom Scientific as Vice President for Blindness Hardware Management, remaining in that position until 2013. He still works for Freedom Scientific as Director Blindness Communications. His duties include hosting Freedom Scientific's FSCast podcast and managing their blog.

Mosen's first piece of access technology, in the early 1980s, was an Apple IIe, running Braille Edit from Raised Dot Computing, and a VersaBraille from TSI. He also used the original Keynote device from Pulse Data International in 1986.

Today he uses a Windows computer with JAWS and an iPhone. He does have other devices including the Amazon Echo and the Apple TV. At Mosen Consulting, he has access to many other devices for accessibility testing and training.

When asked what he thought was the greatest technological invention for the blind he responded, "Without a doubt, my answer is braille. Braille is the most priceless gift blind people have ever been given in history. It is our path to literacy. And its story is a parable. Many sighted people tried to slow the adoption of braille, to stop it in its tracks before it had really begun. It just goes to show that the best people to solve the problems and challenges of blind people are blind people ourselves."

Mosen Consulting

On May 1, 2013, Mosen started his own company, Mosen Consulting. The company provides technology training. They sell a variety of books such as iOS Without the Eye and Become an Amadeus Maestro, both written by Mosen. Books by other authors are also available including My Mac Pages, by Anne and Archie Robertson. Mosen Consulting can also work with businesses to make their websites more accessible. They also produce The Blind Side podcast. Visit the website here.

When asked what made him start his own business, Mosen answered:

When I had cause to think about the future, I did a lot of soul-searching to work out what exactly it is that motivates me. What gets me bouncing out of bed every morning feeling good about life? I realized that it's making a difference for blind people. If I can end my day with a clear answer to the question, 'What difference have you made today?' then I feel a sense of fulfillment. The trouble is, I like to make a difference in a variety of ways. I love the entertainment aspect of the radio show I do, and indeed all that Mushroom FM, with its exceptional team of broadcasters, represents. I like to help make sense of technology for people who struggle with it or just have other things to think about, and wants someone else to do all the research for them. I'm a passionate accessibility and disability rights advocate. I want to make the world a more accessible place for blind people. So, I decided, the best way to do that was to start my own company, and do it all. The work is never dull, and there is a wide variety. I love every second of it.

Comment on this article.

Related articles:

More by this author:

Focusing on Braille: a Review of the Focus 5th Generation Braille Displays From the VFO Group

Many braille displays have come and gone over the years. Others, such as the Focus line from the VFO Group, keep evolving to meet the changing needs of the consumer. The original line of Focus displays debuted over 15 years ago and was available in 44, 70, or 84 cells. They used USB or a 9-pin serial cable, had audio in and out jacks, only 3 controls on the front, and were intended for desktop use. The most recent line of Focus displays supports multiple Bluetooth connections, use USB C cables, have a clock and calendar, and support a wider range of technology. This article examines the latest Focus braille displays, the Focus 14 and the Focus 40.

What's In The Box?

Either a Focus 14 or Focus 40 Blue, AC Adapter, USB C cable, plugs for various systems of electricity, both braille and print user manuals, registration card, companion CD, and a carrying case. Unlike all other companies, VFO includes their user manual in braille with the Focus.

Physical Description

The first thing that stands out about the Focus 5th Generation to me is the solid construction. Unlike the previous iteration of Focus displays, the 5th Generation is made of a very solid aluminum that seems like it could withstand quite a bit of rough activity. Not only does it have a stronger housing, there are bumpers on each side of the device to help absorb the impact of a drop.

Orienting the Focus 14 so the series of buttons on the front is closest to you, you will find the following controls from left to right: Left Selector button, Left Rocker bar, Left Panning button, Left Shift button, Right Shift button, Right Panning button, Right Rocker bar, and Right Selector button. The same controls are on the Focus 40, though in a slightly different configuration. From left to right, you have the following: Left Panning button, Left Rocker bar, Left Selector button, Left Shift button, Right Shift button, Right Selector button, Right Rocker bar, and Right Panning button. This layout is the same as that seen on the previous generation of the Focus display.

Along the left side of the Focus, you will find the left bumper. Below that, but almost up against the bumper you will find a Micro SD Card slot, the Power button behind that, with the USB C port furthest away. At this time, the Micro SD card slot will not permit you to do anything with the memory card, but the VFO Group has stated there will be a firmware upgrade in the spring which will include the ability to read and write BRF content. There isn't anything on the right side of the Focus beyond the right bumper. On the surface of the display, the closest thing to you is the Spacebar, which is more pronounced than it was on the previous generation. Behind the Spacebar, you will find 14 or 40 cells of braille, with corresponding Cursor Routing buttons for each cell behind the display. Located at the left and right ends of the display, you have a Left Nav Rocker Bar with a NAV Mode button behind the Nav Bars. Continuing to explore the surface, you have an eight-dot Perkins keyboard, with a Menu button located between dots 1 and 4. The only new control on the surface is the Menu button. All other controls discussed on the front and surface have different functionality depending on the screen reader in use. The newest generation is slightly longer than the previous one, mainly because of the bumpers on either side.

The Case

The case for the 5th Generation Focus displays is very well designed. With the fourth generation, the 14-cell display was a challenge to get into the case. With the 5th generation, this is no longer such an issue. The case that came with the 4th Generation of the Focus 40 did not permit access to the controls on the device while it was being carried. The latest line of Focus displays come with cases that are very similar to those made by Executive Products. There is also a zipper pocket on the top of the case, where I was able to store my iPhone 8. The downside to the design of this case is that it can be somewhat of a challenge to plug in the USB C cable while the Focus is in the case. This is because a part of the leather partially covers this port.

Some Things Don't Change

Study Mode, the way in which Status Cells are handled, Diagnostics, and Battery Info Mode are the same as they were in the previous generation of Focus displays. All have a purpose, and those purposes continue to be met equally in the 5th Generation when compared to older models.

New Features and Menu Options

When first powering on the Focus 5th Generation, you will see one of the new features: a clock. On the Focus 40, you will also see the status of the battery and the active connection, if there is one. On the 14, you will see the time, but must pan to the right to see this other information. Once the status line is displayed, you can then go in to the Configuration menu by pressing the Menu button a second time. Once you have entered the Configuration menu, use either Rocker Bar to move through the list of options. Select an option by pressing Enter. The options include: Repeat, Rest, Clock, Calendar, Firmness, Connections, and Language. If you are in an active connection, pressing the Menu button twice will bring you to the Configuration menu. To exit the Configuration menu, press the Left Selector.

The Repeat and Rest options were available in the previous generation of Focus displays, but the rest of the options are new. The Repeat option controls how quickly the Nav Bars will scroll through elements on a screen, and the Rest option controls the duration of time of inactivity that must occur before the Focus goes to sleep. Each of the new functions will be discussed briefly below. For more details on each feature, please see the manual.

Setting both the time and date are explained clearly in the manual. If you are looking for a full application in either the Clock or Calendar, you will not find it. The purpose of the Clock is to allow you to quickly check the time. You may also need to readjust it from time to time, as I noticed the clock starts to fall behind the actual time very slowly. You can also quickly check the date by pressing the Menu button with the letter D. There are no options to set alarms, schedule appointments, etc.

Firmness on the previous line of Focus displays was controlled only by the screen reader. If the screen reader did not have an option to select the dot firmness, it was automatically left at whatever the setting had been previously. JAWS allows the user to customize the dot firmness, but it is now something the user can set within the Configuration menu of the Focus. Again, for specifics on how to configure this setting, please consult the above link.

Connections is another new option which allows the user to delete and configure connections to other devices. The 5th Generation of Focus displays allows up to 5 Bluetooth connections and one USB. This option will let the user choose any already connected devices or give the user the ability to delete connections. The Language option allows the user to put the menu in one of nine different languages.

Connecting to Devices

Addressing each screen reader and its use is beyond the scope of an AccessWorld article; I tried the Focus with every option I had access to. The Focus 5th Generation is compatible with JAWS on Windows, NVDA, BrailleBack on Android, iOS (version 11 or later), and Mac OS. It is not currently compatible with the Fire OS or iOS versions before 11. Once more than one connection is established, the user can jump between the connected devices by holding down the Menu button and then dots 1, 2, 3, 4, or 5 to jump between the corresponding Bluetooth channels. Menu with dot 8 will put the user on the USB connection. It's very convenient to be able to switch among the connected devices. It would be helpful if, when switching to a channel that has a paired device on it, the Focus could specify which device has been assigned that channel. If the device connected to the Focus on that particular channel is not active, you will only see the Bluetooth name. There is no confirmation that you have switched channels.

JAWS

Like all previous models of the Focus, the 5th Generation is compatible with JAWS for Windows. Setting up with JAWS 13 and later is simple. Simply plug the Focus in, let the Windows drivers install, and then restart JAWS. You should then begin seeing braille output.

The Focus works extremely well with JAWS, whether you wish to use the braille keyboard as your input method or a QWERTY keyboard. There is an option to lock the braille keys if you plan to use another input method and do not want to worry about bumping the braille keyboard. However, learning the various keyboard combinations can make it so that you do not need access to a QWERTY keyboard at all. Commands exist on the braille keyboard that allow the user to simulate many different key presses on a QWERTY keyboard. For example, pressing the right Shift key with dot 4 will simulate the pressing of the Windows key, while the right Shift key pressed with Spacebar and dot 2 will simulate the pressing of the Applications key. There are commands for just about every key on the QWERTY keyboard, and commands also make it possible to perform modifier key combinations such as dot 4 with dot 8 with Spacebar pressing the Windows key, followed immediately by the letter D to get to your desktop. Other commands function specifically with Microsoft Word and the Navigation Quick Keys, text selection, several JAWS specific functions, and much more. To see a complete list of all of the options available, please see the Braille Display Input Commands guide on the Freedom Scientific website.

Overall, for my personal use, I find the experience of using JAWS with the Focus displays to be a very smooth and efficient process. With a wealth of keyboard command options and the ability to type very quickly in contracted braille accurately, JAWS and the Focus combine for a wonderful braille experience. The VFO group has clearly explained each of the commands and has provided a comprehensive list of all the keyboard commands available.

NVDA

The process of connecting to NVDA with the Focus requires an additional step. Once you establish a Bluetooth or USB connection with your PC, you must then install the appropriate drivers from the companion CD or download and install the drivers from the Freedom Scientific website. After you have followed the prompts to install the display, you will then need to go into NVDA preferences and select the Focus display. This is also where you set your preferences such as the braille tables, cursor tethering options, and so on. NVDA 2018.1 was used to conduct this evaluation. When typing rapidly in contracted braille, I found that letters were missed. When I rapidly typed this sentence, "Typing in contracted braille is more fun with JAWS," what resulted was: "Typing in like so more fun JAWS." Further development is required to improve rapid contracted braille input support between NVDA and the Focus, though this seems true regardless of the display used. The translator also does not take context into account. For example, if you type a word that you decide to make plural, when you go back and add an s, NVDA will interpret that as the word "so" instead of adding the letter s. I did notice that braille input is more responsive when compared to NVDA 2017.4, and also that contracted braille input no longer freezes occasionally. It is my hope that improvements will continue as NVDA braille input matures.

There are a few options for simulating key presses, but the list of options are much smaller than what is available for JAWS. You can press Spacebar with W to activate the Windows key, but there is no way to, for example, press the Windows key followed by the letter R for the Run Dialog. This is also true of the Alt key. While JAWS can use the Quick Navigation keys on websites, these do not function in braille with NVDA. There are other options to move around documents, and it is possible to simulate Arrow key presses. For more detailed information, consult section 14.1 of the NVDA User Guide under the Help menu, or read it online here.

iOS

The Focus 5th Generation displays are only compatible with iOS 11 and later. Connect using the conventional method in Braille Settings. You must enter the pin code 0000 to pair. iOS 11.2.6, which was the latest public release at the time of writing, was used to conduct this evaluation.

iOS makes use of the front controls on the Focus and also allows the user to program their own functions for the different controls. When writing longer documents, or when replying to a large thread of emails, I found that the cursor sometimes behaved erratically. This is not an issue specific to the Focus, but one that is happening with all braille displays when connected to an iOS device.

As the unit I use left the VFO Group before the latest firmware update, I found that chorded commands were a challenge to use on iOS. Pressing the Spacebar in conjunction with a letter would often result in the command not being recognized. If this is happening to you, you should install the latest firmware. As long as you have a computer that can run Windows, you will be able to carry out the firmware update. To verify that your firmware is out of date, consult the above webpage.

If you don't have access to a Windows computer, or if you are uncomfortable performing the firmware update, you can do what I did before I knew of the update. I programmed a bunch of the commands I commonly use to work with dot 7 instead of the spacebar. This means that if you were to ever pick up my iPhone with the Focus connected, dot 7 with H would take you to the Home screen, for example. It would also mean that you are nosey. Other than these points, the Focus works the same way in which most other braille displays do when connected to iOS.

Conclusion

The latest line of Focus displays from the VFO group demonstrates that the company continues to respond to user feedback. My complaints about the 4th Generation were that it broke very easily when handled roughly and that the case was not adequate for portable use, Both of these issues have been addressed in the latest model with the more rugged design and high quality case. Still sporting a battery life of about 20 hours on Bluetooth, the same key mapping and control configuration, the Focus 5th Generation has combined the good points of the previous line of Focus displays with a more modern look and feel.

Product Information

Name: Focus 14 Blue - 5th Generation
Company: VFO Group
Price: $1295

Name: Focus 40 Blue - 5th generation
Company: VFO Group
Price: $2995

Comment on this article.

Related articles:

More by this author:

Accessible Dictation with Open Source DictationBridge

If you're old enough, you may have used a typewriter in your life. If you're ancient, like me, you may even have learned to type using a manual Royal or Smith Corona. If so, you may still be banging away on your computer keyboard, pounding those keys a lot harder than necessary—so hard that the constant beating has taken its toll and you've developed a repetitive stress injury, otherwise known as an RSI.

Using a screen reader can also contribute. "Everything we do on the computer is keyboard based, and some of those three- and four-key screen reader commands can involve some genuine finger gymnastics," says Lucia Greco, an Assistive Technology Specialist at the University of California, Berkeley.

Even an iPhone or other touch screen device can be harder on the hands of people with visual impairments than those with sight. "To access a button a sighted person needs merely to give it a single tap, but for a VoiceOver or TalkBack user, tapping that same button can require tapping the screen to begin, then performing a number of swipe gestures to navigate to the button, and then double tapping," notes Greco.

Greco herself was diagnosed with a serious RSI several years ago. So was her friend, Pranav Lal, a Cyber Security Specialist in New Delhi, India.

"We've both begun using dictation as much as possible," says Greco. Their phones come with built-in dictation. Their Windows PCs also offer dictation via the Ease of Access Windows Speech Recognition feature, but there are issues. "[It doesn't have a] built-in feedback mechanism to hear what you've just dictated using your screen reader," says Lal. "The corrections dialogue boxes are also inaccessible out of the box."

Both Grego and Lal began using dictation with Dragon Naturally Speaking paired with J-Say, a set of JAWS scripts produced by Hartgen Consulting that offers screen reader feedback and other Dragon accessibility.

But the package is expensive, approximately $700, along with the additional costs of software maintenance agreements. Greco's employer covered the cost, but Lal wound up paying for the software himself.

They were lucky. "There are so many people who could truly benefit from dictation on their PCs but who can't afford to make it accessible," says Greco.

Lal has been using the free NVDA screen reader "ever since they began supporting track changes." Greco also says she finds herself relying on the open source screen reader more and more instead of JAWS.

Both have become vocal advocates for the open source software model, which is the model used by the NVDA screen reader. "There are so many people around the world who can't afford a thousand dollars for a screen reader. Even those of us who can should consider using and donating to the project to help blind people around the world to participate on a level playing field."

The two decided to work toward an accessible bridge between dictation software and the NVDA screen reader. Lal used an open source plugin called NaturallySpeaking python scripting environment but this solution was extremely cumbersome, especially after Python was updated to a new version. That's when Matt Campbell joined the team and collaborated with Lal to write a separate add-on using the C programming language.

About a year ago, the team released their first public beta of DictationBridge for NVDA. Users began submitting bug reports and feature requests.

"The most frequent request we received was for us to make DictationBridge compatible with other screen readers, which, at the time, included JAWS and Window-Eyes," says Greco.

That was going to take funding. "We had to bring more developers onboard, and respect their professionalism and at least offer them a token payment," Greco says.

The group raised $20,000 in an Indiegogo campaign, and continued their work. By then Window-Eyes had been absorbed by VFO, so they focused their attention on improving NVDA performance and making DictationBridge work with JAWS.

"The going was slow," recalls Greco. "The developers were part time—they had their regular jobs—and if Pranav, in India, say, had a question for Matt or another developer here in the US on Tuesday, he had to consider the time difference, along with the fact the US developer might not log onto the GitHub system until the weekend.

Recently, DictationBridge Version 1.0 was released, with different versions configured for NVDA and JAWS. They've also taken advantage of some significant improvements in Windows 10 dictation, so that either screen reader can work with either dictation platform.

I tried DictationBridge with all four possible configurations. Here are some initial impressions.

Getting Started with DictationBridge

There are separate downloadable versions of DictationBridge available for use with NVDA and JAWS. Each offers two choices for speech recognition. Windows 10 comes with Windows Speech Recognition (WSR) built in, so it's free. Dragon Naturally Speaking from Nuance Communications retails for about $250. Versions 14 and 15 have been tested. Version 15 does not require voice training.

DictationBridge is a 1.0 release, and as such, it's no surprise it can be, shall we say, a bit less than user friendly to install. Novice users will likely need help from an experienced computer user, and even experienced users would be wise to review the online guide before attempting the installation. (Note: A downloadable version of the documentation is offered, but at the time of this writing the link gave a "can't reach this page" error.)

The first step toward accessible text and command dictation involves setting up your dictation engine of choice. Needless to say, you will need a working microphone for this, the higher quality the better. For Dragon, follow the installation process just as you would for any other software install. For Windows 10 WSR, press the Windows key, and type "recognition." That will offer the desktop app, Windows Speech Recognition. Open this app and enable dictation. Do not train the app with your voice at this point beyond the initial setup sentence, however. I will explain why shortly.

This is where things get a bit difficult for both screen readers. As an example, here are the additional steps you must take to get DictationBridge running with NVDA:

Before you can use DictationBridge for NVDA with WSR to execute screen reader commands, you first need to download and install Windows Speech Recognition Macros. The software is supposed to create a similarly named folder in your Documents folder. This folder was never created for me, even after I had uninstalled and reinstalled the software three times. I eventually went ahead and created the folder myself.

Now, locate and run the WSR Macros Utility, a step not mentioned in the documentation. This utility links a set of macros to the WSR engine, enabling screen reader commands via dictation and other features.

Next, open the NVDA menu and select Tools. Select "Install commands for DictationBridge," and, next, "Install commands for Windows Speech Recognition."

Now, locate and open the Windows Speech Recognition Macros Utility in your system tray. Tab once and find the new NVDA macros, then Tab again to access and confirm the "sign macro" option, because WSR only works with signed macros to prevent malware or virus attacks.

Restart NVDA. Now you are ready to dictate.

The three other installations—WSR with JAWS, Dragon with NVDA, and Dragon with JAWS, are equally complex. The documentation leaves much to be desired, but I feel certain this will change in time.

Speech to Text and Back Again!

With everything up and running, "Start listening" caused WSR to do just that, while Dragon uses "Wake up." "Stop listening" seemed to work for both.

If you are using WSR, this is an excellent time to return and do some voice training. Again, start Windows Speech Recognition and access the "Improve speech recognition" option. DictationBridge does an excellent job of voicing the practice sentences, and if you need to hear one again, simply press the grave accent (') key and your screen reader will repeat it. Here, I cannot help but feel that Microsoft is lagging far behind other dictation engines. Both Apple and Google offer much higher quality dictation without training, albeit you do need to have a data connection for the full experience.

DictationBridge doesn't just echo your dictation. It also enables Windows commands, such as "Open Notepad," and screen reader commands (at the time of this writing for NVDA only), such as "Say all," and "Previous line." I am told JAWS commands will be coming soon.

DictationBridge also makes the correction boxes accessible to make word and spelling corrections. For me, this worked much better in NVDA using Dragon than any of the other options. Here I need to remind you that this is a 1.0 release, and that in my opinion it should have more appropriately been designated a .75 beta release. The app can be rather buggy, in my experience, and it's not necessarily the fault of the developers.

Greco doesn't excuse DictationBridge's shortfalls, but she does explain them. "We did a lot of outreach to get users to test the beta and join in on the discussion on our beta list," she says. "Unfortunately, we didn't get the involvement we were hoping for." In fact, just prior to the 1.0 release there were only two JAWS users participating, and when they stopped sending in bug reports there wasn't much we could do to test the app with different hardware and software configurations."

That said, developers are currently hard at work resolving issues. Many of the bugs I experienced may be fixed by the time you read this, so I won't describe them in great detail. Most of them involved extra spoken characters, such as the four repeats of the word "delete" that began too many sentences but which did not affect the text. It was also difficult to correct a sentence where they were multiple words grouped together in lesser or greater strings, such as when "Let's dictate some text," was recognized as "Let's start a syntax."

Community Involvement

Raising $20,000 to fund an open source project is no mean feat. Apparently, however, getting people who might benefit from the project to spend a little time helping out is.

In my opinion, the DictationBridge project is of great potential value to thousands of people with visual impairments around the world. Do you currently have an RSI, or feel the beginnings of one coming on? Especially if you worry you might one day have trouble keying in those obscure screen reader commands, you should probably find a way to get involved. You can start here, by subscribing to the DictationBridge news list or follow them on Twitter @dictationbridge.

Comment on this article.

Related articles:

More by this author:

A Review of "Stress Less, Browse Happy: Your Guide to More Easily and Effectively Navigating the Internet with a Screen Reader," an audio tutorial from Mystic Access

Readers of AccessWorld who have used computers for many years will remember the days before the internet existed. Work was done from within programs such as WordPerfect for writing and editing documents, or a money management program. At the end of the day, the computer was shut down and the process started all over again the next morning. When the average computer user actually began to spend time online, activities consisted of chatting and checking email. Searching online for information yielded just a handful of results that took mere minutes to read.

Today, the experience is quite different. Computers are most always connected to the Internet, and it can sometimes be difficult to determine when an application you are using is working online or offline. Browsing the Web has become an integral part of every computing experience, and the internet is more like a large continent than a small village. There are lots of places to visit, and a lot to see while you're traveling.

Finding one's way around on the Web can be especially daunting for someone who uses a screen reader. A sighted person is able to glance at a computer screen and determine what parts of a webpage are of importance to them. The blind computer user must take much more time to explore the available content, and to develop strategies for returning to places of particular interest. More blind people are using computers than ever before, and a live trainer is often not available to teach users about the complexities of using a computer and assistive technology.

Fortunately, many excellent resources have become available over the past several years, including audio tutorials that allow a blind person to learn key concepts at their own pace, and to repeat the learning experience as many times as needed. One company that has earned a place of respect in the area of high-quality audio tutorials is Mystic Access. Anyone who has used tutorials produced by this company will most certainly recognize the voice of Kim Loftis, Director of Product Development. Loftis's warm, engaging style coupled with her thorough grasp of the subject matter in question make for tutorials that are easy to follow and stand the test of repeated listens. Topics covered include learning to use the Victor Reader Stream and BrailleNote Touch, both from Humanware, and comprehensive discussions of the Amazon Echo and Google Home products.

While most Mystic Access audio tutorials cover a single product or suite of products, the company's recent tutorial on browsing the internet takes a different approach. "Stress Less, Browse Happy: Your Guide to More Easily and Effectively Navigating the Internet with a Screen Reader," available in both DAISY and MP3 files for $24.97, provides a plethora of common-sense strategies for navigating the internet regardless of which screen reader or browser you're using.

Coming in at just under three hours, the tutorial is easy to listen through in a single day. Loftis actually suggests that the student listen through the tutorial once without actually following along at the computer, something that is rare in most tutorials. It doesn't take long to understand why she makes this suggestion, however. The "Stress Less" tutorial is more of a friendly conversation than a classroom lesson. Loftis shares tips and tricks that work for her as a blind person when it comes to surfing the internet, and gives frequent demonstrations of what to expect when surfing a variety of websites including Amazon, and, not surprisingly, the Mystic Access site itself.

Mindset Matters

The "Stress Less" tutorial is divided into three parts. In the first part, Loftis tackles the importance of keeping a positive mindset when learning to browse the Web effectively as a blind person. She compares the internet to a city, and talks about the various ways a traveler might approach visiting a new location. If one has time, leisurely sightseeing can be of great benefit and enjoyment. At other times, focusing on key locations of particular interest is the most advantageous strategy.

Loftis takes the time to address a couple points that are perhaps not stated as often as they should be. She points out that, while visiting a website for the first time can be much slower for a blind person than for a sighted person, familiarity with a given site will eventually make it possible to move around very quickly, perhaps even allowing a blind person to navigate a webpage faster than their sighted counterparts. This is because screen readers take advantage of keystrokes that allow the user to move to various elements of a page such as headings, form fields, and links very rapidly. Loftis also talks about the importance of keeping the voice rate of one's screen reader at a comfortable level. While one person may be able to listen at a high rate of speed, others will need to slow their screen reader down considerably in order to feel comfortable and to process the content most effectively.

Navigating: No Problem

The real meat of the "Stress Less" tutorial is found in Part 2, where discussion turns to navigating webpages. Loftis does not give any recommendations for which browser to use, and, in fact, does not even tell the student which browser she's using. Through the course of the tutorial, it becomes apparent that all demonstrations are done using the NVDA screen reader, although commands are given for the NVDA and JAWS screen readers for Windows, and VoiceOver on the Mac. A brief discussion of webpage navigation on mobile devices is saved for Part 3 of the tutorial, with no actual demonstrations provided.

Loftis takes the student through the simplest of navigation techniques including arrowing up and down a page, moving to links on a webpage, finding headings, lists, form fields, and much more. She does not cover topics such as setting place markers on a webpage, or other techniques that are unique to certain screen readers. All of her tips will be beneficial to everyone, regardless of the operating system, browser, or screen reader being used.

Loftis also discusses more advanced topics, such as moving to various landmarks or regions on a webpage—something that is possible using all screen readers. She provides examples of webpages whose content changes dynamically, so that a new page is not loaded each time a link is activated. She talks about sites that require you to take specific actions such as clicking an "OK" button before you can move anywhere else on the page. These are just a few examples of advanced topics covered in the tutorial.

Putting It All Together

Through all of her discussion, Loftis takes care to remind the student there is no single correct way to navigate a website. There are many tools available, and the user must decide what works best for them. This point is important enough that it is restated at the beginning of Part 3 of the tutorial. The third part of the tutorial is also where a discussion of mobile browsing resides.

The Bottom Line

Learning to use a computer can be daunting for many people, and those with a visual impairment can find their frustrations compounded many times over. For anyone who learns best when they are talked through a task, "Stress Less, Browse Happy: Your Guide to More Easily and Effectively Navigating the Internet with a Screen Reader" may be just the tutorial you are looking for. Loftis presents information in a clear, concise manner and gives plenty of examples throughout.

Product Information

"Stress Less, Browse Happy: Your Guide to More Easily and Effectively Navigating the Internet with a Screen Reader" is available from Mystic Access for $24.97, and comes in both DAISY and MP3 formats for one price.

Comment on this article.

Related articles:

More by this author:

CSUN 2018 Heralds The Year of Wearables—Unless It Doesn't

When reporting on an event like the annual CSUN Assistive Technology Conference, it's tempting to try to sum it up with a single narrative. This is my third year covering the trade show portion of CSUN. In addition to asking questions for AccessWorld, I was part of the Blind Bargains podcast team. You'll find links to some of our interviews in this article.

A couple of years ago, Jamie Pauls wrote that CSUN and other accessibility-focused trade shows inaugurated the year of Braille. This year, a number of the products announced in 2016 are available to purchase, some have even been updated, and a few still aren't available at all. Most of these items exemplify important improvements in function, price, or both.

This year, a different group of devices dominated the news. And despite my own desire to be a little contrarian, it seems appropriate to declare another named milestone: 2018 is the year of wearables. It's fair to say that moniker rolls off the tongue a bit more easily than "the year of head-worn cameras," though the latter is a bit more accurate, since watches, with the exception of the Dot braille smartwatch, weren't as present this year as they have been in the past. And in a few cases, wearables are more than expensive pieces of hardware: they come with services intended to provide live guidance to the wearer.

Glasses that magnify or enhance the environment for people with low vision aren't new. They range from optical telescopes in glasses frames all the way to head-worn cameras connected to a computer. Mobile operating systems have expanded that category, both in terms of helping people with low-vision see better, and offering remote guidance for those with blindness and low vision. Another new technology that has found its way to low-vision eyewear is the virtual-reality headset, employed along with Android-based computers to give wearables more horsepower and better engagement with the camera.

Android Turns to Low Vision

Before I describe some new and updated wearables and services, it's worth pointing out that what makes many, but not all, of these devices work is the availability of an adaptable mobile operating system. Android can power your smartphone, or it can power a low-vision wearable. Sometimes, it's doing both at once. As usual, Google had a significant presence at CSUN, touting the accessibility of all its software, including Android. The company provided a preview of accessibility features in Android 9, also known as Android P. Several changes benefit low-vision users. There's an improved accessibility shortcut that adds the ability to turn color-related features on and off. With the updated Select to Speak feature, you can take a picture of text with your Android device's camera, and have the text read to you. It works whether you're using the TalkBack screen reader or not, and even when you don't have an internet connection.

When asked whether BrailleBack would be updated, Google representatives said they did not have any announcements to make, but that those interested should "stay tuned." The answer didn't go over well with audience at the Android session.

Aira Covers the Field

Aira isn't a new product. In fact, the eyewear and service combo has been available for almost a year. CSUN marked the debut of new Aira glasses, as well as access to the service for non-customers attending CSUN. Here's how Aira works. When you sign up for the service you receive a pair of glasses that contain a camera, along with a Mi-Fi-equipped computer that facilitates communication between you and a human Aira agent. Using the camera image, the agent provides guidance with tasks as simple as reading a recipe, or as complex as navigating in an unfamiliar city. Explorers (the term Aira uses for its users) pay for the service by the minute, starting at $89 a month for 100 minutes. The glasses are provided at no additional cost.

Until now, Aira used Google Glass devices. The new Horizon glasses, which were built by Aira, debuted at CSUN. The glasses have a wider field of view in all directions, so users don't need to turn their head as often to show an agent their surroundings. The camera has been enhanced, too. The company claims that Explorers should be able to capture images in real-time more often, rather than having to send a photo to an agent to have text read. Battery life for the Horizon units is up to seven hours. Instead of a Mi-Fi device for cellular communication, the Horizon model uses a Samsung J7 smartphone as a controller. Its interface is just four buttons. The "smartness" in the Aira controller is a custom artificial intelligence assistant, called Chloe. You summon Chloe as you would any voice assistant, and it can read and speak text.

CSUN attendees who aren't Aira customers had the chance to try the service, using a smartphone instead of glasses. After a short on-boarding process, agents guided attendees around the Manchester Grand Hyatt, where the conference was held, through the exhibit hall, or elsewhere in San Diego. Users I talked to said they enjoyed the experience.

Virtual Reality Glasses are a Reality for Low Vision

If you've never seen one, a virtual reality (VR) headset resembles a pair of ski goggles. The large plastic unit fits around your head, has a single piece of glass or clear plastic on the front, and is secured by a strap that goes around the back of your head. Two CSUN vendors, IrisVision and Patriot Vision, showed off low-vision eyewear based on these headsets. Don't be confused by the VR. These devices use Samsung Gear VR headsets, but they're designed to help you see more of the actual world, not a virtual one.

Each product uses an Android phone to project images onto the inside of the headset. You see what the phone's camera sees, with options to magnify your view, change the color scheme, or look at images from the phone screen, rather than the camera's view. Controls are found on the side of the headset, so changing settings is a matter of swiping, or pressing buttons.

Both vendors say their devices are well-suited to users with macular degeneration as they compensate for central vision loss. Vendors say users with glaucoma, retinitis pigmentosa, and other conditions can also use the devices. Just like a typical video magnifier, you can use a VR headset to magnify images, change the color scheme, and scan text with OCR.

Because headset-based wearables fill your entire view with the camera image, rather than confining your view to a tiny screen in one lens, the experience of wearing them is much more like watching a movie, than like looking through a pair of glasses. This also accounts for the variety of vision conditions for which they are appropriate. They're not exactly fashion-forward, and many users may find their bulk makes it difficult to wear the device for long periods of time, but it's also likely that many will use these headsets in a task-oriented way, rather than as a replacement for glasses.

The Patriot ViewPoint is Patriot's first wearable product, though the company offers a line of tablets. It costs $2,995. Learn more in the Blind Bargains PatriotVision audio interview.

Updated Orcam

Last year, AccessWorld reviewed the Orcam MyReader and MyEye. Unlike other devices in this roundup, Orcam isn't intended to improve vision. It identifies text, objects, and people, and can be used by people with blindness or low vision. At CSUN, the company showed off the MyEye 2.0, a more compact camera that eliminates the cable and external battery found in earlier versions of the product. Orcam MyEye 2.0 is $4,500, including a personal training session for new users. If you have an older Orcam unit, you can upgrade to 2.0 for $3,000. Learn more in the Blind Bargains Orcam audio interview.

NuEyes: Lower Profile, Higher Price

NuEyes isn't a headset. It's a small camera and Android-based computer, mounted in a standard pair of glasses. The company says the glasses work best for those with retinitis pigmentosa or stargardt disease, and who have at least 20/500 vision. There's a 30-degree field of view, making NuEyes more effective than other devices for people with peripheral vision loss. The unit provides up to 12× magnification, OCR and bar code scanning, and can be controlled by voice. The company says Android allows NuEyes to add more apps as they're developed. NuEyes costs $5,995. Learn more in the Blind Bargains NuEyes audio interview.

QD Laser Smart Glasses

QD Laser is a Japanese company that showed a prototype pair of smart glasses that project images directly onto the wearer's retina. The glasses are intended for those whose eye conditions affect the anterior part of the eye, but whose retinas are either undamaged or have some function. The company says the current version offers two hours of battery life, with improvements expected in an upcoming version. The company says the glasses will be available in Japan, later this year, and in 2019 or 2020 in the U.S. The unit is currently expected to cost $5,000 or more. Learn more in the Blind Bargains QD Laser audio interview.

Summing Up the Newness

The range of wearables continues to grow, and it's unclear which ones will be useful to people with which kinds of vision loss. But one thing is clear: the rate at which technology made for a mainstream audience is being applied to devices aimed at those with vision loss is accelerating.

Comment on this article.

Related articles:

More from this author:

What's New in Braille and Beyond: A Report from the CSUN Assistive Technology Conference

For the ninth and last time in San Diego, a crowd of thousands descended upon the largest technology conference for people with disabilities. The CSUN Assistive Technology Conference moves to Anaheim in 2019, but more about that later.

Shelly Brisbin was one of many to call 2016 the year of braille in her CSUN wrap-up in the April, 2016 issue of AccessWorld. While I certainly agreed with that sentiment, 2018 appears to be the year that many of those promised braille devices will actually be available for sale. From high-tech to low-tech, many devices and technologies caught our attention at this year's conference.

For the sixth consecutive year, AFB sponsored exhibit hall interview coverage on Blind Bargains. This year's coverage included over 30 podcasts from companies large and small. Visit the CSUN 2018 audio page on Blind Bargains to hear these podcasts or read text transcripts.

The Canute Grows Up into a Full-page Braille Machine

One of Shelly's mentions in that 2016 article was the Canute from Bristol Braille Technology, a multiline braille display that has been called a "Kindle for the blind." Two years and 5 prototypes later, the display has grown to 9 lines with 40 cells each, or 360 cells total. Think of the Canute as a braille embosser without paper. Canute prints braille on its display line by line from top to bottom, making it easy to read a page as the text is being produced. I was shown several demos of the versatility of the display to present multiline information that a typical braille display renders poorly, such as a calendar, sports statistics in a table, braille music, and math problems.

Canute accepts memory cards and may also be controllable from a computer in the future. It's not ideal for quickly reading through menus and screens, but works great for reading books, articles, or other long-form material.

The price is the other amazing feature of the Canute, with expected retail of just a few dollars per cell. Hear a demo with Ed Rogers, Founder and Managing Director of Bristol Braille Technology and Dave Williams, the Chair of the nonprofit group the Braillists, in this Blind Bargains podcast.

More Low-Cost Braille Displays

Many of our readers are likely familiar with the Orbit Reader 20, the low-cost 20-cell braille display from APH and Orbit Research. While APH hopes to return the Orbit to retail by the summer, another display from Innovision has snuck on the scene.

The BrailleMe

The BrailleMe is a 20-cell display that will be sold in the same under-$500 price range as the Orbit Reader 20. The BrailleMe includes cursor routing keys that can be used to move the cursor to a cell while editing, a feature the Orbit lacks. It also uses 6-dot braille, making it potentially more difficult to follow a cursor or enter special characters. (The Orbit Reader 20 features traditional 8-dot braille.) The BrailleMe works with the latest versions of iOS, Android, VoiceOver on the Mac, and NVDA for Windows. It also includes a translator to convert loaded files on the fly.

The best way for me to describe the braille of the BrailleMe is that it's similar to signage braille. The dots are very well defined and pronounced and give a different feeling from traditional displays. It's not a bad thing in my opinion, just not what most users would be accustomed to feeling. As more affordable braille technologies develop, I'd expect to see, or feel, more variance in the types of braille these displays offer. Expect the BrailleMe to hit US shores this summer.

The Tiny and Flexible Brailliant BI14

For those who prefer a more traditional style of braille cell, Humanware is now taking orders for the Brailliant BI14, their new 14-cell braille display that sells for under a thousand bucks. It includes many of the features you'd expect on a modern display including the ability to pair to multiple devices, a stopwatch and clock, and a built-in notes app. The notes app is intriguing because these notes can then be synced with your iPhone using the free Brailliant Sync app. The unit features 20 hours of battery life and weighs just over a half pound.

The Tiny and Flexible BrailleSense Polaris Mini

Speaking of small devices, HIMS was showing off a smaller brother to the BrailleSense Polaris Android-based notetaker. The 20-cell BrailleSense Polaris Mini includes the same software and features as its larger counterpart, along with modern conveniences like a USB-C port for charging and a Micro-HDMI port for sending the screen to an external device. It's also lighter, weighing in at under a pound. The only tradeoff will be the battery life, which is about 12 hours on the Polaris Mini, compared with 20 for the 32-cell Polaris. It's a marked improvement compared with the battery life of the previous generation U2 Mini however. It's available now for a preorder price of $3,995, $200 off the normal price for the unit and about $1,800 less than the larger Polaris model. Damian Pickering and Jenny Axler from HIMS discuss the Polaris Mini in a Blind Bargains interview.

An Update to the Slate and Stylus

Sometimes the coolest innovations are hidden in the back of the hall. A company called Overflow Biz was displaying a new take on the slate and stylus—an erasable, paperless slate called the Versa. The contraption includes four rows of text and operates like a normal slate but with one major difference. Instead of punching holes into paper, the slate creates braille dots on its reverse side, which can be felt by flipping it over. Buttons on the side of each row are pressed to erase what was written. Think of it as a braille whiteboard: a simple way to jot down names, phone numbers, or other items you need to remember for a short period of time or until you are able to transfer them to a computer or phone. The slate is a prototype for now and the hope is to sell it for under $100.

Dot Returns with an Improved Watch Model

We've seen the Dot, a 4-cell braille smartwatch, evolve over the past couple of years at the conference. This year the company has added some of the most requested and needed features to the unit including water resistance, a find-my-phone feature, and the ability to view notifications on the watch. In my limited testing, I still noticed problems with the braille dots, especially when keeping my fingers on the watch face. The dots often do not appear if fingers are covering the holes, even if the fingers are later moved away. This is problematic for features such as the watch's built-in timer function, a feature that often necessitates the constant viewing of the display. The company is also working on a full-page braille display called the Dot Pad, and their model packs over 20 lines of braille into a braille paper-sized surface. The Dot pad appeared to be an early prototype, and work will need to be done to bring this one to the market. Still, we are very encouraged by the multiple options for more affordable braille displays and their potential to reverse the downward trend of braille literacy. Find the Blind Bargains Dot interview here.

More Access in the Workplace

Cisco and HP were two companies I was surprised to find in the exhibit hall, but both were there for similar exciting reasons. Cisco is bringing accessibility to their IP Phone 8800 Series enterprise telephones through a free software update. These phones are typically used in offices and conference rooms and have become increasingly more complex over the past several years. The update will be free for existing owners of these phones. The text-to-speech features will be available to all users of these devices and can be enabled independently by a blind employee once the software update has been installed.

Moving across the office, HP has created a box that can be connected to many of its enterprise-level printers. Like telephones, printers now generally feature rather involved menu systems that can be used to change parameters or check the health of the printer. HP's device won't be free but should be available soon to its enterprise customers.

Access to employment is often limited by inaccessible tools in the office, so it's encouraging to see mainstream companies step up with viable solutions that could lead to wider employment by people with visual impairments.

Accessible Package Delivery from Amazon Lockers

While much of the focus from Amazon's booth and suite was placed on their accessible Kindles, Fire TV products, and the Amazon Echo family of assistants, the company also demonstrated a way for customers who are blind to audibly locate their packages. Amazon Lockers are available in thousands of locations in many major cities and are a way for people to receive packages when they are not home. They're ideal for those living in apartment complexes or houses where package delivery is unreliable.

Amazon is rolling out voice guidance features to their lockers, which will both give instructions on how to retrieve your package as well as auditory guidance toward the correct locker. Similar to an ATM, the voice prompts are accessed through headphones while the keypads have also been enhanced with braille and tactile markings. Currently, when searching for a locker on Amazon's website, there is no way to tell if a locker is one that has been made accessible. Hopefully, this change can be made to ensure that customers who need this feature will be able to find lockers they can use until all lockers are accessible. Expect accessibility features to start rolling out across Amazon's network of lockers this summer. Shelly recorded an extensive interview with Amazon's Peter Korn that goes into more details on this and other company initiatives.

Conclusion

The CSUN Assistive Technology Conference has evolved into an event that serves the needs of website designers, accessibility consultants, access technology companies, and users of products and services for people with disabilities. It has gone more mainstream with the likes of Google, Amazon, and Microsoft having prominent presences. For 2019, the conference moves north to the Anaheim Marriott, a location that should give better access for international travelers. We'll see many of you there from March 11–15, and of course we'll write about the latest in technology right here in AccessWorld.

Comment on this article.

Related articles:

More from this author: