During the past few years, the community of people with visual impairments has become increasingly excited over the accessibility prospects and possibilities of Google Glass, a pair of smart glasses that incorporates cameras, a heads-up display, and a live data connection to enhance the wearer's ability to interact with his or her environment. What many are not as familiar with, however, are the great strides that have already been made by researchers focused on similar technologies and how they can be used to assist people with visual impairments to navigate and interact with their worlds on a footing more equal to that of their sighted contemporaries. This article discusses two such efforts: A pair of smart glasses under development by a team of researchers led by Oxford University's Stephen Hicks, and the digital eyewear, available for purchase as of October 2013, from Ottawa-based eSight Corporation.
eSight Corporation
For nearly 30 years, Canadian electrical engineer Conrad Lewis has made a point of keeping up with all the latest access technologies. His two sisters Julia and Anne were both diagnosed in their 20s with Stargardt Disease, an early onset form of macular degeneration. Prompted by their diagnoses, Lewis—who began his professional career as a business executive and is now a venture investor—began to bring home new gadgets and pieces of access software he'd come across at trade shows and through his growing network of professional connections.
In the middle of the last decade Lewis took note of the growing convergence of mobile processing power and lightweight, high-resolution video displays. Perhaps he could leverage this coming convergence into a workable product that would enable his sisters to use their limited eyesight more effectively.
In 2007, Lewis founded eSight Corporation with the help of US and Canadian angel investors, along with grants from various foundations and government agencies. "Others had previously worked on head-mounted displays for the visually impaired, but they were too large and heavy, and didn't allow people to be mobile—not at all what Conrad had in mind," says Kevin Rankin, president and CEO of eSight Corporation, where Conrad Lewis currently serves as Chairman of the Board.
The device Lewis envisioned would also require much faster image processing than was available at the time of the company's founding. So he and his team of engineers set about writing and optimizing software, testing and customizing components, and building prototypes for two generations of eSight glasses. They completed their first pre-production model in mid-2012, and in October of 2013 began offering their eSight glasses for sale in the US and Canada.
eSight Glasses: How They Work
eSight glasses are about the size of a pair of wraparound sunglasses. They enable a user to magnify and view objects as close as 12 inches away and as far away as an object across the room, across the street, or across a field. A high-resolution video camera with zoom capabilities is built into the bridge, and a cable runs from one of the earpieces down to a hip-carried processing unit and power source. "The glasses are custom made using lenses ground to the wearer's own prescription," says Rankin. "These lenses are then overlaid with a transparent OLED (organic light emitting diode) display that can be user adjusted to fill their entire field of view, or just the upper portion, while allowing use of peripheral vision and awareness, and most importantly mobility." Think of a pair of bifocals, where the user can choose between magnified and contrast enhanced or their regular vision for any activity of daily living, depending on whether he or she focuses his or her gaze through the upper or lower half of the lenses.
The eSight camera captures what's ahead and sends it to the processing unit, which is about the size of a large-screen smartphone and about twice as thick. There the images are processed frame by frame in real time. "The unit allows the user to adapt to their personal preferences and needs with two easy-to-use dial controls, including an up to fourteen times zoom, contrast, and various color adjustments to make the real-time image easier to see and enjoy," Rankin explains.
Lewis did not want his sisters and other users to have to constantly switch back and forth between their standard prescription glasses and their eSight digital eyewear. "The way we designed them, a wearer could rely on their own prescription lenses to navigate their living room or other familiar surroundings, switch to half screen mode with some magnification and enhanced contrast to watch television, or choose a full-magnification, full-screen mode to read a book with white letters on a black background," he says, adding, "eSight users are now sharing amazing stories of actually seeing all of the important details while shopping, walking through airports, being at school, and at work."
One User's Perspective
The eSight glasses went on sale last October, priced at $14,950. One of the first purchasers was Yvonne Felix, who lives with her husband and their two young sons in Hamilton, Ontario.
Yvonne was diagnosed at age 7 with Stargardt Disease. By 15 she could no longer see the drawings filled with fairies and unicorns she loved to create. "I'd have to finish them in a single sitting," she recalls. "Otherwise I'd lose my place."
In high school teachers would not allow Yvonne to attend art class because they didn't know how to grade her work. They also discouraged her from assembling a portfolio and applying to art college. When Yvonne was 25 she applied, anyway, was accepted, and after graduation she became a community artist with two public installations to her credit—a public conversation area and a large magnifying glass that's also a sundial.
Yvonne read about the glasses in a Foundation Fighting Blindness newsletter, and purchased a pair with the help of a generous private donor and several public fundraisers. "They brought them to my home to try," she relates. Yvonne did not wear prescription lenses, so her first test was using full screen magnification. The results were startling. "The very first thing I saw was my husband and my boys," she remembers. "They were beautiful. They looked just like I had always imagined."
At the time, Yvonne was completing a painting for a charity auction—an abstract depicting her blind spot. "When I saw it through the glasses, I wanted to redo the entire canvas," she says. "My mind's eye and my new eyes had a lot of getting to know each other to do."
Yvonne's vision was improved even more with the addition of prescription lenses. "Sometimes it's like my blind spot isn't even there, anymore," she says. "I can see the dials on the oven, and these days when the house gets dirty I notice it, which is a mixed blessing."
Yvonne's brother, William, also has Stargardt disease, and he is in the process of getting a pair of eSight glasses for himself, too. Her elder son, Noah, has also tried on Yvonne's glasses. "To him it's like a magic trick that lets me read print books to him at night," she says.
Smart Glasses from Assisted Vision
The benefits of eSight eyewear are limited mostly to people with partial sight between 20/60 and 20/400. This leaves out a considerable swath of individuals who have much lower visual acuity. Happily, a small team of British researchers led by Stephen Hicks, PhD, Research Fellow in Visual Prosthetics in the Nuffield Department of Clinical Neuroscience at the University of Oxford, are well on their way to producing a different kind of device that could assist individuals with useable vision, less than 20/400, to identify objects and more safely explore and navigate their environment.
How Assisted Vision Smart Glasses Work
Like Conrad Lewis, Hicks also saw the benefit of working with off-the-shelf technology. "It occurred to me an excellent starting point might be to pair object recognition software with a heads-up display," he explains.
In 2010 Hicks began working with LabVIEW, object recognition software from National Instruments, which later gave him an award for innovative use of their product. "Traffic signs were fairly easy to recognize, and they were also easy to set up in a lab," he says.
Hicks created signs the size of CD jewel cases and hung them on a wall about 4 meters—a bit more than 13 feet—from several individuals with vision less than 20/600. "Without enhancement, none of the subjects could pick out the signs," he recounts. "We trained a video camera on the wall and used the object recognition software to spot the sign. The image was processed and enhanced, then projected onto the heads-up display of a gaming helmet. Every one of the subjects could now see a patch of brightness in the direction of the wall where the sign was located."
Hick's proof of concept model used a single video camera, so there was no way to distinguish distant from nearby objects. In a happy happenstance, however, it was about that time when Microsoft introduced Kinect, a gaming device that creates a 3D map of a room and identifies game players, tracking their movements and gestures. Kinect uses a single video camera, but it also projects thousands of tiny infrared dots, and uses their reflections and a complex set of algorithms to calculate depth, much like radar or sonar.
"With Kinect we could create a 3D map of objects up to 20 feet away," says Hicks. "But we now had too much information. We not only had to identify objects, we had to figure out which objects were important and which were just background."
Hicks and his team solved the problem by taking a giant step backwards. "We stopped trying to identify the objects," he explains. "Instead of trying to pick out that table three feet ahead and tagging it as a table, we began simply presenting that table as an area of brightness, the closer the brighter. Hicks also simplified the image by removing the far?away back half, and adding enhanced contrast controls. "Often all it takes is a tiny hint of where something fairly close is located to find a door or orient yourself inside a room," he says.
Hicks and his team continued to refine and improve their smart glasses. "We didn't want to replace anyone's usable vision, we wanted to enhance it," Hicks says. Toward that end they assembled a different sort of heads-up display using even more off-the-shelf technology. This new display projected the visual enhancements onto a transparent OLED screen. The wearer can use as much of his or her remaining sight as possible to identify that table, helped along by the device's brightness, contrast and edge enhancements. "Hold a hand in front of your face and the image would show through the glasses, but with an aura of brightness at the edges to help identify it," Hicks explains.
One User's Story
In August of 2013, University of London lecturer in French Dr. Hannah Thompson spent two hours with Hicks and his team testing out the glasses. "When I put them on I felt like a character in a science fiction novel, she relates in a blog post. "I was suddenly seeing the world in a completely different way. Objects which would have been impossible for me to see shone before my eyes in shades of pink and white.
"I found the glasses incredibly easy to use, and within minutes I was happily navigating my way around a series of obstacles. I would find these glasses especially useful at night, in glaring sunlight or in dappled shade. They would not only stop me from walking into things, they would also help me keep a watchful eye on my children, who are often the first things to disappear when light conditions affect my vision."
Indeed, light conditions are one of the few remaining hurdles Hicks and his team must overcome before they turn the device over to the engineers to miniaturize the components and incorporate them into an attractive and comfortable pair of eyeglass frames. "The infrared dots work well inside to fix position, but as soon as you step out into bright sunlight they wash out and become increasingly useless," Hicks explains. Hicks has engaged a British camera company to create an imaging processing unit that works in bright light to generate real time 3-D maps and still be small enough to fit on the bridge of a pair of glasses. "We could actually perform all of the processing on the glasses themselves," he adds, "but we will still need to use a separate power supply, because adding a battery would make the smart glasses too heavy for comfort."
Hicks has many other enhancements planned for the near future. "We've circled back around to our starting point with image recognition, which we could use to identify faces, signs, even headline text. Unfortunately, we can't use color as markers for identified objects, because many persons with extremely limited vision have lost their color perception. We could create blinking patterns, however, or play sound cues through headphones—perhaps bone conducting headphones so we don't interfere with environmental sound cues."
Hicks is currently in the final months of a four-year pilot study, and by the end of 2014 he is hoping to begin manufacturing and marketing his glasses through a startup company named Assisted Vision for approximately£500, a bit more than $800.
Contact Information
eSight eyewear
info@esightcorp.com
Phone: 855-837-4448
Assisted Vision
info@smart-specs.com