Envision glasses for the blind can read documents, scan faces, aid navigation

Touching the side of the Envision glasses opens a menu of options, such as scanning text or making a video call.

Consider

For Mike May, who is blind, navigating new spaces can be a challenge. A few weeks ago, he went to a work event at a brewery and had trouble figuring out where to go.

Fortunately, he had on him a pair of Envision smart glasses, which use artificial intelligence to help people who are blind or visually impaired better understand their surroundings. Using a small camera on the side, the glasses can scan objects, people and text, then relay that information through a small built-in speaker. Envision can tell you if someone is approaching, for example, or describe what’s in a room.


Now Playing:
Look at this:

Envision glasses use AI to help blind people read documents,…


9:24

May was using a feature on the glasses called Ally, which lets her start video calls with friends and family for help.

“I called up a colleague of mine, Evelyn, and I said, ‘What do you see?’ and she described the surroundings to me,” said May, chief evangelist of accessible navigation company Goodmaps. “She told me where the tables were and just gave me the lay of the land.”

Envision glasses are based on the enterprise edition of Google Glass. (Yes, Google Glass is still alive.) Google first unveiled these smart glasses in 2013, then pitched them as a way for users to take calls, text, take photos, and view maps. , among other things, directly from the headset. But after a limited – and unsuccessful – release, they never hit store shelves.

A few years later, Google started working on an enterprise edition of the glasses, which Envision is based on. Their portable design makes them ideal for capturing and relaying information as a user would see it.

“What Evision Glasses essentially does is take whatever visual information there is, try to process that information, and then tell it to the user,” says Karthik Kannan, co-founder of Envision.

There are a handful of other apps designed to help people who are blind or visually impaired, including Google’s Lookout app, which can identify food labels, find objects in a room, and scan documents and money. Be My Eyes is another app that connects blind or visually impaired people with sighted volunteers, who can help them get around via live chat.

But Envision’s goal is to make those experiences more intuitive. The headset design frees people’s hands so they can more easily hold a cane or walk a dog, and the camera is conveniently placed right next to your eyes, so you don’t have to hold a phone to scan your surroundings.

“It’s about non-stop narration when you’re walking down a busy street about signs on the side of a bus or taxi or on the side of a building or on the ground,” says May. “There’s this whole flow of information.”

A man wears Envision glasses to read a letter

Envision’s AI can read documents and letters.

Consider

The Envision glasses cost $3,500 and you can order them from the company’s website or from a distributor. Alternatively, you can choose to use the Envision app, which also analyzes text and informs you of your surroundings using your phone’s camera. The app costs $20 for a one-year subscription or $99 for a lifetime subscription.

To use the glasses, you will need to open the Envision app and pair the glasses via Bluetooth. Then connect the glasses to Wi-Fi and you’re good to go. You only have to do this once. After that, you won’t even need to carry your phone around for the glasses to work. To teach Envision to recognize faces, ask people to take selfies in the app and then enter their name. After that, the glasses speak that person’s name out loud when in frame.

The company’s goal is to bring other apps to the glasses like Aira, a service that connects people to trained agents who can see their surroundings using their phone’s camera. An integration with Envision would mean users could connect to Aira directly from the glasses instead. Envision is also in talks with navigation apps to try bringing their services to the glasses.

“Anyone who is in the assistive technology space and building apps can easily access Envision glasses and create as well,” says Kannan.

May, who says technology makes him feel like “a kid in a candy store,” says he loves the independence Envision provides.

“I really like the feeling that a kid gets, which is, ‘Aha, I did it myself.'”

Comments are closed.