Envision’s smart glasses are based on the Google Glass enterprise edition. Mike May, who is blind, finds navigating new environments difficult. He went to a company event at a brewery a few weeks ago and couldn’t figure out where to go.
Fortunately, he was wearing a set of Envision smart glasses, which use artificial intelligence to help blind and visually impaired people better perceive their surroundings. The glasses scan objects, people, and text with a small camera on the side, then transfer the information via a small built-in speaker. Envision, for example, can tell you if someone is approaching or depict what’s in a room.
May was requesting assistance via Ally, a feature on the glasses that allows him to establish video calls with friends and family.
May, chief evangelist at accessible navigation business Good maps, stated, “I called up one of my colleagues, Evelyn, and said, ‘What do you see?’ and she explained the surroundings to me.” “She told me where the tables were and just gave me the lay of the land.”
Envision Glasses are based on Google Glass Enterprise Edition. (Google Glass is still alive and well.) Google first announced these smart glasses in 2013, advertising them as a method for users to make conversations, send texts, take photos, and look at maps all from the headset. They never made it to shop shelves following a limited — and failed — rollout.
After a few years, Google began developing an enterprise edition of the glasses, on which Envision is based. Because of their wearable style, they’re perfect for capturing and relaying information as it would be seen by a user.
“What Evision Glasses really does is take in all of the visual information that’s around, tries to process it, and then speaks it out to the user,” explains Karthik Kannan, co-founder of Envision.
To read our blog about “Xiaomi challenges Facebook and launches its own smart glasses,” click here