Today, Envision announced that it packed its high-tech spectacles with brand spankin’ new, eye-catching features that will enhance the day-to-day lives of low-vision users.

The metaverse marginalizes disabled persons — how virtual worlds can be more inclusiveCreating a phone for disabled people — how companies can be more inclusiveThe best laptops of 2022

Envision updates its AI-powered smart glasses custom-made for low-vision people

Envision’s AI-powered smart glasses, custom-made for low-vision people, was built on Google Glass Enterprise Edition 2. Since it debuted in 2020, Envision claims that its high-tech spectacles changed the lives of hundreds of blind and visually impaired people worldwide. I know what you’re thinking. “How does the Envision AI-powered smart glasses work?” Like its name insinuates, it relies on artificial intelligence to extricate information about targeted images and text — and it tells the user what it “sees.” This lets sight-challenged users read work documents, recognize loved ones, find nearby personal belongings, use public transportation, and more. The AI spectacles are paired with a companion app on Android and iOS. It can read and translate any type of text — handwritten and digital — from any surface, whether it’s a computer display screen, a timetable, a food label or a poster. It also recognizes objects, colors and faces. Heck, it can even describe scenes for users. For example, if there’s a birthday cake with burning candles placed before the user, the AI-powered smart glasses will describe what it “sees.” “Our mission is to improve the lives of the world’s two billion people who are blind or visually impaired by providing them with life-changing assistive technologies, products and services,” said Karthik Kannan, co-founder of Envision. “By analyzing real time user data and direct feedback from across our communities, we are able to constantly enrich the Envision experience and innovate our products.” As mentioned, Envision announced the next generation of its AI-powered smart glasses. Here are the new, updated features added to the assistative, high-tech spectacles:

Document Guidance for Accurate Capture – Eliminates the frustration that comes with taking several images to fully capture a document’s entire text. Enhanced document guidance offers verbal instructions, guiding users on how to position documents for an optimal scanning position. With this new feature, users can capture documents in a single motion.Layout Detection – Envision smart glasses now puts documents into context for users via verbal guidance. It recognizes photo captions, headers and more.Enhanced Offline Language Capabilities – Envision added four additional languages: Japanese, Hindi, Chinese and Korean. The total number of supported languages, when offline, is 26. When connected, that figure climbs to over 60.Third-Party App Development Support – Developers can now participate in Envision’s third-party ecosystem, allowing them to build apps and other services that add value to the AI-powered smart glasses. Thanks to its partnership with Cash Reader app, Envision can now recognize over 100 currencies.Building Ally – The Ally function, the smart glasses’ most popular feature that enables users to communicate with trusted contacts via video conferencing, got upgraded. The new-and-improved version is now optimized for mobile networks and WiFi hotspots.Optimizing Optical Character Recognition (OCR) – Envision significantly improved image capture and interpretation accuracy.

The AI-powered smart glasses have a suggested retail price of $3,500; they can be purchased directly from Envision or via its global distributor network.

Google inspired smart glasses for the blind adds eye catching new features   here s how it works - 87